“Review of NHS diagnostic services”
Item 2 concerns a section 23 report. I ask the Auditor General for Scotland to brief the committee on Audit Scotland's "Review of NHS diagnostic services".
If it is all right, convener, I will do the briefing.
Thank you for the introduction and for another comprehensive report.
I have a short question. Barbara Hurst said that England's endoscopy units are performing better than Scotland's, but paragraph 85 on page 25 suggests that
I will ask Tricia Meldrum to help me out with the comparisons.
That quotation is right. We weighted procedures using a standard system, which considers the amount of time that is spent on different procedures, in order that we could get a means of standardisation across the range of procedures that are carried out. The data showed that, on average, a slightly higher number of procedures were carried out in hospitals in England. That is based on data that were collected by the Healthcare Commission in England. The information from the global rating scale relates to the quality of services, so it examines work that is carried out. England has been part of the global rating scale system for a year longer than Scotland, so it has more data to use. The commission has had more feedback and is able to make more use of such information for benchmarking, because it is a year further down the line than we are in using that information to improve services.
I will not pretend to understand that full explanation, but there seemed to be a contradiction between the information that Barbara Hurst said is in appendix 5 and what paragraph 85 says about the same subject—endoscopy. I do not fully understand that.
The appendix and paragraph 85 relate to different things. Paragraph 85 relates to the activity—how many procedures have been carried out and their efficiency—and the appendix relates to qualities such as how patients are managed and treated and matters such as getting patient consent and patient feedback, which contribute to the quality of the patient experience and indicate how patient-focused services are. I hope that that is clear.
I will leave it at that.
If you do not know where you are, it makes it much more difficult to get to where you want to be. The report states that the national health service lacks
We would say that it is clearly quite difficult: if it was not the NHS would have solved it by now. Quite a lot of work is going on around benchmarking some of the services. That is a start, but it is throwing up a lot of issues around the information that is currently collected on costs through the cost book, for example.
The radiology benchmarking project was in its first year of collecting data on diagnostic activity. Work obviously needs to be done to refine that data set, particularly in relation to how boards count examinations. Some boards count the examination of the body as one, but some boards count it as, for example, three if it covers three different areas. That is quite a simple issue, but definitions around how boards count activity and calculate costs are required. The radiology benchmarking scheme is an iterative process, so I hope that some of the definitions will be standardised.
Are there in-built technical difficulties or disagreements on standards and on what the data should be?
There are inconsistencies in how boards count things. It is not necessarily the case that there is disagreement, but they have counted things differently. The way they provide information for the cost book data is also different. A standardised definition is required and boards need to apply it.
So the problem is solvable.
Yes.
And the solution is not too distant. It can be done.
We made a number of recommendations about the Government, ISD Scotland and the boards working together so that clearer guidance, standard definitions and much better, usable information exist. We found that boards do not use information as much as they might for benchmarking because they have concerns about its quality and usefulness.
I wish you well in getting a solution, as we are talking about a fundamental building block.
I want to return to the general point about the collection of information that Barbara Hurst made in her introductory remarks—Andrew Welsh has just made the same point. A number of reports have said that there seems to be a great lack of information, and of structures to collect information centrally, in the health service. Surely it makes Audit Scotland's job of producing such reports difficult if it does not have such information. Is that right?
The health service has a lot of information—or, perhaps more accurately, a lot of data. The difficulty lies in translating those data into good management information. For the "Review of NHS diagnostic services", we collected some information ourselves that is such basic management information that boards should already have collected it. For example, we did sample testing on how long it takes to turn around a test. We are not talking about getting such data just for the sake of it, but because the information would help people to manage the service better. That is the issue for us.
We are talking about a lot of money. Page 3 of the report states that, in 2006-07, the national health service
We were certainly surprised that we had problems getting any costings on endoscopy, given that it is quite a big service. That is why we said that its costs are likely to be significant.
Is not it difficult to see how you can make recommendations on how to save money on a service if you do not know how much money is being spent on it in the first place?
There are in our report pointers to areas that the NHS might look to in order to save money or to free resources—as we like to say—for redeployment in the service. Exhibit 8 on page 24 of the report contains simple diagrams; I suspect that, as ever, quite a lot of work went into them. The exhibit simply shows costs per test or request by laboratory discipline. I will not go into details, but we can see that boards' unit costs vary quite a bit. We are talking about large numbers; significant sums of money are involved when things are grossed up. National health service managers should find the report useful in helping them to zone in on areas in which they must capture better data to improve the service's efficiency and free up resources for other purposes.
I want to talk specifically about laboratory services, as I do not have so much anecdotal evidence of other services. We keep hearing stories about blood not being sent, going back, getting lost, not arriving on time and so on. The system seems to be a bit haphazard. Did you study the procedure by which blood is taken, classified, taken to a laboratory, dealt with and taken back? Did you study whether the process is efficient? My experience is that it is not but that it is a bit ad hoc.
It is fair to say that the team did not track a test from the ward down to the laboratory and back again. The available information was laboratory information rather than information about the processes that are involved in getting samples to the laboratory and back again. It is clear that considering only what goes on in the laboratory provides only a partial picture.
The people getting these tests are very worried about their condition. They are usually cancer patients, who feel great anxiety as they await the results of tests. They must get clear results on time and the system must be efficient and cost effective. Did those concerns come out in your study? You suggest that you did not go into that much detail.
The health directorates have—although I will probably get the name wrong—a diagnostic collaborative, which has been working with boards on how to improve their systems and looking at whether they can strip out some processes to make things more efficient. A lot of work has been done in different boards on exactly the kind of issue that you raise. We hope that our report will give that work a bit of a push. We want all boards to consider the variations and to find where they might be able to improve their systems.
George Foulkes and Andrew Welsh have both raised the point that, to know where we want to go, we have to understand where we are coming from—we have to have basic data. As both have said, that point has been a consistent theme.
Across the United Kingdom, a lot of work is being done on information within the health service—although we are not saying that the issue is easy to crack. An Atkinson review is considering ways of measuring productivity and of improving information across the piece. If there were no information, we would not have been able to produce a report such as this one. We have included that information, but we have had to caveat some of it.
In a previous life, I worked for an electronics firm. We dealt with figures every single day, and I do not think that health boards or Governments should be any different. They should be looking for the biggest bang for their buck.
I absolutely agree. We have not used the word "logistics", but it is clear that some processes align closely with the logistics of managing processes in the private sector. We are keen for the report to be taken seriously in driving forward the efficiency agenda and taking on board the committee's comments that such services matter to patients, to achieve a quick diagnosis and the right treatment.
We would be remiss if we did not strengthen Barbara Hurst's opening remarks about the reduction in the number of people who are waiting for more than nine weeks from 10,600 two years ago to only two this year. I am interested to know how many people are nearer the six-week target. We must thank the NHS for its efforts and record that we are delighted with that progress, which has continued in recent years. How crucial is driving the waiting time down to six weeks to meeting the 18-week referral-to-treatment target? Is achieving six weeks an absolute requirement, or is it just part of a package of measures that will enable us to meet the 18-week target?
We are not sure where the six weeks came from within the 18 weeks. To meet the 18-week referral-to-treatment target, the time for diagnostics must be reduced, because they are the key to getting people into the treatment process. I do not know whether the waiting time should be six, four or eight weeks, but it probably needs to reduce from nine weeks if the 18-week target is to be hit.
The report makes recommendations on the potential for more direct referrals for tests from GPs, which mean that patients do not have to wait to go to out-patient clinics. Boards could consider when direct referrals would be appropriate and put in place appropriate protocols with GPs to extend direct referrals.
Paragraph 76 on page 21 describes the differences in boards' costs, which are shown in exhibit 8. The costs per test differ hugely between the two boards that serve my constituency—NHS Lanarkshire and NHS Greater Glasgow and Clyde. What analysis has been done of those figures? What were the outcomes? Were they repeat tests? NHS Greater Glasgow and Clyde's tests cost a lot more, but which organisation was most efficient?
In exhibit 8, it is interesting that NHS Greater Glasgow and Clyde is the most expensive for the tests. We think that that is to do with the fact that it has many—
Sites.
Yes—exactly. I ask Catherine Vallely whether we have information that we have not put in the report about the underlying reasons for those costs.
We have no such information. We obtained the figures from the Keele University UK benchmarking laboratory scheme; 2006-07 was the first year in which boards participated in that scheme. We had the total costs and the activity for the discipline and we calculated the cost per test or per request. One reason that NHS Greater Glasgow and Clyde gave us for such variation was its number of sites, which Barbara Hurst mentioned. The board plans to centralise services by 2011, when it projects a lower cost per test and per request. The board's complex case mix is also a factor in the variation. However, we found variation between the figures for another teaching board—NHS Lothian—and those of NHS Greater Glasgow and Clyde. As a result, we make recommendations on making better use of resources and achieving a lower unit cost for laboratory tests.
But no work or weighting has been done to find out how efficient NHS Lanarkshire is in comparison with NHS Greater Glasgow and Clyde.
No.
Do we know how the boards use the benchmarking information?
It has not been published, so it is just being used in the service. Again, as 2006-07 was only the first year of assessment, there are issues to do with the way in which laboratory tests are counted. Keele University is trying to refine that. More definitions will have been agreed for the current year, but that work is on-going. Data quality for laboratory services was also an issue. However, the definitions must be standardised.
So we cannot make the comparison that I suggested until we get standardisation.
No.
Thank you for that.
What is the ballpark figure for spending on the endoscopy service? Is it around £200 million or £300 million? The amount spent on radiology is £178 million, and the amount spent on laboratory services is £246 million.
I would hesitate to give a figure.
Really?
The staff who carry out the procedures are very skilled.
Can you give a rough figure?
The tariff in England for an interventional or diagnostic endoscopy procedure is around £200.
How many such procedures take place in Scotland?
Oh, how many did we say there were in Scotland?
We are meant to be good at the maths, are we not?
Yes.
If we add the three figures for diagnostic services together, we are getting towards £1 billion for those NHS services. Is the Scottish Executive's total budget about £30 billion?
It is £35 billion.
The figure of £1 billion is 2 or 3 per cent of the Scottish Government's whole budget. As was said, a lot of money is spent on diagnostic services, so the work that the witnesses have done on the matter is important, as is the follow-up to it.
We could answer Mr Foulkes's question on the basis of a figure of £200 for an endoscopy procedure, if I could add up a column of numbers quickly.
That is a dangerous route for the Auditor General to go down.
You are right, convener, so I delegate that happily. The fourth column of exhibit 11 has total numbers for weighted endoscopy procedures.
Yes.
So we could work something out from that, if members were interested.
We would first have to unweight the figures.
It would be useful to know the ballpark figure for the cost of endoscopy services.
I am sure that we can write a letter to the committee about that.
Can we do that rather than try to add up the figures now?
Yes.
That is acceptable. It has been a useful discussion, and I thank the witnesses for the information. Does the committee agree to note the report?
Next
Section 22 Report