Skip to main content

Language: English / Gàidhlig

Loading…
Chamber and committees

Audit Committee, 26 Nov 2008

Meeting date: Wednesday, November 26, 2008


Contents


Section 23 Report


“Review of NHS diagnostic services”

Item 2 concerns a section 23 report. I ask the Auditor General for Scotland to brief the committee on Audit Scotland's "Review of NHS diagnostic services".

Barbara Hurst (Audit Scotland):

If it is all right, convener, I will do the briefing.

We decided to examine diagnostic services because they are an important part of the health care system that helps patients to get accurate diagnoses and the right treatment. Delays in getting tests or their results can be difficult for patients and might affect the start of treatment.

We examined three of the main types of diagnostic services: radiology, endoscopy and laboratory services. At times, we regretted being so ambitious in scope, because they represented quite a lot of services for us to look at. For many patients, diagnosis involves testing by each of those services—I am thinking in particular of cancer patients. The three services face similar challenges, including ensuring that enough staff and equipment are available to meet demand, and ensuring that services are efficient and high quality. The services that we examined provide more than 87 million procedures and tests each year, at a cost of more than £280 million, so they are a significant area of expenditure. We assessed the efficiency of diagnostic services, considered how NHS boards are taking action to improve services for patients and examined performance against waiting time targets.

The four main findings that I want to bring to the committee's attention cover waiting times, quality, efficiency and—a recurring theme for the committee—the use and quality of management information.

First, NHS boards have made significant progress in reducing waiting times for eight key diagnostic radiology and endoscopy tests. Exhibit 4 on page 10 of the report shows that the number of people waiting more than nine weeks for a test fell from more than 10,000 in July 2006 to just two people at the end of June this year. Boards achieved the reductions by doing additional work funded by waiting time money and by making some longer-term sustainable improvements, such as streamlining processes and recruiting more staff when there was a clear need to do so.

As a milestone in achieving a new 18-week referral-to-treatment target, from March next year patients should not wait longer than six weeks for the key diagnostic tests. Exhibit 4 shows that the total number of patients waiting for the tests has been increasing since September 2007. The trend suggests to us that it will be challenging for boards to maintain shorter waits, particularly with the more demanding target.

Secondly, we looked at what boards are doing to improve services for patients. Endoscopy units are now using a quality tool called the global rating scale. Appendix 5 of the report shows progress that is being made on a number of fronts, including seeking patient feedback and patient safety. However, the appendix also shows that endoscopy units in England perform better than those in Scotland on all elements in the scale.

We looked at the speed with which tests are carried out and results are reported, which are both important indicators of efficiency and quality. We found that although hospitals perform well in how quickly they carry out in-patient radiology scans, the time that is taken to report the results varies among the hospitals that we examined.

We also found that boards could do more to make appointments more convenient for patients. Just half the hospitals that we reviewed offer patients a choice of date, time and location for endoscopy appointments, and fewer offer such choice in respect of radiology tests. The offer of choice can have the added benefit of reducing the number of patients who do not attend appointments.

Thirdly, we looked at a range of efficiency indicators including unit costs, productivity and the extent to which equipment is used, which showed that there is scope to improve efficiency. For example, we found variation among hospitals in productivity of radiology, endoscopy and laboratory staff, and we found variation among laboratory services in the cost of carrying out tests. We also found that around one in 10 scheduled endoscopy sessions was not used in 2006-07. There are also differences among laboratories in the numbers of repeat tests that are performed on patients. The variation cannot be fully explained by the type of hospital and the complexity of the work that it carries out, or by differences in how hospitals record activity data. We tried a number of statistical tests to determine whether there was any such relationship.

Finally, there has been a lot of work aimed at improving the information that boards hold on the performance of diagnostic services, but frustratingly we still found problems with data quality and a lack of standard definitions. There are inconsistencies in how boards count laboratory and radiology activity and how they calculate costs, which makes it difficult to compare published data. National data are therefore not robust enough to estimate potential savings from improved efficiency. There are no national data on the cost of endoscopy services, although the cost is likely to be significant.

I will stop there. As always, we are happy to answer any questions.

Thank you for the introduction and for another comprehensive report.

Willie Coffey (Kilmarnock and Loudoun) (SNP):

I have a short question. Barbara Hurst said that England's endoscopy units are performing better than Scotland's, but paragraph 85 on page 25 suggests that

"Scottish hospitals carried out more weighted procedures per endoscopy room than English hospitals".

Could you clarify the situation?

Barbara Hurst:

I will ask Tricia Meldrum to help me out with the comparisons.

Tricia Meldrum (Audit Scotland):

That quotation is right. We weighted procedures using a standard system, which considers the amount of time that is spent on different procedures, in order that we could get a means of standardisation across the range of procedures that are carried out. The data showed that, on average, a slightly higher number of procedures were carried out in hospitals in England. That is based on data that were collected by the Healthcare Commission in England. The information from the global rating scale relates to the quality of services, so it examines work that is carried out. England has been part of the global rating scale system for a year longer than Scotland, so it has more data to use. The commission has had more feedback and is able to make more use of such information for benchmarking, because it is a year further down the line than we are in using that information to improve services.

Willie Coffey:

I will not pretend to understand that full explanation, but there seemed to be a contradiction between the information that Barbara Hurst said is in appendix 5 and what paragraph 85 says about the same subject—endoscopy. I do not fully understand that.

Tricia Meldrum:

The appendix and paragraph 85 relate to different things. Paragraph 85 relates to the activity—how many procedures have been carried out and their efficiency—and the appendix relates to qualities such as how patients are managed and treated and matters such as getting patient consent and patient feedback, which contribute to the quality of the patient experience and indicate how patient-focused services are. I hope that that is clear.

I will leave it at that.

Andrew Welsh (Angus) (SNP):

If you do not know where you are, it makes it much more difficult to get to where you want to be. The report states that the national health service lacks

"basic information it needs to ensure diagnostic services are provided efficiently. Where data do exist, they are not consistent."

It also states that data quality is an issue and that NHS boards are unable to provide basic performance information. There is a catalogue of problems. How difficult is it to provide that information? Is there a need for standardisation? What level of finance would be required? Is there any difficulty in getting agreed standards? There is a lack of standard definitions and comparators for costs, without which we cannot really know whether progress is being made. How difficult a problem is that to solve?

Barbara Hurst:

We would say that it is clearly quite difficult: if it was not the NHS would have solved it by now. Quite a lot of work is going on around benchmarking some of the services. That is a start, but it is throwing up a lot of issues around the information that is currently collected on costs through the cost book, for example.

It is fair to say that there is variation in how good the systems are in different hospitals. One of the key challenges will be to ensure that those systems can talk to one another, because a patient does not just get one set of services—he or she will need a range of services. A key issue is to ensure that that can happen more effectively.

Would Tricia Meldrum or Catherine Vallely like to add anything?

Catherine Vallely (Audit Scotland):

The radiology benchmarking project was in its first year of collecting data on diagnostic activity. Work obviously needs to be done to refine that data set, particularly in relation to how boards count examinations. Some boards count the examination of the body as one, but some boards count it as, for example, three if it covers three different areas. That is quite a simple issue, but definitions around how boards count activity and calculate costs are required. The radiology benchmarking scheme is an iterative process, so I hope that some of the definitions will be standardised.

Are there in-built technical difficulties or disagreements on standards and on what the data should be?

Catherine Vallely:

There are inconsistencies in how boards count things. It is not necessarily the case that there is disagreement, but they have counted things differently. The way they provide information for the cost book data is also different. A standardised definition is required and boards need to apply it.

So the problem is solvable.

Catherine Vallely:

Yes.

And the solution is not too distant. It can be done.

Tricia Meldrum:

We made a number of recommendations about the Government, ISD Scotland and the boards working together so that clearer guidance, standard definitions and much better, usable information exist. We found that boards do not use information as much as they might for benchmarking because they have concerns about its quality and usefulness.

I wish you well in getting a solution, as we are talking about a fundamental building block.

George Foulkes (Lothians) (Lab):

I want to return to the general point about the collection of information that Barbara Hurst made in her introductory remarks—Andrew Welsh has just made the same point. A number of reports have said that there seems to be a great lack of information, and of structures to collect information centrally, in the health service. Surely it makes Audit Scotland's job of producing such reports difficult if it does not have such information. Is that right?

Barbara Hurst:

The health service has a lot of information—or, perhaps more accurately, a lot of data. The difficulty lies in translating those data into good management information. For the "Review of NHS diagnostic services", we collected some information ourselves that is such basic management information that boards should already have collected it. For example, we did sample testing on how long it takes to turn around a test. We are not talking about getting such data just for the sake of it, but because the information would help people to manage the service better. That is the issue for us.

George Foulkes:

We are talking about a lot of money. Page 3 of the report states that, in 2006-07, the national health service

"spent over £178 million on radiology services"

and

"£246 million on laboratory services".

It also states:

"There is no published information on how much the NHS in Scotland spends on endoscopy".

I find that astonishing. Do you?

Barbara Hurst:

We were certainly surprised that we had problems getting any costings on endoscopy, given that it is quite a big service. That is why we said that its costs are likely to be significant.

Is not it difficult to see how you can make recommendations on how to save money on a service if you do not know how much money is being spent on it in the first place?

Mr Robert Black (Auditor General for Scotland):

There are in our report pointers to areas that the NHS might look to in order to save money or to free resources—as we like to say—for redeployment in the service. Exhibit 8 on page 24 of the report contains simple diagrams; I suspect that, as ever, quite a lot of work went into them. The exhibit simply shows costs per test or request by laboratory discipline. I will not go into details, but we can see that boards' unit costs vary quite a bit. We are talking about large numbers; significant sums of money are involved when things are grossed up. National health service managers should find the report useful in helping them to zone in on areas in which they must capture better data to improve the service's efficiency and free up resources for other purposes.

George Foulkes:

I want to talk specifically about laboratory services, as I do not have so much anecdotal evidence of other services. We keep hearing stories about blood not being sent, going back, getting lost, not arriving on time and so on. The system seems to be a bit haphazard. Did you study the procedure by which blood is taken, classified, taken to a laboratory, dealt with and taken back? Did you study whether the process is efficient? My experience is that it is not but that it is a bit ad hoc.

Barbara Hurst:

It is fair to say that the team did not track a test from the ward down to the laboratory and back again. The available information was laboratory information rather than information about the processes that are involved in getting samples to the laboratory and back again. It is clear that considering only what goes on in the laboratory provides only a partial picture.

The report highlights issues to do with repeat testing. Those issues would suggest that there is something in what George Foulkes says—information may not be readily available to the clinician on the ward, who may therefore ask for a test when one has already been carried out.

George Foulkes:

The people getting these tests are very worried about their condition. They are usually cancer patients, who feel great anxiety as they await the results of tests. They must get clear results on time and the system must be efficient and cost effective. Did those concerns come out in your study? You suggest that you did not go into that much detail.

Barbara Hurst:

The health directorates have—although I will probably get the name wrong—a diagnostic collaborative, which has been working with boards on how to improve their systems and looking at whether they can strip out some processes to make things more efficient. A lot of work has been done in different boards on exactly the kind of issue that you raise. We hope that our report will give that work a bit of a push. We want all boards to consider the variations and to find where they might be able to improve their systems.

In all our reports, we have started adding a self-assessment checklist at the back. Even when boards have not been part of our sample, we expect them to assess themselves against some of the processes. Those assessments are being worked on by our local audit teams in each health board.

The Convener:

George Foulkes and Andrew Welsh have both raised the point that, to know where we want to go, we have to understand where we are coming from—we have to have basic data. As both have said, that point has been a consistent theme.

My questions are not specific to this report. Do you see signs of progress in the way in which information and data are collected and used? Is there something that Audit Scotland, or the committee, needs to do to encourage better practice, so that we do not find ourselves sitting here in a year or two making the same points?

Barbara Hurst:

Across the United Kingdom, a lot of work is being done on information within the health service—although we are not saying that the issue is easy to crack. An Atkinson review is considering ways of measuring productivity and of improving information across the piece. If there were no information, we would not have been able to produce a report such as this one. We have included that information, but we have had to caveat some of it.

Catherine Vallely said earlier that, for these particular services, standardisation is key. We hope that there will be quite a move towards standardisation in diagnostic services within the next 18 months. However, we cannot guarantee that that will happen in all services; as we start to consider services, we start to see the difficulties in reporting on them objectively.

We do not want to give the committee the impression that nothing is going on, because quite a lot is going on. However, it will take time, and some of the work will have to be done at UK level.

Stuart McMillan (West of Scotland) (SNP):

In a previous life, I worked for an electronics firm. We dealt with figures every single day, and I do not think that health boards or Governments should be any different. They should be looking for the biggest bang for their buck.

George Foulkes picked up on a point that is made in paragraph 5 on page 3 of the report, which says:

"There is no published information on how much the NHS in Scotland spends on endoscopy".

The Parliament came into being in 1999, and I find it surprising that, since then, health boards and Governments have not got to grips with that.

My second point follows from that and is about the more efficient use of resources. If we do not know exactly what is out there, how can we use resources more efficiently? Last night, I saw Alistair Darling saying on the news that all Government departments should be more efficient. Efficiencies can be made, but we must know where we are starting from in order to move forward.

Barbara Hurst:

I absolutely agree. We have not used the word "logistics", but it is clear that some processes align closely with the logistics of managing processes in the private sector. We are keen for the report to be taken seriously in driving forward the efficiency agenda and taking on board the committee's comments that such services matter to patients, to achieve a quick diagnosis and the right treatment.

Willie Coffey:

We would be remiss if we did not strengthen Barbara Hurst's opening remarks about the reduction in the number of people who are waiting for more than nine weeks from 10,600 two years ago to only two this year. I am interested to know how many people are nearer the six-week target. We must thank the NHS for its efforts and record that we are delighted with that progress, which has continued in recent years. How crucial is driving the waiting time down to six weeks to meeting the 18-week referral-to-treatment target? Is achieving six weeks an absolute requirement, or is it just part of a package of measures that will enable us to meet the 18-week target?

Barbara Hurst:

We are not sure where the six weeks came from within the 18 weeks. To meet the 18-week referral-to-treatment target, the time for diagnostics must be reduced, because they are the key to getting people into the treatment process. I do not know whether the waiting time should be six, four or eight weeks, but it probably needs to reduce from nine weeks if the 18-week target is to be hit.

Tricia Meldrum:

The report makes recommendations on the potential for more direct referrals for tests from GPs, which mean that patients do not have to wait to go to out-patient clinics. Boards could consider when direct referrals would be appropriate and put in place appropriate protocols with GPs to extend direct referrals.

Cathie Craigie (Cumbernauld and Kilsyth) (Lab):

Paragraph 76 on page 21 describes the differences in boards' costs, which are shown in exhibit 8. The costs per test differ hugely between the two boards that serve my constituency—NHS Lanarkshire and NHS Greater Glasgow and Clyde. What analysis has been done of those figures? What were the outcomes? Were they repeat tests? NHS Greater Glasgow and Clyde's tests cost a lot more, but which organisation was most efficient?

Barbara Hurst:

In exhibit 8, it is interesting that NHS Greater Glasgow and Clyde is the most expensive for the tests. We think that that is to do with the fact that it has many—

Sites.

Barbara Hurst:

Yes—exactly. I ask Catherine Vallely whether we have information that we have not put in the report about the underlying reasons for those costs.

Catherine Vallely:

We have no such information. We obtained the figures from the Keele University UK benchmarking laboratory scheme; 2006-07 was the first year in which boards participated in that scheme. We had the total costs and the activity for the discipline and we calculated the cost per test or per request. One reason that NHS Greater Glasgow and Clyde gave us for such variation was its number of sites, which Barbara Hurst mentioned. The board plans to centralise services by 2011, when it projects a lower cost per test and per request. The board's complex case mix is also a factor in the variation. However, we found variation between the figures for another teaching board—NHS Lothian—and those of NHS Greater Glasgow and Clyde. As a result, we make recommendations on making better use of resources and achieving a lower unit cost for laboratory tests.

But no work or weighting has been done to find out how efficient NHS Lanarkshire is in comparison with NHS Greater Glasgow and Clyde.

Catherine Vallely:

No.

Barbara Hurst:

Do we know how the boards use the benchmarking information?

Catherine Vallely:

It has not been published, so it is just being used in the service. Again, as 2006-07 was only the first year of assessment, there are issues to do with the way in which laboratory tests are counted. Keele University is trying to refine that. More definitions will have been agreed for the current year, but that work is on-going. Data quality for laboratory services was also an issue. However, the definitions must be standardised.

So we cannot make the comparison that I suggested until we get standardisation.

Catherine Vallely:

No.

Thank you for that.

What is the ballpark figure for spending on the endoscopy service? Is it around £200 million or £300 million? The amount spent on radiology is £178 million, and the amount spent on laboratory services is £246 million.

Barbara Hurst:

I would hesitate to give a figure.

Really?

Barbara Hurst:

The staff who carry out the procedures are very skilled.

Can you give a rough figure?

Catherine Vallely:

The tariff in England for an interventional or diagnostic endoscopy procedure is around £200.

How many such procedures take place in Scotland?

Catherine Vallely:

Oh, how many did we say there were in Scotland?

Mr Black:

We are meant to be good at the maths, are we not?

Catherine Vallely:

Yes.

If we add the three figures for diagnostic services together, we are getting towards £1 billion for those NHS services. Is the Scottish Executive's total budget about £30 billion?

It is £35 billion.

George Foulkes:

The figure of £1 billion is 2 or 3 per cent of the Scottish Government's whole budget. As was said, a lot of money is spent on diagnostic services, so the work that the witnesses have done on the matter is important, as is the follow-up to it.

Mr Black:

We could answer Mr Foulkes's question on the basis of a figure of £200 for an endoscopy procedure, if I could add up a column of numbers quickly.

That is a dangerous route for the Auditor General to go down.

Mr Black:

You are right, convener, so I delegate that happily. The fourth column of exhibit 11 has total numbers for weighted endoscopy procedures.

Catherine Vallely:

Yes.

Mr Black:

So we could work something out from that, if members were interested.

Catherine Vallely:

We would first have to unweight the figures.

It would be useful to know the ballpark figure for the cost of endoscopy services.

Mr Black:

I am sure that we can write a letter to the committee about that.

Barbara Hurst:

Can we do that rather than try to add up the figures now?

Yes.

That is acceptable. It has been a useful discussion, and I thank the witnesses for the information. Does the committee agree to note the report?

Members indicated agreement.