Official Report 166KB pdf
Item 3 is a briefing from the Auditor General on information issues identified in recent Audit Scotland reports on the national health service in Scotland.
As the committee is well aware, a recurrent theme in reports on the NHS presented in my name is limited management information on the cost, quality and accessibility of services. Following a discussion at its meeting on 12 November, the committee asked for a briefing paper from Audit Scotland that would pull together the main themes relating to information issues that have arisen in our recent health reports. As a result, this is not a formal report in my name; it has been prepared for the committee by Audit Scotland and we hope that members find it informative and useful.
The briefing paper looks at the 11 reports on the NHS that we published between January 2007 and April 2009. I draw the committee's attention to five main findings.
Thank you for that briefing and the summary report. You mentioned pages 2 to 3 of the report. On one level, what we read there is worrying. For example, the gap identified under "Financial information" is:
The two points are closely linked. We are seeing signs of progress through the information that the Government and other bodies have provided, but we will want to follow up some of the issues as we carry out follow-up work on some of our studies. We routinely assess the impact of our reports, at a high level at least, to get a sense of what has happened and what has been developing. That helps us identify areas on which we want to do further more detailed follow-up work, perhaps because we are not as confident that progress is being made and things are happening. We will continue to consider information issues in all our studies and we will continue to report on it in our reports and to the committee.
Can we reasonably say that, although there are still weaknesses, we are confident that progress is being made?
We are confident that progress is being made. We cannot comment on whether the developments will address all the issues that we have identified, because we have not done the validation work on that.
In the specific pieces of work that you will undertake in the near future, is there anything that is likely to come back to the committee that will enable us to consider some of the issues in more detail?
That is a good question. Towards the end of the year, we will produce our biennial performance and financial overview of the NHS. If the committee felt that it would be useful, we could ensure that that report, which examines general performance issues, includes a theme on information issues and what progress is being made. If the committee was so minded and felt that it was appropriate, that might be a good opportunity for it to take evidence on those matters and on any other matters relating to the general performance of the health service. We must recognise that information is there for a purpose, so it is rather good to link the issue to how the information is used for performance measurement and performance management purposes.
I have a comment and a question. The paper is an excellent summary and it sets out the information in a helpful way. In particular, the tabular format in appendix 2 is helpful for identifying the issues. My comment relates to Tricia Meldrum's important point that there is a substantial cost burden on NHS boards in collating the information. We heard earlier about the backdrop of a great deal of pressure on public finances, which will mean a great deal of pressure on politicians to ensure that in the health service, for example, front-line services are preserved. That will inevitably mean that much greater pressure will be put on backroom functions, such as the collection of data. In the years ahead, we must all be aware of the extent to which it will be possible to maintain robustness in the collection of information, given that severe efficiency targets will be put on health boards. That is just a comment, although the Audit Scotland team are welcome to respond if they wish.
Sorry, but which table are you referring to?
The table in appendix 2. I am looking at the right-hand column, which is on monitoring and evaluation information. You identify several issues on which on-going work is being done, but at no point are there target dates for progress.
The updates that are in italics are based on information that was provided by the Government. We gave the Government the table and, in some cases, it provided more detailed information, which we summarised to make it easier to understand. However, we did not strip out any dates from the information that the Government provided. If we follow up on individual reports and studies, we would ask for the details of timescales and timelines.
Yes, of course.
We are talking about core information that bodies need to manage their services efficiently and appropriately. We do not see it as an optional add-on; rather, it is core to managing services in the best way possible. The issue is ensuring that it is fit for purpose; it is not about collecting information for the sake of it. The information is core to business.
I will ask about the comments on page 3 of the paper, on monitoring and evaluation and the lack of national information to allow benchmarking to take place across the health boards. The paper mentions a national benchmarking project with more than 90 indicators in place. Is everybody embracing that project? If they are not, why are they not? If consistency in reporting is lacking across the boards, what is that national benchmarking project doing?
We have tried to use some of the benchmarking information in the past in considering the diagnostics project, for example, and there is also work on benchmarking radiology information. However, we have found problems with consistency and data quality. The project is taking forward work on that. Work is being done to improve that information, which the Government has given us an update on. However, we could not draw robust conclusions from that, given the differences in definitions and the data quality issues. We want to consider benchmarking work that is relevant to individual studies, and we have done so in the past, but we have not always found it to be as robust as it could be.
Does an across-Scotland knowledge management strategy need to emerge or develop in the NHS to assist us in getting consistency of reporting across the boards? Is something lacking? You have said several times that no national information is available to us to allow us to benchmark, and the benchmarking framework does not seem to be quite what we want. Do we need to move things a step forward and consider knowledge management in a different way? Obviously, clinical and IT management expertise would be used to bring information together so that we get what we are looking for in the long run.
I think that there is a national knowledge management strategy—it might not be called that, although it could be called something similar—but I am not sure about the extent to which it takes in some of the clinical information. I think that it is more to do with things such as access to evidence-based health care and evidence-based management. We have looked more at individual topics related to individual studies and consistency in that context rather than across the whole of knowledge management.
As members have no other questions, do they agree to note the report? We thank Audit Scotland for providing a helpful report and look forward to it trying to work it into future reports so that we will have the opportunity to return to the issues and reconsider them in more detail.
We would be happy to do that, convener.
Thank you very much.