Agenda item 3 is consideration of the Audit Scotland report “Our impact: Monitoring and Evaluation Report 2025”. We are again joined by the Auditor General, Stephen Boyle, and alongside Mr Boyle is Mark MacPherson—you are welcome back, Mr MacPherson—and Michelle Borland, who is the head of organisational improvement at Audit Scotland. We have one or two questions to put to you but, before we get to those, Auditor General, I invite you to make an opening statement.
I am pleased to present to the committee our annual impact monitoring and evaluation report, which we published in September of this year, based on work up to May 2025. The report includes data covering our performance audits and our annual audit work. It also includes follow-up work on annual audits in the financial year ending 31 March 2024, together with performance audits published between July 2022 and February 2024, many of which will have been considered by this committee. For the communication monitoring part of the report, we looked at reports that were published more recently, between February 2024 and December of last year.
We think that we have good evidence that public bodies are accepting our recommendations, which are the primary vehicle by which we look to support assurance and improvement, and that they are responding positively to them and taking steps to implement our recommendations. However, there is also evidence that the pace of implementation of some audit recommendations is slow. Our performance audit teams assessed that more than half of the recommendations that were in scope for this audit impact report are work in progress, and, regrettably, that only 6 per cent have been fully implemented.
Recommendations about financial sustainability and the use of resources to improve outcomes have the lowest rates of implementation. Annual reports on the college sector and the NHS that I produce have shone light on fiscal sustainability issues and the need to accelerate progress on transformation and reform. We will continue to follow up those in the recent college report and the upcoming NHS annual report later this year.
The annual audit recommendations are being implemented more quickly. They focus on a single body rather than being sector wide, which in some ways makes it more straightforward to implement an audit recommendation, rather than having the need for collaboration across multiple public bodies. Of the 2022-23 audit recommendations, 56 per cent have been implemented and a further 33 per cent are in progress.
We aim to be proportionate when evaluating impact. We are interested in a broad understanding of what is changing in public services due to our audit work. That is about more than just the number of audit recommendations that have or have not been implemented. Monitoring and evaluating our work is important, as it helps us understand whether we are making the difference that is expected of us by the Parliament. It also helps to identify improvements to our audit approach and increase our impact in the years to come. Later this year, we will publish our first evaluation report on how well we are delivering against the outcomes that the Audit Scotland board, the Accounts Commission and I agreed as part of “Public audit in Scotland 2023-28”.
Lastly, following up audit recommendations is a critical part of understanding our impact. We recognise that there are boundaries to this work, with public bodies being responsible for implementing audit recommendations as they choose to do so. As we have discussed in recent times with the committee, we cannot compel a public body to implement a recommendation that it has previously accepted. However, we also aim to do this to support continuous improvement and to better understand the pace of progress and change in public services, as ever, to support parliamentary scrutiny.
Michelle Borland, Mark MacPherson and I look forward to answering the committee’s questions.
Thank you very much indeed. You mentioned the importance of monitoring and evaluation in making a difference. Could you explain how the monitoring and evaluation framework makes a difference?
I am happy to do that. I might pass straight to Michelle Borland to set out some of our wider approaches and then I will be happy to come back in, with Mark MacPherson.
Thank you for the question about the impact framework. We share that framework in exhibit 1 in the report. We take a three-pronged approach to monitoring and evaluating our impact within public audit. We are looking at monitoring early impacts, so that we can get an understanding of the visibility of our work and the initial traction that it is getting. We then evaluate the early impact, within the first 12 to 18 months of a report being produced. That looks at what is happening with our recommendations, how our work is influencing change, where we are seeing change in public bodies as a result of recommendations that we have made, and what stakeholder feedback is telling us about the quality of our work and how it is landing with stakeholders. The third tier of our impact, which is a critical aspect, is thinking about how that change is impacting on the outcomes that we set within “Public audit in Scotland”.
The report that the committee has in front of it today looks at the first two layers of that framework: the early impact that we are having through our social media, media and communications engagement and then the early impact that we are having through our recommendations and the change that is happening as a result. We look forward to publishing, later this year, a further report on how that impact is influencing and contributing to the outcomes in “Public audit in Scotland”.
Mr Brown will ask some questions about social media and media in general and so on but, before we get to those, I invite the deputy convener to put a couple of questions.
Thank you for your report. I can see that you have sent somebody on a course on how to do lots of bar charts and pie charts. However, they were not all clear to me in setting out how you report data. I will go to the beginning of the report and to the high-level exhibit 6. Over the years 2022-23 and 2023-24, the data set that you used to produce the report covers 11 performance audits, which made 63 recommendations, and 235 annual audits, which made a total of 949 recommendations across the two years. Is that correct?
You are correct. For those in scope for this report, there were 11 performance audits. For the annual audits, it is not quite correct to add those two numbers together, because, in effect, that is just the number of public bodies that we audited in one year and then the number in the following year. You will see that there is a difference in the number of audit reports in each year, with one being 115 and the other 118. The number of public bodies changes at the margins each year, depending on whether new bodies have been created, and the committee is sighted on some of those. That is broadly reflective of the scope of what we are evaluating this year.
We are grateful for any feedback that the committee has on the report. We try to recognise that there is a range of audiences. We consider accessibility, and we look, where appropriate, to bring in tabular or graphical formats to convey information. We are grateful for any views that the committee has on how effective that is.
I was trying to get my head round the bigger picture. The report goes into quite granular detail on the different types of recommendations and the various stages that they are at, if you can work out what the dark blue and the light blue shading mean. However, once I have got over that, there is still a point that I am trying to get my head around. Taking the 2023-24 year, because it had the higher number of annual audits of public bodies, and the 459 recommendations that came out of all those reports, I simply want a top line. Of those 459, how many to date are fully implemented, how many are in progress and how many have not been touched at all? I could not find that information in any of the bar charts. If it is there, please point me to it.
Again, I will pass to Michelle Borland to set that out very precisely for the committee, because I think that it is captured in the report. It is structured around four or five main headlines, which are drawn directly from the code of audit practice that frames how an external audit of a public body in Scotland is undertaken. You will see recommendations on the financial statements, which are on anything that an external auditor has directly identified to support improvement in a public body. Then the report goes into what is referred to as the wider scope of public audit in Scotland, with recommendations on financial management, financial sustainability, use of resources and governance and leadership factors. Those are the headlines.
Broadly speaking, a public body will sometimes receive no recommendations from an auditor, as there is nothing to bring to its attention but, in other cases, there could be a double-digit number of recommendations. The average, give or take, is around four recommendations per public body, but perhaps it is a bit too crude to say that that is reflected across the piece. Michelle Borland can build on my opening remarks about how many recommendations have or have not been implemented.
11:30
I can appreciate why that is confusing, Mr Greene. It is important to be clear about which audits are in scope for different aspects of the report. The annual audits in scope for 2022-23 are where we can assess the implementation progress that has been made, because that assessment is undertaken during the 2023-24 audit. For the 2023-24 audit, we have not yet undertaken all the associated follow-up, because that is done during the 2024-25 audit. That is why you would not be able to find that information in the report.
For 2023-24 audits that are in scope, we are reporting on whether the recommendations have been accepted. For the 2022-23 audits, we are reporting on the implementation status. In next year’s report, we will be able to report on the progress made on the recommendations in 2023-24, because that is followed up in 2024-25. I appreciate that there are a lot of financial years in there, but the reason is that the follow-up happens over a number of years.
As a member of the Public Audit Committee, I am looking for trends and patterns. There is a broadly similar number of bodies and recommendations, but is there a pattern of fewer or more recommendations being implemented in the early stage or in the long term, or in those that are just completely ignored? We make a big deal of the 93 per cent figure. Public bodies sit where you are sitting now and say, “Yes, we accept the recommendations of the report,” but those are just words. It is about how many of those actually translate into action. We have a bigger remit. You have no statutory duty to follow up on the recommendations or any locus in that respect, but we do, so that is the sort of data that I need to see the direction of travel.
I absolutely recognise that. Michelle Borland can come back in and Mark MacPherson might want to say a word or two about this as well.
You are correct that we have no statutory duty. I think that I have said a number of times in recent weeks to the committee that it is for public bodies to either accept or reject an audit recommendation and for them to follow up on that. We value the role that the Public Audit Committee plays in exploring whether a recommendation has or has not been accepted or implemented. However, we think that we have a locus in that, too, to explore through follow-up work, annual audits and our programme of follow-up of performance audits—and to an extent in the monitoring report—whether the original audit recommendations have or have not made a difference.
As we refer to in the report, we keep our methodology under review and consider the vehicles that we use to explore follow-up and progress and audit impact. As Michelle Borland mentioned, we refer to the further thinking that we will do following the evaluation of the “Public audit in Scotland” report later this year.
I will bring in Michelle Borland to say more on that point.
I will come back on the point about trends, because we are interested in that as well. As Stephen Boyle mentioned, our approach is evolving and we are continually improving it. We implemented the framework in our annual audit teams, for our annual audit work in 2022-23, so we do not have trend data yet but we will have that next year.
We have been working with the impact framework within performance audit for a little bit longer. That is why you can see some trend data on performance audit in this report. We have more years of data for performance audit where we have done the impact work. Through this year’s audit work, 2024-25, we will be looking at the progress with recommendations made in the 2023-24 audit. We will be able to provide better trend data to the committee on that in next year’s report.
That is helpful—thank you. On performance audits, in paragraph 36, you state:
“We do not currently have systems in place to follow-up recommendations made in national reports to all local authorities or multiple bodies.”
The term “follow-up” is a bit vague. What do you mean by that?
Effectively, we mean whether the recommendation was implemented. We go through a process to explore evidence. That involves referring back to the original recommendation and considering whether it was accepted and implemented and what difference it has made. That involves going back a certain time after the original report to see whether it was effective and implemented.
But you cannot do that at the moment.
We can, actually. Mark MacPherson, who leads our programme of performance audit development, can say where we have got to and set out some of our additional thinking.
Obviously, it is a resource challenge to go back and check progress in every public body in Scotland on a range of reports. We are trying to find the right balance in how we get that information. We talk about the fact that, after publication of a report, we now engage more readily with audited bodies to determine when they will implement recommendations or whether they are not able to implement the recommendation for some reason.
It definitely requires a lot more thought to get that from multiple bodies. With a single-body audit, that is relatively straightforward. With a report in which most of the recommendations are to an individual body or to a small number of bodies—for colleges, that is the Scottish Government, the Scottish Funding Council and maybe the college sector as a whole—we can do it. We have always had an approach to assessing progress on recommendations but, in the past couple of years we have tried to make that more systematic and consistent. However, we recognise that we still have work to do to capture information where reports relate to many bodies across the public sector.
That issue has made us quite thoughtful about making recommendations to multiple bodies. We have explored that topic with the committee in previous discussions on reports on our impact. I hope that, through our reporting over the past 18 months, you will have seen that we have moved away from making almost catch-all recommendations to a particular sector in a performance audit to being more precise and making SMART—specific, measurable, achievable, relevant and time-bound—recommendations, for the reasons that Mark MacPherson outlined.
Effective follow-up and getting evidence from across multiple bodies would be extremely time-consuming and laborious. Almost as an upstream alternative to that, the recommendations are focusing on a particular organisation rather than multiple ones. In some cases, we will still do it. In future reporting that we do on the NHS, we will likely still make recommendations to NHS boards, because that makes sense for them, as a homogeneous group. However, we are quite thoughtful about that and will deploy that approach only where it is appropriate to do so.
Do you think that Audit Scotland gives enough cognisance to some of the factors that may explain why audited bodies have been unable to implement recommendations? There are a number of external factors. Your recommendations talk a lot about financial sustainability and workforce issues, but there is also a wider regulatory and legislative environment that these bodies are working in, which is outside of their control. Do you think that your focus is too much on whether a body did or did not deliver on the recommendations and does not acknowledge that, even if they wanted to deliver on them, it might be impossible for them to do so?
I recognise the complexity of the issue. Many of our reports go into challenging territory, whether on fiscal sustainability, workforce reform or wider public service reform, rather than being issue specific. Our view is that that feels like the area in which public audit can best support assurance around public spending and public service delivery, and—this is at the heart of today’s report—make well-considered, effective recommendations that support improvement.
I will bring in colleagues in a moment, because a couple of case studies in the report illustrate the point about accepting more challenging recommendations that can have an impact. We did a report on the criminal courts backlog. There is no question but that it was a challenging recommendation about the need for a wider consideration of reform. Some of those recommendations have gone through public consultation and, ultimately, in the past few weeks, have been the subject of legislation that was considered by Parliament around digital evidence taking and so on.
I acknowledge that, even if our recommendations are accepted, they will not always be implemented in the desired timescale because they might be complex and consultation and engagement might be required with more than one body. However, I assure you that, when we make recommendations, we absolutely understand the complexity of the issue and the challenging circumstances that face public bodies.
Mark MacPherson can comment on the courts backlog recommendation and the impact that that audit work has had in particular. I will say that it is important that we reflect on the barriers to implementation for public bodies and recognise where it is challenging and, where progress is not being made, why that is. That can help us as an organisation to understand how to make recommendations in a better and more impactful way.
Stephen Boyle mentioned SMART recommendations, and it is important that our recommendations are realistic and achievable for public bodies to implement. The more that we reflect on the barriers and challenges, the more that we can craft recommendations that are meaningful and can be delivered by public bodies.
Since developing this work and looking at how we can improve, part of the guidance that we have developed for auditors has been that, where we make recommendations, we should consult with public bodies on what those recommendations look like as a result of the evidence that we find through the audit and give them an opportunity to comment on those, so that we can make sure that we shape them so that they are realistic, achievable and will add value and impact to what the bodies are doing, and so that we have a sense of how long it might take them to make progress on the recommendations and can put timescales in our reports, given our understanding of their ability to implement recommendations.
All of that is key and is part of what auditors will reflect on post-publication. When we look at the pace of implementation, the barriers are as Stephen Boyle mentioned: capacity, workforce shortages, financial pressures and so on. We will take all of those things into account when we craft the recommendations, so that we can make sure that they are realistic.
I will bring in Keith Brown at this point.
I have only a couple of questions. Looking at the data that you provided for the validation that you might get from media reach—I think that that might be the way to describe it—do you see dangers in that? Some people say that there is a formulaic approach whereby reports are produced that are relatively straightforward and discuss the pros and cons of an issue, but contain a soundbite quote at the end that, if you publish it on the right day, is guaranteed to stir up a good amount of parliamentary discussion and media coverage. Is there a danger that you might be seen to be chasing headlines and adding to a preponderance of negative stories, given that the media will always prefer those to positive ones? Do you recognise that danger if you are looking to that metric for validation?
It is a point to be really careful about. We need to get that balance right, so that there is a reach not just to media but to the public and Parliament. We look to make our reports accessible, first to people in the Parliament, always initially through this committee, but also to the subject committees that we give evidence to, whether that concerns our finance reports or reports such as the recent one that we produced on the adult disability payment, which is likely to be of interest to the Social Justice and Social Security Committee.
Parliament will always be the primary audience for our work but we recognise that public bodies and the public are interested in how public services are performing, and we want to present that information in a considered and balanced way. That is informed by the consultation that we do on our work programme—I consult this committee, and the committee decides thereafter whether to go out to the rest of the Parliament. We ask whether the topics that the Accounts Commission and I propose to audit are the right ones or whether there are other areas that we should be thinking about, and we look to capture that in the reporting that comes from the audits. There will always be a question about what happens next. What feels like the more important question is whether the audit recommendations made a difference, and we should approach that question in a balanced way.
The last point to make is that, when we produce a report that validates how a public service is performing, it does not always get the traction that a more critical report gets. I have referenced this before, but, on the day that I presented the report on the Scottish National Investment Bank, the deputy convener challenged me about the positive tone of the report. It was true to say that it was positive—it was an audit report that said that a public body had largely done what it was set up to do.
Underpinning all of that is that our reports are evidence based and our quality framework is a robust process that follows auditing standards. We are subject to review by the Institute of Chartered Accountants in England and Wales—it currently provides that service. It reviews our work and goes through our files to confirm that we have the evidence to support the conclusions and judgments that we make publicly and, most importantly, to the Parliament.
11:45
I remember that, 20-odd years ago, Audit Scotland did a best-value report on local authorities, in which the council of which I was leader—Clackmannanshire Council—came top. We never heard any comment on that apart from the comments of Clackmannanshire Council and myself, who shouted about it for years afterwards. However, that kind of positive evidence is important, not just for its own sake or because it might make people feel good, but because it demonstrates what is being done right and can be used as an example to help others to improve.
Sometimes there is a lack of context. When an area of public life in Scotland is being looked at, comparisons with what is happening in similar areas in England and Wales—whether things are better or worse—are meaningful to people because there is a relatively similar financial environment in all three areas and people can see whether, for example, Brexit is impacting on an area. However, instead of that, we get told that, for example, there is underfunding, which is a loaded term that relies on a value judgment. Surely, there must be a case for saying that there are times when, if something has been done well, we should broadcast that. Similarly, if something has been done badly—especially in relation to other parts of the UK where it has been done better, given the similar environment—that would be meaningful information for the public.
I come back to the public because, by talking about your social media reach and your public relations, you are recognising that the public are interested in these issues, which might be expressed through the Parliament. However, by and large, beyond the headlines that are generated, how much does your work register with the public? I hope that you can make sense of that question.
Understanding the reach beyond the Parliament is challenging. As you will see in our report, we used to use downloads from our website as a proxy indication of engagement, but with changes in technology—bots, AI and so on—that has now become a bit redundant.
On the issue of good-news stories, I think that we are at a good point. The Accounts Commission and I are considering the next cycle of the work that we will ask auditors to do. As you know from your local authority background, the Accounts Commission has a cyclical approach. It will produce what is known as a best-value assurance report on every local authority over the course of a five-year period. I take a slightly different approach, as I produce more sectoral reports, such as the college one that you referenced earlier, or ones on the NHS.
The work programme is the real driver in this area. We look to select audit topics that might be challenging or might be new—we try to strike a balance in that regard. The SNIB report is a good example of our approach. When we started the audit, I had no preconceptions about what that might produce, but it was good to see a validation of the bank as a vital and significant funding element in public service. At other times, we will produce a report that has more recommendations for improvement. We are always cognisant of the fiscal challenges, and we want our audit work, rather than drawing attention to issues in isolation, to make recommendations about how some of the fiscal challenges might be addressed and to stress the importance of public service reform.
On that last point, we were pleased to see the development of clearer strategies and plans around public service reform. Similarly, Audit Scotland has been making recommendations about the importance of medium-term financial planning for many years, so it is good to see the production of an annual medium-term financial strategy being integrated into the calendar of the Government’s approach. Those are both good examples of audit impact.
I hope that I can reassure you that we are always aware of the importance of balance and clarity, and that the way in which we report our work is evidence based.
I am delighted to hear what you say about SNIB, having had Cabinet responsibility for establishing it. Off the top of your head, could you mention one or two other examples where one of your reports on an area in which you have found good practice or excellent work has generated any kind of media response?
The Queensferry crossing is a high-profile example—
Another one for which I was responsible—thank you for that.
You are on a roll.
I am ticking all the boxes.
I do not want to repeat myself, but the public service reform strategy, together with the fiscal sustainability delivery plan and medium-term financial strategy, are important evidence of audit impact, and I will add another example. Through our audit of the Scottish Government, we have drawn attention a number of times to how the national performance framework and the national outcomes are working. Those were clear ambitions, but, through our audit of them, we saw that they were not underpinned by good evidence or metrics. Thereafter, the Deputy First Minister signalled a pause and embarked on a stocktaking exercise. That is good evidence of the audit process delivering clearer outcomes that can support the performance of public services and demonstrate how public money is being spent.
Graham Simpson has indicated that he does not have any questions. Colin Beattie, do you have any questions?
I am conscious of time, convener, but I have a couple of quick ones.
Exhibit 11 shows the progress on the implementation of the 490 recommendations from the 115 audit reports. It states that the status of 10 recommendations is unknown. Why is that?
That is related to how we pull the data through from the annual audit reports. Obviously there are a lot of annual audit reports in scope. The data for this report comes from an assessment of recommendations at the back of the report, and this is an area for improvement that we have identified in the report. Sometimes the assessment that we have made does not draw a clear conclusion about progress on the recommendation, so we are not able to identify exactly what progress has been made and classify it in one of the four categories—not implemented; work in progress; implemented; or superseded.
One of the things that we have identified in the “Next steps” section of the report is a need to improve the consistency with which we assess progress against recommendations within the annual audit, so that we do not have 10 of those recommendations where the status is unknown.
Thank you. I will leave it there, convener.
On that note, I draw this session to a close. I thank Michelle Borland, Martin MacPherson and Stephen Boyle for their evidence on that report. It is a theme that we will return to, and I am sure it will be part of our deliberations when we consider our legacy report for the next session of Parliament.
As agreed earlier by the committee, I will now move us into private session.
11:52 Meeting continued in private until 12:05.