Item 2 is the Auditor General's report "Scottish Executive: supporting new initiatives", which is part of the "How government works" series. Caroline Gardner will brief us on it.
In November 2002, the Auditor General published the report "How government works in Scotland", which was essentially a reference document that explained the organisation of government in Scotland, the responsibilities of public servants and how they are held accountable, including the role of public audit in all that. We are following it up with a series of reports that comment in detail on aspects of the public sector's business that cut across individual organisations' responsibilities.
How did Audit Scotland decide which areas it would examine?
Do you mean the initiatives that we considered?
No. You said that you considered six areas. I was struck that those areas—and therefore the emphasis of the report—are geared towards the processes and internal arrangements of initiatives rather than towards the experiences of those on the receiving end and towards the impact of the investment. Why are you looking at one side of the coin?
You are right: we have focused on process. That is partly because, at this stage, it is not possible to consider the impact of each initiative. Most of them started in 2004. They will need to run for longer before we can consider the outcomes. The Executive will have to have these process elements in place at the outset, to work it through.
I appreciate that answer, but if anything it reinforces my view that the report is just on one side of the coin. I am pleased that Audit Scotland is paying ever greater attention to wide-ranging aspects of how government works, but I have to be honest and say that I am very disappointed that the report does not address whether substantial areas of investment are joining up at the front end.
I recognise your point. We are looking at this piece of work from one end of the telescope, if you like. There is also a parallel piece of work—a joint study by which the Auditor General and the Accounts Commission will look at community planning. One of the issues that that study is looking at is how easy it is for local authorities and their partners, statutory and otherwise, to join the range of priorities and the funding available for them. They want to come up with a coherent set of strategies for what they are trying to achieve and to deliver that in ways that not only produce savings and efficiencies but outcomes for the communities they serve. This piece of work is one cut across that, but there is a separate piece of work that will look at the much bigger area of community planning. It will, I hope, join some of those pieces together.
I note that, too. However, community planning is just one part of the jigsaw. In previous meetings, the committee discussed the concern that community planning too is getting lost in internal process and organisation rather than actually achieving its stated aims.
The point that we try to make in the report—although we are clearly not doing it very well at the moment—is that to demonstrate whether results are being achieved, the Executive needs to be clear at the outset what the intended outcomes are.
Okay?
I am still a bit confused. There were two, different, issues there. One was whether objectives were clear. I absolutely agree that unless objectives are clear, we will not be able to establish whether they are being achieved.
I think that that has now been registered.
Sorry.
I want to pursue the issue of clear and consistent monitoring standards. It sounds great in theory, but can all projects be judged by the same standards? What common criteria can be used to judge radically different projects and yet ensure that they have the flexibility they need to accomplish their aims?
You are quite right to identify the tremendous range in the size of the funding that is available and what it is intended to achieve. We strongly recommend a proportionate approach.
I understand why the very high-spending projects get more attention, but does that represent a problem? For example, exhibit 1 in the overview document demonstrates that most problems with oversight, monitoring and evaluation are experienced with projects that have lower levels of funding. We are not talking about a random sample. If one assumes that there are more medium-value projects than high-value projects and more low-value projects than medium-value projects, the total cost of the low-value programmes could be quite high, but there is comparatively little oversight of them. The two highest-value projects, which are worth £318 million and £127 million respectively, lie way ahead of the rest. There is a great bunching at the lower-value end. Surely that is where there should be much closer monitoring. From exhibit 1, oversight of such projects appears to be quite low.
There are two points to make on that. The first is that the size of the fund is not the only factor that affects the level of risk associated with it. Examples of other relevant factors are the number of partners involved, the size of the partner bodies and their ability to put good governance in place. Secondly, in doing our work, we were concerned to discover that good practice in one part of the Executive is not always picked up elsewhere in the organisation. The committee might want to follow that up with the Executive.
Oh yes—we certainly want to do that.
I can give you an indication of that, but it would be helpful if you could give me a moment to find the information.
No problem.
Thanks very much.
We can come back to that answer. In the meantime, Eleanor Scott has a question.
Can I follow up on Andrew Welsh's question before I ask the question that I wanted to ask?
Certainly—go ahead.
Is it possible to measure success against objectives, or are targets necessary? Is measuring achievement against an objective just too nebulous? Some of the projects that you mention in exhibit 8 might have been amenable to targets, because the measures of success were not clearly defined. You could have put a target on the business broadband incentive, for example, although that might have been harder to do with some of the other projects. Do we need more targets?
We do not necessarily need more targets. This goes to the heart of the question Susan Deacon asked, which was about how we can know whether we have achieved our objectives. First, we need to be clear about what we are trying to achieve. In some cases, that will translate directly into clear targets but, in others—with community regeneration, for example—the process is more complex and there is a range of ways of measuring achievement. We want a more structured approach to be adopted—we want the Executive to consider what indicators would tell it whether it was meeting an objective and whether they could be turned into targets that could be built into the agreements that are made with the delivery partners that receive the funds. We do not advocate a one-size-fits-all approach, but there is room for improvement in the management of the funds generally.
Can I ask my other question?
Certainly.
My understanding of an initiative is that someone has a big idea and a pot of money that will last for three years and they put the idea into practice. I have issues with that approach, but that is beside the point. Some of the projects that you mention do not seem to fit that model. An example of that is the rural stewardship scheme. I thought it slightly odd that you included as an initiative something that appears in the Scottish Executive Environment and Rural Affairs Department's budget. That scheme goes on all the time, not just for a particular period. It is funded by modulated money that comes from pillar 2 of the common agricultural policy. It seems not quite to sit with the other initiatives that are mentioned. There may be other examples of that that I do not know about, but I happen to know about that one because I was on the Environment and Rural Development Committee for a while.
We identified as initiatives projects that have pots of money that are outside organisations' mainstream funding and for which they bid. Some projects are short or time-limited and have particular issues associated with them. Some of them, such as the scheme you identified, continue for longer periods. We were interested in how the funds that are available are identified and allocated and how their outcomes are monitored and evaluated. In that sense, we considered a mixed bag.
So funding for some of the initiatives could come from different departments' budget lines?
That is right.
We are not talking about a separate initiative budget?
Funding for all the initiatives will have been included in the budget act, but will not have been earmarked for initiatives at the beginning of the financial allocation process.
One of my concerns about such initiatives is the way in which they move from being initiatives to becoming practice, or not. I suspect, as a result of hearing your answer to Susan Deacon's question, that this is too early to discuss how these projects will progress. I will therefore leave that matter hanging for now.
The Executive has a range of reasons for funding initiatives in this way rather than through core funding, which vary from initiative to initiative. Two concerns that delivery partners raised were the amount of time and effort required of them when bidding for relatively small amounts of money and the impact on services when they are heading towards the end of the time-limited funding. They have to consider whether they can and should continue funding as the allocated funding winds down. The latter is relevant especially in the context of community planning, in which people are increasingly trying to look from the ground up at what they want to deliver, rather than doing the Humpty Dumpty thing of trying to put together funding from a range of different sources. In the case of initiatives, that funding is often ring fenced and difficult to use in the flexible ways that might make more sense across the piece.
Initiatives can have a benefit. In developing new ideas and new ways of working across the piece, they can be good at highlighting issues that have lost out. However, how they fit into the plan and whether the decision is made to continue with them depends, in large part, on whether they were of benefit. Our judgment on that will come with time.
I am concerned that the allocation of funding to organisations is not always as open and transparent as it should be. Audit Scotland talks only about a clear and consistent approach; you make no comment about transparency. Is that deliberate, or is it something that you will revisit?
It is not deliberate; it is simply that we were looking at the variation in how funds are allocated—whether we start with an overall fund that is distributed or with several bids that add up to a total. The transparency of that is something we should keep in mind for future work with the Executive, especially in the context of community planning and the growing importance of this sort of funding.
Certainly for me, that transparency is long overdue. Elected politicians are unable to find out whether there is a structure to the allocation and whether it is consistent in addressing local needs. That is where the issue of the wide variation in the allocation of funding comes in.
I do not think that it does, but it takes us back to the importance of community planning.
Do you think that it would be more helpful to delivery partners if there were a standardised bidding process and audit process?
The range and size of the initiatives that we are looking at could mean that in some cases that might be more of a burden, rather than less, but there is certainly scope for spreading good practice and ensuring that the guidance available to Scottish Executive staff is consistent and consistently used.
Have you had any indication as to when the Scottish Executive will respond to the report?
In the main, the Executive does not respond formally to us but responds instead to your inquiry, so there is not a date by which we would expect a response. It is part of the continuing dialogue that we have with it, unless there is an inquiry by the committee.
Is it an agreed report?
Yes.
I would just like to ask for clarification about something I may have missed.
Almost all of them do. We have come at it from two directions. We chose a sample of 20 of the 74 initiatives announced during 2004 and got a range from large ones through to small ones. We also surveyed a sample of more than 70 delivery partners. We asked them about their experiences on the basis of the good practice criteria that we have identified and about what has worked and what causes them difficulties.
The next thing I want to say is more of a comment than a question. You can probably tell from my colleagues' questions that our concerns are about the community impact of initiatives. You have talked about balance, distortion and identifying need, but I want to put on record the fact that I often get feedback from smaller and medium-sized organisations in my constituency to the effect that, for the amount of money involved, there is an awful lot of staff input and a lot of distortion of what they see as their core, mainstream business. I take on board your comments about the community planning report that is under way. It might be worth coming back to us to say whether there is anything beyond that that you think you could do to pick up on that side of the initiatives.
The study looks first of all at the way consultation was happening in terms of what we identified as good practice principles. That was picked up again in the survey of delivery organisations and partners, which were asked how well they felt they had been consulted on what the scheme was intended to achieve and the way in which that was being done.
Do you feel that, even with those limitations on the delivery partners, the consultations were geared towards those partners—even though the partners did not get the extra time to ask their own communities or to consult more widely?
It is difficult to generalise. There were some good examples, but there were also examples where, simply because of the limited time that was available, it would be very hard to enter into a wider discussion around how various things fitted with what we were trying to do locally. That variation is one of the key themes that we have identified. There is room for a better spreading of good practice within the Executive to ensure that, within the bounds of the need to keep government working, there is good local consultation and as much as possible is being done to tie the national priorities that the initiatives intend to achieve into the local circumstances and priorities of communities.
Before I invite a further question from Andrew Welsh, I wish to develop a theme that a number of members have touched on. In paragraph 49 of your report, you say:
I do not think that it is possible to identify such a trend. I have said a couple of times that one of the striking things is the variation. It is not routinely the case that the larger initiatives have an agreement but the smaller ones do not. It is not the case that all the initiatives that come from one particular department or area in the Executive will tend to have outcome measures in place. There is simply a lot of variation.
I do not sense from colleagues that our concern is about a waste of money, in the sense of money being spent inappropriately; it seems to be more a question of value for money, what we are achieving, whether it is working and whether we are achieving our goals.
I want to follow through the idea of bunching. I realise that I have asked you a complicated question on that.
I have a note of it.
That is fine. I also realise that you are dealing with a complex situation and a range of organisations. If my arithmetic is right, a sample of 20 initiatives out of 74 is 27 per cent. How typical is that sample? Can we reasonably assume that those 20 projects represent the other 73 per cent and that the same findings will lie in the other projects?
We selected a sample of 20 not at random, but to reflect the types of initiative that are around and to get a distribution among different Executive departments. Because of that variation, it is probably not fair to say that if we multiply that 27 per cent by four, that is what the whole picture will look like. The most important finding is the degree of variation, which is explained neither by the size or complexity of the funds, nor by the outcomes. By spreading good practice from the outset, the Executive might have scope to be clearer about what it is achieving with a significant tranche of money.
Thank you for that explanation. Half of the initiatives complied with most aspects of good practice. Therefore, half did not. How badly out were that other half? Is that a major cause for concern? Why did non-compliance come about? Do you have any suggestions about that?
As the convener said, the shortcomings are to do with not achieving best practice and not always being able to demonstrate what outcomes are being achieved in areas of expenditure and policy that are always complex.
So it is a matter of education and encouraging the spread of best practice.
Yes.
Thank you.
Susan, I believe that you had a small question that you wanted to ask.
I wanted to clarify a first-principles point about the scope of the report. Forgive me if this has already been covered and I have not appreciated it.
As always with these pieces of work, we are trying to come up with a specification that is tailored, does not take too long to do and is clearly bounded, but which has useful findings that can be applied more widely. This is the first follow-up "How government works" report that we have done and we thought that it would make sense to focus on live policy areas because the history of the way in which they were announced is still clear and readily available. We thought that we could use them as a baseline for examining the arrangements that were put in place and could determine what outcomes have been achieved at a future point.
One of the advantages of what we call baseline reports is that they give us an opportunity to identify best practice and encourage that best practice to be shared across the Executive.
I thank Caroline Gardner for briefing us on that report, in which members showed a great deal of interest.
Meeting suspended until 11:17 and thereafter continued in private until 11:47.
Previous
Items in Private