Under the next item on our agenda, we will take evidence from the permanent secretary. We will take a minute or so to allow those who wish to leave to do so. I am sure that we would not want Mr Elvidge to be trampled in the rush of the press.
This is like putting the support act on after the main band.
This session is to allow us to take evidence from Mr Elvidge on performance monitoring in the Executive. We wrote to Mr Elvidge after the interesting session at which we took evidence from Professor Michael Barber, who, at that time, was the head of the Prime Minister's delivery unit. Mr Elvidge's response has been circulated to members and we have taken him up on his offer to come along and talk to us. The committee has also signalled that it wants to speak to Mr Elvidge about resource allocations under the previous spending review.
Those commitments are selected—
I am sorry, I have not given you the opportunity to make your opening statement.
That is okay. In these circumstances, I do not need an opportunity to make an opening statement. My evidence is a response to a series of questions that you asked; I am not seeking to introduce anything into the discussion that does not flow directly from your questions. I am perfectly happy to plunge straight into the pursuit of your questions.
You said that there are areas that need more intensive support. Can you give us an example of a policy commitment that has had that sort of treatment?
Yes. I am sure that, without looking at my lists, I can pick a few out of my memory. The commitment on recycling waste is a good example. On the one hand, we are aiming at levels of achievement that are paralleled in other countries. On the other hand, we are aiming at a rate of movement from where we started towards those levels of achievement that is enormously challenging and, on the face of it, requires interventions that are not currently understood. Once we get past what we might describe as the easy bit of encouraging recycling, there are issues about changing public behaviours that involve interventions by Government for which it is not easy to find parallels. Therefore, it is reasonable to assume that there will have to be a degree of innovation, which might require support from outside a small team.
The example of waste recycling is interesting because, in that case, the delivery is in the hands of the local authorities, not the Scottish Executive. Some local authorities are performing up to target, whereas others, such as my local authority, are most certainly not. Your requirement for local authorities to sign up to your priorities is quite challenging.
Indeed. It also raises questions about the nature of partnership working between the Executive and local authorities on delivering an objective in which working practices have to be different from those that apply with regard to the generality of local government business. That is part of the distinctive challenge of delivering that commitment.
I have some questions about the delivery unit at Westminster. As the convener said, Professor Barber gave us an extremely interesting presentation in May, in which he described in some detail the on-going process of achieving a target. For any specific priority, one maps a trajectory that ends with the target being achieved; I believe that Professor Barber said that that trajectory was measured monthly.
It was certainly achieved in a different way. What the Scottish and English approaches have in common is close and regular performance monitoring. The data to the delivery unit in Whitehall and the data to us flow no less regularly than monthly and are enormously detailed about the nature of delivery. In that sense, the processes are similar.
But in that case the department does not have any overall responsibility; it is responsible only for delivering targets. If that happens on a departmental basis, some of the raft of cross-cutting priorities and measures in the partnership agreement may not be managed in a detailed way.
That is a question of the way in which information is shared in the Government. If the Health Department was monitoring performance but was not obliged to tell anyone about the outcome, we would be running the risk that you have identified. However, the processes of government are such that the department's monitoring performance is widely shared around the organisation.
Does that sharing take place at a political level or at an official level?
It takes place at both levels.
Is it possible to make that argument the other way round? If you are trying to measure performance, in whatever way and however rudimentarily, across the full range of the partnership commitments, does not that go further down the route of introducing bean counters or button counters? I cannot remember whether Mr McCabe referred to bean or button counters.
Both, I think. He seems to be against all of them.
Is that not a more elaborate process, involving unnecessary checking, than what Professor Barber described as the highly concentrated and focused approach of the United Kingdom Government towards delivering targets in key service delivery areas?
I will try to reply to that question with as much fact and as little opinion as I can. When I was reflecting on the evidence that the committee had received, it occurred to me that Professor Barber had come and told you about one bit of what the UK Government does in monitoring delivery; it was not his responsibility to come and tell you about everything that the UK Government does in monitoring delivery. Every department of the UK Government contains a substantial body of people who are concerned with the tracking of performance across the board. The Prime Minister's delivery unit simply sits above that process and selects out of it a small number of things that it wants to monitor with particular intensity. I do not think that the delivery unit has been established as a substitute for the kind of comprehensive analysis of performance that you would find in the Executive or in the Department of Health or any other Whitehall department; the unit is an addition to it.
You are saying that the Prime Minister's delivery unit does what you do. However, there is no partnership agreement at Westminster. As it has been described to us, the focus in the Executive is on the partnership agreement. In the past, the committee has expressed the view that there is perhaps a need for greater priority to be placed on specific targets and outcomes rather than on the full range of targets that the Executive has identified both in budgetary documents and in the partnership agreement. Would you concede that the regime of checking whether something is geared towards the partnership agreement across the board is perhaps unnecessarily elaborate and that there is a danger of simply testing compliance with partnership commitments rather than actual delivery of services?
I am sure that the proper civil servant's answer is that that is a legitimate area for debate. Ultimately, it is for ministers to decide how they want to exercise political accountability to the electorate. They have chosen to exercise political accountability through the partnership agreement and its 450 individual commitments. Given the fact that that is the choice that ministers have made, it is absolutely clear to me that the right thing for the organisation to do is to have a monitoring framework that enables accountability to be exercised in that way.
That is a very judicious answer. However, it is a difficult task to monitor delivery over the 450 commitments. Would it not be easier if there was a clearer prioritisation of monitoring delivery on targets that particularly matter, rather than the broad-brush approach?
Would it be easier?
Could it be done more effectively? That might be a more concrete way of asking the question.
I would start from the proposition that one must not fly in the face of reality. If expectations of Government are broad, and if political commitments have been given in a particular way, the monitoring processes of Government have to reflect the breadth of accountability that must be discharged. We need to remember that this is not simply a product of the partnership agreement; ministers can be held to account by the Parliament in any way that the Parliament chooses on any subject that the Parliament chooses. It is part of our responsibility to enable ministers to respond to that legitimate accountability.
I want to take you to one of the hot spots and follow up the point that Elaine Murray made about your statistics on health waiting times and what that says about performance monitoring in the Executive.
If I had known I was going to be giving evidence to you in a week with two by-elections, I might have been a little bit more cautious about including health as one of the examples in my evidence, because I very much do not want to be drawn into an active political debate about which particular measure is more important than which other particular measure in the complex field of health.
Obviously performance monitoring highlights the deterioration in the median wait and the increase in waiting times for day cases. I take from your answer that effort in another direction may have skewed the figures. The median wait for out-patients is much the same: waiting times have gone from 46 days to 62 days. I find it a bit difficult to understand how the shifting of effort has led to deterioration. What intrigues me about this is the availability of the performance monitoring information that clearly shows deterioration in median waiting times for patients. There does not seem to be any obvious intervention to do something about that.
I do not think that I would draw that conclusion. There are lots of interventions. Few things are more actively managed than waiting list performance in the NHS.
You consider that the indicators I have cited—and which, to be fair, you cited in your own evidence—are fundamental indicators of performance. We have just had a discussion with Mr McCabe about trying to avoid too much bean counting, or button counting, or both. We have those indicators and they show us the deteriorating position, and I am just trying to work out how that percolates through the system to lead to changes of emphasis, changes of policy or changes of resource allocation to tackle what seems to be a pretty big problem.
For the sake of clarity, I am not entering into the debate about which is the most important measure of waiting times. There is a well-rehearsed effect—that as one concentrates on the longest waiting times, one automatically tends to push up median waiting times. There is a perfectly reasonable debate about which of those two numbers is the better measure of the effectiveness of a health service. What is difficult is to argue that you can improve both of them simultaneously. I am saying that waiting times is a generally accepted measure of effectiveness, but the particular slice that one takes through waiting times is a subject of active debate.
Forgive me for simply taking the lead from your evidence, Mr Elvidge.
You stated in your memo:
I can do my best to describe what is quite a complex document. The First Minister receives the comprehensive report in the same form as the Cabinet receives it. He also has access, as do other ministers, to the more regular summary reports of overall performance that the management group receives. As in any organisation, there is a system of exception reporting if there are particular subjects of concern.
In the presentation that we had from Professor Michael Barber, it was interesting to see him tabling the characteristics of the system whereby there are mechanisms to hold ministers and their civil servants to a single and somewhat narrower set of objectives. There is then a mechanism in place to make an objective assessment of how well people are doing and hence to produce a league table that is available to the Prime Minister and a devil-take-the-hindmost approach to management. Do you have any plans to move in that direction here in Scotland?
That takes us back to our earlier territory. One could operate a system of that kind only if ministers had made a prior decision that there was a smaller group of targets or objectives that were overriding priorities for the organisation. In the absence of that, one cannot simply transpose the delivery unit approach. As I have no reason to suppose that ministers will change their view of political accountability to the electorate, I have no plans to build any systems that assume that they will.
Therefore, we continue producing a report containing 450 or so measures, with the progress against them, and that is the document that is put before the Cabinet for the First Minister and his colleagues to assimilate and respond to.
Yes. Perhaps I did not explain clearly enough that, although the full data are there for ministers, they are summarised in various ways to assist them see the key elements. It would not be very responsible behaviour on our part simply to bang down the detail of 450 commitments and say, "There you are. Make what you will of that."
Does that summarisation not take us closer to having a subset of objectives that can be managed using the Barber method?
Yes and no. It takes us closer to knowing what subset of objectives might be proving particularly challenging to deliver at a particular point in time. The delivery sub-group of the management group uses a mechanism that is specifically targeted on those commitments where there appears to be a delivery issue to resolve. That is different from the approach of the United Kingdom Government's delivery unit, with its fixed list of priorities, which are there irrespective of whether there are concerns about delivery.
In a situation where there is a focus on challenges within a given portfolio, along with scrutiny of why they have arisen and an assessment of how to proceed, is there a function whereby both the minister responsible and his civil servants are answerable to the Cabinet on that?
There is a parallel process. The time sequence does not always work perfectly but, where possible, the first action is that the team responsible is held accountable to me and to my management group colleagues on the delivery sub-group, with emphasis being put on whether, in a problem-solving sense, we can assist the team with the delivery challenge that it faces. I would describe that as a support activity, rather than as an accountability function. My interest is in getting the delivery to happen, rather than in criticising people about any difficulties that they might be having in achieving that delivery.
Is there a mechanism whereby the First Minister, you and Mr McCabe can communicate to individual ministers in charge of portfolios where they sit on a spectrum between highly likely to achieve their objectives and very unlikely to achieve their objectives? Can you position them in that way and thereby give them an awareness of the impression that has been created as to their management of their portfolios?
That kind of assessment is fundamental to the material that is reported to Cabinet. Each commitment is assessed in terms of the risk or absence of risk to delivery. In addition, I would expect individual departmental management teams to share with ministers the flow of management information that each will have about performance across the objectives for its portfolio.
If you were asked to define the three or four key skills for the head of the Health Department, what would you say they were?
That is a fast ball. I would say that the key skills are a capacity for strategic management; the ability to organise and manage a large volume of human and financial resources; an understanding of the processes of delivery in a complex delivery organisation; and the ability to manage relationships in the context of government and the public sector.
So why has the post of director of delivery for the Health Department been advertised at a salary of £100,000 if that is the job of the head honcho?
It is a feature of all organisations that the head honcho cannot spend all their time all day doing all the things that they need to do. Departments consist of support structures that allow the head of department to carry out the various aspects of their job. You might as well ask me why we employ some people to think about strategy in the health service if an ability to think about strategy is part of the skill set of the head of department. I suppose that the short answer is that the reason is that we cannot expect one individual to do the work on their own.
Is the head of the Health Department the equivalent of John Birt?
No. I would not like to characterise the role of head of the Health Department by using John Birt as a description. The head of the Health Department is the accountable officer for the health budget and the senior manager of the department.
The fun is over. If we are to get a director of delivery, why should we start with the Health Department? Could that model be replicated in other departments?
The answer lies in the unique relationship that exists between the Executive and the NHS. No other part of the public sector is similar to the NHS in its core relationship with the Executive. That is to say that the NHS is directly accountable to the Executive; unlike local government, it does not have a separate democratic existence. The health service is unique in the scale of its spending—it is responsible for slightly more than a third of the Executive's total budget—and in its operational decentralisation and complexity, which make the tracking of delivery an enormously difficult business.
I would probably concede that point. My experience in the Health Department was that it requires something such as this post.
My comments are inevitably speculative, given that we are running ahead of the objective setting for the individual and that a sound principle of objective setting is that it should be done in discussion with the individual who will be subject to it. Generally, I would expect to select a limited number of the most challenging delivery issues in the health service and specify the difference that we would like to see the director of delivery make to performance against those delivery objectives. That comes very close to what Michael Barber is doing.
Is the post linked directly with the strategy and delivery unit? Is there an intrinsic link?
There is not an intrinsic link.
Should there be?
No, I do not think so. There should be a channel of discussion about delivery lessons that are being learned in the health service that might be of value elsewhere, but it is fundamental to the models that we use in the Executive that we try to keep responsibility for delivery decentralised in the organisation rather than pull it together. The delivery unit is a very slim structure, which is part flying squad to help teams that are in trouble, part sharer of best practice and part monitoring body. It is best to keep the delivery unit in its whole-Executive role and regard the director of delivery and their team in the health service as the largest example of decentralised delivery responsibility in the organisation.
There are 10 commandments but 450 commitments. How many of the commitments can you put under the focus at any one time? You obviously cannot cover all 450.
It depends what is meant by focus. Tracking them all simultaneously is not a challenge, but we can intervene in only a small number at any one time. As a general rule of thumb, it takes at least an hour of senior management time to have a sensible discussion about delivery problems on any individual commitment. An hour of senior management time—three or four senior managers would be involved—is a scarce commodity in any organisation.
What triggers that sort of intensive focus? Does it come from you, does it come from within the civil service or are most of the initiatives triggered by ministerial diktat?
It is a bit of all those things. There is a cyclical issue. Early in the four years of the session, one does not know how delivery on many of those things is going, so one cannot immediately say, "Let's focus our attention on the ones that are in trouble." At the outset of the four-year cycle, there are management decisions about where it might be useful to invest some time, where the risks look greatest. As we move through the cycle, the emphasis shifts to focusing on those things where the teams are identifying delivery problems. Overriding all of that, if ministers are particularly worried about something we will spend some time looking closely at it.
There is a nice phrase that I got from a Canadian who worked for one of the legislative assemblies there: people in her position are "we be's"; "Whatever happens to politicians, we be here." You have seen several Administrations in Scotland and there are some fundamental underlying issues that we have to address, whether health, education—which I know is your background—or economic growth, on which there is broad consensus between the parties about where we want to be.
No, I do not think so. The process of narrowing down that I described would undoubtedly be a process of narrowing down that had political consent. There is absolutely no point in the organisation focusing on things that ministers do not believe are the right things to focus on. By definition, at any point in time the organisation is managing a number of things that do not appear in the 450 partnership agreement commitments. In particular, it is paying attention to some of the large, long-term outcomes that Mr Mather was talking about in the earlier discussion with the minister. I do not see that as a disconnect. As the minister said, those larger outcomes are often implicit in the more detailed objectives that ministers have set.
I am sure that that is the case. I was certainly not implying that there should be, or is, a separate agenda for civil servants. To cast the question in a different way, I suppose that the challenge to ministers is to identify their core priorities and what is most important for them to deliver in order for you to manage the resources towards that delivery process. That is how it seems to me. Do ministers take on board the fact that, to achieve some of the things that they deem most important, they must set clear priorities and pathways towards delivering those things, and is that then reflected in the way in which the strategy and delivery unit, or indeed the civil service as a whole, operates?
A process of discussing priorities is part of the day-to-day business of ministers. I would be surprised if that were not so in any political system. Does it feed through into what the strategy and delivery unit does? Yes, up to a point. It does not detract from the centrality of the partnership agreement commitments, but it might generate additional issues to which the strategy and delivery unit needs to pay attention. If that were the case, those additional demands on its efforts would arise from that kind of process of ministerial discussion of priorities.
I think that we have probably finished with performance management. Perhaps we can move on to allocations from the spending review and issues of efficiency. As accounting officer, how do you satisfy yourself that the level of efficiency that is being achieved or proposed for each department is realistic and acceptable?
It is tempting to say that I use my judgment and experience and look for as much evidence of deliverability as I can. In a perfect world, I would want to see a clear delivery plan that gives me complete confidence that particular sums of money will be delivered from a particular action. Unfortunately, life is not always that tidy and sometimes I have to work down through a hierarchy of assurances, taking lesser standards of evidence, which I supplement with judgment and sources of reassurance wherever I can find them, by asking questions such as "Has anyone else ever succeeded in doing that?"
In the spending review, how was it determined where the greater scope for efficiency savings was and what level of savings should be achieved?
Outside the Executive, that is done through a process of discussion across departments, drawing on individual departments' knowledge of the areas of government for which they are responsible. Inside the Executive, it is done through a process of discussion of relative challenges that takes place inside the management group. Although quite a lot of the Executive's efficiency savings are designed to be achieved through activities that are organisation wide, there is an issue about how the impact falls on individual departments, which we examine as closely as we can.
It has been highlighted to us that the Scottish Executive Environment and Rural Affairs Department and the Scottish Executive Enterprise, Transport and Lifelong Learning Department have made relatively limited contributions to efficiency savings. Why is that?
It is because of varying patterns of business. The crucial thing to understand about efficiency savings is that they do not take place in a static setting. That is why it is not possible to put a number on the scale of the efficiency savings challenge for the Executive in percentage terms or, realistically, in cash terms. One can identify what the minimum saving is if business is unchanged, but of course that is not the reality, because business changes all the time and additional pressures fall on the organisation. In areas of business in which the additional pressures are greatest, the net saving will be smaller than the average. In bits of the organisation in which people are able to manage an overall decline in activity in some way, there is greater scope for making efficiency savings.
I am intrigued by the fact that the smallest contributions come from those two departments. Will you say any more about why SEERAD appears to be unable to deliver efficiency savings when it has a substantial number of areas in which it could identify such savings?
I am not saying that; I am talking about the net position. SEERAD has one of the best examples of a single project saving money in a department. The changes to the administration of the common agricultural policy payments have released substantial savings by delivering the payments in a much more streamlined way. In part, those savings have been consumed by new activities in the department, the introduction of land management contracts being the primary example of that. Therefore, I would not be critical of my SEERAD colleagues and their commitment to achieving efficiency where they can. One needs to recognise that they are being asked to take on additional activity in a number of areas and that that changes the net picture.
The overall statistics seem to suggest that SEERAD and the Enterprise, Transport and Lifelong Learning Department are making the smallest contribution to efficiency savings, so I am not sure that I have quite got an answer to my question.
Yes, we are. It would be surprising if the pattern stayed uniform over time. In addition, we have set ourselves a savings target for this year that is greater than the savings target that the budget forced us into. That links slightly to Mr McAveety's point about supertankers—if one wants to turn a tanker, the sooner one starts turning the wheel the better. We have tried to pull forward some of the savings pressure to ensure that when we need to hit the peaks of those savings over the three years, we have the measures in place. We are certainly looking beyond 2008 at what we need to achieve.
I will pick up on one of the convener's points about the enterprise field. You explained the relationship between the Executive and the health service as a unique and direct relationship for the delivery of services, and you suggested that the closest comparator is the enterprise network. I am struck by the fact that the 230-page manual to which Mr McCabe referred earlier identified only £4 million of enterprise network savings. Off the top of my head, that is probably less than 1 per cent of the enterprise network's budget, whereas the health service savings are formidably higher. I am struck by the inconsistency between the impact of the two—or their targets. The convener raised a point about the Enterprise, Transport and Lifelong Learning Department's limited contribution to the process. Can you explain why the disparity to which I refer exists?
On the enterprise network, we must take into account the fact that, if we were not starting the clock on 1 April this year but looking back two further years, we would see a different picture of efficiency savings that have now been driven out of the network. Its argument is that so much cost has been taken out in previous years that the scope for additional efficiency savings is necessarily more limited. No one would thank me for accepting that anyone has reached the limits of the efficiency savings that they can achieve, so I am not saying that, but cutting in at a particular point in time is always an issue with an exercise of this kind, as it can distort comparisons between one organisation and another.
That is a reasonable point. I sought an assurance that you were doing exactly what you said you were doing in that area and were not just saying something like, "Well, that's fine, lads. Thanks very much. Let's move on." Given the debate about the outcomes and impact of public expenditure, it strikes me that there is still an opportunity for the Executive to make significant gains in the effectiveness of public expenditure. I hope that the process will accommodate that in due course.
I think that you can take it for granted that there is continued scrutiny of the scope for efficiency savings in all areas, including the one to which you referred.
I want to move on to an issue that I raised with Mr McCabe. It strikes me that there is an inconsistency between how efficiency savings are identified and utilised in local authorities and how that is done in other areas of the Executive's work. The Executive tells local authorities that it believes that they have the capacity to make a certain amount of savings, so the Executive will retain that amount of money. The Executive tells the authorities to make their efficiency savings within the settlement that they achieve. However, in other areas of government, the Executive says that it wants a certain amount of efficiency savings to be made, and those areas are then free to reinvest the savings as they see fit. What is the justification for that inconsistency in approach and for dealing with local authorities differently?
I am tempted to say that you are taking me into areas that Mr McCabe has dealt with comprehensively. It seems to me that ministers are saying to all parts of the public sector that they are making a judgment about the savings that they believe the public sector has the capacity to achieve and are setting budgets in the light of that judgment. Ministers then say to some parts of the public sector that, because of decisions about priorities, they want to create capacity for growth in spending in those parts, but they are not saying that universally. That is the broad nature of the process that has gone on. However, I am close to territory that I feel is exclusively Mr McCabe's and not mine.
What is the rationale for deciding who fits into which category?
The same rationale that lies behind any decision about prioritisation that ministers make, as they do constantly, applies in the resource allocation decisions that they make.
You were here earlier when we had a discussion with Mr McCabe about the information that is required to underpin public confidence in the efficiency exercise that the Executive is undertaking. In your capacity as accounting officer, with a particular responsibility in the relationship that exists between the Executive and Audit Scotland in the verification of Government spending, what further information do you think that the Executive must generate to meet the standards that Audit Scotland's letter of 10 August appears to demand?
I am increasingly getting a sense of pits being dug around me.
Not at all. It is a genuine search for an answer. I have no other motivation whatever.
I was not imputing any base motive; I was simply saying that I had a sense of pits being dug around me.
I feel that you are trying to make a distinction between the pursuit of value for money and the pursuit of the efficient government programme, as though value for money is somehow your responsibility and efficient government is not. Am I misinterpreting what you are saying?
No, you are not. However, I am saying that value for money, as a principle, will take you only so far into the detail of the issues that you are pursuing. The obligation on an accountable officer is to draw attention to those things that are demonstrably not value for money, not to provide a running commentary on the comparative value for money of everything that happens. I am sorry if such distinctions are unhelpful, but I think that the limitations of the accountable officer role are genuinely important. To say that a core responsibility of the accountable officer is to track the detail of every efficient government saving is not an interpretation that accountable officer responsibilities will bear.
Is it not part of the responsibility of the accountable officer to ensure that the baseline data are in place to allow an assessment to be made of the effectiveness of any efficient government programme?
It is the responsibility of an accountable officer to have whatever data he or she needs to satisfy himself or herself that there is not a demonstrable case of poor value for money—or, indeed, improper expenditure or the other things with which an accountable officer is concerned. It is not necessarily an accountable officer's responsibility to insist on particular standards of evidence, especially when—as your discussion with Mr McCabe and committee discussions have brought out—value for money is at the heart of the decision on how many data to collect before the cost of collecting the data outweighs the value of using them.
If baseline data are not in place to allow us to establish a starting point and we say that having a range of measures to prove that an achievement has been made is unjustifiable financially and in other resource terms, we are in danger of being unable to assess realistically a programme's effectiveness. Do you accept the need, which I take from Audit Scotland's letter, for a quantum improvement in the volume of information that is available before anyone can be confident that so-called efficiency savings have been achieved?
I recognise that at the heart of your discussion with Mr McCabe is the question of what an adequate standard of proof is. We have acknowledged that some additional data that neither we nor other organisations had a prior business need to collect will need to be collected in various ways to show delivery of efficiency savings. Therefore, some form of additional collection is built into the plans. Your disagreement with Mr McCabe seems to be about where precisely one draws the line in that process. I do not wish to position myself between you and Mr McCabe on that subject.
You laid out your information requirements to meet your responsibilities as accountable officer to protect the public purse and ensure value for money. Is what you ask for or deem necessary different from what Audit Scotland requires for its purposes, which are broadly similar?
Beyond a point, we are in the realm of judgment about evidence standards and the cost-versus-benefit balance about which we have talked. It seems to be expected that different individuals and organisations will judge that balance differently, even though they may all be motivated by a concern for value for money. Therefore, it is not axiomatic that I, as accountable officer, and Audit Scotland, with its responsibilities, will automatically arrive at precisely the same view on how that balance should be struck.
Do you as permanent secretary have a responsibility in relation to communication of the Government's message? The Government might claim that £1.2 billion of efficiency savings will be made by 2007-08. As permanent secretary, do you have a responsibility to ensure that that statement is justifiable?
Yes. I have a responsibility to ensure that such statements are made in good faith and are supportable. That does not take one automatically to the magic answer to the question of what constitutes the right support.
In effect, then, there are two tests. One is to do with whether Audit Scotland has a view that the information can be signed off and the other is to do with whether you, as permanent secretary, believe that it is credible for the Government to say that it has made £1.2 billion in savings.
Yes. You are right to say that those are different roles. The second one is not part of the accountable officer role; it is part of my responsibility as permanent secretary to ensure that all the statements that ministers make in their ministerial capacity are made in good faith. You are right to say that that is a different dimension of my responsibilities.
Audit Scotland has said that financial information is not enough for its purposes of demonstrating that the financial savings and efficiency improvements have been realised, and that outturn measures for services are required.
Yes. Since efficiency is the measure of a relationship between costs and outputs, that must, in principle, be true. However, that does not take one easily away from the point in the Audit Scotland letter about the intrinsic difficulty of measuring many outputs in the public sector. Some outputs are easily measured but many others are not. One of the challenges is that a large chunk of the responsibilities of the Executive do not have easily measurable outputs. Again, I will end up having to make a judgment about what measurement I need.
On what you said about the role of the permanent secretary in relation to ministers' statements, do you think that it was wise for the Executive to state on 8 September that it will publish targets and figures only when it is convinced that they are robust and deliverable and then go on to publish the document that we have before us today?
I do not see any inconsistency between the statement and the action.
Are you the accountable officer or the accounting officer? They are slightly different things.
They are not different things. The Scotland Act 1998 uses the term "accountable officer", while the UK Government's framework uses the term "accounting officer"—same animal, different names.
My question builds on the perception that the Scottish Executive departments are not being asked to make the same level of efficiency savings as other agencies are. Since devolution, the Executive has grown. It is not surprising that it is bigger than the Scottish Office was, because of the support system that is required to cope with the fact that there are more ministers and so on. However, as accountable officer, how do you demonstrate that the operation of the civil service represents value for money?
That is a big challenge because of the particular difficulty of measuring outputs at the level of the Executive. As you know, the Executive is in almost no respect the deliverer of ultimate outputs. Therefore, in almost all respects, its role is intermediate. Those elements that we can measure seem to me to be an unsatisfactory proxy for the business of the organisation. I can easily enough talk about the number of pieces of legislation that we support ministers to take through the Parliament, and we commonly talk about the number of pieces of ministerial correspondence, the number of parliamentary questions, the number of requests under the Freedom of Information Act 2000 and so on. We can talk about all those matters, and I can draw some ratios. However, I do not think that that is the best way of getting at value of money, because it leaves too much of the organisation's real business unmeasured.
Is that not because the UK civil service delivers more front-line services?
Actually, I am not sure that that is the case. The benchmark data that I referred to cover the totality of civil service employees in Scotland. That includes the whole of the Scottish Prison Service, which accounts for a huge proportion of total civil service activity within the Executive. Allowing for that factor in assessing benchmarking perhaps counts in our favour rather than in the UK Government's favour.
There have been fairly significant real-terms increases in Executive spending over recent years. Do you have evidence that the monitorable outputs for that spending have grown in line with its rate of increase?
We have evidence on almost every output that can be measured. In the short term, do those outputs move in line with spending increases? No, that does not happen, either here or in other places. Has the expenditure-to-outputs ratio improved over that period? No, but that truth is generally demonstrable across the range of Government activities elsewhere. The more challenging question is whether any of that will happen over a long time. One can say only that it is too soon to tell.
I do not want to quote you inaccurately, but you said something to the effect that you did not want to suggest that there was a limit to savings or that we had reached such a point yet. However, in some of your areas of responsibility, there is a limit to the amount of savings that can be made without having to adjust services. For example, at some point, you must have to stop carrying out functions instead of simply making incremental savings.
That must be true in principle. All organisations have a point at which they shift from achieving efficiencies to cutting costs and reducing outputs accordingly. The managerial challenge is to find that transition point.
That is a fair point. Where, then, along the spectrum of efficiency are we?
That takes me back to what I said to Dr Murray about benchmarking. The benchmarking suggests that we are a reasonably long way along the spectrum of achieving efficiency in terms of that kind of comparison. What I mean by that is that shifting very many percentage points—or, to be accurate, tenths of percentage points—will prove to be challenging in relation to the delivery of genuine efficiency.
My question is on the challenge of continuing to squeeze out efficiencies. On the current plans for savings as a percentage of the departmental expenditure limit, it seems that they amount to 1.8 per cent in 2005-06, 2.9 per cent in 2006-07 and 4.4 per cent in 2007-08. However, to achieve that total of 4.4 per cent, on a year-by-year basis, that amounts to a 1.8 per cent cut in the first year, which declines to a 1.1 per cent cut in the second year and gives a 1.5 per cent cut in the third year.
No. That takes me back to my earlier point. The figures that you used are minimum figures for the savings that have to be achieved, all of which are predicated on a standstill budget. The savings are measures of what would have to be delivered simply for the organisation to stand absolutely still. However, the reality is that neither activity nor some of our cost pressures will stand absolutely still. The actual scale of the efficiency savings that are delivered will be larger than those numbers. Because the uncertainties about costs and activity grow over time, the reality is likely to be that the scale of the challenge grows over time—notwithstanding the fact that those baseline figures give a different impression.
Fair enough. There has been discussion of, and reports to previous meetings of the committee about, the fact that ministers are considering taking the efficiency work up to 2010 and have already set their targets for that period. Given the pressures that you have talked about, what figures are being set, and if, as you say, the figure for 2007-08 is a minimum, how can the process of continuing the programme to 2010 work?
How can it work? To an extent, that is a question of keeping up the pressure on us as well as on other parts of the public sector. It is very difficult to know, at this point in time, what the component parts of those additional savings are going to be. In essence, it is a statement of political judgment that further savings of a particular magnitude must, in principle, be achievable. One needs time to work out how that is translated into a particular set of efficiency measures.
As you said earlier, the pattern might change. You talked about the pattern in relation to the enterprise networks. Do you see those efficiency savings being of the same magnitude as the 1.8 per cent saving that we saw for 2005-06?
The honest answer is that I have absolutely no idea. That is very much a political judgment about where ministers wish to set the scale of the challenge. I would be very surprised if the scale of the challenge were any less than it has been in the period up to 2008.
I would like to return to the baseline data issue, to see whether we can inject some additional value into the process. If we are to have genuine efficiency savings, we have to start off with all operational levels and the macro-entity under some level of statistical control, so that we can monitor what is going to be incremental by way of improvement over time—and perpetually. I am worried that, in the absence of that control, not only are we asking civil servants and members of the public services in the local government area to fly blind, but we are putting in jeopardy the credibility of Scotland and the Executive with the business and political media, with competitors, with potential inward investors and with members of our own business community, who are going through exactly that process of achieving greater efficiency after, first of all, having their operations under statistical control. The absence of statistical control really makes the whole thing, sadly, a bit of a laughing stock.
I paid particular attention to the bit of your exchanges with Mr McCabe that dealt with that, as it presented an interesting set of ideas. There is an immense volume of statistical data, and it is difficult to argue that there are key outcomes from the public sector that are not measured. Opinions vary about that, but we certainly measure what is currently the consensus view of the key outcomes in each area of business.
That is the one thing on which I wish to take issue with you. I acknowledge the political judgment that is required to set a target, to approve the programme whereby that target is achieved and to be responsible for that target. In the end, it is the responsibility of the civil service to have a mechanism in place to measure and monitor progress and to report back through ministers. I cannot imagine a senior captain of industry doing the same three things—setting the target, approving the programme and stepping up the target—but then not expecting his statisticians, accountants and so on to deliver the data that will allow him to put the proposition to his shareholders and the wider community.
My argument is that we are doing precisely that. I am not conscious of any key outcome that we do not measure. You are arguing that it is our job to measure things—and we are measuring them. The argument is essentially about the use that is made of those data, rather than about their existence.
In relation to the savings package and the efficiency exercise, Audit Scotland has identified some deficiencies in information and is highlighting the requirement to put baselines on a firmer basis. Even judging from our earlier robust exchange, it is clear that ministers acknowledge the need to put flesh on the bones of baselines and to improve financial information and output information. We take that as a given.
I think that Caroline Gardner is saying that she cannot audit what is not there.
Yes.
She is saying that she would expect the bread and butter of what Audit Scotland audits to be reports that flow through the management process for the purposes of tracking. That seems right to me. I think that auditors should audit the things that management requires for its purposes rather than impose additional demands on organisations. I think that your interpretation of the process is right, but I still do not see that it takes you any nearer the question that you were discussing with Mr McCabe, about what the precise content of the report should be.
If the Government's efficiency savings initiative has been genuine and effective, should we not have confidence in the measurements that are taken to substantiate the process in order to satisfy parliamentary and public opinion?
No. I think that Mr McCabe said that the process is evolving. I regard us as being in partnership with Audit Scotland to work out how best to do something that one cannot simply take down off the shelf. It seems to me that both Mr McCabe and Audit Scotland are saying that the process will have to develop and that they must still make judgments in some areas about how precisely to get the quality of evidence on particular efficiency savings that they would like to have. I do not recognise the situation as being one in which Mr McCabe says that the Executive has reached the limit of what can be done and someone else says that it has not.
It certainly sounded like that when he came into the room, I must say—that was just a pejorative remark.
To put it another way, there is a shared recognition that all the mechanisms were in place to account adequately for and audit the process from the beginning. The Executive has indicated that, and it is also the Finance Committee's and Audit Scotland's view. You are saying, Mr Elvidge, that there is a continuing debate, or negotiation, between Audit Scotland and the Executive on the information that is required to provide the assurances that Audit Scotland feels it requires. Can you assure us that, as accountable officer, you will co-operate with Audit Scotland to the fullest extent in ensuring that it has the information that it needs to audit the efficiencies and provide the reassurances that it feels are appropriate?
The short answer is yes, but let me elaborate on that. I emphasise that I do not think that it is right for the audit process to ask for information that is not required for management purposes—audit should not be an additional burden on organisations. Therefore, because I believe that there will be adequate information for management purposes, I think that I can say that all that information will be freely available to Audit Scotland. The other important point to make is that Audit Scotland's role in the process is, to a substantial extent, the result of an invitation from the Executive to act as adviser and partner in developing the framework. We are not talking about Audit Scotland acting in its formal audit capacity in this case. Therefore, it would be wrong to postulate an adversarial model in which things might be withheld from Audit Scotland or there might be a difference of purpose, because that would contradict the way in which the relationship was entered into. That is my qualified yes.
Okay. We can take that qualified yes and look forward to further information in due course from you, ministers and other officials.
I also look forward to my next appearance. Thank you.
Previous
Efficient GovernmentNext
Financial Memoranda