Skip to main content
Loading…
Chamber and committees

Finance Committee, 27 Sep 2005

Meeting date: Tuesday, September 27, 2005


Contents


Performance Monitoring

Under the next item on our agenda, we will take evidence from the permanent secretary. We will take a minute or so to allow those who wish to leave to do so. I am sure that we would not want Mr Elvidge to be trampled in the rush of the press.

John Elvidge (Scottish Executive Permanent Secretary):

This is like putting the support act on after the main band.

The Convener:

This session is to allow us to take evidence from Mr Elvidge on performance monitoring in the Executive. We wrote to Mr Elvidge after the interesting session at which we took evidence from Professor Michael Barber, who, at that time, was the head of the Prime Minister's delivery unit. Mr Elvidge's response has been circulated to members and we have taken him up on his offer to come along and talk to us. The committee has also signalled that it wants to speak to Mr Elvidge about resource allocations under the previous spending review.

Mr Elvidge, in your letter to us, you say:

"Ministers are committed to delivery across the full range of their commitments".

You also say:

"specific policy commitments will require more intensive monitoring and support".

Will you tell us about that intensive monitoring and support? Is it monitoring of performance and, if so, against which benchmarks or criteria?

John Elvidge:

Those commitments are selected—

I am sorry, I have not given you the opportunity to make your opening statement.

John Elvidge:

That is okay. In these circumstances, I do not need an opportunity to make an opening statement. My evidence is a response to a series of questions that you asked; I am not seeking to introduce anything into the discussion that does not flow directly from your questions. I am perfectly happy to plunge straight into the pursuit of your questions.

The commitments to which you refer were selected for a variety of reasons. One criterion was that their delivery might be more complex than delivery of some of the other partnership agreement commitments and that, therefore, there might be a need for greater support for the teams charged with delivering them than for teams charged with delivering other commitments in the agreement for which the delivery path was clear and, if not straightforward, at least well understood. The first significance of their selection was to ensure that individual teams in the organisation were properly backed up by the rest of the resources of the organisation or, indeed, by resources from elsewhere, if that was necessary to secure the delivery of the commitment.

Another feature of that set of commitments is that they are subjected to a more intense process of monitoring. Across all the commitments, we largely rely on self-assessment of how the process is going by the teams that are delivering the commitment. For the group of commitments in question, the self-assessment is supplemented by assessment by our central analytical services group, which subjects the monitoring of delivery to more rigorous scrutiny to ensure that, in areas where delivery may be most challenging, we are being most robust in our monitoring of progress towards delivery.

You said that there are areas that need more intensive support. Can you give us an example of a policy commitment that has had that sort of treatment?

John Elvidge:

Yes. I am sure that, without looking at my lists, I can pick a few out of my memory. The commitment on recycling waste is a good example. On the one hand, we are aiming at levels of achievement that are paralleled in other countries. On the other hand, we are aiming at a rate of movement from where we started towards those levels of achievement that is enormously challenging and, on the face of it, requires interventions that are not currently understood. Once we get past what we might describe as the easy bit of encouraging recycling, there are issues about changing public behaviours that involve interventions by Government for which it is not easy to find parallels. Therefore, it is reasonable to assume that there will have to be a degree of innovation, which might require support from outside a small team.

Dr Murray:

The example of waste recycling is interesting because, in that case, the delivery is in the hands of the local authorities, not the Scottish Executive. Some local authorities are performing up to target, whereas others, such as my local authority, are most certainly not. Your requirement for local authorities to sign up to your priorities is quite challenging.

John Elvidge:

Indeed. It also raises questions about the nature of partnership working between the Executive and local authorities on delivering an objective in which working practices have to be different from those that apply with regard to the generality of local government business. That is part of the distinctive challenge of delivering that commitment.

Dr Murray:

I have some questions about the delivery unit at Westminster. As the convener said, Professor Barber gave us an extremely interesting presentation in May, in which he described in some detail the on-going process of achieving a target. For any specific priority, one maps a trajectory that ends with the target being achieved; I believe that Professor Barber said that that trajectory was measured monthly.

In paragraph 17 of your submission, you mention "improvements in waiting times" in NHS boards. Waiting times were one of the examples that Professor Barber highlighted when he explained how the delivery unit down south worked. You believe that Scotland has done rather better than England with regard to waiting times. As I do not have my glasses on, I will have to squint at your submission, but you say:

"Latest data indicates that 31.5 of every 100,000 population waited over six months in Scotland, compared to 83 in England."

Did you use a process similar to that used at Westminster of putting in resources and measuring the progress towards achieving the target, or was the target achieved in a different way?

John Elvidge:

It was certainly achieved in a different way. What the Scottish and English approaches have in common is close and regular performance monitoring. The data to the delivery unit in Whitehall and the data to us flow no less regularly than monthly and are enormously detailed about the nature of delivery. In that sense, the processes are similar.

The difference is characteristic of the difference between the Scottish and English models. In the example that you have raised, the delivery unit in England holds the Department of Health—the intermediary, as it were—to account for the health service's performance. In Scotland, the Scottish Executive Health Department holds the NHS to account for delivery. We have not thought it necessary to establish a third player to hold the department to account. Given the way in which Scottish government works, such a system seems unduly complex.

Dr Murray:

But in that case the department does not have any overall responsibility; it is responsible only for delivering targets. If that happens on a departmental basis, some of the raft of cross-cutting priorities and measures in the partnership agreement may not be managed in a detailed way.

John Elvidge:

That is a question of the way in which information is shared in the Government. If the Health Department was monitoring performance but was not obliged to tell anyone about the outcome, we would be running the risk that you have identified. However, the processes of government are such that the department's monitoring performance is widely shared around the organisation.

Because that level of sharing goes on, we do not feel the same necessity to charge a group of people—on behalf of the First Minister, if we take the model literally—to monitor the Health Department in that way. Such a layer of formal checking would be redundant, given the ease with which Scottish ministers share knowledge of performance.

Does that sharing take place at a political level or at an official level?

John Elvidge:

It takes place at both levels.

The Convener:

Is it possible to make that argument the other way round? If you are trying to measure performance, in whatever way and however rudimentarily, across the full range of the partnership commitments, does not that go further down the route of introducing bean counters or button counters? I cannot remember whether Mr McCabe referred to bean or button counters.

Both, I think. He seems to be against all of them.

The Convener:

Is that not a more elaborate process, involving unnecessary checking, than what Professor Barber described as the highly concentrated and focused approach of the United Kingdom Government towards delivering targets in key service delivery areas?

John Elvidge:

I will try to reply to that question with as much fact and as little opinion as I can. When I was reflecting on the evidence that the committee had received, it occurred to me that Professor Barber had come and told you about one bit of what the UK Government does in monitoring delivery; it was not his responsibility to come and tell you about everything that the UK Government does in monitoring delivery. Every department of the UK Government contains a substantial body of people who are concerned with the tracking of performance across the board. The Prime Minister's delivery unit simply sits above that process and selects out of it a small number of things that it wants to monitor with particular intensity. I do not think that the delivery unit has been established as a substitute for the kind of comprehensive analysis of performance that you would find in the Executive or in the Department of Health or any other Whitehall department; the unit is an addition to it.

The Convener:

You are saying that the Prime Minister's delivery unit does what you do. However, there is no partnership agreement at Westminster. As it has been described to us, the focus in the Executive is on the partnership agreement. In the past, the committee has expressed the view that there is perhaps a need for greater priority to be placed on specific targets and outcomes rather than on the full range of targets that the Executive has identified both in budgetary documents and in the partnership agreement. Would you concede that the regime of checking whether something is geared towards the partnership agreement across the board is perhaps unnecessarily elaborate and that there is a danger of simply testing compliance with partnership commitments rather than actual delivery of services?

John Elvidge:

I am sure that the proper civil servant's answer is that that is a legitimate area for debate. Ultimately, it is for ministers to decide how they want to exercise political accountability to the electorate. They have chosen to exercise political accountability through the partnership agreement and its 450 individual commitments. Given the fact that that is the choice that ministers have made, it is absolutely clear to me that the right thing for the organisation to do is to have a monitoring framework that enables accountability to be exercised in that way.

The Convener:

That is a very judicious answer. However, it is a difficult task to monitor delivery over the 450 commitments. Would it not be easier if there was a clearer prioritisation of monitoring delivery on targets that particularly matter, rather than the broad-brush approach?

John Elvidge:

Would it be easier?

Could it be done more effectively? That might be a more concrete way of asking the question.

John Elvidge:

I would start from the proposition that one must not fly in the face of reality. If expectations of Government are broad, and if political commitments have been given in a particular way, the monitoring processes of Government have to reflect the breadth of accountability that must be discharged. We need to remember that this is not simply a product of the partnership agreement; ministers can be held to account by the Parliament in any way that the Parliament chooses on any subject that the Parliament chooses. It is part of our responsibility to enable ministers to respond to that legitimate accountability.

The process of tracking a lot of things is inherent in the nature of accountability. Arguably, in Scotland there is closer and more detailed accountability from ministers to Parliament than at Westminster. I am not sure that any system of prioritisation would relieve the organisation of the need to be in command of information about what is happening across a wide range of activity—nor would I argue that that would necessarily be a good thing. If one is undertaking activities, one should know what is happening with them. That seems to me to be a reasonable principle. Therefore, a certain amount of monitoring is appropriate for everything that we do.

If you ask me whether it would in some sense be easier to be on the hot spot for a smaller rather than a larger amount of things, I guess that there is a sense in which the answer to that must be yes. However, I am not sure that ease of accountability is a legitimate aspiration for us. I certainly do not think that any of our activity is wasteful in an accountable officer sense.

Mr Swinney:

I want to take you to one of the hot spots and follow up the point that Elaine Murray made about your statistics on health waiting times and what that says about performance monitoring in the Executive.

I am sure that the comparison between median waiting times in Scotland and in England is legitimate. However, the median waiting time for in-patients and day cases in Scotland has, since 1999, gone up from 30 days to 50 days. I am rather perplexed as to what that says. If we are talking about a performance monitoring culture, what does a deterioration of such magnitude tell us about the effectiveness of performance monitoring on what I consider a fundamental priority of the Government, which is to reduce waiting times? Reducing waiting times has been a fundamental priority of the Government for as long as I can remember.

John Elvidge:

If I had known I was going to be giving evidence to you in a week with two by-elections, I might have been a little bit more cautious about including health as one of the examples in my evidence, because I very much do not want to be drawn into an active political debate about which particular measure is more important than which other particular measure in the complex field of health.

What I would say is that I think that it is a generally accepted principle of target setting—this is related to the question of prioritisation—that when one puts emphasis on one thing, performance on other things may suffer as a consequence and that effort goes towards the priorities of the moment. Therefore, one sometimes sees the kind of effect that you have identified. The purpose of a performance monitoring system is to understand what is happening and, if the outcome is not acceptable, to create the opportunity to amend priorities to change that. Performance monitoring will not of itself prevent the consequences of shifting effort from one activity to another.

Mr Swinney:

Obviously performance monitoring highlights the deterioration in the median wait and the increase in waiting times for day cases. I take from your answer that effort in another direction may have skewed the figures. The median wait for out-patients is much the same: waiting times have gone from 46 days to 62 days. I find it a bit difficult to understand how the shifting of effort has led to deterioration. What intrigues me about this is the availability of the performance monitoring information that clearly shows deterioration in median waiting times for patients. There does not seem to be any obvious intervention to do something about that.

John Elvidge:

I do not think that I would draw that conclusion. There are lots of interventions. Few things are more actively managed than waiting list performance in the NHS.

We can point to quite dramatic shifts from one time period to another in some of the indicators, which are evidence of active management having an effect. If the charge is that that does not always prevent some indicators from deteriorating or that it does not simultaneously improve all indicators, that would be true, but I do not think that one can draw from that the conclusion that there is not an active management process in play.

Mr Swinney:

You consider that the indicators I have cited—and which, to be fair, you cited in your own evidence—are fundamental indicators of performance. We have just had a discussion with Mr McCabe about trying to avoid too much bean counting, or button counting, or both. We have those indicators and they show us the deteriorating position, and I am just trying to work out how that percolates through the system to lead to changes of emphasis, changes of policy or changes of resource allocation to tackle what seems to be a pretty big problem.

John Elvidge:

For the sake of clarity, I am not entering into the debate about which is the most important measure of waiting times. There is a well-rehearsed effect—that as one concentrates on the longest waiting times, one automatically tends to push up median waiting times. There is a perfectly reasonable debate about which of those two numbers is the better measure of the effectiveness of a health service. What is difficult is to argue that you can improve both of them simultaneously. I am saying that waiting times is a generally accepted measure of effectiveness, but the particular slice that one takes through waiting times is a subject of active debate.

Forgive me for simply taking the lead from your evidence, Mr Elvidge.

Jim Mather:

You stated in your memo:

"Mr McCabe reports to Cabinet twice a year on progress on the Partnership Agreement."

What progress reports does the First Minister receive as an output of that process and how frequently does he get them? Could you also tell us what format those progress reports take?

John Elvidge:

I can do my best to describe what is quite a complex document. The First Minister receives the comprehensive report in the same form as the Cabinet receives it. He also has access, as do other ministers, to the more regular summary reports of overall performance that the management group receives. As in any organisation, there is a system of exception reporting if there are particular subjects of concern.

It is fair to say that the report to Cabinet is an evolving document, but at its heart is an account of performance against every one of the 450-odd partnership agreement commitments and a separate report on the centrally monitored commitments within that. In so far as it is possible to draw out of that any general observations about delivery, that would happen as part of that process.

Jim Mather:

In the presentation that we had from Professor Michael Barber, it was interesting to see him tabling the characteristics of the system whereby there are mechanisms to hold ministers and their civil servants to a single and somewhat narrower set of objectives. There is then a mechanism in place to make an objective assessment of how well people are doing and hence to produce a league table that is available to the Prime Minister and a devil-take-the-hindmost approach to management. Do you have any plans to move in that direction here in Scotland?

John Elvidge:

That takes us back to our earlier territory. One could operate a system of that kind only if ministers had made a prior decision that there was a smaller group of targets or objectives that were overriding priorities for the organisation. In the absence of that, one cannot simply transpose the delivery unit approach. As I have no reason to suppose that ministers will change their view of political accountability to the electorate, I have no plans to build any systems that assume that they will.

Therefore, we continue producing a report containing 450 or so measures, with the progress against them, and that is the document that is put before the Cabinet for the First Minister and his colleagues to assimilate and respond to.

John Elvidge:

Yes. Perhaps I did not explain clearly enough that, although the full data are there for ministers, they are summarised in various ways to assist them see the key elements. It would not be very responsible behaviour on our part simply to bang down the detail of 450 commitments and say, "There you are. Make what you will of that."

Does that summarisation not take us closer to having a subset of objectives that can be managed using the Barber method?

John Elvidge:

Yes and no. It takes us closer to knowing what subset of objectives might be proving particularly challenging to deliver at a particular point in time. The delivery sub-group of the management group uses a mechanism that is specifically targeted on those commitments where there appears to be a delivery issue to resolve. That is different from the approach of the United Kingdom Government's delivery unit, with its fixed list of priorities, which are there irrespective of whether there are concerns about delivery.

Jim Mather:

In a situation where there is a focus on challenges within a given portfolio, along with scrutiny of why they have arisen and an assessment of how to proceed, is there a function whereby both the minister responsible and his civil servants are answerable to the Cabinet on that?

John Elvidge:

There is a parallel process. The time sequence does not always work perfectly but, where possible, the first action is that the team responsible is held accountable to me and to my management group colleagues on the delivery sub-group, with emphasis being put on whether, in a problem-solving sense, we can assist the team with the delivery challenge that it faces. I would describe that as a support activity, rather than as an accountability function. My interest is in getting the delivery to happen, rather than in criticising people about any difficulties that they might be having in achieving that delivery.

Mr McCabe sees other ministers to discuss performance across their portfolios. Normally, the same commitments that are engaging me will form the core of Mr McCabe's discussion. In that setting, the ministers and their heads of department are perhaps situated in more of a context of accountability to Mr McCabe, rather than in the support context that I have described in relation to the work of the delivery sub-group.

Jim Mather:

Is there a mechanism whereby the First Minister, you and Mr McCabe can communicate to individual ministers in charge of portfolios where they sit on a spectrum between highly likely to achieve their objectives and very unlikely to achieve their objectives? Can you position them in that way and thereby give them an awareness of the impression that has been created as to their management of their portfolios?

John Elvidge:

That kind of assessment is fundamental to the material that is reported to Cabinet. Each commitment is assessed in terms of the risk or absence of risk to delivery. In addition, I would expect individual departmental management teams to share with ministers the flow of management information that each will have about performance across the objectives for its portfolio.

I am perhaps making my answer more elaborate than it need be. Fundamentally, there is no risk of any Executive minister being unaware of threats to delivery performance in their portfolio over the time cycles that we are talking about.

If you were asked to define the three or four key skills for the head of the Health Department, what would you say they were?

John Elvidge:

That is a fast ball. I would say that the key skills are a capacity for strategic management; the ability to organise and manage a large volume of human and financial resources; an understanding of the processes of delivery in a complex delivery organisation; and the ability to manage relationships in the context of government and the public sector.

So why has the post of director of delivery for the Health Department been advertised at a salary of £100,000 if that is the job of the head honcho?

John Elvidge:

It is a feature of all organisations that the head honcho cannot spend all their time all day doing all the things that they need to do. Departments consist of support structures that allow the head of department to carry out the various aspects of their job. You might as well ask me why we employ some people to think about strategy in the health service if an ability to think about strategy is part of the skill set of the head of department. I suppose that the short answer is that the reason is that we cannot expect one individual to do the work on their own.

Is the head of the Health Department the equivalent of John Birt?

John Elvidge:

No. I would not like to characterise the role of head of the Health Department by using John Birt as a description. The head of the Health Department is the accountable officer for the health budget and the senior manager of the department.

The fun is over. If we are to get a director of delivery, why should we start with the Health Department? Could that model be replicated in other departments?

John Elvidge:

The answer lies in the unique relationship that exists between the Executive and the NHS. No other part of the public sector is similar to the NHS in its core relationship with the Executive. That is to say that the NHS is directly accountable to the Executive; unlike local government, it does not have a separate democratic existence. The health service is unique in the scale of its spending—it is responsible for slightly more than a third of the Executive's total budget—and in its operational decentralisation and complexity, which make the tracking of delivery an enormously difficult business.

I am thinking hard, but the closest analogy I can come up with is with the enterprise network. However, the vast difference in scale between the health service and the enterprise network is immediately apparent. I simply do not think that one would invent a post of such weight to exercise the delivery scrutiny function for any other part of the public sector. I would not regard the proposed model as transferable. Even if it were, the Health Department would still seem to be the most logical place to start.

Mr McAveety:

I would probably concede that point. My experience in the Health Department was that it requires something such as this post.

The metaphor that I always used was of a supertanker that could have a hole blasted in its hull but still get to its destination. Only then would someone tell you that they had left a big hole in the supertanker saying, "By the way minister, gonnae sort it out for me." A delivery model is required to deal with that.

We had a discussion with the minister about outcomes. How will you measure the performance of the individual who is in the post? What value-for-money outcomes would you expect them to achieve over the next year or two?

John Elvidge:

My comments are inevitably speculative, given that we are running ahead of the objective setting for the individual and that a sound principle of objective setting is that it should be done in discussion with the individual who will be subject to it. Generally, I would expect to select a limited number of the most challenging delivery issues in the health service and specify the difference that we would like to see the director of delivery make to performance against those delivery objectives. That comes very close to what Michael Barber is doing.

Is the post linked directly with the strategy and delivery unit? Is there an intrinsic link?

John Elvidge:

There is not an intrinsic link.

Should there be?

John Elvidge:

No, I do not think so. There should be a channel of discussion about delivery lessons that are being learned in the health service that might be of value elsewhere, but it is fundamental to the models that we use in the Executive that we try to keep responsibility for delivery decentralised in the organisation rather than pull it together. The delivery unit is a very slim structure, which is part flying squad to help teams that are in trouble, part sharer of best practice and part monitoring body. It is best to keep the delivery unit in its whole-Executive role and regard the director of delivery and their team in the health service as the largest example of decentralised delivery responsibility in the organisation.

There are 10 commandments but 450 commitments. How many of the commitments can you put under the focus at any one time? You obviously cannot cover all 450.

John Elvidge:

It depends what is meant by focus. Tracking them all simultaneously is not a challenge, but we can intervene in only a small number at any one time. As a general rule of thumb, it takes at least an hour of senior management time to have a sensible discussion about delivery problems on any individual commitment. An hour of senior management time—three or four senior managers would be involved—is a scarce commodity in any organisation.

In practice, we can bear down intensively from the centre of the organisation on only two or three targets at any particular time. However, one must bear in mind that there is a hierarchy of intervention. The senior management of a department will intervene within the department if particular teams are struggling, so the organisation's overall capacity to focus on commitments is greater, but it is difficult to put a figure on it. If, in very crude terms, we said that it was probably possible to have 10 per cent of the commitments under reasonably close scrutiny at any time, that might be a rough measure of capacity.

What triggers that sort of intensive focus? Does it come from you, does it come from within the civil service or are most of the initiatives triggered by ministerial diktat?

John Elvidge:

It is a bit of all those things. There is a cyclical issue. Early in the four years of the session, one does not know how delivery on many of those things is going, so one cannot immediately say, "Let's focus our attention on the ones that are in trouble." At the outset of the four-year cycle, there are management decisions about where it might be useful to invest some time, where the risks look greatest. As we move through the cycle, the emphasis shifts to focusing on those things where the teams are identifying delivery problems. Overriding all of that, if ministers are particularly worried about something we will spend some time looking closely at it.

The Convener:

There is a nice phrase that I got from a Canadian who worked for one of the legislative assemblies there: people in her position are "we be's"; "Whatever happens to politicians, we be here." You have seen several Administrations in Scotland and there are some fundamental underlying issues that we have to address, whether health, education—which I know is your background—or economic growth, on which there is broad consensus between the parties about where we want to be.

You have described a politically driven process that is geared towards meeting the partnership commitments, but when you responded to Frank McAveety's question about what you would ask the director of delivery to do, you said that you would narrow down the focus to four or five main things that were most important and required to be delivered.

Is there a disjuncture between a politically driven process that is geared towards meeting the outcomes of a coalition agreement in the form of the partnership agreement, and a managerial approach that says, "We've got to deliver on these 10, 12, 15 or 20 things"? Are you trying to manage something that is driving you in opposite directions?

John Elvidge:

No, I do not think so. The process of narrowing down that I described would undoubtedly be a process of narrowing down that had political consent. There is absolutely no point in the organisation focusing on things that ministers do not believe are the right things to focus on. By definition, at any point in time the organisation is managing a number of things that do not appear in the 450 partnership agreement commitments. In particular, it is paying attention to some of the large, long-term outcomes that Mr Mather was talking about in the earlier discussion with the minister. I do not see that as a disconnect. As the minister said, those larger outcomes are often implicit in the more detailed objectives that ministers have set.

To put it another way, is the organisation busy managing a longer-term agenda of its own? The answer is no. Time will tell whether senior civil servants prove to be around longer than individual ministers in the Scottish set-up. I do not think that we should assume that experience in Canada or anywhere else is a guide to experience in Scotland.

The Convener:

I am sure that that is the case. I was certainly not implying that there should be, or is, a separate agenda for civil servants. To cast the question in a different way, I suppose that the challenge to ministers is to identify their core priorities and what is most important for them to deliver in order for you to manage the resources towards that delivery process. That is how it seems to me. Do ministers take on board the fact that, to achieve some of the things that they deem most important, they must set clear priorities and pathways towards delivering those things, and is that then reflected in the way in which the strategy and delivery unit, or indeed the civil service as a whole, operates?

John Elvidge:

A process of discussing priorities is part of the day-to-day business of ministers. I would be surprised if that were not so in any political system. Does it feed through into what the strategy and delivery unit does? Yes, up to a point. It does not detract from the centrality of the partnership agreement commitments, but it might generate additional issues to which the strategy and delivery unit needs to pay attention. If that were the case, those additional demands on its efforts would arise from that kind of process of ministerial discussion of priorities.

The Convener:

I think that we have probably finished with performance management. Perhaps we can move on to allocations from the spending review and issues of efficiency. As accounting officer, how do you satisfy yourself that the level of efficiency that is being achieved or proposed for each department is realistic and acceptable?

John Elvidge:

It is tempting to say that I use my judgment and experience and look for as much evidence of deliverability as I can. In a perfect world, I would want to see a clear delivery plan that gives me complete confidence that particular sums of money will be delivered from a particular action. Unfortunately, life is not always that tidy and sometimes I have to work down through a hierarchy of assurances, taking lesser standards of evidence, which I supplement with judgment and sources of reassurance wherever I can find them, by asking questions such as "Has anyone else ever succeeded in doing that?"

In the spending review, how was it determined where the greater scope for efficiency savings was and what level of savings should be achieved?

John Elvidge:

Outside the Executive, that is done through a process of discussion across departments, drawing on individual departments' knowledge of the areas of government for which they are responsible. Inside the Executive, it is done through a process of discussion of relative challenges that takes place inside the management group. Although quite a lot of the Executive's efficiency savings are designed to be achieved through activities that are organisation wide, there is an issue about how the impact falls on individual departments, which we examine as closely as we can.

The Convener:

It has been highlighted to us that the Scottish Executive Environment and Rural Affairs Department and the Scottish Executive Enterprise, Transport and Lifelong Learning Department have made relatively limited contributions to efficiency savings. Why is that?

John Elvidge:

It is because of varying patterns of business. The crucial thing to understand about efficiency savings is that they do not take place in a static setting. That is why it is not possible to put a number on the scale of the efficiency savings challenge for the Executive in percentage terms or, realistically, in cash terms. One can identify what the minimum saving is if business is unchanged, but of course that is not the reality, because business changes all the time and additional pressures fall on the organisation. In areas of business in which the additional pressures are greatest, the net saving will be smaller than the average. In bits of the organisation in which people are able to manage an overall decline in activity in some way, there is greater scope for making efficiency savings.

There are other factors. Procurement here, as elsewhere, is a major driver of efficiency savings. Depending on the nature of their business, different departments procure in different volumes. One cannot save money on procurement if one is not engaged in a significant volume of procurement activity. However, if one is, the scope is very large. I would not expect to see uniformity.

The Convener:

I am intrigued by the fact that the smallest contributions come from those two departments. Will you say any more about why SEERAD appears to be unable to deliver efficiency savings when it has a substantial number of areas in which it could identify such savings?

John Elvidge:

I am not saying that; I am talking about the net position. SEERAD has one of the best examples of a single project saving money in a department. The changes to the administration of the common agricultural policy payments have released substantial savings by delivering the payments in a much more streamlined way. In part, those savings have been consumed by new activities in the department, the introduction of land management contracts being the primary example of that. Therefore, I would not be critical of my SEERAD colleagues and their commitment to achieving efficiency where they can. One needs to recognise that they are being asked to take on additional activity in a number of areas and that that changes the net picture.

The Convener:

The overall statistics seem to suggest that SEERAD and the Enterprise, Transport and Lifelong Learning Department are making the smallest contribution to efficiency savings, so I am not sure that I have quite got an answer to my question.

One of the issues is the timescales during which commitments are made to make savings, with 2007-08 being the present time horizon. In some areas, it will take longer to make the changes that are required to deliver savings. Are you considering the savings that could be made over a longer timescale, such as the period to 2010 or 2012? Might that show a different pattern from what we have seen up until now?

John Elvidge:

Yes, we are. It would be surprising if the pattern stayed uniform over time. In addition, we have set ourselves a savings target for this year that is greater than the savings target that the budget forced us into. That links slightly to Mr McAveety's point about supertankers—if one wants to turn a tanker, the sooner one starts turning the wheel the better. We have tried to pull forward some of the savings pressure to ensure that when we need to hit the peaks of those savings over the three years, we have the measures in place. We are certainly looking beyond 2008 at what we need to achieve.

One of the reasons for that is that there are a number of uncertainties—and, in particular, one major uncertainty—about costs even in the run-up to 2008, which makes it difficult for me to judge quite the volume of savings that I might have to deliver over and above the minimum. The major uncertainty is the unknown cost of pay movement over that period. We are in the process of negotiating a new pay agreement with the unions—or we will be over the next few months—and I cannot know what that pay agreement will do to the pay bill. I might need to make more efficiency savings.

Mr Swinney:

I will pick up on one of the convener's points about the enterprise field. You explained the relationship between the Executive and the health service as a unique and direct relationship for the delivery of services, and you suggested that the closest comparator is the enterprise network. I am struck by the fact that the 230-page manual to which Mr McCabe referred earlier identified only £4 million of enterprise network savings. Off the top of my head, that is probably less than 1 per cent of the enterprise network's budget, whereas the health service savings are formidably higher. I am struck by the inconsistency between the impact of the two—or their targets. The convener raised a point about the Enterprise, Transport and Lifelong Learning Department's limited contribution to the process. Can you explain why the disparity to which I refer exists?

John Elvidge:

On the enterprise network, we must take into account the fact that, if we were not starting the clock on 1 April this year but looking back two further years, we would see a different picture of efficiency savings that have now been driven out of the network. Its argument is that so much cost has been taken out in previous years that the scope for additional efficiency savings is necessarily more limited. No one would thank me for accepting that anyone has reached the limits of the efficiency savings that they can achieve, so I am not saying that, but cutting in at a particular point in time is always an issue with an exercise of this kind, as it can distort comparisons between one organisation and another.

Mr Swinney:

That is a reasonable point. I sought an assurance that you were doing exactly what you said you were doing in that area and were not just saying something like, "Well, that's fine, lads. Thanks very much. Let's move on." Given the debate about the outcomes and impact of public expenditure, it strikes me that there is still an opportunity for the Executive to make significant gains in the effectiveness of public expenditure. I hope that the process will accommodate that in due course.

John Elvidge:

I think that you can take it for granted that there is continued scrutiny of the scope for efficiency savings in all areas, including the one to which you referred.

Mr Swinney:

I want to move on to an issue that I raised with Mr McCabe. It strikes me that there is an inconsistency between how efficiency savings are identified and utilised in local authorities and how that is done in other areas of the Executive's work. The Executive tells local authorities that it believes that they have the capacity to make a certain amount of savings, so the Executive will retain that amount of money. The Executive tells the authorities to make their efficiency savings within the settlement that they achieve. However, in other areas of government, the Executive says that it wants a certain amount of efficiency savings to be made, and those areas are then free to reinvest the savings as they see fit. What is the justification for that inconsistency in approach and for dealing with local authorities differently?

John Elvidge:

I am tempted to say that you are taking me into areas that Mr McCabe has dealt with comprehensively. It seems to me that ministers are saying to all parts of the public sector that they are making a judgment about the savings that they believe the public sector has the capacity to achieve and are setting budgets in the light of that judgment. Ministers then say to some parts of the public sector that, because of decisions about priorities, they want to create capacity for growth in spending in those parts, but they are not saying that universally. That is the broad nature of the process that has gone on. However, I am close to territory that I feel is exclusively Mr McCabe's and not mine.

What is the rationale for deciding who fits into which category?

John Elvidge:

The same rationale that lies behind any decision about prioritisation that ministers make, as they do constantly, applies in the resource allocation decisions that they make.

Mr Swinney:

You were here earlier when we had a discussion with Mr McCabe about the information that is required to underpin public confidence in the efficiency exercise that the Executive is undertaking. In your capacity as accounting officer, with a particular responsibility in the relationship that exists between the Executive and Audit Scotland in the verification of Government spending, what further information do you think that the Executive must generate to meet the standards that Audit Scotland's letter of 10 August appears to demand?

John Elvidge:

I am increasingly getting a sense of pits being dug around me.

Not at all. It is a genuine search for an answer. I have no other motivation whatever.

John Elvidge:

I was not imputing any base motive; I was simply saying that I had a sense of pits being dug around me.

For me to answer that question properly, we have to be a bit formal about the nature of accountable officer responsibilities. The accountable officer's responsibility to ensure that spending is proper and that value for money is achieved does not quite take us into the territory of ensuring that every detail of a set of spending plans turns out precisely as it was announced to be. Accountable officers are not an alternative to ministers or second-guessers of ministers; they safeguard the propriety of money when it is spent, and the efficiency process is, by definition, a process of not spending money.

Mr Swinney:

I feel that you are trying to make a distinction between the pursuit of value for money and the pursuit of the efficient government programme, as though value for money is somehow your responsibility and efficient government is not. Am I misinterpreting what you are saying?

John Elvidge:

No, you are not. However, I am saying that value for money, as a principle, will take you only so far into the detail of the issues that you are pursuing. The obligation on an accountable officer is to draw attention to those things that are demonstrably not value for money, not to provide a running commentary on the comparative value for money of everything that happens. I am sorry if such distinctions are unhelpful, but I think that the limitations of the accountable officer role are genuinely important. To say that a core responsibility of the accountable officer is to track the detail of every efficient government saving is not an interpretation that accountable officer responsibilities will bear.

Is it not part of the responsibility of the accountable officer to ensure that the baseline data are in place to allow an assessment to be made of the effectiveness of any efficient government programme?

John Elvidge:

It is the responsibility of an accountable officer to have whatever data he or she needs to satisfy himself or herself that there is not a demonstrable case of poor value for money—or, indeed, improper expenditure or the other things with which an accountable officer is concerned. It is not necessarily an accountable officer's responsibility to insist on particular standards of evidence, especially when—as your discussion with Mr McCabe and committee discussions have brought out—value for money is at the heart of the decision on how many data to collect before the cost of collecting the data outweighs the value of using them.

Mr Swinney:

If baseline data are not in place to allow us to establish a starting point and we say that having a range of measures to prove that an achievement has been made is unjustifiable financially and in other resource terms, we are in danger of being unable to assess realistically a programme's effectiveness. Do you accept the need, which I take from Audit Scotland's letter, for a quantum improvement in the volume of information that is available before anyone can be confident that so-called efficiency savings have been achieved?

John Elvidge:

I recognise that at the heart of your discussion with Mr McCabe is the question of what an adequate standard of proof is. We have acknowledged that some additional data that neither we nor other organisations had a prior business need to collect will need to be collected in various ways to show delivery of efficiency savings. Therefore, some form of additional collection is built into the plans. Your disagreement with Mr McCabe seems to be about where precisely one draws the line in that process. I do not wish to position myself between you and Mr McCabe on that subject.

The Convener:

You laid out your information requirements to meet your responsibilities as accountable officer to protect the public purse and ensure value for money. Is what you ask for or deem necessary different from what Audit Scotland requires for its purposes, which are broadly similar?

John Elvidge:

Beyond a point, we are in the realm of judgment about evidence standards and the cost-versus-benefit balance about which we have talked. It seems to be expected that different individuals and organisations will judge that balance differently, even though they may all be motivated by a concern for value for money. Therefore, it is not axiomatic that I, as accountable officer, and Audit Scotland, with its responsibilities, will automatically arrive at precisely the same view on how that balance should be struck.

We return to my points about the accountable officer role. My responsibility is to ensure that what happens is not on the unacceptable side of a minimum standard. Audit Scotland's role is to take a view on where an optimum position in the range of possibilities might lie. Those functions are different.

Mr Swinney:

Do you as permanent secretary have a responsibility in relation to communication of the Government's message? The Government might claim that £1.2 billion of efficiency savings will be made by 2007-08. As permanent secretary, do you have a responsibility to ensure that that statement is justifiable?

John Elvidge:

Yes. I have a responsibility to ensure that such statements are made in good faith and are supportable. That does not take one automatically to the magic answer to the question of what constitutes the right support.

Mr Swinney:

In effect, then, there are two tests. One is to do with whether Audit Scotland has a view that the information can be signed off and the other is to do with whether you, as permanent secretary, believe that it is credible for the Government to say that it has made £1.2 billion in savings.

John Elvidge:

Yes. You are right to say that those are different roles. The second one is not part of the accountable officer role; it is part of my responsibility as permanent secretary to ensure that all the statements that ministers make in their ministerial capacity are made in good faith. You are right to say that that is a different dimension of my responsibilities.

The Convener:

Audit Scotland has said that financial information is not enough for its purposes of demonstrating that the financial savings and efficiency improvements have been realised, and that outturn measures for services are required.

In the context of your definition of your role, do you think that output measures are required for you to satisfy yourself that what is being stated is being delivered?

John Elvidge:

Yes. Since efficiency is the measure of a relationship between costs and outputs, that must, in principle, be true. However, that does not take one easily away from the point in the Audit Scotland letter about the intrinsic difficulty of measuring many outputs in the public sector. Some outputs are easily measured but many others are not. One of the challenges is that a large chunk of the responsibilities of the Executive do not have easily measurable outputs. Again, I will end up having to make a judgment about what measurement I need.

For the Executive's activity, that might be less to do with measurement than with observing activity. It ought to be apparent to me, managerially, whether things that are designed to happen are not happening. Therefore, that shortfall in output ought to be apparent to me without my needing to construct measurement systems to achieve that knowledge.

Mr Swinney:

On what you said about the role of the permanent secretary in relation to ministers' statements, do you think that it was wise for the Executive to state on 8 September that it will publish targets and figures only when it is convinced that they are robust and deliverable and then go on to publish the document that we have before us today?

John Elvidge:

I do not see any inconsistency between the statement and the action.

Are you the accountable officer or the accounting officer? They are slightly different things.

John Elvidge:

They are not different things. The Scotland Act 1998 uses the term "accountable officer", while the UK Government's framework uses the term "accounting officer"—same animal, different names.

Dr Murray:

My question builds on the perception that the Scottish Executive departments are not being asked to make the same level of efficiency savings as other agencies are. Since devolution, the Executive has grown. It is not surprising that it is bigger than the Scottish Office was, because of the support system that is required to cope with the fact that there are more ministers and so on. However, as accountable officer, how do you demonstrate that the operation of the civil service represents value for money?

John Elvidge:

That is a big challenge because of the particular difficulty of measuring outputs at the level of the Executive. As you know, the Executive is in almost no respect the deliverer of ultimate outputs. Therefore, in almost all respects, its role is intermediate. Those elements that we can measure seem to me to be an unsatisfactory proxy for the business of the organisation. I can easily enough talk about the number of pieces of legislation that we support ministers to take through the Parliament, and we commonly talk about the number of pieces of ministerial correspondence, the number of parliamentary questions, the number of requests under the Freedom of Information Act 2000 and so on. We can talk about all those matters, and I can draw some ratios. However, I do not think that that is the best way of getting at value of money, because it leaves too much of the organisation's real business unmeasured.

I find it more helpful to benchmark against other Governments and to compare, for example, our spending on the civil service proportionate to some measure of the volume of activity with their spending. That is the significance of benchmarking against the UK Government. Even when it has achieved all its efficiency targets, it will spend more on the civil service as a proportion of total activity measured by expenditure than we are spending now and certainly than we shall spend when we have carried through our efficient government reductions.

Is that not because the UK civil service delivers more front-line services?

John Elvidge:

Actually, I am not sure that that is the case. The benchmark data that I referred to cover the totality of civil service employees in Scotland. That includes the whole of the Scottish Prison Service, which accounts for a huge proportion of total civil service activity within the Executive. Allowing for that factor in assessing benchmarking perhaps counts in our favour rather than in the UK Government's favour.

There have been fairly significant real-terms increases in Executive spending over recent years. Do you have evidence that the monitorable outputs for that spending have grown in line with its rate of increase?

John Elvidge:

We have evidence on almost every output that can be measured. In the short term, do those outputs move in line with spending increases? No, that does not happen, either here or in other places. Has the expenditure-to-outputs ratio improved over that period? No, but that truth is generally demonstrable across the range of Government activities elsewhere. The more challenging question is whether any of that will happen over a long time. One can say only that it is too soon to tell.

Derek Brownlee:

I do not want to quote you inaccurately, but you said something to the effect that you did not want to suggest that there was a limit to savings or that we had reached such a point yet. However, in some of your areas of responsibility, there is a limit to the amount of savings that can be made without having to adjust services. For example, at some point, you must have to stop carrying out functions instead of simply making incremental savings.

John Elvidge:

That must be true in principle. All organisations have a point at which they shift from achieving efficiencies to cutting costs and reducing outputs accordingly. The managerial challenge is to find that transition point.

Do I think that we have reached a state of perfect efficiency? No, but then I do not know of any organisation that has.

That is a fair point. Where, then, along the spectrum of efficiency are we?

John Elvidge:

That takes me back to what I said to Dr Murray about benchmarking. The benchmarking suggests that we are a reasonably long way along the spectrum of achieving efficiency in terms of that kind of comparison. What I mean by that is that shifting very many percentage points—or, to be accurate, tenths of percentage points—will prove to be challenging in relation to the delivery of genuine efficiency.

Two years ago, the comparison between the UK Government and the Scottish Executive showed that the figure for the UK Government was 4.3 per cent of expenditure on civil service, relative to total expenditure, as compared to 2.6 per cent in Scotland. Clearly, our efficiency was not double that of the UK Government, but we were more than one and a half times more efficient. By definition, the benchmarking suggests that we must be getting into the areas of efficiency that are more challenging to deliver.

Some of the things that we can point to having done—things that are at the heart of the efficiency debate—reinforce that. As I said earlier, both the Scottish Executive and the UK Government have put procurement at the heart of the efficiency process; it is the largest single source of efficiency. In Scotland, we have built an e-procurement system that is capable of serving the whole of the public sector. It has been widely praised, won a number of industry awards and is regarded, certainly by the Financial Times, as a leading-edge, world-class system of its kind.

Making such progress in those significant areas calls into question any further progress that we can squeeze out, particularly given that our present projected efficiency savings take into account the projected increase in the volume of transactions that will go through the e-procurement system. Inevitably, one will begin to hit diminishing returns in some of the key areas.

Mark Ballard:

My question is on the challenge of continuing to squeeze out efficiencies. On the current plans for savings as a percentage of the departmental expenditure limit, it seems that they amount to 1.8 per cent in 2005-06, 2.9 per cent in 2006-07 and 4.4 per cent in 2007-08. However, to achieve that total of 4.4 per cent, on a year-by-year basis, that amounts to a 1.8 per cent cut in the first year, which declines to a 1.1 per cent cut in the second year and gives a 1.5 per cent cut in the third year.

Using the analogy of trying to turn the supertanker round, one could say that you started slowly and then proceeded at a greater pace. However, in this case, the opposite appears to be the case: you started at a fast pace only for the speed to decline. Is that because you have squeezed out most of the easy efficiencies, or is there a lack of ambition when it comes to future efficiency gains?

John Elvidge:

No. That takes me back to my earlier point. The figures that you used are minimum figures for the savings that have to be achieved, all of which are predicated on a standstill budget. The savings are measures of what would have to be delivered simply for the organisation to stand absolutely still. However, the reality is that neither activity nor some of our cost pressures will stand absolutely still. The actual scale of the efficiency savings that are delivered will be larger than those numbers. Because the uncertainties about costs and activity grow over time, the reality is likely to be that the scale of the challenge grows over time—notwithstanding the fact that those baseline figures give a different impression.

Mark Ballard:

Fair enough. There has been discussion of, and reports to previous meetings of the committee about, the fact that ministers are considering taking the efficiency work up to 2010 and have already set their targets for that period. Given the pressures that you have talked about, what figures are being set, and if, as you say, the figure for 2007-08 is a minimum, how can the process of continuing the programme to 2010 work?

John Elvidge:

How can it work? To an extent, that is a question of keeping up the pressure on us as well as on other parts of the public sector. It is very difficult to know, at this point in time, what the component parts of those additional savings are going to be. In essence, it is a statement of political judgment that further savings of a particular magnitude must, in principle, be achievable. One needs time to work out how that is translated into a particular set of efficiency measures.

As you said earlier, the pattern might change. You talked about the pattern in relation to the enterprise networks. Do you see those efficiency savings being of the same magnitude as the 1.8 per cent saving that we saw for 2005-06?

John Elvidge:

The honest answer is that I have absolutely no idea. That is very much a political judgment about where ministers wish to set the scale of the challenge. I would be very surprised if the scale of the challenge were any less than it has been in the period up to 2008.

Jim Mather:

I would like to return to the baseline data issue, to see whether we can inject some additional value into the process. If we are to have genuine efficiency savings, we have to start off with all operational levels and the macro-entity under some level of statistical control, so that we can monitor what is going to be incremental by way of improvement over time—and perpetually. I am worried that, in the absence of that control, not only are we asking civil servants and members of the public services in the local government area to fly blind, but we are putting in jeopardy the credibility of Scotland and the Executive with the business and political media, with competitors, with potential inward investors and with members of our own business community, who are going through exactly that process of achieving greater efficiency after, first of all, having their operations under statistical control. The absence of statistical control really makes the whole thing, sadly, a bit of a laughing stock.

John Elvidge:

I paid particular attention to the bit of your exchanges with Mr McCabe that dealt with that, as it presented an interesting set of ideas. There is an immense volume of statistical data, and it is difficult to argue that there are key outcomes from the public sector that are not measured. Opinions vary about that, but we certainly measure what is currently the consensus view of the key outcomes in each area of business.

It is not that the toolkit for statistical control is not there. We are perfectly able to track such things and, to an extent, we do. They are the bread and butter of political debate. For example, what is performance like in schools, what proportion of young people are going on to tertiary education and what is happening to mortality rates? Those are all measured and the information is common currency.

What ministers have not chosen to do—as I understand it, the committee is arguing that they should do this—is to construct a definitive set of those measures. That could be some version of the happiness index that some people argue countries should use. That index would comprise gross domestic product plus a series of social measures. I have no professional difficulty with how one would produce such a thing, but the question whether one does so is clearly political, not professional.

Jim Mather:

That is the one thing on which I wish to take issue with you. I acknowledge the political judgment that is required to set a target, to approve the programme whereby that target is achieved and to be responsible for that target. In the end, it is the responsibility of the civil service to have a mechanism in place to measure and monitor progress and to report back through ministers. I cannot imagine a senior captain of industry doing the same three things—setting the target, approving the programme and stepping up the target—but then not expecting his statisticians, accountants and so on to deliver the data that will allow him to put the proposition to his shareholders and the wider community.

John Elvidge:

My argument is that we are doing precisely that. I am not conscious of any key outcome that we do not measure. You are arguing that it is our job to measure things—and we are measuring them. The argument is essentially about the use that is made of those data, rather than about their existence.

The Convener:

In relation to the savings package and the efficiency exercise, Audit Scotland has identified some deficiencies in information and is highlighting the requirement to put baselines on a firmer basis. Even judging from our earlier robust exchange, it is clear that ministers acknowledge the need to put flesh on the bones of baselines and to improve financial information and output information. We take that as a given.

If I understand correctly, Caroline Gardner said that her work in validating or auditing the process will depend on reports given to accountable officers. You are the chief accountable officer in that regard. It is for you to put the mechanisms in place that allow us to carry out the necessary testing and to provide the required information. Am I wrong about that?

John Elvidge:

I think that Caroline Gardner is saying that she cannot audit what is not there.

Yes.

John Elvidge:

She is saying that she would expect the bread and butter of what Audit Scotland audits to be reports that flow through the management process for the purposes of tracking. That seems right to me. I think that auditors should audit the things that management requires for its purposes rather than impose additional demands on organisations. I think that your interpretation of the process is right, but I still do not see that it takes you any nearer the question that you were discussing with Mr McCabe, about what the precise content of the report should be.

Mr Swinney:

If the Government's efficiency savings initiative has been genuine and effective, should we not have confidence in the measurements that are taken to substantiate the process in order to satisfy parliamentary and public opinion?

The position is that Audit Scotland, which Parliament established as an independent body, requires sufficient detail before it will be satisfied about the process. However, the minister's response is that the Executive has sufficient measures in place to be comfortable about its judgment. Further, you are saying that this comes down to a matter of judgment. However, if I interpret the committee's position correctly, it feels that all that should be there is not there.

John Elvidge:

No. I think that Mr McCabe said that the process is evolving. I regard us as being in partnership with Audit Scotland to work out how best to do something that one cannot simply take down off the shelf. It seems to me that both Mr McCabe and Audit Scotland are saying that the process will have to develop and that they must still make judgments in some areas about how precisely to get the quality of evidence on particular efficiency savings that they would like to have. I do not recognise the situation as being one in which Mr McCabe says that the Executive has reached the limit of what can be done and someone else says that it has not.

It certainly sounded like that when he came into the room, I must say—that was just a pejorative remark.

The Convener:

To put it another way, there is a shared recognition that all the mechanisms were in place to account adequately for and audit the process from the beginning. The Executive has indicated that, and it is also the Finance Committee's and Audit Scotland's view. You are saying, Mr Elvidge, that there is a continuing debate, or negotiation, between Audit Scotland and the Executive on the information that is required to provide the assurances that Audit Scotland feels it requires. Can you assure us that, as accountable officer, you will co-operate with Audit Scotland to the fullest extent in ensuring that it has the information that it needs to audit the efficiencies and provide the reassurances that it feels are appropriate?

John Elvidge:

The short answer is yes, but let me elaborate on that. I emphasise that I do not think that it is right for the audit process to ask for information that is not required for management purposes—audit should not be an additional burden on organisations. Therefore, because I believe that there will be adequate information for management purposes, I think that I can say that all that information will be freely available to Audit Scotland. The other important point to make is that Audit Scotland's role in the process is, to a substantial extent, the result of an invitation from the Executive to act as adviser and partner in developing the framework. We are not talking about Audit Scotland acting in its formal audit capacity in this case. Therefore, it would be wrong to postulate an adversarial model in which things might be withheld from Audit Scotland or there might be a difference of purpose, because that would contradict the way in which the relationship was entered into. That is my qualified yes.

The Convener:

Okay. We can take that qualified yes and look forward to further information in due course from you, ministers and other officials.

On behalf of the committee, I thank you for coming along to the meeting and responding to our questions. Clearly, we will continue with work in this area as part of our business and we look forward to having a chance to discuss the issues with you again at some stage.

John Elvidge:

I also look forward to my next appearance. Thank you.