Scottish Further Education Funding Council
Agenda item 6 is our inquiry into Audit Scotland's report on the Scottish Further Education Funding Council. Before we proceed, I remind everyone to switch off their mobile phones and pagers.
We will observe the two minutes' silence at 11 o'clock, at which time I will ask members and all those in the chamber to stand. I advise everyone that the process will be filmed and may be used by the BBC.
I welcome the witnesses, who are Mr Roger McClure, the chief executive of the Scottish Further Education Funding Council, and Mr Graham Donaldson, from Her Majesty's Inspectorate of Education. I ask them to introduce their teams.
Mr Roger McClure (Scottish Further Education Funding Council):
On my right, I have Mr David Wann, who is the deputy chief executive and secretary to the council. On my left, I have Mr Liam McCabe, who is the director of our governance and management appraisal and policy directorate.
Mr Graham Donaldson (Her Majesty’s Inspectorate of Education):
On my right is Dr Wray Bodys, who is the chief inspector with responsibility for our work in further education. On my left is Mr Kish Srinivasan, who is the lead inspector for most of our direct work with the funding council.
Thank you. I ask Rhona Brankin to make a declaration of interest.
Although it is not technically a registrable interest, I declare that Graham Donaldson is my brother-in-law.
I think that it is worth recognising that the report shows a positive picture of the efforts that the funding council has made to begin to influence the sector in a number of key areas. Normally at this point, we would move to questions from members. Before we do so, would any of the witnesses like to make an opening statement?
Although I do not wish to make an opening statement, I acknowledge and am grateful for your comment about the efforts of the funding council. I add to that comment that the efforts of the sector and the colleges on the ground, which is where differences are made, are what really matter.
As Mr Donaldson has indicated that he does not wish to make an opening statement, we will proceed to questions from members.
The accountability arrangements that cover the further education sector reflect the respective roles of ministers, the funding council and the colleges. How do those arrangements work in practice? In particular, how do the targets and standards that you set for college activity and the quality of FE provision fit within the framework?
On college activity, we undergo a process each year of allocating annual funding to each college; as one would expect, that process is elaborate. At the end of the process, each college receives an offer of grant from the funding council, which boils down to an amount of money and a target volume of activity. The volume of activity is measured in a unit that has come to be known as the weighted SUM—which stands for student unit of measurement—and that boils down to a 40-hour increment of teaching. Those two figures are offered annually to each college with a set of standard conditions of grant, or, in some cases, specific conditions, which apply to the offer in a particular year. In the vast majority of cases, the colleges accept the offer, which is then turned into a formal agreement to provide a particular volume of activity for a certain amount of funding.
In the following year, through our statistics processes and the collection of statistics, we assess whether colleges have delivered the volume of activity that they agreed to deliver in the formal agreement. That process is audited and, provided that we get a satisfactory audited return for each college, there is nothing further to do. If a college falls short of its target, and depending on how far short it has fallen, there is likely to be follow-up action, which will depend on the specific circumstances in which the college has fallen short. Typically, for the amount that the college has fallen short, we would recover the funding that goes with the volume of provision that the college did not manage to provide under the agreement that it had with us, subject to a de minimis amount.
It is a condition of our funding that colleges must deliver their provision to an acceptable level of quality, which has been defined and discussed with the colleges. We assess whether they are achieving that through the inspection cycle, which is conducted on our behalf by Her Majesty's Inspectorate of Education, colleagues from which are with us today. Colleges are inspected on a four-year cycle—at present, we have completed three quarters of a cycle. Each inspection covers a standard range of items; it examines performance in particular subjects and looks more generally at how colleges manage the education process as a whole. A detailed report is produced and discussed with the college, which has the chance to challenge the report if it thinks that it is unfair. When the report has been agreed, it is published.
Every report comes to me and to the chairman of the funding council to decide whether to agree to the recommendations of our staff on what follow-up action should take place. We have a fairly straightforward rule: if there are significant indicators of weakness in the report, the college will receive a letter immediately that draws attention to the matter and which, depending on the degree of weakness, requires the college to produce a plan of action as to how it will tackle the deficiencies. We set a time scale against which the colleges have to do that, and they are then reinspected by HMIE to ensure that they have made the necessary improvements.
That was an outline description; we can go into more detail if you want.
You talked about making specific requirements of colleges when the funding negotiations are under way. Will you give me an example of what one of those requirements would entail?
Most FE colleges in Scotland run higher education programmes, for example. It is a condition of funding that they do not charge fees for full-time higher education students.
You do not set targets for colleges in areas such as efficiency. Do you consider that there is a gap in the chain of accountability between colleges and Parliament, in respect of college performance?
No, not really. College efficiency, and how we approach it, is worth a bit of exploration. Colleges are under extremely strong pressures to be efficient and those pressures should be explained. As a result of that explanation, I hope that you will see why we do not set an efficiency target, although you will see that the other targets that we set all come together to put pressure on colleges to be efficient.
I described the first pressure at the beginning of this evidence session. Each college receives an annual allocation for a given volume of activity. For most colleges, that allocation is the majority of their income. Any other income that the colleges generate will generally require corresponding expenditure so, in terms of balancing their books, the income from the funding council is far and away the largest part of their income.
Each college is funded at the same rate for producing the same outcome. There are variations in the funding that I could go into, but each college's production of a unit of tuition, if you like, is funded in the same way. That level of funding is, in effect, set each year by the Scottish Executive. Every college in the land has to live within that relationship between funding and delivery of outcome. I do not mind telling the committee that the funding is not generous. The committee will know from other evidence that the financial difficulties that are faced by colleges are an indication not that the majority of college management is hopelessly inefficient, but that the colleges have been under a great deal of financial stress in the past few years. That is the first pressure.
Even if a college wanted to be inefficient, what could it do, especially given that its income is fixed and tied to a given volume of output? It might try to reduce the quality of its output, but as I explained in my second answer, there is a rigorous and relentless process for quality assessment so there would be no escape there. It might decide not to spend any money on its estates, but it has to provide estates strategies, the prime objective of which is to eliminate unsatisfactory accommodation. That is also followed up by the funding council.
The only route for a college that is being inefficient is to let its finances get out of control and get into financial difficulty. The funding council monitors the finances of every college in the land three times a year, analysing them in detail, and when we find that a college is in financial difficulties we follow that up closely and require the college to produce a recovery plan.
The expectations and requirements are clearly set out in the financial memorandum and in our conditions of funding to the colleges. I do not believe that there is a college in the land that does not understand those pressures, which in some ways are analogous to market forces in the private sector. The pressures relating to income, quality, estates and financial monitoring are all so tight that it is hard for a college to be inefficient to a significant extent, although that is not to say that some colleges could not learn from others regarding individual items of expenditure.
If the committee would like to follow up that question, I would be happy to comment on our work on the detailed benchmarking of costs, which helps individual colleges to see where they can improve their performance in specific, detailed areas. We have launched that exercise and it is under way, funded by the Scottish Further Education Funding Council. Every college will have the same rules applied to it so that we have consistent and truly comparable definitions and analyses of costs. The colleges can make good use of that information.
Please be upstanding for two minutes' silence for those who gave their lives for future generations.
Thank you.
Mr McClure said that he would be glad to say something about benchmarking in relation to college efficiency. Have there been discussions about sustainability benchmarking and sustainability targets to help, guide and instruct the colleges?
Not in the sense that you might mean. If you are talking about wider environmental sustainability, the sector is certainly aware of those issues. However, we are focused on a much narrower exercise that looks at the actual monetary costs that are incurred and tries to help colleges to balance their budgets.
The committee will be well aware from other reports that the Auditor General has produced that the sector has been through some difficult financial times and the Scottish Further Education Funding Council has put a high priority on helping the sector to put its finances in order. We have focused on the monetary aspects, although I recognise what you are saying and, with my other hat on, I would say that the higher education sector is becoming increasingly aware of the sustainability agenda. I have no doubt that the further education sector will move on to that agenda.
I would like further clarification on accountability. The Scottish Further Education Funding Council is accountable to the Parliament; you are the accounting officer, and you have to report on how the £400 million is spent, but that £400 million is then handed over to the colleges. To whom are the colleges accountable, to ensure that that money is well spent?
Under the legislation, the money is passed from the council to the college's board of management, which is responsible for the proper expenditure of that money, securing best value and value for money. That is all secured in the financial memorandum between the council and each college, and the boards of management are accountable to the council for how they use the funds that it has allocated to them.
So there are no lines of accountability back to the Parliament or to you for the spending of that money.
I am the link; I am the accounting officer for the funding council and you call me here to answer questions.
What actions can you take to ensure that colleges do not fall short in delivering expected standards or in their performance?
Let us take some examples, the simplest of which is what happens if a college fails to deliver the volume of education for which it has received funding. We do not apply the rules blindly and rigidly, but we ensure that we understand why a college's performance is what it is, and if we are satisfied that the college has fallen short without a very good reason, we recover the funding that went with the shortfall. In other words, the college is at least not paid for work that it did not do.
If a college gets a poor or critical inspection report, we follow that up immediately: I write to the chair of the college—again, the link is with the board of management—and require the college to provide within a set time scale, which is usually a month or so, a plan explaining how it will correct the deficiencies that HMIE has found. If the case is sufficiently serious, we set a time by which the college should have carried out the remedial activity, and HMIE reinspects the college. We receive HMIE's reinspection report, which we also follow up. If a reinspection report highlights deficiencies, that is a very serious matter. More usually, when a reinspection happens—it does not happen that often—we are able to say that we are satisfied with the action that the college is taking because HMIE tells us that it is satisfied, and that deals with that issue.
If the problem is a financial matter—if a college is not balancing its books—we intervene strongly and require the college to prepare a financial recovery plan, which has to be prepared to our satisfaction. We then monitor the college quarterly and it has to provide additional returns to show that it is sticking to its financial recovery plan.
I assure you that, in those key areas, the follow-up is strong.
I have a follow-up question on the efficiency of colleges. There must come a point when no further efficiencies can be gained in the unit cost—the student cost—for particular courses. I am aware that we are talking about a £3,000 funding gap for higher education courses that are delivered in colleges, as opposed to those that are delivered in universities, which causes a number of colleges significant difficulties. How will the funding council overcome that problem?
It would not be right to compare directly the levels of funding in higher and further education. I should explain to the committee that we fund both those sectors on an average basis. For example, in further education colleges, we fund on the weighted SUM that I described, which is applied to higher and further education. There is no differential rate for higher and further education, nor are there differential rates for the different years of a programme—if we start funding in year 1 of a programme, the college does not get different funding for a student who is in year 4.
The same thing applies in higher education. We do not differentiate between the first year of a degree programme, for example, and a postgraduate programme. However, if one were to look at the way in which universities and colleges of higher education deploy their costs, one would find that much more is spent per student on a postgraduate course than on a student in the first year of a degree. One would probably see changes during the course of the degree.
Although it is tempting for FE colleges to make the comparison, I am not sure that it is valid. We are comparing two averages. If we were able to carry out a study that showed what colleges were spending typically on higher national students compared with what universities spend typically on first-year degree students, I would not be surprised if universities were spending less on some of their degree students. There can be large groups of degree students and that is, largely, how the cost is driven—the staff-student ratio is by far the biggest influence on the cost that is allocated to a particular student group.
You asked how far we can go with efficiency. The UK Treasury has observed—for a century, I think—that 1.5 per cent is achievable. However, techniques change. I remember that, when I first came into education administration with the university grants committee in the mid-1980s, people said that if the staff-student ratio went beyond 10:1, the sky would fall in. Now, that ratio in universities is typically well in excess of 20:1. One will not find many universities that will say their education is of a lower quality; they say that it is different, but not necessarily of a lower quality.
With the advent of new technology and e-learning, although it is expensive to produce the necessary materials, one can spread that method of learning through a large population and the unit cost will come down rapidly. It is difficult to say at any point, in relation to efficiency, "Thus far and no further." All that we can do is to go at a sector level from year to year, which is when the Scottish Executive decides in the spending review what it thinks is the right level of resource for a certain level of output. We can also implement the benchmarking study and publish other indicators to help institutions compare themselves in detailed areas in which they might be able to effect some improvement. The culture in both sectors is geared towards continuous improvement. We expect them to work continuously to try to improve in a range of different areas.
Just before we stopped for the two minutes' silence, you made a point about benchmarking. Our predecessor committee first mentioned that matter several years ago. Colleges are seeking more analysis of information. Are you satisfied with the progress that is being made on benchmarking?
I am satisfied with the particular exercise to which I referred, which is part of the campaign for financial security that the funding council launched jointly with the sector. That was quite new and the principals came together with us to launch the campaign. We have been through the stage of tendering the work, which was a major exercise that involved going through the EU procedures. We have appointed a consultant, who has done pilot work. A group from the sector is assessing that pilot work to ensure that the definitions will work.
As I am sure members know, benchmarking rests on consistent definitions and figure work. As soon as that falls apart, one is left with results in which people do not have confidence and they simply will not pay any attention to them. A lot of effort is being made to ensure that the methodology is consistent and appropriate and that it can be applied to every college. Colleges do not organise themselves in exactly the same way, so whichever way one does it, one finds oneself cutting across the organisation of a particular college and asking the college to allocate its resources in a way that does not quite match the way it organises itself—into schools or different departments or whatever. Unless one can enforce rigidly that consistency of definition, one will not end up with a useful result. That is the process that we are going through at the moment.
We are making good progress, but the proof of the pudding will be in the eating when the consultants produce a detailed report for each college. Each report will cover a college's entire expenditure, item by item, and provide comparisons with a sector mean; it will also enable comparisons to be made with other groups of colleges that might better represent the particular college's circumstances.
As there seem to be no further questions on comparative information about colleges, I would like to move to performance information on ministerial priorities.
Will the witnesses tell us a little more about the relationship between the four ministerial priorities for further education in Scotland and the funding council's five corporate goals for further education in Scotland? If I were a college principal, where would I look for my guiding principles?
We probably have a chicken-and-egg situation. The council developed its five corporate goals in its first corporate plan, in 2000. That plan ran from 2000 to 2003. It has been updated each year but has held to those five broad corporate goals. The four ministerial priorities that Audit Scotland has identified are, I think, taken from the most recent letter of guidance that is available to us, which would have come out in December 2002. We do not expect the minister to fit the objectives for a particular year to our corporate goals. However, we ask ourselves whether our corporate goals adequately cover the ground that the minister has put in the letter of guidance. We were broadly satisfied that that was the case. In any case, we would take the letter of guidance as a kind of agenda, whether or not it included something that we might not have included specifically in the corporate plan.
Let me turn on its head something that you said there. You said that you would not expect the minister to make his priorities fit the funding council's corporate goals. Should we expect the council to take steps to ensure that its corporate goals better fit ministerial priorities?
That has to do with the level at which things are set. The corporate goals are set very broadly. We have just gone through the process of preparing a new corporate plan for the funding council. Again, we say that we have four broad aims, but they are very broad. The plan is about to be published, having been approved by the minister. We talk about improvement in learning and skills; fair access and progression; the creation and transfer of knowledge—I should have mentioned that this is a joint corporate plan for both further and higher education; and a coherent system of well-led and innovative institutions. You can see the level at which those are set. We would be surprised if any future guidance letter plucked something out of the air that was outside the scope of those broad aims.
Within those broad aims, we set specific targets. If necessary, those targets will be adjusted in the light of a letter of guidance. Nobody can predict how, in two or three years' time, circumstances in the country might change and necessitate specific guidance from the minister that has not been fully foreseen in the corporate plan. If that happened, we would go with the letter of guidance.
Let me put that to the test in relation to one specific area—planning for future supply and demand. I note that you said that ministerial priorities are very broad; and the implication was that the council's corporate goals were less broad. The ministerial priorities specifically identify a development of "Skills for tomorrow's jobs".
The closest match among the council's corporate goals is
"A sector where the pattern of provision meets Scotland's needs".
I accept that such statements are very much open to interpretation, but my reading is that the ministerial priority is a deal more specific and identifies something that politicians and employers are increasingly identifying as being one of the key issues that needs to be addressed.
That is exactly what I said. I did not say—or, at least, I did not mean to say—that the ministerial objectives that are set out in the letter of guidance are broader than our goals; what I said was exactly the other way round. I said that our goals were broader and that we would expect the ministerial priorities to be embraced by them, in exactly the way that you have just illustrated.
The statement
"A sector where the pattern of provision meets Scotland's needs"
is very broad. Within that, the minister has recently begun to put emphasis on matching skills to jobs, which, as I think you said, is a subset of that much broader goal. Therefore, there is no conflict as far as we are concerned.
Okay. It is reassuring to hear that there is no conflict in that area. What is the funding council doing to gather data in that area?
On "Skills for tomorrow's jobs" specifically, the letter of guidance says that it recognises that the council has to pursue such a role in partnership with other stakeholder bodies. If one dissects that, one finds that the content of courses is determined by the Scottish Qualifications Authority. The funding council does not have any direct remit in what skills are included in programmes. As you would expect, we meet the SQA regularly—we have a quarterly meeting with the organisation and there are other meetings at officer level all the time. We do not have a remit to determine what the make-up of courses should be. Our remit is more to do with creating the capacity in Scotland within which appropriate programmes can be delivered. Our contribution to that includes ensuring that we meet the Scottish Executive's targets on the volume of provision that is produced. If we fell down on that, one would expect there to be less chance of the "Skills for tomorrow's jobs" requirement being met.
I have already referred to the HMIE inspections. A component of the routine inspection has to do with programme design, the relevance of programmes and how far colleges are engaging locally with employers, the local enterprise companies, local labour-market intelligence and so on. That is a further part of our contribution to the overall effort.
The report refers to the employer satisfaction survey, which demonstrates that the sector is doing pretty well in meeting employer needs. That is another contribution that we can make to the ministerial objective of "Skills for tomorrow's jobs".
That is what we are doing now, but there are things that we are planning to do that will add to that. Earlier this year, the council, along with the Scottish Higher Education Funding Council, agreed to put significant resources into a longitudinal survey of what happens to people once they pass through FE and HE. It is a great pity that that has not been done before because, if it had been started 10 years ago, we would now have data that we could use. The survey that we have launched will be a bit like the Scottish household survey; through it, we will be able to see what happens to people in later life, how they have used their further and higher education and what impact it has had on them, which will be very interesting.
Area mapping was mentioned, which we think is probably the single most important way in which we can understand what is happening in detail throughout Scotland, where there are such different communities and such variety of need. We have done one such exercise, which was partially successful, but it was quickly recognised that it was partial and the process of planning for and delivering the second exercise is already under way. It will be much more comprehensive and will involve Scottish Enterprise, which, as the committee knows, funds volume training in Scotland. In due course, we would also like to involve local authorities and voluntary organisations, although that will be much harder, because the scale of such involvement will be much bigger.
An assessment of whether further education is adequate in an area cannot really be made unless provision of a comparable kind is taken into account. In fact, legislation requires colleges, when they are deciding what provision to make, to take account of other provision in the area. Such provision must be brought into the analysis, which is now being done. There is a steering group with all the different stakeholders and I am glad that Audit Scotland has agreed to be an observer on that steering group.
The exercise is significant in trying to achieve a much more sophisticated and intelligent understanding of what is happening in a complex public service. We intend to repeat it every couple of years and to do analyses on industries and on a regional basis. We want to see that what colleges and other providers deliver matches up to perceived demand.
I want to clarify something. I hear what you say about work that is under way, but how robust is the information that is currently available to you to enable you to assess how effectively colleges are meeting the needs of the Scottish economy and local economies?
That is an exceedingly difficult question. There are figures from the employer survey in the report. Some 80 per cent of employers believe that students who come out of further and higher education are well prepared for work. At the other end of the spectrum, if I was asked how well the Scottish economy is doing and how much of its lack of performance is due to further education, I do not think that I could answer such questions. We know that interactions between employers and colleges are extremely strong. The inspections that I have described, which cover three quarters of the colleges in the land, consider closely such links at a local level as well as programme design and so on, and they show good results.
The question whether that investment of public funds has the hoped-for impact on the economy is difficult to answer—I have exactly the same difficulty with higher education. Some people would say that the economy would be much worse if we did not invest such money and some might say that colleges are not doing what they should be doing and the economy ought to be much better. Others would say that training has a certain impact on the economy, but that the economy's performance has to do with many other complex and global factors. The web is so complex that one can never satisfactorily say whether the sector is doing its bit to support the economy.
I have a final practical question. Imagine that I am a college principal, sitting in my local economic forum, and that there is universal agreement in that forum that there is a profound shortage of joiners in the area, which is likely to continue for years to come. Imagine that I think that I can go some way towards meeting the local demand. To what extent would it be in my gift to do so? To what extent would I need to go cap in hand to the funding council and make the case for additional sums?
It would not only be in your gift to meet such demand, but doing so would be a positive requirement. There is a positive requirement on colleges to respond to local needs. Our funding agreement with colleges does not differentiate between what colleges produce; there is a pure volume measure. What goes into that volume, its make-up and the make-up of courses is entirely a matter for colleges, but we assess their responsiveness to local needs. When we consider strategic plans and so on, we consider how well colleges appear to be identifying, understanding and responding to local needs. Meeting local needs is not just in the college's gift—doing so is a positive requirement of funding.
I want to follow up on Susan Deacon's questions about the information that you seek to demonstrate how effectively colleges are supplying to meet demand. What is the timetable for having all that information in place?
The steering group has already begun to meet. We expect the main consultancy work to be carried out next year. If I remember correctly, we expect to have the results of that early in 2005. It is a major undertaking.
Will that influence future funding for colleges?
It is possible—one might even say desirable—that it will influence the distribution of funding among colleges. That is one of the reasons why the exercise has to be extremely robust. If it is going to be used to shift funding from one college to another on the basis that priority in one area is greater than in another area, we have to be on pretty secure ground. As I indicated, shifting funding from one college to another will certainly inflict financial pain on the first college, which it will have to adjust to. We would not wish to engage in that until we were confident that we understood what was happening. We would also need guidance from the Scottish Executive on priorities, albeit it at a high level. Those priorities have to be set by the political process, because that is how the funding is arrived at.
Would you consider using some of the principles that applied to health service funding with the introduction of the Arbuthnott formula, which covered rurality and deprivation? You would have to consider unemployment levels, for example.
Those factors already apply in our funding system. There are remoteness elements and social inclusion elements, which are intended to compensate colleges for the additional costs that tend to be associated with attracting students from particular backgrounds. It becomes more difficult when participation levels vary from region to region as a result of supply and demand. That is what the first exercise showed, although it was a partial account and we did not have sufficient confidence that it was telling us the whole story. If we had the whole story and there were different participation levels between different parts of the country, a judgment would have to be made about whether we could live with that or whether our objective should be to bring greater equality to participation rates. Unless the Scottish Executive provided the funding necessary to bring everybody up to the highest level, the participation level in some areas would go down. Those are difficult judgments and are for the Government of the day rather than the funding council to make. However, nobody will be able to make such judgments until we have reliable information on which to base them.
Will you explain the difficulties that you had in the previous attempt at an industry exercise? Will you elaborate on how you hope to overcome them in the new exercise?
I am afraid that the industry exercise was an unhappy experience. The work was commissioned in a perfectly normal way: a tender was put out, consultants were contracted and the work was done. At a fairly early stage, we were unhappy with the reports that were being produced. All I should say is that we got into discussions with the consultants that were not productive. We got to a point at which we did not think that the consultants would be able to produce a report to the standard that we expected.
The consultants did not wish to do any more work and we withheld the final payment, which immediately introduced a legal situation whereby the consultants wanted to sue the council. We took legal advice and, in the end, a settlement was reached, which—I am pleased to say—meant that we still avoided paying a substantial chunk of the full cost of the exercise. The situation was very unsatisfactory because, as soon as we got into the legal situation, none of the material could be used because it was covered by the legal process. By the time the situation was resolved, the material was not worth publishing because not only was it not of publishable quality, but the data were getting too old to be valuable anyway.
I am afraid that that is the sorry saga of the industry-mapping exercise. I believe that the council did everything it could in how it specified the work and in trying to ensure that, when the work began to go wrong, it got back on track immediately. When it did not work out, we took legal advice and wrapped it up on the basis of getting the best value for money that we could.
The important part of your question was about whether we can learn from that for the future. The answer is very much so. We will do what we can to ensure that the specification is as detailed and clear as it can be. However, there is a real difficulty. When we took legal advice about whether we should rise to the challenge of being sued in court, our legal advisers told us that, like for any major report in a complex area, we would need to get an expert witness to go through the specification and the report and then say whether the report met the specification. They said that, even then, our opponents would do the same and that they could give us no guarantee that the court would find in our favour.
With such complex work, unless one writes something that is so detailed as to be unwieldy and unworkable, it is hard to see how that situation could be avoided. Fortunately, it does not happen very often. It is the first time that it has happened in my experience and I very much hope that it will not happen again. We will do whatever we can to specify the study. We will not have a separate industry study this time; we are carrying out one holistic study, which is a much better way of doing it. We did not have any problems with the bigger, area-mapping study that covered the whole of Scotland geographically. That went absolutely fine. The consultants did a very good job in the circumstances and we had no difficulty with them.
Does that mean that the geographic exercise will be built upon, or is it to be redone as part of the new exercise?
We will take what we can from the original work on the industry mapping, but that goes back two or three years and it is a fast-moving area—the average FE programme lasts for less than a year. I do not think that there is much of the previous exercise that we can use. We learned a bit about contacts and mechanisms—the ways in which industry influences colleges and how colleges work. There is a certain amount of background information that we can use in the new study, but I am afraid that we cannot really use the figures and the gutsy stuff that we got.
How long will it take to complete the new piece of work?
As I said in answer to a previous question, we expect to have the results by spring 2005. We are planning the exercise at the moment. The consultancy work will go on through 2004 and will cover the whole of Scotland, area by area, on a geographical basis, as well as the industry work. There will then have to be time for us to digest the consultants' results and to finalise reports, which we expect to publish in spring 2005.
I would like to take you back to the point that Margaret Jamieson raised about assessment and the criteria that are used in a variety of areas. I am thinking specifically of the city of Edinburgh, where a judgment requires to be made on future needs, in terms of where the economy is going, against current wants—for example in relation to social exclusion, which might apply in other areas. Are we adequately addressing the need to front-load the Edinburgh economy, or is there a danger that some of the criteria that have been assessed miss out the intangibles about where we are going as opposed to the reality about social exclusion, which does not exist?
There are a number of issues. One is whether it is possible to get leading-edge intelligence that will help us to plan adequately. I am afraid that the record has not been terribly strong in a wide range of areas, as I am sure members will know. I will take a higher education example—I hope without offending anybody here. The planning for initial teacher education is an exercise that is carried out every year. We have had a lot of difficulty trying to match the provision of places to the number of trainee teachers. At first sight, one might think that we would know what the demands will be, but in fact that has proven to be difficult.
The first question is whether it is possible to get leading-edge intelligence. The second is who should collect and disseminate it and cause things to happen. There could be a central body that tries to collect and interpret the information. It could come up with the answers and could then require the colleges to respond to them. Alternatively, the responsibility could be put on the colleges. They could be told that it is their job to keep in touch with the sources of intelligence, to understand what they mean in their sphere of influence and to respond to them. We currently operate in the latter context. The funding council can, through inspection, ascertain whether colleges are being responsive. The funding council has a different role in trying to support the process of identifying the relevant intelligence and bringing it together so that colleges can use and respond to it.
There are examples of us beginning to support colleges more. We worked closely with Scottish Enterprise Glasgow on the construction requirements there, which are currently very much to the fore. Scottish Enterprise Glasgow has been working directly with the colleges to find out how they can respond more effectively to that particular need. Part of the difficulty in Glasgow is that the need has arisen far faster than people have been able to respond. We should ask ourselves why that has happened. It will take the colleges quite a long time to train enough people to meet the construction need in Glasgow, and I suspect that a lot of people have to be imported into Glasgow to meet the immediate need there. That relates to the original question about intelligence and how it is collected and identified.
To an extent, you are flying blind in trying to measure the extent to which you are meeting the needs of the community and closing the skills gap, are you not? You do not have the fundamental research in front of you to tell you whether you are meeting the targets.
The supply-and-demand survey will tell us that.
That is what I am saying: you do not have the information at the moment.
That is why we have introduced the exercise. There is a statutory requirement on the funding council
"to secure the provision of adequate and efficient further education in Scotland",
whatever that means. We interpret that to mean that demands and needs are being met locally as far as they can be reasonably assessed, and we developed the idea of the supply-and-demand exercise as a result.
You will have found that it is taking a long time before the answers are available. We would like to have the answers tomorrow but, if the exercise is to be robust, we cannot complete it any faster than is sensible. We would like to establish a rolling trend. We will probably carry out the exercise every couple of years and, each time, we will try to build on the work that we have done before. The survey will become a reliable indicator of how the world is developing and of the trends in a particular area. As the information from the exercise accumulates, it will become increasingly meaningful. However, in the initial stages, when we first do it, it will be a big data collection and analysis exercise.
My question is how you make decisions about shifting bundles of money from one part of the sector to another without those basic data. What do you base such decisions on? You do not have any information at all to base them on.
That is not quite the case. If a college is not recruiting up to its numbers—in other words if there seems to have been a fall-off in demand in one area—we will claw back money from that college, which will return to the central coffers. If the shortfall is significant enough, that will affect the college's allocation the following year—it will have the effect of reducing the volume of activity for which that particular college is funded in the next year. There is an adjustment process. If a sufficient shortfall leads to money being clawed back, those colleges that exceed their target can get additional funds as a result.
In the main allocation process, if there are sufficient funds in the baseline to fund expansion, the letter of guidance will identify the main priorities that the minister wants the council to support. Typically, the agenda has been one of widening access. For example, we have put in money for part-time provision and we have identified those colleges that are drawing students from the lowest socioeconomic groups—or whatever definition you want to use—and used such criteria to support a specific ministerial objective.
I am conscious that we have extended that area of questioning considerably, as members will realise when they come to ask further questions. Margaret Jamieson has a question for Mr Donaldson.
I have a question about the reviews of further education colleges, which include comments on how well colleges meet the needs of the communities that they serve. How do you make those judgments? We have heard a lot about the research that is going to be undertaken by SFEFC. Will the results of that research assist you? Will you take account of other quality assurance audits of further education in future?
I will expand a little on what Mr McClure said about the nature of our work with the sector and set my answer in that context. As he said, we operate under the terms of a memorandum of understanding that we drew up with SFEFC in 1999-2000. That memorandum requires us to carry out reviews of all 46 colleges within the four-year period to July 2004. For each college we undertake subject and college reviews, using a set of quality indicators that have been developed jointly by us and the sector. There is an agreed framework within which the evaluation of the work of colleges can take place. That set of quality indicators is intended to be a tool for colleges to use for internal self-evaluation, and for us to use for external evaluation, so the indicators form the basis of a common language that we can use to talk about the nature of quality when reviews are conducted. We use a four-point scale in our evaluations, ranging from very good to unsatisfactory.
As Roger McClure said, the reports that we produce following inspections are published by us and provided to SFEFC, but they are independent reports. We take responsibility for the content. Specific quality elements and quality indicators are used within the overall framework by the team of HMIE and associate assessors. Associate assessors are people drawn directly from the sector whom we train and who work with us in the inspection process; therefore, within the inspection process as a whole there is a strong element of peer review, as well as external review that is undertaken by HMIE. Much of the effort that we put into developing our staff—both associate assessors and HMIE staff—is about ensuring consistency of interpretation of the quality indicators.
There have been a number of discussions about benchmarking this morning. An important part of what we do is benchmarking information. We are using our team's collective experience of the variety of colleges that it has engaged with to build up a continuous improvement process inside the sector; the evidence that is gained as part of the review process is used to inform the improvement agenda. An important part of what we do is not only exercising an accountability function but taking good practice that we find in one area and ensuring that it is built into the evaluations that we are undertaking in the sector. An engine of improvement is built into the quality assurance process.
From our point of view, better external reference points assist that process of benchmarking. The discussions that the committee has been having with Mr McClure would relate to that. Sensitivity to the local labour market is an important part of what we consider in the context of programme design. The staff in the inspection interact directly with the college and with local employers to determine how the college is engaging with and serving its local labour market. However, higher-quality information about the economy as a whole would assist that process. The kind of exercise that Mr McClure talked about would assist in relation to the sensitivity of the inspection process.
I was involved in the Enterprise and Lifelong Learning Committee's lifelong learning inquiry last year, during which concerns were raised about the volume of quality assurance that colleges were going through. I am conscious that various bodies are involved in that process. Could you say something about that, and about what you see as the future direction?
Do you mean the burden on colleges of the different types of inspections and so on?
Yes.
It is fair to say that by far the biggest component of quality assurance is the college's own systems, which should be an inherent part of the college's activity. No excuse needs to be made for that—it goes with the territory of making education provision. There has to be a rigorous quality assurance process in the institution. It is only the institution that can assure the quality of what is done—no inspector or funding council can do that. What we can do is support the colleges and inspect on behalf of the taxpayer to try to ensure that the colleges are assuring the quality of what is done. There are areas where colleges feel that they are subject to too much scrutiny, one of which is where they are undertaking volume training provision. They need to make the SQMS—Scottish Quality Management System—standards.
I am pleased to say that, in discussions with Scottish Enterprise, we have agreed a mechanism whereby a kind of credit arrangement works, so that if a college has been inspected by HMIE and has passed parts of that inspection satisfactorily, that reads across into achieving SQMS standards. That has come into play in the past six or eight months. Now when I write to colleges confirming the results of their inspection, the letter identifies which of the SQMS standards they are deemed to have achieved—they do not have to be reinspected to achieve those standards, so at least there has been a positive step forward in that area.
It is a difficulty. The burden probably lies less in the case of quality inspection and more in the case of producing similar data for various organisations for funding purposes and so on, where, unfortunately, the definitions and data requirements always vary just enough for each organisation to insist on having its own data. That is where we need to consider having a common set of data, as that would improve things enormously.
The involvement of various bodies in different kinds of quality assurance means that it is quite a crowded field. It is important to distinguish those aspects, such as Investors in People, that a college decides on its own initiative to take on but which are not part of the compulsory framework within which it operates. The area that Mr McClure referred to when he spoke about the relationship between the work that we do and SQMS is critical. There has been good progress to achieve cross-recognition of the work that is done.
We are conscious of the need to continue that work, particularly in the area of the information that we ask of colleges. We need to ensure that the information system that we are using does not require us to ask for the same information that the college has given to someone else in a slightly different form, so that the college has to translate the information into a particular form for us—quite rightly, that is an irritation for the college concerned.
Rhona Brankin referred to the lifelong learning inquiry. Following on from that, a review of quality assurance systems is being undertaken by the department. My own belief is that there is further scope for rationalisation in that area.
I think that all the points have been covered. We move on to address efficiency information, including unit costs.
In an answer to an earlier question, Mr McClure, you made the cogent argument that in order to ascertain whether underfunding was taking place, the per capita unit costs needed to be known. Back in 2000, I think, the Audit Committee recommended that the funding council should develop funding indicators. In particular, we recommended that the funding council should commit itself to the further development of unit cost information. What progress have you made against the commitment that was made to the committee? Are you satisfied with the progress that you have made?
First, I will get the question of the wider performance indicators out of the way. They were published in August 2003—my colleague is about to wave a copy at the committee. It is a comprehensive publication, which was also published on the web in August. It was sent to the chair of every board of management in the land with a covering letter from my chair. The letter pointed out that the document was a management tool and that boards were expected to consider the results in the document and take them into account in the planning and assessment of their own performance. The report describes the fact that, although the wider performance indicators were suspended for a while, they are now fully back on track. The publication to which I am referring is the first robust publication that we have produced on that subject.
As I think the report acknowledges, we have never abandoned the unit cost measure that we inherited. That said, during the first few years of the funding council, the publication of unit costs was delayed. In one case, it was not carried out for the very good reason that the sector was going from one accounting year to another—we had 16 months of financial expenditure in 12 months of activity and it would not have made sense to publish a unit cost in that particular year.
We have to make it very clear to the committee that the calculation of the unit cost, which comes out with a tantalisingly simple set of numbers, is incredibly complex. We are trying to identify the recurring costs per weighted SUM—I mentioned that at the very start—for the teaching activity that is carried out by the colleges that we fund. Colleges do quite a lot of other activity: they do full-cost courses, run residencies, provide catering, run farms, undertake special initiatives that might be funded by us or somebody else, and so on and so forth. In arriving at the figure, colleges have to do a full-cost allocation exercise and apportion all their costs in order to produce the eight independent numbers that we ask them to produce in the exercise.
The process is likely to take nine months from the college's year end. First, we require the audited financial statements. The statement of the eight items that they give us has to be reconciled with the college's audited financial accounts; otherwise we do not know that the figures can be tied up. In order to produce the denominator, we need the audited WSUMs position for each college. That process goes on through the autumn, but we would not expect to have that fully resolved for every college until perhaps January of the following year. Those various numbers must then be brought together and checked to ensure that the data are clean and that they make sense. We would expect to publish the data in March or April for the college year that ended the previous July. That is the standard that we aspire to—we expect to achieve it for the year that has just finished. We expect to publish in March or April of next year the figures for the financial year that finished in July 2003.
On the progress that has been made, we have had some bumps along the way in previous years. In the first year, it took us about 12 months to get the figures out, which is slower than the eight to nine months that I indicated. We then had a year in the middle when it took nearly twice the length of time to publish the figures, but there was a good reason for that. For the most recent publication, it took us about 11 months to get the figures out. As I said, we expect to publish the figures against the nine-month standard that I described.
We have been through a period of improving the data, which is the most fundamental thing. As I alluded to when I started, given all the definitional problems and the need for a full-cost apportionment, we had a situation where the figures that we were receiving from colleges were simply not reliable. Publishing information that is known to be unreliable is not a sensible thing for a public body to do. We feel that the data are now in much better shape and, therefore, we expect now that we will publish each year according to the timetable that I mentioned.
Before I move on from that point, in case members think that we are excessively dilatory in this area, it is worth reassuring the committee. Higher education has a comparable UK-wide exercise, which is called the transparency review. The transparency review seeks to identify, for higher education institutions in the UK, the split of expenditure between teaching and research and the split of that expenditure between what is publicly funded and what is non-publicly funded. As there is a fifth category, "Other", higher education has to produce numbers for five categories.
That exercise has now been running for four or five years, with consultants appointed throughout that entire time. It is expected that universities will produce reliable figures in another year or two. Until then, the exercise will not be considered to be fully transparent. I mention that just to emphasise the difficulty of getting satisfactory definitions in such complex areas that we can be sure are used consistently by institutions, such that when we finally publish the figures, we do not get garbage out because garbage went in. We have to be sure that we have reliable data.
All that brings me to another question, on which there may have been some shading of opinion between SFEFC and Audit Scotland, although that shading was not significant. The question concerns what we can do with the numbers once they have been produced. The figures that we publish are for the costs of activity that we fund per weighted SUM. Those figures take account of different courses, so they correct for the fact that each college has a different mix of courses. However, they do not correct for different college locations. For example, they do not reflect the differences in remoteness funding that we give to colleges. Nor do they correct for demography. For example, colleges get different amounts of funding per student for the purposes of widening access and social inclusion. The figures already vary significantly for those reasons.
When you run your eye down the list and compare the figures for two years, you find big shifts from year to year for certain colleges. To me, that suggests either that the data are still not reliable or that a college has perhaps overshot its target significantly in one year—perhaps in recruitment—and is slightly under for another year. When those two things are put together, you get an apparently big change in the college's efficiency. However, I do not think that there is really a big change in the college's efficiency. The college may have more students, but it may not have planned to have more students. Such changes do not necessarily tell me that the college's management processes have been further enhanced or improved. One could say that, in not managing the recruitment better, the college has not managed so well.
That is one difficulty with the figures. A more fundamental difficulty relates to the fact that, although we provide the bulk of the colleges' income, they are encouraged to earn income from other sources. If they manage to make a contribution from that income, they are encouraged to plough it back into publicly funded education. That means that there could be two reasons for one college having a unit cost that is slightly higher than that of another. The first is that the college is less efficient than the other college; the second is that it has been more effective at generating additional income and ploughing it back into its operation, which has allowed it to sustain a higher cost operation.
We have continued to collect and publish that information. We think that it is of use to colleges but not half as useful to them as the detailed benchmarking exercise that we described earlier, in which every college gets a detailed report covering every area of its expenditure. Colleges will be able to base judgments and decisions on that information.
The other information is useful, but will not greatly affect the efficiency of the sector. To go back to my first answer, the factor that will affect the efficiency of the sector more than anything else is the level of funding per student that the Scottish Executive determines in any one year. I have not been in Scotland long enough to have a feel for the run of figures, but when I worked in the Further Education Council in England, I observed that, over four years, the unit of funding declined by 40 per cent. That was a result of a combination of expansion targets and levels of funding set by the Government. Efficiency came from distributing that money to colleges and ensuring that they balanced their books while delivering a high-quality service. Publishing that information, which we did, would not have had the same impact on the sector as the inescapable fact of how much money is available to deliver a set volume of education.
You have told us how difficult the task is, but when will you be in a position to assure us that we have some reliable information on the unit costs that the Auditor General can use? He seems to think that those are quite important numbers, although you do not seem to think that they are.
We are trying to improve those data all the time. The question is one of definitions. Colleges organise themselves and use staff differently. Some colleges have staff on a permanent payroll, others contract out the service. I know that you think that it is a straightforward matter, but I can assure you that it is not. That is why I used the higher education example to illustrate the situation. In a complex, multi-million pound operation, it is extremely difficult to get definitions applied consistently.
The data are getting better all the time and we have continued to publish them and make them available to colleges. However, for all the reasons that I have mentioned, it is difficult to judge when they are wholly reliable.
Are you saying that there will never be a time when we can say that the data are entirely reliable?
I am suggesting that the benchmarking exercise that will report in the course of next year, which will cover in detail the expenditure of all colleges and will be carried out by an agency working to a consistent set of rules, will be reliable and of immense value to colleges.
What is the difference between unit costs and unit prices? How have you been able to set prices for colleges without having details of their relative efficiency?
By unit price, I think that you are referring to the funding methodology and how we allocate funding to colleges. Quite separate strands of activity are involved. In order to explain what funding prices are, I have to take you back to how the sector came into being. Before it came into being, each college was funded by its local authority, which determined how much funding was needed to meet the expenditure of particular colleges. When the colleges were transferred into a single sector run by the Scottish Office, they were funded initially on a broadly similar basis, which was concerned with what their budgets had been previously and how budgets needed to be increased to recognise inflation, increased costs or whatever.
Gradually, that system was transferred into a standard formula, the basic principle of which was that colleges should get broadly the same funding for doing broadly the same things. In the past, each local authority had funded its colleges differently. Over the years, colleges were brought into the formula which, in the simplest terms, identifies a volume of activity and assigns an amount of grant to that. The volume of activity is the weighted SUM that we referred to this morning and it is a standard measure. A college's allocation was constructed from the total number of weighted SUMs that it was expected to deliver, multiplied by the amount of funding per weighted SUM that was available for that particular year. That is what is referred to when one talks about the unit price. It is purely an allocation device and its roots lie in the total budgets that colleges lived under when they came into the sector.
In subsequent years, how that unit price is affected depends on the baseline funding that the Scottish Executive makes available to SFEFC and the various objectives that the minister asks SFEFC to follow. In the current period, the baseline has increased by about the rate of inflation and the minister did not ask the sector to expand. The implication of that was that the current unit price should be broadly maintained and that is how SFEFC allocated the funds.
Members can see that the allocation is not the same as the amount that a college chooses to spend per SUM because it will have other activities and other sources of income that have to be brought together. Its total expenditure has to be met from its total income.
That is as clear as mud.
I am sorry that you think that it is as clear as mud. I would like to clarify it for you because it is really rather important.
I am just trying to establish how on earth the price is calculated when you do not know the cost of anything.
One does not need to know the cost of individual items to arrive at a figure that is, as members should remember, a block allocation. Each college receives a block allocation of public funding to meet its needs for the coming year. As I indicated, those figures were derived by an historical process. Individual local authorities knew what it took to balance the books for their college, and those figures were brought into the new sector when it was managed by the Scottish Office. The process has been refined gradually over the years.
Members will appreciate the gradual translation from a budget that is adequate to support a college to a national formula that ensures that all colleges are funded on the same basis. If colleges were funded on the basis of unit costs or specific costs that they incurred, each college would have different costs for different things and they would be funded according to what they chose to spend. I do not think that any funding council, either here or south of the border, has ever thought it appropriate to fund on that basis. Is that any clearer to you?
I think that you have given us your answer.
Very helpful. I think that we have finished that section of questions. We move on to quality in further education provision.
Why has HMIE been employed on a service agreement and what benefits will accrue?
Do you mean as a result of employing HMIE or as a result of employing it on a service agreement?
What is the logic behind employing HMIE on a service agreement to review colleges and what benefits will accrue as a result of that?
We are separate organisations and the funding council is not equipped to carry out that type of work. It makes no sense to set up a parallel organisation when there is already an organisation in existence that has the skills and experience to do the job. The question is what is the best way to set up an understanding of the work that will be done. We make a payment to HMIE for the work that it does. Under those circumstances, the service agreement is a standard procedure, and it is important for both sides, to set out clearly the agreement of the work that will be done, the expectations, and the payment to be made by the principal to the agent for that work. Our experience over the years has been that that has worked pretty well. There have not been major disagreements about what was intended to be done and time scales were clearly set out, so that if there were issues about whether reports were coming on stream on time we could refer to the agreement. It is a standard bit of management practice and it is very valuable.
Very few colleges are found to be unsatisfactory, although some are having financial problems. Is there any correlation between those matters or are they entirely separate?
There is no obvious correlation, although from my experience in other sectors of education I think that good managers are good managers. Often, if a college is tightly financially managed, the people who do that also manage course delivery tightly and are tight on quality assurance and so on. Management is an holistic process; we cannot compartmentalise it and say that someone is very good at this, but does not bother with that. However, there is no evident correlation between the odd unsatisfactory grade—there have been only one or two—and the financial performance of colleges.
Given the importance of HMIE work as a measure of quality, why has that not been included in previous plans?
We refer to quality in our corporate plan. We state that the quality of programmes is of high importance and reference to HMIE reports is made in the targets.
What we have not done previously in our corporate plan is draw up a schedule of how we would report against HMIE reports. We did not think that that was necessary, but we have taken note of what is said in the Auditor General's report and in our latest corporate plan, to which I referred, we have drawn up a comprehensive list of all the measures that we are using. We have stated what the indicators are and how we are going to measure them and so on. We have tried to rectify the situation.
Mr Donaldson, how does the work that you do for the Scottish Further Education Funding Council differ from the inspection work that you do for other educational establishments?
The fundamental strategy that we use in further education is similar to the approach that we use elsewhere. The process that I described earlier of developing a set of quality indicators in conjunction with the sector, so that we use a common language when we talk about quality and pursue that as part of an improvement agenda as well as an accountability agenda, is common across the range of our work. Our work ranges from the inspection of education authorities to pre-school centres through to work in higher education and teacher education. The broad strategy is the same.
The biggest difference—which we are currently examining—is that the memorandum of understanding with the funding council confines the quality indicators to educational issues and the student experience. Some of the questions that members have been asking about the relationship between broader strategic management and educational management are not currently part of the inspection process. However, we are reaching the end of the first cycle and we are in discussion with SFEFC about the way in which the inspection process will operate from 2004 onwards, as the process matures. Some of the issues that the committee has raised are under active discussion as part of consideration of the new quality framework.
Can I take it from that answer that, in the negotiations about the future memorandum, you can draw on your experience of inspecting other educational establishments and advise that you could examine other areas such as educational administration, as distinct from financial matters that are already otherwise covered?
Yes. I have to be careful to ensure that, as an inspectorate, we concentrate on the core business—the area where our strengths lie and where we can do well.
We are at the moment discussing with SFEFC the relationship between educational management and some of the more strategic aspects of college management, although that work will probably not take us into the kind of detailed discussions about financial management that the committee has had this morning. However, as Roger McClure said, it can be difficult in practice to distinguish between educational management and broader strategic management. As we move into the next inspection cycle, we need to consider how we can bring the two together.
The fact that SFEFC has not produced summary results on your work on performance potentially robs Parliament of information. Do you have an alternative means of providing Parliament with comprehensive results of your work across the range of colleges that you visit, or are you limited in what you can provide?
We publish overview results in relation to our work in further education. A publication is in preparation—it will be published on 26 November—which looks across the suite of inspections that we have undertaken in the first three years of our work for SFEFC. I am very conscious of the need to ensure that the intelligence that we have gathered about the system's performance is made available in a form that is accessible to the people who have to make decisions. The information is in the public domain.
David Wann waved a copy of SFEFC's publication, "Student and staff performance indicators for further education colleges in Scotland 2001-02", at you earlier. Although the bulk of that document is taken up by individual performance indicators for colleges, it starts with a chunky summary of what is happening at sector level. It brings together all the different types of indicator and includes a summary of the HMIE reports, so that everything is in one place.
The final area that we want to consider is the scope for comparing further education provision in Scotland with provision elsewhere in the United Kingdom and internationally.
What difficulties do you see in making comparisons of relative performance across Great Britain? In these days of joined-up government, why have you opted out of exploring comparative performance indicators with your counterparts in England and Wales?
That boils down to a kind of cost-benefit analysis. I hope that I have given a flavour of some of the difficulties of producing internal measures within the system. It is axiomatic that if we do not have reliable data, not only are our indicators not useful, but they can be downright dangerous, because they might suggest that an institution is misperforming in a particular area when, in fact, the result has arisen from an error in the data and has nothing to do with the institution's performance. We would not want institutions to take action on the basis of incorrect data.
The question is, are there indicators that come from outside Scotland that we can use? We have tried to use such indicators in the past—I assure members that there is no lack of willingness to do so. However, the amount of effort that is required to obtain data that are sufficiently reliable and consistent is disproportionate, so such analysis tends to be pushed further down the agenda. That has not prevented us from interacting with the bodies that collect such data; we have a great deal of interaction with them because it is useful to compare processes, approaches and mechanisms with people who do broadly the same things that we do.
It is much harder to achieve returns when trying to analyse the data. The councils in England and Wales have a different data collection system, different data definitions and they fund differently. One would have to unscramble that in order to make comparisons, first having reached a point at which one believes the data within one's own sphere. That it is not done is really because of a cost-benefit analysis, rather than a lack of willingness. The Auditor General has asked us to reconsider the matter, which we will certainly do to see whether the situation has moved on sufficiently and whether, in selected areas or at a sufficiently high level, we can identify data that would be useful for comparisons.
So, in essence, you are saying that there are very few points for rational comparison because the system is so different down there.
Yes.
You say that you have investigated the issue. Is there, therefore, any evidence that the councils in England and Wales will introduce—or are considering—sustainability benchmarking and indicators?
I am afraid that I cannot speak for Wales. I am not sure what has been done there, although I would be surprised if something broadly similar to what we are doing is not being done there, for the reasons that I gave earlier. Each funding council is well aware of what the others are doing because we have regular contacts on themes such as funding, and opposite numbers keep in touch with one another. However, I cannot give an authoritative answer on what performance indicators will be published.
As there are no further questions, I thank our guests for their full answers and their help. No doubt we will be in touch if we need to follow up on any points. You will be able to read our report when we publish it.
We now move into private session for the next agenda item.
Meeting suspended until 12:27 and thereafter continued in private until 12:33.