Statutory Performance Indicators
We are joined by Caroline Gardner, the deputy Auditor General of Audit Scotland; Alec Taylor, the performance indicators manager; and Lesley Bloomer, whom we have met before and who is the director of the performance audit. They will go through the usual procedure of a 10 or 15-minute presentation, after which there will be time for questions.
Caroline Gardner (Audit Scotland):
Thank you, convener. We are pleased to have the opportunity to talk to the committee about the Accounts Commission's work in publishing performance indicators. We hope that their publication will be a useful source of information for the committee in future. We will talk for no more than 15 minutes. I shall give the committee a quick update on the new audit arrangements under the Scottish Parliament. Alec Taylor will talk then about the context of the performance indicators and the way in which they have developed until now. Lesley Bloomer will pick up some of the issues that we face, concerning the way in which the PIs should develop in future.
Members have in front of them a pack that contains a hard copy of the slides that we will be talking to, which is entitled "Statutory performance indicators for local government". They also have a couple of examples of the ways in which we publish performance information, to which Alec will refer later. We will take members through the main points that we want to get across this afternoon.
The first slide aims to set out how the new audit arrangements look under the Scottish Parliament. The Public Finance and Accountability (Scotland) Act 2000 established the new arrangements and will have effect from 1 April. The Accounts Commission is still responsible for securing the audit of local government in a range of ways and has the power to make reports in the public interest when required. It can also censure, suspend or disqualify members and officers of councils if necessary. That set-up recognises specifically the fact that councils are democratically elected, rather than being accountable to either the Scottish Executive or the Parliament. Therefore, the Accounts Commission retains its previous role in relation to local government.
A new post has been created of Auditor General for Scotland, who is responsible for the audit of almost all the other spending bodies that spend public money in Scotland. The Accounts Commission and the Auditor General are served by a new audit delivery agency, called Audit Scotland, which employs all the staff who carry out the work of the Auditor General and the Accounts Commission and exists solely to provide services to them.
We think that those audit arrangements are much more effective and streamlined than those that apply in the rest of the United Kingdom. On the one hand, they give us the critical mass to be able to carry out our work effectively and on the other, they enable us to take a cross-cutting look across the entire Scottish public sector, in addressing such issues as partnership working. We can also trace the process through from the Executive to the local spending bodies, to ensure that the implementation of policy and the delivery of services are being carried out effectively.
The new arrangements are in place. They recognise the separateness of local government, but give us the opportunity to have a cross-cutting look at the public sector, which can offer real benefits.
The next slide is headed "Performance audit". Within the new arrangements, we are working hard to ensure that our responsibilities can make a real contribution to improving the quality of public services in Scotland, through holding spending bodies to account more effectively and by helping them to improve, rather than by simply looking back at what happened in the past.
There are two key questions on performance, in which members will have an interest. The first question is straightforward and concerns how good performance is. The second—in line with best-value policy—concerns whether improvement is taking place. Are spending bodies continuing to drive up their performance to ensure that they are matching existing best practice?
Our approach can contribute to establishing better public services in the six ways that are set out on the slide, and the challenge for us is to ensure that we strike the right balance between supporting good practice and innovation, and challenging people to improve when evidence suggests that that is possible. That is the approach that we are developing across the public sector, with a special remit for local government under the auspices of the Accounts Commission.
Members might be interested to note that, although some aspects of that approach apply in most parts of the public sector, they are most advanced as an integrated package in local government because of the history of the legislation that is operated in local government.
I will move on to slide 3, before handing over to Alec Taylor. Within the performance audit, we have three main tools at our disposal. The first is a value-for-money study, which looks down through a specific service or function and makes comparisons across a range of bodies to identify what the range of performance is, what works most effectively and where there is room for improvement. The second tool is performance indicators, which we are here to talk to the committee about in more detail. The third tool is management arrangements—the processes that underpin especially good or poor performance, as shown up by the PIs. We are aiming to integrate those three tools much more effectively in future, to ensure that we can give Parliament and councils information that can be relied on.
Alec Taylor will talk in more detail about the PIs.
Alec Taylor (Audit Scotland):
I shall make a few points about the way in which the PIs have developed and the way in which the information is published, which is the final outcome of the process. The key is contained in the Local Government Act 1992, which sets out the framework under which we operate.
Slide 4 shows the criteria that we have to use. They are
"cost, economy, efficiency and effectiveness".
The slide also shows that we must be able to facilitate comparison over time as well as between councils in any financial year. Each year, as we review what our direction should be for the following year, we must retain an awareness of the requirement to balance continuity with any change or development that needs to take place.
Part of the problem of change lies in finding a balance between the practical and the desirable. As slide 5 shows, councils must collect and publish information—audited arrangements must be in place for them to do that. We must ensure that we are not asking too much of councils and that, when we ask for information, they can provide it. If their systems cannot provide the information, there will be little point in introducing new indicators. We have put such arrangements in place in one or two councils—with the agreement of the relevant professionals—to assist them in ensuring that they have information in place to report on important issues. That is one way in which we can assist the process.
In determining our annual direction, we put considerable effort into working with a wide range of interest groups. Before we issue our formal consultation paper in the summer months, we spend time talking to professional associations, social workers, planners, education directors, various units within the Scottish Executive and other interest groups to ensure that much of the paper's content is already understood and known by the people who are likely to be affected by it. Once we have taken account of the responses to that consultation paper, we issue our direction, which happens about this time each year. Indeed, the direction for the next financial year will be going to the Accounts Commission next week. The direction is then backed up by a lot of guidance—on the definition of terms and the interpretation of information—for the relevant officers in councils and their auditors.
In echoing the need for some consistency over time, our direction for the next financial year is very similar to this year's direction. For next year, we are introducing one new indicator and dropping one, which means that we have the same number of indicators. Very little change will take place. However, I reiterate that that is all done on the basis of extensive consultation.
Slide 7 illustrates that, at the other end of the process, we publish the information for the whole of Scotland in several ways. Each year, we distribute a series of pamphlets widely. One of those, which is entitled "Benefits, Finance and Housing", has been included in our pack, which gives a national picture and highlights important messages about a range of indicators. In addition to those pamphlets, we issue a compendium of the data without analysis to allow people who want to undertake any analysis of their own to do so. The data and pamphlets are on our website. The data are also available on disk and are sent out to all sorts of people, such as council officers, students and civil servants in the Scottish Executive.
During the summer, we also sent chief executives a profile of their councils for the first time. Those profiles contained detailed comparative analyses of many of councils' performance indicators in relation to other councils. Next February, we will distribute more widely a similar document for the 1999-2000 data. All our information for a financial year is published in January and February the following year. Next time round, we expect to issue about eight pamphlets that will cover about 50 to 54 indicators.
In slide 8, I present one of the tables that are contained in a pamphlet as an example of a figure that relates to rent arrears. The figure exemplifies how we present data where we can, by illustrating the national picture and reflecting change over time. It shows rent arrears for council housing since 1993-94 and indicates a trend of increasing arrears against an increase in the level of rent that is due. As a result of receiving such information, we undertook a value-for-money study on the management of rent arrears, which was published in June. That demonstrates that, as well as simply being presented in this way, PI data are used for other purposes.
The ninth slide shows last year's rent arrears for the 32 councils. That information is presented in tabular form in the pamphlet, with the comparative arrears figures for the previous two years. There was clearly a wide variation in the level of rent arrears, but I wanted to illustrate how, for some performance indicators, we have used family groups to provide a more like-for-like comparison between councils. Lesley Bloomer will now say something more about issues that arise from the information.
Lesley Bloomer (Audit Scotland):
Alec Taylor has talked about PI work. I will describe the concerns that councils have had about our work and the ways in which we have addressed those concerns. I will then speak briefly about future developments.
We take seriously concerns that are raised by councils. If councils are not happy with the PIs, they will not use them. An increasing number of councils use the indicators to plan targets for improvement with their heads of service. PIs should be used in that way to contribute to improvements in performance. We need to ensure that councils are happy with the PIs so that more councils use them to improve performance.
I wish to address the two issues that are outlined on slide 10. First, concern has been expressed that there are too many PIs. Three years ago, the Scottish Office established a set of PIs to reflect the introduction of the best-value regime. That meant that there were two sets of PIs, as in England, which gave rise to a lot of work.
We worked hard with the Executive to combine the two sets of indicators 18 months ago. It is worth noting that unification of the sets of indicators has only recently been achieved in England, where the work was carried out by the Department of the Environment, Transport and the Regions rather than by the Audit Commission, which is our equivalent body. There is still a perception among councillors and council officers that there are two sets of indicators, but that is not the case. We have worked hard to reduce the number of indicators.
Secondly, there is a view that the PIs lead to unfair comparisons and that, in reporting on them, we do not compare like with like. Complaints have been levelled at a few indicators, in particular at those that involve local targets or those that concentrate solely on expenditure. In the most extreme example, it is hard to compare Clackmannanshire's need for expenditure with Glasgow's. In those cases, we have developed the criteria that we use so that we select and define PIs in a way that makes them much more readily comparable. The committee will be aware of that from the document that we sent on statutory performance indicators.
We are also using family groups where there is evidence that external factors influence a council's performance. We use family groups to group councils for rent collection rates, council tax collection rates and refuse collection. Rent and council tax collection rates are affected by levels of deprivation and refuse collection is affected by how dispersed the population is. We will continue to work hard to improve PIs.
I will finish by talking about the future development of PIs. We will continue to develop the indicators to ensure that they are robust and that they are strictly comparable between councils. We want to begin to develop some voluntary cross-cutting partnership indicators for measures such as the number of racially motivated incidents, the number of people who participate in sport and so on. To make a difference in cross-cutting areas, councils must work in partnership with other bodies, such as health bodies, enterprise agencies, the police, fire services and so on. Councils will be able to use indicators that reflect partnership working in their community leadership role. They will be able to use such indicators with their partners to help to drive change.
Audit Scotland and the performance audit team will work to develop a more rounded picture of performance. That relates to Caroline Gardner's point about the three areas of our work: management arrangements and best value audit; performance indicators; and value-for-money studies. We are working to pull together reporting of our indicators with our other work—in particular with the best value audit. That will mean that we will be able to put together information on how good a council's processes are and on how good the service that users receive is. That will give us a much more rounded picture of a council's performance.
Equally, we will explore, using VFM studies, why councils vary so much in performance. Alec Taylor mentioned rent collection rates, which offers a good example of the extent of variation between the performance indicators of councils. We followed that up with a VFM study and we will be doing more of such work in future.
That is all that I would like to say now about how we plan to proceed. I will be happy to take questions.
I will start with two easy questions—I think. Alec Taylor said that you had introduced a new performance indicator and dropped an old one. I would be interested to know why you did that, and what the indicators were.
If you find that a council—let me put this delicately—has delays with repairs or has invoices that are not paid within 30 days, and if it does not change that for a couple of years, how long is that situation allowed to go on? What can you do about it? I have used two easy examples, although there might be other more serious matters that councils do not address, even once you have issued a report.
I will respond with the easy answer first. The indicator that we propose to drop relates to staff costs in the library service. We did that because of our developing criteria—it is difficult to compare costs of an element of service provision between councils. That is particularly evident in Glasgow: the Mitchell library does not have a lending facility, but it is a very expensive high-quality facility. To compare Glasgow's library staffing costs with those of other councils was considered inappropriate for the purposes of our criteria.
We also wanted to consider more carefully the end product—the outcomes and the outputs—rather than the resource inputs into services. We have developed indicators for the library service that relate to use of libraries. That is far more important in general terms. We are therefore determined to drop the indicator of costs.
The indicator that we are introducing relates to the educational attainment of looked-after children—a social work matter. It stems from the social justice action plan and from the proposal that there will be a target for children who are leaving care: achievement of standard grade English and maths. That means achieving literacy and numeracy. That is a developing target and it was felt that a new policy initiative ought to be reflected in the PIs.
To some extent, the length of time that situations—such as that which you described, convener—would be allowed to continue would depend on the importance of the matter in hand and on the extent to which the council's performance had varied. There is now an issue about best value, including consideration of continuing improvement. There are one or two indicators on which some councils are not doing particularly well, but they are making a determined effort to improve. That is important.
On the other hand, if a council is doing very well, but is somewhat complacent about service delivery, that also shows up in the indicators. In serious cases, a council's auditors will stay with a matter and draw issues to the council's attention annually.
If no council is doing particularly well in relation to an indicator, we might—as we did with the housing rent arrears service—decide that it is time for more general consideration through a value-for-money study. We would investigate whether examples of good practice could be drawn to councils' attention. We have a number of ways of dealing with such matters—as you mentioned, convener—but it depends what the issue is.
I would like clarification on the graph on slide 9. I might think of a more intelligent question in due course, but I simply do not understand the "Family group analysis" thing that is shown on that slide.
Slide 9 relates to an analysis that was carried out on the basis of deprivation and population density. That analysis stemmed from a concern that was held by a few councils that their situation was very different from that of other councils and that a comparison across the 32 councils—whether in tabular or graphic form—was inappropriate. Glasgow is the most common example. It was considered inappropriate to argue that Glasgow City Council should be doing as well on some indicators as the Scottish Borders Council or the Western Isles Council were. Similarly, it was considered inappropriate that Highland Council, with its dispersion, should do as well as some of the more tight-knit council areas.
We carried out an analysis that divided the councils into three groups according to population density and deprivation, to show that those councils that were more alike still suffered from significant variations in performance. We wanted to ensure that Glasgow City Council, Clackmannanshire Council and Highland Council were not being compared unfairly.
I assume that the bars on the chart relate to the list of councils in figure 2(a).
That is correct.
In table 4—on the percentage of invoices that were sampled—I note that Glasgow City Council used a sample of 803,126 invoices—I assume that that is all the council's invoices—but the West Lothian Council sample included only 568 invoices. How confident are you that the authorities are presenting comparable figures? Given the fact that other local authorities seem to have provided all their invoices, are you sure that West Lothian Council sampled 568 invoices at random? Could the results have been skewed? Might it be appropriate to use a common measure, such as all a council's invoices?
How appropriate is the number of repairs per house as an indicator, given the fact that houses come in different sizes, designs and ages? Repairs could range from fixing a bolt in a gutter to a major re-roofing exercise.
I will deal with the first question and Lesley Bloomer will deal with the second.
The issue of a common measure is very important. Initially, the indicator was designed to allow for a minimum sample size of 500, which was agreed as being statistically relevant. However, the guidance that we provide to both the auditor and the council is that the sample should come from a range of departments and should consist of different types of invoice. Increasingly, we have encouraged councils to provide a 100 per cent sample. We can take some comfort from the fact that many councils provided huge samples.
Mr Gibson identified the fact that West Lothian provided a low number of samples. However, it would be clear to the people of West Lothian and the council's auditors that that is a low number. We hope that there will be a movement towards a near 100 per cent sample size in all councils. However, the specification of the indicator was based on a minimum sample size and we have taken the matter forward from there.
I do not have the figures for 1999-2000, but I hope to see an improvement in the performance and an increase in the number of samples that councils provide.
Is there any reason why there is such variation in the number of invoices that are paid within 30 days? I know that that is a bugbear of many members of the business community. The Forum of Private Business indicated that several of its members have gone bust because of that. According to the table, there has been a substantial decrease in the number of invoices that are paid within 30 days by Inverclyde Council and Fife Council although, commendably, councils such as the City of Edinburgh Council have improved significantly. Overall, there does not seem to be any change—the rate is 70 per cent across all local authorities. Invoice payment seems to be a bit of a rollercoaster ride.
You raise one of the key issues about the performance indicators, which is that they raise questions rather than give answers. However, that type of information allows the sort of question that you are asking to be asked. The Federation of Small Businesses has raised the issue with us many times. In a few months, we will publish the third-year data and examine them to see whether there has been any improvement. At this stage, however, I do not know why there is such variation. Clearly, there is variation in the quality of systems that are available in councils. That is why some can give us a 100 per cent sample and others can produce only a small sample. Perhaps that is part of the answer. We will keep an eye on the situation and we hope that it will improve.
You asked how we can compare the level of repairs, but it is important to remember that the condition of the stock will vary enormously in differing local authority areas.
I wondered how relevant the information was.
It is relevant in a couple of ways. We took the information about the number of repairs per dwelling and the number of emergency repairs per dwelling. Because of the variability in the data, we did a value-for-money study on it, which was published a couple of years ago. In that study, we used data from Scottish Homes, which conducts a three-yearly stock condition survey, to examine the stock condition. We found that the age and condition of the stock play a part in the number of repairs: older and less reliably sturdy stock will need more repairs. More important, however, was the way in which the council managed the housing repairs work. If the council was firm about charging tenants for repairs that were needed because of wilful damage, fewer of those repairs needed done. Also, if a council was proactive regarding its maintenance work, fewer ad hoc repairs were required.
The performance indicator data showed us what the variability was. A classic example was the fact that East Dunbartonshire had a higher number of repairs per house than North Lanarkshire. That did not fit, as East Dunbartonshire is a less deprived community and I believe that the stock condition was broadly similar. On examining that situation, we discovered that the difference had arisen because of the management style, rather than external factors such as stock condition.
The same situation is found repeatedly when an area is examined using a value-for-money survey. We examine factors that are external to the council and factors that are in the control of the management of the council. External factors play a part but, frequently, a bigger part concerns the way in which the council manages the processes. That is why we think there is good value in the value-for-money studies: they allow us to pull out examples of good practice that other councils can use.
Table 5b in the leaflet, which deals with the overall proportion of housing response repairs completed within target, shows that Angus has 91.7 per cent, which seems good, and that West Dunbartonshire has only 57.8, which does not seem very good. I take your point about East Dunbartonshire, but those figures do not tell us whether a council is doing all those repairs because its stock is not in good nick or whether the opposite is true. From a lay perspective, there does not appear to be a sufficient explanation of what those figures might mean. Other figures, however, will be perfectly understandable to members of the public and will clearly show whether the local authority is doing better than it did in previous years.
That is a good point. We need to keep reviewing how user friendly our pamphlets are and how much explanation is contained in them. We will have to keep working at that to improve it.
The table with the information about the overall proportion of housing response repairs completed within target is interesting and we will treat it with more care in future. Authority response target times vary greatly. It is heck of a hard to hit the target if the target is four hours, for example. A council that achieves 90 per cent when its target is four hours is possibly doing better than a council whose target time is 24 hours.
I am surprised that the criteria are not standardised. I assumed that they were, as that would allow relevant comparisons to be made.
That is one of the issues that have arisen. We have had complaints about the fact that it is difficult to make like-with-like comparisons. There is no uniform categorisation of repairs. There is no agreement on what constitutes an emergency repair and no agreement on the time scale in which repairs should be done. It is not our job to set those criteria.
Because standardisation of criteria is so important, we have carried the information in pamphlets, and in the compendium we have all the information on the councils' exact target times, so that informed analyses can be made. This year, pending what we hope will be the development of standardised response times, we are taking the repairs that councils aim to complete within 24 hours and asking how many they managed to do. We will report that in the pamphlet, which will be easier to interpret. Our difficulty is that there is no standardised response time, and we cannot impose one, but we are working with directors of housing to try to move the matter forward.
May I add one brief point? During her presentation, Lesley Bloomer talked about the changes that we made recently to the criteria for performance indicators; two important changes will affect indicators such as response times. First, it should be obvious which way is up for an indicator—that is, it should be clear which is good performance and which is poor, so that if there is a change, it should be clear whether things are getting better or worse.
The second change is that we should focus on national, not local, targets. If that means working with local government to establish what national targets ought to be, we will do that.
That is important when looking at comparisons.
First, I wish to make an observation. I welcome the reports on the improvement in rent arrears collection and the fact that housing repair responses are within target in West Lothian. I am sure the council's excellent convener at the time contributed to that.
Namely yourself.
More seriously, I was concerned about the section at the beginning of the report, on council tax collection levels. I know that you are just identifying questions and that it is for the Executive and local authorities to consider them, but there is a puzzling and concerning gap between the level of council tax collection in Scotland and that in England and Wales. In addition, the situation did not get better during the three years that are covered by the report. Why is there such a gap between our collection levels and those in England and Wales?
We did a VFM study on that subject three years ago, and went back and highlighted the position last year when we published the data. Several things affect council tax collection levels, particularly income levels and deprivation levels. The significant factor in Scotland, which England and Wales did not have, was the non-payment campaign, which focused particularly on the west of Scotland, and which we reflected in our report. That is one of the reasons the position in Scotland now is worse than it is in England and Wales.
Our reason for revisiting the issue when we published the data last year was the concern that you raised, which was that the position did not seem to be getting better. For each of the pamphlets that we publish we put out a press release, in which we try to be balanced. The press release that went with this pamphlet was challenging. We said that there had to be improvement, and that it was not acceptable for collection levels to continue as they were, because those who paid were subsidising those who did not. We will keep an eye on what happens with collection rates. If they do not start to improve, we will maintain a press-release focus on the issue so that the message gets through in the media.
However, we have found improvements. When we published our original report, Glasgow City Council sat down with us and the external auditor and agreed a set of improvement targets. Those targets are still low, but the council is meeting them, which is excellent. We understand that there have been problems with the introduction of new information technology systems, which have disrupted collection regimes, and that that may be part of the problem for the other two councils—West Dunbartonshire Council and Inverclyde Council—that we highlighted as having low performance. Those new systems should be bedding down now, and we will be keeping an eye on performance in those councils and across the piece.
My other point, which also relates to the issue that does not make much sense to me, concerns table 3, which shows the cost of collecting council tax. For example, Falkirk Council records a huge reduction in the collection cost per dwelling, from about £11 to just under £3, and its collection rate remains the same throughout the period. In contrast, Fife records an increase in the collection cost per dwelling, from about £4 to £14, and its collection rate remains roughly the same. It is difficult to understand that. Falkirk seems to be making huge savings, or efficiencies, in its collection cost and that is not affecting its collection rate, whereas another authority seems to be putting much more effort into its collection and is not getting any benefit.
It is necessary to examine what the councils are doing. As Alec Taylor pointed out, the indicators do two things. They allow us to make comparisons across all councils on the council tax collection rates. They also allow us to make comparisons within a council over time. The focus tends to be on the comparison over time on indicators on which we ask for costs. We work closely with the Chartered Institute of Public Finance and Accountancy to ensure that the costs are allocated uniformly. That is a hard job. For that reason, there may be variation in how councils allocate costs between headings. Allocation of costs is complicated by the fact that some staff who work on council tax may also work on housing benefit, so the council has to split up those costs somehow. It can be difficult to allocate costs.
As a result, when we have cost indicators we tend to focus on what is happening within a council over time rather than between councils. Bristow Muldoon used the example of the decreased cost in Falkirk. We would need to examine the detail to find out what has happened there. Has the council altered its procedures? Has it cut its costs or changed the way in which it allocates them? We work hard to make indicators comparable between councils, but it is more difficult to do that on cost indicators than on some others.
I know from my experience of talking to councils that they feel overburdened and that they question the value and usefulness of much of the data that they are asked to produce. How do you intend to overcome those concerns?
We have been working hard to overcome them. The work that we did with the Scottish Office to combine the sets has helped enormously.
The demands to produce data come from Audit Scotland, but also result from a host of statutory returns that, as you know, councils make to the Scottish Executive on a variety of matters. We are working with the Society of Local Authority Chief Executives and Senior Managers—SOLACE—the Convention of Scottish Local Authorities and the Scottish Executive on the joint performance information group. Our objective is to examine the demands that are placed on councils to see whether we can streamline and harmonise them.
The work that has been undertaken on social work is a good example of that. All the relevant bodies—Audit Scotland, the Scottish Executive, COSLA and the Association of Directors of Social Work—came together to consider the performance information that had to be reported and that was needed to run the service. They recommended improvements on the information that they needed to have and alterations to what they wanted to report. That process has worked well. The work of the joint performance information group is likely to lead to further work of that type to streamline and co-ordinate all the reporting requirements. That should reduce the overall burden on councils.
Am I right in assuming that there is no performance indicator in this report on finance for the collection of non-domestic rates?
There is no performance indicator for non-domestic rate collection.
Why not? Non-domestic rates are one of the biggest revenue earners for local councils. Why is no check made on that collection?
It would be easy to say that I do not know, because that is the fundamental of any other answer that I might give. That issue has not been raised directly with us by any of the bodies with which we have discussed the performance indicators. In general, our PIs relate directly to services that are provided by the council to a range of publics. That is probably why we have not examined that income stream.
But the council is the service provider: it collects the rates.
Absolutely.
It seems strange that one of the biggest sources of income generation for councils is not subject to a performance indicator. You are very critical of councils for not collecting council tax. How do we know who collects what in terms of business rates?
The answer is that we do not know at the moment. We have done a couple of VFM studies, which explore in more detail the collection of rates and other corporate services of that type. We should perhaps explore the broader question of the effectiveness with which councils gather all their revenues. We will certainly take that point away with us.
To go back to the issue raised by Kenny Gibson, the impression that is being created is that we are comparing apples with oranges. If that is the case, what value do the statistics have? For example, many people question school league tables, because different backgrounds and catchment areas affect the results; there is little value in comparing a high-performing school with a low-performing school, without taking into account the background of the students who go to each school.
Is there no way in which you could add a qualitative dimension to the statistics, instead of carrying out quantitative data analysis, collecting figures and presenting them in a name-and-shame fashion? What possibility is there of producing a value-added table that shows improvements in the level of collection of rent arrears in, for example, Glasgow compared with a more rural or better-off community? Unless there is a qualitative dimension, the statistics have little impact, other than to say that one council is better off socio-economically than another.
There are at least two answers to that question. First, we already do that to some extent through the family groupings, which Alec Taylor talked about, which deal with fairly complex statistics to show what affects performance and then group councils according to how they are affected by those factors. That starts to make comparisons.
Secondly, the PIs are designed to be comparative across councils and over time. The comparison over time works in any case, and councils find the cross-council comparisons useful, because they understand what some of the variations are. We have the power to use value-for-money studies to explore what is driving better or worse performance in much more depth than is possible with PIs. In that way, the comparisons can be made much more valuable. I talked about the ways in which we are developing performance audit as a technique for the future. One of the things that I am keen we should do is to make more use either of contextual information or of standardisation to ensure that the comparisons are as robust as they can be.
If we had a bit more background, it would be nice to talk about how we try hard to ensure that the tables are not just league tables and that they are reported sensitively, and that councils are helped to make use of them. We understand that the tables are sometimes used in the way that was described, especially by the media, but we try to ensure that the information that lets people make sense of the indicators is as widely available as the indicators themselves.
My next question concerns the point that you have just made. Your report contains a commentary on each set of statistics, but the commentary just confirms what the table says; it says that councils A, B and C are doing quite well, and councils E, F and G are not. It contains no analysis of the statistics or anything that would suggest why councils A, B and C are doing better. Statistics relating to the PIs and why they might be counted in a particular way may be encapsulated, but the commentary does not suggest why that might be the case; it simply reflects that a certain council is doing well in a certain statistic.
It is always difficult for us to get the right balance and to present the PIs in a form that is not hugely indigestible—the full compendium is telephone book-sized. We need something that gets the main messages across to the public, but also gets information to councils and to people whose interest is more detailed in a way that allows them to make sense of it.
We work hard at getting the balance right but we do not claim to be entirely there yet. The use of family groups is one way of trying to make the comparisons more useful in a visually simple way—people can simply look at the graph and see quickly which councils come together and are most like theirs for the service in question.
Would it be too difficult to add a paragraph or two in the commentary, by way of analysis of the statistics, to suggest that a simple reading of the table is not enough to indicate that one council is performing better than another?
You are right. Sorry—three of us are trying to answer the question.
Inside the front cover of each of the pamphlets is a statement about a number of factors that affect the council's ability to perform. It says that those
"local factors may mean that a council with a performance which, at first sight, appears to be worse than that of another has, in fact, done better given the circumstances it faces."
We try to flag up caveats to avoid the problem of over-simplistic interpretation—the league table effect, where somebody comes out top and somebody comes out bottom.
But the commentary does not say, "This table could be affected by some of the caveats at the start of the document." It says, "Councils A, B and C are at the top and councils D, E and F are at the bottom"; no analysis of the statistics is included to point out that there may be a reason for that.
As Alec Taylor said, factors are included on the introductory page of each pamphlet. The commentary on council tax collection costs says:
"the cost of collection may be affected by: the ability and willingness of taxpayers to pay"
and so on.
In the commentary, we try to list the factors that may affect the performance levels. Without doing the volume of work that is involved in a VFM study, it is difficult for us to say why there is such a range of performance. Many internal and external factors can be involved, and it is incredibly difficult to unpick them. We would do that in a VFM study, but we have 65 council indicators and another 11 for fire and police. Understanding properly what is happening cannot be done in the commentary. We need to go round an area, which is what we do for council tax, rent collection and so on. However, we outline in the commentary external factors that may affect a council's performance. We try to do that routinely in the commentary—I am sorry if there are issues on which we have not done so.
Performance indicators are just numbers. It is impossible to reflect the quality or detail of service provision in a number. That is why we use the jigsaw analogy—our statutory performance indicators are one part of the jigsaw. It is incredibly important to understand quality issues about service provision—councils should speak to their users and staff about that. Performance indicators are important, however, in that they prompt questions. If, for example, North Lanarkshire has a much lower level of rent arrears than Fife, we ask, "What is going on here?"
We produce audited figures. Yes, there are queries about the comparability of one or two of them, but across the board they are audited, robust figures that allow councils to say, "Hang on a minute—what is happening here?" That is their real value. We cannot encapsulate everything in a number, but we can get the number to prompt questions. We reflect the caveats that are associated with the figures in the commentary and in our press releases.
I hope that my points are related. First, I am reminded of performance tables in education. One of the difficulties is the use that is made of the statistics, not only—obviously—by the press but, in this case, by councils. They could feel devalued. They might, underneath, be doing a fairly good job but, as Michael McMahon said, the statistics do not reflect that.
The starkest example that we have had to date was when we discussed the refuse collection figures the last time that Audit Scotland was at the committee. Those figures showed efficiency in terms of time per wheelie bin. However, the big message to emerge from the report was concern about what we are doing about waste management in general. If a council were thinking of having different wheelie bins for paper and glass, or a similar waste management structure, the time taken per bin might be higher and there could be cost implications. We know that there will be cost implications when we move over to more sophisticated waste management systems.
Lesley Bloomer will remember that the last time she came before the committee, we were at great pains to emphasise the importance of the qualitative analysis that needed to go alongside the quantitative data, which can too easily be misread or misunderstood. Taking the example of refuse collection, how would that report appear next time?
The performance indicators for refuse collection are the cost of refuse collection and the number of complaints—that is a new indicator. Alec Taylor will correct me if I am wrong.
I am sorry to interrupt, but I think there is also an indicator that relates to the proportion of bulk uplift for disabled people and so on.
Yes. I was referring to the household indicators. If a council introduced different bins for households, the costs would increase. At the moment, all councils use systems that are based on wheelie bins, so the figures are comparable. If a council were to change its system, we would have to reconsider the way in which we report that indicator, because it would become a transitional indicator. The situation would not be clear; if a council introduced different bins to improve its recycling rate, the costs would rise and that rise would not necessarily be a bad thing. We would provide additional commentary in support of that indicator, saying that additional costs for recycling would push the figure up. Does that answer your question?
I am trying to say two things. First, there may need to be a change in what you are reporting. It could be important to know the amount of money that is being put into investigating recycling. Secondly, as Keith suggested, we need to keep abreast of what the performance indicators are doing and whether they are truly related to necessary changes in policy, such as waste management. Are we adjusting the performance indicators to reflect policy changes? Have I made myself clear?
I think so.
Apart from the fact that she said Keith and not Kenny. [Laughter.]
Perhaps Alec Taylor would like to come in on that point and link it to policy.
There seems to be a bit of buck-passing going on here.
I think that it was what rugby players call a hospital pass.
We are working with units in the Executive all the time, asking what policy initiatives are coming through and in what direction we need to be going. One of the fundamental problems is that we establish a direction in summer 2000 for the financial year 2001-02 and it is January 2003 by the time that those figures are published. I often ask people whether they can tell me what the key issues for public reporting will be in two and half years' time—if they could, I would ask for the six lottery numbers for Saturday and retire now.
However, there are some areas, such as social work, where there have been several policy initiatives and we have worked closely with people in the department. I mentioned the new indicator in the social justice action plan, and that is a key example.
Last year, we introduced a new indicator for refuse recycling that asked about methods of refuse disposal and the proportion that goes to landfill. Cost is pertinent to that, because over time we will be able to identify those councils with high costs and to find out whether there is any correlation between cost and the proportion of waste that goes to recycling schemes.
Our difficulty is that we get one shot at this every year. Getting any trend information takes three years, so this is a long game. We must be patient, even though sometimes that is frustrating. Part of the frustration is that when we introduce a new indicator for inter-authority comparison, for the first year we have only one year's data. We cannot make comparisons over time; we can make only bland comment on which councils are and are not doing well.
We try desperately to avoid making the obvious judgment that there is a straight league table of 32 authorities and that they should be ranked as if they were part of a football league. Going beyond that requires several years of consistency. That takes me back to what I said earlier about the Local Government Act 1992, which requires consistency over time. In some areas there is fairly rapid policy change or new initiatives are built on to old policies. One has to strike a balance when deciding on performance indicators.
Keith Harding has touched on areas where constructive work may be done in future. One way in which a council can make a difference in running its finances is in the management and rescheduling of borrowings. By that I mean borrowing long, borrowing short and changing the portfolio. That can have a revenue impact on budgets. Do you have any plans to look into that form of management in the 32 Scottish local authorities? I accept the point that you make about having to build up the statistical evidence. However, would you consider providing examples of best value that might be useful to Scottish local authorities?
Given the complexity of the issue, I suspect that that might be hard to capture in one PI or even a small number of PIs. However, we might profitably do a VFM study that compares treasury management practices across the 32 councils in Scotland. As a new audit organisation, we will over the next three months consult local authorities and the range of stakeholders in local government about our priorities in using VFM resources over the next three years. We can include the management and rescheduling of borrowings as a topic in those discussions to gauge whether people feel that an in-depth examination of the issue would be beneficial.
I welcome what you say; that is a good, constructive reply.
In my next question I display my ignorance, as I ought to know the answer but do not. When you develop something, what mechanisms for reporting to the Scottish Executive have been established or may need to be established?
At the moment, we produce two types of reports. One is a report to individual councils that sets out how they measure up against the good practice that we have identified, the benchmarks of performance information and the areas in which councils have shortfalls to address. As a follow up to that, councils are required to agree an action plan with their local auditor.
At the national level, we produce a report that sets out what good practice is and how local government in Scotland as a whole compares with that. The Accounts Commission does not report to the Executive or the Parliament. Its power is the power of publicity. We would publish our report, aim to get media attention for that through the responsible use of press releases and briefings for journalists, and make it known that, through the audit process, we will revisit what has happened. Publicity is our key weapon in getting councils to take action.
When we were talking about refuse collection, I was reminded that when we last discussed that subject one of my colleagues concluded that, because there had been such a decline in the number of people involved in refuse collection, that would certainly not be a career for his children.
However, that is not the point that I wanted to make. To what extent might it be worth considering the effect on staff of meeting performance targets? Can that be measured by the number of premature retirements through illness? The other day, I talked to somebody who works in the public sector. It is clear that in many areas the pressure on people is intolerable, and I thought that that might be a valid area for inquiry.
That is an interesting point. The focus on all of us, as public servants, is increasingly on performance, and measurable performance is part of that. The value lies in prompting questions about how things are going. We can put the information into the public domain, so that councils can use it, and surround it with appropriate caveats. It is then up to councils to use that information sensitively.
We aim to examine good practice rather than cost cutting. Cost cutting is never the focus; the focus is the balance of economy, efficiency and effectiveness.
I understand that the motives are pure and honourable all round. However, there is a definite human cost, which someone should measure.
There is a lot of interest in replacing the ring fencing of Government money that is given to councils by measuring inputs, outputs and performance indicators. Do you think that you have satisfied the Executive or COSLA, or both, that you have made enough progress to provide a good set of proposals for measuring outputs? Where does the issue lie at the moment?
We genuinely think that we have made a great deal of progress over the past couple of years, mainly through working closely with COSLA, the professional associations and the Executive to ensure that we strike the right balance between challenge and support. It is right that councils are held to account for their performance and that hard questions are asked. However, the process should not be about developing sticks to hit people with; it is much more about helping people to identify what works and to make meaningful comparisons between themselves and others. We have moved forward a long way.
We are also keeping a close eye on the interesting work that COSLA and the Executive are doing on, for example, local outcome agreements, whereby, instead of expecting everybody to hit the same target, agreements between councils and the Executive would reflect local circumstances and priorities. We think that audit could play an important part in ensuring that the targets are reasonable and challenging enough without being impossible and that performance against them is reported accurately. I hope that the Society of Local Authority Chief Executives and Senior Managers would agree that we have gone a long way towards getting the balance right. The system will inevitably continue to develop and there is still a lot that we can do to get that rounded picture of performance to councils and bodies such as this committee.
Thank you for that. I have a couple of points to make in rounding up—this is not a question, so do not panic. The relevance of the repair indicators has been pointed out and Lesley Bloomer has said that she hopes to make them more user friendly and to standardise times and repairs according to national targets. In the context of Sylvia Jackson's point about the amount of data that councils must respond to in looking for the kind of information that you want, I think that, in your talks with COSLA and the Scottish Executive, streamlining those requirements will be useful.
Keith Harding's point about the fact that nobody appears to be considering the non-domestic business rate—whether it is being collected and, if not, why not—is important. The committee will certainly try to find out whether something can be done about that, whether by Audit Scotland or by someone else. The issue will not go away.
Michael McMahon and Sylvia Jackson asked about how we explain statistics: how one day things can be all right, but the next day the opposite can be shown.
This has been a useful hour and 10 minutes. You are now allowed to go off and have a coffee; we must meet the Executive. Thank you for coming along.