Official Report 360KB pdf
Item 3 is our annual review of the local government benchmarking system. We will have an oral evidence session with witnesses from the Society of Local Authority Chief Executives and Senior Managers Scotland and the Improvement Service on the progress of the local government benchmarking framework. As members know, we have taken a keen interest in the framework’s progress and have held annual evidence sessions on its development for the past few years.
I welcome Angela Leitch, who is the chair of SOLACE Scotland and is also the chief executive of East Lothian Council; Colin Mair, the chief executive of the Improvement Service; and Emily Lynch, senior project manager at the Improvement Service. Before we move to questions, would any of the witnesses like to make opening remarks?
We will be brief, because I hope that our submission is reasonably self-explanatory and raises the necessary questions. We thank the committee for its continuing interest in and support for developing the benchmarking framework, and I record my interest in the comments that the committee received from its online survey of people’s responses to the framework.
The submission refreshes the point that the framework’s purpose is to create a range of high-level comparable measures across the 32 councils. We have explicitly used the language of “can openers”. The indicators pose, rather than answer, questions for councils. The language of “drill down” has also been used recurrently. If a council looks off trend on an indicator, that poses a question, and the council then drills down within the services and engages with communities to explore why that is the case and how improvements can be made.
We hope that the framework is one contribution to a range of improvement tools that councils are using. We reassure some of the people who responded to the online survey that the framework fits in with self-assessment using the EFQM model, for example—or variants of it—across the 32 councils and community planning partnerships.
Will you spell out what that acronym means?
I beg your pardon. EFQM is the European Foundation for Quality Management, which has developed a self-assessment model that is used in the private sector. An adapted version called the public service improvement framework is used in the public sector in Scotland.
The benchmarking framework continues to be developed. There are areas that need to be strengthened, including our understanding of children’s learning, growth and development across the pre-school period and throughout their primary and secondary schooling, and work is on-going to identify comparable measures on that.
My final point is that we are still operating within the framework of an Accounts Commission directive. The committee has discussed in the past the fact that the benchmarking framework replaced the statutory performance indicators that were previously laid down by the Accounts Commission.
However, the Accounts Commission places a directive on councils to report annually on the local government benchmarking framework data and to put the data in their local public performance reporting. Reporting is happening at council level and down to communities—the framework is there to support councils in having the data that they need to report to communities.
I ask Angela Leitch to say a bit about how the framework is being used at council level and with communities.
Good morning, everyone. I took the opportunity to speak to some of my colleagues at a local level in advance of coming here, and I think that it is fair to say that the benchmarking framework is now firmly embedded in the public performance reporting that local authorities do. All the indicators appear to some degree in each of the 32 reports that are produced annually. Beneath that is a range of further measures that support the benchmarking indicators. Those measures allow us to drill down into more thematic groups or into geographical areas. I have some examples that I can explain further later in the meeting.
As Colin Mair said, on a practical basis, benchmarking is used as part of the improvement toolkit that we have. We have a variety of examples that show how the benchmarking data is used to develop improvement plans and future service plans. The plans are then monitored through the benchmarking indicators.
Now that we have three years’ analysis of the data, we are seeing trends, and the trend data is becoming very useful in further engagement with our communities. That engagement includes discussion with communities about priorities, how we might change policy decisions and whether efficiencies could be made. It also includes making comparisons with elsewhere in Scotland—for which the family groupings are increasingly important—and the learning that we can individually take back to our local authority areas.
Thank you. As part of this exercise, we asked the public for their views on the framework. We had varied responses. I will start with one from a Fife environmentalist—it will probably come as no surprise that I am starting with that one. When asked for views on the benchmarking system, his immediate response was:
“My view is that it is mince. On a good day I might be generous and give it 2/10.”
How would you persuade the Fife environmentalist that the framework is working and that it is making a difference compared with the previous performance indicators that were used?
I will first take one of the points that the Fife environmentalist raised forcibly and succinctly. If the benchmarking framework was being presented as a measure of environmental outcomes, including carbon emissions and so on in Fife, it would not be achieving that. However, it is a framework of what councils call environmental services—waste collection, street management and so on.
If the Fife environmentalist wants a statement of outcome, a parallel bit of work is going on with community planning partnerships—which represent all the public agencies in an area—to get to a set of fairly standard outcome statements. The statements will be published annually and will allow the public to look at how things are changing in their area.
The local government benchmarking framework is intended to help people who run services to compare their services with others elsewhere. The Fife environmentalist is entirely right that it is not measuring the type of environmental outcome for Fife that he wishes to see, but I reassure him that a parallel bit of work is going on to do that.
When councils seem to be significantly out of kilter with how other councils like them are performing, the framework is being used to drill down quite hard and ask why that is. For example, is it a fault in the systems, or is it something to do with delivery mechanisms? The questioning then leads into improvement planning and improvement delivery. We can give examples—Angela Leitch can talk a bit more about them—in which the process is being used in practical ways to drive forward change in services.
The core point about outcomes that the Fife environmentalist raises is valid and is being addressed in a parallel stream of work.
Before I bring in Ms Leitch, Alex Rowley has a supplementary question.
My question is on the point about what is being measured and whether it is necessarily the right thing. I remember that, when the benchmarking figures came out a couple of years ago, Fife Council’s costs for children in care were much higher than those of similar-sized authorities, such as South Lanarkshire Council. It was necessary to ask what sat behind the difference—if I remember correctly, the Fife homes were much smaller than the South Lanarkshire homes. What sat behind that? Did the smaller size of home in Fife mean that children in care had more chance of succeeding and that the care was of better quality?
I am trying to get to what you are measuring and whether it is meaningful. Are you measuring like for like between authorities?
I think that we are. Benchmarking is very much a can opener.
You gave the example of children’s services. We use benchmarking extensively and the key thing is that services are self-aware and know why the differences exist.
We should expect differences, because each of our local authority areas is different and we have different practices. Elected members are elected on a manifesto and determine different local policies. It is on the basis of those policies that some of the practices in local authorities are undertaken.
On the example of children’s services, I would expect deeper analysis to look at whether, through having smaller homes and a higher staff ratio, the outcomes for the young people are any better. If not, what can we do to improve—is a policy or practice change needed? That is where the wider improvement agenda comes in, so that the issue is looked at not just through the raw data but perhaps through engaging with other professionals. What we could do differently, particularly on attainment, is crucial.
We have a lot of examples in which, on the face of it, the raw data shows an individual authority not performing as well as the national average or other comparable local authorities. When we look beneath the data, however, we can understand the reason. It is then up to individual councils to determine whether to continue with the relevant practice.
In my area, we spend more than the national average on our roads services. One reason for that is that we are investing not just in resurfacing but in drainage and kerbing, particularly on rural roads. We know that that work will help the resurfacing to last longer and make the investment better value. It is the self-awareness that is crucial.
We know from previous evidence sessions that all the metrics were to be caveated by authorities, as you did in the case of East Lothian’s roads. Is that happening? Do the public understand that those caveats exist and that local decisions have made some authorities do things in certain ways compared with others?
We worked with councils recently to look at how they could provide the information to the local public and how they could provide the caveats and the local context. It was identified as critical that councils should not put the data out there without some supporting narrative to help people understand local priorities, the council’s starting position and the policy objectives that the council was pursuing. Councils are working to improve how they include the information in their reports in order to provide those caveats and that narrative.
We have an event with councils next week to look at the good practice that is emerging on how data is being reported to the public, the feedback that we are getting from the public on that reporting and how we can continue to improve that. We are working with Audit Scotland, which is undertaking its reviews of public performance reporting, to build on the findings from those reviews so that they can shape how councils address the issue.
10:15
Does Alex Rowley want to come back on that point?
Convener, it might be useful, following that event, if the committee were supplied with information on how councils are using the data and reporting it to the public, so that we could use it for a further discussion.
We can supply that.
When the general numbers are being published and discussed, as they always are, it would be ideal if the reasons behind, for example, the Fife data on children’s services were provided to local authorities, as they might choose to make the same decisions. We will come back to outcomes shortly.
In the run-up to the formation of the new framework, we discussed at length some of the previous indicators—the number of library books borrowed per 1,000 population, for example, did not represent all the services that libraries provide in today’s world. We have a statement from Elma Murray, the chief executive of North Ayrshire Council, about the correct metrics or indicators. She says that
“There are some indicators that would be more appropriately measured by alternative means. For example, one of the indicators is the cost of parks and open spaces per 1,000 population—would this metric be better suited to acres/hectares of parks and open spaces?”
Is the Improvement Service continuing to look at each indicator and to modernise the measures, which can become irrelevant, sometimes over short spaces of time?
Yes—we are. That has been a key priority for the programme and it continues to be a priority.
We have identified limitations in some of the measures, and there are still gaps in the framework. We have been working with all 32 authorities to identify limitations and concerns over the robustness of the measures and where the key gaps are.
Priorities for the period ahead are to improve the guidance on the financial measures to ensure consistency; to strengthen the indicator on gender equality, because it focuses on women in the top 5 per cent of positions, although we are interested in gender equality across the workforce; and, as Colin Mair mentioned, to strengthen the measures on outcomes for children in pre-school and primary education, because we have cost measures but we do not have an outcome measure. We are working with the Association of Directors of Education in Scotland and other professional and educational authorities to look at how we address that point. We also want to strengthen the measures on older people’s and adult social care, because we recognise that that area of the framework requires to be improved.
A specific example that we are working on with councils is whether a net measure of cost in relation to sport, culture and leisure would be more relevant for authorities than the current gross measure. Directors of finance have identified that as a piece of work that they would like to take forward.
There are certainly areas that we are looking to improve.
Good morning. There are a number of issues that I would like to follow up on in relation to the submission from SOLACE and the Improvement Service and the issues raised by the public in response to the committee’s call for views.
What is the engagement of the Convention of Scottish Local Authorities in the process? From what I have heard so far, it all seems to be officer led. Ms Lynch referred to an event next week. Who has been invited to that? Is it purely officers, or are elected members invited to such events as well?
We run a number of events across the year for a range of audiences. The event next week is specifically for officers. However, we also run events for elected members—we ran four different events for elected members last year, and we are scheduled to offer four regional events for elected members this year. We recognise the importance of elected members.
COSLA is on our project board as the vice chair, so it provides an on-going steer and is involved in the development of the programme. We made a presentation at the recent leaders’ meeting in January to ensure that all the elected member leaders were up to date with the programme developments and understood the key themes that are emerging from the project. We prioritise that area in the work programme.
Most local authorities will embed this work within their development programme for elected members. Only yesterday, our benchmarking information went up to our policy and performance committee. The elected members spent about an hour scrutinising the data that came from the benchmarking, and that is where the narrative is as well.
At a local level, having the benchmarking information embedded in our public performance reporting in a variety of ways allows members to scrutinise it and to make the challenge to officers that you would expect.
You said that the council committee—not all the councillors—spent an hour scrutinising the information.
Yes.
How long does it take the officials to scrutinise the same information? I am sure that it is longer than an hour.
Scrutinising the benchmarking information is part and parcel of the way that we look at improving services. We use it with a range of other measures. Part of the benchmarking is satisfaction responses. Most local authorities take that to a much more refined level—in Fife, there is the people panel, and a lot of other local authorities have citizens panels. Engagement with the community is crucial, and the time that is spent is part of the whole improvement agenda.
Increasingly, we are looking at aligning the feedback from complaints or compliments with areas where we want to improve or where there are policy issues that we need to think about as a council. The process is about not just engagement with people who are interested in that type of engagement, but using the passive responses from individuals concerning our services. We tie their responses back into performance information in a way that helps us to think about how we can improve, how we can make better use of our resources and how we can provide a better service to the people of our communities.
Are you confident that all elected members in local government are aware of the benchmarking process and the criteria that are used in their local authority for measuring service delivery? Ms Leitch mentioned that councillors are elected on manifestos. How does that tie in with the benchmarking criteria and the performance of a local authority and how does that stand against what may be the manifesto on which the leading party that becomes the administration is elected?
I can give you a practical example from my area. Our elected members have prioritised the environment of East Lothian—parks and open spaces. The convener mentioned one of the responses to the committee on the subject. We spend the most out of all the local authorities on parks and open spaces, and that is a policy decision. That said, we are looking at whether we want to continue with that level of expenditure. First and foremost, that involves a process of engagement with our elected members, but increasingly it includes an engagement process with the electorate—the people of East Lothian.
As a result of the benchmarking information, we have taken a selection of people from our citizens panel and we are looking at a citizen-led review of parks and open spaces to help to inform the policy decision. That is an example of where policy, benchmarking and engagement are starting to come together.
I am very confident that councillors across the council are familiar with their performance framework and how the benchmarking framework is embedded in it. If you are asking me whether all 1,200 councillors in Scotland are entirely aware of the benchmarking framework as it would appear in our overview report or on the website, my answer is, “Very possibly not.” However, that would not bother me very much, because the point of the benchmarking framework is to support a council’s local performance scrutiny; it should not be treated separately from what the council is doing as regards more scrutiny and improvement.
I am confident on John Wilson’s latter point; I could not honestly give an accurate figure for the first point.
I will add two technical points. One thing that the project board that oversees the programme looks at regularly is the extent to which individual councils include the information within reports to elected members, so that is an on-going focus. Also, we were asked by councils last year to deliver a training programme for officers to support them in developing awareness sessions and approaches with their local elected members, in order to support elected members to engage with and interpret the information. We delivered that programme earlier this year.
You can deliver training to officers that they can take to the elected members, but the issue is whether those elected members feel that there is any value in participating in those training events. I still get information regarding training events that are held in the local authority that I was a member of, and sometimes only half a dozen elected members will turn up to particular training events.
Benchmarking is not just about measures within the local authority. My understanding is that the benchmarking process and framework were established so that each local authority could compare its performance with other local authorities and within the families that have been identified. How do we make sure that that type of measurement against delivery across the families or other local authorities of a similar nature or size is taking place so that local authorities understand how they can improve? My understanding is that part of the aim of benchmarking is to try to deliver things better through using examples of best practice from other local authorities.
I can say a little about the programme of family group work. As you rightly say, it is of particular interest to elected members when we present and share the information at a high level. When we are able to share further, richer detail about the information that is emerging from the family groups, that makes the data far more relevant. Family groups are being established in areas such as services for looked-after children—which we talked about earlier—waste management, council tax and sports services.
All 32 councils are working within those families in order to come together to use the pieces of high-level data as can openers and then drill down to try to understand better what is behind the differences between those councils and what opportunities there are for them to learn from each other. As we have discussed, some of the differences are about policy priorities, local decisions and factors to do with the local context. However, some of the differences in performance are about new or innovative practice or different ways of working. That comparison work is on-going and we continue to work with councils to roll it out across other areas of the framework. Examples of good practice are already being highlighted within those groups.
Convener, there are a thousand more questions that I would like to ask, but I think that I have taken up enough time.
You may get the opportunity to ask more questions later. Clare Adamson is next.
10:30
As someone who has served as a councillor, I am interested in the differences in the areas that are benchmarked. I can easily see how a financial comparison can be done of the delivery of services such as bin uplifts, lighting and so on, but my colleague Alex Rowley talked about a more intrinsic exercise that is more about outcomes. I draw your attention to the submission from Museums Galleries Scotland and the point about the focus being on financial indicators and visitor numbers rather than anything to do with the contribution to wellbeing.
In the context of the Government’s strategic objectives from 2007—a Scotland that is wealthier and fairer, safer and stronger, healthier, greener and smarter—if we are having to look to other reports and other pieces of work to get the full picture, is there not something fundamentally wrong with how we are approaching the benchmarking process? Should we not be able to do all of that within this process?
On the point about outcomes that our colleagues from Museums Galleries Scotland raise, for us, purely and simply, benchmarks are measures that pose questions for people. We are satisfied that the financial measures are standardised and accurate. The footfall measures are taken on exactly the same basis as National Galleries of Scotland takes its measures, so if they were wrong for local government, they would be wrong for everyone else as well. On that model, a lot of the figures for attendance at galleries and museums would simply emerge as being wrong. We are satisfied that the measurement is right.
What do we see the point of the measurement as being? If some museums have managed to increase their footfall significantly—most of the change is increased use rather than decreased expenditure; as a result, the unit cost per person has come down—that is an important success story. How have they managed to do that? In some cases, it will be because a decision was taken to run free bus services for certain communities so that they could access the council’s art resources. On the basis that it would cost people on low incomes quite a lot if they had to pay to catch a bus to get there, consideration was given to what could be done on transport connections.
We see the benchmark as posing a question for people. For a small number of museums, that question is posed quite starkly, because there is a genuine issue with footfall declining quite sharply. Why is that? How can that be reversed?
On the outcomes for museums and galleries, I am involved in a separate piece of work with the directors of culture and leisure in Scotland that is entirely about how we begin to better demonstrate the value that we believe comes from participation in music, sport and the arts. I certainly believe that those activities have value, but even when we look at evaluative studies, we are still quite clunky when it comes to defining and measuring outcomes in that area. There is a lot of assertion, but if we poke a stick at it, it often dissolves fairly quickly. There is quite a lot of work to be done in such areas to arrive at a much clearer understanding of outcomes.
An issue that the committee has raised on previous occasions is who the outcomes are for. If we are running art galleries and museums in Scotland on an uncharged basis, we presumably want to make sure that the whole community benefits from that, and not just some sections of it. We need to start to find out more about the segmentation of our audience. Are museums and galleries being used disproportionately by some communities and not at all by others? If so, what are we going to do about it? That is the drill-down work that Angela Leitch mentioned.
We know that some councils are looking in some detail at who is using certain facilities and who is not, and what they could do to get the people who are not using them to use them. It is axiomatic, I assume, that art galleries and museums have value, but we need to demonstrate their value and to ask whether we are getting the footfall that we hoped that we would get from funding universal free access to them. That is the question that the benchmark poses, and we need to go on and answer it. I absolutely accept the point that Museums Galleries Scotland makes.
This is where the can-opener phrase comes into its own. I am interested in the observations that are made in the submissions. In my local authority, we would take that information back into our improvement framework, which is the assessment of how well the service is performing. That work is done by staff and not by managers. They gather the evidence—benchmarking is part of that—and they do the comparisons with others to determine how we compare.
That process is then scrutinised, to mixed extents, with different publics. People engage because they are interested in a particular subject area, so the next stage is to take the information to individuals who are particularly interested in a certain topic and service. In addition, we do further scrutiny of the improvement plan with our local area network, which includes all the scrutiny partners. They help us to look at the improvement journey, and benchmarking is very much a part of that.
The approach is being embraced across the country in relation to housing. I have with me our “Landlord Performance Report to Tenants 2013/14”, which we put out to tenants and residents associations. It contains the benchmarking data, but it also drills down into far more detail. The detail in such reports has been developed by tenants and residents, who have told local authorities about the types of information that they feel are of value in demonstrating the worth or the performance of particular services.
The housing service has probably done an awful lot more than some services, and it is a good model that we can adopt. That links to the committee’s questions about the engagement process and the journey that we are on.
My concern is that you may well go through all of that process in the housing sector and find that you are comparable with the other authorities in the group that your authority is in, but if you went to the families involved and asked them whether their housing has improved, would there be evidence that outcomes have changed? Even if the financial targets, the process and the delivery are similar, you may not necessarily be materially changing the outcome for the people who are affected. Where is that information captured?
I will use housing as an example, although it is not the only one. The benchmarking families are useful. I can illustrate that with our performance report on housing, in which we compare our performance with that of others. Where we do not perform, we go out and ask why, according to the indicators, others look to be working more effectively and to have better outcomes for people. That is where the analysis takes place, and it is where both the narrative and the whole improvement journey are important.
As Emily Lynch said, the process is about taking ideas from others and adapting them to your circumstances so that improvement becomes much more embedded. That is where the workforce is crucial, because it is not about a certain few who understand the figures. It is about the workforce being committed to the journey of improvement and linking everything back to better outcomes for local people.
Good morning. I speak as a former member of the Public Audit Committee and as someone who was a quality assurance manager at some point in my previous career. For many years, the Public Audit Committee was interested in the follow-up from the many good recommendations that come out of reports and benchmarking documents. The part of the circle that is often not completed is the part about who does the follow-up. I imagine that the benchmarking reports tell you about what you have done and also look forward to what you might want to do. Who reports on whether your good recommendations, advice and so on are taken up by authorities across the board? Furthermore, how does the public see that being done?
On an individual basis, it comes back to embedding improvement as part of the culture. When the annual data from benchmarking comes out, local authorities sit down with their administrations and with various scrutiny committees to establish which areas they need to look at. We also need to tie that back into each local authority’s priorities. At different stages, authorities will have different priorities to which they are committed through their single outcome agreement or their plan. It is at that level that they will look at how other councils are doing.
Attainment is a major priority for my local authority, and we have a big piece of work looking at how other councils are improving attainment, particularly for people in disadvantaged areas. West Dunbartonshire is doing particularly well on that, so we are looking to learn from it—although not exclusively—and the practices that it has introduced to equalise the attainment levels across the local authority.
The follow-up work very much happens at a local authority level.
If someone picked up a report from last year or two years ago and said, “These are great recommendations—were they done?”, how would they find out whether a council had taken forward the recommendations?
Our overview report is a report on data and the questions and issues that the data poses. That is all that it is—it does not make recommendations at all.
Improvement plans at council level would follow on from that. Such plans are tracked through the improvement process and scrutinised by audit and scrutiny committees in councils. If an improvement is agreed, there is a process for tracking whether it is being delivered over time. For example, I am working with a council which, similarly to Angela Leitch’s council, is looking to see how it can improve and prioritise educational attainment for kids from the most disadvantaged backgrounds. That work started from the benchmarking framework. Although that showed that the council was improving, it was not doing so as fast as others. That led it to engage with a range of other councils.
The council now has in place a set of improvement plans with its schools, its community learning and development people, its home-school link people and its employability people to say, “We are now going to shift this onwards sharply.” That will be built into the performance appraisal of headteachers, who will have targets based on what is expected of them given their school’s composition. That will be used to judge the education department, and it will be routinely reported on.
I reassure you that there are mechanisms in place. When a council moves from the high-level benchmark comparison to the question that that poses and then to the improvement action, that is built into quite formalised processes for councils to take forward improvements and report on them.
Can I stop you there, Mr Mair? We have heard from Ms Leitch about embedding improvement. We have all heard, time and time again, about continuous improvement. You mentioned putting targets into headteachers’ appraisal systems. We have found that the front-line staff delivering the services often know what the improvements should be, but they are the least involved in terms of the outcomes that we all require. How are front-line staff involved in the benchmarking? Are the benchmarks communicated to front-line staff? Are they asked for their opinions on how to make improvements? All we are hearing about is top-level stuff.
As you would expect, practice varies. I will use Western Isles Council as an example. It has a fairly innovative practice of producing an annual performance report, which is based on—
Can I stop you there, Ms Leitch? Western Isles Council is different in some regards. It is a small council and, quite frankly, the chief executive is likely to know the vast bulk of the staff and he is particularly approachable. We saw that elsewhere when we went into smaller local authorities, where the chief executive knows everyone and everyone knows the chief executive. What is the situation in councils such as North Lanarkshire, Glasgow and Aberdeen?
10:45
A key feature, which Colin Mair touched on, is embedding the self-assessment or self-evaluation process. The process works particularly well when staff at all levels in the organisation are involved and an understanding of performance is very much at the heart of what they do. We would certainly encourage that approach. Obviously, we do a lot of feedback to staff. Local authorities adopt different approaches, such as the lean, six sigma and vanguard approaches. A variety of improvement mechanisms are used.
Equally, a number of measures are put in place to engage people. For example, Glasgow City Council has been going through a two-year programme of engagement—I do not know whether it has completed it yet—that has included all its front-line staff. That is about trying to explain what the corporate objectives are and giving people at the front line an opportunity to feed back on improvements and how they think things could be done differently. A variety of techniques are being used.
Willie, I am sorry that I interrupted your questions.
No—thanks very much. I was interested in that response.
I want to talk about the framework in general. In answer to a question from the convener, Emily Lynch talked about modernising elements of the framework. Generally speaking, how do you see the framework developing as a result of the provisions of the Community Empowerment (Scotland) Bill? Will the framework evolve significantly as a result of that? Will the public at large be able to influence and, indeed, determine what is in the framework? Will there be measures that are meaningful to them? Will they be able to shape the frameworks rather than have the frameworks done by the local authorities?
There are two elements. First, Willie Coffey talked about closing the loop. We are very much focusing on how we use information to understand what is happening in local communities and to engage more effectively with them. As we become more successful in doing that, that will help us to shape and refine the measures and the way that we present them. We very much see that as a loop. Once we have more effective engagement, that will help us to understand what measures are important and how information should be shared.
The other thing is to reiterate the point that Colin Mair made earlier about the development of the community planning approach, which very much has at its heart ensuring that information helps us to understand what is happening in local communities, for example, where significant inequalities exist across or within local communities; engaging more effectively with those local communities to understand what is behind that; and helping to ensure that local communities shape the solutions.
We plan to work with community planning partners and local communities to shape and develop that approach in the year ahead. The intention is absolutely that local communities will shape the measures that are important to them and—this is important—shape the way in which information should be shared with and reported to them, so that they can engage with it and make sense of it. That is certainly an intention for the year ahead.
I worry that we may have overhyped the framework to members. Some of the questions imply that it does things that it does not do. It is no substitute for all the other good things that Angela Leitch talked about. Robust self-assessment needs to be built into the way that our councils are run, and robust and properly resourced community engagement and development need to be part of what councils do.
The framework does what we have said on the tin, but it does not do anything more than that. It is one tool, but it is no substitute at all for all the other improvement tools and mechanisms that a council uses, and it needs to be linked to them.
Willie Coffey’s points are utterly germane. Is the underlying improvement planning sufficiently robust? Is it sufficiently engaged with the communities on whose behalf we are trying to improve things and do they understand what we are trying to do? I reassure members that every council has other mechanisms for collecting data from communities as part of service reviews and so on.
My final point is in line with what our colleagues in Audit Scotland have said. It is clear that councils publish the data—they have to do so under a directive from the Accounts Commission for Scotland. Councils’ local auditors will look at whether to respond to the data that they have published. If they are off the mark with other councils in their family group in some respect, the auditors are then perfectly entitled to say, “Fine, but what are you going to do about it?” The statutory audit function plays into the process, too. It is not just a free-floating, voluntaristic approach; it exists within a framework in which councils are statutorily audited for best value and improvement as a routine part of how they are dealt with.
We understand the linkages between the framework and the other bits and pieces of improvements. Most members of the committee, excepting Mr Buchanan, have been councillors, some of them very recently—some members may still be councillors, if I remember rightly.
Not any more.
I stand corrected.
There was a lot of hype about the framework, and one of the key things for us is to ensure that hype becomes improvement. That is one reason why you are going to come back to the committee year on year to talk about the framework.
I have one last question about the issue of family groupings. When I was a local councillor, I well remember the family grouping that East Ayrshire was part of, but from time to time I wondered why we could not get a comparison between an activity in East Ayrshire and an activity in Glasgow—we were never part of the Glasgow family because of its size.
Is the system now developed enough to allow elected members, officials or the public to choose which comparators to group? I know that software would be needed to do that, rather than a paper report, but can we do that kind of thing? Can we look at different deprivation indexes around Scotland and group them together and explore comparators for ourselves?
Absolutely. Refining the family groups is one of the priorities that we have identified in improving the framework.
We had to make a start, and the original groups were agreed as a starting point. They provided a practical structure, but there is also a similarity in the challenges that the councils in the groups face. As the families have started to work together, we have realised that the family groups are not always right for all of the subjects that we are looking at. Colin Mair might want to talk about education, as we have been looking at that.
We are keen to work with councils to identify better groupings or ways of arranging the groupings that would work better. Ultimately they are simply a structure to support councils to come together and share and learn; there is an opportunity to ensure that they are more refined so that they are more appropriate.
There is a visualisation tool—I hope that I am using the right term—that would allow one to explore and make comparisons between any councils across the whole range of indicators, if one wanted to make up one’s own sense of what a family should be for certain purposes.
Emily Lynch’s point about education is interesting and is similar to the point that the convener made earlier about whether we are benchmarking our past or trying to move to our future. If we put every council with a high level of deprivation together because we know that it affects education, are we building in an element of self-fulfilling prophecy by saying that we do not really expect people from deprived backgrounds to perform as well as others? The challenge is how we help them to perform better. We should not simply create local authority families and stick with them and so imply that, if a council has deprivation, its education results should necessarily be worse than those of other local authorities.
Angela Leitch alluded to West Dunbartonshire, but there are other councils that have made spectacular improvement in the performance of kids from disadvantaged backgrounds over the four-year trend time. Not of all those would be regarded as very disadvantaged councils, but they are doing well with disadvantaged communities. There are things there for bigger councils that have a lot of disadvantage to learn from. Mr Coffey is right that we should not be too rigid about the family boundaries.
I am big fan of the approach. I was a councillor when the first information on benchmarking came out and I remember being excited as I tried to plough my way through the information that was available then. To go back to John Wilson’s point, benchmarking information can empower councillors to a great extent by enabling them to look across the country. I had certainly never felt as empowered before. That was the early days and a lot of the data was raw. I interpreted it one way and I was told that it meant something different. I assume that the information has improved, so it would be good to get an update on that.
Secondly, the submission from Maryhill and Summerston community council says that it would help to offer a
“wider range of performance indicators for comparison, eg departmental costs, number of successful appeals to planning decisions ... etc.”
It highlights the indicator “How clean is my street” and suggests that the detail should go down to community council level.
Going back to what Colin Mair said, I accept that benchmarking does exactly what it says on the tin and that it is not about the Improvement Service pulling things together for every community. However, have you done an analysis of how councils use the benchmarking to improve services and—to bring in Willie Coffey’s point about the Community Empowerment (Scotland) Bill—get the information down to a meaningful level in communities so that they can compare themselves with each other as well as see the wider performance across Scotland?
That is absolutely key. There is a real push towards what we call place-based approaches to service delivery rather than just assuming that one size fits all regardless of the size of the local authority. I and others across the country have been using and expanding the benchmarking data to see what it means in local areas. Most local authorities have some type of forum, such as a partnership or an area committee, and the detail is now broken down on that basis, although perhaps not to community council level. However, it is certainly broken down to ward or area level.
The document that members can see in my hand has been furnished to our area partnership, and the partnership is now setting the priorities for the local area on the basis of the data in it. The process is to use the benchmarking as a can opener, then distil it down with communities so that they can make sound judgments about where they would like to see our services being focused.
It would be good to get a copy of that, convener.
I would be happy to provide it.
That would be brilliant. Thank you.
We present data at a very high level for a whole council. For example, for the performance of children in secondary 4 or 5, there is an average for the council area as a whole, but the variation around the average will be staggering, and we need to take that right down to the community level.
We have been doing work with community councils across Scotland in the past year. Frankly, I do not think that they are well supported at present, but we can do a lot with routine public domain statistics to get them down to the community council level. We have created something called Viewstat—I am happy to send the link for it to the committee—which will allow any public statistics to be taken down to the level of communities of 600 to 1,000 people. The difficulty is that the geographies that are used for public statistics will not always tidily correspond to a community council’s sense of its community’s identity. However, the statistics will at least allow someone to take all the education results for an authority and find out what happened in the two streets next door to them. People can pull up the information and look at the pattern over time.
We have put 10 years’ worth of data into Viewstat so that, if people want to look at trends, they can do so. It is by no means perfect, but its design standard was that it needed to work for someone who was at least able to book a Ryanair ticket. If you can book a Ryanair ticket, you can use the damn thing. It is relatively simple, but it does at least allow much more ready access to public data.
The Viewstat data sits below the benchmarking data. However, Mr Rowley’s point is valid because drilling down really matters, as the real action happens in quite small communities—that is where lives are varying and are getting better, worse or whatever. We need to be able to get our analysis down to that level. As Angela Leitch said, most councils now have mechanisms for not only council planning but, increasingly, community planning that goes down to neighbourhood level—Edinburgh is a good example of that—and even to sub-neighbourhood level in some cases.
We need to link the benchmarking at one end to that pattern of engagement and working with communities at the other end to get value out of the Community Empowerment (Scotland) Bill. It will be helpful, because it will give communities the right to challenge us on whether we furnish them with enough information. In a way, one of the rights that the bill will confer on a community will be to allow it to say, “You aren’t actually achieving the outcomes that we want in terms of how informed we are—sort it.” Public authorities, whether health boards, councils or whoever, will have to ensure that they are satisfying communities in terms of what they feel they need to know and what they feel they want. Therefore, the Community Empowerment (Scotland) Bill has an important role in driving forward the agenda on an informed public alongside that on empowered communities.
11:00
My question ties into the issue of how we better meet the needs of every community. In its written evidence, CLEAR Fife says that there is a democratic deficit in that, in many communities, there is no opportunity for residents to be consulted, and it wonders whether it would be a good idea to develop an indicator for the level and quality of local authority consultation and, indeed, local councillors’ activity. Will the witnesses comment on that?
CLEAR is right. We used the household data to get a measure of residents’ experience of local services. Because the Scottish household survey is an all-Scotland survey, it gives us the information; however, it is thin, and the chances of an individual being involved in it in any given year are utterly negligible. If that is what we mean by a democratic deficit, there is no question but that one exists. Councils have measures such as citizens panels and residents surveys, but I accept that we could and should do better.
We have tried to outline the costs of powering up the household survey and allowing a lot more people to be involved in it, because we need to get board approval to do that. It is a high-quality survey, but it has quite a narrow base of about 12,000 people throughout Scotland. In order to get it up to a decent enough level from a community point of view that would allow us to disaggregate the data a bit, we are talking probably about a base of 1 million to 1.5 million people. The problem is that I never have that many.
Many local authorities, community planning partnerships and other bodies regularly carry out similar—pretty comprehensive—surveys. Why can we not co-operate and bring them together instead of reinventing the wheel and creating something much bigger? There are councils with citizens juries that are regularly in contact with 1,000-plus people in their areas. Why can we not collate that information?
I was picking up on the point that was raised about getting down to community level. There is nothing that I can do with information from a citizens jury of 1,000 people in Glasgow, because that will be a statement about Glasgow as a whole, not a particular community in Glasgow. Indeed, the jury will be weighted to represent Glasgow’s whole population, not the population of any community in Glasgow.
I took the question that I was asked to be about how we get much closer to actual communities and allow many more of them to express their opinions about public services. Taking an annual sample of 1,000 might be perfectly decent from a representative statistical point of view but if that is all that we do, it is pretty thin engagement with the almost 500,000 people who live in Glasgow.
I take your point that a range of things is happening, but our problem is that they are not standardised. If you asked me whether I could benchmark them throughout Scotland, my answer would be no, because people use completely different instruments that they have evolved locally and which suit their local purposes and their members’ priorities. The merit of the household survey is that it is standardised throughout Scotland and is already part of the Government’s commissioning. The idea was that we could piggyback on it.
We have considered linking up the work that people already do at local level. However, one of the kickbacks is that, if people have an instrument that they think works particularly well for their communities, they ask why they should sacrifice it for a standard instrument that allows us to take a measurement throughout Scotland. They ask us not to interfere with local practice and tell us that they are not going to sacrifice any practice that works in engaging communities so that we can get a better measurement.
There is always a tension between the ways in which we can get to the information. However, I take a sympathetic view. We need to get much closer to communities with regard to engagement and information and ensure that we are aware of different communities’ different views.
That was extremely helpful. Thank you.
I will take some quick-fire questions, because there is some ground that we have not covered. I am hoping for quick-fire questions and quick-fire answers.
I welcome the comments about the household survey. I remember having the same debate about extending the household survey with Scottish Government officials more than 10 years ago, so I wish you the best of luck, Mr Mair.
You referred to visualisation and the Viewstat criteria. I have to say that I find it very difficult to navigate around some local government websites. I think that I can book a Ryanair, easyJet or Jet2 ticket, but—
I wish I had not made that comment earlier.
Some local government websites are tortuous for those who are trying to get some information. How can we improve things to allow people to view the local stats that you are referring to?
That was not very quick-fire, Mr Wilson.
I am sorry, convener.
A quick-fire answer, please, Mr Mair.
If you go on to the benchmarking website, Mr Wilson, you will see that we have done dashboards for each council. They are dead simple, and they allow you to make a comparison over time as well as comparisons between councils. It is literally a dashboard system, and we hope that that makes the information more accessible.
We would certainly welcome your feedback. We get feedback from the public who use the websites, and they tell us what they like and—quite forcefully—what they do not like about the way that we have designed things. We would welcome your contribution, too. All councils now have similar dashboards, and we are asking them to put the dashboards on their websites to make it easier for the public to get information quickly.
As you will know, the committee has spent some time on community empowerment, and we have produced a report about trying to engage with local communities. The responses that we got from our consultation with the public clearly showed that community councils felt that they were not being engaged with. They were not aware of the benchmarking process. How can we do this better?
It is perhaps a matter of differentiating between the terminology and the information. The notion of benchmarking is a bit like community planning, and I am not sure that all community groups really associate themselves with it. It is a different matter, however, when we discuss engagement with them or give them information.
There is a groundswell of support for getting relevant information out to the appropriate individuals, particularly those on community councils, without swamping them. The feedback that I have received suggests that it is the easiest thing in the world for some of our services just to throw things out to community councils, but they can be swamped with information and they find it difficult to differentiate between what is really important and of value to them and what they can ignore. We need to do a bit of work to ensure that we push relevant information to community councils at appropriate points.
We have previously heard how Scottish Water improved after it brought in a new benchmarking regime and started making comparisons with other bodies elsewhere. To what extent are you now using external comparators in the rest of these islands or the rest of the world?
Colin Mair will probably be able to supply more detail about international comparisons.
But he looked at you.
I know. [Laughter.] I remember speaking to a couple of colleagues earlier about this. Glasgow City Council would like to include in this approach more detail about cities across the United Kingdom, and with our steering group, we are working to find out what information we can include from those cities. It will be at the drill-down stage and the family group stage, rather than within the framework, that such discussions would be stimulated and supported. That broader approach will cover the UK.
I hope that that will be considered. As I have said, the committee has heard evidence that the vast bulk of improvement at Scottish Water happened when it started comparing what it was doing with other bodies outwith the UK.
We have also heard that it has been very difficult to put certain councils into family groupings, because of what they are. I hope that you will consider that issue, too.
To what extent do you feel that the general public and stakeholders are using the framework to challenge local authorities? Ms Leitch, do you think that the framework is being used by the public in that way?
The situation is variable. We now have three years’ data—four years in some places—and we are starting to see much more evident trends with regard to whether something has been adopted as practice or is just a one-off and whether there is anything that we can do differently.
Putting the information out through public performance reports is one thing, and certain groups will be particularly interested in that. However, if we really want true engagement, we need to distil the information a bit more and make it relevant to the particular groups that want to engage with us on different subjects.
The committee has considered community planning a great deal over the past few years, and it has heard that certain targets that are put in place at council level for single outcome agreements might come at things from a completely different angle than targets for the health service. In their responses, a number of folk have asked whether we should be considering the council and health board outcomes in the single outcome agreement frameworks instead of measuring local authority business alone. Is there a view on that?
There is: we need to do both. We need a framework in which performance against outcomes and SOAs is consistently measured and publicly available, but that does not mean that we would not want to keep carrying out service-level benchmarking on the cost efficiency and effectiveness of service delivery in different councils. The two things are related, and such a framework would allow us to explore that relationship. A significant bit of work that Emily Lynch is leading on is developing and will develop things along the lines of that outcome approach. The point has been well made by many of your correspondents.
There are major areas where we are shamefully short of clarity about any outcomes at all. When we speak about outcomes for older people, for instance, we are still struggling to put some coherent sense into what we imagine those outcomes to be. We use words such as “dignity” and “choice”, but what do they mean and how do we show that they are happening? A lot of work is going on in that arena and if you would welcome it, we can report back to you on progress on that particular dimension as well as on the dimension that we have reported on today.
You will be back next year anyway, but it would be really interesting if you could continue to apprise us of any changes. Things such as the integration of health and social care might require you to put different measures in place.
Indeed.
However, that jigsaw will soon be completed. I think that, once the Community Empowerment (Scotland) Bill is passed and begins to kick in, engagement with the public will grow in certain areas. At that point, you might consider the various measures that you are using.
It would certainly be useful if you could continue to keep us apprised of developments instead of waiting until next year. Members have made a number of requests for information, and we would be grateful if we could receive that. Thank you very much for your evidence today.
We now move into private session.
11:12 Meeting continued in private until 11:36.Previous
Subordinate Legislation