Official Report 587KB pdf
Under item 2, we will take oral evidence from three panels: from the Convention of Scottish Local Authorities, from local authorities and from the Improvement Service. The first panel represents COSLA. I welcome David O’Neill, president; Michael Cook, vice-president; Barbara Lindsay, deputy chief executive; and Adam Stewart, policy manager. I invite Councillor O’Neill to make an opening statement.
Thank you, convener. I have not switched off my mobile phone, but the good news is that it is on my desk in Verity house, so if we hear it from here, it is very loud.
Thank you very much for those comments.
I think that it would be fair to say that the public sector as a whole was not particularly good at a number of things, including procurement and unit costs, but we are much better at those things now and we recognise the value in them. We now generally know the unit costs and what we are using our money for.
I have had the experience of being an elected member in local government for some time, so I realise that things sometimes take a long time. That is one of my main frustrations about local government.
The history is important. It is particularly important to recognise that councils did not wake up one day and suddenly think, “We need to carry out comparative analysis.” Actually, that went on all the time when you were a member, convener, and it has certainly been going on throughout my time as a member. However, the current change reflects a difference in approach. We have been particularly good at trying to respond to best value since the Local Government in Scotland Act 2003. Certainly, my council has been engaged in that on two occasions. We have also been involved in a pilot project for the latest iteration of dealing with performance information and using it effectively in the management of councils.
We recognise that there is to be some uniformity in benchmarking. We have been told in evidence on the issue that one difficulty with many previous benchmarking exercises by local authorities was that they often compared apples with pears, which led to the abandonment of projects because the benchmarking did not stack up, as folk saw it. Therefore, we welcome the uniformity. Although we realise that benchmarking was done previously, we must recognise that there were difficulties because of the lack of uniformity. Perhaps David O’Neill or Michael Cook will pick up on that.
I can say a few words about that. You will be aware of the report on the issue by the Society of Local Authority Chief Executives and Senior Managers that went to council leaders. One strength of what we are trying to do is that the people who actually deliver the services on the ground and who take the lead—the chief executives and senior officers in local government—have developed the process. They have developed it so that it will maximise the benefits to local government. If the approach had been imposed from the outside, we might have had a bit more difficulty with it, but this benchmarking approach is a creature of local government and a tool that has been developed by local government officers. That is one of its strengths.
Councillor O’Neill mentioned that benchmarking is a tool of local government. I imagine that a key requirement will be to ensure that staff and elected members are on board so that the approach becomes an automatic feature of service management. How do you ensure that that happens?
Performance is one of the four pillars of public sector reform, which are—if I have remembered them correctly—performance, people, prevention and integration.
To put it simply, the people who have the strongest interest in how councils perform are elected members, senior managers and those who are employed in the councils. As soon as we come into local government, we are engaged in a journey in which we try to ensure that there is a process of continuous improvement in delivering the best that we can for the taxpayer, the constituent and the visitor to our area. As I said earlier, we have long been engaged in a process of continuous improvement, and the best-value approach is evidence of that.
Are there any comments from the other two panellists?
It is worth considering at this point that councils are subject to a range of performance indicators, audit frameworks and external challenges. One of the opportunities in the project that we are discussing is that it allows us to take ownership of those indicators that we think add the greatest value to our understanding of management information locally.
As we indicated in our written evidence, the need to improve outcomes to get best value for money and to bear down on performance to do that has been the theme of almost every visit that we and the presidential team have made to councils.
Following on from that, do you consider that you have enough relevant resources to make that interpretation and to make effective use of the benchmarking data? If not, can the Scottish Government or other national bodies do more?
Are you offering us more money?
Absolutely—not.
It is more important, because of the reduced resource, that we do that work to ensure that we are getting best value. The current mayor of Chicago, Rahm Emanuel, who used to be Barack Obama’s chief of staff, said that you should never let a good crisis go to waste. Here we are, right in the middle of the biggest financial crisis since the second world war, which should be a driver for us all to do things better than we currently are.
Resources do not necessarily have to be cash; they could, for example, come in the form of a secondment from another body to help with comparability. For example, Scottish Water has given us some excellent examples of how the organisation has been turned around, and there might be some value in listening to or seconding someone from that body to find out how it achieved that.
If someone has a better experience than us, we are happy to learn from them. Indeed, we are happy to learn from anyone. If anyone has a good idea, we will be happy to steal it.
As the vice-president has suggested, we have to make this part of everyone’s job instead of bringing in someone from the outside. We have the commitment of chief executives, leaders and senior staff and support from the Improvement Service and, as we embark on this journey of learning and improvement, we must be open to looking at other sources of good practice, where appropriate.
We also need to reflect on where we are moving from, as well as where we are moving to. We are moving from a situation in which, as Adam Stewart correctly pointed out, a whole series of auditing, assessment and scrutiny bodies was looking at councils’ performance. Since the Crerar review, there has been an effort to declutter the landscape, which very much fits with local authorities’ aspiration for effective, efficient and proportionate scrutiny. If we develop a model that is fit for purpose, allows proper comparisons to be made Scotland-wide and provides proper information for management purposes so that we can drive best value and continuous improvement, we will have reached the right place. Part of the process, however, is ensuring that we have proportionate scrutiny, and we think that this tool will help us get there.
You said that you have some performance indicators, one of which relates to Audit Scotland and independent audit. Do you see the information that we get from benchmarking fitting into the other performance indicators?
It might be better to ask SOLACE about some of the technicalities. However, what I can say is that this is not just one thing; it is part of a suite of tools. We must certainly take account of what external regulators and auditors are saying. Indeed, Audit Scotland is aware of the benchmarking and I have had informal discussions with John Baillie of the Accounts Commission, who knows that we are discussing these matters this morning. We need to instil in the public sector the ethos that benchmarking, continuous improvement and best value should be embedded in the culture.
We already have a series of statutory performance indicators, some of which, frankly, are pretty redundant. For example, you might look up a statutory performance indicator that tells you the number of people who go to the local swimming pool. The genuine value of that is questionable, and you need to look not only at the inputs at the top but the local authority’s performance in the service that it is providing. There is an expectation—I think that Audit Scotland is at least thinking in this direction—that as we develop these new comparative analysis mechanisms the existing SPIs will fall away and become genuinely redundant.
I suppose that preventative spending would come into that. Instead of examining the raw data, you would be looking at the effect of something, whether there were health outcomes and so on.
It is absolutely not just about the raw data. The information that we are gathering from this exercise comes principally from local financial information and, if you are going to use it effectively as a management tool in your local authority, you will need to examine it in detail and consider all the different angles.
Good morning. I was interested in the comments about chief executives and senior staff delivering on the ground. What discussions have taken place with genuine front-line staff who are delivering services in the communities and on the ground and their unions? After all, this should be about benchmarking the delivery of the service on the ground and not about people behind a desk saying that they are delivering this or that. How do we match the benchmarking of what is being delivered against what is genuinely being delivered on the ground by front-line staff?
That question is probably better answered by individual councils. However, I know from experience as a North Ayrshire councillor that that council spends a considerable amount of time engaging with the workforce to ensure that we are fully aware of the issues both as they see them and as they deliver services. I am very conscious that councillors, chief executives and senior officers of local authorities do not necessarily deliver the services to communities and that services are delivered by people at the front line, the coalface or whatever you wish to call it.
I am keen to endorse a couple of David O’Neill’s points. First of all, although we should remember that benchmarking is principally a management tool, we also need to understand that it is embedded in councils’ DNA. I say in response to Mr Wilson’s question that that is demonstrably true. As an example of that, I have brought with me a copy of Scottish Borders Council’s most recent best value audit, which was done in 2010. In the audit, councils are asked about the pervasiveness of performance culture right down through the edifice; I am happy to say that our council’s answer to that question was, “Pretty good.” That kind of question has been asked since 2003 and will continue to be asked as we move forward, and managers will be determined to secure that sort of response from staff.
Fortunately, as four ex-councillors—a majority, I note—sit on the committee, we know both as councillors and as MSPs what public service delivery is all about and are keen to see the best public performance possible. [Interruption.] I am sorry—I forgot that Margaret Mitchell was also a councillor. I knew that her husband was. That makes five ex-councillor committee members, who make up a formidable team when it comes to looking at council services.
To be honest, I think that your last point misses the point. The important point flowing from that is the need to examine contextual information about swimming pools or any other factor that we are looking at. For example, if you are comparing information from Highland on cost per primary school pupil with information from another local authority, it is important that you do that—to follow the convener’s analogy—on an apples-with-apples and pears-with-pears basis. That is why there is a move in the direction of families, which aggregate councils that have similar social demographics—a similar population base and similar content, in terms of the people whom they service—so that we can make proper judgments about those things.
I listened to Councillor Cook with interest and, for me, what he said raises the point that benchmarking will not be a uniform tool across the 32 local authorities. There will be variances to benchmarking, with regard to the convener’s earlier comment on apples and pears. If we are looking for uniform delivery in benchmarking or, as other members have mentioned, in terms of best value, how can we be certain that the benchmarking measures that are being used are applicable to the local authorities—if local authorities do not opt out of the benchmarking measures that can be uniformly attributed to the delivery of services? Are we saying that we will have benchmarking but that we will not be able to compare the delivery of services by the 32 local authorities or to compare each authority against each other—it will be based on families and possibly on different service delivery?
I am slightly confused by the question. The simple fact is that we will have consistent benchmarking indicators across Scotland, but we will need to aggregate councils, recognising that the data and the information from different councils with different characteristics—different population bases—will be different. It is no more complicated than that.
We are talking about uniform benchmarking, but we are not talking about uniform delivery of services. Local government does not deliver uniformly across its communities. In my council, North Ayrshire, the least deprived community has a life expectancy that is 18 years longer than our most deprived community. We do not deliver the same services to those two communities—we deliver services according to the needs of each community.
I have at last discovered why the two people who have no council experience are the furthest from the salt at this particular dining table.
I would not take that as being the case, Mr Stevenson.
I welcome David O’Neill’s initial comment about his commitment to improving outcomes and absolutely accept that that is the case. He also said that he would copy a good idea from anywhere, so I am going to give him one. Perhaps if we want to get staff involved we should not talk about benchmarking at all because it is a techie thing; we should talk instead about a self-improvement programme, then everybody will realise that they have to do it for themselves. That is an idea for you to discard or use, as you wish. Certainly, when Michael Cook said that benchmarking is principally a management tool, I could envisage front-line staff immediately distancing themselves from it and saying, “It’s nothing to do with me.” However, that is enough observation—you can deal with my observations in any way that suits you.
I can say a few words about that. Thank you for your good ideas; we will take those away and ruminate on them.
I would emphasise the points that Councillor O’Neill has made and will also pick up on the points that were made by Mr Stevenson and Mr Wilson on the self-improvement agenda. We are talking about one tool that is used as part of the wider approach to performance management, an aspect of which is a much greater focus on outcomes. We have talked a little about the challenges that are related to developing that approach. It is also about the heavy investment that councils have made in self-assessment over the past few years through the public sector improvement framework or other European Foundation for Quality Management methodologies. All councils are adopting those as part of their internal scrutiny mechanisms and are subject to some external challenge.
What you have said has not answered the specific question about why you have chosen particular inputs, which I assert—without evidence—are to support your assessment of the journey to the outcomes, which may be relatively long term. I expected a limited answer, but I think that the committee needs to ensure that, by whatever appropriate means, we get that answer. We may return to that, convener.
Yes.
Let me move on to another point, because I do not want to take up valuable time wearing out something that we are probably not going to make too much progress on.
We are confident that the process will work because the people who designed it are practitioners; they do the work day to day.
I am sorry to interrupt. Are you talking about the practitioners who are already employed by the councils?
Yes. I am talking about the members of SOLACE: the chief executives, senior officers and senior managers drew the project together.
I think that that is absolutely right and proper—leadership would have to lie there, in a technical sense.
You would really need to ask the people in SOLACE whether they thought that that was the case. In my experience, they are not shy in saying that they have, or do not have, the necessary skills.
We can pick that up with the third panel of the day.
That is fine.
Good morning, panel. I will take us back to community planning. What challenges are faced in applying the benchmarking approach to community planning partnerships, especially as regards their contribution to delivery of outcomes?
I am a great fan of community planning. It would be good to get the whole of the public sector signed up to community planning in at least as strong a way as local government is signed up to it. We have the review of community planning—
Can I stop you there? You mentioned getting all of the public sector signed up; from evidence that we have taken, it seems that much of the public sector is signed up. The difficulty is that the private sector and, in particular, the third sector do not feel that they are allowed to add the value that they could add. Could you comment on that?
That is probably quite a fair comment to make, but it is also true that although we have legislation that requires local authorities to participate in the community planning process, it does not yet apply to the rest of the public sector. The rest of the public sector come along to CPPs, but they are not under the legal requirement that the local authorities are under.
To embellish what David O’Neill is saying, community planning is still very much in development. There is the national oversight group, which involves ministers meeting representatives from the whole public sector to shape the agenda. Concurrently, there is the performance improvement and benchmarking agenda. We will not wake up next Monday and find that we have a series of benchmarking propositions across the whole public sector, but what is important from the committee’s perspective is that the aspiration—certainly on the part of local government—exists to get to that place. There is a view in COSLA and across local government that community planning is a very good answer to the aspiration that we have across the country to improve outcomes. That is what it is all about. We need to build benchmarking mechanisms that allow us to drive performance on a cross-sectoral basis—for example, when work cuts across a local authority, a health board and the third sector. If we can identify outcomes and indicators and use those as weapons to drive performance, we will get to the right place, but it will take us time to get there.
Anne, do you want to come back on that?
No. That is fine.
I will follow up on the point, in that case. We have found from evidence that we have taken on community planning partnerships that there is good practice and bad practice. Committee members can correct me if I am wrong, but I think that we found that the partnerships that seem to be stronger are those that are influenced by the private and third sectors. David O’Neill talked about compelling folk to get involved. Would it be wise to compel people to become involved, rather than to persuade them to come on board, as has happened in many places?
Some of the other public agencies are direct agencies of Government, so if Government tells them to do it, they will do it, and if Government tells them not to do it, they will not do it. I know that this is not meant to be an evidence session about community planning, but I am more than happy to talk about it.
I think that it is extremely important for benchmarking services to ensure that there is a level playing field. As Ms McTaggart does, I think that community planning partnerships have an immense part to play in that regard. We will do well in benchmarking and service improvement in the areas where community planning partnerships seem to work better.
You are right that community planning works well in some areas but not in others. However, we are clear that we do not want to be just as good as what is currently the best in community planning; we want to see a step change in community planning so that everybody moves up to a higher plane than we are currently on, and that all deliver positive outcomes for our communities. I will leave it at that.
If we do not make the step change, we will have a big difficulty. We all know what is coming to us from the financial context in which we operate. What is driving the national group’s community planning discussion is anxiety about what is out there and the need to improve outcomes in a context in which resources are under pressure for a range of reasons. There is not only growing demand but a substantial reduction in resources. To be frank, unless we make a step change, we will struggle to find the £3.9 billion that we have to find by 2016-17. On the part of local government and—we detect—on the part of national Government, there is a genuine desire to use community planning to come up with some answers. That is the direction that we are driving in.
I want to say something about an aspect of community planning that has proven to be successful. At its beginning, it was seen as a community of delivery organisations, and it worked quite well in that respect. However, we have moved beyond that, and community planning is now not only about a community of delivery organisations but about place: it is about geographical communities and delivering for those communities.
I am sure that we will have you back at some point to talk further about community planning.
David O’Neill’s letter to the convener says that many discussions have taken place on benchmarking, but that his
Who is going to deal with that?
I am happy to start, convener—then anyone else can put their tuppence-worth in.
David O’Neill has touched on the wee story that I was going to offer by way of example. At the COSLA conference back in March, a representative from Swedish local government came along and spoke about benchmarking. She was asked about how that played out in the Swedish public domain, and her response was—as David O’Neill hinted—that, in the Swedish culture, when there is a deficiency or defect, an effort is made to rally round and everyone pulls in the same direction to ensure that the last ship in the convoy is running at the same speed as the others. There is a maturity to the political and public reporting culture in Sweden that would be a noble aspiration for us in Scotland. It would be very helpful indeed if the committee could help us to reach that aspiration.
I have a point about some of the practical things that we can do around benchmarking. One action that we are taking, alongside SOLACE and the Improvement Service, is to develop media messages that explain the indicators, and perhaps give some insights into what they do or do not tell us about variation, and into how people can interpret the information locally. We are doing that with a view to ensuring that, when people look at the information, they have in mind some of the legitimate variation that might exist. Those messages will accompany the information when we make it public early in the new year.
Does John Pentland want to come back in?
No—I think that the witnesses have answered the question. I have one further question with regard to—
I will take Stewart Stevenson and Margaret Mitchell, because they have things to say on that point. Does Stuart McMillan have something to say on that point too?
I have a different question.
Okay. I will let you in afterwards.
Have councils considered the presentational approach to information provision that the Government uses in its Scotland performs website? I imagine that each council’s progress would be marked as “improving”, “steady” or “deteriorating”. If councils were to provide the media with information in that way, there could be a proper political debate around it. It is proper to compare councils with each other not on how they are doing but on whether they are moving in the right direction. If councils were to take that approach, it would give the media something that it could use without much effort and give the debate a different focus. In Government we have not always found that approach to be pleasant, because the arrows can sometimes go down, but we can focus on those things. I commend that approach to you and ask whether you have had a wee think about it.
Is that an approach that people have considered?
That is very much in keeping with the kind of approach that individual councils use in their public performance reporting. Obviously, technology has a large part to play in that. On the publication of information, Mark McAteer will be better placed to give specifics on the format, but certainly web availability and analysis of information are very much on the cards.
Okay. Good.
I am afraid that what is coming over from the panel is, “Please don’t criticise us. There will be bad information out there, but don’t say that this is unacceptable and shouldn’t have happened.” I accept that people will make such comments, but for me the key thing is that issues are identified and councils do something about them. If councils really are signed up to benchmarking, inherent in that should be a robust response that benchmarking has actually achieved what you wanted it to achieve, that benchmarking has identified what councils are doing well and, equally, that it has identified weaknesses and shortcomings. Crucially, as a result of that, councils could work on those weaknesses and shortcomings and make improvements.
There are a number of aspects to that. Plainly, we are not saying, “Don’t criticise poor performance or failure.” There is an acceptance that poor performance will be criticised. However, we need to recognise that the arch-critics will be councillors and people within councils who will sometimes have a better grasp of some of the issues that are developing—notwithstanding the huge experience that exists on the committee.
I absolutely do not think that the message is “Don’t criticise us.” We need to create a culture across the public sector of change and improvement, and that means creating an environment in which it is entirely feasible to say, “Yes—we’ve collected this information and produced it”, so that everybody is clear about what it means and what it does not mean. Rather than just making crude comparisons, we need a culture in the public sector such that it is possible to say that we will use that information to improve.
I was very much encouraged by the beginning of Councillor Cook’s response but discouraged by what he went on to say. If there is a problem with late payment of bills, say that there is a problem and look for why. If there is not a problem, justify why you have done what you have done, but ensure that you are robust. The whole ethos needs to be about improvement. I think that we both agree on that.
We are absolutely disposed to that. The point is that we would do that on the basis of the proper benchmarking indicator.
I see that 50 per cent of the identified indicators in the SOLACE benchmarking suite that was sent to the committee have the word “cost” in them, from “Cost per Primary School Pupil” through “Total HR Cost per 1,000 Employees” down to “Cost per Visit to Libraries”, “Cost per Visit to Museums” and “Cost of Waste Collection per Premise”. How do we get round that issue? How do we compare such measures, particularly the cost of road maintenance, across 32 local authorities that will have varying costs for the delivery of those services? How do we benchmark when 50 per cent of the identified indicators are about the cost of services in local authorities? Clearly, as Councillor Cook said earlier, cost factors for the delivery of services will differ between local authorities. As Councillor O’Neill mentioned, the cost of service delivery in deprived areas will be different from that in other areas.
I am not at all clear why we would not want to know what it costs us to deliver a service.
The point that I am trying to make is that it is clear that some local authorities will spend more per visit to a museum, for example, than others. If we are benchmarking 32 local authorities using the suite of indicators that you have provided to measure the performance of local authority against local authority, some people—I am not saying all people—will then ask why North Ayrshire Council pays X amount for a visit to a museum whereas Glasgow City Council pays only Y amount.
I will say three quick things about that. First, as the president of COSLA said, there is no reason why we would not want to know the cost. There may be a perfectly reasonable explanation for why an authority’s costs for something are more than an adjacent authority’s costs, but the first authority might want to address that because it represents a saving or efficiency that could easily be made.
They will also be caveated, of course.
Regardless of how robust local authorities or COSLA are in using the big-stick approach, there will always be a story for somebody in the media to use against them, because a benchmarking exercise might show clearly that a service must be withdrawn or improved, so we know that they are up for criticism one way or another. That is probably somewhere down the road. I agree with David O’Neill that we must be grown up and realise that local authorities are not improving or taking away services just because they want to but because they are benchmarking.
The people to whom local authorities are primarily responsible and accountable are their electorates but, across the public sector, there are agencies—regulators that are appointed by the Scottish Government or the Scottish Parliament, such as the Accounts Commission and Audit Scotland or Social Care and Social Work Improvement Scotland—that monitor what local authorities do. They do the holding to account, if you like, on behalf of either Parliament or the public. It is right and proper that we should engage with them as fully as we can. As I said earlier, I had a brief discussion with the chair of the Accounts Commission about what we were doing with benchmarking. There has certainly been a change in attitude. The Accounts Commission and Audit Scotland support what we are trying to do—we are singing from the same hymn sheet.
Good morning, panel. I have a couple of quick comments, one of which is about Sweden, which Councillor O’Neill and Councillor Cook spoke about. I studied in Sweden, and I absolutely agree with what you said about the country. It is a model that we should look at and learn from.
Mr McMillan, I do not want to get into a debate on last week’s politics.
I will ask my question, convener. The benchmarking suite was extremely interesting, but one point stood out for me, which was sickness absence days per employee. I am very much aware that with any action there will be a reaction. This ties in with a question from John Wilson about ensuring that you take the workforce with you rather than impose something on the workforce. Whether it is in annual league tables or when information is published, if it is seen that there is an increase in absence days in a local authority, how will COSLA help the local authority to deal with that? I pose that question bearing in mind Councillor Cook’s comment:
I can say a little about absence management in particular, because it is a good example of how we might use information to drive improvement. Absence management information is probably one of the most well-established areas of information that councils have been collecting for a number of years. You can follow the trends quite closely over that period. It is also an area in which councils will look closely at the relative performance of other councils with a view to identifying how their absence management policies are driving improvement. There is a host of what we might call benchmarking clubs—formal and informal—in the professional human resources community to look at that issue.
I can add a political dimension to Adam Stewart’s helpful explanation. It so happens that my responsibility in Scottish Borders Council includes HR. Let us say that our absence figure is 12 days per employee, and for a comparator local authority the figure is nine days, I will want to challenge my managers about why the other local authority is at that level of performance and we are not. What are the contextual factors behind that? Why are we in that position and it is not? What is it doing differently in its approach to operational management? That quite neatly demonstrates what benchmarking is principally about. We expect it to be used by the public, we expect it to be used by parliamentarians and we expect it to be used by the media but, primarily, we are benchmarking so that councils can get our internal processes right in order to serve the public and to ensure that we are delivering as well as we can. That is the objective that sits behind benchmarking.
Thank you, that is helpful.
When does all this go live?
The intention is that it will go live later this year, so figures will be published towards the end of the year.
I thank the witnesses very much for coming in. It has been most enlightening. We will suspend the meeting for five minutes to change the seating for witnesses.
I want to thank Stewart Stevenson for both his good ideas—we will take them away.
They are free.
Our next witnesses represent council leaders. I welcome Councillor Jim Fletcher, who is leader of East Renfrewshire Council; Councillor Ken Guild, who is leader of Dundee City Council; and Councillor Bill McIntosh, who is leader of South Ayrshire Council. Gentlemen, do you wish to make any opening remarks?
Good morning and thank you for inviting me and my colleagues along this morning. From my perspective and from the perspective of East Renfrewshire Council, we very broadly welcome benchmarking. I represent a council that was perhaps a political creation and is a small to medium-sized council. We are very alive to the fact that many people—certainly at the outset—felt that we would not cope with the big-ticket items such as education and social work. We always felt and we probably still feel that we have a point to prove. The only way to prove that we deliver those services well is by having some sort of evidence. Benchmarking provides that, so, as a council, we have never been afraid of it.
The fact that this morning’s panel of local authority spokespeople reflects a broad cross-section of local councils suggests that the committee realises that, when it comes to benchmarking, one size does not fit all. There will be massive differences—geographic, political and socioeconomic—between one council and another. Whatever the results of the benchmarking, the important point is that it is contextualised so that the exercise does not come down simply to raw statistics.
I welcome benchmarking, as I welcome anything that gives us something to compare. However, I would be disappointed if, when we get the benchmarking figures, they do anything other than confirm what we already know. We should know what we are good at and what we are bad at, although not necessarily how good and how bad. One example from the list of items is housing voids, on which we are doing quite poorly just now. We are well aware of that, and we are targeting it. That issue is mentioned almost on a daily basis, and we are grilled by our own members and scrutiny panels, so we are looking at it. Any council should be self-aware.
Thank you. It has taken two years to complete the project, which is a pretty long time. Would you like to comment on the length of time that it has taken to get us to where we are now?
The two-year project is a good way of trying to distil performance down to the key indicators. As councillors, we look at—by and large—around 50 key indicators that are very much related to our single outcome agreement, so this is the right direction of travel. We used to get a raft of indicators before, and I do not think that councillors ever read them in enough depth—you could spend a month of Sundays reading them.
You have never spent much time in Aberdeen City Council, Jim. I was the biggest anorak under the sun when it came to those figures, and there were many others. It is disappointing to hear that not many councillors would read them all.
In my experience, councillors had reams of performance indicators, and their eye would inevitably be drawn to the negative ones. Those are the things that we must pay attention to, but if we are dealing with hundreds of performance indicators, I do not think that we can give them all due weight. Professionals in fields such as education can do that. As a council, we have looked at exam results. Councillors look in great detail at those results school by school, and professionals can use those detailed indicators to drive performance and look for best practice across the board. Rather than councillors looking at all the indicators for everything that the council does, it would seem to be much more productive to distil them down and relate them to the single outcome agreement. The work that we are doing, driven by SOLACE, and other organisations such as COSLA and the Improvement Service, is the right direction of travel.
The main point of the exercise is to improve the level of public service to encourage best practice. That is not something that we particularly want to rush. Okay, it has taken us two years to get here, but we are now in quite a good place. What is happening now is that these benchmarking points will be embedded in the day-to-day working of every department in every council in Scotland. They will be the same benchmarks for each.
Two years is a long time. In the day job of being a council leader, it is frustrating how long it can take for many things to come to pass in a local authority. There are 101 reasons for that, but the main thing in this case is that we should ensure that when benchmarking comes in, it is right. I am hoping that that will be the case, given all the work that has been done that has got us to this stage. We heard earlier from COSLA that it is hoping to introduce it this year, so let us get the show on the road and make it work.
Good morning, gentlemen. You will have heard the questions to the previous panel. How will you build in ownership of benchmarking among staff at officer level, those providing front-line services and elected members so that it becomes automatic in service management?
I hope that we have a culture in our council such that people are proud to work for the council and want to work for it. We have to work in partnership with our staff. When it comes to dealing with trade unions and staff representatives, I operate a system in which my door is always open. People can speak to me informally if there are issues that they want to discuss, and we have formal joint consultative committee meetings at which all those things are touched on.
The term “ownership” has been used, but it is more about responsibility than ownership. There is very much joint responsibility between elected members and officers at all levels, and that is certainly the way in which we are approaching the whole benchmarking ethos. We meet regularly with our senior management team and senior conveners, and our conveners are always in contact with the directors. Senior members of the administration, including me, are now sitting in on a regular basis on management-union meetings, so information is being passed on.
I am interested in that response. For me, responsibility can be a double-edged sword. It can be a burden, whereas ownership suggests more of a voluntary willingness and enthusiasm to engage in the process. I ask you to reflect on that.
The word that I would use is “partnership”, which suggests exactly that.
I think that front-line staff want to do well, and they know when things are not working, but we need to ensure that there is good communication. There is no point in having front-line staff who are good at their job but who are—for whatever reason—not being empowered to do it. They must be able to go to their next in line and say, “I’m not happy with this.” The communication needs to go up, down and across. We can easily just get the report and say, “We’re not happy with this—we need to do something about it,” and that will filter down. The senior manager will talk to the middle manager, and they will have a staff meeting and tell people. That is fine, but we need buy-in as a whole from the bottom up.
You have identified an important factor in communication. In any organisation, success depends on how good the lines of communication are.
We are in a slightly strange position as an administration. We took over in mid-term in 2009, and, two months later, we had a new chief executive, so we had very little baggage. We more or less had a blank sheet to begin with. With the chief executive, we agreed to set up a changing for the future board and to invite members from all the groups along. However, the other groups boycotted it because they thought that, if they came along to the board to discuss and agree on concepts, despite the fact that the board had no decision-making powers, they would somehow be dragged into our decisions. They thought that, if those concepts proved to be wrong, they would be tarred with the same brush. That is the in-built attitude that I would most like to change.
Thank you—that is helpful.
East Renfrewshire Council has not had too many problems with getting people to buy into the agenda of continuous improvement or with people failing to understand why the council needs to perform well. We do not have huge numbers of staff leaving and there seem to be a fair number of people who would like to work for the council. To me, that suggests that the overall ethos is right and positive.
Jim Fletcher is right about having pride in the council. That goes back to the point that I made about front-line staff wanting to deliver good services. Selling benchmarking to elected members is no bother, because on a daily basis we are already hypercritical of just about everything that we do, to the point of silliness at times. That is the nature of the beast. Even members of my political group, which is the largest in my council, criticise some of the stuff that we do. That criticism probably gets louder outwith my political group and, when it gets to our scrutiny panels, they are ready to tear the whole thing apart, which is right and proper. I see no problem with getting buy-in for benchmarking as another useful tool.
I want to explore the 47 or so indicators—the number depends on how we count them. A proportion of them measure inputs, while a somewhat larger proportion seek to identify outcomes. To what extent are you satisfied that SOLACE has come up with a set of input measures that will help you to identify whether you are making the longer-distance journeys to the outcomes that you seek? Were you involved in the process of coming up with those measures? Generally, how useful will the list be? Of course, it is not your list, but one that has been provided to you.
To take an example, I suppose that geography comes into “Cost per Visit to Libraries”. I have picked up on indicators that are particularly relevant to my council, one of which is “Percentage of Rent Due in the Year that was Lost Due to Voids”. That is an issue, and we are improving on that.
I am largely happy with the points that have been made. The work was done through SOLACE, and COSLA leaders discussed it. The leaders of the various councils had the opportunity to make an input if they did not like what was there.
I broadly agree. As I said earlier, we had too many performance indicators to manage them properly. Good work has been done to distil them down to 47—or whatever the figure is—and I have not heard anybody in my council or other councils in COSLA saying that they are wide of the mark. The people who produced them, largely driven by SOLACE, are wise individuals who know their business, and the indicators are robust.
I have to say that, as MSPs, we, too, get scars.
Jim Fletcher has mentioned performance indicators a number of times. Obviously, the eye will automatically go to where a council is performing least well. My question is probably for all of you. If an indicator shows underperformance, do you take that seriously, or do you take into account the unit price or affordability of improving on the indicator before you take action?
We absolutely take that seriously. If there is criticism of the performance indicator, you are right that we need to find out whether the criticism is valid. If the criticism is valid, we need to do something about it, and that is the catalyst for putting up our hand and saying, “Yes, this needs to improve and we will do it better.”
I agree. No council wants to perform poorly in anything, so any negative performance indicator will be looked at. I go through all the screeds of papers that come in and I circle numerous ones that I then discuss at my next meeting with the chief exec. He has his response, and that will filter down.
Absolutely. If a council is performing poorly, that has an adverse effect on the morale not just of the voters but of the council’s elected members and employees. It is in everyone’s interests to continue to try to improve performance.
I absolutely agree with Jim Fletcher: like many other MSPs, if I want to know what is going on in a community, I first call the councillor to find that out.
In East Renfrewshire, we have used benchmarking as a tool to drive performance. For example, in education, East Renfrewshire Council did not perform particularly well in mathematics or modern languages when it was set up. We looked at that and at how to drive up performance.
Would I be right in characterising benchmarking, in your council at least, as a positive encouragement to improve rather than a negative comment on performance?
Absolutely. It is important not to demonise an area that is performing less well—
Those situations are opportunities for improvement.
Absolutely, yes.
As I think I mentioned, we see the approach as a means of embedding benchmarking into the process. Rather than relying on external assessments, we see that as part of the day-to-day working of the council.
In a way, I think that benchmarking is a useful add-on, but in South Ayrshire we have had a form of benchmarking for a number of years. When we took over the council in 2007, it had a history of neglect and mismanagement which, coincidentally, had been identified by Audit Scotland in a less than rosy best-value report at that time.
Good morning, panel. I have a question about community planning partnerships. If you listened to the earlier witnesses, you will know that they had good advice in that regard. What are the challenges in applying the benchmarking approach to community planning partnerships, particularly with regard to contributing to outcome delivery?
We take community planning partnerships very seriously indeed. We have the Dundee partnership, which I chair and which involves the various local government departments, the police, the fire service, the two universities, the local college and a number of charities and community organisations. We also have a local community planning partnership in each of the eight wards in the city, each of which is chaired by a member of the senior management team.
We have approached community planning in a slightly different fashion from other councils. Historically, we had the same sort of meeting that would be familiar to many people—we met quarterly, people exchanged reports, and the police, fire service, enterprise bodies and Strathclyde partnership for transport all turned up. There was a tick in the box, and that was the community planning partnership.
The important word in community planning is “community”. To be honest, I would say that I am not happy with how we do community planning, but I am assured that we are a lot better at it than I perceive; we are not at the top, but we are doing well, which is fine.
Bill McIntosh mentioned community councils. In what other ways will you—or do you—encourage community members to become involved?
We have three main groupings of community councils in Ayrshire. We have the rural area in the south, the towns in the middle and the other bits. The groupings have come together naturally through the geography of the area. As well as being our community councils, they are very much parts of the wider community. In addition, a lot of good stuff is going on in Girvan just now. We have a town team, for example, and there are maybe half a dozen other good organisations. I cannot remember all their names, but there are business associations and so on.
As the convener mentioned to the previous panel, we have heard in evidence that community planning partnerships work best when they have the community at their heart and community members are involved.
We have two community members on our community planning board. I have been the council leader for three years, and those members have been on the board since before that time. It is vital for that representation to continue, but I cannot say in what way it will continue, because we are still in the transition period. I will not give an answer just now, because I am still getting my head round it, and we will not be able to get much further forward until we have the guidance. I understand that it is due in a week or so, and it will give us a chance to move forward.
Councillor Guild, do you want to comment?
Yes. As I mentioned, we have a broad range of community councils and they are very different in character.
I do not necessarily agree with Ken Guild’s experience of community councils—we cover an urban area and there are two community councils in my ward, let alone the whole council.
In asking this question, my intention is not to knock community councils. How have the three councillors encouraged other community members to become involved in the benchmarking process and community planning partnerships? I do not need to hear a response today; perhaps the councillors could write in with that information.
If the witnesses can provide brief oral answers, they should feel free to do so. If not, it would be useful if they could provide that information in writing to the committee.
Are you asking for us to provide information on how we approach a wide range of community groups?
The committee, as part of its investigations into benchmarking and community planning, is interested in finding out which community organisations and bodies are represented on community planning partnerships. That information would be useful.
That is fine.
Good morning, gentlemen. I have a hypothetical question about the suite of indicators that will probably not be easy to answer. On the back page of COSLA’s submission, indicator HSN1 covers current tenants’ arrears as a percentage of net rent due. With events that are outwith local authorities’ control, I imagine that activities related to welfare reform—which Councillor McIntosh touched on earlier—will affect some of the proposed indicators and HSN1 in particular. What can you and the rest of the 32 local authorities do to get across the message that the negative impacts are not of your making but are a result of something that has happened elsewhere?
Gentlemen, I do not want us to stray too much into the issue of welfare reform, but what will happen with the benchmarking figures because of the changes?
Welfare reform is just something that we have to handle, whether the changes come from Edinburgh or London. Once a decision is made at a higher political level, we are there to implement it and make the policy work. We can spend all day criticising something or not criticising something, but that is irrelevant. The reality is—
I do not want to spend any time whatsoever in criticism. I would love to sit here all day and criticise the Westminster Government for its welfare reform policies, but that is not what we are here for today.
That is exactly right. What I am saying is that we are here to implement the policy and we have to do so as best we can. One example of the welfare reform proposals is that, if a tenant is deemed to be in accommodation that is too big for their needs, their housing benefit can be reduced. We are anticipating that the knock-on effect of that—
I will stop you there, as you are straying into issues around welfare reform. We are here to talk about benchmarking. Welfare reform will have an impact on the statistics, but I do not want to get into all of the ins and outs of welfare reform today. I ask you to concentrate on the statistics and the effects on them.
I was trying to illustrate a point, but I take on board what you say.
What are you doing to plan for things?
I am sorry, but we are not going down that path today, because it involves the issue of welfare reform, which another committee of the Parliament is considering. We are here to discuss benchmarking. There will be other opportunities for this committee to discuss the impact of welfare reform on local government.
Of course, welfare reform will have a drastic effect on councils, on our performance and on the figures that you will no doubt be considering over the next few years as part of the benchmarking process.
Again we are straying into the realm of welfare reform, and I do not want to stray any further. The counter-argument to your point is that the Scottish Government does not have the money to mitigate the impact of welfare reform, but I do not want to get into those issues today. We are here to deal with benchmarking. Please stick to the subject, as our time is limited.
I understand that, but I am trying to make the point that there are financial pressures on councils now and that the administrations—in our council we have a Labour-SNP administration—will have to sit down and decide whether they are going to use some of their budget to mitigate the effects of welfare reform.
Okay. I am trying to stick to benchmarking. I could sit here for hours and talk about welfare reform—I am a member of the Welfare Reform Committee—but I do not want to do that today because we are here for a specific purpose.
All our figures will look a lot poorer as the reform kicks in. I go back to the point that I made right at the beginning, which is that the raw statistics may not mean a thing and that they must be set in context.
I think that we are talking about the numbers and not about benchmarking. If everybody goes down equally, the relativities remain unchanged, whereas benchmarking is about using the opportunities—even on the way down, if that is the way things are going—to say that somebody else is handling something better and that we can learn from them. Is that the approach that you are going to take? As Ken Guild said that, generically, everybody is going down together, perhaps he can respond.
I did make the point that everyone is going down together. It that a reflection on the councils or on some other place? We are not talking just about figures. At the risk of straying—I am sure that the convener will tell me if I am—I suggest that we are going to have lots of additional voids due to unpaid rent because of people’s personal problems rather than because of how the council looks.
I do not want us to stray into that, as we will get into the same position as earlier. Stewart, do you want to repeat your question? We need to get to grips with that point.
I was making the point that benchmarking is a neutral tool that works whether everybody is improving or deteriorating. I hope that you will all view it as something that reveals to you opportunities to improve in a relative sense. It would be useful to hear your comments on that.
Bill McIntosh is dying to comment on that.
Yes. I am just thinking about what I can say without being red carded. I am in South Ayrshire, and North Ayrshire is just up the road. The housing side of the welfare reform will impact significantly on South Ayrshire Council but the impact on North Ayrshire Council will be non-existent. That makes for a less-than-level playing field.
But will benchmarking help you to pick up from what you deem comparable councils opportunities to better mitigate the effects of a difficult circumstance? That is the fundamental question in what we are discussing today, which is benchmarking.
Yes. I have identified housing as an issue that my council has to address. Benchmarking—when it is introduced in due course—should confirm where I think my council is on that particular issue and will give an indication of where it is compared with other councils. Because of a particular issue, I might expect my council to score lower than North Ayrshire Council in relation to housing, but that might be acceptable for reasons that I will not mention.
I am sorry to press the point, but what I really want to hear—I may not hear it; you may say that I am off the mark—is that you see benchmarking as helping you to find the answer outside the boundaries of your own talent, expertise, experience and numbers. If benchmarking does not do that, I am not sure that it has a huge purpose.
We may fail if it is does not do that.
I cannot answer that just now. It will be an interesting issue to watch. I do not think that we can assume that benchmarking will give the answer; I would need to wait and see.
There seems to be a thread running through your evidence that suggests that you are one of the benchmarking sceptics. I think that that is the essence of what you are saying.
No, that is contrary to what I have said on a number of occasions today. I started off by saying that I welcome benchmarking and see it as a useful add-on. I see benchmarking as providing confirmation—hopefully, if I am doing my job right—that my council is mid-range rather than up at the top or down at the bottom. I am sorry if, for some reason, all those positive comments have come out as a negative reaction to benchmarking, because that is not my position.
No, I am just—
I think that we should move on.
I will leave it there.
Do any of the other gentlemen want to comment on what Mr Stevenson has said?
My only comment is that different councils will be affected in different ways. I suspect that the impact in East Renfrewshire of the housing benefit changes might be more akin to the impact in areas of the south-east of England, where there are higher house prices, whereas in areas such as Dundee there might be more of an impact on social housing. We need to learn from elsewhere. I do not think that any of us would not use benchmarking to learn from good practice.
I agree. I do not know exactly what role benchmarking would play here. I think that, in his couching of the original question, Mr McMillan said that many of the changes will be beyond the control of the councils. As Jim Fletcher has just said, a lot will depend on the percentage of residents in a particular council area who are already on low pay or social benefits. I think that that will have as much impact on the figures as anything that a council does to try to allay the situation.
I call Margaret Mitchell—very briefly, please.
I think that the point that the committee is trying to get over is that, regardless of where the pressures have come from, benchmarking will allow you to look at the data and realise that your council might have a problem. Can you use the benchmarking information from throughout Scotland to identify other councils that seem to be addressing the issue and use that as an opportunity to learn and get some answers to the problem?
Can we have very brief answers—a yes or a no, if possible?
I think that it is a hypothetical question.
Yes, we would use benchmarking, but it is new. Everyone is on a learning curve. Of course we would look to learn from best practice elsewhere.
Absolutely.
Good morning, panel. I have a brief question.
We have certainly been discussing it for quite some time in Dundee and the answer to your question is no, the officers are not putting forward any extra staff or even suggesting as much, because the approach is already embedded in what they are doing. As I have said, we have had extra costs as a result of the external assessments, because we had to divert staff from their normal duties. Preparing the information for benchmarking is built in or embedded in the jobs that they are already doing.
I agree. We have always looked at our 50 or so performance indicators, and I have certainly not been asked for any additional resources to deal with the new approach to them. This is simply part and parcel of what councils do, and I expect officers to prepare the information properly for the relevant committee, cabinet or whatever.
My answer is the same. The work is already incorporated into what we do.
Will the reduction in the number of indicators compared with what you had in the past not save your officers some time and money? Do you expect whichever audit body it might be to take account of the new suite of indicators in future audits?
I certainly hope so.
We used to have a thick raft of papers, and I think that we will now save a forest’s worth.
The answer to your first question is yes, and I certainly think that the audit bodies will use the indicators.
As the previous panel pointed out, the benchmarking will be publicised, which might lead to league tables and the identification of the worst-performing councils. Could the process be better managed and should other organisations provide some support on how best to approach the matter?
I see no particular need for external support. We all get bad press, but that is part of the job. What we need to do is to put out good news stories throughout the year on the good things that are happening, if for no other reason than to allow our own staff to read about them. After all, they need to know that we appreciate what they are doing, and we need the local communities to appreciate what is being done.
If an organisation wants that kind of support, that is fine. There have been occasions when councils have had very poor best-value audits and a team of experts—for want of a better term—largely led by COSLA has gone in to help.
Every council’s immediate fear was that benchmarking would be used as an excuse either by the press or by Government organisations on either side of the border for compiling league tables, but we have been assured that the matter has been considered right from the start and that, for example, this committee is determined to avoid such a situation where possible.
I think that you are giving this committee much more power than it actually has. No matter what, league tables will be inevitable.
The power of wishful thinking, perhaps.
I thank the witnesses for their time and I suspend the meeting for a changeover of witnesses.
Our final witness is Mark McAteer, director of governance and performance management at the Improvement Service. Do you wish to make an opening statement, Mark?
I just want to thank the committee for inviting me to come and talk again about benchmarking. It is much appreciated. I also want to pass on apologies from David Martin and Ronnie Hinds of SOLACE, who tried to rearrange their diaries to be here but had commitments that they could not get out of.
I will ask the same questions that I asked the other witnesses about inputs and outputs. To what extent have the input measures, which are short term and help us to understand progress towards long-term outputs, been appropriately selected and evidenced as being good in that respect?
We carried out a significant amount of consultation with local authorities, professional associations and audit and inspection bodies on the indicators that we have adopted in the suite and we are confident that, collectively, they tell us the direction of travel towards broad outcomes.
In my earlier questions, I referred to the process of normalisation to extract from measures that are taken by different councils and in different contexts similarities and things that one might validly compare. How much support are you giving to councils in that respect? Are you satisfied that there is a proper normalisation process to allow that to be done and to ensure that councils can see good practice that they might bring into their own practice?
A key principle from the outset was that we would not invent new data but work with data that was already part of the public sector and which, as it had already gone through a degree of what you describe as normalisation, we were reasonably confident was good. However, as we have worked with some of that data, it has become clear that it was never designed for benchmarking purposes. For example, local financial returns are a useful data source—indeed, they form our best data source for comparative cost information for councils—but they are by no means perfect for benchmarking purposes.
I should say that normalisation is not my term; it is broadly used, particularly by financial analysts.
It is a fairly standard term.
My wife had a seven-dimensional mathematical model for all of this, but that is perhaps for another time.
We started off working with SOLACE, and the agreement was on what it called big-ticket issues—the major spend areas of councils. Procurement is a big-spend area, but work on that was already under way through Scotland Excel.
It is a service with small inputs but big outputs.
Yes—it has a big impact. We have agreed with SOLACE that, once we get through this year, those gaps will be plugged, so we will see something on procurement, some indicators on economic development and some on a couple of other areas. Planning is another area on which we need more indicators.
Do you have a quick answer as to roughly what proportion of council expenditure in Scotland goes through a procurement process?
It would be a back-of-a-fag-packet calculation.
If you do not—
It would be somewhere between 35 and 40 per cent.
So it is quite a big omission.
It is a big area but, as I said, when we started, work on procurement was already under way through Scotland Excel, working as the collective procurement agency for Scotland. We have said that we will wait until it has completed that work and then we can build on to the framework.
I understand that, but quis custodiet ipsos custodes? Who is Scotland Excel benchmarking against?
It is the collective procurement agency for councils—
I understand that.
So it is working with councils on that, and looking at their collective procurement processes. From that, we can simply adopt what it believes is good practice collectively.
This line of questioning is extremely important because many councils are not using Scotland Excel to the same extent as others.
I agree—some councils are not using Scotland Excel to full effect.
In Mr Stevenson’s and my part of the world we are doing things rather differently, with the Aberdeen city and shire joint procurement unit.
That is fine.
The issue needs to be looked at seriously.
Mr McAteer, you spoke about some of the indicators that exist. We heard earlier that they are not set in stone and that they might change. Do you anticipate that the suite of indicators will increase dramatically so that it is seen not so much as an add-on, as some earlier panellists described it, but as the key element of performance indicators within local authorities?
As was mentioned earlier, the key purpose behind the set of indicators is that they are can-openers—they are strategic-level indicators. We do not anticipate a massive growth in the number of indicators. We have just talked about some additions that need to be built in, but we might see some of the current indicators drop off over time and others replace them. We do not envisage that hundreds of new strategic-level indicators will come into the framework.
How developed is that other piece of work?
We are doing some work with some of the agencies at present. As regards economic development, for example, we are doing some work with the Scottish local authorities economic development group to look at a raft of indicators that it produced last year. Again, that will help to streamline some of the indicators. That information is much more at a management level, but it will still be important. It is necessary to drill down into the indicators to explain why councils are performing differently.
I seek clarification on who is driving the agenda. I heard you say that the Improvement Service is working with SOLACE and you talked about “we”. Who is the “we”?
It is local government.
I am sorry, Mr McAteer, but I would like clarification of that. You said, “We are doing work with other agencies and SLAED”, “We are in discussions with directors of finance”, “We want to see procurement on the list” and “We are working towards increasing the list of benchmarking indicators”. I am curious about who the “we” is and whether the agenda is being driven by the Improvement Service, SOLACE or—as you just said—local government. If it is being driven by local government, who is that? There seems to be a range of organisations on the periphery with the Improvement Service in the middle.
We were asked by SOLACE—the chief executives association for the 32 local authorities in Scotland—to support it in that work. In effect, we are the day-to-day project managers, for want of a better term.
I sought that clarification because COSLA’s submission does not give any indication that it was involved in the initial discussions on the benchmarking indices. It states that SOLACE worked on them and that they then went to the directors of finance before they went to COSLA. If there is a collective approach to benchmarking, at what stage do the various organisations become engaged in the discussions?
I will take your last point first. It is an improvement agenda about how local government can better use comparative performance data to help councils to drive their own improvement. That is where it started. SOLACE, being merely the representative of the chief executives, asked us—because we are the local authorities’ improvement agency—to support it in that activity, which we have done.
A number of other questions arise from that, but I will leave them for another day.
I have just heard the work put in the context of being an “improvement” process. I would not wish to disagree with that. However, in a territory where a lot of improvement processes and so on are already embedded, what makes benchmarking distinctively different and what does it add? If you havenae worked out where I am coming from by hearing my previous questions, I will follow up with a further question.
I am sure that you will.
Right. So it will enable councils to steal other people’s good ideas.
Yes. When I taught at universities, we used to call that plagiarism, but we now call it knowledge management sharing.
In the academic world, people get punished if they copy, but in the business world, people get punished if they do not. The big challenge that I used to have with the graduates whom I used to recruit was getting them to change their mindset.
Indeed.
It is a tool to identify opportunities for positive change, and you will help councils to use it. I will make a personal observation, as opposed to one as a member of the committee. I think that the politicians whom we heard this morning have perhaps not fully got that. They have to varying degrees—I could see them glimpsing it—but they have not cleaved it to their hearts, and you will help them to do that.
Yes. That is exactly right. Our role will be to support the councils. They will have to drive the improvement, but our role is critical in helping to capture and share learning, and in advising people, including politicians, how they may take that forward.
After thinking about what I have just said, I do not intend to make any comments about officials, as I am not in a position to do that.
What timescale has been put in place for benchmarking to become the central point of performance indicators as opposed to being an add-on? We heard about that from the previous panel.
That drifts into the world of policy, unfortunately. As my organisation does not do policy—we do the implementation once policy has been created—I have to assent to the line that COSLA took. Our job with councils has been to make the benchmarking framework and the supporting processes for improvement as strong and robust as we can, which will allow councils to legitimately make the argument with other public bodies about how the broader landscape can shift and change in order to create space to improve the drive forward. That will include discussions with inspection and audit bodies and the Scottish Government. Our role is simply to work on the improvement end. COSLA’s role at the political end is to do lobbying activities to create space in order to allow the process to grow and be embedded and strengthened further. I am sorry, but that is not my job.
Okay. Are you aware of any discussions on any timescales that have taken place that have involved the various bodies?
We have certainly been part of the discussion in briefing the Accounts Commission and Audit Scotland, for example, so that they are aware of the project, how it has developed, the stage of development that it is at and so on. The Improvement Service would not be involved in policy discussions about what will happen next to the broader performance frameworks that govern local government. That is not our role.
I note from the COSLA paper that the Improvement Service, SOLACE and COSLA are working on a communications strategy. Obviously, part of that will be about managing the media. You heard me ask my previous questions. If the benchmarking data is publicised, that will ultimately lead to a league table that will identify the worst councils. Where are you with your strategy on handling that type of publicity?
There are two elements to that. One element is to get the data and analyse it. The question is what it starts to tell local government and individual councils collectively. We are currently working on that. The intention is that the public report—for want of a better term—will be published some time in the new year. It should be ready early in the new year. We are waiting for key data from the Scottish Government that will not be published until December so that we can finalise some of the indicators, which is why the report will be published in the new year. There will be contextual explanation information in that report to help the public to understand what the information tells them about their authority, how well it is performing and so on, and some of the background pressures, such as the impact of welfare reform, will have to be captured.
Mr McAteer, good afternoon—it is afternoon now. Having sat through our previous two evidence sessions today with COSLA and the three council leaders, you will know that my questions are on how we achieve ownership of benchmarking in three distinct areas. First, how do we get buy-in from members and from staff, both at the front-line level and at managerial level? Perhaps you can indicate whether that has already begun. Secondly, what are the challenges, given the variations that we have heard clearly exist among local authorities? Thirdly, what role does local authority leadership play, both at officer level and at political level, in ensuring the project’s success?
Taking your last point first, I think that the role of leaders is critical. Both political leaders and senior managers have to show that they value the process and, equally, that they use the process, so that it is not just a lot of work for people with no real gain or pay-off at the end of the day. Commitment from leaders has been very strong from the outset of the project and must be sustained going forward. Leaders must show that they use the process to full effect in order to keep up the momentum among staff within each of the organisations. That is absolutely critical.
To reinforce Stewart Stevenson’s point, as well as applying pressure to do all this improvement work and get the data out there, do you consider it key that councils understand this as an opportunity?
Absolutely. Get this right for service improvement purposes and it will help people, but it will also help in dealing with those other pressures that we have talked about.
Does the Improvement Service always make a point of saying that to councils? I did not actually hear that in your response to Stewart Stevenson, although I think that he teased that out in his supplementary questions. I think that that is key.
Yes, I think that that is key.
This is going to work or fail depending on how front-line staff respond. They will buy into it if there is something in it for them. What is in it for them?
I have worked with councils in a variety of roles for 20 years or so and I have yet to work with any member of front-line staff who does not turn up to try to do a good job. At the end of the day, the benchmarking process is about helping them to do that. It is about focusing on things that absolutely matter to drive services forward, and my experience is that staff are committed to that. That is where the benchmarking process ties in with them. If you like, it helps to liberate some of their imaginations and effort and to focus them on ways that can drive services. I am not an expert, but they are experts. My job is to put in place the framework or architecture that enables them to drive performance improvement. That is where benchmarking will help.
So you are saying that the approach will energise front-line staff.
I genuinely hope so.
Ah—the weasel word in there is “hope”.
I cannot control that, but that is what I expect and hope to see.
I am not holding you accountable for doing it, because that is the councils’ job; I am asking for professional feedback on whether we are on the right track that will lead to staff on the front line being energised and feeling that they have a contribution to make. Of course, your answer might not apply in all 32 cases; it might apply in only 20 cases.
I expect that to be the case. If it is not, we will have a serious issue and challenge, because it is a necessary element.
Sorry, but I am going to be persistent. Are we on the right track to do that?
I would say so. Given that I have been the key architect in much of the process, you would expect me to say that.
Right. You will be held accountable for that at a later date, I am sure.
What are the particular challenges of applying the benchmarking approach to community planning partnerships, particularly in relation to the contribution to the delivery of outcomes?
We need to be clear about a couple of things when we talk about how the work that we have been doing with councils on benchmarking might apply to community planning. Benchmarking is ultimately about services and how they perform but, at present, community planning partnerships do not deliver services. They are co-ordination bodies that allow the key public partners to agree the key outcomes that they then try to reflect in their delivery of services. Therefore, benchmarking applied in that context would be slightly different.
We heard earlier from COSLA that it is all going to roll out from December.
Do you mean the particular piece of work on the SOLACE work?
Yes.
It is likely to be into the new year when that is published. As I say, we are waiting for a couple of data sources from the Scottish Government. Those are controlled through national data standards. The Government just cannot give us access to the data until mid-December. Until we get that, we cannot populate a couple of our key indicators on children’s services. By the time that we get the data, we will be close to Christmas, so we will have to make a judgment about whether people might think that we are trying to sneak out data when nobody is looking. Therefore, I think that it will be into the new year before we do the final publication of all the data.
We have heard lots of positives today, but what are the negatives? What are the impediments that are still holding up the process? You have mentioned one about a data source, but what other things might be holding back the process?
I do not think that things are holding us back; the issue is just that we are dealing with an on-going and complex set of issues. That is the biggest challenge that we face. The issue is not that people lack commitment or are not putting in effort; it is just that the process takes a lot of work.
So there are no negatives or impediments.
There are no negatives, but there are challenges. That sounds like a horrible cliché when you say it out loud—we have challenges, not problems, these days. There are challenges, but they are more technical, rather than being about people lacking commitment or ambition.
I am tempted to follow on from the convener’s question by saying that the current process is part of a long line of processes in which local authorities have engaged on best practice and best value and which preceded the current indices.
Ultimately, the local authorities pay for our service. We are a shared service of local government. At the end of the day, we are councils’ improvement body. Increasingly, we are also the improvement body for community planning partnerships, through the councils’ role of supporting CPPs. Our total budget is about £1.3 million per annum, which is paid for from the local authority settlement. The work on benchmarking is a substantial resource commitment on our part. A substantial part of my and my team’s time goes on it. There is no cash involved; it is just a work commitment from the Improvement Service, because we think that it is a strategically important development for local government and we are absolutely happy to support it. That is why we have ensured that benchmarking is one of our business priorities as an organisation. The work is taking up staff time, not cash.
So £1.3 million comes out of the local government settlement.
That is for the whole of the Improvement Service, to cover all our activities working with 32 councils and all the services in between.
I thank Mark McAteer for giving us his time again today.