Official Report 291KB pdf
Agenda item 3 is an oral evidence session on the benchmarking system that Scottish Water operates. The session follows the seminar on local government benchmarking that the committee held on Monday. A recording of that seminar is now available on YouTube, no less. Somebody will have to teach me how to get on to YouTube.
Okay. Thank you very much for inviting Scottish Water to give evidence. Scottish Water has certainly learned a lot over the past 10 or 11 years with the advent of regulation and benchmarking, and we are keen to share what we have learned with other public authorities in Scotland.
Thank you very much.
We were quite fortunate in the water industry, as the Water Services Regulation Authority—Ofwat—developed benchmarking tools and techniques back in the late 1980s. Benchmarking tools were therefore available in England and Wales. Our biggest challenge in Scotland was that we had never done benchmarking before, and we did not necessarily collect information in a way that was consistent with how the benchmarking definitions required it to be collected. We had the added complexity that benchmarking had started with the three predecessor authorities, which had been merged into one authority. Therefore, we had three different ways of collecting information and three different interpretations of the definition of benchmarking.
Besides the existing benchmarking for the three organisations, did Scottish Water look for examples of good practice or benchmarking outside the water industry or in other countries?
We have been a party to international benchmarking for some time. We also supply information to an international benchmarking league table that is organised by Ofwat.
You referred to initial resistance to benchmarking. How was that overcome and does any resistance remain?
As I said earlier, when we were given the results of the first set of benchmarking information we were initially asked to achieve 40 per cent efficiency reductions in our operating costs. That was quite a difficult concept for the senior management team. At that time they had not bought into benchmarking. We believed that in Scotland we were different.
Has the organisation now bought into benchmarking completely?
Absolutely. It is easy with hindsight. We are 10 or 11 years down the road. The benchmarking process was directionally correct and it has helped us to deliver huge improvements for customers such as cost efficiency and service improvements. There is no downside.
Can we go a little further back in history? You talked about the three previous water authorities. I do not know whether you have the detail but, in the days when regional councils controlled water services, did benchmarking between those councils take place?
Not that I am aware of. I was part of Central Regional Council before the water authorities were established and I am certainly not aware of benchmarking having been done at that time, but I might be incorrect.
When you embark on adopting best practice, you identify new indicators as part of the benchmarking process. Indicators can become outdated or irrelevant. How does the review process work in practice? Are there difficulties in removing indicators and adding new indicators?
One of our key indicators is the overall performance assessment—OPA. That is a basket of 17 indicators that all add up to one number and that is fairly straightforward to follow. The components of that indicator change and have changed over time. In practice, we change them in discussion with regulators and other companies. As we see not that a measure is becoming redundant but that it no longer adds value, we can change components of the benchmarking. There has been evidence of that in the past five or six years across the UK.
In your relationship with the regulator, do you negotiate with that body or do you have to listen to it?
We now work with that body. Because of our business performance and the service improvements that we have made, the regulator is a lot more comfortable that we understand our business and are driving it in the right direction.
You have a number of regulators—four, in fact. They are the drinking water quality regulator for Scotland, the Scottish Environment Protection Agency, Waterwatch Scotland and the WICS. Is that correct?
Waterwatch Scotland is now Consumer Focus Scotland.
Do you have to talk to all those people about changes to indicators?
Historically, we have not done that. Because the Water Industry Commission runs the benchmarking and because it is based on a UK-wide set of information, it has involved a bilateral conversation between us and the commission.
Do those bodies talk to one another about changes to your indicators?
Yes. The commission has discussed with the drinking water quality regulator and SEPA the relevance of indicators, to ensure that they remain relevant. It would not be correct to chase benchmarking for the sake of it; benchmarking must deliver improvements.
How are the outputs of benchmarking tools used in practice? Can Scottish Water give practical examples of changes that have been made as a result of the benchmarking data?
The two most significant examples that I can give concern cost efficiency and service improvement. We have driven quite a lot of efficiency through Scottish Water and have reduced our costs significantly. The improvement in our service is shown in the overall performance assessment. When we were first benchmarked with England and Wales, we were definitely in the lower quartile of the service indicator. At that point, we were very much a low-service, high-cost organisation. Over the past five or six years, however, we have moved that overall performance assessment right up into the top quartile, as is evident from the commission’s published performance reports. We are now working towards—and we are pretty much at—being a high-service, low-cost organisation.
Can Scottish Water explain in more detail how “special factors” work in benchmarking?
Some of the benchmarking tools are quite complex, are not accurate and do not explain every organisation’s particular set of circumstances. There is, therefore, a need for explanatory or special factors that explain a different set of circumstances. We have used those for all our operations in the north-west region, where we deliver the service to a much more remote population and incur more cost because we drive much longer distances in responding to customer service issues. We set about explaining that, with the ability to make an adjustment to some of the benchmarking models.
It might be useful for the committee to catch sight of one of those indicators along with the explanatory notes showing how you look at Aberdeen compared to the north-west Highlands or wherever it may be. If that could be sent to the committee, that would be very useful.
We can supply that.
Are there still shortcomings in the Scottish Water benchmarking system? If so, how are they dealt with?
It is all relative. We certainly had shortcomings when we first started and even now the information is not perfect, as it is difficult to get perfect information. Are you asking how we overcame those challenges, or have I misunderstood your question?
It was about the shortcomings in the Scottish Water benchmarking system. How are you dealing with them?
Sorry—I beg your pardon. In the early days, we had shortcomings in things such as data quality, so the organisation looked at the key parts of benchmarking. Quite a lot of benchmark information is asked for, and it is important to understand that some of that information is more important than the rest. It is a matter of trying to see the wood for the trees.
Obviously, we have read about and you have spoken about Scottish Water’s improvements in productivity. Were they down to benchmarking or were they a product of the merger of the three bodies?
They were down to a combination of both. The benchmarking told us exactly where Scottish Water lay in the league tables and what the efficiency gap was, but there were undoubtedly merger efficiencies. We did not need to have three head offices or three management teams. Efficiencies automatically came from going from three bodies to one body, but the benchmarking took us on that efficient service delivery journey by establishing what the efficiency gap was.
Good morning, Belinda. The committee is interested in looking at benchmarking in general to see whether lessons can be learned for local government. From your experience at Scottish Water and with the benefit of hindsight, is there anything that you would have done differently in establishing the system?
If I were to do things differently, I think that it would be helpful to have greater clarity and understanding at the outset about why benchmarking is needed and what benefits it will deliver. I do not think that the conversations between the regulator and the predecessor authorities were full enough. The conversations became fuller when they were between the regulator and Scottish Water.
I will tease out the league table issues a little later. Basically, it is a matter of putting benchmarking in context and spending a lot of time up front selling it, on the presumption that one volunteer is worth 10 conscripts and that there will be a better chance of success if we take everybody with us.
Probably, although it is difficult to be specific, as I am not clear about the type of benchmarking that the local authorities might do.
Was there an independent assessment of what you did in that process?
Yes, there was. We had what used to be called the reporter, who had a formal audit role for Scottish Water. He would come and audit us and all the information and would then report independently back to the commission. He would also report to our audit committee.
So independent assessment was crucial at the beginning and, thereafter, is used as and when the organisation thinks that it would be helpful.
Yes. The added benefit of the independent scrutiny is that those individuals—the assessor is more than one person; it is a team of auditors—bring experience from other organisations to the task. Therefore, in conversation, they can tell us that one of the other organisations does something in a slightly different way that works better and is more effective. That means that we have an in-built learning and feedback loop that comes with the assessor services.
That is helpful.
We have certainly had adverse coverage over the years, although we do not have so much now. For our reputation, we do not want adverse coverage, so it creates added stimulus for more focus on improving services. It drives behaviour within the business. I am sure that if, when we had the bad press, we had been asked whether we saw the upside of it, we would have said that we did not. With the benefit of hindsight, I would say that, although it is not a place where we want to be, it drives the right behaviours in the business for ensuring that we create the service change that we need.
Was there a positive attempt to tell the press that you realised that you fell short of what you would like and would take steps to remedy it?
We have always been consistent about being aware of any shortcomings that are picked up and about being committed to service improvement and to providing the service more effectively and efficiently. We have always been positive about what we take out of bad press, but it provides the impetus for not wanting to be in that position year after year.
Convener, from the evidence that we have taken so far, it seems that people are very wary of press coverage. It would be useful to get as much information as possible on good practice and how it was managed well.
Indeed.
The purpose of a benchmarking exercise is to determine how Scottish Water is doing on a range of indicators by comparison to some of its peers—the other companies in the UK or, if the same indicators are used, the six Scandinavian cities that have been mentioned. That is useful, because it allows us and the public to see how Scottish Water is doing in comparison to other similar organisations. However, I imagine that it might be even more useful, certainly to the people whom I represent in Cumbernauld and Kilsyth, to know how you are doing against the range of indicators in Cumbernauld and Kilsyth compared to how you are doing against those indicators in other parts of Scotland. Do you undertake that form of area-to-area benchmarking exercise in Scotland and, if so, how do you do it?
We do some internal area-to-area benchmarking. That starts to get difficult because of the regional boundaries and where the cut-offs are. This is perhaps adding a bit of technical complexity, but the treatment works at Loch Turret could be delivering water to Cumbernauld, when the operational boundary stops at Stirling. Therefore, it is sometimes tricky to give local benchmarking information and for that to be precise and accurate. It is much easier when the information is taken up to a higher level and we look at a whole country. We do some area benchmarking, but that is regional and changes over time because we change our operating areas, although that is for service reasons, not benchmarking ones.
So the areas that you use do not compare to Scottish Parliament constituency boundaries or local authority boundaries.
No.
That begs the question about how useful that information is. What areas do you use? Would people recognise them? Is that information useful only for internal purposes?
It is useful only for internal purposes at this stage. I am happy to have a think about whether there is merit in dropping down to other levels of detail. I can get back to you on that, but my initial reflection is that there is no merit in it. The lowest level that we drop down to is that of our current operational areas, such as the north-east and south-west. We find that regional and council boundaries change over time, so—
To be fair, they do not change that often. The most recent local government reorganisation was in the mid-1990s, and before that it happened in the 1970s. We are not talking about something that happens regularly.
Possibly not, but keeping information at a high level enables us to process benchmark, which is a slightly different approach. It involves considering, for example, how a works in Aberdeen compares with one in Edinburgh. That operational information is important for us. I would need to think about what would be more helpful for customers in Cumbernauld, for example.
Your customers are our constituents, so that would be worth while.
Absolutely.
Scottish Water was established in 2002, but it was not until 2005-06 that there were annual savings of well over £100 million and, since then, the figure has improved. Between the establishment of Scottish Water and the years when a return was made, was there an on-going cost associated with the efforts to make savings? Were you spending money to get the return in the end? What was the cost of embarking on the benchmarking process?
The cost of embarking on the benchmarking process itself was pretty modest—it certainly was not material. Given the number of people who were involved, we probably did that for hundreds of thousands of pounds, and certainly not millions of pounds. In the price control, or financial settlement, that the water industry commissioner set for us for 2002 to 2006, he recognised that there was a necessity to spend money up front to deliver efficiencies down the line. That was called a spend-to-save allowance. An amount of money was allowed in the first price control to facilitate the delivery of efficiency savings.
This is perhaps an unfair question, but will you say a bit more about the additional skills that, in your opinion, are required to allow local authorities to successfully introduce benchmarking?
That is not an unfair question. We were in the same situation ourselves. Our experience was that we needed to have much higher levels of analytical capability. We were being regulated, which had never happened to us before, so we needed regulatory economists, or what other people might call industrial economists, to help us through the whole sea of benchmarking and regulation.
When you agreed that benchmarking was the way forward, what was the timescale for the indicators being agreed and implemented?
There was a fairly short timescale, because the UK benchmarking system, which had been developed in England and Wales, was already in place. That system had been developed and we were asked to benchmark our information on that basis, so there was no transitional period. We were given the tables of information that were being requested and had to complete them.
As there are no other questions, I thank Ms Oldfield very much for her evidence. It has been extremely useful for the committee to hear about Scottish Water’s experience. As agreed, we now move into private session.
Previous
Subordinate Legislation