Visits and tours

During recess we are open to visitors 6 days a week, 10am to 5pm.

You are welcome to visit the building any time it’s open and entry is free. 

Find out more

Skip to main content

Language: English / Gàidhlig

Loading…
Chamber and committees

Constitution, Europe, External Affairs and Culture Committee

Meeting date: Thursday, June 23, 2022

Agenda: Decision on Taking Business in Private, Channel 4, Scotland’s Census


Contents


Scotland’s Census

The Convener

Item 3 is Scotland’s census. From the National Records of Scotland, I welcome Paul Lowe, registrar general and chief executive, Peter Whitehouse, director of statistical services, and Anne Slater, director of operations and deputy registrar general. I invite the registrar general to make a brief opening statement.

Paul Lowe (National Records of Scotland)

Good morning. Thank you, convener, and good morning committee. Scotland’s census is a highly complex programme and, in the same way as many other modern censuses, it consists of a number of different elements. It brings together high-quality census returns, coverage survey, peer-reviewed statistical techniques and use of high-quality, administrative data to provide additional quality assurance.

Our approach to delivering the census was informed by stakeholder engagement, work with other census-taking bodies and user research. Our responsibilities were to implement the legislation and put in place the tools and support to enable citizens to meet their personal legal obligation to complete their census.

The 2022 census provided more options and greater flexibility to complete the census than had been previously available, whether online, paper, or assisted completion. More than 2 million households, or 89 per cent of respondents, selected the online route, showing a clear public preference for that approach. However, paper questionnaires were widely available, and during the census more than 600,000 were issued.

In advance of today’s meeting, we provided the committee with some facts and figures about the activities from Scotland’s census. They demonstrate the phenomenal effort of census staff to support the public to complete their census. I would like to thank everyone involved in delivering the census and the many organisations and individuals who have engaged with us.

As the committee is aware, on 28 April, the cabinet secretary announced to the Parliament that the census collection would be extended by one month to provide an additional opportunity for households who had not yet done so to complete their returns. On 31 May the public awareness campaign came to an end and our field operations ceased. In line with practice in other UK censuses, we continued to accept late returns for a short period afterwards. As of yesterday, the national return rate was 89 per cent with more than 2.3 million household returns.

A month-long extension to the collection period has had a positive impact on return rates, with the national return rate increasing by 9.8 percentage points since 1 May, and 30 out of 32 local authorities meeting the NRS 85 per cent local authority response target, while only one had met it by 1 May, and 18 local authorities met or exceeded 90 per cent. The most notable difference was in Glasgow, where the return rate increased by 12.4 percentage points.

I regret that we were not quite able to secure the 90 per cent or better that we advised your predecessor committee would be met. It is clear that returns have been lower than they were in 2011. It is important to understand the reasons for that and what it means for future census exercises. However, at 89 per cent we are very close to what we set out to achieve.

My panel of international experts has confirmed that we have a solid foundation to move to the next phase, and that is what we are now doing. The census coverage survey, which is the second largest social survey undertaken in Scotland, is now under way. The CCS has been used in the past two censuses in Scotland. It is critical to our understanding of who has been missed by the census collection, and it allows our statisticians to estimate the volume and characteristics of those people and households who are missing from the census. As part of other measures, it underpins the production of high-quality estimates of the size and structure of Scotland’s population.

I look forward to answering your questions today. Thank you.

The Convener

Mr Lowe, the census work is not yet complete, although the deadline has passed. Can you give us a bit of background on what remains to be done and possible timescales for when we might have a report on the learning points from this year’s census?

Paul Lowe

The next phase is the census coverage survey, which is getting under way now and will run until the end of July or early August. It is a doorstep survey of approximately 50,000 households.

We are continuing to gather lessons learned as part of the programme. In common with previous censuses, we will prepare an evaluation of the census that will go to the Parliament. That will usually be produced after the first output results from the census. That will probably be in 2023, but we are happy to keep the committee up to date with the learnings and information that we gather in the intervening time.

I will move to questions from the committee. Ms Boyack is first.

Sarah Boyack

It was very useful to get your written evidence. I will ask you a couple of questions about the timing. I understand that, when asked, almost one third of the population was not aware of the census particularly given the change of timing to look at the digital issue. My understanding is that, when the 2021 census was carried out in the rest of the UK, there was a safety net approach to try to include people. You made a big deal of the digital response rate, but to have to send out 600,000 paper forms is not going for the safety net approach to target areas of lower-income households and a disproportionately older population, and also rural areas.

Can you give us a comment about that, and can you give us comparable statistics on local authority turnouts in terms of households and individuals? I am making sure that the local authority turnout data that we have is comparable. How are you going to go below the local authority level to make sure that people who did not respond to the census, or areas where people did not respond disproportionately do not miss out? Will you be producing evidence or analysing the census output areas so that we get accurate knowledge about who has missed out in the census?

Paul Lowe

There are three questions, so I will pick them up in turn. The information that you quoted was based on a survey of people who had not completed the census at the end of the census, so it was not a general survey of the population to assess their understanding or awareness. I just wanted to clarify that point.

Obviously, the largest group of people who responded—35 per cent—reported that they were too busy or just did not have the time to do the census, and then other reasons were stated. There clearly are a number of reasons why people in that final group who did not participate did not return their response and we need to understand those. There will be things that we can take into account and lessons that we learn from this that we can build into the future design of the census. I also think that the situation flags some potential changes in public and societal attitude, which will also require close thought when censuses are launched and run in the future.

On your point on the Office for National Statistics, you are right that at the very start of the census the ONS issued some targeted paper forms. We did not do that, but from 28 February, people were able to request paper forms well in advance of census day on 20 March. We received in excess of 360,000 requests for paper forms through that route alone. We did, however, issue some forms proactively, taking into account some of the circumstances that you spoke about—digital exclusion and various other factors—and we issued more than 115,000 forms proactively to that group. That was not on day 1; it was some weeks later. Our field teams also issued around 92,000 forms, some of which were posted through doors where there was no response and some of them were because householders requested the forms.

One of the biggest enigmas is that, of the 600,000 forms that we issued, we received less than half of them back. Even if we were to focus on the 363,000 where somebody proactively got in touch with us and asked for a form, only about two thirds of those were returned to us.

On the final point about sub-local authority data, I will hand you over to Pete Whitehouse, our chief statistician, to give you a bit more information.

Peter Whitehouse (National Records of Scotland)

Good morning, everybody. Hopefully you can all hear me. The question as I understood it was around return rates, at local authority level and below that, so thank you for that question.

What we presented at various points through the programme and continue to do today and to provide to the committee has been local authority and national return rates. We have a household register from which we send out forms and invitations to take part in the census. It is against that and the work that our field force is doing to make sure that we gather all the households that are in scope so that we do not include, for example, vacant properties or businesses, or that we pick up conversions where flats or houses have been changed in nature and size, or the number of homes that are within that location. We do a lot of work to understand who a household group is and that is what we report on, so the return rates are for the responses that we have gathered from the households. As I say, we present those figures at local authority level.

As Paul Lowe has mentioned, we then carry out—and are carrying out at this moment—a census coverage survey, which all the census bodies across the UK have been using as a statistical tool since 2001. That allows us to get a good understanding of the households or the types of areas where the numbers of returns have been lower than we were looking for and helping us to understand any gaps in the census data. We then add to that the administrative data that we are continuing to develop and evolve. Many people on the committee will be fully aware of how administrative data is now a much fuller part of the analytical base of statistics across all dimensions of the economy and society.

We are working with colleagues across the UK, but also very particularly within Scotland, to make sure that we get the use of all that information. For example, knowing roughly from the pupil census the number of school-aged children that are in a particular area helps us to understand how many the census should be covering. If we do not see some of those figures, we know that we need to use some statistical techniques to make sure that we cover that.

The technique is gathering the information from returns from the households, using the CCS to understand where we need to make adjustments, and using administrative data to help us with those adjustments and any biases that may be in the data. Then what we do—to get to the end of my answer—is we present our outputs, our census and our statistical estimates. Those are the things that will be presented in census output areas and our low area geographies. That will be our estimate of the population size and characteristics. As you aggregate those areas, you get more and more detail, and that is another area around protecting confidentiality and privacy. The smaller the area that you look at, the less data you get and so you will get population estimates. As we aggregate all of that, we start to get much more of the richness. I hope that that has answered the question.

Sarah Boyack

It does not quite answer the question, because I was asking for the comparable figures from local authority level data for the 2022 census and the 2011 census. I am particularly interested in credibility. I have looked at the statistics, and I want to double-check that my interpretation is right. The gap is significant—for example, the figure for West Dunbartonshire was 11 per cent down from 2011. However, I want to check that I am using the right figures in terms of households and individual responses.

I want to go back to the information about people not knowing about the census or their personal responsibility. There would be even more of an impact if, several weeks into the census programme, a third of the population were still not aware of their obligations or the impact of the census.

All that goes back to the credibility of the 2022 census, given the aspirations to hit a response rate of around 94 per cent. What do those figures do for the effectiveness and usefulness of this year’s census?

Peter Whitehouse

At the moment, the data that we have on 2022 is household returns—you may have population returns in front of you. I do not have the 2011 stuff to hand, but a comparison can be made—we will do this later in the process—between the individual population returns from 2022 and those from 2011. However, at this point in time, we have the household returns.

My point on comparisons is that the census is increasingly not just an administrative count. In 1991, the census was run as an administrative count and what we got was what we got. We understand that there was probably an undercount of the population. Since 2001, along with the ONS, the Northern Ireland Statistics and Research Agency and other census organisations across the globe, we have taken a different approach. We have done lots of work, particularly during the extension, to make sure, as far as possible, that coverage is across Scotland and all communities. Then we do our census coverage survey and administrative data work. It is about the combination of data.

10:45  

Comparisons between return rates from 2022 and from 2011 or earlier are of some use and of interest, but they are not the sole measure of the quality of the census outputs. We try to build on our census returns, add in the knowledge that we get from our census coverage survey, add in administrative data and use that to produce high-quality census outputs. We will bring together those pieces of evidence. That is where we are getting advice from our international steering group on different statistical methodologies and how to maximise those.

On awareness of the census, as Paul Lowe said, the information was gathered by field force staff on the doorstep, as they were seeking final completion. That information is not from the population in its entirety. It is from the people who at that point, right at the end of the census period, had not yet completed a census form. Of those, a third said that they were too busy.

As I said, we had 2.32 million returns, which is a significant amount of the population. However, as I say, it is not just about the census returns; it is about all the other valuable work that we add in. That now happens in all censuses that run that kind of modern approach to gathering the data.

In the presentation of our outputs, we will produce high-quality population estimates, bounded by our statistical confidence on those, to allow users to understand the variability in that. That will also be how we present all our other data.

Sarah Boyack

So, in your view, there is no issue about the credibility of the census. When I visited with your enumerators, I was struck by the fact that I was in a very significant area and, with just under two weeks to go, there was a turnout rate of 57 per cent. It just did not tick the box of 94 per cent. How will those missing households and missing people be accounted for so that their needs are not ignored in future investment or Government policy? Even after today’s answers, I have significant worries about that.

Paul Lowe

I was grateful to you for coming out and seeing the experience of field force staff. I entirely appreciate the concern, which I know you have illuminated in other places. The census coverage survey goes out to some of those same places and gathers information—its purpose is to fill in the gaps where people have not responded. The first thing is that the extension period increased the response rate across Glasgow by 12.4 percentage points. We focused a lot of effort on that. We had field teams and put additional people into Glasgow. There was an additional focus over that four-week period to target the places that had the lowest response rates and bring them up.

I could have done a very cynical exercise of sending out field force staff to low-hanging-fruit areas across the country and got a 92 per cent response rate, but it would not have been a good-quality census, because I would not have good-quality data about the communities and areas that you are rightly concerned about. That is not what we did. We picked the areas where we had the lowest response rates and where we needed to know more. We focused on and targeted our resource at those areas to drive up the response rates and gather more data. If you look at the shift in response rates, you will see that they are most significant in places such as Glasgow, Dundee and other places where deprivation is a factor—that is because we took that approach.

I agree that there are still some differences compared to what was achieved in 2011, but in 2011 there was also variability across the country. The census coverage survey and the additional data and work that Peter Whitehouse has talked about are used to address those gaps and issues. We do not see any issues with credibility at the moment. I understand the public interest in the issue, which is why I brought together an international expert panel of individuals to look at what we had done. We had a number of sessions in which we presented what had been achieved, what we were doing, how we had done it, where we had got to and what we were planning to do next. That is an important independent assurance to anyone who has a concern on the issue. Those experts said that we have a solid foundation and that it was right for us to move on to the census coverage survey, which is what we have done.

On the point about lack of awareness, we have all picked up on the point that the figure on that came from gathering views at the very tail end of the census—the last week, rather than a few weeks in—from people who had not responded. We also have to remember that everybody in the country received a letter on how to take part—2.7 million letters were issued—and that was not a digital or email approach. Everyone received a physical letter around 28 February. For people who did not respond, up to five reminder letters were then put through their doors. There were hundreds of television adverts and thousands of social media adverts and physical advertising. There was work with a range of partners, including local government and others, which put out communications across their areas and to the different groups that they work with. There was an extensive campaign to reach people.

It is always difficult to exactly measure reach, but the communications industry has measures for that and, on those measures, in the first phase of our marketing campaign, which was the first five weeks of the census, approximately 98 per cent of the population had access to a minimum of at least six advertised messages about the census. I do not think that there is an issue about lack of public awareness; people were reached in a range and combination of ways.

Donald Cameron

Good morning to the panel. I want to ask about the target. There has been a suggestion this morning and in your letter to us of yesterday that the target was 90 per cent or thereabouts. Do you accept that, in the November 2019 document, from which one of the key performance indicators that you cite comes, you defined as an overarching definition of success a person response rate of at least 94 per cent? Further, you referenced an evidence session in September 2020 to our predecessor committee. In that, Mr Whitehouse mentioned the figure of 90 per cent, but he went on to say that the 2011 figure of 94 per cent

“gives us what we are aiming for.”

He went on to talk about

“a good mid-90s response.”—[Official Report, Culture, Tourism, Europe and External Affairs Committee, 17 September 2020; c 32, 33.]

Do you accept that, both in evidence to the Parliament and on paper, you said that 94 per cent was the target that you were aiming for?

Paul Lowe

There is a document that quotes that figure. The figure somewhat predates my arrival in the organisation, but it is based on replicating the response rate in 2011. It is not based on an assessment that, if the organisation or Scotland’s people failed to return 94 per cent, the census would suddenly become worthless. It is not an either/or argument. I think that the cabinet secretary at the time, Ms Hyslop, said that we were looking for around 90 per cent plus, and that was what we articulated in our evidence at the time. We will always want to get as high a rate as possible, because that improves the quality of the data but, at 89 per cent at the moment, we are not in a position in which we do not have a credible response. The data that we have is more than sufficient. It is very challenging territory to suggest that, when over 2.32 million Scottish households have responded to the census, the census data is of no worth or value.

Do you not think it reasonable to expect a response rate in 2022 of at least the response rate that you achieved in 2011?

Paul Lowe

No—not necessarily. In the 2001 census, the figure was 96 per cent, so there was a 2 per cent reduction in response rate in the following census. There is fluctuation in response rates. It cannot necessarily be expected that you will always replicate the rate in the previous census.

Donald Cameron

I turn to Mr Whitehouse, given that he said in September 2020 that a 94 per cent response rate

“gives us what we are aiming for”,

and then spoke about

“a good mid-90s response.”

Do you stand by those comments?

Peter Whitehouse

The conversation, as I remember it, was in the context of where we might be if we were trying to deliver in 2021. There were conversations in the committee and elsewhere about our concern that the percentage return rate would be in the 60s and 70s. My language around 90 per cent was a broad message that that is where we wanted to be. The performance indicator was set as a programme performance indicator because that was what was achieved in 2011. However, it is absolutely clear—I put my hands up and say that I could have been clearer at the time—that this is about getting as many census returns as possible within a reasonable timeframe. That then allows us to move forward, add the data that we are gathering through the census coverage survey, add the administrative data, innovate and add our statistical estimation methodology and produce bounded statistical estimates of the characteristics and numbers of the Scottish population.

It is perhaps unfortunate that there is a focus on one key performance indicator as opposed to the KPI around an 85 per cent threshold. As Paul Lowe set out at the beginning, more than half of the local authorities were above 90 per cent, 30 were above 85 per cent and Glasgow was the lowest, at just under that. Therefore, on that indicator, we are in a good place. I am confident that, with the census coverage survey, the administrative data and our statistical methodology, we will deliver those bounded high-quality census outputs that we are all driving to achieve.

Donald Cameron

Thank you for those answers. I will turn to the question of the safety net that Sarah Boyack was asking you about, because I think that this is an important distinction between what happened in England and Wales and what happened in Scotland. In England and Wales, as we have heard, where the take-up of online completion was expected to be low—for example, in digitally excluded areas, areas of deprivation and rural areas, where there was perhaps a disproportionately high elderly population—the ONS sent paper copies out at the outset, I think to about 10 per cent of households, of whom half responded by filling out the paper copy. In Scotland, that was not done. Given the eventual return rate, do you accept that that was an error?

Paul Lowe

The issue here was one of timing, in that we did that a few weeks later than the ONS. There is no evidence to support the assertion that that was a critical difference. We will obviously see whether we can determine and learn lessons from it. It is a difference but, equally, in Scotland we issued considerably more reminders to people than the ONS did. There are a number of differences in design and approach, including some things that we did earlier or did in higher quantities or more frequently than the ONS did. As I said, we proactively issued large quantities of forms to that group of people well within the timescale for people to respond to the census. Certainly, we will look at the effectiveness of what the ONS did as part of our lessons-learned process.

Peter Whitehouse

I am not sure that I have anything to add to what Paul Lowe has said.

Jenni Minto

I have a very quick question. Mr Lowe, you have twice referred to the census results as being a “solid foundation”. As I am a layperson, can you explain what that message from the international steering group means?

Paul Lowe

As we articulated in evidence to your predecessor committee around the decision to delay the census due to Covid, we anticipated that we would achieve a response rate somewhere in the 60 to 70 per cent range, which we assessed as far too low to provide a credible census return.

11:00  

We have an understanding of what is a credible return that we can then take and use with other elements versus something that would not render something of the quality of a census. Our sense is that we were looking to deliver a census with a response rate of 90 per cent plus and to get as high as we could. We have ended up at 89 per cent. Our internal assessment as an organisation is that that is a very high level of response. I think that anyone would struggle to think of any other sort of exercise of public engagement that would get that response rate.

Because of the scale of the public debate and some of the criticisms about the census response rate in Scotland, I thought that it was important to provide additional reassurance that that was not just our advice and that we were not just marking our own homework but had brought in some very credible worldwide experts and coverage surveys to look at where we had got to with the census. They could have come back and said, “The census return rate is inadequate. You should continue to collect census data.”

The quote that you reference is from the chair of the panel, Professor James Brown, who is a professor of official statistics in Sydney, but it was endorsed by the panel members. Their judgment is that we have reached a reasonable, sensible, credible point to stop the census collection and move on to the next phase of the census. That is essentially what it is trying to capture.

Jenni Minto

If I understand correctly, you are saying that the level of returns, and the information that will be able to be gathered from the census returns, is a suitable level to allow the decisions that are required to be made on the basis of the census to be made.

Paul Lowe

Absolutely correct.

Graham Simpson

Mr Lowe, let us see whether I picked you up correctly. You said that, when you were considering whether to go ahead in 2021, you expected that, if you had gone ahead, you would have got a return rate of 60 to 70 per cent. Am I correct? Mr Lowe is nodding. Of course, the census went ahead in the rest of the UK and was more successful than the Scottish census, which was delayed by a year, has been. You got that wrong, didn’t you?

Paul Lowe

No, because that is comparing two different things. The context of the evidence to the committee that I am referring to was that there were a number of different circumstances in Scotland that would have resulted in significant changes being made to the census design, which would have resulted in a lower response rate. If you recall, during the pandemic at that time in 2020, we did not have mass testing and we did not have clarity on when we were going to get a vaccine. Our colleagues in the United States were running a census during the pandemic that ran into considerable difficulties and they had to double their collection period. Other events were obviously being rescheduled, including the local government elections. The census is not something that you can decide to cancel a day or two before. Either you decide to run it or you decide to reschedule it, and you have to do that far in advance. We were having to take some decisions based on the evidence and information that was available to us at the time.

The ONS was also undertaking similar considerations, but its circumstances were different, so it estimated that, if it were to reschedule its census by a year, it would cost £365 million, which was about 39 per cent of its total programme costs. Also, as the national statistics institute, it had a resource of 6,000 people and a budget of close to £1 billion with which to manage the additional pressures and issues that resulted from the pandemic. The final element that was relevant to it is that it had been working independently for many years on the development of administrative data. That was related to its wider functions in economic and social survey statistics but, as the Covid pandemic hit, it started to look at how it could use that data so that, if it encountered a situation with low response rates, it could mitigate that by using those administrative data resources.

In Scotland, and as confirmed by the chief statistician for Scotland, that data was not available. Each organisation was looking to make a risk-based decision, and there were a number of things that made the situation with the ONS in England different from what was the case in Scotland. In Scotland, we could not have run the census. We had exhausted our contingency time. We were dealing with other demands, including the production of Covid statistics, and we were also moving resource to deal with the radical changes that were being made to the registration system in Scotland at the time of the pandemic. You may recall that there was huge concern about the ability to manage and register the deaths of people from Covid at that time. I am responsible for the death registration system; the ONS is not responsible for the death registration system in England. I had to pivot resource and people to deal with those tasks. I did not have a 6,000-strong organisation that I could just borrow additional people from to do that, so choices had to be made about priorities and what could and could not be delivered at that time.

You are saying that it was impossible for you to have run the census in 2021. Who took that decision? Was it you or was it ministers?

Paul Lowe

Ultimately it was ministers. To clarify that process, we undertook a detailed impact assessment analysis of the threats and risks of Covid to the delivery of the census programme. Having undertaken that exercise—and we published a summary of the results back in 2020—we reached the conclusion that we could not deliver the census as conceived for March 2021. We did not have the time or the people left in order to do that and deliver it in a different way.

There was also a set of circumstances to do with the public response and reaction to the possibility of gathering the census data at the time of Scotland being in lockdown. Again, if you recall, in March 2021 Scotland was still in lockdown although England and Wales had come out of lockdown, so there were differences in the restrictions. We looked at alternative options to deliver the census that would have maintained the date. Those involved using an all-paper approach, using an online-only approach, and using both but without a field force. Our conclusion about all of those is that we would have seen a massively significantly reduced response rate.

There are questions today about concerns about getting to 89 per cent and whether that is good enough or what it means. We would have had a considerably lower response rate in Scotland than anything that we have achieved at this time. To be honest, there would have been real credibility issues about the nature of the census data gathered at that time. We were taking decisions based on the information that was available at the time, the risks that existed, and the fact that there were differences between Scotland, and England and Wales as part of that risk-based decision making, and also recognising that censuses are extremely costly to cancel at the 11th hour and extremely costly to run a follow-up for if the results are not achieved.

Graham Simpson

I am aware that other members want in, convener, but I have a final question. Concerns were raised when the decision was taken to delay for a year. Various experts—I do not need to list them; you will know who they are—came out and said that that could have an impact, and that appears to have been the case. We always speak about lessons learned. Do you think, moving ahead, that Scotland’s census and the rest of the UK’s census could get back into lockstep next time around?

Paul Lowe

To clarify, this was not a political decision. This was the result of an analysis undertaken by the NRS as a census-taking organisation, based on the threat to the delivery of the census in Scotland, for all of the reasons that we have talked about. We made those recommendations to ministers and ministers agreed them, but it was not ministers asking us to delay the census by a year. Ultimately, the decision about when the next census is taken is for Parliament to make, but we have to reflect that the census has been moved out of step only twice in its 200-plus-year history: once during world war 2 and once, in Scotland, during the pandemic.

I appreciate and understand entirely why comparisons are being made with what happened in England and Wales, but we have to remember that 71 per cent of the countries in the world that were planning to take a census in 2020 and 2021 delayed it, including Ireland, Germany and Italy. It was not an unusual decision that was taken here; it was a decision that many nations across the world, including western democracies, were taking at that very same time for that very same reason.

Alasdair Allan

On that point, I have a hypothetical question based on what you have been talking about. You have indicated how difficult it would have been from a practical point of view to organise a census if the decision had been taken to go ahead with a census at the low point—or high point, however you want to look at it—of the restrictions around the pandemic. However, would it also have created some very strange data for historians looking back?

Paul Lowe

That is a very insightful question. One of the purposes of the census is to gather data that asks the same questions of people at the same time but is also reflective of society as it exists and is then usable in future years. One of the challenges around taking censuses during pandemics is that they gather data at an unusual point in society—that is a source of some criticism by some academics of such censuses. On one hand, people can say that it is helpful to get data about that unusual thing that happened, but, on the other hand, others will say that it is not representative of society in a normal state and that it is representative only of society in a lockdown or a near lockdown position.

If you take a census during such a time, you get into difficulties because, for example, students are not in the same place that they would be normally. If I was to take a census in 2021, the population of St Andrews would look and feel very different. People were working at home and the data that you get in relation to where people work, how they travel, how they get there, which informs transport decisions and other decisions, is skewed. One of the things that our colleagues in the ONS had to do—it is a hugely capable organisation, so it was able to do it—was to make adjustments for the fact that the population was not in the same places doing exactly the same things during the pandemic in 2021.

Alasdair Allan

My other question is about household visits. The data that you have provided suggests that there were more than 1.5 million household visits across the country by field staff, and that more than half the households in my local authority area had such a visit. Can you explain for us what a household visit constitutes?

Paul Lowe

It is true to say that 1.7 million address visits were undertaken. That could have covered situations where people were not in at the time. Anne Slater, who is operations director and managed the field teams, can talk about the protocols and what was done under those circumstances.

11:15  

Anne Slater (National Records of Scotland)

I hope that everybody can hear me. Let me know if you cannot. Our field force would get a note of the addresses that they were to visit, on a daily basis. A household visit would mean that the field force person would go to the household and make contact with whoever was living there. If they made contact, they would talk to them about the census and the different ways that that can be completed. Before the extension period, they would directly contact our contact centre if there were issues and do telephone data capture—[Inaudible.]—digital. They would offer the householder a paper form if they did not have one. They would make sure that they knew how to complete the census on line. They were making sure that the householder was aware of all the channels and also exploring with the householder if there were any barriers to completion and what else they could do to help them.

In some instances, the householder would complete the paper questionnaire and the field person would arrange to go back the next day and collect it from them and post it. If the householder was not in when the field person rang the doorbell or whatever, they would put a calling card through the door, which had some information about the census on it and the phone number for our contact centre if they needed further help. We had a system that recorded whether there had been no contact made and that address would come back to be subsequently enumerated. Hence, as has been mentioned in other explanations, householders often had more than one visit.

In the first instance, there was roughly 70 per cent non-contact and 30 per cent contact. That dropped slightly when we moved into May, possibly as a result of people being out because there was better weather. I hope that that gives you everything that you need to know.

I am conscious of time, but Mr Ruskell wants to ask a question.

Mark Ruskell

I appreciate the technical nature of the evidence this morning and, as you said, it was a technical decision to delay rather than a political one. Most of my questions have already been answered, but I wanted to pick up on one thing that Paul Lowe alluded to earlier, around changing attitudes in society towards these censuses. Could you expand on that? Did I pick you up correctly that there may be a changing attitude?

Paul Lowe

I have to be careful, because some of this involves understanding of other data outwith the census but we have to reflect on the fact that there were a significant number of events in the past year or two. We have had more than two years of people living under considerable Covid restrictions and having to follow Government guidance, rules and instructions. There is some data that suggests that that is starting to shift society’s attitudes and how they interact with Government and officialdom. There is also information that people have recently been distracted by a number of different things happening in their life: the post-Brexit landscape, what is happening with the cost of living crisis and various other things.

The survey that we talked about earlier was based on 1,213 households who agreed to answer in the last week of the census, and 35 per cent of those people said that they were too busy to do the census. That also chimes with some of the feedback that we were getting at the doorstep. People did not see the census as important enough to them to do, or they thought that they would do it later, but that later time never came. We changed the design of the census as it was running because we saw that start to happen. We ended up issuing five reminder letters to people who did not respond. We and the ONS were originally planning to issue a couple of reminder letters, but we added in three additional layers of reminder letters and we had to add in additional advertising activity that we had not planned to do and that our colleagues in the ONS had not had to run with. Therefore, there were a range of things that we had to build in to deal with the fact that we were not getting the response rates that we expected.

However, I am highly surprised at the high proportion of people, about a third, who phoned up or contacted us to request a paper form who then did not return it. Those are people who were very proactive in requesting one but did not send it back, for whatever reason. Therefore, there are some fundamental questions that might inform not just future censuses but future engagements with the public on a range of different policy issues and how we do that in the future.

If you had to sum it up in one word, would you say that there was a sense of fatigue?

Paul Lowe

There is certainly an element of that at play here, yes.

The Convener

Mr Lowe, I thank you and your officials for attending this session.

The committee will now consider its final agenda item in private.

11:20 Meeting continued in private until 11:23.  


Previous

Channel 4