Skip to main content

Language: English / Gàidhlig

Loading…

Chamber and committees

Meeting date: Wednesday, May 18, 2022

Criminal Justice Committee 18 May 2022 [Draft]

Agenda: Policing and Mental Health, Online Child Abuse, Grooming and Exploitation, Subordinate Legislation, Proposed Bail and Release from Custody Bill


Contents


Online Child Abuse, Grooming and Exploitation

Our next item of business is a round-table evidence session on online child abuse, grooming and exploitation. I refer members to papers 3 and 4.

It is my pleasure to welcome Stuart Allardyce, the director of Stop It Now! Scotland; Alison Penman of Social Work Scotland; Gina Wilson from the office of the Children and Young People’s Commissioner Scotland; Assistant Chief Constable Bex Smith of Police Scotland; and Miles Bonfield from the National Crime Agency. We are also due to be joined by Joanne Smith of the NSPCC in Scotland, who will be with us soon.

I thank the witnesses for providing the committee with written evidence. If they would like to answer a question, they should catch my eye or that of Stephen Imrie, the clerk, and we will do our best to bring them into the discussion.

We have only about 80 minutes for this evidence session and, as the previous evidence session showed members, we often do not have time to cover everything that we would like to cover, so I ask that we keep the questions short and focused and the answers as brief as possible.

I will kick off with a question for Mr Allardyce. Page 4 of the Stop It Now! submission states that the issue of people having a sexual interest in children should be

“beyond one of law enforcement”.

Is the general public with you on that? What work needs to be done to persuade people of that position?

Stuart Allardyce (Stop it Now! Scotland, Lucy Faithfull Foundation)

There is good evidence for that approach. Police colleagues will present information about the number of individuals who are arrested for crimes in relation to online sexual exploitation of children and viewing indecent images of children, but those figures are just the tip of the iceberg.

We did not mention this in our written submission but, three or four years ago, there was a fairly big study from Germany about the online behaviour of around 8,000 individuals, all of whom were men. Around 2 per cent said that they had viewed child sexual exploitation material. Indeed, about 4 per cent of the overall sample, which was a normative sample, said that they had sexual fantasies or thoughts in relation to children in some capacity. Therefore, we know that there is a massive issue and that we will address only the tip of the iceberg through law enforcement—it clearly needs to be an aspect of solutions, but we need to pivot towards prevention as well.

On whether the public is with us on that, there is sometimes an assumption that organisations such as ours that work directly with people who perpetrate such offences are vilified. We have not seen as much evidence of that as you would customarily expect, so I think that there is a lot of support for prevention and the contribution that it makes to protecting children from harm.

Your organisation supports people who come to you but also people who are referred to you by the police, or the criminal justice system. Is your service ever provided as part of sentencing?

Very rarely. In fact, most of the individuals who contact the Scotland team have just been arrested and are given information about our services by Police Scotland colleagues at the time, partly because there is a significantly high risk of suicide among individuals who have been arrested for such offences. Most of the people who contact us have been signposted to us by police but not referred directly by them.

We do not work further downstream, because the criminal justice social work system usually picks up those individuals further to conviction. However, there is a discussion to be had about whether approaches such as diversion from prosecution could have more of a role.

Thank you. We have a lot to get through, but I draw your attention to evidence that was sent to the committee by the Internet Watch Foundation. Page 16 of its submission says:

“In 2021, we investigated more reports of suspected child sexual abuse imagery than the entire first 15 years we were in existence.”

That goes some way towards illustrating how widespread the problem is.

My next question is open to anyone, but perhaps Mr Bonfield can start. Are the resources and the investigations matching the level of criminality that we are seeing?

First, thank you for the opportunity to give evidence today. Part of my work is to command the National Crime Agency’s units that are involved in investigating child sexual exploitation and abuse online.

The NCA agrees with the estimation of threat that you outline. We are seeing a steady increase in the scale, complexity and severity of the offending online, which goes along with the growth of social media and applications for collaboration over the internet. However, that also provides us with opportunities to do more. As Stuart Allardyce said, we have greater opportunities to collaborate with industry to change the circumstances and prevent the offending in the first instance. It provides us with more opportunities to protect parents and carers and their children, and it gives us more opportunities to investigate online activity and disrupt and deter offenders. Although I see that there is an increase in the threat, there is also an increase in opportunities to do more about it.

On the resources that are applied, I refer you to an earlier answer. A choice must be made by those who are democratically elected by the public about the level of investment that is made in the response. Although any police officer will say that, if we have more resources we can do more, there is a point of limitation, when there is a decrease in the efficiency and effectiveness of the law enforcement response in relation to the scale of the issue. There is a choice to be made by democratic representatives about whether that is the right level of investment. That is as far as I would comment on that.

Good morning. From my perspective, I want to reassure the committee that online child sexual abuse is a massive priority for Police Scotland, as I think that you will see in my written submission. It features highly in our strategic assessment and it is front and centre in our strategic workforce plan and how we allocate resources. That goes to the heart of your question about whether we have the appropriate resources to meet the demand. We know that demand is increasing—you will have seen the statistics in my report. We are not there yet, but we are prioritising the issue as a real threat—it is high up in our cyberstrategy. ACC Andy Freeburn and I work closely together to ensure that we allocate adequate resources to this growing threat.

I am happy to talk a little more about demand and resource. We are starting a piece of work in public protection to look at our resource across that area, so that we can ensure that we have the right resource for the demand that we face, and that that is future proofed. As we know, online crime and cybercrime will only increase as we move further into the digital space.

We are not quite there in terms of resource, but we are definitely moving in the right direction.

Alison Penman, from the perspective of child protection, do you think that enough resource is going into investigating this kind of stuff?

The fundamental challenges around resourcing activity in this area relate to the type of approach that is needed to the work with families, which involves relationship-based practice. It takes time to build up relationships of trust with families, including the perpetrators, the victims and the family members, and that requires a significant resource that social work departments are probably not resourced for at this point.

On your original question, I would highlight the prevalence of this type of dangerous behaviour by children against other children. Obviously, that is a significant area of our work but, again, it requires time to build relationships of trust, which we need to do so that we can intervene effectively rather than tokenistically or superficially.

I have a question for Stuart Allardyce. I am interested to know how you can prevent someone from being an online child abuser. When someone is referred to you, or when you hear from someone—however it is that contact is established—what form does your intervention take? Do you give them counselling? I am interested to know what your organisation does. You mentioned that some people had been arrested. At what point does that kick in?

There are a few different strands. We have a UK helpline that is based in Epsom and is funded by the Home Office and the Ministry of Justice, and it makes referrals to the Scotland services that I manage. Last year, the helpline got 14,000 calls and contacts—including emails, live chats and so on—from about 7,000 individuals.

Were those calls and emails from people referring themselves?

The contacts were made by a wide variety of individuals with concerns around child sexual abuse. Around half of the calls were from individuals who were worried about their own sexual thoughts and feelings towards children. The majority of them were involved in online activity in some way. Many of them had been arrested already, but a significant proportion had not been arrested.

People who call us are given an anonymous and confidential space, because they are reaching out for help. They are told clearly at the start of the call that, if they identify themselves and give us any information about a crime, we will have to pass on that information, but that they do not need to identify themselves in the call. That is how we provide that anonymous advice.

It is important to make the point that we normally think about people who are committing these types of crime as being similar to contact sex offenders, but the evidence is stacking up that they are quite a different population. Often, they are quite worried about their online behaviour and are looking for help and support to stop. We provide advice that starts with what support the person is looking for to stop the behaviour at that stage.

The majority of the people with whom we work in Scotland have already been arrested. We work in that space between arrest and conviction, getting people the right kind of help at the stage that they need it. At the moment, it takes around two years for the individuals who are arrested for these crimes to end up in court.

Are you talking about a one-off phone call, or is there on-going dialogue with people who call up?

It can be a one-off call. We signpost all those individuals to online resources. We have a resource called “Get Help”, which is, in effect, a manualised treatment programme that people can do anonymously. Many individuals keep calling back. We have a call-back service that means that, although we never find out the identity of the individual, we can do telephone support with them over an extended period of time.

If your counsellors hear something alarming—I am sure that they will; it would all be alarming, in my estimation—and believe that someone is having dangerous thoughts, what do you do then, given that you allow them to remain anonymous?

At the end of every call to the helpline, we get the individual to sign off some actions in relation to safety. Clearly, we need to preserve people’s anonymity because, if we were not on the front foot in that regard, people would not call in the first place.

Fortunately, the sort of situations that you describe are few and far between. A lot of concerning information comes in to our professional team of helpline staff but, although situations arise in which somebody presents a significant risk to a child and we do not know who they are or who the child is, they are relatively rare.

11:15  

I have a supplementary question to that opening line of questioning. It is clear that you work in a very difficult area involving health, justice and prevention. To be frank, I dare say that some people would find the approach to be controversial. There is clearly a wider societal, moral and philosophical discussion about how to deal with people who have these kinds of thoughts or engage in such actions.

My understanding is that the National Crime Agency has given evidence in public that there are around 500,000 to 800,000, or possibly even 900,000, individuals who pose various degrees of risk to children. What sort of numbers do you deal with? I ask that because those numbers seem disproportionate to the number of people out there who could be helped. When I say that they could be helped, I am talking about those who have not committed a crime.

I think that you are right about that. When we talk about child sexual abuse in society, there is almost a weary inevitability about the conversation—an assumption that that sort of issue will always be with us. However, in Scotland, we have really shifted the discourse around violence, for instance. I think that there is recognition in professional populations, and among the public more generally, that we can treat violence as a public health problem—as a treatable issue. We need to think about child sexual abuse in that way as well. That would be the way to increase the number of people who contact our service.

As I said, around 7,000 people call our UK helpline every year, and much larger numbers of people use our online resources. We would push for those figures to be much higher because of the scale of the problem, as you have described it.

What is lacking, then? In your paper, you say that

“The key challenge ... is the lack of an overarching strategy to tackle online child sexual abuse”,

and that

“there is no government leadership with the issue straddling multiple government departments and Ministerial portfolios.”

In effect, what are you asking the Government to do? Would you say that the lack of take-up of your service is due to a lack of awareness among the community of those who might benefit from it or simply a fear of contacting you, because of what might happen thereafter if they pick up the phone or access a website?

I did not want to come here to have conversations about the resourcing of that. It is not just about organisations such as ours; it is about how we work in partnership with other organisations. I am sure that police colleagues can speak to that, but the deterrence campaign that we ran with Police Scotland, #GetHelpOrGetCaught, was remarkably successful. I looked at the figure just last week: in Scotland, just under 9,000 people accessed our online resources to seek help in relation to their online behaviour.

That has been driven by a Police Scotland-led campaign—let us be absolutely frank about that; it has not been driven by us—so it absolutely must be about partnerships. However, there is something about the way in which the issue sits in different silos across the Scottish Government. Indeed, so does violence, given the involvement of health, education, law enforcement and justice. All those are important in the violence debate, and the same is true with respect to the prevention of online harm.

We had the “National Action Plan on Internet Safety for Children and Young People”, which I think ended a year or two ago—people might correct me on that—as well as “Scotland’s National Action Plan to tackle Child Sexual Exploitation”. Once again, they sat in silos—they overlapped in places, but they were separate. We think about these things in different ways and in different contexts in Government.

Both action plans no longer exist. The danger with such plans is that they sometimes become a list of things to tick off instead of a means of evidencing impact and change. I therefore say with caution that it is important to have a national action plan, but it has to sit next to a strong research and evaluation strand. Personally, I do not think that we know enough about what is effective with regard to preventative work with families and children themselves or, as Alison Penman has said, with young people who might present a risk of harm to others in online spaces. There is lots of stuff that we still need to find out about, so we should not simply get on with doing lots of activity. An action plan would be a good start, nevertheless.

I might come back with some questions later, convener.

Sure. I think that ACC Smith would like to comment.

I agree with Stuart Allardyce. It is important to understand that policing is just one aspect of the issue and that we cannot just arrest our way out of the problem. It is a much wider societal problem to which the partnership approach is key. In fact, we have already seen that to be really successful. Stuart mentioned the campaign that we have run, and we are running another one in 2022-23 that will focus on perpetrators. We rely heavily on our partners in this space, and working together is genuinely important. We have seen success in the use of joint interview models and joint training, and I am keen to push that forward in a policing sense. I just wanted to add that I am very supportive of the point that Stuart made.

Stuart Allardyce, how do you measure your service’s success rate?

We use pre and post measures both in our work with individuals in Scotland who have been arrested for online offences and in our group work, looking at mental health issues, the risks that are presented by those individuals and reducing those factors over time.

As I am sure you will appreciate, there are significant issues with how we evidence our impact with regard to reducing reoffending in this area, which I know will be a key question for the committee, but the evidence that we have so far suggests that only a minority of individuals who are arrested for online offences seem to go on and commit further offences. Indeed, in most of the international studies that have been carried out, the figure is usually below 10 per cent. Interestingly, the majority of those individuals commit the same kind of offence again. The assumption is that there will be an escalation and that they will go on to commit contact abuse; although that happens—and we need to get very good at assessing such situations to identify the minority who present really significant risks with regard to contact abuse—that is not the case for the majority. The baseline for reoffending is therefore really low.

Good morning. I want to explore any gaps that there might be in the law and what lies at the root of all this. I have to say that I found your submission quite shocking; the issue is shocking anyway, although it is perhaps not surprising or shocking to see the extent to which girls and females are the victims and men tend to be the perpetrators. That said, I was surprised to learn in your submission that the amount of

“self-generated Child Sexual Abuse Material”

has gone

“up 374% in the last two years, ... disproportionately affecting ... girls.”

We are talking about imagery that is produced on webcams by children themselves, but adults are taking advantage of it, and the child is still the victim. Can you attempt to give us any insight into why such a rise has happened over the past two years? What do you think is driving children to do this?

I will say something about that, but Miles Bonfield might be in a better position to respond.

The Internet Watch Foundation data suggests significant increases in crimes involving self-generated images by children—and, most shocking of all, by younger and younger children who are being pulled into this space.

There are two different components to that, one of which is relatively recent, which is the impact of Covid and lockdown, when more children have been at home spending more and more time online, often in unsupervised ways. We did a bit of research that looked at callers to our helpline during the first six months of the Covid lockdown. Individuals who were worried about their online behaviour talked about furlough, isolation and anxiety, and they talked about sexualising some of those stressors, spending more and more time online looking at pornography and drifting into more and more extreme material. There are a number of factors around Covid that we need to accept.

The reality is that smartphone accessibility for children and young people has increased over the past 10 years. As young people go through adolescence, they are, increasingly, expressing themselves in online spaces in terms of intimacy and relationships, and there are adults who take advantage of that, which leads to some of the exploitation. Some of it is about technology, and some is about how technology begins to impact on children’s sexual development.

I agree with Stuart Allardyce. There is a point here about the wider availability of technology for and use of technology by young people. That links to the wider societal issue of the normalisation of this behaviour, which we just talked about. It is a very concerning issue for us all.

Part of the issue is that we are getting better at identifying material. The National Crime Agency has put an awful lot of effort into working with industry to enable it to identify material for us and refer it to us in an efficient and effective way, so that we can do something about it. That has led to an increase in reports of aspects of the material.

Are there any gaps that need to be plugged that will not be addressed by the Online Safety Bill? Some social media companies such as TikTok, which is a big one for younger kids, are meant to have age restrictions, but I am fully aware that it is much harder to catch that when there is live streaming and ways that people can be ingenious around that. As a layperson, it strikes me that those companies are not doing enough, so do we need more laws? I appreciate that TikTok is not a UK-based company, so there would need to be international collaboration.

Stuart Allardyce

I can speak to the Online Safety Bill, but perhaps other witnesses want to come in.

From a National Crime Agency perspective, we do not see any legislative gaps or any issues that are not being addressed. We have put an awful lot of effort into working with Government on the Online Safety Bill, which will make some important changes in legislation that will have an impact.

We are not relying on legislation alone to change the circumstances. One of our priorities is to have industry engagement with social media companies and make it very clear to them what the threat looks like, so that we have shared awareness. We direct them on the work that they can do, so that we have a common purpose and have an effect on the offending space, to prevent opportunities for offenders and enable people to protect themselves.

We are very clear that industry can do more, and we work with industry every day to keep on with that and ensure that it does more. On the aspect of working in partnership with Police Scotland, working with the industry is a job for the National Crime Agency to do on national and international levels, with law enforcement also involved.

11:30  

Miles Bonfield has highlighted that the NCA engages with the industry. That is right and proper to ensure consistency and that we go into the big tech companies in one way only.

However, there are gaps in the legislation in Scotland specifically. Page 5 of my written submission highlights some of those. For example, section 52 of the Civic Government (Scotland) Act 1982 came in before the internet. We now understand that there are definite workarounds in the legal system. We know that the abhorrent acts are prosecuted as far as they can be with the current legislation, but there are gaps. We are working with the Scottish Government on understanding what those gaps are and pushing those forward, but there are a couple of things that would be quite straightforward. For example, there is no Scottish legislation that is specific to prohibited images. That is a gap. The current criteria for an application for a risk of sexual harm order does not cover online offences. I think that it would be quite straightforward to close that gap, and we would be interested in seeing that.

I reassure the committee that a lot of work and communication on that is going on. We have a multi-agency group in which the legislation and the gaps are discussed. We are looking at how we can deal with those gaps. Ultimately, there are some quite straightforward gaps that need to be closed. The risk is that, if we do not do that, legal challenges might prevent our using the legislation that we currently use. We have not seen that, but I suspect that we might see that after a period of time. We are keen to understand what the gaps look like and how we can help to close those gaps.

I want to go back to Pauline McNeill’s original question about the significant rise in the number of young girls in particular falling victim to stuff online. I will turn first to Joanne Smith from NSPCC Scotland. From memory, your organisation has a very useful website for parents of young people who might have concerns. Will you expand a bit on the scale of the threat and what can be done to help to protect children?

Yes. I absolutely concur with everything that has already been said. Our organisation has been aware of a growing trend towards tech-enabled forms of child sexual abuse for many years. However, we saw a significant spike in referral rates when lockdown measures to prevent the spread of Covid were introduced. As Stuart Allardyce mentioned, there was a perfect storm, with children and abusers spending more time at home and online, the exponential growth in the use of smartphones and the new, more sophisticated types of technology, such as live streaming. That means that rates and forms of abuse can escalate with virility because of the ways in which groomers are able to move people from one rather open platform on to much more private and encrypted forms of communication. That is a really worrying trend.

All of that can feel overwhelming, because it feels entrenched, but it is important to say that online sexual abuse is entirely preventable in many ways. A lot of the rapid rise in online offending that we are seeing is the result of corporations having sidestepped their responsibilities. It is really important that they step up to the plate.

As has been said, we hear from professionals and parents who are desperate for information and tools to help them to better protect children. We are overwhelmed by the demand for that type of material. Realistically, however, the scale and pace of the development of online sexual abuse is such that that is insufficient. We need platforms to take responsibility. Just as we would expect safety measures to be implemented in children’s spaces offline, we must expect the same level of rigour online.

On gaps, we have an issue in Scotland in that, despite our having high-quality practice, pockets of expertise and brilliant work that is done by Stop It Now! Scotland and others, we do not have a co-ordinated, overarching strategy. We need that to bring together disparate strands of work so that we have a cohesive and co-ordinated programme of national activity and strategic leadership that brings together the responsibilities of all agencies that work with children, families, communities and, critically, industry to ensure that we seek to prevent harm before it arises.

The scale of the problem is such that we will not be able to arrest our way out of it. We must look at preventative measures that better protect victims and we must provide support and referral tools for prospective offenders. We must be honest about what is required of us—if we are to try to keep children safe, we must have a much more cohesive and collaborative national strategy.

Would Gina Wilson like to say anything about the rise in the number of children becoming victims and what can be done about that? You might also like to comment on Joanne Smith’s point about taking a co-ordinated approach and what that would look like.

Gina Wilson (Office of the Children and Young People’s Commissioner Scotland)

I wholly support the comments from NSPCC Scotland. The issue speaks to the fact that law enforcement alone will not solve the huge increase in the number of self-generated children’s images. That is of huge concern for a number of reasons. We have concerns about the approach that is being taken—the ways in which such situations are dealt with are inconsistent and non-child centred. We would always welcome sensitive and inclusive approaches to awareness raising among children and young people, with a focus on healthy and safe relationships, rather than punitive and criminal justice approaches. Law enforcement alone will not resolve the issues in this area.

I absolutely concur with NSPCC Scotland about the need to shift expectations to digital service providers, and to shift resource towards education and technical solutions. Digital service providers must be held accountable and liable for the welfare of children and young people who use their services. The digital world was not designed for children and they are at significant risk of harm in accessing it.

It is important to consider all children’s rights in the round. Although they have the right to be protected from harm, they also have the right to act autonomously and access and make use of the online world. Therefore, it is absolutely incumbent on service providers to ensure that they provide safe environments.

In terms of gaps, we have spoken a little bit about the Online Safety Bill. Ofcom, as the regulator, will be tasked with producing codes of practice for TikTok and other services to follow. We would want to see children and young people, and the organisations that represent them, involved in producing those codes of practice. The online world is a hugely fast-paced and changing environment. We need to understand directly from children and young people themselves about how we can help to protect them in the online world. We really want them to be part of that process.

I return to Stuart Allardyce’s original point. In relation to children who display harmful sexual behaviour online, any strategy must take cognisance of the different pathways by which children come to do that. That must take into account the context of online relationships with peers, children’s normative expectations and their becoming desensitised to what is and what is not harmful behaviour. When considering interventions, we must take account of a child’s development, because children’s brains are still developing, as well as take account of trauma. Therefore, we would need to approach the matter from a trauma-aware and trauma-informed perspective. At the same time, we must remember that a number of those children will also be experiencing undiagnosed speech, language and communication difficulties, which will significantly impact on how they view peer relationships and their understanding of what is and what is not harmful behaviour.

We have a significant number of children and young people who become involved in harmful behaviour online but do not realise that it is harmful. Therefore, the issue is how we respond by taking a preventative approach.

Thank you very much. I call Katy Clark, who will be followed by Fulton MacGregor.

I was going to ask about organised crime—perhaps we will come on to that later. First, though, I would be interested to hear from those involved in this area how they think perpetrators are created. We have heard that there are a lot of parallels between perpetrators and those who have experienced violence, and there has been a lot of work on violence. We know that experiencing poverty, trauma and violence leads people to be more violent when they grow older. Are there any themes in relation to why people become perpetrators? Is it because they have been victims themselves? That might be one factor, but there might be others. We need to be able to understand those in order to frame a co-ordinated strategy.

Do any of the witnesses who have direct experience have any evidence that might be of use to the committee on that? Perhaps it would be best to start with Stuart Allardyce.

I am happy to start the ball rolling on that. The research into why people commit online harms is contested and there are lots of different arguments. Some academics take the position that it is always about people having a significant paedophilic profile, which might start in adolescence and continue across their life course. We push back against that because of what we see in our work. We certainly work with individuals who would describe themselves in that way, but we need better and more nuanced descriptions that are congruent with what we see in practice.

You mentioned people having experienced trauma. That factor is significantly overrepresented in adults who commit contact sexual offences and particularly adolescents who display contact harmful sexual behaviour. It is perhaps not as overrepresented in the population of online offenders, but it is there. We recently did a study of 800 people with whom we had worked at Stop It Now! Scotland over the past 10 years. Of those, 12 per cent identified themselves as having been sexually abused in childhood, which is roughly around three times what we would expect in the Scottish population.

Adverse childhood experiences are certainly a factor for some individuals, but the key factors are to do with the way that the internet provides opportunities for people to do things anonymously online—sometimes things of a sexual nature. The story that we hear day in, day out at Stop It Now! Scotland is of adults who describe consuming huge amounts of legal online pornography, becoming desensitised to it over time and looking for more extreme and transgressive material. That is not to say that those people do not have a capacity to be sexually interested in children, because they do. However, they did not set out on a pathway looking for that material but drifted towards it over time. That is why there are many opportunities for deterrence and disruption.

I also point out that, in at least half of the 800 men—they are almost all men—with whom we worked, we saw significant low-level mental health issues, such as depression and anxiety, that predated their offending behaviour. The collision of online behaviour and low-level mental health issues is an explanation that is often more congruent with what we see than arguments about paedophilia.

That is interesting. It is a massive topic that we do not have the opportunity to explore properly now.

We have been discussing organised crime. Obviously, there are links between organised crime and some of the other issues that we are discussing. Perhaps Bex Smith would be a good person to talk a little bit about that.

I think that you have heard evidence recently about the way that we tackle organised crime in Police Scotland—we have a separate command that deals with it. There are certain ways that we would tackle these cases—we would do it both covertly and overtly. We would consider the risk posed to individuals and, ultimately, if there was a safeguarding risk to children and there was an organised element to it, we would absolutely deal with that. We would look for that and prioritise it over other areas of organised criminality.

There has been a real cultural shift in policing over the past 10 years, as we would previously have focused much of our work on more traditional organised criminals such as those involved in drugs or firearms.

However, I can say absolutely hand on heart that, if we were to face organised criminality in an online child sexual abuse case, we would deal with the safeguarding issues as a priority.

11:45  

It is important to bring Miles Bonfield in here. Police Scotland would utilise some of the NCA’s unique capabilities. We are well linked in to the national and international aspects of this because, as you will know, a great deal of offending occurs overseas—I think that there is information on that in the briefing. We would look to work in partnership on that, and we would definitely tackle organised criminality in an online child sexual abuse context.

Miles, are you able to talk about how big a factor organised crime is?

Organised crime in relation to child sexual abuse is in the more loose and disorganised range of offending rather than the hierarchical and highly structured offending you might find in a drugs trafficking network or firearms supply network. At the higher end of offending, we see loose social networks of offenders working together in their offending behaviour, sharing things such as tradecraft and how to protect themselves from law enforcement interest, how to distance themselves and how to show out law enforcement activity. We are seeing more and more of that more highly sophisticated, higher-end offending and use of that tradecraft. However, there is an opportunity for us to use some of the high-end national capabilities and our national security capabilities in order to disrupt that offending and pursue those offenders.

Over recent years, we have formed really strong relationships with Government Communications Headquarters and our intelligence community partners to disrupt that activity. We are now using techniques that we would use to tackle serious organised crime offences, such as firearms and drugs trafficking, in child sexual abuse cases. That goes back to Bex Smith’s point about Police Scotland, which also goes for the National Crime Agency and law enforcement generally—UK plc. Child sexual abuse is one of the highest priorities. Therefore, if there is any opportunity to use any capability to disrupt that offending, it is applied.

Gina Wilson, do you want to come in on that? From your perspective, is that a major issue on your radar?

The only thing that I will add on that point is that we have been aware and are concerned that some asylum-seeking children have been prosecuted when involved in criminal behaviour. In some instances, they have been detained in adult prisons, pending trial. Therefore, there is a connection to victims of trafficking and online grooming in the way that we are responding.

I thank the witnesses for coming to speak to us about this very difficult subject. It is important that the committee hears about it. I should have said at the outset that I chair the adult survivors of childhood sexual abuse cross-party group in this Parliament, and Collette Stevenson is also a member of that group. The group has real concerns about some of the stuff that we have been hearing about today. I want to ask about the increase in abuse, particularly during the Covid pandemic, because almost every witness has talked about it. I think that I know the answer to this, but it would be good to get it on the record. Are we talking about a real increase overall—I think that we are—or are we talking about better detection methods, particularly on the part of the police? The police have attended several times to talk to the group about how, over the past few years, they have been able to deploy technology that they would not previously have thought it possible to deploy. Does anyone want to comment on the increase and the scale of such abuse? Are we uncovering it more or has there been an actual increase because of Covid and other factors?

Bex Smith is keen to answer that.

To be honest, from a Police Scotland perspective, it is a combination of those things. We are absolutely seeing an increase. I think that the issue was highlighted to parents during Covid. With children at home and under their eyes a lot more, doing home schooling and so on, parents became a lot more aware of what they were looking at on the internet, and there was an escalation in the number of referrals to Police Scotland from parents and individuals who were concerned about what their children were seeing or who had found certain images. That has been a factor.

Our detection methods are also better. I mentioned the culture in policing a little earlier, and we are more alive to the fact that we can use traditional techniques of law enforcement more successfully in the safeguarding arena. That is what we are doing, and we are really pushing the boundaries to try to understand and detect that kind of offending in relation to organised criminal activity. I genuinely think that, when we look across the piece, from my perspective there is a real increase in demand but also an increase in reporting and in understanding this horrendous sort of offending. People are a lot more aware and able to come forward and report incidents.

We have also opened up our reporting channels, and people are able to report in different ways and are a lot happier to come forward and discuss these things in a way that they previously were not. Society is a lot more able to talk about these issues. I would suggest that it is absolutely a combination of all those factors, but I am sure that other colleagues will have a view on the matter.

It is helpful to get that on the record, because it allows us to clearly say that we have evidence of an overall increase. Katy Clark and Collette Stevenson will agree that that is what the agencies that are represented on the cross-party group that I chair are reporting. They feel that we are on the precipice of another pandemic in the coming period.

My substantive question is about young people’s use of the internet and what we can do to increase safety in educational terms. I know that we have talked a wee bit about that already, but I would like to bring in some of the witnesses at the top half of the table, because the discussion has probably been more focused on those sitting at the bottom half. My question, which is for Alison Penman, is: what more can we do to make young people safe? I have three young children, but it is my 8-year-old who probably falls into the category that we are discussing. She has asked me several times for a TikTok account. There is absolutely no chance that she is getting it—I do not have an account myself; in fact, I do not understand it—but, to be realistic, I will not be able to say no for ever, whether it be for TikTok or whatever replaces it. What can we do to educate our children about this? To be frank—and I am probably not the only parent who will say this—I think that my 8-year-old is more tech savvy than I am. That worries and concerns me as a parent; it worries and concerns my peers and friends whom I speak to; and it is a concern for my constituents. Do you have any advice in that respect?

It comes back to having an educative programme and recognising the role of schools in delivering this. I do not want to sound patronising—I am quite sure that everyone is well aware of this—but we need to address the culture of internet use. We also need to support our education staff and think about how we build up their resilience to deliver some of this work. I come from rural Dumfries and Galloway, where we do not have as much access to certain third sector providers as some of our other colleagues. As a result, a lot of this work will come directly from schools and what they can provide through the curriculum.

We also need to think about how the issue of vicarious trauma might affect the workforce. Several people have already said that this subject is horrible and not easy to talk about, but we should put ourselves in the position of a class teacher who has to have these conversations and might have to recognise what they might be seeing, even though they do not want to believe it. The child in question might come from a nice family, say, so how do you have those conversations with the parents? That is very difficult, and we need to find ways of not just supporting the education workforce in having an open mind to recognise what is going on and to respond appropriately but supporting them thereafter.

My other point is that we need to think about how we take a strategic approach to support and recovery. What can we put in place for families and for individual children who are victims, including those who are behaving harmfully? How can we ensure that those children have access to support and recovery at the time when they need it in order to prevent their own behaviour from escalating?

Stuart Allardyce spoke in his paper about the devastation that such behaviour can have across families, and for family members. His organisation provides support to family members as well. We need to think about how we continue that work, and how we support schools and youth workers in a way that not only opens up the conversation but opens people’s minds and allows them to think the unthinkable. We need to ensure that there is an infrastructure in place to support them to do that, because education will be the front-line response in many of those situations.

Thank you for that. The whole subject of children who display harmful behaviours is such an interesting one. It is an area in which society as a whole recognises that there is a victim and a perpetrator wrapped up in a serious situation. This comment would usually be for the committee’s discussion in private, but I want to put out there to the other committee members that I think that we would find that area of great interest if we were to take evidence on it.

My substantive question is about helping families to cope with the new age that we are living in. We are in it, and the internet is going to be here forever. Gina, are you able to talk about what you are doing on that?

Yes. Peer education is going to be hugely important to us in that area. One of our young advisers put it brilliantly—she told us:

“Adults have a lot of opinions about how the online world affects young people’s lives. But so do young people themselves, and it’s vital that they get their say.”

Part of the issue that Fulton MacGregor has addressed, which many of us will recognise, is that young people—children—are, in some cases, far more advanced than their parents in their knowledge, understanding and use of the internet, and they are able to do things that their family around them does not understand. Parents are not seeing everything that is happening or that young people are involved in.

It is therefore important that children and young people are involved in developing peer education programmes, and in helping adults to understand how they are using the internet and what needs to happen to keep them safe. At present, they are largely absent from those decision-making processes at a domestic level in the UK.

Interestingly, last year, the United Nations, in producing its new general comment on children’s rights in the digital environment, worked with hundreds of young people around the world to create international standards on what those rights should look like. Children of all ages were involved in that process, and they came up with fantastic ideas and suggestions about what they need Governments to do to keep them safe. We need the same kind of involvement in peer education programmes at a local level. I absolutely agree that education is going to be the front-line response to help children, parents and families—everybody—to understand how the internet is being used and how to keep children safe within it.

I know that we are short of time, but I think that Joanne Smith wants to come in.

Yes.

I completely agree with what has been said. The NSPCC has worked in collaboration with a range of tech companies to provide population-level parenting programmes in order to raise awareness and provide tools to help people navigate the online world. However, we know that that is helpful only where children have a responsible adult who is proactively seeking that information, so the importance of peer support cannot be underestimated.

The NSPCC built an innovative partnership with Dundee City Council called “Oor Fierce Girls”. It involved a group of self-identifying young women who had experienced peer-on-peer sexual abuse. They came together and created a movement for change that was really about recognising the discomfort that some professionals feel about having conversations in schools around harmful sexual behaviour and peer-on-peer sexual abuse. The group tried to facilitate some of those conversations, led by the young people themselves. That approach has been hugely successful in Dundee, and we are seeking to roll it out further; the Scottish Government is supporting that work.

More tools and grass-roots forums of that type locally could make a massive difference in helping children to feel able and supported to be better protected online.

Thank you for that. With regard to the work that you have described, I think that I speak for all members in saying that I would be interested to hear more about that as it comes in to other local areas.

We have about 20 minutes left. I will bring in Collette Stevenson, followed by Audrey Nicoll, who is online.

12:00  

Do we have a consistent and easily understood definition of what constitutes online sexual abuse and exploitation? Stuart Allardyce, I watched the video on your website. Is that used by multiple agencies? Is there a consistent approach or are we muddying the waters with what we are doing?

The question of definitions is interesting, and I would be particularly interested in Alison Penman’s view on it. We have a very broad definition of online abuse in the national child protection guidance, and that then needs to be linked to the definition of child sexual abuse in the guidance. The definition of child sexual abuse is very much about contact behaviour, which raises the question of whether the viewing and production of child sexual exploitation material—indecent images of children—is encompassed by the definition in the national child protection guidance.

Having said that, I am not aware of any operational issues that come up in relation to that. Speaking as a social worker, I think that practitioners in the field have a pretty good rule of thumb about what is abuse in the area, so there does not need to be much tightening of definitions.

Are we sending the right message here? It is a bit like buying nappies—I buy the nappies, but it is my child who uses them. Is the message that is being sent from parents to children consistent? Should it be different? Are we hitting the right spot?

Stuart Allardyce

I am sorry; could you clarify that?

For instance, the video on your website—the one in which the door is lying open and the girl is upstairs in her bedroom on her iPad—

Stuart Allardyce

That is an Internet Watch Foundation video.

Yes.

It is not a Lucy Faithfull Foundation video.

That is a good point. There is a risk in trying to motivate parents on safety by ramping up a discourse around fear. It is clear that that can be effective, and the Internet Watch Foundation video that we are talking about is a good example of that, but I wonder whether we need to be a bit more savvy than that.

That connects back to Fulton MacGregor’s question. In the conversations that we, as parents, need to have with our children about online safety, we need to, as a starting point, show an interest in children’s online lives. I ask my kids how their day was at school every day when they come home, although I have to say that they do not tell me very much when I ask them that. We know that our kids spend an incredible amount of time online. Do we know who they are spending time with? Do we sit down and play games with them? Are we curious about their online lives?

Unfortunately, the discourse around online safety has sometimes been defined by people with tech backgrounds. There is a discourse around how we make sure that we have the right restrictions on devices, which is important, but, picking up on Alison Penman’s point from the start of the evidence session, I think that the answers are, in part, relational. They are about how we make sure that parents actively think about gatekeeping, supervising and monitoring young people’s online worlds, as they do with their offline worlds.

I absolutely agree with Stuart Allardyce. Further to that, as Joanne Smith pointed out earlier, not all of our children have a reliable or trusted adult caring for them who will take that approach. In those circumstances, we look for help from youth workers and educational staff, who also need to apply that relationship-based approach.

On the issue of definitions, there are nuances, but I agree with Stuart Allardyce that they are broadly the same. Perhaps we are missing a trick by concentrating on the legislative definition of what would constitute a crime, rather than on the impact on children. We might need to come back to that when we think about definitions. Perhaps Gina Wilson would agree that children, rather than adults, might be the best people to tell us what a working definition of that would be.

As adults, we probably understand what we mean by online exploitation and abuse. Stuart Allardyce highlighted earlier that there are many different strands to the issue: child sexual exploitation, child sexual abuse, child criminal exploitation and online exploitation. How do we bring that all together in a way that makes sense? If we are going to start this discourse with children in relation to peer support and peer education, we need to know what that means to them and how they can help us to make sense of it in a way that allows us to have meaningful conversations.

Bex Smith, could you respond to the question, too?

I was listening intently to the discussion, because I am quite interested in the area.

With regard to my officers and staff dealing with children, looking at offences and working out which parts of the law fit with what is before us, I think that I can say, hand on heart, that that bit comes later.

I agree with Stuart Allardyce that, sometimes, the definition does not matter. If you have a young person or child in front of you and there has been a report of some sort of abuse and you know that something is not right, talking to that child and listening to their experience and their journey will enable you to understand what has happened, and you can use the legislation further down the line to understand what that looks like in a criminal context. However, the most important thing is listening to the child and making sure that they are safe. In the past few years, policing has changed quite a lot in that regard. Previously, our focus would have been much more on a criminal justice outcome, but I can absolutely say that, now, the voice of the child, the experience of the child and the safety of the child are key. If the process does not result in a criminal justice outcome, because it does not quite fit with the legislative definition, we would still view it as a success as long as the child is protected. That is a real culture change in law enforcement.

In a long-winded way, I am saying that I do not think that the definition is important. There are gaps and loopholes in the current definition, and they could be closed in order to make things easier, but most professionals take an approach that involves listening to and understanding the child, and the legislative side comes later on.

I would like to go back to the discussion at the start, when the convener picked up on the issue of resources. The committee considered that issue previously during a session with Police Scotland and the NCA—Miles Bonfield, you were involved in that—and we also considered it in our pre-budget scrutiny.

I recognise that part of the overall response to child sexual exploitation online involves enforcement, and that we need to have a skilled body that can undertake that investigative role, given the international and underground dimensions of the issue, but I am still not clear what the committee and the Scottish Government need to be thinking about in terms of resources. On recruitment, what skills do we need to bring in so that we can fill the skills gap and ensure that we have an adequate investigative capability? How do we make Police Scotland an employer of choice—rather than, say, Google—for the people with the skills that we need in the workplace?

Miles Bonfield, could you respond to that first?

We should be clear that our assessment is that the threat, complexity and severity of offending continue to grow. The challenge is really out there for us.

I am sure that Bex Smith will agree that it is really important for the National Crime Agency and Police Scotland to have good lines of demarcation around what the NCA can and should do, and what Police Scotland can and should do, so that we work efficiently and effectively as a law enforcement system to protect the people of Scotland.

Our agreement with Police Scotland is very clear that, in relation to what Police Scotland wants us to do and what we will do, we want to do those things only once. We want to have only one international liaison network and one set of strategic relationships with international partners in law enforcement. We want to have only one set of national security capabilities—technical things for doing stuff on the internet—and only one strategic assessment on which we work together. We are therefore very clear that we have a delineation of capabilities and capacity to do that, and Police Scotland has a direct call into that capability.

On the skills and capabilities that are required to do that, as a public service, we rely on the mission and the vocational pull of protecting the public in this space. That works very well for us. I will be absolutely transparent and frank about the difficulty of retaining colleagues, particularly colleagues who have social work experience, because of their retention allowances and the comparative pay.

However, we are keen to attract and retain colleagues with the right skills by giving them the opportunity to do things within the NCA and law enforcement that they cannot do elsewhere and by reinforcing the importance of the mission and how important it is to protect the public, so that it is clear that our investigative doctrine, particularly in this area, starts with protecting the child. The first thing that we want to do is to get to a position in which we have actionable intelligence in order to protect the child; we want to have that child-focused element to our work. It is really important to us that we focus on the mission, that we make that clear to our people and that we treat our people well by looking after their wellbeing. Bex Smith will reflect that in the work of Police Scotland.

It is a really interesting question from Police Scotland’s perspective. As the committee will understand, law enforcement has always struggled with recruiting experts in the digital space because of the pay structures within which we operate. Under the cyber strategy, Police Scotland has looked at whether we can bring people in who have the skillsets that we need and have them for a period of time while we train and invest in them, with the knowledge that they might leave the organisation to work in different places. We have to be realistic. We will not be able to retain talent in the digital space for more than a couple of years. People will go off to earn more money in different jobs. They will move around; we know that from the patterns of young people and the way in which they work these days. Policing is no longer a 30-year career, and that absolutely fits with cyber.

We are looking at utilising young people. We are looking at going to universities and using academia to bring people in on short-term contracts so that they can focus on specific pieces of work and we can use their talent and skills in cyber, although we understand that they will walk out the door and that we will have to bring in new people. It is a different way of recruiting and retaining staff in that area for us, but we are alive to that and we understand it.

We are also looking at using ethical hackers—people who have a strong moral sense of purpose about the issue. Miles Bonfield hit the nail on the head when he said that a lot of people work with us for a period of time because they want to make a difference. They want to feel that they are getting out of bed to do something that really means something to society. Ultimately, we capitalise on that. We bring people in and offer them different types of training. We can show them different skills that they will not get in the private sector, especially in working with the NCA. That is really important for us.

12:15  

That is the path that we are going down. It is a long road, and it will take a while before we are able to say that we are really happy with the number of people we have working in the area, but we are definitely on the right track, and I think that that will only improve as we move forward.

That is a really good question, and we are definitely considering it.

That is really helpful.

I want to ask Bex Smith a quick follow-up question on the welfare of not just officers but staff who are involved in investigations in often complex and quite harrowing inquiries. Just before this round-table discussion, we discussed policing and mental health in our first round-table discussion of the day. What provision are you able to make, or what provision do you have in place, to ensure that the welfare of officers and staff who are involved in investigating cases of child sexual exploitation is monitored and supported?

The wellbeing of staff generally is a massive issue for me and for Police Scotland. I have worked in the child protection arena for a number of years, and I know how difficult it sometimes is for people to switch off when they get home and the impact that that can have on friends and family. It can be really tough, especially when the person is trying to make a difference but, when they look at the volume of work that we deal with, is not sure that they are doing so.

Police Scotland and I have made a real commitment on that. For example, with the public protection review, we are coming up with a completely different way of looking at wellbeing across the department through a new strategic plan. Underneath that, we are looking at how we can support officers and staff in each area. We have psychological assessments and the TRiM process, which members are probably aware of—I think that ACC Hawkins gave evidence on that earlier.

It is also about looking at officers’ demand and their workload in relation to what we are currently asking them to do. The public protection review has a big strand on that. I want to be in a position in which I can say, hand on heart, that I have the right officers with the right skills and the right workload demand on them so that they are not under such significant pressure that they feel that working in the area could have consequences for their mental health and how they feel at home.

All that work is linked together under me by a strategic board so that I can understand the workloads, the pressure and the psychological support that we are putting in place. We are not there yet, but we are getting there. It is work in progress, and I am keen to push it forward.

I can reassure members that that is a real priority for me and that I am definitely looking at it. That is why it is a really important strand in our public protection review.

We have time for a brief question from Jamie Greene.

Okay. I had lots of questions, but everyone has used up my time.

I will ask a slightly left-field question. Has there been a rise in vigilante behaviour from members of the public to try to—through online or physical approaches—capture, tackle or deal with predators, for want of a better word? Has there been a rise in people self-policing, in effect? If so, what has been done to tackle or prevent such activity?

I do not have the statistics for that specific crime type to be able to sit here and tell members whether there has been an increase in that behaviour. My sense is that there probably has been. However, I can get that information to the committee, if it is interested in that.

We are absolutely alive to the fact that vigilante groups are operating, and I know that there is some on-going covert work in that space to understand what that looks like. It is important that we understand that, so that we can protect people.

Mistaken identity can sometimes be a real problem in that area, and we do not want to end up in a situation in which a member of the public gets harmed because they have been mistaken for someone who has a sexual interest in children. A few weeks ago, we faced a situation like that, and we reacted quickly by protecting the person concerned and putting out strong messaging to say, “This is not right—this person’s not done what you think they’ve done.” In order to provide reassurance to the public, we will absolutely do that.

Work is being done in the cyber area to look at the nature and scale of that problem in Scotland. I do not have information on that to hand, but I can definitely get that for you. We are aware of the issue and will continue to look at it.

Thank you. I realise that that is a slightly different area of questioning, but I wanted to raise the issue.

I have a final question for the NCA, which is about the complexity of the enforcement landscape. If an image is discovered on a site or through an app, whether we are talking about mobile or fixed-base internet service provider access, it is often not clear where responsibility lies with regard to escalation. Does it lie with the website operator or with the internet service provider? Is the process governed by Ofcom, the Internet Watch Foundation, ministers, the police or the NCA? That lack of clarity can be such that no action is taken. It is not always clear to the consumer how to escalate such a matter, other than by immediately reporting it in the first instance. If no action is taken thereafter, the path to escalation, whereby the ISP or the website can be held to account, is not obvious.

I appreciate that the issue crosses a range of policing and devolved and reserved matters, but could the pathway be tidied up a little more so that people know exactly who does what, who regulates what and what can and will be done if no one else takes action?

With an eye on the clock, I will keep my response, on what is a very complex cross-jurisdictional issue, very simple by just agreeing that we could do more in that space. I said earlier that we are clear in our belief that industry can do more.

We have an opportunity to see things very simply through the culture change across policing that Bex Smith referred to, which has been brought about by working together across the entire system. We are very clear that the key issue is protecting the child and putting their interests first. That makes things very simple for us. It does not really matter where an image is, where it has been reported or how it has been reported. What we look at is the severity, complexity and scale of the offending, and who, therefore, is best placed to protect the child and take action. From that point of view, the issue is quite clear for us.

The increase in the number of reports is partly a result of industry and law enforcement—the NCA and Police Scotland—working together more efficiently.

Thank you. Joanne Smith would like to come in.

Jamie Greene is absolutely right to say that there is a huge grey area here, and we need things to be tightened up. In our asks of the UK Government in relation to the Online Safety Bill, we are calling for amendments to be made to the bill to ensure that senior managers in the in-scope services hold liability for failures on the part of the companies concerned. Too often, there is sidestepping of responsibility. There needs to be a clear line of responsibility, and the Online Safety Bill might provide the vehicle for that.

Alongside that, there needs to be much more emphasis on the prevention of harm. There needs to be a concerted and coherent bringing together of all the agencies in Scotland to make sure that we identify risk early and prevent unnecessary harm.

Thank you very much. Unfortunately, we are out of time, but it is worth putting on the record that we expect to look at legislative consent issues in relation to the Online Safety Bill in the middle of next month. I thank everyone for their evidence. If there is anything that you feel that you did not touch on or that you would like to expand on in any way, please do so in writing to the committee.

12:24 Meeting suspended.  

12:33 On resuming—