Theme: Law & justice | Content Type: Blog

Tech Companies are Failing to Protect Children. Academics and Online Child Activist Groups Speak Out

Joanne Stubbs and Belinda Winder

gaelle-marcel-156147-unsplash-e1546610954370

Gaelle Marcel

| 10 mins read

We are technological teenagers when it comes to social media. Facebook was made publicly available just 14 years ago, and Twitter, Snapchat and Instagram far less. Since then, as a society we have been on a fun-filled spree of communicating, connecting and of satiating our appetites. We have been gluttonous in our demands, wanting more, faster, and then moving onto the next thing. We have seen things we do not want to see and cannot unsee. And we have become used to them. 

But this is not the darkest aspect that emerges from these incredible technological advances. Just as social media and the internet has brought new opportunities into all our lives, it has also brought perils. The crimes that are most entrepreneurial in nature are theft, fraud and sexual offending. They will adapt and thrive wherever they can find new opportunities, such as in cyberspace

Now it is time to take steps to protect children – and indeed all social media users. We need to balance individual freedoms with the protection of individuals. But who should put those limits in place? Should it be the government? Do we need new laws? Should we then expect an over-burdened and under-funded police force (with a fifth fewer police officers than eight years ago) to both spend more time investigating the burgeoning problem with internet-mediated child abuse as well as enforcing the new laws?

The co-author of this article, Joanne Stubbs, who runs an Online Child Abuse Activist Group (OCAG) in Nottingham, has outlined below the concerns the group have with several types of social media. These concerns need to be taken seriously by the tech giants, who should work harder alongside other agencies to protect society using evidence-based policies. 

What are Online Child Activist Groups?

Online Child Activist Groups are citizens working within the law to provide information to the police when a crime is in progress. A recent estimate put the number of such groups at 180 across the UK, but more are being created regularly.

OCAGs do not work directly with the police. Instead, they present evidence gathered on a suspect to the police. This evidence is gathered by a ‘decoy’ who is an adult posing as a minor. They use photos donated by the public and make it very clear from the start of any communications of the ‘age’ of the decoy.

The police run operations that are similar in some respects, but OCAGs have more freedom as private citizens to investigate an individual. They are not bound by the Regulation of Investigatory Powers Act (RIPA) 2000 which would need them to obtain a warrant for every bit of information they gather, nor are they bound by the Cover Human Intelligence Source (CHIS) Code of Practice..

Catching people committing sexual offences online

According to Jo, in a 12-month period (April 2017 to March 2018) in Nottinghamshire, 37 cases from Online Child Activist Group (OCAGs) were handed over to the police; 23 led to a charge, 11 cases are ongoing and only three cases resulted in No Further Action or Refused Charge. Jo’s OCAG always start online, and 100% of the cases have started on apps designed specifically for children or on Facebook.

In a 2018 report, the NSPCC stated, with regard to social media and online safety, “[of] the 2,097 offences where police recorded the method used to communicate with a child, Facebook, Snapchat or Instagram were used in 70% of cases… Facebook, Snapchat and Instagram were the … three most-recorded sites.”From the same source, it was reported that police in England and Wales recorded 1,944 incidents of sexual communication with children in the six months to September 2018. In these, Instagram was used in 32% of the 1,317 cases where a method was recorded, Facebook in 23% and Snapchat in 14%. This shows a fall from Facebook, and a rise in Instagram in a relatively short space of time.

Jo has insights on the use of social media by sexual predators that provide much-needed food for thought. On Facebook, there are thousands upon thousands of pages, closed groups, public pages and secret groups set up specifically for the exploitation of children and beastiality. ‘Suggested friends’ encourages complete strangers to speak to other users. Additionally, there is hardcore pornography across thousands of pages and profiles are not removed even when public alert Facebook to them. Someone Jo works alongside named Tanya Marie (who founded the group Facebook Hunters two years ago) has estimated that in the time she has been running Facebook Hunters, she has found over 400 of such pages – and those are just the public groups.

Jo recommends that Facebook revise their current Community Standards. The company does take down the individual posts eventually, but by the time a post is removed, several more have been uploaded. Perhaps there also needs to be a cap on the number of pages one person can create and administer.

Instagram is becoming one of the most used sites from groomers (according to NSPCC figures). Anyone can request to follow another user, and unless they have put the correct privacy settings on their account, anyone can view all the images they post. It is very common to see children with dozens of men on their profiles, and when Jo comes across a suspect, there are often hundreds of real children on their account. As the children allow this person to ‘follow’ them, it leaves the person able to view anything they post. Instagram now also allow live feeds. Jo recommends that an alert should be set up for certain phrases, although recognises this is difficult to do and often not fruitful. She suggests that at a minimum the report button should be available on live streams, which alerts a real person working as a moderator on Instagram.  

While using Snapchat, Jo has found that children who are unaware of the danger are most likely to send their stories out to the world where strangers can see them. Software has also been developed to take the photos or videos shared on the app so they are saved separately. This leaves a lot of people vulnerable; that nude photo they thought would disappear forever can be captured by viewers.

Jo finds that is difficult to decoy child abusers on Snapchat. When a screen shot is taken of part of a conversation, photo or video, a notification is sent back to the other person. The option exists to use the software as described above, but whilst holding several conversations across various platforms this proves difficult and prone to human error.

The police have the facility to use computers which will record every interaction and this will be used as evidence, but unfortunately neither Jo’s group nor any other OCAG have the funds to do this. Snapchat also has a very precise geographic locator, but ‘ghost mode’ has been created so that no individual user can be tracked – both a blessing and a curse from a child protection perspective. Add into the mix the algorithm protecting the data, downloading evidence from Snapchat is pretty impossible, even for the police. This is why, in our view, it is becoming one of the most used apps for people grooming children for possible future abuse. It leaves no evidence.

Solving the problem together 

Technology itself must help to solve this crisis – and it is a crisis. Social media companies do have human moderators, around 150,000, but the number of posts and uploads daily to social media platforms is far beyond their current capacity. Tech companies need to be more open, and less defensive, about the problems that social media creates. They need to accept help from those who offer. We need to create specialist units that combine expertise from the police, tech companies themselves, academics and child safety experts, those involved in the ‘frontline’ fight (OCAGs), teenagers who use such platforms, and indeed from the individuals who have offended using these platforms, to find effective ways to protect us all online better.

We also need to consider the impact of cuts in the police force, which have impacted training budgets. How are police officers meant to stay on top of cutting-edge crime without the training and system improvements that are needed to cope with it? 

It is time for different stakeholders to join forces and to put into place sophisticated and user-informed mechanisms to protect all social media users. This problem is everyone’s problem, and it is up to us all to work together to solve it, but we need the tech companies to get the ball rolling. 

  • Joanne Stubbs

    Joanne Stubbs has been running an Online Child Activist Group for 16 months. She has brought around 20-25 people to the police as part of various activist groups as well as on her own.

    Articles by Joanne Stubbs
  • Belinda-Winder_avatar.jpg

    Belinda Winder

    Belinda Winder is a Professor in Forensic Psychology and Heads the Sexual Offences, Crime and Misconduct Research Unit at Nottingham Trent University.

    Articles by Belinda Winder