“Parents think a child is safe because when they’re online they’re quiet, but they don’t know what they’re watching,” says Nic Wetton, the head teacher of JH Godwin Primary School in Chester.
She warns their silence is often misleading. “Children can be traumatised by horrific videos they see online,” says the head teacher who has 180 pupils aged from four-and-a-half to eleven in her care.
Ms Wetton says she sees children as young as six playing 12-rated computer games online. “We’ve had cases of children needing medication to sleep. This is immensely worrying”.
Some children coming in to school are inattentive in class because they’ve been up all night, playing on devices like tablets or phones. One recent craze was to see who in a WhatsApp group could stay up the longest – the winner sent a message at 04:00.
As well as watching inappropriate content online, or staying up too late, children who are online unsupervised can be vulnerable to paedophiles.
These issues are familiar to Rachel O’Connell. She has investigated online child abuse, working on statistical techniques to identify abusers.
In the course of her research she went online posing as an eight-year-old child who hadn’t made friends at school. Her understanding of the mindset of predators is extensive and chilling, for example, she says that friendless children are often a target: “They look for that,” she explains.
Ms O’Connell visits schools and finds many parents have no idea which apps their children can access. “Putting naked selfies online seems to be a rite of passage now,” she says. “Parents feel they don’t know how to ‘digitally’ parent, they can feel helpless. We need oversight.”
One significant problem is that children can be targeted while browsing sites that are theoretically off limits for young people.
So, preventing children from getting access to any of these sites would help tackle the problem.
The business founded by Ms O’Connell, TrustElevate, is based on the principle of Zero Data – establishing whether a child should be allowed to log onto a service but without giving away any personal details about that child.
Ms O’Connell has been trialling Zero Data techniques with mobile phone operator, EE. She wants to create a family access app that will screen users for their age and seek parental approval.
TrustElevate software generates a token containing just the child’s age range and no personal information, this information allows a service provider to check out a potential new user.
While the service provider can block access, if the details don’t tally with the permissions held on the system, the token cannot be exploited to push other services, or products, to the child.
These types of technical tools are a help, but schools are fighting back as well.
At JH Godwin School, Ms Wetton runs online safety workshops, where parents are invited to bring a laptop along so they can download safety apps and parental controls.
To her frustration, engagement from parents is not a given. She has arranged workshops where just one parent turned up out of 150 who have children at the school.
Ms Wetton has even tried a tactical approach, putting online safety talks ahead of popular events, such as the Christmas, or Easter bingo sessions. However, she has had heckles from people who felt they shouldn’t be lectured on their evening out.
So, the school is left to come up with practical measures to shield its children from malicious online contact.
For example, she suggests not wearing a school branded jumper while on TikTok. “If children do that, then anyone watching knows where they are going to be at 0800 and 1600.”
She believes online safety apps should be frontloaded on to any device a child might use.
“Surely, that’s better than waiting for a mental health pandemic in the very young? Plus, computer games can be addictive. If we don’t protect them, we get exhausted children coming here, and it’s like trying to teach an empty vessel that just won’t fill.”
She wants to see tech companies made to feel responsible for safeguarding, through measures like age verification software.
And TrustElevate’s Ms O’Connell, says the government should be doing a lot more to regulate children’s access to games and websites.
“There’s no oversight into that at the moment, no oversight into the impact of it.”
A UK government Online Harms White Paper published in 2019, reported that 12-15 year olds spend over 20 hours a week online. And regulator, Ofcom, states that 79% of that group had experienced at least one potentially harmful experience online in the previous year.
The Online Safety Bill, currently before Parliament, will introduce a duty to protect children from harmful, or inappropriate material.
The Bill does not stipulate which technology tools should be used to do this, but Ofcom may respond to failures to protect children by recommending the use of age verification systems.
Speaking to BBC News, Chris Philp, Minister for Tech and the Digital Economy, lays out what he believes will be a much stricter operating environment for online platforms in the future.
“If platforms want children to use their services, they will need to protect them from accessing content that is harmful or inappropriate. If their services are meant for adults, they will need to prevent underage access.”
He insists that schools and parents wrestling with online dangers will be assisted by rigorous government measures. “Those who fail to comply, will face massive fines and risk their services being blocked from access in the UK.”
At JH Godwin School tougher protection measures would be warmly welcomed. Ms Wetton describes the gulf between how big tech presents its role in society as positive and the unintended consequences in the real world.
“Live streaming services are supposed to bring ‘like-minded people’ together, but in reality it means predators using search terms such as ‘girls dancing’.”
She knows the techniques paedophiles use, such as matching their “pace” to their intended victims. “These people are patient and work on a child, so we must open their eyes [to potential dangers].”