The number of children being admitted to hospital for self-harm has risen three-fold over the last decade, the latest NHS figures have revealed, prompting the father of Molly Russell to call for duty of care laws that force social media companies to purge online images glorifying the act.
The latest provisional statistics show children aged 17 and under were admitted 4,455 times in 2019-20 compared to 1,420 in 2009-10.
This means health workers are now seeing an average of 12 children a day arriving at hospital for self-inflicted injuries.
Following the release of the figures, Ian Russell, the father of 14-year-old schoolgirl Molly, who took her own life after viewing self-harm and suicide content on Instagram and other apps, warned that social media sites were sucking still children down dangerous “alorgithmic whirlpools” that normalised depression and suicide.
He also expressed frustration that, more than a year after Instagram pledged to remove all “graphic” self-harm content, The Telegraph was able to find images on the site of children cutting themselves and users posting messages glorifying suicide.
One post read: “Suicide is the only thing we can control in our lives” and another said: “I am going to die anyway, so why not today.”
Mr Russell said: “Today, the sort of harmful material Molly viewed can still be found on the platform [Instagram]. Young people who are struggling with their mental health are still vulnerable when online.
“The inaction of the platforms is only matched by the inaction of our politicians. We are still allowing our children onto the information superhighway and have yet to set any form of effective speed limit for their safety.
“It’s time for companies, governments and individuals to do whatever possible to make the internet a safer place and prevent more wasted young lives.”
The Government is currently drawing up duty of care legislation that will impose a legal responsibility on tech companies to protect children on their services, a measure campaigned for by The Telegraph since 2018.
Under current proposals, companies found to breach their duty of care could face fines running into the billions, criminal prosecution or having their apps barred from the UK.
A Government spokesman said: “We are developing plans to put a new duty of care on online platforms towards their users, and will introduce legislation as soon as possible.”
Bill may not come in to force until 2024
Ministers have only committed to bring a bill before MPs by the end of the current parliamentary session, leading children’s charities to warn that it may not come into force until 2024.
Instagram said it had removed all the posts found by the Sunday Telegraph and was developing new technology to help better scan and find such material on its site. The company said it had removed more than 1.3 million self-harm posts in the first three months of 2020 alone.
Tara Hopkins, the Instagram Head of Policy for Europe, the Middle East and Africa, said: “We have strict rules – developed with experts, including the Samaritans – that do not allow the promotion or glorification of suicide or self-harm, or content that is graphic.
“We remove it as soon as we find it. We use technology to detect this content and direct people to organisations that can help.”
As well as calls to clean up their sites, social media companies are also facing demands to open up their secretive algorithms to health workers and academics.
Last year, the Royal College of Psychiatrists said duty of care laws should also open up tech giants’ data, as the lack of access was frustrating attempts to understand the links between social media and children’s mental health.