Social media companies face a ‘daunting task’ to remove content promoting suicide after being warned they could be banned by the Government if they fail to take action.
Health Secretary Matt Hancock has written to a number of internet giants, telling them they have a duty to ‘purge’ the web of the type of content linked to anxiety, depression and self-harm.
It comes after the father of 14-year-old Molly Russell said the algorithms used by Instagram enabled her to view more harmful content, possibly contributing to her death.
Mr Hancock told the Andrew Marr show that the Government ‘can and must’ legislate against social media companies if they do not do enough to remove harmful content, which could mean imposing extra taxes or banning them.
Social media expert Stephen Parker says the Government is right to put pressure on the internet giants – but warns they face a huge job to stamp out harmful content.
Mr Parker, CEO of Biddulph-based live chat specialist Parkers Software, said: “Policing social media has now been a pressing technical challenge for over a decade – and one that proved none too easy even for the biggest tech giants like Facebook.
“Unfortunately, a comprehensive ‘purge’ of harmful social media content is easier said than done. The scale of such a task is daunting. For example, each day users collectively post 500 million tweets; 95 million photos and videos on Instagram; some 350 million photos per day on Facebook; send three billion snaps; and upload 300 hours of video to YouTube every single minute. This should give some idea of the enormity of the challenge.
“As clever as our technology has become, there is simply no single, catch-all, magical algorithm that can recognise and delete every single piece of harmful content the moment it appears.
“Artificial intelligence is an invaluable tool in the battle against dangerous content, as is traditional, stalwart human policing. But it’s still not enough to stamp out every harmful post, on every channel, on the spot.
“However, social media firms have vast resources at their disposal, as well as some of the brightest brains in the technical world. So, we can reasonably expect social media ‘purging’ to get smarter and quicker as these tech titans continue to apply themselves to the problem.
“Tragedy after tragedy has proved that lives are at stake when it comes to harmful social media content, and lawmakers are right to keep the pressure up on firms. We know that automatic social media policing won’t be perfect, and a far-reaching purge won’t be easy. But it can, will, and must get better for the sake of a safer future.”
Stoke-on-Trent North MP Ruth Smeeth – who has been subjected to anti-Semitic abuse online – believes changing algorithms on search engines, so people are not directed towards harmful or abusive content, would help.
She said: “Every time I speak to groups of young people, there is always a direct correlation between what goes on online and their mental health. We have an obligation to root out [abuse and harmful content] and protect more vulnerable people.”
He wrote: “It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.
“It is time for internet and social media providers to step up and purge this content once and for all.”
He added that the Government is developing a white paper addressing ‘online harms’, and said it will look at content on suicide and self-harm.
Molly Russell was found dead in her bedroom in November 2017 after showing ‘no obvious signs’ of severe mental health issues. Her family later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.
Speaking to the BBC, Molly’s father Ian Russell said he believed Instagram, ‘helped kill my daughter’.
In an interview with The Sunday Times, Mr Russell also criticised Pinterest, saying the online scrapbook has ‘a huge amount to answer for’.
Since Mr Russell spoke publicly about his daughter, Papyrus, a charity that works to prevent youth suicide, said it had been contacted by around 30 families who believe social media had a part to play in their children’s suicides.
An Instagram spokesman said: “We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders.
“As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people.
“While we undertake this review, we are taking measures aimed at preventing people from finding self harm-related content through search and hashtags.”
A Pinterest spokesman told the BBC: “We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform.
“But we know we can do more, which is why we’ve been working to update our self-harm policy and enforcement guidelines over the last few months.”