The internet has undoubtedly changed the way we live and work. From connecting us with those we don’t see as often, to helping us work and communicate more efficiently and cost-effectively, using online channels can offer us some great benefits.
But as the internet has evolved, so – inevitably – has a darker side, bringing about harm and self-doubt which has proved to be devastating in some cases.
The inquest into the death of 14-year old Molly Russell found that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”. Molly had interacted with harmful content online; the coroner commented that the material viewed by Molly on social media “shouldn’t have been available for a child to see”.
Commenting on Molly’s inquest, the NSPCC’s CEO Sir Peter Wanless has said that “Tech companies must expect to be held to account when they put he safety of children second to commercial decisions”.
What’s being done about harmful content online?
The government first published the Online Safety Bill in May 2021, which has since been developed, strengthened and clarified.
The Bill is still proceeding through parliament, with the latest version published on 19th December 2022. At the time of writing, the Bill has completed its journey through the House of Commons after the government agreed to amend the Bill to include possible custodial sentences for media tech bosses who knowingly break the law. The Bill now moves to the House of Lords for further readings prior to Royal Assent.
Although there’s still a way to go, it’s clear that the Bill is much needed. A January 2023 news report by Sky reveals that four in five adults want social media companies to be held responsible for harm caused by content on their channels.
Following an announcement in 2022 that a new offence would be introduced making it illegal to encourage someone to self-harm, the Samaritans’ Head of Policy, Public Affairs & Campaigns Mubeen Bhutta welcomed the move but commented:
“The Government’s commitment to take the action Samaritans have called for and help protect people from others maliciously encouraging or assisting them to self-harm is an important step in the right direction…It’s vital that these changes are implemented in time to be part of the new online safety laws. Every day that goes by is another day people remain at risk.”
How will the Online Safety Bill protect adults & children?
The Online Safety Bill makes social media companies accountable for user safety on their platforms.
The Bill’s measures to protect children include requirements for social media organisations to:
- remove illegal content quickly, or prevent it from appearing at all
- remove content promoting self-harm
- enforce age limits and age checks
- prevent children from accessing harmful content, and/or content inappropriate for their age
- publish risk assessments, making companies more transparent about the risks and dangers children may face when using their platforms
- make the reporting of problems online clear and accessible for both children and their parents
Three measures are included to protect adults. Social media companies must:
- remove illegal content
- remove content banned by social media companies’ own terms and conditions; and
- provide users with tools allowing them tailor the content they see, including avoiding potentially harmful content if they don’t want to see it
This may be achieved by providing for manual, human moderation of content, acting to remove content flagged by users or by implementing warning screens.
Managing illegal & harmful content online
The Online Safety Bill requires social media companies to not only remove all existing illegal content, but to stop illegal content appearing in the first place.
This type of content includes that which promotes self-harm, sexual exploitation, controlling or coercive behaviour, child sexual abuse, fraud or hate crime.
Children will be prevented from viewing harmful or age-inappropriate content, including pornographic content, online abuse, bullying, harassment, or content which promotes or glorifies suicide, self-harm or eating disorders.
Tighter age restrictions for children
Social media companies must ensure that no children under the specified user age use their platforms.
Organisations must state which age verification technologies they are using, and strictly enforce their age limits.
Making sure social media companies are protecting their users
The Online Safety Bill appoints Ofcom as a regulator, ensuring that effective processes and measures are put in place by social media companies to protect users from harmful content.
The Bill gives Ofcom powers to take action against companies failing to meet their requirements, allowing fines of up to £18m or 10% of annual global turnover, whichever is the greater.
Criminal action may also be taken against senior managers, and in extreme cases, Ofcom has powers to require payment providers, advertisers and other providers to stop working with a platform, preventing it from generating money or being accessed at all in the UK.
Why are these measures important?
Adroit Legal Services’ sister company, the National Bereavement Service, sits on The Support After Suicide Partnership, a network of over 80 members and supporters founded in 2013 which has contributed to the national suicide prevention strategy.
This important work has recently gained valuable recognition with the award of an MBE for the partnership’s founder, Hamish Elvidge in the New Year’s Honours List 2023.
More here on our work with SASP & the national suicide prevention strategy
Adroit Legal Services director Lisa Lund is on the SASP steering committee, and comments:
“Working with grieving and bereaved people regularly, we welcome the measures provided by the Online Safety Bill and look forward to the legislation passing promptly through Parliament in 2023.
It is clear from cases such as Molly Russell’s that strictly enforced processes are needed to prevent access to harmful material, both for children and adults.
Those bereaved by suicide are almost twice as likely to attempt suicide themselves. The Online Safety Bill implements measures intended to help reduce the risk of suicide presented by material accessed online, whether bullying, abusive or glorifying suicide and self-harm.
It is critical that the Bill is passed and its new laws enforced against major social media companies, not only reducing the risk of suicide by users of those platforms but also then – according to the SASP research – reducing the risk of subsequent suicides.”
This article has been prepared by Adroit Legal Services and is not intended to constitute legal advice.
—
Trusted legal support for your people by Adroit Legal Services
Established in 2015, Adroit connects our clients’ people and customer base to quality-assured, affordable and accessible legal services from trusted and experienced experts.
Legal specialists throughout the UK are tried and tested, delivering honest advice at every stage of life to protect people’s interests, family and wellbeing.
Adroit’s panel of advisors offers a free initial consultation and discounted rates to make legal services accessible to everyone that’s important to a client organisation.
Contact Adroit Legal Services to find out how we help our clients to protect their peoples’ wellbeing with access to quality-assured, great value legal services through tailored employee benefits packages.