The Government has set out plans for tough new online safety laws to be overseen by an independent regulator, which it claims would make the UK “the safest place in the world to go online”.
The proposed regulatory framework, published in a White Paper today, will impose a statutory “duty of care” on companies that “allow users to share or discover user-generated content or interact with each other online”.
- May 15, 2019
- May 8, 2019
- April 25, 2019
This covers social media platforms, such as Facebook and Twitter, but also extends to search engines, messaging services, online forums and file hosting sites.
Firms must take reasonable steps to keep their users safe online and tackle illegal and harmful activity on their platforms.
If they fail to do so they could face heavy fines, have access to their sites blocked and in some cases senior management could become liable for company failings – all powers still being consulted on.
Prime Minister Theresa May said: “The internet can be brilliant at connecting people across the world, but for too long these companies have not done enough to protect users, especially children and young people, from harmful content.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
The proposed new laws aim to tackle a range of online harms, including incitement to violence and violent content, encouraging suicide, disinformation (or so-called “fake news”), cyber bullying and children viewing and reading inappropriate material online.
They will target the spread of terrorist content and child sexual abuse and exploitation content in particular.
Culture and Digital Secretary Jeremy Wright said: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.”
Online harms have been making headlines ahead of the white paper.
The online tabloid press faced criticism recently for sharing edited clips of the mosque massacre in New Zeland last month, which the terror attacker streamed live on Facebook. Youtube also struggled to remove uploads of the video. Fifty people were killed in the attack.
The father of 14-year-old Molly Russell, who took her own life in 2017, said in January that social media had been partly to blame for her death after finding content about depression and suicide on her Instagram account.
Today’s Online Harms White Paper is a policy document that proposes future legislation. It has been jointly written by the Home Office and the Department for Digital, Culture, Media and Sport.
There were reports last month that its publication had been delayed due to concerns the proposed legislation could negatively impact press freedom.
The Society of Editors said it welcomed the new measures to crack down on internet harms, but warned of the need to be vigilant over press freedom.
It warned the proposed laws set out in the White Paper could “threaten basic rights to freedom of expression if not introduced with great emphasis on avoiding too draconian regulations”.
Press freedom group Article 19 shared the society’s concerns. Its executive director, Thomas Hughes, added: “The Government must not create an environment that encourages the censorship of legitimate expression.
“Article 19 strongly opposes any ‘duty of care’ being imposed on Internet platforms. We believe a duty of care would inevitably require them to proactively monitor their networks and take a restrictive approach to content removal.
“Such actions could violate individuals’ rights to freedom of expression and privacy.” It said regulation “can be best achieved through independent self-regulation” overseen by a new independent and multi-stakeholder body.
The Government is still consulting on whether to set up a new independent regulator – some have claimed it could be called Ofweb – or if the responsibility for oversight could pass to an existing watchdog such as Ofcom, which regulates broadcast media.
It will be funded by industry in the medium term while the Government explores options, such as an industry levy, to sustain it.
The new regulator will have a “legal duty to pay due regard to innovation, and to protect users’ rights online, taking particular care not to infringe privacy or freedom of expression”, the White Paper said.
It added: “We are clear that the regulator will not be responsible for policing truth and accuracy online.”
Among the regulator’s roles would be ensuring social media publish annual “transparency reports” on the amount of harmful content on their platforms and what they’re doing to address this.
It would also make sure companies respond to users’ complaints and act quickly to address them and enforce new codes of practice, which could include action to minimise the spread of “fake news” during elections.
The regulator would take a “risk-based approach” to regulation, acting where there is the “greatest evidence or threat of harm” or where children or other vulnerable users are at risk, the paper said.
It will work to set out expectations for companies “to do what is reasonably practicable to counter harmful activity or content”.
Today’s paper also proposes creating a “safety by design” framework to help companies incorporate online safety features in new apps and platforms at the point of their creation.
A media literacy strategy to teach people to recognise deceptive and malicious behaviours online, including child grooming and extremism, is to be developed with news media organisations.
Wright MP said: “Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users, however, those that fail to do this will face tough action.
“We want the UK to be the safest place in the world to go online, and the best place to start and grow a digital business and our proposals for new laws will help make sure everyone in our country can enjoy the Internet safely.”
Home Secretary Sajid Javid added: “The tech giants and social media companies have a moral duty to protect the young people they profit from.
“Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online.
“That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise.”
A 12-week consultation on the proposals launched today, after which the Government will set out what action it will take to develop its final proposals into legislation.