Fighting for quality news media in the digital age.

  1. News
June 29, 2020updated 30 Sep 2022 9:26am

House of Lords committee: Google and Facebook must face UK regulation – and sanctions – to combat ‘pandemic of misinformation’

By William Turvill and Charlotte Tobitt

A House of Lords committee is demanding the UK Government introduce new legislation allowing regulators to sanction tech giants and tackle a “pandemic of misinformation” that is threatening lives and democracy.

The peers, led by Lord Puttnam, are urging ministers to push through Online Harms legislation “without delay” so that Google, Facebook and other firms can be held accountable for falsehoods spreading online.

A report published today also calls on the Government to “urgently” provide support for struggling news organisations and to implement the recommendations of the Cairncross Review.

In addition, the committee is calling on the Competition and Markets Authority (CMA) to “conduct a full market investigation into online platforms’ control over digital advertising”.

The House of Lords Select Committee on Democracy and Digital Technologies wants the proposed Online Harms Bill to give broadcast regulator Ofcom the power to sanction platforms.

The Lords suggest that sanctions should include fines of up to 4% of global turnover and powers to block “serially non-compliant platforms”.

Alphabet, the owner of Google and Youtube, reported total revenues of $162bn last year, meaning fines under the committee’s proposal could be worth billions.

In a briefing ahead of the report’s publication, Lord Puttnam added that the “single greatest sanction is reputational damage”.

“If it happens three or four times it will become an embarrassment,” he said. “Shareholders won’t want it.”

The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech’

The committee’s 151-page report, published today, calls for legislation to “make sure that online platforms bear ultimate responsibility for the content that their algorithms promote”.

And the report says that when “harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do”.

The committee believes individual users need “greater protection”, and should be able to seek redress against tech firms through an ombudsman.

And they say Facebook, Google and others should show more transparency instead of hiding the way their algorithms work. They are therefore recommending that platforms be forced to conduct audits to “show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers.”

The Lords also want electoral law to be “completely updated” for the digital age to deal with controversies surrounding on online political advertising.

“The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation,” the report says.

“The digital and social media landscape is dominated by two behemoths– Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics.

“This has become acutely obvious in the current Covid-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives.

“Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.

“Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society.”

Other recommendations in the report include:

  • The Online Harms Bill “should make clear that misinformation and disinformation are within its scope”
  • Ofcom should produce a code of practice on misinformation. Included in this should be a requirement that if something is identified as misinformation by an accredited fact-checker, it should be flagged across all platforms and not promoted
  • Ofcom and the platforms should develop a funding system to support fact-checkers unattached to the Government and the companies.

Google, Facebook and other tech giants are coming under increased pressure over their business practices in countries across the world.

Calls for greater regulation have intensified since the outbreak of the Covid-19 pandemic, which has led to large amounts of misinformation being spread via social media platforms including Google-owned Youtube, Facebook and Instagram, which is owned by Facebook.

Press Gazette recently launched a campaign, Fight The Infodemic, calling on the tech platforms to do more to crack down on harmful misinformation. Several Press Gazette investigations have laid bare the prominence of conspiracy theories on platforms like Youtube and Facebook.

Broadcasters and newspapers called up on misinformation reports

The House of Lords committee report noted this, but also shared concerns about misinformation in the traditional media.

It highlighted London Live’s interview with conspiracy theorist David Icke for concern, saying the station appeared “well aware that it was not acting in a responsible manner”. The report said: “Whilst Ofcom have enforced a correction to be shown on the station, at time of writing it has kept its broadcasting licence.”

The committee also highlighted concerns about conspiracy theories aired by presenter Eamonn Holmes on ITV’s This Morning, and accused several newspapers of reporting misinformation about mass cremations occurring in China during the early days of Covid-19.

The Society of Editors raised concerns that the proposed new legislation could force tech giants to be too sweeping in their removal of content, putting legitimate news content at risk.

Executive director Ian Murray said: “No one wishes to see the spread of false information and fake news, especially where doing so puts people at risk or undermines democracy.

“But if the penalties are steep and the scope of the remit for any regulator wide, then at the very least there will be a chilling effect on journalism in this country.

“The digital platforms will be forced to use broad algorithms to avoid the risk of heavy sanctions, an approach that will remove legitimate journalistic content as well as harmful matter.”

Murray also raised concerns that the question of what amounts to disinformation could effectively lead to Government censorship.

Topics in this article : , , ,

Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog

Websites in our network