In tackling the threat of so-called “fake news” and disinformation, MPs recommend establishing a new code of ethics for social media companies – setting out what constitutes harmful content – to be overseen by a regulator with the ability to issue large fines for non-compliance.
This new regulator should be a public body, able to take up complaints from the public and launch legal proceedings against the likes of Facebook, while the tech giants themselves should assume legal liability for harmful content posted by users, MPs have proposed.
The Department for Digital, Culture, Media and Sport Select Committee’s final report into disinformation and “fake news”, published today, said the new regulator should also be able to obtain “any information from social media companies that are relevant to its inquiries”.
This includes the capability to check what data is being held on a user, as well as access to the tech companies’ algorithms and mechanisms to “ensure they are operating responsibly”.
It said the body should have statutory powers to monitor tech companies and that these companies should have “relevant systems in place to highlight and remove ‘types of harm'”.
The report said: “The code of ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media.
“This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.”
In its 108-page report, the committee also recommended extending privacy protection laws to include “models used to make inferences about an individual”, such as those seen during political campaigning, echoing a proposal from the Information Commissioner’s Office.
To fund its new system of regulation, the committee said it supports a levy on tech companies operating in the UK.
The report took aim at Facebook and its founder Mark Zuckerberg in particular, saying the tech billionaire had “shown contempt” to UK Parliament and the International Grand Committee on fake news by refusing invitations to appear before them.
The committee said the Cambridge Analytica Scandal, in which millions of Facebook users’ personal data was harvested and allegedly used for political influence, had been “facilitated by Facebook’s policies”.
The report said: “Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.”
It said evidence from court documents acquired by the committee, in relation to a case brought against Facebook by developer Six4three in the US, indicated that Facebook was “willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers – such as Six4three – of that data, thereby causing them to lose their business”.
The committee went on: “It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws”.
As such, it said the ICO should carry out a detailed investigation into Facebook’s practices, its use of users’ and users’ friends’ data and the use of “reciprocity” of the sharing of data.
It said Facebook was “unwilling to be accountable to regulators around the world” and said the UK Government should consider the impact of tech monopolies on the political world and on democracy.
Press Gazette understands that Facebook rejects any claim that it has breached data protection and competition laws.
In its recommendations, the committee said social media users needed online tools to “help them distinguish between quality journalism and stories coming from organisations that have been linked to disinformation or are regarded as being unreliable sources”.
It said social media companies should be required to either develop tools like this for themselves, or work with existing providers to make such services available for their users
“The requirement for social media companies to introduce these measures could form part of a new system of content regulation, based on a statutory code, and overseen by an independent regulator,” the report said.
MPs said social media companies needed to be more transparent about their own sites, and how they work, “rather than hiding behind complex agreements”. They said the networks should be telling users how algorithms are used to prioritise certain stories, news and videos, depending on each user’s profile.
The committee repeated its suggestion that a new category of tech company be formulated that is neither a “platform” nor a “publisher”.
Since the inquiry began 18-months ago, Facebook has taken steps to defend against the spread of disinformation, including forcing political ad buyers in the UK to verify their identity and location.
The company has also donated £4.5m to fund 80 new community journalists who will be trained up by the National Council for the Training of Journalists and sent into local and regional newsrooms.
Among the DCMS Committee’s other recommendations were:
- The Competitions and Market Authority should conduct a comprehensive audit of the operation of the advertising market on social media
- The CMA should also investigate whether Facebook has been involved in anti-competitive practices
- Creating a public “searchable repository” for all political adverts including details about who paid for the ad and who is being targeted by it “so the public can understand the behaviour of individual advertisers”
- The Government should put pressure on social media companies to publicise any instances of disinformation
- The Government needs to ensure social media companies share information they have about foreign interference on their sites, including who has paid for political adverts, who has seen the adverts, and who has clicked on them, with the threat of financial liability if this is not forthcoming
- The Government should make a statement about how many investigations are currently being carried out into Russian interference in UK politics
- The Government should launch an independent investigation into past elections – including the 2017 general election, 2016 EU Referendum and the 2014 Scottish Independence Referendum – “to explore what actually happened with regard to foreign influence, disinformation, funding, voter manipulation, and the sharing of data, so that appropriate changes to the law can be made and lessons learnt for future elections and referenda”.
- Digital literacy should be made a “fourth pillar of education”, alongside reading, writing and maths
- Building in more “obstacles”, or “friction”, into social media platforms and users’ own activities “to give people time to consider what they are writing and sharing”
- Techniques for slowing down interaction online should be taught, “so that people themselves question both what they write and what they read – and that they pause and think further, before they make a judgement online”.
Damian Collins MP, chairman of the DCMS Committee, said: “Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them – we cannot delay any longer.
“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day. Much of this is directed from agencies working in foreign countries, including Russia.
“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.
“Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.
“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.
“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self-regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.
“We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be done to require major donors to clearly establish the source of their funds.
“Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.
“We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.
“Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world.
“Evidence uncovered by my Committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information.
“Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.
“We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics.
“We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation.”
Karim Palant, UK public policy manager at Facebook said: “We share the committee’s concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.
“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform. But we’re not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do.
“We also support effective privacy legislation that holds companies to high standards in their use of data and transparency for users.
“While we still have more to do, we are not the same company we were a year ago. We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”
In response to the DCMS Committee’s report, a Government spokesperson said: “The Government’s forthcoming White Paper on Online Harms will set out a new framework for ensuring disinformation is tackled effectively, while respecting freedom of expression and promoting innovation.
“This week the Culture Secretary will travel to the United States to meet with tech giants including Google, Facebook, Twitter and Apple to discuss many of these issues.
“We welcome this report’s contribution towards our work to tackle the increasing threat of disinformation and to make the UK the safest place to be online. We will respond in due course.”
Read the full DCMS Committee final report on Disinformation and “fake news”.
Picture: Pixabay
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog