A group of MPs is calling on the UK Government to immediately launch a new body to regulate social media giants whose bosses have “failed to tackle” the Covid-19 infodemic.
Julian Knight MP, chair of Parliament’s DCMS Committee, said: “Evidence that tech companies were able to benefit from the monetisation of false information and allowed others to do so is shocking. We need robust regulation to hold these companies to account.
- July 31, 2020
- July 31, 2020
- July 7, 2020
“The coronavirus crisis has demonstrated that without due weight of the law, social media companies have no incentive to consider a duty of care to those who use their services.”
The committee suggested delays to online harms legislation – “promised by the Government 15 months ago” – have helped coronavirus misinformation to “spread virulently”.
The MPs, who have been investigating the Covid-19 infodemic for several months, also said evidence suggests tech companies have been able to monetise misinformation for themselves and others.
They said that efforts by the companies to “tackle misinformation through warning labels or tools to correct the record have fallen short”.
At the height of the Covid-19 pandemic, Press Gazette launched a campaign, Fight The Infodemic, calling on tech giants to do more to tackle misinformation.
The committee’s full report, which can be read here, also lays out evidence to show how misinformation is allowed to spread on social media.
It said: “We know that novelty and fear (along with anger and disgust) are factors which drive ‘engagement’ with social media posts; that in turn pushes posts with these features further up users’ newsfeeds—this is one reason why false news can travel so fast.
“This is opposite to the corporate social responsibility policies espoused by tech companies relying on this business model.
“The more people engage with conspiracy theories and false news online, the more platforms are incentivised to continue surfacing similar content, which theoretically encourages users to continue using the platform so that more data can be collected and more adverts can be displayed.”
The report recommended: “The current business model not only creates disincentives for tech companies to tackle misinformation, it also allows others to monetise misinformation too.
“To properly address these issues, the online harms regulator will need sight of comprehensive advertising libraries to see if and how advertisers are spreading misinformation through paid advertising or are exploiting misinformation or other online harms for financial gain.”
Another section of the report focused on the role “quality journalism” can play in fighting misinformation. The MPs expressed concerns about funding issues experienced by news organisations because of Google and Facebook’s domination of the online advertising market.
Earlier this month, the UK’s Competition and Markets Authority called on the Government to create new powers to regulate the digital ad market to create greater competition and “improve the quality and accuracy of journalism”.
The DCMS committee said: “Tech companies rely on quality journalism to provide authoritative information. They earn revenue both from users consuming this on their platforms as well as (in the case of Google) providing advertising on news websites, and news drives users to their services.
“We agree with the Competition and Markets Authority that features of the digital advertising market controlled by companies such as Facebook and Google must not undermine the ability of newspapers and others to produce quality content.
“Tech companies should be elevating authoritative journalistic sources to combat the spread of misinformation. This is an issue to which the Committee will no doubt return.”
A Facebook spokesperson said: “We don’t allow harmful misinformation and have removed hundreds of thousands of posts including false cures, claims that Coronavirus doesn’t exist, that it’s caused by 5G or that social distancing is ineffective. In addition to what we remove, we’ve placed warning labels on around 90 million pieces of content related to Covid-19 on Facebook during March and April, which prevented people viewing the original post 95% of the time.
“This month we also launched alerts at the top of people’s feeds encouraging users to wear face masks, a media literacy campaign and a special mythbusting section of our Covid Information Centre. And since February, we have directed more than 3.5 million visits to official Covid advice on the NHS and UK government websites.”