The best way to fight so-called fake news is by “inoculating” social media users against online propaganda, MPs have been told.
Regularly informing social media users about how to spot hoaxes and alerting people to the sources of online lies is much more effective than addressing the problems after they have been read, according to academic experts who study the issues.
“The most important problem we’re facing is that once people take in information, it’s extremely difficult to undo that,” Professor Stephan Lewandowsky, an expert in cognitive psychology at the University of Bristol, told the Parliamentary inquiry into fake news yesterday.
But giving people an “inoculation, like a vaccine” by showing them how lies spread online can be effective, said Professor Lewandowsky.
“If we can get to people before the misinformation does then there’s evidence to show they will be able to filter it out better,” he added.
In October, Facebook released figures showing that Russian operatives published 80,000 posts over a two-year period in an attempt to influence the 2016 US election, reaching 126m Americans in the process.
Twitter said last week that nearly 700,000 users had been “exposed” to Russian propaganda around the US election sent by more than 50,000 accounts.
Executives from Facebook, Twitter and Google are all due to give evidence to the parliamentary fake news inquiry in February about Russian meddling on their platforms during the Brexit vote.
Last year’s French election was also targeted by online propaganda attempting to sway the vote away from the centrist Emmanuel Macron and towards the far-right Marine Le Pen.
But France was more prepared, said Professor Lewandowsky, because they “knew what was coming” after lessons learned from the US election and EU Referendum in 2016.
“But we also need to change the whole ecosystem of online information to make it harder for misinformation to spread,” he added.
Professor Vian Bakir, from Bangor University, said Facebook promoted news and information which supported people’s existing beliefs and people were less likely to question those ideas if they already believed them, whether they are true or not.
Dr Caroline Tagg, from the Open University, said that, on Facebook in particular, people tended to ignore or block out rather than correct information with which they did not agree.
“This increases the filter bubble effect which increases the polarisation of views which enables fake news to be circulated more easily,” said Dr Tagg.
Professor Bakir said that while there was “no silver bullet”, online information and misinformation should be addressed “at all levels” of education and that political leaders should take responsibility for any lies they spread.
MPs will hear from executives at Facebook, Twitter and Google in Washington on February 8.
Picture: Reuters/Dado Ruvic/Illustration
Email email@example.com to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog