Facebook has launched a UK arm to its international fact-checking initiative following more than two years of criticism about how the social network has handled the spread of misinformation on its platform.
Full Fact, a fact-checking charity founded in 2010, will review stories, images and videos which have been flagged by users and rate them based on their accuracy.
The charity’s efforts will focus on misinformation it perceives to be the most damaging, such as fake medical information, false stories around terror attacks and hoaxes around elections.
Facebook’s leadership has been repeatedly criticised by politicians in recent years as problems of misinformation and foreign interference have plagued elections around the world.
The Brexit referendum and 2017 general election were both found to have been tarnished by so-called fake news, while online mistruths have been blamed for stoking division in nations around the world.
Social media companies have faced the threat of regulation if they fail to act on false information on their platforms, and Facebook has been called to answer questions from lawmakers in numerous countries on the subject.
In a highly publicised evidence session before the US Congress in April, founder Mark Zuckerberg addressed the company’s failings on false information and the data scandal involving Cambridge Analytica.
However, he failed to appear when called to the UK Parliament’s inquiry into fake news, prompting MPs to leave an empty chair for him during a session with vice-president Richard Allan in November.
Under the new measures, Facebook users will be able to report posts they fear may be inaccurate for Full Fact to review, while other suspicious posts will be identified byFacebook technology.
Posts will then be labelled as true, not true or a mixture when users share them.
If a piece of content is proven to be false, it will appear lower in Facebook‘s News Feed but will not be deleted.
Claire Wardle, executive director of First Draft, which worked with Full Fact on the 2017 general election, said the biggest problem is that Facebook holds all the information about the project, making it almost impossible for independent auditors to see whether it is working.
“Facebook has this global database of online misinformation and that is something that should be available to researchers and the public,” said Wardle.
“The first concern is to protect free speech and people’s ability to say what they want,” said Will Moy, director of Full Fact, adding that the main problem on social media is often that “it is harder and harder to know what to trust”.
Rather than the “nuanced political fact-checking” on topics such as Brexit and immigration often found on Full Fact’s website, Moy predicted misinformation around health will be one of the biggest issues his team will be tackling.
Facebook first launched its fact-checking initiative in December 2016, after concerns were raised about hoaxes and propaganda spread around the election of Donald Trump.
The social network now works with fact-checkers in more than 20 countries to review content on its platform but studies disagree as to whether their efforts have been effective.
Full Fact will publish all its fact-checks on its website, Moy said, as well as quarterly reports reviewing the relationship with Facebook.
Sarah Brown, training and news literacy manager, EMEA at Facebook, said in a statement: “People don’t want to see false news on Facebook, and nor do we.
“We’re delighted to be working with an organisation as reputable and respected as Full Fact to tackle this issue.
“By combining technology with the expertise of our fact-checking partners, we’re working continuously to reduce the spread of misinformation on our platform.”
Picture: Reuters/Dado Ruvic/Illustration/File Photo
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog