Facebook has set out fresh commitments to protect elections from interference and misinformation should the UK go to the polls.
The social network said it will set up a dedicated operations centre if a general election is called to serve as an added layer of defence, monitoring and quickly removing activity that breaks its rules.
From next week, adverts relating to social issues such as immigration, health and the environment will have to go through the same verification process as political adverts.
At present Facebook requires political advertisers to share who they are and where they live.
Posts reported by Facebook‘s UK fact-checking partner Full Fact found to contain misinformation will also feature more prominent labelling.
Damian Collins, chairman of the Digital, Culture, Media and Sport Committee, asked in a letter today whether Facebook has any plans to formalise a long-term working relationship with third-party fact-checkers given the “onus” put on them in tackling so-called “fake news”.
But writing in the Daily Telegraph, Facebook‘s vice president of policy solutions Richard Allan said new rules for the era of digital campaigning need to be decided by Parliament and regulators.
“While we are taking a number of steps, there are many areas where it’s simply not appropriate for a private company like Facebook to be setting the rules of the game or calling the shots,” he explained.
“For instance, we do not believe it should be our role to fact check or judge the veracity of what politicians say – not least since political speech is heavily scrutinised by the media and our democratic processes.”
Collins wrote to Facebook’s vice president for global affairs and communications Nick Clegg questioning the decision to exempt political figures’ pages from fact-checking.
He also demanded answers as Facebook has got rid of a ban on political advertising that contains “deceptive, false or misleading content”.
It now instead only bans adverts featuring claims that have been debunked by third-party fact checkers or other organisations with relevant expertise.
Collins said these changes would place “heavy constraint” on the social network’s “ability to combat online disinformation in the run-up to elections around the world”, including in the UK.
In his piece, Allan said Facebook was making its library of political adverts more transparent, adding that it is “already being used by journalists and researchers to analyse in real time what political parties and candidates are saying and doing”.
He went on: “UK electoral law needs to be brought into the 21st century to give clarity to everyone – political parties, candidates and the platforms they use to promote their campaigns.
“The law may not be changed before Britain goes to the polls again, but we are determined to play our part in protecting elections from interference by making our platform more secure and political advertising more transparent.”
The changes mark the latest balancing act for Facebook as it seeks to champion freedom of speech against a backdrop of misinformation campaigns.
On Monday, it took down more so-called co-ordinated inauthentic behaviour from Iran and Russia, targeting the US, North Africa and Latin America.
Facebook is also attempting to bolster its efforts against interference ahead of US presidential elections in 2020, including a special security tool for elected officials and candidates that checks their accounts for hacking attempts, as well as a new US presidential candidate spend tracker.
The social network has pledged to add more information about who is behind a page, starting with large pages in the US.
From next month, Facebook said it will start labelling media outlets that are wholly or partially under the editorial control of their government, as state-controlled media.
Picture: Reuters/Dado Ruvic/Illustration
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog