Facebook is cracking down on so-called “deepfake” videos in the lead-up to the 2020 US presidential election, though humour will be immune.
The social network has said it will remove misleading manipulated media that has been edited in ways that “aren’t apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say”.
Videos will also be banned if they are made by Artificial Intelligence or machine learning that “merges, replaces or superimposes content on to a video, making it appear to be authentic”.
Donald Trump is hoping to win a second term in office when US voters go to the polls on 3 November. His democratic rival is yet to be named.
Facebook said its policy on deepfake videos “does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words”.
The move is the latest effort by the tech giant to rein in misinformation, after its decision not to fact check political advertising emerged ahead of the UK general election.
Facebook came under the spotlight last year for allowing an altered video of US House Speaker Nancy Pelosi to remain on its platform.
Under the new rules, the firm says it can stay online because it does not meet the standards of the policy and only videos generated by artificial intelligence to show people saying fictional things will be taken down.
Once the video of Pelosi was rated by a third-party fact-checker, its distribution was reduced and those who tried to share it – or already had done – received warnings that it was false, the social network added.
Twitter has banned all political advertising on its site.
Picture: Reuters/Dado Ruvic/Illustration
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog