The Guardian has published details from 100 internal Facebook training manuals on how its moderators should deal with extremist content.
While remarks such as “someone shoot Trump” should be taken down, The Guardian reports, it says Facebook guidelines state “to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat” may be allowed.
- July 9, 2020
- July 6, 2020
- June 29, 2020
The investigation comes as Facebook faces pressure in the UK over the impact of secret and unregulated political advertising which some fear could undermine democracy.
How have your newspaper consumption habits changed during the pandemic/lockdown, and do you think this will last?
- I read more news digitally than in print now, and expect this to continue (48%, 179 Votes)
- No change (29%, 107 Votes)
- I read more news in print than digitally now, and expect this to continue (14%, 52 Votes)
- I read more news digitally than in print now, but do not expect this to continue (6%, 24 Votes)
- I read more news in print than digitally now, but do not expect this to continue (3%, 10 Votes)
Total Voters: 372
Press Gazette’s Duopoly campaign has warned in recent months that Facebook and Google’s dominance over the advertising market is bad news for journalism.
Last year Facebook is reckoned to have made more than £1.5bn in advertising in the UK alone and returned almost nothing to news publishers whose content is widely shared on the network.
A dossier apparently containing dozens of training manuals and internal documents obtained by the Guardian newspaper claims to offer an insight into how content posted by Facebook’s users is moderated.
Staff are told videos of abortions are allowed to remain on Facebook as long as they do not contain nudity, while footage of violent deaths does not have to be deleted because they can help create awareness of issues such as mental illness, the Guardian said.
All “handmade” art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not, the newspaper claimed.
Facebook will also allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”, it added.
The leak is likely to reignite the debate between freedom of expression, safety and censorship on the internet.
Last week Theresa May outlined plans for widespread reform of cyberspace.
She said the internet had brought “a wealth of opportunity, but also significant new risks which have evolved faster than society’s response to them”.
Outlining plans under a future Tory government, she said: “We want social media companies to do more to help redress the balance and will take action to make sure they do.
“These measures will help make Britain the best place in the world to start and run a digital business, and the safest place in the world for people to be online.”
The document also said: “At a time when the internet is changing the way people obtain their news, we also need to take steps to protect the reliability and objectivity of information that is essential to our democracy and a free and independent press.
“We will ensure content creators are appropriately rewarded for the content they make available online.”
Monika Bickert, head of global policy management at Facebook, said: “Keeping people on Facebook safe is the most important thing we do.
“(Founder) Mark Zuckerberg recently announced that over the next year, we’ll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week, and improve the process for doing it quickly.
“In addition to investing in more people, we’re also building better tools to keep our community safe.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”
The contents of the dossier were described by children’s charity the NSPCC as “alarming to say the least”.
A spokesman said: “It (Facebook) needs to do more than hire an extra 3,000 moderators.
“Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe.”