Fighting for quality news media in the digital age.

  1. News
May 15, 2019updated 30 Sep 2022 7:47am

Facebook to block offending users from livestreaming in wake of New Zealand mosque massacre

By James Walker

Facebook is tightening its rules around livestreaming in the wake of the New Zealand mosque massacre as it seeks to limit its platforms “being used to cause harm or spread hate”.

Australian national Brenton Tarrant broadcast on Facebook Live as he fired on Muslim worshippers at two mosques in Christchurch on 15 March, killing 50 people in the country’s worst ever terror attack.

Facebook battled to take down the video as its was continually republished.

Today the social media giant said in a blog post that it would adopt a new “one-strike” policy for “anyone who violates our most serious policies”, blacking them from using the live platform for set periods of time.

It gave the example of a 30-day ban from the date of a first offence, which could include sharing a link to a statement from a terrorist group “with no context”. It also plans to stop banned people from creating adverts.

People restricted from using the Facebook Live tool will include those identified under the platform’s “dangerous organisations and individuals” policy, which covers terror and hate organisations.

Facebook vice president for integrity Guy Rosen said: “We recognize the tension between people who would prefer unfettered access to our services and the restrictions needed to keep people safe on Facebook.

“Our goal is to minimize risk of abuse on Live while enabling people to use Live in a positive way every day.”

He also revealed that Facebook plans to invest $7.5m (£5.8m) in academic research partnerships aimed at improving video and image analysis technology.

Rosen said variations of the Christchurch terror attack video, which were uploaded to the platform after the original was removed, presented a challenge for the tech company.

He said: “Although we deployed a number of techniques to eventually find these variants, including video and audio matching technology, we realized that this is an area where we need to invest in further research.”

Facebook is partnering with the University of Maryland, Cornell University and the University of California, Berkeley, to research new techniques to detect manipulated media across images, video and audio, and distinguish between “unwitting posters” and those doing it deliberately.

The platform’s announcement came as International leaders and tech bosses led by New Zealand Prime Minister Jacinda Ardern met in Paris today to release a joint call to action against terrorist content online.

In a statement on Twitter, UK Prime Minister Theresa May said: “Today as I head to Paris, my message to Governments and internet companies is that we must work together to stop social media being used to promote terrorism or spread fear and hate.”

Picture: Pixabay

Topics in this article :

Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog

Websites in our network