Online

Facebook restricts Live feature, citing New Zealand shooting

Silhouettes of mobile users are seen next to a screen projection of Facebook logo in this picture illustration taken on March 28, 2018. File photo: Reuters

Facebook Inc said on yesterday it was tightening rules around its livestreaming feature ahead of a meeting of world leaders aimed at curbing online violence in the aftermath of a massacre in New Zealand.

A lone gunman killed 51 people at two mosques in the city of Christchurch on March 15 while livestreaming the attacks on Facebook. It was New Zealand's worst peacetime shooting and spurred calls for tech companies to do more to combat extremism on their services.

Facebook said in a statement it was introducing a "one-strike" policy for use of Facebook Live, temporarily restricting access for people who have faced disciplinary action for breaking the company's most serious rules anywhere on its site.

First-time offenders will be suspended from using Live for set periods of time, the company said. It is also broadening the range of offences that will qualify for one-strike suspensions.

New Zealand Prime Minister Jacinda Ardern said the change addressed a key component of an initiative, known as the "Christchurch Call", she is spearheading to halt the spread of violence online.

"Facebook's decision to put limits on live streaming is a good first step to restrict the application being used as a tool for terrorists, and shows the Christchurch Call is being acted on," she said in an email from her spokesman.

Facebook did not specify which offences were eligible for the one-strike policy or how long suspensions would last, but a spokeswoman said it would not have been possible for the shooter to use Live on his account under the new rules.

The company said it plans to extend the restrictions to other areas over coming weeks, beginning with preventing the same people from creating ads on Facebook.

It also said it would fund research at three universities on techniques to detect manipulated media, which Facebook's systems struggled to spot in the aftermath of the attack.

'WORK TO DO'

Ardern said the research was welcome and that edited and manipulated videos of the March 15 mosque shootings had been slow to be removed, resulting in many, including herself, seeing it played in Facebook feeds.

Facebook has said it removed 1.5 million videos globally that contained footage of the attack in the first 24 hours after it occurred. It said in a blog post in late March that it had identified more than 900 different versions of the video.

Ardern is due to lead a meeting, with French President Emmanuel Macron in Paris on Wednesday, that seeks to have world leaders and chiefs of tech companies sign a pledge to eliminate violent content online.

"There is a lot more work to do, but I am pleased Facebook have taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it," she said.

In an opinion piece in the New York Times on Saturday, Ardern said the "Christchurch Call" would be a voluntary framework that commits signatories to put in place specific measures to prevent the uploading of terrorist content.

Ardern has not made specific demands of social media companies in connection with the pledge, but has called for them "to prevent the use of livestreaming as a tool for broadcasting terrorist attacks."

Representatives from Facebook, Alphabet Inc's Google, Twitter Inc and other tech companies are expected to take part in the meeting, although Facebook Chief Executive Mark Zuckerberg will not be in attendance.

Comments

Facebook restricts Live feature, citing New Zealand shooting

Silhouettes of mobile users are seen next to a screen projection of Facebook logo in this picture illustration taken on March 28, 2018. File photo: Reuters

Facebook Inc said on yesterday it was tightening rules around its livestreaming feature ahead of a meeting of world leaders aimed at curbing online violence in the aftermath of a massacre in New Zealand.

A lone gunman killed 51 people at two mosques in the city of Christchurch on March 15 while livestreaming the attacks on Facebook. It was New Zealand's worst peacetime shooting and spurred calls for tech companies to do more to combat extremism on their services.

Facebook said in a statement it was introducing a "one-strike" policy for use of Facebook Live, temporarily restricting access for people who have faced disciplinary action for breaking the company's most serious rules anywhere on its site.

First-time offenders will be suspended from using Live for set periods of time, the company said. It is also broadening the range of offences that will qualify for one-strike suspensions.

New Zealand Prime Minister Jacinda Ardern said the change addressed a key component of an initiative, known as the "Christchurch Call", she is spearheading to halt the spread of violence online.

"Facebook's decision to put limits on live streaming is a good first step to restrict the application being used as a tool for terrorists, and shows the Christchurch Call is being acted on," she said in an email from her spokesman.

Facebook did not specify which offences were eligible for the one-strike policy or how long suspensions would last, but a spokeswoman said it would not have been possible for the shooter to use Live on his account under the new rules.

The company said it plans to extend the restrictions to other areas over coming weeks, beginning with preventing the same people from creating ads on Facebook.

It also said it would fund research at three universities on techniques to detect manipulated media, which Facebook's systems struggled to spot in the aftermath of the attack.

'WORK TO DO'

Ardern said the research was welcome and that edited and manipulated videos of the March 15 mosque shootings had been slow to be removed, resulting in many, including herself, seeing it played in Facebook feeds.

Facebook has said it removed 1.5 million videos globally that contained footage of the attack in the first 24 hours after it occurred. It said in a blog post in late March that it had identified more than 900 different versions of the video.

Ardern is due to lead a meeting, with French President Emmanuel Macron in Paris on Wednesday, that seeks to have world leaders and chiefs of tech companies sign a pledge to eliminate violent content online.

"There is a lot more work to do, but I am pleased Facebook have taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it," she said.

In an opinion piece in the New York Times on Saturday, Ardern said the "Christchurch Call" would be a voluntary framework that commits signatories to put in place specific measures to prevent the uploading of terrorist content.

Ardern has not made specific demands of social media companies in connection with the pledge, but has called for them "to prevent the use of livestreaming as a tool for broadcasting terrorist attacks."

Representatives from Facebook, Alphabet Inc's Google, Twitter Inc and other tech companies are expected to take part in the meeting, although Facebook Chief Executive Mark Zuckerberg will not be in attendance.

Comments

মেয়াদোত্তীর্ণ ভিসা নিয়ে বাংলাদেশে ৩০ হাজার বিদেশি, অধিকাংশ ভারত-চীনের

তথ্য অনুসারে, মেয়াদোত্তীর্ণ ভিসা নিয়ে বাংলাদেশে যেসব বিদেশিরা রয়েছেন তাদের মধ্যে ভারতীয়দের সংখ্যা সবচেয়ে বেশি। ১৩ ডিসেম্বর পর্যন্ত প্রায় ৪৫ হাজার ভারতীয় বাংলাদেশে বসবাস করছেন। তাদের অধিকাংশই...

১ ঘণ্টা আগে