Published: Sun, March 17, 2019

Facebook, YouTube trying to rein in footage of New Zealand mosque shooting

Facebook, YouTube trying to rein in footage of New Zealand mosque shooting

"Police alerted us to a video on Facebook shortly after the live-stream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video", Facebook said on its Twitter account.

Also in a social media post just before the attack, an account that is believed to belong to one of the attackers posted a link to an 87-page manifesto that was filled with anti-immigrant, anti-Muslim ideas and explanations for an attack. Twitter and Google said they were working to stop the footage being reshared.

Facebook and YouTube did not immediately respond to HuffPost's request for comment on the matter.

Social networks have been caught flat-footed in many cases by videos showing violent acts including suicides and assassinations.

Facebook, Twitter, Alphabet Inc and other social media companies have previously acknowledged the challenges they face policing content on their platforms.

Mia Garlick, of Facebook in New Zealand, said: "We will continue working directly with New Zealand Police as their response and investigation continues".

"Facebook is an unmoderated platform where you can stream anything you want", she said, arguing that no meaningful measures have been taken since a 2017 Facebook livestream of a murder in Cleveland, Ohio. "Platforms can't prevent that, but much more can be done by platforms to prevent such content from gaining a foothold and spreading".

"The responsibility for content of the stream lies completely and exclusively on the person who initiated the stream".

He said the company condemned "the actions of these terrible persons and their disgusting use of our app for these purposes". "That's unacceptable, it should have never happened, and it should have been taken down a lot more swiftly".

In footage that at times resembled scenes from a first-person shooter video game, the mosque shooter was seen spraying terrified worshippers with bullets, sometimes re-firing at people he had already cut down.

Besides acting on user complaints about copies, YouTube said on Friday that it was trying to identify copies with an automated tool that finds videos likely to be violent in nature based on a combination of the title and description of the video, the characteristics of the user uploading it and objects in the footage.

As per The Verge, all though exact re-uploads of the massacre video will be taken down, videos which contain a part of the clip are sent to human moderators so that news videos containing the clips aren't removed in the process. "Groups have deliberately spread it and those accounts should be closed down". New Zealand Prime Minister Jacinda Ardern also labeled it a "terrorist attack".

"We are very clear in our site terms of service that posting content that incites or glorifies violence will get users and communities banned from Reddit", it said in a statement. Children's screams could be heard in the distance as he strode to his auto to get another rifle, then returned to the mosque, where at least two dozen people could be seen lying in pools of blood.

Like this: