Facebook Inc. said they removed 1.5 million video clips internationally of the New Zealand mosque attack in the very first 24 hours after the strike. “In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…,” Facebook said in a tweet late Saturday. The company stated it is additionally eliminating all modified versions of the video that do not show graphic content out of respect for the people impacted by the mosque shooting as well as the problems of regional authorities.
The death toll in the New Zealand mosque shootings increased to 50 on Sunday. The shooter that attacked two mosques on Friday live-streamed the strikes on Facebook for more than 17 minutes making use of an application made for sports enthusiasts, with copies still being shared on social media sites for hours later on. New Zealand Prime Minister Jacinda Ardern has actually said she wants to talk about online streaming with Facebook. Apparently, it seems that the live streaming of the attack was planned and it became viral anyway as per the plan. The video also became viral on other social media platforms like twitter and YouTube but they controlled the situation accordingly. Social media platforms are working on the controlled live streaming which is somehow difficult. Currently social media platforms are controlling the spread of inappropriate content by applying multiple filters when a user tries to upload content.