In the first 24 hours after the deadly mass shooting in New Zealand, Facebook says that it has removed 1.5 million videos that were uploaded of the attack, of which 1.2 million “at upload.”
The company made the announcement in a Tweet, following up on a prior announcement that it had been alerted by authorities and removed the alleged shooter’s Facebook and Instagram accounts. Facebook spokeswoman Mia Garlick says that the company is also“removing all edited versions of the video that do not show graphic content.”
We’ve reached out to Facebook for additional comment and will update this post if we hear back.
The terror attack appears to have been designed to go viral, with the alleged shooter releasing a manifesto that referenced numerous individuals like YouTuber Felix Kjellberg and Candace Owens, as well as white supremacist conspiracy theories. He also posted a 17-minute video to Facebook, Instagram, Twitter, and YouTube, which prompted the message to go further viral, even as all of those companies have worked to prevent its spread.
The attacks have prompted social media sites to react to such content: Facebook, Twitter, and YouTube have been working to remove videos. Reddit banned a subreddit called r/watchpeopledie, while Valve began removing tributes to the alleged shooter that were posted to user profiles.
- How to Manage Employee Hours Without the Hassle - 12/14/2024
- New Report Reveals Potential of Harnessing AI Innovations To Create a Thriving Labour Market for Africa’s Youth - 12/05/2024
- New Report Calls For Inflation-adjusted Compensation and Improved Training Support to Retain Talent in Nigeria’s Finance Sector - 12/03/2024