Facebook Will Now Have 7,500 People Devoted to Finding Snuff Videos
In a major announcement, Facebook CEO Mark Zuckerberg pledged to hire an additional 3,000 employees to review videos reportedly featuring violence, “hate speech,” and “child exploitation.” Zuckerberg’s statement comes in the midst of a number of murders and violent crimes broadcasted over the site’s live streaming service, Facebook Live.
According to the Facebook post, the company already employs 4,500 people in its “community operations team” to review content that has been flagged for its inappropriateness. An additional 3,000 people would mean that 7,500 people would be devoted to taking down offensive content. (It’s unclear if the 3,000 people will be full-time employees or contractors.) Facebook has a workforce of 17,000.
For all the promises from Silicon Valley that algorithms will solve society’s ills, Zuckerberg’s announcement is a clear sign that we’re definitely not there yet.
On top of the new hires, Zuckerberg also committed to building “better tools to keep or community safe” and make it easier for users to report content that violates the site’s terms of service. The new initiative comes in the wake of several incidents of recorded violence on the site, with many calling for tougher restrictions on what content is allowed.
Last month, Steve Stephens broadcasted a video on Facebook of himself shooting a 74-year-old stranger in Cleveland, Ohio. The gruesome footage quickly circulated the Internet before being taken down. Stephens later killed himself after a manhunt and police chase.
On April 26, an Alabama man streamed his suicide on the service. Earlier that week, a man in Thailand streamed himself hanging his 11-month-old daughter before killing himself. Here is Zuckerberg's Statement from May 10th:
Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community.
If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.
Over the next year, we'll be adding 3,000 people to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly.
These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else.
In addition to investing in more people, we're also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.
No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.