Tech

Web’s Worst Job? Facebook Hires 3,000 to Watch for Murders So You Don’t See Them

‘Wellness Plan’

The company wouldn’t say if the 3,000 new watchers it is hiring to weed out the worst videos will be employees, or subcontract workers.

Facebook Live broadcast a senseless murder on Cleveland’s streets. Showed the world a live feed of a Thai man killing his baby daughter before committing suicide. Aired the torture of a mentally ill man in Chicago.

Now Facebook is hiring 3,000 people to weed out the worst videos. But the job is one of the internet’s toughest, as those in similar roles at other companies have grappled with PTSD and other mental issues. For all Silicon Valley’s talk of disruption, tech giants have yet to solve the problem of what may be the internet’s worst job: watching traumatizing content all day.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook—either live or in video posted later,” Facebook CEO Mark Zuckerberg wrote in a Wednesday post. “Over the next year, we’ll be adding 3,000 people to our community operations team around the world—on top of the 4,500 we have today—to review the millions of reports we get every week, and improve the process for doing it quickly.”

ADVERTISEMENT

The jobs fall into a fast-growing field that UCLA assistant professor Sarah Roberts calls “commercial content moderation.” Roberts, one of the field’s early experts, has watched many tech companies hire content moderators out of the same necessity Facebook now faces. Companies need to keep offensive videos off the internet, but machines don’t yet have the discretion for the task and few people want to watch snuff films for a living.

“If the nature of the job is to remove content that other people are unwilling or unable to bear, you can imagine that makes it a pretty tough job, psychologically,” Roberts told The Daily Beast.

Turnover rates in these fields are high. And for employees who remain, the work “leads inevitably to one of two cases,” Roberts said, “burnout or becoming desensitized, neither of which is a good outcome. It can manifest in a number of ways, going all the way to the case of the two Microsoft employees who are suing Microsoft for what they say are debilitating cases of PTSD that have rendered them completely disabled.”

In that suit, filed in December, both employees say they received no special training for working with highly disturbing material, and were severely traumatized after years of attempting to screen the worst videos from Microsoft’s cloud-based products. (Microsoft disputes the two former employees' claims.)

“Most people do not understand how horrible and inhumane the worst people in the world can be,” the former employees' attorney wrote in their lawsuit. The employees claimed to have had woefully inadequate access to mental health services, and that they were denied access to workers’ compensation even after they became plagued with persistent anxiety, PTSD, and nightmares.

In an email, a Facebook spokesperson said the company already had a program in place to offer its psychological support for its content screeners.

In their suit, the two Microsoft employees said their mental health support consisted of a “Wellness Plan,” which allegedly suggested that things like “limiting exposure to depictions, taking walks and smoking breaks, and redirection [of] his thoughts by playing video games would be sufficient to manage his symptoms.” One of the employees claimed to have been penalized on a performance review for playing too many video games.

Roberts said she had seen some similar programs, particularly in law enforcement offices, that advised content moderators to go outside, or even play a round of golf to unwind from their work. But many employees simply are not in a position to go to a golf course—many, in fact, do not work in the same country as the company that employs them.

“What’s not clear in [Facebook’s] statement and what’s not clear about this new batch of hiring, is what the nature of the relationship with these people will be to Facebook,” Roberts said. “It is likely that a very small portion, if any, of these new hires will be full-on Facebook employees. So what I suspect they will do is reach out to third-party contracting firms to fill this labor need.”

A Facebook spokesperson, speaking on background, wouldn’t say if any of the 3,000 new content screeners would enjoy full employee status, or if they would all be working on behalf of an outside contractor.

The distinction can be critical. Facebook offers generous benefit packages for full employees, and has recently extended some benefits to its “contractors and vendors who support Facebook in the US and do a substantial amount of work with us,” Facebook COO Sheryl Sandberg announced in 2015.

But large technology companies have increasingly outsourced their content moderation jobs to countries like the Philippines where the minimum wage is lower, and labor demands are easier to meet.

“As you can imagine, Facebook is typically not very forthcoming about that kind of information. I suspect that in order to practically double their extant workforce, they are going to have to scramble,” Roberts said. She noted that, despite the company’s futuristic goals, the tech giant is still relying on humans for its most unsavory jobs.

“We don’t see a solution being offered that has to do with automation, AI, machine learning, algorithms, or anything other than human intervention,” she said. “That flies in the face of a lot of the rhetoric we’ve been hearing, certainly over the past year or so.”

Got a tip? Send it to The Daily Beast here.