Tech

Facebook Workers, Not an Algorithm, Will Look at Volunteered Nude Photos First to Stop Revenge Porn

It's Complicated

Reports this week said a Facebook pilot program would let users volunteer nudes to an algorithm to stop revenge porn—but those nudes will be viewed by a human at the company first.

171108-cox-facebook-nudes-hero_sop3en

This week, multiple outlets reported on a Facebook pilot scheme that aims to combat revenge porn. In the program, users would send a message to themselves containing their nude images, which Facebook will then make a fingerprint of, and stop others from uploading similar or identical pictures.

The approach has many similarities with how Silicon Valley companies tackle child abuse material, but with a key difference—there is no already-established database of non-consensual pornography.

According to a Facebook spokesperson, Facebook workers will have to review full, uncensored versions of nude images first, volunteered by the user, to determine if malicious posts by other users qualify as revenge porn.

ADVERTISEMENT

Tackling the pervasive problem of revenge porn without someone examining the images—while also making sure the system is not used to pull down legitimate images—is a difficult task. But the manual nature of Facebook’s new processes is still something users of the world’s biggest social network may want to be aware of before sharing nude pictures in order to stop a revenge porn attack.

As The Guardian reported on Tuesday, the process for Facebook’s Australia-focused pilot starts when a user completes an online form with the local government’s e-safety commissioner. The user then sends the images they would like flagged to themselves on Facebook messenger, and an analyst will open the image and create a fingerprint of it. That fingerprint is what the company will use to track any similar images in the future.

What that and other explanations do not necessarily make clear, however, is that prior to making that fingerprint, a worker from Facebook’s community operations team will actually look at the uncensored image itself to make sure it really is violating Facebook’s policies. A Facebook spokesperson described the process on background, meaning The Daily Beast cannot name or directly quote them.

Facebook will keep hold of these images for a period of time to make sure that the company is correctly enforcing those policies. Here, images will be blurred and only available to a small number of people, according to the Facebook spokesperson. An individual employee at Facebook, however, will have at that point already examined the un-blurred versions.

Facebook already employs trained workers to address revenge porn complaints, as well as monitor for other extreme or illegal content.

Realistically, there is not immediately obvious way for Facebook to address this issue with an algorithm. If Facebook fingerprinted the images beforehand, without a human reviewing them, it’s easy to see how someone might exploit the service to censor images that have nothing to do with revenge porn.

The company already examines photos that users flag as revenge porn and deals with them on case-by-case basis. These new measures would serve as a more preventive approach.

“It is absolutely necessary for images to be reviewed by a person when introduced into the checked for dataset, otherwise it would be trivial for someone to abuse this process to censor images,” Nicholas Weaver, a senior researcher at the International Computer Science Institute in Berkeley, California, told The Daily Beast. Weaver pointed at the iconic photo of Tank Man in Tiananmen Square as an example.

With PhotoDNA, a technology that various tech companies use for identifying child abuse material, there already exists a massive database of offending images. All of those images have been human reviewed at some point as well.

“The only difference is the human review is often based on the fruits of criminal search warrants or investigations rather than user-submitted content,” Weaver added.

Facebook is likely not simply creating a “hash”—a unique, cryptographic representation of a file—and using those to check for new revenge porn uploads. With that approach, offenders could simply resize or otherwise alter the image, creating a new hash that will fly under Facebook’s radar.

“There are algorithms that can be used to create a fingerprint of a photo/video that is resilient to simple transforms like resizing,” Facebook’s Chief Security Officer Alex Stamos tweeted at this reporter on Tuesday.

This service’s implementation is a way for the company to tackle inherent risk in revenge porn on a site used by billions of people. If there is a risk of a malicious ex-boyfriend or girlfriend dumping someone’s nudes online, that person may understandably have less of an issue with a Facebook worker seeing their image in an effort to stop it spreading.

No data provided to a company is totally protected. As the recent deletion of President Trump’s Twitter account by a contractor proves, there are people behind the companies powering the internet and our connectivity.

Ultimately, it is up to the individual user on what they would rather do, and what they would like protection from.

Got a tip? Send it to The Daily Beast here.