Movies

The Tortured Souls Cleansing the Internet of Illicit Sex and Violence

DARK WEB

The new documentary ‘The Cleaners’ explores the shadowy profession of internet ‘cleaners’—outsourced employees hired by tech giants to sanitize the internet.

181109-torres-the-cleaners-hero_1_v2fxzr
PBS

“One of the misconceptions is that human nature is human nature, and the technology is just this neural tool,” says Tristan Harris, a former Google design ethicist, when interviewed in The Cleaners, a new documentary from PBS’ Independent Lens. “But this is not true. Because technology does have a bias, and that bias has a goal.”

It’s the exploration of that bias—inherent to the way social media giants like Facebook, Google, and Twitter moderate content—and the often horrifying, at best inflammatory, content that makes it to social media as a result that forms the basis of The Cleaners, premiering November 12 on PBS.

In recent years, the power of social media to propagate hatred and spread graphic images of violent acts has reached an all-time high; clips from the first few minutes of the doc include Mark Zuckerberg extolling the ability of Facebook to “[give] people the power to share anything with anyone,” and, of course, Donald Trump, gleefully admitting to followers at a rally in Phoenix, “If I didn’t have social media I probably wouldn’t be able to get the word out, I wouldn’t be standing here!”

ADVERTISEMENT

Despite lofty goals from its inception, the internet has never been a peaceful utopia for users—instead, it’s become a bizarre hybrid of memes, porn and extremist ideologies (those three are not mutually exclusive, either) helmed by tech giants like Facebook and the more insidious dark web. That’s where the titular “cleaners,” or content moderators, come in. Outsourced employees—in this case, in the Philippines—are hired by the likes of Google or Twitter to rid the platforms of offensive or exploitative images, and must comb through a minimum of 25,000 images or videos per day, deleting those that violate community guidelines and ignoring the ones that don’t.

The documentary relies heavily on email extracts from current and former content managers, most of whom spoke to filmmakers Hans Block and Moritz Riesewieck on the condition of anonymity due to restrictive NDAs. In one, a content manager summarizes their job thusly: “I stop the spreading of child exploitation. I have to identify terrorism. Have to stop the cyberbullying. Algorithms can’t do what we do.”

It’s a Sisyphean task, for sure—for every one video of an airstrike or sexually-explicit image of a child that’s deleted, another five to ten are bound to replace it. But more important is the question of what’s allowed to remain on certain platforms—and what isn’t. Nicole Wong, a former policymaker for Google and Twitter, discussed the two most prevalent methods of moderating content on digital platforms: one, a more severe approach, in which everything posted to the platform is screened, or a more “democratized” approach in which only flagged items are examined—allowing a lot to slip through the cracks. The easiest way to approach content management, Wong explained, is by asking, “What’s the vision for what should be on your platform? What don’t you want in your community?”

The internet can be a great resource, of course, but more often than not, it’s a terrible place. A young content moderator interviewed by filmmakers haltingly shared the most disturbing thing she’d ever seen while moderating: it involved a six-year-old girl engaging in sex acts with adult men. The video was “unforgivable for me to see,” the content moderator said. The mental wellbeing of these content moderators, mostly young men and women from countries like the Philippines, is largely ignored in favor of maintaining a “safer” internet.

Up until recently, the work of content moderators was largely unknown. “Commercial content moderators labor in the shadows of social-media platforms,” Sarah Roberts, a professor of information studies at UCLA, says in the documentary. But recent incidents involving Facebook or Twitter’s decisions to remove—or more controversially, leave up—potentially inflammatory or exploitative content have drawn attention to the way these platforms moderate content, and how this influences everything from politics to pop culture.

The documentary deftly highlights the apparent hypocrisy that governs much of Facebook, Twitter and Instagram’s content-moderation policies. Instagram, especially, has gained notoriety in recent years for immediately taking down photos that feature nudity—especially exposed female nipples—while allowing expressions of hate speech and bomb threats to remain on its platform for days, if not indefinitely (most recently, right-wing troll Milo Yiannopoulos took to Instagram to share his thoughts on the recent pipe-bomb scare—and his dismay that The Daily Beast didn’t receive one in the mail).

Artist Illma Gore, who created the infamous portrait of Trump with an, uh, especially small penis, is portrayed as a victim of Facebook’s overzealous, at times puritanical, content-moderation policy: after her portrait went viral, the post was abruptly removed, and her Facebook page, which constituted an important part of her livelihood as an artist, was shut down. Gore considers her portrait satirical and not inherently pornographic, but for a young content moderator, it’s a different story. One such moderator told filmmakers that the “number one grave error” she could make while on the job would be approving a nude photo, regardless of context: “boobs or male genitals—totally unacceptable.”

Content-moderation policies are, of course, meant to weed out truly disturbing or controversial content from major digital platforms. But, as The Cleaners so skillfully points out, sometimes things aren’t so black and white. Gore’s satirical portrait was removed from Facebook for “violating community guidelines,” and footage of airstrikes and bombings in Syria are often removed for the same reason, despite the fact that NGOs like Airwars try to save them as documentation of the atrocities in order to help civilians on the ground.

Then, of course, there’s the political side of Facebook and Twitter. Executives from those companies and Google were brought before a congressional committee in 2016 in the wake of alleged Russian interference in the 2016 election, and have continued to face questioning from Senate committees in regards to privacy practices and data moderation.

While Facebook and Twitter (and sometimes Instagram) have reluctantly made efforts to clean up their act…the documentary reveals that the companies don’t practice what they preach outside of the U.S.

While Facebook and Twitter (and sometimes Instagram) have reluctantly made efforts to clean up their act (including banning right-wing trolls and conspiracy theorists like Alex Jones from their platforms), the documentary reveals that the companies don’t practice what they preach outside of the U.S. In politically volatile areas like Turkey, for example, Facebook engages in “geo-blocking”—a controversial practice that involves restricting certain content to users within a specific geographic area. In Turkey, this means that dissenting political content created by Turkish citizens and posted to Facebook—including  political cartoons satirizing President Erdogan, rallying cries, protest footage, etc.—is actually unable to viewed by Turkish citizens themselves. Facebook blocked these images after being ousted from the Turkish market, instituting geo-blocking after making a deal with the government to be reinstated in the country, the documentary explains.

The internet, especially the dark web, knows no bounds, and The Cleaners makes a valiant attempt to cover the most pressing issues that come out of content moderation. It is, however, only a 90-minute documentary, so explorations of issues like the popularity of anti-Rohingya propaganda on Facebook in Myanmar, or even the domestic threat of right-wing extremists sharing hate speech online, seem rushed.

Increasingly, social media has been co-opted by right-wing extremists, often booted off of other websites like the alt-right favorite Gab. One such extremist who goes by the name Sabo has gained a significant following on social media. Ranting against “goat-fucking” refugees from “fifth-world shitholes,” Sabo doesn’t see Facebook’s efforts to crack down on hate speech as a problem. “When people say hate speech isn’t free speech, or free speech isn’t hate speech, they are completely wrong,” he claims, citing the First Amendment and Facebook’s accessibility as reasons for him to share his views online. And he’s not worried about fallout from leftists.

“Some leftist thinks you’re being politically incorrect,” Sabo says. “We’re the ones with the guns, the bullets and the training. And as soon as we can start shooting, believe me, all that shit’s gonna go away really fast.”

The Cleaners ends on a poignant note, exploring the effects the job has on the content moderators. Some don’t seem to be bothered by it: “I’ve seen hundreds of beheadings,” says a moderator whose specialty is identifying and deleting ISIS videos and propaganda from U.S. platforms. But others aren’t so lucky. One content moderator, after repeatedly requesting a transfer from his superiors, eventually hung himself. His specialty? Taking down self-harm videos and suicide livestreams. 

Got a tip? Send it to The Daily Beast here.