Tech

Facebook Apologizes For, but Doesn’t Retract, Discriminatory ‘Real Name’ Policy

NOT BUDGING

Facebook sparked outrage with its ‘real name’ policy—which many felt unfairly targeted drag queens and the LGBTQ community. Now the company is apologizing—but not changing.

articles/2014/10/01/facebook-apologizes-for-but-doesn-t-retract-discriminatory-real-name-policy/141001-michaelson-facebook-tease_algvru
Jonathan Nackstrand/AFP/Getty

“Don’t fuck with drag queens,” San Francisco politico Tom Temprano wrote last month. “Drag queens start riots, drag queens publicly highlight the deficiencies of others for a living, and most importantly drag queens generate more compelling and interactive content than just about any other users on your platform.”

And apparently, drag queens can even make the 22nd-largest company in the world say Uncle.

That’s the news from Silicon Valley today: Facebook has apologized for its “Real Name Policy,” which outraged not only drag queens and transgender folk who use their chosen, rather than given, names but also domestic abuse survivors, people with nosy bosses, college professors, and LGBT people in countries where the government uses social media to entrap and imprison them.

Apologized—but not retracted. Apparently, says Facebook, this was all just a big misunderstanding. Per Chris Cox, Facebook’s chief product manager, Facebook doesn’t have a drag problem—it has a dragnet problem.

That’s the gist of it, anyway, according to Cox’s 592-word post. Here are some of those words:

I want to apologize to the affected community of drag queens, drag kings, transgender, and extensive community of our friends, neighbors, and members of the LGBT community for the hardship that we’ve put you through in dealing with your Facebook accounts over the past few weeks…

We owe you a better service and a better experience using Facebook, and we’re going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were.

The way this happened took us off guard. An individual on Facebook decided to report several hundred of these accounts as fake. These reports were among the several hundred thousand fake name reports we process every single week, 99 percent of which are bad actors doing bad things: impersonation, bullying, trolling, domestic violence, scams, hate speech, and more—so we didn’t notice the pattern…

Our policy has never been to require everyone on Facebook to use their legal name. The spirit of our policy is that everyone on Facebook uses the authentic name they use in real life…

We believe this is the right policy for Facebook for two reasons. First, it’s part of what made Facebook special in the first place, by differentiating the service from the rest of the internet…  Second, it’s the primary mechanism we have to protect millions of people…  from real harm. The stories of mass impersonation, trolling, domestic abuse, and higher rates of bullying and intolerance are oftentimes the result of people hiding behind fake names, and it’s both terrifying and sad…

All that said, we see through this event that there’s lots of room for improvement in the reporting and enforcement mechanisms, tools for understanding who’s real and who’s not, and the customer service for anyone who’s affected. These have not worked flawlessly and we need to fix that…

Well, kudos to Cox and Facebook’s phalanx of lawyers and copywriters for squaring the circle and writing a perfect apology without admission.

Note the weasel words (a term they do actually teach in law school): “affected community.” This was not a targeted enforcement. “Fix the way this policy gets handled.” The policy stays the same but the enforcement of it will change. “We didn’t notice the pattern.” Because don’t read Valleywag, SFist, or the other Silicon Valley focused news sites which covered this story.

The culprit, really, is the algorithm. As everyone who’s ever tried to actually contact Facebook Support knows, most of the site runs on automatic pilot. There’s no little elf sorting your news feed for you; there are complex algorithms sorted by machine. Likewise, according to Cox, reports of fake names. Sister Rosa just got caught in the dragnet meant for child molesters and con men.

The other culprit, of course, is the one misanthropic jerk who reported “several hundred” names to the Facebook robot. Maybe it’s a homophobe. Maybe it’s a jealous drag queen. Maybe it’s the person in my Secret cohort who confessed to being the fink. (Unlikely—everyone lies on Secret.) Probably we’ll never know.

What this tempest in a teapot really points to are the vulnerabilities in the use of algorithms in massive operations like Facebook’s. I take Cox at his word that “99 percent” of reports really are of bad people. But even if that’s true, if Facebook is processing “several hundred thousand” reports a week, that leaves several thousand innocent people every week who are caught in the Facebook dragnet.

As I wrote in these pages earlier, that includes many of my friends, whose dignity was compromised when Facebook denied them the choice to use their chosen names. Many have, indeed, migrated to Ello.

But more worryingly, those several thousand per week likely include the vulnerable groups mentioned above: people hiding from stalkers, jealous exes, intrusive bosses, or homophobic governments and families. Since publishing my earlier article, I’ve heard stories of people “outed” to their families and jobs, with serious consequences.

We’ve also been treated to Mark Zuckerberg’s privileged, insensitive, and ignorant statement, made back in 2010, that “having two identities for yourself is an example of a lack of integrity.” Easy for a billionaire to say.

So, there are reasons to be wary of Cox’s machina culpa. Apology to the “affected community” notwithstanding, there remains the core question of how Facebook, and other companies who aggregate massive amounts of personal data, can really protect all of that information from unwanted disclosure.

Surely, the recent disclosure of cloud-stored celebrity photos by the group labeled “The Fappening” indicates that it is virtually impossible to do so, at least on a mass scale. Of course, companies and governments can pay for advanced cybersecurity. But when a financial model depends on millions of users, and mere algorithms patrolling the gates, lapses are inevitable.

Solution? I don’t think there is one.

It’s easy to tell victims of Fappening and Facebook to simply stop using services like social media networks and cloud storage. Or at least, “If you don’t want the picture to leak, don’t post the picture.”

But what such finger-wagging is really saying is that behavior deemed deviant by society (or some segments of it) deserves to be exposed, judged, and punished. After all, most square, straight, white-bread people who enjoy monogamous vanilla sex aren’t in danger of being stigmatized. But if you like having dirty pictures taken, or you use a weird name, or you’re gay in a country like Egypt, then too bad for you.

In the 21st century, we get to know everything—but we only judge some things. As long as that double-standard persists, no algorithm can fix its injustice.

Got a tip? Send it to The Daily Beast here.