Facebook wants its own “Supreme Court” to review controversial material and restore trust in the site. But trust in Facebook is so shaky that audiences don’t even trust it to pick the people who pick its new oversight body.
Since November, Facebook execs have flirted with the idea of establishing a special oversight board that would rule on the site’s most challenging content questions. The board would discuss issues like harassment, hate speech, and safety violations—all areas where Facebook has been accused of making questionable rulings, without enough transparency.
The board won’t be a magic bullet for Facebook’s PR woes, though. On Thursday, the company released the results of an international study on its potential Supreme Court. The results: Many Facebook users don’t trust Facebook to handle its own internal issues.
ADVERTISEMENT
Conducted with more than 650 people from 88 countries, the study searched for a solution that would satisfy as many Facebook users as possible. But as Facebook queried users on the best way to staff a 40-person content board, a recurring problem emerged.
“Many of those who engaged in consultations expressed a degree of concern over a Facebook-only selection process, but feedback was split on an alternative solution,” Facebook’s findings read. “An intermediate ‘selection committee’ to pick the Board could ensure external input, but would still leave Facebook with the task of ‘picking the pickers.’”
Many participants wanted Facebook removed from the selection process, the study found. Even letting the board members pick their replacements was a potential problem. “Others questioned the proposal to leave future selection up to the Board itself, as this could result in a ‘recursion problem’ and the ‘perpetuation of bias.’”
“At the same time,” the study found, “others recognized the efficiency of Facebook’s proposed approach, which would avoid ‘the Kafkaesque process of drafting a separate committee to pick the… committee.’”
But with more than 2 billion users spread across virtually every country, Facebook will be hard-pressed to select a 40-person board that adequately represents every user (87 percent of study participants said they wanted a board that was rich in cultural or linguistic knowledge).
Currently, Facebook employs an army of contractors who sift through the site’s worst content, often making decisions in a hurry, based on dubious guidelines. These moderators often work in secret, forbidden from discussing the traumatic pictures and videos they screen every day. The gated community of low-wage workers doesn’t help Facebook’s reputation of limited transparency. The new oversight board proposes to act as an appeals process for cases that remain controversial, even after moderators review them.
Facebook has previously made noise about democratizing its policy changes. In 2012, the company invited all users to vote on proposed policy changes. The vote would only count if 30 percent of all users weighed in. Less than 1 percent voted, voiding the whole process.
The company also tried signing over its “trending topics” to a dedicated news team, which was supposed to combat the site’s trouble with misinformation. After reports of poor management and sexism, Facebook canned the team. The platform almost immediately resumed promoting hoaxes in the trending topics.