Tech

Facebook Execs Gave Up On Solutions After Finding Its Platform Was Fueling Division: WSJ

NOT A PRIORITY

The social-media giant is reported to have found its algorithms were aggravating polarization. Its top executives ended up ignoring the solutions.

GettyImages-953502240_csmlgk
Justin Sullivan/Getty

After finding evidence from an internal investigation that its platform has been fueling divisiveness and polarization, Facebook decided to ditch efforts to apply solutions to the problem due to a lack of interest among the company’s top executives, The Wall Street Journal reports. The social-media giant, which boasts its mission is to “connect the world,” reportedly launched a research project in 2017 led by Facebook’s former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named “Common Ground” assigned employees into “Integrity Teams” throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation. 

In alignment with CEO Mark Zuckerberg’s stance that Facebook stands for “free expression,” the company decided that it would not “build products that attempt to change people’s beliefs,” a 2018 document reportedly states. Facebook has been stormed with criticism and scrutiny as its platform continues to accelerate the spread of conspiracy theories, partisan quarrel, and misinformation. According to the Journal report, a 2016 presentation told Facebook executives that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s Groups You Should Join and Discover algorithms: “Our recommendation systems grow the problem,” one slide is reported to have read.

Zuckerberg claimed this year that the company would fight against “those who say that new types of communities forming on social media are dividing us.” However, The Wall Street Journal reports that people have heard him argue that online platforms have barely any influence over the matter. “Our algorithms exploit the human brain’s attraction to divisiveness,” one presentation in 2018 read. “If left unchecked, more and more divisive content in an effort to gain user attention & increase time on the platform.”

Read it at The Wall Street Journal

Got a tip? Send it to The Daily Beast here.