Tech

Facebook Demotes ‘Miracle Cures’ and Other Health Disinformation

BAND-AID

The world's biggest social platform continues to play whack-a-mole with posts and pages pushing bogus wellness claims.

Pills-GettyImages-515662146_bxgssd
FRANCK FIFE/AFP/Getty Images

It's the kind of clickbait that can kill you. And now, it's going to be a little harder to find on Facebook.

In a shift meant to minimize the spread of misleading health-related content, Facebook announced Tuesday that it had made an algorithm tweak to demote “sensational or misleading” posts around the topic.

In a Newsroom post, the company disclosed that the newly announced News Feed change actually quietly went into effect last month, targeting posts making “exaggerated” health, nutrition, and fitness claims as well as content promoting the sale of related products and services.

ADVERTISEMENT

Facebook says that it now evaluates health-related posts to see if they overstate their claims or mislead users, as in the case of “making a sensational claim about a miracle cure.”

The company also said that it will examine promotional posts selling unproven health and wellness products, such as weight loss pills.

Facebook has come under fire recently for the viral spread of “anti-vaxx” content disseminating potentially deadly medical misinformation across its platform. In March, the company confronted concerns around misleading vaccine posts, announcing that it would downrank groups and posts sharing such content to make it more difficult to find, both on Facebook and Instagram.

In making the decision to hide or minimize health misinformation on Facebook rather than remove it outright, the company is opting to treat this kind of content like clickbait, an annoying phenomenon but one usually with lower real world stakes. The methodology is the same too—Facebook identifies key phrases associated with bogus health claims and uses these phrases to identify and downlink future content.

Without sufficient human moderators dedicated to the task, an over-reliance on algorithmic filtering will always have blind spots. For example, the search “Are vaccines safe?” yields medically accurate results, but a search of popular anti-vaxx figure Andrew Wakefield leads to multiple Facebook pages with tens of thousands of followers, one of which made almost 20 posts casting doubt on vaccine safety within the last 24 hours.

From that page, Facebook users are encouraged to like a host of other anti-vaccine pages, including “Vaccine facts” and the Alabama Coalition for Vaccine Rights, a group posting links denouncing public figures who “normalize vaccine injury.” For users who've already dipped their toes into conspiracy theories, finding more posts and pages that confirm their existing biases remains easy.

Facebook regularly updates its rules to reduce potentially dangerous content, but those policy changes can be a blunt instrument. As Vice reports, in Facebook's effort to eradicate opioid sales on its platform, the company has also removed groups that support drug users and blocked ads for potentially life-saving fentanyl test kits.

Health misinformation is one social media ill that isn't unique to Facebook. Between the health and wellness SEO grab being waged in Google's search results and viral YouTube culture, Google has an equal if not greater problem on its hands. 

Among prominent social platforms, Pinterest is the only company that's taken a hardline approach: It blocked search results related to vaccines altogether. 

“It’s better not to serve those results than to lead people down what is like a recommendation rabbit hole,” Pinterest Public Policy Manager Ifeoma Ozoma said of the decision

Got a tip? Send it to The Daily Beast here.