Russian-funded covert propaganda posts on Facebook were likely seen by a minimum of 23 million people and might have reached as many as 70 million, according to analysis by an expert on the social-media giant’s complex advertising systems. That means up to 28 percent of American adults were swept in by the campaign.
On Wednesday, Facebook’s chief security officer, Alex Stamos, revealed that Russia had “likely” used 470 fake accounts to buy about $100,000 worth of advertising promoting “divisive social and political messages” to Americans. It was Facebook’s first public acknowledgment of the role it unwittingly played in the Kremlin’s “active measures” campaign. Stamos’ statement was also conspicuous by what it omitted: Facebook has refused to release the ads. More significant, it hasn’t said what kind of reach Russia attained with its ad buy.
There may be a reason for that. On the surface, $100,000 is small change in contemporary national politics, and 3,000 ads sounds like a drop in the pond when Facebook boasts 2 billion monthly users. But it turns out $100,000 on Facebook can go a surprisingly long way, if it’s used right. On average, Facebook ads run about $6 for 1,000 impressions. By that number, the Kremlin’s $100,000 buy would get its ads seen nearly 17 million times.
ADVERTISEMENT
But that average hides a lot of complexity, and the actual rate can range from $1 to $100 for 1,000 impressions on an ad with pinpoint targeting. Virality matters, too. Ads that get more shares, likes, and comments are far cheaper than boring ads that nobody likes, and ads that send users to Facebook posts instead of third-party websites enjoy an additional price break. Finally, there are network effects, which can vastly multiply the number of users who see a promoted Facebook post.
“If they got a super high engagement rate, they’re not only going to get the traffic that you get from paying, but you get this viral factor that can multiply it,” said Dennis Yu, CTO and co-founder of BlitzMetrics, an ad agency that deals exclusively with Facebook ads. “You buy one impression and you get 20 additional impressions.”
Politically charged posts—especially salacious ones—draw high engagement, said Yu. “With certain topics it’s kind of hard to lose. ‘Hillary Clinton got caught hiding something!’ Well, OK. I already believe she hides stuff.” Yu said he suspects Russia maximized its impact with a basic strategy practiced by Facebook marketers: Seed a new Facebook post with a tiny buy as low as $1 a day, then watch Facebook’s ad console and see if the post catches fire. If it doesn’t, write it off and start on the next post. But if people begin engaging with the post in a serious way, you go all in.
“One out of every 100 posts, you’re going to get that home run,” said Yu, whose clients include GoDaddy and the Golden State Warriors. “Then you’re going to boost the heck out of that sucker. You’re going to put $10,000 on it. And Facebook’s algorithm already knows who to show it to, like the friends of the people who already liked it… It’s a risk-free lottery. The minimum cost is $1 per day.”
One now-shuttered Facebook page provides evidence Russia was following this strategy. Called SecuredBorders, the page positioned itself as the work of a group of Americans concerned about U.S. border security. “America is at risk and we need to protect our country now more than ever, liberal hogwash aside,” read the tagline. But a March article by the respected Russian news outlet RBC revealed the page was created and run by the St. Petersburg-based Internet Research Association, identified by a January U.S. intelligence report as a farm of “professional trolls” financed by a Vladmir Putin ally.
It’s unclear how many pages like SecuredBorders Russia ran, but that group alone had 133,000 followers before it disappeared last month, almost certainly as part of Facebook’s purge of 470 deceptive Russian accounts and, reportedly, 25 Facebook communities with a cumulative 3 million subscribers.
Though it’s gone from Facebook, web caches still provide a limited view of the page, and it’s clear there’s a lot of nasty dirt hiding behind Facebook’s sanitary “divisive social and political messages” talk. The page spewed a steady stream of alt-right political memes and fake news, nearly always accompanied a gif or a video and a explicit or implicit call for users to engage with the post.
“This woman is a crook. A sociopath. A heartless, cold b*tch,” reads one of the posts from election season, above a photoshopped image of Hillary Clinton standing for a mugshot. “We’ve had more than enough traitors and crooks in the White House already! Don’t you agree?”
Clinton, or “Killary,” as the page preferred to call her, was a frequent target of the page, while Donald Trump drew consistent praise before the election and since. Other posts railed against U.S. Muslims, who the page claimed falsely are busily indoctrinating American grade-school students.
Some posts went after undocumented immigrants, implying they cross into the U.S. from Mexico to commit crimes and vote Democratic by the thousands. Syrian refugees were routinely painted as freeloaders who mostly “really hate America.” One refugee smear got 3,000 “likes,” more than 200 comments, and was shared 1,300 times, while videos tended to draw higher numbers, starting in the low thousands and climbing to 30,000.
According to RBC’s investigation, SecuredBorders had bigger hits, like a single post boosted through Facebook ads that was seen by 4 million people, shared 80,000 times, and accrued 300,000 likes. The torrent of posts with meager numbers were likely just a means to achieve the occasional jackpot post like this, one worth boosting with ad money, said Yu. “You can see the evidence of their testing. They’re putting out a lot of stuff.”
If Russia is following that strategy, the impact could be huge. Yu provided a possible range for Russia’s $100,000 spend. At the Facebook average rate of $6 for 1,000 ad impressions, Russia’s ads would have displayed 16.7 million times.
At the other end of the spectrum, if Russia took full advantage of the intricacies of Facebook’s ad-bidding system, it might have driven those costs as low as $2 for 1,000 impressions, Yu said, giving it 50 million impressions. Assuming the promoted posts enjoyed a 20-factor viral boost—less than a thrilling sports moment, more than a cosmetics ad—that would translate to between 333 million and 1 billion impressions for Russia’s propaganda, he said.
The pool of distinct people who saw those posts would be much lower, because of Facebook’s targeting algorithm. If someone engages with an anti-Muslim ad, Facebook is more likely to send them the anti-immigrant ad that followed. The collective reach of the ads would be about 7 percent of the impressions, Yu said, for a range of 23 million to 70 million people touched by Russia’s Facebook campaign.
Those numbers are just an estimate. The exact reach of Russia’s campaign remains a mystery that only Facebook can solve. Then there’s the question of targeting. Stamos said a quarter of the 3,000 ads—approximately 750—were “geographically targeted” to audiences in specific places. Most of the geo-targeting occurred in 2015, rather than 2016, Stamos said. It’s worth pointing out that as an ad giant, Facebook’s advertising systems in 2017 have evolved from what they were in June 2015, when the company’s review began.
But Stamos was silent on which physical areas—and audiences—were targeted, particularly as the election loomed. Clinton lost Michigan by 11,612 votes and Wisconsin by just over 27,000.
Knowledgeable Facebook sources who would not speak on the record indicated that Facebook continues to review potential misuse of its platform. Part of that study includes examining the reach of the fraudulent accounts pushing inflammatory or propagandistic content. In a statement, a company representative told The Daily Beast: “We are continuing to cooperate with the relevant investigative authorities. We don’t have more to share about this now.”
It’s possible that Russia also used less overt tactics on Facebook that have yet to be discovered. It is also unclear which of Facebook’s demographic-targeting features were used to push ads to specific people and voting blocs.
Facebook allows advertisers to boost posts only to users with basic signifiers like age and gender, and more complex indicators like interests and probable income, which it can infer by collected marketing data and a user’s internet speed. The social-media juggernaut’s algorithm can even target specific people, for example in swing states, vulnerable districts, or parts of cities.
For his part, Yu doubts that Russia, or anyone else, did much data crunching. “Facebook’s AI is super intelligent now. You don’t have to import email addresses and phone numbers, ZIP Codes and names,” he says. “Five years ago I was the Man. People used to come to me for targeting. Now, there’s nothing better than Facebook’s data.”