For David Sherratt, like so many teenagers, far-right radicalization began with video game tutorials on YouTube. He was 15 years old and loosely liberal, mostly interested in “Call of Duty” clips. Then YouTube’s recommendations led him elsewhere.
“As I kept watching, I started seeing things like the online atheist community,” Sherratt said, “which then became a gateway to the atheism community’s civil war over feminism.” Due to a large subculture of YouTube atheists who opposed feminism, “I think I fell down that rabbit hole a lot quicker,” he said.
During that four-year trip down the rabbit hole, the teenager made headlines for his involvement in the men’s rights movement, a fringe ideology which believes men are oppressed by women, and which he no longer supports. He made videos with a prominent YouTuber now beloved by the far right.
ADVERTISEMENT
He attended a screening of a documentary on the “men’s rights” movement, and hung out with other YouTubers afterward, where he met a young man who seemed “a bit off,” Sherratt said. Still, he didn’t think much of it, and ended up posing for a group picture with the man and other YouTubers. Some of Sherratt’s friends even struck up a rapport with the man online afterward, which prompted Sherratt to check out his YouTube channel.
What he found soured his outlook on the documentary screening. The young man’s channel was full of Holocaust denial content.
“I’d met a neo-Nazi and didn’t even know it,” Sherratt said
The encounter was part of his disenchantment with the far-right political world which he’d slowly entered over the end of his childhood.
“I think one of the real things that made it so difficult to get out and realize how radicalized I’d become in certain areas was the fact that in a lot of ways, far-right people make themselves sound less far-right; more moderate or more left-wing,” Sherratt said.
Sherratt wasn’t alone. YouTube has become a quiet powerhouse of political radicalization in recent years, powered by an algorithm that a former employee says suggests increasingly fringe content. And far-right YouTubers have learned to exploit that algorithm and land their videos high in the recommendations on less extreme videos. The Daily Beast spoke to three men whose YouTube habits pushed them down a far-right path and who have since logged out of hate.
YouTube has a massive viewership, with nearly 2 billion daily users, many of them young. The site is more popular among teenagers than Facebook and Twitter. A 2018 Pew study found that 85 percent of U.S. teens used YouTube, making it by far the most popular online platform for the under-20 set. (Facebook and Twitter, which have faced regulatory ire for extremist content, are popular among a respective 51 and 32 percent of teens.)
Launched in 2005, YouTube was quickly acquired by Google. The tech giant set about trying to maximize profits by keeping users watching videos. The company hired engineers to craft an algorithm that would recommend new videos before a user had finished watching their current video.
Former YouTube engineer Guillaume Chaslot was hired to a team that designed the algorithm in 2010.
“People think it’s suggesting the most relevant, this thing that’s very specialized for you. That’s not the case,” Chaslot told The Daily Beast, adding that the algorithm “optimizes for watch-time,” not for relevance.
“The goal of the algorithm is really to keep you in line the longest,” he said.
That fixation on watch-time can be banal or dangerous, said Becca Lewis, a researcher with the technology research nonprofit Data & Society. “In terms of YouTube’s business model and attempts to keep users engaged on their content, it makes sense what we’re seeing the algorithms do,” Lewis said. “That algorithmic behavior is great if you’re looking for makeup artists and you watch one person’s content and want a bunch of other people’s advice on how to do your eye shadow. But it becomes a lot more problematic when you’re talking about political and extremist content.”
Chaslot said it was apparent to him then that algorithm could help reinforce fringe beliefs.
“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said. “There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”
Lewis and other researchers have noted that recommended videos often tend toward the fringes. Writing for The New York Times, sociologist Zeynep Tufekci observed that videos of Donald Trump recommended videos “that featured white supremacist rants, Holocaust denials and other disturbing content.”
Matt, a former right-winger who asked to withhold his name, was personally trapped in such a filter bubble.
For instance, he described watching a video of Bill Maher and Ben Affleck discussing Islam, and seeing recommended a more extreme video about Islam by Infowars employee and conspiracy theorist Paul Joseph Watson. That video led to the next video, and the next.
“Delve into [Watson’s] channel and start finding his anti-immigration stuff which often in turn leads people to become more sympathetic to ethno-nationalist politics,” Matt said.
“This sort of indirectly sent me down a path to moving way more to the right politically as it led me to discover other people with similar far-right views.”
Now 20, Matt has since exited the ideology and built an anonymous internet presence where he argues with his ex-brethren on the right.
“I think YouTube certainly played a role in my shift to the right because through the recommendations I got,” he said, “it led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”
This opposition to feminism and racial equality movements is part of a YouTube movement that describes itself as “anti-social justice.”
Andrew, who also asked to withhold his last name, is a former white supremacist who has since renounced the movement. These days, he blogs about topics the far right views as anathema: racial justice, gender equality, and, one of his personal passions, the furry community. But an interest in video games and online culture was a constant over his past decade of ideological evolution. When Andrew was 20, he said, he became sympathetic to white nationalism after ingesting the movement’s talking points on an unrelated forum.
Gaming culture on YouTube turned him further down the far-right path. In 2014, a coalition of trolls and right-wingers launched Gamergate, a harassment campaign against people they viewed as trying to advance feminist or “social justice” causes in video games. The movement had a large presence on YouTube, where it convinced some gamers (particularly young men) that their video games were under attack.
“It manufactured a threat to something people put an inordinate amount of value on,” Andrew said. “‘SJWs’ [social justice warriors] were never a threat to video games. But if people could be made to believe they were,” then they were susceptible to further, wilder claims about these new enemies on the left.
Matt described the YouTube-fed feelings of loss as a means of radicalizing young men.
“I think the anti-SJW stuff appeals to young white guys who feel like they're losing their status for lack of a better term,” he said. “They see that minorities are advocating for their own rights and this makes them uncomfortable so they try and fight against it.”
While in the far-right community, Andrew saw anti-feminist content act as a gateway to more extreme videos.
“The false idea that social justice causes have some sort of nefarious ulterior motive, that they're distorting the truth somehow” can help open viewers to more extreme causes, he said. “Once you've gotten someone to believe that, you can actually go all the way to white supremacy fairly quickly.”
Lewis identified the community as one of several radicalization pathways “that can start from a mainstream conservative perspective: not overtly racist or sexist, but focused on criticizing feminism, focusing on criticizing Black Lives Matter. From there it’s really easy to access content that’s overtly racist and overtly sexist.”
Chaslot, the former YouTube engineer, said he suggested the company let users opt out of the recommendation algorithm, but claims Google was not interested.
Google’s chief executive officer, Sundar Pichai, paid lip service to the problem during a congressional hearing last week. When questioned about a particularly noxious conspiracy theory about Hillary Clinton that appears high in searches for unrelated videos, the CEO made no promise to act.
“It’s an area we acknowledge there’s more work to be done, and we’ll definitely continue doing that,” Pichai said. “But I want to acknowledge there is more work to be done. With our growth comes more responsibility. And we are committed to doing better as we invest more in this area.”
But while YouTube mulls a solution, people are getting hurt.
On Dec. 4, 2016, Edgar Welch fired an AR-15 rifle in a popular Washington, D.C. pizza restaurant. Welch believed Democrats were conducting child sex-trafficking through the pizzeria basement, a conspiracy theory called “Pizzagate.”
Like many modern conspiracy theories, Pizzagate proliferated on YouTube and those videos appeared to influence Welch, who sent them to others. Three days before the shooting, Welch texted a friend about the conspiracy. "Watch 'PIZZAGATE: The bigger Picture' on YouTube,” he wrote.
Other YouTube-fed conspiracy theories have similarly resulted in threats of gun violence. A man who was heavily involved in conspiracy theory communities on YouTube allegedly threatened a massacre at YouTube headquarters this summer, after he came to believe a different conspiracy theory about video censorship. Another man who believed the YouTube-fueled QAnon theory led an armed standoff at the Hoover Dam in June. A neo-Nazi arrested with a trove of guns last week ran a YouTube channel where he talked about killing Jewish people.
Religious extremists have also found a home on YouTube. From March to June 2018, people uploaded 1,348 ISIS videos to the platform, according to a study by the Counter Extremism Project. YouTube deleted 76 percent of those videos within two hours of their uploads, but most accounts still remained online. The radical Muslim-American cleric Anwar al-Awlaki radicalized multiple would-be terrorists and his sermons were popular on YouTube.
Less explicitly violent actors can also radicalize viewers by exploiting YouTube’s algorithm.
“YouTubers are extremely savvy at informal SEO [search engine optimization],” Lewis of Data & Society said. “They’ll tag their content with certain keywords they suspect people may be searching for.”
Chaslot described a popular YouTube title format that plays well with the algorithm, as well as to viewers’ emotions. “Keywords like ‘A Destroys B’ or ‘A Humiliates B’” can “exploit the algorithm and human vulnerabilities.” Conservative videos, like those featuring right-wing personality Ben Shapiro or Proud Boys founder Gavin McInnes, often employ that format.
Some fringe users try to proliferate their views by making them appear in the search results for less-extreme videos.
“A moderate user will have certain talking points,” Sherratt said. “But the radical ones, because they’re always trying to infiltrate, and leech subscribers and viewers off those more moderate positions, they’ll put in all the exact same tags, but with a few more. So it won’t just be ‘migrant crisis’ and ‘Islam,’ it’ll be ‘migrant crisis,’ ‘Islam,’ and ‘death of the West.’"
“You could be watching the more moderate videos and the extreme videos will be in that [recommendation] box because there isn’t any concept within the anti-social justice sphere that the far right aren’t willing to use as a tool to co-opt that sphere.”
Young people, particularly those without fully formed political beliefs, can be easily influenced by extreme videos that appear in their recommendations. “YouTube appeals to such a young demographic,” Lewis said. “Young people are more susceptible to having their political ideals shaped. That’s the time in your life when you’re figuring out who you are and what your politics are.”
But YouTube hasn’t received the same attention as Facebook and Twitter, which are more popular with adults. During Pichai’s Tuesday congressional testimony, Congress members found time to ask the Google CEO about iPhones (a product Google does not manufacture), but asked few questions about extremist content.
Pichai’s testimony came two days after PewDiePie, YouTube’s most popular user, recommended a channel that posts white nationalist and anti-Semitic videos. PewDiePie (real name Felix Kjellberg) has more than 75 million subscribers, many of whom are young people. Kjellberg has previously been accused of bigotry, after he posted at least nine videos featuring anti-Semitic or Nazi imagery. In a January 2017 stunt, he hired people to hold a “death to all Jews” sign on camera.
Some popular YouTubers in the less-extreme anti social justice community became more overtly sexist and racist in late 2016 and early 2017, a trend some viewers might not notice.
“The rhetoric did start shifting way further right and the Overton Window was moving,” Sherratt said. “One minute it was ‘we’re liberals and we just think these social justice types are too extreme or going too far in their tactics’ and then six months later it turned into ‘progressivism is an evil ideology.’”
One of Matt’s favorite YouTube channels “started off as a tech channel that didn't like feminists and now he makes videos where almost everything is a Marxist conspiracy to him,” he said.
In some cases, YouTube videos can supplant a person’s previous information sources. Conspiracy YouTubers often discourage viewers from watching or reading other news sources, Chaslot has previously noted. The trend is good for conspiracy theorists and YouTube’s bottom line; viewers become more convinced of conspiracy theories and consume more advertisements on YouTube.
The problem extends to young YouTube viewers, who might follow their favorite channel religiously, but not read more conventional news outlets.
“It’s where people are getting their information about the world and about politics,” Lewis said. “Sometimes instead of going to traditional news sources, people are just watching the content of an influencer they like, who happens to have certain political opinions. Kids may be getting a very different experience from YouTube than their parents expect, whether it’s extremist or not. I think YouTube has the power to shape people’s ideologies more than people give it credit for.”
Some activists have called on YouTube to ban extreme videos. The company often counters that it is difficult to screen the reported 300 hours of video uploaded each minute. Even Chaslot said he’s skeptical of bans’ efficiency.
“You can ban again and again, but they’ll change the discourse. They’re very good at staying under the line of acceptable,” he said. He pointed to videos that call for Democratic donor George Soros and other prominent Democrats to be “‘the first lowered to hell.’” “The video explained why they don’t deserve to live, and doesn’t explicitly say to kill them,” so it skirts the rules against violent content.
At the same time “it leads to a kind of terrorist mentality” and shows up in recommendations.
“Wherever you put the line, people will find a way to be on the other side of it,” Chaslot said.
“It’s not a content moderation issue, it’s an algorithm issue.”