The Facebook group “Soldiers of the Righteous Caliphate” was a bubbling cauldron of noxious extremist support drawn from rival groups including the Islamic State, and the Taliban. It was filled with beheading imagery, military exercises, hand-drawn faceless portraits of the late ISIS leader Abu Bakr al-Baghdadi, and—somehow—plenty of anime porn.
With just over 24,000 members, the Facebook group represented a perfect case study in the malicious mix of extremist support on the platform, which continues to evade detection and gain traction. So why was it full of pornography?
There has been plenty of analysis about extremist groups and how their content evades detection, but not many reports have focused on the antics of people trying to fight these extremists online. Some of these vigilantes are lone actors, others are part of “cyberarmies” aligned with other extremist groups fighting rival terrorist supporters online.
The “Soldiers of the Righteous Caliphate” group was a microcosm of vigilante digital detractors at play. It was the digital equivalent of a wrestling cage match between cyberarmies of extremist group supporters targeting the Islamic State, Hay’at Tahrir al-Sham and the Taliban.
The battlefields of these cyberarmies and the conflict they spurred in weaponized group posts and comment sections represents one of the lesser known moderation challenges for Facebook.
Their weapons of choice: anime porn and numbered kama sutra positions. Put simply, when terrorist supporters go violent on Facebook, their enemies tend to go pornographic.
For instance, a 15-position numbered Kama Sutra menu of sex acts was recently shared with a post that read “the army of the Caliphate is on the path set by Hind, daughter of Abu Sufyan.” Some sex posters in the group used more straightforward messaging. Using a police line-up of naked female anime characters, each with progressively larger busts, one detractor derided the “dogs of hell,” a well-known Middle Eastern moniker for Islamic State supporters.
Other lone actor vigilantes targeting the Taliban supporters in the group made sexual references about the Taliban, posting dancing videos of young Taliban fighters with emojis that represented index fingers into OK signs as a means to suggest the Taliban were homosexuals who “love dark holes.” When Hay’at Tahrir al-Sham accounts posted adoring photos of Abu Muhammad al-Joulani, the group’s leader, a non-extremist account, posted a doctored photo of him having sex with himself, in reference what some to believe is his narcissistic nature.
The question, however, is just how networked are these posters? Does it indicate the presence of cyberarmies, or are these just average people with a taste for shitposting terrorist supporters?
The answer is both.
All of these explicit prevention efforts are often used by the most fringe of the terrorist group detectors as well as networks of opposing extremist groups on Facebook. The answer to why these tactics are utilized is buried in a large tranche of internal Facebook documents released in late October to 12 media organizations, which The Daily Beast has reviewed. Prepared in December 2020, the issues noted by Facebook employees actually provide researchers with a better understanding of how networks of anti-extremist accounts, as well as the cyberarmies supportive of a range of extremist groups, operate on the platform to take on terrorist content when the platform does not.
In the papers, the Middle East and North Africa Integrity Team at Facebook reported that “Iraq is a proxy for cyberarmies working on reporting content in order to block certain pages or content.” The papers went on to mention that “reporters in Iraq understand the zero tolerance FB has for CN [child nudity] and this is used by cyberarmies to close certain pages.”
While none of the lone actors or the extremist cyberarmies fighting one another in the “Soldiers of the Righteous Caliphate” used child nudity or children as extremist repellents, a number did use pornographic material to troll extremists and attempt to get their accounts banned.
Averaging 259 posts a day, the “Soldiers of the Righteous Caliphate” was essentially the Golden Corral buffet for a range of dueling jihadists groups across the platform and the occasional, loosely network band of do-gooders. Using the group as a case study, it’s clear that some of these accounts are just simply friends who’ve had enough of extremist content on the platform and have banded together, while others are members of cyberarmies affiliated with extremist groups intent on dominating the platform and knocking out their competition. Many of them claim to be based in Iraq, Syria and other locations across the Middle East and North Africa.
By following the accounts, and systematically going through their friends, followers and likes, The Daily Beast was able to assess who were linked to cyberarmies and who were loosely constructed members of vigilante Facebookers. What is clear is that many of those involved in posting the sexually charged material were members of the Iran-backed Popular Mobilization Force political party pages in Iraq. Even more obvious are the networks of Taliban, Hay’at Tahrir al-Sham, and Islamic State supporters on the platform that are part of those cyberarmies.
Although they each hate each other, the Taliban supporters—through large groups of their own—and Hay’at Tahrir al-Sham supporters, also linked to groups of their own, equally detest the Islamic State, and have formed virtual armies to both troll and ban the Islamic State off the platform.
Meanwhile, individuals who seethe at the thought of extremists sharing content space on the world’s largest social media network take them all on by photo-bombing them in shared spaces with explicit posts.
There is actually a history to these antics. Back in 2016, members of Anonymous, the shadowy hacker collective, dumped pornography into the timelines of Islamic State supporters on Twitter. In September 2020, infiltrators spammed porn into Telegram channels manned by Islamic State supporters, who grew frustrated with their inability to get rid of the content.
This, of course, reads and looks like quite childish antics to the casual observer, and isn’t necessarily effective, but it adds an element of circus theatrics to the challenge of blunting terrorist support on the platform.
Facebook took down the groups when reported by The Daily Beast. A spokesperson for Meta, Facebook’s parent company, said it doesn’t “allow terrorists on our platform,” and that it removes “content that praises, represents or supports them whenever we find it.”
Ultimately, scrolling through the Facebook group’s timeline reveals not just the lapses in content moderation on a regional level, but also gaps in moderating terrorist content that is explicitly forbidden by the platform’s own community standards.
What’s clear, for now, is that in the darkest corners of Facebook, one man’s porn is another man’s terrorist repellent—and there seems to be no end to this bizarre phenomenon in sight.