Although Austin police are still searching for the motives behind Mark Conditt’s bombing spree, investigators have tentatively concluded he mastered the art of bomb-making from “how to make a bomb” videos found mostly on YouTube.
On any given day there are almost 300,000 videos on YouTube providing step-by-step instructions how to construct bombs—pipe bombs, pressure cooker bombs, you name the type. Some of the videos are the work of teen-age backyard pranksters mixing up household chemicals for “Gatorade bottle bombs” (lethal in their own right), others are so-called “film prop” instructional videos showing how to construct bombs with more “boom” than bark for film-making purposes. But the clear majority are military-grade instructional videos painstakingly walking a would-be Mark Conditt or ISIS bomber how to construct a lethal pipe or pressure cooker bomb.
In the past five years ISIS inspired bombers relied heavily on such tactical YouTube videos to build their homemade bombs. According to Boston police and the FBI, the Boston Marathon Tsarneav brothers constructed their bombs from YouTube videos and Inspire magazine—al Qaeda’s “how to be a terrorist” handbook. So, too, did Syed Farouk and Tashfeen Malik, the San Bernardino terrorists who built a bomb-factory in their garage.
ADVERTISEMENT
Throughout 2017 in the wake of rising public demand and the boycott of American advertisers, YouTube pledged to clean up its act and remove ads from extremist incitement and dangerous videos enabling domestic violence and terrorism.
in response to a concerted campaign by the Counter Extremist Project, YouTube’s management took an important step and removed the hate videos of radical Islamic cleric Anwar al-Awlaki. But for every Awlaki video removed, hundreds of new radical Imam rants have sprung up on YouTube in their place.
Admittedly, it is a huge technological challenge to segregate and identify terrorist-grade bomb making videos from benign backyard pranks, or the inciting sermons of radical Islamic imams from mere propaganda. The challenge compels a new technological social media “moon shot” by Silicon Valley to clean up its act to avoid the consequences of more public demand for regulations to protect public safety.
The Wall Street Journal reported on March 27 that YouTube announced it would ban videos relating to the sale of guns and bump stocks—obviously because of the March For Life campaign.
If YouTube’s management is willing to impose restrictions on the sale of firearms, why not also ban instructional bomb-making videos, which are every bit as dangerous?
If YouTube’s management can unilaterally take down gun and bump stock ads and how-tos, it surely must not be far from having the software to segregate and remove bomb-making videos. But that is something YouTube’s management refuses to disclose. In fact, YouTube’s management has repeatedly claimed it does not have the ability to identify and remove such extremist content by itself and outsources this to so called third-party flaggers who use their eyes rather than software to flag content violating its Terms of Service. Moreover, it does not want to assume any liability for content uploaded to its platform.
Under the Communications Decency Act of 1996 (CDA), social media companies are immune from liability for uploaded content. Congress has carved out two exemptions to this blanket immunity: child pornography and, just recently, sex trafficking.
How can Congress continue to permit social media companies to arbitrarily determine when they are going to remove content merely in response to public pressure? A tortoise-paced half-hearted pledge to act is glaringly insufficient, especially when American lives are at stake. It also strains credulity that one of the largest and wealthiest technology companies does not have the computer software wherewithal to remedy this challenge.
When it came to removing extremist content from corporate ads after losing nearly $1 billion in ad revenue, YouTube was able to bend its algorithms to protect its revenue stream. So how is it going to find and remove gun sale and assembly videos? Obviously, via new software it must have internally developed.
CEP offered to YouTube free of charge its “eGLYPH” software to enable YouTube’s management to solve this challenge of swiftly identifying and segregating extremist content. But YouTube’s management refuses to even road test the “eGLYPH” software already successful in use in Europe. Why? My guess is that YouTube’s lawyers are worried that accepting third party technology could somehow breach the legal dam protecting its CDA content immunity. Apparently, YouTube’s lawyers rule YouTube’s roost.
As Congress considers what to do about the Cambridge Analytica/Facebook data breach it should also take a long hard look at the unacceptable presence of dangerous instructional bomb-making videos on YouTube and begin laying the groundwork for federal intervention unless YouTube’s management acts quickly to remove them. We are one more bomb-making video away from another terror attack. Will it take a pipe bomb attack at a high school to force YouTube to act?