Congress

Senators Brand Instagram as the ‘New Tobacco’ for Teens

No Filter

Senators went after Instagram's CEO on Wednesday, saying that legislation would be coming to police the app's mental health effects on teens.

GettyImages-1237101865_qprf2r
Drew Angerer

Senators were in no mood for nonsense on Wednesday.

What was supposed to be a wide-ranging hearing with Instagram CEO Adam Mosseri before a Senate subcommittee turned into a marathon of congressional complaints, with lawmakers specifically calling out Instagram’s failure to address how toxic the photo-sharing app can be for teen girls and promising legislation to rein in the tech company.

Sen. Marsha Blackburn (R-TN), the ranking member of the Senate Commerce Committee’s Subcommittee on Consumer Protection, Product Safety, and Data Security, called Instagram’s efforts to address mental health issues on the app the “bare minimum,” after Instagram released new features in the wee hours of Wednesday morning meant to nudge teens to “take a break” or switch to new topics if they’ve been stuck on one for a while.

ADVERTISEMENT

Lawmakers were unimpressed.

“The time for self-policing and self-regulation is over,” the chairman of the subcommittee, Sen. Richard Blumenthal (D-CT), said. “Some of the big tech companies have said ‘trust us’… the trust is gone… You’re in the gutter.”

You’re the new tobacco, whether you like it or not. And you’ve got to stop selling the tobacco.
Sen. Mike Lee (R-UT)

The criticism comes after months of revelations about Instagram’s harmful effects on children and teenagers. For years, teens themselves have linked their mental health issues back to time spent on Instagram—and a recent series of investigations have shown that Instagram has known internally that the addictiveness of the app can cause mental health problems. Instagram use has been linked with worsening body image, increased rates of anxiety and depression, even to increased suicidal thoughts, according to The Wall Street Journal.

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” an internal Instagram presentation from March 2020 said, according to The Wall Street Journal.

A presentation from 2019 went even further: “We make body image issues worse for one in three teen girls,” a slide reportedly said.

In several cases, parents have blamed Instagram for the suicides of their children. Molly Russell, a 14-year-old who died by suicide in 2017, killed herself after scrolling through Instagram and being fed depressive content, her father claims.

GettyImages-1237102046_jeffbs

Head of Instagram Adam Mosseri testifies during a Senate Commerce, Science, and Transportation Committee hearing.

Drew Angerer

Two weeks after Molly’s story went public, Mosseri vowed “to do everything we can to keep the most vulnerable people who use our platform safe” in an op-ed.

And yet, Instagram’s first set of parental controls are only slated to launch in March of 2022, Mosseri said Wednesday.

Nonetheless, Mosseri opted Wednesday to push back on findings that Instagram is addictive for teens and ropes them in to more and more harmful content.

”I don’t believe the research suggests that our products are addictive,” Mosseri said.

But Blumenthal was unconvinced, encouraging Mosseri to be more honest about Instagram’s harms on teens.

“We can debate the meaning of the word ‘addictive,’ but the fact is that teens who go to the platform find it difficult—maybe sometimes impossible—to stop. And part of the reason is that more content is driven to them to keep them on the site to aggravate the emotions that are so seductive,” Blumenthal said.

Mosseri also pushed back on the idea that Instagram is part of the problem, alleging that Instagram doesn’t recommend, for instance, eating disorder content to users.

Still, senators found plenty of issues with the Instagram algorithm and what it suggests to users.

Sen. Mike Lee (R-UT) and his staff created a dummy account for a 13-year-old girl as a way to test what content is fed automatically to teen girls on Instagram. They had the fake account follow just one female celebrity, and within minutes the “test” 13-year-old’s account was inundated with content about plastic surgery, body dysmorphia, and other content harmful to mental health. (Blackburn and Blumenthal also pressed Mosseri on similar fake accounts they created to test Instagram.)

Mosseri said this kind of harmful content is only viewed in 5 of 10,000 cases. But once again, lawmakers weren’t buying it.

“It was rampant… It was not 5 in… 10,000,” Lee said, who added that the harmful content was only fed to the “teen” account after Instagram recommended an account to follow. “What changed was following this female celebrity account and that female celebrity account was recommended to this 13-year-old girl. So why are you recommending that somebody follow this, with the understanding that by doing that you’re exposing that girl to all sorts of other things that are not suitable for a child?”

Mosseri didn’t have a clear answer, but he said he thinks users look to Instagram to seek out resources to help them deal with body image and mental health issues. The Instagram CEO also added that other platforms should be held accountable by Congress, suggesting that Instagram supports industry-wide regulation.

American youth are in the midst of a mental health crisis. Feelings of helplessness, depression, thoughts of suicide, and suicide rates among teens are on the rise, according to the U.S. Surgeon General, who issued an alert on the youth mental health crisis this week.

Just in the early parts of 2021, emergency room visits for suicide attempts were 51 percent higher for teen girls than they were in the same period in 2019, the surgeon general noted in the alert. Boys had a four percentage point increase for the same period.

Social media, in many cases, has exacerbated existing mental health issues, an issue that has deepened during the pandemic as many interactions and social engagements have shifted online, according to the alert. The Surgeon General said social media companies need to be more serious about accountability.

“Technology companies must step up and take responsibility for creating a safe digital environment for children and youth,” the alert warned. “Today, most companies are not transparent about the impact of their products, which prevents parents and young people from making informed decisions and researchers from identifying problems and solutions.”

But for Instagram, Sen. Ted Cruz (R-TX) said, it all comes down to whether the app is willing to give up advertising dollars to help protect young teen girls from harmful content that might drive them to suicide or influence them toward disordered eating.

“You make money the more people are on your product—the more people are engaged in [viewing] content that is harmful to them,” said Cruz, who has two daughters entering their teen years. “Every eyeball, you’re making money… If you change policies to reduce eyeballs, you make less money.”

If lawmakers get their way, the days of Instagram making its own rules may be coming to a screeching halt. Committee leadership indicated Wednesday that legislation meant to usurp Instagram’s self-policing, and to render them accountable, would be coming.

As Blumenthal said, there was one clear conclusion Instagram should walk away with today: “Legislation is coming. We can’t rely on trust anymore, we can’t rely on self-policing.”

“You’re the new tobacco, whether you like it or not,” Lee added. “And you’ve got to stop selling the tobacco.”

Got a tip? Send it to The Daily Beast here.