Is Meta Doing Enough to Shield Teenagers From Inappropriate Content Online?



By Tom Linder
Alissa Haedt-Matt

In an effort to protect teenagers from inappropriate posts, social media giant Meta announced in January 2024 that they’re implementing new policies to block content related to suicide, self-harm, and eating disorders from showing up in their feeds.

The announcement was similar to those made in 2019 and 2021, so Illinois Institute of Technology Associate Professor of Psychology Alissa Haedt-Matt remains unconvinced that the parent company of Facebook, Instagram, Threads, and WhatsApp is doing enough to protect teenagers.

“In order to really evaluate whether they’re doing enough, we need a lot more transparency on what exactly is happening,” says Haedt-Matt, who is the director of Illinois Tech’s Dysregulated Eating and Mood Team (formerly the Eating Behaviors Lab). “When I read the policy, I thought they were already doing this; it’s a little unclear to me what exactly has changed—if anything—with this new policy. Meta has long said they remove eating disorder content and will blur out self-harm or suicide-related content as well.”

The new policy announcement comes on the heels of Arturo Bejar—Meta’s former director of engineering—testifying before the Senate Judiciary Committee in November 2023. While Bejar’s testimony primarily addressed unwanted advances received by teenagers on Meta’s apps, he expressed concern about the company’s lack of transparency, as well.

So, what can Meta do in an effort to earn back some of the public’s trust?

“I think they could do a lot more, especially in terms of being transparent about their algorithms, about how they’re identifying eating disorder content,” says Haedt-Matt. “From my perspective, it’s rather insidious content for somebody who’s vulnerable to developing an eating disorder, often disguised as health promotion. You know, the ‘thinspiration,’ ‘fitspiration,’ some of that is disguised as health promotion, but really looks a lot like eating disorder content.

“There needs to be transparency about what they’re finding and how they’re using this information to protect vulnerable groups, especially teens.”