YouTube algorithms consistently push eating disorder and self-harm content to teen girls, new study finds
Anna Mockel was 14 when she suddenly became obsessed with losing weight. It was the spring of 2020, and she had just graduated from eighth grade remotely. She was housebound, nervous about the upcoming fall’s transition to high school, and spent countless hours shuttling between social media apps during the summer of COVID-19 lockdowns.
Anna spends a lot of time on YouTube, “not specifically searching for anything,” but just watching what comes up in her feed. She remembers her thoughts starting to spiral when she saw videos of older girls who were always skinny. The more Anna watched, the more the videos clogged her feed, and the more determined she became to be like the girls in them.
As she clicks and clicks, YouTube’s “Previous” recommended video panel begins to transition from the content feature The “method” for thin girls to lose weight. Diet and exercise videos began to take over Anna’s account. As she continued to watch, the content grew, she said, until her feed was filled with videos about improving the look of her bones and tips for maintaining a 500-calorie-a-day diet. (The recommended daily caloric intake for adolescent girls is 2,200 calories.)
“I didn’t even know there was such a thing online,” Anna said of the eating disorder content recommended to her. “A lot of it showed up on my feed and then I was drawn to it because that’s what had happened to me.”
Anna copied what she saw, restricted her diet, and began losing weight at an alarming rate. At 14, she said she realized she had an eating disorder but “didn’t make the connections” until she was diagnosed with anorexia. Over the next few years, she endured two hospitalizations and spent three months in a residential treatment center before beginning recovery at age 16.
Now 18 and a high school student, she claims social media, particularly YouTube, contributed to her eating disorder.
“YouTube became a community of people struggling with eating disorders,” she said. “It convinced me that (anorexia) wasn’t a problem because there were so many other people online doing the same thing.”
Now, new research confirms that the content was intentionally provided to Anna. A report released Tuesday by the Center to Counter Digital Hate said that when YouTube users show signs of interest in diet and weight loss, nearly 70% of videos pushed by the platform’s algorithm recommend videos that may worsen or raise concerns about body image. Anxious content.
What’s more, the videos have an average of 344,000 views, nearly 60 times the average YouTube video, and feature ads from major brands like Nike, T-Mobile and Grammarly. It’s unclear whether the companies were aware of the ad placements.
“We cannot continue to allow social media platforms to be tested on new generations,” said James P. Steyer, founder and chief executive officer of Common Sense Media, a company committed to helping families Nonprofit organization that provides cybersecurity education.”
He said the purpose of these platforms is to capture the attention of viewers, even if that means amplifying content that is harmful to minors.
The report, titled “YouTube’s Anorexic Algorithm,” examined the first 1,000 videos teen girls received in their “Up Next” panels when they first watched videos about weight loss, dieting or exercise.
To collect data, CCDH researchers created a YouTube profile of a 13-year-old girl and posted videos on the video-sharing platform using popular eating disorder keywords such as “ED WIEIAD.” 100 searches were performed. day), the “ABC diet” (anorexia boot camp diet), and “safe foods” (referring to foods with few or no calories). The research team then analyzed the top 10 recommendations that YouTube’s algorithm pushed to the “Up Next” panel.
Results showed that nearly two-thirds (638) of recommended videos pushed the hypothetical 13-year-old user further towards eating disorders or problematic weight loss content; one-third (344) of YouTube recommendations were viewed as harmful CCDH refers to content that promotes or glorifies eating disorders, contains weight-based bullying, or exhibits imitable behavior; the study found that 50 of the videos involved self-harm or suicide.
“Social media platforms like YouTube have created this anti-human culture,” said Imran Ahmed, founder and CEO of the Center to Combat Digital Hate. “Today’s children are essentially being re-educated through algorithms, taught and persuaded to starve by corporations.”
Ahmed said the study illustrates the systemic nature of the problem and that Google-owned YouTube violated its own policies by allowing the content to appear on the platform.
According to the Pew Research Center, YouTube is the most popular social media site among U.S. teens, ahead of TikTok and Instagram. Three-quarters of U.S. teens say they use the platform at least once a day. YouTube does not require users to create an account to view content.
Social Media Victims Legal Center is a Seattle-based law firm created to respond to Facebook Papers 2021Thousands of lawsuits have been filed against social media companies such as YouTube. More than 20 of the lawsuits allege that YouTube is designed to intentionally create addiction and perpetuate eating disorders in users, particularly teenage girls.
The law firm contacted a 17-year-old client for “60 Minutes.” Her experience mirrors Anna’s.
“YouTube taught me how said the 17-year-old, who in her lawsuit accuses YouTube of intentionally promoting anorexia. She said she created a YouTube account when she was 12 years old. She would log in to watch dog videos, gymnastics challenges and cooking. Then, she said, she started watching videos of girls dancing and exercising, and then she started watching more videos of girls doing extreme sports. Click.
She said her feed became a funnel for eating disorder content, with a series of influencers promoting extreme diets and methods of “staying thin.” She spent five hours a day on YouTube, learning terms like “bulimia” and “ARFID” (avoidant/restrictive food intake disorder). She learned what it meant to “purge” and “restrict” foods; she became very concerned about caloric intake and body mass index (BMI).
When she was in seventh grade, she stopped eating. Soon after, she was diagnosed with anorexia, and over the next five years, she said she spent more time away from school than in school. Now a high school junior, she has been hospitalized five times and spent several months in three residential treatment centers trying to recover from her eating disorder.
“It almost took my life,” she recalled.
YouTube declined to comment when asked why the algorithm was used not to protect young users but to intentionally recommend eating disorder content.
The video-sharing site said it “continuously works with mental health experts to refine (its) method content Advice for Teenagers” In April 2023, the platform expanded its policy on eating disorders and self-harm content to include content that contains “educational, documentary, scientific or artistic content” or discussions of eating disorders that “may trigger a risk for viewers.” “Under this policy, these videos may not be available to viewers under 18 years of age.
YouTube has taken steps to block certain search terms, such as “thinspiration,” which is used to find footage of thin bodies. However, CCDH research found that such videos still appear in the “Up Next” panel. Users learned that these terms can still be searched on YouTube by adding a zero to the letter “O” or an exclamation point to the letter “I.” A video cited in the report praising skeletal physique had 1.1 million views at the time of the analysis; it now has 1.6 million.
As part of the study, CCDH flagged 100 YouTube videos that promoted eating disorders, weight-based bullying or exhibited copycat behaviour. YouTube removed or age-restricted only 18 of the videos.