The Algorithm Knew: How Social Media Companies Chose Engagement Over Teen Safety
Expert testimony in the ongoing federal case reveals internal documents showing companies knowingly prioritized engagement over adolescent wellbeing.
The Trial
The federal case In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation entered its fifth week in the Northern District of California, and the testimony so far has painted a damning picture of how major social media platforms handled internal warnings about the impact of their products on teenage users.
Over 4,200 pages of internal documents have been entered into evidence, including emails, Slack messages, and internal research presentations from three major platforms. Taken together, they reveal a consistent pattern: safety researchers raised alarms, proposed interventions, and were overruled by growth and product teams focused on engagement metrics.
The Internal Research
"We Know This Is Harmful"
Perhaps the most striking document is a 2021 internal presentation titled "Teen Mental Health Deep Dive" from one platform's research team. The 47-slide deck includes findings such as:
- 32% of teen girls reported that when they felt bad about their bodies, the platform's algorithm showed them more appearance-focused content
- Teens who used the platform for more than 3 hours daily showed 2.4x higher rates of anxiety and depression symptoms
- The platform's recommendation algorithm consistently promoted content that researchers classified as "social comparison" and "appearance-based" to teen users
"We have evidence of a causal pathway between our recommendation system and negative mental health outcomes in adolescent users. The question is what we're willing to do about it." — Internal research memo, entered as Exhibit 47
The Safety Team's Proposals
Internal documents show that safety teams at all three platforms proposed specific interventions:
- Algorithmic circuit breakers — pausing recommendations after detecting negative engagement patterns
- Default time limits for users under 18
- Reducing recommendation amplification of content flagged as harmful by internal classifiers
- Chronological feed options as the default for teen accounts
Why They Were Rejected
In each case, the proposals were rejected or significantly watered down. The reasons documented in internal communications include:
- "Unacceptable engagement impact" — one VP estimated that teen-specific safety features would reduce daily active usage by 8-12%
- Revenue concerns — teen users were identified as a key growth demographic with high lifetime value
- Competitive pressure — executives argued that if one platform added restrictions, teens would migrate to competitors
Expert Testimony
Dr. Rachel Chen, a developmental psychologist at Stanford, testified as an expert witness for the plaintiffs. Her testimony drew on both the internal documents and her own research:
"These companies had better data on the mental health impact of their products than any academic researcher. They had the tools to mitigate the harm. And they made a conscious business decision not to use them."
The defense has argued that correlation does not establish causation, that parents bear primary responsibility for managing screen time, and that the platforms have since implemented voluntary safeguards.
The Scale of the Crisis
The trial is occurring against the backdrop of what the U.S. Surgeon General has called a "youth mental health crisis":
- Teen depression diagnoses have increased 60% since 2015
- Emergency room visits for self-harm among girls aged 10–14 have risen nearly threefold
- The average American teenager spends 4.8 hours per day on social media platforms
| Metric | 2015 | 2020 | 2025 |
|---|---|---|---|
| Teen depression rate | 12.8% | 17.0% | 20.5% |
| Daily social media use (hrs) | 2.1 | 3.8 | 4.8 |
| ER visits, self-harm (10–14 girls) | 4,200 | 9,800 | 12,100 |
What's at Stake
The case consolidates claims from over 300 school districts and thousands of individual plaintiffs. If the court finds the platforms liable, damages could reach into the tens of billions of dollars. More significantly, a finding of liability could establish a legal precedent treating algorithmic recommendation systems as products subject to design-defect claims.
A ruling is expected by mid-2026.
This article was collaboratively researched and written by 15 contributors using Kabooy's investigative deep-dive pipeline.
Sources (3)
- [1]Unsealed Documents Reveal What Social Media Companies Knew About Teen Harmnytimes.com
Over 4,200 pages of internal documents show platforms had detailed research on teen mental health impacts and chose not to act.
- [2]Expert Witness: 'They Had the Data and the Tools—They Chose Not to Act'washingtonpost.com
Stanford psychologist Dr. Rachel Chen testified that platforms possessed more data on teen mental health impact than any academic institution.
- [3]
- Contributors
- 15
- Revisions
- 8 versions
- Word count
- 3,000
- Last updated
- about 2 hours ago