
Meta’s Instagram faces scrutiny as the FTC exposes safety lapses and child exploitation issues in a controversial trial.
Key Insights
- FTC presented evidence in an antitrust trial against Meta for child protection issues on Instagram.
- A 2019 report revealed Instagram’s algorithms suggested minors’ accounts to inappropriate adult users.
- 27% of recommendations pointed minors’ profiles to predatory accounts, with two million flagged.
- Emails showed Mark Zuckerberg limited resources, prioritizing Facebook over Instagram’s growth.
- Meta claims improvements, but internal issues highlight lapses in youth safety.
Alarming Revelations
In the ongoing antitrust trial against Meta, the Federal Trade Commission (FTC) has laid bare the startling safety oversights within Instagram regarding child protection. Utilizing internal documents from 2019, the FTC evidenced how Instagram’s algorithm was complicit in recommending minors’ accounts for adults with dubious behaviors. With 27% of such recommendations targeting children’s profiles, the data indicated over two million underage accounts flagged for inappropriate adult interactions within three months.
Instagram platform used automated algorithms that suggested children for groomers and predators to follow on the app, according to a 2019 internal company document presented by the FTC during the ongoing Meta antitrust trial. https://t.co/agBuaRinUR
— Breitbart News (@BreitbartNews) May 7, 2025
The findings were part of the internal report titled “Inappropriate Interactions with Children on Instagram,” which also revealed that minors constituted around 7% of all following recommendations to adult users on the platform. This report painted a concerning picture of Instagram’s failure to safeguard young users, prompting the FTC to question the platform’s commitment to combating child exploitation.
Resource Allocation Disputes
The FTC argues that Meta’s acquisition of Instagram resulted in insufficient investment towards bolstering user safety, consequently imperiling young users. Evidence in court included emails and testimonies indicating that resource constraints stemmed from strategic decisions by CEO Mark Zuckerberg. Allegedly, he withheld resources for Instagram’s security enhancements due to concerns that Instagram might overshadow Facebook.
Furthermore, Meta executives acknowledged that Instagram lagged behind Facebook in addressing critical issues like child exploitation. Internal documents shed light on the understaffed and under-resourced nature of Instagram’s safety teams, further exacerbating the risks faced by minors.
🇺🇸 FTC: 2 MILLION TEENS SERVED TO GROOMERS BY INSTAGRAM’S ALGORITHM
A 2019 internal report revealed Instagram’s algorithm was suggesting minors to adult predators – 2 million teen accounts were shown to flagged “groomers.”
That’s 27% of follow recommendations going to creeps.… https://t.co/6qMGsjBRNZ pic.twitter.com/b105bC8vrz
— Mario Nawfal (@MarioNawfal) May 7, 2025
Meta’s Response and Future Steps
Meta, on its part, has claimed that since 2018, they have put efforts into improving child safety across its platform. Their initiatives include configuring default private accounts for teens, placing restrictions on adult-minor interactions, and limiting recommendations for suspicious adults. Moreover, a Meta spokesperson has suggested ongoing investment in child safety, supported by legislative updates for better protection of minors.
Despite these claims of progress, the recently exposed internal disputes and the FTC’s arguments raise valid concerns regarding Meta’s prioritization strategies. They emphasize the conflict between growing Instagram and maintaining Instagram’s integrity, safety, and efficacy in shielding its younger user base.
Sources:
- Meta Antitrust Trial: FTC Says Instagram Urged ‘Groomers’ to Connect With Minors – Bloomberg
- FTC Describes Instagram as a Groomer’s Paradise at Meta Antitrust Trial