Nearly 1 in 5 users ages 13 to 15 told Meta that they saw “nudity or sexual images on Instagram” that they did not want to see, according to a court filing.
The document, made public on Friday as part of a federal lawsuit in California and reviewed by Reutersincludes parts of a March 2025 statement from Instagram chief Adam Mosseri.
Mosseri said the company does not share survey results “in general,” adding that self-reported surveys are “notoriously problematic,” according to the statement.
Meta, which owns Facebook and Instagram, faces accusations from world leaders that the company’s products harm young users.
In the United States, thousands of lawsuits in federal and state courts accuse the company of designing addictive products and fueling a mental health crisis for minors.
The statistic on explicit images came from a survey of Instagram users about their experiences on the platform, Meta spokesperson Andy Stone said, and not from a review of the posts themselves.
By the end of 2025, the company said it would remove images and videos “that contain nudity or explicit sexual activity, even when generated by AI,” with exceptions considered for medical and educational content.
About 8% of users in the 13 to 15 age group also said they had “seen someone harm themselves or threaten to harm themselves on Instagram,” according to the statement.
Most of the sexually explicit images were sent through private messages between users, Mosseri said in its statement, and Meta must consider users’ privacy when reviewing them.
“A lot of people don’t want us to read their messages,” he said.




