Since at least 2019, Meta has knowingly refused to close the majority of accounts of children under 13 while collecting their personal information without parental consent, a new public court document from an ongoing federal lawsuit against the social media giant alleges.
Attorneys general from 33 states have accused Meta of receiving more than one million reports of users under the age of 13 on Instagram from parents, friends and members of the online community between early 2019 and mid-2023.. “However, Meta only disabled a fraction of those accounts,” the complaint said.
The federal complaint calls for injunctions banning Meta from practices that attorneys general say violate the law. Civil penalties could reach hundreds of millions of dollars, as Meta reportedly harbors millions of teenagers and children. Most states require anywhere from $1,000 to $50,000 in fines per violation.
According to the 54 lawsuit, Meta violated a range of state consumer protection laws, as well as the Children’s Online Privacy Protection Rule (COPPA), which prohibits companies from collecting personal information from children under 13 without parental consent. Meta is alleged to have failed to comply with COPPA with respect to both Facebook and Instagram, even though Meta’s own data shows that Instagram’s audience composition includes millions of children under the age of 13, and that “hundreds of thousands of teen users spend more than five hours a day on Instagram ,” the court document said.
A Meta product designer wrote in an internal email that “the young are the best,” adding that “you want to bring people into your service young and early,” according to the lawsuit.
The unsealed complaint also alleges that Meta knew its algorithm could direct children to harmful content, harming their well-being. According to internal company communications cited in the document, employees wrote that they were concerned about “the content on IG triggering negative emotions in tweens and affecting their mental well-being (and) our ranking algorithms that [them] into negative spirals and feedback loops that are difficult to get out of.”
For example, meta-researchers conducted a study in July 2021 that concluded that Instagram’s algorithm can increase negative social comparisons and “content that tends to make users feel worse about their bodies or appearance,” according to the complaint. In internal emails from February 2021 cited in the lawsuit, Meta employees allegedly acknowledged that social comparison was “associated with increased time spent” on Meta’s social media platforms. and discussed how this phenomenon is “valuable to Instagram’s business model while simultaneously harming teenage girls.”
In a March 2021 internal study on eating disorder content, Meta’s team tracked users whose account names referenced hunger, thinness, and disordered eating. Instagram’s algorithm then began generating a list of recommended accounts “including accounts associated with anorexia,” the lawsuit said.
However, Antigone Davis, Meta’s global head of safety, testified before Congress in September 2021 that Meta “does not direct people to content that promotes eating disorders. That is actually against our policy and we will remove that content as soon as we become aware of it. We actually use AI to find and remove such content.”
“We want teens to have safe, age-appropriate experiences online, and we have more than 30 resources to support them and their parents,” Meta told CNN in a statement. “We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work by using selective quotes and cherry-picked documents.”
Senior leadership at Instagram also knew that problematic content was a critical issue for the platform, the lawsuit said. Adam Mosseri, the head of Instagram, reportedly wrote in an internal email that “social comparison is with Instagram [what] Election interference is for Facebook.” The lawsuit does not specify when that email was sent.
CNN reached out to Meta about Davis and Mosseri’s comments and did not immediately receive a response.
Despite the company’s internal research confirming concerns about social comparison on its platforms, the lawsuit claims Meta has refused to change its algorithm. One employee noted in internal communications cited in the lawsuit that content that encourages negative appearance comparisons is “some of the most engaging content (on the Explore page), so this idea actively goes against the key measures of many other teams.” Meanwhile, “Meta’s external communications denied or obscured the fact that its recommendation algorithms promote content with a high negative appearance comparison to young users,” the lawsuit said.
Meta was also aware that its recommendation algorithms “cause intermittent dopamine release in young users,” which can lead to addictive consumption cycles on its platforms, according to internal documents cited in the lawsuit.
“Meta has profited from children’s pain by deliberately designing its platforms with manipulative features that addict children to their platforms while lowering their self-esteem,” Letitia James, New York’s attorney general, said in a statement last month. New York is one of the states involved in the federal lawsuit. “Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable,” James said.
Eight additional attorneys general sued Meta in various state courts last month, making similar claims in connection with the massive federal lawsuit. Florida has sued Meta in its own separate federal lawsuit, alleging the company misled users about the potential health risks of its products.
The wave of lawsuits is the result of a bipartisan, multi-state investigation dating back to 2021, after Facebook whistleblower Frances Haugen came forward with tens of thousands of internal company documents that she said showed how the company knew its products could negatively impact the health of young people. mental health.
CNN’s Brian Fung contributed to this reporting