Meta was sued by more than three dozen states on Tuesday for knowingly using features on Instagram and Facebook to hook children to its platforms, even as the company said its social media sites were safe for young people.
Colorado, Tennessee and Massachusetts led a joint lawsuit filed by 33 states in the U.S. District Court for the Northern District Court of California, which said that Meta — which owns Facebook, Instagram, WhatsApp and Messenger — violated consumer protection laws by unfairly ensnaring children and deceiving users about the safety of its platforms. The District of Columbia and eight other states filed separate lawsuits on Tuesday against Meta with most of the same claims.
The states said Meta’s algorithms were designed to push children and teenagers into rabbit holes of toxic and harmful content. Features like “infinite scroll” and persistent alerts were used to hook young users, the states said. The attorneys general also charged the company with violating federal privacy laws for children.
”Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens,” the states said in their lawsuit. “Its motive is profit.”
Meta said it was working to provide a safer environment for teenagers on its apps and has introduced more than 30 tools to support teens and families.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said in a statement.
Why the Case Matters
It’s unusual for so many states to come together to sue a tech giant for consumer harms. The coordination shows states are prioritizing the issue of children and online safety and combining legal resources to fight Meta, just as states have previously done for cases against Big Tobacco and Big Pharma companies.
Lawmakers around the globe have been trying to rein in platforms like Instagram and TikTok on behalf of children. Over the last few years, Britain, followed by states like California and Utah, passed laws that would require social media platforms to boost privacy and safety protections for minors online. The Utah law, among other things, would require social media apps to turn off notifications by default for minors overnight to reduce interruptions to children’s sleep.
Regulators have also tried to hold social media companies accountable for possible harms to young people. Last year, a coroner in Britain ruled that Instagram had contributed to the death of a teenager who took her own life after seeing thousands of images of self-harm on the platform.
How the Investigation Started
States began investigating Instagram’s potentially harmful effects on young people several years ago as public concerns over cyberbullying and teen mental health mounted.
In early 2021, Facebook announced that it was planning to develop “Instagram Kids,” a version of its popular app that would be aimed at users under the age of 13. The news prompted a backlash among concerned lawmakers and children’s groups.
Soon after, a group of attorneys general from more than 40 states wrote a letter to Mark Zuckerberg, the company’s chief executive. In it, they said that Facebook had “historically failed to protect the welfare of children on its platforms” and urged the company to abandon its plans for Instagram Kids.
Concerns among the attorneys general intensified in September 2021 after Frances Haugen, a former Facebook employee, leaked company research indicating that the company knew its platforms posed mental health risks to young people. Facebook then announced it was pausing the development of Instagram Kids.
That November, a bipartisan group of attorneys general, including Colorado, Massachusetts and New Hampshire, announced a joint investigation into Instagram’s impact — and potential harmful effects — on young people.
Under local and state consumer protection laws, the attorneys general are seeking financial penalties for Meta. The District of Columbia and the states will also ask the court for injunctive relief to force Meta to stop using certain tech features that the states contend have harmed young users.