Meta must face Massachusetts lawsuit over youth social media addiction, court rules

State’s top court alleges company deliberately designed features of platforms like Instagram to addict young users Meta Platforms must face a lawsuit by the Ma…
Executives blocked changes their own researchers said would help.
Internal data allegedly showed Instagram was harming children, yet leadership rejected fixes its own teams recommended.

On a Friday in April, Massachusetts became the site of a significant legal milestone: the state's highest court ruled that Meta Platforms must stand trial over allegations that it deliberately engineered its social media products to hook children.

The Massachusetts Supreme Judicial Court's decision was the first time any state high court had weighed whether Section 230 of the Communications Decency Act — the 1996 federal law that broadly protects internet companies from liability for content their users post — could also be used to block claims that a company knowingly designed its platforms to exploit young people's psychological vulnerabilities. The court said it could not.

Justice Dalila Argaez Wendlandt, writing for a unanimous bench, drew a clear line between what Section 230 was meant to protect and what Attorney General Andrea Campbell's lawsuit actually targets. The law shields companies from being held responsible for what their users say and post. Campbell's case, Wendlandt wrote, is about something different: Meta's own choices. The allegations center on whether the company built a platform that capitalizes on the developmental fragility of children, and whether it actively misled consumers about how safe Instagram really was.

Campbell, a Democrat, called the ruling a major step toward holding tech companies accountable for practices she says have driven a youth mental health crisis while prioritizing profit. Meta denied the underlying allegations and maintained that it takes extensive measures to protect younger users on its platforms. The company did not immediately respond to requests for comment following the ruling.

The lawsuit has drawn attention since its filing in part because of what it revealed about internal conversations at Meta. According to the complaint, CEO Mark Zuckerberg was dismissive when concerns were raised about Instagram's potential to harm its users. The suit alleges that features like push notifications, the endless scroll, and the "likes" system were not incidental design choices but deliberate tools engineered to exploit teenagers' fear of missing out and their hunger for social validation. Internal data, the state claims, showed the platform was addicting and hurting children — and that executives blocked changes their own researchers said would improve teen wellbeing.

Meta had argued that Section 230 should shield it from the Massachusetts case entirely. A trial court judge rejected that argument, finding that the state was not trying to hold Meta responsible for third-party content but for its own business conduct. The Supreme Judicial Court agreed.

The ruling lands in the middle of a wave of legal pressure on Meta from multiple directions. On March 25, a Los Angeles jury found Meta and Google's parent company Alphabet negligent for building social media platforms harmful to young people, awarding a combined six million dollars to a 20-year-old woman who said she became addicted to those platforms as a child. The day before that verdict, a separate jury found Meta owed $375 million in civil penalties in a New Mexico attorney general lawsuit accusing the company of deceiving users about safety and enabling child sexual exploitation on Facebook and Instagram.

Thirty-four other states are pursuing similar claims against Meta in federal court. At least nine state attorneys general, including a Republican in Iowa who filed suit just this week, have chosen to pursue their cases in state court since 2023. Massachusetts is now among those cleared to proceed.

What comes next is a trial — one that will force Meta to defend not just its policies but the engineering decisions baked into products used by hundreds of millions of people, including children who were never supposed to be there at all.

The claims allege harm stemming from Meta's own conduct — either by designing a platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about Instagram's safety.
— Justice Dalila Argaez Wendlandt, Massachusetts Supreme Judicial Court
A major step in holding these companies accountable for practices that have fueled the youth mental health crisis and put profits over kids.
— Massachusetts Attorney General Andrea Campbell
The Hearth Conversation Another angle on the story
Inventor

What actually makes this ruling different from the dozens of other lawsuits piling up against Meta?

Model

Most of the legal action against Meta has been in federal court, where Section 230 is a powerful shield. This is the first time a state's highest court has said that shield doesn't cover what Meta is actually being accused of — designing addiction into the product itself.

Inventor

So Section 230 was always about content, not design?

Model

That's the argument the court accepted. The law was written to protect platforms from being sued over what users post. It wasn't written to protect a company from liability for its own engineering choices — the scroll that never ends, the notification that pulls you back in.

Inventor

Does the distinction between content and design hold up under scrutiny?

Model

That's exactly what Meta will contest. The company's position is that the design and the content are inseparable — that a feed is just a way of organizing what users post. The court didn't buy it, but Meta will keep pushing that argument.

Inventor

The Zuckerberg detail — him being dismissive of concerns — how much does that matter legally?

Model

It matters a lot for the deception claims. If executives were told internally that the platform was harming kids and chose to publicly say otherwise, that's not a design question, it's a fraud question. Section 230 has nothing to say about that.

Inventor

Thirty-four states in federal court, nine in state court — is this coordinated?

Model

It has the shape of coordination. Attorneys general share research, strategy, and filings. The New Mexico case, the Los Angeles trial, this Massachusetts ruling — they're building on each other, creating a legal record that makes the next case easier to bring.

Inventor

The $6 million verdict in Los Angeles — does that number feel proportionate to what's being alleged?

Model

It's a single plaintiff, a single story. The number that probably worries Meta more is the $375 million in civil penalties from New Mexico. Multiply that kind of figure across dozens of states and you're talking about something that actually changes the economics of the business.

Contact Us FAQ