The architecture of addiction itself is now on trial.
On a Friday in April, the Massachusetts Supreme Judicial Court handed down a ruling that will be studied in law schools for years: Meta Platforms cannot hide behind a 30-year-old federal internet shield to escape a lawsuit accusing it of deliberately engineering its platforms to addict children.
The case was brought by Massachusetts Attorney General Andrea Joy Campbell, a Democrat, who alleged that Instagram's parent company knowingly built features — push notifications, the endless scroll, the dopamine loop of post "likes" — to exploit the psychological vulnerabilities of teenage users and profit from their compulsive engagement. The suit further alleged that Meta's own internal research showed the platform was harming young people, and that senior executives rejected proposed changes that the company's own scientists said would improve teen well-being.
Meta had argued that Section 230 of the Communications Decency Act of 1996 — the federal statute that has long protected internet platforms from liability over content their users post — should also protect it here. The Massachusetts high court disagreed, and in doing so became the first state supreme court in the country to weigh in on that precise question: whether Section 230 shields a company not just from liability over user content, but from claims that the company itself designed addictive systems. The answer, at least in Massachusetts, is no.
A trial court judge had already reached the same conclusion, reasoning that the state was not trying to hold Meta responsible for anything its users posted — it was targeting Meta's own business decisions, its own design choices, its own conduct. The Supreme Judicial Court's ruling affirms that framing and sends the case forward to trial.
The ruling lands in the middle of a legal avalanche. On March 25, a Los Angeles jury found Meta and Alphabet's Google negligent for building social media platforms harmful to young people, awarding a combined six million dollars to a 20-year-old woman who said she became addicted to Instagram and YouTube as a child. The day before that verdict, a separate jury found Meta owed $375 million in civil penalties in a New Mexico attorney general lawsuit accusing the company of misleading users about platform safety and of enabling child sexual exploitation on Facebook and Instagram.
Thirty-four states are pursuing similar claims against Meta in federal court. At least nine state attorneys general have filed in state court since 2023, including a case filed just this week by Iowa Attorney General Brenna Bird, a Republican — a detail worth noting, since the political pressure on Meta is arriving from both sides of the aisle.
Campbell's Massachusetts case drew particular attention early on because of what it alleged about Mark Zuckerberg personally: that the CEO had been dismissive when concerns were raised internally about Instagram's effects on its users. That allegation, aired publicly in the lawsuit's filings, put a human face — and a name — on what might otherwise have seemed like abstract corporate negligence.
Meta has denied all of it. The company says it takes extensive measures to protect younger users on its platforms, and it has pointed to age-verification systems and parental controls as evidence of good faith. Critics, including the attorneys general pursuing these cases, say those measures are inadequate and that the company's own internal documents tell a different story.
What comes next is a trial in Massachusetts — and, almost certainly, more verdicts and rulings in the cases stacking up across the country. The legal question of whether Section 230 protects platform design decisions, as opposed to user content, is now squarely in play. The Massachusetts court has answered it one way. Federal courts will eventually have to answer it too.
Notable Quotes
The state was principally seeking to hold Meta liable for its own business conduct, not content posted by third parties.— Massachusetts trial court judge, reasoning affirmed by the Supreme Judicial Court
The company takes extensive steps to keep teens and young users safe on its platforms.— Meta Platforms, in response to the allegations
The Hearth Conversation Another angle on the story
What makes this ruling different from the dozens of other lawsuits piling up against Meta?
It's the first time a state supreme court has weighed in on a very specific legal question — whether Section 230 protects a company's design choices, not just the content its users post. That distinction matters enormously.
Why does that distinction matter so much?
Because if Section 230 covers design decisions, Meta can argue it's immune from almost any lawsuit about how its platforms work. If it doesn't, the door opens to holding the company accountable for the architecture of addiction itself.
The lawsuit names specific features — notifications, likes, infinite scroll. Is that unusual?
It's deliberate. By naming the mechanics, the state is arguing these aren't neutral tools — they're engineered to keep kids on the platform past the point of healthy use. That's a design liability claim, not a content claim.
And Meta's own research allegedly showed the harm?
That's the sharpest allegation. The suit claims internal data confirmed the platform was hurting young users, and that executives passed on fixes their own scientists recommended. If that's proven at trial, it's damning.
The Los Angeles verdict came just weeks before this ruling. Is there a connection?
Not legally — different courts, different cases. But the timing shapes the atmosphere. Juries and judges are watching each other's work, and a $6 million verdict in LA signals that these claims can land with a fact-finder.
Thirty-four states in federal court, nine in state court. What does that coordination look like?
It's less a coordinated campaign than a convergence. Attorneys general from both parties, in states with very different politics, have looked at the same evidence and reached the same conclusion. That breadth is its own kind of pressure.
What does Meta do now?
It prepares for trial in Massachusetts while fighting on multiple other fronts. The company's best remaining argument is that federal law should preempt all of this — but that argument just lost in the highest court of one state.