The architecture itself is the problem.
The Massachusetts Supreme Judicial Court has decided that Meta Platforms cannot simply walk away from a lawsuit accusing it of deliberately engineering Instagram to hook children. The ruling, handed down unanimously, clears the way for the state's attorney general to press her case in court — a significant moment in what has become a widening legal reckoning for the social media industry.
At the center of the case is a question that has been building for years in courtrooms and legislative chambers across the country: when a technology company designs a product that exploits the psychological vulnerabilities of young people, does it bear legal responsibility for the harm that follows? Massachusetts says yes, and its highest court has now agreed that the question deserves a full hearing.
Justice Dalila Argaez Wendlandt, writing for the unanimous court, drew a careful and consequential distinction. The lawsuit brought by Attorney General Andrea Joy Campbell is not, the court explained, a challenge to anything users posted on Instagram. It is a challenge to the choices Meta made in building the platform itself — choices the state alleges were designed to maximize engagement among children in ways that exploited how young minds develop, and that misled consumers about how safe the app actually was.
That distinction matters enormously because of a federal law known as the Communications Decency Act, which has long shielded internet companies from lawsuits over content their users generate. Tech companies have leaned on that protection heavily, and successfully, for decades. But the Massachusetts court found that the attorney general's claims fall outside that shield — they are about product design, not user content, and the law does not protect Meta from those allegations.
The case does not stand alone. It is part of a broader national push by state attorneys general and private plaintiffs to force social media companies to answer for what critics describe as the deliberate manipulation of young users. A recent high-profile trial resulted in penalties against both Meta and Google, signaling that courts and juries are increasingly willing to scrutinize the industry's practices. Massachusetts is among several states pursuing its own litigation track, and the supreme court's ruling here adds momentum to that effort.
What the attorney general is ultimately arguing is that Instagram was not simply a neutral platform that happened to attract young users — it was, she contends, a product engineered to keep them scrolling, to exploit the particular susceptibility of adolescent brains to social feedback and reward loops, and to do so while the company publicly minimized concerns about the app's effects on children's wellbeing.
Meta has not conceded any of this, and the case is far from over. The ruling means only that the lawsuit can proceed — that the claims are legally viable enough to be tested in court. The company will have every opportunity to contest the evidence and the legal theory as the litigation moves forward.
Still, the decision carries weight beyond its immediate legal effect. State supreme courts do not often weigh in on the boundaries of federal internet law, and a unanimous ruling from Massachusetts carries the kind of authority that other courts notice. As similar cases work their way through the system in other states, the reasoning Justice Wendlandt laid out may prove influential.
For now, the attorney general's case moves ahead. The harder questions — what Meta actually knew, what it actually designed, and what harm actually resulted — will be fought out in the proceedings to come.
Notable Quotes
The lawsuit does not challenge Meta for content posted by users, but for its own design decisions that allegedly exploited children's developmental vulnerabilities and misled consumers about Instagram's safety.— Justice Dalila Argaez Wendlandt, paraphrased, Massachusetts Supreme Judicial Court
The Hearth Conversation Another angle on the story
What makes this ruling different from the dozens of other lawsuits that have been filed against social media companies over the years?
Most of those cases ran into a wall called Section 230 — the federal law that protects platforms from liability for what users post. This court found a way around that wall by focusing on the design of the product itself, not the content on it.
So the argument is that Meta built something harmful, not just that harmful things happened on it?
Exactly. The attorney general isn't saying Instagram hosted bad content. She's saying Instagram was engineered to be addictive to children — that the architecture itself is the problem.
Why does that distinction matter legally?
Because Section 230 only shields companies from liability tied to user-generated content. If the harm comes from the company's own design choices, the shield doesn't apply. That's what the court confirmed here.
Is this the first time a state high court has made that call?
The ruling is described as unprecedented at the state supreme court level, which suggests other courts have been more cautious or haven't reached the question directly.
What does it mean for Meta practically — are they in serious trouble?
Not yet. The ruling just means the case can proceed. Meta still gets to contest everything — the facts, the science, the legal theory. But they can no longer argue the lawsuit should be thrown out before it starts.
And the broader picture — is Massachusetts an outlier here or part of something larger?
Part of something larger. Multiple states are running parallel cases, and there's been at least one major trial that resulted in penalties against Meta and Google. The legal landscape is shifting.
What should we be watching for next?
How the case develops in discovery — that's where we'd learn what Meta's internal research actually showed about how Instagram affected young users. That evidence, if it surfaces, could matter well beyond this one lawsuit.