Meta must face Massachusetts social media addiction lawsuit, judge rules

Designing for addiction is a legal wrong plausible enough to go forward.
A Boston judge refused to dismiss Massachusetts claims that Meta engineered Instagram to hook teenagers.

A Boston judge has cleared the way for Massachusetts to take Meta to court over allegations that the company deliberately engineered Instagram to hook teenagers — and then lied to the public about what it knew.

Suffolk County Superior Court Judge Peter Krupp issued the ruling publicly on a Friday, rejecting Meta's motion to dismiss the case brought by Massachusetts Attorney General Andrea Joy Campbell. The suit, filed in October 2023, accuses Meta of violating the state's consumer protection law and of creating a public nuisance through the design choices it made on Instagram.

At the heart of the complaint is a specific and serious charge: that Meta did not stumble into a youth mental health crisis, but helped engineer one. The state argues the company knowingly built features into Instagram that were designed to maximize engagement among young users in ways that fostered compulsive, addictive behavior — and that while doing so, it misled the public about the risks its platform posed to the mental health of teenagers.

Massachusetts was among a small number of states that chose to file their cases in state court rather than joining the broader federal litigation that has accumulated against Meta over similar concerns. That strategic choice matters. State courts can apply their own consumer protection statutes, which sometimes offer different — and in some cases stronger — grounds for holding companies accountable than federal law alone.

By surviving the motion to dismiss, the case now moves toward the discovery phase, where the state will have the opportunity to compel Meta to produce internal documents, communications, and research that could illuminate what the company knew about Instagram's effects on young users and when it knew it. That process alone can be consequential — prior litigation against Meta has already surfaced internal research suggesting the company was aware of harms to teenage girls and chose not to act on it.

Neither Meta nor a spokesperson for Attorney General Campbell offered comment in the immediate aftermath of the ruling.

The case is one thread in a much larger national reckoning over the responsibilities social media companies bear toward the youngest members of their user base. Dozens of states, school districts, and individual families have pursued legal action against Meta and other platforms, arguing that the design of these products — notification systems, infinite scroll, algorithmic amplification — is not incidental to the harm but central to it.

What comes next in Massachusetts will be worth watching closely. If the state's claims survive further legal challenges and reach trial, it would mark one of the more significant courtroom tests yet of whether a social media company can be held liable not just for what its platform contains, but for how it was built.

Neither Meta nor a spokesperson for the attorney general offered any immediate response to the ruling.
— AP, as reported
The Hearth Conversation Another angle on the story
Inventor

What actually makes this ruling significant — courts reject motions to dismiss all the time, don't they?

Model

They do, but the threshold here matters. Meta argued the claims shouldn't even be heard. The judge disagreed, which means the state's theory of liability — that designing for addiction is a legal wrong — is at least plausible enough to go forward.

Inventor

What is the state actually alleging Meta did?

Model

That Instagram's features were deliberately engineered to keep young users compulsively engaged, and that Meta knew this was damaging to teenage mental health while publicly saying otherwise.

Inventor

Why did Massachusetts file in state court instead of joining the federal cases?

Model

State consumer protection law can offer different tools. It's a strategic choice — some states have statutes that are harder for companies to deflect than federal claims.

Inventor

What does discovery look like from here?

Model

The state can now demand internal documents — research, memos, product decisions. That's often where these cases get their teeth. We've already seen what leaked Meta research looked like. Compelled discovery could go deeper.

Inventor

Is there a precedent for a social media company actually losing one of these cases at trial?

Model

Not yet at this scale. Most have settled or are still working through the courts. That's part of why a Massachusetts trial, if it gets there, would be genuinely new territory.

Inventor

What's the public nuisance claim about?

Model

It's an older legal theory — usually applied to things like pollution or unsafe buildings — being stretched to cover algorithmic harm. The argument is that Instagram's design damaged the broader community, not just individual users.

Inventor

And Meta had no comment?

Model

Nothing immediate. Which is itself a kind of answer — there's not much to say when a judge just told you the case is going forward.

Contact Us FAQ