Meta pulls Facebook ads for social media addiction clients

Blocking the ads doesn't make the underlying harms disappear.
Morgan & Morgan attorney Emily Jeffcott's response to Meta's decision to pull social media addiction lawsuit ads.

A woman in California recently won $6 million in damages after a jury found that her childhood addiction to social media had caused her real harm — with Meta on the hook for 70 percent of that figure and Google for the rest. It was a landmark verdict, and it opened a door that Meta's legal team can see a long line of plaintiffs already walking through.

Meta's response, it turns out, was not only to appeal or settle. It was also to go after the advertising pipeline that feeds those lawsuits in the first place.

The company has begun pulling ads placed by law firms on Facebook, Instagram, Threads, and its broader Audience Network — ads that sought out potential clients for social media addiction litigation. According to Axios, firms including Morgan & Morgan and Sokolove Law saw dozens of their ads deactivated across Meta's platforms. The move was quiet, but its implications were not.

Meta made its reasoning plain in a public statement: the company said it would not allow trial lawyers to profit from its platforms while simultaneously arguing those platforms cause harm. It's a pointed formulation — one that frames the issue as hypocrisy rather than suppression. If these firms believe Meta is dangerous, the argument goes, why are they paying Meta for ad space?

Emily Jeffcott, an attorney at Morgan & Morgan, rejected that framing entirely. She called the ad removals another attempt by Meta to control the story and sidestep accountability. The energy Meta is spending on blocking these ads, she said, would be better directed toward building tools that actually reduce harmful use and toward detecting and removing users under the age of 13 — a persistent problem the company has long struggled to address. Blocking the ads, she added, doesn't make the underlying harms disappear. It just makes it harder for victims to find representation.

Meta's advertising standards do give the company broad discretion here. The policy explicitly reserves the right to remove ads that negatively affect its relationship with users or that promote content contrary to its competitive interests. That language is wide enough to cover almost anything, and Meta appears to be using it.

As of last Friday, the crackdown was incomplete. Several ads from Morgan & Morgan remained visible in Meta's own Ad Library — including one that lists the potential harms of social media use and positions the firm as fighting on users' behalf. Whether those ads survive the coming days is an open question.

The California verdict was not the only recent blow. In March, a New Mexico court ordered Meta to pay $375 million after finding the company had misled users about how safe its platforms were for children. A jury determined that Meta's platforms had endangered minors, exposed them to sexually explicit material, and facilitated contact with sexual predators. That judgment, combined with the California addiction case, signals that the legal environment around social media and child safety has shifted in ways that are difficult to reverse.

Taken together, these cases suggest that what was once a theoretical liability — that a social media company could be held responsible for the psychological damage its products cause to young users — is now a demonstrated one. The law firms running these ads are not chasing a long shot. They are recruiting clients for a category of lawsuit that has already won in court, twice, in the span of a few months.

Meta's decision to cut off that recruitment pipeline may slow the flow of new cases. It will not stop it. The firms will find other channels, and the plaintiffs are already out there. What comes next is a question of scale: how many similar cases are waiting, how courts will handle them, and whether Meta's legal exposure is a contained problem or something much larger.

We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful.
— Meta, in a public statement
The resources Meta is devoting to blocking these ads would be better spent improving user safety and detecting users under age 13.
— Emily Jeffcott, attorney at Morgan & Morgan
The Hearth Conversation Another angle on the story
Inventor

Why would Meta pull these ads now, after the verdicts rather than before?

Model

Because before the verdicts, the lawsuits were a theoretical threat. Now they're a proven one. Two courts have found Meta liable. The ad pipeline suddenly looks like a direct line to more of the same.

Inventor

Is there a legal basis for pulling the ads, or is this just Meta exercising raw platform power?

Model

Both, really. Meta's ad policies give it enormous discretion — it can remove ads that conflict with its competitive interests. That's a clause broad enough to cover almost anything. Whether it's legally defensible is a different question from whether Meta has the technical ability to do it.

Inventor

The attorney from Morgan & Morgan said blocking ads doesn't make the harms go away. Is that the core tension here?

Model

That's exactly it. Meta is treating this as a narrative and recruitment problem. The firms are saying it's a safety problem. Those are two very different diagnoses of the same situation.

Inventor

What does it mean that some ads were still active in Meta's own Ad Library after the crackdown?

Model

It suggests the enforcement is either incomplete or selective. Either Meta's systems haven't caught everything, or the company is being careful about how aggressive it appears. A total blackout would look worse than a partial one.

Inventor

The New Mexico judgment was $375 million. The California award was $6 million. Why the difference in scale?

Model

Different cases, different claims. New Mexico was about systemic deception regarding child safety — a broader, more institutional failure. California was about one person's specific harm from addiction. The individual damages are smaller, but the precedent may be more far-reaching.

Inventor

If you're a law firm running these ads, what do you do next?

Model

You move to other platforms, you run television spots, you rely on word of mouth and press coverage. Meta can control its own ad inventory. It can't control the story.

Want the full story? Read the original at BBC ↗
Contact Us FAQ