According to education site The 74, at least eleven school districts around the country have sued the owners of such platforms as Snap, Instagram, YouTube, and TikTok seeking financial compensation for the “increased mental health services and training they’ve ‘been forced’ to establish” as a consequence of student use of social media.

Among school systems filing suits are those of Seattle, Mesa (AZ), Bucks County (PA), and San Mateo County (CA), as well as districts in New Jersey, Alabama, Kentucky, and elsewhere. Cash demands aside, the schools say they want to negotiate a settlement with the platforms to change how they operate.

These are bad lawsuits that courts should reject. The suits’ announced goal (regulating social media use by students) is in itself debatable, but the means employed (damage lawsuits to recoup public agency expenditures) vaults the whole thing into the realm of the absurd.

As you are probably aware if you follow these issues, lawmakers at both state and federal levels have been advancing proposals lately to regulate social media in the name of protecting minors from its bad influence. Among the goals are to require age verification and so-called age-appropriate design for social media and other sites; ban or severely restrict social media access below some threshold age; and impose age exclusions and controls on particular kinds of content, including content feared to have bad psychological effects. These include causing teens or tweens to develop symptoms of depression or anxiety, feel inadequate or excluded, or simply spend more time online than is good for them, all said to contribute to what is often characterized as a “mental health crisis” among students.

One reason all these proposals have been highly controversial is that for each of them there are obvious big downsides, including (liberty aside) worries about overbreadth, workability, and the chances afforded to use discretion over what is deemed sensitive to censor content disfavored for other reasons.

A recent paper by my Cato colleague Jennifer Huddleston outlined some major objections to the proposals. To begin with, simply cutting off minors from social media use entirely would ban a great deal of communication almost everyone concedes to be wholesome, such as staying in touch with relatives and coordinating with peers on school events, sports, and classroom projects. Most regulatory schemes rely on requirements for age verification, which, when not easily evaded, can result in the gathering into provider databases of sensitive and readily misused personal information such as images of drivers’ licenses or student ID cards.

Meanwhile, the measures that would be most closely tailored to the perceived problem, such as walling off younger users from discussions of sensitive topics such as self-harm, substance abuse, violence, or eating disorders, can block the very kids who may need it most from resources that could help them to cope with or understand crises close to home for them.

Especially if inadvertent breaches are met with steep penalties, it’s inevitable that some content providers will decide to play safe by not making available certain kinds of sensitive content at all, even for adult eyes. For that reason and others, these measures would in practice curtail the speech of adults as well as minors. The Supreme Court has twice invoked the First Amendment to strike down Internet child safety laws on such grounds.

In short, even advocates should be willing to concede there are complicated and not entirely predictable trade-offs here; it’s simply not plausible that social media executives could wish them all away just by vowing to be more public-spirited or forgo ad revenue or put more effort into moderating content. The lawsuits, of course, present the platforms as cartoon villains knowingly inflicting harm, because that’s how lawsuits work.

Maybe you could make a case that elected legislatures have the kind of public legitimacy and wisdom to strike a proper balance between censorship and risk while using open public processes to draw on sophisticated input from many stakeholders. I don’t believe that, but you might.

But it’s thoroughly insane to think the proper way to resolve these issues is by way of a cash-driven litigation settlement negotiated behind closed doors to satisfy trial lawyers representing school districts, all propelled by the chance of billion-dollar awards should the cases someday get to a jury.

The irrationality doesn’t stop there. Even if you buy into the sweeping claims of psychological damage, it’s utterly bizarre to make the school districts here into the legal victims. Imagine the implications of giving public schools a go-ahead to recoup money from any outside actors they think made students more unmanageable and psychologically needy, from video games to so-called consumer culture to – let’s face it – parents.

If you peruse a typical instance of one of these lawsuits you find it has trouble coming up with any particular action platform operators took that, in fact, violated any law on the books, especially in light of Section 230, which generally bars holding platforms liable for user-created content. Instead it falls back on the notion, not a part of the historic civil law, that social media is a “public nuisance” – earlier lawsuits in the recoupment genre have tried out this idea on guns, fossil fuels, vaping, opioids, you get the picture – as well as newly posited “duties of care” meant to precipitate tort negligence liability from thin air plus indignation.

Let’s be frank about the origins of lawsuit campaigns like these: they’re cooked up in the offices of lawyers and consulting experts and sold to public administrators. The article in The 74 makes it sound almost as if school districts came up with the idea of suing on their own. But in fact, the suits are pitched to them on a no-fee, no-win basis, often at conventions and the like. What have you got to lose? Just sign our client contract and we’ll handle the rest.

Back when, legal ethics canons used to prohibit lawyers from representing government clients on a contingency fee basis. That was a good rule.