The European Union’s new consumer‐​privacy regime has gone into effect, a fact you’re probably at least vaguely aware of thanks to the mountain of “privacy‐​policy change” notices piling up in your inbox. The new rules have many American privacy advocates gazing enviously across the pond. Yet there is ample reason to doubt that rules in the GDPR mold will yield meaningful benefits that justify the costs they impose.

The best argument for data‐​protection regulation has been that the current dominant approach to protecting privacy is a sham: Online platforms have long given users notice of how their data will be used in vague, legalistic and lengthy terms‐​of‐​service agreements, which users almost universally “consent” to by clicking “agree” without reading a word. The GDPR “solution” in some ways assumes the problem is that we haven’t given users enough fine print to read or buttons to click.

We’ve already had a preview of how well that approach works: You’ve probably visited a website that, in response to existing EU rules, throws up a banner forcing you to agree to their data policies or click through pages of options before proceeding. And, if you’re like most people, you’ve honed your reflexes to click through these minor annoyances as quickly and automatically as possible.

Like antibiotics, such notices may work when used sparingly, but tend to become ineffective when deployed indiscriminately. To be sure, the GDPR has plenty of other restrictions on how data is used. But when the law demands ritual box‐​checking even for ubiquitous and, to most of us, unobjectionable uses of data, users are conditioned to speed through the nuisance by simply clicking “agree.”

That doesn’t mean it is impossible to give users more robust and meaningful control over the use of their data, but what’s the most effective way to make privacy choices salient and intelligible? Generic regulations aren’t just ill‐​suited to solving that problem; they may be counterproductive. As Berkeley professors Kenneth A. Bamberger and Deirdre K. Mulligan report in their book “Privacy on the Ground: Driving Corporate Behavior in the United States and Europe,” regulation focusing on formalistic methods such as long click‐​through “consent” mechanisms can diminish attention and resources companies give privacy issues and foster a “compliance mentality.”

On the other side of the ledger are compliance costs of such regimes, which aren’t borne only by behemoths like Facebook and Google. The GDPR defines “personal information” broadly to include, among other things, routinely logged data like internet‐​protocol addresses. Countless companies (and nonprofits) that few of us would consider privacy threats are saddled not only with ensuring their data‐​use notifications satisfy EU standards, but also with developing mechanisms to handle requests to purge or provide data.

Those costs are a rounding error for a Google or a Facebook, but less so for smaller companies And that is hardly the only way the rules tend to favor the digital economy’s lumbering dinosaurs over its scrappy mammals.

We’ve become accustomed to a cornucopia of free online content and services underwritten by advertising—and, increasingly, targeted advertising fueled by data. That gives an advantage to the biggest players with the most data to mine: The most vocal proponents of privacy regulation are often equally concerned about the disproportionate power of big players. Introducing more regulatory friction into the process of monetizing data is virtually guaranteed to give big players more power.

There is one element of the GDPR worth copying: the requirement that data custodians notify users promptly in the event of a breach. Companies are notoriously averse to publicizing the fact that they’ve been hacked. Users at minimum need basic information about which companies are fulfilling their obligation to safeguard data and which aren’t.

With that exception, Americans shouldn’t be too eager to emulate our European cousins’ approach to data protection. Much of the rigmarole around boarding a plane since 2001 is justly derided as security theater—an elaborate performance that has more to do with reassuring travelers than detecting real threats. The GDPR is a similar form of privacy theater.