Executive Summary: A growing set of loudly promoted “privacy” and “safety” policies secretly aim to concentrate more power in the hands of fewer US big-tech platforms by turning the web into a maze of consent prompts and communication chokepoints while also serving to increasingly connect identity to online activity.
These policies contrast with those that favor true open societies, democratic freedoms, and decentralized competition.
A New Federal Privacy Law
Congress is finally moving. After years of false starts, failed compromises, and trillion-dollar lobbying campaigns, the House Energy and Commerce Committee are preparing to release a national data privacy law that would finally harmonize the patchwork of state regulations by setting a unified federal standard for data protection. The stakes could not be higher. Done right, this law could be the most significant pro-consumer, pro-competition legislation in a generation. Done wrong, it will cement Big Tech’s dominance over online communication, commerce and information.
Two Visions, One Moment
Two competing visions of the internet are colliding in this legislative moment. The first is the walled-garden model championed by a handful of trillion-dollar platforms (Google, Apple, Meta, et al), who act as the gatekeepers of the digital public square.
These corporations harvest personal data at scale, unilaterally dictate the rules for online interaction, and increasingly abuse their dominant walled gardens (OS, browser, app stores and search) to unfairly make their adjacent businesses the default and sometimes exclusive options. According to these firms, “privacy” is achieved through enclosure: identity, consent, measurement, and messaging are routed through proprietary systems where the platform can see everything, approve everything, and charge everyone. “Privacy” for them is a technical design that strategically locks out rivals from competing on the merits within our digital economy. No one’s privacy is improved when a handful of essential platforms know everything about everyone and people have no choice.
The second vision is the one the internet was built on an open, interoperable web where anyone can innovate, communicate, and compete without paying taxes to a trillion-dollar landlord. Privacy in this model is achieved through reliance on deidentified data, distinctions on handling sensitive and innocuous information, and ensuring enforceable rights, without turning the internet into a passport checkpoint.
The first vision entrenches power. The second unleashes it. The federal privacy law now being drafted will choose between these visions.
Regulators are at a Crossroads
Regulators are now at a crossroads. The US Congress is set to finally release the long-awaited federal privacy law, which will harmonize the handling of personal data across the fragmented patchwork of existing state regulations.
The state privacy laws that Congress is about to preempt were not written on a blank slate. They were written under intense lobbying pressure from the very platforms they purport to regulate. Look carefully at the exceptions endorsed by California’s Privacy Protection Agency (CalPrivacy).
CalPrivacy supports mandating Global Privacy Control (GPC), which contains a hidden “first-party” exemption, letting dominant platforms ignore consumers’ opt-out signals when processing data within their own businesses. The result? Under this logic, Google can harvest your data, and that of its billions of users, across its entire advertising empire (e.g., Search, YouTube, Maps, Shopping, Play Store, AI, Android, TVs, Chrome, and more), because it all happens as “first party.” The “consent” provided when setting up a Gmail account, or a new TV, or a mobile phone, where no one really had a choice becomes a get-out-of-jail-free card for Big Tech.
This is not privacy protection. It is privacy theater. Whether your data is being used to manipulate your purchasing decisions or expose your political beliefs to the highest bidder, the harm does not diminish because the company doing it happens to also own the browser, the operating system, and the app store through which you accessed the content.
In contrast, the UK Information Commissioner’s Office (ICO) published the unequivocal, opposite view: first-party status is irrelevant to assessing data-handling risks. A privacy harm is a privacy harm, regardless of whose logo is on display.
The ICO’s approach is what all people should want, namely to assess risk based on the nature of processing, the sensitivity of data being handled, the safeguards in place, and the actual effects on people, rather than relying on a branding label like “first party.” Whether data handling happens inside one corporate boundary does not magically change the likelihood of discrimination, manipulation or warrantless disclosure to government agencies.
When Congress preempts these state laws, it has an extraordinary opportunity to strip out these Big Tech carve-outs and start fresh. The question is whether it will.
On-Device Distraction
The pattern repeats with on-device processing claims. Apple loudly champions “on-device” exemptions, arguing that keeping data local to the devices they sell, where only their own OS, browser and app can access it magically solves privacy problems. But “on-device” is not a synonym for “non-personal” or “non-risky.” It doesn’t equate to deidentified or anonymized. Accordingly, the UK ICO rightly rejects these carve-outs, recognizing that they do not meaningfully reduce systemic risks when the same company that still controls the software ecosystem and the profit incentives behind restricting rivals.
These regulatory splits are not accidents. They reflect raw economic incentives. Centralized platforms make their money by keeping consumers and data locked inside proprietary silos. Seamless, high-quality, real-time exchanges between decentralized services would let competitors talk to one another and would directly threaten these monopoly moats.
The EU’s Digital Markets Act (DMA) Article 6 was drafted to address exactly this problem. It demands dominant players cease restricting both interoperability, real-time communication, and data portability so that both consumers and businesses can directly communicate without interference by a dominant gatekeeper. Real-time, high-quality communication between independent services reduces switching costs, prevents lock-in, and enables competition on the merits. Being alive to the risk that platforms being used by millions of people is a major advantage for them over start-ups and rivals, the EU imposes an anticircumvention obligation on the DMA under Article 13 which obliges the platforms to protect and anonymize data and enable business to business communications.
Privacy Friction
Big Tech platforms hate these policies for the same reason they love mandatory, annoying consent pop-ups. These popups impose real costs on everyone, but the costs are not evenly distributed. This engineered friction is not an unintended side effect, but an intentional strategy. Increasing friction for all rivals is a hidden barrier to keeping consumers locked within their platform. When regulators unintentionally mandate friction at scale, they centralize attention and monetization in the large platforms who are immune from this incessant annoyance. As a result, regulators are increasingly recognizing and acknowledging that consent popups are merely privacy theater, rather than true safeguards.
The U.S. Department of Justice (DOJ) has named this tactic: “pretextual privacy.” In its antitrust case against Apple, the DOJ described how the company “deploys privacy and security justifications as an elastic shield that can stretch or contract to serve Apple’s financial and business interests.” Pretextual privacy claims have become dominant platforms’ favorite cloak to obfuscate their anticompetitive conduct.
The Seven Most Common “Privacy-Washing” Policies
Regulators do not have to choose between privacy and an open web. But they do have to distinguish between privacy protections and privacy pretexts. The difference shows up in market structure. When a policy requires constant prompts with hidden dark patterns, forced logins, proprietary protocols, or “trust us, it’s on-device” claims, these exemptions must be questioned.
Nowhere is the above privacy washing clearer than in the top seven policies Big Tech supporters peddle to regulators and the public under the guise of being a noble defense against fraud or harm to children.
- “First-party” exemptions that weaken “universal” opt-out signals. Global Privacy Control (GPC) and other Opt-out Preference Signals (OOPS) frequently contain dark patterns in the disclosures they make, and exemptions they omit from disclosures, deceptively steering consumer choices. If the rule is “opt-out unless the company calls it first party,” the biggest companies will simply expand what counts as “first party” via single sign-on, acquisitions, embedded services, and cross-property integration. Any universal or global signal must focus on the consumers’ intent about the online experience they prefer, rather hinging on artificial distinctions based on corporate ownership.
- “On-device” processing carve-outs. Local processing does not reduce the risks of handling data, but it does restrict whether organizations other than OS and browser vendors can compete. When “on-device” becomes a regulatory shortcut, it privileges those who can dictate default settings regardless of the risk their own data handling poses.
- Mandatory consent pop-ups produce fatigue instead of true protection. When every site must interrupt every visit, “consent” becomes meaningless. The public pays in annoyance, small publishers pay in lost revenue, and platforms gain comparative advantage.
- Identity authentication for ordinary access to information. Policies framed as “fraud prevention” or “trust and safety” too often normalise “show me your papers to browse.” This undermines anonymous reading, chills free speech, and pushes users toward identity-gated platforms whose login-gated ecosystems too often aid warrantless, government surveillance.
- Age-gating requirements transform the entire web into a club with a bouncer. Protecting children online is an important goal. But blunt age-gating regimes can be a cloak for misuse and force identity collection for everyone, expand data retention, and create new opportunities for increased privacy invasion and breach risks. Such blanket policies hide platform motivations to gain even greater collection of identity and control over online activity.
- Restrictions on interoperability and real-time communications justified as “security.” When messaging, payments, or device capabilities cannot interoperate, consumers are locked into a single ecosystem. Security should be achieved through standards, audits, and liability, rather than through exclusivity.
- Denigrating and degrading digital object identifiers while expanding platform identity-based targeting. “We’re eliminating cookies” sounds at first like a privacy-forward message, until one realizes that those advocating it rely on their own identifier-based software to monetize their services as well as using consumers’ directly-identifiable information as a common match key with their advertiser customers.
In practice, each of these policies forces consumers to disclose their identity, surrender anonymity, and pay higher prices for the online services they use. These policies do not protect privacy but do protect market power.
Google’s Privacy Sandbox offers the clearest case study. Marketed as a bold move to kill third-party cookies to somehow “improve privacy,” Google quietly expanded its use of individuals’ real-world identities as common match keys through its Customer Match service under the secret code-name “Project Narnia.” The result? Google interfered only with rivals’ safer match keys while supercharging its own closed-advertising-ecosystem. This privacy theater is now being rerun under the guise of GPC.
The Regulatory Challenge
Regulators, lawmakers, and the public must stop falling for the misleading and deceptive practices of the platforms. Tactics such as first party, on device, privacy friction and restricting interoperability are key tools in the platforms’ attempts to use spurious privacy claims to shore up their monopolies.
The open web is a proven engine of competition, innovation, and democratic discourse. Decentralized, open standard protocols that support real interoperability are not radical experiments, they are the practical tools that let new voices enter the conversation without first paying rent to Big Tech. We should be writing laws to protect it. Not laws that hand its keys to the platforms already charging tolls at the gate.
The choice before us is simple. We can keep polishing the elastic shield until the internet belongs to the firms with trillion-dollar market caps. Or we can demand policies that tear down unnecessary authentication walls, enforce genuine interoperability, and let the open web breathe again. The future is still writable. Let’s make sure it is written in open code, not proprietary fine print.
The Moment is Now
Congress’ Energy and Commerce Committee is preparing legislation that would finally create a national data privacy standard, expected to broadly preempt state laws. (https://www.politico.com/news/2026/04/16/gop-national-privacy-law-technology-00876794)
This is the moment to demand that Congress distinguish between privacy protection and privacy pretext. Between policies that protect personal data and policies that hand Big Tech a legally mandated moat. Between a law written for the open web and a law written for the platforms that are undermining it.
The internet was built as a commons. It has been slowly enclosed, one privacy exemption at a time, one consent pop-up at a time, one “on-device” carve-out at a time. The federal privacy law heading toward Congress is either the tool that reverses that enclosure, or the final nail that seals it.
As the world waits to see Congress’ national privacy law, we soon shall see which if any of the privacy washing principles above are finally eliminated to the benefit of all consumers and businesses.