Privacy

Does the US Secure Data Act address Big Tech’s privacy washing carve outs hidden in state privacy laws? 

On 22 April 2026, the House Energy and Commerce Committee published the national SECURE Data Act, H.R. 8413. (https://d1dth6e84htgma.cloudfront.net/SECURE_Data_Act_for_introduction_7c80a347ac.pdf). The bill establishes a federal privacy floor intended to replace the current patchwork of 20 state laws, extending consumer data protections to all Americans. Its stated aim is to end “the confusing and ineffective privacy patchwork currently in place.”

Regulators and advocates worldwide have grown increasingly alert to how large platforms exploit ill-defined notions of “privacy” and “security” to entrench their market positions, restrict real-time online communication, influence democratic elections, slow innovation, and suppress competition. 

The Movement for an Open Web (MOW) has documented for years how lobbied exemptions within state-level privacy legislation has become a vehicle for competitive distortion rather than genuine consumer protection. MOW has also written draft Data Governance and Accountability Principles to help guide policy makers in truly improving privacy online without centralizing greater control over the internet into Big Tech’s hands. (https://movementforanopenweb.com/mow-data-governance-and-accountability-principles) In our analysis of the internet at a crossroads, we identified even recurring “privacy-washing” techniques embedded in US state laws (https://movementforanopenweb.com/the-internet-at-a-crossroads-will-americas-privacy-law-protect-you-or-big-tech). These anticompetitive definitions and carve outs from compliance transfer control over the internet to trillion-dollar platforms while burdening independent publishers and smaller competitors with disproportionate compliance costs.

Now Congress has released the SECURE Data Act, which will preempt all of those state laws. The critical question is how well does it actually fix these underlying issues?

On close reading, the answer is only one of the seven privacy washing measures we called out in the article above would remain under this new law.

Where the SECURE Data Act gets it right

Meaningful clarity around personal and non-personal data. One of the most important advancements within the SECURE Data Act is its definition of deidentified data, far better than most state equivalents. The bill defines deidentified data as data that “cannot reasonably be linked to an identified or identifiable individual or a device linked to an individual.” It requires public commitments against re-identification and mandates contractual prohibitions on downstream partners attempting to re-identify. This is a welcome improvement over ambiguity many state equivalents. Crucially, it correctly recognizes the relative nature of whether data is “personal” by reference to the organizational measures of the recipient rather than treating it as some invisible property inherent within the data itself.

Proportionate consent architecture. The bill takes a structurally sensible approach to restricting consent-based uses of personal data to those disclosed or reasonably necessary at the time of data collection.  

Eliminating mandatory consent popups. The bill corrects California-inspired state laws that mandate interactive consent for routine online navigation. Such popups were never a genuine privacy protection, but instead friction engineering. Mandatory consent banners fall most heavily on smaller, infrequently visited digital publishers who must annoy visitors prior to each visit, while trillion-dollar platforms with reoccurring visits benefit from asking once and never again, despite their ever evolving service offerings. 

The SECURE Data Act adopts an opt-out model, requiring disclosure before collection (via a privacy policy) but not interactive consent dialogs where risks from data handling are reasonably remote. On the other hand, sensitive data requires genuine opt-in, which is proportionate distinction. This approach substantially addresses the annoying experience of the consent popups that have been increasingly plaguing our online experiences, which is widely recognized as engineering annoyance rather than an effective mechanism for consumer protection.  

Protecting the Open Web from OOPS. The SECURE Data Act’s treatment of “universal” opt-out preference signaling (OOPS) mechanisms, like Global Privacy Control (GPC), is a welcome step forward.

Section 10 directs the Secretary of Commerce to publish a report analyzing commercially available technologies that consumers could use to signal genuine universal opt-out preferences. The report is instructed to adopt a balance of interests approach by ensuring such signals can be consumer-friendly, communicate meaningfully-informed consumer choice, while not unduly burdening responsible and beneficial data processing.  

This protection cannot come fast enough. California’s CPPA and new Opt Me Out Act (AB 566) requires businesses that must rely on third parties to operate to honor GPC, while exempting the identical data handling practices when performed by vertically-integrated, larger businesses. Under California’s framework, a publisher must honor GPC signals while Apple and Google can ignore the identical signal when the data flows within its own ecosystem. The SECURE Data Act preempts such blatant anticompetitive distortions embedded within current state privacy rules.  

Room For Improvement 

The First-Party Exemption Problem Remains Unaddressed. We have argued at length that “first-party” exemptions are the most dangerous privacy-washing mechanism within current state laws. Under California’s framework, GPC opt-out signals come with an implied carve-out that lets a company ignore them when processing data “within its own business.” The result is that Google can honor your opt-out on third-party sites while continuing to harvest and profile you across Search, YouTube, Maps, Android, Chrome, and the Play Store, because all of that is classified as “first party.”

The SECURE Data Act does not fix this. Its definition of “sale” expressly exempts intra-company data transfers, meaning data can flow freely across an entire corporate family without triggering the consumer’s right to opt out. An “affiliate” is defined to include entities sharing common branding or common control. For a company like Apple or Google, whose empire spans dozens of products under one login identity, this carve-out is operationally vast and extraordinarily profitable. This Act will codify this antiprivacy and anticompetitive first-party exemption into federal law.

The correct framing is not who owns the organizations processing data but what data is being processed and what risks does that reasonably create. Focusing solely on corporate ownership structure, rather than the nature of the processing and the reasonable risks to individuals, is not a pro-consumer distinction. It is a pro-platform one.

Age-related Rules Contain Some Ambiguity. For children under 13, the bill defers to COPPA, which carries an “actual knowledge” standard. This means a recipient of data has obligations only when it actually knows it is dealing with a child. This ensures that organizations do not need to impose yet additional popups, or worse yet require all visitors to present government issued identifiers merely to access online news, obtain basic information, or engage in standard commerce or communication services. 

The SECURE Data Act adds a “teen” category covering ages 13-16 and prohibits targeting them with advertising or selling their data, but leaves open whether this is triggered by reasonably actual knowledge standard or must somehow be inferred by content directed to teens. Congress should close this gap explicitly.

The Verdict

The preemption protection. Despite the deficiencies in addressing all anticompetitive and Big Tech exemptions within the existing state data protection rules, the broad preemption of those laws with a consistent set of rules is net benefit.

The current state landscape has produced rules that are simultaneously burdensome to independent publishers and weak against dominant platforms. The SECURE Data Act is far better than the state alternatives mandating consent popups, unfairly imposing obligations only on smaller businesses and promoting ambiguous definitions that do little to improve true data protection. This uniform federal law creates real obligations for the risky handling of personal data and establishes enforceable consumer rights. For this progress, Congress deserves much credit.

But on the two provisions that matter most for the open web, such as first-party exemptions, the bill preserves the problem. Intra-company data flows remain exempt from opt-out rights. Focusing on solely on corporate ownership rather than what data is being handled and what reasonable risks exist from such processing is not a pro-consumer distinction. Big Tech platforms profit from such exemptions and they are not codified into state law by accident.  

As MOW has consistently argued: more people surrendering more personal data to fewer companies does not improve privacy. It concentrates power.

Congress has a rare and historic opportunity to write a federal privacy law that genuinely benefits consumers and levels the competitive playing field. The SECURE Data Act is a framework on which that law could be built. But unless legislators remove the first-party exemption, they will only partially improved the underlying situation we face.

The internet’s future is still writable. Make sure your representatives know which parts of this script still need editing. 

Appendix: Comparative Framework — SECURE Data Act, CCPA, and GDPR

The table below compares the key differences among the federal SECURE Data Act, CCPA and GDPR.

TopicUS – SECURE Data ActCalifornia – CCPAEEA – GDPR
Personal Data / Information§16 Reasonably linkable criteria to an individual natural personReasonably linkable criteria to an individual natural person or householdReasonably linkable criteria to identified or identifiable natural person
Non Personal De-identified and Anonymous Data§7 Defined as data that “cannot reasonably be linked to an identified or identifiable individual or a device linked to an individual.”  

Controllers must take reasonable measures to prohibit re-identification, publicly commit not to re-identify, and implement contractual commitments to prohibit re-identification.
Defined as information that “cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked” to a particular consumer by the business possessing the data.  

Businesses must take reasonable measures to prohibit re-identification, publicly commit not to re-identify, and implement contractual commitments to prohibit re-identification.
Recital 26. GDPR uses “anonymous” rather than “deidentified data” that emphasizes current and future likely state in the recipient’s hands, rather than any previous state.
Anonymous data§7 Not defined separately. Functionally covered by “deidentified data” definition.Not defined separately. Functionally covered by “deidentified data” definition.Recital 26 distinguishes anonymous information from personal data: if it is “rendered anonymous in such a manner that the data subject is not or no longer identifiable,” GDPR does not apply.
Sensitive Information§2 Sensitive information requires opt-in consent.Sensitive information requires opt-in consent.Art. 9. Both sensitive and non-sensitive personal data requires opt-in consent or another legal basis.
Protected Classes: Children Under 13§2 “Child” is defined as under 13 and governed by COPPA, where actual knowledge standard applies.Children, such as those under 13, require parental consent.Art. 8 defines “child” below age 16.
Protected Classes: Teenagers Under 16§2 “Teen” is defined as ages 13-16 and requires “verifiable parental consent” for sensitive data, and teens cannot be targeted with advertising or have their data sold.  

However, whether actual or constructive knowledge standard is to be used for teens is not yet defined.
Teens, such as those ages 13-15, are treated as adults. 

Reasonable constructive knowledge, such as content “directed to children,” is often used to determine applicability.
Teens, such as those ages 16-19, are treated as adults. 

Reasonable constructive knowledge, such as content “directed to children” or “likely to be accessed by children” is usually applied.
First-Party Exemptions From Obligations§16 Explicit exemption for intra-affiliate transfers. “Affiliate” includes entities sharing common branding or common control. Intra-company data flows excluded from the “sale” definition.CPRA added an explicit carve-out for corporate family sharing (common branding or majority ownership). 
 
“Service providers” are also excluded.
No blanket first-party exemption.  All processing requires a lawful basis regardless of corporate structure.
Universal Opt-Out Preference Signals (OOPS Like GPC)§10 requires a Secretary of Commerce study within 3 years on commercially available technologies, including browser settings, extensions, and device-level settings to enable universal opt-out signals. The report must consider feasibility and must not unduly burden lawful processing.Mandatory obligation on non-vertically integrated businesses to honor valid opt-out preference signals, including GPC, when evidence exists it was initiated by a consumer.No equivalent. Unnecessary under GDPR as it begins with assumption that opt-in consent or another legal basis is required to collect and process an individual’s personal data.
Obligation To Present Consent PopupsNot required. Upfront consent is only required for sensitive data. Disclosure such as via a privacy policy is sufficient notice.Required presentation of consent popup for any personal data exchanges, but not for the same processing within the vertically-integrated business.Required under ePrivacy Directive, yet the requirements for meaningfully informed consent cannot be practically applied for future B2B partners and most business purposes.
Legitimate Interest   Exemptions§11 provides explicit exemptions from consent signals for: 

1. fraud detection and prevention, 
2. security incident response, 
3. identity theft and harassment
4. prevention, internal research to develop/improve/repair products, services or technology, 
5. performing contracts, 
6. fulfilling warranties, 
7. legal claims, 
8. public or peer-reviewed scientific research (with IRB oversight), 
9. law enforcement cooperation, 
10. product recalls, and 
11. fixing technical errors.
CCPA provides explicit exemptions from consent signals for: 

1. fraud prevention, security, 
2. debugging/error identification, 
3. short-term transient use, 
4. internal research/analytics, and 
5. legal obligations.  

Service providers may retain data for their own internal analytics.
Art. 6 provides other lawful bases than consent signals:

1. legitimate interests, 
2. legal obligation, 
3. vital interests, 
4. contract performance.

These are often used to cover equivalent use cases like fraud prevention, security, service improvement.  Processors acting purely under controller instruction for these purposes are covered by DPA frameworks.
Financial and Health Care Exemptions§13(b) provides exemptions for both

1.  financial institutions subject to Gramm-Leach-Bliley Act Title V and
2. HIPAA-covered entities and business associates. 

Public data associated with motor vehicle records and substance use disorder records also carved out.
Provides exemptions for both

1. financial institutions subject to Gramm-Leach-Bliley Act Title V and 
2. HIPAA-covered entities and business associates.
Health data and financial data receive extra protection as special categories, not exemptions from coverage.