By James Rosewell
Introduction
In today’s digital landscape, privacy is a fundamental human right. Yet consumer expectations vary widely, which require solutions beyond one-size-fits-all concepts. Striking the right balance for responsible data processing always requires a proper assessment of the potential risks of harm to be weighed against societal benefits. The potential risk of harm from data processing depends on both the type and context of the data being processed. Mitigating privacy risks demands a spectrum of options to meet diverse societal needs. While competition can enhance choices and privacy, Big Tech centralization stifles them. Improved transparency is vital for consumers to make their informed choices, and clear guidelines for responsible data handling can simplify compliance for organizations. A new approach to user choice will be required. Below are MOW’s Data Governance and Accountability Principles to help guide policy makers in truly improving privacy online without centralizing greater control over the internet into Big Tech’s hands.
Core Tenets
- Privacy is a human right where consumer expectations exist on a spectrum.
- One-size-fits-all proposals do not adequately address consumer needs.
- The risk of harm depends on what data is being processed and its context, such as each organization’s security measures.
- The same data in the “hands” of one party may be riskier than another.
- Varying consumer privacy expectations require a diversity of choices to adequately meet the needs of diverse segments in society.
- Privacy is improved by competition, rather than in conflict with it.
- There is a spectrum of consumer attitudes and preferences with regard to data collection and advertising practices.
- Big Tech’s technical interference with real-time, accurate interoperability restricts rivals from offering improved solutions to consumers and to businesses.
- Big Tech’s discriminatory definitions of “privacy” do not actually mitigate risk.
- On-device processing and server-side processing can equally harm individuals.
- First Parties can harm individuals just as much as Third Parties.
- Consumers and oversight agencies deserve improved transparency to exercise their choices regarding the collection, use, and disclosure of Personal Information.
- Providing simple, consistent and pragmatic guidelines for responsible data handling ensures easier compliance and establishes improved expectations for all.
- Privacy is improved by competition, rather than in conflict with it.
- Applying a balance of interests to the collection and use of Personal Information is consistent with consumer expectations and is in consumers’ interests.
- This balance should consider whether the likelihood and severity of potential harm to individuals is outweighed by the broader positive benefits for society, e.g. economically efficient advertising practices.
- Businesses should be held accountable for breaking the law or their failure to use data consistently with the terms of their data handling practices.
- Monetization of consumer data for advertising purposes in exchange for content or services offered free to consumers is a legitimate business model that consumers should have a right to use. When consumers are offered a charge to access digital properties, such charges should be reasonable and kept in check by competition.
- Individuals should have the right to a human review of all automated decisions that cause a substantive life impact (e.g., insurance coverage, job application, and declining a loan application).
Data Governance & Accountability Principles
- MOW believes remedies must be tailored to each online harm.
- Harm 1 – Illegal use of Sensitive Data
- Remedy 1 – enhanced labelling of sensitive content to help automate warnings to consumers to improve transparency to exercise their choices regarding the collection, use and disclosure of sensitive Personal Information.
- Consumers have a right to access, request correction and deletion, and limit the uses of sensitive Personal Information.
- Harm 2 – Unwanted Reidentification
- Remedy 2 – enhanced labelling of local storage and support for transient, resettable auto-rotating interoperable random identifiers.
- Not all data is Personal Data. Privacy rules should be limited to use cases where data is or is likely to be linked to a specific, known individual.
- Harm 3 – Unwanted Personalization
- Remedy 3 – consumers should be able to signal their preference regarding personalization to a group of recipients, without disclosing their identity, e.g. via a consumer-software-based settings.
- Because consumers preferences may vary across groups, consumers should be able to override their preference signal on a group-specific or a business-specific basis.
- Harm 1 – Illegal use of Sensitive Data