As part of its Privacy Sandbox project, Google is attempting to force its partners to deploy what is known as a Trusted Execution Environment (TEE). Under Google’s proposals, the processing of deidentified data for ad targeting via Sandbox’s Protected Audiences API (PAAPI) will take place on a TEE provided by a Google-mandated infrastructure. Google advocates this approach as the ultimate answer to user privacy and data security. However, this is a false claim. TEEs don’t enhance privacy as Google claims, but merely add cost and complexity to Google’s competitors.
A TEE is essentially a secure and independently auditable computing environment. Provided by a cloud hosting business, a TEE hosts code (in this case the PAAPI) alongside a mechanism that attests to the integrity of that code. This hardware-controlled mechanism validates that the code being run is as described by the operating business, supposedly ensuring that no funny business is going on behind the scenes.
Magical TEEs
According to Google, by placing the processing of data within this magical environment, the very nature of data processing is changed. Simply by moving where the processing happens, all risk is removed and privacy is preserved. Would that it were so simple!
Whilst a TEE may place some of the processing within a new environment, it doesn’t change the fundamental fact that the processing is taking place and, as such, laws still apply! If it is legal to transfer data from an organization to the TEE, then surely it is legal to transfer that same information to another processing environment too.
Moreover, a TEE itself cannot generate trust. The fact that Google is adamant that it dictates which infrastructure providers may be used by rivals. Their mandate acknowledges the fact that it is quite possible for a bad actor to create something which looks and acts like a TEE but is instead using the data sent to it for nefarious purposes. Just because something says it is ‘trusted’ on the label doesn’t mean that this is always the case.
At the end of day, the only thing that can create ‘trust’ in this situation is a contract between the data controller (the media owner from which the data originates) and the data recipient (Google). Without a contract there is no legally enforceable guarantee that data is being handled properly, either from the point of view of the media owner or of the end consumer whose data is involved. The fact is that it is a contract, not a piece of technology, that can create enforceable ‘trust’, making the imposition of TEE’s redundant.
TEEs and personal data
The contract issue becomes more significant when you look data protection law. A number of legal precedents define that deidentified data is not personal data as long as the recipient of that data doesn’t have the legal means to legally reidentify it. That ability is normally defined in a contract – one which Google has acknowledged it will not sign in the case of its Sandbox*. As such, Google will retain the technical ability to reidentify and is thus processing personal data on its TEEs in likely breach of applicable data protection laws.
Google’s loosely worded TEE attestation promises that it will ‘deter entities from…reidentifying users across sites’ but it doesn’t commit that it won’t do so itself or indeed use the data for other purposes, such as enhancing its own advertising offerings.
Cost and complexity
TEEs are also problematic in terms of competition. As a recognised monopoly Google should not have the right to leverage its dominance in order to distort competition. An example of this distortion would be the enforced imposition of additional costs and complexity onto competitors. As recognized by the CMA and ICO, TEEs are significantly more expensive than traditional approaches and add complexity to the process. By mandating both TEEs as part of its Sandbox and which infrastructure they run in, Google is effectively forcing its competitors to use a more expensive and complicated approach without any countervailing public benefit. Let’s also not forget the negative impact on the climate from mandating unnecessary processing.
TEEs don’t work
The fact is TEEs are not a viable solution for real-time data-driven applications. They don’t offer any meaningful privacy benefit for consumers, given they ignore all data transfer concerns and sensitivity of data being ingested, they don’t enforce the requirements of data protection law, given they do not speak to what legal basis the data controller has to process personal data, and they unfairly impose additional costs and complexity on Google’s competitors.
Google has turned to the TEE in order to try and create a veil of privacy of its flawed Sandbox project but the veil has slipped and the truth is ugly.
* “Privacy Sandbox APIs like Protected Audience and Attribution Reporting are not…made available subject to commercial guarantees.” Privacy Sandbox response to IAB Tech Lab’s Fit Gap Analysis for Digital Advertising, p27