I spent the past year asking one simple question to publishers, marketers and ad tech companies: “So what is your strategy for the cookieless future?” The replies broadly fell into two categories: “We’re not worrying about it yet” and “We’re using solution X and they told us everything’s going to be fine”. Now that time is running out and it is becoming clear that solution X is not going to work fine after all, it may be a good time to sum up what I’ve learnt and share some thoughts about the cookieless future.
Thought #1: there is a lack of understanding about privacy laws and confusion about the future of advertising in a cookieless world.
Thought #2: the lack of understanding of privacy is head in the sand stuff, it’s an entire industry ignoring a problem and wishing that it will just go away.
Thought #3: the confusion is created by actors that cannot accept that have no solution to the problem and are simply buying time by muddling the waters.
The rest of this (long) post tries to clarify some of the issues at play. It starts from a very basic introduction to the sector and the problem, then summarises some (not all) legal requirements around privacy, and then reviews cookieless solutions currently on the market.
Identity in advertising
Identity is the foundation of digital advertising. In order to reach target messages to the right audiences, advertisers seek to identify individuals online, track their behaviour and infer their personal preferences. The better the ability to identify the consumers as they interact with content, products and services, the better the ability to profile their preferences, monitor their behaviour and measure the effectiveness of marketing campaigns to influence purchase behaviour.
Consumer identities now combine data from multiple sites (domains), browsers, devices (computers, smartphones, smart televisions, tablets, fitness trackers and so forth), native apps and retailers both online and offline. This unparalleled ability to identify and track users has put digital advertising on a collision course with regulators and a section of the public. This paper briefly explores the mismatch between the way digital advertising identifies consumers and the privacy laws recently approved in Europe and the US – and what this means for the industry.
Some of the biggest companies on earth like Google, Facebook and Amazon have been built on the ability to collect and exploit consumer data to optimise their services. These companies have the advantage that they require users to create an identity (log in) and can collect data about them as they use their own services. By contrast, the open internet was built as an anonymous space where identifying and tracking individual consumers is difficult. Bespoke solutions were needed to find to de-anonymise users as they move from site to site in the open internet.
The most common approaches are: third party cookies, device fingerprinting and advertising identifiers. Cookies are little files dropped by a site on the browser to identify them among sessions. But over time companies started dropping their third party cookie on domains they did not own so that they could identify consumers and monitor their activities across the internet. Device fingerprinting goes even further by giving the site owner or a third party detailed information about a device’s characteristics that make it unique and therefore identifiable. On smartphones, identity was baked into the operating system: Apple launched its phone-based ID for Advertising (IDFA) in 2011, soon followed by the Google Ad ID (GAID). These IDs were a massive improvement on the cookie because they exist at the device level (so they are interoperable between apps and browsers), they are persistent and cannot be deleted like the cookie, and they are readable by everyone who needs to serve content to the phone. To match these different identities, software as data clean rooms was introduced to connect multiple datasets and achieve an ‘omni-channel view of the consumer’.
The combination of these technologies has enabled the growth of an industry worth hundreds of billions of dollars. This industry is now on a collision course with privacy regulators, concerned that we have entered an era of ‘surveillance capitalism’ that is bad for consumers. Privacy laws like the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) and the recently adopted CPRA set out a number of requirements that appear to be fundamentally incompatible with the way the industry works. This is especially the case for digital advertising in the open internet, where hundreds of companies trade consumer data to power live advertising placement via Real-Time Bidding (RTB). Under the banner of privacy, browsers are withdrawing support for third-party cookies, while Apple is on a path to make IDFA redundant – with Google likely to follow suit.
The rise of privacy
These privacy laws place obligations on organisations, and creates rights for consumers, that are fundamentally incompatible with the open flow of personal data that takes place in RTB. Below is a list of main key areas of friction.
Free, informed and revocable consent. The GDPR requires that data subjects must consent to the processing of their personal data. Such consent must be freely given, based on a reasonable amount of information, specific (i.e. not bundled for different data processing operations), and easily revocable (article 7). The industry has paid a lot of attention to consent, but legal compliance is difficult when personal data is collected for multiple purposes and shared among hundreds of organisations. The Belgian Data Protection Authority (DPA) found that the Transparency and Consent Framework (TCF), the industry standard to regulate consent in Europe, breaches the GDPR because it allows companies to swap sensitive information about people when this has not been authorised, and provides inadequate controls for the processing of intimate personal data that occurs in the RTB system. RTB vendors would have to strip data taxonomies off sensitive information and ensure that personal data that can reveal sensitive information is not tracked at all – but given that almost all behavioural data can be processed to reveal sensitive information, the task is far from straightforward.
Rights to access, rectify and erase personal data. An even harder problem for the sector is how to uphold the consumer rights established under GDPR. The GDPR gives individuals the right to access their personal data being processed by the data controller (article 15), to obtain from the controller the rectification of inaccurate personal data (article 16) and its erasure without undue delay (article 17). But how can the consumer have access to personal data that is shared among hundreds of organisations, let alone rectify or delete it? When each organisation holds personal data resulting from the intersection of different databases from multiple sources and the data is shared so widely it is practically impossible for the consumer to exercise their rights. This is a fundamental flaw with the principle that personal data can be traded, exchanged or synced between different companies as this makes most data rights unenforceable.
Right to data portability. The GDPR also establishes that the consumer has the right to receive their personal data and to transmit it to another organisation without hindrance (article 20). It’s hard to see how this squares with the commercial needs of organisations whose unique selling point and IP lie in the personal data they hold and/or infer through syncing and processing.
Right not to be subjected to profiling. The GDPR gives individuals an absolute right to stop their data being used for direct marketing, including any profiling of data that is related to direct marketing. Moreover, under GDPR recital 71, the consumer should have the right not to be subject to a decision based solely on automated processing (including ‘profiling’) and which produces legal or similarly significant effects “such as automatic refusal of an online credit application”. When data is freely shared for the purpose of profiling, it becomes almost impossible to ascertain whether or not its use drives a decision with a legal or similarly significant effect.
International transfers. The GDPR restricts transfer of personal data to a third country or an international organisation that ensures an ‘adequate level of protection’ (Article 45). Not many countries outside the European Union offer the level of privacy protection that the EU considers adequate, and a recent sentence by the European Court of Justice has invalidated a decision by the European Commission to grant an adequacy decision to the United States. As things stand, personal data cannot be lawfully transferred to the US (due to the extensive powers granted to the US Gov by national security law). The UK will likely follow the same path after Brexit. Autocratic countries such as China and Russia are even less likely to provide adequate protection to personal data. The European Data Protection Board (EDPB) has also ruled that the transfer of encrypted personal data to non-EU entities can only be done if the decryption keys are retained solely by the data exporter.
Fundamental principles. More broadly, the way the RTB sector currently works seems to contradict many fundamental principles of the GDPR, such as data minimisation, integrity and confidentiality, storage limitation and accountability. None of these principles can be reasonably upheld in an ecosystem where personal data is freely traded and stored by multiple organisations.