How could the upcoming ePrivacy Regulation recognise enforceable privacy signals in the EU?

inviting feedback published
Cristiana Santos* , Harshvardhan J. Pandit*
🔓copies: , OSF
In this paper we discuss requirements that privacy signals must satisfy to be enforceable under the ePrivacy Regulation and enable its real-world application.

Definition of Privacy signals.

Privacy signals are digital representations that allow users to communicate their preferences [1] [2][3][4] of how users want their personal data to be processed, e.g. Do Not Track [5][6] for opting out of tracking, Global Privacy Control (GPC) [7] for opting out of third party sharing, IAB’s Transparency and Consent (TCF) signal [8] for communicating decisions), or exercise rights (e.g. Art.21(5) GDPR), and can be automated and customised for the websites they visit.

Privacy signals must be adopted by both sides of the communication, i.e., controllers and data subjects along with support from device or software providers. Signals have the advantage to create a level playing field between all actors (users, websites, third parties), and to avoid user consent to be exploited through deceptive interfaces (dark patterns) [4][9] that nudge users to accept tracking though ubiquitous consent banners where consent where consent cannot be ‘informed’ or ‘freely given’.

The idea of automated signals is not new in the GDPR. Article 21(5) thereof already envisaged that users could exercise their “right to object to processing” by using automated signals, however, this provision was never implemented, neither a process to designate a signal was included in the GDPR [10].

The idea of enforceability of signals has been posited by several stakeholders.

Several stakeholders confirm that the choices made by users when establishing the privacy settings of a browser (or other application) should be binding on, and enforceable against any all parties. The explanatory Memorandum of the ePrivacy Directive [11] stated that “centralising consent in software and prompting users with information about the privacy settings thereof”, is effective to empower users and to avoid overload of consent requests. Thus, browser (and comparable software) settings have a role to play in avoiding consent fatigue. The ePrivacy Directive (ePD) already referred in its Recital 66 that the user's consent may be expressed by using the appropriate settings of a browser or other application. The ePrivacy Regulation proposal refers, in the versions of the Commission, Parliament and Council, the possibility for signals to exist through technical settings of a software, though its current stalemate does not yet define signals as legally binding. The Parliament, in Article 10(1a), made signals legally binding [12]. The Council [13] (Article 4a(2aa) and Recital 20a) says that “consent directly expressed by an end-user” (which we understand it is consent collected via a consent pop-up on each specific website) shall prevail over “software settings” (which we interpret as allowing the user to set a consent in a Web browser preferences). This position is problematic [14]: i) it would disadvantage privacy-friendly browsers that do provide data protection by default, ii) it would not help reduce users repeated and overloading exposure to consent banners [15], nor protect them against the manipulative practices deployed in such banners to deceive users to give consent to be tracked and profiled [9][16][17].

Stakeholders strongly advocate for legally binding signals, such as BEUC [14][18], EDPB [19], the former Article 29 Working Party [20], Access Now, NOYB and EDRI [21]. California’s CCPA, in Art. §999.315 became the first law to require compliance with automated signals and to endorse a specific signal (GPC[7]) and on 2022, its enforcement agency fined Sephora for not honoring GPC [22] - the first such action of its kind.

Who decides which signal is enforceable or must be mandatory?

Article 19(1)(ba) of the Parliament’s draft [12] allocated the technical rule-making procedure to an independent supervisory authority. The Open Rights Group [10] stated that the EDPB should take the task of defining the requirements and technical specifications for signals to communicate and withdraw consent, and to object to processing based on legitimate interest. The proposal from Access Now, NOYB and EDRI [21] mentioned that these signals could be developed and made legally binding via delegated acts by the European Commission, after a positive binding opinion of the EDPB.

Any future enforceable legal provision on signals needs to bring clarity regarding the technical features of signals (identification of users, update of signals, etc.) and possible conflicts [2] between signals as users unintentionally express conflicting or ambiguous preferences and transmit more than one signal, either at individual website or browser levels, which creates uncertainty over how legal rules apply (e.g. websites specifying conflicting decisions through TCF, DNT, GPC, or ADPC signals.

Examples of privacy signals in the wild and its problems.

Despite their clear benefits, privacy signals have failed to be:

(1) adopted by stakeholders such as advertisers who have competing vested interests [23];

(2) standardised for commonality and interoperability (e.g., Platform for Privacy Preferences (P3P)[24] and DNT);

(3) made compatible with legal requirements, and

(4) enforced by legal authorities. 

Of the current efforts, IAB’s TCF signal [3][16] is adopted by hundreds of ad-tech vendors and thousands of websites in the EU, despite its lawfulness being questioned by the Belgian Data Protection Authority [25]. GPC, active within USA, has been adopted by web browsers (e.g., Firefox, DuckDuckGo, Brave), consent management platforms, and service providers (e.g., The Washington Post), and includes applications for GDPR to “restrict third-party data sharing” [7]. Advanced Data Protection Control (ADPC) [26], developed by NOYB, focuses specifically on GDPR requirements regarding consent and right to object. The legality of both GPC and ADPC for EU has been untested.

What could be automated?

Signals can be used to automate and communicate decisions (e.g., user’s consent refusal) or rights. Currently, only Art.21(5) GDPR mentions automation for the right to object. We discuss the potential for signals to automate other relevant events.

Granting informed and specific consent.

The TCF automates the communication of decisions (accept/refuse) though it can only be communicated by authorized agents such as CMPs. ADPC also provides automated communication of decisions, and additionally proposes creating “automated means for users to give or refuse consent” through management interfaces e.g., through a web browser. However, Recital 32 requires “acceptance of the proposed processing” by users. Accordingly, signals must explicitly refer to the “proposed processing” to automate decisions and include the information of Art.13, 14, e.g., processing purposes, identity of controllers etc. In such cases, signals can provide a priori information about possible choices which must then be confirmed by the user. Practically, this could mean preconfiguring consent requests (either under Articles 6(1)(a) GDPR and 5(3) ePD (consent to terminal access) to such preferences to make it convenient for users to express their decision(s) which can be done if the signal supports granularity for different purposes, data categories, etc.

Withdrawal of consent.

GDPR Art.7(3) specifies “It shall be as easy to withdraw as to give consent”. Signals can (be made to) automate withdrawals irrespective of whether granting can be automated. Signals can provide automation of withdrawal at varying levels of granularity, e.g., for all or only specific purposes or controllers. TCF does not support communication of withdrawal, while ADPC supports both communication and automation of exercising withdrawal.

Right to object to legitimate interests.

Art. 21(5) GDPR acknowledges the use of “automated means using technical specifications” to object to “legitimate interests”. This is important given the current practice of misusing legitimate interests within consent interfaces and making it difficult to object to their use [27]. TCF supports communication of objections, while ADPC supports both communication and automation of exercising objections.

Automating the exercise of other rights.

Signals can be used to automate several actions associated with exercising rights, as GDPR Rec.59 states “The controller should also provide means for requests to be made electronically, especially where personal data are processed by electronic means”. This can mean using signals to communicate information or a link to further information regarding GDPR Art.12-22 in order to facilitate easier access as well as exercising of these rights.

Concluding Remarks

The impetus of providing a legally binding and enforceable signal is now in the hands of the legislators. Through this article, we have initiated an important conversation that points to the existence of signals, their basis in other laws outside EU as well as successful legal enforcement, and the path forward to ensure such signals provide benefits through technological automation while upholding the rights and freedoms of users without detriment.


  1. Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)

  2. Hils, M., Woods, D.W., Böhme, R. (2021). Conflicting Privacy Preference Signals in the Wild. In Computers, Privacy and Data Protection. Brussels.

  3. Hils, M., Woods, D.W., Böhme, R. (2021). Privacy Preference Signals: Past, Present and Future. Proceedings on Privacy Enhancing Technologies, 4, 249–269.

  4. Human, S., Pandit, H. J., Morel, V., Santos, C., Degeling, M., Rossi, A., Botes, W., Jesus, V., & Kamara, I. (2022). Data Protection and Consenting Communication Mechanisms: Current Open Proposals and Challenges. International Workshop on Privacy Engineering co-located with IEEE European Security & Privacy (IWPE), Genoa, Italy.

  5. Mayer J., Narayanan A., (2011). Do not track-universal web tracking opt out. Center for Internet and Society.

  6. Kamara I., Kosta E. (2016). “Do Not Track initiatives: regaining the lost user control,” International Data Privacy Law, vol. 6, no. 4, pp. 276–290, doi: 10.1093/idpl/ipw019

  7. Global Privacy Control, GPC Privacy Browser Signal Now Used by Milllions and Honored By Major Publishers, (2021),

  8. IAB Europe Transparency and Consent Framework,

  9. Gray C., Santos C., Bielova B., Toth M., and Clifford D., (2021). “Dark Patterns and the Legal Requirements of Consent Banners: An Interaction Criticism Perspective,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–18, doi: 10.1145/3411764.3445779.

  10. Open Rights Group, Open Letter “ePrivacy Regulation and Privacy Automation”, (2021),

  11. Explanatory Memorandum to COM(2017)10 - Regulation on Privacy and Electronic Communications, 2017,

  12. European Parliament, Report on the proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), A8-0324/2017, (2017),

  13. General Secretariat of the Council, (2021),

  14. BEUC Recommendations for the Trilogue Negotiations on the Proposed ePrivacy Regulation, BEUC-X-2021-106, (2021),

  15. Irish Council for Civil Liberties, Demand to global brand CEOs: stop unlawful consent spam and delete the data, (2022),

  16. Matte C., Bielova N., Santos C,. (2020). Do Cookie Banners Respect my Choice? Measuring Legal Compliance of Banners from IAB Europe’s Transparency and Consent Framework. In IEEE Symposium on Security and Privacy. IEEE, 791–809.

  17. Nouwens M., Liccardi I., Veale M., Karger D., and. Kagal L (2020). “Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence”. In: Proceedings of CHI ’20 CHI Conference on Human Factors in Computing Systems, April 25–30, 2020, Honolulu, HI, USA.

  18. BEUC Position Paper on a Proposal for a Regulation on Privacy ad Electronic Communications (ePrivacy), BEUC-X-2017-059 – 09/06/2017 (2017)

  19. European Data Protection Board, Statement 03/2021 on the ePrivacy Regulation Adopted on 9 March 2021, (2021)

  20. Article 29 Working Party Opinion 01/2017 (WP 247) on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC), (2017),

  21. Access Now, NOYB and EDRI, Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), (2021),

  22. State of California Department of Justice, Attorney General Bonta Announces Settlement with Sephora as Part of Ongoing Enforcement of California Consumer Privacy Act, (2022),

  23. Future of Privacy Forum, Companies that have implemented Do Not Track,

  24. Electronic Privacy Information Center and Junkbusters. Pretty Poor Privacy: An Assessment of P3P and Internet Privacy (2000),

  25. Belgian DPA, Decision on the merits 21/2022 of 2 February 2022, Unofficial Translation from Dutch, Case number DOS-2019-01377,

  26. Human S., Schrems M., Toner A., Gerben, Wagner B., (2021). Advanced Data Protection Control (ADPC), (Sustainable Computing Reports and Specifications). WU Vienna University of Economics and Business.;;

  27. Matte C., Santos C., and Bielova N. (2020). “Purposes in IAB Europe’s TCF: Which Legal Basis and How Are They Used by Advertisers?”. In: Proceedings of the 8th Annual Privacy Forum, APF.