Noticias
28 de marzo de 2025

Data Protection in the era of AI-powered marketing (V)

Por: Paola Cardozo Solano - PhD Researcher & Lecturer. Vrije Universiteit (VU) Amsterdam, The Netherlands.

Illustration by MOMO Studio on Unsplash.

Fifth edition. Data Protection by design and marketing

Data protection by design is a proactive approach to embedding data protection into systems, processes, and organizations from day one. Tracing its origins, we examine how privacy by design principles laid the groundwork for data protection by design under the GDPR. We also discuss the critical role of software vendors, particularly those developing AI-powered CRM and chatbot solutions, in ensuring compliance with GDPR principles.

The origins of data protection by design 

Privacy by design is an approach that aims to embed privacy considerations into systems, processes, and organizations from the outset.

The widely discussed Privacy-Enhancing Technologies (PETs) are not equivalent to privacy by design. Some common examples of PETs include homomorphic encryption, synthetic data, and federated learning. While both PETs and privacy by design, aspire to protect privacy, PETs focus on software and hardware solutions including technical procedures, methods, or knowledge designed to mitigate privacy risks (ENISA, 2015 as cited by ICO, 2023). The UK data protection authority, ICO, emphasizes that PETs are not a “silver bullet” for protecting personal data or privacy (2023, p. 5); PETs are means, not ends in themselves (Future of Privacy Forum, 2023).

PETs address specific privacy needs from a technical perspective, then privacy by design emerged as a more holistic approach (Tamò-Larrieux, 2019). Privacy by design is not strictly a legal nor technical model, on the contrary, it intends to overcome the apparent clash between the legal and the technical perspectives. Its implementation, therefore, requires the combination of both approaches (2019).

Privacy by design was proposed in the nineties by Ann Cavoukian (Information and Privacy Commissioner in Ontario, Canada). From that moment on, privacy by design and its constituent principles became an impactful reference point, adopted by several private and public organizations around the world.

The scope of the model goes beyond mere compliance with legislation, considering privacy as “an organization’s default mode of operation” covering IT systems, business practices, and design and networked infrastructure, with a focus on sensitive data (Cavoukian, 2006, p. 1). Privacy by design was originally structured around seven foundational principles proposed by Ann Cavoukian (2006, p. 2), which can be summarized as follows:

  1. Proactive not Reactive; Preventative not Remedial: Privacy should be considered before any violation occurs. This means that privacy should not only be relevant for organizations after an incident takes place.
  2. Privacy as the default setting: Privacy is built into the system and the business model; no action should be required from the data subject to ensure protection.
  3. Privacy embedded into design: Privacy is integral to the product or service design in a holistic and creative manner.
  4. Full Functionality – Positive-Sum, not Zero-Sum: The model rejects false dichotomies (e.g., privacy vs. security); instead, it aims to satisfy different organizational goals in a win-win manner. 
  5. End-to-End Security – Full Lifecycle Protection: Since security is fundamental to privacy, it must be ensured from start to finish. It also comprehends timely deletion of information.
  6. Visibility and Transparency – Keep it Open: This principle involves keeping the data processing operations transparent to users and providers, ensuring accountability, openness, and compliance.
  7. Respect for User Privacy – Keep it User-Centric: Privacy should be structured around user needs and interests. Measures should empower users and be user-friendly. Consent, accuracy, access, and compliance are central features for respecting users.

The core of Privacy by Design, as underscored by its principles, is to proactively embed user privacy into systems, balancing this focus with other values or goals, such as security or functionality, while also aligning with the organization’s internal goals.

Data protection by design

Although privacy by design and data protection by design are not entirely identical, the principles of the former model laid the foundation for its incorporation into the General Data Protection Regulation (GDPR) (ICO, n.d.).

We agree with Jarovsky and Zanfir-Fortuna (2023), who point out the unfortunate ‘mismatch’ between the model initially proposed by Ann Cavoukian (2010) and how its principles were translated into law. In our view, the GDPR overlooks important concepts, such as the ‘positive-sum, not zero-sum’ approach, which could have provided guidance on ensuring both system functionality and privacy protection.

The GDPR includes provisions for data protection by design, requiring data controllers to implement appropriate legal, technical, and organizational measures to protect data subjects’ rights. Importantly, data protection by design is critical for safeguarding data protection and ensuring trust in the digital age. However, the GDPR provides limited guidance on the practical application of this model.

Data protection by design and by default are outlined in Article 25 of the GDPR (2016). With regard to data protection by design, the legislation mandates that data controllers implement appropriate measures to ensure compliance with the GDPR. Although the GDPR does not provide a detailed conceptualization of data protection by design, it offers some guidelines to help organizations develop these measures. These guidelines include conducting a risk analysis related to the data subjects’ rights and considering factors such as “the state of the art, the cost of implementation, and the nature, scope, context, and purposes of processing” (2016, Article 25).

Recital 78 of the GDPR (2016) lists as possible measures to ensure compliance with the Regulation the implementation of pseudonymization, data minimization, transparency concerning data processing, and providing monitoring features to users. These mechanisms must be established from the design stage of the products, services, and applications throughout their lifecycle.

Future of Privacy Forum (2023) acknowledges the critical commentary that highlights the vague and unclear wording of Article 25, which poses challenges for data controllers in implementing it. However, FPF asserts that data protection by design and by default have been widely enforced since the GDPR came into effect.

The most important point we draw attention to in Article 25 of the GDPR is that the technical, legal, and organizational measures are ultimately intended to effectively implement the data protection principles and protect the rights of data subjects. In other words, privacy by design and the GDPR principles have a means-ends relationship, with the aforementioned model being instrumental in complying with these postulates. According to the EDPB guidelines (2020), data protection by design is crucial to effectively implementing the GDPR principles.

Now, with regard to compliance, Article 25.3 of the GDPR states that fulfillment with data protection by design can be demonstrated via certifications (2016). The most recent example is ISO 31700-1:2023, the new standard on Privacy by Design which aims to create conditions for protecting data subjects before the product is placed on the market (International Organization for Standardization – ISO, 2023). 

It is worth noting that conformity with that standard does not translate into compliance with Article 25 of the GDPR.  As explained by the EDPB (2020), when a certification is awarded, the controller remains responsible for consistently monitoring and guaranteeing compliance with Article 25. Still, adherence to ISO 31700-1:2023 is beneficial to organizations as it allows them to comply with technical norms in data protection, aids them in demonstrating compliance, and helps gain market trust and advantages over competitors (Future of Privacy Forum, 2023).

Therefore, for assessing Article 25 compliance, Future of Privacy Forum (2023) highlights the following key aspects of the model: (i) the controller is responsible for complying with data protection by design obligations; (ii) data protection by design measures are context-dependent, considering the balancing of interests; (iii) technical and organizational measures are equally important and should both be implemented; (iv) technical and organizational measures must be implemented before and during processing; and, (v) technical and organizational measures must be effective in mitigating the data processing risks. 

Implementing data protection by design

This section explores the perspectives of Aurelia Tamò-Larrieux and the European Data Protection Board (EDPB) on implementing data protection by design. Their conceptual frameworks, compliance categories, and key indicators serve as the foundation for the evaluation criteria that will be applied to selected CRM and chatbot applications in the next and final edition of these series.

Importantly, the EDPB’s guidelines begin with a key assertion for this series: while Article 25 of the GDPR places the responsibility for implementing privacy by design on controllers, other actors (such as processors and producers of products, services, and applications) should also play a role in its implementation. By developing data protection-aware software, they help controllers in fulfilling their data protection obligations (2020). In what is relevant to our matter, CRM and chatbots developers should implement data protection by design when creating those solutions, as they are critical for ensuring both controllers’ compliance and the protection of data subjects.

Tamò-Larrieux (2019) explains that implementing data protection by design involves applying legal and technical tools to realize data protection principles and integrating safeguards for data subjects. The first set of tools includes legal instruments such as privacy and data protection legislation, industry standards, and ethical norms. The second set consists of technical measures designed to protect communication security, ensure operational anonymity, and enhance transparency in data processing.

The technical and organizational measures must be ‘appropriate’. According to the EDPB guidelines (2020), ‘appropriate’ means that the measures and safeguards implemented should be suited to achieve the desired purpose, meaning they must effectively implement data protection principles.

Article 25 of the GDPR defines the principle of technical data protection based on the criteria of time, scope, and subject matter. GDPR envisions protecting personal data throughout its entire life cycle, extending beyond data security to include enabling data subjects to request technical measures for safeguarding their data. Additionally, incorporating this model into the GDPR shifts the focus toward establishing a proactive data protection framework (Tamò-Larrieux, 2019).

Tamò-Larrieux (2019) classifies the legal considerations for data protection by design into four categories: legality of processing (fair, transparent, and lawful data use), processing flow design (data minimization, security, anonymization), data subjects’ rights (access, objection, erasure, information), and compliance and enforcement (GDPR penalties, regulatory oversight, accountability for outsourcing).

Concerning technical and organizational tools, Tamò-Larrieux (2019) categorizes four areas. Security ensures confidentiality, integrity, and availability (CIA), with mechanisms such as end-to-end encryption. Article 32 of the GDPR highlights pseudonymization, encryption, and system resilience. Anonymization and pseudonymization protect identities through unlinkability and unobservability, though the GDPR provides little specific guidance.

Autonomy gives individuals control over their data through access restrictions, management systems, and deletion tools, with GDPR emphasizing consent and requiring data in a structured, machine-readable format. Finally, transparency ensures users understand data processing and access rights, supported by privacy icons and impact assessments, though GDPR guidance remains limited here as well.

Tamò-Larrieux (2019) emphasizes that implementing technical and organizational measures is not an absolute obligation but requires balancing interests. Therefore, determining appropriate measures involves assessing state-of-the-art technologies, implementation costs, and the nature, scope, context, and purpose of data processing. Additionally, a risk-based approach is necessary when required by Articles 24, 31, and 35 of the GDPR.

The mentioned legal, technical and organizational categories and tools should be considered in favor of achieving the goal of guaranteeing the data processing principles outlined in Article 5 of the GDPR. Throughout all stages of the product, from design to data deletion, the value chain must implement these principles.

Enforcement of data protection by design in marketing

The Future of Privacy Forum (2023) analyzed data protection by design in direct marketing, focusing on the enforcement of Articles 5 (principles), 21 (right to object to processing), and 25 of the GDPR by Data Protection Authorities (DPAs). Many of the cases in this area have centered around failures to respect individuals’ right to object to the use of their personal data for direct marketing. We agree with the Future of Privacy Forum (2023) that these enforcement actions highlight the importance of controllers implementing effective processes, procedures, and systems to safeguard data subjects’ rights. These considerations are equally relevant for software developers, as product design plays a critical role in ensuring GDPR compliance and protecting data subjects.

One case highlighted by the Future of Privacy Forum (2023) involved the CRM Siebel and the portal used by the Greek telecommunications provider OTE. The Hellenic DPA (HDPA) found violations of Articles 25 and 5(1)(c) of the GDPR and imposed a €200,000 fine due to a malfunction in the CRM platform that prevented the portal from updating as intended. As a result, more than 16,000 subscribers received unwanted advertising calls for over three years. The HDPA determined that OTE should have implemented appropriate measures, such as conducting accuracy checks.

Although Article 25 of the GDPR does not establish that processors are responsible for implementing technical and organizational measures to ensure compliance with data protection principles, the aforementioned case highlights the crucial role of software providers in supporting GDPR compliance. On one hand, the features that enable GDPR compliance for their clients, and on the other hand, the integration of data protection considerations into their operations and organizational DNA, have a significant impact on enabling their clients to comply with data protection principles when using the solutions they provide.

AI and data-intensive technologies pose significant challenges to data protection, as they involve the processing of personal data that can impact individuals’ private lives. Moreover, as discussed in previous editions, a concerning trend is that marketers often view data protection as conflicting with their marketing goals. Therefore, embedding data protection and privacy measures from the outset and integrating compliance features into the AI-based software used daily by marketers can have a significant positive impact on safeguarding these fundamental rights. This is especially important given the widespread use of CRM systems and chatbots in today’s marketing activities.

As previously noted, the final edition of this series will present the results of an evaluation assessing the implementation of data protection by design and compliance with GDPR principles in leading AI-CRM and chatbot solutions. The findings of this assessment will shed light on trends, characteristics, strengths, and weaknesses in the offerings of this type of software, which plays a strategic role in marketers’ work.

Bibliography:

Cavoukian, A. (2006). Privacy by Design The 7 Foundational Principles.

Cavoukian, A. (2010). The 7 Foundational Principles. https://www.ipc.on.ca/wp-content/uploads/resources/pbd-implement-7found-principles.pdf

European Data Protection Board (EDPB). (2020). Guidelines 4/2019 on Article 25 Data Protection by Design and by Default. https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_201904_dataprotection_by_design_and_by_default_v2.0_en.pdf

European Parliament and of the Council. (2016). Regulation (EU) 2016/679 (General Data Protection Regulation). https://eur-lex.europa.eu/eli/reg/2016/679/oj

Future of Privacy Forum. (2023). Unlocking Data Protection by Design and by Default: Lessons from the Enforcement of Article 25 GDPR. https://fpf.org/blog/new-fpf-report-unlocking-data-protection-by-design-and-by-default-lessons-from-the-enforcement-of-article-25-gdpr/

ICO. (2023). What PETs are there? ICO. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/data-sharing/privacy-enhancing-technologies/what-pets-are-there/

ICO. (n.d.). Data protection by design and by default. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/accountability-and-governance/guide-to-accountability-and-governance/data-protection-by-design-and-default/

Jarovsky, L., & Zanfir-Fortuna, G. (Directores). (2023, mayo 14). Regulating AI, EU & US Perspectives with. https://www.youtube.com/watch?v=w33vZlZTVDc

Tamò-Larrieux, A. (2018). Designing for Privacy and Its Legal Framework: Data Protection by Design and Default for the Internet of Things. Springer International Publishing AG. http://ebookcentral.proquest.com/lib/leidenuniv/detail.action?docID=5596908

Tamò-Larrieux, A. (2019). Excerpt of «Designing for Privacy and its Legal Framework». Sui Generis. https://doi.org/10.21257/sg.89