Noticias
28 de febrero de 2025

Data Protection in the era of AI-powered marketing (IV)

Por: Paola Cardozo Solano - PhD Researcher & Lecturer. Vrije Universiteit (VU) Amsterdam, The Netherlands

Fourth edition. AI Marketing & GDPR

The previous editions examined the data protection challenges posed by AI-driven CRM systems and chatbots in marketing. Building on it, this time we will explore the key principles of the General Data Protection Regulation (GDPR) and how they apply to AI marketing solutions. This edition will provide an overview of the legal framework businesses must adhere to when processing personal data for marketing purposes and emphasize the necessity of embedding data protection into AI marketing solutions from the design phase onward, considering the perceptions of marketeers and data subjects.

The GDPR at the crossroads of AI and Marketing

Before diving into the specifics of AI Marketing, it is essential to introduce key elements of the foundations of data protection in Europe. The General Data Protection Regulation (GDPR) came into effect in 2018, creating specific rules for the fundamental right for data protection as established under the EU Charter of Fundamental Rights. In broad terms, the GDPR establishes the principles for data processing, the grounds on which such processing can be carried out, the responsibilities of processors and controllers, the rights of data subjects, and the rules for data transfers. This means that when personal data is processed for marketing activities, the GDPR must also be followed.

The following presentation examines some key GDPR principles in the context of AI marketing applications introduced in previous editions, specifically AI-CRM and chatbots:

Lawfulness: In AI marketing, the processing of personal data must be based on one of the legal grounds outlined in Article 6 of the GDPR. Consent and legitimate interests are the most commonly used grounds in this context. Then, AI CRM systems and chatbots must ensure that they process personal data under one of those grounds and can demonstrate the legal basis for each instance of data processing.

Fairness: requires businesses to inform data subjects about the potential risks and their rights regarding data processing. Additionally, this principle mandates that data subjects be protected from discrimination, which is a relevant aspect when employing AI to collect and use personal data.

Transparency: the data subjects must be informed clearly about how their data will be processed before any data collection occurs. This principle is especially important for AI marketing tools, where chatbots and CRM systems may collect significant amounts of personal data. Businesses must be transparent about the purpose of data processing, the types of data being collected, and the personalization of marketing communications.

Data minimization: businesses employing AI marketing tools must ensure that they only process data relevant to a legitimate purpose, for example, a certain marketing objective. This means that they should avoid gathering excessive personal data.

Integrity, security, and confidentiality: AI marketing solutions should use appropriate technical or organizational measures to protect against unlawful processing, unauthorized access, loss or destruction of personal data. Some examples of these measures include the use of encryption, secure access controls, and data security audits.

Accountability and compliance in AI marketing

Accountability in the data protection refers to the obligations organizations must meet in order to demonstrate compliance with data protection legislation. It is important to note that accountability and compliance go beyond simply checking boxes. Data protection must radiate the DNA of the organizations (Pothos, 2019). Moreover, the application of the GDPR should be tied to the goals of freedom, security, justice, economic and social progress, the well-being of human beings and mankind, and strengthening the European internal market (GDPR, 2016, Recitals 2 and 4). 

Now, translating the GDPR into practice implies complying with (i) the legal principles of GDPR; (ii) technical or organizational measures that can be implemented to achieve the GDPR’s goals; (iii) the guidelines contained in the Regulation; and (iv) balancing conflicting interests (Tamò-Larrieux, 2019). 

Pothos (2019) provides a detailed list of the key practical elements for demonstrating compliance with data protection in an organization, namely: internal policies, employee responsibilities, management responsibilities, incident reporting, policy compliance, internal allocation of responsibilities, data protection training, documentation, cooperation with regulatory bodies, binding corporate rules, the implementation of data protection impact assessments (DPIA), and the appointment of a data protection officer (when required by the GDPR). For AI marketing solutions, this means implementing data protection measures throughout the product lifecycle, from design to deployment and beyond. These elements will be considered in the final edition of this series, which will present the results of a survey evaluating compliance with data protection- by-design in AI-CRM and chatbot solutions for marketing.

The data controller is responsible for determining the purposes and means by which personal data is processed, and then processors carry out processing on behalf of a controller, according to Articles 24 and 28 of the GDPR.  Fundamental to this series is the assertion that software providers, such as the developers of CRM systems and chatbots, act as processors, while the firms that acquire their solutions are the data controllers.

Article 28 of the GDPR highlights that the controller must only engage processors who provide sufficient guarantees to implement appropriate technical and organizational measures to meet GDPR requirements and protect data subjects. This underscores the crucial role of software providers in developing data protection-compliant solutions, ensuring that when their products are brought to market, client firms, acting as controllers, have the tools necessary to ensure the rights established by the GDPR.

It is also important to note that the relationship between the processor and the controller is governed by a binding legal agreement. This agreement includes, among other aspects, instructions on data processing, confidentiality obligations, measures in line with Article 32 of the GDPR, and the responsibility to provide the information required to demonstrate compliance with the Regulation.

Translating these concepts into practice, an illustrative example would be that CRM developers and chatbot providers must ensure their solutions include mechanisms for obtaining and documenting consent or other lawful grounds for data processing, along with features that allow users to exercise their rights (e.g., the right to access, correct, or delete their data).

GDPR & Direct Marketing 

Direct marketing messages do not necessarily involve offering something for sale. Since marketing is a broad discipline, these communications can serve various purposes, such as promoting a company or inviting someone to take a free offer. The key factor for applying the GDPR is that the message must be directed at a specific individual. This excludes untargeted communications, such as banner advertisements, as well as messages that are solely related to the provision of a service (e.g., order status updates) (Seinen, 2019).

Wouter Seinen (2019) argues that data protection in the context of direct marketing is particularly complex given that these practices not only trigger the application of data protection laws but also regulations on consumer protection, spam, and the use of cookies, which also vary between countries.  The ePrivacy Directive applies as lex specialis to electronic communications like SMS and emails, but its examination is beyond the scope of this section, which will focus solely on the GDPR.

Therefore, AI-CRM and chatbots’ processing activities may fall under the scope of direct marketing when messages are targeted to an individual and promote brands, products, or services. It depends on the characteristics of the communication per se.

Article 21 of the GDPR addresses direct marketing in the context of the right to object to the processing of personal data, including profiling, based on the grounds specified in Articles 6(e) and 6(f). These grounds are: (e) processing is necessary for performing a task carried out in the public interest or in the exercise of official authority vested in the controller, and (f) processing is necessary for the legitimate interests pursued by the controller or a third party, provided such interests do not override the fundamental rights and freedoms of the data subject, particularly when the data subject is a child. In light of Recital 48, processing for direct marketing purposes may be considered as being based on legitimate interests.

The key point of Article 21, for the purposes of direct marketing, is that data subjects have the right to object to or opt-out of the processing of their personal data, including profiling, at any time. This right must be clearly and separately presented to the data subject. Regarding the scope of the objection, Recital 71 specifies that the initial or further processing of personal data for those purposes must cease. Furthermore, in terms of Article 21(5), the data subject can object to processing by automated means using technical specifications. Compliance with the opt-out feature and how this right is presented to customers will be evaluated in the last edition, concerning CRM and chatbots.

The Role of AI providers and marketeers in data protection

Given that AI is a data-intensive set of technologies, data protection and privacy are extremely relevant. First, it is clear that data subjects must have control over their personal data, especially when it is processed by corporations in often opaque ways. Second, such processing can significantly impact private life, including sensitive aspects such as financial, health, or even intimate information, especially considering that anonymization techniques, which are often used to protect privacy, are not fully reliable (OECD, 2023, p. 19).

Nissenbaum’s understanding of privacy in terms of contextual integrity is essential as it emphasizes the need to “prescribe specific restrictions on the collection, use, and dissemination of information about people” in particular contexts rather than in general terms (2004, p. 155). Hence, given the unique nature of consumer preferences in marketing, a field that aims to influence behavior, it is crucial to evaluate specific privacy and data protection measures for the industry.

However, marketers report that data protection “has become a difficult, even awkward topic” (Chiocchi, 2022) as they are increasingly concerned with growing awareness and scrutiny of information processing by both authorities and users.  The same author explains that marketers perceive data protection and privacy norms as negatively impacting business performance. According to a survey, 64% of marketers report changing how they collect data, while only 36% of B2B companies feel prepared for the changes required by data protection regulations.

Along the same lines, Bleier et al. (2020) argue that privacy and data protection concerns impact companies, especially smaller firms, by  causing direct revenue losses due to missed sales, litigation risks, data foreclosure, and limited strategic scope due to privacy legislation. The authors also note that data protection laws slow the pace of data-driven innovation in the marketing discipline. Therefore, considering the negative perception towards data protection regimes by marketers, it is critical to delve into how developers of marketing solutions are implementing privacy in their products, if they are doing it at all.

In conclusion, this chapter has explored the intersection of AI marketing and GDPR, focusing on the principles of data protection that businesses must adhere to when using AI-driven marketing tools. Compliance with the GDPR principles must not be seen as mere formality, they should be considered as integral to building consumer trust and ensuring that businesses operate within the bounds of the law.

A central theme emphasized throughout this series is that the best time to safeguard data protection and privacy is by embedding these considerations from the outset. Once a solution reaches the market, is purchased by thousands of companies worldwide, and processes vast amounts of personal data, it may be too late to protect users’ rights. Considering the critical link between data protection and the product lifecycle, the next edition will focus on data protection-by-design in AI marketing.

Bibliography:

Bleier, A., Goldfarb, A., & Tucker, C. (2020). Consumer privacy and the future of data-based innovation and marketing. International Journal of Research in Marketing, 37(3), 466-480. https://doi.org/10.1016/j.ijresmar.2020.03.006

Chiocchi, P. (2022). Council Post: Marketers Don’t Talk About Data Privacy, But They Should. Forbes. https://www.forbes.com/sites/forbesagencycouncil/2022/12/08/marketers-dont-talk-about-dataprivacy-but-they-should/

Nissenbaum, H. (2004). Privacy as Contextual Integrity. Washington Law Review, 79.

Pothos, M. (2019). Accountability requirements. En European Data Protection, Third Edition. IAPP. https://iapp.org/resources/article/european-data-protection/

Seinen, W. (2019). Direct Marketing. En European Data Protection, Third Edition. IAPP. https://iapp.org/resources/article/european-data-protection/

Tamò-Larrieux, A. (2018). Designing for Privacy and Its Legal Framework: Data Protection by Design and Default for the Internet of Things. Springer International Publishing AG. http://ebookcentral.proquest.com/lib/leidenuniv/detail.action?docID=5596908