Skip to content

France – CNIL releases guidance on data protection aspects of chatbots

Browse this blog post

Related news and insights

Blog Post: 15 February 2022

CNIL concludes that data transfers to the US through the use of Google Analytics violate GDPR

On 19 February 2021, CNIL released guidance on the use of chatbots in compliance with data protection law (the Guidelines). The CNIL notes that in order to operate the chatbots, controllers will often need to process personal data, even in the situations where no user account needs to be created or individuals would not need to identify themselves.

The Guidance covers the following key issues:

  • Cookie requirements can be complied with by either obtaining a GDPR-compliant consent from the website user (before cookies are dropped on the terminal equipment of the user) or by depositing the cookies only after the chatbot is activated by the user (for example, by clicking on the previously displayed chat window or by clicking a button explicitly triggering the chatbot functionality). In the second case, the cookies would fall under the category of being strictly necessary to provide an online communication service at the express request of the user and, therefore, not require the user's consent. This exemption only applies to the cookies for which the purpose is strictly limited to the functioning of the chatbot; cookies for any other purposes will require the user's consent;
  • Data retention requirements will depend on the purpose of processing personal data by the chatbot, e.g. longer periods would apply to data necessary to handle claims about a product purchased via a chatbot;
  • In relation to automated decision-making, a conversation between the user and the chatbot alone, without human intervention, should not lead to automated decision-making with a significant impact on the data subject, such as refusing an online credit application or rejecting a job applicant. However, such chatbots may be used as part of a broader procedure that would include a meaningful human intervention;
  • If special categories of data are expected to be processed in the context of operating the chatbot (e.g. chatbots of healthcare services), the controller must ensure that data processing falls under one of the exceptions of Article 9(2) GDPR and, under some circumstances, might be required to perform a prior data protection impact assessment (DPIA). If this is not expected (such as when data are provided by individuals in the free text areas of the chatbot), obtaining prior user consent is not required. However, organisations are recommended to implement mechanisms to minimise the risks to the rights and freedoms of individuals, e.g. displaying warnings against disclosing sensitive data via chatbots or setting up a system to purge, immediately or at least on a regular basis, conversation data that are not relevant for the purpose of processing.

The Guidance is available here (only available in French).