Skip to content

WP29 draft guidelines on profiling and automated decision-making under the GDPR

Browse this blog post

On 17 October, the Article 29 Working Party (WP29) published new draft guidelines on profiling and automated decision-making under the GDPR (the Guidelines).

The Guidelines identify two benefits of profiling - increased efficiencies and resource savings and note that profiling and automated-decision making can be used to tailor services and products to align with individual needs.

However, the Guidelines are conservative and restrictive in the interpretation of the underlying GDPR provisions. WP29 is clearly wary of the possible risks associated with these processing techniques and warns that their use may perpetuate existing stereotypes and social segregation, if sufficient safeguards are not put in place.

The Guidelines were published as a draft and are subject to public consultation until 28 November 2017.

How are profiling and automated decision-making defined?

WP29 clarifies that the provisions in the GDPR on profiling address processing that is automated. The use of some human involvement as well as automated processing does not take the processing out of the scope of the provisions on profiling but is relevant to the GDPR restrictions on automated decision-making.  The Guidelines emphasise that automated processing of personal data to analyse or make predictions about a data subject falls within the definition of profiling.

Automated decision-making is defined by WP29 as an ability to make decisions by technological means without human involvement. It is important to bear in mind that the concepts of automated decision-making and profiling are independent of each other. Automated decisions can be made without profiling (the Guidelines give an example of a speeding fine issued solely on the basis of a speed camera assessment) and profiling does not need to involve automated decision-making. The presence of certain types of automated decision-making in conjunction with profiling however engages additional provisions of the GDPR.

Key issues to note from the Guidelines

Certain automated decision-making may only be made with human involvement

The Guidelines interpret Article 22 of the GDPR as a prohibition on fully automated individual decision-making that has a legal or similarly significant effect. There are some exceptions which are explained below.  However, the key point to note is that unless an exception applies, no decision may be made that produces a legal or similarly significant effect on the data subject without meaningful human oversight.  This is comparable with how the similar requirement in Article 15 of Directive 95/46/EC was translated to local data protection laws of some of the Member States such as Belgium, Netherlands or the Czech Republic.  However, for example, the current UK regime as set out in Section 12 of the Data Protection Act 1998 is very different. The Data Protection Act 1998 does not include general prohibition of automated decision-making, but merely provides for a right of a data subject to object to such processing by a written notice requiring a controller not to make such an automated decision about him or her.

Not surprisingly, the use of "token" or superficial human involvement is also ruled out by the Guidelines - they prescribe that the human involvement must take the form of a person who has the authority and the competence to change the decision. As part of the analysis, the individual should consider all the available input and output data.  Effectively, this means that where a decision that has legal or other similar effects is based on automated profiling (e.g. use of algorithms), it has to be reviewed by a qualified individual to check the conclusion.  This will add significant bureaucracy to the process; there will be few individuals qualified to fully assess the working of an algorithm.

As noted, this rule is subject to some caveats and exceptions:

  1. It only applies where the automated decision produces either a legal or other similarly significant effect. Examples provided of legal effects include a person being entitled to or denied a particular social benefit. Interestingly, the Guidelines also consider what might constitute a "similarly significant" effect. When defining similarly significant effect, WP29 specifies that such effects must be more than trivial and must be sufficiently great or important to be worthy of attention. In other words, the decision must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned. This extends, in the WP29's view, to matters such as refusal of credit for a significant purchase, online advertising based solely on automated processing where the individual being targeted is vulnerable or differential pricing, which may have the effect of excluding the data subject from certain goods or services. What WP29 does not address is how controllers can determine which data subjects are so vulnerable that targeting of certain advertisements to them could have this significant effect or which data subjects would effectively be excluded from certain purchases as a result of differential pricing.
  2. Furthermore, the general prohibition on automated decision-making will not be applicable, if that processing is necessary for entering into or performing a contract; it is authorised by EU or Member State law which contains appropriate safeguards or it is based on the data subject's explicit consent. The Guidelines bring further restrictions in this area. They give an example in the case of credit referencing prior to entry into a contract and conclude that this reason will not always be sufficient to show that the processing is "necessary". They indicate that a controller will need to be able to justify why this was "necessary" and if there is a less privacy intrusive means of achieving the same outcome, the profiling would not be necessary. Later in the Guidelines, there is some interesting commentary on consent - which provides that consent will not be an appropriate basis for processing where consent to profiling is a pre-condition of accessing the controller's services. A valid consent will also need to clear all the other GDPR requirements, for instance – the consent must be freely given, informed and specific, which presents a challenge when profiling is based on complex algorithms.

In circumstances in which a controller wishes to rely on exemptions relating to necessity for entering into or performing of a contract / the data subject’s explicit consent, suitable safeguards must be in place and the data subject must be afforded a right to obtain human intervention, to express his / her point of view and to contest the decision.

The normal rules apply, notwithstanding the complexity of profiling

Articles 13(2)(f) and 14(2)(g) of the GDPR require controllers to provide to data subjects specific information about automated decision-making. These Articles require controllers to provide to data subjects meaningful information about the logic involved in automated decision-making. Naturally, this requirement sparked a debate about how exactly a controller could explain to data subjects functioning of complex algorithms involved in automated decision-making.

WP29 provides useful guidance in this respect and clarifies that the controller should find simple ways to tell the data subject about the mechanism behind reaching of the decisions without necessarily always attempting a complex explanation of the algorithms used. For example, when controller applies a credit scoring method, it can explain to the data subject that this process helps them make fair and responsible lending decisions and provide details of the main characteristics considered in reaching the decision, the source of this information and its relevance.

The Guidelines also address each of the (many) other provisions of the GDPR which may affect profiling. These include purpose limitation, data minimisation, storage limitation, lawful basis for processing, special categories of data, data subject rights, children and DPIAs.  Of these it is interesting (and concerning) to note that in the context of the section on the right to object, the Guidelines state that to justify continued processing on the grounds of compelling legitimate interests, it will not be sufficient simply to show that the processing is legitimate for society at large (not just its own business interests); the controller will also need to "prove" that the objective is "critical for the organisation".

Good practice recommendations given by WP29

Finally, Annex 1 to the Guidelines includes set of specific good practice recommendations by WP29 to assist controllers in meeting the requirements of the GDPR provisions on profiling and automated decision-making.