Data Subject Rights under GDPR – disclosure of recipients, proof of identity, relevance of purpose and breach of response times
Browse this blog post
Related news and insights
Blog Post: 18 May 2023
Blog Post: 31 January 2023
Blog Post: 22 December 2022
Publications: 07 December 2022
We’re now approaching the five-year anniversary of the General Data Protection Regulation (GDPR) taking full effect.
In the run-up to 2018 and the period afterwards, there were many predictions about the likely direction of GDPR enforcement and areas where we could expect regulators and courts to clarify interpretations.
As with most new pieces of legislation it has taken time for major precedent and case law to emerge. The wheels have turned slowly as major cases have had to proceed through national data protection authorities, and sometimes up to the European Data Protection Board (EDPB), National Courts or Court of Justice of the European Union (CJEU).
The last year has seen a significant uptake in precedent and case law emerging and the volume is likely to accelerate in 2023. This blog takes a look through some of the key cases from the last year and what is coming up in 2023. The blog walks through the key cases by theme. Given it is a significant topic in its own right the blog doesn’t cover trends related to cyber security cases; we’ll cover that in a future blog. The blog also doesn’t cover international transfers as they have been extensively covered in previously Allen and Overy blogs.
Data Subject Rights under GDPR – disclosure of recipients, proof of identity, relevance of purpose and breach of response times
The most heavily used right under GDPR is the right of access under Article 15. It is often seen as a ‘cornerstone right’, as personal data gained from a data subject access request (DSAR) can enable a fuller understanding of how personal data is used by a controller and often enables the exercise of other rights, such as erasure. Organisations who regularly receive DSARs will have policies and procedures in place as part of their data protection governance. While the right of access under GDPR was not new there was a limited amount of case law under the previous EU data protection regime, so organisations will want to track developments in this area closely and consider where they need to review and update their policies and processes.
CJEU Judgment- disclosure of recipients
In early 2023 the CJEU issued one of its most significant judgments on the right of access under GDPR. The case involved Österreichische Post AG (the Austrian Postal Service) and had been referred to CJEU to address the scope of the right under Article 15(1)(c) of GDPR – the right to information on ‘the recipients or categories of recipient to whom the personal data have been or will be disclosed’. Österreichische Post had responded to the data subject in general terms and explained that the personal data was processed for marketing purposes and shared with customers, such as mail order companies, IT companies and political parties.
The CJEU clarified that ‘the data subject has the right to obtain from the controller information about the specific recipients to whom the personal data concerning him or her have been or will be disclosed’. This therefore meant that the controller did not have the option to choose to only disclose general information explaining the categories of recipients. Though the CJEU did qualify slightly by recognising that ‘the right of access may be restricted to information about categories of recipient if it is impossible to disclose the identity of specific recipients, in particular where they are not yet known.’ and that the controller may refuse on the basis that the request is manifestly unfounded or excessive under Article 12(5)(b).
The judgment is also in line with the position taken by the EDPB in their draft guidelines on the right of access (published January 2022). We can expect the final guidelines to be published soon.
The judgment indicates how the CJEU is likely to interpret the access right in its wider context and how it can act as a gateway to other rights. Organisations may need adjust their procedures for handling DSARs, to cover the situation when this information should be supplied, but noting that in many cases data subjects often choose to narrow their requests or specify the personal data they are seeking.
It is also worth noting that the provisions under Articles 13 and 14 of GDPR that cover the right to information, essentially the information that controllers will often provide in the form of a privacy notice, also contain the requirement to proactively disclose information about recipients or categories of recipients, when the data is collected directly or indirectly. For indirect collection, Article 14(5)(b) contains an exception for when ‘the provision of such information proves impossible or would involve a disproportionate effort’.
German Court decision – relevance of purpose behind a DSAR
DSARs are often not made in isolation, they can be connected to another problem, such as a customer service dispute with the controller or an employment dispute. This can be a reasonable and legitimate use of the right and can often assist the data subject in gaining a fuller understanding of an issue. However, in some cases controllers have sought to argue abuse of GDPR as a reason to reject a request. This a common issue across many sectors, especially financial services.
In March 2022, in Germany, the Higher Regional Court in Nuremberg (8 U 2907/21) found that a data subject had no right to access under Article 15 as the controller was entitled to reject the request by reliance on Article 12(5)(b) GDPR. The Court explained that the provision lists ‘repetitive character’ as an example of an excessive request and use of ‘in particular’ makes it clear that Article 12(5)(b) also intends to cover other forms of abusive requests. The Court referred to the purpose of Article 15 GDPR via Recital 63, which is to enable the data subject to be aware of, and verify, the lawfulness of the processing. The Court ruled that the data subject was clearly not interested in verifying the lawfulness of the processing but rather to check whether the adjustments made to the insurance premiums were formally compliant with German insurance law. As result the Court concluded the request was abusive and 12(5)(b) could be relied upon. A separate case on this issue has also been referred to CJEU, which should provide the full precedent for controllers on this issue.
Irish Data Protection Commission (IDPC) – decisions on seeking identity documents
The level of identity needed to verify a data subject before responding to a request to exercise their rights under GDPR is an issue that controllers need to consider carefully and in the particular context of the processing. In two cases, involving major online platforms (AirBnB and Twitter), the IDPC ruled that the level of identity sought before acting on an erasure request was excessive and breached GDPR. In both cases the controllers requested the data subject verify their identity by providing a photocopy of their identity document, which they had not previously provided to the controller. The IDPC found the controllers in breach of the data minimisation principle under Article 5(1)(c) of the GDPR, as there were less intrusive options available to the controllers. The IDPC also found them in breach of Article 6 GDPR as they lacked a lawful basis for processing, and a breach on Article 12 as the request was not met within statutory period of one month.
Reprimands were issued in both cases and the controllers were ordered to make changes to their policies and procedures related to identification and requests from individuals to exercise their GDPR rights. The decisions were are agreed at the EDPB via the GDPR Article 60 process and can therefore be taken as a precedent across the EU.
Companies operating under the GDPR should therefore consider whether they need to review or revise their rights handling policies and procedures to ensure their approach to identification is proportionate.
Pending CJEU case – relevance of purpose behind a data subject access request
The issue of purpose will also be considered in a pending case at CJEU, referred from a Court in Germany (DKV Deutsche Krankenversicherung AG Case C-672/22). The case also involves the insurance sector and is very similar to the Nuremberg judgment above. The CJEU will now need to consider whether a controller is obliged to respond to a GDPR access request with the aim of verifying the effectiveness of increases to a private health insurance premium.
Other Pending CJEU cases related to DSARs – definition of ‘copy’ and scope of access
It is also worth briefly noting two other DSAR cases that are pending at CJEU. The first is an Austrian case (Österreichische Datenschutzbehörde and CRIF C-487/2) relating to the definition of ‘copy’ and whether there is a right to entire documents. We await the final judgment but the Advocate General’s Opinion was published in late 2022 and has found that ‘copy’ ‘must be interpreted as meaning that the concept of “copy” referred to in that provision must be understood as a faithful reproduction in intelligible form’ and that ‘that provision does not confer on the data subject a general right to obtain a partial or full copy of the document that contains his or her personal data or, if the personal data are processed in a database, an extract from that database’ .
The case from Finland (Pankki C-579/21) has also reached Advocate General (AG) Opinion stage and has reached a conclusion about the scope of personal data accessible via a DSAR. The AG considered that there is not a right of access to the identity of who may have consulted personal data.
Consent, adtech and cookies
The EDPB’s Cookie Banner Taskforce delivered its report in January 2023 and reaffirmed this position. As the ePrivacy Directive is not covered by the GDPR one stop shop, the EDPB had to take a more cautious approach to harmonisation of position and the report seeks to summarise positions of the majority of DPAs and explain the exceptions. (The proposed new ePrivacy Regulation would have addressed this issue but the negotiations have been on hold for over five years now). In summary, the EDPB report set out the following:
The majority of the DPAs take the position that the ‘absence of refuse/reject/not consent options on any layer with a consent button of the cookie consent banner is not in line with the requirements for a valid consent and thus constitutes an infringement.’
The task force agreed that pre-ticked boxes cannot constitute valid cookie consent.
The taskforce agreed that cookie banners must avoid deceptive practices when displaying links, colours or contrasts.
There was also agreement that subsequent processing following the processing using a cookie cannot be compliant if Article 5(3) of the ePrivacy Directive is infringed, including consent requirements. This is also the case if the controller then relies on legitimate interest for additional processing.
The taskforce set out concerns that some banners were indicating that certain cookies were ‘essential’ or ‘strictly necessary’ but these cookies would not meet the required standards of the ‘strictly necessary’ exemption in the ePrivacy Directive.
The overall direction on cookies in terms of DPA enforcement is clear and organisations should ensure they have reviewed and updated their approach to cookie consent in light of the position set out above. While DPAs may not always issue to fines to smaller organisations they are still likely to require steps to be taken to resolve non-compliance. Reprimands issued may also be made public.
There are, of course, wider questions as to whether users are effectively engaging with consent choices even if these compliant steps are in place, given speed at which users move between different websites and apps, and the evidence of consent fatigue. A February 2023 report from University of Pennsylvania looked at US consumer understanding of consent online and argued that ‘that informed consent at scale is a myth.’ This is also an issue the UK is considering with the reforms in its Data Protection and Digital Information Bill (currently paused in the UK Parliament). See my previous blog on the Bill. However, these are questions for policy makers and in the meantime the compliance steps are now at least becoming clearer.
Pending CJEU decision on IAB Europe relevant to the wider adtech ecosystem
Lastly, the wider adtech ecosystem continues to face uncertainty following the 2022 decision by the Belgian Data Protection Authority against the EU trade body, the Interactive Advertising Bureau (IAB). The decision found that that the transparency and consent framework (TCF) issued by the IAB was in breach of GDPR. It also found that the IAB was acting as a joint controller (with the other companies implementing it) in its role related to the TCF. The decision found that IAB did not have an appropriate Article 6 lawful basis for the processing of personal data through the TCF, breached transparency requirements, plus infringements of the accountability, security, and data protection by design and by default principles. A fine of 250,000 Euros was served on IAB and they were ordered to submit an action plan.
As part of the IAB appeal the Belgian Court of Appeal has referred several questions about the application of GDPR to CEJU. The questions concern IAB Europe's status as a (joint) controller, and whether the ‘TC String’ (a string of numeric characters reflecting users' preferences) can be considered personal data. These questions will have wider implications for a range of scenarios related to digital services online involving joint controllers, the role of trade bodies in setting standards, and use of unique identifiers.
The approval and implementation of IAB’s action plan was delayed pending an appeal but was approved by the Belgian DPA in January 2023 - the IAB now have six months to implement but have sought an injunction against this requirement.
Profiling and automated decision making (ADM)
As companies make greater use of profiling, algorithms, machine learning and artificial intelligence in their business operations, data protection authorities are focusing more time and resource into monitoring and investigating their use - proactively and in response to complaints. We can expect more regulatory action and also key questions about how the GDPR should be applied to these scenarios.
UK takes enforcement action on profiling involving special category data
In the UK the ICO issued a fine of GBP1.48 million and an enforcement notice against catalogue retailer Easylife for unfair processing and lack of transparency. Easylife undertook profiling that made assumptions about certain purchases and categorised customers against medical conditions. The 145,400 customers affected were not informed about the profiling and the ICO concluded that the profiling included use of special category data. Easylife also failed to get valid explicit consent for using this data.
It is also notable that the ICO cited the 2022 CJEU case from Lithuanaia, OT v Vyriausioji tarnybines etikos komisija, (Case C-184/20) when considering the issue of inferring special categories of data. The ICO noted the judgment ‘confirms that the protections which the GDPR gives to data subjects' special category data, including health data, extend beyond inherently sensitive data to cover data revealing health data indirectly, following an intellectual operation involving deduction and cross-referencing’. The CEJU judgment is also an important indication of the broad and purposive approach the CJEU is likely to take to the interpretation of the special category data definition. EU DPAs will be bound to apply the precedent about inference and it is interesting that the ICO has sought to cite it, even though it is not bound to follow new CJEU precedent post Brexit.
Pending CJEU cases related to ADM and profiling
A pending CJEU case referred from the Wiesbaden Court in Germany will create important precedent about the interpretation of Article 22 GDPR provisions on automated decision making. The case centres on a credit score provided by a German Credit Reference Agency that was then used by a third party to refuse credit. The question referred to CJEU is whether Article 22(1) GDPR can be interpreted as:
..meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future already constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined by means of personal data of the data subject, is transmitted by the controller to a third-party controller and the latter draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject?
The referring court assumed that, based – inter alia – on the importance of the credit score for the decision-making practice of third-party controllers, Article 22 GDPR also applies to the credit agency. In particular, the calculation of a credit score is not merely a preparatory processing for a third-party controller’s decision but rather the credit agency’s own automated individual decision-making.
The question will turn on a number of important issues - the use of probability values in ADM, whether the threshold of the ‘legal effects’ test is met and the role of third parties in ADM who then place significant reliance on the original automated processing.
A pending CJEU case referred from Austria will consider a set of questions related to profiling and data subject access rights. The data subject asked the credit reference agency, Dun & Bradstreet (D&B), for information on their profiling. D&B refused, stating that it would violate its trade and business secrets and because it would involve disclose parts of its algorithm. The right to access information about ADM, and profiling, is set out in Article 15(1)(h) of GDPR, including at least meaningful information about the logic involved. (There is a corresponding requirement to proactively provide this information under Articles 13 and 14 but not cited in this case).
The resulting case was referred to CJEU (in a very detailed referral), focused on the following questions:
Is the GDPR requirement to provide access to information only ‘meaningful’ if the citizen can ‘actually, profoundly and promisingly exercise’ the right to express their views (under Article 22) and address a decision based solely on automated processing, including profiling.
Is the GDPR obligation to provide ‘meaningful information’ about profiling only discharged when a data subject can use the information to assess if there is a relationship between the disclosed information and the facts about the their credit score?
This profiling case could also have wider implications for access to information and transparency about use of algorithms in a range of business situations, including the use of artificial intelligence and machine learning. It will also bring closer focus on the concept of explainabilty. The UK ICO has produced detailed guidance on AI and explainabilty in conjunction with the Alan Turing Institute (the UK’s national institute for data science and artificial intelligence). Transparency should be a key component of any wider AI governance or ethics programme.
Applying learnings about ADM beyond the EU
Looking beyond the EU it is also worth noting that the newly formed California Privacy Protection Agency has just issued a consultation on proposed rulemaking related to automated decision making under the California Consumer Privacy Act of 2018 (CCPA).
New precedent on lawful basis under Article 6 GDPR
Contract as lawful basis – IDPC decision rules against use for personalised advertising
One of the most significant GDPR precedents was the January 2023 decision from the IDPC, which found that Meta could not rely on the lawful basis under 6(1)(b), processing necessary for the performance of a contract. The IDPC’s decision was made following a dispute resolution decision by the EDPB under Article 65. These decisions were in response to changes Meta made to its lawful basis for personalised advertising when GDPR came into force in 2018 - the user accepted terms to access the service, so forming the basis of a contract.
The EDPB overruled the IDPC to find that Meta could not rely on their contract with users as a lawful basis when processing personal data for the delivery of personalised adverts. The IDPC also found (and the EDPB agreed) that Meta had not provided sufficient transparency for the processing and Meta have been ordered to remedy this, including by being more specific when explaining processing related to each lawful basis. Meta is also required to bring its processing into compliance with Article 6(1)(b), this may include, but is not limited to finding an alternative lawful basis under Article 6.
Meta has indicated that they will appeal.
EU DPAs are setting a high bar for processing necessary for the performance of a contract. Their interpretation places a significant focus on the wider context of individual rights and less weight is afforded to the assertion made by the controller as about what digital service features are provided under the contract. (The IDPC found that personalised ads were a part of Meta’s services and that people understood that the services are predicated on personalised ads).
There are important learnings from the decisions related to transparency. There is a tension between reiterating the GDPR’s requirement for clear, simple and accessible language, specifying detail about processing related to each lawful basis, what can be provided in different layers and how layers should be considered in the round.
It is important to note the scope of the decisions – they cover personalised advertising and don’t cover other digital forms eg contextual advertising. It also doesn’t cover recommended or personalised content in other forms eg news, user generated content. The EDPB recognises the different context of other aspects of personalised content in their 2020 guidelines on targeting of social media users. The approach to other areas of personalisation therefore remains an open question.
At the current time companies will need to carefully consider and review any processing scenarios where they are relying on a contract as the lawful basis related to personalised advertising. It worth noting that personalised ads are not always delivered using cookies and therefore consent is not always required for instance ads can be delivered when logged into an internet retail service, based on previous purchases rather than cookies.
The overall direction is likely to push towards a greater use of consent (the challenges on this we note above in the section on Consent, adtech and cookies). Controllers may also wish to consider how they can innovate around accessible transparency, delivering through various channels, considering ‘gamification’ via videos and other mechanisms. There may also be a need for technology solutions from the market, or developed by controllers, to help map and record processing operations to lawful basis, and support updated and dynamic publication. Solutions will need to be proportionate and scalable for a range of businesses.
Pending CJEU case – can a purely commercial interest be regarded as a legitimate interest
A pending CJEU case (Koninklijke Nederlandse Lawn Tennisbond Case C-621/22), referred from a court in the Netherlands, will consider the question of what constitutes a legitimate interest. The CJEU will consider whether commercial interest, such as the provision of personal data in return for payment without the consent of the data subject concerned, be regarded as a legitimate interest under certain circumstances. And what circumstances will determine whether a purely commercial interest is a legitimate interest. The Dutch DPA’s position, as reiterated in its guidance, is to reject the use of legitimate interest in such circumstances.
This is again a case that could have wider implications for a range of processing scenarios for businesses. The European Commission have been critical of the stance taken by the Dutch DPA on the issue and their letter to the DPA is available online, setting out their counter legal view.
Pending CJEU case – validity of consent given to an undertaking in dominant position
The CJEU case, Facebook Inc. and Others v Bundeskartellamt case C-252/21, is particularly noteworthy because it will shed light on the overlap between antitrust law and data protection law.
The case concerns Meta’s processing of personal data from various sources. The German antitrust authority prohibited Meta’s processing because it found that user consent was not freely obtained. As noted by the authority, users had to agree to terms and conditions that allowed the social media service to process such data, but the authority held that this was not freely obtained because users could not register to use the social network without such agreement.
Following this, the Higher Regional Court of Düsseldorf referred some interesting questions to the CJEU, in particular regarding the competence of antitrust authorities with regard to data protection violations and obtaining “forced consent”. The CJEU’s decision is still pending. In light of the Digital Markets Act and its requirements for gatekeepers, the decision will be extremely relevant for both data protection and antitrust law.
GDPR and facial recognition
Another important area of activity for data protection enforcement in the last year has been centred on the use of facial recognition technologies. It is notable that the EU DPAs and the UK ICO have made this area a priority. The cost of the technology has fallen and many uses cases and benefits have been presented, from authentication to crime prevention. The DPAs are concerned particularly about use of live facial recognition in public places given the risks of indiscriminate collection of data and the use of biometric data that facial recognition entails. There has been less focus on uses for authentication and access (for example for device logons).
The ICO have issued two formal opinions on the use of live facial recognition. One on use by law enforcement and one on other uses in public places. While the ICO has given strong messages about the legal tests that must be met to deploy FRT in public places, it does not seek to place a ban on their use. The ICO has focused on the importance of conducting data protection impact assessments and effective consideration of necessity and proportionality tests, considering wider risks and harms.
In 2022, the key enforcement action in this area related to Clearview’s facial recognition service, which has billions of images that come from many sources, including scraping from social media. The service was then trialled or used by a number of police forces.
The ICO, French, Italian, Greek, and Swedish Data Protection Authorities have all taken enforcement action, and, outside the EU, the Australian and Canadian Commissioners also took action. The enforcement decisions made findings about infringements of fairness, transparency lawful basis and use of special category data. The level of international co-ordination is also significant.
Because of Clearview’s location in the US the EU and UK based enforcement actions may also pivot on the application of territorial scope under Article 3(2) GDPR. The DPAs found that the Clearview met the test of monitoring the behaviour of data subjects in the UK/EU, despite the implementation of Clearview’s systems taking place through national police forces. Clearview’s pending appeals may therefore provide some wider case law on how this aspect of GDPR should be applied. This could create precedent for companies working through third parties in other circumstances. Any appeals may also be significant for wider scenarios involving image scraping from social media.
While recognising the different risk profile for authentication and access applications, the key message around use of facial recognition in public places is to focus on rigorous risk and impact assessment before deployment, considering the full range of compliance issues engaged.
Finally, it is worth noting that clarity is starting to emerge on the issue of class actions and compensation under GDPR, in the UK and EU. My A&O colleagues Nathan Charnock and Jason Rix published a blog about the opinion of Advocate General (AG) of the CJEU in the case of Österreichische Post. We can also expect the full judgment of CJEU later this year. As explained in more detail in the blog, the AG found that infringement of the GDPR alone does not give rise to the payment of compensation. In particular, he concluded that individual must have suffered damage as a result of the infringement to trigger compensation.
In 2021, the UK Supreme Court, in the case of Lloyd rejected the proposition that compensation for a data breach was payable for infringement without proof of damage. The A&O team have also blogged on this case.
The cases illustrate that the Courts in both jurisdictions are likely to take harm based approach to such GDPR compensation claims, though we await the formal judgment Österreichische Post. It will also be useful to consider how this compares with approach taken to harm in the US Courts, where there is already a significant history of class based actions related data breaches. In the US, to establish standing, plaintiffs need to show that they have suffered an “injury in fact” that is concrete, particularised, and actual or imminent. That said, commentators have suggested that US Courts may be warming to future harm as satisfying injury-in-fact.
Stay connected with the A&O Digital Hub for more analysis
The Data Protection team at A&O will continue to analyse the trends and key issues from these cases throughout 2023 and will publish further blogs to explain key learnings from the decisions and judgments.