Freeths Data Protection Update: Winter 2022/23
Welcome to the Winter edition of the Freeths Data Protection Update.
In this edition we look at a recent ECJ decision regarding a data subject’s rights to discover who their personal data has been disclosed to, the use of AI by local authorities, the ICO’s recent “Tech Horizons” report, and their new approach to reprimands.
- Right to know to whom personal data has been disclosed
(Case report: RW v Österreichische Post AG (Case C‑154/21) EU:C:2023:3)
- Concerns on the use of Artificial Intelligence by local authorities
- ICO – Tech Horizons Report
- Reprimands – The ICO’s new name and shame strategy
Right to know to whom personal data has been disclosed
Case report: (Case report: RW v Österreichische Post AG (Case C‑154/21) EU:C:2023:3)
by Olivia Hill
Transparency is one of the seven key principles of the GDPR. In essence, individuals have the right to know who is processing their personal data, how it is being processed and for what purpose. Organisations must therefore be clear, open and honest with data subjects about the processing activities they are carrying out.
Under Article 15 of the GDPR, which sets out data subjects’ rights of access to their data, individuals have the right to obtain from a controller (amongst other information) details of the recipients or categories of recipient to whom their personal data is or will be disclosed.
A recent ECJ case was centred around this particular “right of access” and is a clear demonstration of the importance of making such information available to data subjects.
In January 2019, RW (an individual) asked Österreichische Post for access to their personal data under Article 15. RW also requested information about: (a) whether his personal data been disclosed to any third parties; and (b) if so, the identity of those third-party recipients.
In response, Österreichische Post confirmed that: (1) it used personal data, to the extent permissible by law, during its activities as a publisher of telephone directories; and (2) that it offered such personal data to trading partners for marketing purposes. Österreichische Post did not disclose the identity of the specific recipients of the data to RW.
RW brought proceedings against Österreichische Post before the Austrian courts, seeking an order that Österreichische Post provide the identity of the recipients. During proceedings, Österreichische Post informed RW that his personal data had been processed for marketing purposes and forwarded to customers, including advertisers trading via mail order and stationary outlets, IT companies, mailing list providers and associations such as charitable organisations, non-governmental organisations (NGOs) or political parties.
The courts dismissed RW’s action at first instance and their appeal on the ground that Article 15(1)(c) GDPR, by referring to ‘recipients or categories of recipient’, gives the controller the option of informing the data subject only of the categories of recipient, and not the full identity.
RW subsequently brought an appeal on a point of law before the Oberster Gerichtshof (Supreme Court). The Oberster Gerichtshof was uncertain as to whether Article 15(1)(c) GDPR should be interpreted to grant data subjects the right to access specific information about the recipients, or whether the controller has discretion in how they respond. The Oberster Gerichtshof therefore decided to stay proceedings and refer to the European Court of Justice for a preliminary ruling.
The ECJ held that Article 15(1)(c) should be interpreted to place an obligation on controllers to provide the data subject with the identity of recipients unless: (1) it is not possible to identify them; or (2) the controller can demonstrate that the request is manifestly unfounded or excessive (under Article 12(5)(b)).
The following reasoning was applied:
- Despite Article 15(1)(c) not giving an explicit order of priority between the terms ‘recipients’ and ‘categories of recipients’, the Court noted that Recital 63 of the GDPR did not allow for the right of access to be restricted solely to categories of recipient.
- Article 15 provides data subjects with a genuine right of access and, when exercised, it is the data subject that must have the option of choosing to obtain information about the specific recipients that have received the data.
- The right of access is necessary to enable the data subject to exercise other rights conferred by the GDPR, such as their right to rectification, right to erasure, right to restriction of processing, right to object to processing or right of action where they suffer damage.
Whilst ECJ rulings are no longer legally binding in the UK, the terms of Article 15(1)(c) of the GDPR mirror those in Article 15(1)(c) of the UK GDPR, therefore the ECJ’s interpretation is likely to be highly persuasive in the UK. The case should serve as a useful reminder that data subject access rights can only circumvented in very limited circumstances.
Organisations should also note that ICO guidance on the application of the UK GDPR encourages organisations to be “as specific as possible” when providing data subjects with information about recipients of their personal data. As is always the case with breaches of the UK GDPR, falling short of this exposes organisations to significant fines and potential reputational damage.
Concerns on the use of Artificial Intelligence by local authorities
Much of Artificial Intelligence and Machine Learning focuses on the development and use of algorithms. That is, sets of instructions which tell the computer how to “learn” and therefore continuously develop how it perform its functions.
The Information Commissioners Office (ICO) recently conducted an inquiry into the use of algorithms in AI interfaces following concerns raised with regards to benefit entitlement and the welfare system. The ICO consulted with various IT suppliers, as well as a sample of local authorities and the Department for Work and Pensions, as part of this inquiry.
No discrimination was found in the use of AI in relation to the benefit system. However, as more and more public authorities use this technology for simple tasks, it is necessary to regularly review compliance with the fundamental principles of Data Protection.
Whilst the use of AI and other technological advancements continues to grow, so does the potential risk to compliance with Data Protection legislation and to the rights of individuals.
Discrimination/Unlawful Use: no discrimination or unlawful use was found in the use of simple AI algorithms relating to the benefit process. This was principally because the final decision was human led and the AI element only sought to reduce the administrative workload.
Practical Steps for Local Authorities:
- any AI led data processing should be proactively reviewed regularly to ensure compliance with Data Protection Laws;
- local authorities should regularly review their privacy policies to ensure they properly reflect current processes, such as the use of any AI; and
- data protection impact assessments should be used to identify and minimise the risk of using AI data or analytics.
The capabilities and potential benefits of new technologies, including AI, are always increasing. So too are the potential benefits this can bring, such as reductions in human error, administrative duties, and customer disputes.
However, local authorities need to ensure that their collection and use of personal data in any AI algorithm complies with their duties as data controllers, keeping at the forefront the individual’s right to fairness, transparency and ensuring public trust.
The ICO remains committed to working with and supporting the public sector to ensure the lawful use of AI and striking a fair balance between the interest of the local authority and public rights.
The ICO have developed a data protection impact assessment for public authority use which can be found here.
ICO – Tech Horizons Report
By Luke Dixon
Following the arrival of the new Information Commissioner, the ICO set out a new strategy for supporting business, fostering innovation, and reducing harms to data subjects.
As part of this strategy, the ICO has published its first “Tech Horizons Report” (the “Report”). The Report is intended to discuss technologies that are set to emerge over the next two to five years.
By publishing the Report, the ICO also hopes to encourage developers to adopt a “privacy by design” approach from the outset when implementing new technologies.
The Report focuses on the following emerging technologies:
- Consumer health tech, such as wearable devices and health and wellbeing apps;
- Next generation “Internet of Things”;
- Immersive technology, such as augmented and virtual reality; and
- Decentralised finance, such as software that employs blockchain technology to support peer-to-peer financial transactions.
The Report identifies a common set of challenges that these technologies present:
- Transparency. Some new technologies (such as augmented reality and smart home systems) may collect information about parties other than the intended user. This data processing might not be transparent to data subjects, who in turn might lack the means for meaningful control over their personal data when processed by such technologies.
- How data subjects may exercise their rights. Some new technologies (such as decentralised finance systems and next generation “Internet of Things”) involve a complex data ecosystem. This makes it had for individuals to understand how organisations process their data, and how to hold those organisations to account for their data processing.
- Collecting too much data. Some technologies may collect more data than they should for their primary purpose. The Report cites tracking individuals across consumer health tech or virtual reality devices as an example of this.
- The need to safeguard sensitive data. Given that some of the new technologies collect special category data (such as health data and biometric data), organisations need to understand how to identify such data as special category and put in place additional safeguards to protect the data.
The ICO anticipates that it will take the following steps in preparing for its regulation of these emerging technologies:
- Work with the public about the benefits and risks of these emerging technologies and how it will approach them as a regulator.
- Invite organisations to work with its Regulatory Sandbox to engineer data protection into these technologies.
- Develop guidance for organisations where needed, starting with guidance on data protection and IoT.
- Proactively monitor market developments so it can act on uses that cause concern.
The ICO produced this Report using “foresight methodology” to identify and discuss the data privacy issues relating to four of the most impactful emerging technologies. It does not cover an exhaustive list of new technologies, though it does helpfully “nod” towards those that did not make the cut, namely: behavioural analytics; quantum computing; digitised transport; generative AI; synthetic media and digital ID.
The Report continues the recent trend of the ICO producing clear and practical guidance that organisations should find helpful when implementing privacy by design into their emerging technological solutions. It also ties-in with the ICO’s own strategy for the new two to three years as expressed in its ICO25 strategy document.
The Report does not discuss what it cites as potentially the most impactful new technology of all, namely neurotechnology. However, it promises a standalone “deep-dive” report into the data privacy implications of this area. We look forward to the publication of this report, which is expected during Spring 2023.
Reprimands – The ICO’s new name and shame strategy
The ICO has recently declared that it will publicise reprimands going back to January 2022 as a way to achieve greater transparency.
This somewhat coincides with the decision that public authorities will no longer generally be fined for breaches. It stands to reason that this is a trade-off for the disappearance of fines.
Unfortunately, there are serious drawbacks to the new policy. As controllers of personal data are encouraged to self-report any incidents above a fairly low threshold, the threat of having a reprimand issued and published could be a deterrent from being open and transparent at the early stages of comparatively minor incidents.
The reason for this is partially that there is no current way to appeal a reprimand once issued (other than via judicial review). In essence, any organisation receiving a reprimand has no ability to challenge the outcome prior to publication.
That is not to say that it may not be beneficial for controllers as a whole to be better able to understand what practises result in reprimands and be able to learn from others’ mistakes, but it remains to be seen whether the possibility of adverse publicity (which of course was already there in relation to more serious sanctions like fines) will have a depressive effect on the number of notifications made.
One other consideration for those making notifications is the risk that publication of a reprimand may operate to encourage litigation by data subjects some time after the event that the reprimand relates to. Not all these events will have been notified to data subjects at the time that they occurred.
While we normally see claims being advanced quite quickly after an incident, the new regime may lead to some claims being made after the point that reprimands are issued and publicised. That may create problems for the controller on the receiving end of that claim, who may no longer have access to contemporaneous evidence of the steps taken in connection with the event, or any harms suffered.
We would always encourage controllers to seek: (a) immediate legal assistance on their communications with the ICO; and (b) advice on evidence preservation at the point that an incident is discovered. Our team has the expertise in both incident response and litigation to be able to steer our clients safely through the crucial early period between discovery and notification, as well as being happy to use our experience to help clients improve their safeguards and incident response processes before anything goes wrong.
If you have any queries on the topics in this update, please get in touch with our Data Protection team.
The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.
‘Doing the right thing’ is at the heart of Freeths. Find out more about our excellent client service and the strong set of values that guide the way we work.
Talk to us
Freeths are a leading national law firm with 13 offices across the UK. If you have a query about our services or just want to find out more, why not give us a call?
Contact: 03301 001 014