Freeths Data Protection Update: Summer 2022
Welcome to the Summer edition of the Freeths Data Protection Update.
In this edition we consider the potential impact of the new Data Reform Bill, the Information Commissioner’s new approach to enforcement in the Public Sector, the regulatory position for using personal data beyond the UK GDPR / DPA, and the recent multi-million pound fine for Clearview AI Inc’s use of personal data collated from social media in facial recognition software.
- Brave New World? UK Government responds to consultation on proposed Data Reform Bill
- Public sector enforcement – an open letter from the Information Commissioner
- Open Season? Sharing personal data outside of regulated sectors
- Clearview AI Inc fined millions for facial recognition breaches
Brave New World? UK Government responds to consultation on proposed Data Reform Bill
By Luke Dixon
The UK Government has published its response to a consultation on the proposed Data Reform Bill.
- The Government launched its consultation (“Data: A New Direction”) in September 2021 to inform its development of proposals to reform the UK’s data protection laws.
In this article, we summarise some of the key take-aways from the Government’s response. We also look at whether this is a brave new dawn for UK data protection, or a case of evolution over revolution.
The Government has undertaken its reform and consultation programme with five core principles in mind:
- To ensure high standards of data protection, while providing greater flexibility to organisations.
- To future-proof the UK’s data protection regime.
- That almost all organisations which comply with the UK’s current regime will comply with the future regime.
- That the UK’s data protection regime will deliver concrete advantages for the UK while preserving data subjects’ rights and the independence of our regulator.
- That the reforms to the Information Commissioner’s Office (“ICO”) will ensure effective, risk-based and preventative supervision.
What were the Proposals?
The government proposed to create a limited, exhaustive list of legitimate interests for which organisations could use personal data without applying the balancing test and without unnecessary or inappropriate recourse to consent. It suggested several processing activities for consideration as part of such a list.
The government intends to pursue the proposal in relation to an initially limited number of carefully defined processing activities. This is likely to include processing activities which are undertaken by data controllers to prevent crime or report safeguarding concerns, or which are necessary for other important reasons of public interest.
Accountability and Compliance Programmes
What were the Proposals?
The government proposed measures under new “privacy management programmes” for organisations, such as:
- appointing a suitable senior individual to be responsible for the programme (such person would replace the DPO role);
- ensuring organisations implement risk assessment tools which help assess, identify and mitigate risks (such tools would replace the current requirements for data protection impact assessments); and
- introducing a more flexible record keeping requirement than the current Record of Processing requirements under Art 30 UK GDPR.
The government plans to proceed with the requirement for organisations to implement privacy management programmes. In particular, it intends to:
- Replace DPOs with “senior individuals”.
- Scrap the requirement for organisations to undertake data protection impact assessments as prescribed in the UK GDPR, but they will be required to ensure there are risk assessment tools in place.
- Proceed with removing the requirement for record keeping provisions. Privacy management programmes will still require organisations to document the purposes of processing, but in a way which is more tailored to the organisation
Data Breach Reporting
What were the Proposals?
The Government sought views on the impact of changing the threshold for reporting a personal data breach, so that only breaches which posed “material” risks to individuals would have to be reported.
The Government will not pursue legislative change but will continue to work with the ICO to explore the feasibility of clearer guidance for organisations on breach reporting.
Subject Access Requests
What were the Proposals?
The Government recognised that subject access requests (“SARs”) are a critical mechanism to empower data subjects to have control over their data. However, it also acknowledged that dealing with requests can be time-consuming and resource intensive for organisations.
The government invited views on whether the “manifestly unfounded” threshold for refusing to respond to, or charge a reasonable fee for, a subject access request was too high. There were mixed views on whether the threshold was too high.
The Government also consulted on whether to align the SAR regime with that for Freedom of Information responses. It sought views on whether to: (i) introduce a cost ceiling for dealing with requests; and (ii) applying a “vexatious request” ground for refusing to deal with a SAR.
The Government plans to change the current threshold for refusing or charging a reasonable fee for a SAR from “manifestly unfounded or excessive” to “vexatious or excessive”, which will bring it in line with the Freedom of Information regime. The Government does not intend to introduce a cost ceiling for SARs, or to re-introduce a nominal fee for processing them.
What were the Proposals?
- The Government sought views on whether there were any other occasions when cookies (and similar technologies such as tracking pixels) should be permitted to be placed on a person’s device without their explicit consent.
- The Government intends to legislate to remove the need for websites to display cookie banners to UK residents. In the immediate term, the government will permit cookies (and similar technologies) to be placed on a user’s device without explicit consent, for a small number of other non-intrusive purposes. These changes will apply not only to websites but connected technology, including apps on smartphones, tablets, smart TVs or other connected devices.
In the future, the Government intends to move to an opt-out model of consent for cookies placed by websites.
Quick Summary of other Reforms
The Government also intends to do the following:
- Press ahead with reforms to the fines regime under the Privacy and Electronic Communications Regulations (“PECR”), to align it with the Data Protection Act 2018/UK GDPR regime. This means a considerable increase in the quantum of a potential fine under PECR from the current £500,000 limit. The intention is to increase the dissuasive effect of fines for PECR breaches.
- Clarify the legislation in relation to anonymisation to address when a living individual is identifiable and therefore within scope of UK data protection legislation.
- Retain Article 22 UK GDPR (which covers automated decision-making, including profiling), but with amendment. Article 22 will become a right to specific safeguards, rather than a general prohibition on solely automated decision-making. Reforms will enable the deployment of AI-powered automated decision-making, providing scope for innovation with appropriate safeguards in place.
- The ICO itself will have the ability to use its discretion to decide when and how to investigate complaints. This means it will not be required to investigate vexatious complaints, or those the complainant has not tried to resolve with the controller. Controllers will be required to consider and respond to data protection complaints lodged with them.
These reforms “tweak” UK GDPR rather than reinventing the wheel. This will be a relief to those who had feared for the UK’s hard-won adequacy status for receiving frictionless transfers from the EU. However, others might regard the Government’s programme for reform as a missed opportunity to radically liberalise the UK’s data protection laws, post-Brexit.
The Government assumes that organisations which currently comply with UK GDPR will also be compliant under the reformed law. Larger and more established organisations will be pleased to hear that the time and money they have invested in UK GDPR compliance to date will not therefore have been wasted.
For smaller businesses and start-ups undertaking UK GDPR compliance programmes for the first time, the reforms allow for a more proportionate approach that may lower the “barriers to entry” to data protection compliance in some cases.
For his part, John Edwards (the recently installed UK Information Commissioner) broadly welcomed the scope of the reforms in his published statement on 16 June 2022.
Public Sector Enforcement – an open letter from the Information Commissioner
By Olivia Hill
The Information Commissioner’s Office (“ICO”) has outlined a revised approach to working with public authorities. The approach is set out in an open letter from the UK Information Commissioner John Edwards, where he states that the approach is just one initiative forming part of ICO25 – the ICO’s new three-year strategic vision – to empower organisations to innovate while using people’s data responsibly.
The Commissioner states that the ICO will continue to call out non-compliance and take robust enforcement action where necessary, but its primary focus will be on raising data protection standards and preventing harms from occurring in the first place. The regulator will work proactively with senior leaders across the public sector to address underlying issues that result in avoidable data breaches.
The Commissioner states: “Whether it’s due to not following a data protection by design approach on the development of new services, or something as simple as not having processes in place to stop sensitive information being sent to the wrong recipient – many of these issues are all too common”.
The ICO is now having discussions with its colleagues across the UK, as well as the wider public sector, to determine the most effective way to deliver these improvements.
Over the next two years, the ICO will be trialling an approach that will see a greater use of the Commissioner’s discretion to reduce the impact of fines on the public. The Commissioner’s open letter acknowledges that fines in the public sector come directly from the budget for the provision of services and the impact of a fine is often felt by victims of a breach, in the form of reduced budgets for vital services.
The public sector can therefore expect to see an increased use of the Commissioner’s wider powers including warnings, reprimands and enforcement notices, with fines reserved for the most serious breaches.
The ICO will continue to investigate data breaches in the same way and will follow up with organisations to ensure the requirement improvements are made. The ICO also aims to increase publicity around enforcement action, including publicising the value of the fine levied, to encourage wider learning.
In light of the revised approach, the ICO recently issued a reduced fine of £78,400 to Tavistock and Portman NHS Foundation Trust for disclosing 1,781 email addresses belonging to adult gender identity patients as a result of failing to use the ‘Bcc’ field.
Another example of recent enforcement action includes a reprimand issued to NHS Blood and Transplant Service, after they inadvertently released untested development code into a live system for matching transplant list patients with donated organs in August 2019.
Addressing these cases, the Commissioner says: “I want to ensure my office remains a pragmatic, proportionate and effective regulator focused on making a difference to people’s lives. That means taking a more proactive and targeted approach with public authorities to ensure they are looking after people’s information while supporting their communities”.
This is an early indication of the pragmatic and proportionate regulatory culture that the new Information Commissioner has promised. Public sector organisations will welcome the ICO’s proposals for a nuanced approach to exercising its enforcement powers, and will no doubt agree with the ICO’s view that prevention is better than cure in relation to breaches of data protection law.
To read the Commissioner’s open letter in full, please click here.
Open Season? Sharing personal data outside of regulated sectors
We spend a lot of time talking about compliance when it comes to personal data. That usually means ensuring that personal data is processed in line with the Data Protection Act and the UK GDPR, but there are other regulations that play an important part – especially when it comes to online data.
Not all use of personal data falls within the Data Protection Act, UK GDPR and regulation by the ICO. In brief, the distinction is between personal data being used by: (a) organisations; and (b) private individuals. Only use by organisations is subject to the regulations set out above.
The UK GDPR specifically says it does not apply to individuals using personal data “in the course of a purely personal or household activity”. This means private individuals can pass on or share personal data without being concerned that they are breaching regulations, but only where it relates to “purely” personal or domestic activity. For example, to set up a neighbourhood contact group or exchange information about tradespeople.
That said, it is important to bear in mind that if personal data is shared irresponsibly individuals may well have other ways to object. One case in Holland led to a grandmother being fined for posting photos of her grand-children on social media. She refused to take them down at the parents’ request, and the Dutch court held that the nature of the platform (and the grandmother’s own privacy settings) meant that she had effectively put the images into the public domain. They judged that this took the processing outside of the scope of purely personal and domestic activity.
While not all processing of personal data is regulated, it always pays to be careful handling personal data to avoid a major fallout. We regularly act in relation to claims for harassment, misuse of private information and defamation which centre around personal data being shared / publicised, even when there have been no breaches of the Data Protection Act and/or the UK GDPR.
Clearview AI Inc fined millions for facial recognition breaches
The ICO have fined Clearview AI Inc £7,552,800.00 for collecting facial recognition data and images from publicly available information on the internet without the consent of the data subjects.
What did Clearview do?
Clearview AI Inc provides software services which allow its customers to upload images on to an app and check for a match against the images in their database. The app provides matches based on characteristics of the image, together with links to where the image came from. Their software is said to have included over 20 billion images.
Given the high number of social media users in the UK, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, data which has been gathered without their knowledge.
How were they discovered?
A joint investigation was conducted in accordance with the Australian Privacy Act and the UK Data Protection Act 2018. It was also conducted under the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement.
The ICO found that Clearview AI Inc were in breach of UK Data Protection Act 2018 by:
- “failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people’s information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR); and
- asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used”
What has happened so far:
In addition to the fine outlined above, the ICO has issued Clearwater an enforcement notice ordering them to:
- stop obtaining and using personal data regarding UK residents which is publicly available on the internet; and
- delete all such personal data already held (in accordance with UK Data Protection Laws).
Clearview AI Inc is no longer offers its services to UK organisations. However, the company has customers in other countries, so could still be using the personal data of UK residents.
What happens next?
Similar, and substantial, fines have already been issued to Clearview AI Inc in other countries (including Australia, Italy and France). Further fines may follow.
This decision provides an interesting example of cross-border data protection enforcement cooperation within the context of a complex case with international dimensions.
If you have any queries on the topics discussed, please contact our expert Data Protection team.
The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.
‘Doing the right thing’ is at the heart of Freeths. Find out more about our excellent client service and the strong set of values that guide the way we work.
Talk to us
Freeths are a leading national law firm with 13 offices across the UK. If you have a query about our services or just want to find out more, why not give us a call?
Contact: 03301 001 014