1. Guidance on ChatGPT and other AI systems
Artificial Intelligence and the impact of large language models (LLMs) such as ChatGPT has regularly been in the news recently, and concerns have been highlighted about the extent to which AI processes personal data and the impacts on fairness and transparency stemming from the use of such systems.
Mindful of the current focus on these LLMs the ICO has issued guidance to developers and users on using Chat GPT and other generative artificial intelligence (AI) which is available here: Guidance on AI and data protection | ICO. Data protection law still applies when the personal information that is being processed comes from publicly accessible sources, and the ICO has stressed that organisations developing or using AI should be considering their data protection obligations from the outset.
In the guidance, the ICO set out eight questions that organisations developing or using AI that processes personal data should ask themselves:
- What is your lawful basis for processing personal data?
- Are you a controller, joint controller or a processor?
- Have you prepared a Data Protection Impact Assessment (DPIA)?
- How will you ensure transparency?
- How will you mitigate security risks?
- How will you limit unnecessary processing?
- How will you comply with individual rights requests?
- Will you use generative AI to make solely automated decisions?
To assist organisations with answering these questions and to mitigate any risks associated with your AI platform, the ICO has also published an AI risk toolkit to support organisations in ensuring that the level of data protection risk associated with generative AI are minimised as far as possible. The toolkit is available here: AI and data protection risk toolkit | ICO.
2. New Subject Access Guidance For Employers
The ICO has provided a new Q&A for employers who receive Subject Access Requests (SARs) off the back of the 15,000 complaints it received last year in relation to the management of SARs. The ICO confirms “it’s important to not get caught out” and that they will be taking appropriate action where necessary. This is evidenced by the recent enforcement action discussed below.
The new guidance can be found here: SARs Q&A for employers | ICO. We will be providing a separate update on this new guidance.
ICO Enforcement Action
3. Further action against public bodies for lack of FOI and SAR compliance
The ICO has been cracking down on public authorities for failing to respond to overdue requests made under the Freedom of Information Act 2000 (FOI requests) and SARs.
Freedom of Information Requests
In the last few months, the ICO has issued enforcement notices to Shropshire Council and Lewisham Council for failing to respond to hundreds of FOI requests.
Shropshire Council confirmed it had a weak FOI request handling system, since individual service areas are responsible for recording and collating their own FOI requests. At the end of April they still had 143 unanswered FOI requests. The oldest unanswered request dated back to April 2021, with remaining requests dating back to January 2022. The enforcement notice requires the Council to respond to all outstanding requests no later than six months from the date of the notice and publish its action plan to mitigate any future delays to FOI requests.
At the end of 2022, the London Borough of Lewisham Council had a total of 338 overdue requests for information, 221 of which were over 12 months old. The Council was prioritising new requests to improve its compliance with the statutory time limit of 20 working days, but the ICO found it had no concrete plans to address its extensive backlog. The enforcement notice likewise requires the Council to respond to all outstanding requests no later than six months from the date of the notice, and it is also required to devise and publish an action plan to mitigate any future delays to responding to FOI requests. Non-compliance with the enforcement order may lead to the Council being found in contempt of court.
Subject Access Requests
Plymouth City Council and Norfolk County Council have both recently been reprimanded by the ICO for repeatedly failing to respond to SARs within the legal deadline of one to three months. The ICO has instructed them to take steps to ensure that the public receive their personal information within the statutory period.
Following enquiries, the ICO found that Norfolk County Council had only responded to 51% of SARs on time between April 2021 and April 2022. At Plymouth City Council, 18 requests took up to two years to complete and a further 18 requests took between three months and one year. The highest compliance rate for SARs completed on time was 77% in 2022 to 2023.
The increasing use of enforcement notices and reprimands by the ICO against public authorities demonstrates that notwithstanding their decision not to apply monetary penalties against public authorities, any authorities found to be failing in their response times will be subject to scrutiny and oversight which will impact significantly on reputation. The ICO said the public bodies need to ensure they have adequate staff resources in place to respond to SARs on time, and continue to implement effective measures to address the outstanding requests.
4. Social media and enforcement – the fines TikTok upwards
At the start of April, the ICO announced a £12.7 million fine for TikTok, regarding how it uses children’s data; on the 12 May, the Irish data protection authority (the ‘Data Protection Commission’ or ‘DPC’) issued a fine to the Facebook company group Meta of €1.2 billion (see Section 7 below). Both these enforcement actions represent significant steps against big tech companies, and serve as salient reminders of how even large-scale, accepted processing can go wrong.
The ICO’s investigation into TikTok related to the period of May 2018 to July 2020 (ie. dating back to the introduction of the GDPR) and specifically three aspects of TikTok’s activities:
- provision of services to UK children under 13 without parental consent/authorisation;
- failure to provide information to platform users about how their data was being used in a way that the users (and particularly under-13 users) could understand; and
- failure to ensure that the personal data processing was lawful, fair and transparent.
One of the core issues behind TikTok’s non-compliance was that while it had a system to screen out users under 13, that system was not fit for purpose during the May 2018 to July 2020 period; senior employees had apparently been notified of this issue, and were aware that those users were vulnerable to tracking/profiling or being presented with harmful content.
The £12.7 million fine represents a significant step by the ICO, supported by the number of affected users (reported to be around 1.3 million underage users), the period of breach (just over two years), and senior employees being aware of the issue.
However, the ICO was originally minded to levy a fine of £27 million; after taking into account representations from TikTok, the ICO dropped part of its investigation relating to special category data, resulting in the now lower fine.
5. Reprimand for Ministry of Justice after confidential personal information left in prison holding area
The ICO has issued a formal reprimand to the Ministry of Justice (MoJ) after confidential waste documents were left in an unsecured prison holding area. Prisoners and staff had access to the 14 bags of confidential documents, which included medical and security vetting details, for a period of 18 days. At least 44 people had access to the information, which had remained on site as a contracted shredder waste removal company had not collected as scheduled. The ICO investigation uncovered a lack of robust policies at the prison including a general lack of staff understanding of the risks to personal data and the need to report data breaches.
The reprimand details a number of required or recommended actions including:
- a thorough review of all data protection policies, procedures and guidance to ensure they are adequate and up to date with legislation; and
- the creation of a separate data breach reporting policy for staff.
The MoJ also recommends that the prison and the Trust put in place a data processing agreement to ensure they are clear as to each party’s data controller responsibilities. The MoJ is also required to provide the ICO with a progress report by the end of October 2023. Details of the reprimand are available here: Ministry of Justice | ICO.
6. Capita Cyber Attack
The outsourcing firm Capita recently confirmed that they were subject to a cyber-attack in which some customer and staff personal data was accessed and exfiltrated. The firm have already announced that they expect the fees associated with responding to the breach to be in the region of £15 million to £20 million. The ICO has this week released a statement recommending that those organisations who use Capita’s services should “check their own position” regarding the incident to assess if any personal data held on their behalf by Capita has been affected. The ICO’s statement is available here: ICO statement on Capita incident | ICO
UK Law Update
7. Update on DPDI Bill
The newly created Department for Science, Innovation and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill on 8 March 2023. The original Bill was paused in September 2022 and was re-introduced with minor amendments. It is currently going through the Committee stage within Parliament, with the Committee expected to report on the Bill by 13 June 2023. As with the original draft, the Bill is an amending document to existing data protection legislation – the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications Regulations 2003.
The Bill has undergone minimal change from the first version published in 2022. The Bill continues to adopt the main changes proposed in the first version, including the revised definition of personal data; the list of specific activities or interests which may be considered to be the controller’s legitimate interest to process personal data; amendment to the requirement to maintain a Record of Processing Activities, so that a record would only need to be kept where the processing is likely to result in a high risk to the rights and freedoms of individuals; removal of the need to have a UK representative for companies outside the UK, and the revised adequacy assessment process incorporating a threshold for data protection rights that is “not materially lower” than provided for under UK law. The proposals also maintain the proposed changes to the office of the Information Commissioner from a corporate sole, to a body corporate known as the Information Commission, including the requirement for the Commission to have regard to a Statement of Strategic Priorities published by the Secretary of State.
The Bill is currently going through the Committee stage which is expected to report back to the House of Commons by 13 June 2023. It has also been agreed that if proceedings on the Bill have not been completed in this Parliament, they will be carried over to the next Session. We will keep readers updated on the Bill’s progress in subsequent editions of Data Matters.
EU Law Update
8. Meta DPC enforcement decision
Following on from the TikTok case above, the Irish DPC has often found itself on the wrong end of the continuing battles between US social media companies and EU regulators. The DPC’s investigation into Meta (Facebook’s parent company) has been ongoing for the past three years since the “Schrems II” decision by the European Court of Justice (CJEU) concerning international data transfers, and has attracted substantial international attention. Broadly, the investigation revolved around the use by Facebook of the European Union’s Standard Contractual Clauses (SCCs), a transfer mechanism commonly used by many organisations in order to transfer personal data to the US. The Schrems II decision which came out in July 2020 confirmed that the US did not offer an essentially equivalent standard of data protection for foreign persons’ data transferred to the US and that reliance on SCCs without putting in place supplemental measures to protect the data to the GDPR standard could require organisations to suspend or cease data transfers to the US. Facebook continued to transfer data to the US in reliance on the SCCs and supplemental measures, however, the DPC considered this to be insufficient to meet the standard of “essential equivalence”. The DPC, as designated supervisory authority for Facebook’s activities in Europe prepared an order requiring Facebook to suspend its affected data transfers, but did not otherwise apply a monetary penalty. However, this was received with concern by other European data protection authorities, and in January 2023 was ultimately considered by the European Data Protection Board (‘EDPB’) under the GDPR’s consistency mechanism.
The EDPB’s decision held that the DPC was required to issue a financial penalty, leading to the very recent decision by the DPC to levy a record fine of €1.2 billion. This fine, in addition to the DPC’s requirement that Facebook i) suspend any future such processing of personal data within five months, and ii) delete all such data still held in the US within six months is likely to have a substantive impact on Facebook’s EU activities but will no doubt be subject to appeal by Meta.
More importantly, however, the ruling raises further questions about whether and how to rely on SCCs in order to transfer data to the US – which most organisations will be doing in some way by relying on US-based cloud processors/service providers. This enforcement action also highlights the urgency of the need for an adequacy decision based on the planned new mechanism between the EU and the US to ease Atlantic personal data transfers – the EU-US Data Privacy Framework (see Section 11 below).
The DPC’s and EDPB’s decisions are available here: Binding Decision 1/2023 on the dispute submitted by the Irish SA on data transfers by Meta Platforms Ireland Limited for its Facebook service (Art. 65 GDPR) | European Data Protection Board (europa.eu)
9. Anonymisation and Pseudonymisation – when to throw away the key
Anonymisation and Pseudonymisation are key concepts in data protection, and lie at the very heart of how personal data should be handled in order to comply with the UK GDPR. A recent EU case suggests the landscape around these concepts may be about to undergo a fundamental shift with fundamental consequences for transferring data between organisations.
On the 26 April, the General Court of the European Union issued an interesting decision in case T-557/20, SRB v EDPS. The decision held that pseudonymised data transferred to a data recipient will not be considered personal data if the recipient does not also hold the key, in contrast to the previously accepted position. Note in particular that the ruling was against the European Data Protection Supervisor (EDPS), one of the EU’s supervisory bodies on data protection matters. The decision may yet be appealed to the CJEU.
The consequence of the approach in SRB v EDPS would be that (i) it's necessary to consider the data recipient’s perspective when considering whether data is personal data; (ii) pseudonymised data transmitted to a recipient will be anonymous data if the recipient does not have the means to re-identify the data subject (ie. does not have the “key”); (iii) the fact the data transmitter has the means to re-identify the data subjects is irrelevant and does not mean the transmitted data is automatically also personal data for the recipient.
The position under this case is reflective of more recent draft guidance issued by the ICO on anonymization which appears to more closely reflect the position arrived at by the General Court. Practically, and assuming the ICO guidance is approved we would suggest that in the near future it is very likely that UK organisations who are recipients of pseudonymised personal data may be able to treat that data as anonymised data to the extent that they are unable to introduce additional information (ie. the “key”) allowing them to potentially re-identify someone and so falling out of the data protection compliance regime under the DPA / UK GDPR. However, the position under the SRB v EDPS case is yet to be tested within the UK courts.
10. EDPB coordinated action for 2023 on the role of DPOs
In March 2023 the European Data Protection Board (EDPB) launched its 2023 “coordinated enforcement action” into the designation and position of data protection officers (DPOs) across Europe. The coordinated action is an initiative instigated by the EDPB on behalf of European data protection supervisory authorities to examine and review the role of the DPO and how effectively the DPO is able to promote its role within organisations to protect and advise on the use of individuals’ personal data by organisations. The European DPAs will issue questionnaires and take soundings from DPOs across Europe as to how effectively they are able to carry out their roles and also to check the level of organisational compliance with the EU GDPR. The EDPB will collate the findings and publish its report towards the end of 2023.
The EDPB’s focus on the role of the DPO as the second initiative undertaken as part of its coordinated enforcement framework highlights the priority which the Board places on the role of the DPO and the need to ensure the role’s profile and independence within organisations. Comment from the EDPB as to the proposed replacement of the DPO role within the UK with the proposed “Senior Responsible Individual” role has been somewhat scarce, although the initiative is an indication that the UK’s proposal could factor highly in the EU’s consideration of the UK’s adequacy decision when it comes to be reviewed in 2025.
International Data Transfers
11. European Parliament resolution of 11 May 2023 on the adequacy of the protection afforded by the EU-US Data Privacy Framework
MEPs recently voted to adopt a non-binding opinion not to accept the proposed Data Privacy Framework proposed by the European Commission and Federal Trade Commission in the US. In a resolution adopted on 11 May, MEPs considered that whilst the DPF was an improvement on previous frameworks, but was not sufficiently far reaching to justify the granting of an adequacy decision for personal data transfers to the US. MEPs noted that the proposed framework still allows for bulk collection of personal data in certain cases, does not make bulk data collection subject to prior independent authorisation, and does not provide for clear rules on data retention. MEPs considered that the framework needed to be able to withstand future legal challenges, and were sceptical that the proposed changes went far enough to avoid being struck down in the European Court of Justice. The members noted that whilst the system for data subjects to achieve legal redress had been improved, the process was still insufficiently transparent and lacked real independence from the Executive.
The concerns raised in the Parliament echo the issues highlighted by the European Data Protection Board (EDPB) in their non-binding opinion issued in February 2023. It’s important to note that the Parliament’s resolution is non-binding on the European Commission, but it is still a significant intervention in the overall adequacy approval process. The Parliament’s resolution on the Data Privacy Framework is available here: Adequacy of the protection afforded by the EU-U.S. Data Privacy Framework - Thursday, 11 May 2023 (europa.eu)
This article was co-written by Laura Cook, Trainee Solicitor.
If you have any questions about the issues raised in this update, please contact a member of our Information Law team.