30/10/2025

Welcome to the winter edition of Data Matters.  The big news is, of course, the passing of the Data Use and Access Act 2025 (DUAA), which will bring in changes to data protection law over the next 18 months.  We have outlined some of the key developments below, as well as signposting you to the regulator’s helpful update page to keep on top of these changes, 

We also look at the latest enforcement action from the ICO – understanding what is likely to elicit such action can help you to avoid similar issues, and the focus lately seems to be around retention. 

As always, if you have any questions about anything raised here – or indeed, any other information law issue – please get in touch with one of the team directly.

Quick Links

Law and Policy Update

Data (Use and Access) Act 2025

The Data (Use and Access) Act 2025 (DUAA) brings the first changes to the UK GDPR which will likely be relevant in some way to all organisations. Various provisions will come into force over forthcoming months with the intention of simplifying the UK’s data protection regime, such as introducing provisions relating to new smart data schemes, facilitating digital ID verification services, supporting advancements in technology and other targeted efficiencies. The structure and extent of the Information Commissioner’s Office powers are also altered.  

Although AI is clearly in focus, provisions were debated during the passage of the DUA Bill, resulting in the DUAA requiring a government report on copyright and AI to be produced within nine months of the DUAA coming into force. Much of the DUAA will be subject to future regulation and so there should be some flexibility as AI develops. 

Significant forthcoming changes which are due to commence in December include:

Recognised legitimate interests

This new lawful basis of ‘recognised legitimate interests’ consists of a list of activities as set out in Annex 1 of the Act. These include, for example, processing for the purposes of direct marketing, certain intra-group data transfers, responding to emergencies and detecting/preventing crime. Where relevant, these exempt controllers of personal data from undertaking the usual balancing exercise as required within a legitimate interests assessment, although of course the processing must still be “necessary”, controllers must remain accountable, and they must continue to comply with all UK GDPR principles.

International data transfers

Having previously been subject to the adequacy test, organisations now need to consider whether the proposed transfers are “transfers approved by regulations”. In doing so, the DUAA moves away from the need to consider whether the recipient country provides rights and protections to individuals and their data which is “essentially equivalent” to those within the UK, to appropriate safeguards which are responsibly considered by the controller as being “not materially lower” than those provided within the UK. This could still pose problems in assessing safeguards of a recipient country and so organisations with global reach may find it advisable to seek further legal advice. In any case, all decisions as to transfers should be considered and carefully documented.

Purpose limitation

The principle of purpose limitation has been enhanced with additions to existing provisions and which are slightly different for purposes for which consent has been received. A new envisaged purpose of processing is likely to be compliant where:

  • The organisation has consent
  • The purpose of processing is for archiving / research purposes in the public interest
  • The processing is carried out to comply with a principle
  • The processing meets a condition in Annex 2 (e.g. archiving, vital interests, safeguarding)
  • The processing is necessary for certain public interest reasons set out in article 23(1) of the UK GDPR.

For further information on the DUAA, please read our recent article 'Data (Use and Access) Act 2025: key changes for local authorities'. 

Please also refer to Data (Use and Access) Act 2025 | ICO for further guidance from the ICO.

Farley v Paymaster (1836) Ltd (trading as Equiniti) [2025] EWCA Civ 1117 (22 August 2025)

The Court of Appeal has allowed an appeal relating to a breach of privacy group action claim submitted by police officers, clarifying the rules around certain claims by individuals for compensation. It was held that a claimant can recover compensation where the alleged fear of the consequences of a breach is well-founded.

A number of police officers had brought multiple claims, including those of data misuse and breach of privacy, and claimed compensation due to third party misuse of their personal data, as annual pension statements of 432 individuals had been posted to the wrong address. Many of the claims had been struck out as the claimants couldn’t provide sufficient evidence that their data had been read by anyone else. 

However, the Court of Appeal held that the judge was wrong to strike out the claims. It was incorrect to deem that there had not been a breach of the UK GDPR simply because third parties had not accessed the police officers’ personal data. 

Raine v JD Wetherspoon plc [2025] EWHC 1593 (KB)

The High Court has dismissed an appeal submitted by JD Wetherspoon and in doing so has clarified the law on the misuse of private information and data protection in circumstances where personal data has been disclosed orally. It held that oral disclosure of personal data, where such data was held in a structured filing system, constitutes ‘processing’ under the UK GDPR.

In this case, an employee’s mother’s telephone number was obtained in contravention of JD Wetherspoon’s internal policies and used for harassment purposes. The telephone number had been kept in a privately marked file within a locked filing cabinet and so had been actively disclosed. The fact that the number had been filed, then accessed and further communicated, would fall within the definition of ‘processing’ and the Court overturned the Recorder’s rejection of this element of the claim as this was more than simply JD Wetherspoon’s failure to keep the personal data secure. 

ICO Update – Enforcement Action/ Reprimands 

Fine issued for refusing to comply with a SAR

The director of a care home has been issued with a fine of £1,100 and ordered to pay costs of £5,440 after being found guilty of refusing to respond to a subject access request for a resident’s personal information. The individual was found to have blocked, erased, or concealed records to prevent information being disclosed. 

Organisations should be aware that they have a statutory obligation to provide information requested by a SAR and it is an offence under section 173 of the Data Protection Act 2018 to alter, deface, block, erase, destroy or conceal information with the intention of preventing disclosure.

Charity fined £18k for destroying personal records

A Scottish adoption charity has been issued a penalty notice by the ICO for destroying approximately 4,800 personal records, an estimated 10% of which were not replaceable. In 2021 an employee was tasked with destroying manual records to free up storage space. The organisation had a limited understanding of the UK GDPR and had neither implemented any data protection policies or procedures nor provided staff with any data protection training. Such training was not provided to staff until March 2023.

Although the board had approved the deletion of records it was not aware of the sensitive and irreplaceable nature of some of the material being deleted. As a result the ICO found that clear board approval had not been  obtained. This deletion constituted a data breach which was notifiable to the ICO. The ICO were not notified until 2023, some two years after the deletion occurred.

This serves as a timely reminder that organisations should have in place robust policies and procedures for the retention of records and data breach reporting, along with a need for all employees to receive appropriate training in data protection. Where records are being deleted, particularly where they are not replaceable, there should be a clear record of what has been deleted and the decision-making process that led to that deletion. 

Police force reprimanded for accidentally deleting body-worn video footage

South Yorkshire Police force has been reprimanded for failing to put in place appropriate technical and organisational measures to ensure body-worn video (BWV) was not accidentally deleted. 

The force outsourced the supply and management of BWV to a third party. BWV was stored on a secure server, but after an upgrade the system struggled to process the data, so a workaround was put in place to store the BWV locally. In July 2023 the third party provider transferred the data to the server. It was later found that, during this process, over 96,000 pieces of BWV had been deleted. It is unclear how the deletion occurred but the lost data related to 126 criminal case, of which three were impacted by the loss.

This case highlights the risks associated with not adequately defining third party roles and responsibilities for processing personal information, not determining security implications and control requirements prior to permitting third party access to data, and the importance of ensuring appropriate storage and backup systems are in place.

23andMe fined £2.31m for failing to protect users’ data

Genetic testing company 23andMe was fined £2.31m for failing to protect users’ data following a large scale cyber attack on the company in 2023. The attackers had used the technique of credential stuffing – where passwords gained from other cyber attackers are reused in a different attack – to access the personal data, including special category data, of over 155,000 UK users. Ordinarily the fine would have been much higher, but the ICO took into account the company’s deteriorating financial position when handing out the penalty notice.

This case highlights the need to have in place appropriate cyber security measures to prevent attacks, including simple measures such as regular scanning for vulnerabilities, ensuring users have strong passwords which are changed periodically and having multi-factor authentication in place, amongst a range of other measures.

ICO continues crackdown on public authorities with FOI backlogs

Liverpool City Council had a compliance rate of 56% for answering Freedom of Information Act requests within the statutory 20 working days during the first half of 2025. As of August 2025, it had a backlog 75 overdue requests. As a result it has been served an enforcement notice requiring the Council to publish an action plan within 30 calendar days and to provide responses to all requests that are currently more than 20 working days old by 8 January 2026.

Following pro-active sampling by the ICO of NHS Trusts’ FOI compliance it was found that Nottingham University Hospitals NHS Trust had an average compliance rate of just 17% for answering requests within 20 working days during quarter four of 2024 / 25. The subsequent enforcement notice required the Trust to put in place an action plan and to clear the backlog of 536 open requests by 31 December 2025.

University College London Hospital NHS Foundation Trust had an average compliance rate of 83% for answering requests within 20 working days throughout 2024, but this figure had declined in the first quarter of 2025, and there was a significant backlog of older requests dating back to 2020. The enforcement notice requires UCLH to provide responses to all requests that are currently more than 20 working days old by 1 December 2025.

These cases highlight the ICOs ongoing commitment to making examples of organisations that are not able to meet the statutory deadline for responding to FOI requests. Public authorities should ensure they have clear processes in place to gather the required information from the relevant departments, with all staff trained on recognising FOI requests and the importance of responding in a timely manner. 

Other ICO News

The ICO unveils AI and biometrics strategy

The UK Information Commissioner's Office has launched a new AI and biometrics strategy aimed at ensuring that organisations develop and deploy emerging technologies lawfully and responsibly. The document that has been published is practical in nature and sets out where the ICO sees the greatest areas of risk and its regulatory priorities. This should help organisations to identify where their governance, mitigations and compliance activities should focus. The strategy outlines the ICO's focus on scrutinising applications such as automated decision-making in recruitment, facial recognition technology in policing, and the use of personal data in training generative AI models. In particular, it targets GDPR compliance in the areas of transparency and explainability, bias and discrimination and rights and redress. A key action identified in the strategy is to develop a statutory code of practice on AI and automated decision-making.

ICO publishes final guidance on encryption

On 2nd September 2025 the ICO published its final guidance on encryption. This detailed guidance provides an in-depth look at encryption and explains how it can be used effectively to comply with UK GDPR requirements. It sets out a number of scenarios where you can use encryption to protect personal information and the residual risks of doing so to demonstrate practical uses of the technology. The guidance emphasises the need for organisations to maintain high technical standards and be in a position to justify their security choices. Security practices should be reviewed in light of the guidance to ensure compliance with the expectations set out within this. 

ICO published draft guidance on Internet of Things products

The internet of things (IoT) refers to the network of interrelated devices and objects which connect and exchange data allowing everyday objects to be connected to the internet to improve efficiency and convenience.

IoT products usually process personal data given they are designed for human interaction and are often linked to a user account. The ICO’s draft guidance explains how data protection law and the Privacy and Electronic Communications Regulations 2003 (as amended) apply to this processing. 

The guidance will be relevant to any organisations responsible for processing by IoT products, such as manufacturers, developers of operating systems, mobile and web app and software developers, 

The consultation on this guidance has now closed. We will report further once the final guidance is released.

Consultation on the ICO’s updated guidance for new exceptions to cookie consents

The ICO has launched a consultation on its draft updated storage and access technologies guidance. It has now added a new chapter to the guidance called ‘What are the exceptions?’. This chapter updates the guidance following the introduction of the Data (Use and Access) Act 2025 to reflect the new exemptions under Privacy and Electronic Communications Regulations (PECR) for requiring consent for certain low-risk cookies such as, statistical analysis and website improvement. This follow-up consultation closed on 26 September 2025, the revised guidance is now being finalised. 

ICO publishes findings from audit of the use of facial recognition technology at South Wales Police and Gwent Police

As part of the ICO’s strategy to step up its supervision of AI and biometric technologies they have been conducting a facial recognition technology in policing project. In connection with this, the ICO audited how South Wales Police and Gwent Police govern their use of facial recognition technology.

The audit found a high level of assurance that processes and procedures are in place and delivering data protection compliance. The executive summary of the audit commented in particular on the following key areas of assurance:

  • the privacy management framework embedded and endorsed by senior management which supports the use of live facial recognition (LFR) technology and comprehensive mapping of data flows for the use of LFR;
  • the ability to demonstrate the lawful provenance of images used to generate biometric templates for LFR watchlists;
  • a DPIA procedure which involves consultation with internal and external stakeholders and which the DPO has oversight responsibility for; and
  • only collecting data that is adequate, relevant and necessary for its purpose. 

Key areas identified where further measures were required to comply with data protection law included accurately documenting retention periods and maintaining version control and reviewing policies at appropriate intervals and updating staff of changes.

Findings from the audit (although context specific) may provide useful insight to other forces implementing similar technology.

ICO published new guidance - disclosing documents to the public securely: hidden personal information and how to avoid an accidental breach

This guidance follows several more serious data breaches linked to human error where sensitive personal data was accidentally disclosed because of insufficient checks. The guidance includes helpful checklists, videos and recommendations to identify and remove hidden personal information before this is shared. 

This guidance should be particularly helpful to those organisations frequently handling subject access requests or freedom of information requests, but the practical tips could also be used by organisations to help prevent a breach in any scenario where documents are being disclosed or shared. 

ICO reports on FOI compliance across NHS Trusts in England

The ICO has set out its findings following its review of how NHS trusts are performing in relation to Freedom of Information Act (FOIA) compliance. NHS Trusts were focused on by the ICO because of the volume of complaints received in this sector. 

In summary the ICO concluded that most trusts are performing well, however extensive backlogs were identified at a small number of trusts which were then required to take immediate steps to improve their FOIA compliance. 

The ICO highlighted a number of learning points for Trusts including encouraging all trusts to:

  • ‘Put thorough processes in place for FOI request handling across service areas with a minimal impact on frontline staff where possible.
  • Establish network of contacts and dedicated FOI champions considering frequent movement of staff in healthcare environment.
  • Raise awareness to improve understanding of how FOI operates in relation to requests from companies seeking commercial information.
  • Publish commonly requested information such as that relating to agency staff, recruitment, prescribing, waiting lists and in-sourcing/out-sourcing.
  • Make sure senior leaders have regular sight of their organisation’s performance against its statutory duties so they can take early action when needed.’

The ICO makes clear in its report of its intention to continue to focus on addressing systemic issues within public sector bodies.

ICO consults on draft changes to how it handles data protection complaints and issues complaints guidance for organisations

The ICO is currently consulting on changes to assessing and determining the extent to which it is appropriate for it to investigate each compliant it receives. These proposed changes follow the significant increase in data protection complaints over recent years and the finite resources the ICO has to respond to each of these. 

The ICO is developing a framework which sets criteria to consider when determining whether a complaint requires further investigation, the aim being to focus its resources on the most significant issues as a priority.  

A triaging process will be introduced and thresholds will be set to trigger deeper reviews of an organisation’s practices, for example after a defined number of complaints have been received about an organisation. The consultation will close on Friday 31 October 2025. 

Linked to this and the intention to reduce the workload for the ICO, is the ICO’s draft Complaints Guidance for Organisations which is under consultation until 19 October 2025. This guidance has been created to help organisations to comply with the new right for data subjects to complain directly to controllers introduced under the Data (Use and Access) Act. The guidance sets out the legislative requirements including:

  • giving people a way of making data protection complaints directly to the data controller;
  • acknowledging receipt of complaints within 30 days;
  • taking appropriate steps to respond to complaints, without undue delay, including making appropriate enquiries and keeping people informed; and
  • telling people the outcome of their complaint without undue delay.

The guidance also sets out good practice recommendations with suggested actions organisations may wish to take to assist with their compliance including:

  • putting in place a documented complaints procedure which is published or easily accessible and which clearly sets out the process and timelines;
  • implementing a complaint form, allowing complaints to be made by phone or by live chat or an online portal;
  • providing appropriate training to staff to ensure they recognise and handle data protection complaints compliantly;
  • having in place effective record keeping to track actions and support investigations;
  • creating a system for asking for supporting information or evidence from complainants;
  • letting complainants know they have the right to complain to the ICO if they are unhappy with the outcome of their complaint;
  • reviewing lessons learned from complaints. 

The right to complain directly to the data controller becomes applicable in June 2026. In the meantime, organisations should take steps to prepare for a potentially increased number of complaints from data subjects and review and amend existing complaints procedures or draw up a new separate procedure setting out how data protection complaints will be handled in accordance with the statutory requirements. 

Our team would be happy to assist with reviewing and updating existing complaints procedures or creating a new procedure which complies with the legislative requirements. Please contact us for further details of the cost effective options we can offer in this area ranging from template procedures that can be customised to more bespoke and in depth policy reviews and drafting.  

Information Commissioner’s Office had authority to fine TikTok

On 4 July, the First Tier Tribunal (FTT) ruled in the ICO’s favour in relation to its action against TikTok. The ICO had imposed a £12.7 million fine on TikTok for the misuse of data belonging to children aged under 13 years despite TikTok arguing that the processing was related to journalistic purposes and therefore exempt from usual provisions. The FTT rejected TikTok’s argument and ordered a trial to determine the issue.

Other

GCHQ ex-intern jailed for data theft

On 13 June 2025, a former GCHQ intern was jailed over massive data theft, downloading 61.8 GB of classified spy tools and 17 staff identities onto his personal computer. The breach prompted scrapping of key intelligence systems and damaged national security operations. The intern knowingly breached protocol and mishandled sensitive data, he said he had no intent of sharing the data with anyone else, he just removed the data out of curiosity and his want to complete work from home. 

This case is an example of how threats to security can come from within an organisation where staff are not sufficiently trained or monitored. It underlines the importance of undertaking appropriate testing and monitoring of relevant systems and ensuring that staff understand the risks and potential real-life consequences of mishandling data.  

Our use of cookies

We use necessary cookies to make our site work. We'd also like to set optional analytics cookies to help us improve it. We won't set optional cookies unless you enable them. Using this tool will set a cookie on your device to remember your preferences. For more detailed information about the cookies we use, see our Cookies page.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

We'd like to set Google Analytics cookies to help us to improve our website by collection and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone.
For more information on how these cookies work, please see our Cookies page.