23/04/2026
A very warm welcome to the latest edition of Data Matters, our newsletter covering the latest developments in information law and privacy.
Since the last newsletter, most of the changes from the Data Use and Access Act 2025 have come into force, and the ICO continues to look at both the protection of children’s data, and AI, with relevant guidance now out for consultation. Enforcement action for personal data breaches continues – with a focus on children and direct marketing, and the ICO continues to issue practice recommendations for Freedom of Information Act breaches – with a focus on statistics in this quarter.
As always, if you have any questions or ideas on what we could include in the next edition, please contact your usual BB contact, or one of the team.
Quick Links
|
Law and Policy Update
Cyber Security and Resilience (Network and Information Systems) Bill published
The Cyber Security and Resilience (Network and Information Systems) Bill (the Bill), published on 12 November 2025, proposes amendments to the UK’s Network and Information Systems (NIS) Regulations 2018. The aim of the Bill is to increase the robustness of the current NIS regime by expanding the types of entities falling within its scope and increasing powers of enforcement bodies.
The surge in cyber-attacks impacting critical public and private sector services is frequently caused by successful attacks on the third-party service providers which they rely upon. For example, it was a ransomware attack on the IT service provider, Advanced, that put NHS data at risk and a cyber-attack on the MOVEit file transfer system which impacted thousands of organisations globally. The Bill therefore focuses on supply chain security and broadens the scope of NIS to include a clearer definition of the cloud computing services captured and also to include data centre providers and managed service providers.
The Bill:
- puts in place a statutory obligation to implement appropriate technical and organisational measures;
- includes broader incident reporting requirements which may require entities caught by the Bill to revisit internal incident response policies and procedures;
- significantly increases the financial penalties for non-compliance and introduces a new cost recovery regime;
- gives more powers to government and the regulators to intervene where there is an emerging threat, provide directions, inspect and seize documents and to interview personnel.
Organisations should assess whether they will fall within the scope of the revised NIS regime (in particular those organisations which may be brought into scope for the first time such as managed service providers and data centres). Steps should be taken to review current strategies for responding to cyber-attacks including associated policies and procedures and levels of investment in this area. We will continue to track the progress of the Bill in this update and report on how any developments impact the practical steps organisations need to take to comply.
DSG Retail Ltd v Information Commissioner [2026] EWCA Civ 140 – Court upholds ICO’s appeal
In this case the Court of Appeal upheld an appeal by the ICO relating to the cyber attack on DSG Ltd where attackers scraped transaction details from card readers impacting over 5.6 million cards, but the attackers were prevented from gaining information which identified card holders.
The ICO took enforcement action issuing the maximum fine possible for a breach of the security principle. DSG appealed on the basis that the data was not personal data because individuals couldn’t be identified from it and argued the security obligations did not therefore apply. The appeal was dismissed by the First Tier Tribunal and DSG further appealed in the Upper Tier Tribunal where it was concluded that the relevant data was not personal data.
The ICO appealed the decision of the Upper Tier Tribunal and the Court of Appeal has now overturned the decision, ruling that an assessment must be made as to whether the person is identifiable by the data controller and not the third party attacker. The Court of Appeal was concerned the Upper Tribunal’s decision would have created a significant gap with no need to protect against ransomware where the attacker cannot identify data subjects.
Data (Use and Access) Act 2025 (DUA) – key aspects take effect
Key provisions of the DUA took effect on 5 February 2026 under the Data (Use and Access) Act 2025 (Commencement No.6 and Transitional and Saving Provisions) Regulations 2026. The key changes were as follows:
- the general prohibition on the use of personal data for solely automated decision making for significant decisions affecting data subjects has now been narrowed so that it only applies to automated decision making based entirely or partly on the processing of special category data. This means if the automated decision making is not based on special category data the legal bases that can be relied upon will be wider and include legitimate interests.
- Cookie consent is no longer required for statistical cookies, appearance cookies or emergency assistance cookies although statistical and appearance cookies still require provisions of an informed and simple opt-out mechanism;
- Processing of personal data based on the new legal basis of ‘recognised legitimate interests’ is now effective (no balancing test is needed where this lawful basis is relied upon) and includes the following activities:
- processing necessary for national security, public security and defence purposes;
- processing necessary for the detection, investigation or prevention of crime;
- responding to requests made by bodies acting in the public interest, for processing by those bodies for purposes laid down in law (for example, to help a government agency discharge its duties and functions); or
- processing necessary for the safeguarding of vulnerable individuals.
- Introduction of a definition of ‘research and statistical purposes’ to encourage a broad interpretation of scientific research to include ‘any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity’.
- Clarification of the circumstances when personal data may be re-used for different purposes depending on the lawful basis relied on.
- Increase of maximum level of fines under PECR to UK GDPR levels (i.e. £17.5 million or 4% of global annual turnover) thereby raising the stakes for non-compliance;
- Extension of the ‘soft opt-in’ exemption to charities making it easier for charities to contact existing supporters with marketing e-mails and text messages in relation to fundraising or appeals and campaigns;
- Expansion of the ICO’s enforcement powers; and
- Reformulation of the international data transfer test to assess a third country’s adequacy changing this from ‘essentially equivalent’ to ‘not materially lower’.
Organisations should review data protection policies and practices to ensure they reflect the DUA changes. Organisations should also be mindful of the forthcoming right for data subjects to complain directly to controllers (taking effect on 19 June 2026). Steps should be taken to ensure that you are appropriately facilitating the making of such complaints.
Fines and Enforcement Action
Children’s Data
Reddit issued with £14.47m fine for children’s privacy failures
The ICO issued Reddit with a £14.47 million fine for processing the personal information of children under the age of 13 unlawfully. The platform was found to have failed to properly check the age of its users, risking children being exposed to inappropriate content and it also failed to carry out a data protection impact assessment to assess and mitigate risks to children.
The ICO has made clear that relying on users to declare their age themselves is not sufficient when children may be at risk and have stated that there is now a particular focus for the ICO on platforms that primarily rely on self-declaration for age verification.
Organisations should ensure that age assurance methods are sufficiently robust and implemented in a manner which is proportionate to match the level of risk posed by their platform.
Imgur owner MediaLab fined over children’s privacy failures
The ICO fined MediaLab, the owner of image sharing and hosting platform Imgur £247,590 after:
- failing to have in place any measures to check the age of users;
- processing personal data of children under the age of 13 without parental consent or any other lawful basis; and
- failing to carry out a DPIA to identify and reduce privacy risks to children.
These failures risked children being exposed to harmful content relating to eating disorders, homophobia, antisemitism and images of a sexual or violent nature.
This action serves as a further warning to organisations processing children’s data to ensure robust age assurance measures are in place and necessary parental consents are obtained.
Direct Marketing
Allay Claims Ltd and ZMLUK Limited fined for nuisance marketing messages
Allay Claims Ltd and ZMLUK Limited have been fined £120,000 and £105,000 respectively for sending millions of unlawful marketing text messages and emails. These fines align with the ICO’s typically robust approach to marketing related breaches.
Head of Investigations at the ICO commented:
“The law here is clear: businesses must only send marketing messages to people who have freely and knowingly consented to receiving them. Relying on vague or third-party consent, or sending marketing messages under the guise of service updates, isn’t enough.”
TMAC Ltd fined £100,000 for unsolicited marketing calls
TMAC Ltd sells personal pendant alarms and security systems. The ICO found that over 260,000 predatory calls were made to vulnerable individuals registered with the Telephone Preference Service over an 8 month period in 2024, in an attempt to solicit sales. There was evidence that the calls were deliberately targeted at vulnerable individuals, and transcripts showed that the employees did not always reveal their true identity – claiming to be from local fire and crime prevention initiatives to convince individuals to purchase a product from them. It also came to light that the source of the telephone numbers was a second hand lost taken from a director’s previous company.
The ICO reminded organisations of the need to consult and comply with the Privacy and Electronic Communications Regulations, and encouraged individuals to report nuisance calls to the ICO through their website.
Data Breaches
Password manager provider fined £1.2m by ICO for data breach
Password manager provider Last Pass UK Ltd has been fined £1.2million by the ICO following a 2022 data breach compromising the personal information of up to 1.6 million of its users.
The breach occurred in 2022 when a hacker gained access to a corporate laptop of an employee and the company’s development environment where encrypted credentials were obtained. The Hacker then gained access to a senior employee’s personal laptop on which the hacker implanted malware to capture the employee’s master password. The Hacker combined details obtained from both incidents to access the employee’s business vault containing the access key and decryption key. This was used to access Last Pass’ backup database and to take personal information.
Last Pass’ use of a ‘zero knowledge’ encryption system (where the master password to access the password vault is stored locally on a customer’s own device and never share with Last Pass) meant the threat actor was unable to decrypt encrypted passwords and other credentials. This was a mitigating factor however the enforcement action taken by the ICO is a reminder to organisations to ensure that they are implementing appropriate technical and organisational security measures.
In particular the access to the senior employees personal device was gained through exploitation of a vulnerability in a third party application. At the time, Last Pass allowed employees to access their business accounts using personal devices on which unverified and potentially insecure and vulnerable third-party applications could be installed. Employees were also encouraged to link their employee business and personal accounts so they could be accessed by a single master password.
Organisations should where possible restrict business activities to business devices only which have restrictions on unapproved apps and software and the linking of business credentials to personal accounts and devices should not be permitted.
If allowing staff to use personal devices for work purposes follow the ICO’s recommendations as follows:
- Enforce multi factor authentication for all remote access;
- Separate work from personal (on both work and personal devices) using managed profiles or containers;
- Keep operating systems and security software up to date and promptly block outdated devices;
- Limit admin privileges and review access privileges regularly; and
- Consider virtual desktop or remote app solutions for high risk roles.
GP surgery reprimanded after excessive medical history of terminally ill patient sent to insurer
Staines Health Group has been reprimanded after sending excessive medical details about a terminally ill patient to their insurer. This incident occurred after a terminally ill patient made a claim to their insurer and the insurer requested five years of medical history be sent to the patient to review before being sent to the insurer to progress the claim.
Instead of sending the medical history to the patient as requested, Staines Health Group sent 23 years of medical records directly to the insurer, which the patient believed ultimately led to a reduction in the payout of their claim.
The incident highlighted the absence of any written process for staff to follow when handling insurance requests and a lack of refresher data protection training for staff.
The ICO recommended organisations take note of the lessons learned from the mistakes of Staines Health Group and should:
- Put in place written processes to support staff with handling personal data requests from third parties;
- Consider the need for a quality assurance process when sharing personal data externally; and
- Provide up-to-date and regular data protection training for staff.
Post Office reprimanded over Horizon IT scandal victims’ data breach
The Post Office has been reprimanded by the ICO following a data breach where its communications team mistakenly published the personal data of postmasters involved in group litigation relating to the Horizon IT scandal.
The ICO found the Post Office had failed to implement appropriate technical and organisational measures to protect personal data with a lack of documented policies or quality assurance processes for publishing documents on the corporate website, as well as insufficient staff training and no guidance on information sensitivity or publishing practices.
The ICO highlighted the reprimand as a reminder of the need to take a data protection by design approach to everyday operations. Key takeaway lessons for all organisations include:
- Putting in place a formal review and approval process before publishing sensitive documents online including considering a multi-step sign-off process.
- Train staff to recognise personal data and be able to assess its sensitivity in context including the reputational and emotional impact of disclosure.
- Implement clear access controls and classification labels for documents and avoid the use of personal storage systems such as OneDrive or GoogleDrive.
- Ensure those involved in publishing content understand their role and the checks required before publication.
- Tailor training to the task and don’t just provide general training. Provide specific guidance to relevant teams on publishing protocols, data classification and risk awareness.
Police Scotland fined and reprimanded following excessive extraction and unlawful disclosure of contents of a mobile phone
Police Scotland received a £66,000 fine and reprimand following the extraction of information from an individual's mobile phone, which was found to be excessive and contained sensitive information that was not relevant to the investigation being carried out. This information was then included in a misconduct disclosure bundle in unredacted form, and shared with a third party who should not have received it.
The ICO found that there was a failure to implement appropriate technical and organisational measures to ensure data security, and a failure to limit personal data sharing to what was necessary for the purpose. Police Scotland were unable to demonstrate that staff handling sensitive informant were clear on guidance and procedures, and the breach was not reported within the statutory 72-hour time frame.
This case illustrates the importance of being clear on your purpose for using, holding and disclosing information. If you are clear on the purpose, then ensuring that only information relevant to the purpose is used is easier to demonstrate. Organisations also need to be clear on what they expect from employees, and give clear guidance and training to allow compliance with data protection law.
Convictions
Further convictions in UK’s largest nuisance call investigation
An investigation launched by the ICO in 2016 has recently resulted in a further 2 convictions, leading to a total of 10 convictions associated with one of the largest nuisance call investigations undertaken by the ICO.
The investigation began when the owner of a car repair garage in County Durham contacted the ICO raising concerns as to his belief that his customers held him responsible for the nuisance calls they were receiving about personal injury claims.
It was initially discovered that approximately one million records were accessed without the individuals’ consent. The data was then sold on in anticipation of generating potential personal injury claims.
It is now known that personal information was obtained from over 400 garages as well as claims management and insurance companies.
ICO Financial Investigators and external financial investigators are now pursuing the recovery of those financial benefits obtained by the convicted individuals via the Proceeds of Crime Act 2002.
ICO Investigations
ICO announces investigation into Grok
On 3 February, the ICO issued a further statement confirming the launch of formal investigations into X Internet Unlimited Company (XIUC) and X.AI LLC (X.AI) as to their processing of personal data associated with the Grok AI system and the capability for the system to create harmful sexualised images and video content. Serious privacy concerns and the potential harm to members of the public have prompted this investigation which follows the ICO’s initial contact with these companies in order to obtain urgent information pursuant to reports of such content being created using Grok. Those reports suggested that Grok has been used to generate non-consensual sexual images of individuals, including children.
Concerns identified by the ICO include whether personal data has been processed lawfully, fairly and transparently and whether the development and use of Grok incorporated appropriate safeguards to prevent the generation of such harmful content using personal data.
The ICO has confirmed that no further comment will be made whilst the investigation is ongoing.
Freedom of Information and Environmental Information Regulations
The ICO continues to crack down on public authorities which are struggling to meet their Freedom of Information Act obligations.
A number of public authorities have recently received practice recommendations in respect of their FOI performance. Authorities are criticised by the ICO where they consistently exceed the statutory timeframe for responding to requests. In many cases delays in processing requests and responding to complaints submitted to the ICO are a result of understaffing and resourcing issues within the organisation. Resourcing is an issue for many public authorities and it might be helpful for organisations to focus on improving system efficiency to enable them to meet the deadlines without needing to take on additional staff. Where strict practice recommendations are put in place, and are not met, then organisations are likely to receive an enforcement notice. The Foreign, Commonwealth & Development Office (FCDO) received an enforcement notice after its performance had not improved sufficiently and any improvements made had not been sustained. Whilst the FCDO has improved it’s compliance level from 37% to 77%, it fell short of the 90% expected by the ICO.
Additionally, a number of Northern Ireland government departments and another public authority have received practice recommendations as a result of not publishing their compliance statistics, as is required by organisations of their size. As a reminder, part 8.5 of the section 45 Freedom of Information Code of Practice recommends that all public authorities with over 100 FTE employees should publish details of their performance on handling FOI requests.
Guidance Updates
Use of Automated Decision Making in recruitment – draft guidance for employer issued
The ICO has highlighted the increasing use of automated decision making (ADM) in recruitment decisions – with a particular increase in the use of AI for this process – and has produced a new report into this trend, along with draft guidance for employers.
ADM can be a useful tool for recruiters, and it is permissible under data protection law, provided the legislation is complied with. The draft guidance explains what is needed to comply, and you can give views on this up until the 29 May 2026.
The ICO consults on data protection enforcement procedural guidance
The ICO has published draft procedural guidance relating to its enforcement powers and how it proposes to conduct data protection investigations. The guidance provides more in-depth insight than has been provided before with details on the processes to be followed at each stage of an ICO investigation. It sets out the circumstances when it may issue a public announcement at stages within the investigation process. It also provides insight into when the ICO may entertain the idea of settlement and how the process will work.
The sections of the guidance relating to the ICO’s new investigative and enforcement powers introduced under the Data (Use and Access) Act 2025 may be of particular interest, such as the powers to make individuals available for interview, to commission reports by ‘approved persons’ and to compel the production of documents.
The consultation has now closed and the guidance may be further refined before it is final implemented. Organisations may however wish to review the guidance now to assess how the ICO’s use of its powers may impact them should action be taken for any alleged breaches.
ICO publishes new guidance for education sector on sharing data to safeguard children
The ICO has published guidance to give those working in education greater confidence in sharing information to safeguard children.
The guidance makes clear that data protection law should not act as a barrier to sharing safeguarding data and instead provides a framework for sharing the data in the right way.
The guidance includes a link to a 10 step guide to sharing safeguarding information. The 10 steps can be summarised as follows:
Step 1: Be clear about how data protection can help you share information to safeguard a child.
Step 2: Identify your objective for sharing information, and share the information you need to, in order to safeguard a child.
Step 3: Develop clear and secure policies and systems for sharing information.
Step 4: Be clear about transparency and individual rights.
Step 5: Assess the risks and share as needed.
Step 6: Enter into a data sharing agreement.
Step 7: Follow the data protection principles.
Step 8: Share information using the right lawful basis.
Step 9: Share information in an emergency.
Step 10: Read the ICO’s data sharing code of practice.
Staff in the education sector should be reassured by this guidance that data protection law compliance and safeguarding are not in conflict and that data protection law should not prevent safeguarding data from being shared where there is a concern that a child or young person is at risk of harm.
ICO publishes updated guidance on international data transfers
In January, the ICO published its updated guidance on international transfers of personal data. The changes are designed to make it quicker for businesses to understand and comply with the transfer rules. Organisations can now follow a simple three step test to identify if they are making a restricted transfer, with additional clarity being provided on common questions that arise.
ICO Campaigns and Focus
Children’s Data
The ICO launches “Switched on to Privacy” Campaign
Following some stark findings in relation to children’s privacy online, the ICO has launched a campaign entitled “Switched On to Privacy” to help parents of children aged between 4 and 11 speak to their children about online safety.
This follows the ICO producing the Children’s Code for those offering online services to children, and a clear message to all data controllers that the ICO can and will enforce standards where children’s data is at risk. Whilst the Children’s Code is aimed at online suppliers of games and social media type services, it is a useful guide to what should be taken into account when an organisation is collecting or using data about those under 18 – even where the intention is not to collect that data.
ICO launches compliance review of children’s data privacy in mobile gaming
Following research highlighting significant levels of parental concern regarding children’s possible exposure to strangers or harmful content through games, the ICO plans to scrutinise popular mobile games to assess their privacy compliance. This will include reviewing default privacy settings, geolocation controls and targeted advertising practices.
The review follows the significant progress made by the ICO to improve privacy standards across social media and video sharing platforms through the Children’s Code strategy.
Care Records
ICO launches Better Records Together campaign to support handling of care records
The ICO has called for urgent improvements across UK local authorities and Health and Social Care Trusts after warning that people trying to access their own care records are being let down.
This has led to the ICO’s launch of a ‘Better Records Together’ campaign which includes a suite of practical resources to tackle the current issues including:
• new standards providing clarity on how to handle requests with care and good practice measures to better support people from the point they enter the care system;
• clear advice for individuals requesting their records to assist them in navigating the process and accessing support; and
• a UK wide supervision pilot running across 2025/26 to monitor the performance of 19 organisations to drive improvements.
Staff dealing with requests for care records in the care sector should be made aware of this guidance and ensure it is taken into account when dealing with requests.
AI Updates
ICO and other international data protection authorities issue joint statement on privacy risks of AI generated imagery
Last month, a joint statement was published reflecting the coordinated views of 61 data protection authorities as to the use of AI generated imagery and their commitment in addressing the perceived global risk that these systems represent. The ability for AI systems to generate realistic images and videos of identifiable people without their consent has raised serious privacy concerns across the world, resulting in its publication.
Although benefits of the use of AI to both individuals and society are recognised within the statement, the integration of AI image and video generation into social media platforms has been recognised as facilitating the production of harmful content, for example, the creation of non-consensual intimate imagery which features real individuals. Additionally, particular concerns identified the potential harm to children and vulnerable groups.
All organisations involved in the development and use of AI content generation systems are reminded that those systems must be developed and used in line with applicable legal frameworks, particularly data privacy, noting that specific legal requirements will vary by jurisdiction and, that within some jurisdictions, the generation of non-consensual imagery can be a criminal offence.
Fundamental principles are included as a guide to those organisations involved in the development and use of such systems, in summary:
- The implementation of robust safeguards to prevent the misuse of personal information and the generation of harmful content.
- Ensuring meaningful transparency as to the AI tool’s capabilities, safeguards, acceptable use and consequences of misuse.
- Providing effective and accessible ways for individuals to request removal of harmful content and to ensure a rapid response to those requests.
- Implementing enhanced safeguards and providing clear and age-appropriate information to children, parents, guardians and educators in order to address specific risks to children.
In conclusion, the signatories to the statement call on organisations to proactively engage with regulators and implement those robust safeguards to guard against the advancement of technology at the expense of privacy, safety, dignity and other fundamental rights, in particular, for those most vulnerable in society.
ICO publishes report on data protection implication of agentic AI
AI is an area of particular focus for the ICO at the moment. Whilst Generative AI is focussed on creating content with input from an operator, Agentic AI operates independently by planning and making decisions to achieve certain outcomes. Whilst its report on the data protection implications of agentic AI is not formal guidance, the ICO does set out its current thinking on the use of the technology. Importantly, organisations remain responsible for the data protection compliance of any technology it utilises, so privacy by design and strong data protection foundations should be at the forefront of any considerations when adopting AI tools.
Other News
Direct marketing: Fundraising Regulator responds to ICO consultation on ‘soft opt-in’ for charities
The Data (Use and Access) Act 2025 amends the Privacy and Electronic Communications Regulations 2003 (PECR) which, when introduced, will allow charities to benefit from the “soft opt-in” exemption for marketing. This will make it easier for charities to contact existing supporters, where certain conditions are met.
The ICO consulted on these changes and the Fundraising Regulator published its response to that consultation at the end of November. Whilst the Regulator welcomes the changes it cautions charities to exercise caution when using the soft opt-in; where the Regulator receives complaints it has indicated it will investigate these, even where the ICO has taken no action in relation to the same issues. Charities should therefore they comply with the Regulator’s guidance, along with that of the ICO.
Home Office consultation on legal framework for using facial recognition in law enforcement
The Home Office plans to develop a framework which governs how law enforcement organisations use facial recognition, biometrics and similar technologies. A consultation on this topic was launched in December. The increasing use of these technologies has raised concerns around privacy, discrimination, oversight and public trust. Current rules are complex and do not provide the certainty needed. The consultation aims to design a clear statutory regime which provides explicit rules around when, where and how law enforcement can use these technologies, and the safeguards which must be put in place. The consultation closes on 12 February 2026.
The ICO, FCA, SRA and ASA launch joint task force to crack down on poor practice in motor finance claims
A new task force has been established to tackle poor handing of motor finance claims by claims management companies. The ICO focus will be on the way in which personal data is used, and in particular, compliance with the direct marketing rules in the Privacy and Electronic Communications Regulations. More detail can be found on the FCA website.
What matters to you, matters to us — follow our LinkedIn page for legal insights that matter.




