21/05/2025
Lessons Learned from this Edition
- In connection with Freedom of Information Requests, ensure staff have the necessary training before disclosures of information are made, and ensure that you have checks built into your processes to avoid inadvertent disclosures.
- Remember the basics of cybersecurity – even if you outsource. Check password policies, multi factor authentication, training…
- Do due diligence on third parties where you are entrusting personal data to them – especially with marketing and cookies. You will be responsible for compliance, so consider undertaking a cookies audit, ensure that you are happy with the consents obtained, and that you have a robust contract in place.
- Monitor your compliance with DSARs and put processes in place where helpful to manage backlogs.
- The ICO is particularly focussed on “hidden” processing – such as by AI and cookies. Make sure you understand what you are doing when using these technologies, and be as transparent as possible with individuals about how their data is being used.
Need help with data privacy compliance?
Compliance with ongoing data privacy obligations is critical to protect your business, but it can be an onerous task. This is why our team of experienced and specialist data privacy lawyers are delighted to offer our new data privacy compliance audit service, a cost effective, solution-based plan tailored to your organisation’s needs.
Quick Links |
ICO Update – Enforcement Action/Reprimands
Action related to security breaches
ICO issues fine to the Police Service of Northern Ireland
The Police Service of Northern Ireland was fined £750,000 after a spreadsheet containing the sensitive personal data of 9,483 officers and staff was inadvertently disclosed in response to a freedom of information request published on the public facing WhatDoTheyKnow website.
The disclosure occurred through the inclusion of a tab on the spreadsheet which was supposed to have been deleted. The information was only briefly available online, however the Police Service of Northern Ireland were confident that the information was in the hands of ‘dissents’ who would seek to use it to intimidate or target officers and staff.
The ICO was particularly concerned about the procedures which the Police Service of Northern Ireland had in place to handle FOI requests and they discovered these were not sufficient to prevent the data from being disclosed in this case and did not therefore amount to appropriate organisational measures to ensure the security of the personal data.
This decision highlights the importance of being aware of hidden data when responding to FOIA requests and making sure appropriate policies and procedures are in place to prevent unlawful or inadvertent disclosure of personal data. Implementing robust governance frameworks, undertaking sufficient staff training and having in place appropriate oversight are important in helping to prevent similar lapses.
ICO issues reprimand to Southend-on-Sea City Council
Southend-On-Sea City Council were issued with a reprimand after accidentally leaking sensitive personal details of employees in response to a Freedom of Information request. The breach occurred due to a failure to undertake proper checks for hidden data before a spreadsheet was released. The ICO specifically noted failures in training and awareness of the packages used by the Council.
This is another incident which highlights the importance of providing staff with appropriate training and awareness of packages used to mitigate the risk of inadvertent disclosure of personal data. In particular if any documents to be considered are Excel documents, ensure staff are familiar with the ‘Inspect Document’ option to help identify hidden data.
ICO issues reprimand to Levales Solicitors LLP
A reprimand was issued to Levales Solicitors LLP (Levales) in relation to a threat actor accessing its cloud based server using legitimate credentials and then publishing data on the dark web. The breach impacted 8,234 UK individuals, 863 of whom were deemed high risk because of the nature of the data involved.
The ICO found that Levales were not ensuring the ongoing confidentiality of its processing systems and did not implement appropriate organisational measures. In particular, Levales didn’t have multi-factor authentication for the affected domain account and did not have a password policy in place. Additionally Levales did not implement appropriate technical and organisational measures to ensure their systems were secure. Levales outsourced their IT management to a third party and were unaware of security measures in place at the time of the incident. Levales had not reviewed if the technical measures associated with the IT management contract, were appropriate for the personal data they were processing since the contract was first signed in 2012.
Key takeaway points from this breach are:
- the ICO’s clear expectation that multi-factor authentication is a basic measure all organisations processing personal data should be implementing; and
- when using a managed service provider for IT, the ICO expects contracts to be reviewed and responsibilities to be fully understood to ensure the security of data is upheld.
Action related to direct marketing
ESL Consultancy Services Ltd fined £200,000 for instigating unlawful loan promotion nuisance texts
ESL has been fined £200,000 by the ICO for knowingly sending unlawful loan promotion nuisance text messages to people who hadn’t consented to receive them.
The ICO found ESL used a third party to send the marketing text messages without ensuring valid consent was in place to send the messages. ESL also took steps to try and conceal the identity of the sender of the messages by using unregistered SIM cards.
This decision serves as a reminder that whenever you instigate direct marketing by text (including via a third party) you must ensure you are complying with the requirements under the Privacy and Electronic Communication Regulations by ensuring that valid consent to send the messages has been acquired.
Smart Home Ensured Limited fined £90,000 for unsolicited marketing calls
Smart Home Ensured Limited received an enforcement notice after it made 14,508 unsolicited marketing calls between 4 July and 24 August 2023 to numbers registered with the Telephone Preference Service (TPS) without consent.
Smart Home had used a public electronic communications service to conduct telesales and failed to request suppression of TPS-listed numbers.
This decision demonstrates the extremely low tolerance approach of the ICO in this area - you must not make live marketing calls to numbers registered on the TPS unless the subscriber has specifically told you that they want your marketing calls. You should always screen numbers against the TPS register before conducting telephone marketing.
Quick Tax Claims Limited fined £120,000 for sending unlawful text messages
Quick Tax Claims Limited sent 7,863,547 unlawful text messages over the course of one month resulting in 66,793 complaints of which 93% stated that there was no opt out option. Personal data had been bought from third party suppliers that hadn’t obtained valid consent. This resulted in the ICO issuing a fine in the sum of £120,000.
The ICO comments in its decision notice that it is reasonable to expect organisations to be aware of their responsibilities in this area given the detailed guidance on direct marketing and consent and the telephone helpline it offers to organisations. When buying in marketing lists from third parties, rigorous checks must be carried out to satisfy to ensure that the personal data has been obtained fairly and lawfully and that the necessary consent has been obtained. Assurances from the third party alone are not sufficient and proper due diligence must be undertaken.
Action related to SARs
The ICO has recently taken robust action against a number of organisations who are consistently failing to meet their obligations in regards to DSARs. The organisations in question failed to provide a response within the required deadline (which is one month, but may be extended by a further two months in certain circumstances) between 30 – 50% of the time over a one year period, with some requests still outstanding after over two years.
In the case of Glasgow City Council, it’s failure to respond to DSARs was investigated; this led to the ICO issuing an Assessment Notice. This required the Council to participate in a wider data protection audit, to establish if appropriate procedures were in place for dealing with individuals’ requests for access to their personal data. In that case one department was responsible for the majority of delayed responses; in the relevant period it had received much higher volumes of requests for very sensitive data, following the launch of the Scottish Government’s Redress Scheme, all of which required close review and redaction of material held in off-site archives. No additional funding was in place to support this increased demand, which led to the failures.
The key takeaways for organisations are:
- If you consistently fail to meet the statutory deadlines for response to DSARs, you are at risk of receiving a reprimand, or being subject to an involuntary audit by the ICO.
- The statutory period for response applies even in periods of high demand or lack of resources.
- To avoid this you can:
- Put processes in place to accurately track the progress of all DSARs received;
- Think about whether it is appropriate to apply the additional two month extension in complex cases;
- Communicate regularly with the data subject where there are delays;
- Put in place additional resources to deal with periods of high demand; and
- Consider whether any of the DSARs should be outsourced to free up internal resources.
Read our Practical tips for DSARs here.
A useful case which considered DSARs, what constitutes a ‘reasonable search, and the scope of personal data is Ashley v HMRC. Read Vicki Bowles’ analysis of the case here.
Action related to FOIA
The ICO has also publicised the enforcement action it has taken against four public authorities for continued failings to meet their obligations under the Freedom of Information Act (FOIA). City of London Police has been issued with an enforcement notice for its FOIA failings. Staffordshire Police, Dorset Police and Goldsmiths, University of London have been given practice recommendations setting out improvements they can make to better comply with their legal obligations under FOIA.
Similarly to the DSAR backlogs discussed above, public authorities have found that a combination of a lack of resources, an increase in the number of FOIA requests and more complex requests, were to blame. Nevertheless, the statutory period for response applies. Public authorities should therefore seek to ensure their FOIA processes are robust, efficient and as streamlined as possible.
Other action
Prosecutions of employees
There have been two recent high profile cases of individual employees being prosecuted under the Computer Misuse Act 1990 for accessing customers’ personal data unlawfully.
- A motor insurance worker had been accessing customer personal data outside of his working hours, which resulted in a higher than normal number of claims being processed.
- Two RAC customer service employees were found to have copied customer personal data, relating to their involvement in road traffic accidents, and sold it to a third party.
In both instances the employing organisation discovered the unlawful conduct and reported it to the ICO. All three individuals were handed suspended prison sentences and ordered to complete 150 hours of unpaid work.
Organisations should ensure they have adequate policies in place to deal with these types of events, that employees are given adequate training and that appropriate monitoring capabilities are in place to prevent this occurring, especially where the personal data they hold is of value to third parties.
Advertising Cookies
The ICO consistently takes a hard line approach to organisations who do not get valid consent for the use of cookies. Sky Betting and Gaming were issued a reprimand for unlawfully processing people’s data through advertising cookies without their consent. Whilst consent had been sought by the organisation, some third-party marketing cookies were being deployed before visitors had given their consent. This resulted in the processing of individuals’ personal data without their consent or any other valid lawful basis. The ICO considers the unlawful disclosure of personal data to third parties “a matter of significant public concern”, so expect to see continued enforcement action being taken in an advertising context.
Other ICO News
ICO and National Crime Agency (NCA) sign Memorandum of Understanding to boost UK’s cyber resilience
On 10 September 2024, a Memorandum of Understanding was signed between the NCA and ICO setting out their commitment to collaborating and sharing information to enhance cyber resilience within the UK.
The MoU outlines key areas in which the ICO and NCA will work together to improve cybersecurity when sharing information, in incident management, handling data subject rights requests and through the promotion of learning and guidance.
Perhaps one of the most reassuring parts of the MoU is the confirmation it provides that the NCA will never pass information it receives from an organisation in connection with a cyber-incident to the ICO without first obtaining the organisation’s consent.
ICO Statement on the public sector approach
On the 9th December 2024 the Information Commissioner issued a statement confirming the outcome of the two-year trial approach to regulating public sector organisations. Amongst other things, this approach included the ICO exercising greater discretion when considering whether to issue fines to public sector organisations and instead increasing the use of wider powers such as warnings, reprimands and enforcement notices. The aim of this approach is not to punish victims of data breaches twice through reducing budgets for public services as a consequence of fines.
In his statement, the Information Commissioner notes that public authorities have feedback that publication of reprimands has been an effective deterrent to public sector organisations because of the reputational damage and the potential impact on public trust – reprimands do therefore capture senior leaders’ attention. Areas of the approach have however been identified for improvement, including making clear which organisations fall within the scope of the public sector approach and what types of infringement could lead to a fine.
The Information Commissioner commented ‘reflecting on the past two years and based on the evidence from the review, I have decided to continue with the public sector approach. But I also have listened to the feedback and will provide greater clarity on its parameters.’ The approach will be kept under review going forwards.
ICO responds to Meta’s announcement on using user data for AI training
The ICO has responded following Meta’s announcement that it intends to use publicly available data to train its AI after pausing its plans to do this in June 2024 at the ICO’s request. The announcement has given rise to data protection concerns and in particular how many individuals may not understand how their publicly available information is being utilised or the potential implications.
The ICO specifically clarified in a statement responding to Meta’s announcement, that it had not granted regulatory approval for this processing by Meta, and that Meta remains responsible for ensuring and demonstrating ongoing compliance. In particular the ICO expects Meta to comply with the principles of transparency and data minimisation and the rules on consent.
This statement is a reminder that it cannot be assumed that publicly available data is exempt from the regulations. Although AI has the potential to be of huge benefit to individuals, these benefits must be carefully balanced against the rights of individuals to have their data protected.
What is on the ICO’s regulatory radar?
ICO to investigate social media platforms’ use of children’s data
Amid growing concerns around the use of data generated by children’s activity online on social media platforms, the ICO has announced investigations into how TikTok, Reddit, and Imgur use the personal information of UK children.
The investigation into TikTok focuses on how the platform uses the personal data of children aged 13–17 to feed algorithms that make recommendations and deliver content and whether this is serving up content that is inappropriate or harmful. The investigations into Reddit and Imgur examine their age assurance measures, such as how they estimate or verify a child’s age which can be used to tailor their experience on the platforms.
The ICO stated:
“If we find there is sufficient evidence that any of these companies have broken the law, we will put this to them and obtain their representations before reaching a final conclusion.”
These investigations are part of broader efforts to ensure social media platforms protect children’s privacy and comply with data protection laws. The ICO has also published a progress update on its Children’s code strategy, outlining its key interventions in this space, and including a comparison table of some of the privacy practices of 29 social media and video sharing platforms.
ICO announces plan to tackle cookie compliance across the UK’s top 1,000 websites
Organisations are expected to provide meaningful control to individuals over how they are tracked online. This includes providing people with the option to opt-out of non-essential data being processed.
In line with this expectation, the ICO’s online tracking strategy for 2025, aims to give greater control over personal information back to the public and the ICO has expanded its review of cookie compliance online to bring the UK’s top 1000 websites into compliance.
To the extent your organisation’s website uses cookies, now would be a good time to audit their use and take steps to ensure they are being set in compliance with data protection law.
Consultations
ICO data protection fees increased following DSIT consultation
The Department for Science, Innovation and Technology (DSIT) opened a consultation between 29 August to 3 October 2024 on its proposals to amend the data protection fees that are payable by data controllers to the ICO. On 27 January 2025 the Data Protection (Charges and Information) (Amendment) Regulations 2025 (SI 2025/63) (the Regulations) were laid before Parliament.
The Regulations came into force on 17 February 2025 and the new fees to be paid to the ICO are as follows:
- Tier 1 (micro-organisations): £52
- Tier 2 (small and medium organisations): £78
- Tier 3 (large organisations): £3,763
New Guidance/ Tools/ Codes of Practice
Consent or pay guidance
The ICO has published consent or pay guidance in relation to mechanisms which give customers the option of either (i) consenting to advertising cookies to push personalised advertising to them when accessing products or services, (ii) paying to not receive personalised advertising when accessing products or services or (iii) walking away.
The guidance focuses on consent and the extent to which it is ‘feely given’ when it is an alternative to payment. It sets out the criteria that organisations should consider to help assess this, including whether there is a power imbalance, what level the fee is set at, whether the standard or nature of the product or service offered is equivalent across both options and whether the principle of privacy by design has been complied with. Organisations implementing a consent or pay model should be mindful of this criteria to help ensure compliance.
ICO launches direct marketing advice generator
Direct marketing is governed by the Privacy and Electronic Communication Regulations and where the marketing involves use of personal information, additionally the UK GDPR. To help organisations stay on the right side of the law, the ICO has launched a direct marketing advice generator.
The tool is targeted at small and medium organisations and is still in the process of being tested and updated, but could be useful to businesses trying to navigate what can sometimes feel like a complex and overwhelming area of law.
New data protection code of conduct launched for UK private investigators
Under article 40 of the UK GDPR, organisations can create codes of conduct that identify and address important data protection issues in their sector with assistance and input from the ICO.
The ICO have approved and published the first sector-owned code of conduct - The Association of British Investigators Limited (ABI) UK GDPR Code of Conduct for Investigative and Litigation Support Services.
Investigators and litigation services can voluntarily sign up to the code and in doing so must satisfy the monitoring body they are meeting the requirements set out within the code.
The ICO has commented:
“This code, which investigators in the private sector can sign up to, will provide certainty and reassurance to those using their services – ensuring investigators are compliant with the UK GDPR requirements. This will assist investigators to navigate the challenges between conducting investigations whilst respecting people’s privacy rights.”
ICO audits and makes recommendations on AI recruitment tools
The ICO has audited developers and providers of AI-powered recruitment tools to assess compliance with data protection laws. The outcome of the audits was summarised in a report published on 6 November 2024 which sets out key findings and recommendations, examples of good practice, case studies and lessons learned.
The ICO’s key recommendations related to:
- ensuring personal data is processed fairly by monitoring for fairness, accuracy and bias in AI;
- informing candidates of how their personal data is processed by AI to meet transparency requirements;
- ensuring compliance with the principles of data minimisation and the purpose limitation;
- putting in place a data protection impact assessment;
- defining and documenting the data protection relationship between relevant parties; and
- identifying the applicable lawful basis and condition for processing.
The ICO also published key questions for organisations looking to procure AI tools for recruitment, so they can seek clear assurances from developers and providers.
AI is being used by many organisations now in their recruitment processes and is a potentially highly useful tool in driving efficiency and helping to reduce unconscious bias when screening applications. It is important however that organisations are aware of the data protection challenges that come with these tools and the content of this report should help employers and AI providers to identify these challenges and address them.
ICO issues draft guidance on how data protection law applies to storage and access technologies such as fingerprinting
The ICO has released draft guidance on storage and access technologies to address concerns in connection with the use of device fingerprinting. This is in response to Google’s announcement permitting device fingerprinting techniques (involving collecting and combining information about a device’s software and hardware, for the purpose of identifying the device) for organisations using its advertising products from February 2025.
The draft guidance sets out in particular the ICO’s expectations in respect of governance, transparency, consent and subsequent processing of personal data obtained via storage/access technologies. A copy of the guidance can be found here. Organisations using tracking technologies, should consider reviewing this guidance and taking any necessary steps to ensure compliance given the increasing regulatory focus in this area.
Law and Policy Update
RTM v Bonne Terre Ltd [2025] EWHC 111 (KB) – Significant judgment on consent
The case of RTM v Bonne Terre Ltd [2025] EWHC 111 (KB) involved a former customer of the defendants, who operate the Sky Betting & Gaming brand. The former customer considered himself a recovering online gambling addict and his claim concerned the unlawful processing of his data for targeted marketing without his consent.
The Judge found for the former customer on the basis that the defendants did not obtain valid consent for the use of cookies or sending direct email marketing to him. The Judge analysed the concept of consent in detail, identifying 3 key strands to help assess when consent will or will not be found to be legally effective:
- The subjective element of the individual’s state of mind (what did they actually understand and desire),
- The autonomous nature of the consent (whether the individual was in a position to autonomously choose to engage with the consent process and decide whether to consent), and
- The evidential standard (how consent is obtained so as to demonstrate consent and mitigate against ambiguity).
This case is very fact specific, but it does suggest that if relying on consent, to help ensure the consent process is as robust as possible, consideration should be given of the types of individuals who are providing the consent, and whether any adjustments could be made to the process of obtaining consent to account for any identified concerns. This is particularly important if the data subjects are high-risk because of vulnerabilities and sophisticated profiling and marketing is being undertaken.
Remedies are yet to be determined in connection with the claim, but the former customer is claiming for harm, distress and loss. A copy of the full judgement can be found here.
Data (Use and Access) Bill (DUAB)
The DUAB was introduced to Parliament in the House of Lords on 23 October 2024 and reached Committee stage in the House of Commons on 4 March 2025.
The DUAB builds on its predecessor – the Data Protection and Digital Information Bill whilst also introducing some important changes. The DUAB’s core objectives are to grow the economy, improve public services and make people’s lives easier.
The DUAM is expected to become law later in 2025. The most recent version of the DUAB can be found here.
Court of Appeal rejects appeal against UK Information Commissioner’s monetary penalty notice
In a judgment handed down in December 2024, the Court of Appeal rejected both grounds for appeal brought by Doorstep Dispensaree Limited against a monetary penalty notice issued by the Information Commissioner.
The Court found that the burden of proof in an appeal of a monetary penalty notice for a breach of data protection law lies with the appellant and subsequent tribunals and appeal courts are not required to start considering an appeal with a ‘blank sheet of paper’, essentially ignoring the monetary penalty notice. Weight may be attached to the reasons given by the Information Commissioner when relevant to matters of discretion, such as proportionality and effectiveness of a penalty or the gravity of a breach, although they should make up their own mind and attach no particular weight to the Information Commissioner’s views on questions of fact and law.
The case raised issues of considerable importance for ongoing and future appeals of penalties issued, and the Commissioner commented “I welcome the Court of Appeal’s judgment in this case as it provides clarity for future appeals. We defended our position robustly and are pleased that the Court has agreed with our findings.”
Other
UK Government publishes Code of Practice for the cybersecurity of AI
What is the Code?
The new voluntary Code together with an implementation guide is now available to organisations, created by the Government to address the specific cyber risks arising out of AI and to set out the baseline security requirements to be implemented by stakeholders in order to protect AI systems.
This is the Government’s initial step in countering such risks, which have been identified to include: data poisoning (the intentional corruption of AI training data); model obfuscation (hiding code, often to protect intellectual property rights); indirect prompt injection (embedding malicious information into AI systems); and operational differences in the management of data within AI systems.
Who does the Code benefit?
The Code is for use by stakeholders, those being the groups identified within the Code as AI system developers, system operators, data custodians, end users (who can be, for example, employees or consumers) and any other affected entities (although these individuals/entities might not be interacting directly with the relevant AI system).
How is the Code structured?
The Code contains 13 Principles, intended to support stakeholders in mitigating cyber risks which may arise during each stage of the AI system lifecycle. These principles are set within three areas: secure design; secure development; and secure deployment; the majority of which are intended to apply to developers and system operators. The full list of principles and related details can be found here.
Is there a data protection risk?
Yes. Whenever personal data (including pseudonymised data) is used in any way for the development and deployment of an AI system, stakeholders will need to seek guidance as to how they can meet the data protection obligations imposed on them under the UK GDPR and Data Protection Act 2018.