Data Bytes 51 - Your UK and European Data Privacy update for September 2024
15 October 2024
15 October 2024
Welcome back to Data Bytes. The last month has seen the ICO continue to focus its activity in two key enforcement areas: (i) unlawful use of cookies; and (ii) use of personal data in AI training models. See below for a summary of the reprimand against Sky Betting and Gaming for unlawfully processing people's data through advertising cookies without their consent. You will also find our summary of the ICO's announcement that it has intervened in relation to the use of UK user data for training of LinkedIn's AI models resulting in LinkedIn suspending the use of its UK user data for this purpose pending further engagement with the regulator.
September also saw the Ashurst data protection and cyber team join forces with our environmental and sustainability colleagues to host a roundtable on Green Cyber, following on from our webinar earlier this year. For a summary of our roundtable discussion, and our insights on forming a three pronged attack to tackle data retention risk, keep scrolling to the Spotlight section.
In another example of the ICO's tough stance on cookies, on 2 September 2024 the ICO issued a reprimand to Bonne Terre Limited, trading as Sky Betting and Gaming, for unlawfully processing people's data through advertising cookies without their consent.
According to the reprimand, Sky Betting and Gaming processed people's personal information and shared it with adtech companies before they had a chance to accept or reject advertising cookies. Although the ICO did not find evidence of deliberate misuse, it concluded that the processing of cookies in this way was not lawful, transparent or fair.
In its announcement, the ICO said the enforcement action against Sky Betting and Gaming is a warning of the consequences that might happen if organisations breach the law and people are denied the choice over targeted advertising. In 2023, the ICO reviewed the top 100 UK websites and issued warnings to those which failed to comply with data protection law cookie requirements.
In light of ICO's continued focus on cookies, organisations should review their use of tracking technologies and consent processes, and rectify any compliance gaps before regulators or the general public spot them. In its Sky Betting announcement, the ICO also said it will publish updated guidance for consultation later this year on the use of cookies and similar tracking technologies – organisations should continue to watch this space.
On 5 September 2024 the UK signed the first-ever international legally binding treaty (CETS No. 225) aimed at ensuring the use of AI systems is consistent with human rights, democracy and the rule of law. The legally binding treaty, officially called the Council of Europe's Framework Convention on Artificial Intelligence and Human Rights, Democracy and the Rule of Law ("FCAI"), covers the use of AI systems by public authorities and private actors. It establishes various obligations with the aim of promoting the progression and innovation of AI with a set of fundamental principles such transparency and oversight, accountability and responsibility, respect for privacy and personal data protection.
You may recall from our previous edition of Data Bytes that the FCAI was adopted by the Council of Europe on 17 May 2024 and we noted the FCAI is the first attempt to apply the basic ideas of the EU AI Act (albeit in less depth) beyond the boundaries of the EU via an international treaty.
As of 8 October 2024 , the FCAI has been signed by 10 countries, including the US, EU and UK – however countries from all over the world will be able to join. Signatories have the flexibility to implement appropriate legislative, administrative or other measures to give effect to the FCAI – this could result in divergences in implementation and impact across jurisdictions over time. The FCAI will enter into force on the first day of the month following the expiration of a period of three months after the date on which five signatories, including at least three Council of Europe member states, have ratified it.
On 20 September, the ICO announced that it intervened in relation to the use of UK user data for training of LinkedIn's AI models resulting in LinkedIn suspending the use of its UK user data for this purpose pending further engagement with the regulator.
This intervention with LinkedIn comes after a similar intervention in June 2024 where the ICO requested that Meta pause and review plans to use Facebook and Instagram user data to train generative AI.
The LinkedIn and Meta interventions demonstrate that the organisations seeking to leverage UK user data for AI model training should anticipate careful scrutiny by the ICO particularly in relation to the safeguarding of users' information rights. Meta has subsequently announced it has decided to resume its AI model training plans in the UK and whilst the ICO has acknowledged this development, it did note that it has not provided regulatory approval for the processing to proceed.
The Department for Science, Innovation and Technology has launched a consultation on a proposed increase to the fees paid by data controllers to the ICO. The objective of the changes is to ensure that the ICO has adequate funding to perform its role effectively, including its new responsibilities in the forthcoming Digital Information and Smart Data Bill.
The government's proposal includes a 37.2% increase in fees distributed evenly across the existing tier fee structure. Under the proposal, large organisations in tier 3, defined as those with over 250 employees and an annual turnover exceeding £36 million, would see their annual fee increase from £2900 to £3979. Tier 2 and tier 1 organisations would also face fee increases from £60 to £82, and £40 to £55, respectively. The government has stated that it has no intention to otherwise amend the existing tier structure, exemptions, or the direct debit discount.
On 1 August 2024, the European Commission has published a revised version of its Q&A document for the EU AI Act (Regulation (EU) 2024/1689).
The Q&A document gives providers and deployers an overview on their obligations and potential enforcement of the AI Act, in order to enable companies asking the right questions when introducing AI systems, in particular regarding the classification of AI systems and general purpose AI models and the classification of such systems. Further, it provides information on the supervisory governance and enforcement mechanisms under the AI Act, including the two-tiered governance system with the national authorities supervising AI systems in general and the EU Commission having a direct competency to deal with general-purpose AI models. The Q&A document underscores the importance of standardization, transparency, and accountability in developing AI systems, with its specific requirements on labelling and watermarking AI-generated content to prevent manipulation and misinformation. The Q&A document draws attention to the significant penalties for non-compliance, reaching up to €35 million or 7% of the total worldwide annual turnover for the most severe infringements.
On 13 September 2024, the European Commission has published a comprehensive set of Frequently Asked Questions (FAQs) regarding technical aspects of implementing the Data Act (Regulation (EU) 2023/2584). The FAQs are designed to assist in particular data holders to better understand the legal requirements and how to take forward the implementation of the Data Act at a technical level. The FAQs are the product of the Commission's extensive stakeholder interaction and provides detailed answers to questions on various aspects, including direct and indirect data access, the role of data holders and the option to "outsource" this role, the protection of trade secrets and the requirement for user verification. The EU Commission has presented the FAQs as a 'living document' that it will update as and when necessary.
On the data protection side, the FAQs emphasize that the Data Act complements the GDPR by enhancing data sharing and ensuring a fair distribution of value generated from data. It specifies that the GDPR remains fully applicable to all personal data processing activities under the Data Act, and in cases of conflict, the GDPR prevails. The document outlines the responsibilities of data holders, including the need to provide users with access to data generated by connected products and related services. Additionally, it introduces mechanisms to protect trade secrets ("trade secret handbreak") and addresses the conditions under which data holders can withhold or refuse to share data. The FAQs further elaborate on the enforcement and cooperation mechanisms between data protection authorities and other competent authorities, ensuring a coherent approach across the EU.
On 26 September 2024, the CJEU has ruled that a data protection authority (DPA) is not obliged to exercise certain corrective powers (e.g. raise a fine) against a data controller that has suffered a personal data breach, but can exercise its discretion on taking action against a shortcoming or breach (C-768/21), even if the issue has been raised by a data subject. Under Article 58 para. 2 GDPR the DPA can draw on a variety of possible measures and reprimands to choose from, in accordance with specific circumstances of the individual case. That applies in particular where an infringement has already been made good and the further processing of personal data by the controller is ensured in compliance with the GDPR.
This landmark decision is highly significant in particular for data breaches deriving from a cyber-attack. According to the CJEU, supervisory authorities may choose to abstain from further action if a data controller has proactively taken the necessary steps to remedy the breach and ensures full compliance with the GDPR.
In September, the European Commission has announced its intention to adopt new Standard Contractual Clauses for the transfer of data to third country controllers and processors subject to the GDPR. The aim is to complement the existing clauses, which are meant for data transfers to a data importer located in a third country not subject to the GDPR.
Standard contractual clauses are model contractual clauses that can be used by data exporters to comply with the GDPR provisions when transferring personal data to third countries.
The European Commission wants to adopt these new clauses by the second quarter of 2025, following a public consultation planned on the fourth quarter of 2024.
A claimant, represented by the non-profit organization NOYB, filed a complaint before the Belgium Data Protection Authority, arguing that the Google Analytics tool illegally transferred personal data to the United States when a data subject is visiting the website flair.be.
First, it found that the claimant did not have a sufficiently concrete interest. The data infringement was orchestrated by the organization, as the claimant, residing in Austria did not speak Dutch (which was the website language).
Furthermore, the Authority considered the organization was not properly mandated, as the mandate was not detailed and several elements were unclear and ambiguous. It was lodged based on pre-established "model case" by the organization.
The Authority found that the organization wanted to raise general issues instead of defending the claimant specific interest. It therefore dismissed the case and stated that the GDPR has been set out to protect concrete individuals rights, and not to create artificial disputes for the benefit of political organizations.
On July 25th, 2024, the European Commission has published its second report on the application of the GDPR. It is required by Article 97 of the GDPR, under which the European Commission must evaluate every four years the application of the GDPR.
This report makes various recommendations, such as allocating sufficient resources to DPA, intensifying efforts towards small and medium size business enterprises and making sure national and European guidelines, as well as the ECJ case law are consistent.
In September, we had the pleasure of hosting Ashurst's annual data protection roundtable. This year our discussion was on Green Cyber, focussing on how a three pronged attack from Data, ESG and IT security leads could get buy in to tackle data retention issues. This event brought together our market-leading legal and risk advisory teams to dive deep into two of the most transformative megatrends we're passionate about at Ashurst: digitisation and sustainability.
Our discussions were rich with insights and forward-thinking strategies, and we've captured the essence of our key takeaways below:
Please contact us if you wish to learn more or explore how we can help your organisation navigate these trends.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.