Legal development

Data Bytes 51 - Your UK and European Data Privacy update for September 2024

Triangular Colorbond profiles

    Welcome back to Data Bytes. The last month has seen the ICO continue to focus its activity in two key enforcement areas: (i) unlawful use of cookies; and (ii) use of personal data in AI training models. See below for a summary of the reprimand against Sky Betting and Gaming for unlawfully processing people's data through advertising cookies without their consent. You will also find our summary of the ICO's announcement that it has intervened in relation to the use of UK user data for training of LinkedIn's AI models resulting in LinkedIn suspending the use of its UK user data for this purpose pending further engagement with the regulator.

    September also saw the Ashurst data protection and cyber team join forces with our environmental and sustainability colleagues to host a roundtable on Green Cyber, following on from our webinar earlier this year. For a summary of our roundtable discussion, and our insights on forming a three pronged attack to tackle data retention risk, keep scrolling to the Spotlight section.

    UK updates

    1. ICO issues reprimand for Sky Betting and Gaming unlawful cookie processing

    In another example of the ICO's tough stance on cookies, on 2 September 2024 the ICO issued a reprimand to Bonne Terre Limited, trading as Sky Betting and Gaming, for unlawfully processing people's data through advertising cookies without their consent.

    According to the reprimand, Sky Betting and Gaming processed people's personal information and shared it with adtech companies before they had a chance to accept or reject advertising cookies. Although the ICO did not find evidence of deliberate misuse, it concluded that the processing of cookies in this way was not lawful, transparent or fair.

    In its announcement, the ICO said the enforcement action against Sky Betting and Gaming is a warning of the consequences that might happen if organisations breach the law and people are denied the choice over targeted advertising. In 2023, the ICO reviewed the top 100 UK websites and issued warnings to those which failed to comply with data protection law cookie requirements.

    In light of ICO's continued focus on cookies, organisations should review their use of tracking technologies and consent processes, and rectify any compliance gaps before regulators or the general public spot them. In its Sky Betting announcement, the ICO also said it will publish updated guidance for consultation later this year on the use of cookies and similar tracking technologies – organisations should continue to watch this space.

    2. UK signs first-of-its-kind AI treaty

    On 5 September 2024 the UK signed the first-ever international legally binding treaty (CETS No. 225) aimed at ensuring the use of AI systems is consistent with human rights, democracy and the rule of law. The legally binding treaty, officially called the Council of Europe's Framework Convention on Artificial Intelligence and Human Rights, Democracy and the Rule of Law ("FCAI"), covers the use of AI systems by public authorities and private actors. It establishes various obligations with the aim of promoting the progression and innovation of AI with a set of fundamental principles such transparency and oversight, accountability and responsibility, respect for privacy and personal data protection.

    You may recall from our previous edition of Data Bytes that the FCAI was adopted by the Council of Europe on 17 May 2024 and we noted the FCAI is the first attempt to apply the basic ideas of the EU AI Act (albeit in less depth) beyond the boundaries of the EU via an international treaty.

    As of 8 October 2024 , the FCAI has been signed by 10 countries, including the US, EU and UK – however countries from all over the world will be able to join. Signatories have the flexibility to implement appropriate legislative, administrative or other measures to give effect to the FCAI – this could result in divergences in implementation and impact across jurisdictions over time. The FCAI will enter into force on the first day of the month following the expiration of a period of three months after the date on which five signatories, including at least three Council of Europe member states, have ratified it.

    3. ICO intervenes again on use of user data for AI model training

    On 20 September, the ICO announced that it intervened in relation to the use of UK user data for training of LinkedIn's AI models resulting in LinkedIn suspending the use of its UK user data for this purpose pending further engagement with the regulator.

    This intervention with LinkedIn comes after a similar intervention in June 2024 where the ICO requested that Meta pause and review plans to use Facebook and Instagram user data to train generative AI.

    The LinkedIn and Meta interventions demonstrate that the organisations seeking to leverage UK user data for AI model training should anticipate careful scrutiny by the ICO particularly in relation to the safeguarding of users' information rights. Meta has subsequently announced it has decided to resume its AI model training plans in the UK and whilst the ICO has acknowledged this development, it did note that it has not provided regulatory approval for the processing to proceed.

    4. UK Government launches consultation on increasing fees payable to the ICO

    The Department for Science, Innovation and Technology has launched a consultation on a proposed increase to the fees paid by data controllers to the ICO. The objective of the changes is to ensure that the ICO has adequate funding to perform its role effectively, including its new responsibilities in the forthcoming Digital Information and Smart Data Bill.

    The government's proposal includes a 37.2% increase in fees distributed evenly across the existing tier fee structure. Under the proposal, large organisations in tier 3, defined as those with over 250 employees and an annual turnover exceeding £36 million, would see their annual fee increase from £2900 to £3979. Tier 2 and tier 1 organisations would also face fee increases from £60 to £82, and £40 to £55, respectively. The government has stated that it has no intention to otherwise amend the existing tier structure, exemptions, or the direct debit discount.

    European updates

    1. EU Commission updates its Q&A for the EU AI Act

    On 1 August 2024, the European Commission has published a revised version of its Q&A document for the EU AI Act (Regulation (EU) 2024/1689).

    The Q&A document gives providers and deployers an overview on their obligations and potential enforcement of the AI Act, in order to enable companies asking the right questions when introducing AI systems, in particular regarding the classification of AI systems and general purpose AI models and the classification of such systems. Further, it provides information on the supervisory governance and enforcement mechanisms under the AI Act, including the two-tiered governance system with the national authorities supervising AI systems in general and the EU Commission having a direct competency to deal with general-purpose AI models. The Q&A document underscores the importance of standardization, transparency, and accountability in developing AI systems, with its specific requirements on labelling and watermarking AI-generated content to prevent manipulation and misinformation. The Q&A document draws attention to the significant penalties for non-compliance, reaching up to €35 million or 7% of the total worldwide annual turnover for the most severe infringements.

    2. EU Commission publishes its technical FAQ document for the Data Act

    On 13 September 2024, the European Commission has published a comprehensive set of Frequently Asked Questions (FAQs) regarding technical aspects of implementing the Data Act (Regulation (EU) 2023/2584). The FAQs are designed to assist in particular data holders to better understand the legal requirements and how to take forward the implementation of the Data Act at a technical level. The FAQs are the product of the Commission's extensive stakeholder interaction and provides detailed answers to questions on various aspects, including direct and indirect data access, the role of data holders and the option to "outsource" this role, the protection of trade secrets and the requirement for user verification. The EU Commission has presented the FAQs as a 'living document' that it will update as and when necessary.

    On the data protection side, the FAQs emphasize that the Data Act complements the GDPR by enhancing data sharing and ensuring a fair distribution of value generated from data. It specifies that the GDPR remains fully applicable to all personal data processing activities under the Data Act, and in cases of conflict, the GDPR prevails. The document outlines the responsibilities of data holders, including the need to provide users with access to data generated by connected products and related services. Additionally, it introduces mechanisms to protect trade secrets ("trade secret handbreak") and addresses the conditions under which data holders can withhold or refuse to share data. The FAQs further elaborate on the enforcement and cooperation mechanisms between data protection authorities and other competent authorities, ensuring a coherent approach across the EU.

    3. CJEU ruling – Data Protection Authorities are NOT obliged to impose corrective measures

    On 26 September 2024, the CJEU has ruled that a data protection authority (DPA) is not obliged to exercise certain corrective powers (e.g. raise a fine) against a data controller that has suffered a personal data breach, but can exercise its discretion on taking action against a shortcoming or breach (C-768/21), even if the issue has been raised by a data subject. Under Article 58 para. 2 GDPR the DPA can draw on a variety of possible measures and reprimands to choose from, in accordance with specific circumstances of the individual case. That applies in particular where an infringement has already been made good and the further processing of personal data by the controller is ensured in compliance with the GDPR.

    This landmark decision is highly significant in particular for data breaches deriving from a cyber-attack. According to the CJEU, supervisory authorities may choose to abstain from further action if a data controller has proactively taken the necessary steps to remedy the breach and ensures full compliance with the GDPR.

    4. Standard Contractual clause for the transfer of data

    In September, the European Commission has announced its intention to adopt new Standard Contractual Clauses for the transfer of data to third country controllers and processors subject to the GDPR. The aim is to complement the existing clauses, which are meant for data transfers to a data importer located in a third country not subject to the GDPR.

    Standard contractual clauses are model contractual clauses that can be used by data exporters to comply with the GDPR provisions when transferring personal data to third countries.

    The European Commission wants to adopt these new clauses by the second quarter of 2025, following a public consultation planned on the fourth quarter of 2024.

    5. The Belgium Data Protection Authority refused to process a complaint of personal data infringements

    A claimant, represented by the non-profit organization NOYB, filed a complaint before the Belgium Data Protection Authority, arguing that the Google Analytics tool illegally transferred personal data to the United States when a data subject is visiting the website flair.be.

    The Belgium Data Protection Authority considered this claim was an abuse of law and dismissed the complaint.

    First, it found that the claimant did not have a sufficiently concrete interest. The data infringement was orchestrated by the organization, as the claimant, residing in Austria did not speak Dutch (which was the website language).

    Furthermore, the Authority considered the organization was not properly mandated, as the mandate was not detailed and several elements were unclear and ambiguous. It was lodged based on pre-established "model case" by the organization.

    The Authority found that the organization wanted to raise general issues instead of defending the claimant specific interest. It therefore dismissed the case and stated that the GDPR has been set out to protect concrete individuals rights, and not to create artificial disputes for the benefit of political organizations.

    6. Second report on the application of the GDPR

    On July 25th, 2024, the European Commission has published its second report on the application of the GDPR. It is required by Article 97 of the GDPR, under which the European Commission must evaluate every four years the application of the GDPR.

    This report makes various recommendations, such as allocating sufficient resources to DPA, intensifying efforts towards small and medium size business enterprises and making sure national and European guidelines, as well as the ECJ case law are consistent.

    France updates

    1. The French Data Protection Authority has fined CEGEDIM Santé 800,000 euros for the unauthorized process of health data

    2. French Data Protection Authority has published its recommendations for mobile applications

    Spain updates

    1. Lack of infringement of data protection regulations by GLS

    2. Catalan's Data Protection Authority Opinion on the use of images obtained by the video surveillance system by tenants

    Spotlight on Green Cyber and Retention Risk

    In September, we had the pleasure of hosting Ashurst's annual data protection roundtable. This year our discussion was on Green Cyber, focussing on how a three pronged attack from Data, ESG and IT security leads could get buy in to tackle data retention issues. This event brought together our market-leading legal and risk advisory teams to dive deep into two of the most transformative megatrends we're passionate about at Ashurst: digitisation and sustainability.

    Our discussions were rich with insights and forward-thinking strategies, and we've captured the essence of our key takeaways below:

    • There are many factors driving the growing volumes of data within organisations. These include the explosion in the use and development of AI (particularly generative AI), poorly managed data governance and historic IT systems which are not capable of automatically deleting data which is passed its stated retention periods.
    • Unnecessary data storage results in unnecessary data storage costs and increased risk: Cyber criminals have more data to find and extort. Regulators treat unnecessarily long data retention periods as an aggravating factor in regulatory fines.
    • ESG is increasingly a factor that should be taken into account when obtaining momentum for tackling data retention risk. Robust data retention and deletion practices are a practical step to reduce carbon footprint and reduce the cost profile of data processing and storage.
    • There is a reluctance from management and business to tackle the issue. To get buy-in, one needs to show a tangible cost saving.
    • Tackling data retention and deletion should be a multi department request. A three pronged approach from compliance, IT security and ESG colleagues, with evidence of cost savings has a stronger chance of management buy-in.

    Please contact us if you wish to learn more or explore how we can help your organisation navigate these trends.

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.