Business Insight

Data Bytes 38: Your UK and European Data Privacy update for July 2023

Triangular Colorbond profiles

    Welcome to our new look and format Data Bytes, where the Ashurst UK and European Data Privacy and Cyber Security team look to summarise the key privacy legal and policy developments of the previous month.

    There's no such thing as a summer break in terms of privacy developments and this June and July have been as busy as ever. From international agreements on data flows to secure the free flow of information from the UK and Europe to the US to new legislation to reform UK data protection laws progressing through Parliament, get your byte sized digest here. Each month we will also turn the spotlight on a particular privacy or cyber risk, in our "Have you thought about" section below. This month we are delighted to share with you the thoughts of John Macpherson, head of Ashurst's cyber response team, and Matt Worsfold, Data & Analytics practice lead at Ashurst Risk Advisory, who are partners in our risk advisory practice on the particular cyber risks that AI presents.

    UK Developments

    1. UK and US commit to Data Bridge

    On 8 June 2023, Government officials from the UK and US reached a commitment in principle to establish a "Data Bridge" to facilitate the free flow of personal data between the two countries. The Data Bridge constitutes a legal framework which will form an extension of the EU-US Data Privacy Framework ("DPF") that was recently subject to an adequacy decision from the EU Commission (see further details below). Once the Data Bridge is finalised, organisations subject to the UK GDPR will be able to transfer personal data to US companies certified under the DPF without having to implement safeguards such as standard contractual clauses or complete a transfer risk assessment. The UK government is still finalising its assessment of the US legal system. The UK also needs to be designated as a “qualifying state” under Executive Order 14086 (the "Executive Order"), which was introduced in the US in October 2022 to provide new privacy and civil liberties safeguards to individuals. Once the agreement is finalised, the story and uncertainty is unlikely to end. As has happened a number of times before, we expect a challenge from Max Schrems as to whether the laws of the US provide sufficient protection to EU and UK citizens or whether more fundamental changes are required to US surveillance laws to provide such protections. For now we have an imminent reprieve from undertaking transfer risk assessments if data is being transferred to members of the DPF. Even transfers to companies in the US not certified under the DPF, will become easier in time once the UK is listed as Qualifying State under the Executive Order, as these additional rights will apply to UK residents whether the importing organisation is a member of the DPF or not.

    2. ICO Releases Guidance on Privacy Enhancing Technologies ("PETs"). 

    On 19 June 2023, the ICO released guidance on PETs aimed at data protection officers and others who are using large personal data sets in connection with finance, healthcare, research or government. The ICO explains the risks and benefits of eight types of PETs in the guidance and endorses the use of these PETs as method of complying with data protection law when sharing people's personal data. The guidance is timely as companies struggling to comply with privacy laws whilst maximising the full potential of AI will find useful guidance on technologies that could be used to either eradicate or minimize privacy risks. It explains the basic premise of a number of PETs including synthetic data, homomorphic encryption and zero knowledge proofs. Although none of this is likely to be new to experienced data scientists, the guidance gives data protection officers and legal teams the vocabulary to be able to stress test the technical teams or third party suppliers as to whether it is possible to undertake data analysis in a more privacy conscious way.

    3. ICO Launces Innovation Advice Service

    The ICO also launched a new innovation advice service for organisations using new or innovative technologies in particular where they include the use of AI, biometric data or PETs. Organisations can expect an answer from the ICO within 10 to 15 business days. Previously asked and answered questions are available and provide useful guidance as to the ICO's thoughts on a number of different areas. Current published answers relate to whether a number plate is personal data, whether providers of deep learning AI solutions are processors or controllers, the lawful basis for storing employee diversity data, lawful basis for generative AI systems, restricted transfers to the US, considerations when engaging cloud service providers and the boundaries of whether something is special category data.

    4. ICO warns organisatioins about "Reject All" cookie buttons

    In an interview with MLEX in June 2023, the ICO's Stephen Bonner advised organisations who fail to give website users a "reject all" cookies option in their cookie banners that they are breaking the law. Bonner noted that the "reject all" button needs to appear in the initial cookie banner presented to website users and that the ICO is monitoring compliance on this issue. The ICO released a statement in 2022 welcoming the introduction of a "reject all" button by Google and the announcement by Bonner highlights the ICO is expecting all organisations to adopt this approach to compliance with the cookie consent laws. Currently many organisations in the UK adopt the practice of having an "accept all" button however users need to go a level deeper into the process and "manage settings" where they are able to toggle non-essential cookies on or off. This statement from the ICO makes it clear that the ICO does not consider this a compliant practice and the "reject all" button should appear at the initial cookie banner. Organisations across the UK should therefore review their cookie consent practices to ensure they are compliant and implement a "reject all" button if necessary.

    5. Data Protection and Digital Information Bill (No.2) (the "Bill") Continues Through Parliament

    Following the Bill's first reading in March 2023, it has progressed through a second reading and was considered by a Public Bill Committee over eight sessions in May 2023. The current version of the Bill as amended by the Committee is available here and it will now progress to a third reading in Parliament at a yet to be announced date.

    EU Developments

    1. EU Commission Adopts New EU-US Adequacy Decision

    On 10 July 2023, the adequacy decision for the EU-US Data Privacy Framework ("DPF") was adopted by the EU Commission as a replacement to the invalidated Privacy Shield Framework. Any US organisation that is subject to the jurisdiction of the Federal Trade Commission ("FTC") or the Department of Transportation ("DOT") may certify under the DPF and receive personal data from organisations based in the European Economic Area without the need for additional safeguards. US organisations such as deposit taking banks are not under the jurisdiction of the FTC and therefore cannot certify under the DPF. Any transfers to these banks would still need to involve EU Standard Contractual Clauses ("SCCs") or another transfer mechanism and the completion of a transfer impact assessment. The EDPB has released FAQs which clarify that the legal changes and safeguards implemented by the US Government in connection with the DPF apply to all data transferred to the US regardless of the transfer tool used. This will also simplify the process for completing Transfer Impact Assessments in connection with transfers to US organisations not able to certify under the DPF since the rights granted to EU residents by Executive Order 14086 will apply whether the organisation is a member of the DPF or not. As stated in the UK update above, we expect a challenge from Max Schrems as to whether the laws of the US provide sufficient protection to EU and UK citizens or whether more fundamental changes are required to US surveillance laws to provide such protections. We expect organisations in Europe to take a cautious approach to this news, and whilst removing the need for SCCs and scaling back on onerous transfer impact assessments is permitted, we foresee many organisations looking to take a belt and braces approach and still requiring SCCs to be in place with members of the DPF, in case history repeats itself and the framework is declared insufficient in the future.

    2. Ad Tech Company Fined EUR 40,000,000 by CNIL

    On 15 June 2023, the French Data Protection Authority ("CNIL") fined Criteo for tracking internet users in order to display targeted advertisements without obtaining consent of the users or complying with transparency requirements and a number of data subject rights under the GDPR. The investigation was initiated by the CNIL following complaints from Privacy International and the European Centre for Digital Rights, NYOB, and specifically found that Criteo had failed to implement measures to demonstrate that consent had been obtained for the personal data it processed. This highlights the CNIL’s enforcement focus on adtech and cookies practices, and sets high expectations for intermediaries in the adtech cycle across Europe. Key findings from this decision are: (i) adtech intermediaries are expected to proactively audit and verify consent collected by their partners on their behalf; (ii) privacy notices need to clearly identify the legal basis for each specific processing purpose; and (iii) the collection of pseudonymous browsing and technical data does not reduce the obligations which attract to such data under the GDPR. All of these present practical challenges for the adtech industry and it will be interesting to see whether other regulators across Europe follow the CNIL's lead with this enforcement focus.

    3. EDPB Adopts Guidelines on GDPR Fine Calculations

    On 7 June 2023, the EDPB released guidelines setting out a five step methodology to be used by EU data protection authorities when calculating administrative fines under the GDPR. The guidelines are intended to harmonise the approach to calculating fines across the EU and includes examples of how the methodology should be applied in practice. The calculation of fines is largely at the discretion of the DPAs, which has led to materially different approaches from one EU Member State to the next. While the guidelines offer no method to predict fines accurately, they do now assist businesses with understanding the principles that DPAs are likely to follow when calculating such fines. For example they emphasise the importance for businesses of cooperation with DPAs and taking appropriate mitigation measures in order to reduce the likelihood of a high fine, to the extent possible.

    Have you thought about the complexities of AI & cyber risk?

    With advancements in AI's ability to process substantial volumes of data, and an increased understanding as to how to exploit both the technology and the data, we are seeing a rapid increase in the deployment of AI which is dramatically impacting the way in which businesses operate. But with the advantages that this technology brings also come a number of risks and challenges including in how AI changes the cyber risk landscape for an organization.

    AI is a double-edged sword. It is an effective tool in detecting and mitigating cyber risk, but also introduces complex cyber risk challenges. Effectively mitigating cyber risk, as it relates to the use of AI, can be difficult. Relying on an organisation's existing cyber strategy, without specifically considering the organisation's use of AI, is unlikely to be effective. This comes down to the unique nature of AI and how the technology works.

    AI is a complex technology which requires access to large and diverse datasets. These datasets are required to shape, train, and direct the AI towards the right outcomes. Organisations should be aware of the potential complications arising from the use of AI, particularly if the AI deployed within the organisation is business critical. It is entirely possible that a vulnerability may not become obvious until the AI has been deployed, and in this case rectifying it may prove difficult. AI vulnerabilities may need more than a simple patch, they may require retraining of the underlying dataset. In this case tricky decisions need to be made. Do you retrain in the live environment or alternatively, take it offline which may might impact the rest of the system? This is all assuming that the vulnerability can even be identified in the first place, which may not be possible given the complex inner workings of an AI system.

    Additionally, the way in which the accuracy of an AI model and its outcomes rely upon significant amounts of data means that organisations are not only opened up to greater risks around effectively securing that data from theft, but they also now need to contend with the risks around data poisoning. Data poisoning is the risk of a threat actor or malicious party intentionally modifying the underlying data feeding the AI model in order to produce unintended outcomes and harm to the organisation. Whether this harm could be reputational damage or harm to individuals, be they consumers, employees or stakeholders. This changes the posture when thinking about the ramifications should someone gain unauthorised access to IT systems when AI models are in use and are being used for critical decision making.

    AI cyber risk is a complex and rapidly evolving as the underlying technology evolves. Organisations should consider how they will mitigate cyber risk arising from their use of AI within the context of their cybersecurity strategy. This requires careful thought about whether the unique risks posed by AI are being adequately incorporated into cyber security and response plans and are adequately covered by existing risk management and governance frameworks. As the use of AI proliferates, many organisations should be thinking about getting their AI risk management and governance right from the outset.

    We recently launched our Future Forces 2023 Report, based on research by Economist Impact, which identifies the six key megatrends, including digitalisation and AI, shaping businesses over the next decade.

     

    Authors: Rhiannon Webster, Partner; Andreas Mauroschat, Partner; Alexander Duisberg, Partner; John Macpherson, Partner; Matthew Worsfold, Partner; Reneé Green, Expertise Counsel; Shehana Cameron-Perera, Senior Associate; Tom Brookes, Associate

     

     

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.