We bring you regularly updated data privacy news, looking at the latest developments you need to know about in data protection and cybersecurity with a focus on the UK and EU.
Register to receive news updates directly in your inbox:
Sign up hereCommissioner McGrath, EU Justice Commissioner, has said that the EU GDPR will be included in a future omnibus package which will focus on reducing the regulatory burden on smaller organisations. He also said the EU would need to continue to monitor developments with the EU-US DPF but noted that he had had a positive meeting with Andrew Ferguson, the Chair of the Federal Trade Commission.
On 10 March 2025, the ICO and FCA published an open letter to trade association chairs and CEOs of financial services firms saying the regulators understand the need for regulatory clarity on use of AI in financial services. A recent FCA and Bank of England survey cited data and consumer protection to be in the top three regulatory constraints to AI deployment within financial services. The FCA and ICO propose a roundtable with industry leaders in May to discuss areas of uncertainty and challenge and look at how the regulators can work together with industry to support growth.
As anticipated, the government has reversed Beeban Kidron's amendments to the Data (Use and Access) Bill. These had proposed that operators of web scrapers and GPAI models be required to comply with UK copyright law and be transparent about their identity as well as allow content creators to understand whether their content had been scraped to train AI. Chris Bryant has said the DUA Bill could pass as soon as late April.
The UK's Planning and Infrastructure Bill was published on 11 March 2025. Among other planning reforms, it re-classifies data centres as nationally significant infrastructure projects. This means the planning process to build them will be fast-tracked and streamlined and limits how it can be challenged in court.
On 6 March 2025, the EDPS published its 2020-24 mandate review setting out its enforcement actions and its enforcement priorities. It has said it will continue its enforcement efforts, particularly around online harms, child sexual abuse material, and cyber security.
On 13 March 2025, Privacy International, Liberty and two individuals, announced they are challenging the Home Secretary's reported decision to serve Apple with an order under the Investigatory Powers Act requiring access to personal data held by Apple. They also made a separate urgent application to make hearings dealing with the Technical Capability Notice public.
Apple is reportedly appealing the order issued under the Investigatory Powers Act requiring it to give the government a backdoor into encrypted services globally in the event of a national security threat. As Apple is not able to provide access to data encrypted using its Advanced Data Protection tool, it chose to withdraw the tool in the UK. President Trump criticised the UK's actions describing them as "something that you hear about in China". Apple is appealing to the Investigatory Powers Tribunal to try and get the order overturned.
On 5 March 2025, the government published a report on the results of its pilot of the proposed Cyber Governance Code of Practice. The draft Code sets out the expectations on company directors in relation to managing cyber risks. The report analyses the feedback from around 20 companies which had attempted to implement the Code and makes several recommendations for improvement, including to map it onto other relevant guidance and publish it on a government website.
On 5 March 2025, the EDPB launched its fourth co-ordinated enforcement action which will focus on the right to erasure. 32 EU DPAs are expected to participate and to look at how controllers deal with right to erasure requests, including the application of exemptions. They will carry out fact-finding exercises and may also open formal investigations.
On 7 March 2025, EU Supervisory Authorities, EBA. ESMA and Eiopa, published an Opinion in which they approved the European Commission's updated Draft Regulatory Technical Standards specifying the elements that a financial entity has to determine and assess when subcontracting ICT services supporting critical or important functions under article 30(5) of the Digital Operational Resilience Act (DORA). The rules, drafted by the SAs, had were changed by the Commission which said the original version had gone beyond the scope what was required.
The European Parliamentary Research Service has published a memo raising some concerns about the UK's data protection regimes which it thinks could impact renewal of the EU-UK adequacy decision, due by 27 June 2025.
In particular, the Research Service is concerned that the Data (Use and Access) Bill would raise new or deepen "existing adequacy concerns" including:
The memo also sets out concerns about the Investigatory Powers (Amendment) Act and specifically mentions the access order reportedly issued to Apple. It stops short of making any kind of recommendation to the Commission.
6 March 2025 is the date by which all President Biden's national security decisions and other orders, memoranda and proclamations have to be reviewed. It is rumoured that the report to be published on that date will recommend that President Trump rescind the Executive Order which underpins the EU-US Data Privacy Framework (DPF). If the President acts on this and revokes the DPF EO, the Commission could revoke its adequacy decision immediately without a transition period.
Even if the DPF EO survives, the European Commission may could still act. The Commission has until 19 March to respond in writing to calls from 19 MEPs on 5 February for it to state whether the DPF still meets adequacy requirements, and to a request from the LIBE Committee on 6 February along similar lines. Both the MEPs and the LIBE Committee are concerned about the fact that the independent Privacy and Civil Liberties Oversight Board which is responsible for ensuring transparency and accountability of US surveillance authorities, is no longer quorate. The PCLOB is responsible for monitoring whether US intelligence agency access to EU personal data is necessary and proportionate and also oversees the redress mechanism for EU citizens under the DPF, the Data Protection Review Court. It is therefore seen by many as crucial to the operation of the DPF.
On 3 March 2025, the UK's ICO announced three investigations as part of its wider interventions into how social media and video sharing platforms use children's data:
If the ICO finds sufficient evidence that any of these companies have breached UK data protection legislation, it will put its findings to the relevant company and obtain their representations before reaching a final conclusion.
On 27 February 2025, the European Parliament Research Service published a briefing on the interplay between the GDPR and the AI Act. In particular, the briefing looked at the requirement under the AI Act to mitigate discrimination and bias in high-risk AI systems, potentially using special category data to be processed, subject to restrictions. The Research Service is concerned that the use of special data for these purposes is more restrictive under the GDPR which creates uncertainty and that this may need to be addressed through legislative reform or further guidance
On 27 February 2025, the ECJ ruled in a reference from Austria about the use of automated decision making and credit scoring. A mobile phone company had relied on the credit rating assessment from Dun & Bradstreet Austria and turned down an application for a contract by an individual. The individual then asked for information about the logic involved in the automated decision-making under Article 15(1) GDPR. The referring court asked the ECJ to determine how detailed the response had to be and asked for clarification on how the balance between protecting trade secrets and the right of access under the GDPR should be assessed.
The ECJ ruled that information provided to data subjects had to be sufficiently clear that they could understand what personal data was used to obtain a specific result. It was not sufficient to provide complex information which the individual would be unable to understand. The data subject's rights cannot be overridden by the controller's desire to protect trade secrets. In the event of concern or doubt, the controller should apply to a court or the supervisory authority for clarification.
See our article by David Klein for more.
These Regulations were made on 24 February 2025 and came into force on 25 February 2025, passing without amendment. They amend the PSTIA Regulations 2023 to:
The ICO is reportedly joining others in investigating DeepSeek's data practices. South Korea's privacy regulator has suggested DeepSeek has engineered its system to cover up data flows, relying more heavily on hard coding than other GenAI models, making it harder to analyse the data being processed. Reports suggest DeepSeek is rapidly being embedded in China's government services, causing local concerns around privacy and cyber security.
The European Health Data Space Regulation (EHDS) was published in the Official Journal on 5 March 2025 and will come into force on 26 March 2025. General provisions will take effect from 26 March 2027, with key rules on primary and secondary use of health data following on 26 March of 2029 and of 2031. Certain technical, organisational, and regulatory measures will be implemented in stages by 26 March 2035.
The EHDS aims to give users control of their electronic personal health data, nationally and cross-border, and create a single market for electronic health record systems, relevant medical devices and high-risk AI systems. This will also be used for the benefit of research and innovation and policy. Individuals will have access to and control over their electronic health care records. There will also be mandatory interoperability and security requirements, and mandated common forms.
On 6 February 2025, the European Parliament's LIBE Committee wrote to the European Commission asking whether the changes made by President Trump to the US Privacy Civil Liberties Oversight Board which left it non-quorate, have an impact on the EU-US Data Privacy Framework. The DPF allows data from the EU and, by extension, the UK, to flow to US signatories without the need for additional transfer mechanisms. LIBE asks the Commission to assess whether the requirement for an essentially equivalent level of protection for EU data when transferred to the US is still being met.
On 20 February 2025, the rules on the notification and reporting of major ICT-related incidents and of significant cyber threats under the Digital Operational Resilience Act, were published in the Official Journal. These comprise Implementing Regulation 2025/302 which sets out standard forms, templates and procedures for reporting, and 2025/301 which sets out technical standards specifying the content and time limits for reporting major ICT-related incidents as well as the voluntary notification for significant cyber threats.
On Friday 21 February, Apple announced it would no longer offer its highest level of data protection service in the UK. The optional service provided for end-to-end encryption of data stored in iCloud. However, last month, Apple received a "technical capability notice" from the UK government under the Investigatory Powers Act, effectively requiring it to give law enforcement access to data worldwide in order to investigate terrorism and child sexual abuse material, subject to a court order. This order for a 'backdoor' to Apple security has been widely criticised by a number of privacy advocates and technology companies. Although not explicitly connected to the IPA request by Apple, it appears that rather than give the UK government access, Apple has withdrawn the encryption service in the UK, although this does not address the fact that the IPA purports to cover worldwide access.
On 19 February 2025, the ICO published the Tech Horizons Report 2025.The report looks at key technologies likely to be adopted in the next two to seven years with a significant impact on privacy. The 2025 report focuses on:
The ICO says that trends identified in previous reports will also apply to these technologies, in particular:
The ICO will proactively address these issues as the technologies develop.
On 16 February 2025, Google lifted its prohibition on advertisers using digital fingerprinting. Digital fingerprinting allows the collection of information from hardware and software which, when combined, can identify individual devices and users. This can be used to track users online and target ads at them. Google has removed third party cookies and will now give users the ability to opt out of tracking using cookies, however, there will not be a choice to opt out of the use of digital fingerprinting techniques.
DeepSeek's data protection practices continue to attract regulator scrutiny, with a number of German DPAs looking at whether or not DeepSeek's parent companies have appointed a representative in the EU. Outside the EU, Taiwan, South Korea and Australia are among the countries to have expressed concern.
After completing the House of Lords stage, the Data (Use and Access) Bill moved to the House of Commons and had its first readings on 6 and 12 February.
The House of Lords made a number of amendments to the Bill, in particular around data scraping to train AI. It introduced new clauses 135-9 which would require the Secretary of State to introduce regulations to require operators of web crawlers and GPAI with a UK connection to:
The Secretary of State would also be required to review technical solutions to prevent and identify unauthorised scraping or use of text and data.
Parliament also introduced new offences in respect of sexually explicit images created without consent (deepfakes).
The ICO published an updated response the DUA Bill on 10 February 2025. The ICO stresses that he does not want the proposed amendment around protection of children's data to suggest that the standard of data protection required for children's data is a higher one than that relating to the processing of adult personal data. The ICO has also requested clarity on a number or points including on the fact that new duties around children's data appear to cover data protection by design but not by default. The ICO reiterates support for the ADM provisions (which were not amended in the Lords), saying they strike the right balance, but has resourcing concerns around the ICO's potential new responsibilities around data scraping.
The Bill now moves to Committee stage with a report expected by 18 March 2025. It is unclear the extent to which these amendments will survive.
The EDPB adopted a statement on age assurance on 12 February 2025. The statement aims to form a basis for consistent application of age assurance across the EU. It underlines that the best interests of the child are of primary importance across the spread of children's rights and sets out a number of other principles to design GDPR-compliant age assurance, based largely on the data protection principles in the GDPR.
As part of the same announcement, the EDPB also said it had adopted recommendations on the 2027 WADA World Anti-Doping Code and that it would extend the scope of the ChatGPT taskforce to AI enforcement.
On 11 February 2025, the EC published a Communication on a simpler and faster Europe, which sets out a five year plan for simplifying the way the EU works and reducing red tape and bureaucracy. The Omnibus packages announced in the Work Programme will be a key deliverable. These are intended to streamline regulatory compliance, including in the area of data breach and other required cyber notifications. The Digital package will entail a review of the Cybersecurity Act and simplification of cyber security legislation. This will form part of a broader review of the digital acquis and whether that accurately reflects the needs and constraints of businesses including SMEs. "Among other things a European Data Union Strategy will address existing data rules to ensure a simplified, clear and coherent legal framework for businesses and administrations to share data seamlessly and at scale, while respecting high privacy and security standards". There will also be a gradual stress-testing of the stock of EU legislation (fitness checks).
The Paris AI Action Summit was held on 10-11 February 2025. The agenda of the summit was heavily focused on AI opportunities with AI Safety issues taking up the minority of the time. There were many initiatives, agreements and investment announcements made during the Summit, and AI safety was certainly not forgotten, but perhaps the most notable thing to arise from it was the refusal of the USA and the UK to sign the declaration on open, inclusive, ethical and sustainable AI. At least 60 countries have signed it including France, Germany, China, India, Japan and Canada.
The USA's snub might have been expected with Vice President Vance taking aim at what he described as the EU's overly restrictive regulatory framework on AI, data and online safety. The US reportedly objected to the declaration's focus on multilateralism, inclusion, the environment and the emphasis on safe and ethical AI development. The UK's decision not to sign was more of a surprise and has led to accusations that it is currying favour with the USA. A government spokesperson said the government would only ever sign up to "initiatives that are in UK national interests" and the government later commented that the declaration was insufficiently clear on global governance and did not address national security. The government did, however, sign other agreements including on sustainability and cyber security and, of course, it plans to legislate on AI safety later this year.
Reactions to the UK's decision have been mixed with some arguing that the government was right to say the declaration did not go far enough on safety, and others offering a range of views from believing that the decision is damaging to the UK's AI aspirations to saying it is helpful in promoting the UK as a liberal market for AI development.
On 14 February 2025, the government announced it was renaming the UK AI Safety Institute as the UK AI Security Institute to reflect a new focus on strengthening protections against the risks AI poses to national security and crime. A new criminal issue team will research a range of crime and security issues which could harm UK citizens (for example the use of AI to generate CSAM), acting in conjunction with the Home Office. The government says the Institute "will not focus on bias or freedom of speech but on the most serious risks posed by the technology to build up a scientific basis of evidence which will help policymakers to keep the country safe as AI develops".
The government also announced a new agreement with Anthropic on AI opportunities to grow the economy.
On 6 February 2025, the European Commission published draft guidelines on what meets the AI Act definition of an AI system. The guidelines divide the definition into seven elements, all of which must be present at some point during the system's development and use in order for the definition to apply. The seven elements are:
All these elements are further explained in the draft guidance which is (only) 13 pages long.
On 23 January 2025, the High Court handed down a ruling looking at the consent requirement under the GDPR and PECR in relation to a vulnerable recovering gambling addict. RTM, a self-described recovering gambling addict, consented to revised terms and conditions introduced in 2018 when the GDPR came into effect. This included consenting to the processing of personal data about RTM's activities on the sites owned by Bonne Terre Ltd and Hestview Ltd, operating as Sky Betting and Gaming (SBG), using cookies, and then to receiving direct marketing. SGB carried out detailed profiling analytics and algorithmic predictions to target the RTM with a significant amount of direct marketing.
RTM claimed he had not given legally effective consent to the use of his personal data and that he had not consented to receiving direct marketing, nor had he been provided with a compliant way to opt-out of receipt.
The High Court upheld RTM's data protection claims (although was unpersuaded by an additional claim relating to misuse of private information). The Court held that consent had to be of a "relatively high" quality and that this was context-specific, taking into account:
The Court said that by ordinary standards, RTM was a highly vulnerable individual. He had not read the terms and conditions but had simply clicked through them in behaviour "which is too overborne, passive, unfocused and ambiguous and too bound up with the craving or compulsion to access gambling, to which the consenting is experienced as a condition to be overcome, to meet the necessary legal standard". The Court therefore found that the individual lacked subjective consent and that the autonomous quality of his consenting behaviour was impaired "to a real degree" so consent was insufficiently freely given.
While the Court agreed that the post 2018 engineering of consent mechanisms was sufficient to for SGB to rely on it being probable that where boxes had been ticked, a specific autonomous decision had been taken to consent, it said a data controller relying on consent cannot rely absolutely on generic probabilities. There was an ineradicable risk that that the autonomy of the consenting behaviour of gamblers is vitiated to some degree by problem gambling. In this case, the Court accepted that it had been. Therefore, SGB's use of cookies and subsequent direct marketing were not lawful processing.
The Court recognised direct marketing about gambling to online gamblers raises special issues and was at pains to stress both that its decision rested on the individual circumstances of the Claimant, and on the timing of the activities, recognising that the online gambling industry practice has moved on. However, the Court did say that "there is an obvious risk of defective consent" when sending direct marketing about gambling to gamblers.
While the scope of the judgment was limited to the facts of the case, it raises issues when relying on consent, particularly in the online gambling sector.
Bonne Terre Ltd which trades as SGB, was reprimanded in September 2024 by the ICO for unlawfully processing personal data through cookies without consent. SGB said this had been due to a technical error which had been fixed.
Separately, an Observer investigation found 52 of 150 tested gambling companies were unlawfully tracking users without their consent by embedding Meta Pixel on their websites which sends information to Facebook which is then used to profile individuals and send them gambling ads. The transfer of personal data was happening before the person agreed to accept or reject marketing when the webpage loaded on a number of sites.
On 5 February 2025, the ICO launched a direct marketing advice generator tool. It provides free advice about how to carry out direct marketing across a range of methods in compliance with UK law. It is aimed primarily at small organisations and is currently in beta phase.
On 5 February 2025, AG Spielman issued an Opinion in the case of EDPS v Single Resolution Board. The Single Resolution Board (SRB) appealed a decision of the EDPS. SRB had shared opinions of its shareholders with consulting firm Deloitte. The name of each shareholder was replaced with an alpha-numerical code before sharing. SRB is an EU institution and has parallel obligations in relation to personal data to those under the GDPR. The EDPS held that SRB had breached transparency requirements by failing to inform the individuals that their data was being shared with a third party. SRB had argued it did not need to do so as the transmitted data would be anonymous. The General Court annulled the EDPS decision, holding that it was incorrect to decide the data was personal data in the hands of Deloitte and that it had failed to consider that Deloitte had no means by which to reidentify the data subjects. The EDPS then appealed the General Court's decision to the CJEU.
In his Opinion, AG Spielman said the EDPS had failed to verify whether or not Deloitte had the means to re-identify the data. Pseudonymised data will not be personal data where the risk of reidentification by the recipient is non-existent or insignificant. The pseudonymised data is, however, personal data with respect to the re-identification 'key holder' and the AG said SRB had failed to meet transparency requirements as it should have informed the data subjects that data would be transferred to Deloitte.
The CJEU is expected to give final judgment in the summer.
AG Szpunar has opined in the case of X v Russmedia Digital and Inform Media Press that the operator of an online marketplace acts as a processor of personal data contained in advertisements on the platform. for GDPR purposes. As such, there is no obligation to check the content of such ads systematically before publication. The operator does have to adopt organisational and technical measures to protect the data, and operates as a controller of personal data of user advertisers registered on the marketplace and must verify their identity.
After years of deadlock the ePrivacy Regulation has been dropped from the European Commission's 2025 Work Programme, published on 11 February 2025. The Commission is expected to replace it with separate legislative proposals on data retention, cookies and digital advertising. The AI Liability Directive was another casualty although the European Parliament is hoping to cover similar ground in a software liability law.
The ICO published guidance on how to keep employment records safe on 5 February 2025. It is intended to help employers understand how the law applies to employment records and how to balance the need to keep these records with the right to a private life.
Separately, the ICO also published a blog 'myth busting' popular misconceptions around personal data and AI.
The EDPB will be discussing DeepSeek during its plenary session this week as individual DPAs including from Luxembourg and the Netherlands, warn users not to share personal data with its chatbot.
On 31 January 2025, the government published a Code of Practice for the Cyber Security of AI (CoP) and an implementation guide. The CoP has been updated from the draft following a call for views, and an implementation guide has been published in response to stakeholder feedback.
The CoP is voluntary and focused on AI systems, setting out cyber security requirements for the AI lifecycle split into five phases: secure design, secure development, secure deployment, secure maintenance and secure end of life. Other standards and guidance are signposted where appropriate. The CoP is divided into standards for developers, system operators, data custodians, end-users and affected entities.
The intention is that the CoP will be used to help create a global standard in the European Telecommunication Standards Institute.
On 27 January 2025, the Data Protection (Charges and Information) (Amendment) Regulations 2025 were laid before Parliament. As a result of a consultation, the planned data protection fees are slightly lower than originally proposed with the increase being 29.8% rather than 37.2% across the three tiers. The Regulations come into force on 17 February 2025, at which point the annual fees paid by controllers to the ICO will be:
There are no changes to the tier structure.
The previous government published a draft Code of Practice on Cyber Governance (CoP) as part of a call for views in January 2024. DSIT published the current government's response to the call for views on 31 January 2025.
The aim of the CoP is to support directors and board members to understand what they should be doing as a minimum to oversee cyber risk management and provide a clear set of actions. DSIT is also exploring how the code can be used to support regulators to assist with regulatory compliance.
The draft CoP comprises five principles, each underpinned by three to five actions. The principles are risk management, cyber strategy, people, incident planning and response, and assurance and oversight. It is designed to complement the NCSC Cyber Security Toolkit for Boards.
There was widespread approval of the CoP from respondents although there were suggestions for additions and requests for clarity as to how it might work with other guidance and resources. Most respondents were in favour an assurance scheme subject to its design.
DSIT will now make minor edits to the draft Code and aims to publish it shortly. Together with the NCSC, DSIT will develop materials to support implementation and industry uptake.
The National Cyber Security Centre published introductory guidance on the use of Content Credential technology on 29 January 2025 – a co-publication with the USA, Australia and Canada. Content Credentials are cryptographically secured metadata to allow content creators to add information about themselves, about creative sources, and about editing history to content. They can help users to assess authenticity, understand intellectual property rights, and distinguish real from synthetic content. The guidance covers when to use Content Credentials, how to make it clear to users that they are there, and how to make them durable. The NCSC encourages adoption while recognising they do not necessarily provide a complete solution.
A number of amendments to the Data (Use and Access) Bill were voted through by the House of Lords at Report Stage on 28 January 2025. This includes an amendment to require AI developers with a UK connection to comply with UK intellectual property law and disclose how they obtain training data. It remains to be seen whether this and other amendments will survive on the Bill's return to the Commons, particularly given the government's consultation on AI and copyright. The government also agreed to bring its own amendment to ban sexual deepfakes, rather than including provision on this in the Crime and Policing Bill as originally intended.
Amidst widespread concern about the processing of EU personal data by DeepSeek, the Italian DPA, the Garante, has issued an order restricting DeepSeek's parent companies from processing Italian personal data. The companies say they do not operate in Italy and are not subject to Italian data protection law. There are concerns that personal data is being transferred to China with no controls over its use and no transparency over what is happening to the data, as well as over the lawful basis for processing it. DeepSeek is also under scrutiny from the CNIL and the Irish DPC.
UK mobile phone operator TalkTalk played down claims that credentials of 18.8m users had been hacked at the end of January. TalkTalk said it was investigating the breach but that the figure of 18.8m was considerably exaggerated. It also underlined that no financial or billing information had been accessed. A hacker using the alias 'b0nd' took responsibility for the attack and is thought to have gained unauthorised access to the systems of a TalkTalk third party supplier. TalkTalk suffered a high profile breach in 2015 which resulted in a £400,000 (pre-GDPR) fine.
On 2 February 2025, Article 5 EU AI Act came into application. This prohibits certain types of AI including AI systems used for social scoring, emotional analysis in the workplace, and subliminal manipulation. On 4 February, the EC published guidelines on prohibited AI practices which provide explanations and practical examples to help stakeholders understand and comply with Article 5 requirements.
In addition, the EU AI Act AI literacy requirement under Article 4 came into force. Providers and deployers of AI systems must ensure their employees and contractors using AI have an adequate degree of AI literacy.
Please see article by TWG which includes tables with suggested AI literacy plan and an overview of prohibited AI systems.
The Executive Order which underpins the EU-US Data Privacy Framework was not one of the many Executive Orders rescinded by President Trump when he came into office, however, the Trump administration has required the three Democrat members of the US Privacy and Civil Liberties Oversight Board to resign. The Board plays an important role in the legal redress complaints process under the DPF. The 'resignations' will leave the Board non-quorate unless successors are seamlessly appointed. This in turn raises the spectre of a challenge not only to the DPF, but also to the ability to use SCCs and BCRs without the need for supplementary measures in relation to EU-US transfers. If the DPF falls, it would also likely impact the UK-US Data Bridge, however, any challenge in the courts would take some time to progress.
The ICO has published a letter sent to the government setting out its plans to help stimulate the UK economy. It proposes:
The ICO published guidance on the 'consent or pay' advertising model on 23 January 2025. The guidance clarifies how these models can be deployed to give users meaningful control. The ICO says 'consent or pay' models can comply with data protection law as long as it can be demonstrated that consent is freely given and other requirements are complied with. Assessments must be documented taking into account the following four factors:
The ICO's approach appears to be slightly more pragmatic than that of the EDPB which maintains that meeting the consent conditions when a binary choice of service is offered will be a high bar. The ICO seeks to maintain a balance between the needs of businesses and the privacy rights of users, providing case studies for tech businesses and news publishers, although does not, of course, compromise on data subject rights. The power imbalance criterion may be hard to overcome for the big tech platforms where there are no or fewer viable alternative services. The 'consent or pay model' is also under scrutiny in the EU under the Digital Markets Act.
The 'consent or pay' guidance was published alongside the ICO's 2025 online tracking strategy. One of the main aims this year is to bring the top 1000 websites into compliance in relation to online tracking, following significant improvements by the top 200 as a result of ICO action.
On 20 January 2025, the EDPB published a report following its Coordinated Enforcement Framework investigation into the implementation of the right of access under GDPR. 30 DPAs launched investigations into compliance with the right of access. The EDPB found that more awareness raising about the EDPB guidelines on the right of access is needed and there are seven key challenges to be tackled. These include lack of documented internal procedures to handle access requests, inconsistent and excessive interpretations on the limits of the right of access, and barriers encountered by individuals including requirements to provide excessive information or documentation. The EDPB did, however, report on positive findings with two thirds of participating regulators finding responses to right of access requests ranging from average to high compliance. The 2025 CEF action focuses on the right to erasure.
The Regulation on the European Health Data Space was adopted by the Council of the EU on 21 January 2025 following adoption by the European Parliament. It will now be formally signed and published in the Official Journal coming into force 20 days later. It will apply two years from entry into force.
At report stage in the House of Lords on 21 January 2025, the House of Lords passed some amendments to the Data (Use and Access) Bill including:
Following complaints by NOYB that personal data transferred from the EU to China is at risk of access by law enforcement authorities in China, a spokesperson for the Ministry of Foreign Affairs said China does not require companies or individuals abroad to collect or provide data or other information to the Chinese government in breach of local law.
On 15 January 2025, the government published a consultation on proposals to reduce threats associated with ransomware attacks. The Home Office proposes introducing legislation to meet three main objectives:
Among the main proposals are:
The consultation seeks views from a range of stakeholders including on whether essential suppliers to the public sector should be included in any ban on making ransomware payments, what sort of penalties should apply to non-compliance, and whether the incident reporting regime should be directed at certain sectors and be based on threshold requirements. Any legislation will complement the proposed Cyber Security and Resilience Bill.
The consultation closes on 8 April 2025.
On 17 January 2025, the EDPB adopted guidelines on pseudonymisation which are open for consultation until 28 February 2025. The EDPB stresses that the guidelines provide two important legal clarifications:
The guidelines analyse technical measures and safeguards when using pseudonymisation to ensure confidentiality and prevent unauthorised re-identification.
Separately, the EDPB also adopted a position paper on the interplay between data protection and competition law.
On 15 January 2025, the EDPS published a 'concept note' setting out proposals for a consistent, cooperative and coherent approach to enforcing the EU's laws on digital markets. The note looks at the incoming EU digital rulebook and notes the risks of inconsistent and contradictory approaches and requirements. The EDPS proposes creating a Digital Clearinghouse 2.0 – a forum for interested regulators to cooperate and exchange knowledge. The EDPS suggests a legislative proposal may be required to ensure effective cross-regulatory cooperation in the digital market, and says the Commission should monitor the new laws carefully and assess whether further changes may be required.
On 15 January 2025, two pieces of EU cyber security legislation approved at the end of 2024, were published in the Official Journal:
Both will come into force 20 days following publication in the Official Journal.
The European Commission published an action plan on improving cyber security in the health sector on 15 January 2025. The plan aims to bolster the cyber security of hospitals and healthcare providers and focuses on four priorities:
The Commission will launch a consultation on the plan which will feed into further recommendations by the end of the year. Specific actions will be rolled out through this year and 2026.
The UK National Cyber Security Centre published guidance to help operational technology owners and operators choose products and manufacturers that follow security by design principles on 13 January 2025. The guidance contains 12 security considerations for procurement processes.
On 16 January 2025, IAB Europe published a feedback paper sent to the EDPB after the 8 November 2024 stakeholder event on the EDPB's pay or consent guidelines. IAB Europe is concerned about the narrow interpretation given to "freely given" consent. It argues that with the consent or pay model, users are given a clear choice between two options and are also able to choose to use alternative services. It adds that there is no obligation on businesses to provide free services or services at a loss which could well transpire if they are required to offer ad-free services for free.
On 16 January 2025, NOYB filed complaints in Greece, Netherlands, Belgium, Italy and Austria asking for suspension of data transfers to China by six Chinese companies. NOYB alleges the transfers are unlawful because the data cannot be adequately protected from access by the Chinese government. Read more about the complaints here.
Matt Clifford's AI Opportunities Action Plan and the government's response to it were published on 13 January 2025. The Action Plan makes 50 recommendations, all of which the government says it will take forward, although two of which it appears to have slight reservations about. As the government itself points out, the Plan focuses less on AI safety issues and more on leveraging AI to help with productivity and growth, and to deliver more efficient public services at lower cost. The government's response is focused around three pillars – laying the foundations for AI to flourish in the UK, boosting adoption across public and private sectors, and keeping the UK "ahead of the pack".
Highlighted ambitions include:
Read more here.
On 9 January 2024, the EC was ordered to pay Thomas Bindl €400 in respect of unlawful transfer of his personal data to the USA by the General Court of the EU. Thomas Bindl, alleged that while registering for an EC conference and choosing to login with his Facebook credentials, his IP address and information about his browser and terminal were transferred to the USA unlawfully. Bindl claimed the data was transferred to the US via AWS as operator of Amazon CloudFront and that his personal data was transferred to Meta Platforms Inc without the use of any safeguards to protect the data. The Court dismissed the claim in relation to transfers of data via Amazon CloudFront, finding that the data was transferred to a server in Munich, not to the USA in accordance with the contract between AWS and the European Commission. The mere threat of a US subsidiary (in this case of Amazon Web Services) being required to transfer personal data to a third country by public law enforcement agencies, did not constitute an actual transfer of the personal data. The Court did, however, agree that by displaying the 'sign in with Facebook' hyperlink, the EC created conditions for the transfer of the individual's IP address to Facebook and to Meta Platforms in the USA and that the Commission was responsible for the transmission. Moreover, at the time there was no EU-US adequacy arrangement in place and there were no appropriate safeguards used by the Commission with respect to the transfer. The Court found the complainant had suffered non-material damage. The ruling does not relate directly to the GDPR but to the equivalent legislation for EU institutions, but is relevant to data transfers under the GDPR as the rules are substantially similar.
In a preliminary ruling in a reference from Austria, the ECJ said DPAs are not allowed to limit the number of claims made by an individual but must review each claim on its merits. The background to the claim is the Austrian DPA's rejection of a claim on the basis that the claimant had made 77 claims between 2018 and 2022. The ADPA wanted to allow a maximum number of two complaints per data subject per month. The ECJ said that as long as complaints are not vexatious or abusive, frequency is not sufficient to classify a claim as "excessive".
On 9 January 2025. The ECJ held that it is not necessary to collect data on customers' titles, particularly where the purpose of the collection is to personalise commercial communications. The background to the ruling is a challenge by the association Mousse to the CNIL about SNCF's requirement for consumers to enter their title when purchasing transport tickets online. The CNIL held that this did not infringe the GDPR. The decision was appealed to the French Council of State which asked the ECJ whether collecting title data was consistent with the data minimisation principle. The ECJ said for the data processing to be necessary it had to be objectively indispensable for performance of a contract, or in attainment of a communicated legitimate interest. In this instance, the ECJ found the data collection was not objectively indispensable.
On 18 December 2024, Google announced that from 16 February 2025, it will no longer prohibit organisations using its advertising products from employing device fingerprinting techniques – the collection of pieces of information about a device's software or hardware which can be combined to uniquely identify a particular device or user. The following day, the ICO published a response saying "businesses do not have free reign to use fingerprinting as they please. Like all advertising technology it must be lawfully and transparently deployed – and if it is not, the ICO will act". The response goes on to say that the ICO does not think fingerprinting is a fair way to track users online as it is likely to reduce people's choice and control over how their information is collected. It notes that the changes to the policy mean fingerprinting could now replace the functions of third party cookies.
The ICO has published draft guidance for consultation on how data protection law applies to storage and access technologies including fingerprinting. It warns that based on its understanding of the technology, complying with required rules on transparency, fairness and consent will be a "high bar to meet".
The EDPB published its Opinion on the use of personal data for the development and deployment of AI models on 18 December 2024 following a request made by the Irish DPC. The Opinion considers:
Throughout the Opinion, the discretion of DPAs and the need to conduct analysis on a case by case basis are emphasised.
On 17 December 2024, the Irish DPC confirmed it is fining Meta €251m in respect of a 2018 data breach which impacted approximately 3m EU/EEA users. This follows two draft decisions submitted to EU regulators under the cooperation mechanism in September 2024. Failings included not providing required information in the breach notification, failing to properly document the facts relating to each breach and steps taken to remedy them, and failing to ensure data protection by design and default.
On 20 December 2024, the Garante announced it was fining OpenAI €15m in respect of its ChatGPT model. The Garante found that OpenAI used personal data to train ChatGPT without a suitable lawful basis and that it was insufficiently transparent with individuals about use of their personal data. OpenAI is appealing the fine which it describes as "disproportionate".
On 2 December 2024, the Council of the EU adopted:
These have already been adopted by the European Parliament so following signature, they will be published in the Official Journal.
On 9 December 2024, the ICO published the results of its two-year trial public sector approach which placed an emphasis on reprimands and collaboration rather than financial sanctions where public sector organisations breach data protection rules. The ICO considers the approach to have been largely successful. It proposes to continue it but does acknowledge potential areas for improvement, in particular, by making it clearer which organisations are covered in the public sector approach and what type of infringements could lead to a fine. As a result, the ICO is consulting on the latter issue. Responses are invited by the end of January 2025.
On 6 December 2024, a group of trade unionists, academics, CEOs, NGOs and campaign organisations including Open Rights Group, Privacy International and Amnesty International, published an Open Letter to Peter Kyle, Secretary of State for Science, Innovation and Technology. The letter urges amendments to the Data (Use and Access) Bill to remove the planned changes to current UK GDPR provisions on automated decision making.
On 3 December 2024, the EDPB published draft guidelines on Article 48 GDPR for consultation. Article 48 covers requests for data from third country public authorities. The draft guidelines provide recommendations for controllers and processors to help ensure they deal properly with such requests and comply with data transfer provisions if they transfer personal data in response to such requests. The EDPB says that an international agreement may provide for both a legal basis and a ground for transfer, but if there is no international agreement or there is no agreement providing for an appropriate legal basis or safeguards, other legal bases or grounds for transfer can be considered in exceptional circumstances and on a case by case basis. Responses to the consultation are requested by 27 January 2025.
On 4 December 2024, the EDPB adopted a statement on the second report of the European Commission on the application of the GDPR. The EDPB underlined the importance of coherence between the GDPR and other digital legislation and said it would focus on producing content for non-experts and SMEs. It also stressed the need for greater financial and human resources for the EDPB and Member State Data Protection Authorities.
On 28 November 2024, the ICO published a blog post to help local authorities facing financial restrictions deal with requests for information under the Freedom of Information Act 2000 and the Data Protection Act 2018.
On 3 December 2024, ENISA published the first of its biennial reports on the state of cyber security in the EU as required by Article 18 of the NIS2 Directive. The report identifies that while Member States share an overall alignment in strategy, there is a substantial risk threat to the EU, particularly as a result of diverse approaches adopted in critical infrastructure sectors. The report identifies four areas that policy recommendations would address in policy implementation, namely cyber crisis management, supply chain and skills. It makes six policy recommendations covering the four priorities as well as the capabilities of critical sector operators and cyber security awareness and cyber hygiene. ENISA expects policy attention to focus on the impact of AI and post-quantum cryptography going forwards, as well as on R&D and innovation in the cyber security sector.
On 4 December 2024, the Council of the EU announced it had agreed its negotiating mandate on the proposed Financial Data Access (FIDA) Regulation which can now move to trilogues. The Council's position suggests a number of clarifications but is generally supportive of the Commission's approach.
Austria and Ireland have certified digital rights campaign group NOYB as a "qualified entity", entitling it to bring collective redress actions in courts throughout the EU. This can take the form of an application for an injunction, or collective redress, ie class actions, on the proviso they are on a non-profit basis.
The EDPB adopted an Opinion approving Brand Compliance certification criteria concerning processing activities by controllers or processors on 3 December 2024. The criteria, previously approved for use in the Netherlands, will now be applicable across the EU and as a Data Protection Seal
On 5 December 2024, the government published the results of a survey of app developers. The aim of the survey was to understand the extent to which app developers are aware of and comply with the voluntary Code of Practice on app and app store security and privacy, and whether or not awareness impacts the level of security. The survey of 600 developers found that just under one in six were aware of the Code of Practice with the larger developers more likely to be aware. There are varying degrees of alignment with the Code's principles and those who are unaware of the Code are less likely to have organisational plans to implement practices which align with the Code's principles
On 22 November 2024, DHSC published its response to a consultation on information standards for health and adult social care in England. The consultation relates to changes made to s250 of the Health and Social Care Act 2012 (HSCA) which have not yet been brought into effect. The changes will:
These changes will be complemented by provisions in the Data (Use and Access) Bill which include standardisation for data sharing across health and social care as information standards covered under s250 of the HSCA.
DHSC is expected to begin preparations to implement mandatory information standards in early 2025, and to lay legislation before Parliament in the Spring.
So far, over 200 amendments have been tabled to the Data (Use and Access) Bill. Politico reports that many of these relate to protection of children and cover issues relating to AI training data, including copyright as well as consent and transparency issues. Other proposed amendments cover legitimate interests, scientific research and data transfers.
23 Member States failed to transpose the NIS2 Directive into their local law by the required deadline of 17 October 2024. As a result, the European Commission has given them two months to finalise their national laws or it will issue a final warning as a prelude to enforcement action.
On 28 November 2024, the government published the gamma (0.4) pre-release version of its updated identity and attributes trust framework, together with supporting documents which provide guidance on:
This version of the framework is not yet ready to be certified against. Certifications against the 0.3 framework will remain valid until the date published in the final gamma version.
On 22 November 2024, the ICO published new practical advice to provide clarity on data protection considerations and support organisations in sharing data responsibly to tackle scams and fraud. The advice is aimed at any organisation seeking to share personal information to identify investigate and prevent fraud, especially banks, telecommunications providers and digital platforms. The advice runs through data protection essentials including carrying out a DPIA, having appropriate processes and data sharing agreements in place, identifying a lawful basis, and complying with data protection law including by giving effect to data subject rights.
On 13 November 2024, the European Parliament published the list of legislative files it will pick up following the EP elections. Over 120 files are listed, including the ePrivacy Regulation which has languished without much progress for a while. It is thought this will be officially dropped by the Commission once it publishes the Digital Fairness Act. The GDPR Enforcement Regulation will also continue to progress.
The European Data Protection Supervisor's 2025 TechSonar report was published on 15 November 2024 and focuses on:
The report provides fictional scenarios for each of these technologies and considers their impact on the fundamental right to privacy and protection of personal data as well as on society as a whole.
The EU's Cyber Resilience Act (CRA) was published in the Official Journal on 20 November 2024. It introduces mandatory cyber security requirements for the design, development, production and making available of products with digital elements and ancillary services. Essentially this covers the cyber security of IoT or connected products. Manufacturers will be required to embed security by design and provide security support and software updates. There are also information and incident reporting requirements. The CRA will apply generally from 11 December 2027. Article 14 (manufacturer reporting obligations) will apply from 11 September 2026, and Chapter IV (conformity assessment bodies) will apply from 11 June 2026. Products placed on the market before 11 December 2027 are only subject to the Regulation if substantially modified, except for Article 14, which applies to all relevant products.
On 7 November, the ICO and DSIT's Responsible Technology Adoption Unit published the Privacy Enhancing Technologies PETs) Cost-Benefit Awareness Tool alongside a checklist to support organisations. The tool focuses on emerging PETs and is structured around an example of using PETs to train a machine learning model without centralised data collection or processing. The tool includes information on compliance costs and benefits and features examples and use cases.
On 6 November 2024, the National Cyber Security Centre (NCSC) published new guidance for brands to help advertising partners counter malvertising (malicious advertising which spreads malware and ransomware and which can lead to fraud). The guidance is aimed at brands which use in-house and third-party digital advertising services to help them choose partners who prioritise cyber security. Recommendations include:
In an effort to allay competition and data protection concerns around its current 'pay or OK' model in the EU relating to behavioural advertising, Meta announced a new version of its services on 12 November 2024. Meta will offer a version of Facebook and Instagram with "less personalised ads". Under the new model, those selecting a free service will still receive ads but the targeting will rely on less data ie it will be contextual, using a minimal set of data points including age, location, gender and how a person engages with ads. Meta will also reduce the cost of its ad-free subscription model.
On 13 November 2024, the ICO announced it had approved the first sector-owned code of conduct by the Association of British Investigators Limited. The Code applies to UK private investigators and is approved in accordance with Article 40 UK GDPR.
On 15 November 2024, the FCA, ICO and the Pension Regulator (TPR), published a joint statement on data protection and effective communications to consumers in relation to retail investment and pensions. The statement has been made in response to requests from retail investment firms and pension providers for further clarity on how the TPR Code of Practice and guidance on communications requirements, and the FCA's Consumer Duty, interact with direct marketing rules under data protection law and regulations. The statement distinguishes regulatory communication and service messages from direct marketing messages and provides some guidance on relevant factors in determining whether a message is or contains direct marketing.
The FCA, jointly with the Prudential Regulation Authority (PRA) and the Bank of England, set out the final requirements and expectations for critical third parties (CTPs) to the financial sector and their operational resilience on 12 November 2024. The new rules will apply from 1 January 2025. They allow the FCA, PRA and Bank of England to monitor and manage systemic risks posed by certain third parties to the financial sector. The regulators will be able to identify potential CTPs and recommend them for designation to the Treasury. The intention is to improve resilience to cyberattacks and outages in the UK's financial sector. The rules will take effect on a CTP on the date the relevant designation order comes into force.
On 6 November 2024, the ICO published the results of its consensual audit engagements with developers and providers of AI tools used in recruitment together with a series of recommendations for developers and recruiters. The ICO recognises the value of using AI in recruitment but also made a series of nearly 300 recommendations following the voluntary audit which are summarised in the report alongside key questions for procurers to ask themselves during the procurement process.
The ICO found areas requiring improvement in relation to processing of personal data in AI recruitment tools, including around concerns that some AI recruitment tools can:
The ICO is running a webinar on 22 January 2025 for AI developers and recruiters.
Following the Commission's first annual review of the EU-US Data Privacy Framework, the EDPB adopted its own report on 4 November 2024. The EDPB recognises the steps taken by the US authorities and the EC to implement the DPF, both commercial and in terms of complaints and redress. The EDPB does however, suggest the US authorities develop guidance on data transfer requirements and human resources data. It also recommends the authorities monitor the compliance of DPF-certified organisations more closely. The EDPB underlines that safeguards around access to EU personal data by law enforcement authorities need to be implemented effectively, and says the Commission monitor future developments. The EDPB recommends the next review of the EU-US adequacy decision take place within three years.
Germany is likely to have a new government soon. On November 6, 2024, the so-called Traffic Light Coalition collapsed, possibly leading to new elections in March 2025. While Chancellor Scholz announced that certain key legislative proposals will be seen through, he did not mention any from the digital sector. Without enactment, many of these initiatives will lapse. We summarise the likely consequences for upcoming initiatives. See here for more.
On 7 November 2024, the ICO published a report on genomics and data protection concerns. The ICO raises concerns around third party data, inappropriate bias, data security and data minimisation. It calls for developers to take a privacy by design approach and join the regulatory sandbox. The report is part of the ICO's Tech Futures series.
On 1 November 2024, the Court of Appeal refused an application by Meta to reverse the judgment of the Competition Appeal Tribunal which granted a revised application for a collective proceedings order submitted by Dr Lovdhahl Gormsen. The CPO claims Meta abused its dominant position impacting UK Facebook users, by using their personal. data for commercial purposes without their consent. The Court of Appeal held that Meta's concerns could be heard at trial and did not mean the central case was not arguable.
The European Supervisory Authorities, the EA, ESMA and Eiopa (ESAs), published joint guidelines on the oversight cooperation and information exchange between the ESAs and the competent authorities under DORA on 6 November 2024. The guidelines apply from 17 January 2025.
On 7 November 2024, ENISA published a consultation on technical guidance for the cybersecurity measures of the NIS2 Implementing Act. The guidance will support Member States and government entities with compliance and will provide advice as to what to consider when implementing a requirement and how to assess whether a requirement has been met. It also provides tables mapping security requirements in the Implementing Regulation to European and international standards as well as national frameworks. The consultation closes on 9 December 2024.
On 7 November 2024, the Association of British Insurers (ABI) and Lloyds of London published a guide for insurers and reinsurers on how to define a "major cyber event". The guide will be useful for organisations with cyber insurance to help them understand the notification and mitigation process when experiencing a cyber event.
Consumer protection magazine Which? Carried out tests on a number of connected devices and found that some, including air fryers and smart watches, processed excessive amounts of personal data. Which? particularly highlighted its finding that three air fryers which asked for permission to record audio on the user's phone through a connected app, were then transferring personal data to China and/or connecting to trackers. The ICO said the tests "show that many products not only fail to meet our expectations for data protection but also consumer expectations". The ICO is currently working on guidance for manufacturers of smart products.
On 1 November, the House of Lords Library published a research briefing on the Data (Use and Access) Bill (DUA). Shortly afterwards, the ICO published its response to the Bill. The ICO welcomes the Bill as "a positive package of reforms". With respect to data protection reform, the ICO welcomes the removal of provisions in the DPDI Bill requiring it to follow a statement of strategic priorities. The ICO concludes that the data protection changes are "pragmatic and proportionate" and welcomes the government's renewed commitment to maintaining adequacy. The ICO's own view is that "the proposed changes in the Bill strike a positive balance and should not present a risk to the UK's adequacy statement". The ICO does make a number of technical suggestions to improve clarity.
The ICO, together with 16 other DPAs working as part of the Global Privacy Assembly International Enforcement Cooperation Working Group, published a follow-up joint statement on protecting data from unlawful data scraping on 28 October 2024.
The initial joint statement published in August 2023, set out key privacy risks associated with data scraping. The follow-up statement has been published as a result of engagement with the largest social media companies. It sets out further expectations including that organisations:
The ICO said there had been positive steps made by social media companies to implement many of the measures identified in the initial statement but additional ones suggested in the follow-up statement include:
On 29 October 2024, the ICO published a statement on its work to protect children online following research which showed that for many children, data is their only currency and they see giving it to apps and services to help them socialise as a necessary exchange. Children are generally unaware of how their data is used and the research found that platform design can exacerbate this and make it difficult for children to make informed privacy decisions. The ICO says it expects to provide further updates on its Children's Code Strategy once requested information from 11 companies has been submitted and analysed.
On 1 November 2024, the ICO published new guidance to help organisations communicate with empathy after a data breach. This reminds organisations that data breaches have an impact on real people. Research conducted by the ICO suggests nearly 30m people in the UK have experienced data breaches and 30% of those suffered emotional distress as a result. At the same time, 25% of affected individuals said they had not received support from the responsible organisations and 32% discovered the breach had occurred due to media reports.
On 25 October 2024, the ICO applied to the Upper Tribunal for permission to appeal a First-Tier Tribunal (FTT) decision which overturned the ICO's fine and data processing ban on Clearview AI in 2022. The ICO had fined Clearview £7.5m for scraping the internet for images to train its image recognition system without the consent of data subjects. The ICO also banned Clearview from processing UK personal data. The ICO's decision was overturned in October 2023 on the basis that it did not have jurisdiction to enforce against a US-based company supplying services to foreign law enforcement agencies, and the ICO was refused permission to appeal in December 2023. That decision was not, however, communicated to the ICO until September 2024. The ICO has now applied directly to the Upper Tribunal for permission to appeal the FTT decision. The ICO says he considers "the Tribunal incorrectly interpreted the law when finding Clearview's processing fell outside the reach of UK data protection law on the basis that it provided its services to foreign law enforcement agencies. The commissioner's view is that Clearview itself was not processing for foreign law enforcement purposes and should not be shielded from the scope of UK law on that basis".
Separately, the ICO announced on 4 November 2024, that it was seeking leave to appeal the judgment of the Upper Tribunal relating to DSG Retail Limited in the Court of Appeal. DSG was fined £500k (reduced under appeal to £250k) under the Data Protection Act 1998 in relation to a cyber breach which affected 14m people. DSG was given permission to appeal, first to the FTT and then to the Upper Tribunal which allowed the appeal and remitted the case to the FTT to be re-decided in 2024. The ICO considers the Tribunal interpreted the law incorrectly in finding that an organisation is not required to take appropriate measures against unauthorised or unlawful processing of data by a third party where the data is personal data in the hands of the controller but not in the hands of the third party.
On 29 October 2024, US-based global digital advertising standards body, IAB Tech Lab, announced the release of its Global Privacy Platform (GPP) Implementation Guidelines for consultation. The comment period will end on 16 December 2024. IAB Tech Lab developed the GPP intending it to help participants in the digital advertising ecosystem comply with privacy regulations across multiple regimes. The implementation guidelines are resources to help product and engineering teams adopt the GPP and comply with GDPR, US state laws and other privacy regulations.
The Data (Use and Access) Bill was published and received its first reading in the House of Lords on 23 October 2024. In case you missed last week's update, read our initial reactions here.
Three years ago, Meta announced it would cease using facial recognition technology for tagging purposes on Facebook in light of privacy concerns. On 21 October 2024, however, it said it was planning to start using facial recognition again to verify user identity, help recover hacked accounts and detect and block some types of scam ads. Interestingly, Meta said it would not be testing facial recognition for identity verification purposes in the EU, UK and in the US states of Texas and Illinois, jurisdictions in which Meta is continuing to have conversations there with regulators.
On 24 October 2024, the Irish DPC announced its final decision to fine LinkedIn €310m for GDPR failings relating to the processing of personal data for behavioural analysis and targeted advertising. The DPC investigated the complaint as lead regulator following a referral from the CNIL and under the Article 60 GDPR procedure. The DPC found that LinkedIn had failed to process the personal data fairly and lawfully as it had not validly relied on either consent, legitimate interests or contractual necessity. It also failed to comply with information provision requirements.
On 22 October 2024, NOYB filed a complaint with the French DPA, the CNIL, against Pinterest. NOYB alleges that Pinterest wrongly relies on legitimate interests despite the Bundeskartellamt ECJ judgment to process personal data for tracking and behavioural advertising processes. It also says personalised advertising is turned on by default, consent is not obtained to tracking, and Pinterest does not provide sufficient information about the categories of personal data it shares with third parties. NOYB is asking that Pinterest erase data processed for personalised ads and asks the CNIL to impose a fine.
On 9 October 2024, the Coalition for Privacy Compliance in Advertising (CPCA) announced it is working with the ICO to try and develop an ICO-approved privacy certification for digital advertising technology which is compliant with ICO guidelines. Industry groups including the Incorporated Society of British Advertisers and the Association of Online Publishers are also set to get involved. CPCA hopes to launch the certification in 2025 and intends it will work with European initiatives.
The UK's Digital Information and Smart Data Bill, announced in the July 2024 King's Speech, is expected to be published later this week or early next week, possibly under a different name. In the background briefing notes to the King's Speech, the government said the Bill would aim to harness the power of data for economic growth. Among other things, the briefing notes suggested it would establish digital verification services, a national underground asset register, and smart data schemes which allow secure sharing of customer data with authorised third party providers. It would also preserve many of the reforms to the ICO's governance structure proposed under the Data Protection and Digital Information Bill (which did not pass before the general election) and would include “targeted reforms to some data laws….where there is currently a lack of clarity”. No specific mention of the GDPR was made but after recent talks between Technology Secretary Peter Kyle and EU Justice Minister Didier Reynders, Reynders tweeted that they had both acknowledge the importance of the UK maintaining EU adequacy. Rumours are that the focus of the Bill will shift from UK GDPR reform to data sharing and digital IDs and the government has already said it will cover patient accessibility to their medical records via the NHS app. Watch this space!
On 16 October 2024, the EDPB published its final guidelines on the Technical Scope of Article 5(3) ePrivacy Directive. The guidelines analyse what is covered (beyond cookies) by the ePrivacy Directive wording "to store information or to gain access to information stored in the terminal equipment of a subscriber or user". They state that there are three main criteria which, if satisfied, will mean the technology is in scope:
The guidelines do not cover exemptions to the consent requirement but they do look at specific use cases and technologies including URL and pixel tracking, local processing, tracking based solely on IP addresses and intermittent and mediated IoT reporting.
On 17 October 2024, the European Commission announced the adoption of an Implementing Act under the NIS2 Directive. The Act sets out cyber security risk management measures and the cases in which an incident should be considered significant and therefore reportable. The Regulation will apply to specific categories of companies providing digital services, covering: TLD name registries, cloud computing service providers, data centre service providers, content delivery network providers, managed service and security providers, online marketplaces, online search engines, social networking platforms and trust service providers. The Regulation will be published in the Official Journal shortly and will come into force 20 days after that.
Member States were supposed to have introduced implementing legislation for the NIS2 Directive by 18 October, although many have not done so.
The UK's National Cyber Security Centre published updated guidelines on multi-factor authentication (MFA) on 9 October 2024. The guidance sets out the strengths and weaknesses of different ways of implementing MFA. While the NCSC says authenticating users to cloud-based corporate services using just a password is insufficient to protect sensitive data, it also suggests only prompting for authentication or MFA when it makes a difference.
On 10 October 2024, the Council of the EU adopted the Cyber Resilience Act (CRA), a Regulation on cybersecurity requirements for products with digital elements. The CRA will sit alongside other EU cyber security legislation and will focus on filling in gaps relating to connected products. The CRA will now be published in the Official Journal and will apply three years after its entry into force (subject to some exceptions which will apply earlier).
The EDPB announced on 7 October 2024, that its fourth Coordinated Enforcement Action will focus on implementation of the right to erasure by data controllers. Data Protection Authorities can join the action on a voluntary basis and the action will be launched in the first half of 2025. In early 2025, the results of the current enforcement action on the right of access will be adopted.
The EDPB also adopted its 2024-25 Work Programme. This is based on the 2024-2027 strategy adopted in April 2024 and is the first one of two work programmes which will implement the strategy. The Work Programme outlines an array of opinions and guidelines which the EDPB intends to publish, as well as various initiatives to enhance regulatory cooperation in the EU and international cooperation on data privacy issues.
On 9 October 2024 the European Commission published a report following its first review of the adequacy decision for the EU-US Data Privacy Framework (DPF). The Commission has concluded that the US authorities have put in place all the necessary structures and procedures to ensure the DPF functions effectively. This includes suitable redress mechanisms, and safeguards to ensure that access to EU personal data by intelligence authorities is limited to what is necessary and proportionate. The Commission did, however, recommend that in order to ensure continued and effective functioning of the DPF:
On the basis of this review, the Commission says its next review will be in three years' time.
On 9 October 2024, the European Data Protection Board (EDPB) adopted an Opinion on certain obligations following from the reliance on processor(s) and sub-processor(s) following an Article 64(2) GDPR request by the Danish SA.
The Opinion covers situations where controllers rely on one or more processor and sub-processor. It focuses on eight questions dealing with the interpretation of certain controller duties when they rely on processors and sub-processors, and looks at the wording of controller-processor contracts particularly where it fulfils Article 28 requirements.
The EDPB says controllers should have the identity information (i.e. name, address, contact person) of all processors and sub-processors readily available at all times so that they can best fulfil their obligations under Article 28. In addition, the controller’s obligation to verify whether the (sub-)processors present "sufficient guarantees" should apply regardless of the risk to the rights and freedoms of data subjects, although the extent of such verification may vary according to the level of risk associated with the processing.
The Opinion also makes clear that while the initial processor should ensure it proposes sub-processors with sufficient guarantees, the ultimate decision and responsibility for engaging a specific sub-processor rests with the controller.
The EDPB considers that under the GDPR the controller does not have a duty to systematically ask for the sub-processing contracts to check whether data protection obligations have been passed down the processing chain. The controller should assess whether requesting a copy of such contracts or reviewing them is necessary for it to be able to demonstrate compliance with the GDPR.
On 9 October 2024, the EDPB adopted a Statement on amendments made by the European Parliament and Council to the draft Regulation on additional procedural rules relating to GDPR enforcement. The Statement broadly welcomes the modifications but recommends addressing further elements. In particular, the EDPB reiterates the need for a legal basis and a harmonised procedure for amicable settlements. It also warns that the introduction of a joint case file, as proposed by the European Parliament, would require complex changes to the document management and communication systems used at European and national levels.
The EDPB is inviting stakeholders to an event on AI models on 5 November 2024. Individuals representing European sector associations, organisations, NGOs, companies, law firms and academics, are invited to take part. One representative per organisation will be admitted on a first come first served basis but the EDPB reserves the rights to give precedence to specific stakeholders who have expressed an interest in participating to ensure relevant expertise and diversity of views.
Following a roundtable meeting on 10 and 11 October 2024 of the G7 data protection authorities, the Canadian DPA published a communiqué setting out three pillars of focus for cooperation – data transfers, emerging technologies such as AI, and bilateral and multilateral enforcement actions.
The Investigatory Powers (Amendment) Act 2024 (Commencement No 1 and Transitional Provisions) Regulations 2024, were made on 10 October 2024 and brought in a number of provisions of the Investigatory Powers (Amendment) Act 2024 on 14 October. These include ss 1-7, 19-23, 25-29. Regulation 3 sets out transitional provisions for communications data retention notices issued under s87 of the 2016 Act and takes effect immediately. Regulation 4 makes transitional provisions with respect to national security and technical capability notices.
On 9 October 2024, the ICO published a blog launching its report ICO tech futures: quantum technologies. This looks at emerging possibilities for quantum technologies involving personal data and at the implications for privacy. In particular, the ICO looks at the potential of quantum technology to break encryption. Its main approach to addressing the risk is NCSC-endorsed post-quantum cryptography. The ICO recommends large organisations begin preparing for the transition now by identifying and reviewing at-risk information, systems and cryptography. Maintaining current cyber hygiene is considered a good start.
Andi Terziu looks at the key themes discussed by our panel on AI and Cyber issues during our recent Cyber Strategy Seminar.
The ICO has launched a new audit framework to help organisations assess their compliance with data protection law. Using the framework will not guarantee that an organisation will meet all data protection requirements as every organisation is different but it covers the range of areas the ICO looks at when assessing an organisation's compliance to conduct consensual and compulsory audits. The framework is targeted at larger businesses and organisations in the public, private and third sectors. It is not aimed at SMEs which should use the self-assessment toolkit and other resources. It is an extension of the Accountability Framework and contains nine toolkits covering different key areas including AI and age-appropriate design.
The ICO confirmed on 3 October 2024, that it is fining the Police Service of Northern Ireland £750,000 after it disclosed employee records including surnames, initials, ranks and roles of 9,483 employees in response to a freedom of information request. The data was uploaded to a website and made available for around three hours. It was presumed by the Police to have fallen into the hands of dissident republicans who would use the data to create fear and uncertainty and to intimidate. The ICO said that the fine would have been £5.6m had it not taken into account the fact that it was being imposed on a public organisation.
Many EU Member States look set to miss the deadline for passing NIS2 implementing legislation on 14 October 2024. As a result, the Commission may pass an implementing Act and is, according to Euractiv, considering how it would define what constitutes a "significant incident" which would need to be quickly reported to authorities. The latest draft proposes that an incident which leads to harm to a person's health or causes financial losses of over €500,000 or 5% of a company's total annual turnover would be considered "significant". The Act also suggests that businesses suffering an incident which is suspected to be the result of malicious actions will need to report the incident where the incident affects 5% or more of the total EU users of cloud computing and content delivery network providers, online marketplaces, search engines, social networking platforms and managed security service providers.
On 4 October 2024, the ECJ handed down judgment in a reference from Austria in a case brought by Max Schrems who complained that personal data relating to his sexual orientation had been processed unlawfully by Meta to send him advertising targeted at homosexuals. He argued this was done without his consent or under any other lawful basis. Schrems claimed the advertisements were not based directly on his sexual orientation but on an analysis of his particular interests. He brought an action in the Austrian courts and subsequently referred publicly to his homosexuality during a panel discussion but did not publish this information on Facebook.
The Austrian Supreme Court asked the ECJ whether:
The ECJ said:
On 4 October 2024, the ECJ ruled in the Lindenapotheke (Case C-21/23) – a reference from Germany. The German court asked whether the GDPR precluded a competitor bringing a case against it for breach of the GDPR under unfair commercial practices legislation. In this case, a competitor of an online pharmacy selling pharmacy-only non-prescription products on Amazon Marketplace, brought an action seeking an order that the pharmacy cease activity unless it could guarantee that consumers could give prior consent to the processing of health data. The ECJ was asked to decide whether the GDPR precluded this action and also whether the data in question was health data and required an Article 9 exemption from the general prohibition on its processing. The ECJ held that the GDPR did not preclude this kind of action and indeed, such action strengthened data protection. It also said that even where products were sold without requiring a prescription, the data which users were required to enter before purchasing was health data.
On 4 October 2024, the ECJ ruled in a reference from the Netherlands regarding the Royal Lawn Tennis Association (KNLTB). The KNLTB had disclosed member personal data to be used for marketing purposes. The Dutch Supervisory Authority (AP) fined the KNLTB for breaching GDPR and, in particular, for not having a lawful basis for the processing at issue. The AP said for the purposes of Article 6(1), legitimate interests are only those enshrined in and determined by law. The referring court then asked the ECJ how to interpret what constitutes a legitimate interest. In short, the ECJ underlined that a legitimate interest can be purely commercial provided it is lawful but the interest does not need to be determined by law. The processing needs to be necessary to fulfil the purpose and a legitimate interest assessment must be carried to determine that the interests of the controller are not overridden by the rights and freedoms of the data subjects.
On 8 October 2024, the EDPB published draft guidelines on Article 6(1)(f) GDPR – the lawful basis of legitimate interests – for consultation. The draft guidelines:
In the July King's Speech, the government announced it would introduce a new Cyber Security and Resilience Bill. This is intended to strengthen defences and protect digital services, including by:
The government has now confirmed that the new legislation will be introduced to Parliament in 2025 rather than this year and is likely to be subject to prior consultation.
The ICO has published its response to Ofcom's consultation on illegal harms. Confining itself to areas with an impact on the ICO's remit, the ICO makes a number of points around alignment with UK data protection law and clarity including:
On 27 September 2024, the Irish Data Protection Commissioner announced it had issued a reprimand and a EUR 91m fine to Meta for security failings relating to the storage of passwords in plain text. The Irish DPC found not only that Meta had failed to notify it of a data breach and to document data breaches concerning the storage of passwords in plain text, but it had not used appropriate technical or organisational measures to protect the passwords against unauthorised measures and use a level of security appropriate to the risk. The decision, which will be published in full shortly, was reached following an Article 60 procedure.
The ECJ has ruled in a reference from Germany, that Supervisory Authorities (SAs) have discretion as to whether or not to exercise corrective powers where a controller has already taken steps to remediate an issue. This particular case related to a minor data breach by an employee at a bank. The affected customer argued that the relevant SA should have fined the bank. The ECJ held that:
LinkedIn has said it will pause the use of UK personal data to train generative AI models as a result of concerns raised by the ICO and pending further engagement. The ICO has welcomed the announcement but says it will continue to monitor major developers of generative AI.
The Equality and Human Rights Commission published guidance on the Public Sector Equality Duty and data protection on 12 September 2024. The guidance analyses the relationship between s149 Equality Act 2010 and data protection legislation. It advises public authorities on how to collect and process equality information in a data protection compliant manner and sets out best practice.
On 17 September, Instagram announced the introduction of Instagram Teen Accounts to provide greater privacy by default. Teen accounts will limit who can contact teens and the content they see. Teens will automatically be given teen accounts and those under 16 will require parental permission to change default settings. Teens will also get access to a new feature which allows them to select types of content they want to see more of and receive notifications suggesting they leave their devices after 60 minutes each day. Sleep mode will be activated between 10pm and 7am which will automatically mute notifications. Online safety campaigners say the controls are easy to circumvent. The ICO has reminded platforms that under the Children's Code, settings must be high privacy by default. Ofcom has said that Instagram's measures are a "step in the right direction" in terms of online safety compliance but warns that it won't hesitate to take enforcement action against any company falling short of its obligations under the Online Safety Act once safety duties come into force early next year.
On 12 September 2024, the government announced that data centres will be designated as Critical National Infrastructure (CNI). This will allow the government to support the sector in the event of critical incidents, minimising their effect on the economy. There will be a dedicated CNI data infrastructure team which will monitor and anticipate potential threats and provide prioritised access to security agencies and emergency services. The government also announced potential investment by DC01UK of £3.75 billion in UK data centres which follows the announcement of an £8 billion investment by Amazon over five years in data centres in the UK.
In June 2024, Meta paused its use of EU and UK personal data on Facebook and Instagram to train generative AI. While it has undertaken to make this pause permanent in the EU, it will resume these activities in the UK after having made changes to the process including simplifying user opt out and giving users longer to do so. The ICO says it will monitor the situation as Meta begins informing users and is clear that it has not provided regulatory approval for the processing. It says it is now up to Meta to "ensure and demonstrate ongoing compliance".
Separately, in order to comply with its obligations under the Digital Markets Act, Meta has announced major changes to WhatsApp and Messenger to enable interoperability with third-party messaging services. As part of this it will notify EU users when a new third-party messaging service becomes interoperable and give users the choice between receiving all messages in a single inbox or keeping them separate.
The EC has announced plans to consult on draft legislation to update Standard Contractual Clauses. The consultation is likely to take place before the end of this year and changes are expected to be finalised in the second quarter of 2025.
The ICO and the National Crime Agency have signed a Memorandum of Understanding setting out how they will help UK organisations become more resilient to cybercrime and share information. They aim to improve standards in cyber security and avoid duplication of effort while sharing information and experience with each other and the public. The ICO has a similar MoU with the National Cyber Security Centre.
The EDPB announced on 10 September 2024, that it will work with the European Commission to produce guidance on how gatekeepers under the Digital Markets Act can comply with the GDPR and align their obligations.
On 12 September 2024, the Irish DPC announced it had begun a cross-border statutory inquiry into Google's AI practices under s110 of the Data Protection Act 2018. The inquiry looks at whether Google has complied with requirements to carry out a Data Protection Impact Assessment before using EU/EEA personal data to train its AI model Pathways Language Model 2 (PaLM2).
The European Commission published FAQs on the Data Act on 6 September 2024. These have been compiled with input from a variety of stakeholders and are intended to support implementation of the Act which will apply from 12 September 2025.
On 17 September 2024, the ICO issued a reprimand to Bonne Terre Limited trading as Sky Betting and Gaming (SkyBet), over its unlawful processing of personal data by using advertising cookies without user consent. The ICO said SkyBet was sharing personal data with adtech companies when they accessed the SkyBet website before users were given a chance to accept or reject advertising cookies. The ICO found no evidence of deliberate misuse but concluded the processing was not lawful, transparent or fair. SkyBet made changes to its cookie practices in March 2023 as a result of the ICO's investigation. The ICO will be publishing guidance on cookies and similar tracking technologies for consultation alongside its position on the consent or pay model later this year.
On 4 September, the Irish DPC announced that it had concluded the proceedings it had brought before the Irish High Court regarding X's use of EU/EEA data to train its AI tool Grok. The proceedings were struck out after X agreed to extend its commitment not to process EU/EEA personal data to train Grok on a permanent basis.
The DPC has made a request to the EDPB for an Article 64(2) opinion on some of the core issues arising in the context of using personal data to develop and train AI.
On 6 September, the CMA published a statement of objections setting out how Google may have broken competition law by using its dominance to favour its own adtech services in open-display advertising. The CMA's provisional findings are that Google has abused its dominant position on both the ad server (DoubleClick for Publishers or DFP), and the buying tool (Google Ads and DV360) side to restrict competition in the UK. In particular, it has preferenced its own ad exchange (AdX), harming competition and advertisers and publishers. The CMA finds that since 2015, Google has used its buying tools and publisher ad server to strengthen AdX's position and restrict competitor opportunities. It has also prevented rival publisher ad servers form being able to compete effectively with DFP.
Practices singled out include:
The CMA will consider representations from Google before reaching a final decision. Google's adtech practices are also under scrutiny from competition regulators in the EU and USA.
The Dutch DPA (the AP) is the latest regulator to fine Clearview AI regarding its processing of personal data for its image recognition database. It has issued a fine of EUR 30.5m and additional penalty payments for ongoing breaches. Clearview AI scraped images from the internet without consent to create its database. The AP warned that Clearview also fails to inform people about its data processing activities and does not respond to subject access requests. The AP is looking into whether the company's directors can be held personally liable for infringements and warns that it will act against Dutch organisations if they use Clearview.
MLex reports that a compromise proposal for the text of the Financial Data Access Regulation suggests digital gatekeepers will be required to keep personal data obtained under FIDA, separate from other personal data they hold and will be prevented from combining it. Member States are hoping to agree a compromise text in the next few months.
On 5 September 2024, the EDPB invited stakeholders to participate in a remote event on 18 November intended to gather information about 'consent or pay' models. This exercise will inform the upcoming guidelines on the issue. The guidelines will be an extension of EDPB Opinion 08/2024 which covers consent or pay in the context of large online platforms and will have broader application.
TfL limited access to some of its online services on becoming aware of a cyber attack on 8 September 2024. TfL said the attack mainly impacted its backroom systems at its corporate HQ but it had no reason to believe any personal data had been compromised. As a result of the attack, TfL limited live travel information services and restricted access to customer journey history for some customers. It has been working with the National Cyber Security Centre and the National Crime Agency in the course of investigating the incident.
On 9 August 2024, the European Commission launched a call for evidence on the functioning of the EU-US Data Protection Framework. Feedback is invited by 9 September and will help the Commission assess whether all aspects of the DPF are in place and functioning as intended. The Commission has been collating information from a variety of sources and will produce a full report on the DPF.
Separately, Switzerland approved the Swiss-US Data Privacy Framework on 14 August 2024.
The CMA announced on 20 August 2024, its decision to accept a variation of Meta's binding commitments to address competition concerns relating to its use of digital display advertising service data. Initially the commitments involved Meta using technical systems to prevent the use of competitor advertising data being used on Facebook Marketplace or to improve it. The technical solutions were applied to advertisers who had opted out of their data being used for such purposes or who had been opted out by Meta and had not objected. The variation allows the solution to cover all data from all advertising customers not just Facebook Marketplace's competitors.
The ICO published its fifth and final call for evidence on generative AI and data protection on 22 August 2024. Responses are required by 5pm on 18 September. The ICO sets out its thinking on the allocation of controllership across the AI supply chain with a focus on AI as a Service (AIaaS). While not going into specific detail on obligations, the ICO provides some indicative scenarios of processing activities. The ICO seeks evidence on additional processing activities and actors not included in this call alongside the relevant allocation of accountability roles and asks for a range of inputs on issues relating to controllership allocation and data processing in generative AI.
On 20 August, the ICO launched a privacy notice generator tool for sole traders, start-ups, small organisations and charities to help them create bespoke privacy notices in a variety of sectors. These include finance, insurance, legal sectors, education, health and social care, retail and manufacturing.
The ICO is partly funded by data protection fees which are payable by organisations which process personal data unless they do it for purposes which are exempt. The maximum fee is currently set at £2,900 and has not increased since it was introduced in 2018. The government is now consulting on raising the fee to a £3979 for tier 3 organisations with more than 250 staff or an annual turnover of over £36m. Tier 1 fee changes are proposed to rise from £40 to £55 and tier 2 from £60 to £82. The consultation closes on 26 September 2024.
NOYB had a busy August. It:
On 9 August 2024, the UN agreed the wording of a draft convention on cybercrime which is expected to be adopted by the General Assembly by the end of the year. It will be the first legally binding instrument on cybercrime and is intended to enhance international cooperation, law enforcement efforts, technical assistance and capacity building. Commentators, including human rights groups and tech companies, are concerned that the language is sufficiently broad to allow authoritarian governments to crack down on opposition and abuse human rights.
The Dutch Supervisory Authority has fined Uber EUR 290m relating to a complaint passed on to it by the French SA and brought by the French Human Rights League representing over 170 Uber drivers. Uber was held to have failed to sufficiently protect driver privacy when transferring their personal data from the EU to the USA. Uber was found to have transferred the data which included special category data, to the USA without using transfer tools for a period of two years from August 2021 until the end of 2023 when Uber began making transfers under the EU-US Data Protection Framework.
On 27 August, the EU and China launched their first set of discussions under the new Cross-Border Data Flow Communication Mechanism. The mechanism will focus on practical solutions to address problems EU companies face in China regarding cross-border data flows.
On 28 August 2024, the ICO issued a reprimand to the Labour Party for repeatedly failing to respond to subject access requests. The Labour Party received 352 SARs in November 2022, partly as a result of a 2021 cyber attack. 78% had not received a response within the required three months, and 56% or responses were delayed by over a year. The ICO said it had received over 150 complaints about Labour's handling of SARs between November 2021 and 2022.
In August, the ICO published a report on tackling barriers to privacy-enhancing technologies (PETs) adoption. The report covers an analysis of the barriers and recommendations on how to overcome them, based on input from an ICO workshop on PETs.
On 7 August 2024, the ICO announced its provisional decision to fine Advanced Computer Software Group Ltd £6.09m for failing to protect the information (which included sensitive data) of just over 80,000 people. Advanced acts as a data processor in providing IT and software services to organisations including the NHS and other healthcare providers. It suffered a ransomware attack in August 2022 via a customer account that did not have multi-factor authentication. This led to the exfiltration of the personal data of 82,946 people and to disruption to critical services such as NHS 111 and access by medical staff to patient records.
The ICO will consider representations by Advanced before making a final decision. If it does proceed with the fine, it will be the first major UK GDPR fine issued to a data processor.
On 2 August 2024, the ICO wrote to 11 social media and video sharing platforms, calling on them to improve their children's privacy practices. This follows on from the ICO's ongoing review of social media platforms and VSPs. The 11 platforms are being asked to address questions relating to default privacy settings, geolocation and age assurance and explain how their practices comply with the Children's Code.
The ICO has also launched a call for interested stakeholders to share views and evidence on:
The call closes on 11 October 2024.
The new Labour government has said it wishes to continue with consultations launched by the previous government on:
Feedback is requested by 9 August 2024.
On 31 July 2024, the EDPS published Model Administrative Arrangements for transfers of personal data from EU institutions to international organisations. These are designed to help the institutions comply with relevant data protection law and place an emphasis on core data protection principles and required safeguards.
On 8 August 2024, X announced it was suspending processing personal data in publicly available tweets to train its AI tool Grok. The decision was welcomed by the Irish DPC which had filed an application to suspend the service at the High Court following complaints by consumer protection groups.
The EU's Consumer Protection Cooperation Network (CPC) sent a letter to Meta on 22 July, with questions relating to a suspected breach of EU consumer law as a result of its pay or consent model in the EU for its Facebook and Instagram platforms. The action is led by the French competition regulator and coordinated by the Commission. The Commission is concerned that Meta uses misleading or aggressive practices and is insufficiently transparent when offering consumers the choice between paying for an ad free model and consenting to being tracked for advertising purposes. This action is distinct from other ongoing investigations relating to the DMA, the DSA and the GDPR which also focus on Meta's pay or consent model. Meta is required to respond and propose solutions by 1 September 2024.
Meanwhile, Meta was hit by a USD$220m fine in Nigeria for breaches of competition, consumer and data protection laws.
The European Commission published its second report evaluating the GDPR on 25 July 2024. The report found that the GDPR was fit for purpose but improvements could be made around enforcement. In particular the GDPR Procedural Regulation would help but the report also emphasised the need for consistent approaches and cooperation between Member State regulators, both in relation to the GDPR itself, and in terms of its place in the wider digital regulatory framework.
The impact of the CrowdStrike incident is likely to result in large insurance claims with estimates that US Fortune 500 companies alone would be likely to claim USD 5.4bn in insured damages. CrowdStrike said at the weekend that 97% of affected systems are now back online.
On 24 July 2024, the ICO issued a reprimand to a school which introduced facial recognition technology in its canteen without first conducting a DPIA as required under the UK GDPR.
The EDPB adopted a statement on the role of DPAs in the AI Act framework on 17 July 2024. The EDPB stresses that DPAs already have experience when dealing with the impact of AI on fundamental rights, in particular, regarding protection of personal data. It therefore recommends that the DPAs be designated as the Market Surveillance Authorities, which must be appointed by Member States by 2 August 2025. The AI Act itself recommends DPAs as MSAs for high-risk AI systems used for law enforcement, border management, administration of justice and democratic processes. The EDPB recommends they should also be MSAs for other high-risk systems and be designated as single points of contact. Appropriate governance and cooperation policies would be needed.
EU DPAs and the ICO have a track record of activity on AI. Most recently, on 18 July, the CNIL published guidance on how it would enforce aspects of the EU AI Act and how to deploy generative AI, and on the same date, the Irish DPC published guidance on the relationship between large language models and data protection.
On 17 July 2024, the EDPB published FAQs on the EU-US Data Privacy Framework (DPF) for individuals and businesses. The FAQs for individuals provide information on the functioning of the DPF – how to benefit from it, how to lodge complaints, and how complaints are handled. The business FAQs focus on which companies are eligible to sign up to the DPF and what to do before transferring personal data to DPF-certified organisations.
Chaos ensued on 19 July after CrowdStrike issued a faulty software update causing many devices using Microsoft Windows to crash. CrowdStrike said around 8.5m devices were impacted and it that been working on a technique to reboot systems more rapidly after warnings that full recovery could take weeks. CrowdStrike said the issue was not due to a cyber attack, however, in the aftermath, there are concerns that hackers may exploit the event by falsely claiming to represent CrowdStrike in order to get access to systems.
The UK privacy advocacy group Open Rights Group (ORG) submitted a complaint to the ICO about Meta's proposed changes to its privacy policy on 15 July 2024. Meta said it would update its privacy policy to rely on legitimate interests to use personal data it processes to develop AI systems from 26 June. ORG's complaint follows a complaint by NOYB regarding similar proposals in the EU. Meta said it would pause the changes in the EU and, at the ICO's request, in the UK, however, the ORG says there has been no change to the policy itself. It calls on the ICO to issue a binding decision to prevent the processing of personal data to develop AI without consent, investigate the issue, and prohibit the use of personal data for undefined "artificial intelligence technology" without opt-in consent.
After concerns expressed by the CMA and the ICO, Google announced on 22 July 2024, that "instead of deprecating third-party cookies, we would introduce a new experience in Chrome that lets people make an informed choice that applies across their web browsing and they'd be able to adjust that choice at any time". It will continue to work on Privacy Sandbox APIs and to make them available in discussion with regulators and industry. Google has also said it will offer additional privacy controls by introducing IP Protection into Chrome's Incognito mode. The CMA has said it is "considering the impact of this announcement and welcomes views until 12 August". The ICO said: "We are disappointed that Google has changed its plans…we will reflect on this new course of action when more detail is available".
The new Labour government set out its legislative agenda in the King's Speech to Parliament (and in background briefing notes) on 17 July 2024. In the speech itself, it announced it would "seek to establish the most appropriate legislation to place requirements on those working to develop the most powerful AI models” although no AI-specific Bill was listed in the briefing notes themselves. They did, however, list two important pieces of legislation not mentioned in the speech:
Digital Information and Smart Data Bill - this aims to harness the power of data for economic growth. Among other things, it will establish digital verification services, a national underground asset register, and smart data schemes which allow secure sharing of customer data with authorised third party providers. It will also preserve many of the reforms to the ICO's governance structure proposed under the Data Protection and Digital Information Bill (which did not pass before the general election) and will include “targeted reforms to some data laws….where there is currently a lack of clarity”. No specific mention of the GDPR is made. Suggestions are that the new Bill will drop some of the more controversial elements of its predecessor and make fewer changes to the UK GDPR.
Cyber Security and Resilience Bill - to protect public services and infrastructure by expanding the remit of existing regulation, putting regulators on a stronger footing and increasing reporting requirements. This is widely expected to bring the current NIS Regulations more in line with the EU's NIS2 Directive.
The EU's AI Act was published in the Official Journal on 12 July 2024 and will come into force on 1 August 2024 with compliance phased in over a three-year period. Hailed by the EU as a global first, the AI Act is the most comprehensive attempt to regulate the use of AI to date. It takes a risk-based approach to AI, aiming to strike a balance between innovation and regulation while protecting fundamental rights. See here for more.
On 9 July 2024, the Global Privacy Enforcement Network (GPEN) comprising 26 international data protection authorities, and the International Consumer Protection and Enforcement Network (ICPEN) announced the results of a global privacy sweep of over 1000 websites and apps. They found that the vast majority used at least one deceptive design practice and that over 75% of subscription services across the 26 countries used dark patterns in their advertising.
The UK ICO held a roundtable on IoT products on 11 July focusing on products like smart meters and fitness devices. It plans to produce draft IoT guidance for consultation towards the end of this year or in early 2025.
NOYB filed a complaint with the Garante against Microsoft-owned ad broker Xandr on 9 July 2024. NOYB alleges Xandr does not comply with GDPR transparency requirements in its use of personal data for targeted advertising, particularly as regards special data. Its website states that it did not fulfil any erasure or access requests in 2022, and it refused to comply with an access and erasure request made in February 2024, saying it was unable to identify the data subject, despite having set a cookie on the individual's device used to target advertising to them, the value of which had been presented to Xandr.
NOYB published a report on 11 July 2024, analysing decisions taken by national data protection regulators relating to cookie consent banners, and comparing them with the position taken by the EDPB's cookie banner taskforce. The report is intended as a useful resource for businesses implementing consent banners on their websites and is intended to spark further discussion about deceptive practices and how to avoid them.
The EU-Japan Economic Partnership Agreement (EPA)came into force on 1 February 2019. On October 2023, the EU and Japan negotiated an agreement on cross-border data flows. This was ratified and entered into force on 1 July, now forming part of the EPA. It provides for the free flow of data (not just personal data) between the EU and Japan and removes data localisation requirements among other things.
The European Commission launched a consultation on a draft Implementing Regulation (IR) under the NIS2 Directive on 27 June 2024. The IR will set out details relating to Article 23(3) of the NIS2 Directive including when an incident will be "significant" and the technical requirements of the Article 23(3) risk management measures. Relevant entities under the IR will include cloud computing service providers, managed service providers, providers of online marketplaces, online search engines and social network platforms. The consultation closes on 25 July 2024 and the Commission hopes to adopt the IR by the end of Q3 2024.
On 3 July 2024, the High Court ruled in Pacini and another v Dow Jones & Company Inc. [2024] EWHC 1709 (KB). The court rejected an application to strike out a claim for breach of data protection law brought by the two claimants – former senior executives of the XIO group. The claim related to two articles published in 2017 and 2018 and the right to be forgotten. Dow Jones sought to have the claim struck out, arguing that it was purely tactical and an abuse of process because it was in reality a statute-barred defamation claim, the nub of which concerned the claimants' reputation, or that it was an abuse in the sense of the Jameel v Dow Jones & Company 2005 case.
The court ruled against Dow Jones saying that the claimants' evidence suggested the nub of the claim was to exercise the right to be forgotten. This seemed arguable and not plainly improper and either way, the Dow Jones argument for abuse would require a full trial. The Judge said it would be wrong to stigmatise a claim for damages for reputational harm caused by processing inaccurate data in a data protection claim as an abuse of process. The issue was not suitable for determination on summary application and probably required an appeal court to assess it given it is currently an unsettled issue. He also said it would be wrong to strike the claim out summarily on Jameel grounds as it could not be said that the claim had little prospect of success based on the evidence.
The case therefore continues.
On 1 July 2024, the European Commission informed Meta that its preliminary findings were that Meta's 'pay or consent' advertising model breaches the Digital Markets Act. Under Article 5(2) of the Digital Markets Act, gatekeepers must seek user consent to combine their personal data between designated core platform services and other services. If a user refuses consent they should be offered access to a less personalised but equivalent alternative service. Gatekeepers cannot make the provision of certain services or functionalities conditional on user consent.
The EC provisionally finds that Meta's 'pay or consent' advertising model breaches the DMA because:
Meta which contends its model complies with EU law including ECJ case law, can reply to the Commission's preliminary findings. If the Commission does not change its preliminary views, it will adopt a non-compliance decision and can impose fines of up to 10% of annual global turnover, rising to 20% for repeated non-compliance.
The 'pay or consent' model has also been under the scrutiny of European Data Protection Regulators, with the EDPB issuing an opinion in April which also emphasised the importance of providing a genuinely equivalent free alternative to a service, rather than offering a binary choice between a paid for ad free service and a free service supplied conditionally on user consent to behavioural advertising.
MLex has reported that the ICO expects a new data protection bill to come "at an early stage of the next parliament" and that he expressed disappointment that the DPDI Bill failed to make it to enactment. This ties in with suggestions from other sources that a Labour government would put forward some sort of digital information bill (a rumoured Digital Bill of Rights), although it may be less focused on GDPR reform than the DPDI Bill was.
The European Data Protection Supervisor and the Spanish DPA, the AEPD, published a joint report on neurodata on 3 June 2024. Neurotechnology monitors human brain activity, for example, it can be used to monitor response to ads and to predict behaviour based on previous ads. Combined with AI, neurodata could conceivably be used for law enforcement and screening as well as surveillance. As such, certain uses will be prohibited under the EU's AI Act. The report outlines different uses of neurotechnology and looks at the types of neurodata they may process and their applications. It goes on to look at challenges when processing such data, including threats to the rights and freedoms of individuals and data protection compliance.
On 27 June 2024, the European Systemic Risk Board published a compliance report on its recommendation on the establishment of a pan-European systemic cyber incident coordination reporting framework (EU SCICF). Its recommendation was published in January 2022 for issues that EU authorities should consider in preparation for DORA and requirements around points of contact. The report found there has been good progress but reminded relevant authorities that the EU-SCIF must be ready to be fully operational by the end of 2024.
On 25 June 2024, the Council of the EU adopted a Recommendation on a Blueprint for protecting EU citizens and the internal market, essentially to coordinate a response at EU level to disruptions to critical infrastructure with significant cross-border relevance. The focus is on sharing experience and information about an incident to help coordinate public communications and an effective response. Testing of the Blueprint at national, regional and EU level is recommended with a test exercise at EU level to take place within 18 months.
The ICO launched a two year trial of its approach to public sector organisations in June 2022. The trial has now ended and the ICO will review its approach before making a final decision on it in the autumn. In the meantime, the current approach will apply.
After years of stalemate, there are media reports that the next European Commission will withdraw the ePrivacy Regulation. The expectation is that it will be replaced, potentially by three separate pieces of legislation covering:
On 25 April 2024, the European Parliament passed the Platform Work Directive, introducing new rules on algorithmic management and data protection measures for workers. The Directive prohibits digital platforms from processing data on workers' emotional or psychological state, or data that could infer sensitive information such as racial or ethnic origin, political or religious beliefs, migration status, health, or trade union activity. Biometric data processing is allowed only for authentication, and platforms cannot collect personal data when the person is not performing platform work. The new rules ensure that workers cannot be fired or dismissed based on algorithmic or automated decisions, requiring human oversight for important decisions affecting platform workers. The Directive will now be adopted by the Council and then signed and published in the Official Journal coming into force 20 days later. Member States will have two years from entry into force to transpose the Directive.
The EDPB has published the final version of its Guidelines on Article 37 of the Law Enforcement Directive which relates to data transfers. The EDPB says the recipient country or organisation must have an essentially equivalent level of data protection as the EU but the requirement relates to the specific data transfer or category of transfers so essential equivalence must be guaranteed for the particular case rather than the overall legislative position in the recipient country. The guidelines also detail the importance of legally binding instruments for data transfers, the requirements under the legal framework, and the role of DPAs in monitoring data transfers.
On 21 June, health data stolen during the 3 June ransomware attack on Synnovis was reportedly published online. The NCSC said it was investigating whether or not the published data was in fact data extracted from Synnovis and both it and NHS England have provided online support tools.
Norway's Personal Data Protection Board has held the Norwegian DPA does not have the authority to impose a daily penalty on Meta Ireland for failing to comply in Norway with the ban on behavioural advertising on the basis of legitimate interest or contractual necessity. The DPA erred in imposing this type of penalty (of up to 1m NOK per day) which it can only do in domestic cases, not in cross-border ones. The behavioural advertising ban was, however, upheld in a separate decision.
All polls currently suggest the Labour Party will win the general election (other parties are available). The Labour Party manifesto was published on 13 June 2024. In terms of data, Labour commits to creating a National Data Library to provide access to public data sets and remove planning barriers for data centres and to supporting creation of conditions to facilitate open banking. While the Labour manifesto does not mention the Data Protection and Digital Information Bill, some of its proposals suggest that it may introduce at least parts of the Bill if not those which deal with reform of personal data protection law, however, the prevailing view is that it will not reform the EU GDPR or those parts of the DPA 18 dealing with personal data.
Arguably the most detailed proposals on data come from the Liberal Democrats – the party most likely to be involved in any coalition situation. Among the proposals in their manifesto, the Lib Dems say they would:
Following "intensive engagement" with Meta, the Irish DPC welcomed Meta's decision (in response to its request) to pause its plans to train its large language model using public content shared by adults on its Facebook and Instagram platforms across the EU/EEA. In a blogpost, Meta expressed disappointment and its belief that the changes made to its privacy policy to allow it to use the data do comply with EU data protection law. The Irish DPC acted following a number of complaints made to EU regulators by NOYB. Meta has also extended this pause to the UK at the request of the ICO.
On 10 June 2024, the CNIL began consulting on the development of AI systems and on how the EU GDPR applies to AI models. Feedback is sought on seven policies relating to the development of AI with the focus including open-source models, web scraping, lawful basis (particularly legitimate interest), data subject rights and security. The consultation closes on 1 September 2024 and the eventual policies will join the information sheets published in April 2024.
The CNIL also published 'principles' sheets on Open Data on 12 June to ensure permitted use of publicly available data complies with data protection law.
On 13 June 2024, NOYB lodged a complaint with the Austrian DPA alleging that Google is not being transparent about the fact that it continues to track Chrome browser users after they have turned on the 'ad privacy' feature. The privacy feature is founded on Google's browser-based privacy sandbox in a move away from third party cookies. NOYB says Google should still get user consent to this and that it does not provide enough information to allow users to provide informed consent.
The EU Council has agreed its negotiating position on the draft Regulation on rules to harmonise cross-border enforcement of the EU GDPR. This focuses on enhanced cooperation between EU data protection regulators and a revised complaints handing procedure. Trilogues can now begin.
On 13 June 2024, the ICO published the final version of its Enterprise Data Strategy following a public consultation. Part of the ICO's ICO25 strategic plan, the Strategy sets out how the ICO plans to use its data assets and exemplify responsible data use by adhering to the principles of:
On 10 June 2024, the UK and Canadian data protection regulators announced they would investigate the 23andMe data breach which last year saw a threat actor compromise over six million profiles in a credential stuffing attack. The ICO and Office of the Privacy Commissioner of Canada (OPC) will focus on the scope of information exposed and likely harms, whether or not 23andMe had appropriate safeguards in place and whether it provided adequate notification of the breach to those affected and to relevant regulators.
The High Court held in Harrison v Cameron and another that a controller may rely on the 'rights of others' exemption under the DPA 18 and Article 15 UK GDPR and refuse to disclose identities of individuals in response to a SAR where doing so could reasonably be expected to create a significant risk that the individuals would face intimidation and threats from the maker of the SAR. A number of other issues were also considered including the scope of the personal/household exemption under Article 2(2)(a) of the UK GDPR.
On 6 June 2024, NOYB filed complaints with eleven European DPAs about Meta's planned changes to its privacy policy and is intending to add to those. Meta is proposing to start processing EU and UK user data on its platforms to train generative AI models. This will include posts and photos dating as far back as 2007 but will exclude private messages or messages with businesses. The data will be processed on the basis of Meta's legitimate interests although users will be able to opt out.
NOYB argues that the GDPR requires Meta to obtain user consent rather than providing an opt-out and says that the opt-out form is difficult to find and is misleading. The changes are set to take effect on 26 June 2024. NOYB is requesting a decision under the Article 66 urgency procedure particularly because it argues there is no option to opt out at a later point once the policy is in effect.
The Hamburg and Danish DPAs have reportedly already expressed concerns, however, the UK ICO's Executive Director for regulatory risk has reportedly said that it is up to users to "exercise their rights where they wish to do so", adding that AI developers must be transparent about their use of personal data.
LinkedIn has said it will stop targeting ads at EU users based on sensitive data following a complaint that doing so was in breach of the Digital Services Act. The DSA prohibits targeting online adverts based on profiling using special categories of personal data. The complaint, made in February 2024, led to the European Commission sending LinkedIn a formal request for information in March of this year.
ENISA and the European Securities and Markets Authority, the European Insurance and Operational Pensions Authority and the European Banking Authority, published a multilateral memorandum of understanding on 4 June 2024. This covers cooperation and information exchange relating to DORA, NIS2 and other areas of common interest. In particular, common incident reporting and developing common technical standards will be prioritised.
On 7 June 2024, Microsoft announced it was making changes to its AI-powered 'Recall' feature for new laptop Copilot+ following privacy concerns. The Recall feature is billed by Microsoft as like giving the computer a photographic memory, allowing it to recall anything which has ever appeared on screen by taking regular screenshots and making those searchable. Microsoft is now giving people a clearer choice to opt in and is turning the system off by default. Strong authentication and 'proof of presence' will be required for users wanting to view their timelines and saved activity. The UK's ICO has been making enquiries with Microsoft about the system.
The NHS suffered a ransomware attack last week, believed to have been carried out by the Russian Quilin criminal gang. The attack affected the IT system of Synnovis which analyses blood tests on behalf of the NHS. A number of non-urgent operations were cancelled and appeals for blood donors were issued. It has been reported that the attack could disrupt the NHS for months if Synnovis is unable to regain access to is systems and the records they contain.
NOYB filed complaints with the Austrian Data Protection Authority on 4 June 2024 in which it alleges Microsoft 365 Education unlawfully tracks children regardless of their age. NOYB alleges that while schools have no control over the systems and the way they process personal data, the Microsoft terms of service state that the schools using the software are the data controllers. NOYB also says there is a lack of transparency and, more seriously, that cookies analyse user behaviour, collect browser data and are used to deliver advertising without an appropriate lawful basis. NOYB is asking the Austrian DPA to investigate.
On 30 May, the Italian DPA, the Garante, announced guidance to help data controllers prevent online data being scraped to train AI. The guidance includes recommendations of concrete measures which can be adopted to protect data from indiscriminate scraping including creating reserved areas which require registration, anti-scraping clauses in terms of service, page traffic monitoring to identify abnormal data flows, and technical solutions to repel bots.
On 27 May 2024, the EDPB published a statement on the financial data access and payments package. The package consists of three proposals aimed at improving consumer protection and competition in electronic payments and providing for consumers to elect to share their data in order to get access to a wider and cheaper range of financial products and services.
The EDPB makes various suggestions on how to align the proposals with EDPB Guidelines, in particular those covering the Second Payment Services Directive, and Opinions on the proposals. The EPDB points out areas where the European Parliament did not follow its recommendations and urges that they be adopted.
Live Nation has confirmed the theft of data belonging to 560m customers worldwide. Responsibility has been claimed by ShinyHunters which is demanding £400,000 for the return of the data comprising names, addresses, contact numbers and incomplete credit card details. The group also claimed responsibility for a cyber attack on Santander Bank in which they claim to have stolen the data of 30m customers including 6m account numbers and balances, and 28m credit card numbers as well as staff data. It has reportedly asked for £1.6m to release the data.
Separately, the BBC suffered a data breach which exposed the details of more than 25,000 members of its occupational pension schemes although the BBC says it has no evidence of a ransomware attack.
The European Data Protection Supervisor published guidelines on 3 June 2024 to help EU institutions, bodies, offices and agencies comply with the GDPR when using generative AI. The guidelines run through the data protection principles and provide practical examples to anticipate risks, challenges and opportunities. There is also a focus on when to carry out DPIAs and how to conduct them.
Reports last week suggested that the UK government would consult on new proposals around ransomware reporting. These would propose banning ransom payments by critical national infrastructure, and require others to report ransomware attacks and obtain permission before paying cyber attackers. With the announcement of the general election, however, it's unclear whether or not the consultation will happen and in what form.
On 23 May 2024, the European Commission opened infringement proceedings against 18 Member States requiring them to implement the Data Governance Act within two months. Non-compliance issues relate mainly to failure to designate responsible authorities or demonstrate that these are empowered to perform tasks allocated under the DGA.
The EDPB adopted an Opinion on the use of live facial recognition technology at airports on 23 May 2024. The Opinion looks at this type of processing in the context of the storage limitation principle, the integrity and confidentiality principle, data protection by design and default, and security of processing. The EDPB notes that there is no uniform requirement across the EU Member States for airport operators and airlines to verify that the name on a passenger's boarding pass matches the one in their identity document. Where no such verification is required, no such verification using biometrics should occur as this would result in excessive processing.
Where biometrics are used, only the data of passengers who actively enrol and consent to participate should be processed. The EDPB found that the only storage solution which would be compatible with the principles it considered are those that are stored in the hands of an individual or in a central database with the encryption key solely in their hands. Such storage solutions must also be implemented with a list of minimum recommended safety measures.
The EDPB's ChatGPT taskforce set up to promote cooperation between DPAs investigating ChatGPT's data protection compliance, published an interim report on its work to date on 24 May 2024. The investigation is ongoing so the findings are not conclusive, however, the workforce considers the application of GDPR provisions around lawfulness, fairness, transparency and accuracy including the ability of individuals to exercise their GDPR rights. The taskforce also developed a common questionnaire for DPAs to use with ChatGPT.
The EDPB also announced plans to develop guidelines on generative AI which will focus initially on data scraping in the context of AI training data.
The Council of Europe adopted the AI Act on 21 May 2024. It now goes through the formal signature process after which it can be published in the Official Journal. This is expected to happen shortly. The press release linked to a version of the Act which is presumed to be near final.
The Data Protection and Digital Information Bill was not put on the 'wash up' list for legislation nearing enactment which is being rushed through before Parliament is dissolved for the 2024 General Election. While in its later stages, the Bill still had to go through amendments in the Lords and was probably either too controversial or too far from enactment to make the cut. With polls predicting a Labour victory, its future is uncertain. It is possible that the Labour manifesto will shed more light once it is published but many expect that it will not survive in its current form if at all.
The ICO intends to fine the Police Service of Northern Ireland £750k after personal information of its entire workforce was exposed. The ICO said that "the sensitivities in Northern Ireland and the unprecedented nature of this breach created a perfect storm of risk and harm". The ICO said that simple and practical to implement policies and procedures could have ensured the incident would not have happened.
The Automated Vehicles Act received Royal Assent on 20 May 2024. Among other things, it introduces detailed technical and cyber security requirements. It also:
Introduces new criminal offences in relation to certain types of non-compliance.
Almost the entire Act is to be brought in under secondary legislation.
Under the banner of the Digital Regulation Cooperation Forum, the ICO and FCA published a joint paper on 14 May 2024, looking at consumer attitudes towards digital assets and how consumers interact with them. The report is intended to inform the regulatory approach from a financial and data protection perspective. Transparency and trust are highlighted as key to consumer take up. The need to ensure that risks associated with the technology underpinning digital assets are addressed to protect consumers is also emphasised.
The ICO launched a fourth call for evidence on generative AI on 15 May 2024. This chapter focuses on data subject rights. It explains relevant rights and their application in the context of the training and use of generative AI. The ICO suggests a series of recommendations to ensure organisations developing and using generative AI keep users informed about their rights, enable their exercise, provide information about use of personal data, and clearly justify any exemptions. The consultation closes on 5 June 2024.
The House of Lords is expected to open the report stage of the Data Protection and Digital Information Bill on 10 June 2024. Any approved amendments will then go forward to the Bill's third reading in the Lords before returning to the Commons for consideration. These are the final stages of the Bill which the government will be keen to enact before calling a general election.
The UK government launched two calls for views on voluntary codes of practice for AI cyber security and cyber security issues for software vendors on 16 May 2024.
AI cyber security
The call for views focuses on the introduction of a voluntary code of practice on AI cyber security. It sets out specific interventions to help secure AI and hopes to contribute to developing a global standard. The intention is to set out baseline security requirements together with risk mitigation actions in twelve key principles. These are designed to cover the entire AI lifecycle across the supply chain. The call is published alongside a number of research reports and is part of the government's National Cyber Strategy.
Software vendors
This proposed code of practice sets out fundamental security and resilience measures to be reasonably expected of B2B software developers and vendors. The draft code sets out 21 provisions under four overarching principles. The government also intends to publish technical controls and implementation guidance to set out minimum actions software vendors need to demonstrate they have taken to establish confidence in the resilience of their product or service. Organisations procuring software should be able to refer to the code and supporting materials to inform their understanding of risks associated with the software and to request any additional measures.
The Regulation amending the Interim Regulation which applies a derogation to the ePrivacy Directive for the purpose of combatting online child sexual abuse was published in the Official Journal on 14 May 2025. The new Regulation extends the derogation period to 3 April 2026 pending the agreement of the currently draft CSEA Regulation.
On 10 May 2024, the ICO called on organisations to do more to boost cyber security and protect the personal data they hold in the context of a growing threat of cyber attacks. The call comes with publication of a new report analysing trends in cyber security breaches during 2023. Finance, retail and education were the sectors reporting the most incidents. The report focuses on five leading causes of breaches:
It also briefly covers malware and ransomware.
Ofcom has published its second major consultation on the Online Safety Act (OSA) focusing on protecting children from harms online. Under the OSA, user-to-user and search services likely to be accessed by children have safety duties relating to content that is harmful to them.
The consultation covers:
In another lengthy set of documents, Ofcom provides a summary of its consultation together with documents including:
Read more.
For many businesses, compliance will engage data protection issues and last week (as reported here) Ofcom and the ICO set out their vision for collaboration on areas of overlap.
The German joint body of federal Data Protection Authorities (DSK) published joint guidance on AI and data protection on 6 May 2024. The guidance focuses primarily on large language models and how organisations can deploy them in a data protection compliant manner, but the DSK says the guidance is relevant to other AI applications and for developers as well as deployers. The guidance covers selection and planning, implementation and active use. Unsurprisingly, it focuses on key data protection issues including transparency, lawful basis, data minimisation, use of special data, data protection by design and default, and governance.
Under the banner of the Digital Regulatory Cooperation Forum (DRCF), Ofcom and the ICO published a Joint Statement on Collaboration on the Regulation of Online Services on 1 May 2024. This sets out:
Collaboration themes highlighted in this statement will be particularly important to businesses that have safety duties under the OSA involving the processing of (children's) personal data, and whose services are also caught under the ICO's Children's Code. They will initially cover:
It's also worth noting that Ofcom published its consultation on protecting children from harmful content online on 8 May (read more). This places a heavy emphasis on age assurance and recommender systems so it's unsurprising that these will be areas of joint focus.
The Dutch Data Protection Authority published guidance explaining why scraping personal data breaches the GDPR on 1 May 2024. The DPA say scraping is almost always illegal and can only be lawful if it is carried out in a highly targeted manner. The DPA highlights that the fact that data is publicly available on the internet does not mean consent has necessarily been given to scrape it. There are instances in which legitimate interests might be available as a lawful basis but the standard will be difficult to meet. The DPA suggests scraping may be permitted for domestic use and where an organisation scrapes news and media sites to gain insight into news about itself.
In response to a request from the Department for Science, Innovation and Technology, the ICO published its strategic approach to regulating AI at the end of April 2024. This covers the opportunities and risks of AI, the role of data protection law, the ICO's work on AI to date and its plans for further work. This centres on guidance, advice, support and enforcement action as well as on collaboration with other relevant regulators.
The draft Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products) (Amendment) Regulations 2024 have been laid before Parliament. They will amend the PSTIA Regulations 2023 to correct an error in Schedule 1 to clarify that when a manufacturer of a relevant connectable product extends the minimum length of time for which security updates must be supplied, this must be published as soon as practicable. In addition, the Amendment Regulations will introduce new Schedule 3 exemptions to being classed as relevant connectable products for motor vehicles, two and three wheel vehicles, and agriculture and forestry vehicles, all of which are covered in separate legislation.
On 29 April 2024, the Council of the European Union adopted a protocol to allow for free data flows between the EU and Japan. Japan already has an EU adequacy decision in relation to personal data. The protocol is intended to prevent data localisation and to provide a more predictable legal framework which allows businesses to handle data efficiently.
The Regulation to amend the eIDAS Regulation as regards establishing the European Digital Identity Framework (eID Regulation) was published in the Official Journal on 30 April 2024. It establishes a new digital identity framework for EU citizens and a European Digital Identity Wallet. Read more.
The Italian DPA, the Garante, lifted its ban on ChatGPT, satisfied that it had made sufficient progress on GDPR compliance issues. These include OpenAI committing to provide greater visibility of its privacy policy and user content opt-out forms, and to provide a new opt-out form for EU citizens to allow them to opt-out of OpenAI using their personal data to train ChatGPT.
On 29 April, the EU Council adopted the Regulation amending the Interim Regulation which applies a derogation to the ePrivacy Directive for the purpose of combatting online child sexual abuse. The new Regulation will extend the derogation period to 3 April 2026 pending the agreement of the currently draft CSEA Regulation, and will now be published in the Official Journal.
The Short-term rental Regulation was published in the Official Journal on 29 April 2024. It sets out rules for data collection by competent authorities and providers of short-term online rental platforms and for sharing that data. It applies to short-term rental platforms which allow hosts to provide short-term rentals in the EU and to hosts providing those services. The Regulation will apply from 20 May 2026.
Advocate General Szpunar has opined in a reference from Germany. As part of the referred issues, the AG was asked whether data relating to the online purchase of non-prescription medicine constituted health data for the purposes of GDPR. The AG advised it does not qualify as special data as the purchase does not allow the inference of sufficiently certain conclusions about the health status of the purchaser because these sorts of medication are for general use and are often purchased proactively or for preventative purposes and may be taken by someone other than the purchaser.
The Upper Tribunal Administrative Appeals Chamber dismissed the ICO's appeal against a First Tier Tribunal decision to overturn an order it made against credit broker Experian. The order related to Experian's marketing services business and required Experian to make changes to its data processing practices in particular in relation to lawful basis and transparency. On appeal by Experian, the First Tier Tribunal largely struck out the ICO's enforcement notice. The ICO appealed the decision but the Upper Tribunal concluded on 23 April, that while there were minor flaws in some of the First Tier Tribunal's decision, they did not raise legal issues. It therefore upheld the decision. The ICO has said it will consider its next steps including whether or not to appeal.
On 23 April 2024, IAB Europe published a response to the EDPB's Opinion on pay or consent models for large online platforms. IAB Europe argues that the EDPB has mischaracterised both the 'consent or pay model' and personalised advertising and risks creating legal uncertainty for many businesses beyond large online platforms. The EDPB is accused of making overly abstract assumptions and failing to substantiate them. IAB Europe also suggests the EDPB has failed to balance individual rights with the freedom to conduct business and that it is attempting to require businesses to provide products and services at a loss or interfere with their chosen business models.
On 26 April 2024, AG Rantos published an Opinion on a referral from Austria in a case brought by Max Schrems. Schrems complained that personal data relating to his sexual orientation had been processed unlawfully by Meta to send him advertising targeted at homosexuals. He argued this was done without his consent or under any other lawful basis. Schrems claimed the advertisements were not based directly on his sexual orientation but on an analysis of his particular interests. He brought an action in the Austrian courts and subsequently referred publicly to his homosexuality during a panel discussion but did not publish this information on Facebook.
The Austrian Supreme Court asked the ECJ whether:
The AG said the ECJ should rule that:
On 24 April, the European Parliament approved:
Data privacy campaign group NOYB has filed a complaint with the Austrian DPA asking it to investigate OpenAI's data processing and the measures it takes to ensure accuracy of personal data processed in the context of its large language models. NOYB also asks the regulator to require OpenAI to comply with a subject access request made by Max Schrems. The background to the complaint is the fact that ChatGPT gave an incorrect date when asked the date of Mr Schrems's birthday – known as a 'hallucination'. NOYB alleges that this breaches accuracy principles and that there is no way for OpenAI to give effect to rectification and deletion requests. There are also transparency issues as ChatGPT does not always make sources clear. NOYB argues that it is able to bring the complaint in Austria as OpenAI's EU headquarters in Ireland does not control the data processing of ChatGPT and therefore the one stop shop does not apply.
The UK Competition Appeals Tribunal has rejected Meta's application for permission to appeal against a collective proceedings order brought by Dr Liza Lovdahl Gormsen. The opt-out CPO was certified in February 2024. It alleges that Meta abused its dominant position by combining personal data collected outside the Facebook Platform with data collected from Facebook to target advertising and generate additional revenue. Moreover, users had no option but to accept this. The CAT's ruling allows the COP to proceed.
The Department for Business and Trade published a Smart Data Roadmap on 18 April 2024, setting out the government's plans in 2024-25 for using powers under the DPDI Bill to identify opportunities and challenges in implementing Smart Data schemes in seven sectors. The plan sets out priority sectors of banking, finance, energy and road fuels, telecommunications and transport, with retail and home buying identified as areas of further interest. Unsurprisingly, an initial priority is to complete the DPDI Bill itself. A further report is anticipated in a year's time although this will be after the General Election.
On 23 April 2024, the EDPB published its Annual Report for 2023 looking back at the work it did last year which includes producing guidelines and Article 65 decisions, the Article 66(2) Meta decision and enforcement co-operation, as well as its legislative consultation activities.
The UK's connectable products regime is now in effect as the Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products) Regulations 2023 came into force on 29 April 2024. The government also updated its PSTIA Regs guidance to include additional guidance on the Statement of Compliance and automotive vehicles
The ASA published a guide to privacy rules on 25 April 2024. This focuses on CAP Code rules for featuring an individual (whether a public person or a private individual) and private property in ads. It also makes reference to Section 10 of the CAP Code which deals with personal data issues.
The EDPB published its long-awaited Opinion on valid consent in the context of pay models implemented by large online platforms (LOPs) on 17 April 2024. The EDPB was asked to consider the 'pay or OK' model for behavioural advertising in the EU by the Dutch, Norwegian and Hamburg DPAs, particularly as used by LOPs, most notably Meta.
The EDPB has concluded that "in most cases it will not be possible for [LOPs] to comply with the requirements for valid consent, if they confront users only with a choice between consenting to processing of personal data for behavioural advertising purposes and paying a fee". The EDPB suggests that LOPs should not be offering a binary choice to consent to advertising or to pay for an ad-free service. Instead they should consider providing an equivalent alternative such as a free service with non-behavioural advertising, ie advertising which uses less personal data than behavioural advertising or none at all.
The reasoning behind this is the need for consent to be freely given. The EDPB stresses that consent cannot be freely given if users suffer detriment by not giving it or by withdrawing it. Detriment can include exclusion from major online services if data subjects choose neither to pay a fee nor to consent to use of their personal data and where an equivalent alternative is not offered. A genuine equivalent is likely to be available where:
The EDPB says personal data is not a tradeable commodity and that LOPs must prevent the fundamental right to data protection being something users have to pay to have.
The EDPB underlines that when assessing whether or not consent has been freely given, conditionality, detriment, imbalance of powers and granularity should be taken into account. A fee cannot compel users to consent so controllers should assess on a case-by-case basis whether a fee is appropriate at all and at what amount the fee should be set in the circumstances. LOPs should also consider whether the data subject is likely to suffer detriment if they do not consent. Similarly, they should evaluate whether there is an imbalance of power between the controller and the individual – this should cover factors including the position of the LOP in the market, the extent to which an individual relies on the service, and its main audience. The EDPB is also concerned about transparency issues around behavioural advertising and warns LOPs that GDPR compliance goes further than complying with lawful basis requirements.
The Opinion includes elements to help assess consent against the GDPR standard in relation to 'pay or OK' but the EDPB also plans to develop guidelines on 'pay or OK' with broader scope.
Prior to the EDPB publishing its Opinion, IAB Europe issued a letter it had sent to the EDPB arguing that 'pay or OK' does not equate to making individuals pay for data protection rights. NOYB has welcomed the Opinion but stresses it only addresses LOPs.
Read more.
The European Parliament's Economic and Monetary Affairs Committee adopted its report on the proposed financial data space (FIDA). The Committee proposes a number of amendments to the Commission's draft including:
The EDPB published its Strategy 2024-2027 on 18 April 2024. It set out priorities grouped around four pillars with key actions per pillar:
The EDPB adopted Rules of Procedure, a public information note and template complaint forms to facilitate the implementation of the redress mechanisms under the EU-US Data Privacy Framework (DPF). The documents relate to two redress mechanisms to handle complaints by EU individuals relating to national security or commercial purposes and in relation to data transferred after 10 July 2023.
The European Parliament published the 'corrigendum' of the AI Act on 16 April 2024. This corrects errors and clarifies language. The text is unlikely to change further in any significant way although it was subject to review by Parliamentary committees at the end of last week.
On 15 April 2024, the ICO published guidance to improve transparency in health and social care. This is the final version of the guidance following a consultation earlier this year. The guidance covers:
The guidance includes a transparency checklist.
The ICO has published a third call for evidence on generative AI. This focuses on accuracy of training data and model outputs and closes at 5pm on 10 May 2024. The call looks at the meaning of accuracy in a generative AI and data protection context and the impact of accuracy as well as the link between purpose and accuracy. It also looks at the impact of training data on accuracy of output.
A discussion draft of a federal American Privacy Rights Act was published although not introduced on 7 April 2024. Previous attempts at a federal privacy law have failed but there are hopes this one, which has bipartisan support, may be more successful. Given the increasingly complex patchwork of privacy laws emerging across the US, there may be more political will to introduce a federal law although there is a long way to go.
The discussion draft proposes among other things:
The EDPS published his annual report for 2023 on 9 April 2024. It highlights key EDPS activities during 2023. The EDPS also gave a speech outlining three priority areas for this year:
The EDPS is also planning various activities to coincide with its 20th anniversary this year.
As reported last week, ENISA is preparing to vote on adoption of the EUCS cloud certification scheme. Under the latest draft from March, it appears that controversial EU localisation requirements have been dropped in favour of transparency provisions which include information on storage location and data processing methods. Once ENISA adopts the EUCS, the EC will issue implementing legislation under the Cybersecurity Act after which all Member States will be able to adopt the certification scheme should they wish to. They will be able to pass local laws making the scheme mandatory for specific types of data.
France has already started this process with its SREN Bill which concentrates on health and national security data. The Bill originally contained capital ownership requirements much like those originally in the EUCS. The SREN Bill no longer contains these on the face of the Bill but France is expected to introduce requirements by decree within six months of the SREN Bill's enactment.
The CNIL has published the final versions of its first AI data protection recommendations following a consultation. Seven "official recommendation sheets" cover how to:
A summary of the recommendations has also been published.
The CNIL will go on to publish further sheets on designing and training GDPR-compliant models, data recovery, the use of legitimate interests as a lawful basis, and data subject rights.
The European Parliament adopted its negotiating position on the draft Regulation laying down additional procedural rules relating to the enforcement of the GDPR on 10 April 2024. Among the changes to the European Commission's proposal are:
Once the Council adopts its own negotiating position, the draft Regulation will proceed to the trilogue stage.
On 10 April 2024, the European Parliament adopted the Regulation permitting continuation of certain derogations of the ePrivacy Directive under an Interim Regulation to combat child sexual abuse online on 10 April 2024. The original Interim Regulation is being extended to 3 April 2026 to allow time for the EU's CSAM Regulation to come into effect. The Council now needs to adopt the Regulation which will then be published in the Official Journal.
The ICO has set out its priorities when it comes to protecting children online during 2024-25. The ICO's Children's Code strategy will focus on:
The ICO will work closely with Ofcom in its capacity as regulator of the Online Safety Act, and with international regulators. Its work will include information gathering engagement and supervision as well as enforcement. It intends to publish a call for evidence in Summer 2024 to gather input from a wide range of stakeholders.
On 4 April 2024, the ICO signed up to a new international multilateral agreement with the Global Cooperation Arrangement for Privacy Enforcement (Global CAPE). This is intended to facilitate cooperation in cross-border data protection and privacy enforcement. Other members include the USA, Australia, Canada, Mexico and Japan.
Amazon is appealing the €32m fine handed down to Amazon France Logistique by the CNIL relating to issues with a tracking system at one of its warehouses. The CNIL found the system was "excessively intrusive" – a finding with which Amazon strongly disagrees.
Google has reportedly settled a US class action lawsuit which alleged that Google continued to collect user data even when they were browsing in 'Incognito mode'. Google agreed to delete data collected from the Incognito mode and to enable users of the Incognito mode to block third-party cookies by default. Google reiterated its belief that the claim was "meritless" and said the data collected was technical data which did not identify individuals. The settlement must now be approved by the courts.
US retail wireless carrier A&T has said personal data of 7.6m current users and 65.4m former users has been leaked on the dark web. The data is thought to be from 2019 or earlier. A&T has not yet confirmed the source of the breach. It says that the leak has not materially affected users but has re-set millions of customer passwords.
The latest ENISA draft of the EU Cloud Service Scheme (EUCS) published in March 2024, which aims to set up a cyber security certification regime for cloud services, suggests the EU will drop the controversial sovereignty requirements that cloud service providers should only be operated by EU-based companies with no non-EU entity exerting effective control. Previous drafts had already watered these provisions down and they are absent from the March 2024 draft.
Meanwhile, the US Cyber Safety Review Board has recommended that all cloud service providers improve security and build resilience against attacks. It has also said it plans to develop guidance for major cloud service providers following a review of the 2023 Microsoft Exchange Online data breach which exposed the email addresses of US and EU government officials.On 18 March 2024, the ICO published new fining guidance setting out how it decides to issue penalties and calculate fines. Publication follows consultation on a draft published in 2023. The guidance is intended to provide greater transparency for organisations. It sets out the legal framework which gives the ICO the power to impose fines, how the ICO will approach key questions including identifying a wider undertaking or economic entity and the methodology of fine calculation using the following five steps:
The House of Lords European Affairs Committee issued a call for evidence on the existing arrangements for EU UK data adequacy on 15 March 2024. It focuses on the existing arrangements and how effective they are, possible challenges to the regime, the implications of a disrupted or no adequacy scenario, taking into account proposals under the DPDI Bill, and lessons learned from other countries' adequacy systems.
The call closes at 12pm on 3 May 2024 and the Committee aims to report by July 2024. The European Commission is due to review the UK adequacy decision by June 2025.
The EDPB is expected to give its views on the 'pay or OK' subscription model used by Meta and others in mid-April. In the meantime, IAB Europe and others have written to the EDPB supporting the model and urging it to follow existing case law. Meta has said it plans to reduce its subscription fee from €9.99 to €5.99 per month for its ad-free version. One of the criticisms levelled at its model is that the cost of the ad-free version is arbitrary and too high so this looks like an attempt to allay those concerns.
The EU's Regulation on transparency and targeting of political advertising was published in the Official Journal on 20 March 2024. This applies to certain types of political advertising disseminated in the EU, brought in to the public domain in one or more Member States, or directed at EU citizens regardless of country of origin and means used to publish. The Regulation will apply from 10 October 2024 with Articles 3 (definitions) and 5(1) (restrictions on providers of political advertising services) applying from date of entry into force.
The US House of Representatives approved the Protecting Americans' Data from Foreign Adversaries Act. This prohibits data brokers from sharing sensitive personal data with an entity controlled by a designated 'country of concern'. This includes a company which is more than 20% owned by an entity in a designated jurisdiction.
On 2 February 2024, the ECJ ruled in a reference from Germany on issues around the processing of employee health data where the controller is a medical service that creates reports on insured persons' fitness for work.
The ECJ held that the exception to the prohibition on processing special data in 9(2)(h) (processing necessary for preventative or occupational medicine for the assessment of the working capacity of an employee) does apply where a medical examination body processes the data not as an employer but as a medical service, subject to other relevant conditions being met. Article 9(3) does not require a controller to ensure that no colleagues of the employee concerned are able to access the data. In addition, at least one of the Article 6(1) conditions (lawful basis) must be met.
The ECJ also said that Article 82(1) (compensation) must be interpreted to mean that financial compensation payable in accordance with that Article should be compensatory not punitive or dissuasive and that while fault of the controller had to be established to award compensation, the seriousness of the fault need not be take in into account when awarding non-material damages.
The Interoperable Europe Act was published in the Official Journal on 22 March 2024. It introduces a co-operation framework for public administrations in the EU to facilitate cross-border exchanges of data. The intention is to ensure interoperable digital solutions and tools to help remove administrative burdens. The Act enters into force on 11 April 2024 and will largely apply from 12 July 2024, with limited provisions applying from 12 January 2025.
The UK government has attributed cyber attacks on the Electoral Commission and MPs to China-sponsored hackers. China has denied carrying out the attacks. Analysts have warned that cyber attacks linked to China are increasing and both the UK and US governments announced sanctions against Chinese companies and individuals they say are linked to the attacks.
The European Parliament approved the AI Act on 13 March 2024. The AI Act will now be formally adopted and published in the Official Journal coming into force 20 days later. Publication of the final text in the Official Journal is unlikely to take place before the end of April/early May. Prohibitions on unacceptable risk AI will apply six months after this. Rules on general purpose AI will apply after a year, and it will be two years before the full Act applies.
On 12 March 2024, the European Parliament adopted the Cyber Resilience Act. The CRA will set out provisions for the security of connected products. Obligations will apply across the supply chain but particularly to manufacturers. Products will be classified according to risk level with high-risk products having to go through independent conformity assessments. There are also requirements around software updates of relevant products.
The CRA will now be formally adopted before being published in the Official Journal and coming into force twenty days later. The majority of the provisions will apply three years after that with some coming in earlier.
The European Council and Parliament reached provisional political agreement on the Regulation to establish a common European Health Data Space on 15 March 2024. The agreed compromise text amends the initial proposal including by:
The Regulation is expected to be adopted later this year and will apply one year later.
On 11 March 2024, the House of Commons Joint Committee on National Security published the government's response to its consultation and subsequent recommendations on tackling ransomware. The Committee's press release was headed "Government's "ostrich strategy" in response to large and imminent national cyber-threat does not reassure". The government rejected key recommendations including:
The government has agreed to consider the Committee's proposals to reform the Computer Misuse Act but rather than act "urgently" as suggested by the Committee, it has not stipulated at timeframe. It has, however, pointed to related actions it is taking including:
On 14 March 2024, the ECJ ruled in a reference from Hungary, that a Member State DPA can make an own-volition order to delete unlawfully processed personal data, even in the absence of a prior request made by the data subject. This is because the DPA has a duty to remedy GDPR infringement and a failure to act would mean the controller could retain the data and continue to process it unlawfully. This is true whether the data originated directly from the data subject, or from another source.
The Data Protection Act 2018 (Amendment of Schedule 2 Exemptions) Regulations 2024 were made on 7 March 2024. They amend the immigration exemption in paragraph 4 of Part 1 of Schedule 2 to the Data Protection Act 2018 following the Court of Appeal decision which held the previous version to be incompatible with Article 23(2) GDPR.
The Italian DPA, the Garante, gave OpenAI 20 days from 8 March to respond to a request for information on whether its AI video generator Sora will be made available to EU users, and to provide more information about it. The Garante wants to understand whether Sora complies with GDPR requirements.
Following ECJ rulings which effectively determined that the only lawful basis available to Meta for targeted advertising was consent, Meta and others switched to what is known as the 'pay or ok' model. This gives users the choice between an ad-free subscription model and agreeing to the use of their personal data for targeted advertising purposes in exchange for a free service.
The EDPB is already scrutinising the model following requests from Member State DPAs amid concerns that consent cannot be said to be freely given in this situation, and that the charging model is insufficiently transparent and potentially unfair. In the UK, the ICO has launched a call for views on the model. However, in the EU, challenges to the model directed at Meta come not only from data protection law but also competition and consumer protection law and the EU's Digital Services Act.
In the last couple of weeks alone:
In 2022, the Belgian Data Protection Authority held that the TC string used in IAB Europe's Transparency and Consent Framework constitutes personal data. It also found that IAB Europe was a joint controller of that personal data. IAB Europe appealed the decision before the Brussels Court of Appeal which made a reference to the ECJ. On 7 March 2024, the ECJ agreed with the Belgian DPA on these two issues. It also held that IAB Europe should not necessarily be viewed as a joint controller in relation to the subsequent data processing performed in pursuit of the TCF purposes. The case now returns to the Brussels Court of Appeal. IAB Europe acknowledged the decision. The Belgian DPA's initial decision has been suspended pending the outcome of the appeal. IAB Europe says it will publish a more detailed analysis of the ECJ ruling shortly. See our article for more.
On 11 March 2024, the European Data Protection Supervisor concluded that the European Commission has infringed key data protection rules when using Microsoft 365. It orders the EC to suspend all data flows resulting from its use of Microsoft 365 to Microsoft, and to its affiliates and sub-processors located in countries outside the EU/EEA not covered by an adequacy decision. The EDPS also orders the Commission to bring the processing operations resulting from its use of Microsoft 365 into compliance with Regulation 2018/1725 (which relates to processing of personal data by Union institutions and bodies). All remedial measures are to be applied by 9 December 2024.
In particular, the EDPS found that the EC had failed to introduce supplementary Schrems II measures in relation to its US data transfers. It had not carried out a transfer mapping exercise, had not ensured that destination countries had equivalent data protection measures to those in the EU, and had concluded SCCs without the appropriate transfer impact assessment or authorisation as required from the EDPS under article 48(3) of Regulation 2018/1725. It was also found to have breached purpose limitation principles and to have failed to ensure required contractual terms in agreements.
On 7 March 2024, the ECJ handed down a preliminary decision in a reference from Finland which related to oral disclosure of personal data. The ECJ was asked whether oral disclosure of personal data constituted processing of personal data under the GDPR. The reference was made after Endemol Shine Finland's oral request to a Finnish court for information about criminal proceedings was denied on the basis that disclosure would involve processing personal data. The ECJ agreed that disclosure would constitute processing of personal data. The ECJ was also asked about the balance of interests between an individual subject to criminal proceedings and public interest in having information about this. The court said that given the sensitivity of data relating to criminal convictions, the fundamental rights of the data subject and protection of personal data would likely prevail over public interest in having access to official documents, whether or not the requestor is a company or a private individual and whether the information is disclosed orally or in writing.
The Financial Conduct Authority has published a report on using synthetic data in financial services. The report is co-authored by the Synthetic Data Expert Group and the FCA. It is intended to act as a guide to creating and using synthetic data, focusing on different issues during the data lifecycle, including data augmentation and bias mitigation, system testing and model validation and internal and external data sharing. Use cases deal with fraud detection, credit scoring, open banking, anti-money laundering and authorised push payment.
The ICO has published its response to Ofcom's first consultation on the Online Safety Act which focuses on protecting people from illegal harms online. The ICO has limited its comments to areas which interact with its data protection remit under current law, noting that the DPDI Bill may amend relevant law when it comes into force. The ICO emphasises that it is essential that users of online services have confidence that their privacy will be protected and while it is not opposed to the recommended content moderation measures in principle, it raises important points of alignment with data protection law.
It considers that the guidance does not currently provide sufficient regulatory certainty to enable services to make a confident assessment about whether content is communicated "publicly" or "privately" and does not challenge Ofcom's evidence base for concluding that factors such as encrypted messaging and anonymity/pseudonymity functionality are risks for illegal harm. It also raises concerns that the guidance could in practice deter services from deploying functionalities such as end-to-end encryption because they are deemed too risky under online safety law. Data minimisation is another area of concern as the ICO comments on a potential lack of clarity about what personal data is needed to comply with Ofcom's guidance measures. The ICO recommends that the final guidance and measures ensure services are not incentivised to process more personal data than is needed.
Michael Tan and Thomas Kahl look at the evolving landscape of China's data export laws.
See full article here.
The ICO has published a second call for evidence on generative AI, focusing on purpose limitation in the AI lifecycle. The first consultation focused on lawful processing of scraped data to train generative AI.
The ICO highlights that the generative AI lifecycle is likely to involve processing different types of personal data for different purposes. Different organisations may have control at various points. Having a specified purpose at each stage allows an organisation to understand the scope of its activity, evaluate its compliance with data protection law, and help them evidence that. Identifying the purpose will allow for compliance with other data protection principles including minimisation, lawfulness, fairness and transparency.
The UK's ICO John Edwards, speaking at the IAPP Data Protection Intensive, has outlined his priorities for 2024. Children's privacy and third party advertising cookies are on the agenda, but the ICO said "the biggest question on my desk" is AI. The ICO highlighted the importance of complying with data protection law when developing, training and using AI, particularly generative AI, but also stressed the important role to be played by other regulators.
President Biden issued an Executive Order on 28 February 2024, directing the Department of Justice to draft regulations to restrict data brokers sending sensitive data of US citizens to foreign adversary countries. The Executive Order also takes steps to reduce data security risks with respect to telecommunications infrastructure, healthcare, and consumer protection among others. Countries of concern are expected to include Russia, China, Iran, North Korea, Cuba and Venezuela.
The EDPB announced on 28 February 2024, that it had begun its Coordinated Enforcement Action for 2024. This focuses on rights of access. 31 DPAs are taking part, including seven German State-level regulators. They have options to send questionnaires to targeted organisations and conduct and/or follow up on formal investigations. The results of the action will be analysed and help determine whether further action is required.
On 29 February 2024, the European Parliament adopted at first reading the Regulation on data collection and sharing relating to short-term accommodation rental services. The Council is expected to adopt the Regulation shortly, after which it will be published in the Official Journal.
The Regulation will require online platforms which facilitate short-term accommodation rentals to comply with registration and data sharing requirements in areas where a registration scheme exists. Member States will have to set up a single digital entry point for submission of data by the platforms about host activity on a monthly basis. This will include information about the number of nights of rental, number of guests, and information about the listing.
On 27 February 2024, the European Parliament adopted at first reading the Regulation on transparency and targeting of political advertising. The Regulation is expected to be adopted by the Council shortly and will then be published in the Official Journal. The Regulation applies to paid-for political adverts in respect of which explicit and separate consent will be needed to use personal data collected from a data subject to target them. Special category data will not be able to be used to target such adverts. The rules will apply 18 months after the Regulation enters into force, however definitions and measures on non-discriminatory provision of cross-border political advertising will apply 20 days after publication in the Official Journal.
The UK's ICO and the USA's Federal Communications Commission have signed a Memorandum of Understanding pledging to work together to protect people from unwanted marketing communications and the misuse of private and sensitive data. The MoU sets out how the regulators will share information and best practice including technical developments, to achieve these goals.
The ICO published guidance for employers on sharing staff personal data in a mental health emergency on 1 March 2024. The ICO stresses that employers can share necessary and proportionate information without delay with relevant and appropriate emergency services or health professionals but recommends planning procedures for doing so ahead of time. This guidance is part of the wider guidance being published by the ICO for employers.
On 1 March 2024, the ICO found that a Home Office pilot scheme to use GPS electronic monitoring of migrants breached data protection law because it:
The ICO said the Home Office failed, throughout the ICO's enquiries, to explain sufficiently why it was necessary or proportionate to collect, access and use people's information via electronic monitoring, including failing to evidence it had considered alternative measures.
The ICO's final Code of Practice on Data protection and journalism was published on 1 February 2024. It has been approved by the Secretary of State and came into force on 22 February. The Code provides guidance for journalists and media publications on data protection in the context of journalism. It looks at how to apply the journalism exemption including looking at public interest and freedom of expression. The Code has statutory force which means it is not legally binding but failure to comply can be taken into account by the courts.
The ICO published new guidance on biometric recognition on 23 February 2024, which explains how data protection law applies when using biometric data in biometric recognition systems. It is aimed at organisations considering using such systems and at the system providers including vendors and developments. It covers the definition of biometric data, biometric recognition uses, how these involve processing special category biometric data, and the data protection requirements which will apply.
At the same time, the ICO announced it had issued an enforcement notice against Serco Leisure, requiring it to stop using facial recognition technology and fingerprint scanning to monitor employee attendance. The ICO said Serco had been unlawfully processing the biometric data of more than 2000 employees at 38 locations. Serco failed to demonstrate this was necessary and proportionate compared with other less intrusive measures such as ID cards and failed to supply employees with a clear alternative to providing their biometric data. Due to the imbalance of power between the employees and Serco, there could be no valid consent as the employees would have been unlikely to feel they could say no to the company's use of FRT.
The OECD has announced it has set up a new expert group for policy synergies in AI, data and privacy. The intention is to break down silos in recognition of the rise of privacy enforcement actions focused on AI, to address concerns that data protection law may be hindering AI development, and focus on synergies.
The McPartland Review of Cyber Security and Economic Growth is a government-commissioned independent review of cyber security as a driver of economic growth. Revenue generated by the sector has jumped from £5.7bn in 2018, to over £10.5bn in 2023 and created an estimated 20,000 jobs. The review has issued a call for views which focuses on how to facilitate further growth. The call closes on 28 March 2024.
AG Pikamäe has opined in a reference from Poland which relates to whether and under what circumstances, a database containing personal data may be sold as part of enforcement proceedings (in this case a debt action), without the consent of the data subjects.
The AG opined that:
The ICO has approved a certification scheme for legal professionals who process personal data. Certification helps law firms and other legal professionals demonstrate compliance with data protection requirements and reassure those using their services. This is the fifth approved certification scheme. The others cover: offering secure re-use and disposal of IT assets, age assurance, children's online privacy, and training and qualification service providers.
The Irish High Court has allowed Max Schrems to join Meta's appeal against the Irish DPC's decision to prohibit Meta from transferring personal data to the USA under its Standard Contractual Clauses and supplementary measures, and the related €1.2bn fine for unlawful transfers. Both Meta and the Irish DPC opposed the application by Schrems but he was held to be uniquely and directly affected given he was the originator of the initial complaints against Meta.
The ICO has published guidance on content moderation and how organisations can respect information rights when moderating online content. The guidance is intended to help organisations caught by the Online Safety Act (OSA) comply with their data protection obligations when moderating content although also applies to organisations carrying out content moderation for other reasons.
The guidance defines content moderation as "the analysis of user-generated content to assess whether it meets certain standards; and any action a service takes as a result of this analysis". Definitions used in the OSA are followed but the guidance does not deal with OSA compliance. The guidance looks at how to assess and mitigate risks when carrying out content moderation, how to make sure use of personal information is fair and lawful, information requirements, data protection principles and data subject rights. It also covers issues including data transfers and automated processing so is wide-ranging.
The guidance may be updated once Ofcom completes relevant guidance and codes of practice.
An amended collective proceedings application brought by Dr Lovdahl Gormsen against Meta has been accepted by the Competition Appeal Tribunal. The original application was turned down when the CAT found the methodology for establishing loss on a class-wide basis was inadequate. The CAT finds that the amended application articulates abuses by Meta which are arguable and triable. The proceedings relate to Meta's alleged abuse of dominance which led it to impose terms and conditions on users, unfairly requiring them to disclose unnecessary and disproportionate personal data in exchange for use of the service, as well as imposing unfair and anti-competitive terms and conditions.
The ECJ has given a ruling on a customer's claim for compensation for non-material damages after his personal data was disclosed to a third party due to employee error. The data was recovered and returned and not used by the third party. The ECJ held that the mere fact that a document is provided in error to an unauthorised third party is not sufficient to consider the controller did not take appropriate technical and organisational measures to protect the data. Article 82 does not require severity of infringement to be taken into account when assessing damages, and Article 82(1) indicates damages should be compensatory not punitive. In a situation where the unauthorised third party is not even aware of the wrongly disclosed personal data, there is no non-material damage just because the data subject fears there may be disclosure or misuse of that personal data in future, As such, the ruling narrows the scope of the December 2023 judgment in VB v Natsionalna agentista za prihodite which found that fear of misuse of personal data by third parties following a GDPR infringement can give rise to a claim for non-material damages.
The EDPB has adopted an Opinion on the notion of main establishment and on the criteria for the application of the one-stop-shop mechanism. The Opinion clarifies the meaning of "main establishment". To qualify, it must be where the decisions on the purposes and means of data processing are taken and not just the place of central administration in the EU. If there is no decision-making function in the EU, there will be no main establishment for GDPR purposes and the controller will not be able to participate in the one-stop-shop mechanism.
The EDPB has published a statement on the legislative developments on the proposed Prevention of Child Sexual Abuse Material (CSAM) Regulation. The EDPB welcomes many of the amendments proposed by the European Parliament, including exempting end-to-end-encrypted communications from detection orders. However, the EDPB also says it regrets that some of the issues it and the EDPS flagged in their joint Opinion on the draft have not been fully resolved, in particular those relating to general and indiscriminate monitoring of private communications via detection orders. The EDPB is concerned that detection orders might be used against people unlikely to be involved in CSAM-related crimes and may not be limited to CSAM content.
DSIT has published an Introduction to AI Assurance. This is the first set of guidance in a series planned to help organisations and regulators with AI assurance and governance. The guidance includes a toolkit and sets out key actions to help organisations improve their understanding of AI assurance, implement effective internal governance processes, and engage with AI standardisation. Sector specific guidance on AI assurance will follow. This guidance sits within the DSIT AI Assurance portfolio.
DSIT and the AI Safety Institute (AISI) also published a notice (which includes case studies) setting out information about the safety evaluation techniques the AISI will use. The focus will initially be on AI misuse, societal impact and autonomous systems. The AISI is not considered to be a regulator but its work will inform UK policy making and provide technical tools to assist with governance and regulation.
Regulators who are in the frame on AI (including the ICO) and relevant Ministers, have been sent letters from the Secretary of State asking them to set out their strategic approach to AI and the steps they are taking in line with expectations set out in the government's AI White Paper. They are asked to do this by 30 April 2024.
The Department for Science Innovation and Technology (DSIT) has published a response to the consultation on the government's AI White Paper. The government's overall strategy has not changed as a result of the consultation. It confirms it will not be introducing AI-specific legislation although recognises it may be needed in future, particularly in relation to the most advanced (or highly capable) general-purpose AI. For now, the intention is to rely on sector-based regulation informed by the five (unchanged) AI principles:
Regulators including Ofcom, the ICO, the CMA, FCA and MHRA are to publish an outline to their regulatory approach by 30 April 2024. This should include:
The regulators will be supported by new guidelines and the AI Standards hub, and their obligations will be on a non-statutory basis. While the government noted there was support for a central AI regulatory function in the responses, it does not propose a single, central regulator but instead points to steps it is already taking to help regulator coordination.
One area where the government does propose to legislate is in relation to automated decision making. In order to simplify the current framework, the government says it will use the Data Protection and Digital Information Bill to expand the lawful bases for processing personal data to reach solely automated decisions which have a legal or similarly significant effect on individuals. It plans to replace the current Article 22 of the UK GDPR with new specific safeguards for automated decision making including information requirements and redress and review rights for individuals. Automated decision making which produces a legal or similarly significant effect will be prohibited where special data is being processed unless the lawful bases of consent, contractual necessity or being subject to a legal obligation can be relied upon where relevant. In the latter two cases, the processing must be in the substantial public interest.
The government also announced:
The response sets out a roadmap for 2024 actions although the government's ability to carry these out will be subject to the timing and outcome of the General Election.
Following its scrutiny of period and fertility tracker apps last year, the ICO has concluded there were no serious compliance issues or evidence of harms. It has, however, decided to remind all app developers to focus on privacy as its review showed a need for general improvements. The ICO stresses the need to be transparent, obtain valid consent, establish the correct lawful basis and be accountable. The ICO will share advice to app users on how to protect their privacy shortly.
The ICO has published a Tech Horizons report which looks at the implications of eight technologies it believes will significantly impact our societies, economies and information rights in the next two to seven years. The report looks at the privacy implications of genomics, immersive virtual worlds, neurotechnologies, quantum computing, commercial drone use, personalised AI, next-generation search, and central bank digital currencies.
The CMA's latest report on Google's commitments relating to its Privacy Sandbox, covering Q4 of 2023, raises continuing competition concerns. The CMA says that Google has complied with the commitments, however, further progress is needed to resolve competition concerns before third party cookies can be phased out.
The Netherlands DPA has said that cookies will be a major focus for 2024. It will check websites more often to see whether they get appropriate consents to tracking cookies and ensure websites are not using misleading or confusing cookie notices.
Meanwhile, the CNIL has said its priorities for 2024 will include data collection in the context of the Olympics and Paralympics, the collection of children's personal data online, loyalty programs and digital receipts, and data subject access rights.
The government has tabled draft Regulations intended to bring the immigration exemption under the Data Protection Act 2018 into compliance with the law as required by the Court of Appeal. The Regulations will introduce a number of safeguards to ensure immigration exemption decisions are made on a case by case basis and in accordance with all applicable laws, and that data subjects are kept informed throughout. The ICO has welcomed the changes.
The Shanghai government has said it will introduce a new approval scheme to fast-track approval for the export of Chinese data by multinationals. The system will be exclusive to Shanghai, with the rest of the country continuing to follow existing transfer rules.
The Regulation for an EU common criteria-based cybersecurity certification scheme (EUCC) has been published in the Official Journal. This is the first cybersecurity certification scheme under the Cybersecurity Act intended to enhance cybersecurity of ICT products, services and processes. Further schemes are planned on cloud services and 5G security. The voluntary scheme provides for two levels of risk assurance – substantial or high - subject to the intended use of the product, service or processes being certified. It enters into force on 27 February 2024.
The ICO has called for organisations to take proactive action to make advertising cookies compliant with data protection law, notwithstanding a positive response to its warnings to top websites. In November 2023, the ICO wrote to 53 of the UK's top 100 websites, warning them that their use of advertising cookies was not compliant with data protection law and that they would face enforcement action if they failed to take action. The ICO says 38 of the organisations have since taken action to ensure compliance, a further four will reach compliance by the end of February 2024, and others are looking to develop alternative solutions including contextual advertising and subscription models. The ICO intends to provide further clarity on lawful implementation of these alternative models in the next month.
The ICO is preparing to write to the next 100 websites and is developing an AI tool to help identify websites using non-compliant cookie banners. However, the ICO urges non-compliant organisations to take proactive action rather than waiting for the ICO to contact them.
The ICO has launched a campaign to promote responsible sharing of children's data. The ICO is working with education, law enforcement and social services organisations to raise awareness about responsible data sharing in order to protect children from harm. The 'Think. Check. Share.' campaign demonstrates how data protection law can help organisations share children's information when required to safeguard children and young people. The ICO has also created a toolkit of resources, and partnered organisations can use the ICO logo to co-brand materials. The campaign follows September 2023's 10-step practical guide on sharing information to safeguard children.
The ICO is consulting on its draft Enterprise Data Strategy (EDS). Developed in line with the ICO25 strategic plan, the EDS outlines how the ICO will use its own data to inform and direct its corporate, regulatory and strategic priorities. Views are sought from business, the public sector, civil society and interested individuals. The ICO seeks to understand how data it holds should be made available in the public interest to help organisations innovate and mitigate risk. The deadline for responses is 12 March 2024 and the ICO plans to publish a final version of the EDS and a roadmap in May 2024.
The EDPB has published a website auditing tool to support controllers and processors as well as DPAs in conducting data audits. The free, open source tool allows preparing, carrying out and evaluating audits and is compatible with other EDPB/EDPS tools.
ENISA has published a report which aims to contextualise the main design principles for engineering data protection in Common European Data Spaces. The report goes through two hypothetical use cases focused on pharmaceuticals and considers the role of data protection engineering, DPIAs and accountability.
The Dutch and French DPAs have fined Uber €10m for failing to disclose its data retention periods to EU drivers, and for making it unnecessarily complicated for drivers to request access to their personal data. The Dutch DPA noted that these "low impact" issues had since been fixed and dismissed the vast majority of driver claims as unfounded.
The Italian DPA, the Garante, has sent OpenAI a notice informing it of alleged breaches of data protection law relating to ChatGPT. The Garante, announced an immediate ban on ChatGPT and an investigation into its parent company OpenAI's GDPR compliance in March 2023. The Garante said OpenAI did not have a lawful basis for processing such large amounts of personal data to train ChatGPT, and did not verify the age of users thereby exposing minors to unsuitable answers. It also had concerns about transparency and data security following a data breach. While disagreeing with the Garante's findings. OpenAI temporarily disabled access to ChatGPT in Italy.
Other EU regulators also began to scrutinise OpenAI and the EDPB set up a dedicated task force to foster cooperation and exchange information on possible enforcement actions conducted by data protection authorities. The Garante, subsequently lifted its ban subject to OpenAI making changes to its privacy practices, including around transparency, lawful basis, and age verification for Italian users. The Garante now says it will take account of the work in progress by the EDPB task force and OpenAI can submit counterclaims within 30 days.
AG de la Tour has opined in a reference from Germany's Federal Court of Justice. The reference asked whether failure to take appropriate measures under Article 12(1) GDPR to provide Article 13(1) required information is a breach of data subject rights and whether, consequently, a representative action brought by a consumer protection association based on such an infringement should be allowed to proceed in accordance with Article 80(2). The AG opined that such a breach could give rise to such proceedings. The referring case relates to a challenge to a consumer representative body bringing an action against Meta Platforms Ireland, alleging it had failed to provide fair processing information to users of free games available in its app centre.
The EU's AI Act has been approved by the Council of the European Union's Committee of Permanent Representatives (COREPER). The Act now moves on to adoption by the European Parliament with a committee vote scheduled for 13 February and the full plenary vote expected on 10 or 11 April. Following adoption by the Parliament, the AI Act will be formally adopted and published in the Official Journal, coming into force 20 days later.
The EU and Japan have signed an economic partnership agreement agreed in October 2023. The protocol includes provisions on cross-border data flows with the aim of providing greater legal certainty that data flows between the two jurisdictions will not be hampered by unjustified data localisation measures.
The government has published a Call for evidence on the draft Cyber Governance Code of Practice. The Code sets out the critical governance areas directors need to tackle in order to protect their organisations. It is intended to help directors and senior leaders take appropriate steps to protect against cyber attacks. The Code has been developed together with the National Cyber Security Centre. It focuses on ensuring companies have detailed breach response plans which are robust and regularly tested, together with breach incident reporting processes. Organisations are also encouraged to prioritise cyber security and ensure employees are properly educated about risks and their mitigation.
Views are sought in particular about the design of the Code, how government can drive its use, and about the pros and cons of an independently assessed assurance process against the Code. The call closes on 19 March 2024.
The UK government has published its response to the call for views on software resilience and security for businesses and organisations. The government proposes a new voluntary code of practice for software vendors setting out the security obligations for organisations that sell software commercially. It will place responsibilities on organisations profiting from the sale of software (rather than on unpaid OSS developers or customers). Guidance will be provided to help suppliers, including for those supplying software used in high-risk situations eg by government or in critical infrastructure. Further resources will be provided to help with accountability and to help customers with software procurement and maintenance.
The Global Privacy Enforcement Network has announced it will carry out a privacy sweep between 20 January and 2 February, looking at deceptive design patterns (dark patterns) used by websites and apps to influence people into making potentially harmful choices relating to their privacy. The intention is that based on the outcome, participating regulators will organise awareness activities on the issue and, potentially, launch investigations and enforcement actions.
The EDPB has been asked by DPAs from Netherlands, Norway and Hamburg, to take a position on the 'pay or OK' model for targeted advertising – ie where users are given a choice between paying for an ad-free service, or allowing behavioural ads in return for a free service. Meta has announced a move to this structure following the EDPB's decision that it could not rely on legitimate interests or contractual necessity as a lawful basis for its targeted advertising.
The government is consulting on whether to amend the Licensing Act to allow digital identities and age assurance technology to play a role in age verification for alcohol sales in England and Wales, and when age verification should take place for distance sales. Part of this involves looking at whether measures to prevent the sale of other age-restricted products being sold to children online have been successful. The consultation closes on 30 March 2024.
The Digital Government (Disclosure of Information) (Identity Verification Service) Regulations 2024 have been made under the Digital Economy Act 2017, and will come into force on 8 February 2024. Their objective is to enable individuals to create a reusable digital identity in a convenient, secure and efficient way and to provide support for the administration and delivery of such services. Specified public authorities are permitted to check and share government-held personal data in order to make it easier for individuals to prove their identity when accessing digital services.
The European Commission launched an AI innovation package to support AI startups and SMEs. The package includes:
UK government publishes Generative AI Framework for government bodies
The government has published non-statutory guidance for HM Government on using generative AI safely and securely for civil servants and people working in government organisations. The guidance looks particularly at compliance with data protection law and ethical considerations. It is based around ten principles and focuses on fairness, collaboration, education, the importance of human oversight, safety and security.
The National Centre for Cyber Security has published a report looking at the cyber threats posed by AI over the next two years. It concludes that AI will certainly increase the volume and impact of cyber attacks but suggests the threat will be uneven. It notes that many threat actors already use AI but as AI capability becomes more sophisticated, attacks will become more effective. The report analyses which threat actors will be most likely to have enhanced capability and the impact that is likely to have.
The ICO has updated its 2021 age assurance Opinion to:
The European Commission has published a report upholding the adequacy decisions for Andorra, Argentina, Canada, the Faroe Islands, Guernsey, Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay (11 of the current 16 adequacy decisions). In some cases, the Commission makes recommendations for further protections, but essentially finds these countries continue to provide an adequate level of protection to EU personal data.
The EDPB has published its report on its 2023 coordinated enforcement action which focused on the role of the DPO. 23 EEA Data Protection Authorities launched investigations and analysed over 17,000 responses. The EDPS also reported on the role of the DPO in internal EU institutions. The report concludes that the majority of DPOs said they have the necessary skills and knowledge to carry out their work, have suitable training and clearly defined tasks, adequate resources and the required independence. However, insufficient resources or knowledge and lack of independence were cited by some. As a result, the report makes recommendations for organisations, DPOs and DPAs. These centre on awareness-raising and incentivisation by DPAs and recommend organisations improve the safeguards to maintain independence, provide sufficient resources and training for DPOs, and institute clear reporting lines.
The EDPB has published a case digest of one stop shop decisions on data security and breach notification relating to Articles 32-34 GDPR. The case digest shows how DPAs have interpreted and applied these parts of the GDPR, offering summaries and analyses of the decisions.
New Jersey's Privacy Bill has become law and will take effect on 17 January 2025. The law will apply to organisations controlling or processing data of at least 100,000 individuals or holding data on at least 25,000 individuals while generating revenue from the information. Processing solely for the purpose of completing a transaction is explicitly excluded from scope.
The CNIL has fined Yahoo! €10m for consent failings in relation to cookies. The CNIL found that in 2020, around 20 advertising cookies were dropped on user devices whether or not consent had been given. The CNIL also found that users of Yahoo! Mail were informed they would lose access to the messaging service if they withdrew consent to cookies. As no alternative was given to accessing the service without consenting to cookies, the consent could not be said to be freely given.
The UK's ICO has launched a series of consultations looking at how aspects of data protection law should apply to the development and use of generative AI. The ICO is seeking views from a range of stakeholders including developers and users of generative AI , legal advisors and consultants, civil society groups and interested public bodies. The consultations will be on a series of chapters setting out the ICOs thinking on how to interpret specific requirements of the UK GDPR and Part 2 of the DPA 18 in relation to pressing questions including about:
The first chapter focuses on the lawful basis for web scraping to train generative AI models. Feedback is invited by 1 March 2024.
The EC has published a call for evidence on the application of the GDPR which closes on 8 February. As reported last week, the EDPB has completed its analysis, concluding the GDPR did not need to be revisited.
The ICO started 2024 as it no doubt means to continue. It fined HelloFresh £140,000 for sending over 79m spam emails and 1m spam texts in seven months. The ICO said the marketing messages were sent based on an insufficiently clear opt-in. The ICO also fined two home improvement companies a total of £250,000 for making illegal marketing calls including to people on the 'do not call' register.
The EU's Data Act was published in the Official Journal on 22 December 2023 and will come into force on 11 January 2024. It will apply broadly from 12 September 2025. The Data Act is intended to remove barriers to data sharing, give businesses access to data they contribute to creating, and individuals more control over all their data (not just personal data). It will empower users of connected devices to access and share data they generate with third parties as well as switch cloud and edge service providers. It also aims to protect SMEs by providing a harmonised framework in which data can be shared, equalising access to data across the single market.
While the Data Act will mostly apply from 12 September 2025, some elements will be introduced on other dates:
The UK's ICO has published guidance for organisations transferring personal data to the US under Article 46 transfer mechanisms (UK IDTA and UK BCRs). The guidance explains how the UK-US Data Bridge can help streamline the Transfer Risk Assessment process required when making transfers to the US under Article 46 (as opposed to under the Data Bridge itself).
The CNIL published a consultation on its draft guidance on transfer impact assessments more widely on 8 January.
The ICO is consulting on two further sets of draft guidance for employers and recruiters which will form part of the ICO's overhauled guidance on employment information. The draft guidance covers:
In mid-December, Google Maps announced that it will give users greater control over their location histories. Users will be able to review and more easily delete their location histories and searches via their location blue dot. Auto-delete will be set to three months by default rather than 18 months. New features will be rolled out over the course of 2024.
The House of Commons Joint Committee on the National Security Strategy has published the findings and subsequent recommendations following its call for evidence on ransomware. The Committee's recommendations to the government include:
The government is required to respond to the report within two months.
The British Library fell victim to a ransomware attack by criminal group Rhysida at the end of October 2023. Not only did it result in personal data records and other files being placed for sale on the dark web, it also incapacitated the British Library systems with the digital catalogue and some other services remaining unavailable. While final costs are unconfirmed, reports suggest that the Library will use 40% of its cash reserves to resolve the issues.
On 11 December 2023, the Court of Appeal upheld the decision of the High Court that the immigration exemption in the Data Protection Act 2018 is unlawful. It found that the requirement to balance rights should be included on the face of the legislation rather than in non-binding policy documents and that there was a failure to comply with Article 23(2)(g) of the UK GDPR. The court has suspended the declaration of unlawfulness for three months from the date of the judgment to give the government time to amend the legislation.
The ICO has published an updated response to the DPDI Bill following its carry-over. The ICO continues to support the Bill but notes that some of the latest amendments amount to new policy which has not been subject to wider public consultation or line-by-line scrutiny in the House of Commons. While welcoming many of the changes, particularly regarding the ICO's own role and funding, the ICO does express reservations about the proportionality of new measures allowing the government to obtain certain information to help it detect benefit fraud.
The government launched a consultation in mid-December on protecting and enhancing the security and resilience of UK data infrastructure, particularly data centres and cloud service providers. There are concerns that the UK is not adequately protected against cyber attacks, or from risks presented by natural hazards such as flooding or other extreme weather.
The government proposes that organisations operating third party data centres, in particular those implemented to provide colocation and co-hosting data services, should be required to undertake or comply with the:
The consultation closes on 22 February 2024.
The Cybersecurity Regulation which sets out common cyber security standards at the institutions, bodies, offices and agencies of the EU, has passed its final hurdle following its adoption by the Council of the EU, and will now be published in the Official Journal. The Regulation puts in place a governance and risk management framework and extends the remit of the Computer Emergency Response Team for the EU institutions. It creates a minimum set of information security rules and standards.
The ECJ has ruled in a reference from Bulgaria on issues around controller liability for third-party cyber attacks and compensation for fear of misuse of personal data. The ECJ held that:
The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 came into force on 31 December 2023. They revoke and replace the definition of "Fundamental rights and freedoms" in the UK GDPR and DPA 18 to take account of the fact that the UK is no longer subject to the EU Charter on Fundamental Rights.
Google has reportedly agreed to pay USD 5bn to settle a class action lawsuit centred on the allegations that it misled users into believing they weren't being tracked online while using incognito mode. The settlement now has to be approved by a judge.
EasyPark Group, owner of RingGo and ParkMobile, reported to the ICO that it had suffered a data breach after hackers accessed their customer data. Around 950 UK users were involved but the majority of those affected are based in Europe.
The EDPB has published an opinion letter adopted in December 2023 in response to a request by the EC on the EC's cookie pledge voluntary initiative. The cookie pledge initiative was developed by the EC in response to concerns about 'cookie fatigue'. The EDPB was asked to give an opinion on whether the EC's proposals would be in line with the GDPR and the ePrivacy Directive.
The EDPB notes that the principles would ensure users receive concrete information on how their personal data is processed and about the consequences of accepting different types of cookies. Users would have greater control over their data as a result. The principles also state that consent should not be asked for within a year of it being refused which, the EDPB says, would be an important step to reducing cookie fatigue. It notes, however, that adherence to the cookie pledge principles would not necessarily mean an organisation had complied with the GDPR or the ePrivacy Directive and should not alter the competence of national data protection authorities.
The EDPB notes that it cannot give a blanket statement to the effect that offering a paid alternative to a service which involves tracking (as Meta is doing in the EU), would constitute valid consent. The EDPB suggests the principles make clear that a case by case assessment is required under such circumstances in accordance with the ECJ judgement in Meta and others of 4 July 2023 (C-252/21, ECLI:EU:C:2023:537, paragraph 150.).
References are made to various pieces of guidance around cookies and case law, including that legitimate interests is not a suitable lawful basis for cookies, and that an opportunity to 'reject all' non-necessary cookies should be given on the first level.
The EDPB takes issue with the suggestion in Principle F, that separate consent for cookies used to manage the advertising model selected by the consumer is not required. The EDPB underlines that consents must be requested for specific well-defined purposes and purposes should not be combined.
As far as not asking consumers to accept cookies for a period of one year is concerned and the assertion that a cookie to record negative consent is a "necessary one", the EDPB says that it understands the scope of this to relate only to the recording of a user's refusal to or withdrawal of consent and recommends clarification. The EDPB also suggests a negative consent cookie should contain only generic information common to all users such as a flag or code, and points to the fact that gatekeepers subject to the Digital Markets Act already have to comply with rules on the frequency of asking for consent to cookies.
The EDPB supports the use of software settings to enable users to express choice but says caution is necessary when aiming to use software settings to express affirmative consent as a default 'yes' would not satisfy GDPR requirements.
All in all, the EDPB recommends a series of clarifications be made to the draft principles.
Separately, the ICO published its November 2023 letter sent to 100 companies informing them that their cookie banners might not be compliant with UK data protection law. The letter has been published so other companies can understand how to address potential non-compliance.
The EDPB has adopted its contribution to the Article 97 report on the application of the GDPR. The EDPB considers that the GDPR has been successful and does not need revisiting. It highlights the need to adopt the currently draft Regulation dealing with cross-border enforcement of the GDPR and to reach more adequacy decisions. It warns that trade agreements must not be used to circumvent data protection law and also highlights the need for better resources for national regulators.
Look back on the main UK and EU-level developments in data and cyber security during 2023 in our 2023 roundup. Our update is packed with useful links and helpful summaries to remind you about the year that (almost) was, and help you start 2024 on top of things. You can access the full edition of Radar which also covers AI, tech, consumer protection, advertising, games and more here.
Provisional political agreement was reached on the AI Act on 8 December 2023 after lengthy and intense negotiations between the European Parliament, the Council of the EU and the European Commission. The legislation continues to take a tiered approach with a ban on certain types of 'unacceptable risk' AI, and the most onerous provisions relating to high-risk systems. Read more.
The ECJ has ruled on a reference from Germany that certain types of automated credit scoring and data retention practices breach GDPR rules on solely automated decision-making. The ECJ says credit scoring must be regarded as an automated individual decision prohibited under the GDPR unless subject to an exception – something for the national courts to determine. The ECJ also said it is unlawful for private credit rating agencies to keep data regarding discharge from debts for longer than the period permitted in relation to the relevant public insolvency record.
The EDPB has published its binding decision (taken earlier this autumn) which instructs the Irish Data Protection Commissioner to issue Meta with an EEA-wide ban on using tracking technologies for the purposes of behavioural advertising on the lawful basis of contractual necessity or legitimate interests. The Irish DPC adopted its final decision on 10 November 2023.
The European Commission has adopted a Communication on the creation of a common European mobility data space to facilitate access, pooling and sharing of data from existing and future transport and mobility data sources. The EMDS is also intended to help ensure a high level of cyber security in the sector. The space is intended to be fully operational by the end of Q1 2028.
The UK's Ministry of Justice has published its Cyber Security Strategy 2023-2028. This focuses on anticipated cyber threats relating to the MoJ and measures to combat those risks, including use of technology and skills training. The strategy is intended to improve the resilience of critical MoJ services and to embed security by design and instil a culture of cyber security.
The California Privacy Protection Agency is proposing updating the California Consumer Privacy Act regulations, potentially to increase fines and update provisions on dark patterns and data subject rights. It has also published proposals for its planned data broker registry under the Delete Act.
The European Parliament and Council reached agreement on the Cyber Resilience Act on 1 December 2023. This will introduce mandatory cyber security requirements for all hardware and software throughout the product lifecycle, taking a risk-based approach. Manufacturers will be required to implement security by design and provide support and updates to consumers for a period of time related to the anticipated lifespan of the product. They will also be subject to transparency and incident reporting requirements. The CRA will now be formally adopted and will enter into force 20 days after publication in the Official Journal. Manufacturers, importers and distributors of hardware and software products will then have 36 months to prepare for full implementation and 21 months in relation to incident and vulnerability reporting obligations.
Pending agreement of the CSAM Regulation, the EC is proposing to extend its 2021 Regulation on a temporary derogation from certain provisions of the ePrivacy Directive to combat child sexual abuse online to 4 August 2024. The Regulation enables the use by certain types of communications services of technology to process personal and other data to help detect, report and remove child sexual abuse material.
The ICO has published learnings from its investigation into 'text pests' – staff who use personal data of customers to contact them inappropriately. The ICO says while the problem does occur, it has also found examples of good practice by businesses and cites some examples. It has not found ongoing negligent behaviour from specific companies but has seen a good level of understanding on how to prevent the issue arising and what to do if it does.
The UK's National Cyber Security Council (NCSC) has announced new voluntary global guidelines on secure AI system development. The guidelines were developed in association with the US and industry and have been endorsed by national agencies from 16 other countries including the G7. The guidelines are intended to help AI system developers embed cyber security by design into all stages of the development phase but extend across the product lifecycle to cover secure deployment, operation and maintenance. They are aimed primarily at providers of AI systems who are using models hosted by an organisation or are using APIs, but all stakeholders are urged to take them into account.
The data privacy campaign group NOYB has filed a complaint with the Austrian data protection regulatory about Meta's new subscription model for ad-free services on its Facebook and Instagram platforms. Meta has started rolling out its ad-free service to EU users and argues that those who prefer to use the free version are consenting to their personal data being used for behavioural advertising. NOYB argues that this does not constitute valid GDPR consent and the subscription model amounts to a "privacy fee".
The Bureau Européen des Unions des Consommateurs (BEUC), has filed a complaint with the network of consumer protection authorities (CPC) around similar concerns from a consumer protection standpoint. It claims that Meta's new model engages unfair commercial practices including due to allegations that:
The ECJ has made its ruling on two references from Germany and Lithuania, the German reference relating to the long-running Deutsche Wohnen case,
The ruling clarifies a number of issues relating to the imposition of GDPR fines including that:
The Data Protection and Digital Information Bill was reintroduced to Parliament on 8 November. During its passage through the Commons prior to its re-introduction, amendments were accepted in relation to clauses 1-7. The government has now tabled 124 pages of further amendments for consideration at report stage which is set to take place on 29 November. Among the amendments highlighted by the government are:
The government describes the changes as "common sense" but some are likely to prove politically contentious.
There are also proposals which depart further from the GDPR, including for example, one tabled by Labour MP Chris Bryant to amend the right to object to solely automated processing which has a legal or similarly significant effect, to a right where processing is "solely or partly" based on such processing. Non-government tabled amendments are less likely to be accepted.
The EU's Data Act has cleared the European Parliament and Council. It is expected to be published in the Official Journal in early January 2024 and will apply 20 months after that. It aims to facilitate data sharing, in particular, of industrial and business data as well as personal data, in order to help individuals and businesses leverage the value of the data they help generate, and level the data playing field
DSIT is consulting on proposals to establish a smart data or open communications scheme in the UK telecoms market which would enable customers to request and obtain information relating to their services which could be shared with third parties with customer consent. This would cover information about usage statistics, price and speed. The CMA has published its response. It is in favour of this type of scheme and provides input and learnings from other smart schemes.
The NHS has announced it will launch a Federated Data Platform from spring 2024, to join up information currently held in separate NHS systems. The platform will bring together real time data such as the number of beds available, waiting lists, staff rosters and supply availability. Somewhat controversially in certain quarters, the NHS has awarded the software contract to a group led by Palantir Technologies UK, but the NHS has underlined that neither they nor the other companies involved in supplying the software will have access to patient data.
The EDPB has adopted provisional Guidelines on the technical scope of Article 5(3) of the e-Privacy Directive which will be finalised following a six-week consultation period. The guidelines are intended to clarify which technical operations, in particular new and emerging tracking techniques, are in scope of the Directive and provide greater legal certainty. The guidelines look at the key elements for the applicability of Article 5(3) and analyse the terminology used in more detail. They also include use cases to cover risk mitigation measures and solutions to ensure consent obligations are fulfilled.
Interestingly, the UK's ICO has also chosen this week to warn some of the UK's top websites that they face enforcement action if they do not make changes to their cookie notices and policies to bring them into compliance with the law. The ICO warns that websites must make it as easy to reject tracking technologies for behavioural advertising purposes, as it is to accept them. This is best achieved by including a 'Reject all' button next to the 'Accept all' one. The ICO has written to 30 non-compliant leading websites giving them 30 days to make changes and will publish an update on this work in January 2024, which will include details of companies which have not addressed its concerns.
The UK's ICO is consulting on a draft Addendum to approved EU Binding Corporate Rules (BCRs). The Addendum will comprise the EU BCRs, an addendum extending their scope to include UK Restricted Transfers and which forms the UK legally binding instrument, and a UK BCR Summary which provides information to Relevant Data Subjects (and for Processor BCR, Third Party Exporters).
The UK BCR Addendum can be used in two ways:
The Addendum is structured as an intra-group agreement but can be amended to be or become a joining agreement. ICO authorisation is required in all cases for the Addendum to create a set of UK GDPR binding BCRs.
The ICO has published draft guidance on transparency in the health and social care sector for consultation. The guidance is aimed at anyone in health and social care who is involved in delivering transparency information to the public and aims to help them understand what data protection transparency means for their organisations, how to develop effective transparency material, how to provide transparency and privacy information to patients, service users and the public, and the factors to consider when assessing levels of transparency. The consultation closes on 7 January 2024.
Provisional political agreement has been reached between the co-legislators on the EU's draft Regulation on data collection and sharing relating to short-term accommodation rental services and amending Regulation (EU) 2018/1724 establishing a single digital gateway to provide access to information to procedures and to assistance and problem-solving services. The aim of the Regulation is to harmonise and improve the framework for data generation by short-term rentals (STRs) across the EU and enhance transparency. This is partly as a result of the heavy burden placed on platforms for STRs and the multiple data requests they receive, especially where they operate cross-border.
The Regulation will now be formally adopted and will then be published in the Official Journal. There will be a two-year implementation period.
The UK's International Data Transfers Expert Council was set up in January 2022 to advise the government on its data transfer policy. It has produced a report submitted for consideration to DSIT setting out its findings based on its work since it was founded. The report contains the council's independent recommendations to the government and international community on facilitating multilateral solutions for sustainable and scalable data transfers. It promotes a risk-based approach based on accountability, flexibility and scalability. The council also makes a series of short-, medium- and long-term recommendations as to how to achieve global consensus on trusted transfers.
The ICO is seeking leave to appeal the judgment of the First Tier Tribunal that held the ICO had incorrectly applied the UK GDPR in relation to its findings that Clearview AI had breached the UK GDPR. The Tribunal agreed with Clearview that the activities in question did not fall within the scope of the legislation. The ICO's view is that Clearview itself was not, as it contended, processing for foreign law enforcement purposes and should not, therefore, be shielded from enforcement under the UK legislation.
The European Parliament has adopted its first reading position on the Data Act following political agreement on the text reached in June 2023. The Council will now adopt the Data Act after which it will be published in the Official Journal and come into force 20 days later. It will apply 20 months after that.
Euractiv has reported that negotiations on the AI Act have hit a potential roadblock as disagreements have arisen. While good progress had earlier been reported on classification of foundation model AI and biometric facial recognition technology, Euractiv suggested that France, Germany and Italy were moving to prevent tiered regulation (or potentially any regulation) of foundation models. There are even reports that some Member States are now opposing the AI Act in its entirety on the basis that it is overregulating the technology. The next trilogue in early December was intended to be the final one but if agreement is not reached, the Spanish presidency is unlikely to do anything further on the file, passing it to the incoming Belgian presidency. There will then be only a few weeks before the European Parliament is dissolved pending elections in June 2024.
The Data Protection and Digital Information Bill (2) has been held over and will be re-introduced (without the no.2).
More controversially, the government announced the Investigatory Powers (Amendment) Bill. The Bill will make a small number of targeted changes to the Investigatory Powers Act 2016 including:
The second trilogue on the Cyber Resilience Act reportedly resulted in progress but revealed two main outstanding areas of disagreement between the co-legislators. There is disagreement over whether security incidents should be reported to ENISA or Member State CSIRTS and in which order. The definition of what constitutes a critical product requiring conformity assessment to be done by certified auditors is also being debated.
The ICO and the EDPS have signed an MoU establishing a framework for cooperation between them on the application of data protection law. The MoU sets out what information might be shared with the goal of improving best practice and supporting regulatory efforts as well as cooperating on projects of mutual interest.
The Norwegian data protection regulator has confirmed it is taking part in a European-level assessment of Meta's new subscription ad-free model. Some regulators have expressed doubts about the new model, concerned that the choice between payment and non-payment, does not equate to GDPR-level consent to behavioural advertising.
These Regulations revoke and replace Article 4(28) of the UK GDPR and s205(1A) of the DPA and other provisions which relate to the meaning of references to fundamental rights and freedoms in data protection legislation. This is in order to make the definition of rights and freedoms relate to the European Convention on Human Rights within the meaning of the Human Rights Act 1988, rather than to refer to the EU Charter of fundamental rights. References relating to the right to data protection are also being removed as this right is not expressly included in the Convention. The Regulations will come into force on 31 December 2023.
The AI Safety Summit hosted by the UK at Bletchley Park took place last week. Notable developments include:
We discuss the summit in more detail here.
US Executive Order and AI Safety Institute
The full text of President Biden's Executive Order on Safe Secure and Trustworthy Development and use of AI (see last week's news) has now been published. Vice President Harris also announced a range of commitments and policy developments at the summit, including the establishment of an AI Safety Institute intended to operationalise NIST's AI risk management framework, creating guidelines, tools, benchmarks and best practice recommendations to identify and mitigate AI risk. It will also enable information sharing and research, including with the UK's planned AI Safety Institute. The VP also announced draft policy guidance on US government use of AI, and the US made a political declaration on the responsible military use of AI and autonomy.
UN sets up global AI advisory board
The UN announced the launch of a high-level advisory body on AI. This is a multi-stakeholder body intended to undertake analysis and make recommendations for international governance of AI. The 38 participating experts are made up of government, private sector and civil society stakeholders. They will consult widely to "bridge perspectives across stakeholder groups and networks".
Other initiatives
These include:
Advocate General Collins has given an opinion in a joined reference from Germany on the relationship between the theft of personal data and identity theft or fraud. The AG opined that theft of personal data does not, in itself, constitute identity theft or fraud. The theft may give rise to a right to compensation for non-material damage where there has subsequently been identity fraud as a result of the theft of the data, but the right to compensation does not depend on subsequent identity theft arising from the theft of the data. Compensation for non-material damage must be assessed on a case by case basis taking all relevant circumstances into account.
French MP Philippe Latombe is appealing the EU General Court's dismissal of his action to annul the EU-US Data Privacy Framework. Mr Latombe, who sits on the CNIL, is bringing the action in his personal capacity.
The ICO has published a toolkit on data sharing with law enforcement. This is intended to help SMEs and sits alongside existing more detailed guidance on the issue and the ICO's code of practice on data sharing.
The Online Safety Act (OSA) received Royal Assent on 26 October 2023. Ofcom's powers have come in immediately but most of the rest of the provisions will be brought into force in two months' time. The OSA focuses on user-generated content (subject to limited exceptions) and applies to user-to-user services and search services as well as pornographic content services. The OSA regulates illegal content and certain specified types of harmful content, in particular, for services likely to be accessed by children, content that is harmful to them. Terrorism and Child Sexual Exploitation and Abuse (CSEA) content are a particular focus, but a range of harmful content is also covered in specified circumstances. In relation to the most harmful type of content likely to be accessed by children, age verification/estimation must be used (subject to a limited exception).
The OSA applies to services which have links to the UK. Various safety duties apply to different categories of content. In order to establish what services need to do, they have to carry out a variety of risk assessments against Ofcom risk profiles. Service providers also have transparency requirements and obligations to provide redress. There are wider duties to protect freedom of expression and the right to privacy including personal data.
Category 1 services (to be determined, but likely to be the larger social media services) and Category 2A and 2B services have additional duties. In particular, Category 1 services have expanded duties to protect fundamental freedoms including content of democratic importance and news publisher content. They also have to comply with adult user empowerment provisions which require them to give adults users options to prevent them encountering certain types of harmful content.
Ofcom is the regulator of the OSA. It has extensive powers and duties. It is responsible for producing initial risk profiles and a raft of codes of practice and guidance which will inform how service providers are supposed to comply with the OSA, as well as a range of reports on the impact and operation of the OSA. The process of introducing these (as set out in Ofcom's revised approach to implementing the OSA) is likely to take at least three years with everything subject to consultation and much of it dependent on the introduction of secondary legislation. Ultimately, Ofcom will have a range of enforcement powers including the ability to fine organisations the higher of up to £18m or 10% of global annual turnover. The OSA is very wide-ranging and Ofcom estimates that around 100,000 online services could be in scope.
UK
The Prime Minister announced the world's first AI Safety Institute to advance knowledge of AI safety, evaluate and test new AI and explore a range of risks. In his speech, the Prime Minister also re-iterated the UK's approach to regulating AI set out in its AI White Paper. DSIT published a discussion paper to support the Summit and published a report evaluating the six-month pilot of the UK's AI Standards Hub. In addition, leading frontier AI firms responded to the government's request to outline their safety policies.
USA
President Biden has issued an Executive Order on safe, secure and trustworthy AI (EO). The EO requires:
The EO also calls on Congress to pass bipartisan data privacy legislation and directs the following actions:
Further directions cover the areas of:
G7
The G7 leaders have agreed International Guiding Principles for all actors in the AI ecosystem and an International Code of Conduct for developers of advanced AI systems as part of the Hiroshima AI process.
The guiding principles document is intended to be a 'living document' building on the existing OECD AI principles. It currently sets out 11 non-exhaustive principles to help "seize the benefits and address the risks and challenges brought by AI". They are intended to apply to all AI actors when and as applicable, to cover design, development, deployment and use of advanced AI systems. They include commitments to mitigate risks and misuse, and identify vulnerabilities, to encourage responsible information sharing, reporting of incidents, investment in security and a creation of a labelling system to enable users to identify AI-generated content.
The G7 suggests organisations follow the voluntary Code of Conduct which sets out a list of actions to help maximise benefits and minimise risks of advanced AI systems with actions for all stages of the AI lifecycle.
The latest round of trilogues on the EU's draft AI Act was held on 24 October 2023. Agreement was reportedly reached on provisions for classifying high-risk AI applications and on general guidance for using enhanced foundation models. The next round of trilogues is planned for 6 December and may cover prohibitions and law enforcement use which were not resolved last week.
In the meantime, the EDPS has published an Opinion on the AI Act setting out its final recommendations as the Act nears completion. Much of the Opinion relates to the EDPS's role as the notified body and market surveillance authority as well as competent authority for the supervision of the provision or use of AI systems in respect to which it asks for a number of clarifications. The EDPS also calls for privacy protections to be at the forefront of the legislation, and for a right for individuals to lodge complaints regarding the impact of AI systems on them with the EDPS explicitly recognised as competent to receive complaints alongside DPAs who, the EDPS recommends, should be designated as the national supervisory authorities under the AI Act to cooperate with authorities that have specific expertise in deploying AI systems.
The Department for Science, Innovation and Technology, has published an executive summary and its initial conclusions from the first phase of an evaluation of the International Data Transfer Agreement (ITDA) and the Addendum to the EU SCCs. These transfer mechanisms replaced the original GDPR Standard Contractual Clauses as a lawful mechanism under which to transfer personal data to third countries.
The evaluation concluded there was a considerable difference between awareness and implementation of the transfer mechanisms of larger versus smaller organisations. Smaller organisations tended to be less proactively engaged with data protection issues and unaware of the IDTA.
Possible action points identified include awareness raising, implementation monitoring and evaluation of the wider impacts of uptake.
The ICO has published a short blog on how data protection law can be used to share criminal offence data to prevent or detect crime (particularly shoplifting) while complying with principles of necessity and proportionality. The blog contains examples of what may or may not be appropriate and looks to be targeted at smaller retailers. It coincides with the publication of the government's Action Plan to tackle Shoplifting.
The Department for Science, Innovation and Technology has published an explanatory memorandum to the Data Protection and Journalism Code of Practice. It sets out the procedure under which the code was developed and for it gaining statutory force. The draft code was presented to the Secretary of State in July 2023. Once it has been laid and completed parliamentary procedure (40 days for sifting), it will gain statutory status at which point it can be relied upon in legal proceedings, and will carry more weight than 'guidance'.
The Irish High Court has granted TikTok Technology Ltd leave to appeal the Irish Data Protection Commission's decision to fine it €345m for breaches of the GDPR. TikTok is also appealing the EDPB's decision which the Irish DPC was required to follow.
The EDPB has agreed with the Norwegian Data Protection Authority and directed the Irish Data Protection Commissioner to permanently ban Meta from carrying out behavioural advertising in the EU on its Facebook and Instagram platforms based on legitimate interests. In July 2023, Norway imposed a ban on Meta from carrying out behavioural advertising in Norway using tracking technology on the grounds of legitimate interests, following an ECJ ruling which said this was unlawful. The Norwegian DPA referred the matter to the EDPB, asking it to extend the ban to the rest of the EU until such time as Meta moves to a consent model for behavioural advertising.
Meta has already announced it will move to offering users a paid-for ad-free subscription model as an alternative, arguing that it will then be able to rely on consent as the lawful basis for ad targeting in its free model. The Norwegian DPA said "The Danish Data Protection Authority also strongly doubts whether Meta's proposed consent solution, which means that those who do not consent to behaviour-based marketing must pay a fee, will be legal".
The ICO has published a toolkit on data sharing with law enforcement. This is intended to help SMEs and sits alongside existing more detailed guidance on the issue and the ICO's code of practice on data sharing.
The Court of Appeal has upheld the High Court's ruling that the UK's ICO is not required to reach a definitive decision on the merits of individual complaints but has broad discretion. The CA said the ICO is only required to handle and investigate complaints to an appropriate extent under Article 57(1)(f) UK GDPR. The ICO welcomed the CA's ruling that it had acted lawfully over a SAR complaint and its confirmation that the ICO has broad discretion in deciding the extent to which it investigates each complaint and is entitled to reach and express a view on it without necessarily determining whether or not there has been an infringement.
The EDPB has announced that its 2024 coordinated enforcement action will focus on the way controllers implement the right of access. Further work will now be carried out to specify the details and the action will be launched in 2024.
Clearview AI was fined £7.5m by the UK's ICO in May 2022 in relation to its scraping of the images of UK individuals to power its facial recognition database. The First Tier Tribunal has now found that the ICO had no jurisdiction to issue its enforcement and penalty notices on the basis that the UK GDPR (and GDPR) did not apply to the processing at issue. Clearview AI succeeded because it successfully argued it is a foreign company providing its service to foreign clients using foreign IP addresses, and in support of the public interest activities of foreign governments and government agencies, in particular in relation to their national security and criminal law enforcement functions, such functions being targeted at behaviour within their jurisdiction and outside the UK. It will be interesting to see whether Clearview AI has similar success against any appeals relating to fines by EU supervisory authorities.
The EC has published a draft implementing Regulation setting out rules for the application of the Cybersecurity Act for the European Common Criteria-based cybersecurity certification scheme (EUCC). Once adopted, it will apply to all information and communications technologies which are submitted for certification under the scheme and is therefore relevant to ICT organisations operating in the EU.
The EDPB and EDPS have published a Joint Opinion on the proposed Regulation on the digital Euro. They are broadly supportive but make a number of recommendations to better ensure data protection standards. These include recommendations to:
The DHSC has published the final version of its data access policy update setting out its policy decisions regarding Secure Data Environments (SDEs) for secondary uses of NHS data. A consultation outcome document explains the changes made in response to its consultation. These include:
The DHSC has committed to providing more information in several areas including on what data will be made available.
The EU General Court has rejected an application for by French MEP Philippe Latombe for interim relief to suspend the execution of the EU adequacy decision in favour of the EU-US Data Privacy Framework. The Court said the applicant had failed to prove the required individual or collective harm caused by the decision.
23andMe has reported a leak which reportedly led to the data of 1m people of Jewish Ashkenazi descent being placed on the dark web. Included data is first and last name, sex, and the genetic evaluation of their origins. 23andMe says it doesn't believe it was hacked but that the attackers were able to obtain user logins by scraping already compromised credentials. It is thought other lists may also have been compiled, for example of those with Chinese origin.
The French data protection regulator, the CNIL, has published a series of guidelines for using AI in a data protection compliant way. They cover a range of topics including selection of lawful basis, data protection by design and when to carry out a DPIA.
The UK's Financial Conduct Authority (FCA) has fined Equifax Ltd. £11m for failing to manage and monitor the security of UK consumer data it had outsourced to its US parent company. The 2017 Equifax Inc. data breach compromised the personal data of approximately 13.9m UK individuals. The FCA said the data breach was "entirely preventable". Equifax Ltd. had failed to treat its parent company as an outsourcer and consequently failed to put protections in place for the data. There were known weaknesses in Equifax Inc.'s data security systems which Equifax Ltd. did not address. Equifax Ltd. did not find out about the breach until six weeks after it had happened and five minutes before it was publicly announced and consequently was not prepared to deal with customer complaints. The FCA also said that public statements were misleading, customer complaints were not properly dealt with and notifications were not given as required. The final fine was discounted by 30% after Equifax agreed to resolve the matter. Equifax also received a 15% mitigation credit. The ICO fined Equifax £500,000 in relation to the same data breach in 2018.
TikTok has filed an appeal against the EDPB Article 65 decision which the Irish Data Protection Commissioner was required to take into account when deciding to fine it for breach of data protection law. It is appealing the Irish DPC's decision separately in the Irish High Court.
OpenAI has reportedly introduced age verification measures for Italian users of ChatGPT in response to directions from the Italian data protection supervisory authority, the Garante. The Garante blocked ChatGPT in March and then agreed a number of required measures with OpenAI. These were originally required by the end of September, a deadline subsequently extended by 60 days. The new age verification tool allows users either to use AI-powered facial analysis to estimate the age of the user, or to upload documents together with a photo, or to use a digital identity product. If the user is aged 13-17, a parent or guardian is supposed to go through the process. Full details are not yet available.
The EDPS has published an Opinion on the European Commission's draft AI Liability Directive, and the draft Directive on adapting noncontractual civil liability rules to AI (AILD). The EDPS recommends:
The ICO has put out a call for applications for its Regulatory Sandbox programme for 2024. Its current area of focus is on biometric processing, emerging technologies, and exceptional innovations. Expressions of interest will be assessed on the basis of whether the product or service being developed is innovative and could provide a demonstrable benefit to the public. Applications must be submitted by 31 December.
As part of the National Cyber Strategy and following a public consultation, the government has published an updated version of its code of practice which sets out the minimum security and privacy requirements for all app store operators and app developers. The original version of the code was published in December 2022 on a voluntary basis with a nine-month implementation period for operators and developers. In May 2023, DSIT consulted on progress and concluded that additional clarifications were needed for some provisions. As a result, the implementation period is being extended by six months to March 2024. DSIT will then review adherence levels and make recommendations to the Secretary of State on next steps. The voluntary code of practice is intended to supplement but not to replace pre-existing legal obligations and is tailored to data breaches in the context of app stores.
California's Governor has signed the so-called Delete Act into law. It requires data brokers to register with the California Privacy Protection Agency which will be required to develop a one-stop-shop mechanism to allow consumers to request deletion and tracking of their personal data. This will need to be set up by 1 January 2026 and from 1 August 2026, data brokers will be required to process deletion requests within 45 days. Some 500 data brokers are thought to be operating in California.
The ICO has issued Snap Inc and Snap Group Limited, with a preliminary enforcement notice over potential failure to properly assess the risks to privacy posed by the AI Chatbot 'My AI' deployed on Snapchat. My AI is powered by ChatGPT. It was rolled out to Snapchat+ users in February 2023, and then to the wider Snapchat user base in April 2023.
The ICO provisionally finds that Snap failed to adequately identify and assess the risks to several million My AI users including children aged 13-17. It found that the risk assessment carried out before launch did not adequately assess the data protection risks posed by the generative AI technology, particularly to children. The preliminary notice sets out steps the ICO may require if a final enforcement notice is adopted. These measures are subject to Snap's representations in response to the preliminary notice. They may, however, require Snap to stop processing data in connection with My AI and to stop offering it to users pending an adequate DPIA.
The ICO will now consider representations by Snap pending a final decision and no conclusion can be drawn as to whether Snap has breached data protection law or whether a final enforcement notice will eventually be issued.
The ICO has published draft Data Protection Fining Guidance for consultation. The guidance is intended to replace parts of the ICO's Regulatory Action Policy (RAP) on its approach to fining. It sets out the legal framework underpinning the ICO's powers to impose fines, the circumstances in which the ICO would consider it appropriate to issue a penalty notice, as well as factors which will influence how the fine is calculated.
The RAP will remain applicable to:
What is an "undertaking"?
The ICO focuses on what constitutes an "undertaking" for the purposes of issuing fines and proposes that where a controller or processor forms part of an undertaking (for example, as a subsidiary), maximum fines will be based on turnover of the undertaking as a whole. While the UK GDPR and Data Protection Act 2018 (DPA) do not define what constitutes an "undertaking" in the context of imposing fines, the ICO says the recitals to the UK GDPR are clear that the term should be understood in accordance with UK competition law. As such, an undertaking does not, in this context, correspond with the commonly understood notion of a legal entity or company under eg English commercial or tax law, but may comprise one or more legal or natural persons forming a 'single economic unit' rather than a single entity characterised as having a legal personality.
Whether or not an individual controller or processor forms part of a wider undertaking depends on whether it can act autonomously or whether another legal or natural person, for example, a parent company, has decisive influence over it and therefore forms part of the same economic unit. The ICO will consider all relevant factors but there will be a rebuttable presumption of decisive influence where a parent company owns all or nearly all the voting shares in a subsidiary.
Linked processing operations
The ICO also explains the approach to fines where there is more than one infringement by a controller or processor ie where one infringement arises from the same or linked processing operations. This will be decided on a case-by-case basis. Where processing operations or sets of operations form part of the same overall conduct, the controller or processor may infringe more than one provision of applicable law. The ICO will consider all relevant circumstances but relevant factors are likely to include the extent to which the processing operations or set of operations are:
Where the ICO finds overall conduct has infringed more than one provision, the ICO will identify the statutory maximum applicable to the most serious individual infringement. The ICO may decide to impose a fine for each infringement arising from the same or linked processing operations provided the sum of those penalties does not exceed the statutory maximum for the gravest infringement.
Conversely, an investigation may identify that different forms of conduct by a controller or processor have infringed separate provisions of the UK GDPR or DPA and will not be sufficiently linked. In such cases, the ICO may decide to include the separate infringements in the same penalty notice, however each infringement would be subject to the relevant statutory maximum amount and might therefore exceed the maximum amount for the gravest single infringement.
The consultation is open until 27 November 2023.
On 5 October, the German Federal Cartel Office (Bundeskartellamt) issued its decision on the antitrust investigation of Google for alleged anti-competitive data processing practices, accepting the commitments made by the company and closing the investigation. In particular, the commitments require Google to offer users the opportunity to give their voluntary, specific, informed and unambiguous consent to the processing of their data across different services or to combining their data with third party data. Google must provide users with appropriate choices when combining data. The design of the user choice architecture should avoid manipulative direction to accepting cross-service data processing and prevent the use of dark patterns. Where Google's data processing terms explicitly state that certain data will not be processed across services, there is no obligation to provide user options. The Bundeskartellamt stated that the commitments will remain in force until 30 September 2029.
The Norwegian DPA has requested a binding decision from the EDPB in the Meta case. Its decision banning Meta from carrying out behavioural advertising in Norway on the basis of legitimate interests expires on 3 November. It is asking the EDPB for a binding decision to make the ban permanent in Norway and to extend it to the whole of the EU/EEA. Meta does not believe the Norwegian DPA has a legal basis to request such a decision from the EDPB and also underlines that it is planning to move to using consent as the lawful basis for behavioural advertising. The Norwegian DPA says that as there is no clear date for when a consent mechanism may be in place, it needs to make a referral. The EDPB will now assess the completeness of the file. Assuming this is satisfied, the decision process will begin.
In the meantime, however, Meta is reportedly progressing in discussions with the Irish Data Protection Commissioner about its plans to move to a subscription model for ad-free services on its Instagram and Facebook platforms. It is proposing around €19 per month to access both services ad-free on smartphones and slightly less to access them on desktop. It may be that if it commits to this within a sufficiently quick fixed timeframe, the EDPB will hold fire on an EU-wide ban, even if Meta can't meet the November deadline.
The Irish Data Protection Commissioner is the lead EU data protection regulator for some of the largest online companies in the world. As such, it has experienced a heavy workload and been at the forefront of some of the most high profile GDPR enforcement actions. Lately, it has seen some of its most significant decisions referred to the EDPB under the consistency and cooperation procedure and has faced criticism from some quarters for not acting sufficiently quickly and forcefully. It has now published a series of case studies from 2018-2023. The 110 page document does not, however, focus exclusively, or even mainly, on the 'big tech' decisions but spans the whole of the GDPR and summarises a range of cases, often on a no-name basis.
The Data Governance Act (DGA) came into force in June 2022 with a 15-month grace period. Its application began on 24 September 2023. The DGA seeks to increase trust in data sharing, particularly in the public sector, to strengthen mechanisms to increase data availability and overcome technical obstacles to the reuse of data. The DGA will also support the set up and development of common European data spaces. The DGA sits alongside the Data Act and sets up frameworks for data sharing. Affected organisations must now comply.
Clearview AI which has been the subject of regulatory enforcement action across the EU and UK in relation to its unlawful scraping of personal data to create an image database, has reportedly settled a class-action lawsuit in the US in relation to the same issues. The class action alleged breach of the Illinois Biometric Information Privacy Act and the privacy laws of California, New York and Virginia. The details of the settlement have not been made public.
The ICO has called on organisations to handle personal information properly to avoid putting victims of domestic abuse at further risk. The call comes after the ICO says it has reprimanded seven organisations over the last 14 months, for data beaches affecting victims of domestic abuse. The ICO says organisations should train staff and put appropriate systems in place to avoid such breaches.
The EDPB has adopted Guidelines on data transfers subject to appropriate safeguards under the Law Enforcement Directive. The Guidelines relate to Article 37 of the Directive which deals with transfers of personal data by competent authorities or international organisations competent in the field of law enforcement. The Guidelines look, in particular, at the legal standard for appropriate safeguards to protect the data.
The government has updated PPN 09/14 which sets out actions for central government departments, their executive agencies, non-departmental public bodies and NHS bodies to take in relation to cyber security in certain procurement contracts. Other public bodies are encouraged to follow the approach. In-scope organisations are required to implement PPN 09/23 within three months of its publication.
In October 2022, AG Szpunar delivered an Opinion in the case La Quadrature du Net and others v Premier Ministère de la Culture. The AG said that Article 15(1) of the ePrivacy Directive read in light of the Charter of Fundamental Rights, does not preclude national legislation that allows for the general and indiscriminate retention of IP addresses for a period of time limited to what is strictly necessary for preventing, investigating, detecting and prosecuting online criminal offences, provided that this data is the only means of identifying the person to whom that address was assigned at the time of the commission of the infringement. This can be done without prior review by a court or independent body if, as in this case, the linking is at a given point in time and is limited to what is strictly necessary to achieve the objective.
The AG has now refreshed his Opinion following the reopening of the case and he expands on his original reasoning and confirms in addition to the above that:
The UK's Information Commissioner's Office has published its final guidance on worker monitoring to help employers comply with data protection law if they wish to monitor their workers. This is aimed at both public and private sector employers and sets out how to conduct monitoring fairly and lawfully. It also includes good practice recommendations to help build trust between employers and workers.
Unsurprisingly, the ICO highlights the importance of transparency and respect for privacy, underlining that any monitoring must be necessary, proportionate and respect the rights of workers.
The guidance focuses on different considerations for different types of monitoring and includes a section on using biometric data for time and attendance control and monitoring. Interestingly, this comes while the ICO's draft guidance on biometric data and data protection (part one of two on biometric data guidance) is still out for consultation. It's unlikely it will change significantly before it is published in final form, and, as most of the practical examples relate to using biometric data in an employment situation, employers considering monitoring workers should refer to this draft guidance as well as to the final worker monitoring guidance.
The UK government has laid the Data Protection (Adequacy) (United States of America) Regulations 2023 before Parliament. They will come into force on 12 October 2023. These Regulations are effectively an adequacy decision in favour of the US. They establish the UK-US Data Bridge (the government's preferred term for adequacy) which allows transfers of personal data to be made to US organisations signed up to the EU-US Data Privacy Framework (DPF) and participating in the UK Extension to it, without the need for additional transfer mechanisms like the Standard Contractual Clauses or Binding Corporate Rules. The US has already designated the UK as a qualifying state. This means UK individuals have the right to access the redress mechanism set out under Executive Order 14086 (EO).
The government has published supporting documents including:
Not all US organisations are entitled to sign up to the DPF and UK Extension. The scheme is regulated by the FTC and Department of Transport. Organisations regulated by other departments and outside FTC jurisdiction, for example those in banking, insurance and telecoms, are ineligible. In addition, journalistic data cannot be transferred under the UK-US Data Bridge.
Special category data can be shared but owing to a difference in definitions, it must be correctly identified by UK organisations as such when it is being shared in order to attract the relevant level of protection in the US.
US recipient organisations are required to indicate they are seeking to receive criminal offence data as part of a human resources data relationship where relevant. Where such data is being shared outside an HR relationship, it must be made clear the data is sensitive and requires additional protections.
According to the government factsheet, before sending personal data to the US, the exporting UK organisation must:
The ICO published an Opinion providing "qualified" support for the Regulations and concluding that "while it reasonable for the Secretary of State to conclude that the UK Extension provides an adequate level of data protection and to lay regulations to that effect, there are four specific areas that could pose some risks to UK data subjects if the protections are not properly applied." These are:
The ICO underlines the need for the Secretary of State to monitor the level of data protection and review it every four years, as well as to monitor relevant developments in the destination country. In addition to the four areas of concern, the ICO also recommends the Secretary of State monitor:
These Regulations made on 21 September 2023 (PSTI Regs), set out the security requirements for manufacturers (but not importers or distributors) of connectable products under the Part 1 of the Product Security and Telecommunications Infrastructure Act (PSTIA).
Part 1 of the PSTIA deals with security of relevant consumer connectable products, potentially placing obligations on manufacturers, importers and distributors, and is set to come into force on 29 April 2024. Much of the detail on what security measures will be required from manufacturers is set out in the PSTI Regs, which will come into force on the same date. The PSTI Regs are based on the UK's Code of Practice for Consumer IoT security and ETSI EN 303 645, and advice from the National Cyber Security Centre.
The PSTI Regs cover:
Read more about the PSTI Regulations here.
The EDPB and EDPS have adopted a joint Opinion on the European Commission's Proposal for a Regulation on additional procedural rules for the enforcement of the GDPR. They broadly welcome the proposal but make a few recommendations for areas where further clarification is needed. They also stress that the Proposal should not unduly restrict the intervention by the Concerned Supervisory Authorities on draft decisions and urge the Commission not to change the current approach to the parties' right to be heard in any dispute resolution procedure where the SAs have not reached consensus.
The EDPB and EDPS have also adopted a joint contribution in response to the EC's public consultation on the template report for the description of consumer profiling techniques pursuant to Article 15 of the Digital Markets Act. They recommend that gatekeepers provide additional information concerning the categories of personal data they process and details about their data protection practices.
The House of Commons Science and Technology Committee has launched an inquiry and call for evidence on the cyber resilience of the UK's critical national infrastructure as measured against resilience targets by 2025. It will look at what the sector needs to achieve those targets and at how to make computer hardware architecture which underpins the critical infrastructure more secure by design. Submissions are invited on a range of issues, including the strength of government programs and support, by 10 November 2023.
On 18 September, the US Attorney General designated the UK as a "qualifying state" for the purpose of implementing the redress mechanism established in Executive Order 14086. The designation will become effective when the UK regulations implementing the Data Bridge for the UK Extension to the EU-US Data Privacy Framework (DPF) come into force. Once in force, organisations in the UK will be able to export personal data to US organisations which self-certify compliance pursuant to the UK Extension to the DPF, without the need for additional transfer tools.
The Irish Data Protection Commissioner has fined TikTok Technology Limited €345m under direction from the EDPB. The Irish DPC initiated an inquiry into the extent to which TikTok complied with its GDPR obligations in relation to its processing of children's personal data between 31 July 2020 and 31 December 2020 (the relevant period), looking at:
The DPC also examined transparency compliance in relation to the information provided to children about default settings.
Following an Article 65 resolution process and in accordance with the EDPB's decision, the Irish DPC has now adopted its final decision that during the relevant period, TikTok's public by default settings and the way it communicated information led variously to breaches of GDPR provisions relating to fairness, transparency and privacy by design and default. The Irish DPC was also required to take into account the EDPB's findings that TikTok had implemented dark patterns in breach of the Article 5 fairness requirement
The DPC has issued:
In a statement quoted in the media, TikTok said "We respectfully disagree with the decision, particularly the level of the fine imposed. The DPC's criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under-16 accounts to private by default".
The ICO has urged organisations to share information to protect children and young people at risk of serious harm. The ICO has published new guidance setting out ten steps to sharing information to safeguard children as part of the wider safeguarding process. The aim of the guidance is to reassure people involved in safeguarding children, that data protection law does not prevent information sharing but ensures it is shared in a fair and proportionate way. In particular, the ICO says there should be no hesitation in sharing information in an emergency. In such cases, there may not be time to follow all the usual processes. The ICO suggests planning ahead for emergency or urgent situations so that everyone is clear what should happen when time is of the essence.
The UK's Information Commissioner and the CEO of the National Cyber Security Centre have signed a joint Memorandum of Understanding setting out how the ICO and NCSC will co-operate. Their plans include focusing on:
The Delaware Personal Privacy Act has been signed and will come into effect on 1 January 2025. Delaware is the latest US State to enact a privacy law. The law will apply to entities conducting business in Delaware that control or process the personal data of over 35,000 consumers or 10,000 where the sale of their personal data accounts for more than 20% of the entity's gross revenue. Delaware is so far the only US State to impose restrictions on selling the personal data of children aged between 16-18.
In line with a wider 2021 ruling, the European Court of Human Rights has ruled that UK surveillance activities infringed the privacy of two non-residents. The complainants had tried to bring their case before the UK's Investigatory Powers Tribunal but were unsuccessful and then sought redress before the ECtHR.
In the wake of the ECJ ruling which said contractual necessity is unlikely to be a valid lawful basis on which to base processing personal data to target advertising, Meta switched to rely on legitimate interests for these purposes. This has already come under scrutiny in the ECJ Budeskartellamt judgment and, notably, in Norway which has imposed a ban on Meta using personal data processed in Meta's legitimate interests for behavioural advertising. Meta is thought to be considering offering EU users an opt-in to receiving targeted ads but has now been reported as also looking at offering a free service with ads and allowing users to pay for an ad-free service on Facebook and Instagram.
The ICO has announced it will review period and fertility tracking apps after a poll commissioned by the regulator showed that half of users have concerns over data sharing and transparency when choosing an app. Data security was another issue. The ICO is asking users to share their experiences through a survey in a call for evidence. It will also be commissioning focus groups and user testing.
On 6 September 2023, the European Commission published its first set of gatekeeper designations and core platform services under the Digital Markets Act as follows (gatekeeper in bold):
The gatekeepers must comply with DMA requirements in full by 6 March 2024 although some obligations (eg to inform the Commission of any intended concentration) apply from designation.
The Commission has also opened market investigations into submissions by Microsoft and Apple that some of their core platform services do not qualify as gateways despite meeting the threshold for core platform services. The relevant services are:
It is also looking at whether Apple's iPadOS should be designated as a gatekeeper despite not meeting the thresholds.
The Commission concluded that while Samsung Internet Browser meets thresholds to qualify as a gatekeeper, it does not qualify as a gateway to reach end users. It reached similar conclusions about Alphabet's Gmail, and Microsoft Outlook.
Google has announced that its Privacy Sandbox for the web is reaching "general availability" on the Chrome browser, The Privacy Sandbox is intended to enhance privacy and give users more control over what ads they see. Its relevance and measurement APIs pave the way for phasing out third party cookies by the end of 2024. Chrome has also released new controls which allow users to select options relating to Privacy Sandbox features.
French MP Philippe Latombe has said he is seeking to challenge the EU-US Data Privacy Framework before the EU General Court of Justice. There are doubts as to whether the challenge will be admissible owing to issues with legal standing.
UK consumer group Which? has published an article looking at the collection of personal data by a range of smart devices. It raises concerns that more data is being collected than is necessary and that there is a lack of transparency as to what collected data is used for. It calls on the ICO to "crack down on data collection by manufacturers and marketing firms that appears to go beyond "legitimate interests". A proper standard of practice should also be put in place to make the rules clearer". A number of leading smart device providers defended their data processing practices in the article.
The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations have been sent for sifting (the first stage in adoption). The Regulations will amend the definition of "fundamental rights and freedoms" in data protection legislation to refer to rights recognised under UK law, rather than retained EU law which will no longer be recognised after the end of 2023. The definition in the UK GDPR and Data Protection Act 2018, will refer to the UK's Human Rights Act rather than the European Convention on Human Rights. As the government sets out in the explanatory notes, the HRA does not explicitly protect personal data unlike the ECHR. However, the government says the protection of personal data falls within the ECHR Article 8 right to respect for private and family life which is enshrined in the HRA. Together with the protections afforded under data protection law, the government says there will be no change to the level of protection for UK citizens.
On 9 August 2023, the ICO and CMA published a joint blog and position paper, which calls for organisations to stop using harmful Online Choice Architecture (OCA) to steer consumers into providing more personal data than they otherwise would like. Read our article on 'Why Online Choice Architecture is a data protection priority' for more on this.
The UK's ICO together with regulators from Norway, Jersey, Switzerland, Canada, Hong Kong, Australia, New Zealand, Columbia, Morocco, Argentina and Mexico, has published a joint statement highlighting the data privacy issues caused by unlawful data scraping on social media sites. The regulators remind social media and website operators that mass data scraping can constitute reportable data breaches in many jurisdictions and that they have obligations to protect publicly accessible personal data where that data is protected by law. The regulators expect the operators to implement mutli-layered technical and organisational measures to mitigate scraping and data privacy risks, including, by rate limiting access where unusual activity is detected, introducing measures to detect bots, and blocking IP addresses where scraping has been identified. The regulators also call on individuals to take steps to protect their data and call on social media companies to enable users to engage with their services in a privacy protective manner.
Social media companies and other website operators hosting publicly accessible personal data are invited to submit feedback within a month and demonstrate how they protect individuals from unlawful scraping of their personal data.
The European Commission has adopted an Implementing Regulation to introduce common logos to help easily identify trusted data intermediation service providers and data altruism organisations in the EU as provided for under the Data Governance Act. Logos will be registered as trade marks. They will need to be clearly displayed by data altruism organisations in all on- and offline publications, together with a QR code linking to the EU public register of recognised data altruism organisations. The register is due to launch on 24 September 2023.
The ICO has published guidance setting out data protection obligations on employers processing health data of the people who work for them. The guidance has been updated following a consultation on the draft. It explains the additional requirements when processing special category data and goes into detail on information provision, carrying out a DPIA prior to processing, data minimisation and security. The second part of the guidance looks at particular workplace scenarios and the guidance also includes a number of checklists.
The ICO has published guidance on how to send emails to multiple recipients in a secure manner. In a blog post, the ICO says that incorrect use of the 'bcc' field to send bulk emails is one of the top data breaches reported to the regulator. The ICO underlines that use of the bcc field is not, on its own, enough to protect personal information, and suggests that organisations sending sensitive information electronically should use alternatives such as bulk email services, mail merge or secure data transfer services. The ICO also points to recent enforcement actions against three organisations which used bcc in breach of data protection law. Key to proper use is assessing and implementing appropriate technical and organisational measures and training staff.
AirBnB Ireland has received a reprimand from the Irish Data Protection Commissioner relating to the processing and retention of copies of documents used to verify identity. The DPC says the practice of retaining a copy of the relevant document after identity has been verified breaches the GDPR minimisation and storage limitation principles.
The EDPS has published opinions on the draft Financial Data Access Framework and Regulation and Directive on payment services within the EU.
The EDPS is supportive of the Framework but recommends tightening the definition of "customer data" and limiting the types of personal data that can be processed as well as explicitly excluding data obtained through profiling. Regarding the proposed Regulation and Directive on payment services, the EDPS makes recommendations to assist with fraud prevention, including defining what data is necessary for fraud prevention, and limiting data collection to data falling within the definition, as well as specifying who may collect special category personal data and under what circumstances.
NOYB has filed complaints against Alphabet-owned fitness tracker Fitbit, with data protection regulators in Austria, the Netherlands and Italy. NOYB alleges that Fitbit effectively requires users to consent to their personal data being exported outside the EU and does not allow them to withdraw that consent. It also claims that Fitbit fails to provide an adequate explanation of what it does with exported special category data.
The US National Institute of Standards and Technology (NIST) has published draft post-quantum cryptography standards. These are intended to help organisations defend themselves from quantum-enabled cyber attacks. Three Federal Information Processing Standards are included containing a number of lattice and hash-based algorithms. NIST has also issued a draft learning program covering cyber security and privacy. The documents are open for consultation.
The DCMS Committee has published a report on 'Connected tech: smart or sinister?'. Among its recommendations are that monitoring of employees should only be done in consultation with employees and with their consent. The report calls on the ICO to develop its draft guidance on monitoring at work into a principles-based code for designers and operators of workplace connected technology.
India's long-awaited Digital Personal Data Protection Act has been published in the Gazette of India. The Act does not specify a transitional period and different provisions are likely to be brought into force on different dates. Essentially, it provides conditions for processing digital data, places obligations on data fiduciaries (controllers) and gives rights to data principals (data subjects). Fines for non-compliance are up to approximately US$30m.
The European Commission has adopted a Communication setting out plans for a Common European Tourism Data Space. The EC intends that the data space will provide the European tourism ecosystem with the means for sharing data, in particular to foster trust, enhance interoperability support digitisation and sustainability of the industry. The introduction of the system will take place over the next two and a half years with full functionality expected by 2025.
The EDPB has published a statement following its participation in the EC's first annual review of the EU-Japan adequacy agreement. The EDPB agreed with the EC's assessment that changes to Japan's data protection laws had not impacted the adequacy decision and that reviews could now proceed on a four yearly basis. It did, however, identify a few issues to keep an eye on including the use of consent in potentially imbalanced relationships.
Meta is planning to offer EU users a simple opt-in (yes or no) to receiving targeted advertising across its platforms. This is in response to the €390m fine it received from the Irish Data Protection Commissioner for unlawfully relying on contractual necessity to serve behavioural ads. Meta's changes will not apply in the UK. The ICO has said it is assessing what this means for the information rights of people in the UK and is considering its response.
HM Treasury is consulting on a cold calling ban for consumer financial services and products. The ban was announced in May 2023 and the consultation and call for evidence look at how best to design and implement it. The government highlights that the ban will work alongside other measures to tackle fraudulent marketing, including the DPDI Bill and the online advertising programme, as well as a proposed online fraud charter. The consultation closes on 27 September 2023.
The EDPB has finalised an Article 65 decision relating to the Irish Data Protection Commissioner's decision on TikTok's use of children's personal data. The Irish DPC submitted its draft decision to other concerned authorities in September 2022, but referred the decision to the EDPB after failing to get agreement. The inquiry relates to TikTok's processing of children's personal data, in particular, historic public-by-default settings, age verification measures for under-13s, and compliance with transparency requirements. The EDPB's decision will be published following publication of the Irish DPC's final decision.
The ICO's Children's Code applies to online services "likely to be accessed by children". The ICO published guidance in the form of FAQs on what this means but has now updated this to add further clarification in response to a consultation which closed in May 2023. Clarifications include additional information on why there is no defined threshold for what constitutes a "significant number" of children accessing a service, and also gives a bit more information about when the ICO might consider regulatory action.
In July, the Norwegian DPA imposed a temporary ban on Meta (to apply on its Facebook and Instagram platforms) from carrying out behavioural advertising based on the surveillance and profiling of users in Norway. The ban will initially apply until October. The DPA has now begun fining Meta nearly €100,000 a day for failing to comply with the ban. Meta's initial application for an injunction to suspend the ban has now been rejected and the Norwegian DPA is considering a referral to the EDPB.
The ICO is consulting on the first phase of guidance on biometric data and biometric technologies. Phase one covers draft biometric data guidance. The consultation is open until 20 October 2023.
The EDPB has published an information note on data transfers from the EU to the USA after the adoption of the EC's adequacy decision relating to the Data Privacy Framework (DPF). The note is broadly supportive although is a reminder of the fact that the adequacy decision is subject to annual review. The EDPB states that:
It is interesting that the EDPB stops short of saying that supplementary measures are not needed to protect personal data being transferred under Article 46 transfer tools, although the implication of its statement is that they are unlikely to be necessary.
On 17 July, the US International Trade Administration launched its DPF website which allows organisations to self-certify under the DPF, and provides a range of advice and information. US organisations can also now sign up to the UK extension although data cannot be transferred under it until the UK has legislated to adopt a UK-US data bridge, and the US has formally recognised the UK as a qualifying state.
In January 2022, the Irish DPC fined WhatsApp €5.5m. Among other things, the Irish DPC found that WhatsApp was not entitled to rely on the lawful basis of contractual necessity for the delivery of service and security (excluding IT security). As a result, WhatsApp has changed to legitimate interests as its lawful basis for these purposes.
The EC has published a call for evidence for an valuation of the ENISA and ICT Regulation. The call is open until 16 September 2023 and a report will be adopted in Q2 2024.
The draft Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products Regulations 2023 (Regulations) have been laid before Parliament under Part 1 of the Product Security and Telecommunications Infrastructure Act 2022 (PSTIA). Part 1 of the PSTIA deals with security of consumer connectable products and is set to come into force on 29 April 2024, but much of the detail on what security measures will be required is to be set out in secondary legislation. The draft Regulations set out security obligations on manufacturers of relevant connectable products (but do not cover requirements for importers or distributors). They are based on the UK's Code of Practice for Consumer IoT security, ETSI EN 303 645, and advice from the National Cyber Security Centre. They are being adopted under the Affirmative Procedure so must be approved by both Houses of Parliament.
The Regulations cover:
Read more.
The Norwegian DPA has imposed a temporary ban on Meta (to apply on its Facebook and Instagram platforms) from carrying out behavioural advertising based on the surveillance and profiling of users in Norway. The ban will initially apply until October. The Norwegian DPA cites the December 2022 Irish DPC decision after which Meta made changes and said that the recent ECJ decision on the role of data protection in competition decisions stated that Meta's behavioural advertising still fails to comply with the law. While it is not Meta's lead regulator, the DPA is exercising its right to intervene directly and may refer the matter to the EDPB after the summer.
The DPA says it is not banning all behavioural advertising by Meta and says its decision does not prevent Meta from targeting advertising based on user bio information or interests that a user has provided themselves, nor where the user has given valid consent.
The ICO has published key learnings for organisations to improve their data protection practices based on reprimands issued by the ICO in Q1 2023/4:
The UK's ICO has published a report following the exit of the Betting and Gaming Council from the ICO's regulatory Sandbox. The Council entered the Sandbox to explore the gambling industry's development and trial of a Single Customer View (SCV) solution, developed with operators and intended to enable a more unified and proactive intervention by gambling operators to reduce incidents of gambling related harm. The data sharing project (known as GamProtect) will now be implemented across the gambling industry with support from the Betting and Gaming Council.
The ICO has also written to UK Finance sharing its findings and responding to a request for clarification on whether in relation to the sharing of consumer credit risk data by credit reference agencies with gambling operators:
The ICO has advised gambling operators that:
The ICO says lenders (and other parties sharing information with credit agencies) and the credit agencies themselves, will need to carry out a compatibility assessment to decide whether the new purpose for sharing the data is compatible with the original purpose for processing the data. The ICO notes the close link between the original purpose of enabling the assessment of consumer credit, and enabling online gambling operators to conduct financial risk checks. It also notes that data subjects have an expectation that their data will be shared by lenders and credit reference agencies for the purpose of credit checks. The Gambling Commission has already directed relevant stakeholders to make it clear to customers that their data may be shared in relation to financial vulnerability.
Given the sensitivity of the data being shared, the ICO expects that sharing of data will be kept to what is necessary and that the underlying purpose of safeguarding individuals at risk provides a compelling reason for sharing of the data. The ICO expects gambling operators to put robust safeguards in place and reminds them that any use of the data for commercial gain would be strictly outside the purpose for which the data is being shared.
The ICO points to clarifications being introduced under the Data Protection and Digital Information Bill which will make clear that a further processing purpose will be compatible with the original purpose where it is necessary for safeguarding vulnerable individuals.
The European Commission has announced the adoption of its adequacy decision for the EU-US Data Privacy Framework (DPF). It entered into force on 11 July. The Commission concludes that the US provides an adequate level of protection for personal data transferred from the EU to US companies under the new framework. Personal data can flow freely from the EU to US companies participating in the framework without the need for supplementary measures.
The Commission underlines that the DPF introduces new binding safeguards to address the concerns raised by the ECJ, including by limiting access to EU data by US intelligence authorities to what is necessary and proportionate to protect national security. The new framework also establishes a Data Protection Review Court to which EU individuals have access and which can order remedies relating to unlawfully transferred personal data, including data deletion.
In its accompanying FAQs, the Commission notes that all safeguards put in place by the US government in the area of national security (including the redress mechanism), apply to all GDPR data transfers to companies in the US, regardless of the transfer mechanism used. The safeguards "therefore also facilitate the use of other tools, such as standard contractual clauses and binding corporate rules". This suggests an end to the requirement for supplementary measures for all US data transfers, even where a Transfer Impact Assessment is required.
On 11 July 2023, the US International Trade Administration confirmed that from 17 July 2023, US organisations may self-certify compliance pursuant to the UK Extension to the EU-US DPF. They may not, however, begin relying on it to receive personal data from the UK and Gibraltar, before the UK's anticipated adequacy regulations (not yet published at time of writing) implementing the Data Bridge enter into force. Organisations participating in the UK Extension must also participate in the EU-US DPF. The Swiss-US DPF will also be available from 17 July 2023.
Read more about what this means, including from a UK perspective here, and find out more detail about the DPF and what compliance involves here.
The ICO has published its code of practice about using personal information for journalism. This will be a statutory code which will come into force once it has completed the s125 DPA process and has been laid before Parliament by the Secretary of State. It can, however, provide guidance immediately. Following a consultation process, the code has taken on board stakeholder feedback. It is intended to provide practical guidance on how to comply with data protection law and good practice when personal information is used for journalism. Reference notes support the code and contain more information about relevant legislation and case law examples. The code is strictly limited to data protection law and does not cover media standards in general.
The European Commission has adopted a proposal for a Regulation on additional procedural rules relating to the enforcement of the GDPR (GDPR Procedural Regulation). The Regulation is intended to set up procedural rules for cross-border enforcement actions. The Commission feels that further harmonisation is needed to support the consistency and cooperation procedure under the GDPR, owing to the fact that different Member States have different interpretations of the GDPR which can result in a lack of consensus and a lengthy dispute resolution process under the Article 65 procedure.
The new draft Regulation sets out additional rules dealing with:
These measures are intended to speed up completion of investigations and support consumers and businesses. The Commission says consumers and businesses will benefit from complaints being resolved more quickly and businesses will gain greater legal certainty.
The Council of Europe has published the first model of the model contractual clauses for the transfer of personal data from controller to controller developed under the Protocol amending the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108+). Convention 108+ requires signatories to take steps in their domestic legislation to apply its principles in order to protect fundamental human rights with regard to processing of personal data. The model clauses can be adopted by signatories under domestic legislation and used to protect the rights of individuals where personal data is transferred from controller to non-signatory controller.
ENISA has published a threat landscapes report focused on the health sector. It analyses cyber incidents between January 2021 and March 2023 and identifies a list of prime cyber security threats to the EU's healthcare sector. Ransomware attacks continue to present the biggest risk, particularly to patient data. ENISA says only 27% of healthcare organisations were found to have a dedicated ransomware response programme. The report makes a number of recommendations including making offline encrypted backups of mission critical data containing confidential information, training and awareness raising for healthcare professionals, and regular vulnerability screening backed up by patches and software updates.
Meta's appeal against the Irish DPC ruling which requires it to suspend its EU-US data transfers under standard contractual clauses and repatriate unlawfully transferred data, has been joined by the US government. The US government will be able to provide information to the court and details of the US rules on government access to data. The repatriation requirement now makes little sense given the adoption of the EU-US Data Privacy Framework adequacy decision, the safeguards under which can also be used to facilitate transfers under SCCs and BCRs.
The U.S. Department of Justice has announced the USA has fulfilled its commitments for implementing the EU-US Data Privacy Framework. The Attorney General has designated the EU Member States as well as Iceland, Liechtenstein and Norway as "qualifying states" whose citizens are able to file for redress through the proposed Data Protection Review Court and benefit from enhanced U.S. privacy protections. The designations will apply once the EU has adopted the EU-US adequacy decision. The ODNI has released the policies and procedures the U.S. intelligence community will follow as part of President Biden's executive order. The European Commission is expected to adopt an adequacy decision in the next few weeks.
The EC has published a proposal for a Regulation for a framework for financial data access intended to set out processes for management of customer data sharing in the financial sector. Once passed it will amend the EBA Regulation, the EIOPA Regulation, ESMA and DORA. This is the second of the common European data spaces to reach this stage (the first being the health data space), and is part of the EU's wider strategy to facilitate sharing of business data. The intention is that the financial data space will:
The ECJ has published its decision in a reference from Germany involving Meta Platforms and the impact of its data practices on competition. In the original case in Germany, the German competition authority, the Bundeskartellamt, found that the data collected by Meta Platforms Ireland about user activities on and off Facebook and across other Meta services and linked back to their Facebook account for targeted advertising purposes, was collected without valid consent and therefore in breach of the GDPR. It found this constituted an abuse of Meta's dominant position in online social networks.
The Higher Regional Court in Duesseldorf asked the ECJ whether it was open to a national competition authority to review whether a data processing operation is GDPR-compliant.
In line with the AG Opinion, the ECJ found that:
The ECJ also made a number of observations in relation to Meta's data processing operations including:
Read more.
The Commercial Court of Ireland has extended Meta's interim stay on the order from the Irish DPC to suspend its EU-US data transfers. The stay will now continue until at least the end of July.
The Irish Parliament has voted to allow the Irish Data Protection Commissioner to declare the majority of its procedures confidential during its investigations. The Irish DPC will have discretion to make certain information confidential but must specify the reasons for doing so. Privacy campaigners have complained that this will reduce scrutiny of the Irish DPC who is the lead EU regulator for a number of the 'big tech' companies.
The UK's National Cyber Security Centre has updated its risk management toolkit for practitioners with an eight-step risk management framework and a revised toolkit framework to cover a variety of sectors and organisation sizes.
Political agreement has been reached between the European Council and Parliament on the European Data Act. The Data Act was originally proposed in February 2022. It aims to facilitate data sharing, in particular, industrial and business data as well as personal data, in order to help individuals and businesses leverage the value of the data they help generate, and level the data playing field.
The Data Act will:
The Data Act is part of the European Strategy for Data. This also includes the Data Governance Act (which will apply from 24 September 2023 and which, among other things, sets out processes and structures to facilitate data sharing among EU public bodies and between sectors), and plans to create a series of common European data spaces.
The Data Act will now be formally adopted and published in the Official Journal. It will come into force twenty days after publication and will apply twenty months after that – so likely in early to mid 2025.
Read more about what this means for you.
The French DPA, the CNIL, has fined behavioural advertising company Criteo €40m for GDPR breaches. This represents a €20m reduction from the originally announced provisional amount. The CNIL investigated Criteo following complaints from Privacy International and NOYB. It found a number of GDPR infringements including:
In determining the amount of the fine, the CNIL took into account the fact that the processing in question concerned a large number of people across the EU and that the company had collected a large amount of personal data relating to the online preferences of individuals. While Criteo did not have names of individual users, the CNIL considered that Criteo held data which was sufficiently accurate to re-identify individuals in some cases. The CNIL also considered that Criteo's failure to evidence consent enabled it to increase the number of individuals accepting its cookies which allowed it to benefit financially.
The EDPB has adopted a template complaint form to facilitate the submission of complaints by individuals and their subsequent handling by DPAs in cross-border cases. The template takes into account differences between national laws and practices. DPAs will be able to use it on a voluntary basis and can adapt it to their national requirements. The expectation is that using the form will save DPAs time and help resolve cross-border cases more efficiently. The EDPB also adopted a template acknowledgment of receipt form.
The EDPB has adopted a final version of the Recommendations on the application for approval and on the elements and principles to be found in Controller Binding Corporate Rules (BCR-C). The recommendations update the existing BCR-C referential which contains criteria for BCR-C approval, and merge it with the standard application form for BCR-C.
The recommendations are intended to:
The recommendations provide additional guidance and aim to ensure a level playing field for all BCR-C applicants based on experience gained by DPAs. They also take account of the Schrems II judgment.
On publication (to follow shortly), the guidelines are applicable to all BCR-C holders. In practice, existing as well as new and ongoing applicants will have to bring their BCR-C in line with the requirements set out in the recommendations, either during the application process or as part of their 2024 annual update.
The ECJ has ruled on interpretation of Article 15(1) GDPR which deals with subject access requests, in response to a reference from Finland. The ECJ said:
The UK's ICO has called on businesses to address the privacy risks of generative AI before adopting the technology and says it will carry out tougher checks on whether organisations have complied with data protection law before and when using generative AI. Businesses are cautioned to "spend time at the outset to understand how AI is using personal information, mitigate any risks….and then roll out your AI approach with confidence that it won't upset customers or regulators". The ICO is signalling that this will be a priority area, saying "businesses need to show us how they've addressed the risks that occur in their context – even if the underlying technology is the same. An AI-backed chat function helping customers at a cinema raises different question (sic) compared with one for a sexual health clinic, for instance".
The Centre for Data Ethics and Innovation has published a report on Enabling responsible access to demographic data to make AI systems fairer. It sets out approaches to accessing demographic data responsibly for bias detection and mitigation. The report suggests using data intermediaries and proxy data may help manage risks although they will not always be suitable. It underlines that in the short term, direct collection of demographic data is likely to be the best option in most circumstances, saying this can usually be done lawfully provided care is taken to comply with applicable data protection law. However, the CDEI also sees an opportunity for an ecosystem involving intermediaries and proxies to emerge which offer better options.
The ICO has published guidance on using privacy enhancing technologies (PETs). The guidance is divided into two parts which variously cover:
ENISA, the EU cybersecurity agency, has published four reports on AI and cybersecurity:
The framework consists of three layers – cybersecurity foundations, AI-specific cybersecurity, and sector-specific cybersecurity for AI. It aims to provide a step-by-step process for establishing good cybersecurity practices to build trust in AI.
The European Commission is poised to introduce Regulations:
According to leaked documents, these proposals are likely to be published on 28 June 2023.
The financial data space will complement the Data Act which is reportedly nearing the end of the trilogue process.
Politico reports that Google has postponed its EU release of its AI chatbot Bard, in response to requests from the Irish Data Protection Commissioner. The Irish DPC has reportedly said Google has not supplied it with a DPIA, nor has it provided any detailed information about the use of personal data by the tool, and how it will comply with the GDPR.
The CDEI has published a portfolio of AI assurance techniques in collaboration with techUK. It is intended to be used by anybody involved in designing, developing, deploying or procuring AI-enabled systems, and sets out examples of AI assurance techniques being used in the real world, to support the development of trustworthy AI. The techniques have been mapped on to the principles set out in the government's AI White Paper. They include:
These are applied across a number of sectors and will be added to over time.
Following Rishi Sunak's trip to Washington DC, the UK and US put out a joint statement committing in principle to establishing a UK-US data bridge to allow frictionless transfers of personal data between the UK and the USA. While this is being touted as 'news', it doesn't add a great deal to earlier statements of intention although it does indicate that progress has been made over the last two years. The UK is continuing to assess the US privacy framework and technical work is ongoing to establish the UK as a qualifying state under President Biden's Executive Order 14086. Under the 'Atlantic Declaration' the US and UK also agreed to cooperate on the development of AI and of Privacy Enhancing Technologies (PETs).
The EDPB has adopted final guidelines on administrative fines. They aim to harmonise the methodology used by DPAs to calculate fines, including the starting points – the categorisation of infringement by nature, the seriousness of the infringement, and the turnover of the business. The guidelines then include a five-step methodology covering the number of instances of sanctionable content, the starting point for calculation of the fine, aggravating or mitigating factors, legal maximums of fines, and the requirements of effectiveness, dissuasiveness and proportionality.
Following consultation on the draft guidelines, an annex has been added with a reference table summarising the methodology and providing illustrative examples. The guidelines have also been updated to ensure consistency with the Meta findings.
In addition, the EDPB adopted final guidelines on Article 65(1)(a) GDPR, which are intended to set out the main stages of the Article 65 procedure and clarify the competence of the EDPB when adopting a legally binding decision under Article 65(1)(a). The guidelines have been updated to ensure consistency with other guidelines and now also include a summary.
Meta has filed two actions at the Irish High Court in an attempt to overturn the Irish Data Protection Commissioner's decision fining it €1.2bn and ordering it to suspend data transfers to the US, and delete or repatriate unlawfully transferred data. It is seeking judicial review of the decision and has asked the court to suspend the deadlines for acting on the requirements under the decision. Meta has been granted a stay on the orders until the court's next sitting which is expected to take place on 19 June 2023.
The ICO has warned that neurotechnolgies (which monitor the brain) pose major risks of bias if not developed and used correctly, particularly to neurodivergent people. The ICO will produce guidance for developers of neurotech. In a report, ICO tech futures: neurotechnology, the ICO predicts that neurotechnolgies will become more widespread over the next decade but risk causing harm if they are not developed and tested using a wide enough range of people. The report details possible future avenues of development for neurotechnology, including the workplace, employee hiring, sports, personal health and wellbeing, marketing, and video games.
Governor de Santis has signed Florida's Digital Bill or Rights into law. It includes privacy requirements for certain 'big tech' companies and introduces children's privacy protection requirements across a broader range of companies. Among other things, it gives consumers the right to know what information companies are collecting about them, and correction and limited deletion and disclosure rights. It applies only to companies with annual revenue of over $1bn. The law will take effect on 1 July 2024.
The AI Act heads into the final stages of the legislative process following the European Parliament's adoption of its negotiating position on 14 June. Attempts to introduce last-minute changes were voted down with the result that negotiations can now begin between the European Parliament and the European Council. See here for more on changes proposed by the Parliament to the Commission's proposal.
Third party payroll provider Zellis, has reportedly suffered a hack of the MOVEit file transfer software it uses. The hack is thought to have been carried out by a criminal gang with links to Russia. MOVEit says it has now corrected the exploited vulnerability. The hack potentially exposes names, addresses, NI numbers and banking details of tens of thousands of employees. BA, Boots and the BBC have confirmed they are among those affected.
The ICO has published guidance on SARs for businesses and employers in the form of a set of Q&As for employers. The guidance covers common issues including how to respond to requests, procedural issued, the scope of requests, what information can be withheld, and how to deal with mixed data.
The Observer reported that 20 UK NHS trust websites are unlawfully sharing personal data with Meta through a Meta Pixel tool. The data in question includes browsing data and information about medical conditions, appointments and treatments. 17 of the 20 trusts have already confirmed they have taken the tool off their sites. The Observer alleges that data was transferred automatically before users had a chance to select cookies and without explicit consent. Much of the transferred data would be special category data. The transfer of personal data also allegedly breached some of the trusts' terms and conditions and privacy policies with only three mentioning Facebook at all.
The Open Rights Group has published a report accusing the ICO of failing to protect public privacy and data rights during the COVID-19 pandemic. The report says the ICO repeatedly failed to take action over clear breaches of data protection law by the government, and by choosing to act as a "critical friend", did not challenge the government on accountability and transparency, excessive data retention, late DPIAs, and the involvement of private companies without proper safeguards. The report expresses concerns that large amounts of personal data are being unnecessarily retained and potentially used by third parties.
The ICO has said it does not agree with the findings of the Open Rights Group report and stresses the support it gave to organisations during the pandemic.
The ICO has published its opinion on the draft Data Protection and Digital Information Bill (No. 2). The Information Commissioner has said the bill "has moved to a position where I can fully support it". He welcomes a "positive package of reforms" which provides greater regulatory certainty for organisations. The ICO is particularly supportive of changes since the first draft to provisions which, it argued, would reduce its independence. It does, however, set out a list of clarifications it thinks are needed in the annex to the opinion.
The ICO has updated its Children's Code guidance to cover edtech providers and services. The updates are intended to clarify when an edtech provider is covered by the Code. This includes services likely to be accessed directly by children, and those provided through schools. Schools are not in scope of the Children's Code as they are not Information Society Services providers.
Microsoft has issued an investor statement suggesting the Irish Data Protection Commissioner will be fining it $425 million for data protection failings in relation to LinkedIn targeted advertising. The DPC issued a provisional decision in April which has not yet been finally announced. Microsoft has said it will contest the DPC's finding.
Separately, the FTC has fined Microsoft $20 million for alleged violations of COPPA relating to its Xbox gaming console and consent to the processing of under-13 year olds' personal data. The proposed order is subject to federal court approval but reportedly requires corrective measures to ensure Microsoft gets parental consent to the processing of personal data of under-13s, and improves its data retention processes for children's data.
Ofcom has lowered incident reporting thresholds for Operators of Essential Services (OESs) in the digital infrastructure sector in its NIS guidance. The changes took effect on 31 May and impact top-level domain name registries, domain name system resolver services, DNS authoritative hosting services and internet exchange points (IXPs) within scope of the NIS Regulations.
The revised guidance is for incidents to be reported where there is a service degradation of 25% for 15 minutes or more (rather than the previous50%). IXPs are also required to report incidents based on the loss of 50% of the total bandwidth capacity across all ports.
Ofcom has also amended the section of its guidance which deals with enforcement, cross-referring instead to its Regulatory Enforcement Guidelines which were published in December 2022.
The CMA is consulting on commitments offered by Meta to address the CMA's concerns that Meta might be abusing a dominant position in digital display advertising (DDA) services.
Under its agreements with its DDA customers, Meta can use the data it receives when providing DDA services for purposes beyond supplying those services. The CMA believes evidence points to Meta using such data, including competitor data, to develop and improve Facebook Marketplace.
Meta has offered commitments to address concerns which include:
The CMA invites comments by 26 June 2023.
The DHSC is consulting on the creation of Secure Data Environments (SDEs) for accessing NHS data for research purposes. The intention is that a research SDE will be the primary way to access NHS data for research and external uses. NHS organisations will continue to manage the data and decide who can access it. The consultation closes on 23 June.
Following the intervention of the EDPB under the Article 65 GDPR process, the decision of the Irish DPC about Meta Ireland's transfers of personal data to the USA using Standard Contractual Clauses and supplementary measures has now been published, bringing the decade-long Schrems litigation to a head. Its findings are that:
The DPC underlines that the decision only binds Meta but it "exposes a situation whereby any internet platform falling within the definition of an electronic communications provider subject to the FISA 702 PRISM programme may equally fall foul of the requirements of Chapter V GDPR and the EU Charter of Fundamental Rights regarding their transfers of personal data to the USA". Given how broadly the FISA 702 provision can be interpreted, this could capture a whole swathe of organisations. However, the DPC notes that the CJEU upheld the validity of SCCs as a legal instrument while emphasising the need to undertake a case-by-case analysis to determine whether or not transfers are lawful. As a result, "it is not open to the DPC to make an order suspending or prohibiting transfers to the United States generally".
In other words, while this decision theoretically has wider application, in practice it only applies to Meta and it took ten years of litigation to reach this stage. Meta and others will now have high hopes that signing up to the imminent US Data Protection Framework will resolve the problems around data transfers to the US. The DPF is likely to be agreed by the end of July, although NOYB (the organisation set up by Max Schrems) is hinting at further legal challenges. However, this decision resonates beyond the US. It will be relevant for any transfers of personal data caught by the GDPR to countries which are not considered essentially equivalent under EU law.
The Irish DPC's decision does not take immediate effect. Meta has up to 12 October 2023 before it needs to suspend transfers under the SCCs, and until 12 November to delete or move back to the EU the personal data unlawfully transferred to the USA. The Irish DPC has reiterated that it does not agree with the decision the EDPB had required it to take and Meta has already confirmed it will appeal, which may delay matters further and would likely pause the enforcement of the DPC's decision. Meta has previously reportedly threatened to suspend its services in the EU altogether (something it denies), but, as NOYB notes, it has servers in the EU, and may move to a position where the majority of its European data is hosted in the EEA. However, hosting data in the EEA will not necessarily overcome all the concerns raised by EU regulators if Meta in the US can still access data stored in the EEA.
The decision does not directly apply in the UK although the UK rules on data transfers mirror those under the EU GDPR. In view of the tenor of UK Government's proposed data protection reforms, it seems unlikely that the UK will follow the full approach of the Irish DPC. However, we will need wait to see the ICO's response.
The Council of the EU has extended the framework for imposing sanctions on people or entities carrying out or threatening cyberattacks which threaten the EU or its Member States to 18 May 2025.
IAB Europe has published v2.2 of its Transparency and Consent Framework (TCF), intended to help the digital advertising industry comply with GDPR. Key changes include:
The EDPB has adopted the final version of its guidelines on facial recognition in the area of law enforcement. The guidelines stress that facial recognition tools must be used in strict compliance with the Law Enforcement Directive and only where necessary and proportionate under the EU Charter of Fundamental Rights. The EDPB repeats its call for a ban of FRT in certain situations as expressed in its joint opinion with the EDPS. Clarifications were added following consultation on the draft guidelines.
The CNIL has launched its action plan for addressing data protection in AI deployments. It sets out a four-phase plan for understanding AI, supervising development, supporting innovation and system auditing. It will be expanding its work on augmented cameras and to cover generative AI, large language models and derived applications, especially chatbots.
The European Parliament has reached political agreement on its negotiating position on the EC's draft AI Act. Formal adoption is expected on 14 June. The proposal will then move to trilogues given the Council agreed its position in December 2022. MEPs have suggested a number of potentially significant amendments to the Commission's proposal.
Unacceptable risk AI
An amended list of banned 'unacceptable risk' AI to include intrusive and discriminatory uses of AI systems such as:
High-risk AI
Suggested changes would expand the scope of the high-risk areas to include harm to people's health and safety, fundamental rights, or the environment. High-risk systems will include AI systems used to influence voters in political campaigns and in social media recommender platforms (with more than 45m users under the DSA). High-risk obligations are more prescriptive, with a new requirement to carry out a fundamental rights assessment before use. However, the European Parliament's proposal also provides that an AI system which ostensibly falls within the high-risk category but which does not pose a significant risk can be notified to the relevant authority as being low-risk. The authority will have three months to object, during which time the AI system can be launched. Misclassifications will be subject to fines.
Enhanced transparency measures
Providers of foundation model AIs would be required to guarantee protection of fundamental rights, health and safety and the environment, democracy and rule of law. They would be subject to risk assessment and mitigation requirements, data governance provisions, and to obligations to comply with design, information and environmental requirements, as well as to register in the EU database.
Generative AI model providers would be subject to additional transparency requirements, including to disclose that content is generated by AI. Models would have to be designed to prevent them from generating illegal content and providers will need to publish summaries of copyrighted data used for training. They will also be subject to assessment by independent third parties.
Additional rights
MEPs propose additional rights for citizens to file complaints about AI systems and receive explanations of decisions reached by high-risk AI systems that significantly impact them.
As widely reported in the media, the government has published amendments to the scope of the REUL Bill. The sunset clause which automatically revoked EU-derived direct and secondary legislation has been replaced with a clause revoking the legislation explicitly set out in Schedule 1. This covers around 600 pieces of legislation, much of which is concerned with fishing, food and the environment. In terms of data privacy, the following laws are listed in Schedule 1:
Even as amended, the Bill still stands to have a considerable impact on the UK's legal framework but changes beyond those laws being revoked under Schedule 1 will take more time to take effect and will evolve, potentially over many years.
The government has published Keeling Schedules to show proposed changes to the UK GDPR, Data Protection Act 2018, and PECR under the Data Protection and Digital Information Bill 2. The schedules show the planned changes under the DPDI Bill which is helpful given it operates largely by amending existing provisions rather than by restating the resulting new law.
The Irish DPC has reportedly sent Meta its decision on the issue of the validity of its data transfers to the USA under SCCs, and whether its supplementary measures are sufficient to afford the data adequate protection. Meta will be given some time to prepare for release of the decision which is likely to happen on 22 May.
The government has published guidance on the security of connected places or smart cities. The collection of guidance applies to a number of stakeholders including senior leadership, decision makers and managers. It also applies to IT professional and cybersecurity leads, information managers, processors and users. The guidance covers a range of design and management issues including protection of a connected place's data, managing incidents and response and recovery. It is intended to guide security decisions and processes on the design, implementation and management of a connected place for those buying or operating connected places technologies.
The ECJ has published a judgment on interpretation of the right of access under Article 15 GDPR and, specifically, the meaning of "copy" and "information" in Article 15(3). The reference was made from Austria and asked the court about the scope of the right and whether the obligation to provide a "copy" is fulfilled by the controller supplying a summary table or whether the right entails the transmission of document extracts, entire documents and extracts from databases in which the personal data is reproduced.
The ECJ held that:
The ECJ has ruled in Österreichische Post case, that mere infringement of the GDPR does not give rise to a right to compensation, however, there is no requirement for non-material damage suffered to reach a certain threshold of seriousness in order to confer a right to compensation.
The Austrian Supreme Court referred questions around the right to compensation following a claim for non-material damages made by an individual against the Austrian postal service after it collected information on the political affinities of Austrian citizens and used that to create targeted address lists without providing the data subjects with required information.
The ECJ held that the right to compensation under 82(1) GDPR, is clearly subject to three cumulative conditions: infringement of the GDPR; material or non-material damage resulting from the infringement; and a causal link between the two. Accordingly, not every infringement gives rise, by itself, to a right to compensation as it may not result in damage. The ECJ went on to hold that there is no requirement for non-material damages to reach a certain level of seriousness in order to give rise to a right to compensation. It is for Member States to assess damages in accordance with applicable national rules.
The Advocate General of the ECJ has published an opinion relating to the long-running Deutsche Wohnen litigation.
The Berlin Court of Appeal asked the ECJ:
The AG opined that it is not necessary to establish the liability of a natural person, and that data protection regulators cannot issue administrative fines unless the relevant breach was intentional or negligent. In other words, there is no strict liability for breaches of the GDPR. If followed by the ECJ, this will have wider implications for German law where it goes further than EU law.
The ICO has launched public beta testing of its free Innovation Advice service designed to answer specific data protection questions. The service is open to any size of organisation across all sectors, planning to use UK personal data to drive a new or innovative product, service or business model that is not yet live. Questions can be asked at any stage of development prior to launch. The ICO aims to respond to queries within 10-15 days.
The US Federal Trade Commission is proposing to expand its 2020 privacy order against Meta. Alleging that Facebook has not complied with elements of the order, which related to the Children's Online Privacy Protection Act Rule, the FTC is now suggesting Meta platforms be restricted from monetising personal data of under-18s or using it for commercial gain once they turn 18. The FTC is also suggesting Meta should have to get independent confirmation that its privacy practices comply with relevant law, and disclose use of and get consent to any future uses of facial recognition technology.
There is debate over the extent to which the original order can be amended and Meta will be fighting the FTC's latest complaint, accusing it of being a "political stunt".
ENISA (the European agency for cyber security) has published a report Cybersecurity of AI and standardisation. The report provides an overview of existing and proposed standards to prepare for the EU's AI Act. Key recommendations include:
The European Parliament continues to work on its compromise position for the draft AI Act and has reportedly reached provisional political agreement. According to Euractiv, the latest draft proposes distinguishing between foundation models – AI trained on broad data at scale, designed for general output but adaptable to a wide range of tasks (like ChatGPT) – and general purpose AI which can be used and adapted to purposes for which it was not intentionally or specifically designed. The proposal is that foundation models would be subject to stricter requirements, for example, testing, risk mitigation, data governance and maintenance requirements. Generative AI foundation models would also be subject to additional transparency requirements. There are also moves to share responsibility across the AI value chain, and a proposal for the European Commission to develop non-binding standard contractual clauses for particular sectors.
The Parliament has reportedly added a layer to the high-risk classification whereby a model which falls within Annex III would only be deemed to be high risk where it posed a significant risk of harm to health, safety or fundamental rights. AI used to manufacture critical infrastructure will be categorised as high risk if it entails a severe environmental risk. Recommender systems of VLOPs (under the DSA) will also be high risk.
Once the Parliament's compromise text is adopted, the AI Act will move to trilogues.
The Digital Regulatory Cooperation Forum has published its workplan for 2023/24. There will be a particular focus on online safety and data protection, promoting competition and data protection, illegal online financial promotions, supporting effective governance of AI and algorithmic systems, enabling innovation in relevant regulated industries, digital assets, and, more broadly, on joint horizon scanning and cooperation.
Meta has allowed for significant fines and a potential halt on its EU-US data transfers according to filings with the US Securities and Exchange Commission. Meta has provided for the Irish Data Protection Commissioner to rule that its use of Standard Contractual Clauses and supplementary measures do not sufficiently protect EU personal data transferred to the USA. A decision is expected by 12 May. There is likely to be a three month compliance period but if the EU-US Data Protection Framework is not agreed by the end of that period, Meta has warned it may need to cease a number of its EU services including the Facebook and Instagram platforms.
The EDPB has published a Data Protection Guide for small businesses as part of its awareness-raising programme. It covers the basics of data protection including cyber security, data subject rights and data breaches. It also includes links to materials for SMEs developed by Member State data protection regulators.
AG Pitruzzella has delivered an opinion on a reference to the ECJ by Bulgaria. The questions arose following a data breach of data held by a public body which resulted in claims made for non-material damage as a result of worry and fear that data might be misused in future. The questions related to appropriate security measures, beach liability and compensation for non-material damage.
Points of interest include the AG's opinion that:
The Business, Energy and Industrial Strategy Committee has published a report on workers' rights and protection. Among the recommendations are that the government introduce a right for workers to be consulted and notified when technology will result in their surveillance, and to consult on an enforceable code of practice on the use of surveillance technology in the workplace.
The Data Protection and Digital Information Bill had its second reading in the Commons. Parliament has issued a call for written evidence on the DPDI2 Bill, calling on those with relevant experience or special interest to submit evidence as soon as possible for consideration by the House of Commons Public Bill Committee which is expected to report by 13 June 2023. This follows the Bill passing its second reading in the Commons and the publication of a research briefing by the House of Commons Library at the end of March.
The EEA Data Protection Authorities (DPAs) have published a report on the outcome of the work carried out by the task force set up to look into the 101 complaints filed by NOYB following the Schrems II decision. The report sets out the common positions of the task force members and on the first cases concerned. The task force was set up to promote a consistent approach to handling the identical complaints which were filed relating to the transfer of personal data to the USA made by Google Analytics and Facebook Business Tools.
The views expressed in the report do not represent the position of the EDPB, nor do they prejudge the outcomes of remaining complaints. Key findings include:
The ICO has issued reprimands to Surrey and Sussex Police for using an app that recorded phone conversations and unlawfully processed personal data. In line with the policy of not issuing large fines to public organisations, the ICO has chosen to issue the reprimands in place of what could have been £1m for each force. The app was available to staff and recorded all incoming and outgoing phone calls. 1,015 members of staff downloaded the app and more than 200,000 phone conversations were subsequently recorded. Those downloading the app as well as those making calls, were unaware that the calls would be recorded.
Following concern from multiple Member State regulators about ChatGPT and potential non-compliance with the GDPR, the EDPB has set up a dedicated task force to foster cooperation and exchange information on possible enforcement actions conducted by data protection authorities. The Italian regulator, the Garante, has lifted its ban on the chatbot subject to OpenAI making changes to its privacy practices, including around transparency and lawful basis.
The ICO has responded to the government's AI White Paper consultation. It broadly supports the government's aims, including the AI sandbox and the overall sector-based approach, however, it does raise a few issues including:
The ICO stresses the important role of the Digital Regulation Cooperation Forum and recommends that the government prioritise research into the type of guidance and the Sandbox activities that AI developers would most value.
The ICO has published a blog explaining that it is closing its investigation into the use of live facial recognition by Facewatch. Facewatch's LFR product uses LFR to help businesses (usually shops) protect staff, customers and stock by spotting 'subjects of interest'. The ICO had raised concerns about the product. It agreed that there was a legitimate interest for its use (to prevent crime) but had issues including in relation to the amount of personal data processed. The ICO said Facewatch had made improvements, including by reducing the personal data collected, and appointing a DPO. As a result the ICO is closing its investigation but it warned use of LFR would be assessed on a case by case basis.
The EDPB has adopted an Article 65 dispute resolution decision concerning a draft decision by the Irish Data Protection Commissioner on whether or not Meta is lawfully transferring personal data to the USA. The content of the decision has not been made public and the Irish DPC must now adopt its final decision within one month. The EDPB will publish its decision after the Irish DPC has notified its decision to Meta Ireland.
The European Parliament LIBE Committee has adopted its rejection of the proposed EU-US data protection framework as expected. This is unlikely to derail the adoption process as the European Parliament does not have a binding vote in the approval process.
The EDPB adopted final Guidelines on data subject rights – Right of access. The guidelines analyse the right of access and set out clarification on its scope, the information the controller has to provide to the data subject, the format of the access request, the main modalities for providing access, and the notion of manifestly unfounded or excessive requests. Some minor clarifications were added during the consultation phase.
The UK government has published a White Paper which sets out a framework for the UK's approach to regulating AI as part of the National AI Strategy and following a 2022 consultation. The government has decided not to legislate to create a single function to govern the regulation of AI. It has elected to support existing regulators in developing a sector-focused, principles-based approach. Regulators including the ICO, the CMA, the FCA, Ofcom, the Health and Safety Executive and the Human Rights Commission will be required to consider the following five principles to build trust and provide clarity for innovation:
UK regulators will publish guidance over the next year which will also include practical tools like risk assessment templates, and standards. The guidance will need to be pro-innovation, proportionate, trustworthy, adaptable, clear and collaborative, underpinned by the following four core elements of the government's AI framework:
Further elements to be considered by regulators are set out in Annex A.
The government also supports the findings of the Vallance Review published earlier in March, which looked at the approach to regulating emerging and digital technologies. With regard to AI, Sir Patrick Vallance recommended:
Interestingly, while providing for a regulatory sandbox, the AI White Paper does not set out further policy on the relationship between IP and generative AI although the Intellectual Property Office is working on a code of practice which is expected to be ready by the Summer.
The government has also published:
The government will monitor the effectiveness of this policy and of the resulting guidance, and consider whether it is necessary to introduce legislation to support compliance with the guidance. It intends to publish an AI regulatory roadmap which will set out plans for establishing central government functions for the four elements of the AI framework. The government also plans to publish a draft AI risk register for consultation, an updated roadmap and a monitoring and evaluation report some time after March 2024.
The European Data Protection Supervisor will be joining the EDPB's coordinated enforcement action against DPOs. While the EDPB will focus on the role of the DPO at national level in the public and private sectors, the EDPS will concentrate on the role of DPOs in EU institutions.
Meta has announced that it is switching its lawful basis for processing data for behavioural adverts from contractual necessity to legitimate interests from 6 April. This is as a result of the Irish Data Protection Commissioner's compliance order following its conclusion that Meta could not rely on contractual necessity for this type of processing on its Facebook and Instagram platforms. Meta also said "Relevant users will also be notified of this change, which will give them additional options around how we process certain information to serve behavioural advertisements".
The Wall Street Journal reports that the "additional options" will allow EU users to submit an online opt-out form electing to receive targeted advertising based on broad categories of data. The opt-out request would then be reviewed before being actioned.
The ICO has published a list of eight questions for developers and users of Generative AI to ask themselves in response to the rise of generative AI and large language models (LLMs), and in the context of the signature by academics of a letter calling for a six-month moratorium on the development of AI, as well as the recently reported ChatGPT data breach.
The ICO reminds organisations developing or using generative AI that they need to take a data protection by design and default approach and to be aware that using personal data to train or develop generative AI will engage data protection law, whether or not the data comes from publicly available sources. They should ask themselves the following questions:
Meanwhile in Italy, the Italian data protection regulator, the Garante, announced an immediate ban on ChatGPT and an investigation into its parent company OpenAI's GDPR compliance. The Garante says OpenAI does not have a lawful basis for processing such large amounts of personal data to train ChatGPT, it does not verify the age of users and so exposes minors to unsuitable answers, there are transparency issues, and there are security concerns following a recent data breach. OpenAI has been given 20 days to respond with a compliance proposal or face a fine. While disagreeing with the Garante's findings. OpenAI has disabled access to ChatGPT in Italy. German regulators are also reportedly considering their response pending further information from the Italian regulator.
The ICO has set out a prioritisation framework for handling complaints made against public authorities under the Freedom of Information Act (FOIA) and the Environmental Information Regulations. It will prioritise complaints which have a significant public interest, assessing them with reference to whether:
The ICO has fined TikTok Technologies UK Limited and TikTok Inc (TikTok) for a number of data protection breaches relating to children's data. These included allowing more than one million UK children under 13 to use its platform contrary to its terms of service, processing the personal data of under-13s without personal consent, and not taking adequate age verification measures including action to remove underage children from the platform.
The ICO reduced the fine which was originally set at £27m after taking into account representations from TikTok and not pursuing the provisional finding relating to unlawful use of special category data.
The High Court has ruled the immigration exemption in the Data Protection Act 2018, remains unlawful and must be made clearer, saying previous actions by the government to make the exemption comply with the UK GDPR have not resolved the issue. The court said the government needs to make it a legal requirement to comply with a code or policy setting out the safeguards and tests to be applied before using the immigration exemption. An obligation merely to "have regard to" it is insufficient.
The European Data Protection Board has adopted final revised Guidelines on Personal data breach notification under the GDPR. Non-EEA controllers will be dismayed to see that the guidelines advise notifiable breaches must be notified not just to the supervisory authority in the country of the controller's representative, but to the SA of each Member State in which affected individuals live. The EDPB points out that the one-stop-shop mechanism is not engaged by the presence of a representative in a Member State.
The EU Council has agreed its position on the Data Act. It has approved the overall aims of the Data Act but has suggested a number of amendments to the Commission's proposal. These include:
This paves the way for trilogues to begin after the European Parliament adopted its negotiating position earlier this year.
The ICO's Children's Code applies to information society services likely to be accessed by children, even where these services are intended to be adult-only. To support service providers in assessing whether or not their services are likely to be accessed by children, the ICO has published draft guidance for consultation. The guidance covers:
The consultation is open until 19 may 2023.
The Belgian DPA has chosen to suspend the six month implementation period for IAB Europe's TCF action plan. The action plan was presented by IAB after the Belgian regulator found its Transparency and Consent Framework, intended to tackle consent for advertising, breached the GDPR. IAB presented an action plan to remedy the issues but also appealed aspects of the DPA's decision before the CJEU. The DPA approved the action plan in January 2023, allowing for a six month implementation period, before the CJEU had considered the appeal. The suspension is a result of a second appeal by IAB Europe to the Belgian Market Court. If the Belgian Market Court upholds the DPA's approval, the implementation period will resume at that time with a deadline sometime towards the end of 2023 instead of in July.
Fifteen EU governments have provided formal comments on the draft Regulation on combatting child sex abuse materials (CSAM). Threats to end-to-end encryption and data retention were cited as particular concerns. Germany requested an addition to the draft text to the effect that technologies which disrupt, weaken, circumvent or modify encryption will not be used. Other countries requested that the legislation contain explicit data retention parameters.
Alongside a Communication on the Single Market at 30, the EC has outlined plans to build a European data space for public procurement data. The architecture and analytics toolkit should be put in place by mid-2023, and procurement data published at EU level will be made available in the system, By the end of 2024, all participating national publication portals should be connected, historic data published at EU level integrated, and the analytics toolkit expanded. The data space will pool data on the preparation for tenders, their calls and outcomes. The aim is to enable more targeted and transparent public spending and boost policy making.
ChatGPT users have been notified that a data breach exposed payment-related and other personal information of 1.2% of ChatGPT subscribers. Full credit card numbers were not exposed. The tool was taken offline while the vulnerability was fixed.
In line with its ICO25 strategy, the ICO has published updated guidance on AI and Data Protection, following requests for clarification on fairness requirements when using AI. The guidance has been restructured and a number of sections have been expanded or added. It covers:
What's new?
Similarly the fairness section is now stand-alone and this has been added to in order provide information on:
The ICO says the updated guidance is in line with the government's ambitions to adopt a pro-innovation approach to AI with embedded principles of fairness. The ICO expects further updates to the guidance will be required to keep up with technological and legislative developments.
Separately, it's been reported that the ICO will set up a hotline to provide guidance on emerging technologies, initially as a pilot project, in order to help companies apply legislation to their technologies.
As we discuss in more detail here, the CNIL has listed its top four priorities for 2023:
The Amsterdam District Court has held that Facebook Ireland breached data protection law when processing the personal data of Dutch Facebook users between 1 April 2010 to 1 January 2020. Personal data was processed for advertising purposes in breach of lawful basis, transparency and information requirements. Facebook was, however, permitted to have placed the obligation to inform users about third party cookies, onto the third party websites operators. The issue before the court was whether or not Facebook had acted lawfully – compensation was not available.
Separately, the Austrian DPA has published a decision in response to one of the NOYB complaints saying that Facebook's tracking pixel tools breach the GDPR and violate the Schrems II decision on data exports. The DPA said that the same principles which led to its declaration that the use of Google Analytics is unlawful, apply to the Facebook Login and Meta Pixel tools. These tools use data which is inevitably transferred to the USA. The DPA advises EU website operators not to include any tools from Meta on their websites. No fines were imposed.
The EPDB has announced its second annual co-ordinated enforcement action, which will look at the designation and position of Data Protection Officers (DPOs). 26 Data Protection Authorities are expected to take part in the process and will send questionnaires to DPOs in their respective jurisdictions. They will be focusing on issues including independence, appointment process and other GDPR requirements. Findings will be analysed to decide whether there is a need for further action and the EDPB will publish a report.
The ICO has reduced the fine handed down to Easylife in October 2022, from £1.35m to £250,000 for using customer purchase data to profile and predict their medical conditions, and then target them with health-related products without consent. Easylife was separately fined £130,000 for making unsolicited marketing calls. The ICO found that Easylife profiled customers making purchases from its Health Club catalogue in order to target them with health-related items. This meant the company was processing 'invisible' health data – people were unaware that their personal data was being used for that purpose.
Easylife has since stopped the unlawful processing of special data. The ICO has reduced the monetary penalty following an appeal, saying "Having considered the amount of the penalty again during the course of the litigation, in light of issues raised by Easylife, I considered that a reduction was appropriate".
The newly created Department for Science, Innovation and Technology (DSIT), has published the Data Protection and Digital Information (No.2) Bill (DPDI2). The original Bill (DPDI1) was published in July 2022 and then put on hold in September under the Liz Truss government to allow for further consideration. DPDI2 is substantially similar to its predecessor with largely minimal changes and clarifications. Unfortunately, it still operates by amending existing legislation rather than producing a complete piece of draft new legislation which makes it hard to digest.
Notable amendments include:
The government has also published a summary of key EHCR issues under the Bill and information standards for health and social care. It remains confident that nothing in DPDI2 will jeopardise EU adequacy.
Euractiv has seen a new compromise text for the Cyber Resilience Act published by the Swedish presidency of the EU Council. It suggests that changes are not particularly significant and that some of the more controversial elements have not yet been amended. There is clarification on interaction with the AI Act and General Product Safety Regulation, a new article mandating Member States to put appeal procedures in place for product manufacturers to challenge the decision of accredited auditors, and clarification around categories of penalties.
The Norwegian regulator is the latest DPA to find (on a preliminary basis) that use of Google Analytics breaches GDPR rules on data transfers. A formal decision is likely at the end of April. The DPA recommends use of alternative tools.
The High Court awarded general damages of £60,000 plus special damages in respect of claims including infringement of privacy and misuse of private information. The claims were made after the defendant covertly recorded naked images of the claimant and then published them on a website alongside a photograph of her face. The court agreed that the knowledge these images had been published online led to the claimant suffering chronic PTSD and to a personality change.
The Irish DPC has published its 2022 annual report which sets out the significant role the Irish Data Protection Commission plays in enforcing the GDPR. Over the year, the DPC concluded 17 large-scale investigations and issued fines of more than €1bn, representing two thirds of the financial penalties imposed for GDPR breaches in the EU. This is in addition to dealing with beach notifications, cross-border and national complaints, and carrying out direct marketing investigations.
TikTok has announced its plans to host EU user data on servers in Ireland and Norway and to have any data transfers outside the EU vetted by an independent third party. The proposals are similar to those being introduced in the US under Project Texas. TikTok will also introduce security gateways to enhance data access control by determining employee access to EU user data, and work to incorporate privacy enhancing technologies.
The ICO has published guidance to help product and UX designers, product managers, QA testers and software engineers embed data protection into their products and services by design. The guidance sets out key considerations for each stage of product design up to post -launch.
The European Parliament has agreed its negotiating position on the EC's draft Data Act. Suggested amendments include:
Once the Council agrees its final position, trilogues will begin.
The EDPB has adopted its non-binding Opinion on the adequate protection of personal data under the EU-US Data Privacy Framework (DPF). In summary, the EDPB welcomes improvements as compared with the Privacy Shield, in particular, the recognition of the principles of necessity and proportionality, and the enhanced oversight and redress regime. However, it also expresses a number of concerns, recommending the Commission seek clarification, and underlines that the effectiveness of the framework will depend on the extent to which it is followed through in practice. In other words, the EDPB is not dismissing the DPF nor seeking to block the EU-US adequacy decision, but neither is it giving unqualified support.
General data protection aspects
The EDPB comments that the DPF Principles have not changed significantly from those under the Privacy Shield. As a result, some of the issues of concern under the Privacy Shield remain, including those relating to the rights of data subjects, the absence of key definitions, lack of clarity around application to processors and a broad exemption for publicly available information. The EDPB is also concerned about protections for onward transfers, and the fact that protections in around automated decision-making, profiling and AI technologies tend to be sector specific. The EDPB says in these areas, rules are needed to provide sufficient safeguards, including the right for the individual to know the logic involved, to challenge the decision, and to obtain human intervention where the decision significantly affects them.
The EDPB stresses the importance of effective oversight and enforcement and underlines the need for compliance checks. The EDPB says it will be monitoring these aspects, and the effectiveness of redress mechanisms (many of which are the same as those in the Privacy Shield) closely, including in the context of periodic reviews.
Access to EU personal data by US public authorities
The EDPB recommends that the Executive Order 14086 (EO) be accompanied by updated policies and procedures across all US intelligence agencies. It recommends the Commission assess these and share their assessment with the EDPB. The EDPB says the EO represents a "significant improvement" by introducing additional safeguards and the concepts of necessity and proportionality into the US legal framework on signals intelligence. It also finds that the proposed redress mechanism for EU citizens alleging unlawful use of their data by US public bodies, to be "significantly improved" compared with the Ombudsperson mechanism under the Privacy Shield. However, the EDPB sees a need for further clarification on questions in particular relating to "temporary bulk collection" and to the further retention and dissemination of bulk collection data.
The EDPB's focus is on the holistic approach to the safeguards and it raises a number of concerns about particular aspects of the US bulk data collection regime under FISA and Executive Order 12333. It also raises concerns about the practical functioning of the Data Protection Review Court which, it says, will require monitoring by the Commission to ensure it is not routinely dismissing claims.
Review and monitoring
The EDPB concludes that the EO provides "substantial improvements" compared to the previous framework but asks for its concerns to be addressed and for the Commission to provide requested clarifications and ongoing monitoring of the implementation of the DPF and the safeguards it provides. It also says it expects the Commission to stick to its commitment to suspend, repeal or amend the adequacy decision on grounds of urgency if necessary.
Full steam ahead?
Despite expressing some concerns, the EDPB's Opinion does not contain anything likely to hold up a new EU-US adequacy decision and, notwithstanding the disapproval of the European Parliament's LIBE Committee, we should see the decision approved shortly.
The European Commission has published a call for evidence regarding its plans to introduce legislation to further harmonise GDPR enforcement by national regulators. The legislation is likely to harmonise administrative procedures and cooperation mechanisms for cross-border cases. The call closes on 24 March 2023.
The EDPB has adopted its work programme for 2023/24. The programme is based on the EDPB's strategy to 2023, and grouped around four pillars. Broadly we can expect the following guidelines on a wide range of issues including:
The EDPB will also focus on:
The EDPB has adopted final guidelines on:
The EDPB also published a one-stop-shop case digest which looks at thematic issues from one-stop-shop decisions relating to the Article 17 right to erasure and the Article 21 right to object. The report summarises the key issues and outcomes from the decisions at a very high level and contains links to the relevant EDPB decisions.
The ICO has called on UK accountants to help their SME clients establish compliant data protection practices. It sets out a list of seven questions which accountants can ask. These cover basic compliance issues including security measures, SARs, data beaches, privacy notices and data mapping.
China has published standard contractual clauses and SCC Regulations which will take effect on 1 June 2023. Businesses will be able to use SCCs in order to transfer personal data where:
The SCC Regulations prohibit breaking down the data volume to circumvent the CAD security assessment which is required under the Measures on Security Assessment for Outward Data Transfer.
The SCCs are not modular – there is one template for all transfers. Before carrying out a transfer, the exporter must carry out an impact assessment and prepare a report covering specified information and SCC agreements must be filed with the specified authorities.
Christopher Jeffery | Senior Counsel
The European Parliament's LIBE Committee has refused to back the draft EU-US adequacy decision, saying the Data Protection Framework does not provide EU citizens with a level of data protection equivalent to that in the EU. The Committee urges the Commission to renegotiate, however, its opinion is not binding on the Commission as its part in the adoption process is limited to the right to scrutiny. All eyes will now be on the much more important EDPB's opinion which is due in the next few weeks.
The ICO has published top tips for games designers on how to comply with the Children's Code. These include:
The Information Rights Tribunal has ruled on an appeal against the ICO's action to require credit reference agency Experian Limited to change its data protection practices. The ICO issued an enforcement notice to Experian in October 2020, following a two year investigation into how the company and two other major credit reference agencies were using the personal information of adults for direct marketing purposes.
The Tribunal agreed with the ICO's findings that Experian had not processed the personal data of over five million individuals transparently, fairly or lawfully because it failed to notify them it was processing the data for direct marketing purposes. The Tribunal did, however, disagree with the ICO's assessment that Experian's privacy notice was insufficiently transparent, that using credit reference data for direct marketing purposes was unfair, and that Experian failed to properly consider its lawful basis for that use.
The ICO has said it will consider the ruling and decide whether or not to appeal.
IAB Europe has asked the Belgian Market Court to grant it protection on an interim basis for its ongoing work on its Transparency and Consent Framework (TCF). IAB agreed a corrective measures plan with the Belgian DPA after it said that the TCF did not comply with the GDPR. While the DPA said that the corrective measures would bring the TCF into GDPR compliance, aspects of the DPA's initial ruling are now being reviewed by the CJEU. IAB is now asking the Belgian court to shield iterations of the TCF which are less directly impacted by the reference to the CJEU.
The EDPB's eagerly awaited ruling on the validity of data transfers by Meta and Instagram under Standard Contractual Clauses is set to be published by 14 April. The Irish DPC, acting as lead supervisory authority, provisionally ruled that the transfers were unlawful. If the EDPB backs this up before an EU-US adequacy is reached, Meta has said it may need to suspend its services in the EU.
A coalition of "leading privacy self-regulatory organisations" - regional Digital Advertising Alliances – has published tech specifications and user interface guidelines to help brands and publishers simplify and improve user choice experience. The aim is to allow integration of the AdChoices option with consent management platforms.
The CMA has said it will extend the period of its investigation into suspected breaches of the Competition Act 1998 by Meta to summer 2023. The investigation relates to Facebook's collection and use of data in the context of providing online advertising services, and its single sign-on function, and whether this results in an unfair competitive advantage over its competitors.
Meta has announced it is updating it's 'Why am I seeing this ad' information to cover information about machine learning, and will provide summarised information about how different actions on its platforms will influence machine learning models.
The EDPS has published an Opinion on two EC draft Regulations which deal with the collection and transfer of advance air passenger information (API) collected during the check-in process. They are intended to replace Directive 2004/82. The EDPS recommends:
The EDPS supports harmonised practices in relation to the collection, sharing and retention of API.
The Economic and Financial Affairs Council has adopted a Decision authorising EU Member States to ratify the Second Additional Protocol to the Convention on Cybercrime (the Budapest Convention) which provides common rules at international level to enhance cooperation on cybercrime and the collection of evidence in electronic form for criminal investigations or proceedings. The UK has already signed the protocol along with 24 other countries.
Christopher Jeffery | Senior Counsel
The CJEU has ruled in a reference from Germany on interpretation of Article 38 GDPR. The questions referred related to the role of national legislation in determining when a DPO can be dismissed, and interpretation of the Article 38(6) reference to 'conflict of interests'. It followed a claim for unfair dismissal after a German DPO who was also Chair of the Works Council was dismissed from the DPO role when the GDPR took effect on the basis that there was a conflict of interest.
The CJEU held that:
The UK government has published a call for views on software resilience and security for businesses and organisations. The UK is looking to strengthen resilience of digital products and services throughout the business supply chain. It wants to better understand the nature of software risks as a whole to UK organisations and where the government should focus on mitigating them. A wide range of software used by organisations and in business is covered including SaaS, open source embedded and firmware. Input is sought across the digital supply chain including developers, cybersecurity experts, users and maintenance service providers.
The Home Office is consulting on proposals to update the Computer Misuse Act, following a 2021 announcement that it would be reviewed. The consultation follows a call for information and is of particular interest to law enforcement agencies, domain name registrars and registries, and hosting providers. The review of the Computer Misuse Act is being carried out to ensure the UK's legislative framework continues to support action against the harm caused by online cybercrime. Views are sought on three proposals for legislation:
The consultation closes on 6 April 2023.
The Italian DPA, the Garante, has issued an order imposing provisional limitation of processing against US AI chatbot Replika. The chatbot generates a 'virtual friend' which interacts with the user. The Garante said that the Chatbot posed a particular risk to children and contained no age verification measures to protect them. Replies served by the chatbot were often inappropriate and there is no valid lawful basis for processing. Luka Inc. the company behind the product was ordered to cease processing Italian data immediately, inform the Garante within 20 days of measures taken to comply with the SA, or face a fine of up to €20m or 4% of annual global turnover.
The Biometrics and Surveillance Camera Commissioner has published a joint Annual Report 2021-22 covering biometrics and surveillance technology. In his introduction, he has warned that the current plan to replace the Surveillance Camera Code with the DPDI Bill effectively does away with current rules without providing a comprehensive replacement framework or suitable oversight. He also expresses concerns about 'mission creep' of ANPR cameras, the use of drones capturing footage of public spaces, and the use by law enforcement of citizen phone camera footage.
Jo Joyce | Senior Counsel
The EC has announced it will oversee the progress of every large-scale GDPR enforcement case. The Commission will require all national regulators to share an overview of large cross-border investigations every other month. The Commission will take into account the steps the DPAs are taking and how long they take. This move comes off the back of complaints made by the Irish Council for Civil Liberties following which, the EU Ombudsperson recommended the Commission monitor the progress of big tech cases under the Irish Data Protection Commissioner's jurisdiction. However, the Commission has decided to apply the same regime to all EU regulators and for all largescale cross-border cases, although practically, this is most likely to impact the Irish DPC.
On 20 January 2023, the ICO said it had decided to stop enforcing failures to file personal data breach reports under Regulation 5A of PECR which requires a communications service provider to notify the ICO within 24 hours of becoming aware of a data breach. The ICO's decision was based on the fact that the incidents tend to be caused by human error, involving one individual, are quickly resolved and result in risk remediation measures being swiftly implemented. Following feedback, the ICO has updated its statement which now says that it will use its discretion not to take enforcement action provided breaches are still reported within 72 hours. The ICO will continue to take enforcement action in relation to the underlying breaches reported where warranted, and continues to expect breaches likely to adversely affect the personal data or privacy of subscribers or users to be reported within 24 hours.
The ICO has published a statement confirming that the use of Facial Recognition Technology (FRT) by North Ayrshire Council in nine schools was likely to have infringed the UK GDPR. The likely infringements included:
The ICO said FRT could be used lawfully in schools but in this case, it had most likely not been. In line with its policy on enforcement against public bodies, the ICO made a series of recommendations to be applied by the Council, rather than imposing a financial penalty.
DCMS has published a policy paper on how organisations can be certified under the UK digital identity and attributes trust framework (DIATF), together with consolidated guidance on the digital identity programme. The framework, currently under development, will enable digital identities to be reused in a secure manner. Organisations must be certified to participate and can already complete this process.
DHSC has published draft statutory guidance pursuant to s274A of the Health and Social Care Act 2012. The draft guidance sets out measures NHS England is required to take to protect the confidentiality of patient data as it has now taken over NHS Digital's statutory functions as of the end of January 2023. NHS England is required to adopt the same statutory protections implemented by NHS Digital together with additional measures to further enhance confidentiality and data protection.
The European Agency for Cybersecurity (ENISA) has published a report on Engineering personal data sharing. The report looks more closely at specific use cases relating to personal data, primarily in the health sector. It discusses how specific technologies and considerations of implementation can support meeting specific data protection requirements, including through cryptographic techniques. It also discusses data sharing in the health sector and using third party services, as well as considerations for giving effect to data subject rights.
Christopher Jeffery | Partner
The Irish Data Protection Commissioner's draft decision on the lawfulness of Meta's data transfers from the EU to the USA, is to go through the Article 65 dispute resolution procedure after concerned supervisory authorities could not resolve disagreements. The Irish Data Protection Commissioner had sent them a draft decision in July 2022, which, if approved, would require Meta to cease transferring personal data to the US under Standard Contractual Clauses. Meta has warned this would result in it ceasing to offer services in the EU, however, alongside this dispute, the EU is working towards finalising a US adequacy decision in the Spring.
The Swedish Presidency of the Council of the European Union has published its compromise text for the Data Act, as reported by Politico. Among the changes are provisions to ensure consistency with the GDPR, enhanced transparency obligations on cloud service providers around cross-border data transfers, and new provisions relating to governmental access to data. Provisions to protect trade secrets have been bolstered, as have cloud service switching requirement measures. The fairness requirement on data-sharing obligations has been broadened to apply to all contractual arrangements, regardless of the size of the contracting business.
JD Sports has said a cyberattack potentially accessed personal data and financial information of 10m customers. The incident affected some online orders made by customers between November 2018 and October 2020 and targeted purchases of products of its JD, Size? Millets, Blacks, Scotts and Millets Sport brands. The company has informed the ICO and will be contacting affected customers. The information which may have been accessed included names, billing and delivery addresses, phone numbers, order details and the last four digits of payment cards. In a press release, JD Sports said the affected data was "limited" as it did not hold full payment data and did not believe account passwords had been accessed.
Christopher Jeffery | Partner
The Irish Data Protection Commissioner has fined WhatsApp Ireland €5.5m. The decision follows on from the fines handed down to Instagram and Facebook earlier this year and relates to similar issues ie transparency, and the reliance on contractual necessity for processing operations and whether it was actually forced consent. In its draft decision, the DPC found there were breaches of transparency obligations.
It also found WhatsApp was not effectively relying on consent so the contention that it used 'forced consent' was not sustainable. The DPC said that WhatsApp was not required to rely on consent. None of the competent authorities disagreed with this part of the analysis and that element of the complaint was rejected. The German SA (with whom the complaint was originally filed) is now responsible for adopting a separate decision for the parts of the complaint which have been rejected under Article 60(9) GDPR.
The DPC then went on to consider whether, in principle, the GDPR precluded WhatsApp Ireland's reliance on contractual necessity and decided it did not. There were objections to this view from six competent supervisory authorities (CSAs) and the decision went through the Article 65 procedure under which the EDPB issued a binding decision. The EDPB upheld the DPC's views on breach of transparency provisions, adding an additional breach. It was in relation to this additional breach that the fine was imposed as the DPC had already imposed a €225m fine on WhatsApp Ireland for transparency breaches.
However, the DPC also found that WhatsApp Ireland is not entitled to rely on contractual necessity for the delivery of service and security (excluding IT security) for the WhatsApp service and that its processing of this data on the basis of contractual necessity to date is in breach of the Article 6(1) GDPR requirement to process personal data fairly and lawfully.
The DPC once again takes issue with the EDPB's direction to conduct a fresh investigation into WhatsApp Ireland's data processing operations to determine whether: it processes special category data; processes data for the purposes of behavioural advertising, marketing, data analytics, and the exchange of data with affiliated companies for the purpose of service improvements; and in order to determine whether it complies with the relevant obligations under the GDPR. As with the EDPB's directions to conduct investigations into Instagram and Facebook, the DPC says the EDPB does not have the authority to make such a direction and it will appeal to the CJEU to have it set aside.
The EDPB has adopted a report on work carried out by the Cookie Banner Taskforce which was established in September 201 to coordinate the response to complaints about cookie banners filed across the EEA by NOYB. The taskforce was set up to share information and ensure a consistent approach to cookie banners in the EEA.
The report sets out the opinion of the taskforce on whether various types of commonly used cookie banners breach the ePrivacy Directive cookie consent requirements:
The EDPB has adopted a report on the findings of its first co-ordinated enforcement action which focused on the use of cloud-based services by the public sector. 22 DPAs across the EEA looked at 100 public bodies operating in a range of sectors. The report sets out the findings, lists actions already taken by DPAs in the field of cloud computing, and makes a series of recommendations to help with GDPR compliance when using cloud services.
DCMS has published an independent report it commissioned on data localisation requirements. It looks at the extent and impacts of potential data localisation measures and includes summary tables as well as 'deeper dives' on some jurisdictions.
The ICO's first Tech Horizons Report looks at technologies emerging over the next two to five years and analyses their impact on society in the context of personal data. The ICO says businesses must consider transparency, what control people have over their data, and how much data is gathered, in order to ensure services are data compliant and developed with consumer privacy at the forefront. The report covers:
Relevant businesses are encouraged to be part of the ICO's sandbox scheme and to consider privacy at an early stage in order to maintain public trust and confidence.
The UK and US governments published a summary of the first meeting of the US-UK Comprehensive Dialogue on Technology and Data. Among the issues discussed were the finalisation and implementation of a data bridge for UK-US data flows, collaboration on facilitating global trusted data flows, and development of AI standards and tools. The next formal meeting will be in January 2024 although quarterly progress reports will be published in the interim.
Victoria Hordern | Partner
The EDPB has published its Article 65 decisions on the Irish Data Protection Commissioner's investigations of Facebook and Instagram which led to significant fines last week. The Irish DPC is appealing the element of the decisions which requires it to conduct an 'own volition' investigation into the data processing practices of Facebook and Instagram and, in particular, their use of special data. The DPC claims the EDPB does not have the authority to do this and is asking the CJEU to annul that aspect of the EDPB's decisions.
In the meantime, the Guardian reported that Meta has said that Facebook and Instagram will restrict data available to advertisers which helps them target teens. From February, advertisers will not be able to access a user's gender or posts they have engaged with for targeted advertising purposes. They will only have access to the user's age and location. Teens will also be given extended user controls which will allow them to opt to see less of certain types of advertising.
IAB Europe says the Belgian Data Protection Authority (APD), has approved its plans to make changes to its Transparency and Consent Framework (an industry-led consent solution for adtech), to bring the TCF into GDPR compliance. The APD found that elements of the TCF were unlawful in February 2022, and IAB Europe submitted its rectification plan in April last year.
IAB Europe said the proposed measures were based on the assumptions that:
Both these assumptions have, however, been referred to the CJEU for a preliminary ruling. While IAB Europe says it is pleased with the APD's approval, it also expresses concern that it has pre-empted responses by the CJEU to the reference on the issues.
The CJEU has ruled on a reference from the Austrian Supreme Court in the Oesterreichische Post case relating to Article 15(1)(c) of the GDPR. This provides a right for data subjects to get information from a controller on request, about the recipients or categories of recipients to whom their data has been or will be disclosed. The referring court asked whether this meant that the controller had to disclose the specific identity of actual recipients.
The decision followed the Advocate General's Opinion and held that the controller is required to provide the data subject on request with the actual identity of recipients unless it is not possible to identify them in which case they may indicate the categories of potential recipient. Where the request by the data subject is manifestly unfounded or excessive, the controller may also indicate categories rather than actual identities of recipient. The CJEU said the right of access is necessary in order to enable the data subject to enforce other GDPR rights such as rectification and erasure. Without identifying specific recipients, this would not be possible.
The CJEU has ruled in a reference from Hungary, that remedies available to data subjects to enforce their rights through the courts under Articles 77 and 79 can be exercised concurrently. Article 77 provides a right to administrative appeal of a Supervisory Authority's decision. Article 79 allows for action before a civil court against a controller. The referring court asked whether in the context of reviewing the lawfulness of an SA decision, it is bound by the final judgment of the civil courts concerning the same facts and the same alleged GDPR infringement. The court was also concerned about a situation in which the decisions might conflict and asked whether one remedy should take priority over the other.
The CJEU held that actions can run in parallel and independently of each other. There is no priority or exclusive area of competence or jurisdiction, nor is there a rule of precedence. It is up to Member States to ensure that no conflict is created by procedural rules and that contradictory decisions in relation to the same processing of personal data do not exist within a single Member State. See our article for more.
The CNIL continues its scrutiny of cookie consent practices and has fined TikTok Ireland and TikTok UK €5m for breaches of the French Data Protection Act on the basis that:
The CNIL focused on the fact that while there was an 'accept all' button, there was no equivalent 'reject all' one but instead, several steps were required to reject cookies.
Digital Rights Ireland is appealing a decision by the Irish Data Protection Commissioner before the Irish Circuit Court. The DPC investigated a complaint by Digital Rights Ireland on behalf of 100m Facebook users whose data was left publicly accessible. The DPC agreed that Facebook had breached the GDPR by allowing the data to be scraped but found the breach did warrant notification to individual users by Facebook. Digital Rights Ireland contends the event did constitute a notifiable breach to individuals.
The German competition regulatory, the Bundeskartellamt, has sent Alphabet Inc., Google Ireland and Google Germany, its preliminary assessment on Google's data processing terms. It says it assumes that s19a of the German Competition Act (GWB) applies which means Google has to change its data processing terms and practices. The Bundeskartellamt says users are not given sufficient choice as to whether and to what extent they agree to the processing of their data across the google ecosystem and that any choices offered are insufficiently transparent and too general. The way the choices are presented also makes it easier for users to consent than to refuse.
The Bundeskartellamt recognised that Google may face restrictions around combining personal data across its services under the Digital Markets Act and also that the GWB partially exceeds these future requirements. It said it is in close contact with the European Commission about that. A final decision is expected later this year.
Christopher Jeffery | Partner
The Irish Data Protection Commission has fined Meta Platforms Ireland Limited (formerly Facebook Ireland Limited) a total of €390m for breaches of the GDPR in relation to its Facebook and Instagram services following binding decisions by the EDPB under the Article 65 procedure. The DPC also requires Meta to bring its data processing operations in line with the GDPR within three months.
The DPC began two inquiries into Facebook and Instagram in 2018, following two complaints made on 25 May 2018 when the GDPR came into effect, which raised essentially the same issues. Ahead of the application of the GDPR, Meta had changed its terms of service for its Facebook and Instagram services. Having previously relied on consent to the processing of user personal data for the delivery of its services (including for behavioural advertising), Meta changed the lawful basis for most (but not all) of its processing operations to the 'necessary for performance of a contract' basis (Article 61(b)). Users were required to accept the new terms of service in order to continue access.
Meta's position was that when the user accepted the terms of service, they entered into a contract with Meta. The relevant data processing operations were necessary in order to perform that contract and deliver Facebook and Instagram services (as the case may be), which included providing personalised services and behavioural advertising.
The complainants argued that Meta was effectively relying on consent rather than the contractual necessity lawful basis because by making accessibility to its services conditional on accepting the updated terms of service, it was forcing users to consent to the processing of their personal data for personalised services and behavioural advertising in breach of the GDPR.
The Irish DPC, as Meta's lead EU regulator, reached provisional findings that:
The DPC's draft decisions were then referred to other concerned supervisory authorities (CSAs) under the cooperation and consistency mechanism of the GDPR. In relation to the first finding about breach of transparency and information requirements, the CSAs agreed with the DPC's findings although argued the proposed fines were too low. However, ten of the CSAs took the view that Meta should not be permitted to rely on the contractual necessity lawful basis on the grounds that the delivery of personalised advertising could not be said to be necessary to perform the core elements of what they said was a more limited contract.
The DPC argued that the provision of Facebook and Instagram services include and appear to be premised on the provision of a personalised service which includes personalised or behavioural advertising. As such, this is central to the bargain struck between users and their chosen service provider and forms part of the contract concluded on acceptance of the terms of service.
As consensus could not be reached, the matter was referred to the European Data Protection Board under the Article 65 dispute resolution procedure. The EDPB reached its conclusions on 5 December. It upheld the DPC's position on transparency, added an additional breach of the fairness principle, and directed the DPC to increase the amount of the fines it was proposing to issue.
The EDPB also found that Meta was not entitled to rely on the contractual necessity lawful basis for the purpose of behavioural advertising.
The Irish DPC is bound to follow the EDPB's determinations. Accordingly, in its final decisions, it reflected the EDPB's findings and increased the fines imposed on Meta to €210m in respect of Facebook and €180m in respect of Instagram. It now requires Meta to bring its processing operations into GDPR compliance within three months.
A Meta spokesperson has said Meta will appeal the substance of the decision, reportedly adding "We strongly disagree with the DPC's final decision and believe we fully comply with the GDPR by relying on contractual necessity for behavioural ads given the nature of our services". Meta also published a blog defending and explaining its data protection practices in response to the decision.
The Irish DPC also noted that the EDPB had directed it to conduct a fresh investigation into Facebook and Instagram's processing operations and use of special data. The DPC said it does not believe it is "open to the EDPB to instruct and direct an authority to engage in open-ended and speculative investigation" and it will therefore bring an action for annulment of that element of the EDPB's decision before the CJEU.
While the EDPB's conclusion on the issue of contractual necessity was not altogether unexpected, it raises very real questions for Meta's business model in the EU and, more widely. It suggests that for many controllers (for whom provision of personalised advertising is not core to their service from the data subject's perspective), legitimate interests or possibly consent are the only available lawful bases for such processing. However, being able to rely on legitimate interests is subject to the outcome of a legitimate interests assessment which will vary on a case by case basis, and consent is problematic as it is difficult to meet GDPR standards in this context. It now remains to be seen whether Meta will be successful in appealing aspects of the DPC's final decisions, and what, if any, changes it will make to its EU processing operations.
The NIS2 Directive was published in the Official Journal on 27 December 2022, entering into force 20 days later. Member States must adopt implementing legislation by 17 October 2024, to apply from 18 October 2024. The original NIS Directive (2018/97) is repealed with effect from 18 October 2023. Regulation 910/2014 (eIDAS), and Directive 2018/1972 (establishing the Electronic Communications code) are amended from that date. For more on the NIS2 Directive, see here.
At the end of December 2022, France's data protection authority, the CNIL, fined Apple €8m over failure to get consent from French users to tracking for the purposes of serving personalised ads. It found that under iOS 14.6, when a user visited the App Store, identifiers used for several purposes, including personalisation of ads on the App Store, were automatically deposited without valid consent because consent was ticked by default and a number of steps were required for users to deactivate it.
The CNIL also fined Microsoft €60m for dropping third party cookies when users visited its search engine bing.com, without getting user consent or providing the ability for users to reject the cookies as easily as accepting them. Two clicks were needed to reject whereas only one was needed to accept. Microsoft was ordered to bring its practices into compliance within three months or face additional fines. Since 29 March 2022, there has been a 'refuse all' option on bing.com.
Christopher Jeffery | Partner
The European Commission published a draft adequacy decision for the EU-US Data Privacy Framework, following President Biden's Executive Order. The decision could be adopted as early as March 2023 although it is thought July is more likely. The draft decision is with the EDPB which must deliver an Opinion, following which it must be approved by a representative Committee of the Member States and then adopted by the Commission. The European Parliament also has a right of scrutiny.
US organisations will be able to join the Framework by committing to a series of privacy obligations. The US commits to limiting the access to EU data by intelligence agencies to what is necessary and proportionate, and there is provision for an independent and impartial redress mechanism, including through a Data Protection Review Court.
The safeguards provided under the adequacy decision will also be an indication of compliance with the Schrems II criteria for the purposes of other authorised transfer mechanisms. Read more.
The Data Protection (Adequacy) (Republic of Korea) Regulations 2022 are now in force. They provide for frictionless transfers of personal data between the UK and South Korea without the need for a transfer impact assessment or the use of an additional transfer mechanism. The Regulations also cover data transfers including personal data relating to credit information – data which is not covered by the EU's South Korea adequacy decision.
In 2021, the Irish Data Protection Commission fined WhatsApp €225m for breaches of the GDPR. The fine related to breaches of transparency requirements, particularly relating to the sharing of WhatsApp data with its parent company Facebook.
The Irish regulator, acting as Lead Supervisory Authority, had originally intended a lower fine of between €30-50m, however, its provisional decision was rejected by other regulators. The EDPB subsequently issued a binding decision under the Article 65 procedure requiring the fine to be increased. It also specified that WhatsApp be given a reduced time of three months to take required remedial actions in relation to its privacy practices.
WhatsApp asked the CJEU to annul the penalty and to allow it to recover costs but the fine was upheld in December.
On 9 December 2022, the Department for Digital, Culture, Media and Sport (DCMS) published a new voluntary code of practice for app store operators and app developers. The Code sets out minimum security and privacy requirements in the form of eight principles. Some of the principles within the Code are mandated through existing legislation, including data protection law and other principles will help stakeholders demonstrate steps towards adherence.
In summary the eight principles are as follows:
There will be a nine-month period for app store operators and app developers to adhere to the Code and the Code will be reviewed and, if necessary, updated no later than every two years in light of technological developments.
The government will start to consider how any of the Code's requirements which are not already mandated in law could be legislated for. It is also looking to identify recommendations for improving the security of apps available through non-mobile app stores intended for non-mobile devices such as games consoles and smart TVs.
The Product Security and Telecommunications Infrastructure Act received Royal Assent on 6 December. Part I creates a new security regime for connectable products while Part II deals with telecoms infrastructure reforms. Much of the substance of Part I is to be covered under secondary legislation. The PSTIA provides a framework but little detail. For more, see our article.
The UK and the Dubai International Financial Centre Authority have released a joint statement committing to increased facilitation of personal data flows. The statement says the UK and DIFC have made considerable progress towards creating a "data bridge" to allow frictionless flows of personal data between them and will continue to work towards that goal.
The CJEU has handed down a judgment on a reference form Germany regarding interpretation of Article 17 GDPR and the right to be forgotten. The reference related to a case involving a request to delete links and thumbnails from search results on grounds of inaccuracy.
The CJEU commented on the Article 17(3) exemption where the processing of the data in question is necessary for exercising the right of freedom of expression and information. This requires a balancing exercise in accordance with proportionality principles. Generally, the starting point is that the data subject's rights will override the legitimate interest of internet users in accessing the information, however, this must be assessed on a case by case basis and in context.
The court also discussed what constituted inaccuracy for the purpose of giving effect to a right to be forgotten request. The court held that a search engine operator (SEO) must de-reference where the requestor proves that more than a part of minor importance is manifestly inaccurate. The requestor is only required to show evidence they can reasonably be expected to find and the SEO is not required to play an active role in trying to find facts which are not substantiated by the request. If the inaccuracy of the relevant information is not obvious in light of the evidence produced by the requestor, the SEO is not obliged to de-list. However, in such cases, the requestor must have the option to refer the matter to a supervisory or judicial authority and the SEO must warn internet users that proceedings contesting the accuracy of the information are in progress.
In relation to thumbnails published without the accompanying article, a separate balancing exercise is required and the informative value of the photos should be considered without taking into account the context of their publication. However, text information directly accompanying the photo should be considered.
The Rapporteur to the CNIL has recommended it fine Apple €6m for failing to adequately warn users they were being tracked by its own apps. The Rapporteur, and the complainant, France Digitale, say that Apple did not apply the same standards of prior consent to its own app tracking transparency feature as for its advertisers.
Following a high profile data breach in September 2022, in December, Uber said it had suffered a new data breach after a threat actor leaked employee email addresses, corporate reports and IT asset information stolen from a third-party vendor.
Meta has agreed to pay $725m to settle the US class action relating to the Cambridge Analytica data breach. This is the largest settlement in a US data privacy class action to date. Meta has not admitted wrongdoing and says it has overhauled its privacy practices since the time of the breach. It is unclear how plaintiffs will claim their share of the settlement which is likely to amount to not more than a few dollars per individual. A further hearing on the settlement is set to take place in early March. A multi-billion pound class action relating to the same issues has also been filed with the Competition Appeal Tribunal in the UK.
In a reference from Finland, AG Campos Sanchez-Bordona has delivered an Opinion relating to the right of access under Article 15(1) GDPR. An individual (JM), was seeking information about the identity and positions of people who had accessed JM's personal data at the Bank at which JM was both an employee and a customer. The Bank, as controller, refused to provide the information arguing that the Article 15(1) right does not apply to log data of the Bank's data processing system which recorded which employees had access to the customer data and at what times.
The AG opined that:
Epic Games, the maker of Fortnite, has agreed with the US Federal Trade Commission, that it will pay $520m to settle claims relating to alleged privacy violations and unwanted charges.
Epic will pay $275m for violating COPPA (the Children's Online Privacy Protection Act). The FTC alleged Epic had violated COPPA by collecting children's personal data without parental consent and making it difficult for parents to request deletion of their children's data. In addition, the FTC alleged Epic's practices of turning on text and voice communications by default harmed children and teens. Epic will also be required to adopt strong privacy by default settings for children and teens, and ensure voice and text communications are turned off by default.
A further $245m will be paid to refund customers for the use of dark patterns and deceptive interfaces, including for billing. The complaint against Epic alleged that Epic used dark patterns to trick users into making purchases, charged account holders without authorisation, and blocked access to purchased content when users questioned unauthorised charges.
The EDPB adopted dispute resolution decisions on the basis of Article 65 in respect of the Irish Data Protection Commissioner's decisions regarding Meta platforms Facebook, Instagram and WhatsApp. The decisions were made following complaint-based inquiries and focused on the lawfulness and transparency of processing for behavioural advertising. The WhatsApp decision looked at the lawfulness of processing for the purpose of improvement of services. The decisions had not been published at the time of writing pending notification of the controller by the Irish DPA who has one month to adopt final decisions following notification by the EDPB. Reports suggest Meta could face a fine of as much as €2bn. Meta says it is too early to speculate and it continues to engage with the Irish DPA.
The UK has unveiled its plans to update the NIS Regulations which implemented the NIS Directive. However, the proposals are narrower than those in the EU's recently approved NIS 2 Directive. This means in-scope organisations operating in the UK and the EU will have to comply with two different regimes going forward."
Christopher Jeffery | Partner
The government has published its response to its call for views on proposals to update the NIS Regulations which implement the EU NIS Directive. The government now proposes:
The UK's reforms will be less wide-ranging than those under the EU's NIS2 Directive and follow a more risk-based, flexible approach. This means the regimes will diverge and in-scope organisations which operate in both the UK and the EU will have to comply with both regimes. The new Regulations will be tabled when Parliamentary time allows.
The Council of the European Union has adopted its common position on the draft AI Act. Changes from the originally proposed text include:
The European Parliament is expected to agree its negotiating position on the AI Act in Q2 of 2023, at which point trilogues will begin.
The EDPB has adopted dispute resolution decisions on the basis of Article 65 in respect of the Irish Data Protection Commissioner's decisions regarding Meta platforms Facebook, Instagram and WhatsApp. The decisions were made following complaint-based inquiries and focused on the lawfulness and transparency of processing for behavioural advertising. The WhatsApp decision looked at the lawfulness of processing for the purpose of improvement of services. The decisions had not been published at the time of writing pending notification of the controller by the Irish DPA who has one month to adopt final decisions following notification by the EDPB. Reports suggest Meta could face a fine of as much as €2 billion. Meta says it is too early to speculate and it continues to engage with the Irish DPA.
The Irish DPC is reportedly looking into the data breach Twitter reported in August. The incident saw user emails and phone numbers published online after records were scraped from the Twitter system. An estimated 5.4 million accounts were affected although Twitter has not confirmed this number. The Irish regulator has written to Twitter asking for more information about the breach.
Meta has announced updated privacy practices to protect young people on Facebook and Meta. These include privacy by default settings for new accounts, tools to limit unwanted interactions with adults and tools to limit the spread of teen intimate images online. These settings will be automatically provided to new Facebook users under 16 (or 18) depending on the country, and teens already on the platforms will be pointed towards higher privacy settings.
Speaking at a conference, TikTok CEO Shou Chew said 'Project Texas' will see data moved from Virginia and Singapore to a new Oracle cloud solution which only US residents will have access to. He added that no foreign government has asked the company for user data and said the response would be to refuse access. The comments were made in response to questions about the influence of the Chinese government on TikTok's owner Bytedance.
The Electronic Communications (Security Measures) Regulations 2022 and the Telecommunications Security Code are now in force. They were made under the Telecommunications (Security) Act 2021 which strengthens the security framework for technology used in 5G and full fibre networks to protect UK telecoms networks from hostile cyber activity.
The ICO has published new guidance and resources on direct marketing to assist organisations and businesses to conduct lawful direct marketing activities. The guidance covers essential issues to consider to achieve UK GDPR and PECR compliance, as well as ways to select the most suitable direct marketing product.
Accompanying resources include:
The ICO has emphasised a "new" approach to enforcement which includes a move away from imposing large fines on public sector organisations. The ICO will maintain a pragmatic, flexible, transparent approach. Businesses should take note that 'reprimands' will now routinely be made public on the ICO's website so reputational damage for data protection failings will become more likely."
Christopher Jeffery | Partner
The UK has concluded an adequacy decision in favour of South Korea which is expected to take effect on 19 December. The decision will allow the freeflow of personal data between the UK and South Korea without a need for additional transfer mechanisms. This is the first adequacy decision the UK has concluded independently of the EU. The EU already has a South Korea adequacy decision but the government is keen to stress that the UK's decision is broader than the EU's, notably because UK organisations will be able to share personal data related to credit information to help identify customers and verify payments.
The Irish Data Protection Commissioner has announced it is fining Meta Platforms Ireland Limited €265 million and is imposing a range of corrective measures. This concludes an inquiry started in April 2021, following reports of a Facebook dataset available on the internet between May 2018 and September 2019. The main issue the DPC examined was compliance with the obligation for data protection by design and default which involved examining the implementation of technical and organisational measures to achieve this. The final decision, made in cooperation with other EU regulators, imposes a reprimand, and an order to bring processing into compliance by taking specified actions within a determined timeframe, in addition to the fine.
The European Commission has adopted a proposal for a Regulation to set out measures to create a high level of public sector interoperability across the Union (Interoperable Europe Act). The Act aims to promote the cross-border interoperability of network and information systems used to provide public services in the EU.
The ICO and Ofcom have published a joint statement setting out how they will work together to ensure coherence between the data protection and new online safety regimes. To achieve their mutual aims, the ICO and Ofcom will work together to achieve maximum alignment and consistency. They will:
The ICO set out a "new" approach to enforcement at a speech given at the National Association of Data Protection Officers annual conference. The ICO highlighted that while fines were useful, they were not the only tool available to it when enforcing the data protection regime and quantum was not necessarily the most significant evidence of its enforcement activities. In addition to highlighting the new approach to avoid punitive fines for public sector organisations, the ICO said that going forward, all 'reprimands' would be published on the ICO's website unless there is a good reason not to publish. The ICO also said he would be promoting a predictable, consistent, transparent but flexible approach to enforcement. Sandboxes and advice services for innovators will assist with this.
The Council of the European Union has adopted the NIS2 Directive. This updates the current NIS Directive on security of network and information systems. The intention is to widen the scope of the rules and harmonise requirements across the EU. NIS2 is also aligned with sector-specific legislation, in particular the Digital Operational Resilience Act for the financial sector (DORA) which has also now been adopted by the Council. Both pieces of legislation will be published in the Official Journal shortly. Member States will have 21 months to introduce legislation implementing NIS2. Read more.
NIS2 will not apply in the UK, however, the UK government confirmed on 30 November, that it would be updating the NIS Regulations 2018 (which implemented the NIS Directive) following a public consultation.
The ICO's updated guidance on international data transfers frames a different approach to carrying out transfer risk assessments than the EDPB's and is arguably the more straightforward. While the ICO will recognise either the UK or the EU approach, it remains to be seen what the EU thinks about this development and whether it will raise concerns around onward transfers of EU data from the UK. Businesses operating across the EU and UK may prefer to stick with the EDPB's approach to transfer risk (or impact) assessments.
Victoria Hordern | Partner
The UK's ICO has published updated guidance on international data transfers. This includes a new section on transfer risk assessments (TRAs), known as Transfer Impact Assessments or TIAs in the EU, as well as a new TRA tool. These are required when using Article 46 transfer mechanisms, as a result of the CJEU judgment in Schrems II.
The ICO stresses that its guidance provides an alternative to the EDPB's guidance on supplementary measures for international transfers. The ICO's own approach to TRAs is different but it says that it is "happy for organisations exporting data from the UK to carry out an assessment" that meets either the UK or the EU approach, which it summarises as follows:
The ICO's new TRA tool is designed to help organisations assess the initial risk level of the relevant categories of data by asking a series of six questions, each of which is accompanied by supporting documentation tables guiding what to consider. The questions are:
The ICO's option 1 approach does put a different emphasis on this exercise to the EDPB's and it should be easier for businesses to complete (despites the length of the TRA tool), however, it could create tension with the EU in relation to onward data transfers and the level of protection afforded to EU data which is ultimately exported to third countries from the UK. For this reason, cross-border businesses may prefer to stick with the EDPB's approach.
The European Data Protection Supervisor has published an Opinion on the EU's draft Cyber Resilience Act, which covers harmonised security requirements for IoT and other products with digital elements. The EDPS broadly welcomes the initiative while emphasising that the GDPR already contains cybersecurity requirements. He urges the Commission to include data protection by design and default as an essential element, in addition to security and data minimisation. As is often the way when commenting on proposed data-related legislation, the EDPS also stresses the need for any incoming legislation to work with the GDPR.
Meta is reportedly seeking to overturn the €405 million Euro fine imposed on it by the Irish Data Protection Commissioner in September. The fine related to historic practices which resulted in default publication of children's personal data on the Instagram platform. Meta said it would appeal the fine and is now reportedly seeking a number of High Court declarations including one that parts of the Irish Data Protection Act 2018 are invalid under the Irish constitution, and are incompatible with the European Convention on Human Rights. In addition, Meta intends to apply to the CJEU to annul an EDPB instruction to the Irish Courts on the level of the fine imposed.
The European Data Protection Board has adopted draft recommendations on applying for controller Binding Corporate Rules to underpin data transfers to third countries. The recommendations will update the current guidance and bring it in line with the Schrems II requirements, replacing the current Article 29 Working Party Recommendations. They also update the application form and set out what must be included on the form and with the application. The draft is open to comments until 10 January 2023. The EDPB is also working on guidelines for processor BCRs.
India has published a draft Digital Personal Data Protection Bill to update its data protection regime. A previous attempt to pass a new data protection act ended in failure. The Bill provides for protection of individual rights. It appears to have stepped back from data localisation requirements, allowing for cross-border transfers with notified countries and territories. It also provides for a Data Protection Board to oversee compliance and impose penalties up to a maximum of INR 5 billion.
The High Court has granted summary judgment in respect of a permanent injunction in a breach of confidence claim arising out of a ransomware attack. It also preserved the anonymity of the Claimant. The Claimant had previously got a without notice interim injunction restraining the unknown Defendants from using or distributing the Claimant's confidential information which was harvested in a ransomware attack. The Claimant then commenced proceedings for breach of confidence and the Court continued the injunction on expanded terms. The Court granted summary judgment because the large amount of data stolen fell into categories requiring extra protection (including security sensitive information), and because the information was obtained by hacking.
The Claimant's anonymity was preserved largely due to the nature of their work and the fact that much of it is covered by the Official Secrets Act.
The deadline for converting to the new 2021 EU Standard Contractual Clauses is 26 December 2022. Those still relying on the old SCCs for data transfers need to act quickly to review their transfer arrangements and enter into the new-form SCCs. You can find out more here.
Christopher Jeffery | Partner
The European Parliament has approved the NIS2 Directive. The legislation will now be approved by the Council of Europe before publication in the Official Journal. This is expected to happen by the end of 2022. Member States will then have 21 months to implement the Directive. NIS2 updates the NIS Directive and deals with cybersecurity of operators of essential services and digital services providers. For more information on NIS2, see our Global Data Hub article by Paul Voigt and Clare Reynolds.
The ICO has issued a reprimand to the Department for Education for allowing a database of 28 million pupils' learning records to be used by Trust Systems Software UK Ltd (trading as Trustopia), for online gambling age verification purposes. Only the fact that the ICO is trying to reduce the impact of public sector fines on the public, prevented it issuing a £10 million fine.
The DfE had continued to give access to Trustopia when it advised the DfE that it was a new trading name for training provider Edududes Ltd. Due to a failure to carry out due diligence, the DfE was unaware that Trustopia was a screening company which used the DfE database to help online gambling companies with age verification. While Edududes had not accessed the database for training purposes, Trustopia accessed the records of 22,000 learners over a period of 16 months.
The ICO said the DfE had failed in its obligations to use and share children's data fairly, lawfully and transparently. It also failed to prevent unauthorised access to the data, have proper oversight of the data, or stop the data being used for reasons not compatible with the provision of educational services, and for purposes different to those for which it was originally processed. The DfE only became aware of the breach following a newspaper report.
The ICO acknowledged that the DfE had taken steps to review access to the database and strengthen registration processes which had resulted in 2,600 organisations having their access removed. Regulatory action against Trustopia was not available as the company has been dissolved.
The ICO is consulting on how to prioritise complaints relating to the response of public authorities to FOI requests. It is proposing to prioritise complaints where there is a clear public interest in the information that has been asked for including by applying the following tests:
Responses are required by 19 December 2022.
Google has settled for a total of $391.5 million with 40 US states over allegations that it continued to collect user location data after they had opted out of tracking. As part of the settlement, Google also agreed to make its location tracking practices more transparent to users, including by showing them more information when they turn settings off and on, and setting out information about the data it collects on a web page. This the largest privacy settlement in US history, although the issues were dealt with under consumer protection law.
Apple is facing a class action lawsuit filed in California, for allegedly using analytics on some of its apps regardless of whether or not users had turned them off in their settings.
As we know, the UK's Data Protection and Digital Information Bill was paused just before its second reading in the House of Commons, to allow incoming Ministers under the ill-fated Liz Truss government to consider it. DCMS Secretary of State Michelle Donelan, then hinted that the Bill might be changed during a speech at the Tory Party conference. Since then, things have gone quiet.
Speaking at a Westminster Forum event on 31 October, Owen Rowland, deputy director for domestic data protection policy at the DCMS said there would be further consultation on the Bill before it progresses to allow ministers to complete final checks. He also suggested this could delay the Bill somewhat. However, he underlined that any organisation which needs to comply with GDPR, would also find itself compliant with UK data protection law under the new regime.
It is unclear whether this means there will be changes to the Bill, although Rowland was also keen to stress that there would be nothing in the Bill to threaten the EU-UK adequacy agreement. A DCMS spokesman said there would not be another public consultation and the timeline of the Bill would not be affected.
At the 44th Global Privacy Assembly, data protection regulators from 120 countries reportedly agreed a resolution on a framework for using personal data based on six core principles. The resolution says the use of facial recognition requires a clear legal basis. The deploying organisation must be able to establish reasonableness, necessity and proportionality as well as transparency. Human rights assessments should be carried out, data protection principles respected, and there must be effective accountability.
The UK's Parliamentary Joint Committee on the National Security Strategy has launched a call for evidence on ransomware. It is seeing views on the nature and extent of ransomware threats, how they are deployed and how they are likely to develop. Information is also sought on the level of vulnerability and preparedness of organisations, and as to whether government response and those of other stakeholders like the ICO, are appropriate or reforms are needed. Responses are required by 16 December.
The call for evidence also looks at the issue of international cooperation which was a focus of the recent second International Counter Ransomware Initiative Summit. 36 countries including the UK and US as well as the EU committed to developing coordinated guidelines on preventing and responding to ransomware attacks. There are plans to establish an International Counter Ransomware Taskforce to share knowledge and resources, and to coordinate on enforcement in line with national law and policy.
The CJEU has handed down a ruling relating to the required response when a subscriber asks for their personal data to be removed from an online public telephone or directory enquiry directory. The decision followed the Advocate General's Opinion, holding that subscriber consent is required to include the details.
Consent must be to GDPR standards but in order to be valid, the subscriber does not, at the point of consent, need to be aware of the identify of all the directory providers which will process their data. Subscribers must have the opportunity to withdraw consent at which point a controller must take appropriate technical and organisational measures to inform third-party controllers ie the telecoms operator and other relevant directory providers. Where a number of controllers rely on a single consent the data subject should be able to withdraw consent to any of them. In addition, the controller must then take reasonable steps to inform search engine providers of the erasure request.
Ofcom is consulting on lowering the incident reporting thresholds in its NIS guidance to operators of essential services (OESs) in the digital infrastructure sector (which includes top-level domain name registries and various types of domain name service providers). Incident reporting is required where the incident has a "significant impact" on the continuity of the relevant service. What constitutes "significant" is set out in Ofcom guidance and these are the thresholds Ofcom is intending to change. The consultation closes on 13 January.
The EDPB is consulting on draft updates (marked in yellow) to its Guidelines on identifying a controller or processor's lead supervisory authority. The update deals with paragraphs 29-34 and Annex 2d(i) and (ii). The changes largely clarify the issue of what happens in a joint controller situation, explaining that:
The consultation ends on 2 December 2022.
The ICO has published guidance on PECR in relation to direct marketing using email and live calls. The guidance summarises the applicable rules respectively for each medium, and looks at the relationship between PECR and data protection rules. The two new elements supplement the existing Guide to PECR.
The ICO has published a blog warning organisations to assess the public risks of using emotion analysis technologies before implementing them. Emotion analysis relies on collecting, storing and processing a range of behavioural data including special category data which the ICO sees as far more risky than traditional biometric technologies used to verify identity. The ICO says that "algorithms which are not sufficiently developed to detect emotional cues risk creating systemic bias, inaccuracy and even discrimination". The ICO will publish guidance on using biometric technologies in Spring 2023.
The ICO has published draft guidance for consultation on information about workers' health, alongside a blog. This is the second part of the ICO's draft guidance on employment practices. The guidance covers lawful basis for processing worker health information, security, how to handle sickness and injury records, dealing with occupational health schemes, rules on medical examinations and testing including genetic testing, and the use of health monitoring technologies, as well as issues around sharing worker health information.
The French data protection regulator, the CNIL, has fined Clearview AI €20 million in relation to its facial recognition database. This is the latest in a string of fines, including from the UK's ICO, which have been imposed on Clearview AI for breaches of the GDPR when compiling its facial recognition database which scraped images from the internet. The CNIL ordered Clearview AI to stop its unlawful processing activities and delete unlawfully processed data.
The OECD has published a report on global policies and initiatives around data flows with the aim of promoting a common understanding and dialogue between the G7 countries and beyond. The report looks at key policies and initiatives on cross-border data flows to inform and support G7 countries' engagement with the policy agenda.
A Bill has been introduced to Parliament in Australia which proposes significant increases to maximum fines for data breaches. The current maximum is AU$2.22 million. The proposal is to increase this to AU$50 million or penalties based on data monetisation and 30% of adjusted quarterly turnover.
AG Szpunar has delivered an Opinion in the case La Quadrature du Net and others v Premier Ministere de la Culture. The AG says that Article 15(1) of the ePrivacy Directive read in light of the Charter of Fundamental Rights, does not preclude national legislation that allows for the general and indiscriminate retention of IP addresses for a period of time limited to what is strictly necessary for preventing, investigating, detecting and prosecuting online criminal offences, provided that this data is the only means of identifying the person to whom that address was assigned at the time of the commission of the infringement. This can be done without prior review by a court or independent body if, as in this case, the linking is at a given point in time and is limited to what is strictly necessary to achieve the objective.
The UK's ICO has fined construction company Interserve Group £4.4 million for failure to implement sufficient security measures to protect employees. The ICO said the company had failed to put appropriate security measures in place to prevent a cyberattack which enabled hackers to access the personal data of 113,000 employees through a phishing email. The compromised data included contact details, bank details and special category data. The UK Information Commissioner John Edwards warned that "the biggest cyber risk businesses face is not from hackers but from complacency within their company". He reminded businesses to regularly monitor for suspicious activity in their systems, act on warnings, update software and train staff.
The Digital Markets Act was published in the Official Journal on 12 October. It introduces rules for platforms that act as gatekeepers in the digital sector, to prevent unfair practices and the imposition of unfair terms and conditions on business and end users. It comes into force 20 days later and will, for the most part, start to apply six months after that on 2 May 2023. Certain provisions around the Commission and its designation of Gatekeepers will apply from 1 November 2022. Articles 42 and 43 apply from 25 June 2023. Gatekeepers will have six months to comply with their obligations following their designation and six months following designation of each core platform service in relation to that service. To find out more, including about provisions relating to data, see our content on Interface.
The EDPB is consulting on proposed changes to its guidelines on data breach notification. It is proposing adding a clarification stating that the mere presence of an EU representative does not trigger the one-stop-shop mechanism for a controller not established in the EU. It says for this reason, a notifiable breach will need to be notified to "every single authority for which affected data subjects reside in their Member State". This suggests that, if the changes are adopted, controllers may end up notifying all EU DPAs just in case it later transpires there are affected data subjects in their jurisdictions.
The European Commission has highlighted the following data-related priorities in its 2023 Work Programme:
The ICO has published its draft guidance on monitoring at work for consultation. Following a call for evidence, the ICO plans to release topic-specific guidance on employment practices and data protection. The monitoring guidance is the first in the series. It aims to provide practical guidance about monitoring works in compliance with data protection legislation, and to promote good practice. The ICO also intends to produce practical tools including checklists to sit alongside the guidance, and has produced an impact scoping document.
The draft guidance covers what constitutes lawful monitoring and now to identify a lawful basis. It also looks at the application of other elements of data protection law (such as fairness, transparency, and accountability) and the processes involved, for example, policies and DPIAs. It addresses covert monitoring, use of special and biometric data, in particular for time attendance and control monitoring, as well as use of automated processing in monitoring tools.
The consultation on the draft closes on 11 January 2023.
The National Cyber Security Centre has published new guidance to help organisations assess their suppliers' levels of cybersecurity. The guidance sets out practical steps to help medium to large organisations gain assurance about the cybersecurity of their organisation's supply chain. It describes typical supplier relationships and the way organisations are exposed to vulnerabilities and cyberattacks via the supply chain, defines expected outcomes and key steps to approaching cybersecurity, answers common questions, and supplements the 2020 NCSC Supply Chain principles.
The EDPB has sent the European Commission a wish list of aspects in national procedural law it would like to be harmonised to facilitate GDPR enforcement. These are issues which the EDPB has identified as potentially requiring legislative change. They include the status and rights of the parties to the administrative procedures, procedural deadlines, requirements for admissibility or dismissal of complaints, investigative powers of data protection authorities, and the practical implementation of the cooperation procedure.
The EDPB has also adopted an Opinion on the approval by the Board of the Europrivacy certification criteria submitted by the Luxembourg DPA. This is the first European Data Protection Seal adopted pursuant to Article 42(5) GDPR. Certification under the seal will be valid across the EU and allow for different controllers and processors to achieve and demonstrate the same level of compliance for similar processing operations regardless of the Member State in which they are located, or the location of their lead regulator.
Also at the plenary session, the EDPB adopted a statement on the digital euro, reiterating the importance of privacy by design and default to the project.
The ICO has confirmed it is now responsible for the UK Trusted List which sets out details of ICO-approved trust services under eIDAS. This sets out rules for UK services and establishes a legal framework for electronic documents, electronic registered delivery services, and certificate services for website authentication.
President Biden last week signed an Executive Order on Enhancing Safeguards for United Signals Intelligence Activities (EO). The EO and related Department of Justice Rules published at the same time, seek to deal with the issues raised by the CJEU in the Schrems II decision that invalidated the EU-US Privacy Shield, and pave the way for a new EU-US Data Privacy Framework (previously known as the Trans-Atlantic Data Privacy Framework). The EO is not EU-specific; its redress elements apply to any state designated by the US as a qualifying state and other elements apply regardless of the location or nationality of the data subject. This is good news for the UK which (at least for now) is bound by the Schrems II decision and has equivalent rules on data exports under the UK GDPR, as those in the EU. On the same day the EO was published, the UK government published a US-UK Joint Statement on a New Comprehensive Dialogue on Technology and Data and Progress on Data Adequacy. The Statement announced "significant progress on UK-US data adequacy discussions". It set out the government's aim of "working expediently" to issue a US-UK adequacy decision and achieving recognition under the EO as a qualifying state.
See more about what this means for future EEA/UK data transfers to the USA here.
Advocate General Campos Sanchez-Bordona has delivered an Opinion in a reference from Austria on interpretation of Article 82(1) of the GDPR which sets out the right to compensation for breaches of the Regulation. The questions referred to the CJEU covered:
The AG opined that the CJEU should hold Article 82 should be interpreted so that a mere infringement of the GDPR does not in itself give rise to a claim for compensation. It must be accompanied by relevant material or non-material damage. Compensation for non-material damage does not include mere upset felt as a result of the breach. It will be up to Member State courts to determine where the dividing line between a feeling of displeasure and non-material damage lies on a case by case basis.
The ICO has fined Easylife Ltd £1.35 million for using customer purchase data to profile and predict their medical conditions, and then target them with health-related products without consent. Easylife was also fined £130,000 for making unsolicited marketing calls.
The ICO found that Easylife profiled customers making purchases from its Health Club catalogue in order to target them with health-related items. This meant the company was processing 'invisible' health data – people were unaware that their personal data was being used for that purpose.
Facebook is reportedly notifying 1 million users that their login details may have been stolen by malicious apps on the apple and android platforms. The apps purport to require login via Facebook but then pass the user details directly back to their creators. Facebook says it has removed all the apps it identified.
Binance, the world's largest cryptocurrency exchange, temporarily suspended transactions and exchange of funds after its network was hacked, causing it to lose anywhere between $110 million and half a billion dollars. The hackers exploited a vulnerability between two blockchains, which is becoming a common approach. Binance says it has secured its funds.
Addressing the Conservative Party Conference, recently appointed Secretary of State at DCMS, Michelle Donelan, 'announced' the government will be "replacing the GDPR with our own business and consumer-friendly, British data protection system". The wording (and use of "today") suggested this was a new announcement but, of course, the Data Protection and Digital information Bill was presented to Parliament at the end of Nadine Doris's tenure at DCMS. The second reading of the Bill was then delayed so that "Ministers [could] consider this legislation". It is unclear from Donelan's speech whether she was referring to the already published Bill, or whether she now intends to revise it or present a different Bill.
The White House Office of Science and Technology Policy has published a 'Blueprint for an AI Bill of Rights'. This is intended to help guide the design, development and deployment of AI and other automated systems and protect the rights of individuals. It lays out five core protections to which it says everyone in America should be entitled:
In relation to data protection, the Blueprint says individuals should be protected from abusive data practices via built-in protections and should have agency over how data is used. Design choices should ensure protections are included by default and that only strictly necessary data for the specific context is collected. Consent should be sought about collection, use, access, transfer and deletion of data in appropriate ways and to the "greatest extent possible". Consent should only be used to justify collection of data "in cases where it can be appropriately and meaningfully given". Hard-to-understand consents and confusing user interfaces should be changed. Enhanced protections for sensitive areas including health and youth data should be put in place.
The Data Access Agreement between the USA and UK is now in force. It sets out conditions for cross-border access to electronic data for the purposes of combatting serious crime, with safeguards and procedures to protect fundamental rights.
According to Politico the White House will shortly publish an Executive Order on transatlantic data transfers, intended to address EU concerns over the protection of EU data transferred to the USA. It is thought that the Order will set out what is considered "necessary and proportionate" in terms of access to personal data by the US intelligence services, and that it will provide additional legal protections for individuals.
Publication of the Order will start a ratification process by the European Commission, expected to take up to six months. On this timeline, we might see a replacement for the EU-US Privacy Shield in Spring 2023.
IAB Tech Lab has announced that its Global Privacy Platform is ready for industry adoption. The GPP is part of a portfolio of solutions developed by IAB Tech Lab to address privacy compliance issues in adtech across jurisdictions with differing regimes. The GPP enables user consent signals to be communicated through the digital ad supply chain and provides the protocol to help consolidate and manage these. It currently supports the US Privacy and IAB Europe TCF consent strings and has plans to add US State-specific strings and one for Canada. Users are advised to begin transitioning to GPP from TCF v2.0.
It is worth remembering that there are question marks over whether the IAB Europe Transparency and Consent Framework satisfies GDPR requirements after IAB Europe was fined by the Belgian DPA and ordered to propose remedial measures.
Following the first anniversary of the implementation of the Children's Code, the ICO is consulting on its impact, gathering views from stakeholders and the public. It is also gathering information from sources including market research, and considering its experience of supervising the Code over the last year. Responses are required by 5pm on 11 November.
The Irish Data Protection Commissioner has submitted a draft decision following its inquiry into Meta Platforms Ireland Limited (Meta), to other concerned authorities under the Article 60 GDPR procedure. The inquiry began in April 2021 following reports of a data breach that reportedly exposed a dataset on the internet relating to approximately 533 million Facebook users worldwide. It focused on Meta's compliance with data protection by design and default obligations under Article 25 GDPR.
The concerned supervisory authorities now have a month to review the draft decision and raise any objections.
The ICO has issued official reprimands to seven organisations relating to their handling of subject access requests. Reprimands are one of the ICO's enforcement tools which can be used to warn organisations that their actions breach the UK GDPR, and to recommend steps to prevent ongoing non-compliance. The non-compliance identified varied between organisations but broadly related to failure to have the proper processes and resources in place to enable timely response to SARs. The organisations in question have been given between three to six months to show they have made improvements, or face further action. They are mostly public sector, however, Virgin Media Limited, received a reprimand calling for improved performance on processing complaints and general SAR compliance. The ICO has published a blog on getting SARs right.
The CJEU has ruled in joined cases that Article 15(1) of the ePrivacy Directive (as amended) read in the light of the Charter of Fundamental Rights of the European Union, precludes national legislative measures which authorise the general and indiscriminate retention of traffic and location data for the purposes of combating serious crime and preventing serious threats to public security. It also reiterated that EU law does not preclude measures that enable the collection of such data where there is a serious threat to national security subject to specified checks and balances. There is also no prohibition on:
Advocate General Rantos has given a non-binding Opinion on a reference from Germany in a case involving the use of combined personal data by Meta. The German Federal Competition Authority banned Meta from collecting personal data and combining it from across its services and from third-party websites and applications. Meta appealed the decision, particularly focusing on whether competition authorities have jurisdiction to rule on an infringement of the GDPR.
The AG said that the court in question had not ruled on GDPR infringement so the question was irrelevant, however, he did respond to the other questions raised, holding (among other things) that:
The CJEU will make a preliminary ruling on the case in due course.
The ICO has issued TikTok Inc and TikTok Information Technologies UK Limited with a notice of intent, a legal document which precedes a potential fine. The ICO says TikTok faces a potential £27 million fine following its findings that the company may have:
These are provisional findings and the ICO stresses that no conclusion can yet be drawn that TikTok has breached UK data protection law, or that a fine will ultimately be imposed. The ICO will now consider representations from TikTok before reaching its final decision.
The ICO has launched a second consultation on its draft code of practice on using personal data for journalism following changes made as a result of its initial consultation. The ICO has taken on board feedback from the first consultation and says it has significantly reduced the length and complexity of the code by:
The consultation is open until 16 November and also covers associated documents.
The Danish data protection authority joins Austria, France and Italy in banning the use of Google Analytics transfers to the US without supplementary measures additional to those provided by Google. The DPA says it is essential that EU Member States are harmonised on this point. It asks organisations using Google Analytics in Denmark to assess whether they can put in appropriate measures to make the use of the tool compliant, and if not, to discontinue using it. It points to pseudonymisation as a potentially effective supplementary measure.
The European Parliament has published a briefing paper on the European Health Data Space (EHDS) proposal. It notes that this is in its early stages, covers the background to the proposal and also sets out its starting position, as well as summarising those of its co-legislators. It is broadly supportive but underlines the need to ensure GDPR compliance. It also advocates the development of European federated networks to contribute to R&I and the use of AI in healthcare.
The European Commission has published a proposal for a new Cyber Resilience Act to protect consumers and businesses from products with inadequate security features. It introduces mandatory cybersecurity requirements for products with digital elements throughout their lifecycle. Manufacturers will be required to embed security by design and provide security support and software updates to address vulnerabilities. There will be information requirements to inform consumers about the cybersecurity of products, and products will need to meet conformity assessments (subject to the type of product).
Products with digital elements are defined as: "any software or hardware product and its remote data processing solutions, including software or hardware components to be placed on the market separately". They are further sub-divided where they are considered as "critical products with digital elements" falling within listed categories and subject to conformity procedures. There are also provisions relating to high-risk AI.
In particular, the proposed legislation introduces:
Security obligations run through the supply chain, including for economic operators, manufacturers, importers and distributors.
The legislation is being introduced as part of the European Commission's Cybersecurity Strategy introduced in December 2020. It will complement NIS2 and the EU Cybersecurity Act.
The Czech Presidency of the EU Council has circulated a new partial compromise on the Data Act which covers cloud switching and interoperability among other issues. According to Euractiv, some of the proposed changes include:
It remains to be seen what the reaction to these proposals will be.
The Irish Data Protection Commission, acting as Lead Supervisory Authority (LSA), has fined Meta Platforms Ireland Limited €405 million following an inquiry into Instagram's privacy practices. The publication of the decision follows the EDPB's binding dispute resolution decision of 28 July 2022 under the Article 65 GDPR procedure. This is the second largest fine handed down under the GDPR to date.
The Irish DPC investigated Instagram's public disclosure of email addresses and phone numbers of children using Instagram's business account feature, and public-by-default settings for children's personal Instagram accounts for a period (the practice has since ended).
The EDPB's decision looked in detail at lawful processing, and at the lawful bases of performance of a contract and legitimate interests which were the lawful bases relied on by Instagram to process the children's personal data. The EDPB found that there were no grounds for the LSA to conclude that public processing of the children's email addresses/phone numbers was necessary for the performance of a contract. It also found that Instagram could not rely on legitimate interests as a lawful basis for publishing these details as the processing was either unnecessary, or if it was necessary, did not pass the legitimate interest balancing test. As a result, the EDPB found that Meta IE had processed children's personal data without a lawful basis, and amended the amount of the fine accordingly.
Separately, South Korea's Personal Information Protection Commission has reportedly fined Meta around £19m around consent practices, with Google being fined around twice that amount for similar reasons.
The Irish Data Protection Commission has submitted a draft decision following its inquiry into TikTok Technology Limited, to other Concerned Supervisory Authorities. The inquiry focuses on processing of children's personal data by TikTok, in particular, its public-by-default settings for under-18 accounts, and age verification measures for under-13s. It also considers whether TikTok complied with GDPR transparency requirements. The Concerned Supervisory Authorities now have one month to review the draft decision under the Article 60 procedure.
The EDPB has confirmed that its second coordinated enforcement action will concern the designation and position of the Data Protection Officer. Over the next few months, the EDPB will specify further details, following which, national Data Protection Authorities will work at national level to gather information and produce targeted follow-up.
A report on the use of cloud-based services by the public sector, the subject of the first coordinated action, will be adopted by the end of this year.
The Governor of California has signed the California Age Appropriate Design Code into law. Influenced by the ICO's AADC, the Bill will apply from 1 July 2024. The CAADC sets out mandatory requirements and prohibitions on providers of online services likely to be accessed by children. These include requirements to carry out DPIAs, to configure high privacy by default settings, to communicate clearly with children, and to provide tools to enable children and/or their parents or guardians to exercise their privacy rights.
Euractiv reports that the EC will introduce its proposal for a Cyber Resilience Act this week. The Act will address cybersecurity issues with consumer connected devices. It will cover "products with digital elements" defined as "any software or hardware product and its remote data processing solutions, including software or hardware components to be placed on the market separately". Products covered by sector-specific rules including medical devices, are outside the scope of the legislation.
Manufacturers of IoT products will need to comply with security requirements and ensure confidentiality of data. They will be required to carry out regular stress testing and report issues to a nominated body. Critical products which represent greater risk will have additional obligations and are divided into two classes. Class I includes identity management systems, browsers, password managers, antiviruses and firewalls, VPNs, network management systems, physical network interfaces, routers and chips used for essential entities as defined in NIS2. Class II, considered as higher risk, includes desktop and mobile devices, virtual operating systems, digital certificate issuers, general purpose microprocessors, card readers, robotic sensors, smart meters and all IoT, routers and firewalls for industrial use in a sensitive environment.
Obligations will extend down the supply chain with penalties for non-compliance of up to €15m or 2.5% of annual turnover, whichever is higher.
The Belgian Market Court has reportedly suspended IAB Europe's appeal against the Belgian DPAs decision that its Transparency and Consent Framework is not GDPR-compliant, making a reference to the CJEU. The CJEU is being asked to look at the definition of 'joint controller' and whether user consent signals sent via the TCF are personal data.
The government has published a call for information on measures designed to enhance the security of online accounts, including those processing personal data. These are described as a "Cyber Duty to Protect", formulated as part of the National Cyber Strategy. Proposals include possibly supplementing the approach to cybersecurity under the UK GDPR. Responses are sought on risks associated with unauthorised access to online accounts, actions currently taken to address the problem, and actions that should be taken as well as allocation of responsibility. Responses will be used to develop proposals which will include appropriate security measures for account providers and organisations processing user account personal data.
DHSC has released a policy paper setting out Secure data environment policy guidelines for data platforms hosting NHS data accessed for research and analysis. The guidelines set out expectations for how secure data environments will be used for NHS and social care data, and the rules by which all platforms providing access to NHS data will need to comply. The policy sets out 12 secure data environment guidelines based on the Five Safes framework of principles. The guidelines set out standards on cybersecurity, transparency, open-working and code sharing, anonymisation, data input, use of data, and patient and public involvement.
The ICO has published chapter 5 of its draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies (PETs). The first four chapters are under consultation which closes on 16 September. It is unclear whether that date will be extended in order to include a consultation on chapter 5 which focuses on PETs.
The Telecommunications (Security) Act 2021 (Commencement) Regulations 2022 have been made. They bring the Telecommunications Security Act 2021 (TSA) into force from 1 October 2022. The Electronic Communications (Security Measures) Regulations 2022 under the TSA will come into force on the same date.
The TSA strengthens the security framework for 5G technology and full fibre networks. The Regulations set out specific security requirements for providers. A code of practice provides further technical detail.
The second reading of the UK's Data Protection and Information Bill did not take place on 5 September as planned. A government spokesperson said the decision not to move to second reading was in order to allow "ministers to further consider this legislation". This implies that there may be changes to the Bill which was tabled in July 2022 following a lengthy consultation process. The Prime Minister has expressed her intention to review and then repeal or replace all EU law by the end of 2023. It is possible that the pause in the legislative process will result in changes to the first version of the Bill although unlike the Bill of Rights, it has not (yet) been pulled.
Instagram (owned by Meta) is facing a €405 million fine for GDPR breaches from the Irish Data Protection Commissioner. While details of the fine had not been published at the time of writing, the Irish DPC has confirmed the amount which is the second largest fine handed down under the GDPR and follows a two year investigation. The fine relates to Instagram allowing users aged between 13-17 to run business accounts from the platform which included their phone numbers and email addresses and to the fact that accounts for 13-17 year olds were set to public by default.
Instagram has said it is reviewing the final decision and points to the fact that it has updated its settings to include more safety features, including setting children's accounts to private by default and ensuring adults can no longer message teens who don't follow them.
The California legislature has approved an Age Appropriate Design Code Act heavily influenced by the ICO's Age Appropriate Design Code. If, as expected, it is signed into law, it will impact businesses providing online services likely to be accessed by children. It is expected to apply from summer 2024.
The ICO published updated guidance and revised application forms and tables to simplify the UK BCR application and approval process. A key change is the revision of the referential table which the ICO says must demonstrate the applicant's "understanding of the spirit and intent behind Article 47". The ICO also calls for a BCR policy to be included in the UK BCR document. This policy should be published in full to give individuals information about transfers of their personal data under the BCRs. See our article for more.
The EDPB and EDPS have published a joint Opinion on the EC's draft Regulation to Prevent and Combat Child Sexual Abuse Online. The Opinion expresses concerns that in its current form, the proposal may present an increased risk to individuals and a threat to their privacy. It suggests the draft Regulation could provide scope for general and indiscriminate content monitoring. Concerns are also raised over the provision for the use of AI and other technologies to scan user communications which could result in errors as well as a high level of intrusion. The Opinion emphasises encryption and particularly end-to-end encryption as an effective privacy tool and cautions against any regulation which might curb its use.
The CJEU has placed a broad interpretation on the scope of what constitutes special data under Article 9 GDPR. In a reference from Lithuania, the CJEU held that publishing the name of a civil servant's spouse, partner or cohabitant makes it possible to determine their sexual orientation and can therefore constitute processing of special data. The data in question is published in Lithuania as part of a requirement on civil servants to disclose their interests and those of their spouse/partner. By extension, publication of interests could also disclose people's political views and union memberships. While this is not a surprising decision, it acts as a reminder to give proper consideration to the protection of this type of personal data.
China has finalised Security Assessment Measures for Cross-border data transfers. These came into effect on 1 September 2022, and clarify when and how to carry out a data export security assessment which is required before data is exported by critical information infrastructure operators. The Measures specify a six month grace period until 31 March 2023 for affected companies to complete their assessments. See here for more.
China has also presented draft SCCs for approval. The consultation period closed at the end of July and final versions are expected soon as we discuss here.
The Irish Data Protection Commissioner's draft decision to stop Meta transferring personal data from the EU to the USA is subject to the Article 65 procedure after concerns were raised during the Article 60 process. It is thought that at least some of the objections related to the decision not to fine Meta for previous unlawful data transfers. Meta continues to threaten withdrawal of its Facebook and Instagram services in the EU if the ban is agreed.
The CNIL reportedly issued a preliminary notice of a €60 million fine to adtech company Criteo. The fine follows a complaint by Privacy International. The CNIL reportedly intends to fine Criteo for GDPR violations related to unlawful targeted advertising and profiling. Criteo now has the right to make a written response before the CNIL issues a draft decision which will be subject to the Article 60 cooperation and consistency process.
In its response to a ten week consultation on drafts of the Electronic Communications (Security Measures) Regulations and Telecommunications Security Code of Practice, the government has set out its proposals for regulation to form part of the new security framework under the Telecommunications (Security) Act 2021. As a result of the consultation, the government is making changes to the drafts ahead of commencement of the new framework in October. The new regulations will help ensure providers protect data, software and equipment which have critical functionality, and help them identify and mitigate security risks.
The CMA has published an ICO report on data protection harms and the ICO's taxonomy, and an ICO-commissioned report which reviews literature relevant to data protection harms. The intention is to analyse different types of harms which arise as a result of data breaches or the inability to exercise data protection rights, and to look at their impact on individuals. This covers a range of impacts from financial loss to emotional distress. The report also looks at the wider impact of these kinds of harms on society.
The House of Commons Library has published a research briefing on the Data Protection and Digital Information Bill which looks at the proposed changes the UK's data protection regime and sets them in context. It's quite helpful in that it sets out what the consultation said, what the response said and what the draft Bill does as a result for various elements.
The government has introduced the Data Protection and Digital Information Bill to Parliament. It covers reforms to the UK GDPR, Data Protection Act 2018 and PECR, but also:
access to customer and business data
electronic signatures, seals and other trust services
disclosure of information to improve public service delivery
sharing of data for law enforcement
information standards for health and social care
biometric data
the role of the Information Commission.
The personal data provisions are not surprising as they are broadly as outlined in the government's response to its consultation published in June (see here for more). As expected, the government intends to clarify certain aspects of the law (eg a definition of scientific research, and further processing) as well as change the rules in other areas including to provide more flexibility around notifying individuals when using their data for scientific research purposes, remove the requirement for a representative for a non-UK controller and remove of the requirement to appoint a DPO. There will be a new approach to the government's assessment of 'adequate' international data transfers – under Schedule 5 of the Bill the government will consider whether the standard of protection provided in a third country is materially lower than the standard of protection provided under UK law - and there are reforms to the Information Commissioner's Office.
Changes to PECR will impact the way cookies are used and extend soft opt-in to marketing communications to non-commercial organisations. The Bill also covers access to data for law enforcement and national security, information standards and interoperability for IT products and services supplied to health and adult social care sector. It provides for sector based smart data schemes with supporting regulation, to secure trusted data sharing.
We'll be taking a look at the headline issues in the Bill at our webinar on 26 July. Click here to sign up.
The DCMS has announced its AI Action Plan, part of its National AI Strategy. An AI paper sets out proposed rules based on six rather vague principles, which regulators must apply with flexibility, to support innovation while ensuring use of AI is safe and avoids unfair bias. Rather than centralising AI regulation, the government is proposing to allow different regulators to take a tailored approach to the use of AI which is more contextual. Regulators including Ofcom, the CMA, the ICO, the FCA and the MHRA will be asked to implement the core principles which are:
Regulators will be encouraged to use 'light touch' options including guidance and sandboxes and to coordinate their approaches.
The paper is subject to a call for evidence. Responses will be considered alongside further development of the framework in an AI White Paper which will look at how to put the principles in practice. This should be published towards the end of 2022.
An AI Action Plan has also been published to show progress to date and identify priorities for the coming year.
The ICO has set out its strategic plan from 2022-25. The Information Commissioner John Edwards, said the focus will be on issues disproportionately affecting already vulnerable or disadvantaged groups. This includes children, AI, algorithmic bias and malicious telemarketing. The ICO will, for example, investigate the use of AI and possible bias in recruitment and other areas. It will also issue further guidance, privacy by design templates, and set up a moderated discussion platform.
The EDPB has adopted a set of criteria to assess whether a case may be of "strategic importance" warranting closer cooperation between regulators. These cases are usually 'one stop shop' cases which relate to a potential high risk to the rights and freedoms of individuals in more than one Member State. The EDPB suggests at least one of the following criteria should be taken into account:
Any Member State SA can propose a case which meets at least one of the criteria for consideration for closer cooperation. The EDPB will then decide whether to identify it as a case of strategic importance, at which point it will be prioritised and supported by the EDPB. This is in addition to the criteria set out in Chapter VII GDPR.
The EDPB and EDPS have issued a joint Opinion on the proposal for a European Health Data Space. This sets out some overarching concerns including that the EHDS will provide for add-on rights already covered under the GDPR. The regulators are concerned this might weaken privacy rights under the GDPR, particularly in relation to secondary uses of data. There are also concerns over the interaction with the EC Data Governance Act, Data Act and AI Act. Diverse regulation in this area across Member States is another issue.
In addition, the EDPB and EDPS recommend excluding wellness and other digital app data from the scope of the EHDS, and not extending the scope of GDPR exceptions regarding data subject rights.
The ICO has published an FOI and Transparency regulatory manual 2022. This sets out a pilot framework for the ICO's regulatory action under its statutory powers under the FOIA and the Environmental Information Regulations 2004. It explains the evidence the ICO will consider, levels of action the ICO will undertake, and aggravating and mitigating factors the ICO may consider.
The Association of the British Pharmaceutical Industry (ABPI) has released new guiding principles on the use of NHS data for research purposes. The aim is to ensure transparency and that the benefit of the data use is shared with patients. The guidelines are underpinned by five principles which supplement existing regulatory requirements and statutory safeguards applicable to the use of health data. These include transparency, fairness, and compliance with data protection law.
The European Commission has written to the Dutch DPA to set out the law on legitimate interests as it says the Dutch DPA's assertion that purely commercial interests cannot qualify, is incorrect. The Commission explains the three part test for reliance on legitimate interests in the context of the GDPR, the Google Spain case and the FashionID case. It underlines the fact that while purely commercial interests can satisfy the first limb of the test (establishment of a legitimate interest behind the processing), the questions as to whether the processing is necessary, and the balancing exercise to establish whether that interest is overridden by the rights and freedoms of the data subject, also have to be answered before using legitimate interests as a lawful basis. Neither of the second two parts of the test preclude purely commercial interests satisfying all three parts of the test.
The Irish Data Protection Commissioner has sent a draft decision to the other EU regulators which sets out its intention to ban Meta (Facebook) from transferring personal data to the USA. This has the potential to cut of EU access from services including Facebook and Instagram. The Irish DPC carried out an 'own volition' investigation in the wake of the Schrems II judgment. The decision is now subject to the Article 60 process and, potentially, to an EDPB decision under Article 65, should there be insufficient agreement with the ruling. NOYB has said they expect objections to the draft decision to be made by other EU regulators.
The government has published its second post-implementation review of the NIS Regulations. The review concluded that the Regulations work well on the whole, but identified areas for improvement including:
The government expects that the Cyber Resilience Proposals, currently under consultation, will address some of its concerns. Reviews will now take place every five years.
The UK and South Korean governments have reached a data adequacy decision in principle. The decision, once finalised, will allow for the free flow of personal data between the two nations without the need for additional transfer mechanisms (although technically this is already in place as the UK retained the effect of EU adequacy decisions made prior to Brexit on a provisional basis). The ICO has also signed a Memorandum of Understanding with the South Korean Personal Information Protection Commissioner. This provides for continuing collaboration but does not involve data sharing. The EC adopted a data adequacy decision in favour of Korea in December 2021.
The ICO is asking to hear from designers and product team members about how they consider privacy by design in practice when designing digital products. The ICO will used the feedback to its consultation to provide more practical support on implementing privacy by design.
The ICO and NCSC have asked solicitors to ensure they don't advise clients to pay ransomware demands. They remind organisations that high risk data breaches should be reported to the ICO, and that the NCSC can provide support and incident response guidance to mitigate harm.
An independent legal review of the governance of biometric data in England and Wales has been published. It was commissioned by the Ada Lovelace Institute in 2020 following calls from the Commons Science and Technology Select Committee, and was led by Matthew Ryder QC. The Ryder Review is supplemented by a policy report from the Institute.
The Review makes ten recommendations to protect fundamental rights, particularly data and privacy rights including for:
The ICO has released a report from the Global Privacy Assembly's International Enforcement Working Group which highlights the growth of credential stuffing cyberattacks and provides guidance on how to prevent, detect and mitigate their risk. This type of attack takes compromised logins and uses bots to try and log in to other sites with them, on the assumption that many people use the same passwords across multiple sites and devices.
Recommendations made in the report include:
checking accounts for unusual activity or unauthorised transactions
ensuring device software is regularly updated and patched
Guidance on incident response plans is also provided.
The ICO has set out a revised approach to working more effectively with public authorities. Its intention is to use its discretion to reduce the impact of fines on the public sector, instead relying more on its wider powers including warnings, reprimands and enforcement notices. Fines will only be issued in the most serious cases. In addition, the ICO will work more closely with the public sector to encourage data protection compliance and prevent harms from occurring.
The ICO will include this initiative as part of its three year strategic vision, ICO25, to be further set out over the coming weeks.
The Conseil d'État has confirmed the CNIL's 2020 decision to fine Amazon Europe €35 million for dropping cookies without consent and for a lack of transparency in breach of the French Data Protection Act which implements the ePrivacy Directive. The Conseil confirmed that the CNIL is competent to impose sanctions outside the GDPR's one stop shop mechanism, and that it had jurisdiction to do so in the context of Amazon's activities in France, even though Amazon did not have its main establishment there. It also confirmed that the amount of the fine was not disproportionate relative to the seriousness of the breaches, the scope of the processing and the financial resources of the company.
The UK government has published its response to its consultation on standardising cybersecurity qualifications and certifications. Based on the feedback it received, the government has decided not to legislate at this point. It has, however, empowered the UK Cyber Security Council to award chartered status to suitably qualified cyber security professionals in 16 different specialism categories at a range of levels of expertise. The Council will run a pilot for the specialisms of Risk Management and Security Architecture this summer and will formally launch the standards in 2023. The launch of standards will then be rolled out by 2025. There will also be a voluntary register of accredited individuals.
Euractiv reports that the draft Cybersecurity Certification Scheme for Cloud Services (EUCS) contains data localisation and sovereignty requirements which would impact cloud service providers in the EU market. The EUCS is secondary legislation being made under the EU Cybersecurity Act. It sets up a voluntary, EU-framework for cybersecurity certificates. Providers will be rated as providing basic, substantial or high assurance levels. The current draft reportedly places sovereignty requirements on high-level assurance products and services (which will likely cover cloud service providers), and would ensure EU law is primary, and that maintenance, operations and data must be located in the EU. Immunity from non-European access would be guaranteed by requiring cloud service providers to be headquartered in the EU and not controlled by non-EU entities.
If published as reported, these proposals are likely to prove highly controversial, particularly if, as expected, the certification scheme becomes mandatory.
The Italian DPA, the Garante, has effectively banned the use of Google Analytics and similar technologies which export personal data from Italy to the USA. The regulator said that GA data cannot be adequately protected on transfer to the USA as a truncated IP address would not qualify as anonymised given Google is able to re-identify it. It has asked all Italian website operators to verify that the use of cookies and other tracking tools on their website complies with data protection law, particularly in relation to GA and similar services. The Garante has set a 90 day compliance deadline and will begin enforcing after that. The Garante is following the decisions by the French and Austrian regulators.
In February 2022, the CNIL found that an unnamed French website manager's use of Google Analytics resulted in the unlawful transfer of personal data to the USA. Other French data controllers who were the subject of NOYB complaints have also received enforcement notices, and the Austrian regulator reached similar conclusions in responses to complaints they received.
The CNIL has now published a set of Q&As regarding its decisions. Clarifications made by the CNIL include:
In short, the CNIL is saying that use of GA (or any similar technology where the data is hosted on servers in third countries) will result in unlawful data transfers unless the data is encrypted before it leaves the EA, by an EU (or adequate country) data controller with exclusive access to the encryption keys, or where a proxy server is used. Whether or not the data being transferred is likely to be accessed by third country government agencies is irrelevant.
The European Economic Social Committee has published its Opinion on the draft Data Act. Among other things, it suggests amending the Act so that:
A bipartisan group of lawmakers has circulated a draft federal data protection law.
The proposal includes protections for children's data, limits on targeted advertising, a limited right of private action, data minimisation requirements, and various technical and organisational requirements.
While very much in its initial stages, the fact that the proposal is bipartisan has led to suggestions that this may be the bid for a Federal US privacy law that succeeds, particularly as it consolidates learning from previous failed privacy bills.
News as at 23 June 2022
The UK government has published its response to its September 2021 consultation 'Data: a new direction' which set out its proposals for reforming the UK's data protection regime. In general, the proposed changes are less controversial than were originally discussed but there will be significant divergence from the UK (and therefore the EU) GDPR, and the the Privacy and Electronic Communications Regulations (PECR).
See our article by Victoria Hordern and Debbie Heywood for more on the
AG Pitruzzella has opined in a reference from Austria on interpretation of the Article 15(1)(c) access right under the GDPR. Article 15(1)(c) provides that individuals can obtain information from a controller about the recipients or categories of recipient to whom their personal data has been or will be disclosed. The AG was asked, in summary, whether it was at the controller's discretion to decide whether to disclose precise recipients, or categories of recipient.
The AG said that it is the data subject, not the controller, who has the choice. This is backed up by the wording of Recital 63 (which refers to the right to know the recipients of data), and due to the purpose of Article 15(1)(c). The right is intended to allow data subjects to verify the lawfulness of any disclosure and confirm recipients of their data are authorised. This implies that the information provided to them must be as precise as possible.
The EDPB has adopted guidelines on certification mechanisms as a tool for transfer. The guidelines are in four parts and look at aspects relevant to various actors from certification bodies, to users.
The ICO has announced that it will now be able to keep a portion of funds paid as civil and monetary penalties it imposes. Previously, the amounts went to the government's Consolidated Fund. They will now be used to cover pre-agreed, specific and externally audited litigation costs.
The Data Governance Act has cleared the last hurdle to adoption following formal adoption by the Council of the European Union without further changes. It will now be signed by the Presidents of the co-legislators and published in the Official Journal. It will enter into force 20 days after publication and will apply 15 months after it comes into force.
As mentioned last week, the EDPB has published new draft guidelines for DPAs on calculating GDPR fines. The EDPB says the starting points for assessing the amount of a fine are categorisation of infringement by nature, the seriousness of the infringement, and the turnover of the business at fault.
The EDPB outlined a 5-step approach:
Analyse whether the calculated final amount meets the requirement of effectiveness, dissuasiveness and proportionality or whether further adjustments to the amount are necessary.
Despite providing harmonisation and transparency, the guidelines are just that. Above all, the individual circumstances of the case must be a determining factor.
The draft guidelines are open for consultation until 27 June 2022.
The ICO has also ordered Clearview AI to stop obtaining personal data of UK residents which is publicly available on the internet, and to delete the data of UK residents from its systems.
The EDPS has published Opinions on EU proposals to set out a common high level of cybersecurity and information security in EU institutions. The EDPS welcomes the proposals but suggests the text could be improved by including greater assurances for the rights and freedoms of individuals where their data is processed for security operations. He recommends ensuring all proposed security measures have a valid legal basis and are necessary and proportionate. In particular, he says the Cybersecurity proposal needs to achieve better alignment with the recently approved NIS2 Directive.
The Spanish DPA (AEPD) has fined Google €10 million for breaches of the GDPR. It found that Google had passed personal data relating to right to be forgotten requests to a third party in the USA without having a lawful basis for doing so. This also had the effect of preventing individuals from having their data erased. The data was transferred under the Lumen Project, an academic project which involved creating a database of content takedown requests. The AEPD said that individuals were not given a chance to opt-out of their data being transferred and also criticised Google's process for submitting erasure requests. In addition to the fine, the AEPD ordered Google to delete all personal data shared with Lumen and ensure Lumen did the same.
Google has said it is considering its response but underlined that it has already started re-evaluating and redesigning its data sharing process with Lumen.
The French DPA (CNIL) has published guidance on criteria for assessing the legality of cookie walls. It stresses that the GDPR requirement that consent be freely given does not automatically mean cookie walls cannot be used at all. The key is whether users are given genuine alternatives to accepting cookies. For example, provided the cost of access is reasonable, it might be acceptable to offer users the chance to use a paid-for cookie-free service, as an alternative to a free service which requires consent to cookies.
The European Commission has adopted a new European strategy for a Better Internet for Kids (BIK+). The strategy aims to improve age-appropriate digital services and ensure children are better protected, empowered and respected online.
It has been adopted together with a draft Regulation to prevent and combat child sexual abuse online. The Regulation will oblige providers to detect, report and remove child sexual abuse (CSA) material on their services. Providers will need to assess and mitigate risk of misuse of their services and take proportionate measures to address issues.
An EU Centre on Child Sexual Abuse will be created to assist service providers and act as a hub of expertise for providers, assist law enforcement and provide victim support.
Key elements include:
There are concerns that, if the Regulation remains in its current form, it will allow for significant intrusion on privacy. The positive obligation on businesses to detect and remove child abuse images could provide scope for scrutiny of encrypted messages and potential profiling. The legislation may change significantly on its way to enactment if the current proposals prove sufficiently controversial.
Provisional political agreement has been reached by the European Parliament and Council on the NIS2 Directive. NIS2 builds on the NIS Directive. It covers medium and large entities across a wider range of sectors including public electronic communications services and digital services. It also covers the healthcare sector. It strengthens cybersecurity requirements and addresses security of supply chains and supplier relationships. It also streamlines reporting obligations and introduces stricter enforcement mechanisms. In addition, it will help increase information sharing and cooperation on cyber crisis management at EU and Member State level.
Member States will have 21 months to transpose the Directive after it comes into force so it is expected to apply some time in 2024.
President Biden is reportedly preparing to issue an executive order which would give the US Department of Justice power to stop certain foreign countries from accessing the personal data of Americans. Commercial transactions involving the transfer of data to third countries deemed to pose a risk to national security could be reviewed and stopped. The draft order is currently being reviewed by government agencies.
Google has announced additional user controls which will enable users to customise ads through an ad centre, and to request removal of search results giving contact details. This will not impact search results from news publications. Additional security features will also be introduced including virtual credit cards which mask actual credit card numbers. More controversially, Google is also reportedly planning to allow drivers' licences to be stored in wallets.
The EDPB has published its 2021 Annual Report. It highlights its work on Schrems II supplementary measures, and opinions on the UK adequacy decisions, new SCCs and guidelines on codes of conduct as a tool for transfer. Its digital work covered opinions on EC legislative proposals including the Data Governance Act, the Digital Services Package and the AI Act. Law enforcement was another priority and various guidance documents were published.
The EDPB's objectives for 2022 include guidance on legitimate interests and on the use of facial recognition by law enforcement published for consultation at the May plenary session. The EDPB also published draft Guidelines on the calculation of administrative fines under the GDPR for consultation.
The UK government has announced a Data Reform Bill to streamline the UK's data protection laws. The Bill is likely to move the UK away from the GDPR regime in some areas but more detail will emerge when the government publishes its response to its consultation on data protection reform. This is likely to be in a few weeks' time and the Bill is then expected to be published in the Summer. This suggests we may see a draft before Parliament begins its summer recess on 21 July. The new Bill of Rights which was also announced in the Queen's speech may also impact the data protection regime.
DCMS has published a call for views on improving the security and privacy of apps and app stores, together with supporting documents. Following a review of the app store ecosystem, the government finds that consumers continue to have access to poorly developed and malicious apps, with the most prevalent risk being in-app malware. App stores are not adequately signposting requirements to developers and users are failing to prioritise security and privacy when downloading apps.
The government seeks views on its plans:
The call closes on 29 June 2022. The government will publish a response and then, if the idea for a code of practice is adopted, it will be published later this year.
The EDPS and EDPB have published their Joint Opinion on the proposed Data Act. While they welcome the aim of the Data Act to work within the current data protection framework, they express "deep concerns" about the lawfulness, necessity and proportionality of the obligation to make data available to Member State and EU public sector bodies in case of "exceptional need". They ask the co-legislators to define more clearly when this might apply and which bodies might request data in those circumstances.
The EDPS and EDPB also advise provision for limitations or restrictions on the use of data generated by the use of a product or service by an entity other than data subjects, particularly where the data is likely to allow profiling or would otherwise entail high risks to the rights and freedoms of individuals. They also recommend clear limitations regarding the use of the relevant data for direct marketing or advertising, employee monitoring, calculating and modifying insurance premiums, and credit scoring. Limitations on the use of data are also advocated to protect vulnerable data subjects, particularly children.
The ICO has launched an updated AI and data protection risk toolkit following comments on its beta version. The toolkit is intended to provide practical support to organisations using AI systems. It is intended to help organisations assess and mitigate risks to individuals.
The AG has published an Opinion on the impact of a right to erasure request following withdrawal of consent to have personal data published in a public electronic telephone directory. The rights that apply in terms of consent requirements and to erasure, are under the GDPR. Under the ePrivacy Directive, however, Article 12(2)suggests that for the purpose of the publication of personal data in a public directory, consent need only be given once and not to a particular provider. It applies to any subsequent processing of personal data by third-party undertakings active in the market of publicly available directories provided the processing is for the same purpose.
The AG was asked to consider what should happen when a data subject asks for their data to be removed from directories. The AG said the CJEU should rule that:
The CJEU has ruled in a reference from Germany, that Member States can make provision for consumer protection associations to instigate actions for breach of data subject rights on the basis of infringement of rules on unfair commercial practices, consumer protection, or the use of invalid general terms and conditions. They can do this at their own instigation, without the mandate of a specific data subject or infringement of specific rights. Article 80(2) GDPR does not preclude this.
The reference was made in relation to Meta. The Bundersverband (Federal Union of Consumer Organisations and Associations) alleged that Facebook Ireland (now Meta) had failed to provide fair processing information to users of free games on its app in breach of data protection law, and had also breached consumer protection and competition rules. The German Federal Court of Justice asked the CJEU to decide whether or not the Bundersverband had standing under Article 80(2) GDPR.
Crucially, the CJEU held that in order to bring a representative action, an authorised organisation need not identify a specific data subject. Identifying an affected category or group may be sufficient. Moreover, there doesn't need to be a specific infringement of a data subject's rights, nor proof of actual harm in a given situation. This decision potentially opens the floodgates to actions brought by consumer protection organisations, for example, around cookie banners or marketing communications. The organisation will be able to bring proceedings in relation to a suspected breach of consumer protection or data protection rules without having to do so through or at the instigation of an individual. Read more.
In the UK, to which the CJEU decision does not apply (although the courts may take it into account), Meta is facing a representative action before the Competition Tribunal under the Consumer Rights Act 2015. It alleges Meta abused its market dominance by setting an unfair price for the free use of Facebook in the form of the personal data it collects from users. The class includes all people domiciled in the UK who used Facebook at least once between 1 October 2015 and 31 December 2019 unless they opt out. It will be interesting to see how it develops.
As part of the European Strategy for Data, the European Commission has published a Regulation to create a European Health Data Space. This is the first draft legislation on the proposed common European data spaces. The aim is to:
Individuals will have access to their electronic health care records and will be able to add information, rectify inaccurate data, restrict third-party access, and have oversight of how their data is used.
Member States will be required to ensure patient summaries, prescriptions, images, image reports, lab results and discharge reports are issued and accepted in a common European format. There will also be mandatory interoperability and security requirements.
Google has announced a new safety section for Google Play which will help people understand the data an app collects or shares, whether the data is secured and additional details which impact privacy and security. The safety section will be rolled out in stages ahead of a 20 July deadline for app developers. The safety section will detail:
Developers will also be able to show whether the developer is sharing data with third parties. Permissions, for example, to use location data, will be simplified.
Apple has a similar but not identical system but the use of privacy labels has been criticised as having potential to be misleading or inaccurate. Google says it checks each data safety section.
EDPB members have agreed to further enhance cooperation on strategic cases and to diversify the range of cooperation methods they use. Each year, the DPAs will identify a number of cross-border cases of strategic importance for which an action plan with a fixed timeline for cooperation will be set. Groups of DPAs may join together to work in groups or create an EDPB taskforce. The DPAs have also committed to sharing information about enforcement strategies and priorities.
The Children's Code requires online services to treat the best interests of the child as the primary consideration when designing and developing services likely to be accessed by children. This involves considering how the use of their data impacts the rights held under the United Nations Convention on the Rights of the Child. The ICO has created tools, templates and guidance to assist organisations in making assessments.
The Data Standards Authority has endorsed two new government data standards:
These standards are intended to increase transparency and data sharing.
A leaked version of the negotiated version of the Digital Markets Act has been published. There have been some significant changes since the first draft. These include wide-ranging interoperability requirements on messaging services, and extended obligations on gatekeepers under Articles 5 and 6, including around use of data.
In particular:
Google has announced it will roll out a 'reject all' button allowing users in the EEA, UK and Switzerland, to reject or accept all non-essential cookies on search or YouTube provided they are signed out or are in incognito mode. Customised options will also be available. The roll-out has started in France (which has been particularly focused on cookie consent).
The news was welcomed by the ICO as a step in the right direction although Stephen Bonner, Executive Director of Regulatory Futures and Investigations, commented that "there's still a long way to go to address concerns around consent across the whole advertising industry…current approaches to obtaining cooking consent need further revision in order to provide a smoother and increasingly privacy-friendly browsing experience.".
The US Department of Commerce has announced the creation of a Global Cross-Border Privacy Rules Forum with Canada, Japan, Republic of Korea, Philippines, Singapore and Taiwan. The Forum aims to establish cross-border privacy rules and privacy recognition for processor systems, as well as privacy standards certification. It also considers interoperability with other data protection and privacy frameworks. A Declaration sets out the underlying principles of the forum. Members will exchange views and share research, analysis and policy to develop best practice. The Forum is theoretically open to all.
The European Parliament has adopted its position on the draft Data Governance Act. The European Council is expected to adopt the approved version without further amendments and it will then be published in the Official Journal. The Data Governance Act establishes:
See here for more about the TADPF.
The government-commissioned report into using health data for research and analysis has been published. Ben Goldacre was commissioned in February 2021, to look into how efficient and safe use of health data for research and analysis can benefit patients and the healthcare sector. The report, aimed at NHS policy makers, the government, and research funders, as well as those using health data for service planning, public health management and medical research, makes 185 recommendations over 112 pages. Helpfully, an executive summary and a slightly longer summary have also been published. The government is likely to include its response in its health and social care data strategy which is expected to be published later this year. The report recommends a move away from techniques like pseudonymisation and towards shared Trusted Research Environments (TREs). The emphasis is on shared resources and processes to enable data sharing in a secure environment. Investment in platforms and curation is seen as the key to resolving problems caused by the current fragmented approach, while preserving patient privacy.
CMA update on investigation into Meta's use of ad data
The CMA is investigating Facebook's collection and use of data in the context of providing online advertising services, including whether its single sign-on function gives it a competitive advantage. It has announced an extension to its information gathering period to summer 2022.
ICO latest guidance on data protection and COVID-19
The ICO has published guidance for businesses around use of personal data relating to the COVID-19 pandemic now that measures have been relaxed across the UK. The ICO says organisations should ask themselves:
Current approaches should be reviewed to ensure they are still reasonable, fair and proportionate to current circumstances taking the latest government guidance into account. Information collected during previous stages of the pandemic should be deleted if it is no longer required.
Organisations should consider whether they (still) need to collect vaccination information. There must be a compelling reason to do so and an appropriate lawful basis must be selected (in addition to any employment law requirements).
Data protection law does not prevent organisations from informing employees about positive cases, but individuals should not be named unless to do so is unavoidable, and only minimum information should be provided.
Advocate General Pitruzzella has opined on a reference from the German Federal Court on the issue of evidence to back up a request to have search results de-listed under Article 17 GDPR (and equivalent provisions under the Data Protection Directive).
The applicants brought a claim against Google asking it to de-reference certain search results of articles whose accuracy they disputed, and to stop displaying thumbnail photographs distinct from the articles. The referring court asked whether it was for the applicants to provide evidence that the information was untrue, or whether it was for Google to assume the claims were well founded and de-list, or to seek to clarify the facts.
The AG said it was for the data subjects to provide evidence of the falsity of relevant content where to do so is not manifestly impossible or excessively difficult. The search engine operator should carry out checks which fall within its specific capacities. Where possible, it should contact the publisher of the relevant web page. Where appropriate, it could suspend the referencing on a temporary basis pending further information, or mark the results as containing disputed information. There should be a balancing of the fundamental rights involved.
With regard to the thumbnails, the AG said there was no need to de-reference images from an image search on the basis of their connection to a name, as account should not be taken of the context of the publication on the internet in which the thumbnails originally appeared.
The FCA has published a call for input on the use of 'synthetic data' to support financial services innovation. The FCA is looking for solutions which will enable greater data sharing for competition purposes and to facilitate innovation while preserving privacy. The FCA believes that it would save on GDPR-related bureaucracy, as well as protecting the rights of individuals. It is looking for views from a range of stakeholders on the potential benefits of using synthetic data, as well as possible limitations and risks. The call closes on 22 June 2022.
TikTok has settled a class action which alleged that TikTok did not collect verifiable parental consent before processing personal information of under-13s in beach of the US Children's Online Privacy Protection Act. The $1.1 million will split among the class which consists of US users aged under 13 who registered for or used TikTok or Muiscal.ly before the effective date of the settlement. TikTok and some of its group companies are also facing a representative action in the High Court around use of children's data.
The ICO has fined H&L Business Consulting Ltd £80,000 for sending nearly 400,000 unsolicited marketing texts which sought to capitalise on the pandemic by directly referencing lockdown when marketing a 'debt management' scheme. The messages falsely claimed the scheme was government-backed when it was not even authorised by the FCA. In addition, the company director had been uncooperative with the ICO.
Sri Lanka has passed its Personal Data Protection Act. Heavily influenced by the GDPR, it will come into effect in stages from next year.
The EC and US government have announced agreement "in principle" of a new Trans-Atlantic Data Privacy Framework to facilitate frictionless data flows between the EU and USA. Full details have not been released but in its press release, the White House said the USA had made "unprecedented commitments" to:
It goes on to say that the Framework ensures:
The scheme will operate as before, on a self-certification basis signalling that an organisation complies with a set of Principles.
The announcement has been greeted with enthusiasm by organisations and privacy professionals. Privacy campaigners including Max Schrems and NOYB, have, however, been more cautious (to put it mildly). They warn that "we expect this to be back at the Court within months from (sic) a final decision". This is based on the expectation that the final text of the Privacy Shield will use GDPR-friendly language (such as 'redress' and 'proportionality') but will not be underpinned by any changes in US surveillance laws. A recent US Supreme Court decision (FBI v Fagaza), reinforced the rights of surveillance authorities to access personal data so it remains to be seen whether a meaningful redress system will be set up for EU citizens concerned about access to their data.
A final version of the new Framework is not expected for the next few months, after which it will go through an approval process, so it is unlikely to be in place in the short term. Assuming it is adopted, even if there is an ensuing legal challenge, it should at the least hold for a year or two. The EC and the US, not to mention businesses, will be hoping it fares better than that and provides a lasting solution where Safe Harbor and the original Privacy Shield failed.
The EU Parliament and Council have reached provisional political agreement on the Digital Markets Act (final text not yet published although due imminently). The DMA is intended to regulate digital markets, effectively to help 'level the playing field' and, focuses on those platforms (including social media) designated as "gatekeepers". This means the bulk of the Act is aimed squarely at the tech giants although it will impact the wider market. SMEs are specifically excluded from gatekeeper status although obligations may be placed on "emerging gatekeepers".
Recognising that data provides gatekeepers with a significant competitive advantage, many of the provisions relate to their use of data. Gatekeepers will be required to share certain data with sellers on their platforms, and will be subject to restrictions on their use of personal data across their services.
A platform will be a gatekeeper if:
If a gatekeeper does not agree with its designation as such, it can challenge the finding.
Gatekeepers will have to:
Gatekeepers will not be allowed to:
Compliance with the DMA will be overseen by the European Commission. An advisory committee and high-level group will be set up to assist the Commission. Member States will be able to empower their own competition authorities to start investigations into possible non-compliance and refer to the Commission.
Non-compliance will result in fines of up to 10% of annual global turnover, rising to up to 20% for repeat offences. Systematic non-compliance (at least three times in eight years) could lead to a market investigation and the imposition of structural or behavioural remedies.
The DSA, which is going through the legislative process running shortly behind the DMA, is another core pillar of EU digital regulation. As such, there are areas of overlap, including around digital advertising. At the press conference to announce political agreement on the DMA, Competition Commissioner Vestager said the DSA is likely to include a ban on targeted advertising without consent, as well as rules on advertising to minors. The Commission is hoping to pass the two Acts around the same time.
The DMA text will now be finalised and then needs to be approved by the Council and the European Parliament. It is expected to come into force in or around October 2022 and will come into effect six months later.
The Digital Regulation Cooperation Forum (DRCF) has launched a new digital regulation research portal. The DRCF includes the CMA, Ofcom, the ICO and the FCA. Its digital regulation research programme brings together over 80 pieces of recent research on emerging and future digital developments from eight regulatory bodies including the DRCF members, the IPO, the Bank of England, the ASA and the Gambling Commission.
The CMA has appointed ING Bank NV as Monitoring Trustee to monitor compliance with Google's commitments to the CMA regarding its Privacy Sandbox proposals. Google's plans to replace third party cookie functionality on its Chrome browser, with Privacy Sandbox tools, are subject to a number of commitments and the Trustee will produce quarterly reports on compliance. It will also inform Google if any further measures are required, and report to the CMA if it considers Google is not adhering.
The National Data Guardian has given written evidence to the Science and Technology Committee's digital data inquiry. It details the advantages of facilitating the collection and sharing of health data while underlining that trust is essential to the process. It suggests the government's proposals on sharing health data made in in its strategy on data protection reform ("Data, a new Direction"), and the "Data Saves Lives" draft strategy, could lead to weaker rights for individuals. Not only would this be an issue for individuals who may be less willing to share health data, but it could have a knock on effect on the EU-UK adequacy decision for data transfers.
The UK's International Data Transfer Agreement and Addendum to the EC Standard Contractual clauses came into force on 21 March 2022 without further changes. UK organisations should begin using these for new transfers. Contracts concluded on or before 21 September 2022 on the basis of the previous Standard Contractual Clauses will be treated as providing adequate safeguards for data exports until 21 March 2024, provided that the processing operations they cover remain unchanged. See here for more.
Google's most up to date analytics tool is Google Analytics 4 (GA4), launched two and a half years ago. Google has announced it will begin phasing out the previous version, Universal Analytics next year. All standard Universal Analytics will stop processing new hits on 1 July 2023. The more recently introduced Universal Analytics 360 will cease hit processing on 1 October 2023.
GA4 operates across platforms and does not rely exclusively on cookies. It uses an event-based data model to delivery user-centric measurement. Going forward, GA4 will no longer store IP addresses. GA4 has always used anonymised IP addresses but Google says it no longer needs to log them, relying instead on data-driven attribution modelling and other upgrades including modular country settings.
Exports of Google Analytics data from the EEA to the USA have been under regulatory scrutiny since the Schrems II judgment and a number of SAs have found such exports to be unlawful. Google will be hoping that these changes will address concerns.
The EC has issued a call for feedback on the draft Data Act. It will summarise the responses which will then form part of the legislative debate on the proposed Regulation. The consultation closes on 11 May 2022.
The EC has issued a call for evidence for an impact assessment and public consultation on a new initiative (most likely a Regulation) on horizontal cybersecurity requirements for digital products and ancillary services (EU Cyber Resilience Act). The initiative was announced by Ursula Von der Leyen in her 2021 State of the Union Address. The Act will aim to make digital products and ancillary services more secure and ensure adequate cybersecurity is applied throughout a product's lifecycle, alongside provision of information.
Various policy options are under consideration, from a largely self-regulatory approach, to full-scale legislation, to a hybrid approach. The call closes on 25 May 2022 and the Commission expects to publish proposals in the third quarter of 2022.
The EDPB has adopted the following at its 62nd Plenary Session:
The ICO has fined five companies a total of £450,000 for making over 750,000 unwanted marketing calls targeted at older and vulnerable people. The ICO found that the companies, possibly working together or using the same marketing lists, were deliberately targeting older people by buying lists form third parties, specifically asking for information about people aged 60+, homeowners, and people who had landline numbers.
The ICO has added to its GDPR guide to cover ransomware and data protection compliance. The guidance provides a compliance checklist and sets out eight common scenarios and situations experienced as a result of a ransomware attack, alongside compliance suggestions. The guidance also points to other resources, notably NCSC guidance.
Ofcom is consulting on updating its guidance for telecoms operators on complying with security requirements under the Telecommunications (Security) Act 2021, which amends ss105A-D of the Communications Act 2003. The guidance is likely to be superseded by a Code of Practice which is also the subject of consultation along with draft secondary legislation which will set out what telecoms providers need to do to comply with security requirements. The consultation closes on 17 May 2022.
SMO, a minor, acting through her litigation friend the former Children's Commissioner for England, is looking to bring a claim against six defendants from the ByteDance Group which owns TikTok, arguing that six group entities breached relevant data protection and privacy law. The second defendant is a UK company which owns the third defendant, an Irish company. The other defendants are located in the Cayman Islands, China and the USA. The claimant is seeking to bring a representative claim under CPR 19.6 (the same route chosen by Mr Lloyd in his case against Google which failed). She required permission to serve claims outside the jurisdiction against some of the defendants. Service must take place within six months of the claim being issued.
The proceedings in this case were stayed pending the outcome of Lloyd v Google. However, the claimant did not proceed in a timely manner and made last minute applications to extend time limits and to serve the claim against the Chinese entity on its London solicitors. The judge did not agree on either of these points.
The main interest though is in the judge's consideration of whether the claims related to a serious issue to be tried on the merits (a requirement for serving out). The issue was whether there was a realistic prospect of success given the outcome of Lloyd v Google. Nicklin J said that, having heard only the claimant's arguments, there is a serious issue to be tried and granted permission to serve out (albeit within the un-extended time limits).
The next step will be an application for summary judgment by the UK-based defendant. The represented class in the claim is all those who are, or were, account holders and users of TikTok from 25 May 2018, and were resident in the UK or EEA (with the exception of the Netherlands as there are separate proceedings there), and who were under the relevant age (either 13 or 16 depending on the relevant country and applicable legislation).
One of the problems with Mr Lloyd's representative action was that not everyone in the represented class had the same interest because they had different amounts of data processed unlawfully meaning they had suffered different levels of damage and any award would have to be individually assessed. The Supreme Court also said each individual in a representative claim had to establish damage had been suffered as a result of a contravention. In Lloyd v Google, the individuals could not demonstrably be shown to have suffered the same level of damage and could not claim damages on a representative basis. It will be interesting to see whether SMO's claim against the UK defendant fails on the same grounds at the hearing of the application for summary judgment.
Separately, TikTok is also under scrutiny in the USA after eight States announced their Attorney Generals would lead a Nationwide investigation into the impact of TikTok on children, teens and adults.
The Belgian data protection regulator (APD) has published an Opinion warning that a draft Bill intended to strengthen the APD, risks making it less independent. In particular, it is concerned that guarantees of independence will be eroded and that the Belgian Parliament will have a greater level of control than that authorised under EU and Belgian law. The APD is also worried about a lack of funding and weakening of legal procedures.
It will be interesting to see whether the EDPS or even the EC will get involved in this, not least because it may shed some light on how the EU might view some of the UK government's proposals in its consultation on the reform of UK data protection law. The UK government is looking at introducing greater oversight of the ICO, including through a government-appointed CEO, and requiring the ICO to consider the government's national and international priorities. In her response to the consultation, the then ICO, Elizabeth Denham, said "there are specific proposals where I have strong concerns because of their risk to regulatory independence".
The Danish DPA has published guidance on the use of cloud services aimed at data controllers. UK businesses may find it useful as the ICO's pre-GDPR 2015 guidance on the subject has not yet been updated. It covers issues for controllers using cloud services and how to assess and monitor processors. It also looks at the issue of data exports to third countries and, in particular, the US.
The Italian DPA has fined Clearview AI €20 million for data protection breaches relating to its facial recognition technology. This follows the UK ICO's November 2021 intention to fine it over £17 million in a joint investigation with the Australian Information Commissioner. Clearview AI's facial recognition system uses over 10 billion images scraped from the internet. Regulators have found Clearview AI in breach of numerous GDPR requirements including that it failed to carry out fair and lawful processing, did not have a lawful basis for collecting information, did not have suitable transparency and data retention policies, failed to give required information to individuals, did not satisfy processing conditions for special data, and did not give effect to data subject rights.
Separately, according to Clearview, more than 2bn of its images are from the Russian social media network Vkontakte. Clearview has reportedly offered Ukraine its services for free to help them identify Russian "infiltrators", identify the dead, fight misinformation and reunite families who do not have paperwork. According to unconfirmed reports, Ukraine began using the technology at the weekend.
The Irish Data Protection Commissioner, acting as lead regulator, has fined Meta €17 million in relation to a series of data breaches which took place between June and December 2018. Specifically, the DPC found that Meta had breached the Article 5(2) accountability requirement, and the Article 24 requirement to maintain appropriate technical and organisational measures. It was not able to readily demonstrate the security measures it had implemented in practice to protect user data in the context of the 12 identified breaches. The Irish DPC's decision was subject to the Article 60 decision making process as it related to cross-border processing.
Privacy campaign group NOYB has sent 270 draft complaints to website operators it believes are not complying with GDPR requirements in their cookie banners. It has included suggestions about how to make the banners compliant and says it will only file the complaints against operators who do not make changes within 60 days. This is the second wave of action NOYB has launched on cookie banners and it says it will extend its focus to pages using consent management platforms other than OneTrust, including TrustArc, Cookiebot, Usercentrics and Quantcast, hoping to review up to 10,000 websites.
Guidance for manufacturers of Video Surveillance Systems
The Biometrics and Surveillance Camera Commissioner has published guidance on security by design and default for manufacturers of Video Surveillance Systems (VSSs) or those manufacturing or assembling components to be used in VSSs. The guidance sets out minimum requirements to ensure systems are designed and manufactured in accordance with security principles of design and default. It forms part of a wider suite of documents being developed as part of the Surveillance Camera Commissioner Strategy in support of the SCC Code of Practice.
ICO reprimands Scottish government over COVID Status app
The ICO has issued a reprimand to the Scottish Government and NHS National Services Scotland for GDPR failings in relation to sensitive health data used by the NHS Scotland COVID Status app.
The ICO raised a number of concerns about the app at the development stage and while some were resolved, the app went live without addressing some of the ICO's wider issues. In particular, the ICO says:
The ICO noted that the privacy notice had been redrafted a few times since launch but said that this had not addressed the issues. The ICO now requires the privacy notice to be re-drafted in order to present the information required by Article 13 in a manner which complies with Article 12. It will consider issuing an Enforcement Notice if this does not happen within 30 days of 28 February.
The ICO has published a call for views on Chapter four of its draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies. The latest chapter focuses on accountability and governance. It deals with responsibility for the anonymisation process, how to do DPIAs, ensuring transparency, risk mitigation and staff training, among other issues. The call closes on 16 September 2022.
The EDPB has adopted the final version of its Guidelines on Codes of Conduct as a tool for transfers. The guidance provides a clarification of the application of Articles 40(3) and 46(2)(e) of the EU GDPR. It also covers the use of an Article 40 Code of Conduct as a mechanism to protect data transfers to third countries.
The revised Swiss Federal Act on Data Protection was due to come into force early this year. It has now been postponed to 1 September 2023 (subject to a confirmatory decision by the Federal Council). The revised Act was heavily criticised in some quarters for not meeting GDPR standards and, while the reason for the delay has not been made public, it is expected that further revisions may come as a result of responses to a June 2021 consultation.
The US Senate has passed the Strengthening American Cybersecurity Act, a package of bills intended to enhance US cybersecurity. The draft legislation is targeted at companies which provide critical infrastructure including energy and healthcare. If passed in its current form, these companies would be subject to 72 hour breach reporting requirements and mandatory disclosure of ransomware payments.
The package now moves to the House of Representatives for consideration. US Department of Justice officials are reportedly unhappy that the predominant reporting requirement is to the Department of Homeland Security's Cybersecurity and Infrastructure Agency although there are provisions to require the Agency to share reports more widely with other relevant agencies such as the FBI.
The EC has published its draft Data Act. The draft Data Act (which takes the form of a Regulation) clarifies who can create value from data (personal and non-personal) and under what conditions. It is the second major legislative initiative of the European Strategy for Data and follows on from the Data Governance Act which creates the processes and structures to facilitate data sharing.
The Act is intended to unlock industrial data by giving business users access to data they contribute to creating, and giving individuals more control over all their data, not just personal data. This is focused particularly on data created using connected devices and related services, for example voice assistants. It is partially aimed at largescale manufacturers and service providers of IoT products who are likely to lose their data advantage to a degree. Third party business users will not be able to use obtained data to develop directly competing products, but they will be able to use it to create other products and services.
Proposals include:
The proposal is intended to be consistent with the GDPR, the ePrivacy Directive, the Free Flow of Non-Personal Data Regulation and the Unfair Contract Terms Directive. "Data" is defined as "any digital representation of acts, facts or information and any compilation of such facts or information, including in the form of sound, visual or audio-visual recording". It therefore covers personal and non-personal data although some provisions in the Act (such as those around international access and transfer) apply only to non-personal data.
Manufacturers and designers must design products and related services in such a way that by default and design, businesses and individuals involved in generating data through them are able to access, use and share their data free of charge. Data must also be made available to third parties by data holders at the request of the user (except by micro and small enterprises). These provisions do not prevent the manufacturer from accessing and using data from their products or related services where agreed with the user.
Data made available cannot be used to develop competing products and trade secrets are given protection. Users and third parties may not share data with organisations designated as gatekeepers under the Digital Markets Act. Third parties receiving data may only process it as agreed with the user (and in accordance with the GDPR where the data includes personal data).
Where a data holder is required to make data available, it must do so on fair and reasonable terms and in a transparent manner. Any compensation to a data holder for making data available to a data recipient must be fair and non-discriminatory.
Contractual terms unilaterally imposed by one party on a micro, small or medium sized business must be fair, reasonable and non-discriminatory. Unfair terms are defined in general terms but the Act also sets out a list of clauses which will always be or will be presumed to be unfair. The burden of demonstrating that terms are non-discriminatory is on the data holder.
Provisions are made for public bodies and agencies to be able to access private sector data where there is an exceptional need for it. For example, where the data is necessary to respond to public health emergencies or major natural or human-induced disasters, the data would be made available for free. Where made available in other cases of exceptional need, for example to prevent or recover from a public emergency, there are provisions to allow compensation for data holders. Rules are introduced to provide oversight and ensure the access right is not abused.
Data processing services must ensure their customers can switch to an equivalent service by another provider and may not create obstacles, including by having termination periods longer than 30 days, restricting porting of data, or preventing users from entering into new contracts with other providers. Contractual terms must allow for switching and include assistance and service continuity provisions during the transition period which must not be more than 30 days subject to technical unfeasibility. There is also provision for a gradual withdrawal of switching charges. Specific technical standards or interfaces are not mandated but services must be compatible with European standards or interoperability technical specifications where available.
Providers of data processing services are required to take all reasonable technical, legal and organisational measures, including contractual arrangements, to prevent international transfers or governmental access to non-personal data held in the EU where such transfer or access would create a conflict with EU or Member State law. This is subject to exceptions for legal data access requests under international agreements.
Operators of data spaces and data processing service providers must comply with requirements to facilitate interoperability of data, data sharing mechanisms and services. These can be generic or sector-specific and the Commission may adopt further specifications and requirements but the immediate focus is on cloud service providers and ensuring portability. There are also requirements regarding essential requirements for smart contracts.
Member States must designate supervisory authorities which will have powers to sanction non-compliance in line with GDPR-level fines for certain breaches.
The legislation now begins the path to approval and is expected to come into effect 12 months after coming into force. See our article for more.
The CMA has published a blog on its decision to accept binding commitments on Google's proposals to replace third party cookies on its Chrome browser with a range of Privacy Sandbox tools. The CMA notes that Google plans to roll out a similar set of changes to app advertising within the Android ecosystem and that it will apply the same commitments on a voluntary basis. The CMA is looking for engagement with and input from other market players to help it inform its approach to monitoring and testing Google's proposals.
The Irish Times has reported that Irish Data Protection Commissioner has issued a preliminary draft decision suspending data transfers by Facebook's Meta from Ireland to the USA. Meta reportedly has 28 days to make submissions after which the Irish DPC will prepare a draft Article 60 decision for consideration by other concerned Supervisory Authorities. This is likely to happen in April. The substance of the decision has not yet been made public.
This is the latest twist in the Schrems complaint and, if suspension of transfers to the US is ordered, there could be wide reaching implications for all data transfers to the USA and, potentially, to other third countries.
The ICO has published draft guidance on the research provisions within the UK GDPR and DPA 18. This is intended to highlight where in the legislation the various provisions relating to research are found, how they fit together, and their practical effect. It is intended to provide more detail and clarity to help organisations understand when they can rely on research provisions. It covers areas including:
As the ICO notes, the government is currently consulting on reforming research provisions as part of its review of the UK GDPR. The ICO believes guidance on the current regime is, however, required now in order to support stakeholders and it addresses many of the issues raised by the government. It is consulting on the draft guidance until 22 April 2022.
The Association of the British Pharmaceutical Industry (ABPI) is consulting on draft governance principles for the use of health data by the pharmaceutical industry. The principles are intended to help increase transparency around the use of health data by pharmaceutical industry researchers and cover:
IAB TechLab (which focuses on developing technology and standards for the digital media ecosystem and is related to IAB Europe), has set out its 2022 product roadmap and working groups. It will focus on:
Among the working groups planned is one on PETs. Developers working on advanced cryptography, data scientists, privacy and security systems engineers, as well as others in the digital advertising community, are invited to help develop privacy enhancing standards and software tools for the digital advertising industry.
The EDPB has begun its first coordinated enforcement action which will investigate the use of cloud-based services by public authorities. This follows the creation of a Coordinated Enforcement Framework (CEF) in October 2020. The action will involve 22 DPAs and the EDPB looking at over 80 public bodies.
The CEF will be implemented at national level in one or more of the following ways: fact finding to identify whether a formal investigation is warranted; formal investigation; and follow up of ongoing formal investigations. SAs will focus on public bodies' challenges with GDPR compliance when using cloud services, including regarding international transfers, and the controller processor relationship.
Results will be analysed in a coordinated manner and the EDPB will publish a report by the end of 2022.
The CNIL has set out its enforcement priorities in 2022:
Google has published a blog on its updated approach to the Privacy Sandbox. It announced a multi-year initiative to build the Privacy Sandbox on Android to introduce more private advertising solutions. These will limit sharing of user data with third parties and operate without cross-app identifiers, including advertising ID. Google is also exploring ways to reduce the potential for covert data collection, including safer ways for apps to be integrated with advertising SDKs.
Developers can review initial design proposals and share feedback on the Android developer site. A beta version will be released by the end of the year.
The announcement follows Google's commitments to the CMA about the development of its Privacy Sandbox.
Oman has approved its Law on the Protection of Personal Data which takes effect in February 2023. The law, which provides a range of data subject rights, is heavily focused on consent and is also subject to a wide range of exceptions. It will be enforced by the Ministry of Transport, Communications and Information Technology which will draw up further regulations.
The EDPS has published preliminary remarks on Modern Spyware, recommending a ban on the development and deployment of Pegasus-type spyware in order to protect fundamental rights and freedoms. The EDPS says it is unlikely that Pegasus could be used lawfully in the EU but recognises there may be exceptional circumstances, including serious imminent threat, that might justify using this type of spyware subject to robust oversight.
California legislators are proposing a new law to protect children's privacy online. It is heavily based on the ICO's Children's Code with a focus on privacy by design. As drafted, the law would require companies based in California (including Meta and Google) to limit the amount of data they collect from children. It would also limit location tracking and profiling for targeted advertising, ban behavioural nudges, and require age-appropriate content policies.
The French data protection authority, the CNIL, has found that an unnamed French website manager's use of Google Analytics breaches the GDPR as it results in the unlawful transfer of personal data to the USA. The CNIL found that the personal data transferred to the USA by Google Analytics was not adequately protected from potential access by US intelligence agencies. This was notwithstanding the additional protective measures put in place by Google. As a result, the CNIL ordered the controller to bring its processing into GDPR compliance within one month, if necessary by ceasing to use Google Analytics (under the current conditions) or by using a tool which does not involve a transfer outside the EU.
The CNIL noted that its decision is in response to one of the 101 complaints filed by NOYB in the EU relating to the use of Google Analytics, and that it was reached in cooperation with EU DPAs. The decision mirrors an earlier one by the Austrian DPA and follows warnings made by the Netherlands regulator.
The CNIL said that analytics tools for website and audience measurement and analysis services should only be used to produce anonymous statistical data which would therefore allow for an exemption from consent requirements if the controller ensures there are no illegal transfers. It has launched an evaluation programme to determine which solutions are exempt from consent. The CNIL also makes it clear that it is not just Google Analytics in the frame but that it, and its EU counterparts, are investigating other similar tools which also result in the transfer of personal data to the USA.
With other EU regulators likely to follow the general direction of travel, it may become difficult to use analytics tools which transfer personal data to the US, although controllers should still be able to use them to collect non-personal analytics.
The wider implications of these decisions are more far reaching. They cast doubt over whether there are any realistically achievable additional measures sufficient to protect personal data transferred to the US from the EU. Google says its supplementary measures intended to give effect to the Schrems II rulings include anonymising IP addresses before data leaves the EU, using data encryption, and employing technical measures to prevent in-transit interception. These are all measures included in EDPB Guidance as ways to reduce risk attaching to US data transfers and yet regulators are finding that they do not adequately protect the data.
Given that the personal data being transferred by analytics tools is not particularly sensitive, one might imagine the risk to individuals regarding access by intelligence authorities is relatively low. So how much more stringent do protections need to be for more sensitive data being transferred to the US? All eyes are now looking out for the rumoured Privacy Shield 2.0.
To hear more on these issues and to catch up with other developments on data transfers, join our webinar on 22 February.
As expected IAB Europe has announced it is appealing the decision of the Belgian data protection authority, that it breached the GDPR in relation to its Transparency and Consent Framework (TCF). IAB Europe has described the APD's decision that it was a data controller in respect of the TC String used in the TCF as "wrong as a matter of law". It now says "We believe the controversial ruling that IAB Europe is a data controller for information processed for TCF purposes is based on a misunderstanding of the facts and a misapplication of the law. This establishes an irrational legal precedent. It will have the perverse effect of discouraging other standard-setting organisations from investing in instruments that aim to protect users and facilitate the exercise of their rights under GDPR".
IAB asks other EU DPAs to confirm they will hold off taking measures against IAB Europe or any TCF participants, pending the outcome of the appeal. It also suggests advocacy organisations calling on advertisers to stop using the TCF and OpenRTB, are reaching "unfounded" conclusions because no advertisers are named parties in the Belgian ruling, and because the APD has not ordered IAB Europe to discontinue the use of TCF while it submits a plan to address APD concerns.
IAB Europe says it will continue to work with the APD towards approval of the TCF as a trans-national GDPR Code of Conduct.
The CMA has announced its acceptance of Google's final commitments to address competition and privacy concerns over Google's planned replacement of third party cookies on its Chrome browser with a range of 'Privacy Sandbox' tools.
The CMA has been working closely with the ICO to analyse and address potential risks associated with Google's proposals. In particular, the CMA was concerned that they would:
The CMA has now accepted final commitments from Google (following two consultations on proposals) which include:
The commitments are made for six years. The ICO has welcomed the outcome.
The Belgian Data Protection Authority (the APD) has fined IAB Europe EUR 250 for GDPR failings of its Transparency and Consent Framework (TCF). The TCF was designed to help the real time biding adtech ecosystem comply with the GDPR, and, in particular, to address some of the difficulties in being transparent and obtaining valid consent. It is widely used by the adtech industry.
The APD found that IAB Europe is a data controller with respect to the registration of user consents, objections and preferences by means of a unique Transparency and Consent (TC) String which is linked to an identifiable user. It is therefore responsible for the following GDPR failings:
The APD fined IAB Europe EUR 250,000 and gave it two months to present an action plan for taking corrective measures which must be completed within six months. These must include:
Responding to the decision, IAB Europe rejected the APD's finding that it is a data controller in the context of the TCF, saying it was wrong "as a matter of law" and hinting strongly at an appeal. On a more positive note, it also hailed the decision as paving the way for the TCF to become a formal GDPR Code of Conduct. This was because the APD provided for issues to be remedied within six months.
Some of these issues are fairly easy for IAB Europe to fix – for example, appointing a DPO. Others go to the heart of GDPR compliance for the adtech sector by making it very difficult to establish a suitable lawful basis for processing. Capturing granular consent is notoriously difficult in the world of real-time bidding which is one of the reasons the TCF was developed in the first place.
IAB Europe said it would work with the APD towards getting the TCF to a point where it can be submitted for approval as a trans-national Code of Conduct. The question is whether it will be able to do that and what it means for the industry as a whole if it can't.
The ICO is calling for views on its updated draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies. This is being published in sections with the most recent focusing on pseudonymisation. Chapters introducing anonymisation and on identifiability were published last year. Further chapters covering research, PETs, data sharing and accountability (among other topics) are set to follow.
The latest chapter sets out key differences between anonymisation and pseudonymisation, guidance on pseudonymisation and on the DPA 18 re-identification offence.
The consultation on the guidance is open until September 2022. A consultation on the full draft will take place in the autumn.
The ICO is inviting health sector organisations to participate in workshops on Privacy Enhancing Technologies. The ICO wants to set out how PETs can facilitate safe, legal and valuable data sharing in health and understand what is needed to help organisations use them. Adoption is currently low and workshop attendance is particularly encouraged for health organisations and healthtech startups, both those using PETs and those not using them, as well as academic and legal experts, and suppliers of PETs. The workshops are part of a BEIS-funded project, 'PETs for Public Good'.
The ICO has announced a major listening exercise to hear from businesses, organisations and individuals about their experiences of engaging with the ICO. This will involve the ICO conducting a survey and hosting events for a variety of stakeholders. The survey opened on 28 January and will close on 1 May 2022.
The EC is expected to present its new Data Act on 23 February. The Act will cover non-personal data and apply to the manufacturers of connected products, digital service providers and users in the EU. The Act will focus on giving individuals and organisations access to the data they contribute to generating. It will place restrictions on the use of third party data by organisations designated as gatekeepers under the Digital Markets Act, and prevent unfair contractual terms and the use of dark patterns to manipulate user choices. The Act is also expected to contain mandatory switching and interoperability provisions for cloud service providers.
The EDPB has adopted its first opinion on certification criteria in response to the GDPR-CARPA scheme submitted by the Luxembourg DPA. Certification schemes are intended to help controllers and processors demonstrate GDPR compliance and give them greater visibility and credibility. The EDPB's role is to ensure consistency of certification criteria across the EU. It has asked for a number of changes to the proposed scheme, mostly requiring it to be more specific and add clarity. Once adopted by the Luxembourg DPA, it will be added to the register of certification mechanisms and data protection seals under Article 42 GDPR.
The ICO has laid the UK's new international data transfer agreement (IDTA), the international data transfer addendum to the EC's standard contractual clauses (Addendum) and a document setting out transitional provisions before Parliament. Providing no objections are raised, they will come into force on 21 March 2022. The IDTA and Addendum will replace the current UK SCCs for international transfers and take the Schrems II decision into account.
Under Paragraph 7, Part 3, Schedule 21 of the Data Protection Act 2018, transitional provisions allowed SCCs issued under the Data Protection Directive to be used as an appropriate safeguard for international data transfers. The ICO refers to these as Transitional Standard Clauses (TSCCs)
Provisions laid out for the transition to the new IDTA are as follows:
The ICO says the IDTA and Addendum are "immediately of use to organisations transferring personal data outside of the UK, subject to the caveat that they come into force on 21 March 2022 and are awaiting Parliamentary approval".
The ICO is developing additional tools to provide support and guidance which will be published soon. These will cover:
This consists of:
The Addendum is designed to be appended to the EU SCCs but states that in the event of any conflict between the Addendum and UK data protection law, UK data protection law will prevail. Despite wording to the contrary in the EU SCCs, the Addendum sets out the following hierarchy with some slightly confusing definitions:
Addendum – the International Data Transfer Addendum which is made up of the Addendum incorporating the Addendum EU SCCs
Addendum EU SCCs – the version of the Approved EU SCCs which the Addendum is appended to, as set out in Table 2 including the Appendix information
Approved Addendum – the template Addendum issued by the ICO and laid before Parliament on 28 January 2022, as it is revised undersection 18 DPA 18.
The Addendum incorporates the EU SCCs subject to changes to take account of governing law, jurisdiction and to make them work in the UK.
Following the decision by the Austrian data protection regulator that the use of Google Analytics breaches rules on data exports, other regulators have added their comments. The Norwegian DPA investigating two complaints, announced it would join Austria's decision but also commented that other web tools send far more personal data to the US than Google Analytics does. The Guernsey DPA said it was removing Google Analytics from its website. On the one hand this represents implicit agreement with the Austrian DPA but it also backs up Google's point that controllers have responsibility for the way in they use Google Analytics.
There are reported to be close to 100 complaints about Google Analytics across the EU and the EDPB has set up a working group to look at the issue. In the meantime, it is likely that both Google and the Austrian controller in question will appeal the Austrian decision. It also remains to be seen whether the Austrian regulator goes on to issue a binding decision or impose sanctions.
Controllers who are responsible for their use of Google Analytics (and any similar tools) should, at this stage, make considered use of the privacy options available pending any EU-level decision.
DCMS has announced a 20-member International Data Transfer Expert Council. The Council is made up of data privacy experts from a variety of key organisations (including Google, Mastercard and Microsoft), industry sectors, legal practice and academia. The role of the Council is to provide independent advice to the government to help it unlock the benefits of secure cross-border data flows following Brexit. This will include advising on revised data transfer tools and new data adequacy arrangements.
ENISA has published a report looking at data protection engineering to help organisations with data protection by design and default. The report focuses on anonymisation, pseudonymisation, data masking and privacy-preserving computations, access, communications and storage and transparency. It focuses on technical aspects, looking at the benefits of particular tools and techniques to meet GDPR security requirements. It also makes recommendations to regulators on actions to promote good practice and provide guidance.
The Data Protection 2018 (Amendment Schedule 2 Exemptions) Regulations 2022, which amend the immigration exemption have been made. The exemption in the DPA 18 was declared unlawful by the Court of Appeal and the government was given until the end of January 2022 to make the required amendments. Privacy campaigners dispute whether the amendments resolve the issue of incompatibility with the UK GDPR.
The EDPB has adopted draft guidelines on the right of access. These are subject to a six week consultation period although they had not been published at the time of writing. The guidelines analyse the right of access and provide guidance on its implementation in a variety of specific situations. They also look at the scope of the right, the information the controller has to provide to the data subject, the format of requests, how to provide access, and at what constitutes "manifestly unfounded or excessive requests". The guidance will not apply under the UK GDPR but controllers may find it useful to read it in addition to ICO guidance on the issue.
The CJEU has ruled on a reference from Germany which asked whether restrictions on sending unsolicited direct marketing communications under Article 13 of the ePrivacy Directive apply to advertising messages that appear in email inboxes and look like normal emails. The practice is referred to as inbox advertising.
The advertising in this case was in a free email service funded by paid-for advertising. Users were also able to select a paid-for ad-free version of the service as an alternative. The adverts were sent at random and the content was not targeted. They were visually indistinguishable from other emails except for the fact that the date was replaced by the word 'Anzeige' (advert). No sender was mentioned and the text appeared against a grey background. The subject section contained text promoting price offers for electricity and gas.
The CJEU was asked to decide whether and under what conditions, the placement of advertising messages in an inbox of a free email service funded by paid-for advertising may be regarded as compatible with the ePrivacy Directive and the Unfair Commercial Practices Directive.
The court held that:
It will be interesting to see how the referring court rules on the issue of whether or not consent to the communications had been received.
Proposals for new laws to strengthen cyber resilience of UK businesses
The government is consulting on new laws to drive up security standards in outsourced IT services, and to make improvements in the way organisations report cybersecurity incidents. The consultation closes on 10 April 2022.
Central to the proposals are plans to expand and reform the NIS Regulations 2018, including by:
Embedding standards and pathways across the cyberprofession by 2025
This consultation is open until 20 March 2022 and focuses on proposals to standardise qualifications and certifications across the cybersecurity profession. The consultation asks for views on how best to ensure that UK Cyber Security Council launched in May last year, fulfils its goals. The Council was set up to act as the professional body for cybersecurity workers. The consultation aims to engage on ways to allow the Council to define and recognise cybersecurity job titles and link them to existing qualifications and certifications. The consultation also looks at ways to tackle the scale and diversity of the skills shortage.
The EDPS has published an Opinion on the proposed Regulation on Transparency and Targeting for Political Purposes. The EDPS suggests legislators consider stricter rules in addition to the measures already proposed. These should include a full ban on microtargeting for political purposes, and a ban on pervasive tracking for political purposes. This should involve further restrictions on which categories can be processed for political advertising purposes, including for targeting and amplification.
A government-funded campaign to put pressure on social media companies to delay bringing in end-to-end encryption (E2EE) was launched last week. The government reportedly spent over £500,000 to promote the campaign which asks for a delay in introducing E2EE until safeguards to protect children from abusers are put in place. Meta recently announced a delay to its plans to extend E2EE across its platforms and the campaign asks for assurances on what safeguards will be put in place.
The ICO responded to this arguing that E2EE serves an important role in safeguarding privacy and online safety. The ICO suggested the debate about the merits or otherwise of E2EE has become too unbalanced with a focus on the costs at the expense of a consideration of the benefits. The ICO went on to say that law enforcers have a variety of methods to catch abusers and do not need to rely on access to encrypted content, concluding "it is hard to see any case for reconsidering the use of E2EE – delaying its use leaves everyone at risk including children".
The Austrian Data Protection Authority, the DSB, has reportedly said that using Google Analytics breaches EU law on data exports in a decision published by the claimant NOYB (the privacy group spearheaded by Max Schrems).
The complaint alleged that Google as data importer, and a website provider as exporter, breached Article 44 GDPR when moving Google Analytics data from the EU to the USA as it was vulnerable to access by US intelligence authorities. Google argued that the data was not personal data and that Chapter V of the GDPR was not applicable to data importers.
The DSB agreed that the export of the data did breach the GDPR. In particular, it found the supplementary measures used by Google in conjunction with Standard Contractual Clauses do not adequately address specific issues with data transfers to the USA and the potential access to data by US intelligence authorities. The DSB held that the transfers were not adequately protected. Moreover, the exporter could not rely on an Article 49 exemption. The transfers therefore took place in breach of Article 44 GDPR.
The DSB did, however, agree with Google that Chapter V of the GDPR applies only to exporters and that it was the exporter who was solely liable for the breach. No fine has been issued to date.
Earlier in the week, the EDPS reached a similar decision with regard to the use of Google Analytics and payment provider Stripe by the European Parliament in relation to its COVID testing website.
Google issued a robust response in a blog post, arguing that there was a fundamental misunderstanding of the way Google Analytics worked. It said that:
Google uses a number of supplementary measures in addition to SCCs for data transfers. These include anonymising IP addresses before data leaves Europe, using data encryption, technical measures to prevent interception in transit, and international security standards.
The defendants have four weeks from the date of the decision to appeal it.
The CNIL has published guidance on when and for what purposes processors may use personal data supplied by controllers for their own purposes. The CNIL says data can only be reused by processors where strict conditions are met.
The guidance suggests current market practice general clauses around further processing, both in DPAs and in data subject-facing policies, will not be sufficient to permit further processing by processors. The guidance requires a high level of decision making by controllers and detailed information both from processor to controller and controller to data subject. While this is French guidance, it is likely to be influential with other regulators.
A class action has been launched against Meta (Facebook's parent company) at the Competition Appeal Tribunal under the Consumer Rights Act 2015. It alleges that Meta abused its market dominance by setting an unfair price for the free use of Facebook in the form of the personal data it collects from users. The class includes all people domiciled in the UK who used Facebook at least once between 1 October 2015 and 31 December 2019 unless they opt out. The claim is said to be worth USD 2.3 billion and is being backed by litigation funder Innsworth.
The action is based on the assertion that at the relevant time, there was no other comparable social platform and that Facebook generated massive revenues by tracking users across the internet, getting a disproportionate value compared to the value received by the user of the Facebook platform. The allegation is that although there are privacy controls available to data subjects, they are difficult to use, and that Facebook is not sufficiently transparent about the way data is used. It therefore imposes unfair terms and conditions.
Facebook argues that users get an appropriate value and that they have meaningful control over their data. Meta is also facing FTC action in the US on anti-trust issues.
It remains to be seen whether this claim under the Consumer Rights Act is more successful than the representative class action claim made in Lloyd v Google under the old Data Protection Act 1998. The claim in Lloyd v Google failed, in part, because the claimant could not demonstrate a common issue among the class. There was no evidence to suggest members of the class had suffered a uniform impact of the data processing, nor that they had all suffered the same level of damages. This is also likely to be true of Facebook users. A one-time user of the platform under maximum privacy settings, would have different exposure to a constant user with open privacy settings. Moreover, the latter may have suffered less distress than the former.
The French data protection authority, the CNIL, has fined Google EUR150 million and Facebook EUR60 million for failing to meet consent requirements to cookies on Facebook, Google and YouTube in France. The CNIL said failure to make it as easy to reject cookies as to accept them meant that consent to cookies was invalid in breach of Article 82 of the French Data Protection Act. While the sites required only one click to accept cookies, several clicks were required to reject all of them.
In assessing the amount of the fines, the CNIL took into consideration the number of users, the revenue the companies raised from advertising indirectly generated by the data collected from the cookies, and the fact that it had repeatedly warned Google and Facebook that they were in breach of the requirements. The CNIL also requires Google and Facebook to make it as easy to reject as to accept cookies within three months, after which they will face penalties of EUR100,000 per day of non-compliance.
As the decision relates to requirements under the French implementation of the Privacy and Electronic Communications Directive, it is specific to France. Consent to cookies under PECR became enhanced GDPR consent when the GDPR came into effect.
It will be interesting to see how this decision impacts other EU versions of the Google and Facebook sites. Not to mention the fact that Google and Facebook are far from alone in making it easier to accept than to reject cookies.
In 2021, the Irish Data Protection Commission fined WhatsApp EUR225 million for breaches of the GDPR. The fine related to breaches of transparency requirements, particularly relating to the sharing of WhatsApp data with its parent company Facebook.
The Irish regulator, acting as Lead Supervisory Authority, had originally intended a lower fine of between EUR 30-50 million, however, its provisional decision was rejected by other regulators. The EDPB subsequently issued a binding decision under the Article 65 procedure requiring the fine to be increased. It also specified that WhatsApp be given a reduced time of three months to take required remedial actions in relation to its privacy practices.
WhatsApp's appeal to the CJEU has now been published. It is asking the CJEU to annul the penalty and to allow it to recover costs arguing that the EDPB:
It will be very interesting to see how this progresses.
The Administrative Court of Luxembourg has suspended the requirement for Amazon to make changes to its data processing in relation to targeted advertising by 15 January on pain of daily penalties. In July 2021, Amazon was fined a record EUR 768 million for transparency failings regarding targeted advertising. Amazon is appealing the fine but also appealed the requirement to make changes by 15 January, arguing that it could not comply due to a lack of clarity as to what it was being asked to do. The Court acknowledged that the requirements made by the Luxembourg DPA were insufficiently clear or precise and suspended the compliance deadline.
The EDPS has said Europol must delete data concerning individuals with no established link to a criminal activity after six months. This order concludes an investigation into Europol's data retention practices started in 2019. The EDPS said Europol had failed to comply with its request to introduce an appropriate data retention period to filter and extract the personal data it was allowed to retain, in breach of the principles of data minimisation and storage limitation under the Europol Regulation. Europol has 12 months to comply in relation to datasets received before the EDPS Decision.
According to the Guardian, Europol has four petabytes of data which has led to accusations of mass surveillance.
The government has published its National Cyber Strategy 2022, setting out a five year plan underpinned by £2.6 billion of investment. The Strategy is built around five pillars involving strengthening the UK's cyber ecosystem and associated technologies. Targeted, sector-focused legislation may follow where needed, particularly in relation to providers of essential and digital services, data protection in the wider economy, and large businesses.
As part of the strategy, the government has published the Product Security and Telecommunications Infrastructure Bill (see below).
The Product Security and Telecommunications Infrastructure Bill (PSTI) has been introduced to Parliament. It will govern the security of consumer connectable devices and speed up the roll out of faster and more reliable broadband and mobile networks by making it easier for operators to upgrade and share infrastructure.
The PTSI will apply to manufacturers and retailers (both on and offline) and will cover connectable products which includes all devices which can access the internet and products which can connect to multiple other devices but not directly to the internet (such as smart light bulbs, smart thermostats and wearable fitness trackers).
The PTSI will give the government the power to bring in tougher security standards for device makers including:
a ban on 'easy-to-guess' default passwords pre-loaded on new devices - all passwords will need to be unique and not resettable to a universal factory setting
a requirement for connectable product manufacturers to tell customers at point of sale, and keep them updated, about the minimum amount of time a product will receive security updates and patches. If none, then that must be clear
a requirement on manufacturers to provide a public point of contact for reporting of security flaws and bugs.
In-scope businesses will be required to produce statements of compliance, investigate compliance failures, and maintain related records.
The new regime will be overseen by a yet to be designated regulator who will have the power to fine companies up to £10 million or 4% of annual global turnover, as well as up to £20,000 per day for non-compliance.
The Network and Information Systems (EU Exit) (Amendment) Regulations 2021 have been made and will come into force on 12 January. They amend the incident reporting thresholds for relevant digital service providers as the previous thresholds established when the UK was in the EU, are no longer suitable. The Regulations amend Article 4 of the retained EU Implementing Regulation and provide for the ICO to set thresholds. Regulation 12 also requires relevant Digital Service Providers to have regard to ICO guidance when deciding whether or not they need to report an incident.
The ICO's consultation on replacing the outdoing thresholds concluded in October 2021. It focused on either replacing the numerical thresholds with UK-relevant ones, or on introducing risk-based indicative and relative thresholds. While the ICO has said final guidance will be in place by the time the new Regulations come into force, the legislation will continue to work without this as it already contains metrics for DSPs to take into account when assessing the impact of an incident and whether or not to report it.
The ICO is consulting on its Regulatory action policy across the laws it monitors and enforces, including the UK GDPR, the DPA18, PECR and the FOIA. The policy looks at the ICO's general approach to regulatory action and explains factors it takes into consideration when taking regulatory and enforcement action. The consultation also covers the ICO's statutory guidance on regulatory action and on PECR powers.
The consultation closes on 24 March 2022.
The European Commission has adopted an adequacy decision in favour of South Korea. Personal data can now be transferred to South Korea from the EU without the need for additional transfer mechanisms. It remains to be seen whether the ICO will now adopt an equivalent decision to enable the free flow of personal data to South Korea from the UK.
The Court of Appeal last year ruled the immigration exemption in paragraph 4 of part 1, Schedule 2 of the Data Protection Act 2018, unlawful. This exempts controllers from needing to comply with specified provisions of the DPA18 and the UK GDPR, where personal data is processed for the maintenance of effective immigration control or the investigation or detection of activities which would undermine effective immigration control, to the extent that compliance would prejudice these measures. The effect of this decision was stayed temporarily to allow the government to make changes to the law.
Regulations reforming the unlawful immigration exemption under the Data Protection Act 2018 have now been laid before Parliament. They must come into force by the end of January 2022.
Under the new Regulations, in order to rely on the exemption, the Secretary of State must put in place an immigration exemption policy document outlining the process for deciding whether or not the exemption applies, and ensuring that personal data is not processed contrary to the UK GDPR. The exemption must be applied on a case by case basis in accordance with the policy document. Records must be kept of decisions and the reasoning behind them, and data subjects must be informed of the decision unless that would be prejudicial to specified purposes.
The immigration exemption had long been cited as a potential EU adequacy issue so, given the EU adequacy decision can be reviewed and withdrawn at any time, the reform is welcome.
The ICO is consulting on its draft guidance on rights of access by individuals to their data held for law enforcement purposes under Part 3 of the Data Protection Act 2018. The guidance is aimed at DPOs and those with data protection responsibilities in the context of law enforcement processing and is intended to be read in conjunction with the ICO's guidance on subject access rights more generally. It explains the right of access and provides advice on how to recognise a SAR made under Part 3. It looks at how to respond, when exceptions might apply, the issue of acting as a joint controller, and at when the information requested includes information about third parties.
The EDPB has adopted a final version of its Guidelines on examples regarding data breach notifications. The guidelines are intended to complement the Article 29 Working party guidance by providing more practical examples. They are intended to help controllers decide how to handle breaches and conduct risk assessments.
The Norwegian DPA has confirmed the final penalty imposed on Grindr for data protection failings. The fine was originally set at around £8.6 million after the Norwegian regulator found the dating app had failed to get valid consent to disclosure of user data to third parties for targeted advertising purposes. Data disclosed included special data. Grindr has since changed its consent collection mechanism.
A Joint Parliamentary Committee report on India's Personal Data Protection Bill has made 81 recommendations and 151 corrections and amendments to the current draft. The two chambers have until the end of the Winter 2023 session to consider and approve the report before acting on the Bill although it may be considered at any time. This suggests we are unlikely to see enactment any time soon.
The UAE's new federal data protection law came into force on 2 January 2022. It applies to controllers and processors located in the UAE and to those processing the personal data of individuals located in the UAE. There are many similarities with the (UK) GDPR, however, the main lawful processing basis is consent as there is no legitimate interests basis. Transparency requirements are more limited and there is no requirement to provide a privacy policy. There are, however, more onerous data processing record keeping requirements.
International transfers of personal data can take place to countries which have an agreement with the UAE, or under data transfer clauses, with the consent of the data subject, or where necessary to fulfil a contract with the data subject.
The UAE has also established a new UAE data office to regulate compliance.
The Irish data protection regulator has published final guidelines on processing children's personal data. The guidance sets out best practice and principles to follow to protect children when processing their data. It identifies 14 'Fundamentals' focusing on transparency, consent and empowerment.
The ICO has published an Opinion on data protection and privacy expectations for online advertising proposals. This comes out of the ICO's study into adtech and addresses developments since that report. The ICO considers these are not sufficiently mature to assess in detail (for example Google's plan to replace third party cookies). As such, the ICO considers there is an opportunity to ensure proposals currently in development progress in a privacy compliant way. The Opinion therefore sets out expectations for proposal developers. Any proposal should:
The ICO calls on Google and other participants to demonstrate that their proposals meet the expectations in the Opinion. New initiatives must address the risks that adtech poses and take account of data protection at the outset. Any proposal which has the effect of maintaining or replicating existing tracking practices (such as those described in the ICO's 2019 report) is not an acceptable response to data protection risks.
In separate but related news, the CMA is consulting on Google's amended commitments to address competition concerns arising from the replacement of third party cookies with 'privacy sandbox' tools. The CMA is recommending they are accepted. The consultation closes on 17 December.
The UK government has published its Product Security and Telecommunications Bill. Part 1 sets out a new regime to help ensure security of consumer connectable products. It creates powers for ministers to introduce security requirements. Three existing requirements set out in a 2018 Code of Practice on the security in consumer internet of things, will be placed on a statutory footing. These will include:
Other security standards are yet to be published.
The requirements will apply to manufacturers, importers and distributors of consumer connectable products which will include smartphones, connected cameras, TVs and speakers, wearable fitness trackers, outdoor leisure products with GPS, connected home automation and alarm systems, connected appliances, smart home assistants, and connected safety products, as well as IoT base stations and hubs.
The Bill also contains stringent enforcement measures including fines up to the greater of £10, or 4% of qualifying worldwide revenue. The Bill is currently progressing to its second reading.
The UK government has launched its Mission One Policy Framework as part of the National Data Strategy. It provides a framework for government action to set the correct conditions to make private and third sector data more usable, accessible and available across the UK economy. It includes six potential levers for government intervention and principles for their application as well as seven priority areas for action to address some key barriers to data sharing for public benefit. These include ensuring that data is used according to the FAIR (findable, accessible, interoperable and reusable) principles. The use of Privacy Enhancing Technologies will be encouraged.
The government has also rolled out a Government Data Quality Hub pilot programme as part of Mission 3 of the Strategy, to help government organisations measure their data maturity, and launched a Data Skills Portal to help organisations measure their data readiness and signpost them to resources and training.
The Department of Health and Social Care has said it plans to launch a review into the impact of potential bias in medical devices and the impact on patients from different ethnic groups. The intention is to identify systemic bias and risk and make recommendations to address any issues. This might well include changes to regulation as the DHSC acknowledges there is insufficient protection at the moment. Pulse oximeters and MRI scanners are highlighted although the review will consider a wide range of devices. Findings are expected in late January 2022.
The updated Surveillance Camera Code of Practice has been laid before Parliament and is due to come into effect on 12 January 2022. The Code has been shortened, legislative references updated, and it incorporates the impact of the judgement in R (Bridges) v Constable of South Wales Police. It does not place any new obligations on users.
The European Council and Parliament have provisionally agreed the final version of the Data Governance Act with the EC. The DGA aims to facilitate public sector data sharing by creating a system for data intermediation services to foster data altruism. The new rules will apply 15 months after the legislation is published in the Official Journal. The Commission is also planning a Data Act which will fulfil a similar function to foster public/private sector data sharing.
The ICO has issued Clearview AI Inc. with a notice of intent to fine it just over £17 million following a joint investigation with the Australian regulator. The ICO's preliminary view is that the scraping of images of individuals from the internet by Clearview to build a facial recognition database, breached UK data protection law in a number of ways. These included:
Clearview may now submit representations by way of appeal and a final decision is likely in the middle of next year.
Advocate General de la Tour has opined that the CJEU should rule that the GDPR does not preclude national legislation which allows consumer protection associations to bring representative actions based on alleged breaches of data protection law. The action may be brought on the basis of the prohibition of unfair commercial practices, infringement of a law relating to consumer protection, or the prohibition of the use of invalid general terms and conditions, provided that the purpose of the action is to ensure observance of the rights derived by the relevant individuals from the GDPR.
The AG relied on the Fashion ID judgment which concerned the Data Protection Directive, saying that the GDPR does not change the effect of the ruling. His view is that Member States can provide for entities to bring representative actions designed to protect the collective interests of consumers without a mandate from the data subjects and without a need to claim the existence of actual cases affecting named individuals.
The AG went further and suggested that the defence of collective interests of consumers is particularly suited to the GDPR's objective to establish a high level of data protection.
The EDPB has published a statement on the EC Digital Services Package and its Data Strategy. It expresses concerns over a lack of protection of fundamental rights and freedoms, a risk of inconsistency, and issues with the oversight framework. In particular, it advocates stronger action on targeted advertising, recommending phasing it out altogether and ultimately prohibiting it where it is based on pervasive tracking. The EDPB also calls for a ban on profiling children.
The UK government has published a new transparency standard for AI algorithms which will be piloted before endorsement is sought from the Data Standards Authority. The standard is intended for use by governmental departments and public sector organisations. The standard is made up of an algorithmic standard together with information about how it works, how it was trained, and the level of human insight involved, and an algorithmic transparency template and guidance to help public sector organisations provide information to the data standard.
The European Court of Human Rights has ruled on an application by an online newspaper editor. He claimed that his Article 10 right to freedom of expression had been infringed when his online newspaper was ordered to pay compensation to individuals for failing to de-index an article following a valid request under data protection law.
The ECtHR upheld the decision of the Italian court and said there had been no Article 10 infringement. The online newspaper had unlawfully processed personal data as the article in question remained easily and directly accessible on the internet following a valid request to have it de-listed. The court said that where the balancing exercise between Article 10 and Article 8 (right to respect for private life) rights had been carried out by Member State authorities in accordance with valid case law, it would need strong reasons to overturn the decision.
The court also said that the unlawful processing was as a result of the failure to de-index the article rather than due to its content or the way in which it was published. The editor was only required to de-list the article, not to permanently delete or anonymise it. The newspaper's right to publish the article diminished with time compared to the rights of the relevant individuals to respect for their reputations.
Apple has launched legal action against Israeli spyware firm NSO Group. Apple says it wants to hold NSO to account for unlawfully surveying and targeting Apple users, and to prevent it from doing so in future. NSO says its tools are intended to target terrorists and criminals, however, they have also allegedly been used by governments to target journalists, rival politicians, and activists.
Max Schrems has escalated relations with the Irish Data Protection Commissioner who has been dealing with his various complaints against Facebook. He has complained to the Austrian Office for the Prosecution of Corruption, arguing the Irish regulator has breached Austrian criminal laws. NOYB argues that the DPC attempted to impose a non-disclosure agreement on it to prevent it from disclosing details about its complaint against Facebook. NOYB says this was an attempt to stifle it "akin to blackmail".
The UK and US governments have announced a commitment to deepening the UK-US "data partnership". This falls a considerable way short of an adequacy agreement and is more like a statement of intent to engage with the issues and facilitate cross-border flows by working towards interoperability of different data protection regimes.
The European Commission has published the responses to its consultation on the proposed Data Act. The Data Act is intended to promote business-to- government and business-to-business data sharing in the public interest. Businesses responding to the consultation are largely clear that there are both technical and legal barriers to this kind of data sharing at the moment.
The Data Act will sit alongside the Data Governance Act which promotes trust and facilitates data sharing across sectors and between Member States. The Data Governance Act has already reached political agreement and should be finalised shortly, and the EC is also planning to review the Database Directive and update it where necessary.
The ICO has published a paper setting out themes and issues which emerged during the ICO's response to the COVID-19 pandemic. The ICO highlights the importance of a flexible data protection framework to enable innovative use of data while protecting people, and of developing sufficient trust in the use of personal data. She also discusses the importance of privacy by design, and issues around data sharing in the public interest.
The UK and Singapore have agreed, in principle, a new Digital Economy Agreement. This includes commitments on a variety of digital trade issues including:
The legal text will now be finalised and then the agreement will be ratified.
The UK government has agreed with operators that they will switch off 2G and 3G networks by 2033. This will help free up spectrum for 5G and subsequent iterations, and make the communications network infrastructure more secure. As part of this, the government is planning to accelerate Open Radio Access Networks (Open RAN) to enable mobile networks to be built using a variety of equipment suppliers. The government also announced a £50 million package to boost innovation in mobile technology.
The EDPB has launched a proposal for its first coordinated action to focus on the use of cloud-based services by the public sector. Assuming the proposal moves forward, national regulators will take up action at a local level but the results will be analysed and addressed together in order to create a plan of follow-up actions, both at Member State and EU level if necessary.
The EDPB has adopted final Guidelines on the restrictions of data subject rights under Article 23 GDPR. The guidelines analyse the criteria which apply to any restrictions introduced by Member States, how assessments need to be carried out and how data subjects can exercise their rights after any restrictions are listed. The guidelines also set out the criteria restrictions are required to meet and when they might be applied.
The ICO is consulting on the beta version of its AI and data protection risk toolkit. The toolkit was published last July and contains risk statements to help organisations that use AI to process personal data. It gives examples of organisational and technical measures which can help mitigate risk and demonstrate compliance with data protection law. The ICO is asking for views from organisations and anyone in compliance-focused and technical roles relating to the use of AI. The consultation closes on 1 December 2021.
Australia has published a Bill to amend its data privacy legislation. Proposed amendments include higher sanctions for violations, and the creation of a new Online Privacy Code to regulate social media businesses, data brokers, and large online platforms.
In July 2020, the ICO, together with the privacy authorities from Australia, Canada, Gibraltar, Hong Kong, China and Switzerland, signed an open letter directed at all video teleconferencing companies and sent directly to the five biggest. The letter set out privacy concerns in light of the sudden uptake in these services resulting from people having to work from home during the pandemic.
The ICO has now published a joint statement and observations on global privacy expectations of VTCs. These set out good practice in relation to security, transparency, privacy by design and end-user control, as well as use of secondary data and data centres.
The Supreme Court has handed down the long-awaited judgment in the data privacy case of Lloyd (Respondent) v Google LLC (Appellant) [2021] UKSC 50. The court reversed the decision of the Court of Appeal, holding that Mr Lloyd could not proceed with a representative action to claim damages from Google regarding the acknowledged fact that, for a period of several months, it tracked the activity of 4 million iPhone users without their knowledge or consent in breach of the Data Protection Act 1998 (DPA98 since replaced by the UK GDPR and the Data Protection Act 2018 (DPA18)).
The decision is significant because of the impact it has on the viability of litigation-funded opt-out class actions, in particular for breaches of data protection law. Read our full coverage of the decision here.
The EC has proposed a Delegated Regulation under the Radio Equipment Directive to strengthen cybersecurity of connected devices which use radio technology. EU manufacturers and those placing products on the market in the EU will be required to take steps to:
The Regulation will set out general objectives and manufacturers will be able to assess products themselves against standards set by the European Standards Organisation or have them independently assessed.
The Regulation will cover a wide range of connected devices including wearables, toys, phones, fitness trackers and telecoms equipment. Some devices are excluded from scope because they are covered by other legislation, for example, connected vehicles and medical devices.
The Regulation is expected to come into force in 2024 following a 30-month transition period. The Commission also plans a Cyber Resilience Act which will cover a wider range of products throughout their lifecycle.
IAB Europe has published a press release saying it expects the Belgian DPA to identify that it has infringed the GDPR when it publishes its draft ruling. The Belgian DPA has been investigating IAB Europe's role in the Transparency and Consent Framework (TCF). It is expected to find that IAB Europe is a data controller for TC Strings – the digital signals created on websites to capture data subjects' choices about the processing of their personal data for advertising – that TC Strings are personal data, and that IAB Europe is a joint controller for them in the specific context of real time bidding.
The IAB says that as it had not previously considered itself to be a data controller in this regard, it will work with the regulator to ensure it fulfils its obligations. It is likely to be given six months to do this. The IAB hopes it will then be possible for the EDPB to approve the TCF as a GDPR Code of Conduct.
While the Belgian ruling can be amended by other regulators under the cooperation and consistency mechanism, this looks to be good news for the adtech industry as the main issue now appears to be with the role of IAB Europe in the TCF rather than with the TCF itself which has been criticised by the regulator in the past. Those signed up to the framework may be required to make minor changes to their privacy policies as a result of the ruling but should wait until it is finalised.
The Court of Appeal has confirmed that the effect of its ruling on the Data Protection Act's immigration exemption will be suspended until 31 January 2022. In an earlier ruling , the court held that the immigration exemption in the DPA 18 is incompatible with Article 23 GDPR. The court has allowed a six-month exemption to enable the government to amend the legislation. The latest ruling confirms that this benefits both the private and public sectors.
Elizabeth Denham will leave her post as the UK's ICO on 30 November. Paul Arnold, the current Deputy Chief Executive, will act as interim ICO until John Edwards takes up his post on 3 January 2022.
The ICO has confirmed its investigation into Clearview AI Inc.'s facial recognition app has concluded. The investigation was carried out jointly with the Australian Information Commissioner (OAIC). The OAIC has published its determination as to Clearview's breaches of Australian data protection law. The ICO is considering its next steps and any formal regulatory action which may be required.
Clearview's facial recognition app allows users to upload a photo of an individual which is then matched to any photos of them on the internet. The system reportedly relies on a database of more than 3 billion images scraped from the internet.
The All-party Parliamentary Group on the future of work has called for an 'Accountability for Algorithms' Act. This is partly in response to the growing use by employers of monitoring technologies and performance targets set by algorithms, The Group's report The New Frontier - Artificial Intelligence at Work suggests legislation could be used to counter negative impacts of these technologies by creating a new corporate and public duty to carry out an algorithmic impact assessment (similar to a DPIA). The duty would apply from the design stage to deployment and require assessment and evaluation of risks and other impacts on work and workers of the relevant AI.
The Act would also update digital protection for workers by requiring an easy-to access right to a full explanation of purpose, outcomes and significant impacts of AI systems at work and a means for redress, as well as a right for workers to be involved in shaping the design and use of the systems. To boost a partnership approach, additional collective rights for unions and specialist third sector organisations, as well as enforcement powers for the joint Digital Regulation Cooperation Forum are also recommended.
It remains to be seen whether the UK government will follow any of these recommendations and how any legislation would sit with the UK GDPR which already includes transparency requirements and impact assessments regarding use of personal data by AI, and provides additional protections where automated profiling results in decisions with legal or similarly significant effect.
The European Parliament ITRE Committee has agreed its draft of the NIS2 Directive. NIS2 will update and broaden the scope of the original NIS Directive which deals with cybersecurity of essential services and critical infrastructure. The Committee's draft was presented in a plenary session on 10 November 2021, which paves the way for trilogues to begin with the European Council.
The Council of Europe has adopted a Recommendation on profiling. It aligns its provisions with Convention 108+ and provides that respect for fundamental rights and freedoms should be guaranteed during all profiling operations, whether public or private sector.
Facebook has announced it will be removing certain targeting categories on topics "people may perceive as sensitive" like race, ethnicity, sexual orientation, health causes, religious and political beliefs from 19 January 2022. It will also give users additional controls over ad content, allowing them to opt to see fewer ads for additional categories of content including gambling and weight loss.
Cybersecurity researchers at Forescout and Medigate were reported to have identified vulnerabilities in millions of connected devices used in hospital networks and systems. Patches have been issued but patient care, building systems and patient data were all potentially at risk although there is no suggestion that the devices were hacked.
Separately, the Lister Fertility Clinic announced its data processor Stor-a-file which processed patient medical records on its behalf, had suffered a ransomware attack. Stor-a-File said the attack affected 13 businesses, six of which are healthcare related including Nuffield Health Leicester Hospital.
Robinhood, the app which facilitates low-volume share trading by individuals suffered a ransomware attack which affected the records of 7 million individuals in the USA. Robinhood said the most sensitive information it collects, such as social security numbers and financial details, was not impacted. It also said that rather than give into the ransom demand, it was working with relevant law enforcement authorities and an external cybersecurity firm.
DPOs have been in the spotlight in the last week. The Luxembourg DPA issued a number of fines to companies for failures around the appointment and role of DPOs. There are several points to emerge from the rulings which are helpful given the lack of EU-level guidance on the issue.
The Luxembourg regulator suggested a DPO needs to be a recognised privacy professional with at least three years' experience in data protection. A DPO role should equate to at least one full time role. If an external DPO is appointed, there needs to be a formal relationship with a control plan and monitoring procedures in place. Whether internal or external, the DPO should be allocated enough time and resources to enable them to fulfil their functions. The need for independence from the company was also stressed.
Separately, the CNIL has published guidance for DPOs which outlines the requirements of the role and best practice recommendations. The guidance covers the role of the DPO, the designation of the DPO, how they should carry out their role, and their relationship with the CNIL.
The EDPB has adopted Guidelines on the interplay between Article 3 (territorial scope) and Chapter V (data transfers) of the GDPR. They aim to assist controllers and processors in the EU in identifying whether a processing operation constitutes an international transfer and to provide a common understanding of the concept of international transfers. The Guidelines cite three cumulative criteria that qualify an operation as a transfer:
Clearing up some confusion which emerged from the guidelines on supplementary measures for data transfers, the EDPB says that processing will be considered a transfer regardless of whether the importer established in a third country is already subject to the GDPR under Article 3. However, where the importer is subject to GDPR by virtue of Article 3(2) for the given processing, less protection or fewer additional safeguards will be needed.
Transfer tools in that situation should take that into account and not duplicate the GDPR provisions but instead address the elements and principles that are missing – ie fill in the gaps relating to conflicting national laws and government access, enforcement and redress. They should deal with conflict of laws and measures to be taken in the event of third country legally binding requests for disclosure of data. The EDPB has said it will help develop a transfer tool that deals with these issues.
The guidelines also clarify that the collection of EU data by a third country organisation which is done directly from data subjects is not a Chapter V data transfer because the transfer is not from an EU establishment, and that a data transfer from an EU processor to a non-EU controller is a Chapter V data transfer.
The European Commission has said it will launch legal action against Belgium in the Court of Justice if it does not take steps to make its data protection regulator more independent within two months. The Commission says some members of the Belgian DPA cannot be viewed as fully independent of the government, either because they have taken part in governmental projects on COVID-19 contact tracing, or because they are members of the Information Security Committee, or they report to a management company dependent on the Belgian government.
The UK government has published a response to its call for views on amending the incident reporting framework for digital service providers under the NIS Regulations. The government is proposing to move incident thresholds out of legislation and into the control of the ICO.
While some respondents disagreed with this approach, over 70% agreed and therefore the government continues to believe this is the best approach, stating that current reporting thresholds are not fit for purpose and result in too few incidents being reported. The ICO launched consultations on threshold models in September.
The UK government has published a response to its call for views on measures to enhance the security of digital supply chains and third-party IT services. The government's proposals received broad support including around certification of assurance marks and minimum requirements in public procurement. The majority of respondents agreed that new or updated legislation would be a sensible way to address issues. The government will set out further policy objectives, probably as part of its upcoming National Cyber Strategy.
The AG has published three Opinions in response to referrals from Germany, France and Ireland which were made following the October 2020 judgments in La Quadrature du Net and joined cases. The 2020 judgments confirmed case law in Tele2 Sverige, holding that general and indiscriminate retention of traffic and location data from electronic communications is only permitted under the ePrivacy Directive to address a serious threat to national security.
The AG considered that the references did not raise any new questions of EU law which had not already been answered or which could not be inferred from existing EU case law.
The ICO has written an open response to a letter from 5Rights foundation which conducted research into breaches of the Children's Code. 5Rights raised three primary concerns:
In response, the ICO has set out context to the work being undertaken to enforce the Code. This includes writing to 40 organisations, with a further nine to follow. Based on responses which are expected by the end of December, the ICO will then decide whether to act formally. Any formal steps are expected to happen in spring 2022.
The ICO has also contacted Apple and Google in response to concerns raised about age ratings of apps about the extent to which the risks associated with the processing of personal data are a factor when determining the age rating for an app.
The ICO referred to the work on age verification and re-iterated the ICO's interest in security of connected devices and access to adult-only websites, even though these are not necessarily within the scope of the Children's Code or always within the ICO's remit to regulate.
The EDPB has set up a task force to coordinate the responses to a number of complaints about cookie banners which have been filed with various Member State regulators by NOYB. The taskforce aims to promote cooperation, information sharing and best practices between the SAs. It will:
The Council of the EU has agreed its negotiating position on the draft Data Governance Act, paving the way for trilogues to begin with the European Parliament. The Data Governance Act aims to facilitate the re-use of public-sector data for research and the benefit of society across sectors and borders. It establishes:
The Council has proposed a number of changes to the Commission's draft including:
Trilogues should begin shortly.
China has published guidance for organisations on how to comply with its incoming data protection law. The guidance provides additional definitions around types of data. "Important" data is data which could harm national security or cause major production issues across a number of sectors in the event of a breach. The definition also covers data relating to AI. China's new law comes into effect on 1 November. We'll be announcing a webinar on the new law shortly so watch this space.
The ICO has published its response to the UK government's consultation on changes to the UK's data protection regime. The ICO is broadly supportive of the review but says "the devil will be in the detail". The main message of the response seems to be that more information would be needed about the plans to enable an assessment. The ICO, unsurprisingly, emphasises maintaining current privacy standards and, on the issue of data transfers, underlines the importance of maintaining EU adequacy.
While the ICO is supportive, in principle, of measures which would increase flexibility and reduce administrative and regulatory obligations providing that does not result in a fall in standards, she does not appear to be in favour of doing away with DPOs and DPIAs, nor with re-introducing a fee for subject access requests. She does, however, recommend legislating against cookie walls, and is, as she has said previously, in favour of reforming cookie rules to reduce the need for pop-ups.
The strongest language is used in response to the government's proposals to reform the ICO. While the ICO supports a regulatory governance model involving a supervisory board with separate Chair and CEO, she says "there are specific proposals where I have strong concerns because of their risk to regulatory independence".
The ICO's new Data Sharing Code of Practice came into force on 5 October 2021. The Code provides guidance on sharing personal data under the UK GDPR and Data Protection Act 2018. It covers issues including transparency, lawful basis and accountability. The Code and other resources around data sharing can be found on the ICO's data sharing support hub. As a statutory Code of Practice, the Code is admissible in court as evidence.
Mischon de Reya is bringing a representative action against Google's subsidiary DeepMind Technologies on behalf of 1.6 million people. DeepMind Technologies had a data sharing arrangement with the Royal Free NHS Trust. A data breach resulted in medical records being shared with tech companies without the knowledge or consent of the relevant individuals. The ICO found a breach had occurred but did not fine the NHS Trust. This action does not cite the Trust.
The IWGB and ADCU unions are backing two employment tribunal claims against Uber made by a total of three drivers alleging that Uber's facial recognition system made by Microsoft is racially biased and can lead to drivers being unfairly suspended from the platform. FRT systems have been shown by some studies to be less effective for non-white people. TFL estimates 94% of PHV drivers are black, Asian and minority ethnic. Uber uses the Real-Time ID Check system to stop drivers sharing accounts and bypassing requirements to be licensed and go through criminal record checks. In response to the allegations, Uber says "The system includes robust human review to make sure that this algorithm is not making decisions about someone's livelihood in a vacuum without oversight". Drivers can choose whether their selfie used for ID purposes is checked by the software or by humans. Even if they choose the software option, if a photo does not match automatically, it is then reviewed by a human. If the human concludes there is no match, the driver is then waitlisted for 24 hours. The next time they log on, a further human review is conducted. If the second human review concludes there is no match, the driver's account is suspended although the driver can appeal.
The Irish Data Protection Commissioner has provisionally found that Facebook's terms and conditions (rather than its privacy policy) form a contract for the processing of personal data for targeted advertising purposes. As such, it may rely on Article 6(1)(b) (processing necessary for a contract) rather than on consent for the processing. The draft decision was published by NOYB which brought the complaint against Facebook.
The complaint focused on the terms and conditions that users were required to sign up to when the GDPR came into effect. It argued that the privacy and cookie policies were incorporated into these terms and conditions and that users were not able to give genuine consent to them as they had to accept the terms and conditions or stop using Facebook. The complainant also said it was unclear which lawful basis was being relied upon for which processing operation.
The DPC considered three issues:
The DPC held that Facebook had not sought to rely on consent with respect to its terms and conditions. Facebook was not required to rely on consent when offering a contract to a user. Facebook was permitted to rely on Article 6(1)(b) regarding the data processing required as a result of the user entering into the terms and conditions. Other data processing operations were covered in the privacy policy which, despite being linked to in the terms and conditions, were not incorporated into them and some of which sought to rely on consent where there was a genuine element of choice.
Facebook had, however, failed to provide the necessary information regarding the lawful basis of processing in its terms and conditions and had not been sufficiently transparent. As a result, the DPC proposes fining Facebook up to €36 million.
The DPC essentially found that Facebook is offering a contract whereby the user provides personal data for targeted advertising purposes in exchange for the social networking facilities. The targeted advertising aspect is a core element of the contract. This means that as far as targeted advertising is concerned, Facebook does not need to rely on consent. It can process the personal data for targeted advertising purposes because it is necessary to do so to give effect to the contract.
In addition, although Facebook offered a 'take it or leave it' contract, this did not constitute forced consent to all the processing operations set out in the privacy policy. The DPC was not satisfied that the privacy policy was incorporated into the terms and conditions because the accept button clearly referred to acceptance of the terms as distinct from the privacy and cookies policies. Facebook relies on a variety of lawful bases for different operations, some of which are based on consent, and some of which are based on contractual necessity. Where contractual necessity is relied upon, the DPC said this covers only the terms of service and not the privacy policy which is only relevant "insofar as it sheds light on the processing operations carried out for which Facebook relies on Article 6(1)(b)." The Irish DPC declined to conduct a full assessment of all processing operations carried out by Facebook and said the complainant was required to be more specific in its requests.
NOYB argues that the decision effectively greenlights bypassing consent by allowing businesses "to just write the processing of data into a contract".
The draft decision will go through the Article 60 process of approval by other regulators. It will be interesting to see how they respond.
The ICO has published an Opinion on age assurance and issued a call for evidence on its use. The Opinion covers how the law applies and facilitates clear and consistent regulation for those who want to use age assurance to help comply with the Children's Code. It sets out age assurance expectations for different risk criteria to help Information Society Service and age assurance providers in the context of the Code.
The call for evidence, open until 9 December 2021, seeks details on:
This is just in relation to the application of age assurance to compliance with the Children's Code rather than more widely.
The ICO has published its long-awaited Code of Practice on Journalism in draft for consultation. The code sets out the manner and extent to which data protection law applies to journalistic content, particularly in terms of the protections for journalism and freedom of expression. It builds on the 2014 version but under the DPA 18, it will gain statutory force. The code is limited to data protection law rather than press conduct and standards. It is aimed at media organisations and journalists whose purpose is to publish journalistic material and who are controllers under the UK GDPR and DPA 18. Its target audience are those who have defined roles and responsibilities such as lawyers, DPOs, and senior editorial staff.
During the consultation process, the ICO wants to hear more from journalists, media organisations, civil society and campaign groups, academics, bloggers and individuals. The ICO will also be running online workshops to discuss the key themes in the code and the development of further tools and resources.
The Court of Appeal found the immigration exemption in the Data Protection Act 2018 to be unlawful in May 2021. The declaration has been suspended until 31 January 2022. This means the UK government now has until then to amend the exemption to bring it in line with the GDPR. If it fails to do so, the exemption will be disapplied at that point.
The CMA has updated the timetable for its investigation into Google's plans to remove third party cookies on its browsers and replace their functionality with a range of 'privacy sandbox' tools. It says it needs additional time to consider representations and consult with Google. It's unlikely the investigation will conclude this year.
China has published a set of draft regulations governing the use of algorithms which will also apply to China's Cybersecurity and Data Security laws, and its Information Services Management Rules. It is not yet known when they will come into effect and whether there will be further drafts or additional guidance.
The regulations are broad in scope and likely to apply to any organisation providing online services and using algorithms. There is an emphasis on preventing online harm and a ban on using algorithms which are likely to induce online addiction of children, or otherwise negatively influence their health. Other requirements relate to preserving China's values, but some measures are more familiar in terms of introducing privacy and consumer protection elements.
The UK government has published a National AI Strategy which sets out a 10-year plan to ensure the foundations are laid for sustainable and ethical AI governance and development. The focus is on building a sound governance system to encourage long term innovation and success. In the next six to 12 months, the government will publish a White Paper on a pro-innovation national position on governing and regulating AI and piloting an AI standards hub. The hub will be used to coordinate UK engagement in global AI standardisation.
The US Congressional Research Service (CRS) has published a report on the Privacy Shield and Transatlantic data flows. The report sets out the main differences between the US and EU regimes, the history of EU-US data flows, and looks at the prospects of a solution. The report confirms that the Biden Administration is looking to conclude a new agreement on data flows with the EU, based on providing assurances in the form of executive orders and administrative actions, on US protections for personal data, and redress for EU citizens in the event of misuse of data.
The news coming out of the negotiations has not been entirely positive though and progress appears to be slow. There are no signs that an agreement will be reached in the near future.
The EDPB has adopted an opinion on the European Commission's draft adequacy decision in favour of the Republic of Korea. This is the latest step to pave the way for the European Commission to recognise South Korea as providing an adequate level of data protection for the purposes of EU-South Korea data exports.
The Department for International Trade (DIT) has published a five-point plan for digital trade which aims to deliver the UK as a global leader in digital trade with a network of international agreements to support it. The plan focuses on:
The Irish Data Protection Commissioner has launched two own-volition inquiries into TikTok's GDPR compliance. The first looks at its compliance with the principle of design and default, particularly in relation to the default settings for users under 18 and age verification measures for under 13s. It will also look at whether TikTok has complied with transparency requirements in the context of processing children's personal data. The second inquiry will look at TikTok's data transfer practices to third countries, and, especially to China.
TikTok has said privacy is "our highest priority". It has made changes to its privacy practices including making all under 16 accounts private by default, deleting accounts of under 13s and suspending push notifications to children's accounts at certain times to protect their sleep patterns.
WhatsApp is appealing the fine imposed on it by the Irish Data Protection Commissioner. The Irish Times reported WhatsApp as having filed an appeal before the Irish High Court. It argues that the Irish DPC's decision is unconstitutional as it amounts to a criminal sanction and interferes with its property rights. It also argues the decision is incompatible with the European Convention on Human Rights, in that it breaches the right to fair procedure. WhatsApp also plans to challenge the EDPB instructions to the Irish DPC which caused a huge increase to the planned fine, before the CJEU.
Security researcher Jermiah Fowler discovered that a database containing around 60 million records of health tracker users around the world had been left unsecured online. It is not clear whether the records had been accessed by malicious actors.
The news coincides with the US Federal Trade Commission's move to require a wider range of companies handling health records to notify consumers if their data has been compromised by a security breach, including where it has been accessed without their consent. The rule has been extended to cover health apps and devices, including those which track fitness, fertility data, fitness and blood glucose.
In her annual State of the Union speech, Ursula von der Leyen announced a planned Cyber Resilience Act to set up common security standards for connected devices. The Act will sit alongside the revised NIS Directive. No details have been published to date.
The appointment of John Edwards to replace Elizabeth Denham as UK ICO has been approved. Edwards is currently the New Zealand Privacy Commissioner. He will begin his new role on 1 November 2021.
Earlier this month we reported on the High Court ruling on causes of action for data breaches in which the court held that failure to provide adequate data security is not a positive act. This limits the causes of action in privacy claims against a data controller who has suffered a cyberattack. You can read more about the consequences of the ruling here.
Following the announcement that the UK intended to depart from the GDPR, the UK government has launched a consultation on the future of the UK's data protection regime, Data: A New Direction, which is open until 19 November.
The 146-page document asks for views on a range of issues to support the government's stated aims of fostering innovation while protecting privacy, facilitating data exports, empowering the ICO, fostering collaboration between the private and public sector (particularly around healthcare), while protecting privacy. The government is confident this can be done without losing EU adequacy.
Many of the proposals aim to cut 'red tape' around current EU GDPR-derived rules and would certainly involve departures from the letter if not the spirit of the current regime. Proposals include:
It is likely to be some time before we see any actual changes to the law, but data-rich organisations may want to respond to the consultation.
Together with the consultation on the UK's data protection regime, the UK government also published a National Data Strategy monitoring and evaluation framework with a call for evidence on high-level indicators to assess progress with the Strategy. It has also set out next steps for the National Data Strategy Forum.
The ICO has called on the G7 to encourage businesses to replace cookie banners with browser and device settings and software applications. These are currently lawful mechanisms but inconsistent take up and lack of harmonised standards means that most businesses rely on more intrusive ways to collect cookie consent. The ICO suggests that user fatigue leads to people clicking on 'accept all' and giving away more data than they would like.
The discussions on cookies took place at a meeting of the G7 data protection regulators which the ICO summarised in a communique as focusing on issues including:
The ICO has issued First Choice Selection Services Ltd with an enforcement notice relating to its failure to comply with a Data Subject Access Request. First choice received the request from a data subject who was also making an employment tribunal claim. It said it would only release information on instruction from the tribunal at which point the data subject complained to the ICO.
The ICO found First Choice had misled it by falsely claiming the employment tribunal said it should not release the data until instructed to do so. First Choice must now respond to the SAR and make changes to its internal systems and procedures to ensure it deals properly with future SARs.
The EP published a study on biometric identification in early September. As a result, it has made recommendations for the draft AI Regulation. These include adding a title dealing with restricted AI practices to cover applications including 'real time' remote biometric identification without an exclusion for law enforcement purposes. and adding to the list of prohibited AI practices. In particular, it recommends the use of emotion recognition systems should be included as a restricted application and separated from the concept of biometric data. Definitions should be more granular and an additional definition of "biometric inferences" should be introduced. The lists should be amendable.
The NGO Global Witness has asked the Equality and Human Rights Commission to investigate whether Facebooks algorithms used to target job adverts breach the Equality Act 2010.
An investigation by Global Witness found that Facebook's algorithms may be excluding people form seeing certain adverts which have not been manually targeted, based on their gender and age. The study showed, for example:
Global Witness has also presented its findings to the ICO suggesting the algorithms breach the GDPR's Article 5(1) principle of fairness and asking it to investigate. It has called on the UK government to require technology companies to be transparent about targeting and carry out risk assessments and mitigation to reduce the impact of potentially discriminatory algorithms.
The ICO's Age Appropriate Design Code (or Children's Code) is now fully in force. A number of tech giants including Facebook, Google, Instagram and TikTok have already made changes to accommodate it. The ICO says it will be proactive in requiring social media platforms, video and music streaming sites, and the gaming industry, to tell it how their services are designed in line with the code. The ICO will identify areas where businesses need support and also reminds them it has the power to investigate or audit organisations.
The ICO has also said it will set out its position on age verification this autumn.
We are holding a webinar with the ICO's Acting Head of Regulatory Futures to discuss the impact of the Children's Code and how to comply on 14 September; register to attend here.
The Irish Data Protection Commission has fined WhatsApp €225 million for breaches of the GDPR. This is the second highest GDPR fine to date. The fine relates to breaches of transparency requirements, particularly relating to the sharing of WhatsApp data with its parent company Facebook.
The Irish regulator, acting as Lead Supervisory Authority, had originally intended a lower fine of between €30-50 million, however, its provisional decision was rejected by other regulators. The EDPB subsequently issued a binding decision under the Article 65 procedure requiring the fine to be increased. It also specified that WhatsApp be given a reduced time of three months to take required remedial actions to its privacy practices.
WhatsApp plans to appeal the decision. It said: "We have worked to ensure the information we provide is transparent and comprehensive and will continue to do so. We disagree with the decision today regarding the transparency we provided to people in 2018, and the penalties are entirely disproportionate".
Apple is pausing its plans to roll out technology to help detect child sexual abuse material. Apple had planned to introduce NeuralHash technology to scan images before they are uploaded to iCloud. The photos would be matched to a database of child sexual abuse material maintained by the National Center for Missing and Exploited Children. Matches would be reviewed by a human and steps taken to remove the material, suspend the user account and report the user to the authorities where appropriate.
Apple's plans have led to privacy concerns with campaigners worried the technology could be misused by authoritarian governments. Apple says it has listened to these concerns and is reconsidering its approach.
The UK government has set out its plans for data protection suggesting a move away from the EU in some areas. In a package of plans the government has announced:
At the time of the government's announcement, Culture Secretary Oliver Dowden, gave an interview to the Daily Telegraph suggesting the UK intended to depart from the EU regime. While he did not give too much detail, he did cite cutting red tape and "box ticking" and creating a "light touch" regime. He also mentioned the issue of repeated consent, particularly regarding cookies, and issues around purpose limitation and scientific research.
Needless to say, the EU is unimpressed and has warned that it will suspend the UK's adequacy agreement should the UK diverge significantly from EU data protection standards. It looks as though there are interesting times ahead.
The ICO has approved GMO GlobalSign as the UK's first qualified trust service provider under the UK eIDAS Regulation. GlobalSign can now issue qualified certificates for electronic signatures and seals.
The EDPB has adopted an Article 65 decision to resolve the lack of consensus on the Irish regulator's inquiry into WhatsApp as lead Supervisory Authority. The Irish Data Protection Commoner issued a draft decision about whether and to what extent WhatsApp complied with the GDPR. A number of other SAs objected to elements of the decision and the EDPB was called in to make a determination under the Article 65 procedure. The EDPB concluded the objections raised were valid and the Irish regulator must now incorporate them into its final decision.
Amazon disclosed a pending fine from the Luxembourg data protection authority, acting as lead regulator, of a record €768 million for failure to obtain valid consent to targeted advertising. The fine has not yet been published. It is made in response to a complaint originally to the CNIL by privacy campaigners la Quadrature du Net.
The CNIL has reportedly written to the group to confirm that in addition to the fine, Amazon is required to take steps to bring its targeted advertising practices in line with the GDPR, including by improving transparency, and to take steps to improve its response to data subject access requests. This must be done within six months, after which Amazon faces fines of €746,000 per day for further non-compliance.
The amount of the fine is unprecedented but Amazon plans to appeal. It says: "There has been no data breach, and no customer data has been exposed to any third party…These facts are undisputed… We strongly disagree with the CNPD's ruling".
The ICO has made mostly minor amendments to its statement on its regulatory approach during the COVID-19 pandemic. Most notable is an emphasis on the importance of complying with data subject rights under data protection and freedom of information law. The ICO expects organisations to begin reducing any backlogs and to implement action plans to deal with them.
The CMA has published an update on its investigation into Google's Privacy Sandbox which is intended to replace third party cookies on its browsers. In June 2021, the CMA said it would accept the commitments made by Google to address competition concerns. Comments were invited by 8 July, but the CMA has now said it will carry on considering the responses through September 2021.
The ICO has published a blog post looking in more detail at three of the standards under the Children's Code. It's a very high-level look at:
Reference is made to key documents and principles which controllers will need to take into account and which the ICO will refer to as part of its compliance assessment.
The Code will be enforced from 2 September.
The High Court has upheld an application to strike out claims for compensation for distress for misuse of private information (MPI), breach of confidence (BoC) and negligence, brought against DSG Retail Ltd regarding a cyberattack on a point of sale system at Curry's PC World in 2018. A separate claim for breach of the Data Protection Act was stayed pending the outcome of an appeal by DSG before the First Tier Tribunal against an ICO fine of £500,000.
Saini J said that while data protection legislation imposed a positive duty around data security, BoC and MPI do not. The wrong was a failure which allowed the cyberattack rather than positive conduct by DSG which would be required for BoC or MPI.
Regarding the negligence claim, the judge said there was no need to impose a duty of care in negligence where statutory duties apply under data protection law. In addition, no loss had been suffered. Any distress suffered was not enough to constitute tortious damage, and there was no pecuniary loss.
The decision is significant as it clarifies which and when causes of action apply to cyberattacks.
The IAB Europe has published a guide on contextual advertising. It looks at alternatives to third party cookies and sets out best practices to help ensure contextual advertising is used efficiently. Essentially the report examines the landscape as third party cookies are phased out.
The ICO has published guidance on direct marketing for the public sector. The guidance is intended to help public sector organisations understand when they are sending direct marketing communications and sets out compliance requirements.
The Garante, the Italian data protection regulator, has fined Deliveroo €2.5 million for its lack of transparency around algorithms it used to manage its workers when assigning orders and booking shifts. Deliveroo was also held to have breached the purpose limitation principle in its use of geolocation data used to track its riders. Deliveroo stopped using the shift booking system in 2020. Deliveroo now has six months to rectify issues and a further 90 days to make any required changes to its algorithms.
While the EU has updated its Standard Contractual Clauses, the UK is still using the previous iterations which pre-date the GDPR and cover limited transfer situations. The ICO has now launched a consultation on its draft International Data Transfer Agreement (IDTA) which will replace the current SCCs, and accompanying guidance.
The consultation covers:
Input is also sought from stakeholders on relevant privacy rights, legal, economic or policy considerations and implications. The consultation closes on 7 October 2021.
The ICO has issued a call for views to help it develop new data protection and employment practices guidance and products to help employers and staff comply with relevant data protection legislation. The existing guidance has not been updated since the Data Protection Act 2018 came into effect. The revised guidance will cover a range of topics from recruitment and selection, to employment and employer health records. Responses are requested by 21 October 2021.
Google has announced a range of measures to give minors more control over their digital footprint. These include blocking targeted advertising aimed at the under-18s and turning location history off for young users. It is also introducing a policy to allow users to request the removal of children's images from Google Images search results.
TikTok has also announced product changes for teenagers of 13-17 years to help protect their privacy. Upcoming changes will focus on the use of push notifications, in-app messaging, the audience of videos and default download settings for videos.
China has passed a new national Personal Information Protection Law which will take effect on 1 November 2021. Some aspects are similar to the GDPR, for example requirements around user consent and data minimisation, as well as fines for non-compliance. However, the new legislation will not prevent or restrict government access to personal data.
Signs of more serious data protection enforcement in China came from the recent finding by the Ministry of Industry and Information Technology that 43 apps including Tencent and WeChat transferred user data including contact lists and location data illegally. The app providers were told to bring their use of personal data within the law by 25 August.
The ICO has approved the criteria for three certification schemes under the UK GDPR:
These are the first schemes to gain ICO approval.
The UK government is consulting on proposed revisions to the Surveillance Camera Code of Practice. The proposed revisions update references to legislation and make changes to take into account the judgment in Bridges v Chief Constable of South Wales and others. There are no proposed policy changes. The consultation closes on 8 September 2021.
T-mobile has suffered a data breach in the USA which has impacted more than 50 million customers including former and prospective customers. The data was stolen by a 21 year old American who described T-mobile's security as "awful". This is the fifth data breach T-Mobile has had since 2018. It has said it is partnering with KPMG and cybersecurity firm Mandiant to overhaul its systems.
作者