Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More
Autor

Debbie Heywood

Senior Counsel – Knowledge

Read More

13. Dezember 2022

Radar - December 2022 – 2 von 2 Insights

Data privacy roundup 2022

It's been another busy year in data privacy. We revisit the highlights, focusing on key UK and EU developments. For a look at what to expect in 2023, see our Interface article here.

Legislation

Major new legislation has been tabled, progressed and in some cases passed in the UK and EU in 2022.  This is aimed at issues including facilitating data sharing, enhancing cybersecurity, particularly of connected devices, and, in the UK, at overhauling the data protection regime. 

UK

Product Security and Telecommunications Infrastructure Bill

The Product Security and Telecommunications Infrastructure Bill (PSTI) was introduced to Parliament in January and is likely to become law early next year if not by the end of 2022.  It will govern the security of consumer connectable devices and speed up the roll out of faster and more reliable broadband and mobile networks by making it easier for operators to upgrade and share infrastructure.

The PSTI will apply to manufacturers and retailers (both on and offline) and will cover connectable products which includes all devices which can access the internet and products which can connect to multiple other devices but not directly to the internet.

The new regime will be overseen by a yet to be designated regulator who will have the power to fine companies up to £10m or 4% of annual global turnover, as well as up to £20,000 per day for non-compliance.  See here for more.

NIS Regulations 2021

The Network and Information Systems (EU Exit) (Amendment) Regulations 2021 came into force on 12 January. They amended the incident reporting thresholds for relevant digital service providers as the previous thresholds established when the UK was in the EU were no longer suitable. In November, the government confirmed it would go ahead with revising the NIS Regulations 2018. Notably, the government is proposing to bring managed service providers in scope.

Regulations to amend DPA 18 immigration exemption

The Data Protection 2018 (Amendment Schedule 2 Exemptions) Regulations 2022, amended the immigration exemption in the Data Protection Act 2018. The exemption in the DPA 18 was declared unlawful by the Court of Appeal and the government was given until the end of January 2022 to make the required amendments. Privacy campaigners dispute whether the amendments resolve the issue of incompatibility with the UK GDPR.

Data Protection and Digital Information Bill

The government introduced the Data Protection and Digital Information Bill to Parliament after a lengthy consultation process which we discuss here.  It covers reforms to the UK GDPR, Data Protection Act 2018 and PECR, but also:

  • access to customer and business data
  • electronic signatures, seals and other trust services
  • disclosure of information to improve public service delivery
  • sharing of data for law enforcement
  • information standards for health and social care
  • biometric data
  • the role of the Information Commission.

The second reading of the Bill was postponed during the Truss government so that "Ministers [could] consider this legislation" after DCMS Secretary of State Michelle Donelan hinted it would be changed.

Speaking at a Westminster Forum event on 31 October, Owen Rowland, deputy director for domestic data protection policy at the DCMS said there would be further consultation with stakeholders (rather than a full public consultation) on the Bill before it progresses to allow ministers to complete final checks.  He also suggested this could delay the Bill somewhat. However, he underlined that any organisation which needs to comply with GDPR would also find itself compliant with UK data protection law under the new regime.  

It is unclear whether this means there will be changes to the Bill, although Rowland was also keen to stress that there would be nothing in the Bill to threaten the EU-UK adequacy agreement. Details of the further consultations are likely to be announced shortly.  In the meantime, see our analysis of the DPDI Bill as introduced to Parliament here.

UK government proposals on regulating AI

The DCMS announced its AI Action Plan, part of its National AI Strategy in July. An AI paper set out proposed rules based on six rather vague principles, which regulators must apply with flexibility.  Rather than centralising AI regulation, the government is proposing to allow different regulators to take a tailored approach to the use of AI which is more contextual.

The paper is subject to a call for evidence. Responses will be considered alongside further development of the framework in an AI White Paper which will look at how to put the principles in practice.  This should be published towards the end of 2022 or early 2023.

This is a much less prescriptive and a more sector-based approach than the EU is currently proposing under its draft AI Act.

National Cyber Strategy

In October, the government published a call for information on measures designed to enhance the security of online accounts, including those processing personal data.  These are described as a "Cyber Duty to Protect", formulated as part of the National Cyber Strategy.  Responses will be used to develop proposals which will include appropriate security measures for account providers and organisations processing user account personal data.

Telecommunications (Security) Act 2021

The Telecommunications (Security) Act 2021 (Commencement) Regulations 2022 brought the Telecommunications Security Act 2021 (TSA) into force from 1 October 2022.  The Electronic Communications (Security Measures) Regulations 2022 under the TSA came into force on the same date. The TSA strengthens the security framework for 5G technology and full fibre networks.  The Regulations set out specific security requirements for providers.  A code of practice provides further technical detail.

EU

Draft EC Data Act

The EC published its draft Data Act in February 2022. It takes the form of a Regulation and clarifies who can create value from data (personal and non-personal) and under what conditions.  It is the second major legislative initiative of the European Strategy for Data and follows on from the Data Governance Act which creates the processes and structures to facilitate data sharing.

The Act is intended to unlock industrial data by giving business users access to data they contribute to creating, and giving individuals more control over all their data, not just personal data.  This is focused particularly on data created using connected devices and related services, for example voice assistants.  It is partially aimed at largescale manufacturers and service providers of IoT products who are likely to lose their data advantage to a degree.  Third party business users will not be able to use obtained data to develop directly competing products, but they will be able to use it to create other products and services.  The Data Act is expected to come into force by mid-2024.  See here for more.

EC draft Regulation to create European Health Data Space

As part of the European Strategy for Data, the European Commission published a Regulation to create a European Health Data Space in May. This is the first draft legislation on the proposed common European data spaces.  The aim is to: give users control of their electronic personal health data, nationally and cross-border, as well as support their free movement by creating a genuine single market for electronic health record systems, relevant medical devices and high risk AI systems and provide a consistent, trustworthy and efficient set-up for the use of health data for research, innovation, policy-making and regulatory activities.

Individuals will have access to their electronic healthcare records and will be able to add information, rectify inaccurate data, restrict third-party access, and have oversight of how their data is used. Member States will be required to ensure patient summaries, prescriptions, images, image reports, lab results and discharge reports are issued and accepted in a common European format.  There will also be mandatory interoperability and security requirements.  Read more.

EC draft Regulation to prevent and combat child sexual abuse online

The EC adopted a draft Regulation to prevent and combat child sexual abuse online in May.  The Regulation will oblige providers to detect, report and remove child sexual abuse (CSA) material on their services.  Providers will need to assess and mitigate risk of misuse of their services and take proportionate measures to address issues.  There are concerns that, if the Regulation remains in its current form, it will allow for significant intrusion on privacy.  The positive obligation on businesses to detect and remove child abuse images could provide scope for scrutiny of encrypted messages and potential profiling.  The legislation may change significantly on its way to enactment if the current proposals prove sufficiently controversial.

Data Governance Act

The Data Governance Act came into force in June 2022 and will apply from 24 September 2023. It:

  • establishes conditions of re-use of certain categories of protected data held by public sector bodies, by a wide variety of stakeholders and for commercial or non-commercial purposes
  • provides for a notification and supervisory framework for the provision of data intermediation services
  • creates a framework for voluntary registration of organisations which collect and process data made available for altruistic processes
  • establishes a Data Innovation Board.

Read more.

EC draft Cyber Resilience Act

In September, the European Commission published a proposal for a new Cyber Resilience Act to protect consumers and businesses from products with inadequate security features.  It introduces mandatory cybersecurity requirements for products with digital elements throughout their lifecycle.  Manufacturers will be required to embed security by design and provide security support and software updates to address vulnerabilities.  There will be information requirements to inform consumers about the cybersecurity of products, and products will need to meet conformity assessments (subject to the type of product). The legislation is being introduced as part of the European Commission's Cybersecurity Strategy introduced in December 2020.  It will complement NIS2 and the EU Cybersecurity Act.  Read more.

NIS2 Directive

The European Parliament and Council of Europe approved the NIS2 Directive in November and it will be published in the Official Journal shortly. Member States will then have 21 months to implement the Directive.  NIS2 updates the NIS Directive and deals with cybersecurity of operators of essential services and digital services providers.  Read more.

EC Digital Operational Resilience Act

The European Commission proposed the Digital Operational Resilience Act (DORA) in 2020.  DORA sets uniform requirements for the security of network and information systems of organisations operating in the financial sector and for critical third parties providing services to those organisations.  It was adopted by the Council and the European Parliament in November and will be published in the Official Journal shortly. It will have a 24 month implementation period.

EC proposes Interoperable Europe Act

The European Commission adopted a proposal for a Regulation to set out measures to create a high level of public sector interoperability across the Union (Interoperable Europe Act) in November. The Act aims to promote the cross-border interoperability of network and information systems used to provide public services in the EU. 

Worldwide

Of course it's not all about the UK and EU.  A number of other countries updated their data protection regimes in 2022, including the UAE, Oman, Sri Lanka, China and Switzerland.  India didn't manage to get its planned new legislation over the line and tabled a new Bill in November.

Meanwhile, the USA continues to develop a State by State privacy regime.  While a bipartisan group of legislators introduced a Federal Bill introduced in May 2022, it is unclear whether it will be successful.  In the meantime, the following laws will be taking effect in 2023:

  • California Privacy Rights Act, effective 1 January 2023
  • Virginia Consumer Data Protection Act, effective 1 January 2023
  • Colorado Privacy Act, effective 1 July 2023
  • Connecticut Data Privacy Act, effective 1 July  2023
  • Utah Consumer Privacy Act, effective 31 December 2023.

Texas, Washington and Illinois already have biometric laws at State level.

The Governor of California also signed the California Age Appropriate Design Code into law in September.  Influenced by the ICO's Children's Code, the Bill will apply from 1 July 2024.  The CAADC sets out mandatory requirements and prohibitions on providers of online services likely to be accessed by children.  These include requirements to carry out DPIAs, to configure high privacy by default settings, to communicate clearly with children, and to provide tools to enable children and/or their parents or guardians to exercise their privacy rights.

There are also global privacy initiatives. By way of example, at the 44th Global Privacy Assembly, data protection regulators from 120 countries reportedly agreed a resolution on a framework for using personal data based on six core principles.  The resolution says the use of facial recognition requires a clear legal basis.  The deploying organisation must be able to establish reasonableness, necessity and proportionality as well as transparency.  Human rights assessments should be carried out, data protection principles respected, and there must be effective accountability.

And at the second International Counter Ransomware Initiative Summit, 36 countries including the UK and US as well as the EU committed to developing coordinated guidelines on preventing and responding to ransomware attacks.  There are plans to establish an International Counter Ransomware Taskforce to share knowledge and resources, and to coordinate on enforcement in line with national law and policy.

Consultations, guidance and reports

The ICO, EDPB and EDPS have been busy as usual.  EU-level guidance no longer has effect in the UK and we are starting to see definite signs of divergence in style, with the ICO tending to take a more risk-based approach than the more prescriptive EDPB.  However, cross-border UK businesses still need to take account of EU regulators' views.

UK

UK Cyber Strategy published

The government published its National Cyber Strategy 2022, in January, setting out a five year plan underpinned by £2.6bn of investment.  The Strategy is built around five pillars involving strengthening the UK's cyber ecosystem and associated technologies.  Targeted, sector-focused legislation may follow where needed, particularly in relation  to providers of essential and digital services, data protection in the wider economy, and large businesses.  

ICO consultation on Regulatory action policy and statutory guidance

The ICO consulted on its Regulatory action policy across the laws it monitors and enforces, including the UK GDPR, the DPA18, PECR and the FOIA in January.

ICO consultation on draft guidance on rights of access to law enforcement data

In January, the ICO launched a consultation on its draft guidance on rights of access by individuals to their data held for law enforcement purposes under Part 3 of the Data Protection Act 2018. The guidance is aimed at DPOs and those with data protection responsibilities in the context of law enforcement processing and is intended to be read in conjunction with the ICO's guidance on subject access rights more generally. 

ICO draft guidance on anonymisation, pseudonymisation and PETs guidance

In February the ICO began consulting on  its updated draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies which is being published chapter by chapter.  Chapter 5 which focused on PETs was published in September. 

ICO consultation on guidance on processing for research

In February, the ICO published draft guidance on the research provisions within the UK GDPR and DPA 18.  As the ICO noted, the government is hoping to reform research provisions in its Data Protection and Digital Information Bill, however, the ICO felt there was a need for guidance at an earlier stage.  Read more.

Guidance for manufacturers of Video Surveillance Systems

In March, the Biometrics and Surveillance Camera Commissioner published guidance on security by design and default for manufacturers of Video Surveillance Systems (VSSs) or those manufacturing or assembling components to be used in VSSs.  The guidance sets out minimum requirements to ensure systems are designed and manufactured in accordance with security principles of design and default.  It forms part of a wider suite of documents being developed as part of the Surveillance Camera Commissioner Strategy in support of the SCC Code of Practice.

ICO guidance on ransomware and data protection compliance

In March, the ICO added to its GDPR guide to cover ransomware and data protection compliance.  The guidance provides a compliance checklist and sets out eight common scenarios and situations experienced as a result of a ransomware attack, alongside compliance suggestions.  The guidance also points to other resources, notably NCSC guidance.  Later this year, the ICO and NCSC advised businesses against paying ransomware demands.

Digital Regulation Cooperation Forum portal launched

In March, the Digital Regulation Cooperation Forum (DRCF) launched a new digital regulation research portal.  The DRCF includes the CMA, Ofcom, the ICO and the FCA.  Its digital regulation research programme brings together over 80 pieces of recent research on emerging and future digital developments from eight regulatory bodies including the DRCF members, the IPO, the Bank of England, the ASA and the Gambling Commission.

Goldacre report on health data

The government-commissioned report into using health data for research and analysis was published in April. The report, aimed at NHS policy makers, the government, and research funders, as well as those using health data for service planning, public health management and medical research, makes 185 recommendations over 112 pages.  Helpfully, an executive summary and a slightly longer summary have also been published. 
The report recommends a move away from techniques like pseudonymisation and towards shared Trusted Research Environments (TREs).  The emphasis is on shared resources and processes to enable data sharing in a secure environment.  Investment in platforms and curation are seen as the key to resolving problems caused by the current fragmented approach, while preserving patient privacy.

ICO resources on acting in the best interests of children

The Children's Code requires online services to treat the best interests of the child as the primary consideration when designing and developing services likely to be accessed by children.  This involves considering how the use of their data impacts the rights held under the United Nations Convention on the Rights of the Child.  In May, the ICO published created tools, templates and guidance to assist organisations in making assessments.

Government call for views on improving security and privacy of apps and app stores

In May, DCMS published a call for views on improving the security and privacy of apps and app stores, together with supporting documents.  The call closed on 29 June 2022.  Plans for a response this year appear to have been pushed back.

ICO updated AI and data protection risk toolkit

In May, the ICO launched an updated AI and data protection risk toolkit following comments on its beta version.  The toolkit is intended to provide practical support to organisations using AI systems and is intended to help organisations assess and mitigate risks to individuals.

Ryder review on governance of biometric data

An independent legal review of the governance of biometric data in England and Wales was published in July. It was commissioned by the Ada Lovelace Institute in 2020 following calls from the Commons Science and Technology Select Committee, and was led by Matthew Ryder QC. The Review makes ten recommendations to protect fundamental rights, particularly data and privacy rights.

DHSC policy guidelines on secure data environments for NHS data

DHSC released a policy paper setting out Secure data environment policy guidelines for data platforms hosting NHS data accessed for research and analysis in September.  The guidelines set out expectations for how secure data environments will be used for NHS and social care data, and the rules by which all platforms providing access to NHS data will need to comply. 

Call for evidence on ransomware

The UK's Parliamentary Joint Committee on the National Security Strategy launched a call for evidence on ransomware in the autumn.  It is seeking views on the nature and extent of ransomware threats, how they are deployed and how they are likely to develop.  Information is also sought on the level of vulnerability and preparedness of organisations, and as to whether government response and those of other stakeholders like the ICO, are appropriate or reforms are needed.  Responses are required by 16 December.

ICO second consultation on draft journalism code

The ICO launched a second consultation on its draft code of practice on using personal data for journalism following changes made as a result of its initial consultation. The ICO has taken on board feedback from the first consultation and says it has significantly reduced the length and complexity of the code.  The consultation closed on 16 November so we can expect to see a final version in the first half of 2023.

ICO draft guidance on monitoring in the workplace

In October, the ICO published its draft guidance on monitoring at work for consultation.  Following a call for evidence, the ICO is releasing topic-specific guidance on employment practices and data protection.  The monitoring guidance was the first in the series and was followed by draft guidance on workers' health data.  Further sections and practical tools will be coming in 2023.

NCSC guidance on gaining confidence in supply chain cybersecurity

The National Cyber Security Centre published new guidance to help organisations assess their suppliers' levels of cybersecurity in November.  The guidance sets out practical steps to help medium to large organisations gain assurance about the cybersecurity of their organisation's supply chain.

ICO guidance on direct marketing using email and live calls

The ICO published guidance on PECR in relation to direct marketing using email and live calls in October. The guidance summarises the applicable rules respectively for each medium, and looks at the relationship between PECR and data protection rules.  The two new elements supplement the overall Guide to PECR.

ICO concerns over emotion analysis technologies

The ICO published a blog in November warning organisations to assess the public risks of using emotion analysis technologies before implementing them.  The ICO says that "algorithms which are not sufficiently developed to detect emotional cues risk creating systemic bias, inaccuracy and even discrimination".  The ICO will publish guidance on using biometric technologies in Spring 2023.

ICO consults on how to prioritise public sector FOI complaints

The ICO is consulting on how to prioritise complaints relating to the response of public authorities to FOI requests. It is proposing to prioritise complaints where there is a clear public interest in the information that has been asked for. Responses are required by 19 December 2022.

New ICO guidance and materials on direct marketing

In December, the ICO published new guidance and resources on direct marketing to assist organisations and businesses to conduct lawful direct marketing activities.  The guidance covers essential issues to consider to achieve UK GDPR and PECR compliance, as well as ways to select the most suitable direct marketing product.

Accompanying resources include:

  • step by step direct marketing guidance
  • a guide to PECR and training resources
  • sector-specific guidance including for SMEs, B2B marketing and data broker services
  • guidance on selecting lawful basis
  • a direct marketing checklist, FAQs and summary tables
  • Ttaining resources.

EU (EDPB and EDPS)

EDPB final guidelines on data breach notifications

The EDPB adopted a final version of its Guidelines on examples regarding data breach notifications in January. The guidelines complement the Article 29 Working Party guidance by providing more practical examples. They are intended to help controllers decide how to handle breaches and conduct risk assessments.

EDPB draft guidelines on the right of access

The EDPB adopted draft guidelines on the right of access in January. They were subject to a six week consultation period although they had not been published in final form at the time of writing. The guidelines analyse the right of access and provide guidance on its implementation in a variety of specific situations. 

EDPS calls for stricter rules on online targeting for political advertising

The EDPS published an Opinion on the proposed Regulation on Transparency and Targeting for Political Purposes in January.  The EDPS suggests legislators consider stricter rules in addition to the measures already proposed.  These should include a full ban on microtargeting for political purposes, and a ban on pervasive tracking for political purposes.  This should involve further restrictions on which categories can be processed for political advertising purposes, including for targeting and amplification.

EDPB's first Opinion on national certification scheme

In February, the EDPB adopted its first Opinion on certification criteria in response to the GDPR-CARPA scheme submitted by the Luxembourg DPA.  Certification schemes are intended to help controllers and processors demonstrate GDPR compliance and give them greater visibility and credibility.  The EDPB's role is to ensure consistency of certification criteria across the EU. 

EDPB adopts final version of Guidelines on Codes of Conduct as a tool for transfers

The EDPB adopted the final version of its Guidelines on Codes of Conduct as a tool for transfers in March. The guidance provides a clarification of the application of Articles 40(3) and 46(2)(e) of the EU GDPR.  It also covers the use of an Article 40 Code of Conduct as a mechanism to protect data transfers to third countries.

EDPB Guidelines on Article 60 GDPR

In March, the EDPB adopted Guidelines on Article 60 GDPR.  These support effective enforcement and cooperation between national supervisory authorities.  They are intended to help SAs interpret their local procedures in a way which conforms to their obligations under the one-stop-shop mechanism.

EDPB Guidelines on dark patterns in social media platform interfaces

Guidelines on dark patterns in social media platform interfaces were adopted by the EDPB at the end of March.  These provide practical recommendations to designers and users of social media platforms on how to assess and avoid 'dark patterns' in social media interfaces which infringe GDPR requirements. 

Enforcement toolbox

The Toolbox on essential data protection safeguards for enforcement cooperation between EEA and third country SAs was adopted in March.  It can be used both for administrative arrangements developed within the EDPB by third country SAs and for international agreements negotiated by the EC.  It covers key topics such as data subject rights, the data protection principles and judicial redress.

EDPB guidelines on calculation of fines

In May, the EDPB published new draft guidelines for DPAs on calculating GDPR fines.  The EDPB says the starting points for assessing the amount of a fine are categorisation of infringement by nature, the seriousness of the infringement, and the turnover of the business at fault, and set out a 5-step assessment process.  While providing harmonisation and transparency, the guidelines are just that.   Above all, the individual circumstances of the case must be a determining factor.

EDPS Opinions on data security

The EDPS published Opinions on EU proposals to set out a common high level of cybersecurity and information security in EU institutions in May.  The EDPS welcomed the proposals but suggested the text could be improved by including greater assurances for the rights and freedoms of individuals where their data is processed for security operations.  He recommended ensuring all proposed security measures have a valid legal basis and are necessary and proportionate. In particular, he said the cybersecurity proposal needed to achieve better alignment with the now approved NIS2 Directive.

EDPB criteria on closer cooperation on strategic cases

In July, the EDPB adopted a set of criteria to assess whether a case may be of "strategic importance" warranting closer cooperation between regulators.  These cases are usually 'one stop shop' cases which relate to a potential high risk to the rights and freedoms of individuals in more than one Member State. 

EDPB updated guidelines on identifying a lead supervisory authority

The EDPB consulted on draft updates to its Guidelines on identifying a controller or processor's lead supervisory authority in August.  The update deals with paragraphs 29-34 and Annex 2d(i) and (ii).  The changes largely clarify the issue of what happens in a joint controller situation.  The consultation ended on 2 December 2022.

EDPB consultation on revision to data breach notification guidelines for non-EU establishments

The EDPB consulted on proposed changes to its guidelines on data breach notification in October.  It is proposing adding a clarification stating that the mere presence of an EU representative does not trigger the one-stop-shop mechanism for a controller not established in the EU.  It says for this reason, a notifiable breach will need to be notified to "every single authority for which affected data subjects reside in their Member State".   This suggests that, if the changes are adopted, controllers may end up notifying all EU DPAs just in case it later transpires there are affected data subjects in their jurisdictions.

Data transfers

The UK updated its standard contractual clauses (SCCs) this year, with the UK introducing its International Data Transfer Agreement and Addendum. The US and EU moved closer to agreeing a replacement for the Privacy Shield, but questions remain over transfers to third countries under the GDPR and the UK GDPR, with Google Analytics data providing a particular flashpoint for EU regulators, alongside the ongoing focus on transfers of personal data by Meta to the USA.

New UK international data transfer agreement

The ICO's answer to the EU's updated SCCs came into force on 21 March 2022. They comprise a new international data transfer agreement (IDTA), the international data transfer addendum to the EC's standard contractual clauses (Addendum) and a document setting out transitional provisions. Contracts concluded on or before 21 September 2022 on the basis of the previous Standard Contractual Clauses will be treated as providing adequate safeguards for data exports until 21 March 2024, provided that the processing operations they cover remain unchanged.  Most organisations will find it easier to use the Addendum rather than the IDTA. See here for more.

EU updated Standard Contractual Clauses

The EU updated its Standard Contractual Clauses in 2021, to take account of the GDPR and the Schrems II guidance.  In June, as we discuss here, the Commission published a series of Q&As to help organisations navigate some of the newer elements.  Organisations relying on the old SCCs for data transfers have until 26 December 2022 to change over to the new ones.  Read more.

ICO and EDPB updated guidance on Binding Corporate Rules

In August, the ICO published updated guidance and revised application forms and tables to simplify the UK BCR application and approval process.  A key change is the revision of the referential table which the ICO says must demonstrate the applicant's "understanding of the spirit and intent behind Article 47".  The ICO also called for a BCR policy to be included in the UK BCR document.  Read more.

In November, the European Data Protection Board adopted draft recommendations on applying for controller Binding Corporate Rules to underpin data transfers to third countries.  The recommendations will update the current guidance and bring it in line with the Schrems II requirements, replacing the current Article 29 Working Party Recommendations.  They also update the application form and set out what must be included on the form and with the application.  The draft is open to comments until 10 January 2023.  The EDPB is also working on guidelines for processor BCRs.

President Biden Executive Order paves the way for new EU (and UK) – US adequacy agreement

President Biden signed an Executive Order on Enhancing Safeguards for United Signals Intelligence Activities (EO). The EO and related Department of Justice Rules published at the same time, seek to deal with the issues raised by the CJEU in the Schrems IIdecision that invalidated the EU-US Privacy Shield, and pave the way for a new EU-US Data Privacy Framework (previously known as the Trans-Atlantic Data Privacy Framework). The EC is expected to publish a draft EU-US adequacy decision in mid-December.

The EO is not EU-specific; its redress elements apply to any state designated by the US as a qualifying state and other elements apply regardless of the location or nationality of the data subject.  This is good news for the UK which (at least for now) is bound by the Schrems II decision and has equivalent rules on data exports under the UK GDPR as those in the EU. On the same day the EO was published, the UK government published a US-UK Joint Statement on a New Comprehensive Dialogue on Technology and Data and Progress on Data Adequacy. The Statement announced "significant progress on UK-US data adequacy discussions".  It set out the government's aim of "working expediently" to issue a UK-US adequacy decision and achieving recognition under the EO as a qualifying state.

See more about what this means for future EEA/UK data transfers to the USA here.

Separately, in October, the Data Access Agreement between the USA and UK came into force.  It sets out conditions for cross-border access to electronic data for the purposes of combatting serious crime, with safeguards and procedures to protect fundamental rights.

ICO updated guidance on international data transfers

The UK's ICO published updated guidance on international data transfers in November. This included a new section on transfer risk assessments (TRAs), known as Transfer Impact Assessments or TIAs in the EU, as well as a new TRA tool.  These are required when using Article 46 transfer mechanisms, as a result of the CJEU judgment in Schrems II.

The ICO takes a narrower, risk-based approach, focusing on the risk to the individual if the transfer goes ahead.  The ICO stressed that its guidance provides an alternative to the EDPB's guidance on supplementary measures for international transfers.  The ICO's own approach to TRAs is different but it says that it is "happy for organisations exporting data from the UK to carry out an assessment" that meets either the UK or the EU approach.

The ICO's new TRA tool is designed to help organisations assess the initial risk level of the relevant categories of data by asking a series of six questions, each of which is accompanied by supporting documentation tables guiding what to consider.  Businesses operating across the EU and UK may prefer to stick with the EDPB's approach.

South Korea gets EU and UK adequacy

The European Commission adopted an adequacy decision in favour of South Korea at the end of 2021.  The UK adopted an adequacy decision in favour of South Korea in December 2022 – the first independent adequacy decision issued by the UK following its departure from the EU.

Google analytics under the regulatory spotlight

In January, the Austrian Data Protection Authority said that using Google Analytics breaches EU law on data exports following a complaint by the NOYB (the privacy group spearheaded by Max Schrems).

The complaint alleged that Google as data importer, and a website provider as exporter, breached Article 44 GDPR when moving Google Analytics data from the EU to the USA as it was vulnerable to access by US intelligence authorities.  Google argued that the data was not personal data and that Chapter V of the GDPR was not applicable to data importers. 

The DSB agreed that the export of the data did breach the GDPR.  In particular, it found the supplementary measures used by Google in conjunction with Standard Contractual Clauses do not adequately address specific issues with data transfers to the USA and the potential access to data by US intelligence authorities.  The DSB held that the transfers were not adequately protected.  Moreover, the exporter could not rely on an Article 49 exemption.  The transfers therefore took place in breach of Article 44 GDPR. 

The DSB did, however, agree with Google that Chapter V of the GDPR applies only to exporters and that it was the exporter who was solely liable for the breach. Google issued a robust response in a blog post, arguing that there was a fundamental misunderstanding of the way Google Analytics worked.

Over the year, other Member State regulators including those from France, Denmark, Norway, Guernsey, the Netherlands and Italy followed the Austrian DPC's lead to varying degrees.  In a set of Q&As, the CNIL said that use of GA (or any similar technology where the data is hosted on servers in third countries) will result in unlawful data transfers unless the data is encrypted before it leaves the EA, by an EU (or adequate country) data controller with exclusive access to the encryption keys, or where a proxy server is used.  Whether or not the data being transferred is likely to be accessed by third country government agencies is irrelevant.  The Danish DPA, however, focused more on pseudonymisation.

The wider implications of these decisions are far reaching. They cast doubt over whether there are any realistically achievable additional measures sufficient to protect personal data transferred to the US from the EU.  Google says its supplementary measures intended to give effect to the Schrems II rulings include anonymising IP addresses before data leaves the EU, using data encryption, and employing technical measures to prevent in-transit interception.  These are all measures included in EDPB Guidance as ways to reduce risk attaching to US data transfers and yet regulators are finding that they do not adequately protect the data.  Given that the personal data being transferred by analytics tools is not particularly sensitive, one might imagine the risk to individuals regarding access by intelligence authorities is relatively low. So how much more stringent do protections need to be for more sensitive data being transferred from the EU to the US? 

Google has announced it will introduce an updated set of analytics tools among other announcements on enhanced privacy tools this year.  However, all eyes are now looking at the pending EU-US Data Framework.

Controllers who are responsible for their use of Google Analytics (and any similar tools) should, at this stage, make considered use of the privacy options available pending any EU-level decision.

Irish DPC intends banning Meta's data transfers to the USA

The Irish Data Protection Commissioner sent a draft decision to the other EU regulators which sets out its intention to ban Meta (Facebook) from transferring personal data to the USA.  This has the potential to cut off EU access from services including Facebook and Instagram. 

The Irish DPC carried out an 'own volition' investigation in the wake of the Schrems II judgment.  The decision is now subject to the Article 60 process and, potentially, to an EDPB decision under Article 65, should there be insufficient agreement with the ruling.  NOYB has said they expect objections to the draft decision to be made by other EU regulators. This is the latest twist in the Schrems complaint and, if suspension of transfers to the US is ordered, there could be wide reaching implications for all data transfers to the USA and, potentially, to other third countries, particularly if it happens prior to the agreement of a new EU-US adequacy arrangement.

Regulator enforcement and data breaches

We've seen record fines from regulators for GDPR breaches this year, and a stream of cybersecurity breaches.  While they are too numerous to mention individually, here are some of the UK and EU highlights (or lowlights depending on your point of view), chosen to represent issues and sectors.

UK

CMA accepts Google's commitments regarding its Privacy Sandbox

The CMA announced its acceptance of Google's final commitments to address competition and privacy concerns over Google's planned replacement of third party cookies on its Chrome browser with a range of 'Privacy Sandbox' tools. The CMA worked closely with the ICO to analyse and address potential risks associated with Google's proposals.  The commitments, made for six years, were welcomed by the ICO.

ICO issued fines of £405,000 for targeting older and vulnerable people

In March, the ICO fined five companies a total of £405,000 for making over 750,000 unwanted marketing calls targeted at older and vulnerable people. The ICO found that the companies, possibly working together or using the same marketing lists, were deliberately targeting older people by buying lists from third parties, specifically asking for information about people aged 60+, homeowners, and people who had landline numbers.

ICO fined business £80,000 for sending unsolicited marketing texts

The ICO fined H&L Business Consulting Ltd £80,000 for sending nearly 400,000 unsolicited marketing texts which sought to capitalise on the pandemic by directly referencing lockdown when marketing a 'debt management' scheme.  The messages falsely claimed the scheme was government-backed when it was not even authorised by the FCA.  In addition, the company director had been uncooperative with the ICO.

Clearview AI fined £7.5m

In May, the UK's ICO fined Clearview AI just over £7.5m for breaches of UK data protection law following a joint investigation with the Australian Information Commissioner.  Clearview AI has a global online facial recognition database which relies on images scraped from the internet.

The ICO found that Clearview AI had:

  • failed to give information to UK individuals in a fair and transparent manner
  • failed to have a lawful basis for collecting the information
  • failed to have suitable data retention policies
  • failed to provide higher data protection standards required for biometric data
  • asked for additional information from people inquiring whether they were on their database.

The ICO also ordered Clearview AI to stop obtaining personal data of UK residents which is publicly available on the internet, and to delete the data of UK residents from its systems. Later in the year, Clearview AI was fined by a series of EU DPAs for similar reasons.

TikTok faces potential £27m fine from ICO

In September, the ICO issued TikTok Inc and TikTok Information Technologies UK Limited with a notice of intent, a legal document which precedes a potential fine. The ICO says TikTok faces a potential £27m fine following its findings that the company may have:

  • processed personal data of under-13s without appropriate parental consent
  • failed to provide proper information to users in a concise, transparent and easily understood way
  • processed special category data without legal grounds to do so.

These are provisional findings and the ICO stresses that no conclusion can yet be drawn that TikTok has breached UK data protection law, or that a fine will ultimately be imposed.  The ICO will now consider representations from TikTok before reaching its final decision.

ICO 'reprimands' for failures relating to SARs

The ICO issued official reprimands to seven organisations relating to their handling of subject access requests in September.  Reprimands are one of the ICO's enforcement tools which can be used to warn organisations that their actions breach the UK GDPR, and to recommend steps to prevent ongoing non-compliance.  The non-compliance identified varied between organisations but broadly related to failure to have the proper processes and resources in place to enable timely response to SARs.   The ICO has published a blog on getting SARs right.

£1.48m ICO fine for company profiling and targeting customers without consent

In September, the ICO fined Easylife Ltd £1.35m for using customer purchase data to profile and predict their medical conditions, and then target them with health-related products without consent. Easylife was also fined £130,000 for making unsolicited marketing calls.  The ICO found that Easylife profiled customers making purchases from its Health Club catalogue in order to target them with health-related items.  This meant the company was processing 'invisible' health data – people were unaware that their personal data was being used for that purpose.

DfE reprimanded for allowing access to children's data for gambling age verification

The ICO issued a reprimand to the Department for Education for allowing a database of 28m pupils' learning records to be used by Trust Systems Software UK Ltd (trading as Trustopia), for online gambling age verification purposes.  Only the fact that the ICO is trying to reduce the impact of public sector fines on the public, prevented it issuing a £10m fine.

EU

Google and Facebook fined for breach of consent

Right at the end of last year, the French data protection authority, the CNIL, fined Google €150m and Facebook EUR60m for failing to meet consent requirements to cookies on Facebook, Google and YouTube in France.  The CNIL said failure to make it as easy to reject cookies as to accept them meant that consent to cookies was invalid in breach of Article 82 of the French Data Protection Act.  While the sites required only one click to accept cookies, several clicks were required to reject all of them. Read more.

Grindr fine reduced from £8.6 to £5.5m

In January, the Norwegian DPA confirmed the final penalty imposed on Grindr for data protection failings. The fine was originally set at around £8.6m after the Norwegian regulator found the dating app had failed to get valid consent to disclosure of user data to third parties for targeted advertising purposes.  Data disclosed included special data.  Grindr has since changed its consent collection mechanism.

WhatsApp fine of €225m upheld by CJEU

In 2021, the Irish Data Protection Commission fined WhatsApp €225m for breaches of the GDPR. The fine related to breaches of transparency requirements, particularly relating to the sharing of WhatsApp data with its parent company Facebook. 

The Irish regulator, acting as Lead Supervisory Authority, had originally intended a lower fine of between €30-50m, however, its provisional decision was rejected by other regulators.  The EDPB subsequently issued a binding decision under the Article 65 procedure requiring the fine to be increased.  It also specified that WhatsApp be given a reduced time of three months to take required remedial actions in relation to its privacy practices.

WhatsApp has asked the CJEU to annul the penalty and to allow it to recover costs but the CJEU upheld the fine in December.

Additional penalties to €768m Amazon fine suspended

The Administrative Court of Luxembourg suspended the requirement for Amazon to make changes to its data processing in relation to targeted advertising by 15 January on pain of daily penalties.  In July 2021, Amazon was fined a record €768m for transparency failings regarding targeted advertising.  Amazon is appealing the fine but also appealed the requirement to make changes by 15 January, arguing that it could not comply due to a lack of clarity as to what it was being asked to do.  The Court acknowledged that the requirements made by the Luxembourg DPA were insufficiently clear or precise and suspended the compliance deadline.

Adtech and cookies

In February, the Belgian Data Protection Authority (the APD) fined IAB Europe €250,000 for GDPR failings of its Transparency and Consent Framework (TCF).  The TCF was designed to help the real time biding adtech ecosystem comply with the GDPR, and, in particular, to address some of the difficulties in being transparent and obtaining valid consent. It is widely used by the adtech industry.

The APD found that IAB Europe is a data controller with respect to the registration of user consents, objections and preferences by means of a unique Transparency and Consent (TC) String which is linked to an identifiable user. As such, it was responsible for a number of GDPR failings including around lawful processing, transparency, accountability, and controller obligations.

Responding to the decision, IAB Europe rejected the APD's finding that it is a data controller in the context of the TCF, saying it was wrong "as a matter of law".  The appeal has been suspended pending a reference to the CJEU on the definition of 'joint controller' and whether user consent signals sent via the TCF are personal data.

Notwithstanding these issues, in October, IAB Tech Lab announced that its Global Privacy Platform is ready for industry adoption.  The GPP is part of a portfolio of solutions developed by IAB Tech Lab to address privacy compliance issues in adtech across jurisdictions with differing regimes. The GPP enables user consent signals to be communicated through the digital ad supply chain and provides the protocol to help consolidate and manage these.  It currently supports the US Privacy and IAB Europe TCF consent strings and has plans to add US State-specific strings and one for Canada.  Users are advised to begin transitioning to GPP from TCF v2.0.

Privacy campaign group NOYB sent 270 draft complaints to website operators it believes are not complying with GDPR requirements in their cookie banners.  It included suggestions about how to make the banners compliant and said it will only file the complaints against operators who do not make changes within 60 days.  This is the second wave of action NOYB has launched on cookie banners and it says it will extend its focus to pages using consent management platforms other than OneTrust, including TrustArc, Cookiebot, Usercentrics and Quantcast, hoping to review up to 10,000 websites.

In April, Google announced it will roll out a 'reject all' button allowing users in the EEA, UK and Switzerland, to reject or accept all non-essential cookies on search or YouTube provided they are signed out or are in incognito mode. Customised options will also be available.  The roll-out has started in France (which has been particularly focused on cookie consent).

The news was welcomed by the UK ICO as a step in the right direction although Stephen Bonner, Executive Director of Regulatory Futures and Investigations, commented that "there's still a long way to go to address concerns around consent across the whole advertising industry…current approaches to obtaining cooking consent need further revision in order to provide a smoother and increasingly privacy-friendly browsing experience.".

In September, the CNIL reportedly issued a preliminary notice of a €60m fine to adtech company Criteo.  The fine follows a complaint by Privacy International.  The CNIL intends to fine Criteo for GDPR violations related to unlawful targeted advertising and profiling.  Criteo now has the right to make a written response before the CNIL issues a draft decision which will be subject to the Article 60 cooperation and consistency process.

In December, the EDPB adopted dispute resolution decisions on the basis of Article 65 in respect of the Irish Data Protection Commissioner's decisions regarding Meta platforms Facebook, Instagram and WhatsApp.  The decisions were made following complaint-based inquiries and focused on the lawfulness and transparency of processing for behavioural advertising.  The WhatsApp decision looked at the lawfulness of processing for the purpose of improvement of services.  The decisions had not been published at the time of writing pending notification of the controller by the Irish DPA who has one month to adopt final decisions following notification by the EDPB.  Reports suggest Meta could face a fine of as much as €2bn.  Meta says it is too early to speculate and it continues to engage with the Irish DPA.

If we see the ePrivacy Regulation next year, things may become clearer on the cookie/adtech front, but we've been waiting a long time for it so are not overly optimistic.

EDPB first coordinated enforcement action

The EDPB began its first coordinated enforcement action to investigate the use of cloud-based services by public authorities in February.  This follows the creation of a Coordinated Enforcement Framework (CEF) in October 2020.  A report was expected by the end of 2022.  EDPB members have agreed to further enhance cooperation on strategic cases and to diversify the range of cooperation methods they use. 

Record Instagram fine

In September, the Irish Data Protection Commission, acting as Lead Supervisory Authority (LSA), fined Meta Platforms Ireland Limited €405m following an inquiry into Instagram's privacy practices.  This is the second largest fine handed down under the GDPR to date and was preceded by an EDPB opinion under the Article 65 procedure.

The Irish DPC investigated Instagram's public disclosure of email addresses and phone numbers of children using Instagram's business account feature, and public-by-default settings for children's personal Instagram accounts for a period (the practice has since ended). 

Meta said it would appeal the fine and is now reportedly seeking a number of High Court declarations including one that parts of the Irish Data Protection Act 2018 are invalid under the Irish constitution, and are incompatible with the European Convention on Human Rights.  In addition, Meta intends to apply to the CJEU to annul an EDPB instruction to the Irish Courts on the level of the fine imposed.

Spanish DPA fines Google €10m

In May, the Spanish DPA (AEPD) fined Google €10m for breaches of the GDPR.  It found that Google had passed personal data relating to right to be forgotten requests to a third party in the USA without having a lawful basis for doing so.  This also had the effect of preventing individuals from having their data erased.  The data was transferred under the Lumen Project, an academic project which involved creating a database of content takedown requests.  Google has said it is considering its response but underlined that it has already started re-evaluating and redesigning its data sharing process with Lumen.

Conseil d'État confirms CNIL's Amazon fine

In July, the Conseil d'État confirmed the CNIL's 2020 decision to fine Amazon Europe €35 million for dropping cookies without consent and for a lack of transparency in breach of the French Data Protection Act which implements the ePrivacy Directive.  The Conseil confirmed that the CNIL is competent to impose sanctions outside the GDPR's one stop shop mechanism, and that it had jurisdiction to do so in the context of Amazon's activities in France, even though Amazon did not have its main establishment there.  It also confirmed that the amount of the fine was not disproportionate relative to the seriousness of the breaches, the scope of the processing and the financial resources of the company.

Irish Data Protection Commissioner submits draft TikTok decision

The Irish Data Protection Commission submitted a draft decision following its inquiry into TikTok Technology Limited, to other Concerned Supervisory Authorities.  The inquiry focused on processing of children's personal data by TikTok, in particular, its public-by-default settings for under-18 accounts, and age verification measures for under-13s.  It also considered whether TikTok complied with GDPR transparency requirements.  The Concerned Supervisory Authorities are reviewing the draft decision under the Article 60 procedure.

Accor fined €600,0000 following EDPB decision under Article 65 procedure

The EDPB reached a binding decision under the Article 65 procedure, requiring the CNIL to fine Accor €600,000 for sending unsolicited marketing communications.  The fine was raised from €100,000 as a result of the procedure.

Data breaches

Data breaches now hit the headlines so regularly, it's hard to pick out the really big ones.  Ransomware continues to be an issue.  In July, the ICO and NCSC asked solicitors to ensure they don't advise clients to pay ransomware demands.  However, phishing is still the most popular form of cyberattack. 

Some of the companies to suffer high profile data breaches this year include Microsoft, Twitter, Marriott (again), Ubisoft, Uber, Nvidia and News Corp.  Healthcare data is a particularly frequent target but a new trend has arisen around cryptos.  Crypto exchanges have been a high profile target of hackers this year.  By way of example, Binance, the world's largest cryptocurrency exchange, temporarily suspended transactions and exchange of funds after its network was hacked, causing it to lose anywhere between $110m and half a billion dollars. The hackers exploited a vulnerability between two blockchains, which is becoming a common approach.

Meta has had a tough year in relation to data breaches.  In March, the Irish Data Protection Commissioner, acting as lead regulator, fined Meta €17 million in relation to a series of data breaches which took place between June and December 2018.  Specifically, the DPC found that Meta had breached the Article 5(2) accountability requirement, and the Article 24 requirement to maintain appropriate technical and organisational measures.  It was not able to readily demonstrate the security measures it had implemented in practice to protect user data in the context of the 12 identified breaches.  The Irish DPC's decision was subject to the Article 60 decision making process as it related to cross-border processing.

In October, the Irish Data Protection Commissioner submitted a draft decision following its inquiry into Meta Platforms Ireland Limited (Meta), to other concerned authorities under the Article 60 GDPR procedure.  The inquiry began in April 2021 following reports of a data breach that reportedly exposed a dataset on the internet relating to approximately 533m Facebook users worldwide. It focused on Meta's compliance with data protection by design and default obligations under Article 25 GDPR. It subsequently announced it is fining Meta €265m and imposing a reprimand.

Also in October, Facebook was reportedly notifying 1m users in October that their login details may have been stolen by malicious apps on the apple and android platforms.  The apps purport to require login via Facebook but then pass the user details directly back to their creators.  Facebook says it has removed all the apps it identified.

It's worth noting that in December, Meta announced updated privacy practices to protect young people on Facebook and Meta.  These include privacy by default settings for new accounts, tools to limit unwanted interactions with adults and tools to limit the spread of teen intimate images online.  These settings will be automatically provided to new Facebook users under 16 (or 18) depending on the country, and teens already on the platforms will be pointed towards higher privacy settings.

The Irish regulator is now reportedly investigating the Twitter breach reported in August.

Meanwhile, human error continues to be a significant driver of data breaches. In November, the UK's ICO fined construction company Interserve Group £4.4m for failure to implement sufficient security measures to protect employees.  The ICO said the company had failed to put appropriate security measures in place to prevent a cyberattack which enabled hackers to access the personal data of 113,000 employees through a phishing email.  The compromised data included contact details, bank details and special category data.  The UK Information Commissioner John Edwards warned that "the biggest cyber risk businesses face is not from hackers but from complacency within their company".  He reminded businesses to regularly monitor for suspicious activity in their systems, act on warnings, update software and train staff.

For more on cybersecurity this year, see here.

Case law

The are innumerable cases brought in the English courts which include data protection claims.  However, as discussed here, the trend for multiple causes of action is being heavily discouraged by the courts.  In addition to the cases discussed in our article, a couple of interesting developments:

Class action launched against Meta

In January, a class action was launched against Meta (Facebook's parent company) at the Competition Appeal Tribunal under the Consumer Rights Act 2015.  It alleged that Meta abused its market dominance by setting an unfair price for the free use of Facebook in the form of the personal data it collects from users. The class includes all people domiciled in the UK who used Facebook at least once between 1 October 2015 and 31 December 2019 unless they opt out.  The claim is said to be worth $2.3bn and is being backed by litigation funder Innsworth.

It remains to be seen whether this claim under the Consumer Rights Act is more successful than the representative class action claim made in Lloyd v Google under the old Data Protection Act 1998. The claim in Lloyd v Google failed, in part, because the claimant could not demonstrate a common issue among the class. 

High Court grants permanent injunction on material harvested in ransomware attack

In November, the High Court granted summary judgment in respect of a permanent injunction in a breach of confidence claim arising out of a ransomware attack.  It also preserved the anonymity of the Claimant.  The Claimant had previously got a without notice interim injunction restraining the unknown Defendants from using or disturbing the Claimant's confidential information which was harvested in a ransomware attack.  The Claimant then commenced proceedings for breach of confidence and the Court continued the injunction on expanded terms.  The Court granted summary judgment because the large amount of data stolen fell into categories requiring extra protection (including security sensitive information), and because the information was obtained by hacking.

The Claimant's anonymity was preserved largely due to the nature of their work and the fact that much of it is covered by the Official Secrets Act. 

Data breach litigation

For a more detailed update on litigation in the courts of England and Wales relating to data breaches, see here.

EU

CJEU says inbox advertising covered by ePrivacy Directive 

In January, the CJEU ruled on a reference from Germany which asked whether restrictions on sending unsolicited direct marketing communications under Article 13 of the ePrivacy Directive apply to advertising messages that appear in email inboxes and look like normal emails. The practice is referred to as inbox advertising.  The CJEU said (in considerable summary) that the ePrivacy Directive would apply to the type of advertising in question.

AG Opinion on right to be forgotten evidence

In April, Advocate General Pitruzzella opined on a reference from the German Federal Court on the issue of evidence to back up a request to have search results de-listed under Article 17 GDPR (and equivalent provisions under the Data Protection Directive).  

The AG said it was for the data subjects to provide evidence of the falsity of relevant content where to do so is not manifestly impossible or excessively difficult.  The search engine operator should carry out checks which fall within its specific capacities.  Where possible, it should contact the publisher of the relevant web page.  Where appropriate, it could suspend the referencing on a temporary basis pending further information, or mark the results as containing disputed information.  There should be a balancing of the fundamental rights involved.

The AG said there was no need to de-reference thumbnail images from an image search on the basis of their connection to a name, as account should not be taken of the context of the publication on the internet in which the thumbnails originally appeared.

CJEU says consumer protection organisations can bring actions for breach of GDPR

In May, the CJEU ruled in a reference from Germany, that Member States can make provision for consumer protection associations to instigate actions for breach of data subject rights on the basis of infringement of rules on unfair commercial practices, consumer protection, or the use of invalid general terms and conditions.  They can do this at their own instigation, without the mandate of a specific data subject or infringement of specific rights.  Article 80(2) GDPR does not preclude this.

Crucially, the CJEU held that in order to bring a representative action, an authorised organisation need not identify a specific data subject.  Identifying an affected category or group may be sufficient.  Moreover, there doesn't need to be a specific infringement of a data subject's rights, nor proof of actual harm in a given situation.  This decision potentially opens the floodgates to actions brought by consumer protection organisations, for example, around cookie banners or marketing communications.  The organisation will be able to bring proceedings in relation to a suspected breach of consumer protection or data protection rules without having to do so through or at the instigation of an individual.  Read more.

AG opinion on GDPR access right

In June, AG Pitruzzella opined in a reference from Austria on interpretation of the Article 15(1)(c) access right under the GDPR.  Article 15(1)(c) provides that individuals can obtain information from a controller about the recipients or categories of recipient to whom their personal data has been or will be disclosed.  The AG was asked, in summary, whether it was at the controller's discretion to decide whether to disclose precise recipients, or categories of recipient.  The AG said that it is the data subject, not the controller, who has the choice. 

Naming a spouse or partner can be processing special data, says CJEU

In September, the CJEU placed a broad interpretation on the scope of what constitutes special data under Article 9 GDPR.  In a reference from Lithuania, the CJEU held that publishing the name of a civil servant's spouse, partner or cohabitant makes it possible to determine their sexual orientation and can therefore constitute processing of special data.  The ruling suggests a more careful assessment may be required for personal data from which special data may be inferred or deduced.

AG Opinion on interaction between GDPR and unfair competition

In September, Advocate General Rantos gave a non-binding Opinion on a reference from Germany in a case involving the use of combined personal data by Meta.  The German Federal Competition Authority banned Meta from collecting personal data and combining it from across its services and from third-party websites and applications.  Meta appealed the decision, particularly focusing on whether competition authorities have jurisdiction to rule on an infringement of the GDPR.

The AG said that the court in question had not ruled on GDPR infringement so the question was irrelevant. However, he did respond to the other questions raised, holding (among other things) that a competition authority can consider GDPR compliance practices as an incidental factor in competition investigations provided it takes the decisions of data protection regulators into account. 

The AG also said that the mere fact that an undertaking has a dominant position, does not, on its own, mean that consent cannot be freely given, but the dominance can play a role in whether or not consent is freely given and market power can lead to an imbalance in power between controllers and data subjects.  The validity of consent needs to be assessed on a case-by-case basis. The CJEU will make a preliminary ruling on the case in due course.

In August, the UK's CMA announced it will proceed with its investigation into the collection and use of advertising data by Meta.  The investigation opened in June 2021 and focuses on Facebook's collection and/or use of data in the context of providing online advertising services and its single sign on function and whether this constitutes a competitive advantage.

CJEU says no indiscriminate retention of traffic and location data subject to limited exceptions

In October, the CJEU ruled in joined cases that Article 15(1) of the ePrivacy Directive (as amended) read in the light of the Charter of Fundamental Rights of the European Union, precludes national legislative measures which authorise the general and indiscriminate retention of traffic and location data for the purposes of combating serious crime and preventing serious threats to public security.  It also reiterated that EU law does not preclude measures that enable the collection of such data where there is a serious threat to national security subject to specified checks and balances.  

Breach of GDPR does not automatically give rise to a claim for non-material damage, says AG

In October, Advocate General Campos Sanchez-Bordona delivered an Opinion in a reference from Austria on interpretation of Article 82(1) of the GDPR which sets out the right to compensation for breaches of the Regulation. The AG opined that the CJEU should hold that Article 82 should be interpreted so that a mere infringement of the GDPR does not in itself give rise to a claim for compensation.  It must be accompanied by relevant material or non-material damage. Compensation for non-material damage does not include mere upset felt as a result of the breach.  It will be up to Member State courts to determine where the dividing line between a feeling of displeasure and non-material damage lies on a case by case basis.

AG ruling on retention of IP addresses to identify suspected copyright infringement

In November, AG Szpunar delivered an Opinion in the case La Quadrature du Net and others v Premier Ministere de la Culture. The AG said that Article 15(1) of the ePrivacy Directive, read in light of the Charter of Fundamental Rights, does not preclude national legislation that allows for the general and indiscriminate retention of IP addresses for a period of time limited to what is strictly necessary for preventing, investigating, detecting and prosecuting online criminal offences, provided that this data is the only means of identifying the person to whom that address was assigned at the time of the commission of the infringement. This can be done without prior review by a court or independent body if, as in this case, the linking is at a given point in time and is limited to what is strictly necessary to achieve the objective.

In dieser Serie

Radar - 2022 roundup

13. December 2022

von Debbie Heywood

Technology, Media & Communications

Data privacy roundup 2022

13. December 2022

von Debbie Heywood

Call To Action Arrow Image

Newsletter-Anmeldung

Wählen Sie aus unserem Angebot Ihre Interessen aus!

Jetzt abonnieren
Jetzt abonnieren

Related Insights

Technology, Media & Communications

Data and cyber security - 2023 roundup

11. Dezember 2023

von Debbie Heywood

Klicken Sie hier für Details
Technology, Media & Communications

Radar - 2023 roundup

11. Dezember 2023

von Debbie Heywood

Klicken Sie hier für Details
Technology, Media & Communications

ICO publishes final guidance on data protection and monitoring workers

Can employers monitor their workers, how and to what extent?

23. Oktober 2023

von Debbie Heywood

Klicken Sie hier für Details