Auteur

Debbie Heywood

Senior Counsel – Knowledge

Read More
Auteur

Debbie Heywood

Senior Counsel – Knowledge

Read More

9 décembre 2019

Radar - December 2019 – 3 de 8 Publications

Radar - December 2019: Data privacy

  • IN-DEPTH ANALYSIS

It's been another busy year in data protection, cybersecurity and wider privacy. We've seen a lot of guidance and a number of interesting cases, as well as a focus on sectors like Adtech and AI. With Brexit on the horizon, data exports have been high on the agenda and we have also seen enforcement under the GDPR take off. There has also been a real focus on ethical use of data and prevention of online harms involving personal data, underlined by the ICO's recent appointment of an ethics advisor.

Here are some of the highlights of 2019. More detailed news and articles can be found on our Global Data Hub and you can sign up for weekly news updates here.

New UK legislation

It wasn't all about preparing for Brexit.

In January, PECR was amended to:

The UK and US agreed a new treaty, the CLOUD Act, expected to come into force early next year subject to ratification on both sides of the Atlantic. The Act replaces law dating back from the 1980s around access to electronic information of criminal suspects in each other's jurisdictions. Under the CLOUD Act, UK authorities will be able to issue a request equivalent to a US court's and, vice versa, which should reduce the wait for access to a matter of weeks or even days, potentially increasing the number of requests tech companies will receive. The Act will not give either country additional rights and each country's authorities will only be able to investigate cases in their own jurisdictions.

The CLOUD Act will require social media companies and other tech businesses to hand over private messages when required to do so by the relevant court order. It does not, however, require them to provide it in readable format, nor to break end-to-end encryption or build in backdoors to software.

Data transfers

The EU Adequacy Decision on Japan's provision of adequate protection for EU personal data was finalised in January and then published in the Official Journal. See here for more.

In November, the EC published its report on the third annual review of the EU-US Privacy Shield. The EC found that the US continues to ensure an adequate level of protection for personal data transferred under the scheme. It noted that there are now around 5000 companies participating. During the past year, there have been a number of improvements, in particular the appointment of a permanent Ombudsperson and more effective enforcement and redress mechanisms.

On 12 December, we expect the Advocate General's Opinion in the case known as Schrems II which could decide the fate of Standard Contractual Clauses (SCCs) as a data transfer mechanism to the USA. It is very difficult to predict what the CJEU will decide although the relatively positive conclusions of the third annual review of the EU-US Privacy Shield offer some encouragement.

In its Safe Harbor decision, the CJEU arguably went further than was necessary to answer the questions raised in the reference, which is one of the reasons the decision took many by surprise. If the CJEU feels that the USA is not taking its concerns seriously, then we may see a similarly impactful ruling. Hopefully the CJEU will seek to keep any impact on EU-US data flows as narrow as possible.

What we should remember is that even during the aftermath of the drastic CJEU ruling on Safe Harbor, regulators did understand the impact on businesses and gave them time to adapt. It is almost inconceivable that there would be immediate sanctions against businesses using SCCs for data transfers, and there will certainly be time to make changes before the enforcement regime kicks in. Just how much time and just how much adaptation will be needed (if any) remains to be seen.

Brexit

2019 did not see Brexit happen but it did see preparations for a no-deal/no adequacy situation.

The US Department of Commerce updated its FAQs relating to the EU-US Privacy Shield to answer questions relating to the impact of Brexit in January. A US organisation which wants to continue receiving UK personal data under the Privacy Shield scheme after Brexit must update language used to explain its commitment to the Privacy Shield principles to explicitly include personal data received from the UK. Those receiving HR data from the UK must also update their HR privacy policies accordingly.

Keeling schedules showing changes to the GDPR and DPA18 after Brexit were published in February.

The ICO published and subsequently updated advice on preparing for a no-deal Brexit. The advice highlights methods of preserving data flows and looks at when businesses might need to appoint a representative in the EU.

The EDPB also produced an information note on the impact of a no-deal Brexit on BCRs which have the ICO as their Lead SA. As the ICO would no longer play a part in the BCR community in the event of a no-deal Brexit, organisations headquartered in the UK would need to identify the most appropriate SA for BCRs under the Article 29 Working Party Opinion 263. Groups which currently have an application for BCRs pending with the ICO will also need go through the exercise and the new nominated SA will take over the application from the ICO. Where the ICO has approved an application which is before the EDPB for approval at the time of a no-deal Brexit, a new lead SA will have to be identified and will re-submit the application to the EDPB for approval.

The issue around personal data transfers after Brexit was also highlighted in the government's Operation Yellowhammer document. In the government's "reasonable worst case scenario", "The EU will not have made a data decision with regard to the UK before exit. This will disrupt the flow of personal data from the EU, where an alternative legal basis for transfer is not in place. In no-deal, an adequacy assessment could take years."

This wasn't particularly surprising or even new information but there was no mention of any solution nor indeed any effort to mitigate the problems a no-deal Brexit would create for personal data flows.

In its No-deal Readiness Report, published on 8 October 2019, the UK government confirmed that it has secured agreements with twelve of the thirteen EU-adequate countries to preserve the free flow of personal data from them to the UK in the event of a no-deal Brexit. This covers Argentina, Canada (commercial organisations), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay, and the USA under the Privacy Shield. Negotiations with Andorra are ongoing.

The UK announced earlier in the year that it would continue to preserve the effect of the EU adequacy decisions with regards to data flows from the UK to the third countries.

For now, no-deal appears to be off the table but even if we go into transition, there is still the possibility that there will be no agreement on data transfers by the end of the transition period, effectively creating a no-deal situation.

In November, we reported on new UK legislation and guidelines to help EU and UK Digital Service Providers deal with compliance under the NIS Directive and the UK NIS Regulations after Brexit. The legislation is stated as coming into force on the twentieth day after exit day but would most likely come into force at the end of any transition period and it amends the NIS Regulations.

Guidance, opinions and consultations

ICO

In April, the ICO called for views on its plans for a journalism code of practice. The ICO plans to use its current guidance as a basis for the new code which is intended to provide guidance for journalists and media organisations on how to strike the right balance between protection of privacy and personal data on the one hand, and freedom of expression and the public interest on the other.

The ICO published a blog on using biometric data following its enforcement notice against HMRC on its use of voice authenticated passwords. The ICO reminded data controllers about:

  • the requirement to be transparent and accountable when processing personal data
  • to carry out DPIAs where processing is likely to result in a high risk to the rights, and
  • freedoms of individuals, to implement data protection by design and default, to act upon any identified risks, and to comply with the conditions around processing of special data, including that any consent must be explicit.

The ICO updated its guidance on cookies and similar technologies in July, and published a 'myth busting' blog. Read our article for more detail.

The ICO selected ten projects for participation in the initial beta phase of its regulatory sandbox. These include:

  • TrustElevate - provider of secure authentication and authorisation for under-16s
  • NHS digital - creating a central mechanism for managing patient consents to data sharing, and
  • Heathrow Airport - use of biometrics.

The ICO consulted on its draft Code of Practice on the use of personal data in political campaigns. The draft Code summarises applicable data protection and marketing laws but does not introduce new requirements.

In November, the ICO published new guidance to help controllers process special personal data lawfully and take necessary steps to protect it. In a blog post, the ICO reminds controllers that they require a GDPR lawful basis and an Article 9 condition for processing as well as, potentially, an associated DPA18 schedule 1 condition. Many of the DPA18 conditions also require that an appropriate policy document be put in place. This is a short document outlining compliance measures and retention policies with respect to data being processed. The ICO has provided a template policy document in its guidance.

Also in November, the ICO presented its Code of Practice on Age Appropriate Design for Information Society Services to Parliament. Due to pre-election purdah rules, it won't be published until a new government is formed. The draft raised a number of issues which data controllers will be hoping have been clarified in the final version. You can read more about the draft here.

EDPB

In January, the EDPB published an Opinion on the interplay between the GDPR and the Clinical Trials Regulation (CTR) which enters into force in 2020. It looks at the use of clinical trial data for the primary use of the data during the clinical trial protocol (including reliability and safety purposes or research), and the secondary use of the data for other scientific purposes.

Finalised guidelines on certification were also adopted in January. The guidelines identify overarching criteria relevant to all certification mechanisms issued under Articles 42 and 43 GDPR. A number of documents around certification and codes of conduct were adopted at EDPB meeting in June. Member State regulators can now submit their additional requirements for accreditation to the EDPB for approval.

In October, the EDPB adopted a final version of its guidelines on the scope and application of Article 6(1)(b) in the context of information society services. The guidelines discuss the lawful basis that the processing is necessary for the performance of a contract to which the data subject is a party or to take steps at the data subject's requests before entering into the contract and applies it to specific sectors including online behavioural advertising.

Draft guidelines on processing of personal data through video devices and on data protection by design and default were published for consultation towards the end of the year and in November, the EDPB adopted a final version of the Guidelines on Territorial Scope following public consultation. The guidelines aim to provide a common interpretation of the GDPR for EEA Data Protection Authorities when assessing whether a particular processing operation by a controller or a processor falls within the territorial scope of the legal framework under Article 3 GDPR.

The EDPB also, through the course of the year, adopted lists from Member States of when DPIAs will be required under Article 35(4) GDPR. The ICO's list can be found here.

Other regulators (just a few examples)

In March, the Dutch regulator commented on the use of 'take it or leave it' cookie walls which require the user to consent to tracking if they want to access a website. The regulator points out that this cannot constitute proper consent under the GDPR and the ePrivacy Regulation as it is not freely given – there is no real choice involved. The Dutch regulator's comments are unsurprising but many businesses fail to collect freely given consent to tracking cookies (and similar technologies) before dropping them, or require the user to give consent in order to access the site. The Dutch regulator has promised greater scrutiny of these practices.

Hot on the heels of the ICO's cookie guidance came the CNIL's updated guidance on cookies and similar technologies. In light of the newly enhanced definition of consent under the GDPR and the ePrivacy Directive, the CNIL says that continued browsing can no longer constitute consent to cookies.

In November, German data protection regulators published guidelines for calculating administrative fines under Article 83 GDPR. The Guidelines set out five steps that authorities are required to follow when assessing the amount of a fine. This is not exhaustive and is subject to review by the EDPB. The guidelines are not binding on cross-border processing. They have been greeted with a degree of criticism.

Interesting cases

We've seen a number of interesting cases at UK and EU level and there's still the Advocate General's Opinion on Standard Contractual Clauses in Schrems II to come.

UK

In April, the High Court upheld a claim that information provided to the claimant in response to a subject access request under s7 Data Protection Act 1998, was inadequate and ordered the defendant to respond to the SAR with more detail. In doing so, the Court provided guidance which remains relevant under the GDPR.

In September, we discussed the decision of the Divisional Court which handed down judgment in an application for judicial review of the use of Automated Facial Recognition technology by the police. While its use did engage the Article 8(1) ECHR right to privacy, in this case it was justified due to the safeguards used by the police.

In October, we covered the Court of Appeal decision which effectively paved the way for a class action in relation to Google's 'DoubleClick' cookie used as a Safari Workaround. Whether or not this will open the floodgates to data breach class actions remains to be seen but it is clear that representative actions can be used in these situations to secure a compensation pot for an indeterminate number of affected individuals.

The UK's controversial 'immigration exemption' in the DPA18 was found to be lawful in in October. The claimants are seeking leave to appeal and it will be interesting to see the reaction of the EU. The immigration exemption is often cited as a stumbling block to the UK benefitting from a post-Brexit Adequacy Agreement to preserve the free flow of personal data from the EEA to the UK.

CJEU

In July, the CJEU held that website operators are joint data controllers with respect to data collected and transmitted to Facebook through a 'like' plugin which allows Facebook to collect personal data from the operator's site. The case was brought by a German consumer protection association which claimed the plugin breached then current data protection legislation (the GDPR did not apply at the time). Read more.

We reported on the Fashion ID judgment in September. The CJEU found that a website operator was a joint controller with Facebook for a limited time at the specific stage of the data processing in which it was engaged in relation to data collected through 'like' plugins. It was not a joint controller in relation to further processing over which it had no control. This meant that any legitimate interests balancing exercise had to take account of the legitimate interests of both Facebook and the website operator and that the website operator had to comply with consent and transparency requirements relating to the plugin on its site.

At the end of September, two cases on the right to be forgotten established that:

  • The right to be forgotten online is far from global. Search engines which give effect to a right to be forgotten request under the GDPR are not required to remove links to results on all versions of its search engines, but only to EEA URLs. Read more.
  • The prohibition on processing sensitive personal data (subject to exceptions and derogations) applies to search engines.
  • Where search engines decline to give effect to a de-referencing request involving earlier stages of legal proceedings, the operator is required to adjust the list of results so that the overall picture it gives internet users reflects the then current position. Read more.

In the Planet49 case, the CJEU, ruling in a reference from Germany, held in October that:

  • Pre-ticked checkboxes cannot be used to collect consent to cookies.
  • This is true whether or not the information stored or accessed on the users' equipment is personal data.
  • Consent to cookies must specifically relate to those cookies and cannot be bundled. Consent to (in this case) participation in a lottery, will not cover cookie consent.
  • The service provider must give the user information about the cookies, including the duration of their operation and whether or not they allow third party access.

The Court's conclusions did not come as a surprise but cookie policies should be reviewed if there is any doubt as to the kind of information provided to users.

Other

In August 2019, the Higher Regional Court in Düsseldorf granted Facebook's request to suspend the effect of the Bundeskartellamt's earlier decision to fine it for abusing its dominant position as a supplier of advertising space in social networks, pending an appeal by Facebook to the German Federal Court of Justice. While the Regional Court did not question Facebook's dominant position in the relevant market, it did not find any basis for abuse of that position because the infringement of data protection law did not have an anti-competitive effect. See here for more.

Enforcement

While many of the regulator fines this year actually related to breaches of the Data Protection Directive, we are starting to see some significant GDPR fines. A few of the most notable sanctions this year include:

Data Protection Directive

The French data protection regulator, the CNIL, fined Uber EUR 400,000 for breaches of the Data Protection Directive in autumn 2016. Uber was fined £385,000 by the ICO and EUR 600,000 by the Netherlands regulator last year relating to the same breach.

Facebook withdrew its appeal against a £500,000 fine handed down by the ICO in relation to failings identified during the Cambridge Analytica scandal fallout. Facebook has not admitted liability but has agreed that it will pay the fine.

GDPR

In January, the CNIL issued Google with a EUR50m fine for breaches of the GDPR, responding to complaints made on 25 and 28 May 2018 by privacy campaign groups about the use of targeted advertising by its Android operating system. Google is appealing.

In November, the Berlin Data Protection Authority (DPA) fined real estate company Deutsche Wohnung EUR 14.5m for breaches of the GDPR. See our article for more.

This follows a EUR 18m fine of the Austrian Postal Service in October for, among other issues, processing personal data about political affiliation without a lawful basis. The fine is subject to appeal and is not yet legally binding.

These fines show that regulators are not afraid to use the full force of the GDPR for egregious and persistent breaches. Any unofficial compliance holiday is well and truly over and businesses should not rely on 'not being found out'.

Cybersecurity

Legislation and guidance

The National Cyber Security Centre (NCSC) published guidance on cybersecurity design principles intended to help ensure that networks and technologies are designed and built incorporating cybersecurity.

The UK laid Regulations before Parliament, designed to enforce the EU Council Regulation on restrictive measures against cyberattacks threatening the EU or Member States, which came into force in June.

The Cybersecurity Regulation was published in the Official Journal in June. It aims to create an EU-wide certification framework for ICT products and services and to establish ENISA as a permanent EU cybersecurity agency. The Regulation will apply from 28 June 2021 but will not apply in the UK after the end of any transitional Brexit period.

The NCSC published guidance in June to help SMEs and medium sized companies prepare cybersecurity response plans. It provides guidance on all stages of the plan from preparation, to identifying the issue, resolving the incident, breach reporting, and learning from the breach. The guidance stresses the importance of assessing critical systems and having a full breach response plan. It emphasises the importance of cybersecurity being treated as a board-level issue.

The NCSC intends to update its Cyber Essentials Scheme by:

      • Transitioning from using five different accreditation bodies to one delivery partner who will operate the scheme from the end of March 2020.
      • Introducing new minimum criteria for certification bodies and assessors.
      • Introducing a 12 month expiry date on certificates awarded under the scheme.

The NCSC is also looking at whether additional levels of assessment both below and above current levels are required. It advises organisations looking to implement the current scheme not to wait for the updated version.

The NCSC also published a new version (3.0) of its Cyber Assessment Framework. The changes seek to widen out the use of the framework beyond operators of essential services by using language which is more general and suitable to a wider group of users than those specifically covered under the NIS Regulations.

EU Member States, with support from ENISA, published a risk assessment of the cybersecurity of 5G networks in October. The report identifies several security challenges with 5G networks. The cooperation group (established under the Cybersecurity Directive) now has to agree a toolbox of mitigation measures by 31 December 2019.

Breaches

Here is a very small selection of some of the more notable breaches and related fines this year.

In May, WhatsApp confirmed that a vulnerability had allowed cyber attackers to install surveillance software remotely onto phones and other devices. The attacks targeted a 'select number' of users and were considered to be highly sophisticated. The breach was discovered and fixed but WhatsApp has advised all users to update their apps as a precaution.

BA informed the London Stock Exchange that the ICO intends to fine it £1.83m for infringements of the GDPR. These relate to the data breach reported to the ICO in September last year which resulted in the personal data of 500,000 customers being compromised. The ICO found that the breach was facilitated by BA's "poor security arrangements". The fine amounts to around 1.5% of BA's annual global turnover. BA expressed itself "shocked and dismayed".

The ICO also issued a notice of its intention to fine Marriott International £99,200,396 for a breach which compromised approximately 339m customer records globally.

The ICO acted as lead SA under the GDPR one stop shop mechanism when carrying out its investigations. Both BA and Marriott have already made changes to their security in cooperation with the ICO and both are entitled to appeal.

The Bulgarian tax agency suffered a huge data breach which some estimate may have compromised the personal data of most adults in the country. The tax agency was hacked and data including names addresses and income was reportedly stolen. The tax agency faces fines of up to £18m as a result.

As part the Equifax data breach settlement concerning one of the largest reported data breaches in history, affected individuals will be entitled to a range of benefits. Monetary relief for consumers is expected to total around USD 425m.

Capital One reported what is thought to be one of the largest personal data breaches in banking history. The names, addresses and phone numbers of 106 million customers were stolen by a hacker. Credit card and account numbers were not compromised. The alleged hacker is in custody.

Biometric data of 1 million people was discovered on a publicly accessible database. Biostar 2's database, operated by Suprema, was found to be unprotected and largely unencrypted, potentially exposing a huge amount of sensitive personal data. The database is used by organisations including the Metropolitan Police, a number of banks and defence contractors.

ePrivacy

While there may have been a lot of work on the ePrivacy Regulation this year, there has been little progress. Despite being picked up by the Finnish presidency, the latest draft was rejected by COREPER in November. This means there must either be a further draft or the Regulation faces being withdrawn by the Commission. There are few who think the Regulation can be agreed any time soon, if at all.

Surveillance and facial recognition technology

As facial recognition technology becomes ever more sophisticated, there is more uptake which comes hand in hand with increased regulator activity.

In July, the EDPB adopted Guidelines on Video Surveillance which focus particularly on special data and cover a wide range of devices. They also look at the lawful basis of processing video surveillance data and at the household exemption and sharing with third parties.

In August, the Swedish DPA issued a fine of around EUR 20,000 to a municipality for breaching the GDPR by using a facial recognition technology (FRT) pilot to track student attendance at a school. While the school had collected consent from the data subjects, the Swedish DPA found that this could not be relied upon as a lawful basis for processing the data due to the clear imbalance in power between the data controller and the data subjects.

The ICO announced an investigation into the use of live facial recognition technology in the Granary Square and Coal Drops Yard development in Kings Cross. The use of the technology has been widely covered in the media and a number of private firms are reported to be using the technology in new developments for security purposes.

The ICO published a blog in November summarising its first published Commissioner's Opinion on the use of FRT by law enforcement in public places. The ICO calls on the government to introduce a statutory and binding Code of Practice on deployment of Live Facial Recognition systems. It also recommends more work be done by a range of organisations from business to public authorities, to eliminate bias in the LFR algorithms, in particular, that associated with ethnicity.

The ICO is also investigating the use of LFR in the private sector (including where used in partnership with law enforcement) separately and this issue is an area of focus for the EDPS and other regulators including the CNIL.

In November, the Grand Chamber of the European Court of Human Rights held that use of covert CCTV to uncover theft by employees did not infringe the Article 8 right to privacy although the claimants may still have remedies under data protection law. See our article for more.

Adtech

Adtech has been under the regulatory spotlight this year, not only regarding data protection, but also from competition regulators. Read more about regulatory views here.

There are also a number of complaints pending before EU data protection regulators relating to GDPR compliance by data brokers and adtech companies, some of which also take aim at the IAB framework and have led to full blown investigations. Unsurprisingly, these focus largely on the selection of the lawful basis for the processing, namely consent and legitimate interests, and on compliance with the data protection principles, particularly transparency, purpose limitation and data retention. Read more in our article.

Following its fact-finding forum, the ICO published an update report into adtech and real time bidding (RTB), summarising its findings so far. Its view is that the adtech industry presents a number of challenges to good data protection practices. The report focuses on issues around transparency and consent (in relation to special data, as a lawful basis, and to cookies), as well as on the data supply chain. The ICO recognised there was more work to be done on resolving the tensions between adtech business models and data privacy law and said it would take an additional six months to gather information and review its position. We are likely to see a further report early next year.

This is going to be even more of a focus if the ePrivacy Regulation is finally passed in 2020 (although we're not going to put money on that).

AI

Another sector focus for regulators during 2019 was AI.

The Committee of the Council of Europe's Convention 108 treaty (on the protection of individuals with regard to the automatic processing of personal data), published guidelines on AI and data protection. The guidelines are intended to help policy makers, product developers and service providers ensure that AI applications incorporate data protection by design.

In April, the National Cyber Security Centre published guidance for those looking to use an off the shelf security tool that employs AI as a core component. It also provides information for those developing in-house AI cybersecurity tools and may have wider application for those using AI for non-security business functions.

The ICO's call for input into the ICO AI framework and guidance closed in November and publication of a consultation draft is expected by January 2020. The consultation was accompanied by a series of blog posts looking at different challenges around the use of personal data by AI.

The government asked the ICO and the Alan Turing Institute (The Turing) to produce practical guidance for organisations to assist them with explaining AI decisions to affected individuals. As a result, the ICO and The Turing undertook Project ExplAIn, conducting research with stakeholders and published an interim report in June.

Online harm and data ethics

2019 saw a lot of attention devoted to managing online harms. While this covers a wide range of issues, some of them are data-related as regulatory and government activity demonstrates.

The Centre for Data Ethics and Innovation (CDEI) published two interim reports on its reviews of online targeting and bias in algorithmic decision making in July. Another major objective of the CDEI is to identify current gaps in regulation and to produce a 'State of the Nation' report on the use and governance of data.

The ICO published its response to the government's White Paper on Online Harms in July. The ICO said she was "surprised and disappointed" at the lack of engagement with the harm of electoral interference and the need for greater transparency in online political advertising and micro targeting. The ICO is not against the idea of an independent regulator but says that relevant regulators (including the ICO) need to work together and all have effective and comparative enforcement powers. The ICO suggested a coordinated approach might be a more effective model than having a single or 'super' digital regulator.

The parliamentary Joint Committee on Human Rights published a report on the Right to Privacy (Article 8) and the Digital Revolution in November, suggesting the 'consent model' for data is "broken". It says that it's almost impossible for people to be fully informed about what is happening to their data and the onus shouldn't be on them to discover it. It suggests that this issue is magnified with regard to children and also criticises businesses which offer no alternatives to consent and no granularity. The Committee also criticises the legitimate interests lawful basis and says there is insufficient clarity around when the interests of an organisation override those of the individual. While recognising that the GDPR aims to protect privacy, the Committee questions whether there are sufficient resources to enforce it effectively. It recommends full adoption of the UN Guiding Principles on Business and Human Rights and also urges the government to include data protection within the scope of its online harms review and any subsequent legislation or regulatory measures.

The use of personal data to influence elections has been another area of focus. The EC adopted a Regulation preventing misuse of personal data influencing the European Parliament elections and political advertising on social media has received attention around Europe. In the UK, the ICO's consultation on its draft Code of Practice for the use of personal data in political advertising closed in October. A final version is expected next year. The ICO has also published blogs on the subject and, in the run up to the UK general election, has written to the UK's political parties to remind them about the need to comply with data protection and electronic marketing laws. Twitter has banned political advertising and Google has brought in restrictions.

Dans cette série

Technologie, Médias et Communications (TMC)

Radar - December 2019

9 December 2019

par plusieurs auteurs

Technologie, Médias et Communications (TMC)

Radar - December 2019: Tech

9 December 2019

par plusieurs auteurs

Protection des données et cybersécurité

Radar - December 2019: Data privacy

9 December 2019

par Debbie Heywood

Droit de la consommation

Radar - December 2019: Consumer

9 December 2019

par plusieurs auteurs

Gaming

Radar December 2019: Games and eGaming

9 December 2019

par plusieurs auteurs

Télécommunications

Radar - December 2019: Communications

9 December 2019

par plusieurs auteurs

Technologie, Médias et Communications (TMC)

Radar - December 2019: Digital Single Market

9 December 2019

par plusieurs auteurs

Technologie, Médias et Communications (TMC)

Radar - December 2019: Other developments

9 December 2019

par plusieurs auteurs

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Technologie, Médias et Communications (TMC)

Data and cyber security - 2023 roundup

11 décembre 2023

par Debbie Heywood

Cliquer ici pour en savoir plus
Technologie, Médias et Communications (TMC)

Radar - 2023 roundup

11 décembre 2023

par Debbie Heywood

Cliquer ici pour en savoir plus
Technologie, Médias et Communications (TMC)

ICO publishes final guidance on data protection and monitoring workers

Can employers monitor their workers, how and to what extent?

23 octobre 2023

par Debbie Heywood

Cliquer ici pour en savoir plus