Authors

Christopher Jeffery

Partner

Read More

Graham Hann

Partner

Read More

Vinod Bange

Partner

Read More

Siân Skelton

Partner

Read More

Paul Glass

Partner

Read More

Debbie Heywood

Senior professional support lawyer

Read More

Glyn Morgan

Partner

Read More

Angus Finnegan

Consulting partner

Read More

Martin Cotterill

Partner

Read More
Authors

Christopher Jeffery

Partner

Read More

Graham Hann

Partner

Read More

Vinod Bange

Partner

Read More

Siân Skelton

Partner

Read More

Paul Glass

Partner

Read More

Debbie Heywood

Senior professional support lawyer

Read More

Glyn Morgan

Partner

Read More

Angus Finnegan

Consulting partner

Read More

Martin Cotterill

Partner

Read More

9 December 2019

Radar - December 2019 – 2 of 8 Insights

Radar - December 2019: Tech

2019 saw a considerable focus on online harms. This brought emerging technologies like AI and drones, as well as the market practices of the tech giants, in particular around adtech and data dominance, into the spotlight.

Online harms

There were a great many reports produced on online harms over the course of the year, particularly in relation to children and vulnerable people. It remains to be seen how many of the recommendations made are actually followed through.

Select Committee report on the impact of social media and screen-use on the health of young people

The House of Commons Select Committee on Science and Technology published a report on the impact of social media and screen-use on young people's health in January. Recommendations included:

  • A statutory duty of care for social media companies registered in the UK towards users under 18 to avoid identified harms.
  • A statutory code of practice to provide consistency on content-reporting practices and moderation mechanisms, backed up by a sanctions and enforcement regime to be implemented by Ofcom and the ICO.
  • Social media companies to publish transparency reports every six months.
  • Government to consider legislation similar to German legislation which requires social media companies to act on reported content within 24 hours.

DCMS Committee report on online disinformation and fake news

The DCMS Parliamentary Committee published a final report on disinformation and fake news in February. The report called for:

  • A compulsory code of ethics for tech companies overseen by an independent regulator.
  • The power for the regulator to launch legal action against companies breaching the code.
  • Government to reform current electoral communications laws and rules on overseas involvement in UK elections.
  • Social media companies to be required to take down known sources of harmful content including proven sources of disinformation.
  • Consideration of how UK law should define digital campaigning including having a definition of online political advertising.
  • A publicly accessible and searchable repository for political advertising items so the public can see who is funding or sponsoring them and the target audience.

The report also found that current electoral law is not fit for purpose and that Facebook had knowingly violated data protection and competition law and should be investigated by the ICO and the CMA. In addition, the Committee recommended establishing a new category of technology company which would not necessarily be either a platform or publisher.

The government responded underlining its support for a CMA investigation into digital advertising as recommended not only by the Select Committee report but also by the Cairncross Review on the future of journalism, and the Furman Review of digital competition, both of which were also published this year. The CMA has said it is actively considering further work in this area but that it is dependent on the progress of the Brexit process.

EC Expert Group on Safer Internet for Children

In February, the EC set up an Expert Group on Safer Internet for Children which includes representatives from all EU and EEA Member States. The Commission also published a report on the evaluation of the Alliance to Better Protect Minors Online in early February. The Alliance is a multi-stakeholder platform through which member companies make commitments to address emerging risks that minors face online.

House of Lords Communications Committee recommends digital super-regulator

The House of Lords Communications Committee published a report following its inquiry into whether or not the internet should be regulated in March. The House of Lords said self-regulation is failing and called for the appointment of a digital super-regulator. The new Digital Authority should be guided by a charter of ten basic principles of online regulation. It should be relatively hands off, leaving enforcement to the ICO, Ofcom and the ASA. It should provide oversight of the full spectrum of regulation and recommendations for new legislation to fill any gaps or inconsistencies.

The Lords also recommended changes to existing legislation including:

  • A new public interest test for data-driven mergers and acquisitions to protect users from having their data sold without their consent.
  • Powers for the ICO to audit tech companies' use of algorithms in decision making.
  • A requirement for services to default to strictest privacy and safety settings.

Online Harms white paper

In April we reported on the government's long-awaited White Paper on Online Harms. The government is proposing a new regulatory framework which will increase the responsibilities of operators to tackle harmful content and activities online. The proposals, set out in the Online Harms White Paper, will apply to any operator which allows users to share or discover user-generated content (UGC) or interact with each other online. It therefore covers a broad range of operators including social media platforms, press publishers that host UGC, cloud hosting providers and retailers who allow users to review products online.

Under the proposals, a new statutory duty of care will be imposed on operators, which will be overseen by an independent regulator. The regulator will set out how operators can comply with that duty of care in Codes of Practice. It will include obligations proactively to monitor or scan for certain tightly defined categories of illegal content. Failure to comply with the duty of care could lead to significant fines and individual liability for senior management. The net effect would be that online operators will not be able to rely solely on the safe harbour provision in the eCommerce Directive that they are merely acting as hosts to avoid liability for certain types of harmful content. Read more.

Government Code of Practice for social medial companies

April also saw the DCMS publish a statutory Code of Practice for providers of online social media as required under s103 of the Digital Economy Act 2017.

The Code does not deal with illegal or unlawful content but sets out actions social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviour on their sites. The Code is directed at providers of social media platforms but is also relevant to sites hosting UGC, including review websites, gaming platforms and online marketplaces.

Ofcom paper on Online market failures and harms

Ofcom published a paper on 'Online market failures and harms' in October. It sets out a broad overview of online policy issues, explaining how economic issues and market failures in online services may cause harm to individuals. Market power, barriers to switching, information asymmetry and behavioural biases are some of the market failures Ofcom looks at, suggesting these can lead to harms including competition issues, fraudulent or unfair business practices, and privacy issues. Ofcom highlights the importance of addressing all market failures that are the source of online harms, and the benefits of regulators working together to address issues.

Drones

UK

The House of Commons Science and Technology Committee published a report on commercial and recreational drone use in the UK in October. The Committee made a number of recommendations including clarification of 'no fly' zones and exemptions, looking into geo-fencing around sensitive areas like prisons, and considering making drones traceable. The Committee also looked at investment and research opportunities and at the proposed registration scheme.

The government launched its Counter-unmanned aircraft strategy a few weeks later. The strategy focuses on combatting the malicious use of drones for high levels of harm, for example terrorism, facilitating crime and disrupting critical infrastructure.

On 5 November, new rules came into place for UK drone users. Users of drones over 250g, must now sit an online test and pay an annual registration fee. Owners of drones over 250g must now identify and label them with a unique licence number.

EU

In June, the EC published two new Regulations on drones in the Official Journal which came into force on 1 July 2019 and will apply from 1 July 2020, although some provisions have a two year implementation period. The Regulation on the rules and procedures for the operation of unmanned aircraft includes a registration requirement for operators of drones above a set weight. The Regulation on unmanned aircraft systems and on third party operators of unmanned aircraft sets out rules on the use of drones and for third-country operators operating within the single European airspace.

AEVs

The BSI published a new cybersecurity standard for self-driving vehicles in December 2018, intended to set a marker for those developing the technology and following on from government recommendations and principles. Car manufacturers can use the standard to demonstrate they are following the government's principles but it is not mandatory.

The government has also consulted on an updated code of practice for the trialling of automated vehicles. The code proposes that businesses looking to trial driverless cars should provide a detailed safety case including a variety of relevant information such as the trial activity, vehicles involved and evidence that the trial can proceed safely. A summary of the case will need to be made publicly available.

AI

Governments and regulators are racing to come up with ethical principles for AI as the law struggles to keep up with the pace of development.

The EC's expert group on AI published its policy recommendations consisting of 33 high-level recommendations for action by policy makers at EU and Member State level in July. These include developing a ten year action plan to review and enhance laws relating to AI and looking at needs in different contexts (eg B2B, B2C, public sector). The recommendations coincided with the Commission's launch of the pilot phase of its ethics guidelines for trustworthy AI which were agreed in April. The pilot phase allows organisations to test the assessment list which will be revised in early 2020 following feedback.

The Organisation for Economic Co-ordination and Development agreed principles and recommendations on AI which have been adopted by 42 countries. The principles complement existing OECD standards in areas like privacy, responsible business conduct and risk management.

In the UK, AI has been a focus for the Information Commissioner (see section on data).

Adtech

Adtech has been under the spotlight of both competition and data protection regulators this year with studies, guidance and, in the case of Google, some significant fines (which are subject to appeal). See here for more.

As part of its Digital Markets Strategy, the CMA carried out a market study into online platforms and digital advertising. The purpose of the study is to consider the extent to which online platforms which rely on digital advertising revenues may have an adverse effect on consumers through the use of digital advertising and assess the potential for remedying or mitigating any adverse effects. The CMA is assessing the nature of competition in the B2B and B2C markets.

The CMA envisages that any remedies needed to address competition issues are likely to require new legislation in line with recommendations in the Furman Report. Any new framework would require monitoring and need to be sufficiently flexible to cope with new technology. The CMA will conduct an evidence gathering exercise across a broad spectrum of stakeholders and issues over the next year.

The study is a central element of the CMA's Digital Markets Strategy, published on 3 July 2019, which also focuses on:

  • Consumer and antitrust enforcement and merger assessment.
  • The work of the CMA’s Data, Technology, and Analytics Unit.
  • How the CMA plans to adapt its mergers approach to digital markets.
  • Considerations around creating a Digital Markets Unit and
  • Making best use of the CMA’s enforcement tools.

Social media influencing

In January, the CMA published guidance for social media influencers giving do's and don'ts for ensuring it is clear when they are being incentivised to endorse or review products.

The ASA published a report on its research into consumer understanding of social media influencing advertising in April. The ASA is not recommending regulatory changes. While it does suggest the minimum requirement for influencers is to use #ad in their posts, best practice will involve considering how placement, visibility and wording of labels will impact consumer understanding. The ASA will be focusing on ensuring influencers and brands are upfront and clear.

Content

ECtHR case on liability for hyperlinks

The European Court of Human Rights considered the issue of whether someone linking to illegal or defamatory content is responsible for the third party content at the end of last year. In this case the issue was linking to defamatory content. The Court held that there was no automatic liability for third party content when linking to it. An individual assessment is required in each case. Of particular relevance would be whether the material was linked to or accompanied by an endorsement or repetition of the offending content, as well as the intent and knowledge of the linker.

Posting hyperlinks to offensive content on YouTube breached Communications Act 2003

An application for judicial review concluded in November that the applicant had been rightfully convicted of three offences contrary to s127(1) Communications Act 2003, which makes it an offence to send a grossly offensive message via a public electronic communications network or cause any such message to be sent.

The applicant had written a blog in which she posted hyperlinks to YouTube videos of herself performing grossly offensive antisemitic songs. The claimant argued it was wrong to convict her as posting a link was a neutral act which did not cause an offensive message to be sent and in uploading a YouTube video, she had only sent the video to a server in California, an inanimate object with whom communication was impossible.

The court considered that the offence was complete at the time the message was sent because the actus reus of the offence was the sending of the message with an intention to insult – the offence did not require the message to be received, whether by a human or otherwise. It also held that the applicant had endorsed the content so her action was not neutral or passive as she had posted the hyperlink knowing its content and intending others to view it.

Courts can order platforms, websites and apps to take down identical and equivalent illegal content

In October, we reported on a CJEU case which held that the general prohibition on monitoring under Article 15 of the eCommerce Directive does not preclude a Member State court from ordering an ISP to remove identical or equivalent information, provided that:

  • the monitoring and search for the information concerned by an injunction are limited to information conveying a message, the content of which remains essentially unchanged from the original information, and
  • contains the elements specified in the injunction, and
  • provided that the differences to both do not require an ISP to carry out an independent assessment of it.

Nor does Article 15 preclude a Member State court from granting an order worldwide within the framework of the relevant international law.

Return to contents page

In this series

Technology, media & communications

Radar - December 2019

by Multiple authors

Technology, media & communications

Radar - December 2019: Tech

by Multiple authors

Data protection & cyber

Radar - December 2019: Data privacy

IN-DEPTH ANALYSIS

by Debbie Heywood

Consumer & retail

Radar - December 2019: Consumer

by Multiple authors

Gaming, eGaming & gambling

Radar December 2019: Games and eGaming

by Multiple authors

Telecommunications

Radar - December 2019: Communications

by Multiple authors

Technology, media & communications

Radar - December 2019: Digital Single Market

by Multiple authors

Technology, media & communications

Radar - December 2019: Other developments

by Multiple authors

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Telecommunications

Radar - December 2019: Communications

9 December 2019

by multiple authors

Click here to find out more
Technology, media & communications

CJEU looks at the right to be forgotten in the context of sensitive personal data

21 October 2019

by Debbie Heywood

Click here to find out more
Technology, media & communications

CJEU rules right to be forgotten online not global

21 October 2019

by Debbie Heywood and Michael Yates

Click here to find out more