9 December 2019
Radar - December 2019 – 2 of 8 Insights
2019 saw a considerable focus on online harms. This brought emerging technologies like AI and drones, as well as the market practices of the tech giants, in particular around adtech and data dominance, into the spotlight.
There were a great many reports produced on online harms over the course of the year, particularly in relation to children and vulnerable people. It remains to be seen how many of the recommendations made are actually followed through.
The House of Commons Select Committee on Science and Technology published a report on the impact of social media and screen-use on young people's health in January. Recommendations included:
The DCMS Parliamentary Committee published a final report on disinformation and fake news in February. The report called for:
The report also found that current electoral law is not fit for purpose and that Facebook had knowingly violated data protection and competition law and should be investigated by the ICO and the CMA. In addition, the Committee recommended establishing a new category of technology company which would not necessarily be either a platform or publisher.
The government responded underlining its support for a CMA investigation into digital advertising as recommended not only by the Select Committee report but also by the Cairncross Review on the future of journalism, and the Furman Review of digital competition, both of which were also published this year. The CMA has said it is actively considering further work in this area but that it is dependent on the progress of the Brexit process.
In February, the EC set up an Expert Group on Safer Internet for Children which includes representatives from all EU and EEA Member States. The Commission also published a report on the evaluation of the Alliance to Better Protect Minors Online in early February. The Alliance is a multi-stakeholder platform through which member companies make commitments to address emerging risks that minors face online.
The House of Lords Communications Committee published a report following its inquiry into whether or not the internet should be regulated in March. The House of Lords said self-regulation is failing and called for the appointment of a digital super-regulator. The new Digital Authority should be guided by a charter of ten basic principles of online regulation. It should be relatively hands off, leaving enforcement to the ICO, Ofcom and the ASA. It should provide oversight of the full spectrum of regulation and recommendations for new legislation to fill any gaps or inconsistencies.
The Lords also recommended changes to existing legislation including:
In April we reported on the government's long-awaited White Paper on Online Harms. The government is proposing a new regulatory framework which will increase the responsibilities of operators to tackle harmful content and activities online. The proposals, set out in the Online Harms White Paper, will apply to any operator which allows users to share or discover user-generated content (UGC) or interact with each other online. It therefore covers a broad range of operators including social media platforms, press publishers that host UGC, cloud hosting providers and retailers who allow users to review products online.
Under the proposals, a new statutory duty of care will be imposed on operators, which will be overseen by an independent regulator. The regulator will set out how operators can comply with that duty of care in Codes of Practice. It will include obligations proactively to monitor or scan for certain tightly defined categories of illegal content. Failure to comply with the duty of care could lead to significant fines and individual liability for senior management. The net effect would be that online operators will not be able to rely solely on the safe harbour provision in the eCommerce Directive that they are merely acting as hosts to avoid liability for certain types of harmful content. Read more.
April also saw the DCMS publish a statutory Code of Practice for providers of online social media as required under s103 of the Digital Economy Act 2017.
The Code does not deal with illegal or unlawful content but sets out actions social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviour on their sites. The Code is directed at providers of social media platforms but is also relevant to sites hosting UGC, including review websites, gaming platforms and online marketplaces.
Ofcom published a paper on 'Online market failures and harms' in October. It sets out a broad overview of online policy issues, explaining how economic issues and market failures in online services may cause harm to individuals. Market power, barriers to switching, information asymmetry and behavioural biases are some of the market failures Ofcom looks at, suggesting these can lead to harms including competition issues, fraudulent or unfair business practices, and privacy issues. Ofcom highlights the importance of addressing all market failures that are the source of online harms, and the benefits of regulators working together to address issues.
The House of Commons Science and Technology Committee published a report on commercial and recreational drone use in the UK in October. The Committee made a number of recommendations including clarification of 'no fly' zones and exemptions, looking into geo-fencing around sensitive areas like prisons, and considering making drones traceable. The Committee also looked at investment and research opportunities and at the proposed registration scheme.
The government launched its Counter-unmanned aircraft strategy a few weeks later. The strategy focuses on combatting the malicious use of drones for high levels of harm, for example terrorism, facilitating crime and disrupting critical infrastructure.
On 5 November, new rules came into place for UK drone users. Users of drones over 250g, must now sit an online test and pay an annual registration fee. Owners of drones over 250g must now identify and label them with a unique licence number.
In June, the EC published two new Regulations on drones in the Official Journal which came into force on 1 July 2019 and will apply from 1 July 2020, although some provisions have a two year implementation period. The Regulation on the rules and procedures for the operation of unmanned aircraft includes a registration requirement for operators of drones above a set weight. The Regulation on unmanned aircraft systems and on third party operators of unmanned aircraft sets out rules on the use of drones and for third-country operators operating within the single European airspace.
The BSI published a new cybersecurity standard for self-driving vehicles in December 2018, intended to set a marker for those developing the technology and following on from government recommendations and principles. Car manufacturers can use the standard to demonstrate they are following the government's principles but it is not mandatory.
The government has also consulted on an updated code of practice for the trialling of automated vehicles. The code proposes that businesses looking to trial driverless cars should provide a detailed safety case including a variety of relevant information such as the trial activity, vehicles involved and evidence that the trial can proceed safely. A summary of the case will need to be made publicly available.
Governments and regulators are racing to come up with ethical principles for AI as the law struggles to keep up with the pace of development.
The EC's expert group on AI published its policy recommendations consisting of 33 high-level recommendations for action by policy makers at EU and Member State level in July. These include developing a ten year action plan to review and enhance laws relating to AI and looking at needs in different contexts (eg B2B, B2C, public sector). The recommendations coincided with the Commission's launch of the pilot phase of its ethics guidelines for trustworthy AI which were agreed in April. The pilot phase allows organisations to test the assessment list which will be revised in early 2020 following feedback.
The Organisation for Economic Co-ordination and Development agreed principles and recommendations on AI which have been adopted by 42 countries. The principles complement existing OECD standards in areas like privacy, responsible business conduct and risk management.
In the UK, AI has been a focus for the Information Commissioner (see section on data).
Adtech has been under the spotlight of both competition and data protection regulators this year with studies, guidance and, in the case of Google, some significant fines (which are subject to appeal). See here for more.
As part of its Digital Markets Strategy, the CMA carried out a market study into online platforms and digital advertising. The purpose of the study is to consider the extent to which online platforms which rely on digital advertising revenues may have an adverse effect on consumers through the use of digital advertising and assess the potential for remedying or mitigating any adverse effects. The CMA is assessing the nature of competition in the B2B and B2C markets.
The CMA envisages that any remedies needed to address competition issues are likely to require new legislation in line with recommendations in the Furman Report. Any new framework would require monitoring and need to be sufficiently flexible to cope with new technology. The CMA will conduct an evidence gathering exercise across a broad spectrum of stakeholders and issues over the next year.
The study is a central element of the CMA's Digital Markets Strategy, published on 3 July 2019, which also focuses on:
In January, the CMA published guidance for social media influencers giving do's and don'ts for ensuring it is clear when they are being incentivised to endorse or review products.
The ASA published a report on its research into consumer understanding of social media influencing advertising in April. The ASA is not recommending regulatory changes. While it does suggest the minimum requirement for influencers is to use #ad in their posts, best practice will involve considering how placement, visibility and wording of labels will impact consumer understanding. The ASA will be focusing on ensuring influencers and brands are upfront and clear.
The European Court of Human Rights considered the issue of whether someone linking to illegal or defamatory content is responsible for the third party content at the end of last year. In this case the issue was linking to defamatory content. The Court held that there was no automatic liability for third party content when linking to it. An individual assessment is required in each case. Of particular relevance would be whether the material was linked to or accompanied by an endorsement or repetition of the offending content, as well as the intent and knowledge of the linker.
An application for judicial review concluded in November that the applicant had been rightfully convicted of three offences contrary to s127(1) Communications Act 2003, which makes it an offence to send a grossly offensive message via a public electronic communications network or cause any such message to be sent.
The applicant had written a blog in which she posted hyperlinks to YouTube videos of herself performing grossly offensive antisemitic songs. The claimant argued it was wrong to convict her as posting a link was a neutral act which did not cause an offensive message to be sent and in uploading a YouTube video, she had only sent the video to a server in California, an inanimate object with whom communication was impossible.
The court considered that the offence was complete at the time the message was sent because the actus reus of the offence was the sending of the message with an intention to insult – the offence did not require the message to be received, whether by a human or otherwise. It also held that the applicant had endorsed the content so her action was not neutral or passive as she had posted the hyperlink knowing its content and intending others to view it.
In October, we reported on a CJEU case which held that the general prohibition on monitoring under Article 15 of the eCommerce Directive does not preclude a Member State court from ordering an ISP to remove identical or equivalent information, provided that:
Nor does Article 15 preclude a Member State court from granting an order worldwide within the framework of the relevant international law.
by Multiple authors
by Multiple authors
by Multiple authors
by Multiple authors
by Multiple authors
by Multiple authors
by Multiple authors
by multiple authors