Services which provide user-generated videos to the public, without exercising editorial responsibility over them, and which have an establishment in the EU (including the UK) will shortly be subject to new notification and content obligations.
For the first time in the UK, services which provide access to user-generated videos will be subject to statutory regulation even where they don't have editorial responsibility over the videos. The 2018 Audiovisual Media Services Directive extends the EU's audiovisual regulatory framework to "video-sharing platforms" (VSPs). One of the main aims of the Directive is to protect users, especially minors, from certain forms of illegal and harmful audiovisual content online, which now extends to VSPs.
VSPs must adopt appropriate measures to protect minors from harmful content, and all users from content containing incitement to violence or hatred, or criminal activity. Adverts on VSPs are also subject to new obligations.
This note explains which services are VSPs, when VSPs will be subject to the regulation, what the Directive requires of VSPs and how the regulation of VSPs compares with other regulations.
While the implementation deadline for the Directive was 19 September 2020, the UK has not, at the time of writing, passed the necessary statutory instrument to carry out the implementation. Implementation in the UK is likely to take place in the autumn.
VSPs are services where any of their principal purpose, the principal purpose of a dissociable section of them or an essential functionality of them is to provide programmes, user-generated videos, or both, to the general public in order to inform, entertain or educate.
Importantly, a service can be a VSP, and subject to the regulation, even if the provider doesn't have editorial responsibility over the choice or substance of the programmes/videos; if the provider only determines their organisation the service can be a VSP. Organisation can be, for example, by automatic means or algorithms, like displaying, tagging and sequencing the content. For example, it can be enough to simply organise user-generated videos on a website that people watch to entertain themselves.
The European Commission has published non-binding guidelines on the practical application of the essential functionality criterion: particular attention should be given to:
The guidelines also identify four non-cumulative categories of relevant indicators which national regulators should consider when assessing whether the essential functionality of a service makes it a VSP:
This 2019 study carried out for the Department for Digital, Culture, Media and Sport (DCMS) shows the different nuances in the VSP definition – for example, in the view of its authors, Twitch's principal purpose is video sharing, Vimeo's principal purpose is not video sharing but a dissociable section of its service is video sharing and video sharing is an essential functionality of Snapchat but not its principal purpose or a dissociable section.
The UK's currently designated regulator, Ofcom, has called for evidence on how to apply the new VSP requirements. The closing date for responding is 24 September 2020. Ofcom recognises that VSP regulation is novel and untested (both nationally and internationally), so it seeks to work constructively with stakeholders to build sound online regulation.
A VSP will be subject to the UK or EU if it is "established" there. Establishment is where the provider "effectively pursues an economic activity using a fixed establishment for an indefinite period". Ofcom suggests that this test will usually be determined by reference to where the service is headquartered – that country's regulator will have jurisdiction over the service. It suggests, for example, that TikTok and Twitch could likely be within UK jurisdiction whereas YouTube, Facebook and Twitter would be expected to be within Irish jurisdiction if they meet the definition of VSP. The authors of the DCMS study concluded that the following VSPs are under, or potentially under, UK jurisdiction: Twitch.tv, Vimeo, TikTok, LiveLeak and Snapchat.
Where VSP providers are not established anywhere in the EU, they shall be deemed established in the Member State where a parent or a subsidiary is established or where an undertaking in their corporate group is established. There is also further detail on how to work out jurisdiction where the parent/subsidiary/other group undertaking are each established in a different Member State.
This deemed establishment test seems to set a low bar for a Member State to claim jurisdiction over VSP providers and means that, even if the provider of the service is outside the EU or UK, the Directive could still apply to it if one of its group companies is within the EU or UK.
The UK government intends to publish further rules in the autumn to clarify the basis on which services will be deemed established in the UK following expiry of the Brexit transition period on 1 January 2021. This will be particularly important for services which may, during 2020, be deemed established in one of the 27 EU member states; from 1 January 2021 such services may, if they have a secondary establishment in the UK, need to comply with the UK regime as well as that of the EU member state of primary establishment.
VSPs established in the UK will have to notify Ofcom that they provide a VSP service under UK jurisdiction. VSPs established in EU states will have to notify their designated regulator. There is a grace period in the UK for existing services, however, which will have until 6 May 2021 to notify. Ofcom intends to publish guidance on scope and jurisdiction in advance to help services determine if they need to notify. Services will be required to pay a, to be determined, fee from April 2022.
Because of how VSP providers are involved with the content on it, the VSP obligations relate to how content is organised on the platform, rather than to the content "as such". In respect of the content, VSP providers can still avail themselves of the safe harbours in articles 12 – 15 of the E-Commerce Directive.
VSPs must take appropriate measures to protect minors from content that may impair their physical, mental or moral development, so that they don't normally hear or see them. The appropriateness of the measures will be determined in light of the nature of the content, the harm it may cause, the protected persons as well as the rights and legitimate interests at stake, including those of the VSP, the uploading users and the general public interest. These measures must be proportionate relative to how harmful the content is, with the most harmful content (like gratuitous violence or pornography) being subject to the strictest measures (like encryption). VSPs must also protect the general public (and minors) from programmes, user-generated videos and commercial communications (with stricter obligations for adverts controlled by the VSPs) inciting violence or hatred or which constitute a criminal offence under EU law, such as public provocation to commit a terrorist offence or offences concerning child sexual exploitation and abuse or racism and xenophobia.
Solutions may range from systematic content descriptors (or other means of description) to visual symbols and age verification tools, transparent and user-friendly content reporting mechanisms, as well as parental control systems. More specifically, the Directive lists a suite of possible appropriate measures (a combination of which can ensure compliance), namely:
Ofcom will issue guidance on the application of these measures.
The measures should be practicable and proportionate and what is appropriate will depend on the size of the VSP and the nature of its service. Member States may, if they wish, impose stricter or more detailed measures. VSP providers shall not be required to introduce ex ante measures or upload-filtering which does not comply with the E-Commerce Directive's prohibition on general monitoring.
There are specific requirements in relation to advertising on VSPs, including recognisability and prohibited or restricted products or practices. VSPs must make available a function for uploaders to declare the presence of advertising and to communicate the presence of such advertising to users (measure 2 above). Where advertising is not within the control of the VSP it will still be required to have appropriate measures in place.
Sanctions available to Ofcom for non-compliance will include the ability to issue legally binding decisions, to set out required remedial steps, to impose financial penalties of up to 5% of applicable qualifying revenue and to issue a direction to suspend or restrict the entitlement to provide a VSP. Ofcom has, nevertheless, said that it does not generally expect to take formal enforcement action before summer 2021, unless there are serious instances of egregious or illegal harm.
It is important to distinguish VSPs from another type of video on demand service, on-demand audiovisual media services (ODAMS), which are already regulated in the EU. The key differences for present purposes between the regulation of VSPs and ODAMS are:
The UK's proposed Online Harms legislation will introduce wide-reaching duties of care on services which contain user-generated content to protect their users from "harm", whether or not causing such harm is lawful or unlawful. Draft legislation has been held up but is expected in early 2021 with a view to coming into force in late 2021. The UK government had initially planned to implement the Directive's VSP requirements through the Online Harms framework and the UK government's intention is that the current regime will remain in place "until such time as the new Online Harms regulatory framework comes into force". So, it is important to keep an eye on how the Online Harms legislation progresses, especially as the White Paper proposed that the harms in scope are much broader than those to which this Directive applies and that legislation is likely to extend to companies without a legal presence in the UK, given the "particularly serious nature of some of the harms in scope and the global nature of many online services".