< Back

The changing role of internet intermediaries

Article 17 of the new Copyright Directive has been accused of censoring freedom of expression and "breaking the internet" by making platforms caught by the article directly liable for infringing content uploaded by users. But who is caught by Article 17 and how much does it really change the way internet platforms will need to operate in practice?

October 2019

The "value gap" proposal in Article 17 (previously Article 13) of the new Copyright Directive (2019/790) was one of the most controversial proposals of the legislation. It effectively makes platforms caught by the provision directly liable for infringing content uploaded by users and requires them to get authorisation from rightsholders from 7 June 2021.

Who does Article 17 apply to?

The short answer is that Article 17 applies to online content sharing platforms, but there are some exceptions and nuances.

Specifically, Article 17 applies to platforms that, as one of their main purposes, "store and enable users to upload and share a large amount of copyright-protected content with the purpose of obtaining profit therefrom, either directly or indirectly, by organising it and promoting it in order to attract a larger audience, including by categorising it and using targeted promotion within it".

The "main purposes" requirement means that electronic communication services, cloud services providers, and online marketplaces whose main activity is online retail (along with other platforms with a different "main purpose") fall outside the scope of Article 17. There are also specific exceptions for open source development and sharing platforms, non-profit scientific and educational repositories, and non-profit online encyclopaedias.

Start-up platforms are not completely exempted from Article 17, but those that have been available in the EU for less than three years and have an annual turnover under EUR 10 million have lesser obligations than other affected platforms.

What does Article 17 require and how will that change the way internet platforms operate?

Firstly, and importantly, Article 17 states that platforms caught by it are directly liable for copyright infringement in relation to content uploaded by users. They do not benefit from the hosting exemption provided in Article 14 of the E-Commerce Directive (2000/31). The liability comes about when the platform provides the public with access to content – this amounts to a "communication to the public" that requires authorisation from the rightsholder.

How does this change things?

The Directive itself purports to "clarify" the law, which suggests that Article 17 simply outlines the legal position that already applies to those platforms that fall within its scope. However, we think this overstates the level of certainty in the pre-Directive legal position, where a lot of issues remain open to interpretation.

On the other hand, The Pirate Bay was a rather extreme example and is arguably not comparable to other, more neutral, online content sharing platforms which contain a substantial proportion of non-infringing content. Questions regarding platforms' liability for copyright infringement and the scope of the hosting exemption are currently before the CJEU in the YouTube case, where the Copyright Directive's claim to merely "clarify" the law will be tested.

In practice, the clarification in Article 17 will benefit rightsholders. Platforms that fall within the scope of Article 17 will no longer be able to simply disclaim liability on the basis that content has been posted by users in automated circumstances in which the platform has no knowledge, or make blanket claims that the hosting exemption applies.

Instead, they will need to engage substantively with rightsholders' claims to at least some degree, including as to why they do not "organise and promote" the content.

If platforms do not have authorisation, they may be able to take advantage of the additional safe harbour introduced by Article 17

Despite the fact that Article 17 makes platforms directly liable for copyright infringement in user-uploaded content, it also provides an additional, narrower, safe harbour. This applies if the platform can demonstrate that it has:

  • used "best efforts" to obtain authorisation from the rightsholder
  • made "best efforts" in accordance with high industry standards of professional diligence to ensure specific works and subject matter are unavailable on the platform if the rightsholder has provided it with "the relevant and necessary information" (certain smaller start-up platforms are not subject to this obligation)
  • acted "expeditiously" on receiving a "sufficiently substantiated notice" from rightsholders to remove or disable access to the notified works or other subject matter, and made "best efforts" to prevent their future upload.

The Directive is, however, clear that these obligations should not result in platforms being required to take down or prevent access to non-infringing works, or having generally to monitor user content.

How does this change things?

There are clearly some unanswered questions around interpretation which will affect the application of these provisions. The first practical effect of this new safe harbour then is that it will shift the argument between platform and rightsholder from the question of whether there is any liability to begin with and whether the hosting exemption applies, to the question of whether the parties have done the things required of them under the new safe harbour provisions.

"Best efforts" to obtain authorisation

Here, platforms may well fall down at the first hurdle – demonstrating that they have used "best efforts" to obtain authorisation. This is an entirely new concept that did not apply to liability assessments under the hosting exemption.

"Best efforts" in English law contracts is normally interpreted to mean that the party must do everything that it possibly can, at whatever cost, to achieve the promised aim. It would be surprising if EU lawmakers intended Article 17 to put all of the bargaining power in the rightsholder's hands in this manner, but even on a natural reading "best efforts" must mean something much greater than a superficial effort to identify and negotiate with the rightsholder (or, more likely, a large multitude of rightsholders).

The Directive suggests that "best efforts" are to be judged taking into account the principle of proportionality, including the platform's size and the costs of taking certain steps. Presumably then, in seeking to obtain authorisation, platforms cannot be expected to take steps that would render their business unviable, but they may be expected to deploy a much larger proportion of their revenues than they currently do in obtaining the necessary copyright authorisations.

"Best efforts" can apply in at least three stages of the "obtaining authorisation" process:

  • First, the platform would have to identify the works available and the relevant rightsholders.
  • Second, it would have to find those rightsholders.
  • Third, it would have to negotiate with the rightsholders it finds.

For some industries, this may be easier than others – one would assume that platforms should at least use works databases made available by collecting societies and check orphan works databases and then be prepared to negotiate on the basis of generally available licences for their uses.

"Best efforts" to ensure certain content is unavailable

Assuming the platform is able to meet this first requirement, it must then use "best efforts" to ensure that it does not make content available against the rightsholder's express wishes. It is this "unavailability" obligation that has prompted lobbyists and commentators to claim that Article 17 requires platforms to implement upload filters and censor user content.

Practically speaking, it is likely that platforms will need to implement some sort of filtering mechanism, whether on upload of new content or promptly afterwards, in order to comply with Article 17, but that this would only apply when rightsholders have provided the relevant and necessary information in relation to specific works.

"Best efforts" will be assessed in light of whether the platform has taken "all the steps that would be taken by a diligent operator", "best industry practice", the effectiveness of the steps the platform has taken, and the principle of proportionality, including the platform's size, the state of the art and the cost of taking various measures.

The Directive anticipates that the European Commission, Member States and industry stakeholders will establish "best practices" for these purposes and a stakeholder dialogue intended to discuss possible practical solutions starts on 15 October 2019.

As filtering technology becomes more effective and more widely and cheaply available, it seems likely that rightsholders and courts will increasingly expect platforms to use a filter, and indeed larger platforms already do use sophisticated filtering technology.

If allegedly infringing content is nevertheless made available, platforms do continue to have room for manoeuvre. Firstly, the "unavailability" obligation only kicks in once the rightsholder has provided the platform with "relevant and necessary information", and the onus is on the rightsholder to justify their requests.

Absent a detailed notification identifying at least a specific URL and digital fingerprints, platforms may always be able to argue that the information they have received is insufficient.

Platforms can also continue to invoke arguments that certain content is not infringing, eg because it falls under an exception and that any best efforts they use need to be tempered accordingly. Many of the protracted exchanges between rightsholders and platforms that occur today on these subjects may therefore continue under Article 17.

Expeditious take-down

The take-down requirement in Article 17 is likely to operate in a similar manner to that under the hosting exemption, albeit that it is now clear that any content notified to the platform under the take-down requirement will also become subject to the unavailability requirement going forward.

Platforms must provide a complaint and redress mechanism for users

Platforms that err on the side of caution in favour of rightsholders may conversely find themselves in disputes with users complaining that their lawful content has been blocked or removed.

User pressure is likely to be a growing commercial consideration as increasing numbers of individuals make significant amounts of money through content sharing platforms and therefore have financial incentives to challenge platforms' decisions and allege monetary loss.

Article 17 requires platforms to put in place an "effective and expeditious complaint and redress mechanism" for users whose content is disabled or removed. Platforms' decisions must be subject to human review, which is likely to create a significant resource burden.

Although Article 17 requires rightsholders to justify their removal requests, platforms may nevertheless find themselves between a rock and a hard place where the assessment of whether content infringes a rightsholder's copyright is not straightforward.

What next?

Article 17 remains controversial and a number of EU Member States may not implement the Article in its entirety. Germany and Poland have voiced opposition to any filtering requirement, and the UK's implementation of the Directive as a whole will be impacted by the outcome of Brexit as well as continuing domestic political uncertainty.

However, non or lesser implementation of Article 17 by some Member States may have little practical impact for platforms, unless they have sites targeted to users in those Member States only, since platforms will otherwise still need to comply with the Article 17 requirements as they become law in the rest of the EU.

Platforms should therefore prepare to comply with Article 17 by the implementation deadline of 7 June 2021, and in the meantime look out for our updates as to the progress of the EU Commission's stakeholder dialogues to establish best practices for cooperation between platforms and rightsholders.

If you have any questions on this article please contact us.

User uploaded content
Adam Rendle


 

Adam and Xuyang discuss the latest developments concerning platform responsibility for user-uploaded copyright infringement.

"It is likely that platforms will need to implement some sort of filtering mechanism…but that this would only apply when rightsholders have provided the relevant and necessary information in relation to specific works."