The popularity of podcasts has increased exponentially over the last few years. In September 2019, Ofcom released statistics which stated that around 7.1 million people listened to at least one podcast every week in the UK during the course of 2019. Over the last 12 months, the COVID-19 pandemic has led to most of us spending more time at home, and therefore seeking more home entertainment. This has further increased the demand for good podcast content.
Given the wide range of content that might be covered by podcasts, the providers of access to podcasts may start to face more legal challenges.
Podcast aggregator platforms often make it possible for third-party users to upload podcasts that they have independently created. Some of these podcasts will include content that is about or refers to individuals or companies. This may expose the platforms to various legal risks, including for defamation. Whether or not the platforms are aware of the potentially unlawful content, by making the podcasts available via their platform, they could face personal liability. Podcast aggregator platforms must therefore consider the extent to which they would be liable, and whether they can rely on some potential defences.
While we consider English defamation liability here, the range of potential legal claims a platform could face is wide. With defamation claims, providers can rely on specific defences provided by the English legislation. For other claims, such as those relating to privacy, data protection, copyright and hate speech, the provider would have to rely on the generally applicable hosting exemption we mention below, and/or demonstrate that the substantive elements of those claims are not made out (eg because the provider doesn't have the necessary mental element for the hate speech crimes, because there was a public interest in any private information being being published, or because the aggregator was not sufficiently involved in the commission of the tort or crime).
Under defamation law, a published statement (including a statement contained in a podcast) would be defamatory of an individual or company if it substantially affects, or tends substantially to affect, in an adverse manner, the attitude of other people towards that individual or company. In other words, the statement must cause other people to think worse of an individual or company.
Additionally, under section 1 of the Defamation Act 2013 (DA 2013), the claimant must also show that the publication has caused or is likely to cause "serious harm" to their reputation. If the claimant is a company trading for profit, it must also show that the serious harm to its reputation has also caused or is likely to cause it "serious financial loss".
A "host" of user generated content has a defence under Regulation 19 of the e-Commerce Regulations 2002. This is a general defence to all of types of unlawful content but it will only apply in circumstances where podcasts are created by third party creators who upload them into the aggregator platform. In this context, the platform could be a provider of an "information society service" which merely stores content electronically on behalf of users.
The defence only applies provided that the podcast aggregator:
This is a 'notice and takedown' defence. Having knowledge of unlawful activity defeats the defence but in order to have knowledge, the notice of potentially unlawful content needs to be sufficient in bringing the unlawful content to the attention of the podcast aggregator and explain why there are no available defences which might make the content lawful. So, for example, simply being made aware that a podcast contains defamatory words is not sufficient because defences may apply which make the words ultimately lawful.
Having said that, it could be a dangerous gamble for a podcast aggregator to take the position that content is not unlawful simply because it cannot determine if any substantive defences ultimately apply. Therefore, if, following a complaint or as a result of an internal investigation, a platform becomes aware of a reasonable argument or reason as to why a podcast's content is unlawful, it would be prudent to remove the offending podcast in order to rely on the hosting exemption.
A podcast aggregator platform will not be liable for a defamation claim if:
In this context, a "publisher" is a commercial publisher whose business is issuing material to the public. To the extent that a podcast aggregator did not select the inclusion of the podcast in question on its platform (ie where the podcast was uploaded by a third party), then that aggregator would arguably not be deemed to be a publisher in this context.
This is again a 'notice and takedown' defence that is lost when the podcast aggregator becomes aware that a podcast contains content that "has a defamatory meaning" (whether as a result of a complaint or through moderating the content contained in its platform) but doesn't then take reasonable care (eg by taking the content down). The section 1 DA 1996 defence relates to knowledge of whether the publication contains a defamatory meaning.
In some contexts, words may have a defamatory meaning, but publishing them may not be unlawful (for example because the allegations may be true). Therefore, this defence is defeated more easily than the hosting exemption. In Tamiz v Google, the Court of Appeal ruled that a website operator may be regarded as a "publisher" after being put on notice of a complaint.
A podcast aggregator could, therefore, potentially be considered the publisher of a defamatory statement if it is aware of the defamation claim and decides not to remove the podcast containing the offending statement.
There is a defence for an operator of a website where a defamation action is brought in respect of a statement posted on that website if it was not the operator who posted the statement. It has not yet been decided whether or not a podcast made available through a podcast aggregator would be a "statement posted" onto that aggregator's service. The term 'website' is not defined but will likely cover any page which can be accessed via a web browser using a URL. It is likely that most podcast aggregator platforms will also be accessible via online URLs (as well as via applications for mobile devices etc).
This defence is defeated if the claimant cannot identify the person making the statement so as to be able to sue them, and gives the operator of the website notice of the complaint and the operator fails to take certain specific steps in accordance with section 5 and the Defamation (Operators of Websites) Regulations of 2013. Therefore, to the extent that a claimant can identify the creator of a podcast (ie the person who posted the statement) so that the claimant can sue him or her, then the podcast aggregator, in its role as (or analogous to) a website operator, does not need to take down the offending podcast in order to rely on this defence.
This provides that "[a] court does not have jurisdiction to hear and determine an action for defamation brought against a person who was not the author, editor or publisher of the statement complained of unless the court is satisfied that it is not reasonably practicable for an action to be brought against the author, editor or publisher."
The term "publisher" in this context has the same meaning as in section 1 of the DA 1996. Section 10 of the DA 2013 therefore introduces greater protection to secondary publishers such as podcast platforms on which third party content is made available.
This defence removes the possibility of an action for defamation being brought against them, except where it is not reasonably practicable for the claimant to bring the action against the author, editor or commercial publisher (eg because these parties might not be identifiable).
The podcast aggregator may also seek to rely on substantive defences to avoid liability for defamatory content by showing that the statement complained of is substantially true, that it constituted an honest opinion, or that it is part of a publication on a matter of public interest.
All of these substantive defences pose their own difficulties. For example, proving the substantial truth of an allegation is entirely reliant on the facts and the burden of proof would be on the podcast aggregator, but the aggregator will be far removed from those facts.
Moreover, the defence of publication on a matter of public interest will involve the podcast aggregator proving not only that the subject matter is in the public interest, but also that, before making the podcast available, the aggregator reasonably believed that publishing the statement complained of was in the public interest.
To be able to show this, the aggregator would, for example, need to demonstrate it conducted the pre-publication enquiries and checks as would be reasonable to expect of the aggregator in the circumstances. This would be difficult for the aggregator to achieve where it made available podcasts without reviewing their content but to review content might lead to it losing its protection under the hosting and s1 DA 96 defences.
Podcast aggregators can take practical steps to ensure they can rely on defences to avoid defamation liability for content they have made available but have not produced. Having processes in place to act on notices of potentially unlawful content is a vital first step but having contractual protections in place with producers is also critical.
To discuss any of the issues raised in this article in more detail, please reach out to a member of our Technology, Media & Telecommunications team.
We discuss some of the key content-related points.
1 von 5 Insights
We discuss some of the dynamics involved in establishing who should be responsible for clearing music for use in podcasts.
2 von 5 Insights
We look at the application of advertising rules to podcasts.
3 von 5 Insights
We look at the EU's draft Digital Markets Act and the UK's plans for a Digital Markets Unit.
4 von 5 Insights