Who is liable for User Generated Content?
These days, whether it’s on a social media website or in relation to an online article, we all expect to have our say and post our own content. Who is liable for those posts when they contain illegal content?
The legal position of an individual who posts content onto a website is clear. He or she is responsible for it. The situation is slightly more complicated when it comes to a business whose website publishes user generated content (UGC). While the point is as yet untested in an English court, provided a website operator:
- is unaware of the infringing or illegal content; and
- has an effective notice and take down procedure in relation to such content (under which it promptly removes infringing or illegal content from the site once on notice of it),
These assumptions will be tested later in 2011, when the European Court of Justice (ECJ) is expected to rule on whether a social networking site must monitor user generated content for copyright infringement or be liable for infringing content. The case is relevant to all those operating websites where users can post their own content.
If the ECJ decides that operators of these sorts of websites cannot rely on the safe harbour provisions, operators will have to start monitoring their users’ activity or be liable for infringement. This could have a significant effect on the business model of many website operators who invite their audience to post content.
The reference has been made in the Belgian case of Netlog v SABAM (C-360/10) in which copyright collecting society SABAM (Société Belge des Auteurs, Compositeurs et Editeurs) seeks a ruling that social networking site Netlog (a “social portal for more than 69 million young people in Europe”) must implement technical measures to monitor the content that its users upload, to ensure the removal of infringing content.
Incidentally, there are similar safe harbour provisions in the UK’s defamation legislation. Again, this legislation is untested in relation to user generated content (UGC). In order to rely on the defence, it is necessary to show that you were not the author, editor or publisher of the defamatory material. In the case of a newspaper, it remains to be seen whether the English courts would be willing to isolate the post from the underlying article and decide that the newspaper publisher was not the publisher of the defamatory post.
In the meantime, there have been conflicting decisions on the point throughout Europe. For example, in September 2010, Google was successful in persuading a Spanish Court that it should not be liable for copyright infringement resulting from users’ uploads to YouTube. However, later that month, a French Court found Google and its Chief Executive liable for defamation over autocomplete “Google Suggest” results. This liability arose from content not created by Google but by internet users.
While we wait for the ECJ’s ruling, this is an uncertain time for businesses who invite UGC. As well as the commercial disadvantage of losing the spirit of allowing everyone to comment, the legal disadvantage of moderating is that the website owner is then in the position of a conventional editor or publisher. If unlawful material is posted onto the website, the website operator will then almost certainly not be able to rely on the safe harbour defences.
From a legal point of view, a decision to moderate “lightly” is the worst of all worlds. The safe harbour defences are unlikely to apply and yet the website operator may be held responsible for the UGC over which it has no real control. If effective procedures are put in place to remove illegal content as soon as it comes to the website owner’s attention, the decision not to moderate will often be the better option.
For more guidance, register here to receive our comprehensive guide: “Liability of online publishers for user-generated content – An overview of English, German and French law”.
If you have any questions on this article please contact us.
² The relevant legal provision clearly states that Member States shall not impose an obligation on an information society service to monitor the information which it transmits or stores or to actively seek facts or circumstances indicating illegal activity – Article 15
Read about why it can be unwise to moderate user generated content posted on your website.
"In order to rely on the defence, it is necessary to show that you were not the author, editor or publisher of the defamatory material"
Quick Poll Results
From the recent poll we can determine that:
45% of companies use social media as a marketing tool
24% of companies have a social media policy in place
30% of companies use social media as a recruitment tool?
If you are interested in learning more please contact us.