Platform immunity under the microscope: cooperation is the best way forward

This is an Insight article, written by a selected partner as part of WTR's co-published content. Read more on Insight

Online operators such as eBay, Facebook, Instagram and Amazon find themselves under the microscope. They have become hugely successful organisations, but their platforms are often misused by those infringing rights or promoting bad locations.

Online operators such as eBay, Facebook, Instagram and Amazon find themselves under the microscope. They have become hugely successful organisations, but their platforms are often misused by those infringing rights or promoting bad locations. Legislative protections and immunities have allowed the platforms to grow without being required to self-police or monitor for infringement. The imperative has been that technology must be allowed to develop. This has been an agenda that is so pressing that legislators have preferred to give platforms immunity rather than require them to regulate what they allow to be said and sold.

On January 25 2018 the UK prime minister addressed the World Economic Forum on current platform immunity issues and stressed that “the same rights people have offline should be protected online… technology companies still need to do more in stepping up to their responsibilities for dealing with harmful and illegal online activity”. Ensuring that rights holders can secure the same IP protection online as they can offline is a key challenge for businesses, which must protect themselves to prosper in the digital economy.

The importance of this issue is not lost on legislators. In its communication of September 28 2017, the European Commission urged that online platforms “do their utmost to proactively detect, identify and remove illegal content online”.

This chapter looks at the latest developments in the law and examines some recent proactive and welcome steps taken by platforms to help rights holders to protect themselves. We are not yet at a point where platforms find themselves without the immunities and protections that they have enjoyed to date. However, there are clear signs that times are changing. Active cooperation from platforms can help to ensure that they are not misused, and may be the only way for them to avoid a complete shift in the burden of policing the Internet.

Legal framework and key case law

It is reasonably well established that online platforms that do not knowingly intervene in the misuse of their platforms are not typically liable for infringements by their users.

At times, this general point can seem to lead to some surprising results. In L’Oreal v eBay ([2009] EWHC 1094 (Ch)) the English High Court found that eBay was not liable for the sale of grey-market goods on its platform, despite the fact that it:

  • actively encouraged listings of sales from outside the European Economic Area to buyers in the United Kingdom;
  • provided specific facilities to assist this; and
  • did nothing to discourage such infringements.

The court held that:

  • eBay’s service did not inherently lead to infringement;
  • it had no duty to prevent infringement by third parties; and
  • the facilitation of infringement with knowledge and an intention to profit was insufficient to establish liability.

The court acknowledged that it was “in no doubt that it would be possible for eBay Europe to do more than they currently do” (eg, filtering, additional disclosure obligations on sellers, additional restrictions on high-risk products and applying penalties more rigorously). However, the court concluded that: “The fact that it would be possible for eBay Europe to do more does not necessarily mean that they are legally obliged to do more.” In essence, for infringement to be established, it would be necessary to show a greater degree of knowing collaboration between eBay and the sellers that were using its platform to infringe.

In Google France (Cases C-236/08 to C-238/8) the European Court of Justice (ECJ) held that Google, in its provision of the AdWords ad referencing service, was not liable for trademark infringement in respect of its storage of keywords that were identical to a trademark. Further, its organisation of the display of advertisements on the basis of that keyword did not make it liable. The ECJ held: “The fact of creating the technical conditions necessary for the use of a sign and being paid for that service does not mean that the party offering the service itself uses the sign.”

The EU E-commerce Directive is the critical legislative context to these decisions. It provides for safe-harbour defences or immunities for online intermediaries that offer “information society services” – that is, those “normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”.

Recital 42 to the E-commerce Directive provides the foundation for the safe harbours as follows: “this activity is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of nor control over the information which is transmitted or stored.”

The common thread to each of the safe harbours is that the service provider is passive in its operation and does not knowingly and actively enable the misuse of its platform. The most relevant safe harbour in Europe for platforms is the Article 14 ‘hosting’ defence. This applies to “the storage of information… at the request of a recipient of the service”. This gives immunity where the platform is not fixed with knowledge. But the burden shifts when the platform is on notice of a specific infringement. Monitoring companies typically send thousands of notices to platforms on a daily basis to ensure that they are on notice of infringement and so must take steps to stop the infringement from continuing.

The ECJ decisions in Google France and L’Oreal v eBay (Case C-324/09) provide guidance on when a platform may rely on the Article 14 defence.

In Google France the ECJ held that “it is necessary to examine whether the role played by that service provider is neutral, in the sense that its conduct is merely technical, automatic and passive, pointing to a lack of knowledge or control of the data which it stores”.

In contrast, the English High Court imposed liability on Amazon in Cosmetic Warriors ([2014] EWHC 181 (Ch)), concerning Amazon’s alleged infringement of the claimants’ LUSH trademark in circumstances where Amazon was commercially involved with the sale of the goods. A drop-down menu on the Amazon platform suggested terms such as ‘lush bath bombs’ when a user started to search for ‘lush’. The links were to competing products sold on Amazon and made no overt reference to the genuine Lush item not being available. In this case, the court considered that Amazon had used the ‘Lush’ sign as part of a commercial communication and that it was selling competing goods on its website. Amazon was not passive or neutral, but actively engaged in the sale of competing products.

Case law has set the clear parameters for assessing whether a platform is passive and neutral or active and complicit. In all cases, actual knowledge of infringement will remove the hosting safe-harbour immunity. The question is how far this knowledge extends – is knowledge always relevant only to a particular infringement or can knowledge give rise to a continuing duty to act?

In L’Oreal the ECJ concluded that for eBay to be fixed with the requisite knowledge/awareness under Article 14, after which the platform must act expeditiously to remove or disable access to content, “it is sufficient… for it to have been aware of facts or circumstances on the basis of which a diligent economic operator should have identified the illegality in question”. The ECJ held that situations in which a diligent economic operator should have identified the illegality include where the “operator of an online marketplace uncovers, as the result of an investigation undertaken on its own initiative, an illegal activity or illegal information, as well as a situation in which the operator is notified of the existence of such an activity or such information”.

This ‘diligent economic operator’ test imposes a higher duty on platforms. The barrier for awareness is lower than actual knowledge obtained on receipt.

Francesco Rizzuto has written (CTLR 2012,18(1) 4 15) that “if the online service provider does not decide on its own initiative to suspend the perpetrator of an infringement of intellectual property rights in order to prevent further infringements, it may be ordered, by means of an injunction, to do so”. It is implicit “that online intermediary service providers should take pre-emptive action to prevent further infringements in order to avoid being ordered to do so by an injunction”. Further, “online service providers… are under a duty of care and should ensure that they are able to intervene, if they become aware of it, to prevent their services being used to infringe intellectual property rights”.

The requirement on platforms to ensure that further infringements are prevented is often the structure that enables rights holders and monitoring companies to take steps to remove the whole social media or seller account for an infringer, rather than only one post or listing.

Technological changes and more are coming

Technological advancements have allowed online platforms to increase their knowledge of whether their services are used to infringe trademarks and to respond to ensure that counterfeiting on platforms is kept to a minimum. As technology enables greater awareness, the question remains whether the platforms can continue to claim to be diligent operators when they fail to take steps to keep infringements off their platforms.

One technology, blockchain, can ensure a secure digital identification in products which allows sellers and consumers to track the supply chain and allows purchasers to ascertain whether goods are genuine. Blockchain empowers a highly secure digital record that is extremely difficult to tamper with, unlike traditional authenticators. Where platforms facilitate distribution, they may choose to use blockchain to verify product provenance before the item is listed on the platform, thereby increasing consumer trust. Of course, not all consumers necessarily care whether they are purchasing a genuine item and many actively seek out a cheaper version (especially as advancements in factory processes enable the fake to embody quality in its manufacture). As a consequence, the technology has its limits in the prevention of counterfeiting. Further, not all platforms will adopt the technology and if they do, the counterfeiting problem may well move to the dark web.

Some measures are already being adopted by e-commerce operators, aimed at limiting the number of counterfeits on platforms. For example, Amazon’s Brand Gating programme asks merchants of high-value items to submit a letter of authorisation from the brand owner or manufacturer or an invoice before the item can be listed.

Another Amazon programme is its Brand Registry Tool. This allows brands to gain more control over how products are advertised. A brand can use the tool to report violations and potentially detect similar infringements with keywords and image searches. Alibaba Taobao has adopted image searching, allowing rights holders to surface copyright infringements. More encouraging signs of platforms adopting new tech solutions include the Facebook Commerce and IP tool, which allows a brand user to sort and filter content and then report infringements in bulk.

While these tools are actively promoted as efficient resources, they are yet to deliver results at scale. Amazon and Facebook remain vague about the full capabilities of their systems, with very little information disclosed openly. Amazon and Alibaba Group are often criticised for their lengthy registration processes. Reports of tech bugs and glitches and limitations of the tool are commonplace.

Pressure is mounting

Attitudes to online counterfeiting and the liability and responsibilities of online platforms have been shifting. As the prime minister said at the World Economic Forum: “The status quo is increasingly unsustainable as it becomes clear these platforms are no longer just passive hosts.”

In September 2015 the European Commission opened the Consultation on Online Platforms, Cloud and Data, Liability of Intermediaries, and the Collaborate Economy as part of its Digital Single Market Strategy. The commission gave examples of ‘online platforms’ including search tools, online marketplaces, social networks and app stores.

On May 25 2016 it published its communication on “Online Platforms and the Digital Single Market Opportunities and Challenges for Europe”. The commission said that it “will assess the role intermediaries can play in the protection of intellectual property rights, including in relation to counterfeit goods, and will consider amending the specific legal framework for enforcement”.

Most recently, the commission published a communication on September 28 2017 entitled “Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms”, which sets out guidelines and principles for platforms to follow in increasing their efforts to prevent, detect, safeguard and remove illegal content online. Although it is non-binding, the communication shows growing recognition that those who make profits from platform operations should also shoulder responsibilities.

Brands are taking direct action

Brands are reacting to the sale of counterfeits of their products being sold on platforms. BIRKENSTOCK announced that from January 1 2018, its products would no longer be available on Amazon because “there have been a series of violations of the law on… Amazon which the platform operator has failed to prevent of its own accord. On a number of occasions, BIRKENSTOCK lodged a complaint that counterfeit products of poor quality which infringed BIRKENSTOCK’s trademark rights and misled the consumers regarding the origins of goods were being made available on the platform”.

Brands are increasingly willing to withdraw ad spend from platforms that fail to prevent unlawful activity. So far, direct action through sponsorship withdrawal has been largely limited to the response against brand association with terrorism and other harmful content. However, this may change over time.


Whether legislation will change the burden on platforms and require them to self-police remains to be seen (we await the finalised legislation under the EU Digital Single Market Strategy). In the meantime, the courts, the UK government, the European Commission and brand owners are increasing pressure on online platforms.

Cooperation with platforms can deliver results for brand owners, and is key to delivering the necessary scale and efficiency in enforcement. Such cooperation may be the best way for platforms to avoid a tectonic shift in their immunity status. Platforms are strongly urged to work even harder with brands and monitoring companies to make the Internet a good place for all businesses to prosper.


First Floor, New Penderel House

283-288 High Holborn

London WC1V 7HP

United Kingdom

Tel +44 20 3051 0494


Simon Baggs
Chief executive officer
[email protected]

Simon Baggs is a lawyer with over 20 years’ experience in the enforcement of intellectual property. He is the full-time CEO at INCOPRO, the company he co-founded in 2012.

He is well known as a leading expert on the enforcement of intellectual property online. He presents regularly at key industry summits on the convergence of IP law and technology, identifying the key legal and technology trends that will have application to online commerce in the future.

His background ensures that INCOPRO focuses on scalable enforcement alongside actionable intelligence. In 2015 his team won the Financial Times award for innovation in intellectual property and in 2016 his team won the Legal Week award for innovation in intellectual property. In March 2018 his team won the Managing Intellectual Property award for innovation in intellectual property.

Unlock unlimited access to all WTR content