Copyright Platform Liability: Lessons Learned

From Passive Host to Active Player

In the first part of this two-part blog, we explained that under the former E-commerce framework online platforms were largely regarded as neutral intermediaries. A platform that merely provided neutral hosting services could in principle not be held liable. That position has changed significantly. Through algorithms, personalised feeds and ranking mechanisms, platforms now influence which video goes viral, which products appear at the top and which content users see. This places platforms at the heart of the distribution of infringing content, much to the frustration of rightholders.

This shift is reflected in the law. As outlined in part 1, Article 17 of the Copyright in the Digital Single Market Directive (C-DSM) provides that certain platforms may themselves be regarded as infringers. They must therefore demonstrate that they have made best efforts: actively seeking licences, preventing the availability of known infringing content and stopping repeated uploads. The Digital Services Act (DSA) adds further obligations and requires all platforms to enforce effectively against illegal content, including copyright infringements. Platforms may still rely on the traditional safe harbour exemptions, but only if they meet strict conditions.

In this second part, we turn to practice. Although the European legislator seeks to balance entrepreneurial freedom and responsibility, platforms find that balance difficult to achieve in reality. Organisations that fail to manage risks in a visible and structured way face reputational damage, regulatory measures or legal claims. The prohibition of general monitoring remains in place, but the room not to act is shrinking. What does this mean for organisations offering online services? The following sections set this out.

The Paradox of the Good Samaritan

As discussed in part 1, the DSA retains the classic safe harbour regime. Hosting platforms are in principle not liable for user uploads, provided they have no knowledge of illegal content and act effectively once they become aware of it. This concerns notice-and-action procedures: responding swiftly to notifications, removing or disabling access to the reported content and, where necessary, preventing recurrence. To avoid discouraging proactive moderation, the DSA introduces an important nuance: the so-called Good Samaritan provision in Article 7.

This provision ensures that a platform does not automatically lose its liability protection when it voluntarily takes measures to detect or remove illegal or infringing content, provided it acts in good faith and with due care. The message from the European legislator is clear: platforms should feel free to tackle abuse proactively, without fear that additional efforts will be used against them.

In practice, however, matters are less straightforward. Intensive moderation, such as the use of automated detection filters, may be interpreted as knowledge of infringing content. This creates a paradox: the more actively a platform monitors, the greater the risk that it loses its safe harbour and is held liable for content it fails to remove. A platform that acts diligently may therefore face greater exposure than one that waits.

Technology: Friend or Foe?

To comply with the DSA and the C-DSM, platforms increasingly rely on sophisticated detection tools. Techniques such as hash matching, fingerprinting and AI-based filters allow millions of uploads to be scanned and compared against reference files from rightholders within seconds. For platforms receiving vast volumes of video, music or images each day, these tools are now essential to demonstrate best efforts to prevent copyright infringements.

Yet these technologies also present clear risks. Detection systems are never flawless. They may fail to identify obvious infringements (underblocking) or remove lawful content such as parody, criticism or quotation (overblocking). This can lead to complaints, reputational harm and tensions with freedom of expression. Moreover, the use of automated detection systems may more quickly endanger the safe harbour under the DSA. The more advanced and proactive the system, the greater the likelihood that the platform is considered to have knowledge of infringing content.

Reliance on such technology creates not only legal, but also significant operational risks. Platforms require specialised expertise, review capacity and resources to carry out moderation carefully. The rise of generative AI adds further complexity, for example where AI-generated works closely resemble existing creations. That topic warrants separate analysis and is therefore not addressed further here. All additional moderation burdens ultimately affect business operations, as the next section explains.

Economic Pressure on Platforms

The legal and technological challenges are substantial, and the economic impact intensifies them. Developing, implementing and maintaining detection systems involves significant costs. Both the DSA and the C-DSM also require structural investment in staff, technical infrastructure and compliance processes. Large players can usually absorb these costs. For smaller platforms or start-ups, however, they may directly threaten business continuity.

This debate also touches on a fundamental right: the freedom to conduct a business under Article 16 of the EU Charter. A general filtering obligation would require platforms to deploy costly and complex control systems, thereby restricting their freedom to organise their business as they see fit. In a recent judgment, the Court of Justice emphasised that this freedom is not absolute. Where necessary to protect other fundamental rights, such as intellectual property rights, it may be restricted. This confirms that platforms have less room to design their services entirely at their own discretion.

The result is a form of indirect compliance pressure. Formally, there is no general filtering obligation. In practice, however, legal risks, reputational concerns and costs lead platforms to feel compelled to implement filtering measures. For some, this affects the core of their business model, particularly where low moderation burdens or maximum user freedom formed part of their proposition. For others, the pressure manifests mainly in additional financial and operational strain.

Practical Lessons for Platform Providers

For Dutch platforms, this evolving landscape means that due care, transparency and documentation have become essential. Several concrete lessons can be drawn.

  • Know your position under the DSA and the C-DSM.

First determine whether your service qualifies as a platform under Article 17 C-DSM and or the DSA. This qualification determines your obligations and the extent of your liability exposure.

  • Establish transparent and accessible procedures.

Implement an accessible reporting mechanism, clear notice-and-takedown procedures, transparent communication with users and a robust complaints process. Reflect these processes consistently in your terms and conditions.

  • Use technology proportionately and with human oversight.

Automate where appropriate, but avoid blind reliance on filters. Document technical choices and ensure that lawful content, such as parody or quotation, is not removed unjustifiably. Provide content creators with an accessible appeal procedure and ensure that every appeal is reviewed by a human.

  • Avoid general monitoring and focus on risk areas.

Excessive monitoring may work against you legally. Adopt targeted and specific measures that reflect the nature and risks of your platform.

  • Document, evaluate and stay up to date.

Documented procedures, balancing exercises and periodic audits demonstrate good faith. Monitor new case law and European guidance, as these may directly affect your liability position.

At its core, the solution lies in proportionality. Limit automated detection to risk areas and combine it with human review. Only through careful alignment of technology, policy and human assessment can enforcement remain both effective and legally sustainable.

Conclusion

The European legislator’s message is clear: online platforms are no longer neutral hosts. They bear an active responsibility to prevent copyright infringements. Those who benefit from user-generated content must also ensure careful enforcement.

The challenge lies in striking the right balance under the DSA and the C-DSM. Platforms must do enough to limit abuse, but not so much that they jeopardise their safe harbour or freedom to conduct a business. Too little action creates risks. Too much action may do so as well.

For businesses, compliance is not a one-off exercise, but an ongoing process. Transparency, documentation and demonstrable good faith are essential.

Does this sound complex? Our advisers are happy to support you in developing a suitable and workable approach. Please feel free to contact us.

Contact us


[1] Court of Justice of the European Union, 30 January 2025, ECLI:EU:C:2025:935, paras 42 to 48 and 55 to 58 (Russmedia Digital & Inform Media Press)

Back to overview