Platform liability for copyright infringement: what do you need to know?

Online platforms must now do far more than simply provide users with a space to share content. Anyone operating a platform on which videos, images, music or texts are uploaded will almost automatically encounter copyright law. In recent years, the European legislator has fundamentally reshaped the landscape. On the one hand, there is the copyright framework introduced by the Copyright in the Digital Single Market Directive (C-DSM Directive, hereafter: C-DSM), with Article 17 as the turning point: certain platforms are no longer merely ‘hosts’, but become directly responsible for copyright infringements committed by users. On the other hand, there is the Digital Services Act (hereafter: DSA), which imposes stricter rules on all online intermediaries regarding how they must deal with illegal content.

In this two-part series, we explain what this new landscape means for organisations that manage online platforms. In this first part, we focus on the foundations: what exactly do the C-DSM and the DSA require, how do they interact, and when is a platform liable for copyright infringement? In Part 2, we turn to practice: how the role of platforms is changing under social pressure, technological developments and enforcement trends, and why voluntary control increasingly resembles a de facto obligation.

Part 1: Copyright on online platforms: what do the C-DSM Directive and the Digital Services Act regulate?

Online platforms are the primary space where creative content is shared. Think of videos, music, images, podcasts, livestreams and texts. This creates tremendous opportunities for reach and innovation, but it also creates friction with copyright law. Users do not always upload their own work. Sometimes it is a (part of a) film, a song playing in the background, a photo taken from Google, or an extract from an article. Platform operators then face a key question: who is liable if copyright is infringed, the user, the platform, or both?

In recent years, the European Union has introduced two important legal instruments that address this issue: the C-DSM and the DSA. These regimes partly overlap. In this blog, we explain in clear language what each instrument does and how they come together in cases of copyright infringement on platforms.

Why new rules?

The old system was straightforward: the user who uploaded illegal content committed the infringement. The platform acted primarily as a conduit. Only once the platform received a clear notice did it have to intervene by taking the content down. That approach worked reasonably well for incidental infringements, but not at the scale of modern platforms. Rightsholders, such as film studios, publishers and photographers, saw their works appear online on a massive scale without consent. In practice, rightsholders often have limited options against individual infringers. Uploaders are frequently anonymous, difficult to trace or located abroad. Legal proceedings against private individuals are costly, time-consuming and rarely produce meaningful results. As a result, rightsholders increasingly shifted their focus to the platforms that facilitate the infringement.

The EU therefore opted for a dual approach. The C-DSM clearly shifts responsibility to the platform for a specific category of platforms. The DSA, in addition, modernises the general rules on how platforms must deal with illegal content, including copyright infringement.

Which platforms fall within the scope of the C-DSM Directive?

The C-DSM contains one provision that is decisive for platforms: Article 17. This article does not apply to every platform. It targets so-called online content-sharing service providers. Think of large social media and video-sharing platforms. These are platforms:

  • that allow users to upload large amounts of content,

  • that make that content available to the public, and

  • that usually generate revenue from that activity, for example through advertising or subscriptions.

Non-commercial online encyclopaedias, cloud services for private storage or online marketplaces generally fall outside this definition.

Why is this distinction so important? Because Article 17 introduces a different liability model from the traditional notice-and-takedown approach.

Article 17 C-DSM in one sentence: the platform is itself responsible

The core of Article 17 is that a content-sharing platform is no longer regarded as a purely neutral host. As a matter of law, the platform is deemed to perform an act of communication to the public when it makes user uploads available. Anyone who communicates a work to the public must have authorisation.

The starting point is simple but far-reaching: platforms must obtain prior authorisation from rightsholders for making their works available. In practice, this often means entering into agreements with collecting societies or individual rightsholders.

If users themselves already have authorisation, for example because they own the copyright or hold a licence, the platform does not need to obtain additional permission. In most cases, however, such authorisation will be absent, which means the platform bears responsibility.

This marks a significant shift. A platform can no longer routinely rely on the defence that it did not know. It must arrange in advance that works may appear on the platform, for example through licensing arrangements.

However, platforms have a way out: best efforts

The EU also recognised that it is impossible to conclude a licence in advance for every individual video, song or image. Article 17 therefore provides an exemption. A platform will not be liable if it can demonstrate that it has made best efforts. In practice, this comes down to three elements:

  • Genuine efforts to obtain licences.

    The platform must actively seek to secure authorisation from rightsholders or collecting societies.

  • Making best efforts to prevent the availability of specific infringing content.

    If rightsholders provide relevant information about specific works that are frequently uploaded without permission, the platform must attempt to detect and prevent the availability of that content. This may involve technical measures, such as content recognition tools.

  • Acting expeditiously upon notice and preventing re-uploads.

    If the platform receives a sufficiently substantiated notice, it must remove the content promptly and prevent the same content from being uploaded again.

This is not a mere box-ticking exercise. Platforms must demonstrably and structurally invest in prevention and follow-up.

What does this mean for filters?

Many platforms will use some form of automated content recognition. Such technology is not explicitly mandated, but in practice it is often the only realistic way to meet the best efforts standard. At the same time, the EU has warned against overly strict filtering. There must remain room for lawful uses, such as parody, quotation or review. Platforms must therefore also provide an effective complaints mechanism where content has been wrongly blocked.

The Digital Services Act: general rules for illegal content

Alongside this sits the DSA. It applies to all online intermediaries, including platforms that do not fall within Article 17. The DSA performs two functions at once.

On the one hand, it preserves the traditional liability exemptions. Hosting providers are not automatically liable for user content, provided they lack actual knowledge and act expeditiously once they become aware of illegality.

On the other hand, the DSA introduces more extensive due diligence and procedural obligations. Platforms must, among other things:

  • provide an easily accessible notice mechanism for illegal content;

  • process notices diligently and in a timely manner;

  • inform users of the reasons for content removal; and

  • offer an internal complaints procedure.

Very large platforms are subject to additional requirements, such as risk assessments and independent audits of their systems.

Importantly, the DSA confirms that there is no general obligation to monitor all content in advance. Platforms are not required under the DSA to screen every upload proactively. They may choose to do so, but it is not a general statutory obligation.

How do the C-DSM and the DSA interact?

Both instruments address copyright infringement, but in different ways.

Article 17 C-DSM is the specific copyright rule for content-sharing platforms.

It determines who is liable and under which conditions a platform can avoid liability.

The DSA is the general horizontal framework for dealing with illegal content.

It primarily regulates how notices, removals and complaints must be organised.

The DSA explicitly states that it does not displace copyright law. In other words, where Article 17 applies, it remains leading. The DSA cannot reclassify a platform as a mere host that is only liable after notice. Article 17 prevails as lex specialis.

This does not mean that the DSA is irrelevant for Article 17 platforms. On the contrary, if you fall within Article 17, you will almost always also qualify as a platform under the DSA. As a result, the DSA’s procedural obligations apply as well. You must therefore comply with Article 17 and ensure that your notice-and-action and complaints procedures are DSA-compliant.

A practical way to remember this is: Article 17 tells you what you must do to avoid liability, the DSA tells you how to do so in a transparent and orderly manner.

What does this mean in practice for platform providers?

For platforms within the scope of Article 17, the era in which a simple takedown form sufficed has passed. You must now be able to demonstrate, on an ongoing basis, that you actively seek licences, take measures to prevent infringements and respond swiftly to notices. This requires both technical and organisational investment.

At the same time, your processes must become increasingly transparent. If a user’s upload is blocked, that user must understand why. If the user objects, the objection must be assessed seriously. If a rightsholder submits a notice, you must clearly document what action you have taken. This is not only a legal obligation, but also a matter of reputation and user trust. Platforms that moderate in a fair and transparent manner face fewer escalations and conflicts.

For platforms outside Article 17, the DSA remains the primary framework. The notice-and-action model continues to apply, but with stricter procedural requirements than before. A poorly designed notice mechanism or inadequately reasoned removals will create greater legal risk.

Taken together, the C-DSM and the DSA create a layered system. Article 17 places substantive responsibility on certain content-sharing platforms: in principle, you are liable if works are uploaded without authorisation, unless you can demonstrate that you have done everything that can reasonably be expected to prevent this. The DSA complements this with clear rules on notices, removals and complaints, and ensures that enforcement is transparent and proportionate.

Yet this is not the end of the story. What appears on paper as a best efforts obligation or a voluntary choice often becomes far more demanding in practice. Platforms face pressure from rightsholders, users, regulators, advertisers and their own technological capabilities. As a result, they frequently go beyond what the law strictly requires. In Part 2, we examine that reality: how platforms are shifting from passive intermediary to active gatekeeper, the paradoxes this creates, and the lessons for entrepreneurs.

Interested in more blogs on intellectual property? Click below.

Blogs on intellectual property

Back to overview