Once again, 2025 was marked by significant developments in the field of data protection and privacy. European case law added further nuance to core concepts such as personal data and international data transfers, while robust enforcement by our national supervisory authority made clear that transparency is not a mere formality.
At the same time, new legislation entered into force, including the Data Act, and the European Commission announced the Digital Omnibus, signalling a recalibration of the digital regulatory framework. In this blog, we set out the key events of the year.
On 3 September 2025, the General Court of the European Union (the General Court) delivered its judgment in the Latombe case, in which the EU–US Data Privacy Framework (DPF) was challenged.
The core of the DPF is well known. US organisations that certify under the framework are deemed to provide a level of protection that is essentially equivalent to that within the EU. Critics, including Latombe, argued that US surveillance practices and the position of the Data Protection Review Court (DPRC) undermine this equivalence.
The General Court did not agree. According to the Court:
From a legal perspective, the DPF therefore remains in force. At the same time, as with earlier adequacy decisions (Safe Harbor and Privacy Shield), the underlying criticism has not disappeared.
Another important European judgment in 2025 concerned the SRB/EDPS case. In this ruling, the Court of Justice of the European Union (the Court) introduced important nuance into the concept of personal data.
The case concerned comments from shareholders and creditors that were pseudonymised by the Single Resolution Board (SRB) and forwarded to Deloitte. Data subjects complained to the European Data Protection Supervisor (EDPS) that they had not been informed of this disclosure. The case ultimately reached the Court.
The Court introduced a key distinction. For the sender, the data remain personal data. For the recipient, this is not necessarily the case, provided that the recipient cannot reasonably identify the individuals concerned.
This may seem like a relaxation of the rules, but the Court emphasizes context. The assessment depends on:
Crucially, this nuance does not affect the information obligations of the sender. Those obligations already arise at the point of collection of the personal data.
Note that the judgment formally concerns the EU Data Protection Regulation for EU institutions and bodies, not the General Data Protection Regulation (GDPR). However, it is equally relevant as an interpretative framework for the GDPR.
In 2025, the long running case between the Dutch Data Protection Authority (Autoriteit Persoonsgegevens, AP) and the Koninklijke Nederlandse Lawn Tennis Bond (KNLTB) came to an end. The case originated in 2018, when the KNLTB sold its members’ personal data to sponsors to generate additional income. Following complaints from members, the AP intervened and launched an investigation.
The AP imposed an administrative fine of EUR 525,000. The disclosure of the personal data was not compatible with the original purpose of collection, and the argument that it was justified by commercial interests was not accepted as a valid legal basis.
In 2024, however, the Court of Justice disagreed with the AP and clarified that a commercial interest is not automatically excluded as a legal basis under the GDPR. This line was followed in 2025 in the settlement between the AP and the KNLTB.
The KNLTB acknowledged that it had acted unlawfully when sharing member data. The AP subsequently reduced the fine from EUR 525,000 to EUR 250,000, partly due to awareness raising measures taken by the KNLTB.
One of the most striking enforcement actions of the AP in 2025 was the EUR 2.7 million fine imposed on Experian. The company compiled creditworthiness reports for its clients, using large volumes of both public and non-public personal data, without informing individuals that they were being assessed.
Because credit scores can have significant consequences for individuals, the AP applies a high standard. It held that too many unnecessary personal data were used without a valid legal basis. In addition, Experian failed to adequately inform data subjects about the existence and consequences of the collection of their personal data.
The European Data Act has applied since 12 September 2025. The Regulation governs who may access data, under what conditions and for what purpose. Its scope is broad. It covers not only parties that generate data, but also data holders, data recipients and, in exceptional cases, public authorities.
One of the main objectives of the Data Act is to break the concentration of data in the hands of dominant players. Users, both consumers and businesses, are granted a right of access to data generated through the use of connected products. These include cars, smartwatches and smart refrigerators, as well as the apps used with those products.
Users must be informed in advance which data are collected and how they can access them. The Act is based on access by design. This means either direct access for the user, or otherwise active provision. Users also have the right to have their data forwarded to a third party.
Responsibility for this lies with the data holder, for example the manufacturer or service provider. The data holder must ensure that users can effectively exercise their rights. In practice, there are often multiple data holders.
The Data Act also introduces rules for mandatory data sharing between businesses (B2B), for example at the request of a user. Such sharing must take place under fixed conditions:
As a result, even parties that do not themselves generate data may fall within the scope of the Act.
Finally, public authorities may only request data where there is an exceptional need and must provide proper justification. In emergency situations, personal data may also be requested. Outside such situations, the focus is in principle on non-personal data.
Note that the current rules on business to government data sharing under the Data Act may still change. Under the Digital Omnibus proposal, these provisions may be amended. We return to this later in the blog.
In addition, 2025 brought clarity on a far reaching amendment to the Telecommunications Act (Netherlands), which will enter into force on the first of July 2026. At present, companies may still contact existing customers by telephone to offer their own similar products or services, without prior consent.
This will change. From 2026, telemarketing based on an existing customer relationship will, in principle, no longer be permitted. Companies will no longer be allowed to call customers with commercial offers unless the customer has explicitly consented.
For email and SMS, the change is less drastic. Companies may continue to approach existing customers via these channels without prior consent, provided that the communication relates to their own similar products or services.
The law also provides for clear exceptions to the telemarketing ban. Telephone contact based on an existing customer relationship will remain permitted for:
With the Digital Omnibus, presented in November 2025, the European Commission took a clear step towards simplifying the digital regulatory landscape. The Omnibus should not be seen as a major reset, but rather as a technical and substantive recalibration. Existing legislation remains in force, but is better aligned, clarified and, in some areas, made more flexible.
The impact of the Omnibus is most visible across three pillars: the GDPR, the AI Act and the Data Act.
A notable change is the explicit codification of the so called relative approach to personal data. In short, data qualify as personal data for an organisation only if that organisation can reasonably identify an individual. The fact that another party could theoretically do so is no longer decisive. This approach aligns with the Court’s case law discussed above (the SRB/EDPS case).
To avoid divergent application in practice, the European Data Protection Board (EDPB) is tasked with developing further guidance.
The European Commission explicitly acknowledges that the GDPR can, in practice, constrain AI development. It therefore clarifies that legitimate interest may in principle serve as a legal basis for training AI models, provided that:
There is also more flexibility for special categories of personal data. Incidental use of such data for AI development becomes possible, for example to detect bias and discrimination, provided that appropriate technical and organisational measures are implemented.
In addition, the processing of biometric data (such as fingerprints, facial scans or voice recognition) is permitted where necessary to verify an individual’s identity, provided that the means of verification remain fully under the control of the data subject.
Finally, the Digital Omnibus introduces practical relief measures:
The most visible effect of the Digital Omnibus on the AI Act is the postponement of obligations for high risk AI systems. Their application is made conditional on a formal decision by the European Commission that the market is ready.
Once such a decision is adopted:
If no decision is adopted, the rules will automatically apply on 2 December 2027 for Annex III systems and on 2 August 2028 for Annex I systems. Importantly, the substantive requirements remain largely unchanged. This is about timing, not lowering standards.
Another important change concerns AI literacy. Where organisations initially faced a general training obligation, this shifts to a best-efforts obligation for Member States and the Commission.
In addition to these major changes, the Digital Omnibus introduces several targeted adjustments relevant in practice.
It is now explicitly permitted to combine conformity assessments under the AI Act with other mandatory product assessments, which may reduce administrative burdens, particularly for AI systems listed in Annex I. Regulatory sandboxes may also be established at Union level or through cooperation between multiple Member States.
Finally, the Digital Omnibus clarifies the division of supervisory responsibilities for general purpose AI between the AI Office, the European Commission and national supervisory authorities.
The Omnibus positions the Data Act as the central framework for non personal data. Other instruments, such as the Data Governance Act, the Regulation on the Free Flow of Non Personal Data in European Union and the Open Data Directive, will cease to exist as standalone regimes and will be integrated into the Data Act.
The result is two clear pillars:
The Digital Omnibus tightens the Data Act on several points to address practical issues. Organisations receive greater protection for trade secrets. Data holders may refuse to share data where there is a real risk that confidential information would be unlawfully used or disclosed in third countries.
In addition, data sharing with public authorities (B2G) is significantly restricted. Contrary to the earlier B2G approach discussed above, public authorities may only be permitted to request data in clearly defined public emergency situations. SMEs are entitled to cost compensation, while larger enterprises must continue to provide data free of charge in such cases.
Finally, the rules on cloud switching are refined. The objective remains to enable customers to switch cloud services without additional costs and to prevent vendor lock in. However, exceptions are introduced for bespoke services and certain SME providers.
The year 2025 once again shows that the law on data protection and privacy continues to evolve and become more refined. There is room for innovation and commercial interests, but only within clear legal boundaries. Transparency, context and due care remain central. With the Digital Omnibus, the focus shifts from isolated rules to coherence and practical applicability. For organisations, this means that an integrated approach to privacy, data and AI remains essential.
We will continue to closely monitor these developments and keep you informed. Subscribe to our newsletters to stay informed.