On 19 November, the European Commission (hereafter: the Commission) presented its proposal for the Digital Omnibus package. This package contains proposals to relax or consolidate several directives and regulations in the digital domain, including the General Data Protection Regulation (hereafter: GDPR), the ePrivacy Directive, the Data Act and the AI Act. According to the Commission, the aim of these changes is to ensure that regulation supports innovation and economic growth rather than hinders it.
For some time now, the market has criticised European Union regulation as overly strict and as hampering innovation. This is not entirely surprising, as various reports show that Europe is indeed lagging behind when it comes to innovation. On the other hand, many people agree that the existing rules are necessary, not least to protect fundamental rights. Moreover, it is not certain that current regulation is the main cause of Europe’s limited innovative capacity. And if regulation does form a bottleneck, the question is whether this is due to the time and effort required to comply with regulation, or due to a lack of understanding of it. Critics fear that the Commission’s proposal will lead to more uncertainty, forcing organisations once again to invest time, resources and money to adapt.
The Commission emphasises that the proposed amendments are intended to streamline the regulatory framework, while the underlying objective, the protection of fundamental rights, remains unchanged.
In this blog, we discuss the proposed amendments relating to the GDPR and what they could mean for your organisation. Worth noting: the complete Digital Omnibus package contains further amendments, such as the relaxation of rules in the AI Act. You can read more about the changes to the AI Act here.
Perhaps the most interesting proposal is that the Commission intends to amend the definition of ‘personal data’. This is the core concept around which the GDPR revolves and one that is widely debated. The Commission therefore seeks to address this by providing controllers with guidance to determine, in specific cases, whether pseudonymised data should or should not be regarded as personal data.
The proposal supplements the definition of personal data. The Commission introduces the ‘relative approach’, meaning that data may be personal data for one party, but not for another. The definition of personal data therefore depends on the party processing the data. The Commission explicitly adds that the fact that a potential recipient could identify individuals does not mean that the data are also personal data for the sender. The mere existence of additional information that could be used to identify a data subject therefore does not automatically mean that pseudonymised data must be regarded as personal data[1].
Last year, the Court of Justice of the European Union (hereafter: the Court) clarified the scope of the concept of personal data in the context of sharing pseudonymised personal data with third parties in the GAR case. With this proposal, the Commission appears to want to codify this interpretation of pseudonymised personal data in legislation. It is, however, questionable why the Commission considers it necessary to amend the law in this respect. After all, case law already serves to further interpret the law.
In addition to the above, the Commission proposes to develop tools and clear criteria enabling controllers to assess whether pseudonymised data should no longer be regarded as personal data and what the risk of re-identification is. These criteria can then be used to demonstrate that the data cannot be traced back to individual persons. The European Data Protection Board (EDPB) will be closely involved in this process and will issue an opinion on the draft criteria.
In addition to amending the definition of ‘personal data’, the approach to ‘special categories of personal data’ is also adjusted. In principle, it is prohibited to process special categories of personal data, unless one of the exceptions in Article 9 GDPR applies. The proposal adds two additional exceptions under which organisations may process special categories of personal data. These exceptions are as follows:
The processing of special categories of personal data in the development and operation of AI systems (see the AI Act) is permitted under certain conditions. For example, the organisation must take organisational and technical measures to prevent the collection of special categories of personal data. If the controller nevertheless accidentally processes special categories of personal data, it must delete them. If deletion requires unreasonable effort, the controller must ensure that those personal data are not inadvertently exposed or generated as output.
The proposal gives controllers scope to refuse a request from a data subject where that data subject clearly abuses his or her rights. The proposal amends the right of access accordingly. If a data subject submits a request that is manifestly unfounded or excessive, or clearly uses their privacy rights for purposes other than data protection, the controller may refuse the request or charge a reasonable fee. In such cases, it is for the controller to demonstrate the abuse.
The proposal relaxes the information obligation under Article 13 in two respects.
Where a controller collects personal data directly from the data subject, it may not be necessary to include this information in the privacy notice. This depends on whether it can be assumed that the data subject is already aware of the controller’s identity, the purpose of the processing and how to contact the controller to exercise their rights. This applies only where there is a clear and limited relationship between the data subject and the controller and where the processing of personal data is non-intensive. This exception does not apply if the data are disclosed to other recipients or categories of recipients, transferred to a third country, subject to automated decision-making, or where there is a high risk to the data subject.
In addition, an exception is introduced for processing carried out in the context of scientific research. Under this exception, the controller does not have to provide information to data subjects where this proves impossible or would require disproportionate effort. The controller nevertheless remains obliged to protect the data subject’s rights and freedoms. This includes making information about the processing publicly available.
Currently, Article 22 provides that a data subject has the right not to be subject to automated decision-making without human intervention where this produces legal effects or similarly significantly affects them. A number of exceptions apply. This ‘negative’ formulation (not permitted, unless) is reversed in the proposal into a ‘positive’ formulation: automated decision-making is permitted only if certain conditions are met. The proposal also clarifies the requirement of ‘necessity’ where automated decision-making is based on the performance of a contract. The amended article explicitly states that the requirement of necessity may also be satisfied where the decision could have been taken in a way other than exclusively by automated means.
Several changes are also proposed in relation to personal data breaches and their notification. First, the Commission aims to streamline breach notification by introducing a single entry point. This means that a single authority will handle notifications under the various regulatory frameworks. Notifications under the GDPR, the Network and Information Security Directive 2 (NIS2), the Electronic Identification, Authentication and Trust Services Regulation (eIDAS Regulation), the Digital Operational Resilience Act (DORA), and the Critical Entities Resilience Directive (CER) will all need to be submitted to this single entry point. The EDPB will develop a template for notifications, together with a list of situations in which a high risk to data subjects must be assumed.
In addition, the obligation for a controller to notify a personal data breach to the supervisory authority is aligned with the obligation to inform data subjects. Both notifications are required only where the breach poses a high risk to the rights of data subjects. The deadline for notification to the authority is extended to 96 hours. The obligation and threshold for processors to notify controllers of personal data breaches, as set out in the current Article 33(2) GDPR, remain unchanged.
The proposal includes a mechanism requiring the EDPB to draw up lists of situations in which a Data Protection Impact Assessment (DPIA) is or is not required, supplemented by a fixed template and methodology. With this approach, the Commission seeks to increase legal certainty and create a harmonised interpretation of when processing qualifies as high risk. At present, there are no European-level guidelines on when a DPIA is required, although several national supervisory authorities have published their own lists.
By introducing a new article, the Commission aims to establish that the lawful basis of ‘legitimate interest’ (Article 6(1)(f) GDPR) may be used to process personal data for the purpose of training AI models. This aligns with the prevailing view on the use of personal data for training AI models, including as reflected in Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models and the position of the Dutch Data Protection Authority in relation to the training of AI models.
A number of exceptions apply. This lawful basis does not apply where other Union or national laws explicitly require consent, or where the fundamental rights and interests of data subjects override the controller’s legitimate interest. This latter requirement carries even greater weight where the data subject is a child.
The Digital Omnibus package contains a substantial number of changes that could significantly alter current practice. Whether these relaxations will achieve the Commission’s objective of stimulating innovation, or instead undermine the digital rights of data subjects, remains to be seen.
Leaving this speculation aside, it is important to note that the proposal is currently exactly that: a proposal. It therefore has no immediate effect on practice. Under the standard legislative procedure, these changes could be adopted at the earliest by the middle of next year. Before then, much still needs to happen and many elements may change, particularly given the ongoing debate.
Do you have questions about the Digital Omnibus package? Please feel free to contact us. We would be happy to help.
[1] “Information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity. Such information does not become personal for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the natural person to whom the information relates.’”