the GDPR is under attack
A while ago, the European Commission launched a Digital Package on Simplification . The goal was to focus on regulatory and administrative issues in the digital law space ( 'digital acquis' ): Reducing burdens on businesses and citizens, updating parts to the current technological advancement, and providing clarifications on some overlap between laws like the GDPR, the AI Act, etc. This included: There was a call for evidence sent out by the Commission in September to get input by stakeholders and experts on how to simplify the rules around data protection, cybersecurity, AI and more and any pressing issues and concerns 1 . The deadline passed in the middle of October, and an official proposal developed from these insights is supposed to be released on the 19th of November. The hopes for this proposal were high: So-called small mid-cap enterprises ('SMCs', between 250 and 750 employees, annual turnover between 50 and 150 million) 2 were introduced as a group earlier this year, and the companies falling into this category expected to be exempt from the more time- and cost-intensive regulatory parts of the GDPR specifically. EU citizens all over wanted a better way to handle cookie consent that does not result in 'cookie fatigue' and still respects their choice. Noyb.eu and others suggested an automated, browser-integrated setting tool for cookie objections called the Advanced Data Protection Control (ADPC) . 3 A version of the proposal document leaked (undated) and I'm writing this to give an overview over the 156 page document, focusing on the relevant bits for end-users, and some juicy stuff. The proposal, as it is right now pre-release, would suggest to: There's a separate proposal to amend the AI Act aside from this omnibus, too. All this doesn't have to immediately be bad, as this is kind of what the whole thing set out to do. We have to look closer as to what the actual amendments are and if the repealed ones are superfluous and merged into the appropriate law for clarity. The most important scoop, specifically Article 2 amendments to clarify other articles: Changes the definition of personal data under Article 4, stating that information is not considered personal data for a controller (= company, etc.) if it does not have any means that could reliably identify the person. This would enable companies to avoid the obligations of the GDPR for quite a lot of what was before thought of as personal data. It is particularly problematic, as it introduces ambiguity around what is or isn't personal data based on what each controller subjectively is capable of based on their own unique capabilities. Conclusively, it would remove pseudonymous data or indirectly identifiable data from GDPR application, when it used to be covered before, significantly lowering protections. (Page 17) Processing of special categories of personal data (Article 9) is changed. For one, it is not strictly forbidden unless a specific rule of allowance is met anymore; instead, it is only prohibited if it directly reveals (instead of infers) a person's sensitive personal data (health, ethnic origin, sexual orientation, religion etc.). That would ironically mean that people who don't want to disclose their sensitive information would lose all protections while those who share it outright via processing would be protected. Another one of the exemptions would be for the residual processing of special categories of personal data for development and operation of an AI system/model , and for verification via biometric data (think, verifying someone's identity via fingerprint). (Page 18) Article 12 is clarified to specifically include that the right of access should only be used for the purpose of protecting personal data, nothing else, and if it is used for other matters, the company can refuse or charge a fee. In practice, it indeed was used a lot to get additional ammunition in court cases, especially between employer and employee. There's a lot of court cases where employees and citizens have used their right to know what data is processed and getting a copy of it to get a hold of internal communication that helps their case against that entity. It's going to be interesting seeing the courts decide whether the motivation of a data subject was truly for data protection reasons or not; a simple court case against an entity should not bar you from enacting your rights. (Page 18) Article 13 (which is the obligation of a company to inform you if they process data about you that they get from you directly) is changed so that there is no obligation to inform you if there is reasonable grounds to expect that you already know this, unless they also transmit the data to other recipients, third countries, there is automated decision-making or there is a high risk to your rights. Seems like in practice, this won't change much, as almost all data processing where they get the data from you directly fall under this - just think of creating a user account on Facebook or a bank account at your bank. But still wanted to include it as it still fits into this overall image of the omnibus attempting to create some loopholes and less restriction on companies. (Page 18) Requirements for automated decision-making in Article 22 are clarified. It previously said that people have the right to not be subject to a decision based solely on automated processing and profiling that has legal or significant effects on them, unless some conditions are met. These conditions are when the decision is necessary for entering into or performing a contract, it's authorized by law, or there is explicit consent. The proposed change concerns what “necessary for a contract” means, clarifying that “necessity” does not depend on whether a human could make the decision instead. That's... not really helpful? Then what could necessity possibly else be? This is clearly just to enable more AI decision making, when we all know almost all AI decisions (like in hiring etc.) can be done by a human instead. (Page 18) Article 33 and 34 deal with reporting requirements when a data breach happens. Previously, they had to always notify the authority unless the risk is very low, and only notify individuals if the risk was high. In this proposal, it is suggested to significantly loosen the requirements, stating both notifications (to the authority and to individuals) would only be required if the breach is likely to result in a high risk to people’s rights and freedoms. The notification deadline is upped from 72 hours to 96 hours as well. That means significantly less oversight and knowledge about data breaches and more time for it to do some damage before relevant parties can start to act and protect themselves. (Page 18) Processing of personal data on and from 'terminal equipment' (= phones and computers) is supposed to be handled solely by the GDPR now instead of the ePrivacy Directive, and cookie regulation is intended to be aligned with its principles. (Page 6) The proposal intends to pave the way for automated, machine-readable indications of individual choices in the settings and calls upon the standardization bodies to develop a standard. Once that standard is set and implemented on all kinds of browsers and devices, there should be a 6 month grace period before website controllers are obliged to respect these settings. This sounds very close to, or exactly like, the Advanced Data Protection Control (ADPC) I mentioned above, which means we could one day have just a setting in the browser instead of banners and checkmarks. (Page 6) Unfortunately, they write: "Considering the importance of advertising revenue for independent journalism as an indispensable pillar of a democratic society, media service providers as defined in Regulation (EU) 2024/1083 (European Media Freedom Act) should not be obliged to respect such signals." Which means you can get ready for every news site to track you to hell and sell your data under the guise of democracy. This unfortunately shines a new light on how the EU is willing to handle "Pay or Okay" cases moving forward, which is very disappointing. ePrivacy Directive: Tracking allowed without consent if it poses a low risk to rights and if it is needed to fulfill a contract. That could significantly lessen our protections and rights against ubiquitous tracking online... (Page 5-6) Unfortunately, the main omnibus proposal document only includes contextual references to AI Act changes, but not the specifics; those are in a separate document here . In short (Page 3, 11-12, 27): They expect possible savings of at least 1 billion EUR annually, with an additional 1 billion savings in one-off costs, amounting to a total of 4 billion over 3 years by 2029 (Page 12-13). It seems like what companies and the average citizen have hoped for was made true - bureaucratic relief for companies and some of the overseeing bodies alike by loosening requirements, shortening them or making them voluntary, as well as paving the way to tackle the cookie banner problem via an automated setting across websites. But: Even though the document repeatedly says they look to keep the high protection standard and ethical core of the regulations intact, I can't agree that they succeeded. Changing the definition of personal data in such a way that it would significantly shift application of the GDPR is messing directly with the level of protection and raises several concerns. Most of the changes really are not in favor of the data subjects; instead, they just make it easier for companies to not have to comply, to not have to tell, to not have to record or report, while making it easier to track and collect data without consent, further bolstering "legitimate interest" and supposed contractual obligations or the subjective capabilities of identifying an individual as valid reasons. It seems to me that the Commission is fully willing to go the path of 'all or nothing': That, if you sign up for certain services, you are fully consenting and have no granular control over what happens with your data, because it is all lobbed under the doubtful reasoning of "providing a service for you". We all know Meta doesn't need half the data it collects to provide that service to you, but it seems like the times of holding companies accountable for this have passed as this business model is legitimized and the EU is scared of being left behind in innovation if we want to stick to the rule of law and democratic values. It's not taken into account how many people feel forced (whether by their environment, their employer, their industry peers, their job chances, their own emotional reliance) to use these services who'd prefer to use them in the most data-restrictive way possible. Citizens should not feel the need to have to sell themselves out fully just to access a digital job marketplace or a digital flea market. In practice, it's looking bleak for what once made the GDPR special. We'll have to see the final version on the 19th, and then look whether the proposal is accepted and how it is put in practice. If you want to read the doc yourself, here is the link. Reply via email Published 13 Nov, 2025 There was a lot more than that, but if you want the details, read this , page 11. And: The European Data Protection Board (EDPB) also asked for feedback on the Guidelines 03/2025 of the interplay between the DSA and the GDPR, and you can find that here . Interesting to note: You can publicly see the feedback that was given by Google Ireland , Meta Ireland , and Amazon EU . ↩ Not to be mixed up with SMEs, which stands for small and medium-sized enterprises. According to the proposal, SMEs now include SMCs, when they previously didn't. ↩ Here is the initial news of the tool from 2021. ↩ a digital fitness check based on the criteria of effectiveness, efficiency, relevance, coherence and EU added-value, as well as a digital 'omnibus' , which means large-scale amendments across many existing acts and regulations, streamlining them to be more cohesive and easier to apply. amend the GDPR, the Data Act, the AI Act and the ePrivacy Directive amend the NIS2 Directive (<- cybersecurity) repeal the Data Governance Act repeal the Free Flow of Non-personal Data Regulation (FFDR), and Regulation 2019/1150, which is for promoting fairness and transparency for business users of online intermediation services (also called P2B Regulation) repeal the Open Data Directive Changes the definition of personal data under Article 4, stating that information is not considered personal data for a controller (= company, etc.) if it does not have any means that could reliably identify the person. This would enable companies to avoid the obligations of the GDPR for quite a lot of what was before thought of as personal data. It is particularly problematic, as it introduces ambiguity around what is or isn't personal data based on what each controller subjectively is capable of based on their own unique capabilities. Conclusively, it would remove pseudonymous data or indirectly identifiable data from GDPR application, when it used to be covered before, significantly lowering protections. (Page 17) Processing of special categories of personal data (Article 9) is changed. For one, it is not strictly forbidden unless a specific rule of allowance is met anymore; instead, it is only prohibited if it directly reveals (instead of infers) a person's sensitive personal data (health, ethnic origin, sexual orientation, religion etc.). That would ironically mean that people who don't want to disclose their sensitive information would lose all protections while those who share it outright via processing would be protected. Another one of the exemptions would be for the residual processing of special categories of personal data for development and operation of an AI system/model , and for verification via biometric data (think, verifying someone's identity via fingerprint). (Page 18) Article 12 is clarified to specifically include that the right of access should only be used for the purpose of protecting personal data, nothing else, and if it is used for other matters, the company can refuse or charge a fee. In practice, it indeed was used a lot to get additional ammunition in court cases, especially between employer and employee. There's a lot of court cases where employees and citizens have used their right to know what data is processed and getting a copy of it to get a hold of internal communication that helps their case against that entity. It's going to be interesting seeing the courts decide whether the motivation of a data subject was truly for data protection reasons or not; a simple court case against an entity should not bar you from enacting your rights. (Page 18) Article 13 (which is the obligation of a company to inform you if they process data about you that they get from you directly) is changed so that there is no obligation to inform you if there is reasonable grounds to expect that you already know this, unless they also transmit the data to other recipients, third countries, there is automated decision-making or there is a high risk to your rights. Seems like in practice, this won't change much, as almost all data processing where they get the data from you directly fall under this - just think of creating a user account on Facebook or a bank account at your bank. But still wanted to include it as it still fits into this overall image of the omnibus attempting to create some loopholes and less restriction on companies. (Page 18) Requirements for automated decision-making in Article 22 are clarified. It previously said that people have the right to not be subject to a decision based solely on automated processing and profiling that has legal or significant effects on them, unless some conditions are met. These conditions are when the decision is necessary for entering into or performing a contract, it's authorized by law, or there is explicit consent. The proposed change concerns what “necessary for a contract” means, clarifying that “necessity” does not depend on whether a human could make the decision instead. That's... not really helpful? Then what could necessity possibly else be? This is clearly just to enable more AI decision making, when we all know almost all AI decisions (like in hiring etc.) can be done by a human instead. (Page 18) Article 33 and 34 deal with reporting requirements when a data breach happens. Previously, they had to always notify the authority unless the risk is very low, and only notify individuals if the risk was high. In this proposal, it is suggested to significantly loosen the requirements, stating both notifications (to the authority and to individuals) would only be required if the breach is likely to result in a high risk to people’s rights and freedoms. The notification deadline is upped from 72 hours to 96 hours as well. That means significantly less oversight and knowledge about data breaches and more time for it to do some damage before relevant parties can start to act and protect themselves. (Page 18) Processing of personal data on and from 'terminal equipment' (= phones and computers) is supposed to be handled solely by the GDPR now instead of the ePrivacy Directive, and cookie regulation is intended to be aligned with its principles. (Page 6) The proposal intends to pave the way for automated, machine-readable indications of individual choices in the settings and calls upon the standardization bodies to develop a standard. Once that standard is set and implemented on all kinds of browsers and devices, there should be a 6 month grace period before website controllers are obliged to respect these settings. This sounds very close to, or exactly like, the Advanced Data Protection Control (ADPC) I mentioned above, which means we could one day have just a setting in the browser instead of banners and checkmarks. (Page 6) ePrivacy Directive: Tracking allowed without consent if it poses a low risk to rights and if it is needed to fulfill a contract. That could significantly lessen our protections and rights against ubiquitous tracking online... (Page 5-6) Unfortunately, the main omnibus proposal document only includes contextual references to AI Act changes, but not the specifics; those are in a separate document here . In short (Page 3, 11-12, 27): More simplifications, flexibility in post-market monitoring, exemptions for R&D scenarios, and delaying some obligations. No obligation for AI literacy of staff anymore, only encouragement to foster it? High-risk AI systems (especially those used by public authorities!) get an extension for the deadline to comply - until August 2030 (for comparison: general purpose AI systems need to comply by 2027). Isn't that insane? The high-risk systems should be the first to have to comply, as they are, by name, high-risk. They expect possible savings of at least 1 billion EUR annually, with an additional 1 billion savings in one-off costs, amounting to a total of 4 billion over 3 years by 2029 (Page 12-13). The proposal aims to create a single-entry point via ENISA through which reporting obligations can be fulfilled for multiple legal acts, saving on some administrative burdens and no more double-reporting needed. But it also centralizes power on EU-level, reduces national control and transparency. (Page 7) The P2B Regulation is suggested to be repealed because the Digital Markets Act (DMA) and Digital Services Act (DSA) are considered to largely overtake the old one. (Page 7) The Open Data Directive's rules are absorbed into the Data Act. (Page 8) The Data Governance Act (DGA) which specifically handles how data can be shared and reused across the EU by 'data intermediaries' (companies that help others share data, like AWS Data Exchange or Microsoft Azure Data Share) and 'data altruism organizations' (entities that collect and share data voluntarily for the public good, like the European Brain Data Hub in Belgium) is supposed to be amended so it's easier for data-sharing services to grow. Unfortunately, that means that complying with certain legal requirements under the DGA is turned into a voluntary framework rather than an obligation. Also, currently the DGA requires that a company offering data intermediation services must have a legally separate company to run those services to avoid conflicts of interest, for example. Under the new plan, that strict requirement would be relaxed and companies wouldn’t need a separate legal entity, only a functional separation (for example, separate departments or IT systems), as long as they meet certain other conditions to ensure independence and trust. The overall compliance requirements for data intermediaries are supposed to be drastically shortened for fewer administrative or reporting duties, and reporting and transparency obligations for data altruism organizations are removed. (Page 15) The requirements of when to do a data protection impact assessment (DPIA) are harmonized in terms of how, when, and what high risk means. There will also be harmonized templates.The European Data Protection Board would be obliged to prepare a proposal for a common template. (Page 18) There was a lot more than that, but if you want the details, read this , page 11. And: The European Data Protection Board (EDPB) also asked for feedback on the Guidelines 03/2025 of the interplay between the DSA and the GDPR, and you can find that here . Interesting to note: You can publicly see the feedback that was given by Google Ireland , Meta Ireland , and Amazon EU . ↩ Not to be mixed up with SMEs, which stands for small and medium-sized enterprises. According to the proposal, SMEs now include SMCs, when they previously didn't. ↩ Here is the initial news of the tool from 2021. ↩