Chapter 10 of 10
Module 10 – Current Trends and Future Directions: Simplification and New Proposals
Concludes with an overview of ongoing reforms and debates, including efforts to simplify overlapping tech regulations, the proposed ‘Digital Omnibus’ adjustments, and the emerging Digital Fairness Act.
1. Where We Are Now: A Dense EU Digital Rulebook
Over roughly the last decade, the EU has built a dense web of digital laws. By late 2025, many of these are already in force or in phased implementation:
- GDPR (since 2018): core data protection and privacy rules.
- Digital Services Act (DSA): content moderation, platform transparency, user protection.
- Digital Markets Act (DMA): gatekeepers and competition in digital markets.
- AI Act (adopted 2024, phased application from 2025): risk-based rules for AI systems.
- Data Act (entered into force 2023, applies from 2025–2026): access to and sharing of non‑personal data.
- Plus sectoral rules: NIS2, Data Governance Act, ePrivacy rules, etc.
From the perspective of a company or regulator, this creates:
- Overlap (e.g., transparency duties under DSA vs. AI Act vs. GDPR).
- Inconsistencies (different definitions of online platform, profiling, high‑risk).
- Complex compliance journeys (one product can trigger 4–5 different regimes).
This module focuses on current reform debates (as of late 2025):
- Regulatory simplification and overlap review across the big digital laws.
- The idea of a “Digital Omnibus” package to tweak and align existing acts.
- The emerging Digital Fairness Act (DFA) and related work on dark patterns and consumer protection.
You should keep in mind the enforcement picture from Module 9: more rules + more enforcers has made simplification a political priority.
2. Why Simplification? Main Drivers Behind the Current Push
Several practical and political drivers explain why the EU is now talking about simplification rather than just more rules:
- Regulatory fatigue
- Businesses (especially SMEs) report that the digital rulebook is hard to navigate.
- Compliance teams must interpret multiple regulations at once (e.g., DSA + GDPR + AI Act for a recommender system).
- Overlapping obligations
- Example: algorithmic transparency appears in:
- DSA (for recommender systems, VLOPs/VLOSEs)
- AI Act (for high‑risk AI and certain general‑purpose AI)
- GDPR (for automated decision‑making and profiling)
- This can mean duplicated documentation and audits for the same system.
- Enforcement coordination problems
- Different authorities: data protection authorities, Digital Services Coordinators, competition authorities, consumer protection authorities, market surveillance bodies.
- Without alignment, platforms and AI providers may face conflicting guidance or parallel investigations.
- Innovation and competitiveness concerns
- Some policymakers worry that a fragmented, heavy rulebook could discourage EU startups and slow AI deployment.
- This is a central argument in discussions about delaying or phasing in some high‑risk AI obligations.
- Political messaging
- After years of “more and stricter rules”, there is pressure to show that the EU can also streamline and clarify.
By the end of this module, you should be able to identify which simplification proposals respond to which of these drivers.
3. Mapping Overlaps: Quick Thought Exercise
Imagine a large online marketplace operating in the EU (think of a platform where users can buy and sell products, with recommendation algorithms and some AI‑based fraud detection).
Task (3–4 minutes):
- List at least three EU digital laws likely to apply to this marketplace.
- For each law, note one type of obligation it imposes.
- Then, mark where you see potential overlap.
Use this simple table format in your notes:
```text
Law | Example obligation | Possible overlap with
-------------|----------------------------------------|----------------------
DSA | e.g. notice-and-action for illegal | e.g. content rules under ...
| content, recommender transparency |
GDPR | e.g. lawful basis for profiling | e.g. AI Act transparency
AI Act | e.g. risk management for high-risk AI | e.g. GDPR, DSA
DMA | e.g. self-preferencing ban | e.g. competition law
Data Act | e.g. B2B data sharing duties | e.g. trade secrets rules
```
Reflect:
- Which overlaps feel manageable (just extra paperwork)?
- Which could lead to conflicting expectations (e.g., transparency vs. trade secrets, or user control vs. safety)?
You do not need to submit answers, but you will use this mental map in later steps when we discuss the Digital Omnibus idea.
4. The ‘Digital Omnibus’ Idea: Tweaking Multiple Laws at Once
In EU law, an “Omnibus” instrument is a legislative act that amends several existing acts at once. In the digital field, policymakers and commentators have been discussing a potential “Digital Omnibus” package to:
- Align definitions across DSA, DMA, AI Act, GDPR, Data Act (e.g., online platform, profiling, high‑risk AI system).
- Streamline overlapping transparency and reporting duties (e.g., one consolidated annual report could satisfy several acts, with common templates).
- Clarify hierarchies and conflict rules:
- Example: if both the AI Act and DSA apply to a recommender system, which transparency rule takes precedence or can they be satisfied through a single integrated disclosure?
- Adjust timelines for implementation where there is clear evidence of regulatory overload, especially for smaller providers.
As of late 2025:
- The exact content and timing of a Digital Omnibus are still under political discussion.
- The concept is often mentioned together with “regulatory fitness checks” and REFIT (the EU’s existing program for simplifying and improving existing legislation).
You should think of the Digital Omnibus as a toolbox for:
- Technical fixes (typos, cross‑references, inconsistent wording).
- Substantive alignment (same concept → same definition across laws).
- Procedural simplification (reduced duplication of reports, audits, and risk assessments).
5. Concrete Example: One Product, Many Laws (and How an Omnibus Could Help)
Consider a job‑matching platform that:
- Uses an AI model to rank candidates for employers.
- Shows candidates personalized job recommendations.
- Operates as an online platform hosting user profiles and job ads.
Today, this platform may face:
- GDPR
- Must have a lawful basis for processing personal data.
- Must explain automated decision‑making and profiling to users.
- AI Act
- The ranking system could be a high‑risk AI system (e.g., affecting access to employment).
- Requires risk management, data governance, transparency, and human oversight.
- DSA (if it meets the thresholds for an online platform, possibly a VLOP)
- Must provide recommender system transparency.
- Users should be able to modify key parameters of recommendations.
- Data Act (depending on data sharing features)
- May need to allow certain access to non‑personal data or clarify conditions for B2B data sharing.
Where a Digital Omnibus could simplify:
- Single transparency layer: Instead of three partially overlapping explanations (GDPR, DSA, AI Act), the Omnibus could:
- Define a common core of information users must receive about automated decisions and recommendations.
- Allow one combined notice that satisfies all three laws.
- Aligned risk assessments:
- A single risk assessment template could be recognized under both the AI Act and DSA systemic risk rules (for very large platforms).
- Common definitions:
- Use the same definition of “profiling” and “high‑risk decision” across GDPR and AI Act, reducing legal uncertainty.
This example shows how technical alignment can significantly reduce compliance complexity without necessarily weakening protections.
6. High-Risk AI Obligations: Debates on Delays and Phasing
The AI Act introduces strict obligations for providers and deployers of high‑risk AI systems (e.g., in employment, credit scoring, critical infrastructure, healthcare).
By late 2025, there are ongoing debates about:
- Timing and phasing
- Some stakeholders argue for longer transition periods for certain high‑risk sectors.
- Others push for faster application in sensitive areas (e.g., biometric identification, law enforcement) to prevent abuses.
- Capacity of regulators and companies
- National market surveillance authorities and notified bodies need expertise and resources to assess AI systems.
- SMEs and public authorities (e.g., hospitals, schools) may struggle to implement all obligations quickly.
- Interaction with other laws
- If a high‑risk AI system is used on a major platform, DSA and AI Act obligations can stack up.
- Simplification proposals include:
- Shared documentation: one technical file serving multiple regulatory purposes.
- Proportionality: more tailored obligations for smaller deployers.
- Proposals in the simplification debate
- Targeted delays for specific high‑risk categories where compliance is particularly complex.
- Priority focus on the most harmful uses (e.g., certain biometric systems), while giving more time for others.
For your exams or essays, be precise: the AI Act itself is adopted and binding, but how strictly and how fast some obligations are applied is a live policy discussion connected to the broader simplification agenda.
7. The Emerging Digital Fairness Act (DFA): Focus on Dark Patterns
The Digital Fairness Act (DFA) is a proposed EU initiative aimed at modernizing consumer protection in the digital environment.
As of late 2025, it is still under discussion, but key themes from Commission communications and public consultations include:
- Dark patterns
- Dark patterns are interface designs that manipulate or mislead users into making choices they might not otherwise make.
- Examples:
- Making the “accept all tracking” button bright and large while hiding “reject” in small text.
- Repeated pop‑ups pressuring users to stay subscribed (“Are you sure you want to lose these amazing benefits?”).
- Digital consumer protection gaps
- Existing rules (e.g., Unfair Commercial Practices Directive, Consumer Rights Directive) were not written with apps, platforms, and AI‑driven personalization in mind.
- The DFA discussions explore whether to:
- Explicitly ban certain dark patterns.
- Clarify what counts as “misleading” or “aggressive” in algorithmic interfaces.
- Relationship to other laws
- DSA already restricts certain dark patterns for online platforms (e.g., design that interferes with user choices).
- GDPR addresses manipulative consent flows for data processing.
- The DFA aims to fill remaining gaps from a consumer contract and fairness perspective.
- Public consultation
- The Commission has used public consultations to gather input from consumers, businesses, NGOs, and academics on:
- Real‑world examples of manipulative design.
- Whether current consumer law is sufficient.
For your purposes, treat the DFA as a consumer‑law complement to DSA/GDPR, focusing on fairness of digital transactions and interfaces, not just data or content moderation.
8. Spot the Dark Pattern: Short Activity
Consider the following signup screen for a streaming service (imaginary example):
> Screen description: The page has a huge green button saying “Start Free Trial”. Under it, in tiny grey text, it says: “Free trial converts automatically into a €29.99/month subscription after 7 days unless cancelled at least 48 hours in advance.” There is no visible “More options” or “Continue without trial” button; the only way to proceed is to click “Start Free Trial”.
Questions to think about (2–3 minutes):
- Which elements of this design could be considered a dark pattern from a consumer protection perspective?
- How might GDPR (if personal data is collected), DSA (if this is part of an online platform), and a future Digital Fairness Act each approach this situation?
- If you were drafting rules under the DFA, would you:
- (a) create a general fairness standard (e.g., “traders must not design interfaces that materially distort consumer decisions”), or
- (b) list specific banned practices (e.g., “hiding material information in small print”, “pre‑selected paid options”)?
Write down a one‑sentence rule you would include in a Digital Fairness Act to address this scenario.
9. Quick Check: Simplification and New Proposals
Test your understanding of the key ideas discussed so far.
Which of the following best captures the **main purpose** of a potential 'Digital Omnibus' package in the EU?
- To replace the GDPR, DSA, DMA, AI Act, and Data Act with a single new regulation.
- To amend and align multiple existing digital laws at once, reducing overlaps and inconsistencies.
- To create a new enforcement agency at EU level that centralizes all digital supervision.
Show Answer
Answer: B) To amend and align multiple existing digital laws at once, reducing overlaps and inconsistencies.
A 'Digital Omnibus' package is discussed as a way to **amend and align several existing acts at once**, especially to fix overlaps and inconsistencies. It is not intended to fully replace the existing framework (so option 1 is incorrect), nor is it primarily about creating a new enforcement agency (option 3).
10. Key Terms Review
Flip the cards to review the central concepts from this module.
- Digital Omnibus (in EU digital policy)
- A proposed approach where the EU would adopt a single legislative act that **amends several existing digital regulations at once** (e.g., DSA, DMA, AI Act, GDPR, Data Act) to align definitions, streamline obligations, and fix overlaps.
- Regulatory simplification
- Efforts to **reduce complexity and duplication** in the legal framework, for example by harmonizing definitions, consolidating reporting duties, or clarifying how different acts interact.
- High-risk AI (AI Act context)
- AI systems that pose significant risks to health, safety, or fundamental rights (e.g., in employment, credit scoring, critical infrastructure) and are therefore subject to **stricter obligations** under the AI Act.
- Dark patterns
- User interface designs that **manipulate or mislead** users into choices they might not otherwise make, often by hiding information, nudging toward more profitable options, or making refusal difficult.
- Digital Fairness Act (DFA)
- An emerging EU legislative proposal focusing on **modernizing consumer protection** in the digital environment, particularly targeting dark patterns and unfair digital practices, and complementing DSA and GDPR.
- Overlap review
- A systematic assessment of how different laws apply to the same services or technologies, identifying **conflicts, duplications, and gaps** to inform simplification or amendment proposals.
Key Terms
- REFIT
- The European Commission’s Regulatory Fitness and Performance program, which reviews existing EU legislation to ensure it is effective, efficient, and as simple as possible.
- High-risk AI
- Under the EU AI Act, AI systems whose use can significantly affect people’s safety or fundamental rights and that are therefore subject to enhanced obligations.
- Dark patterns
- Design choices in user interfaces that covertly steer or pressure users into decisions that may not be in their best interest, often by exploiting cognitive biases.
- Overlap review
- An evaluation of how multiple laws apply to the same activity or technology, used to identify where legal requirements conflict, duplicate, or leave gaps.
- Digital Omnibus
- A proposed EU legislative instrument that would amend several existing digital regulations in one package to improve coherence and reduce overlaps.
- Regulatory simplification
- The process of making the legal framework easier to understand and comply with, often by removing duplication, aligning definitions, and clarifying interactions between laws.
- Digital Fairness Act (DFA)
- An EU initiative under discussion aimed at updating consumer protection rules for the digital age, addressing issues such as dark patterns and unfair online practices.