Chapter 8 of 14
Module 8: Information and Intelligence Sharing Under DORA
Cover DORA’s provisions on voluntary information sharing about cyber threats and vulnerabilities, and how these mechanisms support sector‑wide resilience.
Module 8 Overview: Why Information Sharing Matters Under DORA
Where we are in the course
- Module 6: You studied digital operational resilience testing and TLPT.
- Module 7: You analyzed ICT third‑party risk and oversight of critical providers.
- Module 8 (this module): You now focus on information and intelligence sharing as a system‑level resilience tool.
Legal context (as of December 2025)
- DORA: Regulation (EU) 2022/2554 on digital operational resilience for the financial sector.
- Adopted: December 2022.
- Entered into application: 17 January 2025.
- DORA interacts with:
- GDPR (Regulation (EU) 2016/679) – personal data protection.
- EU competition law (Articles 101–102 TFEU) – antitrust.
- NIS2 Directive (EU) 2022/2555 – broader cybersecurity obligations (transposed in Member States from 2024 onwards).
High‑level idea
DORA encourages but does not mandate voluntary cyber threat and vulnerability information sharing among financial entities and with authorities. The goal is to:
- Shorten detection and response times to new threats.
- Prevent cascade effects across highly interconnected financial infrastructures.
- Raise the baseline of cyber maturity across the sector, including smaller entities.
You will learn to:
- Explain DORA’s objectives and legal basis for cyber threat information sharing.
- Identify constraints: data protection, confidentiality, and antitrust.
- Design governance mechanisms for participation in cyber information‑sharing initiatives that are compliant with DORA and GDPR.
Keep in mind: DORA’s information‑sharing provisions are principles‑based. The real complexity lies in implementation choices you make inside a financial entity.
DORA’s Legal Basis and Objectives for Information Sharing
1. Legal anchors in DORA
Key provisions (paraphrased for clarity):
- Recitals of DORA (especially those on information‑sharing arrangements) – explain the EU’s policy intent to foster trusted cyber‑threat communities.
- Operational risk & ICT risk management chapters – allow and encourage participation in information‑sharing arrangements as part of sound ICT risk management.
- Incident reporting & threat notification sections – distinguish mandatory incident reporting from voluntary threat/vulnerability sharing.
> Conceptual distinction:
> - Incident reporting: mandatory, structured, to competent authorities (e.g., major ICT‑related incidents).
> - Threat/vulnerability information sharing: typically voluntary, often peer‑to‑peer or via communities, and more forward‑looking.
2. Policy objectives
DORA’s information‑sharing framework aims to:
- Improve situational awareness: Entities gain visibility into:
- New malware campaigns targeting financial institutions.
- Exploited vulnerabilities in widely used ICT products.
- Tactics, techniques, and procedures (TTPs) used by threat actors.
- Support collective defense:
- If one bank sees a new phishing kit targeting its customers, sharing IOCs (indicators of compromise) helps others block it.
- Shared playbooks and mitigation strategies reduce duplication of effort.
- Promote proportionality:
- Smaller entities often lack sophisticated threat intel capabilities.
- Information sharing lets them leverage the capabilities of larger players and sectoral CSIRTs.
- Complement TLPT and third‑party risk (Modules 6–7):
- TLPT results feed into community knowledge of realistic attack paths.
- Shared intelligence on critical providers’ vulnerabilities supports better third‑party risk management.
Challenge question (for yourself):
> How does voluntary threat information sharing differ in purpose and timing from the mandatory incident reporting regime under DORA and NIS2?
Example: Voluntary Threat Notification vs Mandatory Incident Reporting
Scenario
A mid‑size EU bank detects:
- A new phishing campaign targeting its retail customers.
- The phishing emails imitate the bank’s MFA prompts and redirect to a fake login page.
- The bank blocks the domain before any known financial loss occurs.
How this plays out under DORA
- Mandatory incident reporting?
- So far, no major service disruption and no confirmed financial loss.
- This may not yet qualify as a major ICT‑related incident under DORA’s thresholds (which are further specified in RTS/ITS adopted by the ESAs).
- Voluntary threat notification to authorities
- The bank can issue a voluntary threat notification to its competent authority (or to a sectoral CSIRT if aligned with national practice), sharing:
- Email subject lines and body text patterns.
- Sender IPs and domains.
- Screenshots of the phishing site.
- Observed TTPs (e.g., use of particular crimeware kits).
- This helps authorities warn other entities before the campaign scales.
- Peer‑to‑peer sharing in a trusted community
- The bank also posts anonymized IOCs and TTPs to a financial ISAC‑like group (Information Sharing and Analysis Center), possibly:
- Via a secure platform using STIX/TAXII.
- Under a community charter that defines confidentiality and use restrictions.
- Difference in legal framing
- Mandatory incident reporting: failure to report when thresholds are met can lead to supervisory measures and sanctions under DORA.
- Voluntary threat notification and sharing: encouraged, but not sanctioned if omitted; however, supervisors may expect mature entities to participate as part of good ICT risk management.
Key takeaway: Voluntary sharing is about anticipatory defense and sector‑wide resilience, not just compliance.
Trusted Communities and Information‑Sharing Arrangements
1. What is an information‑sharing arrangement under DORA?
Under DORA, financial entities may participate in formal or informal arrangements for exchanging:
- Technical data: IOCs, malware hashes, C2 IPs, YARA rules.
- Tactical intel: attack paths, exploited misconfigurations.
- Strategic intel: threat actor trends, sector‑specific risk patterns.
These arrangements can be:
- Sectoral ISACs (e.g., national or EU‑level financial ISACs).
- Public–private partnerships (PPP) with authorities and CSIRTs.
- Vendor‑driven communities (e.g., sharing through a major SOC provider).
2. Requirements for a “trusted community”
DORA expects that participation is structured and governed, not random ad‑hoc emailing. Typical features:
- Clear membership criteria
- Only vetted entities (licensed financial institutions, critical ICT providers, sectoral CSIRTs).
- Governance documents
- Charter or MoU describing:
- Purpose (e.g., cyber defense only).
- Roles (who can contribute, who can consume).
- Decision‑making and escalation.
- Confidentiality and use limitations
- NDA‑like commitments.
- Restrictions on re‑sharing outside the community.
- Security of the sharing platform
- Strong authentication, encryption, logging.
- Support for standard formats (e.g., STIX 2.x) to enable automation.
3. Relationship with DORA’s third‑party and TLPT regimes
- Third‑party risk (Module 7):
- Communities may include critical ICT providers; intel about their vulnerabilities or service degradations is highly valuable but also sensitive.
- TLPT (Module 6):
- Aggregated, anonymized insights from TLPT exercises can be shared to highlight systemic weaknesses (e.g., common misconfigurations in SWIFT connectivity) without exposing individual entities.
Analytical angle:
> When does including a critical ICT provider in an intel‑sharing community increase sector resilience, and when might it increase systemic risk (e.g., by concentrating knowledge about vulnerabilities in one place)?
Design a Minimal Governance Framework for an Intel‑Sharing Group
Imagine you are the CISO of an EU payment institution joining a regional financial cyber‑threat sharing community.
Task: Draft, in bullet points, a minimal governance framework that would satisfy DORA‑style expectations. Focus on 4 headings:
- Purpose and Scope
- Membership and Access Control
- Confidentiality and Use of Shared Information
- Compliance (GDPR, DORA, Antitrust)
Write your answers, then compare with the model solution below.
---
Model solution (one reasonable version)
- Purpose and Scope
- Enhance members’ cyber defense capabilities and operational resilience.
- Scope limited to cybersecurity‑relevant information (threats, vulnerabilities, incidents, mitigations, playbooks).
- Explicit prohibition on using the community for commercial coordination (e.g., pricing, market strategies).
- Membership and Access Control
- Membership restricted to licensed financial entities in the region, selected ICT providers, and designated public authorities/CSIRTs.
- Each member designates named users (e.g., SOC analysts) with individual accounts and strong authentication.
- Regular review of membership; immediate revocation of access upon role change or termination.
- Confidentiality and Use of Shared Information
- Members sign a confidentiality agreement; information classified at least as confidential by default.
- No onward disclosure outside the group without originator’s consent, except where legally required.
- Shared information to be used only for cybersecurity and risk‑management purposes, not for competitive advantage against other members.
- Compliance (GDPR, DORA, Antitrust)
- Personal data minimized; where unavoidable, processed under a clear legal basis (e.g., legitimate interests in ensuring network and information security).
- Alignment with DORA: participation documented in ICT risk management framework and policies; integration with incident handling and testing.
- Antitrust safeguards: agenda and meeting minutes to demonstrate focus on security; prohibition on discussing prices, customer allocation, or strategic commercial plans.
Reflect: What would you add or change if your community included non‑EU entities or global cloud providers?
Data Protection and Confidentiality: Sharing Without Over‑Sharing
1. GDPR constraints
Threat information often contains (or can be linked to) personal data, for example:
- IP addresses, email addresses, usernames.
- Logs that include customer identifiers.
- Employee names in incident timelines.
Under GDPR, you must:
- Identify a legal basis for processing and sharing:
- Commonly: legitimate interests in ensuring network and information security (Recital 49 GDPR).
- Sometimes: legal obligation if sharing is required by law (e.g., specific sectoral rules).
- Apply data minimization:
- Share only what is necessary to enable defense.
- Replace personal identifiers with pseudonyms or aggregates where feasible.
- Ensure transparency and accountability:
- Document sharing in your records of processing activities (RoPA).
- Update privacy notices if sharing practices materially affect data subjects.
2. Confidentiality and trade secrets
Threat intelligence may reveal:
- Internal network architecture.
- Vendor dependencies and vulnerabilities.
- Incident response capabilities and gaps.
These can be trade secrets or sensitive security information. DORA expects:
- Internal classification of threat intel (e.g., internal, community‑only, public).
- Secure channels for distribution (end‑to‑end encryption, access control, logging).
- Need‑to‑know principles even within the community.
3. Balancing speed vs. accuracy
Operational tension:
- Fast sharing aids defense but risks:
- Sharing inaccurate or unverified data.
- Over‑disclosure of personal or confidential data.
- Slow, fully sanitized sharing is compliant but may be operationally useless.
Practical compromise often used by mature entities:
- Phase 1 (rapid): quickly share high‑level IOCs with minimal context.
- Phase 2 (refined): follow up with more detailed, sanitized reports once validated by the SOC/CSIRT and DPO where needed.
Thought question: How would you formalize this two‑phase process in your incident response playbook so that it is repeatable and auditable under DORA and GDPR?
Antitrust and Competition Law Risks in Information Sharing
1. Why antitrust matters here
Even if DORA encourages information sharing, EU competition law still applies. Sector‑wide communities of competitors can become a venue for collusion if not carefully scoped.
Key legal reference: Articles 101 and 102 TFEU (anti‑competitive agreements and abuse of dominance).
2. Risky behaviors to avoid
Within threat‑sharing communities, avoid:
- Discussing prices, fees, or margins (e.g., how much you spend on security services, pricing of cyber insurance to customers).
- Exchanging strategic business plans (e.g., market exit/entry decisions justified by cyber risk).
- Agreeing on collective boycotts of specific ICT providers or security vendors.
3. Safer practices
To stay within DORA’s spirit and competition law:
- Limit scope to security‑relevant information
- Threats, vulnerabilities, incidents, mitigations, best practices.
- Use structured agendas and minutes
- Meetings should have pre‑defined security topics and documented outcomes.
- Involve legal/antitrust counsel for governance design
- Especially for large cross‑border or cross‑sector communities.
- Anonymize and aggregate where possible
- Example: share that “several entities observed increased DDoS attempts via protocol X” without stating which bank or what exact volumes, if not necessary.
4. Edge case: Coordinated response to a systemic threat
Suppose multiple large banks agree simultaneously to:
- Block traffic from a specific major cloud provider region believed to be compromised.
Questions to analyze:
- Is this a legitimate, proportionate security measure or a concerted practice that could distort competition or unfairly harm that provider?
- How should the decision be documented to evidence security‑driven rationale rather than commercial motives?
In such edge cases, early involvement of competition law experts and transparent documentation are critical.
Thought Exercise: Sanitizing Threat Intel for Sharing
You have the following raw incident log entry:
```text
2025-11-02 13:45:12 CET
User: john.doe@examplebank.eu (Customer ID: 998234)
IP: 198.51.100.23
Device: iPhone 14, iOS 18.1
Action: Failed login attempt (2FA push rejected)
Referrer: https://secure-login-examplebank.com
Notes: User contacted support, reported receiving 6 unexpected 2FA prompts.
Analyst: Maria Rossi (maria.rossi@examplebank.eu)
Ticket ID: IR-2025-441
```
Task 1: Identify personal data
List all elements that qualify as personal data under GDPR.
Task 2: Propose a shared version
Rewrite the log entry so that it remains operationally useful for a financial ISAC, while:
- Minimizing personal data.
- Preserving key threat indicators.
---
Model solution (one possible answer)
Task 1 – Personal data present
- `john.doe@examplebank.eu` (email).
- `Customer ID: 998234` (unique identifier).
- `IP: 198.51.100.23` (can be personal data if linked to an individual).
- Potentially the combination of device type, timestamp, and behavior.
- `Maria Rossi` and `maria.rossi@examplebank.eu` (employee personal data).
Task 2 – Sanitized, shareable version
```text
2025-11-02 13:45:12 CET
Customer account (retail, EU) – multiple unsolicited 2FA push notifications observed.
Source IP: 198.51.100.23 (residential ISP in EU Member State A).
Device: iOS mobile device (recent OS version).
Action: Repeated failed login attempts, user rejected 2FA prompts and contacted support.
Indicators:
- Pattern consistent with "MFA fatigue" attack.
- Referrer domain similar to legitimate bank login page.
Ticket reference: IR-2025-441 (internal).
```
Why this is better
- Customer and employee identifiers removed.
- IP retained because it may be an IOC; you could still decide to generalize it (e.g., /24 subnet) depending on risk.
- Context preserved so other entities can detect similar MFA‑fatigue attacks.
Reflect: Would you keep the exact timestamp? Does it add value for others, or could a date‑only field be enough?
Integrating Shared Intelligence into Risk Management and SOC Operations
DORA does not want information sharing to be a dead‑end mailbox. It must feed into risk management, controls, and testing.
1. Governance integration
Your ICT risk management framework (required by DORA) should define:
- Roles: Who receives external intel (e.g., Threat Intel Lead, SOC Manager)?
- Triage process: How incoming intel is prioritized and validated.
- Decision rights: Who can authorize urgent mitigations (e.g., blocking IP ranges, patching out‑of‑hours)?
2. SOC / operational integration
Typical pipeline:
- Ingest: Automated feed from ISAC or authority in STIX/TAXII format.
- Enrich: Correlate with internal telemetry (SIEM, EDR, NDR).
- Act:
- Update detection rules (SIEM correlation rules, IDS signatures, EDR policies).
- Adjust firewall/IPS blocklists.
- Feedback loop:
- Report back to the community whether the shared indicators were observed or not.
- Provide false‑positive/false‑negative feedback.
3. Risk management and testing
Shared intel should influence:
- Risk registers: Emerging threats may change likelihood and impact assessments.
- Control design: If many peers see attacks abusing a certain protocol, you may redesign network segmentation or authentication.
- Testing (Module 6):
- Use community intel to design realistic TLPT scenarios.
- Re‑test controls after applying community‑recommended mitigations.
4. Metrics and evidence for supervisors
To demonstrate DORA compliance, entities can track:
- Number of relevant intel items received vs. acted upon.
- Time from intel receipt to control update.
- Instances where shared intel prevented or reduced impact of incidents.
Thought question: How would you avoid alert fatigue from external intel feeds, while still showing supervisors that you take information sharing seriously?
Check Understanding: Legal and Practical Constraints
Answer the question below, then check the explanation.
Which of the following **best** describes a DORA‑compliant approach to cyber threat information sharing?
- Sharing detailed incident logs including customer identifiers with all members of a financial ISAC, because security interests override GDPR.
- Participating in a vetted financial sector information‑sharing community, sharing minimized and relevant threat indicators under a clear governance framework that addresses confidentiality, GDPR, and antitrust.
- Avoiding any sharing of threat information with competitors to eliminate antitrust risks, and relying exclusively on internal telemetry and vendor feeds.
Show Answer
Answer: B) Participating in a vetted financial sector information‑sharing community, sharing minimized and relevant threat indicators under a clear governance framework that addresses confidentiality, GDPR, and antitrust.
Option 2 aligns with DORA: it encourages participation in trusted communities with **structured governance**, **data minimization**, and **compliance with GDPR and competition law**. Option 1 conflicts with GDPR’s data minimization and lawfulness principles. Option 3 ignores DORA’s policy goal of sector‑wide resilience and is overly risk‑averse regarding antitrust.
Review Key Terms and Concepts
Flip the cards (mentally) to test your recall of core concepts from this module.
- Voluntary threat notification (under DORA)
- An optional communication from a financial entity to competent authorities or relevant bodies about **significant cyber threats or vulnerabilities**, even when they do not yet qualify as reportable major incidents. Intended to support **early warning and sector‑wide resilience**.
- Information‑sharing arrangement / trusted community
- A structured mechanism (e.g., ISAC, PPP) where vetted members exchange **cyber threat and vulnerability information** under **clear governance rules** on membership, confidentiality, data protection, and permitted uses.
- Data minimization in threat intel sharing
- The GDPR principle that only the **personal data strictly necessary** for the security purpose should be shared, often achieved by **anonymization, pseudonymization, or aggregation** of incident data.
- Antitrust risk in cyber information sharing
- The possibility that a security‑focused community of competitors may **inadvertently facilitate anti‑competitive behavior** (e.g., price‑fixing, boycotts). Managed by limiting discussions to security topics, using governance rules, and avoiding commercial coordination.
- Operational integration of shared intelligence
- The process of feeding external threat intel into **SOC workflows, detection rules, risk registers, and testing programs**, ensuring that shared information leads to **concrete control changes** and improved resilience.
Key Terms
- DORA
- Regulation (EU) 2022/2554 on digital operational resilience for the financial sector, applicable since January 2025, setting harmonized requirements for ICT risk management, incident reporting, testing, third‑party risk, and information sharing.
- CSIRT
- Computer Security Incident Response Team responsible for receiving, analyzing, and responding to cybersecurity incidents, sometimes operating at national or sectoral level.
- NIS2 Directive
- Directive (EU) 2022/2555 on measures for a high common level of cybersecurity across the Union, which updates the original NIS Directive and imposes cybersecurity and incident reporting obligations on a broad set of essential and important entities.
- Data minimization
- A GDPR principle requiring that personal data processed be **adequate, relevant, and limited** to what is necessary for the stated purpose.
- Competition law / antitrust
- The body of EU and national rules (notably Articles 101–102 TFEU) aimed at preventing anti‑competitive agreements and abuse of dominance, which continue to apply to information‑sharing communities even when focused on cybersecurity.
- IOC (Indicator of Compromise)
- Technical artifacts (e.g., IP addresses, domains, file hashes, registry keys) that indicate, with some confidence, that a system may have been compromised.
- Voluntary threat information sharing
- The non‑mandatory exchange of information about cyber threats, vulnerabilities, and TTPs between financial entities and/or authorities, encouraged by DORA to enhance sector‑wide resilience.
- TLPT (Threat‑Led Penetration Testing)
- Advanced, intelligence‑driven penetration testing required under DORA for certain significant entities, simulating realistic threat actor behavior to test critical functions and controls.
- TTPs (Tactics, Techniques, and Procedures)
- Patterns of behavior used by threat actors, describing *how* they plan, execute, and maintain attacks, often modeled using frameworks like MITRE ATT&CK.
- ISAC (Information Sharing and Analysis Center)
- A member‑driven organization, often sector‑specific, that enables sharing of cyber threat information and best practices among participants under defined governance rules.