Blend
Compliance Training 28 March 2026

Interactive DORA Training vs. Slide Decks: What Auditors Accept

What ECB and EBA examiners look for in DORA training evidence, why slide-based certificates fall short, and how to build an audit-ready evidence portfolio.

By Tom Payani

Financial supervisors have been examining operational resilience programmes for decades. They know what a substantive programme looks like, and they know what a decorative one looks like. The distinction usually comes down to a simple question: can the people in this organisation actually do the things the programme says they can do?

DORA has been enforceable since 17 January 2025. The regulation's training requirements — spread across Articles 5, 13, and the broader ICT risk-management framework — are now subject to supervisory scrutiny. And the organisations discovering the gap between "we ran training" and "we can demonstrate training worked" are, predictably, the ones that reached for the familiar tools: a compliance slide deck, a multiple-choice quiz, and a completion certificate.

This article examines what financial supervisors actually look for when they examine DORA training evidence, why traditional formats produce insufficient evidence, and how to build a training evidence portfolio that holds up under examination.


What Supervisors Are Actually Looking For

The European Central Bank, the European Banking Authority, and national competent authorities across the EU have a shared supervisory culture around operational resilience. It is not new. The expectation that regulated entities must demonstrate — not merely assert — the effectiveness of their risk-management practices has been established through decades of prudential supervision.

DORA formalises this expectation for digital operational resilience specifically. But the supervisory mindset predates the regulation. When an examiner reviews your ICT risk-management programme under DORA, they are applying the same evaluative framework they have always applied: does this programme function, or does it merely exist on paper?

For training specifically, examiners are looking for evidence across three dimensions.

Coverage. Have the right people been trained? DORA Article 13(6) requires ICT security awareness programmes and digital operational resilience training for all staff, proportionate to their function. Article 5(4) requires the management body to maintain sufficient knowledge to understand and assess ICT risk. The first thing a supervisor checks is whether training reached everyone it was supposed to reach — and whether the content was calibrated to role. A generic module delivered identically to the board, the operations team, and the IT department does not demonstrate proportionate training.

Substance. Did the training address the right competencies? DORA's ICT risk-management framework spans risk identification, protection and prevention, detection, response and recovery, backup, and third-party risk management. Training that covers "cybersecurity awareness" in general terms without mapping to these specific operational domains will appear thin under examination. Supervisors expect to see a clear line between the regulation's requirements and the training content.

Effectiveness. Can you demonstrate that training produced the intended competence? This is where most organisations fall short. A completion certificate demonstrates that someone sat through a course. A quiz score demonstrates short-term recall of information. Neither demonstrates that a staff member can identify an ICT-related incident, follow the correct escalation procedure, or assess a third-party provider's resilience posture — which are the actual capabilities DORA is trying to build.

The supervisory expectation is not that every organisation must use a specific training format. It is that whatever format you choose, you must be able to demonstrate it worked. The burden of proof sits with the institution.


Why Completion Certificates Are Not Enough

To be clear: completion certificates are not worthless. They document that training occurred, that staff engaged with specific content, and that the organisation has a functioning training programme. This is necessary evidence. It is not sufficient evidence.

The gap between necessary and sufficient is where DORA's training requirements diverge from older, less demanding compliance regimes. In some regulatory contexts, documenting that training happened is genuinely all that is required. Annual anti-money laundering refreshers, for example, have historically been assessed primarily on completion rates. The question was "did everyone complete it?" — and the answer was a percentage.

DORA asks a different question. The regulation is built around operational resilience — the ability of a financial entity to withstand, respond to, and recover from ICT-related disruptions. Training under DORA is not a standalone compliance activity. It is a component of the ICT risk-management framework, and it is evaluated in that context.

When a supervisor examines your incident response capabilities, they will look at your incident response plan, your escalation procedures, your communication protocols, and your testing records. Training fits within that picture. The question is not "did your staff complete a course on incident response?" It is "when your staff face an ICT incident, can they execute the response procedure effectively?"

A completion certificate does not answer that question. It answers a different, easier question — one that supervisors stopped finding adequate some time ago.

The same logic applies to quiz-based assessments. Selecting the correct answer to "what is the DORA incident reporting timeline?" from four options demonstrates that the learner read the material recently enough to recall the answer. It does not demonstrate that, in the middle of an actual incident with incomplete information and competing priorities, they would correctly classify the event, initiate the reporting process within the required timeframe, and communicate accurately with the competent authority.

This distinction — between recalling information and applying it under realistic conditions — is the core of the evidence problem that slide-based training creates.


The Shift Toward Demonstrable Competence

The expectation that training must produce demonstrable competence is not unique to DORA. It reflects a broader shift across financial regulation and beyond.

The ECB's Guide to Internal Models, updated through successive iterations, has progressively tightened expectations around staff competence in risk functions — moving from documentation requirements toward evidence of applied capability. The EBA's Guidelines on Internal Governance (EBA/GL/2021/05) explicitly require institutions to ensure that management body members possess "adequate collective knowledge, skills and experience" and to assess this on an ongoing basis — not just at appointment.

Outside financial regulation, the same trajectory is visible. The US Department of Justice's Evaluation of Corporate Compliance Programs asks whether compliance training is "provided in a manner and with a frequency that's adequate for the needs of the target audience" and whether the company has "measured the effectiveness of training." The UK Financial Conduct Authority's approach to individual accountability under the Senior Managers and Certification Regime requires firms to assess competence, not just deliver training.

DORA sits within this trend. The regulation does not use the phrase "demonstrable competence." But its structure — embedding training within a risk-management framework that is subject to resilience testing and supervisory examination — creates an environment where competence must be demonstrated, not merely claimed.

For compliance teams, this means the training evidence question has shifted. The old question was: "Can we show that training was delivered?" The new question is: "Can we show that training changed what people do?"


How Scenario-Based Training Produces Better Audit Evidence

Scenario-based training — sometimes called simulation-based learning — places participants in realistic operational situations and requires them to make sequenced decisions with consequences. In a DORA context, this might mean working through an ICT incident from initial detection through classification, escalation, containment, recovery, and regulatory reporting. Each decision point is logged, scored, and mapped to specific competence dimensions.

The evidence this produces is qualitatively different from a completion certificate or a quiz score. Consider what each format generates when a supervisor requests training records.

Slide-based course with quiz: Completion timestamp. Quiz score (e.g., 80%). Certificate PDF. The supervisor knows the learner finished the course and answered most questions correctly at the time. They do not know whether the learner can apply any of this knowledge in practice.

Scenario-based training: Decision audit trail showing, for example, that the learner correctly identified an ICT incident as major under DORA's classification criteria, initiated the reporting process within the correct timeframe, escalated to the management body appropriately, and coordinated recovery actions in the right sequence. The trail includes where the learner made errors, what feedback they received, and whether they corrected their approach. Competence scores are mapped to specific DORA domains — incident management, third-party risk, business continuity, governance.

The second set of evidence directly addresses the "can they do it?" question. It does not prove that a staff member will perform flawlessly in a real incident — no training can guarantee that. But it demonstrates that they have practised the relevant decision-making process under realistic conditions and achieved a measurable level of competence. That is substantially closer to what supervisors are looking for.

This is consistent with how financial institutions already approach competence in other operational domains. Trading desks run simulations. Treasury functions conduct stress tests. Business continuity plans are rehearsed through tabletop exercises. The principle that competence must be practised and evidenced, not just taught and certified, is well established in financial services. DORA extends that principle to digital operational resilience training.

Our analysis of the same dynamic in the NIS2 context — where the evidence question is structurally identical — is covered in detail in NIS2 compliance training: slides vs. simulations.


Building an Evidence Portfolio That Holds Up

No single piece of training evidence is sufficient on its own. Supervisors assess training as part of the broader ICT risk-management framework, and they expect to see a coherent picture rather than isolated documents. Building an effective evidence portfolio means assembling multiple types of evidence that together demonstrate coverage, substance, and effectiveness.

Layer 1: Programme documentation. Your training programme should be documented as a formal component of your ICT risk-management framework. This includes the training policy, the curriculum mapped to DORA's requirements, the target audience for each module (demonstrating proportionality by role), the delivery schedule, and the review cycle. This layer proves that training is planned, structured, and aligned with regulatory requirements.

Layer 2: Completion and participation records. LMS records showing who completed what, when, and with what assessment results. This is the layer that slide-based courses produce well. It demonstrates organisational reach and procedural compliance. Export these records in a format that can be produced quickly — supervisors do not want to wait while someone navigates an LMS dashboard.

Layer 3: Competence evidence. This is the layer that distinguishes a substantive programme from a decorative one. Decision logs from scenario-based training, competence scores mapped to DORA domains, evidence of how staff performed under simulated operational conditions. This layer demonstrates that training produced applied capability, not just information exposure.

Layer 4: Reinforcement and currency. Evidence that training is not a one-off event. Records of refresher activities, updated scenarios reflecting emerging threats, management body briefings on evolving ICT risks, participation in resilience testing exercises. DORA Article 13(6) requires training to be part of an ongoing programme, and supervisors will look for evidence of continuity.

Layer 5: Integration with operations. Evidence that training connects to actual operational practice. This might include incident response records that reference training scenarios, risk committee minutes that discuss training outcomes, or post-incident reviews that identify training gaps. This layer demonstrates that training is embedded in the organisation's operational culture, not running as a separate compliance activity.

The organisations best prepared for supervisory examination will have evidence across all five layers. The ones most likely to face challenge will be those with strong Layer 2 evidence (lots of completion certificates) but little in Layers 3 through 5.


Practical Steps for Compliance Teams

If you are responsible for DORA training compliance and want to build an evidence base that satisfies supervisory expectations, here is a practical sequence.

Audit your current evidence. Look at what your existing training programme produces. If the answer is completion certificates and quiz scores, you have Layers 1 and 2 covered but are exposed on Layers 3 through 5. Knowing the gap is the starting point.

Map training content to DORA's framework. Review whether your current training addresses the specific operational domains in DORA's ICT risk-management requirements — not cybersecurity in general, but the specific competencies the regulation requires. If your training was designed for a different regulation or a generic audience, it may need restructuring.

Introduce scenario-based elements. You do not necessarily need to replace your entire training programme. Adding scenario-based modules that test applied decision-making in DORA-relevant situations — incident classification, escalation, third-party risk assessment, recovery prioritisation — can generate Layer 3 evidence while complementing your existing foundational training.

Differentiate by role. DORA requires training to be proportionate to function. Your management body needs governance-level scenarios. Your ICT team needs technical operational scenarios. Your broader staff need awareness-level training calibrated to their role in the resilience framework. A single course for everyone does not demonstrate proportionality.

Build the review cycle. Document how you will update training content, reassess competence, and incorporate lessons from incidents and resilience testing. Supervisors look for living programmes, not static ones.

For a detailed breakdown of what DORA Article 13 requires in terms of training scope and proportionality, that companion article covers the regulatory text in depth.

The DORA training programme is built around this evidence portfolio model — combining foundational regulatory knowledge with scenario-based modules that produce the decision-level evidence financial supervisors expect. Each module maps to specific DORA ICT risk-management domains and generates exportable competence records suitable for supervisory examination.

If you are unsure where your current training programme stands relative to what DORA requires, the compliance training diagnostic provides a structured assessment of your evidence gaps and a clear picture of what needs to change.


The Audit Is Not the Point

It would be easy to read this article as purely tactical — build the right evidence portfolio, pass the audit, move on. That framing misses something important.

DORA exists because the financial sector's dependence on digital infrastructure creates systemic risk. A major ICT disruption at a significant financial entity does not just affect that entity — it can cascade through interconnected systems and affect markets, payments, and public confidence. The regulation's training requirements exist because people make the decisions that determine whether disruptions are contained or amplified.

Training that produces genuine operational competence — staff who can actually identify, classify, escalate, and respond to ICT incidents — reduces real risk. Training that produces certificates without competence does not.

The audit question and the operational question have the same answer. Build training that works, document it properly, and the evidence takes care of itself. The organisations that will struggle are those that optimise for the appearance of compliance without investing in the substance of it. Financial supervisors have been distinguishing between the two for a long time. They are good at it.

DORA compliance training audit evidence scenario-based learning financial services ECB EBA

DORA Training Evidence Checklist

Audit-ready checklist of what supervisors expect from your ICT training programme. Downloadable.

Free: AI Training Audit for Your Team

See where AI could improve your training programs. Interactive 5-minute assessment.

Start the Audit