NIS2 Directive — Article 20
Meridian Energy Group — Monday, 6:00 AM
An interactive scenario about ransomware, personal liability, and what happens when the 24-hour reporting clock starts ticking.
340,000 records at risk. 24 hours to report. Personal liability on the table.
All languages available in the full course
CISO at Meridian Energy Group — a mid-size energy company operating critical infrastructure across Germany and the Netherlands. 2,000 employees. NIS2 essential entity.
It’s 6:00 AM on a Monday. Your phone just woke you up. The SOC has detected ransomware on three production servers.
This is a choose-your-own-adventure scenario. You’ll face three real decisions that a CISO encounters during a live ransomware incident — and your choices shape how the story unfolds.
The 4 Stakeholder Bars (top right)
Each bar starts at 50%. Your decisions shift them. There’s no perfect answer — only trade-offs.
24-Hour Deadline
NIS2 requires early warning within 24 hours of detection. A countdown timer runs throughout — your decisions affect how much time remains.
Legal References
Article references appear throughout — click them to read the relevant NIS2 article.
Article 23 — Three-Stage Reporting
Article 20 — Board Accountability
The management body (board of directors) is personally liable for approving and overseeing cybersecurity risk-management measures. Individual board members can face sanctions. They must be informed — not shielded.
Enforcement
Essential entities (like energy companies) face fines up to €10M or 2% of annual turnover. The early warning doesn’t require certainty — it requires speed. File early, update later.
Four alerts in two minutes. Three production servers encrypted. Lateral movement toward the OT network — that’s the gas distribution control system.
Your SOC analyst is on the line. Your phone buzzes with a Signal message from an unknown number.
You’re in the car. 22 minutes to the office. 340,000 households depend on your gas distribution network. The OT alert means they’re probing the boundary to the control systems.
Tomás Vidal, Head of Operations, calls. He’s already at the plant.

“Alex, billing is down. Completely. We can’t process payments, we can’t issue invoices, and the operations planning system is frozen. I’ve got 14 field engineers sitting in their vans with no job orders.”
“We’re losing €50,000 an hour. How fast can you get the systems back?”
You “Tomás, if I rush to restore, I might destroy the forensic evidence we need to understand how they got in.”
Tomás “Evidence? I’ve got 340,000 households who can’t see their bills, and 14 engineers who can’t do their jobs. I need systems, not evidence.”
Tomás “Stefan — Stefan Brandt, he’s been with us nine years — is sitting in his van in Groningen with no job orders. He’s called me twice. His crew does emergency repairs. If a gas line leaks today and we can’t dispatch, that’s not a billing problem. That’s a safety problem.”

Three servers encrypted. 340,000 customer records. OT boundary probed. €50K/hour losses. Your SOC team can preserve forensic evidence OR start restoring — doing both simultaneously risks contaminating the trail.

“Nobody touches those servers until forensics has clean images. Tomás, I know it hurts.”
Tomás “Six hours, Alex. You’re asking me to lose €300,000 so your team can take photos of a crime scene.”
You “I’m asking you to let me find the door they walked in through. Otherwise we’ll be having this conversation again next month.”
The forensics team finds the entry point within 3 hours: a compromised VPN credential from a third-party maintenance contractor. The first exploit ran at 02:17 AM — four hours before the SOC alert. Without the evidence, you’d never have known.
At 9:47 AM, customer services gets a call from Elke Jansen in Groningen. Her direct debit bounced because the billing system was down. She’s 74. The overdraft fee is €35. She wants to know who’s going to pay it.
Article 23 requires the final report to include “the root cause.” Without forensic images, your one-month report would have a blank where the root cause should be. The €300K cost of delayed restoration is a fraction of the €10M fine for incomplete reporting.

“Wipe and restore. Get operations running.”
Tomás “Now you’re talking.”
Billing back online by 9:30 AM. Elke Jansen in Groningen still had her direct debit bounce — the system was down during the processing window. But at least Stefan’s crew got their job orders.
Two weeks later, the regulator asks how the attacker gained access. You can’t answer. The evidence was destroyed during restoration.
Article 23 requires the final report to include “the root cause.” Without forensics, your report will have a blank where the root cause should be.
You split the team. Forensics gets partial images — enough for the malware variant, not the entry point. Restoration gets two servers back. The third is corrupted.
Tomás has 60% of operations. You have 40% of the evidence. Nobody is satisfied.
Splitting resources between forensics and restoration is operationally tempting but delivers the worst of both worlds. Article 23 requires your final report to include the root cause — partial evidence means partial answers. Meanwhile, partial restoration means ongoing service degradation. In practice, the regulator assesses whether you had a documented incident response plan that prioritised evidence preservation. An ad-hoc compromise suggests you didn’t.
The initial response is underway, but there are gaps in what you know. You have limited time before the board briefing.
Choose 3 of 6 lines of investigation. You won’t have time for the rest.

Helen Marsh is in London. She checks email at 8 AM. She has no idea her company is under attack. Last October, you recommended a security audit. Helen deferred it — “Q1 budget is tight, revisit in March.” It’s March.
Under Article 23, you must submit an early warning to the CSIRT within 24 hours. The clock started at 6:04 AM.
Under Article 20, Helen bears personal liability. She needs to know. The question is when.

Incomplete information. Unclear scope. But Helen has personal liability. If she finds out you waited — the trust is gone.

“Helen, Meridian is experiencing a ransomware attack. Three production servers encrypted. The attackers claim to have 340,000 customer records. We have a 24-hour notification deadline under NIS2.”
Helen Long silence. “How bad?”
You “I don’t know yet. That’s the honest answer.”
Helen “Thank you for calling immediately. What do you need from me?”
Helen activates the crisis committee. By 10 AM, Legal, Comms, and the CEO are briefed. You have air cover.
Article 20 makes the management body personally responsible for approving and overseeing cybersecurity risk-management measures. Board members can face personal sanctions for non-compliance. Immediate, honest notification gives the board the information it needs to fulfil this obligation. Waiting “until you know more” denies the board the opportunity to act — and creates personal liability for every hour of delay.
You email: “The board is informed that Meridian is managing a cyber incident affecting billing systems.”
Helen reads it at 8:15. Calls at 8:16.
Helen “‘Cyber incident’ covers everything from a phishing email to a shutdown. Which is this?”
You “It’s... ransomware. Three servers. They claim to have customer data.”
Helen “And you sent me ‘cyber incident’? Alex, I’m personally liable under NIS2. I needed the truth, not a press release.”
Euphemistic briefings create worse liability than bad news delivered honestly. Under Article 20, the management body must “approve and oversee” cybersecurity measures — which requires accurate information. “Cyber incident affecting billing systems” describes a password reset, not a ransomware attack with data exfiltration. If the board later claims it wasn’t adequately informed, your email becomes evidence that you withheld the severity.

“You’ve known since 6 AM. It is now 5 PM. I am personally liable under Article 20. And you waited eleven hours to tell me.”
“I deferred the security audit you recommended in October. If the regulator asks why, and I say I didn’t know we were under attack until 5 PM — what does that look like?”
Every hour between your awareness and the board’s is an hour where the management body couldn’t fulfil its Article 20 obligation to oversee cybersecurity measures. Helen deferred a security audit in October — a decision she’s personally accountable for. Without knowing the company was under active attack, she had no opportunity to activate the crisis plan, engage counsel, or escalate the deferred audit. The 11-hour delay doesn’t just damage trust — it creates a documented gap in the governance timeline that regulators will examine.
SHADOWVAULT’s instructions arrive:
You verify the sample. The records are real. Elke Jansen is on page 3. Name, address, bank details, gas consumption history.
This decision has a time limit.
€2 million. 340,000 records. 500 verified as real.
Auto-selects “negotiate” if timer expires — indecision is a decision

“We don’t pay. We report.”
Dr. Petra Lindström — NCSC-NL “Mr. Reeves, thank you for the early warning. You’re within the 24-hour window. When can you provide the full notification?”
You “Within 72 hours.”
Petra “Good. And Mr. Reeves — you made the right call. Paying doesn’t guarantee deletion.”
Article 23 requires three reports: (1) an early warning within 24 hours of becoming aware, (2) an incident notification within 72 hours with an initial assessment, and (3) a final report within one month including root cause analysis and remediation measures. Reporting immediately and refusing to pay achieves both regulatory compliance and operational sense — paying a ransom does not guarantee data deletion, and it funds further attacks. The NCSC-NL treats prompt self-reporting as a mitigating factor in any supervisory action.
You engage SHADOWVAULT. You learn they accessed via a compromised contractor VPN and have been inside for 11 days.
Useful intelligence. But you’ve spent 2 hours communicating with a criminal organisation without reporting to the CSIRT.
When you file: “When did you first become aware?” — “6:04 AM.” — “And you’re reporting at 2 PM. What were you doing for eight hours?”
Gathering intelligence from the attacker can be valuable, but not at the cost of regulatory compliance. Under Article 23, the early warning must be filed within 24 hours of becoming aware of the significant incident. Spending 2 hours communicating with the attacker before reporting consumes time from a window that’s already tight. The regulator’s question — “What were you doing for eight hours?” — highlights the gap between awareness (6:04 AM) and reporting (2 PM). Any intelligence gained doesn’t offset the optics of delayed notification.
You pay. €2 million. SHADOWVAULT sends a decryption key.
Three weeks later, the database appears on a dark web leak site. They sold it to a second group. The payment bought you nothing.
Dr. Lindström “You funded a criminal organisation. You did not prevent the data from being published. And you delayed your regulatory notification to arrange the payment.”
Paying a ransom creates three separate compliance failures: (1) the payment funded a criminal organisation, which may breach sanctions or anti-money-laundering laws, (2) the delay caused by arranging payment consumed hours from your Article 23 notification window, and (3) the data was published anyway — payment provided no security benefit. NIS2 Recital 101 emphasises that entities should “not be encouraged to pay ransoms” and should instead focus on detection, response, and notification. The €10M maximum fine under Article 34 makes the €2M ransom look small — and you now face both.
The five steps below are shuffled. Click them in the correct chronological order — Step 1 first, Step 5 last.
Click a card to assign it the next step number. Click Reset to start over.
Submit Full Incident Notification
Initial assessment of severity, impact, and indicators of compromise
Submit Early Warning to CSIRT
Initial notification — no root cause required, just flag the incident
Submit Final Report
Root cause analysis, remediation measures taken, and lessons learned
Provide Progress Updates
Ongoing status as requested by the competent authority
Detect and Assess the Incident
SOC alert received — classify severity and determine if NIS2 thresholds are met
Article 20 — Management body personal liability
Article 21 — Cybersecurity risk management
Article 23 — 24h early warning, 72h notification
Article 34 — Fines up to €10M or 2% turnover
You scored . Every hour mattered. Try a different path?