NIS2 Directive — Article 20
Meridian Energy Group — Monday, 6:00 AM
You are the CISO. The ransomware hit three hours ago. The 24-hour reporting clock is already running. Every decision you make in the next hour will be scrutinised by a regulator, a board chair who faces personal liability, and a criminal group who know exactly what your data is worth.
340,000 records. 24 hours. Your name on the notification.
CISO at Meridian Energy Group — a mid-size energy company running critical infrastructure across Germany and the Netherlands. 2,000 employees. 340,000 residential gas customers. NIS2 essential entity. Your name is on every cybersecurity policy the board has approved.
Monday, 6:00 AM. Your phone pulled you out of sleep. Three words from the SOC analyst: “It’s ransomware.” Three production servers. You’re still in your kitchen. The office is 22 minutes away.
This is a choose-your-own-adventure scenario. You face three decisions that real CISOs face in the first hours of a live ransomware incident. There is no safe option. Every choice closes another door. Your decisions shape what happens to your company, your board chair, a 74-year-old customer in Groningen, and you.
The 4 Stakeholder Bars (top right)
Each bar starts at 50%. Your decisions move them. There is no outcome where everyone is happy. That’s the point.
24-Hour Deadline
NIS2 requires an early warning to the CSIRT within 24 hours of becoming aware of an incident. The clock started at 6:04 AM. It does not stop while you’re thinking.
Legal References
Article references appear in the text — click them to read the exact NIS2 wording. You’ll need to understand these to know whether your choices are defensible.
Article 23 — Three-Stage Reporting
Article 20 — Board Accountability
The management body is personally liable for approving and overseeing cybersecurity risk-management. Individual board members can face sanctions. They must be informed immediately — not protected from bad news until you have a clean story.
Enforcement
Essential entities face fines up to €10M or 2% of annual global turnover. The early warning does not require certainty. It requires speed. File what you know. Update as you learn more. The 24-hour clock does not care about your investigation timeline.
Four alerts in under two minutes. Three production servers encrypted. The lateral movement probe is heading toward OT-GATEWAY-01 — the boundary to the gas distribution control system. If that falls, this stops being a billing problem.
Your SOC analyst is talking fast. Then your phone buzzes. Signal message. Unknown number.
You’re still in the car. 19 minutes to the office. You read it twice. They’re not bluffing about the OT probe — that detail isn’t in any public filing. They have been inside the network long enough to map it.
Your phone rings. Tomás Vidal, Head of Operations. He’s at the plant. He already sounds like he hasn’t slept.

“Billing is dead. Totally down. No payments, no invoices. The operations planning system is locked. I have 14 field engineers sitting in their vans right now with no job orders.”
“€50,000 an hour. That’s what this is costing. So tell me — when can you get the systems back?”
You “If I wipe those servers before forensics gets in there, we lose the evidence. We’d never know how they got in.”
Tomás “I don’t care how they got in. I care about my engineers. I care about 340,000 customers who can’t see their accounts. Give me systems.”
Tomás “Stefan Brandt — nine years with us — he’s parked in Groningen with a full crew and no job orders. He called me at 6:15. His team handles gas line emergency repairs. If there’s a leak today and we can’t dispatch him — that’s not an IT problem. That’s a safety incident.”

Three servers encrypted. 340,000 records on a system you no longer fully control. OT boundary being probed. €50,000 bleeding out every hour. Your SOC team has one set of hands. They can preserve forensic evidence or they can start the restore. Running both at once risks contaminating the evidence trail. You have to choose.

“Nobody touches those servers until forensics has clean images. I know what that costs, Tomás.”
Tomás “Six hours. You’re asking me to lose €300,000 so your team can photograph a crime scene.”
You “I’m asking you to help me find the door they used. Because if we don’t, they’ll use it again.”
Forensics finds the entry point in under three hours. A compromised VPN credential from a third-party maintenance contractor — a firm called Grenzmann IT Services. Their access token was valid. The first exploit ran at 02:17 AM, four hours before the SOC caught it. Without the server image, that trail is gone.
At 9:47 AM, customer services takes a call from Elke Jansen in Groningen. Her direct debit bounced. The billing system was still down during the processing window. She is 74 years old. The overdraft fee is €35. She wants to know who is paying it.
Article 23 requires the final report to include the root cause. Without forensic images, you cannot answer that question. The regulator will ask. The €300,000 cost of delayed restoration is real — but it is a fraction of the €10 million maximum fine for an incomplete report, and nothing compared to leaving the entry point open for the next attack.

“Wipe and restore. Get operations back.”
Tomás “Finally.”
Billing comes back online at 9:30 AM. Stefan’s crew get their job orders. Elke Jansen’s direct debit still bounces — the system was down when the bank processed her payment. She gets an overdraft fee. Nobody calls her to explain.
Two weeks later, Dr. Petra Lindström at the NCSC-NL asks a single question: “How did the attacker gain initial access to your network?” You have no answer. The evidence was on those servers. You wiped them at 7:12 AM.
Article 23 requires the final report to include the root cause. Without the forensic trail, that section of your report is blank. The regulator cannot close the investigation without it. And without knowing the entry point, there is nothing stopping the attackers from coming back through the same door.
You split the team. Forensics gets partial images — enough to identify the malware family, not the entry point. Two servers come back online. The third is too corrupted to restore cleanly. It gets quarantined.
Tomás has 60% of operations. You have 40% of the evidence. Both teams are stretched. Stefan Brandt’s crew gets some job orders back at 10:15 AM, but not the emergency repair queue. Nobody is satisfied, and the attacker may still have a foothold somewhere you haven’t looked.
Splitting resources feels like a balanced call. It isn’t. Article 23 requires the final report to include the root cause — and partial forensic images cannot answer that question. Meanwhile, partial restoration means ongoing operational damage and continued regulatory exposure. The regulator will also assess whether you had a documented incident response plan that defined priorities. An ad-hoc split under pressure suggests you didn’t have one, or didn’t follow it.
The response is running. But you still don’t know how they got in, whether the data left the network, or what you’re legally required to tell the board. The briefing is in two hours.
Pick 3 of 6 lines of investigation. The other three stay dark.

Helen Marsh is in London. Right now she is reading email and drinking coffee and has no idea her company is under ransomware attack. Last October, you submitted a recommendation for a full security audit. Helen deferred it — “Q1 budget is tight, let’s revisit in March.” It is March. The audit never happened.
Under Article 23, you must submit an early warning to the CSIRT within 24 hours of becoming aware of the incident. The clock has been running since 6:04 AM.
Under Article 20, Helen carries personal liability for cybersecurity governance. She needs to know. The question is not whether — it is when, and how much you tell her.

You don’t have the full picture yet. You don’t know how bad the data exfiltration is. You don’t know the entry point. What you know is this: Helen is personally liable under Article 20, and every hour you don’t tell her is an hour she spends in violation of her own governance obligations — without knowing it.

“Helen. It’s ransomware. Three production servers encrypted. The attackers claim to have 340,000 customer records. We have a 24-hour early warning deadline under NIS2. I am calling you now because you need to know now.”
Helen Silence. Four seconds of it. “What do we know? Not what do we think — what do we know?”
You “Three servers confirmed encrypted. Data exfiltration is probable but not yet confirmed. OT boundary probed but not breached. Forensics is working the entry point. I’ll have more in three hours.”
Helen “Call me in three hours. What do you need from me right now?”
Helen activates the crisis protocol. By 10 AM, Legal, Comms, and the CEO are in the loop. You have support from the top. Helen is not happy — but she is informed, and under Article 20, that is what matters.
Article 20 requires the management body to personally approve and oversee cybersecurity risk-management measures. Board members can face individual sanctions for non-compliance. Calling Helen immediately — even with incomplete information — gives her the ability to act. Every hour of delay is an hour where she carries personal liability without the information to discharge it.
You send the email at 7:58 AM: “The board is informed that Meridian is managing a cyber incident affecting billing systems. The situation is under active investigation. A further update will follow.”
Helen reads it at 8:15 AM. She calls at 8:16.
Helen “‘Cyber incident’ could mean a phishing email. It could mean a full system failure. I carry personal liability under this law and you sent me two sentences. What is actually happening?”
You “It’s ransomware. Three production servers. They claim to have 340,000 customer records.”
Helen “Ransomware. 340,000 records. And you wrote ‘billing systems.’ Alex, I need facts, not communications management. If the regulator sees that email, it looks like you were hiding the severity from me.”
Softening the language feels like crisis management. Under Article 20, the management body must be able to “approve and oversee” cybersecurity risk-management — which requires accurate information. “Cyber incident affecting billing systems” describes a password lockout, not a ransomware attack with 340,000 records at risk. If the regulator later examines your board communications, that email is evidence that you controlled what the board knew. That is not oversight. That is the opposite of it.

“The attack was detected at 6:04 AM. It is now 5 PM. You have known about this for eleven hours. I am personally liable under Article 20 and you did not call me. You did not email me. You sent nothing.”
“I deferred the security audit you submitted in October. Budget reasons. If the regulator opens an investigation — and they will — and asks why I deferred it, and I have to say I didn’t even know we were under active attack until 5 PM on the same day — what does that look like for me?”
Every hour between your awareness and Helen’s is an hour where the management body is carrying personal liability under Article 20 without the information to act on it. Helen deferred a security audit in October. She is personally accountable for that decision. Without knowing the company was under active attack, she could not activate the crisis plan, engage legal counsel, or escalate the deferred audit. The 11-hour delay does not just damage your relationship — it creates a documented governance gap that the regulator will examine, and which Helen cannot explain.
SHADOWVAULT’s full instructions arrive at 12:09 PM:
You open the attachment. You verify three records at random against the live database. All three match. Elke Jansen is on page 3. Full name, home address, IBAN, monthly gas consumption since 2019.
The next decision has a time limit. Indecision is a decision.
€2 million. 340,000 records. 500 verified as real. 60 hours before they publish. The CSIRT early warning deadline is still running.
Auto-selects “negotiate” if timer expires — indecision is a decision

“We don’t pay. File the early warning. Get law enforcement on the line.”
Dr. Petra Lindström — NCSC-NL “Mr. Reeves. Thank you for the early warning. You are within the 24-hour window. When can you provide the full incident notification?”
You “Within 72 hours. I can confirm root cause, attack vector, and remediation timeline.”
Petra “Good. And Mr. Reeves — not paying was the right call. We have seen SHADOWVAULT twice before. They sell the data regardless.”
Article 23 sets three hard deadlines: an early warning within 24 hours, a full incident notification within 72 hours including initial severity assessment, and a final report within one month with root cause analysis and remediation measures. Refusing to pay and filing immediately achieves both regulatory compliance and operational logic. Paying a ransom does not guarantee deletion — it funds the next attack on someone else. The NCSC-NL treats prompt self-reporting as a significant mitigating factor in any subsequent supervisory action.
You open a channel. Over two hours, SHADOWVAULT confirms they accessed the network via a compromised contractor VPN credential and have been inside for 11 days. They know which backups are clean and which are not.
Genuinely useful intelligence. It shapes your remediation plan. But you spent two hours talking to a criminal organisation instead of filing with the CSIRT.
When you file at 2 PM, Dr. Lindström asks one question: “You became aware at 6:04 AM. It is now 2 PM. What were you doing for eight hours?” The intelligence you gained does not answer that.
Engaging the attacker can yield useful information. But under Article 23, the early warning must be filed within 24 hours of becoming aware of the significant incident — not after you have resolved your questions about scope. The window is already tight. Spending two hours in dialogue with SHADOWVAULT before reporting consumes time you cannot recover. The regulator will note the gap between 6:04 AM and 2 PM. Any intelligence gained does not offset the documented delay in notification.
You pay. €2 million transferred. SHADOWVAULT sends the decryption key at 4:47 PM. The servers come back.
Three weeks later, the full database appears on a dark web leak forum. SHADOWVAULT sold it to a second group before taking your money. Elke Jansen’s name, address, and bank details are publicly searchable. The payment bought you nothing except the illusion of resolution.
Dr. Lindström “You transferred funds to a criminal organisation. The data was published regardless. Your notification to us was delayed by six hours while you arranged the payment. Three separate problems. We are opening a formal investigation.”
Paying a ransom creates three distinct compliance failures. First, the transfer funds a criminal organisation and may breach sanctions or anti-money-laundering law. Second, the time spent arranging payment consumes hours from your Article 23 notification window. Third, the data is published anyway — payment provided no security benefit. NIS2 Recital 101 is explicit: entities should not be encouraged to pay ransoms. The €10 million maximum fine under Article 34 is larger than the ransom you paid. You may now face both.
The five stages of NIS2 incident notification are listed below — in the wrong order. Put them in the correct sequence. Step 1 first. Step 5 last.
Click a card to assign it the next step number. Click Reset to start again.
Submit Full Incident Notification
Initial assessment of severity, impact, and indicators of compromise
Submit Early Warning to CSIRT
Initial notification — no root cause required, just flag the incident
Submit Final Report
Root cause analysis, remediation measures taken, and lessons learned
Provide Progress Updates
Ongoing status as requested by the competent authority
Detect and Assess the Incident
SOC alert received — classify severity and determine if NIS2 thresholds are met
Article 20 — Management body personal liability
Article 21 — Cybersecurity risk management
Article 23 — 24h early warning, 72h notification
Article 34 — Fines up to €10M or 2% turnover
You scored . Three decisions. Every one had a cost. The paths you didn’t take are still there.