Blend
Back to Our Work

Programme Effectiveness Dashboard: Martyn's Law

Twelve Weeks Post-Rollout (Mock Dashboard)

Programme Effectiveness Dashboard: Martyn's Law
Read the full case study

Brief

Build the dashboard a Head of Compliance receives twelve weeks after the Martyn's Law SCORM package goes live across a 12-venue estate. Mock data, real architecture. The piece must (a) read in 60 seconds without methodology, with the iteration narrative obvious from the chart annotations alone; (b) stand up to a "where would this number come from in a real deployment" question via a methodology footer for every metric; (c) re-skin trivially for other products (AI Act, GDPR, DORA) so it reads as architecture, not a one-off case study; (d) demonstrate the seniority signal a Lead-LXD hiring panel screens for: the candidate who thinks past the ship date.

Discovery & Analysis

The training-vendor market is full of "we measure ROI" claims and short on artefacts that show what the measurement actually looks like. Blend's USP is "we stay and measure", and the dashboard that backs that promise had to exist as a clickable proof, not a paragraph. The discovery question: build the post-rollout view a programme owner would receive twelve weeks after launching one of our shipped products, with mock data rich enough that the iteration story tells itself.

Design & Development

The data narrative drives the design. Every UI choice is in service of one 60-second story. • **Time slider as the load-bearing control.** Weeks 1–12 across a horizontal track, with intervention weeks marked in amber. Drag, click, or arrow-key scrub. Every panel re-renders against the selected week. The whole experience is one variable away from being a different product's dashboard. • **Five KPIs at the top, three cards in the middle, one recommendations card at the bottom.** Information architecture mirrors the cognitive sequence a programme owner runs: how big is the problem, where is it, what changed, what do I do next. • **Cohort drill-down with strengths / watchlist / next actions per cohort.** Translates raw scores into language a Head of Compliance can act on. Strengths in green, watchlist in rose, next actions in amber, the action colour, deliberately. • **Methodology disclosure at the bottom.** Plain English. Names the data source per metric. Calls out self-report as a known weakness. Designed for a hiring panel asking "where would this number come from in a real deployment". • **Mock data inspectable.** data.js is hard-coded and readable. Anyone reviewing the piece can verify the iteration narrative against the data, not against marketing copy. Stack: single self-contained HTML file with embedded CSS and JS. SVG line chart, CSS-grid heatmap, CSS-flex bar charts. No chart library, no framework, no dependency surface. WCAG 2.0 AA. Keyboard navigable. Mobile-responsive at 375px. ~1500 lines, ~22 KB gzipped.

Evaluation

The shipped piece does four things most ID portfolios do not. • **It shows the post-rollout view.** Adoption, comprehension by module-and-cohort, behaviour-change signals, recommendations card, cohort drill-down. Six metrics across the three cohorts the Martyn's Law course already serves (Door Staff Guided / Managers Base / Security Leads Expert track). • **It tells one story across twelve weeks.** Weeks 1–4 baseline shows a weak Module 4 Manager-track score (48/100). Week 5 logs a content patch. Weeks 6–12 show the recovery curve to 76/100, paired with a tabletop pass-rate climb from 64% to 87% across the same window. Hover for 60 seconds and the diagnose-iterate-recover loop is visible without reading methodology. • **It has a methodology footer that holds up.** Each of the six metrics names where the data would come from in a real deployment: SCORM cmi.core.lesson_status fields for adoption, in-module decision scores for comprehension, ServiceNow / iAuditor near-miss extracts for behaviour signals, quarterly tabletop scoring rubric for the live-pressure proxy. Self-report is named as a known weakness, paired with the quarterly tabletop for triangulation. • **It re-skins.** Swap data.js and the framing copy and the same architecture renders for AI Act adoption, GDPR refresh, DORA / NIS2 cyber-resilience, or onboarding ramp curves. The architecture is the artefact; the numbers are the variable.

What this means for your organization

Most instructional-design portfolios stop at the course. They show the learner experience and the visual style. They do not show what happens after the SCORM package is uploaded. This piece is the other side. It is the dashboard a Head of Compliance reads in the first twelve weeks of the rollout. Adoption, comprehension, behaviour-change signals, and the iteration log that records what was changed and why. Same world as the Martyn's Law training case study, different lens.

See exactly where your training is leaking ROI.

A 5-minute diagnostic that scores your training across 6 dimensions — then gives you a personalised improvement plan. No email required.

1,000+ teams trained worldwide
Average 40% improvement in learning outcomes
Results in 5 minutes — no strings attached