Kirkpatrick's Four Levels of Training Evaluation

Most L&D teams only measure Level 1 (satisfaction surveys). The real value — and the evidence leadership cares about — lives at Levels 3 and 4. This framework shows you how to measure all four.

Level 1: Reaction

Did participants find the training valuable and engaging? This is the post-training survey most teams already do.

Example: "89% of participants rated the programme 4 or 5 out of 5 for relevance to their role."

Level 2: Learning

Did participants actually acquire the intended knowledge, skills, or attitudes? Measured through assessments, demonstrations, or simulations.

Example: "Average assessment score improved from 52% (pre) to 84% (post), a 32-point lift."

Level 3: Behaviour

Are participants applying what they learned on the job? This is where most L&D measurement breaks down — and where the real value starts.

Example: "Manager observations show 72% of participants consistently using the new coaching framework at 60 days post-training."

Level 4: Results

Did the training drive measurable business outcomes? Revenue, productivity, retention, quality, compliance incidents — the metrics leadership cares about.

Example: "Customer satisfaction scores increased 18% and first-call resolution improved 23% within 90 days of the service training rollout."
The measurement gap

According to ATD research, 90% of organisations measure Level 1, but only 37% measure Level 3 and just 18% measure Level 4. Closing this gap is the single most impactful thing you can do for your L&D function's credibility.

How to measure each level

Level Timing Method Who collects
1. Reaction Immediately after Survey (5-8 questions max), net promoter score L&D team
2. Learning End of training + 2 weeks Pre/post assessment, skills demonstration, scenario simulation L&D team / facilitator
3. Behaviour 30, 60, 90 days after Manager observation checklist, 360 feedback, on-the-job assessment, work output review Managers + L&D
4. Results 90 days – 12 months Business metrics comparison (before vs after), control group comparison, correlation analysis L&D + Business leaders

Training ROI Calculator

Enter your training costs and expected benefits to calculate return on investment, payback period, and generate an executive summary.

Include design, delivery, materials, facilitator time, technology
Total number of employees trained
Used to calculate productivity gains in monetary terms
Conservative: 3-5%, moderate: 5-10%, optimistic: 10-20%
How many fewer people will leave because of improved development?
Typical: 50-200% of annual salary (recruitment, onboarding, lost productivity)
Reduction in quality issues, compliance incidents, or rework
Estimate total cost of quality issues, compliance fines, or rework

ROI Analysis Results

--
Return on Investment
--
Payback Period
--
Total Annual Benefit
--
Cost Per Participant
--
Net Benefit (Year 1)
--
Benefit-Cost Ratio

Executive Summary (copy-paste ready)

Pre/Post KPIs by Training Type

Not all training should be measured the same way. Here are the recommended KPIs for each common training type, with suggested pre and post measurement points.

Leadership Development

KPIPre-training baselinePost-training targetMeasurement method
360 leadership scoreAvg 3.2/5Avg 3.8/5 at 90 days360 feedback survey
Direct report engagementTeam eNPS baseline+10 points at 6 monthsPulse survey
Coaching conversation frequency1x/month avg2x/month at 60 daysManager self-report + HR system
Team turnover rateAnnual baseline15% reduction at 12 monthsHR data

Technical / Digital Skills

KPIPre-training baselinePost-training targetMeasurement method
Skills assessment scoreAvg score on assessment+30% improvementPre/post test
Tool adoption rate% using new tools80%+ active use at 30 daysSystem usage data
Task completion timeAvg time per task20% reduction at 60 daysProcess tracking
Error rateErrors per 100 tasks40% reduction at 90 daysQuality data

Sales Training

KPIPre-training baselinePost-training targetMeasurement method
Win rateCurrent %+5-10 percentage pointsCRM data
Average deal sizeCurrent avg+10-15% increaseCRM data
Sales cycle lengthCurrent avg days10-20% reductionCRM data
Pipeline generationMonthly pipeline value+15% increaseCRM data

Compliance Training

KPIPre-training baselinePost-training targetMeasurement method
Incident rateIncidents per quarter30-50% reductionIncident reports
Near-miss reportingReports per month+50% increase (good — means awareness is up)Reporting system
Scenario decision qualityPre-assessment score85%+ correct decisionsScenario-based assessment
Audit findingsFindings per audit50% reductionAudit reports

Onboarding

KPIPre-training baselinePost-training targetMeasurement method
Time to productivityCurrent avg (weeks)25-40% reductionManager assessment + output data
90-day retentionCurrent %90%+ retentionHR data
New hire satisfactionSurvey baseline4.2+/5 at 30 daysOnboarding survey
Manager satisfactionSurvey baseline4.0+/5 at 60 daysManager survey

Business Case Builder

Fill in the fields below to auto-generate an executive summary you can include in budget proposals, board presentations, or stakeholder communications.

What Good L&D Reporting Looks Like

This is the kind of dashboard your leadership team should be seeing. Not a list of completion rates — a clear picture of business impact.

L&D Impact Dashboard — Q1 2026

Last updated: March 20, 2026
847
Employees trained
+23% vs Q4
94%
Completion rate
+6pts
4.4/5
Satisfaction (L1)
+0.3
78%
Knowledge gain (L2)
+12pts
68%
Behaviour change (L3)
+15pts
312%
Training ROI (L4)
New metric
BEHAVIOUR CHANGE BY PROGRAMME (Level 3 — % applying on the job at 60 days)
AI Skills
Leadership
Sales
Compliance
Onboarding
Tech Skills

Key principles of effective L&D reporting

  1. Lead with business outcomes, not activity metrics. "Training reduced customer complaints by 23%" beats "We delivered 47 training sessions."
  2. Show trends, not snapshots. One quarter means nothing. Three quarters in a row tells a story.
  3. Include behaviour change data. This is what bridges the gap between "they learned it" and "they do it."
  4. Tie every programme to a business KPI. If you cannot connect a training programme to a business metric, question whether it should exist.
  5. Report cost per behaviour change, not cost per head. A programme that costs more but changes more behaviour is a better investment.

Make measurement automatic

This framework is Stage 4 of The Blend Method. Explore the full toolkit to diagnose needs, design programmes, and adapt based on real data.

Explore all resources