Kirkpatrick's Four Levels of Training Evaluation
Most L&D teams only measure Level 1 (satisfaction surveys). The real value — and the evidence leadership cares about — lives at Levels 3 and 4. This framework shows you how to measure all four.
Level 1: Reaction
Did participants find the training valuable and engaging? This is the post-training survey most teams already do.
Level 2: Learning
Did participants actually acquire the intended knowledge, skills, or attitudes? Measured through assessments, demonstrations, or simulations.
Level 3: Behaviour
Are participants applying what they learned on the job? This is where most L&D measurement breaks down — and where the real value starts.
Level 4: Results
Did the training drive measurable business outcomes? Revenue, productivity, retention, quality, compliance incidents — the metrics leadership cares about.
According to ATD research, 90% of organisations measure Level 1, but only 37% measure Level 3 and just 18% measure Level 4. Closing this gap is the single most impactful thing you can do for your L&D function's credibility.
How to measure each level
| Level | Timing | Method | Who collects |
|---|---|---|---|
| 1. Reaction | Immediately after | Survey (5-8 questions max), net promoter score | L&D team |
| 2. Learning | End of training + 2 weeks | Pre/post assessment, skills demonstration, scenario simulation | L&D team / facilitator |
| 3. Behaviour | 30, 60, 90 days after | Manager observation checklist, 360 feedback, on-the-job assessment, work output review | Managers + L&D |
| 4. Results | 90 days – 12 months | Business metrics comparison (before vs after), control group comparison, correlation analysis | L&D + Business leaders |
Training ROI Calculator
Enter your training costs and expected benefits to calculate return on investment, payback period, and generate an executive summary.
ROI Analysis Results
Executive Summary (copy-paste ready)
Pre/Post KPIs by Training Type
Not all training should be measured the same way. Here are the recommended KPIs for each common training type, with suggested pre and post measurement points.
Leadership Development
| KPI | Pre-training baseline | Post-training target | Measurement method |
|---|---|---|---|
| 360 leadership score | Avg 3.2/5 | Avg 3.8/5 at 90 days | 360 feedback survey |
| Direct report engagement | Team eNPS baseline | +10 points at 6 months | Pulse survey |
| Coaching conversation frequency | 1x/month avg | 2x/month at 60 days | Manager self-report + HR system |
| Team turnover rate | Annual baseline | 15% reduction at 12 months | HR data |
Technical / Digital Skills
| KPI | Pre-training baseline | Post-training target | Measurement method |
|---|---|---|---|
| Skills assessment score | Avg score on assessment | +30% improvement | Pre/post test |
| Tool adoption rate | % using new tools | 80%+ active use at 30 days | System usage data |
| Task completion time | Avg time per task | 20% reduction at 60 days | Process tracking |
| Error rate | Errors per 100 tasks | 40% reduction at 90 days | Quality data |
Sales Training
| KPI | Pre-training baseline | Post-training target | Measurement method |
|---|---|---|---|
| Win rate | Current % | +5-10 percentage points | CRM data |
| Average deal size | Current avg | +10-15% increase | CRM data |
| Sales cycle length | Current avg days | 10-20% reduction | CRM data |
| Pipeline generation | Monthly pipeline value | +15% increase | CRM data |
Compliance Training
| KPI | Pre-training baseline | Post-training target | Measurement method |
|---|---|---|---|
| Incident rate | Incidents per quarter | 30-50% reduction | Incident reports |
| Near-miss reporting | Reports per month | +50% increase (good — means awareness is up) | Reporting system |
| Scenario decision quality | Pre-assessment score | 85%+ correct decisions | Scenario-based assessment |
| Audit findings | Findings per audit | 50% reduction | Audit reports |
Onboarding
| KPI | Pre-training baseline | Post-training target | Measurement method |
|---|---|---|---|
| Time to productivity | Current avg (weeks) | 25-40% reduction | Manager assessment + output data |
| 90-day retention | Current % | 90%+ retention | HR data |
| New hire satisfaction | Survey baseline | 4.2+/5 at 30 days | Onboarding survey |
| Manager satisfaction | Survey baseline | 4.0+/5 at 60 days | Manager survey |
Business Case Builder
Fill in the fields below to auto-generate an executive summary you can include in budget proposals, board presentations, or stakeholder communications.
What Good L&D Reporting Looks Like
This is the kind of dashboard your leadership team should be seeing. Not a list of completion rates — a clear picture of business impact.
L&D Impact Dashboard — Q1 2026
Last updated: March 20, 2026Key principles of effective L&D reporting
- Lead with business outcomes, not activity metrics. "Training reduced customer complaints by 23%" beats "We delivered 47 training sessions."
- Show trends, not snapshots. One quarter means nothing. Three quarters in a row tells a story.
- Include behaviour change data. This is what bridges the gap between "they learned it" and "they do it."
- Tie every programme to a business KPI. If you cannot connect a training programme to a business metric, question whether it should exist.
- Report cost per behaviour change, not cost per head. A programme that costs more but changes more behaviour is a better investment.