Key Takeaways
- Completion rates measure compliance, not security improvement — track behaviour change instead
- Seven specific metrics prove training effectiveness: reporting rates, repeat-clicker reduction, dwell time, real-world threat reporting, knowledge retention, incident frequency, and miss rates
- Expect 40% reduction in click rates by 90 days, 86% reduction by 12 months with consistent training
- Academic research shows contradictory results — training works when it’s continuous and behaviour-focused, not annual and passive
- Individual employee tracking may have legal implications under employment law and data protection regulations
You’ve been running your security awareness programme for six months. The completion dashboard is green. The last phishing simulation showed a drop in click rates. And leadership has just asked: “Is this actually working — and should we continue funding it?” That question deserves a better answer than a screenshot of your completion certificate report.
The effectiveness of security awareness training hinges on measuring the right metrics at the right time. According to industry benchmarking data (2025), organisations see an average 40% reduction in phish-prone percentage within 90 days and 86% reduction after 12 months of consistent training. But those figures only tell part of the story.
This article gives you seven metrics that will hold up under scrutiny — and explains precisely what results you should be seeing at your current programme stage, based on data from over 67 million simulated phishing tests.
Why Your Current Metrics Are Lying to You
Most security awareness programmes measure activity, not outcomes. Completion rates hit 100% while employees click phishing links the morning after finishing their training module. A study published in academic research (2025) examined 19,500 healthcare employees and found no significant relationship between recent training completion and phishing resistance.
The problem runs deeper than individual metrics. Government guidance explicitly separates “awareness” (broad culture shift) from “training” (role-based skill development). Completion logs satisfy compliance requirements but do not prove risk reduction. Yet most organisations still report training success using completion percentages.
Click rates present their own distortion. Send an easier phishing simulation this quarter and your numbers magically improve. A controlled study (2025) found that click rates increased from 7.0% for easy lures to 15.0% for hard lures, regardless of training history. Without calibrating for simulation difficulty, click-rate comparisons between campaigns become meaningless.
What Should Your Results Look Like at 6-12 Months?
Benchmarking data from 62,400 organisations provides clear staging expectations. The global baseline phish-prone percentage before any training averages 33.1%. After 90 days of consistent training and simulated phishing, this drops to approximately 20% — a 40% reduction. By 12 months, the average falls to 4.1%.
These figures vary significantly by organisation size and sector. Smaller organisations (1-250 employees) start with a baseline of 24.6%, while larger organisations (10,000+ employees) begin at 40.5%. Healthcare and pharmaceuticals represent the highest-risk sector at 41.9% baseline.
If your organisation sits between months six and twelve, reasonable expectations include:
- Phishing click rate reduction of 50-70% from your original baseline
- Reporting rate of suspicious emails reaching 15-25% of staff
- Repeat-clicker rate (same person failing multiple simulations) below 5%
- Average time to report suspicious emails under 10 minutes
However, recent academic research presents a contradictory picture. A large-scale controlled study (2025) found training interventions showed no statistically significant effect on click rates or reporting rates in an operational fintech environment. This contradiction suggests the effectiveness of security awareness training depends heavily on implementation approach, not just time invested.
The 7 Metrics That Actually Prove Effectiveness
1. Phishing Reporting Rate (Simulation)
Track the percentage of employees who identify and report phishing simulations without clicking first. Industry data (2024) shows 20% of users report phishing in simulation exercises. This metric measures detection behaviour, not just avoidance.
Target benchmark: 20-30% reporting rate within six months. Calculate monthly to track improvement trends. This metric proves employees are actively engaging with security rather than passively avoiding obvious threats.
2. Real-World Threat Reporting Rate
Monitor how often employees report actual suspicious emails versus simulation exercises. This metric proves training transfers to genuine threat scenarios. Many organisations track simulation performance but ignore real-world behaviour transfer.
Collect data from your email security provider on user-reported threats. Compare this against the volume of actual malicious emails detected by automated systems. A healthy programme shows increasing user reporting alongside decreasing successful attacks.
3. Repeat-Clicker Reduction Rate
Define repeat clickers as employees who fail three or more simulations within a 90-day period. Track how this population shrinks over time with targeted intervention. Research shows personalised training reduced repeat phishing victims by 63% within six months.
Target: Repeat-clicker rate below 5% by month six. This metric identifies whether your programme helps the most vulnerable users or simply improves already-capable ones.
4. Dwell Time (Time to Report)
Measure average time between receiving a suspicious email and reporting it. Fast reporting limits potential damage from successful attacks. Industry data shows median time to click malicious links is just 21 seconds — reporting must be faster.
Effective programmes achieve average reporting times under 10 minutes. This metric demonstrates whether training creates reflexive security behaviour or merely theoretical knowledge.
5. Knowledge Assessment Score Trend
Track learning retention through brief monthly assessments, not completion certificates. Focus on practical scenario-based questions rather than policy memorisation. Measure improvement over time, not absolute scores.
This metric separates genuine understanding from compliance box-ticking. Rising knowledge scores combined with improving behavioural metrics indicate effective training design.
6. Security Incident Frequency Trend
Connect training metrics to broader security incident data. Track monthly frequency of successful phishing attacks, malware infections, and social engineering incidents. Effective training should correlate with declining incident rates.
This lagging indicator validates whether simulation improvements translate to real-world protection. According to industry research, employee training reduces average breach cost by $232,867.
7. Miss Rate (Neither Click nor Report)
Track employees who neither click suspicious emails nor report them. This “miss rate” represents hidden risk — staff who ignore potential threats entirely. High miss rates suggest training teaches avoidance without building reporting culture.
Target: Miss rate below 40% by month six. This metric reveals whether training creates engaged security participants or passive observers.
How to Handle Multi-Channel Threats
Email phishing represents only part of the modern threat landscape. Voice phishing increased 442% between H1 and H2 2024. AI-supported phishing represents more than 80% of observed social engineering worldwide, according to European threat landscape reports (2025).
Expand measurement frameworks beyond email to include:
- SMS phishing (smishing) simulation and reporting rates
- Voice phishing (vishing) awareness and response protocols
- QR code phishing recognition and reporting
- Multi-factor authentication fatigue attack resistance
Organisations measuring only email phishing click rates have systematically incomplete pictures of their human risk exposure. Platforms like Complorer integrate multi-channel threat simulation with unified reporting dashboards, giving security teams comprehensive visibility across attack vectors.
What to Do About Repeat Clickers
Traditional approaches mandate additional training for repeat clickers. Research suggests this approach fails. A long-term study of 14,000+ employees found that for the most susceptible participants, mandatory training provided no additional benefit.
Alternative intervention strategies include:
- One-to-one coaching sessions focused on specific vulnerability patterns
- Environmental controls (additional email filtering, browser restrictions)
- Peer mentoring programmes pairing repeat clickers with security-conscious colleagues
- Job role evaluation — some positions may require enhanced technical controls rather than training
Track intervention success by measuring whether repeat clickers show improved reporting rates in subsequent simulations, not just reduced click rates.
How to Present These Metrics to Leadership
Leadership presentations require business language, not technical metrics. Structure your quarterly report around risk reduction and cost avoidance rather than simulation statistics.
Essential presentation framework:
- Open with context: “68% of data breaches involve human factors, with average breach cost of $4.44 million”
- Present trend, not snapshot: “Phishing susceptibility reduced from X% to Y% over six months”
- Connect to real incidents: “Employee reporting identified X actual threats this quarter”
- Benchmark against industry: “Our results exceed sector average by X%”
- Quantify value: “Training investment of £X potentially avoided £Y in breach costs”
Avoid technical jargon. Never present completion rates as success metrics. Focus on behaviour change and risk reduction outcomes that map directly to business impact.
Legal Considerations for Individual Tracking
Tracking individual employee behaviour in phishing simulations raises data protection considerations. Individual click tracking, risk scoring, and detailed behavioural logging constitute employee monitoring under data protection regulations.
Organisations should review their Data Protection Impact Assessment obligations and relevant employment law before implementing individual-level tracking at scale. Consider consulting legal and HR teams to ensure monitoring approaches align with local requirements and employment contracts.
Aggregate reporting often provides sufficient programme insight without individual privacy implications. Focus metrics on team or department trends rather than personal performance tracking where possible.
What the Research Really Says About Training Effectiveness
Academic research presents conflicting evidence about training effectiveness. Industry benchmarking data consistently shows dramatic improvements in click rates and reporting behaviour. However, controlled academic studies sometimes find no significant training effects.
The contradiction likely reflects implementation differences. Sustained, frequent, behaviour-focused programmes show measurable results. Annual, passive, compliance-driven training largely doesn’t. Research on sustained simulation programmes found they halved successful compromise rates within six months across 20 organisations.
This nuance matters for measurement strategy. Programmes designed around behaviour change require different metrics than compliance-focused approaches. The seven metrics outlined above align with behaviour-focused training that research suggests actually works.
Frequently Asked Questions
What’s a good phishing click rate after 6 months of training?
Expect 50-70% reduction from your original baseline by six months. If you started at 30% click rate, aim for 9-15% by month six. Industry data shows average reduction to 4.1% after 12 months of consistent training.
How often should I run phishing simulations?
Monthly simulations provide optimal results according to research data. Quarterly simulations maintain awareness but show slower improvement rates. Annual simulations are insufficient for sustained behaviour change and create unreliable measurement data.
Should I track individual employee performance?
Individual tracking may have legal implications under employment law and data protection regulations. Aggregate department or team-level metrics often provide sufficient programme insight without privacy concerns. Consult legal and HR teams before implementing detailed individual monitoring.
What if my click rates aren’t improving despite training?
Review simulation difficulty using standardised scales. Ensure training is behaviour-focused rather than passive information delivery. Consider multi-channel threats beyond email. Academic research shows some training approaches fail — continuous, practical training outperforms annual, theoretical approaches.
How do I measure real-world training effectiveness vs simulation performance?
Track actual threat reporting rates from your email security systems. Monitor security incident frequency trends. Compare user-reported threats against automated detections. Real-world effectiveness shows employees reporting actual threats, not just performing well in controlled simulations.
What should I do about employees who keep failing simulations?
Avoid punitive additional training, which research shows doesn’t help the most susceptible users. Try one-to-one coaching, environmental controls, peer mentoring, or role evaluation. Track intervention success through improved reporting rates, not just reduced clicks.
What Should You Do Next?
The seven metrics outlined above provide a measurement framework that proves training effectiveness rather than training activity. Focus on behaviour change, real-world transfer, and risk reduction outcomes that leadership can connect to business impact.
Start by auditing your current measurement approach against these seven areas. Identify which metrics you’re already collecting and which gaps need addressing. Platforms like Complorer provide integrated dashboards that track behavioural metrics across multiple attack vectors, making it easier to build comprehensive effectiveness reports.
Remember that measurement affects culture. Frame metrics as improvement opportunities rather than performance surveillance. The goal is building security-conscious teams, not catching people out.
Begin implementing these metrics systematically over the next quarter to build the evidence base your next leadership presentation will need.
References
- Verizon Business. (2024). 2024 Data Breach Investigations Report.
- Rozema, A., et al. (2025). Anti-Phishing Training (Still) Does Not Work: A Large-Scale Reproduction of Phishing Training Inefficacy Grounded in the NIST Phish Scale.
- Merritt, M., et al. (2024). Building a Cybersecurity and Privacy Learning Program. NIST Special Publication 800-50 Rev.1.
- Anonymous/Composite. (2025). Sustaining Cyber Awareness: The Long-Term Impact of Continuous Phishing Training and Emotional Triggers.
- IBM Security. (2024). Cost of a Data Breach Report 2024.

