Strategic Insights on Managed Security Awareness

How Do You Measure the Effectiveness of Security Awareness Training?

Blog Reading Time

10 mins Read / May 8, 2026

On this page

Key Takeaways

  • Traditional metrics like completion rates don’t prove your training reduces risk — they only show activity happened
  • Seven specific metrics provide evidence of actual behavioural change and risk reduction at the 6-12 month programme stage
  • Academic research shows mixed results for training effectiveness — the key is measuring the right behaviours, not just simulation performance
  • A structured leadership reporting framework turns your metrics into compelling business language for budget discussions
  • Organisations should expect 40% improvement in key metrics by 90 days, with 86% improvement possible after 12 months of consistent training

Your security awareness programme has been running for six months. Completion rates look good, recent phishing simulations show improved click rates, and leadership is asking the critical question: “Is this actually working — and should we continue funding it?” That question deserves a better answer than a screenshot of completion certificates. This article provides seven evidence-based metrics that will survive leadership scrutiny, calibrated specifically for programmes at the 6-12 month stage.

Why Your Current Metrics Are Probably Lying to You

Most security teams measure training activity, not training effectiveness. A 100% completion rate tells you people clicked through modules — it doesn’t prove they can spot a real phishing email. According to research published in a 2025 academic study, training interventions showed no statistically significant effect on click rates in controlled environments, despite widespread industry claims of success.

The problem lies in what we choose to measure. Completion rates satisfy compliance requirements but don’t demonstrate risk reduction. A healthcare study referenced in industry research found no significant relationship between recent training completion and actual phishing resistance among 19,500 employees.

This doesn’t mean security awareness training fails — it means most programmes measure the wrong things. The difference between effective and ineffective training often comes down to frequency, relevance, and measuring behaviours rather than activities.

What Should Your Numbers Look Like at 6-12 Months?

According to a 2025 industry benchmarking report covering 67.7 million simulated phishing tests across 62,400 organisations, programmes show predictable improvement curves when implemented consistently:

  • Baseline: 33.1% of employees typically fail initial phishing simulations
  • 90 days: Expect approximately 40% reduction in failure rates
  • 12 months: Well-run programmes achieve 86% reduction, bringing failure rates down to 4.1%

If your metrics aren’t tracking toward these benchmarks, the issue likely isn’t your employees — it’s programme design or measurement approach. Sustained, behaviour-focused programmes consistently outperform annual compliance training.

Industry and Size Variations Matter

Your organisation’s starting point affects realistic expectations. Healthcare and pharmaceutical organisations face higher baseline risk, with initial failure rates of 41.9%. Smaller organisations (1-250 employees) typically start at 24.6% baseline failure rates, while larger organisations (10,000+ employees) often begin at 40.5%.

The 7 Metrics That Actually Prove Effectiveness

These seven metrics provide evidence of genuine behavioural change and risk reduction, not just training completion:

1. Phishing Reporting Rate (Simulation-Based)

This measures the percentage of employees who correctly identify and report phishing simulations without clicking. According to the 2024 Data Breach Investigations Report, 20% of users identified and reported phishing in simulation exercises — but only 11% of those who initially clicked also reported the suspicious email.

What to track: Monthly percentage of employees who report simulated phishing emails within 24 hours, without clicking first.

Target benchmark: 25-30% reporting rate by month 6, 40%+ by month 12.

2. Real-World Threat Reporting Rate

The ultimate test: do employees report actual malicious emails they receive? This metric proves training transfers from simulations to real-world behaviour.

What to track: Number of legitimate threat reports per month divided by total employees. Include reports that turn out to be false positives — erring on the side of caution shows good security instincts.

Target benchmark: 2-3 legitimate threat reports per 100 employees per month indicates strong reporting culture.

3. Repeat-Clicker Reduction Rate

This tracks improvement among your highest-risk users. Research shows targeted, personalised training can reduce repeat phishing victims by 63% within six months.

What to track: Define repeat clickers as employees who fail three consecutive simulations. Track how many move out of this category each quarter through targeted intervention.

Target benchmark: 50% reduction in repeat-clicker population every six months.

4. Average Time to Report (Dwell Time)

Speed matters in real attacks. The 2024 Data Breach Investigations Report found median time to click on malicious links was just 21 seconds. Fast reporting reduces potential damage.

What to track: Average time between simulation delivery and employee report submission.

Target benchmark: Under 2 hours for 75% of reports by month 6, under 30 minutes by month 12.

5. Knowledge Retention Score Trends

Unlike completion rates, knowledge assessments measure actual learning retention over time.

What to track: Quarterly knowledge assessments covering key concepts: identifying suspicious links, recognising social engineering tactics, proper reporting procedures.

Target benchmark: 80% average score with improving trend quarter-over-quarter.

6. Security Incident Frequency (Lagging Indicator)

The business metric that matters most: are you experiencing fewer successful attacks?

What to track: Human-factor security incidents per quarter: successful phishing, credential compromise, malware installation via email.

Target benchmark: 25-50% reduction in human-factor incidents by month 12.

7. Miss Rate (The Hidden Risk Indicator)

Employees who neither click nor report suspicious emails represent hidden risk — they’re not engaging with security processes at all.

What to track: Percentage of employees who receive simulations but show no response (no click, no report).

Target benchmark: Under 15% miss rate indicates good programme engagement.

How Do You Handle Employees Who Keep Failing Tests?

Academic research suggests punitive mandatory training doesn’t help the most susceptible users. A 15-month study of 14,000+ employees found mandatory training provided no additional benefit for repeat clickers.

Instead, try a supportive intervention protocol:

  1. One-to-one coaching session focusing on real examples they’ve encountered
  2. Personalised simulations based on their role and common attack vectors they face
  3. Peer buddy system pairing them with security-conscious colleagues
  4. Monthly check-ins to discuss suspicious emails they’ve received

Measure intervention success by tracking whether reporting rates improve in the two simulations following personal coaching. This approach addresses individual needs rather than adding generic training volume.

What About Multi-Channel Attacks Beyond Email?

Email-only measurement frameworks miss significant attack vectors. Security research shows voice phishing increased by 442% in 2024, while AI-supported phishing represents over 80% of observed social engineering according to European threat landscape analysis.

Expand your measurement to include:

  • Voice phishing (vishing) simulations: Test employee response to suspicious phone calls requesting credentials or information
  • SMS phishing (smishing) awareness: Measure recognition of malicious text messages and QR code attacks
  • MFA fatigue resistance: Track whether employees report repeated authentication requests they didn’t initiate

Platforms like Complorer integrate multi-channel simulation capabilities, enabling comprehensive measurement across the full spectrum of social engineering attacks your employees actually face.

How Do You Turn These Metrics Into a Leadership Presentation?

Leadership needs business language, not security jargon. Structure your quarterly report around three key themes:

Risk Reduction (Lead with This)

  • “Human-factor security incidents decreased by X% this quarter”
  • “Employee threat detection improved by X%, with staff now reporting suspicious emails within 30 minutes”
  • “Our highest-risk users showed X% improvement following targeted intervention”

Financial Impact

According to the 2025 Cost of a Data Breach Report, employee training reduces average breach costs by $232,867. With global average breach costs at $4.44 million, effective training provides measurable return on investment.

Benchmark Comparison

“We’re tracking [above/below/in line with] industry benchmarks for organisations our size. Based on current trajectory, we expect to reach [specific target] by [specific date].”

This framework transforms technical metrics into strategic business discussion. Keep the presentation to 5 minutes, lead with risk reduction, and end with clear next steps.

A Word of Caution: What the Research Actually Says

Academic research presents a more nuanced picture than vendor marketing materials suggest. A large-scale 2025 study of 12,511 employees found training interventions showed no statistically significant main effects on click rates in controlled conditions. However, the same research confirmed that simulation difficulty significantly predicted user behaviour — click rates increased from 7.0% for obvious scams to 15.0% for sophisticated attacks.

This contradiction highlights two important points:

  • Programme design matters more than programme existence. Continuous, behaviour-focused training works. Annual compliance training largely doesn’t.
  • Simulation difficulty affects your metrics. A 15% click rate on a sophisticated simulation indicates better security awareness than 5% clicks on an obvious scam.

The honest assessment: well-designed, sustained security awareness programmes reduce human risk. Poorly designed programmes create compliance theatre without meaningful protection. Your metrics should help you determine which category your programme falls into.

Legal and Privacy Considerations for Individual Tracking

Individual employee tracking raises data protection considerations, particularly under GDPR and UK GDPR. Recording who clicks what, assigning individual risk scores, and maintaining detailed behavioural logs constitutes employee monitoring.

Before implementing individual-level tracking:

  • Review your Data Protection Impact Assessment obligations
  • Consult your legal and HR teams about employment law implications
  • Consider whether aggregated metrics meet your needs without individual identification
  • Ensure transparent communication about what data you collect and why

Many effective measurement frameworks focus on group trends and voluntary reporting rather than individual surveillance. This approach builds trust while still providing actionable data for programme improvement.

What Should You Do Next?

Start with three metrics from the seven-metric framework: phishing reporting rate, real-world threat reporting rate, and repeat-clicker reduction rate. These provide immediate insight into whether your programme creates genuine behavioural change. Establish baseline measurements this month, then track monthly progress against the 6-12 month benchmarks outlined above.

The key is consistent measurement over time rather than perfect data from day one. Complorer’s integrated analytics dashboard makes this tracking straightforward, combining simulation results with real-world reporting metrics in a single view designed specifically for leadership communication.

Focus on building a measurement habit that serves your programme improvement needs while providing the evidence base to justify continued investment in human risk reduction.

Frequently Asked Questions

How often should I run phishing simulations to get accurate metrics?

Monthly simulations provide the most reliable data for measuring improvement trends. Quarterly simulations work for compliance but don’t offer enough data points to identify problems early. Weekly simulations risk creating fatigue and resentment among staff.

What’s a realistic timeline to see improvement in these metrics?

Expect meaningful improvement within 90 days for well-designed programmes. Reporting rates typically improve faster than click rates. Real-world threat reporting often shows improvement within 6 weeks as employees become more security-conscious in their daily work.

Should I track metrics for individual employees or just aggregate data?

Aggregate metrics provide valuable insights with fewer privacy concerns. Individual tracking helps identify specific intervention needs but requires careful consideration of GDPR compliance and employment law. Many successful programmes use individual data for coaching purposes but report aggregate trends to leadership.

How do I know if my phishing simulations are too easy or too difficult?

Use frameworks like the NIST Phish Scale to rate simulation difficulty consistently. Aim for a mix: 70% moderate difficulty, 20% easy (to build confidence), 10% advanced (to challenge strong performers). If everyone passes or everyone fails consistently, adjust the difficulty distribution.

What should I do if my metrics aren’t improving after 6 months?

Review programme frequency, relevance, and measurement approach. Common issues include: simulations that don’t reflect real threats your organisation faces, training content that’s too generic, insufficient feedback after simulations, or measuring the wrong behaviours. Consider switching from punitive to supportive intervention methods for repeat clickers.

References

[1] Verizon Business. (2024). 2024 Data Breach Investigations Report.

[2] Rozema, A., et al. (2025). Anti-Phishing Training (Still) Does Not Work: A Large-Scale Reproduction of Phishing Training Inefficacy Grounded in the NIST Phish Scale.

[3] Merritt, M., Hansche, S., Ellis, B., et al. (2024). Building a Cybersecurity and Privacy Learning Program. NIST Special Publication 800-50 Rev.1.

[4] Anonymous. (2025). Sustaining Cyber Awareness: The Long-Term Impact of Continuous Phishing Training and Emotional Triggers.

[5] IBM Security. (2025). Cost of a Data Breach Report 2025.

Make Security Awareness Actually Work

Training alone doesn’t change behavior.
See how modern programs turn awareness into real-world action.

Explore How It Works
Security Awareness
Author

Complorer

11 Articles

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.