Security Metrics & KPIs
A practical guide to measuring and reporting security performance for boards, executives, and operational teams.
Purpose
Effective security metrics enable organisations to:
- Measure the effectiveness of security controls
- Demonstrate ROI on security investments
- Track progress towards security objectives
- Communicate risk to non-technical stakeholders
- Drive continuous improvement
Leading vs Lagging Indicators
Lagging Indicators
What: Measure outcomes of past events (reactive).
Characteristics:
- Historical data
- Easy to measure
- Show results of security failures
Examples:
- Number of security incidents
- Number of systems compromised
- Mean time to detect (MTTD)
- Mean time to respond (MTTR)
Value: Useful for post-incident analysis and trend identification.
Leading Indicators
What: Measure proactive activities that prevent future incidents (predictive).
Characteristics:
- Forward-looking
- Preventative focus
- Harder to correlate directly to risk reduction
Examples:
- Percentage of systems patched within SLA
- MFA adoption rate
- Security awareness training completion
- Vulnerability scan coverage
Value: Allow corrective action before incidents occur.
Security Metrics by Domain
1. Vulnerability Management
| Metric |
Description |
Calculation |
Target |
| Mean Time to Patch (MTTP) |
Average time from patch release to deployment |
(Sum of days to patch) / (Number of patches) |
Critical: <7 days, High: <30 days |
| Patch Compliance Rate |
% of systems with current patches |
(Patched systems / Total systems) × 100 |
>95% |
| Critical Vulnerability Age |
Days since critical vulnerability identified |
Current date - Discovery date |
0 vulnerabilities >30 days |
| Vulnerability Density |
Vulnerabilities per asset |
Total vulnerabilities / Total assets |
Decreasing trend |
| Scan Coverage |
% of assets scanned regularly |
(Scanned assets / Total assets) × 100 |
100% |
2. Incident Response
| Metric |
Description |
Calculation |
Target |
| Mean Time to Detect (MTTD) |
Average time from incident start to detection |
(Sum of detection times) / (Number of incidents) |
<15 minutes (critical systems) |
| Mean Time to Respond (MTTR) |
Average time from detection to containment |
(Sum of response times) / (Number of incidents) |
<1 hour (critical incidents) |
| Mean Time to Recover (MTTR) |
Average time from incident to full recovery |
(Sum of recovery times) / (Number of incidents) |
<4 hours (critical systems) |
| Incident Volume |
Number of confirmed incidents |
Count by severity |
Decreasing trend |
| False Positive Rate |
% of alerts that are false positives |
(False positives / Total alerts) × 100 |
<10% |
| Incidents by Category |
Breakdown by incident type |
Count by category (malware, phishing, etc.) |
Trend analysis |
3. Identity & Access Management (IAM)
| Metric |
Description |
Calculation |
Target |
| MFA Adoption Rate |
% of users with MFA enabled |
(MFA users / Total users) × 100 |
100% |
| Privileged Account Coverage |
% of privileged accounts in PAM |
(PAM accounts / Total privileged accounts) × 100 |
100% |
| Access Review Completion |
% of scheduled reviews completed on time |
(Completed reviews / Total reviews) × 100 |
100% |
| Orphaned Accounts |
Accounts inactive >90 days |
Count |
0 |
| Deprovisioning Timeliness |
Avg time to disable account after separation |
(Sum of hours) / (Number of leavers) |
<24 hours |
| Password Reset Volume |
Number of password reset requests |
Count per month |
Decreasing (with SSO) |
| Failed Login Attempts |
Rate of failed authentication attempts |
Count per day/week |
Monitor trend |
4. Security Awareness & Training
| Metric |
Description |
Calculation |
Target |
| Training Completion Rate |
% of staff who completed annual training |
(Completed / Total staff) × 100 |
100% |
| Phishing Simulation Click Rate |
% of users who clicked phishing simulation |
(Clicks / Emails sent) × 100 |
<5% |
| Phishing Simulation Report Rate |
% of users who reported phishing simulation |
(Reports / Emails sent) × 100 |
>60% |
| Repeat Offenders |
Users who repeatedly fail simulations |
Count |
<2% |
| Security Incident Reporting |
Number of user-reported security concerns |
Count per month |
Increasing (shows awareness) |
5. Endpoint Security
| Metric |
Description |
Calculation |
Target |
| Antivirus/EDR Coverage |
% of endpoints with active AV/EDR |
(Protected endpoints / Total endpoints) × 100 |
100% |
| Endpoint Compliance |
% of endpoints meeting security baseline |
(Compliant endpoints / Total endpoints) × 100 |
>95% |
| Malware Detection Rate |
Malware detections per 1000 endpoints |
(Detections / Endpoints) × 1000 |
Stable or decreasing |
| Unpatched Endpoint Age |
Days since endpoint last patched |
Average age of unpatched systems |
<30 days |
| Encryption Coverage |
% of laptops with full disk encryption |
(Encrypted laptops / Total laptops) × 100 |
100% |
6. Network Security
| Metric |
Description |
Calculation |
Target |
| Firewall Rule Age |
Average age of firewall rules |
Average days since rule created |
Review rules >1 year old |
| Intrusion Detection Alerts |
IDS/IPS alerts per day |
Count |
Monitor trend |
| Network Segmentation Coverage |
% of critical systems in isolated segments |
(Segmented systems / Critical systems) × 100 |
100% |
| DDoS Mitigation Success |
% of DDoS attacks successfully mitigated |
(Mitigated / Total attacks) × 100 |
100% |
7. Third-Party Risk
| Metric |
Description |
Calculation |
Target |
| Vendor Assessment Coverage |
% of vendors assessed per tier requirements |
(Assessed / Total vendors) × 100 |
100% |
| Overdue Vendor Reviews |
Number of vendors past review date |
Count |
0 |
| Critical Vendor Findings |
Open critical findings from assessments |
Count |
<5 |
| Vendor Certification Currency |
% of critical vendors with current certs |
(Current certs / Critical vendors) × 100 |
>95% |
| Vendor Incidents |
Security incidents involving third parties |
Count per quarter |
0 |
8. Data Protection & Privacy
| Metric |
Description |
Calculation |
Target |
| Data Breaches |
Number of data breach incidents |
Count per year |
0 |
| Records Compromised |
Number of records involved in breaches |
Count per incident |
0 |
| Data Subject Requests |
GDPR/privacy rights requests |
Count per month |
Track trend |
| DSR Response Time |
Avg time to respond to data subject requests |
(Sum of days) / (Number of requests) |
<30 days |
| Encryption Coverage |
% of sensitive data encrypted at rest |
(Encrypted data / Sensitive data) × 100 |
100% |
| DLP Incidents |
Data loss prevention policy violations |
Count per month |
Decreasing trend |
9. Cloud Security
| Metric |
Description |
Calculation |
Target |
| Cloud Security Posture Score |
Cloud Security Posture Management (CSPM) score |
Vendor-specific score |
>90% |
| Misconfiguration Rate |
Number of critical cloud misconfigurations |
Count |
<5 |
| Public Exposure Incidents |
Publicly exposed cloud resources |
Count |
0 |
| Cloud Asset Inventory Accuracy |
% of cloud assets in inventory |
(Inventoried / Total discovered) × 100 |
100% |
| IAM Policy Violations |
Overly permissive IAM policies |
Count |
<10 |
10. Application Security
| Metric |
Description |
Calculation |
Target |
| SAST/DAST Coverage |
% of applications with security testing |
(Tested apps / Total apps) × 100 |
100% |
| Critical Vulnerabilities in Production |
Critical app vulns in production |
Count |
0 |
| Security Debt |
Open security findings in backlog |
Count (weighted by severity) |
Decreasing trend |
| Secure Code Training Completion |
% of developers with secure coding training |
(Trained / Total devs) × 100 |
100% |
| Dependency Vulnerabilities |
Known vulnerabilities in libraries/dependencies |
Count (critical/high) |
<10 |
11. Compliance & Governance
| Metric |
Description |
Calculation |
Target |
| Control Effectiveness |
% of controls meeting effectiveness criteria |
(Effective controls / Total controls) × 100 |
>90% |
| Audit Findings |
Number of audit findings by severity |
Count |
Decreasing trend |
| Policy Compliance Rate |
% of policies reviewed within schedule |
(Current / Total policies) × 100 |
100% |
| Risk Register Currency |
% of risks reviewed within 90 days |
(Current / Total risks) × 100 |
100% |
| Security Exceptions |
Number of active policy exceptions |
Count |
<10 |
12. Business Continuity & Resilience
| Metric |
Description |
Calculation |
Target |
| RTO Achievement |
% of DR tests meeting RTO targets |
(Met RTO / Total tests) × 100 |
>95% |
| RPO Achievement |
% of recoveries meeting RPO targets |
(Met RPO / Total recoveries) × 100 |
>95% |
| BC/DR Test Coverage |
% of critical systems tested annually |
(Tested / Critical systems) × 100 |
100% |
| Plan Currency |
% of BC/DR plans reviewed within 12 months |
(Current / Total plans) × 100 |
100% |
Board-Level Security Reporting
Key Principles for Executive Reporting
- Focus on business impact, not technical details
- Use visual dashboards, not tables of numbers
- Show trends over time, not just snapshots
- Benchmark against industry peers where possible
- Link to business objectives and risk appetite
- Highlight key risks and mitigation actions
Recommended Board Metrics (Dashboard)
Risk Overview
- Cyber Risk Rating: Overall risk score (e.g., High/Medium/Low or 1-10 scale)
- Critical Risks: Top 3-5 risks with mitigation status
Incident Summary
- Security Incidents: Count by severity (Critical/High/Medium/Low)
- MTTD/MTTR: Trend over last 12 months
- Material Incidents: Any incidents requiring board awareness
Control Effectiveness
- Security Control Maturity: % of controls at target maturity level
- Audit Findings: Open findings by severity, trend
- Compliance Status: Compliance with key regulations (GDPR, PCI, etc.)
Security Posture
- Vulnerability Management: % of critical vulns outstanding >30 days
- Patch Compliance: % of systems with current patches
- MFA Adoption: % of users with MFA enabled
Third-Party Risk
- Vendor Risk: % of critical vendors with current assessments
- Vendor Incidents: Third-party security incidents
Resilience
- BC/DR Test Results: Pass/fail status, RTO/RPO achievement
- System Availability: Uptime % for critical systems
Investment & Resources
- Security Budget Utilization: % of budget spent vs allocated
- Security Staffing: Vacancies in security team
- Training Completion: % of staff with annual security training
Sample Board Dashboard
Cyber Risk Rating: MEDIUM (Improving from HIGH in Q3)
Security Incidents (Q4 2025)
- Critical: 0
- High: 2 (both contained within SLA)
- Medium: 15
- Low: 47
MTTD: 12 minutes (Target: <15 min) ✓
MTTR: 45 minutes (Target: <60 min) ✓
Top Risks:
1. Third-party data breach risk (Vendor X - assessment overdue)
→ Mitigation: Assessment scheduled for Jan 2026
2. Phishing susceptibility (click rate 8%, target <5%)
→ Mitigation: Enhanced training programme launched
3. Legacy system end-of-life (System Y unsupported from Mar 2026)
→ Mitigation: Replacement project approved, on track
Compliance Status:
- GDPR: Compliant ✓
- PCI DSS: Compliant (Annual AOC submitted) ✓
- Cyber Essentials Plus: Certified (Valid until Aug 2026) ✓
Security Posture:
- Patch Compliance: 97% (Target: >95%) ✓
- MFA Adoption: 100% (Target: 100%) ✓
- Critical Vulnerabilities >30 days: 2 (Target: 0) ✗
→ Both in remediation, ETA: Feb 2026
Investments:
- Security Budget: 85% utilized YTD (on track)
- Security Team: 1 vacancy (SOC Analyst, interviews in progress)
Metric Maturity Levels
Level 1: Ad-Hoc
- No formal metrics programme
- Reactive reporting (incidents only)
- Manual data collection
Level 2: Defined
- Basic metrics defined and collected
- Quarterly reporting
- Spreadsheet-based tracking
Level 3: Managed
- Comprehensive metrics across domains
- Automated data collection (SIEM, GRC tools)
- Monthly reporting with trend analysis
- KPIs linked to security objectives
Level 4: Optimised
- Real-time dashboards and alerting
- Predictive analytics and benchmarking
- Continuous improvement driven by metrics
- Board-level reporting integrated with business metrics
Common Pitfalls
- Too many metrics: Overwhelming stakeholders with data
- Vanity metrics: Measuring activity, not outcomes (e.g., "number of vulnerability scans run")
- No context: Reporting numbers without trends or targets
- Lack of actionability: Metrics that don't drive decisions
- Inconsistent measurement: Changing definitions over time
- No benchmarking: No comparison to industry or past performance
- Manual collection: Too time-consuming to sustain
- Wrong audience: Technical metrics to executives, strategic metrics to engineers
SIEM & Security Analytics
- Splunk: Security dashboards and metrics
- Elastic Security: Metrics and KPI dashboards
- Microsoft Sentinel: Built-in workbooks and metrics
- ServiceNow GRC: Risk, compliance, audit metrics
- Archer (RSA): Risk and compliance dashboards
- LogicManager: Risk metrics and reporting
- Power BI: Custom security dashboards
- Tableau: Visual analytics
- Grafana: Real-time monitoring dashboards
CSPM & Cloud Security
- Prisma Cloud (Palo Alto): Cloud security posture metrics
- Wiz: Cloud security risk scores
- Orca Security: Cloud security metrics
Quick Selection Guide
| Organisation Profile |
Recommended Metrics Focus |
| Small business (<50) |
10-15 key metrics: Patch compliance, MFA adoption, incident volume, training completion |
| Medium (50-500) |
20-30 metrics across vulnerability, IAM, incidents, training, vendor risk |
| Large enterprise (500+) |
Comprehensive metrics across all domains, automated dashboards, board reporting |
| Financial services |
Compliance-heavy: Control effectiveness, audit findings, vendor risk, resilience |
| Healthcare |
Privacy-focused: Data breaches, DSR response times, access reviews, training |
| Technology/SaaS |
DevSecOps metrics: SAST/DAST coverage, vulnerability density, cloud posture |