Stop Measuring Activity Measure Actual Security.
Written By: AKATI Sekurity Insights Team | Cybersecurity Consulting & MSSP Experts
Reading Time: 4 minutes
The Scene: A quarterly board meeting. The CISO presents a dashboard filled with green indicators. "Patching compliance: 98%. Antivirus detection rate: 99.7%. Security awareness training completion: 100%. Firewall uptime: 99.99%." The board smiles. Security looks great. Three weeks later, ransomware locks every system. Customer data appears for sale on the dark web. How did this happen when all the metrics were green? Welcome to the dangerous world of vanity metrics—numbers that look impressive but measure nothing that actually matters. This is the story of how organizations create elaborate systems to measure security theater while real threats walk past undetected.
The Dashboard That Lied
Let us tell you about a company we'll call "TechCorp." Mid-sized. Growing fast. Taking security seriously—or so they thought. Their quarterly security reports were works of art. Beautiful charts. Impressive percentages. Everything trending positively. The board loved these presentations. Security spending approved without question because the numbers proved it was working.
Then one Monday morning, their payment processing system stopped responding. Then their customer database. Then email. Then everything. Ransomware. The attackers had been inside the network for 47 days. They'd stolen 3TB of customer data. Encrypted every critical system. Left a ransom note demanding $4.5 million in Bitcoin. The company scrambled. Called in incident responders. Tried to understand how this happened despite all their security investments and those reassuring green metrics.
Here's what the forensics team found: The attackers entered through a VPN account belonging to a contractor who'd left three months earlier. Nobody disabled the account. The metrics didn't track this because "account deactivation" wasn't measured—only "password complexity compliance" was, which showed 100%. The attackers moved laterally through the network using legitimate credentials they'd harvested. The metrics showed "zero malware detections" because the attackers didn't use malware—they used admin tools. The backup systems that should have enabled recovery? Configured incorrectly and never tested. The metrics showed "backup completion: 100%" but nobody checked if restores actually worked.
Every metric was green. The security program was failing catastrophically. The numbers told a story of perfect security. The reality was a security program measuring irrelevant activities while ignoring critical gaps. This isn't unusual. This is the norm.
What Organizations Measure vs. What Actually Matters
Walk into any corporate security operations center and look at the dashboards. You'll see metrics like: number of phishing emails blocked, number of malware samples detected, percentage of systems patched within SLA, percentage of employees completing security awareness training, number of security alerts generated, percentage of alerts investigated, firewall rule changes processed, vulnerability scan completion rates, antivirus signature update compliance, and password expiration policy compliance.
These aren't useless metrics. They measure activity. But activity isn't outcomes. Let me show you the problem. A company blocks 10,000 phishing emails monthly. That sounds impressive. The metric is presented to the board: "Phishing protection: 99.2% effective." What this doesn't tell you is whether the 80 phishing emails that got through were clicked. Whether any led to credential compromise. Whether those compromises led to lateral movement. Whether attackers are currently in your environment because of phishing that succeeded. The metric measures filtering effectiveness, not actual risk.
Another example: "98% of critical vulnerabilities patched within 30 days." Sounds great. What this doesn't tell you is whether the 2% that weren't patched include the vulnerability attackers are actively exploiting. Whether those unpatched systems are internet-facing. Whether compensating controls exist. Whether the patching metric counts systems that nobody knows exist—shadow IT that never gets scanned.
Here's the pattern: traditional security metrics measure inputs and activities. How many things we did. How busy the security team was. What they don't measure is outcomes and risk reduction. Are we actually safer? Are attackers less likely to succeed? Would we detect and stop an attack faster than last quarter? These are harder questions. They can't be answered with simple percentages. So organizations default to measuring what's easy rather than what's important.
The Metrics Executives Should Actually Demand
If you're an executive or board member responsible for oversight of cybersecurity, here are the questions that actually matter—and the metrics that answer them:
| Critical Question | Metric to Measure | Benchmark / Standard | Action Required |
|---|---|---|---|
| 01 Can you detect a breach in progress? | Mean Time to Detect (MTTD) How long from initial compromise to detection? |
Industry Average ~200 days Best-in-Class Hours or days |
If your CISO can't tell you your organization's MTTD, you have no detection capability being measured. Demand breach and attack simulations to test and measure detection. |
| 02 Can you respond effectively when detected? | Mean Time to Respond (MTTR) Mean Time to Contain (MTTC) From detection to stopping the attack |
Critical Indicator Organizations measuring and improving these metrics demonstrate actual incident response capability |
Those that don't measure response times are guessing about their capability. Establish baseline measurements and track improvement. |
| 03 Are your critical systems actually protected? | Crown Jewel Protection Coverage Protection status of top 10 most critical systems |
Requirements • Enhanced monitoring (tested) • MFA + logged access • Backups tested (90 days) • IR procedures practiced |
List your 10 most critical systems. If you can't confirm all requirements for each, your security program is failing where it matters most. |
| 04 How much security debt are you carrying? | Known Vulnerabilities & Misconfigurations By severity and age, in systems that matter |
Focus Areas • Critical/high severity • Internet-facing systems • Systems with sensitive data • Track age of issues |
Not total vulnerabilities—that's useless. Focus on critical issues in important systems. Track age to reveal if security is improving or degrading. |
| 05 Would your backups actually work? | Successful Restore Test Completion Rate Not backup completion—actual restore success |
Critical Distinction Backup completion rate measures if backups ran. Restore tests measure if you could actually recover. |
Many organizations discover during ransomware incidents that backups are corrupted or locked. Test restores regularly. Measure and track success rates. |
| 06 Are you improving? | Security Maturity Trending Capability progression over time |
Frameworks • NIST Cybersecurity Framework • CIS Controls • Measure quarter over quarter |
Use consistent frameworks to assess security program maturity. This measures whether investments translate to capability improvement versus just spending money on unused tools. |
Why Organizations Resist Measuring What Matters
You might wonder why, if better metrics exist, organizations don't use them. The answer is uncomfortable: better metrics reveal problems that demand action. It's easier to measure things that look good than things that expose gaps requiring expensive fixes or uncomfortable conversations. Measuring "mean time to detect" requires admitting breaches happen and testing detection, which might reveal you'd never detect sophisticated attackers. Better to measure "antivirus detection rate" and maintain the comforting illusion of perfect prevention.
Measuring "critical asset protection" requires identifying critical assets, which creates accountability for protecting them specifically. If the CEO's crown jewels are identified and one gets breached, someone's accountable. Better to spread protection thin across everything and claim "we protect all assets equally." Measuring "security debt" requires acknowledging all the things you should fix but haven't. That creates pressure to allocate budget and resources to unglamorous remediation work instead of exciting new security tools. Better to measure "new security controls deployed" and ignore the mountain of unfixed problems.
The shift from activity metrics to outcome metrics requires organizational maturity and courage. It requires executives comfortable with risk-based conversations. It requires security leaders willing to show vulnerability and gaps rather than projecting invulnerability. It requires board members asking hard questions about effectiveness rather than accepting reassuring dashboards. Most organizations aren't there yet.
AKATI Sekurity: Measuring Security Effectiveness, Not Security Theater
Developing meaningful security metrics requires expertise in both security operations and business risk management—understanding what actually predicts security outcomes versus what just looks good on dashboards. AKATI Sekurity's Cybersecurity Consulting services include security metrics and KPI development—working with executive teams and boards to define meaningful measures of security effectiveness, implementing measurement frameworks aligned with business risk priorities, and establishing reporting that enables informed decision-making rather than false reassurance.
Our Security Posture Assessments evaluate not just your security controls but your ability to measure their effectiveness. We help organizations move from activity-based metrics to outcome-based metrics that actually predict security performance. For organizations that want validation of their security measurement approach, our Red Team Services test whether your metrics would actually detect sophisticated attacks—revealing whether your dashboard shows real security or security theater.
For ASEAN organizations facing regulatory expectations around security measurement and reporting (Bank Negara Malaysia RMiT, Monetary Authority of Singapore Technology Risk Management Guidelines), we help develop metrics frameworks satisfying regulatory requirements while providing genuine insight into security effectiveness. For US organizations navigating board-level security reporting expectations and SEC cybersecurity disclosure rules, we help establish metrics and reporting approaches that meet regulatory expectations while driving security improvement.
Stop measuring activity. Start measuring outcomes. Contact AKATI Sekurity at hello@akati.com for more information.
Key Terms Explained:
Mean Time to Detect (MTTD): Average time between initial system compromise and detection of the security breach
Mean Time to Respond (MTTR): Average time from detecting a security incident to completing response actions
Mean Time to Contain (MTTC): Average time from detecting an incident to successfully stopping the attack from spreading
Security Debt: Accumulated known vulnerabilities, misconfigurations, and security gaps accepted as risk due to resource constraints
Crown Jewels: Organization's most critical and sensitive systems, data, and assets requiring highest level of protection
Vanity Metrics: Measurements that look impressive but don't correlate with actual security outcomes or risk reduction
Red Team Exercise: Simulated attack by security professionals testing an organization's detection and response capabilities
Purple Team Operation: Collaborative exercise where offensive security testers (red team) and defenders (blue team) work together to improve detection
Security Maturity: Overall capability level of an organization's security program measured against established frameworks
Compensating Controls: Alternative security measures implemented when primary controls cannot be applied
Lateral Movement: Technique where attackers move from one compromised system to others within a network
Privilege Escalation: Attack method gaining higher-level permissions than initially obtained during compromise
Shadow IT: Technology systems, software, or services used within organizations without explicit IT department approval or knowledge
References:
IBM Cost of a Data Breach Report 2024
Verizon Data Breach Investigations Report 2024
Ponemon Institute: Cyber Resilient Organization Report
NIST Cybersecurity Framework: Measurement and Metrics
About the Author: This article was written by AKATI Sekurity's security governance and risk management specialists who help organizations develop meaningful security metrics, conduct security program maturity assessments, and establish board-level security reporting frameworks across financial services, healthcare, technology, and manufacturing sectors in ASEAN and the Americas.
Related Services: Cybersecurity Consulting | Security Posture Assessment | Red Team Services | 24/7 Managed Security (MSSP)