This assessment evaluates an organization's overall security and compliance posture across six critical dimensions: SOC 2 readiness, vulnerability management, penetration testing, data privacy compliance, incident response, and access control. It provides a structured diagnostic that identifies the weakest links in the security chain, quantifies maturity gaps against industry benchmarks, and routes teams to specific remediation actions. Use this when preparing for compliance audits, evaluating acquisition targets, onboarding a new CISO, or responding to board-level security posture questions. [src1]
What this measures: Organizational preparedness for a SOC 2 Type II audit, covering trust services criteria, policy documentation, control implementation, and evidence collection.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No formal security policies; no understanding of SOC 2 requirements; no designated compliance owner | Zero documented policies; no control framework mapped; no prior audit experience; no GRC tool |
| 2 | Emerging | Some policies drafted but incomplete; basic awareness of SOC 2 trust services criteria; no evidence collection process | 30-50% of required policies documented; no continuous monitoring; evidence gathered manually and ad hoc |
| 3 | Defined | All five trust services criteria mapped to controls; policies documented and reviewed annually; evidence collection process established | Policy library covers 80%+ of controls; GRC platform in use; gap assessment completed; remediation plan exists |
| 4 | Managed | SOC 2 Type II achieved and maintained; continuous monitoring automated; exceptions tracked and remediated within SLAs | Type II report current; automated evidence collection; control exceptions under 5%; annual re-certification on schedule |
| 5 | Optimized | Multi-framework compliance (SOC 2 + ISO 27001 + SOC 3); automated control testing; compliance-as-code integrated into CI/CD | Cross-mapped controls across frameworks; real-time compliance dashboards; automated policy enforcement; zero audit exceptions |
Red flags: No designated compliance owner; customer security questionnaires take weeks to complete; unable to produce a current SOC 2 report when requested. [src3]
Quick diagnostic question: "If a prospect asked for your SOC 2 report tomorrow, could you provide a current one within 24 hours?"
What this measures: Maturity of the program for identifying, prioritizing, remediating, and tracking vulnerabilities across infrastructure, applications, and cloud environments.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No regular vulnerability scanning; patching is reactive to incidents only; no asset inventory | No scanning tools deployed; patch cycles exceed 90 days; no CVSS-based prioritization; unknown asset count |
| 2 | Emerging | Periodic vulnerability scans (monthly or quarterly); basic patching process exists; partial asset inventory | Scanner deployed but coverage under 60%; patch cycle 30-60 days for critical; no SLA tracking |
| 3 | Defined | Weekly automated scans across infrastructure and applications; risk-based prioritization; SLA-driven remediation | 90%+ asset coverage; critical vulns patched within 14 days; SBOM maintained; vulnerability trends tracked monthly |
| 4 | Managed | Continuous scanning with contextual prioritization; integrated into CI/CD | Mean time to remediate critical vulns under 7 days; EPSS-based prioritization; automated scanning in build pipelines; exception management |
| 5 | Optimized | Attack surface management integrated; predictive vulnerability intelligence; near-zero exploitable exposure window | Continuous attack surface discovery; mean time to remediate critical under 48 hours; automated remediation for known patterns |
Red flags: No asset inventory; scanning less than monthly; critical vulnerabilities older than 30 days unpatched; average 74-day remediation time for critical application vulnerabilities. [src2]
Quick diagnostic question: "How many critical or high-severity vulnerabilities are currently open, and what is your average time to remediate them?"
What this measures: Maturity and coverage of offensive security testing, including scope, frequency, methodology, and remediation follow-through.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No penetration testing performed; security relies solely on defensive tools | No pen test reports; no budget allocated; unknown external attack surface |
| 2 | Emerging | Annual third-party pen test of limited scope (external network only); findings documented but remediation inconsistent | One pen test report per year; scope excludes cloud, APIs, and internal; less than 50% of findings remediated before retest |
| 3 | Defined | Annual pen tests covering external, internal, and web applications; methodology aligned with OWASP/PTES; remediation tracked to closure | Comprehensive annual report; external + internal + web app scope; 80%+ critical/high findings remediated within 60 days |
| 4 | Managed | Semi-annual or continuous pen testing; red team exercises; cloud and API testing included | Multiple test cycles per year; red team/purple team exercises; cloud-native testing; pen test findings integrated into sprint backlog |
| 5 | Optimized | Continuous automated + manual testing aligned with CI/CD; bug bounty program active; adversary simulation with threat-informed scenarios | Continuous pen testing in release cycles; active bug bounty; MITRE ATT&CK-aligned adversary simulations; BAS tools |
Red flags: No pen test in the last 12 months; scope excludes cloud infrastructure or APIs; 45%+ of findings unresolved after 12 months; junior-only tester teams without senior oversight. [src4]
Quick diagnostic question: "When was your last penetration test, what was in scope, and what percentage of critical findings were remediated?"
What this measures: Compliance posture across data privacy regulations (GDPR, CCPA/CPRA, and emerging state/national laws), including data mapping, consent management, rights fulfillment, and breach notification readiness.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No awareness of data privacy obligations; no data inventory; no privacy policy beyond boilerplate | No data processing records; no consent mechanism; no appointed DPO/privacy lead; cookie banner absent |
| 2 | Emerging | Basic privacy policy published; cookie consent banner deployed; awareness of GDPR/CCPA but incomplete compliance | Privacy policy exists but generic; basic cookie consent; no data mapping; no DSAR process |
| 3 | Defined | Data inventory and processing records maintained; consent management platform deployed; DSAR fulfillment process operational | Records of processing activities documented; CMP with granular consent; DSARs fulfilled within regulatory windows; privacy impact assessments for new features |
| 4 | Managed | Multi-jurisdiction compliance (GDPR + CCPA + state laws); automated data discovery and classification; privacy-by-design embedded in SDLC | Automated PII discovery across data stores; privacy reviews integrated into product development; cross-border transfer mechanisms documented |
| 5 | Optimized | Privacy engineering as a discipline; real-time consent signal propagation; automated regulatory change tracking | Global Privacy Control honored; real-time consent enforcement across all systems; zero regulatory enforcement actions; privacy KPIs tracked |
Red flags: No records of processing activities; DSAR response time exceeds regulatory deadlines; no consent mechanism beyond a checkbox; storing data without documented legal basis. [src6]
Quick diagnostic question: "If a user submitted a data deletion request today, how long would it take to fulfill completely across all systems?"
What this measures: Organizational readiness to detect, contain, respond to, and recover from security incidents, aligned with NIST CSF 2.0 Detect/Respond/Recover functions.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No incident response plan; no defined roles or escalation paths; incidents handled reactively | No IRP document; no SIEM or log aggregation; no on-call rotation for security; no post-incident reviews |
| 2 | Emerging | Basic incident response plan exists; some log aggregation; ad hoc on-call rotation | Written IRP but untested; basic logging; no tabletop exercises; inconsistent incident classification |
| 3 | Defined | Documented IRP with defined roles, severity levels, and escalation procedures; SIEM deployed; regular tabletop exercises | IRP tested via tabletop exercises annually; SIEM with detection rules; on-call rotation defined; post-mortems documented |
| 4 | Managed | Automated detection and alerting; playbook-driven response for common incident types; MTTD under 24 hours; MTTC under 4 hours | SOAR platform for automated response; detection coverage across MITRE ATT&CK; established SLAs; quarterly tabletop exercises |
| 5 | Optimized | AI-augmented threat detection; automated containment for known attack patterns; integrated threat intelligence; continuous improvement | MTTD under 1 hour; automated containment and quarantine; threat hunting program active; incident data feeds back into architecture decisions |
Red flags: IRP has never been tested through tabletop exercise; no centralized logging or SIEM; no defined communication plan for breach notification. [src1]
Quick diagnostic question: "Walk me through what happens in the first 60 minutes after your SOC detects a potential breach — who is notified and what actions are taken?"
What this measures: Maturity of identity and access management (IAM) practices, including authentication, authorization, least privilege enforcement, and zero trust implementation.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Shared accounts common; no MFA; passwords stored in spreadsheets or shared docs; no access reviews | Shared admin credentials; no SSO; no MFA on production systems; no access provisioning/deprovisioning process |
| 2 | Emerging | Individual accounts for most systems; MFA on some critical systems; basic role definitions exist | MFA on primary identity provider only; manual access provisioning; role definitions exist but not enforced; quarterly access reviews (manual) |
| 3 | Defined | SSO/SAML across business applications; MFA enforced organization-wide; RBAC implemented; automated provisioning/deprovisioning | SSO coverage 80%+ of applications; MFA on all production and admin access; automated onboarding/offboarding; access reviews quarterly |
| 4 | Managed | Zero Trust architecture in progress; context-aware access policies; PAM deployed; just-in-time access for elevated permissions | PAM for all privileged accounts; conditional access policies; JIT access for production; service account inventory and rotation |
| 5 | Optimized | Full Zero Trust implementation; continuous authentication and authorization; automated least privilege enforcement; ITDR | Zero standing privileges; continuous risk-based authentication; automated access right-sizing; passwordless authentication adopted |
Red flags: Shared admin accounts for production systems; no MFA on source code repositories or cloud consoles; former employees still have active access; service accounts with static credentials and no rotation. [src5]
Quick diagnostic question: "How quickly are access rights revoked when an employee leaves, and when was the last time you audited who has production access?"
Overall Score = (SOC 2 Readiness + Vulnerability Mgmt + Pen Testing + Data Privacy + Incident Response + Access Control) / 6
| Overall Score | Maturity Level | Interpretation | Recommended Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | Fundamental security controls absent; organization is exposed to significant regulatory and breach risk; not ready for enterprise customers | Prioritize access control (MFA/SSO) and basic vulnerability scanning immediately; engage fractional CISO |
| 2.0 - 2.9 | Developing | Basic controls exist but gaps are exploitable; compliance certifications not achievable; enterprise sales blocked | Close highest-risk gaps first; begin SOC 2 readiness program; implement SIEM and IRP |
| 3.0 - 3.9 | Competent | Solid security foundation; SOC 2 achievable; most enterprise requirements met; room for automation | Pursue SOC 2 Type II; automate vulnerability management; advance toward continuous pen testing |
| 4.0 - 4.5 | Advanced | Strong security program; multiple compliance frameworks; automated detection and response; competitive advantage | Advance Zero Trust implementation; add adversary simulation; pursue ISO 27001 |
| 4.6 - 5.0 | Best-in-class | Industry-leading security posture; security as a business enabler; proactive threat management | Maintain and innovate; contribute to industry standards; advanced threat hunting |
| Weak Dimension (Score < 3) | Fetch This Card |
|---|---|
| SOC 2 Readiness | Business Continuity Planning — build foundational governance before audit |
| Vulnerability Management | Cyber Risk Quantification — quantify exposure to justify remediation investment |
| Penetration Testing | Cyber Risk Quantification — risk-based testing scope prioritization |
| Data Privacy Compliance | ESG Reporting — privacy obligations often overlap with ESG data governance |
| Incident Response | Business Continuity Planning — IR and BCP are interdependent |
| Access Control & Identity | Internal Audit — access control failures surface through audit findings |
| Segment | Expected Average Score | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Seed/Series A (<$5M ARR) | 1.8 | 2.5 | 1.2 |
| Series B ($5M-$30M ARR) | 2.8 | 3.3 | 2.0 |
| Growth ($30M-$100M ARR) | 3.5 | 4.0 | 2.8 |
| Scale/Public ($100M+ ARR) | 4.2 | 4.5 | 3.5 |
Fetch when a user asks to evaluate their security posture, is preparing for SOC 2 or ISO 27001 certification, needs to respond to board-level security questions, is conducting due diligence on an acquisition target, onboarding a new CISO, or diagnosing why the organization keeps failing customer security questionnaires.