This assessment evaluates how effectively an organization segments its market and defines its ideal customer profile (ICP) for sales prioritization. Companies with well-defined ICPs achieve 68% higher account engagement and 33% higher conversion rates, yet most organizations operate with incomplete or outdated segmentation. [src2]
What this measures: The depth and quality of the ideal customer profile definition.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No formal ICP; "we sell to everyone" or vague descriptions | Ask 5 people — get 5 different answers; no written definition |
| 2 | Emerging | Basic ICP with 2-3 firmographic criteria but not validated | Written ICP exists but criteria are assumptions, not data-derived |
| 3 | Defined | ICP with 6-10 attributes validated against best-customer analysis | ICP based on top-decile customer LTV analysis; team can articulate criteria |
| 4 | Managed | Multi-segment ICP with distinct value propositions per segment | Segment-specific playbooks; win rates tracked by segment |
| 5 | Optimized | AI-driven ICP discovery with dynamic adjustment | ML models identify micro-segments; ICP updates automatically |
Red flags: ICP created 3+ years ago and never updated; ICP is a marketing persona sales doesn't use; no data validation. [src1]
Quick diagnostic question: "Show me your ICP document — when was it last updated, and what data validated it?"
What this measures: Types of data used to define and operationalize segmentation.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Only basic firmographics; no enrichment | CRM has industry and employee count; many fields empty |
| 2 | Emerging | Firmographic enrichment from third-party; some technographic data | Enrichment tool deployed but completeness < 60% |
| 3 | Defined | Firmographic + technographic + behavioral data systematically maintained; > 80% completeness | Enrichment runs automatically; website engagement tracked |
| 4 | Managed | Intent data integrated; product usage data informs expansion segmentation | Intent signals surface accounts with buying behavior |
| 5 | Optimized | Multi-signal fusion with predictive models scoring propensity | AI synthesizes all data layers into unified account score |
Red flags: Fewer than 5 attributes per account; no third-party enrichment; intent data purchased but not integrated. [src4]
Quick diagnostic question: "Beyond company name and industry, what data do you have on target accounts — and how complete is it?"
What this measures: Whether segmentation drives actual sales behavior and resource allocation.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Segmentation doesn't influence sales prioritization | Reps choose accounts based on preference, not segment priority |
| 2 | Emerging | Used for territory planning but not daily prioritization | Territories consider segment but daily prospecting is ad hoc |
| 3 | Defined | Integrated into CRM with account tiers; lead scoring reflects ICP fit | Accounts tagged Tier 1/2/3; segment-specific messaging templates |
| 4 | Managed | Drives resource allocation (specialists, pricing); segment win rates tracked | Enterprise gets dedicated SE; segment performance informs coaching |
| 5 | Optimized | Real-time signals drive dynamic prioritization and next-best-action | AI recommends which accounts to engage based on real-time signals |
Red flags: Sales can't explain Tier 1 vs Tier 3 distinction; no segment tags in CRM; same outreach for all accounts. [src3]
Quick diagnostic question: "Would a rep's daily activity show evidence of segment-based prioritization — or does every account get the same treatment?"
What this measures: Whether segmentation is validated against outcomes and updated based on performance.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No validation against outcomes; ICP based on untested assumptions | Nobody has compared win rates across segments |
| 2 | Emerging | Occasional analysis but not systematic; ICP drift not monitored | Annual analysis that doesn't change ICP or allocation |
| 3 | Defined | Quarterly validation of win rate, ACV, LTV by segment; ICP updated on drift | Quarterly report shows segment performance; revision process documented |
| 4 | Managed | Continuous monitoring with drift alerts; A/B testing of segment hypotheses | Dashboard monitors segment health; drift investigation triggered automatically |
| 5 | Optimized | ML models continuously validate and refine; segments emerge/retire automatically | Model accuracy tracked; segments evolve dynamically |
Red flags: ICP unchanged 18+ months despite market shifts; no win/loss analysis by segment; team believes "we know our customers" without data. [src5]
Quick diagnostic question: "When did your ICP last change based on win/loss data — and what changed?"
What this measures: Whether all GTM functions share the same segmentation.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Each function has its own customer view; marketing targets different segments than sales | Marketing campaigns target one segment; sales prospects another |
| 2 | Emerging | Sales and marketing share some segmentation; product and CS disconnected | Sales and marketing agree on ICP but personas don't map to segments |
| 3 | Defined | Unified segmentation across sales, marketing, CS; segment strategies per function | All functions reference same definitions; content mapped to segments |
| 4 | Managed | Cross-functional segment reviews; resource allocation coordinated by segment | Quarterly reviews include all functions; budget follows segment strategy |
| 5 | Optimized | Segments orchestrated across entire lifecycle; segment P&L tracked | Product roadmap prioritized by segment ROI; entire GTM coordinated per segment |
Red flags: Marketing generates leads that don't match sales ICP; product launches target segments sales isn't pursuing. [src6]
Quick diagnostic question: "Would marketing, sales, and product give the same top 3 customer segments in priority order?"
Overall Score = (ICP Rigor + Data Sophistication + Operationalization + Validation + Cross-Functional Alignment) / 5
| Overall Score | Maturity Level | Interpretation | Recommended Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | No effective segmentation; sales wasting effort on poor-fit accounts | Build foundational ICP from best-customer analysis; implement enrichment |
| 2.0 - 2.9 | Developing | Basic segmentation but not validated or operationalized | Validate ICP against win/loss data; integrate into CRM and lead scoring |
| 3.0 - 3.9 | Competent | Solid ICP with CRM integration; typical for well-run B2B companies | Add behavioral and intent data; build segment-specific sales motions |
| 4.0 - 4.5 | Advanced | Data-rich segmentation driving resource allocation; competitive advantage | Deploy AI-driven segment discovery; build cross-functional orchestration |
| 4.6 - 5.0 | Best-in-class | Dynamic, AI-driven segmentation continuously refined | Maintain through model improvement; explore micro-segmentation |
| Weak Dimension (Score < 3) | Fetch This Card |
|---|---|
| ICP Definition Rigor | ICP Building Playbook |
| Data Layer Sophistication | Sales Data Enrichment Strategy |
| Segmentation Operationalization | Segmentation-Driven Sales Playbook |
| Validation and Feedback Loops | ICP Validation Framework |
| Cross-Functional Alignment | GTM Segmentation Alignment Playbook |
| Segment | Expected Average Score | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Series A-B ($2-20M ARR) | 2.0 | 2.8 | 1.3 |
| Series C-D ($20-100M ARR) | 2.8 | 3.5 | 2.0 |
| Growth/Scale ($100M+ ARR) | 3.5 | 4.0 | 2.8 |
| Enterprise/Public | 3.8 | 4.3 | 3.2 |
Fetch when a user asks to evaluate segmentation effectiveness, diagnose declining win rates despite steady pipeline, assess ICP validity after market changes, or optimize resource allocation across segments. Also relevant when CAC is rising without LTV improvement.