Organizational Design Assessment
Purpose
This assessment evaluates the effectiveness of an organization's structural design across five critical dimensions: span of control optimization, management layer efficiency, role clarity and job architecture, decision rights allocation, and structural agility. The output identifies where structural dysfunction is creating drag on execution speed, employee confusion, or excessive management overhead. [src4]
Constraints
- Requires access to org chart and headcount data with reporting relationships
- Not meaningful for companies under 25 employees
- Optimal spans vary by function — do not apply uniform targets
- Assessment evaluates structure, not culture
- Re-run after major reorgs, M&A, or 50%+ headcount change
Assessment Dimensions
Dimension 1: Span of Control
What this measures: Whether managers have appropriate direct report counts for their function and level.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No awareness of span metrics; wild variation from 1-2 to 20+ | No correlation between spans and role complexity |
| 2 | Emerging | Spans tracked but not managed; generic targets across functions | Same target for engineering and call center managers |
| 3 | Defined | Function-specific targets; 70%+ within target; exceptions documented | Written span policy by function; exception reviews |
| 4 | Managed | Actively optimized during reorgs; data-driven analysis | Span analysis in reorg cases; quarterly reporting |
| 5 | Optimized | Dynamic management with real-time monitoring; AI workload analysis | Predictive overload alerts; complexity-based spans |
Red flags: Any manager with <3 reports; IC manager with >15 reports; org average below 4. [src1]
Quick diagnostic question: "What is the average number of direct reports per manager, and how much does it vary?"
Dimension 2: Management Layer Efficiency
What this measures: Whether the organization has the right number of layers between CEO and frontline.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No awareness of layer count; layers added organically via promotions | Cannot answer how many layers exist; 10+ in some paths |
| 2 | Emerging | Layers known but not managed; acknowledged as too many | Layer count documented but static; no delayering |
| 3 | Defined | Target layer count by function; recent delayering; M:IC ratio tracked | Written layer policy; recent restructuring alignment |
| 4 | Managed | New layers require justification; regular audits; cost calculated | Layer addition requires VP approval; annual overhead audit |
| 5 | Optimized | Minimal viable layers; project-based overlays; real-time impact analysis | Flat structure with matrix; team leads vs. permanent managers |
Red flags: >8 layers in company <5,000; M:IC ratio >1:3; "manager of one" roles. [src2]
Dimension 3: Role Clarity & Job Architecture
What this measures: Whether roles are clearly defined with documented responsibilities, career levels, and competency expectations.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No job descriptions; arbitrary titles; unclear responsibilities | Frequent scope conflicts; same title means different things |
| 2 | Emerging | Some JDs exist but outdated; informal career levels; role overlap | Partial coverage; employees cannot articulate next step |
| 3 | Defined | Comprehensive job architecture; 80%+ current descriptions; career ladders | Formal job families and levels; career paths visible |
| 4 | Managed | Actively maintained; role clarity surveyed; scope boundaries enforced | Annual JD review; clarity scores >80%; RACI documented |
| 5 | Optimized | Dynamic role architecture; skills-based design; talent marketplace | Skills taxonomy drives roles; internal mobility enabled |
Red flags: >30% of employees cannot describe top 3 responsibilities; Director-level with no reports in 100+ company. [src4]
Dimension 4: Decision Rights Allocation
What this measures: Whether it is clear who decides what, at what level, and with what authority.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | All decisions escalated to CEO; employees afraid to decide | Weeks-long decision cycles; learned helplessness |
| 2 | Emerging | Informal delegation; inconsistent authority; shadow decision-makers | Some empowerment but inconsistent; no RACI framework |
| 3 | Defined | RACI/DACI documented; spending authority defined; 70%+ without CEO | Written decision frameworks; authority tiers established |
| 4 | Managed | Decision rights reviewed; speed tracked; empowerment surveyed | Annual rights audit; decision velocity metrics |
| 5 | Optimized | Distributed decision-making; AI-assisted routing; near-zero bottlenecks | Decisions logged and analyzed; context-adaptive routing |
Red flags: CEO approving <$5K expenses in 100+ person company; 2x industry decision cycle time. [src4]
Dimension 5: Structural Agility
What this measures: How quickly the org can restructure and adapt its design to changing needs.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Reorgs traumatic and infrequent; no change management; static design | Last reorg was chaotic; employees fear change |
| 2 | Emerging | Awareness that design needs evolution; reorgs still disruptive | Reorgs happen but poorly managed; political staffing |
| 3 | Defined | Annual org reviews; change management process; cross-functional teams form easily | Annual review cycle; reorg playbook; teams form in 2 weeks |
| 4 | Managed | Continuous evolution; incremental changes; matrix/pod structures | Quarterly micro-adjustments; teams form in days |
| 5 | Optimized | Dynamic and fluid; real-time adaptation; internal talent marketplace | Continuous team evolution; near-zero restructuring disruption |
Red flags: No design changes in 3+ years despite growth; every cross-functional initiative needs VP sponsor. [src5]
Scoring & Interpretation
Overall Score Calculation
Overall Score = (Span of Control + Layer Efficiency + Role Clarity + Decision Rights + Structural Agility) / 5
Score Interpretation
| Overall Score | Maturity Level | Interpretation | Recommended Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | Structural dysfunction actively impeding execution | Engage org design consultant; conduct span-and-layer analysis |
| 2.0 - 2.9 | Developing | Organic, unmanaged structure with visible growing pains | Define span targets; document decision rights; create job architecture |
| 3.0 - 3.9 | Competent | Sound foundation with optimization opportunities | Optimize spans; implement skills-based roles; accelerate delegation |
| 4.0 - 4.5 | Advanced | Structure enables the business with minor agility gaps | Focus on dynamic teaming and talent marketplace |
| 4.6 - 5.0 | Best-in-class | Org design is a competitive advantage | Maintain and innovate; pilot AI-assisted org design |
Benchmarks by Segment
| Segment | Expected Average Score | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Startup (25-100) | 2.0 | 2.5 | 1.5 |
| Growth (101-500) | 2.8 | 3.2 | 2.0 |
| Scale-up (501-2000) | 3.3 | 3.7 | 2.5 |
| Enterprise (2000+) | 3.8 | 4.2 | 3.0 |
Common Pitfalls in Assessment
- Confusing org chart accuracy with design quality: Having an up-to-date org chart does not mean the design is effective.
- Uniform span targets: A 1:10 span is excellent for sales but dangerous for engineering with junior developers. Function-specific targets are essential.
- Delayering without delegating: Removing layers without delegating authority creates bottlenecks at fewer levels.
- Reorg as universal solution: Restructuring around bad managers does not fix management quality problems.
When This Matters
Fetch when a user asks about organizational effectiveness, is evaluating org structure appropriateness, preparing for a reorganization, diagnosing slow decision-making, or investigating unclear roles and responsibilities.