This assessment evaluates the maturity of an organization's learning and development function across six critical dimensions: strategic alignment, skills architecture, learning delivery infrastructure, leadership development, measurement and ROI, and learning culture. The output is a composite maturity score (1-5) that identifies where L&D is adding value and where it is operating as a cost center without measurable impact. [src1]
What this measures: How tightly L&D priorities connect to business strategy and measurable outcomes.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | L&D is a training order-taker; no connection to business strategy | No documented L&D strategy; catalog unchanged 2+ years |
| 2 | Emerging | Budget and annual plan exist but not derived from strategy | Training plan not linked to goals; L&D leader absent from leadership |
| 3 | Defined | L&D strategy derived from business priorities; skills gaps identified | Strategy references company OKRs; annual gap analysis; budget tied to initiatives |
| 4 | Managed | L&D shapes strategy through workforce capability insights | Capability data informs hiring vs building; ROI per initiative measured |
| 5 | Optimized | L&D is strategic differentiator; real-time skills intelligence | AI predicts skill obsolescence; L&D cited in employer brand |
Red flags: L&D leader cannot connect portfolio to business priorities; flat budget during growth; no skills gap analysis. [src1]
Quick diagnostic question: "How does your L&D team decide what programs to invest in — business strategy or manager requests?"
What this measures: Whether the organization has a structured skills taxonomy and systematically identifies gaps.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No skills taxonomy; capabilities unknown; training by role title only | No competency framework; skills tracked informally |
| 2 | Emerging | Basic competency framework for some roles; self-assessment with poor data | Frameworks for leadership only; completion below 50% |
| 3 | Defined | Comprehensive taxonomy; regular assessments with manager validation | 80%+ participation; gap reports produced and acted upon |
| 4 | Managed | Dynamic architecture with market intelligence; AI-assisted skills inference | Taxonomy updated quarterly; AI infers skills from projects |
| 5 | Optimized | Real-time skills intelligence platform; predictive gap modeling | Skills graph with adjacency mapping; 12-24 month predictions |
Red flags: No skills taxonomy; cannot report top 5 skill gaps; skills data in disconnected spreadsheets. [src2]
Quick diagnostic question: "Can you show me your skills taxonomy and the top 3 skill gaps right now?"
What this measures: Technology, modalities, and accessibility of learning programs.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Classroom-only or ad hoc; no LMS; no tracking | No learning platform; content scattered across drives |
| 2 | Emerging | Basic LMS as content repository; limited e-learning | Low adoption; compliance-focused; no mobile access |
| 3 | Defined | Modern LMS with blended learning; curated content; mobile access | 70%+ monthly active users; blended paths; annual refresh |
| 4 | Managed | LXP with personalized recommendations; micro-learning; social learning | AI-powered recommendations; multi-provider marketplace |
| 5 | Optimized | AI-driven adaptive learning; VR/AR; learning in flow of work | AI personalizes paths; learning embedded in daily tools |
Red flags: LMS login rate below 30%; all training compliance-focused; no mobile access; content 2+ years old. [src3]
Quick diagnostic question: "What percentage of employees used your learning platform last month?"
What this measures: How systematically the organization develops future leaders at all levels.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No formal leadership development; best ICs promoted without training | No leadership competency model; no succession planning |
| 2 | Emerging | Some external programs; ad hoc mentoring; C-suite succession only | Select high-potentials sent to programs; top-10 role succession |
| 3 | Defined | Structured programs at multiple levels; high-potential process; succession for top 2 levels | Internal curriculum; 9-box reviews; succession for director+ roles |
| 4 | Managed | Programs tied to business strategy; coaching available; pipeline metrics tracked | Programs aligned to strategic capabilities; diversity of pipeline tracked |
| 5 | Optimized | AI-assisted potential identification; personalized journeys; simulation centers | AI identifies potential; succession depth 2+ for all critical roles |
Red flags: No first-time manager training; succession covers CEO only; leadership budget first to be cut. [src5]
Quick diagnostic question: "What happens when someone is promoted to manager for the first time?"
What this measures: How rigorously L&D effectiveness is measured and connected to business outcomes.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Only completion rates tracked; no effectiveness measurement | Sole metric is courses completed; budget defended by headcount |
| 2 | Emerging | Satisfaction surveys (Kirkpatrick L1); completion rates; some quizzes | Happy sheets collected; quiz scores for compliance |
| 3 | Defined | Effectiveness measured (Kirkpatrick L2-3); behavior change assessed | Pre/post assessments; manager surveys 90 days post-training |
| 4 | Managed | Business impact (Kirkpatrick L4); ROI calculated; A/B testing | Revenue impact measured; ROI reported; skill velocity tracked |
| 5 | Optimized | Real-time impact dashboard; causal inference; AI optimizes investment | Causal models isolate L&D impact; AI recommends budget reallocation |
Red flags: Only completion rates measured; no behavior change assessment; L&D budget not tied to outcomes. [src6]
Quick diagnostic question: "Beyond completion rates, how do you measure whether training changed behavior?"
What this measures: Whether the organization values continuous learning and embeds it in daily work.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Learning is compliance-only; no development time; not valued by leadership | Mandatory compliance only; managers discourage learning time |
| 2 | Emerging | Some voluntary learning; tuition reimbursement underutilized | Low reimbursement usage; learning happens outside work hours |
| 3 | Defined | Dedicated learning time; development goals in reviews; knowledge sharing | 4-8 hours/month allocated; goals in reviews; ambassador program |
| 4 | Managed | Learning in workflows; managers evaluated on development; internal mobility | Managers accountable for growth; mentoring programs; communities of practice |
| 5 | Optimized | Learning organization culture; AI-driven suggestions; talent attraction differentiator | Learning in daily tools; failure-as-learning mindset; employer brand asset |
Red flags: No time for learning; no development goals in reviews; tuition reimbursement below 10% utilization. [src4]
Quick diagnostic question: "How much dedicated learning time do employees have each month?"
Overall Score = (Strategic Alignment + Skills Architecture + Delivery + Leadership Dev + Measurement + Learning Culture) / 6
| Overall Score | Maturity Level | Interpretation | Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | L&D is compliance-only; skills gaps widening; retention at risk | Build skills taxonomy; establish LMS; create manager training |
| 2.0 - 2.9 | Developing | Foundation exists but reactive; disconnected from business | Connect to business priorities; skills gap analysis; leadership pipeline |
| 3.0 - 3.9 | Competent | Solid function with strategic connection; measurement improving | Business impact measurement; LXP deployment; predictive skills modeling |
| 4.0 - 4.5 | Advanced | High-performing with measurable impact; optimization focus | AI-driven personalization; causal impact modeling |
| 4.6 - 5.0 | Best-in-class | Industry-leading learning organization; competitive differentiator | Maintain innovation; pioneer emerging learning tech |
| Weak Dimension (Score < 3) | Fetch This Card |
|---|---|
| Strategic Alignment | Performance Management Assessment |
| Skills Architecture | People Analytics Maturity Assessment |
| Leadership Development | Performance Management Assessment |
| Measurement & ROI | People Analytics Maturity Assessment |
| Learning Culture | DEI Program Assessment |
| Segment | Expected Average | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Startup (50-200 employees) | 1.6 | 2.3 | 1.0 |
| Growth (200-1,000 employees) | 2.5 | 3.2 | 1.8 |
| Enterprise (1,000-10,000) | 3.2 | 3.8 | 2.5 |
| Large enterprise (10,000+) | 3.7 | 4.2 | 3.0 |
[src6]
Fetch when a user asks to evaluate their L&D function, diagnose why upskilling programs fail, justify L&D investment to the CFO, or assess whether the organization can build capabilities internally versus hiring.