Learning & Development Maturity Assessment

Type: Assessment Confidence: 0.84 Sources: 6 Verified: 2026-03-10

Purpose

This assessment evaluates the maturity of an organization's learning and development function across six critical dimensions: strategic alignment, skills architecture, learning delivery infrastructure, leadership development, measurement and ROI, and learning culture. The output is a composite maturity score (1-5) that identifies where L&D is adding value and where it is operating as a cost center without measurable impact. [src1]

Constraints

Assessment Dimensions

Dimension 1: Strategic Alignment

What this measures: How tightly L&D priorities connect to business strategy and measurable outcomes.

ScoreLevelDescriptionEvidence
1Ad hocL&D is a training order-taker; no connection to business strategyNo documented L&D strategy; catalog unchanged 2+ years
2EmergingBudget and annual plan exist but not derived from strategyTraining plan not linked to goals; L&D leader absent from leadership
3DefinedL&D strategy derived from business priorities; skills gaps identifiedStrategy references company OKRs; annual gap analysis; budget tied to initiatives
4ManagedL&D shapes strategy through workforce capability insightsCapability data informs hiring vs building; ROI per initiative measured
5OptimizedL&D is strategic differentiator; real-time skills intelligenceAI predicts skill obsolescence; L&D cited in employer brand

Red flags: L&D leader cannot connect portfolio to business priorities; flat budget during growth; no skills gap analysis. [src1]

Quick diagnostic question: "How does your L&D team decide what programs to invest in — business strategy or manager requests?"

Dimension 2: Skills Architecture & Gap Analysis

What this measures: Whether the organization has a structured skills taxonomy and systematically identifies gaps.

ScoreLevelDescriptionEvidence
1Ad hocNo skills taxonomy; capabilities unknown; training by role title onlyNo competency framework; skills tracked informally
2EmergingBasic competency framework for some roles; self-assessment with poor dataFrameworks for leadership only; completion below 50%
3DefinedComprehensive taxonomy; regular assessments with manager validation80%+ participation; gap reports produced and acted upon
4ManagedDynamic architecture with market intelligence; AI-assisted skills inferenceTaxonomy updated quarterly; AI infers skills from projects
5OptimizedReal-time skills intelligence platform; predictive gap modelingSkills graph with adjacency mapping; 12-24 month predictions

Red flags: No skills taxonomy; cannot report top 5 skill gaps; skills data in disconnected spreadsheets. [src2]

Quick diagnostic question: "Can you show me your skills taxonomy and the top 3 skill gaps right now?"

Dimension 3: Learning Delivery Infrastructure

What this measures: Technology, modalities, and accessibility of learning programs.

ScoreLevelDescriptionEvidence
1Ad hocClassroom-only or ad hoc; no LMS; no trackingNo learning platform; content scattered across drives
2EmergingBasic LMS as content repository; limited e-learningLow adoption; compliance-focused; no mobile access
3DefinedModern LMS with blended learning; curated content; mobile access70%+ monthly active users; blended paths; annual refresh
4ManagedLXP with personalized recommendations; micro-learning; social learningAI-powered recommendations; multi-provider marketplace
5OptimizedAI-driven adaptive learning; VR/AR; learning in flow of workAI personalizes paths; learning embedded in daily tools

Red flags: LMS login rate below 30%; all training compliance-focused; no mobile access; content 2+ years old. [src3]

Quick diagnostic question: "What percentage of employees used your learning platform last month?"

Dimension 4: Leadership Development

What this measures: How systematically the organization develops future leaders at all levels.

ScoreLevelDescriptionEvidence
1Ad hocNo formal leadership development; best ICs promoted without trainingNo leadership competency model; no succession planning
2EmergingSome external programs; ad hoc mentoring; C-suite succession onlySelect high-potentials sent to programs; top-10 role succession
3DefinedStructured programs at multiple levels; high-potential process; succession for top 2 levelsInternal curriculum; 9-box reviews; succession for director+ roles
4ManagedPrograms tied to business strategy; coaching available; pipeline metrics trackedPrograms aligned to strategic capabilities; diversity of pipeline tracked
5OptimizedAI-assisted potential identification; personalized journeys; simulation centersAI identifies potential; succession depth 2+ for all critical roles

Red flags: No first-time manager training; succession covers CEO only; leadership budget first to be cut. [src5]

Quick diagnostic question: "What happens when someone is promoted to manager for the first time?"

Dimension 5: Measurement & ROI

What this measures: How rigorously L&D effectiveness is measured and connected to business outcomes.

ScoreLevelDescriptionEvidence
1Ad hocOnly completion rates tracked; no effectiveness measurementSole metric is courses completed; budget defended by headcount
2EmergingSatisfaction surveys (Kirkpatrick L1); completion rates; some quizzesHappy sheets collected; quiz scores for compliance
3DefinedEffectiveness measured (Kirkpatrick L2-3); behavior change assessedPre/post assessments; manager surveys 90 days post-training
4ManagedBusiness impact (Kirkpatrick L4); ROI calculated; A/B testingRevenue impact measured; ROI reported; skill velocity tracked
5OptimizedReal-time impact dashboard; causal inference; AI optimizes investmentCausal models isolate L&D impact; AI recommends budget reallocation

Red flags: Only completion rates measured; no behavior change assessment; L&D budget not tied to outcomes. [src6]

Quick diagnostic question: "Beyond completion rates, how do you measure whether training changed behavior?"

Dimension 6: Learning Culture

What this measures: Whether the organization values continuous learning and embeds it in daily work.

ScoreLevelDescriptionEvidence
1Ad hocLearning is compliance-only; no development time; not valued by leadershipMandatory compliance only; managers discourage learning time
2EmergingSome voluntary learning; tuition reimbursement underutilizedLow reimbursement usage; learning happens outside work hours
3DefinedDedicated learning time; development goals in reviews; knowledge sharing4-8 hours/month allocated; goals in reviews; ambassador program
4ManagedLearning in workflows; managers evaluated on development; internal mobilityManagers accountable for growth; mentoring programs; communities of practice
5OptimizedLearning organization culture; AI-driven suggestions; talent attraction differentiatorLearning in daily tools; failure-as-learning mindset; employer brand asset

Red flags: No time for learning; no development goals in reviews; tuition reimbursement below 10% utilization. [src4]

Quick diagnostic question: "How much dedicated learning time do employees have each month?"

Scoring & Interpretation

Overall Score Calculation

Overall Score = (Strategic Alignment + Skills Architecture + Delivery + Leadership Dev + Measurement + Learning Culture) / 6

Score Interpretation

Overall ScoreMaturity LevelInterpretationNext Step
1.0 - 1.9CriticalL&D is compliance-only; skills gaps widening; retention at riskBuild skills taxonomy; establish LMS; create manager training
2.0 - 2.9DevelopingFoundation exists but reactive; disconnected from businessConnect to business priorities; skills gap analysis; leadership pipeline
3.0 - 3.9CompetentSolid function with strategic connection; measurement improvingBusiness impact measurement; LXP deployment; predictive skills modeling
4.0 - 4.5AdvancedHigh-performing with measurable impact; optimization focusAI-driven personalization; causal impact modeling
4.6 - 5.0Best-in-classIndustry-leading learning organization; competitive differentiatorMaintain innovation; pioneer emerging learning tech

Dimension-Level Action Routing

Weak Dimension (Score < 3)Fetch This Card
Strategic AlignmentPerformance Management Assessment
Skills ArchitecturePeople Analytics Maturity Assessment
Leadership DevelopmentPerformance Management Assessment
Measurement & ROIPeople Analytics Maturity Assessment
Learning CultureDEI Program Assessment

Benchmarks by Segment

SegmentExpected Average"Good" Threshold"Alarm" Threshold
Startup (50-200 employees)1.62.31.0
Growth (200-1,000 employees)2.53.21.8
Enterprise (1,000-10,000)3.23.82.5
Large enterprise (10,000+)3.74.23.0

[src6]

Common Pitfalls in Assessment

When This Matters

Fetch when a user asks to evaluate their L&D function, diagnose why upskilling programs fail, justify L&D investment to the CFO, or assess whether the organization can build capabilities internally versus hiring.

Related Units