This assessment evaluates the maturity of a company's Financial Planning & Analysis function across five critical dimensions: budgeting process, forecasting accuracy, scenario modeling, management reporting, and decision support integration. The output is a composite maturity score (1-5) that identifies the weakest capabilities and routes to specific improvement actions. [src1]
What this measures: How effectively the organization creates, manages, and iterates on its annual budget and periodic reforecasts.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Budget is a spreadsheet exercise done annually with no process ownership; takes 3+ months | Disconnected spreadsheets; no version control |
| 2 | Emerging | Structured annual budget with department input; spreadsheet-based; 8-12 weeks | Templates exist; manual consolidation; one reforecast/year |
| 3 | Defined | Formalized driver-based budgeting; 2-3 reforecasts/year; 4-8 weeks | Budget calendar; planning tool in use; quarterly reforecasts |
| 4 | Managed | Rolling budget with monthly reforecasts; under 4 weeks; automated variance | Rolling 12-18 month budget; automated reporting |
| 5 | Optimized | Continuous planning with real-time updates; AI-assisted; under 2 weeks | Continuous planning platform; predictive recommendations |
Red flags: Budget takes >12 weeks; no reforecasts during the year; budget never referenced after approval.
Quick diagnostic question: "How many times per year do you reforecast, and how long does the process take?"
What this measures: The accuracy, speed, and methodology of financial forecasts across revenue, expenses, and cash flow.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No formal forecasting; static annual budgets; variance measured at year-end | No forecast documents; annual-only variance analysis |
| 2 | Emerging | Quarterly forecasts using historical run rates; MAPE >20% | Quarterly forecast; formula-based; no rolling horizon |
| 3 | Defined | Monthly/rolling forecasts; driver-based; MAPE 10-20%; 6+ month horizon | Driver-based models; accuracy reports distributed |
| 4 | Managed | Rolling 12-18 month forecasts; multiple scenarios; MAPE 5-10% | Multi-scenario models; probabilistic forecasts; automated feeds |
| 5 | Optimized | Continuous forecasting with AI/ML; MAPE <5%; finalization under 3 days | AI-augmented models; real-time integration; <72hr cycle |
Red flags: Forecasts never compared to actuals; single point estimate with no range; revenue forecast misses by >15%.
Quick diagnostic question: "What is your forecast accuracy (MAPE) over the last 4 quarters?"
What this measures: The organization's ability to model different business scenarios, stress tests, and what-if analyses.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No scenario modeling; only a single plan exists | No scenario documents; ad hoc spreadsheet changes |
| 2 | Emerging | Basic best/worst/base case; maintained manually; annual only | Three-scenario model; static; limited variables |
| 3 | Defined | Multiple scenarios with linked assumptions; quarterly updates | Linked models; 5-10 key variables; used in board decks |
| 4 | Managed | Dynamic models with real-time updates; Monte Carlo approaches | Planning tool models; probability-weighted; used for M&A decisions |
| 5 | Optimized | Real-time simulation with AI sensitivity analysis; <24hr turnaround | Real-time dashboards; AI insights; decision workflow integration |
Red flags: Only one version of the plan; leadership cannot model a 20% revenue drop within 24 hours. [src4]
Quick diagnostic question: "How many active scenarios do you maintain, and how quickly can you model a major assumption change?"
What this measures: The quality, timeliness, and actionability of financial reports delivered to management and the board.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Reports produced manually, often late, inconsistent format | Ad hoc spreadsheets; format changes monthly; 3+ weeks after close |
| 2 | Emerging | Standardized monthly package; 10-15 business days; backward-looking | Monthly template; delivery 10-15 days; mostly P&L |
| 3 | Defined | Comprehensive reporting within 10 days; KPIs, variance, commentary | Board deck with KPIs; 7-10 day cycle; variance commentary |
| 4 | Managed | Automated dashboards; real-time data; self-service; 5-day cycle | BI dashboards; automated pipelines; self-service analytics |
| 5 | Optimized | Predictive analytics; AI narrative generation; continuous monitoring | AI commentary; predictive metrics; automated alerts; 2-3 day cycle |
Red flags: Monthly close >15 business days; board receives only P&L with no operational KPIs. [src6]
Quick diagnostic question: "How many business days after month-end do stakeholders receive the reporting package?"
What this measures: How effectively FP&A partners with business units to drive data-informed decisions.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | FP&A is purely a reporting function; no involvement in decisions | Only produces reports; not invited to strategy meetings |
| 2 | Emerging | Ad hoc analyses when requested; beginning to attend reviews | Occasional analyses; reactive insights |
| 3 | Defined | FP&A business partners assigned to BUs; regular review cadence | Named partners; monthly BU reviews; proactive commentary |
| 4 | Managed | FP&A drives strategic planning; ROI analysis; capital allocation | Leads planning cycle; investment committee involvement |
| 5 | Optimized | Strategic partner embedded in all major decisions; predictive insights | Executive committee presence; strategy co-creation |
Red flags: FP&A not invited to business reviews; no analysis influenced a major decision in 6 months; >70% time on report production. [src2, src5]
Quick diagnostic question: "When was the last time an FP&A analysis directly influenced a significant business decision?"
Overall Score = (Budgeting + Forecasting + Scenario Modeling + Reporting + Decision Support) / 5
| Overall Score | Maturity Level | Interpretation | Recommended Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | FP&A function barely exists — flying blind on financial planning | Build foundational budgeting and hire FP&A resource |
| 2.0 - 2.9 | Developing | Basic processes exist but manual, slow, backward-looking | Implement planning tool; establish rolling forecasts |
| 3.0 - 3.9 | Competent | Solid FP&A foundation with automation and partnership gaps | Automate reporting; develop scenario capabilities |
| 4.0 - 4.5 | Advanced | Strong FP&A function driving strategic value | Deploy AI augmentation; reduce cycle times |
| 4.6 - 5.0 | Best-in-class | FP&A is a strategic differentiator | Maintain edge through innovation and external benchmarking |
| Weak Dimension (Score < 3) | Fetch This Card |
|---|---|
| Budgeting Process | Financial Metrics Benchmarks for target-setting |
| Forecasting Accuracy | Cash Flow Assessment for forecast-dependent planning |
| Scenario Modeling | Financial Metrics Benchmarks for baseline assumptions |
| Management Reporting | Financial Controls Assessment for reporting controls |
| Decision Support | Revenue Operations Assessment for cross-functional alignment |
| Segment | Expected Average Score | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Seed/Series A (<$5M ARR) | 1.5 - 2.0 | > 2.5 | < 1.5 |
| Series B-C ($5M-$50M ARR) | 2.5 - 3.0 | > 3.5 | < 2.0 |
| Growth ($50M-$250M ARR) | 3.0 - 3.5 | > 4.0 | < 2.5 |
| Scale/Public ($250M+ ARR) | 3.5 - 4.0 | > 4.5 | < 3.0 |
Fetch when a user asks to evaluate their FP&A function, diagnose why financial planning is reactive, prepare for a CFO hire or transition, or benchmark FP&A capabilities against industry standards before investing in planning tools.