This assessment evaluates how effectively product usage data informs marketing decisions and how mature the product-led growth (PLG) motion is across five dimensions: data integration, activation and onboarding, usage-driven campaigns, product-qualified leads (PQLs), and feedback loop closure. [src1]
What this measures: How well product usage data flows to and from marketing systems.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | Product and marketing data in separate systems; no sharing | Marketing cannot see signups or usage; no integration |
| 2 | Emerging | Signups flow to CRM; some events sent to marketing; 24+ hour delay | Signup events sync; few key events shared; most data stays in product analytics |
| 3 | Defined | Real-time product events to marketing; 10+ key events synced; unified profiles | Customer profiles enriched with usage; product and marketing data joined in warehouse |
| 4 | Managed | CDP unifies all data; event-driven architecture; identity resolution | Anonymous-to-known stitching; real-time streaming; sub-hour latency |
| 5 | Optimized | Bi-directional data flow; ML models predict across both systems | Product personalizes from marketing data; marketing from product data |
Red flags: Marketing cannot see product usage; no integration between analytics and automation; manual data syncs. [src2]
What this measures: How well marketing and product collaborate to drive new user activation.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No activation metric; onboarding is product-only; marketing stops at signup | Marketing measures signups only; time-to-value not measured |
| 2 | Emerging | Basic onboarding emails; activation loosely defined; not tracked by marketing | Welcome email series; in-app tooltips; marketing and product not coordinated |
| 3 | Defined | Activation metric jointly owned; multi-channel onboarding; rate tracked weekly | Clear activation metric; onboarding tied to behavior; A/B testing on flows |
| 4 | Managed | Personalized onboarding by use case; activation segmented per cohort; >30% rate | Segmented onboarding paths; time-to-value by cohort; in-app + email orchestrated |
| 5 | Optimized | AI-powered adaptive onboarding; sub-hour time-to-value; >50% activation | AI adapts in real-time; predictive at-risk identification; hours to value |
Red flags: No defined activation metric; marketing stops at signup; time-to-value unmeasured; activation below 15%. [src4]
What this measures: How product usage data triggers and personalizes marketing campaigns.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No usage-based campaigns; all demographic or time-based | Same email to all; no behavioral triggers; calendar-driven only |
| 2 | Emerging | Few basic triggers (inactive user email); not personalized | 1-3 behavioral triggers; generic re-engagement; no usage segmentation |
| 3 | Defined | Usage-based segments drive targeting; feature adoption campaigns; 5-10 flows | Active/at-risk/dormant segments; feature campaigns; 5-10 behavioral flows |
| 4 | Managed | Personalized by individual usage; dynamic content; upgrade triggers | Usage-based personalization; upgrade at plan limits; cross-sell by feature patterns |
| 5 | Optimized | AI determines optimal message, channel, timing per user from usage | Predictive campaigns anticipate needs; fully autonomous optimization |
Red flags: Zero behavioral triggers; all blast sends; usage data not used for segmentation; no feature adoption campaigns. [src3]
What this measures: Whether the company identifies and operationalizes product-qualified leads.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No PQL concept; leads are MQLs from content; usage not in scoring | Leads from form fills only; free users never flagged to sales |
| 2 | Emerging | PQL concept discussed; some manual identification; no scoring | CS occasionally flags active free users; no automated handoff |
| 3 | Defined | PQL scoring model based on usage thresholds; automated alerts; conversion tracked | PQL defined and automated; conversion rate tracked; alerts to sales |
| 4 | Managed | Multi-signal PQL model; CRM integrated; PQL tiers with different routing | ML-enhanced scoring; enterprise PQLs get high-touch; SMB get automated upgrade |
| 5 | Optimized | Predictive PQL model; PQL drives 50%+ of pipeline; continuously refined | Predictive scoring; PQL is primary pipeline source; model trained on conversion data |
Red flags: No PQL definition; free users never surfaced to sales; usage not in lead scoring; all leads from content engagement. [src5]
What this measures: Whether insights flow bidirectionally between product and marketing teams.
| Score | Level | Description | Evidence |
|---|---|---|---|
| 1 | Ad hoc | No structured feedback; teams operate independently; no shared metrics | Separate OKRs; no joint meetings; last-minute launch communication |
| 2 | Emerging | Occasional sharing; marketing learns post-launch; some shared metrics | Monthly update; launch content created but not involved in planning |
| 3 | Defined | Marketing in launch planning; product gets customer insights; shared dashboard | 4+ week launch planning; customer feedback shared; quarterly strategy alignment |
| 4 | Managed | Continuous collaboration; shared experimentation; joint growth team | Growth pod; experiments designed jointly; A/B results inform both teams |
| 5 | Optimized | Fully integrated growth engine; shared P&L; AI optimizes across both | Unified growth function; end-to-end journey orchestration |
Red flags: Teams never meet; launches surprise marketing; no shared KPIs; customer feedback from marketing does not reach product. [src6]
Formula: Overall Score = (Data Integration + Activation + Usage Campaigns + PQLs + Feedback Loop) / 5
| Overall Score | Maturity Level | Interpretation | Next Step |
|---|---|---|---|
| 1.0 - 1.9 | Critical | Product and marketing completely siloed; PLG failing or nonexistent | Integrate signup data; define activation metric; monthly sync |
| 2.0 - 2.9 | Developing | Basic data connections; usage does not meaningfully inform marketing | Define PQL model; build behavioral triggers; establish shared KPIs |
| 3.0 - 3.9 | Competent | Functional feedback loop with usage campaigns and PQLs | Scale behavioral campaigns; ML-enhanced PQLs; form growth team |
| 4.0 - 4.5 | Advanced | Sophisticated PLG with predictive models and cross-functional collaboration | AI personalization; predictive PQLs; unified experimentation |
| 4.6 - 5.0 | Best-in-class | Fully integrated product-marketing growth engine | Maintain; evaluate emerging PLG-AI patterns |
| Segment | Expected Average | "Good" Threshold | "Alarm" Threshold |
|---|---|---|---|
| Pure PLG | 2.8 | 3.5 | 2.0 |
| PLG + Sales Assist | 2.4 | 3.2 | 1.8 |
| Sales-led with freemium | 1.8 | 2.5 | 1.2 |
| Developer tools (PLG) | 3.0 | 3.8 | 2.2 |
[src1]
Fetch when a user asks how to connect product usage data with marketing, wants to build or improve a PLG motion, needs to define PQLs, has low free-to-paid conversion, or wants to assess product-marketing collaboration.