Organizational Network Analysis (ONA) is a structured diagnostic methodology that maps the invisible communication flows, influence relationships, and trust patterns that exist beneath a company's formal org chart. By treating a company as a measurable traffic network of messages, decisions, and handoffs rather than a static hierarchy, ONA reveals that formal authority accounts for only 20-30% of how work actually gets done — the remaining 70-80% flows through informal networks of influence and trust. [src1, src2]
START — User needs to diagnose organizational dysfunction
├── What's the primary symptom?
│ ├── "Teams don't collaborate despite being told to"
│ │ └── ONA — Communication Network Mapping ← YOU ARE HERE
│ ├── "Employees are disengaged or unhappy"
│ │ └── Employee Engagement Survey (measures sentiment, not structure)
│ ├── "Reporting lines are unclear or overlapping"
│ │ └── Organizational Design / RACI Matrix
│ ├── "Specific individuals are in personal conflict"
│ │ └── Conflict Resolution / Mediation Framework
│ └── "Decision-making is slow but nobody knows why"
│ └── ONA — Decision Flow Mapping ← YOU ARE HERE
├── Is the organization large enough (50+ people)?
│ ├── YES → Proceed with ONA
│ └── NO → Use direct observation and team retrospectives instead
└── Are there privacy/legal constraints on data collection?
├── YES (EU/GDPR) → Use survey-based ONA only (no passive digital trace)
└── NO → Choose between survey-based and passive digital-trace methods based on budget
Leadership uses network analysis to find individuals who are "bottlenecks" or "resistors" and puts them on performance improvement plans. This weaponizes a structural diagnostic as a surveillance tool, destroys psychological safety, and guarantees that future ONA participation rates collapse. [src4]
Reframe structural findings as workflow problems, not personality problems. If someone is a bottleneck, the diagnosis is "this role carries too many cross-team dependencies" — the intervention is redistributing those dependencies, not blaming the individual. [src2]
A single ONA snapshot is taken, presented to the board as "the real org chart," and filed away. Six months later, a reorganization has completely changed communication patterns, but leadership still references the old map. [src3]
Schedule ONA assessments at regular intervals (every 6-12 months) and after major structural events (mergers, reorganizations, leadership changes). Compare network maps over time to distinguish stable structural patterns from transient fluctuations. [src2]
Analyzing every email, every Slack message, every calendar invite across the entire organization simultaneously. This creates an unmanageable dataset, triggers privacy concerns, and produces a network graph so dense it communicates nothing actionable. [src1]
Start with a targeted scope — the specific teams, boundaries, or decision flows where symptoms are visible. Expand the analysis only into areas where the initial map reveals unexpected patterns, mirroring how cybersecurity SIEM tools escalate attention based on threat indicators. [src1]
Misconception: ONA is a technology product — you need expensive software to do it.
Reality: ONA is a methodology, not a tool. The core technique (asking "who do you go to for information/advice/trust?") can be executed with a spreadsheet and a network visualization library. Software platforms automate scale, but the diagnostic logic is method-driven. [src2]
Misconception: More connections are always better — the goal is to maximize network density.
Reality: Collaboration overload is as destructive as isolation. Individuals with too many network ties become bottlenecks and burn out. The goal is efficient network structure — the right connections in the right places — not maximum density. [src2]
Misconception: ONA replaces the org chart.
Reality: ONA supplements the org chart by revealing what it cannot see. Formal hierarchy serves legitimate purposes (accountability, legal authority, resource allocation). ONA identifies where informal reality diverges from formal design so that both can be aligned. [src1]
Misconception: Passive digital-trace analysis (email/Slack metadata) is always more accurate than surveys.
Reality: Digital traces capture frequency but not quality, trust, or influence direction. A survey asking "who do you go to for critical decisions?" captures a dimension that email volume cannot. The strongest ONA designs combine both methods. [src1]
| Concept | Key Difference | When to Use |
|---|---|---|
| ONA (Organizational Network Analysis) | Maps actual communication and influence flows — structural diagnostic | Diagnosing hidden bottlenecks, key-person risk, siloed teams, post-merger integration |
| Employee Engagement Survey | Measures individual sentiment and satisfaction — perceptual diagnostic | Diagnosing morale and motivation without needing structural visibility |
| RACI Matrix | Assigns formal responsibility for decisions — governance tool | Clarifying who is accountable for what, not who actually communicates with whom |
| Stakeholder Mapping | Identifies key stakeholders and interests for a specific initiative — project tool | Planning a change initiative, not diagnosing systemic communication patterns |
Fetch this when a user asks about mapping informal influence networks, diagnosing hidden communication bottlenecks, identifying key-person dependencies, assessing post-merger integration risks, or understanding why teams fail to collaborate despite formal alignment. Also fetch when a user references the gap between "how work should flow" (org chart) and "how work actually flows" (reality).