teleo-codex/inbox/queue/2026-02-00-choudary-hbr-ai-coordination-not-automation.md
2026-03-18 15:18:07 +00:00

5.2 KiB

type title author url date domain secondary_domains format status priority tags
source AI's Big Payoff Is Coordination, Not Automation Sangeet Paul Choudary (@sanguit) https://hbr.org/2026/02/ais-big-payoff-is-coordination-not-automation 2026-02-01 ai-alignment
article unprocessed high
coordination
automation
translation-costs
AI-value
misallocation
platform-strategy
economic-payoff

Content

Main argument: AI's most significant economic value comes from reducing "translation costs" — friction in coordinating disparate teams, tools, and data — rather than automating individual tasks. AI enables coordination without requiring consensus on standards or platforms.

Key concept — Translation costs: The friction involved in coordinating disparate teams, tools, systems. Historically required standardization (everyone use the same platform). AI eliminates the standardization requirement by doing the translation dynamically.

Evidence:

  • Construction (Trunk Tools): Integrates BIM software, spreadsheets, photos, emails, PDFs into unified project view. Teams maintain specialized tools. Coordination cost drops without standardization.
  • Auto insurance (Tractable): Disrupted market leader CCC Intelligent Solutions by training AI to interpret smartphone photos of vehicle damage — sidestepping standardization requirements. Processed ~$7B in claims by 2023.

Author's three strategies for incumbents:

  1. Become the translation layer (example: project44 in logistics — ecosystem-wide coordination)
  2. Double down on accountability (Maersk's integrated logistics model — responsible for outcomes despite fragmentation)
  3. Fragment and tax (FedEx — maintains privileged internal unified view, rations external access)

Author: Sangeet Paul Choudary — C-level AI and platform strategy advisor, UC Berkeley senior fellow, Thinkers50 Strategy Award 2025.

Agent Notes

Why this matters: This is the most important reframe I've encountered for the automation overshoot problem. If AI's ACTUAL value is in coordination reduction (not automation), then organizations that are automating tasks (the dominant deployment pattern) are SYSTEMATICALLY MISALLOCATING. They're pursuing the wrong value. This is a new mechanism for misallocation that's distinct from the four overshoot mechanisms identified last session — it's not that firms overshoot the optimal automation level, it's that they're optimizing for the wrong thing entirely.

What surprised me: The argument that AI eliminates the standardization requirement for coordination is genuinely novel to me. This matches the mathematical argument in our KB — distributed architectures don't require consensus (like monolithic alignment trying to aggregate all preferences). If AI can coordinate without consensus, this is a practical instantiation of what our collective architecture thesis requires theoretically.

What I expected but didn't find: Evidence that the coordination payoff is LARGER than automation in magnitude. The article makes the qualitative argument but doesn't provide comparative ROI data. Also missing: whether coordination applications of AI are being deployed at scale yet, or whether this remains largely untapped.

KB connections:

Extraction hints:

  • High-priority claim candidate: "AI's primary economic value is in reducing translation costs between specialized teams and tools rather than automating individual tasks, which means most AI deployment is systematically misallocated toward lower-value automation applications"
  • The "coordination without consensus" principle deserves extraction — it operationalizes the distributed architecture thesis at the firm level
  • The three incumbent strategies are less extractable (prescriptive rather than empirical)

Context: HBR February 2026 publication by credible platform strategy thinker. Highly visible to business audience. This is the kind of mainstream articulation that could shift how organizations think about AI deployment.

Curator Notes

PRIMARY CONNECTION: coordination protocol design produces larger capability gains than model scaling because the same AI model performed 6x better with structured exploration than with human coaching on the same problem

WHY ARCHIVED: Provides the economic theory for WHY automation-focused AI deployment is suboptimal — the real value is in coordination. This reframes the overshoot problem as misallocation not just excess.

EXTRACTION HINT: Extract the "translation costs" concept and the coordination-vs-automation value claim. Scope carefully: Choudary's argument is about where economic value is largest, not about alignment implications — Theseus should make the alignment connection explicit in extraction.