teleo-codex/foundations/critical-systems/positive feedback loops amplify deviations from equilibrium while negative feedback loops dampen them and the balance between the two determines whether systems stabilize self-correct or run away.md
m3taversal 6c357917cd
theseus: foundations follow-up + Claude's Cycles research program (11 claims) (#50)
Reviewed by Leo. 11 claims: 4 foundation gaps (coordination failures, principal-agent, feedback loops, network effects) + 7 Claude's Cycles capability evidence. 4 source archives. Minor non-blocking feedback posted.
2026-03-07 15:19:27 -07:00

34 lines
3.9 KiB
Markdown

---
type: claim
domain: critical-systems
description: "Control theory's foundational distinction: negative feedback creates stability and self-correction while positive feedback creates exponential growth, lock-in, and cascading failure — most complex systems exhibit both simultaneously"
confidence: proven
source: "Wiener, Cybernetics (1948); Meadows, Thinking in Systems (2008); Arthur, Increasing Returns and Path Dependence (1994)"
created: 2026-03-07
---
# positive feedback loops amplify deviations from equilibrium while negative feedback loops dampen them and the balance between the two determines whether systems stabilize self-correct or run away
Wiener's cybernetics (1948) formalized what engineers had known for centuries: systems are governed by feedback. Negative feedback loops (thermostats, homeostasis, market price corrections) push systems toward equilibrium by counteracting deviations. Positive feedback loops (compound interest, viral spread, arms races) amplify deviations, driving systems away from their starting state.
The interaction between the two determines system behavior:
**Dominated by negative feedback:** The system is self-correcting. Perturbations decay. Examples: body temperature regulation, competitive market pricing, ecosystem population dynamics. These systems are stable but can be slow to adapt.
**Dominated by positive feedback:** The system runs away. Small advantages compound into large ones. Examples: nuclear chain reactions, bank runs, network effects in technology adoption. Arthur (1994) demonstrated that positive feedback in technology markets produces lock-in — the winning technology need not be the best, only the first to cross a tipping point.
**Both operating simultaneously:** Most real complex systems. Meadows (2008) showed that the most dangerous systems are those where positive feedback loops operate on short timescales (quarterly profits, capability advances) while negative feedback loops operate on long timescales (regulation, social learning, institutional adaptation). The system appears stable until the positive loop overwhelms the negative one — then the transition is sudden and often irreversible.
This framework applies directly to coordination design: designed systems need negative feedback (error correction, oversight, accountability) that operates at least as fast as the positive feedback (capability growth, competitive pressure, accumulation of power). When negative feedback is slower, the system is structurally unstable regardless of initial conditions.
---
Relevant Notes:
- [[recursive self-improvement creates explosive intelligence gains because the system that improves is itself improving]] — the intelligence explosion as a positive feedback loop without a governing negative feedback mechanism
- [[the alignment tax creates a structural race to the bottom because safety training costs capability and rational competitors skip it]] — positive feedback (competitive advantage from skipping safety) dominating negative feedback (reputational or regulatory cost)
- [[minsky's financial instability hypothesis shows that stability breeds instability as good times incentivize leverage and risk-taking that fragilize the system until shocks trigger cascades]] — Minsky's insight as positive feedback in financial systems: stability itself is the input that drives the destabilizing loop
- [[complex systems drive themselves to the critical state without external tuning because energy input and dissipation naturally select for the critical slope]] — SOC as a system where positive and negative feedback balance at the critical point
- [[optimization for efficiency without regard for resilience creates systemic fragility because interconnected systems transmit and amplify local failures into cascading breakdowns]] — efficiency optimization as positive feedback that weakens the negative feedback of resilience
Topics:
- [[_map]]