Fills the most-referenced gaps in the KB — concepts cited 5-17 times each by existing claims but never written as formal claim files. Domains: grand-strategy (11), mechanisms (9), internet-finance (1), foundations/collective-intelligence (1), foundations/cultural-dynamics (4). Co-Authored-By: Leo <leo@teleo.ai>
3.4 KiB
| type | domain | description | confidence | source | created | secondary_domains | related_claims | |||
|---|---|---|---|---|---|---|---|---|---|---|
| claim | mechanisms | The Metropolis algorithm shows that accepting worse solutions with decreasing probability provably converges to the global optimum -- the mathematical case for tolerating short-term loss | proven | Kirkpatrick, Gelatt, Vecchi (1983), Metropolis algorithm (1953), Boltzmann distribution | 2026-04-21 |
|
|
Simulated annealing maps the physics of cooling onto optimization by starting with high randomness and gradually reducing it
A metal cools slowly from high temperature. At high temperature, atoms jump freely between configurations, exploring widely. As temperature drops, atoms settle into low-energy configurations. If cooling is slow enough, the metal reaches its global energy minimum -- a perfect crystal. If cooled too fast, atoms freeze in a disordered, suboptimal state (glass).
Kirkpatrick, Gelatt, and Vecchi (1983) proved this physical process is isomorphic to combinatorial optimization. Replace "energy" with "cost function" and "temperature" with "willingness to accept worse solutions." At high temperature, the algorithm accepts moves to worse states frequently, enabling broad exploration. As temperature decreases, acceptance of worse states drops exponentially, and the algorithm converges toward the global optimum. The cooling schedule is everything: too fast and you freeze in a local minimum, too slow and you waste computation exploring already-mapped territory.
The insight that transfers beyond computation: any system that wants to find globally good solutions must tolerate periods of locally worse performance. Markets that never allow failure (bailouts, zombie firms) are cooling too fast -- they freeze in suboptimal configurations. Societies that never tolerate disorder (authoritarian stability) are doing the same. The mathematical proof says you MUST pass through worse states to reach better ones when the landscape is rugged.
The cooling schedule implies a lifecycle. Young systems should explore widely (high temperature). Mature systems should exploit locally (low temperature). The transition between exploration and exploitation is itself the critical design choice -- and there is no universal optimal schedule. It depends on the landscape's ruggedness, which you generally don't know in advance.
Evidence
- Metropolis algorithm (1953) -- the acceptance probability function exp(-deltaE/kT) provably samples the Boltzmann distribution
- Kirkpatrick et al. (1983) -- demonstrated convergence on VLSI circuit layout, traveling salesman, graph partitioning
- Convergence proof -- Hajek (1988) proved simulated annealing converges to global optimum if cooling is logarithmic: T(t) >= d/ln(t)
- Physical metallurgy -- the glass transition is literally the consequence of insufficient annealing
Challenges
- Logarithmic cooling is impractically slow for most real problems -- practitioners use heuristic schedules that sacrifice convergence guarantees for speed
- Modern methods (genetic algorithms, reinforcement learning) often outperform SA on specific problem classes