diff --git a/inbox/archive/2025-12-00-fullstack-alignment-thick-models-value.md b/inbox/archive/2025-12-00-fullstack-alignment-thick-models-value.md index 400e0293..7d72f7be 100644 --- a/inbox/archive/2025-12-00-fullstack-alignment-thick-models-value.md +++ b/inbox/archive/2025-12-00-fullstack-alignment-thick-models-value.md @@ -45,7 +45,7 @@ Published December 2025. Argues that "beneficial societal outcomes cannot be gua **KB connections:** - [[AI alignment is a coordination problem not a technical problem]] — this paper extends our thesis to institutions -- [[AI development is a critical juncture in institutional history]] — directly relevant +- AI development is a critical juncture in institutional history — directly relevant - [[the alignment problem dissolves when human values are continuously woven into the system rather than specified in advance]] — "thick values" is a formalization of continuous value integration **Extraction hints:** Claims about (1) alignment requiring institutional co-alignment, (2) thick vs thin models of value, (3) five implementation mechanisms.