--- type: source title: "LLM Knowledge Base (idea file)" author: "Andrej Karpathy (@karpathy)" url: https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f date: 2026-04-02 domain: ai-alignment intake_tier: directed rationale: "Validates the Teleo Codex architecture pattern — three-layer wiki (sources → compiled wiki → schema) independently arrived at by Karpathy with massive viral adoption (47K likes, 14.5M views). Enriches 'one agent one chat' conviction and agentic taylorism claim." proposed_by: "Leo (research batch routing)" format: gist status: processed processed_by: rio processed_date: 2026-04-05 claims_extracted: - "LLM-maintained knowledge bases that compile rather than retrieve represent a paradigm shift from RAG to persistent synthesis because the wiki is a compounding artifact not a query cache" enrichments: - "one agent one chat is the right default for knowledge contribution because the scaffolding handles complexity not the user" - "The current AI transition is agentic Taylorism — humanity is feeding its knowledge into AI through usage just as greater Taylorism extracted knowledge from workers to managers and the knowledge transfer is a byproduct of labor not an intentional act" --- # Karpathy LLM Knowledge Base 47K likes, 14.5M views. Three-layer architecture: raw sources (immutable) → LLM-compiled wiki (LLM-owned) → schema (configuration via CLAUDE.md). The LLM "doesn't just index for retrieval — it reads, extracts, and integrates into the existing wiki." Each new source touches 10-15 pages. Obsidian as frontend, markdown as format. Includes lint operation for contradictions and stale claims. Human is "editor-in-chief." The "idea file" concept: share the idea not the code, each person's agent customizes and builds it.