Some checks are pending
Sync Graph Data to teleo-app / sync (push) Waiting to run
- What: 4 new claims (LLM KB compilation vs RAG, filesystem retrieval over embeddings, self-optimizing harnesses, harness > model selection), 4 enrichments (one-agent-one-chat, agentic taylorism, macro-productivity null result, multi-agent coordination), MetaDAO entity financial update ($33M+ total raised), 6 source archives - Why: Leo-routed research batch — Karpathy LLM Wiki (47K likes), Mintlify ChromaFS (460x faster), AutoAgent (#1 SpreadsheetBench), NeoSigma auto-harness (0.56→0.78), Stanford Meta-Harness (6x gap), Hyunjin Kim mapping problem - Connections: all 4 new claims connect to existing multi-agent coordination evidence; Karpathy validates Teleo Codex architecture pattern; idea file enriches agentic taylorism Pentagon-Agent: Rio <244BA05F-3AA3-4079-8C59-6D68A77C76FE>
24 lines
1.8 KiB
Markdown
24 lines
1.8 KiB
Markdown
---
|
|
type: source
|
|
title: "LLM Knowledge Base (idea file)"
|
|
author: "Andrej Karpathy (@karpathy)"
|
|
url: https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f
|
|
date: 2026-04-02
|
|
domain: ai-alignment
|
|
intake_tier: directed
|
|
rationale: "Validates the Teleo Codex architecture pattern — three-layer wiki (sources → compiled wiki → schema) independently arrived at by Karpathy with massive viral adoption (47K likes, 14.5M views). Enriches 'one agent one chat' conviction and agentic taylorism claim."
|
|
proposed_by: "Leo (research batch routing)"
|
|
format: gist
|
|
status: processed
|
|
processed_by: rio
|
|
processed_date: 2026-04-05
|
|
claims_extracted:
|
|
- "LLM-maintained knowledge bases that compile rather than retrieve represent a paradigm shift from RAG to persistent synthesis because the wiki is a compounding artifact not a query cache"
|
|
enrichments:
|
|
- "one agent one chat is the right default for knowledge contribution because the scaffolding handles complexity not the user"
|
|
- "The current AI transition is agentic Taylorism — humanity is feeding its knowledge into AI through usage just as greater Taylorism extracted knowledge from workers to managers and the knowledge transfer is a byproduct of labor not an intentional act"
|
|
---
|
|
|
|
# Karpathy LLM Knowledge Base
|
|
|
|
47K likes, 14.5M views. Three-layer architecture: raw sources (immutable) → LLM-compiled wiki (LLM-owned) → schema (configuration via CLAUDE.md). The LLM "doesn't just index for retrieval — it reads, extracts, and integrates into the existing wiki." Each new source touches 10-15 pages. Obsidian as frontend, markdown as format. Includes lint operation for contradictions and stale claims. Human is "editor-in-chief." The "idea file" concept: share the idea not the code, each person's agent customizes and builds it.
|