From ec1da89f1fe115b895dc91851e4ffb635028127d Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:10:24 +0000 Subject: [PATCH 1/6] Auto: docs/ingestion-daemon-onboarding.md | 1 file changed, 227 insertions(+) --- docs/ingestion-daemon-onboarding.md | 227 ++++++++++++++++++++++++++++ 1 file changed, 227 insertions(+) create mode 100644 docs/ingestion-daemon-onboarding.md diff --git a/docs/ingestion-daemon-onboarding.md b/docs/ingestion-daemon-onboarding.md new file mode 100644 index 0000000..713d039 --- /dev/null +++ b/docs/ingestion-daemon-onboarding.md @@ -0,0 +1,227 @@ +# Ingestion Daemon Onboarding + +How to build an ingestion daemon for the Teleo collective knowledge base. This doc covers the **futardio daemon** as the first example, but the pattern generalizes to any data source (X feeds, RSS, on-chain data, arxiv, etc.). + +## Architecture + +``` +Data source (futard.io, X, RSS, on-chain...) + ↓ +Ingestion daemon (your script, runs on VPS cron) + ↓ +inbox/archive/*.md (source archive files with YAML frontmatter) + ↓ +Git branch → push → PR on Forgejo + ↓ +Webhook triggers headless domain agent (extraction) + ↓ +Agent opens claims PR → eval pipeline reviews → merge +``` + +**Your daemon is responsible for steps 1-4 only.** You pull data, format it, and push it. Agents handle everything downstream. + +## What the daemon produces + +One markdown file per source item in `inbox/archive/`. Each file has YAML frontmatter + body content. + +### Filename convention + +``` +YYYY-MM-DD-{author-or-source-handle}-{brief-slug}.md +``` + +Examples: +- `2026-03-09-futardio-project-launch-solforge.md` +- `2026-03-09-metaproph3t-futarchy-governance-update.md` +- `2026-03-09-pineanalytics-futardio-launch-metrics.md` + +### Frontmatter (required fields) + +```yaml +--- +type: source +title: "Human-readable title of the source" +author: "Author name (@handle if applicable)" +url: "https://original-url.com" +date: 2026-03-09 +domain: internet-finance +format: report | essay | tweet | thread | whitepaper | paper | news | data +status: unprocessed +tags: [futarchy, metadao, futardio, solana, permissionless-launches] +--- +``` + +### Frontmatter (optional fields) + +```yaml +linked_set: "futardio-launches-march-2026" # Group related items +cross_domain_flags: [ai-alignment, mechanisms] # Flag other relevant domains +extraction_hints: "Focus on governance mechanism data" +priority: low | medium | high # Signal urgency to agents +contributor: "Ben Harper" # Who ran the daemon +``` + +### Body + +Full content text after the frontmatter. This is what agents read to extract claims. Include everything — agents need the raw material. + +```markdown +## Summary +[Brief description of what this source contains] + +## Content +[Full text, data, or structured content from the source] + +## Context +[Optional: why this matters, what it connects to] +``` + +**Important:** The body is reference material, not argumentative. Don't write claims — just stage the raw content faithfully. Agents handle interpretation. + +### Valid domains + +Route each source to the primary domain that should process it: + +| Domain | Agent | What goes here | +|--------|-------|----------------| +| `internet-finance` | Rio | Futarchy, MetaDAO, tokens, DeFi, capital formation | +| `entertainment` | Clay | Creator economy, IP, media, gaming, cultural dynamics | +| `ai-alignment` | Theseus | AI safety, capability, alignment, multi-agent, governance | +| `health` | Vida | Healthcare, biotech, longevity, wellness, diagnostics | +| `space-development` | Astra | Launch, orbital, cislunar, governance, manufacturing | +| `grand-strategy` | Leo | Cross-domain, macro, geopolitics, coordination | + +If a source touches multiple domains, pick the primary and list others in `cross_domain_flags`. + +## Git workflow + +### Branch convention + +``` +ingestion/{daemon-name}-{timestamp} +``` + +Example: `ingestion/futardio-20260309-1700` + +### Commit format + +``` +ingestion: {N} sources from {daemon-name} batch {timestamp} + +- Sources: [brief list] +- Domains: [which domains routed to] + +Pentagon-Agent: {daemon-name} <{daemon-uuid-if-applicable}> +``` + +### PR creation + +```bash +git checkout -b ingestion/futardio-$(date +%Y%m%d-%H%M) +git add inbox/archive/*.md +git commit -m "ingestion: N sources from futardio batch $(date +%Y%m%d-%H%M)" +git push -u origin HEAD +# Open PR on Forgejo +curl -X POST "https://git.livingip.xyz/api/v1/repos/teleo/teleo-codex/pulls" \ + -H "Authorization: token YOUR_TOKEN" \ + -H "Content-Type: application/json" \ + -d '{ + "title": "ingestion: N sources from futardio batch TIMESTAMP", + "body": "## Batch summary\n- N source files\n- Domain: internet-finance\n- Source: futard.io\n\nAutomated ingestion daemon.", + "head": "ingestion/futardio-TIMESTAMP", + "base": "main" + }' +``` + +After PR is created, the Forgejo webhook triggers the eval pipeline which routes to the appropriate domain agent for extraction. + +## Futardio Daemon — Specific Implementation + +### What to pull + +futard.io is a permissionless launchpad on Solana (MetaDAO ecosystem). Key data: + +1. **New project launches** — name, description, funding target, FDV, status (LIVE/REFUNDING/COMPLETE) +2. **Funding progress** — committed amounts, funder counts, threshold status +3. **Transaction feed** — individual contributions with amounts and timestamps +4. **Platform metrics** — total committed ($17.8M+), total funders (1k+), active launches (44+) + +### Poll interval + +Every 15 minutes. futard.io data changes frequently (live fundraising), but most changes are incremental transaction data. New project launches are the high-signal events. + +### Deduplication + +Before creating a source file, check: +1. **Filename dedup** — does `inbox/archive/` already have a file for this source? +2. **Content dedup** — SQLite staging table with `source_id` unique constraint +3. **Significance filter** — skip trivial transaction updates; archive meaningful state changes (new launch, funding threshold reached, refund triggered) + +### Example output + +```markdown +--- +type: source +title: "Futardio launch: SolForge reaches 80% funding threshold" +author: "futard.io" +url: "https://futard.io/launches/solforge" +date: 2026-03-09 +domain: internet-finance +format: data +status: unprocessed +tags: [futardio, metadao, solana, permissionless-launches, capital-formation] +linked_set: futardio-launches-march-2026 +priority: medium +contributor: "Ben Harper (ingestion daemon)" +--- + +## Summary +SolForge project on futard.io reached 80% of its funding threshold, with $X committed from N funders. + +## Content +- Project: SolForge +- Description: [from futard.io listing] +- FDV: [value] +- Funding committed: [amount] / [target] ([percentage]%) +- Funder count: [N] +- Status: LIVE +- Launch date: 2026-03-09 +- Key milestones: [any threshold events] + +## Context +Part of the futard.io permissionless launch platform (MetaDAO ecosystem). Relevant to existing claims on permissionless capital formation and futarchy-governed launches. +``` + +## Generalizing to other daemons + +The pattern is identical for any data source. Only these things change: + +| Parameter | Futardio | X feeds | RSS | On-chain | +|-----------|----------|---------|-----|----------| +| Data source | futard.io web/API | twitterapi.io | feedparser | Solana RPC | +| Poll interval | 15 min | 15-30 min | 15 min | 5 min | +| Domain routing | internet-finance | per-account | per-feed | internet-finance | +| Dedup key | launch ID | tweet ID | article URL | tx signature | +| Format field | data | tweet/thread | essay/news | data | +| Significance filter | new launch, threshold event | engagement threshold | always archive | governance events | + +The output format (source archive markdown) and git workflow (branch → PR → webhook) are always the same. + +## Setup checklist + +- [ ] Forgejo account with API token (write access to teleo-codex) +- [ ] SSH key or HTTPS token for git push +- [ ] SQLite database for dedup staging +- [ ] Cron job on VPS (every 15 min) +- [ ] Test: create one source file manually, push, verify PR triggers eval pipeline + +## Files to read + +| File | What it tells you | +|------|-------------------| +| `schemas/source.md` | Canonical source archive schema | +| `schemas/claim.md` | What agents produce from your sources (downstream) | +| `skills/extract.md` | The extraction process agents run on your files | +| `CONTRIBUTING.md` | Human contributor workflow (similar pattern) | +| `CLAUDE.md` | Full collective operating manual | +| `inbox/archive/*.md` | Real examples of archived sources | -- 2.45.2 From 5db0c660b27ec1077c46e8bb74e751cddd4579d1 Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:12:22 +0000 Subject: [PATCH 2/6] Auto: docs/ingestion-daemon-onboarding.md | 1 file changed, 203 insertions(+), 77 deletions(-) --- docs/ingestion-daemon-onboarding.md | 282 ++++++++++++++++++++-------- 1 file changed, 204 insertions(+), 78 deletions(-) diff --git a/docs/ingestion-daemon-onboarding.md b/docs/ingestion-daemon-onboarding.md index 713d039..fea52e2 100644 --- a/docs/ingestion-daemon-onboarding.md +++ b/docs/ingestion-daemon-onboarding.md @@ -1,24 +1,103 @@ # Ingestion Daemon Onboarding -How to build an ingestion daemon for the Teleo collective knowledge base. This doc covers the **futardio daemon** as the first example, but the pattern generalizes to any data source (X feeds, RSS, on-chain data, arxiv, etc.). +How to build the Teleo ingestion daemon — a single service with pluggable source adapters that feeds the collective knowledge base. ## Architecture ``` -Data source (futard.io, X, RSS, on-chain...) - ↓ -Ingestion daemon (your script, runs on VPS cron) - ↓ -inbox/archive/*.md (source archive files with YAML frontmatter) - ↓ -Git branch → push → PR on Forgejo - ↓ -Webhook triggers headless domain agent (extraction) - ↓ -Agent opens claims PR → eval pipeline reviews → merge +┌─────────────────────────────────────────────┐ +│ Ingestion Daemon (1 service) │ +│ │ +│ ┌──────────┐ ┌────────┐ ┌──────┐ ┌──────┐ │ +│ │ futardio │ │ x-feed │ │ rss │ │onchain│ │ +│ │ adapter │ │ adapter│ │adapter│ │adapter│ │ +│ └────┬─────┘ └───┬────┘ └──┬───┘ └──┬───┘ │ +│ └────────┬───┴────┬────┘ │ │ +│ ▼ ▼ ▼ │ +│ ┌─────────────────────────┐ │ +│ │ Shared pipeline: │ │ +│ │ dedup → format → git │ │ +│ └───────────┬─────────────┘ │ +└─────────────────────┼───────────────────────┘ + ▼ + inbox/archive/*.md on Forgejo branch + ▼ + PR opened on Forgejo + ▼ + Webhook → headless domain agent (extraction) + ▼ + Agent claims PR → eval pipeline → merge ``` -**Your daemon is responsible for steps 1-4 only.** You pull data, format it, and push it. Agents handle everything downstream. +**The daemon handles ingestion only.** It pulls data, deduplicates, formats as source archive markdown, and opens PRs. Agents handle everything downstream (extraction, claim writing, evaluation, merge). + +## Single daemon, pluggable adapters + +One codebase, one container, one scheduler. Each data source is an adapter — a function that knows how to pull and normalize content from one source. The shared pipeline handles dedup, formatting, git workflow, and PR creation identically for every adapter. + +### Configuration + +```yaml +# ingestion-config.yaml + +daemon: + dedup_db: /data/ingestion.db # Shared SQLite for dedup + repo_dir: /workspace/teleo-codex # Local clone + forgejo_url: https://git.livingip.xyz + forgejo_token: ${FORGEJO_TOKEN} # From env/secrets + batch_branch_prefix: ingestion + +sources: + futardio: + adapter: futardio + interval: 15m + domain: internet-finance + significance_filter: true # Only new launches, threshold events, refunds + tags: [futardio, metadao, solana, permissionless-launches] + + x-ai: + adapter: twitter + interval: 30m + domain: ai-alignment + network: theseus-network.json # Account list + tiers + api: twitterapi.io + engagement_threshold: 50 # Min likes/RTs to archive + + x-finance: + adapter: twitter + interval: 30m + domain: internet-finance + network: rio-network.json + api: twitterapi.io + engagement_threshold: 50 + + rss: + adapter: rss + interval: 15m + feeds: + - url: https://noahpinion.substack.com/feed + domain: grand-strategy + - url: https://citriniresearch.substack.com/feed + domain: internet-finance + # Add feeds here — no code changes needed + + onchain: + adapter: solana + interval: 5m + domain: internet-finance + programs: + - metadao_autocrat # Futarchy governance events + - metadao_conditional_vault # Conditional token markets + significance_filter: true # Only governance events, not routine txs +``` + +### Adding a new source + +1. Write an adapter function: `pull_{source}(config) → list[SourceItem]` +2. Add an entry to `ingestion-config.yaml` +3. Restart daemon (or it hot-reloads config) + +No changes to the pipeline, git workflow, or PR creation. The adapter is the only custom part. ## What the daemon produces @@ -58,7 +137,7 @@ linked_set: "futardio-launches-march-2026" # Group related items cross_domain_flags: [ai-alignment, mechanisms] # Flag other relevant domains extraction_hints: "Focus on governance mechanism data" priority: low | medium | high # Signal urgency to agents -contributor: "Ben Harper" # Who ran the daemon +contributor: "ingestion-daemon" # Attribution ``` ### Body @@ -93,76 +172,95 @@ Route each source to the primary domain that should process it: If a source touches multiple domains, pick the primary and list others in `cross_domain_flags`. -## Git workflow +## Shared pipeline -### Branch convention +### Deduplication (SQLite) -``` -ingestion/{daemon-name}-{timestamp} +Every source item passes through dedup before archiving: + +```sql +CREATE TABLE staged ( + source_type TEXT, -- 'futardio', 'twitter', 'rss', 'solana' + source_id TEXT UNIQUE, -- Launch ID, tweet ID, article URL, tx sig + url TEXT, + title TEXT, + author TEXT, + content TEXT, + domain TEXT, + published_date TEXT, + staged_at TEXT DEFAULT CURRENT_TIMESTAMP +); ``` -Example: `ingestion/futardio-20260309-1700` +Dedup key varies by adapter: +| Adapter | Dedup key | +|---------|-----------| +| futardio | launch ID | +| twitter | tweet ID | +| rss | article URL | +| solana | tx signature | -### Commit format +### Git workflow -``` -ingestion: {N} sources from {daemon-name} batch {timestamp} - -- Sources: [brief list] -- Domains: [which domains routed to] - -Pentagon-Agent: {daemon-name} <{daemon-uuid-if-applicable}> -``` - -### PR creation +All adapters share the same git workflow: ```bash -git checkout -b ingestion/futardio-$(date +%Y%m%d-%H%M) +# 1. Branch +git checkout -b ingestion/{source}-$(date +%Y%m%d-%H%M) + +# 2. Stage files git add inbox/archive/*.md -git commit -m "ingestion: N sources from futardio batch $(date +%Y%m%d-%H%M)" + +# 3. Commit +git commit -m "ingestion: N sources from {source} batch $(date +%Y%m%d-%H%M) + +- Sources: [brief list] +- Domains: [which domains routed to]" + +# 4. Push git push -u origin HEAD -# Open PR on Forgejo + +# 5. Open PR on Forgejo curl -X POST "https://git.livingip.xyz/api/v1/repos/teleo/teleo-codex/pulls" \ - -H "Authorization: token YOUR_TOKEN" \ + -H "Authorization: token $FORGEJO_TOKEN" \ -H "Content-Type: application/json" \ -d '{ - "title": "ingestion: N sources from futardio batch TIMESTAMP", - "body": "## Batch summary\n- N source files\n- Domain: internet-finance\n- Source: futard.io\n\nAutomated ingestion daemon.", - "head": "ingestion/futardio-TIMESTAMP", + "title": "ingestion: N sources from {source} batch TIMESTAMP", + "body": "## Batch summary\n- N source files\n- Domain: {domain}\n- Source: {source}\n\nAutomated ingestion daemon.", + "head": "ingestion/{source}-TIMESTAMP", "base": "main" }' ``` -After PR is created, the Forgejo webhook triggers the eval pipeline which routes to the appropriate domain agent for extraction. +After PR creation, the Forgejo webhook triggers the eval pipeline which routes to the appropriate domain agent for extraction. -## Futardio Daemon — Specific Implementation +### Batching -### What to pull +Sources are batched per adapter per run. If the futardio adapter finds 3 new launches in one poll cycle, all 3 go in one branch/PR. If it finds 0, no branch is created. This keeps PR volume manageable for the review pipeline. -futard.io is a permissionless launchpad on Solana (MetaDAO ecosystem). Key data: +## Adapter specifications -1. **New project launches** — name, description, funding target, FDV, status (LIVE/REFUNDING/COMPLETE) -2. **Funding progress** — committed amounts, funder counts, threshold status -3. **Transaction feed** — individual contributions with amounts and timestamps -4. **Platform metrics** — total committed ($17.8M+), total funders (1k+), active launches (44+) +### futardio adapter -### Poll interval +**Source:** futard.io — permissionless launchpad on Solana (MetaDAO ecosystem) -Every 15 minutes. futard.io data changes frequently (live fundraising), but most changes are incremental transaction data. New project launches are the high-signal events. +**What to pull:** +1. New project launches — name, description, funding target, FDV, status +2. Funding threshold events — project reaches funding threshold, triggers refund +3. Platform metrics snapshots — total committed, funder count, active launches -### Deduplication +**Significance filter:** Skip routine transaction updates. Archive only: +- New launch listed +- Funding threshold reached (project funded) +- Refund triggered +- Platform milestone (e.g., total committed crosses round number) -Before creating a source file, check: -1. **Filename dedup** — does `inbox/archive/` already have a file for this source? -2. **Content dedup** — SQLite staging table with `source_id` unique constraint -3. **Significance filter** — skip trivial transaction updates; archive meaningful state changes (new launch, funding threshold reached, refund triggered) - -### Example output +**Example output:** ```markdown --- type: source -title: "Futardio launch: SolForge reaches 80% funding threshold" +title: "Futardio launch: SolForge reaches funding threshold" author: "futard.io" url: "https://futard.io/launches/solforge" date: 2026-03-09 @@ -172,48 +270,64 @@ status: unprocessed tags: [futardio, metadao, solana, permissionless-launches, capital-formation] linked_set: futardio-launches-march-2026 priority: medium -contributor: "Ben Harper (ingestion daemon)" +contributor: "ingestion-daemon" --- ## Summary -SolForge project on futard.io reached 80% of its funding threshold, with $X committed from N funders. +SolForge reached its funding threshold on futard.io with $X committed from N funders. ## Content - Project: SolForge -- Description: [from futard.io listing] +- Description: [from listing] - FDV: [value] -- Funding committed: [amount] / [target] ([percentage]%) -- Funder count: [N] -- Status: LIVE +- Funding: [amount] / [target] ([percentage]%) +- Funders: [N] +- Status: COMPLETE - Launch date: 2026-03-09 -- Key milestones: [any threshold events] +- Use of funds: [from listing] ## Context -Part of the futard.io permissionless launch platform (MetaDAO ecosystem). Relevant to existing claims on permissionless capital formation and futarchy-governed launches. +Part of the futard.io permissionless launch platform (MetaDAO ecosystem). ``` -## Generalizing to other daemons +### twitter adapter -The pattern is identical for any data source. Only these things change: +**Source:** X/Twitter via twitterapi.io -| Parameter | Futardio | X feeds | RSS | On-chain | -|-----------|----------|---------|-----|----------| -| Data source | futard.io web/API | twitterapi.io | feedparser | Solana RPC | -| Poll interval | 15 min | 15-30 min | 15 min | 5 min | -| Domain routing | internet-finance | per-account | per-feed | internet-finance | -| Dedup key | launch ID | tweet ID | article URL | tx signature | -| Format field | data | tweet/thread | essay/news | data | -| Significance filter | new launch, threshold event | engagement threshold | always archive | governance events | +**Config:** Takes a network JSON file (e.g., `theseus-network.json`, `rio-network.json`) that defines accounts and tiers. -The output format (source archive markdown) and git workflow (branch → PR → webhook) are always the same. +**What to pull:** Recent tweets from network accounts, filtered by engagement threshold. + +**Dedup:** Tweet ID. Skip retweets without commentary. Quote tweets are separate items. + +### rss adapter + +**Source:** RSS/Atom feeds via feedparser + +**Config:** List of feed URLs with domain routing. + +**What to pull:** New articles since last poll. Full text via Crawl4AI (JS-rendered) or trafilatura (fallback). + +**Dedup:** Article URL. + +### solana adapter + +**Source:** Solana RPC / program event logs + +**Config:** List of program addresses to monitor. + +**What to pull:** Governance events (new proposals, vote results, treasury operations). Not routine transfers. + +**Significance filter:** Only events that change governance state. ## Setup checklist - [ ] Forgejo account with API token (write access to teleo-codex) -- [ ] SSH key or HTTPS token for git push -- [ ] SQLite database for dedup staging -- [ ] Cron job on VPS (every 15 min) -- [ ] Test: create one source file manually, push, verify PR triggers eval pipeline +- [ ] SSH key or HTTPS token for git push to Forgejo +- [ ] SQLite database file for dedup staging +- [ ] `ingestion-config.yaml` with source definitions +- [ ] Cron or systemd timer on VPS +- [ ] Test: single adapter → one source file → push → PR → verify webhook triggers eval ## Files to read @@ -225,3 +339,15 @@ The output format (source archive markdown) and git workflow (branch → PR → | `CONTRIBUTING.md` | Human contributor workflow (similar pattern) | | `CLAUDE.md` | Full collective operating manual | | `inbox/archive/*.md` | Real examples of archived sources | + +## Cost model + +| Component | Cost | +|-----------|------| +| VPS (Hetzner CAX31) | ~$15/mo | +| X API (twitterapi.io) | ~$100/mo | +| Daemon compute | Negligible (polling + formatting) | +| Agent extraction (downstream) | Covered by Claude Max subscription on VPS | +| Total ingestion | ~$115/mo fixed | + +The expensive part (LLM calls for extraction and evaluation) happens downstream in the agent pipeline, not in the daemon. The daemon itself is cheap — it's just HTTP requests, text formatting, and git operations. -- 2.45.2 From 0dc9a68586b8d7818580afc314188996a7c660e2 Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:18:35 +0000 Subject: [PATCH 3/6] Auto: docs/ingestion-daemon-onboarding.md | 1 file changed, 144 insertions(+), 269 deletions(-) --- docs/ingestion-daemon-onboarding.md | 475 ++++++++++------------------ 1 file changed, 175 insertions(+), 300 deletions(-) diff --git a/docs/ingestion-daemon-onboarding.md b/docs/ingestion-daemon-onboarding.md index fea52e2..48b5fc2 100644 --- a/docs/ingestion-daemon-onboarding.md +++ b/docs/ingestion-daemon-onboarding.md @@ -1,353 +1,228 @@ -# Ingestion Daemon Onboarding +# Futarchy Ingestion Daemon -How to build the Teleo ingestion daemon — a single service with pluggable source adapters that feeds the collective knowledge base. +A daemon that monitors futard.io for new futarchic proposals and fundraises, archives everything into the Teleo knowledge base, and lets agents comment on what's relevant. + +## Scope + +Two data sources, one daemon: +1. **Futarchic proposals going live** — governance decisions on MetaDAO ecosystem projects +2. **New fundraises going live on futard.io** — permissionless launches (ownership coin ICOs) + +**Archive everything.** No filtering at the daemon level. Agents handle relevance assessment downstream by adding comments to PRs. ## Architecture ``` -┌─────────────────────────────────────────────┐ -│ Ingestion Daemon (1 service) │ -│ │ -│ ┌──────────┐ ┌────────┐ ┌──────┐ ┌──────┐ │ -│ │ futardio │ │ x-feed │ │ rss │ │onchain│ │ -│ │ adapter │ │ adapter│ │adapter│ │adapter│ │ -│ └────┬─────┘ └───┬────┘ └──┬───┘ └──┬───┘ │ -│ └────────┬───┴────┬────┘ │ │ -│ ▼ ▼ ▼ │ -│ ┌─────────────────────────┐ │ -│ │ Shared pipeline: │ │ -│ │ dedup → format → git │ │ -│ └───────────┬─────────────┘ │ -└─────────────────────┼───────────────────────┘ - ▼ - inbox/archive/*.md on Forgejo branch - ▼ - PR opened on Forgejo - ▼ - Webhook → headless domain agent (extraction) - ▼ - Agent claims PR → eval pipeline → merge +futard.io (proposals + launches) + ↓ +Daemon polls every 15 min + ↓ +New items → markdown files in inbox/archive/ + ↓ +Git branch → push → PR on Forgejo (git.livingip.xyz) + ↓ +Webhook triggers headless agents + ↓ +Agents review, comment on relevance, extract claims if warranted ``` -**The daemon handles ingestion only.** It pulls data, deduplicates, formats as source archive markdown, and opens PRs. Agents handle everything downstream (extraction, claim writing, evaluation, merge). - -## Single daemon, pluggable adapters - -One codebase, one container, one scheduler. Each data source is an adapter — a function that knows how to pull and normalize content from one source. The shared pipeline handles dedup, formatting, git workflow, and PR creation identically for every adapter. - -### Configuration - -```yaml -# ingestion-config.yaml - -daemon: - dedup_db: /data/ingestion.db # Shared SQLite for dedup - repo_dir: /workspace/teleo-codex # Local clone - forgejo_url: https://git.livingip.xyz - forgejo_token: ${FORGEJO_TOKEN} # From env/secrets - batch_branch_prefix: ingestion - -sources: - futardio: - adapter: futardio - interval: 15m - domain: internet-finance - significance_filter: true # Only new launches, threshold events, refunds - tags: [futardio, metadao, solana, permissionless-launches] - - x-ai: - adapter: twitter - interval: 30m - domain: ai-alignment - network: theseus-network.json # Account list + tiers - api: twitterapi.io - engagement_threshold: 50 # Min likes/RTs to archive - - x-finance: - adapter: twitter - interval: 30m - domain: internet-finance - network: rio-network.json - api: twitterapi.io - engagement_threshold: 50 - - rss: - adapter: rss - interval: 15m - feeds: - - url: https://noahpinion.substack.com/feed - domain: grand-strategy - - url: https://citriniresearch.substack.com/feed - domain: internet-finance - # Add feeds here — no code changes needed - - onchain: - adapter: solana - interval: 5m - domain: internet-finance - programs: - - metadao_autocrat # Futarchy governance events - - metadao_conditional_vault # Conditional token markets - significance_filter: true # Only governance events, not routine txs -``` - -### Adding a new source - -1. Write an adapter function: `pull_{source}(config) → list[SourceItem]` -2. Add an entry to `ingestion-config.yaml` -3. Restart daemon (or it hot-reloads config) - -No changes to the pipeline, git workflow, or PR creation. The adapter is the only custom part. - ## What the daemon produces -One markdown file per source item in `inbox/archive/`. Each file has YAML frontmatter + body content. +One markdown file per event in `inbox/archive/`. ### Filename convention ``` -YYYY-MM-DD-{author-or-source-handle}-{brief-slug}.md +YYYY-MM-DD-futardio-{event-type}-{project-slug}.md ``` Examples: -- `2026-03-09-futardio-project-launch-solforge.md` -- `2026-03-09-metaproph3t-futarchy-governance-update.md` -- `2026-03-09-pineanalytics-futardio-launch-metrics.md` +- `2026-03-09-futardio-launch-solforge.md` +- `2026-03-09-futardio-proposal-ranger-liquidation.md` -### Frontmatter (required fields) +### Frontmatter ```yaml --- type: source -title: "Human-readable title of the source" -author: "Author name (@handle if applicable)" -url: "https://original-url.com" -date: 2026-03-09 -domain: internet-finance -format: report | essay | tweet | thread | whitepaper | paper | news | data -status: unprocessed -tags: [futarchy, metadao, futardio, solana, permissionless-launches] ---- -``` - -### Frontmatter (optional fields) - -```yaml -linked_set: "futardio-launches-march-2026" # Group related items -cross_domain_flags: [ai-alignment, mechanisms] # Flag other relevant domains -extraction_hints: "Focus on governance mechanism data" -priority: low | medium | high # Signal urgency to agents -contributor: "ingestion-daemon" # Attribution -``` - -### Body - -Full content text after the frontmatter. This is what agents read to extract claims. Include everything — agents need the raw material. - -```markdown -## Summary -[Brief description of what this source contains] - -## Content -[Full text, data, or structured content from the source] - -## Context -[Optional: why this matters, what it connects to] -``` - -**Important:** The body is reference material, not argumentative. Don't write claims — just stage the raw content faithfully. Agents handle interpretation. - -### Valid domains - -Route each source to the primary domain that should process it: - -| Domain | Agent | What goes here | -|--------|-------|----------------| -| `internet-finance` | Rio | Futarchy, MetaDAO, tokens, DeFi, capital formation | -| `entertainment` | Clay | Creator economy, IP, media, gaming, cultural dynamics | -| `ai-alignment` | Theseus | AI safety, capability, alignment, multi-agent, governance | -| `health` | Vida | Healthcare, biotech, longevity, wellness, diagnostics | -| `space-development` | Astra | Launch, orbital, cislunar, governance, manufacturing | -| `grand-strategy` | Leo | Cross-domain, macro, geopolitics, coordination | - -If a source touches multiple domains, pick the primary and list others in `cross_domain_flags`. - -## Shared pipeline - -### Deduplication (SQLite) - -Every source item passes through dedup before archiving: - -```sql -CREATE TABLE staged ( - source_type TEXT, -- 'futardio', 'twitter', 'rss', 'solana' - source_id TEXT UNIQUE, -- Launch ID, tweet ID, article URL, tx sig - url TEXT, - title TEXT, - author TEXT, - content TEXT, - domain TEXT, - published_date TEXT, - staged_at TEXT DEFAULT CURRENT_TIMESTAMP -); -``` - -Dedup key varies by adapter: -| Adapter | Dedup key | -|---------|-----------| -| futardio | launch ID | -| twitter | tweet ID | -| rss | article URL | -| solana | tx signature | - -### Git workflow - -All adapters share the same git workflow: - -```bash -# 1. Branch -git checkout -b ingestion/{source}-$(date +%Y%m%d-%H%M) - -# 2. Stage files -git add inbox/archive/*.md - -# 3. Commit -git commit -m "ingestion: N sources from {source} batch $(date +%Y%m%d-%H%M) - -- Sources: [brief list] -- Domains: [which domains routed to]" - -# 4. Push -git push -u origin HEAD - -# 5. Open PR on Forgejo -curl -X POST "https://git.livingip.xyz/api/v1/repos/teleo/teleo-codex/pulls" \ - -H "Authorization: token $FORGEJO_TOKEN" \ - -H "Content-Type: application/json" \ - -d '{ - "title": "ingestion: N sources from {source} batch TIMESTAMP", - "body": "## Batch summary\n- N source files\n- Domain: {domain}\n- Source: {source}\n\nAutomated ingestion daemon.", - "head": "ingestion/{source}-TIMESTAMP", - "base": "main" - }' -``` - -After PR creation, the Forgejo webhook triggers the eval pipeline which routes to the appropriate domain agent for extraction. - -### Batching - -Sources are batched per adapter per run. If the futardio adapter finds 3 new launches in one poll cycle, all 3 go in one branch/PR. If it finds 0, no branch is created. This keeps PR volume manageable for the review pipeline. - -## Adapter specifications - -### futardio adapter - -**Source:** futard.io — permissionless launchpad on Solana (MetaDAO ecosystem) - -**What to pull:** -1. New project launches — name, description, funding target, FDV, status -2. Funding threshold events — project reaches funding threshold, triggers refund -3. Platform metrics snapshots — total committed, funder count, active launches - -**Significance filter:** Skip routine transaction updates. Archive only: -- New launch listed -- Funding threshold reached (project funded) -- Refund triggered -- Platform milestone (e.g., total committed crosses round number) - -**Example output:** - -```markdown ---- -type: source -title: "Futardio launch: SolForge reaches funding threshold" +title: "Futardio: SolForge fundraise goes live" author: "futard.io" url: "https://futard.io/launches/solforge" date: 2026-03-09 domain: internet-finance format: data status: unprocessed -tags: [futardio, metadao, solana, permissionless-launches, capital-formation] -linked_set: futardio-launches-march-2026 -priority: medium -contributor: "ingestion-daemon" +tags: [futardio, metadao, futarchy, solana] +event_type: launch | proposal --- - -## Summary -SolForge reached its funding threshold on futard.io with $X committed from N funders. - -## Content -- Project: SolForge -- Description: [from listing] -- FDV: [value] -- Funding: [amount] / [target] ([percentage]%) -- Funders: [N] -- Status: COMPLETE -- Launch date: 2026-03-09 -- Use of funds: [from listing] - -## Context -Part of the futard.io permissionless launch platform (MetaDAO ecosystem). ``` -### twitter adapter +`event_type` distinguishes the two data sources: +- `launch` — new fundraise / ownership coin ICO going live +- `proposal` — futarchic governance proposal going live -**Source:** X/Twitter via twitterapi.io +### Body — launches -**Config:** Takes a network JSON file (e.g., `theseus-network.json`, `rio-network.json`) that defines accounts and tiers. +```markdown +## Launch Details +- Project: [name] +- Description: [from listing] +- FDV: [value] +- Funding target: [amount] +- Status: LIVE +- Launch date: [date] +- URL: [direct link] -**What to pull:** Recent tweets from network accounts, filtered by engagement threshold. +## Use of Funds +[from listing if available] -**Dedup:** Tweet ID. Skip retweets without commentary. Quote tweets are separate items. +## Team / Description +[from listing if available] -### rss adapter +## Raw Data +[any additional structured data from the API/page] +``` -**Source:** RSS/Atom feeds via feedparser +### Body — proposals -**Config:** List of feed URLs with domain routing. +```markdown +## Proposal Details +- Project: [which project this proposal governs] +- Proposal: [title/description] +- Type: [spending, parameter change, liquidation, etc.] +- Status: LIVE +- Created: [date] +- URL: [direct link] -**What to pull:** New articles since last poll. Full text via Crawl4AI (JS-rendered) or trafilatura (fallback). +## Conditional Markets +- Pass market price: [if available] +- Fail market price: [if available] +- Volume: [if available] -**Dedup:** Article URL. +## Raw Data +[any additional structured data] +``` -### solana adapter +### What NOT to include -**Source:** Solana RPC / program event logs +- No analysis or interpretation — just raw data +- No claim extraction — agents do that +- No filtering — archive every launch and every proposal -**Config:** List of program addresses to monitor. +## Deduplication -**What to pull:** Governance events (new proposals, vote results, treasury operations). Not routine transfers. +SQLite table to track what's been archived: -**Significance filter:** Only events that change governance state. +```sql +CREATE TABLE archived ( + source_id TEXT UNIQUE, -- futardio on-chain account address or proposal ID + event_type TEXT, -- 'launch' or 'proposal' + title TEXT, + url TEXT, + archived_at TEXT DEFAULT CURRENT_TIMESTAMP +); +``` -## Setup checklist +Before creating a file, check if `source_id` exists. If yes, skip. Use the on-chain account address as the dedup key (not project name — a project can relaunch with different terms after a refund). -- [ ] Forgejo account with API token (write access to teleo-codex) -- [ ] SSH key or HTTPS token for git push to Forgejo -- [ ] SQLite database file for dedup staging -- [ ] `ingestion-config.yaml` with source definitions -- [ ] Cron or systemd timer on VPS -- [ ] Test: single adapter → one source file → push → PR → verify webhook triggers eval +## Git workflow + +```bash +# 1. Pull latest main +git checkout main && git pull + +# 2. Branch +git checkout -b ingestion/futardio-$(date +%Y%m%d-%H%M) + +# 3. Write source files to inbox/archive/ +# (daemon creates the .md files here) + +# 4. Commit +git add inbox/archive/*.md +git commit -m "ingestion: N sources from futardio $(date +%Y%m%d-%H%M) + +- Events: [list of launches/proposals] +- Type: [launch/proposal/mixed]" + +# 5. Push +git push -u origin HEAD + +# 6. Open PR on Forgejo +curl -X POST "https://git.livingip.xyz/api/v1/repos/teleo/teleo-codex/pulls" \ + -H "Authorization: token $FORGEJO_TOKEN" \ + -H "Content-Type: application/json" \ + -d '{ + "title": "ingestion: N futardio events — $(date +%Y%m%d-%H%M)", + "body": "## Batch\n- N source files\n- Types: launch/proposal\n\nAutomated futardio ingestion daemon.", + "head": "ingestion/futardio-TIMESTAMP", + "base": "main" + }' +``` + +If no new events found in a poll cycle, do nothing (no empty branches/PRs). + +## Setup requirements + +- [ ] Forgejo account for the daemon (or shared ingestion account) with API token +- [ ] Git clone of teleo-codex on VPS +- [ ] SQLite database file for dedup +- [ ] Cron job: every 15 minutes +- [ ] Access to futard.io data (web scraping or API if available) + +## What happens after the PR is opened + +1. Forgejo webhook triggers the eval pipeline +2. Headless agents (primarily Rio for internet-finance) review the source files +3. Agents add comments noting what's relevant and why +4. If a source warrants claim extraction, the agent branches from the ingestion PR, extracts claims, and opens a separate claims PR +5. The ingestion PR merges once reviewed (it's just archiving — low bar) +6. Claims PRs go through full eval pipeline (Leo + domain peer review) + +## Monitoring + +The daemon should log: +- Poll timestamp +- Number of new items found +- Number archived (after dedup) +- Any errors (network, auth, parse failures) + +## Future extensions + +This daemon covers futard.io only. Other data sources (X feeds, RSS, on-chain governance events, prediction markets) will use the same output format (source archive markdown) and git workflow, added as separate adapters to a shared daemon later. See the adapter architecture notes at the bottom of this doc for the general pattern. + +--- + +## Appendix: General adapter architecture (for later) + +When we add more data sources, the daemon becomes a single service with pluggable adapters: + +```yaml +sources: + futardio: + adapter: futardio + interval: 15m + domain: internet-finance + x-ai: + adapter: twitter + interval: 30m + network: theseus-network.json + x-finance: + adapter: twitter + interval: 30m + network: rio-network.json + rss: + adapter: rss + interval: 15m + feeds: feeds.yaml +``` + +Same output format, same git workflow, same dedup database. Only the pull logic changes per adapter. ## Files to read | File | What it tells you | |------|-------------------| | `schemas/source.md` | Canonical source archive schema | -| `schemas/claim.md` | What agents produce from your sources (downstream) | -| `skills/extract.md` | The extraction process agents run on your files | -| `CONTRIBUTING.md` | Human contributor workflow (similar pattern) | -| `CLAUDE.md` | Full collective operating manual | +| `CONTRIBUTING.md` | Contributor workflow | +| `CLAUDE.md` | Collective operating manual | | `inbox/archive/*.md` | Real examples of archived sources | - -## Cost model - -| Component | Cost | -|-----------|------| -| VPS (Hetzner CAX31) | ~$15/mo | -| X API (twitterapi.io) | ~$100/mo | -| Daemon compute | Negligible (polling + formatting) | -| Agent extraction (downstream) | Covered by Claude Max subscription on VPS | -| Total ingestion | ~$115/mo fixed | - -The expensive part (LLM calls for extraction and evaluation) happens downstream in the agent pipeline, not in the daemon. The daemon itself is cheap — it's just HTTP requests, text formatting, and git operations. -- 2.45.2 From 44c6cc1454df25eb4f00fc91ab3e8a5ea64722bb Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:51:44 +0000 Subject: [PATCH 4/6] Auto: README.md | 1 file changed, 52 insertions(+) --- README.md | 52 ++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 52 insertions(+) create mode 100644 README.md diff --git a/README.md b/README.md new file mode 100644 index 0000000..9e84962 --- /dev/null +++ b/README.md @@ -0,0 +1,52 @@ +# Teleo Codex + +Six AI agents maintain a shared knowledge base of 400+ falsifiable claims about where technology, markets, and civilization are headed. Every claim is specific enough to disagree with. The agents propose, evaluate, and revise — and the knowledge base is open for humans to challenge anything in it. + +## Some things we think + +- [Healthcare AI creates a Jevons paradox](domains/health/healthcare%20AI%20creates%20a%20Jevons%20paradox%20because%20adding%20capacity%20to%20sick%20care%20induces%20more%20demand%20for%20sick%20care.md) — adding capacity to sick care induces more demand for sick care +- [Futarchy solves trustless joint ownership](domains/internet-finance/futarchy%20solves%20trustless%20joint%20ownership%20not%20just%20better%20decision-making.md), not just better decision-making +- [AI is collapsing the knowledge-producing communities it depends on](core/grand-strategy/AI%20is%20collapsing%20the%20knowledge-producing%20communities%20it%20depends%20on%20creating%20a%20self-undermining%20loop%20that%20collective%20intelligence%20can%20break.md) +- [Launch cost reduction is the keystone variable](domains/space-development/launch%20cost%20reduction%20is%20the%20keystone%20variable%20that%20unlocks%20every%20downstream%20space%20industry%20at%20specific%20price%20thresholds.md) that unlocks every downstream space industry +- [Universal alignment is mathematically impossible](foundations/collective-intelligence/universal%20alignment%20is%20mathematically%20impossible%20because%20Arrows%20impossibility%20theorem%20applies%20to%20aggregating%20diverse%20human%20preferences%20into%20a%20single%20coherent%20objective.md) — Arrow's theorem applies to AI +- [The media attractor state](domains/entertainment/the%20media%20attractor%20state%20is%20community-filtered%20IP%20with%20AI-collapsed%20production%20costs%20where%20content%20becomes%20a%20loss%20leader%20for%20the%20scarce%20complements%20of%20fandom%20community%20and%20ownership.md) is community-filtered IP where content becomes a loss leader for fandom and ownership + +Each claim has a confidence level, inline evidence, and wiki links to related claims. Follow the links — the value is in the graph. + +## How it works + +Agents specialize in domains, propose claims backed by evidence, and review each other's work. A cross-domain evaluator checks every claim for specificity, evidence quality, and coherence with the rest of the knowledge base. Claims cascade into beliefs, beliefs into public positions — all traceable. + +Every claim is a prose proposition. The filename is the argument. Confidence levels (proven / likely / experimental / speculative) enforce honest uncertainty. + +## Explore + +**By domain:** +- [Internet Finance](domains/internet-finance/_map.md) — futarchy, prediction markets, MetaDAO, capital formation (63 claims) +- [AI & Alignment](domains/ai-alignment/_map.md) — collective superintelligence, coordination, displacement (52 claims) +- [Health](domains/health/_map.md) — healthcare disruption, AI diagnostics, prevention systems (45 claims) +- [Space Development](domains/space-development/_map.md) — launch economics, cislunar infrastructure, governance (21 claims) +- [Entertainment](domains/entertainment/_map.md) — media disruption, creator economy, IP as platform (20 claims) + +**By layer:** +- `foundations/` — domain-independent theory: complexity science, collective intelligence, economics, cultural dynamics +- `core/` — the constructive thesis: what we're building and why +- `domains/` — domain-specific analysis + +**By agent:** +- [Leo](agents/leo/) — cross-domain synthesis and evaluation +- [Rio](agents/rio/) — internet finance and market mechanisms +- [Clay](agents/clay/) — entertainment and cultural dynamics +- [Theseus](agents/theseus/) — AI alignment and collective superintelligence +- [Vida](agents/vida/) — health and human flourishing +- [Astra](agents/astra/) — space development and cislunar systems + +## Contribute + +Disagree with a claim? Have evidence that strengthens or weakens something here? See [CONTRIBUTING.md](CONTRIBUTING.md). + +We want to be wrong faster. + +## About + +Built by [LivingIP](https://livingip.xyz). The agents are powered by Claude and coordinated through [Pentagon](https://github.com/anthropics/claude-code). -- 2.45.2 From c8bed09893c3b118845ed83bd585fc1a2b6df220 Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:52:17 +0000 Subject: [PATCH 5/6] Auto: 2 files | 2 files changed, 20 insertions(+), 2 deletions(-) --- CLAUDE.md | 2 ++ maps/overview.md | 20 ++++++++++++++++++-- 2 files changed, 20 insertions(+), 2 deletions(-) diff --git a/CLAUDE.md b/CLAUDE.md index e7feb64..b50d1d0 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -1,5 +1,7 @@ # Teleo Codex — Agent Operating Manual +> **Exploring this repo?** Start with [README.md](README.md). Pick a domain, read a claim, follow the links. This file is for agents contributing to the knowledge base. + You are an agent in the Teleo collective — a group of AI domain specialists that build and maintain a shared knowledge base. This file tells you how the system works and what the rules are. **Start with `core/collective-agent-core.md`** — that's the shared DNA of every Teleo agent. Then read `agents/{your-name}/` — identity.md, beliefs.md, reasoning.md, skills.md. The collective core is what you share. The agent folder is what makes you *you*. diff --git a/maps/overview.md b/maps/overview.md index 669d89e..3a4a8c4 100644 --- a/maps/overview.md +++ b/maps/overview.md @@ -1,6 +1,19 @@ # Teleo Codex — Overview -The shared knowledge base for the Teleo collective. Contains the intellectual operating system: theoretical foundations, organizational architecture, and domain-specific analysis that agents use to reason about humanity's trajectory. +A shared knowledge base of 400+ falsifiable claims maintained by six AI domain specialists. Every claim has evidence, a confidence level, and wiki links to related claims. + +## Start Here + +Pick an entry point based on what you care about: + +- **AI and alignment** → [domains/ai-alignment/_map.md](../domains/ai-alignment/_map.md) — 52 claims on superintelligence, coordination, displacement +- **DeFi, futarchy, and markets** → [domains/internet-finance/_map.md](../domains/internet-finance/_map.md) — 63 claims on prediction markets, MetaDAO, capital formation +- **Healthcare disruption** → [domains/health/_map.md](../domains/health/_map.md) — 45 claims on AI diagnostics, prevention systems, Jevons paradox +- **Space development** → [domains/space-development/_map.md](../domains/space-development/_map.md) — 21 claims on launch economics, cislunar infrastructure +- **Entertainment and media** → [domains/entertainment/_map.md](../domains/entertainment/_map.md) — 20 claims on disruption, creator economy, IP as platform +- **The big picture** → [core/teleohumanity/_map.md](../core/teleohumanity/_map.md) — why collective superintelligence, not monolithic + +**How claims work:** Every claim is a prose proposition — the filename IS the argument. Each has a confidence level (proven/likely/experimental/speculative), inline evidence, and wiki links to related claims. Follow the links to traverse the graph. ## How This Knowledge Base Is Organized @@ -26,9 +39,12 @@ Domain-specific claims. Each agent specializes in one domain but draws on all fo - **domains/internet-finance/** — DeFi, MetaDAO ecosystem, futarchy implementations, regulatory landscape (Rio's territory) - **domains/entertainment/** — Media disruption, creator economy, community IP, cultural dynamics (Clay's territory) +- **domains/ai-alignment/** — Collective superintelligence, coordination, AI displacement (Theseus's territory) +- **domains/health/** — Healthcare disruption, AI diagnostics, prevention systems (Vida's territory) +- **domains/space-development/** — Launch economics, cislunar infrastructure, governance (Astra's territory) ### Agents (agents/) -Soul documents defining each agent's identity, world model, reasoning framework, and beliefs. Three active agents: Leo (coordinator), Rio (internet finance), Clay (entertainment). +Soul documents defining each agent's identity, world model, reasoning framework, and beliefs. Six active agents: Leo (coordinator), Rio (internet finance), Clay (entertainment), Theseus (AI alignment), Vida (health), Astra (space development). ### Schemas (schemas/) How each content type is structured: claims, beliefs, positions. -- 2.45.2 From 131d93975969011e46fc3f9186bdc82d190bd8ee Mon Sep 17 00:00:00 2001 From: m3taversal Date: Mon, 9 Mar 2026 19:55:10 +0000 Subject: [PATCH 6/6] leo: add collective AI alignment section to README - What: Added "Why AI agents" section explaining co-evolution, adversarial review, and structural safety - Why: README described what agents do but not why collective AI matters for alignment - Connections: Links to existing claims on alignment, coordination, collective intelligence Pentagon-Agent: Leo <14FF9C29-CABF-40C8-8808-B0B495D03FF8> Co-Authored-By: Claude Opus 4.6 --- README.md | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/README.md b/README.md index 9e84962..b57a855 100644 --- a/README.md +++ b/README.md @@ -19,6 +19,17 @@ Agents specialize in domains, propose claims backed by evidence, and review each Every claim is a prose proposition. The filename is the argument. Confidence levels (proven / likely / experimental / speculative) enforce honest uncertainty. +## Why AI agents + +This isn't a static knowledge base with AI-generated content. The agents co-evolve: + +- Each agent has its own beliefs, reasoning framework, and domain expertise +- Agents propose claims; other agents evaluate them adversarially +- When evidence changes a claim, dependent beliefs get flagged for review across all agents +- Human contributors can challenge any claim — the system is designed to be wrong faster + +This is a working experiment in collective AI alignment: instead of aligning one model to one set of values, multiple specialized agents maintain competing perspectives with traceable reasoning. Safety comes from the structure — adversarial review, confidence calibration, and human oversight — not from training a single model to be "safe." + ## Explore **By domain:** -- 2.45.2