theseus: extract claims from 2026-04-01-stopkillerrobots-hrw-alternative-treaty-process-analysis
Some checks are pending
Sync Graph Data to teleo-app / sync (push) Waiting to run

- Source: inbox/queue/2026-04-01-stopkillerrobots-hrw-alternative-treaty-process-analysis.md
- Domain: ai-alignment
- Claims: 2, Entities: 1
- Enrichments: 1
- Extracted by: pipeline ingest (OpenRouter anthropic/claude-sonnet-4.5)

Pentagon-Agent: Theseus <PIPELINE>
This commit is contained in:
Teleo Agents 2026-04-04 15:00:05 +00:00
parent 7e96d63019
commit be1dca31b7
3 changed files with 67 additions and 0 deletions

View file

@ -0,0 +1,17 @@
---
type: claim
domain: ai-alignment
description: The 270+ NGO coalition for autonomous weapons governance with UNGA majority support has failed to produce binding instruments after 10+ years because multilateral forums give major powers veto capacity
confidence: experimental
source: "Human Rights Watch / Stop Killer Robots, 10-year campaign history, UNGA Resolution A/RES/80/57 (164:6 vote)"
created: 2026-04-04
title: Civil society coordination infrastructure fails to produce binding governance when the structural obstacle is great-power veto capacity not absence of political will
agent: theseus
scope: structural
sourcer: Human Rights Watch / Stop Killer Robots
related_claims: ["[[AI alignment is a coordination problem not a technical problem]]", "[[voluntary safety pledges cannot survive competitive pressure because unilateral commitments are structurally punished when competitors advance without equivalent constraints]]"]
---
# Civil society coordination infrastructure fails to produce binding governance when the structural obstacle is great-power veto capacity not absence of political will
Stop Killer Robots represents 270+ NGOs in a decade-long campaign for autonomous weapons governance. In November 2025, UNGA Resolution A/RES/80/57 passed 164:6, demonstrating overwhelming international support. May 2025 saw 96 countries attend a UNGA meeting on autonomous weapons—the most inclusive discussion to date. Despite this organized civil society infrastructure and broad political will, no binding governance instrument exists. The CCW process remains blocked by consensus requirements that give US/Russia/China veto power. The alternative treaty processes (Ottawa model for landmines, Oslo for cluster munitions) succeeded without major power participation for verifiable physical weapons, but HRW acknowledges autonomous weapons are fundamentally different: they're dual-use AI systems where verification is technically harder and capability cannot be isolated from civilian applications. The structural obstacle is not coordination failure among the broader international community (which has been achieved) but the inability of international law to bind major powers that refuse consent. This demonstrates that for technologies controlled by great powers, civil society coordination is necessary but insufficient—the bottleneck is structural veto capacity in multilateral governance, not absence of organized advocacy or political will.

View file

@ -0,0 +1,17 @@
---
type: claim
domain: ai-alignment
description: The Mine Ban Treaty and Cluster Munitions Convention succeeded through production/export controls and physical verification, but autonomous weapons are AI capabilities that cannot be isolated from civilian dual-use applications
confidence: likely
source: Human Rights Watch analysis comparing landmine/cluster munition treaties to autonomous weapons governance requirements
created: 2026-04-04
title: Ottawa model treaty process cannot replicate for dual-use AI systems because verification architecture requires technical capability inspection not production records
agent: theseus
scope: structural
sourcer: Human Rights Watch
related_claims: ["[[AI alignment is a coordination problem not a technical problem]]"]
---
# Ottawa model treaty process cannot replicate for dual-use AI systems because verification architecture requires technical capability inspection not production records
The 1997 Mine Ban Treaty (Ottawa Process) and 2008 Convention on Cluster Munitions (Oslo Process) both produced binding treaties without major military power participation through a specific mechanism: norm creation + stigmatization + compliance pressure via reputational and market access channels. Both succeeded despite US non-participation. However, HRW explicitly acknowledges these models face fundamental limits for autonomous weapons. Landmines and cluster munitions are 'dumb weapons'—the treaties are verifiable through production records, export controls, and physical mine-clearing operations. The technology is single-purpose and physically observable. Autonomous weapons are AI systems where: (1) verification is technically far harder because capability resides in software/algorithms, not physical artifacts; (2) the technology is dual-use—the same AI controlling an autonomous weapon is used for civilian applications, making capability isolation impossible; (3) no verification architecture currently exists that can distinguish autonomous weapons capability from general AI capability without inspecting the full technical stack. The Ottawa model's success depended on clear physical boundaries and single-purpose technology. For dual-use AI systems, these preconditions do not exist, making the historical precedent structurally inapplicable even if political will exists.

View file

@ -0,0 +1,33 @@
# Stop Killer Robots
**Type:** International NGO coalition
**Founded:** ~2013
**Focus:** Campaign to ban fully autonomous weapons
**Scale:** 270+ member NGOs
**Key Partners:** Human Rights Watch, International Committee for Robot Arms Control
## Overview
Stop Killer Robots is an international coalition of 270+ NGOs campaigning for a binding international treaty to prohibit fully autonomous weapons systems. The coalition advocates for meaningful human control over the use of force and has been active in UN forums including the Convention on Certain Conventional Weapons (CCW) and UN General Assembly.
## Timeline
- **2013** — Coalition founded to campaign against autonomous weapons
- **2022-11** — Published analysis of alternative treaty processes outside CCW framework
- **2025-05** — Participated in UNGA meeting with officials from 96 countries on autonomous weapons
- **2025-11** — UNGA Resolution A/RES/80/57 passed 164:6, creating political momentum for governance
- **2026-11** — Preparing for potential CCW Review Conference failure to trigger alternative treaty process
## Governance Strategy
The coalition pursues two parallel tracks:
1. **CCW Process:** Engagement with Convention on Certain Conventional Weapons, blocked by major power consensus requirements
2. **Alternative Process:** Preparing Ottawa/Oslo-style independent state-led process or UNGA-initiated process if CCW fails
## Challenges
- Major military powers (US, Russia, China) block consensus in CCW
- Verification architecture for autonomous weapons remains technically unsolved
- Dual-use nature of AI makes capability isolation impossible
- Ottawa model (successful for landmines) not directly applicable to AI systems