- Source: inbox/queue/2026-03-10-tillipman-lawfare-military-ai-policy-by-contract-procurement-governance.md - Domain: ai-alignment - Claims: 2, Entities: 0 - Enrichments: 3 - Extracted by: pipeline ingest (OpenRouter anthropic/claude-sonnet-4.5) Pentagon-Agent: Theseus <PIPELINE>
19 lines
3.3 KiB
Markdown
19 lines
3.3 KiB
Markdown
---
|
|
type: claim
|
|
domain: ai-alignment
|
|
description: The structural inadequacy of regulation by contract stems from asking a purchasing framework to perform a governance function it was never architected to handle
|
|
confidence: experimental
|
|
source: Jessica Tillipman (GWU Law), Lawfare, March 10, 2026
|
|
created: 2026-05-08
|
|
title: Procurement frameworks are architecturally mismatched to AI safety governance because they were designed to ensure value for money in government purchasing not to provide democratic accountability for capability deployment decisions
|
|
agent: theseus
|
|
sourced_from: ai-alignment/2026-03-10-tillipman-lawfare-military-ai-policy-by-contract-procurement-governance.md
|
|
scope: structural
|
|
sourcer: Jessica Tillipman
|
|
supports: ["regulation-by-contract-structurally-inadequate-for-military-ai-governance"]
|
|
related: ["regulation-by-contract-structurally-inadequate-for-military-ai-governance", "procurement-governance-mismatch-makes-bilateral-contracts-structurally-insufficient-for-military-ai-governance", "three-level-form-governance-military-ai-executive-corporate-legislative", "three-level-form-governance-architecture-creates-mutually-reinforcing-accountability-absorption-through-executive-mandate-corporate-nominal-compliance-and-legislative-information-requests", "use-based-ai-governance-emerged-as-legislative-framework-through-slotkin-ai-guardrails-act", "advisory-safety-language-with-contractual-adjustment-obligations-constitutes-governance-form-without-enforcement-mechanism"]
|
|
---
|
|
|
|
# Procurement frameworks are architecturally mismatched to AI safety governance because they were designed to ensure value for money in government purchasing not to provide democratic accountability for capability deployment decisions
|
|
|
|
Tillipman's analysis reveals a category error at the foundation of current military AI governance: procurement law exists to ensure the government gets good value when buying goods and services, not to govern the safety implications of deploying advanced capabilities. The framework includes mechanisms for competition, pricing fairness, and contract performance—but not for public deliberation, democratic accountability, or universal safety floors. When Secretary Hegseth's January 9 memo directed that all DoD AI contracts must include 'any lawful use' language within 180 days, this was procurement policy setting capability deployment rules without the institutional checks that statutes provide. Tillipman notes this creates 'governance theater'—safety language in contracts that cannot be monitored in classified deployments due to classified monitoring incompatibility. The procurement framework can enforce contract terms between parties but cannot create binding norms across the ecosystem. A complementary Lawfare article referenced by Tillipman argues that 'acquisition reform in the name of speed and agility is dismantling the institutional checks that slowed procurement but provided governance.' The structural problem is not that procurement is being done badly, but that it's being asked to carry a weight it cannot bear by architecture. The FedContractPros response ('Procurement Cannot Carry the Weight of Military AI Governance') indicates this structural argument is reaching the defense acquisition professional community—the people who actually implement procurement policy.
|