All corrections
X March 2, 2026 at 11:36 PM

x.com/allTheYud/status/2028351784251511062

1 correction found

1
Claim
The contract says they do whatever the hell they declare legal.
Correction

The publicly released contract language does not say the government can do whatever it “declares” legal; it limits use to “lawful purposes” consistent with applicable law and includes specific prohibitions and legal constraints.

Full reasoning

OpenAI publicly released the “relevant language” from its agreement with the (renamed) Department of Defense (“Department of War”) on February 28, 2026.

That released language does not say the government can do whatever it declares legal. Instead, it explicitly limits use to “all lawful purposes, consistent with applicable law” and then adds additional constraints, including limits on autonomous weapons use and restrictions around surveillance / handling private information (e.g., compliance with the Fourth Amendment, FISA, EO 12333, and that the system “shall not be used for unconstrained monitoring of U.S. persons’ private information”).

Because the contract language (as released by OpenAI) is framed around external legal standards (“applicable law”) and includes specific prohibitions, the statement that the contract says they can do “whatever the hell they declare legal” is not an accurate description of what the released contract language says.

1 source
  • Our agreement with the Department of War | OpenAI (Feb 28, 2026)

    OpenAI’s post quotes the contract: “The Department of War may use the AI System for all lawful purposes, consistent with applicable law…” and further specifies constraints on autonomous weapons and surveillance/handling private information (Fourth Amendment, FISA, EO 12333) including that it “shall not be used for unconstrained monitoring of U.S. persons’ private information…”

Model: OPENAI_GPT_5 Prompt: v1.6.0