Politics

/

ArcaMax

Anthropic sues US government over supply chain risk label

Rachel Metz and Jimmy Jenkins, Bloomberg News on

Published in Political News

Anthropic PBC sued the Defense Department for declaring that the artificial intelligence giant posed a risk to the U.S. supply chain, further ramping up a high-stakes dispute with the Pentagon over safeguards on the company’s technology.

San Francisco-based Anthropic is challenging a decision by the department and other federal agencies to shift their AI work to other providers, based on a risk designation typically reserved for companies from countries the U.S. views as adversaries.

Anthropic wants a judge to remove the supply-chain risk designation and require U.S. agencies to withdraw directives related to it. The company claims it is being shut out for disagreeing with the administration and argues the legal principles at stake affect every federal contractor whose views the government dislikes.

“These actions are unprecedented and unlawful,” Anthropic said in a complaint filed Monday in San Francisco federal court, adding that the company’s business is being threatened. “The Constitution does not allow the government to wield its enormous power to punish a company for its protected speech.”

Last week, the Pentagon formally notified Anthropic of its determination. Chief Executive Officer Dario Amodei then issued a statement saying the government’s actions were not “legally sound” and had left the company with “no choice but to challenge it in court.”

According to the complaint, the government’s actions “are harming Anthropic irreparably,” putting the company’s contracts with private firms “in doubt” and potentially “jeopardizing hundreds of millions of dollars in the near-term.”

There are likely to be “enormous” consequences for others, including on those “whose speech will be chilled; on those benefiting from the economic value the company can continue to create; and on a global public that deserves robust dialogue and debate on what AI means for warfare and surveillance,” Anthropic said.

The dispute erupted last month, after the Pentagon wanted to use Claude for any purpose within legal limits — and without any usage restrictions from Anthropic. The firm had insisted that the chatbot not be used for mass surveillance against Americans or in fully autonomous weapons operations.

In response, Defense Secretary Pete Hegseth on Feb. 27 ordered the Pentagon to bar its contractors and their partners from any commercial activity with Anthropic. In a post on X, Hegseth set a six-month period for Anthropic to hand over AI services to another provider.

Trump blasted Anthropic the same day on his Truth Social network saying the company “made a DISASTROUS MISTAKE trying to STRONG-ARM the Department of War.” In his post, the president directed U.S. government agencies to stop using Claude.

The company’s lawsuit names as defendants the Department of War — which the Trump administration uses to describe the Department of Defense — as well as more than a dozen other federal agencies.

The White House defended the administration’s actions.

“President Trump will never allow a radical left, woke company to jeopardize our national security by dictating how the greatest and most powerful military in the world operates,” spokesperson Liz Huston said. Trump and Hegseth “are ensuring America’s courageous warfighters have the appropriate tools they need to be successful and will guarantee that they are never held hostage by the ideological whims of any Big Tech leaders.”

The Department of Defense didn’t respond to a request for comment on the lawsuit.

 

Anthropic said in the complaint that it imposed “usage restrictions” based on the company’s “unique understanding of Claude’s risks and limitations — including Claude’s capacity to make mistakes and its unprecedented ability to accelerate and automate analysis of massive amounts of data, including data about American citizens.”

As part of its challenge to the U.S. government, Anthropic also filed a complaint in an appellate court in Washington, DC, focusing on a law governing procedures for mitigating supply-chain risks in procurement. In that suit, the company claimed the Defense Department exceeded its authority with actions that were “arbitrary, capricious and an abuse of discretion.”

AI technology

In the days after the department first announced its risk designation, consumers drove “unprecedented demand” for Anthropic’s chatbot Claude in a show of support for the company’s resistance to the government push for unfettered use of its technology.

Meanwhile, rival OpenAI announced it had struck an agreement to let the Pentagon deploy its artificial intelligence models in its classified network. OpenAI chief Sam Altman later said he was working with the Defense Department to add more guardrails around surveillance.

Founded in 2021 by former OpenAI employees, Anthropic quickly cemented itself as a rival to the ChatGPT maker with Claude, which it billed as more safety- and business-focused. Today, the San Francisco-based company has more than 300,000 business customers who use its models to streamline workplace responsibilities, particularly in the field of computer programming where it has emerged as a market leader with its AI coding assistant, Claude Code.

Anthropic started the year on a winning streak, with surging sales, multiple viral products and a large funding round — all giving the startup a big advantage in the costly global AI race.

But its future is uncertain since its relationship with the Pentagon imploded in late February — just before the U.S. attacked Iran in a major Middle East military operation.

Some legal and policy experts warned that the fallout from the government’s declaration would be dire.

Jennifer Huddleston, a senior fellow at the Cato Institute, said the case goes well beyond a contracting dispute and posses a risk to freedom of speech.

“The designation and attempts to blacklist the company from other aspects of the government go far beyond the scope of what would be considered least restrictive means even if there are security concerns about the further use of the product,” Huddleston said in a statement.

The case is Anthropic v. U.S. Department of War, 26-cv-01996, U.S. District Court, Northern District of California (San Francisco).

(With assistance from Hadriana Lowenkron.)


©2026 Bloomberg L.P. Visit bloomberg.com. Distributed by Tribune Content Agency, LLC.

 

Comments

blog comments powered by Disqus

 

Related Channels

The ACLU

ACLU

By The ACLU
Amy Goodman

Amy Goodman

By Amy Goodman
Armstrong Williams

Armstrong Williams

By Armstrong Williams
Austin Bay

Austin Bay

By Austin Bay
Ben Shapiro

Ben Shapiro

By Ben Shapiro
Betsy McCaughey

Betsy McCaughey

By Betsy McCaughey
Bill Press

Bill Press

By Bill Press
Bonnie Jean Feldkamp

Bonnie Jean Feldkamp

By Bonnie Jean Feldkamp
Cal Thomas

Cal Thomas

By Cal Thomas
Clarence Page

Clarence Page

By Clarence Page
Danny Tyree

Danny Tyree

By Danny Tyree
David Harsanyi

David Harsanyi

By David Harsanyi
Debra Saunders

Debra Saunders

By Debra Saunders
Dennis Prager

Dennis Prager

By Dennis Prager
Dick Polman

Dick Polman

By Dick Polman
Erick Erickson

Erick Erickson

By Erick Erickson
Froma Harrop

Froma Harrop

By Froma Harrop
Jacob Sullum

Jacob Sullum

By Jacob Sullum
Jamie Stiehm

Jamie Stiehm

By Jamie Stiehm
Jeff Robbins

Jeff Robbins

By Jeff Robbins
Jessica Johnson

Jessica Johnson

By Jessica Johnson
Jim Hightower

Jim Hightower

By Jim Hightower
Joe Conason

Joe Conason

By Joe Conason
John Stossel

John Stossel

By John Stossel
Josh Hammer

Josh Hammer

By Josh Hammer
Judge Andrew P. Napolitano

Judge Andrew Napolitano

By Judge Andrew P. Napolitano
Laura Hollis

Laura Hollis

By Laura Hollis
Marc Munroe Dion

Marc Munroe Dion

By Marc Munroe Dion
Michael Barone

Michael Barone

By Michael Barone
Mona Charen

Mona Charen

By Mona Charen
Rachel Marsden

Rachel Marsden

By Rachel Marsden
Rich Lowry

Rich Lowry

By Rich Lowry
Robert B. Reich

Robert B. Reich

By Robert B. Reich
Ruben Navarrett Jr.

Ruben Navarrett Jr

By Ruben Navarrett Jr.
Ruth Marcus

Ruth Marcus

By Ruth Marcus
S.E. Cupp

S.E. Cupp

By S.E. Cupp
Salena Zito

Salena Zito

By Salena Zito
Star Parker

Star Parker

By Star Parker
Stephen Moore

Stephen Moore

By Stephen Moore
Susan Estrich

Susan Estrich

By Susan Estrich
Ted Rall

Ted Rall

By Ted Rall
Terence P. Jeffrey

Terence P. Jeffrey

By Terence P. Jeffrey
Tim Graham

Tim Graham

By Tim Graham
Tom Purcell

Tom Purcell

By Tom Purcell
Veronique de Rugy

Veronique de Rugy

By Veronique de Rugy
Victor Joecks

Victor Joecks

By Victor Joecks
Wayne Allyn Root

Wayne Allyn Root

By Wayne Allyn Root

Comics

Jimmy Margulies Chris Britt John Deering Daryl Cagle Michael de Adder Rick McKee