Anthropic is suing the Trump administration, asking federal courts to reverse the Pentagon’s decision designating the artificial intelligence company a “supply chain risk” over its refusal to allow unrestricted military use of its technology
Anthropic is suing the Trump administration for what it calls an “unlawful campaign of retaliation” against the artificial intelligence company over its refusal to allow unrestricted military use of its technology.
Anthropic asked federal courts on Monday to reverse the Pentagon’s decision last week to designate the artificial intelligence company a “ supply chain risk.” The company also seeks to undo President Donald Trump's order directing federal employees to stop using its AI chatbot Claude.
The legal challenge intensifies an unusually public dispute over how AI can be used in warfare and mass surveillance — one that has also dragged in Anthropic's tech industry rivals, particularly OpenAI, which made its own deal to work with the Pentagon just hours after the government punished Anthropic for its stance.
Anthropic filed two separate lawsuits Monday, one in California federal court and another in the federal appeals court in Washington, D.C., each challenging different aspects of the government's actions against the company.
“These actions are unprecedented and unlawful," Anthropic's lawsuit says. "The Constitution does not allow the government to wield its enormous power to punish a company for its protected speech. No federal statute authorizes the actions taken here. Anthropic turns to the judiciary as a last resort to vindicate its rights and halt the Executive’s unlawful campaign of retaliation.”
The Defense Department declined to comment Monday, citing a policy of not commenting on matters in litigation.
Anthropic said it sought to restrict its technology from being used for two high-level usages: mass surveillance of Americans and fully autonomous weapons. Defense Secretary Pete Hegseth and other officials publicly insisted the company must accept “all lawful" uses of Claude and threatened punishment if Anthropic did not comply.
Recommended for you
Designating the company a supply chain risk cuts off Anthropic's defense work using an authority that was designed to prevent foreign adversaries from harming national security systems. It was the first time the federal government is known to have used the designation against a U.S. company.
President Donald Trump also said he would order federal agencies to stop using Claude, though he gave the Pentagon six months to phase out a product that’s deeply embedded in classified military systems, including those used in the Iran war.
Anthropic's lawsuit also names other federal agencies, including the departments of Treasury and State, after officials ordered employees to stop using Anthropic’s services.
Even as it fights the Pentagon’s actions, Anthropic has sought to convince businesses and other government agencies that the Trump administration’s penalty is a narrow one that only affects military contractors when they are using Claude in work for the Department of Defense.
Making that distinction clear is crucial for the privately held Anthropic because most of its projected $14 billion in revenue this year comes from businesses and government agencies that are using Claude for computer coding and other tasks. More than 500 customers are paying Anthropic at least $1 million annually for Claude, according to a recent investment announcement valued the company at $380 billion.
Anthropic said in a statement Monday that “seeking judicial review does not change our longstanding commitment to harnessing AI to protect our national security, but this is a necessary step to protect our business, our customers, and our partners."
Copyright 2026 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
Keep the discussion civilized. Absolutely NO
personal attacks or insults directed toward writers, nor others who
make comments. Keep it clean. Please avoid obscene, vulgar, lewd,
racist or sexually-oriented language. Don't threaten. Threats of harming another
person will not be tolerated. Be truthful. Don't knowingly lie about anyone
or anything. Be proactive. Use the 'Report' link on
each comment to let us know of abusive posts. PLEASE TURN OFF YOUR CAPS LOCK. Anyone violating these rules will be issued a
warning. After the warning, comment privileges can be
revoked.
Please purchase a Premium Subscription to continue reading.
To continue, please log in, or sign up for a new account.
We offer one free story view per month. If you register for an account, you will get two additional story views. After those three total views, we ask that you support us with a subscription.
A subscription to our digital content is so much more than just access to our valuable content. It means you’re helping to support a local community institution that has, from its very start, supported the betterment of our society. Thank you very much!
(0) comments
Welcome to the discussion.
Log In
Keep the discussion civilized. Absolutely NO personal attacks or insults directed toward writers, nor others who make comments.
Keep it clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't threaten. Threats of harming another person will not be tolerated.
Be truthful. Don't knowingly lie about anyone or anything.
Be proactive. Use the 'Report' link on each comment to let us know of abusive posts.
PLEASE TURN OFF YOUR CAPS LOCK.
Anyone violating these rules will be issued a warning. After the warning, comment privileges can be revoked.