Note: Anthropic CEO Dario Amodei delivered his answer late this afternoon, Feb. 26. See update at the end of this column.
Here at the AI Humanist we endeavor to keep calm and resist the temptation to join the Doomsayers Club.
But some days it’s hard.
This week there’s a showdown happening in the AI world that’s legitimately terrifying.
On Tuesday, Defense Secretary and walking COMSEC risk Pete Hegseth met with Dario Amodei, CEO of Anthropic, the company that developed the powerful AI model Claude. The topic: Limitations on the military use of Claude.
Currently, those limitations are:
No using Anthropic's AI for fully autonomous weapons.
No using their AI for mass surveillance of Americans.
Amodei thinks these are reasonable and sane precautions to take when combining a wobbly new decision-making machine with weapons of mass destruction. Anthropic stakes its entire brand on safety. Go to the company’s home page. This is what you’ll see:

Hegseth, you will not be surprised to learn, wants zero limitations on his ability to spy on civilians and kill anyone his boss designates an enemy.
This isn’t an academic debate. It never is with this crew. Hegseth demanded an answer from Dario Amodei by Friday at 5:01pm. That’s tomorrow.
If Anthropic doesn’t drops its safeguards, Hegseth warned Amodei, the Pentagon will designate the company a supply chain risk. That would means the immediate cessation of all current and future military contracts.
A little context, please.
Back in July the Pentagon signed $200 million AI development deals with four tech companies: Anthropic, Google, OpenAI, and xAI. It was an open tryout: Show us what you’ve got, may the best model win.
In recent months Anthropic has begun to pull away from the pack in terms of its product quality, and that trend seems to have taken hold within the military as well. Ten days ago The Wall Street Journal reported that Claude was used during the Jan. 3 operation to kidnap Nicolás Maduro from Venezuela. Which is...good or bad, I guess, depending on your view of mechanized death. That raid involved the killing of 83 people.
Anthropic leadership grew queasy at the news, according to one report, and began internal discussions about safeguards, red lines, and the ethics of working with the Pentagon.

Nicolás Maduro: Kidnapped with the help of Anthropic’s Claude.
Who could have seen this coming? Besides everyone.
The dark implications of Hegseth’s ultimatum are horrifying and gobsmackingly obvious.
We are living in a moment in which Hegseth, the Attorney General, and the President are gleefully designating any and all persons who dare to disagree with them as domestic terrorists and enemies of the state. The citizens of Minneapolis have been under siege by armed thugs for two months now. In Texas right now, several protesters are on trial facing charges that could land then decades in federal prison, labeled as “domestic terrorists" for acts as simple as handing out flyers. (Look up the Prairieland 19.) Even Robert DeNiro can't open his mouth without getting threatened with deportation.
It’s 2026. We’ve been dealing with Trump for a decade now. We know a few things. Hard experience has taught us to listen to what he says, because he says the quiet things out loud. The things we were condescendingly told to calm down about because they were “just talk,” he's actually done. Again and again and again.
During the 2024 campaign Trump spoke openly about using military force against “the enemy within.”
"I think the bigger problem is the enemy from within," he told Fox News three weeks before the election. "We have some very bad people. We have some sick people, radical left lunatics. And I think they're the big—and it should be very easily handled by, if necessary, by National Guard, or if really necessary, by the military, because they can't let that happen."
Trump has already deployed National Guard troops against U.S. citizens in Los Angeles, Washington, D.C., Memphis, Chicago, and Portland. When they’re not executing poets and nurses, ICE thugs are using facial recognition technology and Palantir tracking systems to identify, mark, and intimidate Minnesotans lawfully acting to keep their neighbors safe.
Now the Defense Secretary is demanding that Anthropic allow the military to use its powerful AI technology to conduct widespread surveillance on Americans and develop autonomous murder robots. With a national election that will determine the control of Congress, and the future of Trump, less than nine months away.
Your answer is due at 5:01pm tomorrow, Mr. Amodei.
Update
Late Thursday afternoon, Feb. 26, Anthropic CEO Dario Amodei delivered his answer in a public Substack post.
He said no to the Pentagon: "We cannot in good conscience accede to [Hegseth’s] request.”
Read the full Anthropic statement here.
Coda
Late Tuesday, a Pentagon source told Politico that Hegseth and the White House are considering invoking the Defense Production Act on Anthropic. That would force the company to allow the Pentagon to use its technology in whatever way Hegseth and the President desire.
As Politico’s Brendan Bordelon wryly noted: "It was not immediately clear how the Pentagon intends to label Anthropic a supply chain risk—which typically requires the government and its contractors to cut ties with that company—while simultaneously invoking the Defense Production Act to compel the company to cooperate with the Pentagon.”
Joseph Heller would be proud.
MEET THE HUMANIST
Bruce Barcott, founding editor of The AI Humanist, is a writer known for his award-winning work on environmental issues and drug policy for The New York Times Magazine, National Geographic, Outside, Rolling Stone, and other publications.
A former Guggenheim Fellow in nonfiction, his books include The Measure of a Mountain, The Last Flight of the Scarlet Macaw, and Weed the People.
Bruce currently serves as Editorial Lead for the Transparency Coalition, a nonprofit group that advocates for safe and sensible AI policy. Opinions expressed in The AI Humanist are those of the author alone and do not reflect the position of the Transparency Coalition.

Portrait created with the use of Sora, OpenAI’s imaging tool.
