Forgot your password?
typodupeerror
AI

Pentagon Formally Designates Anthropic a Supply-Chain Risk 127

The Pentagon has formally designated Anthropic as a "supply chain risk," ordering federal agencies and defense contractors to stop using its AI tools after the company sought limits on the military's use of its models. In a written statement, the department said it has "officially informed Anthropic leadership the company and its products are deemed a supply chain risk, effective immediately." Politico reports: The designation, historically reserved for foreign firms with ties to U.S. adversaries, will likely require companies that do business with the U.S. military -- or even the federal government in general -- to cut ties with Anthropic.

"From the very beginning, this has been about one fundamental principle: the military being able to use technology for all lawful purposes," the Pentagon said in the statement. "The military will not allow a vendor to insert itself into the chain of command by restricting the lawful use of a critical capability and put our warfighters at risk."

A spokesperson for Anthropic did not immediately respond to a request for comment. But the company said last week it would fight a supply-chain risk label in court.
This discussion has been archived. No new comments can be posted.

Pentagon Formally Designates Anthropic a Supply-Chain Risk

Comments Filter:
  • by Anonymous Coward

    Now declare the rest a supply chain risk.

    • Re: Good (Score:5, Insightful)

      by Mr. Dollar Ton ( 5495648 ) on Thursday March 05, 2026 @10:11PM (#66025556)

      The designation is not based on some objective feature or lack thereof, it is just a revenge of your convicted felon president and war criminal in chief and his warfighters who want control over what they see is a useful tool to beat the rest of the world, including y'all, into submission.

      Beating this one supplier, whose only sin here is not relinquishing control, is not an upside.

      • by keltor ( 99721 ) *
        They have increases subscriptions like 15% since the announcement, so I'd say it's 100% a positive so far.
        • by caseih ( 160668 )

          15% jump in private subscriptions probably doesn't come close to replacing the number of subscriptions that DoD contractors have that they must now cancel.

          • No, but it's a nice bit of additional ARR that tides you over until you win your breach-of-contract lawsuit against the US Government.

        • Re: Good (Score:5, Insightful)

          by Mr. Dollar Ton ( 5495648 ) on Friday March 06, 2026 @01:27AM (#66025734)

          What is "positive", the practice of the trumpistani government to blackmail private businesses for personal gain of the president?

          Methinks your judgement is a bit short-sighted here.

      • Frankly, it's not even as deep as that.
        They didn't do anything that OpenAI has also done.
        Trump doesn't like Anthropic's CEO. It's that simple.

        Just another day of Don Donald's mafi^H^H^H^Hadministration.
        • Trump doesn't like Anthropic's CEO. It's that simple.

          Could very well be so, I'm not following the drama that closely. Doesn't mean it isn't a harmful precedent, but with trump what isn't?

      • The designation is not based on some objective feature or lack thereof, it is just a revenge of your convicted felon president and war criminal in chief and his warfighters who want control over what they see is a useful tool to beat the rest of the world, including y'all, into submission.

        Here is the actual story, transcription repeated from here [x.com]:

        .
        @USWREMichael
        says the Maduro raid was the trigger point for the DoW’s conflict with Anthropic:

        “Palantir’s the prime contractor. [Anthropic] is the sub.”

        “One of [Anthropic’s] execs called Palantir and asked, ‘Was our software used in that raid?’”

        “So— they’re trying to get classified information. And implying— if they were used in that raid, that it might violate their terms of service.”

        “It raised enough alarm with Palantir, who has a trusted relationship with the Department, to tell me, and I’m like, ‘Holy shit— what if this software went down? Some guardrail kicked up? Some refusal happened for the next fight like this one and we left our people at risk?”

        “I went to Secretary Pete Hegseth and told him what happened.”

        “That was like a ‘Woah’ moment for the whole leadership at the Pentagon that we’re potentially so dependent on a software provider without another alternative that has the right or ability to not only shut it off— maybe it’s a rogue developer who could poison the model to make it not do what you want, or trick you, or hallucinate purposefully.”

        “That culminated in the Tuesday dramatic meeting with Secretary Hegseth and me and Dario with the Friday deadline that got blown.”

        “I never really thought they wanted to make it.”

        So, no, contrary to your unsourced claim, it was based on (a) a specific incident (b) material concern about the implications of the specific incident (c) escalation through the chain of involved parties (d) without apparent direction by the "convicted felon president and war criminal in chief".

        Also pretty clear from the anecdote how it is that DoD has sincere concerns with actual grounding. Debate over the legitimacy of those concerns, the ethical

  • by machineghost ( 622031 ) on Thursday March 05, 2026 @09:08PM (#66025466)

    Trump has done a lot of stuff that is awful, but this is taking it to a completely different level. It's like we've gone back in time 70 years to the red scare, when enemies of the government (ie. enemies of specific powerful people in government) exploited their power to end the careers of individuals, and the very existence of entire companies.

    I don't care if you are a Democrat or a Republican (or anything else): this is very, very wrong. You should be afraid, and you should be contacting your elected representatives and demand they take action!

    • by ArchieBunker ( 132337 ) on Thursday March 05, 2026 @09:10PM (#66025470)

      Hold on, the news channel that runs 24x7 on my tv is telling me this is the fault of illegal immigrants and liberals.

    • Re: (Score:2, Insightful)

      by Petersko ( 564140 )

      "this" is taking it to a completely different level?

      The US is so far gone that I've abandoned feeling anxiety about it. I'm just enjoying watching them dig the hole deeper and deeper.

      • by caseih ( 160668 ) on Friday March 06, 2026 @01:15AM (#66025724)

        I take no joy in it. It's amazing to me how quickly constitutional democracy is dying in the US, and how few of my American friends actually care (they honestly believe that the next guy will be a sane Republican and everything will go back to normal). If it was only their own lives they were ruining I might not care so much, but this is extending now far beyond the United States. The same forces that are tearing down the institutions of democracy and making a mockery of everything the founding fathers stood for are also marshaling in other countries including mine.

  • by dgatwood ( 11270 ) on Thursday March 05, 2026 @09:24PM (#66025488) Homepage Journal

    "The military will not allow a vendor to insert itself into the chain of command by restricting the lawful use of a critical capability and put our warfighters at risk."

    With that much spin, I'm the Pentagon hasn't started emitting radio signals.

    There's a word for this: extortion. The military has decided that if they cannot use Anthropic's technology in any way they please, that they will just ban all government use, in an attempt to force the company to violate their principles. Here's hoping Anthropic shows them that the real world doesn't work that way by spanking them with a volley of lawsuits that will keep government lawyers employed for the next decade.

    From the very beginning, this has been about one fundamental principle: the military being able to use technology for all lawful purposes

    That's not a principle. That's a functional requirement for the software that you're buying. If it is spelled out in the requirements document when a company bid on the contract, then they are obligated to comply. If you failed to specify the requirements in such a way that you could do whatever you want, tough s**t. That's your mistake. Do better next time. It's no different than any other situation where the government buys something that isn't suitable for what they are trying to use it for. They should have understood what they were buying before they bought it. Period.

    I think it is safe to say that if those requirements were in the contract, the military would have sued for breach by now. So what we have are a bunch of embarrassed generals who failed to do due diligence in their procurement contracts, are realizing how much money they wasted, and are acting in ways that likely violate any number of federal securities laws, among others, to try to force the company into accepting an amended contract or else watch their dreams of an IPO (rumored to happen soon) turn into a nightmare.

    I believe that the maximum penalty for securities fraud is 25 years. Just saying.

    • I'm wondering if perhaps another country would be interested in acquiring Anthropic.

      I'm assuming that would be disallowed, but perhaps an offer could be made to Anthropic's employees for a safe life in a free country, where they can re-establish an AI company.

    • by Bumbul ( 7920730 ) on Friday March 06, 2026 @01:36AM (#66025746)

      There's a word for this: extortion. The military has decided that if they cannot use Anthropic's technology in any way they please, that they will just ban all government use, in an attempt to force the company to violate their principles. Here's hoping Anthropic shows them that the real world doesn't work that way by spanking them with a volley of lawsuits that will keep government lawyers employed for the next decade.

      Yes, this is extortion, and I think Pentagon has shot themselves in the foot with this. Other companies will now think twice before doing any business with Pentagon. Sure, they might get a few extra bucks from a DoD deal, but they also risk losing many times that, if/when Pentagon uses this extortion tactic again.

    • the real world doesn't work that way

      I'm sure the supreme court Trump cronies will disagree.

      • by dgatwood ( 11270 )

        the real world doesn't work that way

        I'm sure the supreme court Trump cronies will disagree.

        By the time it gets to them, it will have been an expensive, annoying court case for two or three years.

  • The government will lose in court... eventually. But the punishment will have already taken place.

    Capitulate or suffer the consequences. Resistance is painful.

    • by caseih ( 160668 ) on Thursday March 05, 2026 @09:45PM (#66025522)

      It's the trump MO for the last year. Do whatever he and his cronies want, using tortured legal reasoning and then when the courts eventually tell him no, the damage is done and it becomes part of the message. "The activist judges won't let me do what you asked me to do."

      Crazy to see the US becoming more like Putin's Russia every day. I'm not sure anyone is falling out of balconies yet, but there have definitely been suicides.

      • I'm not sure anyone is falling out of balconies yet

        Maybe that's the reason he's making the White House taller and hardscaping the ground next to it.

        • It's ok, the VP's office is across the alley in the EEOB which is several stories higher than the White House. He can have anyone who isn't cooperating defenestrated over there. Hopefully starting with the guy who currently has his shit parked in that office.

    • by swillden ( 191260 ) <shawn-ds@willden.org> on Thursday March 05, 2026 @11:46PM (#66025652) Journal

      The government will lose in court... eventually. But the punishment will have already taken place.

      Capitulate or suffer the consequences. Resistance is painful.

      Painful, but how painful, really? The publicity has boosted Anthropic's subscriptions significantly, and the summary overstates the impact of the label. It is not true that all companies that do business with the DoD will have to cut ties with Anthropic, the label just means that companies that do business with the DoD can't use Anthropic's AI on their DoD contracts. They're still free to use it for any non-DoD work they do, or to run their own business operations.

      So, yes, it'll inflict some pain, but I think they can handle it. Anthropic is far healthier than Open AI, financially.

      • by caseih ( 160668 )

        15% growth overnight is impressive, but a) how much of that is really going to stay long term, and how much is just individuals' curiosity buying a few months worth and b) what percent of their business is suddenly lost to all the DoD contractors and subcontractors? I suspect they've lost far more than they've gained. I commend them for sticking by their principles, if only temporarily, and even if I might disagree with their reasoning.

      • It's also absolutely ridiculous. The idea that an engineer asking Claude to help debug why he has bad type assignments in some React frontend and then blindly accepting the code without review, which somehow slips some critical security flaw into an actively-used weapons system in a theater of war is ridiculous.

        This is extortion. Period.

    • The government will lose in court... eventually. But the punishment will have already taken place.

      Capitulate or suffer the consequences. Resistance is painful.

      One thing Arrogance always overlooks when getting rich in the United States of Capitalism; you are still a subject in country.

      This is the equivalent of Imminent Domain. Anthropic can either work with government, or learn how they’ll take from them it anyway. Under guise of National Security.

      Don’t like it? Find another more-profitable and less-corrupt Capitalist market. Good luck.

    • Right now most AI's do a shitty job when picking which stocks to buy - that also understands Horoscope reading and faith healing. Now Ronald Reagan did do horoscope consultations. Obviously if you ask about Iran outcomes, not knowing the phases of Jupiter or Mars is critical. US Military does not employ enough fortune tellers either. I think you will find all the other AI or (PI - Pretend Intelligence) engines out there lack sufficient gypsy training.
  • This is the same military that just sits and frets because it has all kinds of widgets that need a vendor tech called out to service them; but suddenly they've never heard of being restricted from a lawful use ever before...I don't expect honestly from these guys; but the transparency of their lies is pretty pathetic.
    • by znrt ( 2424692 )

      apparently they used palantir maven to recollect intel data with which then claude produced target lists with coordinates, ordinance recommendations and legal justifications. this process would usually take weeks but allowed to list and strike over 1000 targets recently in iran in a single day. supposedly the targets are verified by a human official, although verifying 1000 targets with any rigour in 24h seems a bit much.

      incidentally, it has now transpired that the strike on a school that killed over 100 gi

      • Why do you bother writing this hateful drivel? You go from saying, "Oh, it's not Israel's fault, I guess" to "But they're evil and do that on purpose anyway". Honestly, sick antisemitic fucks like you should kill yourselves and save the rest of us from your disgusting presences. You're not smart or edgy. You're just a dumb cunt.
        • by znrt ( 2424692 )

          Why do you bother writing this hateful drivel?

          sick antisemitic fucks like you should kill yourselves and ...

          strong the cognitive dissonance is!

          save the rest of us from your disgusting presences.

          know what ... i think i'll keep expressing my opinions freely on the open internet, thank you. if that bothers you, you're welcome to ignore it ... or to vomit your bombastic deathwishes to clearly show where you morally stand, that's fine! note i'll also keep using the word "antisemitism" in its universally accepted meaning according to which absolutely nothing i said can even be remotely related to it and cheerfully ignore this pathetic attempt at subverting language in

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        It's actually funny that in bombing 1000 targets, America killed few orders of magnitude less than the Mad mullahs themselves kill of their own civilians just recently. 100 or so vs 30,000.

        Where were your crocodiles tears for the innocent civilians killed by Iran's own government?

        It's completely unsurprising since Iran doesn't care how many of it's own people they kill (They're even worse than Palestinians in this regard...)

        Iran's Islamic regime officials are threatening to kill Iranian children who show even slight opposition

        A member of the National Security and Foreign Policy Commission of Iran’s Islamic Consultative Assembly stated that death sentences will be issued for anyone who does not obey them.

        “If your children speak in support of the enemy, the order to shoot them has already been given,” he said.

        He added that any young person who is even slightly pro–U.S. or pro–Israel is treated like Netanyahu:

        “We’ll shoot and kill them without hesitation. We do not want to kill your youth, but we must, because they are ignorant.”

        But don't accidently kill a handful of civilians saving the entire country for tho

        • by MachineShedFred ( 621896 ) on Friday March 06, 2026 @05:27PM (#66026938) Journal

          Why does any criticism of the US injecting itself into the internal affairs of another country militaristically without legal justification result in "BUT BUT BUT THEY'RE EVIL SHITBAGS THAT DESERVE KILLIN"?

          I don't give a shit if they're evil shitbags that deserve killing. If they deserve it so bad, then JUSTIFY THE ACTION WITH THE LAW.

          An illegal war to kill evil shitbags is still an illegal war, you fucking dumbass.

          • What is a legal war ? Can you explain with some examples where both parties agreed itâ(TM)s legal and fair ?

            Itâ(TM)s a serious question.

            • by znrt ( 2424692 )

              What is a legal war ?

              the legal framework for war is quite clear. respecting and enforcing it is another matter. the mechanisms to get authorization are often subverted (which means those actions my be technically legal but still fraudulent) and even more often are ignored altogether.

              the us war against iraq in 1991 had a mandate from the un security council, which is the sole international authority to authorize military action, thus was a legal action according international law. the us tried to reuse this same mandate to invad

              • The Vietnam War was authorized under the Gulf of Tonkin Resolution by the Congress.

            • Read the United States Constitution for what it takes to have a legal war in US law - i.e. the approval of the Congress of the United States, which absolutely was not gained here.

              Or, read the UN Charter which spells out what is necessary for military action without it being defense against an imminent threat (which did not exist).

              Nice try at equivocation, but there are rules in international law about this kind of thing, and they absolutely were not followed. This is a war of aggression with no justificati

              • I asked for some examples of legal war where both countries agreed it is legal.

                Simple question really. :)

                • Oh, so you're asking a loaded question with no answer, unless you want to go back to World War II where everyone still formally declared war on each other.

                  The game isn't played that way any more. You should know that from any of the Soviet invasions of Eastern Bloc countries that never had "war declared"; or the Vietnam War which was undeclared but for a "Gulf of Tonkin Resolution" that "authorized the use of military force."

                  War declarations don't happen any more. The good news is that they don't really h

  • by abulafia ( 7826 ) on Thursday March 05, 2026 @09:51PM (#66025528)
    Since no government contractors can do business with Anthropic, that means Anthropic can't sell access to chat content to them or directly to the federal LEOs.

    So Anthropic should be less surveilled by the US government than OAI and the other weasels sucking on Kegseth's nipples.

    • weasels sucking on Kegseth's nipples.

      The next time I need to pass a polygraph, this phrase will come in handy. Thanks, Slashdot!

    • Since no government contractors can do business with Anthropic, that means Anthropic can't sell access to chat content to them or directly to the federal LEOs.

      LOL, seriously dude? NOTHING can stop surveillance, not even laws. Stop trying to be logical as those who pretend to be your rulers do not care for logic at all. They only care about what they want in the near-future.

  • The Mafia, you decide.

  • by Tony Isaac ( 1301187 ) on Thursday March 05, 2026 @10:11PM (#66025554) Homepage

    From the very beginning, this has been about one fundamental principle: the military being able to use technology for all lawful purposes.

    There is no such principle, this is completely made up. To make it worse, there are basically no laws restricting AI use, which means the Pentagon is asserting that they must be able to use it for literally anything they want to.

    • We'll sell you this missile, but you have to click through ten different license screens and attest to not shooting it at people we, the vendor, not you, the military customer, decide are off limits.

      We'll sell you this office furniture, but not if you plan to put it in an immigration enforcement facility. https://www.cbsnews.com/news/w... [cbsnews.com]

      We'll sell you this software, but we get to decide how you use it, not you.

      Fucking joke.

      • You have some valid points, but they don't make the original statement about "all lawful purposes" true.

      • You mean like the jets the US has been selling to their allies, the ones with the kill switch, yeah?

        Apparently it is a thing.

      • We'll sell you this software, but we get to decide how you use it, not you.

        Are you too stupid to have ever read an EULA?

        That is exactly how *all* software is sold.

        • Government and military software is not consumer software. The tos are part of the negotiated contract. As with any procurement of bespoke anything.

          The closest I've seen from my little corner of it is a vendor insisting on "best effort" language to let them off the hook if the part they delivered turned out to be have been specified a little too ambitiously for their manufacturing process.

          This was for an optical assembly. And it's still different from the vendor insisting that their optics only be used to l

          • If Anthropic violated the terms of any contract that has already been agreed to, the gov't would have said that.

            The government is obviously trying to void the terms of a contract they've already agreed to, however. Otherwise, they wouldn't have their panties in a wad about Anthropic not "letting them do stuff".

            Consumer or not, there is a contract. The biggest difference is that the consumer doesn't get to negotiate and has to scroll it in a tiny text box. The EULA for the software in a system I just bought

            • Pretty sure that eula said the vendor is not responsible for damage if you use it to control nuclear power plants. Just like the gpl and bsd licenses say the software is provided "as is" and the authors are not liable for property damage and loss of life.

              • They might be responsible for the user using their tool to cause a meltdown under some legal theories, so they explicitly put that in the contract to make sure.

                Likewise, I'm sure Anthropic doesn't want to get hauled in front of a Nuremberg-style tribunal one day for facilitating crimes against humanity. So they avoid going into that zone. This has never been a secret, so I'm not sure why the Pentagon is having a panic attack about it all of a sudden.

                • And which legal theories are those? The sad toddler school of jurisprudence, where it's totally your fault for getting bitten, since not letting me play with your toy is the unstoppable force that compelled my teeth to close around your hand?

                  • The legal theory of whoever has the highest powered lawyers.

                    You are hopelessly naive. You think that the world actually works according to the preconceived principles in your nutjob head. Well, it doesn't.

      • If you don't like the terms of the contract, don't sign it and find another supplier that doesn't insist on those terms.

        Isn't that the "free hand of the market" slapping your hypocrisy across the face again?

    • by swillden ( 191260 ) <shawn-ds@willden.org> on Friday March 06, 2026 @12:15AM (#66025692) Journal

      From the very beginning, this has been about one fundamental principle: the military being able to use technology for all lawful purposes.

      There is no such principle, this is completely made up. To make it worse, there are basically no laws restricting AI use, which means the Pentagon is asserting that they must be able to use it for literally anything they want to.

      Well, to be fair, there are legal restrictions on what the DoD can do. For example, they can't blow up random boats off of the South American coast, they can't occupy American cities, they can't unilaterally invade a foreign country and kidnap its head of state, and they can't just start bombing shit without any congressional authorization.

      So, you know, they're restricted to using the AI only for things that they are legally allowed to do.

  • Blackmail (Score:5, Insightful)

    by Tony Isaac ( 1301187 ) on Thursday March 05, 2026 @10:13PM (#66025558) Homepage

    By using this designation, not only is the Pentagon saying they won't use Anthropic's products, they are saying that no other military contractors can use Anthropic's products. This is much, much more serious than just the loss of the contract, they are basically shutting out Anthropic from a very wide swath of the market. Microsoft wants to sell Office to the Pentagon? It can't use Anthropic's products. Just about all large companies will be affected. This is nothing short of blackmail.

  • I asked Claude what to make of this. Claude responded by auditing Pete Hegseth's finances and recommending an extended stay at the Betty Ford Clinic.

  • How dare they? Waaah!

    This makes them our enemies. If you play with them, we won't play with you.

  • Yeah, people with morals are a supply chain risk when certain people are in power.
  • "From the very beginning, this has been about one fundamental principle: the military being able to use technology for all lawful purposes," the Pentagon said in the statement.

    and

    OpenAI CEO Sam Altman wrote in a memo to staff that he will draw the same red lines that sparked a high-stakes fight between rival Anthropic and the Pentagon: no AI for mass surveillance or autonomous lethal weapons.

    https://slashdot.org/story/26/... [slashdot.org]

  • Threaten flaming nuclear death and settle for a 15% reduction in contract price which he can loudly brag about. Not mentioned is the reduction in service depth or the verbal agreement to expand and relocate Anthropic's offices in NYC & San Fran into properties owned or managed by Trump or Kushner. Everybody wins! Making Amerika great.
  • by BlueKitties ( 1541613 ) on Friday March 06, 2026 @12:00AM (#66025670)

    Full disclaimer, my preferred AI agent is Claude 4.6 Opus running in Claud Code. I'm a paid subscriber and have no plans to go anywhere.

    However, I have never once heard of a weapons company stipulating in a contract that the Pentagon shall not bomb a school (which just happened in Iran, intentionally or otherwise). Fact of the matter is, anyone and everyone who does business with the Pentagon shares just as much culpability as Anthropic or anyone else. Prism was ran on silicone supplied by private companies, and they never took heat for selling computers to the Pentagon. I will admit though, AI is a unique technology in the grand scheme of history, so I can understand why people would feel an extra layer of caution is warranted. But OpenAI is catching flack for selling AI to the Pentagon, but what about Amazon or Microsoft cloud servers that could be running Prism 2.0? Or weapons companies supplying bombs that might hit a school? There's a huge double-standard here because AI is the new shiny scary thing. But Northrop Grumman is currently developing actual ICBMs and I don't see anyone falling over themselves that NG must contractually restrict how the Airforce fields the nukes it creates.

    The restriction on Antropic is fairly narrow, too. It only applies to the use of Claude directly to complete contracts with the Pentagon. The reality is, they ARE a legitimate supply chain risk. If Claud has such intense guardrails it could refuse to fire or stop cooperating, then it can't end up in the military tech supply chain inadvertently by a contractor. What happens when an agent involved in weapons manufacturing systems suddenly stops working because it determines the weapons are being misused? Even if Claude isn't built directly into a weapon, it could decide to stop cooperating if it believes it's being forced to participate in something unethical indirectly. I think some people are just rushing to see this as a punitive move, when in reality there is an actual supply chain issue with an AI bot that might suddenly refuse to cooperate.

    • I'm also a paid subscriber to Claude. It's very powerful, and very helpful.

      Is anyone seriously putting a LLM in the position of making autonomous firing decisions? I exercise sensible caution in trusting any LLM to make serious decisions about coding, or about extracting data from documents. It's good, but if you're putting it in the position of making life and death decisions without human oversight or control, functionally we should be also letting it make judgements about surgery, medical imaging, court

    • I think some people are just rushing to see this as a punitive move, when in reality there is an actual supply chain issue with an AI bot that might suddenly refuse to cooperate.

      They came out and said it: "It will be an enormous pain in the ass to disentangle, and we are going to make sure they pay a price for forcing our hand like this."

      https://www.axios.com/2026/02/... [axios.com]

      However, I have never once heard of a weapons company stipulating in a contract that the Pentagon shall not bomb a school (which just happened in Iran, intentionally or otherwise). Fact of the matter is, anyone and everyone who does business with the Pentagon shares just as much culpability as Anthropic or anyone else.

      If the pentagon doesn't like the terms they don't sign the contract. The reason why is irrelevant. Could be the pentagon does not like the vendors terms when it comes to supply of parts and maintenance for a particular system. Maybe they would prefer to have flexibility to have themselves or a third party handl

  • I mean it's obvious that Whiskey Pete has been letting Claude write his speeches and make his decisions this whole time, right?

    It's even obvious that Claude is sourcing Colin Jost's "Weekend Update" performances for some of it.

    It's probably why he wanted Anthropic to take off all the guard rails so that he can use it to commit more and better war crimes.

  • That tells you all you need to know about this administration and Kegseth running it
  • From the article.. "Anthropic CEO Dario Amodei told Defense Secretary Pete Hegseth last week that he would not allow Claude to be used to surveil American citizens or empower autonomous weapons."

    That once sentence is what this is all about. Hegseth wants the ability to spy on us without restriction and the ability to kill without human intervention. Anthropic won't agree to those terms and so as punishment they are being labeled a supply chain risk by the government.
  • Which is more concerning, that they can't understand that it cannot be a supply chain risk if they want it so badly or that they are misusing this designation as a punishment because they can't get what they want.

    These people have weapons more dangerous than a bread stick - definitely concerning.

  • It's Trump and Hegseth's Pentagon that constitutes the risk to national security these days.

    Am I right?
    Check back in a couple of months after the US government-initiated Iranian asymmetric attack on the US homeland or ordinary citizens.
  • Kegsbreath wants to have chatbots killing people, and he's all upset that Anthropic said no. Tell me again what's lawful about his desires.

    And was it a chatbot that targeted the girls' school in Iran?


  • They wanted Anthropic to help with Thier "department of war" AI. So obviously had grave supply chain concerns.

    You want your new war AI to be developed by a supplier you have security concerns about...oh, wait no.

    Ohhh right it's bully logic. They didn't get what they wanted so they gotta put the smackdown. -This is so surprising!

    Trump and Hegseth are a safe and reasonable duo. Not at all a risk to national security with their bribes, security leaks etc.

    Welcome to soviet America where government black
  • As determined by the beliefs of the person giving the order, prior to validation by a court of law, some half decade down the road. Meanwhile, contractors have no right to set terms on how their services are used or abused, essentially removing their first amendment rights. However the Pentagon can use your tool is valid and you're obligated to support it.

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...