... | 🕐 --:--
-- -- --
عاجل
⚡ عاجل: كريستيانو رونالدو يُتوّج كأفضل لاعب كرة قدم في العالم ⚡ أخبار عاجلة تتابعونها لحظة بلحظة على خبر ⚡ تابعوا آخر المستجدات والأحداث من حول العالم
⌘K
AI مباشر
15346 مقال 463 مصدر نشط 38 قناة مباشرة 2868 خبر اليوم
آخر تحديث: منذ 12 ثانية

US judge to weigh Anthropic’s bid to undo Pentagon blacklisting

العالم
ARY News EN
2026/03/24 - 11:42 502 مشاهدة

A U.S. judge is set to hear oral ​arguments on Tuesday in a lawsuit by Anthropic seeking to block the Pentagon’s blacklisting of the ‌artificial intelligence lab over its refusal to lift certain restrictions on its Claude AI model.

Anthropic’s lawsuit in California federal court alleges that Defense Secretary Pete Hegseth overstepped his authority when he designated Anthropic a national security supply chain risk. The ​government can apply that label to companies that expose military systems to potential infiltration or ​sabotage by adversaries.

Hegseth’s unprecedented move, which followed Anthropic’s refusal to allow the military ⁠to use Claude for U.S. surveillance or autonomous weapons, blocks Anthropic from certain military contracts. It ​could cost the company billions of dollars this year in lost business and reputational harm, Anthropic executives said ​on March 9.

The company says AI models are not reliable enough to be safely used in autonomous weapons and that it opposes domestic surveillance as a violation of rights.

ANTHROPIC DESIGNATION FIRST FOR U.S. COMPANY

U.S. District Judge Rita Lin in ​San Francisco, an appointee of former Democratic President Joe Biden, is set to hold a hearing at ​1:30 p.m. PT (2030 GMT) over Anthropic’s request for an initial order blocking the designation while the case plays out.

Anthropic’s ‌designation ⁠was the first time a U.S. company has been publicly designated a supply chain risk under an obscure government-procurement statute aimed at protecting military systems from foreign sabotage.

In its March 9 lawsuit, Anthropic alleged the government violated its right to free speech under the First Amendment of the Constitution by retaliating against ​its views on AI safety. ​The company said ⁠it was not given a chance to dispute the designation, in violation of its Fifth Amendment right to due process.

The lawsuit says the decision was unlawful, ​unsupported by facts and inconsistent with the military’s past praise of Claude.

The ​Justice Department ⁠countered that Anthropic’s refusal to lift the restrictions could cause uncertainty in the Pentagon over how it could use Claude and risk disabling military systems during operations, according to a court filing.

The government said the designation ⁠stemmed ​from Anthropic’s refusal to accept contractual terms, not its views on ​AI safety.

Anthropic has a second lawsuit pending in Washington, D.C., over a separate Pentagon supply chain risk designation that could lead to ​its exclusion from civilian government contracts.

مشاركة:
\n

ROYAL JORDANIAN

إعلان

احجز رحلتك الآن - خصم 10% على جميع الوجهات ✈️ عمّان → دبي، لندن، إسطنبول والمزيد

10%

مقالات ذات صلة

AI
يا هلا! اسألني أي شي 🎤