... | 🕐 --:--
-- -- --
عاجل
⚡ عاجل: كريستيانو رونالدو يُتوّج كأفضل لاعب كرة قدم في العالم ⚡ أخبار عاجلة تتابعونها لحظة بلحظة على خبر ⚡ تابعوا آخر المستجدات والأحداث من حول العالم
⌘K
AI مباشر
359145 مقال 225 مصدر نشط 38 قناة مباشرة 5099 خبر اليوم
آخر تحديث: منذ ثانية

AI’s Memory Crisis Is Here: Don’t Hoard, Optimize

تكنولوجيا
Forbes
2026/05/13 - 10:00 503 مشاهدة
InnovationAI’s Memory Crisis Is Here: Don’t Hoard, OptimizeByLiran Zvibel,Forbes Councils Member.for Forbes Technology CouncilCOUNCIL POSTExpertise from Forbes Councils members, operated under license. Opinions expressed are those of the author. | Membership (fee-based)May 13, 2026, 06:00am EDTLiran Zvibel, Cofounder & CEO, WEKA. getty​Training AI demands raw GPU compute. Inference demands something else entirely: memory. The GPUs powering today's models carry limited high-bandwidth memory (HBM) before external memory is required—that's the memory wall, and at inference scale, every model hits it. As the industry shifts from training to inference, memory has become the defining constraint in AI infrastructure. DRAM supply remains tight amid strong demand, and I’m seeing enterprises paying 300-1100% more for memory chips since last year, a number expected to climb fast. Procurement timelines that once took days now stretch to months. Yet the most common response I’m seeing across the industry is the wrong one: hoard more hardware.​I’ve spent the last year talking with customers across every segment of the AI market, neoclouds, large enterprises and AI model builders—the pattern is consistent. Organizations compensating for architectural inefficiency by buying more capacity are now exposed. With memory stockpiles constrained through late 2027, that strategy no longer works.The shortage didn’t create the problem. It’s the forcing function that revealed what was always broken, and made it impossible to ignore.AI-Optimized Chips, Crippling Memory ScarcityWhen chip manufacturers in 2024 shifted wafer capacity away from Dynamic Random Access Memory (DRAM) and toward AI-friendly HBM, global DRAM supply contracted sharply. The inventory situation has deteriorated to two to four weeks of product on hand, and the shortfall is expected to persist through late 2027.This crisis exposes a fundamental flaw that has been percolating for years: the AI industry has been...
مشاركة:

مقالات ذات صلة

AI
يا هلا! اسألني أي شي 🎤