The Register on MSN
SK Hynix's $13B packaging facility promises more HBM for the AI bubble
Great news for AMD and Nvidia, less so for cash-strapped consumers Memory makers just can't churn out their DRAM fast enough.
New AI memory method lets models think harder while avoiding costly high-bandwidth memory, which is the major driver for DRAM ...
SK Hynix, Samsung, TSMC, and Micron are investing billions to meet demand for high-bandwidth memory and advanced chips for the AI boom.
The next generation of high-bandwidth memory, HBM4, was widely expected to require hybrid bonding to unlock a 16-high memory ...
A new technical paper titled “On the Thermal Vulnerability of 3D-Stacked High-Bandwidth Memory Architectures” was published by researchers at North Carolina A&T State University and New Mexico State ...
The global AI boom is driving up smartphone and tablet production costs, as memory suppliers divert more capacity toward high-bandwidth memory (HBM) for AI accelerators. Samsung Electronics and SK ...
As artificial intelligence and high-performance computing continue to advance rapidly, high-bandwidth memory (HBM) paired with GPUs has become a critical battleground in the semiconductor industry.
AMD gains AI chip momentum with high-bandwidth memory, accelerating revenue, 30%+ data center growth, and a 1.2 PEG for ...
AI is driving demand and higher prices for DRAM and NAND into 2026. Products using non-volatile memories to replace NOR and SRAM in embedded applications are growing.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results