
On Tuesday, Micron Technology, Inc. (NASDAQ:MU) disclosed that it is the first and only memory company to ship both HBM3E and SOCAMM (small outline compression attached memory module) products for AI servers in data centers.
Micron’s SOCAMM, a modular LPDDR5X memory solution developed in collaboration with NVIDIA, is designed to support the NVIDIA GB300 Grace Blackwell Ultra Superchip.
Additionally, Micron’s HBM3E 12H 36GB is integrated into the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24GB is available for the NVIDIA HGX B200 and GB200 NVL72 platforms.
The deployment of Micron’s HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems highlights Micron’s crucial role in enhancing AI workload performance.
These high-performance memory solutions are key to unlocking the full potential of GPUs and processors, supporting the secular growth of AI.
Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit, said, “AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron’s contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications.”
Last month, the company announced it shipped samples of its 1γ (1-gamma), sixth-generation (10nm-class) DRAM node-based DDR5 memory designed for next-generation CPUs to ecosystem partners and select customers.
Investors can gain exposure to the stock via Direxion Daily MU Bull 2X Shares (NASDAQ:MUU) and SPDR Galaxy Digital Asset Ecosystem ETF (NASDAQ:DECO).
Price Action: MU shares are up 0.18% at $101.90 premarket at the last check Wednesday.
Read Next:
- Micron Technology Q2 Earnings Preview: Results In-Line, Soft Guide Likely
Photo via Shutterstock.