Home Update Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025

Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025

22
Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025


Being the primary firm to ship HBM3E reminiscence has its perks for Micron, as the corporate has revealed that’s has managed to promote out all the provide of its superior high-bandwidth reminiscence for 2024, whereas most of their 2025 manufacturing has been allotted, as effectively. Micron’s HBM3E reminiscence (or how Micron alternatively calls it, HBM3 Gen2) was one of many first to be certified for NVIDIA’s up to date H200/GH200 accelerators, so it seems just like the DRAM maker shall be a key provider to the inexperienced firm.

“Our HBM is sold out for calendar 2024, and the overwhelming majority of our 2025 supply has already been allocated,” mentioned Sanjay Mehrotra, chief govt of Micron, in ready remarks for the corporate’s earnings name this week. “We continue to expect HBM bit share equivalent to our overall DRAM bit share sometime in calendar 2025.”

Micron’s first HBM3E product is an 8-Hi 24 GB stack with a 1024-bit interface, 9.2 GT/s information switch fee, and a complete bandwidth of 1.2 TB/s. NVIDIA’s H200 accelerator for synthetic intelligence and high-performance computing will use six of those cubes, offering a complete of 141 GB of accessible high-bandwidth reminiscence.

“We are on track to generate several hundred million dollars of revenue from HBM in fiscal 2024 and expect HBM revenues to be accretive to our DRAM and overall gross margins starting in the fiscal third quarter,” mentioned Mehrotra.

The firm has additionally started sampling its 12-Hi 36 GB stacks that provide a 50% extra capability. These KGSDs will ramp in 2025 and shall be used for subsequent generations of AI merchandise. Meanwhile, it doesn’t appear to be NVIDIA’s B100 and B200 are going to make use of 36 GB HBM3E stacks, at the least initially.

Demand for synthetic intelligence servers set information final 12 months, and it seems like it’ll stay excessive this 12 months as effectively. Some analysts imagine that NVIDIA’s A100 and H100 processors (in addition to their numerous derivatives) commanded as a lot as 80% of all the AI processor market in 2023. And whereas this 12 months NVIDIA will face harder competitors from AMD, AWS, D-Matrix, Intel, Tenstorrent, and different firms on the inference entrance, it seems like NVIDIA’s H200 will nonetheless be the processor of alternative for AI coaching, particularly for giant gamers like Meta and Microsoft, who already run fleets consisting of a whole lot of 1000’s of NVIDIA accelerators. With that in thoughts, being a major provider of HBM3E for NVIDIA’s H200 is a giant deal for Micron because it allows it to lastly seize a sizeable chunk of the HBM market, which is presently dominated by SK Hynix and Samsung, and the place Micron managed solely about 10% as of final 12 months.

Meanwhile, since each DRAM gadget inside an HBM stack has a large interface, it’s bodily larger than common DDR4 or DDR5 ICs. As a consequence, the ramp of HBM3E reminiscence will have an effect on bit provide of commodity DRAMs from Micron, the corporate mentioned.

“The ramp of HBM production will constrain supply growth in non-HBM products,” Mehrotra mentioned. “Industrywide, HBM3E consumes approximately three times the wafer supply as DDR5 to produce a given number of bits in the same technology node.”



Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here