Micron ships HBM4 to key customers; to open India plant in Sanand

Micron HBM4 36GB 12-high products lead the industry in power efficiency for data center and cloud AI acceleration. It will also invest Rs. 13,000 crores in Sanand, India

author-image
DQI Bureau
Updated On
New Update
HBM4
Listen to this article
0.75x 1x 1.5x
00:00 / 00:00

The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology Inc. has announced the shipment of HBM4 36GB 12-high samples to multiple key customers.

Advertisment

This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.

Leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation. This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively.

Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron’s previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry.2 This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency.2

Advertisment

Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.

"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."

Intelligence accelerated: Micron’s role in AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.

Advertisment

Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers’ next-generation AI platforms. 

Micron will also invest Rs. 13,000 crore to set up an SEZ facility across 37.6 hectares in Gujarat's Sanand, India, for manufacturing semiconductors.

india micron