Search

Gen AI, HPC to fuel HBM market growth

The market for high bandwidth memory (HBM) is poised for rapid growth over the next five years, announces Yole Group in its latest analyses. According to the analysts, it is led by the continuous expansion of data-intensive artificial intelligence (AI) and high-performance computing (HPC) applications. As a result, the HBM sector will largely outgrow the overall DRAM market and remain undersupplied throughout 2024.

What are the latest innovations? What impact will they have on the ecosystem? Simone Bertolazzi and Emilie Jolivet, respectively Principal Analyst and Director, More Moore activities at Yole Group offer you today a snapshot of this industry.

This article is based on the key results of the Next-Generation DRAM 2024 – Focus on HBM and 3D DRAM report published this month. More information about Yole Group’s memory products: HERE including specific teardowns: Nvidia H100 Tensor Core GPUAMD 3D V-Cache with TSMC SoIC 3D Packaging

The rapid rise of generative AI has boosted demand for high-speed DDR5 DRAM and HBM technologies in the data center market. AI workloads are driving the need for higher bandwidth to increase data transfer rates between devices and processing units.

Hyperscalers and original equipment manufacturers (OEMs) are increasing their server capacity to support model training and inference, requiring more AI accelerators. This is in turn driving strong growth in HBMs associated with these accelerators. Demand for data center accelerators exceeded four million units in 2023 and is poised to nearly double in 2024.

At Yole Group, Simone Bertolazzi and Emilie Jolivet estimate that following an impressive 93% year-on-year increase in bit-shipment growth, data center DRAM bit demand could grow at a compound annual growth rate (CAGR) of 25% in 2023-2029, driven by a 39% growth in DRAM for AI servers over that period.

The share of HBM in overall DRAM bit shipments is forecast to rise from approximately 2% in 2023 to 6% by 2029, as AI server demand outpaces other applications. But as HBM technology is priced significantly higher than DDR5, in revenue terms the share is anticipated to climb from $14 billion in 2024 to $38 billion in 2029 – after having soared by more than 150% year on year from around $5.5 billion in 2023. 

Competition intensifies as suppliers vie for leadership

Simone_BERTOLAZZI-SBE_YINT
Simone Bertolazzi, PhD. Principal Analyst, Memory at Yole Group
To take advantage of the new generative AI wave and to speed up the market recovery process, Samsung, SK Hynix and Micron have started diverting more of their wafer capacity to address HBM opportunities, leading to an overall bit production slowdown and accelerating the shift to undersupply for non-HBM products.

Memory suppliers have increased their HBM wafer production, which Yole Group estimates increased from 44,000 wafers per month (WPM) in 2022 to 74K WPM in 2023 and could grow to 151K WPM in 2024.

SK Hynix is leading the development and commercialization of HBM, but its competition with Samsung is becoming more intense. Micron has a relatively small market share compared with the South Korean companies but is ramping up its production to capitalize on the market opportunity.

AI demand accelerates rollout of new HBM generations

The need to increase bandwidth for HPC applications is accelerating suppliers’ roadmaps for the development and commercialization of new HBM generations. Companies are striving to meet their goals for the launch of each generation to be well-positioned for the strong demand to come.

SK Hynix gained a significant advantage with the introduction of HBM3 in the second half of 2022, and while this generation and its extended version HBM3E are still in the early stages of deployment, all three key players are planning to introduce HBM4 in 2026.

In addition to the anticipated growth in orders from customers such as NVIDIA, the likes of AMD, Google and Amazon plan to start manufacturing their own AI accelerators to power their AI-based applications.

Emilie_JOLIVET-EJO_YINT
Emilie Jolivet Business Line Director, More Moore activities at Yole Group
All the buyers are competing to be AI ready, to make sure they have the infrastructure to serve the needs of AI. That is why we see them spending a lot of money and buying servers equipped with AI chips and HBM memory at prices that might be overestimating its value – they are strong enough to deal with the supply chain on their own.

Rapid growth clouds market visibility

With existing suppliers expanding capacity and multiple new entrants buying HBM to leverage generative AI for new solutions, the market is changing rapidly – making it challenging to quantify.

The next several quarters will be a time for suppliers to recover from the extended market downturn. Yole Group expects that the full year 2024 and part of 2025 will be marked by undersupply and rising prices. Yole Group will continue monitoring the market to see how it evolves.

Stay tuned!

Related article

s

AI requirements to hasten small NAND phase-out. Artificial intelligence (AI) has been incorporated into smartphone features for several years, but the implementation of adapted Large Language Models (LLM) in high-end devices could result in a pull forward in demand for DRAM and an acceleration in the phase-out of the smallest NAND storage capacity…

About the authors

Simone Bertolazzi, PhD is Principal Analyst, Memory at Yole Group.

As a member of Yole Group’s Memory team, he contributes on a day-to-day basis to the analysis of markets and technologies, their related materials, device architectures and fabrication processes.

Previously, Simone carried out experimental research in the field of nanoscience and nanotechnology, focusing on emerging semiconducting materials and their applications in opto-electronic device. He (co-) authored more than 20 papers in scientific journals and was awarded the prestigious Marie Curie Intra-European Fellowship.

Simone obtained a PhD in physics in 2015 from École Polytechnique Fédérale de Lausanne (Switzerland), where he developed novel flash memory cells based on heterostructures of two-dimensional materials and high-κ dielectrics. Simone earned a double M. A. Sc. degree from Polytechnique de Montréal (Canada) and Politecnico di Milano (Italy), graduating cum laude.

Emilie Jolivet is Business Line Director of the More Moore activities at Yole Group.

Based on her valuable experience in the semiconductor industry, Emilie manages the expansion of the technical and market expertise of the memory and computing team.

In addition, Emilie’s mission focuses on the management of business relationships with semiconductor leaders and the development of market research and strategy consulting activities inside Yole Group.

Prior to Yole Group, after an internship in failure analysis at Freescale (France), she was an R&D engineer for seven years in the photovoltaic business where she co-authored several scientific articles. Emilie then worked at EV Group (Austria) as a business development manager in 3D & Advanced Packaging.

Emilie Jolivet holds a Master’s degree in Applied Physics specializing in Microelectronics from INSA (Toulouse, France).  She also graduated with an MBA from IAE Lyon.

Related products

up