SK hynix pumps billions into HBM chips to meet AI demand

It’s already sold out of stock for this year and next – An article from Dan Robinson for The Register.

High bandwidth memory (HBM) is becoming a key technology in the continued AI investment race as SK hynix plans to spend billions on memory chip production and China’s Huawei looks to develop its own in partnership with a local foundry.

SK hynix, the world’s second biggest memory chipmaker, is set to invest ₩103 trillion ($74.5 billion) in boosting its semiconductor division between now and 2028, the company announced after a management strategy meeting at the end of June.

According to its investment plan, 80 percent of that total (₩82 trillion or about $60 billion) is to be directed towards AI-related business areas such as HBM, BusinessKorea said, increasing production capacity to respond to growing demand.

As The Register reported a while back, SK hynix has already sold all the HBM chips it will manufacture this year as well as most of its expected 2025 production, owing to demand driven by the AI craze. The latter is partly because its HBM chips are optimized for use with Nvidia’s top-end GPU accelerators, and the company was an early mover in the HBM market.

HBM was developed as a way of boosting memory bandwidth for key applications by putting the chips inside the same package as the CPU or GPU chips, sometimes directly stacked on top of those so the connections are much shorter. Our colleagues over at Blocks & Files have an explainer on HBM.

There were warnings that industry enthusiasm for HBM has the potential to cause a DRAM supply shortage unless more manufacturing lines can be brought into play as demand for this memory is expected to grow 200 percent this year and double again in 2025… Full article