Search

SK hynix develops world’s first 24GB HBM3 with 12 stacked DRAM chips

SK hynix has developed a new HBM3 product with the world’s highest capacity of 24 gigabytes (GB) by vertically stacking 12 individual DRAM chips for the first time in the world. HBM stands for high-bandwidth memory. Demand for this product is expected to rise in the rapidly growing generative artificial intelligence (AI) market as it has enough processing power to transmit 163 high-definition movies in one second.

SK hynix succeeded in developing a 24 GB package product with a capacity 50 percent higher than the previous one, following volume production of HBM3, for the first time in the world in June of 2022, according to industry sources on April 20. The maximum capacity of the previous HBM3 was 16 GB by vertically stacking 8 single DRAM chips. HBM’s innovation that increases data processing speed compared to existing DRAM is connecting multiple DRAM chips vertically.

SK hynix engineers applied advanced MR-MUF and TSV technologies to this product. MR-MUF is a process of injecting and hardening a liquid-type protective material between the spaces to protect circuits between chips after stacking semiconductor chips. Compared to the method of laying a film-type material every time a chip is stacked, the process is more efficient and it is also effective in dissipating heat. Advanced MR-MUF technology enhances process efficiency and product performance stability.

TSV is an advanced packaging technology that drills thousands of fine holes in DRAM chips and connects the holes of the upper and lower chips with electrodes that vertically penetrate them all. SK hynix’s HBM3 with this technology can transfer 163 full HD (FHD) movies in one second, realizing a speed of up to 819 GB per second.

SK hynix developed the product with the same height as a 16GB product by vertically stacking 12 single DRAM chips which are 40 percent thinner than before, using TSV technology. HBM, which SK hynix developed for the first time in the world in 2013, is evaluated as a memory semiconductor product essential for generative artificial intelligence (AI) that requires high-performance computing. HBM3, the latest standard, is the optimal memory for quickly processing large amounts of data and demand from big tech companies is gradually rising.

SK hynix aims to supply products in the second half of the year in line with the growing demand for premium memory following the recent growth of the AI chatbot (conversational robot) industry. Currently, HBM3 24GB samples are provided to global customers for performance verification.

up