Market and Technology Trends
Generative AI 2024 - Impact on Processors, Memory, Advanced Packaging, and Substrates
By Yole Intelligence —
Deep dive in supply chain and its bottleneck and resilience. Which company takes benefits of generative AI and who is missing ?
YINTR24454
Three-page summary
Executive Summary
Introduction
- Why generative AI?
- What makes generative AI so appealing to investors?
Generative AI and processors
- GPU and AI ASIC – product and related ecosystem
- Datacenter GPU and AI ASIC – market forecast
Supply chain overview
Memory
- Memory market trends – overview by end-market
- High Bandwidth Memory – a market driven by generative
AI
- 2023 HBM – market share estimates
Foundry
Packaging
- High-end performance packaging – all platforms
- GPU and AI ASIC – advanced packaging forecast
- Supply chain for high-end packaging
AI processor Substrates
- Advanced IC substrates for AI accelerators
- IC Substrate – market forecast
- FCBGA – market forecast
Outlook
Related product
Corporate presentation
Datacenter GPU and AI ASIC revenue could reach $156B by 2025, and $233B by 2029.
In 2023, datacenter processor shipments for AI acceleration experienced strong growth, a trend expected to continue through 2024 and 2025. Both flagship GPUs and AI ASIC are expected to experience a strong growth and the associated scenarios are discussed in the report. The combined datacenter GPU and AI ASIC revenue is expected to grow from 50 billion in 2023 to more than 200 billion in 2029.
Moreover, the entire supply chain is anticipated to experience the repercussions of this expanding market, encompassing wafer and memory production, as well as substrate and 2.5D/3D packaging. Notably, the HBM market for AI accelerators is projected to experience significant growth, expanding by a factor of 8 between 2023 and 2029. Similarly, IC substrate revenues are anticipated to increase by a factor of 10 during the same period.
Nvidia, SK hynix, and TSMC were leading in 2023 – but what about the coming years?
Nvidia is the frontrunner in the generative AI market with its flagship GPUs, witnessing substantial growth in its datacenter business line, while AMD's MI300 is gaining momentum. Hyperscalers like Google, Amazon, and Chinese BATX are developing AI ASICs as custom chips for internal use and cloud services, aiming to reduce reliance on datacenter GPUs from fabless companies, lower costs, and utilize processors tailored to their needs. These AI ASICs present the primary competition for Nvidia. Intel's Gaudi and several startups with diverse approaches are also entering the market.
The foundry market is dominated by TSMC, with Samsung, Intel Foundry Services, and SMIC striving to capture a share of this vast opportunity. These foundries cater to various clients and offer different technologies to meet the growing demand for AI accelerators.Samsung, SK Hynix, and Micron are expanding their wafer capacity for HBM production to seize the opportunities in the AI market. SK Hynix is currently at the forefront of the HBM market, but competition is heating up with Samsung.
In the realm of advanced packaging, Intel, Samsung, and TSMC are prominent leaders, providing distinctive 2.5D and 3D technologies for high-performance applications. These companies are driving innovation in the high-end packaging market. Despite a challenging year for IC substrate makers, the AI hype is expected to positively impact the industry in the long term, driven by recent investments, capacity expansions, and glass core substrate developments.
Technology innovation at all the supply chain levels to support the growing need for computing.
The launch of ChatGPT in November 2022 sparked significant interest in AI accelerators, which are specialized chips designed for highly parallelizable calculations. As AI models require extensive vector and matrix calculations, GPUs and AI ASICs have become increasingly important. AI accelerators have since diverged into two categories: those specialized for training and those for inference.
AI models in data centers are becoming more complex, with increasing parameters and sample data, driving the evolution of chip architecture. Training chips require higher computing power, memory, and bandwidth, while inference chips prioritize high throughput, I/O HBM bandwidth, and sufficient memory.
Memory technology, such as HBM, is essential for rapid data transfer in AI accelerators. HBM3 was introduced in 2022, with HBM3E and HBM4 expected to follow in 2024 and 2026, respectively. AI accelerator market trends include larger and more diverse form factors with minimal layer count, enabling the use of chiplets for custom AI accelerators. Glass core substrates are expected to become the preferred choice due to their flexibility, cost-effectiveness, and mechanical stability. Advanced packaging technologies, such as 2.5D and 3D platforms, are crucial for meeting the performance and efficiency requirements of AI accelerators in data center applications. These platforms ensure low latency, high speed, and low power consumption while evolving with denser integration in future generations.
Achronix, Advanced Micro Devices (AMD), Alibaba Group, Amkor, Apple, Arm, ASE Group, Atos, Amazon Web Services (AWS), Baidu, Biren Technology, Broadcom, Cambricon Technologies, Cerebras, Cisco, Corerain Technologies, Cray, Dell, Denglin, Enflame, Global Foxcom, GlobalFoundries (GF), Google, Graphcore, Groq, HiSilicon, Hewlett Packard Enterprise (HPE), Huawei, Ibiden, IBM, Intel, Inspur, JCET, Kalray, Kyocera, Lenovo, Lightelligence, Lightmatter, Luminous computing, Lynxi, Marvell Technology, Meta, Microchip, Micron, Microsoft Azure, Nan Ya Plastics Corporation, Nvidia, Powertech Technology Inc, Quanta Cloud Technology (QCT), Quanta Computer, SambaNova, Samsung, SAP, Semiconductor Manufacturing International Corporation (SMIC), Shinko, SiPearl, SK hynix, SPIL, Socionext, Sony, Sugon, SuperMicro, Tachyum, T-Head, Taiwan Semiconductor Manufacturing Company (TSMC), Tencent, Tenstorrent, Tesla, Unimicron, United Microelectronics Corporation (UMC), Untether AI, YMTC, and more.
Report's objectives:
Comprehensive Analysis of Generative AI's Impact on the Semiconductor Industry:
- Examine the AI Processor market, including GPU and AI ASIC - revenue forecast, units forecast, ASP evolution, product breakdown
- Evaluate the HBM market in terms of bits evolution and revenue
- Assess the foundry market dynamics and trends
- Investigate the packaging market, focusing on interposer and HBM stack technologies
- Analyze the substrate market and its role in the supply chain
Global Understanding of the Ecosystem and Key Players:
- Identify major competitors at each level of the supply chain
- Examine the relationships and interdependencies within the ecosystem
- Determine the key beneficiaries of the Generative AI momentum
- Anticipate changes and shifts in the industry for the upcoming years
Key Technical Insights and Future Technology Trends:
- Discuss processor key technology choices and dynamics shaping the industry
- Review the main players' roadmaps and technology development plans
- Compare and contrast GPU and AI ASIC performances, highlighting strengths and weaknesses
- Identify potential challenges and opportunities in the development and adoption of Generative AI technologies in the semiconductor industry
Key Features:
- 2019-2029 processor shipment and revenue
- 2019-2025 High-end GPU breakdown by products
- 2019-2029 wafers for processor die and for 2.5D/3D packaging
- 2019-2029 2.5D/3D units and associated revenue for 2023, 2026, and 2029
- 2019-2029 memory bit-DRAM demand
- 2019-2029 HBM bit shipment and revenue
- 2023 HBM market share
- 2019-2029 IC substrate units and revenue for datacenter AI accelerators
- 2019-2029 IC Substrate Panel Units for AI datacenter processors
- 2019-2029 FCBGA revenue for AI datacenter processors
- Processor, memory, foundry, packaging, and substrate ecosystem, and supply chain analysis
- Processor, memory, foundry, packaging, and substrate technology trends for datacenter GPU and AI ASICs