Search

Smartphone memory: Gen AI upgrades to drive spike in DRAM demand

AI requirements to hasten small NAND phase-out.

Artificial intelligence (AI) has been incorporated into smartphone features for several years, but the implementation of adapted Large Language Models (LLM) in high-end devices could result in a pull forward in demand for DRAM and an acceleration in the phase-out of the smallest NAND storage capacity.

Smartphones are quickly moving beyond the use of a specific AI for targeted applications — like face recognition, photo editing and audio filtering — to embedded LLM-based multimodal AI that can assist the user in a range of tasks, including picture creation, co-piloting, live translation, predictive user interfaces and more.

At Yole Group, both analysts John Lorenz and Thibault Grossi estimate that the rise of AI-enabled smartphones could trigger additional demand for DRAM and NAND above the current forecast. New generative AI functionality requires more DRAM content in each phone, as older devices cannot provide the processing power required. In addition, as the average lifespan of a smartphone is around three years, the availability of these new features may inspire some consumers to upgrade their devices sooner.

The combined impact of this content demand increase and rise in unit sales could result in smartphone DRAM bit demand rising by 16% above Yole Group’s base forecast model in 2024, and by up to 48% above in 2026. It would also create a 3% increase in DRAM demand above the model in 2024, rising to as much as 8% in 2026. The current forecast has smartphone DRAM demand growing at around 20% annually.

The average high-end smartphone contained 9GB of DRAM in 2023, and content is expected to move closer to 10GB in 2024 as manufacturers begin to incorporate generative AI-based functionality.

This data comes from Yole Group’s collection of memory analyses including, NAND Market Monitor  – DRAM Market MonitorNext-Generation DRAM 2024 – Focus on HBM and 3D DRAM  – Status of the Memory Industry – – Status of the Processor Industry –and more.

Shrinking AI models for mobile devices

LLMs require increasingly vast amounts of memory as the number of parameters grows — training various models using FP32 has evolved from approximately 400MB to over 4TB for the latest iterations. Memory footprint reduction techniques, like quantization of parameters, are necessary to shrink these large models trained in data centers to be practical for use in small devices. Moving to INT8 for inference reduces precision but makes LLMs small enough to run on mobile phones without coming up against hardware limitations.

While current basic AI features use around 100MB of memory on mobile devices, LLM-based features could require up to 7GB of additional RAM.

Thibault Grossi Senior Technology & Market Analyst, Memory at Yole Group
Suppliers will need to provide manufacturers with larger memory sizes. Qualcomm’s Snapdragon 8 Gen 3 processor is designed for generative AI and can support up to 24GB of DRAM — with native support for up to 10 billion parameters on-device.
Smartphone manufacturers are finding ways to implement LLMs in ways that limit how much additional memory they require. High-end Android models and iPhones are likely to be the first to need to add more DRAM and AI functionality in 2024 and 2025, followed by low-end phones starting in 2026.

For instance, the AI features in Samsung’s upcoming S24 smartphone are supported by Google’s Gemini Nano model running on INT4, which is estimated to use around 2GB.

The share of high-end smartphones with 16GB — which is sufficient for a 7B parameter LLM — is expected to grow to 11% in 2024, up from 8% in 2023. Smaller LLMs may be able to run on a 12GB device.

While Apple tends to manage with less DRAM in iPhones — which is likely to remain the case for its models with generative AI — 8GB will likely not be sufficient for a sophisticated LLM. Yole Group’s base assumption is for memory to increase to 12GB starting in 2026.

However, at this early stage in the AI adoption cycle, the extent to which LLMs will increase memory demand is yet to be determined. If manufacturers can keep down the size of their on-board models, then 12-16GB will be sufficient and only a small percentage of phones will be affected. Yole Group is also analyzing other functions, including RF-based technologies.

Gen AI to have a smaller impact on NAND

John_LORENZ-JLO_YINT
John Lorenz Senior Technology & Market Analyst, Memory at Yole Group
Unlike DRAM, the impact of generative AI on NAND memory is expected to be relatively limited. AI uses NAND memory to store the model parameters and the metadata associated with all the files on a smartphone. Yole Group does not anticipate a major impact overall but expects an effect on smaller storage configurations.

Indeed, Yole Group’s analysts have modelled two scenarios: one in which generative AI accelerates the extinction of lower storage configurations and a second in which AI accelerates the migration to higher NAND content per phone. Scenario 1 would lift the average content per phone in 2024 and 2025 by about 1% above the current forecast model, while Scenario 2 would increase the average content by around 10%. A hybrid scenario would see average content increase by about 5% per unit.

There are questions as to the amount of additional storage required for the content generated by AI-based apps. It will likely accelerate the extinction of the lowest capacity configurations, which would be 128GB in high-end devices.

Latency is key to the performance of AI applications, with larger models requiring faster interfaces for storage. Gen AI functionality is likely to accelerate the adoption of the most advanced storage interfaces.

While the launch of gen AI features will likely prompt consumers to upgrade their phones earlier, Yole Group does not expect this to permanently change the average lifespan of smartphones, so the acceleration of phone purchases could result in lower smartphone shipments later, relative to the base forecast.

With the first smartphone models featuring gen AI functionality yet to hit the market, it is yet to be determined how consumer demand will unfold — and the ultimate impact on memory demand. Yole Group’s scenarios consider several possibilities and analyze the impact of numerous technologies, including RF. Keep following Yole Group’s comprehensive insights as we track the emerging trends in the sector and their potential implications.

Stay tuned!

About the authors

Thibault Grossi is Senior Technology & Market Analyst, Memory at Yole Group.

Thibault is in engaged with in-depth analysis of the dynamic memory market and is in charge of the Yole Group’s NAND market research.

Prior to Yole Group, he worked at Schneider Electric as resident engineer in an EMS in Asia and then occupied different position within the electronic procurement organization (PCBA, Software and semiconductor). Semiconductor business has been his focus for the last 10 years. First in Projects Procurement and then as Category Manager for Semiconductor and Display where he was in charge of procurement strategy on memory and analog. He led global negociations and interfaced Europe designs projects.

Thibault obtained in 2006 a master degree in electronic and computing science from Pierre and Marie Curie University in Paris.

John Lorenz is Senior Technology & Market Analyst, Memory at Yole Group.  

John is in engaged with in-depth analysis of the dynamic memory market and is in charge of the Yole Group’s DRAM research.

Prior to joining Yole Group, John spent 15 years with Micron technology, where he played pivotal roles in process engineering and in market intelligence and strategy.

John has a Bachelor of Science degree in Mechanical Engineering from the University of Illinois Urbana-Champaign (USA), with a focus on MEMS devices.

Related products

Reports

up