ADAS Sensors and Computing: A 22 Billion Dollar Market in 2025

An article written for Photonics Views – Radar and camera are expected to lead the market, but more computing power and a new E/E architecture will be required.

Advanced driving assistance systems (ADAS) based on a combination of sen-sors and electronic control units (ECUs) have been developed to support the driver. These systems have helped to reduce road fatalities, alert the driver to potential problems and to avoid collisions.

ADAS relies mostly on radars and cam-eras with the corresponding ADAS com-puting to process the data generated by these sensors. Recently, the devel-opment of more powerful computing chips has allowed the development of more advanced functionalities.
ADAS functionalities were initially developed for safety but are now also used to enable some automated driv-ing features. The implementation of such features requires the use of more sensors, more computing power and a more complex electric/electronic (E/E) architecture.

Greater ADAS functionality with a new E/E architecture.

The top priority for OEMs is the devel-opment of ADAS for safety and auto-mated driving features. The develop-ment of advanced emergency braking systems (AEB) is a great step to avoid forward collisions but still needs per-fecting, as demonstrated by the Amer-ican Automobile Association (AAA) in October 2019. Automated driving features in traffic jams or on the high-way will also be developed by OEMs as consumers are looking for these to make driving easier. The development of such features will be a way for OEMs to dif-ferentiate themselves.

To do so, the addition of more sen-sors, more computing power and a new E/E architecture will be required. The E/E architecture is expected to evolve from the current distributed architec-ture to a domain centralized architec-ture and then further to a vehicle cen-tralized architecture in the long term. However, there are two major drawbacks to this evolution: the communication bandwidth necessary to transport all of the ‘raw’ data to the centralized node expands to the order of Gigabits per sec-ond, and the computational expense of associating all of the raw data to possible targets rises significantly.

Traditionally, cars have been built using a distributed architecture with one ECU for each function. In order to enable automated driving features, how-ever, OEMs will have to develop smarter ECUs or domain controllers to process the data that is simultaneously generated by multiple sensors.

Audi and Tesla have initiated this trend using a combination of radars, cameras, and a lidar in Audi’s case. Audi and Aptiv developed a domain controller to merge the generated data, the zFAS. Tesla goes one step further in the devel-opment of domain controllers with its Autopilot hardware. Autopilot is much more complex and has more functional-ity, with the ability to perform frequent over-the-air (OTA) software updates. Innovation brought by such features will be a key differentiator for OEMs looking to relaunch the market.

About the authors

As part of the Photonics, Sensing & Display division at Yole Développement (Yole), Pierrick Boulay works as Market and Technology Analyst in the fields of Solid-State Lighting and Lighting Systems to carry out technical, economic, and marketing analyses. Pierrick has authored several reports and custom analyses dedicated to topics such as general lighting, automotive lighting, LiDAR, IR LEDs, UV LEDs and VCSELs. Prior to Yole, Pierrick has worked in several companies where he developed his knowledge on general lighting and on automotive lighting. In the past, he has mostly worked in R&D departments on LED lighting applications. Pierrick holds a Master’s in Electronics (ESEO – Angers, France).

Additional author: Cédric Malaquin, technology and market analyst, RF devices and technology, Yole Développement.