Artificial intelligence for automotive, stakes and challenges – Interview of NVIDIA

Artificial intelligence (AI) slowly but surely invades more and more markets and thus, daily life of each. If it is already widely implemented in consumer systems such as smartphone, smart home, virtual personal assistants; it takes more time to get into the automotive segment. Reasons are multiple but the main goal is safety. To reach a high level of safety, perception and planning software should incredibly efficient and require a huge amount of computing power.

In the report Artificial Intelligence Computing for Automotive 2019, Yole Développement (Yole) describes the autonomous car market as divided into two segments: cars as we know, gradually integrating more features making them more and more autonomous and robotic cars, already fully autonomous at low speed and in designated areas, agnostic in terms of sensor and also of computing hardware, choosing the best of what it exists.
In 2018, only robotic vehicles could claim to possess in-car AI. The associated computing market, driven by computers equivalent to what is found in datacenters and associated with rather low volumes, brought the computing market to $156M in 2018. Over the next 10 years, with the development of robot taxis and shuttles, this market will remain the main revenue generator for AI in automotive, with $9B in total computing revenue expected in 2028. In 2019, the first cars qualified as “ADAS level 3” will hit the road, and AI will enhance ADAS level 2+ cars, replacing conventional computer-vision algorithms. Yole expects a $63M computing market for ADAS in 2019, reaching almost $3.7B in 2028. Finally, infotainment will include AI computing hardware in order to manage technologies such as gesture recognition and speech recognition. All AI-related computing is literally exploding, with total expected 2028 revenue increasing to almost $14B at a 50% CAGR2018-2028.
NVIDIA uses the power of AI and deep learning to deliver a breakthrough end-to-end solution for autonomous driving—from data collection, model training, and testing in simulation to the deployment of smart, safe, self-driving cars. In this report, we see NVIDIA as the undisputed leader in the robotic vehicles segment and a new entrant in the ADAS segment. Danny Shapiro, Senior Director of the Automotive Division, explains a bit more about NVIDIA’s vision of the future of autonomous driving and its strategy to Yohann Tschudi, PhD., Technology & Market Analyst at Yole Développement (Yole).

Yohann Tschudi (YT): Please introduce yourself and your division to our readers.
Danny Shapiro (DS): My name is Danny Shapiro, I am the senior director of automotive at NVIDIA and we are focused on delivering technologies from the datacenter to the car to enable autonomous transportation. We believe that every type of vehicle will become autonomous: passenger cars, trucks, taxis, delivery vehicles, and even industrial, commercial and agricultural vehicles.

YT: NVIDIA is known to develop HPC (high performance computing) products, particularly for AI applications.
Why did you make the choice to target this high-end market?

DS: NVIDIA is all about solving the world’s most challenging problems to help our customers. If you look at the history of the company, we are always pushing the boundaries, doing things that have never been done before, and creating products and solutions that customers perhaps did not know at the time that they needed. When we created 3D graphics, there was not really a market for 3D graphics; when we started creating products in the high performance computing, there were no solution there before. For AI, there have been a lot of developments over many years before we entered the market. But NVIDIA enabled the massive growth of AI because it is so computational intensive. NVIDIA’s products are able to support the computational load and our customers can solve problems that previously were thought to be unsolvable in their lifetime. We are continually looking at market opportunities and challenging issues, figuring out how we can solve our customers’ problems.

YT: Why did you choose to enter the automotive market?
We chose it because it is very complicated and a very important problem. There are far too many people injured or killed on the world’s roads. This is a great opportunity to make transportation safer, to give people time back that they spend travelling in cars or stuck in traffic or looking for parking, and also to give the freedom of mobility to those who cannot drive.

YT: At Yole Développement, we are seeing AI everywhere nowadays and it is rapidly entering the consumers’ daily life.
In your opinion, why and how will AI change the game in the automotive market?

DS: We are already seeing that happen. If you look at AI assistants in the car, MBUX, the new Mercedes-Benz user’s experience is powered by NVIDIA AI and graphics technology. There is also NIO in China which is now bringing AI into the car powered by NVIDIA. Many of the next generation of cars are bridging that consumer trend, but bringing AI into the car but needs powerful computing, and it must be energy efficient and automotive grade. NVIDIA is the only company that has started with datacenter and high-performance computing and brought it down into the car. Conversely, the traditional automotive supply chain has produced low performance, low-cost, low energy ECUs, but those are incapable of the types of computation required for AI.

YT: Yole Développement’s analysts distinguished two different automotive markets: ADAS vehicles, with a slow adoption of fusion sensor computing platforms, and robotic vehicles, using already complete computer systems equivalent to datacenter platforms.

In your opinion, what trend will be most followed by the OEMs in the ADAS field?
DS: What we are seeing now is a different mindset and a need for centralized computing instead of hundreds of individual ECUs and smart cameras with limited functionality. Moving to a centralized architecture means an AI supercomputer in the car like the DRIVE AGX Xavier (Figure 1) which is a system-on-a-chip (SoC) capable of 30 trillion operations per second, yet only consumes 30 watts. We have also introduced the level 2+ system, DRIVE AP2X (Figure 2), with autopilot capabilities. It is far superior to any ADAS solutions on the market today. It of course handles the same emergency braking and lane keeping, but on top of that it can perform on-ramp to off-ramp maneuvers, lane merges, lane splits, and lane changes and incorporates driver monitoring. It still is not a level 3 system, the driver still need to be in control, but what we are doing is bringing the technology from robot taxi’s type of development and making it available sooner. It will increase the safety of any vehicle even if it’s not fully self-driving.

NVIDIA DRIVE AGX Xavier delivers 30 TOPS of performance while consuming 30 watts of power. It can fuse data from a variety of sensors enabling Level 2+ and higher automated driving systems.

DRIVE AP2X is a complete Level 2+ automated driving solution encompassing DRIVE AutoPilot software, DRIVE AGX and DRIVE validation tool

YT: Are you in a position to gain a share of this market, considering it could be a high-volume market with low prices?
DS: Absolutely. Looking at the deals that were announced with companies ranging from Volvo–their car computer will be NVIDIA DRIVE level 2+ and will be standard in every Volvo starting with the next generation. We made an announcement with Mercedes-Benz for a whole new vehicle centralized computing architecture that will extend through their product line. We are working with Chinese companies like Xpeng and SF Motors and we think that a lot of these cars, the NEVs (New Energy Vehicles), will be much more mainstream cars. All these companies are all looking at how do we: A) create a safer vehicle, B) bring various levels of autonomy to market, and C) make a vehicle that can get better over time through software updates. The answer is through centralized computing, software-defined vehicle architecture and AI — that is the solution. For any company that is not thinking about these three principles is going to be passed in the market. Would you buy a phone that you cannot get an over-the-air update for? Of course not. The automotive industry moves a little slower but it will be that way soon. If your car does not have the latest and greatest, it will not be attractive to consumers. For example, Tesla is taking over not just premium segments with traditional automakers but across many segments. Automotive companies all over the world are being affected by this disruptive technology, Autopilot, with a supercomputer in the car, that happens to be a NVIDIA product, and with over-the-air updates. Once people experienced that, it is impossible to go back.

YT: If we look at the other segment, NVIDIA is already very well positioned for robotic vehicles.
What do you think are the reasons for this success?

DS: I think there are two vectors: A) we have an open platform. It makes it easy for different partners to add to it, and build up the ecosystem for others to create their own applications. If we had a closed system in a particular product, automakers don’t want to be forced to rely on something they do not control–it becomes hard for them to differentiate. B) We uniquely have a unified architecture that goes end-to-end from the training, testing, validating, simulating in the datacenter to AI processing in the car. It’s the same architecture so it makes it easier for the industry to develop on NVIDIA and improve their time to market.

YT: Today, some robot taxis and shuttles are in use, though it is still in experimentation stage and is not yet a common way to travel.
How quickly do you expect robotic vehicles to become wide-spread on our roads? What are the obstacles to this expansion? Regulation? Accidents?

DS: You are right, trials and pilot projects are already happening. I think we’ll see them in confined environments: shuttles at airports and college campuses. There might even be special lanes on open road for AVs. I think we’ll see more and more in terms of industrial applications: agricultural, construction, and industrial use cases as well. There are many municipalities that are very eager to have their communities adopt AVs. In China for example, they are building regions for autonomous vehicles. I think there may be some areas where pedestrians are completely separated from road users with walkways that go over the roads. That would mitigate potential hazards and accelerate the adoption. There are many different areas where things are moving forward but still I think that for widespread autonomous vehicles, they are several years away. That’s why we focused on level 2+ systems. These systems will bring the advantages of robot taxis down to passenger cars sooner and help prevent accidents and fatalities. I do not think that there are any insurmountable obstacles, even if one challenge today is that there are different regulations in different regions. That makes it challenging for automakers and developers especially in the US where we hope a unified set of standards to avoid to run a different software because you cross a State line.

YT: Considering reliability, some players argue about the number of miles run by their system, some others use simulations, where NVIDIA’s solutions also have a huge impact.
How do we balance these two measures of mileage?

DS: I do not have a number right now but I think there will be many more simulated miles than actual road test miles. The value of simulation and the ability to control the weather, the traffic, and the time of the day is huge. Over time, we will be able to analyze what is required in terms of simulation and road miles. But for simulation itself, the key point is to get standardized set of tests between industry, regulators, and other entities. The innovation NVIDIA develops is open and benefits the entire industry.

YT: To end off, let’s talk a bit about the future.
Do you think as us that we will see a merge in terms of computing technologies for robotic vehicles and ADAS vehicles in the future?

DS: Yes indeed. We have a highly scalable approach. You do not necessarily need all the same sensors and the same level of computation from a robot taxi when you have a human behind the wheel, but we can leverage the same software to keep human driven vehicles safer on the road to full autonomy.

YT: What are your thoughts on the vehicle of 2030-2035?
DS: I will be able to get in the car and go to sleep while the car is driving me where I want to go!

YT: Do you think our kids will still need their driver’s licenses?
My kids are teenagers, so they will. But their kids will not. It will be an option for a long time to have the right to drive a car on our roads, it is part of our culture. But with the proliferation of AVs, it won’t be a requirement. People still love to ride horses but they do not ride them to work every day. In the future, I think that there will be places where if you want to drive, you will go there–some kind of race circuits. And I could see that soon there will be places where only autonomous vehicles are allowed. If we could flip a switch right now and say all the cars are autonomous, that would be much easier. Our Safety Force Field and DRIVE AV software will not crash into another car, but it cannot prevent other car to behaving badly and causing crashes. If every car had Safety Force Field then the number of crashes would go to 0. If the government and the regulators say that only autonomous vehicles are allowed, it would instantly make our roads the safest form of transportation.

YT: Is there anything else that NVIDIA would like to add?
I think that if you look to the world’s largest automaker, Toyota, and the announcement we made with them, it is pretty telling. Developing and deploying AVs is extremely complex. Toyota, TRI and TRI-AD’s absolute focus on safety, and their recognition of the importance of AI and simulation, will put them at the forefront of AV development. It’s likely a wake-up call to the rest of the auto industry.


Danny Shapiro is Senior Director of Automotive at NVIDIA, focusing on artificial intelligence (AI) solutions self-driving cars, trucks and shuttles. The NVIDIA automotive team is engaged with hundreds of car and truck makers, tier 1 suppliers, HD mapping companies, sensor companies and startup companies that are all using the company’s DRIVE hardware and software platform for autonomous vehicle development and deployment. Danny serves on the advisory boards of the Los Angeles Auto Show, the Connected Car Council and Udacity. He holds a Bachelor of Science in electrical engineering and computer science from Princeton University and an MBA from the Haas School of Business at UC Berkeley. Danny lives in Northern California where his home solar system charges his electric, AI self-driving car.


As a Technology & Market Analyst, Yohann Tschudi, PhD is a member of the Semiconductor & Software division at Yole Développement (Yole). Yohann is daily working with Yole’s analysts to identify, understand and analyze the role of the software parts within any semiconductor products, from the machine code to the highest level of algorithms. Market segments especially analyzed by Yohann include big data analysis algorithms, deep/machine learning, genetic algorithms, all coming from Artificial Intelligence (IA) technologies.
After his thesis at CERN (Geneva, Switzerland) in particle physics, Yohann developed a dedicated software for fluid mechanics and thermodynamics applications. Afterwards, he served during 2 years at the University of Miami (FL, United-States) as a research scientist in the radiation oncology department. He was involved in cancer auto-detection and characterization projects using AI methods based on images from Magnetic Resonance Imaging (MRI). During his research career, Yohann has authored and co-authored more than 10 relevant papers.
Yohann has a PhD in High Energy Physics and a master degree in Physical Sciences from Claude Bernard University (Lyon, France).

Related report

Artificial Intelligence Computing for Automotive report, Yole Développement, 2019

Artificial Intelligence Computing for Automotive 2019

Artificial Intelligence for automotive: why you should care.

Related webcast

Be part of our webcast on May the 9th, related to the Artificial Intelligence for Automotive.

Go deeper with Yole Développement analysis on Artificial intelligence, its impact on the automotive industry: trends, market, players and future

More information to be available very soon – Please contact Fanny Vitrey