AVs see the future with perception sensors

An article by Anne-Françoise Pelé for EETIMES – Part of EE Times’s turning point for AVs special report.

Autonomous driving has moved from hype to reality, and it might be some time before autonomous vehicles (AVs) allow their passengers to watch a movie or admire the scenery while driving them safely to their destination. In this blurry picture, camera, radar and LiDAR units are the “eyes” of the vehicles, mapping the road to full autonomy.

To gain an objective view of the present situation and prospects, EE Times consulted Pierrick Boulay, senior technology and market analyst in the Photonics and Sensing Division at Yole Intelligence, part of Yole Group. Pierre Cambou, principal analyst in the Photonics and Sensing Division at Yole Intelligence, also contributed to the analysis.

“It is clear that the automotive industry has underestimated how difficult it would be to develop autonomous driving features,” Boulay said. “Ten years ago, the industry expected that autonomous driving would be more common. It was one of Tesla’s promises, and if we look at where we are today, Tesla has still not achieved full autonomous driving features.”

The only automated features have been implemented by European and Japanese OEMs, and these features are still limited to highways, with driving speeds up to 60 km/h, Boulay said. “It is almost a useless feature and quite far from what people expected 10 years ago.”

During Tesla’s Autonomy Day in April 2019, CEO Elon Musk made a bold prediction: “By the middle of next year, we’ll have over a million Tesla cars on the road with full self-driving hardware, feature complete, at a reliability level that we would consider that no one needs to pay attention [to the road].”

A wave of euphoria swept through the automotive industry. Tesla’s stock price rocketed, and investors poured jaw-dropping amounts of money into startups as optimists claimed AVs were just around the corner…