Search

Want better autonomous navigation? start with LiDAR

By Eric Aguilar, CEO, Omnitron Sensors – Where is LiDAR? Why hasn’t it become as ubiquitous as cameras? The tech sector invested an astonishing $2.6 billion in LiDAR in 2021 and billions of dollars in prior years, according to the research firm Yole Intelligence. But investment plunged to $184 million as of July 2022. By 2032, just 10% of all cars are expected to include LiDAR. Why such a small percentage?

It may seem incongruous, but LiDAR—which stands for light detection and ranging—is still growing at a healthy pace. For the automotive market alone, Yole predicts it will grow from $128 million in 2022 to $2 billion in 2027, representing a 73% CAGR. The overall market should reach $6.3 billion that year.

What’s amazing is that LiDAR’s growing despite all of its warts. If it were properly realized, LiDAR would be everywhere by now, providing the real-time 3D vision that’s essential to wide-ranging applications, from automotive advanced driver assistance systems (ADAS), robotic cars, and drones to industrial robotics and augmented reality/virtual reality (AR/VR) applications.

Except for the most expensive applications, including satellites, aircraft/spacecraft and topographical exploration, LiDAR has come up short.

LiDAR’s implementation lags far behind more mature vision technologies, such as cameras and radar. Yet LiDAR is worth our attention.

Cameras lack depth, which is essential for robots to understand their environment. They also can’t see at night, which slashes their efficacy. Radar also has its limits. First and foremost, it lacks resolution, which means that it can’t differentiate a car from a person. LiDAR, on the other hand, provides depth and functions seamlessly at all levels of light. It also delivers phenomenal resolution, so it can perceive both moving and stationary objects—another critical advantage.

In a vastly different market space, we’re beginning to see rudimentary LiDAR used in some smartphones. A more fully functional LiDAR might let you use your smartphone to look around a building or virtually climb a tree to see a bird’s nest at the top.

Focus on automotive

But before we embrace LiDAR for price- and power-sensitive consumer devices, we need to satisfy the requirements of automotive. While it could be years until we have a new car that drives itself while we read email or play Wordle, the automotive industry is already seeing major demand for ADAS—which encompasses collision avoidance, pedestrian crash avoidance, lane-departure warning, self-parking, and other features that promote safety and convenience while driving. ADAS has made significant gains in the last five years, but with improvements to LiDAR, it could be so much better—as could other types of autonomous navigation systems.

Like most transformative tech advancements, the road to LiDAR has been paved with potholes. Some of the most promising autonomous navigation companies have gone away, while others have reinvented themselves or have emerged anew.

But industry-wise, the signs point to a rosy future.

Ford’s new Latitude AI subsidiary will focus on new automated driving technology, including hands-free, eyes-off driver assist for Ford vehicles. Mercedes-Benz—which already sports Level 3 autonomous features in some of its top-line cars—has announced plans to integrate Luminar’s LiDAR subsystem in even more advanced vehicles by 2025. Mercedes’ investment of multiple billions in Luminar demonstrates its seriousness.

We’re also seeing some mergers as independent LiDAR companies, such as Ouster and Velodyne LiDAR, pool hefty patent portfolios and scale resources to capture even more of the market. Surely, others will follow.

Despite this flurry of industry activity, there’s a well-kept secret about LiDAR that’s preventing us from realizing its full potential: the LiDAR sensors used in cars and drones are fragile, expensive, unreliable and may only last a few months.

I first discovered this when I made the jump from core sensor development to integrator. Initially, I worked with LiDAR at Wing, a Google X program for autonomous delivery drones. At Tesla, I led the firmware integration team that took Model 3 from prototype to production. As I moved on to Argo AI, Ford and Volkswagen’s former robotaxi business, I continued to grapple with LiDAR-related issues.

Through it all, I learned first-hand that perfecting LiDAR is really hard. Unlike radar and cameras, the already mature optical technologies used in autonomous navigation systems, LiDAR is still evolving. It’s also highly complex.

First, LiDAR requires sub-micron–tolerance on an active alignment system, a process that’s akin to a watchmaker making miniscule adjustments to get the laser alignment right. This is extremely expensive at the micro-scale. Second, LiDAR demands reliability. But it functions in high-vibration environments with wide-ranging temperature cycles. This fatigues the parts, resulting in tiny wobbles that corrupt accuracy. Since we’re talking about safety-critical navigation systems, that’s a problem we have to fix.

So what’s a manufacturer to do if its LiDAR sensors break every six months?

It may sound unbelievable, but when I was on the integrator side of the business, we used to replace them. With LiDAR sensors now costing in the $500-$5,000 range a piece, that’s not financially sustainable. It’s also a pain in the neck for the customer, who’s facing excessive trips to the dealer for service.

Still, LiDAR is a key enabling technology for robotics that we can’t just abandon. That’s why it’s time to fix the optical subsystem in the LiDAR sensor. To get started, we need to understand its requirements.

The LiDAR wish list

The most critical element in the long-range LiDAR needed for the automotive sector is the mirror. In this case, autonomous navigation requires a mirror with a large beam that can hit a 200+ meter range with centimeter-level accuracy. Rather than use rotating mirrors, like the spinning polygons commonly used , the mirror in long-range LiDAR can’t wobble. Instead, the mirror should stop and scan. Because it doesn’t wobble, this step-scanning mirror isn’t prone to the inaccuracies caused by wobble.

The mirror must also handle high vibration—such as hitting bumps in the road—as well as wide temperature variation. The mirror must be affordable to manufacturers, making new LiDAR systems cost-competitive for the first time. Plus, the mirror must perform both horizontal and vertical scanning, supporting customers that want single- or dual-axis scanning.

During LiDAR’s evolution, some optical subsystems have hit a few of these marks, but none has hit them all.

Currently used in satellite systems priced in the $1 million to $10+ million-dollar range, voice coils are high-accuracy and robust, but they’re about $5,000 each. While not as expensive as voice coils, scala mirrors are still costly; they’re also bulky and prone to wobbling. The spinning polygon, as we know, is expensive and prone to wobbling. The galvo has a huge motor behind it to do the step-scanning, but it’s very expensive at about $2,500 apiece. It’s also high-power, which is a problem for drones.

Enter MEMS step-scanning mirrors.

The deal with MEMS

Historically speaking, we’ve only seen MEMS optical subsystems that are resonating mirrors, so they’re as prone to vibration as other rotational mirror components. In the past, MEMS mirrors have also been too small for the large beams required for long-range LiDAR. What’s more, MEMS mirrors have also been gated by the same manufacturing issues that have limited the growth of MEMS for decades: complex process technology that makes MEMS sensors difficult to mass produce at commodity prices.

This doesn’t mean we’re throwing out MEMS. Far from it. Automotive manufacturers need small, sleek, affordable step-scanning mirrors that fit in a car’s roofline—and that are robust enough to handle high vibration, as well as temperature variation.

A new generation of MEMS mirrors that are about the size of a dime are large enough, at 15 mm in diameter, to move the tens of degrees needed for a wide field of view. This MEMS mirror also performs step-scanning, so it meets the performance demands of the automotive environment. Plus, this new device has a very powerful silicon motor behind it, so it’s fast enough for long-range LiDAR.

Achieving a step-scanning MEMS mirror that meets all the requirements of LiDAR has been no mean feat. We know because we’re doing it. To make it work, we’ve developed a new topology for MEMS, which features the rearrangement of silicon process steps and a new packaging method. The result of our foundational work is a MEMS step-scanning mirror that’s robust and reliable, and low-cost in mass-produced volumes. In short, it’s a MEMS mirror that will solve the most serious problems with LiDAR, driving autonomous navigation systems to a whole new level.

up