Sensor Fusion for Autonomous Vehicles – WebinarRegister now
Advanced Driving Assistance Systems (ADAS) are based on a combination of sensors and electronic control units (ECUs). These systems have proven to reduce road fatalities, alert the driver to potential problems and avoid collisions. ADAS relies primarily on radar and cameras, along with the necessary computing resources to process the data generated by these sensors.
The recent availability of even more powerful computing chips and sensors has enabled the development of even more advanced functions, expanding beyond safety assistance to incorporate increasingly automated driving capabilities. At the sensor level, for example, some OEMs are incorporating LiDAR in addition to radars and cameras. The implementation of these autonomous features requires the use of more sensors, more computing power and a more complex electric/electronic (E/E) system architecture.
Traditional vehicles (cars, trucks, etc.) are also now not the only platforms that are becoming autonomous; a growing number of autonomous industrial devices are being developed, for example. These include guided vehicles in warehouses and ports, forklifts, trucks, cranes, ships, last-mile delivery robots and delivery drones.
In this presentation, we will describe the increasing need of, along with the “fusion” coordination of, sensors for autonomous devices for both automotive and industrial applications. The presentation will cover topics such as cameras, radar, LiDAR, E/E architectures and domain controllers. A question-and-answer session will follow the presentation.
LiDAR for Automotive and Industrial Applications 2020
LiDAR is facing headwinds and is looking for diversification.