Skip to content

What does sensor fusion mean for the future of radar?

Gideon Kedem argues that cameras, radars and LiDAR are all vital to autonomous driving

There has been much debate about sensor technology within the automotive industry and the role that cameras, radars and LiDARs will play in the future of autonomous driving. The discussion has centred around the cost and viability of each sensor type, prompting some automakers to move away from radar and LiDAR solutions in favour of camera-based autonomous technology.

Automotive World Magazine – December 2022

The reality is that each sensor type has its own strengths and weaknesses, and all three technologies are vital to the advancement of autonomous technology. Cameras excel in reading street signs and classifying objects but are limited in poor weather conditions. Radars provide accurate measurements of speed and distance and excel in poor weather but cannot read signs and struggle with traffic lights. LiDARs are highly accurate at measuring objects but are expensive.

While the use of radar in the automotive industry dates back to the 1980s, radar technology has been the subject of significant debate for years. That’s because legacy vehicle architectures limit the amount of data that radars can transmit to the CPU.

Does this mean the end for radar within automotive design? Not necessarily.

Future of Radar image 1
Transitioning processing power from the radar units to a central ECU will lead to cheaper systems and will facilitate the process of sensor fusion

Radars generate massive amounts of data, but existing links simply do not offer sufficient bandwidth to support the transmission of raw data from radars to ECUs. This has forced radar manufacturers to dedicate local processing capabilities within their radars themselves. From a system perspective, this is sub-optimal as there is no single ECU capable of receiving all the raw data to make a robust decision based on it.

For radar manufacturers to make inroads in autonomous systems, they should transition from this architecture—where the radar and ECU are coupled at the edge—to a satellite architecture. This would move the data processing to a more centralised location in the vehicle, enabling an ECU to access raw data from multiple sensors. The more raw data processed by an ECU, the more accurate the decision can be.

This follows the trend of sensor fusion taking place in the automotive industry, which enables a vehicle’s central computing unit to account for various radars, cameras and LiDARs—each of which have their own strengths and weaknesses. Sensor fusion has the potential to significantly lower complexity and costs in this era of software defined vehicles, while helping cars make safer decisions. Centralised processors in the car can extract much more information from raw data using AI and machine learning techniques, which are harder to implement when processing the data from each radar sensor locally. With a significant number of software companies developing such algorithms, this can enable even higher accuracy and performance.

Sensor fusion
Sensor fusion allows central ECUs to compensate for the drawback of each sensor type

 

The satellite radar architecture has other advantages. By removing heavy local processing from the sensor unit, radars will become much more power efficient, as well as less expensive. Power efficiency is a top priority for OEMs as, with the electrification of the engine, power consumption by the various devices in the vehicle has a direct impact on the driving range of the vehicle. Reducing costs is also important because radars are usually located in vulnerable places within the vehicle that are susceptible to impact and damage.

Satellite architecture requires the tunneling of raw data at high bandwidth and low latency between the host processor and the remote sensors. This is the type of high-speed connectivity solution that the MIPI A-PHY standard aims to achieve. For the first time, the industry now has a streamlined, standardised solution to send high bandwidth raw data from radars to a central ECU.

All sensor types will play a vital role in enabling the next stages of autonomous driving, creating less uncertainty around the navigation environment. Centralised processing has the potential to vastly improve the current implementation of various sensor solutions, and this is where radar will continue to play a vital role moving forward.


The opinions expressed here are those of the author and do not necessarily reflect the positions of Automotive World Ltd.

Gideon Kedem is Senior Vice President and Head of Automotive at Valens Semiconductor

The Automotive World Comment column is open to automotive industry decision makers and influencers. If you would like to contribute a Comment article, please contact editorial@automotiveworld.com

Related Content

Welcome back , to continue browsing the site, please click here