Manufacturers are outfitting modern
cars with a wide array of advanced control and sensing functions. Collision warning
and avoidance systems, blind-spot monitors, lane-keeping assistance, lane-departure
warning and adaptive cruise control are established features that assist drivers and
automate certain driving tasks, making driving a safer and easier experience.
Lidar, radar, ultrasonic sensors and
cameras have their own niche sets of benefits and disadvantages. Highly or fully
autonomous vehicles typically use multiple sensor technologies to create an accurate
long- and short-range map of a vehicle’s surroundings under a range of weather and
lighting conditions. In addition to the technologies complementing each other, it is
also important to have sufficient overlap in order to increase redundancy and
improve safety. Sensor fusion is the concept of using multiple sensor technologies
to generate an accurate and reliable map of the environment around a vehicle.
Ultrasonic waves suffer from strong
attenuation in air beyond a few meters; therefore, ultrasonic sensors are primarily
used for short-range object detection.
Cameras are a cost-efficient and
easily available sensor; however, they require significant processing to extract
useful information and depend strongly on ambient light conditions. Cameras are
unique in that they are the only technology that can “see color.” Cars that have the
lane-keep assist feature use cameras to achieve this feat.
Lidar and imaging radar share a broad
array of common and complementary features that can map surroundings as well as
measure object velocity. Let’s compare the two technologies in several
categories:
- Range. Lidar and imaging
radar systems can detect objects at distances ranging from a few meters to more
than 200 m. Imaging lidar has difficulty detecting objects at close distances.
Radar can detect objects from less than a meter to more than 200 m; however, its
range depends on the type of system: short-, medium- or long-range radar.
- Spatial resolution. This
is where lidar truly shines. Because of its ability to collimate laser light and
its short 905- to 1,550-nm wavelength, infrared (IR) light spatial resolution of
approximately 0.1 degrees is possible with lidar. This resolution enables
high-resolution 3D characterization of objects in a scene without significant
back-end processing. On the other hand, radar’s wavelength (4 mm for 77 GHz) has
challenges resolving small features at long distances.
- Field of view (FOV).
Solid-state lidar and radar both have excellent horizontal FOV (azimuth), while
mechanical lidar systems, with their 360 degrees rotation, possess the widest
FOV of all advanced driver assistance systems (ADAS) technologies. Historically,
lidar has better vertical FOV (elevation) than radar. Lidar provides angular
resolution (for both azimuth and elevation), which is one primary feature
necessary for improved object classification.
- Weather conditions. One of
the biggest benefits of radar systems is their reliability in rain, fog and
snow. The performance of lidar generally degrades under such weather conditions.
Using IR wavelengths of 1,550 nm helps lidar achieve improved performance under
adverse weather conditions.
- Ambient light. Lidar and
cameras are both susceptible to ambient light conditions. At night, however,
lidar and imaging radar systems offer very high performance because they provide
their own illumination. Radar and modulated lidar techniques are resistant to
interference from other sensors.
- Cost and size. Radar
systems have become mainstream in recent years, making them highly compact and
affordable. As lidar has become more popular, its cost has dropped
precipitously, with prices dropping from approximately US$50,000 to below
US$10,000. The mainstream use of radar in modern-day vehicles is made possible
by increased integration, which reduces system size and cost. The mechanical
scanning lidar system from a few years ago – commonly seen mounted on various
autonomous self-driving robotaxis – is bulky, but advances in technology have
shrunk lidar over the years. The industry shift to solid-state lidar will
further shrink system size and lower costs.