AutomotiveElectronics

The role of ADAS sensors in automotive design

Advanced Driver Assistance Systems (ADAS) are intended to prevent deaths and injuries by reducing accidents. Exemplary ADAS applications include: pedestrian detection/avoidance, lane departure warning/correction; traffic sign recognition; automatic emergency braking, and; blind-spot detection. This FAQ starts with an overview of the “levels of driving automation” and its relation to ADAS. It then reviews the role of key sensor technologies and how they are applied for each application area and closes with a look into the future of ADAS and the connected car.

ADAS support improved driver performance and increased vehicle and road safety. ADAS combines human-machine interfaces (HMIs), and sensors to detect driver errors, nearby obstacles, and respond as needed to improve safety. Human error is the cause of most vehicle accidents. ADAS are used to enhance, automate and adapt vehicle technology to improve safety through technologies that can alert the driver to safety issues, implement safeguards and, in some instances, take limited control of the vehicle. Automated lighting, adaptive cruise control, satellite navigation, traffic warnings, parking assistance, lane centering, and lane departure alerts are common ADAS functions.

The use of sensor data to analyze the vehicle’s operating environment and compare it to safety parameters is the key to ADAS. Those sensors include automotive imaging, LIDAR, radar, image processing, computer vision, multi-axis inertial motion, and in-car networking. The Society of Automotive Engineers (SAE) categorizes ADAS functions in a series of levels defined by the amount of automation involved:

  • In level 0, ADAS only provides information for the driver to interpret on their own. Parking sensors, surround-view, traffic sign recognition, lane departure warning, night vision, blind spot information system, rear-cross traffic alert, and forward-collision warning are examples of level 0 ADAS functions.
  • Levels 1 and 2 both have the driver do most of the decision-making. Level 2 is more fully capable than level 1, and while level 1 can take control over one functionality, level 2 can take control of multiple functions.
  • ADAS that are considered level 1 are: adaptive cruise control, emergency brake assist, automatic emergency brake assist, lane-keeping, and lane centering.
  • ADAS that are considered level 2 are: highway assist, autonomous obstacle avoidance, and autonomous parking.
  • Level 3 vehicles have “conditional driving automation” and can make informed decisions for themselves based on sensor information about the local environment, such as accelerating past a slow-moving vehicle. But they still require a human override. The driver must remain alert and ready to take control if the system cannot perform the task.
  • From level 3 to 5, the amount of control the vehicle has increases. Level 4 vehicles can intervene if things go wrong or there is a system failure. In this sense, these cars do not require human interaction in most circumstances. Level 5 is a fully autonomous vehicle that does not need a driver.
SAE International J3016 “Levels of Driving Automation” standard includes ADAS capabilities up to Level 3 and automated driving capabilities in Levels 4 and 5. (Table: SAE International)

Sensors drive ADAS

ADAS requires a continuous stream of information about the surrounding environment. Sensors provide the information needed for ADAS to work. The sensors used for ADAS can be classified in several ways, for example:

  • Cameras provide image data
  • LIDAR (Light Imaging Detection and Ranging), millimeter-wave radar, and ultrasonic sensors can provide distance data
  • GPS and odometers provide position data
  • Inertial measurement units (IMUs) can provide speed, acceleration, and attitude data
  • Night vision systems can be active or passive. Active night vision systems project infrared light, and passive systems rely on the thermal energy that comes from cars, animals, and other objects.

ADAS sensors are needed to detect everything the driver can see and what the driver hasn’t noticed or can’t even see. There are a variety of sensor technologies already in use, each with its own capabilities. And increasingly, several sensors with complementary capabilities are used together in what’s called “sensor fusion.” For example, LIDAR or radar can be used in combination with vision systems to implement adaptive cruise control (also called inter-vehicle distance control), front collision warnings, autonomous emergency braking (also called collision damage mitigation braking control), blind-spot monitoring, and parking assist.

Implementation of ADAS requires a variety of sensor technologies. (Image: Synopsys)

ADAS vision systems can be classified by the type of image sensor used and the imaging system architecture. Image sensors are either Charge-Coupled Device (CCD) sensors or CMOS sensors. CCD has relatively higher sensitivity compared to other image pickup sensors and has less noise. Until recently, vehicles often used lower-cost CMOS image sensors, for example, in rear-looking camera systems. But more advanced systems that benefit from higher image resolutions are using CCD sensors.

The image system architecture can be monocular or stereoscopic. In a single-lens monocular system, distance is calculated from the pixel position in the vertical direction of recognized image coordinates, but the error tends to be large. These systems are low-cost and relatively insensitive to installation position but are limited in their ability to recognize objects. Uses of single-lens systems are typically limited to warning functions, crosswalk recognition, white line recognition, and lane-keeping. They can be found used in combination with other sensors.

A stereo camera uses parallax to measure distances with reasonable accuracy. Three-dimensional objects can be detected, enabling the measurement of distance and lateral position to various objects such as pedestrians, bicycles, vehicles with high accuracy. Calibration is more difficult than for single-lens systems.

Distance is also measured using LIDAR, millimeter-wave radar, and ultrasonic sensors. Ultrasonic sensors are used for short-range applications and slower-speed use cases. The choice between LIDAR and radar is more complicated. LIDAR is good at detecting small objects thanks to the very short wavelength; it has great precision and can build an exact 3D monochromatic image of an object, but its usefulness is limited at night and in bad weather. Radar has another set of tradeoffs:

  • Long operating distance
  • It can be used in more varied conditions and environments. Since it is not as sensitive to, for example, dirt and it does not have any mechanical moving parts.
  • Due to reflection and/or disturbance, it can sometimes detect objects that give the object a false size. For example, a soda can on the road can be identified as a much larger object.
  • And radar does not have the fidelity as some other sensors, which means it’s not as accurate as some other options.

 

ADAS sensor performance comparison. (Image: Analog Devices)

ADAS sensor performance comparison. (Image: Analog Devices)

Six-axis MEMS-based IMUs Inertial Measurement Units (IMUs) can include a gyroscope and accelerometer combination in a small plastic package. They can provide high resolution, stable, and reliable 16-bits acceleration and angular velocity along the three orthogonal axes. This enables parking-assistance systems to determine the vehicle’s motion efficiently, vision systems can employ stabilization techniques and improve image quality, and for positioning systems, it can enhance absolute position accuracy. Designs that integrate six separate masses into the IMU can offer small cross-axis sensitivity and allow for accurate reporting of linear acceleration and angular velocity into the vehicle’s reference frame, regardless of the installation mounting orientation, to improve design flexibility and ADAS performance.

ADAS sensors are typically integrated with one or more central controllers. These ADAS controllers are capable of controlling several vehicle systems through connections to the in-vehicle network.

ADAS controllers

The use of ADAS controllers supports new functions without affecting the system architecture; functions can be easily moved between each vehicle system controller, and the use of controllers can accelerate the development of ADAS applications. A common ADAS controller can support the easier expansion of ADAS functions into multiple vehicle models.

Flexibility is enhanced by the use of a layered software structure. The separation of application logic and actuator control logic supports ongoing optimization and the addition of new ADAS applications. It also increases flexibility when selecting ADAS sensors and control actuators.

ADAS systems are implemented through a combination of hardware and software. (Image: Hitachi Automotive)

One standard ADAS controller collects environmental data from various sensors and can implement functions such as lane departure warning, front collision warning, pedestrian collision warning, and automated emergency braking. Standard features include computer vision and image processing algorithms and 77GHz forward-looking millimeter-wave radar. It has inputs for additional sensors as needed for specific functions. Future ADAS controllers are expected to include artificial intelligence and machine learning capabilities.

The future of ADAS

Today, ADAS is contained within the vehicle, with the possible exception of GPS. The next step in ADAS evolution will be integrating ADAS with the connected car using vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), and vehicle-to-everything (V2X) communications. Communications with the external environment will also support an expanded definition of “sensor fusion.”

ADAS systems are constrained by what the sensors on the vehicle can detect, including a forward range of about 250 meters. The integration of V2V communication is expected to exponentially extend sensor range by enabling vehicles to directly communicate with each other and share information on relative speeds, lane positions, directions of travel, and even controlled actions, such as sudden braking, accelerations, or changes in direction. Sensor fusion will merge the external data stream with the vehicle’s own sensor data and create a more expansive and detailed picture of the environment. It will provide more accurate and earlier information with which to implement corrective actions and avoid collisions.

Similar to V2V, V2I will provide vehicles with access to external information. In the case of V2I from infrastructure elements such as traffic lights and signals, variable speed limits, and congestion information. In addition to its use by ADAS, V2I information will be necessary for fully automated vehicles. V2X will add data streams, including access to machine learning from beyond the immediate area, and will be important when ADAS transitions to full automation.

Summary

ADAS is designed to prevent deaths and injuries by reducing the number of accidents. It is seen as a stepping stone toward fully automated vehicles. As such, the sensor technologies being developed and optimized for ADAS will also play an important role in the roll-out of automated vehicles. And the performance of those increasingly capable sensors will be amplified by advanced sensor fusion architectures, artificial intelligence, and machine learning.