What Is ADAS?
3 things you need to know
Advanced driver-assistance systems (ADAS) are the hardware and software components that automate a driver’s responsibilities. Examples of ADAS in vehicles today include adaptive cruise control, blind spot detection, lane change detection, automatic lane following, and automatic emergency braking.
ADAS can make roads safer by minimizing human error. Some ADAS systems enforce safe driving habits by alerting drivers of unsafe road scenarios, such as when a car in the driver’s blind spot would make changing lanes dangerous. Other ADAS systems automate driving behaviors, such as collision avoidance with autonomous emergency braking.
In fact, ADAS prevents 28% of all crashes and 9,900 annual fatalities in the U.S., according to a study by Boston Consulting Group.
Levels of ADAS
There are five levels of driving automation as defined by the Society of Automotive Engineers. Most cars on the road today have ADAS features between Level 0 and Level 3. Companies at the forefront of automated driving are pursuing Levels 4 and 5.
Fully autonomous vehicles may become a reality as the safety, cybersecurity, and policy issues are being worked out.
To understand how ADAS features are designed, let’s use adaptive cruise control as an example. When using this ADAS feature, the car slows down as it approaches a vehicle in front and accelerates to cruising speed if the vehicle in front moves a safe distance away.
The first step in designing adaptive cruise control (ACC) is to collect data from sensors mounted on the car. For adaptive cruise control, we need a camera and a radar sensor. The camera detects the other objects in the frame (vehicle, pedestrian, tree, etc.), and the radar calculates the distance from our car to the object.
After collecting data from our sensors, we turn our focus to ADAS algorithm development. Adaptive cruise control can be broken down into three steps:
Steps 1, 2, and 3 correspond to the following:
- A perception algorithm to detect if there is a vehicle in front of us
- A radar algorithm to calculate our distance from the vehicle
- A controls algorithm to adjust the speed of our car based on the distance measurement.
We used ACC as an ADAS example, but the general methodology of choosing the right sensors and designing algorithms based on the sensor data applies to all ADAS features.
The Importance of Sensors
The three most popular sensor types used for ADAS features are camera, radar, and lidar.
Cameras
Cameras are used for detection-related ADAS tasks. Cameras on the side of a vehicle can detect blind spots. Cameras in the front can detect lanes, vehicles, signs, pedestrians, and cyclists. The associated ADAS detection algorithms are generally built using conventional computer vision and deep learning algorithms. Cameras have several advantages:
- They provide excellent data for object detection
- They are relatively inexpensive – low price means that testing many types of cameras is less expensive for manufacturers
- There are many varieties– test and select from many camera types such as fisheye, monocular, and pinhole
- They are the most extensively researched – the camera is the oldest of the three sensor types and has been studied the most
The downside of camera data is that they are less suited for detecting distance from an object compared to data from other sensor types. For this reason, ADAS developers often use cameras in conjunction with radar.
Radar
Radar sensors emit a high frequency wave and record when these waves bounce back to them from objects in the environment. The data can be used to calculate the distance to an object. In ADAS, radar sensors are usually on the front of the vehicle.
Radar works in varying weather conditions, which makes it a practical sensor choice for ADAS features like automatic emergency braking and adaptive cruise control.
Although radar sensor data are well-suited for distance detection algorithms, these data are less useful in algorithms for classifying the detected objects. For this reason, ADAS developers often use radar in conjunction with cameras.
Lidar
Lidar (light detection and ranging) sensors emit a laser into the environment and record when the signal returns. The returned signals are reconstructed to create a 3D point cloud that shows the lidar’s surrounding environment. Lidar data can be used to calculate the sensor’s distance from the objects in the 3D point cloud.
There are two types of lidar sensors used for ADAS applications:
- Electromechanical (spinning) lidar - Electromechanical lidar is mounted on top of a car and rotates while collecting data to produce a 3D point cloud map of the environment.
- Solid-state lidar – This is a newer type of lidar that has no moving parts. In the long term, solid-state lidar promises to be faster, cheaper, and more accurate than electromechanical lidar. However, designing a commercially viable sensor poses engineering problems related to the safety and range of the sensor.
You can use lidar data to perform both the distance detection and object classification functions in ADAS. However, lidar data processing requires more computational power compared to camera and radar data, and poses some challenging problems for ADAS algorithm developers.
Developing ADAS Algorithms with Simulation
Testing on hardware is expensive, so engineers first test their ADAS solutions using virtual simulation. Simulation environments can be 2D or 3D.
You can use 2D simulation to develop and test ADAS algorithms for camera and radar. We start by creating virtual scenes with roads, pedestrians, cyclists, and other vehicles. Then we place our vehicle in the scene and mount virtual cameras and radar sensors onto it. We can then program the movement of the car to generate synthetic sensor data for ADAS algorithm development and testing.
3D simulation builds on 2D simulation and allows us to test lidar in addition to cameras and radar. 3D environments require more computational power because of their relative complexity.
Once you have developed ADAS algorithms in simulation environments, the next development stage is hardware-in-the-loop (HIL) testing. This involves testing ADAS algorithms with real hardware from cars, such as a real braking system, by connecting them to a simulation environment. HIL testing provides a good sense of how an ADAS component of a car will operate in the real world.
There are other ADAS tests such as driver-in-the-loop, but they all lead to in-vehicle tests to understand how the vehicle will perform when all the parts come together. This is the most expensive type of ADAS testing but also the most accurate and is required before moving a vehicle to production.
MATLAB® and Simulink® support ADAS development for each stage of the workflow:
- Analyzing data
- Synthesizing driving scenarios
- Designing ADAS planning and control algorithms
- Designing perception algorithms
- Deploying algorithms
- Integrating and testing
Analyzing Data
MATLAB enables you to access, visualize, and label live and recorded driving data for ADAS development. MATLAB also supports geographic map data via HERE HD Live Maps, OpenStreetMap, and Zenrin Japan Maps. These data are often used for ADAS algorithm development and verification.
Synthesizing Driving Scenarios
MATLAB lets you develop and test ADAS algorithms in virtual scenarios using the cuboid simulation environment for controls, sensor fusion, and motion planning, as well as the Unreal Engine environment for perception. You can also design realistic 3D scenes with RoadRunner.
Designing ADAS Planning and Control Algorithms
MATLAB contains many automated driving reference applications, which can serve as starting points for designing your own ADAS planning and controls algorithms.
Designing Perception Algorithms
MATLAB provides tools for developing perception algorithms from camera, radar, and lidar data. You can develop algorithms using computer vision, deep learning, radar and lidar processing, and sensor fusion.
Deploying ADAS Algorithms
Toolboxes like MATLAB Coder™, Embedded Coder®, and GPU Coder™ allow you to automatically generate code to deploy your ADAS algorithms onto embedded devices and service-oriented architectures like ROS and AUTOSAR.
Integrating and Testing
You can integrate and test your perception, planning, and control systems with Simulink tools. Using Requirements Toolbox™, you can capture and manage your ADAS requirements. You can also use Simulink Test™ to run and automate test cases in parallel.
Related Products: Automated Driving Toolbox™, Computer Vision Toolbox™, Lidar Toolbox™, Radar Toolbox, RoadRunner, RoadRunner Asset Library, RoadRunner Scene Builder