Radars, cameras, lidars, ultrasonic sensors help an autonomous vehicle make decisions, and their accuracy is paramount to safe autonomous driving.
Technology creates new solutions but with increasing complexity. For instance, in the automotive industry, technology has come a long way from a simple speed control feature in the early 1900s to fully autonomous vehicles (AV) today. Vehicle manufacturers (OEMs) constantly aim to make driving as comfortable as possible for their customers.
Autonomous driving (AD), which was once science fiction, is made possible using multiple sensors like radars, cameras, lidars, etc., which continuously feed information and help complex applications along with hardware and control systems to make critical driving decisions.
AVs can depend on more than 20 sensors mounted at various locations. Each sensor provides different kinds of data. For example, cameras provide images, radars and lidars provide point clouds, etc. The autonomous driving software uses information/data from each of these sensors individually and collectively from multiple sensors (radar + camera) to make precise driving decisions.
Figure 1: The autonomous vehicle depends on multiple sensors to make a driving decision.
Since the AV depends on such a wide range of sensors, calibration of these sensors becomes very important. Mis-calibrated sensors can have a catastrophic impact on the safety and functionality of the AV. Calibration of individual sensors is a long-drawn-out process, and different sensors i.e., LiDAR, radar or cameras etc. require different calibration methodologies. This calibration is typically carried out at the end of the production line.
The general understanding in the industry is that intrinsic calibrations, i.e., geometric distortion, focal length, etc., do not change over time. They are typically done only once. However, extrinsic calibration, i.e., position and orientation, which can change based on various reasons, from poor road conditions to sensors mounted on movable car parts such as the side view mirror, is a challenge. For example, a rear-facing camera mounted on the auto-folding side-view mirror, is used for lane change functionality or to detect the presence of a vehicle in the adjacent lane, a yaw angle miscalibration of 0.5 degrees can cause a lateral error of almost one meter depending on the distance to the vehicle. A lateral error of one meter could result in a vehicle present in the adjacent lane going undetected.
Challenges in Sensor Calibration
The typical challenges or concerns about sensor calibration include:
Figure 2: A typical rig to calibrate sensors
Sensor Calibration Solutions
The challenges of sensor calibration for AVs can be resolved as follows.
Requirements of a Multi-modality Sensor Calibration System
A multi-modality sensor calibration system requires the following features:
Additionally, it is important to calibrate individual sensors and multiple sensors with respect to each other. Multi-modality sensor calibration refers to the calibration of different types of sensors such as Camera with respect to LiDAR or RADAR about LiDAR.
Let us look at an example to understand multi-modality sensor calibration.
In an AV, features such as Lane Departure Warning (LDW), Forward Collision Warning (FCW), Traffic Sign Recognition (TSR) rely on the front-facing camera. A well-calibrated camera can position the vehicle in the center of the lane and maintain an appropriate distance from the vehicle in the front. But a minor error in the pitch angle of the camera mounting can result in the vehicle in front appearing further away than it is. It could result in delayed system response to avoid a collision with a vehicle in the front.
Figure 3: An autonomous vehicle uses the front camera to assess the road for various features like FCW, TSR and LDW.
Alternatively, a LiDAR or a RADAR sensor can reliably estimate the distance to vehicles even if there is a variation in its pitch angle. Thus, either one of these sensors can be used to determine and compensate for a camera’s pitch angle mounting errors.
For a camera or LiDAR to effectively predict the yaw angle mounting error, an Inertial Measurement Unit (IMU) sensor in the vehicle can compare vehicle movement with the tracking and movement of other stationary objects, such as road barriers, traffic signs or lane markings.
Multiple such solutions can be created where a single point of truth must be used to calibrate all sensors. The single point of truth can be another sensor, such as the LiDAR or camera. But it is also important to note that any calibration error in the main sensor will magnify the errors for other sensors. An auto calibrating system is important.
With the increase in competition between OEMs to develop quicker and more reliable and advanced Autonomous Driving features, the effective use of perfectly calibrated sensors is key to their success.
KPIT has worked on developing multiple models for sensor misalignment. These models could be extended to every sensor and help in autocalibration. We have the knowledge, skills and experience to develop state-of-the-art technologies and features. We require access to millions of kilometers of vehicle data, test vehicle setup and customer-specific requirements from our customers.
At KPIT, over the past decade, we have developed a strong Autonomous Driving Practice, and we are strategic software development and integration partners to global OEMs and Tier 1s for autonomous driving.
Our teams help clients across all domains with AD development and deploy key technologies like AI, ML, DL, Sensor Fusion, Computer vision, Virtual Simulation and more.
About the Author
Rajat Jayanth Shetty is a solutions architect at KPIT Technologies.