SELF-CALIBRATING FUSION OF MULTI-SENSOR SYSTEM FOR AUTONOMOUS VEHICLE OBSTACLE DETECTION AND TRACKING

Loading...
Thumbnail Image

Date

2023-01-01

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Mobile robots have gained significant attention due to their ability to undertake complex tasks in various applications, ranging from autonomous vehicles to robotics and augmented reality. To achieve safe and efficient navigation, these robots rely on sensor data from RADAR, LiDAR, and Cameras to understand their surroundings. However, the integration of data from these sensors presents challenges, including data inconsistencies and sensor limitations. This thesis proposes a novel LiDAR and Camera sensor fusion algorithm that addresses these challenges, enabling more accurate and reliable perception for mobile robots and autonomous vehicles. The proposed algorithm leverages the unique strengths of both LiDAR and camera sensors to create a holistic representation of the environment. It adopts a multi-sensor data fusion (MSDF) approach, combining the complementary characteristics of LiDAR's precise 3D Point Cloud Data and the rich visual information provided by cameras. The fusion process involves sensor data registration, calibration, and synchronization, ensuring accurate alignment and temporal coherence. The algorithm introduces a robust data association technique that mateches LiDAR points with visual features extracted from camera images. By fusing these data, the algorithm enhances object detection and recognition capabilities, enabling the robot to perceive the environment with higher accuracy and efficiency. Additionally, the fusion technique compensates for sensor-specific limitations, such as LiDAR's susceptibility to adverse weather conditions and the camera's vulnerability to lighting changes, resulting in a more reliable perception system. The thesis contributes to advancing mobile robot perception by providing a comprehensive and practical LiDAR and camera sensor fusion algorithm. This novel approach has significant implications for autonomous vehicles, robotics, and augmented reality applications, where accurate and reliable perception is vital for successful navigation and task execution. By addressing the limitations of individual sensors and offering a more unified and coherent perception system, the proposed algorithm paves the way for safer, more efficient, and intelligent mobile robot solutions in various real-world settings.

Description

Keywords

Autonomous vehicles, LiDAR and Camera sensor fusion, Multi-sensor data fusion, Self-calibration algorithm, Sensor data mis-alignment, Tracking algorithm

Citation