SELF-CALIBRATING FUSION OF MULTI-SENSOR SYSTEM FOR AUTONOMOUS VEHICLE OBSTACLE DETECTION AND TRACKING

dc.contributor.advisorCheok, Ka C.
dc.contributor.authorTian, Kaiqiao
dc.contributor.otherMirza, Khalid
dc.contributor.otherRadovnikovich, Micho Tomislav
dc.contributor.otherLouie, Wing-Yue Geoffrey
dc.contributor.otherCesmelioglu, Aycil
dc.date.accessioned2024-10-02T13:31:38Z
dc.date.available2024-10-02T13:31:38Z
dc.date.issued2023-01-01
dc.description.abstractMobile robots have gained significant attention due to their ability to undertake complex tasks in various applications, ranging from autonomous vehicles to robotics and augmented reality. To achieve safe and efficient navigation, these robots rely on sensor data from RADAR, LiDAR, and Cameras to understand their surroundings. However, the integration of data from these sensors presents challenges, including data inconsistencies and sensor limitations. This thesis proposes a novel LiDAR and Camera sensor fusion algorithm that addresses these challenges, enabling more accurate and reliable perception for mobile robots and autonomous vehicles. The proposed algorithm leverages the unique strengths of both LiDAR and camera sensors to create a holistic representation of the environment. It adopts a multi-sensor data fusion (MSDF) approach, combining the complementary characteristics of LiDAR's precise 3D Point Cloud Data and the rich visual information provided by cameras. The fusion process involves sensor data registration, calibration, and synchronization, ensuring accurate alignment and temporal coherence. The algorithm introduces a robust data association technique that mateches LiDAR points with visual features extracted from camera images. By fusing these data, the algorithm enhances object detection and recognition capabilities, enabling the robot to perceive the environment with higher accuracy and efficiency. Additionally, the fusion technique compensates for sensor-specific limitations, such as LiDAR's susceptibility to adverse weather conditions and the camera's vulnerability to lighting changes, resulting in a more reliable perception system. The thesis contributes to advancing mobile robot perception by providing a comprehensive and practical LiDAR and camera sensor fusion algorithm. This novel approach has significant implications for autonomous vehicles, robotics, and augmented reality applications, where accurate and reliable perception is vital for successful navigation and task execution. By addressing the limitations of individual sensors and offering a more unified and coherent perception system, the proposed algorithm paves the way for safer, more efficient, and intelligent mobile robot solutions in various real-world settings.
dc.identifier.urihttps://hdl.handle.net/10323/18237
dc.relation.departmentElectrical and Computer Engineering
dc.subjectAutonomous vehicles
dc.subjectLiDAR and Camera sensor fusion
dc.subjectMulti-sensor data fusion
dc.subjectSelf-calibration algorithm
dc.subjectSensor data mis-alignment
dc.subjectTracking algorithm
dc.titleSELF-CALIBRATING FUSION OF MULTI-SENSOR SYSTEM FOR AUTONOMOUS VEHICLE OBSTACLE DETECTION AND TRACKING

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Tian_oakland_0446E_10366.pdf
Size:
8.31 MB
Format:
Adobe Portable Document Format