Research on the Calibration Method of Laser Radar and Camera Extrinsic Fusion based on Checkerboard Features
DOI:
https://doi.org/10.54097/jb15bs61Keywords:
Chessboard Pattern Features, Multi-sensor Fusion, Lidar Camera Calibration, External Parameter Fusion Calibration MethodAbstract
The external parameter calibration of LiDAR and camera is the core link of multimodal perception fusion technology, and its accuracy directly determines the effectiveness of cross modal information fusion. This article proposes a method for high-precision calibration of laser radar and camera extrinsic parameters using a checkerboard calibration board based on spatial registration theory. In response to the difficulty of feature extraction caused by the sparsity of LiDAR point clouds, this method enhances feature information by accumulating multi view point clouds and combines filtering algorithms to improve feature saliency; A standardized experimental platform was built based on ROS and Matlab, utilizing the reflection intensity distribution pattern of the checkerboard point cloud to achieve accurate extraction of its three-dimensional corner coordinates. At the same time, the corresponding two-dimensional checkerboard corner points are detected in the camera image, and the relative pose transformation matrix between the LiDAR and the camera is solved by establishing an accurate matching relationship between the two-dimensional and three-dimensional corner points. The experimental results show that this method has high-precision calibration results and good repeatability, which can provide a reliable extrinsic basis for multi-sensor fusion perception systems.
Downloads
References
[1] Xiang Binhong Research on Object Detection Method Based on Information Fusion of Automotive Radar and Camera [D]. Chongqing; Chongqing University, 2017.
[2] Li Yanfang, Huang Yingping Target detection based on fusion of LiDAR and camera [J]. Electronic Measurement Technology, 2021, 44 (5); 112-117.
[3] Liu R, Xu J, Lou Y, et al. Multi-sensor fusion based indoor mobile robot localization[C]. International Conference on Automation Science and Engineering, Mexico, 2022: 22-27.
[4] Levinson J, Thrun S. Automatic Online Calibration of Cameras and Lasers[C] //Robotics; Science and Systems, 2013,2;7.
[5] Taylor Z, Nieto J,Johnson D. Multi-modal sensor calibra⁃tion using a gradient orientation measure [J]. Journal of Field Robotics, 2015, 32(5);675-695.
[6] Pandey G, McBride J R, Savarese S, et al. Automatic ex⁃ trinsic calibration of vision and lidar by maximizing mutu⁃ al information [J]. Journal of Field Robotics,2015,32 (5): 696-722.
[7] Nguyen T T N, Phan T D, Duong M T, et al. Sensor fusion of camera and 2D LiDAR for self-driving automobile in obstacle avoidance scenarios[C]. International Workshop on Intelligent Systems Ulsan, Korea, 2022: 1-7.
[8] Zhou Yuanyuan, Yi Peng, Ma Li Camera and laser extrinsic online calibration based on pose interpolation [J]. Electronic Design Engineering, 2023, 31 (16): 43-46, 51. Zhou Y Y, Yi P, Ma L. Online extrinsic calibration of camera and LiDAR based on pose interpolation [J]. Electronic Design Engineering, 2023, 31 (16): 43-46, 51.
[9] Wei Shuangfeng, Tang Nian, Huang Shuai, etc Research on registration of ground laser scanning point cloud and image matching point cloud based on building contours [J]. Computer Application Research, 2021, 38 (8): 2515-2520. Wei S F, Tang N, Huang S, et al. Research on registration of ground laser scanning point cloud and image matching point cloud based on building footprint [J]. Application Research of Computers, 2021, 38 (8): 2515-2520.
[10] Yuan C J, Liu X Y, Hong X P, et al. Pixel-level extrinsic self calibration of high resolution LiDAR and camera in targetless environments[J]. IEEE Robotics and Automation Letters, 2021, 6(4): 7517-7524.
[11] Chen Wenming, Hong Ru, Gai Shaoyan, etc 3D Multi object Tracking Based on Feature Fusion and Similarity Estimation Network [J]. Acta Optica Sinica, 2022, 42 (16): 1615001. Chen W M, Hong R, Gai S Y, et al. Three dimensional Multi object Tracking Based on Feature Fusion and Similarity Estimation Network [J]. Acta Optica Sinica, 2022, 42 (16): 1615001.
[12] Zhang Lei, Xu Xiaobin, Cao Chenfei, etc Robot pose estimation method based on image and point cloud fusion with dynamic feature elimination [J]. Chinese Journal of Lasers, 2022, 49 (6): 0610001. Zhang L, Xu X B, Cao C F, et al. Robot pose estimation method based on image and point cloud fusion with dynamic feature elimination [J]. Chinese Journal of Lasers, 2022, 49 (6): 0610001.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Frontiers in Computing and Intelligent Systems

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

