Visual SLAM Methods for Autonomous Driving Vehicles

Authors

  • Rui Xu

DOI:

https://doi.org/10.54097/vs34j409

Keywords:

Autonomous vehicle; visual SLAM method; localization; mapping.

Abstract

Autonomous driving has recently become a burgeoning field poised to revolutionize transportation, with apparent anticipation about its widespread adoption soon. As vehicles equipped with autonomous capabilities become increasingly prevalent, the need for robust navigation systems becomes vital. Simultaneous Localization and Mapping (SLAM) methods have emerged as a critical solution to address the challenges inherent in autonomous driving. By concurrently creating maps of the environment and accurately localizing vehicles, SLAM algorithms enable autonomous vehicles to navigate safely and efficiently in diverse, dynamic, and even GPS-denied environments. This paper aims to elucidate the functionality and principles underpinning SLAM methods, with a particular focus on their application in autonomous driving vehicles. By examining traditional localization methods and their limitations, this paper underscores the pivotal role of SLAM in overcoming these challenges. Furthermore, this paper delves into the advancements in visual SLAM technology and its effectiveness in resolving contemporary issues encountered by autonomous vehicles, such as uncertainties in urban environments. The integration of Convolutional Neural Networks (CNNs) with visual SLAM systems is discussed, showcasing the potential to enhance depth estimation; optical flow, feature correspondence, and camera pose estimation. Despite these advancements, persistent challenges remain, including map robustness, computational requirements, and security considerations. Nevertheless, by leveraging visual SLAM technology, autonomous driving vehicles are poised to navigate complex environments with unprecedented precision, paving the way for a future where transportation is safer, more efficient, and more accessible than ever before.

Downloads

Download data is not yet available.

References

Taheri H., Zhao C. X., et al. SLAM; definition and evolution. Engineering Applications of Artificial Intelligence, 2021, 97: 104032. DOI: https://doi.org/10.1016/j.engappai.2020.104032

Barros M. A., Michel M., Moline Y., Corre G., Carrel F., et al. A Comprehensive Survey of Visual SLAM Algorithms. Robotics, 2022, 11(1): 24. DOI: https://doi.org/10.3390/robotics11010024

Cheng J., Zhang L., Chen Q., Hu X., Cai J., et al. A review of visual SLAM methods for autonomous driving vehicles. Engineering Applications of Artificial Intelligence, 2022, 114: 104992. DOI: https://doi.org/10.1016/j.engappai.2022.104992

Milz S., Arbeiter G., Witt C., Abdallah B., Yogamani S., et al. Visual SLAM for Automated Driving: Exploring the Applications of Deep Learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2018: 247-257. DOI: https://doi.org/10.1109/CVPRW.2018.00062

Singandhupe A., La H. M. A Review of SLAM Techniques and Security in Autonomous Driving. In Proceedings of the Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 2019: 602-607. DOI: https://doi.org/10.1109/IRC.2019.00122

Lategahn H., Geiger A., Kitt B. Visual SLAM for autonomous ground vehicles. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 2011: 1732-1737. DOI: https://doi.org/10.1109/ICRA.2011.5979711

Wong K., Gu Y., Kamijo S., et al. Mapping for Autonomous Driving: Opportunities and Challenges. IEEE Intelligent Transportation Systems Magazine, 2021, 13(1): 91-106. DOI: https://doi.org/10.1109/MITS.2020.3014152

Bresson G., Alsayed Z., Yu L., et al. Simultaneous localization and mapping: A survey of current trends in autonomous driving. IEEE Transactions on Intelligent Vehicles, 2017, 2(3): 194-220. DOI: https://doi.org/10.1109/TIV.2017.2749181

Downloads

Published

19-08-2024

How to Cite

Xu, R. (2024). Visual SLAM Methods for Autonomous Driving Vehicles. Highlights in Science, Engineering and Technology, 111, 138-143. https://doi.org/10.54097/vs34j409