Active Perception in SLAM Exploration: Problem Formulation and Methods Review

Authors

  • Wenlong Du

DOI:

https://doi.org/10.54097/d8hcv003

Keywords:

Active perception, Simultaneous localization and mapping, Robotic exploration.

Abstract

Active SLAM (A-SLAM) aims to enable an agent’s autonomous exploration and mapping task, provide robustness and accuracy in both localization and mapping and enhance environmental adaptivity in applications. Its working principle, as a holistic probabilistic-based problem in model formulation, can be divided into three decoupled working stages and accordingly solved, these stages are brief action identification, cost and utility computation, and action execution. The identification stage needs reasoning on what and where to look at in order to diminish uncertainty in an efficient fashion, by employing sensory methods and performing searching or optimization tasks with appropriate environmental representations, to feed the following stages with a discretized set of planning goals. This is the active perception problem by definition. The activeness of active SLAM is mostly decided at the perception stage with appropriate quantifying measures. This paper revisits the formulation of the A-SLAM problem especially by the stage of active perception, reviews its environmental representations (maps) and related modeling measures for information and uncertainty, underlines a sample-based aim of maximizing the information gain from sensors with respect to corresponding maps, and make comments on some novel approaches.

Downloads

Download data is not yet available.

References

[1] Cadena C, Carlone L, Carrillo H, Latif Y, Scaramuzza D, Neira J, Reid I, Leonard J J. Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age. IEEE Trans. Robot. 2016, 32 (6), 1309–1332.

[2] Chen Y, Huang S, Fitch R. Active SLAM for Mobile Robots With Area Coverage and Obstacle Avoidance. IEEE/ASME Transactions on Mechatronics 2020, 25 (3), 1182–1192.

[3] Ahmed M F, Masood K, Fremont V, Fantoni I. Active SLAM: A Review on Last Decade. Sensors 2023, 23 (19), 8097.

[4] Placed J A, Strader J, Carrillo H, Atanasov N, Indelman V, Carlone L, Castellanos J A. A Survey on Active Simultaneous Localization and Mapping: State of the Art and New Frontiers. IEEE Transactions on Robotics 2023, 39 (3), 1686–1705.

[5] Bajcsy R. Active Perception. Proceedings of the IEEE 1988, 76 (8), 966–1005.

[6] Bajcsy R, Aloimonos Y, Tsotsos J K. Revisiting Active Perception. Auton Robot 2018, 42 (2), 177–196.

[7] Kim A, Eustice R M. Active Visual SLAM for Robotic Area Coverage: Theory and Experiment. The International Journal of Robotics Research 2015, 34 (4–5), 457–475.

[8] Paull L, Seto M, Leonard J J, Li H. Probabilistic Cooperative Mobile Robot Area Coverage and Its Application to Autonomous Seabed Mapping. The International Journal of Robotics Research 2018, 37 (1), 21–45.

[9] Elfes A. Using Occupancy Grids for Mobile Robot Perception and Navigation. Computer 1989, 22 (6), 46–57.

[10] Deng D, Xu Z, Zhao W, Shimada K. Frontier-Based Automatic-Differentiable Information Gain Measure for Robotic Exploration of Unknown 3D Environments. arXiv November 10, 2020.

[11] Zhao Y, Xiong Z, Zhou S, Wang J, Zhang L, Campoy P. Perception-Aware Planning for Active SLAM in Dynamic Environments. Remote Sensing 2022, 14 (11), 2584.

[12] Yuwen X, Zhang H, Yan F, Chen L. Gaze Control for Active Visual SLAM via Panoramic Cost Map. IEEE Transactions on Intelligent Vehicles 2023, 8 (2), 1813–1825.

[13] Bonetto E, Goldschmid P, Black M J, Ahmad A. Active Visual SLAM with Independently Rotating Camera. In 2021 European Conference on Mobile Robots (ECMR); 2021; pp 1–8.

[14] Wang Z, Chen H, Zhang S, Lou Y. Active View Planning for Visual SLAM in Outdoor Environments Based on Continuous Information Modeling. IEEE/ASME Transactions on Mechatronics 2024, 29 (1), 237–248.

[15] Placed J A, Castellanos J A. Enough Is Enough: Towards Autonomous Uncertainty-Driven Stopping Criteria. IFAC-PapersOnLine 2022, 55 (14), 126–132.

[16] Atanasov N, LeNy J, Daniilidis K, Pappas G J. Decentralized Active Information Acquisition: Theory and Application to Multi-Robot SLAM. In 2015 IEEE International Conference on Robotics and Automation (ICRA); 2015; pp 4775–4782.

[17] Kitanov A, Indelman V. Topological Belief Space Planning for Active SLAM with Pairwise Gaussian Potentials and Performance Guarantees. The International Journal of Robotics Research 2024, 43 (1), 69–97.

[18] Rodríguez-Arévalo M L, Neira J, Castellanos J A. On the Importance of Uncertainty Representation in Active SLAM. IEEE Transactions on Robotics 2018, 34 (3), 829–834.

[19] Nordlöf J, Hendeby G, Axehill D. Belief Space Planning Using Landmark Density Information. In 2020 IEEE 23rd International Conference on Information Fusion (FUSION); 2020; pp 1–8.

[20] Tao Y, Liu X, Spasojevic I, Agarwal S, Kumar V. 3D Active Metric-Semantic SLAM. IEEE Robot. Autom. Lett. 2024, 9 (3), 2989–2996.

[21] Chen W, Shang G, Ji A, Zhou C, Wang X, Xu C, Li Z, Hu K. An Overview on Visual SLAM: From Tradition to Semantic. Remote Sensing 2022, 14 (13), 3010.

[22] Yang Z, Liu C. TUPPer-Map: Temporal and Unified Panoptic Perception for 3D Metric-Semantic Mapping. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2021; pp 1094–1101.

[23] Patil S, Kahn G, Laskey M, Schulman J, Goldberg K, Abbeel P. Scaling up Gaussian Belief Space Planning Through Covariance-Free Trajectory Optimization and Automatic Differentiation. In Algorithmic Foundations of Robotics XI: Selected Contributions of the Eleventh International Workshop on the Algorithmic Foundations of Robotics; Akin, H. L., Amato, N. M., Isler, V., van der Stappen, A. F., Eds.; Springer International Publishing: Cham, 2015; pp 515–533.

[24] Topiwala A, Inani P, Kathpal A. Frontier Based Exploration for Autonomous Robot; 2018.

[25] Kim A, Eustice R M. Perception-Driven Navigation: Active Visual SLAM for Robotic Area Coverage. In 2013 IEEE International Conference on Robotics and Automation; 2013; pp 3196–3203.

[26] Zhang Z, Rebecq H, Forster C, Scaramuzza D. Benefit of Large Field-of-View Cameras for Visual Odometry. In 2016 IEEE International Conference on Robotics and Automation (ICRA); 2016; pp 801–808.

[27] Manderson T, Holliday A, Dudek G. Gaze Selection for Enhanced Visual Odometry During Navigation. In 2018 15th Conference on Computer and Robot Vision (CRV); 2018; pp 110–117.

[28] Jiang S, Wang S, Yi Z, Zhang M, Lv X. Autonomous Navigation System of Greenhouse Mobile Robot Based on 3D Lidar and 2D Lidar SLAM. Front. Plant Sci. 2022, 13.

[29] Sun F, Zhou Y, Li C, Huang Y. Research on Active SLAM with Fusion of Monocular Vision and Laser Range Data. In 2010 8th World Congress on Intelligent Control and Automation; 2010; pp 6550–6554.

[30] Ebadi K, Bernreiter L, Biggie H, Catt G, Chang Y, Chatterjee A, Denniston C E, Deschênes S P, Harlow K, Khattak S, Nogueira L, Palieri M, Petráček P, Petrlík M, Reinke A, Krátký V, Zhao S, Agha-mohammadi A, Alexis K, Heckman C, Khosoussi K, Kottege N, Morrell B, Hutter M, Pauling F, Pomerleau F, Saska M, Scherer S, Siegwart R, Williams J L, Carlone L. Present and Future of SLAM in Extreme Environments: The DARPA SubT Challenge. IEEE Transactions on Robotics 2024, 40, 936–959.

Downloads

Published

11-12-2024

How to Cite

Du, W. (2024). Active Perception in SLAM Exploration: Problem Formulation and Methods Review. Highlights in Science, Engineering and Technology, 119, 85-91. https://doi.org/10.54097/d8hcv003