Influence of Spatial Perception Ability on Virtual Annotation Response in Teleoperation of Excavator

Authors

  • Xiaomeng Li
  • Xing Su

DOI:

https://doi.org/10.54097/fcis.v3i3.8564

Keywords:

Teleoperation, Virtual Annotations, Spatial Perception Ability

Abstract

Virtual annotation (VA) for compromising visual limitations during teleoperation can provide critical instructional information, but it remains to be verified whether an operator can respond to VA timely and accurately while operating. Studies have demonstrated a positive relationship between spatial perception ability (SPA) and teleoperating performance. Many studies considered that those with greater SPA have a lower mental workload and more attention on other matters. Therefore, this study sought to examine whether an operator with greater SPA can respond better to VA. Quantitative results from the SPA test questionnaire and the teleoperated excavator experiment showed a significant inverse relationship between SPA and the response time. It is expected that the results could improve the effectiveness of selecting teleoperators and optimize prior management.

Downloads

Download data is not yet available.

References

Ha Q, Yen L, Balaguer C. Robotic autonomous systems for earthmoving in military applications [J]. Automation in Construction, 2019, 107: 102934.

Zhou T, Zhu Q, Du J. Intuitive robot teleoperation for civil engineering operations with virtual reality and deep learning scene reconstruction [J]. Advanced Engineering Informatics, 2020, 46: 101170.

Nagano H, Takenouchi H, Cao N, et al. Tactile feedback system of high-frequency vibration signals for supporting delicate teleoperation of construction robots [J]. Advanced Robotics, 2020, 34(11): 730-743.

Hong Z, Zhang Q, Su X, et al. Effect of virtual annotation on performance of construction equipment teleoperation under adverse visual conditions [J]. Automation in Construction, 2020, 118: 103296.

Su X, Talmaki S, Cai H, et al. Uncertainty-aware visualization and proximity monitoring in urban excavation: a geospatial augmented reality approach [J]. Visualization in engineering, 2013, 1(1): 1-13.

Liu A M, Oman C M, Galvan R, et al. Predicting space telerobotic operator training performance from human spatial ability assessment [J]. Acta Astronautica, 2013, 92(1): 38-47.

Zetzsche C, Galbraith C, Wolter J, et al. Navigation based on a sensorimotor representation: a virtual reality study[C]. Human Vision and Electronic Imaging XII. SPIE, 2007, 6492: 526-539.

Bolton A, Burnett G, Large D R. An investigation of augmented reality presentations of landmark-based navigation using a head-up display[C]. Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 2015: 56-63.

Guven S, Feiner S. Visualizing and navigating complex situated hypermedia in augmented and virtual reality[C]. 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE, 2006: 155-158.

Andersen D, Popescu V, Cabrera M E, et al. Virtual annotations of the surgical field through an augmented reality transparent display [J]. The Visual Computer, 2016, 32: 1481-1498.

Leo J, Zhou Z, Yang H, et al. Interactive Cardiovascular Surgical Planning via Augmented Reality[C]. Asian CHI Symposium 2021. 2021: 132-135.

Gauglitz S, Nuernberger B, Turk M, et al. World-stabilized annotations and virtual scene navigation for remote collaboration [C]. Proceedings of the 27th annual ACM symposium on User interface software and technology. 2014: 449-459.

Leutert F, Schilling K. Projector-based augmented reality for telemaintenance support [J]. IFAC-PapersOnLine, 2018, 51 (11): 502-507.

Fang Y, Cho Y K, Chen J. A framework for real-time pro-active safety assistance for mobile crane lifting operations [J]. Automation in Construction, 2016, 72: 367-379.

Cheng T, Teizer J. Real-time resource location data collection and visualization technology for construction safety and activity monitoring applications [J]. Automation in construction, 2013, 34: 3-15.

Bolton A, Burnett G, Large D R. An investigation of augmented reality presentations of landmark-based navigation using a head-up display[C]. Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 2015: 56-63.

Menchaca-Brandan M A, Liu A M, Oman C M, et al. Influence of perspective-taking and mental rotation abilities in space teleoperation[C]. Proceedings of the ACM/IEEE International Conference on Human-robot interaction. 2007: 271-278.

Hegarty M. Components of spatial intelligence [M]. Psychology of learning and motivation. Elsevier. 2010: 265-297.

Ekstrome R, French J, Harman H, et al. Manual for kit of factor referenced cognitive tests [J]. Princeton, New Jersey: Educational Testing Service, 1976.

Long L O, Gomer J A, Wong J T, et al. Visual spatial abilities in uninhabited ground vehicle task performance during teleoperation and direct line of sight [J]. Presence: Teleoperators and Virtual Environments, 2011, 20(5): 466-479.

Pellegrino J W, Alderton D L, Shute V J. Understanding spatial ability [J]. Educational psychologist, 1984, 19(4): 239-253.

Kozhevnikov M, Hegarty M. A dissociation between object manipulation spatial ability and spatial orientation ability [J]. Memory & cognition, 2001, 29: 745-756.

Hegarty M, Waller D. A dissociation between mental rotation and perspective-taking spatial abilities [J]. Intelligence, 2004, 32(2): 175-191.

Dror I E, Kosslyn S M, Waag W L. Visual-spatial abilities of pilots [J]. Journal of Applied Psychology, 1993, 78(5): 763.

Downloads

Published

17-05-2023

Issue

Section

Articles

How to Cite

Li, X., & Su, X. (2023). Influence of Spatial Perception Ability on Virtual Annotation Response in Teleoperation of Excavator. Frontiers in Computing and Intelligent Systems, 3(3), 43-50. https://doi.org/10.54097/fcis.v3i3.8564