Analysis of Interaction Methods in VR Virtual Reality
DOI:
https://doi.org/10.54097/hset.v39i.6559Keywords:
Virtual Reality; Interactive Experience; Motion Capture; Eye Tracking; Haptic Feedback.Abstract
Virtual Reality is a new revolution in interaction. As a super technology that can trick the human brain, Virtual Reality is widely used in healthcare, education, business, medical, entertainment and industry. In the field of VR, developers are committed to giving users a full and realistic interactive experience, for example by transmitting the senses of touch, smell, sight and hearing to the user's brain and restoring the user's perception of the real world first-hand. This requires not only powerful technology, but also sophisticated interaction design. Many VR giants are currently working on various technologies and systems to enhance the user's interaction experience in the virtual world. This article analyses three of the main interaction methods (motion capture, eye tracking and haptic feedback) and their associated device products and provide a brief comparison and summary of them. Although the interaction methods in VR are not yet unified and the various virtual reality devices on the market have their own flaws, with more and more players getting involved and the rapid development of technology, it is not difficult to judge that in the near future virtual reality interaction devices will explode like a rocket.
Downloads
References
Riener R, Harders M. Introduction to virtual reality in medicine. Virtual Reality in Medicine. Springer, London, 2012: 1-12.
Fisher J A, Samuels J T. A proposed curriculum for an introductory course on interactive digital narratives in virtual reality. International Conference on Interactive Digital Storytelling. Springer, Cham, 2021: 462-477.
Bouchard S, Côté S, St-Jacques J, et al. Effectiveness of virtual reality exposure in the treatment of arachnophobia using 3D games. Technology and health care, 2006, 14(1): 19-27.
Kano F, Tomonaga M. How chimpanzees look at pictures: a comparative eye-tracking study. Proceedings of the Royal Society B: Biological Sciences, 2009, 276(1664): 1949-1955.
Dzsotjan D, Ludwig-Petsch K, Mukhametov S, et al. The Predictive Power of Eye-Tracking Data in an Interactive AR Learning Environment. Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers. 2021: 467-471.
Lindeman R W, Page R, Yanagida Y, et al. Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system. Proceedings of the ACM symposium on Virtual reality software and technology. 2004: 146-149.
Soave F, Bryan-Kinns N, Farkhatdinov I. A preliminary study on full-body haptic stimulation on modulating self-motion perception in virtual reality. International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Springer, Cham, 2020: 461-469.
Hoppe M, Knierim P, Kosch T, et al. VR HapticDrones: Providing haptics in virtual reality through quadcopters, Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia. 2018: 7-18.
Jack D, Boian R, Merians A S, et al. Virtual reality-enhanced stroke rehabilitation. IEEE transactions on neural systems and rehabilitation engineering, 2001, 9(3): 308-318.
Schäfer A, Reis G, Stricker D. Controlling Continuous Locomotion in Virtual Reality with Bare Hands Using Hand Gestures, International Conference on Virtual Reality and Mixed Reality. Springer, Cham, 2022: 191-205.
Wang X, Ong S K, Nee A Y C. Multi-modal augmented-reality assembly guidance based on bare-hand interface. Advanced Engineering Informatics, 2016, 30(3): 406-421.
Fang C, Zhang Y, Dworman M, et al. Wireality: Enabling complex tangible geometries in virtual reality with worn multi-string haptics, Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 2020: 1-10.
Downloads
Published
Conference Proceedings Volume
Section
License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.