A Blind Assistance System Based on Real-Time Object Detection, Timing Analysis and Angle Navigation
DOI:
https://doi.org/10.54097/kx353b79Keywords:
Assistive System, YOLOv5, Timing Analysis, Multi-target Tracking, Angle Computation, TensorFlow LiteAbstract
In this paper, we propose an assistive system for the blind based on real-time object detection, temporal analysis, and angle computation. We use the YOLOv5 model for object detection on mobile and combine it with a lightweight multi-target tracking algorithm (DeepSORT) to achieve continuous recognition of the objects in the video stream, thus providing more stable detection results. The system provides directional cues to the blind by calculating the angle of the object relative to the centre of the image, and intuitive environmental descriptions with Text-to-Speech (TTS) speech output. Tests show that (on Android devices), with proper model quantisation and optimisation, YOLOv5 can achieve real-time detection speeds of around 15-20 FPS; and that multi-target tracking does not significantly increase latency. The error range of the angle calculation is about ±5°, which is an acceptable error and in line with the accuracy level of the mobile phone's sensor and human alignment capabilities. The system provides a highly accurate and portable visual perception assistance solution for blind people.
Downloads
References
[1] Ometov, A., et al. (2022). Computer Vision Enabled Obstacle Avoidance for Visually Impaired. IEEE Access, 10, 1380-1390.
[2] Kuriakose, R., & Bharathi, V. (2023). DeepNAVI: An AI-Driven Navigation App for the Blind. ACM SIGACCESS Accessibility and Computing, (125), 179-187.
[3] Ultralytics. (2020). YOLOv5. [Online Resource] GitHub: https://github.com/ultralytics/yolov5.
[4] Li, X., et al. (2022). YOLOv5-Lite: A Lightweight Object Detector Optimized for Edge Devices. arXiv preprint arXiv: 2210. 01433.
[5] Wojke, N., Bewley, A., & Paulus, D. (2017). Simple online and realtime tracking with a deep association metric. ICIP.
[6] Bharadwaj, T., & Mehta, Y. (2019). Evaluation of Turn-by-Turn Navigation Errors for the Visually Impaired. ASSETS '19: The 21st International ACM SIGACCESS Conference on Computers and Accessibility.
[7] Giudice, N., & Legge, G. (2008). Blind navigation and the role of technology. The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, 479–487.
[8] AbuAli, M., & Hardiman, T. (2021). Smartphone compass accuracy in real-world conditions. Sensors, 21(5), 7-15.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Frontiers in Computing and Intelligent Systems

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.