Integrating Eyeglass Camera And Ultrasonic Smart Cane With Enhanced Navigation For Blind

Authors

  • Lakshmi Priya S
  • Mohanram M
  • Sivaraj R
  • Vishnu Rajaperumal M

DOI:

https://doi.org/10.53555/jaz.v45i4.4698

Keywords:

Assistive technology, Object detection, Visually impaired, Obstacle detection, Ultrasonic sensor

Abstract

The integration of intelligent wearable aids is revolutionizing accessibility solutions for people with visual and hearing impairments. Advanced CNN algorithms allow these devices to accurately interpret the real-time environment. Ultrasonic cameras improve mobility and safety by detecting obstacles and providing important visual information to users. For people with visual impairments, this system alerts people to the presence and location of obstacles, allowing them to navigate safely. At the same time, the vibration detection mechanism provides tactile feedback for people with hearing impairments, ensuring inclusive accessibility. These devices analyze visual input and provide intuitive feedback, allowing users to independently and safely navigate their environments. Continuous learning capabilities allow adaptation to different environments and ensure effectiveness in different scenarios. This innovative approach represents a major advance in improving the quality of life for people with disabilities. The seamless integration of AI, CNN algorithms, and sensory technology highlights the transformative potential of assistive technology to promote independence and inclusion

Downloads

Download data is not yet available.

Author Biographies

Lakshmi Priya S

Assistant Professor, Department of Biomedical Engineering, V.S.B. Engineering College, India

Mohanram M

UG Student, Department of Biomedical Engineering, V.S.B. Engineering College, India

Sivaraj R

UG Student, Department of Biomedical Engineering, V.S.B. Engineering College, India

Vishnu Rajaperumal M

UG Student, Department of Biomedical Engineering, V.S.B. Engineering College

References

V. V. Meshram, K. Patil, V. A. Meshram and F. C. Shu, “An Astute Assistive Device For Mobility And Object Recognition For Visually Impaired People,” In IEEE Transactions On Human Machine Systems, Vol. 49, No. 5,pp.449-460, 2019.

R.Jafri, R L. Campos, S.A. Ali, And H. R. Arabnia, “Visually And Infrared Sensor Data Based Obstacle Detection For The Visually Impaired Using The Google Project Tango Tablet Development Kit And The Unity Engine,” IEEE Access, Vol. 6, pp.443-454, 2018.

R. R. Bourne et al., "Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: a systematic review and meta-analysis," The Lancet Global Health, vol. 5, no. 9, pp. e888-e897, 2017.

E. Cardillo et al., "An Electromagnetic Sensor Prototype to Assist Visually Impaired and Blind People in Autonomous Walking," in IEEE Sensors Journal, vol. 18, no. 6, pp. 2568-2576, 2018.

A. Riazi, F. Riazi, R. Yoosfi, and F. Bahmaei, "Outdoor difficulties experienced by a group of visually impaired Iranian people," Journal of current ophthalmology, vol. 28, no. 2, pp. 85-90, 2016.

M. M. Islam, M. S. Sadi, K. Z. Zamli and M. M. Ahmed, IEEE Sensors Journal, vol. 18, no. 6, pp. 2568-2576, 2018."Developing Walking Assistants for Visually Impaired People: A Review," in IEEE Sensors Journal, vol. 19, no. 8, pp. 2814-2828, 2019.

S. Mahmud, X. Lin and J. Kim, "Interface for Human Machine Interaction for assistant devices: A Review," in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), 2020, pp. 0768-0773.

R. Tapu, B. Mocanu, and T. Zaharia, "Wearable assistive devices for visually impaired: A state of the art survey," Pattern Recognition Letters, 2018. [9] N. Sahoo, H.-W.Lin, and Y.-H. Chang, "Design and Implementation of a Walking Stick Aid for Visually Challenged People," Sensors, vol. 19, no. 1, 2019.

H. Zhang and C. Ye, "An Indoor Wayfinding System Based on Geometric Features Aided Graph SLAM for the Visually Impaired,"

Mahendru, Mansi, and Sanjay Kumar Dubey. "Real Time Object Detection with Audio Feedback using Yolo vs. Yolo_v3." 2021 11th International Conference on Cloud Computing, Data Science & Engineering (Confluence). IEEE, 2021.

Samhita, M. S., et al. "A critical investigation on blind guiding devices using CNN algorithm based on motion stereo tomography images." Materials Today: Proceedings (2021).

Ashiq, Fahad, et al. "CNN-based object recognition and tracking system to assist visually impaired people." IEEE Access 10 (2022): 14819-14834.

Zheng, S., Wu, Y., Jiang, S., Lu, C., & Gupta, G. (2021, July). Deblur-yolo: Real-time object detection with efficient blind motion deblurring. In 2021 International Joint Conference on Neural Networks (IJCNN) (pp. 1-8). IEEE.

Konaite, Matshehla, et al. "Smart Hat for the blind with Real-Time Object Detection using Raspberry Pi and TensorFlow Lite." Proceedings of the International Conference on Artificial Intelligence and its Applications. 2021.

Downloads

Published

2024-03-16

Issue

Section

Articles

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.