An automated navigation system for blind people

Md. Atiqur Rahman, Sadia Siddika, Md. Abdullah Al-Baky, Md. Jueal Mia

Abstract


Proper navigation and detailed perception in familiar or unfamiliar environments are the main roles for human life. Eyesight sense helps humans to abstain from all kinds of dangers and navigate to indoor and outdoor environments. These are challenging activities for blind people in all environments. Many assistive tools have been developed by the blessing of technology like braille compasses and white canes that help them to navigate around in the environment. A vision and cloud-based navigation system for the visually impaired or blind person was developed. Our aim was not only to navigate them but also to perceive the environment in as much detail as a normal person. The proposed system includes ultrasonic sensors detecting obstacles, stereo camera to capture videos to perceive the environment using deep learning algorithms. Face recognition approach identified known faces in front of him. Blind people interacted with the whole system through a speech recognition module and all the information was stored in the cloud. Web and android applications were developed to track blinds so that guardians were monitoring them while visiting and reached them in an emergency. The experimental results showed the proposed system could provide more plenty information and user-friendly interaction.

Keywords


Blind navigation; Deep learning; Face recognition; Optical character recognition; Speech processing; Yolo

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v11i1.3452

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).