An Adam based CNN and LSTM approach for sign language recognition in real time for deaf people

Subrata Kumer Paul, Md. Abul Ala Walid, Rakhi Rani Paul, Md. Jamal Uddin, Md. Sohel Rana, Maloy Kumar Devnath, Ishaat Rahman Dipu, Md. Momenul Haque

Abstract


Hand gestures and sign language are crucial modes of communication for deaf individuals. Since most people can't understand sign language, it's hard for a mute and an average person to talk to each other. Because of technological progress, computer vision and deep learning can now be used to count. This paper shows two ways to use deep knowledge to recognize sign language. These methods help regular people understand sign language and improve their communication. Based on American sign language (ASL), two separate datasets have been constructed; the first has 26 signs, and the other contains three significant symbols with the crucial sequence of frames or videos for regular communication. This study looks at three different models: the improved ResNet-based convolutional neural network (CNN), the long short-term memory (LSTM), and the gated recurrent unit (GRU). The first dataset is used to fit and assess the CNN model. With the adaptive moment estimation (Adam) optimizer, CNN obtains an accuracy of 89.07%. In contrast, the second dataset is given to LSTM and GRU and a comparison has been conducted. LSTM does better than GRU in all classes. LSTM has a 94.3% accuracy, while GRU only manages 79.3%. Our preliminary models' real-time performance is also highlighted.

Keywords


Convolutional neural network; Deep learning; Gated recurrent unit; Long short-term memory; Recognition; Sign language

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v13i1.6059

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).