Real-time recognition of American sign language using long-short term memory neural network and hand detection

Reham Mohamed Abdulhamied, Mona M. Nasr, Sarah N. Abdul Kader

Abstract


Sign language recognition is very important for deaf and mute people because it has many facilities for them, it converts hand gestures into text or speech. It also helps deaf and mute people to communicate and express mutual feelings. This paper's goal is to estimate sign language using action detection by predicting what action is being demonstrated at any given time without forcing the user to wear any external devices. We captured user signs with a webcam. For example; if we signed “thank you”, it will take the entire set of frames for that action to determine what sign is being demonstrated. The long short-term memory (LSTM) model is used to produce a real-time sign language detection and prediction flow. We also applied dropout layers for both training and testing dataset to handle overfitting in deep learning models which made a good improvement for the final result accuracy. We achieved a 99.35% accuracy after training and implementing the model which allows the deaf and mute communicate more easily with society.

Keywords


Action detection; Hand gesture; LSTM model; MediaPipe; Sign language

Full Text:

PDF


DOI: http://doi.org/10.11591/ijeecs.v30.i1.pp545-556

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

shopify stats IJEECS visitor statistics