Implementation of wheelchair controller using mouth and tongue gesture
Abstract
In this paper, a simple human-machine interface allowing people with severe disabilities to control a motorized wheelchair using mouth and tongue gesture is presented. The development of the proposed system consists of three principal phases: the first phase is mouth detection which performed by using haar cascade to detect the face area and template matching to detect mouth and tongue gestures from the lower face region. The second phase is command extraction; it is carried by determining the mouth and tongue gesture commands according to the detected gesture, the time taken to execute the gestures, and the previous command which is stored in each frame. Finally, the gesture commands are sent to the wheelchair as instruction using the Bluetooth serial port. The hardware used for this project were; laptop with universal serial bus (USB) webcam as a vision-based control unit, Bluetooth module to receive instructions comes from the vision control unit, standard joystick used in case of emergency, joystick emulator which delivers to the control board signals similar to the signals that are usually generated by the standard joystick, and ultrasonic sensors to provide safe navigation. The experimental results showed the success of the proposed control system based on mouth and tongue gestures.
Keywords
Face detection; Human machine interface; Mouth gesture; Powered wheelchair; Tongue gesture;
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v24.i3.pp1663-1671
Refbacks
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).