Facial emotion recognition based on upper features and transfer learning
Abstract
Facial expression recognition (FER) in the upper face focuses on the analysis and recognition of emotions based on features extracted from the upper region of the face. This region typically affects the eyes, eyebrows, forehead, and sometimes the upper cheeks. Since these areas are often less affected by face masks or other facial coverings, FER algorithms can concentrate on capturing and interpreting the relevant facial cues, such as eye movements, eyebrow positions, and forehead wrinkles, to accurately recognize and classify different emotions. By focusing on the upper face, FER systems can mitigate the impact of occlusions caused by masks and still provide meaningful insights into the emotional states of individuals. In this work, a FER approach focusing on the upper region is proposed. Several experiments have been made using the CK+ dataset in addition to a comparison between the emotion recognition scores using the upper and the entire face in order to determine whether this area can reflect the real expressed emotion. The results of our approach are promising compared to previous studies with an accuracy up to 96%.
Keywords
CK+; Face mesh; Facial emotion recognition; Transfer learning; Upper face region
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v37.i1.pp530-539
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).