Linear fusion approach to convolutional neural networks for facial emotion recognition
Abstract
Facial expression recognition is a challenging problem in the scientific field of computer vision. Several face expression recognition (FER) algorithms are proposed in the field of machine learning, and deep learning to extract expression knowledge from facial representations. Even though numerous algorithms have been examined, several issues like lighting changes, rotations and occlusions. We present an efficient approach to enhance recognition accuracy in this study, advocates transfer learning to fine-tune the parameters of the pre-trained model (VGG19 model ) and non-pre-trained model convolutional neural networks (CNNs) for the task of image classification. The VGG19 network and convolutional network derive two channels of expression related characteristics from the facial grayscale images. The linear fusion algorithm calculates the class by taking an average of each classification decision on training samples of both channels. Final recognition is calculated using convolution neural network architecture followed by a softmax classifier. Seven basic facial emotions (BEs): happiness, surprise, anger, sadness, fear, disgust, and neutral facial expressions can be recognized by the proposed algorithm, The average accuracies for standard data set’s “CK+,” and “JAFFE,” 98.3 % and 92.4%, respectively. Using a deep network with one channel, the proposed algorithm can achieve well comparable performance.
Keywords
Convolutional neural networks; Deep learning; Machine learning; Transfer learning; Visual geometry group;
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v25.i3.pp1489-1500
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).