A lightweight method for face expression recognition based on improved MobileNetV3

Abstract Facial expression recognition plays a significant role in the application of man–machine interaction. However, existing models typically have shortcomings with numerous parameters, large model sizes, and high computational costs, which are difficult to deploy in resource‐constrained devices...

Full description

Bibliographic Details
Main Authors: Xunru Liang, Jianfeng Liang, Tao Yin, Xiaoyu Tang
Format: Article
Language:English
Published: Wiley 2023-06-01
Series:IET Image Processing
Subjects:
Online Access:https://doi.org/10.1049/ipr2.12798
Description
Summary:Abstract Facial expression recognition plays a significant role in the application of man–machine interaction. However, existing models typically have shortcomings with numerous parameters, large model sizes, and high computational costs, which are difficult to deploy in resource‐constrained devices. This paper proposes a lightweight network based on improved MobileNetV3 to mitigate these disadvantages. Firstly, we adjust the channels in the high‐level network to reduce the number of parameters and model size, and then, the coordinate attention mechanism is introduced to the network, which enhances the attention of the network with few parameters and low computing cost. Furthermore, a complementary pooling structure is designed to improve the coordinate attention mechanism, which enables it to assist the network in extracting salient features sufficiently. In addition, the network with the joint loss consisting of the softmax loss and centre loss is trained, which can minimize the intra‐class gap and improve the classification performance. Finally, the network is trained and tested on public datasets FERPlus and RAF‐DB, with the best accuracy of 87.5% and 86.6%, respectively. The FLOPs, parameters, and the memory storage size are only 0.19GMac, 1.3 M, and 15.9 MB, respectively, which is lighter than most state‐of‐the‐art networks. Code is available at https://github.com/RIS‐LAB1/FER‐mobilenet.
ISSN:1751-9659
1751-9667