Deep3DCANN: A Deep 3DCNN-ANN framework for spontaneous micro-expression recognition

Selvarajah Thuseethan, Sutharshan Rajasegarar, John Yearwood

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)

Abstract

Facial micro-expressions play a significant role in revealing concealed emotions. However, the recognition of micro-expressions is challenging due to their fleeting nature. Moreover, the visual features of the face and the visual relationships between the facial sub-regions have a strong influence on the presence of micro-expressions. In this work, a novel end-to-end facial micro-expression detection framework, called Deep3DCANN, is proposed to integrate these components for effective micro-expression detection. The first component of our framework is a deep 3D convolutional neural network that learns useful spatiotemporal features from a sequence of facial images. In the second component, a deep artificial neural network is utilized to trace the useful visual associations between different sub-regions of the face. Furthermore, a carefully crafted fusion mechanism is built to combine the learned facial features and the semantic relationships between the regions to predict the micro-expressions. We also construct a new loss function to jointly optimize both modules of our proposed architecture. Our proposed method performs favourably on five benchmark spontaneous micro-expression databases compared to existing micro-expression recognition baselines on videos. In addition, through an extended experiment, we show that our proposed approach can effectively recognize the frame-wise micro-expression changes in a sequence of video frames.

Original languageEnglish
Pages (from-to)341-355
Number of pages15
JournalInformation Sciences
Volume630
DOIs
Publication statusPublished - Jun 2023
Externally publishedYes

Fingerprint

Dive into the research topics of 'Deep3DCANN: A Deep 3DCNN-ANN framework for spontaneous micro-expression recognition'. Together they form a unique fingerprint.

Cite this