A. Lynn Abbott


This paper describes the use of multi-stage Progressive Transfer Learning (MSPTL) to improve the performance of automated Facial Emotion Recognition (FER). Our proposed FER solution is designed to work with 2D images, and is able to classify facial emotions with high accuracy in 6 basic categories (happiness, sadness, fear, anger, surprise, and disgust) for both frontal and (more challenging) non-frontal poses. We perform supervised fine-tuning on an AlexNet deep convolutional neural network in a three-stage process, using three FER datasets in succession. The first two training stages are based on FER datasets containing frontal images only. The final training stage uses a third FER dataset that includes non-frontal poses in images that are relatively low in resolution and/or with partial occlusion. Experimental results demonstrate that our proposed MSPTL approach outperforms typical TL and other PTL systems for FER in both frontal and non-frontal face poses. These results are demonstrated using two different testing datasets (VT-KFER and 300W), which corroborates the generality of the proposed solution and its robustness for handling a wide range of varying poses, occlusion, and expression intensities.

Sherin F. Aly , A. Lynn Abbott: Facial Emotion Recognition with Varying Poses and/or Partial Occlusion Using Multi-stage Progressive Transfer Learning. SCIA 2019: 101-112


A. Lynn Abbott

Publication Details

Date of publication:
May 12, 2019
Scandinavian Conference on Image Analysis
Page number(s):
Publication note:

Xiaolong Li, He Wang, Li Yi, Leonidas J. Guibas, A. Lynn Abbott, Shuran Song: Category-Level Articulated Object Pose Estimation. CoRR abs/1912.11913 (2019)