iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis

07/01/2021
by   Xin Liu, et al.
0

We introduce a new dataset for the emotional artificial intelligence research: identity-free video dataset for Micro-Gesture Understanding and Emotion analysis (iMiGUE). Different from existing public datasets, iMiGUE focuses on nonverbal body gestures without using any identity information, while the predominant researches of emotion analysis concern sensitive biometric data, like face and speech. Most importantly, iMiGUE focuses on micro-gestures, i.e., unintentional behaviors driven by inner feelings, which are different from ordinary scope of gestures from other gesture datasets which are mostly intentionally performed for illustrative purposes. Furthermore, iMiGUE is designed to evaluate the ability of models to analyze the emotional states by integrating information of recognized micro-gesture, rather than just recognizing prototypes in the sequences separately (or isolatedly). This is because the real need for emotion AI is to understand the emotional states behind gestures in a holistic way. Moreover, to counter for the challenge of imbalanced sample distribution of this dataset, an unsupervised learning method is proposed to capture latent representations from the micro-gesture sequences themselves. We systematically investigate representative methods on this dataset, and comprehensive experimental results reveal several interesting insights from the iMiGUE, e.g., micro-gesture-based analysis can promote emotion understanding. We confirm that the new iMiGUE dataset could advance studies of micro-gesture and emotion AI.

READ FULL TEXT

page 1

page 4

page 7

research
01/23/2018

Survey on Emotional Body Gesture Recognition

Automatic emotion recognition has become a trending research topic in th...
research
10/13/2020

A Generalized Zero-Shot Framework for Emotion Recognition from Body Gestures

Although automatic emotion recognition from facial expressions and speec...
research
07/20/2023

Joint Skeletal and Semantic Embedding Loss for Micro-gesture Classification

In this paper, we briefly introduce the solution of our team HFUT-VUT fo...
research
09/14/2018

Mugeetion: Musical Interface Using Facial Gesture and Emotion

People feel emotions when listening to music. However, emotions are not ...
research
09/18/2020

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

We present a novel generalized zero-shot algorithm to recognize perceive...
research
08/21/2023

Co-Speech Gesture Detection through Multi-phase Sequence Labeling

Gestures are integral components of face-to-face communication. They unf...
research
12/01/2020

Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices

We present and evaluate a novel interface for tracking ensemble performa...

Please sign up or login with your details

Forgot password? Click here to reset