An EEG-Based Multi-Modal Emotion Database with Both Posed and Authentic Facial Actions for Emotion Analysis

by   Xiaotian Li, et al.

Emotion is an experience associated with a particular pattern of physiological activity along with different physiological, behavioral and cognitive changes. One behavioral change is facial expression, which has been studied extensively over the past few decades. Facial behavior varies with a person's emotion according to differences in terms of culture, personality, age, context, and environment. In recent years, physiological activities have been used to study emotional responses. A typical signal is the electroencephalogram (EEG), which measures brain activity. Most of existing EEG-based emotion analysis has overlooked the role of facial expression changes. There exits little research on the relationship between facial behavior and brain signals due to the lack of dataset measuring both EEG and facial action signals simultaneously. To address this problem, we propose to develop a new database by collecting facial expressions, action units, and EEGs simultaneously. We recorded the EEGs and face videos of both posed facial actions and spontaneous expressions from 29 participants with different ages, genders, ethnic backgrounds. Differing from existing approaches, we designed a protocol to capture the EEG signals by evoking participants' individual action units explicitly. We also investigated the relation between the EEG signals and facial action units. As a baseline, the database has been evaluated through the experiments on both posed and spontaneous emotion recognition with images alone, EEG alone, and EEG fused with images, respectively. The database will be released to the research community to advance the state of the art for automatic emotion recognition.


page 1

page 2

page 3

page 7


Impact of multiple modalities on emotion recognition: investigation into 3d facial landmarks, action units, and physiological data

To fully understand the complexities of human emotion, the integration o...

The Indian Spontaneous Expression Database for Emotion Recognition

Automatic recognition of spontaneous facial expressions is a major chall...

Measuring User Experience of Adaptive User Interfaces using EEG: A Replication Study

Adaptive user interfaces have the advantage of being able to dynamically...

Facial Expression and Peripheral Physiology Fusion to Decode Individualized Affective Experience

In this paper, we present a multimodal approach to simultaneously analyz...

End-to-end facial and physiological model for Affective Computing and applications

In recent years, Affective Computing and its applications have become a ...

Context-aware Human Intent Inference for Improving Human Machine Cooperation

The ability of human beings to precisely recog- nize others intents is a...

Multiple Emotion Descriptors Estimation at the ABAW3 Challenge

To describe complex emotional states, psychologists have proposed multip...

Please sign up or login with your details

Forgot password? Click here to reset