Attentive Cross-modal Connections for Deep Multimodal Wearable-based Emotion Recognition

08/04/2021
by   Anubhav Bhatti, et al.
0

Classification of human emotions can play an essential role in the design and improvement of human-machine systems. While individual biological signals such as Electrocardiogram (ECG) and Electrodermal Activity (EDA) have been widely used for emotion recognition with machine learning methods, multimodal approaches generally fuse extracted features or final classification/regression results to boost performance. To enhance multimodal learning, we present a novel attentive cross-modal connection to share information between convolutional neural networks responsible for learning individual modalities. Specifically, these connections improve emotion classification by sharing intermediate representations among EDA and ECG and apply attention weights to the shared information, thus learning more effective multimodal embeddings. We perform experiments on the WESAD dataset to identify the best configuration of the proposed method for emotion classification. Our experiments show that the proposed approach is capable of learning strong multimodal representations and outperforms a number of baselines methods.

READ FULL TEXT
research
06/09/2022

AttX: Attentive Cross-Connections for Fusion of Wearable Signals in Emotion Recognition

We propose cross-modal attentive connections, a new dynamic and effectiv...
research
08/24/2020

Unsupervised Multi-Modal Representation Learning for Affective Computing with Multi-Corpus Wearable Data

With recent developments in smart technologies, there has been a growing...
research
11/03/2021

A cross-modal fusion network based on self-attention and residual structure for multimodal emotion recognition

The audio-video based multimodal emotion recognition has attracted a lot...
research
07/06/2022

GraphCFC: A Directed Graph based Cross-modal Feature Complementation Approach for Multimodal Conversational Emotion Recognition

Emotion Recognition in Conversation (ERC) plays a significant part in Hu...
research
02/26/2016

Multimodal Emotion Recognition Using Multimodal Deep Learning

To enhance the performance of affective models and reduce the cost of ac...
research
01/08/2023

Seamless Multimodal Biometrics for Continuous Personalised Wellbeing Monitoring

Artificially intelligent perception is increasingly present in the lives...
research
08/27/2023

A Unified Transformer-based Network for multimodal Emotion Recognition

The development of transformer-based models has resulted in significant ...

Please sign up or login with your details

Forgot password? Click here to reset