Multilabel Automated Recognition of Emotions Induced Through Music

05/29/2019
by   Fabio Paolizzo, et al.
0

Achieving advancements in automatic recognition of emotions that music can induce require considering multiplicity and simultaneity of emotions. Comparison of different machine learning algorithms performing multilabel and multiclass classification is the core of our work. The study analyzes the implementation of the Geneva Emotional Music Scale 9 in the Emotify music dataset and the data distribution. The research goal is the identification of best methods towards the definition of the audio component of a new a new multimodal dataset for music emotion recognition.

READ FULL TEXT
research
12/09/2021

Personalized musically induced emotions of not-so-popular Colombian music

This work presents an initial proof of concept of how Music Emotion Reco...
research
02/01/2021

Neural Network architectures to classify emotions in Indian Classical Music

Music is often considered as the language of emotions. It has long been ...
research
07/27/2023

Emotion4MIDI: a Lyrics-based Emotion-Labeled Symbolic Music Dataset

We present a new large-scale emotion-labeled symbolic music dataset cons...
research
09/12/2019

The emotions that we perceive in music: the influence of language and lyrics comprehension on agreement

In the present study, we address the relationship between the emotions p...
research
11/30/2017

Enabling Embodied Analogies in Intelligent Music Systems

The present methodology is aimed at cross-modal machine learning and use...
research
07/19/2023

Mood Classification of Bangla Songs Based on Lyrics

Music can evoke various emotions, and with the advancement of technology...
research
06/04/2021

Musical Prosody-Driven Emotion Classification: Interpreting Vocalists Portrayal of Emotions Through Machine Learning

The task of classifying emotions within a musical track has received wid...

Please sign up or login with your details

Forgot password? Click here to reset