EDOSE: Emotion Datasets from Open Source EEG with a Real-Time Bracelet Sensor
This is the first concrete investigation of emotion recognition capability or affective computing using a low-cost, open source electroencephalography (EEG) amplifier called OpenBCI. The most important aspect for this kind of study is effective emotional elicitation. According to state-of-the-art related works, movie clips are widely used for visual-audio emotional elicitation. Here, two-hundred healthy people of various age ranges participated in an experiment for effective clip selection. Typical self-assessment, affective tags and unsupervised learning (k-mean clustering method) were incorporated to select the top 60 effective clips from 120 candidates. A further 43 participants gathered for an emotional EEG using OpenBCI and peripheral physiological signals using a real-time bracelet sensor while watching the selected clips. To evaluate the performance of OpenBCI toward emotion-related applications, the data on binary classification tasks was analyzed to predict whether elicited EEG has a high or low level of valence/arousal. As in the previous study on emotional EEG datasets, power spectral densities were extracted as the input features for a basic machine learning classifier; the support vector machine. The experimental results for the proposed datasets or EDOSE outperformed those from the state-of-the-art EEG datasets in a study of affective computing, namely DEAP, MAHNOB-HCI, DECAF and DREAMER. Finally, the EDOSE dataset can assist the researcher (upon request) in various fields such as affective computing, human neuroscience, neuromarketing, mental health, etc.
READ FULL TEXT