EmoNeuroDB

Mapping Human Emotions through EEG Signals


Unsplashed background img 1
Competition Details

The study involves participants being fully immersed in a carefully regulated VR setting, wherein they are exposed to an avatar that exhibits six primary human emotions, namely fear, joy, anger, sadness, disgust, and surprise. The participants are instructed to imitate the facial expressions of the avatar, while their brain activity is recorded using a multi-channel EEG sensor. The process of EEG data acquisition involves the utilisation of a sophisticated system of dry sensors, which effectively minimises any potential disruptions caused by facial movements and muscle activity. The research encompassed a cohort of 40 participants, with each individual undertaking the task independently, thereby generating a total of 720 samples for each emotional condition. The objective of this challenge is to conduct an analysis of EEG signals in order to accurately identify and classify different emotional states. The evaluation of participants will be conducted by assessing the accuracy of their emotion recognition, with the additional consideration of processing time as a determining factor in case of a tie. The competition will employ the CodaLab platform for the purposes of code submissions, data sharing, and communication. The competition is being held as part of the 18th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2024).

VR application

Highly immersive systems prove to be a highly effective method for inducing emotions in participants engaged in crafted experiences. Our developed application maximally isolates users from their natural environment, assuming complete control over visual perception and, within certain limits, auditory sensations. The natural way the human brain processes information from the environment further aids us in this endeavour. Focusing on the user's actions activates highly effective mechanisms for filtering out "noise" information. Real sounds unrelated to the ongoing experience become muted, and the participant immerses themselves in the virtual scenario. In our application, participants encounter an avatar that prompts them to mimic its expressive facial expressions, representing six fundamental human emotions (fear, joy, anger, sadness, disgust, surprise). A clear prompt precedes each emotional exposure to ensure participants are aware of the upcoming emotion, allowing them time to prepare and empathize with the experience mentally. A visual cue, such as a label, eliminates ambiguity about the portrayed emotion. This approach allows users to step into the shoes of someone undergoing these emotions. The user's task is to mimic the avatar's facial expressions, conveying specific emotions.

Database Descriptions
EEG

For data acquisition, we used DSI-24, a research-grade wireless dry electrode EEG headset designed for rapid application of 21 sensors at locations corresponding to the 10-20 International System. This device offers numerous benefits and features, such as signal quality, capturing precise and reliable EEG data with minimal artifacts; its dry electrodes eliminate the need for messy gels and ensure consistent, artifact-free recordings; the freedom of conducting experiments and studies without the constraints of tethered cables; comfort, ensuring that subjects can wear it for extended periods during experiments or real-world applications; its ergonomic design minimizes discomfort and distractions; user-friendly software and a quick setup process; calibration is efficient and user-friendly; compatibility with VR headsets, it is possible to use the DSI-24 with a VR headset due to specific head strap; Find out more.

Experiment procedure

A total of 40 people took part in the study - 20 male and 20 female, aged from 19 to 57. Each person was performing the task separately. At the beginning of the experiment, the person is asked to open their eyes (15s) and then close their eyes (15s). After a short pause, the user is asked to show the emotion they see on the avatar's face. Before the avatar swathes a given emotional state, the screen displays what emotion the user will be dealing with. This procedure is repeated three times for each emotion. The session ends with eyes closed (15s) and opened (15s) precisely as at the beginning.

Data

The database includes 720 row EEG signal recordings from 21 21 sensors located according to 10-20 International System. The participants express six basic emotions, preceded by the neutral state. Each subject repeated each of the samples three times, and each recording lasted for about 15 seconds with a resolution of ~300 Hz. Data is provided in the form of two CSV files: one with row data and one filtered (filter HP: 1Hz and LP: 50Hz, filter delay 173,3 ms).

Important Dates
  • December 15th, 2023 Beginning of the quantitative competition, release of development and data.
  • February 10th, 2024 Deadline for code submission.
  • February 17th, 2024 Release of final evaluation data decryption key. Participants start predicting the results on the final evaluation data.
  • February 27th, 2024 End of the quantitative competition. Deadline for submitting the predictions over the final evaluation data. The organizers start the code verification by running it on the final evaluation data.
  • March 22nd, 2024 Deadline for submitting the fact sheets.
  • March 30th, 2024 Release of the verification results to the participants for review. Participants are invited to follow the paper submission guide for submitting contest papers.
  • April 7th, 2024 Contest paper submission deadline.
  • April 15th, 2024 Notification of paper acceptance.
  • April 22nd, 2024 Camera ready of contest papers.
  • May 27th-31st, 2024 FG 2024 FG 2024 - Brain Responses to Emotional Avatars Challenge results and presentations.
Organizers
  • Agnieszka Dubiel, Lodz University of Technology, Poland.
  • Dorota Kamińska, Lodz University of Technology, Poland.
  • Grzegorz Zwoliński, Lodz University of Technology, Poland.
  • Egils Avots, iCV Lab, University of Tartu, Estonia.
  • Akbar Anbar Jafari, iVCV, Tartu, Estonia; iCV Lab, University of Tartu, Estonia.
  • Cagri Ozcinar, iCV Lab, University of Tartu, Estonia.
  • Gholamreza Anbajafari, iVCV, Tartu, Estonia; iCV Lab, University of Tartu, Estonia; PwC Finland, Helsinki, Finland.
  • Sergio Escalera, University of Barcelona, Spain.
  • Julio C. S. Jacques Junior, University of Barcelona, Spain.
Competition Rules

The aim of this competition is to analyze EEG signal and perform the emotion recognition. The parcicipants must submit the code and all dependencies via CodaLab (the link will be provided on 1st December 2023). The evaluation would be based on the average correct emotion recognition. In case of equal performance, the processing time will be used in order to indicate the ranking. The training data will be provided followed by the validation dataset. The test data will be finally lunched with no label and it will be used for the evaluation of participants.

Unsplashed background img 3