EmoNeuroDB

Mapping Human Emotions through EEG Signals


Unsplashed background img 1
Competition Details

The study involves participants being fully immersed in a carefully regulated VR setting, wherein they are exposed to an avatar that exhibits six primary human emotions, namely fear, joy, anger, sadness, disgust, and surprise. The participants are instructed to imitate the facial expressions of the avatar, while their brain activity is recorded using a multi-channel EEG sensor. The process of EEG data acquisition involves the utilisation of a sophisticated system of dry sensors, which effectively minimises any potential disruptions caused by facial movements and muscle activity. The research encompassed a cohort of 40 participants, with each individual undertaking the task independently, thereby generating a total of 720 samples for each emotional condition. The objective of this challenge is to conduct an analysis of EEG signals in order to accurately identify and classify different emotional states. The evaluation of participants will be conducted by assessing the accuracy of their emotion recognition, with the additional consideration of processing time as a determining factor in case of a tie. The competition will employ the CodaLab platform for the purposes of code submissions, data sharing, and communication. The competition is being held as part of the 18th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2024).

VR application

Highly immersive systems prove to be a highly effective method for inducing emotions in participants engaged in crafted experiences. Our developed application maximally isolates users from their natural environment, assuming complete control over visual perception and, within certain limits, auditory sensations. The natural way the human brain processes information from the environment further aids us in this endeavour. Focusing on the user's actions activates highly effective mechanisms for filtering out "noise" information. Real sounds unrelated to the ongoing experience become muted, and the participant immerses themselves in the virtual scenario. In our application, participants encounter an avatar that prompts them to mimic its expressive facial expressions, representing six fundamental human emotions (fear, joy, anger, sadness, disgust, surprise). A clear prompt precedes each emotional exposure to ensure participants are aware of the upcoming emotion, allowing them time to prepare and empathize with the experience mentally. A visual cue, such as a label, eliminates ambiguity about the portrayed emotion. This approach allows users to step into the shoes of someone undergoing these emotions. The user's task is to mimic the avatar's facial expressions, conveying specific emotions.

Database Descriptions
EEG

For data acquisition, we used DSI-24, a research-grade wireless dry electrode EEG headset designed for rapid application of 21 sensors at locations corresponding to the 10-20 International System. This device offers numerous benefits and features, such as signal quality, capturing precise and reliable EEG data with minimal artifacts; its dry electrodes eliminate the need for messy gels and ensure consistent, artifact-free recordings; the freedom of conducting experiments and studies without the constraints of tethered cables; comfort, ensuring that subjects can wear it for extended periods during experiments or real-world applications; its ergonomic design minimizes discomfort and distractions; user-friendly software and a quick setup process; calibration is efficient and user-friendly; compatibility with VR headsets, it is possible to use the DSI-24 with a VR headset due to specific head strap; Find out more.

Experiment procedure

A total of 40 people took part in the study - 24 males and 16 females, aged from 19 to 57. Each person was performing the task separately. At the beginning of the experiment, the person is asked to open their eyes (30s) and then close their eyes (30s). After a short pause (5s for preparation), the user is asked to show the emotion they see on the avatar's face (5s). After showing each emotion, the user had time to relax, at the same time the data was saved on the server (5s). Before the avatar starts showing a particular emotion, the name of the emotion the user will be dealing with is displayed on the screen. This procedure is repeated three times for each emotion. The session ends with eyes closed (30s) and opened (30s) precisely as at the beginning.

Data

The database includes 720 row EEG signal recordings from 21 sensors located according to 10-20 International System. The participants express six basic emotions, preceded by the neutral state. Each subject repeated each of the samples three times, and each recording lasted for about 15 seconds with a resolution of ~300 Hz. Data is provided in the form of two CSV files (filter HP: 1Hz and LP: 50Hz, filter delay 40ms, main frequency: 50Hz, sample frequency: 300Hz, sensor data units: μV, reference location: Pz).
download data

Important Dates
  • December 20th, 2023 Beginning of the development phase. Release of development data with labels. Evaluated on validation data.
  • February 20th, 2024 End of development phase and start of the test (final) phase. Release of test data (with no labels) and validation labels. Evaluated on test data.
  • February 27th, 2024 End of the test phase.
  • March 2nd, 2024 Deadline for sharing the code code and fact sheets.
  • March 3rd, 2024 Start of the code verification phase.
  • March 13th, 2024 Release of the verification results.
  • April 7th, 2024 Contest paper submission deadline.
  • April 15th, 2024 Notification of paper acceptance.
  • April 22nd, 2024 Camera ready of contest papers.
  • May 27th-31st, 2024 FG 2024 FG 2024 - Brain Responses to Emotional Avatars Challenge results and presentations.
Organizers
  • Agnieszka Dubiel, Lodz University of Technology, Poland.
  • Dorota Kamińska, Lodz University of Technology, Poland.
  • Grzegorz Zwoliński, Lodz University of Technology, Poland.
  • Egils Avots, iCV Lab, University of Tartu, Estonia.
  • Akbar Anbar Jafari, iVCV, Tartu, Estonia; iCV Lab, University of Tartu, Estonia.
  • Cagri Ozcinar, iCV Lab, University of Tartu, Estonia.
  • Gholamreza Anbajafari, iVCV, Tartu, Estonia; iCV Lab, University of Tartu, Estonia; PwC Finland, Helsinki, Finland.
  • Sergio Escalera, University of Barcelona, Spain.
  • Julio C. S. Jacques Junior, University of Barcelona, Spain.
Competition Rules

The aim of this competition is to analyze EEG signal and perform the emotion recognition. The participants must submit the code and all dependencies via CodaLab (the link will be provided on 20th December 2023). The evaluation would be based on the average correct emotion recognition. In case of equal performance, the processing time will be used in order to indicate the ranking. The training data will be provided followed by the validation dataset. The test data will be finally lunched with no label and it will be used for the evaluation of participants.

Final Evaluation and Ranking

Important dates regarding code submission and fact sheets are defined in the schedule.

Code verification: After the end of the test phase, participants are required to share with the organizers the source code used to generate the submitted results, with detailed and complete instructions (and requirements) so that the results can be reproduced locally (preferably using docker). Note that only solutions that pass the code verification stage are eligible to be announced in the final list of winning solutions. Participants are required to share both training and prediction codes with pre-trained models. Participants are requested to share with the organizers a link to a code repository with the required instructions. This information must be detailed inside the fact sheets (detailed next).

Ideally, the instructions to reproduce the code should contain:

  1. how to structure the data (at the train and test stage),
  2. how to run any preprocessing script, if needed,
  3. how to extract or load the input features, if needed,
  4. how to run the docker used to run the code and to install any required libraries, if possible/needed,
  5. how to run the script to perform the training,
  6. how to run the script to perform the predictions, that will generate the output format of the challenge.

Fact sheets: In addition to the source code, participants are required to share with the organizers a detailed scientific and technical description of the proposed approach using the template of the fact sheets provided by the organizers. The Latex template of the fact sheets can be downloaded here.

Sharing the requested information with the organizers: Send the compressed project of your fact sheet (in .zip format), i.e., the generated PDF, .tex, .bib, and any additional files to agnieszka.dubiel@dokt.p.lodz.pl and egils.avots2@gmail.com, and put in the Subject of the E-mail "FG 2024 EmoNeuroDB Challenge / Fact Sheets and Code repository"

IMPORTANT NOTE: we encourage participants to provide detailed and complete instructions so that the organizers can easily reproduce the results. If we face any problem during code verification, we may need to contact the authors, and this can take time, and the release of the list of winners may be delayed.

Ranking

Below is the final ranking created after careful analysis of the codes received from participants. As intended, average accuracy was considered first and computation time second (if average accuracy was the same for several participants).

Place Nickname Avg. Accuracy Computation time
1 lownish 0.2444 111.50 minutes
2 syntax 0.2278 58.85 seconds
2 defreire 0.2278 269.56 seconds
3 hidenn 0.2167 233.13 seconds
4 SCaLAR NITK 0.2056 20.76 seconds
Paper submission

All participants who sent codes to the organizers by the deadline were invited to submit articles summarizing their work. According to the schedule, the organizers are waiting for participants' articles by the end of the day on April 7th. Articles should be submitted in PDF files, and the length of the article should be at most 6 pages with references (we recommend preparing material of 4 pages with references). Please send the prepared material to agnieszka.dubiel@dokt.p.lodz.pl with the email title "FG 2024 EmoNeuroDB Challenge / Report". The organizers will report to the participants with feedback regarding the acceptance or rejection of the article by April 15th (EOD).

Unsplashed background img 3