DEAP: A Database for Emotion Analysis ;Using Physiological Signals
read more
Citations
A Multimodal Database for Affect Recognition and Implicit Tagging
Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks
A review of affective computing
AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild
Deep learning for electroencephalogram (EEG) classification tasks: a review.
References
Measuring emotion: The self-assessment manikin and the semantic differential
Affective Computing
Related Papers (5)
Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks
Frequently Asked Questions (11)
Q2. What are the contributions mentioned in the paper "Deap: a database for emotion analysis using physiological signals" ?
The authors present a multimodal data set for the analysis of human affective states. An extensive analysis of the participants ' ratings during the experiment is presented.
Q3. What test was used to test for significance?
To test for significance, an independent one-sample t-test was performed, comparing the F1-distribution over participants to the 0.5 baseline.
Q4. What other features have been shown to be correlated with valence?
There are other content features such as color variance and key lighting that have been shown to be correlated with valence [30].
Q5. What are the four quadrants of the valence-arousal space?
The valence-arousal space can be subdivided into 4 quadrants, namely low arousal/low valence (LALV), low arousal/high valence (LAHV), high arousal/low valence (HALV) and high arousal/high valence (HAHV).
Q6. How many videos were selected via Last.fm affective tags?
Of the 40 selected videos, 17 were selected via Last.fm affective tags, indicating that useful stimuli can be selected via this method.
Q7. What are the common types of emotional information used for emotion assessment?
Physiological signals are also known to include emotional information that can be used for emotion assessment but they have received less attention.
Q8. What are the two widely available databases for emotion assessment?
To the best of their knowledge, the only publicly available multi-modal emotional databases which includes both physiological responses and facial expressions are the enterface 2005 emotional database and MAHNOB HCI [4], [5].
Q9. What was the emotional highlight score of the i-th segment ei?
The emotional highlight score of the i-th segment ei was computed using the following equation:ei = √ a2 i + v2 i (1)The arousal, ai, and valence, vi, were centered.
Q10. How did the participants rate their familiarity with the songs?
after the experiment, participants were asked to rate their familiarity with each of the songs on a scale of 1 (”Never heard it before the experiment”) to 5 (”Knew the song very well”).
Q11. What was the arousal and valence level of each video?
The participants rated arousal and valence levels and the EEG and physiological signals for each video were classified into low/high arousal/valence classes.