Y. Baveye, E. Dellandrea, C. Chamaret, and L. Chen, LIRIS-ACCEDE: A Video Database for Affective Content Analysis, IEEE Transactions on Affective Computing, vol.6, issue.1, pp.43-55, 2015.
DOI : 10.1109/TAFFC.2015.2396531

URL : https://hal.archives-ouvertes.fr/hal-01375518

R. Calvo and S. D. Mello, Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications, IEEE Transactions on Affective Computing, vol.1, issue.1, pp.18-37, 2010.
DOI : 10.1109/T-AFFC.2010.1

L. Canini, S. Gilroy, M. Cavazza, R. Leonardi, and S. Benini, Users' response to affective film content: A narrative perspective, 2010 International Workshop on Content Based Multimedia Indexing (CBMI), pp.1-6, 2010.
DOI : 10.1109/CBMI.2010.5529892

G. Chanel, K. Ansari-asl, and T. Pun, Valence-arousal evaluation using physiological signals in an emotion recall paradigm, 2007 IEEE International Conference on Systems, Man and Cybernetics, pp.2662-2667, 2007.
DOI : 10.1109/ICSMC.2007.4413638

G. Chanel, J. Kronegg, D. Grandjean, and T. Pun, Emotion assessment: Arousal evaluation using eeg's and peripheral physiological signals. Multimedia content representation, classification and security, pp.530-537, 2006.

E. Coutinho and A. Cangelosi, Musical emotions: Predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements., Emotion, vol.11, issue.4, p.921, 2011.
DOI : 10.1037/a0024700

E. Coutinho and N. Dibben, Emotions perceived in music and speech: relationships between psychoacoustic features, second-by-second subjective feelings of emotion and physiological responses, 3rd International Conference on Music & Emotion, 2013.

R. Cowie, M. Sawey, C. Doherty, J. Jaimovich, C. Fyans et al., Gtrace: General Trace Program Compatible with EmotionML, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, 2013.
DOI : 10.1109/ACII.2013.126

J. Fleureau, P. Guillotel, and Q. Huynh-thu, Physiological-Based Affect Event Detector for Entertainment Video Applications, IEEE Transactions on Affective Computing, vol.3, issue.3, pp.379-385, 2012.
DOI : 10.1109/T-AFFC.2012.2

J. Fleureau, P. Guillotel, and I. Orlac, Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp.73-78, 2013.
DOI : 10.1109/ACII.2013.19

M. Grimm and K. Kroschel, Evaluation of natural emotions using self assessment manikins, IEEE Workshop on Automatic Speech Recognition and Understanding, 2005., pp.381-385, 2005.
DOI : 10.1109/ASRU.2005.1566530

A. Haag, S. Goronzy, P. Schaich, and J. Williams, Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System, ADS, pp.36-48, 2004.
DOI : 10.1007/978-3-540-24842-2_4

J. Kim and E. André, Emotion recognition based on physiological changes in music listening, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.30, issue.12, pp.2067-2083, 2008.

S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani et al., DEAP: A Database for Emotion Analysis ;Using Physiological Signals, IEEE Transactions on Affective Computing, vol.3, issue.1, pp.18-31, 2012.
DOI : 10.1109/T-AFFC.2011.15

P. J. Lang, M. K. Greenwald, M. M. Bradley, and A. O. Hamm, Looking at pictures: Affective, facial, visceral, and behavioral reactions, Psychophysiology, vol.3, issue.3, pp.261-273, 1993.
DOI : 10.1016/0022-1031(84)90047-7

R. W. Picard, E. Vyzas, and J. Healey, Toward machine emotional intelligence: analysis of affective physiological state, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.23, issue.10, pp.1175-1191, 2001.
DOI : 10.1109/34.954607

M. Soleymani, G. Chanel, J. Kierkels, and T. Pun, Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses, 2008 Tenth IEEE International Symposium on Multimedia, pp.228-235, 2008.
DOI : 10.1109/ISM.2008.14

F. Zhou, X. Qu, J. R. Jiao, and M. G. Helander, Emotion Prediction from Physiological Signals: A Comparison Study Between Visual and Auditory Elicitors, Interacting with Computers, vol.26, issue.3, pp.285-302, 2014.
DOI : 10.1093/iwc/iwt039