A. Hanjalic, Extracting moods from pictures and sounds: towards truly personalized TV, IEEE Signal Processing Magazine, vol.23, issue.2, 2006.
DOI : 10.1109/MSP.2006.1621452

S. Arifin and P. Y. Cheung, Affective Level Video Segmentation by Utilizing the Pleasure-Arousal-Dominance Information, IEEE Transactions on Multimedia, vol.10, issue.7, pp.1325-1341, 2008.
DOI : 10.1109/TMM.2008.2004911

S. Zhao, H. Yao, X. Sun, P. Xu, X. Liu et al., Video indexing and recommendation based on affective analysis of viewers, Proceedings of the 19th ACM international conference on Multimedia, MM '11, pp.1473-1476, 2011.
DOI : 10.1145/2072298.2072043

M. Horvat, S. Popovic, and K. Cosic, Multimedia stimuli databases usage patterns: a survey report, Proceedings of the 36nd International ICT Convention MIPRO, pp.993-997, 2013.

M. Soleymani, M. Larson, T. Pun, and A. Hanjalic, Corpus Development for Affective Video Indexing, IEEE Transactions on Multimedia, vol.16, issue.4, pp.1075-1089, 2014.
DOI : 10.1109/TMM.2014.2305573

URL : http://arxiv.org/abs/1211.5492

Y. Baveye, J. Bettinelli, E. Dellandrea, L. Chen, and C. Chamaret, A large video data base for computational models of induced emotion, Affective Computing and Intelligent Interaction (ACII), pp.13-18, 2013.

J. A. Russell, A circumplex model of affect., Journal of Personality and Social Psychology, vol.39, issue.6, pp.1161-1178, 1980.
DOI : 10.1037/h0077714

URL : https://hal.archives-ouvertes.fr/hal-01086372

P. Philippot, Inducing and assessing differentiated emotion-feeling states in the laboratory, Cognition and Emotion, vol.1, issue.2, pp.171-193, 1993.
DOI : 10.1037/12908-000

J. J. Gross and R. W. Levenson, Emotion elicitation using films, Cognition & Emotion, vol.1, issue.1, pp.87-108, 1995.
DOI : 10.1007/BF00995547

E. Douglas-cowie, R. Cowie, I. Sneddon, C. Cox, O. Lowry et al., The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data, Affective Computing and Intelligent Interaction, pp.488-500, 2007.
DOI : 10.1007/978-3-540-74889-2_43

A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers, Cognition & Emotion, vol.15, issue.7, pp.1153-1172, 2010.
DOI : 10.1177/0013164403253226

J. Rottenberg, R. D. Ray, and J. J. Gross, Emotion elicitation using films Handbook of emotion elicitation and assessment, p.9, 2007.

S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani et al., DEAP: A Database for Emotion Analysis ;Using Physiological Signals, IEEE Transactions on Affective Computing, vol.3, issue.1, pp.18-31, 2012.
DOI : 10.1109/T-AFFC.2011.15

M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, A Multimodal Database for Affect Recognition and Implicit Tagging, IEEE Transactions on Affective Computing, vol.3, issue.1, pp.42-55, 2012.
DOI : 10.1109/T-AFFC.2011.25

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.371.1612

S. Carvalho, J. Leite, S. Galdo-´-alvarez, and O. Gonçalves, The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study, Applied Psychophysiology and Biofeedback, vol.20, issue.6, pp.279-294, 2012.
DOI : 10.1007/s10484-012-9201-6

C. Demarty, C. Penet, G. Gravier, and M. Soleymani, A Benchmarking Campaign for the Multimodal Detection of Violent Scenes in Movies, Proceedings of the 12th International Conference on Computer Vision, ser. ECCV'12, pp.416-425, 2012.
DOI : 10.1007/978-3-642-33885-4_42

URL : https://hal.archives-ouvertes.fr/hal-00767036

B. Jou, S. Bhattacharya, and S. Chang, Predicting Viewer Perceived Emotions in Animated GIFs, Proceedings of the ACM International Conference on Multimedia, MM '14, pp.213-216, 2014.
DOI : 10.1145/2647868.2656408

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.572.4539

A. Hanjalic and L. Xu, Affective video content representation and modeling, IEEE Transactions on Multimedia, vol.7, issue.1, pp.143-154, 2005.
DOI : 10.1109/TMM.2004.840618

N. Malandrakis, A. Potamianos, G. Evangelopoulos, and A. Zlatintsi, A supervised approach to movie emotion tracking, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp.2376-2379, 2011.
DOI : 10.1109/ICASSP.2011.5946961

H. Kang, Affective content detection using HMMs, Proceedings of the eleventh ACM international conference on Multimedia , MULTIMEDIA '03, pp.259-262, 2003.
DOI : 10.1145/957013.957066

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.335.1830

H. L. Wang and L. Cheong, Affective understanding in film, IEEE Transactions on Circuits and Systems for Video Technology, pp.689-704, 2006.

K. Sun and J. Yu, Video Affective Content Representation and Recognition Using Video Affective Tree and Hidden Markov Models, Affective Computing and Intelligent Interaction, pp.594-605, 2007.
DOI : 10.1007/978-3-540-74889-2_52

M. Xu, J. S. Jin, S. Luo, and L. Duan, Hierarchical movie affective content analysis based on arousal and valence features, Proceeding of the 16th ACM international conference on Multimedia, MM '08, pp.677-680, 2008.
DOI : 10.1145/1459359.1459457

M. Soleymani, G. Chanel, J. Kierkels, and T. Pun, Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses, 2008 Tenth IEEE International Symposium on Multimedia, pp.228-235, 2008.
DOI : 10.1109/ISM.2008.14

M. Soleymani, J. Kierkels, G. Chanel, and T. Pun, A Bayesian framework for video affective representation, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp.1-7, 2009.
DOI : 10.1109/ACII.2009.5349563

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.155.3074

G. Irie, T. Satou, A. Kojima, T. Yamasaki, and K. Aizawa, Affective Audio-Visual Words and Latent Topic Driving Model for Realizing Movie Affective Scene Classification, IEEE Transactions on Multimedia, vol.12, issue.6, pp.523-535, 2010.
DOI : 10.1109/TMM.2010.2051871

S. Zhang, Q. Huang, S. Jiang, W. Gao, and Q. Tian, Affective Visualization and Retrieval for Music Video, IEEE Transactions on Multimedia, vol.12, issue.6, pp.510-522, 2010.
DOI : 10.1109/TMM.2010.2059634

A. Yazdani, E. Skodras, N. Fakotakis, and T. Ebrahimi, Multimedia content analysis for emotional characterization of music video clips, EURASIP Journal on Image and Video Processing, vol.20, issue.3, p.26, 2013.
DOI : 10.1109/34.667881

K. Sun, J. Yu, Y. Huang, and X. Hu, An improved valencearousal emotion space for video affective content representation and recognition, IEEE International Conference on Multimedia and Expo, pp.566-569, 2009.

R. W. Lienhart, Comparison of automatic shot boundary detection algorithms, Electronic Imaging, pp.290-301, 1998.

H. Leventhal and K. Scherer, The Relationship of Emotion to Cognition: A Functional Approach to a Semantic Controversy, Cognition & Emotion, vol.54, issue.1, pp.3-28, 1987.
DOI : 10.1037/0003-066X.39.2.117

A. Metallinou and S. S. Narayanan, Annotation and processing of continuous emotional attributes: Challenges and opportunities, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), pp.1-8, 2013.
DOI : 10.1109/FG.2013.6553804

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.672.4412

C. Colombo, A. D. Bimbo, and P. Pala, Semantics in visual information retrieval, IEEE Multimedia, vol.6, issue.3, pp.38-53, 1999.
DOI : 10.1109/93.790610

J. Xiao, J. Hays, K. Ehinger, A. Oliva, and A. Torralba, SUN database: Large-scale scene recognition from abbey to zoo, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.3485-3492, 2010.
DOI : 10.1109/CVPR.2010.5539970

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.469.2228

M. Soleymani and M. Larson, Crowdsourcing for affective annotation of video: Development of a viewer-reported boredom corpus, Workshop on Crowdsourcing for Search Evaluation, SIGIR 2010, 2010.

S. M. Mohammad and P. D. Turney, CROWDSOURCING A WORD-EMOTION ASSOCIATION LEXICON, Computational Intelligence, vol.39, issue.2, pp.436-465, 2013.
DOI : 10.1111/j.1467-8640.2012.00460.x

Y. Yang and H. Chen, Ranking-Based Emotion Recognition for Music Organization and Retrieval, IEEE Transactions on Audio, Speech, and Language Processing, vol.19, issue.4, pp.762-774, 2011.
DOI : 10.1109/TASL.2010.2064164

P. A. Russell and C. D. Gray, Ranking or rating? Some data and their implications for the measurement of evaluative response, British Journal of Psychology, vol.85, issue.1, pp.79-92, 1994.
DOI : 10.1111/j.2044-8295.1994.tb02509.x

S. Ovadia, Ratings and rankings: reconsidering the structure of values and their measurement, International Journal of Social Research Methodology, vol.23, issue.5, pp.403-414, 2004.
DOI : 10.1177/001872678603900301

G. N. Yannakakis and J. Hallam, Ranking vs. Preference: A Comparative Study of Self-reporting, Affective Computing and Intelligent Interaction, pp.437-446, 2011.
DOI : 10.1007/s11257-010-9078-0

R. Mantiuk, A. M. Tomaszewska, and R. Mantiuk, Comparison of Four Subjective Methods for Image Quality Assessment, Computer Graphics Forum, vol.2011, issue.3, pp.2478-2491, 2012.
DOI : 10.1111/j.1467-8659.2012.03188.x

J. D. Smyth, D. A. Dillman, L. M. Christian, and M. J. Stern, Comparing Check-All and Forced-Choice Question Formats in Web Surveys, Public Opinion Quarterly, vol.70, issue.1, pp.66-77, 2006.
DOI : 10.1093/poq/nfj007

S. S. Skiena, The algorithm design manual, 2008.
DOI : 10.1007/978-1-84800-070-4

M. M. Bradley and P. J. Lang, Measuring emotion: The self-assessment manikin and the semantic differential, Journal of Behavior Therapy and Experimental Psychiatry, vol.25, issue.1, pp.49-59, 1994.
DOI : 10.1016/0005-7916(94)90063-9

URL : http://ufdc.ufl.edu/LS00000045/00009

R. B. Dietz and A. Lang, Aefective agents: Effects of agent affect on arousal, attention, liking and learning, Cognitive Technology Conference, 1999.

J. L. Fleiss, Measuring nominal scale agreement among many raters., Psychological Bulletin, vol.76, issue.5, pp.378-382, 1971.
DOI : 10.1037/h0031619

K. Krippendorff, Estimating the Reliability, Systematic Error and Random Error of Interval Data, Educational and Psychological Measurement, vol.54, issue.1, pp.61-70, 1970.
DOI : 10.1177/001316447003000105

J. J. Randolph, Free-marginal multirater kappa (multirater ? f ree ): An alternative to fleiss fixed-marginal multirater kappa, Paper presented at the Joensuu University Learning and Instruction Symposium, 2005.

J. R. Landis and G. G. Koch, The Measurement of Observer Agreement for Categorical Data, Biometrics, vol.33, issue.1, pp.159-174, 1977.
DOI : 10.2307/2529310

S. Zhang, Q. Tian, Q. Huang, W. Gao, and S. Li, Utilizing affective analysis for efficient movie browsing, 2009 16th IEEE International Conference on Image Processing (ICIP), pp.1853-1856, 2009.
DOI : 10.1109/ICIP.2009.5413590

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.613.8695

L. Canini, S. Benini, and R. Leonardi, Affective Recommendation of Movies Based on Selected Connotative Features, IEEE Transactions on Circuits and Systems for Video Technology, pp.636-647, 2013.
DOI : 10.1109/TCSVT.2012.2211935

S. Mallat and S. Zhong, Characterization of signals from multiscale edges, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.14, issue.7, pp.710-732, 1992.
DOI : 10.1109/34.142909

Y. Baveye, F. Urban, C. Chamaret, V. Demoulin, and P. Hellier, Saliency-Guided Consistent Color Harmonization, Computational Color Imaging Workshop, pp.105-118, 2013.
DOI : 10.1007/978-3-642-36700-7_9

D. Hasler and S. Suesstrunk, Measuring colourfulness in natural images, Proc. SPIE Electronic Imaging 2003: Human Vision and Electronic Imaging VIII, pp.87-95, 2003.
DOI : 10.1117/12.477378

Y. Ke, X. Tang, and F. Jing, The design of high-level features for photo quality assessment, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.419-426, 2006.

O. , L. Meur, T. Baccino, and A. Roumy, Prediction of the interobserver visual congruency (IOVC) and application to image ranking, Proceedings of the 19th ACM International Conference on Multimedia, pp.373-382, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00628077

Y. Luo and X. Tang, Photo and Video Quality Evaluation: Focusing on the Subject, Proceedings of the 10th International Conference on Computer Vision, pp.386-399, 2008.
DOI : 10.1007/978-3-540-88690-7_29

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.458.1974

Y. Baveye, F. Urban, and C. Chamaret, Image and Video Saliency Models Improvement by Blur Identification, Computer Vision and Graphics, pp.280-287, 2012.
DOI : 10.1007/978-3-642-33564-8_34

Y. Baveye, C. Chamaret, E. Dellandréa, and L. Chen, A Protocol for Cross-Validating Large Crowdsourced Data, Proceedings of the 2014 International ACM Workshop on Crowdsourcing for Multimedia, CrowdMM '14, pp.3-8, 2014.
DOI : 10.1145/2660114.2660115

URL : https://hal.archives-ouvertes.fr/hal-01313188

J. A. Sloboda, Empirical studies of emotional response to music., Cognitive bases of musical communication, pp.33-46, 1992.
DOI : 10.1037/10104-003

M. Zentner, D. Grandjean, and K. R. Scherer, Emotions evoked by the sound of music: Characterization, classification, and measurement., Emotion, vol.8, issue.4, pp.494-521, 2008.
DOI : 10.1037/1528-3542.8.4.494

K. R. Scherer and M. R. Zentner, Emotional effects of music: Production rules Music and emotion: Theory and research, pp.361-392, 2001.

Y. Baveye, E. Dellandrea, C. Chamaret, and L. Chen, From crowdsourced rankings to affective ratings, 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW), pp.1-6, 2014.
DOI : 10.1109/ICMEW.2014.6890568

URL : https://hal.archives-ouvertes.fr/hal-01313166