C. , D. Hall, J. Lebeau, and L. , Beliefs about the nonverbal expression of social power, Journal of Nonverbal Behavior, vol.29, issue.2, pp.105-123, 2005.

C. , J. And-pelachaud, C. And-badler, N. And, M. Steedman et al., Animated conversation: Ruled-based generation of facial expression gesture and spoken intonation for multiple conversational agents, Computer Graphics, pp.413-420, 1994.

C. , M. Allwood, J. , A. , and E. , Some suggestions for the study of stance in communication, Proc. SocialCom Conference, pp.617-622, 2012.

C. , M. And, M. Ochs, and C. And-pelachaud, Mining a multimodal corpus for non-verbal behavior sequences conveying attitudes, Proceedings of the 9 th . Language Resources and Evaluation Conference, pp.3417-3424, 2014.

C. , C. Plessier, J. Martin, J. C. Ach, L. et al., Combining facial and postural expressions of emotions in a virtual character, Proc. IVA Conference, pp.287-300, 2009.

C. , R. And, C. Cox, . And, J. Martin et al., Issues in Data Labelling Emotion-Oriented Systems: The Humaine Handbook, pp.215-244, 2011.

D. , N. Mortillaro, M. , S. , and K. R. , The Body Action and Posture Coding System (BAP): Development and Reliability, Journal of Nonverbal Behavior, vol.36, pp.97-121, 2012.

D. Rosis, F. And-pelachaud, C. And-poggi, I. And, V. Carofiglio et al., From Greta's mind to her face: modelling the dynamics of affective states in a conversational embodied agent, International Journal of Human-Computer Studies, vol.59, issue.1-2, pp.81-118, 2003.
DOI : 10.1016/S1071-5819(03)00020-X

D. , J. And, and A. Paiva, 2013.I Want to Be Your Friend: Establishing Relations with Emotionally Intelligent Agents, Proc. of AAMAS, pp.777-784
URL : https://hal.archives-ouvertes.fr/hal-01011936

D. , Y. And-pelachaud, C. And-artière, and T. , Modeling Multimodal Behaviors From Speech Prosody, Proc. of IVA, pp.217-228, 2013.

E. , P. And, and W. V. Friesen, Unmasking the Face. A guide to recognizing emotions from facial clues, 1975.

F. , G. And, J. Gall, H. And-romsdorfer, T. And-weise et al., A 3-D Audio-Visual Corpus of Affective Communication, IEEE Transactions on Multimedia, vol.12, issue.6, pp.519-598, 2010.

G. , M. And, I. Crabtree, . And, and D. Ballin, Customisation and context for expressive behaviour in the broadband world, BT Technology Journal, vol.22, issue.2, pp.16-17, 2004.

I. , A. M. Niedenthal, P. M. And-cantor, and N. , An influence of positive affect on social categorization, Motivation and Emotion, vol.16, issue.1, pp.65-78, 1992.

K. , S. Krenn, B. Marsella, S. Marshall, A. N. Pelachaud et al., Towards a common framework for multimodal generation: The behavior markup language, IVA Conference, pp.21-23, 2006.

L. , M. Dale, R. Bard, E. , J. et al., Behavior matching in multimodal communication is synchronized, Cognitive Science, vol.36, issue.8, pp.1404-1426, 2012.

M. , M. Sneddon, I. Mckeown, G. Bevacqua, E. Desevin et al., Evaluation of Four Designed Virtual Agent Personalities, IEEE Transactions on Affective Computing, vol.3, issue.3, pp.311-322, 2012.

O. , M. Niewiadomski, R. Brunet, P. , and P. C. , Smiling Virtual Agent in Social Context, Cognitive Processing, Special Issue on "Social Agents. From Theory to Applications, pp.13-519, 2012.

O. , M. Prepin, P. , P. , and C. , From emotions to interpersonal stances: Multi-levels analysis of smiling virtual characters, Proc. of ACII, pp.258-263, 2013.

O. , C. And-tannenbaum, and P. , The Principle of Congruity in the Prediction of Attitude Change, Psychological Review, vol.62, issue.1, pp.42-55, 1955.

P. , I. And-forcheimer, and R. , MPEG4 Facial Animation -The standard, implementations and applications, 2002.

P. , F. Ochs, M. Pelachaud, and C. , A formal model of social relations for artificial companions, Proc. of European Workshop on Multi- Agent Systems (EUMAS), 2013.

P. , H. Descamps, S. And-ishizuka, and M. , Scripting affective communication with life-like characters in web-based interaction systems, Applied Artificial Intelligence, vol.16, pp.7-8, 2002.

P. , K. Ochs, M. , P. , and C. , Beyond backchannels: coconstruction of dyadic stance by reciprocal reinforcement of smiles between virtual agents, 2013.

R. , B. And, M. Ochs, and C. And-pelachaud, From a User-Created Corpus of Virtual Agent's Non-Verbal Behavior to a Computational Model of Interpersonal Attitudes, Proc. of IVA, pp.263-274, 2013.

R. , C. And, and C. Sidner, Using collaborative discourse theory to partially automate dialogue tree authoring. Intelligent Virtual Agents, 2012.

R. , C. L. Omlor, L. Christensen, A. , G. et al., Critical features for the perception of emotion from gait, Journal of Vision, vol.9, issue.6, pp.1-32, 2009.

S. , P. Ethier, N. And, and E. Woody, Interpersonal Complementarity. Dans Handbook of interpersonal psychology: Theory, research, assessment, and therapeutic interventions, pp.123-142, 2011.

S. P. Bedell-b, D. J. , and M. J. , Current Directions in Emotional Intelligence Research, Handbook of Emotions, pp.504-520, 2000.

S. , W. C. Srikant, R. Agrawal, and R. , FIRO: A three-dimensional theory of interpersonal behavior Mining sequential patterns: Generalizations and performance improvements Advances in Database Technology 1057, pp.1-17, 1958.