J. Armitage, Revealing timelines: Live coding and its gestures, Proceedings of ICLC, 2016.

J. Barbosa, F. Calegario, V. Teichrieb, G. Ramalho, and P. Mcglynn, Considering audience's view towards an evaluation methodology for digital musical instruments, Proceedings of NIME, 2012.

M. A. Baytas, T. Göksun, and O. Ozcan, The perception of live-sequenced electronic music via hearing and sight, Proceedings of NIME, 2016.

F. Berthaut, D. Coyle, J. Moore, and H. Limerick, Liveness Through the Lens of Agency and Causality, Proceedings of NIME, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01170032

F. Berthaut, M. T. Marshall, S. Subramanian, and M. Hachet, Rouages: Revealing the Mechanisms of Digital Musical Instruments to the Audience, Proceedings of NIME, 2013.
URL : https://hal.archives-ouvertes.fr/hal-00807049

F. Berthaut, D. Martinez-plasencia, M. Hachet, and S. Subramanian, Reflets: Combining and Revealing Spaces for Musical Performances, Proceedings of NIME, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01136857

S. A. Bin, N. Bryan-kinns, and A. P. Mcpherson, Skip the pre-concert demo: How technical familiarity and musical style affect audience response, Proceedings of NIME, 2016.

B. Calvo-merino, D. E. Glaser, J. Grèzes, R. E. Passingham, and P. Haggard, Action Observation and Acquired Motor Skills: An fMRI Study with Expert Dancers, Cerebral Cortex, vol.15, issue.8, pp.1243-1249, 2005.
DOI : 10.1093/cercor/bhi007

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.329.6583

S. Fels, A. Gadd, and A. Mulder, Mapping transparency through metaphor: towards more expressive musical instruments, Organised Sound, vol.7, issue.02, pp.109-126, 2002.
DOI : 10.1017/S1355771802002042

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.3.8689

R. I. Godøy, A. R. Jensenius, A. Voldsund, K. H. Glette, M. E. Høvin et al., Classifying music-related actions, 2012.

J. Kim and E. André, Emotion recognition based on physiological changes in music listening, IEEE transactions on pattern analysis and machine intelligence, vol.30, issue.12, pp.2067-2083, 2008.

S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani et al., DEAP: A Database for Emotion Analysis ;Using Physiological Signals, IEEE Transactions on Affective Computing, vol.3, issue.1, pp.18-31, 2012.
DOI : 10.1109/T-AFFC.2011.15

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.593.8470

E. Kohler, C. Keysers, M. A. Umiltà, L. Fogassi, V. Gallese et al., Hearing Sounds, Understanding Actions: Action Representation in Mirror Neurons, Science, vol.297, issue.5582, pp.297846-297854, 2002.
DOI : 10.1126/science.1070311

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.177.3161

C. Lai and T. Bovermann, Audience experience in sound performance, Proceedings of NIME, 2013.

M. Leman and P. Maes, The Role of Embodiment in the Perception of Music, Empirical Musicology Review, vol.9, issue.3-4, pp.236-246, 2014.
DOI : 10.18061/emr.v9i3-4.4498

I. Molnar-szakacs and K. Overy, Music and mirror neurons: from motion to ???e???motion, Social Cognitive and Affective Neuroscience, vol.1, issue.3, pp.235-241, 2006.
DOI : 10.1093/scan/nsl029

URL : https://academic.oup.com/scan/article-pdf/1/3/235/8560793/nsl029.pdf

J. Paulus, M. Müller, and A. Klapuri, State of the art report: Audio-based music structure analysis, Proceedings of ISMIR, 2010.

S. S. Rautaray and A. Agrawal, Vision based hand gesture recognition for human computer interaction: a survey, Artificial Intelligence Review, vol.36, issue.3, pp.1-54, 2015.
DOI : 10.1201/9781420064995-c34

F. Ringeval, A. Sonderegger, J. Sauer, and D. Lalanne, Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), pp.1-8, 2013.
DOI : 10.1109/FG.2013.6553805

D. Sammler, M. Grigutsch, T. Fritz, and S. Koelsch, Music and emotion: Electrophysiological correlates of the processing of pleasant and unpleasant music, Psychophysiology, vol.20, issue.2, pp.293-304, 2007.
DOI : 10.1016/S0163-6383(98)90021-2

J. C. Schacher and P. Neff, Skill Development and Stabilisation of Expertise for Electronic Music Performance, Proceedings of CMMR, 2015.
DOI : 10.1038/nrn2152

E. Schubert, S. Ferguson, N. Farrar, D. Taylor, and G. E. Mcpherson, The Six Emotion-Face Clock as a Tool for Continuously Rating Discrete Emotional Responses to Music, Proceedings of CMMR, 2012.
DOI : 10.1007/978-3-642-41248-6_1

H. Sequeira, P. Hot, L. Silvert, and S. Delplanque, Electrical autonomic correlates of emotion, International Journal of Psychophysiology, vol.71, issue.1, pp.50-56, 2009.
DOI : 10.1016/j.ijpsycho.2008.07.009

URL : https://hal.archives-ouvertes.fr/hal-00332658

M. Swan, The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery, Big Data, vol.1, issue.2, pp.85-99, 2013.
DOI : 10.1089/big.2012.0002

B. W. Vines, C. L. Krumhansl, M. M. Wanderley, I. M. Dalca, and D. J. Levitin, Music to my eyes: Cross-modal interactions in the perception of emotions in musical performance, Cognition, vol.118, issue.2, pp.157-170, 2011.
DOI : 10.1016/j.cognition.2010.11.010

D. M. Wegner and T. Wheatley, Apparent mental causation: Sources of the experience of will., American Psychologist, vol.54, issue.7, p.480, 1999.
DOI : 10.1037/0003-066X.54.7.480

J. C. Wu, M. Huberth, Y. H. Yeh, and M. Wright, Evaluating the audience's perception of real-time gestural control and mapping mechanisms in electroacoustic vocal performance, Proceedings NIME, 2016.