A. Blake and M. Isard, 3D position, attitude and shape input using video tracking of hands and lips, Proceedings of the 21st annual conference on Computer graphics and interactive techniques, pp.185-192, 1994.

M. D. Bondy, N. D. Georganas, E. M. Petriu, D. C. Petriu, M. D. Cordea et al., Model-based face and lip animation for interactive virtual reality applications, Proceedings of the ninth ACM international conference on Multimedia, pp.559-563, 2001.

D. A. Bowman, E. Kruijff, J. J. Laviola, and I. Poupyrev, 3D user interfaces : theory and practice, 2005.

C. Bregler, M. Covell, and M. Slaney, Video rewrite : Driving visual speech with audio, Proceedings of the 24th annual conference on Computer graphics and interactive techniques, pp.353-360, 1997.

C. Busso and S. S. Narayanan, Interrelation between speech and facial gestures in emotional utterances : a single subject study. Audio, Speech, and Language Processing, IEEE Transactions on, vol.15, issue.8, pp.2331-2347, 2007.

S. Carbini, S. L. Picard, A. Bouguet, and J. E. Viallet, Interaction collaborative sur des objets virtuels 3D à l'aide d'une interface multimodale oro-gestuelle, 2006.

Q. Chen, N. D. Georganas, and E. M. Petriu, Hand gesture recognition using haar-like features and a stochastic context-free grammar. Instrumentation and Measurement, IEEE Transactions on, vol.57, issue.8, pp.1562-1571, 2008.

P. R. Cohen, M. Johnston, D. Mcgee, S. Oviatt, J. Pittman et al., Quickset : Multimodal interaction for distributed applications, Proceedings of the fifth ACM international conference on Multimedia, pp.31-40, 1997.

A. Corradini and H. Gross, A hybrid stochastic-connectionist architecture for gesture recognition, Proceedings. 1999 International Conference on, pp.336-341, 1999.

W. Ding, P. Chen, H. , and M. Pomplun, A gaze-controlled interface to virtual reality applications for motor-and speech-impaired users, 2008.

Z. Feng, B. Yang, Y. Li, Y. Zheng, X. Zhao et al., Real-time oriented behavior-driven 3D freehand tracking for direct interaction, Pattern Recognition, 2012.

S. Fothergill, H. Mentis, P. Kohli, and S. Nowozin, Instructing people for training gestural interactive systems, Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, pp.1737-1746, 2012.

D. Friedman, R. Leeb, A. Antley, M. Garau, C. Guger et al., Navigating virtual reality by thought : First steps, Proceedings of the 7th Annual International Workshop on Presence, vol.160, p.167, 2004.

P. Garg, N. Aggarwal, and S. Sofat, Vision based hand gesture recognition, World Academy of Science, Engineering and Technology, vol.49, issue.1, pp.972-977, 2009.

D. M. Gavrila, The visual analysis of human movement : A survey. Computer vision and image understanding, vol.73, pp.82-98, 1999.

E. Ghomi, O. Bau, W. Mackay, and S. Huot, Conception et apprentissage des interactions tactiles : le cas des postures multi-doigts, FITG '10 : French workshop on tactile and gestural interaction, 2010.

M. Haller, J. Leitner, T. Seifried, J. R. Wallace, S. D. Scott et al., The nice discussion room : Integrating paper and digital media to support co-located group meetings, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.609-618, 2010.

J. Hasenfratz, M. Lapierre, and F. Sillion, A real-time system for full body interaction with virtual worlds, Proceedings of the Tenth Eurographics conference on Virtual Environments, pp.147-156, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00281383

Z. Jian, A. Cheok, and K. C. Chung, Real-time lip tracking for virtual lip implementation in virtual environments and computer games, The 10th IEEE International Conference on, vol.3, pp.1359-1362, 2001.

F. Klompmaker, V. Paelke, and H. Fischer, A taxonomy-based approach towards NUI interaction design, Distributed, Ambient, and Pervasive Interactions, pp.32-41, 2013.

W. A. König, R. Rädle, and H. Reiterer, Interactive design of multimodal user interfaces, Journal on Multimodal User Interfaces, vol.3, issue.3, pp.197-213, 2010.

H. Lahamy and D. Lichti, Real-time hand gesture recognition using range cameras, Proceedings of the Canadian Geomatics Conference, 2010.

W. Lai and H. Huosheng, Towards multimodal human-machine interface for hands-free control : A survey, 2011.

A. Lécuyer, F. Lotte, R. B. Reilly, R. Leeb, M. Hirose et al., Brain-computer interfaces, virtual reality, and videogames, Computer, vol.41, issue.10, pp.66-72, 2008.

J. Y. Lee, M. S. Kim, J. S. Kim, and S. M. Lee, Tangible user interface of digital products in multi-displays, The International Journal of Advanced Manufacturing Technology, vol.59, issue.9, pp.1245-1259, 2012.

R. Leeb, D. Friedman, M. Slater, and G. Pfurtscheller, A tetraplegic patient controls a wheelchair in virtual reality, BRAINPLAY 07 Brain-Computer Interfaces and Games Workshop at ACE (Advances in Computer Entertainment), p.37, 2007.

C. Lim and D. Kim, Development of gaze tracking interface for controlling 3D contents, Sensors and Actuators A : Physical, 2012.

F. Lotte, A. Lécuyer, Y. Renard, F. Lamarche, and B. Arnaldi, Classification de données cérébrales par système d'inférence flou pour l'utilisation d'interfaces cerveau-ordinateur en réalité virtuelle, 2006.

A. M. Maceachren, G. Cai, R. Sharma, I. Rauschert, I. Brewer et al., Enabling collaborative geoinformation access and decision-making through a natural, multimodal interface, International Journal of Geographical Information Science, vol.19, issue.3, pp.293-317, 2005.

F. Marton, M. Agus, E. Gobbetti, G. Pintore, and M. Rodriguez, Natural exploration of 3D massive models on large-scale light field displays using the FOX proximal navigation technique, Computers & Graphics, 2012.

S. Mcglashan, Speech interfaces to virtual reality, Proceedings of 2nd International Workshop on Military Applications of Synthetic Environments and Virtual Reality, 1995.

S. Mcglashan and T. Axling, A speech interface to virtual environments, 1996.

M. R. Morris, D. Morris, and T. Winograd, Individual audio channels with single display groupware : effects on communication and task strategy, Proceedings of the 2004 ACM conference on Computer supported cooperative work, pp.242-251, 2004.

S. Murphy, Design considerations for a natural user interface(NUI). White paper, 2012.

P. Nugues, C. Godéreaux, P. E. Guedj, and F. Revolta, A conversational agent to navigate in virtual worlds, 1996.

K. O'hara, R. Harper, H. Mentis, A. Sellen, and A. Taylor, On the naturalness of touchless : putting the nteractionback into nui, ACM Transactions on Computer-Human Interaction (TOCHI), vol.20, issue.1, p.5, 2012.

K. O'hara, A. Sellen, and R. Harper, Embodiment in brain-computer interaction, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.353-362, 2011.

S. Otmane, Modèles et techniques logicielles pour l'assistance à l'interaction et à la collaboration en réalité mixte. Habilitation à diriger les recherches, 2010.

S. Oviatt and P. Cohen, Perceptual user interfaces : multimodal interfaces that process what comes naturally, Communications of the ACM, vol.43, issue.3, pp.45-53, 2000.

T. Pfeiffer, Towards gaze interaction in immersive virtual reality : Evaluation of a monocular eye tracking set-up, Virtuelle und Erweiterte Realität-Fünfter Workshop der GI-Fachgruppe VR/AR, pp.81-92, 2008.

P. Premaratne and Q. Nguyen, Consumer electronics control system based on hand gesture moment invariants, Computer Vision, IET, vol.1, issue.1, pp.35-41, 2007.

D. W. Seo and J. Y. Lee, Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences, Expert Systems with Applications, vol.40, issue.9, pp.3784-3793, 2013.

M. Siggelkow, Importance of gaze awareness in augmented reality teleconferencing, 2005.

P. Song, H. Yu, and S. Winkler, Vision-based 3d finger interactions for mixed reality games with physics simulation, Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, 2008.
URL : https://hal.archives-ouvertes.fr/hal-01530684

P. Song, H. Yu, and S. Winkler, Vision-based 3d finger interactions for mixed reality games with physics simulation, Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, 2008.
URL : https://hal.archives-ouvertes.fr/hal-01530684

N. A. Streitz, J. Gei, T. Holmer, S. Konomi, C. Müller-tomfelde et al., i-land : an interactive landscape for creativity and innovation, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, CHI '99, pp.120-127, 1999.

R. Vertegaal, The gaze groupware system : mediating joint attention in multiparty communication and collaboration, Proceedings of the SIGCHI conference on Human factors in computing systems, pp.294-301, 1999.

M. Weiss, J. D. Hollan, and J. Borchers, Augmenting interactive tabletops with translucent tangible controls, Tabletops-Horizontal Interactive Displays, pp.149-170, 2010.

D. Wigdor and D. Wixon, Brave NUI world : designing natural user interfaces for touch and gesture, 2011.

C. R. Wren, A. Azarbayejani, T. Darrell, and A. P. Pentland, Pfinder : Real-time tracking of the human body. Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.19, issue.7, pp.780-785, 1997.

C. R. Wren, F. Sparacino, A. J. Azarbayejani, T. J. Darrell, T. E. Starner et al., Pentland. Perceptive spaces for performance and entertainment untethered interaction using computer vision and audition. Applied artificial intelligence, vol.11, pp.267-284, 1997.

M. Wu and R. Balakrishnan, Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays, Proceedings of the 16th annual ACM symposium on User interface software and technology, pp.193-202, 2003.

R. C. Zeleznik, A. S. Forsberg, and J. P. Schulze, Look-that-there : Exploiting gaze in virtual reality interactions, 2005.

S. Zhai, C. Morimoto, and S. Ihde, Manual and gaze input cascaded (magic) pointing, Proceedings of the SIGCHI conference on Human factors in computing systems, pp.246-253, 1999.