Skip to Main content Skip to Navigation

Sensor fusion for interactive real-scale modeling and simulation systems

Abstract : This paper proposes an accurate sensor fusion scheme for navigation inside a real-scale 3D model by combining audio and video signals. Audio signal of a microphone-array is merged by Minimum Variance Distortion-less Response (MVDR) algorithm and processed instantaneously via Hidden Markov Model (HMM) to generate translation commands by word-to-action module of speech processing system. Then, the output of optical head tracker (four IR cameras) is analyzed by non-linear/non-Gaussian Bayesian algorithm to provide information about the orientation of the user's head. The orientation is used to redirect the user toward a new direction by applying quaternion rotation. The output of these two sensors (video and audio) is combined under the sensor fusion scheme to perform continuous travelling inside the model. The maximum precision for the traveling task is achieved under sensor fusion scheme. Practical experiment shows promising results for the implementation.
Complete list of metadatas

Cited literature [8 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00879739
Contributor : Compte de Service Administrateur Ensam <>
Submitted on : Monday, November 4, 2013 - 4:55:34 PM
Last modification on : Thursday, March 26, 2020 - 8:54:39 AM
Document(s) archivé(s) le : Friday, April 7, 2017 - 8:46:15 PM

File

LE2I_CGAMES_2013_MIRZAEI.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00879739, version 1
  • ENSAM : http://hdl.handle.net/10985/7455

Citation

Mohammad Ali Mirzaei, Jean-Rémy Chardonnet, Christian Pere, Frédéric Merienne. Sensor fusion for interactive real-scale modeling and simulation systems. 18th International Conference on Computer Games (CGAMES USA), Jul 2013, United States. pp.149-153. ⟨hal-00879739⟩

Share

Metrics

Record views

265

Files downloads

437