Co-Sound: An interactive medium with WebAR and spatial synchronization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Co-Sound: An interactive medium with WebAR and spatial synchronization

Manabu Tsukada
Hiroshi Esaki
  • Fonction : Auteur
  • PersonId : 1077457

Résumé

An Internet-based media service platform can control recording processes and manage video and audio data. Furthermore, the design and implementation of an object-based system for recording enable the flexible playback of the viewing contents. Augmented Reality (AR) is a three-dimensional video projection technology. However, there are few examples of its use as a method for audiovisual media platforms. In this study, we propose Co-Sound, which is designed as a multimodal interface that renders object-based AR dynamically in response to various actions from viewers on a web browser by sharing AR objects among multiple devices in real time. We confirmed that the system was developed as an object-based interactive medium with AR, achieved the general acceptance of the system was very high through a questionnaire survey, and low-latency synchronization to accept operations from multiple users in real time.
Fichier principal
Vignette du fichier
paper_36.pdf (5.88 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02942505 , version 1 (18-09-2020)

Licence

Paternité

Identifiants

Citer

Kazuma Inokuchi, Manabu Tsukada, Hiroshi Esaki. Co-Sound: An interactive medium with WebAR and spatial synchronization. 19th International Conference on Entertainment Computing (ICEC), Nov 2020, Xi'an, China. pp.255-263, ⟨10.1007/978-3-030-65736-9_22⟩. ⟨hal-02942505⟩
123 Consultations
91 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More