Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Neurophysiology Année : 2012

Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception

Ivo Vanzetta
  • Fonction : Auteur
  • PersonId : 1056682

Résumé

Choosing an appropriate set of stimuli is essential in order to characterize the response of a sensory system to a particular functional dimension, such as the eye movement following the motion of a visual scene. Here, we describe a framework to generate random texture movies with controlled information content, i.e., Motion Clouds. These stimuli are defined using a generative model which is based on controlled experimental parametrization. We show that Motion Clouds correspond to dense mixing of localized moving gratings with random positions. Their global envelope is similar to natural-like stimulation with an approximate full-field translation corresponding to a retinal slip. We describe the construction of these stimuli mathematically and propose an open-source python-based implementation. Examples of the use of this framework are shown. We also propose extensions to other modalities such as color vision, touch and audition.
Fichier principal
Vignette du fichier
MotionClouds.pdf (5.16 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02387898 , version 1 (26-01-2021)

Identifiants

Citer

Paula Sanz Leon, Ivo Vanzetta, Guillaume S. Masson, Laurent Perrinet. Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception. Journal of Neurophysiology, 2012, 107 (11), pp.3217-3226. ⟨10.1152/jn.00737.2011⟩. ⟨hal-02387898⟩
60 Consultations
31 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More