Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications

Abstract : To make sense of their environment, both humans and robots need to construct a consistent perception from many sources of information (including visual and auditory stimulation). Multimodal merging thus plays a key role in human perception, for instance by lowering reaction times and detection thresholds. Psychophysics experiments have shown that humans are able to fuse information in a Bayes optimal way (Ernst & Banks, 2002), weighting each modality by its precision (i.e. the inverse of its perceived variance). Weights are usually estimated a posteriori from experimental data, but the mechanisms by which agents may estimate such precision online are not well studied. Some propositions may stem from sensorimotor accounts of perception and the predictive coding framework, with actions (e.g. saccades) being used to simultaneously estimate stimulus localization and sensory precision (Friston et al., 2011). In the context of the AMPLIFIER (Active Multisensory Perception and LearnIng For InteractivE Robots) project (2018-2022), we study the mutual influence of multisensory fusion and active perception. The project combines three complementary components. First, psychophysics experiments contribute to the confirmation and refining of hypotheses, by manipulating stimuli and task constraints (e.g., audio-visual discrepancy, stimulus presentation time, number of fixations or saccades during presentation) and estimating their effect on saccadic eye movements, as well as the effects of eye movements on the localization of the target. Second, neurocomputational models based on the dynamic neural field framework provide distributed representations of stimuli, allow to replicate experimental data, and to make predictions. Finally, such models will be coupled with active decision-making and developmental sensorimotor contingencies learning to be embedded on social robotic platforms, to improve human-robot interactions by providing more natural (gaze) interactions and more appropriate reactions in complex environments.
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01894621
Contributeur : Jean-Charles Quinton <>
Soumis le : samedi 14 juillet 2018 - 17:35:41
Dernière modification le : mercredi 23 janvier 2019 - 16:30:40
Document(s) archivé(s) le : mardi 16 octobre 2018 - 01:28:54

Fichier

poster_WEM_2018_AMPLIFIER_v2_l...
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01894621, version 2

Citation

Mathieu Lefort, Jean-Charles Quinton, Simon Forest, Adrien Techer, Alan Chauvin, et al.. Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications. Grenoble Workshop on Models and Analysis of Eye Movements, Jun 2018, Grenoble, France. pp.1, 〈https://eyemovements.sciencesconf.org/〉. 〈hal-01894621v2〉

Partager

Métriques

Consultations de la notice

253

Téléchargements de fichiers

28