Skip to Main content Skip to Navigation
Conference papers

Environment Exploration for Object-Based Visual Saliency Learning

Céline Craye 1, 2, 3 David Filliat 2, 1 Jean-François Goudou 3
2 Flowers - Flowing Epigenetic Robots and Systems
Inria Bordeaux - Sud-Ouest, U2IS - Unité d'Informatique et d'Ingénierie des Systèmes
Abstract : Searching for objects in an indoor environment can be drastically improved if a task-specific visual saliency is available. We describe a method to incrementally learn such an object-based visual saliency directly on a robot, using an environment exploration mechanism. We first define saliency based on a geometrical criterion and use this definition to segment salient elements given an attentive but costly and restrictive observation of the environment. These elements are used to train a fast classifier that predicts salient objects given large-scale visual features. In order to get a better and faster learning, we use an exploration strategy based on intrinsic motivation to drive our attentive observation. Our approach has been tested on a robot in our lab as well as on publicly available RGB-D images sequences. We demonstrate that the approach outperforms several state-of-the-art methods in the case of indoor object detection and that the exploration strategy can drastically decrease the time required for learning saliency.
Document type :
Conference papers
Complete list of metadata

Cited literature [23 references]  Display  Hide  Download
Contributor : Céline Craye <>
Submitted on : Thursday, March 17, 2016 - 10:06:19 AM
Last modification on : Thursday, January 21, 2021 - 9:26:01 AM
Long-term archiving on: : Saturday, June 18, 2016 - 5:53:11 PM


Files produced by the author(s)


  • HAL Id : hal-01289159, version 1



Céline Craye, David Filliat, Jean-François Goudou. Environment Exploration for Object-Based Visual Saliency Learning. International Conference on Robotics and Automation , May 2016, Stockholm, Sweden. ⟨hal-01289159⟩



Record views


Files downloads