Skip to Main content Skip to Navigation
New interface
Conference papers

Integrating user preference to similarity queries over medical images datasets

Abstract : Large amounts of images from medical exams are being stored in databases, so developing retrieval techniques is an important research problem. Retrieval based on the image visual content is usually better than using textual descriptions, as they seldom gives every nuances that the user may be interested in. Content-based image retrieval employs the similarity among images for retrieval. However, similarity is evaluated using numeric methods, and they often orders the images by similarity in a way rather distinct from the user's intention. In this paper, we propose a technique to allow expressing the user's preference over attributes associated to the images, so similarity queries can be refined by preference rules. Experiments performed over a dataset with computed tomography lung images shows that correctly expressing the user's preferences, the similarity query precision can increase from an average of 60% up to close to 100%, when enough interesting images exists in the database.
Complete list of metadata
Contributor : Richard Chbeir Connect in order to contact the contributor
Submitted on : Wednesday, December 10, 2014 - 3:30:55 PM
Last modification on : Wednesday, November 16, 2022 - 4:02:03 AM


  • HAL Id : hal-01093356, version 1


Monica Ribeiro Porto Ferreira, Marcelo Ponciano-Silva, Agma Traina, Caetano Traina, Sandra de Amo, et al.. Integrating user preference to similarity queries over medical images datasets. CBMS '10 Proceedings of the 2010 IEEE 23rd International Symposium on Computer-Based Medical Systems, Oct 2010, Perth, Australia. pp.486-491. ⟨hal-01093356⟩



Record views