Detecting Surgical Tools by Modelling Local Appearance and Global Shape - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Medical Imaging Année : 2015

Detecting Surgical Tools by Modelling Local Appearance and Global Shape

Résumé

Detecting tools in surgical videos is an important ingredient for context-aware computer-assisted surgical systems. To this end, we present a new surgical tool detection dataset and a method for joint tool detection and pose estimation in 2d images. Our two-stage pipeline is data-driven and relaxes strong assumptions made by previous works regarding the geometry, number, and position of tools in the image. The first stage classifies each pixel based on local appearance only, while the second stage evaluates a tool-specific shape template to enforce global shape. Both local appearance and global shape are learned from training data. Our method is validated on a new surgical tool dataset of 2 476 images from neurosurgical microscopes, which is made freely available. It improves over existing datasets in size, diversity and detail of annotation. We show that our method significantly improves over competitive baselines from the computer vision field. We achieve 15% detection miss-rate at 10(-1) false positives per image (for the suction tube) over our surgical tool dataset. Results indicate that performing semantic labelling as an intermediate task is key for high quality detection.
Fichier non déposé

Dates et versions

hal-01239862 , version 1 (08-12-2015)

Identifiants

Citer

David Bouget, Rodrigo Benenson, Mohamed Omran, Laurent Riffaud, Bernt Schiele, et al.. Detecting Surgical Tools by Modelling Local Appearance and Global Shape. IEEE Transactions on Medical Imaging, 2015, 34 (12), pp.2603--2617. ⟨10.1109/TMI.2015.2450831⟩. ⟨hal-01239862⟩
102 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More