%0 Generic
%A GarcĂ­a-Peraza-Herrera, LC
%A Li, W
%A Fidon, L
%A Gruijthuijsen, C
%A Devreker, A
%A Attilakos, G
%A Deprest, J
%A Poorten, EBV
%A Stoyanov, D
%A Vercauteren, T
%A Ourselin, S
%C Ithaca, NY, USA
%D 2017
%F discovery:10122155
%I arXiv
%T ToolNet: Holistically-Nested Real-Time Segmentation of Robotic Surgical Tools
%U https://discovery.ucl.ac.uk/id/eprint/10122155/
%X Real-time tool segmentation from endoscopic  videos is an essential part of many computer-assisted robotic  surgical systems and of critical importance in robotic surgical  data science. We propose two novel deep learning architectures  for automatic segmentation of non-rigid surgical instruments.  Both methods take advantage of automated deep-learningbased multi-scale feature extraction while trying to maintain  an accurate segmentation quality at all resolutions. The two  proposed methods encode the multi-scale constraint inside the  network architecture. The first proposed architecture enforces it  by cascaded aggregation of predictions and the second proposed  network does it by means of a holistically-nested architecture  where the loss at each scale is taken into account for the  optimization process. As the proposed methods are for realtime semantic labeling, both present a reduced number of  parameters. We propose the use of parametric rectified linear  units for semantic labeling in these small architectures to  increase the regularization of the network while maintaining  the segmentation accuracy. We compare the proposed architectures against state-of-the-art fully convolutional networks.  We validate our methods using existing benchmark datasets,  including ex vivo cases with phantom tissue and different robotic  surgical instruments present in the scene. Our results show  a statistically significant improved Dice Similarity Coefficient  over previous instrument segmentation methods. We analyze  our design choices and discuss the key drivers for improving  accuracy.
%Z This version is the version of record. For information on re-use, please refer to the publisher’s terms and conditions.