论文标题
基于深度学习和跟踪的非刚性外科手术工具的实时细分
Real-Time Segmentation of Non-Rigid Surgical Tools based on Deep Learning and Tracking
论文作者
论文摘要
实时工具分割是计算机辅助外科系统中的重要组成部分。我们提出了一种基于完全卷积网络(FCN)和光流跟踪的新型实时自动方法。我们的方法利用了深神经网络产生高度可变形部分的准确分割以及光流的高速分割的能力。此外,可以在少量的医学图像上微调预训练的FCN,而无需手工制作功能。我们使用现有和新的基准数据集验证了我们的方法,涵盖了使用不同手术仪器的体内和体内实际临床病例。提出了两个版本的方法,非实时和实时。前者仅使用深度学习,在真实的临床数据集上实现了89.6%的平衡精度,表现优于(非现实时间)的最新水平。后者是深度学习与光流跟踪的结合,在所有验证的数据集中,平均平衡精度为78.2%。
Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6% on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8% points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2% across all the validated datasets.