Visual tracking of surgical tools for proximity detection in retinal surgery

42Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In retinal surgery, surgeons face difficulties such as indirect visualization of surgical targets, physiological tremor and lack of tactile feedback. Such difficulties increase the risks of incorrect surgical gestures which may cause retinal damage. In this context, robotic assistance has the potential to overcome current technical limitations and increase surgical safety. In this paper we present a method for robustly tracking surgical tools in retinal surgery for detecting proximity between surgical tools and the retinal surface. An image similarity function based on weighted mutual information is specially tailored for tracking under critical illumination variations, lens distortions, and rapid motion. The proposed method was tested on challenging conditions using a phantom eye and recorded human in vivo data acquired by an ophthalmic stereo microscope. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Richa, R., Balicki, M., Meisner, E., Sznitman, R., Taylor, R., & Hager, G. (2011). Visual tracking of surgical tools for proximity detection in retinal surgery. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6689 LNCS, pp. 55–66). https://doi.org/10.1007/978-3-642-21504-9_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free