Date Added: Oct 2010
In this paper the authors present an online algorithm for robustly tracking surgical tools in dynamic environments that can assist a surgeon during in-vivo robotic surgery procedures. The next generation of in-vivo robotic surgical devices includes integrated imaging and effector platforms that need to be controlled through real-time visual feedback. The tracking algorithm learns the appearance of the tool online to account for appearance and perspective changes. In addition, the tracker uses multiple features working together to model the object and discover new areas of the tool as it moves quickly, exits and re-enters the scene, or becomes occluded and requires recovery.