Abstract:
Telerobotic, telesurgical, and/or surgical robotic devices, systems, and methods employ surgical robotic linkages that may have more degrees of freedom than an associated surgical end effector n space. A processor can calculate a tool motion that includes pivoting of the tool about an aperture site. Linkages movable along a range of configurations for a given end effector position may be driven toward configurations which inhibit collisions. Refined robotic linkages and method for their use are also provided.
Abstract:
Telerobotic, telesurgical, and/or surgical robotic devices, systems, and methods employ surgical robotic linkages that may have more degrees of freedom than an associated surgical end effector in space. A processor can calculate a tool motion that includes pivoting of the tool about an aperture site. Linkages movable along a range of configurations for a given end effector position may be driven toward configurations which inhibit collisions. Refined robotic linkages and methods for their use are also provided.
Abstract:
A surgical instrument manipulator comprises a manipulator arm and an instrument mounting structure rotatably mounted to the manipulator arm. The instrument mounting structure comprises: an attachment interface configured to removably couple to a surgical instrument; and a passage within the instrument mounting structure. When the surgical instrument is coupled to the attachment interface, the surgical instrument is fixed to the instrument mounting structure. An elongate body of the surgical instrument extends through the passage when the surgical instrument is coupled to the attachment interface. The instrument mounting structure is rotatable relative to the manipulator arm, and when the surgical instrument is coupled to the attachment interface, rotation of the instrument mounting structure with respect to the manipulator arm causes rotation of the surgical instrument.
Abstract:
A system comprises a first robotic arm adapted to support and move a tool and a second robotic arm adapted to support and move a camera configured to capture an image of a camera field of view. The system further comprises an input device, a display, and a processor. The processor is configured to display a first synthetic image including a first synthetic image of the tool. The first synthetic image of the tool includes a portion of the tool outside of the camera field of view. The processor is also configured to receive a user input at the input device and responsive to the user input, change the display of the first synthetic image to a display of a second synthetic image including a second synthetic image of the tool that is different from the first synthetic image of the tool.
Abstract:
A method for determining a shape of a lumen in an anatomical structure comprises reading information from a plurality of strain sensors disposed substantially along a length of a flexible medical device when the flexible medical device is positioned in the lumen. When the flexible medical device is positioned in the lumen, the flexible medical device conforms to the shape of the lumen. The method further comprises computationally determining, by a processing system, the shape of the lumen based on the information from the plurality of strain sensors.
Abstract:
A medical system may comprise a display system and a processor. The processor may be configured to generate a computer model of a plurality of instruments. The plurality of instruments may extend through and out of a distal end of an entry guide and may include an image capturing instrument. The processor may be further configured to cause an image of the computer model to be displayed on the display system and determine from a configuration of the plurality of instruments in the computer model if an event alert threshold associated with an event has been reached. If a determination is made that the event alert threshold has been reached, an event indicator may be displayed on the display system at a portion of the image of the computer model associated with the event.
Abstract:
A medical system may comprise a display and a processor configured to determine an optimal position for an image capturing instrument to view working ends of a plurality of medical instruments when the plurality of medical instruments and the image capturing instrument are each extending out of a distal end of an entry guide. The processor may also be configured to cause the optimal position to be displayed on the display along with an image captured by the image capturing instrument of the working ends of the plurality of medical instruments.
Abstract:
Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined. using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, of external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
Abstract:
In one embodiment of the invention, a a minimally invasive surgical system is disclosed. The system configured to capture and display camera images of a surgical site on at least one display device at a surgeon console; switch out of a following mode and into a masters-as-mice (MaM) mode; overlay a graphical user interface (GUI) including an interactive graphical object onto the camera images; and render a pointer within the camera images for user interactive control. In the following mode, the input devices of the surgeon console may couple motion into surgical instruments. In the MaM mode, the input devices interact with the GUI and interactive graphical objects. The pointer is manipulated in three dimensions by input devices having at least three degrees of freedom. Interactive graphical objects are related to physical objects in the surgical site or a function thereof and are manipulatable by the input devices.
Abstract:
Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.