Abstract:
A multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures includes first and second master input devices, a first slave robotic mechanism, and at least one processor configured to generate a first slave command for the first slave robotic mechanism by switchably using one or both of a first command indicative of manipulation of the first master input device by a first user and a second command indicative of manipulation of the second master input device by a second user. To facilitate the collaboration or training, both first and second users communicate with each other through an audio system and see the minimally invasive surgery site on first and second displays respectively viewable by the first and second users.
Abstract:
Non-white light from an endoscope (201) of a teleoperated surgical system (200) is used to illuminate a surgical site (203). A camera (220L) captures an image of the surgical site, and the image is displayed on a monitor (251). The non-white light illumination minimizes noise in the images of the surgical site presented on the monitor relative to images captured using white light illumination and displayed on the monitor.
Abstract:
A robotic system is provided. The robotic system includes a publishing node including at least one first synchronization database that includes a plurality of attributes, each of the attributes including a tag identifying the attribute and data, a flag associated with each of the attributes, and a subscriber list. The system also includes a subscriber node including at least one second synchronization database. The publishing node is configured to set the flag associated with the attributes when the attributes are written in the at least one first synchronization database or when the data included in the attributes are modified and publish the flagged attributes to the subscriber node.
Abstract:
An endoscope with a stereoscopic optical channel is held and positioned by a robotic surgical system. A capture unit captures (1) a visible first image and (2) a visible second image combined with a fluorescence second image from the light. An intelligent image processing system receives (1) the visible first image and (2) the visible second image combined with the fluorescence second image and generates at least one fluorescence image of a stereoscopic pair of fluorescence images and a visible second image. An augmented stereoscopic display system outputs a real-time stereoscopic image including a three-dimensional presentation including in one eye, a blend of the at least one fluorescence image of a stereoscopic pair of fluorescence images and one of the visible first and second images; and in the other eye, the other of the visible first and second images.
Abstract:
An operator telerobotically controls tools to perform a procedure on an object at a work site while viewing real-time images of the object, tools and work site on a display. Tool information is provided by filtering a part of the real-time images for enhancement or degradation to indicate a state of a tool and displaying the filtered images on the display.
Abstract:
An exemplary surgical instrument tracking system includes at least one physical computing device that determines, based on endoscopic imagery of a surgical area and using a trained neural network, an observation for an object of interest depicted in the endoscopic imagery, associates, based on a probabilistic framework and kinematics of a robotically-manipulated surgical instrument located at the surgical area, the observation for the object of interest to the robotically-manipulated surgical instrument, and determines a physical position of the robotically-manipulated surgical instrument at the surgical area based on the kinematics of the robotically-manipulated surgical instrument and the observation associated with the robotically-manipulated surgical instrument.
Abstract:
A teleoperated surgical system is provided comprising: a first robotic surgical instrument; an image capture; a user display; a user input command device coupled to receive user input commands to control movement of the first robotic surgical instrument; and a movement controller coupled to scale a rate of movement of the first robotic surgical instrument, based at least in part upon a surgical skill level at using the first robotic surgical instrument of the user providing the received user input commands, from a rate of movement indicated by the user input commands received at the user input command device.
Abstract:
A teleoperated surgical system is provided comprising: a first robotic surgical instrument; an image capture; a user display; a user input command device coupled to receive user input commands to control movement of the first robotic surgical instrument; and a movement controller coupled to scale a rate of movement of the first robotic surgical instrument, based at least in part upon a surgical skill level at using the first robotic surgical instrument of the user providing the received user input commands, from a rate of movement indicated by the user input commands received at the user input command device.
Abstract:
An apparatus may configure an illuminator to illuminate a scene with non-white light and control a camera to capture, in a plurality of color channels of the camera, a frame of the scene illuminated with the non-white light. The apparatus may adjust a signal of a color channel of the camera in the frame of the scene based on the non-white light.
Abstract:
Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.