Abstract:
An operator telerobotically controls tools to perform a procedure on an object at a work site while viewing real-time images of the work site on a display. Tool information is provided in the operator's current gaze area on the display by rendering the tool information over the tool so as not to obscure objects being worked on at the time by the tool nor to require eyes of the user to refocus when looking at the tool information and the image of the tool on a stereo viewer.
Abstract:
A minimally-invasive surgical system includes a slave surgical instrument having a slave surgical instrument tip and a master grip. The slave surgical instrument tip has an alignment in a common frame of reference and the master grip, which is coupled to the slave surgical instrument, has an alignment in the common frame of reference. An alignment error, in the common frame of reference, is a difference in alignment between the alignment of the slave surgical instrument tip and the alignment of the master grip. A ratcheting system (i) coupled to the master grip to receive the alignment of the master grip and (ii) coupled to the slave surgical instrument, to control motion of the slave by continuously reducing the alignment error, as the master grip moves, without autonomous motion of the slave surgical instrument tip and without autonomous motion of the master grip.
Abstract:
A system comprises a processor and a memory having computer readable instructions stored thereon, which, when executed by the processor, cause the system to display a surgical environment image, which includes an image from an imaging system and an interaction image. The interaction image displays a body part of a user and an input control device. The instructions further cause the system to display, in the interaction image, a movement of the body part as the body part interacts with the input control device. The movement causes the body part to actuate the input control device. The instructions further cause the system to receive an input from the input control device in response to the actuation of the input control device. The instructions further cause the system to adjust a setting or a position of a component of a surgical system based on the received input.
Abstract:
A method and apparatus for manipulating tissue. A tissue control point is displayed over an image of the tissue in a user interface. An input is received that moves the tissue control point within the user interface. A first instrument that is physically associated with the tissue is operated based on the received input to thereby manipulate the tissue.
Abstract:
Techniques for registering a computer-assisted device to a table include a computer-assisted device having an articulated arm and a control unit. The control unit is configured to receive information of a first motion of a table, the first motion comprising a first rotation about a first axis and causing a second motion of a point associated with the articulated arm; receive information of the second motion; receive information of a third motion of the table, the third motion of the table comprising a second rotation about a second axis different from the first axis and causing a fourth motion of the point; receive information of the fourth motion; determine a relationship between the computer-assisted device and the table based on a position of the point, the first rotation, the second motion, the second rotation, and the fourth motion; and control motion of the articulated arm based on the relationship.
Abstract:
A system and method of view restoration include a computer-assisted device having an imaging device and a controller coupled to the imaging device. The controller is configured to record kinematic information, imaging information, or both the kinematic information and the imaging information before movement of the imaging device from a first repositionable arm to a second repositionable arm or from a first workspace port to a second workspace port; detect the movement of the imaging device from the first repositionable arm to the second repositionable arm or from the first workspace port to the second workspace port; determine, in response to the detection, a desired position and orientation of the imaging device based on the recorded kinematic information, the recorded imaging information, or both the recorded kinematic information and the recorded imaging information; and move the imaging device based on the desired position and orientation.
Abstract:
A system comprises a display configured to present a graphical user interface including a viewing area and one or more user interface elements. The system also comprises a gaze tracking device configured to detect a change in a gaze of a user while the user views the graphical user interface presented on the display. The system also comprises an audio input device configured to receive audio information. The system also comprises one or more processors configured to process the audio information received at the audio input device in accordance with a first mode to direct the received audio information as audio output to one or more audio output devices; and in response to the gaze tracking device detecting the change in the gaze of the user, process the received audio information in accordance with a second mode, the second mode being distinct from the first mode.
Abstract:
A system comprises a medical tool including a shaft having proximal and distal ends and an articulatable distal portion coupled to the distal end of the shaft. The system also comprises a processing unit including one or more processors. The processing unit is configured to determine a target in a medical environment. The articulatable distal portion is directed toward the target. The processing unit is also configured to determine a motion of at least a portion of the shaft, and in response to the determined motion, control a pose of the articulatable distal portion so that the articulatable distal portion remains directed toward the target.
Abstract:
A teleoperational system comprises a teleoperational assembly configured to support an instrument and an imaging device. The instrument has an instrument tip and a processing unit including one or more processors. The processing unit is configured to determine an instrument position of the instrument, determine an instrument position error relative to the imaging device, and determine, based on the instrument position and the instrument position error, that at least a portion of the instrument is outside a field of view of the imaging device. The processing unit is further configured to in response to determining that at least a portion of the instrument is outside the field of view of the imaging device and based on a mode of operation of the teleoperational system, cause an out-of-view indication for the instrument to be presented.
Abstract:
A computer-assisted device includes an articulated arm configured to support an end effector and a control unit. When coupled to the articulated arm and a table, the control unit is configured to detect movement of the articulated arm caused by movement of the table, determine a movement of the table based on motion data received from the table, and drive one or more first joints of the articulated arm based on the movement of the articulated arm and the determined movement of the table.