Abstract:
The invention relates to an assisting apparatus (2) for assisting a user in moving an insertion element (11) like a catheter to a target element within, for instance, a person (8). A target element representation representing the target element within the object in its three-dimensional position and three-dimensional orientation and with its size is generated based on a provided target element image. Moreover, a three-dimensional position of the insertion element is tracked, while the insertion element is moved to the target element, and the target element representation and the tracked position of the insertion element are displayed. The three-dimensional position and orientation of the target element relative to the actual position of the insertion element can therefore be shown to the user, while the insertion element is moved to the target element, which allows the user to more accurately and faster move the insertion element to the target element.
Abstract:
A volume mapping instrument (20), deployable within a partially or a completely enclosed anatomical volume, employs one or more medical tools (40) with each medical tool (40) being transitional between a deployable structural configuration to orderly position each medical tool (40) within the anatomical volume and a mapping structural configuration to anchor the medical tool (40) against the boundary of the anatomical volume. The volume mapping instrument (20) further employs an optical shape sensor (30) to generate one or more encoded optical signals indicative of a shape of the boundary of the anatomical volume in response to each medical tool (40) being transitioned from the deployable structural configuration to the mapping structural configuration within the anatomical volume. Based on the encoded optical signal(s), a volume mapping module (51) is utilized to map a portion or an entirety of the boundary of the anatomical volume.
Abstract:
A volume mapping instrument (20), deployable within a partially or a completely enclosed anatomical volume, employs one or more medical tools (40) with each medical tool (40) being transitional between a deployable structural configuration to orderly position each medical tool (40) within the anatomical volume and a mapping structural configuration to anchor the medical tool (40) against the boundary of the anatomical volume. The volume mapping instrument (20) further employs an optical shape sensor (30) to generate one or more encoded optical signals indicative of a shape of the boundary of the anatomical volume in response to each medical tool (40) being transitioned from the deployable structural configuration to the mapping structural configuration within the anatomical volume. Based on the encoded optical signal(s), a volume mapping module (51) is utilized to map a portion or an entirety of the boundary of the anatomical volume.
Abstract:
An interactive holographic display system includes a holographic generation module configured to display a holographically rendered anatomical image. A localization system is configured to define a monitored space on or around the holographically rendered anatomical image. One or more monitored objects have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.
Abstract:
A medical system for shape sensing by interacting with a shape sensing element (12) configured to perform a shape sensing measurement of an interventional device (14) and by interacting with an image data generation unit (28) for generating image data (21) is provided. The medical system includes a shape sensing console (16) in communication with the shape sensing element (12) for generating measurement signals of the shape sensing measurement, a shape reconstruction unit (18) in communication with the shape sensing console for reconstructing a shape (19) of the shape sensing element based on the generated measurement signals, a receiving unit (22) for receiving a position and/or orientation of an apparatus (26, 26′) including the image data generation unit (28), a coordinate transformation unit (20) for registering the reconstructed shape (19) by representing the reconstructed shape (19) in a coordinate system based on the received orientation of the apparatus (26, 26′), and a connector unit (32) for connecting the shape sensing console (16) and/or said shape reconstruction unit (18) to the shape sensing element (12), the connector unit (32) being detachably connectable to a housing (30) of the apparatus (26, 26′).
Abstract:
The invention relates to a registration system (13) for registering an imaging device (2) like an x-ray C-arm device with a tracking device (4) like an optical shape sensing tracking device. An object detection unit (5) is adapted to a) allow a user to add markers to an image of an object for indicating arbitrary positions on the object and b) provide a representation of the object based on the added markers, wherein a registration unit (6) determines registration parameters based on a location of the object tracked by the tracking device and on the location of the representation. This kind of registration can be performed with any object being visible for the user in the image, thereby permitting to use a wide variety of objects for the registration.
Abstract:
The invention relates to an assisting apparatus (2) for assisting a user in moving an insertion element (11) like a catheter to a target element within, for instance, a person (8). A target element representation representing the target element within the object in its three-dimensional position and three-dimensional orientation and with its size is generated based on a provided target element image. Moreover, a three-dimensional position of the insertion element is tracked, while the insertion element is moved to the target element, and the target element representation and the tracked position of the insertion element are displayed. The three-dimensional position and orientation of the target element relative to the actual position of the insertion element can therefore be shown to the user, while the insertion element is moved to the target element, which allows the user to more accurately and faster move the insertion element to the target element.