Abstract:
An ultrasound imaging system according to the present disclosure may include an ultrasound transducer assembly comprising a plurality of apertures that are configured to transmit signals toward and receive signals from a region of interest (ROI) of a subject, a tracking sensor disposed within the subject and configured to move within the ROI, the sensor being responsive to signals transmitted by the apertures, and at least one processor in communication with the ultrasound transducer assembly and the tracking sensor. The at least one processor may be configured to generate a first image of a first portion of the ROI from signals received from at least one activated aperture, identify a position of the tracking sensor using signal data from the tracking sensor that corresponds to at least one signal transmitted by the apertures, and generate a second image of a second portion of the ROI from signals received from at least one other aperture activated based on the identified position, wherein the second portion of the ROI is different from the first portion of the ROI.
Abstract:
An instrument driving mechanism includes an instrument drive assembly (140) including a first set of wheels (136) coupled to a first end portion and a second set (138) of wheels coupled to a second end portion opposite the first end portion. The first set of wheels is configured to engage an elongated instrument (104) therein such that a rotation plane of the first set of wheels is coplanar with a longitudinal axis of the instrument. The second set of wheels is configured to engage the elongated instrument therein such that a rotation plane of the second set of wheels is obliquely oriented with the longitudinal axis of the instrument wherein motion of the instrument is controlled by controlling rotations of the wheels. The instrument drive assembly mounts to a mounting position (149) of a medical device that permits the instrument to pass therethrough and is configured to fix a position of the instrument drive assembly to enable positioning of the instrument.
Abstract:
A method and system track a location of an object while the object is disposed within a region of interest within biological tissue, the location of the object being determined with respect to a tracking coordinate frame; generate acoustic images of the region of interest, the acoustic images being generated with respect to an acoustic image coordinate frame which is different from the tracking coordinate frame; transform the location of the object from the tracking coordinate frame to the acoustic image coordinate frame; and automatically adjust at least one image resolution parameter of the acoustic images in response to the location of the object with respect to the acoustic image coordinate frame.
Abstract:
A classification-based medical image segmentation apparatus includes an ultrasound image acquisition device configured for acquiring, from ultrasound, an image depicting a medical instrument such as needle; and machine-learning-based-classification circuitry configured for using machine-learning-based-classification to, dynamically responsive to the acquiring, segment the instrument by operating on information (212) derived from the image. The segmenting can be accomplished via statistical boosting (220) of parameters of wavelet features. Each pixel (216) of the image is identified as “needle” or “background.” The whole process of acquiring an image, segmenting the needle, and displaying an image with a visually enhanced and artifact-free needle-only overlay may be performed automatically and without the need for user intervention.
Abstract:
An apparatus (10) provides remote assistance to a local operator of a medical imaging device (2) disposed in a medical imaging device bay (3) via a communication link (14) from a remote location (4) that is remote from the medical imaging device bay to the medical imaging device bay. The apparatus includes a workstation (12) disposed at the remote location including at least one workstation display (24). At least one electronic processor (20) is programmed to, over the course of a medical imaging examination performed using the medical imaging device: extract successive image frames from video (17) or screen sharing (18) of a controller display (24′) of the medical imaging device; screen-scrape information related to the medical imaging examination from the successive image frames over the course of the medical imaging examination; maintain status information on the medical imaging examination at least in part using the screen-scraped information; and output an alert (30) perceptible at the remote location when the status information on the medical imaging examination satisfies an alert criterion.
Abstract:
A system and medical device for the electromagnetic tracking of a medical instrument transported through the medical device. The medical device has a central axis and a channel that receives and transports a medical instrument through the medical device. The channel extends to a distal portion of the medical device and connects with an opening in the medical device that is not aligned with the central axis. The medical device includes a tracking component that is a plurality of coordinated electromagnetic sensors for generating a virtual axis of travel for the medical instrument, with the virtual axis passing through the opening of the device and being aligned with tool insertion axis.
Abstract:
Systems, devices, and associated methods including: emitting ultrasound to the subject and receiving, in response, a current ultrasound view; matching the received image to a pre-existing image, such as a three-dimensional reference image; and, for user assistance, generating, based on the matching, feedback for the guidance. The reference image may be a statistical atlas or it may be derived from patient-specific CT or MR scans. The pre-existing image may instead be a database image corresponding to a state in a state space. The feedback can be an image derived from the reference image; a graphic indication of a plane of the target view; the received view fused to an image derived from the reference image; or the received view and an image derived from the reference image, the derived image appearing concurrently and enhanced to spatially indicated where the received view registers to the reference image.
Abstract:
An interventional system employing an interventional tool (20) having a tracking point, and an imaging system (30) operable for generating at least one image of at least a portion of the interventional tool (20) relative to an anatomical region of a body. The system further employs a tracking system (40) operable for tracking any movements of the interventional tool (20) and the imaging system (30) within a spatial reference frame relative to the anatomical region of the body, wherein the tracking system (40) is calibrated to the interventional tool (20) and the imaging system (30) and a tracking quality monitor (52) operable for monitoring a tracking quality of the tracking system (40) as a function of a calibrated location error for each image between a calibrated tracking location of the tracking point within the spatial reference frame and an image coordinate location of the tracking point in the image.
Abstract:
The present invention relates to an ultrasound elastography system (10) for providing an elastography measurement result of an anatomical site (32) a corresponding method. The system (10) is configured to visualize a suitability for shear wave elastography of the region of interest (33) to the user within the ultrasound image (52) and/or to recommend an elastography acquisition plane (48, 50) for conducting shear wave elastography to the user. By this, proper selection of a location for an elastography measurement may be supported.
Abstract:
An ultrasound system includes a 3D imaging probe and a needle guide which attaches to the probe for guidance of the insertion of multiple needles into a volumetric region which can be scanned by the 3D imaging probe. The needle guide responds to the insertion of a needle through the guide by identifying a plane for scanning by the probe which is the insertion plane through which the needle will pass during insertion. The orientation of the insertion plane is communicated to the probe to cause the probe to scan the identified plane and produce images of the needle as it travels through the insertion plane.