Abstract:
A system for boundary identification includes a memory (42) to store shear wave displacements through a medium as a displacement field including a spatial component and a temporal component. A directional filter (206, 208) filters the displacement field to provide a directional displacement field. A signal processing device (26) is coupled to the memory to execute a boundary estimator (214) to estimate a tissue boundary in a displayed image based upon a history of the directional displacement field accumulated over time.
Abstract:
Systems and methods for image registration includes an image feature detection module (116) configured to identify internal landmarks of a first image (110). An image registration and transformation module (118) is configured to compute a registration transformation, using a processor, to register a second image (112) with the first image based on surface landmarks to result in a registered image. A landmark identification module (120) is configured to overlay the internal landmarks onto the second image using the registration transformation, encompass each of the overlaid landmarks within a virtual object to identify corresponding landmark pairs in the registered image, and register the second image with the first image using the registered image with the identified landmarks.
Abstract:
The invention relates to a calibration apparatus for calibrating a system for introducing an influencing element like a radiation source into an object, particularly for calibrating a brachytherapy system. First and second images show a longish introduction device (12) like a catheter and a tracking device (16) like an electromagnetically trackable guidewire inserted into the introduction device as far as possible, and the introduction device and a calibration element (46) having the same dimensions as the influencing element and being inserted into the introduction device as far as possible. A spatial relation between the tracking device and the calibration element is determined based on the images for calibrating the system. Knowing this spatial relation allows accurately determining an influencing plan like a brachytherapy treatment plan and accurately positioning the influencing element in accordance with the influencing plan, which in turn allows for a more accurate influencing of the object.
Abstract:
Systems and methods for image registration includes an image feature detection module (116) configured to identify internal landmarks of a first image (110). An image registration and transformation module (118) is configured to compute a registration transformation, using a processor, to register a second image (112) with the first image based on surface landmarks to result in a registered image. A landmark identification module (120) is configured to overlay the internal landmarks onto the second image using the registration transformation, encompass each of the overlaid landmarks within a virtual object to identify corresponding landmark pairs in the registered image, and register the second image with the first image using the registered image with the identified landmarks.
Abstract:
A system (200) and method (900): access (910) a 3D reference image of a region of interest (ROI) (10) in a subject which was obtained using a first 3D imaging modality; segment (920) the ROI in the 3D reference image to define (930) a reference 3D coordinate system; acquire (940) 2D acoustic images of the ROI without absolute spatial tracking; generate (950) a standardized predicted pose for the 2D acoustic images with respect to a standardized 3D coordinate system (400) which was defined for a plurality of previously-obtained 3D images of corresponding ROIs in a plurality of other subjects which were obtained using the first 3D imaging modality; convert (960) the standardized predicted poses for the 2D acoustic images from the standardized 3D coordinate system to reference predicted poses in the reference 3D coordinate system; and use (970) the reference predicted poses for the 2D acoustic images to register the 2D acoustic images to the 3D reference image.
Abstract:
Ultrasound image devices, systems, and methods are provided. An ultrasound imaging system comprising a processor circuit in communication with an ultrasound probe comprising a transducer array, wherein the processor circuit is configured to receive, from the ultrasound probe, a first image of a patients anatomy; detect, from the first image, a first anatomical landmark at a first location along a scanning trajectory of the patients anatomy; determine, based on the first anatomical landmark, a steering configuration for steering the ultrasound probe towards a second anatomical landmark at a second location along the scanning trajectory; and output, to a display in communication with the processor circuit, an instruction based on the steering configuration to steer the ultrasound probe towards the second anatomical landmark at the second location.
Abstract:
A system, method, and computer-readable medium are disclosed for extracting breathing pattern data from ultrasound images for aiding downstream clinical patient management, including detecting a trigger event indicative of a breathing pattern from at least one of an audio-based trigger and an image-based trigger from an ultrasound video stream. The presently disclosed technique may further include identifying a breathing pattern in the ultrasound video stream responsive to detection of the trigger event. The presently disclosed technique may also include extraction of at least one breathing-related parameter and the generation of a record with the at least one breathing-related parameter.
Abstract:
Systems and methods for ultrasound image acquisition, tracking and review are disclosed. The systems can include an ultrasound probe coupled with at least one tracking device configured to determine a position of the probe based on a combination of ultrasound image data and probe orientation data. The image data can be used to determine a physical reference point and superior-inferior probe coordinates within a patient being imaged, which can be supplemented with the probe orientation data to determine lateral coordinates of the probe. A graphical user interface can display imaging zones corresponding to a scan protocol, along with an imaging status of each zone based at least in part on the probe position. Ultrasound images acquired by the systems can be tagged with spatial indicators and severity indicators, after which the images can be stored for later retrieval and expert review.
Abstract:
The present disclosure describes ultrasound systems and methods configured to determine the elasticity of a target tissue. Systems can include an ultrasound transducer configured to acquire echoes responsive to ultrasound pulses transmitted toward the tissue, which may include a region of increased stiffness. Systems can also include a beamformer configured to control the transducer to transmit a push pulse into the tissue, thereby generating a shear wave in the region of increased stiffness. The beamformer can be configured to control the transducer to emit tracking pulses adjacent to the push pulse. Systems can further include a processor configured to determine a displacement amplitude of the shear wave and based on the amplitude, generate a qualitative tissue elasticity map of the tissue. The processor can combine the qualitative map with a quantitative map of the same tissue, and based on the combination, determine a boundary of the region of increased stiffness.
Abstract:
A system and method for planning and performing an interventional procedure based on the spatial relationships between identified points. The system includes a storage device (102) having an image (104) which includes a plurality of targets (107). A spatial determination device (114) is configured to determine distances and/or orientation between each of the targets. The system is configured to compare the distances and generate a warning signal if at least one of the distances is less than a minimum threshold (128). An image generation device (116) is configured to generate a graphical representation for display to the user which shows the spatial information between a selected target with respect to the other targets. A planning device (126) is configured to modify or consolidate targets automatically or based on a user's input in order to more effectively plan or execute an interventional procedure.