Abstract:
A system and method for is provided for operation of an orthopedic system. The system includes a load sensor for converting an applied pressure associated with a force load on an anatomical joint, and an ultrasonic device for creating a low-power short-range ultrasonic sensing field within proximity of the load sensing unit for assessing alignment. The system can adjust a strength and range of the ultrasonic sensing field according to position. It can report audible and visual information associated with the force load and alignment. Other embodiments are disclosed.
Abstract:
A portable measurement system is provided including a probe, a user interface control and a receiver. The probe includes a plurality of ultrasonic transducers that emit ultrasonic waveforms for creating a three-dimensional sensing space. The user interface control captures a location and position of the probe in the three-dimensional sensing space. The receiver includes a plurality of microphones to capture the ultrasonic waveforms transmitted from the probe to produce captured ultrasonic waveforms and a digital signal processor that digitally samples the captured ultrasonic waveforms and tracks a relative location and movement of the probe with respect to the receiver in the three-dimensional ultrasonic sensing space from time of flight waveform analysis. Embodiments are demonstrated with respect to hip replacement surgery, but other embodiments are contemplated.
Abstract:
A portable measurement system is provided comprising a probe, two trackers, a receiver and a pod. A user interface control captures a location and position of the probe in a three-dimensional sensing space with respect to a coordinate system of the receiver from time of flight waveform analysis. The system suppresses a ringing portion of the received ultrasonic and minimizes distortion associated with ultrasonic transducer ring-down during high-resolution position tracking of the probe and the two trackers. Media is presented according to a customized use of the probe and two trackers during an operation workflow.
Abstract:
A system and method is provided for resolving a pivot point via touchless interaction. It applies to situations where one end of a rigid object is inaccessible but remains stationary at a pivot point, while the other end is free to move and is accessible to an input pointing device. As an example, the rigid object can be a leg bone where the proximal end is at the hip joint and the distal end is at the knee. The system comprises a wand and a receiver that are spatially configurable to touchlessly locate the pivot point without contact. The receiver tracks a relative displacement of the wand and geometrically resolves the location of the pivot point by a spherical mapping. The system can use a combination of ultrasonic sensing and/or accelerometer measurements. Other embodiments are disclosed.
Abstract:
A system (111) and method (200) for providing sensory feedback (210) for touchless feedback control is provided. The system can include a touchless sensing unit (110) that detects at least one position (304) of an object in a touchless sensing space (300), and an indicator (166) communicatively coupled to the touchless sensing unit that provides sensory feedback associated with the at least one position. The indicator can change in accordance with a location, a recognized movement, or a strength of the touchless sensing space. The sensory feedback can be associated with acquiring a touchless control or releasing a touchless control. The sensory feedback can be visual, auditory, or haptic.
Abstract:
A method for determining position and alignment is provided. The method includes monitoring a first and second sequence of ultrasonic signals transmitted from the first device to a second device, estimating a location of the first device from Time of Flight measurements of the ultrasonic signals at respective microphones on the second device, calculating a set of phase differences, weighting a difference of an expected location and estimated location of the first device with the set of phase differences to produce a relative displacement, and reporting a position of the first device based on the relative displacement.
Abstract:
At least one exemplary embodiment is directed to a website configured to collect sound signatures from around the world and beyond. A communication device automatically stores acoustic information received by a microphone of the communication device. The acoustic information is analyzed for a trigger event. The trigger event stores the acoustic information, attaches metadata, creates a Gaussian Mixture Model, and measures sound pressure level. The communication device automatically sends the sound signature to a database when a communication path is opened to communication device. Each sound signature has associated metadata including a time stamp and geocode. Automatically collecting sounds using a communication device adapted for the process enables a database that captures sounds globally on a continuous basis.
Abstract:
An apparatus for sensory based media control is provided. A system that incorporates teachings of the present disclosure may include, for example, a media device having a controller element to receive from a media controller a first instruction to select an object in accordance with a physical handling of the media controller, and a second instruction to control the identified object or perform a search on the object in accordance with touchless finger movements. Additional embodiments are disclosed.
Abstract:
A touchless sensing unit (110) and method (210) for calibrating a mobile device for touchless sensing is provided. The method can include evaluating (214) a finger movement within a touchless sensory space (101), estimating (216) a virtual coordinate system (320) from a range of finger movement, and mapping (218) the virtual coordinate system to a device coordinate system (330).
Abstract:
An Integrated Development Environment (IDE) (100) for creating a touchless Virtual User Interface (VUI) 120 is provided. The IDE can include a development window (152) for graphically presenting a visual layout of user interface (UI) components (161) that respond to touchless sensory events in a virtual layout of virtual components (261), and at least one descriptor (121) for modifying a touchless sensory attribute of a user component. The touchless sensory attribute describes how a user component responds to a touchless touchless sensory event on a virtual component.