Abstract:
Systems and methods for machine vision are presented. Such machine vision includes ego-motion, as well as the segmentation and/or classification of image data of one or more targets of interest. The projection and detection of scanning light beams that generate a pattern are employed. Real-time continuous and accurate spatial-temporal 3D sensing is achieved. The relative motion between an observer and a projection surface is determined. A combination of visible and non-visible patterns, as well as a combination of visible and non-visible sensor arrays is employed to sense 3D coordinates of target features, as well as acquire color image data to generate 3D color images of targets. Stereoscopic pairs of cameras are employed to generate 3D image data. Such cameras are dynamically aligned and calibrated. Information may be encoded in the transmitted patterns. The information is decoded upon detection of the pattern and employed to determine features of the reflecting surface.
Abstract:
A system to determine a position of one or more objects includes a transmitter to emit a beam of photons to sequentially illuminate regions of one or more objects; multiple cameras that are spaced-apart with each camera having an array of pixels to detect photons; and one or more processor devices that execute stored instructions to perform actions of a method, including: directing the transmitter to sequentially illuminate regions of one or more objects with the beam of photons; for each of the regions, receiving, from the cameras, an array position of each pixel that detected photons of the beam reflected or scattered by the region of the one or more objects; and, for each of the regions detected by the cameras, determining a position of the regions using the received array positions of the pixels that detected the photons of the beam reflected or scattered by that region.
Abstract:
An image projection device for displaying an image onto a remote surface. The image projection device employs a scanner to project image beams of visible light and tracer beams of light onto a remote surface to form a display of the image. The device also employs a light detector to sense at least the reflections of light from the tracer beam pulses incident on the remote surface. The device employs the sensed tracer beam light pulses to predict the trajectory of subsequent image beam light pulses and tracer beam light pulses that form a display of the image on the remote surface in a pseudo random pattern. The trajectory of the projected image beam light pulses can be predicted so that the image is displayed from a point of view that can be selected by, or automatically adjusted for, a viewer of the displayed image.
Abstract:
Embodiments are directed toward measuring a three dimensional range to a target. A transmitter emits light toward the target. An aperture may receive light reflections from the target. The aperture may direct the reflections toward a sensor that comprises rows of pixels that have columns. The sensor is offset a predetermined distance from the transmitter. Anticipated arrival times of the reflections on the sensor are based on the departure times and the predetermined offset distance. A portion of the pixels are sequentially activated based on the anticipated arrival times. The target's three dimensional range measurement is based on the reflections detected by the portion of the pixels.
Abstract:
A method, apparatus, and manufacture for writing and annotation is provided. An image is provided on a surface. In one embodiment, each time invisible ink is deposited on the surface, the location of the invisible ink deposited on the surface is detected before the invisible ink vanishes from the surface. In another embodiment, when a tip of a stylus is in contact with a location of an image on the surface, employing three or more light detectors to detect light at the location. The detected light is employed to determine a position and an orientation of the tip of the stylus and the location on the surface, and modifying the image based on the stored information.
Abstract:
A three-dimension position tracking system is presented. The system includes transmitters and receivers. A transmitter scans continuous or pulsed coherent light beams across a target. The receiver detects the reflected beams. The system recursively determines the location of the target, as a function of time, via triangulation and observation of the time-of-flight of the incoming and outgoing beams. The transmitter includes ultra-fast scanning optics to scan the receiver's field-of-view. The receiver includes arrays of ultra-fast photosensitive pixels. The system determines the angles of the incoming beams based on the line-of-sight of the triggered pixels. By observing the incoming angles and correlating timestamps associated with the outgoing and incoming beams, the system accurately, and in near real-time, determines the location of the target. By combining the geometry of the scattered beams, as well as the beams' time-of-flight, ambiguities inherent to triangulation and ambiguities inherent to time-of-flight location methods are resolved.
Abstract:
An image projection device for displaying an image onto a remote surface. The image projection device employs a scanner to project image beams of visible light and tracer beams of light onto a remote surface to form a display of the image. The device also employs a light detector to sense at least the reflections of light from the tracer beam pulses incident on the remote surface. The device employs the sensed tracer beam light pulses to predict the trajectory of subsequent image beam light pulses and tracer beam light pulses that form a display of the image on the remote surface in a pseudo random pattern. The trajectory of the projected image beam light pulses can be predicted so that the image is displayed from a point of view that can be selected by, or automatically adjusted for, a viewer of the displayed image.
Abstract:
An image projection device for displaying an image onto a remote surface. The image projection device employs a scanner to project image beams of visible light and tracer beams of light onto a remote surface to form a display of the image. The device also employs a light detector to sense at least the reflections of light from the tracer beam pulses incident on the remote surface. The device employs the sensed tracer beam light pulses to predict the trajectory of subsequent image beam light pulses and tracer beam light pulses that form a display of the image on the remote surface in a pseudo random pattern. The trajectory of the projected image beam light pulses can be predicted so that the image is displayed from a point of view that can be selected by, or automatically adjusted for, a viewer of the displayed image.
Abstract:
A folded optical element waveguide that allows a minimum width bezel to be used around the perimeter of a light-based touch screen display. The apparatus and method includes a touch screen and a waveguide substrate provided adjacent the touch screen. The waveguide substrate includes a plurality of waveguides and a plurality of optical elements provided adjacent the touch screen. The waveguides include an internally reflective surface to reflect light perpendicular to the surface of the touch screen. The emitting and detecting waveguides are thus folded and provided around the side edges of the display. As a result, the width of the bezel around the display can be minimized.
Abstract:
A fingerprint sensor uses beams of light to detect a fingerprint as the finger is swiped over a ridged surface. The beams of light are directed toward individual regions of the ridged surface so that the light beams will generally be totally internally reflected when a finger is not touching the ridge. The total internal reflection characteristics of the ridged surface are altered at regions touched by the ridges on the finger as the finger is swiped over the sensor. This alters the amount of light reflected by the ridged surface. These changes in light reflection as the finger is swiped over the ridged surface can be observed simultaneously over multiple channels, preferably disposed laterally with respect to each other, to provide a fingerprint.