Abstract:
An apparatus for increasing the speed at which an object can be optically scanned. First and second optical signals from first and second areas of an object are focused respectively onto first and second optical sensor arrays. Image data signals from first and second optical sensor arrays are interlaced in a detection circuit to form a composite image of the object. By exposing the first area during the time that the optical signal from exposure of the second area is being clocked out and by exposing the second area during the time that the optical signal from exposure of the first area is being clocked out, the object can be scanned at a faster rate. Specialized image sensors do not need to be developed. Commercially available image sensors can be used. In particular, increased scan speed could result from placing two relatively inexpensive, commercially available Charge Coupled Devices (CCD's) or Complementary Metal-Oxide-Semiconductor (CMOS) image sensors parallel to each other and alternatively exposing near-by areas of the object.
Abstract:
A position sensing device having a single photosensing element is disclosed herein. The position sensing device determines the location of an object to which the position sensing device is attached relative to a surface. The position sensing device has a plurality of light paths that direct light from different area portions of the surface to the single photosensing element. Each area portion of the surface is associated with a light source wherein each light source may be activated individually. A processor illuminates these area portions individually. As the area portions are illuminated, the photosensing element creates image data representative of the image of the area portion being illuminated. The processor analyzes the image data and identifies distinct features in the area portions. As the object is moved relative to the surface, the locations of these distinct features relative to the photosensing element move. By measuring this movement, the processor is able to determine the velocity, direction of movement, and position of the object relative to the surface.
Abstract:
A position sensing device is disclosed wherein the positioning sensing device may determine the velocity and position of an object relative to a surface as the object is moved relative to the surface. The positioning sensing device may comprise two depth measurement devices that are mounted to the object on an axis that is substantially parallel to the direction of movement between object and the surface. The depth measurement devices are spaced a predetermined distance from each other. The depth measurement devices measure the contours of the surface as the object is moved relative to the surface and may output data representative of the surface contour to a processor. Accordingly, the processor receives two data signals that are out of phase wherein the phase shift is proportional to the relative velocity between the object and the surface. The processor may then perform an analysis on the data signals to determine the velocity of the object relative to the surface. Likewise, the processor may also determine the displacement of the object relative to the surface during a time interval.
Abstract:
An illumination system for illuminating a scan region on an object may comprise a hollow reflector having an interior reflective surface, an entrance aperture, and an exit aperture. A light source is positioned adjacent the entrance aperture of the hollow reflector so that some of the light rays produced by the light source pass through the entrance aperture and are reflected by the interior reflective surface of the hollow reflector before passing through the exit aperture.
Abstract:
A hand-held scanner apparatus may comprise a body and an image head mounted for rotational movement to the body. The rotation of the rotating image head maintains the face of the image head in contact with the document during a scanning operation even though the user may rock or tilt the body of the hand-held scanner during the scanning operation.
Abstract:
A sensor assembly and an optical image color scanner using the sensor assembly. The sensor assembly is of the type having three separate rows of optical sensors. Two of the three sensor rows have color filters and one is unfiltered (receptive to white light). For gray scale scanning, only the unfiltered (white) sensor row is used, thereby maximizing the speed of gray scale scanning. For color scanning, three color values are computed as a linear transformation of values from the two filtered and one unfiltered values. The linear transformation may be as simple as subtracting signals from the two filtered sensor rows from the signal from the unfiltered (white) sensor row. For color scanning, memory buffers are required for two of the three sensor output signals. For highest accuracy in color scanning, the exposure time for the white sensor row is reduced relative to the exposure time for the two filtered sensor rows. An optional white channel bandpass filter (passing all wavelengths within the human visual range and rejecting wavelengths outside the human visual range) provides improved accuracy if the light source has significant wavelengths outside the human visual range.
Abstract:
An optical system comprises a linear illumination source configured to emit light, a first scanning stage configured to receive the light and to scan the light, and a second scanning stage. The linear illumination source is configured to generate light forming a vertical field of view based on the one or more output signals received from a controller modulating the one or more output signals comprising image data defining content. The first scanning stage redirects portions of the light to generate an output defining a horizontal field of view based on the one or more output signals of the controller. The first scanning device combines the vertical field of view and the horizontal field of view in the output light to create a two-dimensional light image of the content. The second scanning stage receives and directs the output of the first scanning stage toward a projected exit pupil.
Abstract:
A parallel beam flexure mechanism (“PBFM”) for adjusting an interpupillary distance (“IPD”) of an optical device is disclosed. The PBFM includes a plurality of flexures, a mounting platen, an optical payload, and a horizontal translation mechanism. The mounting platen has a first end and a second end, where the mounting platen is attached to a first set of flexures that are in a parallel arrangement to a second set of flexures attached to a frame of an optical device, such as a head mounted display. The optical payload and horizontal translation mechanism are attached to the mounting platen, where the horizontal translation mechanism is configured to translate the mounting platen in a horizontal direction by bending the flexures, thereby adjusting the IPD of the optical device.
Abstract:
In embodiments of an imaging structure with embedded light sources, an imaging structure includes a silicon backplane with a driver pad array. The embedded light sources are formed on the driver pad array in an emitter material layer, and the embedded light sources can be individually controlled at the driver pad array to generate and emit light. A conductive material layer over the embedded light sources forms a p-n junction between the emitter material layer and the conductive material layer. Micro lens optics can be positioned over the conductive material layer to direct the light that is emitted from the embedded light sources. Further, the micro lens optics may be implemented as parabolic optics to concentrate the light that is emitted from the embedded light sources.
Abstract:
A near-eye display system includes an image former and a waveguide. The image former is configured to form a display image and to release the display image through a first exit pupil. The waveguide presents a back surface that faces the wearer's eye, and a front surface opposite the back surface. The waveguide is substantially transparent to external imagery received normal to the front surface, and is configured to receive the display image from the image former and to release the display image through a second exit pupil, which is larger than the first exit pupil.