Abstract:
Disclosed are methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display are disclosed. In one embodiment, the method includes displaying a user-interface on a substantially transparent display of a wearable computing device. The method further includes displaying a virtual object in the view region at a focal length along a first line of sight and detecting a physical object at a physical distance along a second line of sight. The method still further includes determining that a relationship between the focal length and the physical distance is such that the virtual object and the physical object appear substantially co-located in a user-view through the view region and, responsive to the determination, initiating a collision action between the virtual object and the physical object.
Abstract:
A method of dynamically increasing an effective size of an eyebox of a head mounted display includes displaying a computer generated image (“CGI”) to an eye of a user wearing the head mounted display. The CGI is perceivable by the eye within an eyebox. An eye image of the eye is captured while the eye is viewing the CGI. A location of the eye is determined based upon the eye image. A lateral position of the eyebox is dynamically adjusted based upon the determined location of the eye thereby extending the effective size of the eyebox from which the eye can view the CGI.
Abstract:
A camera system includes an array of image pixels disposed in or on a substrate and laid out in a multi-ring pattern. The array of image pixels is coupled to acquire image data of a color image in response to light incident on the array of image pixels. A color filter array (“CFA”) is positioned to color filter the light incident on the array of image pixels and includes at least two different color filter types that filter different color bands of the light. An actuator is coupled to the CFA to adjust the CFA in a sequence and a controller is coupled to the actuator to control the sequence such that each image pixel in the array of image pixels is temporarily optically subtended by each of the at least two different color filter types in the CFA while acquiring the image data associated with the color image.
Abstract:
A camera system includes a single pixel photo-sensor disposed in or on a substrate to acquire image data. A micro-lens is adjustably positioned above the single pixel photo-sensor to focus external scene light onto the single pixel photo-sensor. An actuator is coupled to the micro-lens to adjust a position of the micro-lens relative to the single pixel photo-sensor to reposition the micro-lens to focus the external scene light incident from different angles onto the single pixel photo-sensor. Readout circuitry is coupled to readout the image data associated with each of the different angles from the single pixel photo-sensor.
Abstract:
An adjustable lens includes a first lens member having a first corrugated surface, a second lens member having a second corrugated surface, and blackout regions disposed between the first and second corrugated surfaces. The first corrugated surface includes a periodic structure of alternating ridge and groove sections. The second corrugated surface includes a periodic structure of alternating ridge and groove sections. The blackout regions are positioned to block image light passing through either the ridge sections of the first lens member, or alternatively, the groove sections of the first lens member.
Abstract:
A camera system includes an image sensor, an aperture, and an adjustable lens. The adjustable lens is disposed in an optical path of the image sensor to focus image light received through the aperture onto a pixel array of the image sensor. The adjustable lens includes first and second lens members and blackout regions. The first lens member includes a first corrugated surface and a first flat surface opposite the first corrugated surface. The first corrugated surface includes a periodic structure of alternating ridge and groove sections. The second lens member includes a second corrugated surface and a second flat surface opposite the second corrugated surface. The second flat surface faces the first flat surface. The blackout regions are disposed between the first and second corrugated surfaces and positioned to block the image light passing through either the ridge or the groove sections of the first lens member.
Abstract:
An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data.
Abstract:
A lens includes a depth of field (“DOF”) range and a macro range. The macro range is distinct and separate from the DOF range. The macro range is a near field relative to the DOF range. The DOF range provides a first field of view (“FOV”) while the macro range provides a second FOV that is smaller than the first FOV within the DOF range. The lens transfers sharpness from a peripheral viewing region within the macro range into a central viewing region within the macro range.
Abstract:
An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data.
Abstract:
An optical system includes a display panel, an image former, a viewing window, a proximal beam splitter, and a distal beam splitter. The display panel is configured to generate a light pattern. The image former is configured to form a virtual image from the light pattern generated by the display panel. The viewing window is configured to allow outside light in from outside of the optical system. The virtual image and the outside light are viewable along a viewing axis extending through the proximal beam splitter. The distal beam splitter is optically coupled to the display panel and the proximal beam splitter and has a beam-splitting interface in a plane that is parallel to the viewing axis. A camera may also be optically coupled to the distal beam splitter so as to be able to receive a portion of the outside light that is viewable along the viewing axis.