Abstract:
A camera system includes an image sensor, a stop aperture, an infrared cut filter disposed between the image sensor and the stop aperture, and a lens assembly. The lens assembly has a field of view ranging between 80 degrees and 110 degrees and is disposed between the infrared cut filter on an image side of the lens assembly and the stop aperture on an object side of the lens assembly. The lens assembly includes six lenses. Four of the six lenses have positive optical power and two of the six lenses have negative optical power. The six lenses include first, second, third, fourth, fifth, and sixth lenses having first inline, second inline, third inline, fourth inline, fifth inline, and sixth inline relative positions, respectively, along an optical path through the lens assembly.
Abstract:
An electronic device can include a first image sensor configured to capture a first image of a field of view and a second image sensor configured to capture a second image of the field of view. The electronic device can include a color filter adjacent to the second image sensor such that the field of view is viewable by the second image sensor through the color filter. The first image can have a first pixel resolution. The second image can have a second pixel resolution. The electronic device can include a controller configured to determine a third image based on luminance content of the first image and color content of the second image. The third image can have a third pixel resolution indicative of a spatial resolution of the first image and a spectral resolution of the second image.
Abstract:
An electronic device can include a first image sensor configured to capture a first image of a field of view and a second image sensor configured to capture a second image of the field of view. The electronic device can include a color filter adjacent to the second image sensor such that the field of view is viewable by the second image sensor through the color filter. The first image can have a first pixel resolution. The second image can have a second pixel resolution. The electronic device can include a controller configured to determine a third image based on luminance content of the first image and color content of the second image. The third image can have a third pixel resolution indicative of a spatial resolution of the first image and a spectral resolution of the second image.
Abstract:
Embodiments describe an eye sensing module to detect eye movements and gestures. The eye sensing module is included in a head wearable display, and may be placed anywhere within the head wearable display so that the image sensor device has a line-of-sight to the user's eye. The eye sensing module comprises an image sensor, an array of focusing lenses disposed over the image sensor placed side-by-side (i.e., not on top of one another), and a corresponding array of directional prisms disposed over the focusing lenses. The directional prisms and the focusing lenses increase the field of view of the image sensor to enable the image sensor to capture eye gestures and eye movements for different user eye sizes, eye locations, and other varying eye characteristics. These eye sensing modules increase the field of view of the image sensor without increasing the size of image sensor or focusing lenses.
Abstract:
A color sensitive image sensor includes first, second, and third image sensor layers vertically aligned in an image sensor stack. Each of the image sensor layers includes a pixel array oriented to generate image data in response to light incident on the image sensor stack and readout circuitry coupled to the pixel array to readout the image data. A first optical filter layer is disposed between the first image sensor layer and the second image sensor layer and has a first edge pass filter characteristic with a first cutoff wavelength. A second optical filter layer is disposed between the second image sensor layer and the third image sensor layer and has a second edge pass filter characteristic with a second cutoff wavelength offset from the first cutoff wavelength.
Abstract:
An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The first pixel array and the second pixel array face substantially a same direction. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data.
Abstract:
A camera module includes a lens assembly, an image sensor, and a hybrid lens holder. The image sensor is aligned with the lens assembly to capture images of light incident through the lens assembly on a light sensitive surface of the image sensor. The hybrid lens holder holds the lens assembly a fixed offset from the image sensor. The hybrid lens holder includes a barrel section in which the lens assembly is held as a vertical stack rising above the image sensor and a flange section that rests on the image sensor to maintain the fixed offset from the image sensor. The discrete lens elements are held in place by direct contact with an inner side of the barrel section. The barrel section and the flange section are a single, contiguous housing structure.
Abstract:
A camera apparatus includes an image sensor to output an image signal, a stop aperture, a lens assembly, and a controller. The lens assembly is disposed between the image sensor on an image side of the lens assembly and the stop aperture on an object side of the lens assembly. The lens assembly includes a plurality of lens elements that collectively induce axial chromatic aberration between red, green, and blue light. The controller is coupled to receive red, green, and blue channels of the image signal. The controller includes logic that causes the controller to use the blue channel without the red or green channels of the image signal to perform image recognition on objects captured in a near-field of the lens assembly and to use the blue, red, and green channels collectively when capturing images in a far-field of the lens assembly.
Abstract:
A method of reducing light damage in a shutterless imaging device includes receiving a signal from a hardware device. The signal from the hardware devices is analyzed. In response to the analysis of the signal from the hardware device, a lens of the shutterless imaging device is adjusted. Adjusting the lens spreads out energy of far-field image light incident on an image sensor of the shutterless imaging device.
Abstract:
An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The first pixel array and the second pixel array face substantially a same direction. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data.