Abstract:
An example method involves: (a) determining an indication of distance to an object in a scene, wherein the computing device comprises three or more image-capture devices that provide two or more baselines for stereoscopic imaging, wherein at least two pairs of image-capture devices from the three or more image-capture devices are operable for stereoscopic imaging, wherein each pair provides one of the baselines, and wherein a first of the baselines is non-parallel to a second of the baselines, (b) selecting, by the computing device, a first pair from the at least two pairs of image-capture devices, wherein the first pair is selected based on the indication of distance and the baseline provided by the first pair, and (c) operating the first pair of image-capture devices to capture stereoscopic image data.
Abstract:
A device may operate a first image-capture system to capture first image data of a scene. While the first image-capture system is capturing the first image data, the device may operate a second image-capture system to determine an updated value for the first image setting, and send an instruction to the first image-capture system that indicates to use the updated value for the first image setting to continue to capture the first image data.
Abstract:
The present disclosure relates to staggered arrays of individually addressable light-emitting elements for sweeping out angular ranges. One example device includes an astigmatic optical element. The device may also include an array of individually addressable light-emitting elements arranged to emit light towards the astigmatic optical element. The astigmatic optical element may be arranged to focus light emitted from each individually addressable light-emitting element to produce a substantially linear illumination pattern at a different corresponding scan angle within an angular range. The example device may further include a control system operable to sequentially activate the individually addressable light-emitting elements such that the substantially linear illumination pattern sweeps out the angular range. The individually addressable light-emitting elements may be staggered with respect to one another in the array such that the substantially linear illumination pattern sweeps out the angular range continuously.
Abstract:
The present disclosure relates to methods and systems that may reduce pixel noise due to defective sensor elements in optical imaging systems. Namely, a camera may capture a burst of images with an image sensor while adjusting a focus distance setting of an optical element. For example, the image burst may be captured during an autofocus process. The plurality of images may be averaged or otherwise merged to provide a single, aggregate image frame. Such an aggregate image frame may appear blurry. In such a scenario, “hot” pixels, “dead” pixels, or otherwise defective pixels may be more easily recognized and/or corrected. As an example, a defective pixel may be removed from a target image or otherwise corrected by replacing a value of the defective pixel with an average value of neighboring pixels.
Abstract:
Example embodiments may help multi-camera devices determine disparity information scene, and use the disparity information in an autofocus process. An example method involves: (a) receiving image data of a scene that comprises at least one image of the scene captured by each of two or more image-capture systems of a computing device that includes a plurality of image-capture systems; (b) using the image data captured by the two or more image-capture systems as a basis for determining disparity information for the scene; and (c) performing, by the computing system, an autofocus process based at least in part on the disparity information, wherein the autofocus process provides a focus setting for at least one of the image-capture systems of the computing device.
Abstract:
An apparatus is described that includes an image sensor having a first output port and a second output port. The first output port is to transmit a first image stream concurrently with a second image stream transmitted from the second output port.
Abstract:
A device may operate a first image-capture system to capture first image data of a scene. While the first image-capture system is capturing the first image data, the device may operate a second image-capture system to determine an updated value for the first image setting, and send an instruction to the first image-capture system that indicates to use the updated value for the first image setting to continue to capture the first image data.
Abstract:
Example embodiments may help multi-camera devices determine disparity information scene, and use the disparity information in an autofocus process. An example method involves: (a) receiving image data of a scene that comprises at least one image of the scene captured by each of two or more image-capture systems of a computing device that includes a plurality of image-capture systems; (b) using the image data captured by the two or more image-capture systems as a basis for determining disparity information for the scene; and (c) performing, by the computing system, an autofocus process based at least in part on the disparity information, wherein the autofocus process provides a focus setting for at least one of the image-capture systems of the computing device.
Abstract:
Within examples, devices and methods for providing optical element field of view functionality by providing an optical element into and out of an optical receiving path are described. In one example, a device is provided that comprises an imager die having an optical receiving path, and an actuator coupled to an optical element and configured to cause a change in a position of the optical element into and out of the optical receiving path of the imager die. The actuator is configured to cause the change in the position of the optical element to change a given field of view setting of the device. In some examples, a device may be configured to include dual-setting field of view functionality.
Abstract:
This document describes curved image sensors capable of sensing light from a monocentric lens. This curved image sensor receives light focused at a curved focal surface and then provides electric signals from this curved image sensor to a planar computing chip, such as a CMOS chip. By so doing, the higher image quality, smaller size, and often smaller weight of monocentric lenses can be gained while using generally high-quality, low-cost planar chips.