Abstract:
Systems and methods are described that relate to optical image stabilization in mobile computing devices that include multiple optical element sets. In an example embodiment, a mobile computing device may include a plurality of optical element assemblies, which may be coupled to a shared frame. The shared frame may be configured to maintain a fixed spatial arrangement between the optical element assemblies. A controller may receive a signal indicative of a movement of the mobile computing device. Based at least on the signal, the controller may determine a stabilization movement of the shared frame. The controller may cause an actuator to move the shared frame according to the stabilization movement. Optionally, the shared frame may also be configured to provide focus adjustments. For example, the controller may be additionally configured to cause the shared frame to move to a focus position based on a focus signal.
Abstract:
An example method involves: (a) determining an indication of distance to an object in a scene, wherein the computing device comprises three or more image-capture devices that provide two or more baselines for stereoscopic imaging, wherein at least two pairs of image-capture devices from the three or more image-capture devices are operable for stereoscopic imaging, wherein each pair provides one of the baselines, and wherein a first of the baselines is non-parallel to a second of the baselines, (b) selecting, by the computing device, a first pair from the at least two pairs of image-capture devices, wherein the first pair is selected based on the indication of distance and the baseline provided by the first pair, and (c) operating the first pair of image-capture devices to capture stereoscopic image data.
Abstract:
Devices and methods for providing multi-aperture lens functionality are provided. In one example, a device is provided that comprises a plurality of optical element assemblies configured to focus light. The device also includes a shared frame coupled to each of the plurality of optical element assemblies and configured to maintain the plurality of optical element assemblies in a fixed spatial arrangement. A position of the shared frame corresponds to a focus setting of the device. The device also comprises an actuator coupled to the shared frame and configured to cause a change in the position of the shared frame corresponding to a change in the focus setting of the device. The change in the position of the shared frame causes the plurality of optical element assemblies to be configured in the changed focus setting of the device.
Abstract:
An apparatus is described that includes an image sensor having a first output port and a second output port. The first output port is to transmit a first image stream concurrently with a second image stream transmitted from the second output port.
Abstract:
The present disclosure relates to curved arrays of individually addressable light-emitting elements for sweeping out angular ranges. One example device includes a curved optical element. The device may also include a curved array of individually addressable light-emitting elements arranged to emit light towards the curved optical element. A curvature of the curved array is substantially concentric to at least a portion of the circumference of the curved optical element. The curved optical element is arranged to focus light emitted from each individually addressable light-emitting element to produce a substantially linear illumination pattern at a different corresponding scan angle within an angular range. The device may further include a control system operable to sequentially activate the individually addressable light-emitting elements such that the substantially linear illumination pattern sweeps out the angular range.
Abstract:
Imaging systems can often gather higher quality information about a field of view than the unaided human eye. For example, telescopes may magnify very distant objects, microscopes may magnify very small objects, and high frame-rate cameras may capture fast motion. The present disclosure includes devices and methods that provide real-time vision enhancement without the delay of replaying from storage media. The disclosed devices and methods may include a live view user interface with two or more interactive features or effects that may be controllable in real-time. Specifically, the disclosed devices and methods may include a live view display and image and other information enhancements, which utilize in-line computation and constant control.
Abstract:
A computing device is described that includes a display unit. The display unit includes an array of pixels defined by N rows and M columns and at least one row select unit configured to select one or more of the N rows. The display unit further includes a first column control unit configured to drive a first contiguous group of pixels from the array of pixels, and a second column control unit configured to drive a second contiguous group of pixels from the array of pixels.
Abstract:
A wearable computing device may receive a data transmission schedule from a wirelessly tethered camera device. The wearable computing device may include a data receiver, and the data transmission schedule may be based on a frame rate and a resolution of the camera device. Possibly in response to receiving the data transmission schedule, the wearable computing device may be readied to receive a data transmission of the data transmission schedule. With the data receiver, the wearable computing device may receive the data transmission. In response to completing the reception of the data transmission, the wearable computing device may power down the data receiver.
Abstract:
Devices and methods for providing low power lens focus functionality with position retention are provided. In one example, a device is provided that comprises an optical element assembly configured to provide a plurality of focus settings based on a change in a position of the optical element assembly. The device also includes a first actuator configured to cause the change in the position. The device also includes a second actuator configured to retain the optical element assembly in the position. The change in the position causes the optical element assembly to be configured in a given focus setting of the plurality of focus settings.
Abstract:
An example method involves: (a) receiving image data that is generated by each of a plurality of image-capture systems, wherein the plurality of image-capture systems are all arranged on a given device and all are oriented in substantially the same direction, (b) analyzing, by the computing system, image data that is generated by one or more of the image-capture systems to select image data from at least one of the image-capture systems having a field-of-view that is not substantially occluded by an unintended element, and (c) storing the selected image data.