Abstract:
Using the same image sensor to capture a two-dimensional (2D) image and three-dimensional (3D) depth measurements for a 3D object. A laser point-scans the surface of the object with light spots, which are detected by a pixel array in the image sensor to generate the 3D depth profile of the object using triangulation. Each row of pixels in the pixel array forms an epipolar line of the corresponding laser scan line. Timestamping provides a correspondence between the pixel location of a captured light spot and the respective scan angle of the laser to remove any ambiguity in triangulation. An Analog-to-Digital Converter (ADC) in the image sensor operates as a Time-to-Digital (TDC) converter to generate timestamps. A timestamp calibration circuit is provided on-board to record the propagation delay of each column of pixels in the pixel array and to provide necessary corrections to the timestamp values generated during 3D depth measurements.
Abstract:
A data compressor includes a zero-value remover, a zero bit mask generator and a non-zero values packer. The zero-value remover receives 2N bit streams of values and outputs 2N non-zero-value bit streams having zero values removed from each respective bit stream based on a selected granularity of compression for values contained in the bit streams. The zero bit mask generator receives the 2N bit streams of values and generates a zero bit mask corresponding to the selected granularity of compression. Each zero bit mask indicates a location of a zero value based on the selected granularity of compression. The non-zero values packer receives the 2N non-zero-value bit streams and forms at least one first group of packed non-zero values.
Abstract:
A processor. In some embodiments, the processor includes: a first tile, the first tile being configured: to feed a first nibble from a third queue, through a first shuffler, to a first multiplier, and to multiply, in the first multiplier, the first nibble from the third queue by a first nibble of a third weight; to feed a second nibble from the third queue, through the first shuffler, to a second multiplier, and to multiply, in the second multiplier, the second nibble from the third queue by a second nibble of the third weight; to feed a first nibble from a fourth queue, through the first shuffler, to a third multiplier, and to multiply, in the third multiplier, the first nibble from the fourth queue by a first nibble of a fourth weight.
Abstract:
A Time-of-Flight (TOF) technique is combined with analog amplitude modulation within each pixel in a pixel array using multiple Single Photon Avalanche Diodes (SPADs) in conjunction with a single Pinned Photo Diode (PPD) in each pixel. A SPAD may be shared among multiple neighboring pixels. The TOF information is added to the received light signal by the analog domain-based single-ended to differential converter inside the pixel itself. The spatial-temporal correlation among outputs of multiple, adjacent SPADs in a pixel is used to control the operation of the PPD to facilitate recording of TOF values and range of an object. Erroneous range measurements due to ambient light are prevented by stopping the charge transfer from the PPD—and, hence, recording a TOF value—only when two or more SPADs in the pixel are triggered within a pre-defined time interval. An autonomous navigation system with multi-SPAD pixels provides improved vision for drivers under difficult driving conditions.
Abstract:
A Dynamic Vision Sensor (DVS) pose-estimation system includes a DVS, a transformation estimator, an inertial measurement unit (IMU) and a camera-pose estimator based on sensor fusion. The DVS detects DVS events and shapes frames based on a number of accumulated DVS events. The transformation estimator estimates a 3D transformation of the DVS camera based on an estimated depth and matches confidence-level values within a camera-projection model such that at least one of a plurality of DVS events detected during a first frame corresponds to a DVS event detected during a second subsequent frame. The IMU detects inertial movements of the DVS with respect to world coordinates between the first and second frames. The camera-pose estimator combines information from a change in a pose of the camera-projection model between the first frame and the second frame based on the estimated transformation and the detected inertial movements of the DVS.
Abstract:
An imaging device includes at least a first group of pixels. A driver block is configured to generate at least two shutter signals, each having on-phases periodically alternating with off-phases. The shutter signals might not be in phase. The imaging device may have an optical shutter that is partitioned in two or more parts, or a set of two or more optical shutters. The shutter parts, or the shutters, may receive the shutter signals, and accordingly open and close. A design of the driver block requires reduced power, which is highly desirable for mobile applications. Moreover, 3D imaging may be implemented that uses various time-of-flight configurations.
Abstract:
An imaging apparatus may include an image sensor that includes a Bayer color filter array (CFA) of 2-by-2 cell groups each of which includes a red pixel, a blue pixel, and two green pixels, and an image signal processor for processing raw data from the Bayer CFA. The image signal processor corrects the raw data to reduce artifacts such as image contrast degradation caused by using green pixels with wide transmission spectrum.
Abstract:
A method and a system are disclosed that use a double-readout technique to generate timestamps and grayscale values for pixel-specific outputs of a pixel row in a pixel array in which the pixel array forms an image plane and the row of pixels forms an epipolar line of a scanning line on the image plane. For a pixel-specific output that exceeds a threshold, a timestamp value and a grayscale value are generated and associated with the pixel-specific output. If a pixel-specific output does not exceed the threshold and no timestamp is generated (i.e., a missing timestamp), a grayscale value for the pixel-specific output is generated and associated with the pixel-specific output. Timestamps errors may be corrected that may be caused by missing timestamps, timestamps that are associated with pixel clusters and outlier timestamps, i.e., pixel-specific outputs that are not consistent with a monotonic relationship of timestamp values under normal conditions.
Abstract:
An embodiment includes a system, comprising: a buffer configured to store a plurality of events, each event including a plurality of bits; an output circuit configured to output events from the buffer; and a controller coupled to the buffer and the output circuit and configured to: cause the output circuit to output a first event from the buffer; and select a second event from the buffer to be output by the output circuit after the first event based on bits associated with the first event.
Abstract:
An image sensor operates in a base mode according to a first sampling rate and a first pixel exposure time and includes a light condition detector that extracts illuminance information from a pixel signal received from pixels in a pixel array, and generates a low-illuminance information signal upon detecting that a value of the illuminance information falls within a first range, a sampling controller that changes the first sampling rate to a second sampling rate in response to the low-illuminance information signal, and an exposure time controller that changes the first pixel exposure time to a second pixel exposure time in response to the low-illuminance information signal.