Abstract:
Embodiments relate to two stage multi-scale processing of an image. A first stage processing circuitry generates an unscaled single color version of the image that undergoes noise reduction before generating a high frequency component of the unscaled single color version. A scaler generates a first downscaled version of the image comprising a plurality of color components. A second stage processing circuitry generates a plurality of sequentially downscaled images based on the first downscaled version. The second stage processing circuitry processes the first downscaled version and the downscaled images to generate a processed version of the first downscaled version. The unscaled single color high frequency component and the processed version of the first downscaled version of the image are merged to generate a processed version of the image.
Abstract:
An image processing pipeline may process image data at multiple rates. A stream of raw pixel data collected from an image sensor for an image frame may be processed through one or more pipeline stages of an image signal processor. The stream of raw pixel data may then be converted into a full-color domain and scaled to a data size that is less than an initial data size for the image frame. The converted pixel data may be processed through one or more other pipelines stages and output for storage, further processing, or display. In some embodiments, a back-end interface may be implemented as part of the image signal processor via which image data collected from sources other than the image sensor may be received and processed through various pipeline stages at the image signal processor.
Abstract:
Embodiments of the present disclosure generally relate to image signal processing logic, and in particular, to separating an undecimated image signal data to create two components with lower resolution and full-resolution, generating an interpolation guidance information based on the two components created by separation, forming a difference image data representing the difference between the chroma and luma values of each pixel and its neighboring pixels, and merging the processed image data from the processing pipelines with the unprocessed image data using the interpolation guidance information generated. The generation of the interpolation guidance information is based on determining distances between pixel values from a group comprising pixels from interpolation nodes, pixels diagonally located adjacent to the interpolation nodes, pixels horizontally adjacent to the interpolation nodes, and pixels vertically adjacent to the interpolation nodes.
Abstract:
An image processing pipeline may perform noise filtering and image sharpening utilizing common spatial support. A noise filter may perform a spatial noise filtering technique to determine a filtered value of a given pixel based on spatial support obtained from line buffers. Sharpening may also be performed to generate a sharpened value of the given pixel based on spatial support obtained from the same line buffers. A filtered and sharpened version of the pixel may be generated by combining the filtered value of the given pixel with the sharpened value of the given pixel. In at least some embodiments, the noise filter performs spatial noise filtering and image sharpening on a luminance value of the given pixel, when the given pixel is received in a luminance-chrominance encoding.
Abstract:
An image processing pipeline may account for clipped pixels in auto focus statistics. Generating auto focus statistics may include evaluating a neighborhood of pixels with respect to a given pixel in a stream of pixels for an image. If a clipped pixel is identified within the neighborhood of pixels then the evaluation of the given pixel may be excluded from an auto focus statistic. The image processing pipeline may also provide auto focus statistics that do not exclude clipped pixels. A luminance edge detection value may, in some embodiments, be generated by applying an IIR filter to the given pixel in a stream of pixels to band-pass filter the given pixel before including the band-pass filtered pixel in the generation of the luminance edge detection value.
Abstract:
An image processing pipeline may process image data at multiple rates. A stream of raw pixel data collected from an image sensor for an image frame may be processed through one or more pipeline stages of an image signal processor. The stream of raw pixel data may then be converted into a full-color domain and scaled to a data size that is less than an initial data size for the image frame. The converted pixel data may be processed through one or more other pipelines stages and output for storage, further processing, or display. In some embodiments, a back-end interface may be implemented as part of the image signal processor via which image data collected from sources other than the image sensor may be received and processed through various pipeline stages at the image signal processor.
Abstract:
An image signal processor of a device, apparatus, or computing system that includes a camera capable of capturing image data may apply piecewise perspective transformations to image data received from the camera's image sensor. A scaling unit of an Image Signal Processor (ISP) may perform piecewise perspective transformations on a captured image to correct for rolling shutter artifacts and to provide video image stabilization. Image data may be divided into a series of horizontal slices and perspective transformations may be applied to each slice. The transformations may be based on motion data determined in any of various manners, such as by using gyroscopic data and/or optical-flow calculations. The piecewise perspective transforms may be encoded as Digital Difference Analyzer (DDA) steppers and may be implemented using separable scalar operations. The image signal processor may not write the received image data to system memory until after the transformations have been performed.
Abstract:
A pedestal level for an image sensor can be dynamically adjusted based on one or more parameters. The parameters include one or more operating conditions associated with the image sensor, pre-determined image sensor characterization data, the number of unused digital codes, and/or the number of clipped pixel signals. The operating conditions can include the temperature of the image sensor, the gain of at least one amplifier included in processing circuitry operably connected to at least one pixel, and/or the length of the integration period for at least one pixel in the image sensor. Based on the one or more of the parameters, the pedestal level is adjusted to reduce a number of unused digital codes in a distribution of dark current. Additionally or alternatively, the variance of the pixel signals can be reduced to permit the use of a lower pedestal level.
Abstract:
An image signal processor of a device, apparatus, or computing system that includes a camera capable of capturing image data may apply piecewise perspective transformations to image data received from the camera's image sensor. A scaling unit of an Image Signal Processor (ISP) may perform piecewise perspective transformations on a captured image to correct for rolling shutter artifacts and to provide video image stabilization. Image data may be divided into a series of horizontal slices and perspective transformations may be applied to each slice. The transformations may be based on motion data determined in any of various manners, such as by using gyroscopic data and/or optical-flow calculations. The piecewise perspective transforms may be encoded as Digital Difference Analyzer (DDA) steppers and may be implemented using separable scalar operations. The image signal processor may not write the received image data to system memory until after the transformations have been performed.