Abstract:
Disparity is determined for images from an array of disparate image sensors. In one example, a method includes processing images from multiple cameras of a camera array to reduce variations caused by variations in the in the image sensors of the respective cameras, the multiple cameras including a reference camera and at least one secondary camera, and determining multiple baseline disparities from the image from the reference camera to the images from the secondary cameras.
Abstract:
A range is determined for a disparity search for images from an image sensor array. In one example, a method includes receiving a reference image and a second image of a scene from multiple cameras of a camera array, detecting feature points of the reference image, matching points of the detected features to points of the second image, determining a maximum disparity between the reference image and the second image, and determining disparities between the reference image and the second image by comparing points of the reference image to points of the second image wherein the points of the second image are limited to points within the maximum disparity.
Abstract:
Disparity is determined for images from an array of disparate image sensors. In one example, a method includes processing images from multiple cameras of a camera array to reduce variations caused by variations in the in the image sensors of the respective cameras, the multiple cameras including a reference camera and at least one secondary camera, and determining multiple baseline disparities from the image from the reference camera to the images from the secondary cameras.