Abstract:
An image processing apparatus processes an image obtained by an image capturing apparatus that includes a plurality of first photoelectric converters and a plurality of second photoelectric converters each forming a pair with each other. The second photoelectric converters providing a signal that has a parallax for a signal from the corresponding one of the first photoelectric converters, and a signal reading unit configured to read a first signal as an output signal from the first photoelectric converter and to read a second signal as a combination signal of the first signal and an output signal from a paired one of the second photoelectric converters in image capturing in which the first and second photoelectric converters receive light from an object. The image processing apparatus includes a defocus amount calculating unit configured to calculate a defocus amount using the first signal and the second signal.
Abstract:
An image captured with flash light and an image captured without flash light are each divided into a plurality of regions, and a luminance value is calculated for each region. Based on a ratio of a difference between the luminance value of each region in the image captured with flash light and the luminance value of the corresponding region in the image captured without flash light to the luminance value of the corresponding region in the image captured without flash light, a degree of influence of light representing the influence of light emitted by the light emitting device on the luminance value of each region in the image captured with flash light is calculated. Regions in which the degrees of influence of light are larger than a predetermined threshold are determined as illuminated regions.
Abstract:
An image captured with flash light and an image captured without flash light are each divided into a plurality of regions, and a luminance value is calculated for each region. Based on a ratio of a difference between the luminance value of each region in the image captured with flash light and the luminance value of the corresponding region in the image captured without flash light to the luminance value of the corresponding region in the image captured without flash light, a degree of influence of light representing the influence of light emitted by the light emitting device on the luminance value of each region in the image captured with flash light is calculated. Regions in which the degrees of influence of light are larger than a predetermined threshold are determined as illuminated regions.
Abstract:
A display apparatus includes a display device capable of changing at least one of a plurality of parameters including a luminescence intensity, a color gamut, and an electro optical transfer function (EOTF); and at least one processor or circuit configured to function as a display control unit configured to control the display device to display a plurality of images and a background image displayed around each of the plurality of images such that an appearance of the background image is visually identical between the plurality of images when the display device is caused to display the plurality of images.
Abstract:
An imaging apparatus receive light fluxes passing through and incident on different pupil regions of an imaging optical system and generates at least image data of one pair of subject images. Imaging surface correction information regarding a manufacturing error or optical characteristics of design of an imaging lens is transmitted from a lens control unit of the imaging lens to a system control unit of a camera body, and then the system control unit receives the imaging surface correction information. The system control unit performs a process of generating a defocus map based on a parallax between at least one pair of images and performs imaging surface correction through correction of the calculated defocus amount. In the imaging surface correction process, influences of the optical characteristics of the imaging lens and an inclination of an imaging surface of an image sensor are corrected, and thus a more accurate distance map is generated.
Abstract:
An imaging apparatus receive light fluxes passing through and incident on different pupil regions of an imaging optical system and generates at least image data of one pair of subject images. Imaging surface correction information regarding a manufacturing error or optical characteristics of design of an imaging lens is transmitted from a lens control unit of the imaging lens to a system control unit of a camera body, and then the system control unit receives the imaging surface correction information. The system control unit performs a process of generating a defocus map based on a parallax between at least one pair of images and performs imaging surface correction through correction of the calculated defocus amount. In the imaging surface correction process, influences of the optical characteristics of the imaging lens and an inclination of an imaging surface of an image sensor are corrected, and thus a more accurate distance map is generated.
Abstract:
An image-pickup apparatus includes a synthesizer configured to synthesize a plurality of images, a detector configured to detect positional information, a first instructor configured to instruct a start of a photography preparing operation, a second instructor configured to instruct a start of a photographic operation, and a controller configured to determine first and second capturing conditions in accordance with an instruction from the first instructor and to hold reference position information based upon the positional information when the first instructor provides an instruction, the controller being configured to make an image-pickup unit start the consecutive shooting under the first capturing condition in accordance with an instruction from the second instructor, and to make the image-pickup unit capture an image under the second capturing condition when the detector detects that the positional information of the image-pickup apparatus corresponds to the reference position information.
Abstract:
A display apparatus includes a display device capable of changing at least one of a plurality of parameters including a luminescence intensity, a color gamut, and an electro optical transfer function (EOTF); and at least one processor or circuit configured to function as a display control unit configured to control the display device to display a plurality of images and a background image displayed around each of the plurality of images such that an appearance of the background image is visually identical between the plurality of images when the display device is caused to display the plurality of images.
Abstract:
This invention encodes, using less memory, a wide-angle image obtained by performing image capturing a plurality of times. An apparatus includes a compositing unit that, each time an image capturing unit captures an image, crops a partial image of a predetermined region in the captured image, and composes the partial image with a composed image obtained from a previously captured image, an encoding unit that, when the composed image updated by the compositing unit has a pre-set size, encodes the image of the tile in the composed image, a releasing unit that releases an area used for the encoded tile in the memory, and a control unit that controls the compositing unit, the encoding unit, and the releasing unit so as to repeatedly perform operations until a pre-set condition is satisfied.
Abstract:
An image processing apparatus that properly sets a moving image area according to an image. An image processor generates a combined image, by setting one of a plurality of images as a basic image and disposing a moving image generated from the plurality of images in a moving image area which is designated in part of the basic image. The image processor sets a still area in part of the basic image. A system controller causes the basic image to be displayed on a display section. A user performs a touch operation on the display section displaying the basic image, whereby the moving image area is designated. When the moving image area is designated in a manner overlapping the still area, the image processor performs image combining by deleting the overlapping area from the moving image area.