Abstract:
An image joining apparatus includes: a motion vector detection unit that detects a motion vector between images; a motion compensating unit that performs correction of compensating for a positional difference value between the images on one of the images using the motion vector to generate a motion compensated image; a difference determination unit that selects one of the images and determines whether a difference value between adjacent pixels in the selected image is equal to or larger than a value; an overwriting unit that overwrites another image with the selected image in an area of the selected image in which the difference value between the adjacent pixels is equal to or larger than the value; and a mixing unit that locally mixes the selected image and the other image in an area of the selected image in which the difference value between the adjacent pixels is smaller than the value.
Abstract:
A saturation ratio calculation unit calculates a saturation ratio of the saturation of an outer circumference of a color gamut of an input image to the saturation of an outer circumference of a color gamut of an output image. A histogram generation unit generates a histogram in which a plurality of pixels selected as pixels outside the color gamut are counted for each pair of lightness and hue. A magnification determination unit determines, within a range of not less than 1 and not more than the saturation ratio, a magnification by which the chromaticity of the input image to be subjected to the color gamut conversion is to be multiplied for each pair of lightness and hue based on the histogram.
Abstract:
A peripheral blown-out pixel counter counts the number of blown-out pixels in a block including a target pixel and peripheral pixels in image data. A G threshold setting unit sets a G threshold such that the G threshold is greater as the number of the blown-out pixels is larger. A (B−G) threshold setting unit sets a (B−G) threshold such that the (B−G) threshold is smaller as the number of the blown-out pixels is larger. A correction necessity determination unit determines that a color of the target pixel should be corrected when a G signal is determined to be smaller than the G threshold and a (B−G) value is determined to be greater than the (B−G) threshold. A color correction unit corrects at least the G signal and a B signal when the color of the target pixel should be corrected.
Abstract:
An image processing method is provided. Calculating sums of differences identified between a pixel value of a pixel of interest in an input image and pixel values of three pixels surrounding the pixel of interest. Calculating an average value of four pixel difference sums calculated by the pixel difference sum calculators. Calculating deviations between the average value and the four pixel difference sums calculated by the pixel difference sum calculators. Deriving a minimum coefficient from the candidates of coefficient calculated by a candidate coefficient calculator by using adjusted deviations derived from multiplying the deviations by a constant. Subtracting values derived from multiplying the adjusted deviations by the minimum coefficient from the pixel value of the pixel of interest in the input image and outputting values of four pixels in an enlarged image twice an original size of the input image in horizontal and vertical directions.
Abstract:
Provided is an image joining apparatus that creates a joined image by joining multiple consecutive still images on a time axis, the image joining apparatus including: a past image confirmation unit that confirms whether there is a past still image located before the previous still image in terms of time in at least a part of a portion to be joined to the target still image; an angle calculation unit that calculates an angle formed between a motion vector associated with the target still image and a motion vector associated with the past still image, when the past image confirmation unit has confirmed that the past still image is present; and a stop processing unit that stops joining of the target still image when the angle calculated by the angle calculation unit is equal to or greater than a predetermined angle.
Abstract:
Provided are an image processing apparatus, an image processing method, a program, and a camera which are capable of generating a joined image in which misalignment is less likely to occur. An image processing apparatus according to an exemplary embodiment includes a motion vector derivation unit that derives a motion vector between images of a plurality of images; a correction unit that corrects a motion vector between images included in a target section from a first image to a second image based on cumulative errors in the section from the first image to the second image; and a texture writing unit that writes, into a frame memory, the plurality of textures that form the joined image, based on the motion vector corrected by the correction unit.
Abstract:
Provided are an image processing apparatus, a pixel processing method, a program, and a camera which are capable of appropriately generating a joined image. An image processing apparatus according to an exemplary embodiment generates a joined image by joining a plurality of textures based on a plurality of images. The image processing apparatus includes: a motion vector derivation unit that derives a motion between images of the plurality of images; a frame memory in which a joined image frame is set and the plurality of textures that form the joined image are written; and a texture writing unit that writes the plurality of textures into the frame memory based on the motion between the images. The texture writing unit writes a first texture among the plurality of textures into an area that is not in contact with an edge of the joined image frame.
Abstract:
Provided are an image processing apparatus, an image processing method, a program, and a camera which are cable of generating a joined image in which distortions and seams are less likely to occur. An image processing apparatus according to an exemplary embodiment generates a joined image by joining a plurality of textures based on a plurality of images, and includes a second derivation unit that derives a motion between images of the plurality of images, and a texture writing unit that writes, into a frame memory, the plurality of textures that form the joined image, based on the motion between the images. The plurality of textures include a texture having such a shape that at least a part of an outline thereof is curved.