Abstract:
An image generation apparatus includes an objective optical system, an optical-path splitter, an image sensor, and an image processor. Both a first imaging area and a second imaging area are images of a field of view of the objective optical system. An outer edge of the first imaging area and an outer edge of the second imaging area are located at an inner side of a predetermined image pickup area, and an image of the outer edge located in the predetermined image pickup area is captured by the image sensor. The image processor has reference data, and in the image processor, a feature point is extracted on the basis of the imaging area. An amount of shift between the predetermined imaging area and the image sensor is calculated from the feature point and the reference data.
Abstract:
An optical device where an image of an object is formed on an image pickup element and where an image restoration process is performed on the image obtained by the image pickup element, comprising a MTF that satisfies the following conditional expression (1): 0.001
Abstract:
An image processing device includes a processor including hardware, the processor being configured to implement an image acquisition process that acquires captured images from an imaging section that performs a frame sequential imaging process in which one cycle includes first to N-th frames, and a synthesis process, wherein the processor implements the image acquisition process that acquires a plurality of captured images that have been captured in an i-th frame and differ from each other as to an in-focus object plane position, and the processor implements the synthesis process that calculates a second synthesis map based on a first synthesis map calculated with respect to the i-th frame, and the first synthesis map calculated with respect to a k-th frame, and synthesizes the plurality of images that have been captured in the i-th frame based on the second synthesis map.
Abstract:
An endoscope system includes: a generating means generating a compositing mask that serves as compositing ratios of the corresponding pixels between a pair of images acquired by simultaneously imaging two optical images having different focus positions, into which a subject image is divided on the basis of the ratios of contrasts; a correcting means subjecting compositing masks generated for pairs of images acquired in time series, to weighted averaging for respective pixels, thus generating a corrected mask; and an compositing means compositing the two images according to the corrected mask. The correcting means subjects the compositing masks to weighted averaging by performing weighting such that the percentage of the past compositing masks is higher at pixels constituting a static area and an area having contrast lower than a threshold than at pixels constituting a moving-object area or an area having contrast equal to or higher than the threshold.
Abstract:
An image processing device according to the present invention includes: a color space conversion unit that calculates brightness signals and color difference signals from RGB signals of two images of a single subject acquired under different image acquisition conditions; a color difference determination unit that determines whether or not the absolute value of the difference between the color difference signals of the two images calculated by the color space conversion unit is smaller than a predetermined threshold value; and a color averaging unit that, if the color difference determination unit determines that the absolute value of the difference between the color difference signals of the two images is smaller than the threshold value, averages the color difference signals of the two images and calculates average color difference signals.
Abstract:
An imaging device includes a first image sensor that is placed to intersect the optical axis of an imaging lens, a second image sensor that is placed to intersect the optical axis of the imaging lens so as to be situated at a given distance from the first image sensor, and receives light that has passed through the first image sensor, and a processor including hardware, wherein the processor is configured to implement a brightness correction process that amplifies the pixel value of the second image captured by the second image sensor using a gain set based on the light transmittance of a light-receiving section of the first image sensor, and an object distance calculation process that performs a depth-from-defocus (DFD) process based on the pixel value of the first image and the pixel value of the second image subjected to the brightness correction process to calculate an object distance.