Abstract:
An image processing method includes adjusting a brightness of each of a first image and a second image based on a first exposure time when the first image is captured and a second exposure time when the second image is captured, respectively, the first and second images being generated by capturing a same object under different light conditions, estimating an intensity of light, reaching the object when the second image is captured, based on the adjusted brightness of each of the first and second images, separating the second image into two or more regions according to the estimated intensity of light, determining a target brightness that a final result image is to have, adjusting the brightness of the second image, by different amounts for each of the separated regions, based on the target brightness and generate the final result image based on the adjusted brightness of the second image.
Abstract:
A method of generating an image using an image capturing device includes generating an imperfect image excluding data corresponding to one or more pixel sensors by capturing an object while the object is focused. A defocused image is generated to include the data corresponding to the one or more pixel sensors generated using neighboring pixel sensors around the one or more pixel sensors while the object is defocused. The data corresponding to the one or more pixel sensors is extracted based on data in the generated imperfect image and data in the generated defocused image. A final image is then generated by reflecting the extracted data to at least one of the generated imperfect image or the generated defocused image.
Abstract:
A lens shading correction method includes providing lens shading correction profile data; calculating an intensity values of light passing through each of one or more visible light pass filters and each of one or more infrared light pass filters; calculating an average of the intensity values of the light passing through the one or more visible light pass filters; calculating an average of the intensity values of the light passing through the one or more infrared light pass filters; calculating a normalized intensity value of the light passing through the one or more infrared light pass filters, based on the calculated averages; adjusting one or more lens shading correction coefficients included in the lens shading correction profile data and each having a value varying depending on a frequency element of light, based on the calculated normalized intensity value; and correcting lens shading by using the adjusted lens shading correction coefficient.
Abstract:
An image device including a pixel array and a controller, The pixel array having first pixels and second pixels and corresponding channel drivers. The controller may perform operations of a dynamic vision sensor (DVS), an ambient light sensor (ALS) and a proximity sensor (PS).
Abstract:
An image processing device capable of generating foreground object data by using a captured image includes a depth data generator configured to generate depth data of the captured image; an amplitude data generator configured to generate amplitude data of the captured image; a foreground object detector configured to perform a first detection operation for detecting the foreground object based on the generated depth data and first reference background data, and to perform a second detection operation for detecting the foreground object based on the generated amplitude data and second reference background data; and a foreground object data generator configured to generate the foreground object data based on a result of the first detection operation and a result of the second detection operation.
Abstract:
CMOS image sensors are provided. A CMOS image sensor may include a semiconductor substrate including a light-receiving region and a logic region adjacent the light-receiving region. The CMOS image sensor may include a photoelectric conversion region in the light-receiving region. Moreover, the CMOS image sensor may include an isolation region including an interface with a sidewall of the photoelectric conversion region. The isolation region may include a first refractive index that is smaller than a second refractive index of the semiconductor substrate, and the isolation region may be between the logic region and the sidewall of the photoelectric conversion region.
Abstract:
An image sensor is provided which includes a plurality of unit pixels, ones of which are configured to convert an input light signal into at least four frame signals. The image sensor also includes a signal processor that is configured to measure a distance from an object based on the at least four frame signals from one of the plurality of unit pixels.
Abstract:
A binary image sensor includes a plurality of unit pixels on a substrate having a surface on which light is incident. At least one quantum dot is disposed on the surface of a substrate. A column sense amplifier circuit is configured to detect binary information of a selected unit pixel among the plurality of unit pixels from a voltage or a current detected from the selected unit pixel, and a processing unit is configured to process binary information of the respective unit pixels to generate pixel image information. Related devices and methods of operation are also discussed.
Abstract:
In some embodiments, an imaging device includes a pixel array. At least one of the pixels includes a photodiode that can generate charges, and a select transistor that receives the charges in its bulk. When the select transistor is selected, a pixel current through it may depend on a number of the received charges, thus evidencing how much light it detected. A reset transistor may reset the voltage of the bulk.
Abstract:
A binary image sensor includes a plurality of unit pixels on a substrate having a surface on which light is incident. At least one quantum dot is disposed on the surface of a substrate. A column sense amplifier circuit is configured to detect binary information of a selected unit pixel among the plurality of unit pixels from a voltage or a current detected from the selected unit pixel, and a processing unit is configured to process binary information of the respective unit pixels to generate pixel image information. Related devices and methods of operation are also discussed.