Abstract:
An image sensor chip includes a first wafer and a second wafer. The first wafer includes an image sensor having a plurality of sub-pixels, each of which is configured to detect at least one photon and output a sub-pixel signal according to a result of the detection. The image processor is configured to process sub-pixel signals for each sub-pixel and generate image data. The first wafer and the second wafer are formed in a wafer stack structure.
Abstract:
A unit pixel includes a first photoelectric converter, a first transfer transistor disposed between to the first photoelectric converter and a first node, a connection transistor disposed between and connected to a second node and the first node, a second transfer transistor disposed between and connected to a third node and the second node, a second photoelectric converter connected to the third node, and a storage metal-oxide semiconductor (MOS) capacitor connected to the third node. The storage MOS capacitor stores charges from the second photoelectric converter. For a first time period, first charges accumulated in the first photoelectric converter are transferred to the first node, for a second time period, second charges accumulated in the first photoelectric converter are transferred to the first node and the second node, and for a third time period, third charges accumulated in the second photoelectric converter are transferred to the first to third nodes.
Abstract:
A three-dimensional (3D) image sensor includes a first substrate having an upper pixel. The upper pixel includes a photoelectric element and first and second photogates connected to the photoelectric element. A second substrate includes a lower pixel, which corresponds to the upper pixel, that is spaced apart from the first substrate in a vertical direction. The lower pixel includes a first transfer transistor that transmits a first signal provided by the first photogate. A first source follower generates a first output signal in accordance with the first signal. A second transfer transistor transmits a second signal provided by the second photogate. A second source follower generates a second output signal in accordance with the second signal. First and second bonding conductors are disposed between the first and second substrates and electrically connect the upper and lower pixels.
Abstract:
A semiconductor device may include a first sensor configured to sense light having a wavelength within a first wavelength range from incident light and generates a first electrical signal based on the sensed light and a second sensor configured to sense light having a wavelength within a second, different wavelength range from the incident light and generates a second electrical signal based on the sensed light. The first and second sensors may be electrically connected to each other via an intermediate connector, and the first sensor and the second sensor may share a pixel circuit that is electrically connected thereto via the intermediate connector. The first and second wavelength ranges may include infra-red and visible wavelength ranges, respectively. The first and second wavelength ranges may include different visible wavelength ranges.
Abstract:
A semiconductor device may include a first sensor configured to sense light having a wavelength within a first wavelength range from incident light and generates a first electrical signal based on the sensed light and a second sensor configured to sense light having a wavelength within a second, different wavelength range from the incident light and generates a second electrical signal based on the sensed light. The first and second sensors may be electrically connected to each other via an intermediate connector, and the first sensor and the second sensor may share a pixel circuit that is electrically connected thereto via the intermediate connector. The first and second wavelength ranges may include infra-red and visible wavelength ranges, respectively. The first and second wavelength ranges may include different visible wavelength ranges.
Abstract:
An image capture method performed by a depth sensor includes; emitting a first source signal having a first amplitude towards a scene, and thereafter emitting a second source signal having a second amplitude different from the first amplitude towards the scene, capturing a first image in response to the first source signal and capturing a second image in response to the second source signal, and interpolating the first and second images to generate a final image.
Abstract:
An image sensor according to an example embodiment of includes a first pixel and a second pixel in a first row. The first pixel includes a first photoelectric conversion element at a first depth in a semiconductor substrate and the first photoelectric conversion element is configured to convert a first visible light spectrum into a first photo charge, and the second pixel includes a second photoelectric conversion element at a second depth from the first depth in the semiconductor substrate, the second photoelectric conversion element is at least partially overlapped by the first photoelectric conversion element in a vertical direction, and the second photoelectric conversion element is configured to convert a second visible light spectrum into a second photo charge.
Abstract:
An operation method of an image sensor includes determining a distance between the image sensor and an object, and activating at least one of a color pixel, a depth pixel and a thermal pixel included in a pixel array of the image sensor based on a determined distance and a reference distance.