Abstract:
A color image sensor including an array of pixels is formed in a semiconductor layer having a back side that receives an illumination. Insulated conductive walls penetrate into the semiconductor layer from the back side and separate the pixels from one another. For each pixel, a color pixel penetrates into from 5 to 30% of a thickness of the semiconductor layer from the back side and occupies at least 90% of the surface area delimited by the walls. An electrically-conductive layer extends from the lateral wall of the filter all the way to the walls.
Abstract:
An image sensor is includes a plurality of pixels. Each of the pixels includes a silicon photoconversion region and a material that at least partially surrounds the photoconversion region. The material has a refraction index smaller than the refraction index of silicon, and the interface between the photoconversion region of the pixel and the material is configured so that at least one ray reaching the photoconversion region of the pixel undergoes a total reflection or a plurality of successive total reflections at the interface.
Abstract:
A photosensitive semiconductor region is configured to be illuminated through a rear face. A periodic array of pads formed of a first material is provided at the front face. The periodic array has an outline with a periodic pattern parameterized by characteristic dimensions. The outline forms an interface between the first material and a second material, where the first and second materials have different optical indices. The characteristic dimensions of the periodic pattern are less than a wavelength of interest and are configured to produce at the interface a reflection of light at the wavelength of interest towards the photosensitive semiconductor region.
Abstract:
The present description concerns a polarimetric image sensor formed inside and on top of a semiconductor substrate, the second comprising a plurality of pixels, each comprising: —a photosensitive region formed in the semiconductor substrate; —a diffraction structure formed on the side of an illumination surface of the photosensitive region; and —a polarization structure formed on the side of the diffraction structure opposite to the photosensitive region.
Abstract:
An optical sensor includes pixels, with each pixel formed by a photodetector and a telecentric system topping the photodetector. Each telecentric system includes: an opaque layer with openings facing the photodetector and a microlens facing each opening and arranged between the opaque layer and the photodetector. Each pixel further includes an optical filter between the microlenses and the photodetector. The optical filter may, for example, be an interference filter, a diffraction grating-based filter or a metasurface-based filter.
Abstract:
The present disclosure relates to an image sensor comprising a first layer of photoelectric material and a diffraction grating located between said first layer and the face of the sensor configured to receive light rays.
Abstract:
The present description concerns an image sensor formed inside and on top of a semiconductor substrate, the sensor comprising a plurality of pixels, each comprising a photodetector formed in the substrate, the sensor comprising at least first and second bidimensional metasurfaces stacked, in this order, in front of said plurality of pixels, each metasurface being formed of a bidimensional array of pads, the first metasurface having a first optical function and the second metasurface having a second optical function different from the first optical function.
Abstract:
An image sensor is includes a plurality of pixels. Each of the pixels includes a silicon photoconversion region and a material that at least partially surrounds the photoconversion region. The material has a refraction index smaller than the refraction index of silicon, and the interface between the photoconversion region of the pixel and the material is configured so that at least one ray reaching the photoconversion region of the pixel undergoes a total reflection or a plurality of successive total reflections at the interface.
Abstract:
An integrated image sensor with backside illumination includes a pixel. The pixel is formed by a photodiode within an active semiconductor region having a first face and a second face. A converging lens, lying in front of the first face of the active region, directs received light rays towards a central zone of the active region. At least one diffracting element, having a refractive index different from a refractive index of the active region, is provided at least partly aligned with the central zone at one of the first and second faces.
Abstract:
An integrated image sensor may include adjacent pixels, with each pixel including an active semiconductor region including a photodiode, an antireflection layer above the photodiode, a dielectric region above the antireflection layer and an optical filter to pass incident luminous radiation having a given wavelength. The antireflection layer may include an array of pads mutually separated by a dielectric material of the dielectric region. The array may be configured to allow simultaneous transmission of the incident luminous radiation and a diffraction of the incident luminous radiation producing diffracted radiations which have wavelengths below that of the incident radiation, and are attenuated with respect to the incident radiation.