Abstract:
An imaging element includes: a processing circuit that performs analog/digital conversion on captured image data; a memory that is capable of storing the captured image data obtained as a result of performing the analog/digital conversion by the processing circuit; and an output circuit that outputs output image data based on the captured image data to an exterior of the imaging element, wherein the output circuit includes a first output line and a second output line, the first output line is connected to a first signal processing circuit disposed at the exterior, the second output line is connected to a second signal processing circuit disposed at the exterior, and at least one of an output frame rate or an output data amount of the output image data is different between the first output line and the second output line.
Abstract:
An imaging element includes a memory that stores first image data obtained by being captured by the imaging element and is incorporated in the imaging element, and a first processor that is configured to perform image data processing on the first image data and is incorporated in the imaging element. The first processor is configured to receive vibration information related to a vibration exerted on the imaging element within a frame output period defined by a first frame rate, and output second image data obtained by assigning the vibration information to a specific position set in the first image data within the frame output period.
Abstract:
An imaging apparatus includes an imaging element that incorporates a memory which stores image data obtained by imaging an imaging region at a first frame rate, and a first processor configured to output the image data at a second frame rate less than or equal to the first frame rate, and an imaging lens including a focus lens, in which the first processor is configured to generate combined image data based on the image data of the number of frames decided in accordance with a depth of field with respect to the imaging region out of the image data of a plurality of frames obtained by imaging the imaging region at different positions of the focus lens.
Abstract:
An imaging element comprises a first communication interface that is incorporated in the imaging element and outputs first image data based on image data obtained by imaging a subject to an external processor, a memory that is incorporated in the imaging element and stores the image data, and a second communication interface that is incorporated in the imaging element and outputs second image data based on the image data stored in the memory to the external processor, in which an output method of the first communication interface and an output method of the second communication interface are different.
Abstract:
Provided is an imaging element in which a first frame rate is a frame rate higher than a second frame rate, and a first processor is configured to read out image data of a plurality of frames in parallel within an output period of image data of one frame defined by the second frame rate, acquire a focus driving speed and a rolling shift amount, decide a combining condition for the image data of the plurality of frames stored in a memory based on the acquired focus driving speed and the rolling shift amount, perform combining processing on the image data of the plurality of frames in accordance with the decided combining condition, and output the image data after combining obtained by the combining processing.
Abstract:
An imaging element incorporates a processing circuit and a memory. The memory stores captured image data obtained by imaging a subject at a first frame rate. The processing circuit performs processing based on the captured image data stored in the memory. An output circuit outputs output image data based on the captured image data to an outside of the imaging element at a second frame rate. The first frame rate is a frame rate higher than the second frame rate and is determined in accordance with an occurrence cycle of a flicker, and the processing circuit detects a flicker effect avoidance timing at which an effect of the flicker on imaging by the imaging element is avoided, based on the captured image data of a plurality of frames.
Abstract:
A system control unit consecutively images a subject by driving based on a global shutter method, divides captured image data obtained by the imaging into a plurality of areas, and each time each area is generated, compares the generated area with the same area as the area in the captured image data generated by the imaging performed before the imaging, and detects a moving object from the area based on a result of the comparison. Based on a change in position of the detected moving object, the system control unit predicts a timing at which a trigger range TR set in the captured image overlaps with the moving object, and performs automatic imaging in a frame period that includes the timing.
Abstract:
A digital camerathat includes an imaging sensor having a stop 2 arranged in front of an imaging surface includes an imaging control unit that starts exposure of pixels on the entire imaging surface at the same time and then, ends the exposure of the pixels at the same time in a state where light from a subject is incident on the imaging surface, by controlling an imaging sensor drive circuit which drives the imaging sensor, and a processr that sequentially changes an F number of the stop 2 to a plurality of values during a period from the start of the exposure until the end of the exposure. The processr controls a time in which the F number is maintained at each of the plurality of values to be a time that is based on a function indicating light transmittance characteristics of an APD filter.
Abstract:
A digital camera includes an imaging control unit that images a subject by performing driving based on a rolling shutter method on an imaging sensor, and a light emission control unit that emits auxiliary light a plurality of times from a light emission device while the imaging is performed. A difference At between start timings of exposure started by the driving in two adjacent pixel rows is a value other than 1/n (n is an integer greater than or equal to 1) of an exposure time in which the exposure is performed. The light emission control unit controls a light emission cycle of the auxiliary light to be a time that is m (m is an integer greater than or equal to 1) times the difference.
Abstract:
Provided is a technique capable of correcting unique shading characteristics of a single-eye stereoscopic imaging device. A focal length is acquired. A one-dimensional correction table corresponding to the focal length is acquired from a plurality of stored one-dimensional correction tables. Shading correction is performed using the acquired one-dimensional correction table. An arbitrary pixel is selected from main pixels, a correction value corresponding to the position of the selected pixel is read from the acquired one-dimensional correction table for the main pixel, and shading correction is performed for the basis of the correction value and the value of the arbitrary pixel. This process is performed for all of the main pixels and the sub-pixels. A two-dimensional SD correction unit performs normal shading correction for the data subjected to the shading correction using a two-dimensional correction table.