Abstract:
Optical apparatus includes a scanning line projector, which is configured to scan a line of radiation across a scene. A receiver includes an array of sensing elements, which are configured to output signals in response to the radiation that is incident thereon, and collection optics configured to image the scene onto the array, such that each sensing element receives the radiation reflected from a corresponding point in the scene. A processor is coupled to receive the signals output by the sensing elements, to identify respective times of passage of the scanned line across the points in the scene by comparing a time-dependent waveform of the signals from the corresponding sensing elements to an expected waveform, and to derive depth coordinates of the points in the scene from the respective times of passage.
Abstract:
An optical sensing device includes a light source, which is configured to emit one or more beams of light pulses at respective angles toward a target scene. An array of sensing elements is configured to output signals in response to incidence of photons on the sensing elements. Light collection optics are configured to image the target scene onto the array. Control circuitry is coupled to actuate the sensing elements only in one or more selected regions of the array, each selected region containing a respective set of the sensing elements in a part of the array onto which the light collection optics image a corresponding area of the target scene that is illuminated by the one of the beams, and to adjust a membership of the respective set responsively to a distance of the corresponding area from the device.
Abstract:
Optical apparatus includes a scanning line projector, which is configured to scan a line of radiation across a scene. A receiver includes an array of sensing elements, which are configured to output signals in response to the radiation that is incident thereon, and collection optics configured to image the scene onto the array, such that each sensing element receives the radiation reflected from a corresponding point in the scene. A processor is coupled to receive the signals output by the sensing elements, to identify respective times of passage of the scanned line across the points in the scene by comparing a time-dependent waveform of the signals from the corresponding sensing elements to an expected waveform, and to derive depth coordinates of the points in the scene from the respective times of passage.
Abstract:
Optical apparatus includes a projector, which is configured to direct a pattern of one or more stripes, extending along a longitudinal dimension across a target. A receiver includes an array of optical sensors, and objective optics, which are configured to image the target onto the array, and which have a non-circular aperture, which is elongated in a direction dependent upon the longitudinal dimension of the stripes.
Abstract:
An optical sensing device includes a light source, which is configured to emit one or more beams of light pulses at respective angles toward a target scene. An array of sensing elements is configured to output signals in response to incidence of photons on the sensing elements. Light collection optics are configured to image the target scene onto the array. Control circuitry is coupled to actuate the sensing elements only in one or more selected regions of the array, each selected region containing a respective set of the sensing elements in a part of the array onto which the light collection optics image a corresponding area of the target scene that is illuminated by the one of the beams, and to adjust a membership of the respective set responsively to a distance of the corresponding area from the device.
Abstract:
Optical apparatus includes a plurality of emitters arranged in a row and configured to emit respective beams of optical radiation. Projection optics, which are configured to project the beams toward a target, include first cylindrical lenses, which have respective, mutually-parallel first cylinder axes and are aligned respectively with the emitters in the row so as to receive and focus the respective beams in a first dimension, and a second cylindrical lens, which has a second cylinder axis perpendicular to the first cylinder axes and is positioned to receive and focus all of the beams in a second dimension, perpendicular to the first dimension. A scan driver is configured to shift the second cylindrical lens in a direction perpendicular to the second cylinder axis so as to scan the beams across the target.
Abstract:
Imaging apparatus includes an image sensor, which acquires an image of a scene, and a scanner, which includes an optical transmitter, which emits a sequence of optical pulses toward the scene, and an optical receiver, which receives the optical pulses reflected from the scene and generates an output indicative of respective times of flight of the pulses. Scanning optics are configured to scan the optical pulses over the scene in a scan pattern that covers and is contained within a non-rectangular area within the scene. A processor identifies an object in the image of the scene, defines the non-rectangular area so as to contain the identified object, and processes the output of the optical receiver so as to extract a three-dimensional (3D) map of the object.
Abstract:
A light detection and ranging (LIDAR) system has an emitter which produces a sequence of outgoing pulses of coherent collimated light that transmitted in a given direction, a mirror system having a scanning mirror that is positioned to deflect the outgoing pulse sequence towards a scene, and a detector collocated with the emitter and aimed to detect a sequence of incoming pulses being reflections of the outgoing pulses that are returning from said given direction and have been deflected by the scanning mirror. An electronic controller communicates with the emitter and the detector and controls the scanning mirror, so that the outgoing pulses scan the scene and the controller computes a radial distance or depth for each pair of outgoing and incoming pulses and uses the computed radial distance to provide a scanned 3D depth map of objects in the scene. Other embodiments are also described.
Abstract:
A camera includes a pulse transmitter for transmitting at a transmit time through an aperture and along an optical path to a target a coherent electromagnetic ranging pulse at a first wavelength range outside the visible spectrum. In some embodiments, the camera includes a reflected pulse detector for receiving a reflected electromagnetic pulse reflected by the target back along the optical path and through the aperture at a detect time subsequent to the transmit time. In some embodiments, the camera includes a shutter positioned for shielding the pulse detector from at least transmit time to an intermediate time between the transmit time and the detect time. In some embodiments, the shutter includes a layer of semiconductor material placed in the optical path at a point between the target and the detector.
Abstract:
Systems and methods for providing spatially dynamic illumination in camera systems. A spatially dynamic illumination source enables the illumination of only desired objects in the field of view of the camera, thereby reducing the amount of light required from the illumination source. The spatially dynamic illumination source may include an array of illumination elements and a control component. Each illumination element in the illumination array may include a light-emitting element combined with an optical element. A camera and the spatially dynamic illumination source may be combined in a camera and illumination system. The camera and illumination system may dynamically detect, track, and selectively illuminate only desired objects in the camera field of view.