Abstract:
Example control methods and apparatuses, example LiDARs, and example terminal devices are provided. One example method includes controlling a transmitter to transmit a first pulse train, where the first pulse train includes M1 first-type pulses and M2 second-type pulses, M1 is an integer greater than 1, and M2 is a positive integer. The transmitter is controlled to transmit a second pulse train, where the second pulse train includes at least one of M3 first-type pulses or Ma second-type pulses, where a power of a first-type pulse is greater than a power of a second-type pulse, and where the second pulse train and the first pulse train have different transmission time periods, or correspond to different sub-emitters, or correspond to different pixels in a detection field of view, or correspond to different detection fields of view, or correspond to different sub-receivers.
Abstract:
A novel neural modeling framework Neural Transient Field (NeTF) is provided for non-line-of-sight (NLOS) imaging. NeTF recovers the 5D transient function in both spatial location and direction, and the training data input is parametrized on the spherical wave-fronts. A Markov chain Monte Carlo (MCMC) algorithm is used to account for sparse and unbalanced sampling in NeTF.
Abstract:
This application discloses a time-of-flight measurement method, apparatus, and system. The method includes obtaining histogram data of a target object. The histogram data includes m counts, m is an integer greater than 1, and each of the m counts is associated with a time. The method also includes performing digital filtering on the m counts to obtain m filtered values respectively corresponding to the m counts, and determining a time of flight of the target object based on a time corresponding to a peak value in the m filtered values.
Abstract:
A distance camera includes at least one photo element, a trigger generator activating the photo element during a temporal integration gate, a light source illuminating an object with light pulses having a predetermined temporal intensity profile with a duration Tp, and an intensity sensor determining the intensity Ip of the light pulses arriving on the photo element. The integration gate has a predetermined delay to the light pulse emission start point in order to capture the light pulses back reflected from the object. The photo element outputs a signal value U at an integration end point in time T1e and in accordance with an intensity Ip and a duration of the light pulse arriving on the photo element during its activation.
Abstract:
Depth information about a scene of interest is acquired by illuminating the scene, capturing reflected light energy from the scene with one or more photodetectors, and processing resulting signals, in at least one embodiment, a pseudo-randomly generated series of spatial light modulation patterns is used to modulate the light pulses either before or after reflection.
Abstract:
A time-of-flight depth calculation method includes: obtaining a phase image, and obtaining, based on the phase image, a differential ratio of charge signals corresponding to reflected signals acquired by an image sensor at different times; in response to that the differential ratio of the charge signals is greater than or equal to a threshold, obtaining a first phase based on a phase conversion model and the differential ratio of the charge signals; and calculating a depth value of a target region based on the first phase.
Abstract:
A time delay of arrival (TDOA) between a time that a light pulse was emitted to a time that a pulse reflected off an object was received at a light sensor may be determined for saturated signals by using an edge of the saturated signal, rather than a peak of the signal, for the TDOA calculation. The edge of the saturated signal may be accurately estimated by fitting a first polynomial curve to data points of the saturated signal, defining an intermediate magnitude threshold based on the polynomial curve, fitting a second polynomial curve to data points near an intersection of the first polynomial curve and the intermediate threshold, and identifying an intersection of the second polynomial curve and the intermediate threshold as the rising edge of the saturated signal.
Abstract:
Doppler correction of broadband LIDAR includes mixing, during a first time interval, a returned optical signal with an in-phase version of the transmitted signal to produce a first mixed optical signal that is detected during the first time interval to produce a first electrical signal. During a non-overlapping second time interval the returned optical signal is mixed with a quadrature version of the transmitted signal to produce a second mixed optical signal that is detected during the second time interval to produce a second electrical signal. A complex digital signal uses one of the digitized electrical signals as a real part and a different one as the imaginary part. A signed Doppler frequency shift of the returned optical signal is determined based, at least in part, on a Fourier transform of the complex digital signal. A device is operated based on the Doppler frequency shift.
Abstract:
A method and device for selecting echoes from an echo list. The method includes receiving an echo list, the echo list including a plurality of current echoes, at least two current echoes of the plurality of current echoes including a relationship correlation. The method also includes weighting at least one assignment of at least one current echo of the plurality of current echoes of at least one past echo of at least one past echo function. The method also includes selecting an assignment of at least one current echo to the at least one past echo of the at least one past echo function such that a predeterminable selection criterion is fulfilled. The selection of the assignment takes into account the relationship correlation between the at least two current echoes of the plurality of current echoes.
Abstract:
A motion recognition device capable of recognizing the motion of an object without contact with the object is provided. Further, a motion recognition device that has a simple structure and can recognize the motion of an object regardless of the state of the object is provided. By using a 3D TOF range image sensor in the motion recognition device, information on changes in position and shape is detected easily. Further, information on changes in position and shape of a fast-moving object is detected easily. Motion recognition is performed on the basis of pattern matching. Imaging data used for pattern matching is acquired from a 3D range measuring sensor. Object data is selected from imaging data on an object that changes over time, and motion data is estimated from a time change in selected object data. The motion recognition device performs operation defined by output data generated from the motion data.