-
公开(公告)号:US12125269B2
公开(公告)日:2024-10-22
申请号:US17584449
申请日:2022-01-26
Applicant: Ford Global Technologies, LLC
Inventor: Gaurab Banerjee , Vijay Nagasamy
IPC: G06V10/00 , G01S13/86 , G01S13/89 , G01S15/86 , G01S15/89 , G01S17/86 , G01S17/89 , G06V10/80 , G06V10/82 , G06V20/58 , G01S13/931 , G01S15/931 , G01S17/931
CPC classification number: G06V10/803 , G01S13/862 , G01S13/865 , G01S13/867 , G01S13/89 , G01S15/86 , G01S15/89 , G01S17/86 , G01S17/89 , G06V10/82 , G06V20/58 , G01S13/931 , G01S15/931 , G01S17/931
Abstract: A plurality of images can be acquired from a plurality of sensors and a plurality of flattened patches can be extracted from the plurality of images. An image location in the plurality of images and a sensor type token identifying a type of sensor used to acquire an image in the plurality of images from which the respective flattened patch was acquired can be added to each of the plurality of flattened patches. The flattened patches can be concatenated into a flat tensor and add a task token indicating a processing task to the flat tensor, wherein the flat tensor is a one-dimensional array that includes two or more types of data. The flat tensor can be input to a first deep neural network that includes a plurality of encoder layers and a plurality of decoder layers and outputs transformer output. The transformer output can be input to a second deep neural network that determines an object prediction indicated by the token and the object predictions can be output.
-
公开(公告)号:US12117560B2
公开(公告)日:2024-10-15
申请号:US17394241
申请日:2021-08-04
Applicant: Google LLC
Inventor: Nicholas Edward Gillian , Carsten C. Schwesig , Jaime Lien , Patrick M. Amihood , Ivan Poupyrev
IPC: G01S7/41 , A63F13/21 , A63F13/24 , G01S7/40 , G01S13/56 , G01S13/66 , G01S13/86 , G01S13/88 , G01S13/90 , G06F3/01 , G06F3/04815 , G06F16/245 , G06F18/21 , G06F18/25 , G06F18/28 , G06F18/40 , G06F21/32 , G06F21/62 , G06N20/00 , G06V10/80 , G06V20/64 , G06V40/20 , H04Q9/00 , H04W4/80 , H04W16/28 , G01S13/931 , G01S19/42 , G06F1/16 , G06F3/0346 , G06F3/0484 , G06F3/16 , G06T7/73 , G08C17/02
CPC classification number: G01S7/415 , A63F13/21 , A63F13/24 , G01S7/4004 , G01S7/41 , G01S7/412 , G01S13/56 , G01S13/66 , G01S13/86 , G01S13/867 , G01S13/88 , G01S13/888 , G01S13/90 , G01S13/904 , G06F3/011 , G06F3/017 , G06F3/04815 , G06F16/245 , G06F18/217 , G06F18/25 , G06F18/253 , G06F18/28 , G06F18/41 , G06F21/32 , G06F21/6245 , G06N20/00 , G06V10/806 , G06V20/64 , G06V40/28 , H04Q9/00 , H04W4/80 , H04W16/28 , A63F2300/8082 , G01S13/865 , G01S13/931 , G01S2013/9322 , G01S19/42 , G06F1/163 , G06F3/0346 , G06F3/0484 , G06F3/165 , G06F2203/0384 , G06F2221/2105 , G06T7/75 , G08C17/02 , G08C2201/93 , H04Q2209/883
Abstract: This document describes apparatuses and techniques for radar-enabled sensor fusion. In some aspects, a radar field is provided and reflection signals that correspond to a target in the radar field are received. The reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted. Based on the radar features, a sensor is activated to provide supplemental sensor data associated with the physical characteristic. The radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature. By so doing, performance of sensor-based applications, which rely on the enhanced radar features, can be improved.
-
公开(公告)号:US20240300540A1
公开(公告)日:2024-09-12
申请号:US18179939
申请日:2023-03-07
Applicant: GM Cruise Holdings LLC
Inventor: Burkay Donderici
CPC classification number: B60W60/0027 , G01S13/862 , G01S13/865 , G01S13/867 , G01S15/86 , G01S17/86 , B60W2420/403 , B60W2420/408 , B60W2420/54 , B60W2556/35
Abstract: Systems and techniques are provided for fusing sensor data from multiple sensors. An example method can include obtaining a first set of sensor data from a first sensor and a second set of sensor data from a second sensor; detecting an object in the first set of sensor data and the object in the second set of sensor data; aligning the object in the first set of sensor data and the object in the second set of sensor data to a common time; and based on the aligned object from the first set of sensor data and the aligned object from the second set of sensor data, fusing the aligned object from the first set of sensor data and the aligned object from the second set of sensor data.
-
公开(公告)号:US12079004B2
公开(公告)日:2024-09-03
申请号:US17405715
申请日:2021-08-18
Applicant: Waymo LLC
Inventor: Mingcheng Chen , Christian Lauterbach
CPC classification number: G05D1/0248 , G01S13/865 , G01S17/89 , G05D1/0088 , G05D1/0221
Abstract: Aspects of the disclosure relate to training and using a model for verifying accuracy of ICP alignments or alignments between data points using an iterative closest point algorithm. For instance, a model may be trained using ICP alignment data, including alignments between an object appearing in LIDAR sensor frames. The training may also include setting a definition for a trusted ICP alignment. In this regard, the model may be trained such that, n response to receiving additional LIDAR sensor frames and corresponding additional ICP alignment data, output a value indicative of whether the additional ICP alignment data is trusted according to the definition. The model may then be used to control a vehicle in an autonomous driving mode by determining whether alignment data for object determined using the ICP algorithm should be trusted.
-
公开(公告)号:US12066524B2
公开(公告)日:2024-08-20
申请号:US17893110
申请日:2022-08-22
Applicant: SUTENG INNOVATION TECHNOLOGY CO., LTD.
Inventor: Changsheng Gong
IPC: G01S13/08 , G01S7/4863 , G01S13/86 , G01S17/08
CPC classification number: G01S13/865 , G01S13/08 , G01S17/08 , G01S7/4863
Abstract: This application is applicable to the field of radar technologies, and provides a radar data processing method, a terminal device, and a computer-readable storage medium. The method includes: obtaining radar data collected by a receiving area array; if the radar data is saturated, performing data fusion processing based on a floodlight distance value to obtain a fusion result; and determining a distance of a target object based on the fusion result. The method can accurately obtain an actual distance of the target object, effectively reduce a measurement error, improve calculation accuracy, and resolve an existing problem of a large deviation of a measurement result when an actual echo waveform cannot be effectively restored because a signal received by the radar is over-saturated when a laser is directly irradiated on a target object with high reflectivity.
-
公开(公告)号:US12032056B2
公开(公告)日:2024-07-09
申请号:US17014928
申请日:2020-09-08
Applicant: Lyft, Inc.
Inventor: Alfred Charles Jones, II , Marco Antonio Marroquín , Kevin Danford
IPC: G01S13/931 , B60R11/04 , G01S7/02 , G01S13/86
CPC classification number: G01S13/931 , B60R11/04 , G01S13/865 , G01S13/867 , G01S7/027 , G01S2013/93273
Abstract: A universal sensor assembly for mounting on a vehicle is provided. The universal sensor assembly includes a sensor suite. The sensor suite includes a baseplate and a sensor being supported by the baseplate. The sensor including a field of view (FOV) associated with detecting objects within an environment surrounding the vehicle. The universal sensor assembly further includes a support structure. The support structure includes a set of detachable attachment mechanisms supporting the baseplate. The set of detachable attachment mechanisms is included on a rooftop of the vehicle at positions that are based on surface parameters associated with the rooftop and a support component supporting the baseplate. The one support component is disposed at a position on the rooftop that is based on the surface parameters so that the FOV of the sensor is unoccluded by any portion of the vehicle and the support structure.
-
7.
公开(公告)号:US20240199241A1
公开(公告)日:2024-06-20
申请号:US18426390
申请日:2024-01-30
IPC: G01S17/66 , B64G3/00 , G01S13/86 , G01S13/933 , G01S17/87 , G01S17/933
CPC classification number: G01S17/66 , B64G3/00 , G01S13/865 , G01S13/933 , G01S17/87 , G01S17/933
Abstract: In a search and tracking method for full time-domain laser detection of space debris, a set of latest precision orbital parameters of a debris object and start and end moments of a current transit of the object are first obtained. Search-specific guidance data is generated based on the above information and in combination with estimation of a maximum along-track error of the orbital parameters of the object during the current transit. A DLR system performs multi-elevation search on the object based on the search-specific guidance data, obtains a plurality of pieces of detection data of the object after detecting the object during the search, determines an along-track error of the orbital parameters of the object based on the detection data, and corrects the orbital parameters of the object in real time based on the along-track error, so as to guide the DLR system to subsequently track and detect the object.
-
8.
公开(公告)号:US12013457B2
公开(公告)日:2024-06-18
申请号:US17150590
申请日:2021-01-15
Applicant: UATC, LLC
Inventor: Raquel Urtasun , Bin Yang , Ming Liang , Sergio Casas , Runsheng Benson Guo
IPC: G01S13/86 , G01S7/41 , G01S13/58 , G01S17/89 , G01S17/931 , G06F18/22 , G06F18/2433 , G06F18/25 , G06N3/044 , G06N3/045 , G06N20/00 , G06V10/74 , G06V10/80 , G06V10/84 , G06V20/58
CPC classification number: G01S13/865 , G01S7/417 , G01S13/589 , G01S17/89 , G01S17/931 , G06F18/22 , G06F18/2433 , G06F18/251 , G06N3/045 , G06N20/00 , G06V10/761 , G06V10/803 , G06V10/84 , G06V20/58 , G06N3/044
Abstract: Systems and methods for integrating radar and LIDAR data are disclosed. In particular, a computing system can access radar sensor data and LIDAR data for the area around the autonomous vehicle. The computing system can determine, using the one or more machine-learned models, one or more objects in the area of the autonomous vehicle. The computing system can, for a respective object, select a plurality of radar points from the radar sensor data. The computing system can generate a similarity score for each selected radar point. The computing system can generate weight associated with each radar point based on the similarity score. The computing system can calculate predicted velocity for the respective object based on a weighted average of a plurality of velocities associated with the plurality of radar points. The computing system can generate a proposed motion plan based on the predicted velocity for the respective object.
-
9.
公开(公告)号:US20240190466A1
公开(公告)日:2024-06-13
申请号:US18065419
申请日:2022-12-13
Applicant: Kodiak Robotics, Inc.
Inventor: Derek J. Phillips , Collin C. Otis , Andreas Wendel , Jackson P. Rusch
CPC classification number: B60W60/0011 , G01S13/862 , G01S13/865 , B60W2420/42 , B60W2420/52 , B60W2420/54 , B60W2420/62
Abstract: This disclosure provides systems and methods for controlling a vehicle. The method comprises receiving data from a set of sensors, wherein the data represents objects or obstacles in an environment of the autonomous vehicle; identifying objects or obstacles from the received data; determining multiple sets of attributes of the objects or obstacles, wherein each set of attributes of the objects or obstacles are determined based on data received by an individual sensor; determining a candidate trajectory for the autonomous vehicle based on the multiple sets of attributes of the objects or obstacles; and controlling the autonomous vehicle according to the candidate trajectory.
-
公开(公告)号:US12000958B2
公开(公告)日:2024-06-04
申请号:US17741374
申请日:2022-05-10
Applicant: BDCM A2 LLC
Inventor: Matthew Paul Harrison
CPC classification number: G01S7/417 , G01S13/726 , G01S13/865 , G01S13/867 , G01S13/931 , G06F18/217 , G06F18/254 , G06N3/04 , G06N3/08
Abstract: Examples disclosed herein relate to an autonomous driving system in a vehicle, including a radar system with a reinforcement learning engine to control a beam steering antenna and identity targets in a path and a surrounding environment of the vehicle, and a sensor fusion module to receive information from the radar system on the identified targets and compare the information received from the radar system to information received from at least one sensor in the vehicle.
-
-
-
-
-
-
-
-
-