SYSTEM AND METHOD FOR PERFORMING A CAMERA TO GROUND ALIGNMENT FOR A VEHICLE

    公开(公告)号:US20250148639A1

    公开(公告)日:2025-05-08

    申请号:US18500395

    申请日:2023-11-02

    Abstract: A method of performing a camera to ground alignment for a camera system on a vehicle. The method includes determining if enabling conditions have occurred and estimating a location of a vanishing point in a source image. Ground lines are selected based on a source image. A lane line detection is performed based on clustering of the ground lines to determine lane lines in the source image. At least one of pitch, yaw, or roll of the vehicle are estimated from the source image. A cost function based on estimates of pitch, yaw, and roll is minimized to obtain an optimal pitch value, an optimal yaw value, an optimal roll value, and lane lines from the source image. A sliding window-based refinement is performed on source images. Alignment results are broadcast to a downstream application or it is determined if the camera system on the vehicle is misaligned.

    Aggregation-based LIDAR data alignment

    公开(公告)号:US12130390B2

    公开(公告)日:2024-10-29

    申请号:US17569948

    申请日:2022-01-06

    Abstract: A LIDAR-to-vehicle alignment system includes a memory and alignment and autonomous driving modules. The memory stores points of data provided based on an output of one or more LIDAR sensors and localization data. The alignment module performs an alignment process including: based on the localization data; determining whether a host vehicle is turning; in response to the host vehicle turning; selecting a portion of the points of data; aggregating the selected portion to provide aggregated data; selecting targets based on the aggregated data; and based on the selected targets, iteratively reducing a loss value of a loss function to provide a resultant LIDAR-to-vehicle transformation matrix. The autonomous driving module: based on the resultant LIDAR-to-vehicle transformation matrix, converts at least the selected portion to at least one of vehicle coordinates or world coordinates to provide resultant data; and performs one or more autonomous driving operations based on the resultant data.

    Dynamic lidar to camera alignment
    106.
    发明授权

    公开(公告)号:US11892574B2

    公开(公告)日:2024-02-06

    申请号:US16944253

    申请日:2020-07-31

    CPC classification number: G01S7/4972 G01S17/86 G01S17/931 G06T7/70

    Abstract: Systems and method are provided for controlling a vehicle. In one embodiment, a method includes: receiving, by a controller onboard the vehicle, lidar data from the lidar device; receiving, by the controller, image data from the camera device; computing, by the controller, an edge map based on the lidar data; computing, by the controller, an inverse distance transformation (IDT) edge map based on the image data; aligning, by the controller, points of the IDT edge map with points of the lidar edge map to determine extrinsic parameters; storing, by the controller, extrinsic parameters as calibrations in a data storage device; and controlling, by the controller, the vehicle based on the stored calibrations.

    CALIBRATION PIPELINE FOR ESTIMATING SIX DEGREES OF FREEDOM (6DOF) ALIGNMENT PARAMETERS FOR AN AUTONOMOUS VEHICLE

    公开(公告)号:US20230126100A1

    公开(公告)日:2023-04-27

    申请号:US17511959

    申请日:2021-10-27

    Abstract: A calibration pipeline for 6DoF alignment parameters for an autonomous vehicle includes an automated driving controller instructed to receive inertial measurement unit (IMU) poses and final radar poses and determine smoothened IMU poses from the IMU poses and smoothened final radar poses from the final radar poses. The automated driving controller aligns the smoothened IMU poses and the smoothened final radar poses with one another to create a plurality of radar-IMU A, B relative pose pairs. The automated riving controller determines a solution yielding a threshold number of inliers of further filtered radar-IMU A, B relative pose pairs, randomly samples the further filtered radar-IMU A, B relative pose pairs with replacements several times to determine a stream of filtered radar-IMU A, B relative pose pairs, and solves for a solution X for the stream of filtered radar-IMU A, B relative pose pairs.

Patent Agency Ranking