COMBINING RULE-BASED AND LEARNED SENSOR FUSION FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

    公开(公告)号:US20250123605A1

    公开(公告)日:2025-04-17

    申请号:US18989849

    申请日:2024-12-20

    Abstract: In various examples, systems and methods are disclosed that perform sensor fusion using rule-based and learned processing methods to take advantage of the accuracy of learned approaches and the decomposition benefits of rule-based approaches for satisfying higher levels of safety requirements. For example, in-parallel and/or in-serial combinations of early rule-based sensor fusion, late rule-based sensor fusion, early learned sensor fusion, or late learned sensor fusion may be used to solve various safety goals associated with various required safety levels at a high level of accuracy and precision. In embodiments, learned sensor fusion may be used to make more conservative decisions than the rule-based sensor fusion (as determined using, e.g., severity (S), exposure (E), and controllability (C) (SEC) associated with a current safety goal), but the rule-based sensor fusion may be relied upon where the learned sensor fusion decision may be less conservative than the corresponding rule-based sensor fusion.

    Particle-based hazard detection for autonomous machine

    公开(公告)号:US12235353B2

    公开(公告)日:2025-02-25

    申请号:US17454389

    申请日:2021-11-10

    Abstract: In various examples, a hazard detection system fuses outputs from multiple sensors over time to determine a probability that a stationary object or hazard exists at a location. The system may then use sensor data to calculate a detection bounding shape for detected objects and, using the bounding shape, may generate a set of particles, each including a confidence value that an object exists at a corresponding location. The system may then capture additional sensor data by one or more sensors of the ego-machine that are different from those used to capture the first sensor data. To improve the accuracy of the confidences of the particles, the system may determine a correspondence between the first sensor data and the additional sensor data (e.g., depth sensor data), which may be used to filter out a portion of the particles and improve the depth predictions corresponding to the object.

    JOINT 2D AND 3D OBJECT TRACKING FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

    公开(公告)号:US20230360231A1

    公开(公告)日:2023-11-09

    申请号:US17955814

    申请日:2022-09-29

    CPC classification number: G06T7/246 G06T2207/30252

    Abstract: In various examples, techniques for multi-dimensional tracking of objects using two-dimensional (2D) sensor data are described. Systems and methods may use first image data to determine a first 2D detected location and a first three-dimensional (3D) detected location of an object. The systems and methods may then determine a 2D estimated location using the first 2D detected location and a 3D estimated location using the first 3D detected location. The systems and methods may use second image data to determine a second 2D detected location and a second 3D detected location of a detected object, and may then determine that the object corresponds to the detected object using the 2D estimated location, the 3D estimated location, the second 2D detected location, and the second 3D detected location. The systems and method then generate, modify, delete, or otherwise update an object track that includes 2D state information and 3D state information.

Patent Agency Ranking