-
公开(公告)号:US20180067492A1
公开(公告)日:2018-03-08
申请号:US15420994
申请日:2017-01-31
Applicant: Mentor Graphics Corporation
Inventor: Rainer Oder , Andreas Erich Geiger , Ljubo Mercep , Matthias Pollach
CPC classification number: G05B17/02 , G01C21/28 , G01C21/3407 , G01C21/3415 , G01C21/3667 , G01S7/4808 , G01S13/723 , G01S13/862 , G01S13/865 , G01S13/867 , G01S13/931 , G01S15/931 , G01S17/936 , G01S2013/9375 , G01S2013/9378 , G01S2013/9385 , G05D1/0088 , G05D1/021 , G05D1/024 , G05D1/0246 , G05D1/0255 , G05D1/0257 , G05D1/0259 , G05D1/0268 , G05D1/0276 , G05D1/0278 , G05D2201/0213 , G06F16/29 , G06F16/5854 , G06F17/5009 , G06K9/00791 , G08G1/161 , G08G1/163 , G08G1/164
Abstract: This application discloses a computing system to implement multi-level sensor fusion in an assisted or automated driving system of a vehicle. The computing system can populate an environment model with raw measurement data captured by sensors mounted in a vehicle and with data corresponding to a possible object in an coordinate field associated with the environmental model. The computing system can combine at least a portion of the raw measurement data with the data corresponding to the possible object, and detect an object proximate to the vehicle based, at least in part, on the combination of the raw measurement data and the data corresponding to the possible object. A control system for the vehicle can control operation of the vehicle based, at least in part, on the tracked detection event.
-
公开(公告)号:US20180067489A1
公开(公告)日:2018-03-08
申请号:US15287537
申请日:2016-10-06
Applicant: Mentor Graphics Corporation
Inventor: Rainer Oder , Andreas Erich Geiger , Ljubo Mercep , Matthias Pollach
CPC classification number: G05D1/0088 , G05D1/0231 , G05D1/0255 , G05D1/0257 , G06K9/00791
Abstract: This application discloses a computing system to implement low-level sensor fusion in an assisted or automated driving system of a vehicle. The low-level sensor fusion can include receiving raw measurement data from sensors in the vehicle and temporally aligning the raw measurement data based on a time of capture. The low-level sensor fusion can include spatially aligning measurement coordinate fields of the sensors into an environmental coordinate field based, at least in part, on where the sensors are mounted in the vehicle, and then populating the environmental coordinate field with raw measurement data captured by the sensors based on the spatial alignment of the measurement coordinate fields to the environmental coordinate field. The low-level sensor fusion can detect at least one detection event or object based, at least in part, on the raw measurement data from multiple sensors as populated in the environmental coordinate field.
-