Abstract:
A system and method for creating an enhanced perspective view of an area in front of a vehicle, using images from left-front and right-front cameras. The enhanced perspective view removes the distortion and exaggerated perspective effects which are inherent in wide-angle lens images. The enhanced perspective view uses a camera model including a virtual image surface and other processing techniques which provide corrections for two types of problems which are typically present in de-warped perspective images—including a stretching effect at the peripheral area of a wide-angle image de-warped by rectilinear projection, and double image of objects in an area where left-front and right-front camera images overlap.
Abstract:
A method and system for estimating the state of health of an object sensing fusion system. Target data from a vision system and a radar system, which are used by an object sensing fusion system, are also stored in a context queue. The context queue maintains the vision and radar target data for a sequence of many frames covering a sliding window of time. The target data from the context queue are used to compute matching scores, which are indicative of how well vision targets correlate with radar targets, and vice versa. The matching scores are computed within individual frames of vision and radar data, and across a sequence of multiple frames. The matching scores are used to assess the state of health of the object sensing fusion system. If the fusion system state of health is below a certain threshold, one or more faulty sensors are identified.