Abstract:
A vision system of a vehicle includes a camera and a non-imaging sensor. With the camera and the non-imaging sensor disposed at the vehicle, the field of view of the camera at least partially overlaps the field of sensing of the non-imaging sensor at an overlapping region. A processor is operable to process image data captured by the camera and sensor data captured by the non-imaging sensor to determine a driving situation of the vehicle. Responsive to determination of the driving situation, Kalman Filter parameters associated with the determined driving situation are determined and, using the determined Kalman Filter parameters, a Kalman Filter fusion may be determined. The determined Kalman Filter fusion may be applied to captured image data and captured sensor data to determine an object present in the overlapping region.
Abstract:
A vision system for a vehicle includes a camera disposed at or proximate to an in-cabin portion of a windshield of the vehicle. The camera has a forward field of view to the exterior of the vehicle through the windshield of the vehicle. The camera is operable to capture image data. A control includes an image processor that is operable to process captured image data to determine lane delimiters present in the field of view of the camera. The control connects to a vehicle communication bus of the vehicle and receives vehicle data via the vehicle communication bus. Responsive at least in part to processing of captured image data by the image processor and to vehicle data received via the vehicle communication bus, the control determines a yaw rate. The control provides the determined yaw rate to a driver assistance system of the vehicle.
Abstract:
A control system or method for a vehicle references a camera and sensors to determine when an offset of a yaw rate sensor may be updated. The sensors may include a longitudinal accelerometer, a transmission sensor, a vehicle speed sensor, and a steering angle sensor. The offset of the yaw rate sensor may be updated when the vehicle is determined to be stationary by referencing at least a derivative of an acceleration from the longitudinal accelerometer. The offset of the yaw rate sensor may be updated when the vehicle is determined to be moving straight by referencing at least image data captured by the camera. Lane delimiters may be detected in the captured image data and evaluated to determine a level of confidence in the straight movement. When the offset of the yaw rate sensor is to be updated, a ratio of new offset to old offset may be used.
Abstract:
A vehicular cabin monitoring system includes an interior-viewing camera disposed at a vehicle and viewing a driver sitting at a driver seat of the vehicle and viewing at least one other seat of the vehicle. The interior-viewing camera is operable to capture image data. The vehicular cabin monitoring system, via processing of image data captured by the interior-viewing camera, detects an occupant present in the vehicle. The vehicular cabin monitoring system, via processing of image data captured by the interior-viewing camera, monitors the driver sitting at the driver seat of the vehicle. Control of the vehicle is adjusted based at least in part on the monitoring of the driver and the detected occupant.
Abstract:
A vehicular vision system includes a forward-viewing camera that views through a windshield forward of a vehicle. Responsive at least in part to processing by an image processor of image data captured by the forward-viewing camera while the equipped vehicle is traveling along a road, another vehicle on the road ahead of the equipped vehicle is detected, and the vehicular vision system may determine lateral acceleration of the detected other vehicle on the road ahead of the equipped vehicle. The vehicular vision system may generate an output based at least in part on the determined lateral acceleration of the detected other vehicle on the road ahead of the equipped vehicle. Responsive to determination that the equipped vehicle is approaching a school zone, pedestrian detection via image processing of image data captured by the forward-viewing camera may be enhanced.
Abstract:
A vehicular vision system includes a camera disposed at an in-cabin side of a windshield of a vehicle. Responsive at least in part to processing of captured image data, the system determines a camera-derived path of travel of the vehicle along a road. Responsive at least in part to a geographic location of the vehicle, the system determines a geographic-derived path of travel of the vehicle along the road. Control of the vehicle along the road is based on diminished reliance on the determined geographic-derived path of travel when a geographic location reliability level of the determined geographic-derived path of travel is below a threshold geographic location reliability level. Control of the vehicle along the road is based on diminished reliance on the determined camera-derived path of travel of the vehicle when a camera reliability level of the determined camera-derived path of travel is below a threshold camera reliability level.
Abstract:
A vehicular driving assist system includes an electronic control unit (ECU). When the equipped vehicle is driving in the autonomous mode, driving of the vehicle is controlled by the ECU without human intervention. When the vehicle is being driven in a non-autonomous mode, a particular driver at least partially drives the vehicle. The vehicular driving assist system learns a driving style of the particular driver and, when the vehicle is being driven in the non-autonomous mode by the particular driver, the system restricts learning of the driving style of the particular driver responsive to determination by the system that the particular driver is (i) driving the vehicle erratically and/or (ii) performing an illegal driving maneuver. When the vehicle is driving in the autonomous mode, and responsive to the particular driver being present in the vehicle, the vehicle drives in accordance with the learned driving style of the particular driver.
Abstract:
A method for vehicular control includes providing a forward viewing camera, a yaw rate sensor, a longitudinal accelerometer, a speed sensor and a control system at the vehicle. While the vehicle is moving, an angular rotational velocity of the vehicle about a local vertical axis is determined, a yaw rate offset is determined, and a longitudinal acceleration is determined. A corrected yaw rate is determined responsive to the determined yaw rate offset of the yaw rate sensor and the determined longitudinal acceleration of the vehicle. The control system determines a projected driving path of the vehicle based at least in part on the determined corrected yaw rate. A hazard condition ahead of the vehicle in the projected driving path is determined at least in part responsive to detecting an object and to the projected driving path. The system automatically applies the brakes of the vehicle responsive to the determined hazard condition.
Abstract:
A method for determining potential collision with another vehicle by a vehicle equipped with a vision system includes providing, at the equipped vehicle, a vision sensor including a camera and at least one non-vision sensor, and determining presence of a leading vehicle ahead of the equipped vehicle. A time to collision to the leading vehicle is determined based at least in part on a determined distance to the leading vehicle and a determined relative velocity between the equipped vehicle and the leading vehicle. A braking level of the equipped vehicle is determined to mitigate collision of the equipped vehicle with the leading vehicle. A weighting factor is employed when determining the braking level, and the weighting factor is adjusted responsive at least in part to determining, via processing of image data captured by the camera, that a brake light of the leading vehicle ceases to be illuminated.
Abstract:
A method for determining a corrected yaw rate for a vehicle includes receiving a first yaw rate input from a yaw rate sensor of the vehicle and determining if the vehicle is moving or stationary. If the vehicle is determined to be moving, the method includes determining a steering angle of the vehicle, and determining an offset correction value based at least in part on a determined speed of the vehicle and the determined steering angle. A yaw rate offset is determined based at least in part on the determined offset correction value and the received first yaw rate input. A second yaw rate input is received from the yaw rate sensor of the vehicle, and a corrected yaw rate value is determined based at least in part on the received second yaw rate input and the determined yaw rate offset.