Abstract:
A navigation route cooperation navigation system and a method of controlling the same are provided. The navigation route cooperation navigation system can secure a visible distance or a communicable range and allow a member vehicle which cannot perform cluster driving to drive according to driving information of a leader vehicle, allow the member vehicle to drive according to a recommended route better than a navigation route of a leader vehicle, and set the member vehicle which passes the leader vehicle as a new leader vehicle.
Abstract:
A method of estimating a self-location of a vehicle includes obtaining an omnidirectional image by using an omnidirectional camera photographing a ground around a driving vehicle at a current location of the driving vehicle, estimating the current location of the driving vehicle on the basis of a global positioning system (GPS) satellite signal received from a satellite, searching a satellite image database storing a satellite image obtained by photographing the ground with the satellite to determine candidate satellite images corresponding to the estimated current location of the driving vehicle, comparing the omnidirectional image with each of the determined candidate satellite images to determine a candidate satellite image having a highest similarity to the omnidirectional image from among the determined candidate satellite images, and finally estimating the current location of the driving vehicle by using a location measurement value mapped to the determined candidate satellite image.
Abstract:
A system for integrating gestures and sounds including: a gesture recognition unit that extracts gesture feature information corresponding to user commands from image information and acquires gesture recognition information from the gesture feature information; a background recognition unit acquiring background sound information using a predetermined background sound model from sound information; a sound recognition unit that extracts sound feature information corresponding to user commands from the sound information and extracts the sound feature information based on the the background sound information and acquires sound recognition information from the sound feature information; and an integration unit that generates integration information by integrating the gesture recognition information and the sound recognition information.
Abstract:
Provided are an apparatus and method for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle. The apparatus for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle includes a sensing section which senses surrounding vehicles traveling within a preset distance from the autonomous vehicle, a communicator which transmits and receives data between the autonomous vehicle and another vehicle or a cloud server, a storage which stores precise lane-level map data, and a learning section which generates mapping data centered on the autonomous vehicle by mapping driving environment data of a sensing result of the sensing section to the precise map data, transmits the mapping data to the other vehicle or the cloud server through the communicator, and performs learning for autonomous driving using the mapping data and data received from the other vehicle or the cloud server.