Abstract:
A global navigation satellite sensor system (GNSS) and gyroscope control system for vehicle steering control comprising a GNSS receiver and antennas at a fixed spacing to determine a vehicle position, velocity and at least one of a heading angle, a pitch angle and a roll angle based on carrier phase position differences. The system also includes a control system configured to receive the vehicle position, heading, and at least one of roll and pitch, and configured to generate a steering command to a vehicle steering system. The system includes gyroscopes for determining system attitude change with respect to multiple axes for integrating with GNSS-derived positioning information to determine-vehicle position, velocity, rate-of-turn, attitude and other operating characteristics. Relative orientations and attitudes between motive and working components can be determined using optical sensors and cameras. The system can also be used to guide multiple vehicles in relation to each other.
Abstract:
A global navigation satellite sensor system (GNSS) and gyroscope control system for vehicle steering control comprising a GNSS receiver and antennas at a fixed spacing to determine a vehicle position, velocity and at least one of a heading angle, a pitch angle and a roll angle based on carrier phase position differences. The system also includes a control system configured to receive the vehicle position, heading, and at least one of roll and pitch, and configured to generate a steering command to a vehicle steering system. The system includes gyroscopes for determining system attitude change with respect to multiple axes for integrating with GNSS-derived positioning information to determine vehicle position, velocity, rate-of-turn, attitude and other operating characteristics. Relative orientations and attitudes between motive and working components can be determined using optical sensors and cameras. The system can also be used to guide multiple vehicles in relation to each other.
Abstract:
The present invention provides a work vehicle including a control system that can switch between a first driving mode for allowing the work vehicle to travel in a manned state and a second driving mode for allowing the work vehicle to travel in an unmanned state, wherein the control system controls such that, during an execution of the second driving mode, a number of types of information exchanged by communication in the control system becomes less than that during the first driving mode, or a communication interval of information exchanged by the communication in the control system becomes longer than that during the first driving mode.
Abstract:
An aircraft is provided and includes a frame, drive elements configured to drive movements of the frame and a computer configured to receive mission planning and manual commands and to control operations of the drive elements to operate in a safe mode in which mission commands are accepted but manual commands are refused, a manual mode in which mission commands are refused but manual commands are accepted and an enroute mode. The computer is further configured to only allow mode transitions between the safe and manual modes and between the safe and enroute modes.
Abstract:
Robotic devices may be operated by users remotely. A learning controller apparatus may detect remote transmissions comprising user control instructions. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The learning apparatus may monitor one or more wavelength (infrared light, radio channel) and detect transmissions from user remote control device to the robot during its operation by the user. The learning apparatus may be configured to develop associations between the detected user remote control instructions and actions of the robot for given context. When a given sensory context occurs, the learning controller may automatically provide control instructions to the robot that may be associates with the given context. The provision of control instructions to the robot by the learning controller may obviate the need for user remote control of the robot thereby enabling autonomous operation by the robot.
Abstract:
Robots have the capacity to perform a broad range of useful tasks, such as factory automation, cleaning, delivery, assistive care, environmental monitoring and entertainment. Enabling a robot to perform a new task in a new environment typically requires a large amount of new software to be written, often by a team of experts. It would be valuable if future technology could empower people, who may have limited or no understanding of software coding, to train robots to perform custom tasks. Some implementations of the present invention provide methods and systems that respond to users' corrective commands to generate and refine a policy for determining appropriate actions based on sensor-data input. Upon completion of learning, the system can generate control commands by deriving them from the sensory data. Using the learned control policy, the robot can behave autonomously.
Abstract:
A computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to detection of a hazard in the path of the vehicle.
Abstract:
A global navigation satellite system (GNSS) and gyroscope control system for vehicle steering control comprising a GNSS receiver and antennas at a fixed spacing to determine a vehicle position, velocity and at least one of a heading angle, a pitch angle and a roll angle based on carrier phase position differences. The system also includes a control system configured to receive the vehicle position, heading, and at least one of roll and pitch, and configured to generate a steering command to a vehicle steering system. A vehicle control method includes the steps of computing a position and a heading for the vehicle using GNSS positioning and a rate gyro for determining vehicle attitude, which is used for generating a steering command. Relative orientations and attitudes between tractors and implements can be determined using optical sensors and cameras. Laser detectors and rangefinders can also be used.
Abstract:
Methods and apparatus that provide a hardware abstraction layer (HAL) for a robot are disclosed. A HAL can reside as a software layer or as a firmware layer residing between robot control software and underlying robot hardware and/or an operating system for the hardware. The HAL provides a relatively uniform abstract for aggregates of underlying hardware such that the underlying robotic hardware is transparent to perception and control software, i.e., robot control software. This advantageously permits robot control software to be written in a robot-independent manner. Developers of robot control software are then freed from tedious lower level tasks. Portability is another advantage. For example, the HAL efficiently permits robot control software developed for one robot to be ported to another. In one example, the HAL permits the same navigation algorithm to be ported from a wheeled robot and used on a humanoid legged robot.
Abstract:
Methods and apparatus that provide a hardware abstraction layer (HAL) for a robot are disclosed. A HAL can reside as a software layer or as a firmware layer residing between robot control software and underlying robot hardware and/or an operating system for the hardware. The HAL provides a relatively uniform abstract for aggregates of underlying hardware such that the underlying robotic hardware is transparent to perception and control software, i.e., robot control software. This advantageously permits robot control software to be written in a robot-independent manner. Developers of robot control software are then freed from tedious lower level tasks. Portability is another advantage. For example, the HAL efficiently permits robot control software developed for one robot to be ported to another. In one example, the HAL permits the same navigation algorithm to be ported from a wheeled robot and used on a humanoid legged robot.