Abstract:
An autonomous travel vehicle includes a platform, a traveler, a storage, an arrival position predictor, a corrected speed calculator, and a reproduction travel command calculator. The traveler controls the platform to travel in accordance with a travel control command. The storage stores travel route data, which stores subgoal points, arrival times, and traveling speeds in association with each other. In the reproduction travel mode, the arrival position predictor predicts a predicted arrival position. In the reproduction travel mode, the corrected speed calculator calculates a corrected traveling speed based on a predicted traveling distance and a required traveling distance. The reproduction travel command calculator calculates a reproduction travel control command based on the corrected traveling speed, as the travel control command.
Abstract:
A mobile navigation system includes a directive beamforming antenna carried by the vehicle, emitting first and second sensing beams in first and second directions at first and second time points, respectively; an electromagnetic wave reflector installed in the target zone, receiving the first and second sensing beams, and transmitting first and second retro waves back; and a processor electrically coupled to the directive beamforming antenna, receiving the first and second retro waves, and determining a direction where the vehicle will be guided to move according to information of the first and second retro waves. A coverage area of the first sensing beam and a coverage area of the second sensing beam partially overlaps with each other, and the direction where the vehicle will be guided to move lies between the first direction and the second direction.
Abstract:
Robotic devices may be trained by a trainer guiding the robot along a target trajectory using physical contact with the robot. The robot may comprise an adaptive controller configured to generate control commands based on one or more of the trainer input, sensory input, and/or performance measure. The trainer may observe task execution by the robot. Responsive to observing a discrepancy between the target behavior and the actual behavior, the trainer may provide a teaching input via a haptic action. The robot may execute the action based on a combination of the internal control signal produced by a learning process of the robot and the training input. The robot may infer the teaching input based on a comparison of a predicted state and actual state of the robot. The robot's learning process may be adjusted in accordance with the teaching input so as to reduce the discrepancy during a subsequent trial.
Abstract:
Robotic devices may be operated by users remotely. A learning controller apparatus may detect remote transmissions comprising user control instructions. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The learning apparatus may monitor one or more wavelength (infrared light, radio channel) and detect transmissions from user remote control device to the robot during its operation by the user. The learning apparatus may be configured to develop associations between the detected user remote control instructions and actions of the robot for given context. When a given sensory context occurs, the learning controller may automatically provide control instructions to the robot that may be associated with the given context. The provision of control instructions to the robot by the learning controller may obviate the need for user remote control of the robot thereby enabling autonomous operation by the robot.
Abstract:
An apparatus and methods for training and/or operating a robotic device to follow a trajectory. A robotic vehicle may utilize a camera and stores the sequence of images of a visual scene seen when following a trajectory during training in an ordered buffer. Motor commands associated with a given image may be stored. During autonomous operation, an acquired image may be compared with one or more images from the training buffer in order to determine the most likely match. An evaluation may be performed in order to determine if the image may correspond to a shifted (e.g., left/right) version of a stored image as previously observed. If the new image is shifted left, right turn command may be issued. If the new image is shifted right then left turn command may be issued.
Abstract:
An automatic driving vehicle system includes: a surrounding information recognition unit that recognizes surrounding information of a vehicle; a vehicle state recognition unit that recognizes a vehicle state of the vehicle; a running plan generation unit that generates a running plan based on the surrounding information of the vehicle and that generates a control band of a target control value for the vehicle in the running plan, based on at least one of the vehicle state and the surrounding information; a first computation unit that computes a command control value such that the vehicle state becomes a target vehicle state corresponding to the target control value, based on the running plan, the vehicle state and the control band; and an actuator that controls the running of the vehicle based on the command control value. The system may also include an actuator control unit.
Abstract:
Robotic devices may be operated by users remotely. A learning controller apparatus may detect remote transmissions comprising user control instructions. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The learning apparatus may monitor one or more wavelength (infrared light, radio channel) and detect transmissions from user remote control device to the robot during its operation by the user. The learning apparatus may be configured to develop associations between the detected user remote control instructions and actions of the robot for given context. When a given sensory context occurs, the learning controller may automatically provide control instructions to the robot that may be associates with the given context. The provision of control instructions to the robot by the learning controller may obviate the need for user remote control of the robot thereby enabling autonomous operation by the robot.
Abstract:
A method and apparatus for ergonomically supporting a worker while performing a work operation includes an automatic guided vehicle capable of moving a seat carrying the worker between a first non-work position relative to a work piece and a second work position at which the worker performs the work operation. The automatic guided vehicle includes controls to enable the worker to ergonomically position the angle of a seatback relative to a seat bottom and the height of the seat relative to the automatic guided vehicle for ergonomic support of the worker while performing the work operation. Under program control, the automatic guided vehicle is movable from one work position to another work position and/or back to the non-work position.
Abstract:
Robotic devices may be trained by a trainer guiding the robot along a target trajectory using physical contact with the robot. The robot may comprise an adaptive controller configured to generate control commands based on one or more of the trainer input, sensory input, and/or performance measure. The trainer may observe task execution by the robot. Responsive to observing a discrepancy between the target behavior and the actual behavior, the trainer may provide a teaching input via a haptic action. The robot may execute the action based on a combination of the internal control signal produced by a learning process of the robot and the training input. The robot may infer the teaching input based on a comparison of a predicted state and actual state of the robot. The robot's learning process may be adjusted in accordance with the teaching input so as to reduce the discrepancy during a subsequent trial.
Abstract:
Methods and apparatus that provide a hardware abstraction layer (HAL) for a robot are disclosed. A HAL can reside as a software layer or as a firmware layer residing between robot control software and underlying robot hardware and/or an operating system for the hardware. The HAL provides a relatively uniform abstract for aggregates of underlying hardware such that the underlying robotic hardware is transparent to perception and control software, i.e., robot control software. This advantageously permits robot control software to be written in a robot-independent manner. Developers of robot control software are then freed from tedious lower level tasks. Portability is another advantage. For example, the HAL efficiently permits robot control software developed for one robot to be ported to another. In one example, the HAL permits the same navigation algorithm to be ported from a wheeled robot and used on a humanoid legged robot.