Abstract:
Methods and systems for collecting camera calibration data using wearable devices are described. An augmented reality interface may be provided at a wearable device. Directions for a user to present a calibration target to a camera may be presented at the augmented reality interface. Calibration data collected by the camera viewing the calibration target may be received. Existing calibration data for the camera may be validated based at least in part on the collected calibration data.
Abstract:
Systems and methods for determining strategy modes for autonomous vehicles are described. An autonomous vehicle may detect aspects of other vehicles and aspects of the environment using one or more sensors. The autonomous vehicle may then determine strategy modes of the other vehicles, and select a strategy mode for its own operation based on the determined strategy modes and an operational goal for the autonomous vehicle. The strategy modes may include an uncoupled strategy mode, a permissive strategy mode, an assistive strategy mode, and a preventative strategy mode. The autonomous vehicle may further determine elements in the environment and topological constraints associated with the environment, and select the strategy mode for its own operation based thereon.
Abstract:
Systems and methods for operating autonomous vehicles are described. An autonomous vehicle may detect strategy modes and/or actions of other vehicles in a local environment. The autonomous vehicle may then select a strategy mode for its operations based on the detected strategy modes and/or actions of other vehicles, and based on an operational goal for the autonomous vehicle. The strategy modes may include an uncoupled strategy mode, a permissive strategy mode, an assistive strategy mode, and a preventative strategy mode. The autonomous vehicle may further select an action for its operations based on the selected strategy mode.
Abstract:
Disclosed are various embodiments for obtaining environmental data and operational data from autonomous vehicles in a roadway. A vehicle state of the autonomous vehicles can be updated using data that is obtained from nearby vehicles or other vehicles that are on the roadway. The roadway management system can also generate updates to the vehicle state based upon data obtained from sources external to autonomous vehicles.
Abstract:
Visual task feedback for workstations in a materials handling facility may be implemented. Image data of a workstation surface may be obtained from image sensors. The image data may be evaluated with regard to the performance of an item-handling task at the workstation. The evaluation of the image data may identify items located on the workstation surface, determine a current state of the item-handling task, or recognize an agent gesture at the workstation. Based, at least in part on the evaluation, one or more visual task cues may be selected to project onto the workstation surface. The projection of the selected visual task cues onto the workstation surface may then be directed.
Abstract:
An inventory system has inventory pods that are freely and independently moved about a facility and include inventory holders having dynamically reconfigurable storage bins. For various operating scenarios, the components of the inventory system are directed to dynamically reconfigure the storage bins, thereby maintaining efficient product density amongst the inventory holders and permitting the use of automated equipment to manipulate inventory items stored in the inventory pods. One or more inventory pods may be used with lifting modules and picking modules to dynamically reconfigure the storage bins as inventory items are added to and removed from the inventory holders.
Abstract:
This disclosure describes a device and system for verifying the content of items in a bin of an inventory holder within a materials handling facility. In some implementations, a bin content verification apparatus may be positioned within the materials handling facility and configured to capture images of inventory holders that include bins as the inventory holders are moved past the apparatus by mobile drive units. The images may be processed to determine whether the content included in the bins has changed since the last time images of the bins were captured. A determination may also be made as to whether a change to the bin content was expected and, if so, if the determined change corresponds with the expected change.
Abstract:
Systems and methods for operating autonomous vehicles are described. An autonomous vehicle may detect strategy modes and/or actions of other vehicles in a local environment. The autonomous vehicle may then select a strategy mode for its operations based on the detected strategy modes and/or actions of other vehicles, and based on an operational goal for the autonomous vehicle. The strategy modes may include an uncoupled strategy mode, a permissive strategy mode, an assistive strategy mode, and a preventative strategy mode. The autonomous vehicle may further select an action for its operations based on the selected strategy mode.
Abstract:
Systems and methods for coordinating operations of a plurality of autonomous vehicles are described. An autonomous vehicle management system may receive information from the plurality of autonomous vehicles related to strategy modes, actions, and the environments. The strategy modes may include an uncoupled strategy mode, a permissive strategy mode, an assistive strategy mode, and a preventative strategy mode. The autonomous vehicle management system may then process the received information based on an operational goal for the system as a whole. The autonomous vehicle management system may then instruct modifications to operations of one or more of the plurality of autonomous vehicles to achieve the operational goal for the system.
Abstract:
Visual task feedback for workstations in a materials handling facility may be implemented. Image data of a workstation surface may be obtained from image sensors. The image data may be evaluated with regard to the performance of an item-handling task at the workstation. The evaluation of the image data may identify items located on the workstation surface, determine a current state of the item-handling task, or recognize an agent gesture at the workstation. Based, at least in part on the evaluation, one or more visual task cues may be selected to project onto the workstation surface. The projection of the selected visual task cues onto the workstation surface may then be directed.