Abstract:
A propeller provided on an aerial vehicle may include a digital camera or other imaging device embedded into a surface of one of the blades of the propeller. The digital camera may capture images while the propeller is rotating at an operational speed. Images captured by the digital camera may be processed to recognize one or more objects therein, and to determine ranges to such objects by stereo triangulation techniques. Using such ranges, a depth map or other model of the surface features in an environment in which the aerial vehicle is operating may be defined and stored or used for any purpose. A propeller may include digital cameras or other imaging devices embedded into two or more blades, and may also use such images to determine ranges to objects by stereo triangulation techniques.
Abstract:
This disclosure is directed to monitoring exposure levels for individual cameras of a stereo camera system to reduce unwanted lens flaring which may disrupt stereo vision capabilities of an unmanned aerial vehicle (UAV). The UAV may identify lens flaring by monitoring histogram data received from the cameras and then mitigate the lens flaring by performing aerial maneuvers with respect to the light source and/or moving camera componentry such as deploying a lens hood. The UAV may also identify lens flaring patterns that on a camera's sensor and determine a location of a peripheral light source based on these patterns. The UAV may then deploy a visor between the location of the peripheral light source and the cameras.
Abstract:
Methods and systems for collecting camera calibration data using wearable devices are described. An augmented reality interface may be provided at a wearable device. Directions for a user to present a calibration target to a camera may be presented at the augmented reality interface. Calibration data collected by the camera viewing the calibration target may be received. Existing calibration data for the camera may be validated based at least in part on the collected calibration data.
Abstract:
Images captured by a camera system can be processed to detect precipitation in one or more of the images, and to generate a reconstructed image(s) without the precipitation, or with a reduced amount of the precipitation. Detection of precipitation can be based on a difference between a first feature in a first image and a second feature in a second image that corresponds to the first feature, where the first and second images were captured by different cameras at different times. A determination as to whether precipitation is present in the first image and the second image can be based at least in part on a disparity between the first feature and the second feature.
Abstract:
Techniques for providing a verification of a flight path or landing zone may be provided. For example, during delivery an unmanned aerial vehicle (UAV) may capture one or more images of a plurality of delivery locations within an area. A computer system may generate one or more image templates or filters using the one or more images and subsequently use the image filters to verify a flight path or landing zone for a delivery by the UAV during flight.
Abstract:
An unmanned aerial vehicle (UAV) may provide an approach notification to enable people to understand and interpret actions by the UAV, such as an intention to land or deposit a package at a particular location. The UAV may communicate a specific intention of the UAV and/or communicate a request to a person. The UAV may monitor the person or data signals for a response from the person, such as movement of the person that indicates a response. The UAV may be equipped with hardware and/or software configured to provide notifications and/or exchange information with a person at or near a destination. The UAV may include lights, a speaker, and possibly a projector to enable the UAV to project information and/or text on a surface. The UAV may control a moveable mechanism to “point” toward the person, at an object, or in another direction.
Abstract:
This disclosure describes an aerial vehicle that includes a light alteration assembly that may be used to alter light entering a lens of a camera of the aerial vehicle. The light alteration assembly may include an adjustable visor and/or filters that may be selectively positioned over the lens of the camera. By altering light entering the lens of a camera of the aerial vehicle, the camera is able to obtain higher quality images of the area surrounding the aerial vehicle. The higher quality images may then be processed to accurately detect objects within a vicinity of the aerial vehicle.
Abstract:
An inventory system has inventory pods that are freely and independently moved about a facility and include inventory holders having dynamically reconfigurable storage bins. For various operating scenarios, the components of the inventory system are directed to dynamically reconfigure the storage bins, thereby maintaining efficient product density amongst the inventory holders and permitting the use of automated equipment to manipulate inventory items stored in the inventory pods. One or more inventory pods may be used with lifting modules and picking modules to dynamically reconfigure the storage bins as inventory items are added to and removed from the inventory holders.
Abstract:
This disclosure describes a device and system for verifying the content of items in a bin of an inventory holder within a materials handling facility. In some implementations, a bin content verification apparatus may be positioned within the materials handling facility and configured to capture images of inventory holders that include bins as the inventory holders are moved past the apparatus by mobile drive units. The images may be processed to determine whether the content included in the bins has changed since the last time images of the bins were captured. A determination may also be made as to whether a change to the bin content was expected and, if so, if the determined change corresponds with the expected change.
Abstract:
This disclosure describes a device and system for verifying the content of items in a bin within a materials handling facility. In some implementations, a bin content verification apparatus may pass by one or more bins and capture images of those bins. The images may be processed to determine whether the content included in the bins has changed since the last time images of the bins were captured. A determination may also be made as to whether a change to the bin content was expected and, if so, if the determined change corresponds with the expected change.