Abstract:
Embodiments are described for an unmanned aerial vehicle (UAV) configured for autonomous flight using visual navigation that includes a perimeter structure surrounding one or more powered rotors, the perimeter structure including image capture devices arranged so as to provide an unobstructed view around the UAV.
Abstract:
In some examples, one or more processors of an aerial vehicle access a scan plan including a sequence of poses for the aerial vehicle to assume to capture, using the one or more image sensors, images of a scan target. A next pose of the scan plan is checked for obstructions, and based at least on detection of an obstruction, the one or more processors determine whether a backup pose is available for capturing an image of the targeted point orthogonally along a normal of the targeted point. Responsive to determining that the backup pose is unavailable for capturing an image of the targeted point orthogonally along the normal of the targeted point, image capture of the targeted point is performed at an oblique angle to the normal of the targeted point.
Abstract:
Techniques are described for controlling an autonomous vehicle such as an unmanned aerial vehicle (UAV) using objective-based inputs. In an embodiment, the underlying functionality of an autonomous navigation system is exposed via an application programming interface (API) allowing the UAV to be controlled through specifying a behavioral objective, for example, using a call to the API to set parameters for the behavioral objective. The autonomous navigation system can then incorporate perception inputs such as sensor data from sensors mounted to the UAV and the set parameters using a multi-objective motion planning process to generate a proposed trajectory that most closely satisfies the behavioral objective in view of certain constraints. In some embodiments, developers can utilize the API to build customized applications for the UAV. Such applications, also referred to as “skills,” can be developed, shared, and executed to control behavior of an autonomous UAV and aid in overall system improvement.
Abstract:
In some examples, an unmanned aerial vehicle (UAV) may identify a scan target. The UAV may navigate to two or more positions in relation to the scan target. The UAV may capture, using one or more image sensors of the UAV, two or more images of the scan target from different respective positions in relation to the scan target. For instance, the two or more respective positions may be selected by controlling a spacing between the two or more respective positions to enable determination of parallax disparity between a first image captured at a first position and a second image captured at a second position of the two or more positions. The UAV may determine a three-dimensional model corresponding to the scan target based in part on the determined parallax disparity of the two or more images including the first image and the second image.
Abstract:
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a magic wand user interaction paradigm is described for intuitive control of an FDA using a PMD. In other embodiments, methods for scripting a shot are described.
Abstract:
Methods and systems are disclosed for an unmanned aerial vehicle (UAV) configured to autonomously navigate a physical environment while capturing images of the physical environment. In some embodiments, the motion of the UAV and a subject in the physical environment may be estimated based in part on images of the physical environment captured by the UAV. In response to estimating the motions, image capture by the UAV may be dynamically adjusted to satisfy a specified criterion related to a quality of the image capture.
Abstract:
Methods and systems are disclosed for an unmanned aerial vehicle (UAV) configured to autonomously navigate a physical environment while capturing images of the physical environment. In some embodiments, the motion of the UAV and a subject in the physical environment may be estimated based in part on images of the physical environment captured by the UAV. In response to estimating the motions, image capture by the UAV may be dynamically adjusted to satisfy a specified criterion related to a quality of the image capture.
Abstract:
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a magic wand user interaction paradigm is described for intuitive control of an FDA using a PMD. In other embodiments, methods for scripting a shot are described.
Abstract:
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a user may control image capture from an FDA by adjusting the position and orientation of a PMD. In other embodiments, a user may input a touch gesture via a touch display of a PMD that corresponds with a flight path to be autonomously flown by the FDA.
Abstract:
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a magic wand user interaction paradigm is described for intuitive control of an FDA using a PMD. In other embodiments, methods for scripting a shot are described.