Abstract:
A method for presenting operational information of a mobile platform includes collecting diagnostic data and travel route data associated with an operation of the mobile platform, and integrating the travel route data with the diagnostic data for presentation. The diagnostic data includes at least one of platform diagnostic data or remote control data. The remote control data is generated by a remote controller associated with the mobile platform or generated by a computer device.
Abstract:
A propulsion system comprises a propeller and a driving device coupled with the propeller. The propeller comprises a hub having a receiving hole, a plurality of blades connected to the hub, and a connecting surface. The driving device comprises a body portion, an elastic abutting member disposed on the body portion and configured to abut against the connecting surface, and a lock portion disposed on the body portion and including a first snap-fitting portion. A second snap-fitting portion is arranged on an inner wall of the receiving hole and is configured to snap with the first snap-fitting portion.
Abstract:
System and method can support photography. A controller can configure a carrier to move an imaging device along a moving path. Furthermore, the controller can apply a time-dependent configuration on the imaging device, and use the imaging device for capturing a set of image frames along the moving path based on the one or more time-dependent parameters.
Abstract:
The present disclosure provides a multi-rotor aerial vehicle. The multi-rotor aerial vehicle includes a body, the body including a first side and a second side opposite to each other; a first rotor connected to the first side of the body; and a second rotor connected to the second side of the body, a torque coefficient of the second rotor being different from a torque coefficient of the first rotor. When the multi-rotor aerial vehicle flies in a direction from the second side toward the first side or from the first side toward the second side, the first rotor rotates at a first rotational speed, the second rotor rotates at a second rotational speed, and an absolute value of a difference between the first rotational speed and the second rotational speed is less than a predetermined value.
Abstract:
An image processing method includes, when an edit triggering event for a target image is detected, acquiring description information associated with the target image. The description information includes interference information that affects image quality occurred in a shooting process of the target image. The description information includes at least one of motion data of a carrying member that carries a shooting module configured to acquire the target image or motion data of a moving object on which the carrying member is mounted. The method further includes editing image clips in the target image which are associated with respective interference information of the description information to obtain a processed target image.
Abstract:
A method and a device for setting a flight route are provided. The method comprises acquiring route data of an aerial vehicle, determining waypoint coordinates in the route data, configuring a route display interface according to maximum distances between the determined waypoint coordinates, displaying a route of the aerial vehicle in the configured route display interface according to waypoint coordinates in the route data, and resetting the route displayed in the route display interface according to edit information corresponding to a received edit operation to obtain updated route data of the aerial vehicle.
Abstract:
A propulsion system includes a driving device and a propeller coupled with the driving device. The driving device includes a body portion and a lock portion disposed on the body portion. The lock portion includes a first snap-fitting portion. The propeller includes a hub having a receiving hole, a plurality of blades connected to the hub, and a second snap-fitting portion arranged on an inner wall of the receiving hole and configured to snap with the first snap-fitting portion.
Abstract:
An image processing method includes, when an edit triggering event for a target image is detected, acquiring description information associated with the target image. The description information includes interference information that affects image quality occurred in a shooting process of the target image. The method further includes editing image clips in the target image which are associated with respective interference information of the description information to obtain a processed target image.
Abstract:
A method for adjusting image exposure includes, while continuously tracking one or more targets using an imaging device mounted on a movable object, receiving a user indication of a target for imaging, determining a first representation of the target, capturing a first image that includes the first representation of the target, determining an exposure parameter for the imaging device using data in the first image that corresponds to the first representation of the target, determining a second representation of the target, capturing a second image including the second representation of the target, and adjusting the exposure parameter for the imaging device using data in the second image that corresponds to the second representation of the target.
Abstract:
A method includes receiving state information of a virtual movable object in a simulated movement from a movement simulator associated with a movable object and determining movement information for the simulated movement by associating the state information with context information. The state information includes information identifying a location of the virtual movable object in a virtual space. The context information includes information identifying a location of the user terminal, which is at a different location than the movable object in a real space. The method further includes displaying the simulated movement on a display associated with the user terminal based on the movement information, and receiving control data to control the simulated movement in the virtual space using the user terminal when the movable object is in simulation and to control movement of the movable object in the real space when the movable object is in real operation.