Abstract:
Various techniques are described to facilitate controlling an unmanned aerial vehicle (UAV) and viewing feedback received from a UAV. A graphical user interface (GUI) is provided that allows a user to view a display window. The display window may indicate structures or portions of structures in which additional image data is desired by highlighting these portions within the display window. Static imagery may be leveraged to provide smooth and consistent feedback transitions. When a delay exists between the time the UAV sends live video data and the time it may be displayed in the GUI, the static images may be shown in the display window initially until the live video data may be displayed. The opacity of structures included in an initial display window may also transition to a greater opacity over time, with the live video eventually being displayed.
Abstract:
A tethering system for a remote-controlled device includes a tether line having a first end adapted to be connected to a ground support and a second end adapted to be connected to the remote-controlled device. The system further includes an anchor-point disposed between the first and second ends of the tether line, the anchor point having an eyelet for securing the tether line and allowing the tether line to slide through the eyelet during use. The anchor-point and eyelet enable the tether line to flex or bend and the remote-controlled device to maneuver one or more of over or around the target area without interfering with any nearby obstructions.
Abstract:
Various techniques are described to facilitate controlling an unmanned aerial vehicle (UAV) and viewing feedback received from a UAV. A graphical user interface (GUI) is provided that allows a user to view a display window. The display window may indicate structures or portions of structures in which additional image data is desired by highlighting these portions within the display window. Static imagery may be leveraged to provide smooth and consistent feedback transitions. When a delay exists between the time the UAV sends live video data and the time it may be displayed in the GUI, the static images may be shown in the display window initially until the live video data may be displayed. The opacity of structures included in an initial display window may also transition to a greater opacity over time, with the live video eventually being displayed.
Abstract:
The method and system may be used to control the movement of a remote aerial device in an incremental step manner during a close inspection of an object or other subject matter. At the inspection location, a control module “stabilizes” the remote aerial device in a maintained, consistent hover while maintaining a close distance to the desired object. The control module may retrieve proximal sensor data that indicates possible nearby obstructions to the remote aerial device and may transmit the data to a remote control client. The remote control module may determine and display the possible one or more non-obstructed directions that the remote aerial device is capable of moving by an incremental distance. In response to receiving a selection of one of the directions, the remote control module may transmit the selection to the remote aerial device to indicate the next movement for the remote aerial device.
Abstract:
A method and system may assess the damage to infrastructure using aerial images captured from an unmanned aerial vehicle (UAV), a manned aerial vehicle (MAV) or from a satellite device. Specifically, an item of infrastructure may be identified for assessing damage. The UAV, MAV, or satellite device may then capture aerial images within an area which surrounds the identified infrastructure item. Subsequently, the aerial images may be analyzed to determine a condition and the extent and/or severity of the damage to the infrastructure item. Furthermore, the aerial images along with indications of the extent of the damage may be displayed on a computing device.
Abstract:
A vehicle chatbot of a smart vehicle assistant engages in a conversation with a user associated with a vehicle, for instance to provide insurance information and range extension tips associated with vehicle operations. The vehicle chatbot may also engage in a conversation with an external entity in the event of a collision, to provide information to the external entity on behalf of vehicle occupants. The smart vehicle assistant may also cause the vehicle to autonomously drive to a location following a collision. A responder dispatched to respond to the collision may use a smart responder assistant that includes a responder chatbot. The responder chatbot may engage in a conversation with the responder to obtain information identifying the vehicle and/or damage to the vehicle, and may provide the responder with information about the vehicle and recommendations regarding how to extract occupants of the vehicle.
Abstract:
The following relates generally to providing virtual reality (VR) alerts to a driver of an autonomous vehicle. For example, a vehicle may be driving autonomously while the driver is watching a VR movie (e.g., on a pair of VR goggles); the driver may then receive a VR alert recommending that the driver take control of the vehicle (e.g., switch the vehicle from autonomous to manual mode). The following also relates to generating a VR feed for presenting real-time road conditions so that a user may preview a road segment. The following also relates to generating a VR feed corresponding to an event (e.g., a vehicle collision, a crime, a weather event, and/or a natural disaster).
Abstract:
Embodiments described herein, inter alia, receive telematics data collected over a period of time, wherein the telematics data is indicative of operation of a vehicle by a potential renter during the period of time; identify, upon analyzing the telematics data, driving behavior(s) of the renter during the period of time; determine, for each driving behavior, a corresponding state of an environment of the vehicle when the driving behavior occurred; determine renter eligibility value(s) for the renter based on the driving behavior(s) and the corresponding state of an environment of the vehicle; compare the renter eligibility value(s) to user preference value(s) of a profile of a vehicle owner, wherein the user preference value(s) define one or more criteria for vehicle renters with whom the vehicle can be shared; and cause an indication of the vehicle associated with the profile to be displayed only if the renter satisfies the criteria.
Abstract:
Systems and methods are described for using data collected by unmanned aerial vehicles (UAVs) to generate insurance claim estimates that an insured individual may quickly review, approve, or modify. When an insurance-related event occurs, such as a vehicle collision, crash, or disaster, one or more UAVs are dispatched to the scene of the event to collect various data, including data related to vehicle or real property (insured asset) damage. With the insured's permission or consent, the data collected by the UAVs may then be analyzed to generate an estimated insurance claim for the insured. The estimated insurance claim may be sent to the insured individual, such as to their mobile device via wireless communication or data transmission, for subsequent review and approval. As a result, insurance claim handling and/or the online customer experience may be enhanced.
Abstract:
Methods and systems for charging an electric vehicle (EV) are described herein. An EV may require additional battery power to reach a charging station. A remote server in communication with the EV or an on-board computer or mobile device in the EV may obtain data to determine a location for the EV to meet a charging vehicle. The charging vehicle may be dispatched to meet the EV and deliver power to it, enabling the EV to reach a charging station or other destination. In some examples, the charging vehicle may deliver power to the EV while both vehicles are stationary. In other examples, the charging vehicle may couple to the EV while both vehicles are in motion.