Abstract:
Some embodiments provide a mapping application for generating views of a three-dimensional (3D) map. The mapping application includes a geographic data module for identifying a set of geographic data that represents a portion of the 3D map. The set of geographic data includes a set of camera captured images that correspond to the portion of the 3D map. The mapping application includes an image processing module for rendering the view of the 3D map based on the geographic data by animating a type of map element in the view of the 3D map.
Abstract:
Some embodiments of the invention provide a navigation application that allows a user to peek ahead or behind during a turn-by-turn navigation presentation that the application provides while tracking a device (e.g., a mobile device, a vehicle, etc.) traversal of a physical route. As the device traverses along the physical route, the navigation application generates a navigation presentation that shows a representation of the device on a map traversing along a virtual route that represents the physical route on the map. While providing the navigation presentation, the navigation application can receive user input to look ahead or behind along the virtual route. Based on the user input, the navigation application moves the navigation presentation to show locations on the virtual route that are ahead or behind the displayed current location of the device on the virtual route. This movement can cause the device representation to no longer be visible in the navigation presentation. Also, the virtual route often includes several turns, and the peek ahead or behind movement of the navigation presentation passes the presentation through one or more of these turns. In some embodiments, the map can be defined presented as a two-dimensional (2D) or a three-dimensional (3D) scene.
Abstract:
A mobile computing device can be used to locate a vehicle parking location in weak location signal scenarios (e.g., weak, unreliable, or unavailable GPS or other location technology). In particular, the mobile device can determine when a vehicle in which the mobile device is located has entered into a parked state. GPS or other primary location technology may be unavailable at the time the mobile device entered into a parked state (e.g., inside a parking structure). The location of the mobile device at a time corresponding to when the vehicle is identified as being parked can be determined using the first location technology as supplemented with sensor data of the mobile device. After the location of the mobile device at a time corresponding to when the vehicle is identified as being parked is determined, the determined location can be associated with an identifier for the current parking location.
Abstract:
Some embodiments provide a device that stores a novel navigation application. The application in some embodiments includes a user interface (UI) that has a display area for displaying a two-dimensional (2D) navigation presentation or a three-dimensional (3D) navigation presentation. The UI includes a selectable 3D control for directing the program to transition between the 2D and 3D presentations.
Abstract:
A mobile computing device can be used to locate a vehicle parking location. In particular, the mobile device can automatically identify when a vehicle in which the mobile device is located has entered into a parked state. The mobile device can determine that the vehicle is in a parked state by analyzing one or more parameters that indicate a parked state or a transit state. The location of the mobile device at a time corresponding to when the vehicle is identified as being parked can be associated with an identifier for the current parking location.
Abstract:
A mobile device including a touchscreen display can detect multiple points of fingertip contact being made against the touchscreen concurrently. The device can distinguish this multi-touch gesture from other gestures based on the duration, immobility, and concurrency of the contacts. In response to detecting such a multi-touch gesture, the device can send a multi-touch event to an application executing on the device. The application can respond to the multi-touch event in a variety of ways. For example, the application can determine a distance of a path in between points on a map that a user has concurrently touched with his fingertips. The application can display this distance to the user.
Abstract:
A device that includes at least one processing unit and stores a multi-mode mapping program for execution by the at least one processing unit is described. The program includes a user interface (UI). The UI includes a display area for displaying a two-dimensional (2D) presentation of a map or a three-dimensional (3D) presentation of the map. The UI includes a selectable 3D control for directing the program to transition between the 2D and 3D presentations.
Abstract:
At least certain embodiments of the present disclosure include an environment with a framework of software code interacting with a plurality of applications to provide gesture operations in response to user inputs detected on a display of a device. A method for operating through an application programming interface (API) in this environment includes displaying a user interface that includes a respective view that is associated with a respective application of the plurality of applications. The method includes, while displaying the respective view, detecting, via the software code, a user input within the region of the touch-sensitive surface that corresponds to the respective view, and, in response, in accordance with a determination that the user input is an inadvertent user input, ignoring the user input. The determination that the user input is an inadvertent user input is made based on an inadvertent user input call transferred through the API.
Abstract:
Devices, methods, nd machine-read le media to facilitate intuitive comparison and selection of calculated navigation routes are disclosed. An electronic device for navigation includes a touch-sensitive screen and a processing module for displaying a map, calculating a number or navigation routes simultaneously on the touch-sensitive screen, and receiving a selection of a route. Callouts, or markers for presenting key information about each route, are also displayed discretely on the map. Navigation tiles including key route information and route pictorials can also be created and displayed for each calculated route.
Abstract:
A multi-step animation sequence for smoothly transitioning from a map view to a panorama view of a specified location is disclosed. An orientation overlay can be displayed on the panorama, showing a direction and angular extent of the field of view of the panorama. An initial specified location and a current location of the panorama can also be displayed on the orientation overlay. A navigable placeholder panorama to be displayed in place of a panorama at the specified location when panorama data is not available is disclosed. A perspective view of a street name annotation can be laid on the surface of a street in the panorama.