Abstract:
Systems and methods for time proximity based map user interactions with a user interface are provided. In one example implementation, a method includes providing for display a user interface on a display device. The user interface can display imagery of a geographic area. The method can include obtaining data indicative of a relevant time for contextual information. The method can include obtaining contextual information associated with the geographic area. The method can include obtaining a configuration for a user interface element associated with the time based contextual information based at least in part on time proximity of the contextual information to the relevant time. The method can include providing for display the user interface element based at least in part on the configuration.
Abstract:
Systems and methods for time proximity based map user interactions with a user interface are provided. In one example implementation, a method includes providing for display a user interface on a display device. The user interface can display imagery of a geographic area. The method can include obtaining data indicative of a relevant time for contextual information. The method can include obtaining contextual information associated with the geographic area. The method can include obtaining a configuration for a user interface element associated with the time based contextual information based at least in part on time proximity of the contextual information to the relevant time. The method can include providing for display the user interface element based at least in part on the configuration.
Abstract:
In general, the subject matter described in this disclosure can be embodied in methods, systems, and computer-readable devices. A computing device presents a user interface that includes first and second images. The computing device receives first user input that moves the user interface in a first direction. The computing device moves the user interface in the first direction by moving the first and second images in the first direction, and modifying a presentation of the second image at the same time that the second image is moved in the first direction, to indicate that the second image is an image that is pannable. The computing device receives second user input that interacts with the second image to pan the second image. The computing device pans the second image by replacing a first portion of the second image with a second portion of the second image.
Abstract:
The technology relates to selecting and displaying images captured at different points in time. As an example, a user of a computing device may view a first street level image as viewed from a particular location and oriented in a particular direction. The user may select other time periods for which similar images are available. Upon selecting a particular time period, a second street level image may be displayed concurrently with the first street level image, wherein the second street level image was captured on or around the selected time period. If the user changes the perspective of the first image an automatic change in perspective of the second image may occur.
Abstract:
Aspects of the disclosure relate generally to providing a user with an image navigation experience. For instance, a first image of a multidimensional space is provided with an overlay line indicating a direction in which the space extends into the first image such that a second image is connected to the first image along a direction of the overlay line. User input indicating a swipe across a portion of the display is received. When swipe occurred at least partially within an interaction zone defining an area around the overlay line at which the user can interact with the space, the swipe indicates a request to display an image different from the first image. The second image is selected and provided for display based on the swipe and a connection graph connecting the first image and the second image along the direction of the overlay line.