Abstract:
A wearable device receives user configuration data that provides user-specific information for stored use cases and generates a personalized use case database based on the stored use cases and the user-specific information. The wearable device obtains location, sensor, time, and peripheral state data and detects a meaningful change in a most recent data instance of the location, sensor, time, or peripheral state data over an earlier data instance of the location, sensor, time, and peripheral state data. The wearable device determines if the most recent data instance corresponds to a particular use case in the personalized use case database; presents on a display a new home-screen experience, including multiple application user interfaces for the particular use case, when the most recent data instance corresponds to the particular use case; and changes a hardware state to enable use of the multiple application user interfaces in the new home-screen experience.
Abstract:
A user device may include a display that includes a first area and a second area. The first area may be located around a periphery of the second area, and the first area may include a touch-sensitive surface. The user device may detect a user interaction with the first area and may determine an action associated with the user interaction. The user device may perform the action in association with information displayed in the second area.
Abstract:
A wearable device receives user configuration data that provides user-specific information for stored use cases and generates a personalized use case database based on the stored use cases and the user-specific information. The wearable device obtains location, sensor, time, and peripheral state data and detects a meaningful change in a most recent data instance of the location, sensor, time, or peripheral state data over an earlier data instance of the location, sensor, time, and peripheral state data. The wearable device determines if the most recent data instance corresponds to a particular use case in the personalized use case database; presents on a display a new home-screen experience, including multiple application user interfaces for the particular use case, when the most recent data instance corresponds to the particular use case; and changes a hardware state to enable use of the multiple application user interfaces in the new home-screen experience.
Abstract:
A wearable device receives user configuration data that provides user-specific information for stored use cases and generates a personalized use case database based on the stored use cases and the user-specific information. The wearable device obtains location, sensor, time, and peripheral state data and detects a meaningful change in a most recent data instance of the location, sensor, time, or peripheral state data over an earlier data instance of the location, sensor, time, and peripheral state data. The wearable device determines if the most recent data instance corresponds to a particular use case in the personalized use case database; presents on a display a new home-screen experience, including multiple application user interfaces for the particular use case, when the most recent data instance corresponds to the particular use case; and changes a hardware state to enable use of the multiple application user interfaces in the new home-screen experience.
Abstract:
A wearable device receives user configuration data that provides user-specific information for stored use cases and generates a personalized use case database based on the stored use cases and the user-specific information. The wearable device obtains location, sensor, time, and peripheral state data and detects a meaningful change in a most recent data instance of the location, sensor, time, or peripheral state data over an earlier data instance of the location, sensor, time, and peripheral state data. The wearable device determines if the most recent data instance corresponds to a particular use case in the personalized use case database; presents on a display a new home-screen experience, including multiple application user interfaces for the particular use case, when the most recent data instance corresponds to the particular use case; and changes a hardware state to enable use of the multiple application user interfaces in the new home-screen experience.
Abstract:
A virtual reality (VR) device presents a virtual environment and detects an input that relates to a contact by a user. The VR device identifies attributes of the input, and the attributes include, for example, a portion of the VR device associated with the contact, a duration of the contact, and a direction of motion of the contact. The VR device modifies the presented virtual environment based on the input attributes, such as presenting a default view of the virtual environment when a first type of input is detected; moving a virtual horizon when a second type of input is detected; enlarging a depicted virtual object when a third type of input is detected; presenting a menu of options when a fourth type of input is detected; and capturing and presenting an image or video of the user's actual surroundings when a fifth type of input is detected.
Abstract:
A user device may include a display that includes a first area and a second area. The first area may be located around a periphery of the second area, and the first area may include a touch-sensitive surface. The user device may detect a user interaction with the first area and may determine an action associated with the user interaction. The user device may perform the action in association with information displayed in the second area.
Abstract:
A device may provide information identifying a user interface layout, associated with a smart watch, for display. The device may receive a user configuration of the user interface layout based on providing the information identifying the user interface layout for display. The device may provide, to the smart watch, configuration information associated with the user configuration to permit the smart watch to update a user interface based on the configuration information.
Abstract:
A virtual reality (VR) device presents a virtual environment and detects an input that relates to a contact by a user. The VR device identifies attributes of the input, and the attributes include, for example, a portion of the VR device associated with the contact, a duration of the contact, and a direction of motion of the contact. The VR device modifies the presented virtual environment based on the input attributes, such as presenting a default view of the virtual environment when a first type of input is detected; moving a virtual horizon when a second type of input is detected; enlarging a depicted virtual object when a third type of input is detected; presenting a menu of options when a fourth type of input is detected; and capturing and presenting an image or video of the user's actual surroundings when a fifth type of input is detected.