Abstract:
An adaptive navigation system anticipates a user's interactions with a device, modifying the device in the future based on past user movements and interactions. The system records a user's movement patterns and correlates movement patterns with how the user interacts with the device. When the movement patterns recur, the system modifies at least one behavior of the device based upon past interaction of the user with the device.
Abstract:
A computing device can obtain usage data associated with the device. The usage data indicate how the computing device operates or how the device is used. The device can analyze the usage data to recognize usage patterns. The usage patterns can correspond to recurring actions or tasks initiated by the user using the device, such as actions or tasks initiated due to the user's habits and/or routines. Based on the usage patterns, the device can determine a task that has a sufficient likelihood of being performed using the device within a specified or determined time (e.g., 5 minutes from now, one year from now, etc.). The device can provide information (e.g., recommendations) associated with the task, and likely relevant to the user. The user can use the provided information to perform the task, thereby increasing the ease of access or efficiency associated with performing the task.
Abstract:
Users can switch between applications using contextual interface elements. These elements can include icons for applications determined to likely be accessed by the user for a current context. Information is gathered to determine the current context, then information such as patterns of historical usage are utilized to determine and rank the applications by likelihood of use. Different contexts can include different icons, and a given context can include different icons for different points in time or locations. A user can access a contextual interface element by performing a swipe motion, for example. The user can continue the motion to an area associated with an icon of interest, and perform an action such as a tap or release to cause the associated application to be launched. Such an approach enables a user to quickly and easily launch another application independent of the application currently active on the device.
Abstract:
Users can switch between applications using contextual interface elements. These elements can include icons for applications determined to likely be accessed by the user for a current context. Information is gathered to determine the current context, then information such as patterns of historical usage are utilized to determine and rank the applications by likelihood of use. Different contexts can include different icons, and a given context can include different icons for different points in time or locations. A user can access a contextual interface element by performing a swipe motion, for example. The user can continue the motion to an area associated with an icon of interest, and perform an action such as a tap or release to cause the associated application to be launched. Such an approach enables a user to quickly and easily launch another application independent of the application currently active on the device.
Abstract:
Techniques for determining positions of devices within an environment are described herein. In some instances, an environment, such as a home or office of a user, may include an array of devices, some or all of which may couple to a network or to other devices via short-range wireless connections (e.g., Bluetooth®, Zigbee®, etc.). These devices may capture an array of data for providing to a central service, which is configured to analyze the data and, based on this analysis, determine a location of the devices relative to one another. That is, the central service may analyze the data to determine relative distances and orientations between the identified devices within the environment.
Abstract:
Techniques are described for determining height, weight, or other characteristics of a user based on processed sensor data. The sensor data may include data collected by sensors on the user's computing device or other computing devices, or data collected by stationary, external sensors. Different types of sensor data may be processed to estimate at least one physical characteristic of the user, such as the user's height, weight, apparel size, age, and so forth. The estimated characteristics may be employed to perform actions based on the user's identity or category, to customize content delivery for the user, or for other purposes.
Abstract:
In some examples, an electronic device may include one or more recognition devices able to be used to recognize a current user. If the electronic device recognizes that a primary user, such as an owner, is currently using the electronic device, the electronic device may allow access to all of the primary user's private information and all of the features of the electronic device. On the other hand, when the electronic device determines that the current user is an unknown user, or that the current user is a known user who is authorized to access only limited information or features of the electronic device, the electronic device may send a communication to restrict the current user from accessing private information of the primary user. In some cases, the electronic device may enable the primary user to designate which items known users and/or unknown users may access.
Abstract:
A system and method for identifying a user of a device includes comparing audio received by a device with acoustic fingerprint information to identify a user of the device. Image data, video data and other data may also be used in the identification of the user. Once the user is identified, operation of the device may be customized based on the user. Further, once the user is identified, data can be associated with the user, for example, usage data, location data, gender data, age data, dominant hand data of the user, and other data. This data can then be used to further customize the operation of the device to the specific user.
Abstract:
Techniques for determining positions of devices within an environment are described herein. In some instances, an environment, such as a home or office of a user, may include an array of devices, some or all of which may couple to a network or to other devices via short-range wireless connections (e.g., Bluetooth®, Zigbee®, etc.). These devices may capture an array of data for providing to a central service, which is configured to analyze the data and, based on this analysis, determine a location of the devices relative to one another. That is, the central service may analyze the data to determine relative distances and orientations between the identified devices within the environment.
Abstract:
Users can switch between applications using contextual interface elements. These elements can include icons for applications determined to likely be accessed by the user for a current context. Information is gathered to determine the current context, then information such as patterns of historical usage are utilized to determine and rank the applications by likelihood of use. Different contexts can include different icons, and a given context can include different icons for different points in time or locations. A user can access a contextual interface element by performing a swipe motion, for example. The user can continue the motion to an area associated with an icon of interest, and perform an action such as a tap or release to cause the associated application to be launched. Such an approach enables a user to quickly and easily launch another application independent of the application currently active on the device.