Abstract:
In various example embodiments, a system and method for causing the performance of a complementary activity on a secondary device that is complementary to a device activity performed on a primary user device are presented. In an example embodiment, a device activity being performed by a primary user device is detected. A secondary user device capable of performing a complementary activity corresponding to the device activity is identified. Instructions are generated for the secondary user device to perform the complementary activity based on the complementary activity including an activity component that utilizes a functionality of the secondary user device not available on the primary user device, wherein the functionality includes capturing data of a particular data type from a sensor. The instructions to perform the complementary activity are transmitted to the secondary user device.
Abstract:
A software widget running on a user device may be designed to operate in a locked or an unlocked mode. In unlocked mode, the user has full interactivity with the widget. In locked mode, however, at least some of the interactivity with the widget is restricted, despite the fact that the widget still operates normally otherwise while in the locked mode. While in locked mode, first user input and second user input may be compared against a predefined unlocking sequence to determine if the widget should be unlocked.
Abstract:
In an example embodiment, a first aspect of a physical environment, other than location or current time, of the electronic device is determined. Then a mode of a notification function within the electronic device is dynamically modified such that the mode changes from a first mode in which a notification does not activate a vibration motor in the electronic device to a second mode in which the notification does activate the vibration motor, based on the determined first aspect of the physical environment.
Abstract:
In various example embodiments, a system and method for data mesh-based wearable device ancillary activity are presented. A device activity being performed in real-time by a user device of a user is detected. Attribute data associated with a plurality of attribute sources is accessed. A user preference indicating a preference for performing on a secondary user device a complementary activity corresponding to the device activity is inferred. Based on the inferred user preference, the secondary user device is identified according to a device status of the secondary user device, the device status indicating a device capability to perform the complementary activity in real-time. The complementary activity to be performed in real-time on the secondary user device is generated by analyzing at least one of the device activity, a device functionality of the secondary user device, and the user preference. Instructions to perform the complementary activity in real-time are transmitted to the secondary user device.
Abstract:
Systems and methods are presented for providing notifications based on user activity data. In some embodiments, a method is presented. The method may include accessing first sensor data associated with a first activity of a user. The method may also include determining that the user is engaged in the first activity based at least in part on the first sensor data, receiving a notification while the user is engaged in the first activity, and determining to not present or in other words, hold the notification while the user is engaged in the first activity. In some embodiments, the method may also include accessing second sensor data associated with the user, determining that the user is no longer engaged in the first activity based at least in part on the second sensor data, and presenting the notification while the user is no longer engaged in the first activity.
Abstract:
In an example embodiment, an active pixel sensor on a user device is utilized to capture graphical user interface navigation related movements by a user. Areas of low luminance can be identified and movements or alterations in the areas of low luminance can be translated into navigation commands fed to an application running on the user device.