Abstract:
Briefly, a feature rich touch subsystem is disclosed. The feature rich touch subsystem includes one or more novel user input capabilities that enhance the user experience.
Abstract:
An embodiment of the present invention provides a method of using probabilistic techniques in trending and profiling of user behavior in order to offer recommendations, comprising detecting patterns in user behavior over time thereby enabling a personal device associated with said user to predict what the user is likely to do on a given day or what the user intends to accomplish in an action that has begun.
Abstract:
Briefly, a method and apparatus for recognizing movement of one or more touches across a location on a keyboard grid on a touch panel interface is disclosed. The method may receiving user input with a touch panel interface, recognizing movement of one or more touches across a location on a keyboard grid on the touch panel interface, and performing an action associated with the movement of one or more touches across the location on the keyboard grid on the touch panel interface.
Abstract:
An embodiment of the present invention provides a method of template-based prediction and recommendation, comprising utilizing templates that consist of a sequence of activities or locations to characterize a user's day by a personal device, wherein as the user goes about a day, the personal device attempts to match pre-existing templates to the user's location and activities, assigning a probability to each template; and using the matching templates to predict what the user will do next and thus narrow down a set of logical recommendations.
Abstract:
Systems and methods may provide for displaying a three-dimensional (3D) environment on a screen of a mobile device, and identifying a user interaction with an area behind the mobile device. In addition, the 3D environment can be modified based at least in part on the first user interaction. Moreover, the 3D environment may be modified based on movements of the mobile device as well as user interactions with the mobile device, allowing the user to navigate through the virtual 3D environment by moving the mobile/handheld device.
Abstract:
Mobile device rejection of unintentional sensor contact. An embodiment of a mobile device includes a first touch sensor to detect contact by a user of the mobile device for input of gestures by the user, a memory to store indicators of unintentional contact to the first touch sensor, and a processor to evaluate contact to the first touch sensor. The processor compares a contact with the first touch sensor to the indicators of unintentional contact to determine if the contact is unintentional, and the mobile device rejects the contact as an input to the mobile device if the contact is determined to be unintentional and accepts the contact as an input to the mobile device if the contact is determined to be intentional.
Abstract:
Techniques for gesture-based device connections are described. For example, a method may comprise receiving video data corresponding to motion of a first computing device, receiving sensor data corresponding to motion of the first computing device, comparing, by a processor, the video data and the sensor data to one or more gesture models, and initiating establishment of a wireless connection between the first computing device and a second computing device if the video data and sensor data correspond to gesture models for the same gesture. Other embodiments are described and claimed.
Abstract:
Briefly, a method and apparatus for recognizing multi-shape, multi-touch gestures including finger and non-finger touches input to a touch panel interface is disclosed. The method may include receiving user input with a touch panel interface, recognizing a multi-shape, multi-touch gesture including finger and non-finger touches in the user input, and performing an action associated with the multi-touch gesture including finger and non-finger touches.
Abstract:
Briefly, a method and apparatus for recognizing temporal tapping patterns input to a touch panel interface is disclosed. The method may include receiving user input with a touch panel interface, recognizing a temporal tapping pattern in the user input, and performing an action associated with the temporal tapping pattern.
Abstract:
Transforming mobile device sensor interaction to represent user intent and perception. An embodiment of a mobile device includes a display screen for the display of data and images and a touch sensor to detect a motion of a gesture made by a thumb or other finger of a user of the device. The mobile device further includes a module to transform the motion detected by the touch sensor to generate a modified motion to reflect a perception of the user, where the modified motion is to be applied as an input relating to the display screen.