Abstract:
Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
Abstract:
In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
Abstract:
Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
Abstract:
Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.
Abstract:
In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
Abstract:
Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.