Abstract:
Apparatus and methods are disclosed for simultaneously tracking multiple finger and palm contacts as hands approach, touch, and slide across a proximity-sensing, multi-touch surface. Identification and classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting into a versatile, ergonomic computer input device.
Abstract:
The detection of a palm touch on a touch surface, such as a mouse surface, is disclosed. A palm touch can be determined as the touch on the touch surface that has a radius exceeding a predetermined palm touch radius. Alternatively, a palm touch can be determined as the touch on the touch surface located beyond the expected distance between finger touches.
Abstract:
Pre-stored no-touch or no-hover (no-event) sensor output values can initially be used when a sensor panel subsystem is first booted up to establish an initial baseline of sensor output values unaffected by fingers or other objects touching or hovering over the sensor panel during boot-up. This initial baseline can then be normalized so that each sensor generates the same output value for a given amount of touch or hover, providing a uniform response across the sensor panel and enabling subsequent touch or hover events to be more easily detected. After the initial normalization process is complete, the pre-stored baseline can be discarded in favor of a newly captured no-event baseline that may be more accurate than the pre-stored baseline due to temperature or other variations.
Abstract:
Apparatus and methods are disclosed for simultaneously tracking multiple finger and palm contacts as hands approach, touch, and slide across a proximity-sensing, multi-touch surface. Identification and classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting into a versatile, ergonomic computer input device.
Abstract:
Techniques for identifying and discriminating between different types of contacts to a multi-touch touch-screen device are described. Illustrative contact types include fingertips, thumbs, palms and cheeks. By way of example, thumb contacts may be distinguished from fingertip contacts using a patch eccentricity parameter. In addition, by non-linearly deemphasizing pixels in a touch-surface image, a reliable means of distinguishing between large objects (e.g., palms) from smaller objects (e.g., fingertips, thumbs and a stylus) is described.
Abstract:
Pre-stored no-touch or no-hover (no-event) sensor output values can initially be used when a sensor panel subsystem is first booted up to establish an initial baseline of sensor output values unaffected by fingers or other objects touching or hovering over the sensor panel during boot-up. This initial baseline can then be normalized so that each sensor generates the same output value for a given amount of touch or hover, providing a uniform response across the sensor panel and enabling subsequent touch or hover events to be more easily detected. After the initial normalization process is complete, the pre-stored baseline can be discarded in favor of a newly captured no-event baseline that may be more accurate than the pre-stored baseline due to temperature or other variations.
Abstract:
Disclosed herein is a capacitive touch sensitive device. One aspect of the touch sensitive device described herein is a reduction in the number of sensor circuits needed for circular or linear capacitive touch sensitive devices while maintaining the same resolution and absolute position determination for a single object. A related aspect of the touch sensitive device described herein a coding pattern that allows each sensor circuit of a capacitive touch sensitive device to share multiple electrodes at specially chosen locations in a sensor array such that the ability to determine the absolute position of a single object over the array is not compromised.
Abstract:
The identification of low noise stimulation frequencies for detecting and localizing touch events on a touch sensor panel is disclosed. Each of a plurality of sense channels can be coupled to a separate sense line in a touch sensor panel and can have multiple mixers, each mixer using a demodulation frequency of a particular frequency, phase and delay. With no stimulation signal applied to any drive lines in the touch sensor panel, pairs of mixers can demodulate the sum of the output of all sense channels using the in-phase (I) and quadrature (Q) signals of a particular frequency. The demodulated outputs of each mixer pair can be used to calculate the magnitude of the noise at that particular frequency, wherein the lower the magnitude, the lower the noise at that frequency. Several low noise frequencies can be selected for use in a subsequent touch sensor panel scan function.
Abstract:
Apparatus and methods are disclosed for simultaneously tracking multiple finger and palm contacts as hands approach, touch, and slide across a proximity-sensing, multi-touch surface. Identification and classification of intuitive hand configurations and motions enables unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting into a versatile, ergonomic computer input device.
Abstract:
Multi-event input systems, methods, and devices for use in connection with touch-sensitive electronic devices are disclosed. In accordance with certain embodiments of the present disclosure, a third state called “hover” can be achieved on a touch-base user interface device in addition to the states of pointer down and pointer up. In an embodiment involving a capacitive touch-sensing surface, one way to achieve the third state is for the user to contact the touchpad or touch screen with a non-flesh part of a finger, such as a fingernail, rather than the fleshy part of a finger. In other embodiments, the non-flesh part may comprise an electrically insulative layer covering a portion of a finger. The third state enables an adjunct system's user interface to achieve active navigation around the screen in a pointer-up (or left-up) input tool condition. One result is that mouseover pop-ups can be used on touch screen devices. Another result is that tooltips can be used on touch screen devices. Another result is that text can be selected using touch screen devices.