Abstract:
Systems and processes for scanning a user interface are disclosed. One process can include scanning multiple elements within a user interface by highlighting the elements. The process can further include receiving a selection while one of the elements is highlighted and performing an action on the element that was highlighted when the selection was received. The action can include scanning the contents of the selected element or performing an action associated with the selected element. The process can be used to navigate an array of application icons, a menu of options, a standard desktop or laptop operating system interface, or the like. The process can also be used to perform gestures on a touch-sensitive device or mouse and track pad gestures (e.g., flick, tap, or freehand gestures).
Abstract:
The present disclosure relates to screenreader techniques and volume control techniques for electronic devices. In some embodiments, a device displays a plurality of user interface objects in an ordered progression. A rotation of a rotary input mechanism is detected. In response to the rotation of the rotary input mechanism, if a rotary screenreader navigation mode is activated, a visual highlight is displayed and an auditory output is produced. In some embodiments, a device has a volume setting. A gesture is detected, and a volume adjustment mode is activated. The gesture ends with a contact being maintained, and the volume setting is adjusted in accordance with detected movement of the contact.
Abstract:
The present disclosure generally relates to methods and devices for providing touch accommodations to users with tremors or other fine motor impairments to improve the accuracy of such users' touch inputs on touch-sensitive surfaces. Such methods and devices include various approaches for compensating for brief, inadvertent touch inputs; touch inputs with inadvertent motion across the touch-sensitive surface; and/or touch inputs with inadvertent recoil contacts. In some embodiments, the touch accommodations are implemented in a software layer separate from the application layer, such as the operating system.
Abstract:
An electronic device, while in an interaction configuration mode: displays a first user interface that includes a plurality of user interface objects; and, while displaying the first user interface, detects one or more gesture inputs on a touch-sensitive surface. For a respective gesture input, the device determines whether one or more user interface objects of the plurality of user interface objects correspond to the respective gesture input. The device visually distinguishes a first set of user interface objects in the plurality of user interface objects that correspond to the detected one or more gesture inputs from a second set of user interface objects in the plurality of user interface objects that do not correspond to the detected one or more gesture inputs. The device detects an input; and, in response to detecting the input, exits the interaction configuration mode and enters a restricted interaction mode.
Abstract:
The present disclosure generally relates to methods and devices for providing touch accommodations to users with tremors or other fine motor impairments to improve the accuracy of such users' touch inputs on touch-sensitive surfaces. Such methods and devices include various approaches for compensating for brief, inadvertent touch inputs; touch inputs with inadvertent motion across the touch-sensitive surface; and/or touch inputs with inadvertent recoil contacts. In some embodiments, the touch accommodations are implemented in a software layer separate from the application layer, such as the operating system.
Abstract:
Systems and processes for activating a screen reading program are disclosed. One process can include receiving a request to activate the screen reading program and prompting the user to perform an action to confirm the request. The action can include a making swiping gesture, shaking the device, covering a proximity sensor, tapping a display, or the like. In some examples, the confirming action must be received within a time limit or the input can be ignored. In response to receipt of the confirmation (e.g., within the time limit), the screen reading program can be activated. The time limit can be identified using audible notifications at the start and end of the time limit. In another example, a device can detect an event associated with a request to activate a screen reading program. The event can be detected at any time to cause the device to activate the screen reading program.