Abstract:
A computer system displays a first view of a three-dimensional environment including a first representation of a first portion of a physical environment. While displaying the first view, the computer system detects movement of a first user from a first location to a second location of the physical environment, and in response: if the second location corresponds to a first type of exercise, the computer system replaces at least a portion of a second representation of a second portion of the physical environment that includes the second location with virtual content corresponding to the first type of exercise; and if the second location corresponds to a second type of exercise, the computer system replaces at least a portion of a third representation of a third portion of the physical environment that includes the second location with virtual content corresponding to the second type of exercise.
Abstract:
A configuration for a breathing sequence may be defined using a user interface of a user device. The user interface may also be used to begin the breathing sequence. Prior to beginning the breathing sequence, the user device can determine a breath ratio for the breathing sequence based on a breathing profile. The breath ratio is based at least in part on inhale time and exhale time of a breath of the breathing sequence. During the breathing sequence, a fluctuating user interface element may fluctuate at a cyclic rate corresponding to the breath ratio. Such fluctuation may include repeated growing and repeated shrinking of the fluctuating user interface element.
Abstract:
A computer system displays a first view of a three-dimensional environment, including a first user interface object having a first surface at a first position in the three-dimensional environment. While displaying the first view of the three-dimensional environment, the computer system detects a change in biometric data of a first user, and in response, changes an appearance of the first surface in the first user interface object in accordance with the change in biometric data of the first user. While displaying the first user interface object with the appearance that has been changed based on the change in the biometric data of the first user, the computer system detects first movement of the first user, and in response, changes the appearance of the first user interface object in accordance with the first movement of the first user.
Abstract:
A configuration for a breathing sequence may be defined using a user interface of a user device. The user interface may also be used to begin the breathing sequence. During the breathing sequence, a fluctuating user interface element may fluctuate at a cyclic rate. Such fluctuation may include repeated growing and repeated shrinking of the fluctuating user interface element. During the breathing sequence, heart rate data may be collected and used to present heart rate information at a conclusion of the breathing sequence.
Abstract:
A computer system, while displaying a first computer-generated experience with a first level of immersion, receives biometric data corresponding to a first user. In response to receiving the biometric data: in accordance with a determination that the biometric data corresponding to the first user meets first criteria, the computer system displays the first computer-generated experience with a second level of immersion, wherein the first computer-generated experience displayed with the second level of immersion occupies a larger portion of a field of view of the first user than the first computer-generated experience displayed with the first level of immersion; and in accordance with a determination that the biometric data corresponding to the first user does not meet the first criteria, the computer system continues to display the first computer-generated experience with the first level of immersion.
Abstract:
An electronic device with a touch-sensitive surface, a display, and tactile output generator(s) displays a user interface including a moveable component representing a plurality of selectable options. The device detects a scroll input directed to the moveable component including movement of a contact followed by a lift-off. In response to detecting the scroll input: the device moves the moveable component through moving a first selectable option and a second selectable option, where the movement of the moveable component gradually slows down after the lift-off of the contact. As the first moveable component moves through the first selectable option with a speed: the device generates a tactile output and an audio output. As the moveable component moves through the second selectable option with a slower speed: the device generates a different tactile output in a first property but is the same in a second property and generates a different audio output.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
An electronic device provides, to a display, data to present a user interface with a plurality of user interface objects that includes a first user interface object and a second user interface object. A current focus is on the first user interface object. The device receives an input that corresponds to a request to move the current focus; and, in response, provides, to the display, data to: move the first user interface object from a first position towards the second user interface object and/or tilt the first user interface object from a first orientation towards the second user interface object; and, after moving and/or tilting the first user interface object, move the current focus from the first user interface object to the second user interface object, and move the first user interface object back towards the first position and/or tilt the first user interface object back towards the first orientation.
Abstract:
An electronic device displays a user interface of a first software application that includes one or more draggable objects and one or more control objects; and, detects a contact on a touch-sensitive surface at a first location while a focus selector is displayed over a first draggable object and a movement of the contact across the touch-sensitive surface to a second location that corresponds to a first control object. In accordance with a determination that the contact at the first location satisfies object selection criteria, the device moves the first draggable object to the first control object in accordance with the movement of the contact across the touch-sensitive surface to the first control object. In accordance with a determination that the contact at the second location satisfies first intensity criteria, the device performs a first predetermined operation that corresponds to activation of the first control object.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.