Abstract:
Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein.
Abstract:
A method that performs a series of interactive operations to calibrate a compass in a mobile device. The method requires a user to move the device to a variety of different orientations. In order to ensure that the device moves to a sufficient number and variety of orientations, the method instructs the user to rotate the device in a series of interactive operations. The interactive operations provide feedback to inform the user how well the user is performing the interactive operations. In some embodiments, the feedback is tactile (e.g., a vibration). In some embodiments the feedback is audible (e.g., a beep or buzz). In some embodiments, the feedback is visual (e.g., an image or images on a video display of the device). The feedback in some embodiments is continuous (e.g., a changing visual display) and in some embodiments is discrete (e.g., the device beeps after taking a good reading).
Abstract:
An electronic device detects a gesture on a touch-sensitive surface. In response to detecting the gesture on the touch-sensitive surface, when the gesture is a first swipe gesture in a first direction, the device displays at least a list of recent electronic notifications. When the gesture is a second swipe gesture in a second direction distinct from the first direction, the device displays one or more settings icons in a settings panel, wherein the settings panel includes a respective settings icon that, when selected, causes a partially transparent interface to be displayed over the settings panel, wherein the partially transparent interface is at least partially transparent so that at least a portion of the settings panel can be seen through the partially transparent user interface.
Abstract:
An example method occurring at a device with a touch-sensitive display. The method includes that while displaying a first user interface of a first application on the touch-sensitive display, the device receives a gesture at the touch-sensitive display. The method includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture is from an edge region of the touch-sensitive display, the device ceases to display the first user interface of the first application and displaying a second user interface of a second application that is different from the first application. The method also includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture begins at a non-edge region of the touch-sensitive display, the device performs an operation within the first application while continuing to display the first user interface.
Abstract:
An example method occurring at a device with a touch-sensitive display. The method includes that while displaying a first user interface of a first application on the touch-sensitive display, the device receives a gesture at the touch-sensitive display. The method includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture is from an edge region of the touch-sensitive display, the device ceases to display the first user interface of the first application and displaying a second user interface of a second application that is different from the first application. The method also includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture begins at a non-edge region of the touch-sensitive display, the device performs an operation within the first application while continuing to display the first user interface.
Abstract:
The present disclosure relates to devices and processes for monitoring attributes of a user's physical activity (e.g., workout) or inactivity, and to user interfaces (e.g., an activity indicator) for displaying the same. In some examples, a device determines whether physical activity corresponds to a first type based on a first set of criteria, and whether physical activity corresponds to a second type based on a second set of criteria. In some examples, the device controls an inactivity timer that measures user's inactivity. In some examples, the device displays a first visual representation of an attribute or amount of a first type of physical activity, and a second visual representation of an attribute or amount of a second type. In some examples, the device displays a third visual representation of an attribute or amount of a third type of activity. In some examples, the third visual representation corresponds to user's inactivity.
Abstract:
An example method occurring at a device with a touch-sensitive display. The method includes that while displaying a first user interface of a first application on the touch-sensitive display, the device receives a gesture at the touch-sensitive display. The method includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture is from an edge region of the touch-sensitive display, the device ceases to display the first user interface of the first application and displaying a second user interface of a second application that is different from the first application. The method also includes that in response to receiving the gesture on the touch-sensitive display and in accordance with a determination that the gesture begins at a non-edge region of the touch-sensitive display, the device performs an operation within the first application while continuing to display the first user interface.
Abstract:
The present disclosure relates to devices and processes for monitoring attributes of a user's physical activity (e.g., workout) or inactivity, and to user interfaces (e.g., an activity indicator) for displaying the same. In some examples, a device determines whether physical activity corresponds to a first type based on a first set of criteria, and whether physical activity corresponds to a second type based on a second set of criteria. In some examples, the device controls an inactivity timer that measures user's inactivity. In some examples, the device displays a first visual representation of an attribute or amount of a first type of physical activity, and a second visual representation of an attribute or amount of a second type. In some examples, the device displays a third visual representation of an attribute or amount of a third type of activity. In some examples, the third visual representation corresponds to user's inactivity.
Abstract:
Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein.
Abstract:
Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein.