Abstract:
Methods and systems for unlocking a screen using eye tracking information are described. A computing system may include a display screen. The computing system may be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the computing system. The user may attempt to unlock the screen. The computing system may generate a display of a moving object on the display screen of the computing system. An eye tracking system may be coupled to the computing system. The eye tracking system may track eye movement of the user. The computing system may determine that a path associated with the eye movement of the user substantially matches a path associated with the moving object on the display and switch to be in an unlocked mode of operation including unlocking the screen.
Abstract:
Example methods and systems for displaying one or more indications that indicate (i) the direction of a source of sound and (ii) the intensity level of the sound are disclosed. A method may involve receiving audio data corresponding to sound detected by a wearable computing system. Further, the method may involve analyzing the audio data to determine both (i) a direction from the wearable computing system of a source of the sound and (ii) an intensity level of the sound. Still further, the method may involve causing the wearable computing system to display one or more indications that indicate (i) the direction of the source of the sound and (ii) the intensity level of the sound.
Abstract:
A method includes determining, at a first time, a representation of a first head rotation of a head mounted display (HMD) using a first inertial sensor sample stream and rendering, at an application processor, a texture based on the first head rotation. The method further includes determining, at a second time subsequent to the first time, a representation of a second head rotation of the HMD using a second inertial sensor sample stream having a higher sampling rate than the first inertial sensor sample stream, and generating, at a compositor, a rotated representation of the texture based on a difference between the first head rotation and the second head rotation.
Abstract:
Disclosed are methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display are disclosed. In one embodiment, the method includes displaying a user-interface on a substantially transparent display of a wearable computing device. The method further includes displaying a virtual object in the view region at a focal length along a first line of sight and detecting a physical object at a physical distance along a second line of sight. The method still further includes determining that a relationship between the focal length and the physical distance is such that the virtual object and the physical object appear substantially co-located in a user-view through the view region and, responsive to the determination, initiating a collision action between the virtual object and the physical object.
Abstract:
Apparatus are described herein including an imaging device and a multi-pixel display disposed in a head-mountable device (HMD). The apparatus includes an optical system configured to optically couple the multi-pixel display and the imaging device to a retina of a wearer of the head-mountable device, such that the retina is at a focal plane that is conjugate to both a first focal plane at the multi-pixel display and a second focal plane at the imaging device. The direction of gaze of a user of the HMD could be determined from the retinal image. The determined gaze direction could be used to operate the HMD. The pattern of vasculature in the retinal image could be used to identify the user of the HMD. Information in the retinal image could be used to determine the medical state of the user and diagnose disease states.
Abstract:
The present disclosure describes example systems and methods for identifying an indication of an injury of a user of a wearable computing device. The systems and methods may be directed to determining that an acceleration experienced by the wearable computing device exceeds a threshold value. In response, the wearable computing device may perform a diagnostic procedure in order to identify an indication of an injury experienced by the user of the wearable computing device. The diagnostic procedure may include one or more of an eye response test, a verbal response test, a motor response test, and a visual diagnostic test.
Abstract:
A method includes determining, at a first time, a representation of a first head rotation of a head mounted display (HMD) using a first inertial sensor sample stream and rendering, at an application processor, a texture based on the first head rotation. The method further includes determining, at a second time subsequent to the first time, a representation of a second head rotation of the HMD using a second inertial sensor sample stream having a higher sampling rate than the first inertial sensor sample stream, and generating, at a compositor, a rotated representation of the texture based on a difference between the first head rotation and the second head rotation.
Abstract:
Exemplary methods and systems relate to detecting physical objects near a substantially transparent head-mounted display (HMD) system and activating a collision-avoidance action to alert a user of the detected objects. Detection techniques may include receiving data from distance and/or relative movement sensors and using this data as a basis for determining an appropriate collision-avoidance action. Exemplary collision-avoidance actions may include de-emphasizing virtual objects displayed on the HMD to provide a less cluttered view of the physical objects through the substantially transparent display and/or presenting new virtual objects.
Abstract:
An apparatus includes a light source, a display array, a light relay, a photodetector, and control circuitry. The light source is for providing lamp light during an ON-time of the light source. The display array is positioned to receive and selectively manipulate the lamp light. The light relay is positioned to receive the image light from the display array. Control circuitry is coupled to the light source for adjusting the light source and coupled to receive an output of the photodetector.
Abstract:
A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.