Abstract:
In a system for moving and scaling in a virtual reality environment, a user may a move from a first virtual position in the virtual environment toward a selected feature at a second virtual position in the virtual environment. While moving from the first position toward the second position, a user's scale, or perspective, relative to the user's surroundings in the virtual environment, may be adjusted via manipulation of a user interface provided on a handheld electronic device.
Abstract:
A system for detecting and tracking a hover position of a manual pointing device, such as finger(s), on a handheld electronic device may include overlaying a rendered mono-chromatic keying screen, or green screen, on a user interface, such as a keyboard, of the handheld electronic device. A position of the finger(s) relative to the keyboard may be determined based on the detection of the finger(s) on the green screen and a known arrangement of the keyboard. An image of the keyboard and the position of the finger(s) may be rendered and displayed, for example, on a head mounted display, to facilitate user interaction via the keyboard with a virtual immersive experience generated by the head mounted display.
Abstract:
In a virtual reality system, a user may travel from a first virtual location to a second virtual location. During travel, a dynamic virtual animation may be displayed within a portal in the field of view by the user, allowing the user to experience a sensation of traveling from the first virtual location to the second virtual location. A fixed feature may be displayed in the field of view, surrounding the portal. The arrangement and position of the fixed feature may remain fixed while the dynamic virtual animation is displayed within the portal, to provide a stable frame of reference while experiencing the sensation of traveling. The stable frame of reference provided by the fixed feature may mitigate a feeling of disorientation and/or motion sickness during travel due to a mismatch between the dynamic visual experience and the stationary physical experience.
Abstract:
A system for tracking a first electronic device, such as a handheld electronic device, in a virtual reality environment generated by a second electronic device, such as a head mounted display may include the fusion of data collected by sensors of the electronic device with data collected by sensors of the head mounted display, together with data collected by a front facing camera of the electronic device related to the front face of the head mounted display.
Abstract:
A virtual eyeglass set may include a frame, a first virtual lens and second virtual lens, and a processor. The frame may mount onto a user's head and hold the first virtual lens in front of the user's left eye and the second virtual lens in front of the user's right eye. A first side of each lens may face the user and a second side of each lens may face away from the user. Each of the first virtual lens and the second virtual lens may include a light field display on the first side, and a light field camera on the second side. The processor may construct, for display on each of the light field displays based on image data received via each of the light field cameras, an image from a perspective of the user's respective eye.