Abstract:
Examples of light projector systems for directing input light from a light source to a spatial light modulator are provided. For example, an optical device is disclosed which includes a first surface having a diffractive optical element, a second surface normal to the first surface, and a third surface arranged at an angle to the second surface. The third surface may be a beam splitting surface that is reflective to light of a first state and transmissive to light of a second state. The diffractive optical element may receive an input beam made up of light having the first state, and convert the input beam into at least a first diffracted beam at a first diffraction angle such that the first diffracted beam is directed toward the third surface and is reflected by the third surface in a direction substantially parallel to the first surface.
Abstract:
An eyepiece waveguide for an augmented reality. The eyepiece waveguide can include a transparent substrate with an input coupler region, first and second orthogonal pupil expander (OPE) regions, and an exit pupil expander (EPE) region. The input coupler region can be positioned between the first and second OPE regions and can divide and re-direct an input light beam that is externally incident on the input coupler region into first and second guided light beams that propagate inside the substrate, with the first guided beam being directed toward the first OPE region and the second guided beam being directed toward the second OPE region. The first and second OPE regions can respectively divide the first and second guided beams into a plurality of replicated, spaced-apart beams. The EPE region can re-direct the replicated beams from both the first and second OPE regions such that they exit the substrate.
Abstract:
Systems and methods for eye pose identification using features of an eye are described. Embodiments of the systems and methods can include segmenting an iris of an eye in the eye image to obtain pupillary and limbic boundaries of the eye, determining two angular coordinates (e.g., pitch and yaw) of an eye pose using the pupillary and limbic boundaries of the eye, identifying an eye feature of the eye (e.g., an iris feature or a scleral feature), determining a third angular coordinate (e.g., roll) of the eye pose using the identified eye feature, and utilizing the eye pose measurement for display of an image or a biometric application. In some implementations, iris segmentation may not be performed, and the two angular coordinates are determined from eye features.
Abstract:
Examples of eye-imaging apparatus using diffractive optical elements are provided. For example, an optical device comprises a substrate having a proximal surface and a distal surface, a first coupling optical element disposed on one of the proximal and distal surfaces of the substrate, and a second coupling optical element disposed on one of the proximal and distal surfaces of the substrate and offset from the first coupling optical element. The first coupling optical element can be configured to deflect light at an angle to totally internally reflect (TIR) the light between the proximal and distal surfaces and toward the second coupling optical element, and the second coupling optical element can be configured to deflect at an angle out of the substrate. The eye-imaging apparatus can be used in a head-mounted display such as an augmented or virtual reality display.
Abstract:
An augmented reality (AR) device is described with a display system configured to adjust an apparent distance between a user of the AR device and virtual content presented by the AR device. The AR device includes a first tunable lens that change shape in order to affect the position of the virtual content. Distortion of real-world content on account of the changes made to the first tunable lens is prevented by a second tunable lens that changes shape to stay substantially complementary to the optical configuration of the first tunable lens. In this way, the virtual content can be positioned at almost any distance relative to the user without degrading the view of the outside world or adding extensive bulk to the AR device. The augmented reality device can also include tunable lenses for expanding a field of view of the augmented reality device.
Abstract:
An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.
Abstract:
Architectures are provided for selectively incoupling one or more streams of light from a multiplexed light stream into a waveguide. The multiplexed light stream can have light with different characteristics (e.g., different wavelengths and/or different polarizations). The waveguide can comprise in-coupling elements that can selectively couple one or more streams of light from the multiplexed light stream into the waveguide while transmitting one or more other streams of light from the multiplexed light stream.
Abstract:
In some embodiments, an augmented reality system includes at least one waveguide that is configured to receive and redirect light toward a user, and is further configured to allow ambient light from an environment of the user to pass therethrough toward the user. The augmented reality system also includes a first adaptive lens assembly positioned between the at least one waveguide and the environment, a second adaptive lens assembly positioned between the at least one waveguide and the user, and at least one processor operatively coupled to the first and second adaptive lens assemblies. Each lens assembly of the augmented reality system is selectively switchable between at least two different states in which the respective lens assembly is configured to impart at least two different optical powers to light passing therethrough, respectively. The at least one processor is configured to cause the first and second adaptive lens assemblies to synchronously switch between different states in a manner such that the first and second adaptive lens assemblies impart a substantially constant net optical power to ambient light from the environment passing therethrough.
Abstract:
Illuminations systems that separate different colors into laterally displaced beams may be used to direct different color image content into an eyepiece for displaying images in the eye. Such an eyepiece may be used, for example, for an augmented reality head mounted display. Illumination systems may be provided that utilize one or more waveguides to direct light from a light source towards a spatial light modulator. Light from the spatial light modulator may be directed towards an eyepiece. Some aspects of the invention provide for light of different colors to be outcoupled at different angles from the one or more waveguides and directed along different beam paths.
Abstract:
A display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity may be selected using an array of shutters that selectively regulate the entry of image light into an eye. Each opened shutter in the array provides a different intra-pupil image, and the locations of the open shutters provide the desired amount of parallax disparity between the images. In some other embodiments, the images may be formed by an emissive micro-display. Each pixel formed by the micro-display may be formed by one of a group of light emitters, which are at different locations such that the emitted light takes different paths to the eye, the different paths providing different amounts of parallax disparity.