VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

    公开(公告)号:US20230251710A1

    公开(公告)日:2023-08-10

    申请号:US18304311

    申请日:2023-04-20

    CPC classification number: G06F3/013 G06T19/006 G06T7/50

    Abstract: A method for determining a focal point depth of a user of a three-dimensional (“3D”) display device includes tracking a first gaze path of the user. The method also includes analyzing 3D data to identify one or more virtual objects along the first gaze path of the user. The method further includes when only one virtual object intersects the first gaze path of the user identifying a depth of the only one virtual object as the focal point depth of the user.

    MIXED REALITY SYSTEM WITH MULTI-SOURCE VIRTUAL CONTENT COMPOSITING AND METHOD OF GENERATING VIRTUAL CONTENT USING SAME

    公开(公告)号:US20210174598A1

    公开(公告)日:2021-06-10

    申请号:US17178524

    申请日:2021-02-18

    Abstract: A computer implemented method for warping virtual content from two sources includes a first source generating first virtual content based on a first pose. The method also includes a second source generating second virtual content based on a second pose. The method further includes a compositor processing the first and second virtual content in a single pass. Processing the first and second virtual content includes generating warped first virtual content by warping the first virtual content based on a third pose, generating warped second virtual content by warping the second virtual content based on the third pose, and generating output content by compositing the warped first and second virtual content.

    BLENDED MODE THREE DIMENSIONAL DISPLAY SYSTEMS AND METHODS

    公开(公告)号:US20210337176A1

    公开(公告)日:2021-10-28

    申请号:US17368096

    申请日:2021-07-06

    Abstract: A method for displaying a three dimensional (“3D”) image includes rendering a frame of 3D image data. The method also includes analyzing the frame of 3D image data to generate depth data. The method further includes using the depth data to segment the 3D image data into i) at least one near frame of two dimensional (“2D”) image data corresponding to a near depth, and ii) at least one far frame of 2D image data corresponding to a far depth that is farther than the near depth from a point of view. Moreover, the method includes displaying the near and far frames at the near and far depths respectively. The near and far frames are displayed simultaneously.

    VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

    公开(公告)号:US20210174472A1

    公开(公告)日:2021-06-10

    申请号:US17110638

    申请日:2020-12-03

    Abstract: A method for displaying a three dimensional (“3D”) image includes rendering a frame of 3D image data. The method also includes analyzing the frame of 3D image data to generate best known depth data. The method further includes using the best known depth data to segment the 3D image data into near and far frames of two dimensional (“2D”) image data corresponding to near and far depths respectively. Moreover, the method includes displaying near and far 2D image frames corresponding to the near and far frames of 2D image data at near and far depths to a user respectively.

    BLENDED MODE THREE DIMENSIONAL DISPLAY SYSTEMS AND METHODS

    公开(公告)号:US20200374504A1

    公开(公告)日:2020-11-26

    申请号:US16872792

    申请日:2020-05-12

    Abstract: A method for displaying a three dimensional (“3D”) image includes rendering a frame of 3D image data. The method also includes analyzing the frame of 3D image data to generate depth data. The method further includes using the depth data to segment the 3D image data into i) at least one near frame of two dimensional (“2D”) image data corresponding to a near depth, and ii) at least one far frame of 2D image data corresponding to a far depth that is farther than the near depth from a point of view. Moreover, the method includes displaying the near and far frames at the near and far depths respectively. The near and far frames are displayed simultaneously.

    VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

    公开(公告)号:US20250044865A1

    公开(公告)日:2025-02-06

    申请号:US18923373

    申请日:2024-10-22

    Abstract: A method for determining a focal point depth of a user of a three-dimensional (“3D”) display device includes tracking a first gaze path of the user. The method also includes analyzing 3D data to identify one or more virtual objects along the first gaze path of the user. The method further includes when only one virtual object intersects the first gaze path of the user identifying a depth of the only one virtual object as the focal point depth of the user.

Patent Agency Ranking