Blending virtual environments with situated physical reality

    公开(公告)号:US11580704B2

    公开(公告)日:2023-02-14

    申请号:US17227078

    申请日:2021-04-09

    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.

    Reality-guided roaming in virtual reality

    公开(公告)号:US10885710B2

    公开(公告)日:2021-01-05

    申请号:US16353912

    申请日:2019-03-14

    Abstract: In various embodiments, computerized methods and systems for dynamically updating a fully-immersive virtual environment based on tracked physical environment data. A computing device coupled to a HMD receives sensor data from a variety of sensors. The computing device can generate a virtual scene based on the received sensor data, whereby the virtual scene includes at least a portion of a virtual path that corresponds to at least a portion of a navigable path determined based on the received sensor data. The computing device can modify the virtual scene include a virtual obstruction that corresponds to a physical object detected based on additional sensor data received from the sensors. The modified virtual scene is presented to the user for display, so that the user can safely traverse the physical environment while staying fully-immersed in the virtual environment.

    Dynamic haptic retargeting
    3.
    发明授权

    公开(公告)号:US10290153B2

    公开(公告)日:2019-05-14

    申请号:US15703093

    申请日:2017-09-13

    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.

    MULTI-ITEM SELECTION USING EYE GAZE
    4.
    发明申请

    公开(公告)号:US20190094958A1

    公开(公告)日:2019-03-28

    申请号:US15718995

    申请日:2017-09-28

    Abstract: Representative embodiments disclose mechanisms for selection of items using eye tracking. One or more primary selection targets are presented to the user. When the user selects a primary selection target, secondary selection targets are presented in close proximity to the primary selection target, either before or after selection of the primary selection target. The secondary selection targets are animated in a way that moves them away from the primary selection target. The user's eye naturally follows the secondary selection target of interest, if any, producing a vector having a magnitude and a direction. The magnitude and direction of the vector are used to identify which, if any, of the secondary selection targets are intended ty the user.

    Interactive and shared surfaces
    5.
    发明授权

    公开(公告)号:US11509861B2

    公开(公告)日:2022-11-22

    申请号:US15380690

    申请日:2016-12-15

    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.

    Real time styling of motion for virtual environments

    公开(公告)号:US11055891B1

    公开(公告)日:2021-07-06

    申请号:US16814130

    申请日:2020-03-10

    Abstract: Examples of the present disclosure describe systems and methods for providing real-time motion styling in virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) environments. In aspects, input data corresponding to user interaction with a VR, an AR, or an MR environment may be received. The input data may be featurized to generate a feature set. The feature set may be compared to a set of stored motion data comprising motion capture data representing one or more motion styles for executing an action or activity. Based on the comparison, the feature set may be matched to feature data for one or more motions styles in the stored motion data. The one or more motions styles may then be executed by a virtual avatar or a virtual object in the VR/AR/MR environment.

    EYE GAZE CORRECTION USING PURSUIT VECTOR
    8.
    发明申请

    公开(公告)号:US20190107884A1

    公开(公告)日:2019-04-11

    申请号:US15726282

    申请日:2017-10-05

    Abstract: Representative embodiments disclose mechanisms for calibrating an eye gaze selection system. When the calibration is triggered, a snapshot of an area around the current user's gaze point is taken. The snapshot area is then animated to cause motion of the snapshot area. As the snapshot is animated, the user's gaze will naturally track the thing the user was focusing on. This creates an eye tracking vector with a magnitude and direction. The magnitude and direction of the eye tracking vector can then be used to calculate a correction factor for the current user's gaze point. Calibration can be triggered manually by the user or based on some criteria such as error rates in item selection by the user.

    PRECISE SELECTION TECHNIQUES FOR MULTI-TOUCH SCREENS

    公开(公告)号:US20180074678A1

    公开(公告)日:2018-03-15

    申请号:US15818452

    申请日:2017-11-20

    CPC classification number: G06F3/04812 G06F3/0488 G06F2203/04808

    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.

Patent Agency Ranking