LATE WARPING TO MINIMIZE LATENCY OF MOVING OBJECTS

    公开(公告)号:US20220375026A1

    公开(公告)日:2022-11-24

    申请号:US17518828

    申请日:2021-11-04

    Applicant: Snap Inc.

    Abstract: A method for minimizing latency of moving objects in an augmented reality (AR) display device is described. In one aspect, the method includes determining an initial pose of a visual tracking device, identifying an initial location of an object in an image that is generated by an optical sensor of the visual tracking device, the image corresponding to the initial pose of the visual tracking device. rendering virtual content based on the initial pose and the initial location of the object, retrieving an updated pose of the visual tracking device, tracking an updated location of the object in an updated image that corresponds to the updated pose, and applying a time warp transformation to the rendered virtual content based on the updated pose and the updated location of the object to generate transformed virtual content.

    VARIED DEPTH DETERMINATION USING STEREO VISION AND PHASE DETECTION AUTO FOCUS (PDAF)

    公开(公告)号:US20240223715A1

    公开(公告)日:2024-07-04

    申请号:US18606150

    申请日:2024-03-15

    Applicant: Snap Inc.

    CPC classification number: H04N5/2226 H04N23/45 H04N23/672

    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vison, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.

    Varied depth determination using stereo vision and phase detection auto focus (PDAF)

    公开(公告)号:US11722630B2

    公开(公告)日:2023-08-08

    申请号:US17746292

    申请日:2022-05-17

    Applicant: Snap Inc.

    CPC classification number: H04N5/2226 H04N23/45 H04N23/672

    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vision, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.

    VARIED DEPTH DETERMINATION USING STEREO VISION AND PHASE DETECTION AUTO FOCUS (PDAF)

    公开(公告)号:US20230239423A1

    公开(公告)日:2023-07-27

    申请号:US18129009

    申请日:2023-03-30

    Applicant: Snap Inc.

    CPC classification number: H04N5/2226 H04N23/45 H04N23/672

    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using, stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vison, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.

    Late warping to minimize latency of moving objects

    公开(公告)号:US12067693B2

    公开(公告)日:2024-08-20

    申请号:US17518828

    申请日:2021-11-04

    Applicant: Snap Inc.

    CPC classification number: G06T3/18 G06T13/40 G06T13/80 G06T15/205 G06T2210/44

    Abstract: A method for minimizing latency of moving objects in an augmented reality (AR) display device is described. In one aspect, the method includes determining an initial pose of a visual tracking device, identifying an initial location of an object in an image that is generated by an optical sensor of the visual tracking device, the image corresponding to the initial pose of the visual tracking device, rendering virtual content based on the initial pose and the initial location of the object, retrieving an updated pose of the visual tracking device, tracking an updated location of the object in an updated image that corresponds to the updated pose, and applying a time warp transformation to the rendered virtual content based on the updated pose and the updated location of the object to generate transformed virtual content.

    LOW LATENCY HAND-TRACKING IN AUGMENTED REALITY SYSTEMS

    公开(公告)号:US20240273843A1

    公开(公告)日:2024-08-15

    申请号:US18645277

    申请日:2024-04-24

    Applicant: Snap Inc.

    Abstract: A method for reducing motion-to-photon latency for hand tracking is described. In one aspect, a method includes accessing a first frame from a camera of an Augmented Reality (AR) device, tracking a first image of a hand in the first frame, rendering virtual content based on the tracking of the first image of the hand in the first frame, accessing a second frame from the camera before the rendering of the virtual content is completed, the second frame immediately following the first frame, tracking, using the computer vision engine of the AR device, a second image of the hand in the second frame, generating an annotation based on tracking the second image of the hand in the second frame, forming an annotated virtual content based on the annotation and the virtual content, and displaying the annotated virtual content in a display of the AR device.

    Visual-inertial tracking using rolling shutter cameras

    公开(公告)号:US12028626B2

    公开(公告)日:2024-07-02

    申请号:US18098939

    申请日:2023-01-19

    Applicant: Snap Inc.

    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.

Patent Agency Ranking