-
公开(公告)号:US12190540B2
公开(公告)日:2025-01-07
申请号:US18142879
申请日:2023-05-03
Applicant: Snap Inc.
Inventor: Kai Zhou
Abstract: A method for transferring a gait pattern of a first user to a second user to simulate augmented reality content in a virtual simulation environment is described. In one aspect, the method includes identifying a gait pattern of a first user operating a first visual tracking system in a first physical environment, identifying a trajectory from a second visual tracking system operated by a second user in a second physical environment, the trajectory based on poses of the second visual tracking system over time, modifying the trajectory from the second visual tracking system based on the gait pattern of the first user, applying the modified trajectory in a plurality of virtual environments, and generating simulated ground truth data based on the modified trajectory in the plurality of virtual environments.
-
公开(公告)号:US20240341591A1
公开(公告)日:2024-10-17
申请号:US18133870
申请日:2023-04-12
Applicant: Snap Inc.
Inventor: Clemens Birklbauer , Matthias Kalkgruber , Tiago Miguel Pereira Torres , Yubin Xi , Kai Zhou
IPC: A61B3/11
CPC classification number: A61B3/111
Abstract: An eyewear device including a sensor used to measure deformation of an eyewear frame to estimate an inter-pupillary distance (IPD) of an eyewear user. The sensor is used to determine head breadth (HB) of the user and to estimate the IPD of the user. A processor displays an image on a display of the eyewear as a function of the estimated IPD to improve virtual object rendering for an improved augmented reality (AR) viewing experience while reducing vergence accommodation mismatch (VAM). User profile data, such as age and gender, can be used to generate a more accurate estimated IPD of the user.
-
公开(公告)号:US20240221222A1
公开(公告)日:2024-07-04
申请号:US18609845
申请日:2024-03-19
Applicant: Snap Inc.
Inventor: Kai Zhou
CPC classification number: G06T7/80 , G06T7/73 , G06T2200/04 , G06T2200/24
Abstract: A method for adjusting camera intrinsic parameters of a single camera visual tracking device is described. In one aspect, a method includes accessing a temperature of a camera of the visual tracking system, detecting that the temperature of the camera exceeds a threshold, in response identifying one or more feature points that are located in a central region of an initial image, generating a graphical user interface element that instructs a user of the visual tracking system to move the visual tracking system towards a border region of the initial image, and determining intrinsic parameters of the camera based on matching pairs of the one or more detected feature points in the border region and one or more projected feature points in the border region.
-
公开(公告)号:US20240135633A1
公开(公告)日:2024-04-25
申请号:US18400289
申请日:2023-12-29
Applicant: Snap Inc.
Inventor: Kai Zhou , Qi Qi , Jeroen Hol
CPC classification number: G06T15/20 , G06T7/20 , G06T7/50 , G06T7/70 , G06T2207/30241 , G06T2207/30244
Abstract: Systems and methods of generating ground truth datasets for producing virtual reality (VR) experiences, for testing simulated sensor configurations, and for training machine-learning algorithms. In one example, a recording device with one or more cameras and one or more inertial measurement units captures images and motion data along a real path through a physical environment. A SLAM application uses the captured data to calculate the trajectory of the recording device. A polynomial interpolation module uses Chebyshev polynomials to generate a continuous time trajectory (CTT) function. The method includes identifying a virtual environment and assembling a simulated sensor configuration, such as a VR headset. Using the CTT function, the method includes generating a ground truth output dataset that represents the simulated sensor configuration in motion along a virtual path through the virtual environment. The virtual path is closely correlated with the motion along the real path as captured by the recording device. Accordingly, the output dataset produces a realistic and life-like VR experience. In addition, the methods described can be used to generate multiple output datasets, at various sample rates, which are useful for training the machine-learning algorithms which are part of many VR systems.
-
公开(公告)号:US20240127006A1
公开(公告)日:2024-04-18
申请号:US17967209
申请日:2022-10-17
Applicant: Snap Inc.
Inventor: Kai Zhou , Jennica Pounds , Zsolt Robotka , Márton Gergely Kajtár
CPC classification number: G06F40/58 , G06F3/014 , G06F3/017 , G06F3/0346 , G06V10/26 , G06V20/20 , G06V40/113 , G06V40/28
Abstract: A method for recognizing sign language using collaborative augmented reality devices is described. In one aspect, a method includes accessing a first image generated by a first augmented reality device and a second image generated by a second augmented reality device, the first image and the second image depicting a hand gesture of a user of the first augmented reality device, synchronizing the first augmented reality device with the second augmented reality device, in response to the synchronizing, distributing one or more processes of a sign language recognition system between the first and second augmented reality devices, collecting results from the one or more processes from the first and second augmented reality devices, and displaying, in near real-time in a first display of the first augmented reality device, text indicating a sign language translation of the hand gesture based on the results.
-
公开(公告)号:US12190438B2
公开(公告)日:2025-01-07
申请号:US18400289
申请日:2023-12-29
Applicant: Snap Inc.
Inventor: Kai Zhou , Qi Qi , Jeroen Hol
Abstract: Systems and methods of generating ground truth datasets for producing virtual reality (VR) experiences, for testing simulated sensor configurations, and for training machine-learning algorithms. In one example, a recording device with one or more cameras and one or more inertial measurement units captures images and motion data along a real path through a physical environment. A SLAM application uses the captured data to calculate the trajectory of the recording device. A polynomial interpolation module uses Chebyshev polynomials to generate a continuous time trajectory (CTT) function. The method includes identifying a virtual environment and assembling a simulated sensor configuration, such as a VR headset. Using the CTT function, the method includes generating a ground truth output dataset that represents the simulated sensor configuration in motion along a virtual path through the virtual environment. The virtual path is closely correlated with the motion along the real path as captured by the recording device. Accordingly, the output dataset produces a realistic and life-like VR experience. In addition, the methods described can be used to generate multiple output datasets, at various sample rates, which are useful for training the machine-learning algorithms which are part of many VR systems.
-
公开(公告)号:US20240303934A1
公开(公告)日:2024-09-12
申请号:US18118906
申请日:2023-03-08
Applicant: Snap Inc.
Inventor: Thomas Muttenthaler , Kai Zhou
CPC classification number: G06T19/006 , G02B27/0172 , G06T3/40 , G06V10/25 , G02B2027/0178 , G06V2201/07
Abstract: Examples describe adaptive image processing for an augmented reality (AR) device. An input image is captured by a camera of the AR device, and a region of interest of the input image is determined. The region of interest is associated with an object that is being tracked using an object tracking system. A crop-and-scale order of an image processing operation directed at the region of interest is determined for the input image. One or more object tracking parameters may be used to determine the crop-and-scale order. The crop-and-scale order is dynamically adjustable between a first order and a second order. An output image is generated from the input image by performing the image processing operation according to the determined crop-and-scale order for the particular input image. The output image can be accessed by the object tracking system to track the object.
-
公开(公告)号:US20240193870A1
公开(公告)日:2024-06-13
申请号:US18065180
申请日:2022-12-13
Applicant: Snap Inc.
Inventor: Kai Zhou , Dunxu Hu , Dominik Schnitzer
CPC classification number: G06T19/003 , G06F3/017 , G06F30/20 , G06T15/50
Abstract: A content creation system for extended Reality (XR) systems. The content creation system receives motion data of an XR device and generates trajectory data of a trajectory within a 3D environment model of a real-world scene based on the motion data where the trajectory simulates the motion of the XR device within the real-world scene. The content creation system receives user interaction event data and generates simulated sensor data based on the trajectory data, the 3D environment model, and the user interaction event data. The content creation system generates simulated tracking data based on the simulated sensor data and determines simulated power consumption data and thermal condition data based on operation of the computer vision component while generating the simulated tracking data. The content creation system generates a display from a user's perspective of the 3D environment model along with the simulated power consumption and thermal data.
-
公开(公告)号:US20240078733A1
公开(公告)日:2024-03-07
申请号:US18388987
申请日:2023-11-13
Applicant: Snap Inc.
Inventor: Kai Zhou , Kenneth Au
CPC classification number: G06T13/40 , G06F16/90335 , G06F18/21 , G06T19/006 , G06V20/30 , H04L67/535
Abstract: A system and a method for generating an automated GIF file generation system is described. In one aspect, the method includes accessing an animated GIF file, identifying a plurality of elements displayed in the animated GIF file, applying a variation of one or more elements to the animated GIF file, and generating a variant animated GIF file by applying the variation of the one or more elements to the animated GIF file. The system measures a trending metric of the variant animated GIF file based on a number of times the variant animated GIF file is shared on the communication platform and uses the trending metric as a feedback to generating the variant animated GIF file.
-
公开(公告)号:US11688101B2
公开(公告)日:2023-06-27
申请号:US17448655
申请日:2021-09-23
Applicant: Snap Inc.
Inventor: Clemens Birklbauer , Georg Halmetschlager-Funek , Matthias Kalkgruber , Kai Zhou
CPC classification number: G06T7/80 , G06V10/443 , H04N17/002 , H04N23/61 , H04N23/651 , H04N23/6812
Abstract: A method for adjusting camera intrinsic parameters of a multi-camera visual tracking device is described. In one aspect, a method for calibrating the multi-camera visual tracking system includes disabling a first camera of the multi-camera visual tracking system while a second camera of the multi-camera visual tracking system is enabled, detecting a first set of features in a first image generated by the first camera after detecting that the temperature of the first camera is within the threshold of the factory calibration temperature of the first camera, and accessing and correcting intrinsic parameters of the second camera based on the projection of the first set of features in the second image and a second set of features in the second image.
-
-
-
-
-
-
-
-
-