-
公开(公告)号:US20250123679A1
公开(公告)日:2025-04-17
申请号:US18920449
申请日:2024-10-18
Applicant: Meta Platforms Technologies, LLC
Inventor: Tsz Ho Yu , Chengyuan Yan , Christian Forster
Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
-
公开(公告)号:US20240353920A1
公开(公告)日:2024-10-24
申请号:US18649918
申请日:2024-04-29
Applicant: Meta Platforms Technologies, LLC
Inventor: Christian Forster , Andrew Melim
CPC classification number: G06F3/012 , A61B5/6803 , G06F1/163 , G06F3/0308
Abstract: In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the first frame, a pattern of lights disposed on the wearable device, capturing a second frame of the wearable device by the camera using a second exposure time, identifying, in the second frame, predetermined features of the wearable device, and adjusting the pose of the wearable device in the environment based on the identified pattern of light in the first frame or the identified predetermined features in the second frame. The method utilizes the predetermined features for tracking the wearable device in a visible-light frame under specific light conditions to improve the accuracy of the pose of the controller.
-
公开(公告)号:US20240119568A1
公开(公告)日:2024-04-11
申请号:US18484193
申请日:2023-10-10
Applicant: Meta Platforms Technologies, LLC
Inventor: Andrey Tovchigrechko , Fabian Langguth , Alexander Sorkine Hornung , Oskar Linde , Christian Forster
CPC classification number: G06T5/002 , G06T5/005 , G06T7/10 , G06T7/285 , G06T17/20 , G06T19/20 , G06V20/20 , G06T2207/10012 , G06T2207/10028 , G06T2207/20021 , G06T2210/44 , G06T2219/2021
Abstract: A processor accesses a depth map and a first image of a scene generated using one or more sensors of an artificial reality device. The processor generates, based on the first image, segmentation masks respectively associated with a plurality of object types. The segmentation masks segment the depth map into a plurality of segmented depth maps respectively associated with the object types. The processor generates meshes using, respectively, the segmented depth maps. For each eye of the user, the processor captures a second image and generates, based on the second image, segmentation information. The processor warps the plurality of meshes to generate warped meshes for the eye, and then generates an eye-specific mesh for the eye by compositing the warped meshes according to the segmentation information. The processor renders an output image for the eye using the second image and the eye-specific mesh.
-
公开(公告)号:US12153724B2
公开(公告)日:2024-11-26
申请号:US17719630
申请日:2022-04-13
Applicant: META PLATFORMS TECHNOLOGIES, LLC
Inventor: Tsz Ho Yu , Chengyuan Yan , Christian Forster
Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
-
-
-