-
公开(公告)号:US12092826B2
公开(公告)日:2024-09-17
申请号:US18128905
申请日:2023-03-30
申请人: Snap Inc.
CPC分类号: G02B27/0172 , G09G3/32 , G02B2027/014 , G02B2027/0178 , G09G2360/04
摘要: An eXtended Reality (XR) display system includes a Light Emitting Diode (LED) display controller, and a Light Emitting Diode (LED) near-eye display element operatively coupled to the LED display driver. The LED near-eye display element includes one or more motors and an LED array operably connected to the one or more motors. During operation, the LED display driver receives video data including a rendered virtual object of an XR experience and generates LED array control signals based on the video data, the LED array control signals causing one or more LEDs of the LED array to be energized in a sequence. The LED display driver also generates synchronized motor control signals and simultaneously communicates the LED array control signals to the LED array and the synchronized motor control signals to the one or more motors causing the LED near-eye display element to display the rendered virtual object.
-
公开(公告)号:US11915453B2
公开(公告)日:2024-02-27
申请号:US18106655
申请日:2023-02-07
申请人: Snap Inc.
CPC分类号: G06T7/80 , G02B27/017 , G06T7/70 , G06T19/006 , H04N23/54 , G05G9/04737 , G06T2207/30241 , G06T2207/30244
摘要: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
-
公开(公告)号:US20240221220A1
公开(公告)日:2024-07-04
申请号:US18442772
申请日:2024-02-15
申请人: Snap Inc.
CPC分类号: G06T7/80 , G02B27/017 , G06T7/70 , G06T19/006 , H04N23/54 , G05G9/04737 , G06T2207/30241 , G06T2207/30244
摘要: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
-
公开(公告)号:US20230267691A1
公开(公告)日:2023-08-24
申请号:US17677821
申请日:2022-02-22
申请人: Snap Inc.
发明人: Branislav Micusik
CPC分类号: G06T19/006 , G06F3/017 , G06F3/0346 , G06V10/76 , G06V10/82 , G06V20/46
摘要: A method for detecting changes in a scene includes accessing a first set of images and corresponding pose data in a first coordinate system associated with a first user session of an augmented reality (AR) device and accessing a second set of images and corresponding pose data in a second coordinate system associated with a second user session. The method identifies the first set of images corresponding to a second image from the second set of images based on the pose data of the first set of images being determined spatially closest to the pose data of the second image after aligning the first coordinate system and the second coordinate system. A trained neural network generates a synthesized image from the first set of images. Features of the second image are subtracted from features of the synthesized image. Area of changes are identified based on the subtracted features.
-
公开(公告)号:US20240233144A9
公开(公告)日:2024-07-11
申请号:US17973167
申请日:2022-10-25
申请人: Snap Inc.
CPC分类号: G06T7/292 , G06F3/011 , G06T7/564 , G06T19/006 , G06V20/64 , G06T2207/10012 , G06T2207/10028 , G06T2210/56
摘要: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
-
公开(公告)号:US20240176144A1
公开(公告)日:2024-05-30
申请号:US18128905
申请日:2023-03-30
申请人: Snap Inc.
CPC分类号: G02B27/0172 , G09G3/32 , G02B2027/014 , G02B2027/0178 , G09G2360/04
摘要: An extended Reality (XR) display system includes a Light Emitting Diode (LED) display controller, and a Light Emitting Diode (LED) near-eye display element operatively coupled to the LED display driver. The LED near-eye display element includes one or more motors and an LED array operably connected to the one or more motors. During operation, the LED display driver receives video data including a rendered virtual object of an XR experience and generates LED array control signals based on the video data, the LED array control signals causing one or more LEDs of the LED array to be energized in a sequence. The LED display driver also generates synchronized motor control signals and simultaneously communicates the LED array control signals to the LED array and the synchronized motor control signals to the one or more motors causing the LED near-eye display element to display the rendered virtual object.
-
公开(公告)号:US20230401796A1
公开(公告)日:2023-12-14
申请号:US17893723
申请日:2022-08-23
申请人: Snap Inc.
CPC分类号: G06T19/006 , G06T19/20 , G06V20/20 , G06T7/74 , G06T2207/10028 , G06T2207/30244 , G06T2219/024
摘要: A method for aligning coordinate systems from separate augmented reality (AR) devices is described. In one aspect, the method includes generating predicted depths of a first point cloud by applying a pre-trained model to a first single image generated by a first monocular camera of a first augmented reality (AR) device, and first sparse 3D points generated by a first SLAM system at the first AR device, generating predicted depths of a second point cloud by applying the pre-trained model to a second single image generated by a second monocular camera of the second AR device, and second sparse 3D points generated by a second SLAM system at the second AR device, determining a relative pose between the first AR device and the second AR device by registering the first point cloud with the second point cloud.
-
公开(公告)号:US12125150B2
公开(公告)日:2024-10-22
申请号:US17677821
申请日:2022-02-22
申请人: Snap Inc.
发明人: Branislav Micusik
CPC分类号: G06T19/006 , G06F3/017 , G06F3/0346 , G06V10/76 , G06V10/82 , G06V20/46
摘要: A method for detecting changes in a scene includes accessing a first set of images and corresponding pose data in a first coordinate system associated with a first user session of an augmented reality (AR) device and accessing a second set of images and corresponding pose data in a second coordinate system associated with a second user session. The method identifies the first set of images corresponding to a second image from the second set of images based on the pose data of the first set of images being determined spatially closest to the pose data of the second image after aligning the first coordinate system and the second coordinate system. A trained neural network generates a synthesized image from the first set of images. Features of the second image are subtracted from features of the synthesized image. Area of changes are identified based on the subtracted features.
-
公开(公告)号:US20240288946A1
公开(公告)日:2024-08-29
申请号:US18341558
申请日:2023-06-26
申请人: Snap Inc.
发明人: Georgios Evangelidis , Bernhard Jung , Ilteris Kaan Canberk , Daniel Wolf , Balázs Töth , Márton Gergely Kajtár , Branislav Micusik
CPC分类号: G06F3/017 , G06F3/012 , G06T7/248 , G06T7/74 , G06T2207/30196 , G06T2207/30204 , G06T2207/30241
摘要: A method for aligning coordinate systems of user devices in an augmented reality system using somatic points of a user's hand as alignment markers. Images captured from multiple user devices are used to align the reference coordinate systems of the user devices to a common reference coordinate system. In some examples, user devices capture images of a hand of a user and use object recognition to identify somatic points as alignment markers. The somatic points of a user device are translated to a common reference coordinate system determined by another user device.
-
公开(公告)号:US20240135555A1
公开(公告)日:2024-04-25
申请号:US17973167
申请日:2022-10-24
申请人: Snap Inc.
CPC分类号: G06T7/292 , G06F3/011 , G06T7/564 , G06T19/006 , G06V20/64 , G06T2207/10012 , G06T2207/10028 , G06T2210/56
摘要: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
-
-
-
-
-
-
-
-
-