-
公开(公告)号:US20240402800A1
公开(公告)日:2024-12-05
申请号:US18676786
申请日:2024-05-29
Applicant: Apple Inc.
Inventor: Julian K. Shutzberg , David J. Meyer , David M. Teitelbaum , Mehmet N. Agaoglu , Ian R. Fasel , Chase B. Lortie , Daniel J. Brewer , Tim H. Cornelissen , Leah M. Gum , Alexander G. Berardino , Lorenzo Soto Doblado , Vinay Chawda , Itay Bar Yosef , Dror Irony , Eslam A. Mostafa , Guy Engelhard , Paul A. Lacey , Ashwin Kumar Asoka Kumar Shenoi , Bhavin Vinodkumar Nayak , Liuhao Ge , Lucas Soffer , Victor Belyaev , Bharat C. Dandu , Matthias M. Schroeder , Yirong Tang
IPC: G06F3/01 , G06F3/04815
Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.
-
公开(公告)号:US20240103613A1
公开(公告)日:2024-03-28
申请号:US18244570
申请日:2023-09-11
Applicant: Apple Inc.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Birardino
IPC: G06F3/01 , G06F3/04842
CPC classification number: G06F3/013 , G06F3/017 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US12265665B1
公开(公告)日:2025-04-01
申请号:US18371098
申请日:2023-09-21
Applicant: Apple Inc.
Inventor: Emmanuel Piuze-Phaneuf , Ali Ercan , Julian K. Shutzberg , Paul A. Lacey
IPC: G06F3/01
Abstract: In some implementations, a method includes: determining a location for virtual content; detecting a user interaction with the virtual content; in response to detecting the user interaction with the virtual content, determining a position of a hand gesture during the user interaction with the virtual content; in accordance with a determination that the position of the hand gesture is within a threshold distance relative to the location of the virtual content, generating corrected hand tracking data associated with the user interaction with the virtual content; and in accordance with a determination that the position of the hand gesture is outside of the threshold distance relative to the location of the virtual content, generating uncorrected hand tracking data associated with the user interaction with the virtual content.
-
公开(公告)号:US12223117B1
公开(公告)日:2025-02-11
申请号:US18371111
申请日:2023-09-21
Applicant: Apple Inc.
Inventor: Emmanuel Piuze-Phaneuf , Ali Ercan , Julian K. Shutzberg , Paul A. Lacey
Abstract: In some implementations, a method includes: obtaining uncorrected hand tracking data; obtaining a depth map associated with a physical environment; identifying a position of a portion of the finger within the physical environment based on the depth map and the uncorrected hand tracking data; performing spatial depth smoothing on a region of the depth map adjacent to the position of the portion of the finger; and generating corrected hand tracking data by performing point of view (POV) correction on the uncorrected hand tracking data based on the spatially depth smoothed region of the depth map adjacent to the portion of the finger.
-
公开(公告)号:US20240393876A1
公开(公告)日:2024-11-28
申请号:US18790194
申请日:2024-07-31
Applicant: APPLE INC.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Berardino
IPC: G06F3/01 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US12099653B2
公开(公告)日:2024-09-24
申请号:US18244570
申请日:2023-09-11
Applicant: Apple Inc.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Berardino
IPC: G06F3/01 , G06F3/04842
CPC classification number: G06F3/013 , G06F3/017 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US20240103618A1
公开(公告)日:2024-03-28
申请号:US18470359
申请日:2023-09-19
Applicant: Apple Inc.
Inventor: Julia Benndorf , Qichao Fan , Julian K. Shutzberg , Paul A. Lacey , Hua Gao
IPC: G06F3/01 , H04N13/344
CPC classification number: G06F3/013 , H04N13/344
Abstract: Methods and apparatus for correcting the gaze direction and the origin (entrance pupil) in gaze tracking systems. During enrollment after an eye model is obtained, the pose of the eye when looking at a target prompt is determined. This information is used to estimate the true visual axis of the eye. The visual axis may then be used to correct the point of view (PoV) with respect to the display during use. If a clip-on lens is present, a corrected gaze axis may be calculated based on the known optical characteristics and pose of the clip-on lens. A clip-on corrected entrance pupil may then be estimated by firing two or more virtual rays through the clip-on lens to determine the intersection between the rays and the corrected gaze axis.
-
-
-
-
-
-