User Interface Response Based on Gaze-Holding Event Assessment

    公开(公告)号:US20240103613A1

    公开(公告)日:2024-03-28

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Spatiotemporal smoothing for improved hand tracking

    公开(公告)号:US12223117B1

    公开(公告)日:2025-02-11

    申请号:US18371111

    申请日:2023-09-21

    Applicant: Apple Inc.

    Abstract: In some implementations, a method includes: obtaining uncorrected hand tracking data; obtaining a depth map associated with a physical environment; identifying a position of a portion of the finger within the physical environment based on the depth map and the uncorrected hand tracking data; performing spatial depth smoothing on a region of the depth map adjacent to the position of the portion of the finger; and generating corrected hand tracking data by performing point of view (POV) correction on the uncorrected hand tracking data based on the spatially depth smoothed region of the depth map adjacent to the portion of the finger.

    USER INTERFACE RESPONSE BASED ON GAZE-HOLDING EVENT ASSESSMENT

    公开(公告)号:US20240393876A1

    公开(公告)日:2024-11-28

    申请号:US18790194

    申请日:2024-07-31

    Applicant: APPLE INC.

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    User interface response based on gaze-holding event assessment

    公开(公告)号:US12099653B2

    公开(公告)日:2024-09-24

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Corrected Gaze Direction and Origin
    6.
    发明公开

    公开(公告)号:US20240103618A1

    公开(公告)日:2024-03-28

    申请号:US18470359

    申请日:2023-09-19

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 H04N13/344

    Abstract: Methods and apparatus for correcting the gaze direction and the origin (entrance pupil) in gaze tracking systems. During enrollment after an eye model is obtained, the pose of the eye when looking at a target prompt is determined. This information is used to estimate the true visual axis of the eye. The visual axis may then be used to correct the point of view (PoV) with respect to the display during use. If a clip-on lens is present, a corrected gaze axis may be calculated based on the known optical characteristics and pose of the clip-on lens. A clip-on corrected entrance pupil may then be estimated by firing two or more virtual rays through the clip-on lens to determine the intersection between the rays and the corrected gaze axis.

Patent Agency Ranking