GAZE BEHAVIOR DETECTION
    1.
    发明公开

    公开(公告)号:US20230418372A1

    公开(公告)日:2023-12-28

    申请号:US18211712

    申请日:2023-06-20

    Applicant: Apple Inc.

    CPC classification number: G06F3/013

    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine a gaze behavior state to identify gaze shifting events, gaze holding events, and loss events of a user based on physiological data. For example, an example process may include obtaining eye data associated with a gaze during a first period of time (e.g., eye position and velocity, interpupillary distance, pupil diameters, etc.). The process may further include obtaining head data associated with the gaze during the first period of time (e.g., head position and velocity). The process may further include determining a first gaze behavior state during the first period of time to identify gaze shifting events, gaze holding events, and loss events (e.g., one or more gaze and head pose characteristics may be determined, aggregated, and used to classify the user's eye movement state using machine learning techniques).

    Electronic device with foveated display and gaze prediction

    公开(公告)号:US10890968B2

    公开(公告)日:2021-01-12

    申请号:US16376329

    申请日:2019-04-05

    Applicant: Apple Inc.

    Abstract: An electronic device may have a foveated display, an eye-tracking system and a head movement detection system. The eye-tracking system may gather information on a user's point of regard on the display while the head movement detection system may capture information regarding the rotation of the observer's head. Based on the point-of-regard information, head rotation information, image data, the type of eye/head movement that is underway, and/or tiredness information, control circuitry in the electronic device may produce image data for a display, with areas of different resolutions and(or) visual quality. A full-resolution and(or) quality portion of the image may overlap the point of regard. One or more lower resolution portions of the image may surround the full-resolution portion. The control circuitry may include a gaze prediction system for predicting the movement of the user's gaze during a saccade.

    USER INTERFACE RESPONSE BASED ON GAZE-HOLDING EVENT ASSESSMENT

    公开(公告)号:US20240393876A1

    公开(公告)日:2024-11-28

    申请号:US18790194

    申请日:2024-07-31

    Applicant: APPLE INC.

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    User interface response based on gaze-holding event assessment

    公开(公告)号:US12099653B2

    公开(公告)日:2024-09-24

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Electronic Device With Foveated Display and Gaze Prediction

    公开(公告)号:US20190339770A1

    公开(公告)日:2019-11-07

    申请号:US16376329

    申请日:2019-04-05

    Applicant: Apple Inc.

    Abstract: An electronic device may have a foveated display, an eye-tracking system and a head movement detection system. The eye-tracking system may gather information on a user's point of regard on the display while the head movement detection system may capture information regarding the rotation of the observer's head. Based on the point-of-regard information, head rotation information, image data, the type of eye/head movement that is underway, and/or tiredness information, control circuitry in the electronic device may produce image data for a display, with areas of different resolutions and(or) visual quality. A full-resolution and(or) quality portion of the image may overlap the point of regard. One or more lower resolution portions of the image may surround the full-resolution portion. The control circuitry may include a gaze prediction system for predicting the movement of the user's gaze during a saccade.

Patent Agency Ranking