Electronic devices with display operation based on eye activity

    公开(公告)号:US12197643B2

    公开(公告)日:2025-01-14

    申请号:US18456307

    申请日:2023-08-25

    Applicant: Apple Inc.

    Abstract: An electronic device may have a display for displaying image content. Head-mounted support structures in the device may be used to support the display. The electronic device may have an eye monitoring system that detects eye saccades and eye blinks. Control circuitry in the electronic device may coordinate operation of the display with periods of suppressed visual sensitivity that are associated with the saccades and blinks. By making adjustments to display circuitry and image content during periods of suppressed visual sensitivity, potentially visually obtrusive changes to displayed images can be hidden from a user of the electronic device. Adjustments to display operation may help reduce burn-in effects, may help reduce power consumption, and may otherwise improve device performance.

    Electronic Devices With Display Operation Based on Eye Activity

    公开(公告)号:US20200019238A1

    公开(公告)日:2020-01-16

    申请号:US16443214

    申请日:2019-06-17

    Applicant: Apple Inc.

    Abstract: An electronic device may have a display for displaying image content. Head-mounted support structures in the device may be used to support the display. The electronic device may have an eye monitoring system that detects eye saccades and eye blinks. Control circuitry in the electronic device may coordinate operation of the display with periods of suppressed visual sensitivity that are associated with the saccades and blinks. By making adjustments to display circuitry and image content during periods of suppressed visual sensitivity, potentially visually obtrusive changes to displayed images can be hidden from a user of the electronic device. Adjustments to display operation may help reduce burn-in effects, may help reduce power consumption, and may otherwise improve device performance.

    USER INTERFACE RESPONSE BASED ON GAZE-HOLDING EVENT ASSESSMENT

    公开(公告)号:US20240393876A1

    公开(公告)日:2024-11-28

    申请号:US18790194

    申请日:2024-07-31

    Applicant: APPLE INC.

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    User interface response based on gaze-holding event assessment

    公开(公告)号:US12099653B2

    公开(公告)日:2024-09-24

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    PERIPHERAL LUMINANCE OR COLOR REMAPPING FOR POWER SAVING

    公开(公告)号:US20220011858A1

    公开(公告)日:2022-01-13

    申请号:US17316460

    申请日:2021-05-10

    Applicant: Apple Inc.

    Abstract: In an embodiment, an electronic device includes a display and an eye tracker. The display includes one or more foveated areas. In the embodiment, the eye tracker is configured to collect eye tracking data regarding a gaze of one or more eyes of a user on the display. The electronic device also includes processing circuitry operatively coupled to the display. In the embodiment, the processing circuitry is configured to receive an indication of a motion associated with the gaze from the eye tracker. The processing circuitry is also configured to determine a previous location associated with the gaze during a previous frame and a target position associated with the gaze during a target frame. In the embodiment, the processing circuitry is configured to expand one or more foveated areas of the display adjacent a previous position of the gaze of the user.

    User Interface Response Based on Gaze-Holding Event Assessment

    公开(公告)号:US20240103613A1

    公开(公告)日:2024-03-28

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Electronic Device With Foveated Display and Gaze Prediction

    公开(公告)号:US20190339770A1

    公开(公告)日:2019-11-07

    申请号:US16376329

    申请日:2019-04-05

    Applicant: Apple Inc.

    Abstract: An electronic device may have a foveated display, an eye-tracking system and a head movement detection system. The eye-tracking system may gather information on a user's point of regard on the display while the head movement detection system may capture information regarding the rotation of the observer's head. Based on the point-of-regard information, head rotation information, image data, the type of eye/head movement that is underway, and/or tiredness information, control circuitry in the electronic device may produce image data for a display, with areas of different resolutions and(or) visual quality. A full-resolution and(or) quality portion of the image may overlap the point of regard. One or more lower resolution portions of the image may surround the full-resolution portion. The control circuitry may include a gaze prediction system for predicting the movement of the user's gaze during a saccade.

Patent Agency Ranking