USER INTERFACE RESPONSE BASED ON GAZE-HOLDING EVENT ASSESSMENT

    公开(公告)号:US20240393876A1

    公开(公告)日:2024-11-28

    申请号:US18790194

    申请日:2024-07-31

    Applicant: APPLE INC.

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    User interface response based on gaze-holding event assessment

    公开(公告)号:US12099653B2

    公开(公告)日:2024-09-24

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Input Sensor with Acceleration Correction
    3.
    发明申请
    Input Sensor with Acceleration Correction 审中-公开
    输入传感器加速度校正

    公开(公告)号:US20170017346A1

    公开(公告)日:2017-01-19

    申请号:US15212119

    申请日:2016-07-15

    Applicant: Apple Inc.

    CPC classification number: G06F3/0418 G01L1/144 G06F3/0414 G06F3/044

    Abstract: Systems and methods for detecting user input to an electronic device are disclosed. The electronic device can include an input sensor system that itself includes an input-sensitive structure that compresses or expands in response to user input. The input sensor system measures and electrical property of the input-sensitive structure for changes. The input sensor system is coupled to an accelerometer to receive acceleration data to modify the detected changes to the input-sensitive structure.

    Abstract translation: 公开了用于检测用户对电子设备的输入的系统和方法。 电子设备可以包括输入传感器系统,其本身包括响应于用户输入而压缩或扩展的输入敏感结构。 输入传感器系统测量和输入敏感结构的电气特性进行变化。 输入传感器系统耦合到加速度计以接收加速数据,以将检测到的改变修改为输入敏感结构。

    User Interface Response Based on Gaze-Holding Event Assessment

    公开(公告)号:US20240103613A1

    公开(公告)日:2024-03-28

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Force sensor-based motion or orientation determination in a device

    公开(公告)号:US10254870B2

    公开(公告)日:2019-04-09

    申请号:US15089400

    申请日:2016-04-01

    Applicant: Apple Inc.

    Abstract: An electronic device is disclosed. The electronic device comprises a touch sensor panel configured to detect an object touching the touch sensor panel and a plurality of force sensors coupled to the touch sensor panel and configured to detect an amount of force with which the object touches the touch sensor panel. A processor is coupled to the plurality of force sensors, the processor configured to: measure a first value from a first force sensor of the plurality of force sensors; measure a second value from a second force sensor of the plurality of force sensors, different from the first force sensor; and determine a motion characteristic of the electronic device based on the first value and the second value.

Patent Agency Ranking