GAZE-CONTROLLED USER INTERFACE WITH MULTIMODAL INPUT
    1.
    发明申请
    GAZE-CONTROLLED USER INTERFACE WITH MULTIMODAL INPUT 审中-公开
    GAZE控制用户界面与多模式输入

    公开(公告)号:US20140354539A1

    公开(公告)日:2014-12-04

    申请号:US14290634

    申请日:2014-05-29

    IPC分类号: G06F3/01

    CPC分类号: G06F3/013 G06F3/0304

    摘要: A personal computer system provides a gaze-controlled graphical user interface having a bidirectional and a unidirectional interaction mode. In the bidirectional interaction mode, a display shows one or more graphical controls in motion, each being associated with an input operation to an operating system. A gaze tracking system provides gaze point data of a viewer, and a matching module attempts to match a relative gaze movement against a relative movement of one of the graphical controls. The system includes a selector which is preferably controllable by a modality other than gaze. The system initiates a transition from the unidirectional interaction mode to the bidirectional interaction mode in response to an input received at the selector. The display then shows graphical controls in motion in a neighbourhood of the current gaze point, as determined based on current gaze data.

    摘要翻译: 个人计算机系统提供具有双向和单向交互模式的注视控制的图形用户界面。 在双向交互模式中,显示器显示运动中的一个或多个图形控件,每一个与操作系统的输入操作相关联。 注视跟踪系统提供观察者的凝视点数据,并且匹配模块尝试将相对注视运动与图形控件之一的相对运动相匹配。 该系统包括选择器,其优选地可以通过不同于注视的方式控制。 响应于在选择器处接收的输入,系统启动从单向交互模式到双向交互模式的转换。 然后,显示器根据目前的凝视数据确定当前凝视点附近的运动图形控件。

    Eye Gaze Determination
    2.
    发明申请
    Eye Gaze Determination 有权
    眼睛凝视测定

    公开(公告)号:US20150177833A1

    公开(公告)日:2015-06-25

    申请号:US14580286

    申请日:2014-12-23

    IPC分类号: G06F3/01 G06T7/00

    摘要: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.

    摘要翻译: 提供了用于从存在于用户界面中的一个或多个交互元件识别交互元件的方法,系统和计算机程序产品,包括:从眼睛跟踪系统接收注视信息; 确定相互作用元件是来自眼睛跟踪系统的注视信息的对象的可能性; 以及基于所述似然性确定从所述一个或多个交互元素识别交互元素。

    FAST WAKE-UP IN A GAZE TRACKING SYSTEM
    3.
    发明申请
    FAST WAKE-UP IN A GAZE TRACKING SYSTEM 有权
    在GAZE跟踪系统中快速唤醒

    公开(公告)号:US20140043227A1

    公开(公告)日:2014-02-13

    申请号:US13962151

    申请日:2013-08-08

    IPC分类号: G06F3/01

    CPC分类号: G06F3/013 G06F1/325

    摘要: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.

    摘要翻译: 注视跟踪系统,响应于激活信号留下低功率模式,通过限制传感器的图像区域,在短时间内捕获初始的眼图片段,以实现增加的帧速率。 以正常的帧速率拍摄后续的眼睛照片。 第一个凝视点值是基于眼图的初始脉冲串无记忆地计算的,而没有附加的图像,而后续值可以通过考虑以前的凝视点值或先前眼睛图像的信息来递归地计算。 图像区域的限制可以由使用相同或不同传感器捕获的初步概览图来指导。 从目标点值,系统可以导出将被提供给具有视觉显示的计算机设备的控制信号。

    ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING
    4.
    发明申请
    ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING 有权
    用于控制基于眼睛跟踪的计算机设备的安排,方法和计算机程序

    公开(公告)号:US20140009390A1

    公开(公告)日:2014-01-09

    申请号:US13960476

    申请日:2013-08-06

    IPC分类号: G06F3/01

    CPC分类号: G06F3/017 G06F3/013

    摘要: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.

    摘要翻译: 计算机设备与图形显示相关联,该图形显示器呈现出适合于基于用户生成的命令被操纵的至少一个GUI组件。事件引擎适于接收描述用户在显示器上的观点的眼睛跟踪数据信号。 基于该信号,事件引擎产生影响至少一个GUI组件的一组非光标控制事件输出信号。 每个非光标控制事件输出信号描述用户关于显示器的眼睛活动的特定方面。 最初,事件引擎从至少一个GUI组件中的每一个接收控制信号请求。 控制信号请求定义特定GUI分量所需的非光标控制事件输出信号组的子集。 事件引擎根据每个相应的控制信号请求将非光标控制事件输出信号提供给至少一个GUI部件。

    SYSTEM FOR GAZE INTERACTION
    5.
    发明申请
    SYSTEM FOR GAZE INTERACTION 有权
    GAZE交互系统

    公开(公告)号:US20150130740A1

    公开(公告)日:2015-05-14

    申请号:US14601007

    申请日:2015-01-20

    摘要: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.

    摘要翻译: 提供了一种用于在与信息呈现区域的用户交互期间产生基于手势的命令的控制模块。 控制模块被配置为从触摸板获取用户输入并注视来自注视跟踪模块的数据信号; 并且基于用户去除所述用户的手指与所述触摸板的接触来确定至少一个用户生成的基于手势的控制命令; 至少基于凝视数据信号,确定包括用户凝视点的信息呈现区域上的凝视点区域; 并且基于所确定的凝视点区域和至少一个用户生成的基于手势的控制命令来执行操纵呈现在图形信息呈现区域上的视图的至少一个用户操作,其中在所述确定的凝视点区域执行用户动作。

    ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING

    公开(公告)号:US20130326431A1

    公开(公告)日:2013-12-05

    申请号:US13960530

    申请日:2013-08-06

    IPC分类号: G06F3/01

    CPC分类号: G06F3/017 G06F3/013

    摘要: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.

    ARRANGEMENT, METHOD AND COMPUTER PROGRAM FOR CONTROLLING A COMPUTER APPARATUS BASED ON EYE-TRACKING

    公开(公告)号:US20130321270A1

    公开(公告)日:2013-12-05

    申请号:US13960432

    申请日:2013-08-06

    IPC分类号: G06F3/01

    CPC分类号: G06F3/017 G06F3/013

    摘要: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.