-
11.
公开(公告)号:US20210109591A1
公开(公告)日:2021-04-15
申请号:US16970549
申请日:2019-02-15
Applicant: Tobii AB
Inventor: Jonas Andersson , Anders Clausen , Richard Hainzl , Anders Kingbäck , Simon Olin , Mark Ryan , Daniel Tornéus , Björn Nutti , Torbjörn Sundberg , Catarina Tidbeck , Ralf Biedert , Niklas Blomqvist , Dennis Rådell , Robin Thunström
Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
-
公开(公告)号:US10114459B2
公开(公告)日:2018-10-30
申请号:US15809138
申请日:2017-11-10
Applicant: Tobii AB
Inventor: André Algotsson , Anders Clausen , Jesper Högström , Jonas Högström , Tobias Lindgren , Rasmus Petersson , Mårten Skogö , Wilkey Wong
IPC: G06F3/01 , G06F17/30 , G06F3/00 , G06F3/0481 , G02B27/01
Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
-
公开(公告)号:US20180307905A1
公开(公告)日:2018-10-25
申请号:US15926722
申请日:2018-03-20
Applicant: Tobii AB
Inventor: Simon Gustafsson , Alexey Bezugly , Anders Kingbäck , Anders Clausen
IPC: G06K9/00 , G06F3/01 , G02B27/00 , G02B27/01 , G06F3/03 , G06F3/0481 , G09G5/34 , G06F3/0485 , G06F3/0484 , A61B3/113
CPC classification number: G06K9/00604 , A61B3/113 , G02B27/0093 , G02B27/017 , G02B27/0179 , G02B2027/0138 , G02B2027/014 , G02B2027/0187 , G06F3/012 , G06F3/013 , G06F3/0304 , G06F3/04815 , G06F3/04845 , G06F3/0485 , G06F2203/0381 , G09G5/34 , G09G2340/0464 , G09G2354/00
Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
-
公开(公告)号:US20210112107A1
公开(公告)日:2021-04-15
申请号:US17007397
申请日:2020-08-31
Applicant: Tobii AB
Inventor: Anders Clausen , Daniel Tornéus
Abstract: Data packets containing gaze data are streamed from an eyetracker to a client via a driver unit by receiving, repeatedly, gaze data packets in a first interface; and, providing, repeatedly, via a second interface, gaze data packets. The client sends a request message to the driver unit. The request message defines a delivery point in time in a first time frame structure at which delivery point in time in each frame of the first time frame structure the gaze data packet shall be provided to the client via the second interface. An offset is calculated between a reception point in time and the delivery point in time. The reception point in time indicates when a gaze data packet is received from the eyetracker relative to the first time structure. An adjusted data acquisition instance is assigned based on the offset. The adjusted data acquisition instance represents a modified point in time in a second time frame structure when at least one future gaze data packet shall be produced by the eyetracker. The driver unit sends a control message to the eyetracker. The control message is adapted to cause the eyetracker to produce the at least one future gaze data packet at such an adjusted acquisition instance in the second time structure that the reception point in time for the at least one future gaze data packet is expected to lie within a margin prior to the delivery point in time.
-
公开(公告)号:US10228763B2
公开(公告)日:2019-03-12
申请号:US16129585
申请日:2018-09-12
Applicant: Tobii AB
Inventor: André Algotsson , Anders Clausen , Jesper Högström , Jonas Högström , Tobias Lindgren , Rasmus Petersson , Mårten Skogö , Wilkey Wong
IPC: G06F3/01 , G02B27/01 , G06F3/0481 , G06F3/00 , G06F17/30
Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
-
公开(公告)号:US20190011986A1
公开(公告)日:2019-01-10
申请号:US16129585
申请日:2018-09-12
Applicant: Tobii AB
Inventor: André Algotsson , Anders Clausen , Jesper Högström , Jonas Högström , Tobias Lindgren , Rasmus Petersson , Mårten Skogö , Wilkey Wong
IPC: G06F3/01 , G06F17/30 , G06F3/0481 , G02B27/01 , G06F3/00
Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
-
公开(公告)号:US09958941B2
公开(公告)日:2018-05-01
申请号:US15276592
申请日:2016-09-26
Applicant: Tobii AB
Inventor: Simon Gustafsson , Alexey Bezugly , Anders Kingbäck , Anders Clausen
IPC: G06F3/01 , G02B27/01 , G06F3/0485 , G06F3/0481 , G06F3/0484 , G06K9/00 , G06F3/03 , G02B27/00
CPC classification number: G06K9/00604 , A61B3/113 , G02B27/0093 , G02B27/017 , G02B27/0179 , G02B2027/0138 , G02B2027/014 , G02B2027/0187 , G06F3/012 , G06F3/013 , G06F3/0304 , G06F3/04815 , G06F3/04845 , G06F3/0485 , G06F2203/0381 , G09G5/34 , G09G2340/0464 , G09G2354/00
Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
-
公开(公告)号:US09829976B2
公开(公告)日:2017-11-28
申请号:US14954026
申请日:2015-11-30
Applicant: Tobii AB
Inventor: André Algotsson , Anders Clausen , Jesper Högström , Jonas Högström , Tobias Lindgren , Rasmus Petersson , Mårten Skogö , Wilkey Wong
CPC classification number: G06F3/013 , G02B27/017 , G02B2027/0138 , G02B2027/0178 , G02B2027/0187 , G06F3/005 , G06F3/017 , G06F3/04815 , G06F17/30047
Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
-
19.
公开(公告)号:US11622103B2
公开(公告)日:2023-04-04
申请号:US16970549
申请日:2019-02-15
Applicant: Tobii AB
Inventor: Jonas Andersson , Anders Clausen , Richard Hainzl , Anders Kingbäck , Simon Olin , Mark Ryan , Daniel Tornéus , Björn Nutti , Torbjörn Sundberg , Catarina Tidbeck , Ralf Biedert , Niklas Blomqvist , Dennis Rådell , Robin Thunström
Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
-
公开(公告)号:US10607075B2
公开(公告)日:2020-03-31
申请号:US15926722
申请日:2018-03-20
Applicant: Tobii AB
Inventor: Simon Gustafsson , Alexey Bezugly , Anders Kingbäck , Anders Clausen
IPC: G06K9/00 , G02B27/00 , G02B27/01 , G06F3/01 , G06F3/0485 , G06F3/03 , G06F3/0481 , G06F3/0484 , G09G5/34 , A61B3/113
Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
-
-
-
-
-
-
-
-
-