-
公开(公告)号:US20210181920A1
公开(公告)日:2021-06-17
申请号:US17188915
申请日:2021-03-01
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
IPC: G06F3/0481
Abstract: Aspects of the systems and methods are described providing for modifying a presented interactive element or object, such as a cursor, based on user-input gestures, the presented environment of the cursor, or any combination thereof. The color, size, shape, transparency, and/or responsiveness of the cursor may change based on the gesture velocity, acceleration, or path. In one implementation, the cursor “stretches” to graphically indicate the velocity and/or acceleration of the gesture. The display properties of the cursor may also change if, for example, the area of the screen occupied by the cursor is dark, bright, textured, or is otherwise complicated. In another implementation, the cursor is drawn using sub-pixel smoothing to improve its visual quality.
-
公开(公告)号:US20210173490A1
公开(公告)日:2021-06-10
申请号:US17155019
申请日:2021-01-21
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
-
公开(公告)号:US20200286295A1
公开(公告)日:2020-09-10
申请号:US16823294
申请日:2020-03-18
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
-
公开(公告)号:US20200159334A1
公开(公告)日:2020-05-21
申请号:US16660528
申请日:2019-10-22
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
-
公开(公告)号:US20240393862A1
公开(公告)日:2024-11-28
申请号:US18796192
申请日:2024-08-06
Applicant: ULTRAHAPTICS IP TWO LIMITED
Inventor: David S. HOLZ
Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
-
46.
公开(公告)号:US20240370091A1
公开(公告)日:2024-11-07
申请号:US18769353
申请日:2024-07-10
Applicant: ULTRAHAPTICS IP TWO LIMITED
Inventor: Pohung CHEN , David S. HOLZ
IPC: G06F3/01 , G06F3/03 , G06F3/0346 , G06F3/04815 , G06V40/20
Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.
-
公开(公告)号:US20240296588A1
公开(公告)日:2024-09-05
申请号:US18662932
申请日:2024-05-13
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
IPC: G06T7/80 , G06T7/30 , G06V10/147 , G06V40/20
CPC classification number: G06T7/85 , G06T7/30 , G06V10/147 , G06V40/28 , G06T2207/10021 , G06T2207/30196
Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.
-
公开(公告)号:US20230325005A1
公开(公告)日:2023-10-12
申请号:US18209259
申请日:2023-06-13
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ
IPC: G06F3/01 , G06F3/03 , G06V10/145 , G06V40/10
CPC classification number: G06F3/017 , G06F3/0304 , G06V10/145 , G06V40/113 , G06F2218/08
Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
-
49.
公开(公告)号:US20230094182A1
公开(公告)日:2023-03-30
申请号:US17958089
申请日:2022-09-30
Applicant: Ultrahaptics IP Two Limited
Inventor: Kevin A. HOROWITZ , David S. HOLZ
IPC: G06T17/10 , G06T17/20 , G06T7/593 , G06T7/11 , G06T7/246 , G06T7/194 , G06T7/136 , G06F3/01 , G06T19/00 , G06T19/20 , G06V40/20
Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
-
50.
公开(公告)号:US20220413566A1
公开(公告)日:2022-12-29
申请号:US17901542
申请日:2022-09-01
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. HOLZ , Paul DURDIK
Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
-
-
-
-
-
-
-
-
-