-
公开(公告)号:US11099653B2
公开(公告)日:2021-08-24
申请号:US16659468
申请日:2019-10-21
Applicant: Ultrahaptics IP Two Limited
Inventor: Michael Zagorsek , Avinash Dabir , Paul Durdik , Keith Mertens
Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
-
12.
公开(公告)号:US10936022B2
公开(公告)日:2021-03-02
申请号:US16283693
申请日:2019-02-22
Applicant: Ultrahaptics IP Two Limited
Inventor: David S. Holz , Paul Durdik
Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
-
公开(公告)号:US10768708B1
公开(公告)日:2020-09-08
申请号:US14833016
申请日:2015-08-21
Applicant: Ultrahaptics IP Two Limited
Inventor: Maxwell Sills , Robert S. Gordon , Paul Durdik
Abstract: The technology disclosed relates to motion capture and gesture recognition. In particular, it calculates the exerted force implied by a human hand motion and applies the equivalent through a robotic arm to a target object. In one implementation, this is achieved by tracking the motion and contact of the human hand and generating corresponding robotic commands that replicate the motion and contact of the human hand on a workpiece through a robotic tool.
-
-