-
公开(公告)号:US20180143693A1
公开(公告)日:2018-05-24
申请号:US15358022
申请日:2016-11-21
IPC分类号: G06F3/01 , G06F3/0484 , G06K9/00 , G06K9/66
CPC分类号: G06F3/017 , G06F1/163 , G06F3/011 , G06F3/0304 , G06F3/04845 , G06K9/00355 , G06K9/0061 , G06K9/00671
摘要: A method for moving a virtual object includes detecting a position of two input objects. A position of a centroid that is equidistant from the two input objects and located between the two input objects is dynamically calculated, such that a reference line running between the two input objects intersects the centroid. Upon detecting a movement of the two input objects, the movement is translated into a change in one or both of a position and an orientation of the virtual object. Movement of the centroid caused by movement of the two input objects causes movement of the virtual object in a direction corresponding to the movement of the centroid. Rotation of the reference line about the centroid caused by the movement of the two input objects causes rotation of the virtual object about its center in a direction corresponding to the rotation of the reference line.
-
公开(公告)号:US20180120944A1
公开(公告)日:2018-05-03
申请号:US15341957
申请日:2016-11-02
CPC分类号: G06F3/017 , G06F1/163 , G06F3/013 , G06F3/0304
摘要: Methods and devices for displaying a virtual affordance with a virtual target are disclosed. In one example, the virtual target is displayed to a user via a display device. The user's point of gaze is determined to be at a gaze location within a target zone including the virtual target. The user's hand is determined to be at a hand location within a designated tracking volume. Based on at least determining that the user's gaze is at the gaze location and the user's hand is at the hand location, the virtual affordance is displayed at a landing location corresponding to the virtual target, where the landing location is independent of both the gaze location and the user's hand location. Movement of the user's hand is tracked and the virtual affordance is modified in response to the movement.
-
公开(公告)号:US10140776B2
公开(公告)日:2018-11-27
申请号:US15181250
申请日:2016-06-13
申请人: Julia Schwarz , Bharat Ahluwalia , David Calabrese , Robert C J Pengelly , Yasaman Sheri , James Tichenor
发明人: Julia Schwarz , Bharat Ahluwalia , David Calabrese , Robert C J Pengelly , Yasaman Sheri , James Tichenor
IPC分类号: G06T19/20 , G06F3/01 , G06T19/00 , G06F3/0484 , G06T15/60
摘要: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
-
公开(公告)号:US20170358144A1
公开(公告)日:2017-12-14
申请号:US15181250
申请日:2016-06-13
申请人: Julia Schwarz , Bharat Ahluwalia , David Calabrese , Robert CJ Pengelly , Yasaman Sheri , James Tichenor
发明人: Julia Schwarz , Bharat Ahluwalia , David Calabrese , Robert CJ Pengelly , Yasaman Sheri , James Tichenor
CPC分类号: G06T19/20 , G06F3/011 , G06F3/013 , G06F3/017 , G06F3/04842 , G06F3/04845 , G06T15/60 , G06T19/006 , G06T2219/20 , G06T2219/2004 , G06T2219/2016
摘要: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
-
-
-