-
公开(公告)号:US20240201793A1
公开(公告)日:2024-06-20
申请号:US18542263
申请日:2023-12-15
申请人: TDK CORPORATION
发明人: Rémi Louis Clément PONÇOT , Juan S Mejia SANTAMARIA , Abbas ATAYA , Etienne De FORAS , Bruno FLAMENT
摘要: In a method for training a gesture recognition model, gesture data is collected from an inertial measurement unit (IMU) positioned on one side of a user, wherein the IMU is capable of collecting data when positioned on either side of the user. A transformation is applied to the gesture data, wherein the transformation generates transformed gesture data that is independent of either side of the user. A gesture recognition model is trained using the transformed gesture data.
-
公开(公告)号:US20240201792A1
公开(公告)日:2024-06-20
申请号:US18537224
申请日:2023-12-12
申请人: GOOGLE LLC
发明人: Stiven Guillaume Francois Morvan , Dongeek Shin , Andrea Colaco , Sambuddha Basu , Sean Kyungmok Bae , Junyi Zhu
IPC分类号: G06F3/01 , A61B5/0536 , A61B5/263 , G06V10/12 , G06V10/764 , G06V40/20
CPC分类号: G06F3/017 , A61B5/0536 , A61B5/263 , G06V10/12 , G06V10/764 , G06V40/28
摘要: Techniques include determining hand gestures formed by a user based on an electrical impedance tomograph of the wrist. For example, a user may be outfitted with a flexible wristband that fits snugly around the wrist and contains a plurality of electrodes, e.g., 32 electrodes. When a current is applied to a first subset of the electrodes, e.g., two of 32 electrodes, the electric field induced through at least one cross-section of the wrist will in turn induce a voltage across adjacent pairs of a second subset of the electrodes (e.g., the other 30 of 32 electrodes). From this current and induced voltage, one may use techniques of electrical impedance tomography (EIT) to determine the electrical impedance throughout the at least one cross-section of the wrist, e.g., in an electrical impedance tomograph. One may use a neural network to map the electrical impedance tomograph to a hand gesture.
-
公开(公告)号:US20240201789A1
公开(公告)日:2024-06-20
申请号:US18084885
申请日:2022-12-20
发明人: Keyu Qi , Hailing Zhou , Nan Ke , David Nguyen , Binghao Tang
CPC分类号: G06F3/017 , G06V10/761 , G06V10/82 , G06V20/52
摘要: Implementations are directed to receiving a first set of images included in a first video captured by a camera that monitors a human performing a task; processing the first set of images using a first machine learning (ML) model to determine whether the first set of images depicts a gesture that is included in a predefined set of gestures; in response to determining that the first set of images depicts a gesture included in a predefined set of gestures, processing a second set of images included in the first video using a second ML model to determine a first gesture type of the gesture; comparing the first gesture type with a first expected gesture type to determine whether performance of the task conforms to a standard operating procedure (SOP) for the task; and providing feedback representative of a comparison result in a user interface.
-
公开(公告)号:US20240201788A1
公开(公告)日:2024-06-20
申请号:US18068118
申请日:2022-12-19
申请人: Robert Bosch GmbH
发明人: Sharath Gopal , Shubhang Bhatnagar , Liu Ren
IPC分类号: G06F3/01 , G06V10/764 , G06V10/82
CPC分类号: G06F3/017 , G06V10/764 , G06V10/82
摘要: A computer-implemented system and method relate to gesture recognition. A machine learning model includes a first subnetwork, a second subnetwork, and a third subnetwork. The first subnetwork generates feature data based on sensor data, which includes a gesture. The feature data is divided into a set of patches. The second subnetwork selects a target patch of feature data from among the set of patches. The third subnetwork generates gesture data based on the target patch of feature data. The gesture data identifies the gesture of the sensor data. Command data is generated based on the gesture data. A device is controlled based on the command data.
-
公开(公告)号:US20240201787A1
公开(公告)日:2024-06-20
申请号:US18067956
申请日:2022-12-19
申请人: T-Mobile USA, Inc.
IPC分类号: G06F3/01 , G06V20/20 , G06V40/10 , G06V40/20 , H04M1/72454
摘要: This document discloses techniques, apparatuses, and systems for hand-movement based interactions with augmented reality (AR) objects. A set of hand gestures is determined, where each hand gesture is associated with a respective interaction with a virtual object in an AR environment. An environment is captured within the field of view of a viewfinder and included within an AR environment having a virtual object. A hand gesture is performed in the field of view of the viewfinder is captured. The captured hand gesture is determined to correspond to a particular hand gesture from the set of hand gestures. As a result, a particular interaction associated with the particular hand gesture is performed on an object within the augmented reality environment. In doing so, an AR engine can effectively interact with objects in an AR environment using hand gestures.
-
公开(公告)号:US20240201736A1
公开(公告)日:2024-06-20
申请号:US18396534
申请日:2023-12-26
申请人: Ouraring Inc.
IPC分类号: G06F1/16 , A61B5/00 , A61B5/01 , A61B5/0205 , A61B5/021 , A61B5/024 , A61B5/11 , A61B5/145 , A61B5/1455 , A61B5/332 , A61B5/349 , G01P15/00 , G02B19/00 , G04G21/02 , G06F3/01 , G06F3/14 , G06F21/32 , G06V10/75 , G06V40/10 , G06V40/70 , G08B5/36 , G08B21/02 , G08C17/02 , H02J7/00 , H02J7/35 , H02S40/22 , H02S99/00
CPC分类号: G06F1/163 , A61B5/01 , A61B5/0205 , A61B5/021 , A61B5/02416 , A61B5/1118 , A61B5/14532 , A61B5/1455 , A61B5/332 , A61B5/349 , A61B5/681 , A61B5/6826 , G01P15/00 , G02B19/0052 , G02B19/0061 , G04G21/02 , G04G21/025 , G06F1/1635 , G06F3/014 , G06F3/017 , G06F3/14 , G06F21/32 , G06V10/751 , G06V40/10 , G06V40/70 , G08B5/36 , G08B21/02 , G08C17/02 , H02J7/0044 , H02J7/35 , H02S40/22 , H02S99/00 , A61B2560/0214 , A61B2560/0412 , A61B2562/146 , A61B2562/164 , A61B2562/166 , G02B19/0042 , G06V40/15 , G08C2201/30 , Y02E10/52
摘要: A finger-worn wearable ring device may include a ring-shaped housing, a printed circuit board, and a sensor module that includes one or more light-emitting components and one or more light-receiving components. The wearable ring device may further include a communication module configured to wirelessly communicate with an application executable on a user device.
-
公开(公告)号:US12014464B2
公开(公告)日:2024-06-18
申请号:US17714496
申请日:2022-04-06
申请人: Magic Leap, Inc.
IPC分类号: G06T19/00 , G02B27/01 , G06F3/01 , G06F3/04815 , G06F3/0482 , G06K7/14 , G06K19/06 , G06V40/10
CPC分类号: G06T19/006 , G02B27/017 , G06F3/011 , G06F3/012 , G06F3/013 , G06F3/015 , G06F3/04815 , G06F3/0482 , G06K7/1408 , G06K19/06009 , G06V40/107 , G06F3/017
摘要: Examples of systems and methods for a wearable system to automatically select or filter available user interface interactions or virtual objects are disclosed. The wearable system can select a group of virtual objects for user interaction based on contextual information associated with the user, the user's environment, physical or virtual objects in the user's environment, or the user's physiological or psychological state.
-
公开(公告)号:US12013986B2
公开(公告)日:2024-06-18
申请号:US17838887
申请日:2022-06-13
发明人: Ayan Sinha , Chiho Choi , Joon Hee Choi , Karthik Ramani
IPC分类号: G06F3/01 , G06N20/00 , G06V10/44 , G06V20/64 , G06V30/194 , G06V40/10 , H04N13/271
CPC分类号: G06F3/017 , G06N20/00 , G06V10/454 , G06V20/653 , G06V30/194 , G06V40/11 , G06V40/113 , H04N13/271
摘要: A method for hand pose identification in an automated system includes providing map data of a hand of a user to a first neural network trained to classify features corresponding to a joint angle of a wrist in the hand to generate a first plurality of activation features and performing a first search in a predetermined plurality of activation features stored in a database in the memory to identify a first plurality of hand pose parameters for the wrist associated with predetermined activation features in the database that are nearest neighbors to the first plurality of activation features. The method further includes generating a hand pose model corresponding to the hand of the user based on the first plurality of hand pose parameters and performing an operation in the automated system in response to input from the user based on the hand pose model.
-
公开(公告)号:US12013725B2
公开(公告)日:2024-06-18
申请号:US18448595
申请日:2023-08-11
申请人: Ouraring Inc.
IPC分类号: G06F1/16 , A61B5/00 , A61B5/01 , A61B5/0205 , A61B5/021 , A61B5/024 , A61B5/11 , A61B5/145 , A61B5/1455 , A61B5/332 , A61B5/349 , G01P15/00 , G02B19/00 , G04G21/02 , G06F3/01 , G06F3/14 , G06F21/32 , G06V10/75 , G06V40/10 , G06V40/70 , G08B5/36 , G08B21/02 , G08C17/02 , H02J7/00 , H02J7/35 , H02S40/22 , H02S99/00
CPC分类号: G06F1/163 , A61B5/01 , A61B5/0205 , A61B5/021 , A61B5/02416 , A61B5/1118 , A61B5/14532 , A61B5/1455 , A61B5/332 , A61B5/349 , A61B5/681 , A61B5/6826 , G01P15/00 , G02B19/0052 , G02B19/0061 , G04G21/02 , G04G21/025 , G06F1/1635 , G06F3/014 , G06F3/017 , G06F3/14 , G06F21/32 , G06V10/751 , G06V40/10 , G06V40/70 , G08B5/36 , G08B21/02 , G08C17/02 , H02J7/0044 , H02J7/35 , H02S40/22 , H02S99/00 , A61B2560/0214 , A61B2560/0412 , A61B2562/146 , A61B2562/164 , A61B2562/166 , G02B19/0042 , G06V40/15 , G08C2201/30 , Y02E10/52
摘要: A finger-worn wearable ring device may include a ring-shaped housing, a printed circuit board, and a sensor module that includes one or more light-emitting components and one or more light-receiving components. The wearable ring device may further include a communication module configured to wirelessly communicate with an application executable on a user device.
-
公开(公告)号:US20240193866A1
公开(公告)日:2024-06-13
申请号:US18078832
申请日:2022-12-09
申请人: Yannick VERDIE , Zihao YANG , Deepak SRIDHAR , Steven George MCDONAGH , Juwei LU
发明人: Yannick VERDIE , Zihao YANG , Deepak SRIDHAR , Steven George MCDONAGH , Juwei LU
摘要: Methods and systems for estimation of a 3D hand pose are disclosed. A 2D image containing a detected hand is processed using a U-net network to obtain a global feature vector and a heatmap for the keypoints of the hand. Information from the global feature vector and the heatmap are concatenated to obtain a set of input tokens that are processed using a transformer encoder to obtain a first set of 2D keypoints representing estimated 2D locations of the keypoints in a first view. The first set of 2D keypoints are inputted as a query to a transformer decoder, to obtain a second set of 2D keypoints representing estimated 2D locations of the keypoints in a second view. The first and second sets of 2D keypoints are aggregated to output the set of estimated 3D keypoints.
-
-
-
-
-
-
-
-
-