-
公开(公告)号:US20240020853A1
公开(公告)日:2024-01-18
申请号:US17273148
申请日:2019-08-28
Applicant: THE UNIVERSITY OF TOKYO
Inventor: Yoshihiko NAKAMURA , Wataru TAKANG , Yosuke IKEGAMI
CPC classification number: G06T7/215 , G06T7/143 , G06T2207/10016 , G06T2207/10024
Abstract: The invention relates to automatic discovery and evaluation of a motion by treating motion recognition as an optimization problem by considering a series of basic motions obtained by segmenting a subject motion. The method comprises segmenting time series data defining a motion of a subject into a plurality of segments, classifying each segment into a class for a basic motion by using time series data of the segment, and converting the motion of the subject to a sequence of high rank symbols in which each high rank symbol is formed from a series of the basic motions, wherein a function that calculates a score based on a set of a high rank symbol and a sequence of basic motions is provided and the motion of the subject is converted to the sequence of the high rank symbols by an optimization calculation using dynamic programming.
-
公开(公告)号:US20230031291A1
公开(公告)日:2023-02-02
申请号:US17786773
申请日:2020-12-24
Applicant: THE UNIVERSITY OF TOKYO , OHTAKE ROOT KOGYO CO., LTD.
Inventor: Yoshihiko NAKAMURA , Yosuke IKEGAMI , Yoshitake OTA , Yuji KANZAKI , Hirotaka KOGA
Abstract: A treadmill according to the present invention includes: a frame; an endless belt; an endless-belt drive unit; a plurality of cameras provided on the frame such that the orientations and/or locations thereof can be adjusted; a motion-data acquisition unit that markerlessly acquires motion data of a subject by using image information obtained with camera images; a motion analysis unit that analyzes the motion of the subject by using the motion data; and a control unit that controls the endless-belt drive unit based on the motion data.
-
公开(公告)号:US20240303859A1
公开(公告)日:2024-09-12
申请号:US18549351
申请日:2022-03-07
Applicant: The University of Tokyo
Inventor: Yoshihiko NAKAMURA , Yosuke IKEGAMI , Takuya OHASHI
CPC classification number: G06T7/74 , G06T17/00 , G06T2207/20081 , G06T2207/20084 , G06T2207/30196 , G06T2210/12
Abstract: he The target includes a plurality of keypoints of a body including a plurality of joints and the 3D position of the target is identified by positions of the plurality of keypoints. A bounding box and reference 2D joint position determining unit determines a bounding box surrounding the target in a camera image at a target frame to be predicted subsequent to at least one frame at which imaging is performed by the plurality of cameras using the 3D positions of the keypoints of the target at the at least one frame and acquires reference 2D positions of the keypoints projected from the 3D positions of the keypoints of the target onto a predetermined plane. A 3D pose acquiring unit acquires the 3D positions of the keypoints of the target at the target frame using image information in the bounding box and the reference 2D positions.
-
4.
公开(公告)号:US20220108468A1
公开(公告)日:2022-04-07
申请号:US17274744
申请日:2019-08-29
Applicant: THE UNIVERSITY OF TOKYO
Inventor: Yoshihiko NAKAMURA , Wataru TAKANO , Yosuke IKEGAMI , Takuya OHASHI , Kazuki YAMAMOTO , Kentaro TAKEMOTO
Abstract: The present invention provides a motion capture with a high accuracy which can replace an optical motion capture technology, without attaching optical markers and sensors to a subject. A subject with an articulated structure has a plurality of feature points in the body of the subject including a plurality of joints wherein a distance between adjacent feature points is obtained as a constant. A spatial distribution of a likelihood of a position of each feature point is obtained based on a single input image or a plurality of input images taken at the same time. One or a plurality of position candidates corresponding to each feature point are obtained based on the spatial distribution of the likelihood of the position of each feature point. Each join angle is obtained by performing an optimization calculation based on inverse kinematics using the candidates and the articulated structure. Positions of the feature points including the joints are obtained by performing a forward kinematics calculation using the joint angles.
-
-
-