-
公开(公告)号:US20200156721A1
公开(公告)日:2020-05-21
申请号:US16452532
申请日:2019-06-26
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: YOUJUN XIONG , Chunyu Chen , Yizhang Liu , Ligang Ge , Jianxin Pang
IPC: B62D57/032
Abstract: The present disclosure provides a robot gait planning method and a robot with the same. The method includes: obtaining, through the sensor set, force information of feel of the robot under a force applied by a target object; calculating coordinates of zero moment points of the feet of the robot with respect to a centroid of a body of the robot based on the force information; and determining a gait planning result for the robot based on the coordinates of the zero moment points with respect to the centroid of the body. The present disclosure is capable of converting the force of the target object to the zero moment points, and using the zero moment points to perform the gait planning, so that the robot follows the target object in the case that the robot is subjected to a force of the target object.
-
公开(公告)号:US12243242B2
公开(公告)日:2025-03-04
申请号:US17866574
申请日:2022-07-18
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Shuping Hu , Jun Cheng , Jingtao Zhang , Miaochen Guo , Dong Wang , Zaiwang Gu , Jianxin Pang
Abstract: A method includes: performing target detection on a current image to obtain detection information of a plurality of detected targets; obtaining position prediction information of each of a plurality of tracked targets and a number of times of tracking losses of targets from tracking information of each of the tracked targets, and determining a first matching threshold for each of the tracked targets according to the number of times of tracking losses of targets; calculating a motion matching degree between each of the tracked targets and each of the detected targets according to the position detection information and the position prediction information; for each of the tracked targets, obtaining a motion matching result according to the motion matching degree and the first matching threshold corresponding to the tracked target; and matching the detected targets and the tracked targets according to the motion matching results to obtain a tracking result.
-
43.
公开(公告)号:US12080098B2
公开(公告)日:2024-09-03
申请号:US17562963
申请日:2021-12-27
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Yusheng Zeng , Jun Cheng , Jianxin Pang
CPC classification number: G06V40/171 , G06T7/73 , G06V40/166
Abstract: A method for training a multi-task recognition model includes: obtaining a number of sample images, wherein some of the sample images are to provide feature-independent facial attributes, some of the sample images are to provide feature-coupled facial attributes, and some of the sample images are to provide facial attributes of face poses; training an initial feature-sharing model based on a first set of sample images to obtain a first feature-sharing model; training the first feature-sharing model based on the first set of sample images and a second set of sample images to obtain a second feature-sharing model with a loss value less than a preset second threshold; obtaining an initial multi-task recognition model by adding a feature decoupling model to the second feature-sharing model; and training the initial multi-task recognition model based on the sample images to obtain a trained multi-task recognition model.
-
公开(公告)号:US11941844B2
公开(公告)日:2024-03-26
申请号:US17403902
申请日:2021-08-17
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Yepeng Liu , Yusheng Zeng , Jun Cheng , Jing Gu , Yue Wang , Jianxin Pang
CPC classification number: G06T7/74 , G06N3/04 , G06T7/75 , G06V40/164 , G06T2207/20081 , G06T2207/20084 , G06T2207/30201
Abstract: An object detection model generation method as well as an electronic device and a computer readable storage medium using the same are provided. The method includes: during the iterative training of the to-be-trained object detection model, the detection accuracy of the iteration nodes of the object detection model is sequentially determined according to the node order, and the mis-detected negative samples of the object detection model at the iteration nodes with the detection accuracy less than or equal to a preset threshold are enhanced. Then the object detection model is trained at the iteration node based on the enhanced negative samples and a first amount of preset training samples. After the training at the iteration nodes are completed, it returns to the step of sequentially determining the detection accuracy of the iteration nodes of the object detection model until the training of the object detection model is completed.
-
公开(公告)号:US11833692B2
公开(公告)日:2023-12-05
申请号:US17115712
申请日:2020-12-08
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Dake Zheng , Yizhang Liu , Zheng Xie , Jianxin Pang , Youjun Xiong
IPC: B25J9/16 , G05B19/4155 , B25J13/08
CPC classification number: B25J9/1666 , B25J9/1605 , B25J9/1643 , G05B19/4155 , B25J13/08 , G05B2219/40269
Abstract: The present disclosure provides a method for controlling an arm of a robot, including obtaining obstacle information relating to the arm of the robot by at least one sensor, obtaining current posture information of the arm of the robot by a least one detector and obtaining an expected posture information of an end-portion of the arm of the robot, determining an expected trajectory of the end-portion of the arm of the robot, determining an expected speed of the end-portion of the arm of the robot in accordance with the expected trajectory of the end-portion, determining a virtual speed of a target point on the arm of the robot, and configuring a target join speed corresponding to a joint of the arm of the robot. Such that the redundant arm of the robot may be configured to prevent from contacting the obstacles in the complex environment while performing corresponding tasks.
-
46.
公开(公告)号:US20230386241A1
公开(公告)日:2023-11-30
申请号:US18088800
申请日:2022-12-27
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: KAN WANG , Jianxin Pang
CPC classification number: G06V40/10 , G06V10/761
Abstract: A person re-identification method, a storage medium, and a terminal device are provided. In the method, a loss function used during model training is a preset distribution-based triplet loss function constraining a difference between a mean of a negative sample feature distance and a mean of a positive sample feature distance to be larger than a preset difference threshold; where the positive sample feature distance is a distance between a feature of a reference image, and a feature of a positive sample image, and the negative sample feature distance is a distance between the feature of the reference image and a feature of a negative sample image. In this manner, it can constrain the mean of the positive sample feature distance and that of the negative sample feature distance, thereby improving the accuracy of person re-identification results.
-
公开(公告)号:US11776288B2
公开(公告)日:2023-10-03
申请号:US17389380
申请日:2021-07-30
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Yonghui Cai , Jun Cheng , Jianxin Pang , Youjun Xiong
IPC: G06N3/04 , G06V30/24 , G06V10/94 , G06T3/40 , G06F18/214
CPC classification number: G06V30/2504 , G06T3/4046 , G06V10/95 , G06F18/2148 , G06N3/04 , G06V2201/07
Abstract: A target object detection model is provided. The target object detection model includes a YOLOv3-Tiny model. Through the target object detection model, low-level information in the YOLOv3-Tiny sub-model can be merged with high-level information therein, so as to fuse the low-level information and the high-level information. Since the low-level information can be further used, the comprehensiveness of target detection is effectively improved, and the detection effect of small targets is improved.
-
公开(公告)号:US11691284B2
公开(公告)日:2023-07-04
申请号:US17120225
申请日:2020-12-13
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Chunyu Chen , Yizhang Liu , Ligang Ge , Zheng Xie , Hongge Wang , Youjun Xiong , Jianxin Pang
IPC: B25J9/16 , B25J13/08 , B62D57/032
CPC classification number: B25J9/1664 , B25J9/1651 , B25J9/1694 , B25J13/085 , B25J13/088 , B62D57/032
Abstract: A robot control method includes: obtaining force information associated with a left foot and a right foot of the robot; calculating a zero moment point of a COM of a body of the robot based on the force information; updating a motion trajectory of the robot according to the zero moment point of the COM of the body to obtain an updated position of the COM of the body; performing inverse kinematics analysis on the updated position of the COM of the body to obtain joint angles of a left leg and a right leg of the robot; and controlling the robot to move according to the joint angles.
-
公开(公告)号:US11618173B2
公开(公告)日:2023-04-04
申请号:US16854856
申请日:2020-04-21
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Jian Li , Hongyu Ding , Youpeng Li , Jianxin Pang , Youjun Xiong
Abstract: A robot joint includes a casing, a motor assembly including a stator and a rotor that are arranged within the casing, and a harmonic drive received, at least in part, in the rotor. The harmonic drive includes a circular spline, a wave generator fixed to the rotor, and a flex spline. The circular spline is arranged around and engaged with the flex spline. The wave generator is received in the flex spline and configured to drive the flex spline to rotate with respect to the circular spline. The robot joint further includes an output shaft fixed to the flex spline.
-
公开(公告)号:US11602843B2
公开(公告)日:2023-03-14
申请号:US16932872
申请日:2020-07-20
Applicant: UBTECH ROBOTICS CORP LTD
Inventor: Jie Bai , Ligang Ge , Yizhang Liu , Hongge Wang , Jianxin Pang , Youjun Xiong
Abstract: The present disclosure provides a foot-waist coordinated gait planning method and an apparatus and a robot using the same. The method includes: obtaining an orientation of each foot of the legged robot, and calculating a positional compensation amount of each ankle of the legged robot based on the orientation of the foot; obtaining an orientation of a waist of the legged robot, and calculating a positional compensation amount of each hip of the legged robot based on the orientation of the waist; calculating a hip-ankle positional vector of the legged robot; compensating the hip-ankle positional vector based on the positional compensation amount of the ankle and the positional compensation amount of the hip to obtain the compensated hip-ankle positional vector; and performing an inverse kinematics analysis on the compensated hip-ankle positional vector to obtain joint angles of the legged robot.
-
-
-
-
-
-
-
-
-