-
公开(公告)号:US20230162005A1
公开(公告)日:2023-05-25
申请号:US18157277
申请日:2023-01-20
Applicant: HUAWEI TECHNOLOGIES CO., LTD.
Inventor: Pengxiang CHENG , Zhenhua DONG , Xiuqiang HE , Xiaolian ZHANG , Shi YIN , Yuelin HU
IPC: G06N3/045
CPC classification number: G06N3/045
Abstract: This application provides a neural network distillation method and apparatus in the field of artificial intelligence. The method includes: obtaining a sample set, where the sample set includes a biased data set and an unbiased data set, the biased data set includes biased samples, and the unbiased data set includes unbiased samples; determining a first distillation manner based on data features of the sample set, where, in the first distillation manner, a teacher model is trained by using the unbiased data set and a student model is trained by using the biased data set; and training a first neural network based on the biased data set and the unbiased data set in the first distillation manner, to obtain an updated first neural network.
-
2.
公开(公告)号:US20230153857A1
公开(公告)日:2023-05-18
申请号:US18156512
申请日:2023-01-19
Applicant: HUAWEI TECHNOLOGIES CO., LTD.
Inventor: Jingjie LI , Hong ZHU , Zhenhua DONG , Xiaolian ZHANG , Shi YIN , Xinhua FENG , Xiuqiang HE
IPC: G06Q30/0251 , G06Q30/0202
CPC classification number: G06Q30/0251 , G06Q30/0202
Abstract: A training method includes: obtaining a first recommendation model, where a model parameter of the first recommendation model is obtained through training based on n first training samples; determining an impact function value of each first training sample with respect to a verification loss of m second training samples in the first recommendation model; determining, based on the impact function value of each first training sample with respect to the verification loss, a weight corresponding to each first training sample; and training the first recommendation model based on the n first training samples and the weights corresponding to the n first training samples, to obtain a target recommendation model.
-