-
1.
公开(公告)号:US20230289600A1
公开(公告)日:2023-09-14
申请号:US18318616
申请日:2023-05-16
Applicant: HUAWEI TECHNOLOGIES CO., LTD.
Inventor: Wen Yan , Yijun Yu , Dongrun Qin , Yang Xin
IPC: G06N3/08
CPC classification number: G06N3/08
Abstract: A model distillation training method provides for establishment of a distillation training communication connection with a second device prior to performing distillation training on a neural network model. Based on exchange of distillation training information between the first device and the second device, the second device configures a first reference neural network model by using first configuration information sent by the first device. After configuring the first reference neural network model, the second device performs operation processing on first sample data in first data information based on the configured first reference neural network model by using the first data information to obtain first indication information, and sends the first indication information to the first device. The first device trains, by using the first indication information, a first neural network model designed by the first device.