Invention Grant
- Patent Title: Meta-knowledge fine tuning method and platform for multi-task language model
-
Application No.: US17531813Application Date: 2021-11-22
-
Publication No.: US11354499B2Publication Date: 2022-06-07
- Inventor: Hongsheng Wang , Haijun Shan , Shengjian Hu
- Applicant: ZHEJIANG LAB
- Applicant Address: CN Hangzhou
- Assignee: ZHEJIANG LAB
- Current Assignee: ZHEJIANG LAB
- Current Assignee Address: CN Hangzhou
- Agency: W&G Law Group
- Priority: CN202011202867.7 20201102
- Main IPC: G06F40/20
- IPC: G06F40/20 ; G06K9/62 ; G06N20/00 ; G06N5/04 ; G06N5/02

Abstract:
Disclosed is a meta-knowledge fine tuning method and platform for a multi-task language model. The method is to obtain highly transferable shared knowledge, that is, meta-knowledge, on different data sets of tasks of the same category, perform interrelation and mutual reinforcement on the learning processes of the tasks of the same category that correspond to different data sets and are in different domains, so as to improve the fine tuning effect of downstream tasks of the same category on data sets of different domains in the application of the language model, and improve the parameter initialization ability and the generalization ability of a general language model for the tasks of the same category.
Public/Granted literature
- US20220138414A1 META-KNOWLEDGE FINE TUNING METHOD AND PLATFORM FOR MULTI-TASK LANGUAGE MODEL Public/Granted day:2022-05-05
Information query