Prompt Tuning Using One or More Machine-Learned Models

    公开(公告)号:US20240378196A1

    公开(公告)日:2024-11-14

    申请号:US18684518

    申请日:2021-08-20

    Applicant: Google LLC

    Abstract: Systems and methods for prompt tuning can leverage semantic searching for determining similar prompts to use for retraining. A prompt can be generated then searched to find the similar prompts. Data related to the similar prompts can then be utilized for prompt tuning. Moreover, systems and methods for prompt tuning can generate and utilize a meta-prompt to reduce the computational cost of generating prompts. The prompt tuning techniques can be implemented as part of a prompt tuning application programming interface (API).

    Parameter Efficient Prompt Tuning for Efficient Models at Scale

    公开(公告)号:US20230325725A1

    公开(公告)日:2023-10-12

    申请号:US17718738

    申请日:2022-04-12

    Applicant: Google LLC

    CPC classification number: G06N20/20 G06V10/764 G06V10/7747

    Abstract: Systems and methods for natural language processing can leverage trained prompts to condition a large pre-trained machine-learned model to generate an output for a specific task. For example, a subset of parameters may be trained for the particular task to then be input with a set of input data into the pre-trained machine-learned model to generate the task-specific output. During the training of the prompt, the parameters of the pre-trained machine-learned model can be frozen, which can reduce the computational resources used during training while still leveraging the previously learned data from the pre-trained machine-learned model.

Patent Agency Ranking