Instruction Fine-Tuning Machine-Learned Models Using Intermediate Reasoning Steps

    公开(公告)号:US20240256965A1

    公开(公告)日:2024-08-01

    申请号:US18424624

    申请日:2024-01-26

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: An example method for training a machine-learned sequence processing model includes obtaining a plurality of training examples for training the machine-learned sequence processing model. For each respective training example of the plurality of training examples, the example method includes: obtaining a respective query associated with the respective training example; inputting the respective query to the machine-learned sequence processing model; obtaining, from the machine-learned sequence processing model a response to the respective query and a trace of intermediate states from the respective query to the response; evaluating the response using a ground truth response associated with the respective training example; evaluating the trace using a ground truth trace associated with the respective training example; and updating one or more parameters of the machine-learned sequence processing model based on the evaluation of the response and based on the evaluation of the trace.

    CHARACTER-LEVEL ATTENTION NEURAL NETWORKS
    2.
    发明公开

    公开(公告)号:US20240289552A1

    公开(公告)日:2024-08-29

    申请号:US18564859

    申请日:2022-05-27

    Applicant: Google LLC

    CPC classification number: G06F40/284

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing a machine learning task on an input sequence of characters that has a respective character at each of a plurality of character positions to generate a network output. One of the systems includes a neural network configured to perform the machine learning task, the neural network comprising a gradient-based sub-word tokenizer and an output neural network. The gradient-based sub-word tokenizer is configured to apply a learned, i.e., flexible, sub-word tokenization strategy to the input sequence of characters to generate a sequence of latent sub-word representations. The output neural network is configured to process the latent sub-word representation to generate the network output for the task.

Patent Agency Ranking