Attention-based decoder-only sequence transduction neural networks

    公开(公告)号:US11556786B2

    公开(公告)日:2023-01-17

    申请号:US16759690

    申请日:2018-10-29

    申请人: GOOGLE LLC

    IPC分类号: G06N3/08 G06N3/04

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.

    Attention-based sequence transduction neural networks

    公开(公告)号:US11113602B2

    公开(公告)日:2021-09-07

    申请号:US16932422

    申请日:2020-07-17

    申请人: Google LLC

    IPC分类号: G06N3/04 G06N3/08

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.

    Depthwise separable convolutions for neural machine translation

    公开(公告)号:US10853590B2

    公开(公告)日:2020-12-01

    申请号:US16688958

    申请日:2019-11-19

    申请人: Google LLC

    摘要: Methods, systems, and apparatus, including computer programs encoded on computer storage media for performing machine translation tasks. One method includes receiving an input text segment in an input language; processing the input text segment using an encoder neural network to generate an encoder neural network output, the encoder neural network comprising multiple depth wise separable convolutional neural network layers; processing the encoder neural network output using an autoregressive decoder neural network to generate a decoder neural network output; and processing the decoder neural network output to generate a predicted output text segment in a target natural language.

    Generating parse trees of text segments using neural networks

    公开(公告)号:US10409908B2

    公开(公告)日:2019-09-10

    申请号:US14976121

    申请日:2015-12-21

    申请人: Google LLC

    IPC分类号: G06F17/27 G06N3/04

    摘要: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating parse trees for input text segments. One of the methods includes obtaining an input text segment, processing the input text segment using a first long short term memory (LSTM) neural network to convert the input text segment into an alternative representation for the input text segment, and processing the alternative representation for the input text segment using a second LSTM neural network to generate a linearized representation of a parse tree for the input text segment.

    ATTENTION NEURAL NETWORKS WITH LOCALITY-SENSITIVE HASHING

    公开(公告)号:US20210350244A1

    公开(公告)日:2021-11-11

    申请号:US17164691

    申请日:2021-02-01

    申请人: Google LLC

    IPC分类号: G06N3/08 G06N3/04

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes an attention neural network configured to perform the machine learning task, the attention neural network including one or more LSH attention layers, each LSH attention layer comprising one or more LSH attention sub-layers, each LSH sub-layer configured to: receive a sequence of queries derived from an input sequence to the LSH attention layer, the sequence of queries having a respective query at each of a plurality of input positions; determine one or more respective hash values for each of the respective queries at each of the plurality of input positions; generate a plurality of LSH groupings; and generate an attended input sequence.

    ATTENTION-BASED IMAGE GENERATION NEURAL NETWORKS

    公开(公告)号:US20210064924A1

    公开(公告)日:2021-03-04

    申请号:US17098271

    申请日:2020-11-13

    申请人: Google LLC

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output image. In one aspect, one of the methods includes generating the output image intensity value by intensity value according to a generation order of pixel—color channel pairs from the output image, comprising, for each particular generation order position in the generation order: generating a current output image representation of a current output image, processing the current output image representation using a decoder neural network to generate a probability distribution over possible intensity values for the pixel—color channel pair at the particular generation order position, wherein the decoder neural network includes one or more local masked self-attention sub-layers; and selecting an intensity value for the pixel—color channel pair at the particular generation order position using the probability distribution.

    ATTENTION-BASED DECODER-ONLY SEQUENCE TRANSDUCTION NEURAL NETWORKS

    公开(公告)号:US20200342316A1

    公开(公告)日:2020-10-29

    申请号:US16759690

    申请日:2018-10-29

    申请人: GOOGLE LLC

    IPC分类号: G06N3/08 G06N3/04

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. One of the methods includes, at each of a plurality of generation time steps: generating a combined sequence for the generation time step that includes the input sequence followed by the output tokens that have already been generated as of the generation time step; processing the combined sequence using a self-attention decoder neural network to generate a time step output that defines a score distribution over a set of possible output tokens; and selecting, using the time step output, an output token from the set of possible output tokens as the next output token in the output sequence.

    ATTENTION-BASED IMAGE GENERATION NEURAL NETWORKS

    公开(公告)号:US20190130213A1

    公开(公告)日:2019-05-02

    申请号:US16174074

    申请日:2018-10-29

    申请人: Google LLC

    摘要: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output image. In one aspect, one of the methods includes generating the output image intensity value by intensity value according to a generation order of pixel-color channel pairs from the output image, comprising, for each particular generation order position in the generation order: generating a current output image representation of a current output image, processing the current output image representation using a decoder neural network to generate a probability distribution over possible intensity values for the pixel-color channel pair at the particular generation order position, wherein the decoder neural network includes one or more local masked self-attention sub-layers; and selecting an intensity value for the pixel-color channel pair at the particular generation order position using the probability distribution.

    Generating parse trees of text segments using neural networks

    公开(公告)号:US10268671B2

    公开(公告)日:2019-04-23

    申请号:US15396091

    申请日:2016-12-30

    申请人: Google LLC

    摘要: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating parse trees for input text segments. One of the methods includes obtaining an input text segment comprising a plurality of inputs arranged according to an input order; processing the inputs in the input text segment using an encoder long short term memory (LSTM) neural network to generate a respective encoder hidden state for each input in the input text segment; and processing the respective encoder hidden states for the inputs in the input text segment using an attention-based decoder LSTM neural network to generate a linearized representation of a parse tree for the input text segment.