-
公开(公告)号:US20230222318A1
公开(公告)日:2023-07-13
申请号:US18009841
申请日:2021-06-30
Applicant: Google LLC
Inventor: Dmitry Lepikhin , Yanping Huang , Orhan Firat , Maxim Krikun , Dehao Chen , Noam M. Shazeer , HyoukJoong Lee , Yuanzhong Xu , Zhifeng Chen
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing machine learning task on a network input to generate a network output. In one aspect, one of the systems includes an attention neural network configured to perform the machine learning task, the attention neural network including one or more attention layers, each attention layer comprising an attention sub-layer and a feed-forward sub-layer. Some or all of the attention layers have a feed-forward sub-layer that applies conditional computation to the inputs to the sub-layer.
-
公开(公告)号:US20230124572A1
公开(公告)日:2023-04-20
申请号:US17791409
申请日:2020-01-08
Applicant: GOOGLE LLC
Inventor: Puneet Jain , Orhan Firat , Sihang Liang
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, that translate text depicted in images from a source language into a target language. Methods can include obtaining a first image that depicts first text written in a source language. The first image is input into an image translation model, which includes a feature extractor and a decoder. The feature extractor accepts the first image as input and in response, generates a first set of image features that are a description of a portion of the first image in which the text is depicted is obtained. The first set of image features are input into a decoder. In response to the input first set of image features, the decoder outputs a second text that is a predicted translation of text in the source language that is represented by the first set of image features.
-
公开(公告)号:US20250131215A1
公开(公告)日:2025-04-24
申请号:US19000935
申请日:2024-12-24
Applicant: Google LLC
Inventor: Puneet Jain , Orhan Firat , Sihang Liang
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, that translate text depicted in images from a source language into a target language. Methods can include obtaining a first image that depicts first text written in a source language. The first image is input into an image translation model, which includes a feature extractor and a decoder. The feature extractor accepts the first image as input and in response, generates a first set of image features that are a description of a portion of the first image in which the text is depicted is obtained. The first set of image features are input into a decoder. In response to the input first set of image features, the decoder outputs a second text that is a predicted translation of text in the source language that is represented by the first set of image features.
-
公开(公告)号:US20230274100A1
公开(公告)日:2023-08-31
申请号:US17682282
申请日:2022-02-28
Applicant: Google LLC
Inventor: Xavier Eduardo Garcia , Orhan Firat , Noah Constant , Xiaoyue Guo
IPC: G06F40/58 , G06F40/197 , G06F40/166 , G06F40/253 , G06N3/08
CPC classification number: G06F40/58 , G06F40/197 , G06F40/166 , G06F40/253 , G06N3/08
Abstract: The technology provides a model-based approach for multilingual text rewriting that is applicable across many languages and across different styles including formality levels or other textual attributes. The model is configured to manipulate both language and textual attributes jointly. This approach supports zero-shot formality-sensitive translation, with no labeled data in the target language. An encoder-decoder architectural approach with attribute extraction is used to train rewriter models that can thus be used in “universal” textual rewriting across many different languages. A cross-lingual learning signal can be incorporated into the training approach. Certain training processes do not employ any exemplars. This approach enables not just straight translation, but also the ability to create new sentences with different attributes.
-
公开(公告)号:US20230196105A1
公开(公告)日:2023-06-22
申请号:US18082934
申请日:2022-12-16
Applicant: Google LLC
Inventor: Zirui Wang , Wei Yu , Orhan Firat , Yuan Cao
IPC: G06N3/08
CPC classification number: G06N3/08
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating labeled training data using a pre-trained language model neural network. In particular, the language model neural network can generate the text input in a new labeled training example from an input sequence that includes (i) one or more context inputs and (ii) a text label that identifies the ground truth category for the new labeled training example.
-
公开(公告)号:US20200342182A1
公开(公告)日:2020-10-29
申请号:US16610233
申请日:2019-08-26
Applicant: GOOGLE LLC
Inventor: Melvin Jose Johnson Premkumar , Akiko Eriguchi , Orhan Firat
IPC: G06F40/58
Abstract: Training and/or using a multilingual classification neural network model to perform a natural language processing classification task, where the model reuses an encoder portion of a multilingual neural machine translation model. In a variety of implementations, a client device can generate a natural language data stream from a spoken input from a user. The natural language data stream can be applied as input to an encoder portion of the multilingual classification model. The output generated by the encoder portion can be applied as input to a classifier portion of the multilingual classification model. The classifier portion can generate a predicted classification label of the natural language data stream. In many implementations, an output can be generated based on the predicted classification label, and a client device can present the output.
-
公开(公告)号:US20250148224A1
公开(公告)日:2025-05-08
申请号:US19015153
申请日:2025-01-09
Applicant: Google LLC
Inventor: Xavier Eduardo Garcia , Orhan Firat , Noah Constant , Xiaoyue Guo , Parker Riley
IPC: G06F40/58 , G06F40/166 , G06F40/197 , G06F40/253 , G06F40/56 , G06N3/045 , G06N3/047 , G06N3/08 , G06N3/084
Abstract: The technology provides a model-based approach for multilingual text rewriting that is applicable across many languages and across different styles including formality levels or other textual attributes. The model is configured to manipulate both language and textual attributes jointly. This approach supports zero-shot formality-sensitive translation, with no labeled data in the target language. An encoder-decoder architectural approach with attribute extraction is used to train rewriter models that can thus be used in “universal” textual rewriting across many different languages. A cross-lingual learning signal can be incorporated into the training approach. Certain training processes do not employ any exemplars. This approach enables not just straight translation, but also the ability to create new sentences with different attributes.
-
公开(公告)号:US12242948B2
公开(公告)日:2025-03-04
申请号:US17159437
申请日:2021-01-27
Applicant: Google LLC
Inventor: Yanping Huang , Dmitry Lepikhin , Maxim Krikun , Orhan Firat , Ankur Bapna , Thang Luong , Sneha Kudugunta
Abstract: Systems and methods for routing in mixture-of-expert models. In some aspects of the technology, a transformer may have at least one Mixture-of-Experts (“MoE”) layer in each of its encoder and decoder, with the at least one MoE layer of the encoder having a learned gating function configured to route each token of a task to two or more selected expert feed-forward networks, and the at least one MoE layer of the decoder having a learned gating function configured to route each task to two or more selected expert feed-forward networks.
-
公开(公告)号:US20240378441A1
公开(公告)日:2024-11-14
申请号:US18661447
申请日:2024-05-10
Applicant: Google LLC
Inventor: Slav Petrov , Yonghui Wu , Andrew M. Dai , David Richard So , Dmitry Lepikhin , Erica Ann Moreira , Gaurav Mishra , Jonathan Hudson Clark , Maxim Krikun , Melvin Jose Johnson Premkumar , Nan Du , Orhan Firat , Rohan Anil , Siamak Shakeri , Xavier Garcia , Yanping Huang , Yong Cheng , Yuanzhong Xu , Yujing Zhang , Zachary Alexander Nado , Eric Jun Jie Ni , Kefan Xiao , Vladimir Feinberg , Jin Young Sohn , Aurko Roy
IPC: G06N3/08
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network to perform any one or more of a variety of machine learning tasks. For example, the neural network can be configured as a generative neural network, e.g., an autoregressive generative neural network.
-
公开(公告)号:US11138392B2
公开(公告)日:2021-10-05
申请号:US16521780
申请日:2019-07-25
Applicant: Google LLC
Inventor: Zhifeng Chen , Macduff Richard Hughes , Yonghui Wu , Michael Schuster , Xu Chen , Llion Owen Jones , Niki J. Parmar , George Foster , Orhan Firat , Ankur Bapna , Wolfgang Macherey , Melvin Jose Johnson Premkumar
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
-
-
-
-
-
-
-
-
-