Implicit bridging of machine learning tasks

    公开(公告)号:US10713593B2

    公开(公告)日:2020-07-14

    申请号:US15394708

    申请日:2016-12-29

    Applicant: Google LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for performing machine learning tasks. One method includes receiving (i) a model input, and (ii) data identifying a first machine learning task to be performed on the model input to generate a first type of model output for the model input; augmenting the model input with an identifier for the first machine learning task to generate an augmented model input; and processing the augmented model input using a machine learning model, wherein the machine learning model has been trained on training data to perform a plurality of machine learning tasks including the first machine learning task, and wherein the machine learning model has been configured through training to process the augmented model input to generate a machine learning model output of the first type for the model input.

    Implicit bridging of machine learning tasks

    公开(公告)号:US10679148B2

    公开(公告)日:2020-06-09

    申请号:US16402787

    申请日:2019-05-03

    Applicant: Google LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for performing machine learning tasks. One method includes receiving (i) a model input, and (ii) data identifying a first machine learning task to be performed on the model input to generate a first type of model output for the model input; augmenting the model input with an identifier for the first machine learning task to generate an augmented model input; and processing the augmented model input using a machine learning model. An exemplary system applying implicit bridging for machine learning tasks, as described in this specification, trains a machine learning model to perform certain types of machine learning tasks without requiring explicit training data for the certain types of machine learning tasks to be used during training.

    CROSS-LINGUAL CLASSIFICATION USING MULTILINGUAL NEURAL MACHINE TRANSLATION

    公开(公告)号:US20200342182A1

    公开(公告)日:2020-10-29

    申请号:US16610233

    申请日:2019-08-26

    Applicant: GOOGLE LLC

    Abstract: Training and/or using a multilingual classification neural network model to perform a natural language processing classification task, where the model reuses an encoder portion of a multilingual neural machine translation model. In a variety of implementations, a client device can generate a natural language data stream from a spoken input from a user. The natural language data stream can be applied as input to an encoder portion of the multilingual classification model. The output generated by the encoder portion can be applied as input to a classifier portion of the multilingual classification model. The classifier portion can generate a predicted classification label of the natural language data stream. In many implementations, an output can be generated based on the predicted classification label, and a client device can present the output.

    Cross-lingual classification using multilingual neural machine translation

    公开(公告)号:US11373049B2

    公开(公告)日:2022-06-28

    申请号:US16610233

    申请日:2019-08-26

    Applicant: Google LLC

    Abstract: Training and/or using a multilingual classification neural network model to perform a natural language processing classification task, where the model reuses an encoder portion of a multilingual neural machine translation model. In a variety of implementations, a client device can generate a natural language data stream from a spoken input from a user. The natural language data stream can be applied as input to an encoder portion of the multilingual classification model. The output generated by the encoder portion can be applied as input to a classifier portion of the multilingual classification model. The classifier portion can generate a predicted classification label of the natural language data stream. In many implementations, an output can be generated based on the predicted classification label, and a client device can present the output.

    Adapting automated assistants for use with multiple languages

    公开(公告)号:US11113481B2

    公开(公告)日:2021-09-07

    申请号:US16621578

    申请日:2019-05-02

    Applicant: GOOGLE LLC

    Abstract: Techniques described herein may serve to increase the language coverage of an automated assistant system, i.e. they may serve to increase the number of queries in one or more non-native languages for which the automated assistant is able to deliver reasonable responses. For example, techniques are described herein for training and utilizing a machine translation model to map a plurality of semantically-related natural language inputs in one language to one or more canonical translations in another language. In various implementations, the canonical translations may be selected and/or optimized for determining an intent of the speaker by the automated assistant, so that one or more responsive actions can be performed based on the speaker's intent. Put another way, the canonical translations may be specifically formatted for indicating the intent of the speaker to the automated assistant.

    MULTI-TASK LEARNING USING KNOWLEDGE DISTILLATION

    公开(公告)号:US20190325308A1

    公开(公告)日:2019-10-24

    申请号:US16458506

    申请日:2019-07-01

    Applicant: Google LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for performing multi-task learning. In one method a system obtains a respective set of training data for each of multiple machine learning tasks. For each of the machine learning tasks, the system configures a respective teacher machine learning model to perform the machine learning task by training the teacher machine learning model on the training data. The system trains a single student machine learning model to perform the multiple machine learning tasks using (i) the configured teacher machine learning models, and (ii) the obtained training data.

Patent Agency Ranking