END-TO-END TEXT-TO-SPEECH CONVERSION
    2.
    发明申请

    公开(公告)号:WO2018183650A2

    公开(公告)日:2018-10-04

    申请号:PCT/US2018/025101

    申请日:2018-03-29

    Applicant: GOOGLE LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating speech from text. One of the systems includes one or more computers and one or more storage devices storing instructions that when executed by one or more computers cause the one or more computers to implement: a sequence-to-sequence recurrent neural network configured to: receive a sequence of characters in a particular natural language, and process the sequence of characters to generate a spectrogram of a verbal utterance of the sequence of characters in the particular natural language; and a subsystem configured to: receive the sequence of characters in the particular natural language, and provide the sequence of characters as input to the sequence-to-sequence recurrent neural network to obtain as output the spectrogram of the verbal utterance of the sequence of characters in the particular natural language.

    IMPLICIT BRIDGING OF MACHINE LEARNING TASKS
    4.
    发明申请
    IMPLICIT BRIDGING OF MACHINE LEARNING TASKS 审中-公开
    机器学习任务的隐式桥接

    公开(公告)号:WO2018085577A1

    公开(公告)日:2018-05-11

    申请号:PCT/US2017/059776

    申请日:2017-11-02

    Applicant: GOOGLE LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for performing machine learning tasks. One method includes receiving (i) a model input, and (ii) data identifying a first machine learning task to be performed on the model input to generate a first type of model output for the model input; augmenting the model input with an identifier for the first machine learning task to generate an augmented model input; and processing the augmented model input using a machine learning model. An exemplary system applying implicit bridging for machine learning tasks, as described in this specification, trains a machine learning model to perform certain types of machine learning tasks without requiring that explicit training data for the certain types of machine learning tasks to be used during training.

    Abstract translation: 包括用于执行机器学习任务的编码在计算机存储介质上的计算机程序的方法,系统和装置。 一种方法包括接收(i)模型输入,以及(ii)标识要在模型输入上执行的第一机器学习任务的数据,以生成用于模型输入的第一类型的模型输出; 用第一机器学习任务的标识符扩充模型输入以生成增强模型输入; 并使用机器学习模型处理增强模型输入。 如本说明书中所描述的,应用机器学习任务的隐式桥接的示例性系统训练机器学习模型以执行某些类型的机器学习任务,而不需要在训练期间使用用于某些类型的机器学习任务的显式训练数据。

    REWARD AUGMENTED MODEL TRAINING
    5.
    发明申请
    REWARD AUGMENTED MODEL TRAINING 审中-公开
    奖励增强模型培训

    公开(公告)号:WO2018039510A1

    公开(公告)日:2018-03-01

    申请号:PCT/US2017/048529

    申请日:2017-08-25

    Applicant: GOOGLE LLC

    CPC classification number: G06N3/08 G06N20/00

    Abstract: A method includes obtaining data identifying a machine learning model to be trained to perform a machine learning task, the machine learning model being configured to receive an input example and to process the input example in accordance with current values of a plurality of model parameters to generate a model output for the input example; obtaining initial training data for training the machine learning model, the initial training data comprising a plurality of training examples and, for each training example, a ground truth output that should be generated by the machine learning model by processing the training example; generating modified training data from the initial training data; and training the machine learning model on the modified training data.

    Abstract translation: 一种方法包括获得识别将被训练以执行机器学习任务的机器学习模型的数据,机器学习模型被配置为接收输入示例并且根据当前值处理输入示例 为多个模型参数中的一个生成用于输入示例的模型输出; 获得用于训练机器学习模型的初始训练数据,所述初始训练数据包括多个训练例子,并且对于每个训练例子,通过处理训练例子应该由机器学习模型产生的地面真实输出; 从初始训练数据生成修改的训练数据; 并在修改后的训练数据上训练机器学习模型。

    NEURAL MACHINE TRANSLATION SYSTEMS
    9.
    发明申请
    NEURAL MACHINE TRANSLATION SYSTEMS 审中-公开
    神经机器翻译系统

    公开(公告)号:WO2018058046A1

    公开(公告)日:2018-03-29

    申请号:PCT/US2017/053267

    申请日:2017-09-25

    Applicant: GOOGLE LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for neural machine translation. One of the systems includes an encoder neural network comprising: an input forward long short-term memory (LSTM) layer configured to process each input token in the input sequence in a forward order to generate a respective forward representation of each input token, an input backward LSTM layer configured to process each input token in a backward order to generate a respective backward representation of each input token and a plurality of hidden LSTM layers configured to process a respective combined representation of each of the input tokens in the forward order to generate a respective encoded representation of each of the input tokens; and a decoder subsystem configured to receive the respective encoded representations and to process the encoded representations to generate an output sequence.

    Abstract translation: 用于神经机器翻译的方法,系统和装置,包括编码在计算机存储介质上的计算机程序。 其中一个系统包括编码器神经网络,其包括:输入正向长期短期存储器(LSTM)层,其被配置为以正向顺序处理输入序列中的每个输入标记以生成每个输入标记的相应正向表示,输入 后向LSTM层,被配置为以后向顺序处理每个输入令牌以生成每个输入令牌的相应后向表示,以及多个隐藏LSTM层,被配置为以前向顺序处理每个输入令牌的相应组合表示,以生成 各个输入令牌的相应编码表示; 以及解码器子系统,被配置为接收各个编码表示并处理编码表示以生成输出序列。

Patent Agency Ranking