Multitask Learning As Question Answering
    31.
    发明申请

    公开(公告)号:US20190355270A1

    公开(公告)日:2019-11-21

    申请号:US16006691

    申请日:2018-06-12

    Abstract: Approaches for natural language processing include a multi-layer encoder for encoding words from a context and words from a question in parallel, a multi-layer decoder for decoding the encoded context and the encoded question, a pointer generator for generating distributions over the words from the context, the words from the question, and words in a vocabulary based on an output from the decoder, and a switch. The switch generates a weighting of the distributions over the words from the context, the words from the question, and the words in the vocabulary, generates a composite distribution based on the weighting of the distribution over the first words from the context, the distribution over the second words from the question, and the distribution over the words in the vocabulary, and selects words for inclusion in an answer using the composite distribution.

    SYSTEMS AND METHODS FOR LEARNING FOR DOMAIN ADAPTATION

    公开(公告)号:US20190286073A1

    公开(公告)日:2019-09-19

    申请号:US16054935

    申请日:2018-08-03

    Abstract: A method for training parameters of a first domain adaptation model includes evaluating a cycle consistency objective using a first task specific model associated with a first domain and a second task specific model associated with a second domain. The evaluating the cycle consistency objective is based on one or more first training representations adapted from the first domain to the second domain by a first domain adaptation model and from the second domain to the first domain by a second domain adaptation model, and one or more second training representations adapted from the second domain to the first domain by the second domain adaptation model and from the first domain to the second domain by the first domain adaptation model. The method further includes evaluating a learning objective based on the cycle consistency objective, and updating parameters of the first domain adaptation model based on learning objective.

    Dynamic Memory Network
    35.
    发明申请
    Dynamic Memory Network 审中-公开
    动态内存网络

    公开(公告)号:US20170024645A1

    公开(公告)日:2017-01-26

    申请号:US15221532

    申请日:2016-07-27

    Abstract: A novel unified neural network framework, the dynamic memory network, is disclosed. This unified framework reduces every task in natural language processing to a question answering problem over an input sequence. Inputs and questions are used to create and connect deep memory sequences. Answers are then generated based on dynamically retrieved memories.

    Abstract translation: 公开了一种新颖的统一神经网络框架,动态存储网络。 这个统一框架将自然语言处理中的每个任务都减少到一个输入序列中的问题回答问题。 输入和问题用于创建和连接深层记忆序列。 然后基于动态检索的存储器生成答案。

    Systems and methods for reading comprehension for a question answering task

    公开(公告)号:US11775775B2

    公开(公告)日:2023-10-03

    申请号:US16695494

    申请日:2019-11-26

    CPC classification number: G06F40/40 G06F40/30

    Abstract: Embodiments described herein provide a pipelined natural language question answering system that improves a BERT-based system. Specifically, the natural language question answering system uses a pipeline of neural networks each trained to perform a particular task. The context selection network identifies premium context from context for the question. The question type network identifies the natural language question as a yes, no, or span question and a yes or no answer to the natural language question when the question is a yes or no question. The span extraction model determines an answer span to the natural language question when the question is a span question.

Patent Agency Ranking