-
公开(公告)号:EP3844536A1
公开(公告)日:2021-07-07
申请号:EP19801845.9
申请日:2019-11-11
申请人: Koninklijke Philips N.V. , Nederlandse Organisatie voor Toegepast-Natuurwetenschappelijk Onderzoek TNO
-
公开(公告)号:EP2384603B1
公开(公告)日:2015-03-04
申请号:EP09801280.0
申请日:2009-12-22
发明人: WEI, Gongming , LIU, Yadong , HOUTEN, Henk van , LIU, Bo , CORNELISSEN, Hugo Johan , ZHU, Xiaoyan , RONDA, Cornelis Reinder , RUIJTER, Hendrikus Albertus Adrianus Maria de
CPC分类号: C09K11/06 , F21W2131/3005 , F21Y2101/00 , F21Y2115/10 , G02B6/0035 , G02B6/0068
-
公开(公告)号:EP3408755A1
公开(公告)日:2018-12-05
申请号:EP17701752.2
申请日:2017-01-23
CPC分类号: G06F17/2795 , G06F17/2785 , G06F17/2881 , G06N3/0445 , G06N3/0454 , G06N3/08
摘要: The present disclosure pertains to a paraphrase generation system. The system comprises one or more hardware processors and/or other components. The system is configured to obtain a training corpus. The training corpus comprises language and known paraphrases of the language. The system is configured to generate, based on the training corpus, a word-level attention-based model and a character- level attention- based model. The system is configured to provide one or more candidate paraphrases of a natural language input based on both the word-level and character-level attention-based models. The word-level attention-based model is a word-level bidirectional long short term memory (LSTM) network and the character-level attention-based model is a character-level bidirectional LSTM network. The word-level and character level LSTM networks are generated based on words and characters in the training corpus. In some embodiments, the LSTM networks are stacked residual LSTM networks comprising residual connections between stacked layers of a given LSTM network.
-
-
-