-
公开(公告)号:US12045727B2
公开(公告)日:2024-07-23
申请号:US17115464
申请日:2020-12-08
Applicant: NEC Laboratories America, Inc.
Inventor: Renqiang Min , Christopher Malon , Pengyu Cheng
IPC: G06N3/088 , G06F40/20 , G06N3/02 , G06N3/0442 , G06N3/08 , G06N3/082 , G06N3/086 , G10L15/06 , G10L15/16 , G10L15/22
CPC classification number: G06N3/088 , G06F40/20 , G06N3/0442 , G06N3/08 , G06N3/086 , G10L15/063 , G10L15/16 , G10L15/22 , G06N3/02 , G06N3/082
Abstract: A computer-implemented method is provided for disentangled data generation. The method includes accessing, by a bidirectional Long Short-Term Memory (LSTM) with a multi-head attention mechanism, a dataset including a plurality of pairs each formed from a given one of a plurality of input text structures and given one of a plurality of style labels for the plurality of input text structures. The method further includes training the bidirectional LSTM as an encoder to disentangle a sequential text input into disentangled representations comprising a content embedding and a style embedding based on a subset of the dataset. The method also includes training a unidirectional LSTM as a decoder to generate a next text structure prediction for the sequential text input based on previously generated text structure information and a current word, from a disentangled representation with the content embedding and the style embedding.
-
公开(公告)号:US20210174213A1
公开(公告)日:2021-06-10
申请号:US17115464
申请日:2020-12-08
Applicant: NEC Laboratories America, Inc.
Inventor: Renqiang Min , Christopher Malon , Pengyu Cheng
Abstract: A computer-implemented method is provided for disentangled data generation. The method includes accessing, by a bidirectional Long Short-Term Memory (LSTM) with a multi-head attention mechanism, a dataset including a plurality of pairs each formed from a given one of a plurality of input text structures and given one of a plurality of style labels for the plurality of input text structures. The method further includes training the bidirectional LSTM as an encoder to disentangle a sequential text input into disentangled representations comprising a content embedding and a style embedding based on a subset of the dataset. The method also includes training a unidirectional LSTM as a decoder to generate a next text structure prediction for the sequential text input based on previously generated text structure information and a current word, from a disentangled representation with the content embedding and the style embedding.
-