Learning longer-term dependencies in neural network using auxiliary losses

    公开(公告)号:US11501168B2

    公开(公告)日:2022-11-15

    申请号:US16273041

    申请日:2019-02-11

    Applicant: Google LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for structuring and training a recurrent neural network. This describes a technique that improves the ability to capture long term dependencies in recurrent neural networks by adding an unsupervised auxiliary loss at one or more anchor points to the original objective. This auxiliary loss forces the network to either reconstruct previous events or predict next events in a sequence, making truncated backpropagation feasible for long sequences and also improving full backpropagation through time.

Patent Agency Ranking