SYSTEM AND METHOD FOR COMPACT, FAST, AND ACCURATE LSTMS

    公开(公告)号:US20210133540A1

    公开(公告)日:2021-05-06

    申请号:US17058428

    申请日:2019-03-14

    Abstract: According to various embodiments, a method for generating an optimal hidden-layer long short-term memory (H-LSTM) architecture is disclosed. The H-LSTM architecture includes a memory cell and a plurality of deep neural network (DNN) control gates enhanced with hidden layers. The method includes providing an initial seed H-LSTM architecture, training the initial seed H-LSTM architecture by growing one or more connections based on gradient information and iteratively pruning one or more connections based on magnitude information, and terminating the iterative pruning when training cannot achieve a predefined accuracy threshold.

    SYSTEM AND METHOD FOR INCREMENTAL LEARNING USING A GROW-AND-PRUNE PARADIGM WITH NEURAL NETWORKS

    公开(公告)号:US20220222534A1

    公开(公告)日:2022-07-14

    申请号:US17613284

    申请日:2020-03-20

    Abstract: According to various embodiments, a method for generating a compact and accurate neural network for a dataset that has initial data and is updated with new data is disclosed. The method includes performing a first training on the initial neural network architecture to create a first trained neural network architecture. The method additionally includes performing a second training on the first trained neural network architecture when the dataset is updated with new data to create a second trained neural network architecture. The second training includes growing one or more connections for the new data based on a gradient of each connection, growing one or more connections for the new data and the initial data based on a gradient of each connection, and iteratively pruning one or more connections based on a magnitude of each connection until a desired neural network architecture is achieved.

Patent Agency Ranking