DISTRIBUTED TRAINING OF NEURAL NETWORK MODELS

    公开(公告)号:US20210182660A1

    公开(公告)日:2021-06-17

    申请号:US16716461

    申请日:2019-12-16

    Abstract: Systems and methods for distributed training of a neural network model are described. Various embodiments include a master device and a slave device. The master device has a first version of the neural network model. The slave device is communicatively coupled to a first data source and the master device, and the first data source is inaccessible by the master device, in accordance with one embodiment. The slave device is remote from the master device. The master device is configured to output first configuration data for the neural network model based on the first version of the neural network model. The slave device is configured to use the first configuration data to instantiate a second version of the neural network model. The slave device is configured to train the second version of the neural network model using data from the first data source and to output second configuration data for the neural network model. The master device is configured to use the second configuration data to update parameters for the first version of the neural network model.

    Neural network training from private data

    公开(公告)号:US11551083B2

    公开(公告)日:2023-01-10

    申请号:US16716497

    申请日:2019-12-17

    Abstract: Training and enhancement of neural network models, such as from private data, are described. A slave device receives a version of a neural network model from a master. The slave accesses a local and/or private data source and uses the data to perform optimization of the neural network model. This can be done such as by computing gradients or performing knowledge distillation to locally train an enhanced second version of the model. The slave sends the gradients or enhanced neural network model to a master. The master may use the gradient or second version of the model to improve a master model.

Patent Agency Ranking