-
公开(公告)号:US20230118025A1
公开(公告)日:2023-04-20
申请号:US17914297
申请日:2021-06-03
Applicant: QUALCOMM Technologies, Inc.
Inventor: Matthias REISSER , Max WELLING , Efstratios GAVVES , Christos LOUIZOS
Abstract: A method of collaboratively training a neural network model, includes receiving a local update from a subset of the multiple users. The local update is related to one or more subsets of a dataset of the neural network model. A local component of the neural network model identifies a subset of the one or more subsets to which a data point belongs. A global update is computed for the neural network model based on the local updates from the subset of the users. The global updates for each portion of the network are aggregated to train the neural network model.
-
公开(公告)号:US20230036702A1
公开(公告)日:2023-02-02
申请号:US17756957
申请日:2020-12-14
Applicant: Qualcomm Technologies, Inc.
Inventor: Matthias REISSER , Max WELLING , Efstratios GAVVES , Christos LOUIZOS
IPC: G06N3/02
Abstract: Aspects described herein provide a method of processing data, including: receiving a set of global parameters for a plurality of machine learning models; processing data stored locally on an processing device with the plurality of machine learning models according to the set of global parameters to generate a machine learning model output; receiving, at the processing device, user feedback regarding machine learning model output for the plurality of machine learning models; performing an optimization of the plurality of machine learning models based on the machine learning output and the user feedback to generate locally updated machine learning model parameters; sending the locally updated machine learning model parameters to a remote processing device; and receiving a set of globally updated machine learning model parameters for the plurality of machine learning models.
-