ASYNCHRONOUSLY TRAINING MACHINE LEARNING MODELS ACROSS CLIENT DEVICES FOR ADAPTIVE INTELLIGENCE

    公开(公告)号:US20190385043A1

    公开(公告)日:2019-12-19

    申请号:US16012356

    申请日:2018-06-19

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that asynchronously train a machine learning model across client devices that implement local versions of the model while preserving client data privacy. To train the model across devices, in some embodiments, the disclosed systems send global parameters for a global machine learning model from a server device to client devices. A subset of the client devices uses local machine learning models corresponding to the global model and client training data to modify the global parameters. Based on those modifications, the subset of client devices sends modified parameter indicators to the server device for the server device to use in adjusting the global parameters. By utilizing the modified parameter indicators (and not client training data), in certain implementations, the disclosed systems accurately train a machine learning model without exposing training data from the client device.

    Asynchronously training machine learning models across client devices for adaptive intelligence

    公开(公告)号:US11593634B2

    公开(公告)日:2023-02-28

    申请号:US16012356

    申请日:2018-06-19

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that asynchronously train a machine learning model across client devices that implement local versions of the model while preserving client data privacy. To train the model across devices, in some embodiments, the disclosed systems send global parameters for a global machine learning model from a server device to client devices. A subset of the client devices uses local machine learning models corresponding to the global model and client training data to modify the global parameters. Based on those modifications, the subset of client devices sends modified parameter indicators to the server device for the server device to use in adjusting the global parameters. By utilizing the modified parameter indicators (and not client training data), in certain implementations, the disclosed systems accurately train a machine learning model without exposing training data from the client device.

Patent Agency Ranking