FILTERING FOR MIXING SERVER-BASED AND FEDERATED LEARNING

    公开(公告)号:US20240330766A1

    公开(公告)日:2024-10-03

    申请号:US18609704

    申请日:2024-03-19

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: A method includes receiving, from a client device, a client machine learning (ML) model and obtaining a set of training data including a plurality of training samples. The client ML model is trained locally on the client device. For each respective training sample in the plurality of training samples, the method also includes determining, using the respective training sample, a first loss of the client ML model; determining, using the respective training sample, a second loss of a server machine learning (ML) model; and determining a respective score based on the first loss and the second loss. The method also includes selecting, based on each respective score of each respective training sample in the plurality of training samples, a subset of training samples from the plurality of training samples and training the server ML model using the subset of training samples.

    DECENTRALIZED LEARNING OF MACHINE LEARNING MODEL(S) THROUGH UTILIZATION OF STALE UPDATES(S) RECEIVED FROM STRAGGLER COMPUTING DEVICE(S)

    公开(公告)号:US20240095582A1

    公开(公告)日:2024-03-21

    申请号:US18075757

    申请日:2022-12-06

    Applicant: GOOGLE LLC

    CPC classification number: G06N20/00

    Abstract: During a round of decentralized learning for updating of a global machine learning (ML) model, remote processor(s) of a remote system may transmit, to a population of computing devices, primary weights for a primary version of the global ML model, and cause each of the computing devices to generate a corresponding update for the primary version of the global ML model. Further, the remote processor(s) may cause the primary version of the global ML model to be updated based on the corresponding updates that are received during the round of decentralized learning. However, the remote processor(s) may receive other corresponding updates subsequent to the round of decentralized learning. Accordingly, various techniques described herein (e.g., FARe-DUST, FeAST on MSG, and/or other techniques) enable the other corresponding updates to be utilized in achieving a final version of the global ML model.

    Mixed client-server federated learning of machine learning model(s)

    公开(公告)号:US11749261B2

    公开(公告)日:2023-09-05

    申请号:US17197954

    申请日:2021-03-10

    Applicant: Google LLC

    CPC classification number: G10L15/065 G10L13/04 G10L15/26 G10L15/30

    Abstract: Implementations disclosed herein are directed to federated learning of machine learning (“ML”) model(s) based on gradient(s) generated at corresponding client devices and a remote system. Processor(s) of the corresponding client devices can process client data generated locally at the corresponding client devices using corresponding on-device ML model(s) to generate corresponding predicted outputs, generate corresponding client gradients based on the corresponding predicted outputs, and transmit the corresponding client gradients to the remote system. Processor(s) of the remote system can process remote data obtained from remote database(s) using global ML model(s) to generate additional corresponding predicted outputs, generate corresponding remote gradients based on the additional corresponding predicted outputs. Further, the remote system can utilize the corresponding client gradients and the corresponding remote gradients to update the global ML model(s) or weights thereof. The updated global ML model(s) and/or the updated weights thereof can be transmitted back to the corresponding client devices.

    Using corrections, of automated assistant functions, for training of on-device machine learning models

    公开(公告)号:US12272360B2

    公开(公告)日:2025-04-08

    申请号:US18657405

    申请日:2024-05-07

    Applicant: GOOGLE LLC

    Abstract: Processor(s) of a client device can: receive sensor data that captures environmental attributes of an environment of the client device; process the sensor data using a machine learning model to generate a predicted output that dictates whether one or more currently dormant automated assistant functions are activated; making a decision as to whether to trigger the one or more currently dormant automated assistant functions; subsequent to making the decision, determining that the decision was incorrect; and in response to determining that the determination was incorrect, generating a gradient based on comparing the predicted output to ground truth output. In some implementations, the generated gradient is used, by processor(s) of the client device, to update weights of the on-device speech recognition model. In some implementations, the generated gradient is additionally or alternatively transmitted to a remote system for use in remote updating of global weights of a global speech recognition model.

    CO-DISTILLATION FOR MIXING SERVER-BASED AND FEDERATED LEARNING

    公开(公告)号:US20240330767A1

    公开(公告)日:2024-10-03

    申请号:US18611628

    申请日:2024-03-20

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: A method includes training a client machine learning (ML) model on client training data at a client device. While training the client ML model, the method also includes obtaining, from a server, server model weights of a server ML model trained on server training data, the server training data different that the client training data. While training the client ML model, the method also includes: transmitting, to the server, client model weights of the client ML model; updating the client ML model using the server model weights; obtaining, from the server, updated server model weights of the server ML model, the updated server model weights updated based on the transmitted client model weights; and further updating the client ML model using the updated server model weights.

    SYSTEM(S) AND METHOD(S) FOR JOINTLY LEARNING MACHINE LEARNING MODEL(S) BASED ON SERVER DATA AND CLIENT DATA

    公开(公告)号:US20230359907A1

    公开(公告)日:2023-11-09

    申请号:US17848947

    申请日:2022-07-01

    Applicant: GOOGLE LLC

    CPC classification number: G06N5/022

    Abstract: Implementations disclosed herein are directed to various techniques for mitigating and/or preventing catastrophic forgetting in federated learning of global machine learning (ML) models. Implementations may identify a global ML model that is initially trained at a remote server based on a server data set, determine server-based data for global weight(s) of the global ML model, and transmit the global ML model and the server-based data to a plurality of client devices. The server-based data may include, for example, EWC loss term(s), client augmenting gradients, server augmenting gradients, and/or server-based data. Further, the plurality client devices may generate, based on processing corresponding predicted output and using the global ML model, and based on the server-based data, a corresponding client gradient, and transmit the corresponding client gradient to the remote server. Implementations may further generate an updated global ML model based on at least the corresponding client gradients.

    UTILIZING ELASTIC WEIGHT CONSOLIDATION (EWC) LOSS TERM(S) TO MITIGATE CATASTROPHIC FORGETTING IN FEDERATED LEARNING OF MACHINE LEARNING MODEL(S)

    公开(公告)号:US20230351246A1

    公开(公告)日:2023-11-02

    申请号:US17734766

    申请日:2022-05-02

    Applicant: GOOGLE LLC

    CPC classification number: G06N20/00 H04L67/10

    Abstract: Implementations disclosed herein are directed to utilizing elastic weight consolidation (EWC) loss term(s) in federated learning of global machine learning (ML) models. Implementations may identify a global ML model that initially trained at a remote server based on a server data set, determine the EWC loss term(s) for global weight(s) of the global ML model, and transmit the global ML model and the EWC loss term(s) to a plurality of client devices. The EWC loss term(s) may be determined based on a Fisher information matrix for the server data set. Further, the plurality client devices may generate, based on processing corresponding predicted output and using the global ML model, and based on the EWC loss term(s), a corresponding client gradient, and transmit the corresponding client gradient to the remote server. Implementations may further generate an updated global ML model based on at least the corresponding client gradients.

    MIXED CLIENT-SERVER FEDERATED LEARNING OF MACHINE LEARNING MODEL(S)

    公开(公告)号:US20220293093A1

    公开(公告)日:2022-09-15

    申请号:US17197954

    申请日:2021-03-10

    Applicant: Google LLC

    Abstract: Implementations disclosed herein are directed to federated learning of machine learning (“ML”) model(s) based on gradient(s) generated at corresponding client devices and a remote system. Processor(s) of the corresponding client devices can process client data generated locally at the corresponding client devices using corresponding on-device ML model(s) to generate corresponding predicted outputs, generate corresponding client gradients based on the corresponding predicted outputs, and transmit the corresponding client gradients to the remote system. Processor(s) of the remote system can process remote data obtained from remote database(s) using global ML model(s) to generate additional corresponding predicted outputs, generate corresponding remote gradients based on the additional corresponding predicted outputs. Further, the remote system can utilize the corresponding client gradients and the corresponding remote gradients to update the global ML model(s) or weights thereof. The updated global ML model(s) and/or the updated weights thereof can be transmitted back to the corresponding client devices.

Patent Agency Ranking