Heterogeneous Federated Learning Via Multi-Directional Knowledge Distillation

    公开(公告)号:US20240249193A1

    公开(公告)日:2024-07-25

    申请号:US18417947

    申请日:2024-01-19

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: Generally, the present disclosure is directed to enhanced federated learning (FL) that employs a set of clients with varying amounts of computational resources (e.g., system memory, storage, and processing bandwidth). To overcome limitations of conventional FL methods that employ a set of clients with varying amounts of computational resources, the embodiments run multi-directional knowledge distillation between the server models produced by each federated averaging (FedAvg) pool, using unlabeled server data as the distillation dataset. By co-distilling the two (or more) models frequently over the course of FedAvg rounds, information is shared between the pools without sharing model parameters. This leads to increased performance and faster convergence (in fewer federated rounds).

Patent Agency Ranking