Decentralized Deep Learning under Distributed Concept Drift. A Novel Approach to Dealing with Changes in Data Distributions Over Clients and Over Time
Overview
- Date:Starts 31 May 2023, 10:00Ends 31 May 2023, 11:00
- Location:MV:L15, Chalmers tvärgata 3
- Language:English
In decentralized deep learning, clients train local models by sharing model parameters, rather than data, in a peer-to-peer fashion. In this setting variations in data distributions across clients have been extensively studied, however, variations over time have received no attention. This project proposes a solution to address decentralized learning where the data distributions vary both across clients and over time. We propose a novel algorithm that can adapt to the evolving concepts in the network without any prior knowledge or estimation of the number of concepts, as well as a novel aggregation method based on client similarity. Evaluation of the algorithm is done using standard benchmarks adapted to the temporal setting, where it outperforms previous methods for decentralized learning.