Martin Gullbrandson och Oscar Johansson, Elektroteknik

​Title: Posterior Estimation in Federated Learning
Examiner: Lennart Svensson
Federated learning is an emerging field within machine learning aimed to distribute the learning task. The concept is to train models locally on edge devices and globally aggregate these models across a graph of edge devices. Doing so is thought to increase both end-user privacy and decrease communication costs, as data is not communicated. With data censored, the learning task is considerably more difficult than in the centralized case. It is further complicated as data is usually not independent and identically distributed (non-IID) among edge devices.

This thesis explores how a Bayesian perspective may consider the inherent model variance when aggregating different locally trained models, specifically in non-IID environments. This is done by evaluating algorithms that take advantage of a probabilistic perspective, such as FedPA [1], and a standard algorithm, FedAvg [2]. Further, we propose a novel lightweight approach using kernel density estimation to estimate a posterior distribution and using mean-shift for model inference. We call the algorithm federated kernel posterior (FedKP) and show that it outperforms other algorithms on bench-marking datasets while requiring no extra information and hyperparameter-optimization.

Martin, Oscar and Lennart
Join the seminar by Zoom
Kategori Studentarbete
Plats: Analysen, meeting room, Rännvägen 6B, EDIT trappa D, E och F
Tid: 2022-06-03 10:00
Sluttid: 2022-06-03 11:00

Sidansvarig Publicerad: fr 03 jun 2022.