1. Flash talk – Presentation of the MACBES Team
2. Scientific talk by Riccardo Taiello (EPIONE team)
Let Them Drop: Scalable and Efficient Federated Learning Solutions Agnostic to Client Failures
Abstract: Secure model aggregation is nowadays recognized as the key component for Federated Learning (FL). It enables the collaborative training of a global machine learning model without leaking any information about FL clients’ local models. It is shown that clients who fail to complete the protocol, referred to as dropped clients, can seriously affect the correct computation of the global machine learning model. While the literature counts numerous fault-tolerant secure aggregation protocols that use secret sharing to reconstruct the inputs of dropped clients, the performance of these solutions decreases with an increase in the dropout rate. In this paper, we propose Eagle, a fault-tolerant, secure aggregation solution that is agnostic to client failures and therefore outperforms existing solutions. Eagle is inherently compatible with realistic FL schemes that implement client selection. Furthermore, existing state-of-the-art solutions usually apply to basic FL settings whereby all clients are synchronized. We propose Owl a secure aggregation solution suitable to the asynchronous setting. We have implemented both solutions and show that: (i) in a cross-device scenario, with realistic dropout rates, Eagle outperforms the best SA solution in SyncFL, namely Flamingo, by x4, approximately (ii) whereas in the asynchronous setting, Owl exhibits the best performance in all scenarios compared to the state-of-the-art solution LightSecAgg (by at least x10). During the final five minutes of the presentation, we will showcase a brief demonstration highlighting a practical implementation of secure aggregation for federated learning within the Fed-BioMed framework.