Loading...
Initialisation and network effects in decentralised federated learning
Title / Series / Name
Applied Network Science
Publication Volume
10
Publication Issue
1
Pages
Editors
Keywords
Complex networks
Federated learning
Gossip protocols
Random walks
Multidisciplinary
Computer Networks and Communications
Computational Mathematics
Federated learning
Gossip protocols
Random walks
Multidisciplinary
Computer Networks and Communications
Computational Mathematics
URI
https://hdl.handle.net/20.500.14018/28671
Abstract
Fully decentralised federated learning enables collaborative training of individual machine learning models on a distributed network of communicating devices while keeping the training data localised on each node. This approach avoids central coordination, enhances data privacy and eliminates the risk of a single point of failure. Our research highlights that the effectiveness of decentralised federated learning is significantly influenced by the network topology of connected devices and the initial conditions of the learning models. We propose a strategy for uncoordinated initialisation of the artificial neural networks based on the distribution of eigenvector centralities of the underlying communication network, leading to a radically improved training efficiency. Additionally, our study explores the scaling behaviour and the choice of environmental parameters under our proposed initialisation strategy. This work paves the way for more efficient and scalable artificial neural network training in a distributed and uncoordinated environment, offering a deeper understanding of the intertwining roles of network structure and learning dynamics.
Topic
Publisher
Place of Publication
Type
Journal article
Date
2025-10-30
Language
ISBN
Identifiers
10.1007/s41109-025-00737-4