Loading...
Coordination-free decentralised federated learning in pervasive networks:Overcoming heterogeneity
Valerio, Lorenzo ; Boldrini, Chiara ; Passarella, Andrea ; Kertész, János ; Karsai, Márton ; Iñiguez, Gerardo
Valerio, Lorenzo
Boldrini, Chiara
Passarella, Andrea
Kertész, János
Karsai, Márton
Iñiguez, Gerardo
Title / Series / Name
Pervasive and Mobile Computing
Publication Volume
118
Publication Issue
Pages
Editors
Keywords
Data heterogeneity
Decentralised Federated Learning
Deep neural networks
Heterogeneous model initialization
Pervasive Networks
Software
Information Systems
Hardware and Architecture
Computer Science Applications
Computer Networks and Communications
Decentralised Federated Learning
Deep neural networks
Heterogeneous model initialization
Pervasive Networks
Software
Information Systems
Hardware and Architecture
Computer Science Applications
Computer Networks and Communications
Files
Loading...
Kertesz-Janos_2026.pdf
Adobe PDF, 11.22 MB
URI
https://hdl.handle.net/20.500.14018/28830
Abstract
Fully decentralised federated learning enables collaborative model training among edge devices without relying on a central coordinator, thereby avoiding single points of failure and supporting spontaneous collaboration in pervasive environments. However, the absence of coordination introduces challenges that go beyond data heterogeneity alone. In realistic decentralised settings, devices often start from different model initializations, possess limited and non-IID local data, and interact over unstructured communication graphs, making naive parameter averaging ineffective and potentially destructive. In this paper, we address decentralised learning under combined data and initial model heterogeneity by proposing DecDiff+VT, a coordination-free decentralised learning algorithm specifically designed for such environments. DecDiff+VT integrates two complementary mechanisms: DecDiff, a disruption-aware aggregation strategy that updates local models towards their neighborhood average with a magnitude inversely proportional to model disagreement, and a lightweight virtual teacher (VT) mechanism based on soft-label regularization to improve local generalization in the absence of strong or centralized teacher models. Extensive experiments on image classification and activity recognition benchmarks (MNIST, Fashion-MNIST, EMNIST, CIFAR-10, and UCI-HAR) show that DecDiff+VT consistently outperforms or matches state-of-the-art decentralised baselines, achieving faster convergence, improved generalization, and greater robustness to overfitting, without incurring additional communication or memory overhead compared to standard decentralised averaging.
Topic
Publisher
Place of Publication
Type
Journal article
Date
2026-02-12
Language
ISBN
Identifiers
10.1016/j.pmcj.2026.102184