Energy-Aware Decentralized Learning with Intermittent Model Training

May 1, 2024ยท
Martijn De Vos
,
Akash Dhasade
,
Paolo Dini
,
Elia Guerra
,
Anne-Marie Kermarrec
,
Marco Miozzo
,
Rafael Pires
Rishi Sharma
Rishi Sharma
ยท 0 min read
Abstract
SkipTrain is a novel Decentralized Learning (DL) algorithm, which minimizes energy consumption in decentralized learning by strategically skipping some training rounds and substituting them with synchronization rounds. These training-silent periods, besides saving energy, also allow models to better mix and produce models with superior accuracy than typical DL algorithms. Our empirical evaluations with 256 nodes demonstrate that SkipTrain reduces energy consumption by 50% and increases model accuracy by up to 12% compared to D-PSGD, the conventional DL algorithm.
Type
Publication
2024 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)