Affiliated with RCfalse
Multi-layer Coordination for High-Performance Energy-Efficient Federated Learning
Li,Li1; Wang,Jun2; Chen,Xu3; Xu,Cheng Zhong4
Conference NameIEEE/ACM 28th International Symposium on Quality of Service (IWQoS)
Source Publication2020 IEEE/ACM 28th International Symposium on Quality of Service, IWQoS 2020
Conference DateJUN 15-17, 2020
Conference PlaceHangzhou, PEOPLES R CHINA

Federated Learning is designed for multiple mobile devices to collaboratively train an artificial intelligence model while preserving data privacy. Instead of collecting the raw training data from mobile devices to the cloud, Federated Learning coordinates a group of devices to train a shared model in a distributed manner with the training data located on the devices. However, in order to effectively deploy Federated Learning on resource-constrained mobile devices, several critical issues including convergence rate, scalability and energy efficiency should be well addressed. In this paper, we propose MCFL, a multi-layer online coordination framework for high-performance energy efficient federated learning. MCFL consists of two layers: a macro-layer on the central server and a micro-layer on each participating device. In each training round, the macro coordinator performs two tasks, namely, selecting the right devices to participate, and estimating a time limit, such that the overall training time is significantly reduced while still guaranteeing the model accuracy. Unlike existing systems, MCFL removes the restriction that participating devices must be connected to power sources, thus allowing more timely and ubiquitous training. This clearly requires on-device training to be highly energy-efficient. To this end, the micro coordinator determines optimal schedules for hardware resources in order to meet the time limit set by the macro coordinator with the least amount of energy consumption. Tested on real devices as well as simulation testbed, MCFL has shown to be able to effectively balance the convergence rate, model accuracy and energy efficiency. Compared with existing systems, MCFL can achieve a speedup up to 8.66× and reduce energy consumption by up to 76.5% during the training process.

URLView the original
Indexed ByCPCI-S
WOS Research AreaComputer Science ; Engineering
WOS IDWOS:000629047500013
Scopus ID2-s2.0-85094808900
Fulltext Access
Citation statistics
Cited Times [WOS]:0   [WOS Record]     [Related Records in WOS]
Document TypeConference paper
Faculty of Science and Technology
Affiliation1.ShenZhen Institutes of Advanced Technology,Chinese Academy of Sciences,China
2.Futurewei Technology,United States
3.Sun Yat-sen University,China
4.State Key Lab of IoTSC,University of Macau,Macao
Recommended Citation
GB/T 7714
Li,Li,Wang,Jun,Chen,Xu,et al. Multi-layer Coordination for High-Performance Energy-Efficient Federated Learning[C],2020.
APA Li,Li,Wang,Jun,Chen,Xu,&Xu,Cheng Zhong.(2020).Multi-layer Coordination for High-Performance Energy-Efficient Federated Learning.2020 IEEE/ACM 28th International Symposium on Quality of Service, IWQoS 2020.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Li,Li]'s Articles
[Wang,Jun]'s Articles
[Chen,Xu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Li,Li]'s Articles
[Wang,Jun]'s Articles
[Chen,Xu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Li,Li]'s Articles
[Wang,Jun]'s Articles
[Chen,Xu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.