Top Research Papers on Federated Learning
Dive into the world of Federated Learning with our curated list of top research papers. These studies offer insights into the latest advancements, challenges, and solutions in decentralized machine learning. Stay ahead in the field by exploring these pioneering works.
Looking for research-backed answers?Try AI Search
A survey on federated learning
1583 Citations 2021Chen Zhang, Yu Xie, Hang Bai + 3 more
Knowledge-Based Systems
Federated learning is a set-up in which multiple clients collaborate to solve machine learning problems, which is under the coordination of a central aggregator. This setting also allows the training data decentralized to ensure the data privacy of each device. Federated learning adheres to two major ideas: local computing and model transmission, which reduces some systematic privacy risks and costs brought by traditional centralized machine learning methods. The original data of the client is stored locally and cannot be exchanged or migrated. With the application of federated learning, each ...
Vulnerabilities in Federated Learning
214 Citations 2021Nader Bouacida, Prasant Mohapatra
IEEE Access
A comprehensive survey of the unique security vulnerabilities exposed by the FL ecosystem is provided, highlighting the vulnerabilities sources, key attacks on FL, defenses, as well as their unique challenges, and discussing promising future research directions towards more robust FL.
Recent Advances on Federated Learning for Cybersecurity and Cybersecurity for Federated Learning for Internet of Things
422 Citations 2022Bimal Ghimire, Danda B. Rawat
IEEE Internet of Things Journal
A background and comparison of centralized learning, distributed on-site learning, and FL, which is then followed by a survey of the application of FL to cybersecurity for IoT, so readers can have a more thorough understanding of FL for cybersecurity as well as cybersecurity for FL, different security attacks, and countermeasures.
On Demand Fog Federations for Horizontal Federated Learning in IoV
100 Citations 2022Ahmad Hammoud, Hadi Otrok, Azzam Mourad + 1 more
IEEE Transactions on Network and Service Management
A horizontal-based federated learning architecture, empowered by fog federations, devised for the mobile environment is proposed and results show that the proposed model can achieve better accuracy and quality of service than other models presented in the literature.
Federated Class-Incremental Learning
180 Citations 2022Jiahua Dong, Lixu Wang, Zhen Fang + 4 more
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
A novel Global-Local Forgetting Compensation (GLFC) model is developed, to learn a global class-incremental model for alleviating the catastrophic forgetting from both local and global perspectives, and a prototype gradient-based communication mechanism is developed to protect the privacy.
Personalized Federated Learning: A Meta-Learning Approach
361 Citations 2020Alireza Fallah, Aryan Mokhtari, Asuman Ozdaglar
arXiv (Cornell University)
A personalized variant of the well-known Federated Averaging algorithm is studied and its performance is characterized by the closeness of underlying distributions of user data, measured in terms of distribution distances such as Total Variation and 1-Wasserstein metric.
A Learning-Based Incentive Mechanism for Federated Learning
555 Citations 2020Yufeng Zhan, Peng Li, Zhihao Qu + 2 more
IEEE Internet of Things Journal
The incentive mechanism for federated learning to motivate edge nodes to contribute model training is studied and a deep reinforcement learning-based (DRL) incentive mechanism has been designed to determine the optimal pricing strategy for the parameter server and the optimal training strategies for edge nodes.
Federated Learning: Opportunities and Challenges
144 Citations 2021Priyanka Mary Mammen
arXiv (Cornell University)
While FL appears to be a promising Machine Learning technique to keep the local data private, it is also vulnerable to attacks like other ML models.
Federated learning for drone authentication
109 Citations 2021Abbas Yazdinejad, Reza M. Parizi, Ali Dehghantanha + 1 more
Ad Hoc Networks
Experimental results show that the federated drone authentication model gains a high true positive rate (TPR) during drone authentication and better performance compared to other ML-based models.
Federated Learning with Fair Averaging
116 Citations 2021Zheng Wang, Xiaoliang Fan, Jianzhong Qi + 3 more
journal unavailable
This work proposes the federated fair averaging (FedFV) algorithm to mitigate potential conflicts among clients before averaging their gradients, and shows the theoretical foundation of FedFV to mitigate the issue conflicting gradients and converge to Pareto stationary solutions.
A review of applications in federated learning
1331 Citations 2020Li Li, Yuxi Fan, Ying Kei Tse + 1 more
Computers & Industrial Engineering
This study reviews FL and explores the main evolution path for issues exist in FL development process to advance the understanding of FL, and identifies six research fronts to address FL literature and help advance theUnderstanding of FL for future optimization.
Federated Learning with Quantization Constraints
106 Citations 2020Nir Shlezinger, Mingzhe Chen, Yonina C. Eldar + 2 more
journal unavailable
This work identifies the unique characteristics associated with conveying trained models over rate-constrained channels, and characterize a suitable quantization scheme for such setups, and shows that combining universal vector quantization methods with FL yields a decentralized training system, which is both efficient and feasible.
Federated Learning for Internet of Things
110 Citations 2021Tuo Zhang, Chaoyang He, Tianhao Ma + 3 more
journal unavailable
The proposed FedDetect learning framework improves the performance by utilizing a local adaptive optimizer and a cross-round learning rate scheduler, and the system efficiency analysis indicates that both end-to-end training time and memory cost are affordable and promising for resource-constrained IoT devices.
Towards Personalized Federated Learning
913 Citations 2022Alysa Ziying Tan, Han Yu, Lizhen Cui + 1 more
IEEE Transactions on Neural Networks and Learning Systems
This survey explores the domain of personalized FL (PFL) to address the fundamental challenges of FL on heterogeneous data, a universal characteristic inherent in all real-world datasets.
A Taxonomy of Attacks on Federated Learning
197 Citations 2020Malhar Jere, Tyler Farnan, Farinaz Koushanfar
IEEE Security & Privacy
A taxonomy of recent attacks on federated learning systems is provided and the need for more robust threat modeling in Federated learning environments is detailed.
Adaptive Personalized Federated Learning
337 Citations 2020Yuyang Deng, Mohammad Mahdi Kamani, Mehrdad Mahdavi
arXiv (Cornell University)
Information theoretically, it is proved that the mixture of local and global models can reduce the generalization error and a communication-reduced bilevel optimization method is proposed, which reduces the communication rounds to $O(\sqrt{T})$ and can achieve a convergence rate of $O(1/T)$ with some residual error.
Federated Learning for Healthcare Informatics
1312 Citations 2020Jie Xu, Benjamin S. Glicksberg, Chang Su + 3 more
Journal of Healthcare Informatics Research
The goal of this survey is to provide a review for federated learning technologies, particularly within the biomedical space, and summarize the general solutions to the statistical challenges, system challenges, and privacy issues in federation, and point out the implications and potentials in healthcare.
Federated Learning and Wireless Communications
113 Citations 2021Zhijin Qin, Geoffrey Ye Li, Hao Ye
IEEE Wireless Communications
This article provides a comprehensive overview of the relationship between Federated learning and wireless communications, including basic principles of federated learning, efficient communications for training a federatedLearning model, and federatedlearning for intelligent wireless applications.
Distributed training across several quantum computers could significantly improve the training time and if we could share the learned model, not the data, it could potentially improve the data privacy as the training would happen where the data is located. One of the potential schemes to achieve this property is the federated learning (FL), which consists of several clients or local nodes learning on their own data and a central node to aggregate the models collected from those local nodes. However, to the best of our knowledge, no work has been done in quantum machine learning (QML) in federa...
Communication-efficient federated learning
323 Citations 2021Mingzhe Chen, Nir Shlezinger, H. Vincent Poor + 2 more
Proceedings of the National Academy of Sciences
A communication-efficient FL framework is proposed to jointly improve the FL convergence time and the training loss, and a probabilistic device selection scheme is designed such that the devices that can significantly improve the convergence speed and training loss have higher probabilities of being selected for ML model transmission.
Threats to Federated Learning: A Survey
234 Citations 2020Lingjuan Lyu, Han Yu, Qiang Yang
arXiv (Cornell University)
This paper provides a concise introduction to the concept of FL, and a unique taxonomy covering threat models and two major attacks on FL: 1) poisoning attacks and 2) inference attacks, and provides an accessible review of this important topic.
Federated Learning with Matched Averaging
102 Citations 2020Hongyi Wang, Mikhail Yurochkin, Yuekai Sun + 2 more
arXiv (Cornell University)
This work proposes Federated matched averaging (FedMA) algorithm designed for federated learning of modern neural network architectures e.g. convolutional neural networks (CNNs) and LSTMs and indicates that FedMA outperforms popular state-of-the-art federatedLearning algorithms on deep CNN and L STM architectures trained on real world datasets, while improving the communication efficiency.
Model-Contrastive Federated Learning
1237 Citations 2021Qinbin Li, Bingsheng He, Dawn Song
journal unavailable
MOON is a simple and effective federated learning framework that utilizes the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive learning in model-level.
From federated learning to federated neural architecture search: a survey
168 Citations 2021Hangyu Zhu, Haoyu Zhang, Yaochu Jin
Complex & Intelligent Systems
A description of federated neural architecture search that has recently been proposed is proposed, which is categorized into online and offline implementations, and single- and multi-objective search approaches.
SplitFed: When Federated Learning Meets Split Learning
536 Citations 2022Chandra Thapa, M.A.P. Chamikara, Seyit Camtepe + 1 more
Proceedings of the AAAI Conference on Artificial Intelligence
A novel approach is presented, named splitfed learning (SFL), that amalgamates the two approaches eliminating their inherent drawbacks, along with a refined architectural configuration incorporating differential privacy and PixelDP to enhance data privacy and model robustness.
Privacy preserving distributed machine learning with federated learning
130 Citations 2021M.A.P. Chamikara, Péter Bertök, Ibrahim Khalil + 2 more
Computer Communications
DISTPAB alleviates computational bottlenecks by distributing the task of privacy preservation utilizing the asymmetry of resources of a distributed environment, which can have resource-constrained devices as well as high-performance computers.
Learning to Detect Malicious Clients for Robust Federated Learning
185 Citations 2020Suyi Li, Yong Cheng, Wei Wang + 2 more
arXiv (Cornell University)
This work proposes a new framework for robust federated learning where the central server learns to detect and remove the malicious model updates using a powerful detection model, leading to targeted defense.
Learn from Others and Be Yourself in Heterogeneous Federated Learning
238 Citations 2022Wenke Huang, Mang Ye, Bo Du
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
This work proposes FCCL (Federated CrossCorrelation and Continual Learning), which leverages unlabeled public data for communication and construct cross-correlation matrix to learn a generalizable representation under domain shift for heterogeneity problem and catastrophic forgetting.
Decentralized learning works: An empirical comparison of gossip learning and federated learning
146 Citations 2020István Hegedűs, Gábor Danner, Márk Jelasity
Journal of Parallel and Distributed Computing
Surprisingly, the best gossip variants perform comparably to the best federated learning variants overall, thus providing a fully decentralized alternative to federatedLearning.
SecureBoost: A Lossless Federated Learning Framework
463 Citations 2021Kewei Cheng, Tao Fan, Yilun Jin + 4 more
IEEE Intelligent Systems
The SecureBoost framework is shown to be as accurate as other nonfederated gradient tree-boosting algorithms that require centralized data, and thus, it is highly scalable and practical for industrial applications such as credit risk analysis.
A Sustainable Incentive Scheme for Federated Learning
115 Citations 2020Han Yu, Zelei Liu, Yang Liu + 5 more
IEEE Intelligent Systems
The FL incentivizer (FLI) dynamically divides a given budget in a context-aware manner among data owners in a federation by jointly maximizing the collective utility while minimizing the inequality among the data owners, in terms of the payoff received and the waiting time for receiving payoffs.
A survey on security and privacy of federated learning
1231 Citations 2020Viraaji Mothukuri, Reza M. Parizi, Seyedamin Pouriyeh + 3 more
Future Generation Computer Systems
This paper aims to provide a comprehensive study concerning FL’s security and privacy aspects that can help bridge the gap between the current state of federated AI and a future in which mass adoption is possible.
Robust Federated Learning With Noisy Communication
139 Citations 2020Fan Ang, Li Chen, Nan Zhao + 3 more
IEEE Transactions on Communications
This paper proposes a robust design for federated learning to decline the effect of noise and utilizes the sampling-based successive convex approximation algorithm to develop a feasible training scheme to tackle the unavailable maxima or minima noise condition and the non-convex issue of the objective function.
FLAME: Taming Backdoors in Federated Learning
198 Citations 2022Nguyen, Thien Duc, Rieger, Phillip, Chen, Huili + 10 more
Aaltodoc (Aalto University)
Federated Learning (FL) is a collaborative machine learning approach allowing participants to jointly train a model without having to share their private, potentially sensitive local datasets with others. Despite its benefits, FL is vulnerable to so-called backdoor attacks, in which an adversary injects manipulated model updates into the federated model aggregation process so that the resulting model will provide targeted false predictions for specific adversary-chosen inputs. Proposed defenses against backdoor attacks based on detecting and filtering out malicious model updates consider only ...
The future of digital health with federated learning
2154 Citations 2020Nicola Rieke, Jonny Hancox, Wenqi Li + 14 more
npj Digital Medicine
This paper considers key factors contributing to this issue, explores how federated learning (FL) may provide a solution for the future of digital health and highlights the challenges and considerations that need to be addressed.
OpenFL: the open federated learning library
107 Citations 2022Patrick Foley, Micah Sheller, Brandon Edwards + 9 more
Physics in Medicine and Biology
This manuscript presents OpenFL and summarizes its motivation and development characteristics, with the intention of facilitating its application to existing ML/DL model training in a production environment, and describes the first real-world healthcare federations that use the OpenFL library.
Fair Resource Allocation in Federated Learning
133 Citations 2020Tian Li, Maziar Sanjabi, Ahmad Beirami + 1 more
International Conference on Learning Representations
Federated learning involves training statistical models in massive, heterogeneous networks. Naively minimizing an aggregate loss function in such a network may disproportionately advantage or disadvantage some of the devices. In this work, we propose q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair (i.e., more uniform) accuracy distribution across devices in federated networks. To solve q-FFL, we devise a communication-efficient method, q-FedAvg, that is suited to federated networks. We valid...
UAV-Enabled Covert Federated Learning
103 Citations 2023Xiangwang Hou, Jingjing Wang, Chunxiao Jiang + 3 more
IEEE Transactions on Wireless Communications
A UAV-enabled covert federated learning architecture, where the UAV is not only responsible for orchestrating the operation of FL but also for emitting artificial noise to interfere with the eavesdropping of unintended users is conceived.
Federated learning for privacy-preserving AI
154 Citations 2020Yong Cheng, Yang Liu, Tianjian Chen + 1 more
Communications of the ACM
This research presents an engineering and algorithmic framework to ensure data privacy and user confidentiality in the rapidly changing environment.
Deep Federated Learning for Autonomous Driving
115 Citations 2022Anh Nguyen, Tuong Do, Minh Quan Tran + 5 more
2022 IEEE Intelligent Vehicles Symposium (IV)
A peer-to-peer Deep Federated Learning (DFL) approach to train deep architectures in a fully decentralized manner and remove the need for central orchestration is proposed and achieves superior accuracy compared with other recent methods.