Home / Papers / FedGR: A Lossless-Obfuscation Approach for Secure Federated Learning

FedGR: A Lossless-Obfuscation Approach for Secure Federated Learning

1 Citations2021
Wenjing Qin, Li Yang, Jianfeng Ma
2021 IEEE Global Communications Conference (GLOBECOM)

A new Paillier homomorphic encryption to design a new gradient security replacement algorithm, which eliminates the connections between gradient parameters and user sensitive data and greatly reduces the user's local computing overhead.

Abstract

Federated learning is a promising new technology in the field of artificial intelligence. However, the unprotected model gradient parameters in federated learning may reveal sensitive participants information. To address this problem, we present a secure federated learning framework called FedGR. We use Paillier homomorphic encryption to design a new gradient security replacement algorithm, which eliminates the connections between gradient parameters and user sensitive data. In addition, we revisit the previous work by Aono and Hayashi(IEEE TIFS 2017) and show that, with their method, the user's local computing burden is too heavy. We then proved FedGR has the following characteristics to solve this problem: 1) The system does not leak any information to the server. 2) Compared with that of ordinary deep learning systems, the accuracy of federated training results yielded by our system remains unchanged. 3)The proposed approach greatly reduces the user's local computing overhead.