Encrypted weight sharing using federated learning

17th September 2020
Encrypted weight sharing using federated learning

Proposed by Cagri Ozcinar – cagriozcinar at gmail.com

Abstract: Federated learning approach was built so that we can use data which is distributed at different user devices to train a machine learning algorithm, without the data being actually transferred outside the user device. Thus facilitating the learning of the model and also not hampering the user privacy-the user data never leaves the local device. However recent approaches have shown that private user data can be exploited by using the gradients or the weights that the edge model share with the main server.

In this dissertation, we would like to allow the user devices to share the weights in an encrypted way, the server would do its update on such encrypted weights and then return the updated weights to the user.

References:

Requirement: 

  • Basic understanding of Deep-learning,
  • Strong Python programming skills with knowledge of PyTorch/TensorFlow tools.
  • Ideal candidates should have interest in deep learning in general, and must have the ability to follow new research trends and learn new tools.