
TensorFlow Federated: Machine Learning on Decentralized Data - gavinuhma
https://github.com/tensorflow/federated
======
gavinuhma
“TensorFlow Federated (TFF) is an open-source framework for machine learning
and other computations on decentralized data. TFF has been developed to
facilitate open research and experimentation with Federated Learning (FL), an
approach to machine learning where a shared global model is trained across
many participating clients that keep their training data locally. For example,
FL has been used to train prediction models for mobile keyboards without
uploading sensitive typing data to servers.“

From
[https://www.tensorflow.org/federated](https://www.tensorflow.org/federated)

This is a very exiciting project that compliments other privacy-preserving
machine learning techniques (PPML) in TensorFlow:

\- Differential Privacy:
[https://github.com/tensorflow/privacy](https://github.com/tensorflow/privacy)

\- Secure Multi-Party Computation: [https://github.com/mortendahl/tf-
encrypted](https://github.com/mortendahl/tf-encrypted)

\- Confidential Computing (Trusted Execution Environments, Asylo, Intel SGX):
[https://github.com/dropoutlabs/tf-trusted](https://github.com/dropoutlabs/tf-
trusted)

------
grantlmiller
This has super broad applicability, beyond mobile, federated learning could
become the answer to invasive, data hungry centralized applications that "need
the data to make the system better"... no, you just need to learn locally &
send a learning summary to a central system, not the data.

~~~
gavinuhma
100!

This recent paper from Google combines FL with secure multi-party computation
for “secure aggregation” to prevent the aggregate server from even seeing the
gradients
[https://export.arxiv.org/abs/1902.01046](https://export.arxiv.org/abs/1902.01046)

