

Is Differential Privacy practical? (2013) - luu
http://blog.mrtz.org/2013/08/21/dp-practical.html

======
ahelwer
The Ironclad project[1] claims to have created a formally-verified
differential privacy database. I recommend watching the demo video, the
development efficiency they saw using formal verification is jaw-dropping.

[1] [https://www.usenix.org/conference/osdi14/technical-
sessions/...](https://www.usenix.org/conference/osdi14/technical-
sessions/presentation/hawblitzel)

------
noisydonut
Google recently put a tool called RAPPOR into production that's partially
based on differentially private randomized response for collecting usage
statistics about Chrome users.

Code is here:
[https://github.com/google/rappor](https://github.com/google/rappor) Paper is
here:
[http://static.googleusercontent.com/media/research.google.co...](http://static.googleusercontent.com/media/research.google.com/en/us/pubs/archive/42852.pdf)

------
tokenrove
I certainly hope Betteridge's law of headlines does not apply here.
Differential privacy is an amazing criterion, and I'd love to hear about
anyone who has put it into practice on data they're releasing.

------
hackuser
It would be much appreciated if someone could explain, succinctly, how
differential privacy works and its strengths and weaknesses. Thanks!

