Hacker News new | past | comments | ask | show | jobs | submit login
A brief tour of differential privacy [pdf] (cmu.edu)
35 points by zuhayeer 14 days ago | hide | past | web | favorite | 5 comments



Differential Privacy is definitely an intriguing security concept. However, another recent interesting negative is that there may be a smaller performance gap between Differential Privacy and the stronger security notion of Obliviousness (https://en.wikipedia.org/wiki/Oblivious_data_structure) than previously thought.

Specifically, it was recently shown that the fastest implementations of ORAM (https://en.wikipedia.org/wiki/Oblivious_RAM) are actually in line with the lower bounds of theoretical Differentially Private RAM performance - meaning you gain zero performance advantage by using a weaker security model. Of course, it's unclear if this is true for other data structures and usages, but ORAM is a non-insignificant research area.

See https://eprint.iacr.org/2018/1051 for more details.


Differential privacy is almost entirely unrelated to obliviousness; the context of the paper you're talking about is "hiding access patterns", and yes, in that case relaxing to differential privacy doesn't buy you any efficiency benefits.

However, DP is useful for much more than hiding access patterns, and was originally developed for hiding information in statistical analyses; obliviousness cannot help you in those use cases


My reading of the ORAM paper is that it is trying to solve a different problem - namely the situation where the threat you’re protecting against is an observer in the middle.

The purpose of differential privacy is to allow one a processor to compute an approximately accurate model without ever knowing the exact details of any origin.

Eg the example for ORAM is a secure element talking to main memory, and wanting to prevent an observer from knowing the actual structure of memory being used by the SE.

This is compared to the use case of differential privacy which is an endpoint wishing to process data without ever knowing what the actual source data was. For example as deployed by Apple: many metrics are bludgeoned with noise on the customers device before being sent to Apple servers. That way (theoretically) Apple never knows exact details of user behaviour, but can accumulate enough info to make approximations at scale.


I think you've misread the paper, which is about differentially private RAM. Differential privacy (w/o "RAM") is neither weaker nor stronger than ORAM.


Differential Privacy is not new. It's better than nothing but it dies NOT fix PII leaking to third party and for first party it's a matter of trust




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: