Hacker Newsnew | comments | show | ask | jobs | submit login

"Each file and each folder node uses its own randomly generated 128 bit key. File nodes use the same key for the attribute block and the file data, plus a 64 bit random counter start value and a 64 bit meta MAC to verify the file's integrity."

Not sure I understand this, how can you deduplicate if every file is encrypted via a random key?




Maybe on file upload, they encrypt it with the file hash then chunk those encrypted files and store those with dedup.

Then, on the user side, they store an per-user encrypted index (random, counter, MAC) to those individual chunks to represent the file.

That way, they can only see giant encrypted blocks of data, and per-user encrypted indexes to data. But it is all encrypted.

They would need to hack into accounts by keylogging passwords to decrypt the indexes and see what files users can actually access.

Public links could be shared by giving out a key in the URL that is a file containing indexes to other blocks. So whoever knows the URL, knows the index, and can get the data.

That is the way I'd design it, at least... :)

edit:typo

-----


Not sure, it doesn't seem to say anything about encrypting with the file hash and implies that metadata and actual data use the same key.

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: