

Mega, the new MegaUpload, to launch on January 19 - sgarbi
http://kim.com/mega/

======
narag
_In the past, securely storing and transferring confidential information
required the installation of dedicated software. The new Mega encrypts and
decrypts your data transparently in your browser, on the fly. You hold the
keys to what you store in the cloud, not us._

It seems a good idea to me, maybe because I thought of it before :-) I'm not
sure how well they will monetize it. I've read that economies come from
detecting duplicates and archiving contents just once. But from the customer
POV, that would be a strong feature.

~~~
BrokenPipe
Well encryption changes this. Assuming the key is in the hand of the user
(which if it isn't kind of breaks the full point of encryption in this case)
then each file will have a different digest and won't be detected as a
duplicate.

Is there a way to store files encrypted with different data and detect they
are the same file ? If there was, wouldn't this allow LEO to verify this and
issue DCMA notices ?

~~~
thefreeman
I believe if you could verify two separately encrypted files were the same it
would defeat the purpose of encryption completely. By definition a secure
encryption system cannot allow this.

~~~
cmurphycode
Actually, there are encryption schemes that allow deduplication. They leak
information (that the file you have already exists), but the encrypted bits
themselves are secure.

The keyword is "convergent" encryption. We used something like this at Iron
Mountain Digital many years ago (they still do, AFAIK), and it is used in
BitCasa today.

You should read the papers, but essentially the concept can be boiled down to
encrypting the plaintext with a hash of the plaintext.

Since there is no way to derive the hash of a plaintext from an encrypted
block, there is no way to hack the key other than regular old brute force. But
if the same data is uploaded twice, the same hash is computed, and thus the
same encryption is used, and thus the encrypted cipher text is identical.

The encryption keys can be stored separately from the cipher text. In
particular, the user who uploaded the data would store the hashes (this would
already happen in most backup applications anyway). Then, for retrieval, they
give the hash and the block location to the server, who is now able to decrypt
it. By stealing the server, you gain zero access to plaintext data.

Very cool stuff :)

~~~
vy8vWJlco
Knowing the mapping between a hash of some plaintext and it's de-duplicated
ciphertext means a person can just provide a list of hashes and ask Mega to
delete their corresponding ciphertexts, even if they can't break the
encryption. At least if they maintain their ignorance they can truthfully say
they don't have the power to track down a ciphertext for any given plaintext
hash. Hopefully they will, and just provide bulk cloud storage, with people
holding onto their little key files. It's much easier to back up a 1KB key-
file (or whatever form it comes in) than the encrypted 250GB blob it protects.

~~~
mindslight
Derive your encryption key from the contents of the file _and_ a "convergence
key". The "convergence key" can then be null for global convergence, a shared
secret for a privately shared convergence, or a random nonce for no
convergence. The derived encryption key is stored the same in every case. When
encrypting a file, clients trade off using space versus a file getting deleted
if the server is required to remove the ciphertext. The server never knows the
difference.

~~~
vy8vWJlco
This could even actually be done by the user _before_ storing it on the cloud
service and finding duplicates would be trivial server-side. (Though I don't
see much incentive for a person to do this since it only benefits the hoster.)
For example, in the Mega interface, a user could specify the length of the
convergence key (random salt that inversely affects the likelyhood of de-
duplication on the host) with a default length of 0. This would then be part
of the "key" proper, as those bits are required to access the original file.

~~~
mindslight
And it _should_ be done such that the server treats everything the same. The
incentive comes from deduped files counting less against storage quotas, and
no time spent uploading the file. I'm just commenting on the general approach
here, not the applicability to any particular type of service.

But your 'random salt' idea suffers from the attacker just generating all
possible encryptions of the plaintext due to the small number possibilities.
The "convergence key" is solely a security-parameter-length key that you can
pass around to your friends so that your files will dedupe with theirs while
not being susceptible to confirmation attacks by others.

------
ma2rten
"Before, we operated only a handful of storage nodes located in expensive
premium data centers. Now, thanks to encryption, we can connect a large number
of hosting partners around the world without worrying about privacy breaches."

Was privacy really ever their main concern ?

~~~
grecy
> Was privacy really ever their main concern ?

I expect it will be with the authorities breathing so hard down their necks,
and the questionable legality of the files users will likely upload.

~~~
axusgrad
This is on their "How to become a hosting partner" page
(<http://kim.com/mega/#/hosting>):

>Unfortunately, we can't work with hosting companies based in the United
States. Safe harbour for service providers via the Digital Millennium
Copyright Act has been undermined by the Department of Justice with its novel
criminal prosecution of Megaupload. It is not safe for cloud storage sites or
any business allowing user-generated content to be hosted on servers in the
United States or on domains like .com / .net. The US government is frequently
seizing domains without offering service providers a hearing or due process.

------
dutchbrit
What's new about this exactly? This was posted a few months back.

Ps. You can access it here too, <http://mega.co.nz/>

Their me.ga domain was taken away from them.

~~~
mikegioia
There's something unsettling about "Mega Conz"

~~~
dutchbrit
No pun intended... :-)

------
BayAreaDev
The whole Mega.com is in-your-face thing Kim is doing with authorities.

The real bone with megaupload was, authorities held megaupload responsible for
what it hosted - and Megaupload could not deny what they had on their servers
(copyrighted stuff) - so the responsibility and liability lied with megaupload
and caused its downfall.

With mega.com - the game is, mega.com will claim we don't know what is on our
servers (since its encrypted at browser) so we can't be held liable for it -
and this will stick!

I read somewhere comment that - its doesn't matter how strong the encryption
is for mega.com users - all mega.com need it as a shield from legal troubles.

Clever!

~~~
sgarbi
wasn't the content encrypted on MegaUpload then?

~~~
Sunlis
It may have been, but even if it was the encryption was likely done server-
side. If I'm understanding this correctly, they will now be encrypting data
before it leaves your browser. That way their servers never see the true data,
so they can't be held accountable for what users are uploading.

In the previous model of receiving data then possible encoding it, they have
full access to the raw data uploaded and are responsible for policing the
legality of the files being uploaded.

~~~
malandrew
TBH, I can't wait until this becomes the standard way for lots of services to
do business. This is going to be better for everyone's privacy and it's
ultimately going to be cheaper and less complex for everyone to do this by
default because the alternatively is being forced to fund cost of policing;
something that Hollywood is already trying to force ISPs to do. Section 230 of
the Communications Decency Act was supposed to provide this protection, but
it's become clear it simply doesn't provide enough protection when highly
motivated politically-connected actors get involved. Encryption from end-point
to end-point creates a situation where providers don't even need to worry
about needing the protections from Section 230.

------
nextstep
This is exciting. Kim also tweeted that each user gets 50GB of storage for
free.

<https://twitter.com/kimdotcom/status/291936750580953088>

~~~
joeblau
I'm looking more forward to this than I was to the Facebook announcement. I
also interested in seeing what Kimble will roll out.

------
level09
>"Powered by Instra"

What does Instra provide exactly ? I used them for domains but I don't really
know if they also provide some good hosting infrastructure.

~~~
mwilcox
According to NBR they provide "expert product, billing and technical support
services to Mega."

[http://www.nbr.co.nz/article/nz-company-named-key-mega-
partn...](http://www.nbr.co.nz/article/nz-company-named-key-mega-partner-mega-
ceo-named-ck-134668)

------
bdcs
Does anyone care to speculate on how payments will be processed? Payment
networks seem to be the weak link...

------
GotAnyMegadeth
Bit annoying that that button resizes...

~~~
sgarbi
I didn't even notice there was a hidden message triggered by the
mouseover..... I remember that once this practice was discouraged by many
usability recommendations!

~~~
manojlds
I thought it was pretty clear with the hand mouse cursor to the bottom right.

~~~
sgarbi
it is the kind of spammy-like detail you are likely to find in popup windows
that probably is blind to me

------
jakozaur
False start, <http://mega.co.nz/> will launch on January 19.

------
EGreg
Now this is how you do a "Mega launch". From the man who "can't do small".

