Hacker News new | past | comments | ask | show | jobs | submit login
Introducing Transfer Appliance: Sneakernet for the cloud era (googleblog.com)
95 points by nealmueller on July 18, 2017 | hide | past | favorite | 47 comments



To be fair, AWS launched a much, much larger analogue to AWS Snowball. https://aws.amazon.com/snowmobile/


"Snowmobile uses multiple layers of security designed to protect your data including dedicated security personnel, GPS tracking, alarm monitoring, 24/7 video surveillance, and an optional escort security vehicle"

The data is encrypted, so that part is really just to impress the commoner.


I assume these services were originally created to meet government contracting requirements (transporting highly classified information).


That would require a whole other set of security requirements than this has.


Hahaha, you got me there. Althought, to be fair, Snowmobile is an entirely different product category. Same problem, different scale of solution. It's like Starbucks coming out with a truck solution, when all you wanted was a Venti.


Espresso trucks sound awesome.


Interestingly, Starbucks does have both a trailer solution and a shipping-container solution. The former is parked out front when the actual store is undergoing refurbishment. The latter is seemingly air-dropped into neighborhoods like Northglenn, CO and Ballard, WA.

Coincidentally, Starbuck's first shipping container store is just a mile or two from Amazon's first use of the Snowmobile.


Plus they did a really cute lego blogpost. https://aws.amazon.com/blogs/aws/aws-snowmobile-move-exabyte...


This a bit tangental but I like the graphs/visualizations they used in this ad. Easy to read and interrupt


Agreed! The diagram [1] is super handy, and lets me reiterate why we waited to so long to do this: 10 Gbps of peering is just not that rare (and you probably want it for updates, etc. anyway). Even as you get to the petabyte range, being able to just hit "Go" and then do differential updates is so valuable that you really have to be talking about lots of Petabytes in a location where (or reason why) you can't get 10Gbps plus of peering.

Disclosure: I work on Google Cloud (but didn't work on this)

[1] https://3.bp.blogspot.com/-SnFabcStXhM/WW4SEhj6adI/AAAAAAAAE...


To add to Boulos point - Google Cloud has the "Cloud Transfer Service" [0] that's hugely popular. It's easy to get connectivity to Google's vast network of POPS and this service helps take advantage of that connectivity. Also S3.

(work at G but not in any of these areas)

https://cloud.google.com/storage/transfer/


The ability to export data from GCP to maybe your own datacenter is really missing.


Good idea, thanks for the suggestion. It does seem more open for the appliance to ship data to the cloud and also the reverse. The interest is interesting, and I'll definitely feed that back.


Same issue.

Another idea would be for GCP to handle export to a physical device and that device would be shipped to a known location (to both GCP and the client) and accessible by either but offline and cold stored. And similar to how paper records are stored you could put in a request to access the records and thereby verify their existence (which you can't do with cloud data other than the copy that you are able to access).

Fwiw, Iron Mountain lost about 60 physical records file boxes that we had ..


Shipping Petabytes to a cloud you can practically never again retrieve all the data is ... risky.

The aws way of doing this is nice: You rent the appliance like when importing data to the cloud and pay a (reduced) "bandwidth" fee for exporting the data to the appliance which will be shipped to you.


Hotel California model.


One of my favorite songs. Doesn't apply here though. Iron Mountain has a service today to move data out of GCP.


I don't know if I'd be comfortable relying on a third party for data export.


In the same theme: moving very large amounts of data between GC and AWS, in either direction. I wonder what it would take for them to make this cost zero. Legislation?

It likely wouldn't be very expensive to get a very high-capacity fiber connection going between GC/AWS datacenters in relative physical proximity.

Not expressing a particular/immediate personal need here, just noting that this could help keep the lock-in factor down and level of competition up.


"Like many organizations we talk to, you probably have large amounts of data that you want to use to train machine learning models."

I understand's Google's bias here but doesn't it usually make more sense to bring the programs/models to where the data already is?


GCP has proprietary machine learning accelerator hardware that you can't buy.


The problem isn't the programs, it's the hardware. GCP has specialized / scalable hardware to build models on.


It is extremely expensive to setup and maintain infrastructure to process big data, especially when you get into the petabytes.


$300 for a 100TB rackable drive? Can I just keep it instead?


$30 per day after 10 day grace period: https://cloud.google.com/data-transfer/pricing

It would be funnier if you could keep it indefinitely, but reads were $0.12/GB. ;)

I wonder if that's a viable business model...


Nimble offers great local storage on a pay per month per use model. I'm sure HP is working hard to ruin this.


Obviously not. Rent a car for a week seems cheap. Can I keep it forever?


The other comments here are correct. The standard model for these appliances is a rental model. Please talk to a GCP seller.


Good god... It was a joke...


It seems to pretty clearly be a rental, so no...


Anyone know how to backup 1tbyte worth of images in google or AWS? My ISP throttles my upload time. I'm interested in doing it at a lower price point than $500. Is that possible?


I've happily used Arq (https://www.arqbackup.com/) to get a lot of things into Drive and GCP. You can control your network rate to still get the files uploaded but over a longer time-span below your ISP's throttling.

(Disclosure: I work on GCP but this is a personal, not professional, endorsement)


it's multi cloud so I think you're safe from showing a bias.



thanks I hadn't seen that price, looks handy


I was thinking of buying a NAS just for this. It would take a while, but would be running all night which is handy. cheapest synology is $110 new + disk.

https://www.synology.com/en-uk/knowledgebase/DSM/help/CloudS...


Try this too: S3 Transfer Acceleration, maybe gets around ISP throttling (I haven't tried this yet) https://aws.amazon.com/s3/faqs/#s3ta


Become a visitor to one of thr tech companies, and use their wifi.


So a Hooli Box?


Hooli Box compresses and stores data. This is just used to transfer data to GCP.


This compresses.


1 petabyte compressed! Woot!


Awwwww, it's too bad this can't be used to get stuff directly into Drive.


Good idea, thanks for the suggestion. The interest is interesting, and I'll definitely feed that back. I felt the pain recently myself when I moved all my photos and videos to Google Photos and I wished I had one of these.


Anyone know what's inside the box?


Article title is actually "Introducing Transfer Appliance: Sneakernet for the cloud era", can someone change how it's presented here? This was not a good improvised title.


Yes, thank you. We've updated the title from “Google launches a larger analogue to AWS Snowball”.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: