Hacker News new | past | comments | ask | show | jobs | submit login

I hope whoever that is has a really fast Internet connection, because 400TB is too much to download in 30 days even with a 1Gbps connection.



Generally, services like Google Drive can't even max out a 1Gbps connection.


And then you'll get into trouble with your provider. While they promised you unlimited, it's only unlimited at full speed for the first xGB and then they reduce you to dial-up speeds. Or you cannot surpass 3* the average in your neighborhood.


Hello, I am "whoever that is".

I got a couple gigabit lines at my disposal and for this one I'll be using our local hackerspace's gigabit line, which is going through freifunk[1]. I'm also only planning on saving 20TB or so at most, which is definitely more reasonable with a gigabit line with a month to spare. (and to not bother everyone else, I'll limit the bandwidth use)

[1]: https://en.wikipedia.org/wiki/Freifunk


I wonder if they'll give the option for them to move to a paid storage option, at least temporarily, for them to have enough time to download everything.


I have the option to use the google workspace data export option to export the data to a google-owned gcloud storage bucket for 30 days. I'm sure given sufficient bandwidth on your end you could pull 400TB in a month from gcloud.

This does mean all the data is zipped tho, so you cannot be selective on what gets saved and what not, and I will have to use conventional gdrive methods (rclone etc) to download selectively instead.


Thats just over 88 hours.


~400,000 GB * 8 bits/Byte = 3,200,000 Gb at 1Gbps that's 3,200,000 seconds or about 888 hours or about 37 days. If that's 400TB * 1024, instead of 1000, then it's a bit longer, pushing 38 days.


Sheesh, the TCP and whatever else overhead is probably more than the 24 byte difference of 1024 vs 1000..


Did you ever try and calculate the difference?

1TiB = 1024GiB = 10241024MiB = 102410241024KiB = 1024102410241024 bytes...

1024^4 is 1099511627776 bytes...

So it's 1099 GB vs 1000 GB which is a solid 10% difference. Your TCP overhead is not anywhere close to 10% unless you are sending with an MTU of 240...


88 hours would be at 1 GB/sec = 8 Gbps, not 1 Gbps.

400 TB over 30 days is about 1.23 Gbps.


It’s ~888 hours, or 37 days.

You dropped a 0 somewhere.



ChatGPT and Wolfram Alpha confirmed to me that it's about 37 days, which is 888 hours, not 88 :)

Wolfram Alpha: https://www.wolframalpha.com/input?i=400tb+at+1gbps ChatGPT: https://chatgpt.com/share/4ddd464a-dabd-4c12-9cc4-e8271a51a6...

(ChatGPT did a great job at breaking down the problem)


So did your sibling comments and I imagine anyone with very basic maths skills could.

And most likely burning a lot less carbon than ChatGPT.


[flagged]


> the AI you used to write this

I don't follow. Are you suggesting I'm an AI? Or that my answer was copy-pasted from an AI?

> HN is kinda asynchronous

Or that HN runs an AI to serve the website?


Except everyone is forgetting the TCP/IP overhead. I also asked ChatGPT, and 400TB of data creates a 10.7TB overhead, which adds ~5 days...


if you had done the math yourself you would have arrived at a different number....


How so, care to educate the lazy?

It's all theoretical anyway, if it's 400TB split into individual files, and I'm using millions of HTTPS GET's, that's also overhead...

Gotta love nerds spending their Saturday (nights) arguing about some simple maths...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: