And then you'll get into trouble with your provider. While they promised you unlimited, it's only unlimited at full speed for the first xGB and then they reduce you to dial-up speeds. Or you cannot surpass 3* the average in your neighborhood.
I got a couple gigabit lines at my disposal and for this one I'll be using our local hackerspace's gigabit line, which is going through freifunk[1]. I'm also only planning on saving 20TB or so at most, which is definitely more reasonable with a gigabit line with a month to spare. (and to not bother everyone else, I'll limit the bandwidth use)
I wonder if they'll give the option for them to move to a paid storage option, at least temporarily, for them to have enough time to download everything.
I have the option to use the google workspace data export option to export the data to a google-owned gcloud storage bucket for 30 days. I'm sure given sufficient bandwidth on your end you could pull 400TB in a month from gcloud.
This does mean all the data is zipped tho, so you cannot be selective on what gets saved and what not, and I will have to use conventional gdrive methods (rclone etc) to download selectively instead.
~400,000 GB * 8 bits/Byte = 3,200,000 Gb at 1Gbps that's 3,200,000 seconds or about 888 hours or about 37 days. If that's 400TB * 1024, instead of 1000, then it's a bit longer, pushing 38 days.
So it's 1099 GB vs 1000 GB which is a solid 10% difference. Your TCP overhead is not anywhere close to 10% unless you are sending with an MTU of 240...