
Ask HN: Where to run a large data project - swsieber
I&#x27;ve been asked to do a data processing job by a friend. As part of the job, I&#x27;m supposed to download and process roughly 1.5tb of data. That&#x27;s a problem because my local ISP data cap is 300gb a month.<p>So I&#x27;d like to ask: what recommendations does hn have for me? I was thinking of getting a private server and trying to get some cheap storage, but I&#x27;m not sure where to go for that.<p>Side note: not only do I have to download a huge data set, but I have to make a bunch of whois calls based on some IPs in the data set.
======
mtmail
The usual cloud offerings, e.g. Amazon AWS, Azure, Digital Ocean, are relative
expensive when you need large harddrives. At least when you rent for several
weeks. My typical projects need 1TB fast SSD.

If drive speed is not an issue, it sounds like download speed is, have a look
at a server auction [https://www.hetzner.com/sb](https://www.hetzner.com/sb)
Those are servers other companies custom-ordered and no longer use. "2x 3TB"
give you about 2.8TB usable space (drives are mirrored for redundancy by
default, but you can change that and combine them).

~~~
swsieber
Wow! That's a great site. Thanks for the tip.

