

Ask HN: If a Hadoop cluster were free, what would you do? - tonydiv


======
lrosiak
I'd use it for running optical character recognition on millions of pages of
government documents to make public information truly free, at
[http://citizenaudit.org/](http://citizenaudit.org/). But I really just need
the 26 cores with a master node feeding documents for that, not actually map-
reduce.

~~~
tonydiv
Hmm, if there's no MR involved, I'm not sure our GPUs would be very helpful
for this type of job. What do you think?

------
sk2code
I've recently became interested in Hadoop, map/reduce, HDFS. I would like to
use it for learning purpose.

~~~
tonydiv
What types of jobs are you trying to implement?

~~~
sk2code
Would like to use it primarily for configuration purpose. Admin/devops kinds
of jobs.

~~~
tonydiv
Could you provide me your preferred email? Is it the one in your HN profile?

~~~
sk2code
The one in my HN profile should be good. Thanks.

------
sbenfsck
I have a Tera-butte-ton of security data that I could use hadoop for analysis.
Have you a free cluster?

~~~
tonydiv
We've developed a parallel architecture using GPUs thereby reducing the cost
by a few hundred factors. So, yes, we will have a freemium plan, and an
affordable pricing model.

Email me at tony@parallelx.com if you're interested.

------
manidoraisamy
I might use it to analyze email conversations related to customer service.

~~~
tonydiv
What types of algorithms would you implement? Would they be compute bound?

------
staunch
How many cores, how much RAM, how much HDFS storage?

~~~
tonydiv
Our free tier will be:

26 CPU cores, 15 GB RAM, 60GB of SSD storage 1536 GPU cores, and 2GB RAM

for a certain amount of time. If you'd like a reserve instance, or more
power/storage/throughput, we'll charge more accordingly.

~~~
staunch
Most typical usage of Hadoop I've seen (so far) is people dumping many
terabytes of data into HDFS and then analyzing it. But maybe you're more
focused on distributed computing than the "big data" side of it.

