

Learning from Big Data: 40 million entities in context - disgruntledphd2
http://googleresearch.blogspot.ie/2013/03/learning-from-big-data-40-million.html

======
chippy
Could 40 million be Big Data, as in too big for my desktop, and requiring
Hadoop or something - or would this be manageable on a RDBMS database server?

~~~
saosebastiao
Big Data is whatever you want it to be. For the VC blowhards around here, it
is happiness, wealth, a solution to world hunger, and a bikini model with DDD
boobs that loves old nerds and doesn't get them thrown in jail for drug
trafficking.

But for people who actually work with it regularly, it is typically a
constraint that manifests itself in the form of impatience. I could process
petabytes of data using AWK, but if I can't wait that long, I will end up
using Cascalog on a cluster.

~~~
chippy
That's a great answer - the impatience part!

