Note: I don't mean to ask this sarcastically. I'm actually asking.
From my experience talking to users (and potential users) of Openstack (http://openstack.org), there again is variance. Most people are relatively small (a few hundred GB to a few hundred TB). Some are much bigger (several PB). The most exciting thing I heard was that CERN is evaluating Openstack swift (http://swift.openstack.org) for their storage needs. A researcher from CERN gave a keynote at the last Openstack design summit. CERN generates 25 PB / year and has a 20 year retention policy. They have vast storage needs. The storage needs vary greatly.
I've seen that outsourcing infrastructure is great to a point, but the largest users can generally get substantial cost savings by bringing their infrastructure back in house.