Hacker News new | past | comments | ask | show | jobs | submit login

Aren't the bulk of customers who turn to the cloud rather small operations, who are trying to "outsource" as much of their infrastructure issues as possible? And aren't these customers much more concerned about pricing, rather than possible future growth?

Note: I don't mean to ask this sarcastically. I'm actually asking.




From my experience working with Rackspace Cloud Files, customer sizes are all over the map. Some customers are very small. Some are very large. I know that S3 has a similar variance in customer size.

From my experience talking to users (and potential users) of Openstack (http://openstack.org), there again is variance. Most people are relatively small (a few hundred GB to a few hundred TB). Some are much bigger (several PB). The most exciting thing I heard was that CERN is evaluating Openstack swift (http://swift.openstack.org) for their storage needs. A researcher from CERN gave a keynote at the last Openstack design summit. CERN generates 25 PB / year and has a 20 year retention policy. They have vast storage needs. The storage needs vary greatly.

I've seen that outsourcing infrastructure is great to a point, but the largest users can generally get substantial cost savings by bringing their infrastructure back in house.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: