Hacker News new | past | comments | ask | show | jobs | submit login

Hard to say. In 1960 classical computing would have seemed totally centralized and unattainable in distributed form for the masses. Who knows what the future may bring for quantum computing assuming there is actually a Conventional use case for the masses (most people probably don’t have a burning urge to factor primes at home like say playing video games).





I agree. I'm trying to see this through that perspective when we were at that point in the cycle where everything was centralized. I'm wondering if there will ever be a time (or even a reason) for things to go back the other way again.

Also, agreed about the use case - sometimes I get the feeling that quantum computing is a problem looking for a solution (but I am sure that must not be the case). That said, I think things are partially that way because quantum computing is just such a different paradigm, so to truly take advantage of it takes a pivot in thinking, but that great dividends may be possible as a result.

My thought is, it's kind of like how we learned about what FPGAs could do. Different paradigm, incredible opportunity.


Privacy, offline access, low latency - these are all excellent use cases for edge computing. Once it's time to do some heavy lifting, though, it makes a lot more sense to centralize. Decentralization gives you control along with responsibility, so the cycle goes something like this:

* Decentralized as a part of early development

* Centralized for ease of early deployment

* Decentralized once it becomes simple / commodity enough that everyone can just have one

* Recentralized once it's cheaper to run them all centrally again

And then you only break back out once the thing you're doing fundamentally changes for some reason.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: