Hacker News new | comments | show | ask | jobs | submit login

> Would you say GPU compute was envisioned in the 80's?

Sure, why not? What is a modern GPU if not a bunch (well, a very large bunch!) of vector processors? Vector supercomputers were a thing in the 1970'ies, parallel vector supercomputers in the early 1980'ies. The 1980'ies also saw the beginnings of massively parallel systems (e.g. Connection Machine, Transputer). So take the state of supercomputing in the 1980'ies, extrapolate using Moore's law up to today, and you might arrive at something quite close to a GPU chip.

Now, that this chip full of vector processors would be a side-effect of the $$$ available for graphics rendering, largely for entertainment purposes, is probably a historical fluke few would have predicted.

But my point was that Quantum Computing really is different. It's not a general purpose computing method (in the sense of Turing completeness etc.), and AFAIK so far all attempts at it require very low temperatures in order to drive down thermal fluctuations. Sub-K refrigerators have advanced considerably in the past few decades, but still it's far away from something portable.

> I don't think we understand the quantum world and quantum computers well enough today to define 100% of the things we'll be using quantum computers for 50 years from now. We'll start with cryptography, molecule interaction simulations, and some optimization problems, but I think we'll think of new ways to use them in the future.

Oh, absolutely. I'm just not convinced it'll be something every Tom, Dick, and Harry will use to go about their daily lives, even 50 years from now.






Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: