Hacker News new | comments | show | ask | jobs | submit login

And even then, I don't see quantum computing going the same way we went with our machines - QC looks like very batch-oriented with timeshared front-end processors, a bit like some ancient supercomputers.





I think this would enable new usages we can't even imagine yet. This is not a question of wether it is useful now.

That's like saying in the 90s, "why do we need 1Gbps internet, 56k is completely enough to check your emails or browse newsgroups."

And the ancient supercomputers are now in our pockets


> That's like saying in the 90s, "why do we need 1Gbps internet, 56k is completely enough to check your emails or browse newsgroups."

Not really, since QC is not computing like today except a lot faster. It's a very different way for solving some problems. They are not general purpose computing devices.

> And the ancient supercomputers are now in our pockets

Yeah, about that pocket-sized dilution refrigerator...


Would you say GPU compute was envisioned in the 80's?

I don't think we understand the quantum world and quantum computers well enough today to define 100% of the things we'll be using quantum computers for 50 years from now. We'll start with cryptography, molecule interaction simulations, and some optimization problems, but I think we'll think of new ways to use them in the future.


> Would you say GPU compute was envisioned in the 80's?

Sure, why not? What is a modern GPU if not a bunch (well, a very large bunch!) of vector processors? Vector supercomputers were a thing in the 1970'ies, parallel vector supercomputers in the early 1980'ies. The 1980'ies also saw the beginnings of massively parallel systems (e.g. Connection Machine, Transputer). So take the state of supercomputing in the 1980'ies, extrapolate using Moore's law up to today, and you might arrive at something quite close to a GPU chip.

Now, that this chip full of vector processors would be a side-effect of the $$$ available for graphics rendering, largely for entertainment purposes, is probably a historical fluke few would have predicted.

But my point was that Quantum Computing really is different. It's not a general purpose computing method (in the sense of Turing completeness etc.), and AFAIK so far all attempts at it require very low temperatures in order to drive down thermal fluctuations. Sub-K refrigerators have advanced considerably in the past few decades, but still it's far away from something portable.

> I don't think we understand the quantum world and quantum computers well enough today to define 100% of the things we'll be using quantum computers for 50 years from now. We'll start with cryptography, molecule interaction simulations, and some optimization problems, but I think we'll think of new ways to use them in the future.

Oh, absolutely. I'm just not convinced it'll be something every Tom, Dick, and Harry will use to go about their daily lives, even 50 years from now.


Actually, 1975.

https://en.wikipedia.org/wiki/Cray-1

"The Cray-1 was the first supercomputer to successfully implement the vector processor design. These systems improve the performance of math operations by arranging memory and registers to quickly perform a single operation on a large set of data."


I would say the ILLIAC IV would be a better approximation, but the point is the same: massively parallel computers have been with us for a very long time before the first GPU.

What nobody anticipated is that we'd be using stuff that was originally designed for games.


> Would you say GPU compute was envisioned in the 80's?

Mostly yes: https://en.wikipedia.org/wiki/Connection_Machine

From the 60's: https://en.wikipedia.org/wiki/ILLIAC_IV

The problem with timesharing and QC is that you need to preserve and restore the quantum state of the computer when switching tasks. I am not sure how you would go about with that or if that's even possible.


GPGPU is basically "single instruction multiple data" SIMD at thousand cores. Supercomputers have been doing this for decades.

Quantum computing architectures are a whole different beast.


Can you please stop with the thousand cores bullshit? Even in vega 64 there are only 64 cores with 64 ALU each core.

Each core can still only process one instruction stream but if there are multiple threads sharing the same instruction stream they all get executed at once.


> Not really, since QC is not computing like today except a lot faster. It's a very different way for solving some problems. They are not general purpose computing devices.

> Would you say GPU compute was envisioned in the 80's?

Well, GPUs aren't used for general computation either. There are specific sets of problems that they're good at, but we don't run the OS on them.


> There are specific sets of problems that they're good at, but we don't run the OS on them.

Kind of.

Xeon Phi's predecessor, the Larrabee, was designed as a GPU. Intel now has a Phi that can be the only CPU in your computer.

It all depends on the code that the GPU cores run.

Some time ago, while musing about what a modern-day Amiga would be, I'd imagined it'd have a GPU and run all (or, at least most of) its software on it.


>And the ancient supercomputers are now in our pockets

Actually, my cheap smartphone is orders of magnitude faster than the ancient supercomputers.


And yet the UI is still laggy :-\

They have to do a lot more work to render that UI. The framebuffer on an iPhone X is larger than the entire main memory of the Cray 1!

Mostly a software issue.

To me, this is reminiscent of a time when people never thought we'd own personal computers because "what's the use?" I'm putting my bets on the technology making its way into costumer products as it matures.

Not if the device has to be at absolute zero, it's not.

I don't see that requirement relaxing in the next ten years.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: