
Qudits: The Real Future of Quantum Computing? - jonbaer
http://spectrum.ieee.org/tech-talk/computing/hardware/qudits-the-real-future-of-quantum-computing
======
adamnemecek
You don't actually want qubits, you want an analog computer with
differentiable signals. Most likely photonic. Qubits are a dead evolution
branch.

I've been recently exploring computational metamaterials for photonic
computation.

[http://users.ece.utexas.edu/~aalu/research%20-%20page%203.ht...](http://users.ece.utexas.edu/~aalu/research%20-%20page%203.htm)
(there's quite a few papers on this but unfortunately they are all paywalled.
Spoiler alert, they seem to be based entirely on Fourier transform).

These computational metamaterials don't need electricity to be powered (you
need something that will shoot the photons on them and read back the values
off tho).

Machine learning would be much, much faster on these as you have O(1)
differential calculus.

They don't heat up. You can possibly build a house sized CPU out of these. I
can see it, a city block sized CPU and a nuclear reactor next to it.

Did you know that on an analog machine, you can do sort in O(n)?

[https://en.wikipedia.org/wiki/Spaghetti_sort](https://en.wikipedia.org/wiki/Spaghetti_sort)

Hit me up if you wanna chat about this. I've seen the "light" (xdddd) now and
can't go back to stupid bits.

I'm not like super married to the metamaterials but analog photonic trumps
quantum for just about every task I can think of.

~~~
jd007
> You don't actually want qubits, you want an analog computer with
> differentiable signals. Most likely photonic. Qubits are a dead evolution
> branch.

that's quite a bold claim. im pretty intrigued, but it seems like there isn't
a lot of easily accessible information on this topic. the wikipedia article on
analog computers mostly focuses on mechanical ones from the past. do you have
any recommended starter readings for analog computation? is it a model that
has the computational time complexity of a non-deterministic turing machine?
thanks.

~~~
adamnemecek
There really isn't. Some of the old stuff still applies. But my approach to
understanding this is very all over the place and I wouldn't recommend it for
anyone besides myself. But I've found trying to understand electromagnetism
and basic quantum theory (photon-matter interactions) to be quite helpful.

As for classification, there's some work on this e.g. this
[http://www.cs.princeton.edu/~ken/MCS86.pdf](http://www.cs.princeton.edu/~ken/MCS86.pdf)

I believe it's not non-deterministic, as in some sense, if you can pose the
problem in a differentiable way, it's better than non-determinism.

It's like being in a maze. An analog computer like robot can perceive the
slight tilt of the floor that goes from the entrance to the exit and just
follows that tilt. It takes the robot an optimal number of steps too.

A non-deterministic Turing would kind of split itself into N different robots
and start exploring all the turns. It might find the answer eventually but
e.g. it will consume that much more energy.

This analogy might be silly but I hope it conveys how very different analog
machines are.

~~~
jd007
thanks for the info. for the maze example, a ND robot would be able to find
the exit in the optimal number of steps since one of the decision trees will
in fact be the optimal path and will terminate first. so i dont see how analog
computing can be more efficient.

unless you are saying that in an analog case we can ignore the rules of the
maze and can go through walls. but then are we still solving the same problem?
seems like the time complexity comparison no longer makes sense if you
redefine the constraints of the problem.

~~~
adamnemecek
> unless you are saying that in an analog case we can ignore the rules of the
> maze and can go through walls.

Not ignore, but you have a lot more information about the problem and you have
a general direction of the solution.

ND turing can split and like explore multiple corridors at once. But in the
end all those robots that don't end up finding exit just wasted energy. I
guess you don't care about this in complexity theory but this is a huge gain
in practice.

~~~
techdragon
The description makes me think of a loud sound echoing through the maze, each
branch further divides the volume of the sound, but a loud enough sound will
"exit" in the optimal "O(1)" time because the sound wave "explores" every
branch of the maze in "parallel".

Is this a reasonable mental model of what your describing?

~~~
adamnemecek
In your description you are still expending extra energy to explore the
branches. Also it's not O(1), it's just that every step you take is bound to
be optimal. So in O(1), you are literally one step closer to the solution.

------
petersjt014
I know these guys understand quantum anything more than I ever will, but is
having a larger radix specifically a practical way for increasing performance?

If a binary device has a hardware fault, bits that are in error still have a
50% chance of being correct. It seems like a single component failing here
would make this worse (five times worse if it wasn't quantum).

I don't know whether or not the pros outweigh the cons here, or at what radix
they do (there's probably some statistical analysis that can answer that), but
ten just _seems_ like too much.

~~~
gizmo686
I am not an expert.

The issue that quantum computers have is not so much that individual qubits
get errors, but rather that quibits become entangled with the rest of the
universe, erasing our ability to take advantage of any quantom effect. To
prevent this, you need to very carefully contain the quibits to prevent them
from interacting with anything apart from highly controlled manipulations. The
more quibits you have, the harder this becomes. If you can replace this with
quidits that have more states, you can do the same computation, while needing
fewer particles, making containment easier.

