Stan Williams of HP Labs gave a talk recently at Rice, his alma mater. He was describing the "HP Machine" they are working on, with memristors at the bottom of a hardware and software stack rearchitected from scratch. Don't look to be able to buy any memristors from HP anytime soon. HP predicts that their own internal demand, in producing the HP Machine, will consume all their memristor production for years.
Let's see if one of four fundamental electronic components will be independently productized and sold by other vendors or countries to everyone-but-HP. Anyone remember IBM hardware monopolies?
Would we like a world where HP makes memristors, Google makes resistors, Amazon makes capacitors, and everyone else waits for leftover inventory?
Oh that is pretty clever. I wonder what the write time is. Even a reliable 6 level memory would be useful giving you four useful states (2 bits), a non-initialized, and an initialized undef state.
This is an interesting implementation of the memristor, but I still think the real breakthrough will be in the two state memristor [1]. Once memristor fabrication technology is mature, we are going to see a massive paradigm shift away from the Von Neumann architecture.
All the details are in the linked video by HP's Stan Williams. Definitely worth a watch if you're interested in the future of computing
Another video [2], but a bit more in depth. Recommended if the material in the first was a bit confusing as Stan was short on time to explain some of the details.
yeah, still don't see where he talks about architecture. He explains how memristors can be used for logic, but con Neuman is not implicitly wedded to transistors.
I found interesting that world's first mechanical computer, Charles Babbage's "Analytical Machine" had a base-10 fixed point arithmetic [1], and we could soon have base-10 RRAM...
There's nothing in the article that would make me think that this is the case. A qubit is a fundamentally different concept, in which a computation can be performed on all possible states simultaneously. This seems to just be getting more states out of a single classical element.
No, you can think of a qubit as a unit vector in the complex plane. It can have two real values, but also all kinds of complex values. Computing with a qubit is rotating the vector around, measuring it collapses its imaginary state to a real state. The probability which state you get depends on the rotation at the time of the measurement.