

DARPA's new memristor-based approach to AI - boh
http://spectrum.ieee.org/robotics/artificial-intelligence/moneta-a-mind-made-from-memristors

======
Eliezer
Yeah, that's like saying that you've got a new approach to writing the world's
greatest novel based on using the letter 'y' in place of 'i'. AI is a software
problem and if you don't know how general intelligence works, having a new
kind of transistor really, really, really ain't gonna help you much. Wrong
level of organization.

~~~
nerme
Have you looked at how we write code? It is directly related to the hardware
architecture. Operators over here, memory over here.

Why wouldn't there be a benefit in having the hardware mimic the type of
software they plan on using, namely, neural networks? That way, from the
ground up, you've got these "neurons" that can computer and store information
locally, which is pretty much what you've got in a neural network.

I'm really excited to see what kind of ground-up software architecture is
going to come from using memristors as opposed to the combination of logical
operators and separate memory. It sure as hell isn't going to look anything
like Assembly. :)

~~~
naveensundar
There is really no mathematical advantage that NNs have over other approaches
namely statistical, optimization and formal symbolic approaches/algorithms
which are (imho) better engineering tools as they are more amenable to
analysis. At the same time NNs are not inferior to other approaches,
mathematically. The only advantage of NNs are sounding cool and being
superficially similar to our brains. Consider this simple and often repeated
analogy, sticking a beak and feathers on your toy airplane won't make it go
faster. In real AI research, NNs are just a bullet point in the huge field of
Machine Learning (<http://ijcai-11.iiia.csic.es/calls/call_for_papers>).

AI people should stop throwing around cool names and instead build things
which are real (Please do not start another AI winter.) Watson is a refreshing
step in the right direction.

~~~
pjscott
In this case, neural networks have the advantage of looking similar enough to
networks of memristors that you may be able to cram a big neural net into a
small, low-power chip. It's not a huge breakthrough in AI, but prove very
handy for some applications.

------
pnathan
This is so _bollocks_. And from the IEEE no less. Shame on them.

A turing-complete architecture is a turing complete architecture. There is
_no_ difference in computational ability between turing-complete
architectures.

The advance here is in hardware architecture, not in AI. Make no mistake, it's
an interesting approach - possibly the memristor will allow a neural network
to be a practical chip, which has not proven feasible to date. But it's not
going to open the magic gates to Big AI.

The mathematical, philosophical, and algorithmic aspects of AI are simply not
understood in any sort of great depth of thought. Say, what is intelligence
anyway (rhetorically)?

I look forward to seeing the memristor research results, and I hope some
useful hardware neural network advances come from it.

------
lars512
The article comes across as either boastful or gushing about memristors, and
hand-waves over as many hard unsolved problems as old AI predictions ever did.
This makes it quite hard to take seriously. Prefix all these bold statements
with: "If MoNETA is successful in its goals..."

I have no doubt that trying to simulate mammalian brains with memristors will
lead to some great new AI advances, but I won't hold my breath that it will
solve all AI problems simultaneously, and neither should you.

------
izendejas
I'm staying healthy as long as possible. An exciting future awaits.

Forget going to Mars... the next frontier is understanding and emulating a
mammalian brain, preferably the human brain with its amazing cerebral cortex.
And boy will the memristor and related technology come at a great time when
we're bombarded with an exponential increase in data that we don't yet fully
exploit.

~~~
arctangent
+1 for staying as healthy as possible for as long as possible - not that I'm
actually all that healthy.

There's a good chance that human lifespans will increase dramatically over the
next few decades.

