
Here Comes The Wetware - davidedicillo
http://techcrunch.com/2010/12/04/wetware/
======
tibbon
A major problem here isn't just the interface technology, but that we have
such a poor understanding of how the brain works (my girlfriend is working on
her PhD in Neuroscience and I hear about this almost daily). Consequently, our
interfaces will be crude and clunky at best for a long time.

We've barely got the faintest ideas of how to measure some things that would
seem really basic. Memory? Perception? Forget it. When using one of these
you're basically trying to train your brain to stimulate huge areas to try to
get some readout. It happens that we aren't great at this. Even playing a game
as basic as Pong is very hard with this type of interface, and it isn't as if
we're going to get significantly better at this anytime soon.

Its about as graceful as trying to make your entire leg smash forward in order
to press a small delicate button. We can hit things, but when you have lots of
small delicate buttons, the resolution isn't going to get much better since
we're still smashing blindly.

~~~
kermit_de_fro
To be fair, we can make huge steps forward once the interface technology has
improved resolution. I will trust your girlfriend's expertise, but I do know
from mine in Machine Learning that we can extract a surprising amount of
signal without knowing the neurophysical processes which generate the data.

A huge problem right now with the products coming out of Emotive and similar
companies is that the level of noise is enormous and the resolution of the
sensors is very poor. As we continue to remove the noise from the data (and
come up with better sensors!), we can train better and better models to
differentiate these various activities.

One PI working on this project whose work I am familiar with is Mike D'Zmura
at UC Irvine. <http://cnslab.ss.uci.edu/muri/research.html>

~~~
tibbon
From a non-scientist's point one issue I can see that humans have problems
with relating to these devices is a lack of feedback. I've tried a few of
these and it takes a few seconds to know if its working or not. That's a
pretty unnatural feeling for a human, since we normally know if an attempt of
ours had an action quickly or not. Like you said, there is a huge amount of
noise that makes it very difficult, and much of the lag we have with feedback
is due to us trying to smooth out this noise and find a pattern.

Hopefully one or two smart discoveries can give us a nice little leap ahead
forward until we have a better brain model to work with- bringing the
technology to at least a semi-usable level since as it stands it is pretty
much a novelty at best.

------
frisco
God, not another EEG-based BCI article (especially of Emotiv, whose core
competency as a company is mostly being a hype machine). This stuff is trash
from a neural interfacing perspective; your keyboard is much higher throughput
and better fidelity, and as much a "brain interface" as this is (it translates
brain activity into computer signals!).

EEG mostly detects scalp, eye, and face muscle activity, and some very gross
brain state (such as awake/alert and drowsy). Subjects can get _ok_
performance in demos by confusing this muscle activity with "brain activity",
but it's absolutly not a "neural interface".

~~~
Goladus
A keyboard requires fingers or at least some sort of functional limb, however.
This doesn't. It doesn't really have to be called a neural interface, if that
offends you.

For anyone else hung up on this, I recommend mentally substituting the term
"scalp-eye interface" or "hands-free interface" and focusing on the
possibilities instead of the limitations.

------
waterlesscloud
When I was a wee lad, probably 1980 or so, I was a regular at a market testing
group for video games. It was usually just whatever new game was coming soon,
but once it was a headband that you wore, with electrodes. It would control
your Atari 2600 "by thought". I remember playing River Raider with it at the
testing, you could go left and right and fire.

I don't think it actually read your brain activity, I think it was more muscle
control dressed up, but it sorta worked.

Edit- Link to some info on the device. It did in fact work by muscle activity.
[http://www.atarimuseum.com/videogames/consoles/2600/mindlink...](http://www.atarimuseum.com/videogames/consoles/2600/mindlink.html)

------
sp332
With "wetware" on the rise, and with the recent passing of Leslie Nielsen, I
think this is an appropriate time to remind people of the movie _Forbidden
Planet_. Besides being one of Nielsen's few (old!) bits of "serious" acting,
the movie has an interesting point: computer interfaces must distinguish
between unconscious thoughts and purposeful instructions. Otherwise the
machine will start executing instructions directly from the subconscious
("id") whether you really wanted to do those things or not.

<http://en.wikipedia.org/wiki/Forbidden_Planet>

~~~
Eliezer
[http://wiki.lesswrong.com/wiki/Generalization_from_fictional...](http://wiki.lesswrong.com/wiki/Generalization_from_fictional_evidence)

~~~
sp332
Well of course I expect people to examine the idea critically instead of
lifting it straight from a 1956 movie and applying it to next-gen interaction
design problems. It's just an idea I think people should consider.

------
peregrine
I'd love to play with one of these and just watch the inputs, during various
mind states. Resting, relaxing, watching tv, playing music, listening to
music, drunk, high, bored, focused, reading, panic etc, etc. I don't know what
I would do with the data, probably end up posting it online see if someone can
do something with it.

Idk, I feel like with practice you could get pretty good at using it. But here
it has a $300 price tag, looks like I will be waiting for someone else to get
after those ideas.

------
Twisol
> _"It can be a transformational experience," Garten says, of the moment users
> first don a headset. "For the first time, you’re consciously interacting
> with your own brain."_

So I was at WonderWorks < <http://wonderworkspcb.com/index.php> > in early
November, and they have a litle game that works based on your brain waves. You
put a headband with metal nodes on your head, and it moves a ball forward
based on how calm you are. You sit down across from someone and relax as much
as possible, and whoever is the calmest gets the ball closer to the goal.

(edit: I found the page for the "mindball" game here:
[http://wonderworkspcb.com/Content/37/Mindball_Mental_Challen...](http://wonderworkspcb.com/Content/37/Mindball_Mental_Challenge_Wonders_Panama_City_Beach_FL__))

I have to say, it was very cool. There wasn't as much shock and awe as I'm
sure the headsets in this article must inspire, but you can put that down to
the primitive interface I was using. It was still _extremely_ interesting,
because I really felt like I was "consciously interacting with my own brain".
Making something happen just based on how calm you are... I couldn't get the
smile off my face afterwards.

(If you're curious, I won every time.)

------
rgrieselhuber
Maybe I've been living in a cave but I had no idea that tech like NeuroSky and
emotive was so cheaply available. Anyone with experience on which one is
better?

~~~
daeken
I can't speak for the NeuroSky, but I reverse-engineered the protocol and
crypto used for the Emotiv to create the open source Emokit (
<https://github.com/daeken/Emokit> ) a couple months back. While no one has
really done anything with it yet, it opens up some fun stuff. The Emotiv has
the benefit of more sensors, and it isn't much more expensive. I'd love to see
someone do something really cool with it.

~~~
toby
I just found that 20 minutes ago and was so excited that I ordered an Emotiv
right away. Great to see that you're an HN user, I'm looking forward to
playing with the Python library.

~~~
daeken
Awesome. Feel free to hit me up on #emokit on irc.freenode.net. One note: the
new headsets _may_ have a different key than the older ones. This has not yet
been confirmed, but if it is the case, I'll get it working again regardless.

------
mortenjorck
> _We’re still a long way from real wetware (direct brain-computer
> connections) . . . but last week an NYU professor had a digital camera
> implanted in his head. It’ll be many years (if ever) before that goes
> mainstream, but the line between the mind and its tech is growing finer._

So, really, here doesn't come the wetware. But the author is correct to
disclaim "if ever." The simple reality is that medicine isn't anywhere _near_
the point yet where sticking something in transcranially is anything apart
from a major, expensive, and risky operation (the NYU prof's camera implant is
transdermal, but stops well short of the cranium). The popularity of breast
augmentations and the like may prove the societal acceptance of elective
surgery, but until medicine has advanced to the point where you can do a brain
job as routinely and safely as a boob job, it's not going to happen.

------
rcfox
I don't know about the author of the article, but I've had wetware for the
past 25 years.

(The wetware is your brain.)

------
drallison
Video of Emotiv demo in Stanford EE Computer Systems Colloquium, April 9,
2008. <http://news.ycombinator.com/item?id=1970408>

------
dhughes
Years ago Mind Bowling was a commercial game that use biofeedback (not
brainwaves) to control the movement of the ball.

