
What If the Future of Technology Is in Your Ear? - caseyf7
https://backchannel.com/the-future-of-technology-is-in-your-ear-8c276b810413
======
TomFrost
The problem I've always had with audio interfaces is that input is not
private. Requests on public transportation are heard by many. Requests at home
turn into a conversation with the roommate, spouse, or children. Requests
walking down the street make others question the mental wellbeing of the
person talking to him/herself.

I'm reminded of the Ender's Game sequels in which the protagonist wears a
small earpiece with an AI named Jane. He communicates with Jane by
"subvocalizing" \-- mentally saying the words, physically barely uttering a
sound. The AI understands.

A few years ago there was a TED talk (forgive me; unable to find the link) on
which a technology was demoed to do something similar. Sensors placed around
the throat, combined with EEG sensors around the temple, allowed a man to
transmit text to a computer by following all the mental and muscle processes
of speaking, stopping short of moving his lips in an obvious fashion or making
sounds. The sensors allowed the computer to translate their input to actual
words.

Perfecting and miniaturizing that technology, then combining it with an in-ear
AI, would be a game changer.

~~~
rmac
The future is brain-to-text, where our thoughts get piped directly to
Alexa/Siri/Cortana/Now

[http://www.kurzweilai.net/brain-to-text-system-converts-
spee...](http://www.kurzweilai.net/brain-to-text-system-converts-speech-
brainwave-patterns-to-text)

No need to utter words in public.

~~~
erikpukinskis
Maybe. I support brain-computer interface research, but I don't think you
should be so certain that it's the future.

Imagine you could stimulate the brain directly, what would that look like? If
it's visual input, it's going to look a lot like stimulating the visual
cortex, right? So you're suggesting we will invent some sort of biomedical
device that can project an image onto some region of cortex. That would be
neat!

But isn't there already a device that can project an image onto a region of
cortext? There is! It's called the eye! What is really gained by stimuating
the visual cortext directly, vs just having, say, an eyebrow piercing that
just does the same thing by coming in through the eyeball?

Well, ok, maybe you'd like to get a little more bandwidth than the eye can
provide? But I'm not sure that really makes sense. The eye takes in a LOT of
bandwidth, and most of the time, unless we are fighting a bear or trying to
run a football through a bunch of bodies, we're not even using all of the
bandwidth.

Maybe you want to be able to look at the world and get input subconsciously
without messing up your beautiful view. But you can just dedicate a some small
number of "pixels" of your visual field to non-visual data. If your brain
can't integrate that data with the visual field, it will just become a blind
spot, and it will be as invisible to you as your current blind spots are.

Really, our perceptual system _could_ be much more powerful, but beyond a
certain point there's no reason to have more bandwidth because the existing
stuff is all a brain of our size can really synthesize in realtime. You could
add additional inputs to your brain, but your brain will just grow around
them, dulling your connection with other parts of your body. That's not
necessarily bad, but, again, why not just use those parts of your body for the
input?

The planet Earth has spent about a billion years trying to design the perfect
input/output interface for the human brain and it's done a pretty good job.
I'm not saying we can't improve on it, I'm just saying it's going to be hard
to do any optimizations at the hardware level. And we don't really need new
hardware to plug in to computers, because our brain is already happy to remap
itself to virtual inputs! Pretty much all of the things people imagine doing
with a BCI can be done more efficiently, and less obtrusively with augmented
reality.

~~~
chongli
_Imagine you could stimulate the brain directly, what would that look like?_

It needn't 'look' like any of our senses. Direct brain-computer interaction
could take the form of an internal dialogue, similar to how a person might
mentally rehearse both sides of an anticipated conversation in their head.

~~~
erikpukinskis
OK! So you're talking about sound data then. We also have a really great input
device for that.

How would the direct neural input be better than hearing?

~~~
XorNot
Infinite volume with no hearing damage?

I've never been able to find a satisfactory answer on how the cochlear implant
behaves like that. I somewhat suspect nobody with hearing who developed it
actually knows.

~~~
erikpukinskis
You mean like this?
[https://en.wikipedia.org/wiki/Up_to_eleven](https://en.wikipedia.org/wiki/Up_to_eleven)

------
combatentropy
What I would like is less complex but still very hard: wireless earphones that
somehow self-charge, from radio signals, the sun, body movement, or whatever
would let you leave them on all day and never have to worry about charging
them.

Couple that with a simple and easy way to switch between your phone and
laptop. I mean these mainly for listening, to music, a video, a game, etc. But
when a phone call comes in, you can switch to your phone and use them as a
wireless headset.

Finally make them stylish. Not ornate and gawdy but just sleek and thin and
with good metal, like jewelry, and everyone will just wear them all day long
without self-consciousness.

We have to figure out how to make them self-charging though. That's the hard
part.

~~~
schmappel
How about equipping them with wireless charging? Its charging station could be
a little bowl that you'd drop 'm in at night before you go to bed. Easy as
pie.

~~~
combatentropy
Yes, that would work when the battery life of wireless earbuds gets to about
18 hours. Big wireless headphones can last that long, even much longer, but
most small wireless earbuds die after eight.

------
sna1l
[http://www.bragi.com/](http://www.bragi.com/) \-- I think this company is
trying to do something like this

------
Mindless2112
Jarvis [1] seemed pretty neat in 2014 when Intel announced it, but I've heard
nothing of it since. It seems like a much better bet than Google's Glass.

[1] [http://www.theverge.com/2014/1/6/5282416/intel-unveils-
jarvi...](http://www.theverge.com/2014/1/6/5282416/intel-unveils-jarvis-its-
smart-headset)

------
dmd
I wonder how much this earbud costs. Maybe he should mention it's $13 eleven
times, instead of just ten.

------
sklogic
I could never stomach any podcasts (same thing with videos, of course). It is
a slow and boring way of communication. You can read a text much faster, you
can skim, you can easily go back to any point in this text.

Audio is so inferior that I cannot comprehend why all those podcasts even
exist.

~~~
noir_lord
I agree entirely which is why I read the articles instead of using a podcast,
it was working great til I hit that tree...

The utility is that you listen to them while doing other things, also if the
speed bothers you, speed it up.

I listen to most podcasts at 120-130%.

------
rpgmaker
This was one of my biggest takeaways from _Her_ (the movie). Maybe the audio
interface as represented in the movie _is_ the future.

------
mgberlin
Maybe I'm missing something but I can't seem to find the product being
advertised in this article for sale anywhere.

