
What I’ve learned from 35 years of wearing computerized eyewear - twentysix
http://spectrum.ieee.org/geek-life/profiles/steve-mann-my-augmediated-life?
======
sudhirj
Steve Mann: Old-fashioned welding helmets use darkened glass for this. More
modern ones use electronic shutters. Either way, the person welding merely
gets a uniformly filtered view. The arc still looks uncomfortably bright, and
the surrounding areas remain frustratingly dim.

Me: Hasn't this guy ever heard of HDR? He could have just used a couple of
video cameras with some processing.

SM: A few years before this, I had returned to my original inspiration—better
welding helmets—and built some that incorporated vision-enhancing technology.
[...] These helmets exploit an image-processing technique I invented that is
now commonly used to produce HDR (high-dynamic-range) photos.

Me: Oh. Right.

~~~
jacquesm
Steve is the kind of person that makes you question your assumptions about
just about anything.

Doing a 'why does't he just' on him means you're going to have to do the
equivalent of 6 months continuous reading first if you want to avoid making a
fool out of yourself, so I wouldn't worry about not knowing about his
connection to HDR.

Steve and I had some interesting exchanges back in '95 or so when video on the
web was still a novelty. Steve went on to make history with his series of
inventions.

What's extremely impressive to me is Steve's incredible faith in his own
inventions, no clinical trials on others but dog-fooding in the extreme.
Bolting things onto (and into) his body for attachment and augmenting his
world. He's a real pioneer in every sense of the word.

~~~
omegant
Is this guy receiving any industry support? Is something he has invented
availble to the public?

~~~
jacquesm
<http://en.wikipedia.org/wiki/Steve_Mann>

------
jwr
I don't understand why Google hasn't hired Steve Mann yet, at least as a
consultant on the project. Seems like hubris to me: this guy has been testing
wearable systems for 20 years or so and knows more about the experience than
anyone on the planet.

~~~
nirvana
Because Steve Mann is into wearable computers and google glass is not a
wearable computer project. Your wondering this is understandable because
google has mislead you. But glass has no CPU, there's not enough power or
space for one. The intelligence lives on google servers where the speach
recognition is done, and everything else, and glass is useless without a net
connection (So you have to be in an area with wifi or have a smartphone handy
to tether to.)

Steve Mann has been working on head mounted displays, true, but he's been
focusing on local horsepower wearable computers.

Ultimately, I don't think google glass is anything more than a PR project to
remove the stigma of google as ripoff artists and to make it look like they're
innovative.

Given current technology, glass on wifi should have about 20 minutes of
battery life, maybe an hour. Which makes them pretty useless.

There's really quite a difference between a wearable computer (what Mann works
on) and a bluetooth headset with integrated display (what glass is.)

~~~
buro9
I agree with your post in it's entirety, except for the bit on battery life as
I spoke to an engineer using Glass a few weeks ago and 5-6 hours with real
world use was the toted number (for a mix of Wi-Fi and Bluetooth use).

I still feel that is very poor though, for Glass to really be useful it should
last a long working day, and ideally all of your average waking hours.

It remains a bluetooth headset with integrated display and camera. The phone
and some remote servers do the real work.

~~~
enraged_camel
If 5-6 hours is the toted number, the real number is most likely close to 4
hours.

~~~
buro9
7 hours was the toted number, 5-6 hours was what he said he got in real-world
usage.

------
kiba
To me, google glasses is worth a buy for just only one reason:
sousvelliance(or inverse survelliance).

When somebody crashed into your house, or you witnessed a crime taking
progress, or remembering an important details related to business dealing, it
just may be worth the 1500 dollars you spent to get a Google Glass.

Other application of google glass may provide utility on a daily basis. I can
imagine getting 10 dollars worth of useful service from google glass everyday,
and 5 dollars in security benefit for the surrounding society. Multiply that
by 365 days which is 5475 USD in term of economic benefit every year. Don't
forget to mention high value recordings such as record of criminal activities,
abuse of authorities by cops.

(Of course, if you're too poor, than google glasses isn't worth 1500 USD even
if it may be someday worth 1500 USD of value to you.)

~~~
greenyoda
Recording in a public place in generally legal, but recording a private
conversation, like a business meeting, is more complex. The laws vary
depending on state (in the U.S.), and some states require the consent of all
participants. I'm not sure that I'd want to wear a device that would expose me
to felony charges if I inadvertently recorded some sensitive conversation that
I didn't have permission to record.

Also, private property owners (like store owners) can eject you for trying to
record video on their premises. There are some places where recording devices
will never be welcome, such as movie theaters, sporting events and workplaces
that deal with confidential information (e.g., a doctor's or lawyer's office,
or even a start-up company whose product wasn't yet announced).

And I'm pretty sure that even Google wouldn't be happy if all their employees
wore these to work every day. Would your work colleagues or managers speak
candidly with you if they knew that their every word might be getting
recorded?

~~~
Karunamon
>Would your work colleagues or managers speak candidly with you if they knew
that their every word might be getting recorded?

If I recall correctly, there's a light visible on the outside of the eyepiece
(the lens part) that's on whenever recording is happening. There will be no
question if recording is going on or not.

------
dbbolton
>The impact and fall injured my leg and also broke my wearable computing
system, which normally overwrites its memory buffers and doesn’t permanently
record images. But as a result of the damage, it retained pictures of the
car’s license plate and driver, who was later identified and arrested thanks
to this record of the incident.

This image retention also happened when he was attacked at a McDonald's in
Paris last summer: [http://www.huffingtonpost.com/2012/07/17/steve-mann-
attacked...](http://www.huffingtonpost.com/2012/07/17/steve-mann-attacked-
paris-mcdonalds-digital-eye-glass-photos_n_1680263.html)

Here's his account: [http://eyetap.blogspot.com/2012/07/physical-assault-by-
mcdon...](http://eyetap.blogspot.com/2012/07/physical-assault-by-mcdonalds-
for.html)

As far as I know, nothing ever came of it (i.e. there were no charges or
settlements).

~~~
staz
Yes it seemed to me like he already used this excuse.

Yeah you don't have to worry my glass never record anything... well except
when it's super convenient to me, then it magically break just at the right
moment

------
jaggederest
I really wish someone would make an early-adopter version of his EyeTap v4/v5
and sell it.

Would I pay cost plus 50% for one? Absolutely. Up to and including car-level
prices. This is HN, a startup, anyone?

~~~
rasur
I've been a follower of Steve's work since meeting him in '96 at work, and I
can only echo your sentiments here. The EyeTap would be my preferred solution
(compared to Glass, for example).

~~~
rdl
I've been planning to make one myself once the Microvision laser scanning
displays got reasonable, but I've been waiting about...15 years for that? so
it's probably a lost cause.

The area which I think would be super-interesting and easy would be pure audio
mediated reality. Vision is hard, but I could do audio for $500. I have
shooting earmuffs which essentially do this already -- they have microphones
and speakers, and amplify soft sounds while attenuating loud sounds.

------
codeulike
_The second issue, the eyestrain from trying to focus both eyes at different
distances, is also one I overcame—more than 20 years ago! The trick is to
arrange things so that the eye behind the mirror can focus at any distance and
still see the display clearly._

Sounds like he has some great insights here. He's also known as 'the worlds
first cyborg' (<http://en.wikipedia.org/wiki/Steve_Mann>), and the lonely
trail he seemed to be on is now shifting to the mainstream.

~~~
frozenport
Certainly not lonely, its a gimmick that gives him legitimacy in his
professional field.

~~~
enraged_camel
Uh, please. He has very real and legitimate contributions to his field.

------
cromwellian
I think eyestrain would be a factor if you're using these systems as an
augmented display that's constantly on, but I don't think the point of these
devices is to be constantly looking at them, but rather, to engage them as
needed, otherwise, let them get out of the way.

His devices in the pictures are shown to get in between the eye and the
external world, whereas, if you look at glass, the screen is up and out of
your line of sight.

I think if your wearable tech display is always on and continuously visible,
it'll be a problem, battery life will be negatively impacted, and the device
will distract you constantly.

~~~
dbaupp
I think that the problems you mentioned are either obvious (clearly being on
always will reduce battery life) or just require better design: if the device
is distracting, it needs to be adjusted so that it isn't (as an example, it
could usually be pass-through displaying only a tiny information panel out of
the line of sight like Glass, and only expanding over the whole field when
some special feature is activated. Being artificially limited to never being
an overlay isn't better than being able to be both an overlay and also just an
info panel.)

The author posits that Google placed the screen out of the line of sight to
avoid vision misalignment and misadjustment problems.

(Also, the eyestrain due to focus distances was mentioned, and apparently
solved by using an "aremac": a pinhole camera in reverse which means the video
is focused at every distance.)

~~~
gammarator
The author actually says that Google's design _creates_ eyestrain by requiring
viewing at a fixed focal distance.

------
splicer
_As I went to speak with the driver, he threw the car into reverse and sped
off, striking me and running over my right foot as I fell to the ground._

Probably because the driver was like "AAHHHHH!!!! A FREAKIN' CYBORG!!!!"

~~~
pwelch
Thought the exact same thing.

------
Swizec
All I really want from computerized eyewear is telemetry for sports.

Current time/pace/distance/route/whatever when I'm running.

Current speed, next corner severity/distance when I'm longboarding.
([http://swizec.com/blog/ifihadglass-the-app-i-want-to-
build/s...](http://swizec.com/blog/ifihadglass-the-app-i-want-to-
build/swizec/6035))

That alone would be worth the money to me. Such things already exist for
skiing goggles, but those aren't extensible and only really fit one sport. So
that's no good.

~~~
berntb
All I really want is a network connection, an 80 chars wide Emacs terminal in
the window and a chording keyboard strapped to my hand. This assumes a small
Linux (BSD?) distro installed. The rest is trivial.

(Iirc, this is a setup one of Mann's students had.)

Edit: OK, I do want a camera too. And video log. And... But 80+% of usability
would come from Emacs lisp (or short scripts run from shell)

Edit 2: Love HN. I comment about a setup I read about years ago and have been
waiting for buying the hardware -- and of course get answers (I assume that
w/out employment contracts, they would have been more detailed). Thanks.

~~~
ronyeh
Remembrance Agent <http://www.remem.org/> was created by Bradley Rhodes, one
of the group of "cyborgs" doing wearable computing research at MIT at that
time.

... and after a quick search, it looks like he works at the Google, most
likely on Glass.

~~~
saulrh
I've actually worked with the Thad Starner and the Remembrance Agent before.
Let me tell you, it's even cooler than it looks.

Of note, there's a version of the RA that has additions that make it more
suitable for use on wearable computers. The first item in the papers section
(Using Physical Context..., 2003) describes all the extra stuff that the
wearable version does.

------
rogueSkib
Visual perception is truly an amazing thing! The author's anecdotes about
vision alteration and the brain's ability to adapt were very interesting to
me. I have nystagmus: my eyes move back and forth quickly all the time. I've
often wondered what it looks like to see without the movement; however, that
_is_ how I see: I don't notice the movement at all. My vision with contacts
doesn't get much better than 20/40, so I do experience the effects of the
movement. I tend to think of my vision as if it's an example of two-point wave
interference: <http://en.wikipedia.org/wiki/Interference_(wave_propagation)>
The further away an image is from my focal points, the more the interference
from movement affects my brain's ability to piece it all together; it's
similar to tunnel vision, but instead of darkness on the periphery, it's
progressively more blur. To see most clearly, I have to tilt my head to the
side, to my "null point" where my eyes move the least. Not to mention my head
moves often in some sort of sync with my eyes, especially while reading; once
in school, a substitute teacher raised his voice angrily, thinking I was
shaking my head at his work on the board!

I'm curious how Google and other developers of high-tech eyewear will account
for us with out-of-the-ordinary eye conditions. If the glasses or certain apps
rely on eye movements for communication, we probably couldn't use them.

------
woodchuck64
> But as a result of the damage, it retained pictures of the car’s license
> plate and driver, who was later identified and arrested thanks to this
> record of the incident.

[McDonald's assault]: > when the computer is damaged, e.g. by falling and
hitting the ground (or by a physical assault), buffered pictures for
processing remain in its memory, and are not overwritten with new ones by the
then non-functioning computer vision system.

Fragile design or, say, accelerometer-controlled backup memory?

------
WestCoastJustin
Reliant: Steve Mann wrote "Physical assault by McDonald's for wearing Digital
Eye Glass" [1], back in July 2012. Speaks to the stigma Google Glass is likely
to face.

[1] [http://eyetap.blogspot.ca/2012/07/physical-assault-by-
mcdona...](http://eyetap.blogspot.ca/2012/07/physical-assault-by-mcdonalds-
for.html)

------
hnriot
I wonder how much wearing a contraption on his head contributed to his getting
hit by the car. Even state of the art high-end viewfinders with millions of
pixels have frustratingly long lag, enough to easily take away the reaction
speed edge gained from millions of years of evolution.

~~~
wes-exp
Besides lag, computerized vision could be seriously distracting. People
already nearly get hit by cars staring into their smartphones.

------
coldtea
> _What I’ve learned from 35 years of wearing computerized eyewear_

Apparently nothing about fashion?

~~~
lilsunnybee
i don't know why Google doesn't team up with some lead designers in the
eyeglasses and sunglasses business, and come up with some actual _stylish_
shades for Google Glass. Maybe they just aren't at that stage yet.

~~~
antiterra
They apparently are: [http://www.fastcompany.com/3006116/where-are-they-
now/google...](http://www.fastcompany.com/3006116/where-are-they-now/google-
glass-warby-parker-frame-help-design)

------
ronyeh
(In the far future) if Google Glass could automatically upload video of what
you're seeing to your cloud storage, you could have a searchable log of your
entire life.

Reminds me of these projects:

<http://en.wikipedia.org/wiki/MyLifeBits>

<http://en.wikipedia.org/wiki/Microsoft_SenseCam>

Maybe a V1 of this could have Google Glass take a photo every minute. You
could upload it automatically to Evernote or your private G+ photo feed. Then,
you could occasionally review and "star" the important moments of your life
(and maybe even delete/summarize chunks that are less important).

~~~
seabee
> you could have a searchable log of your entire life.

Obligatory sci-fi cautionary tale: [http://www.channel4.com/programmes/black-
mirror/episode-guid...](http://www.channel4.com/programmes/black-
mirror/episode-guide/series-1/episode-3)

A circular buffer and explicit save is the furthest I'd want to take it.

~~~
networked
As much as I enjoyed watching Black Mirror bringing a piece of fiction into
this kind of discussion early on is potentially highly problematic [1].

Consider what mentioning The Matrix or Terminator does to a discussion about
AI. What Black Mirror's "The Entire History of You" [2] does to lifelogging
resembles what those films do artificial intelligence for dramatic purposes. I
highly recommend reading Less Wrong's article on this issue [1] for an in-
depth discussion of this issue.

[1]
[http://lesswrong.com/lw/k9/the_logical_fallacy_of_generaliza...](http://lesswrong.com/lw/k9/the_logical_fallacy_of_generalization_from/)

[2] Trailer: <https://www.youtube.com/watch?v=3bFCqK81s7Y>, plot summary
(spoilers):
[https://en.wikipedia.org/wiki/Black_Mirror_%28TV_series%29#S...](https://en.wikipedia.org/wiki/Black_Mirror_%28TV_series%29#Series_1).

------
Tichy
Couldn't the fashion problem be solved by making hats cool again. Already some
hipsters are wearing them, should provide plenty of space for hiding computing
gear. Perhaps even the projector into the eye could be hidden in the rim of
the hat?

------
shocks
EyeTap, and this: [http://bits.blogs.nytimes.com/2013/02/27/scientists-
uncover-...](http://bits.blogs.nytimes.com/2013/02/27/scientists-uncover-
invisible-motion-in-video/)

I would pay for this. I would pay a lot.

------
ericbb
The pinhole aremac idea is so elegant! Infinite field of view and no need to
measure the eye's lens. I wonder if video games and head-mounted displays
designed for gaming will one day take advantage of that.

~~~
Shorel
I hope so.

It seems the Occulus Rift does not have anything similar.

------
drsim
Apart from the potential physiological issues, the author briefly touches on
the sociological impact this may have. In my mind that will be even more
profound.

If you haven't seen it, 'Black Mirror' on TV here in the UK has an excellent
episode where nearly everyone (voluntarily) has an implant which records
everything they see.

Well worth a watch _:<http://www.channel4.com/programmes/black-
mirror/4od#3327868>

_not sure how available this is outside the UK. It's called 'The Entire
History of You'

------
aneth4
This guy looks amazing, though he can hardly lament that lessons were not
learned if be did not participate in the commercialization of the technology.

Why is this guy not consulting for Google? And I'm not sure if I'm more
astounded or thankful that he has not patents his research.

------
paulftw
35 years? means we won't hear about any entertainingly silly patent claims?

------
kayoone
Computerized eyewear like Google Glass will kill the jobs of Rally Co-Drivers!

~~~
nicholassmith
Interesting idea, but rally drivers probably wouldn't want a level of
distraction in their eye line. Vehicular HUDs have been around for a while
now, so they could have adopted it a while back.

------
maeon3
I spend several thousand dollars testing a few "eyesight for the blind"
products by taking video on a head mounted camera, encoding the image as an 1
image per second audio file that is transmitted to the ears. I was actually
able to get it to work as advertised, and I believe that given 10 hours a day
practice for a month, you could detect a sense of depth perception and make
out attributes in your environment through your audio cortex enough to walk
around slowly without bumping into things. I had the blind friend test out the
best I could do, and although it was a technological marvel, he actually
didn't like it because it made people ostracise him even MORE than him being
blind. He can move around slowly without bumping into things much more
fashionably with the system he already had, a stick, good hearing, touch, and
memory.

So a few insights:

1\. If you are putting something in front of your eyes, or on your hat brim
that looks like a hacked together bunch of cameras and wires and you wear it
in public, there is millions of years of evolution causing people to ostracise
you. It's so bad, that a blind person told me: "The ostracism from wearing it
is worse than the ostracism from them realizing your blind."

2\. You think you're confidant and can handle it? You aren't, inside you are
millions of years of evolution to remove what is causing the ostracism. If you
are the kind of person who can choose to remain single and lonely for life
when you burn with passion for the opposite sex, then you have the kind of
mettle it takes to wear cameras and wires on your head in public.

3\. The experience I had with converting visual to audio and using my audio
cortex was tremendous. For example objects that "popped out" at me during
audio-vision were completely different than normal vision. Take a brick wall
for instance: I could hear the distance between the bricks (cement) was
smaller in one spot, and larger in another spot because of an anomalous blip
in the audio file. When looking at it visually, you think "meh", it's just a
brick wall. With the audio file, the different brick leaps out at you as an
anomaly. Thus exposing the data structure/algorithmic differences between the
visual cortex and audio cortex.

Doing visual as audio makes you an infant again, the tiniest changes in things
leap out as fascinating. This experience I had could probably be sold to
people bored to tears with life. A billion dollar idea! Be an infant again.

~~~
Cushman
My intuition is that this is only true insofar as sensory augmentation up to
this point has been mostly useless. The reason you don't see delivery people
or mechanics or whoever with headgear isn't because it's dorky (although of
course it is), but because it just doesn't help that much. We live in a world
that has been very well designed for people with the usual basic senses, so
adding more on doesn't give you that much more information.[0]

So miniaturization is important, but I think the real improvements to be made
are in software. In the coming era, these devices are going to start offering
real-world superpowers. People who never forget a face, or where they put
their keys, or _anything_ really. People who can have a quiet conversation in
a noisy room. People who can do basic computing tasks _subconsciously_ while
having a conversation. People with "spider-sense" who never seem surprised by
anything.

These tools will still _look_ dorky, but the advantages they offer will be so
great that the people who do use them will be very cool regardless.

[0] And I'd augment this claim by noting two examples off the top of my head
of people who _do_ use these technologies professionally: surgeons and fighter
pilots. Both of those jobs involve doing unfathomably difficult things human
beings are incredibly unsuited to do, so they will take every advantage they
can get, hang the cost and the aesthetic.

~~~
fudged71
Brainport seems to be the most useful kind of sensory technology available.
Giving you the ability to 'see' all kinds of different inputs, such as radar,
sonar, UV, vision, balance, etc.

The only problem is that it requires your tongue. And your tongue is where you
talk and eat. Once we can overcome this problem, there are huge implications.

------
nirvana
For the record, this guy has been wearing actual computers.

Google glass is an accessory- essentially a bluetooth headset, display and
camera built into glasses. The intelligence lives on the servers, and glass
needs a bluetooth or wifi connection to talk to the net.

I think google's engaging in a bit of a PR swindle by making people think
google glass is like an iPhone. It isn't, it _needs_ and iPhone or android
phone to connect to the net.

Consequently it can't replace a smartphone.

I'm also pretty dubious about the battery time it will get, even without
having to run a local CPU.

~~~
icebraining
When did Google try to make people think that?

~~~
DanBC
When reading comments by nirvana you need to realise that anything done by
Apple is good, and anything done by anyone not Apple (but especially MS,
Google, and Samsung) is stupid, or evil, or crooked, a dumb.

"Google is trying to swindle people with dishonest PR stunts" translates into
"Google is doing the normal pr stunts that every company attempt; there are
problems with most pr."

In 1998 researchers with a 1000 subjects found a 93% confidence of predicting
whether a comment was made by nirvana or not nirvana just from reading the
post, based on phrasing such as "the real reality distortion field".

~~~
emiliobumachar
I could get a higher confidence just by guessing "not nirvana" all the time ;)

------
seatac
I don't think the article said what the man's purpose was though (the one who
crashed into his home and hurt him badly).

------
rikacomet
this has to go front page!

