
Google Glasses are real, will use two 0.52-inch micro displays - ukdm
http://www.geek.com/articles/mobile/google-glasses-real-micro-displays-20120223/
======
kylemaxwell
The article concludes by asking what we'd do with a wearable computer, but
that's not the right question to ask. We already _have_ wearable computers,
but we call them "smartphones".

Instead, we can think about what uses we have for new _interfaces_ on these
computers. Gesture-based interfaces have existed in browsers for a long time.
And my friends from the subcontinent would say that they've had precursors for
many centuries[1].

HUDs do represent something "new" (in terms of availability, not technology).
So perhaps, every time I look at an ad, I get a red/yellow/green indicator of
their rating by the BBB. Or I look at a printed URL, make a gesture, and it's
bookmarked for me. Think of the data overlays that you could actually use
_and_ could likely be monetized quickly.

What I fear, of course, is selling ads into my eyes based on what I'm looking
at, because everybody (particularly Google) seems to think that advertising is
the primary way to fund the web and mobile apps. I already hate billboards
around town: this seems like an opportunity for a sea change in how we
monetize our work.

[1]: <https://en.wikipedia.org/wiki/Mudra>

~~~
nodata
> We already have wearable computers, but we call them "smartphones".

Smartphones are not wearable computers. I don't wear mine, but always have it
with me in my pocket. I don't wear my wallet - it's in my pocket. (Laptops are
also not wearable computers, even though I always have it with me.)

~~~
pndmnm
Do you wear your belt, or is it just always with you, hooked through some
loops on your pants?

This isn't meant sarcastically -- the idea of "wearing" something is pretty
tied up with the context of the object (e.g. is it "clothing") and less to do
with how you're actually carrying it around, which can make "wearable"
computing something of an oxymoron depending on how it's interpreted.

~~~
nodata
Whether or not the two things are equivalent in one way ("you have it with
you") isn't what we're discussing. Am I wearing this book in my hand? No I am
not. The ring on my finger? Yes. "Wearable" is a word that means a specific
thing. It doesn't mean "something you have with you".

~~~
pndmnm
Right. So why are you wearing your belt (hooked through some fabric loops on
your pants) and you're not wearing your phone (resting in a fabric pouch on
your pants)?

------
ericHosick
I have a friend who is almost blind. Nothing wrong with his eyes. It is his
visual cortex that has the problem. Can't be corrected with lenses.

I think this type of technology will be of great use for people that have this
kind of genetic deficiency.

As an example. Currently my friend uses an iPhone to take a picture of a menu,
holds the display a few inches form his eyes and zooms in on the picture to
read the menu.

I think he could use these glasses to stream zoomed in video and actually see
what is going on around him.

------
rdl
I wish someone would make glasses using the Microvision laser retina displays.
I worked in the MIT Media Lab wearables group as an undergrad a decade or so
ago, and they were pretty awesome then, but aside from a couple of defense
applications, I've never seen them ship multiple units.
(<http://www.microvision.com/wearable_displays/index.html>)

~~~
randallsquared
They've had these _almost_ available for so long that I now assume there's
some major flaw that they're not talking about.

~~~
ChuckMcM
I've tried to get my hands on their stuff in the past, and one of the issues
appears to be that its considered 'national strategic' which is code for this
stuff gives our military and advantage over other guys. And that was as close
as I got to getting an answer for general availability.

That being said you can sign up as a developer and get a 'kit' which has like
two eye pieces and various support stuff for like $20,000 but that was not
something I could invest in to satisfy my curiosity :-)

------
daniel_reetz
I don't meant to be rude, but the author just made this up. The glasses will
not be using "two .52 inch micro displays", unless Google happens to be
working at the height of early 90's tech.

Current video glasses employ Holographic Optical Elements, which are pretty
neat. Turns out that you can make a (limited) holographic representation of
almost any lens. Since holographic films are flat, you can collapse big,
difficult parts of an optical system into nothing more than a flat sheet of
plastic. HOEs, though perhaps not in everyone's thoughts day-to-day, are
definitely present in your day-to-day. There's one behind your LCD right now.
It's a holographic representation of a diffuser, with a carefully designed
(but somewhat limited) diffusion angle.

So the Google Glasses will almost certainly use holographic optical elements.
That doesn't mean that Princess Leia is going to jump out of them, and it
doesn't mean that they're going to display anything in 3D. It means that the
technology used to bend and squish the light into your eye is diffractive, not
refractive (the difference between a pinhole and a glass lens). They will
almost certainly take the form of the extant Vuzix goggles or the Lumus
glasses, both of which are almost market-ready. In both cases, a
micro/pico/nano-projector is fired into a holographic element that compresses
the projection strongly in one axis. So you go from a rectangular projection,
which doesn't fit in the temple of your glasses, to a line projection, which
does, traveling along the temple of the glasses. As the beam travels, it is
bounced into a lens with a mirror, or on the inside of a plastic conduit or
holographic waveguide. Both the Vuzix lens and the Lumus lens have slightly
different approaches, but essentially they have multipart holographic lenses
embedded in the lens in front of your eye that uncompress the compressed axis
section by section and split the projection out into your eye. What you see,
in the end, is a large image floating out in space. Note that without tracking
hardware, this image does not track your eye, it tracks your head. So your
floating screen will move with your head and not your saccades. Please note
that, optically speaking, the language I am using here is very coarse.

The holographic optical elements, and their particular design, are the key
technology here. They can be flat, so the glasses can be made reasonably thin
and small. Micro projectors are almost small enough to fit in the temples of
glasses already (no need for ".52" micro displays, wtf). You can see images of
the Vuzix and Lumix lenses on Google Images and in videos of CES 2012.

Keep your eyes open for the rainbowy rectangles visible in the lens. That's
the HOE. Lumus' HOE is in sections, bouncing the screen out in columns; Vuzix
is either doing much finer columns or has some other approach. Lumus claim
some HD resolution; I think the Vuzix rez is still unclear or varies.

[http://www.kguttag.com/wp-content/uploads/2012/01/Vuzix-
Holo...](http://www.kguttag.com/wp-content/uploads/2012/01/Vuzix-Holographic-
CES-010.jpg)

[http://www.instablogsimages.com/1/2011/12/15/lumus_see_throu...](http://www.instablogsimages.com/1/2011/12/15/lumus_see_through_hd_video_glasses_8dwmj.jpg)
There are two manufacturers producing this technology - Vuzix and Lumus.

Personally, I've never been so excited about a technology (and I've been
watching this space for years) and I'm so totally jazzed that Google is doing
this. I hope they're using the Lumus stuff because Lumus has been around
forever but won't sell samples to the public. I'll buy these things the moment
they come out and hack the hell out of them.

~~~
Rhapso
"unless Google happens to be working at the height of early 90's tech"

We are working at the height of 90s tech. For a consumer product, cheap enough
to mass produce at a price people are willing to pay ($200-$600) you have to
step backwards in time a bit for embedded system design.

I've built a wearable using this type of display (the half inch micro display)
and the biggest issue is that you cannot resolve an image well that close to
your eye without giving you a instant eyestrain headache. You need a lens
system to create virtual distance between your eye and the display. This is
also the big issue with the "project the display on/in the lens approach." You
need to have a focal length for the display similar to the depth at which the
user spends most of their time looking, otherwise switching from real world to
display and back is rather headache inducing.

I'm really excited about this too, but more for main-stream attention on a
field that has been marginalized for over 10 years now.

~~~
daniel_reetz
Wait, sorry, do you mean you (personally) or you (Google)?

I'm aware of other microdisplay-based goggles like the Recon Instruments stuff
- where a microdisplay approach makes more sense, but it's not covering your
FOV.

Have you worked with HOEs at all? Or worn these glasses? It's true that you
need to place the image carefully, but, for example, in the case of the Lumus
display it appears about 10 feet out from you, which was alright for me on the
showroom floor.

At CES 2012 Vuzix claimed they'd have $600 goggles in Q2 2013. This fall they
are doing a monocle for $2500. No pricing at all from Lumus. Of course, I am
appropriately skeptical about these claims, but HOEs are eminently
manufacturable, pico projectors are cheap, and as you say, the renewed
attention on this space is pretty rad. I'm optimistic. If the Google Device
doesn't use the future looking stuff, I'll just buy the Vuzix set and develop
on that.

~~~
Rhapso
ok, pronoun breakdown because apparently I did poorly.

We=Engineers in general

I = Me personally

you = potential designer of a wearable display, could be replaced with "one"

If the displays work as well as you describe (I hope they do!) then they would
be great for mass production in a product like this(a glasses based wearable)
in 5-10 years or so when they are inexpensive.

~~~
daniel_reetz
Yeah, especially since you are working in this space, you really, really need
to see these. They're not 5-10 years off like they were 5-10 years ago.

------
peterclary
Let me adjust my tinfoil hat here.

Let's assume the following technologies would all make sense in something like
this: \- Forward-facing camera. \- Eye tracking. \- Object recognition
(similar to Google Goggles)

So Google can tell what you're looking at, and for how long. Nope. Can't see
any possible risks there from the company which tracks you across the web...

~~~
jerf
I'm still in the early phases of this and not that dogmatic about it, but I'm
noticing in my life I'm starting to trend towards a "fight the cloud; own your
personal computers" party line. Augmented reality can be a greatly empowering
technology, but you should own it. And you don't own the cloud. He who owns
the CPU calls the cycles; be someone who owns, not someone who is owned.

------
54mf
I'd be more excited if these were being built by a company that didn't have a
stake in the smartphone wars, but hopefully they'll be "open" enough to work
with any OS via Bluetooth. Either way, this sounds kind of amazing. I look
like a nerd already, I'll take the risk of looking like more of one to have a
_tiny screen in my goddamn glasses_. This is the future.

------
coreycollins
There has been a lot of talk that these glasses will connect via 3g/4g and
have different sensors to improve the augmented reality experience. Why not
just have a display, power source, and bluetooth module? Then use your phone
for the difficult computing.

~~~
nl
Because the primary reason your phone is as big as it is, is because of the
screen.

A big, bright screen forced a big battery.

I don't know what the battery usage on a HUD screen is, but I suspect a lot
less - they don't require backlighting for a start.

That cuts down on the battery requirements, which cuts down on the size &
weight (Obviously this is speculation: it wouldn't be a huge surprise if you
had to carry a battery pack wired to the glasses somehow)

Secondly, the glasses themselves need to do a lot of fast, low-latency image
processing. It isn't at all clear that bluetoothis an appropriate low-latency,
high bandwidth connection mechanism for this.

Having said that, I suspect you'll be able to connect to your phone via
bluetooth if you want - but the primary processing will be done on the
glasses.

------
weirdkid
Can't wait to see what happens to people wearing these while they drive. And
the laws that follow.

~~~
sukuriant
It will be even more fascinating when applications that could be used to
assist in driving become commonplace, too. potential-hazard-detection, in-eye
GPS, etc, could be incredibly useful. ...reading the current twitter from your
friends... not so much

------
gojomo
Can I root my Google Glasses and add software that simulates the sunglasses in
the John Carpenter classic 'They Live'?

(For example: <http://peteashton.com/images/they-live-20100615-034749.jpg> )

------
knieveltech
The grid. A digital frontier. I tried to picture clusters of information as
they moved through the computer. What did they look like? Ships? Motorcycles?
Were the circuits like freeways?

I kept dreaming of a world I thought I'd never see. And then... one day... I
got in.

------
daliusd
OK. It will be possible to see quite realistic 3D but how it will affect my
eyes?

~~~
jrockway
Goggles will do nothing.

------
k33n
I can't imagine that this will do anything but make people look really nerdy.

~~~
efields
Yeah, and even if its a 1.0 thing that they expect to perfect over a couple
versions, wearing something on your face does a lot to your sense of identity.
It needs to look right, which pretty much means it needs to be well-designed —
not Google's strong point.

~~~
gilini
Well, if you think that this will be the first attempt ever to produce a
portable HUD, I believe the impact this will have won't depend that much on
its commercial success.

~~~
vidarh
It's not in any way the first attempt to produce a portable HUD.

It might be the first attempt to produce a "low cost" _mass market_ portable
HUD, though.

~~~
gilini
Yes, I actually meant a marketable product.

