
Bosch Smart glasses: A tiny laser array paints images directly onto your retina - deniscepko2
https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/bosch-ar-smartglasses-tiny-eyeball-lasers
======
database_lost
Start of the article: "My priority at CES every year is to find futuristic new
technology that I can get excited about. But even at a tech show as large as
CES, this can be surprisingly difficult. If I’m very lucky [...]"

End of the article: "Bosch covered our costs for attending CES 2020."

~~~
quelltext
Also: "After making a minor nuisance of myself, Bosch agreed to give me a
private demo"

I mean it's possible they paid for him (and others) and still didn't allow him
to get a demo... but it's unlikely. Also, I feel like IEEE can afford to pay
the attendance costs for a reporter.

So, this is really just promoted content/an ad. Which is fine and it's even
fine to wait until the end to tell us I think but not if the article body
tries to paint a different, organic picture. This wasn't at all necessary for
the content but simply there to detract from the paid promo nature of the
piece.

~~~
keanzu
I wondered this myself but I searched the IEEE website and it appears that
there are several stories about various Bosch technologies so it is _possible_
that Bosch was expecting this writer to cover their other stuff and wasn't
planning to demo this particular item. Given that Bosch is looking for a
commercial partner and not to sell direct to the public they _might_ have had
the glasses to show to possible partners and not reporters.

~~~
shkkmo
I don't know if the article was edited, but this is what the article now shows
at the end:

> Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch
> Sensortec, responsible for the Smartglasses Light Drive, was not aware of
> nor involved with the travel support.

So it seems like you are correct. The article could have been much more clear
about this though.

~~~
keanzu
I can't prove it but at the time I checked and the OP of this thread had an
exact and complete quote of what was on the page.

It has been edited at some point within the last ~2 hours.

------
nradov
As an amateur runner and triathlete I really hope someone will integrate this
technology into suitable smart sunglasses. I would pay $1000 today for such a
product if it actually works. My GPS fitness tracker is a great tool but I
hate having to constantly glance down at my wrist to check pace, distance, and
heart rate. This is particularly annoying when executing a structured
intervals workout based on specific target metrics.

There are existing heads-up display products targeted at cyclists such as the
Everysight Raptor and Garmin Varia Vision. However they aren't practical or
comfortable for runners.

Ideally I'd like the smart glasses to have the following features: ANT+
Extended Display profile. 6 hour battery life. Lightweight with even weight
distribution (not all on one side). Prescription lens compatible.

~~~
tmpz22
I don't trust any of the major tech companies to do a good job with this
product. How long until your weather app needs access to that always on camera
in your glasses?

Practically I also can't imagine how they'll solve weight/battery/networking
issues with the device particularly in athletic settings where you're
literally putting the device through a constant earthquake.

Maybe we should stop wishing for some ridiculous convenience layer to be added
to our lives and just look around during our runs like we've done for
thousands of years.

~~~
nradov
Don't worry about it. None of the major tech companies understand the
endurance sports market and it's too small of a niche to even interest them.

~~~
rtkwe
All it need to do is define an interface and the sports watch/app companies
will do it themselves. Google doesn't need to understand every usecase and
make an app for it they made Android Wear and anyone can make an appropriate
app.

~~~
nradov
Having an app doesn't help if the hardware sucks. Android Wear has been a
failure in the sports market so far due to short battery life, awkward touch
interface, and limited sensor support.

------
sebringj
Social ticks such as frantically tapping the side of your head, will be normal
when someone tries to "remember" your name. I thought it was weird when people
were talking to themselves with bluetooth on or walking into a street sign
when looking down at their phone...but it's just going to get stranger. I
could easily imagine eye flickering or eye rolling when your brain OS is
rebooting. These glasses and also the contact lenses in the works will make us
forget having to worry about things we take for granted today just as your
phones helped us forget peoples' phone numbers or care about knowing how to
get somewhere as the new devices will be contextually aware of facts we need
to know that will just appear as an overlay...no more "let me look it up on
google". If it's coupled with audio, gps and other inputs, it could be even
more proactive in finding things before you even knew you needed them.

~~~
cgriswald
I'm not sure how it can get stranger. I've had guys come up to the urinal next
to me still talking into their ear pieces. It's incredibly rude, bizarre
behavior. Their conversation is taking place in an entirely different context,
and I think that's where their minds usually are too. So they might not think
anything of it when they approach from behind saying something like, "We need
to take care of this right now" too loud and too close to your ear.

~~~
reaperducer
When that happens to me, I make sure to let the person on the other side of
that conversation know where it's being held.

I flush repeatedly, turn on the water, and take out my phone and have very
loud imaginary phone calls, or just cycle through the available ringtones at
high volume.

Some people are still so low class that it doesn't even phase them.

~~~
sebringj
Hmm, you can also try to pretend you have one on yourself and laugh joker-
style looking up then talking to the ceiling, making sure the volume of your
voice was uncomfortably loud and inappropriate for the setting you are in. Its
not like that person can call you rude and it will give you some satisfaction
that you have interrupted their call. Maybe they will self reflect at some
point realizing how rude they are themselves. IDK.

------
magduf
This reminds me of a technology from a couple of decades ago that seemed to
disappear: wearable computer monitors. What ever happened to those?

Basically, you wore this thing like eyeglasses on your head, and it had a
small arm that extended in front of your eye (but a little below it). When you
looked down, it appeared like a computer monitor was hovering in front of you.
The very early models were 320x200 resolution in monochrome (red on black),
but I tried one at a trade show in 2000 that I think was 800x600 in VGA color,
which at the time was pretty decent. I'm surprised these never got more
popular; they would be great for laptop computers: you could have total
privacy in your viewing (unlike a normal screen), and with improvements in the
technology you could potentially "see" a much larger screen than a normal
laptop has.

Does anyone else remember these?

~~~
ctdonath
Yes. They were terrible. Best case was Google Glass, which sounded
entrancingly innovative until you tried one and immediately lost all interest.

Resolution was very low, battery life very short, UI very annoying, appearance
very embarrassing.

~~~
csallen
I actually enjoyed my Google Glass, primarily its always-on easy-to-reach
nature. Pulling your phone out of your pocket, turning it on, and unlocking it
is a trivial action, sure. But putting your finger up to your face is still an
order of magnitude faster and easier. As a result, I found myself taking way
more photos and having literally dozens more phone calls with friends and
family. It was pretty interesting.

------
RcouF1uZ4gsC
What are the safety implications of this? If something goes wrong, will people
be blinded? What effect do these lasers have long term on the retina, the
lens, and the vitreous humor of the eye?

I did not see any part of the article address these issues.

~~~
andorov
What if someone hacks it to intentionally blind you?

~~~
Accujack
Just look away. It's just a display device, so at most it's going to display
confusing pictures.

It's not possible to "hack" more output power into lasers with software
changes. Would that it was. You can change the duration of the beam, but you
can't pulse the beam without a Q switch in a way that changes the
instantaneous power.

~~~
robryk
In the normal operations, the laser will scan across the retina without long
dwell times at a single spot. Software is likely able to cause the laser to
track a single point on the retina (I expect that the device needs some sort
of an eye tracker and thus a camera aimed at the eye). I don't know if that
can produce a harmful power density.

~~~
Accujack
Given that it doesn't harm the retina when scanning, I'd say "full white" is
the best it could do, which could be surprising and a bit uncomfortable, but
not actually damage the eye.

Picture your monitor going all white... bright for a sec, but that's about it.

~~~
robryk
Given that it's scanning the duty cycle as seen by any part of the eye is very
small. If the perceived brightness was a function of power density averaged
over time, then it would very obviously have to be able to be much brighter
than "full white" to create the full white experience[^].

[^] in reality the perceived brightness is somewhat higher than the mean (i.e.
a light source that's twice as bright with twice smaller duty cycle appears
brighter). I'm not sure how large an effect that is, and whether it has
anything to do with pupil size adjustment.

------
thdrdt
This could become big in logistics. For example for order picking.

But also for maintenance crews. Want to know which machine broke down? The
glasses will give you directions and will even give you an overview of the
maintenance history.

I believe this is not a consumer product. Bosch has some consumer products but
they are way bigger in the business market.

And about the laser: it's just light. A laser doesn't mean 'cut through
everything'. It all depends on the power. I'm sure Bosch doesn't want to melt
your retina.

~~~
pawelk
> And about the laser: it's just light. A laser doesn't mean 'cut through
> everything'

Yeah, in this context laser means a light source with extremely tight cone,
meaning it can render a very tiny point on the retina. Not a ray of death that
will penetrate your brain and come out the other side of the skull ;)

------
andrewla
Maybe this will be the exception, but in general these seem so consumer
focused as to be useless. I just want something that can accept some
standardized or ad hoc well-documented protocol to do basic raster images or
text or something as a baseline. I want something that application developers
(and people like myself) can start to hack on and explore where it can go.

~~~
ehnto
I think AR is more suited to commercial users than consumers anyway. Google
Glass and Microsoft's AR solution seem to be playing out that way.

Anecdotally, the only context I would want information beamed right into my
line of sight is in work scenarios. All other scenarios I want technology to
be in the background as much as possible.

~~~
andrewla
I don't see the AR applications even being the most interesting part of this.
A private facial recognition database coupled with "this person's name and a
note to self" would be immensely helpful in a lot of situations -- I have bad
facial recognition (not full on face-blindness, but inconvenient) and it would
be neat to be able to hack on this a little bit. This would also require a
camera of some sort, but I'd rather have that not be integrated.

Possibly even "real-world closed captions".

A tap for clock/calendar function would be handy.

Morse code (or other silent, maybe subvocal?) "telepathy" would be interesting
as well.

I don't know how convenient or awful these would be in reality, but if the
cost were not exorbitant and you weren't locked into a proprietary app
ecosphere then it definitely seems like it would be worth a shot.

~~~
germinalphrase
Naturally, I worry about all of this facial recognition and data gathering;
however, as a k12 teacher, the utility of displaying names/data about my
students is immediately apparent. A great deal of time/effort is expended on
assessing my students and modifying my interactions with them based on that
information.

~~~
andrewla
My main fear here is using any sort of centralized/cloud system for any
portion of this. Facial recognition against a small corpus can be done fairly
easily off the shelf.

Everyone wants to sell me something at a huge discount because they know that
the enhancements to their own database will pay for the difference. I just
want the basic version of this for my own personal use, preferably with no
online use at all short of maybe encrypted backups (maybe).

------
gshdg
I'm very curious what the effect would be over the course of decades of having
even very weak lasers hitting your retina directly.

~~~
jerf
Photons do not get a special "laser" tag added to them by physics. If it's all
in the visual wavelengths and at intensities below what we experience every
day (sunlight is _really_ bright, our eyes hide from us the number of orders
of magnitude difference between even normal night-time artificial light and
sunlight), there's no issue.

(However... since I often see this sorta misinterpreted in the wild on the
internet, note that is an _if-then_ statement. If the antecedent is false, I
make no claim.)

The primary safety concern I have would be met by designing the lasers such
that if they are overdriven for any reason, they will _physically burn out_
before outputting enough light to be dangerous. Per the classic Therac-25 [1]
case study though, that is one safety feature I absolutely want in _hardware_.
There is no amount of software I would accept to implement that.

I would also _additionally_ stick some fuses into the system, tuned below the
threshold where the power will burn out the laser, along with of course
building the whole battery system to not be able to deliver enough power to
power the lasers to a dangerous level. However, I really want excess power to
physically burn out the lasers. (I wouldn't want to find out the hard way that
an EMP of some sort can overdrive the lasers.)

For all that I'm laying out safety systems here, I am quite confident that it
could be done safely. We trust our lives to much more dangerous systems all
the time. I will say that I can't explain to you how you'd _audit_ that
safety, though.

[1]:
[https://www.bowdoin.edu/~allen/courses/cs260/readings/therac...](https://www.bowdoin.edu/~allen/courses/cs260/readings/therac.pdf)

~~~
pavel_lishin
> _Photons do not get a special "laser" tag added to them by physics._

True, but we usually get a pretty wide spread of light energies. These are
likely going to be very specific frequencies, hitting similar areas over and
over. I wonder if the retina can get fatigued of specific frequencies.

~~~
jerf
A reasonable question, but I suspect we'd already have some idea if this was
the case. There's a lot of artificial light with relatively few, narrow
frequencies already in use. I imagine someone, somewhere in some bizarre
application would have discovered this as a problem.

Since we only see in three color dimensions, it's hard for us to notice day-
to-day, but for instance, some fluorescent bulbs are just 5 spikes in
particular frequencies. It looks fairly "white" to us, but it's far from
normal light.

(I am interpreting your comment as being fatigued of/damaged by very specific
frequencies in a way that it would not be fatigued/damaged for the same amount
of energy spread out over a wider range still within the given cone's
sensitivity range.)

~~~
pavel_lishin
> _(I am interpreting your comment as being fatigued of /damaged by very
> specific frequencies in a way that it would not be fatigued/damaged for the
> same amount of energy spread out over a wider range still within the given
> cone's sensitivity range.)_

Yep, that's my concern. I don't know if anyone's done long-term studies about
low-level light of identical wavelength.

I mean, on the one hand, people used to be afraid of fast-moving vehicles,
convinced that it was impossible for a human body to survive going faster than
40 miles per hour.

But on the other hand, people used to strap radium to their faces because they
thought it was a cure-all, too.

------
JackRabbitSlim
[https://spectrum.ieee.org/biomedical/imaging/in-the-eye-
of-t...](https://spectrum.ieee.org/biomedical/imaging/in-the-eye-of-the-
beholder)

IEEE's coverage way back in 2003 and even by then a prototype existed as far
back as 91.

I've been waiting for people to circle back around to this type of display for
about a 2 decades now. I suspect the biggest barrier/reason is aversion to
liability and safety of beaming lasers directly into customer eyes.

~~~
unishark
Lasers have been used routinely to measure the eye for the last 20 years. And
everyone making that technology probably toyed with the idea of making a
display rather than a sensor because you deal with the same issues. My guess
is it has been prophetic patents holding progress back, so hopefully they are
starting to expire now.

------
Darkphibre
OMG. I've been waiting for this tech ever since reading about this in high
school during the early 90's, coming out of the HIT Lab research (Human
Interfaces Technology Laboratory at University of Washington).
[http://www.hitl.washington.edu/projects/wlva/](http://www.hitl.washington.edu/projects/wlva/)

I was so enamored, I actually talked my mom into swinging buy when we did a
Pacific Northwest drive during summer break. Unfortunately, the lab was closed
to tours at the time. And the lab's name also tickled me. It was my aspiration
to go work there, and on VR technologies... and then the media chewed up the
tech and spit it out, causing the long winter (I _also_ wanted to get into
Neural Networks and AI...)

~~~
lanewinfield
Ha! I went to college my freshman year at University of Puget Sound, and my
philosophy professor who taught our class "Posthuman Future" took us on a tour
of that very lab at UW. It's been many years, but I still think back on that
experience as a revelation.

~~~
Darkphibre
That's so cool! And sounds like an amazing class. More and more, I'm kinda
wishing I had a chance to get to college. Maybe some day I'll return in my
retirement. :)

------
DanBC
This is a great article and it covers a lot of stuff that I'd wondered about
before.

Here's someone doing a (somewhat terrifying) DIY version:
[https://eclecti.cc/hardware/blinded-by-the-light-diy-
retinal...](https://eclecti.cc/hardware/blinded-by-the-light-diy-retinal-
projection)

Virtual Retinal Displays are a pretty old idea, first being demonstrated in
1991. They've been stuck in development for years, so it's good to see them
getting some more attention.

[http://www.hitl.washington.edu/research/vrd/](http://www.hitl.washington.edu/research/vrd/)

~~~
nrp
That was me. It’s great to see that the light engine and electronics have been
miniaturized to be almost glasses sized. It’s annoying that they are still
stuck with the tiny exit pupil I saw in 2012 and HITLab had in the 90’s. I
hoped that mass producible pupil replicators would have been ubiquitous by
now.

~~~
DanBC
When I say "somewhat terrifying" I do genuinely mean "also awesome"!

------
giancarlostoro
I think this will be the next frontier for wearable tech. Smart watches are
part of it, maybe even a smartwatch with some buttons that you can feel so you
can control what you see in your AR glasses. At least until we see wearable
contact lenses.

The phone becomes the main device, but these peripherals will enhance your use
of your phone. Imagine being able to look up words you might not know when
having a conversation in live time. Or seeing where the bathroom is in any
building.

Edit: If you disagree at least tell me why.

~~~
magduf
I'd like to know why you're being downvoted too; I've thought about this very
thing for a while now. Everyone right now runs around with the head tilted
down so they can look at their phone (which is supposedly causing us to grow
bone projections on the rear of our skulls). It would be much better to have
eye implants so we can see information in real-time from our mobile devices,
even as we walk around. Like you said, it would be really handy to see a map
while you walk in a building, showing you where the bathroom is, or to help
you navigate while you walk in a dense city, instead of having to pull out
your phone and look at it.

~~~
Darkphibre
The bone projection study has been pretty much debunked:

[https://arstechnica.com/science/2019/06/debunked-the-
absurd-...](https://arstechnica.com/science/2019/06/debunked-the-absurd-story-
about-smartphones-causing-kids-to-sprout-horns/)

That said, the current mobile situation is _not_ great on posture.

------
maxehmookau
This sounds fascinating. I can't even load the page properly in Safari due to
the stupid scroll hijacking around the ad.

------
baybal2
Very impressive how far they went with MEMS scanners. That thing is a
continuation of this: [https://ae-
bst.resource.bosch.com/media/_tech/media/product_...](https://ae-
bst.resource.bosch.com/media/_tech/media/product_flyer/BST-BML050-FL000.pdf)

------
NikolaeVarius
On a somewhat related note, Google Glass enterprise edition 2 is now
"generally" available [https://developers.googleblog.com/2020/02/glass-
enterprise-e...](https://developers.googleblog.com/2020/02/glass-enterprise-
edition-2-now.html)

~~~
felipemnoa
If I were google I would buy this company and replace google glasses with
this.

~~~
shdh
Google revenue 2018: $136B

Bosch revenue 2018: $78B

------
jp555
Here's a concise 3min overview of all current AR tech and the continuum of
product groups in the AR-VR (mixed reality) space.

[https://youtu.be/U1BVI2JcNPc](https://youtu.be/U1BVI2JcNPc)

This product would sit squarely in the "Smart Glasses" group.

------
ortusdux
Just in time for the snow crash TV show

~~~
folago
Is there a TV show?! I loved that book! Please don't be like Altered Carbon.

~~~
tzumby
I had the exact same reaction as I read the comment :D

------
castis
Seems like this technology, when developed further, could be the sort of thing
that [https://www.skully.com/](https://www.skully.com/) was trying with their
original prototype, which was really exciting to me at the time.

------
sschueller
What does 150 Line pairs [1] for resolution mean?

[1] [https://www.bosch-sensortec.com/products/optical-
microsystem...](https://www.bosch-sensortec.com/products/optical-
microsystems/smartglasses-light-drive/)

~~~
blattimwind
I assume they mean line pairs per mm not 300 lines total.

Edit: Nope, the vertical resolution is really 150 line pairs.

------
dr_dshiv
So, with eye tracking, we could put these little laser things on a watch to
give it. 20in screen?

------
felipemnoa
Reminds me of the anime called "Dennō_Coil" [1],[2]. The protagonists use
glasses to join an AR world. Never thought I would actually see something like
this in real life within my lifetime.

If these glasses actually work we could be entering an AR revolution in the
near future. Even if that future is still 20 years away that is still pretty
close.

[1]
[https://www.youtube.com/watch?v=ODTvrQFtEOM](https://www.youtube.com/watch?v=ODTvrQFtEOM)

[2]
[https://en.wikipedia.org/wiki/Denn%C5%8D_Coil](https://en.wikipedia.org/wiki/Denn%C5%8D_Coil)

------
euske
This isn't new. QD Laser
[https://www.qdlaser.com/en/](https://www.qdlaser.com/en/) has been making a
similar technology for a while. I've tried their glasses at CSUN assistive
technology conference in 2014 or something. Their target user was mostly low
vision people, and the image was very clear. They also claimed that the
technology is pretty safe for a long use. One problem that I found is that I
have long eyelashes and it kinda blocks a part of the image, casting a shadow.
But other than that, it looks a solid stuff.

------
cbsks
I was working with similar glasses at Microvision 10 years ago!
[https://web.archive.org/web/20100114084506/http://microvisio...](https://web.archive.org/web/20100114084506/http://microvision.com/pdfs/program_brief.pdf)

I don't know why they didn't take off. I was an intern there for a few months.
Every few weeks another engineer would quit, so maybe that had something to do
with it. It was a good to experience working for a failing company. Now I know
some warning signs!

------
evan_
This sounds really cool and promising but this made me laugh:

> The concept video doesn’t really do it justice—it looks great.

(two paragraphs later)

> The concept video is a quite accurate representation of how the glasses look
> when you’re using them.

------
irjustin
Hrmmmm would have been good to know about the cost covering bit in the
beginning of the article - but I digress.

I am extremely excited by the prospect of these or any glasses like system
that can work well in a package that doesn't look like a bolted a computer to
my head.

That laser warning sticker feels weird seeing as how that is literally the
whole point of the glasses, so I'll probably wait for a 2nd iteration just to
be sure. But for me, these would win over a decent watch any day.

~~~
rtkwe
IIRC the laser warning sticker is just required by law for any laser device of
any appreciable power so there's no "well it's the whole point of the product"
exemption to having the warning.

------
eximius
Very cool. I was super interested in Intel Vaunt and this is just an iterative
improvement. I would buy a pair of these if they had a reasonably open
API/firmware.

------
jpm_sd
This is a really odd article.

Using this device sounds awful!

>

What I do want to talk about is how this entire system fundamentally screws
with your brain in a way that I can barely understand, illustrated by
seemingly straightforward questions of “how do I adjust the focus of the
image” and “what if I want the image to seem closer or farther away from me?”

[...]

because the Smartglasses are using lasers to paint an AR image directly onto
your retina, that image is always in focus. There are tiny muscles in our eyes
that we use to focus on things, and no matter what those muscles are doing
(whether they’re focused on something near or far), the AR image doesn’t get
sharper or blurrier. It doesn’t change at all.

Furthermore, since only one eye is seeing the image, there’s no way for your
eyes to converge on that image to estimate how far in front of you it is.
Being able to see something that appears to be out there in the world but that
has zero depth cues isn’t a situation that our brains are good at dealing
with, which causes some weird effects.

>

Oh wait, maybe it's not awful?

>

for the first 10 or so minutes of wearing the glasses, your brain will be
spending a lot of time trying to figure out just what the heck is going on.
But after that, it just works, and you stop thinking about it (or that’s how
it went for me, anyway.) This is just an experience that you and your brain
need to have together, and it’ll all make sense.

~~~
endorphone
The article sounds like the device is unnatural and disorienting at first, but
then you get used to it and it isn't anymore. It sounds like a promising
technology.

------
contingencies
This demo is so weak I think we need a new noun for this category of failed
wearable tech / lifestyle technology demos. Can anyone think of a good
candidate?

 _wightbosch_ , n. failed futuristic product demonstration. Etymology from
_wight_ (poetic) ghost/deity + _bosh_ (British) nonsense + _Bosch_ (German
technology firm)

------
bobloblaw45
It would be pretty cool but then again magic leap looked pretty cool. I'll get
hyped after they start shipping.

------
rhacker
Not for me, I'd rather have a camera to get input and an audible bot in my
ear, no visual change in my face.

------
mrexroad
Reminds me of the “Virtual Retinal Display“ that Microvision was trying to
create back in the late 90s.

------
kick
Is this a vector display? The way it's worded seems like it, but I can't tell.

~~~
Rochus
It works like a CRT monitor where the beam scans line by line, see
[https://en.wikipedia.org/wiki/Virtual_retinal_display](https://en.wikipedia.org/wiki/Virtual_retinal_display)

------
ape4
Even in the demo video, some stuff is easy with a phone like recipes or
shopping list.

------
gumby
Despite this being essentially sponsored post, I’m glad someone has finally
got this kind of thing working.

This is the path that should ultimately be us infinite depth of field and true
image injection. Super exciting!

------
riazrizvi
In one of the photos, the dude is wearing the glasses and smiling at the
camera, it looks like there is a green reflection. Is that just a reflection
on the glasses from ambient light, or can we see the text he is seeing?

~~~
lesquivemeau
Probably just a reflection: There is no way you could see what is displayed if
you don't wear the glasses with this system

------
momirlan
laser writing to retina sounds like an awful idea. at least change the
marketing

------
kleiba
Laser into the eye sounds like a great idea. What could possibly go wrong?

------
JRKrause
I don't see how they can hope to overcome the alignment issue. This idea has
been prototyped before, iirc carmack built a simple one during very early
development of the Oculus.

------
anfractuosity
What's the holographic film for out of interest, is it for focussing somehow?

(I'm kind of curious as to why the lasers can't directly hit the retina from
the mirror array)

~~~
JRKrause
From my understanding it's just for reflecting the laser light. The lasers are
shining parallel to the lens surface initially and thus need to be redirected
towards the pupil, this is done with a film.

------
zmix
Oh yeah! That's exactly what I need after looking into a lightbulb (my
computer monitor) for a many hours a day! Laser, right into the eye! ;-)

------
jcims
With how well they design user interfaces on their appliances, it’s little
surprise they put battery life right in the middle of the field of view.

------
LOL_Arch_Linux
I'm not sure how much I'd trust a laser directly on my retina--at least not
until it's been very well tested (on other people).

------
jbattle
Seems like very useful technology for fighter pilots or even race car drivers
(maybe they already have something functionally similar)

~~~
keanzu
Fighter pilots already have helmet mounted displays to project information to
the pilot's eyes. These systems can also cue weapons systems to the direction
the pilot is looking.

[https://en.wikipedia.org/wiki/Helmet-
mounted_display](https://en.wikipedia.org/wiki/Helmet-mounted_display)

------
rdl
I remember using MicroVision laser headmounts back in 1998. It's kind of
shocking how slow commercialization of this has been.

------
johnchristopher
I really like that this system relies on one eye only. I have a very slight
strabism and some VR/3D stuff are out of reach.

------
kragen
How many pixels do they have, and what's the refresh rate? Also the one he's
wearing still looks kinda dorky.

~~~
Franciscouzo
It doesn't works with pixels, it looks like it uses a galvanometer to guide a
laser, it also says here [0] that it has a 60Hz frame rate.

[0] [https://www.bosch-sensortec.com/products/optical-
microsystem...](https://www.bosch-sensortec.com/products/optical-
microsystems/smartglasses-light-drive/)

~~~
kragen
The relevant figures for a galvo would be kpps and angular precision.

~~~
rtkwe
There's the more generic line pairs per millimeter which is basically how many
light and dark lines it can draw per mm.

------
adultSwim
Great to see development is continuing after Intel shut down the division that
was developing this technology.

------
m3kw9
Next: how to prevent screen burn in.

------
max_
Can an interesting application of this be how it can be used to assist the
partially blinded?

------
weld
The next evolution will be implanted direct optical nerve stimulation. Should
I hold out?

------
Simulacra
Daemon operatives, take notice!

------
rusk
Finally!

We've been talking about these in Sci-Fi for years.

Surely the Jet Pack can't be far off now ...

~~~
BuildTheRobots
Star Trek TNG S05E06: The Game:
[https://en.wikipedia.org/wiki/The_Game_(Star_Trek:_The_Next_...](https://en.wikipedia.org/wiki/The_Game_\(Star_Trek:_The_Next_Generation\))

Though the issue in the episode was the addictiveness of the game (which is
already an issue) rather than the fact it painted directly on your eye.

~~~
rusk
Yes, this.

------
ranie93
Can't wait to get burned in "pixels" in my retina

------
ocdtrekkie
I am confused that it "paints images directly onto your retina" (presumably,
shooting lasers at your eyeball, which sounds unhealthy) and yet the writer
was able to take a picture of the image on the glass.

~~~
ThrowawayR2
The image is being painted onto the camera sensor instead of the retina.

Regarding "shooting lasers at your eyeball, which sounds unhealthy", this
technology has existed since at least the early '00s. Microvision was showing
off their Nomad laser HMD
([https://www.google.com/search?q=microvision+nomad](https://www.google.com/search?q=microvision+nomad))
around 2002 IIRC. It's reasonably well established as safe.

------
gimmeThaBeet
Oh boy, I love that headline: Bosch Gets Smartglasses Right With Tiny Eyeball
Lasers. You know you're in the future when you start with the tiny eyeball
lasers.

------
jotm
I really hope they're just connected to a smartphone as a display.

Who thought it was a good idea to put a whole computer in glasses or
smartwatches?

~~~
rtkwe
Having the logic to interpret and respond to input on the device reduces the
latency between interaction and effect. Also having at least some power on the
device means it's useful by itself. It also reduced batter drain on both
devices to only send small BT packages instead of essentially streaming video
constantly during use.

------
stazz1
Hope those lenses are not reflective

------
EGreg
These lasers... are they dangerous, can they leave lasting scars on the retina
through prolonged use?

I would like some info on similar things.

------
ampdepolymerase
Isn't this the same technology as Focals by North?

~~~
kick
It says that in the article, yes.

------
0xff00ffee
TBH, I've wanted this tech for DECADES ever since I saw Scientific American TV
cover the "Cyborgs" at MIT in the early 90's, and then Google Glass (remember
when everyone was selected based on their open idea submissions and thought
they were getting FREE Google Glass, only to get the $1500 invoice?)

Let's start the list of all the things that can go wrong with this!

On the plus side, I wonder if they could measure the chromatic aberration and
distortion of the reflected laser light and compute a correct lens
prescription.

------
benignslime
This is similar to what Focals by North have been doing for a few years.

[https://www.bynorth.com/](https://www.bynorth.com/)

Turns out the article mentions them.

~~~
aantix
Nice to see there can be a fashion conscious smart eyewear.

Apple has been rumored to have such a product in the works. If theirs were
this cool, I could see them really taking off.

------
davidhyde
I stopped reading at "paints an image directly onto your retina". This device
is basically a miniature DLP laser projector that projects light onto a fancy
light gradient stuck to the glass. You are looking at a reflection, nothing is
painting your eyeball.

~~~
excalibur
I think it's the "directly" part that's incorrect, not the "paints an image
onto your retina" part.

~~~
_carl_jung
But then, everything we look at paints an image onto our retina, so using it
as a hook for a headline feels disingenuous.

~~~
rtkwe
If the article is correct it is an important difference because the light is
directly projected such that it's always in focus and not affected by the
cornea so it's just always there and you don't have to change your focus to
the read it.

