End of the article: "Bosch covered our costs for attending CES 2020."
I mean it's possible they paid for him (and others) and still didn't allow him to get a demo... but it's unlikely. Also, I feel like IEEE can afford to pay the attendance costs for a reporter.
So, this is really just promoted content/an ad.
Which is fine and it's even fine to wait until the end to tell us I think but not if the article body tries to paint a different, organic picture. This wasn't at all necessary for the content but simply there to detract from the paid promo nature of the piece.
"We don't want this getting out just yet, but... oh okay, you can come behind the curtain just this once."
<end of private demo>
"So, what'dya think? By the way, here's a fresh marketing video we happen to have lying around. Tell everyone. No no, actually, let us pay you to tell everyone. We insist."
> Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.
So it seems like you are correct. The article could have been much more clear about this though.
It has been edited at some point within the last ~2 hours.
I don't think the new text changes anything, though. a) If that new text tries to paint the picture that this isn't sponsored content or might have bias, then why did they even mention that disclaimer? b) Which exact legal entity or subsidiary was "officially" sponsoring the trip isn't really that relevant when they are so tightly connected in brand and organizationally.
They could have simply started the article with: "This year Bosch invited us to CES fully covering our attendance cost."
or alternatively later on "while Bosch sponsored our trip we still had a really hard time getting a demo for ... because of ..." (e.g. too many people)
Sneaky disclosures kill any interest I have in a product/company.
Case in point, I'm not even going to read this article after reading this comment.
>Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.
This IMHO makes it even worse. The article should clarify that they updated this description.
There are existing heads-up display products targeted at cyclists such as the Everysight Raptor and Garmin Varia Vision. However they aren't practical or comfortable for runners.
Ideally I'd like the smart glasses to have the following features: ANT+ Extended Display profile. 6 hour battery life. Lightweight with even weight distribution (not all on one side). Prescription lens compatible.
Practically I also can't imagine how they'll solve weight/battery/networking issues with the device particularly in athletic settings where you're literally putting the device through a constant earthquake.
Maybe we should stop wishing for some ridiculous convenience layer to be added to our lives and just look around during our runs like we've done for thousands of years.
Trouble is GPS isn't always reliable in heavily wooded areas, so there might need to be improvements in GPS before that is really viable.
GNSS accuracy is poor in forests, canyons, and dense urban environments due to line-of-sight obstructions and multipath reflections. The latest Navstar Block III satellites should improve that a little, but any real solution would require deploying additional satellites in geosynchronous orbit like the Japanese QZSS.
There's nothing about geosync orbit that would help with the tree/canyon effect. Satellites directly overhead contribute the least to horizontal positioning - essentially only helping to establish the "true" clock value but not 2D position.
Multi-constellation tracking can be helpful, simply because there are more satellites to use.
I feel like this should be solvable on the software side, especially for medium or long distance runs. If I look at the traces, I can see that the recorded zig-zag is obviously wrong, and a bit of smoothing should be able to fix it.
Are there any running apps that are accurate?
Accelerometer, magnetometer, and gyroscope sensor data can help a little to sanity check GNSS inputs and fill in brief gaps. But those tiny sensors have terrible drift which makes them nearly useless for sustained position tracking.
This frustrated me enough that I eventually got a Stryd footpod. The pace/distance tracking is extremely accurate and I use it to override what the watch records. So the GPS track still bounces all over the place but the recorded pace/distance data is correct.
Unfortunately the state of localization isn't ready for passive (aka non-radiative) reliable sub meter localization over anything more than a few meters squared in a well lit indoor space.
Going to be a while likely before we get to "wear everyday" AR glasses with sub meter accurate camera position/localization.
Also I hate to be "that guy" whose stupid wristwatch is constantly beeping during group workouts and races.
(I am in no way affiliated with this company)
I'd love to have it overlayed over the corner of my eye while I'm working out.
When Bluetooth headsets became small enough to be inconspicuous I lived in a marginal neighborhood. When my wife and I would go out, we'd play a guessing game called "Bluetooth or mental illness."
I think a lot of people don't realize that a lot of people consider it a luxury to be out of contact for periods of time.
I flush repeatedly, turn on the water, and take out my phone and have very loud imaginary phone calls, or just cycle through the available ringtones at high volume.
Some people are still so low class that it doesn't even phase them.
To that end, my experience of using transparency mode on AirPod Pro earbuds is that they very much do “become invisible” while allowing me to overlay (auditory) information on the world around me. If they were built to be as inconspicuous as my father’s hearing aids, nobody would know the difference and no overt social stigma would persist. The AR future of today is auditory.
EDIT: where "visibility" also includes haptic feedback and all of that other good stuff. See http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...
I'll make more devices soon, and will sell them for $25. Send me an email, and I'll message when they are ready (mail in profile).
The good news is that AR can add visibility to hidden interfaces too. The bad news is that nobody will bother.
That's bad interface design IMO. I guess the headphones look "cleaner" but it's really annoying in practice and not worth it.
That image reminds me of the transhumanist YouTube series H+. In one of the first episodes one of the characters keeps trying to “reboot” after his implant succumbs to a computer virus capable of killing people.
Basically, you wore this thing like eyeglasses on your head, and it had a small arm that extended in front of your eye (but a little below it). When you looked down, it appeared like a computer monitor was hovering in front of you. The very early models were 320x200 resolution in monochrome (red on black), but I tried one at a trade show in 2000 that I think was 800x600 in VGA color, which at the time was pretty decent. I'm surprised these never got more popular; they would be great for laptop computers: you could have total privacy in your viewing (unlike a normal screen), and with improvements in the technology you could potentially "see" a much larger screen than a normal laptop has.
Does anyone else remember these?
Resolution was very low, battery life very short, UI very annoying, appearance very embarrassing.
I did not see any part of the article address these issues.
Our eyes are designed to let light in. The power level of lasers that can run all day on a 350mAh battery they share with some electronics is going to be miniscule, like maybe a fraction of a milliwatt of power. A common laser diode available right now that outputs 1mW continuously consumes about 36 mA of power to do that. Three lasers outputting 1mW would consume roughly 363 or 108mA, so a 350mA battery would only power those three lasers for about edit 3.4 hours, with nothing left for the electronics to use.
Less than 1mW is not much, even for a laser, and at visible wavelengths it's not going to transfer enough heat to even measurably change the temperature of the human eye, much less damage it.
Lots of interesting safety questions unique to this particular display technology.
That means one of two things must be true. Either a) the beam is way brighter than other light entering your eye, such that it delivers in a single focused beam the same amount of energy that would otherwise be distributed evenly across the surface of your retina if you were viewing natural light. (Thus the questions I posed in my previous comment.) Or b) the beam is the same brightness as other light entering your eye, meaning the amount of energy in any individual pixel is far _less_ than what your eye receives from natural light. (I imagine this would make the image appear very dim.)
It's not possible to "hack" more output power into lasers with software changes. Would that it was. You can change the duration of the beam, but you can't pulse the beam without a Q switch in a way that changes the instantaneous power.
Picture your monitor going all white... bright for a sec, but that's about it.
[^] in reality the perceived brightness is somewhat higher than the mean (i.e. a light source that's twice as bright with twice smaller duty cycle appears brighter). I'm not sure how large an effect that is, and whether it has anything to do with pupil size adjustment.
I remember when a CRT monitor would mess up and stop scanning right the white point in the middle was much brighter than the normal brightness, I was always worried it would burn in and would shut off the monitor right away
I'm wondering how is the laser scanned over the retina (probably some MEMS chip). What happens if the beam scanning is suddenly stopped in place, wouldn't you get the whole laser power concentrated in a small dot. And it happens so quickly that you won't have time to react.
There may also be some phantom image effects like we used to have on those old TV monitors. Or like we can have when fixating a picture.
What's probably more insidious would be projecting some very lightly superimposed structure. You probably can induce some unconscious cognitive load or nausea. By decreasing the discomfort for example when the user is looking at an ad, you will increase the effectiveness of the ad.
But also for maintenance crews. Want to know which machine broke down? The glasses will give you directions and will even give you an overview of the maintenance history.
I believe this is not a consumer product. Bosch has some consumer products but they are way bigger in the business market.
And about the laser: it's just light. A laser doesn't mean 'cut through everything'. It all depends on the power. I'm sure Bosch doesn't want to melt your retina.
Yeah, in this context laser means a light source with extremely tight cone, meaning it can render a very tiny point on the retina. Not a ray of death that will penetrate your brain and come out the other side of the skull ;)
Please. I'll take 3.
Anecdotally, the only context I would want information beamed right into my line of sight is in work scenarios. All other scenarios I want technology to be in the background as much as possible.
Possibly even "real-world closed captions".
A tap for clock/calendar function would be handy.
Morse code (or other silent, maybe subvocal?) "telepathy" would be interesting as well.
I don't know how convenient or awful these would be in reality, but if the cost were not exorbitant and you weren't locked into a proprietary app ecosphere then it definitely seems like it would be worth a shot.
Everyone wants to sell me something at a huge discount because they know that the enhancements to their own database will pay for the difference. I just want the basic version of this for my own personal use, preferably with no online use at all short of maybe encrypted backups (maybe).
(However... since I often see this sorta misinterpreted in the wild on the internet, note that is an if-then statement. If the antecedent is false, I make no claim.)
The primary safety concern I have would be met by designing the lasers such that if they are overdriven for any reason, they will physically burn out before outputting enough light to be dangerous. Per the classic Therac-25  case study though, that is one safety feature I absolutely want in hardware. There is no amount of software I would accept to implement that.
I would also additionally stick some fuses into the system, tuned below the threshold where the power will burn out the laser, along with of course building the whole battery system to not be able to deliver enough power to power the lasers to a dangerous level. However, I really want excess power to physically burn out the lasers. (I wouldn't want to find out the hard way that an EMP of some sort can overdrive the lasers.)
For all that I'm laying out safety systems here, I am quite confident that it could be done safely. We trust our lives to much more dangerous systems all the time. I will say that I can't explain to you how you'd audit that safety, though.
True, but we usually get a pretty wide spread of light energies. These are likely going to be very specific frequencies, hitting similar areas over and over. I wonder if the retina can get fatigued of specific frequencies.
Since we only see in three color dimensions, it's hard for us to notice day-to-day, but for instance, some fluorescent bulbs are just 5 spikes in particular frequencies. It looks fairly "white" to us, but it's far from normal light.
(I am interpreting your comment as being fatigued of/damaged by very specific frequencies in a way that it would not be fatigued/damaged for the same amount of energy spread out over a wider range still within the given cone's sensitivity range.)
Yep, that's my concern. I don't know if anyone's done long-term studies about low-level light of identical wavelength.
I mean, on the one hand, people used to be afraid of fast-moving vehicles, convinced that it was impossible for a human body to survive going faster than 40 miles per hour.
But on the other hand, people used to strap radium to their faces because they thought it was a cure-all, too.
You can use the effect to generate colors that are actually impossible in the real world (and are only perceived):
That part is not likely. If you concentrate the light over just a few retina cells, dangerous levels are very low.
I thought they kinda did. By being very similar wavelength and power, compared to normal light which has all sorts of wave lengths and power. This difference can mean certain rods/cones being stressed more than average and not triggering the sort of fatigue that normal light would cause.
Perhaps the best example of a similar concept, though not with lasers, is looking at a total solar eclipse right before or after the sun is fully eclipsed. There is a small period of time where extremely bright light makes it into our eyes, but not he frequencies that cause pain normally associated with looking at the sun. This means that our default defenses against looking at the sun don't kick in and doing permanent eye damage is extremely easy without feeling any pain as the damage is done.
The consequence of being a point light source is that lenses can refocus the beam, parallel or not, back into a point. Doing so concentrates a lot of energy on a tiny surface. And if that surface is your retina, then that when it becomes dangerous, literally burning a tiny hole into it.
It can never happen with, say, a regular light bulb, or even the sun. It the light source is spread out, the image on your retina, or anywhere else, will be spread out, limiting the energy density. This is an indirect consequence of the second law of thermodynamics called the conservation of etendue.
IEEE's coverage way back in 2003 and even by then a prototype existed as far back as 91.
I've been waiting for people to circle back around to this type of display for about a 2 decades now. I suspect the biggest barrier/reason is aversion to liability and safety of beaming lasers directly into customer eyes.
I was so enamored, I actually talked my mom into swinging buy when we did a Pacific Northwest drive during summer break. Unfortunately, the lab was closed to tours at the time. And the lab's name also tickled me. It was my aspiration to go work there, and on VR technologies... and then the media chewed up the tech and spit it out, causing the long winter (I also wanted to get into Neural Networks and AI...)
I think the first end users were supposed to be fighter pilots.
Here's someone doing a (somewhat terrifying) DIY version: https://eclecti.cc/hardware/blinded-by-the-light-diy-retinal...
Virtual Retinal Displays are a pretty old idea, first being demonstrated in 1991. They've been stuck in development for years, so it's good to see them getting some more attention.
Let's start the list of all the things that can go wrong with this!
On the plus side, I wonder if they could measure the chromatic aberration and distortion of the reflected laser light and compute a correct lens prescription.
The phone becomes the main device, but these peripherals will enhance your use of your phone. Imagine being able to look up words you might not know when having a conversation in live time. Or seeing where the bathroom is in any building.
Edit: If you disagree at least tell me why.
That said, the current mobile situation is not great on posture.
Bosch revenue 2018: $78B
This product would sit squarely in the "Smart Glasses" group.
The glasses in Virtual Light "got these little EMP drivers around the lenses, work your optic nerves direct".
EDIT: Actually he wears goggles that the laser shines onto, so a little different
Edit: Nope, the vertical resolution is really 150 line pairs.
If these glasses actually work we could be entering an AR revolution in the near future. Even if that future is still 20 years away that is still pretty close.
I don't know why they didn't take off. I was an intern there for a few months. Every few weeks another engineer would quit, so maybe that had something to do with it. It was a good to experience working for a failing company. Now I know some warning signs!
> The concept video doesn’t really do it justice—it looks great.
(two paragraphs later)
> The concept video is a quite accurate representation of how the glasses look when you’re using them.
Turns out the article mentions them.
Apple has been rumored to have such a product in the works. If theirs were this cool, I could see them really taking off.
I am extremely excited by the prospect of these or any glasses like system that can work well in a package that doesn't look like a bolted a computer to my head.
That laser warning sticker feels weird seeing as how that is literally the whole point of the glasses, so I'll probably wait for a 2nd iteration just to be sure. But for me, these would win over a decent watch any day.
Using this device sounds awful!
What I do want to talk about is how this entire system fundamentally screws with your brain in a way that I can barely understand, illustrated by seemingly straightforward questions of “how do I adjust the focus of the image” and “what if I want the image to seem closer or farther away from me?”
because the Smartglasses are using lasers to paint an AR image directly onto your retina, that image is always in focus. There are tiny muscles in our eyes that we use to focus on things, and no matter what those muscles are doing (whether they’re focused on something near or far), the AR image doesn’t get sharper or blurrier. It doesn’t change at all.
Furthermore, since only one eye is seeing the image, there’s no way for your eyes to converge on that image to estimate how far in front of you it is. Being able to see something that appears to be out there in the world but that has zero depth cues isn’t a situation that our brains are good at dealing with, which causes some weird effects.
Oh wait, maybe it's not awful?
for the first 10 or so minutes of wearing the glasses, your brain will be spending a lot of time trying to figure out just what the heck is going on. But after that, it just works, and you stop thinking about it (or that’s how it went for me, anyway.) This is just an experience that you and your brain need to have together, and it’ll all make sense.
wightbosch, n. failed futuristic product demonstration. Etymology from wight (poetic) ghost/deity + bosh (British) nonsense + Bosch (German technology firm)
This is the path that should ultimately be us infinite depth of field and true image injection. Super exciting!
(I'm kind of curious as to why the lasers can't directly hit the retina from the mirror array)
We've been talking about these in Sci-Fi for years.
Surely the Jet Pack can't be far off now ...
Though the issue in the episode was the addictiveness of the game (which is already an issue) rather than the fact it painted directly on your eye.
Regarding "shooting lasers at your eyeball, which sounds unhealthy", this technology has existed since at least the early '00s. Microvision was showing off their Nomad laser HMD (https://www.google.com/search?q=microvision+nomad) around 2002 IIRC. It's reasonably well established as safe.
It's a lot less confusing if you read the article lol.