
How Samsung's VR Headset Convinced John Carmack to Join Oculus VR - ismavis
http://www.engadget.com/2014/09/04/samsung-gear-vr-john-carmack/
======
ajmurmann
"And I had argued for it a long time. I wrote so many of these multi-page
emails about how important it was. But once we had it so people could see and
see that it really does work, then Samsung went and wrote a proper interface
for it."

Wow, I really have to admire his patience. While reading this I was even
thinking "They are arguing with John Carmack about what he needs to get the
graphics right? Who do they think they are?!" and I am not him. I would never
have the patience for this. That must be like my grandma arguing with me about
programming.

~~~
2muchcoffeeman
It's good to know that even John Carmack has to deal with this sort of thing
though. It's not just us lowly programmers, even the legends get hassled.

~~~
Joeri
He is also by his own admission still learning about programming:

 _On the soft­ware devel­op­ment side, you know there was an inter­est­ing
thing at E3, one of the inter­views I gave, I had men­tioned some­thing about
how I’ve been learn­ing a whole lot, and I’m a bet­ter pro­gram­mer now than I
was a year ago and the inter­viewer expressed a lot of sur­prise at that, you
know after 20 years and going through all of this that you’d have it all
fig­ured out by now, but I actu­ally have been learn­ing quite a bit about
soft­ware devel­op­ment, both on the per­sonal crafts­man level but also
pay­ing more atten­tion by what it means on the team dynam­ics side of
things._

[https://blogs.uw.edu/ajko/2012/08/22/john-carmack-
discusses-...](https://blogs.uw.edu/ajko/2012/08/22/john-carmack-discusses-
the-art-and-science-of-software-engineering/)

------
greyskull
> Gear VR's been in development for around 1.5 years now

Wow. That puts it not too long after the Oculus Rift kickstarter, which was in
October 2012. I wonder how the Gear project came to start. Were there some
graphics enthusiasts with moving power at Samsung that went "this. we need to
get on this"?

------
pimlottc
I'm confused. I know Carmack joined Oculus, but what's this about Samsung? I
take it they're working together now? It's still two different pieces of
hardware, right? This article is leaving out some crucial context here.

~~~
Htsthbjig
"but what's this about Samsung?"

It is called "the screen". Samsumg manufactures(mass produces) the best
screens in the world. Their OLED screens are the best quality, simple,
lightweight and extremely fast response.

Oculus needs Samsung screens. Samsumg needs oculus talent, so they are
partners.

~~~
pimlottc
Sure, I'm not saying it didn't make sense, just that the article didn't bother
to explain it at all. If you start off by saying someone works for one
company, then start going on about his doing stuff for another company, you
ought to explain what the two of the have to do with each other.

------
moron4hire
So, I've been working on making it easy to make WebGL VR demos:
[https://github.com/capnmidnight/VR](https://github.com/capnmidnight/VR)

------
iandanforth
I truly do not understand this. Why would you want to dilute presence? Why
would you want to create an expensive phone accessory when the hardware
changes constantly and the software is extremely fragmented? There are plenty
of scenarios where I can see usinga Rift, but very very few where I am
'mobile' and want to experience VR. Of those cases I can't see any where a
dedicated and occasionally updated set of hardware isn't a better development
experience than trying to deal with VR hassles on top of all the standard
android ones. Please someone point out what Carmack gets here that eludes me.

~~~
readerrrr
It is simple. The target Carmack is aiming for is mobile VR. Tethered
experience is a necessary middle step. Oculus will release their version in
the future and Gear allows him to develop the tech.

~~~
mietek
You could say mobile VR is a necessary step towards AR (augmented reality).

[http://blogs.valvesoftware.com/abrash/why-you-wont-see-
hard-...](http://blogs.valvesoftware.com/abrash/why-you-wont-see-hard-ar-
anytime-soon/)

------
skizm
2014 and there are still websites that auto start videos? At the bottom of the
page no less? I pretty much ctrl+q out of fear at this point. Autoplaying
videos = virus/phishing website in my head.

~~~
runeks
You should enable click-to-play for Flash content in your browser. It both
improves security and prevents auto starting videos.

~~~
skizm
Nice, didn't know that existed. Thanks.

------
melling
Why do we need to tether to a desktop for the extra power? Can't we create a
wireless video standard to drive the VR headset?

~~~
TheCoreh
I believe latency would not be acceptable with current wireless technology.

~~~
jacquesm
Wireless video can have approximately one millisecond latency today, I'm not
sure how bad that would be wrt to things like nausea but it seems pretty short
to me.

~~~
dubcanada
Your eyes are only made for a certain frame rate. Any more and your eyes will
ignore it, but your brain will need to piece it back together (ie nausea). Any
less and it will seem choppy. It's a very very thin line at which it goes from
nausea too choppy/laggy.

Not that a few milliseconds of latency really matter, but if you don't care
about 2 milliseconds here and another 2 there, they start adding up. It's much
better to try your best to have the lowest possible latency.

~~~
SixSigma
Eyes don't have a framerate. Fighter pilots have been tested and can identify
the type of plane in an image with just one frame at 255 fps.

Noticing a flash of light can go into the 1000 fps territory.

[http://www.100fps.com/how_many_frames_can_humans_see.htm](http://www.100fps.com/how_many_frames_can_humans_see.htm)

~~~
apl
While it's true that a biological eye doesn't really have a defined
"framerate" like video sensors do, the examples you're providing don't really
show that. Sensitivity to very short stimuli doesn't prove temporal resolution
in the sense that's applied here; what you want to be looking at is flicker
fusion instead. As a cheeky example -- still pictures have approximately zero
frames per second temporal resolution but would be able to pick up that 2.5ms
plane if their sensitivity was sufficiently high...

Measuring ERGs (electro-retinogrammes) indicates that our photoreceptors can't
really resolve luminance fluctuations above, at the very limit, 70-80Hz. Even
Drosophila's high speed cascade doesn't really modulate beyond 200-250Hz. I'm
really doubtful about those claims.

~~~
ghusbands
Indeed, flicker fusion for humans pretty much tops out at 80Hz, but you're
underestimating the effects of smooth pursuit eye movement. For example, if
you animate a fast-moving object at 80Hz on a 160Hz display, and follow it
with your eyes, you will see two copies of the object. There's really no upper
limit for this effect.

------
yummybear
Is there, currently, a way to put positional headtracking into a wireless
device (without using a fixed camera)?

~~~
madgeeklabs
Using the device camera together with Microsoft Photosynth technology would be
viable. Maybe not in the time required to be confortable for the wearer of tbe
VR set.

------
dharma1
so how do I create content for this thing? unity3d -> export to Android - or
something else?

------
omglol
engadget is not a credible source for any news...

~~~
declan
I have no affiliation with Engadget (in fact, I've spent over a decade working
at _competitors_ to Engadget). But I've found that it makes sense to evaluate
news organizations' output on an article-by-article basis. Even the best news
organizations can screw things up on occasion, and even excellent reporters
can be assigned derivative stories by their editors.

The article-by-article approach is what [http://recent.io/](http://recent.io/)
uses in its pipeline, which is working very well in testing so far.

More to the point, this looks like a perfectly decent story and an engaging
read. The writer even managed to avoid being snarky about Samsung trying
(sigh) to second-guess John Carmack!

