
Mozilla, OTOY, Autodesk Work to Deliver High Performance Games and Apps on Web - praxxis
https://blog.mozilla.org/blog/2013/11/05/mozilla-otoy-and-autodesk-work-to-deliver-high-performance-games-and-applications-on-the-web/
======
cromwellian
It's an idea tried and failed many times over (GPU hosted games in the cloud,
OnLive, GaiKai, etc). The round-trip lag from the time you press a button
until the time you see a change on the screen is just unacceptable for most
twitch oriented games. The JS/Web implementation doesn't change anything about
the fundamentals. I don't know why Mozilla is bothering with this approach vs
the Emscript/Asm.js which would fair much better (but still not likely to
succeed for AAA games)

You have people complaining about touch-lag on Android devices, which is on
the order of 100ms, and you're telling me you're going to send a packet with a
controller movement, render a frame, compress a frame, ship it back, and
display it with something <100ms? The demo shown is a non-interactive cut-
scene, so no one's going to notice the input lag. OnLive, when latency was
actually measured, came out around 150ms, totally unacceptable.

Th true test would be running something like CoD, BF4, or a SF2 tournament and
get pro gamers to evaluate the system.

The AAA titles are never going to be run this like economically. I'm playing
Battlefield 4 right now, a huge huge game that brings my current PC rig to its
knees. Why would anyone want to suffer this with 150ms lag and compression
artifacts?

~~~
codeboost
Most multiplayer games are quite playable over the network. The difference is
that the graphics are rendered locally and the world state is streamed over
the network.

A fast enough pipe (100Mbps fiber is quite common and cheap here in Europe)
could deliver good results, even if the whole world is rendered and then
streamed as video.

~~~
cromwellian
Multiplayer games run with a synchronization/prediction model. Most are not
run with a purely server-driven world-state. The entire game simulation code
and logic runs locally and is sent to the server, the server than collects all
the inputs and calculates differences to the world state and sends back the
diffs with predictions based on latency. These diffs may differ from the local
state, and so corrections are applied. In the majority of cases, the
differences are minor enough that the player doesn't notice. When they are
severe, the player notices hitbox inaccuracy or worse, rubberbanding as his
actions are "snapped back".

So when you're playing an FPS and you press a button to shoot, the local game
logic and physics computes and displays the result immediately (firing
animation, sound start playing). A short delay later, the server confirms a
kill.

Even with this fairly sophisticated model, you still want <50ms pings. Video
streaming and running the entire game in the cloud just won't work for games
that require rapid hand-eye coordination.

------
est
So, instead of demo videos on youtube, where can I try this hype in _my_
browser?

~~~
shrike
Here are the servers, spin one up and give it a try!

[https://aws.amazon.com/marketplace/search/results/ref=gtw_na...](https://aws.amazon.com/marketplace/search/results/ref=gtw_navgno_search_box?searchTerms=OTOY&search=)

~~~
guiomie
Can't I download some binaries, and have my own local instance running
instead?

~~~
lelandbatey
Nope. Welcome to the future. Here is your lifeViewer (tm), please don't forget
to make your monthly payment or you view into life will be terminated.

------
ryanackley
Don't get me wrong because this is a cool technological achievement if it can
deliver as promised.

However, when they originally announced this in May, my impression was that
this was going to be open sourced and a part of Firefox. I found that to be
really exciting.

It turns out, it's just another business trying to show us the future. Nothing
wrong with that but it doesn't excite me as much.

Also, Brendan Eich is an advisor to Otoy, the company behind ORBX.js. Is it
just me or does anyone else think that it's a conflict of interest for him to
market a for-profit service/product via Mozilla? It's not clear what exactly
Mozilla's involvement is in this project and what do they get out of it
besides a broader web ecosystem?

~~~
sandyman
If it weren't for Mozilla creating broadway.js, ORBX.js would never have
happened. Andreas' work on this libary was they key inspiration for ORBX.js.
Since May, Mozilla has helped us optomize the JS code (which was key post FF22
when the JS VM changed), and is helping us move the decoder entirely to the
GPU in WebGL2. I think at some point we would like to open source the older
ORBX.js codecs as we iterate on this first version, but even that doesn't make
much sense until we get a stable file format for video. Right now ORBX.js is
tuned for live streaming. That will change with v2, which is we're targeting
for early next year with compression close to HEVC - see
[http://aws.otoy.com/docs/ORBX2_Whitepaper.pdf](http://aws.otoy.com/docs/ORBX2_Whitepaper.pdf)

------
mirsadm
This sort of stuff is just not very exciting to me. Latency is very important
in gaming. Unless the servers are in the next room the latency will be
probably be crappy. Companies have enough issues launching AAA titles these
days without streaming them. It'll be like the Sim City launch every time.

The only really exciting thing going on in gaming (for me) is the Oculus Rift.
Outside of gaming maybe there are other use cases. Given how powerful and
cheap hardware is, I just have a hard time believing it'll take off.

~~~
nl
[https://brendaneich.com/2013/05/today-i-saw-the-
future/](https://brendaneich.com/2013/05/today-i-saw-the-future/) is a much
better link. To quote:

 _OTOY’s CEO Jules Urbach demo’ed an entire Mac OS X desktop running in a
cloud VM sandbox, rendering via ORBX.js to Firefox, but also showed a Windows
homescreen running on his Mac — and the system tray, start menu, and app icons
were all local HTML5 /JS (apps were a mix ranging from mostly local to fully
remoted, each in its own cloud sandbox)._

Personally I find _that_ much more interesting than anything to do with
gaming.

~~~
sliverstorm
Why is that interesting? Has VNC been EOL'd?

~~~
nl
_apps were a mix ranging from mostly local to fully remoted, each in its own
cloud sandbox_

I'm not aware of a way to do that easily using VNC.

Also, is there a good API for programming VNC? So I can inspect frames and act
upon them?

------
tlack
Has anyone actually had any luck getting this working? Seems like such a
revolutionary idea that I'm dying to try it, but no luck so far.

It took me a few tries (and a couple hours) to get one of their preconfigured
AMIs provisioned, excruciatingly extract the GUID from Win2k8 (people actually
work this way?), but now their weird HTTP bouncer endpoint doesn't seem to be
connecting at all..

~~~
jack57
I'm in the same boat. This is pretty frustrating considering I'm paying for
this special GPU instance by the hour.

~~~
tlack
Are you aware of any support boards or anything like that on which we could
get resolution?

~~~
sandyman
[http://render.otoy.com/forum/viewforum.php?f=70](http://render.otoy.com/forum/viewforum.php?f=70)

------
RexRollman
Video via javascript? Advertisers must love the idea of this. I hope that
there will be a way to turn this kind of thing off.

~~~
ssafejava
I believe sound is disabled by default - makes it honestly not very different
than an animated GIF from an advertising perspective.

~~~
RexRollman
The nice thing about Firefox is you can stop animated GIFs by hitting ESC.
There is also an about:config setting that will turn them off as well. I
personally find animated/video graphics annoying when trying to read.

------
randyrand
So VNC...? But in your browser? I don't see how making VNC work in the broswer
is 'revolutionary.' What does it add to the table more so than a standalone
VNC client?

~~~
nl
Yep, and the web is exactly the same as green screen terminals.

Instead, imagine VNC, but _programmable with Javascript_ across applications
in a standard way.

Imagine using a remote app, but when a particular trigger occurs (think a
special button on the client) it opens up a new connection to another server,
and splices the connection into the view, so you can use apps on the two
servers at the same time.

But there is plenty more - Javascript can inspect individual frames, and use
them as triggers for other actions.

~~~
jameshart
So, like X Windows in the browser, then?

~~~
nl
Yeah, except video acceleration has more of a chance of working reliably in
this than in X Windows (it's a JOKE people! I'm not really trying to have a
discussion about how video acceleration in Linux really does work now,
finally, this time for good. Really.)

Anyway, I'm not sure what point you are trying to make.

Yes, all remote display protocols are similar at some level.

The fact that this is programmable via Javascript and runs in the most widely
deployed client app ever made (ie, a browser) is a fairly significant
difference though.

------
riskable
From the TechCrunch link (which was in the article):

>A single GPU, Amazon argues, can support up to eight real-time 720p video
streams at 30fps (or four 1080p streams).

Seriously? I have been working on X11 support in Gate One for a while now and
my laptop, with absolutely ZERO GPU acceleration can deliver/encode 720p,
30fps video to a browser at around 5% CPU utilization.

Proof: [http://youtu.be/6zJ8TNcWTyo](http://youtu.be/6zJ8TNcWTyo)

------
abhiv
It's not clear to me from the announcement which parts are executed server-
side and which client-side.

Companies like iSwifter have tried to do server-side-rendered, streamed Flash
games for years with limited success. The local machine in that case simply
transmits input and displays video.

I think the difference here is that there will be some client side computation
as well, but I'm not sure how much. If the GPU is in the cloud, that seems to
indicate that they are bypassing WebGL, which would provide access to the
local GPU. So my guess is that the JS does the typical setup work of a CPU in
a rendering pipeline (setting up the scene, constructing draw batches),
transmits it to the cloud GPU, and then transfers the rendered frame to the
local GPU for display.

For interactive applications, it seems like the CPU-cloud GPU latency could be
a deal breaker, though John Carmack famously said that it was faster to ping
Europe from the US than to draw a pixel to the screen.

~~~
sandyman
It's rendered server side. The decoding is done entirely in JS, so no plug-
ins. Not even <video> tag (try doing streaming with that, the latency is +100
ms). So, When you boot an AMI up on Amazon, the web page streams back the host
FB as a 60 hz HD stream. Here in Los Angeles, my ping time to EC2 West Coast
(NoCal) is 13-16 ms. Encode is 4-6 ms, decode in JS is 4-8 ms - I don't notice
any latency. I am very curious to see the experiences others have. BTW I work
at OTOY, and my username on the OTOY forums is "Goldorak". I can help anyone
out there if they need assistance.

~~~
abhiv
Thanks for the reply. What about the additional latency of sending the
rendered framebuffer to the local graphics card after it's received from the
cloud? It seems like you're right on the edge of being able to do one-frame
latency between input and render at 30 fps (33 ms per frame), but would have
at least two-frame latency at 60 fps?

~~~
sandyman
Well if you run Aero in windows 7, that adds 3 frame of latency for example.
In my testing, I don't think the decoode->canvas->present is much of an issue,
but others might be more sensitive. Of course that last step is also browser
specific. Firefox 19+ to me seems the smoothest, but Chrome 26+ an Opera 16+
give very good results too.

------
wmf
So the encoding is done by Nvidia in hardware; I guess ORBX is doing the
decoding. Is there some reason why asm.js + WebGL is better than <video> \+
DASH/HLS?

When it comes to watermarking, I'd like to see a cost comparison between
watermarking and a static CDN (e.g. Netflix can serve 15+ Gbps per server).

~~~
w-ll
[http://phoboslab.org/log/2013/05/mpeg1-video-decoder-in-
java...](http://phoboslab.org/log/2013/05/mpeg1-video-decoder-in-javascript)

Was posted here a few months back, it's a hand conversion of a mpeg decoder in
JavaScript to stream ffmpeg via websockets to a canvas.

------
angularly
Quite amazing, I predict this will have a big impact on gaming. It basicly
means that games can be delivered from the cloud, and it seems quite
plausible, this is how games will be delivered in the near future.

For the consumer it means you dont have to buy games, you can subscribe to a
service and rent them. For the game developers, it means they don't have to
worry about illegal copies.

~~~
sillysaurus2
_I predict this will have a big impact on gaming. It basicly means that games
can be delivered from the cloud, and it seems quite plausible, this is how
games will be delivered in the near future. For the consumer it means you dont
have to buy games, you can subscribe to a service and rent them. For the game
developers, it means they don 't have to worry about illegal copies._

This idea has been floating around for years. A company went bankrupt trying
to do it.

I predict it won't happen unless/until Valve makes it happen. No one else
would be able to convince enough gamers to switch. And Valve has no incentive
to do it, because there's really no incentive for anyone to do it. At least,
not for the gaming audience at large. This tech has the potential to be a
godsend for 3D artists wishing for realtime previews of their work. But
gamers? Not so much.

Remember the other day an article was floating around like "Desktop PCs aren't
dead, we just don't need new ones"? That's finally starting to become true for
gaming PCs as well. Gamers are quite content on their current gen boxes.

You may argue that this tech will enable higher graphics fidelity, and will
blow people away with how real it looks. But considering nobody knows how to
make games look any more real than they look now, I wouldn't hold my breath.
Gaming graphics has plateaued.

In summary, "games rendered via the cloud" is the pets.com of the gaming
industry.

EDIT: Oy. If you're going to try to refute me, then put some effort into it.

~~~
KaoruAoiShiho
Are you sure you're well versed in this subject? I detect more than a few
dubious statements in your post.

~~~
vinkelhake
Then do us a favor and point them out so that there can be a real discussion.

------
bdegman
I'm interested in something like this for high-end CGI and graphics work.
Being able to use a macbook air with a big workstation back end in the cloud
would be a dream. I know there are render farm solutions that big studios use
but realtime access to lots of computing power from anywhere would be amazing
for freelancers and smaller studios.

~~~
thenomad
That's exactly what OTOY are aiming at with their cloud rendering. See
[http://www.tomshardware.com/news/Otoy-OctaneRender-Cloud-
Edi...](http://www.tomshardware.com/news/Otoy-OctaneRender-Cloud-Edition-
gpu,21770.html)

------
ibash
What do you mean by rendered with pure JS? It's rendered server side, isn't
it?

~~~
asdfs
Apparently the video decoding and other client-side stuff is done purely in
JavaScript (and in particular using WebGL), no <video> tag or plugins or
anything like that. I presume that all the server-side stuff is still native
code.

I'd still like to see a demo page with their JavaScript decoder.

~~~
tonyplee
Are you sure it is not just as simple as opening up a websocket and pipe the
frame data as texture data directly into the webgl instant?

~~~
sandyman
No, there is a full DCT decoder in ORBX.js, the encoder is built into the
Amazon AMI, and can use CPU or GPU for encoding (the GPU encoder is pure
OpenCL - so one day ORBX.js could support encoding in the browser through
WebCL). If you send raw data down to the browser, you would need a 1 GB
connection . We are targeting 4G/LTE speeds at 8-12 Mbps for HD @ 60 hz, with
support going to 1-3 Mbps for 1024x768 @ 30- 60 hz..

------
gcb1
whats with the sgi names reuse?

sgi irix had something similar. think it was called inventor. basically a vmrl
viwer plugin. also their last workstation was called octane.

------
rhelmer


