
Cloudy Gamer: Playing Overwatch on Azure's New Monster GPU Instances - detaro
http://lg.io/2016/10/12/cloudy-gamer-playing-overwatch-on-azures-new-monster-gpu-instances.html
======
Neeek
This is pretty cool to see! I personally don't think it will catch on too
wildly in my circles, 40-80ms of input delay is not a trivial number, and in
the video footage you can see it clearly. It definitely found a niche in this
guy though, and probably many more as well. Even if I share some of the
sentiments from the other comment, it's pretty amazing what we're able to do
with the tech.

On a side note, is that laptop connected with wifi? I've found that even "in-
home" in home streaming from my computer to a laptop can get pretty choppy if
I'm not sitting in the living room in front of the router (sometimes that
isn't enough either).

------
walrus01
ISP network engineer here: I really don't see this ever being practical in
terms of latency, even if the distances from client to GPU are only 5ms or
less. It is still an eternity compared to the latency and performance of a GPU
on a local PCI express 3.0 x16 motherboard bus adjacent to CPU, ram and m.2
SSD. Just a distance like Seattle to Portland would be a latency killer. Even
the routing/fiber path from issaquah to Seattle would be too much, at 2.5ms.

~~~
jordanthoms
It's not uncommon for screens to have input latencies in the 50-100ms range -
John Carmack tweeted "I can send an IP packet to Europe faster than I can send
a pixel to the screen. How f’d up is that?". [0]

Given that people are generally happy to accept that kind of latency for a
frame getting from their GPU to photons, I don't see why an extra ~10ms of
latency getting a frame back and forth from a local datacenter would be a
dealbreaker except for the most demanding scenarios. It does require that the
datacenter is close to the user though.

[0] -
[https://twitter.com/id_aa_carmack/status/193480622533120001](https://twitter.com/id_aa_carmack/status/193480622533120001)
& [http://superuser.com/a/419167](http://superuser.com/a/419167)

~~~
user5994461
The latency to the screen is fixed with no jitter. The human mind can adapt.

Network latency is erratic and variable, sometimes packets are just lost.

~~~
throwaway7767
> The latency to the screen is fixed with no jitter. The human mind can adapt.

In addition, latency is cumulative and after a certain threshold becomes
extremely annoying. Adding network delays to the already-present latency
elsewhere in the system just makes you more likely to hit that threshold.

------
gravypod
I see this as being the sad, yet inevitable, future of gaming. First we didn't
own the game, we owned a licensed copy. Next we are going to not own a copy,
or own a license, but instead rent time to play a game at exorbitant rates and
it will be heralded as "amazing" since it will be "cheaper for you" since you
don't need to buy your own hardware.

~~~
ManlyBread
This has been tried before (it was called OnLive AFAIR) and it didn't catch
on. The perspective is not that great for publishers since the cost of
investment and maintaining such a solution would be enormous for what would
essentially be a subpar experience comparing to what we have right now.

~~~
rebuilder
I'd expect it to be a distributor providing this service. Think Steam as a
subscription service. Play any game in the catalog, as much as you want, for a
monthly fee, without any installations or driver hunts. Seems like a good
proposition for the customer, provided it works and is cheap enough. Don't
know how publishers and game devs would take it.

------
shermanyo
Can we acknowledge that there are a huge set of games that don't rely on low
latency input or feedback?

Games like Civilization or X-Com could increase the level of detail by an
order of magnitude, while still playing on a notebook.

~~~
Neeek
Absolutely, even third person RPGs probably wouldn't suffer too badly.

