When it works, it's great. I can absolutely see this being the future of gaming as home bandwidth continues to improve, and more datacenters get built near major population centers.
It is _very_ sensitive to the connection, though. This summer, I was about 11 milliseconds away from an AWS datacenter with a 500 megabit connection. Buttery smooth perfection. No video compression artifacts. Now, I am about 60 milliseconds away with a 50 megabit connection and it is unplayable. 50 megabits/60 milliseconds is a good connection by US broadband standards these days!
It's not a 'normal' scale of quality like having different levels of PC hardware.
It either works perfectly, or is completely unplayable.
There's a low cap on the maximum market size based on peoples internet quality.
Anyone who's written game netcode(either as a hobby or professionally) knows that you build the game design and game engine from the ground up to tolerate latency.
For action games most of the time it's all about building a game design where you're predicting(either via physical location or other player's actions), except in the few rare cases that do time-rewinding(most fighting games, some FPSes that combine both, most notably Counter-Strike). This is usually handled by dead-reckoning
For large scale, low bandwidth games(AKA RTSes and the like where gamestate is deterministic) that's handled via lock-step. The gamestate is 100% deterministic and all clients move together with a shared set of inputs in "lock step".
Both of these approaches can be tolerant to latencies up to ~600ms(back in the ole 28.8/56k days of '97 SubSpace was doing ~300 players in one zone with a high skill curve and a robust netcode). They usually mask it with client side reactions that are then reconciled with the server in a robust way. If you're just dropping video frames over a network stack none of that is available to you no matter how fancy your FEC or other tricks are.
Somehow I've now got the urge to go dust off the Continuum client again and boot up SubSpace.
They will be playing with bluetooth gamepads (5ms latency) on their TV (10-20ms latency) using Wifi (5-10ms latency), so the internet streaming delay of <10ms from an edge server will be barely noticeable.
For example, Stadia is featuring "Lara Croft and the Temple of Osiris" which is a perfect game for high-latency unskilled casual play.
(FWIW I heavily use Steam's streaming client so I'm pretty familiar with most of the failure modes, it doesn't work great for everything but is convenient when the game style and network performance overlap)
I can deal with 1080p playing old ps2/3 games through an online service, or modern with the same resolution but high settings with snappy controls.
I cannot play a game with noticeable input lag whatsoever, even if it was at 4k HDR 144hz.
You'd think that "serious gamers" would obsess over the merits of games (mechanics, plot, level design, gameplay, etc), not the technical minutiae of framerates and input lag. That term always seemed odd to me.
Generally, I agree. But to me, input lag and performance does impact.
Like watching a movie cam rip with choppy audio. You can still admire the film, but it's not going to be a pleasant experience compared to watching it in the cinema.
But the prediction is exactly what makes it feel unresponsive. So the human perception is input lag, regardless of what causes that feeling.
> I cannot play a game with noticeable input lag whatsoever, even if it was at 4k HDR 144hz.
People are surprisingly tolerant of different latencies.
Many Vsync implementations end up with 4-5 frames of latency, which ends up around 75ms, and consoles that run at 30FPS (or 20 FPS like some N64 classics!) can be in excess of 100ms. If the game is designed around it, people will adapt (just like people adapt to 24fps movies).
I agree though, if you want games to feel like your is physically connected, 10-20ms is what most folks in the VR industry are targeting.
At first I was surprised by the playability of Stadia at such a slow connection myself but even though I cannot play shooter games I wouldn't go as far to call the experience I get with the 10 Mbps connection 'unplayable'.
a) in a metro area that has a major internet exchange. Hurricane Electric's list of pops is a decent start: http://he.net/ip_transit.html Some of those locations are more well connected than others though. In the US, prime exchanges are really LA, SF/SJ, Seattle, Reston, New York, Miami, Dallas, Chicago. If you settle near a lesser exchange like say Denver, you'll probably have good connectivity to many networks, but many services don't have a datacenter in Denver. Also, some companies are trying to put their big datacenters farther north to save on cooling and energy costs, so bias north if possible; however, if you want excellent connectivity to South America, much of that goes through Miami.
If you're in Canada, Vancouver or Toronto are your best bets. In South America, Sao Paulo or Bogota (but note, connectivity is sparse between countries, it historically all went through Miami, but that's changed a bit). In Europe, Amsterdam, London or Frankfurt? In Asia, Singapore or Taipei. In Africa, Johanesburg. In AU/NZ, reportedly it kind of all sucks, but Melbourne/Syndey/Auckland are okish.
b) not on DSL; usually it's run with settings that guarantee you 20 ms ping to the first hop. Be careful because some of the DSL providers imply fiber to the home, but really mean fiber to the DSLAM (or whatever it's called today).
c) fiber is better than cable, but cable is ok as long as the company isn't incompetent. Comcast is much maligned, but their network is actually pretty good; local areas could be mismanaged though.
d) you need to actually put potential addresses into the sign up for service pages. Put the one you're looking at, as well as the neighbors; if you get prompts like 'you've already got service, would you like to manage it', that's a good sign that they probably actually service the address. Listed available speeds to addresses that are current customers are more likely to be accurate as well. If there's a public utility fiber, check their fiber map, but don't expect to hear back from them with firm availability or not in time to make a decision on your prospective living arrangement.
I think bitrates need to be considerably higher. 4K Blu-ray uses close to 100mbit/sec, but movies have the advantage of not requiring really low latency encoding which means it is far more efficient with bitrate. I think probably looking at well over 100mbit/sec to get good PQ on these services.
With racing games I learned to compensate and anticipate. And there are pleeeeenty of slower games to play.
I was an early adopter of gforce now and I really enjoyed it while it was in beta. As soon as it was released to the public all the publishers pulled their games.
Average us internet speed (speed paid for - not just speed available) is >120 megabits so I wouldn’t exactly call that good considering gamers are likely to have faster than average connections
In terms of latency, having your game in the cloud is worse. In terms of graphics, it's going to be worse compared to having high-end local hardware.
Despite all that, the killer app that will tip the scale is cheat prevention. Cloud gaming annihilates nearly every category of cheat. Wallhacks? Impossible. Infinite health? Impossible. Flyhacks and speedhacks? Impossible. The only category that isn't completely eviscerated is aimhacking, and even then it will be dramatically harder and less beneficial to aimhack, since you will need local realtime image analysis to do it properly and you won't even be able to aim at targets you can't currently see.
Once multiplayer gamers get a taste of that, there's no going back.
Sadly, over enough time I fear this effect will start destroying the market for high-end PC hardware, which will be both sad and a detriment to general computing.
In the long term, I believe running your own servers is going to be the future, because otherwise there'll always be license issues, e.g. when you want to play your gog.com games on GeForce Now. Or indie games or business apps in general.
Also, all those services prevent you from sharing with friends for their business reasons, meaning absolutely no coop or splitscreen.
Anyway, the key for making such services work is custom ultra low latency udp protocols. I'm going with nvenc hardware encoding, cuda for data wrangling, and a boost::asio based c++ core for the network layer. That, and controlling packet loss, for example by self-throttling and spacing to avoid overflows at intermediate relays.
BTW, I'm surprised by the bandwidth numbers in the article. Even fast explosion heavy games like BroForce work reasonably well with 5mbit/s if you use h265.
Microsoft and Sony have a large catalog to offer, Amazon and Google have large data centers, a lot of money to throw at publishers, and a good path to integration with streaming (Youtube and Twitch).
Geforce Now has shown the hostility independent efforts will have to face.
I also believe the target audience is mostly users who can't or don't want to afford dedicated hardware and will want a good collection of games with a subscription.
Instead, I'm building a platform as a service. My ideal customers will be the software publishers that want to run their own streaming cloud. There's a lot of demand for enterprise solutions that allow you to demo things online. With a regular unlimited demo version, you have the risk of people trying to crack it. If the demo is a streaming service, then you can use everything, but copy nothing.
In essence, a high performance streaming solution is the perfect copy protection for pricey enterprise apps.
After trying out Stadia and GeForce Now, what I personally want is a high performance data center like what they have, but with much more control for the user. There's no reason why I should be limited to only play games in their catalogue. Because that denies me pretty much all indie games.
I think nVidia built NVENC with an eye towards their own streaming service (GeForceNow), with an earlier version (GameStream, which lets you stream locally to a Shield device) as a test-bed.
For enterprise presentations, that's negligible compared to salaries.
Is this relevant without comparing the picture quality?
After all, you don't need a DRM module for this - It _is_ DRM.
The other thing that no one is talking about is that it removed 99% of the cheats because the game is not running locally and you just use a "dumb" terminal.
I have a feeling these streaming services will capture a huge portion of the 'casual' market, and there will be some hardcore people who refuse it or consider it a toy.
Kinda like cell phones and mobile gaming. Very popular, not going away, but didn't kill off PCs and probably can't.
"Gamers" won't like it, but AAA publishers always get what they want if it impacts profits - this is the future and there's no way around it. I wouldn't be surprised seeing the first "cloud-exclusive" release opping up soon that you can play on Stadia or equivalent only for half a year or so. Valve is working on something like this as well afaik.
I recently purchased Cyberpunk 2077 and was stunned at the amount of complaining about poor graphics, glitches, gameplay etc. as I have experienced none of those.
As for latency and lag, my most popular game to play is PUBG, which is any game should suffer from those issues you'd think it would be a multiplayer, battle royale game like PUBG. But no, other than being very bad at it, I've never experienced what feels like input lag or latency issues.
Sales pitch over!
I’m curious how much momentum these services will pick up by sheer virtue of it being so challenging to find competitive hardware for sale.
Online multiplayer is the only category where cloud gaming have the potential to be better than traditional hardware, but they have to built the game with cloud gaming in mind, not just porting it
I've played around with Stadia a bit, even on "good" internet in the US, 20ms ping, 250mbps, Stadia was only really enjoyable for single players games. Even playing a relatively slow multiplayer game like Dota 2 would have been frustrating fairly often.
However, having just 1 server render and basically 10 viewport is really the future of cloud gaming for me, once the infrastructure(stable fast internet) is there
First of all, Stadia streams from Edge nodes - it's not about living next to data centers. I'm just giving another data point. There are numerous places for latency to be introduced in game streaming: encoding, decoding, local network, WAN, etc. An extra 15 seconds on your ping is a HUGE deal for playability in game streaming. 20 ms might be an awesome ping for CS:GO, but for something like Stadia you're going to feel that negatively.
They (gaikai) - even earlier than 2012 - had a very smooth gaming experience considering the general state of connectivity and gpu/cpu then compared to today.
I'm not surprised to see google pushing out fat streams like that since they have POPs in every city and youtube-battle-tested video bandwidth congestion detection but geforce now... that doesn't sound like a recipe for happy streaming.
Also echoing everyone else's praise for parsec, it is a highly underrated (free...) software.
The fine folks at Google know their computer science well so this isn’t news to them, but it kinda lays bare that eventually they will be selling consoles to execute games locally since it’s clear they do not have the appetite to get 10Gbps+ fiber to the country. Steaming-only Stadia made sense paired with a mature Google Fiber deployment, but alas that is not the universe we reside in.
I live in London, so probably not far from the GCP data centre, I have a 50mb connection (pretty average and affordable for London) and I don't see myself ever investing in a console or gaming PC again.
My gaming experience with Stadia is better than "good enough", and that's all that matters. If they can get the good titles, they have me as a customer.
I would settle for a lower quality service than I'm currently getting before I think about buying my own hardware again.
Shooters are out, skill based games (e.g. rocket league) are out, racing games are much harder due to latency as well and quite a few platformers become an infuriating experience.
Slower paced games work just fine but until the latency gap is closed or until games will be designed with it in mind Stadia is dead in the water as far as i concern, 5+ frames of latency at 60fps isn’t a good experience.
At the risk of being called a ludite or a peasant, may I suggest most people don't actually care that much about reaching the height of visual fidelity, at least in this aspect?
The reality is this same argument could be made against any kind of video streaming. But the reality is that people are pretty satisfied with the fidelity that video encoders can produce. Yes, it will always be inferior in several respects to the uncompressed, ~33Gbps 3840x2160@60hz your local rig can push out, but will it be a deal breaker for everyone, or even most people? I don't think so, personally.
I believe what you’re describing is a feature of Thunderbolt, not DP.
I use MST with a mini display port hub for more outputs on my laptop.
Displayport isn't even available on displays or GPUs.
These services use hardware acceleration for encode and decode of the video to H.264 (perhaps HEVC as well)
Clearly the bandwidth requirement is lower.
3840 x 2160 x 120 x 10 x 3 = 29,859,840,000
That's not even 40Gbps
But my point still stands. Streaming doesn't need such a high bitrate
How many people play Minecraft?
edit: There's a population that cares about graphics, but the population that doesn't is bigger and collectively has more money. Just look at the mobile gaming market.
Also DisplayPort 2.0 bandwidth is mostly for supporting VR where visual artifacts and low fidelity lead to motion sickness for a huge demographic.
I only fear latency, which is what makes games unplayable to me... I live in Argentina and the best ping I can get on Fortnite is 50ms, using servers in Brazil. There are few territories with low ping in the world.
Sure, VR might one day be a big part of the gaming market. It won't be soon, and it will probably never be the entire market.
They notice with their wallets. A battlestation capable of 10-bit color, 240Hz, at 4K, for a single graphics generation, costs more than a monthly subscription service.
Google generally operates with 5+ years of foresight. If you think at all that cloud gaming will become viable in the next 5-ish years then you should not ignore or rule out any cloud gaming platforms as unreasonable.
Am I missing something?
“However, these companies released so far little information about their cloud gaming operation and how they utilize the network. In this work, we study these new cloud gaming services from the network point of view.”
They’re early too, so this paper will get a lot of citations in the future.
It’s a big bait and switch as far as I’m concerned. Those triple-A titles might be available some day, but who knows when that day will be. I left very unsatisfied and didn’t even try streaming anything.
You pay the subscription to stream in 4k, and to get a small selection of games included (yes, not the AAA games you want).
It's free hardware if you play in HD.
Overall very disappointed in Stadia.
FWIW I consider Stadia and the ilk oversold to the mass market akin to VR. There may be a market for people who want to play AAA games, will pay several hundreds per year and yet have never bothered to pay ~$300 for a game console (tail end of last gen). I'm just skeptical it's actually as huge as it's hyped to be.
Well, I can't even go back and look at the original marketing now because when I do I get a warning that Stadia must be played on Chrome and to download that. Obviously I could go in private viewing or something but why would I even bother with that? It's so user-hostile.
But the marketing that I do remember that enticed me to check it out was that I thought that it was for streaming games. When I checked out the site they had this big list of games that it looked like I could stream. If I could only stream, say, 15 games as it appears to be why not just show those? Hmm.
I do think there's a market for those (like myself) who have a few games they'd like to play. Cyberpunk 2077 looks cool - I'll pay $10 or $15 or even $20/month to stream that (among other games) because my alternative is buying it for $50 and not having it run at ultra-max settings on my MacBook Air. Not desirable.
To your point that this was a knock against it at launch, well, yea. They have all this marketing copy that makes it look so cool and then I have to pay retail price for a new game. Why wouldn't I just buy a Playstation or use Xbox or Sony's online services? Like what value is $9.99/month for some lame games that I don't care to play? Pass.
This service doesn't address any market that I can see.
Streaming doesn't mean it's free, or even all included in a subscription. It means it's running on a server and you receive the video.
The market for this is basically the same as consoles, except you don't pay for the hardware initially.
If you want 4k and a bunch of free games every month, you can buy Stadia Pro. But you don't have to.
But the need for lower latency is increasing as well (a la VR)
That’s exactly the point though, tech requirements don’t stand still
Oculus has been working on home desktop to quest WiFi streaming and say the experience isn’t optimal yet for release and that is all local.
If you want to play mobas or competitive FPS the actual amount of grunt required is going to be flat.
If you want to play the latest assassin's creed then it's harder to do, but still, assassin's creed Odyssey on a 1070 is still stunningly good and gets decent frames - the requirements will always increase but the amount of power required to get a certain fidelity shouldn't be too bad.
From the surface, cloud gaming is absolutely a fantastic idea, I.e., it removes a series of barriers of playing high end games: cost of hardware, convenience, subscription bases consumption to remove upfront cost.
However, in close examination, it becomes clear that high end game is costly not because these above mentioned barriers. On the contrary, these barriers are part of the structure to support the high end games.
Additionally, the modern high end gaming experience entirely depends on the exclusive ownership of costly computing power. The hidden foundation that supports that comouting power is the low cost local communication networks. And we all know that wide area communication is always more expensive than computing. So for any large scale cloud gaming, the underlying economy is that it's always more cost-effective of having local hardware.
However, this price won't last, it is written on the web page that one day, there will be an increase for everybody.
Can they even get close to DisplayPort 2.0 performance in the next couple years? I don’t see how this is ever going to scale as bit depth, resolution and frame rates continue to increase. This is a losing strategy because a lossy, slow connection between your display and your controller will only get worse (quadratically so) as those three factors (bit depth, resolution and frame rate) increase.
They are going to have to beef up the client hardware and do more of the game engine and rendering compute locally, which will undermine the cost savings they offered being centralized.
And if it works well for games, it'll surely work well enough for regular software.
I think premise is either "floating workstation" or that if you please you can use laptop instead of being tied to workstation.