Hacker News new | past | comments | ask | show | jobs | submit login
A network analysis on cloud gaming: Stadia, GeForce Now and PSNow (arxiv.org)
123 points by jsnell 34 days ago | hide | past | favorite | 145 comments

I did a good amount of gaming via Parsec and AWS this summer.

When it works, it's great. I can absolutely see this being the future of gaming as home bandwidth continues to improve, and more datacenters get built near major population centers.

It is _very_ sensitive to the connection, though. This summer, I was about 11 milliseconds away from an AWS datacenter with a 500 megabit connection. Buttery smooth perfection. No video compression artifacts. Now, I am about 60 milliseconds away with a 50 megabit connection and it is unplayable. 50 megabits/60 milliseconds is a good connection by US broadband standards these days!

That's the thing with these services.

It's not a 'normal' scale of quality like having different levels of PC hardware.

It either works perfectly, or is completely unplayable.

There's a low cap on the maximum market size based on peoples internet quality.

Pretty much.

Anyone who's written game netcode(either as a hobby or professionally) knows that you build the game design and game engine from the ground up to tolerate latency.

For action games most of the time it's all about building a game design where you're predicting(either via physical location or other player's actions), except in the few rare cases that do time-rewinding(most fighting games, some FPSes that combine both, most notably Counter-Strike). This is usually handled by dead-reckoning[1]

For large scale, low bandwidth games(AKA RTSes and the like where gamestate is deterministic) that's handled via lock-step[2]. The gamestate is 100% deterministic and all clients move together with a shared set of inputs in "lock step".

Both of these approaches can be tolerant to latencies up to ~600ms(back in the ole 28.8/56k days of '97 SubSpace[3] was doing ~300 players in one zone with a high skill curve and a robust netcode). They usually mask it with client side reactions that are then reconciled with the server in a robust way. If you're just dropping video frames over a network stack none of that is available to you no matter how fancy your FEC or other tricks are.

Somehow I've now got the urge to go dust off the Continuum client again and boot up SubSpace.

[1] https://www.gamasutra.com/view/feature/131638/dead_reckoning...

[2] https://meseta.medium.com/netcode-concepts-part-3-lockstep-a...

[3] https://store.steampowered.com/app/352700/Subspace_Continuum...

While you are absolutely correct, I believe the target market for Stadia is more people like my parents, who used to play casual games 10 years ago and then got too busy. They cannot justify owning a console and purchasing $60 games. But they'd be easy to sell on a $5 monthly games on demand subscription.

They will be playing with bluetooth gamepads (5ms latency) on their TV (10-20ms latency) using Wifi (5-10ms latency), so the internet streaming delay of <10ms from an edge server will be barely noticeable.

For example, Stadia is featuring "Lara Croft and the Temple of Osiris" which is a perfect game for high-latency unskilled casual play.

I thought one has to still buy/rent games on top of the $5/month. $5 is only the fee for renting “cloud hardware” - May be some games are included, but definitely not comparable to a Netflix for games. I guess it more like a Disney- ?

Stadia is free if you're happy with 1080p, and you purchase games on top. The pro subscription is around $10/month, but adds 4k res and a few new games every month.

Oh yeah, I don't doubt there's a market for this but I don't think you see it take over the same way that say Netflix did for VoD.

(FWIW I heavily use Steam's streaming client so I'm pretty familiar with most of the failure modes, it doesn't work great for everything but is convenient when the game style and network performance overlap)

But what benefit is there compared to playing a game on a phone or PC?

Look at how people use javascript for client side coding but go or rust for the back end -- the service provider pays for back end resources and they are always going to be niggardly when it comes to cpu, gpu, etc.

OT: thanks for the link to Subspace; played that a bunch in the old times; gonna try it again soon.

Latency is the most important thing. There's a reason high refresh monitors are loved by serious gamers.

I can deal with 1080p playing old ps2/3 games through an online service, or modern with the same resolution but high settings with snappy controls.

I cannot play a game with noticeable input lag whatsoever, even if it was at 4k HDR 144hz.

by serious gamers

You'd think that "serious gamers" would obsess over the merits of games (mechanics, plot, level design, gameplay, etc), not the technical minutiae of framerates and input lag. That term always seemed odd to me.

If you're, say, playing a competitive FPS with a team (e.g. Counter Strike), you will obsess about game mechanics and latency and input lag, because after 10 hours proficiency you'll be able to notice if you're getting packet loss, if your spouse is using all the bandwidth and your crappy router with bufferbloat adds additional latency.

Hardcore fans of media have always fawned over the technical aspects. We've seen it in photography, film-making, music, and now gaming.

I used that for lack of a better term.

Generally, I agree. But to me, input lag and performance does impact.

Like watching a movie cam rip with choppy audio. You can still admire the film, but it's not going to be a pleasant experience compared to watching it in the cinema.

If the game is unpleasant to play because of input lag and the player cannot react to what is happening on screen because their response arrives half a second ir more too late, they'll have trouble appreciating all the other stuff.

We're not talking about input or rendering latency here, networked games are designed to work such that even when you have 100-300ms of round-trip latency the objective of the game is setup in such a way that your success is based on "predicting" events or the server keeps all disparate time domains in memory and can time-rewind to resolve authoritative game state.

I know that technically it's a different problem.

But the prediction is exactly what makes it feel unresponsive. So the human perception is input lag, regardless of what causes that feeling.

The input lag causes both unresponsiveness and prediction. The former is obvious, a simple example for the latter: in RTS/MOBA (both rollback or lockstep netcode) the whole region around the opponent where he could be in the next tick becomes your target, the size of it depends on the latency and if RTT is big enough possible reactions become an additional factor.

> Latency is the most important thing. There's a reason high refresh monitors are loved by serious gamers. The biggest improvement with 144Hz is motion clarity, but you can cheese it with tricks like strobing or Black Frame Insertation (BFI). I keep a CRT around for this reason.

> I cannot play a game with noticeable input lag whatsoever, even if it was at 4k HDR 144hz. People are surprisingly tolerant of different latencies. Many Vsync implementations end up with 4-5 frames of latency, which ends up around 75ms, and consoles that run at 30FPS (or 20 FPS like some N64 classics!) can be in excess of 100ms. If the game is designed around it, people will adapt (just like people adapt to 24fps movies).

I agree though, if you want games to feel like your is physically connected, 10-20ms is what most folks in the VR industry are targeting.

As a nearly daily user of Google Stadia I have to disagree. I use Stadia in two different households, one with a 50 Mbps connection (where Stadia works perfectly) an one with a connection capped at 10 Mbps down and 1 Mbps up. The 10 Mbps connection only allows for 720p gameplay and I get frequent stuttering every 5 minutes or so but this doesn't really ruin the experience with most story driven games like Red Dead Redemption 2 or AC Odyssey.

At first I was surprised by the playability of Stadia at such a slow connection myself but even though I cannot play shooter games I wouldn't go as far to call the experience I get with the 10 Mbps connection 'unplayable'.

I honestly want to have a map of that overlays homes and apartments with Internet connection speed and main Internet nodes/routers/datacenters. If remote work is a big part of the future, then prioritizing optimal Internet connection parameters as a part of a moving decision seems rational.

I suspect such a map doesn't exist, but if you want optimal Internet connection, what you're looking for is:

a) in a metro area that has a major internet exchange. Hurricane Electric's list of pops is a decent start: http://he.net/ip_transit.html Some of those locations are more well connected than others though. In the US, prime exchanges are really LA, SF/SJ, Seattle, Reston, New York, Miami, Dallas, Chicago. If you settle near a lesser exchange like say Denver, you'll probably have good connectivity to many networks, but many services don't have a datacenter in Denver. Also, some companies are trying to put their big datacenters farther north to save on cooling and energy costs, so bias north if possible; however, if you want excellent connectivity to South America, much of that goes through Miami.

If you're in Canada, Vancouver or Toronto are your best bets. In South America, Sao Paulo or Bogota (but note, connectivity is sparse between countries, it historically all went through Miami, but that's changed a bit). In Europe, Amsterdam, London or Frankfurt? In Asia, Singapore or Taipei. In Africa, Johanesburg. In AU/NZ, reportedly it kind of all sucks, but Melbourne/Syndey/Auckland are okish.

b) not on DSL; usually it's run with settings that guarantee you 20 ms ping to the first hop. Be careful because some of the DSL providers imply fiber to the home, but really mean fiber to the DSLAM (or whatever it's called today).

c) fiber is better than cable, but cable is ok as long as the company isn't incompetent. Comcast is much maligned, but their network is actually pretty good; local areas could be mismanaged though.

d) you need to actually put potential addresses into the sign up for service pages. Put the one you're looking at, as well as the neighbors; if you get prompts like 'you've already got service, would you like to manage it', that's a good sign that they probably actually service the address. Listed available speeds to addresses that are current customers are more likely to be accurate as well. If there's a public utility fiber, check their fiber map, but don't expect to hear back from them with firm availability or not in time to make a decision on your prospective living arrangement.

If streaming takes a large chunk of the market that has good internet quality, that's a significant revenue gone for consoles and gaming PC hardware (even if it's not 100% of the market).

I used Stadia for a while, but after a couple months, the compression started to bum me out. It really worked great, and playing the latest Assassin's Creed on a $200 Chromebook is really something else. But when I would switch to my main TV, I started thinking about how I had given up on the last uncompressed input to my TV; my game console. Dark area blocks, color banding, it all started driving me nuts.

I agree. I actually find watching game trailers on YouTube pretty pointless, as everything turns into mush and it totally loses the "precision" of the graphics. I found these cloud streaming services similar, though not quite as extreme.

I think bitrates need to be considerably higher. 4K Blu-ray uses close to 100mbit/sec, but movies have the advantage of not requiring really low latency encoding which means it is far more efficient with bitrate. I think probably looking at well over 100mbit/sec to get good PQ on these services.

Yeah same here. For me it's a bit more obvious. Basically scenes that have heavy foliage + motion turns into a smeary mess.

50mbit should be enough for streaming games like this. However, it's very latency specific (considering the video encoding and decoding will add a few tens of ms to the latency), so you are probably closer to 100ms overall which definitely feels very laggy. That is considerably more latency than I get on a FTTH connection from London to New York, for example.

That is surprising for me to hear. I did cloud gaming a decade ago with OnLive and some Playstation Now a little later and it was never as bad as the gaming gatekeepers suggested. 60ms latency on a 50mbit connection should be good, sad it isnt.

With racing games I learned to compensate and anticipate. And there are pleeeeenty of slower games to play.

I used OnLive till the bitter end. It was awesome. A little sad the way it ended, also some tears in the rain for the game library I had purchased and the friends I had made in Space Marine multiplayer.

I was an early adopter of gforce now and I really enjoyed it while it was in beta. As soon as it was released to the public all the publishers pulled their games.

60ms is considered good? Is there some data or graphs on connection speed and latency in the US? Would be interesting to see.

> 50 megabits/60 milliseconds is a good connection by US broadband standards these days!

Average us internet speed (speed paid for - not just speed available) is >120 megabits so I wouldn’t exactly call that good considering gamers are likely to have faster than average connections

Estimates I could find put it around 30-170Mbps, but the top end (speedtest.net average) of Canada is >150Mbps, which isn't actually common (living in/near large city). Average likely around 30-60 Mbps from experience.

Speed test shows average at 170 for USA broadband


The tech and infrastructure aren't there yet and the Stadia rollout has been a laughingstock, but cloud gaming is still the future of multiplayer video games (and, since game-as-a-service is (sadly) the future of AAA game models, and because operating a game-as-a-service essentially requires your game to be multiplayer in order to entice people to keep playing (via community-based social reinforcement and the status-seeking that drives the purchase of cosmetics), that's going to be an increasingly large proportion of all AAA games).

In terms of latency, having your game in the cloud is worse. In terms of graphics, it's going to be worse compared to having high-end local hardware.

Despite all that, the killer app that will tip the scale is cheat prevention. Cloud gaming annihilates nearly every category of cheat. Wallhacks? Impossible. Infinite health? Impossible. Flyhacks and speedhacks? Impossible. The only category that isn't completely eviscerated is aimhacking, and even then it will be dramatically harder and less beneficial to aimhack, since you will need local realtime image analysis to do it properly and you won't even be able to aim at targets you can't currently see.

Once multiplayer gamers get a taste of that, there's no going back.

Sadly, over enough time I fear this effect will start destroying the market for high-end PC hardware, which will be both sad and a detriment to general computing.

This is an interesting analysis, I'm wondering where the disagreements are coming from.

I don’t disagree with the points made here, and I really appreciate the perspective. However, I do question the magnitude of importance for cheating. It kinda seems like a solved problem, and by solved I mean “good enough to have fun”. Are we really going to reinvent an entire industry to solve a relatively non-existent problem?

I'm working on building a game streaming cloud service at the moment.

In the long term, I believe running your own servers is going to be the future, because otherwise there'll always be license issues, e.g. when you want to play your gog.com games on GeForce Now. Or indie games or business apps in general.

Also, all those services prevent you from sharing with friends for their business reasons, meaning absolutely no coop or splitscreen.

Anyway, the key for making such services work is custom ultra low latency udp protocols. I'm going with nvenc hardware encoding, cuda for data wrangling, and a boost::asio based c++ core for the network layer. That, and controlling packet loss, for example by self-throttling and spacing to avoid overflows at intermediate relays.

BTW, I'm surprised by the bandwidth numbers in the article. Even fast explosion heavy games like BroForce work reasonably well with 5mbit/s if you use h265.

I'm curious how you plan to compete.

Microsoft and Sony have a large catalog to offer, Amazon and Google have large data centers, a lot of money to throw at publishers, and a good path to integration with streaming (Youtube and Twitch).

Geforce Now has shown the hostility independent efforts will have to face.

I also believe the target audience is mostly users who can't or don't want to afford dedicated hardware and will want a good collection of games with a subscription.

I don't plan to compete with those huge services and their catalogues.

Instead, I'm building a platform as a service. My ideal customers will be the software publishers that want to run their own streaming cloud. There's a lot of demand for enterprise solutions that allow you to demo things online. With a regular unlimited demo version, you have the risk of people trying to crack it. If the demo is a streaming service, then you can use everything, but copy nothing.

In essence, a high performance streaming solution is the perfect copy protection for pricey enterprise apps.

sure for gaming, but why would i run my enterprise software on your pc wherw you can see my propertary data. does not make sense except gaming and some other smaller things.

I think it makes a ton of sense for things like EDA software, they have to run on beefy machines that are shared between users and the user experience over thinlinc / X11 forwarding is typically pretty atrocious. If you sell that solution to Cadence they would be super happy probably.

It makes sense if I sell you the software and the rack space but you can use it on your own hardware which you fully control.

There is a lot of appeal in "play anywhere" alone. Such software gives you the ability to better utilize your hardware by playing from your laptop, friend's house or phone. Also, imagine only needing a weak device as thin client and doing 3D or photo editing through your main computer.

This sounds great, I would love to talk with you more about this if possible. The technology used at Stadia is streaming using open-source Vulkan graphics and it is Linux based, this gives you a lot of leverage since its open-sourced.

Just wanted to wish you luck here. There is a huge need for competition in this space. Moonlight and Parsec are both good, but seem to be stagnant performance-wise.

I find moonlight unbearable to setup and parsec tends to be limited by my low upload speed so that it looks like a blurry mess for my friends. In one top down racing game, it got so bad that my brother literally couldn't find his car onscreen.

After trying out Stadia and GeForce Now, what I personally want is a high performance data center like what they have, but with much more control for the user. There's no reason why I should be limited to only play games in their catalogue. Because that denies me pretty much all indie games.

I seriously hope you consult an IP lawyer on this. There was a lawyer on youtube a few months ago that pointed out that excuse doesn't necessarily work as a defense when it comes to copyright infringement. https://youtu.be/JKjxfDJXV1E?t=1084

Thanks for the link :) And yes, I'll need a watertight ToS at least.

Except that h265 is terrible for encoding, you'll burn so much CPU juste to encode the video. ffmpeg with a zen3 5900x encode couple of frames per seconds ( but it doesn't have any hw encoding ).

nVidia’s range of cards ship with dedicated hardware encoders; anything from the last half-decade can encode h264, and the 2000 and 3000 series cards can do h265 fast enough that a 4K stream renders frames and transmits across my local network (over ethernet) in less than 16ms. I’m not sure if AMD is shipping similar hardware.

But how many streams per card? It must be pricey to keep a high-end GPU tied up for one user.

One stream per card, as far as I'm aware; however, given that we're talking about streaming more demanding games, you'd need dedicated hardware for running the game as well.

I think nVidia built NVENC with an eye towards their own streaming service (GeForceNow), with an earlier version (GameStream, which lets you stream locally to a Shield device) as a test-bed.

You’re going to need that anyway to render the game in the first place.

Roughly $1.5 per GPU per hour.

For enterprise presentations, that's negligible compared to salaries.

> We find that GeForce Now and Stadia use the RTP protocol to stream the multimedia content, with the latter relying on the standard WebRTC APIs. They result in bandwidth-hungry and consume up to 45 Mbit/s, depending on the network and video quality. PS Now instead uses only undocumented protocols and never exceeds 13 Mbit/s.

Is this relevant without comparing the picture quality?

I'm surprised Stadia uses standard WebRTC. It doesn't support Firefox, so I assumed maybe it was some specialized version of QUIC that was tightly integrated into Chrome.

After all, you don't need a DRM module for this - It _is_ DRM.

The Chrome stack has been highly tuned to make a Stadia use case work, but Firefox hasn't, making highly latency sensitive usage difficult for now.

No, you can use rtp with like 50Kbps link (audio only). This statement makes no sense

There is a big difference of running a PC on the cloud that runs your game vs what Stadia does, that's why Stadia is so much better than the competition, the game is compiled for that purpose.

The other thing that no one is talking about is that it removed 99% of the cheats because the game is not running locally and you just use a "dumb" terminal.

It would also make mods impossible.

I have a feeling these streaming services will capture a huge portion of the 'casual' market, and there will be some hardcore people who refuse it or consider it a toy.

Kinda like cell phones and mobile gaming. Very popular, not going away, but didn't kill off PCs and probably can't.

I doubt it. Publishers will be pushing for this heavily since it makes piracy absolutely impossible as well.

"Gamers" won't like it, but AAA publishers always get what they want if it impacts profits - this is the future and there's no way around it. I wouldn't be surprised seeing the first "cloud-exclusive" release opping up soon that you can play on Stadia or equivalent only for half a year or so. Valve is working on something like this as well afaik.

I purchased a Nvidia Shield Pro before Christmas (an unbelievably slick piece of tech btw) and signed up for GeForce Now. I'm based in Ireland, as are the local Geforce Now servers (presume it's running on Azure). A surprisingly decent gaming experience - The Witcher 3 looks amazing, and performance is extremely decent with only slight delays.

Yeah the shield can also receive from your gaming PC provided it has an nvidia card. It's a really nice device in general.

Steam Link can also be installed on the Shield and as long as your PC has steam running, you can use its Big Picture mode to stream games.

So can any device with Moonlight, I haven’t seen performance differences as far as local streaming goes between my Shield (2018 model) and my A1E (which has a fairly shitty SoC).

I've been playing Stadia since it launched. Now, I'm a casual gamers, I haven't bought hardware since PS2 and play games on my mobile occasionally, but I have been blown away my Stadia. I simply cannot get my head around how it can stream a game, in 4K, to my TV.

I recently purchased Cyberpunk 2077 and was stunned at the amount of complaining about poor graphics, glitches, gameplay etc. as I have experienced none of those.

As for latency and lag, my most popular game to play is PUBG, which is any game should suffer from those issues you'd think it would be a multiplayer, battle royale game like PUBG. But no, other than being very bad at it, I've never experienced what feels like input lag or latency issues.

Sales pitch over!

I’ve been using GeForce Now for a few years now, originally because I wanted a nice way to use Steam on an older Mac, then as a holdover for when I could pick up a latest-gen GPU. Which—the latter—has proved utterly impossible.

I’m curious how much momentum these services will pick up by sheer virtue of it being so challenging to find competitive hardware for sale.

I dont know why these cloud gaming service didnt create at least one game tailored to that platform. I mean, single person game is good and all, but that is arguably the worst performing for a cloud gaming because they are create with local hardware in mind. Why isnt online multiplayer game takes more focus on these platform? Like those battle royale games. If they create it with cloud gaming in mind, a match can be just like local coop game, except you have a very long controller and HDMI cable(aka your internet).

Online multiplayer is the only category where cloud gaming have the potential to be better than traditional hardware, but they have to built the game with cloud gaming in mind, not just porting it

Because single players games are more accommodating to network issues, skipped frames, adjustments in resolution etc. You will notice an input latency spike if you're playing a multiplayer game that requires precise and quick input.

I've played around with Stadia a bit, even on "good" internet in the US, 20ms ping, 250mbps, Stadia was only really enjoyable for single players games. Even playing a relatively slow multiplayer game like Dota 2 would have been frustrating fairly often.

Maybe youre correct. I dont know much about how multiplayer games works.

However, having just 1 server render and basically 10 viewport is really the future of cloud gaming for me, once the infrastructure(stable fast internet) is there

20 ms ping to the Stadia servers? That's actually pretty high. I usually get 5-8 ms

It's kind of silly to assume everyone lives next to the data centre.

I think it's silly for you to put those words into my mouth.

First of all, Stadia streams from Edge nodes - it's not about living next to data centers. I'm just giving another data point. There are numerous places for latency to be introduced in game streaming: encoding, decoding, local network, WAN, etc. An extra 15 seconds on your ping is a HUGE deal for playability in game streaming. 20 ms might be an awesome ping for CS:GO, but for something like Stadia you're going to feel that negatively.

Stadia did this with Outcasters. It's a small game but shows the potential.

The only one of these services I like, is the Xbox game pass hybrid approach. While you can stream games to your phone, you can also just install them on your PC or your Xbox. I find phone streaming to be a novelty, it breaks so often I'll find myself running to my computer to finish up a gameplay session

PS Now is the continuation of gaikai (purchased by Sony in 2012) - at the time the lead developer of x264 was working there.

They (gaikai) - even earlier than 2012 - had a very smooth gaming experience considering the general state of connectivity and gpu/cpu then compared to today.

I'm not surprised to see google pushing out fat streams like that since they have POPs in every city and youtube-battle-tested video bandwidth congestion detection but geforce now... that doesn't sound like a recipe for happy streaming.

Also echoing everyone else's praise for parsec, it is a highly underrated (free...) software.


I’ve been playing Watchdogs: Legion on geforce now for over a month and I’m extremely happy with it. I play from my ipad with a wireless xbox one controller. Quality is really good and I only see some stutters very rarely. I’ve played other games with the same experience. I would love to see SFV but I imagine the lag might make it a bad experience. all the games I tried were single player first/3rd person view and lag was not noticeable, I’m assuming there’s a lot of input prediction happening, but even in menus things were instantaneous.

You don't need a long document for me to tell you stadia sucks. I tried it out over the holiday season, games play fine but I couldn't ever get clear 1080p resolution out of it. I'm on symetrical gig fiber in Colorado, which I guess could be my downfall.

All of you people that are bullish on Stadia, explain to me why DisplayPort 2.0 provides up to 77Gbps of bandwidth? Google cannot possibly encode/decode that stream with high fidelity using an internet connection available from any residential ISP in the next couple years. Maybe within this decade you could accomplish that. However by that point, local compute will have already exceeded this standard.

The fine folks at Google know their computer science well so this isn’t news to them, but it kinda lays bare that eventually they will be selling consoles to execute games locally since it’s clear they do not have the appetite to get 10Gbps+ fiber to the country. Steaming-only Stadia made sense paired with a mature Google Fiber deployment, but alas that is not the universe we reside in.

I honestly just don't care, and I suspect most don't either.

I live in London, so probably not far from the GCP data centre, I have a 50mb connection (pretty average and affordable for London) and I don't see myself ever investing in a console or gaming PC again.

My gaming experience with Stadia is better than "good enough", and that's all that matters. If they can get the good titles, they have me as a customer.

I would settle for a lower quality service than I'm currently getting before I think about buying my own hardware again.

I live in London with 1gig connection (G.Network) and I find Stadia unplayable for so many titles.

Shooters are out, skill based games (e.g. rocket league) are out, racing games are much harder due to latency as well and quite a few platformers become an infuriating experience.

Slower paced games work just fine but until the latency gap is closed or until games will be designed with it in mind Stadia is dead in the water as far as i concern, 5+ frames of latency at 60fps isn’t a good experience.

> explain to me why DisplayPort 2.0 provides up to 77Gbps of bandwidth?

At the risk of being called a ludite or a peasant, may I suggest most people don't actually care that much about reaching the height of visual fidelity, at least in this aspect?

The reality is this same argument could be made against any kind of video streaming. But the reality is that people are pretty satisfied with the fidelity that video encoders can produce. Yes, it will always be inferior in several respects to the uncompressed, ~33Gbps 3840x2160@60hz your local rig can push out, but will it be a deal breaker for everyone, or even most people? I don't think so, personally.

Part of the DisplayPort spec (that I love) is Multi-Stream Transport, which allows you to daisy chain monitors off a single connection. I need that much bandwidth if I'm going to be running, say 3x 4K monitors with a single cable. HDMI doesn't support this and has less bandwidth.

What devices have you seen that actually support MCS? I haven’t seen a single monitor even those that come with USB-C that support it...

I use MST to run my dual HP Z27Q monitors (5K 27-inch). Their only connection is a dual DP1.2 inputs - so I have four hefty cables coming off my GPU.

I believe what you’re describing is a feature of Thunderbolt, not DP.

I'm not sure about monitor support. I've seen some higher end Dell monitors have it.

I use MST with a mini display port hub for more outputs on my laptop.

Just bought one that will: F32tu87 Samsung albeit via the lightening 3/usb-c interface

I dont care much in either direction about Stadia but, I think there is certainly a big market for people who just want to play some games now and then without having to own a gaming rig or even a console. High fidelity will not matter so much. Simply being able to play some AAA games that you otherwise might not have been able to play will matter.

40gbps is the nitrate for for 4k at 120hz at 10bit colour. This is uncompressed video.

Displayport isn't even available on displays or GPUs.

These services use hardware acceleration for encode and decode of the video to H.264 (perhaps HEVC as well)

Clearly the bandwidth requirement is lower.

It is..?

3840 x 2160 x 120 x 10 x 3 = 29,859,840,000

That's not even 40Gbps



But my point still stands. Streaming doesn't need such a high bitrate

They don't need to compete with a state of the art local setup, they just need to sell something better than what the user happens the have. There's a large delta between those two things that Stadia can live in.

I still have fun playing N64.

How many people play Minecraft?

edit: There's a population that cares about graphics, but the population that doesn't is bigger and collectively has more money. Just look at the mobile gaming market.

I agree that people will settle for less, but I am arguing Google has always been aiming for AAA titles and markets it that way.

Also DisplayPort 2.0 bandwidth is mostly for supporting VR where visual artifacts and low fidelity lead to motion sickness for a huge demographic.

I have Stadia, love it, and don't care that much about quality. I care that it's console-free and on the two TVs I bounce between, or wherever else.

Marketing AAA titles does not mean catering to people who are into high end graphics... e.g. I have a MBP, a PS4 and a Switch. None of these can run Cyberpunk 2077 (properly) or MS Flight Sim (at all). I have no plans or wishes to buy a high end Windows rig... BUT I would play via the cloud, just to experience these games, and would not care if there are some compression artifacts (currently unnoticeable when I stream 4k movies) or if they are capped at 30fps.

I only fear latency, which is what makes games unplayable to me... I live in Argentina and the best ping I can get on Fortnite is 50ms, using servers in Brazil. There are few territories with low ping in the world.

What's VR got to do with it? Sure, Stadia might never work for VR.

Sure, VR might one day be a big part of the gaming market. It won't be soon, and it will probably never be the entire market.

I dare you see to the difference between a properly encoded movie in h264/h265 vs the raw version of a BluRay in 1080p. Same could be said for audio and mp3.

The simple answer is that you don't need the full bitrate video to have a decent experience. I use Nvidia gamestream locally at 30 to 50 Mbps and it's basically indistinguishable from the real thing. For most people, it doesn't matter if they can't tell the difference.

Im sure you or I can tell a huge difference between 240hz and 60hz. How can other people not notice? Or 10-bit or higher color? Maybe people don’t yet know what they are missing, but when they do, Stadia won’t be competitive.

> How can other people not notice?

They notice with their wallets. A battlestation capable of 10-bit color, 240Hz, at 4K, for a single graphics generation, costs more than a monthly subscription service.

Do you need 240hz for a story-driven game? Will Final Fantasy 15 be better if it's that smooth?

Stadia sucks, don't put me on the pro-stadia side of the debate, but:

Google generally operates with 5+ years of foresight. If you think at all that cloud gaming will become viable in the next 5-ish years then you should not ignore or rule out any cloud gaming platforms as unreasonable.

Why all the graphs and details about bandwidth when latency from the controller to the screen is really the thing that matters. They mention it several times and never show any measurements.

What is the complexity of building something like Stadia: what technology is required to assemble a competing product, so far the are using open-source tools like Linux and Vulcan?

The best of these is shadow.tech, they literally give you a Windows Desktop to remote into and play any game you want. The rest of these services all have limited game availability.

They wrote a whole paper to say that PSNow doesn’t use WebRTC or RDP and is (currently) capped at 13Mbit/s while the others use up to 45Mbit/s with WebRTC/RDP?

Am I missing something?

RTP, and yea.

“However, these companies released so far little information about their cloud gaming operation and how they utilize the network. In this work, we study these new cloud gaming services from the network point of view.”

They’re early too, so this paper will get a lot of citations in the future.

Tried Stadia today and closed my account in less than 15 minutes. Those triple-A games they advertise on the front? Well they aren’t available unless you want to purchase them. You get access to I shit you not like 15 games that are not titles I care to play.

It’s a big bait and switch as far as I’m concerned. Those triple-A titles might be available some day, but who knows when that day will be. I left very unsatisfied and didn’t even try streaming anything.

I think you're expecting too much. Is google supposed to give away triple A titles along with free rackspace just so they can let you play games?

Well what else am I paying $9.99/month for? The opportunity to buy games from them?

For hardware, basically.

You pay the subscription to stream in 4k, and to get a small selection of games included (yes, not the AAA games you want).

It's free hardware if you play in HD.

Yea... but then I could just use Nvidia's service which is much cheaper and has a much larger library.

Overall very disappointed in Stadia.

You can buy all the games you want without the Pro subscription.

Yea maybe. I don’t see a reason to switch from Steam or not use Nvidia’s streaming service in that case.

I don't know why you'd call it bait and switch. Stadia has always been "you have to buy the games separately" for pretty much all titles. That was the big knock against it at launch, and has been an impediment to its growth.

FWIW I consider Stadia and the ilk oversold to the mass market akin to VR. There may be a market for people who want to play AAA games, will pay several hundreds per year and yet have never bothered to pay ~$300 for a game console (tail end of last gen). I'm just skeptical it's actually as huge as it's hyped to be.

> I don't know why you'd call it bait and switch.

Well, I can't even go back and look at the original marketing now because when I do I get a warning that Stadia must be played on Chrome and to download that. Obviously I could go in private viewing or something but why would I even bother with that? It's so user-hostile.

But the marketing that I do remember that enticed me to check it out was that I thought that it was for streaming games. When I checked out the site they had this big list of games that it looked like I could stream. If I could only stream, say, 15 games as it appears to be why not just show those? Hmm.

I do think there's a market for those (like myself) who have a few games they'd like to play. Cyberpunk 2077 looks cool - I'll pay $10 or $15 or even $20/month to stream that (among other games) because my alternative is buying it for $50 and not having it run at ultra-max settings on my MacBook Air. Not desirable.

To your point that this was a knock against it at launch, well, yea. They have all this marketing copy that makes it look so cool and then I have to pay retail price for a new game. Why wouldn't I just buy a Playstation or use Xbox or Sony's online services? Like what value is $9.99/month for some lame games that I don't care to play? Pass.

This service doesn't address any market that I can see.

You can stream any games in their library. You just have to buy them. They never said it was free.

Streaming doesn't mean it's free, or even all included in a subscription. It means it's running on a server and you receive the video.

The market for this is basically the same as consoles, except you don't pay for the hardware initially.

I find the marketing for the platform to be inadequate at best. It’s very much “stream all these games” - not as much on the side of “buy this game to stream it now”.

Just to clarify, are you aware that you don't have to pay the pro subscription to buy games right?

If you want 4k and a bunch of free games every month, you can buy Stadia Pro. But you don't have to.

Yes. But having a streaming subscription was what was interesting for me. If I have to buy the game I think it’s better to use a different service.

My bet is that the improvement in integrated graphics (think Apple M1 or Vega 11) will solve the "gaming on a low end laptop" problem far before game streaming.

Games will simply use more power as the average user's rig becomes more powerful. Do you really think we'll solve low power ray tracing before we solve streaming? Streaming also scales better. Rendering double the frames requires double the power but streaming double the frames does not because frames are mostly similar and compression helps a great deal.

> Games will simply use more power as the average user's rig becomes more powerful.

But the need for lower latency is increasing as well (a la VR)

This is certainly moving the goal posts... is everything in the future going to be VR? I doubt it. There will be a market for streaming 4K@60 for a long time.

> This is certainly moving the goal posts...

That’s exactly the point though, tech requirements don’t stand still

Over on /r/oculusquest, Shadow's streaming PC is somewhat popular for running full PC VR games on the portable Quest (as not all the owners have full gaming desktops). So it's not like it's impossible.

I’ve actually experimented with this a bit and while it can be done it isnt at all playable for 90% of games (anything requiring even moderate response times).

Oculus has been working on home desktop to quest WiFi streaming and say the experience isn’t optimal yet for release and that is all local.

There are diminishing returns here.

If you want to play mobas or competitive FPS the actual amount of grunt required is going to be flat.

If you want to play the latest assassin's creed then it's harder to do, but still, assassin's creed Odyssey on a 1070 is still stunningly good and gets decent frames - the requirements will always increase but the amount of power required to get a certain fidelity shouldn't be too bad.

Streaming also does away with the hassle of downloading updates and managing storage space. I would happily avoid loading up my macbook with games even if it could run them.

I wouldn't bet on Apple to become big in gaming (apart from mobile games, which is a very different market), regardless of their hardware.

You might find https://doi.org/10.1145/3401335.3401366 (https://eprints.lancs.ac.uk/id/eprint/144135/1/FromOneEdge_M...) interesting, it looks into the increase in carbon emissions from streaming games instead of downloading the game.

Cloud gaming is actually far worse than the idea sounds like.

From the surface, cloud gaming is absolutely a fantastic idea, I.e., it removes a series of barriers of playing high end games: cost of hardware, convenience, subscription bases consumption to remove upfront cost.

However, in close examination, it becomes clear that high end game is costly not because these above mentioned barriers. On the contrary, these barriers are part of the structure to support the high end games.

Additionally, the modern high end gaming experience entirely depends on the exclusive ownership of costly computing power. The hidden foundation that supports that comouting power is the low cost local communication networks. And we all know that wide area communication is always more expensive than computing. So for any large scale cloud gaming, the underlying economy is that it's always more cost-effective of having local hardware.

I don't see it, I got geforce now and that's because it's cheap much cheaper then buying a new game pc or console. If you just want to play sometimes it's a great option.

For 5.50€/month, it is an extremely cheap way to get to play games with all graphical settings at max with 1080p resolution at 60 fps.

However, this price won't last, it is written on the web page that one day, there will be an increase for everybody.

I’ll lead with my peak cynical take, apologies in advance, but I think Google just wanted to entice people with cheap centralized streaming games while it was viable to get them married to a platform they will ultimately have to buy a local console to render games with anyways.

Can they even get close to DisplayPort 2.0 performance in the next couple years? I don’t see how this is ever going to scale as bit depth, resolution and frame rates continue to increase. This is a losing strategy because a lossy, slow connection between your display and your controller will only get worse (quadratically so) as those three factors (bit depth, resolution and frame rate) increase.

They are going to have to beef up the client hardware and do more of the game engine and rendering compute locally, which will undermine the cost savings they offered being centralized.

My guess is that they're testing the waters for enterprise software. Think Google Docs but for CAD. There's a lot of money in that market and people are used to paying $1000+ monthly subscriptions anyway. So if you can move that onto a remote desktop server, you'll have amazing performance and healthy profit margins.

And if it works well for games, it'll surely work well enough for regular software.

Why would you get better performance from CAD in the cloud? Doesn't a cloud server have essentially the opposite performance profile from a CAD workstation, with many more cores but much slower clock rates? CAD vendors want your workstation to have a few fast cores and several GPUs.

I imagine in the enterprise space there could be a desire to have cheap simple to manage machines to give to employees or BYOD with a subscription service to manage cloud compute resources for things like CAD. Lots of enterprises don't like being in the internal IT management and compute hardware hosting business, they're in the business of making whatever they're designing in the CAD software. We've already seen this kind of migration to cloud services for other kinds of business processes.

> CAD vendors want your workstation to have a few fast cores and several GPUs.


I think premise is either "floating workstation" or that if you please you can use laptop instead of being tied to workstation.

I wouldn't expect better performance, but you usually license software by node. Having it run on a shared server is an easy way for employees to share and work together.

Clouds are starting to focus on HPC machine types which can crank out CAD renders

I see. I guess it's technically possible for a cloud vendor to temporarily attach several GPUs to your CAD render, and afterward reassign them to someone else, therefore bringing the utilization up and the price down, it just doesn't seem like such a big market opportunity when a $20k workstation isn't that important next to the cost of the software and its operator.

dont forget that you're not tied to one workstation to do the job. makes WFH(or anywhere really) much easier, and much higher potential for easy collaboration

I agree that is a good fit for a use case, but I’m speaking about Stadia in particular. They are going to disappoint a lot of consumers.

Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact