Hacker News new | past | comments | ask | show | jobs | submit login

honestly none of this even matters until they can offer a service with comparable input latency to local play, and so far every indication is that they can't. the whole service falls apart if it feels like shit to play.



Yes you've played the service ? It surely sucks when no reviews are out ? Look for people who have dogfooded and raved about it.


Unless they've solve the speed of light issue then we don't need to wait for reviews. Even using a remote desktop on the other side of the city isn't a pleasant thing to be doing full time and that's not remotely as twitch based as gaming.


I don't think that'll be a huge issue.

Google Cloud has regional US DCs in western Iowa and central South Carolina. A midpoint between those two locations roughly lands on Nashville TN, which is ~600 miles away from either. Light could make a roundtrip of that distance in 6ms. Of course, the internet doesn't allow for latency at the speed of light, but that's the physical limit, and that's plenty; a typical internet browser alone has input lag of 10ms [1]. In order to achieve 60fps, frames have 16ms to be rendered.

But the regional DC is only the worst-case, because they've said they're deploying these things in 7500 locations around the world. That's unprecedented scale for a tier 1 cloud provider at the edge. They know that they have to be close to consumer populations.

Also consider this: Once cloud streaming takes off, we're going to see deeper integration into the frameworks and game engines themselves. Imagine a game engine which is built for streaming. It could do input prediction, doing a "light rendering pass" of frames for N possible inputs the input buffer might receive on the next frame, before it receives them. These custom chips they use have plenty of headroom to do this at 1080p, and most controllers have, what, 12 buttons + all of the joystick states? Depending on the game this might be possible (example, hard to do in multiplayer). Combine that with the natural advantage a cloud-hosted multiplayer game would have in networking with other clients to resolve game-state, and you can see that its not just a strict downgrade; it might be possible that we'll see improvements in the performance of games beyond just the typical "new year better graphics" cycle.

[1] https://www.vsynctester.com/testing/mouse.html

[2] https://shadow.tech/


I watch WebRTC streams from California in Germany with 100ms network latency. Speed of light has never been an issue. Latency within the same continent or within the same city is much lower.


People said the same about streaming video services.


No they didn't, that was simply a bandwidth issue, we've been streaming video since TV was invented and with the right equipment you could stream video over the internet before the web even existed. No equipment exists that can alter the fundamental speed limit of the universe.


Video isn't interactive.


Yep. This is the kicker. For streaming videos, only bandwidth matters but for video games, both bandwidth and latency matter.

Reminds me a bit of the 30/60 fps fights a few years ago. Sure, 30fps games look more "cinematic" and 60fps movies look uncanny, but 30fps games feel less responsive.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: