Google Cloud has regional US DCs in western Iowa and central South Carolina. A midpoint between those two locations roughly lands on Nashville TN, which is ~600 miles away from either. Light could make a roundtrip of that distance in 6ms. Of course, the internet doesn't allow for latency at the speed of light, but that's the physical limit, and that's plenty; a typical internet browser alone has input lag of 10ms . In order to achieve 60fps, frames have 16ms to be rendered.
But the regional DC is only the worst-case, because they've said they're deploying these things in 7500 locations around the world. That's unprecedented scale for a tier 1 cloud provider at the edge. They know that they have to be close to consumer populations.
Also consider this: Once cloud streaming takes off, we're going to see deeper integration into the frameworks and game engines themselves. Imagine a game engine which is built for streaming. It could do input prediction, doing a "light rendering pass" of frames for N possible inputs the input buffer might receive on the next frame, before it receives them. These custom chips they use have plenty of headroom to do this at 1080p, and most controllers have, what, 12 buttons + all of the joystick states? Depending on the game this might be possible (example, hard to do in multiplayer). Combine that with the natural advantage a cloud-hosted multiplayer game would have in networking with other clients to resolve game-state, and you can see that its not just a strict downgrade; it might be possible that we'll see improvements in the performance of games beyond just the typical "new year better graphics" cycle.
Reminds me a bit of the 30/60 fps fights a few years ago. Sure, 30fps games look more "cinematic" and 60fps movies look uncanny, but 30fps games feel less responsive.