Hacker News new | past | comments | ask | show | jobs | submit login
Google unveils Stadia cloud gaming service (theverge.com)
538 points by valgaze 66 days ago | hide | past | web | favorite | 558 comments

I participated in the beta and was pleasantly surprised to see how the entire experience came together. I only experienced one small technical issue. The game runs in real time on Google's servers. If you have some lag on your network, the client will drop all the frames during that time. Instead I expected behavior like Youtube where you will see a small loader and the game will continue before the lag started. This was specifically a problem when playing Assassin's Creed: Odyssey cause in the cutscene the characters would say something like "Go to the * lag * and meet * lag *" and then I had no idea what I had to do next. Lag was still a rare thing during my experience. It's just that when it occurred the timing was unfortunate.

How do you picture your hypothetical vision of things working on the server side? Naively using buffering would make the input stream desync from the output stream. "Input prediction" works well enough when a game supports it, but lag would reappear for titles where the game doesn't. A general solution (for games with no input prediction) would require you to run the game in an x86 emulator with save-state + rewind support, such that network stutter could be translated to micro-rewinds of the game's VM; and even that wouldn't work if the game was multiplayer.

I could see games being able to signal "there is no input during this cutscene, so it can be buffered at the client." But for everything besides cutscenes, game streaming essentially has to work like a VoIP call (hard-realtime), doesn't it?

For a tangential example, Zoom conferencing increases the replay rate after a cutout. I can tell that someone goes to say, 1.25x and they still sound ok, but you usually don't lose much this way.

It happens occasionally when I drive (bad LTE zones). On bad days (ie, if I'm tethering at hotel to join a conf or using airport wifi), I sometimes hear "robot voice" as the tool attempts to deal with signal attenuation.

This may not work as well in video games, but I would like the option.

But for everything besides cutscenes, game streaming essentially has to work like a VoIP call (hard-realtime), doesn't it?

I imagine most music, and some during-gameplay dialogue, could be buffered at the client. I'd expect sound effects to be the only audio that needs to be super low latency. Maybe certain dialogue too, if the speaker is visible on screen.

This alone might alleviate a lot of the GP's problems, since most audio wouldn't cut out. That can make a huge perceptual difference.

Ugh, I'm past the edit window and forgot to put a '>' before that first sentence. I was quoting, not plagiarizing, promise!

I think you pretty much explained the solution. You need buffering plus real time VM suspend+resume. I was thinking the client could keep sending last shown frame data back to the server and if a hiccup is detected the server stops the clock cycles on the VM and resumes the clock as soon as the buffer is empty. This may lead the user to see a loader for a short time.

When you're talking to someone on Skype and you notice the other person has stopped moving, you stop talking.

They may already be doing something similar (halt rendering on congestion) but when this kind of lag happened to me it felt like the rendering had continued without pause on their end.

I would imagine it being implemented with SIGSTOP/SIGCONT

It’s the same experience in all other forerunners in the field.

In essence, you need a fat cable to a datacenter very nearby for shooters/fast action games.

Regular wi-fi is more or less sufficient for everything less (you might get video degradation from time to time).

OnLive was the first and it was just insanely awesome. Too bad it was shattered.

I also tried LiquidSky, and it was insanely awesome. At one point you ended up having a virtual PC where you could install and play any game from your Steam library. Too bad they’ve changed the service and monetisation a few times, and it’s unclear if yhey are alive.

Haven’t tried others.

Thanks for explaining your experience!

There's a lot of us out here who are curious how this thing works and the design tradeoffs.

I also was a participant. It was pretty cool. I did a wireshark and they are using Google's QUIC protocol.

In my experience the game would drop graphics before I would experience input lag. There were a handful of times that I did experience input lag. This was on a wired connection 100mbs down 20mbs up through Xfinity.

I did notice in the Google demo, that when the person was using the game pad, he experienced input lag when he was trying to jump up on top of that steeple on top of the building.

Hmm, I think I like those design choices. A dropped frame or two isn't a big deal in my experience, while a delayed input is a far bigger detriment to the gaming experience. QUIC (and UDP in general) seems about right for the technology backend.

I'm still of the opinion that Google shot themselves in the foot here by having a bunch of wireless controllers in one room. Its like they've never talked to a Super Smash Bros Brawl tournament organizer before: Wii Nunchuks over 2.4GHz Bluetooth have dropped packets / dropped input issues when you get to ~20+ participants.

Don't do mass wireless in one room. It always ends poorly. I'd expect that local wireless in a typical living-room setting would be a better experience actually. After all: the major issue is whether or not the wired-connection / fiber backbone of the typical city is up to spec for this kind of thing. (A typical living room user probably doesn't have to worry about clogged 2.4GHz connections unless they're in an apartment I guess...)

> tradeoffs

lag is the least of your worries. Encoding artifacts will get to you and completely ruin the experience on anything that is not a simple puzzle game.

So basically the same issues as onlive then? I remember thinking that it was pretty good for a while. Then I tried the same game on a decent PC immediately after using the service.

It was night and day, both in terms of latency and graphics quality. I don't have high hopes for this service.

I feel similarly. I've tried in-home streaming from PS4 to a MacBook— obviously the encoding of the video signal isn't going to be as optimized or speedy as Google's backend, but still, the overall ping is sub-10ms and it's still a way worse experience than playing the same game on the TV, even for relatively slow paced action-adventure titles like HZD or RDR2.

I can't imagine trying to play something like a fighting game this way.

It worked great, in my experience. The most awesome thing about it was playing Dirt Rally 3 I believe, on a Motorola Droid/Milestone. A full PC game on a smartphone, pretty mind blowing at the time, and it was quite good even over 3G.

> Lag was still a rare thing during my experience

Anything more specific than that, i.e. lag would occur every 10 mins on avg?

Yes that seems about right. I would say lag occurs once every 5 mins and lasts for roughly a second. Not longer than that. But since gaming is such a visual medium that 1 second seems longer than it actually is. One thing I forgot to mention is I was on wifi and the performance may have been better had I plugged in an ethernet cable. Although I would add that wifi usage is going to be what the typical user opts for.

I used to get these, every 5 minutes, right on the clock. Turns out, OS X location services rescan wifi every 5 minutes, God knows why. Turn it off, lag gone.

From Apple's info page on Location Services:

> Your approximate location is determined using information from local Wi-Fi networks, and is collected by Location Services in a manner that doesn’t personally identify you.


Excessive channel scans have been a problem on Linux, too: http://blog.cerowrt.org/post/disabling_channel_scans/

It's about time WiFi radios had a separate radio for scanning.

The idea that one can retune the radio into a different channel for 100 milliseconds or so and not impact user experience is absurd.

Fascinating. I tested this on Windows 10 PC, though. So may not be the same issue.

> I would say lag occurs once every 5 mins and lasts for roughly a second.

As someone who plays through DOOM on Ultra Nightmare, this kills the game. There are times when, if I were to miss even 10 or 15 frames, there would be a good chance that will be the end of my run.

Hello fellow Doom Nightmare player. I think Stadia is not for folks like us who have tasted the fill experience on high end rigs.

It will instead, expand the audience to a lot of casuals who just want a little demon-slaying power-fantasy on a low difficulty. Nothing wrong with that, and it is a damn big addressable audience. Just like how mobile didn't "kill" any other platform, it just added a whole lot of candy crushers.

In my experience, 2.4 GHz is congested. An entire room of wireless controllers at 2.4GHz (ex: Super Smash Bros. Setups with Wiimotes) would cause these "dropped packet" issues, even with local WiFi / Wireless. 2.4GHz shares Bluetooth, 802.11n, and many other protocols.

If this were an entire room of WiFi controllers hitting the same WiFi frequencies all at the same time, it could have been a local problem.

All in all, we can't read too much into the demo-conditions. There are too many variables at play here.


No lag at all on my end. I am in the USA, so I have a solid but not amazing connection.

I should note that I have only played for half an hour because AssCreed was extremely boring. The tech itself was very solid in my experience.

I tried to catch some streaming artifacts, like parts of the screen not updating or a visible lag, but I could not see anything. In a blind test, I doubt I would have been able to differentiate this from running the game on my home console.

I'll second that. Standard Xfinity cable connection, decent PC, and wired ethernet to the router. Occasionally in the evening during prime time when I assume everybody in my building was watching Netflix the stream would kick into lower quality graphics, but would recover quickly. No control input lag that I could detect. I put about 30 hours into the game and enjoyed myself (Awsome environment design and a vast, vast map to explore. All that fighting is repetitive and tedious, but story mode and running around exploring, taking in the views and the history is quite fun).

In any event I think that if you are a casual gamer and want to get off the graphics card upgrade train every couple of years, then this is a no brainer. It probably would not be a good match for multiplayer shooters, but for solo games, even AAA titles, it's a great option.

My feedback was that if they paired up with Steam they'd have a killer product. They might still have a killer product if they convince all those developers on Steam to also put their games on this service.

The artifacts were very visible to me when rotating the camera quickly or during fast movement. If the camera isn't moving very much I think it is easier to get a high quality picture at a standard bit rate.

> This was specifically a problem when playing Assassin's Creed: Odyssey cause in the cutscene the characters would say something like "Go to the lag * and meet * lag " and then I had no idea what I had to do next.

That's annoying.

Now, as a technicality specific to ACO, the goals should still appear in the mission's description, no matter whether you hear the NPCs stating those goals or not.

What were the frame rates like? I tried cloud gaming once but the frames being transmitted were low quality at best on a 70 mbps download.

Oddysey was streamed at 60fps, but the game itself ran at 30

Google could show ads during the lags - that might help increase revenue numbers.

Unrealistic for what seems like obvious reasons; no one would use the service if it was lagging enough to make that time profitable by way of ads. If your comment was tongue-in-cheek, which I hope it was, I'd urge you to contribute more substance and originality in these comments. After all, this isn't Reddit.

Do you think these things are not considered. There is an army of people at Google that try to figure out how to show us more relevant ads. It's the perfect scenario for upselling.

Slow connection speed? Run these tests, etc. Or, switch to a faster more secure browser. Sound familiar?

HN is particularly Google friendly and criticizing usually gets downvotes.

Google was just today fined 1.49 billion Euros for their ad strategy - this is reality: https://www.bbc.com/news/business-47639228

Yeah, that's not unusual.

That's interesting. I remember now that I signed up for the beta. However I never received an invite :(

Rewrote this a few times. Overall I think this would be really bad for the gaming world. Not buying consoles would be nice, but the natural business model will be ads or pay per hour or pay per MB. This is going to further drive the industry to towards mass multiplayer and grindy games I think. More accessible but worse content.

Indie shops would probably struggle even more under this model.

The cost per user will vary drastically by the amount they play and how computationally demanding it is. I just don’t see it being feasible without personalized cost. Looking forward to paying minimum cost for minimum graphics too.

I disagree. The natural business model will be a mix between subscription services and free to play games with micro transactions. Microsoft (and others like EA) is already moving in that direction with game pass (monthly subscription gets you all first party games plus partner games).

The biggest concern is going to be internet availability and data caps. I tried the beta for Project Stream (now Stadia) this past winter. It worked well and was impressive, but I have a good internet connection with a 1TB cap. I have friends with much worse speeds and harsher caps and I am not sure if this would be viable for them.

I am more concerned about 'physical' gaming being phased out. I doubt Stadia will do this soon, but I like building a new computer every 4 years for playing games. Maybes it's something I won't actually miss(like how I don't miss CDs for music), but it remains to be seen.

F2P games are the incarnation of grindy.. and more of them is the worst thing that could happen to gaming.

For some people, the grind is the main attraction, especially if it's well balanced. Look at Path of Exile, that's one hell of a grindy game and people absolutely adore it.

Is the grind an integral part of the gameplay, flow and pacing of the game? Fine.

Is the grind a way to slow progression in order to make micro-payments more desirable? Fuck. That.

Or worse, a hybrid like Assassin's Creed Odyssey: a $60+ game that gets super grindy, but reminds you that you can opt into micro-payments to get a little experience or coin boost...

I really enjoyed it for the first 20 hours or so.

Morrowind was way more grindy than Odyssey (make 50 steps, a monster attacks you, repeat until the end of game). Odyssey just made all enemies within +/-2 levels of your own character, contrary to Origins, where levels were preset for each area.

I downloaded two mods pretty much instantly for Morrowind: no cliff racers and being able to run without losing stamina. Saves uncountable hours.

People love Path of Exile because it's essentially a slots machine.

Also, it doesn't have that high of a playerbase.

While I am not a fan of many F2P games, their popularity and impact are undeniable.

The trap that you should avoid falling into is assuming that's the only way games can thrive. F2P games are huge and dominate the conversation, but I think indie games are the best they have ever been. My favorite game last year was Into The Breach, and by all accounts it sold well in the same market that Fortnite dominated in.

Games are evolving in weird ways, but it's in a multifacted and diverse way.

Into the breach was a great game, but it is unlikely to have sold as well if it wasn't for FTL's success.

Eh, probably to some degree, but at least anecdotally for me, "new game from FTL dev" got me to watch the trailer, but the concept is 100% what hooked me.

Oh, for sure, I feel exactly the same way. ITB is one of my favorite games in my steam library, and that's the same way I discovered it. The argument I was trying to make is that a lot of people wouldn't have had the opportunity to be hooked had it not been for the "new game from FTL dev"

FTL was a Kickstarted game from devs with zero pedigree (to my knowledge) and it was also a indie hit. While I agree Into The Breach owes some success to the devs being established now, but I don't think you negate my point about this being a time where games of all types can thrive.

Game Pass works because you're essentially temporarily unlocking a library of games to be downloaded to your device. It really isn't an indicator they're moving in the same direction as Stadia.

It is kind of another beast altogether for Google, Shadow, Microsoft (xCloud), etc. to dedicate actual hardware for your usage. Shadow and others like it are essentially remote VM w/ GPU that you rent by the hour. Stadia, xCloud, etc. we don't know what the business model is going to look like. The only positive here is that Microsoft, Google, and Amazon are all cloud infrastructure companies so in theory, they could price their offerings cheaper than a company that is a tenant on their systems.

It's not just game pass though. Microsoft has flat out said they are moving towards streaming. They are also doing things like making Xbox live a service on the Nintendo Switch, iOS and Android. Game Pass the way it is today is just an indicator into how things will be.

Microsoft has made clear that they want Xbox to be a platform independent of the physical box they sell to people. They are 100% going in the direction of Stadia. The big question is how they are going to do it in regards to their next console.

They have announced xCloud, yes, but have they announced the business model/pricing? So far, pretty sure the answer is no. So it may be a subscription, but it may also end up being akin to renting a VM with GPU in Azure. Time will tell.

> The only positive here is that Microsoft, Google, and Amazon are all cloud infrastructure companies so in theory, they could price their offerings cheaper than a company that is a tenant on their systems.

That would be considered unfair competition and they would probably be fined by EU, I guess

I agree with your comments. I'd also add Stadia could operate a traditional platform/store model as well. It'd be a big ask for big publishers to launch their huge-budget flagship content straight into a content bundle. But you could have them sell access to people with Stadia accounts same as they sell physical/digital copies to Xbox/PS/PC owners.

Imagine simply moving all hardware support issues "in-house" or at least trading in all of the pimply teenagers with hardware issues for a single team of engineers working for one of the smartest companies in the world.

The vast majority of your support department can be let go.

Fortunately PC Building Simulator is now a game

I think undoubtedly, I would bet on the longevity of a streaming service from Valve or Amazon more than I would one from Google. But Google's past history of killing free side services doesn't offer the most relevant precedence. This is a service into a mainstream (and growing) industry that gels with Google's strengths, particularly scalability and AI. YouTube Premium/TV would be better precedent, though the jury is still out on those. I think there's still a difference, though, in that YT+/TV entered a market with already dominant leaders (traditional cable, Netflix, free YouTube). If Stadia can satisfactorily mitigate the problems inherent to streaming, and get it out before Microsoft's rumored Xbox development, it will be in a prominent position in a new field for gaming.

Why on earth would it be pay per hour or per MB? Does Netflix charge that way? How about Spotify?

Time has shown that the best revenue model for these kinds of services is finding a monthly price, or multiple tiers of a monthly price, that adequately cover the costs incurred from heavy users and light users alike.

For end users, I agree. It’s also interesting to consider how content providers will be paid: per user who plays, per user-hour, a flat catalog fee, etc., and what downstream effects this will have on games.

Netflix and Spotify’s cost structure are different. Streaming content is cheap. Rendering a high end video game can get pricey because you need GPU time. There’s a lot of other considerations too. Like, a lot of people will pause a stream and let it sit for a few hours. That’s fine. What if people want to leave a game running for a few hours? The computational burden is still high.

I participated in the beta and I think they resolved this by setting a time out, and you get disconnected after some time.

I'm not so sure. To play devil's advocate, this may make games cheaper even without ads. Developers want to not worry about disparate hardware, Google wants you in their ecosystem. Buying google devices, using google services, etc. Content creators want this for better interaction with fans.

> Developers want to not worry about disparate hardware

Most games would be developed for Playstation + XBox + PC + Cloud service to maximise audience (unless one cloud provider gains a monopoly, something so far nobody managed to do in either game platforms or cloud computing).

Of course you can decide to develop only for the cloud platform of your choice, but that's nothing new. Microsoft and Sony already pay you good money to make your game exclusive to their platform (if you're lucky even if you're indy).

I would not be surprised if Google injected some ads into the games as soon as their service reached critical mass and thus killing immersion.

Maybe I'm super negative here but I don't see any pros to this development. Today's games are already dumbed down and steered towards profits only. The only games I play these days are indie games made by 1 person up to a handful of people.

We really don't need more but we need better. (This applies to many other areas too)

I don't understand the arguments being made above.

Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.

Similarly, monthly subscription where you get access to a list of games actually helps indie games. Being able to jump into an indie game you already own within seconds, compared to having to buy and install a small game you've never heard of before.

> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.

Pay-per-hour means developers have an incentive to build addictive games that keep you just engaged enough to keep playing for an extended time.

> Similarly, monthly subscription where you get access to a list of games actually helps indie games. Being able to jump into an indie game you already own within seconds, compared to having to buy and install a small game you've never heard of before.

Monthly subscriptions mean that the platform has to choose what games to include and promote based on what is most likely to make users find value in the platform—which means focussing on those with widest appeal, unless their recommender engine can get enough signal to reliably predict niche interest.

I don’t think this is true. It assumes people are economically rational and trying to optimize fun per dollar. But people already should be valuing their time and yet they don’t. If developers must now guarantee players not just try their game but play it for x hours to make a profit, I don’t think they will be pushing for shorter better games.

> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.

I'm not sure about that. Currently tons of people pay monthly for grindy MMO games. Expanding that sort of revenue scheme to single player games would encourage devs to create more Skinner boxes to keep players "engaged" over the long run.

MMO's are pay-monthly for unlimited hours, not pay-per-hour, that's a big difference. The marginal cost of an additional hour of play is zero.

Players aren't going to spend hours grinding if they know they're paying extra money for each of those hours.

I don't think it's that much of a difference to pay monthly to grind for a month.

Shorter games don't happen. We saw this with Steam when they added a no-questions-asked refund policy for short games. Devs get absolutely punished by making short games now so we're going to see them get longer, and the same will hold if Stadia pays by the hour. If Stadia pays a fixed amount per game we will probably see a proliferation of short games designed to bait people into playing.

> Pay-per-hour would lead to shorter, more thoughtful games, rather than long grindy ones.

Or it ties money directly to length, so you need your game to be 60+ hours to justify it's existence. Or people won't pay that much so we don't get any new dark souls or RDR2 length games.

So um, what do the customers get out of this?

Um, playing games at 4k without buying any hardware. And all the other stuff they mentioned in the presentation.

I'm struggling to understand how fewer casual gamers buying consoles would impact indie developers. Could you expand on your point?

Indie developers don't get a cut of console sales, but they do care about how many people buy/play their games, and if this widens the market it seems like it'd be a win for them to me.

Most indie games are short. If money is allocated to content producers based on play time, I would bet that most would earn less than a typical indie game purchase price per user.

That’s just speculation mind you. Things like Stardew Valley would probably do even better.

Nowhere did it say it's per MB or hour or whatever. Yet it's the top comment here. It's just fear mongering and hearsay on your part when most likely it will be a fixed monthly price.

It’s Google, which is notorious for smothering virtually everything unrelated to search or ads in the crib, and especially when it’s something they made in-house. Add to that streaming games; problematic for the (by far) most popular two genres: huge multiplayer shooters, and sports. I see almost no chance of this succeeding long-term, and I’d put the odds of it going EOL within 5 years at over 80%.

These huge companies will keep throwing money at streaming because they see gold at the end of the rainbow, but I’ve seen no indication that they have a plan without giant question marks before the “profit” step. Look at YouTube, something Google acquired in 2006, is dominant in its space, and still doesn’t make them money. Yet somehow games, which are more finicky and reliant on universally good internet connections is going to work for them?

This is trend-chasing, once again without any real new ideas to overcome existing challenges.

> It’s Google, which is notorious for smothering virtually everything unrelated to search or ads in the crib.

It's apparently deeply tied to YouTube, and particularly to help provide an onramp for more (often ad-opportunity-generating) gaming content on YouTube a service that is both an ad-platform and the venue for at least two distinct revenue generating premium services (the older of which is ad free) that Google has not strangled in the crib.

5 years is about as long as a console generation usually lasts, and this has less of a buy-in cost than either PC or console gaming.

If it runs for 4 years and makes money, that's a success for Google. It doesn't need to be permanent to be useful.

There's the major difference that you can't "pirate" a single-player game that is only available remotely, and that you can access such games remotely from any hardware.

So it could result in a renaissance of the classic AAA 3D single-player games, perhaps.

You already cant pirate current generation of console games.

Nope, Switch piracy exists thanks to Nvidia

And what about the other 2 consoles? What are you contributing exactly with your boorish "Nope"?

Writing "ps4 piracy" and "xbox one piracy" into google is a whole lot shorter than your comment. Then again, it doesn't give karma like a cool retort would.

I agree, and I'll add that many gamers care a lot about their privacy which conflicts with Google's core business model.

Also, this will fail miserably if Google thinks it can continue with its current attitude toward user support, so there is hope.

"Serious" gamers can still buy consoles. This is not a zero sum market.

Your point being? People can still buy DVDs and yet the movie and tv industry is being completely changed by Netflix.

Three million people still use Netflix to rent DVDs.

... and yet the movie and tv industry is being completely changed by Netflix streaming

In 2015 there were still 2.1 million AOL dialup subscribers. Many people probably just don't realize they're still subscribed.


Barely. Games that don't require an Internet connection and constant updates are a relic of a bygone gaming era.

Everyone copied Apple when they removed the headphone jack, whose to say consoles in the traditional sense will still be around in 20 years?

Microsoft/Sony are NOT going to sit back and let Google win here. Expect replies from them (especially with Microsoft's Azure).

The streaming services of gaming is coming.

> The streaming services of gaming is coming.

It's already failed multiple times. OnLive tried & failed. Nvidia's has been in beta for years. Sony has one.

Everyone in this area has tried this. Nobody has seen what could be described as "success", and the cost models so far have been ludicrous. Turns out renting Xeons and Radeon MI Instincts in a professionally staffed, maintained datacenter is way, way more expensive than a handful of consumer chips in a box in the living room with nobody on-call to monitor it.

The GPU here looks to be basically a slightly cut-down AMD MI25. That'd make a single GPU in this stradia cloud gaming service costs more than 10 xbox one x's. How do you make that price-competitive here?

A big difference would be that OnLive had space in 5 colo datacenters in the US. Google has 19 full datacenters around the world and are building more. Plus, google has their own very large fiber network from different POP's and ISP's around the world. The fiber backbone gives them lower, and more predictable latency, compared to multiple upstream ISP's with different connections, issues, etc.

Since not everybody is playing at the same time, a single GPU will service multiple players (each of whom would require a console).

On the other hand, everyone on the east coast (therefore using east coast edge nodes) will be playing from 8-11 EST when the new Wolfenstein game comes out, so how is stuff rationed? Do you make people queue until there is a node close enough to them available? Do you sell the spare GPUs to people in GCP to use for their compute on off times to make up the cash? Do you make it $40 per month?

I think this comment is super underrated. If America is asleep, you can't really use that capacity for players in Europe, since the latency would increase. Likewise if Europe is over capacity, you can't really just assign players to a US server.

And (while I realize you're oversimplifying for the sake of example), it's not just per-continent in this case, but something more akin to per-metro-area.

Google can have GPU count near player count but when there is no big demand they can be used for other types of computations.

Yeah, if you know where to look, they left clues about using MI25 hardware. (I haven't been an employee for years, this all unfolded afterwards and, ironically, it is just one search away.)

I'm sure they got bulk/promotional pricing from AMD, plus they're very good at both running hardware with low overhead and packing it efficiently.

> plus they're very good at both running hardware with low overhead and packing it efficiently.

You can't really pack the hardware here since it's latency sensitive. It's straight dedicated resources to an array of VMs. Dedicated CPU cache, even, hence the odd 9.5MB L2+L3 number.

Bulk pricing only gets you so far here. You're still talking gear that's categorically way more expensive than similar performance consumer parts. Not to mention all the other costs in play - data center, power, IT staff, etc...

Making this price-competitive is a big problem

You can't do time slicing, no, but you can definitely reduce time to first frame in many ways. If you don't do that, you need to provision even more hardware. Packing is also part of the capacity planning phases of a service.

The other costs (power, people, etc.) are amortized over Google's array of services.

Last but not least, it would be very dumb of them not to run batch workloads on these machines when the gaming service is idle. I bet $1000 these puppies run under Borg.

> The other costs (power, people, etc.) are amortized over Google's array of services.

Power doesn't really amortize, and neither does heat.

And capacity still had to increase for this. They didn't just find random GPUs under the table they forgot about, and now that they have a massive fleet of GPUs it's not suddenly going to start handling dremel queries.

This all still costs money. A shitload of it. Someone is going to pay that bill. More ads in YouTube won't really fund gaming sessions. So will this be ad breaks in the game? No way that's cost-effective for the resources used. Straight-subscription model? This seems most likely, but how much and how will you get people to pay for it?

Maybe it wasn't AMD, but they already had a massive fleet of GPUs. It wasn't running Dremel, either. Or maybe they found a way to do that, too, I don't know, but there are already enough workloads at Google to keep GPUs well fed.

I know from experience that Google is very cheap. You tell Urs you saved a million dollars and he'll ask you why you didn't save two. Or five.

If this takes off, the pricing of the service will pay for the hardware (assuming they did a reasonable job there of baking it in). Even if it doesn't, organic growth from other, much larger Google services can make use of the idle hardware.

For the record, I was involved in a couple of projects that required a lot of new hardware. One of them even ended up saving the company a lot of money in a very lucky, definitely unintended way.

>They didn't just find random GPUs under the table they forgot about, and now that they have a massive fleet of GPUs it's not suddenly going to start handling dremel queries.

This strikes me as rather amusing. Google was having such trouble getting their hands on enough GPUs that they decided to build custom hardware accelerators (TPUs) to fill the gaps.

I'm sure they'll find a use for these.

Or to look at it the other way, it's a Vega 64 with double RAM so Google probably pays $600 or less. Google doesn't pay enterprisey gouge pricing.

It'd be a Vega 56 basis not Vega 64 but the problem is that "double the ram" part.

HBM2 memory is super expensive. Like, rumors are 16GB of HBM2 is $320 expensive. Toss in anything custom here and there's zero chance this is under $600/GPU.

Even in the hotly contested consumer market the 16gb HBM2 Radeon VII is $700. And that doesn't have any high speed interconnects to allow for sharing memory with CPU or multi-gpu.

> How do you make that price-competitive here?

Could they be using these GPUs for other purposes during idle times (AI model training, cloud GPU instances)?

They have TPUs for their AI stuff, and you still have to dedicate these resources while gaming sessions are active. How much monetary value can they really get out of the idle population here to offset the active usage?

You underestimate how much Google tries to squeeze out of all the machines in its fleet. That includes old ones, sometimes to comical effect. A colleague at my current job told me about utilization targets at Amazon, where he used to work. At Google you could choose to be that wasteful if you really wanted to, but you'd lose headcount. Be more efficient and you'd get more engineers. I.e. you decide if you'd rather get machines or people.

There's also an old paper by Murray Stokely and co. about the fake market that was created to make the most use of all hardware planetwide.

Not really, the Samsung Galaxy S10 still has a headphone jack. And Samsung being the largest Android device manufacturer, makes a big counter point.

Sony launched Playstation Now 5 years ago. This is not a particularly new idea.

Microsoft already has project xCloud in the works.


There's so much that this will hurt in the gaming industry. Some of my thoughts:

1. As subscription services take over, the upfront revenue game studios see will drop. This is just simple math: Xbox Game Pass costs $10/month, which means its the total cost of a AAA game over 6 months. In a traditional model, many gamers would expect to buy, lets say, 2 AAA games per year. In this new model, I can play as many as I want. And even if I only play 1 or 2 every year, I'm almost definitely going to be "dabbling" in the collection for other games I may want to play, ESPECIALLY if they're instant-on like Stadia. Even if they pay-out to studios based on some metric derived from time spent in game, there's no way studios will get the same level of income as they did before. (note: this is exactly why Spotify is having such a hard time, and why they're branching out beyond music. royalties abstracted behind a subscription service suck for the bottom line)

2. So upfront revenue drops. How do studios make that up? In-game transactions. They're already huge, and they'll just keep getting bigger.

3. So what, micro transactions (mtx) are the "new normal". Well, the top 5% of games can afford that decreased upfront revenue by making it up in mtx (think: Fortnite, Apex Legends, CoD). The trailing 95% can't (think: Indie titles).

4. Beyond that, you can bet your bottom dollar that Stadia will pay out tons to the AAA studios just to get their names on the platform, given that Google has no first party studios to speak of. Assassins Creed gets enough upfront revenue to make it worth their while, meanwhile the next indie darling is left out to dry, further balkanizing the gaming industry

5. Switching gears: A massive number of software engineers in the industry entered it because of gaming. Games, even back in the 90s, were such a clear application and value of computers that it was obvious, even to children, that they'd be something huge. It inspired a generation, to not only play, but to mod and even make their own. Now, we're moving that all off into the cloud, hidden from the next generation. Google wants you to own a Chromebook and consume their products, not understand how they work.

6. Speaking of modding: Its literally the source of the world's most popular games. Battle Royale? You can trace its roots back to mods for ARMA and Minecraft. MOBA? Dota, a mod for warcraft. Creativity happens in environments that large corporations can't recreate, and traditionally a great platform has been starting with a base game, a great game, that some studio created, then exerting your creativity on top of that platform. It benefited everyone, including large AAA studios who could then copy your idea and make millions. Yeah, good luck modding on a blackboxed server a hundred miles away.

7. But fine. I guess we're moving into the future and this is part of it. Except, there are millions, even BILLIONS, of people around the world without the internet capability to even join this service. Google tried to help solve this with Fiber, and gave up. Its fucking hard. They'd rather do easy, cool things, like cloud gaming. Modern consoles are bad enough; my brother, who lives just an hour outside of a top 10 US city, recently told me that he downloaded Fortnite on Xbox for the kids. It took a week of 24/7 downloading. Most games drop with day 1 patches in the dozens of gigabytes, even if you buy the disk in-stores. The sheer arrogance of Google, to get up on stage and claim this service is gaming for EVERYONE, the apex of accessibility, is disgusting to me. They're stuck so far up their own ass they've become the ouroboros.

8. Well, streaming games have taken over the world. Let's say you want to compete with Google on this streaming game front. All of the top three cloud providers now want to get into game streaming. So, no way can you compete on cost there with them; they own the data centers and give their game streaming divisions nice fat discounts. They all have the pockets to design nice custom silicon with AMD specialized for the task. And, oh by the way, all the latest games are now optimized for this silicon (whether its the custom AMD chips in the PS4, or XB1, or whatever cloud streaming service we're talking about, they're all custom). You're stuck with off-the-shelf cards. Ha! Nvidia won't let you deploy their cards in a datacenter [1], because they ALSO want in on this big cash pile they've all convinced themselves exist. So, basically, good luck. The world has balkanized, and penetrating it becomes harder every year.

I hate this. I hate it so much. The only saving grace is that it is inevitable that this will fail to realize the results Google wants, and they'll pull the plug. And maybe the rest of the industry is smart enough to recognize how short-sighted a streaming-first/subscription-first strategy is, for literally everyone involved except the people who rent the metal.

I want to go back to the 2000s. This new world sucks.

[1] https://www.techpowerup.com/239994/nvidia-forbids-geforce-dr...

honestly none of this even matters until they can offer a service with comparable input latency to local play, and so far every indication is that they can't. the whole service falls apart if it feels like shit to play.

Yes you've played the service ? It surely sucks when no reviews are out ? Look for people who have dogfooded and raved about it.

Unless they've solve the speed of light issue then we don't need to wait for reviews. Even using a remote desktop on the other side of the city isn't a pleasant thing to be doing full time and that's not remotely as twitch based as gaming.

I don't think that'll be a huge issue.

Google Cloud has regional US DCs in western Iowa and central South Carolina. A midpoint between those two locations roughly lands on Nashville TN, which is ~600 miles away from either. Light could make a roundtrip of that distance in 6ms. Of course, the internet doesn't allow for latency at the speed of light, but that's the physical limit, and that's plenty; a typical internet browser alone has input lag of 10ms [1]. In order to achieve 60fps, frames have 16ms to be rendered.

But the regional DC is only the worst-case, because they've said they're deploying these things in 7500 locations around the world. That's unprecedented scale for a tier 1 cloud provider at the edge. They know that they have to be close to consumer populations.

Also consider this: Once cloud streaming takes off, we're going to see deeper integration into the frameworks and game engines themselves. Imagine a game engine which is built for streaming. It could do input prediction, doing a "light rendering pass" of frames for N possible inputs the input buffer might receive on the next frame, before it receives them. These custom chips they use have plenty of headroom to do this at 1080p, and most controllers have, what, 12 buttons + all of the joystick states? Depending on the game this might be possible (example, hard to do in multiplayer). Combine that with the natural advantage a cloud-hosted multiplayer game would have in networking with other clients to resolve game-state, and you can see that its not just a strict downgrade; it might be possible that we'll see improvements in the performance of games beyond just the typical "new year better graphics" cycle.

[1] https://www.vsynctester.com/testing/mouse.html

[2] https://shadow.tech/

I watch WebRTC streams from California in Germany with 100ms network latency. Speed of light has never been an issue. Latency within the same continent or within the same city is much lower.

People said the same about streaming video services.

No they didn't, that was simply a bandwidth issue, we've been streaming video since TV was invented and with the right equipment you could stream video over the internet before the web even existed. No equipment exists that can alter the fundamental speed limit of the universe.

Video isn't interactive.

Yep. This is the kicker. For streaming videos, only bandwidth matters but for video games, both bandwidth and latency matter.

Reminds me a bit of the 30/60 fps fights a few years ago. Sure, 30fps games look more "cinematic" and 60fps movies look uncanny, but 30fps games feel less responsive.

This is such a parochial and short-sighted view. Let's block new technology because "game studios will die". Businesses need to adapt. And if people still like games that don't rely on microtransactions or multiplayer, studios will make them. There is a place for both.

Also, to point 7. guess what billions of people can't afford a console or an expensive PC rig. Yet, in developing countries data is already very cheap if not free. So, this DEFINITELY is a big step closer to unlocking games for them.

Do you really think data is delivered to people in those countries in any sort of latency that would make a service like this remotely enjoyable to use? A massive number of people in the US don't even have internet that could support something like Stadia, let alone developing countries.

By comparison, shipping "edge devices" (aka, uh, COMPUTERS) running an 845 Snapdragon or Tegra (like the Switch) is cheap and getting cheaper. What makes more sense: asking someone in a developing country to pay $200 one time for a general purpose computer useful for everything including 1080p gaming, or $10/month in addition to, uh guess what, some computing device they'd already have to own to access Stadia.

The adage goes: never underestimate the bandwidth of a station wagon full of tapes barrelling down the highway. I mean, the follow up is usually "but never forget the latency", though in this case that doesn't apply. Point being: the internet isn't the answer to everything, but if the only tool Google has ever known is a hammer then every problem is going to look like a nail.

You clearly have never lived in a developing country to know how many families will balk at spending $200 on a gaming console. Accept your ignorance.

This isn't new technology. It's just a new product. Actually, it isn't even a new product since others have tried it before.

You hit every single point I tried to make to my friend about half an hour ago. On one hand, I'm terrified that all of this will come to pass; on the other, if this is like any of Google's other projects, it's going to evaporate in two years anyways. Whatever happened to OnLive?

It wasn't the right time for OnLive. The tech wasn't there, in networking infrastructure (the internet), compute infrastructure (DC tech), or hardware (graphics cards).

To that last point; it is impossible to understate how fundamentally important Nvidia's Pascal architecture has been to the development of both gaming and AI. In my mind, its the most important computing chipset invented in the 2010s, and belongs among the "world's greatest" chips next to the Intel Core architecture, Apple's A-series, Pentium, and the 8086. It put Nvidia, quite literally, 5 years ahead of the competition almost overnight; AMD is still catching up, three years later, to the perf-per-dollar and perf-per-watt of the GTX 1080.

That chipset, and the cards that were made with it (GTX 1080/1080Ti namely) were the first indication that DC rendering with a stream-to-client architecture was actually possible for video gaming. Before that it was hard to make an economic case for it.

OnLive was purchased by Sony, and its easy to conclude that they repurposed their tech for their Playstation Now service. So it lives on.

The whole gaming industry is positioning itself for cloud-only streaming in the next 10 years. With 5G+ everywhere, reduced latencies, nearby datacenters full of specialized GPUs, many gaming studios plan to release streaming games only at some point, cutting off local players completely (it makes a perfect business sense, collecting regular monthly rent for gaming). Unless there is some regulatory pressure, the only way to play AAA games in the future will be via streaming, local computers will be pretty dumb latency-optimized streaming machines.

Not sure how would that affect game developers though as there won't be any attractive way to programming for young people; before, making an own game was quite an attraction to jump into programming and a motivation to study hard.

Make some backup copies of those roms now, I guess.

This is my main concern.

Why would they charge per hour or per megabyte? That reduces stickiness. Much more likely is that they'll just get a cut of the game developers' sales, just like the game console vendors do. They'll set the cut so that the net profit pencils out over the long term.

Probably a subscription service like Google's other media-content-house offerings (Play Music and YouTube).

The cost will change per game, so it could be that game's cost is associated with how much compute they need. I think it will probably be a monthly subscription based on the gym membership model. Most people won't use it a lot and they'll make all their money on them.

I think my concern about this wholly depends on if it completely takes over the business model. I really want to own my games. But, I don't mind if alternate streaming models exist, just if they become seriously required.

It's awesome, I can finally use Linux everywhere! Hell yeah, count me in

I think your worry is overblown.

This is going to be an alternative for some types of games.

And quite on the contrary I think it would awesome for indies, since they tends to be pretty short. And it is much easier for them to gain consumers, if they went viral over streaming, many of which are, like horror games.

Grindy games like classic WoW? Sign me right up.

Why are you sky-is-falling this announcement? Where in their release did they say anything about per-MB or ad based business models? Did they say anything at all about personalized costs in general?

No, they didn't, and every single thing you've suggested about their gaming platform is also true about every other streaming platform, and yet none, literally none, of those platforms have tried any of the things you're complaining about.

You basically just made up a bunch of things to complain about because nothing in the actual release was objectionable. Yours is not a helpful way to react, my friend, though it is a popular one.

I mean, I’ll give you that nothing is confirmed, but nobody has made this work yet. It’s mostly tech testing, so current business models aren’t a good metric.

I’m open to debate about how this could be priced, but I’m pretty comfortable pointing to existing cloud computing business models or streaming services as a precedent.

I would invite you to come up with an alternative business model for serving people who like to play high end graphics games for many hours a day.

> but I’m pretty comfortable pointing to existing cloud computing business models or streaming services as a precedent.

Why are you this comfortable? Netflix, YouTube TV, Twitch, Sling, Amazon Prime Video -- basically all streaming services offer flat rates, not per-MB ones.

Further, all existing game library services tout unlimited gaming as a primary selling point! That's the primary reason you opt into Gamefly or Nvidia Shield, at least according to their own marketing.

And this offering is not for high-end gamers. It's taking the benefits high-end gamers get for their investment into their hardware, and making it available to the millions of more casual gamers. This isn't for high-end gamers, so creating a business model for them using Stadia makes no sense.

Finally, you're not thinking of this at the right layer if you're thinking in terms of things like s3, ec2, lambda, etc.. This is the product that's built on top of those, and the single price problem has been present for hundreds of years. It's a solved one, just ask any current MMO or hell, any clothing manufacturer. You're basically saying that an XL t-shirt is going to cost the same as a S t-shirt, despite tens of thousands of examples to the contrary.

> Netflix, YouTube TV, Twitch, Sling, Amazon Prime Video -- basically all streaming services offer flat rates, not per-MB ones.

YouTube (and Google Play TV & Movies, which appears to carry the same for-sale/rent content in a different storefront) and Amazon Video also both offer purchase of individual content items as well as a common flat rate subscription to certain content.

That's got everything to do with digital rights management, and nothing to do with resource utilization.

The amount of variable cost to stream music or video to a user is significantly less than the the variable cost to render high end graphics. The high end hardware costs money and depreciates. Why would it cost less to rent a gpu hour for generic purposes than it would to rent a gpu hour to play video games?

> The amount of variable cost to stream music or video to a user is significantly less than the the variable cost to render high end graphics.

I don't agree. Anything is scalable.

... no, not everything is scalable, nor does everything benefit from economies of scale.

This does benefit from economies of scale, but it’s not something you can just solve with infrastructure and fixed cost. No matter how many computers you have, you’re still going to do multiple orders of magnitude more calculations to render high end games than stream a song. And you’re going to deal with difficult load balances because every twelve year old gets home from school at the same time (exaggeration but point stands). GPU time costs money.

This benefits massively from economies of scale, and yes it is very much something you can solve with infrastructure and fixed cost. Fundamentally it doesn't matter if you're streaming a song or streaming a game.

GPU time costs money but it's a fixed cost, doesn't matter what the GPU time is being used for, therefore it won't be per-game. The end.

It’s only a fixed cost if the computers must be running at all times. That’s not the case. GPUs consume power, and require energy to cool. They also burn out over time and need to be replaced.

For similar reasons, you’re not just going to rake in the money mining bitcoin because you bought a bunch of computers.

Or maybe to make the point even stupider, you could make a game about training neural networks same as you would on a real cloud service provider. If you can understand why google doesn’t charge a simple monthly flat fee for cloud computing of neural nets, you can understand why they can’t charge a simple monthly fee for computing neural nets in a game.

I really don't understand why people here aren't talking about the real underlying principle of this service. Google has captured the education and cheap laptop space with chromebooks. This is the logical extension of that software. Sure they will eventually combine chromeos and android but this is one more step in the direction of everything happening inside of your web browser.

The entire reason this is happening is because Google has successfully captured what once was the holy grail of markets, education, and is doing things like this to keep people on chromebooks.

ChromeOS could be as important than Android. Especially with the Windows 7 extended life support coming up, Google has less than 4 years to convince people to transfer to their platform instead of Windows 10. I know it seems heretical to imply that it will happen but I think this is a good example that Google considers it not only a real possibility but something that they might actually have a good chance of being able to pull off.

The most important thing to realize is that they are focusing on gamers as a trial, the people who are price sensitive and who have to move first off of Windows 7 since they can't pay for extended support even if they wanted it. They can then continue this into the workplace. This is a brilliant move by Google.

An alternative explanation might be that it's a buffer for overprovisioned GPUs for an undersold Cloud service

GCP uses Nvidia but Stadia uses AMD GPUs so they're not sharing hardware.

It could also be a way for AMD to get a ton of GPUs sold after oversupply due to the mining boom/bust.

The GPUs are custom for Google.

Do we know custom how? It might be the same chips with different microcode or whatever GPUs' equivalent is. Is it a custom spin like Intel has been doing for years for Google and Amazon?

What worries me is that I read about this being "Chrome-only".

Google is not just trying to win in console gaming space here - it's trying to establish itself as the browser monopolist and it seems to be doing it without much subtlety.

And of course, only Google can build it's game service into it's own browser, and into it's own video sharing site, and into the Play Store. Stadia is Google utilizing multiple of it's existing monopolies to enter into a new space. It's also a major shot at Twitch, not just traditional game platforms like PlayStation and Xbox, since streamers will need to be streaming on YouTube to take advantage of things like Crowd Play.

There's a difference between having a monopoly and being the best product in the market.

I'm not sure, if you drive everyone else out the market then you are by elimination the best product in the market (since at that point you are the only product left).

Monopoly laws exist for a reason and hard experience and I'm of the opinion that a single large player having complete domination even if they also have technical superiority is something that probably should be broken up.

> I'm not sure, if you drive everyone else out the market then you are by elimination the best product in the market (since at that point you are the only product left).

I think this can be read in two ways:

1. You're the only product, therefore you're the best (and the worst too, at the same time I guess). In which case I want to point out you can be a monopoly without controlling 100% of the market. There are usually always smaller competitors around— you just happen to have a share large enough or the assets necessary to control what happens ¯\_(ツ)_/¯. De Beers, for example, was considered a monopoly when it controlled 90% of the world's diamond production.

2. You're the one that came on top in a market with other competitors. Therefore, you must be the best.

This is assuming the only way to eliminate players off the market is by being "better" than them, but that is sadly not the case. For example— in Mexico there's a monopoly over the telecommunications business, and part of the reason it happened and stayed that way was due to support from the federal government and political corruption. Microsoft has had monopolies over several software markets— not because they were "better", but because if a better product came to be, they'd either buy it or build one that was built-in to Windows. IE was pretty bad, in many ways worse than FF, but also came built-in.

Completely agree with both your points, you can also be a monopoly in a less direct form of corruption via regulatory capture, there are lots of ways for smart rich people in charge of massive companies to bend the system their way.

Democracy and a free press in theory should act as a retarding measure but somehow that’s gone off the rails more (or I’m more aware of it than I used to be and it’s always been that way), social media and the internet has changed the landscape, We have a sitting president screaming fake news at news where they have incontrovertible proof..often his own words from previous speeches and interviews.

The world has gone haywire and at a time when globally we need more unity to address the issues facing us as a global society the very bastions of that global society are getting beaten with a stick.

I wonder what the world is going to look like 2050, I’ll be 70 if I’m still around.

What would stop someone from building a Stadia competitor as a Native Client app?

Google's infrastructure, as well as Chrome, Google Accounts and Youtube. Budget too, I suppose.

I mean, this doesn't stop competitors, but it's quite the moat to climb.

They specifically said they are looking into making the client browser/platform-agnostic, which they'll have to do anyway if they want it on iOS since you aren't allowed to run anything but the Safari engine for web browsers (of course, they could build a native client).

When they say agnostic it will mean proposing and implementing web standards that they push through and other browsers dont support yet and have to catch up on - look at all the browser extensions they came up at light speed. Like WebUSB, not to say that they had very serious security flaws also.

Google uses open standards and business practices to appear like the good guys, but make no mistake they are abusing their market power.

I mean when will you see ads of Stadia on the Google homepage? What happens when you will search for Assasins Creed in future? Oh, look we have it right here at Google, no need to leave our ad platform.

I feel like there's a lot more low-hanging fruit that could be added to the Chrome experience to win people over, rather than building a relatively expensive game streaming platform.

Unless I'm mistaken ChromeOS is only used in US schools on a wide scale.

The only Chromebook I've seen anywhere is the nice one I bought my mum for Christmas year before last, she loves it and I've had no issues supporting it (like not a single one, she uses it constantly as well).

I've got a school age stepson nothing there either.

Give it a couple years and they will be everywhere. I used to work for a company that sold a monitoring and filtering product for school computers.

It's only been in the last two or three years that the ecosystem around them has really matured enough that they can compete with Windows machines and iPads on anything other than price. With the way schools' budgets work, we've really only just passed the early adopter stage.

Not with the hardware division being downsized.

Are you referring to Google's hardware division? If you are that doesn't really mean much.

Most schools aren't buying fleets of Pixelbooks. They're buying chromebooks from companies like Acer and Asus which make devices that retail in the $200-400 range.

Yeah, but someone needs to actually develop ChromeOS, without Pixelbooks management might decide to focus elsewhere.

And right now from the outside it feels like there is a ramping internal politics going on with ChromeOS, Android, PWA, Flutter, Fuchsia, Kotlin, Dart teams, with upper management giving free reign and let the best win kind of stuff.

> Yeah, but someone needs to actually develop ChromeOS, without Pixelbooks management might decide to focus elsewhere.

I really, really doubt that. I don't think you understand how ubiquitous Chromebooks are becoming to the education space. Last I heard, in the US, 60%+ of all school provided computers are Chromebooks. School SysAdmins love them because they're dirt cheap and can be provisioned quickly.

From Google's perspective it's great too. Between Google Classroom and the way chromebook device management works, students have to have a google account to be able to go to school. There's rules on what data can collect, but still, kids are forced into the Google ecosystem at a young age.

ChromeOS doesn't need the pixelbook to survive, it provides an enormous amount of value on its own.

> And right now from the outside it feels like there is a ramping internal politics going on with ChromeOS, Android, PWA, Flutter, Fuchsia, Kotlin, Dart teams

100% agree there. I've been hearing for the past 3 or 4 years that Android and ChromeOS were going to be merged and nothing has yet to come of it. It seems like even Google doesn't know what is going on there.

Check the worldwide market share, Chromebooks are less than GNU/Linux desktops.

Being only king of US school system isn't something that holds long term in a product roadmap.

> Being only king of US school system isn't something that holds long term in a product roadmap.

Dude... I'm sorry but you are so wrong. Like I said originally, I literally just left this industry after working in it for years.

US spends more money on education than pretty much any other nation, both per student and as a total dollar amount. The reason the numbers are so low world wide is because chromebooks in education are a relatively new concept. Everyone has been going after the big fish, which is the US.

Additionally, they way you need to handle student data in the US is fairly consistent across state lines, which means you don't need to customize your solution very much to be able to sell to all 60 million students. Once you go overseas, you'd need to sell across multiple country lines to be able to find a pool of students that big (unless you're targeting China, Russia, or India which all have their own issues).

If you don't believe me, here's a blog post from a few years ago where google literally say ChromeOS is here to stay and then they focus heavily on it's benefits to education. https://blog.google/products/chrome/chrome-os-is-here-to-sta...

Even if you want to ignore all of that, I don't think you realize how much of a PR nightmare it would be if Google just stopped supporting ChromeOS right now. Schools have spent hundreds of thousands of dollars buying into this ecosystem. For schools that buy at the district level, it's in the millions. Most schools/districts don't have the budget to just replace all their computers overnight. Shutting down ChromeOS would pretty much fuck all digital learning in a lot of school districts for years to come.

PS. I'm pretty sure iOS's adoption numbers US & worldwide (not in schools, just total consumer adoption) match up pretty closely with Chromebooks, so there goes your idea that only dominating the US market isn't a viable business strategy.

Should I list all products that Google has written blogs about they being here to stay?

Sure. I'd love a list of all the products Google has canceled (not merged into another offering) after coming out on their official Google blog and saying they are here to stay.

The source that claimed Google was downsizing their hardware division said that they moved "dozens" of employees. Just last year they finalized an acquisition of 2,000 employees from HTC. What difference is some dozens?

Furthermore, the source didn't exactly do a great job with their research. This is a direct quote:

>Pixelbook is a Chromebook, meaning it runs on Google's Chrome OS software and is only capable of using internet-based applications.

Worldwide it has even less market share than GNU/Linux desktops.

My school uses them in the UK but I believe that's a minority. A lot more schools seem to use iPads instead.

Not common in Singapore. We do use computers in schools. Usually Windows or Macbooks. Same goes for univerties.

There is a wide deployment in Norway I believe, but indeed, I've not heard of big deployments elsewhere.

Common in New Zealand too.

What you smoking foo?

>>The entire reason this is happening is because Google has successfully captured what once was the holy grail of markets, education, and is doing things like this to keep people on chromebooks.

Care to share how this tidbit is true?

>>The most important thing to realize is that they are focusing on gamers as a trial, the people who are price sensitive and who have to move first off of Windows 7 since they can't pay for extended support even if they wanted it. They can then continue this into the workplace. This is a brilliant move by Google.

What gamers are still on Win7? Most are already on 10. Most legit gamers care about input lag, which a streaming service will never solve vs. having hardware wired to your monitor. Most legit gamers use a wired mouse for gods sake because of input lag introduced to wireless mice.

> What gamers are still on Win7? Most are already on 10.

Care to share how this tidbit is true?

Sorry to use your words against you, but I'm not so certain. I'd like to see real statistics from a few sources. I agree that Windows 7 is surely phasing out, but anecdotally (which is where most conjecture like yours and mine come from), I know a number of people who specifically stayed on Windows 7 for two reasons: 1) Compatibility with games; and 2) No telemetry (or at least it's considerably minimal compared to Windows 10).

Steam's hardware survey is probably the best measure available, and puts W10 at 68% and W7 at 27%.


Maybe, but I would like to see more than the one source. HN is becoming filled with people who tend to make such generalized statements without much to back it up. Your link helps, so thank you! I would love to see more data and less speculation around here :)

Re: the survey - the type of person who remains on Windows 7 due to concerns about 10 would be more likely to opt out of the survey, or not use Steam in the first place. I would put the number for Windows 7 a little higher at maybe 30%.

I expect the number of Windows 10 players to increase as DX 12 becomes a common requirement, but we're not quite there yet.

>Care to share how this tidbit is true?


I would say 60+% qualifies as most. On the other hand, ~25% is not negligible. So you both win ?

Windows 10 supports more modern games and consumers outside the nerd sphere don't care about telemetry.

I suppose, then, that ignorance is bliss. Because once I explained Windows 10 telemetry to my own parents, they opted to run Linux. They're nowhere even close to the penumbra of the nerd sphere. But, it is anecdotal, though it does rebut the notion that ALL consumers outside the nerd sphere don't care.

> doing things like this to keep people on chromebooks.

Care to share how this tidbit is true?

Lots of filthy casuals just care that they can game. If they can, somehow, why shouldn't they buy the Chromebook? Hell, if that's what they had in school, they might have grown up to like precisely the genres of games that are easy to design around the lag.

Most legit gamers care about input lag

The majority of people don't notice < 40ms round trip lag, though "legit gamers" often do notice. < 30ms, round trip lag, and few people outside of hardcore FPS players will notice. Get to < 20ms, and you get down to rounding error levels. Don't just ask me. (Hobbyist game dev.) Ask gamedevs and companies who have conducted testing.

40ms roundtrip lag for most of the world seems pretty achievable to me.

> ping google.com

Pinging google.com [] with 32 bytes of data: Reply from bytes=32 time=6ms TTL=50 Reply from bytes=32 time=8ms TTL=50 Reply from bytes=32 time=9ms TTL=50 Reply from bytes=32 time=10ms TTL=50

Ping statistics for google.com: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 6ms, Maximum = 10ms, Average = 8ms


I'm on GDC public WiFi.

> ping google.com

--- google.com ping statistics --- 12 packets transmitted, 12 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 74.750/75.566/76.724/0.587 ms

I'm on 1GBit Google Fibre in Orange County which should be as ideal an experience as could be expected. Now, considering there would be additional time on top of the pure ping, I image there would be 80-90ms of latency for me. No way lol.

I'm not sure why are your pings that high. I just asked my friend in an Asian country about his pings, and he averages out at 5ms to Google. Meanwhile, my average is 1ms in the US.

Pretty sure both of us have a Google datacenter nearby, but could the result of your ping be due to a bad configuration on your router?

I image there would be 80-90ms of latency for me.

I could totally make RTS and other real-time games work with that.


Unfortunately, any and all trust I had in google maintaining its services for more than a few years is long gone. I have games from ~10+ years ago I can still download, update, and play via Steam; What are the chances this lasts more than 3 years? In addition to that obvious concern, I'd also be very hesitant to let Google have any control over my gameplay experience. The downsides of the gaming as a service just have no appeal to me, although I'm sure it would be functional in certain generes, for a certain type of gamer.

Regardless of how many streamers are playing Skyrim in their browser, I don't think I'll be joining them.

This is actually something that worries me, long term. There's people who still play 30+ year old games— you can still buy a used NES and play classic Tetris if you wanted.

Mario Kart on the Wii still works. Online play is broken though because Nintendo has no interest in running old servers to keep a service in a long discontinued game running.

If any games are developed solely for Stadia, will they disappear if the platform disappears? Even if it doesn't, will they be playable in perpetuity, or will Google shrug breaking changes if it's for games that are over N years old, and have under K users?

I think most things we buy have a life. When I pay $$ to get shoes, I dont expect that to run forever. Sure, it might be nice but I _understand_ it might lose its life after a while. We expect software similarly to run forever, but I think we need to start associating life against it.

Now if a shoe only works for a week, I'm not going to buy it. If it works for a year, sure I'll take it.Similarly for a digital good (phone, computer, software) we need to come up with a baseline on what the acceptable number for life is. It cant be Infinity, but it should be reasonable.

The big difference here I think is for most things we buy, we're mostly control over their longevity. I can take care of something and have it last longer.

This all ties into ownership. If I own something I have control over it, so I can re-sell it if I'd ever like to re-coup some value or let someone else make use of it. I can lend it to a friend, and have it back to use or lend again. I can give it as a gift to a friend or family member. I can keep it indefinitely as a memory. All of these are reasons why some people still have a NES, GameCube, etc. It's probably not that they've been playing on the console for 30 years. They either kept the NES they owned as kids, or bought one second-hand for the nostalgia, or got it as a gift from someone who knows how much they like classic games, etc.

There are still have tournaments for games on old consoles. Old games are perfectly good games— they don't have the same wow factor, but they are as fun today as they have always been. Will future "classic games" be lost to history, or available only when a publisher decides to monetize the nostalgia of the public with a re-release?

>> When I pay $$ to get shoes, I dont expect that to run forever.

The better analogy is: when you buy a book, do you expect you can read it again in ten years? People who bought 1st pressings of Beatles albums 60 years ago can still play them with their grandkids. I can still play old SNES with my siblings at Christmas. But imagine a 2019 being unplayable in 2029. It's reasonable to be concerned.

Some may remember an earlier streaming game service, OnLive, that crashed and burned spectacularly: https://www.theverge.com/2012/8/28/3274739/onlive-report plus HN discussions here: https://hn.algolia.com/?query=onlive&sort=byPopularity&prefi...

I saw the OnLive demo. It was very slick, and the concept seemed more than viable, it was inevitable.

OnLive was both before the Cloud era (where they manually had to maintain many points of presence that were near consumers and they didn't have the money to do it near what Google, Amazon and Microsoft are able to do) and before GPUs were really mainstream for in normal data center servers to the scale this type of service requires.

Key quote from the linked article:

> the company had deployed thousands of servers that were sitting unused, and only ever had 1,600 concurrent users of the service worldwide

They were burning through all of their money because they highly overestimated the audience. That's one of the main problems the cloud was made to solve.

This particular cloud will only solve the problem if google will find a way to sell the time of these AMD GPUs to someone else.

Traditional GPGPU customers, and machine learners, usually use CUDA instead of OpenCL / VulkanCompute / DirectCompute. At least from my position is looks that way. I’m a freelance developer and my clients are picking CUDA in ~75% cases. The rest is DirectCompute, if computing on client PCs and hardware compatibility is required.

Economics were not the main problem with OnLive, it simply didnt work from the players perspective.

Right partially because OnLive couldn't spend the money required to put servers close enough to give users a good experience. It would have taken a monumental amount of money for them to expand in a way that would facilitate that, and custom make servers that had consumer grade GPUs in them.

Furthermore, there are other technological advances that have come even in the past 2 years that help facilitate this, like Secure Reliable Transport protocol (https://www.srtalliance.org/).

Things were just too stacked against OnLive at the time to make it work, but most of those barriers have since been removed after they had gone under.

OnLive didnt work (good enough) even if you lived next door to their servers.

Gaikai was one of OnLive's competitors. It had better management and probably better tech as well. Sony bought it out and it still lives as Playstation Now. I believe Dave Perry is one of the founders.

EDIT: https://www.pcgamer.com/gaikai-founder-blames-onlives-pc-foc...

OnLive's patents were bought up by Sony in 2015, and I imagine they now power at least some of Playstation Now, Sony's streaming servier.

Internet speeds in North America have greatly increased since the OnLive launched.

I suspect that Google will have better infrastructure than OnLive, and that their customers will have better Internet.

Speed is not the (main) problem. It's latency and jitter. With the last one especially bad on mobile networks. To get at most 1ms of additional delay you also need a datacenter every 300 km.

Google operates tiny little datacenters in every ISP POP in America. That is why the video I just watched on Youtube in Oakland came from some place 2ms away even though Google's nearest real datacenter is in Oregon.

How do you edge cache games though? Edge caching youtube requires a giant pile of hard drives, Edge caching games requires a giant pile of hard drives and a giant pile of compute power.

(Disclaimer in bio.)

It's not the edge cache itself that matters; it's that many ISPs peer directly with Google.

Google Cloud can route the audio/video/keyboard packets mostly over Google's private network and then only use the public internet once it gets to your ISP (or their transit provider). This provides Google with more control over how the packet gets to the end user.

Google provides a similar service to Google Cloud customers as the "Premium Network Service Tier".


Huh, that's actually really interesting. I was going to say, "but even then you're still dealing with the speed of light", but I guess at the limit, it takes 4ms to reach 1,000 miles, which should be enough to get most anywhere in the U.S. to a Google data center. The latency dynamics here are a lot less implausible than I was suspecting at first, given peering at the local ISP level.

According to Wikipedia, input lag for video games is between 67-133ms. Adding an extra 10ms for network might be acceptable.


Half that value.

If you don't think Google can drop-ship racks of GPUs to major cities, I think you are underestimating them.

All these additional CPU/GPU require additional power and cooling capacity.

It's the challenge of edge computing datacenter design.

Video files (even live stream HTTP based protocols like HLS) can be cached very easily on CDN infrastructure.

However, a gaming session means to have a dedicated daemon running for you somewhere. I doubt this can be deployed anywhere on the spot, but perhaps they have some amazing technology for that.

Yeah, but the video file has to be the same video streaming to lots of people.

CDNs aren't helpful if every stream is unique.

That's the point, there's no cache possible for interactive gaming sessions. It's a different architecture altogether.

I used the precursor of Stadia, Project Stream. There were moments where the video went fuzzy, but for >95% of the time, it ran fantastic. My gaming PC setup is an i5/16GB RAM/GTX 1080 and it stutters more than the stream did (I assume that's on the CPU side). I'm sure Google has the compression algorithm chops to make this all work.

Unfortunately for me at least, what I want is more akin to a cloud VM where they can host my games and I stream them wherever, which this isn't it. I've already got games between Steam, uPlay, Origin, etc. I have zero interest in buying a game 2x just to be able to have more portable play.

I'm pretty sure you won't buy them (again). You will rent them. Anything but subscription-based pricing model for Google Stadia would be a surprise to me.

No, Stadia is the perfect use case for F2P games. The only requirement to start playing will probably be a Google account.

It'll probably be a mix of pay-once games, lots-of-games-for-a-subscription, and pay-once, play-until-Google-shuts-the-service-down games.

Sounds great on paper but I'm skeptical.

Not only you will need a really good internet connection and wifi, but also being close to a Stadia data center. The best possible case scenario will probably be a latency of around 50 ms, but I'm guessing the average will be closer to 100 ms which is too much for many types of games (specially competitive games the streamers play).

Heck, even streaming in your local network isn't such a great experience these days.

I love the idea, but realistically we are still very far away from streaming games replacing consoles or PCs. Many companies have tried (Nvidia, Sony, etc) but no one has succeeded for the simple reason that latency is not there yet.

i tried this and found it to be pretty satisfying. i played entirely on a macbook while traveling during my winter vacation. It worked surprisingly well even with mixed wifi, i would compare it to the experience you get from using PS4 Remote Play or Steam Link over ethernet.

The only time i found it lagging was similar to the times when both of the above lag: during particularly complex visual scenes (i.e. you're circle strafing around a target and the entire screen is constantly redrawing). I thought it was great for playing a game casually: i.e. story mode. Lots of people use that phrase as a put-down, but the system is well suited for a game like ACO where you are mostly being tactical, planning, exploring, and moving the story forward.

I think of a game like 2016 hitman: i hesitated to install 30G of it to my ps4, but if you told me i could drop into the demo/prelude in less than a minute, even at 720, it's a very appealing concept for somebody like me that plays video games the way other people watch netflix while they're eating dinner: basically whenever i have some downtime and want to dip into a story or mechanic i like for ~30 mins.

> Steam Link over ethernet

Note that even Steam Link over Ethernet is too laggy to play KB/M FPS at any serious level - the input lag makes FPS almost unplayable.

I can see the growth of services like these, especially with more gamers being unable to access dedicated hardware, but there will always be a niche for dedicated hardware. The only way I think they could solve that problem with streaming is using some sort of hybrid streaming approach where some of the UI is remote and some is local and they could do client side simulation somehow for input.

i perhaps have diminished standards, but i found steam link over ethernet to be totally serviceable for playing through most of hyper light drifter on a controller. the input lag was fairly minimal and the computer was already quite long in the tooth, but to your point, i wasn't getting 60fps, that's for sure (probably realistically it was a firm 30fps).

i don't doubt that consoles will remain useful, but i think that services like this will satisfy a pretty legit niche for a vast swath of games that aren't really dependent on low latency input (i.e. puzzle/turn based/rpg/simulators/board game conversions) and that are often 'discovered' by people finding letsplays on youtube.

for some variety of mmorpgs, i can imagine devs being excited about the reduced surface area for cheating/exploits. for somebody like me that uses a mac, i'm looking forward to playing a version of Civ that doesn't cause my laptop to sound like it's about ready to take flight. i don't think the idea is meant to replace consoles, though more, it seems like a way to grease the wheels of commerce and get people playing (and buying) games that they've been traditionally priced out of because of the not-insignificant startup cost of building and maintaining a gaming pc/console.

I guess my question is; how would something like this scale? With Netflix and Spotify, the media is the same every time you play it, and even stuff like the Black Mirror CYOA have a limited number of combinations, so it's very easy to cache.

Every game has an extremely high number of potential combinations and outcomes, so it's effectively uncacheable. Fan that out to Steam level popularity and diversity of games and it sounds a bit nuts.

The only way is adding more servers closer to the users, which means Stadia will only offer really low latencies to users living close to the data centers. So probably only people in large cities.

I don't really have a need to stream a full game, but your idea about the demos actually sounds great. I would like to play the demo instantly, evaluate if the gameplay/controls are good and if it does and I'm interested, I would purchase and download full game to my system.

I was a beta tester for Project Stream. FWIW, I traveled with a Pixelbook and played Project Stream with the hotel WiFi in Seattle. Everything is smooth and similar (if not the same) experience when I played with my home WiFi.

The latency is good from my experience. There are occasional resolution drop but that happens <1 per hour. Mind you my experience was based on playing ACO which probably is not sensitive to latency.

I also played Nvidia Shield Now on Mac while it was beta. I would say Project Stream is a much much better experience (no need to signup a new account is a plus, no client required a HUGE plus).

I'll try Stadia for sure when it gets to my country, probably in 2020 or 2021 since Google is not very fond of Mexico.

Agreed. Enthusiast gamers fret/obsess over a small difference in FPS, refresh rate and lag especially when it comes to competitive gaming. Until Google can deliver a service that is indistinguishable, it’s DOA in my view.

> Agreed. Enthusiast gamers fret/obsess over a small difference in FPS, refresh rate and lag especially when it comes to competitive gaming. Until Google can deliver a service that is indistinguishable, it’s DOA in my view.

I work for Google, opinions are my own.

I agree but I don't consider this service to be for hardcore enthusiasts. The enthusiasts will continue to buy their powerful gaming machines because they really care about having the best experience.

I think this is great for someone like me, who likes video games, but doesn't want to spend that much money on a computer. Before I would not be able to play the latest titles because my computer is 6 years old, but this would make it so I could.

There are probably tons of kids out there whose parents would scoff at the idea of giving them a $1000+ computer but now would be able to play the latest games with this technology.

All that assuming it works, of course, which is no guarantee.

> I think this is great for someone like me, who likes video games, but doesn't want to spend that much money on a computer. Before I would not be able to play the latest titles because my computer is 6 years old, but this would make it so I could.

> There are probably tons of kids out there whose parents would scoff at the idea of giving them a $1000+ computer but now would be able to play the latest games with this technology.

Isn't a console affordable enough in that case?

Presumably this service is not going to be free, and the overall cost for someone who plays a lot of games would probably be similar to just getting a $1000+ PC and using it for 3-4 years. Of course maybe the pricing will prove me wrong.

how are those parents gonna react when their entire month's data plan gets used up in a weekend? honestly i would rather see google try and cut in on the handheld market by creating peripheral controllers for android devices. every smartphone in the world is four buttons and a dpad away from being the best gameboy ever made.

In the civilized internet world, we don't have any data caps though.

Fair enough and I do want to see services like this and the developments from solving these difficult problems.

This is like saying that car sharing services like Uber are DOA because there are people who like to have racing cars or are professional pilots.

It's a niche market overall, and each is a different segment you can target independently.

With current internet solutions in many countries this is purely impossible.

Nothing can deliver the ~25ms input latency that local gaming can, but when you can't game locally it'll do.

This is a very valid point. Adding 100ms of latency to games that already struggle to deal with their built in ~100-150ms already is basically a non-starter. So you're left with casual games, single player games (with no modding), and maybe some less latency sensitive multi-player games.

Even the best hardware right in front of you can still currently cause a bad user experience for latency-sensitive games. All it can take is one player with high latency or an unreliable connection, and it can detract from the enjoyability of a game.

And games already have latency from input devices, inputs registering, the game calculating the next frame, the monitor displaying it, and the user perceiving what has happened on screen. Every millisecond of network delay gets sandwiched between all of those.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact