Hacker News new | past | comments | ask | show | jobs | submit login
Run your own high-end cloud gaming service on EC2 (2015) (lg.io)
279 points by tosh 13 days ago | hide | past | web | favorite | 194 comments





You could use EC2 but Geforce Now has left beta and announced pricing and it is much cheaper at $5/month.

Doing this on-premise is also pretty tantalizing - I watched a video recently of Linus Tech Tips where he built a 64 core Threadripper and used virtual machines to replace four physical computers in his home including two players simultaneously gaming.

https://www.youtube.com/watch?v=jvzeZCZluJ0


>Doing this on-premise is also pretty tantalizing - I watched a video recently of Linus Tech Tips where he built a 64 core Threadripper and used virtual machines to replace four physical computers in his home including two players simultaneously gaming.

so basically a mainframe? I can't imagine it's economically viable though. a 64 core threadripper costs more than eight ryzen 3700x and clocks lower.


Inspired by Linus Tech Tips and bummed that the new threadrippers were out of my budget, I decided to build a 4 gamers 1 CPU box based on a Ryzen 3900X. I used the Ryzen 3900X (12 core) and a x570 Taichi, and then 4 GPUs (in the Nivida 1060 to 1660 range) (I had to use one of these: https://www.amazon.com/gp/product/B07YDH8KW9 to connect my 4th GPU to one of the m.2 slots) My total cost was somewhere in the $2400 range.

I personal think it's a super awesome setup, but it's _definitely not_ for the faint of heart. You have to _really enjoy_ debugging crazy shit for it to be worth it.

But even at my "budget" $2400 range, a $20/month for 4 users of some game streaming service, would buy me about _ten years_ for that price.

The economics of a streaming service are really quite killer.


Which OS do you use as a host? What I really dislike about my setup is the dedicated gpu I need just for the BIOS to post.

//Edit: Using a Ryzen 5 2600 and a Gigabyte X470 Auros Ultra Gaming


I'm using Unraid, (a linux), which uses kvm for the virtualization. If you're using kvm, you can passthrough your primary GPU by dumping the vbios, and then passing that along when you initiate the passthrough. Passing a custom vbios is pretty easy to do in Unraid, though dumping the vbios is still a manual process. I have do that in my setup because I don't have any spare slots for another GPU, even a tiny one.

Thanks! That leaves me thinking, though. I'm also using Unraid, and the main gpu is exclusively used for my vm passthrough (it's a Radeon). I thought that the bios won't free it once it has been claimed by the bios for passthrough, hence the Geforce GT710 for the bios. If I could free that, I could host another gaming setup.

You can definitely run another gaming setup through that. Here's what you need to do:

- Follow the instructions in this video for getting a dump of your vbios: https://www.youtube.com/watch?v=mM7ntkiUoPk (you can stop once you've gotten the vbios)

- Make sure your Unraid is updated to at least 6.7

- Read the "New vfio-bind method" section of: https://forums.unraid.net/topic/80001-unraid-os-version-67-a...

- Use the knowledge gained from that to add an IOMMU group assigned to your GT710 to /boot/config/vfio-pci.cfg

- Reboot your Unraid server

- Do the normal gpu passthrough thing for the GT710 for a VM, but add the dumped vbios to the "Graphics ROM BIOS:" field in the VM "edit" gui

Hopefully it should work :D

One thing to keep in mind about the vfio-pci.cfg file is that it's effectively a blacklist, and if you do something that could change your IOMMU group assignments (such as adding or removing a PCI device) you could end up inadvertently blacklisting a PCI device you don't intend to. All you need to do is update the IOMMU groups in vfio-pci.cfg to fix it, but it can freak you out if you're not expecting it.

(For example if I remove one of my GPUs, one of my SATA controllers will inevitably end up getting the IOMMU group that _used_ to belong to a GPU, so it'll get blacklisted, and two of my array drives will appear missing until I update the vfio-pci.cfg to match the new IOMMU groups)


Thanks a lot for the write-up! Once I got a tad more time at hand, I'll tinker around a bit. :)

I'm using Linux and configured the kernel to basically entirely disown the GPU from the very start (blacklisted kernel modules, disabled framebuffer). After that, I'm able to passthrough the GPU to a VM.

Thanks for the hint - I always thought it's the bios grabbing the gpu, and not releasing it again for the vm. I might need to read into some stuff.

Is your complete build posted somewhere (for example pcpartpicker) ? Thanks

"Economically viable" isn't very glamorous these days. Or difficult.

You can pick up a used i5-3xxx or 4xxx system with 8GB of RAM real cheap. If you get one that has at least a 300W power supply, you can then put in a GTX 1660 or similar and do quite well with most games. Max settings @ 1080p for some older games, and still decently playable at decent quality for the newer ones. And all for $300 USD or less.

I just put an RX 580 with 8GB of video RAM into my old i5-3570S system. The card was just $135 at Microcenter, because it was open-box.


> The card was just $135 at Microcenter, because it was open-box.

Even then you'll need 27 months (2 years and 3 months) before being more cost effective than Geforce Now. That's considering that you are on a desktop though with a CPU powerful enough.

Personally I started doing cloud gaming mostly because I was on a netbook at school (I liked the form factor and the battery life was quite nice versus most laptop). At first on OnLive, then on LiquidSky, and since a year ago, on Geforce Now.

I don't believe the service will stay at 5$/month though, both service I used shutdown because they weren't viable and they were in the 10$ range. LiquidSky was supposed to come back, but they missed multiple release date now and with the release of GeForce Now, I have no hope. Shadow seems more viable at 25$/month, which makes your solution much more cost effective.


Shadow recently dropped their price to $13 USD a month for a year contract. The $34 USD / month for month-to-month is still pretty steep.

>I can't imagine it's economically viable though

Well he also built a 6 PC watercooling loop, so yeah not all of it is particularly applicable to the average viewer


The full-room water cooling was one of my favorite LTT projects. It might be inapplicable to most users but it was a pretty good idea. (If I remember correctly it had a lot of problems and never really worked, but I appreciate someone trying it out.)

Same here. That's usually the kind of videos I watch, cause I'm not that interested in seeing benchmarks of the latest RGB memory sticks, or any other memory sticks, GFX cards, headphones for that matter.

When I saw his LAN gaming center I though it would be cool if he'd retry that project for that. :->


> I'm not that interested in seeing benchmarks of the latest RGB memory sticks,

With the kind of workshop they are building with Alex, it's clear they are aware of that. Let just hope they can do more of them quicker!


I don't even know what his build ran at in total, but he had $1700 (x2!) USB repeaters using fiber optic to directly connect USB peripherals all through his house, half a terabyte of ram and 4 GPUS. I think you could build a lesser machine to cover most household computing needs.

I don't think the core speed would matter much - Steam reports 37% of gamers are using 3ghz-and-below processors so they may be mediocre but still competent:

https://store.steampowered.com/hwsurvey/processormfg/


If you live with 2 or 3 other SWEs this is an attractive option. You have enough PCIe lanes to pack in 4 GPUs. You have enough spare horse power to also host basic home services like a file server, gitlab instance, home automation stuff, vpn, etc.

I think personally I'd opt for building four or five separate machines and managing them as a cluster though.


> If you live with 2 or 3 other SWEs this is an attractive option.

In some sort of software engineering commune?


Most 2 and 3 bedroom apartments in San Francisco are software engineering communes, yes.

Google intentional living or intentional community.

Or just, you know, having roommates.

But you can't write a blog about re-inventing an old concept if you don't give it a trendy new-age name.

The new age name was commune, I believe.

If by "new age" they mean 1871...

I've heard of a few people living like this, especially in areas with very high rent. I'm assuming it's not extremely uncommon and it doesn't sound like the worst way to live.

Are people really this unfamiliar with the idea of having roommates? I'm 26 and it's been a necessity for my entire post living in parents house life. I could buy a house in the city I currently live in, but I'd be spending >50% of my (relatively large) income on mortgage repayments and I'd be utterly screwed if I ever hit the "housing prices crash" + "no job" combo.

I know students in the US sometimes have roommates. I didn’t think many adults did it in the west. Let alone well-paid professionals.

I think it's more common now among tech employees. It's also probably less common in areas with cheaper housing. Personally I feel this kind of shared housing situation would probably be much more enjoyable than loving alone.

It's been great for me. But it heavily relies on the people you're living with being generally reasonable.

I've also found it fun to live with people with varying interests, living with just other software engineers might be boring. Common hobbies are great though. Imagine every night being a lan party.


Why would this be attractive to SWEs? Seems like something that would be more relevant to gamers.

Half of the SWEs I work with don’t game.


My workstation is set up like this, with one giant LVM pool for storage and two GPUs, it gives a few advantages:

* Can run two OS at one time, each with half the resources * Can run one with full resources if needed * Can have multiple linux and windows installs * Can have snapshots of installs * Takes around 5-10 seconds to swap one of the running installs for a different install * Can run headless VMs on a core or two while doing all the above, ie a test runner or similar service if needed

I use a 49" ultrawide with PBP, have one GPU connected to each side, so the booted installs automatically appear side to side, and Synergy to make mouse movement seamless, etc, etc

It took a little work to set up, but I've worked this way for ~ 3 years now and never had to think about the setup after the initial time investment, and during upgrades. Highly recommend it.

I definitely can see the advantage for a small team for having a single large machine with multiple GPUs, and letting them sit at a thin workspace and "Check out" whatever install they want to use, how much CPU power and RAM they need, etc, clone and duplicate installs and get a copy with their personal files in home, ready to go, and can check out larger slices of the machine when they have more CPU/GPU intensive tasks, that's probably my ideal office machine, after using my solo workstation for a while


Any guide on this? I would love MacOS and Windows side by side.

The arch linux wiki page [1] is a good place to start (Note you don't have to use arch for the host to get value out of the page, I use nixos), and/or the macos repo [2] (Or maybe this newer one [3])

The only real hurdle hardware wise is your IOMMU groups and your CPU compatibility, if you have a moderately modern system it should be a problem.

I also have a couple of inexpensive PCIe USB cards that I pass through to the guests for direct USB access, highly recommended.

The guides will use qcow2 images or pass through a block device, as I mentioned I have a giant LVM pool, I just create lvs for each vm and pass the volume through to the vm as a disk , and let the vm handle it. In the host you can use kpartx to bind and mount the partitions if you ever need to open them up.

[1] https://wiki.archlinux.org/index.php/PCI_passthrough_via_OVM...

[2] https://github.com/kholia/OSX-KVM

[3] https://github.com/yoonsikp/macos-kvm-pci-passthrough


You don't do this just because you like playing games. You do this because you're primarily an IT dork who likes building crazy setups. Maybe you'll play a few games with your friends on it later.

This is like the Arch Linux of computer hardware. You don't install Arch Linux to get shit done. You do it because you like building and configuring, and to learn about Linux.


Wrong point about Arch though. You can get a lot of shit done with Arch since it has the AUR and is not forcing you to reinstall anything from scratch from an old distro and kernel combo every 2 years.

And it doesn't surprise you with something - or some combination of things - that are installed. If you need it, you put it there, otherwise it's not in your way.

I'm on Mac at the moment and frequently pine for Arch. 'What on Earth just happened...', 'What did that...', 'What is this...', 'Well that wouldn't have happened on Arch.'

People claim Mac 'just works' because of out-of-the-box readiness, well I'd say in contrast Arch 'still works' or 'keeps working': it does what you tell it and if you don't tell it to change, it stays working in the same way.


I am starting to get somewhat disenchanted with the whole “the Mac just works” thing. I suppose their recent dip in QA effort is to blame to some extent. But also, whenever something doesn’t work, and Apple doesn’t care enough to fix it —- tough luck. On Linux you at least have options if you’re a technical person and are not afraid of the command line.

I just wish I could pay a company that would have engineers and designers building an actual user-friendly, stable OS on top of Linux. Mac OS, with all its faults, is still miles ahead in terms of usability and polish.


Fair. I don't install Arch to get stuff done, I decided to stick with debian derivatives and xfce for that, but not everyone is the same.

The parent comment was likely referring to using the clients as workstations, not necessarily gaming computers. Though the cost of this server and the fiber optic usb peripherals is crazy. One of those fiber usb repeaters costs as much as a high end computer.

> One of those fiber usb repeaters costs as much as a high end computer.

Not if you "borrowed them from work for testing"...


> and clocks lower

And that's the core problem, isn't it? Aren't games notorious for being mostly sequential workloads?


There's no reason they have to be. Game development is slowly but surely migrating toward fully capitalizing on high core/thread counts.

I'd be interested to know if this has changed recently - the Xbox One and PS4 are both 1.6GHz/1.75GHz 8-core machines.

When you're designing for that known console configuration you probably make use of the extra cores if you need it. And the really cpu intensive genres (strategy games mostly) don't tend to get console releases.

As 4 core/8 thread machines become basically the minimum you can assume most pc gamers will have we'll probably see devs making more use of multithreading everywhere.


> the core problem

I see what you did there.


Linus's setup doesn't make much sense economically but the concept works. You don't have to go to the extreme end of available CPUs. 64-cores for 4 machines = 16 cores per machine, which is overkill, especially for gaming.

If you go for a 12 or 16 core Threadripper like the 2920X or 2950X you're only looking at $425/$690, which is $106/$173 per "PC".

Combine that with the savings you get from not having to buy 4 cases, PSUs and motherboards and I think a multi-head threadripper setup will end up significantly cheaper than buying 4 machines.


The CPUs alone are cheaper, yes, but there is room for savings in hardware consolidation. Only one PSU, one case, etc. The effect may be small compared to peripherals and adds inconvenience by tying multiple user experiences to a single point of failure, but it’s still a neat idea.

Sadly GeForce Now doesn't exist in Australia yet, while EC2 does -- though I bit the bullet and bought a Razer Core X external GPU enclosure instead after using streaming services (specifically Parsec) for years.

GeForce Now is fantastic. I'm not a serious enough gamer to warrant buying a PC, and I do all my work on my MacBook Air 2013. It's pretty cool to boot up a GeForce instance and get to use all the resource intensive games (games like Civ, Cities Skylines) without my computer frying my legs off.

I tried a round of FPS with it. It wasn't that bad, honestly. My ping to the GeForce server was about 12ms, and the ping from GeForce to the game server was < 2ms. That's actually lower latency than I'm used to with an Xbox.

It did feel a bit laggy, though, despite the low latency. But I'd try it again. For me the benefit is in resource-intensive single player games.

Either way, for $5 a month, it's a great deal!


I’ve actually sort of done this exact thing on premise. I run a full VMware setup at my house with networked storage. I’m currently just running some Ryzen 2700s for the hosts with plenty of ram for each, but I did decide to just buy a graphics card and pass it through to a windows VM.

I then bought a dummy HDMI plug, and installed some open source software called Moonlight on my Mac and Linux systems.

I can stream my games locally no problem from my VM. Eventually I want to get wireguard set up, and test performance over VPN, but I really haven’t had a reason to try it just yet.


You might want to check out Parsec, which in my experience is much smoother than Moonlight was.

The one thing that is preventing me from going the virtualized route: peripherals. Shadow has it where you can use USB peripherals connected to your device locked to the VM, but for the life of me, I can’t get HORI’s firmware update for the Assault Commander FFXIV mini keyboard (I have an orb weaver chroma too, but I really like the stick placement on the HORI) to properly work. Haven’t even tried the mapping software yet.

If I can figure that out, may be in love.


That's fun but not particularly realistic for most.

The direction we're going in is massive parallel computing, not massive chips. Threadripper represent the pinnacle of individual chips but I'd imagine having 4 boxes that are each 3800X + 1GPU + 128 GB ram would be cheaper and easier to maintain both hardware and software wise? Except it wouldn't sound as fun.


Still now Linux client with Geforce Now.

I wonder if using Wireguard could improve the performance of a custom solution.


We have a Linux client on Parsec (https://parsecgaming.com/downloads). You can install Parsec on any gaming machine (PC or VM) and use our game streaming software to play from your Linux machine

I tried GeForce now. Maybe it’s because I am a fairly heavy casual FPS gamer, but the input lag was very noticeable. Neat concept and I hope it can get better!

Did you pay for priority access, or was this with the free version?

Yeah it was priority. 90 days free trial.

Yeah, do not do this. You'll end up with a $1000 bill from Amazon becuase you forgot to shut it down and discover very quickly they don't care at all once they already have your money.

FWIW, I did that once for an unrelated project and they ate the 1200 bill for a running EC2.

I dunno if they'd eat that cost again, and I wouldn't try it on purpose.

But keep in mind that they have an incentive to do this: it's the cost of acquiring users without scaring them about a possibly humongous bill.

I now have better billing alarms set.


They should just add an "shutdown at cost x option" for private users at least..

https://aws.amazon.com/getting-started/tutorials/control-you...

Under section 4, anyone can set up an aws cost budget


Yes, you can set up a budget but it's only used for alerting. The services will happily keep running and building up to that 1k bill if you miss the alerts or don't react to them.

Those alerts can invoke a Lambda that shuts the service down, which is a ridiculously baroque solution but it's what you get.

Wow, they really made it complex, didn't they? On one hand they advertise how anyone can spin up a VM and connect to it but you still need to do some serverless black magic to keep your budget in check. Anyway, thanks for the heads up. I'll definitely try that out.

The aim of the game is for you to be depending on even more AWS services to help you use the ones you already have

> ridiculously baroque solution

It seems to follow their practice of building the “API first” solution. By the way, a lambda/programmatic solution allows you to disable lower priority resources and retain higher priority resources. It’s a lot of effort, but it’s incredibly flexible.


Additionally billing alarms have a max precision of a day, so if you manage to really screw up you can rack up a huge bill before the alarm goes off.

You can set up billing alarms with thresholds which send you sms/email alerts

Yeah, that's not a terrible idea. But I work with tools that could, like, cut my thumbs off... I feel like this is still pretty nerfed up by comparison.

>and discover very quickly they don't care at all once they already have your money.

That hasn't been my experience with AWS but then again it has been years since I allowed such a thing to happen.


Setting up a CloudWatch alert to email you when the bill goes over $10 in 6 hours is a good measure (of course you probably won't get the alert for 24 hours because of Amazon's billing update delay).

Or just spend $500 one time on a local rig one time that can max out 1080p at 100+frames instead of paying more per year for less.

I know this is for private use but the only point of game streaming is so that companies can put ads in stream.


My son plays about 6 hours of BeamNG and X-Plane a week. It’s much cheaper (and faster) for him to play on an EC2 instance with Parsec at $.50/hr than to invest the money and space in an equivalent gaming rig. And he can play it wherever we go.

Setup details: I use Paperspace since it has an image with everything already configured (doing this is surprisingly tricky, I was never able to get GPU drivers installed and configured myself after hours of trying and also trying various AMIs). It has auto-shutdown after an hour of inactivity. I use VirtualHere to forward a joystick to the host, and Parsec for streaming. It works great, and I pay $.50/hr plus $5/mo for storage. Over WiFi with the cheapest Comcast plan, the latency is about 30 ms, which is fine for those games.


Really happy to hear you and your son are getting a lot of value from Parsec's streaming tech. Let me know if you have any suggestions/ideas to improve the product. Thanks!

I just set it up last night. Great stuff, but fix your manual.

Concretely, update the “solutions for cloud rented pc’s” section at this link: https://support.parsecgaming.com/hc/en-us/articles/115002601...

(Which the client points at)

On paperspace, you need to set up a paid public IP, or deal with higher latency. Now I know what ZeroTier is, but my network latency was 100ms under load vs 30 with the public IP...

It took me four hours to debug this. Also, why does the server send every other UDP packet to a mysterious port on startup? (According to tcpdump on my router)

Also, in the question about broken mouse cursors, the correct answer is to close the paperspace web tab displaying the desktop of the instance.

Cheers!


Thanks. We definitely fall behind once in awhile on the support articles. Fixing it now.

Overall, your documentation is excellent; otherwise I wouldn’t have bothered nit picking it.

I’ll definitely recommend parsec to friends. I got rid of my last windows box a while back, but I have a few windows-only steam titles that I’d like to play.

Moving forward, my gaming desktop is something like a 25% of the household electricity budget. I hate wasting all that power, so I really want to replace it with a thin client.


Nice to see someone from Parsec here! USB forwarding like VirtualHere would be nice, so that I don’t have to fiddle with that every time I start the instance.

IIRC I tried the Parsec AMI but it was difficult to use and required a lot of fiddling. I can’t remember the details unfortunately, but I would have preferred to just use that instead of Paperspace (this might be a niche use case since I’m an engineer). If I could have figured that out, I would have written a CLI around it and open sourced it for others to use.

Also because I’m an engineer, I really wanted to read the source code to see how it worked. But I get that it’s your magic sauce :).

Thanks for the awesome tech!


One other thing: Your onboarding flow at the paperspace blog still points to parsec.tv. It made me wonder if you went out of business until I found the new download links.

https://blog.paperspace.com/setting-up-your-cloud-gaming-rig...


$0.50 an hour only amounts to 200 hours of gaming for the price of a rig capable enough to play games at qualities that are going to be streamed. That plus $0 for Hamachi ($50/year if you need more than 5 computers on the network) works pretty well. I've tried Paperspace, and the performance was about the same. I will say that the Shadow streaming service I thought was superior to all other options I tried, but also far more expensive for not enough of a difference.

It might make sense for some people, I'm not saying that everyone should do this.

My son's young enough that he still goes through phases. That and he leaves every summer to go to his grandparents'. Sticking with cloud gaming gives me the freedom to just throw everything away or stop using it for long periods of time without guilt.

Also, it honestly makes me feel better to know that I didn't buy a bunch of fancy hardware to to use 3.6% of the time. AWS rents the hardware out to somebody else when I'm not using it, and is strongly incentivized to get its utilization as close to 100% as possible.


Have you tried alternatives like stadia and gefore now? I'm curious to hear your thoughts. I have a gaming pc with a 4770k and a 1080ti. I was considering upgrading it with the excuse that it would be my ML workstation (this is the lie I tell myself) but with things like colab pro coming out I'm considering if the future (for my use case) will just be renting hardware.

I did try GeForce Now, since it has BeamNG. From what I could tell, it was about the same as Paperspace, except I didn't have access to the underlying OS so I couldn't install mods outside of Steam. It doesn't have X-Plane. It's slightly cheaper and easier to use since you don't have to manage the lifecycle of a host.

And Stadia doesn't have either game.

If you run your own host, you can play whatever game you want, and mod it however you feel like. Otherwise, you're stuck with whatever curated stuff the platform provides.


> with the excuse that it would be my ML workstation

Ah, I see we're alike.

I'm glad I went ahead and got a gaming rig from best buy for $900. Because after playing legacy games at extremely high frame rate for about 2 weeks, I proceed to spend all my spare time on a random javascript game.

At this rate I'll get to my ML workstation and train my own GAN at 2035.


Stadia is a complete non-starter for me because of a highly limited selection and no ability to bring my Steam & GOG library into it.

Or maybe you're in a small apartment without space for a desktop and don't have time to game that often so this is actually a better, more cost-effective solution?

Everybody's needs are different -- there's no reason to dismiss this.


I find it hard to imagine I’ll ever live in an apartment where I won’t make space for a desktop.

But my experience with cloud gaming on EC2 has been pretty great too, as I’m getting older I feel less and less inclined to shell out for a new video card every two years.


As long as you are fine with 1080p cards from 5+ years ago are fun. 4k is generally not worth it.

1440 @ higher refresh than 60 and 4k@60 is very well worth it these days. especially on a nice LG OLED with gsync ;)'

HDR is pretty sweet, too.

HN is horrible for gaming/pc building advice. much better places.


> 1440 @ higher refresh than 60 and 4k@60

I've tried 1440p @ 144 hz and 4K @ 60 hz, and I'll take the 144 hz any day.

> HN is horrible for gaming/pc building advice. much better places.

Absolutely. I come here for my tech news and discussion, but gaming? No way.


Thats what I have and a 1070 is still find for most games.

I’m not trying to take away from streaming gaming, but I find a sub 1000 dollar gaming laptop can let you play a LOT of pc games (laptop being your smallest form factor and all-in-one package with a display, etc). Budget gaming laptop really helped me get away from my MacBook which basically killed the gamer in me for years (can’t play shit on it).

Last time I tried GeForce Now in beta, there was a very perceptible delay in input. Not sure if that’s still the case now.


Everyone going to this would ruin games. These streaming services will always have compression artifacts or jitter which kills immersion. Worse they will be more expensive over time than hardware. Even worse your games can get deleted and you won’t own anything. Worse there will he ads put into the stream.

Price is comparison will be variable. If you're chasing the highest specs and framerates, you're spending more than streaming would cost. As for artifacts, I played most of Assassin's Creed Odyssey at the highest specs through Shadow and noticed barely any difference between the gaming rig I have, running it at the same settings. And I still owned the game, it was installed from my Steam library. Not owning is a legitimate concern with something like Stadia, but not most other services. Ads in Streams? Again, maybe for Stadia if they release a free version, but for most services where you're basically buying a friendly front end to something like EC2 and running from your own library of games, no- Ads are not an issue. You're already paying for the commodity computing, and Amazon or whoever will no more insert ads to game streams than they would an RFP session.

> the only point of game streaming is so that companies can put ads in stream

I don't want to own gaming hardware - I don't want the space taken up, the hassle of the upgrades, and I'm a super-super-casual gamer so I don't want to invest in anything. It sounds perfect for me and I can't wait for it to be working well!


It’s never going to work as well unless you are playing the kinds of games where latency doesn’t matter.

Jup. Tried DIY cloudgaming a few times, almost signed up for shadow/Google, but then just bought a 200$ card and now I'm able to play all games, from any store I want, for next 2-3 years.

In case of AWS/cloud it's not just gaming but also always thinking about aws/technical stuff..

If you can afford to buy all games on steam and have 1gig internet - go ahead and use a cloudgaming service. But you'll see the video compression and you won't be able to play the great game you once bought at humblebundle.


I think the biggest thing that stadia is trying to enable is being able to play all games on non-windows systems. You could play a desktop game on your android phone or even on your Chromebook.

I run a linux desktop with windows installed as a kvm guest with GPU passthrough which gives near bare metal performance.

Do you have a link to a guide for this or more info? This is really interesting.

My main desktop is dual booted but I never really boot into Linux since I often game, although I really prefer Linux interface+desktop. If Windows could be a kvm guest and GPU passthrough worked I would def go with that option.

edit: I found this which seems to give a good overview but I'm still curious if you have any tips/hacks for getting it working.


It’s kind of difficult to setup but you will learn a lot of you don’t have much virt experience. I recommend trying to share as little as possible. Pass through a whole pcie usb hub to guest instead if separate devices. Buy a second pcie NIC. Use a cheap AMD card for the host (better for linux) and NVIDIA for your guest. Don’t try to share your motherboard audio just use the passed through cards audio out of HDMI and display port. Get an AB switch to change which machine your KB/M is on instead of software. Basically only share your processor and memory. I have linux on a m2 nvme and my guest has a big ssd all to itself.

And the hoops you had to jump through to do that is one validation of why products like stadia might be in demand.

The whole point of doing that is to not have to run dual boot or have Windows on your bare metal. Do you think someone who cares about that wants to give Google absolute power over their gaming?

Building a rig and just putting Windows 10 on it and installing steam takes about an hour if you have done it before and about 6 hours if you haven’t and have a tutorial. You could just also buy a prebuilt rig for 10 percent more.


I have an eGPU and I find Geforce Now to be more convenient when I want to stream to TV, and when I don't want to reboot into Windows which is often. It's actually so convenient I often wonder if I could just use a Shield TV for PC gaming, the only downside is several of my favorite games are not yet playable on Geforce Now.

I have a similar setup and use SteamLink to stream to my shield for a lot of games. Works a treat for most of my games which are on Steam anyway.

What card?

Perhaps, but this article is ridiculous and great.

There are a FEW legitimate reasons to go this route, but not many good ones, but I have to commend this article for true hacker spirit. It's lovely!


There are about 10 major game streaming services available now, and none of them inject any advertising as far as I know.

Which ones are you referring to?


If Stadia goes for a free tier, that's the most likely candidate. But I can't imagine any service where it's basically a friendly front end to an EC2 clone trying to insert ads.

Its not about ads, IMHO. At least in Google's case, I bet the biggest thing in it for them, from a long term perspective, is the most literal gamification of crowd-sourced AI training.

And then lug the local rig around the country for the occasional game?

This is assuming that wherever you're going to be has enough bandwidth to support streaming games.

Hotel internet seems to be about 10 years behind the current technology at all times.


Yea this is where I envision this working best for me. I travel for work and have been playing a ton on RetroPie lately. I wouldn’t mind being able to bring a SBC of some sort and being able play higher end games on the hotel TV. I don’t like lugging around a bunch of extra stuff.

You can build a 1080p rig that is about as big as a Nintendo Switch.

Now I'm curious: GPs ~$500, lets say max double the volume of a Switch + Dock, even if we exclude the screen and input devices seems like a quite tall order.

You can build some fairly impressive SFF PCs, but even an InWin Chopin breaks that volume budget if I've done the math right, and it's not a cheap hobby. They to me feel like a class above this.


You could definitely go eGPU + intel NUC and tape them together for 2x Switch + Dock. M2 nvme drives are tiny and I think some boards take laptop memory. You really only need the equivalent of a gtx 970 to do 1080p which probably exists in Pci x4.

Nuc makes a model called a Hades Canyon that is an awesome gaming rig for its form factir. 4k at mid-range settings, 1080p at anything you can throw at it. (And I'll be honest... Maybe I'm getting old, but from 2 feet away I can't tell the differece between 1080 and 4k.)

And double the "just spend $500 one time" initially suggested as being the solution.

Not quite double, I waited for a ~$700 deal, but yeah, you pay a premium for the form factor. If you aren't looking to optimize both performance and small form factor, $500 is reasonable. Less if you can recycle an old case, psu, etc.

I personally can handle putting a desktop in 1br apartment for $200 less.

Yeah, if you're smart about these things and know where to look and don't mind some diy, an adequate desktop can be found for $200. A game streaming service would be perfect for that sort of setup. But I still think the price point for streaming is wrong for mass adoption, and for anyone with even minor gaming interests already (e.g. you've bought a decent amount of games over the years) then being able to use your existing library of games is an essential feature.

Wrong subthread, small apartment isn't regular travel ;)

But I don't think there's need to argue this further, I think the point that there isn't a one-size-fits-all solution has been made exhaustively, by everyone.


I barely find Steam Link playable on my wired home network. There is no chance Stadia is playable on hotel Internet.

Yeah, I didn't want to build so I sprang for an Intel Nuc Hades Canyon, and it was a significant, massive upgrade from my 3yr old gaming rig. And it streams through Hamachi like a champ.

Yep, this plus Hamachi is my personal streaming server.

If you're thinking about doing this, it's actually way easier with Parsec, there are even scripts to set it up for you - https://github.com/jamesstringerparsec/Parsec-Cloud-Preparat.... It'll also set up an auto-shutdown script that prevents you from leaving it on and racking up a huge bill

Thanks for sharing Parsec! I'm the co-founder. It's really great that the script is helpful.

I really love the product, it's super fast and dead-easy to use. Have you folx thought more about Android TV, I'd love to get it working on Shield in a way that's usable (right now you get to a Windows login screen and you're stuck b/c you only have a controller)

Parsec stopped providing the machines though. So now it’s actually quite inconvenient.

Why? The vast majority of us here can spin up an EC2 machine and run a script, that's literally all you have to do

I have a comparison between NVIDIA's service and an equivalent setup on Azure here:

https://taoofmac.com/space/blog/2018/09/30/1600

The setup I used is here:

https://github.com/ecalder6/azure-gaming

...and has been kept up to date (it was annotated last month, and the config is stable).


> €1.4 an hour for the CPU/GPU alone, plus a couple more Euro for hard disk usage and long-term storage (usage was pretty high during setup, which accounts at least for half the amount)

The Azure solution sounds much more expensive than the prices quoted in the article


At the time I did not have access to preemptible instances, which lowers costs considerably. You did not quote the next few paragraphs...

I have a gaming / home server at home. Using Unraid, it serves a gaming vm, a NAS, smart home (with home-assistant and Node-RED), Ad-Blocking (secondary Pi-Hole), a media server (Plex) and a Unifi-Controller.

It‘s based on a Gigabyte x470 and a Ryzen 5 2600. The rig is watercooled (using ZMT tubing), which makes it super silent (and I don‘t need a heater in my office any longer. It‘s amazing how efficient watercooling is - yet, the heat still has to go somewhere).

The gpu is a Sapphire Pulse Vega 56. All in all, the complete rig consumes 80 watts in idle, and 130 watts when I‘m using the gaming vm for surfing. When gaming on it, consumption can go up to a reasonable 500ish watts.

I‘m also using a Thinkpad x230 as my „roadwarrior“.

While „the cloud“ might be cheaper, I love to have everything „at home“.


The paradigm pendulum continues to swing back and forth between geographically centralized and distributed computing.

Curious if it would be possible to do this setup with VR as well.

I doubt it: For VR, latency and jitter are much more important than for other games. My GPU (old R9 290) has ~15ms delay in the encoder when using parsec for streaming. Add a few ms in the decoder (1ms), plus whatever the network/Internet adds to that. Compression artifacts on a flat game are also quite noticeable and the reason I did not yet put my desktop next to my home server and go fanless in the study room. Oh, and that's with 1080p/60Hz at 20MBit - my Vive has a higher resolution and refresh rate ;-)

I connect an Oculus Quest to my shadow instance to play desktop VR games on steam. It works great, and I don't have any issues with latency.

I suspect that a lot of the people saying that sort of thing is impossible just haven't tried it. Most of my friend group uses Shadow for everything nowadays, and I can't see myself switching back to maintaining my own hardware.


Yep! It's very common in the oculus quest community. It's not on EC2 though. The best supported cloud service is called Shadow

I don't see any reason why it shouldn't be possible, but VR is extremely latency sensitive so I think the experience would suffer pretty badly.

VR streaming over local 1gbit ethernet is certainly possible -- I use Virtual Desktop to stream PCVR to my Quest and I wouldn't say the latency is noticeable, at least for me anyway. It's not really playable on 2.4GHz (so, max ~300mbit/s) wireless but is nearly flawless on 5GHz (max ~840mbit/s ?). Although the stream itself maxes out at 50mbps. Reported stream latency is about 15ms.

FPS games like Skyrim work great, even latency-sensitive games which require quick reactions like Thumper do pretty well. I get lower latency / better image quality using Quest+VD than I do a standard 1080p stream on my Steam Link.....

I think that the VR video stream is already a 180 degree stereoscopic format helps with the latency; you're only ever looking at a subset of the stream so when you move your head around it's the same latency as local VR.

Only noticeable thing is if you move your head really quickly to over your shoulders it can take a few milliseconds for the footage to appear.


This is a nice article to prove it can be done, but honestly I would rather setup a dedicated gaming rig. If you're serious about gaming you'll soon rack enough hours where your EC2 Spot costs exceed the initial investment in your own hardware.

Also the open UDP and TCP to 0.0.0.0/0 in the security group made me cringe. At least set it to your current IP /32 and perhaps set a script that watches your public IP and updates the SG automatically when it changes.


I think a dedicated rig + Hamachi or other VPN of choice is an ideal option for most use cases.

I followed this tutorial when I Fallout 4 first came out and I had not yet built a desktop. It worked pretty well and, given spot prices, was pretty cost competitive with a gaming desktop if you only cared about one or two games.

AWS inbound traffic is free but persistent storage is not so I downloaded the entire game whenever I spun up the AWS instance, which probably happened 10+ times. I still wonder how much that cost Steam in terms out outbound traffic....


Game developer here, I'd love to have a cloud gaming PC as a workstation for development. Anyone know of any companies/services that offer this?

Parsec[1] is a Discord-like game-streaming host and client. I've used it a little and it seems to work just as well or better than Steam Stream or GeForce Now. Furthermore, it has virtually no setup and works across WAN without a VPN.

[1] https://parsecgaming.com/features/


Thanks for sharing Parsec!

https://shadow.tech offers such a service

https://www.paperspace.com/ does exactly this.

Tried it for gaming and it was a great experience. Gaming only works if you are near their data centers though - afaik US and NL; got a high latency in Germany, but that's not the their fault. You'll find 10$ referral codes online to try it for free.


For what it's worth, my roommate does a lot of his development on Shadow. He works on one of the Nintendo Switch emulators, and his laptop isn't powerful enough to compile and run things at a reasonable speed. I could absolutely see cloud gaming machines being a useful platform for game development.

You may want to try http://shadow.tech/

Reasonable prices and works very smooth.


I haven't used it, but I think this might work https://aws.amazon.com/products/end-user-computing/desktop-a...

Paperspace works pretty well also Shadow

I'm immediately drawn to the idea of trying to build such a service. I'll get this started on Friday, I know I have some time over then. :D

> <50ms

I don't think that's gonna be much fun. Especially not if you're used to playing on a powerful local machine


My experience is that anything with direct camera control via mouse doesn't feel good unless you're streaming from a machine on a local network, but if you're using a controller it's not really noticeable.

It's probably going to vary from user to user - even streaming on my local network (1Gbps wired), playing with a controller feels less than great.

I played most of Assassin's Creed Odyssey through Shadow and mouse camera control was never an issue. Felt as smooth as local. Of course, with all of these services, local network conditions play a huge part. I live near a few urban centers where most of these services have some presence.

I've tried Playstation Now on a PS4 with a 1Gbps enterprise-grade fibre connection and the latency was definitely noticeable, despite single-digit pings. I'm starting to think the latency from the video encoding/decoding could be a bigger issue than the network itself.

> I'm starting to think the latency from the video encoding/decoding could be a bigger issue than the network itself.

It depends on the setup but encoding and decoding combined can be done in less than 7 ms (1 frame at 144 fps) on good hardware:

https://blog.parsecgaming.com/testing-game-streaming-input-l...

Last year on more modest hardware I would regularly see around 12 ms (1 frame at 83 fps).


For some games it won't be a problem at all. I play CIV6 with my brother and I take too long to load the game. This could be a solution.


Same year I release this video https://www.youtube.com/watch?v=fB8htu_Bxpo

Is basically the same implementation, I must say that keep the instance open const me a bunch of money in the end.


Its good for occasional cloud gaming. Otherwise I hope the PC game industry will remain alla carte dinner instead of all you can eat buffet cloud gaming. Reason being income sent to the game industry its healthier to pay for computer games directly to the developers.

It's "a la carte". If you want to be extra fancy, you include the accent on the "a".

A la carte is French, and alla carta is Italian. Alla carte is both :)

So is Steam so optimized or how come VNC/x2go is still reloading the screen so slowly?

I believe they use the GPU for video encoding, while most Remote Desktop solutions are CPU-only. Also, I wouldn't be surprised if the encoding used by Steam is lossy, which is fine for games and fast-moving video but would be completely unusable for text.

Most remote desktop solutions don't even use video and instead transmit parts of the screen that have changed as still images.

When not consuming media content this approach is better because the fine details look more crisp. In video the finer details are limited by a bitrate & key frame frequency.


I tried doing something like this last year. It took a long time to get right. When it finally worked, the bill was far higher than i had estimated due to bandwidth.

Overall, i love the idea, but it was a huge pain and not worth my time.


This still seems like a better option than Stadia or GeForce Now, because it should support all PC games (including store-exclusive titles like Satisfactory, Untitled Goose Game, and The Outer Worlds).

>$0.11/hr Spot instance of a g2.2xlarge

I'm not an AWS user, but don't spot instances risk being shut down/paused/etc at any moment? It seems like a bad solution for remote gaming.


Yes they may get terminated within 2 minutes of creation. Thats why they are cheaper. Still this is a demo of what can be done and what's possible now. This can never be used for any reasonable gaming as its volatile.

Haha I posted this years ago. Glad it’s still helpful! Funny it’s made it’s way back to the front page so many years later.

Using a commercial service like Google Stadia would be far cheaper than this no?

Lightsail for fixed pricing a month.

Does Lightsail have GPU instances?


Nope. Lightsail has much more limited instance type options than EC2.

https://aws.amazon.com/lightsail/pricing/


That’s what I thought, I couldn’t find any GPU instances. Figured the GP knew more than I did :).

mbit != Mbit.

I'm not sure, but cloud gaming seems to be one of the "new" overrated hypes.

Based on that it's done before and didn't get that far ( Gaikai and OnLive ), but nobody seems to remember that.


Another way to view it is that like electric vehicles cloud gaming is not a question of if, but when.

OnLive was released when most internet connections were still crummy, and before HD capable computers were embedded in TVs, smartphones and tablets. Despite all of that it still worked pretty well, but at that time I thought if I can run the game on my desktop why bother? Nowadays I don't have a gaming desktop capable of the newest games, so I think about it differently.


Electric cars didn't even take off yet. It's 1-2% of total car sales and valued like 90%

There is also a lot of gouvernement subsidies die electric cars and the range was not good.

There internet in those days was as good as it is now in Western countries.

We'll c, but I'll be happy to place an online money bet.


Timing is important. The enemy is latency. Today you can get below 1ms network latency. But as networks latency has improved, latency from IO and most important - the screen, has gone up. So there is still work to do. You would basically need a custom made device that you plugin directly into your fiber outlet. Something like a VR helmet.

Where is it a "hype"? Opinion overall (users, press, ...) seems fairly mixed - but it does have some mainstream-ish appeal this time.

I'm really surprised that none of the cool kids here comment about the astronomical ecological cost of such rigs.

Reminder: IT is responsible for nearly 4% of CO2 emissions worldwide [1]. Video streaming represents 60% of that [2].

In a nutshell, cloud gaming is like switching on an oven at full power with the door open. If a majority of people use that kind of solution, the catastrophic effects of global warming will be even worse.

The true problem is that we're not paying the real price of AWS and other cloud services. We're expecting our children to pay for the ecological impact.

[1]: https://theshiftproject.org/wp-content/uploads/2019/03/Execu... [2]: https://www.dw.com/en/is-netflix-bad-for-the-environment-how...


Does that account for data centers getting built primarily in areas with easy access to renewable energy? I suspect the fuel mix for your average Amazon data center to be considerably greener than running the same power rig out of your home. I don't know if it's enough to offset the cost of the networking equipment between the DC and your house, but it's not so clear cut that this would be worse than buying a gaming rig and running the game yourself.

The pie chart in [2] implies streaming is << 60% of CO2 from IT. Networks generate ~16% of the CO2 of IT, and only 60% of that is streaming video. TV manufacturing is another 11%.

All device electricity usage is another 20% (there’s no breakdown there, but most of the streaming device electricity goes to the screen, which you’d pay for with broadcast or dvd playback).

Video streaming servers should be relatively low energy vs. the other servers running in the data center.


The second source you link to states 60% of that energy is in manufacturing, and 40% in use. Wouldn't that imply using a hosted solution only when you need it would be more environmentally friendly than buying your own gaming PC?



Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: