Hacker News new | past | comments | ask | show | jobs | submit login
Games should have “optimise for streaming” option (mcraiha.github.io)
58 points by mcraiha on Feb 26, 2023 | hide | past | favorite | 65 comments



Running a denoising filter through ReShade or ffmpeg (or better yet with VapourSynth post processing for non live streaming) can help.

In fact, hardware encoders have excellent built in denoising (and other postprocessing) functions. But they are so obscure that less than a dozen streamers on the planet are probably using them... I cant even remember the syntax now, I would have to go look it up in Staxrip and then see if it even works in OBS :/


Don't the discrete cosine transforms that underlie most video codecs automatically remove noise, as noise is generally high frequency?


Not as well as more computationally expensive algorithms that take more frames/wider areas into account.

And as the blog mentions, noise "tricks" the encoder into blowing its bitrate budget on preserving noise that looks like detail, instead of something more useful.


The DCT is lossless. It's the subsequent quantization step that chops certain frequency components.


Furthermore it is a common (even logical) misbelief that chopping out the higher frequencies after a Fourier transform is filtering them out. Unfortunately this merely zeroes the exact frequencies used to define the bins but leaves a lot of high frequency signals. You need a proper low-pass filter.


Don't think so, I read somewhere (like HN) that Netflix purposefully removes noise, transmits it and readds it locally saving some 30% bandwidth.

And the HBO intro is a nightmare.


How is it possible for those encoders to know that fog is less important than other elements of the game scene?

It seems like removing the elements will ensure bits aren’t wasted on those elements.


It can't, but I have learned that you cant rely on devs to put needed visual features into games, especially old already-released games. And a mild denoising prefilter is a reasonably good universal way to help the hardware encoder, and they tend to be much better at preserving detail than the encoder itself.


I completely agree with the article but its a little weird that the example game used is MTG Arena when thanks to streaming extensions I find it's one of the best games to watch on Twitch.

Almost every MTGA streamer I've seen uses Untapped (https://articles.mtga.untapped.gg/set-up-twitch-extension-fo...) which solves issues 2 and 3 since you can in fact just mouse over cards to see what they are as well as get detailed info on the deck being run and game state.


This is true, but it would be nice if Arena got configurable backgrounds at some point even without the streaming case.


The noise thing is interesting. Modern games tend to have lots of noise. There's often a noise or "film grain" post processing filter to make the game look less artificial. Lighting effects often have noise injected at many stages to soften and liven them up. This was really noticeable in Control for example, I think it may have been a surrealist stylistic choice. Modern video codecs (and maybe H.264 as well) have the ability to denoise to improve compression quality and then add noise/film grain back in after decoding. Theoretically the game engine could integrate with this by producing a noise map of sorts to add to the video stream.


Only AV1 has this, and its a pretty niche feature at the moment.

And a "noise buffer" is an interesting concept. I dunno if any other applications try to do this with AV1, or if thats even possible without major hacks (as its noise synthesis is basically designed for internal use).


This is a good example of the title turning people off from the content. The content of the article is a nice intro to how some artifacts are created while streaming and how to deal with them, for sure it is interesting and useful for developers of games that wish to focus on streaming to build a name or community and lack that knowledge. There's hardly any argumentation as to why the general game should have this option contrary to what you'd expect with such a title.


There are always trade-offs.

If you prioritize streamability you end up deprioritizing something else.


I think this is a SUPER astute observation and is looking to the future of what people will want from games engines.

I also think we can take this further - games should be setting up their pipelines to offer a "compression friendly" version of the frame as an interstitial product before producing the final render. That way the game produces two native frame streams: one that's extremely compressible and one that's got all the bells and whistles. Obviously that takes more horsepower but the point is moving towards games engines as a broadcast-aware medium.

This is going to be a huge part of what makes a game accessible and successful going forward and the people who make tools for games will benefit from taking that seriously.


I bet it'd be more work on the side of the game developer and streaming platform, but IMO Zoom, Google Meet, Jitsi, etc. aren't a ton better at streaming text than it looks like Twitch is from TFA. Even in the absence of effects (just small text on a monitor of differing resolution), things can be plenty hard to read.

I suspect that it should be possible for at least the streaming platforms that have native applications to tie into the OS's screen-reading APIs to grab text and provide it out-of-band, e.g. by streaming rectangles of the background color and render text over it on the viewer's client. Then it'd be copy-pastable by the viewer too, and their own assistive technologies could work on it.


> but IMO Zoom, Google Meet, Jitsi, etc. aren't a ton better at streaming text than it looks like Twitch is from TFA

Come to think of it, it's a little bonkers that sharing a desktop over a meeting doesn't leverage any of the stuff that is in remote desktop and similar protocols - why send screenshots of text when viewer's PCs are capable of rendering the same text? I suppose the status quo is easier to implement but there'd be a measurable improvement in text quality and bandwidth usage if a different approach were taken.


That's actually something I never really thought about but makes total sense.

I think it's because applications are rendering their own text directly to the GPU these days. Back in the day, you used to use standard OS controls for everything which allow you to do exactly this without any direct attachment to the underlying rendering layers, but these days most applications are either Chromium with an icon or something rendered to a GPU canvas (Flutter style) to get easy animations for cheap, each with their own GPU acceleration system.

RDP has several modes (block updates, video streams, you name it) and switches between them on the fly. Windows especially uses the win32 render hooks to significantly reduce bandwidth necessary to run native applications, integrating in the window event loop mechanism.

I don't think browsers or streaming tools can leverage that tech as easily and even if they could I expect the benefit to be diminishing over the coming years. This understated use case for natice applications is often forgotten and at this point I don't expect the industry to move back to the old way of writing applications.


Video codecs can be extremely good at encoding static content, with the right tuning, and trying to capture the text of any arbitrary app on the screen would be quite a task.

One of the bigger issues for text is 4:2:0 subsampling... not really sure how to work around that, as higher levels of subsampling are extremely expensive.


Is it really chroma? I always thought poor text rendering was an artifact of dct+quantization


Depends. The chroma subsampling still hurts text even at really high bitrates or in static scenes, and you can see a big difference in 4:4:4 or RGB videos.


  Because in general, how do you verify that the two PCs or Mac or Linux or iPhone or Android have the same fonts installed?


This is for streaming low-end games on mobile, apparently. Are they thinking Amazon Luna?

Game streaming as a service hasn't worked out well. It's too expensive a service to provide, since each user needs a rackmount gamer PC in the data center when on . Most of the streaming services of a few years ago went bust. Vortex closed in 2022 and Google Stadia closed recently. Shadow PC pivoted to B2B. NVidia GeForce Now doubled their prices.

Technically, it works fine, but doing it profitably seems to require pricing around $30/month, and that's not selling.

Should film grain should be removed from streaming movies, too? Some systems do that, and re-insert it in the player.


This is not about streaming a game running on a remote machine for personal use. It's about streaming a recording of you playing and commentating to viewers. This is a big part of modern gaming culture that video game producers are not optimizing for. Some of the best free advertisement you can get for a game you develop is having a popular Twitch or YouTube streamer play it for their audience. It makes sense for all parties involved to support this workflow.


Honestly, this is not a secret that game companies are paying popular streamers to stream their games, especially on launch


Right. Paid advertisement is the next best thing :)

Especially since they're already paying the streamers to play their games, it makes sense to provide an optimized game rendering mode for this.


> Should film grain should be removed from streaming movies, too?

Yes! I hate film grain in Blu Rays, but it looks particularly awful in streams. And there are far better ways to fix banding if thats the goal of adding grain.

I dont think many are leveraging AV1's grain synthesis yet.


I think the article is about streaming games on Twitch and YouTube.


given the similar requirements and higher margins, I'm surprised streaming CAD as a service never took hold. Imagine a service where you can model CAD parts and run simulations. the frames are rendered in high fidelity and streamed. latency isn't as much of a hard requirement and you can charge more per user in such a situation.


That's what Autodesk Fusion 360 is, sort of. It's partly local and partly remote.


rendering is still done remote. the dream of stadia or geforcenow is that the whole program runs on a remote machine and you'd justs stream the interface


Another important issue for the streamers/youtubers is the soundtrack of the games. Some tracks can cause DMCA takedown requests. Sometimes even automatic. Game devs have taken notice though. Some more recent and popular games have streamer mode which excludes tracks like these.


It would be nice to have, but probably too much dev/design/QA effort for the payoff. Then you have to consider the confusion of presenting two different appearances of the game to the audience.

This seems like it should be solved by streaming services, not game developers.


More games are already building a "streaming mode" where the UI is shifted, any passwords (eg multiplayer lobbies) are redacted, etc.

Many games rely on streamers as a kind of marketing.

Adding effect tweaks into streamer mode isn't a bad idea.


It's probably not a big deal if it's just another visual quality setting. You just make another set of LODs that remove some noisy effects.


Adding a checkbox which defines if the game is on "streaming mode" or not could be cool

What would not be cool would be the huge expense of needing to do two ui/Ux passes on user interface and qa that entire process and bugs thereof. The piece is also talking of vfx creating issues and these are wholly unfixable unless at enormous expense. At that point they would instead need to be disabled which would make the game look overall worse and as a less appealing product and as an art director I would *not* want my game to be streamed under said circumstances

That said there are some good takeaways from it, a setting to disable film grain or improve font/ui readability


It’s extremely common for games to have dozens of different quality settings to support being played on hardware with different capabilities. On top of these quality settings they often offer a “high / medium / low” preset so users can quickly select an approximately correct setting for each individual quality option. Adding a “streaming” preset to these options that selects the correct settings for improving encoding seems fairly easy to integrate into the dev process and gets a lot of the benefits described in this article.


I guess if game publishers care about how their content will appear in streaming channels they might adjust their graphical approaches to avoid things that won’t compress well.

Similar to how pop music production values changed as kids started listening to music over phone speakers.

Or how failing to account for how some images handle compression led to HBO shipping those unwatchably dark Game of Thrones episodes. I guess directors targeting streaming nowadays know not to do that?

But I don’t think this passes as normative advice for all game developers. If as a game dev you value how your game looks on Twitch, you might want to think about some of the graphical flourishes you include.

On the other hand if, as a game dev, you want people to experience your game personally and you want to discourage streaming consumption, maybe you crank these features up to the max so your game looks terrible streaming. Entirely up to you how you use this knowledge, really.

Saying games “should” optimize for this is like saying back in the 80s/90s that movie directors “should” frame their shots so they can be cropped down to 4:3 because otherwise their movie will look bad on VHS and TV. Sure, some directors followed that approach; others went ‘screw that’ and framed shots wide so they were basically impossible to pan and scan without losing content.


I'm not sure streamers will opt for modes that prioritise the video encoder over their own experience though. Look at all the esports streamers that play the game on the lowest, ugliest graphics settings because the increased performance helps their reaction time or lowered effects help them focus. But if you watch e.g. tournaments organised by the game studio, they'll put the graphics settings on max to maximise the visual appeal over gameplay benefits.


"Or how failing to account for how some images handle compression led to HBO shipping those unwatchably dark Game of Thrones episodes. I guess directors targeting streaming nowadays know not to do that?"

That blame lies on the service encoders. They should be allocating more bitrate to dark scenes, and gently debanding them, rather than relying on decades-old x264 defaults.

But to be fair this wasn't well known back in early GoT days.


>Magic: The Gathering Arena is a free-to-play digital collectible card game developed and published by Wizards of the Coast

More like "Free to play" if you're ok with getting creamed by people shoving cash into the game for packs and drafts. MTGA is as rigged as every other game is these days: power for money.


Free to play strongly implies pay to win. If it were just free, they'd say that.


It often doesn’t. Plenty of competitive games are free to play and make their money from cosmetics and battle passes. I’m not a huge fan of those things either but I don’t think you could call LoL pay to win for example.


If you actually look for good games and don't just play the latest AAA games made by big companies, you'll realize there's still good games out there


As a lifetime game developer publisher, I am going to disagree. I feel like half my career people have been trying to push streamed games. In the oldie days it was a company called OnLive. Then it was a bunch of other services including Stadia, Luna etc.

Game developers don't want this. Gamers generally don't prefer it either, or they can be sold on the idea but consistently reject the experience. Why is this such a thing? It's because business people know that if they can convert game buyers into people who pay for streaming service they can play gate keeper and extract all the surplus value. That's not a priority for me!


Isn’t the article talking about live streaming like on Twitch? As opposed to game streaming over a system like Stadia


Then it doesn't need "optimize for streaming" it needs a straight to stream rather than render then re encode.


That seems like a pretty good idea generally. For a single-player game an extra straight-to-stream output could even show extra info; be third person, etc.

Hell, why not include the ability to render locally, like a halo/StarCraft (and many other series of course, these are just the two I remember) replay.


The player still has to see the game and "watching your own stream" introduces significant (inconsistent) latency making it inappropriate for any game with timing.


This is confusing me as well. I thought it was about game "streaming," but after reading the comments I'm thinking maybe its about "game streaming."

But the problem is fundamentally the same, as they are both using GPU hardware encoders to stream a game to a client.


I think it is. It still seems like a minority situation to me. How much of a player base is streamers that it makes sense to optimize games for Twitch? Wouldn't it make more sense for Twitch to optimize video codecs for games in mind instead?


The viewership and exposure level is so great on Twitch/Youtube/Facebook/etc that games already have "streamer modes" that hide information to, for example, help prevent stream snipers (including big games like Call of Duty).

Streaming is big. Making your game look good when streamed is probably worthwhile. Arguably some game genres and trends wouldn't even exist if not for streaming.

Battle Royale games were based on a little known ARMA mod that was popularized by streamers like Lirik which lead to more people playing them which lead to more games like that being made.


A small fraction of players, but its a critical fraction to market a game. Many games already have a "streamer mode" that disables licensed music among other things.


I might have misread something but I think the author means "streaming" to mean when you are streaming the game to an audience on a service like twitch. You mean it in the sense of playing the game on a game client that's actually rendered somewhere else (e.g. a data center)


It's about watching others play.


To be clear the article is mostly talking about optimizing for video compression. However it does also have some points about tooltips that maybe should be always visible as a viewer can't choose when to activate them.


So one neat thing about Twitch is that it has channel extensions which are basically interactive overlays.

So, for example, with some card games viewers can mouse over a card in the stream video and get tooltips/card text/information.

It only works for some live content and not vods though. I imagine storing all the required information for vods would be challenging.


I think this blog post only applies to simplistic time-sink videogames designed to seamlessly fit between the finely oiled cogs of capitalism.

Anything story-driven and/or single-player with even a pretense of artistic integrity should stay as far away from wasting man-hours behind optimizing the graphics, UI and general aesthetic to please the h.264 overlords.


Lots of streamers stream single player games and many fans consume those games purely via stream.

Some of those viewers may have accessibility needs. All of them benefit from clear text on text heavy games.


This sounds like PR speak. Games are meant to be played, optimizing for watching should be at the bottom of any developer's to-do list. Some minor quirks with noise or the UI are not enough to justify pouring God knows how many hours to create alternative viewing modes and rendering options.

Streaming optimization is missing the forest for the trees, namely that in a world with better internet connections not only would streaming quality be solved but also every other problem with slow internet would be too


It's easier to disable an effect (it's usually an engine boolean) than it is to run hundreds of thousands of miles of fiber optic cable. We can disable an effect in a patch cycle - and meaningfully improve accessibility for players.

Literally millions of people watch streamers play games. There are, at the time of me writing this comment, 4,800,000 people watching twitch right now (according to TwitchTracker) - that excludes Youtube Live, Facebook Live, Steam, etc. Many games (Five nights at Freddy's, Minecraft, Fall Guys) owe their extreme popularity to streamers. The issues here apply even to recordings of your game.

If you are ignoring streamers, you are ignoring a major part of your base, and an extremely public part of your base at that. You're also actively telling your players who need accessibility tools that you don't value them enough to invest some time over basic settings tweaks (that you should probably have anyways...).


There's plenty of games I've enjoyed exclusively via streaming

Like TloU (Both) or Some Assasins Creed Games I did eventually buy most of those games, and maybe in a future I shall play them, but in the mean time they got a sale out of me, and to care more for future releases (Or even merchandise)


And did you give a single thought about some occasional video compression artifacts?


Not really, Maybe on Dark scenes or particle heavy enviroments, which being fair, are a bit more common on history driven games

But, considering the lengths most streamers go to have everything on ultra because more quality really doesn't really hurt either, it doesn't seem like it'd be an unwelcome change

I'd love some of the things sugested here, like a separate stream so text doesn't get lost to compression or changes like that


I dont think grain is central to the "artistic integrity" of a game. I turn it off in every Mass Effect run with zero regrets.

And games look better if other high frequency effects, like texture shimmering or shadow flicker, are denoised away.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: