So low-overhead (fast) decompression is essential for AAA titles. But recently I guess developers think disks are fast enough now, because PC games have been using more uncompressed assets to free up CPU time, massively bloating install size. Current-gen consoles also all have NVMe SSDs.
Given this trend, RAD's heyday was definitely in the days before consoles had NVMe disks (Bink in particular let games punch far above their weight) so this might be a nice forever-home.
Blame Intel for not making faster CPUs, and consumers for tolerating massive install sizes, i guess.
The other angle to this story is that the PS5 has a hardware decode unit for RAD Kraken. To get the best use out of the PS5 hardware, it's essential that the encoder is still made available. This is a huge competitive moat. (PCs probably won't get a comparable decompression accelerator card and such a card wouldn't get enough market penetration anyway.)
Meanwhile here I am with a sound card in my desktop PC for no real reason anymore :\
It takes about half a second in a single core to decompress a 4 min MP3 to WAV in my 2012 MB Air, including writing to disk. In a gaming machine it will be way less. If anything, the audio could be decompressed and cached on demand, when loading levels, or even during installation.
Also sound cards do not participate on the decompression process. It's been CPU from the start, baring rare exceptions. Sound cards just receive raw PCM. There used to be hardware codecs but they're not really common, especially in consumer/gamer-grade soundcards.
Additionally, there can easily be 20-40 sounds playing at once, more if you haven’t optimized yet (which typically happens in the last few months before release). These also need to be preloaded slightly, and once playing stream from disk, so source starvation needs to be handled and the codec needs to not be glitchy if missing some packets.
It’s also happening in a system that needs to be real-time and keep each frame timer on the order of milliseconds, though Moore’s law moving to parallelization has helped a lot. You’d be surprised how under powered the consoles are in this regard (caveat: I haven’t developed on the upcoming gen, which is getting better)
As for loading and caching on demand, that’s limited by memory, given the sheer amount of samples used in games, it’s just not practical. For specific example in a very well known game, there are over 1600 samples for boxes hitting stuff (impact sounds). What I’m building right now is meant to make generative audio easier and reduce the number of samples needed, so more tools to process sound could make this naive caching approach practical
That almost sounds as if it could be worthwhile to synthesize on demand from a very small set of (offset/overlaid) base samples through a deep chain of parameterized FX. With 1600 FX parameter preset tuples as the MVP, bonus points for involving game state context.
The "sound shaders" part is very interesting for me, and the kind of tech I would like to see more.
As another commenter mentioned, these games shipped with compressed audio for consoles. Also that generation of consoles have pretty good hardware codecs for audio (320 channels in the Xbox).
And MP3 was just an example of what I had here at my disposal. But as an exercise I converted my 4 minute MP3 to Vorbis. Decoding it converting to WAV took the same amount of time as before: about half a second on a very old and underpowered MacBook Air. Most of this time is spent writing 50mb to disk.
Edit: reasoning is here: https://www.rockpapershotgun.com/2014/03/12/respawn-actually...
Minspec is 2-core PC, probably to support large player base and as noted before there can be 20-40 audio files all streaming from disk and decoding, so sure one file might be fast but no way that’s happening on a 2-core PC. Sure one file might decode fast, but 40 of them, all also streaming from disk while keeping frame rate playable, just impossible
Also, according to the article they're packing all the locales. To me this seems like a bigger issue.
And even if it had just a single decoder, there's always the possibility to pre-decode audio. Or just decode the music/film.
In general it doesn't make sense to use wavs. Games have been using compressed audio since the late 90s without any significant trouble, and mp3s are aurally transparant at 320kbps.
I think better opamps and audio quality is always enough reason to have a separate sound card in your PC.
Eh, a PS2 with a ~300mhz MIPS CPU is more than able to play VCD with MPEG1 encoded video. You must be young, because the PS2 is on par a Pentium MMX/Pentium 2 and for sure you could play MPEG videos back in the day.
On the PC Bink was a great choice for FMV as it had a really clean API and 'just worked'.
And the MPEG1 spec is compatible with MPEG2 based decoders on the PS2, so the effort on it would be almost null.
As I said, Gen-Zers understimate late 90's/early 00's hardware.
Also, the PS2 GPU was a data monster, refilling the VRAM like nothing. Ask the PCSX2 developers about that.
What I'm hoping to see at this front, since install sizes are bloating, that the installers/configuring becomes slightly more advanced.
When I install a game I want to be able to choose the following at install-time:
Texture Packs (main blobs that takes up a lot of space - why download 50GB if you need the 10GB version)
Mode Packs (Single Player Campaigns, Multi player, maps)
This way you can take a game that currently cost say 80GB for everyone to average out 30GB to 50GB for most players. On the low end the same game needs to work at 10GB and at the high end can consume the whole 80GB for the person with hardware that can take it. Obviously for console players, they just want to play and get on with it and maybe wont enjoy the choices mentioned above, but PC and tweakers/benchmarkers/modders should enjoy it.
So yes, 10GB here and there isn't much.. but as a collective, if you have 1000000 users downloading 300GB less per month (I know, silly numbers, it's way more in reality), it would make a huge difference to the "gaming industries & network load"-effects as a whole. Plug-in some real numbers (which I don't have)(Players X ave number games downloaded per year X ave game size).
Other industries like streaming/music sites, they optimize their transmission sizes and even small gains are often worth it (10 million people downloading 2MB less per song on spotify, while playing 50 songs per day... the numbers add up quick). Somebody will pay for the transmission - either the consumer or the business. The business only needs to ask is it cheaper to pay a few devs to optimize their code or is cheaper to pay for a fatter data pipe. I think long term, engineering effort always wins vs throwing money at the problem.
I have a bog standard internet connection with a download speed of 1000gb. Most games download faster than it would take me to go to the store or order a package.
I really like Nintendo Switch cartridges, I prefer to buy physical copies when I can, but I don't think there's a future in them. I kind of wish someone would make "smart cards" that just gives you access to the game online. Since there's so many essential updates to games these days, that's almost what cartridges are anyway.
You forgot the reference link btw.
That would be terrible from an archival perspective. Flash memory is subject to charge leakage that causes data be lost over time. High density flash, which has the lowest cost per byte, also has the poorest data retention lifetime.
So the mechanism to have modular game delivery already exists in steam, they can just repurpose it and give it a different label, or bake the options into the Install dialog so you can choose the parts you want upfront.
And yeah, other launchers are a pain (with proton), I agree.
They are so used to badly implemented software in Electron requiring humoungous CPU and RAM requeriments that basic performing tools it's mind blowing to them.
Heck, any Pentium2 on par of a PS2's CPU or better could play DivX videos under MPlayer under Linux/NetBSD with ease.
420p, ok, but video.
You mean a single - full screen video, or videos inside the 3D content?
Supergiant Games' titles since Bastion all use RAD's Bink codec to store offline renders of their character models (as movies) instead of traditional sprite sheets or realtime rendered 3d models, so they're playing back dozens of movies at all times.
As another reply mentioned, it's also standard at this point for loading screens to be movies.
Of course you can do video decoding in software, that's what Bink is. It can do 4K60 on a PS4 CPU.
My feeling is that big budget games are prepared to jump through the hoops in order to wring the last drop of video quality (or add hours more 'content') but when timelines are short and engineers are thinly spread Bink is a great way to just get it done and move along.
On the PC, all formats are theoretically supported, but there's always that one person with an outdated AMD driver and Korean Windows XP.
I'll take a small handful of weird but very specific specs over the PC situation any day of the week.
I doubt this particular moat means much since PC has practically unlimited disk space. This recent install size hype is basically irrelevant if you're on PC. Just get a nice $100 SSD and continue. It matters only to PS/Xbox because they do not provide easy/cheap expansion.
You don't find out until after you've bought the game...
Mind you, it’s based on an algorithm that RAD devised…
Filesystem compression could help here. To my knowledge games consoles don't support filesystem compression, but I'm not sure why.
> Filesystem compression could help here.
How? Compression is not free, no matter where you put it, someone has to decompress it again.
Are you talking about some exotic SATA controller or other hardware that does that?
On release the new CoD:MW was 130 GB large. A few updates later we are now at over 256 GB.
I first encountered Smacker videos in Caesar II (which even shipped smkplay.exe on the disc so you could play them from outside the game), but they were used for a lot of games of that era, including a bunch of early Blizzard titles like Warcraft 2 and StarCraft.
I hope that Epic does something interesting with what RAD has built here. Taking it more proprietary (e.g, halting new licensing) would be a huge blow for the industry.
I would trust him to run his own benchmarks fairly and properly, but I am not sure if there have been good independent benchmarks done.
He's funny, too. Highly recommend The Jeff and Casey Show for podcast content.
they have a seriously impressive group of developers
Some of the engineers are older but Fabian is in his thirties and last I checked his goatee wasn't greying.
Bink 2 - video codec
Oodle - data compression suite
Telemetry - profiling toolkit
Granny 3D - toolkit for building 3d games
Miles Sound System 10 - sound authoring toolset in 2D and 3D
Their stuff is very well regarded and widely used.
At least that is the impression I have got, watching from outside the industry. It's always possible that I've got a distorted view as an outsider.
Unless it was a DOS game from the 90s, in which case it may have used its 256 color predecessor Smacker.
It's amazing how this tiny company has managed to stay relevant through decades of huge technological shifts.
Bink 2 - video codec
Oodle - data compression suite
Telemetry - profiling toolkit
Granny 3D - toolkit for building 3d games
Miles Sound System 10 - sound authoring toolset in 2D and 3D
I've seen their logo on tons of games. I never knew they were a local company (Kirkland, WA near Seattle).
GStreamer with its ridiculous threading architecture and lock-heavy approach? You think that's going to run on a console CPU? ffmpeg / libavcodec? Yes, we can hook that up, have it crank out YUV buffers and display those on the GPU, or we can just use Bink. Yes, it costs a ton of money, but based on when I evaluated it last, it was almost twice as fast as libvpx on our target platforms, it was well-suited for games (worked well with custom allocation strategies, didn't start a bunch of threads by itself, integrated well into our renderer), was easy to integrate, the toolchain was lovely, and when we needed help with a bug, we got an email back the same day from Jeff. RAD's support is top-notch.
Multimedia is not too big of a problem that a company like Epic can probably spend a few days integrating libvpx and opus to make a custom video solution, but big enough that most smaller studios with custom engines don't have time to spend on that, and those are the sort of problems that RAD likes to tackle.
When Bink was released and the PS2 and Gamecube were on the horizon this kind of thinking was needed. When you get to PS4/Switch you're just running consumer PCs. The exact same Nvidia SoC used in the Switch is actually used by Nvidia for their Shield Android TV boxes whose literal purpose is to play audio/video streams - it's quite capable of doing so.
"The Switch uses NVDEC along with VIC (Video Image Composer) in the GPU for video decoding. Games are exposed to higher level APIs by Nintendo which abstract the configuration of these modules."
If you're using a custom engine which does not want to use these APIs you're of course free to do it yourself (and deal with how you're going to syncronize it and the higher resource utilization) but most major titles use the built in hardware decoder provided via the API.
That said, yes, Bink certainly is a more turnkey solution, which can definitely make business sense to use. This is a byproduct of consoles being closed platforms.
This is not a biproduct of consoles being closed platforms, multimedia is a challenge everywhere; cross-platform, usable multimedia playback lacks a high quality product even on open PC platforms.
I echo the sentiments of many when saying I am surprised no one had scooped them up earlier (although I doubt it wasn't for lack of offers).
The biggest pain in the ass with Oodle is recompressing everything after you integrate it, that first time takes FOREVER.
I too find those companies weird. They seem to have nice product, website boasts how nice it is, but: no publicly available documentation, evaluation SDK, no pricing, only thing that's there is sales@ address. And while their product might be good it isn't really unique. Why I would jump hoops to use it while I can likely implement video playback from libavcodec and generic compression with zstd or LZ4 faster than their sales people will reply?
They haven't given me any reason to worry yet, what with how when they acquired Quixel, made Mixer and Bridge free, and cut the price of all Megascans assets in half for everyone while making them free for Unreal users.
his blog: https://fgiesen.wordpress.com/
one of my favorites: https://fgiesen.wordpress.com/2012/04/08/metaprogramming-for...
For game optimization it is completely unique and literally game changing.
It will show you where your hotspots are, where your threads are starving out, which CPU core is running an event (and when it is thread-switched), where your random frame hitches are happening, how your cpu workload is being spread across your frame.
You can seamlessly zoom from seeing the scheduling/tasks for 100 consecutive frames of a game down to what was happening in a 100us slice of a single frame.
That is: non-shared (per-cpu in Linux) ring buffers, small simple events, and formatting done after the fact.
I wish this was next week so I could post a link to my alpha build at the bottom of this post
IIRC the idea behing Intel's Larabee at least partly involved running pixomatic on that hardware.
They started the fight, and I think it's safe to say they knew what they were getting into.
now epic, fix you 1gb electron launcher, hire me if you can't
On the plus side, this opens the doors for a "Pied Piper, but for games" style startup to come out of nowhere and take big market share.
How long this will last is anybody's guess, of course.
[Edit- I may have misunderstood your question, but I'll leave my reply anyway.]
Here are some possible reasons you might use Bink instead of a free codec:
- Bink video has been around for a long time as a product, even though the codec has been updated (probably multiple times, I don't know). There weren't always good-enough free codecs, and if you have prior experience integrating Bink with your stuff then you may prefer to stick to something you know is good even if it costs money.
- RAD focuses on producing libraries that are very easy to drop in, specifically in the context of games. They know how game code is written and structured. They know game developers want to be able to integrate video in a way that works -- and performs reliably -- within an existing game engine that does a lot of custom management of textures/buffers/graphics memory. State of the art free codecs don't necessarily have free libraries that are actually high performance and easy to integrate within a game context.
- Your state of the art free codec might work great on a modern smartphone or PC, but good luck getting it to work on a Nintendo Switch or whatever weird console you're targeting. Bink is available for whatever game-specific platform hardware you're targeting.
So basically this means something like Bink has limited application time wise, and going forward it will become more and more obsolete because free codecs will cover all needed use cases sufficiently.
As a game you will always want more cycles for level loading or whatever else happens in background as you also play that video so bink-like specialized and super lean and fast library will always have a use in that scenario.
Gstreamer is a pain to set up, and when you do set it up it uses tons of threads, ffmpeg isn't nicer either.
Well, that's exactly the point. It's been around for dark ages. Today things are different.
For new codecs like AV1 there are new and efficient decoders too and they are only getting better. It's not all limited to ffmpeg which is using external decoders anyway, it doesn't decode things itself.
Bink fits the bill today and has fit the bill since forever. I don't see any of the alternatives having similar performance characteristics, ease of integration or availability on platforms that are more or less closed and there's no indication of them being open in the future.
In adition to that the switch cost is quite high so any other alternative has to be good, and I haven't yet seen a contender.
We use Bink because it is better then the open-source alternatives, and that's it.
Ah, you mean that 1% market share on Steam, sorry.
So instead of trolling again, analyze where it's heading. I don't see any reason why Bink can't be replaced by free codecs.
PS4/XB1 have been enormously more powerful than PS3/Xb360 and yet almost all games on those consoles that play a video do it using Bink even if both consoles have that AMD hardware you mention.
The current generation PS5, etc is way more powerful than PS4, and yet bink still seems to still dominate in the limited titles that we can see.
I can call Epic, Nintendo, Sony, Microsoft, Apple, Google when something doesn't work as expected, free codecs dumped on github, who knows.
This becomes a more notable tradeoff for a studio looking to get funding for a particular milestone from a publisher. Your technical leads (and content creators) likely also cut their teeth on games that integrated it, so it has existing operational momentum too.
This sounds like an artifact of the past. Today there aren't that many hardware platforms that free codecs don't already cover properly. And I expect with something like push by AOM this will become even less and less of an issue in the future. Codecs are becoming more of a commodity in this sense so something like Bink I expect will be less and less relevant.
It'll be nice if there are great, free, off the shelf codecs and tools, but at least right now almost every current AAA game seems to use Bink.