Hacker News new | past | comments | ask | show | jobs | submit login
Epic acquires RAD Game Tools (epicgames.com)
262 points by jsheard on Jan 7, 2021 | hide | past | favorite | 186 comments



The history of RAD is a history of media read speeds - playing back full-motion video from a CD on a PS2 with its near-zero available CPU resources was a superhuman effort.

So low-overhead (fast) decompression is essential for AAA titles. But recently I guess developers think disks are fast enough now, because PC games have been using more uncompressed assets to free up CPU time, massively bloating install size. Current-gen consoles also all have NVMe SSDs.

Given this trend, RAD's heyday was definitely in the days before consoles had NVMe disks (Bink in particular let games punch far above their weight) so this might be a nice forever-home.

Blame Intel for not making faster CPUs, and consumers for tolerating massive install sizes, i guess.

The other angle to this story is that the PS5 has a hardware decode unit for RAD Kraken. To get the best use out of the PS5 hardware, it's essential that the encoder is still made available. This is a huge competitive moat. (PCs probably won't get a comparable decompression accelerator card and such a card wouldn't get enough market penetration anyway.)


While the codecs have been central to RAD nearly from the start(Miles was the first product, closely followed by Smacker), the talent pool they have is exceptional across many other categories relevant to Epic - so there is an element of aquihire here, even considering the IP. It's probably a good time to exit since the alternative would mean coming up with a sufficiently awesome next-gen thing again, and even with experienced hands, that can be a moonshot.


I heard that the main issue with decompressed assets was audio, not video (granted, video is image and audio so one is a strict superset of the other). One game - Titanfall or a CoD game, IIRC - had something like 20 GB of uncompressed audio in a 35GB installation footprint. Rationale was saving CPU cycles.

Meanwhile here I am with a sound card in my desktop PC for no real reason anymore :\


That doesn't really compute. Audio decompression is pretty light on CPUs these days.

It takes about half a second in a single core to decompress a 4 min MP3 to WAV in my 2012 MB Air, including writing to disk. In a gaming machine it will be way less. If anything, the audio could be decompressed and cached on demand, when loading levels, or even during installation.

Also sound cards do not participate on the decompression process. It's been CPU from the start, baring rare exceptions. Sound cards just receive raw PCM. There used to be hardware codecs but they're not really common, especially in consumer/gamer-grade soundcards.


Couple of things to note: MP3 is not appropriate for use in real time due to a variable amount of silence added to the start of the sample intrinsic to the compression. You can sometimes use it for music But if there’s any changes based on game events mp3 is unusable. A lot of work has been put into optimizing mp3 including at os and hardware levels but that’s not usable in games. Commonly it’s some custom spin on vorbis which is

Additionally, there can easily be 20-40 sounds playing at once, more if you haven’t optimized yet (which typically happens in the last few months before release). These also need to be preloaded slightly, and once playing stream from disk, so source starvation needs to be handled and the codec needs to not be glitchy if missing some packets.

It’s also happening in a system that needs to be real-time and keep each frame timer on the order of milliseconds, though Moore’s law moving to parallelization has helped a lot. You’d be surprised how under powered the consoles are in this regard (caveat: I haven’t developed on the upcoming gen, which is getting better)

As for loading and caching on demand, that’s limited by memory, given the sheer amount of samples used in games, it’s just not practical. For specific example in a very well known game, there are over 1600 samples for boxes hitting stuff (impact sounds). What I’m building right now is meant to make generative audio easier and reduce the number of samples needed, so more tools to process sound could make this naive caching approach practical


> For specific example in a very well known game, there are over 1600 samples for boxes hitting stuff

That almost sounds as if it could be worthwhile to synthesize on demand from a very small set of (offset/overlaid) base samples through a deep chain of parameterized FX. With 1600 FX parameter preset tuples as the MVP, bonus points for involving game state context.


That’s literally my startup, I won’t get deep into the reasons why good tools for this don’t exist yet, but if you imagine game development is like regular development but with orders of magnitude more chaos you can understand how difficult it is to build stuff for reuse. After 15 years in the industry, my approach is just the same as yours


You got amazing stuff!

The "sound shaders" part is very interesting for me, and the kind of tech I would like to see more.


> You’d be surprised how under powered the consoles are in this regard

As another commenter mentioned, these games shipped with compressed audio for consoles. Also that generation of consoles have pretty good hardware codecs for audio (320 channels in the Xbox).

And MP3 was just an example of what I had here at my disposal. But as an exercise I converted my 4 minute MP3 to Vorbis. Decoding it converting to WAV took the same amount of time as before: about half a second on a very old and underpowered MacBook Air. Most of this time is spent writing 50mb to disk.


Yeah that is curious if consoles shipped with compressed audio but not PC. The prevailing wisdom on PC is codecs are easier to deal with due to dedicated audio thread. Decisions like that are not made lightly so now I’m curious what the reason was

Edit: reasoning is here: https://www.rockpapershotgun.com/2014/03/12/respawn-actually... Minspec is 2-core PC, probably to support large player base and as noted before there can be 20-40 audio files all streaming from disk and decoding, so sure one file might be fast but no way that’s happening on a 2-core PC. Sure one file might decode fast, but 40 of them, all also streaming from disk while keeping frame rate playable, just impossible


Good points. But there's still the possibility to decompress during installation, which shouldn't be too hard even for 2-core computers, and is probably faster than downloading.

Also, according to the article they're packing all the locales. To me this seems like a bigger issue.


Keep in mind that Titanfall had to ship on the Xbox 360 and PS3.


Interestingly, those two consoles had dedicated hardware decoders. It doesn't make any sense to have uncompressed audio for them.


If it's any similar to the hardware decoders on iPhones, they're probably limited to n streams. That might be good for playing some background music or ambient sounds, but it gets tricky really quickly when you have multiple sounds playing at once.


Not really: The Xbox has 320 independent decompression channels according to its marketing material, which is kinda powerful. The PS3 had similar tech in its Motion Decoder, but they didn't disclose how powerful it was.

And even if it had just a single decoder, there's always the possibility to pre-decode audio. Or just decode the music/film.


All the console versions shipped with compressed audio due to the anemic hard drive and disk speeds, as well as tiny RAM amount.

In general it doesn't make sense to use wavs. Games have been using compressed audio since the late 90s without any significant trouble, and mp3s are aurally transparant at 320kbps.


Rumour I heard was that it was because they realised 20% of their customers using much older PC's wouldn't be able to decode fast enough, so they distributed it with the uncompressed to everyone because no distribution platform allowed "optional" downloads.


This sounds like the most plausible.


> Meanwhile here I am with a sound card in my desktop PC for no real reason anymore...

I think better opamps and audio quality is always enough reason to have a separate sound card in your PC.


I'm sure Sony didn't burn kraken decompression into hardware without a license saying all their third party devs can use the compressor for Sony's PlayStation console (and future ones for backwards compatibility at least) in perpetuity.


>playing back full-motion video from a CD on a PS2 with its near-zero available CPU resources

Eh, a PS2 with a ~300mhz MIPS CPU is more than able to play VCD with MPEG1 encoded video. You must be young, because the PS2 is on par a Pentium MMX/Pentium 2 and for sure you could play MPEG videos back in the day.


If that's all it was doing while the video played, sure. That's not all they were doing while videos played. Often, videos were shown while assets were unloaded from RAM and others were loaded from disk.


PS2 had hardware to decompress DVD quality video with very low cpu overhead. Rad's bink video compression (at the time of the PS2) was slower and more dvd bandwidth heavy, only reason to use it was if you didn't want a separate set of video files for your PS2 game sku.

On the PC Bink was a great choice for FMV as it had a really clean API and 'just worked'.


That's nothing when the PS2 is able to play MPEG2 video (DVD's), something only a late Pentium2 could do, often with a hardware decoder or a good video card.

And the MPEG1 spec is compatible with MPEG2 based decoders on the PS2, so the effort on it would be almost null.

As I said, Gen-Zers understimate late 90's/early 00's hardware.

Also, the PS2 GPU was a data monster, refilling the VRAM like nothing. Ask the PCSX2 developers about that.


Another thing that changed from say 10 years ago, some games have better launchers and ways to configure them.

What I'm hoping to see at this front, since install sizes are bloating, that the installers/configuring becomes slightly more advanced.

When I install a game I want to be able to choose the following at install-time:

Texture Packs (main blobs that takes up a lot of space - why download 50GB if you need the 10GB version)

Language Packs

Voice/Audio Packs

Mode Packs (Single Player Campaigns, Multi player, maps)

This way you can take a game that currently cost say 80GB for everyone to average out 30GB to 50GB for most players. On the low end the same game needs to work at 10GB and at the high end can consume the whole 80GB for the person with hardware that can take it. Obviously for console players, they just want to play and get on with it and maybe wont enjoy the choices mentioned above, but PC and tweakers/benchmarkers/modders should enjoy it.


A 1TB SSD drive costs around 100 (€/ $). A person who enjoys tweaking their PC likely can easily get terabytes of storage space. I don't really see the utility of fiddling with 10GB here and there...


Okay, imagine you have 300+ games on Steam/Epic/Gog etc... if you were to install 50 of them, that alone would eat up 1TB easy peasy. If you want to install more of them, a 1TB nvme drive is not going to cut it. Maybe 10TB either. So what do most people do? They install/uninstall frequently. But for those with crappy internet where 30GB take 2 days to download, it's better to make a backup before uninstall/reinstall. If the backups are smaller, that helps too.

So yes, 10GB here and there isn't much.. but as a collective, if you have 1000000 users downloading 300GB less per month (I know, silly numbers, it's way more in reality), it would make a huge difference to the "gaming industries & network load"-effects as a whole. Plug-in some real numbers (which I don't have)(Players X ave number games downloaded per year X ave game size).

Other industries like streaming/music sites, they optimize their transmission sizes and even small gains are often worth it (10 million people downloading 2MB less per song on spotify, while playing 50 songs per day... the numbers add up quick). Somebody will pay for the transmission - either the consumer or the business. The business only needs to ask is it cheaper to pay a few devs to optimize their code or is cheaper to pay for a fatter data pipe. I think long term, engineering effort always wins vs throwing money at the problem.


I don't have any stats so I don't know how large is this population who want to rotate through their Steam library with high cadence who lack fast internet. About the lack of fast internet - It feels like streaming and remote work have normalized "fast enough" internet connections of 100mb and up but I might be completely wrong here.


I wonder if it would be possible to get 250 GB SSDs for like $25 or less. You could then literally sell your game on an SSD. According to this [0] Nintendo charges $8-20 per cartridge to third parties.


Given the popularity of digital sales it's hard to see the business model where this would make sense.

I have a bog standard internet connection with a download speed of 1000gb. Most games download faster than it would take me to go to the store or order a package.


Yeah, this is basically exactly what Nintendo is doing. I guess for PC and larger consoles you could use larger memory cards that might be cheaper to make. But anyway, it's still a problematic approach since it eats into the margins unless you increase the price of the game, which most game publishers try to avoid.

I really like Nintendo Switch cartridges, I prefer to buy physical copies when I can, but I don't think there's a future in them. I kind of wish someone would make "smart cards" that just gives you access to the game online. Since there's so many essential updates to games these days, that's almost what cartridges are anyway.

You forgot the reference link btw.


Yes, https://www.aliexpress.com/item/4000266629080.html - guessing that the AliExpress price is 2x the Taobao price which is ?x the quantity price. I do wonder how fast the controller/flash in this is, though (this is limited to SATA3) - that's now many times slower than the Series X / PS5 speed.


> "...literally sell your game on an SSD..."

That would be terrible from an archival perspective. Flash memory is subject to charge leakage that causes data be lost over time. High density flash, which has the lowest cost per byte, also has the poorest data retention lifetime.


Not everybody has gigabit FTTH. On a 25 Mbit connection waiting for humongous downloads really sucked when I wanted to play a bit of GTA 5 (not even online!) after not touching it for a few weeks.


A 1TB NVMe isn’t that cheap (though it’s getting there) yet, though. I’ll never go back to actual drives now I’m running 2x NVMe’s haha


Complicated launchers often fail with proton.


So in steam, when you go into the properties of a game, there is a DLC's tab. When you select/unselect DLC's, it will often trigger a download to happen (although small). With Dota, they deliver support for OpenGL & Vulkan basically as DLC's.

So the mechanism to have modular game delivery already exists in steam, they can just repurpose it and give it a different label, or bake the options into the Install dialog so you can choose the parts you want upfront.

And yeah, other launchers are a pain (with proton), I agree.


The PS2 had a hardware MPEG decoder and could also play DVDs so I don't think it had much trouble doing FMV.


Yeah, is like most of HN should be kids, because, FFS, it's a PS2, not a Nintendo from the 80's.

They are so used to badly implemented software in Electron requiring humoungous CPU and RAM requeriments that basic performing tools it's mind blowing to them.

Heck, any Pentium2 on par of a PS2's CPU or better could play DivX videos under MPlayer under Linux/NetBSD with ease. 420p, ok, but video.


every aaa game still uses rad tools. compression is still a thing for various reasons: fitting more textures in the gpu/ram/etc. for instace, with decode happening on gpu


It seems weird to use specialized video decode hardware when videos are almost always played in a full-screen exclusive fashion. Like, other than loading, what is the CPU doing during those times?


I worked on COD:AW and we used video in game often. The lead UI artist would have used it significantly more if performance was better. Might be an exception but it's not uncommon, I've worked on multiple titles with full 3D and video playback at the same time


Hang on, the performance of playing static video content was not good enough? (and worse than dynamically rendered game art?)

You mean a single - full screen video, or videos inside the 3D content?


Video used as textures, either in world on 3D meshes, or as part of UI (HUD) elements. It could be prohibitively expensive to play a single video in some scenes, and we didn't support multiple videos. Not because it wasn't possible or anything, mostly because performance, although there would have been an engineering cost to doing also (ie I would have had to add support to the engine)


Ah, video textures. That makes sense and indeed sounds expensive.


Also remember "expensive" is relative. I think it added little over 1ms to total frame time per video, but in a 60fps console game that. Our budget was about 1ms or less for the entire UI during gameplay (including scripting, layout and rendering), so a video could more than double that. 4 simultaneous videos would be like 1/4 of the entire frame :).


Videos can be used for a lot of different things. You can animate textures, obviously, but you can also animate things like normal maps to get really interesting effects. I recently read an article that described how one of the more recent Assassin's Creed games used animated normal maps to make rainfall patterns on the ground.


likely the performance of playing small video elements as part of the UI, oftentimes it's faster to keep a whole uncompressed texture atlas in memory for animations and such.


Yes and in most cases we would do this, but some content made more sense as video


Videos have been used frequently for all sorts of other stuff in games since the playstation, if not earlier. In StarCraft the portrait of your selected unit was a movie file (RAD smacker, I think? But I don't have a copy of starcraft lying around to double check).

Supergiant Games' titles since Bastion all use RAD's Bink codec to store offline renders of their character models (as movies) instead of traditional sprite sheets or realtime rendered 3d models, so they're playing back dozens of movies at all times.

As another reply mentioned, it's also standard at this point for loading screens to be movies.


To be clear, Kraken is for general compression, not video.

Of course you can do video decoding in software, that's what Bink is. It can do 4K60 on a PS4 CPU.


Makes sense. If it were just video it'd seem like it would be cheaper to include an h.264 chip, they're a lot more common and you get the same thing


Encoding videos across consoles in h.264 is a pain and unreliable. I did it, and so I can see why a big budget title would use blink. Each console has its own odd and undocumented requirements for the encoding specification. Think 1079p at 29.99fps pm one console while vannila 1080p 30fps on another. Get it wrong and your video is blank with only a "failed to decode" in the logs.


Absolutely. Massaging video codec settings to exactly match the console hardware requirements and feeding video data in/out of each console's propriety video API is a royal PITA.

My feeling is that big budget games are prepared to jump through the hoops in order to wring the last drop of video quality (or add hours more 'content') but when timelines are short and engineers are thinly spread Bink is a great way to just get it done and move along.


But there's relatively few consoles out there, which makes the situation much more tolerable.

On the PC, all formats are theoretically supported, but there's always that one person with an outdated AMD driver and Korean Windows XP.

I'll take a small handful of weird but very specific specs over the PC situation any day of the week.


Aren’t bink videos can be rendered as textures? A lot of games use such effects on in-game objects (e.g. tv playing real videos).


Video game loading screens, for instance.


> PCs probably won't get a comparable decompression accelerator card and such a card wouldn't get enough market penetration anyway

I doubt this particular moat means much since PC has practically unlimited disk space. This recent install size hype is basically irrelevant if you're on PC. Just get a nice $100 SSD and continue. It matters only to PS/Xbox because they do not provide easy/cheap expansion.


> consumers for tolerating massive install sizes

You don't find out until after you've bought the game...


I have stopped playing games because the updates were too big and I got sick of waiting for them to download. I'm sure the games companies are tracking this stuff, so I'm likely in the minority.


The PS5 launched with only 667gb of usable disk space, asset compression has some use still.


The PS5 has hardware-accelerated decompression, which is probably what you’d want to use when targeting PS5.

Mind you, it’s based on an algorithm that RAD devised…


> PC games have been using more uncompressed assets to free up CPU time, massively bloating install size

Filesystem compression could help here. To my knowledge games consoles don't support filesystem compression, but I'm not sure why.


> > PC games have been using more uncompressed assets to free up CPU time, massively bloating install size

> Filesystem compression could help here.

How? Compression is not free, no matter where you put it, someone has to decompress it again.

Are you talking about some exotic SATA controller or other hardware that does that?


No I was thinking of modern filesystem compression solutions. They offer relatively poor compression ratios, but computationally efficient decompression.


That would undo the entire point of it, no?


> massively bloating install size

On release the new CoD:MW was 130 GB large. A few updates later we are now at over 256 GB.


So you’re saying Fortnite is gonna be amazing for us PS5 owners?


Lots of love for Bink, but Bink's predecessor is an earlier video codec/format called Smacker, which was especially for 8-bit colour palettes, where the palette (or part of it) might be fixed due to being needed by UI elements outside the video, but some or all of it may be allowed to vary/rotate as the video played.

I first encountered Smacker videos in Caesar II (which even shipped smkplay.exe on the disc so you could play them from outside the game), but they were used for a lot of games of that era, including a bunch of early Blizzard titles like Warcraft 2 and StarCraft.


I immediately thought of Caesar II when you mentioned Smacker :). One of the impressive things in that game (besides being a great game) was the many very smooth video clips, enabled by that tech.


I played Caesar II for days and days as a teenager. Awesome game.


smackply for life! Interlaced and/or all! While bink is great too.


RAD's Charles Bloom has a blog (https://cbloomrants.blogspot.com) which is excellent for anyone interested in high-performance programming. Especially compression.


Especially compression. The compression provided by RAD Tools is nothing short of otherworldly -- it can provide compression ratios comparable to XZ, but with decompression speeds more along the lines of what you'd expect from gzip (or even better).

I hope that Epic does something interesting with what RAD has built here. Taking it more proprietary (e.g, halting new licensing) would be a huge blow for the industry.


How does it compare with zstandard these days?


Cbloom has a bunch of blog posts with charts purporting to show it doing significantly faster decoding than zstd at every choice of compression ratio.

I would trust him to run his own benchmarks fairly and properly, but I am not sure if there have been good independent benchmarks done.


Not just high-performance, his "Library Writing Realizations" post is also well worth a read whatever domain you're working in.

http://cbloomrants.blogspot.com/2015/09/library-writing-real...


Everyone who works there is extremely talented. Fabien, Sean, Allen, etc. Jeff has a very keen eye for it.

He's funny, too. Highly recommend The Jeff and Casey Show for podcast content.


fabien giesen, also at RAD, has a very informative blog as well. https://fgiesen.wordpress.com/

they have a seriously impressive group of developers


What were they paying to keep these greybeards around?


RAD always paid very well by gamedev standards. But that was never the main reason people worked there.

Some of the engineers are older but Fabian is in his thirties and last I checked his goatee wasn't greying.


AFAIR a public H1B entry from mid-2000 shows $300k salary, which was impressive at the time even for non-gamedev.


Thats insane for wages.


Work environment matters more than pay, and the pay presumably was very good. Imagine being down the hall from the people who literally started indie game development, or demo-scene royalty.


Interesting job with other people they like is probably a good guess.


For those interested in what RAD Game Tools actually makes, as far as I understood from their website (http://www.radgametools.com/):

Bink 2 - video codec Oodle - data compression suite Telemetry - profiling toolkit Granny 3D - toolkit for building 3d games Miles Sound System 10 - sound authoring toolset in 2D and 3D


Basically any game you've played that has any pre-rendered video (for cutscenes, logos, whatever) most likely uses Bink video for it.

Their stuff is very well regarded and widely used.

At least that is the impression I have got, watching from outside the industry. It's always possible that I've got a distorted view as an outsider.


> Basically any game you've played that has any pre-rendered video (for cutscenes, logos, whatever) most likely uses Bink video for it.

Unless it was a DOS game from the 90s, in which case it may have used its 256 color predecessor Smacker.

It's amazing how this tiny company has managed to stay relevant through decades of huge technological shifts.


You need extra newlines. Reformatted:

Bink 2 - video codec

Oodle - data compression suite

Telemetry - profiling toolkit

Granny 3D - toolkit for building 3d games

Miles Sound System 10 - sound authoring toolset in 2D and 3D


It doesn't appear to be mentioned on the site, but they also collaborated with Sony to develop the hardware decompression unit in the Playstation 5


Sony hardware engineers put in a herculean effort designing the awesome custam hardware decompression chip, but it was more or less just implementing the existing oodle algorithm, which they licensed. more like a translation of an existing book than writing a new work.


What’s amazing is they have made these same products for like...25 years?


Pretty much. Though they've periodically updated them to use new techniques like animation blending. I think there's also a lot of work in optimizing for different platforms. And perhaps more importantly, there's a lot of work in making the middle ware easy to integrate. Ease of integration is often more important than performance for a lot of middleware.

I've seen their logo on tons of games. I never knew they were a local company (Kirkland, WA near Seattle).


Not geezer splaining you, but they had a fairly heavy advert presence in gamdev magazine and dr dobbs. They are right behind Attachmate in being a big software outfit in Seattle.


Really hoping epic takes the opportunity to give Bink a more open license, to be honest. Much as I appreciate how advanced Bink was at the time, i don't know any dev that still use it given that av1 and such are just Good Enough.


Bink isn't the codec, though that's good too, it's the software stack. How do you render AV1 on a PS4 or a Nintendo Switch, synchronize it with the audio stream? No, seriously, tell me.

GStreamer with its ridiculous threading architecture and lock-heavy approach? You think that's going to run on a console CPU? ffmpeg / libavcodec? Yes, we can hook that up, have it crank out YUV buffers and display those on the GPU, or we can just use Bink. Yes, it costs a ton of money, but based on when I evaluated it last, it was almost twice as fast as libvpx on our target platforms, it was well-suited for games (worked well with custom allocation strategies, didn't start a bunch of threads by itself, integrated well into our renderer), was easy to integrate, the toolchain was lovely, and when we needed help with a bug, we got an email back the same day from Jeff. RAD's support is top-notch.

Multimedia is not too big of a problem that a company like Epic can probably spend a few days integrating libvpx and opus to make a custom video solution, but big enough that most smaller studios with custom engines don't have time to spend on that, and those are the sort of problems that RAD likes to tackle.


PS4 and Switch both have hardware decoders (though neither would be using AV1 in particular due to age). I'm not sure about how it works on PS4 but on Switch games just use NVDEC like a PC with Nvidia hardware would and the functionality is exposed in the SDK. You can use h.264, h.265, VP8, and VP9 (the hardware supports more but it's not exposed).

When Bink was released and the PS2 and Gamecube were on the horizon this kind of thinking was needed. When you get to PS4/Switch you're just running consumer PCs. The exact same Nvidia SoC used in the Switch is actually used by Nvidia for their Shield Android TV boxes whose literal purpose is to play audio/video streams - it's quite capable of doing so.


I can't go into more detail because these platforms do not have any public documentation, but this is not true, you cannot use NVDEC. And it doesn't cover audio or synchronization.


Without referencing non-public documentation https://yuzu-emu.org/entry/yuzu-nvdec-emulation/

"The Switch uses NVDEC along with VIC (Video Image Composer) in the GPU for video decoding. Games are exposed to higher level APIs by Nintendo which abstract the configuration of these modules."

If you're using a custom engine which does not want to use these APIs you're of course free to do it yourself (and deal with how you're going to syncronize it and the higher resource utilization) but most major titles use the built in hardware decoder provided via the API.


Just curious, is there any reason MS/SIE console (AMD custom SoC) won't implement hardware RAD decoder? From what I read from this submission, their codec is good and used everywhere so it looks like it's better to have HW decoder.


SDL has always been the stack of reference when implementing your own engine, as far as I've seen in game dev. Now that is probably not the case for AAA games like cyberpunk etc, I don't follow as much tech there. But yeah, I've never seen gstreamer used for that. Curious if you can elaborate bout your experience especially wrt. the switch like you were mentioning downthread, I'm super curious.


Gstreamer and FFMpeg would certainly run on a Switch, a PS4, or whatever else. We know this because it runs on devices with the same processor.

That said, yes, Bink certainly is a more turnkey solution, which can definitely make business sense to use. This is a byproduct of consoles being closed platforms.


When I say "run", I mean "runs at target performance". Yes, sure, Turing completeness technically means that software raytracing "runs" on the Switch, we all know that, but it's not a viable solution for a shipping product.

This is not a biproduct of consoles being closed platforms, multimedia is a challenge everywhere; cross-platform, usable multimedia playback lacks a high quality product even on open PC platforms.


FFMpeg and Gstreamer both run on both Nvidia Tegra hardware and AMD Jaguar/Bulldozer at 4K 60fps.


Gstreamer is more suited for filtering/FX and media overlays.


That's true, but on the Bulldozer/GCN system I had as well as the Jetson I played with it was still able to decode 4K 60fps video.


shows up on cyberpunk 2077 splash screen


doom uses bink cyberpunk uses bink


Almost all of these logos show up at the beginnings of my games. Cyberpunk most recently comes to mind.


There's a lot of focus here on Bink, but Oodle has a fascinating history as well. From world-class compression ratios/decompression speeds to being implemented in hardware by Sony in all PS5s. RAD have a great history of getting exactly the right people that are interested in their fields and are great at what they do.

I echo the sentiments of many when saying I am surprised no one had scooped them up earlier (although I doubt it wasn't for lack of offers).


Oodle is very expensive though.


Ehhh, I mean it's not cheap, but definitely within the realm of reasonableness for a funded semi-indy game - and to be fair, if you're creating enough assets that using Oodle makes a substantial saving to your disk/memory footprint (i.e. your game package is in the multiple GB range) then probably you have some money to pay all those artists, so hopefully you can afford Oodle. If you're doing a pixel art or retro game, it's probably not worth it.

The biggest pain in the ass with Oodle is recompressing everything after you integrate it, that first time takes FOREVER.


I wonder why some companies do not publicly post their price of the product? Or at least a price range? What's the harm in that?


Because pricing on stuff like that is negotiable and they don't want people to have a starting point.


Or less favorable take from Joel Spolsky: "Bad Idea #2: How Much Money Do You Have? Pricing." https://www.joelonsoftware.com/2004/12/15/camels-and-rubber-...

I too find those companies weird. They seem to have nice product, website boasts how nice it is, but: no publicly available documentation, evaluation SDK, no pricing, only thing that's there is sales@ address. And while their product might be good it isn't really unique. Why I would jump hoops to use it while I can likely implement video playback from libavcodec and generic compression with zstd or LZ4 faster than their sales people will reply?


You're making a lot of assumptions. You can probably get a free trial just by emailing them, and you will be in direct contact with the dev who actually wrote the tool. A number of their devs have/had blogs where they were constantly talking about the development of the products etc. First of all these guys are superstars in the games industry, so everybody knows about them, and secondly they are super approachable and responsive.


That was a great read. Thanks for posting.


If nothing else, libavcodec is LGPL.


Well yes, you have a point there when developing for consoles.


I see. Thank you.


It always seemed like RAD makes the kinds of tools you want as a developer who likes to run its own core systems and license powerful components for key issues. To think these tools will now be locked into the bloated Unreal Engine makes me feel a bit sad. It's like a lot of game companies get bought up by giants in a "just because we can" fashion, recently, and I can't help but wonder how this de-prioritizes quality and diversity in the industry.


Epic would have to be out of their mind to lock this tech up. Licensing it out to other companies brings them pure profit, and they still can incentivize people to use Unreal ecosystem by integrating this tech into their engine.

They haven't given me any reason to worry yet, what with how when they acquired Quixel, made Mixer and Bridge free, and cut the price of all Megascans assets in half for everyone while making them free for Unreal users.


They're in a strategic fight with Unity, though. The goal might not be to profit from these products specifically.


It certainly could happen down the road, but the post says the opposite for the time being, that they'll continue to support usage outside Unreal Engine.


We'll see about that in a few years when some of the contracts start to expire and the company becomes more integrated.


That's awesome! Congrats to the RAD team and Epic! Epic certainly has acquired a world-class team. Also, I've been a long time fan of Jeff's podcast, it is irreverent and hilarious if you haven't checked it out!

https://jeffandcaseyshow.com/


Casey, in this case, is of course Casey Muratori of Handmade Hero fame.


Of all the RAD offerings Telemetry is the most special. It lets you record millions of CPU profiling events (thousands per frame) with extremely low overhead, and then shows those events in a hierarchical timeline.

For game optimization it is completely unique and literally game changing. It will show you where your hotspots are, where your threads are starving out, which CPU core is running an event (and when it is thread-switched), where your random frame hitches are happening, how your cpu workload is being spread across your frame. You can seamlessly zoom from seeing the scheduling/tasks for 100 consecutive frames of a game down to what was happening in a 100us slice of a single frame.


How does it do these things without a high CPU overhead?!


Based on no knowledge at all about Telemetry, probably something similar to the amazing trace functionality in the Linux kernel.

That is: non-shared (per-cpu in Linux) ring buffers, small simple events, and formatting done after the fact.


for those who don't know, Fabian Giesen (@ryg of demoscene group https://en.wikipedia.org/wiki/Farbrausch) works at RAD.

his blog: https://fgiesen.wordpress.com/

one of my favorites: https://fgiesen.wordpress.com/2012/04/08/metaprogramming-for...


I’ve been using unreal for the past 6 months to make a game. It’s not as easy to get started with as unity is, but if you are good at figuring stuff out then it’s just a powerhouse. The renderer is incredible, the workflows are modular, you can work in a data driven way, you can modify skeletal animations in-engine, performance is amazing, etc etc etc. It’s also stable, and non-production ready features are clearly identified as beta or experimental.

I wish this was next week so I could post a link to my alpha build at the bottom of this post


You are welcome to return and share your work in seven days!


As far as I know, they are one of the few video game compression companies out there. Undoubtedly this is amazing news for Unreal Engine 4 devs, but for those not using the engine this is going to be a bit of a blow.

On the plus side, this opens the doors for a "Pied Piper, but for games" style startup to come out of nowhere and take big market share.


"RAD will continue supporting their game industry, film, and television partners, with their sales and business development team maintaining and selling licenses for their products to companies across industries - including those that do not utilize Unreal Engine." - the article

How long this will last is anybody's guess, of course.


Pied piper did have a whole subplot about compression in the context of reducing VR video game bandwidth latency. You might be more spot on than you think :)


I think we should all congratulate these guys. They have been knocking it out of the park for years andI hope this brings them financial security, hopefully life changing reward.


Yes, the people who work there and who passed through there are the best of the best.


If Epic wanted to make a video game streaming service like Stadia, the video compression knowledge these guys have could come in very handy I imagine. So maybe it's not just about Unreal Engine, but about making the Epic store in to a streaming service.


RAD had an amazing software renderer on par on the Unreal one.


Pixomatic wasn't just on par with UE1's SW rasterizer it was somewhat explicitly designed to be the SW rasterizer for UE2. The thing was notable by the fact that it had feature parity with HW Direct3D implementations of the time and worked by dynamically compiling D3D shaders into x86 code (which had the interesting feature that it used ESP as general purpose register).

IIRC the idea behing Intel's Larabee at least partly involved running pixomatic on that hardware.


I think the Havok engine ran on that on software mode too, at least for the first Max Payne. I remember forgetting to install NV drivers back in the day, and the game launched smoothly at 640x480 with no issues. Ok, it was uber-grainy, but still playable.


I suspect Intel were going to run OpenSWR on it - that's the software renderer that they eventually donated to Mesa after Larabee shut down.


I'm not sure it's great news tbh, in 5 years everything will be in UE so people not using Unreal will be screwed.


I've seen Epic using Oodle in the Fortnite replay files so it makes sense for them to stop paying a license and buy the company instead. I'm wondering if they are using the other tools from RAD Game Tools already, or if it's only for this compression algorithm!


These guys are my heroes!


Man, Epic is absolutely flush with cash. They’re moving their HQ down the road a mile or so by buying a dying mall for $100M and hiring 2k more employees.


Well, they're a bit company right now, still riding the revenue success of Fortnite and the potential of their launcher/store. So investors are interested, including Chinese investors that own a decent swathe of shares. I just hope they don't become such huge industry giants that they stomp on anything with a faint smell of competition.


Sony recently infused Epic with $250M for 1.4%

Source: https://www.sportspromedia.com/news/sony-epic-games-minority...


I’m just amazed with what Fortnite has done for them. I pass by their current HQ regularly and it’s pretty modest. You’d never suspect that’s the HQ of such a successful company. I was surprised to learn they’re still taking funding. I figured they were pulling down absurd profit.


They literally bought a mall to use as a new HQ. https://www.eurogamer.net/amp/2021-01-04-epic-games-buys-sho...


Yep I mentioned that. It's a mall that's been slowly dying for several years. It is absolutely massive too.


Their legal challenges with Apple might’ve put a dampener


Epic's management is one of a kind; they made amazing game that millions of people love and play which makes them billions then they want to compete with Steam on digital video game marketplace in a way that they bribe devs for exclusive release on Epic Store then they got in fight with Apple over App Store fees and got their game kicked out of the store. I mean one hell of a ride. They grew some big balls in the last couple of years.


> Their legal challenges with Apple might’ve put a dampener

They started the fight, and I think it's safe to say they knew what they were getting into.


Of course they are. Tencent had bought "not technically a majority stake" there, to try breaching Apple and pushing 3rd party appstore there (Apple has 7 billion revenue just from selling game apps in appstore). They will finance anything Epic does while they are useful for their plans.


Yeah, but they still generated a billion in profit last year all by themselves. Not exactly a “we need a cash injection” situation, which is why I was a bit puzzled.


[flagged]


Oh, so you’re one of the people downvote brigading me. Thanks, now I have your user ID and a comment that I can send the mods as evidence that people were downvoting unrelated comments of mine because they got mad about the thread last night. Thanks for making my case easier to prove!


Forgot about epic games for a second and thought it was weird that a medical software company was buying game tools.


Sounds like a reaction to the recent interview of the famous game repackager Fitgirl


good, people who focus on performance and efficiency should get rewarded the most

now epic, fix you 1gb electron launcher, hire me if you can't


What is the point of making custom codecs for Bink, instead of using state of the art free codecs?


(I don't work in the industry, but am an interested amateur)

[Edit- I may have misunderstood your question, but I'll leave my reply anyway.]

Here are some possible reasons you might use Bink instead of a free codec:

- Bink video has been around for a long time as a product, even though the codec has been updated (probably multiple times, I don't know). There weren't always good-enough free codecs, and if you have prior experience integrating Bink with your stuff then you may prefer to stick to something you know is good even if it costs money.

- RAD focuses on producing libraries that are very easy to drop in, specifically in the context of games. They know how game code is written and structured. They know game developers want to be able to integrate video in a way that works -- and performs reliably -- within an existing game engine that does a lot of custom management of textures/buffers/graphics memory. State of the art free codecs don't necessarily have free libraries that are actually high performance and easy to integrate within a game context.

- Your state of the art free codec might work great on a modern smartphone or PC, but good luck getting it to work on a Nintendo Switch or whatever weird console you're targeting. Bink is available for whatever game-specific platform hardware you're targeting.


> Your state of the art free codec might work great on a modern smartphone or PC, but good luck getting it to work on a Nintendo Switch or whatever weird console you're targeting. Bink is available for whatever game-specific platform hardware you're targeting.

So basically this means something like Bink has limited application time wise, and going forward it will become more and more obsolete because free codecs will cover all needed use cases sufficiently.


Given that bink has been around for ages, it almost always finds a niche of decoding stuff fast for video games.

As a game you will always want more cycles for level loading or whatever else happens in background as you also play that video so bink-like specialized and super lean and fast library will always have a use in that scenario.

Gstreamer is a pain to set up, and when you do set it up it uses tons of threads, ffmpeg isn't nicer either.


> Given that bink has been around for ages, it almost always finds a niche of decoding stuff fast for video games.

Well, that's exactly the point. It's been around for dark ages. Today things are different.

For new codecs like AV1 there are new and efficient decoders too and they are only getting better. It's not all limited to ffmpeg which is using external decoders anyway, it doesn't decode things itself.


As I said, in games there's always a need for fast, platform agnostic (and super low overhead API that just works). And I mean truly platform agnostic, PS4, PS5, Switch, etc. Those are new platforms that may or may not have ports of whatever else is "available" and "fast".

Bink fits the bill today and has fit the bill since forever. I don't see any of the alternatives having similar performance characteristics, ease of integration or availability on platforms that are more or less closed and there's no indication of them being open in the future.

In adition to that the switch cost is quite high so any other alternative has to be good, and I haven't yet seen a contender.


Consoles held progress back for decades, but even they can't stop it. So I see Bink becoming obsolete, I don't really see any major arguments for it. Just because it fit the bill before doesn't say anything about the future.


The "we can use worse software now that hardware allows us to do it" is why software is sunk so low these days.

We use Bink because it is better then the open-source alternatives, and that's it.


Yeah, those Switch sales are tanking, while PS 5 and XBox X/S pile unsold on the store shelves.

Ah, you mean that 1% market share on Steam, sorry.


Not sure about Switch, but recent AMD based consoles should have good video decoding capabilities. You can do fine without Bink there.

So instead of trolling again, analyze where it's heading. I don't see any reason why Bink can't be replaced by free codecs.


:D It's heading towards what you predict at the speed we're heading to be in the year of linux on desktop.

PS4/XB1 have been enormously more powerful than PS3/Xb360 and yet almost all games on those consoles that play a video do it using Bink even if both consoles have that AMD hardware you mention.

The current generation PS5, etc is way more powerful than PS4, and yet bink still seems to still dominate in the limited titles that we can see.


That's not a technical problem, sounds like incumbent problem to me. And this also negates your argument that Bink is needed because hardware can't support other codecs. Apparently it's powerful and it can? So why even bring that as a reason.


Because what matters is what comes on the SDK and AAA game engines, regardless of what AMD supports.

I can call Epic, Nintendo, Sony, Microsoft, Apple, Google when something doesn't work as expected, free codecs dumped on github, who knows.


It's a combination of patents, features, target devices, and marketing proposition. There are many devices with hardware decoders that work great for watching a TV show, but if you're using video for interactive assets as games do, you probably have latency targets to hit, you might want seek-through performance, you may want transparencies, and so on. And RAD has long had a lock on really supporting the platforms game devs are using, so if you use their stack instead of bodging together a video solution, you're saving time overall.


The free codecs might not necessarily be designed or optimized for the platform you're working on. Porting an open-source player for your particular situation might cost more in developer time than simply licensing Bink. Bink's developers are also going to be more conscious of the technical limitations of a game console that you might be running into (eg: hard thread/socket/resource limits) compared to a more general-purpose codec.

This becomes a more notable tradeoff for a studio looking to get funding for a particular milestone from a publisher. Your technical leads (and content creators) likely also cut their teeth on games that integrated it, so it has existing operational momentum too.


> The free codecs might not necessarily be designed or optimized for the platform you're working on. Porting an open-source player for your particular situation might cost more in developer time than simply licensing Bink.

This sounds like an artifact of the past. Today there aren't that many hardware platforms that free codecs don't already cover properly. And I expect with something like push by AOM this will become even less and less of an issue in the future. Codecs are becoming more of a commodity in this sense so something like Bink I expect will be less and less relevant.


Maybe. Real time and production often has special needs general purpose tools don't cover. A lot of delivery codecs focus on things like lossy compression and buffering. I know from video production and computer graphics, delivery codecs are miserable to work with for reasons such as: not allowing arbitrary channels (if you only need 1 channel or maybe you need 5 for an alpha and bump map), playing in reverse, cutting in on arbitrary frames, different compression on each channel (lossy rgb, but lossless alpha), etc. My understanding is that Bink is often used for realtime playback of multiple elements integrated with the 3d environment (triggered explosions, HUD elements, streaming textures) so it needs to be performant, handle many simultaneously, and integrated with the engine since you'd be applying transforms and LUTs.

It'll be nice if there are great, free, off the shelf codecs and tools, but at least right now almost every current AAA game seems to use Bink.


Modern codecs like AV1 and whatever succeeds it can already be good enough for such special needs too. It's only natural for free codecs to catch up to that eventually.


Sounds like a game developer's competitive advantage waiting to happen!


I sure hope so.


That's true. Windows oriented developers don't realize libre codecs are everywhere, even under Windows and OSX. Ditto with graphic and text handling libraries, I am pretty sure Freetype or Harfbuzz exists on some way under Windows and OSX.


If you get video streaming with decent resolution and frame rate running on a PS1 directly from a CD using AV1 leave me a note. That'll be a good hire.


PS1? Why would you care about PS1? Besides, when did Bink become the only codec supported even by old consoles?


The PS1 played MPEG1 which was more than enough.


Look into cbloom's descriptions of Oodle.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: