As a former POW (Vietnam), I hesitated to play WOLF for over a
month after downloading as I feared flashbacks. I didn't want to
remember all that I had been through all those years ago, when, as
POW's, my friend and I decided an escape attempt would be better than
a slow death by torture and starvation.
My friend and I made crude maps and hoarded food. The day of the
escape we clubbed the guard with stones, took his gun and fought our
way through two levels of underground tunnels (only a few guards and
had to crawl). I made it, my friend didn't.
Dreams...NO! NIGHTMARES...YES!! However, the more I play
WOLF the less frequently I have nightmares. The chilling part is
turning a corner and seeing a guard with his gun drawn.
WOLF is a powerful game. Fearful as well. I believe that a
person should face the past. So... when I can play EPISODE 1
comfortably (no nightmares), I plan on ordering the full series.
There's also a letter from a Microsoft manager requesting a multiplayer version.
I never ended up going into games myself, but in no small part I owe my career to him.
As well as the Doom book, his plan files are a fascinating time capsule to go back over, particularly if you 'came of age' during that golden age of PC games development.
Note JC's comment that "this is just my pet research project". If you've ever played quake and then quakeworld online, you'll know what an understatement that was and how freaking awesome the online gameplay became with qw.
So really, thanks John Carmack for the countless wasted hours :)
Amen. Especially when you consider that a massive number of players, such as myself, were using dial-up (56k) modems. Quakeworld made multiplayer go from a slideshow to comparable to single-player performance, albeit with hiccups.
Back then, just managing to setup XFree86 correctly was like a huge personal victory and running quake there was really the cherry on top.
I seem to recall having to actually input a bunch of numbers/frequencies taken from my crappy CRT monitor's specsheet into the X config and basically bruteforcing it until it worked. "Good times"!
Those were particularly thorny for me too, when I was getting started with linux (and was still a newbie with computers in most respects).
"Good times" indeed :)
Never really realized it until now, but this is true for me.
The original Quake, in its heyday, is what sparked my interest in Linux. Linux then sparked my interest in, well... more Linux - and gave me a new direction for my eventual shift into engineering. It really shaped my entire career.
I had heard somehow, somewhere that you could squeak out slightly better frame rates and ping times in Quake, if you used this alternative OS called Linux (which really wasn't true, in retrospect). And so that began my quest, which triggered a cascade of deep dives into linux and its community for me. Everything from installation issues and troubleshooting, to compiling kernel modules (and everything in between). I even ended up dialoging with the driver developer for one of those modules (not yet in the kernel). A bug had surfaced - he actually fixed it and sent me a patch. So then I learned how to patch a driver too. One of the best "customer" experiences I've ever had, really :)
Ironically, I might have ended up in game development had there never been a linux port for quake :P
There's a table in the above link, as well as a few more details
First person immersive sims are generally best played on the PC. When consoles that were able to do decent graphics for a reasonable period (i.e. the pace of hardware evolution slowed) came into their own, third person control worked better with joypads, and for commercial reasons, PC games often ended up as a second-rate ports of console games, with compromised controls. You really need a mouse for first person.
You did, until Golden Eye N64.
And many of those early games (Wolf3d, Doom, Hexen) didn't have Mouse look on be default, you played them Closed (Arrow Keys and Right Ctrl/Alt) or Open (Arrow Keys and Left Ctrl/Alt).
It really wasn't until Quake I/QuakeWorld that FPS' made the shift to Mouse+WASD. And evenstill in Quake1, mouse look was off by default.
Improvements in aim assist really make console FPS games more approachable. Still not up to the usability of a mouse, but if the game is thoughtfully designed it meets a "good enough" bar.
Although I wasn't a big fan of Rare's cursor aim functionality, it felt very clumsy.
Personally I'm a fan of sticky aim assist (halo) or snap-to-body aim assist (call of duty, gta). These kinds of systems are the kind of work where players may not even realize they are being helped.
But, my point is simply Golden Eye 64 was a console FPS where the controls weren't clunky (for it's time).
FPS stands for First Person Shooter. The games you're listing don't simulate much, even if some of those games have RPG elements.
FPS is too reductive. I'm using the phrase immersive sim in the same way as these articles:
See also Wikipedia - https://en.wikipedia.org/wiki/Immersive_sim
In 92 the Amiga 1200 came out, a wonderful computer with great hardware but unfortunately it never made it big. Four years later in 1996 mass market PCs were coming out with Pentium processors, 3dfx accelerators, high resolution bitmap displays, internet connections and powerful general purpose sound adapters.
Today we have orders of magnitude more computing power, but essentially they are powering the same experiences, albeit far more polished.
If we look back four years, 2014, not much has really changed. The big game releases indeed are the same games (in the 90s they'd have been called mission packs).
The magic of experiencing something completely brand new, completely alien and magical, has gone away. Though, I am hopeful VR/AR is going to deliver that again. I don't know how far off it is, but it would be fitting if it's Carmack and Abrash behind it. Again.
What would a revolutionary Amiga 1000 look like in 2018? With modern hardware and latest OS/system ideas.
Gaming consoles could be considered examples of this, but they are kind of single purpose.
Maybe a topic for an AskHN, for a rainy day.
- revolutionary custom graphics hardware
- better sound hardware than competitors
- pre-emptive multitasking
Therefore, if I was attempting something like this today, I would:
- make the graphics preeminent in the system. In other words, keep the graphics card and throw away the legacy CPU. Let it drive PCI devices.
- on its own, doing that makes it much harder to program, so put considerable work into making the instruction set and programmer's model open and well documented
- find a DAW engineer and let them build the audio subsystem with an obsessive focus on low latency. Let's aim for no more than 10 samples latency between input and output processing and see where that gets us.
- full multitasking in which nothing is ever allowed to block anything else unrelated, through resource reservation (qv the Nemesis research operating system). Having an Electron app on the system should not impair anything else, and the system's default editor should also be focused on low-latency.
- apps are by default fully security partitioned from each other. The operating system would maintain a CRDT-based record-orientated personal data storage system, incorporating lessons from PalmOS. This gives both native sync and automatic persistence across power-off.
- low latency non-USB keyboard and mouse. PS/2 would actually work but we could go for something really surprising like gigabit Ethernet or optical TOSLINK.
(Low latency is a good example of a feature which is extremely hard to retrofit and you end up redesigning the system around it).
If you throw away the CPU, how do you run general purpose programs (of which games are one type)? If the graphics card can run them, why are you not calling it a CPU?
The Amiga had a CPU :) Never owned one, but I understand a key feature was that graphics were very easy to program (mapped to main memory, or so a friend who owned one told me).
In hindsight i feel the Amiga's custom chips was actual an Achilles heel. Yes it allowed each model to sprint out of the gate, but it also limited how far the user could upgrade them.
In contrast the only thing on the motherboard of a PC of the era (once we hit the 386 and later) was the CPU and RAM. Everything else lived on replaceable boards hooked to a bus. And as the Pentium era showed, even the bus could be "replaced" by placing a new one (PCI) side by side with the old one (ISA).
All this allowed a continuation of sorts, where a humble DOS box could live on into the Win9x era in some way or other.
I would argue the closest mainstream machines currently available that meet this requirement are Macs and iOS devices. Sure, there's a handful-ish amount of different configurations, but they do not vary greatly. Backwards compatibility certainly isn't much of a concern in macOS.
Using the Motorola 680000 with 32 bit internal/16 bit external buses and color bit-mapped graphics, my ST, hardware wise, totally blew away all other PCs from a technical POV and it just annoys me when the Amiga gets all this love the and the ST gets none whatsoever.
There's some great videos on Youtube (particularly the Computerphile ones) on the Tramiel era Atari computers. They are a great little machine to help learn computer organization on. They were powerful yet still relatively simple machines that you could fit in your head.
E.g. I love this one on linked lists where they write some 68k assembly and run it on a STE and Falcon. https://www.youtube.com/watch?v=DyG9S9nAlUM
I grew up in that era and agree that it was a golden age, but of course I'm aware that nostalgia plays a big part in that. If you look at, say, 2000-2010 or even 2010 to now it seems things have slowed down - which isn't necessarily a bad thing.
For a variety of reasons, in the last decade (or in the last couple), videogames matured as a narrative medium, which is a radical difference from the 90s. One could argue that, from an whole - including artistic - perspective, 2010s are the golden era.
I don't think there is a golden age, though. There are still radical improvements to come in the next decade(s), and it will require a long time to be able to have a judgment of what characterized video games/developemnt in each era.
Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks. Hence you get mostly sequels, online cash cows, linear on-rails shooters/cinematic experiences that are easy to sell. Content is expensive to produce so studios are afraid of "wasting" it. Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it.
1997-2004 I think was the golden age of gaming where technology was good enough that cool things could be made, yet bad enough that small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
System Shock 2, Deus Ex, Metal Gear Solid, Silent Hill 1-3, Resident Evil 1-3, No One Lives Forever, Age of Empires, Warzone 2100, Mafia 1, Half-Life, Syphon Filter, Parasite Eve - in no particular order; I am sure I am forgetting many many more great titles from that era.
The last AAA game I bought on release day was SimCity (5). That was a huge disappointment for me and I wouldn’t do that again. I should have knew better.
Only the triple-A developers.
> [...] small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
I would venture that Ninja Theory with "Hellblade: Senua's Sacrifice" fulfills this - they're were a small studio with a game you can't really call dumbed down and it was a profitable game with a relatively small budget (<$10M).
 Now they're part of Microsoft.
> Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it
Ever play the PS2 port?
They had to rework the maps to cope with the PS2's memory constraints. Everything's familiar, but a bit different. Worth a shot if you want another hit.
I suspect that many people in this discussion reference big names because they played in the past, but don't play anymore, therefore talk about what they actually see advertised.
> Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks [...] 1997-2004 [...] good enough that cool things could be made yet bad enough that small-ish creative studios could compete
Big studios are only a part of the gaming production landscape.
There is a lot going on in the small/indie studio segment; I've randomly picked up the first link for "Best 2017 pc games", and roughly half of it was not AAA.
Nowadays, with the availability of game frameworks, the entry barriers to game development are pratically non-existent, to the point that at least one critically acclaimed game was made with... Game Maker (in fact, lauded for the narrative).
90s FPS was my most treasured era of gaming… watching the tech evolve inspired me to get into computing and become a dev.
More to the original point, the first GTA remains along with Age of Empires one of my all-time favourite games. It's easy to forget in light of the franchise it became what a novel and crazy little game that was.
In an era when most mainstream movies are way dumbed down, I doubt games got much more "artistic" and "narrative".
(Especially if one considers the text and graphic adventure games in the 80s/90s).
At best they got some ersatz narrative qualities, but nothing to write home about.
Deus Ex -- You play as a cybernetic anti-terrorist operative in a prosaic and cynical vision of the future fueled entirely by the conspiracies of the mid-90s BBS scene. The world is complex, coherent, and fleshed out. You have the illusion that your choices matter.
Deus Ex: Human Revolution -- You play as a super bad-ass private security in a neo-Renaissance world populated fueled by the hyperbole surrounding transhumanism. The developers deliberately make it less complex compared to the original for the sake of streamlining, use banal pop culture for their "inspirations," and intentionally design the game so that player choices are irrelevant.  It can be boiled down to "We want to make a western Metal Gear Solid."
I'll just grab a quote from a random Fallout 3 retrospective here -- "[...] Fallout 1 and 2 were defined by complex storylines, detailed characters and far-reaching consequences to the player's actions. And that these elements are less prominent in FO3, while faster action and stunning visuals have been brought to the forefront." 
Or a Thief (2014) review: "The three major strengths of past Thief titles - wide open mission design, sound propagation and narrative - are this game’s biggest weaknesses. That is a fundamental problem it cannot hope to overcome." 
Video games "matured" in the sense that they became more like movies interspersed with interactive segments, but the notion that that made them more artistic is pretty unfounded.
The decision that the player should be able to win via multiple paths, be it sneaking, gunplay, or something else entirely, really made it a beast to work on.
Damn it, there is a whole sub-section of one of the maps were you can encounter mobs you normally only encounter towards the later end of the game. And you do this by following up on a missing person and finding a way into the sewers.
That said, i keep coming back to a quote found over at Filfre, where one of the people that worked on the early Lucasarts (still Lucasfilms Games back then) games muses over how game developers have a bad habit of getting distracted by new toys.
Meaning that whenever some new hardware came along that allowed more of something, more colors, more sprites, more anything, they would invariably churn out a mass of shallow action games or similar to show of how many sprites or colors they can make the hardware push. Effectively the industry is rolled back by several years of development practice (and i dare say something similar happened with mobile tech when the iPhone released).
And it may well be that as we keep having AMD and Nvivida push out new GPUs, that we are stuck in a rut of continually colorful and well rendered but bland games.
Never mind that with the gamepad being the more likely input device, many interfaces are hampered (Deus Ex inventory tetris anyone?).
Many people want a movie-like experience from their game, but that isn't what I played games for. It was more about emergent situations, rather than scripted. That was also what made games replayable - it's very rare for me to want to replay a modern game these days.
Also, I agree with emergent situations, as a gamer, I'd call myself an "explorer" I'll check every nook and cranny, often, with modern games, I end up breaking things because I go places I shouldn't yet, instead of following what the developer expected of me.
As for replayability, I agree that it has mostly been lost, but, I don't feel it's a bad thing. I have a backlog of games on various systems to last me several more years, and I continue to buy games at a pace that means it'll sustain for some time to come.
Coupled with increasingly less time to game as I get older, I struggle to play some games at all.
Fallout 3 and Oblivion were two of my favs, New Vegas and Skyrim sit in their plastic wrap since day one, as I haven't found the time to commit to them. Fallout 4 got some 60 hours of my time compared to Fallout 3 in which I spend over 400 hours.
I don't play COD, or similar games, their "experience" is too shallow and linear, and it's easy to just keep them out of my backlog.
I want a game with a solid deadline that I can shelve when I'm done and move on in the backlog. The really greats will get a special "shelf" where they'll come back out, or are given to friends with similar taste.
I think the parent poster means this: that because each "scene" in an AAA game cost a lot to produce, they want every player to experience it. If it was entirely optional, then some players would miss it, and then how can you explain to them why the game was so expensive? So they must see scene; in order to ensure this, the game becomes more linear and with fewer optional missions/situations/paths.
Think of it as a big budget movie: they filmed the action-oriented, CGI-ladden scene, so they want you to watch it.
None of these games are scripted in the same way CoD is. RPGs certainly have a lot of scripting, but they're at the other end of the spectrum - broad scripting rather than deep scripting. CoD has scripted experiences, where almost every detail of a scene is pre-planned, so that if you have two players in two different rooms, and they meet up later to talk about the game, they'll have had similar experiences and the same sequence of events.
It's the deep scripting, for complex cinematic scenes, that the game directors are afraid of players missing. These are what the players are buying. If the players miss out, they get a substandard play experience.
The late 90s and 00s to me was all dominant single games (e.g. Starcraft, Quake, Halflife). A lot of it was about having the latest graphics, with big studios winning. And spinoff games were done with an 'engine', which often made them look/feel/respond similar to the original. You also saw a lot of sequels and series taking over from new game concepts because of this. And most of them were in a few genres (FPS and RTS especially).
I agree its a good time now though, we are so flooded with different types of games that you have to do something interesting to get noticed.
The way games are used as a narrative medium is by shoving movies inside them and forcing you to watch them instead of actually playing the game. Its like going to the movie theater, being handed a book, and told to read the book while we pause the movie between two scenes.
Obviously some games are exceptions, but they are few and far between.
> The way games are used as a narrative medium is by shoving movies inside them
If that's how a game is presenting its story, then yes; they likely would be better served by simply making a movie. Games are an interactive medium, but that just means that storytelling in games will be different and use different tools than other mediums, while allowing for entirely new kinds of engagement with a story being told.
The memorable video game stories in my experience have been those that were engaged, collaborative experiences that I felt physically involved in as a player. The wholly emergent stories from Dwarf Fortress, the dynamically simulated open world with a strong interactive narrative in Star Control II, or the incredible physical connection to the tightly presented stories of Brothers: A Tale of Two Sons or What Remains of Edith Finch; these are all different ends of a spectrum of interactive narrative design.
The kind of games you mention are certainly terrible examples of storytelling, but to dismiss the medium as being a "terrible way to tell a story" is to miss out on some of the most interesting interactive stories being told.
Brothers: A Tale of Two Sons is pretty much the only game I can think of that does this successfully, and I don't think it's something that can be done with any story. Books and movies, on the other hand, can tell pretty much any story. They are universal, while video games are limited. I believe that to be inherent to the mediums, and not just because we are still learning how to tell stories in video games.
Emergent stories are not storytelling and have no bearing on video games effectiveness as a narrative medium. They are stories, yes, but they are not being told. They are being created in real time by the people playing the game, which is quite fun but not storytelling.
Interesting that I have exactly opposite view. Gameplay elements if done well create a tight feedback loop between the game and the player, an illusion that the player is part of the story, not just an observer.
There are reverse design documents for some games explaining how quests and gameplay are weaved together to achieve a desired effect on the player: http://thegamedesignforum.com/features/reverse_design_CT_1.h...
The problem with "games as movies" is that most games tell a story that would feel amateurish or childish if told as a movie. But also, as one piece in the Atlantic controversially argued , games are a different medium than movies; trying to "tell a story" with a game, in the traditional sense, is failing to take advantage of the medium.
 Note I don't entirely agree with it, but it raises some valid points: https://www.theatlantic.com/technology/archive/2017/04/video...
To be fair, "director" has been a title in video game credits for 30+ years. Shigeru Miyamoto (or rather "S. Miyahon") is listed as a director on the credit roll of The Legend of Zelda, for example. I don't know how common it is relative to other titles like "designer" or "planner", but in any case Cage isn't breaking any more ground here than he is in his storytelling.
Did they really?
I agree things are more 'cinematic', but text based adventures were arguably just as 'narrative' if not more so, and these are some of the earliest games in existence.
Then there were the 'graphical adventure' games like kings quest, etc which were hugely popular, then things like 'tomb raider', and so on.
taking these as a lineage, one can say it's really just the presentation that has matured, which one could then say makes the argument a 'technological view'..
I would say that kind of thing, finding ways of using interactivity to tell stories in ways that no other medium can, is an example of the field "maturing". (Beyond just "pass this test of skill or strategy to see the next part of the story, presented in an otherwise conventional manner").
More seriously, the rate of technical change in games development was huge during the 90s. We went from 3D being barely possible without textures to Half-Life 1's cinematic-style monorail introduction (which was visually better than most of the "full motion video" we'd seen in games up to that point). Graphics accelerators became a consumer product.
The development of the tech also made a wide open space for innovation in art direction - each new game looked noticeably different as well, in a way that's been attenuated among all the brown gritty shooters of today.
Doom also popularised multiplayer gaming (not really a thing on PCs at that point, limited to splitscreen on consoles). And it was the first game I encountered that really embraced modding - the Doom editors allowed the re-use and re-mix of the game assets into your own levels and all sorts of strange doom-flavoured experiences.
I also think it began a golden age.
I spin up OG Doom and have a rocking good time. Same goes for N64 games. They're still good.
The same cannot be said for games from a decade ago. No one has a passion for COD4.
Hence, the Golden Era title.
> Games from that era are still fun and enjoyable to pick up and play.
Is pure nostalgia.
Certainly strange to watch the definition shift over time.
In any case, the main thing I want to say is that it doesn't really matter whether they are 'in the trenches.' You are in different realms of existence if you own the company vs. work for it. As a founder, it's your personal creation; as an employee, you may care about the product, but the primary thing is your paycheck (most startup employees I've spoken with are much more mercenary with this than myself, too). You know that if it succeeds, the founder will become rich and famous and enter a social stratum that the employees will still only be able to fantasize about. No one will know their name or grant them ridiculous amounts of respect etc.
Even if none of that comes to fruition, the fact that your potential courses (as they relate to the company) are so divided creates immediate present-term social distance. And you lose the sense of shared struggle—or it's at least on a much lower level.
If you're all equal owners, the feeling of going to work on your shared thing must be very different, I imagine. I think it's probably also necessary to not take investment (or somehow do it in a very low pressure way), in order to have fun like the Id guys.
Thinking about how much Carmack has been up to since that book was written is super cool.
I grew up in this era so I'm certainly enamored by the nostalgia trip but I have a lot of respect for the team at id, it's fascinating to read about all of the mundane aspects of ray tracing and compressing color palettes enough such that a game like Doom or Quake could actually work. id probably did more than any other company at the time to make programming seem like an alternative subculture instead of a science and to make games these dark and moody experiences that were a blast to play (especially with friends via LAN).
i = 0x5f3759df - ( i >> 1 ); // what the fuck?
Gameplay video: https://www.youtube.com/watch?v=V2Ddl4CM4ao
For those looking for a more period correct experience there's Chocolate Doom: https://www.chocolate-doom.org/wiki/index.php/Chocolate_Doom
Though I have to admit I generally just fire up my 486 to play. I've already had one complete playthrough of Ultimate Doom and Doom 2 this year and I'm sure it won't be the last. :)
Those 3D enemies that show up in the video, come walking towards the player as if they had a strong stomach ache!
Also explosions look too "fluffy"... Doom is not a game that needs to be reworked, IMHO.
150 people playing right now, tons of crazy mods (Survival mods, RPG-like leveling, Megaman, etc).
Remember the No Clipping cheat code?
Here’s the source code check to turn it on:
The Smashing Pumpkins paid tribute by using a Doom sample on their Melon Collie and the Infinite Sadness album.
// Smashing Pumpkins Into Samml Piles Of Putried Debris.
I kept wondering what "samml" meant... in German "sammeln" means to gather and "semmel" in Austrian slang is a local kind of roll or bun.
Modding quake was one of my first great bursts of curiosity and creativity. I really remember those days fondly.
I think you mean Team Fortress? Although Half-Life does use a modified Quake engine, it was a commercial game that used none of the assets of Quake, so I don't think calling it a "mod" is really appropriate.
(Side note: my username is a Half-Life reference)
I wonder why ID didn't continue to compete with Unreal in the middleware engine. Why Valve decide to leave what was at the time possibly the best game engine. How Unreal went from some mediocre engine to an insanely great engine and still improving rapidly.
Doom ( Or John Carmack ) used to be the sole cheer leader for OpenGL. I still loved 3Dfx, Voodoo and Glide though. The API was small and fast, in an era when Direct3D was .... really not very good. And Interesting times, Intel is now coming back to GPU market again after i740, it was the first Graphics card to use AGP Slot. May be instead of GPGPU which we should go back to graphics with AGP; Advance Graphics Processor.
IIRC from one of his later talks, Carmack wasn't interested in tech licensing and it was mostly the other owners pushing the idea. I remember even before Doom 3 was released and the Quake 3 engine was in its apex in terms of licensing (basically the most licensed of their engines by far), their licensing page wrote that all the support you'd get was a day with Carmack to explain some bits.
> How Unreal went from some mediocre engine
Unreal was never a mediocre engine, even from its first release it had amazing tools and a very flexible architecture. I'd say that even if the rendering tech wasn't sometimes behind id (i always noticed that id would come up with a neat new idea and the next UE would polish it up - the only time this didn't happen was with Rage and UE4), their toolset and architecture is what made other devs go after them and their stance to support (you'd get constant updates and documentation and i think at some point they set up an internal "community site" for people licensing their engine) was the icing on the cake.
The id Tech 3 wikipedia page "id Tech 3, popularly known as the Quake III Arena engine"
Someone tried for a pull request in Oct 2017 with 3,275 commits, still pending.
I can't find any evidence that anything new has happened.
When I did my first mods 1995 there was no source code under GPL just reverse engineered format descriptions, but I was able write writers in Lisp to write maps in acad. This was fun.
But it's been fun doing ports to OS/2, DJGPP v1, Watcom C, and of course various other Unix via the X11 code. The GPL'd code was 'cleaned up' and kind of bugged up on the way, sadly what was released really wasnt' all that pure.
For anyone wanting something more 'pure' for MS-DOS I'd highly suggest: Mara'akate's DooM-New
And then there is Quake, which I compiled with the excellent MS-DOS based TCP/IP stack WATTCP so that I could bring native TCP/IP networking to Quake.
And on the heels of that, I did a QuakeWorld client for MS-DOS
which kind of took a life on it's own here:
And of course with the source to Quake 2 available, it only seemed proper to port it to MS-DOS.
So yes, there is GREAT fun to be had in the iD source.
Also don't forget some of their earlier games:
I've certainly found it really useful for reproducing machine-learning papers I've been reading.
This because it used its own language, quake-c, and a simple compiler. Beyond that there were few inherent limits to the engine, so people set of making wild weapon packs (some having up around 20 guns that could have multiple modes), RPGs (abusing the hell out of how the game transferred character state between maps), even simple flight sims and driving games.
Come Quake2 you needed VC++ to do mods (leading to a way fewer single developer experimental mods), and come Quake3 there was a hard cap on the number of weapons the mod could contain.
By the way, I love this from the readme:
> If you have obtained this source code several weeks after the time of release, it is likely that you can find modified and improved versions of the engine in
various open source projects across the internet.
When I was a kid in 1996 I toyed with Quake C but really had no idea what I was doing. I'd love to toy with mods again but not really sure if there is a good way to get started. Most of the resources I knew of 22 years ago are gone or significantly outdated.
This is all you need for playing maps and mods. For making maps (the most common type of mod) you'll also need a map editor - currently the most common seems to be Tenchbroom, but it is far from the only option and a lot of people use other editors, such as BSP, QuArK, GtkRadiant and some people even use either the original Worldcraft 1.3 (i think) or a modded Hammer (from Valve) to make maps. I think some also use J.A.C.K., which is basically a Hammer clone, although i think it is abandoned now.
The map editor isn't enough, you also need the command line tools to "compile" the map (this is how the Quake engine can support multiple and different editors: by decoupling the tools from the engine and the individual types of tools from each other - which to me is one of the best ways to architect and engine and its tools, but sadly most popular engines have forgotten it). There are several variations, depending on the features you want. But i think these days most people are using ericw's tools since they support some advanced features from modified engines (like the BSP2 format for larger maps and colored lightmaps).
Beyond mapping, if you want to delve in QuakeC (the game's scripting language) you need the original QuakeC source code and a QuakeC compiler - while you can probably use the original QCC, i've seen FTEQCC from the FTE engine to be mentioned way more often. Some tutorials can be found here  (the Inside3D site which was a sort of 'nexus' for QuakeC modders was shut down some time ago and while the community create InsideQC it looks like not everything survived the transition).
Of the 3 quakes, which is easiest to dive into mod development?
Also i forgot to link to the Quake Wiki: https://quakewiki.org/
For instance, the idclev cheat handling in st_stuff.c has to check if the game is shareware and restrict level changes beyond a certain range.
I haven't done a deep dive on the code, but it looks like gamemode is set based on something in the WAD file.
Shouldn't Doom 3, Quake 4 and maybe Rage 1 be up here?
If you want native doom, check out the source ports like Chocolate DooM. Doing ports to stuff like SDL is somewhat straight forward, or you can do a Cocoa one yourself, but I'd start with something known to work first..
Anyway, great work scruffy!
The game was bought by fans, the source was released and Keen Dreams was published on steam.
The ownership of the older stuff is quite fragmented.
It would be fun to see Keen source code if it’s out there!