In case anyone doubts how revolutionary some of these games were, check out these fan letters from the Wolfenstein source:
As a former POW (Vietnam), I hesitated to play WOLF for over a
month after downloading as I feared flashbacks. I didn't want to
remember all that I had been through all those years ago, when, as
POW's, my friend and I decided an escape attempt would be better than
a slow death by torture and starvation.
My friend and I made crude maps and hoarded food. The day of the
escape we clubbed the guard with stones, took his gun and fought our
way through two levels of underground tunnels (only a few guards and
had to crawl). I made it, my friend didn't.
Dreams...NO! NIGHTMARES...YES!! However, the more I play
WOLF the less frequently I have nightmares. The chilling part is
turning a corner and seeing a guard with his gun drawn.
WOLF is a powerful game. Fearful as well. I believe that a
person should face the past. So... when I can play EPISODE 1
comfortably (no nightmares), I plan on ordering the full series.
There's also a letter from a Microsoft manager requesting a multiplayer version.
As you say, that is an extremely moving letter. I would never have associated Doom with therapy - thanks for sharing. (I am a big fan of John Carmack and the id team.)
We owe a lot to John Carmack in many ways - but I hope one of his enduring legacies will be the amount of programmers he inspired with both the games he helped make, and the code he made available. It helped many of us to get started somewhere.
I never ended up going into games myself, but in no small part I owe my career to him.
As well as the Doom book, his plan files are a fascinating time capsule to go back over, particularly if you 'came of age' during that golden age of PC games development.
At a time when pretty much every regular pc and internet connection sucked, here's what is definitely an historical moment in online gaming. The birth of Quakeworld, the internet optimized quake engine:
https://raw.githubusercontent.com/ESWAT/john-carmack-plan-ar...
Note JC's comment that "this is just my pet research project". If you've ever played quake and then quakeworld online, you'll know what an understatement that was and how freaking awesome the online gameplay became with qw.
So really, thanks John Carmack for the countless wasted hours :)
If you've ever played quake and then quakeworld online, you'll know what an understatement that was and how freaking awesome the online gameplay became with qw.
Amen. Especially when you consider that a massive number of players, such as myself, were using dial-up (56k) modems. Quakeworld made multiplayer go from a slideshow to comparable to single-player performance, albeit with hiccups.
As I just posted above, this strikes home for me. For me it wasn't just the client-side prediction, it was that there was also a linux client for quake world.
Yeah I remember that. It was also at about that time that I started using and learning linux.
Back then, just managing to setup XFree86 correctly was like a huge personal victory and running quake there was really the cherry on top.
I seem to recall having to actually input a bunch of numbers/frequencies taken from my crappy CRT monitor's specsheet into the X config and basically bruteforcing it until it worked. "Good times"!
> ... but I hope one of his enduring legacies will be the amount of programmers he inspired with both the games he helped make, and the code he made available. It helped many of us to get started somewhere.
Never really realized it until now, but this is true for me.
The original Quake, in its heyday, is what sparked my interest in Linux. Linux then sparked my interest in, well... more Linux - and gave me a new direction for my eventual shift into engineering. It really shaped my entire career.
I had heard somehow, somewhere that you could squeak out slightly better frame rates and ping times in Quake, if you used this alternative OS called Linux (which really wasn't true, in retrospect). And so that began my quest, which triggered a cascade of deep dives into linux and its community for me. Everything from installation issues and troubleshooting, to compiling kernel modules (and everything in between). I even ended up dialoging with the driver developer for one of those modules (not yet in the kernel). A bug had surfaced - he actually fixed it and sent me a patch. So then I learned how to patch a driver too. One of the best "customer" experiences I've ever had, really :)
Ironically, I might have ended up in game development had there never been a linux port for quake :P
I certainly learned a lot by writing Quake 3 Arena mods and unreal tournament mods. Even if I wasn't make huge changes just the confidence that I could dive into an existing set of code, figure something out and make something happen gave me a lot of confidence for things in the future.
It was a golden age for the first person immersive simulation game. Wolfenstein 3D, Doom, Quake, Thief, System Shock II, Half-Life, Deus Ex - there was a period of classics with large improvements and innovation from one game to the next, where game design space was explored just as the technology to render it became available.
First person immersive sims are generally best played on the PC. When consoles that were able to do decent graphics for a reasonable period (i.e. the pace of hardware evolution slowed) came into their own, third person control worked better with joypads, and for commercial reasons, PC games often ended up as a second-rate ports of console games, with compromised controls. You really need a mouse for first person.
And many of those early games (Wolf3d, Doom, Hexen) didn't have Mouse look on be default, you played them Closed (Arrow Keys and Right Ctrl/Alt) or Open (Arrow Keys and Left Ctrl/Alt).
It really wasn't until Quake I/QuakeWorld that FPS' made the shift to Mouse+WASD. And evenstill in Quake1, mouse look was off by default.
Improvements in aim assist really make console FPS games more approachable. Still not up to the usability of a mouse, but if the game is thoughtfully designed it meets a "good enough" bar.
Although I wasn't a big fan of Rare's cursor aim functionality, it felt very clumsy.
Personally I'm a fan of sticky aim assist (halo) or snap-to-body aim assist (call of duty, gta). These kinds of systems are the kind of work where players may not even realize they are being helped.
If the question is which is a more effective control scheme FPS, mouse+keyboard is clearly superior to a dual-analog sticks especially in competitive matches.
But, my point is simply Golden Eye 64 was a console FPS where the controls weren't clunky (for it's time).
I didnt even realize mouse + keyboard was a thing until i downloaded the killcreek vs romero demo. I asked my friend “how are they looking around so fast?” Until then I had been using pg up to look up.
I know what FPS stands for, but Thief most definitely isn't a shooter, and System Shock II is less of a shooter than a creepy exploration game. Deus Ex is half an RPG.
FPS is too reductive. I'm using the phrase immersive sim in the same way as these articles:
No, it doesn't really help explain how Doom, Wolfenstein, or Quake, etc., are lumped in together with games like Thief as simulations. FPS is certainly not reductive. A game can be more than one genre, but sometimes an FPS is just an FPS, and more than half the games you listed are pretty much just an FPS.
They weren't even referred to as FPS games at that period of time, they were DOOM clones. The FPS moniker came along later after it settled into a genre.
Immersive sim is a genre. Games with strong narratives that react to the player often in significant ways. Some of the games grandparent listed aren't immersive sims.
As the comments below observe, I think the types and number of experiences built on the back of new (to mainstream audiences anyway) technology were transformative.
In 92 the Amiga 1200 came out, a wonderful computer with great hardware but unfortunately it never made it big. Four years later in 1996 mass market PCs were coming out with Pentium processors, 3dfx accelerators, high resolution bitmap displays, internet connections and powerful general purpose sound adapters.
Today we have orders of magnitude more computing power, but essentially they are powering the same experiences, albeit far more polished.
If we look back four years, 2014, not much has really changed. The big game releases indeed are the same games (in the 90s they'd have been called mission packs).
The magic of experiencing something completely brand new, completely alien and magical, has gone away. Though, I am hopeful VR/AR is going to deliver that again. I don't know how far off it is, but it would be fitting if it's Carmack and Abrash behind it. Again.
I wonder sometimes, what could be built using today's tech if it was focused like Amigas back in the day with a single hardware spec and a dedicated OS just for that platform. Without regard to backwards compability in OS, like BeOS at the time.
What would a revolutionary Amiga 1000 look like in 2018? With modern hardware and latest OS/system ideas.
Gaming consoles could be considered examples of this, but they are kind of single purpose.
Therefore, if I was attempting something like this today, I would:
- make the graphics preeminent in the system. In other words, keep the graphics card and throw away the legacy CPU. Let it drive PCI devices.
- on its own, doing that makes it much harder to program, so put considerable work into making the instruction set and programmer's model open and well documented
- find a DAW engineer and let them build the audio subsystem with an obsessive focus on low latency. Let's aim for no more than 10 samples latency between input and output processing and see where that gets us.
- full multitasking in which nothing is ever allowed to block anything else unrelated, through resource reservation (qv the Nemesis research operating system). Having an Electron app on the system should not impair anything else, and the system's default editor should also be focused on low-latency.
- apps are by default fully security partitioned from each other. The operating system would maintain a CRDT-based record-orientated personal data storage system, incorporating lessons from PalmOS. This gives both native sync and automatic persistence across power-off.
- low latency non-USB keyboard and mouse. PS/2 would actually work but we could go for something really surprising like gigabit Ethernet or optical TOSLINK.
(Low latency is a good example of a feature which is extremely hard to retrofit and you end up redesigning the system around it).
> keep the graphics card and throw away the legacy CPU
If you throw away the CPU, how do you run general purpose programs (of which games are one type)? If the graphics card can run them, why are you not calling it a CPU?
The Amiga had a CPU :) Never owned one, but I understand a key feature was that graphics were very easy to program (mapped to main memory, or so a friend who owned one told me).
Given that outside of the Switch the consoles are x86 AMD made socs, even consoles have bowed to the PC.
In hindsight i feel the Amiga's custom chips was actual an Achilles heel. Yes it allowed each model to sprint out of the gate, but it also limited how far the user could upgrade them.
In contrast the only thing on the motherboard of a PC of the era (once we hit the 386 and later) was the CPU and RAM. Everything else lived on replaceable boards hooked to a bus. And as the Pentium era showed, even the bus could be "replaced" by placing a new one (PCI) side by side with the old one (ISA).
All this allowed a continuation of sorts, where a humble DOS box could live on into the Win9x era in some way or other.
> ...a single hardware spec and a dedicated OS just for that platform. Without regard to backwards compability in OS, like BeOS at the time.
I would argue the closest mainstream machines currently available that meet this requirement are Macs and iOS devices. Sure, there's a handful-ish amount of different configurations, but they do not vary greatly. Backwards compatibility certainly isn't much of a concern in macOS.
I always dislike it when the Amiga 1200 is mentioned as a forerunner to modern computing and the wonderful, albeit buggy, Atari 1040ST, released in 1985, is totally forgotten about
Using the Motorola 680000 with 32 bit internal/16 bit external buses and color bit-mapped graphics, my ST, hardware wise, totally blew away all other PCs from a technical POV and it just annoys me when the Amiga gets all this love the and the ST gets none whatsoever.
The ST was not as good as the Amiga, that's why. The only advantage it had over the Amiga was the slightly higher CPU speed, because the Amiga's was reduced to keep in sync with the hardware it had. The hardware being what it needed to trounce the ST. You only have to look at the range and quality of games for each to see what I mean. ST ports always had that muddy look to them.
I still own and love both :) I mentioned the A1200 because it was basically the apex of that line of computers (and indeed that entire approach to computing).
There's some great videos on Youtube (particularly the Computerphile ones) on the Tramiel era Atari computers. They are a great little machine to help learn computer organization on. They were powerful yet still relatively simple machines that you could fit in your head.
The Amiga had a hardware blitter - this made the difference in games. If you play the same game on both systems, the Atari port seems like a slideshow in comparison.
I am a musician as well and I do remember its MIDI-specs were top-of-the-line. I never really got the chance to use it in that way but thinking about that almost makes me want to pick one up and give it a go...
Not GP, but I agree. The world of PC gaming was moving so fast in the 90s - sound cards, CD-ROM, "multimedia", the internet, 3D accelerators etc. (I appreciate that most of these technologies were invented earlier, but the 90s was when they really came of age and hit mass adoption).
I grew up in that era and agree that it was a golden age, but of course I'm aware that nostalgia plays a big part in that. If you look at, say, 2000-2010 or even 2010 to now it seems things have slowed down - which isn't necessarily a bad thing.
I think a strictly technological view of videogames is very restrictive.
For a variety of reasons, in the last decade (or in the last couple), videogames matured as a narrative medium, which is a radical difference from the 90s. One could argue that, from an whole - including artistic - perspective, 2010s are the golden era.
I don't think there is a golden age, though. There are still radical improvements to come in the next decade(s), and it will require a long time to be able to have a judgment of what characterized video games/developemnt in each era.
Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks. Hence you get mostly sequels, online cash cows, linear on-rails shooters/cinematic experiences that are easy to sell. Content is expensive to produce so studios are afraid of "wasting" it. Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it.
1997-2004 I think was the golden age of gaming where technology was good enough that cool things could be made, yet bad enough that small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
System Shock 2, Deus Ex, Metal Gear Solid, Silent Hill 1-3, Resident Evil 1-3, No One Lives Forever, Age of Empires, Warzone 2100, Mafia 1, Half-Life, Syphon Filter, Parasite Eve - in no particular order; I am sure I am forgetting many many more great titles from that era.
I agree, luckily we have games from indie developers to fill in that gap. The tradeoff is that you have to sacrifice good graphics and a perpetual alpha/beta development cycle for gameplay. There’s been so many great indie games, such as Minecraft, Rimworld, Stardew Valley, Terraria, Prison Architect, Rust, Undertale, etc. In the good ‘ol PC gdays I used to buy 10-15 AAA titles per year on release day. Now I only buy a couple only on a heavily discounted Steam sale.
The last AAA game I bought on release day was SimCity (5). That was a huge disappointment for me and I wouldn’t do that again. I should have knew better.
> [...] small-ish creative studios could compete. Games were popular enough to make it profitable for studios to make them yet not too mainstream to dumb them down for the lowest common denominator.
I would venture that Ninja Theory with "Hellblade: Senua's Sacrifice" fulfills this - they're were[1] a small studio with a game you can't really call dumbed down and it was a profitable game with a relatively small budget (<$10M).
All good points. You're certainly right about the costs involved in 'triple-A' development. Indie games are out there, though. Some of them even look great, just not photorealistic (such as Limbo).
> Yet I was still discovering hidden things in the original Deus Ex 10 years after I first played it
Ever play the PS2 port?
They had to rework the maps to cope with the PS2's memory constraints. Everything's familiar, but a bit different. Worth a shot if you want another hit.
How many games have you played in the last, say, 5 years?
I suspect that many people in this discussion reference big names because they played in the past, but don't play anymore, therefore talk about what they actually see advertised.
> Largely because of improvements in graphics, game budgets have gone way up and studios are afraid to take risks [...] 1997-2004 [...] good enough that cool things could be made yet bad enough that small-ish creative studios could compete
Big studios are only a part of the gaming production landscape.
There is a lot going on in the small/indie studio segment; I've randomly picked up the first link for "Best 2017 pc games", and roughly half of it was not AAA.
Nowadays, with the availability of game frameworks, the entry barriers to game development are pratically non-existent, to the point that at least one critically acclaimed game was made with... Game Maker (in fact, lauded for the narrative).
I agree on the indie games. I enjoyed VVVVV and Doki Doki Literature Club. Out of recent AAA titles, Prey was very encouraging, GTA5 was pretty damn good, the recent Hitman is quite good also. I just think that the time period I mention had a higher number/density of great AAA titles.
Wasn't Vice City was the Grand Theft Auto of 2002/03? ;)
More to the original point, the first GTA remains along with Age of Empires one of my all-time favourite games. It's easy to forget in light of the franchise it became what a novel and crazy little game that was.
What I find absolutely fascinating is that there are only 16 years between the release of GTA1 and GTA5. How could such amazing progress have occurred in such a short timeframe? It's unbelievable.
>* For a variety of reasons, in the last decade (or in the last couple), videogames matured as a narrative medium, which is a radical difference from the 90s. One could argue that, from an whole - including artistic - perspective, 2010s are the golden era.*
In an era when most mainstream movies are way dumbed down, I doubt games got much more "artistic" and "narrative".
(Especially if one considers the text and graphic adventure games in the 80s/90s).
At best they got some ersatz narrative qualities, but nothing to write home about.
Agreed. Just look at franchises that were around in the 1990s and compare them to their 2010s counterparts. Almost none of them look good in comparison to their antecedents.
Deus Ex -- You play as a cybernetic anti-terrorist operative in a prosaic and cynical vision of the future fueled entirely by the conspiracies of the mid-90s BBS scene. The world is complex, coherent, and fleshed out. You have the illusion that your choices matter.
Deus Ex: Human Revolution -- You play as a super bad-ass private security in a neo-Renaissance world populated fueled by the hyperbole surrounding transhumanism. The developers deliberately make it less complex compared to the original for the sake of streamlining, use banal pop culture for their "inspirations," and intentionally design the game so that player choices are irrelevant. [0] It can be boiled down to "We want to make a western Metal Gear Solid."
I'll just grab a quote from a random Fallout 3 retrospective here -- "[...] Fallout 1 and 2 were defined by complex storylines, detailed characters and far-reaching consequences to the player's actions. And that these elements are less prominent in FO3, while faster action and stunning visuals have been brought to the forefront." [1]
Or a Thief (2014) review: "The three major strengths of past Thief titles - wide open mission design, sound propagation and narrative - are this game’s biggest weaknesses. That is a fundamental problem it cannot hope to overcome." [3]
Video games "matured" in the sense that they became more like movies interspersed with interactive segments, but the notion that that made them more artistic is pretty unfounded.
I think some article in PCGamer back in the day quoted Warren Spector regarding Deus Ex, and its complexities, as him regularly going into his office and sitting down to figuratively bang his head on the desk while asking why they make things so hard on themselves over and over.
The decision that the player should be able to win via multiple paths, be it sneaking, gunplay, or something else entirely, really made it a beast to work on.
Damn it, there is a whole sub-section of one of the maps were you can encounter mobs you normally only encounter towards the later end of the game. And you do this by following up on a missing person and finding a way into the sewers.
That said, i keep coming back to a quote found over at Filfre, where one of the people that worked on the early Lucasarts (still Lucasfilms Games back then) games muses over how game developers have a bad habit of getting distracted by new toys.
Meaning that whenever some new hardware came along that allowed more of something, more colors, more sprites, more anything, they would invariably churn out a mass of shallow action games or similar to show of how many sprites or colors they can make the hardware push. Effectively the industry is rolled back by several years of development practice (and i dare say something similar happened with mobile tech when the iPhone released).
And it may well be that as we keep having AMD and Nvivida push out new GPUs, that we are stuck in a rut of continually colorful and well rendered but bland games.
Never mind that with the gamepad being the more likely input device, many interfaces are hampered (Deus Ex inventory tetris anyone?).
Modern video games too often have far more linear gameplay, because AAA budgets means assets get wasted if they don't get consumed. See the entire CoD series, for example - tedious to play if you're used to a different kind of game.
Many people want a movie-like experience from their game, but that isn't what I played games for. It was more about emergent situations, rather than scripted. That was also what made games replayable - it's very rare for me to want to replay a modern game these days.
Can I ask, what does "assets get wasted if they don't get comsumed" mean?
Also, I agree with emergent situations, as a gamer, I'd call myself an "explorer" I'll check every nook and cranny, often, with modern games, I end up breaking things because I go places I shouldn't yet, instead of following what the developer expected of me.
As for replayability, I agree that it has mostly been lost, but, I don't feel it's a bad thing. I have a backlog of games on various systems to last me several more years, and I continue to buy games at a pace that means it'll sustain for some time to come.
Coupled with increasingly less time to game as I get older, I struggle to play some games at all.
Fallout 3 and Oblivion were two of my favs, New Vegas and Skyrim sit in their plastic wrap since day one, as I haven't found the time to commit to them. Fallout 4 got some 60 hours of my time compared to Fallout 3 in which I spend over 400 hours.
I don't play COD, or similar games, their "experience" is too shallow and linear, and it's easy to just keep them out of my backlog.
I want a game with a solid deadline that I can shelve when I'm done and move on in the backlog. The really greats will get a special "shelf" where they'll come back out, or are given to friends with similar taste.
> Can I ask, what does "assets get wasted if they don't get comsumed" mean?
I think the parent poster means this: that because each "scene" in an AAA game cost a lot to produce, they want every player to experience it. If it was entirely optional, then some players would miss it, and then how can you explain to them why the game was so expensive? So they must see scene; in order to ensure this, the game becomes more linear and with fewer optional missions/situations/paths.
Think of it as a big budget movie: they filmed the action-oriented, CGI-ladden scene, so they want you to watch it.
Fallout 3 and Oblivion were two of my favs, New Vegas and Skyrim sit in their plastic wrap since day one, as I haven't found the time to commit to them. Fallout 4 got some 60 hours of my time compared to Fallout 3 in which I spend over 400 hours.
None of these games are scripted in the same way CoD is. RPGs certainly have a lot of scripting, but they're at the other end of the spectrum - broad scripting rather than deep scripting. CoD has scripted experiences, where almost every detail of a scene is pre-planned, so that if you have two players in two different rooms, and they meet up later to talk about the game, they'll have had similar experiences and the same sequence of events.
It's the deep scripting, for complex cinematic scenes, that the game directors are afraid of players missing. These are what the players are buying. If the players miss out, they get a substandard play experience.
Like the other reply, thanks for explaining. I see what it means now. This is one of the reasons I explore games so thoroughly, don't want to miss anything.
For me the early 90s with the new technology came a lot of experimentation and new kinds of games. For example one of my favourite developers Bullfrog made games like Populous, Syndicate, Magic Carpet, Theme Park, Dungeon Keeper. All radically different game styles and all excellent games.
The late 90s and 00s to me was all dominant single games (e.g. Starcraft, Quake, Halflife). A lot of it was about having the latest graphics, with big studios winning. And spinoff games were done with an 'engine', which often made them look/feel/respond similar to the original. You also saw a lot of sequels and series taking over from new game concepts because of this. And most of them were in a few genres (FPS and RTS especially).
I agree its a good time now though, we are so flooded with different types of games that you have to do something interesting to get noticed.
Really? Currently, I think video games are a terrible way to tell a story. Much worse than a book, tv show, or movie.
The way games are used as a narrative medium is by shoving movies inside them and forcing you to watch them instead of actually playing the game. Its like going to the movie theater, being handed a book, and told to read the book while we pause the movie between two scenes.
Obviously some games are exceptions, but they are few and far between.
Games are certainly a different way to tell a story, but I wouldn't say "terrible".
> The way games are used as a narrative medium is by shoving movies inside them
If that's how a game is presenting its story, then yes; they likely would be better served by simply making a movie. Games are an interactive medium, but that just means that storytelling in games will be different and use different tools than other mediums, while allowing for entirely new kinds of engagement with a story being told.
The memorable video game stories in my experience have been those that were engaged, collaborative experiences that I felt physically involved in as a player. The wholly emergent stories from Dwarf Fortress, the dynamically simulated open world with a strong interactive narrative in Star Control II, or the incredible physical connection to the tightly presented stories of Brothers: A Tale of Two Sons or What Remains of Edith Finch; these are all different ends of a spectrum of interactive narrative design.
The kind of games you mention are certainly terrible examples of storytelling, but to dismiss the medium as being a "terrible way to tell a story" is to miss out on some of the most interesting interactive stories being told.
The reason I say it is a terrible way to tell a story is not because you can't tell a good story in a video game, but because most stories don't benefit from having gameplay elements thrown into them. In fact, I think it takes away from the story because the storyteller loses control over important things like pacing and the person experiencing the story has to constantly switch between story-mode and gameplay-mode.
Brothers: A Tale of Two Sons is pretty much the only game I can think of that does this successfully, and I don't think it's something that can be done with any story. Books and movies, on the other hand, can tell pretty much any story. They are universal, while video games are limited. I believe that to be inherent to the mediums, and not just because we are still learning how to tell stories in video games.
Emergent stories are not storytelling and have no bearing on video games effectiveness as a narrative medium. They are stories, yes, but they are not being told. They are being created in real time by the people playing the game, which is quite fun but not storytelling.
> The reason I say it is a terrible way to tell a story is not because you can't tell a good story in a video game, but because most stories don't benefit from having gameplay elements thrown into them.
Interesting that I have exactly opposite view. Gameplay elements if done well create a tight feedback loop between the game and the player, an illusion that the player is part of the story, not just an observer.
Like a sibling comment says, in the 2010s videogames have "gone Hollywood". My perspective is similar to what you say, but actually in the opposite direction: I think videogames in the 90s experimented more and were less focused in graphics, even though of course there were very technical programmers like Carmack. In the 2010s, videogames seemed to think graphics were the most important improvement; but even worse, every game designer thought of himself as an amateur movie director. Unfortunately, most game designers would make terrible directors. I'm going to single out David Cage of Quantic Dream as someone who writes embarrassingly bad plots for his videogames (Indigo Prophecy, Heavy Rain), and actually calls himself a "director".
The problem with "games as movies" is that most games tell a story that would feel amateurish or childish if told as a movie. But also, as one piece in the Atlantic controversially argued [0], games are a different medium than movies; trying to "tell a story" with a game, in the traditional sense, is failing to take advantage of the medium.
> I'm going to single out David Cage of Quantic Dream as someone who writes embarrassingly bad plots for his videogames (Indigo Prophecy, Heavy Rain), and actually calls himself a "director".
To be fair, "director" has been a title in video game credits for 30+ years. Shigeru Miyamoto (or rather "S. Miyahon") is listed as a director on the credit roll of The Legend of Zelda, for example. I don't know how common it is relative to other titles like "designer" or "planner", but in any case Cage isn't breaking any more ground here than he is in his storytelling.
I agree things are more 'cinematic', but text based adventures were arguably just as 'narrative' if not more so, and these are some of the earliest games in existence.
Then there were the 'graphical adventure' games like kings quest, etc which were hugely popular, then things like 'tomb raider', and so on.
taking these as a lineage, one can say it's really just the presentation that has matured, which one could then say makes the argument a 'technological view'..
I would argue that Brothers: A Tale of Two Sons or What Remains of Edith Finch are both examples of games that use inherent features of physically playing a game to deliver their stories in a really unique way.
I would say that kind of thing, finding ways of using interactivity to tell stories in ways that no other medium can, is an example of the field "maturing". (Beyond just "pass this test of skill or strategy to see the next part of the story, presented in an otherwise conventional manner").
I would agree to the 90s having huge fundamental leaps. For more recently, on the innovative technology side you do have VR, SSDs, steady performance increases across the board. But I would say the biggest leaps have come with better code and asset generation rather than brute hardware improvements. PBR was a massive change in the way assets are generated and how real they can look and feel. Likewise games are reaching really impressive levels of mechanical fidelity and visual depth. GTAV still looks amazing today, and that's down to the sheer amount of unique assets and texture variance in the game. Games now tend to flourish due to gameplay depth, scale and freedom over every impressive visuals. Which is great.
It was my teenagerhood, of course it was a golden age.
More seriously, the rate of technical change in games development was huge during the 90s. We went from 3D being barely possible without textures to Half-Life 1's cinematic-style monorail introduction (which was visually better than most of the "full motion video" we'd seen in games up to that point). Graphics accelerators became a consumer product.
The development of the tech also made a wide open space for innovation in art direction - each new game looked noticeably different as well, in a way that's been attenuated among all the brown gritty shooters of today.
Doom also popularised multiplayer gaming (not really a thing on PCs at that point, limited to splitscreen on consoles). And it was the first game I encountered that really embraced modding - the Doom editors allowed the re-use and re-mix of the game assets into your own levels and all sorts of strange doom-flavoured experiences.
Doom was the first networked PC game I played. It was one of the first of it's kind. It opened up a world of connected gaming that lasted at least a decade.
It's not surprising at all - what comes after the very early stages of development of many areas in science and technology are indeed usually could be called "a golden age." (In math it was the period of around 100 years after Newton; in physics - the second half of the 19th century plus the first half of the 20th.) As far as computers and software, we are now also way past the golden age, unfortunately...
For PC gaming, the move from pre to post Voodoo cards was a revolution leap I have yet to see again in that space. Everything since has been evolutionary. Carmack and glQuake were one of things leading the charge. Each new game, new video card, new processor at the time brought so much new power and excitement. It was the golden age.
CoD 4 is widely recognized as the last good CoD, and arguably, the best one. CoD was and is huge. Not among your peers, but among youth. We're no longer in "the gaming scene" as old farts, and
> Games from that era are still fun and enjoyable to pick up and play.
For some of the history of that time, I suggest checking out "Masters of Doom : how two guys created an empire and transformed pop culture", by David Kushner. The book is 15 years old (!!!) at this point, but I think it's still a compelling read.
I can second that—it's an excellent book. It put a clear image in my mind of the sort of environment I'd like to develop software in: primarily just a group of friends who have a shared idea of something cool they want to build together. I've been disillusioned that that will happen at a Silicon Valley type startup (not that it's impossible—but far more likely that the founders will segregate themselves from employees in such a way that everyone involved can't feel the same passion, commitment, interest, etc.—or even just the knowledge that they're going through the same struggle at the same time. Also the big stakes and investor pressure etc. are almost certainly gonna drain the fun.), but I'm still hopeful I can find or create similar circumstances elsewhere.
It's been interesting and sad to watch the definition of "startup" transform from the bootstrapped style "group of friends" company like Id Software who... well, defined the damn term - into the VC-funded "startups" of today. If anything startups today look like insanely corporate structures to me, answering to "the man" before they even get off the ground much less accomplish anything of note.
Certainly strange to watch the definition shift over time.
I have trouble believing the nyc startup scene is more egalitarian, but I’m on my 4th startup as an early employee (#1-18ish) and the founders have always been in the trenches. Not to invalidate your experience, just to emphasize the role of luck in all of this.
I've been at two startups, one in Boston and one in San Francisco. At the Boston startup, all the founders but one were very much 'in the trenches'; the one who wasn't was a professor splitting her time elsewhere, so it made sense. That company was ~20 people when I joined. At the SF startup, the founder was... (to avoid saying too much) alienating, though the whole company was only four people for much of the time, and never more than eight or so.
In any case, the main thing I want to say is that it doesn't really matter whether they are 'in the trenches.' You are in different realms of existence if you own the company vs. work for it. As a founder, it's your personal creation; as an employee, you may care about the product, but the primary thing is your paycheck (most startup employees I've spoken with are much more mercenary with this than myself, too). You know that if it succeeds, the founder will become rich and famous and enter a social stratum that the employees will still only be able to fantasize about. No one will know their name or grant them ridiculous amounts of respect etc.
Even if none of that comes to fruition, the fact that your potential courses (as they relate to the company) are so divided creates immediate present-term social distance. And you lose the sense of shared struggle—or it's at least on a much lower level.
If you're all equal owners, the feeling of going to work on your shared thing must be very different, I imagine. I think it's probably also necessary to not take investment (or somehow do it in a very low pressure way), in order to have fun like the Id guys.
I also recommend the NoClip documentary on Id Software and on the back of that, virtually any interview with John Romeo -- the guy is awesome and so willing to share his thoughts, ideas, history, anything...
I hate to play the role of the grouch, but I read it after constantly hearing about it and think it's definitely worth reading, but the book just isn't that good. The writing was poor and it tried to force narratives on events with a heavy hand. The actual history in it is cool but mostly lacking technical details which would have made it more interesting. It's worth reading if you already find Carmack interesting but keep your expectations low so you're not disappointed like I was. It's a fast and easy read though so it won't take long to try it regardless of who's opinion you listen to.
Thinking about how much Carmack has been up to since that book was written is super cool.
I grew up in this era so I'm certainly enamored by the nostalgia trip but I have a lot of respect for the team at id, it's fascinating to read about all of the mundane aspects of ray tracing and compressing color palettes enough such that a game like Doom or Quake could actually work. id probably did more than any other company at the time to make programming seem like an alternative subculture instead of a science and to make games these dark and moody experiences that were a blast to play (especially with friends via LAN).
If you'd like to try out Doom now, there is an OpenGL port called Doomsday[0] which is very complete and loyal to the original, while smoothing out/modernizing things enough to make it genuinely fun for modern players. For instance, you can look in all directions with 'mouse look' and jump and run faster while pressing shift, and there are dynamic spot lights attached to certain things (e.g. fireballs), even though the original sprites are still the primary visual. It also supports Hexen and Heretic and adds a more modern multiplayer interface. (Also allows swapping alternate game assets, so some people have done mods with 3d models.)
As someone who grew up playing Doom I have to admit that video makes me uneasy - almost like an "uncanny valley" effect. Although I can see why it might be easier on the eyes for modern gamers and if it introduces more people to the game then that can only be a good thing.
Though I have to admit I generally just fire up my 486 to play. I've already had one complete playthrough of Ultimate Doom and Doom 2 this year and I'm sure it won't be the last. :)
Actually I get that now too when I skip through quickly. For me it's because he mostly uses 3d character models during the video—although that's not the default setup. I mostly just watched the beginning earlier and didn't realize how much was using the 3d character set.
Id's move back in the day to open their software enough to allow a modding community was an early stroke of genius. I still play quake mods from time to time. It's amazing how much longevity they got out of those games, let alone the engines.
Modding quake was one of my first great bursts of curiosity and creativity. I really remember those days fondly.
I think you mean Team Fortress? Although Half-Life does use a modified Quake engine, it was a commercial game that used none of the assets of Quake, so I don't think calling it a "mod" is really appropriate.
This is a gift, just to have. Thanks for posting it. I have very fond memories of playing Wolf3d years ago on DOS. It will be fun to read through the source of the classics.
If you're looking for a high-level overview of some of id's source code releases, check out Fabien Sanglard's guides[0]. He's also written a book that goes into greater detail about the Wolf 3D engine.
His Wolf3D book is gorgeous (full-color, lots of great screenshots and illustrations). Well worth the price. Plus a lot of great info about the vagaries of sound, graphics, and memory architectures on the machines of the era.
There used to be sort of an Engine War in the late 90s and early 00s, Id tech, Source Engine, Unreal. Now ID is basically gone. And it is interesting both Valve and EPIC manage to create a platform, valve created Steam, and EPIC made Unreal almost like de facto standard. ( Yes I know there is Unity ).
I wonder why ID didn't continue to compete with Unreal in the middleware engine. Why Valve decide to leave what was at the time possibly the best game engine. How Unreal went from some mediocre engine to an insanely great engine and still improving rapidly.
Doom ( Or John Carmack ) used to be the sole cheer leader for OpenGL. I still loved 3Dfx, Voodoo and Glide though. The API was small and fast, in an era when Direct3D was .... really not very good. And Interesting times, Intel is now coming back to GPU market again after i740, it was the first Graphics card to use AGP Slot. May be instead of GPGPU which we should go back to graphics with AGP; Advance Graphics Processor.
> I wonder why ID didn't continue to compete with Unreal in the middleware engine.
IIRC from one of his later talks, Carmack wasn't interested in tech licensing and it was mostly the other owners pushing the idea. I remember even before Doom 3 was released and the Quake 3 engine was in its apex in terms of licensing (basically the most licensed of their engines by far), their licensing page wrote that all the support you'd get was a day with Carmack to explain some bits.
> How Unreal went from some mediocre engine
Unreal was never a mediocre engine, even from its first release it had amazing tools and a very flexible architecture. I'd say that even if the rendering tech wasn't sometimes behind id (i always noticed that id would come up with a neat new idea and the next UE would polish it up - the only time this didn't happen was with Rage and UE4), their toolset and architecture is what made other devs go after them and their stance to support (you'd get constant updates and documentation and i think at some point they set up an internal "community site" for people licensing their engine) was the icing on the cake.
I wonder why you said that Unreal (I assume for the first, single-player, Unreal) was a mediocre engine. AFAIK it was competitive with Quake II and released only a couple of months later. Care to detail?
From the OpenArena page on wikipedia: "The OpenArena project was established on August 19, 2005, one day after the id Tech 3 source code released under GNU GPL license."
The id Tech 3 wikipedia page "id Tech 3, popularly known as the Quake III Arena engine"
Someone tried for a pull request in Oct 2017 with 3,275 commits, still pending.
I can't find any evidence that anything new has happened.
Because you should not use this unmaintained original source, but the various mods around. There are many, but different for each game.
Eg https://ioquake3.org for latest quake 3 arena.
When I did my first mods 1995 there was no source code under GPL just reverse engineered format descriptions, but I was able write writers in Lisp to write maps in acad. This was fun.
DooM is not only fun, but it's great fun to play around with from the inside, or even to port. I've helped bring Doom to the x68000, which was quite the mission as the only compiler we had that could build 'large' executables was GCC 1.39 which had been translated into Japanese.
But it's been fun doing ports to OS/2, DJGPP v1, Watcom C, and of course various other Unix via the X11 code. The GPL'd code was 'cleaned up' and kind of bugged up on the way, sadly what was released really wasnt' all that pure.
For anyone wanting something more 'pure' for MS-DOS I'd highly suggest: Mara'akate's DooM-New
The Doom source is also what enabled ViZDoom (http://vizdoom.cs.put.edu.pl/) - a Python wrapper around the Doom game engine for machine-learning research.
I've certainly found it really useful for reproducing machine-learning papers I've been reading.
IMO Quake 1 was the pinnacle of moddable ID engines.
This because it used its own language, quake-c, and a simple compiler. Beyond that there were few inherent limits to the engine, so people set of making wild weapon packs (some having up around 20 guns that could have multiple modes), RPGs (abusing the hell out of how the game transferred character state between maps), even simple flight sims and driving games.
Come Quake2 you needed VC++ to do mods (leading to a way fewer single developer experimental mods), and come Quake3 there was a hard cap on the number of weapons the mod could contain.
I spent so much time reading the Doom specs and hacking around with C and .WAD files as a teen. Staying up until 2am, drinking Mt Dew, getting graphics from the game or maps to appear on the screen... funstuff
Recently I saw watched his QuakeCon talk from 2013, "Principles of Light and Rendering" at QuakeCon. It helped me solidify a lot of the basic understanding of 3D rendering I already had and provided a lot of additional knowledge as well.
Can I use this to play Doom 3 BFG edition on Linux? I'd love to just buy it on Steam but sadly it's Windows only.
By the way, I love this from the readme:
> If you have obtained this source code several weeks after the time of release, it is likely that you can find modified and improved versions of the engine in
various open source projects across the internet.
I wish I hadn't looked. It doesn't look anywhere near as good as Doom 3 in my memory looked. I've had this when playing old games from my childhood before. In my memory they were perfect, beautiful. Then I looked again and my memories were shattered. I didn't expect it to happen with Doom 3.
Thanks to the Quake engine being open source and LordHavoc's source port we have a very efficient VR shooter running on all VR devices, like the Oculus Go.
https://quakewiki.org/wiki/DarkPlaces
Love seeing LordHavoc mentioned. I collaborated a little bit with LordHavoc while helping build the open source shooter game Nexuiz (~2004-2005) and he is truly a brilliant game engine programmer!
Is there a good entry point into quake mods these days?
When I was a kid in 1996 I toyed with Quake C but really had no idea what I was doing. I'd love to toy with mods again but not really sure if there is a good way to get started. Most of the resources I knew of 22 years ago are gone or significantly outdated.
Check Quaddicted[1] for some singleplayer maps. The most common engine by far is Quakespasm[2] which keeps the original style but fixes bugs, reintroduces some features from the software renderer that were lost in the original OpenGL version (fullbrights, lightmaps that go above 100%, non-power-of-two textures, etc) and removes and raises some hardcoded limits. You probably also want QuakeInjector[3], which is a "quake mod installer" that handles dependencies. You may also want MiniQL[4] (made by me) as a more straightforward and minimalistic launcher (you can also launch QuakeInjector from MiniQL if it is installed in the same directory).
This is all you need for playing maps and mods. For making maps (the most common type of mod) you'll also need a map editor - currently the most common seems to be Tenchbroom[5], but it is far from the only option and a lot of people use other editors, such as BSP[6], QuArK[7], GtkRadiant[8] and some people even use either the original Worldcraft 1.3 (i think) or a modded Hammer (from Valve) to make maps. I think some also use J.A.C.K.[9], which is basically a Hammer clone, although i think it is abandoned now.
The map editor isn't enough, you also need the command line tools to "compile" the map (this is how the Quake engine can support multiple and different editors: by decoupling the tools from the engine and the individual types of tools from each other - which to me is one of the best ways to architect and engine and its tools, but sadly most popular engines have forgotten it). There are several variations, depending on the features you want. But i think these days most people are using ericw's tools[10] since they support some advanced features from modified engines (like the BSP2 format for larger maps and colored lightmaps).
Beyond mapping, if you want to delve in QuakeC (the game's scripting language) you need the original QuakeC source code[11] and a QuakeC compiler - while you can probably use the original QCC, i've seen FTEQCC[12] from the FTE engine to be mentioned way more often. Some tutorials can be found here [13] (the Inside3D site which was a sort of 'nexus' for QuakeC modders was shut down some time ago and while the community create InsideQC[14] it looks like not everything survived the transition).
I think the first one (the list i gave is for that) since it seems to have the most tools and documentation available out there. After that Quake 3 seems to be the most popular, but that is multiplayer only (not that some people didn't try to make singleplayer mods though :-P).
I've cross compiled the MS-DOS stuff from OS X. From the PowerPC days even.
If you want native doom, check out the source ports like Chocolate DooM. Doing ports to stuff like SDL is somewhat straight forward, or you can do a Cocoa one yourself, but I'd start with something known to work first..
Carmack did the original XFree86 3.3.x port to Mac OS X Server beta back in 1999 (Which still had the OpenStep GUI!). This eventually evolved into the XFree86 mainline, and then into XQuartz
I don't think any of the projects will compile on macOS as-is. You'll want to look at project forks if you're looking for something that'll compile out of the box.
As a former POW (Vietnam), I hesitated to play WOLF for over a month after downloading as I feared flashbacks. I didn't want to remember all that I had been through all those years ago, when, as POW's, my friend and I decided an escape attempt would be better than a slow death by torture and starvation.
My friend and I made crude maps and hoarded food. The day of the escape we clubbed the guard with stones, took his gun and fought our way through two levels of underground tunnels (only a few guards and had to crawl). I made it, my friend didn't.
Dreams...NO! NIGHTMARES...YES!! However, the more I play WOLF the less frequently I have nightmares. The chilling part is turning a corner and seeing a guard with his gun drawn.
WOLF is a powerful game. Fearful as well. I believe that a person should face the past. So... when I can play EPISODE 1 comfortably (no nightmares), I plan on ordering the full series.
There's also a letter from a Microsoft manager requesting a multiplayer version.
https://github.com/id-Software/wolf3d/blob/master/WOLFSRC/GO...