I love the imaginative, alt-history attention to detail: rather than a derivative clone of Wolfenstein or Doom, this looks like something The Bitmap Brothers might have produced in 1993 if they'd had access to another 30 years of community knowledge optimizing for the Amiga's great-for-2D but not-great-for-3D architecture.
Playing Wolfenstein3D on a friends PC was one of the reasons I thought "It's over for the Amiga" [0]. When talking to Commodore management at conferences back then, they didn't want to see 3D coming [1] (For reference I had a A4000/40 Retina(Z3) around that time).
So it is nice to see this :-)
[0] Both demo scene coders
[1] Funnily I was a Sega Saturn zealot around the time too, which was also 100% 2D focused and Sega didn't see 3D coming (Though I loved Panzer Dragoon, a 3D game squeezed out of 2D hardware, and Daytona was great too). And yes there was Starglider2 for the Amiga (Loved the space whales, was recently reminded of them when watching Ahsoka).
Yes you're right, my wording wasn't exact. I've meant 3D taking over the home market (and arcade market) and killing 2D plattform games and RPGs. Sega was very successful running "3D" games on 2D hardware (that could scale sprites, like Outrun or Space Harrier). So they were aware of 3D from early on. But they've released the Saturn with a main 2D focus, squeezed out some 3D games (Panzer Dragoon, Nights into Dreams, Tomb Raider, Burning Rangers and some arcade conversions) - the takeover was going on, people realized, with Mario 64 and Wipeout for example - and then added 3D hardware to the Dreamcast. But if they had anticipated the taking over of 3D, the Saturn would have been the Dreamcast (or not, I think my first PC 3D card was a Voodoo and that was after the Saturn).
The first 3D I remember having played was Major Havoc or I robot - so yes there were "real" 3D games in the market, but 3D completly took over the home market, except handheld like the Gameboy/Advanced/DS/... (I think the 2D revival came with the indy developer movement, but I'm not sure).
Also see e.G. the transition from Bards Tale 3D (small window) to Dungeon Master 3D (main window, 3D interaction) and then Ultima Underworld 3D (all in, tilt, up/down) as an example of 3Dification of games and genres.
You are forgetting the Sega Model 1 and Sega Model 2 arcade platforms. So many great 3D games (Virtua Fighter, Virtua Racing, Daytona USA)
Sega's problem wasn't that it didn't see 3D coming. They did, despite the common claims on the internet, the Saturn was designed to be a 3D capable console, with dedicated 3D hardware.
Sega's problem is that they bet on the wrong 3D technology. Like Sega's arcade systems. the Saturn was designed for quads, and could only do forwards texture mapping (which doesn't allow for UV coordinates).
While the playstation went for triangles and inverse texture mapping. It turns out that was the correct direction, and the whole industry settled on that as a standard. All game engines and tooling would assume triangles.
To make things worse, the Sega's hardware 3D implementation wasn't that good even at the techniques it was trying to implement.
I mean the arcade version of Daytona USA, which ran on a dedicated 3D arcade hardware (The Sega Model 2) that predated the Saturn by 2-3 years.
Part of the reason why the Saturn port of Dayton USA was so good, is because the game was designed for the Model 2 hardware, and the Saturn's 3D capabilities were cost-optimised versions of the Model 2 3D capabilities.
VDP1 was designed to do 3D. It's not just a 2D sprite engine that developers later tricked into doing 3D, it was absolutely intentional that it would be used for 3D from the very start. the fact the 3D quads could also be used to show thousands of 2D sprites with full scaling/rotation/distorting was more of a intentional bonus of the design.
Yes, Sega did miss a trick by also dedicated a bunch of the silicon budget to 2D-only capabilities. VDP2's four layers of scrolling backgrounds plus two layers of mode-7 style "3D planes" are pretty useless for 3D work. They should have dropped VDP2 entirely and and focused on making VDP1 more powerful.
When games were actually designed around the Saturn's 3D capabilities, the results were pretty good. Daytona is great, Tomb Raider is another great.
Some games went further, going out of their way to adjust their game design so they could take advantage of VDP2's 3D planes. That's the major reason why Panzer Dragoon looks so good, the water is a 3D VDP2 layer. The skybox too, but that's reasonably common technique in Saturn games.
Sega released Saturn as a full fledged, 3D focused hardware. A year before the PlayStation 1 was out.
Some chronological refresher:
Sega had already been successful in making arcade 3D games and were the first to effectively transition true 3D to the home markets with multiple early flagship releases such as Virtual Fighter, beautifully ported to the home console.
The reason not many decent 3D games were shipped on Saturn was not down to the hardware capabilities or goal from sega to embrace expected 3D market demands.
It was rather due to the horrific developer experience. It was so terribly difficult to program it, only those at sega were comfortable with it, if that.
Saturn's hardware architecture complexity, having multiple CPUs, and other design aspects prove bad choice as it jeopardised 3rd party game developers productivity. Words were that most games shipped leveraging a single CPU given how obscure dual cpu programming was.
Ironically Sega designed a more powerful machine than the PS1 and got it out a whole year earlier, yet games on PS1 were significantly superior. All down to ease of development.
Wipeout was an early game on PS1 and indeed among many other popular games sealed the fact home gaming would be 3D. But sega had already produced a full fledged 3D capable Saturn, they didn't wait to see wipeout in 1995/1996, or Mario 64 another 2 years later.
Saturn was capable of running wipeout 2097, metal gear solid, grand tourismo, and other late PS1 games that pushed the ps1 to its limits. But it missed the momentum and studios were not anymore invested in the saturn. Its marketshare kept shrinking.
what the Dreamcast got added was primarily developer friendly features, the rest was simply leaps forward in compute power, especially on the GPU. Bump mapping if I'm not mistaken was supported by the Dreamcast first among home consoles. It was not "added" 3D, for sega that was its 2nd generation of 3D focused hardware, also their 2nd attempt, both ultimately failed for different reasons.
The Saturn had two graphics chips VDP1/VDP2, where on one you could draw (distorted) sprites and on the other large planes e.g. for backgrounds [0].
If you'd implemented a z-Buffer in software you could sort these sprites to draw them in the right sequence to use the distorted sprites as 4-corner polygons to render 3d models.
Yes, but it's basically the same story with the playstation.
VDP1 draws distorted textured quads (it's misleading to call them sprites). The playstation's GPU displays distorted textured triangles.
The playstation GPU also knows nothing about 3D, it doesn't have a depth buffer. Like the Saturn, you have to implement depth sorting on the CPU and link all those distorted triangles up in the right sequence of 3-corner polygons to render a 3D model.
Yet nobody ever claims the playstation isn't a 3D console.
It's not incorrect to call them distorted sprites. They are. But they are also texture-mapped quads, because the two concepts are functionally identical. Sega explicitly pointed out this relationship in the documentation they sent out to developers.
And if you are in a discussion about if VDP1 is 3D capable or just advanced 2D hardware that clever programmers tricked into doing 3D, then it doesn't really matter what terminology Sega used.
It's misleading terminology because if the person you are talking to doesn't understand the functional equivalence between distorted sprites and texture-mapped quads, as the "sprite" terminology might lead them to assume it's 2D-only hardware.
"And if you are in a discussion about if VDP1 is 3D capable"
This was not the discussion. The discussion was if Sega missed the total shift of gaming to 3D games that was brought on by 3D graphics cards and that propelled PCs into the mass gaming market (before 3D it mostly excelled at thing consoles couldn't do, like text adventures and point&click). And that it didn't need 2D hardware (at least for the international market, JP had e.G. more Shmups and it seems less interest in 3D shooters) but dedicated 3D hardware like the PowerVR2 they did put in the Dreamcast (too late, and I wrote Dreamcast software and loved the machine, so I'm not against Sega hardware in any way). And if they had anticipated the move of games (the vast majority of todays AAA titles are 3D games, for the better or worse, and in the Top 10 of most sold games, 8 are 3D and 2 are 2D or 2.5D) in it's majority to 3D, they at least had a chance to overtake the Playstation and survive as a game hardware vendor. The Saturn was only on par with the Playstation (e.G. at the time I owned Tomb Raider on both), which wasn't enough.
If you take a look at the most sold PS1 games, it's mostly 3D games:
The Wikipedia page says this about 3D of the Saturn:
"[..] which resulted in the creation of the "SuperH RISC Engine" (or SH-2) later that year [..] According to Kazuhiro Hamada, [..] we realized that a single CPU would not be enough to calculate a 3D world." Although the Saturn's design was largely finished before the end of 1993, reports in early 1994 of the technical capabilities of Sony's upcoming PlayStation console prompted Sega to include another video display processor (VDP) to improve 2D performance and 3D texture mapping"
Sega build the Saturn without a focus on 3D. Because they saw 3D coming too late, late in and after development had mainly finished, they added a 2nd CPU and added a VDP for 3D texture mapping to compensate for the lack of 3D capabilities.
> Sega released Saturn as a full fledged, 3D focused hardware. A year before the PlayStation 1 was out.
The Saturn only beat the playstation to the Japanese market by 11 days.
The gap was more like 3 months in other markets, but only because Sega bought the launch date forwards (with disastrous results. In North America it was scheduled to launch one week before the Playstation, but Sega said "Surprise, it's launching today". Many retailers in North America went on to boycott Sega, because they weren't informed about the surprise launch, and didn't have any stock).
Ah scud race. I'll never forget finding that machine in a bowling alley arcade and noticing the coin feeder wasn't closed. Surreptitiously opening it to find a little button that could be pressed to automatically add a credit for free each time. That was one Saturday I'll never get back and I don't care.
As i understand 3D was considered an afterthought for the Sega Saturn, their 1995 flagship console, only adding a separate processor for 3d late into the design. This made the 3d capabilities of the Saturn very hard to program for.
It's on a Amiga 500, it has about 3 MIPS of performance or about 4x less than a 486sx25 that has about 12 (yes, 68k had a couple of extra registers).
On top of that the Amiga500 graphics chip placed all graphics in bitplanes, that was a great move for just copying 2d graphics in the mid 80s with the help of simple hardware acceleration, but it really became a pain for anyone wanting to do 3d graphics.
To call it a reskin is quite a disparagement of the amount of work they've probably had to put in to get in running decently (unless they figured out a very clever way to make the hardware do a fair chunk of the work for them, kudos either way).
And as others have pointed out, it's that they've chosen a more "amiga-y" artstyle.
My knowledge of the Amiga chipset is very cursory, but peeking at the source code, there are some blitter routines that look like they're used for texture scaling (i.e. stretching + drawing vertical strips/walls.) I don't know whether that (alone) accounts for the significant speedups needed. A good comparison would be to any bottlenecks the original NeXT engine, which had the 68k, but didn't have any of the accelerators...
Hmh, browsing a bit it seems that the Dreadtool part of the source seems to be some kind of editor built with Visual C++. The Dreadmake folders seems to have actual Amiga sources.
A reskin within games is usually defined as using the same binary with different assets, even a shared source code lineage(Doom source used in Heretic/Hexen or especially Quake to Halflife) would stretch that definition.
In this case I doubt there is any shared code at all other than conceptual ideas.
If you are willing to go around proclaiming that Rust is just a reskin of Java then sure, I'll give that you're willing to equate wildly different things. (I didn't use C# and Java because those are probably closer in lineage due to J++ than this game and Doom)
I think the Bitmap Brothers mention is because the 2D graphics feel very Amiga-ish / BB-ish. When i read the comment i wondered "why the Bitmap Brothers reference?" but once i saw the screenshot i thought "ah, i see". Compare the textures/2D artwork of, say, Doom, Duke3D or Blood and then the sprites of something like Chaos Engine, Gods (even the DOS versions) or even some of their 90s PC games like Z and i think you'll see how the visual style of Grind is closer to that of Bitmap Brothers' games than to Doom/Duke3D/Blood/etc.
Consider that Doom is sometimes blamed for single-handedly causing the downfall of the Amiga's position in gaming because of how long it took to get even a half-decent Doom-like game for it, as a result of how badly mismatched the Amiga graphics chipset was for 3D graphics.
That is why is this is impressive.
The number of gamers who switched to PC's just because of Doom was huge. After Doom everything was measured against it for a long time. Even many diehard Amiga fans suddenly decided the game was up and switched.
That your reaction is that it is "barely more than a Doom reskin" demonstrates why it is incredible.
To the extent that if a "barely more than a Doom reskin" had been achieved near the time Doom came out, computing history might have looked quite different (though Commodore was so badly mismanaged at this point that they'd probably still have managed to mess things up even if they'd been handed this kind of reprieve on a silver platter, but one can dream)
Fellow Amiga game dev here. The Amiga dev scene is alive and well, and I feel like these past five years especially, we've seen some mindblowing stuff on the platform.
If you haven't seen any recent Amiga demoscene entries, you're in for a treat. I'm tempted to share a whole list of demos, but here is a particularly amazing one released earlier this year that runs on a stock A500: https://www.youtube.com/watch?v=2jciCr8zEhw
I believe a number of the devs who have been around for decades are still writing actual code with their preferred tools that run on Amiga. But I wouldn't be surprised if even there, most of the coding is being done in emulator, because the modern workflow is so much snappier that way.
However, the cross-platform build stack has gotten comprehensive, and straight modern hardware has become the method of choice for a lot of devs (including myself). There is an "Amiga Assembly" extension for VS Code which does a ton of heavy lifting for you, including setting up the build tools, and even grabbing a fork of the emulator that has DAP support so you can use VS Code's built-in debugger.
I personally use Neovim and a couple plugins for syntax highlighting and DAP support. As part of my build process, I run Python scripts which convert images and SFX to Amiga-friendly formats and assemble them into a custom binary file. I can't imagine how long that would have taken me to set up in an Amiga-only environment—Python is just too convenient.
I think a lot of what you see possible on Amiga today is thanks to modern tools allowing for very rapid iteration as well as some crazy preprocessing for effects that depend on lookup tables, etc.
FreePascal still has official support for creating Amiga binaries. I never tried it since I have no idea about how Amiga development works in general, but I tried FreePascal a bit for other systems and it seems pretty good (underrated?) (both the language and the compiler; fast, low on bloat, mostly safe memory, huge standard library).
If I was doing it (as an ex demo coder) I would do as much as I could in emulator to make my life easier and then test it on the hardware and hope for the best. Demo coding is one of those annoying things where real hardware is absolutely essential because you are often doing things which aren't properly emulated.
This demo didn't impress me that much, I was actually hoping for more. Maybe I've just forgotten how underpowered the Amiga OCS was for things like this? Second Reality by Future Crew on the PC is still my #1.
A 33 MHz 486 CPU (the recommended spec for Second Reality to run smoothly) is MUCH more powerful than the 7 MHz 68000 of the A500... It's not just nearly 5 times faster, the 486 is way more cycle efficient, not to mention having a nice byte-addressable bitmap screen to draw into (vs 2-6 separate bitplanes).
So yeah you forgot how underpowered OCS Amigas are :)
The Amiga hardware doesn't really help much with 3D effects, which the demo is loaded with. Audio is a big help for a 68000 but a 486 can mix 8 PCM channels and not even notice.
I was fascinated by the Amiga when I was young, but they weren't widely available in my country.
Lately I've had an itch to do some retro programming work and was looking at the Amiga specifically. I tried Aztec C natively in UAE and VBCC as a cross-compiler, and got some command-line and hello world Intuition programs working.
But going beyond that has proven somewhat difficult. Documentation seems to be scattered all over the internet. I found a bunch of books on archive.org, but they mostly seem to focus on Intuition applications or beginner programmers.
I don't think you can follow the path originally trodden (competing on asmone 68K/hw with your fellow local coders, exchanging snippets of hw reg info and source) however there's a terrific short series of videos on youtube you might consider:
Wei-ju Wu lays out how to code directly on the hardware in an accessible manner and, importantly, from C, instead of 68K, with lots of examples. Then, if (or rather: when) you have to, you can always step down a level into 68K assembly to get better performance.
This is truly amazing. I've always adored demoscene, but not done anything other than consume, so I would appreciate more of those demos here if you can share some highlights of recent times.
Correct, the demo I linked is a non-interactive audiovisual demo that is designed to show off the leet coding and artistic skills of the team. ;) The OP is a fully-playable game, though.
When designing a game, you probably have to code a game engine that has certain capabilities and certain restrictions that persist through the entire game. But for these demos, you can write an entire isolated program that runs one 15-second visual effect before flushing it out and loading something totally different for the next effect. The effects are also usually designed with a lot of tricks that make them inflexible (like preprocessed data tables) and usually also take up every CPU cycle available, leaving none for stuff like game logic.
I bought an Amiga 500 about 10 months ago. Haven't touched one since way back when (89-90?).
If you are considering doing the same, I recommend going to YouTube and look up two keywords: PiStorm and FS-UAE. These two allow me to prepp a hard-disk image in emulation on my PC, transfer the image to a rPi inside the Amiga and mount it. My Amiga 500 now boots a workbench with all the things I was looking for.
I also bought a new case and keys, because the Amiga was very yellowed. And a new power supply because the old one looks like it wanna burn my house down
I know nothing about this. What's up with the vertical lines for color variations here? Is there some kind of hardware trick why that technique is used?
The last time I wrote a 2.5D engine was over 25 years ago so please be gentle
I suspect it's a way to dither with few colors, and possibly a way to render textures very quickly into the Amigas peculiar pixel format. One pixel is not a triplet or bytes, or even a single byte. One byte is 8 pixels!
And that's for monochrome. (Pure black and pure white.)
To have more colors, you add more of these "bit planes" on top of each other.
So if you have say, 32 colors and want to update a single pixel, you have to write four bytes, changing only the one bit in every one of these four bytes.
This is not very nice when you try to sweep textures on a 3D-ish mesh.
(But very good for 2D graphics with large chunks bobbing up and down.)
The Amiga (except some early 1000s) also had a Extra Half Brite mode that has 6 bitplanes, 64 colours with the second 32 being half the brightness of the others.
Other than using copper to change palette entries at desired scanlines, Amiga 500 can display 64 colors in EHB (Extra Half Brite) mode with 6 bitplanes. Since there are only 32 palette entries, the upper 32 colors are otherwise same, but half the brightness.
And of course 4096 in HAM mode, but the mode has severe limitations and is completely impractical for textured 3D-games.
1. simulates texture filtering and dithering hiding low resolution and low number of colors in textures, and
2. renders two or four columns at once.
Amiga had planar graphics, which did not support setting individual pixels. To set a pixel you'd have to set appropriate bit in up to 6 bytes (64 colors max) scattered around in memory. Writing one byte modified one bit of 8 adjacent pixels, which was terrible for textured 3D graphics.
The original image is squashed vertically, so that you only compute half the pixels and then you stretch it back up. When you do this, you can't have the usual dither patters, so you end with something like this (unless you cheat shift every other line, like in Sonic 3D, but then you have dither patterns everywhere).
with fast ram. That pretty much means additional accelerator card because Commodore didnt bother to build fast ram memory controller into the thing. You couldnt just slap some simms or ram chips on a card, you needed additional logic. Cheapest contemporary Fast Ram cards were ~100 pounds + ~30 pounds a meg, half the cost of 270 pound Amiga A1200 in 1994. You could argue it still cheaper than PC, lets count.
And after all that, you'd still have a 68020 under the hood. You could, of course, spend an eye-watering amount of money on a 68040 accelerator... and then... you're still not going to beat the DX2/66. (Let's not even talk about the performance of AGA versus a local bus SVGA card)
It feels like I've heard so many Amiga users say that they finally broke down and jumped ship when Doom and the DX2/66 dropped. It really was a gamechanger. Before then it was possible to argue that a '040 Amiga 4000 still held its own against a DX/33 or maybe even a DX/50, but not this.
There were a lot of things that came together to kill the Amiga, mostly Commodore's mismanagement, mismarketing and complacency, but as the final blow to it being a viable computer with a future in the eyes of consumers, I feel like the DX2/66 was it. It would have killed the Mac too if Apple hadn't woken up to smell the coffee a few years prior and had Power Macs ready to go.
Back in the day, I was playing DooM 2 on a 386DX 40MHz and was very fine. And Rebel Assault on 14fps.
Also, I managed to play DooM 3 on a GeForce 2 MX DDR (using the patch sets to allow it to run on Voodo cards).
The artwork really works with the engine - everything looks a bit grimy and muddy which means the vertical stripes you get as part of the implementation fit in really well.
The sound, animation and gameplay is also great. I love the sound of the machine gun starting and stopping when the character get in the line of sight of the enemy. Great attention to detail.
Nah, might have helped give it a bit more life for another year or two, but even if the Amiga (and Commodore) survived the era of Doom (386/486s), it was never going to survive the arrival of Quake and Pentiums, let alone the arrival of GPUs that was to follow shortly after.
The thing is, Commodore might have been as little as months away from getting the next gen chipset to production if they'd had just a bit better sales through '92 and '93. They were producing samples '92 onwards but struggled to afford iterations.
They were mismanaged enough that they might well have failed to leverage that and just failed a year or two later as you said, and it might very well be that even the best case of completing AAA[1] would just have bought them another extra couple of years.
But Commodore had faced crunches like that several times and survived and bounced back massively. So who knows what things might look like if AAA had gone to market (with chunky graphics modes, and massively increased memory bandwidth) and bought them enough sales and by extension time to complete the next-iteration - Hombre[2] - as well.
Though Hombre was based on PA RISC, so might well have ended up being the death-knell instead a bit later. Though their "backwards compatibility" story for it was based on options of either a "classic Amiga on a chip" or emulation, so maybe they'd have gotten to a sufficiently CPU agnostic position to be able to make further architecture switches survivable.
It's fun to speculate.
I'm a bit of two minds about it - I'd have loved the Amiga to have survived longer - I still miss it -, but I'm unsure if I'd have liked the direction Commodore would have taken it in with Hombre (which was being designed to also run Windows NT... Shudder). It's easier to have nostalgia when later iterations haven't ruined the original experience...
It’s wild to me that now people are making something like this just for fun, while if this had actually been released at the time it would have been a best-selling game.
man, I remember the "chunky vs planar" usenet megathreads when DOOM came out, amazing that it's still producing results 30 years later. I miss my old A500, but not slip/ppp, x/y/zmodem downloads and the terrifying floppy checksum errors.
For 2D stuff definitely, but that was because of some neat dedicated graphics processor and a fixed platform for developers to target. The Amiga range in that sense was WAY beyond others in terms of audio/visuals when it came out. In that sense it was closer to the home consoles than a typical PC.
It led to the odd situation where you had PC's that had these killer CPU's but everything else about the system was holding them back. The Amiga had an average CPU (8Mhz 68K) but everything else was picking up a lot of slack.
As for this example, this is exceptionally good for the Amiga. I don't recall there being anything even remotely close to this in its time. Hackers are just gonna optimise way beyond reason and I love it.
They are optimizing, probably. Thanks to the internet, knowledge sharing is substantially simplified compared to the 1990s, and it's massively easier to find people to collaborate with. Hard to understate how difficult it was to find useful info back in the day.
Additionally, anybody that grew up coding on the Amiga has had 30, 35 years to think about it since! - and they are probably still young enough (or, more accurately, probably not yet properly old enough...) that time has, for now, added more to their abilities than it has taken away.
And: modern PCs are ridiculously fast! A table or routine that would have taken your Amiga days to produce, even assuming you'd have considered the idea feasible in the first place ("b-but - you'd need a temporary 512 MByte table for that!!") can be generated in 5 minutes with some python code on your 10 year old laptop.
Tooling and code generation improvements no doubt help a lot, but IMO those improvements must be coupled with creative manual optimizations in order to get something like this out of a platform that was tailored for rectangular 2D bitmaps.
There isn't much the compiler can optimize on such a simple CPU as the m68k when the source is simple C or hand-optimized assembler code to begin with.
Contemporary compilers definitely do a lot of CPU-agnostic optimizations that ye olde compilers weren't capable of. The only new CPU features that fundamentally change this are vector instructions, and compilers still suck at autovectorization.
Contemporary compilers are "optimized" to optimize for superscalar, fully pipelined, out of order cpus with plenty of ram and caches. Literally nothing in common with a 68k. Also modern compilers wouldn't even know how to produce code for the amiga coprocessors.
This engine very likely is written in hand optimized, clock-exact, asm.
The IBM PC (and clones) at the time of launch of the Amiga 500, had at best an 80286 at 6 to 12Mhz . I would call it roughly equivalent CPU with the 68000 at 8Mhz (16 bit segmented addressing vs 16/32bit hybrid with planar ram addressing)
At launch definitely was a decent little performer. That said, the 386 was probably launched shortly afterwards as well. The funny thing is that it wasn't until almost the mass adoption of high speed 386's and the 486 in the early 90's that we finally saw a lot of stuff that over took things the Amiga and SNES for that matter. A big part of that was VGA graphics via the VESA bus. The ISA bus did a lot to hold back video out put on these systems. Graphics was almost an afterthought by comparison.
I mean things like Commander Keen were considered really decent for side scrolling graphics on PC but would be seen as trivial on a system like the Amiga and maybe even the NES using early 80's tech. It was a long time until the PC could scroll graphics as smoothly as the NES could with Super Mario in 1985. Custom ASIC designs just had to so leverage back then. It is something we are moving back into nowadays with chiplet designs and SoC's in general. Just look at the processor blow outs on something like the M1/M2 to see this.
The Amiga really struggled to handle Wolfenstein/Doom clones due to the layout of it's video memory making if even harder for the already-slow CPU. The chips that made it great at 2D games held it back at 3D.
This new engine is an absolute masterpiece of Amiga coding. The many mid-90s attempts to do similar things tended to need more powerful Amigas (at least an A1200, preferably with an 030 or better accelerator card and extra RAM) for less impressive results.
Yes and no. It's complicated. For textured 3D games, even with 256 colors, the PCs with VGA were better. Wolfenstein catered to EGA, with 16 colors.
For smooth frame synced 2D games (no lag, no jitterynes, no tearing) Amiga was hands down better, but those kinds of games were falling out of style in the early 90s.
Yes and no. PCs from beginning was very simple business machines, for accounting, but later benefited from free market and cheap clones.
As for me, VGA was wonder, I now think from marketing considerations, PCs way was to make Super-EGA, with little colors, but high resolution. But, sometimes wonders happen.
Amigas, or to be more exact, Commodore computers, from beginning made as home computers, with decent integrated graphics and sound.
Unfortunately, Amiga parent company was very aggressive against clones, and from what I hear, was not effectively managed (or may be, just tried to made too complex architecture). So after some time, initial superiority was lost, and PCs not just dominated market, but also, once become better technically (because of constant competition on free market).
Approx after 1987-middle 1990s, Amiga become technically lagged behind forever, then parent company bankrupt and now, each year, Amigas become more history artifacts.
Yeah they did at the time for sure. When the A500 was released PC games were generally CGA/EGA. The A500 was way ahead of its time, especially when you account for how much they cost.
Around the time I got my A500, my neighbour's dad bought a PC that ran California Games in CGA at about one frame per second. I'm sure there were much better PCs available, but it didn't impress us kids much. They _did_ have Leisure Suit Larry though...
What really impressed me is that I was born around the same time as the A500 but I remember playing PC games that looked worse than this, hahaha.
Part of it is surely that I was using hand-me-down PCs and there was probably some time lag on the games that my dad would get through the sneaker-net.
The art style is also really solid and modern which I’m sure helps hide some of the limitations.
The Amiga smoked PCs when it came out. PCs didn't really come into their own until the 486 became commonplace, but once it did the Amiga was struggling to keep up, games-wise. For example, the PC port of Mortal Kombat was the most arcade-accurate home port of any system. The Amiga port was nearly, but not quite, as good.
This is not how I remember games on the Amiga. Having said that, I loved the games and graphics on the Amiga. It just felt like a beautiful visual experience.
Judging from Amiga retro channels on youtube, they all look like GenX-ers to me. I suspect nostalgia drives you to this, not because these are particularly interresting machines today. (shots fired, I know)
This is true for the demoscene in general, as well. Not completely, thankfully, but it seems like that it is becoming even more niche than it already was.
It is originally from the C64 demo ”Andropolis” by Booze Design and Instinct. Coded by Andreas Larsson, the guy/genius who ported Eye of the Beholder to the C64.
Here is the same thing in turbo mode on an Ultimate 64:
Once people know something is possible with video games, they tend to imitate it fairly well across platforms. I like to use Street Fighter 2 as an example… good ports on PCE, SNES, Genesis.. and surprisingly decent bootlegs on NES and others. But until it hit arcades, there was nothing else like it.
Oooooooookay, so I'm not about to do this down, because I think it's awesome, but I do think it's time for a little more up-front transparency here.
When I clicked through and hit play on the first video I braced myself for the worst framerate in human history and/or a tiny viewport, and so my jaw briefly dropped when I was greeted with neither of those. The game is basically fullscreen and runs great: a perfectly respectable (for the time - I know a lot of you will moan about games that don't run at a constant 60fps but, man, maybe you don't know what it was like back then) 25 - 30fps, and a frankly amazing framerate for the Amiga. Plenty of 2D games couldn't have reliably held that and just forget it for any kind of 3D.
Then my rational mind kicked in with, "There's no effin way that's running on an A500 with a stock 68k CPU running at 7.16/7.09MHz."
And, sadly, I'm right: there is no effin way. The footage was recorded from an A1200 with a stock 68020 running at 14MHz. The graphics are OCS compatible, and I've no doubt the engine is caning the OCS blitter to within an inch of its life, so it looks exactly the same as it would on an A500, but you ain't getting that kind of framerate without more horsies than the A500's 68k can supply.
With that being said, the game runs at 10 - 12 fps on a stock A500 with 1MB RAM (512K chip + 512K fast, so you need the 1.3 ROMs and the Fatter Agnus), which is still incredibly impressive. For a lot of solid polygon 3D games back in the day - think F/A-18 Interceptor, F-19 Stealth Fighter, and even games like Starglider 2 - drops to single digit framerates if not exactly the norm, were certainly commonplace (I remember F-19 being particularly bad for this). I seem to remember the Amiga version of Elite held a pretty decent framerate but, to be fair, the polygon count was very low, and it had a pretty small viewport. By comparison with these games, the Grind visuals, at 10 - 12 fps, on a bog standard A500 with an absolutely era appropriate amount of memory are nothing short of extraordinary.
If, in 1990 - 1993, or even after I'd first played DOOM on a mate's dad's 486SX in summer 1994, you'd shown me Grind running on my Amiga 500 at 10 - 12 FPS I would probably have fainted or jizzed myself or something equally ridiculous. It would have literally blown my mind.
Incredible work.
EDIT: Is it me or is the shotgun sound effect a slightly edited version of the DOOM shotgun sound effect? Not complaining, because it's an awesome sound, but just wondering if others are hearing the same thing I am.