Kinda sad. Our current x86 hardware is filled with closed binaries and firmware that can't be easily replaced. That seems to be why so many _open_ laptops are using ARM.
It, like the C64 (and the rest of the 8-bit gen) that preceded it, had know, fixed, hardware.
The PC, and to some degree the ARM SoCs, have a layer of software abstraction sitting on top of mutable hardware.
The fixed nature of the hardware allowed people to poke, prod, and time actions down to the clock cycle.
But all this also meant that even the smallest hardware change would break existing software.
There was nothing like that in the PC world for the first few years of the Amiga's life cycle, and for games there continued to be nothing like that until years after Commodore went bankrupt.
While most Amiga games certainly did hammer the hardware, unlike on the PC where games of the era also hammered the hardware, the small portion Amiga games that were in fact system friendly from the start was far greater on the Amiga in the early years than on the PC (e.g. some notable titles included Cinemaware's King of Chicago, as well as Ports of Call, and a number of Sierra On-Line games).
The bigger problem apart from the ludicrous mismanagement at Commodore was that Commodore was too successful at marketing the low end Amiga's. The A500/500+ and then the A600/A1200 were the big sellers, not the higher end machines. These were price-sensitive buyers unwilling to spend money on graphics cards, because until first person shooters hit they had largely very little reason to.
This made graphics card a far more niche expansion on the Amiga, where they were a luxury, than on the PC where you increasingly had to buy one to run what you wanted.
It was first when first-person shooters hit that the planar graphics that had been a tremendous advantage with 2D games became a limitation overnight, that the Amiga suddenly was playing catch-up.
The problem wasn't inability to update the platform, but it was exacerbated strongly by a chronic culture of underivestment in R&D which had been a hallmark for Commodore all the way back to their calculator days that certainly made the updates happen far slower than engineering wanted. Commodore survived as long as it did because their engineers, and those of the companies they bought along the road, pulled off crazy feats when their back was to the wall, but they were frequently unable to get even minor funding to complete projects even during the periods were the company was pretty much printing money - Commodore went on death-marches when they were facing existential threats and were cash-strapped instead, and eventually it caught up with them.
It's a really smart design. Combines a forward-looking architecture with a realistic implementation for the time. It's a shame it died out while x86 survived.
It's sad overall really. The Amiga was a great machine with a fun and easy-to-use/manipulate operating system. Memory protection (beyond weirdnesses like Mungwall etc.) would have been nice.
For such a small team though they did very well. Commodore were desperate for a slice of that DOS/Windows PC pie though and that further drained the coffers :(
If they'd done what Intel did and continued to develop on the same platform, maybe they'd still exist and we've have more diversity in platforms today. The 68000 was so much nicer to develop for than x86.
It wouldn't be a problem today as designers have transistors to spare, but it was in the early '90s, when the high performance market was taken over by the simpler OoO RISCs and x86  of which, against expectations, Intel managed to build a competitive OoO implementation in the form of the PentiumPro.
 which compared to other CISCs is much simpler.
And now that's dead too. And yeah I think the engineers at Motorola in the 90s basically just saw the RISC writing on the wall and threw up their hands and said that was the way to go, customers be damned, meanwhile Intel just had too much invested in x86 CISC and couldn't do that and so was forced to make it work.
As a former Amiga user, I'm familiar with the Amiga's graphics co-processor(s). I actually forget now how many, I remember Copper and Blitter, but don't recall if those were chips, or functions on a single chip.
So, is this a sort of video adapter that converts the native Amiga video output to HDMI compatible signals, or is it full graphics card that brings Amiga graphics (32/64/4096 color, with acceleration) to modern output resolutions?
(Edit:) On the hardware side, I implemented the Zorro bus protocol in the FPGA (in Verilog) and hooked it up to a SDRAM controller/arbitrator and DVI/HDMI encoder. There is also a simple blitter in the code.
Maybe even emulation of a faster CPU in FPGA (a'la the Vampire for A600).. though, it'd probably use the CPU slot instead of Z2/Z3.
Also, is it possible to hijack the function of the OCS/ECS/AGA directly via the Zorro slots (proxying the native output gfx output, e.g. with games), or maybe it'd require some kind of hardware bridge between the gfx chipset and your card?
On that note, i think the A500 my parents got me back in the day were a oddity. I distinctly recall it having 1MB chip ram, suggesting it had the ECS inside. But it shipped with Workbench 1.x rather than 2.x.
I can't remember whether you did the 1MB chip config with a dip switch or by cutting a track on the motherboard - I think it may have been the latter, which is probably why I didn't do it.
RAM was mapped at 0x80000-0x100000 from chipset point of view. CPU saw same data at 0xc00000-0xc80000.
You could even argue Doom killed Amiga as a gaming machine.
Really the big interfacing issue with a classic Mac is video, because many modern displays can no longer handle their sync frequencies. Either you need an older CRT (and to maintain it) or an older CFL-backed LCD (and to maintain it) that can hopefully show an image at a non-native size without upscaling.
With a custom digital video card and DVI/HDMI output you can actually hook up a modern display, just as with the card the OP created for Amiga. But unlike with Amiga, the classic Mac always used a straightforward framebuffer, so there's no software support issue.
Because optical ADB mice are hard to come by? There are a few, but they're rare.
A good USB implementation might also allow you to hook up storage devices, which would be pretty nice.
There are other (non-DIY) USB-to-ADB projects out there taking shape, too.
But I'm also hoping someone starts making Nubus cards for VGA, USB mass storage, etc. It would just take an FPGA or perhaps even an overclocked teensy.
All you really need are the level shifting buffers just as with the Zorro interface, and enough I/O for the 50-some NuBus signals. That breaks down to 32 address/data lines, a few NuBus control lines, and some lines to control the level shifting buffers.
Edit: I highly doubt that this project would be doable in an ice40 part (no SERDES, relatively small number of LUTs), but maybe an interesting challenge for the reader ;) Maybe with an extra transceiver IC and single bit depth, single zorro protocol...
They also support Xilinx Spartan 6 which looks like what this board uses.