Thinking about storage now…
My favourite drop in disk hack was probably diskspr¹ to store 980k on a double-density disk, which worked by replacing trackdisk.device².
And awesome libraries like xpk³, which allowed you to add aftermarket compression/encryption support to applications. Looks like people are still writing modules too, including LZMA⁴ support(although I can't begin to imagine how slow that would've been on my Amiga ;).
*  http://www.generationamiga.com/2016/12/23/xcopy-amiga-pirate...
Commodore had a nickle and dime marketing plan and made PC clones as well. The Commodore 65 was going to be the next 8 bit C64 type computer after the C128. But it never got out of prototype.
Mac and DOS/PC tech caught up to the Amiga around 1987-1992 and Amiga could not make a newer chipset in time to compete with them.
The Amiga was like the Mac but with true preemptive multitasking and 1/3rd the cost of a similar powered Mac. Amiga didn't earn a lot of money with the Amiga because they lowballed the price and Apple won because they highballed the price until Steve Jobs could come back to fix the company. Amiga had no Steve Jobs savior and went out of business because the DOS/PC cut into their sales too.
What they should have done:
kept Amiga extensible - kept the external bus when cost optimizing the A500 into the A600. (Instead the A600 was a crippled, slightly incompatible A500 that split the market and made it slightly less interesting to game developers.)
The chipset in A1200 and A4000 was too little, much too late. The A1200 was a case study in cheapskating and crippling an already anemic CPU. (It came with a disabled L1 cache. If it had only had 64 kilobytes of more RAM, they could have enabled the L1 cache.)
Should have partnered with SUN and made SunOS (Solaris-to-be) for high end Amigas, more powerful than what they ever produced. (They basically said "fuck you" to SUN.)
It could also have helped to buy fewer business jets for the CEO, but I think that was more of a symptom of what was wrong. If they had done fewer completely idiotic moves, they could have afforded a few jets easily.
I don't think the Zorro bus was as good as ISA or PCI and the A600 was like you said it was. The Amiga needed a networking adapter to work with networks and Apple Macs later had them as default or via a NuBus slot.
Plus most of the software for the Amiga was video games which limited the system to video games it needed more business software.
PCI is vastly superior to Zorro-3 though, in both features and performance. Z3 never achieved the performance given in the specs and had quite some problems. IIRC Dave Haynie said that he would have used PCI instead of developing Z3 if it had been out at the time.
Due to this fact, piracy on the Amiga had a much larger impact on the platform than on the PC.
The final nail in the Amiga's coffin, of course, was the arrival of 3D: the Amiga couldn't play Doom (at that time), and so it quickly became the inferior gaming machine.
In the pro market, Commodore didn’t even know how to play - they treated their machines like toys. Despite this, for a period, if you wanted to make music you bought Atari, if you wanted to do publishing you bought Apple, but if you wanted to do TV and FX you bought Amiga.
Piracy was really, really not a factor at all.
What you're saying was fairly true by the time of QuakeGL and Quake 2 around 1997, but not Doom in 1993.
Yes, you could; CPU+chipset (and, in some cases, + RAM, IIRC) processor upgrades which plugged into an expansion card (to power some of the support hardware) and the processor socket existed.
The way you made Doom playable was swapping motherboard/cpu/ram/vga for a 40-66MHz 486/localbus ones at a cost of multiple A1200s, not exactly what one would call 'extended pretty cheaply'.
We had something like that in the PC world too: https://en.wikipedia.org/wiki/2M_(DOS)
It also had a rather interesting feature for log files, where you allocated a set size, and any appending onto the end of the file that made it longer would discard the part at the beginning of the file, without allocating any new space.
Neat! Thanks for that link.
(well, it's an FPGA, not an actual taped-out ASIC, but it's still faster than the Motorola 68060. it's a very cool product.)
If you can’t get it, it may as well not exist.
contact them. It was written for paying customers.
Unofficial name 'Portable Genera'.
You might want to lookup what a real lisp machine is.
Its primary purpose was to "restore" a damaged volume to a state which enabled read-access, allowing you to copy the data to a different volume. At the time most users (myself included) viewed it as a repair tool which should have been able to restore a damaged volume to operational state again. You needed a second floppy disk drive to be able to copy the data to a different disk which was a rare thing to have in the old days. So we settled for what should have been a "fix in place" repair operation, but in reality "Disk Doctor" never left a volume in a better state than it was before, even if the file system structures had been sound to begin with.
The reasons why the "Disk Doctor" was so bad at repairing anything are legion.
For example, if a track could not be read because of a flipped bit or physical damage, "Disk Doctor" would attempt to restore the entire track to a sane state by reformatting it (physical low-level initialization). The "Disk Doctor" uses the same buffer for reading and writing data, which means that when it reformats a track, it will (as a side-effect) write the data that was last read back to disk. That data was written to a different track, and if it happened to contain file system data structures, then "Disk Doctor" would later pick them up and try to make sense of them: it was basically "fuzzing" itself.
Part of the diagnostic operations performed prior to "repair" was to detect damaged or deleted file system structures, such as files or directories which no longer had a valid parent directory ("orphans"). The "Disk Doctor" would gather these and add them to the root directory. It did not check if there already were files or directories present in the root directory which shared the same names, which could have the effect of corrupting the root directory. You might be able to list the root directory and find several files with the same names stored in it, but accessing these would only go as far as the first entry in the directory entry list.
When trying to recover the "orphaned" directory entries the "Disk Doctor" made no attempt to verify that the files and directories it added to the root directory were structurally sound. You could have ended up with fragments of files which were deleted ages ago and whose parts had been partly overwritten since they had been deleted. These broken file data structures could reference data and metadata blocks, which in turn would then break the disk validation process.
(Jokes aside, this only occurred if track 40 was corrupt, IIRC)