From the beginning the PC wasn't defined by a single hardware implementation, but by a set of capabilities and interfaces. There were always "compatible" PC knockoffs and "compatible" extension cards that weren't really 100% bug-for-bug compatible. That meant that software writers had to stick pretty much to the approved interfaces. There was no sense trying to push the hardware beyond the intended capabilities because even if you succeeded it probably wouldn't work on the bulk of the machines out there with different variations of the hardware.
It took a long time even to see things like scrolling games on EGA cards, even though the scroll registers were designed right into the chip and documented as such. Why? Because there were no BIOS calls to implement this, and most developers took the limits of BIOS as the limits of what they could count on working. It took risky shareware developers (Carmack and Romero) to prove to everyone that really the EGA chip-level interface was a usable target.
So because of the prevalence of the mostly-compatible compatibles, the PC software market had another layer of abstraction built in. Software should work on any machine providing the documented capabilities and interfaces, not just on one exact hardware setup. This allowed for a cycle of compatible upgrades where software could be used across hardware generations, and hardware could upgrade with sufficient backward compatibility.
That upgrade cycle was missing for machines like the Commodore 64, which were defined by a single hardware implementation. For the Commodore, developers always felt free to push beyond the intended limitations of the hardware, with confidence that if it would work it would work everywhere. That fed a vibrant demoscene, but closed off any possibility of a hardware/software upgrade cycle and eventually led to the death of the computer line.
Later 'Jet' provided some test indication as well. I was always amazed because CP/M machines were all different sort of by design, it was the sheer quantity of the original 5150 PC that made it worth specializing for that target. I later owned an Epson QX-10 with an "MS-DOS" card. And simply MS-DOS compatible ran hardly any software at all!
In the 80s / 90s, computers were very expensive and it was common to only buy exactly what you needed. PCs excelled at this. Everything was an option: Video card (MDA, CGA, EGA, VGA), ram, hard drive, hard drive controller, serial port controller, sound card. If you later needed more capabilities you upgraded whatever was necessary.
As computers became cheaper, upgrading began to make less financial sense. Upgrading a component and paying for it to be installed became almost equal to the cost of a new computer. There also became less to upgrade, onboard sound, network, etc was good enough. The only thing left to upgrade was CPU (but usually required a new motherboard) and video card. Upgradability which was originally a huge selling point became a niche feature.
As integration continued, size, weight, power consumption all became more important than upgradability / customization. Hardware wise, all Android phones and the iPhone are pretty much the same.
Does technology get the credit for the difference between the 80s, or is it some mix of the current balance of geo-politics, interconnected global trade, wage arbitrage that still allows slavery in the supply chain, and free interest rates (in the case of Europe, negative)?
May be in the end game biodegradable 'computanium' that is killed and reborn daily right before it becomes obsolete?
It still feels very strange and disturbing to throw away something less than a decade old much less two years.
It is harder with smartphones, but I have older phones being guest map and GPS devices.
In many countries people pay full price for handsets and use pre-paid cards. In no way you see them replacing their mobiles every two years.
Server side from the client perspective, that's effectively what "the cloud" really is.
One the one hand. On the other hand, getting newer Android releases running on, for example, the G1DEV phone is an example of how this ethos has persisted, even on Android.
 Example screenshot: http://s.uvlist.net/l/y2008/04/49443.jpg
Bravo to you all for showing us that old-school hacking is alive and well!
Makes one wonder what kinds of fantastic things we might achieve with todays' technology, but probably never will, moving on to build tomorrow's technology instead.
Edit: Found one article that quotes Woz on that: http://www.cultofmac.com/302087/38-years-later-woz-still-thi...
However, the era of CGA composite color graphics was very short-lived. By the time the IBM-PC started to become a popular home computing platform, 16-color Tandy/EGA graphics and RGB monitors were the norm. Any composite color hacks would be rendered obsolete and unusable on these later machines.
Same thing for all the things you see Commodore 64 demos do... by the time you're creating the awesome graphical effect there's often not a lot left over for the game itself. (Though there are some interesting exceptions... there appear to be some surprisingly high-quality side-scrolling platformers now based on "bad lines", which are both explained and then the platformers shown at https://www.youtube.com/watch?feature=player_detailpage&v=fe... . Though the entire presentation is fascinating and shows a few other demoscene effects in real programs.)
I think the real reason it wasn't discovered earlier is that most CGA PCs were not connected to composite monitors or TVs (people who could afford the big expensive IBM machine could generally afford a dedicated digital monitor as well). A few games used 160x200x16 composite but even those generally had modes for RGBI monitors as well (which wouldn't work so well with the 500/1K colour modes, though I guess there are the dithering characters 0xb0 and 0xb1 which might have worked). These +HRES modes also suffer from CGA snow, which might have been a deal-breaker.
But also, the reality is that the demoscene didn't even figure any of this out until 2013-ish.