

1K colours on CGA: How it's done - fcambus
http://www.reenigne.org/blog/1k-colours-on-cga-how-its-done/

======
dzdt
The reason it has taken the demoscene so long to pull this off is the same
reason that the PC beat its competitors in the early 1980's.

From the beginning the PC wasn't defined by a single hardware implementation,
but by a set of capabilities and interfaces. There were always "compatible" PC
knockoffs and "compatible" extension cards that weren't really 100% bug-for-
bug compatible. That meant that software writers had to stick pretty much to
the approved interfaces. There was no sense trying to push the hardware beyond
the intended capabilities because even if you succeeded it probably wouldn't
work on the bulk of the machines out there with different variations of the
hardware.

It took a long time even to see things like scrolling games on EGA cards, even
though the scroll registers were designed right into the chip and documented
as such. Why? Because there were no BIOS calls to implement this, and most
developers took the limits of BIOS as the limits of what they could count on
working. It took risky shareware developers (Carmack and Romero) to prove to
everyone that really the EGA chip-level interface was a usable target.

So because of the prevalence of the mostly-compatible compatibles, the PC
software market had another layer of abstraction built in. Software should
work on any machine providing the documented capabilities and interfaces, not
just on one exact hardware setup. This allowed for a cycle of compatible
upgrades where software could be used across hardware generations, and
hardware could upgrade with sufficient backward compatibility.

That upgrade cycle was missing for machines like the Commodore 64, which were
defined by a single hardware implementation. For the Commodore, developers
always felt free to push beyond the intended limitations of the hardware, with
confidence that if it would work it would work everywhere. That fed a vibrant
demoscene, but closed off any possibility of a hardware/software upgrade cycle
and eventually led to the death of the computer line.

~~~
bluedino
Why didn't this work out for high-end Android handsets? They are in a similar
situation.

~~~
FigBug
By 'this' do you mean why didn't Android completely beat iPhone? Similar
reason to why the Mac is making a comeback. The hardware market has changed.

In the 80s / 90s, computers were very expensive and it was common to only buy
exactly what you needed. PCs excelled at this. Everything was an option: Video
card (MDA, CGA, EGA, VGA), ram, hard drive, hard drive controller, serial port
controller, sound card. If you later needed more capabilities you upgraded
whatever was necessary.

As computers became cheaper, upgrading began to make less financial sense.
Upgrading a component and paying for it to be installed became almost equal to
the cost of a new computer. There also became less to upgrade, onboard sound,
network, etc was good enough. The only thing left to upgrade was CPU (but
usually required a new motherboard) and video card. Upgradability which was
originally a huge selling point became a niche feature.

As integration continued, size, weight, power consumption all became more
important than upgradability / customization. Hardware wise, all Android
phones and the iPhone are pretty much the same.

~~~
AJ007
I wonder how long throwing away 'computers' after a year or two of use will
remain common for us in the first world.

Does technology get the credit for the difference between the 80s, or is it
some mix of the current balance of geo-politics, interconnected global trade,
wage arbitrage that still allows slavery in the supply chain, and free
interest rates (in the case of Europe, negative)?

May be in the end game biodegradable 'computanium' that is killed and reborn
daily right before it becomes obsolete?

It still feels very strange and disturbing to throw away something less than a
decade old much less two years.

~~~
bjelkeman-again
I tend to retire my machines to less demanding or suitable tasks. A Mac Pro
2006 (with a Cinema Display from 2000) is still going strong as a studio music
computer running Logic Pro X (essentially a Hackintosh on Mac hardware, as
Apple stopped supporting them). A MacBook from 2005 drives a TV with streaming
video (although it needs retiring now).

It is harder with smartphones, but I have older phones being guest map and GPS
devices.

------
FreakyT
It's impressive they got this many colors out of it! I still remember the
classic CGA Pink+Cyan color scheme[1]; those always seemed like a particularly
garish choice for "default" colors.

[1] Example screenshot:
[http://s.uvlist.net/l/y2008/04/49443.jpg](http://s.uvlist.net/l/y2008/04/49443.jpg)

~~~
tptacek
I just got a tattoo done in CGA magenta/cyan/green (a composite frame from
Sierra's Black Cauldron). This turns my whole world, or at least arm, upside
down!

~~~
fapjacks
Very cool! Got a link to a pic? I also have some oldschool "pixelated"
tattoos, though in greyscale only.

------
cubano
I am utterly amazed at the amount of hard work and programming mindshare
expended on, let's face it, a very unimportant technology.

Bravo to you all for showing us that old-school hacking is alive and well!

~~~
listic
What's boggles my mind is the fact that it took the enthusiasts up until now
to uncover the potential of the platform, as compared to, say early 1980's
when people were paid to produce entertainment software for the IBM PC.

Makes one wonder what kinds of fantastic things we might achieve with todays'
technology, but probably never will, moving on to build tomorrow's technology
instead.

~~~
Zikes
IIRC there was a post by Steve Wozniak here a few months ago where he stated
he still occasionally gets ideas for how he could have improved the code for
the original Apple computers.

Edit: Found one article that quotes Woz on that:
[http://www.cultofmac.com/302087/38-years-later-woz-still-
thi...](http://www.cultofmac.com/302087/38-years-later-woz-still-thinks-ways-
improve-apple-ii/)

------
jbuzbee
Pushing those old systems to their limit (and beyond) was a lot of fun and
very instructional for those of us just getting into programming. On my Atari
800, I recall using 6502 assembly to hook the horizontal and vertical blank
interrupts to change the color lookup tables and character definition maps on
the fly giving me the ability to greatly increase the number of colors
displayed and do crude character-set animation.

------
userbinator
Reminds me of this demo that uses a single microcontroller to generate a PAL
composite signal, with a few tricks to increase the colours available:

[http://www.linusakesson.net/scene/phasor/](http://www.linusakesson.net/scene/phasor/)

------
tibbon
Strange question; why didn't many games in this era exploit this? I'd guess
that the knowledge just wasn't as easily shared? Seems these systems were
capable of pretty amazing things, but those things were frequently overlooked.

~~~
jerf
If I'm reading correctly between the lines, they are pouring a _ton_ of CPU
into this effect. I wouldn't be surprised if you told me what we saw in the
demo is literally almost all this effect can do, and there's not a lot of
power left over for actually running a game.

Same thing for all the things you see Commodore 64 demos do... by the time
you're creating the awesome graphical effect there's often not a lot left over
for the game itself. (Though there are some interesting exceptions... there
appear to be some surprisingly high-quality side-scrolling platformers now
based on "bad lines", which are both explained and then the platformers shown
at
[https://www.youtube.com/watch?feature=player_detailpage&v=fe...](https://www.youtube.com/watch?feature=player_detailpage&v=fe1-VVXIEh4#t=2900)
. Though the entire presentation is fascinating and shows a few other
demoscene effects in real programs.)

~~~
ajenner
The CPU usage isn't _that_ bad compared to some of the other things we did in
the demo - with some help from interrupts it could be done with maybe 20% of
CPU. There is also a much easier ~500 colour variant which doesn't take any
CPU time at all once set up.

I think the real reason it wasn't discovered earlier is that most CGA PCs were
not connected to composite monitors or TVs (people who could afford the big
expensive IBM machine could generally afford a dedicated digital monitor as
well). A few games used 160x200x16 composite but even those generally had
modes for RGBI monitors as well (which wouldn't work so well with the 500/1K
colour modes, though I guess there are the dithering characters 0xb0 and 0xb1
which might have worked). These +HRES modes also suffer from CGA snow, which
might have been a deal-breaker.

~~~
jerf
Cool, thanks for the details!

------
amelius
Interesting, but it would be nice to see a picture of the resulting palette.

~~~
ajenner
Here you go:
[http://www.reenigne.org/misc/cga_1k_colours.png](http://www.reenigne.org/misc/cga_1k_colours.png)

------
humanarity
The writer of this piece is a genius. Amazing hackery.

------
meneses
Reminded me of the Doom/Keen days

------
gcb0
two pictures of waveforms would be much better than text here ...

~~~
ajenner
I wanted to make some pictures to go with the article, but I didn't have time
(I wanted to get the article out quickly as there was so much curiosity and
speculation about the methods we used). I just tried adding a quick hack to
cga2ntsc to output the waveforms and viewed them in an audio package, but the
result wasn't terribly elucidating. Perhaps I will redo the article with more
pictures at a later date.

