Hacker News new | past | comments | ask | show | jobs | submit login
Secret colours of the Commodore 64 (aaronbell.com)
398 points by a1r on March 22, 2017 | hide | past | favorite | 95 comments

> Sidebar: A note on frames per second. The European PAL C64 updated at 50fps, whereas the US NTSC systems updated at 60fps

This is because alternating current is 50Hz in PAL countries (eg Europe), and 60Hz in NTSC countries (eg America). Analogue TVs vertical refresh rate was synced to the AC frequency for a bunch of practical reasons, which meant gaming consoles had to send signals to the television at either 50Hz or 60Hz.

Result? Many PAL console games actually ran at 5/6ths of NTSC speed. Most notoriously, Sonic the Hedgehog, whose agility was more sluggish (and soundtrack less vivacious) for a large fraction of the world. More information in this video: https://www.youtube.com/watch?v=cWSIhf8q9Ao

In a similar fashion, some turntables have four rows of dots on their platter and a strobe light that will blink at a frequency determined by your mains AC frequency, and when the platter is moving at a perfect 33 RPM or 45 RPM, the respective row of dots for your AC frequency will appear to be standing still.


Synchronous clocks do this too... they've driven by a motor that's designed to work in phase with the electrical grid's frequency. This means they rotate at a fixed speed and can be used as a time base for a clock.

From the perspective of the grid operator, however, 50 or 60Hz is not always 50 or 60Hz. A sudden load or a generator tripping offline (to preserve itself) results in a transient slowdown of the frequency of the entire grid. I spent a summer in high school helping out with the analysis of these kinds of disturbances, and there's a distinct pattern to the fluctuation of grid frequency. There are also slight longer term errors in grid frequency, although operators are held to strict standards.

Getting back to clocks, integrating these transient frequency errors over time results in clocks that shift forward and backward relative to real time. This integrated time error is often displayed in grid control rooms, and it is something they deliberately manage to ensure that the 'grid time' is accurate. In practical terms, this means a period of ever so slightly less than nominal frequency is likely to be followed by a period of deliberately induced slightly higher than normal frequency, so that the overall integrated error tends to zero.

More details on the time control aspect on page 13 here: http://www.nerc.com/docs/oc/rs/NERC%20Balancing%20and%20Freq...

I previously worked with Unix sysadmin tasks and software development at Statnett SF [1], the transmission system operator (TSO) for the national level of the Norwegian electrical power distribution network, and am currently studying at the university to become an electrical engineer. We started learning about transformers, generators and motors in three-phase systems this semester, so that document is of interest to me. I looked over the document briefly and intend to read it in full later, for example tomorrow when I will be going on a two hour train trip to Oslo.

The electricity sector in Norway relies predominantly on hydroelectricity. In 2008, hydroelectricity generated 141 terawatt-hours (TWh) and accounted for 98.5% of the national electricity demand. [2]

I have been told that the nature of hydroelectric power generator installations makes the act of balancing the power system in Norway quite different from what most other countries are dealing with but I think the document you linked will be informative to me none-the-less. Besides, understanding power systems balancing in general and not just for Norway would allow me to work in other countries in the future should I want to. Not saying that the university is going to teach me things that apply to Norway only of course but I hope that you understand what I mean.

PS: Statnett has a live view of the Nordic power balance on their website -- http://statnett.no/en/Market-and-operations/

[1]: http://statnett.no/en/About-Statnett/

[2]: https://en.wikipedia.org/wiki/Electricity_sector_in_Norway

Cool... good luck! (If you ever find an intuitive way to think about reactive power, please let me know... I get it in broad strokes, I've been trying to wrap my head around the details for over 20 years, off and on.)

> I have been told that the nature of hydroelectric power generator installations makes the act of balancing the power system in Norway quite different from what most other countries are dealing with

Hydro has both upsides and downsides. To the upside, they're easily able to respond to changes in demand. It's the difference between opening a wicket gate and adding fuel/air to boil more water, to make more steam, to apply more torque. Hydro can also support stored power.... sometimes you see what are known as pumped storage generators generate 'negative' output. This means they're running as a motor to pump water up a hill. When power is needed they can just drain the water through turbines to generate power.

That said, Hydro also has additional constraints on operation that can be imposed by flood control requirements, reservoir levels [1], environmental regulations, etc.

1] https://wrrc.arizona.edu/drought-diminishes-hydropower

> for a bunch of practical reasons

Basically makes it cheaper to build. You have a natural frequency there to use, and you don't have to come up with all this additional hardware to smooth out the existing frequency and come up with a new one.

I think there was a flip-side to this ... though NTSC had faster refresh, PAL had a higher resolution (more lines). I'm not sure, but I think this may have been a tradeoff.

It also reduced the effects of mains interference. If you had a 60Hz vertical scan, but there was interference from 50Hz mains (as there often was) then this would cause rapid rolling vertical distortion.

At 50Hz, even if the vertical sync wasn't actually locked to the mains frequency, any mains distortion would roll much, much more slowly, and be less offputting.

You are correct. A black bar at the bottom of the screen is a nostalgic trigger for me! (When playing games which were designed for NTSC resolution on PAL screen. This was a generation later on the Amiga)

With analog TVs, it's the signal source that generates the synchronization signal. For the C64 specifically, like all the home computers and video game systems I know, this is derived from a crystal, not mains A/C. Since you need a pixel clock and horizontal clock at consistent multiples of the vertical draw time, using mains to derive the video sync seems very impractical.

My (basic) understanding is that having the beam move at roughly the same frequency as the A/C current is done to mitigate noise and distortion caused by other appliances, which earlier tubes were much more susceptible to.

You have more lines with PAL but in practice since most of the game devs were either in Japan or in the US (with a few notable exceptions) most games were designed for NTSC and many didn't bother to increase the resolution when porting to PAL. Furthermore many games refreshed the display at half the video framerate so you end up with 25fps on PAL.

So in many cases you end up with games that run at 5/6th the nominal speed and have black bars at the top and bottom of the screen. PAL gaming was pretty crap, but of course at the time I didn't know any better and I didn't understand english anyway, so it's not like I had a choice...

> Basically makes it cheaper to build. You have a natural frequency there to use, and you don't have to come up with all this additional hardware to smooth out the existing frequency and come up with a new one.

I'm not an expert, but I read that it was the idea but it was never implemented and TVs used independent generators.

I recently modded my Master System to have composite out and added a 50/60 switch at the same time. Flipping it in-game feels really strange, as I'm so familiar with the PAL speed, but the NTSC speed is obviously the correct one.

If you want to see how this type of colour magic works in video form, the 8-bit guy does a GREAT video on this:


He shows all the interesting artifacting that makes these extra colours possible.

Here's a great older example of this technique (and a couple others) used on the TRS-80 Color Computer 3 to show "hi-color" bitmaps on a machine with a 16 color palette:


Wow. Thanks! This CoCo is still teaching me. Love that thing.

Reminds me of another piece of magic I used to use during my Amiga demo days. Infinite Bobs. A Bob was a BlitterObject, a graphic rendered by the hardware, not dissimilar to a sprites.

People would try many things to render as many Bobs as possible.


At some point the infinite bob demo appeared (I couldn't find a video for this). Seemingly endless sprites rendered on the screen - all moving, with no slowdown. As a 15 year old programmer learning to code demos in assembly I was very confused, how was it done?

I did eventually (after many hours spent in devpac disassembling code) work it out. They were not rendering infinite bobs, they were rendering one!

If you create three screen buffers. On the first screen buffer draw the bob at position (x,y), then switch to the second buffer draw at (x+delta,y+delta), then switch to the third buffer and draw at (x+delta,y+delta) and then repeat the cycle through the buffers as you move the bob.

If you switch between the buffers on the vsync (50/60hz) then the difference between where you drew the bobs appears like movement. You can draw out infinitely complex patterns and make it look like you're rendering millions of bobs!

Happy days.

That Phenomena demo in the youtube video that you linked to is actually a very clever display of the infinite bob trick. It is just so convincing that it is, still, difficult to spot it.

Dear deity that music made me nostalgic.

This overlooks a key detail, namely interlacing. The C64 wasn't producing 60 frames per second, but rather 60 fields: sets of odd or even scanlines. So on the CRT TVs of the time, you'd get one colour on the odd scanlines, and another on the even scanlines. Scanline striping, essentially.

That isn't quite correct. The C64 outputs a progressive-scan image, which is a hack on top of NTSC or PAL. In this configuration, there are only even or odd fields, which double-strike the same area. Thusly, it is 60 fields per second, and since only one field forms the full frame, it's 60 frames per second as well.

There is no scanline striping; that is strictly an artifact of poor NTSC capture devices (like most HDVTs) that naively assume the incoming signal is a 480i image with odd and even fields.

A progressive-scan image emitted as an interlaced image results in, well, an interlaced image. That the scanline pairs are usually identical merely means that it's wasting vertical resolution.

And nonetheless a CRT from the time would show these as different scanlines.

No, this isn't correct. The image isn't "progressive" or "interlaced" until the output stage. Whether or not they are identical is secondary to the timings, which is all that matters. If every frame has an integer number of scanlines (and VSYNC begins during H-blank), it's going to be a progressive scan image.

A "CRT from the time" - the things littered around my apartment that occupy a little too much of my time - can easily illustrate the effect I am describing of a progressive scan image without alternating scanline positions. Go grab almost any '90s game console, or have fun with the Sega Genesis, which supports 240p as well as 480i; Sonic 2 uses 480i to fit two viewports into the two-player mode, but uses 240p during a single-player game.

Well, alternate fields have a 0.5 scanline offset, so that's sort of true. But you definitely don't see them both at once. Are you sure you've ever used a proper CRT TV? ;) - you can get the alternating effect with LCD TVs, but that's because LCD TVs handle analogue input in a different way.

Justifications for this:

0. TV hblank rate is 15,625Hz - at 50 fields/sec, that's 312.5 lines/field, some way short of the 400 you'd need to display 2 fields

1. Out of the 312.5 lines/field, only 288 are booked for the visible area, and most TVs can barely resolve them precisely anyway

2. When the TV is displaying one field in one set of scanlines, and the previous in the other, where does the data for the previous field come from?

2.1. TVs of the period have no buffer

2.2. Computers of the period don't generally have the RAM (or sometimes the hardware) to double buffer

2.3. The image produced is based on more than just the contents of the RAM anyway, increasing the RAM requirements (if you were to try to do this)

2.3. Cheap DRAM of the period doesn't have the bandwidth to scan out two frames'-worth at once)

Here's a pic of my BBC Micro, with interlaced output, running a program that flashes the screen alternating red and white: http://i.imgur.com/1XvkRso.jpg - you can see that at the top it's scanning out an entirely white frame, and at the bottom there's the end of the previous entirely red frame that's in the process of decaying.

As an example of 2.3 - note that the difference between one frame and the next here is entirely the video registers - the RAM stays exactly the same. Only one of the palette registers changes (and the flashing cursor is added by the hardware as a sort of post process step).

(I suspect the red/green/blue blur at the bottom of the white region is an artefact of my phone's terrible camera. Photos from my (slightly) better camera don't have that, but they do look overall the same. However today only my phone is willing to play ball with my PC.)

Alternate fields have an offset, but only if the horizontal sync timings make it so. If a machine is built to output a progressive image, there aren't really "fields", or you can consider them "all odd" or "all even" fields. I don't know anything about the BBC micro, but the Commodore 64 outputs a progressive image.

It's definitely interlaced output, though in most display modes the content is progressive (i.e., both fields look the same) - which is a bit of a dumb arrangement, really, since you get all of the flicker and none of the benefit.

Fortunately there's a command to switch it off, and no TV seems to mind displaying 50.08Hz output rather than 50Hz...

Are you referring to the BBC Micro, or the Commodore 64?

No, the signal indicates whether a given field is for even or odd scanlines. I used plenty of computers and CRTs around that time and not one incorrectly displayed progressive as interlaced.

To clarify, I'm talking about CRT TVs, not computer monitors. The latter aren't (usually†) interlaced.

†There's some interesting exceptions!

Neither are most CRT TVs, if you keep feeding them the odd/even field repeatedly. I keep CRT TVs and monitors at home and can easily verify this. The effect is usually dark gaps between the lines or doubled lines (where the corresponding field would otherwise have been drawn). Doubled lines, as in the same as the previous line of the same field, not from the previous field. It's not a particularly obscure subject or uncommon trick, so I don't see why you would keep insisting that it isn't so when any source that possibly could have proven you right, if you were, will prove you wrong.

I think I misunderstood the responses I got at first, and I couldn't easily find a source. Now that I do understand, it's quite interesting.

I'm talking about televisions and similar. You are still simply wrong. Nearly all of the 8-bit and 16-bit consoles and computers that output to CRT default to non-interlaced, without the TV alternating fields.

Look at other responses in this thread to see how interlacing used to be achieved and the flexibility it had.

Ah, I see now. Your comment about the half-length scanline made me understand.

If the TV displays N fields per second, and the C64 produces N fields per second, then you'll see one at a time. The only effect of the interlace is to shift alternate frames down by 0.5 scanlines. TV phosphor decay is too rapid for anything else to happen. By the time the next field starts the previous one is already gone. (I took some short-exposure shots with my camera and it looks like the phosphor decays from white to black in less than 1/6th of a frame.)

(Well... it's been a long time since I've seen a C64 plugged into a TV, but I don't see how anything else could happen! You definitely don't get an alternating effect from a BBC Micro.)

You are accounting for phosphor decay but not persistence of vision.

If you use an Amiga, which has interlaced graphics modes, you see very easily that persistence of vision is too short to make those images seem stable unless the colours on each pair of lines are very close to each other. It works quite well for photos etc., but is awful for e.g. text or graphics with sharp lines at high contrast.

Try it and see!

I was under the impression that there must have been some additional signaling to make interlacing happen. If for no other reason than the display would need to know which frame was the upper scanline and which was the lower.

Apart from the fact that they would have advertised the increased resolution (albeit flickery). On the Amiga, which did do interlaced modes. you could tell on a screen showing a single colour whether it was interlaced or not.

The difference in signaling is that in an interlaced mode, the final scanline is half-length - Vsync begins halfway through the line, causing the next frame to be offset by half a line vertically.

The monitor doesn't "know", in that case. The interlacing is simply a physical phenomenon. Later digital capture devices must recognize this situation and handle it appropriately. Many assume it's always happening, as progressive-scan 252-field NTSC is out of spec anyway.

It's not an additional signaling.

TVs were designed in the 30's, where electronics were extremely primitive and expensive. You wanted the consumer device to be as simple as possible so it could be within the consumer price range. So TVs were little more than a Radio receiver hooked up to a cathode ray tube (CRT).

To drive a CRT you need 3 signals: X position, Y position and brightness. The dumbest possible design is to have 3 radio receivers and transmit all 3 signals over the air. But the extra receivers are expensive and besides the X and Y signals are very repetitive, which would be a waste of bandwidth.

So two flyback transformers were added to the design of the TV, which generate a saw wave pattern. Starting at 0% they would consistently increase power until 100% before rapidly snapping back to 0. One would run at the vertical refresh rate to drive the CRT's Y signal and the other would run at the horizontal refresh rate to drive the CRT's X signal. The Brightness would come from the radio receiver.

With this design, you just need a way of synchronizing the TV studio's cameras and all the TVs in the area to the same horizontal and vertical refresh rates. You might think: Easy, we just use the mains power frequency for vertical and then divide it by 525 to get the horizontal frequency.

But a 525 frequency divider way too expensive to put in every TV. Instead, they only put one divider in the studio to calculate the horizontal refresh rate and transmit a synchronization pulse embedded into the brightness signal. A simple circuit in the TV detects the synchronization pulse and nudges the flyback transformer to match. A second, longer synchronization pulse is transmitted between every field for the TV to synchronize the vertical flyback transformer.

So a basic Black and White TV is just a radio receiver, two flyback transformers and two synchronization detectors hooked up to a CRT. It doesn't know anything about interlacing or even how many lines there should be in every frame. Back then, a TV studio could theoretically start transmitting at 61 Hz or with a few extra lines per frame and every TV would follow along, right up until the point where the horizontal or vertical refresh rates went out of the spec of the shittest flyback transformers in consumer TVs.

Interlacing is a brilliant hack that is 100% done at the studio end. All they do is pick a horizontal and vertical refresh rate that don't divide into each other a whole number of times. 60 hz divided by 15.750 kHz is 262.5 lines. This means that when the TV's vertical flyback transformer reverts to zero (putting the CRT's Y position back to zero), every second frame, the X position of the CRT will be halfway along the screen.

One thing you might have noticed is that the Y position is constantly incrementing, it doesn't step down by one line worth of y position at the end of each line. This means that the TV signal is actually rotated slightly, with the end of each line having almost the same Y position as the start of the next line.

Which means if the field starts halfway through a line, the start of the first full line on that field (and every line after that) will be half a line lower than it was on the previous field.

Other interlacing schemes are theoretically possible, just by picking appropriate horizontal and vertical refresh rates. You could have Triple interlacing or quadruple interlacing (though I doubt either would be pleasing to look at). But most early game consoles and computers pick a horizontal and vertical refresh rate which divide into each other with a whole number of lines, resulting in a progressive display.

Amiga had the A2024 monitor and I always thought is was kind of quad interlace weird interlace for an exotic monitor with long afterglow.

However, the thruth is different but also strange. http://bboah.amiga-resistance.info/cgi-bin/showhardware_en.c...

Apparently, the monitor samples four frames, puts them in 4 frame buffers, then outputs a complete image to the CRT, each frame representing a quadrant on the screen. Must have been very expensive...

Great point.

I wonder how much better the modern version of this effect (as seen in the article) would be if they implemented this.

I once toyed with Temporal dithering on VGA ModeX.

While the bullet point listed resolution for standard VGA was 320x200. Hitting the hardware registers and paying the price of a rather peculiar addressing mechanism for pixels you could get a lot more(and double buffering to boot).

320x240 was the most common tweaked mode, because it gave you square pixels and page flipping.

At the edge of what momitors could handle there was a 400x300 mode which ran at 87Hz Flipping two images with this mechanism give you a 43Hz shimmer, which is amost impossible to pick on colours if the two components are similar luminance.

I never saw this get used for anything, but it would have made an excellent paint program for standert VGA.

I vaguely recall seeing a DOS demoscene demo that filled the 320x200 screen with a letterboxed 256x200 gradient. Then it slammed the actual display frame data into the VGA 256 color pallete one line at a time on each horizontal sync. The result was an effective 18-bit color resolution image from a 256-color device. (I think MCGA didn't actually have 24-bit palletes)

Of course, it meant you had to spend 90% of the frame updating the pallete registers and only had the vertical blank time to draw everything into the frame buffer. But, the focus of the demo was the idea that high-color was at all possible on VGA hardware.

I'm pretty sure I've seen this used in the demo scene, but I can't name any productions after all these years. I remember Second Reality has a mode x plasma at least

I recall Ambience by Tran as one of the first (and few) tasteful interlaced productions: https://www.youtube.com/watch?v=xJshv8BjdoM VESA-compatible hicolor and truecolor were about to become commonplace.

i think one of the assembly party reports (94?) used it to show true color photos on vga.

If using composite video and a PAL TV, you can also blend colors by using alternating horizontal stripes because PAL TVs use a delay line to cancel the chroma information with the line above it. Exactly what color you got would depend on the specifics of your TV's decoding.

Yes. I believe the term was a "raster split". There was a limitation to how many colours you could use for your sprites, and using this technique you could effectively double-it. As long as your characters were on different horizontal portions of the screen. The bitmap brothers (Creatures, Mayhem in Monster Land) were notorious for this and other sweet tricks. Remember how there used to be a border? They figured out how to get sprites in there. Amazing towards the end of its life all the ways developers found to squeeze out those extra tricks from the ageing hardware.

The border was "opened" for sprites by the crack group 1001 Crew in the beginning of 1986.

This is very similar to how "grayscale" is achieved on a TI-83 calculator to make games like Zelda and Desolate (https://www.youtube.com/watch?v=5UHqPMxeZnY) possible.

Off topic-ish: There was a trick you could use on the Atari ST to display more than the allowed number of colours - you reloaded the palette data on each horizontal interrupt. Could this have been used on the C64 also?

You could do this on the Amstrad CPC (and many people did, mostly but not exclusively demo coders). The CPC had a palette of 27 possible colours, of which 16, 4, or 2 could be shown at once, depending on screen resolution (160px, 320px, 640px respectively, all x 200px). By reprogramming the palette selection on the fly, using two successive writes to output port &7F00, you could change the palette as the electron beam moved down the screen. You could even do it mid-screen row if your timing was good!

Talking of the Atari ST, there was an excellent example of this colour-switching in an ST software utility called Photochrome, which could use overscan effects, multiple palette switches per line, plus the colour switching technique in this article to greatly increase the number of perceived colours. It's the best example of this colour switching technique I've seen with barely any noticeable flicker (on a CRT-TV anyway). Seeing what was effectively a 'true colour' image coming from a machine that's only supposed to be capable of 16 indexed colours was a very impressive feat.

There was no palette to reload - the C64 (like most 8-bit systems) had a fixed palette in hardware.

But you could interrupt the CPU from the graphics chip based on scan line (just in case that's the part GP was interested in), for instance to do a classical graphics/text screen split or replicate sprites.

"Raster bar" type graphics were a specialty of the Atari 2600 and 800, and the Commodore Amiga(all Jay Miner-led designs), and starting with the 800, they had programmable "display list interrupts" built into their graphics chips which would let you switch between the various graphics modes and scroll, as well as change the palette. This effect is used all over the place to make nice sky and horizon gradients, to make text and objects look shiny, or to do fades.

The C64 missed out on some of this programmability, but had other goodies to compensate(more sophisticated sprite hardware, a really solid default palette).

Absolutely. The c64 had a bunch of things that was global, but could be changed per scanline, like graphics mode, border color, background color, sprite positions/bitmaps/colors, character offset, etc.

You could actually change them mid scanline too, but it was tricky to get the timing right, so the exact pixel of the change would be a bit random.

Brilliant talk on using these techniques to draw hires color pix on c64. The NUFLI technique is based on combining hires mono images with color sprites underneath AND color flashing.


Incidentally, this type of temporal dithering is how 18-bit LCDs can approximate 24-bit colour:


I remember writing a piece of machine language on a C-64 which flipped the color palette as fast as possible. It was so fast that all you saw was individual horizontal line segments of different colors within every individual raster line. (I.e. it was faster than the horizontal refresh rate of the CRT which is usually what, well over 10 kHz.)

Sheesh, when was that? Probably 1985.

Mid-rasterline color splits, often combined with snakes of sprites, often with border removal effects.

Watch this demo for state-of-the-art in that kind of programming on the C64.


This is cool, but I've never understood why people like the C64 palette. It looked washed out to me.

I guess people grow to like the look of the machines they have fond memories of. (Apple II, Coco, Sinclair)

The Amiga was the first machine to really impress me with its pallete.

The Amiga was really the first home computer to have a truly impressive palette. It got the jump on the PC mainstream by a good few years. I don't think it was until sVGA happened that the PCs caught up again ...

Well the Amiga had 4096 colours. The colour palette was of course more flexibel.

But if you compare the C64 palette with other 8 bit computers of the time, it looks better for me. And as he explains in his article, it is easier to generate more realistic looking pictures with it than with a more vibrant colour palette.

These kinds of tricks remind me a bit of how the Apple II produced color using the timing of the NTSC signal, or how Thexder tried to produce more colors https://www.youtube.com/watch?v=fBL2rtCdpZ0, or how the demo 8088 MPH produces more colors than the CGA composite mode was intended https://www.youtube.com/watch?v=yHXx3orN35Y

These were the "magics" you were able to do at that time! You could achieve amazing and unexpected results going beyond common software programming, mixing software with phisical effects for example. This was the reason the author of the article got so impressed. I don't know what kind of todays's unconventional programming technique can achieve such a wow effect! Side note: i was 14 too in 1991...incredible times, such as mid '80s!

I think this type of instance maybe why the C64 has the reputation of encouraging hackers that it does. This is the perfect example to me of hacking to make something better.

I think a big part of the beauty of the C64 is that there were so many of them, and they were all more or less the same (although the hardwere went through many revisions). That meant people quickly ran up against the same boundaries, and identified amazing ways to push past them.

Question: the demos on the website are not smooth in my mobile chrome browser. Is it even POSSIBLE to get glitch free per display frame update in a browser? I grew up with the c64; the kind of tearing and frame-rate glitches we see on nearly every other platform since make me sad!

If all the stars align. So, basically, no.

There are too many layers, half of which don't synchronize correctly, and if even one is out of alignment you'll get tearing/stuttering.

The page http://vik.cc/dvik-joyrex/download/105Colors.ppt has an algorithm to convert a 24-bit RGB to interlaced 8-bit on the MSX palette.

Cool article, but the Spectrum really only had 7 colours and black. And at the time I hated the Commodore 64's colour palate (and any non ZX Spectrum palate, until I saw the beautiful SAM Coupé's) so I guess it's a matter of preference.

You could use this same palette flicker on the speccy to remove the attribute clash and create new colours, I remember someone demoing a more or less pixel perfect version of Super Mario World, albeit with a black background.

The spectrum was capable of all sorts of odd things that were never intended. http://tarjan.uw.hu/zx_gfx_modes_en.htm

That's very interesting. I'm always impressed at how dedicated some people are to old systems too. My Spectrums haven't been touched in 15 years and here's people making graphics cards for them still.

There's even a new speccy coming out. http://www.specnext.com/

Niceee. I wonder if it has a real Z80 or it's implemented on an FPGA as mentioned.

As I understand, the FPGA is to interface the Z80 with the raspberry pi that's onboard to include extra graphic modes and HDMI output, but I could well be wrong. There's a lot of unanswered questions about its inner workings, but they have some good looking test boards already etched.

I also preferred the vibrant colors of my CPC to those of a C64.

Sinclair ZX Spectrum had a palette of 15 colors. 7 base colors, plus 7 "bright" versions, plus black (bright black was the same as normal black).

Ah yeah but they're essentially the same colours, whereas the C64 had 16 unique ones. If it had only used 4 bits for colour then I don't think it would've maintained its earthy palate because it wouldn't have been useful enough.

Near the bottom, I wonder if high-speed img flipping in js could be done by absolute-positioning the two images and toggling display:none/display:block (or flip the z-index above/below) on the upper layer on every animation frame?

I think you'd have better luck with rendering a pair of ImageBitmap to a canvas via requestAnimationFrame.

Would either be faster on mobile? I feel like a gif would have been better in this case, and I generally dislike the format.

I guess it's all about which 8-bit we have fond memories of but ... the C64 is not known for having a great palette. It looks rather dull and washed out.

The Amstrad CPC probably had the most vibrant palette of all of the 8-bits.

A C64 artist even tried to prove otherwise, but IMHO scored an own goal : http://www.indieretronews.com/2016/02/is-c64-palette-far-sup...

So there is a potentiometer on the c64’s board that you can adjust to control the color intensity. The people on the assembly line them were supposed to connect them to a screen and adjust the pot until it matched a particular intensity, but there were runs of c64s where they just said "fuck that" and turned it all the way up without checking. Early 64s were SUPER saturated.

The fact that we are having a conversation about it looking dull and washed out makes it clear that this unauthorized shortcut was cracked down upon. But it also helps to explain why there are so many palettes available in most c64 emulators - people tried to match their memories, or a screen grab, of wherever their particular c64 was adjusted.

I'd adjusted mine; if I recall correctly it came with the intensity all the way up. I turned it down but didn't turn it down to the point most people's seem to have been set at, judging from most modern emulated screenshots.

Skin tones look way better on the C64 in the examples. Sure, the missing saturated red makes the Ferrari and tulip images pretty bad.

I would think answer is NO, but (why not) is there a way to synchronize painting of the picture with the vertical synchronization of the screen in modern browsers, to prevent flickering?

That's `requestAnimationFrame` - it will lock at 60hz if possible, though I believe 60fps is not enough to get rid of the flickering.

As an Australian user of the TRS-80 Color Computer I was always disappointed the artificating technique: https://en.wikipedia.org/wiki/Composite_artifact_colors ... didn't work on PAL so I never got to see all the 'cool' colours promised on game adverts in USA-based magazines!

How did they a screenshot of the blended color in the magazine? Wouldn't a screenshot only get one frame?

I'm wondering why they went with a dragon boss that flipped from one solid color to the other - wouldn't switching between a checkerboard pattern of the two colours have worked better?

Yes but it looks like this wasn't possible. C64 multicolor sprites have 4 colors and it looks like one frame of the dragon is already using 4 colors.

Also, having two separate ditherings of each sprite would use twice the memory per sprite, and that dragon looks to be composed of multiple sprites and possibly multiple frames of animation per sprite. Most games were memory limited, which means the devs had to make several compromises just to get the game to fit at all.

Ah, so a trade-off as always.

I wonder if the demoscene has some nice examples where they are more perfectionist about this.

Reminds me of the Future Crew demo that used similar techniques to emulate "thousands" of colors on a 256 color display. Very unimpressive in YouTube videos. Amazing on a CRT.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact