Yeah, I know, I can be pretty hard on myself :-) (And I did say it made me cringe a little. I'm mostly pretty proud of what I did back then.)
I learned to hack in a shop called Computerland, in Perth, WA. Immediately after school, I'd march through the heat from school down and into the shop, which had great A/C of course, insert my floppy (most valuable thing in the schoolbag), and continue typing in code, reading mags, and so on. As long as I was actually doing stuff (i.e. not just playing games), the sales guys were cool with me hanging out, every day for weeks, hacking code.
So, BYTE and all the mags I could buy, pretty much got me booted up - and for that I have to say: a) very well done on a great article, b) thank you for showing up here and now, and what have you been up to since?
But professionally, although most Real Time Clock module does generate interrupts, it's misleading to call this simple circuit a "Real Time Clock", there is no "Real Time". It's better to use the term "counter", or "timer".
But then how do you measure time since boot? With simple hacks like this of course. Also, this hack allowed a form of multi-tasking, an 8-bit feature quite well targeted back then.
Consistency was key in such clocks, not necessarily accuracy. The fact of the drift is not important, since it was consistent and could be accomodated for in software - as many peripherals of the day, indeed, required.
And of course, it also is “fun” when you accidentally do not only pick up the frequency of mains power, but also its voltage in your electronics.
Ah, for the Apple of 1981.
There is no doubt it is expensive, but if you fit their target market, there was a fair probability you would get pretty enamored with the fit and find the budget. There are tradeoffs (some significant) to that approach, and Apple had and continues to possess a cultural tendency to be pretty purist about it for some products some of the time, but it has served them well so far (though I see eerily uncomfortable parallels of the ][ era to today's iPhone era).
He's literally taking the piss out of Apple, because it was expensive even then.
For one thing, it was about as fast as a Commodore 64 disk drive, although that was due more to incompetent engineering on Commodore's part than brilliant engineering on Woz's part. (There was plenty of unpicked low-hanging fruit in Woz's own Disk II system.)
For another, you could connect it to an old telephone hybrid and get a free 1200-baud modem, albeit a nonstandard one. Perfectly adequate for allowing two Apples to talk to each other.
And yes, I don't see why you couldn't rig it up to decode WWV(B) transmissions with a minimal amount of external hardware, especially in conjunction with an interrupt-driven timer like the one described in the article.
Do you have more details about this? Usually when the Disk II is brought up it's praised for its simpleness and utility.
How he managed to overlook that is one of life's great mysteries, at least to me.
It makes a lot of sense to talk about saving $100 on an RTC board if you only want a clock to for some one-off script you’re writing. Things can be as fiddly as you want in that case—e.g. load the program from disk to memory, then enable the clock using the little switch you built into it and let it tick while the disk is spun down.
Beyond that, an RTC without battery backup to maintain the time across reboots might not be extremely useful. Given the low precision I'd consider this more like a hardware timer than an RTC, but I'm probably splitting hairs now.
It's still a cool hack nonetheless, it's always refreshing to see how simple and straightforward it is to hack these old machines, both in hardware and software. Can you imagine doing something like that on a modern system? I mean, it's doable but it'll take more than three lines of assembly to get it to work correctly.
The nearest things I can think of in terms of modern hacks would be tapping into the SMbus on the motherboard, or the I2C bus exposed on the VGA connector, e.g. https://dave.cheney.net/tag/i2c
I’m surprised there’s nothing about Wozniak in this. The guy used to look at circuit boards as a kid and redesign them to use less chips... for fun! As a kid!
I discovered this back in the day when I modified a breakout clone to use keyboard input instead of a paddle (joystick) which I didn't own: The game ran much much much faster, and because I had never ran it using a paddle I didn't realize for quite a while that it wasn't supposed to run that fast, and I thought I was just bad at the game.
I'm also very surprised to see that apparently the Apple II didn't have something like this built in already, were all delays implemented as busy loops?
There were sound synthesizers that could play chords, all based on toggling a digital output connected to a speaker with the right timing patterns.
And I found a web version that sounds just like it.
I don't think any of the popular 8-bit machines did. It might make more sense if you consider it from the other end - what would you do with an accurate date and time on such a machine?
For a 60Hz timer, just being able to have a delay without needing to count all the instructions in between when programming.
The Commodores had timers in their 6522/6526 chips, where an RTC was set up. In Basic, it could be accessed via the TI or TI$ variables (integer and string respectively). Not having timers or VBlank interrupts were unusual. The Apple ][ was an early and hacked-together design by Woz, not at all made for a mass market. Just look at the graphic formats :-/
Also, it would translate ImageWriter output to QuickDraw and you could print to any printer connected to your Mac.