Did a lot of embedded 6809 assembly for satellite ground station equipment in the 1980s. I really enjoyed coding for that platform. Some of the things that set it apart from contemporary 8-bit processors:
Multiple stack registers for complex argument passing.
Internal 16-bit registers.
(Integer) Multiply/Divide.
Sign extension (SEX instruction!)
Years later I did some embedded OS/9 (Not the Apple OS, but the Microware one.) The target was actually a 68k, but the OS had been ported from the original 6809 platform. It was pretty powerful for the day and although not completely POSIX compliant, it was very POSIX-like.
I also wrote a lot of 6809 assembly in the 80s- for broadcast character generators. I wrote a multi-tasking operating system for it at one point- I used the DP register to point to per-task local storage (exactly like errno in multi-tasking UNIX).
My co-workers went nuts with the PC-relative capabilities, but I didn't bother- those addressing modes were pretty slow.
We had a cool piece of hardware: an in-circuit emulator. The 6809 version of this one:
That thing was the bomb, because it had trace memory so you could watch what led up to an interrupt causing your system to crash. Non-symbolic though, you needed the linker map and assembly listings by your side to do anything.
There were some great tools for that platform. I used a different brand ICE, but also an HP1630D logic analyzer with a processor chip-clip and a built-in disassembler so you could see where your code failed. (Worked great for finding hardware problems too.)
So if all you have is a Tek 465 scope, you can make a poor-man's logic analyzer by connecting an oscillator to the reset pin, and trigger off of this pin on the scope. Then use the delayed sweep knob to scan through the bus, one or two bits at a time.
Video Data Systems- a small company in Hauppauge, NY. They made titlers "Chyrons", ad machines and community bulletin board things, oh and the NY Off Track Betting video stuff.
Oh, I bet you had this experience: 100s of monitors whining at near 15734 Hz, damaging your young ears..
I worked for Quanta in Salt Lake. They made the Delta character generator, and the Orion, and some others before my time.
Yeah, I've heard that whine. Haven't heard it in years, though. Both monitors and my ears have changed...
Did you ever go to the National Association of Broadcasters trade show? That was loud. Hundreds of monitors, but dozens of booths, each trying to play catchy audio loud to attract attention.
I used to work with a QCG-500 at a local TV station about twenty years ago. I can still pick out the way certain fonts looked at certain sizes on that thing. It eventually got replaced by a Chyron Maxine.
Prosumer and professional character generators are woefully underdocumented. Anyone wanna donate any old functional ones to me? I'll set up and make a Youtube series about them lol. I'd love to see them eventually be emulated in the way that classic game consoles and home computers have been.
I'm 40 years old and I can still hear flyback. I actually have a Roku Express+ hooked up to a 27" Trinitron. I miss the days of CRTs, except for their weight.
I miss CRT's too, and I have a couple. One, near my work desk, is a lightly used SONY PVM, high pitch Trinitron. I love to have movies playing on it when working. Above it, on a shelf, are a reasonable VHS that I use for capturing tapes and watching movies on as well as some not so special DVD/Blu-Ray player I should hook up via RGB, but is running S-video.
Rambling aside, I would enjoy a video series on character generators. Lots of us geeked out on broadcast details.
Personally, I would enjoy any tech info on how the characters were done and what trade-offs got made any why. Some of what was out there looked very good. Other gear, not so much. Bet it's a few minor details that differentiate those outcomes too.
It's been 24 years. I probably don't owe a duty of confidentiality any more. So...
We had descriptions of each character of each font in a scalable, vector form. To draw a character, we'd draw it as a bitmap (1-bit depth) at 4 times the size. Then we would shrink each 4x4 cell down to one 8-bit value (weighting the center more than the edges). That 8-bit value was the alpha (transparency) value for drawing the character over whatever was already there in the frame buffer.
When drawing (compositing) one thing over another with an alpha channel, we used true Porter & Duff compositing, with pre-multiplied colors. We had a custom ASIC that would do it, so it was fast.
One of the trade offs was the rise time (or number of pixels, same thing) of the edge of a character. Too fast and the NTSC video signal couldn't handle it (and you'd have visually obvious stair-steps on the diagonals). Too slow and the characters looked soft rather than crisp.
At VDS, there was 68000-based titler called Vidstar or Videostar, something like that. This was before the great DRAM price decline, so every effort was made to reduce the memory including the frame buffer. (on previous products they experimented with bubble memory and even shipped a product using CCD memory).
So the frame buffer was compressed using run-length coding. The hardware decoded it on the fly when generating the video. Anyway, we could get very high resolution this way with a tiny fast SRAM frame buffer, it looked great.
A problem was that even with the 68000, composing a page from text was quite slow: 10 seconds for single line, something like that.
One of the products was a teleprompter. To make smooth slow scrolling without complex sampling we moved the screen by changing the vertical sync position. This is not so easy to do well because the vertical sync detector in the monitor is pre-charged by the nearest horz sync (this is why NTSC has the double-rate serration pulses for interlace).
On the other hand, we did roll and crawl effects using fast 6809 assembly on other products. The font on this type was very crude- either fixed width characters like a terminal or semi-proportional spaced, where the width had to be a multiple of some number of pixels (so still character cells). You could get away with it on CATV bulletin board devices, but not in high quality titlers.
It also explains some of what we would see on various systems too.
If, say target frame buffer was some multiple of the color clock, it would be 320, 640, maybe as low as 160 pixels. (We had a couple of those and it was crappy obvious)
That ramp seems like it needs to align with that clock too, or there would be color fringing. Avoiding it entirely is gonna be soft, or strict limits on color combos.
(How many TV newscasts went with blue and white?)
Going sharper would mean a potential color artifact amd or different ones, depending on the alignment between the color clock and the ramp pixel start?
Say it was a blue fringe. Placing the glyph on an even pixel start might generate a bluish artifact, and a reddish one on an odd pixel. (Assuming 320 pixels for illustration)
Good placement against say a blue background, maybe gradient to also mask even odd line effects...
Were there placement limits per glyph to keep these things consistent? Did you guys get down to that detail?
Some of the graphics seen near the end of SD were sweet! Like almost every issue woven into a nice looking, very clever package.
In the late 1970s I was working at Tymshare, and one of my tasks was maintaining the assembler and linker. (I think this was on the PDP-10 but it could have been another machine.)
We wanted a "weak external" feature, somewhat akin to a "weak reference" in modern languages: instead of failing to build if the external symbol was not defined in another object file, it would link OK but leave a null value that you would check at runtime.
The assembly directive for a regular external was EXTERN. I thought of calling the weak external WEXTERN but that looked silly. So I decided to call them Secondary Externals, with the directive being SEXTERN.
I just finished a project that introduced features for both signal injection and signal extraction. So now we can enjoy both SIN and SEX, sometimes at the same time.
John Draper (Cap'n Crunch) once creepily invited a friend of mine (who was a strapping young lad at the time) into the back of a van to "execute some 1802 instructions".
Ah yes, I knew John well in those days. Not nearly as well as he wanted to know me!
We used to drive around in his VW microbus finding interesting payphones. Then when he was learning to program, I visited him once in a while at his place in Berkeley to help him with his code.
My motivation was that he usually had some decent weed. Then one time he asked if I wanted to "work out". That lasted about two minutes until I found out what he really meant.
Many years later, I ran into John at the Homebrew Reunion. He didn't recognize me at first, so I reintroduced myself and mentioned how we used to hang out and write code.
That is exactly correct. You beat me by 60 seconds, while writing up my comment, I saw yours after posting. Interesting, I wonder how many neurons I'm wasting on useless 8 bit instruction set trivia.
Yes, and it actually is one of the easiest sets to remember because there are so few exceptions. Everything that you expect to be there is there and works like you expect it to.
I always thought of the 6809 as the Chrysler Cordoba of 8 bit microprocessors, with soft Corinthian Leather upholstery and a luxurious automatic multiply instruction.
"I have much more in this small CPU than great comfort at a most pleasant price. I have great confidence, for which there can be no price. In 6809, I have what I need."
If I recall correctly, that instruction doubled the clock speed of the M6809, usually at the expense of making it go out of sync with the woeful 6847 video chip, putting colorful garbage on the screen.
There is no divide instruction. You had to write a subroutine for that, you could mask it as a macro on a half decent assembler but the CPU did not have it natively, it did have 8x8 multiply.
I haven't tried his CUBIX on his emulator yet, but there's a certain richness to systems that start with a full fledged debugger on the bare metal. And he's got his own assembler, Forth, BASIC, APL, C compiler...
It is really an 8-bit CPU from a bus (external connections) perspective...but internally, there were 16 bit registers (sort of...) Perhaps this was why he made the distinction.
The 8-bit A and B accumulators could be addressed together as a 16-bit D accumulator. The contents of A and B could be multiplied with the product stored in D. There were also two 16-bit index registers with advanced and versatile addressing modes, X and Y, along with 16-bit user and system stack pointers, U and S.
I have a copy of that book near me. For a very long time, Moto would send docs out for the mere asking. Free of charge. I got mine as a high schooler. Didn't use the mail though. Instead, I visited a local Moto office and they handed me the ones I wanted and I got to talk chips with the people there for a while.
For a small town kid interested in all this stuff, that made quite an impression. Instant Moto fan.
The 6809 and the 6502 had a 'fast register file' page aka the 'zero' page on the 6502 which could be relocated anywhere in memory in the 6809. These were looked at as an external extension of the register set and lots of instructions could use the data there as 8 or 16 bit data or pointers.
Not all. The Texas Instruments TMS9900 designed as a single chip version of the TI 990 minicomputer series had three internal 16 bit registers, but 16 general purpose registers in RAM.
Thanks for this! What a blast from the past! Look at the ADS! Nice little diversion here.
Sure do miss BYTE. Would be excellent to have a similar publication today. The scope is larger, perhaps too large, but maybe not. One got such great perspective from the pages of BYTE.
I love absolutely everything about this. I get the same feeling of admiration and envy from this that I get from machinists who build tooling and fixtures that they then use to make parts. This system feels like a serious tool the author built to do real work.
The author's simulator runs fine on a 32-bit Windows machine. I would expect it to run under DOSBox just fine.
The simulator's text user interface is also superb. Getting the machine running with the full OS was very straightforward (run "D6809.COM W=CUBIX.DDI" to start with the OS disk mounted read/write).
For what it's worth, the source code of OmegaSoft Pascal, a Pascal cross-compiler with linker/assembler/... for OS9/68000 targeting the 6809, is available: http://pascal-central.com/omega.html
This reminds me of a brief fling that I had with a SuperPET back in the 80s. [1] 96KB RAM and an enormous ROM (48K!) with all the Waterloo languages (microAPL, microFORTRAN, microBASIC, microPASCAL, microCOBOL) in it!
I never really wrote any ASM for it though, but it inspired me to write a p-code based runtime that had 16 bit registers for the 6502.
I wrote a FORTH for the SuperPET (sold as a Waterloo "Micro Mainframe" here).
The 96KB were done in a rather weird way, because there were really only 32KB of contiguous address space available once you accounted for ROM, and both the 6502 and 6809 had a 64KB address space anyway. So the SuperPET had 32KB contiguous RAM in the lower half of the address space, and 64KB of paged RAM occupying a 4KB address range somewhere in the upper half of the address space. The code for the Waterloo languages typically ran in paged RAM, split into 4KB segment, and when it needed to jump from one page to another, it used some trampoline routine to switch the page and jump back.
For FORTH, I used a simpler scheme: I ran in contiguous RAM and used the paged RAM for the FORTH explicit virtual memory system, because there you only needed to guarantee access to one page at a time so it mapped onto the hardware without much sweat.
How do the 6809 features compare to ARM and RISC-V? If someone wanted to play with assembly, would it be nicer on contemporary computers, or is there something about 6809 that is still an advantage? Maybe now just the simplicity could be helpful?
ARM32 has the same "feel" to me as the 6502 and 6809 do. If you wanted to have a similar experience as those of us who cut our teeth on those systems, but on a more modern platform, I think you'd get it from ARM32. Really the most noticeable differences are, you won't have to waste cycles (literally) writing out 16- and 32-bit arithmetic carry code all the time, and you get registers instead of the zero-page. (Though the experience of writing carry code is still there, if you want to do any 64-bit arithmetic.)
The historical reason for this is that ARM32 was directly modeled after the 6502, which in turn was based on the 6800, as was the 6809.
Also: 6809 has PSH and PUL that take a register list. ARM's LDM/STM is a direct copy of this feature (via 68000's MOVEM). Other RISC processors did not have a list.
The conditional branch instructions (and set of flags) for all of these are from the PDP-11.
ARM32? I would disagree. I've had occasion to do reverse engineering and hand-assembly of various bits of ARM32 Thumb code recently. It brought back all the feels of hand-assembling 6502 on my Apple //e in 1994. Moreso even than AVR assembly. The ARM32, 6809, and 6502 are all cut from the same cloth. Same type of vehicle, just more cylinders.
ARM64 and RISC-V I would agree are very different beasts. One of the key differences I feel, is that caching and pipelining become much more important on these systems. Most (not all) ARM32 systems have very simple and short pipelines, and no cache. E.g. Cortex-M3 and -M4.
I am curious to read other opinions. That said, simplicity has two axis:
The instruction set and overall CPU complexity, and the hardware it's running on.
In my view, playing with assembly is best done while also either playing with hardware, or really understanding and having access to said hardware at the "raw metal" level. The distinction being computation, and that's not too much different of an experience, and making things actually happen, which can be a widely different experience!
The simplicity, again in my view, of 8 bit type hardware is very helpful. It's not so large and or complex. People can understand it reasonably well, and with some degree of effort, make things happen.
Back when I was learning assembly language, it was on these kinds of systems. Sometimes the hardware played a lesser role, say a machine that has serial i/o for use with a terminal. Other times, the hardware was far more prominent, like with home computers of the time.
With the terminal, one would typically setup the hardware for serial comms, and once that was all done, basically stuff values, from a buffer, into the designated address or capture them, into a buffer, as they arrived.
On a machine with built in display, the display was typically memory mapped in some fashion. Writing to a specific address in RAM would cause a character to appear on the display screen.
It was common to have a memory map of the computer handy when writing assembly language programs. Put a number here, or set a bit there, and things would happen, like maybe click a speaker, or change the color of the screen, turn a motor on, or make a sound.
Have you looked at the 6502 hardware tutorial kits Ben Eater is doing?
(I mentioned up thread the idea of doing these with the 6809, which is tempting me badly right now...)
They are fantastic! Assembly language + hardware functionality = your program.
And this is the embedded view on things too. Having everything simple and small generally means being able to make good progress and learn a lot without it being too heavy of a lift. And depending on what your interests are, frankly you might be able to do more than you would expect.
Should your interests lie in this direction, a 6809 is an awful lot of fun to program. It's one of the best 8 bit experiences, IMHO. Be warned though! It can be an addiction, and once it starts... you may find yourself with a scope, spiffy new tools, and the need for some space to work, and, and, and! Mine has been life long, first triggered by writing a line of BASIC, pressing carriage return (enter key these days), and seeing a light bulb turn on. :D Soon after that, it was a speaker click, then a loop to make sounds, then one with parameters to make different sounds, and, and, and... There you go.
But, it's not necessary to fixate on this chip either. It's a beautiful CPU, no doubt. But others are also fun, and they have their advantages and there is a wide variety of hardware out there to program in and with.
If this kind of thing does not appeal in some way, then yes! By all means explore assembly language on something newer, bigger and faster.
Or, maybe it is, and or you don't know.
I would suggest emulation. We've got pretty great emulation for all the old machines.
Many of us, who enjoy these older CPUs, work with emulators, because we've got access to modern tools and such that make life a whole lot easier.
While there is something to be said for an authentic experience, to be frank? It's a lot of work, and kind of painful most of the time.
A good emulation smooths away a lot of pain points and "nicer" is a lot less important. Use an older CPU, newer one, freely. And you can save states and all the other good stuff that makes learning less painful and more fun and fruitful.
Right now I've got a little 6502 project in progress for an Apple //e type computer. It's being done in an emulator, and from time to time I move it all onto the real computer just to see it all go and interact with it for real, and or check it out at different clock speeds, or just show off.
But the majority of the time developing is on emulation. I can fire it up, make a little progress, or explore an idea, and then save everything for the next step later. In this way, it's easy to jump in for an hour, have some fun, and get back out again when that hour isn't costly in some way.
Probably more here than you wanted. Hope it helps.
I appreciate the comperhensive answer. I have done some experiments with assembly in C64 and on ZX Spectrum and also many years ago with 8086. And one or two other things. Mostly pretty simple. I grew up on the Coco 2 when I was maybe 9 or 10, but never got into assembly on it. Maybe someday I can check that out. I have heard a lot of people say how great the 6809 is for that.
I haven't ever really tried to dig into a lot of details on assembly with the contemporary processors though. I know there are quite a lot of extra capabilities, for example various things for efficient vector math and virtualization, but don't know a lot of details. So I was just wondering what people thought about doing assembly on the latest CPUs compared to the old ones. But I think as you point out, a lot of that depends on what hardware you are interfacing with and how and more generally the application.
I think you are right to point out that the details and capabilities of the hardware interfaces are going to make a huge difference. That is something that is attractive to me about assembly programming, especially olders systems that are designed to be interfaced with assembly, is that hardware capabilities are directly accessible and you can achieve interesting things straight away just by putting some numbers into memory.
So a few years ago I built this thing called Vintage Simulator https://vintagesimulator.com/ and one of the ideas I had was about directly accessing interesting hardware capabilities. For example, the system provides Lua scripting, and I created a way to interface with the RAM on the emulated systems. So I started working on the idea of a virtual 3d spaceship controlled by an onboard C64. I was making it so that I would put things like the elevation of the ship into memory with Lua, and then with C64 assembly I made a read out on the screen. I was planning on making so the assembly program could control the ship, with the Lua program monitoring special memory locations.
Also along those lines, I was dreaming about making a virtual 3d holographic vector display. So it could maybe monitor certain memory locations or something and then to display into the 3d environment from the emulated 8 bit computer, just write to memory. Or maybe write to some kind of output system if that made more sense.
That is an assembly language game with surprising depth. People, who play it seriously will learn quite a lot.
Using older hardware in tandem with current is a lot of fun. The terminal, for example, still does exactly what it was designed to do and is still useful today. A C64 has a user port for this kind of thing. Apple 2 computers saw all manner of expansion cards capable of data capture, control, computation... making ones own card with a PIA was a standard type project, just as connecting a circuit to the user port was. BASIC was often enough for many basic controls, say regulating temperature, or turning things off and on based on time, or other data states, assembly language brought more of the speed possible, and with more speed, we get more posibilities.
Take one bit wired to a speaker. That is what the Apple and IBM PC shipped with to make sound. The PC had a spiffy timer to bang out notes, and the Apple had nothing but a toggle. Speaker on, or off.
Here is where the fun part was back in the day:
At first, it seems useless. Access an address and the speaker pops out. Do it again, and the speaker pops in. One can type this right into the computer with a BASIC POKE statement, or the monitor by address and watch the speaker change states.
A loop gets one some clicks rapidly, or maybe a buzz, depending. But in assembly language a lot more is on the table! Suddenly one realizes it is possible to turn the speaker on or off long before it has reached the other state. There are lots of other realizations, such as what a read modify, write instruction does compared to one that does a single read or write.
There are any number of simple, seemingly shallow examples out there, racing the beam, making sounds, cramming more onto a floppy, and so on.
But what happens when the hardware is simple, driven by software gets right at the heart of assembly language, in my view. Hardware may be designed to do something, and if designed reasonably, will do that thing.
But, where it may be simple, or over engineered, has bugs, whatever, all suggest more, and as time passed we saw people get far more out of a lot of hardware than intended. Demoscene comes to mind here.
Maybe four thoughts! You should definitely take a look at this guy:
Here is the assembly programmer mindset modeled well. People generally approach it wanting to get more out of their hardware, or understand what is really happening better, or to help them with subtle program errors.
Some do it for fun too.
This work suggests the ideas evolved during early computing, hacking on hardware, exploiting bugs, design choices, and more continue to play out on modern processors and systems just fine.
I remember seeing an ad of an Apple computer, possibly from the 80’s, that was sold as portable but putting together the whole package resulted in a huge cube. Does anybody know the model number, or have a link to the add?
A friend and I have discussed doing something similar (i.e a luggable system in a wooden box) with rPI and a commodity 1080P display. We were thinking more of a campaign furniture aesthetic though
An old Kaypro case would be heavy, but a pretty easy retrofit. You could fit a nice LCD in the spot where the 9 inch diagonal CRT goes. You could also have someone "chop" the depth of the case pretty cheaply at a metal shop.
What a nice build though it does look like you might get stabbed in the wrist by the latches when typing on it. Good reason to wear a sweat shirt while using it.
Multiple stack registers for complex argument passing.
Internal 16-bit registers.
(Integer) Multiply/Divide.
Sign extension (SEX instruction!)
Years later I did some embedded OS/9 (Not the Apple OS, but the Microware one.) The target was actually a 68k, but the OS had been ported from the original 6809 platform. It was pretty powerful for the day and although not completely POSIX compliant, it was very POSIX-like.