Hacker News new | comments | show | ask | jobs | submit login
Motorola M68000 Family Programmer’s Reference Manual (1992) [pdf] (nxp.com)
134 points by doener 11 days ago | hide | past | web | favorite | 99 comments





68k was the easiest 'language' I ever learned. You had the reference manual, the instruction timings, no middleware, it was awesome and I made my future career from the age of 12 by writing Amiga / ST stuff.

Best thing is - it was me against the machine. If anybody did something that moved more sprites than I did, or did better parallax, it was because they were better than I. No driver incompatibility to blame. I loved programming in 1984, it made me tolerate less shit and produce better code than people who were less fortunate to be born later.

I miss those days :(


Oh my, the memories. I remember when I was fascinated with animated 3d plots of z = f(x,y) functions which I saw on TV, and when the only decently performant way I found to render those plots myself on my Amiga 500 was to build my own compiler of math functions into 68K machine code from my C program.

The machine code was the easy part - what cost me so much time was reinventing on my own a parser without knowing anything about parsers theory and AST (I was 16 then).

Today, you type your function inside Google search and you get a rotating 3d plot rendered in real time in js inside the browser... lol.


For younger generations, C is a synonym for fast code, yet on those days it was quite common that asm { ... } blocks would be like 80% of function bodies.

I shared a similar experience regarding parsing also with your age, got a compiler development book at the local library, but the T diagrams of bootstrapping process and how to use automatons only made sense years later.

I ended up resorting to common sense and a very lousy parsing algorithm.

However my goal was pretty basic, doing syntax highlighting of Turbo Pascal code for printing.


Yeah, commons sense and quite lousy parsing was my solution too. Only years later I learned how much easier the proper solution would have been - but my local library didn't have any coding book back then :)

> No driver incompatibility to blame.

Actually, at least on the 8 bit machines we had ROM firmware to blame.

I remember some ZX Spectrum issues depended on the actual model one had, which were un-officially documented on the ROM Disassembly books.


You could get in trouble if your code relied on specific contents of the ROM, for example to use it as deterministic random numbers. Also, installing a Kempston Joystick hardware interface would mess up the default way to set up interrupts, but a couple of years into the Speccy's life, everyone knew how to do it safely. Those two are the ones I remember.

I feel ya. The 68k days were glorious indeed. I'm sure things weren't really as good as I remember them, but man the nostalgia pull is real. I agree with the Raspberry Pi people, it's a lot harder to hack on stuff than it used to be.

I completely missed the heyday of microprocessor programming, but I wholeheartedly disagree that hacking on things is harder than it used to be. TI-84 calculators are $100 and let you write Z80 assembly, 68k microprocessors are available for less than $10 from any number of sources, easily available documentation and 30+ years of experience to draw on, freely available plans for whole systems so that you don't actually have to figure anything out yourself, or you can bypass all that and build your own processor in a circuit simulator, like I did, or learn HDL and buy a cheap FPGA and run it for real, or use something like python to control your parallel port with single lines of code connected to a webserver if you don't mind being silly.

Heck, if I want to, I can even massively hack things, write kernel drivers for linux OR windows, do basically whatever I want. Some of them are harder than others, and some are more intended than others, and some require more time than others, but the world of hacking is still one of possibility. Pretty much the only locked down systems are smartphones, which sucks, but I just bought a raspberry Pi zero W for 10 dollars, which is incredible compared to the Basic Stamp 2 micro controller that cost $120 just a decade ago.

Literally the only difficult (at least IMO) thing for hacking whatever you want is analog circuits. I still massively struggle with that.


Good points. When I think of hackability tho, I think about the devices and "toys" that I already have. Back in the day you just hacked on what you already had. Also, most of the time you were a kid or teenager that couldn't even afford the 10 bucks for the pi, so unless your parents bought one, you were out of luck. I had to take apart my dad's VCR and computers and stuff in order to learn.

But for an older person, you make some great points.


There are still jobs for asm, C/C++ programmers where dependency on external libraries are not needed. I do understand tho, I miss those days too. My first 68k system was a sun 380.

> I made my future career from the age of 12 by writing Amiga / ST stuff [...] I loved programming in 1984

My first computer was an ST in the late 90s! it was a bit of a franken model that my dad assembled, basically the best bits of various STs, I loved it. It was my favorite computer even in that era. I was of a similarly young age but didn't get beyond BASIC back then :), still I feel like I learnt a lot more than my friends with their more expensive windows boxes of the time.

I wish I took the programming further back then, I ended up rediscovering it later in life instead. I feel like I have some strange kind of false pseudo nostalgia for these machines though since I discovered them almost two decades later :P


I agree, I had a similar experience.

But you didn't have an Amiga or ST in 1984, did you?


You're correct, my years are slightly off. But I am old so forget things!

Back in the 1980's, I took one of these chips and ran four copies of the same OS on it, using some virtual memory and driver magic. The full story can be found at http://jeff-barr.com/2008/01/08/inmsx-running-4-copies-of-an... .

Wow dude, you're amazing! How are you able to dig so far into (extremely technical, difficult!) uncharted territory without getting diverted one way or another - frustration, "too hard", give up, etc.? ESPECIALLY since you did it all without the galaxies of information and well-tred paths available today!

There was no Internet to distract him.

I'm always fascinated by old operating systems, especially ones which are different from Unix/Windows/etc.

Do binaries still exist? Documentation? Source code?


Brilliant read, thank you for sharing!

that was a fun read, thank you!

I think one of the greatest things about the M68K today is that it's a relatively powerful processor that's relatively straightforward to program and it's pretty easy to design hardware that uses it.

A single board computer with the m68k is still small enough that you can comprehend the schematics and build it at home (e.g. http://www.kswichit.com/68k/68k.html ), and it's a very valuable learning experience.


I used the 68K back in college for a couple of hardware/software codesign classes. It was a lot of fun and easy to work with. The boards we used were all wire-wrapped. No where near as clean as the kit you linked to. Still, was pretty easy to work with and fairly forgiving (I don't remember any of them burning up despite all of the crazy lab circuits we connected). Still have my Programmer's Reference Manual sitting on my shelf, sadly just collecting dust.

Also the processor the Sega Genesis uses, so you can learn to make your own Genesis game!

Gamehut has a series on doing that

https://youtu.be/PSYhSmXBgIw


Are new M68K chips still produced? I looked a while ago, but couldn't find any.

The 68K evolved into two lines: the Dragonball and, more recently, the ColdFire line. NXP is continuing production of the ColdFire parts. Dragonball evolved into the iMX line by swapping the 68K core for ARM.

https://en.wikipedia.org/wiki/Freescale_DragonBall

https://en.wikipedia.org/wiki/NXP_ColdFire


I've got one of these: http://apollo-core.com/index.htm in an Amiga. It's an insane amount of fun.

"Apollo Core 68080 is the natural and modern evolution of latest 68000 processors. It's 100% code compatible, corrects bugs of 680x0 designs and adds on top most of the cool features which were invented the years after."


Check this out. Apolle core should have a neon sign above it.

Yes. They're known as ColdFire processors now, and they're sold by NXP.

Coldfire is a rather different product, though. The instruction set is similar but not identical, and there's no exposed memory bus. (Most of the parts have internal flash and SRAM, like a modern microcontroller.)

> The instruction set is similar but not identical, and there's no exposed memory bus.

Actually, there are a number of offerings in the coldfire v2/v3/v4family with decent performance and external buses including PCI, 10/100 ethernet, USB, ATA, and CAN with DDR memory and parallel bus interface.

https://www.nxp.com/docs/en/data-sheet/MCF5307BUM.pdf https://www.nxp.com/docs/en/data-sheet/MCF54418.pdf https://www.nxp.com/docs/en/data-sheet/MCF54455.pdf


Oh, the modern ones absolutely have other external busses -- but not a standard 68000 memory bus. There's no way to hook up an EEPROM with your program on it, for instance.

I see what you are talking about. While the external bus interface on these cold fire parts is not a pin compatible m68k cpu bus, it certainly can access memory devices and uses DMA. From the 5307 manual:

1.3.7.1 External Bus Interface The bus interface controller transfers information between the ColdFire core or DMA and memory, peripherals, or other devices on the external bus.


I used one of the first generation dev boards, it had external DRAM and ROM.

Check out the Firebee. An Atari ST semi-compatible machine built around the Coldfire:

http://firebee.org/


Barely. The 68SEC000 is still in production, but is NRND.

What about the second-source vendors? Historically I know there were Hitachi ("HD" prefix), Toshiba ("TMP" prefix), and Signetics ("SN" prefix) parts. At one point I saw a modern-looking Atmel data sheet for a TS68C000 (Thomson/STMicro license?), but I don't know how recent that actually was.

The Hitachi line, at least, seems to be explicitly EOL by Renesas. I'm not familiar enough with the other vendors to know who might have picked up their licenses.


I suspect those are all gone too. TS68C000 went EOL in 2010 -- it was produced by Thomson-CSF, which was acquired by e2v.

One interesting avenue to explore might be the Sega Genesis. It's still being produced in Brazil, believe it or not -- and there must be some sort of 68000-compatible processor at the core of those.


I think many was EOLed when the Freescale Sendai fab shut down.

AFAIK, no, at least not for the general market, but they're pretty easy to get on Ebay, AliExpress & friends.

NXP still makes a range of ColdFire-based chips, which is based on M68k

I'd assume so, since there are still many devices that use them!

I still program the 68K today in K&R C. I recently released a Dropbox client for Tandy XENIX.

http://pski.net/trs-box-xenix/


Back in the 80's, I wrote a letter to their documentation department asking how much certain manuals were (I was a fan). I got a box of manuals in the mail instead of a price list. They sent the references for both the 68000 and 6809. It was funny when I got to college and one of the EE courses involved programming of a 6809. Its a true shame their isn't a 64-bit 68000 as it is such a nicer processor than the x86.

They never charged for manuals, as far as I know. I have a whole bookshelf of 68K and PPC books, datasheets and references.

As a high school kid, I got a ride to a local office to get docs.

Picked up 68k, 6809 and various databooks.

Not only were docs free, but the engineers there started up a good conversation. That I was a kid didn't matter much.


Zilog refused to send me Z80 manuals when I was a teenager. I was pretty sad. I did print out the datasheet but it wasn't the same.

I had the same experience with 6809E as a teenager. I have warm feelings toward Motorola to this day.

The 68K was such a glorious processor to program, especially after coming from the 6502. Hardware multiply and divide? Sign me up! :D

The assembly code looked like poetry.


I actually connected on LinkedIn last year with Darek Mihocka, a guy who I last knew of 30-odd years ago.

He'd made a Atari ST extension called (I think) Quick ST. The ST had awfully slow text output due to many things such as the interleaved bitmaps and lack of hardware assist. His made it much faster. I wrote a competitor that was faster than his. A couple of month later he wrote a new version that obliterated mine.

I literally spent weeks trying to work out how he'd done it, went through the 68k reference manual, nothing.

Then one day it came to me in an actual dream - 12 year old me actually woke up and started typing. It was the 'move.p' instruction. From memory this instruction split a 16 bit word into 2 bytes but aligned on a word (16 bit) boundary. I added this and my code then beat his.

Bear in mind this was before the internet, so no to-and-fro with conversation, all I saw was what he did. It absolutely stood me in good stead for the rest of my career, each time now I see a programmer who writes something completely sloppy I ask them how it will scale. I really miss those days and wish somebody could bring this back for newer people to learn the basics.


I'm still proud of my shell sort implementation in 10 or 12 68K instructions. I was young, so it felt like a huge achievement. :-) That and helping a couple of demo coders shave an instruction or two from core loops.

These days I yell at people because they build Docker images that take up several Gigabytes...


Yeah. I've yelled at frontend devs recently cos their node.js webserver with hardly anything on it took 1.2GB. For one thing they've got 300MB of Material Design icons there, wtf?!

I love that I don't have to use the vertical blank interval to do processing, but fuck me if we've not gone the other way, programming is lazy nowadays.


I really wish it wasn't so lazy these days. Computers getting faster shouldn't excuse to use less and less performant programming practices. But then from my experience in the field thus far this opinion seems to be in the minority.

There was also a great instuction called "movem.l" which moved multiple registers at once to almost anywhere (and vice-versa). It was used mainly to allocate stack frames, but could be creatively used to move stuff from place to place REAL fast.

Yes! The worst thing about it is that it was at your fingertips the entire time - the reference manual was there and it was exact. The 68000 (not the 68020) didn't have cache so instruction timings were exact, so I had no excuse.

Just to chime in with more 68k nostalgia and trivia: The 68000 didn't have cache - but 68010 did.

Three whole instructions worth of cache.

"Wha...?" you might say? Tha' heck good is that? The point was that the 3-instruction cache could cache tight loops, like the kind you would use for mem-to-mem copies, which is a terribly common thing to do. I put one in my Amiga 1000 (geez, did I hack that thing...) and got about a 10% overall system performance increase.

It was neat! More lovely details here: http://www.memphisamigagroup.net/diskmags/198803/68010-kit/M...


I remember it as just one instruction plus the jump. Checking up on this now, it looks like the "cache" (actually the IR register and the prefetch register) can only hold one 16-bit instruction and a DBxx instruction.

Ok, now that you mention it - that does sound correct. It was a long time ago... ;)

With a tower of MOVEM.L instructions and enough scratch registers you could get within a percent or two of theoretical bus speed for moving aligned data around. The ST used this in spots, and the memory manager and toolbox of the Macintosh used MOVEM extensively.

M68k assembler was so straightforward and beautiful. Easy to read and easy to write. Too bad that the monstrous x86 won out.

I studied the 68000 manual in jr high but didn’t have a thing to program it on. I went from 6502 in High School (I understand but damn this is tedious), to IBM 370 (what do you mean no ASCII), to 6809 (what sorcery is this, oh my precious), to 8088 (why would they do this), to 68000 (a pox on IBM’s house for not picking this).

68K is so nice, and damn do I miss programming that low. The x86 always felt like I had offended someone from Greek myth and was being punished. A cruddy blaster not a lightsaber.


Totally agreed. The first time I saw x86 assembler I almost cried.

My first computer had an Intel 8080 so when I got a 6502 it seemed so backward. How could you do anything with that? Then when the 68000 came along it was wonderful and put both the others to shame. I never understood why IBM didn't use it. I still have a hard copy of a preprint of the 68K manual dated 1979.

From http://yarchive.net/comp/ibm_pc_8088.html

> 68000 was carefully considered. "AN excellent architecture chip, it has proven to be a worthy competitor to the Intel-based architecture." there wer four major concerns:

> 1) 16 bit data path would require more bus buffers, therefore a more expensive system board.

> 2) more memory chips for a minimum configuration.

> 3) while it had a performance advantage, the 68000 was not as memory efficient.

> 4) Companion and support chips not as well covered as Intel.o

> He also felt the the 68000 didn't have as good software and support tools, and the similar register model allowed the porting of 8080 tools to the 8086/8088.

> "In summary the 8088 was selected because it allowed the lowest cost implementation of an architecture that provided a migration path to a larger address space and higher performance implementations. Because it was a unique choice relative to competitive system implementations, IBM could be viewed as a leader, rather than a follower. It had a feasible software migration path that allowed access to the large base of existing 8080 software. The 8088 was a comfortable solution for IBM. Was it the best processor architecture available at the time? Probably not, but history seems to have been kind to the decision."

I don't remember where I read about it, but I remember reading about that IBM ditched the 68008 (8 data bit bus version, like the 8088) because some problem with it... availability of it, perhaps.


The beauty of the 6502 is page 0, that gets pretty close to having a 256 byte register file. The 6809 one-upped the 6502 in this respect by allowing the 'direct page' to be set anywhere in memory, as long as it is on a 256 byte boundary.

Not only that but easy to disassemble. I created a monitor with a built-in disassembler, in 68K assembler. The single-step bit was a boon for debugging.

I wish the m68k architecture had beat out Intel.

Did anyone else get into ASM coding through the TI-89 (M68K) and TI-83 (Z80)? I had so much fun drawing sprites on graph paper and working on random projects and games I was thinking about.

Sharing them with my friends through the cable that went into the headphone jacks. The fun of seeing somebody at my school I didn't really know getting a kick out of something I made.


I did ASM on the TI-83 when I was in high school. Self-taught from the reference manual. Didn't know how to properly use the stack or write functions in ASM at the time...my code was a mess! Still, was quite a lot of fun with the only danger of having to yank the batteries if you messed up.

By the time I got a TI-89 around 1999, I was in college and just didn't have the time to do anything substantive with programming it.

I'd dust it off, but mine is old enough to just have the I/O port, and don't know that I have any functioning computers with a parallel or serial port for the old TI Link.


There are USB versions of the TI-Link.

Alternatively, a while ago I used Emscripten to build an emulator for the TI Voyage 200 (a late revision of the TI-92). There's some minor glitches, but it's mostly functional:

https://woofle.net/v200/


Anyone else use the hex “assembler” on the TI-83+? After a while you just memorize the hex opcodes for z80 insns...

TI-GCC for my TI-89 was nice...a few years back I tried to dust-off my IMSA-issued 89, and...it didn’t turn on. Sad. Replaced all batteries. Nothing... Then the flex connector on my 83+ started flaking out (the LCD corruption issue plaguing these...).


I did Z80 on the Spectrum.

In Portugal, TI wasn't that relevant, the dream of most students were having either a Casio FX-850 (later FX-880), or an HP-48.


TI-89 was my gateway drug to 68k. I still have it on my desk actually, tho it's been in a box for the last 15 years.


Oh my god!! I loved MacGolf!! I think that back in the day I used TMON to break the copy protection scheme (for personal use only, of course!) so many good memories. Thanks so much.

Hey, I played that game on our Mac Plus! Or was it our LC II? Hard to remember the details now. :)

In the Air Force, I went to school to learn how to repair and maintain an 18 bit Hewitt-Rand mainframe computer in 1988. It was a four month course taught at Tyndall AFB near Panama City Beach, Florida.

At the time, I was an Amiga fan, and I had this manual. I was totally blown away at how nearly identical the architecture of the M68000 was compared to that mainframe. Imagine my thinking that my Amiga 500 had the equivalent of a room-sized mainframe in that chip!


An elegant weapon for a more civilized age.

Notice the high quality of documentation. Motorolas were some of the best as I remember it.

It was definitely among the very best, but more than 15 years ago I found an error in the E500/Book E reference manual that repeated itself for more than 200 pages. It only was relevant to you if you cared about the actual bit encoding of SPE instructions. Plus there was a random reference to someone named Gary that, obviously, wasn't meant to be there.

Motorola's PowerPC support hotline acknowledged the problem on the same day and published a new version of the manual by the end of the week or the next one. Of course they did, because that's how great they were.


But what became of Gary?

They never acknowledged Gary, but removed him anyway. The original said

"Scalar floating-point instructions operate on single 32-bit operands that reside in the lower 32-bits of the GPRs. These instructions are considered a proper subset of the SPE APU; this subset is referred to as the SPFPU now called the EFPU Gary cares only that it have embedded in it somewhere."

(more recent e500 chips considered the scalar instructions to be in a separate APU)


Good times ... EXCEPT ' you cannot move “word” or “long-word” sized data in and out of memory, if the address is an odd number' .. in other words, a hard fault.. This mis-feature can come up a lot, and was fixed in later CPUs.

http://mrjester.hapisan.com/04_MC68/Sect01Part06/Index.html


Well, purists would have told you that you had to align your data. Yes, later chips did support misaligned accesses, but that came at a cost: extra bus cycles.

And now compilers will make sure your data structures are aligned. Adding circuitry and compromising performance for lazy programmers is a bad idea. This is one thing RISC-V has carried over and I'm not sure they have justified the decision to do so.

In some cases, packing data structures with no alignment is more important because it means more data fits in the cache, and cache misses are extremely slow in comparison.

Modern x86 is mostly insensitive to data alignment, since it fetches internally in cacheline-sized blocks (I believe it's 64 bytes) and a very wide barrel shifter is used to access the desired bytes. There's a tiny (single-digit cycles) extra time when the element straddles two cachelines (assuming both lines are in cache)


The justification in the RISC-V spec is that it might make porting some software easier. They also say that unaligned loads/stores may be non-atomic and take much longer than aligned accesses -- allowing firmware emulation on simple implementations. The current atomic extension doesn't allow unaligned at all -- though I think there's an extension to allow atomic unaligned accesses in the works.

One could argue that adding circuitry and compromising performance for lazy programmers is the central idea of cpu design nowadays. The ones that assumed smart compilers or assembly programmers fell by the wayside in the market. VLIW cpus, Cell, Itanium etc.

Brings me back in memory lane, where I was a 13-14 yrs old middle-schooler trying to read this extremely thick manual with no or little background in logic design, CPUs etc. Well I could still program, or at the very least hack it to show some graphics, implement sinus scroll from scratch etc :)

I started with the 6502, but my second computer was an Atari STE and did a lot of 68k programming. It was great. Then I discovered the book "The Art of Electronics" as a late teen. It had a whole bunch on the 68008 ( which is gone from the current edition ) which was nice for little embedded projects as it interfacedd with the 8 bit world.

https://en.wikipedia.org/wiki/Motorola_68008

later on I got to do some work on a 68030 series micro. That was pretty nice also. All assembly based.


What is the modern equivalent that kids can use today? Arduino?

I am not qualified to answer, so here I go:

If in the USA: Hack a TI-89 calculator. They are still 68K-based, are unfortunately inescapable bricks tied to mandated Maths curriculum, and have an active hacking community.

I re-learned M68K programming via the Palm Pilot, which has a fun SDK clearly inspired by the Classic Macintosh.

I would say that ARM programming can be as fun as 68K, so perhaps a Raspberry Pi to start. You can get closer to the glory days with ARM-M Arduino-like toys such as the Teensy.

I haven’t done any assembly-level Arduino programming, so I don’t know what that feels like versus 68K. Lots of microcontrollers grew upwards out of the then-popular 8051, which does not feel the same to me.

I got started on the Motorola 6809 CPU and the TRS-80 Color Computer. You could almost call it an 8-bit precursor to the M68K, it is limited but very nice design.


Yep, here is a possible starting point with ARM Assembly on the PI.

https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/os/i...


POKE 65495, 0

:-)


Oh, the memories. Being able to predict the exact instructions the compiler would generate. Patching in memory using NOP instructions. Actually being able to write assembler code. Unsegmented memory. It has always disappointed me that Motorola didn't do better in microprocessors but that's water under the bridge.

I read old processor manuals for fun as a kid. 68k, x86, the rare ARM or SPARC book...

Thanks for posting. Well before my time. But powered some legendary video game consoles such as SNK Neo-Geo and Sega 16.

http://www.sega-16.com/


[turns to bookshelf] My copy is dated 1984.

I have an MC68000 User's Manual edited by Prentice-Hall with the HP logo (and an HP 9800 Computer Systems sub-title) that says December 1981.

mine as well (albeit 4th edition, 1984). which is sort of weird because i never saw or touched an M68k HP; i only ever referred to it when debugging stripped binaries on Sun 3s.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: