I have a question about that. Why would they do this?
What was so valuable in these few lines of (mostly proprietary) code, that you would make the effort to port instead of just writing from scratch on the new machine?
If you take a look at what came later, a lot of much more substantial software packages were made by a single bedroom coder in his spare time.
Why port? It couldn't be to save effort? (At least the conceptual level of things would remain the same if you start from scratch.)
I vividly remember my first encounter with Unix, in an ACM journal archive at the local university library as a teenager in the late '70s, utterly astonished to find lower-case computer commands and prompts.
Use by development groups for Western Electric products was different from the in-house Unix systems run by the Bell Labs data center that were used as time shared systems for email, general computing, and document production using nroff and memo macros. Moving away from all upper case made line printers less efficient, but about that time laser phototypesetting became available.
It had extra address lines that weren't even used.
Same thing for the keyboard encoder, IIRC.
Wow, I wonder how the history of UNIX would have panned out if that machine hadn't happened to land at Bell Labs.
Then we moved onto a bit of 680x0. Yeah, that was different.
A simulator for it was pretty easy to write, switches and all.
Edit: just checked, and the 11 has about 70 instructions. I think MACRO-11 might abstract over many of them, making it look like it has fewer instructions. The instructions group nicely into a few families, which makes things really easy.
Compare that to Linux, its most popular descendant Unix of today...
Today's Linux requires a tad more memory and disk than that... just a tad... <g>