One night over burrito bowls we got to talking about the Osborne 1 and the odd size display. Specifically why 52 x 24 instead of 40x24.
As the story goes, there were huge concerns the computer would come in under budget. At the time chip companies would bundle their CPU’s and RAM - loss leading with one but charging more than market for the other. Architectures should have meant these chips were not cross compatible. Adam Osborne set a hard ceiling on the price, Intel compatibility, and 40x24 native resolution for the display (so as to be compatible with with other “Business” applications). Facing these constraints Lee couldn’t make it work unless he got creative. Through some arcane use of the dark arts (read: layout and capacitor placement) he was able to get the timing working between the low cost Z80 processor and the low cost 4116 RAM.
When he presented the news it went something along the lines of: “Adam, I’ve got some good news for you and some bad news. The good news is I made the cheaper ram work with the Z80 and we’re going to come in under budget. The bad news is I can’t get you 40x24, but I can do you one better: 52x24!”
As I understand it, this has something to do with the timing weirdness when integrating into the Z80s built in refresh logic which Motorola devs had to write in software. My understanding is they rolled with it but Adam went on to flush the savings windfall this hack got them down the drain by announcing the Osborne 2 at a lower price with more features before the Osborne 1 shipped.
But imagine a true sci-fi-esque VR experience a decade or two from now in company of your children or grand-children on a Christmas evening, that might make you feel in magic-land all over again.
We're never gonna have an airliner that is 10 times faster than the ones we have now. Same principles with computing too.
I worked for an aerospace contractor in the mid-80s, and the corporate computing center was an IBM mainframe shop (originally 360s/370s, later a mix of 370s and 3033s, later the 308x multiprocessors series).
Even when I left that world in 1986, punch cards were still used in several contexts, such as the starter decks for all production jobs. But the greatest reliance on punch cards was that every employee's time card was a punch card. Recurring data was pre-punched, employees would write billing hours and projects on printed fields, then that data was punched in by the keypunch department at the end of the week. The timecards then were written to tape and fed into accounting, payroll, and shop order control.
Yet despite this primitive appearance, every employee got their paycheck by the following Thursday, every week.
wow. this is literally a sound wave. i thought it was some kind of euphemism.
as i was until now only familiar with the electric variety.
Well, not literally. A sound wave is a compression wave (vs. shear or torsion).
Which essentially means that there are going to be different stories about 80x24/5 from that time - probably all of them correct in context.
(oh and the console monitors on our Burroughs 6700 mainframe had delay lines while the user block terminals - TD830s - had dram and micro-controllers - there's another non-IBM/non-DEC world view of essentially the same technology)
The first thing you notice is how everything is different. While the rest of the world was standardising on what a "file" was (a sequence of (not necessarily 8-bit) bytes) as early as the 60's, the IBM world has "datasets" which are rigidly formatted sequences of records.
The differences doesn't end there, and extends as far as terminology, referring to booting as IPL and disk drives as DASD.
When reading documents abd tutorials written by people who seemingly has worked with IBM systems for a long time, it also seems that while they know everything about how to operate z/OS, they are very ignorant of even basic features of "traditional" operating systems (which these days means Unix and Windows).
I was reading a discussion about Revedit, which is an editor that is popular on MVS, and it was praised for being incredibly advanced. I've used it, and compared to the other one I have access to (whose name escapes me) it sure is better.
But, the editor lacks a lot of functionality even the simplest of editors on Unix had (regexp, for example) and the fact that no one in the mainframe world seems to miss it suggests to me that people who work in the mainframe world simply don't think about computing in the same way as people who were raised on Unix.
I'm not suggesting that one approach is better than the other. I think there are things that each side could learn from the other. But right now I only see the mainframe side learning from Unix, and very little going the other way. I probably will never work with mainframes for real, but learning how they work had been very helpful. It would probably help the industry if more people tried that.
Ps I always assumed 80x24 was a teletype (eg. tty) standard or at the least a two-by 80x12 punch card layout. I also wonder if the serial cable throughput to the terminal played a role..
Unix was great mostly because it was simple and you could port it to new hardware ... Prior to that it was in no hardware manufacturer's interest to release their OS into the wild, that didn't sell mainframes. Unix also released a generation of frustrated systems people who were not allowed to write kernel code unless they worked for a mainframe company
I always imagined an OS had to be really user hostile to have a movie villain named after it. IIRC, Alan Kay, Bonnie McBird's husband, worked for them and, probably, made some of his opinions clear to his wife.
It IS the most user hostile thing I ever seen running on a computer.
The Model 40 was a nice machine. I remember it because the printer unit reused the type pawls from the Model 28 in a rubber belt that ran the full width of the paper (80 columns, of course). So it was a chain printer without the chain, which meant it was somewhat quiet when printing.
But it made it to the PC and its MDA video, IIRC. I used to use that mode all the time.
>The biggest problem with this theory is the VT100's display was 80×24, not 80×25
Having spent quite a few years banging away on VT100's and clones, wasn't the display actually 80x25, but with the 25th line being used to show status? So the electronics was actually set up for 25 lines.
I have a very simplistic question. The article asks, "Given the historical popularity of 80x24 terminals, why do so many modern systems use 80x25 windows?" My question is: Do any modern systems use 80x25 windows, other than the console of the IBM-compatible PC?
Almost half of that line could be written to using escape sequences -- in UNIX, a simple "echo" command with the properly encoded escape sequence did the trick.
In my organization, we had to install and run multiple generations of our products, and it was confusing to keep track which "world" your current environment variables pointed to.
So, I, being a tinkerer by nature, wrote separate environment files for each "world" that included an echo command to echo the appropriate escape sequence so that the current environment setting could be seen at a glance in the bottom status line.