Hacker News new | past | comments | ask | show | jobs | submit login

A 9-bit byte is found on 36-bit machines in quarter-word mode.

Parity is for paper tape, not punched cards. Paper tape parity was never standardized. Nor was parity for 8-bit ASCII communications. Which is why there were devices with settings for EVEN, ODD, ZERO, and ONE for the 8th bit.

Punched cards have their very own encodings, only of historical interest.




>A 9-bit byte is found on 36-bit machines in quarter-word mode.

I've only programmed in high level programming languages in 8-bit-byte machines. I can't understand what you mean by this sentence.

So in a 36-bit CPU a word is 36 bits. And a byte isn't a word. But what is a word and how does it differ from a byte?

If you asked me what 32-bit/64-bit means in a CPU, I'd say it's how large memory addresses can be. Is that true for 36-bit CPUs or does it mean something else? If it's something else, then that means 64-bit isn't the "word" of a 64-bit CPU, so what would the word be?

This is all very confusing.


A word is the unit of addressing. A 36-bit machine has 36 bits of data stored at address 1, and another 36 bits at address 2, and so forth. This is inconvenient for text processing. You have to do a lot of shifting and masking. There's a bit of hardware help on some machines. UNIVAC hardware allowed accessing one-sixth of a word (6 bits), or one-quarter of a word (8 bits), or one-third of a word (12 bits), or a half of a word (18 bits). You had to select sixth-word mode (old) or quarter-word mode (new) as a machine state.

Such machines are not byte-addressable. They have partial word accesses, instead.

Machines have been built with 4, 8, 12, 16, 24, 32, 36, 48, 56, 60, and 64 bit word lengths.

Many "scientific" computers were built with 36-bit words and a 36-bit arithmetic unit. This started with the IBM 701 (1952), although an FPU came later, and continued through the IBM 7094. The byte-oriented IBM System/360 machines replaced those, and made byte-addressable architecture the standard. UNIVAC followed along with the UNIVAC 1103 (1953), which continued through the 1103A and 1105 vacuum tube machines, the later transistorized machines 1107 and 1108, and well into the 21st century. Unisys will still sell you a 36-bit machine, although it's really an emulator running on Intel Xeon CPUs.

The main argument for 36 bits was that 36-bit floats have four more bits of precision, or one more decimal digit, than 32-bit floats. 1 bit of sign, 8 bits of exponent and 27 bits of mantissa gives you a full 8 decimal digits of precision, while standard 32-bit floats with an 1 bit sign, 7-bit exponent and a 24 bit mantissa only give you 7 full decimal digits. Double precision floating point came years later; it takes 4x as much hardware.


I see. I never realized that machines needed to be random number of bits because they couldn't do double-precision so it was easier to make the word larger and do "half" precision instead.

Thanks a lot for your explanation, but does that mean "byte" is any amount of data that can be fetched in a given mode in such machines?

e.g. you have 6-bit, 9-bit, 12-bit, and 18-bit bytes in a 36-bit machine in sixth-word mode, quarter-word mode, third-word mode, and half-word mode, respectively? Which means in full-word mode the "byte" would be 36 bits?


The term "byte" was introduced by IBM at the launch of the IBM System/360 in 1964. [1], which event also introduced the term "throughput". IBM never used it officially in reference to their 36-bit machines. By 1969, IBM had discontinued selling their 36-bit machines. UNIVAC and DEC held onto 36 bits for several more decades, though.

[1] https://www.ibm.com/history/system-360


I don't think so. In the "normal" world, you can't address anything smaller than a byte, and you can only address in increments of a byte. A "word" is usually the size of the integer registers in the CPU. So the 36-bit machine would have a word size of 36 bits, and either six-bit bytes or nine-bit bytes, depending on how it was configured.

At least, if I understood all of this...


One PDP-10 operating system stored five 7-bit characters in one 36-bit word. This was back when memory cost a million dollars a megabyte in the 1970s.


36 bits also gave you 10 decimal digits for fixed point calculations. My mom says that this was important for atomic calculations back in the 1950s - you needed that level of precision on the masses.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: