My dad worked for HP from the mid-1970s through the mid-1990s. Needless to say, I used HP calculators in high school and college. The best things about having an HP calculator were the solid physical construction (the buttons on the 11C and 15C were awesome), the accuracy, and the fact that whenever your classmates asked to borrow your calculator they would recoil in horror when you asked them whether they knew RPN. Nobody borrowed my calculator. Anyway, I love this project.
I was part of a group touring HP labs in the mid-1970s.
They showed us how they computed the key dimensions and typography with an (HP) minicomputer, then had it print out the commands for some kind of numerically controlled cutting machine on paper tapes using an old ASR33 teletype.
The cutting machine generated moulds for the injection moulding machines making the keys.
The keys are in two parts: light coloured digits and symbols and darker plastic forming the shape of the key itself and surrounding the lighter symbol plastic. As a result, they can be worn down but they can never wear out.
My biggest challenge the first time I ever used an HP calculator was less RPN than the syntax of it. I thought I had to hit enter after every token so I typed, e.g., 2⎆3⎆+⎆ rather than 2⎆3+. Needless to say, this did not work as expected, but being simultaneously vain and bashful, I was unwilling to ask for help and did almost all the arithmetic for my freshman physics class by hand.
I still have my dads old HP with the glowing red letters and all the functions. Not sure if we still have the charger. Not sure the battery is any good, but the calculator worked fine last time it was turned on decades ago. Any idea if this can be made to function again?
Yes probably. What model is it? I repair and restore old HP calculators. One thin about all the models from 1972 through the early 1980s that use a battery pack - do not power the calculator off the adaptor alone - these models used a working battery pack as part of the adaptor voltage "regulation" - if you power off the adaptor without the battery a much higher voltage will get into the calulator and can permanently damage it. If you dont have a working battery pack, the only way to test it is with a bench power supply set to a safe voltage. For many of these models I install a lithium battery system and USB-C charge port so we no longer have to mess around with old leaky batteries and HP's poor implementation for charging them. Some of the models that are like this are: 35, 45, 55, 65, 67, the 20-series (21, 22, 25 etc) and the 30-series (31E, 32E, 33E, 33C, 34C etc). After these, the "Pioneers" came out (11C, 15C etc) which dont use a power adaptor and are safe.
One of the coolest projects I've seen in a while! Amazing work! In case anyone missed the write-up^1, it's very well-written. I really enjoyed the chapter about designing the instruction set.
This is a brilliant project. (My DM42 returned 9 exactly.)
Blog post 6 had an error where the picture of a HP-71B (I have one and used its Forth/Assembler ROM manual to write the first HP-48 ROM decoder) where the caption says it is a 48GX (both used a Saturn CPU).
This WebAsm code is compiled using Qt and Verilator so it runs the "hardware" and its microcode inside the simple UI shell that provides the calc interface. In the debug version you can list the ucode, set breakpoints, see regs etc.
Really hoping that ghidra can add support for non-byte aligned memory regions some day. So many cool 4-bit architectures out there and attempting to shoehorn them into ghidra produces not great results
Are you trying to make a pun with byte/bite relating to nibble? Because that's actually where the term nibble (referring to 4 bits) comes from, so I'm not sure such a pun even counts as a pun anymore. Or am I misinterpreting your comment?
"Nibble" may well have always been in use by folks, and nybble may have actually come later. At the very least, references to each spelling being in use exist for the last ~60 years.
The first book match I get for "nibble" near "byte" is in the 1964 "System 360 Assembler Language" by Don H. Stabley uses nibble. The earliest match I can find for "nybble" in relation to computers was the 1968 "Encyclopedia of library and information science". Nybble (and likely nibble) itself doesn't seem to have taken off until around the mid 1970s https://books.google.com/ngrams/graph?content=nybble&year_st...
References to the coining of the term in 1958 of course don't provide a textual source.
I was wondering this as well. Probably when a new wave of people discovered the concept in the absence of the older wave? By contrast, "byte" has been in use continuously and widely.
The definition of a byte today is different than the definition of byte when those machines were manufactured. Just like how 'foot' is now standardized(*)
(* technically, a 'foot' is not a standard unit of measure but that's due to the long history of 'foot' not being standardized until relatively recently)
No. 8 bit bytes are the de facto standard but that is in no way official nor the definition of the word. You can find modern niche projects with non-8-bit bytes (and by extension non-32 or 64 bit words).
Love this and love seeing people building their own hardware/software tools. I hope to carve out the time soon to be one. Calculators are a perfect project
The core question: how did HP's scientific calculators actually work at the gate level? That rabbit hole led to building one from scratch.
The architectural decision everything else follows from: a decimal calculator should store numbers as BCD — one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable.
What the project covers:
- Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution
FSM, hardware stack guard with a FAULT state for microcode debugging
- CORDIC for trig functions, verified to 14 significant digits
- Two-pass assembler in Python (~700 lines)
- Verilator + Qt framework: same Verilog source runs in simulation,
as a desktop GUI debugger, as WebAssembly, and on real hardware
- Scripting language on top of the microcode for adding functions
without touching hardware
Very impressive, and obviously a labour of love! As a calculator and SystemVerilog enthusiast, it's wonderful to see a project such as this come to fruition - congratulations!
I'm holding in my hand a 4-bit Von Neumann Mostek MK50310N that my father and I used to use to build calculators long ago. Although Mostek made chips for HP (such as the HP-35), they weren't commercially available, but the 50310 was. We could only dream of a project such as yours. I was happy when the "open source" NumWorks was released, but this project aligns more with my interests.
Will definitely install the Qt simulator - would be even better to build one IRL!
I didn't think it was that much, but I checked my receipt from 2017, and sure enough it was!
I have in my hand (I guess I like using that phase today), my father's original receipt for the HP-45 (it's with the box and manual). $299.00 in 1975, which is $1,850 today(!!!).
Relatively speaking, electronics are very, very cheap today compared to what they used to be. Still appreciate that my $30 CASIO does 90% of that the NumWorks can do, but I'm happy to support upstarts such as NumWorks.
The Z80 does have a DAA (decimal adjust arithmetic) instruction to facilitate BCD arithmetic. I don't think I ever used it in my Z80 years and I'm not familiar with the details. Sadly I have much less experience with the 6502, I didn't even know it had a BCD mode.
Intel 8080 already had DAA, so it could add BCD numbers. Because there was no BCD subtraction, that had to be done by an indirect method, e.g. by doing a nines' complement before a BCD addition.
Zilog Z80 improved this, so that on it DAA could also be used after a subtraction, when it would correctly adjust the result for implementing BCD subtraction.
This was one of the Z80 improvements that were considered important at the time of its launch.
While I have never considered worthwhile to do BCD arithmetic instead of converting the decimal numbers to binary numbers, the DAA instruction on all Intel 8080 successors, including on IBM PC compatible computers, was very convenient for converting binary numbers to their ASCII character string hexadecimal representation, for printing or display purposes (because the decimal adjusting happens to add the correct ASCII offset between '0' ... '9' and 'A' ... 'F', so the conversion to ASCII can be done branchless).
Slight correction, the correct offset is 7, and DAA only adds 6. But the trick is also adding the carry bit. This works on the 6502 in decimal mode too, e.g. https://news.ycombinator.com/item?id=6342286
On the Z80 and 8086, the code can be made one byte shorter by taking advantage of adjust-after-subtraction, which the 8080 didn't have (and on 6502 worked differently):
CP 10 / CMP AL,10 ;set carry if valid decimal digit
SBC A,69H / SBB AL,69H ;0..9 => 96h..9Fh (auxC=1), 10..15 => A1h..A6h (auxC=0)
DAA / DAS ;subtract 66h if auxC set, 60h if clear
I'd like to amplify this. Decimal math using nibbles containing 0 through 9 was standard operating procedure, one of the things that made the 6502 nice. Unfortunately you had to use a short loop to convert between 2s complement and BCD.
Ironically the Z80 is a nibble ALU. That's why its so slow compared to the competition, an 8 bit add on a "2 MHz" Z80 takes as much clock time as a 8 bit add on a "1 MHz" 6809.
The Z80 is pipelined and thus has a higher latency but also higher throughput. Besides, memory was the bottleneck, in particular instruction fetches, so multicycle instructions made more sense. Related article: https://news.ycombinator.com/item?id=6341137
reply