I’m going to leave smarter inquiries to those more qualified to raise them, but I wanted to take a moment to express my admiration and gratitude for your work in preserving, expanding, and advancing our collective knowledge and understanding of low-level computing and systems architecture. Hats off to you, it’s been incredibly rewarding to watch your work.
Why is it up to reverse engineering to preserve history and knowledge. I wonder if the original engineers are alive to help and doesn't Intel have this information locked away somewhere?
I have a transistor-level 8086 simulator that mostly works but needs some cleanup and bug fixes. For now, I'm concentrating on analysis of the 8086 rather than finishing the simulator.
I haven't really timed it let alone optimized it, but about 50 clock cycles per second. If I had a nice graphical display like Visual 6502, it would be way slower.
Yes, the simulator is extremely helpful for analysis. I started doing the analysis on paper, but it's very easy to make a mistake and end up confused. The simulator is also very helpful when trying to understand complicated state machines such as the bus control circuitry. So I plan to put more emphasis on simulation for future projects. The tradeoff is that it takes a lot more time up front to get the simulation working.
I'm curious about the circuitry that drives the difference between "minimum" and "maximum" modes from the processor - another Intel pin-saving strategy. Is there basically an 8288 sitting in a corner of the 8086 die, or are things more complicated than that?
There's not really much to the minimum and maximum modes. You can think of it as multiplexers on the relevant pins, selecting the signal for the appropriate mode. (Of course, the logic isn't quite that clean but that's the basic idea.)
Thanks. I guess what I'm wondering is whether the maximum-mode signals (S0, S1, etc) are somehow more fundamental to the 8086's operation, with the minimum-mode signals (WR, INTA, ALE etc) being derived from them solely for the benefit of people without an 8288?
Put another way, if you could, recursively, delete all the gates in the 8086 which only exist to drive minimum-mode pins, would any remnant of the minimum-mode signals remain for other internal uses?
I haven't looked at the signals from that perspective, but I'd say that both sets of signals are derived from various internal signals that are more fundamental.
Great article, thank you so much for taking the time to write all your explorations down. With regards to the mentioned patent(US4449184A) It turns out I am terrible at reading patents, any explanations what it is for?
My best guess is it for placing memory or state between the registers and the processing units. Or perhaps some subtlety of this as registers are already memory or state. I could not figure it out.
Modern patents are especially prone to this. They try to claim as much as possible while avoiding publishing any trade secrets. The result tends toward a unreadable mess.
Where they go and mention that the attackers went ahead and built their own simulator of a CPU as part of their attack vector. Do you know what they emulated and would you have more details on that specific hack ?
I think the authors promised to come back with a part 2 eventually but so far they haven't