No, the iAPX 432's successor was the Intel i960, which confusingly is completely unrelated to the i860.
By the way, it's interesting to read the iAPX 432 architecture paper along with the Case for RISC paper, since the 432 paper argues the exact opposite. Essentially, due to increasing software costs, you should put as much as possible into hardware. The semantic gap between hardware and programming languages should be minimized by implementing high-level things such as objects and garbage collection in hardware, which they call the Silicon Operating System. The instruction set should be as complete as possible, with lots of data types.
I should also mention that the 432's instructions were not byte-aligned: they were anywhere from 6 to 321 bits long. Just decoding the instructions took a complex chip, with a second chip to run them. This is the opposite of the RISC idea to make instruction decoding as simple as possible.
The paper says, "The iAPX 432 represents one of the most significant advances in computer architecture since the 1950s."
The capabilities based software architecture was pretty interesting too, particularly in the light of security problems, ubiquitous distributed systems, and the adoption of OO. See also
Of course Java provided a similar architecture in many ways on a mainstream architecture the same way that Common Lisp made Lisp machines obsolete. Java never really got a universal architecture for serialization and persistence the way the AS/400 did.
That said, I find AVR-8 pretty interesting in that it is the last 8-bit architecture and is, from the viewpoint of the assembly programamer, technically superior to all those machines I had or craved in the 1980s.
That's quite an argument. "Software is getting too hard to write, so we will do the work in hardware instead, since hardware is so much easier to develop."
It's pretty ballsy of Intel's marketing department to so blatantly lie to your face like that. Maybe they were big believers of the "Big Lie" rhetorical device? If you say something so outrageous people are less likely to question it because you must be a genius with loads of inside knowledge to assert something so completely crazy on the face of it.
In the late 70s to early 80s that wasn't an obviously wrong idea, especially because in the posts above, as I understand it, when they say hardware it mostly means microcode, which is more software than hardware. At that moment many architectures like the iAPX432, Lisp machines, Xerox machines, the VAX, etc. were microcoded.