"all the necessary knowledge was explained from the ground up. All you needed to know was supplied, clearly laid out, not just hints for efficient programming. Basicly, you could rebuild your own computer by reading these books."
This is what I'm trying to recapture with my ARM project. Basically an ARM Cortex M4 is of the same order of complexity as large mini-computer "back in the day" where you could (and often did) learn all the basics of computers from architecture to compiler construction. I realized that I had a tremendous advantage learning about computers because you could put an entire PDP 11 architecture in your head while you were writing code, but you can't so easily do that we even an ATOM version of the Pentium. Combined with a straight forward I/O system that kept to a small number of principles, used repeatedly, and you did not have "needless"[1] complexity getting in the way of learning.
Another good reference for seeing how things were build is "Computer Engineering: A DEC View of Hardware Design" [2] which discusses all sorts of trade offs in computer that once you understand them, things like superscalar execution units make much more sense to you.
[1] It is all useful complexity but before you know what you don't know it is just a wall of confusing concepts and jargon.
Do you mean by that, that you're attempting some kind of documentation/description project, by chance? If yes, would you be maybe willing to do the writing in some "open doors" model?
I was going to do the same type of project, but use a MIPS chip, however determined that ARM chips are much more available. Could also use FPGA, but that's a little too much abstraction.
There is a level of detail where things sort of fall down. And I have fallen into that sticky pit a few times so I am getting better at recognizing it.
On the one hand, it is useful to know everything about the CPU construction, on the other it is efficient to use something with other people helping out.
My current plan is a compromise position, early in the project there are basic concepts on computation, but once the basic concepts are presented they are "mapped" over into an ARM implementation. This takes the reader from "Ok, I get how computers do their thing..." to "Ok, I see how this type of computer does its thing." At the end of the day I felt that both were important concepts, the how, and the how to match with existing practice concepts.
That said, my board design that goes along with this has an FPGA which is there to implement a simple frame buffer. That was before I found the STM32F429 which has its own simple frame buffer, and so I'm digressing at the moment trying to figure out if I should go that route or not. One of the benefits of having the FPGA on board was that for some definition of "easy" you do some stuff in the FPGA, the scare quotes though are there because FPGAs bring their own pile of complexity to the problem and that defeats some of my goals of keeping it only as complex as it needs to be.
You both should use an FPGA, then you have control over the full stack. It is a little bit more work to bootstrap an FPGA system but the effort is worth it.
This is what I'm trying to recapture with my ARM project. Basically an ARM Cortex M4 is of the same order of complexity as large mini-computer "back in the day" where you could (and often did) learn all the basics of computers from architecture to compiler construction. I realized that I had a tremendous advantage learning about computers because you could put an entire PDP 11 architecture in your head while you were writing code, but you can't so easily do that we even an ATOM version of the Pentium. Combined with a straight forward I/O system that kept to a small number of principles, used repeatedly, and you did not have "needless"[1] complexity getting in the way of learning.
Another good reference for seeing how things were build is "Computer Engineering: A DEC View of Hardware Design" [2] which discusses all sorts of trade offs in computer that once you understand them, things like superscalar execution units make much more sense to you.
[1] It is all useful complexity but before you know what you don't know it is just a wall of confusing concepts and jargon.
[2]http://www.amazon.com/Computer-Engineering-Hardware-Systems-...