Code is my favorite technical book of all time [1]. Charles does an amazing job of building a computer up from basic principles "two young boys who want to communicate after their parents tell them to go to bed at night" all the way to modern (for 1999) computers. He layers abstraction on top of abstraction all the way to a working computer. My only (slight) disappointment in the book is that he tries to cover operating systems -> object oriented programming in a couple of chapters at the end. That could have been a multi-volume series in its own right.
It goes really well with Elements of Computing Systems (2nd ed) [2] which I kind of think of as a "lab manual" where you get to build a computer from first principles.
I picked it up years ago, got through the first few chapters, but then never finished it. I loved the early buildup and still want to go back and keep reading.
If you're actually looking for something that does build to a virtual computer, Nand2Tetris (second edition of the book The Elements of Computing Systems that goes with it is now out) is a great companion to Code: https://www.nand2tetris.org/.
Code is more high level, Nand2Tetris and Elements is project based but covers some similar territory.
The hardware exercises expect you to use a simple HDL, you write an assembler in your language of choice (I used Ruby, but others have used Python or JavaScript), and you write programs for the computer you "built" in earlier chapters using Jack (a language with Java-like syntax).
"Project 4 is written in the computer’s assembly language, and
projects 9 and 12 (a simple computer game and a basic operating system)
are written in Jack—the Java-like high-level language for which we build a
compiler in chapters 10 and 11."
Code is not more high level it starts from as low as you can get in terms of abstractions, it explains what electricity is and how it moves through wires and and moves on to creating simple logic gates using switches.
Highly recommend Nand2Tetris. I completed it a few months ago as a relatively non-technical person (I had only taken CS50 as my first CS class ever in 2020) and learned a ton.
GP said “builds up to.” It’s more about the abstractions adding up so that the reader understands how electrons can be programmed. There is no implementation of any kind of virtual or hardware machine in the book.
It’s been awhile so I might have missed a step or two. You’ll come away knowing how we used electricity to get from lightning bolts, to pocket computers. You will not come away with a programmable machine.
It’s more a book that actually shows you how a computer really works, from first principles.
It’s not a book about implementing a particular computer - it’s more about giving you the rough thought process about how it actually works at the level of electricity and wires, and how this gets built up into something that actually calculates stuff.
It’s the first book that took me from “computers are mystical magic” to “computers are understandable magic”.
There is a much older text book called the Art of Digital Design. In it, if you buy and solder the necessary components, you get a fully working PDP-8.
This book was absolutely essential when I was first sinking my teeth into software programming. Things that seemed either arbitrary or completely mysterious suddenly made sense now that I had a contextual understanding of how computers worked. Without having that knowledge, I think my career would have been vastly more difficult and frustrating.
While I understand many “get started programming” books/tutorials put an emphasis on getting coding asap, I really had to stop and learn about computers before I could start coding in a well rounded way (and that’s coming from a WebDev, who doesn’t even have to deal with low-level stuff too often!).
Uh, as someone with too many years of programming behind me to imagine that, can you mention some concrete things that you found arbitrary or completely mysterious, and that the booked cleared up for you? Thanks.
- why is volatile memory called volatile? and why can't we just keep that data around?
- what can a 64-bit computer do that a 32-bit computer can't?
- what do people mean when they say "code is data"
- what's an instruction, really?
- how did people program computers before they had screens?
- how did people program computers before they had keyboards?
- why is it called 2's complement?
And others that I can't recall right now. It's a fantastic book and I recommend it to everyone who is the slightest bit interested in how computers work.
That's a great question. I started out as a programmer having been a casual computer user through my childhood/teenage years. I never took apart computers or ventured into how they worked. Thus, when I decided to pursue this as a career, I had a lot of catch up to do. Here's a couple of things off the top of my head:
- The terminal was completely foreign to me. Why is it structured so? How are permissions set? Octal?!?!?!
- Why do I have to specify a type in a programming language?
- What/why are all these special characters used in programming?!
- Why is a program structured in the way it is? What are the levels of abstraction working in a given program?
- What happens when I run/compile my program?
I would also say that learning a little bit of C also really helped illuminate computers for me. Not only in the sense of how they work, but also why programs use their current syntax. For example, for developers who look at Javascript for the first time seeing parenthesis, colons, curly brackets, etc all make an initial sense: they seem familiar. However, to somebody first diving in all of these characters seem totally arbitrary! Having gained a sense of how computers worked and then a very basic introduction into low-level programming, suddenly these high level languages seem much less arbitrary.
Thanks for sharing! I can (of course) understand that many of these things seem arbitrary, but would never have been able to come up with the list on my own. :)
What makes this book great is that it more or less only assumes you know how an on-off switch works. Then he goes on to teach you how a (although rather primitive) CPU works and how you would program it.
The teaching style in this book is so unbelievably good that even if you know all the ins and outs of a computer you want to read on because he explains everything in a way that you wish that you would have come up with yourself.
The best and most useful in a long-term classes that i took in college were first-principle-taught classes, even if they weren't directly related to any work i did at the time (or ended up doing in the future).
The one I remember dearly was CS2110 at Georgia Tech. Most undergrads shuddered when they heard of that course and tried to avoid it, but it legitimately opened up my eyes to how computers "really" work. The class started all the way from logic gates, to building your own 8-bit APU using those logic gates (in a simulator), to writing some assembly code, to programming a gameboy game, to implementing your own malloc. And the continuation of that course, CS2200, was great too, which started with using logic gates and basic APUs (the kind you implemented yourself in CS2110) to build a primitive CPU pipeline (in a simulator as well).
I cannot ovestress how useful that understanding (no matter how basic it was) ended up being later on in my career, despite me doing mostly webdev for the first 5 or so years out of college. It basically turned my knowledge from "i know how to write code" to "I understand what I am actually doing on all levels of the stack (even if that knowledge is rather simplistic for the lower levels of that stack)". Came in pretty handy when I discovered a bug in one of the early versions of the TypeScript transpiler as I was writing my webdev code. Pretty certain I would've been completely stumped by my code not working if I didn't have an intuition for digging in that direction (which i definitely wouldn't have had, if it wasn't for that class).
Everything in that class was building up so well off what was learned earlier in the class, I was wowed. I actually felt like i understood how those things were naturally "discovered" for the first time back in the day. Basically, it was the difference between "here is a formula, here is what it does, here is how you should understand it" and "here is what you know, here is a concern you might want to think about next, oh congrats, you just 'discovered' this well-known concept following the same train of logic that led to it being originally 'discovered'".
I often wonder: for someone with no understanding of how computers work; no idea about electricity and transistors, no idea about CPUs executing instructions, no idea about software and abstraction: what do they think about when they click a ‘play’ button in a music player, and the interface updates and music starts? Do they think about what is happening ‘inside’ the box? I think I do.
I suspect I have a completely different mental model than them - just a completely different casual understanding of it. I find it hard to imagine how they must see the modern world. It must seem like magic!
Note that I’m not at-all speaking about intelligence here. Just knowledge.
Anyway: for that person, I think that if they read ‘Code’, their entire understanding of the world would change, which is sort of amazing.
> I find it hard to imagine how they must see the modern world. It must seem like magic!
I think this is unnecessarily infantilizing. There are a great many very complex things in the modern world, and people must employ abstractions for most of it.
I think with the rise of multi form factor computing, there is more basic literacy about the nature of computers these days. People don’t think that a phone god makes their phone work and a laptop god lets them work on their document.
>> I find it hard to imagine how they must see the modern world. It must seem like magic!
> I think this is unnecessarily infantilizing. [...] people must employ abstractions [...]
"Magic" to me means "does not (appear to) obey laws of cause and effect".
When the phone doesn't work after you install App X, do you attempt to remove app X? Why would restarting the phone help the situation? If none of this is logical, you're just following the 16 step Apple Fixit Checklist, good luck... it might work... but if you miss something what happens? You're just back to the start and run the checklist again... or call the ~~witch doctor~~ knowledgable friend.
Now I won't say there's no magic in understanding computers. There certainly are cause and effect mechanisms that are too arcane for ordinary mortals to be familiar with (even like "don't build the project inside a directory that has a space in it" never mind "don't put a colon or slash in your filename").
The point is: "magic" does not have to mean infantilizing.
I said ‘like magic’. I know they know they’re made by humans and understandable with effort.
Complex chemistry and materials engineering feels ‘like magic’ to me. I know I could understand it with enough effort - but I don’t understand it now. I’m sure my view of everyday things would change if I did.
We etch runes into stones, imbue them with lightning, and thereby make them come alive and do our bidding... and you want to tell me computers are not magic?!
What's your mental model of a dog?.. What's happening inside when he barks or wags his tail? Why does he like carrots more than cheese? How does his memory work?..
The way you think of dogs is probably not very different from the way many people think of computers.
It's different for me because I know that computers are knowable because they were built by people, and lots of people (including me) know how they work. I believe dogs are knowable too because I believe in science, but nobody really knows (yet) how dogs work at anything close to the same level of detail.
Lets take something I know nothing about: Industrial chemistry. That still seems less mysterious to me than dogs, because I know that I could find books about industrial chemistry at whatever level of detail I wanted. But dogs--not so much, beyond a rather superficial level.
I've asked very young students what they think a computer is and how it works. There's always at least one student who correctly replies: "It's a machine that does what someone told it to do".
Of course modern computing devices are absurdly complex and intricate machines all the way from silicon to software, but the basic mechanism is easily grasped by children. For all their complexity, computers are still just programmable calculators.
I didn’t discover it until after coding for over 20 years and getting a masters in cs, and I really enjoyed it for filling in some gaps in my knowledge of the engineering side of things and the historical treatment of how various things came to be. And it was a great review of all the stuff I did already know from formal training and experience. It was a quick and easy read, even after tracing through every schematic, so I found it very enjoyable.
Same. I think some of us learn best by having a map of the territory and an understanding of how everything all fits together first. Having that better helps us deep-dive into and master specific areas. This one single book provides that map.
Absolutely. I tend to be mystified and lost without some way to place what I'm doing in some larger context, even if that context is super-vague or wrong in technical ways.
Had several teachers who utterly refused to provide that, repeating "it'll all make sense eventually." Well, it did, after I got better teachers.
That's a great observation about teachers. I was fully convinced in late middle school that I was an idiot who had no hope to achieve good grades or a grasp of subjects. Then suddenly I found myself getting As in high school without a real change in effort/routines. I attribute that in large part to learning the "why" behind various subjects via new teachers.
Wish I had something to say other than "me, too." I read it sometime in high school, well after I'd learned C, and it really made me understand what was actually happening to the code I wrote. Very helpful later in my career and hobbies too, when I have to do reverse-engineering.
I'm a big fan of teaching the fun part first, but that really ought to be followed by learning the boring stuff. I think of it as teaching the What followed by the Why. Too many programs seem to stop after the first part.
This book (1st edition, at least) is astonishing. Perhaps the best book for coders that I’ve ever read. I recommend it to all the HN crowd.
Reading it, I felt like I actually understood how computers worked, right down to the “electricity going through wires” level and lower, and how that builds up to if-then statements, etc, in a high-level language.
As a computer science student (Informatik over here), I have to say that my university program was heavy on theoretical CS and maths, but there was extremely little in terms of hardware or low level engineering. The one networking class I took diving into the internet stack was probably the closest to it.
Obviously everyone's experience will be different but I think of a CS education more of treating computers like an abstract machine, not a physical one.
Don't oversell it. Speaking as someone who has done VLSI layout of a 32-bit processor in my student days, Petzold's book is a solid popularization but it doesn't cover a tenth of what a computer engineering degree does.
Depends on the school, for sure. My CS education included a course on Computer Architecture, for which the final project was to implement your unique architecture on an FPGA and demonstrate it running a (simple) algorithm. I liked that course so much, I went back for Computer Architecture II and learned about pipelining, hazards, etc.
"I'm speaking about MY computer science degree". Fixed that for you.
My computer science degree included courses on microprocessors, circuit design, logic, system architecture, etc. and I think that most rigorous CS programs would also include these.
I can't tell whether I had an unusually good CS education, or if I'm missing something, but everything I see discussed in this thread as crucial insights taken from this book are things I recall being covered at least once in university, yeah. Perhaps it's just especially effective in its organization and ordering of fundamentals. Still, the praise it's getting makes me want to pick up a copy just to see if it can fill in any gaps I've missed in truly grokking those concepts.
> Perhaps it's just especially effective in its organization and ordering of fundamentals.
This is it. It’s not going to tell you some deep technical details about how modern processors work, but it’s an extremely well-written introduction to low-level computer concepts that any moderately intelligent person can follow with no prior specialist knowledge, which is rare among nonfiction books. But still, it’s a popular science book, not a thorough technical treatise.
Also, probably a lot of people read it before university. I read it at age 16 or so and learned basically my entire framework for understanding what a computer is and does. I doubt someone with a rigorous computer engineering degree under their belt would learn anything specific, but it’s still a fantastic piece of writing and you might enjoy seeing a good explanation from first principles of how everything fits together.
I think the difference is that this book in about 200 pages or so, starts from two boys using flashlights to attempt to "communicate" with each and goes through simple circuits to logic gates to CPU, ALU, volatile memory, rudiments of assembly language to a high level language. That's the difference, hand holding you through the explanations with the emphasis on pedagogy rather than being a dry theory book.
Yes, it consolidates a ton of information very well. I think it can appeal both to the beginner/layperson as introductory, or to those with more experience to put lots of pieces together. I didn’t learn much new information but it tied together many things, and just reviewing and recalling old theory felt like a good exercise.
Maybe where you went, but that definitely wasn’t the case in my CS degree. Code is really great and he’s a better teacher than most of the professors I had.
As someone with very little coding experience (learned HTML and CSS like a decade ago, I am not a coder by any definition), is this book too advanced/technical for me? Sounds kind of conceptual/birds eye view and interesting.
"Programming Windows" exposed the simplicity behind the Windows architecture. If you had even a smattering of C experience, Petzold could get you writing Windows applications the same day you opened his book. He triggered an explosion of software development.
There are excellent books for novice Linux programmers, but unfortunately, nothing compared to Petzold's clear and direct presentation.
This brings back fond memories. Charles thanked me in the Acknowledgements section of the first edition of Programming Windows as "the indefatigable Michael Geary."
I had spent many hours on BIX and CompuServe and maybe GEnie helping other Windows programmers get their start.
So I like to think that in a small way I contributed to Charles' success in educating a generation of Windows programmers.
Absolutely! My first software job back in 94, on my first day I was given a battered copy of Petzold and told to read and do the exercises up to chapter 7. It was an extremely effective boot camp for a novice windows programmer
You may well be correct, the instructions might just have been to code along with the book, I know that part of the exercise was definitely to ensure that I could use the tools. It was 28 years ago, I'm fuzzy on the details :)
Win32 _is_ hilariously complicated(albeit as powerful as anything else with perhaps more steps upfront).
Consider creating a child process in GNU/Linux:
fork();
Vs in win32’s asinine API:
CreateProcess( NULL, // No module name (use command line)
argv[1], // Command line
NULL, // Process handle not inheritable
NULL, // Thread handle not inheritable
FALSE, // Set handle inheritance to FALSE
0, // No creation flags
NULL, // Use parent's environment block
NULL, // Use parent's starting directory
&si, // Pointer to STARTUPINFO structure
&pi ) // Pointer to PROCESS_INFORMATION structure
)
* Fork is no longer simple. Fork’s semantics have infected the design of each new API that creates process state. The POSIX specification now lists 25 special cases in how the parent’s state is copied to the child [63]: file locks, timers, asynchronous IO operations, tracing, etc. In addition, numerous system call flags control fork’s behaviour with respect to memory mappings (Linux madvise() flags MADV_DONTFORK/DOFORK/WIPEONFORK, etc.), file descriptors (O_CLOEXEC, FD_CLOEXEC) and threads (pthread_atfork()). Any non-trivial OS facility must document its behaviour across a fork, and user-mode libraries must be prepared for their state to be forked at any time. The simplicity and orthogonality of fork is now a myth.
It's funny that you chose fork to make your point because in my opinion window's version is vastly superior.
First of all the name CreateProcess is obviously much more readable than fork. When I want to create a process, I usually want to run some function in the new thread which the windows version takes as arguments, alternatively fork copies the entire god damn process (what?) and I have to choose what to do based on the return value. The only reason the performance of fork is not abysmal is because the underlying operating system has to implement some sort of copy on write optimization but that just makes the performance characteristics of the program less transparent.
If I remember correctly, Programming Windows had a desktop app focus, whereas Jeffrey Richter’s book(s) had a system focus. Then .NET happened and almost anybody could program Windows apps, assuming Microsoft didn’t keep deprecating frameworks and languages all the time.
And in the end it turned out that it’s not worth programming windows any more. Just use Qt, Java or the awful web technology of the day and you’ll probably be better off than with whatever MS is recommending.
This is great news. This book has been highly regarded for being able to explain the magic behind what makes a computer actually work. The new edition has about 70 pages of additional material.
There is also a companion website that is under construction that already has a delightful amount of interactivity, showing how binary switches, relays, and gates work:
Looking at that source code for the website is fun it is own right. There is a whole JavaScript framework for simulating and displaying circuits. Each circuit is represented by a JSON file.
I read the 1st edition while I was in high school. I knew I wanted to get into tech after reading that book. Many years later, I worked with Charles at Xamarin. For weeks after I joined, I thought of ways to get him to sign my book. One day I heard he was doing a book signing at an event, and I volunteered to go. He signed my book, "from bits to mobile", and now it lives on the top of my bookshelf.
I have no idea how i ended up buying this book last year (haven‘t heard of it from anyone)… the best book i have read in recent years. Definitely the best tech book i have or will ever read (haven‘t even finished „code“ fully yet, lol).
Also: i found a book i always was too scared of starting to read (because i didn‘t want to feel like the biggest idiot trying to read it):
„The Annotated Turing“.
When i looked at it again after having read „Code“, i saw it was also written by Petzold.
The way he wrote „code“ i know, that this will be great. I am very excited to read it when i have a few hours to fully block for it.
This is by far the best book I've read that really made computers click for me.
I think it's especially good for people like me who work in software development but don't have a computer science degree or background. Going from scratch and the very foundations of telegraphy all the way through to what opcodes really are and how code actually works in memory was an eye opener for better understanding what coding really is.
We need more people like Petzold in this world. The layers of abstraction are stacked so high that we need to provide a generalist view of what's going on. Some of these Jenga bricks need realignment and people to maintain them, sometimes to re-engineer those bricks to be stronger.
We can't just all sit at the top of the tower and wonder why is it behaving irratically! Please support by purchasing the book.
I suppose I came to understanding programming in an unusual way. I first knew a bit of BASIC and could write simple programs in it, how the computer actually worked was a baffling mystery.
My first semester in college, I took a class in semiconductor physics. That started with the PN junction, to diodes, to transistors, to gates, to flip-flops, to clocking, to registers, adders, etc.
Later on, this made learning microprocessors straightforward, then assembler, then C, etc.
I suppose it would have been faster to go straight to programming, but I am happier knowing how it works all the way down.
I learned it the hard way as well. I think my book was "computer architecture" by Morris & Mano
you start from PN to logic gates to truth tables to carnough diagrams to writing your own adder to D transistors to memory to bus to clock to cpu to write your own assembly to perform machine instructions.
I also learned electronics first, but in a much more amateurish way. As a kid I loved electronics magazines and building projects from them. My dream was a Z-80 kit, but even much simpler kits were too expensive for me. I ended up making a make-believe computer with a few logic gates, decade counters and flip flops. That was my "computer", which taught me digital logic. Not a lot of power, but for a 10-11 old it could compute the stuff I wanted, and I eventually learned digital logic.
I never got the Z-80 kit but I eventually got a computer with DOS, which had QBasic, where I learned to code by modding GORILLA.BAS and NIBBLES.BAS. What a weird language, there was AND and OR, but not NOR, XOR, NAND and others...
Great book -- read it when I was a teenager -- but one thing that I think makes the book a bit harder to explain to smaller kids these days is the use of relays for explaining lots of things. Its a great concept to explain things but it does tend to cause a bit of confusing for younger kids who may not have played with electromagnets as much as kids from 20-30 years ago. Not sure what to replace relays with but maybe having kids watch a super easy to understand video about how relays work would help make the book easier to understand for say like an 8 year old reading the book.
Hydraulic relays? I vaguely remember coming across a similar "bottom-up" book about how computers work many decades ago (near the end of the mainframe/minicomputer era) which used water flowing through pipes and valves as its analogy. Unfortunately the title wasn't so memorable, but I do recall the cover was a photo of a little girl filling a bucket from a spigot on the outside of a house. Does anyone know what book that was? I've searched a bit before, but had no luck.
But that would introduce more complexity (polarity, biasing, etc) that would distract from the initial purpose. With a relay they should only learn how connecting a battery to pins 1 and 2 closes the contacts on 3 and 4. Using relays with more throws they can also learn the basics of logic gates, flip flops, etc.
I believe that relays still have a place for educational purposes.
The first edition of this book was my introduction into how computers worked at a lower level. It gave me enough of a grounding in various concepts that I was able to understand much of a Digital Systems class.
It also partly inspired my first efforts at building a scientific calculator (which would never be quite finished).
I definitely recommend it for folks who want to build context on the lower levels of computers, as a start into understanding CPUs, binary/hex, and other parts of how we tricked sand into thinking with lighting.
no M$-lapdog, he. I learnt my Win16 from the first edition of "Programming Windows" and would have learnt my modern UI from the newest edition if MS hadn't been so busy throwing the bathwater out with the Windows Phone baby that I lost heart.
Great. There goes my summer.:-) Essentially read this book with my 14 old Aspie and help him understand and appreciate Software and the hardware it runs on it. Thanks for posting this here.
I majored in electronics (more than a decade ago) but always worked in software development. Does it still make sense to purchase it? I am eager to buy it, just need the push. :)
I have a similar background in Electrical Engineering and, while I enjoyed the book, it did not change my perspective. I think the book does a good job of opening up what many programmers (and, of course, others) may see as a "black box" that they interact with on a daily basis. I recommend it to those who do not already have a fairly comprehensive of how a computer works, but if you think you pretty much already understand how transistors become ALUs and have touched Assembly, you may find it a bit boring (as I did).
Thanks for this solid comment. Assembly is the one that I am most removed from at the moment. It is 15 years or more. So it seems like I will enjoy it at the very least.
I finished my Electrical and Computer Engineering undergrad in 2008 and have been a software dev ever since.
You won't learn anything new if you had computer architecture and digital logic classes. But it's an excellent refresher. And so well written that it reads more like light fiction than a technical deep dive.
(Assuming the second edition is as good as the first) yes. This is the kind of book you can read on a plane ride from New York to San Francisco without taking notes or opening up your computer, and still get a lot out of. It's captivating but conversational and well written.
My wife bought me this book when I was in college and I never found time to read it. The comments here make me want to crack it open and give it another swing. Thank you all.
The Annotated Turing is probably my favorite technical and historical book, there's nothing else like it. But I'll never forgive the utterly strange typo in it:
"In a famous 1907 paper on relativity, Hilbert's friend Hermann Minkowski would coin the word Zaumreit or spacetime."
Zaumreit literally translates to "bridle riding" and is probably a play on words, Raumzeit means spacetime. This has puzzled me for over a decade now. How does this even happen?
You find it surprising that an American author and book editor don't read or speak German?
I'm guessing the author remembered the word incorrectly and wrote it down that way. It's not hard to transpose two letters when you don't know the language.
The book is really well researched, it includes 5 pages of selected bibliography. I doubt the author wrote about a scientist coining a word and then recalled said word from memory. There's no misspelling of the word "Entscheidungsproblem" which occurs like 80 times. It just stuck out to me, maybe the source material was already wrong, maybe he was trying to be funny or nerd snipe someone, it's a mystery.
I recommend this to anyone learning about programming or computers. That's usually kids. Last year, I went back to ready it again and it started with a story about trying to communicate with your friend next door. I thought "oh, this story isn't really relatable to kids today - they all have phones".
So I'm really glad there is a second edition and I'm wondering if there is a new story.
Happy to see a new edition! Great book, really helped remove a lot of the "well it's just dark magic" parts of how a computer works for me. Really can't recommend this enough to anyone who wants to truly understand how computer hardware and software works from bottom to top.
My sample has a gigantic splat on its back because I threw it at some insect many years ago and it was positively crushed by The Petzold's impact force. Never cleaned that off because it added to the aesthetic of raggedness.
As doors got heavier it became necessary to add Petzold's windows programming with MFC book to the stack too, another massive (and great at the time) tome.
I got the first edition and it was transformative.
Finally the connection between electricity, computers and modern communication was made. This all while being a fun-easy read.
If it was not for this book I would never have started going down the rabbit hole of dabbling with ICs, Arduinos and (basic) electric circuitry.
As Alan Kay famously said "People who are really serious about software should make their own hardware". No need to develop a whole computer, just getting your hands dirty with more basic electronics than consumer hardware will make you a much more complete technician.
Having read (and own) Charles Petzold's Programming Windows, Code and The Annotated Turing books, i can unequivocally state that he is a Master of Technical Writing. No fluff, precise but detailed and straight to the point. Modern authors can learn more than a lesson or two from his style of writing.
He needs to write more books, that is his Gift and Calling.
PS: Andrew Tanenbaum's Structured Computer Organization makes a very good followup to Code.
I never did read this book, one of the bigger holes in my technology reading. I should use this as an excuse to finally pick it up and dive in, once the new edition arrives.
Looking back it's really strange that nowhere in our CS/Soft Eng. curriculum is it covered exactly what is meant by the term "abstraction" when it comes to how a computer works. That it's all, after all, shuffling of electrons (underlying MOSFET chemistry notwithstanding) and signals is the missing link.
There's other books out there, and Ben Eater's website, that indepth show how to construct processor, gates, store "memory", and so on.
When I was in school 'computer engineering' was the degree you wanted for that level of understanding. It was a blend of 50/50 computer science and electrical engineering. You'd learn enough analog EE to understand transistors and enough digital EE to understand logic and computer architecture (this is really where you learn the gory details of how a CPU works internally). Then you'd focus on enough CS to flesh out low level OS, systems programming, etc. to make it all work. Basically learn enough to go from nothing but a circuit diagram to a computer booting up a display with a login prompt for an OS you designed and built, on hardware you designed and built.
You're describing what was the CompSci option for an egineering degree when I went to Uni the first time. If you look at another great book for learning the "full picture", Nand to Tetris, Comp Sci is the second half of the book while Engineering seems to fill the space between physics and low-level software.
Right. Imagine a car. Now consider how it is abstracted to you. The wheel, the controls, the pedals. That's an abstraction of the complex system that is a car.
Great book. It might be time to replace my first edition hardcover. Looking forward to the follow-up blog post with the more detailed info on the updated content.
Wow this is awesome. Code was an eye-opening book for me as a child. I have a 2 year old, and have been wondering whether he could possibly appreciate camera film ISO codes (an early example of "a code" in the book), given that he'll likely never see a roll of film, or other things like that. An updated edition is great news.
Is this book useful for a 10-year-old-experience software engineer (I also studied Computer Science)? I usually see this book named as “worth it” but I don’t know if it is an introduction to computer science or it is really interesting for an experienced developer.
It's interesting because you'll learn how to explain things better to the newbies. Pick up a pdf on libgen, check it out. If you like it buy it, if not you're still square even.
I’m definitely going to buy the new edition and excitedly re-read it, but it would have been deeply enjoyable if the second edition _only_ updated itself to dark mode. Could have been one of the rare actually funny April Fools Day announcements.
I bought Code and Code Complete around the same time. And started reading Code before switching to Complete. And now I'm reminded I have the 1st Edition still sitting on my bookcase read about 1/3 the way through.
Is the microsoft press site actually decent? Is there anywhere else y'all would recommend picking this up from? Seems like Amazon doesn't offer a combination ebook+physical the way the microsoft site does.
Bought it! I own the first one and it's just a fantastic book. I've shared it with family members to help them understand how computers work (more tech savvy ones at least). But yeah love this one.
The 1st edition of this book was very influential for a generation (or more) of engineers. I can't wait to see how it's evolved and what I learn reading a new edition through again.
I am beyond excited. I have my worn out copy of the first edition perched near me. Petzold's Code is by far my favorite computer science and programming related book.
> If you’d like to pre-order the book from the publisher, don’t try to find the book on the Pearson website. Instead, order the book from InformIT.com or MicrosoftPressStore.com or your favorite online retailer.
So I imagine any online retailers where you might be able to preorder technical books. If you are not adverse to doing so, it can probably be pre-ordered in the coming months on Amazon, as well.
In Europe you could for instance order at Blackwells[1], they are an independent bookseller and fairly cheap for english language books. Even better would be to use a local independent bookseller in your own country of course
Any way to get this on the Internet as a web page? It's ironic how this book covers some of the most advanced technologies in our age, yet gatekeeps it all behind plain old printed pages or a "watermarked ebook."
Lol "the hidden language of computer hardware and software" imagine calling Chinese a "hidden language" on an intro text book just because it's not immediately obvious how to speak it.
It goes really well with Elements of Computing Systems (2nd ed) [2] which I kind of think of as a "lab manual" where you get to build a computer from first principles.
[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
[2] https://www.amazon.com/Elements-Computing-Systems-second-Pri...