Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe I'm not smart enough, but I consider building CPU from NAND gates, writing a simple OS for the machine and writing a complier for a C subset for it can do a lot more about de-mystifying computing. I tried twice to read SICP but I don't get the hype. I guess it's fine to not read it after all.

Disclaimer: I still consider nand2tetris too trivial for the de-mystify work. A full project on the same scale of UTokyo's CPU project [1] is a lot better.

Disclaimer: I only completed the hardware part of nand2tetris so I don't claim that I have done the CPU+OS+compiler stuff. But I do find it a lot more interesting than SICP.

[1] https://www.is.s.u-tokyo.ac.jp/isnavi/practice01-01.html



The approach you describe is excellent, but it builds upwards from the physical foundations of computing.

SICP builds from the mathematical foundations of computing (sort of). The end game is to take students from zero through:

1. Understanding several styles of programming (though most often in terms of "what fundamental operations does this language provide?").

2. Being able to build an interpreter.

3. Being able to build a compiler.

For a two term intro course, it's extremely agressive.

I encountered it much too late to benefit much—I'd already been working on compilers at Lisp companies for a couple of years. But as an introductory text, it's a bit like the Feynman Lectures are for physics: probably too hard for almost everyone. But the right people, at the right time, will really love it.


I forgot to include, that the last part of SCIP is indeed very interesting, and that's the only part that I completed when studying the course in Python. Compiling theory is always interesting and that's part of the syllabus I'm looking forward to drill a bit deeper into in the future.


That would explain why I never got SICP. I can’t follow the math.


In particular, while I can visualize code executing in my head, I cannot just look at a formula and do the same: the formula is too static.

Formulae are a beautiful top-down model of some narrow, idealized slice of reality.

Understanding is built bottom-up.

For example, the raw torment of differential equations could have been clarified if the Prof. said something like: "Your car going down the street is a differential equation. Your position, x, your speedometer velocity is dx/dt, and the accelerator pedal position d/dx(dx/dt). We will now set about abstracting this past all casual understanding."


And then you learn that most differential equations are not explicitly solvable, and the world is actually coded in partial differential equations, anyway.


> In particular, while I can visualize code executing in my head, I cannot just look at a formula and do the same: the formula is too static.

The right lesson to learn from this isn't that math is too hard, it's that you're using the wrong tools. The usual target architecture of math (so to speak) is a rewrite engine, not a register machine. You should almost never be "executing" anything.


Don't sell yourself short. The traditional mathematical guild has put religionists to shame with their penchant for obfuscation and ambiguity. Computing science math on the other hand is rigorous, concrete, and accessible to anyone who can, well, write a program.


It’s not actually. I have short-term memory issues that make holding math in my head next to impossible. Things get transposed or forgotten mid-operation. The only way I can do math is to have everything written down, including formulas. Even then, I may not be able to figure out how to change both sides of an equation. I can’t convert miles per hour to time required to travel X miles, even with paper in front of me. (I wish I could. I love hard sci-fi.)

I don’t have these issue with programming because it doesn’t use my logical memory, it uses visual/spatial memory. I see function calls as a series of lines creating decision trees.

You may be surprised how little math matters in practice.


I think I know what you mean. In high school I struggled most with maths that required non-trivial algebraic operations, and more often than not I'd just make some mistake while working out the steps / answer. Most people are able to get around the issue by holding more steps and doing it in their heads, but somehow I wasn't able to do it. I'd make some trivial mistake when copying the mathematical expressions and end up in a dead end. I wouldn't know whether it was a typographical mistake or whether I'm using the wrong approach.

With programming my compiler tells me when I'm off base, and I get instant feedback with testing. I can save all my intermediate steps and approach the problem with trial and error if needed. I have muscle memory with Vim (it's become my cyborg short-term memory buffer). I often wonder how much better I would be at maths if I had a Wolfram-alpha cyborg device attached to my brain :P


Use vim with Maxima calling it as if it was an interpreter from %! instead of using a Maxima session. Map it to a key binding. Or better: set readline to vi-mode, and you'll use the maxima prompt almost as easily as editing a vi file.


It’s not actually. I have short-term memory issues that make holding math in my head next to impossible.

Next to the vast plains of mathematics, all of us have inadequate short term memories. This is why we write everything down.

The goal of studying any piece of mathematics is to commit it to long term memory. Not in the sense of memorizing dates and times for history class, but in the sense of muscle memory.

As for “how little math matters in practice”, that view is all too common among programmers. It is not a view shared by engineers or scientists in any other field. Without math we wouldn’t have any of the tools and amenities of modern life. Even the Amish teach their children math.


This isn't what I was referring to. It's not storage but retrieval that's broken.

I can't work out how to convert miles per hour to minutes per mile, even on paper. My brain confuses and transposes the operations necessary to do formulas. I have difficulty working out where to put the variables. This can happen with the instructions in front of me.

There's a certain kind of logical/calculative abstractness I can't process without a lot of effort. A lot of math (and LISP syntax) seem to require this. Matrix operations and accounting don't. (Discrete math was fun!)


> As for “how little math matters in practice”, that view is all too common among programmers. It is not a view shared by engineers or scientists in any other field.

Phrases like this are a red flag. They're usually thinly veiled anti-intellectualism, promising to free you from the oppressive gate-keeping of centuries of built-up knowledge. In reality, it's a pop culture prison.

The same ideas are prominent in guitar, where you "don't need to understand theory or how to read sheet music." Which is true, because you can find chords that sound good together with enough time and patience. But it's also not the full truth, as it is obvious that more classically trained musicians tend to bring a lot more to the table from both a technique and a composition standpoint.


And very often, what appears to sugary pop music, was made by very musically educated people. You can't go very far with power chords only.


One example of the kind of song I think you're talking about is "That's What Love is For" [1] by Amy Grant. I once read someone describe it as being like a quaint little Michael Bolton song (can no longer find the link). But the chord progressions and the way it moves between keys are fairly sophisticated for pop music.

[1]: https://www.youtube.com/watch?v=uLVV2TaI4Wo


You might be surprised how deeply math is connected to what you're doing!

It sounds like you're doing some visual reasoning with decision trees, which means you're doing computer science (understanding the execution of an algorithm) using the same intuitive geometric understanding that is extremely effective in computational complexity theory (drawing the evaluation of a sequence of function calls, for example, we can see that it's going to take up a certain amount of space, possibly growing with n in two different dimensions, for example, indicating n^2 growth by purely visual reasoning). It's been trendy in mathematics for a while to disparage geometric reasoning as non-rigorous (which, to be clear, it can be), and even to regard anything that's not purely algebraic as non-mathematical, which is a shame. You can certainly do mathematical reasoning without it being algebraic, rigorous, formalized, or written down.


If people don't understand the math they are supposedly skilled at using, then it is math that is failing them, not them.


No. Math is math. Your brain must be adapted to it. Go play with PostScript and geometry, make some figures, do some trig, try do display some integration curve by hand. Then go back to Algebra. The dots in your brain will connect and you will be enlightened.


I had no idea that I also do this. The visual logic thing that is. I also have short term memory issues, but those seem to be improving with lions mane and bacopa. Having a high resolution large monitor also helps, the more I can fit on a screen, the easier it is to work.

Double edit, if your interested in learning some math, geometry is basically algebra, visualized. Algebraic proofs didn't really click for me until I took a different math class in college on geometric algebra.


> Having a high resolution large monitor also helps, the more I can fit on a screen, the easier it is to work

One of the most important things I learned at the Tufte seminar I attended was to let the optic nerve do as much work as possible to free up the short term memory for the hard parts. Big screens or big printouts are a great tool.

> geometry is basically algebra, visualized

This isn't still part of basic math instruction? It was literally middle school math where I went to school! We were frequently tested on both the algebraic and geometric solutions for a given problem. That explains a lot really.


It was split out when I took it, it wasn't until I took some math history geometry class in college that it all made sense.


> I don’t have these issue with programming because it doesn’t use my logical memory, it uses visual/spatial memory

It pleases me to be able to tell you that your visual/spatial reasoning is isomorphic to the symbol pushing that you're calling math. You're literally doing the same thing, just seeing it differently. This was a major part of my mathematical education, I guess that's not taught anymore?


Mathematical notation is a human language that nobody wants to teach.

The usage of random smybols confuses people because they think there must be a meaning or reading for each symbol but the truth is it's all arbitrary but nobody tells you this.

They don't even teach you the Greek alphabet even though it is essential in college/university. I wonder how many people are weirded out simply because they don't know the Greek alphabet.


> Computing science math ... is ... accessible to anyone who can, well, write a program.

it most definitely is not.

maybe it's the notation or the sudden appearance of it on the page, or something I don't know about, but math on a page becomes a literal wall that I can't overcome when reading.

I can explain any concept I understand well enough to teach others, via analogy, metaphor, or demonstrative example, including algorithms I use when I write software. I can't even understand software I write on a mathematical level; forget describing it or teaching it with math.

math is just out of reach for me. it has been since high school and it will be for the rest of my life.

if I have to describe an algorithm I am using in a program by using math, then it will never be described by me, except via the program source code.

if you try to share an algorithm with me and you use math to define it, you might as well just save your time because you are as likely to extinguish the Sun with a single eyedropper of water as you are likely to communicate with me in any way using math.

I loved math when I was young. I would spend time on weekends making up numbers and then calculating their square and cube roots on paper for fun. My high school math teacher hated calculus and he taught us all to hate math in toto when we were in that class. he sucked the enjoyment of math out of everyone effortlessly and completely.

I will not be rejoining the mathematically literate in this lifetime, and it is something that I desperately want to learn.

to say that math is simply accessible to any given person is just plain incorrect.


You’ve convinced yourself that you can never do it. It may interest you to know that the greatest mathematicians also feel that they are groping and may never accomplish what they are trying to do.

Two comments: notice that the equations written physics text books always use single letters for variables - possibly with sub/superscripts.

This is to keep the notation compact. If they used long variable names as programmers do, the essence of the relation described would disappear in the clutter.

That’s why math notation is as compact as possible.

(APL is an example of a programming language that tries to emulate a mathematical style.)

Second, try to look for the shape of an equation and what it’s trying to tell you about the relationship between the variables.


Noting that olympian runners and the physically disabled both want to run faster doesn't help.

I get really angry when math types just drive by and claim everything is easy if you practice and have the correct mindset. Everyone has something they aren't good at, and that's OK IMHO.


I'm not a "maths type." My passion has always been for programming, not what people think of when you say "math." Of course the Curry-Howard correspondence tells us that programs are mathematical objects, but that's not what comes to mind for most people.

I never said it was easy, I said you can do it. If you're smart enough to write a computer program you definitely have the required cognitive horsepower. Now if you want to only do the things in life that present no difficulty then that's your choice, but for your sake I hope you choose to push your boundaries and grow.

In high school I was taught that if the algebra is hard to understand then look at the geometry, and if the geometry is hard to understand then look at the algebra. Some people are definitely better at one way than the other. Definitely learn where your strengths lie and use them, but don't assume that you're bad at something because you're bad at one of many ways of doing it.

And yeah, you probably don't have the raw brainpower of a von Neumann or a Gauss (I sure don't), but so what? I'll never be a competitive weightlifter but I still lift because I enjoy seeing what my body can do. And I'll never win a Fields medal, Turing award, or probably ever even get a paper published, but I still play around with predicate calculus and other discrete math because I want to see what my mind can do. And as a happy side effect having some physical strength makes me more useful to other people, and knowing some mathematical reasoning does too, by making me better at writing correct programs.


thank you.


You protest too much. You surely know it is not out of reach, but you would have to find your own way back into the joy of it and no-one else can tell you what to read or where to start.


it's out of reach.

easy for you does not equate to easy for me.


You wouldn't have gotten into math the first time if it was just easy, we like it because it's hard.


The process you’re describing is demystifying computers, not computing.

It’s the path of: “Here are basic building blocks, here’s the machine from them, here’s simple things you can do with it, here’s a bare metal language for it, here’s a more advanced language. OK, you’re ready for a course about doing something non-trivial.”

It’s a totally valid approach, but the approach of computing first goes: “Here’s data, here’s operations on data, here’s more advanced composition of those operations, here are some languages created out of those operations, we can finally do compelling stuff. OK, you’re ready for a course about the details of how machines enabled this process.”


And that's why you get computing graduates who are still baffled by concepts like pointers, arrays, and sequential execution of statements. They can cargo-cult their way through some basic coding exercises and they can talk about computers, without really understanding what's happening when you type "x = y + z;"


I've spent considerable time studying both approaches [SICP-like, nand2tetris like, and I'd include a third OS-centric approach as in MIT's Xv6 courses] (SICP and SICP-adjacent I would also include programmes based on Forth and self-bootstrapping Smalltalk), one is the bottom-up approach from the CompSci point of view and the other one from the ElEng point of view.

The general bottom-up approach is dead or moribund, as it's the concept that an educated professional should know more or less all the relevant aspects of his or her discipline to a decent standard.


Petzold's Code book is also a highly recommended bottom-up, although it focuses more on hardware than software.


Yeah that's also a good one, but I think whoever completed nand2tetris needs to go for some university level textbook + lab material for the next level? Like a proper book for computer architecture, some reading for FPGA and HDL, and two books, one for OS and one for compiler.


Nand2tetris is much more advanced than Petzold. But yes, there’s a lot more to learn after that.


I recently finished reading it and highly recommend. I was unable to put it down and read half of it in one sitting. The title is indeed a bit misleading though, it's really about electronics and hardware.


> The title is indeed a bit misleading though, it's really about electronics and hardware.

That's what code is made of.


The software part of nand2tetris is quite a bit harder than the hardware part. I didn’t bother to buy the book for the first part, but I needed it for the second part.

SICP doesn’t really teach hardware, and it teaches you a sophisticated functional compiler, rather than the imperative toy Java of nand2tetris.

They are both important.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: