Hacker News new | past | comments | ask | show | jobs | submit login

I'm probably in the minority here, but I think that starting with the low level - bits, data representation/interpretation, and logic (which leads to CPU operation) - is the best path to go when starting to learn programming, because it really dispels a lot of the "magic" and mystery behind computers that can confuse beginning programmers when their code doesn't work as they expect.

The idea that computers are conceptually very simple and logical in operation, and that any surprising behaviour is really the result of interacting layers of complexity rather than an intrinsic property of the machine, is something that should be firmly kept in mind whenever programming, or even just using, a computer.




I'm sympathetic to this approach, though it's not necessarily the one I'd pick anymore. I think there's real value to understanding systems at this level, and I find that it informs my everyday work. I'm not sure, however, that it's necessarily the best way to get somebody engaged and interested. I'm sure there are personalities that dig this sort of thing (mine does) but loads of people I've worked with or taught seem to be most engaged by the fast iteration cycle of something like a REPL. Personally, that makes me lean towards teaching with Ruby or Python (probably Ruby, as I think it has more to teach). I guess it's horses-for-courses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: