I wonder what is the fate of this ambitious plan? Giving that there are only two commits, the first was in 2016 and then another one (converting from txt to markdown). Also, I found this [1] the funniest way of reaching out (comment on a commit)
Did he actually do that, is there somewhere where this is available? In this case, it would be more appropriate to link to that.
This seems to be a bit odd? This is already a more tedious hardware project to debug, but when it is about learning the basics, building a much simpler circuit would provide more insight.
It's also a bit questionable why building hardware should be part of a full stack digitial systems course? It's very good knowledge for certain, but seems like a sidetrack for me.
For the hardware to software part, anyone interested in some "passive" learning of these fundamentals might consider one of two excellent games I've played:
-Turing Complete
-Shenzen I/O
I'm sure they both work on Windows and Linux, likely Mac as well for both. They are fun dynamic puzzles that help build out a framework of understanding software-hardware control for those with no experience.
I'm not sure if Shenzen I/O is a good recommendation - as with all Zachtronics games sooner or later it gets very puzzle like and gets difficult very fast.
Turing Complete on the other hand is focused on the exploration/learning experience and - for example - doesn't show leaderboards until very late in the game.
Playing both - Shenzen I/O is great, but Turing Complete has a spot in my "best games ever" list.
Ambitious, but I like the idea. I feel like it would be too much to absorb for most juniors, but for your average senior like me with gaps, it may prove useful.
Geohot has a bit of an elitest mindset, despite not really being able to follow through himself. I'd say to anyone starting their interest in the field, don't get put off by this time frame, just keep at it and you will win.
>Hiring is hard, a lot of modern CS education is really bad, and it's hard to find people who understand the modern computer stack from first principles
Ever heard of Electrical and Computer Engineering degree courses, it covers from transistor physics, digital design including HDL, CPU microprocessor, embedded system, operating systems, programming, database web development, mobile applications and machine learning. Heck it also covers Engineering maths fundamentals for your DSP, control systems and wireless systems courses. It's like drinking from fire hose, but it's a very comprehensive, interesting and relevant degree program to be completed in four years duration. Both Lisa Su and Jensen Huang completed their degree in this engineering field.
A friend of mine is designing rear view mirrors for Porsche for about 15 years and now he's seen as some sort of rear view mirror expert and he can't find anything else. It's hilarious.
I hope you've brushed up on year 3 curriculum before going to fix that garage door motor so you can answer some random medium / hard questions from Finite Element Analysis in Solid Mechanics or Vehicle Powertrain, Noise and Vibration.
I was going to say just that. I think we have this debate for a long time ( at least 25+ years ) unfortunately CS won and CE lost. If only could Computer Engineering course just offer an additional one year materials for all the additional CS stuff.
Big Tech also have a preference on CS as well since a lot of them also came from CS background.
What are the numbers after the languages supposed to be? Hours? Minutes? 10 hours for blinking an LED seems far too long, while 2500 minutes for an OS seems far too short.
Rarely have I seen such garbage getting so many github stars. It seems to be all signaling of how clever the guy who wrote it is, without any actual substance to back it up. There is literally no way for anyone to get anything meaningful out of such an information dump. Even if you could memorize all the information, internalizing it takes years.
It's pretty much the reverse of Peter Norvig's best article, Teach yourself programming in 10 years [1]. If you haven't read it, I'd recommend you do to cleanse your palette.
I agree with the "no substance" part. However, I think such a course is not for beginners, instead it would be for experienced programmers who already have an idea about how computers work and want to fill up some gaps so that they have the whole picture.
These 12 weeks can be part of that "10 years" progression. It addresses the "computer" part, and adds a language that isn't mentioned to the list: Verilog, a hardware definition language.
As for the "guy", he is George Hotz, a controversial figure, who likes to think of himself as very clever indeed, but going up and down the abstraction ladder is, I think, one of his strengths. So I won't dismiss his article too quickly.
Section 3 Building a ARM7 CPU is in my view the best example of this - building a pipelined processor is exactly the sort of thing that sounds impressive, but in truth teaches you exceedingly little for the effort it requires.
* Architecture wise, implementing a core in verilog versus simulating it in a higher level language are nearly the same from a learning perspective, there really aren't any special. insights from hardware when just starting out. And doing this in verilog just makes the entire process more obtuse because its hardware.
* Microarchitecture wise, a simple pipelined processor teaches you basically nothing about how a modern processor (i.e. superscalar, out-of-order) functions from the perspective of a person higher on the stack. If you were to be using it to springboard to writing an OoO superscalar processor, it would be far better to be given a well-designed multistage core and the assignment be to add pipelining - this avoids the student implementating the multistage core of the processor, which is really mostly straightforward implementation of simple but large state machines, and instead focuses purely on the pitfalls of pipelining while also giving a model of good verilog/processor writing practices.
Another thing which is sortof weird is worrying about the type of RAM - that's getting just way into the weeds of processor design and teaches you nothing except stuff like "how to interface with dual port versus single port RAM blocks". And given the time constraints it makes more sense to just simulate memory anyways.
When I, an educator who has a understanding of the topics, read the outline I immediately felt sorry for the people who would be part of that curriculum.
This outlines looks smart to people who already dealt with all or most topics involved, but it would be soul-crushing to most students. That being said, this is mostly an issue of how the material is organized and which parts are emphasized in which way.
In a way it is weird to pinpoint what is wrong with it. It is at the same time too niche and to broad at the same time in weird way.
If we want to let people decide on their learning journey, I think something like this is even more harmful, since it creates an unrealistic standard. It's the equivalent of an Instagram lifestyle for software/hardware engineering.
I've met many world class developers, and I'm sure that even after years of experience almost none could handle such a course even if dedicating themselves full time to it. Building a C compiler in Haskell is just one part of a 3-week section, and if you don't know Haskell you're really dead if you have to pick it up on the fly. It feels like an A/B test, to see how much bullshit can pass, or maybe the guy really is delusional, I don't really know.
A positive spin is people are so excited to learn this stuff that a syllabus got thousands of github stars. I think a lot of his work is great from a pedagogical form. I often learn more about ML by reading tinygrad, even if i've never used it in production. We should be celebrating this guy's efforts
Did he actually do that, is there somewhere where this is available? In this case, it would be more appropriate to link to that.
[1] https://github.com/geohot/fromthetransistor/commit/bc3e63e2a...