Hacker News new | past | comments | ask | show | jobs | submit login
Computer Science Curriculum in 1000 YouTube Videos (laconicml.com)
341 points by nixass 4 days ago | hide | past | favorite | 133 comments





This is the eternal debate, but man, I wouldn’t start somebody off with a C programming course as #1. Especially if there’s a Python course coming up shorty anyway. And then a C++ course to learn “OOP”, which is much easier to learn in Python. Ouch.

I suggest just starting with Python, learning C at some point because it’s universal, and never learning C++ without a specific reason.


I started with C, and if you don't dive to deep into it, you can get a pretty good concept of program structure, what a function is, what a pointer is and all sorts of concepts that extrapolate easily in other languages.

I'm grateful to have learned on a strongly typed language first, the thing about students is that sometimes they're easily demotivated, my University switched to python after teaching C first, after learning python there were a lot of student complaints asking why C was so complicated and why did they need to have static typing, and it wasn't natural to them to think about compiling as a normal step in software development.


This sounds like a survivorship bias. I'm willing to bet that if someone such as yourself (who is motivated to learn C) started with a weak/duck typed language, you would have maintained motivation, and later moved onto something like C, feeling grateful to be learning C. i.e. who you fundamentally are is not altered by the language you first start learning.

Now take the reverse. If someone who is not motivated to learn C started out in Python, they're more likely to stay motivated, perhaps never learning strong typing. If you start them on C, they lose interest and the field loses a valuable addition.

Not everybody has to learn C or static typing or pointers. It's okay for people to have a narrow programming/comp sci scope and still enjoy the field.


I learned C as my first language and suggest it as first language. And I think it is relatively small language, (discounting all the different details open to implementation not covered in the spec).

One thing I would miss as a beginner is repl. Because, that is how I later learned bash with constant feedback and constantly tweaking the input code until the computer stops swearing at you. Also, it was really fun way to learn language, one small piece at a time. Other than that, I do not consider C to be a particularly hard language.

But, if your goal is to learn a language with modern OOP design, then you should go for python or ruby IMHO. It is much better than Java, (even to do anything you have to create a class with main with so many modifiers). I think that would the reason to learn python, because structuring the code is archaic or at least different from how other mainstream languages in C and can be a hurdle for newbie who want to scale up their programs.

I think good about python is being simple and also having features that are mainstream that translates various other mainstream languages. But, should I care about that as a newbie is an entirely different thing.

Edit : Added a comment regarding OOP


Why would they lose motivation with C? Its really not that difficult. At least not compared to something like haskell or even something like python at times (are dictionary comprehensions actually that easy from a syntax POV? What about how copying a dictionary isn't an element by element copy, but a reference/ptr copy?)

Even with undefined behavior (which a newbie won't really have to care about at the beginning IMO) C is relatively simple from a flow and structural point of view. If you are saying you can do cooler things in python than C at the beginning, that's probably true, but its unclear how many people drop CS because the language they are using doesn't enable easily making cool things.


> Why would they lose motivation with C? Its really not that difficult.

This is the confusion right here. You're underestimating what is difficult to other people.

As a friendly internet stranger with modest leadership experience, I would very gently suggest to be cautious of this pitfall. When we think like this, even if it's just coming from a place of logic, we can very easily send the wrong signal. We can make people feel stupid, and wonder why we're being critical of them.


I think it helps motivate beginners if you pick a stack that allows them to create a modern UI, even if the functionality is something as useless as a vuvuzela app.

You probably know this, but correcting for anyone reading. C is generally considered statically but *weakly* typed.

No, it isn't. There isn't consensus on what weakly/strongly typed actually means, but C is strongly typed (with some unfortunate loopholes) by most definitions.

> I'm grateful to have learned on a strongly typed language first

Python is strongly typed.


I always feel like people who suggest python over c as the de-facto thing to start with, are probably people who started with something like c over something like python themselves, but forget all about how all that experience and understanding helped them make sense of python in an instant, and thus consider the 'easy' language a no-brainer.

To an absolute beginner, however, python is not only an incredibly complex system (except for extremely trivial things), but because it goes out of its way to hide its internals, it cripples users for years before they can figure out what's actually going on under the hood.

I'm fully on the "start with c, move to python" camp. The lessons just need to not suck, which I concede may be easier to do with c ... but it should still be c.


> it cripples users for years before they can figure out what's actually going on under the hood.

it doesn't - python hides the unimportant aspect of the computer; namely, it abstracts the memory allocation and resource process. You treat the computer, under python, as an abstract machine which has unlimited memory, and don't worry about its use. you focus on algorithmic understanding and fundamental data structures.

Once you grasped the computer science fundamentals, then you tear open the box and learn the underlying impl. of a computer - such as memory allocation, and assembly, etc (ostensibly, by making your own operating system using C, and/or writing a compiler using C).

you can replace python with haskell, which imho, does an even better job as a first language.


To be fair, python has many robust features beyond gc. For example, dicts, which are bread and butter for python abstract away 1000s of lines of C, which leads to very confused students when they first see errors like 'unhashable type: list'. Encountering leaky abstractions always made me uncomfortable when I took my first classes, but for a very first class its still probably better than dealing with memory allocation in C depending on what one expects from an intro class.

But when you start using dicts at the beginning, you would be learning about hash maps and what hashing is, etc. This is basically computer science 101 stuff, with which python makes an environment much easier to learn with.

> haskell

Sure, sure - monads and all... I wonder if there is even a beginner programmer’s learning material that uses Haskell.


I have TA'd both intro courses using C and Python as the first language. Python is IMHO better as the first language.

I respect that. I have similar experience but with the reverse conclusion.

I think this depends on the student. I do not have a CS degree. I dabbled in Python, but the first programming course that I took was Harvard's CS50 MOOC. I really appreciated learning C because it taught me about the underlying system through concepts like memory management. I agree that Python is friendlier, but I feel like C trains a particular mindset. Python may be more suitable for those who want to get off the ground quickly.

The thing about languages like C is that they will teach students to do many things by rote that they aren’t equipped to understand until later. I’m talking about things like includes, using a compiler, << and other facets of basic I/O, defining a main function.

Python allows students to hit the ground running much faster. Once they learn the very basics, then it can make sense to introduce something like C, but not require them to temporarily ignore magic incantations with the promise they’ll understand them later.


When teaching new students Java, the first step is "type these exact lines character for character, you won't understand what they mean for the next year, but you need to type them at the start of every new program.", which can be a little rough.

that's a fault with the teaching method, not the language.

You have to create a class with main method, just to start. And you need to know, what are access specifiers, what are classes and static methods. Of course, you can just ignore them as something you just have to do, but I think buy design OOP is not an additional requirement but starting point in Java.

But, it is not like you have to use them, in that case why start with a complex language as a beginner anyway? It will either be too much of details or too much of magic to remember.

(Digression : in my beginner days I was using C as a glorified assembly language with control flow redirection only through, (wait for it) goto. Even after grew up to 700 lines. Only while doing a java port I got used to using methods because java does not have goto)


Not anymore! With Java 11 you can (finally) have top level code immediately execute as if it was run in main method. Only took 25 years.

Well, I am not sure about Java 11. It took 25 years, yes and organization I work has just entered the java 8 world completely. It was originally written in Java 7 and it took 5 long years to completely transition to Java 8. Even then (Java got a repl now!!!), I am skeptical of the language in terms of beginner friendliness.

So, if you are going to read some code from the world you should be familiar with the archaic notations anyway. Even in frameworks, which java shops in neighborhood is mostly about, has intro docs with terms from almost 15 years ago and assumes people who read are aware of it. The thing that really bothers me (not that it is bad in an objective way) is how much old java platform has OOP encoded to its DNA. And it is great at it.

It is a language meant to be used by people who are familiar with writing such code. The FP is really enjoyable, but kind of ugly and seems retrofitted into the OOP. In general, I like opinionated crisp languages, but Java is getting away from that. In nutshell, it is disorienting for me now a days.


no it's not. The language _requires_ those incantations, but to explain it all to a beginner requires a lot of time and effort which, to the student, doesn't look like any payoff (until they have internalized it completely, and understands their meaning).

It's the same as human learning natural languages - you have specific grammar and rules you _just_ follow, without understanding their etymology or how it evolved to be this way (and their uses).


And python requires its own boilerplate incantation. One is neither considerably less verbose than the other, nor any less cryptic. Both take exactly two lines. The python incantation simply "feels" simpler to you, because you as a seasoned programmer comprehend the 'intent' behind it as being simpler. But this is more likely to relate to our inability to perceive from a point of ignorance that which we already understand.

A good lesson could start by explaining what these do by way of dissection and for the purpose of orientation, without going down the rabbit hole.


This resonates with me. When I was an undergrad about 25 years ago most intro to programming classes were taught in C. There were a lot of concepts I struggled with at the time that now I find trivial. In retrospect, what I was really struggling with was some of the lower level gnarly bits C exposes to the programmer.

While it's good that I know how those things work, I did not need to understand them to just get a basic understanding of strings, data structures, etc. For instance it wasn't that I couldn't grok linked lists, I couldn't grok pointer manipulation at the time. But that still meant my code didn't work.


it's really a big-endian vs little-endian matter, if you think about it.

Starting with C means sitting down, thinking hard and taking the time to understand how the basic model of a computer works, along with all the issues and annoiances that come with it.

Starting with python really means leaving out the details and just getting into programming and maybe one day you'll wonder ow stuff actually work underneat.


I TA'd an introductory computer architecture course (basically, start with a MOSFET transistor and end up with "here's how you write malloc/free"), which was students' first introduction to both assembly and C. That experience helps solidify that you really want to students' first experience with programming to be a high-level language.

One of the problems with teaching is a phase-ordering problem: what order do you teach the concepts in? As GP notes, C starts you off with a lot of boilerplate that isn't going to become relevant until a month or two down the line. Concepts like functions, libraries, the insanity that is the C preprocessor are distractions when your first challenge is just getting people to understand the mental model of how a procedural program works (i.e., understanding what x = 5 really means!). On top of that, C imposes a mental burden of understanding the stack versus the heap and the relevant memory management concerns. And then you get into C's string handling...

I suspect many of the people commenting here are in the class of people for whom programming was obvious and easy, and never struggled with any of the concepts. This class of people will do fine with any language being used to introduce the concepts, because they're not really learning the concepts. Where the language matters most is for those for whom programming is not obvious, and they need help identifying which parts of the material is more fundamental than others. Giving these people a language that is more forgiving of mistakes is going to go a longer way to democratizing CS education than insisting people start by learning how computers really work.

That isn't to say that this isn't something that should be taught pretty early on: far from it, I think a computer architecture class should be the second or third class you take on the topic. Delaying the introduction of C until computer architecture means that you can use C as a vehicle to introduce all the other parts about computers that are important, such as 2's complement arithmetic, ASCII, what the stack and heap are, etc.


Personally, I dislike all the 're-learning' process. However, by 'Python', I had to re-learn quite a few concepts when I touched C ground. Of course, it was a 'A-Ha' moment to put in positive way.

You might say it really "took you on"!

Puts/gets negates the need for includes and >> isn’t in C.

Also Python has a horrific main method syntax and duck typing is ultra confusing early on.


Ifmain style main can be omitted for beginners (no syntax for main). Duck typing can be explained easily : we need something that has a food method.

In C, you learn to modify a letter to upper or lower case, by finding its ascii number, and adding to it, or subtracting from it, a magic number.

In Python, you just use the .lower() or .upper() function.

One and done. Move onto your next problem.

In C, you hope to not trigger an array-index-out-of-bounds error, resulting in a segmentation fault. Then, not understanding what you did wrong, and then having to fire up the debugger to find your needle in the haystack.

Hours later, you’re just like, I just want to modify a string. Why is it so difficult?


This is a silly argument. You have to do the exact same thing in python if you don't want to use a standard method/function. There's nothing stopping you from using a standard function in c either.

Your rant should be about bad teaching of bad habits instead


What I’m trying to point out, is that C has a lot of compromised and poor design decisions made, as a programming language.

Maybe it was because it was designed in the 1970s, when computers were more primitive. But better alternative languages were designed back then that didn’t have the failings of C.

But these designs permeates throughout the language. So eventually errors keep coming up.


Totally agree you need to learn C before getting too far, but it can make intro programming more challenging than it needs to be. Since this is supposed to be an ML-targeted curriculum it seems like Python is a better start. However, as I said, it’s an eternal debate. At least we have choices!

you learn Python to get the basics: what are variables, what are values, what is control flow, what are functions, what are modules, how does it fit together as code to perform tasks. Worrying about the memory and the underlying machine can come later. The biggest problem with Python as a beginning CS course is that it´s not a great language to learn about types (meaning it's more difficult to learn the cluster of related concepts expression, value, type, variable).

I guess I have different thoughts about what the basics are. I would advocate to start with assembly, but I’m biased because I started with assembly. Things like what is memory? What is a register? What is the Program Counter? What (really) is the call stack and how does it work in memory? Fetch-decode-execute? Jumps, conditional branches, loops. Address modes.

If you know these things, then you know how the computer actually works, and a lot of the mystery of C is behind you already.

Then you can start introducing stuff that’s built on top of it all like types, if statements, pointers... build the building from foundation up, don’t start on the second floor and tell the student not to go downstairs yet.


That's not a good way to teach someone to code and there's a reason why absolutely no curriculum on the planet does it that way. If you're teaching someone how to operate a car you start by showing them which pedal is the gas and which is the brake, not by explaining to them how a differential works. The pedagogical basics and the lowest level system primitives are not the same thing.

Bottom up vs top down. A lot of people benefit from a top down approach, especially those who haven't already decided they want to heavily invest in the subject.

I think C doesn't have to be the first language you learn but I do think it should come fairly early, because otherwise you don't have an appreciation for a lot of the things the computer is actually doing when you're writing your code. Try explaining the concept of "stack vs heap" to someone who's only done python programming, for example.

My brother went through a coding bootcamp that focused on Python and JS and then he started to get into Golang on his own and one day he asked me what the point of pointers is. We ended down a rabbithole that kind of blew his mind. Stack vs heap, the structure of the call stack, manual memory management, etc.


It's the fundamental debate of "are you training to be a software engineer or a computer scientist?"

SWE's can get away without understanding many underlying concepts if they are going to be writing "business code", that is, handling strings, ints and interacting with databases and APIs.

Computer Scientists and those programming on embedded systems or designing systems/OS/theory themselves, need to delve into the math, algebra, and lower understandings of computers.


I think it's the other way around - computer scientists can get away without understanding the underlying hardware, and do their research based on a basic/abstract machine. Because they are doing research on computer science - how different algorithms can be improved or new data structures (or ML techniques etc).

Software engineers are more likely to need to know the underlying machine and write code suitable for such - because they need to write working/production ready code. It's like applied science, vs pure research science.


Yeah, I think the same. If I started today, I would love to being with python or even Java/Kotlin as it's better to show the good things about OOP.

C it's great to learn the barebones experience with memory, and as a lang that runs in everything.

The modern use I could see for C++ it's usually too specific for a random learner. I would only recommended for people who like to work in game engines, simulations, networking, embedded systems or migration of legacy systems, and other edge cases.

But for learning, I think it's better to learn with a modern lang, as you will be less distracted learning the quirks of vitange language, and use that time to learn in parallel a lot of concepts outside the lang, like algorithms, devops, soft skills with teams, prioritizing requirements, etc.


Out of curiosity, is there anything in particular that makes C++ preferable for simulations? I’ve been trying to get into the topic lately - for now I’ve stuck to python and Julia for the visualizations that are more familiar to me.

C++ is really nice for low-level OOP, and compared with C it adds generics and classes. Compared to Python, C++ is lower-level and can have significant performance gains with a well written program. With regard to Julia, if you write your code in "the right way", it is also compiled and low-level.

Makes sense, I can see some benefits in having more control of what happens with the memory, rather than trying to fit the model I have in mind into the datatypes python.

Found this post [0] fitting: > The best thing you can do when dealing with lower-level languages like C and C++ is understand that things like lists, strings, integers, floating point numbers, objects, etc. do not actually exist. They are simply abstractions. There are only the CPU and memory.

[0]: https://news.ycombinator.com/item?id=8356400


Performance, I'd assume. Simulations can take a lot of number-crunching.

Until teachers understand that learning programming and learning a programming language can actually be done separately, the debate will go on. There's this mindset that by teaching c early on you expose students to a whole range of useful concepts. That may be true, but you also distract them with useless language idiosyncrasies that have nothing to do with programming, strictly speaking.

I think starting with a language with static typing would be better. With Python or Javascript you still need to know how the types work otherwise you can run into unexpected "defined behavior". Something like Golang makes a fair compromise between C and Python: it is statically typed, compiled, and has a garbage collector.

I think the way Harvard's CS50 course approaches this is really good. It used to be entirely in C but in more recent years they start with C for the first few weeks and then move on to Python and expose you to a bunch of other tools as well.

> Python

The computer science classic of all time (by Knuth) uses an assembly language to teach analysis of algorithms. Using Python (or Scheme, for that matter, as is done in SICP) will only take you so far. C is a good middle ground.


> will only take you so far

Sure, and that distance is far enough for 95% of developers these days. If they feel the need to go even further, they could learn C at a later time. No need to scare possible developers away right off the bat.


A programmer should not be scared by the (still) most popular programming language. What's there to be scared by, anyway? For learning, C is just like Algol, very straightforward. Well, there's pointers: very useful for linked lists and such.

Start with assembly. And yes I’m absolutely serious.

Not “real” assembly, a simplified one that operates directly on memory without registers.


That's C with pointers.

Assembly without registers is absolutely pointless since the only thing that matters in any flavor of assembly is how to get data into registers when you need it and then remove it before you override it.


The point isn’t to learn assembly, it’s to learn that all programs are just a series of instructions, operating in order, transforming memory. The loading and storing of registers is an implementation detail that’s not relevant in this context.

The advantage over C is having an even more limited instruction set and using descriptive names for each instruction.


They don't transform memory, they load memory into registers, transform the registers and (maybe) store the registers values back into the memory. The difference is profound and until you code in assembly you have no idea why (or that) string operations are so expensive.

Again, the point is not to have an real language for any efficient real world usage. The point is to have a simplified language that represents a series of commands operating on a bank of memory. Like this:

    SET 1 100  # Set memory[1] = 100
    SET 2 500  # Set memory[2] = 500
    ADD 3 1 2  # Set memory[3] = memory[1] + memory[2]
It's easy to explain what each individual step is meant to do and you can build up to more complicated instructions. It allows children to see the persistent effects of running the program, mutating the state of the memory, eventually introducing the instruction counter itself as a state variable.

>eventually introducing the instruction counter itself as a state variable.

If you're going to be talking about the IP register than you need to talk about the IR register. So we should avoid talking about registers by talking about registers?

You're not selling this registerless assembly very well.


It doesn't have to be a register. Just a labeled piece of memory.

It works great with children and you don't even need a computer. You can do it all on a pen and paper.


I wholeheartedly agree. But I think it’s just a matter of preference for top-down or bottom-up

Perhaps the C course is supposed to fill this role, but I'd vote for including a course in assembler/assembly. Not because many students would use it, but because it teaches how a CPU actually functions without abstraction. I've used it only once, and even for that, I was mainly transcribing from a textbook. But understanding assembly was key for me to solve many computing problems over my career.

In other programming activities, understanding computing at the assembly level enabled me to realize that the bug I was chasing was actually in a library or, god forbid, the compiler.


I absolutely agree that asm should be taught for computer fundamentals but I really LOL'd at assembly teaching how CPU works without abstraction. There is tons of abstraction below assembly -- see Microcode. It's this mistake in thinking that lets things like spectre/meltdown sit in computers for decades. But for the most part you're right, going down past assembly has not helped me understand __software__.

There are at least three levels of indirection between memory and registers and there is no language that allows you to access that directly.

It lets you see more of the stack but not all of it.

Before I coded in assembly for example I had no idea why string operations in C were so lacking compared to everything else. When you realize that you are literally dealing with bytes in an array of variable length the limitations make a lot more sense.


You also need to know how the machine actually executes the code. It's not that complicated, but it trains your intuition for where the real optimisations are.

C, and assembly as we know it on most architectures, is a fairly high-level abstraction compared to what most CPU cores are actually doing these days.


When I was in school, that course was called Computer Organization.

We have made computer science so complicated for ourselves. It may be evolution I guess.

On one hand (1) we have people who write OS, compilers etc., On the other (2) we have folks use them and build applications for businesses. The school teaches us for (1). We learn (2) ourselves at work. I wish schools taught us for (2) as well along with all the necessary soft skills to deal with.


In Germany, we have two different institutions: Universities (identical to US university I guess) and something called Fachhochschule (I don't know if there even is a correct translation for that, lets call it FH).

University is focused on covering the theoretical aspect of computers with a lesser focus on practical applications - although they are thought to some extent - while FH is more focused on building applications and real-life systems with a smaller focus on theoretical aspects.

The only problem is that FH has long been considered second class and everyone wants to attend university. but in my opinion and based on my experience, many people would be a better fit for FH.

For someone deciding whether to attend Uni or FH, it is often not easy because you cannot guess what you are going to do in your work-life and whether you might enjoy research or you would rather go into industry directly.

Aside from that though, I think it is great that we have different institutions for different audiences with differing goals. Does something like that exist in the United States or somewhere else in the world?


Well, the equivalent for Fachhochschule in the United States are called Trade Schools. But trade schools typically teach things like Aircraft mechanic, train engineer, welding, etc. While I'm sure there are some trade schools that teach programming, typically you would go to a more dedicated place like a coding bootcamp.

Often universities will have separate computer science and software engineering degrees. The first is more academically oriented (for those who want to get a PhD in computer science) and the other for people who want a job as a developer.


> the equivalent for Fachhochschule in the United States are called Trade Schools

This is not true. Students can get accredited Master's degrees at a University of Applied Science. Some of the Universities even do research.

https://en.m.wikipedia.org/wiki/Fachhochschule

I'm not aware of any trade school in the US where students can get higher credentials than a certificate or associates degree.


I think a better analogy in the US would be universities with an academic focus vs universities with a practical focus. It's not a codified difference and the degrees you get are the same, but it exists somewhat informally.

As an example, in San Diego, the two top universities are UCSD and SDSU. UCSD is much more prestigious but SDSU teaches more practical skills, for example while SDSU biology students have 3 years of laboratory experience, UCSD students will have 2/3rds of one year of laboratory experience. Actually labs will not even consider UCSD students when hiring because they have so little practical knowledge. UCSD students are expected to go on to get professional degrees or higher academic degrees.

In general, in California, "University of California" schools are known to be academic while "California State Universities" are known to be practical. This was somewhat codified in the California Master Plan for Higher Education, where the top eighth of students would go to the UC schools and the top third would go to CSU schools[0].

I don't know if other states are the same.

[0]: https://en.wikipedia.org/wiki/California_Master_Plan_for_Hig...


Maybe the word "equivalent" isn't a perfect fit, but I wouldn't go so far as to say it's "not true." There's some overlap in the concept, if not an exact correlation. I attended a university that had a strong focus on practical applications to the point that it was often tongue-in-cheek referred to as a "glorified trade school."

Edit: what koube said. I went to a CSU school.


World Education Services (and others) evaluate foreign degrees and determine the US equivalent, the Central Office for Foreign Education in Germany does the same for US degrees and determines the German equivalent.

The equivalence of credentials matters for example once you get into the realm of government jobs with strict requirements, or getting accepted into Master or PHD programs.


As a US-based software developer, the concept of a credential that anyone cares about being issued by a school is a strange one. :D

FH == Community College / 2 year schools?

It doesn't appear to be.

From Wikipedia, "There is another type of university in Germany: the Fachhochschulen (Universities of Applied Sciences), which offer mostly the same degrees as Universitäten, but often concentrate on applied science (as the English name suggests) and usually have no power to award PhD-level degrees, at least not in their own right... The FH Diploma is roughly equivalent to a bachelor's degree."[0]

There doesn't seem to be a community college equivalent in continental Europe [1].

[0] https://en.wikipedia.org/wiki/Education_in_Germany

[1] https://en.wikipedia.org/wiki/Community_college


Hmm, in my view writing operating systems and compilers is also pretty far on the side of "applied CS" right around building websites and smartphone apps. But this doesn't strike me as a problem unless people are really confused and choosing CS programs in order to get experience building websites. From what I saw when I was in college a decade ago, even most of the smaller schools were starting to have special "Software Engineering" tracts since they recognized the growing popularity (or at least prestige) of programming jobs. I chose CS because I was fascinated with the actual science and math, and I wouldn't recommend anyone with the same fascination to skip theory courses.

Yeah I think CS is for (1) though and the groups for (2) can do a business IT degree or something like that in a different dept.

I think it is a bit reductionist to say (2) can "do a business IT degree or something". Taught CS today - at least at UK universities - is a mix of discrete mathematics, electrical engineering, and software engineering.

I think it comes down to the age old debate as to whether a degree is supposed to prepare a student to roll into an industry job or whether it is supposed to be an introduction to academia. An academic purist might argue that these should all be separate degrees because they are different intellectual pursuits, while a company would likely much prefer to take on a graduate with some understanding of all of the above.


Keep in mind that it's almost impossible to teach everyone everything.

The people who write compilers still have to learn a lot on the job because writing a good compiler is a lot harder than writing a basic one. It's turtles all the way down.

Ultimately I think teaching you how to write a compiler will put you in much better stead than teaching you just how to use one.


(1) safely gets you to a $200k salary, though. (2) is much more risky to practice.

I think this is where coding bootcamps are winning...

Some schools distinguish between computer science and computer engineering.

a good CS program teaches you that everything is exactly the same

How do we make education more modular? The hope being that a free and open source self-directed system could be suitable (and thus) compassionate/kind to beginners of all kinds and at different skill levels. That it allows many entry and exit points, sort of like a branching of interconnected plateaus and islands. Stepping stones from everywhere to everywhere. Rhizomatic. Cross-connected topics with illustrated and detailed guides.

It would be beautiful if it, as well as recording your progress, if the system would also enable you to contribute back to the curriculum, allowing you to create a better suited lesson (an alternative) for the next learner, who now has a choice of two lessons, one new lesson complementing the original lesson. Or it could add a step that was missing. I think some distributed application framework like holochain or DAT would be perfect to build this on.

The closest I've seen a modular education system is Ryan Carson's TreeHouse, where there are 'tracks', topics and courses - which are all made up of modular components that are available and reorganized based on the learner's interests/wants/goals.


Beautiful idea.

This is a nice first cut at a curriculum in that it captures a lot of important topics, but it's far too blocky. For example, I'm glad they put computer architecture in the first year, but I don't think there's any reason to go through the entire set of Carnegie Mellon lessons on computer architecture for a computer science curriculum.

I think a more subtle approach of introducing the basics, say, for computer architecture, a simplified CPU model of an arithmetic logic unit, registers, and flags, so you have a feel of what your programs are running on. Then get back to programming concepts. Go deeper into computer architecture later.

If you really want a curriculum, you have to take the broad topics, present the introductory lectures, then create sections for more specialized topics with links to lectures that go into more depth, and maybe add prerequisites to those sections.

update for clarity.


This website is a joke, made by a recent graduate (23 year old guy): https://laconicml.com/author/laconicml/

Really dude, you got an undergrad in CS and you will teach us all these topics? I am exhausted by the available curricula that would make me a computer scientist, data scientist, machine learning guru, Python ninja and whatever else. All that by watching videos, attending online programs, mostly made by people who have zero content credibility yet they are able to put together a bootstrap website and search YouTube for videos.

Please make it stop.


Well, you could learn much of the modern theoretical physics (some topics several times over) from just 136 video lectures:

https://www.youtube.com/playlist?list=PLQrxduI9Pds1fm91Dmn8x...


Interesting tone of discussions happening in here compared to the same post in reddit.com/r/programming. The comments in reddit seem to be geared toward how this isn’t a replacement of a CS degree with a hint of elitism. I mean, I agree this is not a replacement for a CS degree but they all seem combative and hostile towards people wanting to learn CS concepts via alternative means

I’m self taught, but don’t generally recommend it. It’s really hard work, difficult to get right and I need to make more decisions, which is taxing. It also never stops. Still learning and planning to learn as long as I can.

There are upsides though: I learn at my own pace, I can dive more deeply into a topic for a while, I can choose topics pragmatically, and I choose how to learn and test.

Now this curriculum: I have a different approach. I like watching videos/lectures, but more as a supporting tool, when I’m tired. I much prefer books, papers and experimentation. Also I feel like the curriculum is all over the place. I agree with the math part mostly but the rest seems a bit like a hodgepodge.


I find there's a similar culture on a lot of subreddits. You're more likely to find criticism of existing ideas rather than discussion of new solutions. I think it is a function of a relatively young audience.

I always get a kick out of the goofy self-assuredness of 'computer science curriculums'. You must learn calculus and discrete math...oh, and CSS.

Along similar lines, this list links to the best MOOC for each course in a core CS curriculum:

https://github.com/ossu/computer-science


I noticed that this doesn't include a compilers course. I've seen others recommend the Stanford Compilers course on EDX[0] but it looks like audit access is going away this month. Does anyone have recommendations for free, quality compilers courses?

[0] https://www.edx.org/course/compilers


I really like this idea, just not for CS, but for any topic.

Take a subject matter expert and let them curate a set of free videos to help anyone learn.


I guess this is the evolution of "awesome" GitHub repositories. Instead of just dumping every matching link into README, now a short description is scraped and saved along with links.

Not saying this page has bad content, but there's at least one such "curated" list on front page of HN, and I wonder how much of it is for audience / Twitter follower building "profession".


I will need the matrix capability to load them into my brain. Otherwise,time it takes to watch all the videos is not something I can invest.

I watch all YouTube videos at 2-3x speed. Helps me a lot. I also watch a lot of TV shows at 1.5-2x speed.

I find that it's more efficient to watch two or more videos at the same time. I also end up working on multiple assignments simultaneously.

I drive a semi truck. I find I can drive more miles if I drive to the shipper and receiver simultaneously. But it really screws up my Google maps timeline.

Why stop at one load of cargo, though?

People speak at a much slower rate than your brain is capable of processing.

Videos are a horrible format for learning. At least for me personally. Just give me a textbook.

Pretty great resource for those who didn't go to university, but wanted to.

I'd say one might learn rust instead of C, and learn python first instead of 2nd...

Also IMO UML and those Software Engineering classes are not useful for Bay area styled tech companies.


Do people actually use UML in practice? It's not something I've ever really found useful for my own development. Maybe it's more of a communication tool than a design tool? Or am I being too hasty?

Yes but only pragmatically not dogmatically. Activity diagrams can be an excellent way to reason about and document some more complex processes imho.

I work mostly in enterprises, I see UML all the time, especially Use Case diagrams and Sequence diagrams.

in my experience sequence diagrams tend to betray the challenges of distributed computing.

Do you mind explain it a bit more? I use it mostly to document the steps between systems, I don't see how distributed computing changes that.

I think it also works as a mental tool. It helps you think about the structure of your software.

Learning Rust without the context of what it solves with C seems a bit weird to me. C isn't easy to master but it's a lot smaller language than Rust and easier to get up and running.

Agreed. Ownership would be way too confusing to start out with.

> Also IMO UML and those Software Engineering classes are not useful for Bay area styled tech companies.

Out of interest, what do you base this opinion on?


I'm not the original poster, but I'll share my opinion. As an undergrad I took two semesters of software engineering at Cal Poly San Luis Obispo (CSC 308 and 309), and as a grad student I was a TA for a year-long senior-level software engineering sequence at UC Santa Cruz (CMPS 115, 116, and 117), which culminated in a capstone project. In these courses we were taught formal software engineering methodologies, with an emphasis on iterative development (at UCSC we taught our students Scrum). However, at the companies I've worked for, which include a mixture of traditional, conservative companies (such as an aerospace company, an "old-school" Silicon Valley enterprise giant, and a traditional Japanese IT company) and large Web giants (like Google and Facebook), the software engineering processes I've encountered in industry have been much more informal.

Granted, at both Cal Poly and UC Santa Cruz we learned very useful skills that are very important when writing production software, such as source control, code reviewing, coming up with effective testing strategies, and other things that I use in my career daily. However, we learned other things that I haven't encountered in industry yet, such as writing formal requirements documents and drawing UML diagrams.


Learning it ~15 yrs ago and then never using it again.

Anecdata.


UML is pretty retarded.

Hey look. Let’s make system documentation easy by drawing a bunch of stick figure people.

Stickman is a bank customer.

And he needs to

1. Open an account

2. Deposit funds

3. Withdraw funds

4. Close an account

Seriously?


I mean this is from the time when Waterfall was a thing and the "best practice" was let's design the entire system specs upfront and then build it..

The interviewer: “In Rust We Trust, of course... But, O Say, Can You C?”

Who says we don't live in amazing times? No excuse for anyone not having a top-notch education available for free.

A lot of people don't thrive with self-directed learning, and even more don't do well without being able to ask questions. There's no doubt that YouTube is an amazing resource but the fact it exists doesn't automatically mean everyone can learn well from it. Those who can are quite fortunate. Those who can't still need good teachers.

A lot of college/university course work is openly available on the internet. This includes syllabi, textbooks, class notes, tutorials, homework assignments and solutions.

If one wants to learn, for example, algorithms, one does not need to do this in a self-directed manner. Several learning paths already exist. If one is having trouble grasping a specific topic, sites like Stack Exchange, Stack Overflow, Reddit, Khan Academy, and YouTube exist for one to ask questions on. I have had many issues cleared up by reading through previously asked questions, including seeing the person who a theorem is named after answer the question. Is it quicker to ask a teacher a question? Yes, of course. What will lead to longer lasting learning? For me, it's figuring it out on my own.

Eventually, one will come to a point where there is no longer an answer key, a teacher to guide you, or a well-worn path to follow, what happens then? People need to learn on the job all the time. Learning how to formulate a question, find the resources to help answer that question, and determine if what you have found answers the question are all key skills people need to learn. It helps to do this in a structured setting, but can be learned through the internet.

If self-directed means some people are not good at working without deadlines, then they can form a habit. Set aside time and work through some of the above material.

Cal Newport has put out many articles, blog posts, books, and now a podcast on how to learn and be a good student. I'm sure there are many other resources, including the soft-question tags on some of the previously mentioned websites. Old "Ask HN" posts are also a phenomenal resource. So, I think those who are not good at learning on their own really do have a tremendous amount of resources out there. Will some people even after using all of the resources still fail to learn, I'm sure some will. I think many people, myself included, aren't putting in enough effort to really find their limits, or to discover a process that works for them. Yes, for some people they can figure it out, others will have guidance from teachers, friends, or family, but for others many resources exist to learn the process on their own. At the end of the day, it's easier to go on Tik Tok than it is to crack a textbook and grind problems.


> it's easier to go on Tik Tok than it is to crack a textbook and grind problems.

Learning requires work. There's no shortcut. And people who don't want to put in the work will find endless excuses. Just like there's no shortcut to playing a guitar like Eddie Van Halen.


The point is the door to higher education is not closed due to anyone due to lack of money, connections, etc. All it takes is a working brain and motivation.

This seems to veer a bit close to the idea of "learning styles", which as far as we can tell don't actually exist.

Any tips on easily making an offline copy of these? I'm afraid by the time I started some of these would go dark.

Try this:

  lynx -dump -hiddenlinks=listonly https://laconicml.com/computer-science-curriculum-youtube-videos/ | grep youtube.com | perl -pe 's/.*http/http/' | youtube-dl --batch-file -

Thanks, this works! It fails when one video isn't found, but gets 60% of the way there. I'll see if I can exclude that one and get it to keep going

youtube-dl perhaps? Not sure in what order the naming will ho

Woow, this is awesome man! Thanks a lot for sharing this!!!

tbh at that stage I might as well register for a degree and get the piece of paper



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: