I suggest just starting with Python, learning C at some point because it’s universal, and never learning C++ without a specific reason.
I'm grateful to have learned on a strongly typed language first, the thing about students is that sometimes they're easily demotivated, my University switched to python after teaching C first, after learning python there were a lot of student complaints asking why C was so complicated and why did they need to have static typing, and it wasn't natural to them to think about compiling as a normal step in software development.
Now take the reverse. If someone who is not motivated to learn C started out in Python, they're more likely to stay motivated, perhaps never learning strong typing. If you start them on C, they lose interest and the field loses a valuable addition.
Not everybody has to learn C or static typing or pointers. It's okay for people to have a narrow programming/comp sci scope and still enjoy the field.
One thing I would miss as a beginner is repl. Because, that is how I later learned bash with constant feedback and constantly tweaking the input code until the computer stops swearing at you. Also, it was really fun way to learn language, one small piece at a time. Other than that, I do not consider C to be a particularly hard language.
But, if your goal is to learn a language with modern OOP design, then you should go for python or ruby IMHO. It is much better than Java, (even to do anything you have to create a class with main with so many modifiers). I think that would the reason to learn python, because structuring the code is archaic or at least different from how other mainstream languages in C and can be a hurdle for newbie who want to scale up their programs.
I think good about python is being simple and also having features that are mainstream that translates various other mainstream languages. But, should I care about that as a newbie is an entirely different thing.
Edit : Added a comment regarding OOP
Even with undefined behavior (which a newbie won't really have to care about at the beginning IMO) C is relatively simple from a flow and structural point of view. If you are saying you can do cooler things in python than C at the beginning, that's probably true, but its unclear how many people drop CS because the language they are using doesn't enable easily making cool things.
This is the confusion right here. You're underestimating what is difficult to other people.
As a friendly internet stranger with modest leadership experience, I would very gently suggest to be cautious of this pitfall. When we think like this, even if it's just coming from a place of logic, we can very easily send the wrong signal. We can make people feel stupid, and wonder why we're being critical of them.
Python is strongly typed.
To an absolute beginner, however, python is not only an incredibly complex system (except for extremely trivial things), but because it goes out of its way to hide its internals, it cripples users for years before they can figure out what's actually going on under the hood.
I'm fully on the "start with c, move to python" camp. The lessons just need to not suck, which I concede may be easier to do with c ... but it should still be c.
it doesn't - python hides the unimportant aspect of the computer; namely, it abstracts the memory allocation and resource process. You treat the computer, under python, as an abstract machine which has unlimited memory, and don't worry about its use. you focus on algorithmic understanding and fundamental data structures.
Once you grasped the computer science fundamentals, then you tear open the box and learn the underlying impl. of a computer - such as memory allocation, and assembly, etc (ostensibly, by making your own operating system using C, and/or writing a compiler using C).
you can replace python with haskell, which imho, does an even better job as a first language.
Sure, sure - monads and all... I wonder if there is even a beginner programmer’s learning material that uses Haskell.
Python allows students to hit the ground running much faster. Once they learn the very basics, then it can make sense to introduce something like C, but not require them to temporarily ignore magic incantations with the promise they’ll understand them later.
But, it is not like you have to use them, in that case why start with a complex language as a beginner anyway? It will either be too much of details or too much of magic to remember.
(Digression : in my beginner days I was using C as a glorified assembly language with control flow redirection only through, (wait for it) goto. Even after grew up to 700 lines. Only while doing a java port I got used to using methods because java does not have goto)
So, if you are going to read some code from the world you should be familiar with the archaic notations anyway. Even in frameworks, which java shops in neighborhood is mostly about, has intro docs with terms from almost 15 years ago and assumes people who read are aware of it. The thing that really bothers me (not that it is bad in an objective way) is how much old java platform has OOP encoded to its DNA. And it is great at it.
It is a language meant to be used by people who are familiar with writing such code. The FP is really enjoyable, but kind of ugly and seems retrofitted into the OOP. In general, I like opinionated crisp languages, but Java is getting away from that. In nutshell, it is disorienting for me now a days.
It's the same as human learning natural languages - you have specific grammar and rules you _just_ follow, without understanding their etymology or how it evolved to be this way (and their uses).
A good lesson could start by explaining what these do by way of dissection and for the purpose of orientation, without going down the rabbit hole.
While it's good that I know how those things work, I did not need to understand them to just get a basic understanding of strings, data structures, etc. For instance it wasn't that I couldn't grok linked lists, I couldn't grok pointer manipulation at the time. But that still meant my code didn't work.
Starting with C means sitting down, thinking hard and taking the time to understand how the basic model of a computer works, along with all the issues and annoiances that come with it.
Starting with python really means leaving out the details and just getting into programming and maybe one day you'll wonder ow stuff actually work underneat.
One of the problems with teaching is a phase-ordering problem: what order do you teach the concepts in? As GP notes, C starts you off with a lot of boilerplate that isn't going to become relevant until a month or two down the line. Concepts like functions, libraries, the insanity that is the C preprocessor are distractions when your first challenge is just getting people to understand the mental model of how a procedural program works (i.e., understanding what x = 5 really means!). On top of that, C imposes a mental burden of understanding the stack versus the heap and the relevant memory management concerns. And then you get into C's string handling...
I suspect many of the people commenting here are in the class of people for whom programming was obvious and easy, and never struggled with any of the concepts. This class of people will do fine with any language being used to introduce the concepts, because they're not really learning the concepts. Where the language matters most is for those for whom programming is not obvious, and they need help identifying which parts of the material is more fundamental than others. Giving these people a language that is more forgiving of mistakes is going to go a longer way to democratizing CS education than insisting people start by learning how computers really work.
That isn't to say that this isn't something that should be taught pretty early on: far from it, I think a computer architecture class should be the second or third class you take on the topic. Delaying the introduction of C until computer architecture means that you can use C as a vehicle to introduce all the other parts about computers that are important, such as 2's complement arithmetic, ASCII, what the stack and heap are, etc.
Then perhaps ASM would be an even better starting point?
Also Python has a horrific main method syntax and duck typing is ultra confusing early on.
In Python, you just use the .lower() or .upper() function.
One and done. Move onto your next problem.
In C, you hope to not trigger an array-index-out-of-bounds error, resulting in a segmentation fault. Then, not understanding what you did wrong, and then having to fire up the debugger to find your needle in the haystack.
Hours later, you’re just like, I just want to modify a string. Why is it so difficult?
Your rant should be about bad teaching of bad habits instead
Maybe it was because it was designed in the 1970s, when computers were more primitive. But better alternative languages were designed back then that didn’t have the failings of C.
But these designs permeates throughout the language. So eventually errors keep coming up.
If you know these things, then you know how the computer actually works, and a lot of the mystery of C is behind you already.
Then you can start introducing stuff that’s built on top of it all like types, if statements, pointers... build the building from foundation up, don’t start on the second floor and tell the student not to go downstairs yet.
My brother went through a coding bootcamp that focused on Python and JS and then he started to get into Golang on his own and one day he asked me what the point of pointers is. We ended down a rabbithole that kind of blew his mind. Stack vs heap, the structure of the call stack, manual memory management, etc.
SWE's can get away without understanding many underlying concepts if they are going to be writing "business code", that is, handling strings, ints and interacting with databases and APIs.
Computer Scientists and those programming on embedded systems or designing systems/OS/theory themselves, need to delve into the math, algebra, and lower understandings of computers.
Software engineers are more likely to need to know the underlying machine and write code suitable for such - because they need to write working/production ready code. It's like applied science, vs pure research science.
C it's great to learn the barebones experience with memory, and as a lang that runs in everything.
The modern use I could see for C++ it's usually too specific for a random learner. I would only recommended for people who like to work in game engines, simulations, networking, embedded systems or migration of legacy systems, and other edge cases.
But for learning, I think it's better to learn with a modern lang, as you will be less distracted learning the quirks of vitange language, and use that time to learn in parallel a lot of concepts outside the lang, like algorithms, devops, soft skills with teams, prioritizing requirements, etc.
Found this post  fitting:
> The best thing you can do when dealing with lower-level languages like C and C++ is understand that things like lists, strings, integers, floating point numbers, objects, etc. do not actually exist. They are simply abstractions. There are only the CPU and memory.
The computer science classic of all time (by Knuth) uses an assembly language to teach analysis of algorithms. Using Python (or Scheme, for that matter, as is done in SICP) will only take you so far. C is a good middle ground.
Sure, and that distance is far enough for 95% of developers these days. If they feel the need to go even further, they could learn C at a later time. No need to scare possible developers away right off the bat.
Not “real” assembly, a simplified one that operates directly on memory without registers.
Assembly without registers is absolutely pointless since the only thing that matters in any flavor of assembly is how to get data into registers when you need it and then remove it before you override it.
The advantage over C is having an even more limited instruction set and using descriptive names for each instruction.
SET 1 100 # Set memory = 100
SET 2 500 # Set memory = 500
ADD 3 1 2 # Set memory = memory + memory
If you're going to be talking about the IP register than you need to talk about the IR register. So we should avoid talking about registers by talking about registers?
You're not selling this registerless assembly very well.
It works great with children and you don't even need a computer. You can do it all on a pen and paper.
In other programming activities, understanding computing at the assembly level enabled me to realize that the bug I was chasing was actually in a library or, god forbid, the compiler.
It lets you see more of the stack but not all of it.
Before I coded in assembly for example I had no idea why string operations in C were so lacking compared to everything else. When you realize that you are literally dealing with bytes in an array of variable length the limitations make a lot more sense.
C, and assembly as we know it on most architectures, is a fairly high-level abstraction compared to what most CPU cores are actually doing these days.
On one hand (1) we have people who write OS, compilers etc., On the other (2) we have folks use them and build applications for businesses. The school teaches us for (1). We learn (2) ourselves at work. I wish schools taught us for (2) as well along with all the necessary soft skills to deal with.
University is focused on covering the theoretical aspect of computers with a lesser focus on practical applications - although they are thought to some extent - while FH is more focused on building applications and real-life systems with a smaller focus on theoretical aspects.
The only problem is that FH has long been considered second class and everyone wants to attend university. but in my opinion and based on my experience, many people would be a better fit for FH.
For someone deciding whether to attend Uni or FH, it is often not easy because you cannot guess what you are going to do in your work-life and whether you might enjoy research or you would rather go into industry directly.
Aside from that though, I think it is great that we have different institutions for different audiences with differing goals. Does something like that exist in the United States or somewhere else in the world?
Often universities will have separate computer science and software engineering degrees. The first is more academically oriented (for those who want to get a PhD in computer science) and the other for people who want a job as a developer.
This is not true. Students can get accredited Master's degrees at a University of Applied Science. Some of the Universities even do research.
I'm not aware of any trade school in the US where students can get higher credentials than a certificate or associates degree.
As an example, in San Diego, the two top universities are UCSD and SDSU. UCSD is much more prestigious but SDSU teaches more practical skills, for example while SDSU biology students have 3 years of laboratory experience, UCSD students will have 2/3rds of one year of laboratory experience. Actually labs will not even consider UCSD students when hiring because they have so little practical knowledge. UCSD students are expected to go on to get professional degrees or higher academic degrees.
In general, in California, "University of California" schools are known to be academic while "California State Universities" are known to be practical. This was somewhat codified in the California Master Plan for Higher Education, where the top eighth of students would go to the UC schools and the top third would go to CSU schools.
I don't know if other states are the same.
Edit: what koube said. I went to a CSU school.
The equivalence of credentials matters for example once you get into the realm of government jobs with strict requirements, or getting accepted into Master or PHD programs.
From Wikipedia, "There is another type of university in Germany: the Fachhochschulen (Universities of Applied Sciences), which offer mostly the same degrees as Universitäten, but often concentrate on applied science (as the English name suggests) and usually have no power to award PhD-level degrees, at least not in their own right... The FH Diploma is roughly equivalent to a bachelor's degree."
There doesn't seem to be a community college equivalent in continental Europe .
I think it comes down to the age old debate as to whether a degree is supposed to prepare a student to roll into an industry job or whether it is supposed to be an introduction to academia. An academic purist might argue that these should all be separate degrees because they are different intellectual pursuits, while a company would likely much prefer to take on a graduate with some understanding of all of the above.
The people who write compilers still have to learn a lot on the job because writing a good compiler is a lot harder than writing a basic one. It's turtles all the way down.
Ultimately I think teaching you how to write a compiler will put you in much better stead than teaching you just how to use one.
It would be beautiful if it, as well as recording your progress, if the system would also enable you to contribute back to the curriculum, allowing you to create a better suited lesson (an alternative) for the next learner, who now has a choice of two lessons, one new lesson complementing the original lesson. Or it could add a step that was missing. I think some distributed application framework like holochain or DAT would be perfect to build this on.
The closest I've seen a modular education system is Ryan Carson's TreeHouse, where there are 'tracks', topics and courses - which are all made up of modular components that are available and reorganized based on the learner's interests/wants/goals.
I think a more subtle approach of introducing the basics, say, for computer architecture, a simplified CPU model of an arithmetic logic unit, registers, and flags, so you have a feel of what your programs are running on. Then get back to programming concepts. Go deeper into computer architecture later.
If you really want a curriculum, you have to take the broad topics, present the introductory lectures, then create sections for more specialized topics with links to lectures that go into more depth, and maybe add prerequisites to those sections.
update for clarity.
There are upsides though: I learn at my own pace, I can dive more deeply into a topic for a while, I can choose topics pragmatically, and I choose how to learn and test.
Now this curriculum: I have a different approach. I like watching videos/lectures, but more as a supporting tool, when I’m tired. I much prefer books, papers and experimentation. Also I feel like the curriculum is all over the place. I agree with the math part mostly but the rest seems a bit like a hodgepodge.
Really dude, you got an undergrad in CS and you will teach us all these topics? I am exhausted by the available curricula that would make me a computer scientist, data scientist, machine learning guru, Python ninja and whatever else. All that by watching videos, attending online programs, mostly made by people who have zero content credibility yet they are able to put together a bootstrap website and search YouTube for videos.
Please make it stop.
Take a subject matter expert and let them curate a set of free videos to help anyone learn.
Not saying this page has bad content, but there's at least one such "curated" list on front page of HN, and I wonder how much of it is for audience / Twitter follower building "profession".
I'd say one might learn rust instead of C, and learn python first instead of 2nd...
Also IMO UML and those Software Engineering classes are not useful for Bay area styled tech companies.
Additionally, any reasonable system will have timeout + retry, but it's not possible to differ btwn timeout and inordinately delayed. Meaning your requests maybe replayed many times, and the system frequently doesn't handle that well
Out of interest, what do you base this opinion on?
Granted, at both Cal Poly and UC Santa Cruz we learned very useful skills that are very important when writing production software, such as source control, code reviewing, coming up with effective testing strategies, and other things that I use in my career daily. However, we learned other things that I haven't encountered in industry yet, such as writing formal requirements documents and drawing UML diagrams.
Hey look. Let’s make system documentation easy by drawing a bunch of stick figure people.
Stickman is a bank customer.
And he needs to
1. Open an account
2. Deposit funds
3. Withdraw funds
4. Close an account
If one wants to learn, for example, algorithms, one does not need to do this in a self-directed manner. Several learning paths already exist. If one is having trouble grasping a specific topic, sites like Stack Exchange, Stack Overflow, Reddit, Khan Academy, and YouTube exist for one to ask questions on. I have had many issues cleared up by reading through previously asked questions, including seeing the person who a theorem is named after answer the question. Is it quicker to ask a teacher a question? Yes, of course. What will lead to longer lasting learning? For me, it's figuring it out on my own.
Eventually, one will come to a point where there is no longer an answer key, a teacher to guide you, or a well-worn path to follow, what happens then? People need to learn on the job all the time. Learning how to formulate a question, find the resources to help answer that question, and determine if what you have found answers the question are all key skills people need to learn. It helps to do this in a structured setting, but can be learned through the internet.
If self-directed means some people are not good at working without deadlines, then they can form a habit. Set aside time and work through some of the above material.
Cal Newport has put out many articles, blog posts, books, and now a podcast on how to learn and be a good student. I'm sure there are many other resources, including the soft-question tags on some of the previously mentioned websites. Old "Ask HN" posts are also a phenomenal resource. So, I think those who are not good at learning on their own really do have a tremendous amount of resources out there. Will some people even after using all of the resources still fail to learn, I'm sure some will. I think many people, myself included, aren't putting in enough effort to really find their limits, or to discover a process that works for them. Yes, for some people they can figure it out, others will have guidance from teachers, friends, or family, but for others many resources exist to learn the process on their own. At the end of the day, it's easier to go on Tik Tok than it is to crack a textbook and grind problems.
Learning requires work. There's no shortcut. And people who don't want to put in the work will find endless excuses. Just like there's no shortcut to playing a guitar like Eddie Van Halen.
lynx -dump -hiddenlinks=listonly https://laconicml.com/computer-science-curriculum-youtube-videos/ | grep youtube.com | perl -pe 's/.*http/http/' | youtube-dl --batch-file -