Hacker News new | past | comments | ask | show | jobs | submit login
Teach Yourself Computer Science (teachyourselfcs.com)
1169 points by JDW1023 12 months ago | hide | past | favorite | 265 comments

One thing I'm always amazed by when casually talking to people about what I do is how they don't think they can do it. Many people think that software engineering and programming are some inscrutable thing. Sometimes I try to tell them that it is nothing special, it doesn't take a mega genius or math expert to do it. That the most important skill is just being able to solve real problems, soft skills and breaking problems into parts. As well as the ability to self learn and research. I get the idea that most people don't believe that though. Or they think its a humblebrag. Rant done, it's just frustrating that people limit themselves like that.

> the most important skill is just being able to solve real problems, soft skills and breaking problems into parts. As well as the ability to self learn and research.

Seeing a very non-technical friend of mine being completely immersed in games like "Factorio" and "Satisfactory" where the entire game is essentially just putting constantly more complex and valuable resources together in automated ways is fascinating to me. He will talk about efficiency ratios until someone stops him, and when he's not playing he's reading wiki articles to better understand the problems he's presented with. The sheer look of glee and pride when he gives a tour of all the automation around his base is infectious, and although he doesn't share my amazement at my conclusion, I can't help but think that he's already doing what most engineers I work with do every day, and he doesn't even realize it.

Sometimes I feel like there's so much untapped potential out there, people that would have loved to get their hands dirty with micro-controllers, soldering, programming and all the other stuff that could help them solve real world problems, but they were just never given the chance. They never had the mentorship or the right context to develop those skills. Instead, most programming is done under soul-crushing enterprise conditions, and a lot of teaching about programming is (understandably) only about whatever will benefit you in the workplace. When you're an adult, there's no time for play, so most programmers I know would rather do anything but write code and solve problems when they come home from work.

I'm not sure what my point is here, but it is interesting how people look at programming and software engineering as something so inconceivably complex. The way I see it, it's basically just digital plumbing and carpentry, as long as you know how to use the toolbox: and most people will never even try it.

>> The sheer look of glee and pride when he gives a tour of all the automation around his base is infectious, and although he doesn't share my amazement at my conclusion, I can't help but think that he's already doing what most engineers I work with do every day, and he doesn't even realize it.

One reason why I can't really get into Factorio. It's too much like work.

The funny thing is I do code at home, for fun, in my free time- or I used to when I had a clear line demarcating my "free time" from work time (I do AI research and I mostly work from home, so most of the time I'm working on something that I really have to work on). But Factorio is just ... too much like work.

... dah, I can't explain it :)

I understand what you mean. While I'm a Factorio fan I felt this way about the graphical-programming zachlike SpaceChem. Too much like work.

I approached Factorio like MineCraft but with the boring stuff automated.

To have fun in a game like Factorio, just mess around - you don't need to make a perfect factory, or compete with your friend.

> ... people that would have loved to get their hands dirty with micro-controllers, soldering, programming and all the other stuff that could help them solve real world problems, but they were just never given the chance

Anecdotally, this was _almost_ me. I ended up going to university to study CS much later in my life than most people, and only because of a fortunate meeting with someone who was from that kind of background and 'guessed' I would be into it, and if I didn't, who knows what crappy dead-end job I would've ended up in.

How lucky I am to have 'made it' into a career I enjoy, and how many other people are out there who would love to have this opportunity, but instead are working in dull, menial jobs, just like I was before uni, is something I think about often.

I'm not sure how we get more people involved. There are 'STEM ambassador' (https://www.stem.org.uk/stem-ambassadors) roles in the UK. I'm not sure of their efficacy.

There's a wonderful initiative in Suffolk, East of England, run by somebody I knew many years ago. Basically it's an educational club and community interest charity run/founded by a Matthew Applegate to teach kids and young people across Suffolk (I think there were also talks of expanding it to Norfolk.) all sorts of tech things like robotics, programming, game design electronics, and probably more now.

I don't know what it is about his style of teaching but he completely changed my view of IT and the IT industry. Before then, my main tech knowledge was from the at the time terrible exposure you got in rural schools back in the 2000s and taking electronics apart to try and see how they worked. I thought programming was way beyond me as I didn't do well at all in school and had major self esteem issues as a kid. Matt is one of the key reasons I gained a serious interest in technology and has helped probably thousands more like me. He showed me all it takes to learn anything is time, effort, and a healthy dose of curiousity.

Maybe other educators could reach out to him and ask how he does things differently? I'd love to see things like this crop up nationwide, I feel it'd seriously change the knowledge base of tech issues for future generations.

Relevant Links https://twitter.com/CCCSuffolk https://twitter.com/CCCCFSuffolk https://twitter.com/Pixelh8 (Matt's Twitter.) creativecomputingclub.com (Website appears down for maintenance time of writing.) https://www.linkedin.com/organization-guest/company/creative...

God, yes, all the wasted potential out there hurts my soul.

They're having fun and learning. I think it's OK.

In cases where the bulk of their time is wasted on a boring job when they could be learning and having fun all the time on something impactful, no that's not OK.

I think a major difference between most games and real problem solving at work is that the environment is massively simplified and a lot of wrinkles when dealing with the real world are ironed out. In most games you can just relax and not think too hard, I would say, and reap massive rewards, while in real life, rewards come in much more intangible forms after much harder struggling.

This could also explain why many would rather just play games than solve real-world problems when they come home from work. It requires a bit of discipline, desire and resolve I would say. But one should also realize that the rewards of actually going out to solve real-world problems and interact with real people are so much sweeter than the virtual rewards you earn in simplified and idealized games, in many aspects. I would also say that while many don't do it at home, there are still many people that do, and do an amazing job at it.

Maybe LabView would be a good way to get Factorio players into programming. The model is very similar: components laid out on a 2d grid, connected by pipes that transmit values.

I got into programming with LabView.

I've played Factorio in the past and am playing Satisfactorio right now. I generally don't code at home or have any side projects.

Let's do some introspection as to why I prefer playing these games over coding at work:

1. Games are designed in a way, that speaks to intrinsic human values. For instance, in Factorio, the local Fauna wants to kill me. I have to build and design my base accordingly or will I'll perish. My work in contrast, is far removed from intrinsic human values.

2. Games are designed to be accessible, with every step yielding rewards right away.

2.1. Coding in an enterprise environment is full of complex task, which have to be broken down first.

2.2. Games such as Factorio increase complexity of task later on, but you always utilize mechanics you've already mastered.

2.3. Task in enterprise environments tend to always require learning new knowledge first (business domain, existing code base etc.) before being able to solve it.

2.4. Games allow me to start from scratch. Existing code bases, which I didn't build myself, require me to invest quit some time before I can even start achieving my task.

2.5. Often, these existing code bases require me to learn new frameworks. Games teach new skills/tools in seconds or minutes. Frameworks tend to require hours or days to digest.

3. Games provide visual feedback. My labor results in visible feedback inside the (virtual) world. Accomplishing a coding task tends to not result in any feedback (I do mostly backend). If there is feedback, it's normally in text form and of negative format (error logs).

4. In general, feedback in games is unambiguous, clearly tells you what is happening. No debugging. No frustration.

When I actually know everything that is required to solve a task and can concentrate on achieving the task in the best way, that's the moment I'm having fun. But normally, there is so much I don't know, tools I can use but have not mastered, that it feels like a mess. Even if you believe you have everything, reality tends throw some roadblocks along the way.

Now, why don't I code at home and do side projects?

That's in my first point. There is just nothing I could use coding for that solves problems I care about (intrinsic human goals). Even if I know how to write the next Facebook, I wouldn't do it because I really don't care. Well, there are some code solvable things I consider interesting, but they are out of scope in terms of my resources and abilities. Also, the way along would properly still suck for the most part, once it gets serious.

Right now, I'm learning some Math in my free time. Quit enjoyable, gives me new insights in how everything works. I tend not to have that feeling why learning new frameworks / programming languages. New insights are pleasurable to me, tech stuff just doesn't tell me anything interesting about the world.

Maybe some of you who don't feel the same can understand people like me a bit better (not saying that all non side project people have the same reasons).

Frustrating? It's our greatest luck! If people would accept how easy programming is we wouldn't get paid that good for it on all levels.

Some jobs are protected by difficult, mandatory, exclusive education: Law, medicine, piloting. Software development is one of the least protected areas and we also face the threat of outsourcing. To most companies we are an unpleasant cost factor in need of optimization.

Unless you plan to become a manager or owner who profits from cheaper labor cost and more potential employees to chose from, do your part to keep people away from programming.

Realistically, the amount of time it would take to teach yourself CS is immense. It doesn't take genius, but it's a ton of work. So when people think "they can't do it" they might be right from a life/work/time standpoint. I know many people that have tried to get into it and it's extremely daunting.

To learn anything, on your own, self directed, is hard for any subject. I think programmers and devs underestimate the amount of time they spend on "getting good". It doesn't feel hard when you got into it in high school, got a 4 year degree, then spent 10 years in the industry. But if you're 26, working in real estate and have a significant other, it's a huge ask.

A lot of jobs are very linear, like taking a task though some steps and wrangling people to help along the way. I think programming in a professional context is a very unique job in how much fresh thinking is required most of the time. Even at low levels you really work in a very independent way and have to make serious choices. Having a ticket rock up for a JBoss 4 app written in Seam and Rich Faces becomes a challenge because yes I know the big topics like Java, App Servers etc but there's all this new old framework knowledge I have to reacquire to solve the issue. Most jobs are not like that.

There are two sides to this. On the one hand, yes, software needs fresh thinking because reuse is simpler than in other industries (not perfect, sure, maintainable/extensible/modifiable/modular code is really hard to write). In a lot of jobs people have to physically manufacture or process something and it requires effort for each new item. In software, if you write yourself a script that does the weekly maintenance on the server, then you won't have to type it every week. You still have to fix any new problems that inevitably pop up, but you don't have to do the exact same stuff over and over.

On the other hand, I think this attitude of programmers is too smug.

Many jobs that seem simple have to solve difficult problems and require engineer-like thinking. Sure some low-skill jobs, like cashier or mail delivery or truck driving don't require a lot of novel thinking on a daily basis. The bulk of the job is carrying out what you know.

But take for instance event organizers: no two events are the same. You need to negotiate the requirements from the client just like in software, you need to split up the work, plan it all out, execute together etc. Constraints often force you to be inventive.

Or how about lawyers? Even in mundane affairs, like divorce, no two cases are the same. There is probably more in common between one corporate mobile app or CRUD system and the other, than the circumstances of two divorce cases.

How about doctors? Sure some of it is routine, like much of a GP's work. But in complex surgeries they need to come up with a specific strategy and plan for the individual patient.

In fact, I think most jobs paid at the level of software engineers require a similar amount of novel thinking. The contrast is not software vs rest, but well-paid professional jobs vs low-paid, low-skill repetitive jobs (physical and mental as well, like office paper pushing).

Furthermore, many software jobs are really 9-5 "manufacturing". Not every dev is John Carmack. There's a huge demand on really simple business stuff and the very same functionality gets built over and over again by different companies for different business clients. These may be lower paid than the fun Google-level research-like creative software jobs, but they are out there. So top software jobs should be compared to the top of other professions, not just to any "white collar" job (because in the latter case you also have to include the "code monkeys" in the software jobs).

> Many people think that software engineering and programming are some inscrutable thing

It's not inscrutable, but it's not something everyone can do. And among those who can, not all of them necessarily enjoy it. It takes a high enough IQ (whatever reservations you might have about IQ tests, there's no doubt they match with programming ability) as well as an appropriate disposition.

I mean you wouldn't expect every single person to be able to play music or paint well enough to make a living out of it. I sure as hell wouldn't want to be a lawyer, I doubt I could even do the job at a mediocre level if my life depended on it. (Not that I dislike the field: I watch more law-related Youtube videos than I'd normally care to admit, which reinforces my belief that I'd suck at it.)

> whatever reservations you might have about IQ tests, there's no doubt they match with programming ability

If you happen to have some article or research in mind it'd be interesting to read

(Or maybe you read sth long ago and didn't save any links / references)

Have you taken or even just looked at an IQ test? The kind of things they make you do just look like the type of tasks a programmer has to handle day in and out.

Raven's Progressive Matrices.

I would expect good results to be strongly correlated with an ability to learn and be good at software -- and therefore I was curious about if you had read something related to that.

> just look like the type of tasks a programmer...

Hmm, I think RPM looks very different from writing code. Maybe you had other tests in mind? (Then I'm curious about which?)

> being able to solve real problems, soft skills and breaking problems into parts. As well as the ability to self learn and research

You are implying, that most people can do that. I think only about 15% of adults can do what you listed.

And only about 50% of people employed as programmers.

And of them only 10-20% do it okay well (I think)

There is a steep learning curve to go from scripting out some solutions, to understanding good design patterns for production quality code, and to run and deploy an application. I have a non-technical BA degree and it has taken me years to feel competent at building software, and every day I have to push myself to learn new things. And the learning is often "two steps forward, one step backwards." It's not for everyone.

> Rant done, it's just frustrating that people limit themselves like that.

"If I can do it, so can you" isn't a fair judgement. Everyone is different, and CS/programming has many complexities that not everyone is interested in putting in the time commitment to learn it.

For me, with time and experience it has become easier to put a solution together, but it still not easy work and many times I am mentally exhausted by the end of the day.

It's usually the other way around for me. People I talk to generally think that they can be software developers without much effort.

I even met one guy who said that he wrote some HTML back in the day and it was easy, so he was thinking of making a career in software development as it wouldn't take him more than a few weeks to pick it up.

Disagree. You need very good, structured, abstract thinking. Most people don't have that. Even many developers don't have that or aren't very good at it.

And even among the people who have it many can't stand stuff like programming - you have to have a particular type of mindset/disposition to be successful.

> That the most important skill is just being able to solve real problems, soft skills and breaking problems into parts.

Do you really believe this? I'd be interested to know what you do for a living.

Programming requires the ability to grasp abstract concepts which most people simply do not possess. Talent is a thing. I'm terrible at composing music even though I enjoy playing it. I have no talent. Most people are terrible at programming.

I have to conclude that anyone who thinks this is very young and/or has never actually tried to teach someone how to program.

I completely agree!

In my experience, this is often due to a lack of interest. Most people (including myself) often just want a product to work. When something fails to work correctly or you want extra features, you bring it to a shop to get it fixed. When you're not too interested in the process of fixing it, the process behind it can feel like magic.

I feel this way with most products I own or buy. For instance, for me, getting my car fixed can feel like this at times.

Getting deeper and deeper in computer science learned me that I could potentially learn allot, as long as I'm interested enough! Debugging a car couldn't be much worse than debugging a computer system, can it? You probably shouldn't start with some high end car, but an older example should be doable, right? This attitude got me through most of my college years. Sometimes you have to adjust the bar a little (or a lot) , but in the end, you probably can pull it off.

I beg to disagree becoming a solid software engineer requires a lot of intelligence and persistence as well as a certain mindset.

Sitting down and staring at a problem for 10 hours with very little progress can be extremely demotivating for some.

To just get a job in software, the skills that are needed aren't really high though.

Agree. This is a skill that can be acquired. No great computers scientists or programmers are born, they are made. Unfortunately, a vibrant interest and curiosity in core skills of mathematics, logic, and even writing (communications) is pounded out of lot of people early in life, and it makes computing seem inscrutable, unattainable and probably even boring.

For me it's exactly the same. Usually I try to explain them saying something along those lines but halfway through and I already find them distracted or confused

Whenever i have someone ask me where to get started on CS - i give them a few options tiered to try and ensure some stickiness/higher utility for them specifically.

~Some Existing Touch Point With CS

Teachyourselfcs.com is definitely in the conversation - in general I recommend it to friends who have a pre-existing "gateway" to the field (software engineer or otherwise). The page talks about progressing from "type 2" to "type 1" software engineer - a deeper understanding affording a richer career/experience.

~Relatively Fresh - but have strong intent

For people who are more "virgin" and fresh but are willing to put in the work - I recommend OSSU ("open source curriculum") - and i've found for whatever reason that structure has let them gradient in with more success. (https://github.com/ossu/computer-science)

~Intrigued/Engaged - but intent is still nascent

For people who have a fleeting interest but don't necessarily have the time or intensity about their interest i usually recommend a starter python course - 90% of these people will fall off the path unless there is a more permeable access-point to the space that offers a positive feedback loop. The university of Michigan has some engaging options.

I remember telling people to read "The Python Tutorial" on python.org , if they wanted to write Python. It takes a few hours for an experienced programer, and a couple of days for a beginner. People had difficulty starting.

A beginner might have trouble assimilating this information - too few examples, not enough entry level drilling on data types, operators, flow control and functions. For example they might forget to put quotes around a string or not know how to use two for loops, one inside the other, even if they know how to use one in isolation. It's hard to grok how to compose things even if the student saw how to use them in isolation. And for the teacher, it's hard to keep the lessons easy enough at that entry level.

When I started programming (with Python) I remember how the modulo operator was introduced, without any further information. I found it quite confusing, since I didn't know what modulo meant (that could be solved with googling) but more important, how it was useful. When you come from a CS perspective, it's completly clear on how you can use it and why it is essential for some computing tasks, but for a non-CS beginner that can be much more obscure.

Interesting perspective! And it seems connecting the dots is part of the role of an educator!

Quotes to indicate strings have been a problem, definitely. It's odd because it's so clear for me. Yes, I was lucky to dabble in all this a lot when I was younger.

> not enough entry level drilling on data types [..] they might forget to put quotes around a string

Just want to emphasize that first part; I've seen the second in new hires who passed a coding bootcamp, and it's really weird to watch. One of them even had to see the error message and go back to the code and still puzzle over it for several seconds before realizing what was wrong.

>For example they might forget to put quotes around a string

I wonder whether a "richer" editor/IDE, that flags such things immediately upon writing them, could greatly improve the learning experience.

Many tools even point out things such as non-existing variables in scope, which are not strictly "errors" in a dynamic language during edit time.

I would assume so. Using Intellij Idea helped me a lot, rather than using Netbeans that our professor insisted on.Netbeans has a really slow autocomplete and would slow me down a lot, however Intelllij's autocomplete would pop immediately and would learn from my usage.

I strongly recommend OSSU! Both books and MOOCs have their place, and sometimes a MOOC is easier to stay motivated, provided a schedule, and exercises of appropriate number and difficulty.

I really like reading books too, but sometimes the completion-ist part of me is too strong and I shouldn't really do all the exercises.

In comparison with teachyourselfcs OSSU looks lame, serious. Structured badly, tons of links & strange YouTube channels. TYCS has chapters and 2-3 best resources about, ideal for this case.

For beginners I highly recommend Harvard CS50, the best of the best course available for free and paid too.

I agree that there seem to be a variety of links, but most of the courses linked there are of very high quality and I don't seem to see the "YouTube channels" that you're referring to. One thing I was unsatisfied with with TYCS is how it lists Crafting Compilers for the compilers/PL section of the curriculum. I would argue that more important than the specifics about compilers is a high-level understanding of various ideas in programming language design, and be able to understand the building blocks of different languages and compare the trade-offs of using one language compared with another. Of course, if you go through SICP as an entry point to computing, as suggested in TYCS, then you might be somewhat better off, but it's still not the same as a dedicated course on PL itself.

Dan Grossman from the University of Washington has an excellent course Programming Languages on Coursera: https://www.coursera.org/learn/programming-languages/, which I think would be far more relevant to modern programmers than studying compilers specifically. And I'm glad to see this course listed under the "Core Programming" section of OSSU.

It's remarkable how the course goes from zero to several complete projects in different domains and technologies in just a few weeks. CS50 is a real stepping stone.

I'd never heard of OSSU, thanks. The maths courses linked in that are good, too.

The OSSU curriculum has a nice selection of textbooks.

However, I am concerned with the fact that these curated lists lack a unifying idea of what CS is.

If you go to pure math instead, the consensus is clear. A pure math bootcamp, like Math 55, is mostly linear algebra and real analysis, taught from Axler and Rudin in its current iteration.

Simpler math curriculums progress more slowly, or use easier textbooks, but the idea remains the same: Linear algebra and real analysis.

What is the equivalent distilled core of CS? It can't be so many things. There must be a core which you build upon by stacking other courses and textbooks.

My take on this is logic and computation. They are the algebra and calculus of CS, and they are intimately related by the Curry-Howard isomorphism. I find it alarming that CS students are often not taught basics of propositional logic, first-order predicate logic or lambda calculus. It's the basics, and it is really useful.

Software Foundations [1] implements a nice curriculum in line with the ideas I have written about, but it is aimed at graduates and consequently it lacks background. I am still thinking about combinations of textbooks for undergrads that would offer something equivalent. Any suggestions welcome.

[1] https://softwarefoundations.cis.upenn.edu/

Were I to suggest two things to read/watch it would be Nand2Tetris and The Little Schemer. They not only teach the theory but teach you to apply the theory, and the movement of theory to application is crucial to me when discussing computing, an applied science.

N2T teaches digital logic, what Turning saw, how it was physically implemented and how programming languages were and are developed. TLS introduced various topics like the halting problem in an effective and playful manner. Both spark interest, something I find some maths courses fail to do in many people unfortunately.

This "Software Foundations" series feels to me like a focused exploration on PL. I totally agree that a solid grasp on PL theories is actually one of the most helpful things you could do to become a better programmer, and the MOOC by Dan Grossman from the University of Washington is amazing on this. But I'm not sure if this is exactly the same as learning "logic and computation".

+1 for recommending OSSU. The courses listed there seem to mostly be of very high quality, and I'm glad to see the two courses listed under the "Core Programming" section are the two that are among the best I've ever seen. The course Programming Languages by Dan Grossman is definitely much more helpful and insightful in the area of PL than the "Crafting Compilers" book listed in TYCS.

Has anyone here learned a subject by going through one of these "teach yourself" lists? I've learned most of what I know about computers on my own, but it wasn't from following a list of textbooks. And when I've tried to follow a course like this for other subjects, despite my initial interest, I've found myself unable to make it through more than a couple chapters of the first textbook before running out of steam.

But somebody must be getting some value out of these things, right? Am I just bad at studying? Or not as interested in the subjects I'm trying to learn about as I think I am?

I have used similar lists to go from almost entirely self taught to having finished a half dozen CS and FP courses at my own pace. At one point in my life I was keeping up a GitHub streak every day by just pushing a couple exercise answers to a private repo. After a while, I really had to go looking to find challenging enough work to keep me engaged. Books with homework like SICP, FP in Scala, Let Over Lambda, Software Foundations, Haskell from First Principles, Land of Lisp, and Progamming Languages all really offered a lot of homework to keep the streak up.

The streak really is a super power for self study of any form. It's so hard to ignore a 1600+ day streak even when you don't feel like it.

When Github took that streak counter down, I was crushed, combined with me taking a management heavy job (encouraging me to read a lot about soft skills), it really collapsed the whole system for me. One day I'd love to pick it back up, but right now I'm doing the same thing with violin as an adult learner, so that's taking up that time I'd otherwise spend on CS.

Learning violin might help with CS more than you can imagine. Giving yourself a break from something you have been doing for years/decades is absolutely necessary and will get you more excited to get back to learning about CS and prevent burnout. I know i am certainly that way. However, YMMV.

There's an app you may enjoy called Loop Habit Tracker. I used it to keep track of streaks for things I couldn't as easily track through other means. Its pretty amazing how just seeing your streaks can be so effective in building habits.

I just finished the distributed systems section of teachyourselfcs and I now have a solid understanding of a lot of concepts I previously only shallowly understood.

When I started learning, the recommended textbook was Distributed systems: Principles and Paradigms (the new recommendation is Designing Data-Intensive Applications) and the recommended course was MIT 6.824.

6.824 is the first online course that I have completed fully and it was well worth it. I read, took notes, and summarized all required course readings (around 20 papers) and completed the labs, which involved creating a raft-based key-value store. The labs were especially useful because they forced me to really understand the details of the topics I had learned (e.g., MapReduce, Raft, Raft’s log compaction) in order for my code to pass all the tests.

I’m very happy with the results of following the course and I now plan to put the same amount of effort into other topics.

How long did it take you to complete it on your own, including the exercises?

For the course I would estimate 120-180 hours spread over 6 weeks.

I read the textbook over a couple of months, but I couldn't give you a good estimate in hours.

Thank you, that sounds very reasonable!

I think these should be used more as a guide and dependency graph of study subjects rather than a rigid schedule. So, instead of going from first textbook until the last, maybe you just look at what interests you the most in the list and then check if you have the necessary prerequisites.

I plan to eventually complete most of the materials, but I take huge, sometimes multi-year detours. I've done SICP and nand2tetris books, it took around a year each (not fulltime obviously, also I went a bit harder on 2nd half of the nand like writing unit tests and building AST for the compiler instead of the easier approach suggested in the book). Then I decided I wanted to go do OS, but before that to learn C to be able to implement my own stuff, so I also worked through K&R book.

On the other hand I'm pretty bad at doing "real world" things, full of edge cases and details, and try to avoid them as much as possible, so I may not be an example of someone getting value in a practical sense. I feel that I've grown better as a programmer but I can't say how much of it can be attributed to my home studies and how much just to the experience on the job itself.

I graduated with a degree in BioSciences a few years ago, and elected to not do grad school like my peers. I saw no point of losing thousands of dollars to do a master’s in Bioinformatics/Data just to get a jr level job. That summer after graduation from undergrad I attempted to learn CS and go through OSS Computer Science program and TeachYourSelfCS to break into the industry. The problem was a lot of it wasn’t job relevant and it felt very rigid, and sometimes even more complex than it needed to be. These University level courses aren’t really necessary. I like to view coding as a trade and skill, something that can be learned and applied right away rather than a mechanism for research, feel free to disagree with me. In any case, I just wanted to code and build software (sorry CS purists...). I slowly found myself less interested in CS theory and more in software engineering. Feeling disenchanted with CS, I decided to spend a few months learning web and software development fundamentals through online courses on Udemy and Frontend Masters. After that I did a certificate in fullstack development at my alma matter (cost me a couple thousand dollars, much cheaper than grad school). I will say I was pretty lucky that as soon as I finished the full stack program, I was hired by a small software shop.

I think that CS vs SWE thing happens with a lot of people, and I got caught there, too. Luckily, I ended up really liking CS and it introduced me to the field, whereas otherwise I would've gone for SWE. There really needs to be a 'teachYourselfSWE' for people who are into that.

How can you tell which one is right for you? So many computer scientist graduates I know ended up pursuing software engineering that I've mistakingly conflated the two.

it's really theory (CS) vs building things (SWE). As you want to build larger projects, or build them better, you rely on CS ideas to do that. So SWE is really engineering assisted by applied CS. I've been studying CS for about two years and am only now learning any real application development. You can study CS without ever touching a computer or building an application (though doing both those things will give you context for why CS topics matter).

It's kinda like if you want to build a rocket to get to space, you can tinker with propellant, metal, and fire, and be a hobbyist that eventually you build a rocket to go to space. Or you can be a physicist and calculate everything on paper. To get to space you really need both those people, but the physicist can learn to build and the engineer can learn theory.

> but the physicist can learn to build and the engineer can learn theory.

So is this why frameworks exist? To abstract away the theory that proves something more "performant" or "sufficient", and allow the builder to do their thing?

More or less yes. The idea is that a framework (ideally) does some feature A the "right" way and programmers using the framework gain the benefits of convenience and safety, often at the cost of flexibility. Frameworks will usually try and balance the power or depth of the domain its abstracting with ease of use.

A good framework will stop beginners from making critical mistakes, decrease the amount of code its users have to write, and offer professionals well-constructed ways around its handholding as needed.

If your aim is to do some less demanding programming, e.g. using a framework without fully understanding how things all work together, then focusing more on practical projects could definitely help. However, this dichotomy between "CS" and "SWE" is a very dangerous idea. Once you have a solid grounding in CS, the software you write will be of much higher quality, you'll ship code faster, and you'll avoid many mistakes since you have a clear idea of what you're doing instead of just trying to blindly copy recipes.

Particularly helpful I would say is a solid idea in programming language theory and I can't recommend the MOOC by Dan Grossman from the University of Washington highly enough. As he said, even if you never get to program in the languages used throughout the course, you will become a much better programmer after finishing the course.

Curious to know: do you feel that you are at any disadvantage to peers who came to your role who went through a traditional CS or computer engineering undergrad program?

On the job, not really. Plenty of resources online that allow me to commit to continuous learning.

> The problem was a lot of it wasn’t job relevant and it felt very rigid, and sometimes even more complex than it needed to be.

CS was born out of a time when computers were very limited. How do you write a word processor with a spell checker, when the dictionary alone won't fit on a floppy disk, or fit in memory?[1] How do you draw rounded rectangles on your GUI when they take too many CPU cycles to draw?[2]

CS people managed to carve out paths through problem spaces that could be done on the computers of the day. These days, you put everything you need into a hashtable and it looks up in microseconds and you don't have to care anymore.

[1] https://prog21.dadgum.com/29.html

[2] https://www.folklore.org/StoryView.py?story=Round_Rects_Are_...

Interesting. I believe much of CS is practical now, and much is forward-facing and not relevant yet.

Having learned programming on my own and after spending some time doing it at work, I can now look at these books and say: Ok, these do make sense now and it's engaging to read them whilst a few years ago it was all too abstract and felt 'somewhere out there'.

Both are important. You do need to work on your own projects and own concrete case, find tricky cases, look up suggestions online, etc. It can be a great learning opportunity. For example, back in the day I used to do a lot of video recoding, ripping DVDs to SVCD, so I had to learn what the terms mean in the programs and why results weren't as I expected. This led me to learn basic concepts and a mental model of coding theory, color spaces and signal processing. Or when I built a website, I had to learn about networking, about programming some Javascript, etc. just to get the website to display. Similar with game modding, learning about graphics, textures, scripts, writing code to decode, modify and encode binary files containing text to translate them from English to my language. This meant I had to learn about character encodings etc. If you have a personal purpose in mind, learning can "just happen". I didn't intend to learn the things, I was just striving to make the thing work.

I have no idea how I'd make someone interested in these things if their mind doesn't pop up such tinkering goals for them.

On the other hand, the above is very inefficient in itself. It is hard to overstate the importance of reading good books, besides the practical learning of the above paragraph.

You need to be able to relate to the topics to some degree, but once you have the hang of it, you can benefit from the years and years of experience of experts and hone your mental models, discover new topics and relations across topics and understand things more holistically.

Learning only from books/courses is not sufficient, you won't be able to apply what you learned if you don't constantly try to map from what the book says to something you are already familiar with a little bit.

But you have to pick good books. Bad books can be more confusing and discouraging. I think American CS books especially from MIT Press are really high-quality. They may be expensive (but there's LibGen).

You may also have problems with delayed gratification. The benefits of an hour spent on reading a textbook will be vague, uncertain and delayed and often hard to attribute to the book. One hour of watching Netflix or Youtube or browsing through 20 genuinely interesting links from HN feels more rewarding. This is a big problem for us all in today's attention economy. You need to be very conscious of this to break out of it.

Similar to you, a lot of what I have learned in CS/programming has been through experimenting (personally I like the Jennifer Dewalt method of just doing a lot of small projects: http://jenniferdewalt.com). You are not bad at studying! We all just have our own learning methods that are most comfortable. A proper education system would ask you what ways you like to learn, then have multiple versions of the same curriculum taught in different ways. Somebody build that please.

I learned networking through reading the book recommended in this website, and playing around with wireshark along with some other stuff. The book itself was about 800 pages, but breaking it down by sections made the task much more manageable

I think I'm pretty good at self-teaching w/ textbooks. I will admit that there were 1 or 2 classes at college where I completely skipped the class and just read the textbook because I could absorb info faster than attending lecture.

I got a copy of Benjamin Pierce's `Types and Programming Languages` and I've read basically the full thing and have completed 80% of the exercises.

Currently working through a couple other textbooks slowly. I find that taking notes and doing the exercises is essential, otherwise I lose interest. (Same with online video courses)

I finished university some time last century; so, most of what I know is self taught at this point. Also, software engineering is more of an apprenticeship thing than something a university professor without relevant software engineering experience can do.

I'm being harsh here because I used to have software engineering as a topic for my Ph. D. studies. Part of the reason I left the academic world (after a postdoc) was a bit of impostor syndrome where I realized I was telling others how to engineer software without having much experience engineering software. This always struck me as odd. I've since learned a lot about software engineering that was definitely not part of any computer science studies I know off.

I always think of college being more about learning to learn than to learn anything specific. Also the difference between university and a technical school is that the former tends to be exactly about being less about practical skills and more about skills you need to learn new stuff and ask critical questions. Quite a few projects I've been on over the years have also required me to dive into other domains. I've had to read up on metallurgy, geo spatial algorithms, and medical stuff at various points for different projects.

Not computer science but I learned a lot by going through Teach Yourself Logic[1]. I did not know where to start and how one subject relates to other and this guide helped me a lot. I have followed the guide for two years now and wish for some thing like this for every topic.

[1]: https://www.logicmatters.net/tyl/

When I learn something to the point I can consider it 'learned' it is never from one source or one guide. It's information taken at the relevant time in my learning journey from any search.

I would say MOOCs are usually much easier to follow than books. Great courses can really change your whole perspective. The list of MOOCs listed in OSSU are generally of very high quality and I would highly recommend them.

I finished two courses on the list. All I feel is: I wish I had watch/read this earlier.

I thinks I had much more fun from these courses after I have done many real projects in these years.

Maybe you get bored because it's little bit easy and basic for you.

I'm sure many do but clearly the hardest part isn't actually following a list. It's finding or creating the discipline necessary to actually go through the entire list.

I personally think there is a lot of value in these things, particularly in cases where it's just impractical to go back to school/university or that tuition is unaffordable. Quite a few years ago I learnt software engineering and basic computer science through free, online resources (a few of which have already been mentioned in this thread), it got me a few successful interviews and eventually a job.

That was actually the first time I have self-taught myself anything with the aim of breaking into a field that had no experience with. It was difficult at the beginning because I had just quit the job I had back then and fell into depression for a personal, unrelated reason; but perhaps that was also the reason that I eventually had the drive and focus to get through what I would normally have found "boring" at the beginning — I had nothing better to do. The experience turned out to also be one that is empowering because now I feel like I could learn a lot of things by myself.

Not sure if it helps, but here are a few things base on personal experience of going through this type of syllabuses:

* Assuming that they are written by people who are good what they do and know what they are talking about, these syllabuses are usually to be followed through step-by-step unless explicitly stated otherwise.

* Different people benefit from different modes of learning. There is nothing stopping you from using auxiliary resources to help you to get through a syllabus.

* More often than not it takes a few reads of some text/attempts of an exercise for me to really understand something. In one particular case I read through a book three times, about half a year a part each time, before I felt that I really understood it; and when I finally understood most of it, I was only glad that I decided to try it again for a third time.

* I think it helps to have someone to talk to about what I was doing, it could be a close friend or family who doesn't understand anything about what you are studying. Alternatively, if what you are studying is "popular" enough and there are communities of people doing the same—get connected. Emotional support helps.

* I always try to interact with people who are at different skill levels. There is always a lot to learn from people who are much more experienced than I am; and explaining what you have learnt to someone less experience helps to consolidate what I have learnt.

* Focus is very important. Find what works best for your through experimentation: if you can sit through some given material for a couple of hours straight, great (but do take care of your health and at least get some stretches and water); if you regular breaks, that's also fine—just make sure you set an alarm clock and commit to getting back to it when it goes off.

On a somewhat related point, for freely available resources, I would buy a coffee or make donations where possible to people who devote their time to help others. :)

I hope you will find something that works for you! Good luck!

Main updates as of May 2020:

Computer Architecture: added Computer Systems: A Programmer's Perspective as first recommendation over nand2tetris.

Compilers: Crafting Interpreters added as first recommendation over dragon book.

Distributed Systems: added Designing Data-Intensive Applications as first recommendation over Distributed Systems.

Online availability of some video lectures has changed as well.

CS:APP is recommendation over Computer organization and design.

My school actually use the same book for Computer Systems (similar to CMU's 15-213) but less intensive. I wonder what parts the 400-level course use for Computer Architecture.

Harvard also uses the CS:APP for their first systems course, CS61.

Are you thinking or debating if getting a deeper understanding of CS will help you or your career?

I am thinking of the same. Even though I have a Bachelor of Science (BSc) in Computer Science, it has been 14 years that I actually used any of it. I regret it sometimes since I know what it is to know how things work underneath the high level languages (PHP, JavaScript and Python for me).

Lately I have started taking more interest in CS and I think part of the reason is Rust. You could be learning Go, Swift or any similar language and come to similar feelings.

In Python (my language of choice for the last 7 years along with JavaScript) I do not really have to care about pointers, stack or heap allocation, how memory gets create or cleaned, etc. I also do not need to know if there is a virtual machine, a garbage collector, etc. Data types and how they are handled are somewhat important but I rarely get slapped by the Python interpreter for mistakes.

To me it seems that a lot of developers are choosing to go into more typed and lower level (closer to metal) languages. My gut feeling says this is happening because we can not crank up core speed from 2GHz to 3Ghz to 4Ghz and beyond as easily as we did pre-2Ghz era. Yes we hit ceilings of per-core power. But we have more users than ever before and the world seems to be eager to bring computing to even larger audiences.

All this is great, but it means developers need more refined and powerful set of tools that can work in low-power environments, boot up fast and deliver more computing throughput. The high levels of abstractions that most of us have enjoyed will stay but there will be new territories where you want to push code that is way, way more efficient. If we want to work with them, we need to know not only how to push a Template into an `<H1 />` but about the metal that is underneath. From how browsers work internally to how networks work, how processes work or communicate, how memory is allocated and so forth.

So this is a choice we have to make individually. I choose to learn and I am going with Rust and spending time to learn CS again. It is a slow process though, I have full time commitment to my product as a solo founder. Cheers!

I want to add something to this, something that I am understanding now perhaps because I have slowed down as a person. Learning is HARD, really really hard. More so when there are tons of social, professional and personal distractions.

On top of this I am chased by procrastination. I have gotten a much better grip at this. But just as an example - I have tried learning Rust at least 4 times before since Rust got highlight on HN I guess. I failed each each one. This time, I have at least gotten to the point where I opened the documentation on Vectors! My goodness, who am I? Complete lack of patience.

This directly causes me to sense pressure when I see statements like "type 1/2" engineer. It is not the material that is attacking me. It is my own failure hiding under the shelter of ego.

I think this is an extremely common experience. At least it maps really well to what I've gone through. Small surges of "motivation" wherein I embark on projects and then they taper off, constant battles with procrastination, etc.

However, at least in my experience, it's possible to develop tools and systems to stay disciplined enough to finish what I committed to. Not every tool works for every type of project so it's a constant battle. The good news, though, is that you can fail at this as many times as you'd like (I certainly have) but you only have to succeed once!

>> I know what it is to know how things work underneath the high level languages (PHP, JavaScript and Python for me)

I suspect this has helped you more than you seem to know. This knowledge opens doors; gets you jobs you couldn't otherwise get; and makes you a better programmer.

>> In Python (my language of choice for the last 7 years along with JavaScript) I do not really have to care about pointers, stack or heap allocation, how memory gets create or cleaned, etc. I also do not need to know if there is a virtual machine, a garbage collector, etc.

I have seen some horrendous Python code written by people who didn't understand these things. The Python VM "hides" them from you, but they're still relevant for writing fast, correct code as almost every operation in the language ends up using one or more of those "under the hood".

Best of luck with Rust and your startup!

You are perhaps true about me not being aware that the core CS knowledge is helping me all the time. I have always been able to explain technical issues on a whiteboard all the way to the level of protocol or internals. And they have helped me immensely in my work.

I guess since I do not spend a lot of time in CS literature, I have the feeling I do not know much. But I guess I already have an edge at work without noticing it.

Thanks and best of luck to you!

A lot of people link CS with a career as software engineer for the web but leave out a lot of other software engineering and research opportunities. I get that a deep knowledge of CS doesn't always directly help while doing web development but it can really help in other area's of software development (embedded systems, OS, graphics, etc.)

Of course everyone should learn because otherwise they will be left behind with pay or life in general. Just setting it up in a negative way should not be there like "type 1 vs type 2". There is much more to be achieved when you are nice to people and learn to cooperate in a helpful way than when you are master of algorithms and low level operating system details.

I think it is not intentionally set up as negative. I think it is our perception since learning is hard. I have failed a lot in the last many years to learn (relearn) deeper concepts even though I wanted to.

So when I see "type 1/2" I automatically feel pressured. But that is my own incapability to stick to the subject, the books. My own procrastination. Learning is hard, escaping with an excuse about ego is way easier. Rust is already teaching me this.

Rust also seems like a great choice because of its close ties with WebAssembly. Web developers who learn it should have an advantage as the tooling starts to get better around WASM.

That list is...not bad, as far as it goes.

I would suggest Essentials of Programming Languages by Daniel P. Friedman and Mitchell Wand instead of (or even better, in addition to) Crafting Interpreters. The former is about languages, the latter compilers (or rather interpreters). They are complementary.

A couple of things I would add:

Logic Rather than saying it is "applied math", I prefer to view programming as "applied formal logic". Everything from fancy type systems to Fowler's refactoring to Dijkstra's "weakest precondition" semantics for normal human being programming languages are simple applications of formal logic. Logic for Computer Science by Steve Reeves and Mike Clarke seems a good choice.

Automata theory Ok, if you're going this far, you might as well go all the way and learn what all that "finite state machine", "universal Turing machine", and "incomputablity" crapola is about. Introduction to the Theory of Computation by Michael Sipser is pretty good.

Old-school Artifical Intelligence Games. Plus, what can I say but, it'll come back again. Artificial Intelligence: A Modern Approach by Russell and Norvig. (Oh, there it is down towards the bottom.)

Sorry, I don't have video suggestions. And yeah, some of those aren't free. But they're not horribly expensive either. Maybe there are alternatives.

"Still too much? If the idea of self-studying 9 topics over multiple years feels overwhelming..."

I'd have some snarky comment here, but I'm bitter.

I like Algorithms by Robert Sedgewick. It's a great book. He also has free online courses based on the books that are of excellent quality:

- https://www.coursera.org/learn/algorithms-part1

- https://www.coursera.org/learn/algorithms-part2

There's a website for the book as well, that has code that is well organized and commented.

Unfortunately excellent books like "Programming Pearls" have code examples that use weird naming conventions that just make me just want to punch the wall.

For competitive programming and interviews, Antti Laaksonen's books are fantastic: https://www.cses.fi/book/index.php

I’ve heard that Part 1 of the algorithms course is sufficient for new grad interviews. Can you verify?

I got an interview that involved part 2 (substring search), which I solved using KMP.

It sounds like you got hazed.

I still don’t understand why these software interviews do this. Especially when the work in question, is derivative, and mostly just reformulating other ideas and libraries.

It’s like, how often are you going to have to reinvent your own Red Black Binary Search Tree to solve some obscure business problem? Most of the time, the business doesn’t give a damn. Just throw your data into a database, and be done with it. Then use some queries to filter through your data, and get the data you need to solve the problem at hand.

And usually, by looking at someone’s code style, you know how good they really are. You can determine, if (1) they are at a hackerish level and their code has bugs and occasionally breaks, or (2) if their code is really solid software engineering level quality that’s airtight and can survive anything you throw at it.

I prefer these interviews rather than take-home projects, which usually take longer and can be cheated.

A company once sent me a really long take home exercise that took like 3 full days to implement to completion. I said: no thanks, and in the same 3 days I had multiple interviews and even got an offer.

senior interview or new grad?

Phone interview for a senior role, not onsite.

> I’ve heard that Part 1 of the algorithms course is sufficient for new grad interviews

I am planning on returning to university for a CS PhD after discovering that studying this curriculum is consistently the highlight of my day. Studying CS is the most interesting thing I do on a regular basis. I owe my thanks to the authors.

I'm self-taught too, and strictly interested in research-focused CS masters or PhD programs (as opposed to industry-focused cash-cows). I think I'm mostly interested in studying PL or program synthesis, but I'm also open to other ideas. How did you make that jump from self-taught to qualified for a PhD program?

I made this exact jump. The answer, for me, was a four year bachelor's degree.

I was a self-taught software engineer, not a self-taught computer scientist. I was writing production code for fairly normal stuff, not proving theorems or implementing research prototypes of new ideas. So YMMV.

If you can make it through TAPL or PFPL on your own, then you probably know enough to be useful to some random grad student somewhere. Latch onto an implementation effort, get authorship on a popl/splash/pldi paper or two, and that's probably enough to convince some faculty somewhere to take a chance on you.

But most good graduate programs still require a regionally accredited degree. Getting an exception to that rule will likely require a publication record. (Getting into good graduate programs often requires a publication record in any case. I recommend against attending not-good graduate programs.)

I'm almost halfway through this: https://plfa.github.io/

I was auditing a grad class at UT that used this book, then corona hit. Still plan to finish it eventually though

I'm not even really sure where I would get started working on an implementation or working on a paper. Are these projects open source?

I would like to add this related question to the discussion: what resources/courses/learning techniques do you and others like you who have had an undergraduate Computer Science degree for a few years and are interested in academia but not sure what steps to take next use to continue your study? At some point I would also like to return for a PhD or Master's degree and I would like to keep myself mentally trained, but there are so few resources for higher-level study. All of the online resources I've tried are almost exclusively introductory and don't provide the challenge I crave.

Does anyone taken higher-level courses at a local university in their free time just for the experience and to be surrounded by like-minded individuals while having a professional career?

Email them.

what topic would you be pursuing for rnd?

What field are you going into?

Ha - super ironic that this would be atop HN today. I've recently decided to dive back into this journey as well. Anyway - I was watching the final lecture of David J. Malans CS50 course yesterday and found this awesome slide he shared. I screengrabbed it and now I share it with all of you.

It's essentially "where to go from here and beyond" (by course)


I re-uploaded that image to a host that doesn't require JS to view an image, because requiring JS to view an image is peak JS stupidity.


Wow, apparently Imgur is now redirecting their plain image links to pages which _do_ require JS for viewing. Thus, Imgur has now been removed from the list of image hosts that I use (and I suggest others to do so as well)

Instead, here's the actual image, without any the JS: https://i.postimg.cc/hjzXvrNc/yvA5Eiz.jpg

CS50 is how i started to learn CS. The fact that it's freely available worldwide is crazy. I could never follow a Harvard class otherwise, and the content is great.

I like to compare CS to medicine.

Most topics of it aren't difficult to understand, but overall it's just a lot of information. If you want to get a comprehensive CS-education, you have to accept that it takes years of studying, learning and practise.

Agree with this. I’ve got a medical degree as well as self taught developer for 15+ years. The longer you do it, the easier it becomes as the knowledge just kind of seeps through immersion. But I can imagine for a mind that’s new to either topic can feel quickly overwhelmed without significant motivation to overcome that learning inertia.

Would you say CS is generally easier, with a smaller body of fundamentals, and a front-end JS dev could swap places with a back-end C# dev and the two would get up to speed a lot faster than an orthopedist and cardiologist, with all the specialized information in those two fields? Medicine just comes off like a lot more memorizing of an enormous volume of disparate information compared to programming.

I'd say that in CS, there's a lot more transferrable knowledge. In medicine, there's a lot of specialization and niche knowledge. The cardiologist would know very little about orthopaedic specialist, for instance. I was a GP myself, and likely knew more about medicine & surgery in general. Although I had less specific knowledge, I could integrate information better (eg. Project manager, if you will, vs database specialist).

So yes -- in CS, if you work in an area with a lot of general knowledge -- you are likely to get up to speed faster. In medicine, if you're a GP, you can probably undergo additional specialist training much quicker than someone who first pursued a single specialty first.

From a frontend dev perspective a complex system like a living human being is nature's monolith of legacy hacks on top of legacy hacks, but you're damn right it holds up under millions of requests per second

I wonder if there's any TeachYourself Medicine resources?

That would be very dangerous, albeit useful. Making a diagnosis and giving a therapy have different consequences than programming SW and executing code

Yes, computer systems are carefully designed to prevent programming errors from killing or injuring anyone, costing too much money, and so on.

Feel the power of the sarcasm side...

Trusting doctors can be dangerous too (personal experience)

There are many similarities indeed (eg abstraction: cell -> tissue -> organ, modularity etc) but the fundamentals are different (deterministic CS vs probabilistic biomed)

At 200 hours per subject, it would take 2 years to complete if studying part-time (20hr/week) during only the working weeks of the year (92 weeks). If you're uncomfortable working+studying 60-70hrs a week, it would probably take 4 years.

Are there blog posts around where people share their experience of undertaking and completely this endeavour?

Heck, I don't know of any blogs with people sharing their experience of an undergrad CS degree.

Good point. I’ve been meaning to write up my experience, because I feel that it was really poor and Australian students need to be more aware of how sub-par our CS education system is.

Though it doesn't cover programming directly, I've always loved Computer Science from the Bottom Up - https://www.bottomupcs.com/

not op, but i think it's good enough to warrant reposts so new people can find it.

‘dang is saying “see also”, not “this doesn’t need reposting.” From what I’ve seen, reposts after a year or so are fine.

Also past comments are sometimes better (and other times not) than average. Just depends on who chimes in and their mood and what else is happening in their world at the moment.

As grzm and brudgers pointed out, the goal of such comments is simply to point curious readers to related threads they may find interesting. If there were a problem with the repost we'd mark it a dupe instead. Reposts are fine on HN after a year or so, or if a story hasn't had significant attention yet. This is in the FAQ: https://news.ycombinator.com/item?id=23150065.

I've been looking for a brief, unambiguous phrase that can point people to related threads without having to spell all this out every time. A reader suggested "See also" which seemed like it would hit the spot: https://news.ycombinator.com/item?id=23150065. I guess not!

Hmm, "see also" strikes me as pretty unambiguous phrasing.

I wonder if the problem is that when people are skimming, they're going to to miss a pair of short words in favor of the larger URLs (and by "larger" I mean the physical space the text occupies on screen). A bunch of links to the same post will always seem damning if you don't know better.

> Note: this guide was extensively updated in May 2020.

Prior discussion may not be exactly relevant on some points.

Does anyone have a github 'awesome' type of page with aggregated teach yourselfs? I would love to have it for other subjects.

Having gone through all of these course myself, I'm wondering if this stuff is really relevant to today's workforce? Compiler Design? How to build an Operating System?

I remember in a job writing my own BNF grammer and a a parser for it to optimize something or other and I got back basically "WTF" on the code review.

I would say database fundamentals is a must, so is networking. But honestly the rest is iffy. When was the last time you needed to build a red-black tree?

I think programming should be a larger part of CS. And no, not in Lisp. I also think ML needs to be more of a thing as well.

While most programmers are web and mobile application developers, not all are. Yes, there are people out there doing systems programming, contributing to the Rust compiler, and all those things. Further, debugging a system is harder than writing the system and much, much harder if you do not have a suitable store of background information.

Insert story about explaining to my aerospace engineer buddy why

    for i in 1..n:
      s += y
for strings s and y was fine on one browser and very, very slow on another.

When was the last time I wrote a red-black tree? Never. I've never had to write a red-black tree. If I need one right now, I'd have to go look it up. (But then, I'm charging $5 for hitting the pipe with a hammer and $95 for knowing where to hit the pipe with the hammer.) On the other hand, I had to build some damn data structure or other last week, and that is a very similar problem. Don't miss the forest for the trees.

Should programming be a large part of CS? Yes, absolutely. Doing these things in practice is a big part of learning them, and that's why a person with an undergrad degree in CS is employable and a person with an undergrad degree in Archaeology isn't.

To be fair, this isn't "teach yourself the skills necessary to be successful in a development job," it's "teach yourself computer science."

I'm being sincere when I ask: what would you do with a self-taught CS background? I understand many of the larger companies require an understanding of CS principles, but, if not to be a successful developer, how useful is a self-taught CS education? What do you do with it if not software development? In my somewhat limited experience, the more CS-focused folks hold PhDs, and sometimes I feel it's more about the credentials than the education.

Keep in mind that "successful software developer" is a fuzzy term. You can be a successful developer if you are gluing APIs together for an internal enterprise web app and making six figures per annum. You can also be an unsuccessful developer in those same circumstances, if your ambitions are different.

The "CS focused people" that you typically hear about usually have Ph.D.s because you usually hear about (and from) people doing fancy new things. There are many other "CS focused people" out there just keeping the lights on, or rather keeping the backbone routers from being swamped under their own routing tables.

In my experience, CS is useful when software engineering breaks down. Many foundational abstractions in modern high level languages and frameworks are leaky. This leads to pathological behavior under stress (extreme scale, extreme performance, groundbreaking tech, unusual requirements). CS is also a big help when you're exploring new technical areas that frameworks haven't been developed for yet.

You know that person who you always go to when your program malfunctions? Or the high level engineer at your company who also seems to know how to diagnose a site incident or other strange software problem? What about the technical founder who leads a startup to do something new and extraordinary? Or how about the person who wrote your framework/language of choice? These are people who understand CS fundamentals, irrespective of whether they have a formal CS degree.

Not coming from a CS background, I’m always looking for resources to grow.

Was quite turned off by the “two kinds of people” assertion, broken down by understanding of CS. There are still holes in my understanding, surely, and I fill them in where I can so I can feel like a true Scotsman, but I’ve never had trouble dropping down abstraction levels when I need to, and even 10 levels down I’m still not close to logic gates etc.

As far as I can tell the computer always tells you what’s wrong if you can follow the breadcrumbs.

I wouldn't trust myself to sit down and code a red-black tree. I'm guessing if you find a GH repo with a RB tree implementation, there will be more than a few closed issues because these kinds of things often have many edge cases.

What is more important I think is to understand why a RB tree exists: what are its unique characteristics vs a binary tree, B-tree, AVL-tree, etc.

Same thing with hash tables. If you have never coded a hash table implementation, it may be puzzling to discover there are many, many hash tables, each with pluses and minuses. The real skill is being able to understand the trade-offs and choose a hash table implementation that will work for your specific problem. And when it falls down for some specific data, load factor, etc., understand why.

I think ML should be included mostly just the fundamentals to understand what's happening, and to give an accurate radar to students for spotting bullshit.

Looks like the fundamentals of ML would be covered by the Math for CS section.

I really liked "The Computing Universe" by Tony Hey. It's a book which covers most of these topics on a higher level, but I felt it was (very roughly) an abstract of what I learned in the first CS semester. So I find it a very good introduction into the field for interested non-CS people.

As a 30 year old software developer who's now considering going back to school for the next 3-4 years to get a degree, I wonder if I should blame these kinds of resources for making me believe that I could learn everything on my own and that a degree was unnecessary. Having learned this material in school would have made my life so much easier...

For context, I recently received a lot of pressure to get a degree, despite years of experience and good knowledge of CS fundamentals.

Perhaps Western Governors University could be a good fit. https://www.wgu.edu/online-it-degrees/computer-science.html

The philosophy of WGU is that you pass self-paced courses by passing tests, so people that already know the material can complete the courses quickly.

I wish I had known about them years agob. I have applied last week, but I'm still waiting for their decision (they don't officially accept international students).

Are they a respected degree in CS?

They are accredited and mostly pursued by already practicing software engineers who want to check the box for a degree.

That's not an answer. :-)

Accredited is indeed a valid answer https://www.wgu.edu/missouri/about/accreditation.html

I'm doing that right now at 32, but more for personal enrichment than anyone pressuring me.

I feel like I'm getting much more out of my degree than I would have if I went the traditional route because I understand the context for the material, and because I've seen much of it before through self teaching, so I can focus more on consolidating my knowledge of the fundamentals rather than struggling to absorb all of these hard and often poorly justified concepts for an exam.

Georgia Tech has an online MS in CS or Cybersecurity. The CS degree is a little cheaper, classes are roughly $650 each, cyber are around $900. You're looking at around $7k or $12k for the entire degree. I'm currently in the Cybersecurity program if you have any questions.

https://omscs.gatech.edu/ https://pe.gatech.edu/degrees/cybersecurity

Another option is NYU's Cyberfellow's MS for cybersecurity. It's around $16k https://engineering.nyu.edu/academics/programs/cybersecurity...

Where is the pressure coming from? Your employer? I’d guess it varies widely between companies.

The pressure comes from everywhere:

- Current employer: "Your salary is already good for someone without a bachelor's degree"

- Current employer: "We're sorry but this promotion requires a bachelor's degree"

- Future employer: "We're sorry but we hired someone with a bachelor's degree for this job"

- Date: "You don't have a bachelor's degree?"

- Friends: "You're the only one without a bachelor's degree"

- Meetup: "What did you study in college?"

- Country: "We can't grant you a work visa without a bachelor's degree"

- Family: "You will have more opportunities with a bachelor's degree. We'll even pay for it." (offer expired)

- School: "Here's a scholarship" (offer expired)

- 80,000 hours: "You basically don't exist"

- Brain: "Maybe I'm not a superstar coder and job security is important"

Perhaps some of these are hypothetical, but they feel the same as if they had been concretized.

I have a hard time believing that an employer ever cared about my degree in CS from 24 years ago from a no name state college when they decided to hire me.

I’m sure my classes in COBOL,FORTRAN, and data structures in Pascal made any difference. Well, I did take a course in C and 16 bit x86 assembly.

It was different times back then I'd argue, you could basically do 5+5 in a shell and get hired and shipped of to New York, but they where also forced to be constrained enough by the hardware to become clever after getting the job.

2018 Apple had 20 million registered developers and it has only increased since then.

It's still very easy to become a developer right now (sambo found it to be the easiest path forward for example, from no experience at all), but the trend is quite scary. I have frankly no idea how it look when I turn 60.

I’m saying once you have experience even now, as the original poster has, most companies don’t care if you have experience.

In the immortal words of every in r/cscareerquestions “learn leetCode and work for a FAANG”. No I didn’t do that to “get into a FAANG” but it is a much easier route than the one I took.

If you want to go the enterprise route, if you have experience and do enough resume driven development and can answer basic techno trivia on a company’s chosen stack, it’s even easier.

does similar pressure exist for a Master’s Degree? I’ve considered OMSCS for this reason.

Ask me after I get my bachelor's degree.

I suspect the pressure never really goes away.

> despite years of experience and good knowledge of CS fundamentals

Have you considered majoring in something other than CS? If you're adequately self-taught and need the degree, then might as well learn something new and useful.

This is what I'm considering. I'm relatively content having been programming long enough. If I go back to school it probably won't be CS.

On the flip side, unfortunately CS is one of the few useful degrees that's had enough schools create a worker-friendly program (i.e. online) so many other degrees probable mean dropping out of the workforce.

Don't quit working to get a degree, and don't take on a lot of debt to get one.

Unfortunately the way the US education system is designed I don't have much of a choice atm.

I have considered the idea, but I don't know if the additional effort would be worth the possibly marginal benefits.

I've considered biology and law because they seem to pair well with software. Perhaps smaller niches would be even better.

was a Bachelor’s in a quantitative subject not sufficient?

Just going to drop this link here:https://news.ycombinator.com/item?id=23564832

Save you a click: TeachYourselfComputerScience Slack channel


Can anyone with a CS undergrad from an elite US college (Stanford, Berkeley, MIT, Carnegie Mellon) comment on whether they feel their undergrad covered 100-200 hours worth of these subjects?

As someone with an Australian CS degree, I feel like my curriculum covered less than half of this properly.

Let's see; a one semester class is 3 hours/week, 15 weeks, or 45 hours of lectures. The semi-official ratio was to expect 2-3 hours of work in addition to the lectures (haha, yeah, I slacked off a lot), so that would be 135-180 hours. The books chosen sound appropriate. Sounds about right to me.

This was UT Austin, which may or may not, now or ever, be an "elite US college", and it was 30 years ago. (Holy crap, where's my walker?)

That's definitely strange. I studied at a university in the Netherlands which implemented what I think was a European standard curriculum, and what this article mentions is a subset of the first two years of the undergraduate degree if I remember correctly.

Some elective courses are missing, and AI though it is mentioned. We also used many of the books mentioned here. Also back when I did it there were more seperate math courses that we shared with the maths undergrads. I know that they stopped doing that after I graduated.

Undergraduate studies in the US in general are very different from in Europe.

I did my undergrad (EECS) at Imperial and MIT (for my final year), and the difference between the two was pretty enormous. My home department expected me to take five graduate classes (plus some undergraduate classes for "light relief") over the course of my year at MIT, something the other undergraduates there thought was very unusual.

In general US undergrad is much broader than in Europe, and doesn't go into as much depth, even though a US bachelors is a year longer than a European one. Not necessarily a bad thing, it just depends on what you consider the point of an undergraduate education to be; I'd say that most European universities try to structure degree programs which can funnel you straight into research or industry without having a sudden step up (the step up from undergrad to graduate studies in the US is pretty huge) whereas the US thinks it's more important to develop yourself in a broader range of areas rather than see the degree solely as a means to an end.

They both have strengths and weaknesses (having experienced both) and I don't think either can be said to be better.

I feel like a lot of my coursework was overstuffed with learning and re-learning waterfall vs agile, and fluff subjects like 'user-centred design' and 'professional computing practice'.

The latter was basically 'how to make a resumé' and 'how to not be an arsehole in email'.

That sounds like I might have been a professional education instead of a scientific education. Nothing wrong with that, you could get a full computer science degree from our university and not be able to code for shit. You definitely can't hire university grads blindly. You often see developers act tough about how knowing how a CPU works is important for being a developer, and it is, but knowing that doesn't make you a good developer. It just makes a good developer better.

Where was this?

MIT: A normal class will be 12 hours/week nominally. The average self-reported time is usually a little under this, in the 9-12 hour range. Most students are not taking databases or networking classes (not counting 6.033 which is a dud). Compilers, distributed systems, OSs are fairly popular classes but I'd hesitate to say most CS students take them. There are over 100 EECS classes offered each semester so there's a lot of variety in what people take outside of the requirements.

I'm currently studying CS at UNSW and half of these subjects I've taken relevant courses, and the other half are in my future. Where did you have your experience?

RMIT in Melbourne. UNSW is way better than the former, and from what I’ve gathered is the best program for CS in the country.

I went to Berkeley for undergrad.

Anyone in the EECS department who was serious about a future in programming took, at a minimum, two courses on algorithms / data structures (CS61B / CS170), one of which was required for the degree. I suspect that's partially covered by what any crack-the-coding-interview study course would try to teach you anyways (dynamic programming, tree / graph traversal; NP completeness and reductions probably less so).

In general, each CS course at Cal had ~3 hours a week of lecture, 1-3 of discussion / labs, and ~2-10+ hours of homework, varying on the individual and course (not to mention whatever auxiliary studying one might do for exams). With 15 weeks of instruction, that adds up to a minimum of 90 hours, but realistically, 200 might not even begin to describe some courses. A full courseload might look like 4 classes (potentially 3 CS classes + 1 humanities course) and is designed to be ~40 hours a week.

Agreed with the sibling comment that this list seems to have a number of notable omissions -- in particular, it looks to be very systems-oriented. But I also don't think that you _need_ all these courses to be a fully-fledged programmer. Sure, knowing how compilers / languages work might make it easier to pick up more languages, but learning languages across multiple programming paradigms (e.g. functional vs imperative vs declarative), which you'll get anyways if you follow this course list, is more valuable than implementing a recursive descent parser (which will make you really understand one language).

When I went there, Berkeley didn't have an undergrad course dedicated to distributed systems. But, to be fair, the MIT course listed is graduate-level. That seems to be the biggest outlier, as half the catalog consists of intro-level courses at Cal: Programming, Computer Arch, Math for CS, Algos / Data Structures (at least, the first half of the lectures mirrors CS61B; the latter half is closer to CS170).

If I had to suggest a sequencing based on the course parallels, I'd go with:

Core: Programming -> Computer Arch, (Math for CS -> Algorithms / Data Structures)

Intermediate: OS / Networking / DBs / Compilers

Advanced: Distributed Systems (probably want OS / Networking as prereqs, at a minimum)

But, I'd expect a small portion of undergrads took all of the intermediate topics, much less distributed systems.

Regularly-offered upper div courses that evidently didn't make the cut, off the top of my head: Security (this is a big one, IMO), AI, Graphics, Software Engineering, HCI, Computability / Complexity (automata theory among other things), Computer Architecture (beyond the intro-level treatment). Supposedly ML also started being regularly offered for undergrads (CS189).

I would say ML has blown up in the recent years. In addition to 189, all CS undergrads are now requires to take EE16A+B which, although focused on circuits, teaches a bit of linear algebra needed for ML; and some take in addition 126 (probability) and 127 (linear optimization).

To add on to the parent comment, a 4 unit class is usually considered to have a 12 hour workload per week (180 hours a semester). [1] Two caveats to this: 1. There was controversy in the past from a CS professor that students should actually be putting in 20 hours per week [2]. 2. With some project or pset heavy classes, I don’t doubt that 20 hours was irregular. For example with 162 (operating systems), every week we had a tight schedule of lecture, discussion, homework assignments, group projects, and possibly a midterm to study for.

[1] https://academic-senate.berkeley.edu/coci-handbook/2.3.1

[2] https://www.dailycal.org/2019/03/10/uc-berkeley-computer-sci...

Thanks for the detailed reply.

> If you’re a self-taught engineer or bootcamp grad, you owe it to yourself to learn computer science.

No, you don't.

I know all these topics and I don't recommend anyone working in a corporate environment to learn any of this. It'll likely result in you resenting your job, your peers and the entire software industry.


I gave my heart to know wisdom, and to know madness and folly: I perceived that this also is vexation of spirit. For in much wisdom is much grief: and he that increaseth knowledge increaseth sorrow.

1:16-18; King James Version

If you're a self-taught engineer without computer science knowledge, you'll find that you simply cannot work on the lower parts of the stack. Fine for you, perhaps, but it's not wise to limit your own career possibilities.

That biblical argument is pretty much the rationale for the events that take place in Farenheit 451 and the reason for the Great Firewall.

That's just FUD.

Recommending ignorance while quoting religious text is so over the top that it makes me wonder if you're just trolling.

You made a contradiction and then gave an irrelevant reason behind.

What does it say about our industry if learning all this makes workers hate what they do ? By this logic we should cut down on advanced education so people stay content in dull jobs.

I've included a Biblical quote concerning this matter to point out that this is an old human problem, not merely an industry problem.

> By this logic we should cut down on advanced education so people stay content in dull jobs.

This would indeed be the case, if corporate jobs were an inescapable necessity.

Isn't reduced education a method for keeping people content in most situations?

> It'll likely result in you resenting your job, your peers and the entire software industry.

Jokes on you I resent my job and the industry already.

Ignorance is bliss!

The biggest fish in the pond doesn't need to know there's an ocean next door.

I don't understand why so many programming jobs require a comp sci degree when very rarely do they require anything more than "programming"

The CS degree is not for the job, it's for the individual

Can you elaborate?

See "Why Learn Computer Science" in the OP.

Looks like a decent start, great to put these various sources together. Under algorithms, I found discussions of big o time complexity, one of the real killers for the non-cs majors in terms of interviews and career growth. Under algorithms I found it on the Coursera course by "Tim Roughgarden ... available on Coursera and elsewhere".

This is very good. I would recommend this to anybody considering a career in software.

My only problem is that it's not very pedagogical; I don't think "read and study the foundations for 1000 hours, you twerp" is really the best approach to ease people in.

I would recommend starting small, making a webpage or some simple fun programs that you enjoy working on. MDN is a good resource (https://developer.mozilla.org/en-US/docs/Learn) for this, but there are probably better guides / tutors out there that can personalize an approach based on where you're at. Glitch is also cool (https://glitch.com/)

There is a distinct lack of Hopcroft and Ullman from this list:

Introduction to Automata Theory, Languages, and Computation, 2nd Ed.


Which is basically all the goodness in Structure and Interpretation... and any book on compilers and interpreters. Basically, though I don't reckon that any modern courses teach from Hopcroft & Ullman, it's a major textbook in the field (unfortunately the 2nd ed is easier to find but the 1st has the works).

Another foundational text is Andrew Tennebaum's book on Operating Systems:

Operating Systems Design and Implementation


To be honest I don't how it compares with the book proposed in the article, since I haven't read that book.

Finally, two personal recommendations for anyone interested in AI (as a study of advanced CS concepts and not just as a way to make a quick buck with a shallow understanding of a few machine learning tutorials):

Artificial Intelligence: A modern approach (Russel & Norvig)


And the free pdf of AI Algorithms, Data Structures, and Idioms in Prolog, Lisp, and Java:


Which doubles as a good textbook for programming languages in general.

The recommendation of "Crafting Interpreters" is pretty weird. I would argue that more important than the specifics about compilers is a high-level understanding of various ideas in programming language design, and be able to understand the building blocks of different languages and compare the trade-offs of using one language compared with another. Dan Grossman from the University of Washington has an excellent course Programming Languages on Coursera: https://www.coursera.org/learn/programming-languages/, which I think would be far more relevant to modern programmers than studying compilers specifically.

Has anyone ever found a guide of similar format and quality for data science? I would love to find a similar collection of resources for learning data science fundamentals (math, statistics, programming, etc.)

You can find resources (including learning plans or "syllabus") at LearnAwesome.org . For eg: Machine Learning: https://learnawesome.org/topics/d6e39f8d-1ba0-4d46-9fbe-771c...

How I as a mechanic major started in CS:

My lab had a project that nobody tended to, an evangelist at the lab told me all the good things about linux and git so I bought a burner laptop for $250, wiped the disk, installed ubuntu and wrote 3000 lines of the crappiest code you can ever imagine.

But it worked.

So now when someone ask me for advice to start fresh in software engineering, I always tell them you just have to wipe your computer and install $DISTRO. And start from the terminal, everything else will just come naturally (with deadline pushing it might get faster).

There is ArsDigita University Curriculum[1] which is an interesting one

[1] http://www.aduni.org/courses/

I'd like to know the reasoning behind replacing the very approachable, concise, hands-on and doable Nand to Tetris course with what looks like an enormous textbook.

Are you reading a different article to the one being linked?

IMAO this Type 1 and Type 2 comparison is ridiculous. And in this context Type 2 seems to me being more related to continuous and more pragmatic learning. I have done CS and I can't remember using any of it to make money. Do CS because you love the subject and crave for more knowledge about it. Do the Type 2 stuff if you need a job/money.

One of my favorite interview questions was "What happens when you type a URL into a browser address bar and hit return?" My answer went into opening a TCP connection and DNS lookups because I'm a networking guy. And I've had the fun opportunity to explain to someone that "no, the application is fine. No one can reach it because NASA has a split-DNS monstrosity that is currently broken. Again." And then there was Friday, when I was helping someone get set up in our development environment; she was able to get to the web interface of Gitlab, but not check out a project. I haven't heard back because she had a meeting, but I'm betting it was because she was on the wrong VPN configuration.

The problem with "type 2" stuff is that building a system is easier than debugging the same system, and "it works on my machine" isn't always an appropriate answer.

(But resigning and getting a (hopefully better paying) job before the excrement hits the rotary impeller is always an appropriate answer, I guess.)

I feel like the hardest part of modern software engineering is systems architecture. I've pretty easily picked up everything on this list that I've touched, but I still don't know how to organize systems. Is this just me? (if not, then I feel that this site misses the most important part of the discipline)

Because that is the the part of software engineering which needs the most experience and general knowledge. There is no way around it. After all you need to know the constraints around a system before even building it and that is only possible if you have seen it before. It can help to study others people work before starting a project in a similar domain.

These types of problems, ones which can only know how to solve by solving them, are known as wicked problems. Software design is well known to be a wicked problem.

After my first prototype(s) of system I start reviewing design patterns with goal to identify how my prototype(s) are similar to various design patterns, then I start looking at opensource projects with similar design patterns, eventually I see the righteous path.

This is my approach. Except three days later I suddenly realize a cleaner approach and I wonder why my original plan was so ugly.

I wonder if anyone here has tried this (or other) self taught CS successfully while not being a "self-taught engineer or bootcamp grad".

What if you are a designer or PM type with several years experience in software industry? Is it productive to skip real experience or actual classes to get into CS/engineering?

I'm a PM turned programmer, I did both.

I started with Harvard's CS50 and web programming in Node independently, and I ended up with a masters in software engineering and can now push production-ready code (mostly in React/Node by choice, although studying university-level classes has exposed me to different paradigms and languages: C, Java, OCaml...)

I whole-heartedly recommend concurrently studying first principles and a couple of applied technologies. They serve different purposes but are ultimately connected in the big picture. University courses very rarely teach how to make real-life projects. Conversely, learning real-life technologies rarely shows you the theoretical principles that fuel them, at least not explicitly. React/Redux for example have really interesting ideas on managing state drawn from functional programming. To churn out React code, you certainly don't need an FP course or formally studying the pains of managing state in programs, but I found it to be immensely helpful. Same with many other areas: studying systems and networking will help you with many back-end technologies, compilers can help you understand many different programming languages, etc.

I don't think there's a program that teaches both, because it's probably impractical to design it. Simply pick one theoretical route like Teach Yourself CS and a couple of applied technologies, and work on them simultaneously.

You could certainly try to start learning to code with SICP and go from there, but for the most part these resources will be much more valuable to those who have some prior background in programming.

SICP sounds like a big waste of time when you can follow another guide in a language like Python and make applications along the way.

Yeah! Yeah! I'd say your onto something here buddy!

Agree/disagree, but after doing SICP myself, following other methods and making projects at the same time (or working on things I was interested in) forced me to learn a lot more than just doing some assignments in a language I would never use again.

I would recommend http://www.bottomupcs.com/ if you're looking for a quick crash course. It's a bit out of date but its the shortest book I've found which covers most of the computer science stack.

Of course an MS - which they recommend against - would likely cover additional topics (computer graphics, AI/machine learning, formal methods) as well as follow-on material to the introductory courses listed on the page.

Is this computer science books? Seems like great computer engineering books but I have only read a few of the suggested ones.

Is there any math or science in those books (aka, could I write a paper from it)?

Anyone that have read them that care to chip in?

I think you have a very strange definition of computer science.

There isn't enough math, but there's nothing in the undergrad CS curriculum most places to "write a paper from".

Equivalent “X guide for practitioners with deep knowledge” for any of these (X=)fields:




Deep learning and AI

Where would one go if they had money but minimal time to get a formal online education in CS? Syllabus, other students to discuss ideas with, etc?

Send me $1000. Where do you want the diploma to say you got your Ph.d.?

Good Programmer and Code JS

where do i send it?

how much of this curriculum is necessary for a FAANG job?

I’d love to go through all of them, but I’m constrained with work as well, so I’ll have to prioritize.

More than all of it, but also none of it.

You can always just hack the interview process if your primary goal is a job at a particular Big Co. In that case, learning all of the material is probably an extraordinarily inefficient path toward your goal.

But "any dumbass can hack can hack mega-corp hiring processes" is sort of obvious. If your goal is to do the real thing, as opposed to knowing how to hack the American Corporate Tech Giant mechanism for determining whether you know the real thing, then the answer is "teaching yourself the content knowledge in this list sort of misses the forest for the tress".

I think a lot of low-cost/open source/self-directed programs sort of miss one of the points of education. The reason for learning all that stuff in CS courses is more than just the content knowledge. It's also the intellectual history of the field, the problem solving approaches, how people in this field communicate, etc.

If you think of fields like Econ or Law or Medicine, it's easier to see. There's some raw content knowledge (the rules of evidence, human anatomy), but also there's the huge amount of stuff around that content knowledge. Like, yes you need to know the rules of evidence and the names of every bone. But... there's also a bunch of other stuff you're supposed to be picking up as a result of going through the process of learning that material. The self-study approach somehow misses a lot of the "stuff you're supposed to learn as an effect of learning the material", especially in CS.

I just want to feed my family... I’ll worry about enlightenment later.

Having an actually solid foundation when you're being paid 300K to be expert isn't "enlightenment". It's "appropriately qualified".

Hacking the hiring process has a good effort vs reward ratio, but it's also high risk. The risk is that obvious gaps in the candidate's background surface after starting the position. Which usually doesn't turn out well: best case career stagnates and worst case the candidate ends up on the bottom of a stack rank when things get lean.

It's a risk that can be taken in a calculated way. Sometimes things turn out well. Sometimes not.

Then would you provide any actionable advice?

I understand your concern about not gaining implicit knowledge while surrounded by peers, but I’ve gained “the ability to think” through my other course of study, and your concerns don’t seem to be attached to a solution. Rather, it seems like a desperate attempt to gatekeep over an increasingly egalitarian field. And sorry, but there isn’t much communication to be learned among CS undergrads.

The knowledge I lack on the job will be taught as I’m exposed to my shortcomings, as is the case with most jobs.

In addition, my school’s CS department was lacking, so most of it was poorly taught, and I relied on self-teaching regardless. Once I get through hiring season, I’ll worry about covering the remaining material before I begin work.

Don't look for a FAANG job? There are plenty of enterprise IT jobs where the primary skill is putting together a web app to look like some designer's Photoshop, with pay 3-4x the median US.

"Once I get through hiring season, I’ll worry about covering the remaining material before I begin work."

Someone who has spent the time wading through this stuff might consider that somewhat insulting. Your future co-workers, say.

> Someone who has spent the time wading through this stuff might consider that somewhat insulting

ok? If I have an entry-level offer in-hand 6 months from now and I’m behind a few textbooks, I think I can manage. Sorry to break it to you, but a lot can be done if you’re healthy and avoid distractions.

If you're ever offered a 1M job, you accept it and figure out the risks later. It's much better to fail after collecting 1M, than grunting for 10 years thru a risk free 100K job.

I remember a similar conversation with a CTO of some cheap company that wanted to offer me 100K and arguing that he knew many risk takers who took on high paying jobs in bigger firms and failed. That was many years ago. Now my comp is approaching 1M, my job is super risky and demanding, but I don't care as in the worst case I quit with enough money to not work anymore. I know many vps and execs in smaller firms and their attitude is the same: they don't give a flying f..k so long as the pay/risk ratio is appropriate.

thank you. My risk profile is also higher than most. Do you mind sharing your story? I’m an undergrad looking to sell my youth. Would like to work at a non-profit once I can be financially independent with my family.

I really agree with this. I didn't take the distributed systems course when I was at university, but later I still managed to work at a few FAANGs. Flicking through the material in the MIT 6.824 course, I really wish I learnt this before I needed it, rather than scrambling to learn it piecemeal as things came up. I would've been promoted faster, and things would've been less stressful.

It is possible to learn this on the job. But with hindsight, my advice would be to just take the easy way out and learn it beforehand.

any of the other material you didn’t quite need for the interview itself?

The FAANG interview process is pretty standardized, if you're pressed on time I suggest you learn and master just 2 things: data structures and system design.

> The FAANG interview process is pretty standardized

Not really. It is standardized at some FAANG companies but not others.

is system design necessary for new grads?

Honestly, start with web or mobile development and start working on projects. Start with the fundamentals and go from there. CS theory is great but it’s not gonna pay the bills.

FAANG interviews do cover DS, algorithms, math, and OS, though.

Pretty necessary, I cant imagine working at a FAANG without at least having read and understood "Designing Data-Intensive Applications" cover to cover.

Really cover to cover? Don't think that's necessary unless you have deep interest in the subject. There are folks who can barely walk and chew gum at the same time, working at FAANGs.

just to get through the new grad interview, I mean.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact