Seeing a very non-technical friend of mine being completely immersed in games like "Factorio" and "Satisfactory" where the entire game is essentially just putting constantly more complex and valuable resources together in automated ways is fascinating to me. He will talk about efficiency ratios until someone stops him, and when he's not playing he's reading wiki articles to better understand the problems he's presented with. The sheer look of glee and pride when he gives a tour of all the automation around his base is infectious, and although he doesn't share my amazement at my conclusion, I can't help but think that he's already doing what most engineers I work with do every day, and he doesn't even realize it.
Sometimes I feel like there's so much untapped potential out there, people that would have loved to get their hands dirty with micro-controllers, soldering, programming and all the other stuff that could help them solve real world problems, but they were just never given the chance. They never had the mentorship or the right context to develop those skills. Instead, most programming is done under soul-crushing enterprise conditions, and a lot of teaching about programming is (understandably) only about whatever will benefit you in the workplace. When you're an adult, there's no time for play, so most programmers I know would rather do anything but write code and solve problems when they come home from work.
I'm not sure what my point is here, but it is interesting how people look at programming and software engineering as something so inconceivably complex. The way I see it, it's basically just digital plumbing and carpentry, as long as you know how to use the toolbox: and most people will never even try it.
One reason why I can't really get into Factorio. It's too much like work.
The funny thing is I do code at home, for fun, in my free time- or I used to when I had a clear line demarcating my "free time" from work time (I do AI research and I mostly work from home, so most of the time I'm working on something that I really have to work on). But Factorio is just ... too much like work.
... dah, I can't explain it :)
I approached Factorio like MineCraft but with the boring stuff automated.
To have fun in a game like Factorio, just mess around - you don't need to make a perfect factory, or compete with your friend.
Anecdotally, this was _almost_ me. I ended up going to university to study CS much later in my life than most people, and only because of a fortunate meeting with someone who was from that kind of background and 'guessed' I would be into it, and if I didn't, who knows what crappy dead-end job I would've ended up in.
How lucky I am to have 'made it' into a career I enjoy, and how many other people are out there who would love to have this opportunity, but instead are working in dull, menial jobs, just like I was before uni, is something I think about often.
I'm not sure how we get more people involved. There are 'STEM ambassador' (https://www.stem.org.uk/stem-ambassadors) roles in the UK. I'm not sure of their efficacy.
I don't know what it is about his style of teaching but he completely changed my view of IT and the IT industry. Before then, my main tech knowledge was from the at the time terrible exposure you got in rural schools back in the 2000s and taking electronics apart to try and see how they worked. I thought programming was way beyond me as I didn't do well at all in school and had major self esteem issues as a kid. Matt is one of the key reasons I gained a serious interest in technology and has helped probably thousands more like me. He showed me all it takes to learn anything is time, effort, and a healthy dose of curiousity.
Maybe other educators could reach out to him and ask how he does things differently? I'd love to see things like this crop up nationwide, I feel it'd seriously change the knowledge base of tech issues for future generations.
https://twitter.com/Pixelh8 (Matt's Twitter.)
creativecomputingclub.com (Website appears down for maintenance time of writing.)
This could also explain why many would rather just play games than solve real-world problems when they come home from work. It requires a bit of discipline, desire and resolve I would say. But one should also realize that the rewards of actually going out to solve real-world problems and interact with real people are so much sweeter than the virtual rewards you earn in simplified and idealized games, in many aspects. I would also say that while many don't do it at home, there are still many people that do, and do an amazing job at it.
Let's do some introspection as to why I prefer playing these games over coding at work:
1. Games are designed in a way, that speaks to intrinsic human values. For instance, in Factorio, the local Fauna wants to kill me. I have to build and design my base accordingly or will I'll perish. My work in contrast, is far removed from intrinsic human values.
2. Games are designed to be accessible, with every step yielding rewards right away.
2.1. Coding in an enterprise environment is full of complex task, which have to be broken down first.
2.2. Games such as Factorio increase complexity of task later on, but you always utilize mechanics you've already mastered.
2.3. Task in enterprise environments tend to always require learning new knowledge first (business domain, existing code base etc.) before being able to solve it.
2.4. Games allow me to start from scratch. Existing code bases, which I didn't build myself, require me to invest quit some time before I can even start achieving my task.
2.5. Often, these existing code bases require me to learn new frameworks. Games teach new skills/tools in seconds or minutes. Frameworks tend to require hours or days to digest.
3. Games provide visual feedback. My labor results in visible feedback inside the (virtual) world. Accomplishing a coding task tends to not result in any feedback (I do mostly backend). If there is feedback, it's normally in text form and of negative format (error logs).
4. In general, feedback in games is unambiguous, clearly tells you what is happening. No debugging. No frustration.
When I actually know everything that is required to solve a task and can concentrate on achieving the task in the best way, that's the moment I'm having fun. But normally, there is so much I don't know, tools I can use but have not mastered, that it feels like a mess. Even if you believe you have everything, reality tends throw some roadblocks along the way.
Now, why don't I code at home and do side projects?
That's in my first point. There is just nothing I could use coding for that solves problems I care about (intrinsic human goals). Even if I know how to write the next Facebook, I wouldn't do it because I really don't care. Well, there are some code solvable things I consider interesting, but they are out of scope in terms of my resources and abilities. Also, the way along would properly still suck for the most part, once it gets serious.
Right now, I'm learning some Math in my free time. Quit enjoyable, gives me new insights in how everything works. I tend not to have that feeling why learning new frameworks / programming languages. New insights are pleasurable to me, tech stuff just doesn't tell me anything interesting about the world.
Maybe some of you who don't feel the same can understand people like me a bit better (not saying that all non side project people have the same reasons).
Some jobs are protected by difficult, mandatory, exclusive education: Law, medicine, piloting. Software development is one of the least protected areas and we also face the threat of outsourcing. To most companies we are an unpleasant cost factor in need of optimization.
Unless you plan to become a manager or owner who profits from cheaper labor cost and more potential employees to chose from, do your part to keep people away from programming.
To learn anything, on your own, self directed, is hard for any subject. I think programmers and devs underestimate the amount of time they spend on "getting good". It doesn't feel hard when you got into it in high school, got a 4 year degree, then spent 10 years in the industry. But if you're 26, working in real estate and have a significant other, it's a huge ask.
On the other hand, I think this attitude of programmers is too smug.
Many jobs that seem simple have to solve difficult problems and require engineer-like thinking. Sure some low-skill jobs, like cashier or mail delivery or truck driving don't require a lot of novel thinking on a daily basis. The bulk of the job is carrying out what you know.
But take for instance event organizers: no two events are the same. You need to negotiate the requirements from the client just like in software, you need to split up the work, plan it all out, execute together etc. Constraints often force you to be inventive.
Or how about lawyers? Even in mundane affairs, like divorce, no two cases are the same. There is probably more in common between one corporate mobile app or CRUD system and the other, than the circumstances of two divorce cases.
How about doctors? Sure some of it is routine, like much of a GP's work. But in complex surgeries they need to come up with a specific strategy and plan for the individual patient.
In fact, I think most jobs paid at the level of software engineers require a similar amount of novel thinking. The contrast is not software vs rest, but well-paid professional jobs vs low-paid, low-skill repetitive jobs (physical and mental as well, like office paper pushing).
Furthermore, many software jobs are really 9-5 "manufacturing". Not every dev is John Carmack. There's a huge demand on really simple business stuff and the very same functionality gets built over and over again by different companies for different business clients. These may be lower paid than the fun Google-level research-like creative software jobs, but they are out there. So top software jobs should be compared to the top of other professions, not just to any "white collar" job (because in the latter case you also have to include the "code monkeys" in the software jobs).
It's not inscrutable, but it's not something everyone can do. And among those who can, not all of them necessarily enjoy it. It takes a high enough IQ (whatever reservations you might have about IQ tests, there's no doubt they match with programming ability) as well as an appropriate disposition.
I mean you wouldn't expect every single person to be able to play music or paint well enough to make a living out of it. I sure as hell wouldn't want to be a lawyer, I doubt I could even do the job at a mediocre level if my life depended on it. (Not that I dislike the field: I watch more law-related Youtube videos than I'd normally care to admit, which reinforces my belief that I'd suck at it.)
If you happen to have some article or research in mind it'd be interesting to read
(Or maybe you read sth long ago and didn't save any links / references)
I would expect good results to be strongly correlated with an ability to learn and be good at software -- and therefore I was curious about if you had read something related to that.
> just look like the type of tasks a programmer...
Hmm, I think RPM looks very different from writing code. Maybe you had other tests in mind? (Then I'm curious about which?)
You are implying, that most people can do that.
I think only about 15% of adults can do what you listed.
> Rant done, it's just frustrating that people limit themselves like that.
"If I can do it, so can you" isn't a fair judgement. Everyone is different, and CS/programming has many complexities that not everyone is interested in putting in the time commitment to learn it.
For me, with time and experience it has become easier to put a solution together, but it still not easy work and many times I am mentally exhausted by the end of the day.
I even met one guy who said that he wrote some HTML back in the day and it was easy, so he was thinking of making a career in software development as it wouldn't take him more than a few weeks to pick it up.
Do you really believe this? I'd be interested to know what you do for a living.
Programming requires the ability to grasp abstract concepts which most people simply do not possess. Talent is a thing. I'm terrible at composing music even though I enjoy playing it. I have no talent. Most people are terrible at programming.
I have to conclude that anyone who thinks this is very young and/or has never actually tried to teach someone how to program.
In my experience, this is often due to a lack of interest. Most people (including myself) often just want a product to work. When something fails to work correctly or you want extra features, you bring it to a shop to get it fixed. When you're not too interested in the process of fixing it, the process behind it can feel like magic.
I feel this way with most products I own or buy. For instance, for me, getting my car fixed can feel like this at times.
Getting deeper and deeper in computer science learned me that I could potentially learn allot, as long as I'm interested enough! Debugging a car couldn't be much worse than debugging a computer system, can it? You probably shouldn't start with some high end car, but an older example should be doable, right? This attitude got me through most of my college years. Sometimes you have to adjust the bar a little (or a lot) , but in the end, you probably can pull it off.
Sitting down and staring at a problem for 10 hours with very little progress can be extremely demotivating for some.
To just get a job in software, the skills that are needed aren't really high though.
~Some Existing Touch Point With CS
Teachyourselfcs.com is definitely in the conversation - in general I recommend it to friends who have a pre-existing "gateway" to the field (software engineer or otherwise). The page talks about progressing from "type 2" to "type 1" software engineer - a deeper understanding affording a richer career/experience.
~Relatively Fresh - but have strong intent
For people who are more "virgin" and fresh but are willing to put in the work - I recommend OSSU ("open source curriculum") - and i've found for whatever reason that structure has let them gradient in with more success. (https://github.com/ossu/computer-science)
~Intrigued/Engaged - but intent is still nascent
For people who have a fleeting interest but don't necessarily have the time or intensity about their interest i usually recommend a starter python course - 90% of these people will fall off the path unless there is a more permeable access-point to the space that offers a positive feedback loop. The university of Michigan has some engaging options.
Just want to emphasize that first part; I've seen the second in new hires who passed a coding bootcamp, and it's really weird to watch. One of them even had to see the error message and go back to the code and still puzzle over it for several seconds before realizing what was wrong.
I wonder whether a "richer" editor/IDE, that flags such things immediately upon writing them, could greatly improve the learning experience.
Many tools even point out things such as non-existing variables in scope, which are not strictly "errors" in a dynamic language during edit time.
I really like reading books too, but sometimes the completion-ist part of me is too strong and I shouldn't really do all the exercises.
For beginners I highly recommend Harvard CS50, the best of the best course available for free and paid too.
Dan Grossman from the University of Washington has an excellent course Programming Languages on Coursera: https://www.coursera.org/learn/programming-languages/, which I think would be far more relevant to modern programmers than studying compilers specifically. And I'm glad to see this course listed under the "Core Programming" section of OSSU.
However, I am concerned with the fact that these curated lists lack a unifying idea of what CS is.
If you go to pure math instead, the consensus is clear. A pure math bootcamp, like Math 55, is mostly linear algebra and real analysis, taught from Axler and Rudin in its current iteration.
Simpler math curriculums progress more slowly, or use easier textbooks, but the idea remains the same: Linear algebra and real analysis.
What is the equivalent distilled core of CS? It can't be so many things. There must be a core which you build upon by stacking other courses and textbooks.
My take on this is logic and computation. They are the algebra and calculus of CS, and they are intimately related by the Curry-Howard isomorphism. I find it alarming that CS students are often not taught basics of propositional logic, first-order predicate logic or lambda calculus. It's the basics, and it is really useful.
Software Foundations  implements a nice curriculum in line with the ideas I have written about, but it is aimed at graduates and consequently it lacks background. I am still thinking about combinations of textbooks for undergrads that would offer something equivalent. Any suggestions welcome.
N2T teaches digital logic, what Turning saw, how it was physically implemented and how programming languages were and are developed. TLS introduced various topics like the halting problem in an effective and playful manner. Both spark interest, something I find some maths courses fail to do in many people unfortunately.
But somebody must be getting some value out of these things, right? Am I just bad at studying? Or not as interested in the subjects I'm trying to learn about as I think I am?
The streak really is a super power for self study of any form. It's so hard to ignore a 1600+ day streak even when you don't feel like it.
When Github took that streak counter down, I was crushed, combined with me taking a management heavy job (encouraging me to read a lot about soft skills), it really collapsed the whole system for me. One day I'd love to pick it back up, but right now I'm doing the same thing with violin as an adult learner, so that's taking up that time I'd otherwise spend on CS.
When I started learning, the recommended textbook was Distributed systems: Principles and Paradigms (the new recommendation is Designing Data-Intensive Applications) and the recommended course was MIT 6.824.
6.824 is the first online course that I have completed fully and it was well worth it. I read, took notes, and summarized all required course readings (around 20 papers) and completed the labs, which involved creating a raft-based key-value store. The labs were especially useful because they forced me to really understand the details of the topics I had learned (e.g., MapReduce, Raft, Raft’s log compaction) in order for my code to pass all the tests.
I’m very happy with the results of following the course and I now plan to put the same amount of effort into other topics.
I read the textbook over a couple of months, but I couldn't give you a good estimate in hours.
I plan to eventually complete most of the materials, but I take huge, sometimes multi-year detours. I've done SICP and nand2tetris books, it took around a year each (not fulltime obviously, also I went a bit harder on 2nd half of the nand like writing unit tests and building AST for the compiler instead of the easier approach suggested in the book). Then I decided I wanted to go do OS, but before that to learn C to be able to implement my own stuff, so I also worked through K&R book.
On the other hand I'm pretty bad at doing "real world" things, full of edge cases and details, and try to avoid them as much as possible, so I may not be an example of someone getting value in a practical sense.
I feel that I've grown better as a programmer but I can't say how much of it can be attributed to my home studies and how much just to the experience on the job itself.
It's kinda like if you want to build a rocket to get to space, you can tinker with propellant, metal, and fire, and be a hobbyist that eventually you build a rocket to go to space. Or you can be a physicist and calculate everything on paper. To get to space you really need both those people, but the physicist can learn to build and the engineer can learn theory.
So is this why frameworks exist? To abstract away the theory that proves something more "performant" or "sufficient", and allow the builder to do their thing?
A good framework will stop beginners from making critical mistakes, decrease the amount of code its users have to write, and offer professionals well-constructed ways around its handholding as needed.
Particularly helpful I would say is a solid idea in programming language theory and I can't recommend the MOOC by Dan Grossman from the University of Washington highly enough. As he said, even if you never get to program in the languages used throughout the course, you will become a much better programmer after finishing the course.
CS was born out of a time when computers were very limited. How do you write a word processor with a spell checker, when the dictionary alone won't fit on a floppy disk, or fit in memory? How do you draw rounded rectangles on your GUI when they take too many CPU cycles to draw?
CS people managed to carve out paths through problem spaces that could be done on the computers of the day. These days, you put everything you need into a hashtable and it looks up in microseconds and you don't have to care anymore.
I have no idea how I'd make someone interested in these things if their mind doesn't pop up such tinkering goals for them.
On the other hand, the above is very inefficient in itself. It is hard to overstate the importance of reading good books, besides the practical learning of the above paragraph.
You need to be able to relate to the topics to some degree, but once you have the hang of it, you can benefit from the years and years of experience of experts and hone your mental models, discover new topics and relations across topics and understand things more holistically.
Learning only from books/courses is not sufficient, you won't be able to apply what you learned if you don't constantly try to map from what the book says to something you are already familiar with a little bit.
But you have to pick good books. Bad books can be more confusing and discouraging. I think American CS books especially from MIT Press are really high-quality. They may be expensive (but there's LibGen).
You may also have problems with delayed gratification. The benefits of an hour spent on reading a textbook will be vague, uncertain and delayed and often hard to attribute to the book. One hour of watching Netflix or Youtube or browsing through 20 genuinely interesting links from HN feels more rewarding. This is a big problem for us all in today's attention economy. You need to be very conscious of this to break out of it.
I got a copy of Benjamin Pierce's `Types and Programming Languages` and I've read basically the full thing and have completed 80% of the exercises.
Currently working through a couple other textbooks slowly. I find that taking notes and doing the exercises is essential, otherwise I lose interest. (Same with online video courses)
I'm being harsh here because I used to have software engineering as a topic for my Ph. D. studies. Part of the reason I left the academic world (after a postdoc) was a bit of impostor syndrome where I realized I was telling others how to engineer software without having much experience engineering software. This always struck me as odd. I've since learned a lot about software engineering that was definitely not part of any computer science studies I know off.
I always think of college being more about learning to learn than to learn anything specific. Also the difference between university and a technical school is that the former tends to be exactly about being less about practical skills and more about skills you need to learn new stuff and ask critical questions. Quite a few projects I've been on over the years have also required me to dive into other domains. I've had to read up on metallurgy, geo spatial algorithms, and medical stuff at various points for different projects.
I thinks I had much more fun from these courses after I have done many real projects in these years.
Maybe you get bored because it's little bit easy and basic for you.
That was actually the first time I have self-taught myself anything with the aim of breaking into a field that had no experience with. It was difficult at the beginning because I had just quit the job I had back then and fell into depression for a personal, unrelated reason; but perhaps that was also the reason that I eventually had the drive and focus to get through what I would normally have found "boring" at the beginning — I had nothing better to do. The experience turned out to also be one that is empowering because now I feel like I could learn a lot of things by myself.
Not sure if it helps, but here are a few things base on personal experience of going through this type of syllabuses:
* Assuming that they are written by people who are good what they do and know what they are talking about, these syllabuses are usually to be followed through step-by-step unless explicitly stated otherwise.
* Different people benefit from different modes of learning. There is nothing stopping you from using auxiliary resources to help you to get through a syllabus.
* More often than not it takes a few reads of some text/attempts of an exercise for me to really understand something. In one particular case I read through a book three times, about half a year a part each time, before I felt that I really understood it; and when I finally understood most of it, I was only glad that I decided to try it again for a third time.
* I think it helps to have someone to talk to about what I was doing, it could be a close friend or family who doesn't understand anything about what you are studying. Alternatively, if what you are studying is "popular" enough and there are communities of people doing the same—get connected. Emotional support helps.
* I always try to interact with people who are at different skill levels. There is always a lot to learn from people who are much more experienced than I am; and explaining what you have learnt to someone less experience helps to consolidate what I have learnt.
* Focus is very important. Find what works best for your through experimentation: if you can sit through some given material for a couple of hours straight, great (but do take care of your health and at least get some stretches and water); if you regular breaks, that's also fine—just make sure you set an alarm clock and commit to getting back to it when it goes off.
On a somewhat related point, for freely available resources, I would buy a coffee or make donations where possible to people who devote their time to help others. :)
I hope you will find something that works for you! Good luck!
Computer Architecture: added Computer Systems: A Programmer's Perspective as first recommendation over nand2tetris.
Compilers: Crafting Interpreters added as first recommendation over dragon book.
Distributed Systems: added Designing Data-Intensive Applications as first recommendation over Distributed Systems.
Online availability of some video lectures has changed as well.
Lately I have started taking more interest in CS and I think part of the reason is Rust. You could be learning Go, Swift or any similar language and come to similar feelings.
To me it seems that a lot of developers are choosing to go into more typed and lower level (closer to metal) languages. My gut feeling says this is happening because we can not crank up core speed from 2GHz to 3Ghz to 4Ghz and beyond as easily as we did pre-2Ghz era. Yes we hit ceilings of per-core power. But we have more users than ever before and the world seems to be eager to bring computing to even larger audiences.
All this is great, but it means developers need more refined and powerful set of tools that can work in low-power environments, boot up fast and deliver more computing throughput. The high levels of abstractions that most of us have enjoyed will stay but there will be new territories where you want to push code that is way, way more efficient. If we want to work with them, we need to know not only how to push a Template into an `<H1 />` but about the metal that is underneath. From how browsers work internally to how networks work, how processes work or communicate, how memory is allocated and so forth.
So this is a choice we have to make individually. I choose to learn and I am going with Rust and spending time to learn CS again. It is a slow process though, I have full time commitment to my product as a solo founder. Cheers!
On top of this I am chased by procrastination. I have gotten a much better grip at this. But just as an example - I have tried learning Rust at least 4 times before since Rust got highlight on HN I guess. I failed each each one. This time, I have at least gotten to the point where I opened the documentation on Vectors! My goodness, who am I? Complete lack of patience.
This directly causes me to sense pressure when I see statements like "type 1/2" engineer. It is not the material that is attacking me. It is my own failure hiding under the shelter of ego.
However, at least in my experience, it's possible to develop tools and systems to stay disciplined enough to finish what I committed to. Not every tool works for every type of project so it's a constant battle. The good news, though, is that you can fail at this as many times as you'd like (I certainly have) but you only have to succeed once!
I suspect this has helped you more than you seem to know. This knowledge opens doors; gets you jobs you couldn't otherwise get; and makes you a better programmer.
I have seen some horrendous Python code written by people who didn't understand these things. The Python VM "hides" them from you, but they're still relevant for writing fast, correct code as almost every operation in the language ends up using one or more of those "under the hood".
Best of luck with Rust and your startup!
I guess since I do not spend a lot of time in CS literature, I have the feeling I do not know much. But I guess I already have an edge at work without noticing it.
Thanks and best of luck to you!
So when I see "type 1/2" I automatically feel pressured. But that is my own incapability to stick to the subject, the books. My own procrastination. Learning is hard, escaping with an excuse about ego is way easier. Rust is already teaching me this.
I would suggest Essentials of Programming Languages by Daniel P. Friedman and Mitchell Wand instead of (or even better, in addition to) Crafting Interpreters. The former is about languages, the latter compilers (or rather interpreters). They are complementary.
A couple of things I would add:
Logic Rather than saying it is "applied math", I prefer to view programming as "applied formal logic". Everything from fancy type systems to Fowler's refactoring to Dijkstra's "weakest precondition" semantics for normal human being programming languages are simple applications of formal logic. Logic for Computer Science by Steve Reeves and Mike Clarke seems a good choice.
Automata theory Ok, if you're going this far, you might as well go all the way and learn what all that "finite state machine", "universal Turing machine", and "incomputablity" crapola is about. Introduction to the Theory of Computation by Michael Sipser is pretty good.
Old-school Artifical Intelligence Games. Plus, what can I say but, it'll come back again. Artificial Intelligence: A Modern Approach by Russell and Norvig. (Oh, there it is down towards the bottom.)
Sorry, I don't have video suggestions. And yeah, some of those aren't free. But they're not horribly expensive either. Maybe there are alternatives.
"Still too much? If the idea of self-studying 9 topics over multiple years feels overwhelming..."
I'd have some snarky comment here, but I'm bitter.
There's a website for the book as well, that has code that is well organized and commented.
Unfortunately excellent books like "Programming Pearls" have code examples that use weird naming conventions that just make me just want to punch the wall.
For competitive programming and interviews, Antti Laaksonen's books are fantastic: https://www.cses.fi/book/index.php
I still don’t understand why these software interviews do this. Especially when the work in question, is derivative, and mostly just reformulating other ideas and libraries.
It’s like, how often are you going to have to reinvent your own Red Black Binary Search Tree to solve some obscure business problem? Most of the time, the business doesn’t give a damn. Just throw your data into a database, and be done with it. Then use some queries to filter through your data, and get the data you need to solve the problem at hand.
And usually, by looking at someone’s code style, you know how good they really are. You can determine, if (1) they are at a hackerish level and their code has bugs and occasionally breaks, or (2) if their code is really solid software engineering level quality that’s airtight and can survive anything you throw at it.
A company once sent me a really long take home exercise that took like 3 full days to implement to completion. I said: no thanks, and in the same 3 days I had multiple interviews and even got an offer.
I was a self-taught software engineer, not a self-taught computer scientist. I was writing production code for fairly normal stuff, not proving theorems or implementing research prototypes of new ideas. So YMMV.
If you can make it through TAPL or PFPL on your own, then you probably know enough to be useful to some random grad student somewhere. Latch onto an implementation effort, get authorship on a popl/splash/pldi paper or two, and that's probably enough to convince some faculty somewhere to take a chance on you.
But most good graduate programs still require a regionally accredited degree. Getting an exception to that rule will likely require a publication record. (Getting into good graduate programs often requires a publication record in any case. I recommend against attending not-good graduate programs.)
I was auditing a grad class at UT that used this book, then corona hit. Still plan to finish it eventually though
I'm not even really sure where I would get started working on an implementation or working on a paper. Are these projects open source?
Does anyone taken higher-level courses at a local university in their free time just for the experience and to be surrounded by like-minded individuals while having a professional career?
It's essentially "where to go from here and beyond" (by course)
Instead, here's the actual image, without any the JS: https://i.postimg.cc/hjzXvrNc/yvA5Eiz.jpg
Most topics of it aren't difficult to understand, but overall it's just a lot of information. If you want to get a comprehensive CS-education, you have to accept that it takes years of studying, learning and practise.
So yes -- in CS, if you work in an area with a lot of general knowledge -- you are likely to get up to speed faster. In medicine, if you're a GP, you can probably undergo additional specialist training much quicker than someone who first pursued a single specialty first.
Feel the power of the sarcasm side...
Are there blog posts around where people share their experience of undertaking and completely this endeavour?
I've been looking for a brief, unambiguous phrase that can point people to related threads without having to spell all this out every time. A reader suggested "See also" which seemed like it would hit the spot: https://news.ycombinator.com/item?id=23150065. I guess not!
I wonder if the problem is that when people are skimming, they're going to to miss a pair of short words in favor of the larger URLs (and by "larger" I mean the physical space the text occupies on screen). A bunch of links to the same post will always seem damning if you don't know better.
Prior discussion may not be exactly relevant on some points.
I remember in a job writing my own BNF grammer and a a parser for it to optimize something or other and I got back basically "WTF" on the code review.
I would say database fundamentals is a must, so is networking. But honestly the rest is iffy. When was the last time you needed to build a red-black tree?
I think programming should be a larger part of CS. And no, not in Lisp. I also think ML needs to be more of a thing as well.
Insert story about explaining to my aerospace engineer buddy why
for i in 1..n:
s += y
When was the last time I wrote a red-black tree? Never. I've never had to write a red-black tree. If I need one right now, I'd have to go look it up. (But then, I'm charging $5 for hitting the pipe with a hammer and $95 for knowing where to hit the pipe with the hammer.) On the other hand, I had to build some damn data structure or other last week, and that is a very similar problem. Don't miss the forest for the trees.
Should programming be a large part of CS? Yes, absolutely. Doing these things in practice is a big part of learning them, and that's why a person with an undergrad degree in CS is employable and a person with an undergrad degree in Archaeology isn't.
The "CS focused people" that you typically hear about usually have Ph.D.s because you usually hear about (and from) people doing fancy new things. There are many other "CS focused people" out there just keeping the lights on, or rather keeping the backbone routers from being swamped under their own routing tables.
You know that person who you always go to when your program malfunctions? Or the high level engineer at your company who also seems to know how to diagnose a site incident or other strange software problem? What about the technical founder who leads a startup to do something new and extraordinary? Or how about the person who wrote your framework/language of choice? These are people who understand CS fundamentals, irrespective of whether they have a formal CS degree.
Was quite turned off by the “two kinds of people” assertion, broken down by understanding of CS. There are still holes in my understanding, surely, and I fill them in where I can so I can feel like a true Scotsman, but I’ve never had trouble dropping down abstraction levels when I need to, and even 10 levels down I’m still not close to logic gates etc.
As far as I can tell the computer always tells you what’s wrong if you can follow the breadcrumbs.
What is more important I think is to understand why a RB tree exists: what are its unique characteristics vs a binary tree, B-tree, AVL-tree, etc.
Same thing with hash tables. If you have never coded a hash table implementation, it may be puzzling to discover there are many, many hash tables, each with pluses and minuses. The real skill is being able to understand the trade-offs and choose a hash table implementation that will work for your specific problem. And when it falls down for some specific data, load factor, etc., understand why.
For context, I recently received a lot of pressure to get a degree, despite years of experience and good knowledge of CS fundamentals.
The philosophy of WGU is that you pass self-paced courses by passing tests, so people that already know the material can complete the courses quickly.
I feel like I'm getting much more out of my degree than I would have if I went the traditional route because I understand the context for the material, and because I've seen much of it before through self teaching, so I can focus more on consolidating my knowledge of the fundamentals rather than struggling to absorb all of these hard and often poorly justified concepts for an exam.
Another option is NYU's Cyberfellow's MS for cybersecurity. It's around $16k
- Current employer: "Your salary is already good for someone without a bachelor's degree"
- Current employer: "We're sorry but this promotion requires a bachelor's degree"
- Future employer: "We're sorry but we hired someone with a bachelor's degree for this job"
- Date: "You don't have a bachelor's degree?"
- Friends: "You're the only one without a bachelor's degree"
- Meetup: "What did you study in college?"
- Country: "We can't grant you a work visa without a bachelor's degree"
- Family: "You will have more opportunities with a bachelor's degree. We'll even pay for it." (offer expired)
- School: "Here's a scholarship" (offer expired)
- 80,000 hours: "You basically don't exist"
- Brain: "Maybe I'm not a superstar coder and job security is important"
Perhaps some of these are hypothetical, but they feel the same as if they had been concretized.
I’m sure my classes in COBOL,FORTRAN, and data structures in Pascal made any difference. Well, I did take a course in C and 16 bit x86 assembly.
2018 Apple had 20 million registered developers and it has only increased since then.
It's still very easy to become a developer right now (sambo found it to be the easiest path forward for example, from no experience at all), but the trend is quite scary. I have frankly no idea how it look when I turn 60.
In the immortal words of every in r/cscareerquestions “learn leetCode and work for a FAANG”. No I didn’t do that to “get into a FAANG” but it is a much easier route than the one I took.
If you want to go the enterprise route, if you have experience and do enough resume driven development and can answer basic techno trivia on a company’s chosen stack, it’s even easier.
I suspect the pressure never really goes away.
Have you considered majoring in something other than CS? If you're adequately self-taught and need the degree, then might as well learn something new and useful.
On the flip side, unfortunately CS is one of the few useful degrees that's had enough schools create a worker-friendly program (i.e. online) so many other degrees probable mean dropping out of the workforce.
I've considered biology and law because they seem to pair well with software. Perhaps smaller niches would be even better.
As someone with an Australian CS degree, I feel like my curriculum covered less than half of this properly.
This was UT Austin, which may or may not, now or ever, be an "elite US college", and it was 30 years ago. (Holy crap, where's my walker?)
Some elective courses are missing, and AI though it is mentioned. We also used many of the books mentioned here. Also back when I did it there were more seperate math courses that we shared with the maths undergrads. I know that they stopped doing that after I graduated.
I did my undergrad (EECS) at Imperial and MIT (for my final year), and the difference between the two was pretty enormous. My home department expected me to take five graduate classes (plus some undergraduate classes for "light relief") over the course of my year at MIT, something the other undergraduates there thought was very unusual.
In general US undergrad is much broader than in Europe, and doesn't go into as much depth, even though a US bachelors is a year longer than a European one. Not necessarily a bad thing, it just depends on what you consider the point of an undergraduate education to be; I'd say that most European universities try to structure degree programs which can funnel you straight into research or industry without having a sudden step up (the step up from undergrad to graduate studies in the US is pretty huge) whereas the US thinks it's more important to develop yourself in a broader range of areas rather than see the degree solely as a means to an end.
They both have strengths and weaknesses (having experienced both) and I don't think either can be said to be better.
The latter was basically 'how to make a resumé' and 'how to not be an arsehole in email'.
Anyone in the EECS department who was serious about a future in programming took, at a minimum, two courses on algorithms / data structures (CS61B / CS170), one of which was required for the degree. I suspect that's partially covered by what any crack-the-coding-interview study course would try to teach you anyways (dynamic programming, tree / graph traversal; NP completeness and reductions probably less so).
In general, each CS course at Cal had ~3 hours a week of lecture, 1-3 of discussion / labs, and ~2-10+ hours of homework, varying on the individual and course (not to mention whatever auxiliary studying one might do for exams). With 15 weeks of instruction, that adds up to a minimum of 90 hours, but realistically, 200 might not even begin to describe some courses. A full courseload might look like 4 classes (potentially 3 CS classes + 1 humanities course) and is designed to be ~40 hours a week.
Agreed with the sibling comment that this list seems to have a number of notable omissions -- in particular, it looks to be very systems-oriented. But I also don't think that you _need_ all these courses to be a fully-fledged programmer. Sure, knowing how compilers / languages work might make it easier to pick up more languages, but learning languages across multiple programming paradigms (e.g. functional vs imperative vs declarative), which you'll get anyways if you follow this course list, is more valuable than implementing a recursive descent parser (which will make you really understand one language).
When I went there, Berkeley didn't have an undergrad course dedicated to distributed systems. But, to be fair, the MIT course listed is graduate-level. That seems to be the biggest outlier, as half the catalog consists of intro-level courses at Cal: Programming, Computer Arch, Math for CS, Algos / Data Structures (at least, the first half of the lectures mirrors CS61B; the latter half is closer to CS170).
If I had to suggest a sequencing based on the course parallels, I'd go with:
Core: Programming -> Computer Arch, (Math for CS -> Algorithms / Data Structures)
Intermediate: OS / Networking / DBs / Compilers
Advanced: Distributed Systems (probably want OS / Networking as prereqs, at a minimum)
But, I'd expect a small portion of undergrads took all of the intermediate topics, much less distributed systems.
Regularly-offered upper div courses that evidently didn't make the cut, off the top of my head: Security (this is a big one, IMO), AI, Graphics, Software Engineering, HCI, Computability / Complexity (automata theory among other things), Computer Architecture (beyond the intro-level treatment). Supposedly ML also started being regularly offered for undergrads (CS189).
To add on to the parent comment, a 4 unit class is usually considered to have a 12 hour workload per week (180 hours a semester).  Two caveats to this: 1. There was controversy in the past from a CS professor that students should actually be putting in 20 hours per week . 2. With some project or pset heavy classes, I don’t doubt that 20 hours was irregular. For example with 162 (operating systems), every week we had a tight schedule of lecture, discussion, homework assignments, group projects, and possibly a midterm to study for.
No, you don't.
I know all these topics and I don't recommend anyone working in a corporate environment to learn any of this. It'll likely result in you resenting your job, your peers and the entire software industry.
I gave my heart to know wisdom, and to know madness and folly: I perceived that this also is vexation of spirit. For in much wisdom is much grief: and he that increaseth knowledge increaseth sorrow.
1:16-18; King James Version
Recommending ignorance while quoting religious text is so over the top that it makes me wonder if you're just trolling.
> By this logic we should cut down on advanced education so people stay content in dull jobs.
This would indeed be the case, if corporate jobs were an inescapable necessity.
Jokes on you I resent my job and the industry already.
The biggest fish in the pond doesn't need to know there's an ocean next door.
My only problem is that it's not very pedagogical; I don't think "read and study the foundations for 1000 hours, you twerp" is really the best approach to ease people in.
I would recommend starting small, making a webpage or some simple fun programs that you enjoy working on. MDN is a good resource (https://developer.mozilla.org/en-US/docs/Learn) for this, but there are probably better guides / tutors out there that can personalize an approach based on where you're at. Glitch is also cool (https://glitch.com/)
Introduction to Automata Theory, Languages, and Computation, 2nd Ed.
Which is basically all the goodness in Structure and Interpretation... and any book on compilers and interpreters. Basically, though I don't reckon that any modern courses teach from Hopcroft & Ullman, it's a major textbook in the field (unfortunately the 2nd ed is easier to find but the 1st has the works).
Another foundational text is Andrew Tennebaum's book on Operating Systems:
Operating Systems Design and Implementation
To be honest I don't how it compares with the book proposed in the article, since I haven't read that book.
Finally, two personal recommendations for anyone interested in AI (as a study of advanced CS concepts and not just as a way to make a quick buck with a shallow understanding of a few machine learning tutorials):
Artificial Intelligence: A modern approach (Russel & Norvig)
And the free pdf of AI Algorithms, Data Structures, and Idioms in Prolog, Lisp, and Java:
Which doubles as a good textbook for programming languages in general.
My lab had a project that nobody tended to, an evangelist at the lab told me all the good things about linux and git so I bought a burner laptop for $250, wiped the disk, installed ubuntu and wrote 3000 lines of the crappiest code you can ever imagine.
But it worked.
So now when someone ask me for advice to start fresh in software engineering, I always tell them you just have to wipe your computer and install $DISTRO. And start from the terminal, everything else will just come naturally (with deadline pushing it might get faster).
The problem with "type 2" stuff is that building a system is easier than debugging the same system, and "it works on my machine" isn't always an appropriate answer.
(But resigning and getting a (hopefully better paying) job before the excrement hits the rotary impeller is always an appropriate answer, I guess.)
What if you are a designer or PM type with several years experience in software industry? Is it productive to skip real experience or actual classes to get into CS/engineering?
I started with Harvard's CS50 and web programming in Node independently, and I ended up with a masters in software engineering and can now push production-ready code (mostly in React/Node by choice, although studying university-level classes has exposed me to different paradigms and languages: C, Java, OCaml...)
I whole-heartedly recommend concurrently studying first principles and a couple of applied technologies. They serve different purposes but are ultimately connected in the big picture. University courses very rarely teach how to make real-life projects. Conversely, learning real-life technologies rarely shows you the theoretical principles that fuel them, at least not explicitly. React/Redux for example have really interesting ideas on managing state drawn from functional programming. To churn out React code, you certainly don't need an FP course or formally studying the pains of managing state in programs, but I found it to be immensely helpful. Same with many other areas: studying systems and networking will help you with many back-end technologies, compilers can help you understand many different programming languages, etc.
I don't think there's a program that teaches both, because it's probably impractical to design it. Simply pick one theoretical route like Teach Yourself CS and a couple of applied technologies, and work on them simultaneously.
Is there any math or science in those books (aka, could I write a paper from it)?
Anyone that have read them that care to chip in?
There isn't enough math, but there's nothing in the undergrad CS curriculum most places to "write a paper from".
Deep learning and AI
where do i send it?
I’d love to go through all of them, but I’m constrained with work as well, so I’ll have to prioritize.
You can always just hack the interview process if your primary goal is a job at a particular Big Co. In that case, learning all of the material is probably an extraordinarily inefficient path toward your goal.
But "any dumbass can hack can hack mega-corp hiring processes" is sort of obvious. If your goal is to do the real thing, as opposed to knowing how to hack the American Corporate Tech Giant mechanism for determining whether you know the real thing, then the answer is "teaching yourself the content knowledge in this list sort of misses the forest for the tress".
I think a lot of low-cost/open source/self-directed programs sort of miss one of the points of education. The reason for learning all that stuff in CS courses is more than just the content knowledge. It's also the intellectual history of the field, the problem solving approaches, how people in this field communicate, etc.
If you think of fields like Econ or Law or Medicine, it's easier to see. There's some raw content knowledge (the rules of evidence, human anatomy), but also there's the huge amount of stuff around that content knowledge. Like, yes you need to know the rules of evidence and the names of every bone. But... there's also a bunch of other stuff you're supposed to be picking up as a result of going through the process of learning that material. The self-study approach somehow misses a lot of the "stuff you're supposed to learn as an effect of learning the material", especially in CS.
Hacking the hiring process has a good effort vs reward ratio, but it's also high risk. The risk is that obvious gaps in the candidate's background surface after starting the position. Which usually doesn't turn out well: best case career stagnates and worst case the candidate ends up on the bottom of a stack rank when things get lean.
It's a risk that can be taken in a calculated way. Sometimes things turn out well. Sometimes not.
I understand your concern about not gaining implicit knowledge while surrounded by peers, but I’ve gained “the ability to think” through my other course of study, and your concerns don’t seem to be attached to a solution. Rather, it seems like a desperate attempt to gatekeep over an increasingly egalitarian field. And sorry, but there isn’t much communication to be learned among CS undergrads.
The knowledge I lack on the job will be taught as I’m exposed to my shortcomings, as is the case with most jobs.
In addition, my school’s CS department was lacking, so most of it was poorly taught, and I relied on self-teaching regardless. Once I get through hiring season, I’ll worry about covering the remaining material before I begin work.
"Once I get through hiring season, I’ll worry about covering the remaining material before I begin work."
Someone who has spent the time wading through this stuff might consider that somewhat insulting. Your future co-workers, say.
ok? If I have an entry-level offer in-hand 6 months from now and I’m behind a few textbooks, I think I can manage. Sorry to break it to you, but a lot can be done if you’re healthy and avoid distractions.
I remember a similar conversation with a CTO of some cheap company that wanted to offer me 100K and arguing that he knew many risk takers who took on high paying jobs in bigger firms and failed. That was many years ago. Now my comp is approaching 1M, my job is super risky and demanding, but I don't care as in the worst case I quit with enough money to not work anymore. I know many vps and execs in smaller firms and their attitude is the same: they don't give a flying f..k so long as the pay/risk ratio is appropriate.
It is possible to learn this on the job. But with hindsight, my advice would be to just take the easy way out and learn it beforehand.
Not really. It is standardized at some FAANG companies but not others.