When he dropped out of engineering school he went to manage a restaurant. Made good money. Then started to build a house. Quit that too. He almost had occupancy on it and said, "I'm done. I'm out."
That's also what college is about. If you come to me looking for a job and didn't go to college, why? Did you drop out? Why? Do you not think learning is important? Do you not understand sticking it out until you have accomplished the goal?
Do you give up before you are through?
Not only that, but of course, I meet folks all the time who think "you don't need a degree to be a programmer." Sure. You don't need a degree to put Ikea furniture together either. I don't need programmers. I write programs to write programs. I need folks who know how to think... for themselves and learn and go out and find knowledge they need to solve problems and then solve the problems.
Until they are done solving the problem.
Not until they've given up.
I dropped out because it simply was not feasible to go any farther: I could not juggle full-time undergraduate pure math at Berkeley while working multiple low wage jobs in the SF Bay Area plus all the other responsibilities and commitments I had accumulated at 27, along with the stress of being able to afford my next meal, let alone rent.
I failed math thru high school. Barely graduated. Worked in grocery stores thru my 20s. On a whim I bought myself a trig book and self taught to calculus before deciding to go back.
After dropping out I taught myself to code. Night after night I learned SQL, built shit in Python and node and tinkered with Heroku and AWS and Docker while trying to fill gaps in my CS knowledge by reading SICP and the algorithm design manual.
The job hunt process was pretty brutal and lasted about a year and a half. Rejection after rejection after rejection. Take home projects to work on and technical interviews to study for after work when I was exhausted from super demanding physical jobs that time and time again wouldn't pan out
Finally I got hired into a remote position at a great company, and left the bay for a place where life is slower, cheaper, and less crazy making. Now I'm going back to school on a part time basis at a local university.
There are so many stages between where I was a few years ago and where I am now where I could have given up, and where I think a lot of folks do. You don't really read those stories in manic Medium articles about learn to code success stories, how one guy (with a trust fund and a credit card and a network of ivy league grad friends) learned to code and got offers from every FAANG. I don't blame the ones who give up.
And I think if you have access to financial resources, and have friends who have them too, it's also much more straightforward, or at least less frustrating and painful, to finish school, and get a job that will pay you fairly.
But otherwise it's a tough road to walk, one that's physically and emotionally exhausting, and I suspect the level of commitment involved to walk it isn't too different from that needed to graduate with a degree. I wouldn't know tho -- that's not me (yet)
In the middle of my senior year I got a job at a startup that demanded so much time I felt it was better to pause college, and the hard work I've done since has paid off tremendously. I am going back to school, but there is nothing I regret about leaving initially, there is nothing I am learning that I didn't already study on my own time.
For every story about engineers who dropped out and got a menial job and had no ambition to become a better engineer, there are stories of engineers who had that ambition from the start and the degree became a nuisance. You shouldn't hold it against them until you know why they chose the path they did
> If you come to me looking for a job and didn't go to college, why? Did you drop out? Why?
You, clearly, dropped out because you had an amazing opportunity that you jumped on, not because you decided you were "done with it" and aren't capable of seeing things through until they're done.
I think the important question is "why?" -- for basically any career decision!
There is an obvious and common answer to this: college is expensive, both directly and in opportunity cost of not working. It's an expense that few people can easily afford without significant family support or loans.
Filtering by college degrees is filtering by family wealth more than anything else (or otherwise someone's willingness to get into debt).
I assume you have the US in mind? This is less true in the many developed countries where education is free and/or programs exist to cover living expenses during studies.
Yet, people who need to cover their own living expenses have a far smaller success rate, need to drop out more often or need to pause studies to work.
Programs to cover these living expenses exist, but they are limited, and not everyone who is not eligible for these can easily cover all expenses.
When access to education is considered an actual right, living expenses tend to be included in the conversation (as you just pointed out). Unlike merely incidentally affordable education.
As an unrelated observation, the effect known as Berkson's Paradox would probably trigger along the way which is an interesting and little-discussed factor in this sort of thing. Two factors for getting in to college being wealth/connections and intelligence/ability.
Link for the lazy: https://en.wikipedia.org/wiki/Berkson%27s_paradox
As a counterexample: Dutch universities can’t select for students (for most studies), and annual fees are ~€2k annually (regulated). There will be some bias towards wealthier backgrounds that are more able to invest that money and time, but compared to the US/UK systems I expect it is minimal.
This is not unique, I think systems in the Nordics and Germany similar.
Wealthy students get the best grades since their parents send them to expensive schools designed to maximise this system. They then get into the "best" universities.
What do the Dutch do if there are 10k spots available in a given university and 20k people apply?
Exceptions happen: eg the popularity of CSI lead to a huge boom in people wanting to study forensic sciences. IIRC they were allowed to limit the seats for those studies back in the day.
Similarly, dentistry and doctors have a fixed amount of seats. They have a lottery that is biased based on your high school grades.
Overall I think this system has the better outcomes ‘on average’, but that the true top students are not challenged as much as they would be in a more selective system such as the US/UK
I didn't know that the Dutch had unlimited money. Weird that the portion of the population with degrees (44%) is comparable to the US (46%).
My family could afford neither, and I don’t know what I would do if it weren’t possible to work in my favorite field without a god forsaken degree.
I'm not sure it's wealth as much as filtering by situation. My family was not and still isn't wealthy. Quite the opposite in fact. I was fortunate enough to be born near a local college so I could live with my parents while going to school, and working 30 or so hours/week to pay tuition.
I definitely didn't have the 'college experience', but I graduated with almost no debt.
I got my first programming job when I was a sophomore in college through people I met there. A couple different companies asked me to put college on hold for a semester in order to work more, but I'm glad I said no. College is not the only path, especially today. But, it was the best path for me.
Sometimes it is a good idea to give up, sometimes it is not. It is not a function of the individual, but a function of the circumstances.
I've hired dropouts who had a good reason, and were otherwise qualified. I'd do it again. But there are plenty of people who drop out because they can't make it, because we idolize founders who dropped out, etc. And it's not automatically an admirable thing if you don't know why you're dropping out.
On the other hand, colleges do need to really rethink their education model. I fully believe in the value of a core or liberal arts education, it provided useful analytical & communication skills, but there needs to be a significant pivot towards more concrete marketable skills. At the community college level, there is an excellent degree type for this: The Associates of Applied Science, or AAS. It has some, but reduced humanities requirements and focuses much more on career skills of the chosen area, all of which have specific, immediately accessible job opportunities. The really unfortunate part of this degree, however, is that is very hard to build upon later at a 4 year school to finish a bachelors if you so choose: the credits either don't transfer or transfer as electives, not requirements towards a degree. Again, a major pivot is needed for traditional 4 year schools.
'Common sense ain't so common' is a trite aphorism, but very true.
I read your Practical Compiler Construction combined with Elements of computing systems  and that was a great combo!
There are many reasons beyond just weak character for why someone would not go to college and it's not up to you as an employer to make that judgement call on someone's personality until you've dissected their real reasons for leaving.
This is something that takes a lot of time, way longer than an interview has to offer, so it's best to not make assumptions.
Did you not drop out? Why?
That's a genuine question I have of any interviewee. What was their thinking behind studying for 3–10 years at universities vs developing themselves through other avenues?
For someone with a master's degree, I'd like to witness their critical thought process. Their problem finding skills. Their ability to research outside their own field. Nothing would be a bigger red flag of mediocrity than someone who has spent several of their best years having gained no more than what mere vocational training would.
I never consider having a degree a "plus." It is only a plus if one has something to show for it.
It acts as a social filter preventing a large number of people from persuing a given profession that they otherwise could do just because their parents couldn't afford to rent a flat to their kid in another city for several years.
I don't think that there is any relation between the type of education that someone has (online vs formal) and a personal characteristic like perseverance.
There could be many people that could not afford formal education, learned online and are still perseverant. The two things are unrelated it just sounds like prejudgement.
Speaking from personal experience, my team has hired college dropouts who have turned out to be some of the best learners and problem solvers we've got. I have also interviewed candidates who seemed to be under the impression that their Ivy League degree entitled them to the position despite having no experience working on "real-life" projects to speak of. Candidates should never be discounted on the basis of not having a college degree. If they've got a strong application, the interviewer should be able to determine the rest of these more qualitative things over the course of a conversation.
On the other hand, if done right, college can be a place where some very intense, sustained personal growth can take place at a level which cannot be rivaled by supplanting it with e.g., taking a series of online courses. Taking advantage of the resources around you (seeking mentorship, research experience, forming study groups with colleagues, doing co-ops, etc.) can help put you far ahead of a person who pursued a self-study route in the same period of time. I agree that it is unfortunate that college is financially unviable for a lot of people who have a lot of potential. However, I don't necessarily agree with the people who allege that college is nothing more than paying for a piece of paper so that you can get a job.
But lots of people do need programmers.
These "is college necessary" posts suffer the same problem as "why are programmers making 200K right out of school" posts. The job title covers everything from hacking out WordPress plugins, to leading a team building an MVP, to building large distributes systems, to designing the software that goes into medical devices and autonomous vehicles.
I'd say that covers the majority of programming jobs. The problem, though, is that the are a vast number of people who can meet that bar and salaries are bound to collapse to reflect that at some point. What happens to them then?
I don't don't, I've been hearing people make this claim for a decade now. Hasn't happened yet.
Quite the opposite, people keep saying this, yet salaries continue to skyrocket.
Maybe basic programming skills are just really valuable, and when people learn them, they become able to do much more productive work?
Programming salaries are now very bimodal.
I call BS. I'd be impressed if you could even write metaprogram to answer your own interview questions.
He didn't get the job though.
I dropped out of college. There were a number of reasons, but really it just boiled down to the fact that I was not a mature adult at 20 years old. I was enjoying my life and my freedom living alone (arguably too much!). I don't regret any of it. I started learning to code about 3 years later and worked my way into doing it professionally.
At almost 40 now it feels absurd for me to be judged based on my 20 year old self. I know I'm biased here but, if anything I feel like the college you went to and the number of years attended say just as much about your level of privilege and rigidity of your upbringing than it does about your character.
Those contents can be an asset in your professional career and that alone justifies to study.
It's not necessary to study CS to become a programmer or even a good software developer but it can certainly help.
I'm not convinced you learn any faster at uni than you do on the job. Uni seems like a good way to get started if you have nothing but if you have the chance to take a job that seems to always be the better option.
It was about stuff like cryptography on a theoretical level and learning some assembler to understand the basics on a lower hardware level.
That's just two examples of things I'd maybe never have learned just by working at the company.
I didn't say it's faster but then I'm not a big fan of things that move too fast - as neuroscientists know learning works with restructuring the brain and that needs some time.
And I'm happy I didn't start this journey with university because then I might have quit. I learned to program before in a funny way just like you learn how to play with any toy. Think that might have helped to start studying and for me it was a good combination I think.
In the end it's your choice and you'll find your way - I can just say how my path looked like and what was good about studying for me - it doesn't have to be the same for you of course.
I take graduate degrees seriously, because people tend to take those seriously and focus on the education more than the paper at the end. I don't take 4 year degrees seriously at all. If all you've proven is that you can survive with minimal effort for four years before losing patience then you certainly have more potential than the lowest side of a bell curve, but you haven't proven you bring added value.
There you go. You haven't proven you bring added value.
Doing research is by all accounts difficult and draining to a ridiculous degree but if you did well in undergraduate coursework graduate coursework is more of the same.
I've found university to be somewhat of a bad predictor for whether people think for themselves. It is however from my experience a good predictor for following instructions and follow-through (which, as you mentioned, is not unimportant).
If I see somebody with no degree, but experience, he might also have follow-through though, and often more resolve to solve problems themselves.
I had a university education myself, and while I think it's not strictly necessary for programming, it does give you a good overview of related fields (mathematics, algorithms and maybe others) that often come in handy as part of solving a problem more elegantly or efficiently. And self-taught programmers often don't touch on these topics and have a big blind spot there.
I wouldn't hire; work with; or work for somebody who doesn't know how to manage the risk of Type I/II errors in their heuristic.
That signals to me "lazy thinker".
> In statistical hypothesis testing a type I error is the rejection of a true null hypothesis (also known as a "false positive" finding or conclusion), while a type II error is the non-rejection of a false null hypothesis (also known as a "false negative" finding or conclusion). Much of statistical theory revolves around the minimization of one or both of these errors, though the complete elimination of either is treated as a statistical impossibility.
I usually start with the question: 'What data would convince me that I am wrong?'
If I can't answer that for myself - I know I have a problem.
Relax, you have done well. You have a good "why". Now consider someone who doesn't have a good "why" and also doesn't have a college degree to at least prove they can complete something. It's not a perfect indicator, but it's an indicator.
I make top-tier money outside of SV, I get to go to all the cool conferences, company-paid international travel. Most of my vacations all the flights are free thanks to the miles.
I might finish school one day, but it's mostly for the piece of paper and 0% for the knowledge. if WGU had a linux-track I would consider that as the path of least-resistance.
Went to college, did my best until proper graduation. Dug my way into FP and semi advanced topics at the time. No job because I'm not mainstream (and barely insterested by REST and similar).
On the other hand some gigs can accept barely educated juniors, train them for one thing, they get a cute career and raises along the years and by the time they get to my age they're comfy and set (potentially, not every life is bluesky and easy)
There's no clear better path. Even if you consider college academic education, yes we had good classes (DB normal forms, optimizing compilers, computer graphics etc) but we also spent half the time on obviously horrendous 2000s OOP that led the world at the time and will soon be forgotten. So it's not even great on that side too.
So, is going to college and earning a degree the only way to do that? In fact most of the college education is not designed with the idea of inculcating learning as a skill, in fact college may be a bit late for that skill and as a consequence/side-effect earn a degree.
Should we reject someone who still can give you the signals (thinking, persistence) one could be looking for, but hasn't attended or dropped out of college?
There could be more optimal ways to learn to think, and solve problems and being reasonable enough in not giving up. Going to college, incurring debt, doing things where the dead end is a degree is not optimal.
How about "because it is really expensive, and not everyone is provided enough to pay for it, or take out the debt, and there are other ways of learning that stuff".
When person do not has clear target or reevaluate target when already in college, dropping is a kind of damage control
And it is ok to drop when you understand that it is a waste of time or effort
And we assume that college, profs and etc is good, which is not always true
Do you have a PhD, or did you give up? Do you have a high school degree, or did you give up? Do you still work at the first job you ever had, or did you give up?
I found the material and tests straightforward but you absolutely needed to put in significant time to complete the assignments.
I'd already been working for years as a programmer before I went back as well.
Highly depends on the college and major. My CS degree had a ton of out of class homework assignments and projects not to mention the need to study to pass the tests preventing you from coasting by. Add onto that the fact that at least 10 of my math and cs classes were bell curved based and Cs are the minimum passing grade so if you try to coast by, you will need to retake a bunch of classes or switch to an easier major as everyone else is working really hard to not be in the bottom (i had to retake 3 classes myself).
This statement seems deliberately obtuse.
The barrier to entry for most people to tech jobs isn't whether or not they can get interviews, it's whether or not they can pass the technical assessment that's become a standard part of the process. In other industries, GPA, school rank, highest degree earned are hard pre-reqs to inteviews, even for entry-level jobs. That isn't necessarily the case in tech.
I work in a field that falls under the "data science" umbrella and I take issue with a lot of online courses compared to traditional education because almost all of them overpromise and underdeliver and take advantage of naive students that don't know any better. I can't tell you how many applicants I've interviewed that list dozens of online certifications for this and that skill, but can't demonstrate any knowledge of it when asked or assessed.
Just my $0.02, but online tech courses and degrees are akin to the MBAs of a decade ago: exploitive, expensive, and often entirely unnecessary. I would still hold that a technical computer science degree from a 4-year university is worth it, however, for the benefit of being in a collaborative learning environment with peers (Note: That doesn't mean you have to go to Berkeley - I went to a top 50 state school and got the same education and job opportunities as all of my friends that went to top 10 schools, but I graduated with a positive net worth.)
(Disclaimer: I know there are always exceptions to everything. I have generally heard very good things about GT's courses - Udacity's I'm more skeptical of.)
I would use them for an introduction, but would definitely pick up a well recommended textbook after, or do a medium sized personal project after. Although the Coursera Cryptography class I took was just as good as my University one so I guess it really depends.
Lots of folks who have taken my Flask course said they learned more about web app development in 10 hours of self paced videos than they did in 4 years of university. Lots of them felt like they finished the course really knowing how to build something, and many of them have gotten hired for work shortly after.
But the course doesn't touch algorithms or any theory around computer science. It's just 10 hours of exposure to building a real SAAS app in stages.
I personally believe experience trumps almost everything and courses can be very good for people who consider themselves self guided learners, because you always have the power to research the theory while treating it as something that's on a need to know basis. Taking a course on a specific subject lets you focus your time on the exact thing you're trying to accomplish and some course instructors also provide free support (I do), so you always have an out or 2nd set of eyes to help get an answer for things you can't figure out alone.
I never went to college but I do sometimes regret missing out on the social / networking experience, but I have no regrets about taking a self guided approach to web development for the last 20 years and I'm happy with how things turned out.
Motivated real world exposure > real world exposure > motivated self education > traditional education
As someone who mindlessly "self taught" through tutorials, then did half a bootcamp, then did a MSCS-- none of those things were anywhere near as transformational for me as a dev than my first year as a software engineer.
But of course, building AlgoDaily has probably taught me even more than the many years of work at this point, purely because there's been a strong impetus.
The "need to know" basis is huge. I think with a strong enough "need to know", any method works.
Keep up the good work!
i will say that there is an interesting social component on-the-job, especially early on in a (my) career. more than once i had a colleague who would zero in on a gap in theory (almost always algorithms) and talk down to me. in one case i had someone jab their finger at my face and yell (yes...raised, angry voice in a large cube farm environment) "I got my masters from MIT and you dropped out of highschool". The fact that this person was, in the end, completely wrong is immaterial to my point. About a year post-conflict we were (and still are) great and supportive friends.
The issue boils down to a need to "prove yourself" to certain people when you lack an undergrad degree. If you bounce jobs, regardless of reason, the process starts all over again. Someone stepping into most of the environments I've worked in with a fresh BSCi in CS/EE/whatever have not been placed in the same position to justify their existence.
For me, it resulted in a multi-year personal issue of harshly judging engineers with degrees from well-known schools like Stanford, MIT, and so on. I would think to myself "why can't you do this? I dropped out of highschool, learned on my own while working crappy paying jobs as a teenager before getting a shot...mommy didnt send me to a fancy school...i worked my way up from the assembly line to the engineering team" and other such toxic inner monologues. Harsh judgements based on the fact that everyone has a different melange of life experience. Super unhelpful.
As time went on I realized that a papered engineer, especially one who was only a few years out of school couldn't possibly have covered all the things in their coursework...there is enough volume of knowledge it would take decades to learn it all in school...assuming that was all you had to study!
By now I have been around the block so many times that my lack of a degree is less of a barrier, outside of passing the resume-gatekeepers at larger tech orgs. It resulted in my career focus being in the startup-SMB sized orgs. Mostly people seem surprised I didn't go to college, maybe a little amused by the fact they took on a debt load to be sitting next to me.
I hear a lot of "damn, i could have saved so much money". My reply tends to be along the lines of "...yeah...but it took almost 10y for HR to stop trying to low-ball my pay based on my lack of a degree...so i think it may be closer to a draw than either of us realize".
1. They act as a coach: The professor and your peers expect you to attend lectures. The assignment is due by 5pm on Wednesday. If your performance isn't satisfactory you will be dropped from the course.
2. They get better feedback about their teaching. If half the students can't do the assignment on a particular topic, the professor can schedule catch-up lessons. Watching a group of students struggle with a question can give valuable insights about how to teach that topic effectively.
3. They act as a high-quality filter: Only high-quality applicants will be admitted to the university course, while anyone can pay $10 and start doing a Coursera course. The university also offers the opportunity to become part of a valuable alumni network.
(Some online bootcamps like Lambda also have these advantages because they insist on strict online attendance and are willing to drop students who don't put in the effort)
1. Online coaching can be provided through video conferencing or similar. An online course perhaps could have the added benefit of allowing students to experiment with a subject. If they need to be forced into staying on track then they're probably not interested in the subject to begin with. They could also come back to the course at a later time when they're more motivated.
2. Adaptive algorithms can solve this problem and do it in real time.
3. A well crafted examination/certification can do the same for less money. See the accounting industry's certified public accounting exams or the legal industry's BAR exams.
Ultimately though, I see a hybrid system being created. If you want to take the traditional class, you can, for a fee. If you can self-study then you do the online course and only pay a much smaller fee plus any supplemental services you buy. Overall everyone wins.
Only four states allow you to take the BAR exam without going to law school (edit: and good luck getting a good job without the law degree). The CPA exam requires either a Bachelors degree or 120 college credits to be taken.
So, in other words, your examples are in fact showing the exact opposite of what you think they do.
What we need is a way for people to verify their knowledge. Take the exam, if you pass you get the certificate or similar. I am not talking about some sort of full blown professional certificate for every single profession. CPA/BAR exam was just an example obviously.
Outside the US and possibly Canada you can become a qualified accountant by examination and work experience without ever getting a degree in most of the English speaking world.
If you’re saying credentialism exists I doubt many would disagree, but just because a bad system is in place doesn’t mean a good one is impossible.
I started college in 2001 as a computer science student and didn’t “finish” for over 10 years because of job (sysadmin for university) and consulting opportunities (travel). I’m glad I finished because it’s behind me and I don’t need to think about it any more. I still have the common nightmare of not knowing where my final exam room is located.
Anyways, I am a big fan of online courses. In early 2000s I had learned and built many PHP sites and started with Rails. There were no classes/courses on PHP or rails!
Fast forward to early 2010s and I find myself watching Stanford’s iOS development courses. I leveraged the knowledge to become a successful mobile app developer consultant.
A few years ago I purchased a handful of online courses on React/Redux. With that knowledge I’ve built a successful Electron JS app available on app stores.
These successes are not because I’m smart. It’s because I have a high tolerance for pain and boredom. When I see a challenge I keep digging at it until it’s solved.
Protip: Watch lectures at 1.5X speed (2X if review). Anything slower and my attention becomes highly distractable.
Software engineering is one of them. The amount of tutorials and videos available on the internet far surpasses any curriculum at school.
But on the other hand, anything that requires hands on training that you can’t get at the comfort of your house, like medical or scientific careers, those you probably need to go to a school for.
Further more, in terms of software engineering, I don’t think the school system can ever keep up with the fast pace of the tech world. It’s just a rigid system and too slow for anything fast changing, like the web/app development.
What I found in my time in both environments is that Nanodegrees appeal more to students who don’t have access to traditional education (college is too expensive, or grad school requires an undergrad degree, etc.). That makes most MOOC students less experienced, less qualified, and higher risk (in the sense that they mostly don’t have the profile of successful college students). Udacity, et. al., then appear to have a very important role to play in satisfying the need for education unencumbered by academic gatekeeping.
But the _other_ constant undertone in the MOOC community is the “get-rich-quick” crowd who expects a Nanodegree to make them a 6-figure AI engineer in three months at 5 hours per week. The dirty secret is that we already have a fast-paced learning environment that can give you a good crash course on the required core skills to make you a useful apprentice: it’s called “college”. It’s arguable that the typical BS could be abbreviated a bit or focus a bit more on “job-ready” skills. But I think the time required for most people to get there is much closer to a 48-month BS than a 4-month Nanodegree.
The other dirty secret is that no one wants to hire you as a junior developer at SV rates if you don’t have experience and need a visa or want to work remotely in your low CoL hometown. Unless you already have strong qualifications, you’re fooling yourself if you think an ND or Udemy course is gonna help you break into Google as a fully remote worker.
A US Bachelor’s is not 48 months, at most 9 months a year is spent officially studying, the rest is holiday. That would be 36 months. If we pretend the average student treats it as seriously as a full time job, ignoring all research on how students spend their time, we can still cut that 36 months in half, because half of the average US Bachelor is general education with no professional impact. That’d be 18 months.
If we want to look at the real world for examples we can see the UK, where most Bachelor’s are three years, with the extensive breaks and holidays you have in the US, but two year, full time, non stop degrees exist, or at Lambda School, which takes nine months to turn people into software engineers. They also demand more and more consistent work than well over 90% of university courses.
I went to college year round as it was the only way to balance my work schedule. I was a minimal full time student during the typical semesters and took classes all summer.
> because half of the average US Bachelor is general education with no professional impact.
I disagree. The general education is probably what everyone should go to college to learn. Reading and writing (communication) is the basis for almost every single job a person may have. It's also a skill that lasts forever. When I was in undergrad I took random business courses for my electives. I still use and have built upon concepts I learned in economics, finance, and accounting.
People that talented and motivated about CS should take computer engineering instead; much fewer general education courses are usually required.
Aside from that, there are two obvious ways to turn the general education requirements to your professional advantage: 1) writing courses, since the average developer can't write or put together a logical worth a damn and a lot of writing is needed as they get more senior and 2) foreign language courses, which open up job opportunities if you take the time to achieve basic proficiency.
For the vast majority of students, online courses are not a good vehicle for learning. Not because online courses are, in themselves, ineffective, but because success in them requires a much higher degree of internal motivation. Without the structure afforded by the traditional classroom experience, a very large number do not engage with the coursework, especially beyond the first week or two. We see a rapid drop off in activity & assignment completion.
As for my programming skills, I can safely say they are top 1%, as I can often ignore inquiries from say FB HRs. Yet I learnt everything myself. Without math, and without that structured way of thinking that's required to prove theorems, I wouldn't be able to get to the CS fundamentals, and my CS knowledge would be very shallow. There were a few CS teachers in college, but even then it was obvious to me that they don't know much and they had to cater to the least able students in the group anyway. I don't see a way to bring highly skilled and competent CS teachers to college: those who really know programming and have interest and ability to deal with people, often make 500k+ a year with very relaxed work hours - there is simply no incentive for them to bother teaching CS to (mostly uninterested) kids in college. And those who do teach CS in college as their full time job don't know much about CS, simply because gathering that knowledge is a separate full time job.
Edit: so online courses or college? Neither. You only need a book that thoroughly explains fundamentals and will to go thru it. Not enough will? Then you need a teacher whose only job will be to assess your knowledge twice a year in the form of some exam. Both online courses and college are too slow: I could honestly finish a masters degree in 1 year if I could avoid wasting time on all the fluff.
Oh man, I wish this was an accurate measurement of programming skills :D
>The best way to learn is to do your own experiments. Once understood, that understanding lasts a lifetime. Facts can change, but the governing rules, if deciphered, won’t.
I recommend to all software developers that they join the ACM (Association of Computing Machinery). This gives you access to computer science papers which are the foundations and the governing rules. There's also access to Safari Learning which gives you access to the latest books and video courses:
It costs a few hundred bucks a year and I've learned more in reading random CS papers and having access to great books and video courses than paying for many courses.
If you really want to understand how machines work, how strings work.
How distributed computing works
How databases work
How to program efficient string search
How regular expressions work
How to represent problems well for example in graph
How statistics works
How to analyze problems
How to read papers
How to be able to learn anything by your own
And the list goes on and on
It's all in real college University courses no online course would give that to you.
You have to focus on theory of math and cs for a couple of years and stretch and train your mind.
I don’t agree with you. I don’t think you need a four year degree for these concepts. I think many of the concepts you outline are very critical especially as you work on larger and more complex systems. However I’m not even convinced college does a good job at teaching these concepts. College introduce me to some of them but didn’t actually help me build any intuition at all. I regurgitated the things my professor said and that’s how I pass through my courses with straight A’s. I didn’t really understand distributed systems until I worked on them. I didn’t really understand them well until I read about distributed systems not from a textbook but by other people who had worked on them and this was mostly online papers or YouTube videos or conference videos where are companies presenting on their lessons learned.
I think the course of the classes you need for a computer science curriculum such as data structures, algorithms, operating systems, networks, languages and compilers, databases and distributed systems - These can all be self taught. I’m not convinced a four year degree that’s full of fluff with various elective courses adds any value.
I learned all of what you enumerated online, usually practically applied too. This is not something you have to go to a university to learn.
Granted, I wasn't following a MOOC or a bootcamp - I just kept reading, and digging in deeper, and asking questions and practicing on my own.
It's rare for someone to have so much self motivation. School environment forces you to learn those things without the burden of having to be so self motivated.
Then I started tutoring and teaching. That's when I realized that I'm an extreme outlier. Most people are not particularly motivated and won't put in the hours upon hours of struggle.
The average experience of learning CS in university is just completely alien if you teach yourself how to program as a child. The biggest differences is the emotional labor. There are lots of things that my students would describe as "frustrating" that I have literally never thought of as frustrating (e.g., reading compiler errors). I think it's similar to learning a new language as an adult vs. as a small child.
Frontend engineer on my team: https://boards.greenhouse.io/udacity/jobs/4320541002
All open engineering positions:
We for sure had engineers who were self-taught / learned using online resources (myself included). In fact, I hold zero certifications related to programming.
Over the years we recruited and employed a handful of engineers who pretty much learned everything they knew about programming through Udacity courses. Frankly, I always felt it was a very supportive environment for non-traditional engineers :)
This is HN, I would expect a hundred anecdotes from those who have succeeded against the traditional educational system in CS.
But are there studies or other evidence this is more then survivorship bias? Is any study following dropouts, non-BS grads, bootcampers and online students in the industry over time and counting from the total population who registered for CS 101?
I disagree 100%. That is the primary value I got from college - learning how to not only learn, but break everything down, re-conceptualize it into new things, and build it back up into something productive. We absolutely applied what we learned to our own projects, and I've found those lessons continue to work well a couple decades into my career.
However, for many (myself included), it provided a framework that I wouldn't have been able to get otherwise. After I got that framework, then the idea made more sense to me.
It has some, but reduced humanities requirements and focuses much more on career skills of the chosen area, all of which have specific, immediately accessible job opportunities. The really unfortunate part of this degree, however, is that it's very hard to build upon later at a 4 year school to finish a bachelors if you so choose: the credits either don't transfer or transfer as electives, not requirements towards a degree. Again, a major pivot is needed for traditional 4 year schools.
The fully online approach seems good only for those who are natural autodidacts or as a fallback for those who simply can't get through some of the non-CS required college courses.
My self education took about as long as a college degree anyways, but I was able to work the whole time which I think gave me a bit of an edge over most college graduates. Self education can feature as much or as little theory as you decide you want to learn
This is definitely not true across the board. I have multiple peer-reviewed published papers (in pretty solid journals) in a field I never formally studied, and am a college dropout to boot.
There was a thread last year where lots of people commented about being rejected for a visa in Europe because they didn't have a degree, despite having multiple years of experience.
The broad understanding I came to was, it's important to make your own conclusions, and not take anyone else's conclusion for granted, whether it comes from universities, courses, or even blog posts like mine. In reality, however, it's difficult to devote that much time to personally research everything. So you should at least do it with the things that matter the most to you in life.
All that said, perhaps for the sharper students these classes demand much less time than they do for me.
I'm currently in GA and it is tough, probably my toughest class so far - 10 to 20 hours a week is about what I'm doing. I think I would need at least 20 hours/week to pull off an A, but I don't have the time unfortunately. My undergrad, which is a top tier state school, was also on the whole a lot easier. OMSCS is a tough degree - completing it is a big achievement.
The difference is in the number of subjects that you can take per semester, and that also adds to the difficulty. Also, I’m not entirely sure but some courses are bound to be labs heavy. Those courses probably are not available on omscs.
Understanding > knowledge
Start with leetcode easy and see what I am missing if i am unable to solve that problem. progress all the way to leetcode hard.
Anything else is pure waste of time and would not land you a job.
You might or might not be able to do this but if you don't leetcode you won't get that job in the first place.