Hacker News new | comments | show | ask | jobs | submit login
Two out of three developers are self-taught, and other trends from a survey (qz.com)
113 points by raddad on Mar 31, 2016 | hide | past | web | favorite | 118 comments



I started programming when I was 10 or 11 and continued on to get a CS degree. One thing I did notice was that a lot of people in some of my classes seemed to have just woken up one day and decided they wanted to get a degree in computer science. And all throughout my years in school, those are the people who never ventured out of the curriculum (which was java based) to explore and learn other things. Which just seems to set you up for a life in boring corporate java desktop app development (unless you like that kind of development).

An example is my upper level elective class on database systems and design. For the final project of the class you were to take everything you've learned about relational databases and design your own schema, get all of the data properly normalized, and make an app that uses it. Other than that, the sky was the limit. You could use whatever programming language, framework, and database that you wanted, and the app could literally do anything. There was nothing limiting you to building a desktop Java app coupled with MySQL, yet 95%+ of the class did just that. I ventured off and used Postgres + Node.js and made a single page web app for my project, using skills I learned outside of school, on my own time, learning these things (I longed for a friend that knew what a promise was or that bluebird wasn't about a damn live animal).

Now to get to my point. I'm not going to say that self taught developers are strictly better, but I feel like they are the other 5% like myself that see what we do as more than just a class, grade, job, or paycheck. They are the ones that spend the time to learn new emerging things in the software development realm and do it on their own regard. I would much rather work with someone who is not an uncustomized, straight from the java CS degree factory college/university. Someone that ventures outside of the path they are guided on in their career and skill sets and takes the time to learn on their own, and learn the things they want to learn just because.

I'm lucky enough to have my first job be somewhere where I get to dive right into Node.js, Angular, etc. which is what I want to do at this time. Maybe the other 95% feel this way about their first jobs which probably are making internal business Java applications, but I really can't believe that.


When I went to college the first time around, I really didn't have a desire to be there. If I dropped out, though, everyone would label me a bum since I also didn't have a desire to do anything else. So I got an easy degree in marketing and ended up working in retail. When I finally decided that I wanted to do something else, I went back to school for computer science. I'd never developed anything before. I didn't even know what programming was. I just knew that if I got the degree and made the right grades, I'd get a job paying 3-4 times what I was making in retail. That's exactly what happened and I do precisely what you describe: build boring internal Java apps for a big company. However, I never really got involved with this stuff because I loved it. I did it for the money and because it was a far easier job than what I had. Now I'm going back to school again for a graduate degree in an entirely different field (except this time debt free). I just don't dive too deeply into stuff. It's not my thing. I'd rather switch up careers a few times in my life and do a bunch of other hobbies like woodworking, music, etc.


I think there's room (and need) in the world for both types of developers; ones who just stick to the curriculum and do internal Java apps for big companies and ones who venture off the beaten path. Neither one is better than the other, and each person has their own way of feeling like they're learning and growing and leading a fulfilling life. If it's woodworking or music that make you happy, great! If it's programming at work and then going home and programming even more that make you happy... also great!


I agree 100%.


I think it's awesome you had the courage to admit that here. There's nothing wrong with just working a job from 9-5 to fund your passions. Programmers tend to take offense at it though, for some reason.


>Programmers tend to take offense at it though, for some reason

Probably the same reason passionate musicians take offense when lazy garbage tunes make it to the charts:

Gauging the quality of music is something most people can't do, and they feel robbed because they're not compensated proportionately to the amount of their effort compared to those other unpassionate musicians.


The thing is, I feel the same way about my hobbies. Why can't I make as much $/hr as a carpenter or landscaper? Why can't the market accept that this is what I would prefer rather than working as a programmer by day and carpenter in my spare time? Simply put, the market values what it values. There is a definite limit to how much it will care about your passion because other people eventually care about their own money and time too much to give it to you. Unless I am willing to work long hours, I can't make nearly as much as I do now working just over 40 a week. In business for myself I wouldn't have access to the same benefits I get in my programming job. Nothing would compare.


That's the real reason we need to require programming courses in middle school. Lots of those people didn't really know CS was a thing when they were in their early teens. They hit college, but by then they are 6-8 years behind. They graduate and nobody wants to hire them because they have zero usable experience.

We need to require one year of programming to get the talented kids started early then move all our "intro to CS" classes from the main syllabus to remedial college courses (you wouldn't call pre-algebra an "intro to physics" course). This would also help other professions. Engineering especially suffers from this problem where they take the required C or Java intro course then have matlab dumped onto them the next semester. They muddle through and graduate with little skill in programming despite the fact that their entire job will revolve around programming.


"zero usable experience" is a bit harsh and doesn't reflect reality. There are internship programs where students have a chance to gain work experience as well as work-study. And anyway, programming is not rocket science and even rocket scientists have "zero usable experience" in rocket science when they land their first jobs. Fresh-out-of-college hires should be placed in junior roles and mentored systematically by experts rather than handed over to project-management professionals who will thrust deliverables upon them.

As for "one year of programming" [before college] what do you mean? There are already AP computer science courses for students in high school. Most high schools have enough trouble getting students ready for college, no need to add comp-sci when the basics are on such poor foundations. Much more effective to provide a solid grounding in mathematics, writing, and science, history, and literature in high school.


I worked in Huntsville, AL once upon a time (for those who don't know, Redstone Arsenal and the Huntsville research park do a huge percent of the world's so-called "rocket science" and the city has the highest percentage of STEM people of anywhere in the world). Their university takes engineering pretty seriously (and is literally squashed up against the research park). Even so, things can be difficult.

If you get an internship with one of the big companies, you will very likely have a great career in Huntsville. If you don't, then you'll have a much harder time. There are only around 180K EE jobs in the US compared to a couple million CS jobs. The dropout rate for EE is huge and the pass rate for FE/PE exams is somewhat low (you have to become certified after graduating). Companies must take a risk to hire new people because the barrier of entry is so high, but even then, they are fairly picky about experience. How much more picky will they be about experience when they have more options?

The argument about having trouble with current school curriculum is a red herring. Just because they are already failing and need an overhaul does not mean that the overhaul shouldn't include an introduction to coding in middle school (and for the record, they need new curriculum, better teaching methods, and most importantly, parents who actually care about their kids enough to become involved in their day-to-day education).

I don't believe for a minute that everyone can code well enough to do it for a living any more than I believe everyone who takes English classes will make a living as an author. English is important for communication, math is important for calculating and thinking, and coding is important for focusing on creative logical reasoning (complementing math very nicely). Learning how to make a turtle go through an obstacle course will teach way more about logic than a thousand "word problems".

Most children will take such a course and walk away disliking coding, but hopefully with a little better logical ability. The rest will discover a subject they enjoy and increase their proficiency before college.


"Word Problems" -- these are absolutely critical.

There's only so much students can absorb and they need the basics FIRST. While computer programming studies are a nice enrichment for the students which are ready, it would be a waste of time if the kid can't do algebra and word problems by high school graduation.


We don't necessarily need a "programming class" in the formal educational style. We need introduction to the possibilities of programming, showing the magic of having their ideas come to life on screen, or personally modifying a running piece of magic. That's the sort of thing that hooks in kids of that age, not churning them through code boilerplate.

Another goal would be to break the mystique around computing and show that it's accessible to everybody, with just a bit of informing. Too many people who didn't get into computing early on are too scared to touch anything.


I believe we should be pursuing equality of opportunity rather than equality of outcome. Currently, there's still a big class gap (poor people -- especially minorities -- are far less likely to code) and even among the middle class, there's a lot of dis-information about coding (thanks Hollywood).

I agree that proper rigor is best avoided at that age. At the same time, a little bit of real code (especially in a language like scheme where there's not a bunch of syntax to memorize) won't scare away the people who gravitate toward coding.

What we don't need to do (in my opinion) is attempt to appeal to the lowest common denominator. We need to find all the other good candidates rather than lying to the unqualified by saying "anyone can code". Give them a chance to fail or succeed and let nature take its course.


That isn't unique to Computer Science or Programming. I've worked with engineers, accountants, project managers, and many trades people that never touch their area of expertise outside of the classroom or the office. No venturing outside the curriculum or the self improvement on personal time. For most of them their job is just a job and that is enough for them.

I think what is amazing about programming is that so many people are able to get jobs without degrees. There is a general consensus that self taught developers can be as good or better than formally taught ones. I haven't seen this in other fields until you get to senior leadership. Sure, areas like law and accounting require a formalized foundation that are hard to get outside of degrees but trades is an area with a lot of room for self-development yet there is little respect for those without certificates. Sure you can make the argument of insurance needs but that just raises the question of why contracted developers don't need to be bonded.


Programming has a very low barrier to entry in the United States. You need a computer, not a particularly recent one, and time. I'm told that proficiency with English is also helpful (I speak English natively, so I can't comment on how helpful). Internet access can be had for free at libraries/schools to download tutorials and software.

I know several people who learned to program on scavenged 386s and 486s in the late 90s and early 2000s. These could be had for free from some places (they were being thrown away) or cheaply from a garage sale. You could get a copy of Linux to run on a 386/486 from a local users group or by downloading it slowly and putting it onto floppies at the library (assuming your begged/borrowed/scavenged machine didn't have a cd-rom). A copy of Red Hat Linux, SuSE, Slackware, Debian, FreeBSD, or Mandrake had everything necessary to teach you Python, Perl, Bash, Makefile, Emacs Lisp, or Awk in 2000 (tutorials, compilers/interpreters, intermediate documentation). If you could get net access, a whole other world was opened up.

This was my path, but I've met many others who progressed similarly. Scavenged hardware and addiction to the power instead of sports, dating, etc can pull you out of the trailer parks, high rises, or hoods. (Not that you couldn't do multiple things, but if you're poor in the states, you probably had to work from the time you were legally able and combined with school leaves little time left for multiple hobbies.) Other trades don't have such a low barrier, which practically means they are limited to those who can progress through a college path.


Too busy to read all comments, apologies if it's been stated elsewhere, but a related remark: Whenever I interview a programmer, I ask them about their pet project at home.

I have yet to meet a good programmer that doesn't have at least one pet project developed outside of the office. Active development not required, just something that they put a lot of time/thought into. I ask them to tell me about it, platform used, problems encountered/resolved, things they've discovered. Fire in their eyes -> a big plus. A statement along the lines of "I had to stop because I dreamt about code" or they teach me something new -> double plus. Anyone who tells me "I don't program at home, once I leave the office the day is done" -> 'archived' immediately. 'School-coders' typically fall into the latter category.

PS I taught myself at ages 14-17 (Basic -> 6502 ASM -> Pascal -> C) then went to college (computer science BSc/MSc). Then the internet came about. Yes, I'm that old… No, I can't keep up with the latest and greatest either. Still have a pet project (or 10)


> I have yet to meet a good programmer that doesn't have at least one pet project developed outside of the office.

Frankly, I have never once had any need that could not be solved by free off-the-shelf software and/or a couple short scripts. I have absolutely zero desire to get home, having spent the day programming, and program some more.

I guess I would still fall into that because I developed and ran a website for some time (and forums that I sometimes hacked a little on), but I've never mentioned it in an interview and really, looking back on it, I should have spent that time doing other, non-tech related things.

Yet at every job I've been at, management considers me a "rock star" (ugh).

I think you criterion is likely (though of course not guaranteed) to get you good programmers, and it's not necessarily a bad thing to exclude the set of good programmers who don't meet that criterion, but I find the idea that I should love programming so much that I should dream about it and do it at home...unsettling. Nobody expects that from other professions. And I think I associate it generally with people who are willing to accept below-market pay (and therefore companies that pay poorly) because they're happy to just get a job doing it.

That said, I do like programming, well enough. But what I really like is solving other people's problems. And what I love is my hobbies that have nothing to do with computers and are unfortunately impossible to monetize.


That really depends on how much you can put into your day job. If you're at a small firm and you have equity, it may be more worthwhile and interesting to focus on professional work, dealing with problems at a scale and complexity beyond personal projects.


> I have yet to meet a good programmer that doesn't have at least one pet project developed outside of the office.

I don't. I use my limited spare time outside of work to study.


But you did… at one point? (note that I wrote: "Active development not required").


Last time I had a side project was at uni. So a bit over a decade ago.

At the time I was 90% theory/learning (Uni) and 10% practical, so adding in a side project to increase the practical side was very valuable.

These days I'm 90% practical (work) and so adding extra L&D/theory is much more valuable to me than spending another hour coding.


I didn't study computer science, but I was a lot like this at university. It wasn't out of laziness or lack of imagination (because I worked pretty hard), but simply naivety and blind faith in the university.

I'd worked hard to get into a reputable university, and simply assumed that they would be teaching me all the stuff I needed to know. Anything that the wise professors (experts in their field!) chose to leave off the syllabus must be of lesser value, right?

Looking back now as the cynic I have become, this seems laughably, pathetically naive, and completely false. 18 year old me was pretty stupid.

If I ever get the chance to give advice to a young person I will be sure to impress on them that it's the stuff you do outside the course that really matters. The qualification just gives you credibility and gets your CV past HR. That cool side project you did is what you'll talk about in the interview. I don't know whether they'll listen, but I'll tell them anyway.

Some people (like you) seem to know this instinctively. I wish someone had told me!


I halfway joke that there is a single test whether a non programmer has any chance of becoming a programmer. Do they right click?

I guess there are some analogs for different environments. Command line, do they know about man pages. General problems, do they google it first or ask for support. etc...


I can't decide if this is a deliberate or accidental Mac joke.


I took an A.I. class and decided that was a good opportunity to teach myself Python, since the professor didn't care what language the assignments were in as long as they worked (I specifically asked her if using Python was okay too). It took me longer to turn in each assignment because I was getting tripped up on encountering certain things in Python for the first time, but it was definitely a better language for the programs we worked on (I think everyone else was using Java), and I liked the language so much I still use it whenever I can today.

But my degree was never going to teach me Python. I had to decide to learn it myself.

Pretty sure we also learned Prolog in that class, but that we had instruction on.


I would say that 3 out of 3 developers are self-taught, but that about one third of them also have a degree in Computer Science.


I agree completely - I studied Computer Science in college, but nevertheless I learned how to program by teaching myself. I got value out of studying comp sci, but by no means whatsoever did studying comp sci teach me how to build software.


Yes! I view my degree and experience getting it in college as incredibly useful for learning all sorts of theory and the like that I wouldn't have taught myself because I don't find it particularly interesting. But it's still useful knowledge to have in the back of your head.

90% of what I do daily on the job and any pracitcal/useful programming is stuff I've taught myself over the years.


Or mathematics, or physics, or astronomy; and I'd rather hire any of those than a modern CS graduate, everything else being equal.


Having taught/graded students who graduated with various backgrounds ranging from undergraduate degrees in CS, ECE, to Maths and from IIT, MIT to Duke. I can assure you that its probably inferiority complex about your own CS education or math envy that is behind this line of reasoning. Given a particular school a median CS major is typically always a better programmer/problem-solver than an equivalent Physics or Applied math major.


Self Disclosure: I was a Math/Econ major with a minor in Applied Statistics (almost a decade ago).

> Given a particular school a median CS major is typically always a better programmer/problem-solver than an equivalent Physics or Applied math major.

The highest scoring undergraduate degree for those who take the MCATs is Math/Stats. The second highest is Physical Sciences. Far and away, last place is Specialized Health Sciences, with Biology also lagging[1]. The reason isn't because "Math majors are smarter", it's because they are choosing to go into a field that isn't the directly assumed career path. A Math major HAS to stand out compared to the average CS major when going into a programming field, just as an English major HAS to stand out compared to a Finance major when going into finance. If I am interviewing a Math major and a CS major, chances are the Math major is "better" (whatever that contextually means), simply because they've already stood out above all the CS majors. All that being said, let's not pretend our undergraduate degrees are anything more than a rubber stamp.

[1] https://www.aamc.org/download/321496/data/factstablea17.pdf


>> let's not pretend our undergraduate degrees are anything more than a rubber stamp.

May be at the school you studied at. Having taught / studied at Cornell (a highly ranked CS school) and Syracuse (ranked ~50th in USA) there is a huge gap between difficulty & level of effort required in undergraduate courses. Ignoring it as if its just a rubber stamp is a huge mistake. Honestly while there were several good students at Syracuse, I can frankly attest that even an average CS Major at Cornell is better than the 90% of students at Syracuse.


> May be at the school you studied at.

Mature.

> there is a huge gap between difficulty & level of effort required in undergraduate courses

I'm not arguing the difference between the quality of education between schools. I'm commenting on its usefulness after the rubber stamp. I went to a top college. I still question whether or not $200k was worth it to buy my first job.

> I can frankly attest that even an average CS Major at Cornell is better than the 90% of students at Syracuse.

I can frankly attest that I don't give a shit when their resume ends up on my desk. The amount of A students that end up with a B- in life are roughly the same amount that went from B- to A. A+ end up in academics, which is the only place where that aspect actually matters. I care about practical utility, not a slip of paper. "Westchester Community College" ends up in the same pile as "Cornell" if they have 2+ years of work experience.

My father teaches at Cornell as well, but at Weill in NYC. In the last 4 years, he's accepted 1 doctor from Harvard Medical School (#1 in the country) to his fellowship program. He typically doesn't take any because he thinks they value the history of their education more than the usefulness of it in practical application. While I was growing up, EVERY year when reviewing his applicants, we would inevitably have a dinner conversation about "those entitled shit heads".

> Ignoring it as if its just a rubber stamp is a huge mistake.

Whatever floats your boat. However, I'd caution you to warn your students exactly the opposite.


A purely selfish request, I know, but may I ask why? :)


It's a lot harder to fake one's way to a mathematics degree if one lacks or cannot acquire rigorous reasoning skills, and those are critical to the construction of quality software. Whereas I interview a lot of CS graduates who have degrees from fine schools and can't solve simple analytical problems or write a straightforward predicate expression to test for a well-defined condition.


My experience is mostly counter to this. At least with respect to engineers (fewer mathematician and scientist types that I've encountered professionally). My office prefers engineers of computer scientists, to such an extreme they'd hire a civil engineer over a computer scientist.

The consequence is a lot of people that know nothing beyond basic algorithms and data structures who try to code their own geographical databases using arrays/linked lists and wonder why queries like find the nearest neighbor to X and find all objects within this shape take forever. The answer is, they used the wrong data structures. But convincing them of that is like pulling teeth. Also, according to several of them, C = C++, and every language gets compiled to C so why would you use anything else...


But you said "everything else being equal", so I'm confused why you're highlighting the extreme negative examples. If you interview a math major and a CS major and they both do equally well on those simple analytical problems - would you still rather hire the math major?


I'd make offers to both, of course, once I've moved on from the simple question to the hard ones and observed good answers! Good candidates are rare.


yes, his point is that he could be wrong about the hire decision, and going with the mathematics degree is more likely to be right.

Whether or not that's true is another discussion, but that's the point he's making.

I have a degree in CS & Math, so I'm guessing I'm good to go :)


What are the odds that you do not have a CS degree, and have been made to feel inferior at some point(s) in your life, for that reason?


Two out of three developers who answered a survey on a self-teaching site are self-taught. I see this is yet another article assuming the SO survey is representative of developers in general. It is not. It is representative of people who use Stack Overflow heavily enough to see and click on a survey link.

Clearly there's a lot of interest in finding out what is true of developers in general: What languages do we prefer? What's our educational background? What are our demographics? It would be great to have a survey that would answer such questions. This one doesn't.


I have a CS degree and still consider myself self taught. I started programming as a kid, and learned mostly from books and coding. I only got my degree because I thought it'd be necessary for getting a job.


Anyone reading this and being in the same boat, intending to get a degree. Please do yourself a favor and consider getting a degree in Physics, EE, Math, ..., and do some CS on the side. Sure CS might be easier, but you pass up the chance to widen your horizon.

EDIT: many will only study once; why not study something new? Even if tuition is free, why have someone teach you topics you might already be very familiar with? I'm not saying you won't learn anything new in CS. You very much will learn something new. But if you choose a related field with little to no exposure yet, you will learn so much more.


Anyone reading this and being in the same boat: Be very careful with this advice. There are high-paying jobs out there who will only consider CS grads. Why cripple yourself professionally learning something tangential to what you want to work in?

Anecdotally the majority of non-CS grads I've worked with haven't been anywhere near as good as CS grads.


> There are high-paying jobs out there who will only consider CS grads.

Are those entry-level jobs? For those I can see it. If you're hiring somebody with 5+ years of experience, though, only considering those with a CS degree is pretty short-sighted.


This is an unbelievably bad advice. People underestimate difficulty in learning core CS & Programming materials after they have learnt it. All the while underestimating difficulty involved in learning Math, Physics and EE. As someone who studied Chemical Engineering and later did an MS in CS and now a PhD. I highly recommend studying CS if you are interested in CS. Barring ECE (not EE) the amount of overlap between even Maths / Physics & CS is minimal.


I understood CS a long time before I understood all the math in EE. CS doesn't start to catch up to engineering in mathematical rigor until you are on the back half of your masters degree. I can teach you missing CS theory about how compilers or algorithms work on the job without too much trouble. I wouldn't dream of teaching you how to do polar coordinate conversion for AC circuit calculations on the job under any circumstances though. 4 years of CS is very easy compared to 4 years of engineering.

Theoretical CS is very important, but also quite academic. 99% of developers won't ever delve into that theory and will instead reach for a library based on that theory. The biggest issue in programming is managing large amounts of information that changes over time. This is not only completely avoided in most courses, it is also close to impossible to teach outside of gaining years of experience because every decision is a tradeoff and it takes time to build up an intuition for such things.


This is REALLY bad advice. Yeah, the high-flying finance firms will pay big bucks for Physics or Math majors that can code and think quick on their feet, but you have to be REALLLLY good to get into those firms, and those firms hold CS to the same level anyway.

If you want to become a software engineer, get a CS or CpE degree and do whatever you want as a minor.

Some anecdotes:

* My girlfriend got a Math degree from a good, but not Ivy-league, school with a minor in Physics. Turns out that the market doesn't value a B.S in Math as much as one would think. She landed up becoming a teacher, which she likes doing, but it's a field that's really hard to get out of without resetting your career.

* I went to school that had a cooperative education program. All of the engineering students were eligible to participate. Twice a year, the Co-op program had two days in which hiring managers from local companies (my school is in Hoboken, so local == NYC) would come to the school and interview people. We would put our names and majors on the lists that employers had, and they would select the people they wanted to speak with.

The CS and CpE students would always, without fail, get AT LEAST 7 interviews that day. Every other major would be lucky to get 2 or 3.


That's an interesting mindset. As someone who is self-taught, I consider a degree to be a formality that I have to wait for because I move faster than this.

The Internet is my chance to widen my horizons, if I want to learn physics or math, I can and I will, but a Physics or Math degree might not help me as much professionally.


As I've mentioned before, I really prefer to interview and hire mathematics students over CS. It's much harder to fake one's way to a math degree.


This thread overly focuses on algorithms, which are not necessary for the majority of coding. What is necessary is the ability to handle abstractions, purity and regularization of thought. Mathematics, Philosophy and Writing are just or more applicable as a skill than being able to please an Online Judge.


I have a math degree. I do a LOT of coding on the side. Hire me? :)


I'm in the same boat right now. I have been programming since 12 and I'm getting a degree because I don't think I'll get a job without one (or more so my parents think that).

Is this true? Can I start looking right now? Granted, I am not in the valley. I'm all the way in New Jersey.


I'm not sure I would so quickly listen to the HN trends. There are a lot of people on HN who are younger and they like the idea that this would be a trend. You are allowing them to define the competitive advantage.

You'll have to do the math and weigh the cost/benefits of your degree.

Personally I think we're in a bit of a bubble/gravy train. The idea that programming will become the new automotive assembly line for self-taught labor is a bit of a stretch.

A degree can impart many benefits (math, core CS ideas, subject matter classes such as speech theory, advertising, management, etc). While many people can make enterprise apps with direction based on being self taught; not many can really build something in depth. Boeing isn't going to let just anyone write aerospace firmware. So if you want to get a job quickly and milk this while it's hot - maybe you go for it? I don't think it's clear cut, and I like having my degree. I'm good friends with fantastic developers without degrees who work alongside me.

In the long term; it depends on the competitive advantage you can get out of that time in school vs your peers extra experience in the field; as well as your goals.


Is a degree objectively better?

Developers generally fit into two broad categories -- those who learned to program on their own at age 10 or so and those who decided to pursue a CS degree because the job pays well or they like playing video games. 7 years of experience beats 4 years of college and 3 years experience pretty much every time.

Unfortunately, a huge amount of CS degree time is spent dealing with those with no experience (We have remedial Math or English classes, why not remedial programming classes?). When I hit college (pursuing EE with CS on the side), I already had almost a decade of programming under my belt. I coasted through anything programming related (I'd already found and read books on the theoretical side of programming, so even those weren't that interesting).

At the end of the college road, the people with college only were mostly worthless. They may be able to parrot the big-O notation, but they couldn't tell if a function would be efficient. They could talk about design patterns, but they couldn't handle systems more complex than a couple files. That's nothing against them, I (and others who had been programming from an early age) simply had a lot more experience in thinking in that way (and over time, most of them have become much better).

We need to introduce programming to all children at an early age. Not because everyone can code (that's not at all backed by any studies I've ever seen), but because lots of kids don't find out that they have a knack for programming until college (or not at all).

If most CS students had been programming since middle school, CS could drop a bunch of the remedial classes and focus on the finer parts of programming. Many so-called masters classes are within easy reach if you have some programming time under your belt. Companies would be a lot more willing to hire a dev out of college if they knew college was worth something. Until that happens, I don't think a degree is actually better.


I'm actually confused why you seem to be attempting to argue my point, when I was stating it's a situation that is unique to an individual. Do you really objectively think a degree is a yes/no answer that applies universally? What if someone wants to work on deep sea exploration robotics? Yes/No is an oversimplification and your post actually helps to point that out in two ways.

The second way is easier so I'll get it out of the way. More education earlier is an argument that formal programming education can be helpful. That wouldn't undermine the idea that a CS degree has importance on a individual basis; not as a general yes/no rule. We've agree that formal programming education CAN help.

The second point is big for me, and it really bothers me to be honest (not about your post, I appreciate your post). It's this idea that the CS degree is this static cement thing; like we order it from Amazon. A degree is what you make it!

Your point about remedial classes, etc is spot on. It's exactly an argument that each person has to weigh the benefits they can get from the degree with the benefits from going right into the industry. If the person shows up at college and picks programming for the reasons you described; then you are probably correct in that it is not "better". However, college is a four year chance to build a competitive advantage. Statistics. Economics. Linguistics. Physics. Accounting. Art. Etc.

If someone chooses to make their degree a worthless money sink, that does not cancel out the person who spends four years learning to code, analyze speech, and working hard on the OpenROV team. Those are two completely different people and neither needs to put their degree on their resume. But only one can put speech analysis and OpenROV there; which they got as part of that degree. They also now have those contacts and team building experiences. Another example might be someone who spends the four years working part time as a contractor. "Got a degree while building industry experience" trounces "got a degree" AND "industry experience" (imho).

A degree is something you pay for, it's an investment and should be treated as such (in the scope of our discussion). While the underlying argument may be if an employer cares; I think that honestly employers are looking at it as a signal that there might be something more to you than some Java/JavaScript syntax.


My main problem with attending my university is a lot of my time is spent dealing with classes I can't care about.

Right now my in major GPA is above a 3.2 while my cumulative GPA is hovering around a 2.9; I just don't care about non-cs classes.

This would be find if being at a university had any noticeable benefits, but it only has detractors.

  - No one will hire me for a paid position
  - I have to waste most of my time on classes I don't care about
  - No more time for my side projects that I use to learn.


My GPA was not awesome.

You could be affected by any number of things (wrong university, wrong goals, wrong order of operations, in a hurry). Please don't be offended, I just mean that you might leverage your resources better and find classes you care about (I ended up double majoring in CS and Economics, but I didn't care about econ until I tried it).

Here is a more plain and TLDR response. Pretend it's a new investment and write down/spell out what you are putting in and getting out. If it doesn't add up, don't make the investment.

Finally, not caring is a problem. For some arguing in favor of degrees, this is a big part of their point. Why don't you care about those classes? Do you think you'll only ever get cool and interesting projects at work? An employer needs to know you'll spend two weeks finding some ridiculous bug that doesn't even affect anything if that is what they assign. Every third Sunday, the system crashes. Fix it. While your peers do greenfield work. It means sometimes putting effort into things you aren't interested in as part of the bigger picture.


If you think of it as wasting time, then your attitude will come out in your work. All those classes you don't care about are there because university isn't trade school.

The more interesting work goes to programmers who can communicate. 90% of programmers can get the job done; I want someone who can articulate what he's built, how it can be improved, when it will be ready, what needs to be changed, what we're doing wrong, etc.

That's what all those classes you don't care about are for: giving you perspective outside heads-down coding that can be outsourced for $50/day.

You'd be amazed at how many people I have interviewed that simply either can't, or won't talk (from the interviewer's perspective they are the same thing!).

Again: most halfway decent programmers are adequate. The great ones are great not because they can write code, but because they can explain how that code works to someone else and, vice versa, they can understand someone who is trying to explain why the code doesn't do what they need. That is worth paying for.


It actually bugs me every once in a while when I do an interview and I know the guy can code but he just can't articulate anything. It's like he's locked in there and he would be fine if I slipped tasks under his door in an envelope. But to your point it just doesn't work that way and those people get turned away not because they didn't have the technical skills, but because they demonstrated they couldn't be effective beyond the IDE.

The other day I finally cut a guy off and said, "Yes, sure, you aren't good with terms and explanations. How do you have technical discussions with other developers? What do you do in code reviews?". Deer meet Headlights.

I also realize your point is beyond technical communication, and you are right on that too.


My college had three non-CS writing courses. I would say they helped improve my writing skill, especially one of the classes. I would also say writing well is an important skill to have when working for a business - when writing documents and sending e-mails to other techs, and especially in communications with non-techs. CS is not the only thing you need to know when working for a business.

Other classes folded back into CS or working for a business as well. For example, my psychology course was helpful in a number of areas, including as preparation for an AI elective I took.


I have to take two physics classes, ~4 history classes, and ~2 English classes.

We also have a requirement for attending a philosophy class about ethics in computing or something.


My education as an EE was somewhat similar. Before starting any EE classes, I took 4 physics courses, two chemistry, three English/composition/Literature (would have been 4 but I tested out of Engl101) courses, and one History course (that I failed and had to repeat because I blew it off). Then we can get into the stuff I only know I took because it's on my transcript. Stuff I forgot all about because it was never actually "used" like Thermodynamics, Heat Transfer, Linear Analysis, and Math, Math, Math. I think the only time I didn't have anything that looked like a math class was the semester right before graduation.

Somewhere in there I took a few actual EE courses.

Guess which ones had the most impact on my career? Hint: it ain't the EE stuff (although I worked as an EE for 6 years after graduation).


The difference I see is that EE is a math field. The concepts of EE can only be represented as numbers and math.

Computer science is only focused on logic and algorithms with state.

It's not really math.

But in the end I DON'T mind math classes, I like math classes. I mind stupid English, Lit, and Phil, classes.

I'll probably go for a math minor. I like it. But I'm not going to for a English minor.


Part of the signal of a degree is can you do a good job on things that are required but are not interesting. It's called work for a reason; you're not worth much to an employer if you can only do thing that you find fun.


Yes, we just had a period after 2008 when open positions dried up, and I remember 2000-2001 when the economy hit the tech companies specifically (I went to work at an investment bank). People shouldn't ask if they and their co-workers have CS degrees, they should ask what percentage of programmers in their 40s and 50s they have met have CS degrees (or at least math/physics/EE degrees). 1999 was even more of a go-go time than now, then suddenly everyone was laid off and looking for work, and the few positions open that didn't require a BSCS would have HR go over your educational background asking about it. A lot of the ones without degrees who got jobs when people were desperate to hire got washed out when times got hard, and never came back to the industry. Something to think about when you see all the 20 somethings sitting around you (where are the older people?) and tell yourself a BSCS does not matter.

One point people make is the best programmers without degrees are better than the worst programmers with degrees. This is true but does not mean much.

I took courses in Java and C++, and self-taught programmers tend to know as much as I learned in those classes. What they tend to skip is the theoretical foundations. Calculus, graph theory, theory of computation, mu-recursive functions.

I took two classes (one was elective) which dealt with mutual exclusion, race conditions, critical sections etc. I spent months studying them, writing complex homeworks dealing with them, being quizzed on them etc. This came to be helpful, sometimes very helpful for me. Obviously people are not learning about this as I run across race conditions in code far too often, and I have heard others say this as well. I had to sit down and learn to deal with this for some months and some programmers never do.

I don't understand the comments from people who say they already knew what was taught in class so it was a waste. Class is one fourth of the time, then you're supposed to spend three hours studying for each hour in class. So if you know 100% of what was taught in class you just saved yourself three hours to do something else. I was a Unix sysadmin for a long time and yes, the class where we learned "ls", "cd", "pwd", "chmod" etc. was not that educational. But the next class our homework was to study the process scheduler for Mac, Windows and Linux. I had always put off doing this, so I finally sat down and studied how Linux's completely fair scheduler worked and learned a lot. I learned what actually happened in full when an add instruction came in for two registers on a processor, and what logic gates they would go through. So even what I specialized in was filled out.

Another thing - often it is not what you know you don't know, but what you don't know you don't know. If I know a heap is a good data structure to quickly find the highest number in a list, then I can learn about heaps. If I never heard of a heap I might use a slower method. Or a real world example - in class I learn about Goedel numbers. Then a year later at work I have to take a (short) list of (small) numbers and make them into a hash. But how? Then I remember Goedel numbers. Again, it's not what you know you don't know, it's what you don't know you don't know. To learn by yourself, you have to know what to look for, and you often don't.

As I said, I worked a while as a Unix sysadmin. I went back to finish my BSCS. I'm not rationalizing a decision I made at 18 by talking it up, I realized how important it was to filling in gaps of knowledge, as well as to getting a job, especially when one is needed. A lot of people reading this are too young to have been working when most of the startups and dot-com's folded in 2000, and the few IT positions at traditional companies were flooded with applicants, many with college degrees. Things can turn on a dime, and having 4 years experience in RoR or Node.JS can quickly go from meaning a $120k job to meaning next to nothing.


So what you are saying is that the classes are just there to force you to learn by yourself?

If I am paying money for class, then shouldn't I be taught the material? Not do something I can do for a free 2-year stint with a computer.


How does my three years of reading get compressed to two years?

As I said, most recommend each hour of lecture time to be followed by three hours of self-study.

Also, if someone is going to do those three years of non-class study (after class in my case) so they can have my equivalent non-class study - why not just do the extra year of in-class work and get the diploma? The attitude perplexes me - "I will study three years like you, but won't get a diploma that will help in employment".

No one is forcing me to go to class. But if I am taking a class on theory of computation, with quizzes etc., then it makes sense that that will be the months in which I read about pushdown automata outside of class. So after I choose which class to take in a semester, I am in a sense "forced" to study that topic outside class for the next four months. I can also e-mail my professor or classmates or wait after class or go to office hours if something confuses me.


1) You will spend 10x the time in those 2 years struggling by yourself (or asking questions online).

2) You will have gaps because you missed things

3) You won't retain the knowledge as well because you won't be tested on it.

4) If you're someone who isn't good at math, good luck motivating yourself to study real analysis, optimization theory, statistical learning theory, etc.


A lot of doors will be closed to you if you don't have a degree. And college is a great place to make connections with like-minded people too.


Any advice for those who have a degree in an unrelated field, and has been coding since they were 11 and are approaching thirty. I'm considering other options sometimes, but feel this might be an issue.

It wasn't every middle schooler who reads a book on perl for leisure.


The book "Cracking the Coding Interview" is a huge help for getting through interviews. It's a real grind to work through, but worth it. You don't have to do all the problems, but make sure you do some from each section.

Most business degrees can be useful, because you understand the business much better. Accounting actually turns out to be useful for distributed systems.

Edit: It wouldn't have occurred to me, but I think Anthropology would be very helpful in understanding why systems were put together the way they are.


and here I am with an anthropology degree.

In fairness, I am a developer for an anthropological research agency at a large university. That is a plus.

I'll take a look at that book. Thanks.


It depends on what you're experiencing. For me, I'm self-taught from age 8 and got a degree in Mechanical Engineering as I thought I wanted to work in automotive. Did an internship for MB in Stuttgart and when interviewing for permanent positions was steered away from US auto by a Ford engineer. If I had his address, I'd send him a Christmas card and a nice gift every year.

I've had zero issues finding work in computing. If you're also not finding issues, the advice is "do nothing and continue to ignore the line in any JD that says 'BS in CompSci required'". It's almost certainly not.

If you are experiencing issues getting interviews, consider leaving your degree off or explicitly highlighting your self-taught nature and work experience.

Anyone who was reading a Perl book in middle school for pleasure and has stuck with computing since then is almost surely a strong candidate, at least way more than sufficiently strong.


Not sure if this helps. I'm also a self-thought programmer coming from a similar story -started programming at 10 and never stopped since- I'm in law school now, but I continue to spend nearly all my time hacking. Sometimes I think it's better not working as a programmer, because I'm very passionate about computers.

I was depressed for a long time that I couldn't study CS. Then I realized, I can learn what I want freely. I have an entire life dedicated to computers.

Although I secretly dream about inventing the next facebook, I mostly embraced the fact that I won't be able to get a formal education and a job in the field.

Tinkering with computers is part of who I am. The rest is not in my hands.


Use Coursera to get the education you wish you'd had.


In short, emphasize the skills and experience you do have. Apply and interview far and wide at all the companies that remotely interest you.

A bit of background: I have been coding since an early age and hold an arts degree. I had no serious experience in the industry a few years ago, but had work experience in other fields and I had many independent projects to showcase. I have been gainfully employed in software for the last few years as I approached 30 myself. Now I feel somewhat established enough that the arts/self-taught background is not as big of a deal as it seemed when I was just trying to break into the field. They even have me performing technical interviews now for some reason, which I'm mentioning to show that I've now been on both sides of the table.

My advice to you is to build your resume around the skills and experience that may be unconventional but are as relevant to the job you want as possible. Where most people would have job experience or education front and center, I placed a rundown of projects that detailed what techs I used to accomplish what ends and what results I got out of each project. These may have been even unreleased, work in progress personal projects (clearly stated as such), not for profit projects or websites or communities that I had a technical hand in creating, open source software contributions, and so on. Of course, if you don't have any projects to show at all then you should work on that before trying to get a job. But I assume you have some personal stuff, even incomplete and unseen by anyone, that you can showcase to demonstrate your skills. Links to working demos are a major plus, and you should always prefer concise descriptive text over just namedrop lists.

I list work experience where I think it is relevant to the job in software. Any previous office experience, for example, can at least show that you know how to work on a project, meet deadlines, act professionally (arguable), etc. Highlight the transferable skills of your previous employment. Likewise, highlight the transferable skills of your degree. It often became a point of intrigue that I had this esoteric degree after we've talked in-depth about programming for an hour -- so don't think of it as a weakness. In your resume, stay focused on the goal, which is to get a job in software. Do not use a generalist resume as your software resume.

Finally, pound the pavement. Do not wait. Tweak your resume and send it out often. I lost track of the number of companies that I've actually applied to, but in getting my first job I had in-person interviews (sometimes multiple rounds) at over 10 companies. I flat-out bombed some interviews. It didn't click in others. And sometimes it seemed like I was going to get the job, but with my untested background it was too 'risky' or I was too 'junior'. Don't let this stop you from going to the next interview. Embrace rejection and use it as an opportunity to learn. Try to strike a balance between quantity and quality in the jobs you apply for. Be directed and intentional in that you only apply to jobs that actually interest you and that you actually think you can do (so no "Sr. Dev", of course), but don't be so focused on one or two companies that you just sit around watching their careers page for an opening. Find a middle ground somewhere between "The One True Dream Job" and loading 100,000 copies of your resume into a biplane and dumping them over the city. In my case, beyond the usual job sites, I got a list of all the tech companies in my city in some local tech magazine's annual hottest tech company edition (or whatever it was) and went through the list applying to every single one that seemed like it would fit. And that got me my first job in tech, which has been utterly game-changing.

Others have already mentioned that you need to practice that awful hurdle that is "the coding interview". So be sure to do that as well, and take every interview as a learning experience. Hope this helps.


> college is a great place to make connections with like-minded people

I've heard this advice a lot but that hasn't been my experience. I stay in contact with quite a few of my co-workers from the past but I haven't stayed in contact with anyone from my time at the University. I was working and providing for a family while attending school so I didn't really have a lot of time to do anything other than the difficult CS assignments.


I suppose that everyone has a different experience. I'm still in contact with a number of people from my university years. I had a roommate who was more in your position, though (simultaneously studying and supporting a family). I'd imagine that his experience was closer to yours than to mine. He really didn't have time for a lot of social activity.


If you can get into a top 5 (10?) school, I'd absolutely go. It's utterly unfair (and probably not backed by actual facts), but the cachet from a top school is real.

University is also a great time to have fun, develop relationships, and become a proto-adult. I'm not sure that 4 years of "head start" (by skipping college) is a good decision for the average 18-year old to make. By all means, if you have a great idea or a compelling thing that you're running "towards", that's fine, give it a go. But the default "go to a good school and get a degree in engineering, math, or comp-sci" is pretty sound advice for most. I had no idea (more precisely: the wrong idea) what I wanted to do with my life when I was 17. College gave me time to figure more of that out and I don't feel like I lost out, in fact quite the opposite.


I still think having a degree is necessary. It lets you approach all job opportunities. But, it is true a lot of companies don't care about your degree if you show you're a good developer.


It isn't necessary for a career in software engineering as you can tell just by looking at your colleagues. But I agree that going to school will make you a more rounded individual and increase the likelihood that you can write professionally and have a better understanding of the world that you find yourself writing code in.


Sure, it'll impact you, but so will the loss of time, funds, etc., from attending to academic pursuits. Key is to find a good first job as soon as possible and leverage it to find your next job as soon as possible. After you're established a career path, the percent of opportunities you have access to will be less and less limited by your academic history. Generally speaking, my suggestion is to go to school; though that's not the same advice I'd give to my past myself either given the opportunity.


You might be able to get a job, but many places won't hire you without a degree. What this study shows is people who are at least partially self-taught, but if you look at the number here, it looks like at least 60-70% posses some sort of related degree.


Companies say that they won't hire without a degree, but exceptions are made for those who have written substantial code for open source projects that are highly visible. It is possible, albeit difficult, to get around the very policy that you mention existing in most companies. A bachelors degree is only four years of education and is in a very controlled and contrived environment, it cannot make up for excellent achievement in the real world.


If you have even slightest interest in finance and want to learn some quirks of the profession and related engineering skillsets (start with FIX protocol), you can start making a good deal of money in NJ/NYC. No degree required.


I'm self taught too since maybe middle school, and did take C and C++ for giggles. I never got a CS degree though, since the math really was difficult. I consider my self a decent programmer though. It's amazing to me the strong link between engineering and programming. So much so that every job I've ever seen posted has included it as a requirement. It's a shame really


I remember sometime in the early 90s I was working at my first really huge contract -- hundreds of programmers.

One day after lunch I get off the elevator and take a look around the huge room. I could probably see 100 folks or so.

There were a dozen different nationalities, people of all ages and genders. There were extremely smart guys who didn't have a degree. There were extremely smart guys who had PhDs in things like particle physics. Here I was, a self-taught guy, leading a team of 30. I had 3 PhDs working for me. I knew more than one person with double degrees in a foreign country who came here for a better life.

And it didn't matter. All that mattered was whether you got along with people, what kind of attitude you had, and whether or not you could push through and solve problems for folks.

I think this was the moment that I decided that I love this industry.


It's the only industry in America that to me reflects the American dream. I'm a high school drop out and so far nobody has even asked about my education. They check out my portfolio of work and go on that. Results are results, everything else is just window dressing.


Same here. I was scrubbing toilets for $8/hr two years ago, and now I work with PhD's and MBA's making six figures. God bless america.


My oldest son grew up fixing computers and programming them, but he wasn't so good at structured education.

By the time he was 20, he was out of school and working fast food. Programming on various projects in his free time. Making minimum wage.

I begged him to start looking for programming work, but he always told me that he wasn't qualified. How could he compete in the job market with all those professional coders?

Finally he tried. Of course, he got a job -- at a startup. He ended up being the go-to guy for both coding and infrastructure.

He's done a lot of things since then, but I'll always remember him looking at me, rolling his eyes, and telling me that what I was saying was impossible.

Yes, there's a huge role for luck, for having good parents, for being born in the right country, and for networking skills. But this is still an industry where if you love it, you can make terrific money just by being passionate about it.


Some interesting take-aways.

Most developers here are self-taught?

Nope, we're all self taught. Though in this case you are faced with a survey with an option of self taught next to others which include school.

Most developers aren't looking?

This may be the point which the government doesn't understand about tech jobs and immigration. I don't know what it's like to be looking for a job in the U.S. these days but I imagine most people who are decent at programming aren't looking. If you want a bunch of coders you need to get them fresh out of university or start looking abroad. The thousands of resumes going out to development job openings from the unemployed must be from crazy people who can't code.

People finding jobs from others they know?

This sort of goes along with developers not looking. If nobody is looking, then how do you find people to work for you? Get your current employees to hit their Rolodex. Nevermind all that stuff about degree requirements, etc. In my experience, the requirements hit the listing and then you never hear about them. I imagine that's because the listing attracts the crazies and then you get the gig when you sound like you halfway know what you're talking about.


I believe the biggest takeaway is the shift from preferring science fiction to preferring a soap opera in space. It seems like the number of thinkers relative to the population has remained static while the number of programmers relative to the population has increased.

What happens when you need more thinkers than society has to offer?


I absolutely agree with the general sentiment in this thread that we're all self-taught. I've had quite a few friends ask me to help them become programmers, and my first response is that (to paraphrase) "it's not just a job to learn and then work towards retirement. It's a constant learning experience where you have to wake up every day and realize you're ignorant, slow, and unimaginative compared to your peers and if that's not the case, then you're probably going to fall into obscurity. If you're ok with that, then let's get started."

One avenue I haven't seen yet approached here is that schooling was far behind the times at least through the mid-90s. I was set to graduate from a highly regarded prep school in 1996, and in the "advanced computer class", we were learning Pascal. My queries about the internet weren't answered well enough to keep me interested in the conversation.

By then I'd built a couple silly websites for myself, met hundreds of people from around the world, and had my own little secret educational source - a step up from my peers in school, which I needed because they were all Definitely smarter than me. I was hooked and had zero interest in plain old desktop applications, which was the end-game to what was being taught in every school I looked at (from my 17-year-old perspective).

A book on Perl understood what I was after. It wasn't even necessarily a very good book on Perl. It had an open source web-store on a CD in the cover and it told me step-by-step how to find a web host and then set up the web-store on a server. I set up a web store for my mom's retail business, which then stayed afloat for a little while longer (she now sells online full-time).

I proceeded to drop out of college and haven't stopped learning since.


Note that this doesn't mean completely self-taught!

Most (good) developers are definitely at least partially self-taught, but those who are employed in the industry also tend to possess, not necessarily a CS degree but at least on in a related field.


>related field. What exactly is a "related field" to programming though? I guess you could argue math and statistics, but even that's not really applicable to the average CRUD app developer.


Our developer went to Poly and it shows. He can think of a solution to our programs needs almost instantly, were as our other devs whom were self taught and no longer with us because they hit a brick wall.

It pays to have a good programmer with degrees.


The best programmers I've worked with are first and foremost problem solvers. One had a CS degree, the other was an accounting major and another a musician with a high school degree.

There are very few CS courses I took at the university that prepared me for the real world. I'm sure there are certain fields like AI, DB algorithms, Math/Science applications, compilers that benefit from formal academic training but for the most part being logical and a a good problem solver are the most important skills.


I too have found problem solving to be the most important skill - understanding the problem so you can craft the proper fix or implementation is vital to saving time.

Of the best developers I've worked with/know, only 1 has a CS degree. One has a liberal arts degree IIRC (works at Netflix), another is a college dropout (works at Google)...and the one with a CS degree has a tendency to overengineer systems with great flaws that the others I know wouldn't do (tries to be too clever). Myself, I have a MS in math from a top 15 program (PhD dropout).


My Accounting degree has actually been really useful when working with distributed systems, as accounting systems were the original rigorous distributed systems. I only wish I'd gone heavier on the math.


I suppose I can't tell you your anecdata is incorrect, since an anecdote is just an anecdote. I can counter it with my own, though. Of the top 10 best devs I know, about half were didn't get CS degrees. And I say this as someone with a CS degree.


I can second fishtoaster and add that I've worked with people who've been to university, some who did computer related courses and others who didn't - more often than not an arts based course, and have also worked with people who didn't do a degree at all.

All of those hasn't been a factor in if a developer is good or not. It's when they "hit a brick wall" and give up. The best developers ask for help when stuck and help others when they're stuck.


He went to Poly because he is smart, he is a good developer because he is smart.


I was self-taught as well. Back in '82 I was an in-house illustrator at a children's book publisher. The company (now defunct) brought in Apple IIs in order to do some house-branded educational software. I fell in love with the little machines immediately, even though I really had no exposure to any computers in high school or college (art school). The real software guys that were hired taught me 6502 assembly and off I went. I got my first job as a programmer in '85 doing Mac assembly (68000) programming. So much fun.

That said, I have a huge amount of respect for CS grads. I have seen it. My background in algorithms is totally non-existent, besides what I've been able to pick up on my own. I've been at a disadvantage MANY times due my lack of formal CS/Math (my "formal" math edu stopped at plane geometry in HS!). Thankfully, I've always had good friends to help me through these issues, but it would have been a lot easier for me if I had had real training. Google is a huge help now that we no longer need programming manuals, as such.

I have always gravitated towards the visual, GUI aspect of software development, which is probably not a surprise given my art school background. I really think that companies should keep open minds regarding education. It really takes all kinds of people to do what we do, especially at big, diverse companies. Being visually oriented and a capable programmer is a unique kind of background that can be used to great effect. Not all cookies are the same shape.


If you make use of your university time correctly, then the primary value of virtually any degree is that it teaches you how to self-teach to an extent you could not reach if you just started self-teaching on your own.

Aside from occasional prodigies, the folks who are best at teaching themselves new things are folks who proved they could do it through degree programs.

Don't get me wrong, though, a lot of people do not actually make good use of their university time, and they get through to graduation without learning much about self-teaching. I'll never understand why they would want to waste so much money for that.

If I had my choice when recruiting, I'd select people in this order:

1. Someone whose degree and job experience clearly shows they are thoughtful and can self-teach.

2. Someone who acquired abilities by self-teaching even without a degree and/or prior experience.

3. Someone who has a degree and/or experience, but who clearly doesn't have much skill to self-teach.

4. Someone who does not have a degree, experience, or self-teaching ability.

Really, I'd prefer to never hire someone from groups 3 or 4. But sometimes it can be hard to detect fakers from group 3.

I think a lot of people share this opinion, which is somewhat tragic since very few hiring processes make even the faintest attempt to determine if a candidate is good at learning new things or self-teaching. Instead, just as with the tired old thread about HackerRank from yesterday, we spend all our time quizzing people on rote memorization of standard examples, which is something that the group 3 people are very good at faking their way through.


This contrasts strongly with "Dropouts need not apply": https://news.ycombinator.com/item?id=11393671

blogs.wsj.com/economics/2016/03/30/dropouts-need-not-apply-silicon-valley-asks-mostly-for-developers-with-degrees/


Silicon Valley != the entire programming world


Why is the photo of a gaming conference or is that what people think developers look like?


Yeah I noticed that as well, looks like it's from a LAN party or something.


I like that you can clearly see video games on their screens, and the caption is "Learning at Work."


CS teaches a lot of theory, but not as much practical. Let's face it, 90% of programming is CRUD and the hardest part of such applications (managing all the state) isn't something that can really be taught in 2 years worth of CS classes (the other 2 years being used on non-CS-specific stuff). Employers don't want to pay to teach new programmers (even CS grads who couldn't get an internship), but generally expect a couple years worth of experience before they'll even consider someone for an "entry-level" dev position.

Programming is a job that requires constant learning. If a programmer has what it takes to do that, then it's not too surprising that the programmer can learn the basics on their own. A lot of devs like myself started learning to program around 10 or so. When college enters the picture, these devs are bored for most of the classes (except perhaps things like algorithms or compiler classes).

If a would-be programmer had the foresight to look into the programming, then they'd probably note that there's more profit in skipping the degree and putting that 50K on the mortgage instead. CS has more free teaching material online than any other white collar job I know of. A programmer can get the exact same education quite easily if desired (you can't really say the same about other STEM fields except perhaps math).


I had one class in high school and that was 1973. Other than that I was self taught and still coding today (iOS). A number of friends in college got the (new at the time) CS degrees and all but one eventually were unemployable as they had learned nothing beyond the degree content (mainframes at the time). As long as you are into keeping up to date, after a while the degree no longer matters.


I'm self taught as far as web development and programming skills are concerned. But to some degree, it feels like that was really the only option over here in the UK.

Okay, things have apparently improved significantly in the last few years, and schools are seemingly teaching a few more things actually considered programming now, but back when I was learning HTML and CSS and Javascript and all that stuff in the early 00s, the standard of teaching in the IT classes at school and sixth form was absolutely dreadful, and it likely didn't get much better at university. GCSE classes for IT were literally 'how to use Microsoft Office was dummies' and A level course were basically devoid of programming tasks.

As a result you basically had to teach yourself, since there's no way in hell your school education would teach useful skills for any sort of development and you'd likely struggle significantly to go from there to a degree level.


Even within CS education programs, this has a profound impact.

You can't design a useful introductory computer science course that works well for everybody. A subset of your class has been dabbling with code since childhood, while another subset needs to start from the beginning.

Other degree programs don't seem to suffer nearly this extreme level of experience gap within their incoming students. What fraction of incoming mechanical engineering students have had the chance to build a working motor and then take it through a dozen revisions until it works the way they want? It has to be much less than the comparable fraction of CS students.

I think this is a big reason for the famous bimodal distribution of outcomes experienced by most introductory CS classes.


This is from the SO survey and I think it's a bit misleading. Only 13% of respondents say they are only self-taught as opposed to combining autodidacticism with some other method of learning.


Does self taught simply mean "doesn't have a CS degree?" I don't have a CS degree but I don't consider myself self taught because I didn't learn in a vacuum. I would say that 75% or more of what I know, I learned from great colleagues.


Even if you started out from school, you are going to need to be self-taught after a year anyway. What you learn in school is going to be obsolete quickly enough and most workplaces are going to assume that you will magically be up to day at all time.


The weird thing is that I went to University for a computer science degree and took 2-3 programming courses where I used C and C++ but 95% or more of what I know I learned outside of University.

So does that make me self taught or not? If so, I wonder if that should be 4 out of 5 developers being self taught.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: