An example is my upper level elective class on database systems and design. For the final project of the class you were to take everything you've learned about relational databases and design your own schema, get all of the data properly normalized, and make an app that uses it. Other than that, the sky was the limit. You could use whatever programming language, framework, and database that you wanted, and the app could literally do anything. There was nothing limiting you to building a desktop Java app coupled with MySQL, yet 95%+ of the class did just that. I ventured off and used Postgres + Node.js and made a single page web app for my project, using skills I learned outside of school, on my own time, learning these things (I longed for a friend that knew what a promise was or that bluebird wasn't about a damn live animal).
Now to get to my point. I'm not going to say that self taught developers are strictly better, but I feel like they are the other 5% like myself that see what we do as more than just a class, grade, job, or paycheck. They are the ones that spend the time to learn new emerging things in the software development realm and do it on their own regard. I would much rather work with someone who is not an uncustomized, straight from the java CS degree factory college/university. Someone that ventures outside of the path they are guided on in their career and skill sets and takes the time to learn on their own, and learn the things they want to learn just because.
I'm lucky enough to have my first job be somewhere where I get to dive right into Node.js, Angular, etc. which is what I want to do at this time. Maybe the other 95% feel this way about their first jobs which probably are making internal business Java applications, but I really can't believe that.
Probably the same reason passionate musicians take offense when lazy garbage tunes make it to the charts:
Gauging the quality of music is something most people can't do, and they feel robbed because they're not compensated proportionately to the amount of their effort compared to those other unpassionate musicians.
We need to require one year of programming to get the talented kids started early then move all our "intro to CS" classes from the main syllabus to remedial college courses (you wouldn't call pre-algebra an "intro to physics" course). This would also help other professions. Engineering especially suffers from this problem where they take the required C or Java intro course then have matlab dumped onto them the next semester. They muddle through and graduate with little skill in programming despite the fact that their entire job will revolve around programming.
As for "one year of programming" [before college] what do you mean? There are already AP computer science courses for students in high school. Most high schools have enough trouble getting students ready for college, no need to add comp-sci when the basics are on such poor foundations. Much more effective to provide a solid grounding in mathematics, writing, and science, history, and literature in high school.
If you get an internship with one of the big companies, you will very likely have a great career in Huntsville. If you don't, then you'll have a much harder time. There are only around 180K EE jobs in the US compared to a couple million CS jobs. The dropout rate for EE is huge and the pass rate for FE/PE exams is somewhat low (you have to become certified after graduating). Companies must take a risk to hire new people because the barrier of entry is so high, but even then, they are fairly picky about experience. How much more picky will they be about experience when they have more options?
The argument about having trouble with current school curriculum is a red herring. Just because they are already failing and need an overhaul does not mean that the overhaul shouldn't include an introduction to coding in middle school (and for the record, they need new curriculum, better teaching methods, and most importantly, parents who actually care about their kids enough to become involved in their day-to-day education).
I don't believe for a minute that everyone can code well enough to do it for a living any more than I believe everyone who takes English classes will make a living as an author. English is important for communication, math is important for calculating and thinking, and coding is important for focusing on creative logical reasoning (complementing math very nicely). Learning how to make a turtle go through an obstacle course will teach way more about logic than a thousand "word problems".
Most children will take such a course and walk away disliking coding, but hopefully with a little better logical ability. The rest will discover a subject they enjoy and increase their proficiency before college.
There's only so much students can absorb and they need the basics FIRST. While computer programming studies are a nice enrichment for the students which are ready, it would be a waste of time if the kid can't do algebra and word problems by high school graduation.
Another goal would be to break the mystique around computing and show that it's accessible to everybody, with just a bit of informing. Too many people who didn't get into computing early on are too scared to touch anything.
I agree that proper rigor is best avoided at that age. At the same time, a little bit of real code (especially in a language like scheme where there's not a bunch of syntax to memorize) won't scare away the people who gravitate toward coding.
What we don't need to do (in my opinion) is attempt to appeal to the lowest common denominator. We need to find all the other good candidates rather than lying to the unqualified by saying "anyone can code". Give them a chance to fail or succeed and let nature take its course.
I think what is amazing about programming is that so many people are able to get jobs without degrees. There is a general consensus that self taught developers can be as good or better than formally taught ones. I haven't seen this in other fields until you get to senior leadership. Sure, areas like law and accounting require a formalized foundation that are hard to get outside of degrees but trades is an area with a lot of room for self-development yet there is little respect for those without certificates. Sure you can make the argument of insurance needs but that just raises the question of why contracted developers don't need to be bonded.
I know several people who learned to program on scavenged 386s and 486s in the late 90s and early 2000s. These could be had for free from some places (they were being thrown away) or cheaply from a garage sale. You could get a copy of Linux to run on a 386/486 from a local users group or by downloading it slowly and putting it onto floppies at the library (assuming your begged/borrowed/scavenged machine didn't have a cd-rom). A copy of Red Hat Linux, SuSE, Slackware, Debian, FreeBSD, or Mandrake had everything necessary to teach you Python, Perl, Bash, Makefile, Emacs Lisp, or Awk in 2000 (tutorials, compilers/interpreters, intermediate documentation). If you could get net access, a whole other world was opened up.
This was my path, but I've met many others who progressed similarly. Scavenged hardware and addiction to the power instead of sports, dating, etc can pull you out of the trailer parks, high rises, or hoods. (Not that you couldn't do multiple things, but if you're poor in the states, you probably had to work from the time you were legally able and combined with school leaves little time left for multiple hobbies.) Other trades don't have such a low barrier, which practically means they are limited to those who can progress through a college path.
I have yet to meet a good programmer that doesn't have at least one pet project developed outside of the office. Active development not required, just something that they put a lot of time/thought into. I ask them to tell me about it, platform used, problems encountered/resolved, things they've discovered. Fire in their eyes -> a big plus. A statement along the lines of "I had to stop because I dreamt about code" or they teach me something new -> double plus. Anyone who tells me "I don't program at home, once I leave the office the day is done" -> 'archived' immediately. 'School-coders' typically fall into the latter category.
PS I taught myself at ages 14-17 (Basic -> 6502 ASM -> Pascal -> C) then went to college (computer science BSc/MSc). Then the internet came about. Yes, I'm that old… No, I can't keep up with the latest and greatest either. Still have a pet project (or 10)
Frankly, I have never once had any need that could not be solved by free off-the-shelf software and/or a couple short scripts. I have absolutely zero desire to get home, having spent the day programming, and program some more.
I guess I would still fall into that because I developed and ran a website for some time (and forums that I sometimes hacked a little on), but I've never mentioned it in an interview and really, looking back on it, I should have spent that time doing other, non-tech related things.
Yet at every job I've been at, management considers me a "rock star" (ugh).
I think you criterion is likely (though of course not guaranteed) to get you good programmers, and it's not necessarily a bad thing to exclude the set of good programmers who don't meet that criterion, but I find the idea that I should love programming so much that I should dream about it and do it at home...unsettling. Nobody expects that from other professions. And I think I associate it generally with people who are willing to accept below-market pay (and therefore companies that pay poorly) because they're happy to just get a job doing it.
That said, I do like programming, well enough. But what I really like is solving other people's problems. And what I love is my hobbies that have nothing to do with computers and are unfortunately impossible to monetize.
I don't. I use my limited spare time outside of work to study.
At the time I was 90% theory/learning (Uni) and 10% practical, so adding in a side project to increase the practical side was very valuable.
These days I'm 90% practical (work) and so adding extra L&D/theory is much more valuable to me than spending another hour coding.
I'd worked hard to get into a reputable university, and simply assumed that they would be teaching me all the stuff I needed to know. Anything that the wise professors (experts in their field!) chose to leave off the syllabus must be of lesser value, right?
Looking back now as the cynic I have become, this seems laughably, pathetically naive, and completely false. 18 year old me was pretty stupid.
If I ever get the chance to give advice to a young person I will be sure to impress on them that it's the stuff you do outside the course that really matters. The qualification just gives you credibility and gets your CV past HR. That cool side project you did is what you'll talk about in the interview. I don't know whether they'll listen, but I'll tell them anyway.
Some people (like you) seem to know this instinctively. I wish someone had told me!
I guess there are some analogs for different environments. Command line, do they know about man pages. General problems, do they google it first or ask for support. etc...
But my degree was never going to teach me Python. I had to decide to learn it myself.
Pretty sure we also learned Prolog in that class, but that we had instruction on.
90% of what I do daily on the job and any pracitcal/useful programming is stuff I've taught myself over the years.
> Given a particular school a median CS major is typically always a better programmer/problem-solver than an equivalent Physics or Applied math major.
The highest scoring undergraduate degree for those who take the MCATs is Math/Stats. The second highest is Physical Sciences. Far and away, last place is Specialized Health Sciences, with Biology also lagging. The reason isn't because "Math majors are smarter", it's because they are choosing to go into a field that isn't the directly assumed career path. A Math major HAS to stand out compared to the average CS major when going into a programming field, just as an English major HAS to stand out compared to a Finance major when going into finance. If I am interviewing a Math major and a CS major, chances are the Math major is "better" (whatever that contextually means), simply because they've already stood out above all the CS majors. All that being said, let's not pretend our undergraduate degrees are anything more than a rubber stamp.
May be at the school you studied at. Having taught / studied at Cornell (a highly ranked CS school) and Syracuse (ranked ~50th in USA) there is a huge gap between difficulty & level of effort required in undergraduate courses. Ignoring it as if its just a rubber stamp is a huge mistake. Honestly while there were several good students at Syracuse, I can frankly attest that even an average CS Major at Cornell is better than the 90% of students at Syracuse.
> there is a huge gap between difficulty & level of effort required in undergraduate courses
I'm not arguing the difference between the quality of education between schools. I'm commenting on its usefulness after the rubber stamp. I went to a top college. I still question whether or not $200k was worth it to buy my first job.
> I can frankly attest that even an average CS Major at Cornell is better than the 90% of students at Syracuse.
I can frankly attest that I don't give a shit when their resume ends up on my desk. The amount of A students that end up with a B- in life are roughly the same amount that went from B- to A. A+ end up in academics, which is the only place where that aspect actually matters. I care about practical utility, not a slip of paper. "Westchester Community College" ends up in the same pile as "Cornell" if they have 2+ years of work experience.
My father teaches at Cornell as well, but at Weill in NYC. In the last 4 years, he's accepted 1 doctor from Harvard Medical School (#1 in the country) to his fellowship program. He typically doesn't take any because he thinks they value the history of their education more than the usefulness of it in practical application. While I was growing up, EVERY year when reviewing his applicants, we would inevitably have a dinner conversation about "those entitled shit heads".
> Ignoring it as if its just a rubber stamp is a huge mistake.
Whatever floats your boat. However, I'd caution you to warn your students exactly the opposite.
The consequence is a lot of people that know nothing beyond basic algorithms and data structures who try to code their own geographical databases using arrays/linked lists and wonder why queries like find the nearest neighbor to X and find all objects within this shape take forever. The answer is, they used the wrong data structures. But convincing them of that is like pulling teeth. Also, according to several of them, C = C++, and every language gets compiled to C so why would you use anything else...
Whether or not that's true is another discussion, but that's the point he's making.
I have a degree in CS & Math, so I'm guessing I'm good to go :)
Clearly there's a lot of interest in finding out what is true of developers in general: What languages do we prefer? What's our educational background? What are our demographics? It would be great to have a survey that would answer such questions. This one doesn't.
EDIT: many will only study once; why not study something new? Even if tuition is free, why have someone teach you topics you might already be very familiar with? I'm not saying you won't learn anything new in CS. You very much will learn something new. But if you choose a related field with little to no exposure yet, you will learn so much more.
Anecdotally the majority of non-CS grads I've worked with haven't been anywhere near as good as CS grads.
Are those entry-level jobs? For those I can see it. If you're hiring somebody with 5+ years of experience, though, only considering those with a CS degree is pretty short-sighted.
Theoretical CS is very important, but also quite academic. 99% of developers won't ever delve into that theory and will instead reach for a library based on that theory. The biggest issue in programming is managing large amounts of information that changes over time. This is not only completely avoided in most courses, it is also close to impossible to teach outside of gaining years of experience because every decision is a tradeoff and it takes time to build up an intuition for such things.
If you want to become a software engineer, get a CS or CpE degree and do whatever you want as a minor.
* My girlfriend got a Math degree from a good, but not Ivy-league, school with a minor in Physics. Turns out that the market doesn't value a B.S in Math as much as one would think. She landed up becoming a teacher, which she likes doing, but it's a field that's really hard to get out of without resetting your career.
* I went to school that had a cooperative education program. All of the engineering students were eligible to participate. Twice a year, the Co-op program had two days in which hiring managers from local companies (my school is in Hoboken, so local == NYC) would come to the school and interview people. We would put our names and majors on the lists that employers had, and they would select the people they wanted to speak with.
The CS and CpE students would always, without fail, get AT LEAST 7 interviews that day. Every other major would be lucky to get 2 or 3.
The Internet is my chance to widen my horizons, if I want to learn physics or math, I can and I will, but a Physics or Math degree might not help me as much professionally.
Is this true? Can I start looking right now? Granted, I am not in the valley. I'm all the way in New Jersey.
You'll have to do the math and weigh the cost/benefits of your degree.
Personally I think we're in a bit of a bubble/gravy train. The idea that programming will become the new automotive assembly line for self-taught labor is a bit of a stretch.
A degree can impart many benefits (math, core CS ideas, subject matter classes such as speech theory, advertising, management, etc). While many people can make enterprise apps with direction based on being self taught; not many can really build something in depth. Boeing isn't going to let just anyone write aerospace firmware. So if you want to get a job quickly and milk this while it's hot - maybe you go for it?
I don't think it's clear cut, and I like having my degree. I'm good friends with fantastic developers without degrees who work alongside me.
In the long term; it depends on the competitive advantage you can get out of that time in school vs your peers extra experience in the field; as well as your goals.
Developers generally fit into two broad categories -- those who learned to program on their own at age 10 or so and those who decided to pursue a CS degree because the job pays well or they like playing video games. 7 years of experience beats 4 years of college and 3 years experience pretty much every time.
Unfortunately, a huge amount of CS degree time is spent dealing with those with no experience (We have remedial Math or English classes, why not remedial programming classes?). When I hit college (pursuing EE with CS on the side), I already had almost a decade of programming under my belt. I coasted through anything programming related (I'd already found and read books on the theoretical side of programming, so even those weren't that interesting).
At the end of the college road, the people with college only were mostly worthless. They may be able to parrot the big-O notation, but they couldn't tell if a function would be efficient. They could talk about design patterns, but they couldn't handle systems more complex than a couple files.
That's nothing against them, I (and others who had been programming from an early age) simply had a lot more experience in thinking in that way (and over time, most of them have become much better).
We need to introduce programming to all children at an early age. Not because everyone can code (that's not at all backed by any studies I've ever seen), but because lots of kids don't find out that they have a knack for programming until college (or not at all).
If most CS students had been programming since middle school, CS could drop a bunch of the remedial classes and focus on the finer parts of programming. Many so-called masters classes are within easy reach if you have some programming time under your belt. Companies would be a lot more willing to hire a dev out of college if they knew college was worth something. Until that happens, I don't think a degree is actually better.
The second way is easier so I'll get it out of the way. More education earlier is an argument that formal programming education can be helpful. That wouldn't undermine the idea that a CS degree has importance on a individual basis; not as a general yes/no rule. We've agree that formal programming education CAN help.
The second point is big for me, and it really bothers me to be honest (not about your post, I appreciate your post). It's this idea that the CS degree is this static cement thing; like we order it from Amazon. A degree is what you make it!
Your point about remedial classes, etc is spot on. It's exactly an argument that each person has to weigh the benefits they can get from the degree with the benefits from going right into the industry. If the person shows up at college and picks programming for the reasons you described; then you are probably correct in that it is not "better". However, college is a four year chance to build a competitive advantage. Statistics. Economics. Linguistics. Physics. Accounting. Art. Etc.
If someone chooses to make their degree a worthless money sink, that does not cancel out the person who spends four years learning to code, analyze speech, and working hard on the OpenROV team. Those are two completely different people and neither needs to put their degree on their resume. But only one can put speech analysis and OpenROV there; which they got as part of that degree. They also now have those contacts and team building experiences. Another example might be someone who spends the four years working part time as a contractor. "Got a degree while building industry experience" trounces "got a degree" AND "industry experience" (imho).
Right now my in major GPA is above a 3.2 while my cumulative GPA is hovering around a 2.9; I just don't care about non-cs classes.
This would be find if being at a university had any noticeable benefits, but it only has detractors.
- No one will hire me for a paid position
- I have to waste most of my time on classes I don't care about
- No more time for my side projects that I use to learn.
You could be affected by any number of things (wrong university, wrong goals, wrong order of operations, in a hurry). Please don't be offended, I just mean that you might leverage your resources better and find classes you care about (I ended up double majoring in CS and Economics, but I didn't care about econ until I tried it).
Here is a more plain and TLDR response. Pretend it's a new investment and write down/spell out what you are putting in and getting out. If it doesn't add up, don't make the investment.
Finally, not caring is a problem. For some arguing in favor of degrees, this is a big part of their point. Why don't you care about those classes? Do you think you'll only ever get cool and interesting projects at work? An employer needs to know you'll spend two weeks finding some ridiculous bug that doesn't even affect anything if that is what they assign. Every third Sunday, the system crashes. Fix it. While your peers do greenfield work. It means sometimes putting effort into things you aren't interested in as part of the bigger picture.
The more interesting work goes to programmers who can communicate. 90% of programmers can get the job done; I want someone who can articulate what he's built, how it can be improved, when it will be ready, what needs to be changed, what we're doing wrong, etc.
That's what all those classes you don't care about are for: giving you perspective outside heads-down coding that can be outsourced for $50/day.
You'd be amazed at how many people I have interviewed that simply either can't, or won't talk (from the interviewer's perspective they are the same thing!).
Again: most halfway decent programmers are adequate. The great ones are great not because they can write code, but because they can explain how that code works to someone else and, vice versa, they can understand someone who is trying to explain why the code doesn't do what they need. That is worth paying for.
The other day I finally cut a guy off and said, "Yes, sure, you aren't good with terms and explanations. How do you have technical discussions with other developers? What do you do in code reviews?". Deer meet Headlights.
I also realize your point is beyond technical communication, and you are right on that too.
Other classes folded back into CS or working for a business as well. For example, my psychology course was helpful in a number of areas, including as preparation for an AI elective I took.
We also have a requirement for attending a philosophy class about ethics in computing or something.
Somewhere in there I took a few actual EE courses.
Guess which ones had the most impact on my career? Hint: it ain't the EE stuff (although I worked as an EE for 6 years after graduation).
Computer science is only focused on logic and algorithms with state.
It's not really math.
But in the end I DON'T mind math classes, I like math classes. I mind stupid English, Lit, and Phil, classes.
I'll probably go for a math minor. I like it. But I'm not going to for a English minor.
One point people make is the best programmers without degrees are better than the worst programmers with degrees. This is true but does not mean much.
I took courses in Java and C++, and self-taught programmers tend to know as much as I learned in those classes. What they tend to skip is the theoretical foundations. Calculus, graph theory, theory of computation, mu-recursive functions.
I took two classes (one was elective) which dealt with mutual exclusion, race conditions, critical sections etc. I spent months studying them, writing complex homeworks dealing with them, being quizzed on them etc. This came to be helpful, sometimes very helpful for me. Obviously people are not learning about this as I run across race conditions in code far too often, and I have heard others say this as well. I had to sit down and learn to deal with this for some months and some programmers never do.
I don't understand the comments from people who say they already knew what was taught in class so it was a waste. Class is one fourth of the time, then you're supposed to spend three hours studying for each hour in class. So if you know 100% of what was taught in class you just saved yourself three hours to do something else. I was a Unix sysadmin for a long time and yes, the class where we learned "ls", "cd", "pwd", "chmod" etc. was not that educational. But the next class our homework was to study the process scheduler for Mac, Windows and Linux. I had always put off doing this, so I finally sat down and studied how Linux's completely fair scheduler worked and learned a lot. I learned what actually happened in full when an add instruction came in for two registers on a processor, and what logic gates they would go through. So even what I specialized in was filled out.
Another thing - often it is not what you know you don't know, but what you don't know you don't know. If I know a heap is a good data structure to quickly find the highest number in a list, then I can learn about heaps. If I never heard of a heap I might use a slower method. Or a real world example - in class I learn about Goedel numbers. Then a year later at work I have to take a (short) list of (small) numbers and make them into a hash. But how? Then I remember Goedel numbers. Again, it's not what you know you don't know, it's what you don't know you don't know. To learn by yourself, you have to know what to look for, and you often don't.
As I said, I worked a while as a Unix sysadmin. I went back to finish my BSCS. I'm not rationalizing a decision I made at 18 by talking it up, I realized how important it was to filling in gaps of knowledge, as well as to getting a job, especially when one is needed. A lot of people reading this are too young to have been working when most of the startups and dot-com's folded in 2000, and the few IT positions at traditional companies were flooded with applicants, many with college degrees. Things can turn on a dime, and having 4 years experience in RoR or Node.JS can quickly go from meaning a $120k job to meaning next to nothing.
If I am paying money for class, then shouldn't I be taught the material? Not do something I can do for a free 2-year stint with a computer.
As I said, most recommend each hour of lecture time to be followed by three hours of self-study.
Also, if someone is going to do those three years of non-class study (after class in my case) so they can have my equivalent non-class study - why not just do the extra year of in-class work and get the diploma? The attitude perplexes me - "I will study three years like you, but won't get a diploma that will help in employment".
No one is forcing me to go to class. But if I am taking a class on theory of computation, with quizzes etc., then it makes sense that that will be the months in which I read about pushdown automata outside of class. So after I choose which class to take in a semester, I am in a sense "forced" to study that topic outside class for the next four months. I can also e-mail my professor or classmates or wait after class or go to office hours if something confuses me.
2) You will have gaps because you missed things
3) You won't retain the knowledge as well because you won't be tested on it.
4) If you're someone who isn't good at math, good luck motivating yourself to study real analysis, optimization theory, statistical learning theory, etc.
It wasn't every middle schooler who reads a book on perl for leisure.
Most business degrees can be useful, because you understand the business much better. Accounting actually turns out to be useful for distributed systems.
Edit: It wouldn't have occurred to me, but I think Anthropology would be very helpful in understanding why systems were put together the way they are.
In fairness, I am a developer for an anthropological research agency at a large university. That is a plus.
I'll take a look at that book. Thanks.
I've had zero issues finding work in computing. If you're also not finding issues, the advice is "do nothing and continue to ignore the line in any JD that says 'BS in CompSci required'". It's almost certainly not.
If you are experiencing issues getting interviews, consider leaving your degree off or explicitly highlighting your self-taught nature and work experience.
Anyone who was reading a Perl book in middle school for pleasure and has stuck with computing since then is almost surely a strong candidate, at least way more than sufficiently strong.
I was depressed for a long time that I couldn't study CS. Then I realized, I can learn what I want freely. I have an entire life dedicated to computers.
Although I secretly dream about inventing the next facebook, I mostly embraced the fact that I won't be able to get a formal education and a job in the field.
Tinkering with computers is part of who I am.
The rest is not in my hands.
A bit of background: I have been coding since an early age and hold an arts degree. I had no serious experience in the industry a few years ago, but had work experience in other fields and I had many independent projects to showcase. I have been gainfully employed in software for the last few years as I approached 30 myself. Now I feel somewhat established enough that the arts/self-taught background is not as big of a deal as it seemed when I was just trying to break into the field. They even have me performing technical interviews now for some reason, which I'm mentioning to show that I've now been on both sides of the table.
My advice to you is to build your resume around the skills and experience that may be unconventional but are as relevant to the job you want as possible. Where most people would have job experience or education front and center, I placed a rundown of projects that detailed what techs I used to accomplish what ends and what results I got out of each project. These may have been even unreleased, work in progress personal projects (clearly stated as such), not for profit projects or websites or communities that I had a technical hand in creating, open source software contributions, and so on. Of course, if you don't have any projects to show at all then you should work on that before trying to get a job. But I assume you have some personal stuff, even incomplete and unseen by anyone, that you can showcase to demonstrate your skills. Links to working demos are a major plus, and you should always prefer concise descriptive text over just namedrop lists.
I list work experience where I think it is relevant to the job in software. Any previous office experience, for example, can at least show that you know how to work on a project, meet deadlines, act professionally (arguable), etc. Highlight the transferable skills of your previous employment. Likewise, highlight the transferable skills of your degree. It often became a point of intrigue that I had this esoteric degree after we've talked in-depth about programming for an hour -- so don't think of it as a weakness. In your resume, stay focused on the goal, which is to get a job in software. Do not use a generalist resume as your software resume.
Finally, pound the pavement. Do not wait. Tweak your resume and send it out often. I lost track of the number of companies that I've actually applied to, but in getting my first job I had in-person interviews (sometimes multiple rounds) at over 10 companies. I flat-out bombed some interviews. It didn't click in others. And sometimes it seemed like I was going to get the job, but with my untested background it was too 'risky' or I was too 'junior'. Don't let this stop you from going to the next interview. Embrace rejection and use it as an opportunity to learn. Try to strike a balance between quantity and quality in the jobs you apply for. Be directed and intentional in that you only apply to jobs that actually interest you and that you actually think you can do (so no "Sr. Dev", of course), but don't be so focused on one or two companies that you just sit around watching their careers page for an opening. Find a middle ground somewhere between "The One True Dream Job" and loading 100,000 copies of your resume into a biplane and dumping them over the city. In my case, beyond the usual job sites, I got a list of all the tech companies in my city in some local tech magazine's annual hottest tech company edition (or whatever it was) and went through the list applying to every single one that seemed like it would fit. And that got me my first job in tech, which has been utterly game-changing.
Others have already mentioned that you need to practice that awful hurdle that is "the coding interview". So be sure to do that as well, and take every interview as a learning experience. Hope this helps.
I've heard this advice a lot but that hasn't been my experience. I stay in contact with quite a few of my co-workers from the past but I haven't stayed in contact with anyone from my time at the University. I was working and providing for a family while attending school so I didn't really have a lot of time to do anything other than the difficult CS assignments.
University is also a great time to have fun, develop relationships, and become a proto-adult. I'm not sure that 4 years of "head start" (by skipping college) is a good decision for the average 18-year old to make. By all means, if you have a great idea or a compelling thing that you're running "towards", that's fine, give it a go. But the default "go to a good school and get a degree in engineering, math, or comp-sci" is pretty sound advice for most. I had no idea (more precisely: the wrong idea) what I wanted to do with my life when I was 17. College gave me time to figure more of that out and I don't feel like I lost out, in fact quite the opposite.
One day after lunch I get off the elevator and take a look around the huge room. I could probably see 100 folks or so.
There were a dozen different nationalities, people of all ages and genders. There were extremely smart guys who didn't have a degree. There were extremely smart guys who had PhDs in things like particle physics. Here I was, a self-taught guy, leading a team of 30. I had 3 PhDs working for me. I knew more than one person with double degrees in a foreign country who came here for a better life.
And it didn't matter. All that mattered was whether you got along with people, what kind of attitude you had, and whether or not you could push through and solve problems for folks.
I think this was the moment that I decided that I love this industry.
By the time he was 20, he was out of school and working fast food. Programming on various projects in his free time. Making minimum wage.
I begged him to start looking for programming work, but he always told me that he wasn't qualified. How could he compete in the job market with all those professional coders?
Finally he tried. Of course, he got a job -- at a startup. He ended up being the go-to guy for both coding and infrastructure.
He's done a lot of things since then, but I'll always remember him looking at me, rolling his eyes, and telling me that what I was saying was impossible.
Yes, there's a huge role for luck, for having good parents, for being born in the right country, and for networking skills. But this is still an industry where if you love it, you can make terrific money just by being passionate about it.
Most developers here are self-taught?
Nope, we're all self taught. Though in this case you are faced with a survey with an option of self taught next to others which include school.
Most developers aren't looking?
This may be the point which the government doesn't understand about tech jobs and immigration. I don't know what it's like to be looking for a job in the U.S. these days but I imagine most people who are decent at programming aren't looking. If you want a bunch of coders you need to get them fresh out of university or start looking abroad. The thousands of resumes going out to development job openings from the unemployed must be from crazy people who can't code.
People finding jobs from others they know?
This sort of goes along with developers not looking. If nobody is looking, then how do you find people to work for you? Get your current employees to hit their Rolodex. Nevermind all that stuff about degree requirements, etc. In my experience, the requirements hit the listing and then you never hear about them. I imagine that's because the listing attracts the crazies and then you get the gig when you sound like you halfway know what you're talking about.
What happens when you need more thinkers than society has to offer?
One avenue I haven't seen yet approached here is that schooling was far behind the times at least through the mid-90s. I was set to graduate from a highly regarded prep school in 1996, and in the "advanced computer class", we were learning Pascal. My queries about the internet weren't answered well enough to keep me interested in the conversation.
By then I'd built a couple silly websites for myself, met hundreds of people from around the world, and had my own little secret educational source - a step up from my peers in school, which I needed because they were all Definitely smarter than me. I was hooked and had zero interest in plain old desktop applications, which was the end-game to what was being taught in every school I looked at (from my 17-year-old perspective).
A book on Perl understood what I was after. It wasn't even necessarily a very good book on Perl. It had an open source web-store on a CD in the cover and it told me step-by-step how to find a web host and then set up the web-store on a server. I set up a web store for my mom's retail business, which then stayed afloat for a little while longer (she now sells online full-time).
I proceeded to drop out of college and haven't stopped learning since.
Most (good) developers are definitely at least partially self-taught, but those who are employed in the industry also tend to possess, not necessarily a CS degree but at least on in a related field.
It pays to have a good programmer with degrees.
There are very few CS courses I took at the university that prepared me for the real world. I'm sure there are certain fields like AI, DB algorithms, Math/Science applications, compilers that benefit from formal academic training but for the most part being logical and a a good problem solver are the most important skills.
Of the best developers I've worked with/know, only 1 has a CS degree. One has a liberal arts degree IIRC (works at Netflix), another is a college dropout (works at Google)...and the one with a CS degree has a tendency to overengineer systems with great flaws that the others I know wouldn't do (tries to be too clever). Myself, I have a MS in math from a top 15 program (PhD dropout).
All of those hasn't been a factor in if a developer is good or not. It's when they "hit a brick wall" and give up. The best developers ask for help when stuck and help others when they're stuck.
That said, I have a huge amount of respect for CS grads. I have seen it. My background in algorithms is totally non-existent, besides what I've been able to pick up on my own. I've been at a disadvantage MANY times due my lack of formal CS/Math (my "formal" math edu stopped at plane geometry in HS!). Thankfully, I've always had good friends to help me through these issues, but it would have been a lot easier for me if I had had real training. Google is a huge help now that we no longer need programming manuals, as such.
I have always gravitated towards the visual, GUI aspect of software development, which is probably not a surprise given my art school background. I really think that companies should keep open minds regarding education. It really takes all kinds of people to do what we do, especially at big, diverse companies. Being visually oriented and a capable programmer is a unique kind of background that can be used to great effect. Not all cookies are the same shape.
Aside from occasional prodigies, the folks who are best at teaching themselves new things are folks who proved they could do it through degree programs.
Don't get me wrong, though, a lot of people do not actually make good use of their university time, and they get through to graduation without learning much about self-teaching. I'll never understand why they would want to waste so much money for that.
If I had my choice when recruiting, I'd select people in this order:
1. Someone whose degree and job experience clearly shows they are thoughtful and can self-teach.
2. Someone who acquired abilities by self-teaching even without a degree and/or prior experience.
3. Someone who has a degree and/or experience, but who clearly doesn't have much skill to self-teach.
4. Someone who does not have a degree, experience, or self-teaching ability.
Really, I'd prefer to never hire someone from groups 3 or 4. But sometimes it can be hard to detect fakers from group 3.
I think a lot of people share this opinion, which is somewhat tragic since very few hiring processes make even the faintest attempt to determine if a candidate is good at learning new things or self-teaching. Instead, just as with the tired old thread about HackerRank from yesterday, we spend all our time quizzing people on rote memorization of standard examples, which is something that the group 3 people are very good at faking their way through.
Programming is a job that requires constant learning. If a programmer has what it takes to do that, then it's not too surprising that the programmer can learn the basics on their own. A lot of devs like myself started learning to program around 10 or so. When college enters the picture, these devs are bored for most of the classes (except perhaps things like algorithms or compiler classes).
If a would-be programmer had the foresight to look into the programming, then they'd probably note that there's more profit in skipping the degree and putting that 50K on the mortgage instead. CS has more free teaching material online than any other white collar job I know of. A programmer can get the exact same education quite easily if desired (you can't really say the same about other STEM fields except perhaps math).
As a result you basically had to teach yourself, since there's no way in hell your school education would teach useful skills for any sort of development and you'd likely struggle significantly to go from there to a degree level.
You can't design a useful introductory computer science course that works well for everybody. A subset of your class has been dabbling with code since childhood, while another subset needs to start from the beginning.
Other degree programs don't seem to suffer nearly this extreme level of experience gap within their incoming students. What fraction of incoming mechanical engineering students have had the chance to build a working motor and then take it through a dozen revisions until it works the way they want? It has to be much less than the comparable fraction of CS students.
I think this is a big reason for the famous bimodal distribution of outcomes experienced by most introductory CS classes.
So does that make me self taught or not? If so, I wonder if that should be 4 out of 5 developers being self taught.