
Two out of three developers are self-taught, and other trends from a survey - raddad
http://qz.com/649409/two-out-of-three-developers-are-self-taught-and-other-trends-from-a-survey-of-56033-developers/
======
JoshGlazebrook
I started programming when I was 10 or 11 and continued on to get a CS degree.
One thing I did notice was that a lot of people in some of my classes seemed
to have just woken up one day and decided they wanted to get a degree in
computer science. And all throughout my years in school, those are the people
who never ventured out of the curriculum (which was java based) to explore and
learn other things. Which just seems to set you up for a life in boring
corporate java desktop app development (unless you like that kind of
development).

An example is my upper level elective class on database systems and design.
For the final project of the class you were to take everything you've learned
about relational databases and design your own schema, get all of the data
properly normalized, and make an app that uses it. Other than that, the sky
was the limit. You could use whatever programming language, framework, and
database that you wanted, and the app could literally do anything. There was
nothing limiting you to building a desktop Java app coupled with MySQL, yet
95%+ of the class did just that. I ventured off and used Postgres + Node.js
and made a single page web app for my project, using skills I learned outside
of school, on my own time, learning these things (I longed for a friend that
knew what a promise was or that bluebird wasn't about a damn live animal).

Now to get to my point. I'm not going to say that self taught developers are
strictly better, but I feel like they are the other 5% like myself that see
what we do as more than just a class, grade, job, or paycheck. They are the
ones that spend the time to learn new emerging things in the software
development realm and do it on their own regard. I would much rather work with
someone who is not an uncustomized, straight from the java CS degree factory
college/university. Someone that ventures outside of the path they are guided
on in their career and skill sets and takes the time to learn on their own,
and learn the things they want to learn just because.

I'm lucky enough to have my first job be somewhere where I get to dive right
into Node.js, Angular, etc. which is what I want to do at this time. Maybe the
other 95% feel this way about their first jobs which probably are making
internal business Java applications, but I really can't believe that.

~~~
notlisted
Too busy to read all comments, apologies if it's been stated elsewhere, but a
related remark: Whenever I interview a programmer, I ask them about their pet
project at home.

I have yet to meet a _good_ programmer that doesn't have at least one pet
project developed outside of the office. Active development not required, just
something that they put a lot of time/thought into. I ask them to tell me
about it, platform used, problems encountered/resolved, things they've
discovered. Fire in their eyes -> a big plus. A statement along the lines of
"I had to stop because I dreamt about code" or they teach me something new ->
double plus. Anyone who tells me "I don't program at home, once I leave the
office the day is done" -> 'archived' immediately. 'School-coders' typically
fall into the latter category.

PS I taught myself at ages 14-17 (Basic -> 6502 ASM -> Pascal -> C) then went
to college (computer science BSc/MSc). Then the internet came about. Yes, I'm
that old… No, I can't keep up with the latest and greatest either. Still have
a pet project (or 10)

~~~
sheepmullet
> I have yet to meet a good programmer that doesn't have at least one pet
> project developed outside of the office.

I don't. I use my limited spare time outside of work to study.

~~~
notlisted
But you did… at one point? (note that I wrote: "Active development not
required").

~~~
sheepmullet
Last time I had a side project was at uni. So a bit over a decade ago.

At the time I was 90% theory/learning (Uni) and 10% practical, so adding in a
side project to increase the practical side was very valuable.

These days I'm 90% practical (work) and so adding extra L&D/theory is much
more valuable to me than spending another hour coding.

------
metaphorm
I would say that 3 out of 3 developers are self-taught, but that about one
third of them also have a degree in Computer Science.

~~~
pklausler
Or mathematics, or physics, or astronomy; and I'd rather hire any of those
than a modern CS graduate, everything else being equal.

~~~
aub3bhat
Having taught/graded students who graduated with various backgrounds ranging
from undergraduate degrees in CS, ECE, to Maths and from IIT, MIT to Duke. I
can assure you that its probably inferiority complex about your own CS
education or math envy that is behind this line of reasoning. Given a
particular school a median CS major is typically always a better
programmer/problem-solver than an equivalent Physics or Applied math major.

~~~
lloyd-christmas
Self Disclosure: I was a Math/Econ major with a minor in Applied Statistics
(almost a decade ago).

> Given a particular school a median CS major is typically always a better
> programmer/problem-solver than an equivalent Physics or Applied math major.

The highest scoring undergraduate degree for those who take the MCATs is
Math/Stats. The second highest is Physical Sciences. Far and away, last place
is Specialized Health Sciences, with Biology also lagging[1]. The reason isn't
because "Math majors are smarter", it's because they are choosing to go into a
field that isn't the directly assumed career path. A Math major HAS to stand
out compared to the average CS major when going into a programming field, just
as an English major HAS to stand out compared to a Finance major when going
into finance. If I am interviewing a Math major and a CS major, chances are
the Math major is "better" (whatever that contextually means), simply because
they've already stood out above all the CS majors. All that being said, let's
not pretend our undergraduate degrees are anything more than a rubber stamp.

[1]
[https://www.aamc.org/download/321496/data/factstablea17.pdf](https://www.aamc.org/download/321496/data/factstablea17.pdf)

~~~
aub3bhat
>> let's not pretend our undergraduate degrees are anything more than a rubber
stamp.

May be at the school you studied at. Having taught / studied at Cornell (a
highly ranked CS school) and Syracuse (ranked ~50th in USA) there is a huge
gap between difficulty & level of effort required in undergraduate courses.
Ignoring it as if its just a rubber stamp is a huge mistake. Honestly while
there were several good students at Syracuse, I can frankly attest that even
an average CS Major at Cornell is better than the 90% of students at Syracuse.

~~~
lloyd-christmas
> May be at the school you studied at.

Mature.

> there is a huge gap between difficulty & level of effort required in
> undergraduate courses

I'm not arguing the difference between the quality of education between
schools. I'm commenting on its usefulness after the rubber stamp. I went to a
top college. I still question whether or not $200k was worth it to buy my
first job.

> I can frankly attest that even an average CS Major at Cornell is better than
> the 90% of students at Syracuse.

I can frankly attest that I don't give a shit when their resume ends up on my
desk. The amount of _A_ students that end up with a _B-_ in life are roughly
the same amount that went from _B-_ to _A_. _A+_ end up in academics, which is
the only place where that aspect actually matters. I care about practical
utility, not a slip of paper. "Westchester Community College" ends up in the
same pile as "Cornell" if they have 2+ years of work experience.

My father teaches at Cornell as well, but at Weill in NYC. In the last 4
years, he's accepted 1 doctor from Harvard Medical School (#1 in the country)
to his fellowship program. He typically doesn't take any because he thinks
they value the history of their education more than the usefulness of it in
practical application. While I was growing up, EVERY year when reviewing his
applicants, we would inevitably have a dinner conversation about "those
entitled shit heads".

> Ignoring it as if its just a rubber stamp is a huge mistake.

Whatever floats your boat. However, I'd caution you to warn your students
exactly the opposite.

------
ksenzee
Two out of three developers _who answered a survey on a self-teaching site_
are self-taught. I see this is yet another article assuming the SO survey is
representative of developers in general. It is not. It is representative of
people who use Stack Overflow heavily enough to see and click on a survey
link.

Clearly there's a lot of interest in finding out what is true of developers in
general: What languages do we prefer? What's our educational background? What
are our demographics? It would be great to have a survey that would answer
such questions. This one doesn't.

------
city41
I have a CS degree and still consider myself self taught. I started
programming as a kid, and learned mostly from books and coding. I only got my
degree because I thought it'd be necessary for getting a job.

~~~
gravypod
I'm in the same boat right now. I have been programming since 12 and I'm
getting a degree because I don't think I'll get a job without one (or more so
my parents think that).

Is this true? Can I start looking right now? Granted, I am not in the valley.
I'm all the way in New Jersey.

~~~
georgefrick
I'm not sure I would so quickly listen to the HN trends. There are a lot of
people on HN who are younger and they like the idea that this would be a
trend. You are allowing them to define the competitive advantage.

You'll have to do the math and weigh the cost/benefits of your degree.

Personally I think we're in a bit of a bubble/gravy train. The idea that
programming will become the new automotive assembly line for self-taught labor
is a bit of a stretch.

A degree can impart many benefits (math, core CS ideas, subject matter classes
such as speech theory, advertising, management, etc). While many people can
make enterprise apps with direction based on being self taught; not many can
really build something in depth. Boeing isn't going to let just anyone write
aerospace firmware. So if you want to get a job quickly and milk this while
it's hot - maybe you go for it? I don't think it's clear cut, and I like
having my degree. I'm good friends with fantastic developers without degrees
who work alongside me.

In the long term; it depends on the competitive advantage you can get out of
that time in school vs your peers extra experience in the field; as well as
your goals.

~~~
hajile
Is a degree objectively better?

Developers generally fit into two broad categories -- those who learned to
program on their own at age 10 or so and those who decided to pursue a CS
degree because the job pays well or they like playing video games. 7 years of
experience beats 4 years of college and 3 years experience pretty much every
time.

Unfortunately, a huge amount of CS degree time is spent dealing with those
with no experience (We have remedial Math or English classes, why not remedial
programming classes?). When I hit college (pursuing EE with CS on the side), I
already had almost a decade of programming under my belt. I coasted through
anything programming related (I'd already found and read books on the
theoretical side of programming, so even those weren't that interesting).

At the end of the college road, the people with college only were mostly
worthless. They may be able to parrot the big-O notation, but they couldn't
tell if a function would be efficient. They could talk about design patterns,
but they couldn't handle systems more complex than a couple files. That's
nothing against them, I (and others who had been programming from an early
age) simply had a lot more experience in thinking in that way (and over time,
most of them have become much better).

We need to introduce programming to all children at an early age. Not because
everyone can code (that's not at all backed by any studies I've ever seen),
but because lots of kids don't find out that they have a knack for programming
until college (or not at all).

If most CS students had been programming since middle school, CS could drop a
bunch of the remedial classes and focus on the finer parts of programming.
Many so-called masters classes are within easy reach if you have some
programming time under your belt. Companies would be a lot more willing to
hire a dev out of college if they knew college was worth something. Until that
happens, I don't think a degree is actually better.

~~~
georgefrick
I'm actually confused why you seem to be attempting to argue my point, when I
was stating it's a situation that is unique to an individual. Do you really
objectively think a degree is a yes/no answer that applies universally? What
if someone wants to work on deep sea exploration robotics? Yes/No is an
oversimplification and your post actually helps to point that out in two ways.

The second way is easier so I'll get it out of the way. More education earlier
is an argument that formal programming education can be helpful. That wouldn't
undermine the idea that a CS degree has importance on a individual basis; not
as a general yes/no rule. We've agree that formal programming education CAN
help.

The second point is big for me, and it really bothers me to be honest (not
about your post, I appreciate your post). It's this idea that the CS degree is
this static cement thing; like we order it from Amazon. _A degree is what you
make it!_

Your point about remedial classes, etc is spot on. It's exactly an argument
that each person has to weigh the benefits they can get from the degree with
the benefits from going right into the industry. If the person shows up at
college and picks programming for the reasons you described; then you are
probably correct in that it is not "better". However, college is a four year
chance to build a competitive advantage. Statistics. Economics. Linguistics.
Physics. Accounting. Art. Etc.

If someone chooses to make their degree a worthless money sink, that does not
cancel out the person who spends four years learning to code, analyze speech,
and working hard on the OpenROV team. Those are two completely different
people and neither needs to put their degree on their resume. But only one can
put speech analysis and OpenROV there; which they got as part of that degree.
They also now have those contacts and team building experiences. Another
example might be someone who spends the four years working part time as a
contractor. "Got a degree while building industry experience" trounces "got a
degree" AND "industry experience" (imho).

A degree is something you pay for, it's an investment and should be treated as
such (in the scope of our discussion). While the underlying argument may be if
an employer cares; I think that honestly employers are looking at it as a
signal that there might be something more to you than some Java/JavaScript
syntax.

~~~
gravypod
My main problem with attending my university is a lot of my time is spent
dealing with classes I can't care about.

Right now my in major GPA is above a 3.2 while my cumulative GPA is hovering
around a 2.9; I just don't care about non-cs classes.

This would be find if being at a university had any noticeable benefits, but
it only has detractors.

    
    
      - No one will hire me for a paid position
      - I have to waste most of my time on classes I don't care about
      - No more time for my side projects that I use to learn.

~~~
HeyLaughingBoy
If you think of it as wasting time, then your attitude will come out in your
work. All those classes you don't care about are there because university
isn't trade school.

The more interesting work goes to programmers who can _communicate_. 90% of
programmers can get the job done; I want someone who can articulate what he's
built, how it can be improved, when it will be ready, what needs to be
changed, what we're doing wrong, etc.

That's what all those classes you don't care about are for: giving you
perspective outside heads-down coding that can be outsourced for $50/day.

You'd be amazed at how many people I have interviewed that simply either
can't, or won't talk (from the interviewer's perspective they are the same
thing!).

Again: most halfway decent programmers are adequate. The great ones are great
not because they can write code, but because they can explain how that code
works to someone else and, vice versa, they can understand someone who is
trying to explain why the code doesn't do what they need. That is worth paying
for.

~~~
georgefrick
It actually bugs me every once in a while when I do an interview and I know
the guy can code but he just can't articulate anything. It's like he's locked
in there and he would be fine if I slipped tasks under his door in an
envelope. But to your point it just doesn't work that way and those people get
turned away not because they didn't have the technical skills, but because
they demonstrated they couldn't be effective beyond the IDE.

The other day I finally cut a guy off and said, "Yes, sure, you aren't good
with terms and explanations. How do you have technical discussions with other
developers? What do you do in code reviews?". Deer meet Headlights.

I also realize your point is beyond technical communication, and you are right
on that too.

------
DanielBMarkham
I remember sometime in the early 90s I was working at my first really huge
contract -- hundreds of programmers.

One day after lunch I get off the elevator and take a look around the huge
room. I could probably see 100 folks or so.

There were a dozen different nationalities, people of all ages and genders.
There were extremely smart guys who didn't have a degree. There were extremely
smart guys who had PhDs in things like particle physics. Here I was, a self-
taught guy, leading a team of 30. I had 3 PhDs working for me. I knew more
than one person with double degrees in a foreign country who came here for a
better life.

And it didn't matter. All that mattered was whether you got along with people,
what kind of attitude you had, and whether or not you could push through and
solve problems for folks.

I think this was the moment that I decided that I love this industry.

~~~
cmdrfred
It's the only industry in America that to me reflects the American dream. I'm
a high school drop out and so far nobody has even asked about my education.
They check out my portfolio of work and go on that. Results are results,
everything else is just window dressing.

~~~
andersen1488
Same here. I was scrubbing toilets for $8/hr two years ago, and now I work
with PhD's and MBA's making six figures. God bless america.

~~~
DanielBMarkham
My oldest son grew up fixing computers and programming them, but he wasn't so
good at structured education.

By the time he was 20, he was out of school and working fast food. Programming
on various projects in his free time. Making minimum wage.

I begged him to start looking for programming work, but he always told me that
he wasn't qualified. How could he compete in the job market with all those
_professional_ coders?

Finally he tried. Of course, he got a job -- at a startup. He ended up being
the go-to guy for both coding and infrastructure.

He's done a lot of things since then, but I'll always remember him looking at
me, rolling his eyes, and telling me that what I was saying was impossible.

Yes, there's a huge role for luck, for having good parents, for being born in
the right country, and for networking skills. But this is still an industry
where if you love it, you can make terrific money just by being passionate
about it.

------
gexla
Some interesting take-aways.

Most developers here are self-taught?

Nope, we're all self taught. Though in this case you are faced with a survey
with an option of self taught next to others which include school.

Most developers aren't looking?

This may be the point which the government doesn't understand about tech jobs
and immigration. I don't know what it's like to be looking for a job in the
U.S. these days but I imagine most people who are decent at programming aren't
looking. If you want a bunch of coders you need to get them fresh out of
university or start looking abroad. The thousands of resumes going out to
development job openings from the unemployed must be from crazy people who
can't code.

People finding jobs from others they know?

This sort of goes along with developers not looking. If nobody is looking,
then how do you find people to work for you? Get your current employees to hit
their Rolodex. Nevermind all that stuff about degree requirements, etc. In my
experience, the requirements hit the listing and then you never hear about
them. I imagine that's because the listing attracts the crazies and then you
get the gig when you sound like you halfway know what you're talking about.

~~~
hajile
I believe the biggest takeaway is the shift from preferring science fiction to
preferring a soap opera in space. It seems like the number of thinkers
relative to the population has remained static while the number of programmers
relative to the population has increased.

What happens when you need more thinkers than society has to offer?

------
enobrev
I absolutely agree with the general sentiment in this thread that we're all
self-taught. I've had quite a few friends ask me to help them become
programmers, and my first response is that (to paraphrase) "it's not just a
job to learn and then work towards retirement. It's a constant learning
experience where you have to wake up every day and realize you're ignorant,
slow, and unimaginative compared to your peers and if that's not the case,
then you're probably going to fall into obscurity. If you're ok with that,
then let's get started."

One avenue I haven't seen yet approached here is that schooling was far behind
the times at least through the mid-90s. I was set to graduate from a highly
regarded prep school in 1996, and in the "advanced computer class", we were
learning Pascal. My queries about the internet weren't answered well enough to
keep me interested in the conversation.

By then I'd built a couple silly websites for myself, met hundreds of people
from around the world, and had my own little secret educational source - a
step up from my peers in school, which I needed because they were all
Definitely smarter than me. I was hooked and had zero interest in plain old
desktop applications, which was the end-game to what was being taught in every
school I looked at (from my 17-year-old perspective).

A book on Perl understood what I was after. It wasn't even necessarily a very
good book on Perl. It had an open source web-store on a CD in the cover and it
told me step-by-step how to find a web host and then set up the web-store on a
server. I set up a web store for my mom's retail business, which then stayed
afloat for a little while longer (she now sells online full-time).

I proceeded to drop out of college and haven't stopped learning since.

------
ultramancool
Note that this doesn't mean completely self-taught!

Most (good) developers are definitely at least partially self-taught, but
those who are employed in the industry also tend to possess, not necessarily a
CS degree but at least on in a related field.

~~~
razster
Our developer went to Poly and it shows. He can think of a solution to our
programs needs almost instantly, were as our other devs whom were self taught
and no longer with us because they hit a brick wall.

It pays to have a good programmer with degrees.

~~~
mmgutz
The best programmers I've worked with are first and foremost problem solvers.
One had a CS degree, the other was an accounting major and another a musician
with a high school degree.

There are very few CS courses I took at the university that prepared me for
the real world. I'm sure there are certain fields like AI, DB algorithms,
Math/Science applications, compilers that benefit from formal academic
training but for the most part being logical and a a good problem solver are
the most important skills.

~~~
Bahamut
I too have found problem solving to be the most important skill -
understanding the problem so you can craft the proper fix or implementation is
vital to saving time.

Of the best developers I've worked with/know, only 1 has a CS degree. One has
a liberal arts degree IIRC (works at Netflix), another is a college dropout
(works at Google)...and the one with a CS degree has a tendency to
overengineer systems with great flaws that the others I know wouldn't do
(tries to be too clever). Myself, I have a MS in math from a top 15 program
(PhD dropout).

------
eludwig
I was self-taught as well. Back in '82 I was an in-house illustrator at a
children's book publisher. The company (now defunct) brought in Apple IIs in
order to do some house-branded educational software. I fell in love with the
little machines immediately, even though I really had no exposure to any
computers in high school or college (art school). The real software guys that
were hired taught me 6502 assembly and off I went. I got my first job as a
programmer in '85 doing Mac assembly (68000) programming. So much fun.

That said, I have a huge amount of respect for CS grads. I have seen it. My
background in algorithms is totally non-existent, besides what I've been able
to pick up on my own. I've been at a disadvantage MANY times due my lack of
formal CS/Math (my "formal" math edu stopped at plane geometry in HS!).
Thankfully, I've always had good friends to help me through these issues, but
it would have been a lot easier for me if I had had real training. Google is a
huge help now that we no longer need programming manuals, as such.

I have always gravitated towards the visual, GUI aspect of software
development, which is probably not a surprise given my art school background.
I really think that companies should keep open minds regarding education. It
really takes all kinds of people to do what we do, especially at big, diverse
companies. Being visually oriented and a capable programmer is a unique kind
of background that can be used to great effect. Not all cookies are the same
shape.

------
p4wnc6
If you make use of your university time correctly, then the primary value of
virtually _any_ degree is that it teaches you how to self-teach to an extent
you could not reach if you just started self-teaching on your own.

Aside from occasional prodigies, the folks who are best at teaching themselves
new things are folks who proved they could do it through degree programs.

Don't get me wrong, though, a lot of people do not actually make good use of
their university time, and they get through to graduation without learning
much about self-teaching. I'll never understand why they would want to waste
so much money for that.

If I had my choice when recruiting, I'd select people in this order:

1\. Someone whose degree and job experience clearly shows they are thoughtful
and can self-teach.

2\. Someone who acquired abilities by self-teaching even without a degree
and/or prior experience.

3\. Someone who has a degree and/or experience, but who clearly doesn't have
much skill to self-teach.

4\. Someone who does not have a degree, experience, or self-teaching ability.

Really, I'd prefer to _never_ hire someone from groups 3 or 4. But sometimes
it can be hard to detect fakers from group 3.

I think a lot of people share this opinion, which is somewhat tragic since
very few hiring processes make even the faintest attempt to determine if a
candidate is good at learning new things or self-teaching. Instead, just as
with the tired old thread about HackerRank from yesterday, we spend all our
time quizzing people on rote memorization of standard examples, which is
something that the group 3 people are very good at faking their way through.

------
Kinnard
This contrasts strongly with "Dropouts need not apply":
[https://news.ycombinator.com/item?id=11393671](https://news.ycombinator.com/item?id=11393671)

blogs.wsj.com/economics/2016/03/30/dropouts-need-not-apply-silicon-valley-
asks-mostly-for-developers-with-degrees/

~~~
TranquilMarmot
Silicon Valley != the entire programming world

------
cdnsteve
Why is the photo of a gaming conference or is that what people think
developers look like?

~~~
erikdared
Yeah I noticed that as well, looks like it's from a LAN party or something.

------
hajile
CS teaches a lot of theory, but not as much practical. Let's face it, 90% of
programming is CRUD and the hardest part of such applications (managing all
the state) isn't something that can really be taught in 2 years worth of CS
classes (the other 2 years being used on non-CS-specific stuff). Employers
don't want to pay to teach new programmers (even CS grads who couldn't get an
internship), but generally expect a couple years worth of experience before
they'll even consider someone for an "entry-level" dev position.

Programming is a job that requires constant learning. If a programmer has what
it takes to do that, then it's not too surprising that the programmer can
learn the basics on their own. A lot of devs like myself started learning to
program around 10 or so. When college enters the picture, these devs are bored
for most of the classes (except perhaps things like algorithms or compiler
classes).

If a would-be programmer had the foresight to look into the programming, then
they'd probably note that there's more profit in skipping the degree and
putting that 50K on the mortgage instead. CS has more free teaching material
online than any other white collar job I know of. A programmer can get the
exact same education quite easily if desired (you can't really say the same
about other STEM fields except perhaps math).

------
coldcode
I had one class in high school and that was 1973. Other than that I was self
taught and still coding today (iOS). A number of friends in college got the
(new at the time) CS degrees and all but one eventually were unemployable as
they had learned nothing beyond the degree content (mainframes at the time).
As long as you are into keeping up to date, after a while the degree no longer
matters.

------
CM30
I'm self taught as far as web development and programming skills are
concerned. But to some degree, it feels like that was really the only option
over here in the UK.

Okay, things have apparently improved significantly in the last few years, and
schools are seemingly teaching a few more things actually considered
programming now, but back when I was learning HTML and CSS and Javascript and
all that stuff in the early 00s, the standard of teaching in the IT classes at
school and sixth form was absolutely dreadful, and it likely didn't get much
better at university. GCSE classes for IT were literally 'how to use Microsoft
Office was dummies' and A level course were basically devoid of programming
tasks.

As a result you basically had to teach yourself, since there's no way in hell
your school education would teach useful skills for any sort of development
and you'd likely struggle significantly to go from there to a degree level.

------
ef4
Even within CS education programs, this has a profound impact.

You can't design a useful introductory computer science course that works well
for everybody. A subset of your class has been dabbling with code since
childhood, while another subset needs to start from the beginning.

Other degree programs don't seem to suffer nearly this extreme level of
experience gap within their incoming students. What fraction of incoming
mechanical engineering students have had the chance to build a working motor
and then take it through a dozen revisions until it works the way they want?
It has to be much less than the comparable fraction of CS students.

I think this is a big reason for the famous bimodal distribution of outcomes
experienced by most introductory CS classes.

------
emodendroket
This is from the SO survey and I think it's a bit misleading. Only 13% of
respondents say they are _only_ self-taught as opposed to combining
autodidacticism with some other method of learning.

------
vkjv
Does self taught simply mean "doesn't have a CS degree?" I don't have a CS
degree but I don't consider myself self taught because I didn't learn in a
vacuum. I would say that 75% or more of what I know, I learned from great
colleagues.

------
Raphmedia
Even if you started out from school, you are going to need to be self-taught
after a year anyway. What you learn in school is going to be obsolete quickly
enough and most workplaces are going to assume that you will magically be up
to day at all time.

------
racl101
The weird thing is that I went to University for a computer science degree and
took 2-3 programming courses where I used C and C++ but 95% or more of what I
know I learned outside of University.

So does that make me self taught or not? If so, I wonder if that should be 4
out of 5 developers being self taught.

