Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask YC: Why the degree?
38 points by _bbks on May 24, 2008 | hide | past | favorite | 206 comments
I am a developer without a degree, I left school when I was 16, at which time id been programming for 2 years. I've studied several languages, worked for countless companies doing contract work, and been employed as a developer and lead developer full time.

I'm now 24.

I used to be very frustrated to see companies demanding degrees before they'll even talk to you for new developer positions. I have always put that down to the HR department that don't really know what they're doing when employing developers.

However, I'm sorry to see many startups that are employing developers in the early stage, are requiring that candidates have a degree.

For all the company knows, I (and countless others) could be the best developers they'd ever meet. And yet we're automatically excluded from even applying.

Why?

If/When I am employing developers, a degree won't even separate one candidate from another, as long as I judge both to be good developers, then they're on an equal standing.

I know several people that have degree's in computer science, and none have any magical abilities that you only pickup at university.

So what gives?



>For all the company knows, I (and countless others) could be the best developers they'd ever meet. And yet we're automatically excluded from even applying.

The game is about probability, not possibility. The assumption is:

P(qualified | degree) > P(qualified | no degree)

That's probably reasonable, since schools do filter out at least some idiots. To find a candidate, one must interview roughly 1/P candidates. If interview costs are high (e.g., face time is precious), you want as many easy filters as you can get.

This is especially true if you don't trust HR to properly find the "diamonds in the rough" (I certainly wouldn't).


OK I can understand it from that perspective, it's not that there's so much weight being put on a degree, but rather its used for lack of other filters, when there's lots of applicants.

Doesn't sound very accurate, but I can see the reason why.


For example, the P(A | B) notation is something you learn at a University Probability class. A good feeling for elementary probability thoery is something that makes you much better at modeling, understanding and predicting the world, and not just as a programmer. And there's lots of other classes like that in a Uni. curriculum (although the overall percentage of really useful classes is in my estimate only 25-50%). Before you say that you could learn that at home, well, normal people who want to learn such things choose to go to school, and corporate HR is usually not interested in deviants. My personal opinion: I also have friends who are very smart and great programmers and dropped out of school, and I think that that particular choice was not very smart on their part, they could have just finished it, and would be in a much better position. Eg. they wouldn't be reliant on personal recommendations when applying for jobs; also, how do you ask for a raise w/o a degree --- where are you going to go instead?. etc.


"normal people who want to learn such things choose to go to school"

That's a load of crap. Normal people who want to learn such things pick up a book and read about it. Only very young people consider "going to school" a viable option. Normal people have jobs, kids, and responsibilities and don't have time to go to school every time they pick up a new interest.


Normal people who want to learn such things pick up a book and read about it.

I would be very surprised to find very many "normal people" who have the drive to learn, say, probability theory or statistics to any reasonable depth by self-study. Of course such people exist, but they are neither typical nor common.


I said normal people who want. Now it's true that normal people don't generally want to learn probability theory, but if they do, they'll turn to books and self teaching long before school unless they're extremely young and still think of school as life.

Once you've been in the real world for a while, the idea of going back to school is generally the last option, not the first. The only thing school has to offer over self directed learning is very smart people to learn from, but if you look around, you can find those people in the real world as well. Mentors tent to show up just when you're sincerely looking for them.


they'll turn to books and self teaching long before school unless they're extremely young and still think of school as life.

On the contrary, many people take classes and attend conferences, workshops and seminars at many different ages. You could learn cooking by self study, but cooking classes continue to be popular, despite being more expensive.

The only thing school has to offer over self directed learning is very smart people to learn from

Hardly the only thing: a lot of things are easier to learn when you're in a group of intelligent people all interested in approximately the same thing, and all trying to learn the same subject matter (I'm thinking mostly of graduate seminar classes here, not 300-student undergrad lectures).


There's a difference between conferences, workshops, etc. and school.


Different venues perhaps, but for someone attending because they want to, the goals are much the same.


I didn't say adults don't go to school, only that it's usually not the first option.

You second point just echo's what I already said, smart people to learn from, that includes your teachers and peers.


Well, you said that "normal people" would read a book "long before" going to school, which I think is not true in many domains.

My second point was not just that you can learn from your peers, but that there is a social environment at a school that is important, and not readily replicated in self-study. You don't learn much from your peers at a typical cooking class, but they still contribute to why many people choose to take classes rather than self-study.


The stats on college attendance would seem to be against you since about 70% of the population doesn't do it or doesn't finish a 4 year degree. Those are normal people. The 30% that do attend college and finish with at least a Bachelor's are the minority.

Yes, school is a social environment for learning, that has value, but it's hardly the only way to obtain that. Outside of school, there are local clubs and hobbie groups, forums and blogs such as this on the Internet, alternate paths such as the military, etc. There are all kinds of ways to be around people who enjoy the same things you do, or are trying to learn the same things you are without going to school.

By saying thats a benefit of school you're implying that it's not obtainable elsewhere. School just isn't that important if you really want to learn, it's more valuable for those that lack motivation and need someone to make them learn.


The stats on college attendance would seem to be against you since about 70% of the population doesn't do it...

In what country? Citation?

Here's some numbers from the UK: http://www.universitiesuk.ac.uk/faqs/showFaq.asp?l1=2&ID...

According to that, 43% of 17-30 year olds enter higher education in the UK.


My apologies for the U.S. cultural assumption. See http://www.census.gov/prod/2004pubs/p20-550.pdf for what it's like here. I also presumed we're discussing those who finish college, not those who just enter it. What percentage actually finish college in the UK?


The OECD stats look quite interesting. They break things down into Type A[1] and Type B[2] programs. Superficially, Type A are 3+ year courses, Type B are 2+ year courses. Results for the UK and US are here[3][xls]. Executive summary: for the 3-4 year Type A graduates, in the UK, 97% graduate. The OECD average is 67%. This report also has data for the US.

For actual numbers of people in the UK with degrees, there's this page[4] from the National Statistics. For people of Working Age, 18 to 60-ish, in the UK, 16% have degrees, and another 8.5% have "Higher education qualifications". 15% have no qualifications. It would be handy to have data for the 25-35 age range, since not many 18 year olds have degrees.

[1] Type A: http://stats.oecd.org/glossary/detail.asp?ID=5440

[2] Type B: http://stats.oecd.org/glossary/detail.asp?ID=544

[3] World Stats: http://www.oecd.org/dataoecd/17/15/39245059.xls

[4] UK Degree Stats: http://www.statistics.gov.uk/STATBASE/Product.asp?vlnk=10446


The original poster was asking why startups use this practice. Startups should not be looking for normal people at all.


Thank you, now I feel special ;P


Normal is not easily defined. Average is. you are confusing the average person with the average hacker. which most likely the average type of person in your circle of acquaintences.

I have suggested self-teaching to many non-hacker types and they are completely opposed to it. Even those going into programming.


I was talking about college-age people, as in to enter college when you're 18 or get a job and hope to become an autodidact.


Not related to the original question (I agree with the above answer) but in my experience HR departments/managers are more likely to read a CV not sent by the usual channels. The easiest way to bypass it is simply sending an email. The most efficient - leave it in person. Don't know why, but from (limited) anecdotal evidence seems to work.


That assumption will only allow you to pick based on a relative standard not an absolute one. Also, a degree being an eligibility criteria makes no sense. If you want to hire well then you should be willing to meet any relevant candidate. So in effect, with a degree being a REQUIREMENT you are actually lowering the pool of people who will apply and hence reducing your probability not increasing it.


The most desirable companies get so many resumes that they need to filter out most of them. I used to work for a popular search engine before Google came into the picture. As an engineer I had to interview people every week. HR handed me lots of resumes from people with CS degrees from top universities for phone screening. They didn't bother with the rest unless someone was heavily recommended by a trusted source.


Do you think you were doing it the best way possible?


His employer's search business got eaten by Google, at the time I think Google were hiring 50%+ PhDs.


Sure - a smaller search space will not enhance the value of the maximum your are searching for.

But the cost of searching my lower drastically.

And as another poster pointed out it is the total of (quality of candidate - search cost) that you want to maximize.


The assumption does not hold with more information.

First, self teaching processes filter out idiots too, since idiots can't self teach.

P(qualified | (no degree + autodidactic)) > P(qualified | (Degree + little self teaching ability))


Agreed completely. But how do you get more information cheaply?

Remember, all you know is P(qualified| resume says autodidactic)= P(qualified | autodidactic) * P(autodidactic | resume says autodidactic) . The second term can easily kill you. Do you trust everything people put on their resume?

On the other hand, P(degree | resume says degree) is probably close to 1, since a degree is so easy for an HR person to check.

Look, I've got nothing against hiring good people who have no degree, and I certainly don't think a degree proves much (my current students prove that conclusively). If you know you have such a good person, hire them, ignore the degree.

I'm just pointing out that the game is to find the good people. And that's a bit tricky to do.

[edit: by this way, in the interest of clarifying, what you want to optimize is not candidate quality, but (candidate quality - search costs). Don't ignore the search costs term.]


In the programming industry, it's not really that hard to filter out bad candidates. You shouldn't waste time talking to any programmer not willing to submit to some simplistic programming test in lieu of a resume. Resumes are crap and have no place in hiring programmers. Code is all that matters, either they can do it or they can't and if they can, they'll accept your challenge.

Nothing makes for a better interview than doing a code review and critique of the candidates own code. Something simple but telling like write a little slot machine that lets you bet money, spin the wheel, and win or lose with the game ending when you run out of money.

Anyone who refuses such a test isn't a programmer you want anyway, and you can tell an enormous amount about a potential candidate by simply looking at the quality of the code submitted. Don't bother interviewing anyone who's code you can't stand. You can see everything you need to know about their skills in that code.

You wouldn't hire a graphic designer who refused to submit samples of his designs, nor should you hire a programmer who refuses to submit samples of his code. Merely by having such a test, you'll weed out all the fakers because they won't bother with it, they'll submit their fake resumes elsewhere.


Many good candidates could be put off by such test. More importantly, even if they agree to your test and send you some code, you need an engineer's time to evaluate the code sample, and if you're going to invest engineer's time you might as well do a full phone screen. The recruiters are by and large non-technical people. They would love to be able to ask some technical questions which could help them predict whether a candidate would pass the interviews or not. I've been actually asked by a recruiter friend of mine how to do it. But, if you think about it, this is not so easy if you're not a programmer yourself.

You are absolutely right that it would be madness to hire a programmer without having seen him code. The companies are well aware of it and they will surely ask you to write a lot of code during the interviews. But that comes at a much later stage, as it costs them much more money than checking on your resume whether you have a degree or not.

I'm not saying that I like the way things are, but that's just life. If that's of any consolation, in most other industries the requirement of having a degree is much more strict than in case of Software Engineers.


I disagree, any candidate worth having would enjoy such a test. Any programmer who is put off by being asked to program needs to find another profession, period. You should not ask someone to program during an interview, you should review the code they've submitted and make them explain it, all if it, their design choices, their idioms, naming conventions, etc. If they can't discuss code they just wrote prior to the interview, you don't want them, but many good people don't perform well under the pressure of an interview so you won't get an accurate feel for them if they program at the interview.

No one but a programmer is qualified to evaluate another programmer, HR and recruiters have no place here and they'll just waste time and effort pretending to be useful, they aren't.

By doing the test you've already filtered out the wanna be's so you don't need phone interviews, the submitted samples should go strait to a qualified programmer, it'll take him only a few minutes to toss out any bad submissions and he'll quickly know who's worth actually interviewing and who's not.

Code is the only resume a good programmer needs.


One issue to be aware of is that it's not just the corporation interviewing multiple candidates. It's also the candidate interviewing multiple companies. Putting a large burden on each candidate may not scale if every company does it. I agree that it's important to assess actual coding ability but I don't know if a programming test is the best way to go about it.


Any programmer who considers such a trivial test a large burden, you don't want. There's no other way to assess programming ability than programming, it's really that simple. If you hire an unknown programmer without making him submit a code sample, then you deserve what you get because you're gambling and likely to lose. The whole point is to find good programmers, and good programmers enjoy programming, they'll happily write a trivial program to prove it, happily. Anyone who balks at such a test is not someone you want to hire, period.


Dont be the kind of employer that asks for a week long project as a resume. That's just irritating.


I didn't say that, I'm talking an hour or two long project here.


That may work better than a resume.

Can you imagine a test for the ability to work on larger programs?


Good code is good code, I don't agree that it takes a vastly different skill set to work on different sized programs. During the interview you can simply ask questions such as "How would you provide the ability to configure various strategies for different users interfaces for this slot machine? Say a web UI, a command line UI, and a native app UI?". "How might you make the randomizer for spinning the slots configurable?". The discussions these open up will tell you a lot about their knowledge of making flexible software.

Again, the test is just to weed out the bad candidates and allow you to only interview worthy people, everything else you'll find out in the actual interview. Interviews will be rare, few people will submit code you'll find acceptable at all. The vast majority of people applying for programming jobs, can't program, it's sad, but true.


There is a set of skills and body of experience that applies to working on large bodies of code. In fact, there are different skill sets for working on bodies of bad code and good code.


You'll note I said vastly different, I didn't say there was no difference.


The language syntax may be the same, but programming approaches might be very different. I've seen parts of a Smalltalk program where it was coded like spaghetti copy-paste Fortran, and it's actually hopeless to try and fully understand the semantics and still meet your deadline. But I was able to use a few tricks to prove that certain modifications wouldn't alter other functionality so I could get my work done. In the same program, there was well factored code with a nice object model, where it would behoove you to understand the part you're working on in a more conventional way.

On reflection, I think you may be right that good code is good code. Size of the system matters most when the code is bad.


There are lots of other cheap filters that companies do not use. For example, "Have you read Ayn Rand?" This question may well have better predictive power than a degree. And it's not hard to verify if the person is lying about whether they've read some at all.

Or, "how many blog posts do you have about programming?" Again this is easy to verify.


Who is Ayn Rand? An utterly dreadful writer and philosopher who believed that the axiom of reflexivity of identity had substantive ontological, social, moral and political consequences. It's no accident that Objectivists sought to justify their belief in the "virtue of selfishness" in the law of reflexivity; they have no use for altruism. Rand's philosophy glorifies the sociopath homo economicus, whose sole objective in life is to maximize his expected utility.

However, results in evolutionary game theory show that a society of self-seeking, self-regarding agents will generally face conditions that ultimately lead to its collapse.

Gary Cooper's goofy speech in The Fountainhead ( see http://www.youtube.com/watch?v=Zc7oZ9yWqO4 ) typifies Rand's attitudes. Among other preposterous propositions, Cooper is made to utter the nonsense that great inventions are uniformly the work of sole inventors, selfishly and reflexively seeking their own interests. This is ahistorical; see Against Intellectual Monopoly by Michele Boldrin and David K. Levine ( http://www.dklevine.com/general/intellectual/againstnew.htm ) for the history of inventions such as the steam engine, radio, telephone, and so on. In each case, ideas were in the air, and there were a number of people who came up with similar inventions more or less at the same time.

Cooper argues the basic notions of intellectual monopoly, which are that intellectual property is essentially indistinguishable from tangible property, and that all copies of ideas "belong" to their creator. These arguments come straight from the RIAA legal playbook. I'm surprised that any culture of hackers would want to subscribe to notions more commonly associated with corporate monopolists.


I suspect the point is that having heard of and bothered to read one of her books, whether you liked or agreed with it or not, probably implies various things about you.

A) You socialize(d) with people who read things that aren't sold in the grocery store.

B) You not only know how to read, but most likely voluntarily read a 700+ page book in your spare time in order to learn/see what it was about/etc...

C) If you can speak about what was in the book, and what you thought about it, you can follow the plot of a 700+ page book, you can understand the points the author was making, perhaps you can intuit the not-very-subtle philosophical and societal messages she was delivering, and you can discuss how you agree or disagree with those messages.

It's no IQ test, but frankly it's probably a much better question than "Do you have a degree?".

At least I'd rather work with people who have read a book like that, and have an opinion on it's content and the author's points (even if they hated the book/points/etc...), than the average CS degree graduate.


Very few people are hiring based on an understanding of objectivism. Data structures and algorithms on the other hand...


I'd rather work with a smart, well read person who likes to think than someone who hasn't read much.

Sure asking about Ayn Rand isn't really an intelligence test, but it's not a bad start. Smart people can learn about data structures and algorithms. Slow people who've managed to get a CS degree from some random college may have learned enough to pass, but there's no demonstration of smarts there.

I'm hiring based on people being smart, self-motivated, willing to learn, willing to think, and people I can hold a conversation with. Teaching someone like that how data structures work is a lot easier than teaching a degree holder how to be someone I want to work with and someone I can trust to be on the ball as new technologies come out.


Exactly.


Ayn Rand seems to be (significantly) more popular here than in the general population, judging from a dozen or so times I've seen her come up in comment threads. Isn't that more relevant than your personal, negative opinion of her philosophy?

So, out of curiosity, do these "results in evolutionary game theory" have a source?


"Isn't that more relevant than your personal, negative opinion of her philosophy?"

No, because it's not only a personal opinion, but a statement that Rand's philosophy of rational self interest is logically invalid ('rational' does not imply 'self interest'; the basis on the reflexivity of identity could fairly be called desperate) and scientifically incorrect. Rand's philosophy is incompatible with findings of reciprocal altruism in evolutionary biology and experimental game theory.

"So, out of curiosity, do these "results in evolutionary game theory" have a source?"

Here's one: "Game Theory Evolving" by Herbert Gintis; see Chapter 11. http://www.amazon.com/Game-Theory-Evolving-Herbert-Gintis/dp...


As you must be aware, many smart people disagree with you about Rand, and can back it up with something better than a youtube video. And anyway, you said her philosophy contains certain flaws. That doesn't imply it's not valuable and useful overall, so even if I concede your points, it's not very important. The reasons I like Rand have nothing to do with "axiom of reflexivity of identity" or that other stuff you said.

edit: no online sources? I don't normally pay $35+ because a hostile, anonymous internet commenter said something would refute someone I respect but didn't want to explain the ideas himself.


You asked for a source: I provided one. A request for an explanation is something else. I'll give you another source instead: http://www2.owen.vanderbilt.edu/Mike.Shor/courses/GTheory/do...

As for $35, it pains me to mention libraries...it was a source, with a link so that you could see something about the book.

I can't say I know of a single public intellectual or professional philosopher who takes Rand seriously. I do know of a well-regarded mathematical logician who does, but this is an aberration.


>I can't say I know of a single public intellectual

Alan Greenspan (http://www.nytimes.com/2007/09/15/business/15atlas.html).

Clarence Thomas (http://www.law.com/jsp/article.jsp?id=1090180289132).


your new source is basics, not new results. i just wanted an interesting paper on new results to read. those are normally published online.

your ignorance of public intellectuals and professional philosophers is not an argument against Rand. lol.


No professional philosophers take Rand seriously, except for a couple whackos in a couple places.

I understand if your 15 how exciting Rand can be. But to see adults take it seriously is sad.


who's Ayn Rand? and why must you be a blogger to be a programmer? These are silly questions imho.


The degree question is also silly, but has non-zero predictive capability. So do these. Certainly they rule out plenty of good programmers, but the issue is: do they leave plenty of good programmers, and a higher quality pool of remaining applicants?

If good programmers read Rand or write blogs at a higher rate than bad programmers, then it works, even if it's frequently wrong about individual people.


The difference is, if companies started using the filters you suggest, then the candidates would soon catch up and everyone would have a programming blog and have read Cliff's notes for "Atlas Shrugged." So it's not, as they say, an evolutionarily stable strategy. That's why no one does it on a larger scale I guess.


For a demonstration of this principle, compare the quality of a person with a degree in Computer Science. Has the average gotten better in the last 30 years, stayed about the same, or gotten worse?

My feeling is it has gotten much worse as people have sought degrees solely for the purpose of using them to get jobs in the field, precisely as you suggest would happen to blogs and reading cliff's notes.

However, may I point out that blogging is very different from reading the cliff's notes of a book? A hiring manager can read your blog. If they simply check that you have a blog, well, whoop-de-doo. I have a blog, so that clearly proves nothing.

However, if the hiring manager reads your blog, they can deduce a great deal about what you pretend to think and how you communicate it. So it is a small example of your work, much as posting source code is a small example of your work.


Linus Torvalds - best programmer I've heard of doesn't write a blog.

You still haven't told us who Ayn Rand is.


You haven't addressed the point and you can google her.


they are generic and poor filters. for example do you read person x. If person X isn't a household name (e.g. Bill Gates) Then it says nothing about them other than none ever introduced them to that person. I'm willing to bet there are a significant number of good C++ programmers that don't/can't care or know who Bjarne Stroustrup is. I had to look up the spelling of his name (although I don't consider myself a good c++ programmer).

The ability to write blogs is irrelevant here too as what we most likely care about is the ability to program. Hence, the programming test.


Take the group of programmers who have written more than 30 blog posts about programming. X% of them are good hires, and 100-X% are bad hires.

Now take the group of programmers who have written less than 30 blog posts about programming. Y% of them are good hires, and 100-Y% are bad hires.

Is X > Y, or X == Y, or X < Y?

That is the issue.

And that is the way degrees are used (when used rationally). The claim with degrees is that X is more than Y, not that most good programmers have degrees or anything else. That may be so. Then someone defended using degrees by saying there is a lack of alternative tests that are sufficiently cheap. That's not true. There are lots of cheap tests, and I have suggested 2 for which I believe X>Y is likely, and which, if studied, might turn out to have a higher X than the degree test.


to a degree I did. Maybe not to the degree you desired.

My point was is that even though it filters out many of the worst programmers it would also filter out all of the really great programmers. I'd rather have my search take longer and cost more to have those great programmers on my team than end up with just good programmers.


Provided your filters (1) increase the probability that the remaining applicants are qualified and (2) do not reduce the applicant pool too much, then it is probably a good idea (even if unconventional).

Note however that criteria (2) is important. If you use the filter "Is named Linus Torvalds", you would certainly increase the probability that any given applicant is good. Then again, your applicant pool will drop to either 0.


I actually have a friend who was asked if he has read Ayn Rand while interviewing with an insurance company.


For those following along at home, the unpretentious word for "autodidact" is "self-taught"


It's not pretentious just because you don't know it. "Autodidactic" sums up a phrase using just one word. Basically it comes down to this:

Having a larger vocabulary allows you to spend more time communicating and less time talking.

Just like abstracting functionality away inside functions allows you to spend more time solving problems and less time (re)writing code.


You're certainly right about abstraction being useful. My fist thought was that the simple-English for autodidactic here would be "self-taught", being aware of unnecessary abstractions, though the subtler meaning is closer to "someone who will teach themselves".

Simply saying "self-taught" wouldn't have had the right connotation: instead of "clever and driven", it would have been, "but what have they missed?" and we would have missed out on this interesting off-topic sub-thread!


I thought there was a difference where self taught would be "I did teach myself this thing" and autodidactic would be "I have the personality trait of continually teaching myself new things." Only one of the dictionaries I checked made this fine grained (and relevant) distinction explicit though.


Is autodidact such an unusual word in English?

(I would find a perfectly acceptable - but I am not a native speaker though. That's why I am asking.)


Its the first time i've ever seen autodidact(ic). I assumed it meant self-taught based on context, probably like most of the others.


OK, probably I only know all the weird words you never actually use.


That actually happens a fair amount with non-native speakers, since our uncommon words are sometimes very similar to their common words. For example, an italian recently called me ascetic, and I had to look it up.


"Autodidacta" is the most common way to say "self-taught" in Spanish.


I love how English is quite happy to absorb new words, even if the English speakers only know a subset!

Back in school, we were warned about "faux amis", words which sound like they'd be right to an English speaker, but very much out of place to a native French speaker. I think in English we'd just accept any and all alternative meanings and let context sort it out, who's up for some creative reading?


Yeah, in Spanish it's not unusual or fancy.


In French (autodidacte) it's pretty much the canonical word for that concept, too (French is similar to Spanish in lots of ways).


Anti-intellectualism will be the death of us all.


Using that word may not be pretentious, but -- worse -- it is bad form. An intellectual might object to the word's use in this context simply because it is bad writing.

The goal of writing to communicate. Unnecessarily using obscure words clouds communication. In this case, I don't think the using "autodidact" was necessary. It did not add to brevity or cadence or coherence. It was a mistake.

The only thing worse than an anti-intellectual is a bad intellectual.


No, using precise words enhances communication. There are different connotations between the words.


"Self-taught" and "autodidactic" are denotatively equivalent, so from that perspective the word "autodidactic" is absolutely not more precise.

What's more, relying on connotation to enhance precision is kind of silly. Connotation is highly subjective and often inconsistant across a community.

One good example of this fact is that few people here seem to see the connotative difference between the two words in question. I don't.

Maybe you can help us. Thanks.


Ironically (or pseudo-ironically?), the very difference in connotation between self-taught and autodidactic is pretentiousness. Perhaps not quite pretentiousness, but more of a difference in social class -- a self-described autodidact is indicating that even though they are self-taught, they are not "blue collar" self-taught.


And it's not that self-taught always means lower class, it's just that autodidact always means higher class. There must be a better word in English for the type of distinction in words I am describing here, maybe someone can enlighten me...


Um, fuck you. Really, I mean it. My mom, a single mother, worked full time doing hard physical labor in a bakery. Her back is still sore from all the hard work she did to raise me and my brother. We lived in cheap housing, paycheck to paycheck for years, and went to pretty crappy public schools. I don't see how you get anything about economic class from the word I used.

We were poor growing up. I liked to read. I learned big words when I read, and I use them when I communicate. End of story.


Can you look at the actual post I responded to and explain how communication was enhanced in this case?

If there is a different connotation in this case, I'd argue it makes the post more confusing.


I feel like I'm in junior high getting bullied for being a nerd. I'm so sorry for using a big word. In the future I'll try and limit myself to a fifth grade reading level (maybe reaching rhetorical, which is a fancy word for a skillful kind of speech, heights of seventh or eighth grade level) so that the idiotic (for you mouth-breathers, idiotic is a big cloudy multisyllabalic way of saying dumb that fails to add brevity, or cadence or coherence) Techcrunch half of this forum can follow along without having to use a dictionary to find a word they don't know.

"Autodidact" is obscure? WTF? Where is this coming from? Did everyone come here right after watching the American Gladiator contest? Did we get an influx of Time Magazine readers?

For me, the goal of writing here is to practice at playing with ideas. If, in playing with ideas I use words that are (maybe, a little) obscure, and I fail to communicate to people who seem to me like idiots, well, I'm fine with that. Great, actually. If only the failure to communicate ran both ways.

I'm done being autodidactic. I'm going to put down my books and go watch some soap operas and sitcoms until I figure out how the idiots here think.


...and then you 'defend' yourself by finishing up with statements like that? Wow.


The only thing pretentious about 'autodidact' is the use of the word 'pretentious' to characterize a common vocabulary word.


It's not really that common, most people wouldn't know what it meant so it is a bit pretentious. Most people really would just say "self taught".


It's not at all pretentious to use a well-defined word whose meaning is precisely appropriate. To anyone who needs convincing of this I recommend a large stack of Christopher Hitchens.

What's pretentious is using big words to impress others. Most people who do this actually end up not using such words correctly, since nuances of meaning are hardly their primary concern.

Edit: what I really object to is your implied criterion that anything beyond what "most people" would say must be pretentious. Lord help us if we're supposed to go by "most people".


I think Richard Feynman would disagree with you, and his plain spoken manner is a symbol of how to be smart without being pretentious.

Using uncommon words appropriately doesn't make them any less pretentious unless you really don't know they're uncommon, and Christopher Hitchens, whom I very much like, is hardly someone associated with being unpretentious. Hitchens certainly doesn't suffer fools well.


So anything beyond what's common is pretentious? As I read the definition of the word, that's just wrong. On the other hand, looking things up in a dictionary to find out what they mean is so far from common that, by your definition, I'm being pretentious just for doing so.

Edit: your invocation of the name of Feynman strikes me as gratuitous (oops, I'm being pretentious there) and a textbook example of appeal-to-authority (or rather, it would be one, if you actually were quoting him or had the right to speak for him).

Edit 2: what on earth does "suffering fools well" have to do with being pretentious or not?


I wish there was a voice recognition program for smartphones that kept a running log of uncommon words within earshot. You could then click on the log to go to a definition. (Which would be pre-fetched and cached.)


Well, it's not really worth arguing about, but my appeal to authority was in response to an appeal to authority (Hitchens). If you don't agree the word is pretentious, fine, but to me and many others I presume (since I didn't start this thread, someone else called it pretentious first) do find that people who use uncommon words are being pretentious and trying to sound impressive with their vocabulary. Given a choice to say something plainly or with fancy words, the unpretentious choice is plain talk.

As for the reference to suffering fools, I was merely pointing out why I think Hitchens can be pretentious, but it wasn't really relevant, I agree.

Maybe ostentatious is a more accurate word, but it is a synonym of pretentious. In any case, I really don't care that much, so I won't bother arguing further, if you disagree with me, then let's just agree to disagree and move on to something more productive than arguing about words.


Ok, but I can't resist adding one more! First, it seems we do agree on what's "pretentious": not using uncommon words as such, but doing so to sound impressive with one's vocabulary. Also, I brought up Hitchens not to invoke any kind of authority but rather as rich source material for the inventive and pitch-perfect and endlessly entertaining (and, yes, unpretentious) usage of all kinds of words. It seems we agree on most of that :)

Lastly, I'd like to add something to this conversation that isn't merely being critical. The thing about less common words is that, often, they are not exactly synonymous with more common alternatives. There are often nuances that, consciously or otherwise, add to the meaning of what's being said. While thinking of this I remembered a brief post from Language Log a while back that I thought was brilliant. It takes several examples of forms that have been claimed to be interchangeable (and thus superfluous), and susses out real distinctions between them:

http://itre.cis.upenn.edu/~myl/languagelog/archives/005487.h...

There must be a few people here who find language as interesting as I do, and would enjoy parsing out the examples they give. I'd post it as an item to HN but can't think of a title that could possibly convey the point. Oh well, maybe I'll just post it anyway...


Terrific link!


It's pretentious to say that it's pretentious; in fact, any use of the word pretentious is extremely pretentious, including this one. The use/mention distinction also breaks down for the word 'pretentious': it's not even possible to mention the word without being fatuously pretentious. This is especially true if the criterion for pretentiousness is whether "most people" would know what it is.

'Autodidact' is another matter.


It's IGNORANT to refer to that word as pretentious.

I have seen that attitude from the room temp IQ crowd - never thought I would see it here.


What lead you to believe that posters here have high IQs? Certainly not the articles or comments ;)


Average IQs would be sufficient to make his point. In my part of the world, "room temperature" refers to somewhere in the neighborhood of 75.


I've also met someone who thought it would be neat to be lumped in with the pseudo-intellectuals, because she thought it was a neat sounding word.


You are correct, but note that just because "self taught" is unpretentious, it is not necessarily the case that "autodidact" is pretentious.


True, but it's getting "more information" that is costly. If someone else can do it for you, so much the better. Personal recommendations are the best source (if you know and trust the recommender).


Maybe it just seems to me like an investment obviously worth paying.


"I know several people that have degree's in computer science, and none have any magical abilities that you only pickup at university."

I think you are shortchanging the value of a formal education. You are forced to learn things you don't necessarily enjoy. For example, I was forced to take a few stats courses which I found extremely boring, but recently at work it was incredibly valuable. I know I thought I was a great developer when I was 16, but looking back I was just a good problem solver. A degree also forces you to work extremely hard on some things (projects, exams) with rigid deadlines.

Are you suggesting that people are wasting 4 years of their life? I realize it is 'just a piece of paper', but that piece of paper signifies an accomplishment.

Now I am sure a few people are going to say I have drunk the Kool-Aid. To that I suggest there are some places you can break the rules, and some times you have to play within the rules of the game. I think getting a degree still opens many doors, and because of that I followed through with it. Clearly there were times I considered skipping university or dropping out to pursue an opportunity, but looking back I have no regrets.


"You are forced to learn things you don't necessarily enjoy. For example, I was forced to take a few stats courses which I found extremely boring, but recently at work it was incredibly valuable."

I don't think that's the way to go. If learning something is boring, then it's either because you'll never need it or you don't realize you'll ever need it. You can't know if it's boring because of the former or the latter before you really need it.

For example, I found math pretty boring in school (beyond 6th grade, say) because I just didn't see what applications it would have in my life even if I knew I wanted to be a programmer and supposedly good math skills were a hard requirement (it only is if the problem-domain you're working on requires math).

But now I have some big ideas for my projects and I learned that some fields of mathematics and other disciplines would be a great investment to learn because it would ease or even simply make possible their implementation. So now those topics really look fun to me so it will give me the push I need to learn it all as fast as possible and in an enjoyable way.


>I realize it is 'just a piece of paper', but that piece of paper signifies an accomplishment.

What's written on that piece of paper is more important than the accomplishment in my experience, at least as long as finding a job is concerned.

If a company deals with hundreds of applicants they need some way of trimming them down, and not having a computer science degree will put you at a disadvantage even if you do have some other degree. I've been rejected for jobs because I have a physics degree and not "Comp. Sci"; I'm sure there are mathematicians on here who can say the same thing.

This might not be so true for startups, but I suspect that when faced with a deluge of applicants even startups will go straight for the "Comp . Sci" degrees if just to make the numbers managable.


I don't have a CS degree, but all else being equal a good education trumps not having one. The same person with a degree and without a degree will be more effective with one, especially if the degree is a meaningful one (i.e. not just to get a piece of paper).


I'm not meaning to take any value what so ever from a degree, and I believe that people that have them have accomplished something worthwhile.

What I am saying is that people shouldn't just dismiss people without a degree, because those of us who have a true interest in what we do, and learn, read books and experiment on our own - without being forced, or having a set syllabus to follow can become very talented developers.

If you've got to a very high level alone, I think that's a damn good accomplishment too, one that is at least comparable to a degree in itself.


Matthew, I agree, however, there's a general trend I've noticed within this conversation: technical v thoughtful ability.

From my experience, programming requires both understanding and experience. What you learn in college does not necessarily correlate to success in a technical field (especially when you're coming out of a Liberal Arts university as I did).

Simply put, when it comes to a technical field:

(Past experience + understanding + execution) > a Liberal Arts degree.

But you shouldn't disregard a degree when it comes to thinking + innovation. College provides an invaluable means to look/analyze the world around you.

I wrote about this a few days ago that may be of interest:

http://datainsightsideas.com/post/35471878/


>Now I am sure a few people are going to say I have drunk the Kool-Aid.

Even if you point this out ahead of time, those people could still be right.

>I think getting a degree still opens many doors, and because of that I followed through with it.

Signals are wasteful.

>but looking back I have no regrets.

Most people think whatever choices they made were the best. That you have no regrets is not unusual.


I think you're shortchanging the value of being autodidactic.

The paper signifies an accomplishment that has nothing to do with actual ability. This obsession with degrees is a fetish.

Any decently self teaching person is going to wade through the boring stuff on their own anyway. That's a straw man.

In fact, you needing the pressures of deadlines and professors to get you to work, that's more indicative of you not being a great worker or being ambivalently committed to your work then anything about self teaching people. If you need to enter a codependant relationship with an institution that often calcifies mediocrity to get your work done or to wade through the boring stuff, that doesn't mean the self teaching dropout shares your problem or hasn't done the other work.


People who can self teach well are the minority, and many of those people still go to university/college. I am not shortchanging the value whatsoever.

I think being autodidactic is incredibly valuable - but I don't believe whether or not you have a degree is a good indicator of that. That skill is also incredibly hard to determine from a resume or interview. When you are hiring commonly you have once piece of paper to look at and then a 30 minute interview to evaluate the person. Whether you like it or not, the majority of people doing this will filter by degree and institution. If you can think of a better process, create a HR startup I am sure it would be amazingly successful! Many companies realize that the hiring process is flawed but there currently are few options.

I also disagree that educational institutions "calcify mediocrity".


In my experience hiring software engineers, possession of an undergrad CS degree was not a helpful filter for resumes. I did find that candidates with advanced degrees or undergrad EECS degrees tended to be a little bit stronger than the average candidate. So did candidates with unrelated undergrad degrees, and a few years of job experience.

I never figured out how to look at a resume and meaningfully select for extremely good hackers who were within our ability to attract (although it was easy to identify them in interviews). The ones for whom that was obvious from their resume had better opportunities.

One thing that I find interesting about this conversation is how much more often it comes up in software than elsewhere. Why are there so many autodidact hackers out there? I think it might be that computers are such excellent, affordable learning environments. The quick feedback loop you can get actually working with a computer is unlike anything available for most fields.

On the other side, academic CS programs are relatively new, and possibly don't do such a great job yet. Or maybe it's just that they've been attracting large numbers of the wrong people in recent years.


Computer programming is something that you have to do for a long time to be good at. Almost all good programmers started well before college, and a lot of them go on to study something other than CS. My BS is in Electrical Engineering, and I have an MS in Math, yet I still consider myself a desirable candidate for programming jobs.


Back in my day I didn't know CS was it's own major; I thought it was just a subset of EE and programming was just one thing you had to know, in addition to being actually able to build a computer from scratch.


Math should also be very self-teachable. Though it takes a lot of discipline or (better) passion.

Almost all ground-breaking mathematics was done at a very young age.


P.S. And at 22 I fear I am already too old to ever become a great mathematician.

And you think you have a depressing career outlook?


I haven't met self-taught programmers who learned much theory (CS, math, physics, etc). I bet there are some, but I guess they are rare.

Theory gets useful if you want to tackle harder problems. If you aren't aware of the theory, you often don't know what you are missing. As an ex-dropout, I was in that position for a long time.

Going out on a limb now, big company with little respect for theory -> Microsoft, big company with healthy respect for theory -> Google.

You can do fine without theory, but its getting harder (and will continue to).


A Bachelor's degree (even in CS) isn't just job training. At least half the classes required for a degree are not related to your major. When you graduate with a CS degree you've (hopefully) learned some CS theory and some practical skills but also you're more well-rounded from learning composition, speech, math, history, science, etc. If being a developer meant pounding out code all day long then a college degree shouldn't be a factor in the hiring process. Most developers I know spend more time doing other things - planning, estimating, communicating with other teams, etc. - than coding. This non-coding workload increases as you move up the ranks so most HR departments hire with an eye toward the future as well. So given what you know to be true - that most companies require a degree - you have two options: 1. Get a degree 2. Work for a company that doesn't require a degree

And one more thing - the plural of degree is degrees not degree's. I learned that in school.


Interesting points. However, you seem to be assuming that people without degrees disappear into some kind of a time warp while everyone else is at university.

Team work, communication, planning and organisation skills can be picked up in real life environments too, if anything in a better fashion.

When you enter employment at a young age, your employer and your colleagues are more willing to show you the ropes and give you time to learn on the job, considering you usually start at a low position. Then you proceed to learn real world experience for a number of years, depending on how quickly you dropped out of education, and how quickly you landed a job.

I had been working as a developer for 5 years by the time my friends of the same age finished their degrees. Put that into perspective and compare to people fresh out of university.

I'm not saying people with a degree have wasted their time, or that it wasn't the right thing to do. Just give those of us who chose a different path some credit too.

As far as my options go, I'm now self employed and loving it.


> I had been working as a developer for 5 years by the time my friends of the same age finished their degrees.

I knew a fellow who didn't go to university and was managing a team of 12 developers within his first five years of work. He was richer than the average graduate too.


I worked 20 years in software and publishing with no degree. In my experience, no one cared.

I did not see any correlation between skill level and degree. Programming is a craft, like carpentry. You learn it by doing it, by doing it with other people, by thinking about it and talking about it, a little bit by reading books about it, and almost not at all by watching lectures about it.

That said, people who don't have the kind of experience with programming to tell a good programmer from a bad one have to resort to proxy measures like credentials. I usually worked for small companies, where this is less of a problem.


But a carpenter is not an architect. The barrier to entry for pure programming is rather low.

I see programming today as in a similar state as auto mechanics was in the 1970s & early 80s in the US. If you had good sense and aptitude, you could be one, and the barrier to entry was very low.

Now it's different. You (for the most part) need some sort of certification to be a "grease monkey;" and even then, you are fixing, tuning and rebuilding engines, not designing them.


If the company is great, but HR acts as a filter, skip HR. Contact the department you're looking at directly and be prepared to wow them.


Most job ads I see say "Bachelors degree or equivalent experience" In those cases, definitely apply even without a degree. And even if they don't, apply anyway. A lot of time stuff gets put in job descriptions by copy & paste, or because it's common, not because it's an actual requirement. Yeah, HR might filter you out for not having a degree in some cases, but you'll get past them to the hiring manager in other cases, then you have a chance to sell yourself.


It all comes down to energy required to make a decision. Degree acts as a quick filter. For example, having a degree in CS does not mean that you are a good programmer but what I, as an employer, know about a person with a degree is that they possess a certain level of knowledge and they have also proven themselves and have shown that they can focus and accomplish something.

If you think about it, those two things I listed are incredibly important. When you're swamped with 100 resumes and you can interview maybe 10 people, these little markers help employer make a decision. Now, that doesn't mean that a person without a degree doesn't have a chance... it just means that they will have to prove themselves in some other way (being famous coder/blogger/speaker helps there)...


Actually, I am inclined to believe that work experience is a better filter. If you are hiring freshers, than ask them to build something in a day or a week and use that instead.

Now, that doesn't mean that a person without a degree doesn't have a chance... it just means that they will have to prove themselves in some other way (being famous coder/blogger/speaker helps there)...

Actually the onus is on the employer. Shouldn't you want to hire the best?


The onus is not on the employer. The employer's job is to fill the hole with the best that they can easily ascertain. I can tell you from doing hiring in the past that people have gotten very good at filling their resume's with bullshit.

The last time I hired someone for a Senior Developer position, I had to filter through almost 500 resume submissions. Of those, I did a telephone technical interview with about 200 applicants. We did face to face interviews with the 6 applicants who passed an incredibly simple technical interview - Things that any programmer who has done any real work would almost be insulted by.

Any available filter to help sort the wheat from the chaff is helpful, and a degree is one of those filters.

That being said, I do not have a degree in anything, I got into the industry at 17 and haven't had time to go to school. But, I've been able to get far by working harder than everyone else; in the conversation of degrees making a difference, I'd like to have one, sure. But I've never needed one. I think it's come up in one interview once, and it ended up being a nonpoint.


fill the hole with the best that they can easily ascertain.

I as an employer and employee of other organizations in the past - I can tell you that filling the hole with what is available lowers standards. While this does happen, it is not something to aspire too.

I can tell you from doing hiring in the past that people have gotten very good at filling their resume's with bullshit.

I agree. But how is this relevant to our discussion. I have not said that the resume is a criteria.

The last time I hired someone for a Senior Developer position, I had to filter through almost 500 resume submissions. Of those, I did a telephone technical interview with about 200 applicants. We did face to face interviews with the 6 applicants who passed an incredibly simple technical interview - Things that any programmer who has done any real work would almost be insulted by.

Apparently, your filters arent effective.

Any available filter to help sort the wheat from the chaff is helpful, and a degree is one of those filters.

A degree may be one of those ineffective filters that you are already using. Perhaps it is just a segmentation factor like gender or age. If all one seeks to do is reduce the number of resumes/candidates one has to meet, then one might as well use gender or age or anything else.

That being said, I do not have a degree in anything, I got into the industry at 17 and haven't had time to go to school. But, I've been able to get far by working harder than everyone else; in the conversation of degrees making a difference, I'd like to have one, sure. But I've never needed one. I think it's come up in one interview once, and it ended up being a nonpoint.

And see you turned out to be alright :)


> Perhaps it is just a segmentation factor like gender or age.

Young people are more likely to have a degree, so perhaps asking for a degree is covert ageism.


Agreed, people with a degree have shown a commitment to a goal, and achieved it. That's definitely +1 to them, but as for the basic level of understanding, doesn't that become less effective as a gauge when candidates have 2-3 years of commercial experience?


You've mentioned "2-3 years of commercial experience", as if that indicated anything. What most of the interviewers are getting to is that, on paper, almost anyone looks good. Look at the discussions regarding 'The Dead Sea effect' (http://brucefwebster.com/2008/04/11/the-wetware-crisis-the-d...).

To get your degree from a school indicates that you demonstrated some abilities in a selective and competitive environment. The quality of the school tends to imply the level of quality vetting -- to get into MIT is more competitve than ITT. The name on the commercial experience reflects a similar vetting process -- a development job at Apple is more difficult to obtain and maintain than one at a local utility company. Arguably, a job at a notable startup is probably another strong indicator (on the idea that a startup can't waste time on inferior talent); but I wouldn't consider job upgrades at an unknown startup to indicate much (ie, everyone here could be a CTO or lead developer for their own startup... does two years of being a lead developer for a team of one to three, selected by company seniority, tell you anything about technical leadership ability?).

It is all just borrowing collective intelligence -- using the wisdom of previous crowds to give off quality indicators.


Their goal is not to be perfect judges in individual cases. A crude first pass works fine for them as long as it leaves a lot of good people.

http://www.paulgraham.com/judgement.html


Don't self link, that's called spamming.


Self-spamming?


I call it efficiency.


No one has humor here.


Hear, hear!


How do we encourage true meritocracy? Open Source is one thing that can do this for programming. It enables a programmer's output to be evaluated in detail by all. Blogs might do this as well.


That's the supporting theory. The problem is that this eliminates both the majority of the unqualified applicants and a large number of the very top applicants. Suitable for a big company? Sure. For a startup or more ambitious company? Unlikely.



Dealing with bullshit. That's what it all comes down to.

In the academic world, you invariably run into people who not only have explicit control over your success, but are possibly unreasonable and malicious. The fact that someone has a degree proves that he had enough dedication to stick with it and conquer (or at least fare) the propagators of bullshit.

Anyone intelligent can sit in the library and learn to code; not everyone intelligent can code AND deal with adversity.


Having interviewed quite a few people, I tend to think that there's a correlation between not just a degree, but also the quality of the school where it was attained, and the quality of the candidate - as measured by his/her professional knowledge and ability. That's not always true. In fact, of the two best coders I know, one didn't bother with a degree, and the other did it as an off hand, graduating after 8 years, having worked full time all along. Nevertheless, when it comes to people I don't know, and given that I don't have the time to interview everyone who sends their resume, a degree serves as an initial filter. If the resume is impressive enough without it, then I'll go ahead and interview, but it has to be really impressive. In short, not having a degree is an impediment. You can succeed without it, but you're putting yourself at a disadvantage.


If a position stated that a degree is a "nice to have, or advantage" I would certainly still send my CV and hope that it was glowing enough to counter my lack of degree, which I would hope it is through sheer commercial experience gained while everyone else was in education.

But my gripe is that most employers don't even give us that chance.

In a way, that has played its part in my working life that has sent me on a path to be independent and start my own company, so I guess its not all bad :)


Smaller companies where productivity is measured more closely don't care that much. Big, dumb bureaucracies tend to hire based on buzzword counts on resumes and GPAs. You don't want to work at those places anyway.


Part of me wonders what would happen if you just faked some school on the paper. It worked wonders for the dean at MIT, so if you are smart enough give it a try.


I'm in the exact same boat, I don't have a degree - I must admit, I worked at a company that hired people with/ without degrees and the programmers that I could relate best with were the ones who had degrees.

I've been thinking about getting a degree, but between working a full time job and doing a startup I just don't have the time, I have however bought myself two excellent books, which I'm hoping to fill the gaps with:

* The elements of computing systems (From Nand to Tetris...)

* Programming collective intelligence.

Web developers who don't have degrees aren't exposed to the fundamentals and fancy algorithms that are beaten in to you in CS courses.


"Web developers who don't have degrees aren't exposed to the fundamentals and fancy algorithms that are beaten in to you in CS courses."

That's not true. First of people without degrees can be dropouts, having learned it in school before dropping out. Second, people can self teach that stuff too.


I can believe that someone would have the discipline to sit down with a textbook and teach themselves something. I've done it myself (although I wish I could do it more often). But the problem is that it isn't always obvious what parts are important or why. I found in my first year of university I learned a whole lot more CS than I ever did before, not just because I was being taught at a faster pace but also because I had a better feel for what I should learn on my own.


I don't have a degree either and I have to admit: when I was in my 20s, it bothered me too as I thought, "I might be missing out on something." Today though, running my own business, I see that it worked out perfectly. In my latest job posting (for a .NET/SQL dev), I wrote "A Computer Science degree is not required but helpful" because I don't equate degree to good code.

Joel Spolsky talks about this in his book and he prefers a degree or, if not a degree, evidence that the person had to go through some sort of highly selective process.


Thanks. I guess in that case it can be an indicator that the person has some kind of skill.

Although I do have friends that have managed to get a degree in computer science and they still can't code unless they're copying from a book.

Is Joel's book based on employing people straight out of education? because presumably people that hadn't gone the degree route would have 2-3 years commercial experience by the time people the same age finish their degrees.


Here's the book: http://tinyurl.com/3kr99k

It deals a lot w/ hiring interns-to-become-full-timers.


Two of the best software engineers I've ever hired (and worked with) did not have college degrees (one only had a GED for high school). But they were both highly talented and came highly recommended from people I trusted.

No, getting a CS degree doesn't guarantee that you're talented; far from it. However, it does show that you have enough aptitude to make it through a CS program, and it lessens the chance that you'll spend my time re-inventing -- or, worse yet, ignoring -- foundational concepts in computer science and software engineering.

Here are some more of my thoughts on the subject of hiring IT engineers:

http://brucefwebster.com/2008/01/10/the-wetware-crisis-tepes...

http://brucefwebster.com/2008/04/14/the-longest-yard-reorgan...

(and thanks to dmv for the link to my 'Dead Sea' post, which is what led me here). ..bruce..


The degree is seen as a benchmark, an easy early filter - you know someone with a degree has had the focus to knuckle down and study a subject in depth for a period of time. Frustrating for those who don't have a degree but have done the same amount of studying, if not more. But that's the way it goes.

You're only 24. If it's bothering you, go get a degree. Otherwise, make sure you produce work that demonstrates your abilities. Check out this article - 'How I got hired by Amazon' http://www.brunozzi.com/en/2008/05/22/how-i-got-hired-by-ama... - a great example of not needing to care whether the person has a degree or not.

I don't have a degree and I ended up working at Microsoft. But it would never have happened if I had tried applying through their recruitment channel, due to the degree filter. I was headhunted based on performance.


I used to do web-apps for a hosting company to earn money to help pay for school. (I think earning a degree was worth every cent BTW) None of the coders there had a degree and they were all very good at making everything work. This was a team of 2-3 people on a codebase of 50,000 lines of PHP / MySQL - The owners of the company also didn't have a degree and they seemed to prefer hiring people without one.

The other posters who have spoke of a degree as a filter which lowers the cost of interviewing are correct. I can think of two occasions over the 1.5 yrs that I worked there where they hired people who didn't have a clue what they were doing. This was obvious immediately and they had to be ruthless about firing them. I don't know if requiring a degree would have made this nastiness less likely, but these mistakes have a significant cost for both the employer and the employee.


A degree is a proof that you were able to complete something where other have failed, and that you are willing to learn and learn fast. It also proves that you are not completely dumb, since few people fail at university.

Even without degree, I think that you still have a chance to get a job if your resume and your projects you have are good enough (or if you run a business in the past). It probably get's more difficult in the consulting business, as the company sells you (and your resume) to their customers, so it get's difficult to sell you for a premium rate if you got no degrees or certification.

That being said, If you haven't done so far, I would read a few algorithm classes, because that's something you normally never learn while programming (also a good tip for the Twitter guys... ;))


I've consulted for a lot of years with out a degree. I don't think it would be possible without a massive stack of recommendations and success stories.


"It also proves that you are not completely dumb, since few people fail at university."

doesnt add up, what exactly do you mean?


Sorry.I thought that many people fail at university, so you prove that you are somehow more capable of learning. Few was not well chosen.

However, I observed in recent years (at least here in Switzerland and also other people told me about it) that it seems to get easier and easier to pass university, as there is more and more pressure from the state (which finances university) to let more and more people pass, due to the lack of engineers.


Lots of people have probably read these, but Steve Yegge wrote some great posts about interview loops on his old Amazon blog.

http://steve.yegge.googlepages.com/five-essential-phone-scre...

http://steve.yegge.googlepages.com/why-phone-screens-matter

http://steve.yegge.googlepages.com/what-you-need-to-know

Here's an excerpt from that last one that is on point:

"My point today is that although I don't feel you need to have a Computer Science degree in order to be a good developer, I do believe that there is a set of learnings, which you normally acquire in the course of a CS degree, that I would consider to be part of the core set of ideas that are 'software common sense'. If you don't know them, then I will feel that you lack common sense.

"In our CS degree, we saw a lot of proofs, and had to work through many of them by hand. I feel that this was to some extent unnecessary, since it obscured the importance of the things we were proving by putting us all to sleep. Being able to prove something does allow you to reason through it to assure yourself of its correctness, and to derive it from first principles if you've forgotten it. But in everyday programming, you don't need to do proofs. You ought to have a feel for the outline of the proof, the sort of intuitive derivation of a rule, so you can reason through it with yourself and with others. But you don't need to be ultra-formal about it unless you happen to love proofs, as some folks do.

"Instead, you need to be aware of the major findings and learnings of Computer Science, and the rough reasons for them. I realize this opinion is going to displease two audiences: CS theory folks, who will think it's too weak, and professional programmers with no CS degree, who will think it's too strong. But it's my opinion and I'll stick with it."


There are opportunity costs either way. When I was your age (I am now, on one leg), I didn't have my degree. After my father's premature death, I went to work to help support my mother and two brothers. I was working in videotex--this was before the WWW. (You might be amused to know that one of the prototypes I worked with was written in lisp.)

While I was always able to find work, it certainly was more difficult finding employment before I finished my degree. It raised questions--the rhetorical kind (interviewers would raise them but weren't interested in the answers).

Now that I have my Ph.D., it's more difficult again.

The point is that timing matters. I don't have the same choices now--and that's without having a family to support.


Just curious, how did you teach yourself programming?

To me a degree would not so much signal skill in any programming language, as a general skill in learning new stuff, which to me seems to be the most important skill of a programmer. A university degree seems to indicate that ability of learning stuff to some extent. Sure, if you taught yourself programming, you have learned something by yourself, too. But I think it is a lot easier to learn one particular programming language than to have a general understanding of computer science, be able to write down stuff, learn things you are not as interested in, and so on.


I "caught the bug" when I was 14 and messing around with mirc scripting with some IRC buddies.

I scripted a bunch of stuff for controlling mp3's, and a few socket bots to protect from channel takeovers.

Then I wanted a mp3 player more like itunes is today, and so I set about making one in VB.

From there I progressed to games, in VB and Java, such as space invaders etc, dabbled a bit with 3d programming.

Moved to the web, HTML, JS, followed by PHP, then some JSP/Servlets and ASP through a work requirement, learned SQL Server and databases in general from a MCDBA who was also a MCT.

Messed with C and linux, borrowed some books from the library and read every page, lost my way a bit and learned about creating exploits.

Then got back on track, some c++, some ASP.NET/C#.

And the list goes on. My language and framework of choice is now Ruby on Rails.


And what is the reason you did not want to study? Just trying to understand... Obviously, it is a money issue (earning 3 years vs paying 3 years). But if you enjoy the stuff, wouldn't it be natural to study it?

For example, personally I am bored by now by web programming, and university really wasn't so much about practical programming. I liked the theory, complexity of algorithms and so on. in your self-taught curriculum, did you ever come across those kinds of things? Do you/did you care? If you are truly interested in that stuff, but don't want to spend the money, why not do it in a remote learning course or something? I mean, you want to learn the stuff anyway, why not get credit for it?

Obviously it is better to create a new distributed hashing algorithm by creating something like bittorrent, rather than doing theoretical work about it at university. But university is not so bad, and some results from academia do spill over into the real world.

Just today a friend told me about a program he got to analyze that did user authentication on the client. It is amazing to me that people can be able to program such Java clients, yet be unable to understand the security issue. They can program, yet they can not program. Maybe it is because it is so easy to learn programming, and bad code actually runs, too, that employers like to see some additional verification like a CS degree (and of course the CS degree is not guarantee stuff like that won't happen).


Freedom to do what I want to do I guess. In the UK there's a 5 year gap between finishing secondary school, and finishing university (2 years of college in between).

During that 5 years, I've lived in 3 completely different locations within England, and lived in another country for a few years too. I've worked for big companies, small companies, startup companies and now myself.

It was a personal decision, that I took, and I don't regret it one bit. I'm not even in the market to get a job, I just had an old annoyance come back when I noticed a few startups requiring degree's for their jobs on here, I thought startups would be different to the usual useless HR departments in that they should be able to tell good developers to bad, without filtering on whether they have a degree or not..


I wouldn't let it stop you. I've applied for and been offered numerous jobs which listed a college degree as a requirement (I don't have one). It helps to have a good resume. And by that I mean one that not only has lots of strong relevant job history, but also one that is well presented/written and really shows off your strengths, while minimizing things that might be liabilities (like the lack of a degree). For instance, my resume doesn't even have an education section.

So don't let it stop you. Also, as other people have mentioned, if you can duck HR, that's great too. If you don't know anyone who works there, find one of their tech folks on a forum somewhere, and work in from that way. Most places give employees hiring bonuses for referrals, so most people will be happy to help you get in the door (assuming they think you're good enough to want to work with you).

Personally, having worked at many places, I generally find folks with CS degrees to be liabilities. Especially if they're relatively fresh out of college. Granted I don't work in areas where you need lots of knowledge about machine learning, or AI, or... I do Java-based web application development.

I've found that most CS programs, by nature of the need to identify topics, and build curriculums and supporting materials, are years behind the real world new technologies. I'd rather hire/work with someone who had spent the last four years building cool things with the latest and greatest technologies. Who knew the real world tips, tricks, and pitfalls of a given approach/language/toolkit/framework. Rather than someone who'd spent the last 4 years in a classroom or a frat house, learning technology that is 1-6 years out of date, and didn't have hands-on-experience with how to handle a Slashdotting, or how to run a secure world facing server/postfix/whatever, or knew that Firefox treats xml response types very strictly and breaks on Google inserted AdSense javascript, unless you force the type to html (while IE and Safari don't care).

Don't get me wrong, there are some smart people in CS programs, or coming out of CS programs, but based on my working/hiring experience, in general, a CS degree is a slight liability on a resume from my point of view.

Real World Experience > College Degree.

And frankly, for the type of work I do, Real World Experience + College Degree usually == Real World Experience.


When you were 16 did you know that employers use a college degree as a coarse filter when trying to get a manageable applicant pool? It would seem that your answer was most likely one of two. You were either ignorant of reality when making a huge decision, or you were the type to want to fight reality because it felt like injustice and you 'knew better'. Of course at 16 we all 'knew better' and if that's how it went down I'm sorry you didn't get better advice... but it's a personality trait that some folks don't grow out of, and if you've ever managed a team you know they aren't the ones you want to have around.

Anyway, this is not to make any sort of judgment on you personally, you have probably done more awesome things than most 24 year olds and may be nothing like what I described, but you asked: when an employer sees a resume without a degree that's a snap decision thought process they might go through.

The situation actually isn't that bad... most of the best jobs aren't "resume" jobs, they're "hey I know a guy" jobs. Networking is much more important than resume to getting a good job - especially at startups!


I used to think that a computer science degree was not useful. However, can you answer the following questions:

1. How would you program a compiler for a new DSL? 2. What is stdcall? How does it relate to the stack? 3. How many registers does a 486 have? What is a register? 4. How does ethernet work? 5. What is the most efficient way to break a project down into tasks and milestones when dealing with a large team? 6. If you break a computer which you bought, but has not yet been delivered, who is liable? 7. How would you mathematically model a computer controlled temperature controller system?

I know the answers to those questions because I learnt them along the way as I got my degree. I have a wide spectrum of knowledge which is not directly useful to programming, but is useful to me as general knowledge.

When you learn by yourself, you tend to learn enough to do what you need to get done. In school you learn things that are not immediately obvious that you need them.


"When you learn by yourself, you tend to learn enough to do what you need to get done. In school you learn things that are not immediately obvious that you need them."

No, no, no. YOU tend to only do the minimum to do what needs to get done without outside pressure. Don't project your flaws onto us self teaching people who don't share them. Any autodidactic person will constantly be learning both generalist and specialist things. Just because you don't have enough fire in your belly to do more then the minimum learning when you're on your own doesn't mean anything about anyone else.

(EDIT: Sorry. Not to be too harsh on you. Your quiz is relevant. Of course, if the OP can't answer questions like yours in his field, he isn't the kind of autodidact I'm defending. But if he can, (and given that many self teaching people can do things like that easy) your point about learning just enough to get stuff done is more a personal thing then anything about autodidactic people)


The thing is this: There are some things that ones does not know that one does not know. And only in a formal teaching environment is one introduced to these topics. Most schools give a healthy dose of general knowledge together with the domain specific knowledge. And most self learners tend to stick to domain specific topics.

I don't know how self taught you are, but 100% of my income comes from things I learned myself and applied myself. I spent $0 to learn those things, and now I make, well, a fair bit more than $0.

In spite of all I learned by myself, when I went back to school, I discovered things I did not even know where useful to me.

I'm not saying that a person is flawed because he is not aware of some specific area. He just is not aware of it because it's not something he directly requires. What Web programmer needs to know about register transfers? Well, if he went to do a CS degree, this knowledge would be forced on him, making him a better programmer in general!

Civilisation only exists because of the institution of school. Don't bash it, it's a very good thing.


"only in a formal teaching environment is one introduced to these topics."

That looks like dangerous thinking to me. I guess it's conforting to think that what you never heard of in your "formal teaching environment" doesn't exist or doesn't matter. School can only expose so many topics to you. It can go deeper than necessary in some topics that won't be relevant to you and completely ignore others that would be.


I went to college and was a semester away from graduating.

Here is my education history. I studied music and philosophy not computers. Anyway I was self taught enough that in my senior year contemporary philosophy of mind class (which I studied independently because it was my interest before the class was offered), the professor asked me to proofread and critique the final rather then take it. I never took notes but I could correct other peoples notes from memory. I also taught myself composition and music theory up to the 400 level before dropping out of school. In high school, I got perfect scores on AP tests for classes I skipped almost every day (I got 15 college english credits for a class I failed). I did knowledge bows and was the knowledge bowl coaches TA, where he gave me old questions and encyclopedias on various topics and we got 2nd at state. (we were a poor public school). I had more AP credits then the saludictorian and valedictorian but I sat in the back with the pregnant girls and delinquents at graduation. I don't think I learned things from class, and think all that was self taught. I always got in trouble for reading ahead or hiding a book.

School is, in general, good. But for certain bright people, putting them like that in classes with people of normal intelligence is exactly like putting normal kids in classes with severely mentally disabled kids, and limiting the normal kids potential to the mentally disabled kids potential. On balance school is good, but there is a class of people for whom school is devastatingly confining and intellectually restricting. Depending on early experiences, these people are going to hate school early, become self taught college dropouts or never go to college (who can blame them?) and teach themselves skills like this guy is saying. Because these people's dropout status is related to their high intelligence, excluding them with barriers based on degrees is a great way to protecting yourself from some of the most creative, intelligent, original thinkers our planet has to offer.


I believe in people who execute. The tasks you have to do in a standard job are for the most part VERY similar to those in school. Little creativity is required, just follow the rules and deliver. People who cannot finish a college degree may be geniuses, but what use is a genius to a big company if the genius never finishes a project but gets bored after a few weeks?

University is a free place. Come and go as you please, just write the exams. Those geniuses should have no problem doing that, no?

A company needs a good mix of the following:

1. Slow and steady worker 2. Charming and friendly people 3. People with clever ideas who know how to explain their ideas properly

Some helter skelter genius with strong opinions and a low ability to complete projects is exactly the wrong type of person for a company.

And the right kind of person to start an own business.


Not going to university was an option, I made a decision, got on with it, and I haven't lost anything in the pace of my career as a result. If anything its instilled an attitude that has well prepared me for running my own company, in the ability to think different and break the rules.

Therefore I don't see it that I failed to complete a project, or have a lacking ability to execute. Not going the normal path, where you're not guided through, the route isn't clear, you have to make decisions, and take actions on your own accord.

Does that not demonstrate an ability to execute?

I have always encountered jobs on the standard job market that require degrees, and I just thought that was how it was. I found other jobs, and made my way despite that.

The reason for starting this thread was because I thought people starting their own companies, especially many of them being from a technical background themselves, would have a different line of thinking..


It's a false assumption that not going to college means a person can't finish something. Fact is, MOST people don't go to college and many of them don't do it for financial or family reasons. College isn't like high school, it's not available to everyone for free. Many many smart people skip college and that's not going to change any time soon.


Of course I don't think that people who don't go to college can't finish something. I ABSOLUTELY would hire a person who did not go to college if he was qualified. But a person who drops out of college because he felt the classes where holding him back is for me a person who cannot complete stuff.

If he dropped out to execute something (like start a business), then it is a major plus point. If he dropped out to go work in a dead end job, or to become an 'artist' then this is a sign of a person who cannot complete projects.

I mean, why would you drop out of college to go earn $3000 a month when you could just finish college and earn more than that?

I would NEVER discriminate against a person because financially they had trouble going to college. But I would see it as very negative if a person complained about college not being the right thing for them. Learning is learning, there is no right or wrong way. Everyone does it his way.


Then you don't like people who can't finish things, stop equating that with college, they are unrelated. Successfully completing college does not mean you are good at finishing things. It may be an indicator, it's certainly not a qualifier.

Many jocks finish college while barely being able to read, many students cheat their way through, or barely make it through with low grades. Going to college is much more an indicator of your social values rather than your skills.

Most people tend to go to college because they're expected to, it has a lot to do with the values your parent instill. Many people skip college because it never occurred to them they should go, their families simply didn't instill those values in them.

College is nothing more than a place where one can learn, it does not mean one does learn. People who enjoy learning don't ever stop learning and self learning eventually becomes a requirement for all who want to continue to grow. College is nothing more than a kick start for a minority of society, there are many other paths that are just as successful.

Many of the most successful companies in the world were started by college dropouts who realized school was getting in their way. Classrooms are good for teaching the masses, people who learn faster than average will absolutely feel like they're being held back and turn to self directed learning as the superior method it is.


Because these people's dropout status is related to their high intelligence, excluding them with barriers based on degrees is a great way to protecting yourself from some of the most creative, intelligent, original thinkers our planet has to offer.

Thank you LPTS, that's pretty much exactly what I wanted to say. A lot of people have gone off on tangents discussing the merits of a degree, where I actually agree with them for the most part.

The point I was actually trying to make, you've conveyed very well. Don't judge our intelligence or ability on such a unrelated subject as having a degree or not.

Hopefully a few people will have read this thread who will be employing in the future, and the information contained may influence their decision when they're deciding to write the "Degree required" line on their job advert.


I think you misunderstand me. I do not learn exactly enough to get things done. That would be impossible. If I am asked to program a web app in PHP, I would study PHP, Databases, Web server configurations, and have enough knowledge to develop a web app. I would not start by studying design patterns, software modelling techniques and abstract functional programming.

That's the difference between self learning and structured learning - most of the time people learn things specific to the topic they are interested in. If you are trying to claim otherwise, then your worldview is deeply flawed.


As a self-taught hacker, this just isn't true of me. I'm constantly reading about things that I have no immediate use for. In fact, one of the reasons I dropped out of my CS undergrad program was that I felt like its focus was too narrow.


So what stopped you from absorbing all of the narrow focus and then discovering more by reading on the side?


There's nothing wrong with doing things that way. I know a bunch of smart people who did that.

But if you're going to be teaching yourself on the side anyway, part of the value of being in school in the first place is reduced. Depending on your financial situation, it might make sense to just teach yourself full-time, or to do your reading on the side while working.

I hated school. After a while I decided that I was better off without it. Ten years later, I still feel that way.


(I don't mean to sound harsh on you, you sound smart and what I wrote has the wrong tone.)

"most of the time people learn things specific to the topic they are interested in."

Your structure is wrong. It's not most of the time all people. It's most of the people all the time. An autodidactic person (admittedly rare and an outlier) will have already studied design patterns or whatever, because they read obsessively and learn best that way.

I'm not trying to claim most people don't work that way. I'm saying there is a group of outliers that has both an ability to learn way better and faster than school can teach, and who dropout. These outliers are both highly desirable employees and cast off by the system. I think that normal people are projecting their own inability to self teach well at a college level onto the freaks who have this ability.

If you think there isn't a group of outliers who is better off self teaching, as able as any PHD's, and who drop out or don't attend school because it is stifling to their natural urge to learn, your worldview is excluding a lot of the most talented and creative people on the planet.


Whats so freaky about teaching yourself?! Learning college level material isn't some mystical magical thing that you need the professor who holds the keys to teach you! Unless they are the only repository of information and no textbook exists to learn the material. Once you start understanding the vocab (and stop wasting your time by not reading the best books for your topic) it's just like anything else. Hopefully you can find a professor/writer who isn't horrible at writing and learned less is more. If not, you'll figure it out.


The key here is to see it from the opposite end -- from the potential employer point of view. We (I am a programmer, but I do a lot of interviewing and general recruity stuff) get metric raftloads of resumes, interest emails, and so on. Most of them are pretty unexceptional to begin with, and most candidates also lie^w exaggerate.

In order to find the people you want to hire you have to trim down the ones you talk to by some means. Think of it is a heuristic based search -- it won't be perfect, but it can be very good. You take the traits which correlate highest with success (I will leave success nebulous on purpose, that is a whole other discussion) and start whittling down the search space. A college degree tends to correlate highly with being a good or great programmer, therefore it is frequently used. This is a simplification, of course (though I am sure there are shops which really do this), but there it is.


An employee who is 2+ years from a bachelor's degree is a long way from finishing while an employee, and therefore there's a high chance they will leave to attend school full-time in the near future. Companies also justify your hire by not paying as much as somebody with a degree out of principle, and therefore always afraid you might find a higher paying job or start college. Finally, a college educated person will likely have debt to pay off, and is likely to stay with the company for 1-5 years after acclimating to a paycheck in the first 3-6 months.


In my experience, even companies may ask for a degree, but if you show persistence, call & introduce yourself, walk in or network your way in.. they become more susceptible to giving you an interview.

Basically (in my experience) you have to show a genuine interest and pursue them at first, kind of like a woman/girl/chick/etc.


I think it has to do with red tape surrounding the hiring process. There's an essentially infinite number of traits a potential employer cannot ask about (like intelligence), but a degree isn't one of them. So degrees matter because they are the only filter HR departments are allowed to use.


Many companies take their character from their founders and hire accordingly.

For instance Microsoft focused on hiring high IQ people that went to good schools because that's where Bill Gates came from.

Google same thing, high IQ individuals with post grad degrees, hence google looks to hire people who fit this mode.


A degree signifies that someone knows what they don't know. I am quite competent in many technical areas, but my formal education pushed me into areas where I'm not comfortable and would have avoided if it was up to me.


there is no such as an autodidact really, untraditionally taught perhaps but even if you learn at home and not at a university you still probably read kind of the same books nd you talk to a lot of people on forums.

i guess if two people have the same skills and one is selftaught then that says a lot more about the selftaught person.

i personally have no experience in hiring people but i could imagine a lot of idiots who think they are the shit(but have a horrible lack of understanding for the fundamentals) would apply.

wouldnt most jobs that ask for a degree consider someone who has a really proven trackrecord?

i mean at least for webdevelopment...


A degree proves that you can learn.


Why the third degree?


I think my original question clearly states why, I would like to know why this is the case with the overall percentage of people employing developers..


do you know Ruby? let's talk.


Well, to mention, I agree with you that engineers without degrees can be awesome. I've interviewed a couple hundred engineers and reviewed thousands of resumes, and when I see people without degrees I look for really solid experience and/or lots of personal projects that show what they've done/experienced. Often those candidates turn out to be just as good engineers in the long-run, and sometimes better since they've come this far without needing a degree in a college-focused career path.

That being said, I think you're misunderstanding a bit what a degree conveys and why it's attractive to companies.

For one, it gives you a grade against some sort of scale that can be used to measure general aptitude. Even if GPA isn't a score from a truly ideal "test", it's a known quantity and you can roughly gauge what that number means. Yet comparing a GPA of "3.6" against "dropped out of school and studied programming" is really hard to do-- the GPA gives you a rating from an accredited university of scholars about how much that candidate studied, learned and worked. The other is just one person's word with some rough and hard to verify guidance you might be able to get from references (and/or your own research).

If someone were looking across listings to buy a good race horse, it's a gamble for them to take the seller's word (and do their own independent research) that the "horse is wicked fast" no matter how many times you said you personally timed it. The average buyer would be more inclined to go after the documented/well-known winners from public races. Now, we all know that's not necessarily the most profitable strategy, but it is usually a safe one. Those that ignore the other candidates risk missing a "sleeper" that really can shake things up, but that's part of the game.

So your strategy should be to look for those companies that are willing to take those chances rather than worrying so much about the ones that filter everything by institution and GPA. I also advise you to approach them in some way that shakes up their expectations a little anyway (e.g. sending them your resume printed on rolled scrolls, handing it to them in-person at some interesting event, through a respected employee, etc). You want them to think about you differently, so give them a reason to think you're a clever puppy outside the mold that they might have missed otherwise. If you only approach them like the other candidates do, it's not surprising that they'd grade you against the same criteria for the other candidates.

It's also important to keep in mind that bachelor degrees don't just convey "Computer Science". They also convey all of the other non-CompSci courses that you need to graduate. It's a general show of discipline across courses and studies with one of the biggest being written communication. If you don't have a degree, it would be awesome to also provide them examples that you've written (white papers/essays/speeches/etc) to help show that you're a well-rounded candidate that doesn't just know how to sit behind a computer and code. I promise you that's not all they're looking for.

You also have the fun option of showing them all up and just starting your own business. Here are just a few that didn't make it through college and changed the entire world: http://www.portfolio.com/executives/features/2008/04/14/Bril...

Best of luck to you.


Maybe a degree means you are too willing to accept mindless mediocrity (like that found in our education system) and no degree (but the same level of ability) means the person is both autodidactic and unwilling to accept everyday mediocracies like found in our education system.

So the effect of demanding degrees is one of mediocre organizations (which are mediocre on account of, for example, routinely doing very dumb things like using tokens of accomplishment that do not reflect actual accomplishments and abilities to make decisions instead of doing the hard work of thinking and taking intelligent risks) reinforcing their mediocrity by selecting for people who have already demonstrated their ability to be content in a mediocre system. There are lots of studies that show very little to no relationship between grades and intelligence or grades and creativity. But organizations are unwilling to put this scientific knowledge into use.

I would feel more excited about interviewing a self taught developer who was good then someone who was taught through school, since I value self teaching, meritocracy and self motivation a good self taught person would display and despise the kind of cronyism, group think, and gradual relaxation of standards to the lowest common denominator that schools all too often represent.

If finding the right employer is like finding a needle in a haystack, anything that takes away most of the hay is good.


"There are lots of studies that show very little to no relationship between grades and intelligence or grades and creativity. But organizations are unwilling to put this scientific knowledge into use."

I think that's the confirmation bias at work. If such an organization learns that a PhD-holder is real good the reaction will be: "Well what do you expect, it's a PhD!" but facing a successful self-taught person they'd say: "How did he manage to do that without a degree? Must be luck..."


the only degree that should matter is the quality of programming/coding a person can do - a degree certificate, which is basically a paper can not guarantee the quality of a hacker and his/her skills. The society has become way too demanding - also supply and demand is off-balance. situation in india is different than in north-america ... in USA/Canada i have seen that you can still survive without a degree but in india you just cant survive, in fact every other person (applying for a job) is a master degree holder - competition over there is crazy - but there are few entrepreneur who think out of the box - for example people behind the Zoho.com hire poor students from the high-school in a village, they then train them and within few months these high-school students become one of the best developers ... this is a must read interview "The Smartest Unknown Indian Entrepreneur": http://www.forbes.com/2008/02/22/mitra-zoho-india-tech-inter...


seems like every job (CS or Sys Admin) I see requires a Bachelors, here in the US which means that almost everyone else I'm competing with has one.


Thanks for the link, that's a good read.


what are you doing right now? i hope smart people like you are always busy doing something creative - if possible can u post your email please - i would like to contact you.


I am currently in the process of launching a new product for my business.

You can contact me at: matthewking [dot] yc [at] gmail.com


I can see requiring a degree if you want someone from a good university...Stanford, MIT...that way you know from the get go that you are getting a smart person.

But in reality neither that, nor job experience really matters. Why? Because a person can write anything they want on their resume and you'll never know that they are BSing


" Why? Because a person can write anything they want on their resume and you'll never know that they are BSing"

Unless you pick up the phone and make a few calls. There are many things that are trivial to check, such as whether someone actually attended a given school or worked at a given company.

Certainly people can and do BS on their resume and get away with it, but its not because resumes are inherently unverifiable.


how do you know the phone # someone gives you is their ex-boss' phone # and not their buddy's


By looking the company up in the phone book/internet and calling from there. There might be some cases where endless bureaucracy might get in your way, but in most, it wont.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: