Hacker News new | comments | show | ask | jobs | submit login

I think the college degree hiring issue is all about confusing a symbol- the diploma- with reality- having the skills. So, Steve's anecdote is about getting hired by someone smart enough to test him to find the reality.

I have similar situation as Steve. I was studying physics, working for a nationally recognized lab under a guy who should have gotten a nobel, when I got a job as a software developer and realized that was what I really wanted to do with my life.

I used to put down that I studied physics, but eventually, I just dropped it. My resume now is one page, with a summary at top of my skills and a list of the places I've worked taking up the rest.

I find that this is a really good filter. If someone won't hire you because you didn't go to college, you know that this is someone you don't want to work for. They are expressing a prejudice-- assuming you lack a skill based on their own assumptions, because they probably needed college to teach them. Many times people rationalize this by saying "college shows commitment". Well, keeping a job for 4 years shows commitment. Outside projects that are a lot harder than college was shows commitment.

The real reason I didn't finish college is that it didn't make mathematical sense. It costs a lot, delays your career and doesn't deliver sufficient value to cover these costs. I think that situation has gotten a lot worse.[1]

So, the right thing to do is look at the education section skeptically. Did they work their way thru college? Why did they go? Did they think they were getting more value than the cost?

I hear companies won't hire people without degrees. I see "BS requires, Master preferred" a lot. I never let that stop me from sending my resume, and back when I was willing to work for others (rather than myself) I tended to get interviews, and 4/5 of those interviews would result in an offer or another step in interviewing (for companies that had a multi-step process.) I learned quickly to send all my resumes out on one single day, and have interviews scheduled close to each other, lest I get offers from some companies before I'd had a chance to interview at others.

None of these companies cared whether I had a degree. (And the ones who did, probably never called me in for an interview, but there's no way to tell which jobs have already been filled vs. which ones were at companies with that prejudice.)

But I consider that a blessing-- this filters out companies that confuse the symbol (eg: the diploma) for the reality (eg: having the skills.)[2]

I've met a lot of "smart" people who think they are so smart that they don't realize how much smarter other people can be. This limits their world view. It even interferes with their ability to comprehend or think logically. They let prejudices and ideology get in the way of perceiving reality.[3]

The last thing you want to do is work for a boss who believes his fantasy over reality.

And filtering out the ones who think you're not qualified because you don't have a diploma is a useful tool for that.

Epilog: I express strong feelings here. I am unabashedly opinionated, but I think it is critical in hiring to hire people who think differently than you. I think its critical to give the benefit of the doubt, allow a wide variation and then focus on what's really important- the relevant ability, their capabilities. I think "cultural fit" is often used to exclude good candidates for unfair reasons. I think I'd hire someone I disagreed with all the time if they were qualified (but haven't put this to the test yet- only having hired someone who disagrees with me most of the time.)

[1] I am very willing to hire people with degrees. Even though college is often a waste of time and money, and could show bad judgement, they can also show other things-- like the need to spend a couple years finding yourself outside the overwhelming influence of your parents, or the need to figure out what it is you really want to do with your life, etc. Some do it out of a commitment to their parents because it means so much to their parents, and I respect that. I don't think that someone turning 18 magically means they've figured everything out.

[2] I have found, however, that hackers (eg: people who taught themselves when they were young) right out of highschool are about as equally prepared for employment life as (most) people with CS degrees right out of college. Either way its going to take a couple years before they're really productive. Hackers shouldn't go to college.

I am assuming that hackers are generally auto-didects and not the kind of people who need to be trained, while college is for people who need to be trained, the kind of people who can't just pick up a new language over the weekend, or can't just read a college textbook to get the stuff they hadn't learned otherwise.

[3] In fact, I think that the fact that so many of these people who focus on degrees are people who went to college because they needed to be Trained, means that they are people who generally simply don't understand that some people self train. They don't see the advantage of the auto-didact who will learn things that seem ancillary (eg: economics) or irrelevant to someone who has been trained.

I think the training in college teaches a narrow way of thinking, or maybe it just doesn't expand the mind, while the autodidacts will expand their own minds.

Companies would be much better off hiring autodidacts and making sure at least one is in the interview loop, to ensure that the trained people don't exclude someone based on their own narrow thinking.

"Oh, your company has written your product in Haskell? That's nice. No I've never written any Haskell in my life, but I learned Lisp when I was 14 and write a long of Erlang, and pick up languages easily. I'll have no problem picking up Haskell."

I think this above conversation sounds like nonsense to a trained person, because a trained person doesn't "just pick up" a language.

Actually, it's not about just showing "commitment" -- a university computer science education shows me (1) how you deal with a hard problem that you've never seen before, (2) how you deal with being around people smarter than you are and (3) how you perform under deadline pressure. Your four years at an employer doesn't necessarily show me any of these things, at least not without a ton of hard, specific questions from me -- indeed, I think that that's what Ben was trying to drive at in his interview at Steve. In my experience, it also happens to be true that top software engineers have learned quite a bit from a computer science education (especially when coupled with several years of practice), and when I find bright people who have dropped out of school or studied other disciplines and ask them tough, direct technical questions, I often find that they have wild misconceptions about the way computers actually work.

Now, you might very well be the exception to that rule, but the cost of a mis-hire is so astronomically high to me that you can expect that if I ever interviewed you, I would expect you to greatly outperform a top new computer science graduate -- and I would interview accordingly. This is possible, but not likely: a top computer science graduate not only has the same intelligence and aptitude that you had at 22, but also four years of formal computer science education and some top internships to really send the principles home. Of course, it would be a mistake to imply that you'd actually want to work for me: after all, I'm the kind of employer that won't hire a software engineer who doesn't have a university degree in computer science...

Completing a BSc with a third involves more deadline pressure than shipping, say, two or three video games? Are you sure about that?

Being more general and less snarky - I think that, probably, you (personally, as a single hiring manager) can get away with "only hire CS grads from great universities with firsts", if you work for somewhere easy to hire into (Facebook/Google/Twitter/whatever), because they don't find it hard to find people, just to find ones who're good enough. If (as I suspect by your estimation of the costs of mid-hiring) you're working for a no-name start-up, you're not only cutting out the brilliant people without degrees (and I've worked with many over my career), but you're also cutting out the excellent graduates who won't want to work for you because you're hiring a monoculture.

Facebook, Google, and Twitter certainly hire/make offers to hire individuals without degree. I know folks that are greatly respected at all three that fit that bill: Wayne Rosing is a famous example. These companies also can't afford to make hiring mistakes and have an extremely high bar.

That said, a Computer Science education certainly improves your chance of getting past the hiring bar. To most it means an exposure to topics they learn about had they spent the four years doing web development: what Bryan called "how computers work" (operating systems, CPU architecture, concurrency), algorithms and data structures beyond arrays and hash tables, and advanced topics (distributed systems, machine learning).

On the other hand, if you've spent those four years contributing to FreeBSD, doing game development (and here I mean doing AI and graphics yourself), or working on another technically challenging project such as a web browser or a compiler, it would be a different story.

I have an MS in CSE but from not from a nationally recognized top-tier school. I think I've done reasonably okay as far as professional success goes, but if I had to do it all over again, I'd have transfered to UC Berkeley (or another top CS school) when I had the chance, even if it meant delaying entering the work force by 1-2 years.

>Completing a BSc with a third involves more deadline pressure than shipping, say, two or three video games? Are you sure about that?

Personally - quite possibly. If you did the BSc I know you were at least able to hand in something that met the basic requirements, on time or close to it. If all I know is you worked at Company X and weren't fired, that could mean any number of things, and I'd have to be a pretty good interviewer to figure out which.

If you did the BSc I know you were at least able to hand in something that met the basic requirements, on time or close to it. If all I know is you worked at Company X and weren't fired, that could mean any number of things, and I'd have to be a pretty good interviewer to figure out which.

I don't follow this reasoning. Unless just having the BSc is actually good enough to get hired instantly, you're going to have to figure out if the candidate is good enough anyway. The additional information from the BSc is slim, at the cost of rejecting a chunk of the candidate pool. Just doesn't seem like a very worthwhile tradeoff.

The pool's big enough that other factors (e.g. interview scheduling) are more restrictive. Tossing out half the pool at random really wouldn't hurt. If there's a trivial-to-measure factor that's even 5% correlated with suitability for the position, it's worth looking at only those applications that have it.

Tossing out half the pool at random really wouldn't hurt.

There was a story posted here before about a hiring manager that flipped coins to sort the candidate pile. "We only hire lucky people here." I wish I still had the link...

As a guy still in school, I would say that more than half of the people in my CS classes are complete dunces. I would think that even by selecting for the degree, you're still faced with the problem of needing people to prove their skills, no? Is a guy who graduated with an almost failing C really better qualified -- at least from a foot-in-the-door perspective-- than a self-taught guy that has the confidence in his skill to ignore the degree requirement and apply anyway?

No. But there's no easy way to tell the difference between "a self-taught guy that has the confidence in his skill to ignore the degree requirement and apply anyway", and an idiot who didn't read the ad properly, and the latter are sadly far more numerous.

You only hire people with degrees? I couldn't afford a degree, that's why I never got one, but have tons of experience. You still wouldn't consider hiring someone like me? Why?

This just seems like you're limiting yourself from hiring people that couldn't afford college, but that are still extraordinarily valuable to your company. Which is just wrong.

I don't know what country you are in, but in the US, they will extend all kinds of loans to you. (That really is the problem). In fact, the less money you have, the better the deal for the loans. At some point in college my girlfriend and I wished our parents had less money because it would have made our cost (which we were paying) so much lower.

Do 1-2 years at a community college, 1-2 at a state college, and you have a 4 year degree at a reasonable price.

I got a merit scholarship, that lowered prices for me quite a bit. Others have noted that although advertised tuition has gone up, the average price paid by students hasn't changed nearly as fast. I.e. the rich people are paying sticker price and the poor are getting subsidized 'need' based loans. The only people getting screwed are those in the middle.

I'm pretty sure there are a lot of people in that situation. Me included.

I bought myself a home when I was 24, so finishing up college wasn't reasonable for me. Instead, I started working on my own projects as well as learning as much as possible to do the job of me and the developers above me. I've learned a lot in those 4 years (and and continuing to do so). So much so, that thinking about going to college gives me mixed feelings.

Its either... 1) Should I spend X amount of money to re-learn the stuff I've learned just to get a small increase in pay, so I can pay off those college bills that I've accumulated from the year and a half training?

2) Should I work on this project (the Bachelor's Degree) instead of finishing up my personal project (that's potentially a great startup idea)?

3) Is and extra $5-10k a year "guaranteed" now better than an extra $500k-10 mil a year "probable"?

I don't know. Perhaps, I can compromise a little. Finish up slowly and have my cake and eat it to. All I know is, with my personal project, I have an opportunity to do something great. In the meantime, I have a good job and am earning a very good paycheck (well as far as the rest on the US is concerned).

I feel that I can always go back to school and finish up. In the meantime, I will continue to learn things on my own through my projects and my day job.

I couldn't get a degree because no school would accept me into their programs to begin with. I haven't found it ever to be an issue in the real world though. Companies often come to me these days.

The good news is that you only need one job. For every company that requires a degree, there is one that values the self-learner even more. There is no use in fretting over individual company hiring practices.

if they can hire the people they need to by being that selective, why not? it's working.

What's your judgement of something 'working'? Maybe many companies are failing because their hiring is limited to a specific academic category. Whilst startups with much less budget surpass them, because they're not limited to anything.

Your company: only academia

Startups: academia + anybody else

With that in mind, it's just plain math that startups have more chance in finding great people.

If you're looking for just another mediocre / average corporate drone requiring a degree makes sense. The average person with a degree will outperform an average person without one. But if you're actually trying to hire top level talent a degree shouldn't matter. Someone who's passionate about what they do and intelligent will be able learn from experience and teach themself.

Actually you have it exactly backwards.

The average person with a degree will outperform an average person without one. But the average person applying for a particular job who can get in is about equivalent to other people doing the same. But people prefer people with degrees. Therefore, if all else is equal, the average person you hire with a degree is worse than the average person without one.

But if you want people to perform at the very top level for computer science stuff, then you both want a top intellect and a degree. (However you, as a company don't realistically have the option of top people.)

But the average person applying for a particular job who can get in is about equivalent to other people doing the same. But people prefer people with degrees. Therefore, if all else is equal, the average person you hire with a degree is worse than the average person without one.

I'm not sure it follows that a person you hire with a degree will be worse than the average person that you hire without a degree. After all, you said "if all else is equal. . . ." If all else is equal than the candidates are equally qualified and the presence or absence of the degree made no difference. I don't see how having a degree by itself (which is the scenario you envision) can ever count against you.

If you hire someone without a degree, the most you can conclude is that there was probably at least one person with a degree that was inferior to that individual. The problem is that you don't know how many people with degrees are inferior to this individual. Exact numbers matter in this case; I think that the reason that people prefer people with degrees is because having the degree is more often associated with the required skills than not having the degree.

It does follow. If you're hiring people who are equally attractive to you, and one is attractive in part because of the degree, then for them to be equally attractive they must be worse on your other desired qualifications - such as demonstrated competence. My claim is that demonstrated competence is more important than the degree. Therefore of those two candidates, the one without the degree usually turns out to be better.

This is actually true about any discriminated against group (which people without degrees are). On average the discriminated against group may be worse (for whatever historical reasons), but the ones who are good enough to become seriously considered despite that are actually better than the ones you would consider equally desirable. Therefore if you're on the fence about a decision, you should prefer the one who lacks the most obvious signals like degrees, etc.

This may sound like an abstract and weird hypothesis. But it is a testable one. For example see http://hbswk.hbs.edu/item/6498.html for evidence that being willing to hire women into management in a culture where women are discriminated against results in better financial returns for the company that is willing to do so.

As for your hypothesis that people with a degree are more likely to have the required skills, that depends heavily on the job. Certainly if you're going to hire someone to work on a compiler, you'd prefer people who know how compilers work. Which gives a big edge to CS grads. But for general software development I'd prefer someone who can quote Code Complete back at me than an average CS grad.

(Disclaimer, I have a masters in math, and almost finished my PhD. If you are looking to hire me, and you have an alternate candidate without my academic qualifications who you think is a tossup compared to me, then yes, I am saying that you should hire them instead.)

I agree that demonstrated competence is what ultimately matters. The problem that I had with your example is that you said "all things being equal. . . ." That implied that the candidates were both technically competent, only one had a degree and one didn't. I understand your point that candidates that succeed despite not having a degree are probably better than a lot of people with degrees. But you are making a stronger claim than that; you're saying that discriminated groups that succeed in getting hired are probably better than others that also got in. Now, you did provide a link to a study that was true of women but I would want to see evidence specific to people with degrees versus those without. The problem with generalizing the way you seem to be is that the people you hire without a degree may not have been competing against the people you hired with degrees.

For example, suppose you have two positions to fill that are basically the same. You fill one position with person A that doesn't have a degree and you fill another position with person B that has a degree. Person A may have gotten the position because they beat people with and without degrees that were also applying to the position. The same applies to person B. On one hand, you're saying that person B will get some level of preferential treatment because he/is has a degree. But on the other hand, you're saying that person A is probably better than person B in a manner that isn't reflected in the basic competency tests that were used to hire them in the first place. However, at this point you're comparing apples to oranges because they weren't hired from the pool of candidates.

Moreover, I don't know what you mean by "better". You might say that someone that succeeds despite lacking a degree is probably more passionate than an equally hireable person that had a degree. He/she may have more job experience. He/she may have passed with an A+ from the school of hard knocks. It seems to me that the only way to quantify this and aggregate over individual differences would be to compare the long-term salaries of people with degrees versus those without. Pay may seem like a coarse measure of whether an employee is better or not (I should know because I left a job in which I was underpaid for years despite being productive) but it is at least a simple measure of how much a company actually values an employee.

Sorry, I don't have access to a study directly on degrees. If you want that, go hire some social science people to study it.

However my personal, anecdotal experience is that peer coworkers that I've had who did not have college degrees have, on average, been better than ones who did. By "peer coworker" I mean "working with me, with a job title similar to or better than mine". By "better" I mean "impressed me more". My measure of being impressed is what I thought of the quality and quantity of work that I saw them doing.

I have no idea what their salaries were like. There were some that I know were making less than coworkers with degrees, even though I thought that their work was better. Most I never had a discussion about salaries with.

That said, on the whole I would wager that discriminated against workers get a worse salary even if their productivity is equivalent or better. Why? Because salary is the result of a negotiation, one of whose inputs is what your alternate options are. People are not paid what they are worth to the company. They are paid what the company thinks it needs to pay them to keep them happy, and the difference between that and their worth is kept by the company as profit. (Companies that do not act this way soon find that they are not able to make a profit and some time later find themselves out of business...) If some employees have a hard time being paid more elsewhere, then they will often be satisfied with less from you.

This phenomena is presumably why the paper that I pointed you to measured productivity on the basis of company growth and profitability, and not on paid salary.

But if you want people to perform at the very top level for computer science stuff, then you both want a top intellect and a degree. (However you, as a company don't realistically have the option of top people.)

Then again, there are people (occasionally) like Steve Blank and Ed Fredkin.

I get bcantrill's point. A degree ensures the candidate meets a certain baseline. I graduated as an electric engineer (specializing in computers) and had to educate myself in a lot of stuff a comp-sci student has to, but with the help of an experienced teacher, in order to graduate. There is no assurance I got everything right.

Interestingly out of the dozens of people I have worked closely with over the past decade the best workers have been the self taught (degree or no degree). I have had the misfortune of making quite a few mistakes of hiring "excellent" post-grads with a first in CS or a related field who are totally useless without the university structure around them.

I have also learned that not everyone is lucky enough to have gone to university for one reason or another, that does not stop them being brilliant though.

There is a lot of snobbery around having a degree (and more so having a Masters these days) which is a great shame.

Would a B.E. make the cut?

I don't want to imply that there's a hard-and-fast rule here -- there isn't. And it's possible (but highly unlikely) that I would hire someone who is entirely self-taught. It's more of a spectrum: if you've done very well in a program than I'm familiar with, I have a great deal of certainty about the kinds of problems you've dealt with; if your degree is slightly different (EE, CompEng) or from an institution that I have no familiarity with, that just means I need to familiarize myself and wade into the specifics -- it's not a strike against, by any means. If your degree is further afield but technical, that will require more validation that you understand computer science; I've seen way, way too many physicists who turn out to be horrifically bad software engineers. If your degree is non-technical, however, then you're going to get lumped into the "self-educated" camp -- and again, it's not impossible that I would hire someone self-educated, it's just highly unlikely to encounter someone self-educated who meets our bar for software engineer.

One of the things that I've noticed in this discussion is that there's been little mention of what your company does (I do, I believe I'm citing some of your work in my M.Sc. thesis). While, in general, I suspect that self-taught developers can do a lot of great work, I also suspect that the kind of development that you're doing at Joyent would very strongly benefit from formal CS training.

I understand bcantrill's point as a quest to remove uncertainty. He wants to interview people who meet certain basic criteria so he can focus on what makes one CS grad different from the other. If he interviews people with more diverse backgrounds he will have to deal with more diversity and the interview would consume more time and resources.

But an engineer or a physicist would probably be able to advise him on what he should sing his hard drives to make them run smoother. I don't think a compsi grad could do that ;-)

>> "if you've done very well in a program than I'm familiar with"

That's not proper English grammar.

Bigots sounds the same, even when they're degreed.

My philosophy on hiring has always been to hire the smartest people who are still capable of working and communicating well with others. Over the years I have found that I need to focus more on different areas during the hiring process depending on the presence of a degree.

Those with degrees I need to ferret out if they actually like to program or if they just are in it "for the job." I tend to find that those that have come to programming via another route are more likely to actually enjoy it.

Those without degrees I need to spend more time ferreting out the "plays well with others" skills. I tend to find that those without a college degree tend to have less experience attempting to solve tough intellectual problems as a member of a group -- this is usually alleviated by more professional experience. One other quirk that seems to come up more often[1] is the "Smartest Person in the Room" syndrome, the constant need to prove one's superior intelligence. If you get more than one of these people on your team it tends to turn every discussion or meeting into an unproductive fireworks display of one-upmanship.

My perfect hire is the self-learner who loves to program, has a mastery of the fundamentals of computer science, can effectively communicate complex topics, and has enough confidence in their abilities to stow the ego.

Then again I'm also looking for some beach front property here in Colorado...

[1] - I also see this quite a bit with people who are _really_ proud of where they went to school. This is then usually combined with them somehow working the name of their alma mater into the conversation at least a dozen times during an hour interview.

I don't think your comment is totally on point. I've kind of seen both sides, and I think you're understating the value of formal education. I taught myself programming as a teenager, worked at a tech startup through college (getting a degree in a non-CS engineering field), worked as a software engineer after college, then headed off to law school.

There is no substitute for the skills you'll develop early in your career hacking code at a startup. At the same time, after years of working as a programmer, I feel unbalanced. I've had a lot of success at my previous jobs, but ran into challenging problems where I wished I had a few more formal analytical techniques in my toolbox. I can "just pick up" a new programming language, but I haven't been able to "just pick up" type theory.

After three years of education in law, I've come to appreciate what formal education brings to the table: an appreciation for the big picture principles of a field that are hard to grasp when you're neck deep in code working at a startup. Any auto-didact could be a lawyer. With practice, doing similar things repeatedly, they could be a very efficient one. But even a mundane corporate bankruptcy can bring up a novel point of law, and its when you encounter something totally unfamiliar that a formal education really proves useful. I think someone who can draw on the broad principles of the field is at a huge advantage when dealing with a complex novel problem than someone who has just seen the pretty specific areas that they've hacked on.

Now, if I could choose between an auto-didact with no degree and someone with a great degree and no ability to self-teach, I'd choose the former, most of the time. But if I were doing a startup that involved designing the next TCP/IP, I'd probably pick the PhD without any programming skills to speak of.

You can pick up formal analytical tools on your own, you just have to be very good at sniffing out the right kinds of information, which is hard.

EDIT: I'm not comfortable with the accuracy of what I wrote. I'm leaving it for the use of the term "lying culture", which I think is real and is a useful summary term.

Negotiating such circumstances continues to be a struggle, for me. To the extent my opinion has any value, though, I would recommend being on the lookout for becoming part of or involved with a "lying culture" and getting out of such circumstances as soon as possible.

As I stated, in my case I compensated. For me, that seemed to end up being a self-destructive approach.

Several bad personal circumstances perhaps kept me more "trapped" than I might otherwise have been -- if it was not just personal weakness. Regardless, from that perspective, my recommendation is to walk away. The sooner you do so, the less it's going to cost you. And you won't be further empowering those who are shoveling the shit.


I started a longer comment, but I'm going to reduce it to the phrase I used within it: A lying culture.

A lying culture is corrosive, and it is particularly damaging to those who don't lie and aren't interested in the priorities such a culture tends to emphasize. (Personal power, control, and exploitation.)

I've managed to negotiate some such cultures, in good part by finding and connecting with the decent sub-population within them. Sometimes that included winning the trust of somewhat disengaged employees over to my cause.

However, negotiating such a culture for a time does not equate with long term success. I think, in retrospect, it is better to get out. Personal connections can mask, but not counteract, the larger influences.

It is also worth keeping a keen ear tuned to if and when your work culture starts turning into a lying culture.

Those "stay the course", "engagement", et al. memos can be one canary, and if you're paying attention, you'll observe some of the brightest employees -- where their personal investments and risks aren't too high -- jumping ship soon after the scent of such a change begins wafting around.

It is, I think, not just about personal opportunity. It's about a low or zero tolerance for bullshit.

College can also open your eyes to a vast array of scholarly disciplines and help you become a well-rounded, cultured, and thoughtful person. It doesn't always work, and it's not the only way to reach this end, but it's not a bad option.

You can only do so much in a limited span of time. Each person has to optimize for what they hold important.

Not going to college was the best thing that ever happened to my career. I got to spend my "college years" optimizing for business, which put me miles ahead of my peers who went to school. I now earn significantly more and get to work on more interesting projects (by my metric, at least) than my friends who even have PhDs in the field.

But that came with a cost. I missed out on a lot of non-career qualities by not having the college experience. I don't feel either direction is wrong, you just have to choose which is more important to you.

Really smart hackers should go to college, if they want to, but it is hard to find a college that won't be boring for them. There are a few out there, usually small. Don't study computers, it's a waste of time, you already know them. Study something else, like Art or History or Philosophy. Keep hacking on the side (because you can't help it, you're a hacker).

I agree on not studying computers. Happily, despite the odd name, Computer Science isn't actually about computers. Or a science. Even if you "know computers" chances are you don't know computer science. And if you do there's always abstract mathematics :).

Also, I think the advice to go to a small college is exactly backwards. If you're really smart and can teach yourself, go to a large research university. Compared to little colleges, these universities focus quite a bit more on novel research and less on teaching students. The professors are chosen on their aptitude in the field, not as teachers. So you get to work on ground-breaking stuff with brilliant people, but have less instructional support. This is a great compromise if you are something of an autodidact.

After all, the classes are not the most important thing you should get out of a university education. It's working with other extremely smart people, doing novel and nontrivial research and being able to pursue very specialized and advanced topics of interest. And the classes that are the most important--the most advanced ones, naturally--are the sort that aren't offered at little colleges anyhow. For example, I'm going to take a class on program synthesis next semester; I don't think I'd have that same option at a small college, and it's certainly a very exciting topic!

Besides, an engineering, mathematics or CS program at a good research university is more than challenging enough for anyone. This sort of education is a great complement to the average hacker--it ensures you have breadth even in topics you don't like (CPU design may be icky, but I had to learn the basics all the same :P), gives you depth in the subjects you do like and supplies a very helpful sort of systematic organization. It really helps draw connections between otherwise disparate areas of study and gives you a good base to pursue more advanced interests.

I agree completely. The truth is that there are simply techniques that have been finely crafted over decades that you are simply unlikely to discover from your own hacking. There are algorithms that are subtle and the product of pure academic research. For example, a self-taught hacker probably wouldn't even know that there are methods to build a self-balancing binary tree. This is because, in the course of your own hacking projects, you may never encounter datasets that are large enough and ill-conditioned enough for it to matter whether you used a self-balancing node insert algorithm or the standard algorithm that any clever highschool student could roll up. How many self-taught guys can code quickly and even bug-free, yet don't know that there are better algorithms for sorting than bubble-sort? These are the type of things that someone with a formal education would be aware of, even if they couldn't code the algorithms themselves or very well.

I used to think that just being a good hacker was all it took to be a good developer but the fact is that you don't know what you don't know. There are things that you are forced to learn in academic courses that you may not have chosen on your own but which end up being valuable in unpredictable ways.

This comment shouldn't be in the gray. We, as hackers, shouldn't place as much importance as we do in our college diplomas. Very few of use are computer scientists. Very few of us learned as much in 4 years as you did at your first startup (or even hacking on the side, for that matter)

It's one thing to say that we shouldn't place so much importance in diplomas in the context of hiring. It's quite another thing to claim that "really smart hackers" will be bored if they take Computer Science at MIT, Stanford, Berkeley, Oxford, etc.

I actually sort of followed this advice. When I started college, I was working towards a CS degree. I already knew as much programming as I wanted to learn and quickly got bored with it. Some friends recommended I switch majors to something I didn't know and actually still had an interest in learning. So I switched to networking. Now I'm employed in information security and I keep my coding only on projects I want to work on.

It's actually sound advice (and like all advice, you should think it through fully), if you know something already why pay $80k to get a piece of paper to prove it? Many people are employed in jobs they know but didn't major in.

And that's cool, but it isn't a necessity if you are being employed to do a particular job.

(Because of the glut of liberal arts BAs, there aren't a ton of jobs for "well-rounded, cultured, and thoughtful" - though those might be little bonuses)

I'm going to give some personal experience about why I only hire people with degrees, which I realize is not data. If anyone can think of a way to actually evaluate this at a statistical level, I'd be very interested.

I've worked with a large number of people both with and without college degrees, in the US and abroad. And I've found that it's not the degree that counts -- it's the liberal education (I mean that in the sense of a well-rounded education, and many college graduates are not receiving that, particularly at colleges/universities where they're groomed for a role in software engineering). People with a broad background aren't just more inclined to problem-solve, they're better at it -- they come up with more innovative and subtle solutions, they're better able to manage their own time, etc. Time and time again I see issues being kicked around until they land at someone with a "well-rounded" background, who actually is willing to examine and solve the problem rather than looking at it from a single perspective.

I don't doubt that one can arrive at this state of mind without attending college, or that some who attend college fail to get there. But I want smart people who are able and willing to problem-solve, and in my experience college graduates have a serious edge.

Also keep in mind there are fields besides CS/CE in which teaching yourself is all but impossible because the materials with which to do so are not cheaply available to anyone the way computers are (I'm thinking specifically about biosciences, chemistry, etc.).

I also happen to believe there are myriad benefits to a (good) college education besides vocational preparedness, but I'll spare you that spiel.

You're still making the mistake of equating Computer Science as a college major with software engineering as a profession. The two are not entirely distinct, but very different. Their overlap is not large, and mostly consists of "coding".

Filtering for autodidact programmers will certainly find you people who've more strongly focused on programming, possibly on software engineering, than on computing science.

However, when you need someone to tell you how to build a filesystem that's redundant to the Nth degree or when to use a decision tree construction algorithm instead of a naive Bayesian filter for data classification.... you'll need a computer scientist, and you'll damn well call for someone with a degree.

"I have found, however, that hackers (eg: people who taught themselves when they were young) right out of highschool are about as equally prepared for employment life as (most) people with CS degrees right out of college"

I doubt you meant it this way but that's pretty damning - in my experience people with CS degrees (or any degree) right out of college are horribly ill equipped for most things.

The problem with college is while it teaches some really great stuff, it gives people an over inflated sense of what they know and what they're ready for.

In that sense they're often the opposite of many high school graduates who might be a little too unsure of themselves because they're too worried about not having a degree.

I suspect it's an accurate statement. Employment life comes with many requirements many of which aren't taught at schools and aren't always easy to pick up when self-schooling either.

I don't think "ill-equipped" is the right word. You probably have the right "equipment", just none of the experience or secondary skills needed. Neither hacker nor college graduate. One might have more programming experience and the other might have more formal scientific training, but that's just not what is being talked about there.

Are you familiar with the work of Baudrillard?

Not so much. Can you explain?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact