
Apprenticeships - Employers Must Get Past Degree Snobbery  - csjohnst
http://codemanship.co.uk/parlezuml/blog/?postid=1053
======
brudgers
Posing Apprenticeships as a significant alternative to higher education is
delusional. First because a true apprenticeship requires a contractual
relationship between the apprentice and the master. That sort of contract is
extremely problematic to modern business entities because it does not allow
flexibility in staffing levels, requires a significant investment in training
(several years), and because indentured service is generally frowned upon and
body warrants are hard to obtain these days -- there is little viable recourse
should the apprentice terminate the apprenticeship early.

In addition apprenticeship is difficult because it easily runs afoul of equal
opportunity expectations and requirements (in the US). The difficulty in
differentiating between individuals and in determining each person's unique
skill set before they are on the job is the reason all soldiers go through the
same basic training and then are assigned to their specialties (of course one
could argue that military skills are often determined before hand - but that
is an argument for prior training (college) not against it).

Indeed the significant latitude of the military is an indication of what is
needed to create any semblance of a workable apprenticeship program which
provides equal opportunity on a large scale - an organization where
meritocracy is highly advantageous both to the organization and the
individuals who lead it, extreme prejudice in the enforcement of contracts
(e.g. execution for desertion), and very one sided contracts (e.g.
imprisonment for AWOL).

Modern higher education has grown because it offers such a powerful solution
to many of the problems created by apprenticeship particularly lack of equal
opportunity, exploitation of apprentices, diversion of resources to training
and away from profit making activities, and long term commitments to
particular individuals who may not be suited for the profession.

~~~
gfdgfdgfd
"Posing Apprenticeships as a significant alternative to higher education is
delusional."

This is absolutely absurd and just shows pure ignorance of the economic
realities beyond your grasp. Germany thrives off of apprenticeships. You can't
find a job without having an apprenticeship under your belt in Germany.
Students spend between 50% to 70% of their time at a company, while the rest
is spent on traditional education. Apprenticeships are a vital source to
Germany's economy. Dismissing it based on your pet theories is bordering on
asinine.

<http://en.wikipedia.org/wiki/Apprenticeship#Germany>

~~~
brudgers
The German apprenticeship model doesn't scale well to less affluent and less
homogenous countries (i.e. pretty much the rest of the world) - and it is my
impression that German Hauptschule students don't apprentice in professions
such as computer programming. Then again, if one accepts the idea that it is
o.k. to channel children into career paths at the age of nine or ten and
generally reserve the best schools for the upper social classes, then the
German model might be as successful as you suggest.

~~~
eru
Germany is successful despite the Hauptschulen, not because of it. The
apprenticeships are one of the things that allow Germany to get away with its
aweful schooling system.

------
DanielBMarkham
Here's my rough criteria when selecting applicants for technology roles:

1) What have you done lately that is like what I want you to do?

2) What is your attitude like? (past references very important here)

3) Have you taken some test or certification (or can we give you one) that
demonstrates skills in areas we might be concerned about?

From there, perhaps _you can start learning_ , that is, it might be worthwhile
to talk about a position. But all of that factual and procedural knowledge
will be put to the test when you are inserted into our actual environment
where your social skills are going to have as much to do with your value as
your technical skills.

None of that involves a college degree (unless the job duties and environment
mimic the college experience), and it all fits nicely into some kind of
apprenticeship program. Yes, there can be a lot wrong with apprentice
programs, but "apprentice program" is a very, very broad term. The trick is
going to be in the setup and execution of the program.

I freely admit that we apprenticeship supporters wave our hands around a lot
while yelling "apprenticeships! apprenticeships!" without providing necessary
detail. But I really feel that under this rubric is where the eventual
solution will lie. We need to bring education down to be as close to the
actual work environment as possible. We need more rapid feedback loops in
education and more specific, tailored in situ instruction. Apprenticeships do
this.

Note that there is another topic -- the importance of a classic liberal
education -- which I am a huge supporter of. But I think we have mixed up two
concepts: things that directly translate into money for me and my family and
things that make me a better overall person. Both may or may not be important
to a particular person, but by mixing up the terms and lumping them all under
"college education" it has confused the education argument to a terrible and
unnecessary degree. This confusion is what is at the heart of the seemingly-
unsolvable education discussion.

~~~
argv_empty
_I freely admit that we apprenticeship supporters wave our hands around a lot
while yelling "apprenticeships! apprenticeships!" without providing necessary
detail._

A lot of those details could be worked out along the way. I'd be more excited
to see the supporters provide actual apprentice programs than details on how
everyone should operate such a program.

~~~
bbarthel
I disagree. Knowing the details of an apprenticeship program upfront are
critical to attract and retain both the "apprentices" and the "masters"
necessary to make a program work. It would be unfair to both groups to have
divergent programs and skill levels being passed off as apprenticeships.

There are plenty of successful models that a program could be based on. In the
US, electricians require 5 years of work with a journeyman + classroom
instruction. Professional Engineers require a degree in their discipline, 2
examinations, and 4 years of relevant experience, usually under a licensed
engineer before they can get their PE license (its not called an
apprenticeship, but an EIT - Engineer-in-Training. You are expected to learn
from a more experienced engineer who is responsible for overseeing your work
and providing a recommendation before you are licensed).

------
ajkessler
Many employers use a university degree as a proxy to judge whether or not
you're employable, not whether or not you are qualified. As the article
rightly points out, it's easier than ever to get a college degree, both
because there are more universities offering them than ever before, and
because the requirements to get those degrees are lower than ever before.
Obtaining a degree merely shows whether or not you have the minimal foresight
and work ethic required to be admitted to a university, and the even more
minimal determination and resilience to actually get the degree. Thus, many
employers think "If you can't muster a degree, in this lax environment, you're
probably not going to be a good employee."

That said, if you can prove you have that determination, work ethic, and
competence to do the job, many employers wouldn't give two shits if you never
got an otherwise meaningless degree. The problem is, there's not a lot of
other great proxies to demonstrate those qualities, especially proxies that
would save your resume from getting tossed immediately.

As to the "degree snobbery" thought, it makes a lot of sense for some
employers to use universities as their recruiting system. Think about law
firms that only hire Harvard law grads. Snobbery at its worst, right?

Well, say Harvard Law gets 10000 applicants each year. This group is already
self-selected to a certain extent, because most people who don't have a shot
at getting in don't even apply. Next, Harvard only selects the most elite
candidates (those with perfect scores only have about a 50% chance of getting
in). So, anybody who makes it out of Harvard, even if they're at the bottom of
the class, has already been screened extensively. If an employer just picked
totally at random from this Harvard pool, he's got an excellent shot at
picking a great employee, because the barrel he's choosing from has already
been screened for him. This can hugely reduce the amount of effort an employer
needs to put into making a hire, and so, even if lots of qualified candidates
are overlooked, such a system might still make great sense.

------
dman
I think the idea of setting time aside in your youthful life where you invest
in yourself and expand your horizons is a powerful one. It makes the world a
more interesting place and pushes human achievement. To what extent a
university helps for this might be debatable and will depend on the person and
the university, but I wonder if an average person would earmark 4 years of
their life to learning and improvement left completely to their own.

~~~
shrikant
Patio11 said it best (re: "do I _have_ to go to college??)
<http://news.ycombinator.com/item?id=1182552>

_edit_ : just re-read your comment, and I realise you're not necessarily
taking sides and merely speculating if the average person can get by without
the investment into college. I agree so strongly I would upvote you twice.

~~~
delinka
If only US society actually did provide four years of education and living
expenses, then patio11 would be correct. Sure, such offers exist but only if
you're poor. Or are the valedictorian. Average Joe gets no free ride for four
years.

~~~
shrikant
He meant a social subsidy, as opposed to monetary. Basically, you aren't too
strait-jacketed as a student, the world is your oyster, you can try out lots
of different things, etc.

There was no mention of a 'free ride'...

~~~
delinka
That's a plausible interpretation of patio11's comment. I, however, think it's
a bit of a stretch given the language of the comment. Perhaps it was his
intent to say what you interpreted, but I'm inclined to disagree.

------
kayoone
Well computer science is not only about programming. Programming is a big part
of it and people without a degree might be good programmers, but do they
really have gone through all the math and theory by themselves or just skipped
it alltogehter ? For example most people without a degree might not know what
Big O Notation is or how it works or why certain data structures are better
than others.

So it depends much on the work someone is going todo, relativly simple
programming tasks ? No degree required. Working on something that scales to
millions of users or has to run with exceptional performance ? A CS degree
would atleast tell me the candidate has learned about the theory required for
this.

Generally for a good CS grad its easier to get really good at programming,
than making a programmer comfortable with all the theory.

Try to get into any of the TOP software companies in the world without a
degree and i wish you good luck (not impossible though).

I say this as someone who quit half way into his Bsc to start a company btw ;)

~~~
nicpottier
My experience interviewing says that a great many CS graduates still have no
idea about Big O, or even really understand hashtables. That's from years of
interviewing quite qualified candidates coming into AMZN BTW.

They might have passed the class and test where that was covered, surely it
WAS covered, but they didn't actually retain it in a meaningful way.

I'm a dropout and worked at AMZN for a few years BTW. I don't think I would
have had any problem working at Google or Amazon if I had wanted to either.

Granted, I probably couldn't have worked at those companies as my FIRST job,
but that isn't the argument. The argument is who is better off after four
years, someone who attended university to get a CS undergrad, or someone who
worked in the trenches.

I'd say if there were more good apprenticeship positions available the latter
would almost always be better.

~~~
_delirium
I agree on the last point. I think this sort of stuff _can_ be learned in an
apprentice-type way, but not many positions currently do a good job of it. A
great motivated way to learn big-O type analysis, for example, is to work on a
system where a bottleneck gets improved from an O(n^2) to an O(n logn)
algorithm, while working alongside someone who explains to you what that
means, how you determined which algorithm was which, and why this is useful
analysis to know how to do.

A lot of non-degree programming jobs seem to lack both the challenge and the
mentoring to make that happen, though, so you end up with people who learn
more about how to bang scripts together out of snippets pulled off the web.
That's actually also a pretty useful skill, especially if you become a very
fast and skilled applier of band-aids, but it's not quite the same as a CS
apprenticeship.

~~~
nicpottier
Good point, and actually one I overlook too often in my own experience.

I went to CMU for two years before dropping out, so I did get a decent
grounding in data structures / algorithms etc.. And I absolutely agree that
that grounding has been important and is hard to get in the trenches.

Maybe we need to be arguing towards one or two year of school then
apprenticeships?

My housemate teaches a one year CS fundamentals course here in Rwanda. It is
essentially the only real CS training available in the country, though there
are plenty of people with degrees from the universities. But he does a pretty
good job of covering the basics. I'm sure he'd confess that two years would be
better than one, but I'm not sure he'd argue for more.

------
csjohnst
I'm of the opinion that a university degree is less about learning the subject
matter, and more about teaching you how to learn. So a programmer who has not
completed a degree may be a fantastic programmer, but one who has done a
degree may have a few extra skills on top of simple programming skills.

I.e. Communications skills, analytical thinking, knowing where to look for
solutions, how to ask the questions that improve your knowledge and the proven
ability to see through a project etc...

~~~
arethuza
When I completed a CS degree in '88 I remember thinking that what it was
really doing was lining you up to possibly go on to do postgraduate research -
which I did eventually do. If you aren't going to be doing something that is
vaguely like research then I'm struggling to see the relevance of CS degrees
for most development jobs.

Universities are really rather splendid places for research and absolutely
awful at vocational training!

~~~
liedra
I couldn't agree more. Universities _shouldn't be_ for everyone - they should
be for people who are in it for the learning, or for highly skilled research,
not degree factories for entry level positions.

What breaks my heart in academia in the UK is seeing courses being dumbed down
so that students are "happy" in their courses (i.e. not failing) so that the
university gets a good response in the National Student Survey. The other
thing I see is an increasing sense of entitlement - "we pay your wages so you
should pass us". Going to uni is like going to the gym. You don't get fit by
simply having a gym subscription - you have to work at it. Same goes for a
university education. Higher fees are only going to make that sense of
entitlement worse, which means for more dumbing down... and the cycle
continues.

------
St-Clock
So the solution to increased cost of education is apprenticeship? Is this part
of the current trend at bashing higher education to make sure that we become
obedient but efficient drones?

There is a lot of things broken in higher education, but saying that hiring a
CS/SE graduate for a developer position is like hiring a theoretical physicist
to repair a car is disingenuous at best.

Guess what, graduate students who write a compiler as part of their course,
who are working on a vm for matlab, who are improving IDE auto-complete based
on all sorts of algorithms, who are devising a new distributed merge algorithm
and evaluating its performance through hardcore network simulation, well, they
know how to program! As a bonus, they know how to apply the scientific method
and be rigorous when they report a result or an improvement. They have been
exposed to all sorts of things that an undergrad don't even suspect their
existence.

Sure, some grad students aren't good. Sure, people who don't go to
college/grad studies can end up being way better and knowing more than grad
students, but don't discredit a degree because you __believe __that it's too
theoretical. Just ask about the homeworks, the projects, and the thesis the
grad student worked on.

~~~
absconditus
"Is this part of the current trend at bashing higher education to make sure
that we become obedient but efficient drones?"

No, the goal is to remove vocational training from higher education and return
it to its supposed goal.

The graduate student that you describe is rare. A masters degree in CS
typically means that the person took more random computer courses and knows no
more about software development. What you fail to realize is that nearly
anyone who wants a masters in CS can shop around and find a school that is
willing to accept them.

~~~
St-Clock
"What you fail to realize" <\- ?

First, I believe there is a cultural difference between Canada and the US. In
Canada, a Master's degree is typically not a professional degree and you
usually cannot buy a M.Sc. Typically, half of the credits come from courses
and the other half (often even more) comes from your thesis. I don't know
about the situation in the UK, the author's country of origin.

Second, I TAed and taught programming, algorithms, and software architecture
courses for undergrads, Master's and Ph.D. students so I'm well aware of the
advantages and limitations of higher education. I saw students in an advanced
architecture course who did not know what a thread was or who had never
written a single SQL query. Well, they learned it in my course.

The graduate students that I described aren't rare. At least 75% of the
Master's and Ph.D. students in the SE lab and PL lab at my university match
that description. Maybe they would have a hard time building a web application
in a day, but I think they have demonstrated that they can learn pretty
complicated things and that they will learn how to solve your particular
technical problems.

I felt the article was really about "degree snobbery", meaning that the author
promoted snubbing people with degrees. I understand the frustration of people
without degree who need to prove every time that they don't need one. But I
don't believe that having a systematic negative bias on candidates with a
degree is wise either. Honestly, does it make sense as the author says that
someone with a bachelor in C.S. don't know how to implement a binary search
(see [1] for a possible explanation)?

Regarding the vocational training that needs to be removed from higher
education, I believe a compromise is needed. I agree that you cannot
efficiently learn development process and all the latest languages and
frameworks at school. But you need to learn some good programming skills and
software engineering practices, otherwise, it is a lot harder to understand
and play with more complex concepts and it is also more difficult to bring a
significant contribution if you do a Master or Ph.D. later.

[1] [http://www.skorks.com/2010/10/99-out-of-100-programmers-
cant...](http://www.skorks.com/2010/10/99-out-of-100-programmers-cant-program-
i-call-bullshit/)

------
ig1
Imagine you offered a job position of "software apprentice" and say you got a
thousand responses. How would you identify the ones that would be worth
gambling on ? - there would be little to nothing to differentiate candidates.

By and large you'll have candidates who didn't do well at school (because
those who do well at school tend to go to university), have never done
anything to show that they're capable of long term commitment, and have never
done anything significantly intellectually challenging.

CS & SE courses at top universities which get to have the pick of the best
students still have a first year drop-out rate of 10-20%.

Any company offering software apprenticeships can expect to suffer huge drop-
out rates with minimal upside.

~~~
ohyes
How is that different from any other job application? The job will clearly go
to the boss's nephew, who is totally great at computer stuff because he plays
so much WoW. (Though it is a well known fact that the best middle management
comes from Eve Online).

Seriously, you will generally want to look for people who are intelligent and
good at abstract thinking. In my biased opinion, you should be shooting for
the best humanities majors (In disciplines like English, music, philosophy and
math), as the targets of such an apprenticeship. Not high-school graduates.

Until it catches on, your competition for employing them will be Starbucks and
grad school which means they will be a steal compared to programmers trained
at Stanford or MIT (TM). A liberal arts major would be pleased to be making
30-40k a year out of school. Also, you can always fire them if they don't work
out.

If it then catches on, you might be able to expand to people who are
interested in making programming a career, then you might be able to catch
some of the

Currently, CS courses at top universities have very little to do with the
business of actually programming, they are more based on mathematical theory.
SE courses tend to look more like 'software engineering management.'

In each discipline, you learn important skills... but they tend to have very
little to do with the craft of sitting down at a computer and making your
thoughts into working code. (I have a master of science in computer science, I
learned many things that made me a better coder, but the degree didn't teach
me to code, work experience did).

~~~
gaius
_Also, you can always fire them if they don't work out._

I'm guessing you're in a right-to-work state, in the UK and EU firing someone
for anything other than gross negligence is very, very hard. Hiring someone
who "just doesn't work out" is a huge risk for an organization that can't
afford to shuffle them off into a "VP of Paperclips" position.

~~~
arethuza
"in the UK and EU firing someone for anything other than gross negligence is
very, very hard"

It certainly isn't difficult to fire people in the UK providing they haven't
been working for you for too long (it's either six months or a year). Other
countries in the EU can be a nightmare though.

------
hamidpalo
Degrees are usually a signal about how well the person is expected to perform.
A degree from a top school with a decent GPA means that not only was this
person select among many to attend, but they also managed to go through it
okay. That is why employers ask for a degree in anything for certain jobs.
They may not care about art history, but a degree is a good signal that "hey
this person is smart."

For CS it's a little more practical but a degree is still fundamentally a
signal. As is an active github account, blog, etc...

------
sharmajai
We have to stop pretending degrees are worthless. The way I see it, going to
school, expedites your learning process, by exposing you to professors and
your peers, it helps you learn best practices, which although generic, saves
you a lot of time making same mistakes others (your professors and your peers)
have made. Also it gives you the focus and urgency to finish your learning on
time.

As an analogy, consider the knowledge accumulated while atending school as
open source software, even if your knowledge or the OSS is generic to be fully
usable for the task at hand, it almost always gives you a big headstart to get
your job done, because it avoids the trivial and non-trivial pitfalls through
years of maturity.

~~~
mattdeboard
You're conflating the degree and the relevant knowledge gained in pursuit of
the degree. They are not one in the same, and I believe that is the point most
people are making. At least, that's what I believe.

The skepticism directed at college education is a result of cost/benefit
analysis. I would gladly take a 2-year associate's program that yielded a
degree in computer science, much like nurses have an 2-year nursing degree.
The four year degree I'm in pursuit of now (in my 30s)? I'll likely drop out
after getting the discrete math, algorithms and other math-intensive courses I
likely would not study on my own.

~~~
crs
"I'll likely drop out after getting the discrete math, algorithms and other
math-intensive courses I likely would not study on my own."

I think that statement right there is why degrees are still valuable. To earn
your degree you will have to study subjects that an independent learner would
not likely study on their own.

So far in my career, the amount of math I was required to take to earn my CS
degree has been very valuable. I have to use it a lot. (Graphics, GIS).

~~~
mattdeboard
No, that statement is an example of why the information is valuable, not the
degree. There is still a whole lot of stupid hoops people have to jump through
to get a degree.

------
eftpotrm
Agree absolutely.

I'd long ago decided that even with the £3,000 per year fees (which Labour
said they wouldn't do in their election manifesto, got an overall majority
then did anyway - remember that Aaron Porter and others slamming the LibDems
at the moment), the investment in a degree before a career in software just
didn't seem to add up. Realistically a £25k debt against a 3 year delay in
starting a career - you may well earn less for the first few years but by the
time you've paid off that £25k, is the degree _really_ going to be a
differentiating factor?

With now £27k just on tuition, plus living expenses for three years, what's
the point? Honestly, I learnt more that I use professionally at A level than
in my degree, let alone what I've learnt professionally. Sad to say this but I
would actively recommend against an 18 year old with an interest in working in
IT studying at university, the way things are at the moment.

~~~
nickthedart
Agree with this. With £27k debt for tuition alone, plus other debts for living
costs, in a field that changes so fast as Computer Science, you'd be paying
off this debt long after much of what you'd have learned would be obsolete.
Far better to go straight into work, and study part-time (if employers will
take you without a degree that is, which they might if they realise that 18-yr
olds can be smart and cheap). It'll be interesting to see what 18-yr olds do
in response to these fees. Sadly I fear many may not be clued up at that age,
and will study Computer Science then regret later.

~~~
gaius
_Computer Science_ doesn't change quickly at all. Slightly faster than maths
does, perhaps. Only the fashionable language changes quickly.

I think that many people these days don't understand the difference between CS
and "making websites with RoR".

~~~
nickthedart
I agree with that. But, a typical course contains bits that don't change e.g
algorithm theory and bits that do , e.g programming skills learned from
implementing algorithms, which, back in the day would have been in Pascal or
C, nowadays is in Java or Python, and in future maybe some other language. So
taking such a course would have 2 aims - get a good grounding in theory, and
get some buzzwords on your resume / c.v too . People who don't understand the
difference between CS and making RoR websites , sadly include many hiring
managers, right? ;)

------
akmiller
I think it says something about the kinds of employers that look at people who
have spent upwards of 100k to obtain a piece of paper that implies some
arbitrary level of understanding in a given subject. The same level of
understanding that I could get on my own for probably a few hundred bucks.

Don't get me wrong, I'm not against higher education in theory. We've gotten
to a point though were it's not about the education anymore, it's about the
diploma. We enforce this idea in our high schools that you can't be successful
without one and as such send our kids in droves to Colleges. We've
artificially made the demand so high that these institutions can charge
whatever they want and the kids are still going to attend and put themselves
(or their parents) further and further in debt.

~~~
andylei
> says something about the kinds of employers that look at people who have
> spent upwards of 100k to obtain a piece of paper

great you're an employer. for fun, let's say you're google, except you just
ignore the 'education' line resumes

> The same level of understanding that I could get on my own for probably a
> few hundred bucks

oops, now, you have 10k resumes of people claiming they have knowledge of
computer science. but you only need to hire 10 people. what do you do?

~~~
akmiller
Surely I could filter those resumes just like many employers currently do
anyway. First of all we'd likely be talking about some type of entry level
position as education plays less and less of a role when looking for
experienced engineers. So I start with some type of test before I'll even look
at your resume. In fact, if I'm an employer as popular as Google, I may have
more than one level of tests to get through based on the job before I give any
resume's a look. It's not hard to test for knowledge in Computer Science and
I'm sure companies like Google already do to some extent anyhow.

Once you've done the initial filtering it's not too terribly hard to spot some
potentially good candidates. Especially in the field of software development
where it's easier than ever to build products and make them available for
others to see and use.

I'm also not saying to ignore the education lines...I'm just suggesting that I
don't like it as the main criteria for filtering out candidates.

------
timtadh
> Let's face it, the best experts aren't the ones who knows all the answers,
> but the ones who know where to look for the answers.

In many ways I feel like this is exactly what a degree in Computer Science or
a related field gives you. Becoming an educated member of society isn't about
learning the "correct" answers it is about learning to ask the right
questions. It is about learning some questions outlive their answers.
Universities enable students to glimpse the horizon of human understanding. A
glimpse of the infinite unknown.

I understand people feel burned by: their experiences at university, hiring
practices of corporations, poorly performing well credentialed hires, and the
cost of education in general. However, let's not toss the baby out with the
bath water here.

@brudgers said it best:

> Modern higher education has grown because it offers such a powerful solution
> to many of the problems created by apprenticeship particularly lack of equal
> opportunity, exploitation of apprentices, diversion of resources to training
> and away from profit making activities, and long term commitments to
> particular individuals who may not be suited for the profession.[1]

[1] <http://news.ycombinator.com/item?id=2899059>

------
imperialWicket
I could not agree more, and although this applies to a broad range of
industries it is extremely pervasive in software development (Related, I have
heard a few computer science folks highlight that computer science as a major
is not targeted at creating software developers -
<http://news.ycombinator.com/item?id=1884255>). True enough.

I stand by my sentiment on that post, that open source projects are a great
'apprenticeship' opportunity for those interested in computer science-like
fields (software/web development and the like). That said, I have participated
and watched at <http://opensource.com> and <http://teachingopensource.com> and
come to realize just how difficult it is to get opensource into the
educational system.

Knowing that difficulty, I might categorize experiences in the following best
to worst order for new hires:

1\. Active open source contributor 2\. Active open source contributor w/ non-
CS degree 3\. Active open source contributor w/ CS degree

Two additional notes:

1\. These are not meant to be absolutes, there are certainly individuals who
fall into the above-mentioned category 3 that far surpass a category 1
candidate in a particular skill. I am merely suggesting that at a high level,
the _likely_ skill-set available to a category 1 candidate is often more
desirable than the _likely_ skill-set of a category 2 or 3 candidate. A lot
more could be said here, but it is not the point of this post. 2\. While this
most obviously applies to software development, it also has a natural home
among technical document authors, marketing, customer relations, QA, and many
other aspects of business that exist and flourish in open source communities.

------
delinquentme
I've run into this exact issue.

However when you've got the managing parties who have degrees, they seek out
justification for their expenditure, through self-seeking-self activities,
consciously or subconsciously.

Additionally when grad students are willing to work, and have been trained in
the shadow of those who are managing, all the gears in the system work as
expected.

The issue comes into play when you've got kids like myself. We are ITCHING to
learn, fail to see the benefit in 4 years of learning. And most of all:

We know the fastest way to the bleeding edge of research is NOT through the
"tried-and-true" channels. It is most efficiently attained through jumping
right into the fray and working with said researchers.

THIS is what will create that innovation that we seek. The degree system
simply is the easy answer and a way to continue with the entrenched system, by
providing slave labor.

------
Jach
I'm skeptical that this is going to happen on its own any time soon. While the
legal frameworks are in place to prevent discrimination on sex/race/age/etc.,
I think we should put similar mechanisms in place for formal education.
Especially when so many degrees are worthless as a measure of skill, so
they've become irrelevant to the job at hand just like a person's race.

Make it so that employers can't ask for education, just like they can't ask
for age, nor make a degree a job requirement. Of course when an applicant
comes in for an interview, their race and relative age quickly becomes
apparent, so it's not really a matter of information hiding as removing a
more-and-more irrelevant filter. There's also nothing stopping an applicant
from explicitly exposing their age/education/etc. on a resume or during the
interview, and I'd still want to mention an MIT education if I had one. At the
very least you would want to talk about school projects since you may not have
any other experience, but that's up to the applicant. The question is "What
things have you made? How did you do it?", not "Did you take a data structures
course at an accredited university?" I'm not even sure it would create that
much extra burden on HR departments since I hear they're already swamped with
applicants matching degree requirements.

On the other hand, a free market approach may be to just leave it alone and
let the tech companies that require CS degrees, or black people only, suffer
to the companies that care about skill alone. I'm pretty okay with that too as
a practical outcome. The question there becomes really philosophical and
whether you want a big government to slim down in an inconsequential way or
continue its historical path of trying to enforce certain moral directives on
supposedly less enlightened people.

Downvoter(s): would appreciate a discussion on which idea(s) is/are most
offensive to you. There's the additional filter of "this person made it
through a 4 year program and may therefore be determined/have long-term
goals/etc.", but really I don't find that a very compelling or useful filter
for many jobs.

------
aidenn0
The problem with this is that universities get away with screening processes
that would be problematic at a company. For example, screening applicants
based on SAT scores is a legal gray area for a lot of companies, but the
college you went to is a proxy for your SAT score.

I'm convinced that more than half of the value of hiring someone from a top-
tier college is who they admit, rather than what they teach.

Frightening true story: Someone I know at a government contractor startup was
hiring a fortran programmer. As part of the interview test he was giving a
simple fortran sritten test. The company lawyers found out and had him stop.
Apparently tests that haven't been vetted for cultural/racial biases are a
potential source of liability for government contractors.

~~~
crs
So very true. At the company (very large defense contractor) we have to ask
the same questions to every applicant. We can't ask follow on questions or
deviate from the pre-defined question list. It was deemed legally unfair to
ask different applicants different questions. That makes it harder and harder
to distinguish the good from the bad.

------
shmageggy
_I suspect there are countless CS graduates who can describe Binary Search
theoretically but couldn't hand-roll a binary search implementation to save
their lives_

If this is the case, and I doubt that it is very often, then your CS program
has failed you miserably.

~~~
timtadh
I believe Knuth says in his article on Binary Search in the art of computer
programming that it took some ridiculous number of years from when Binary
Search was first described to a implementation free of bugs.

I find it highly likely that the majority of _programmers_ would produce a
buggy version of Binary Search on their first attempt. Why? History indicates
programmers often make small mistakes even when writing simple algorithms. A
survey of 26 papers on variations of binary search found that 6 of the papers
had serious errors in the published algorithms.[1] 4 of the errors were
"design" errors. That is the algorithm they designed had a fault. 2 were
implementation errors (1 in assembly, 1 in cobol). All of the errors were
published in a peer reviewed publication. Therefore, even peer review does not
always spot errors in a "simple" binary search algorithm. Why would you expect
recent graduates to do any better?

[1] <http://comjnl.oxfordjournals.org/content/26/2/154.abstract>

~~~
Jach
That's fascinating that less were implementation errors than design errors,
though "the distribution of errors among families of algorithms is not
uniform" accounts for that I guess. Though the paper is from 1983...

I scanned through the paper and found no explicit mention of the typical
coding error caused by using M := (L+H)/2 instead of M := L + ((H - L) / 2)
(though the paper interchanges both). So I suspect a re-analysis would find
more coding errors than design errors in languages without arbitrary-sized
integer auto-conversions. My reasoning for that conclusion is based on:
[http://googleresearch.blogspot.com/2006/06/extra-extra-
read-...](http://googleresearch.blogspot.com/2006/06/extra-extra-read-all-
about-it-nearly.html)

Of course, given the stories of all the CS-degree wielding applicants who
flat-out can't do FizzBuzz, and the amusing incorrect designs/implementations
of commenters who scoff at the notion of not being able to do FizzBuzz and try
to prove they can but fail, I was already inclined to believe there's a lot of
incompetence to go around. It's not even necessarily a problem with the CS
programs, but there is the danger of learning to recite an algorithm's steps
word for word without knowing what it means or what it's useful for and fail
with a live use case.

------
CptMauli
A "Fachinformatiker" (for some reason translated as specialist by google
translate) you can become through an apprenticeship in Germany for some time
(there where predecessors to it with different names)

see
[http://translate.google.de/translate?hl=de&sl=de&tl=...](http://translate.google.de/translate?hl=de&sl=de&tl=en&u=http%3A%2F%2Fde.wikipedia.org%2Fwiki%2FFachinformatiker)

oder auf deutsch <http://de.wikipedia.org/wiki/Fachinformatiker>

so it is nothing new at all.

~~~
eiji
As a fellow german, but not a Fachinformatiker, I suspect colleagues without a
degree still have a hard time to work their way up into more advanced
management roles or just into the higher income brackets. I wish that would
not be the case though.

What I find intriguing, although I'm surrounded by "Programmers", almost no
CS! We have physicist, mechanical engineers, mathematicians, historians and
english majors even.

------
brackin
Most (if not a lot of) startups i've seen look at portfolio and experience
more than do you have a degree, anyway. I'd rather hire an 18 year old who's
got lots of great stuff on Github, internships and other experience.

Than someone who's just finished their Computer Science degree from a random
uni, with no experience. Almost all of those I know going to university are
just planning to do their course and apply for jobs afterwards, expecting
their degree to mean instant an instant job.

------
daimyoyo
I have found firsthand that the lack of a college degree has increasingly
become a scarlet letter in our society. It's not quite as bad for coders as it
is in the rest of tech(the fact we have open source projects to prove we know
what we're doing helps a lot) but it's still disproportionately difficult to
get a job that pays a living wage unless you have the degree to "prove" you
know what you're doing.

------
tokenadult
"Discarding the bachelor’s degree as a job qualification would not be
difficult. The solution is to substitute certification tests, which would
provide evidence that the applicant has acquired the skills the employer
needs."

<http://www.nytimes.com/2008/12/28/opinion/28murray.html>

------
leon_
I'm an self taught programmer from Germany and since leaving school I have
been working as a free lancer.

Some time ago I decided that doing the free lance is a little to stressful for
me (you know, customer acquisition, taxes, business overhaed) so I talked to
some friends of mine who work as "normal" office developers. They all said
something like "yeah, any company should be happy to have you working for
them" and so I started so send out applications.

Guess what: The HR guys won't even look at my CV without any sort of official
paper. Be it a CS diploma or a finished apprenticeship (yes, in Germany we
have programmer apprenticeships) - without that not even an interview.

So what I'm thinking now about is doing the 3 year apprenticeships. Only to
get a paper that says "yeah, that guy knows how to spell Java and knows what
ISO norm 23542 is" (programmer apprenticeships isn't more than that in
Germany). Oh happy time - where I will spend the next 3 years and "learn"
nonsense I already know or I don't want to know. And my classmates will be ~17
years old.

------
NY_Entrepreneur
'Education', K-12, college, and later, is pushed and shoved by several large
influences.

As in this thread, two of the influences are (1) employers want some
'criteria', maybe even 'credentials', that will simplify selecting promising
employees and (2) students seeking jobs want to keep down the costs in time,
money, and effort to meet such 'criteria' or to get such 'credentials'.

Broadly there should be some 'market forces' that provide answers: (1) If the
education employers want is expensive, then employers will have to pay
employees enough to pay for the education; (2) if the education costs more
than it is worth to the employers, then students and employers will make do
with less such education; (3) if students don't come for the education, then
some educational institutions will have to make some changes to offer the
students more value for the costs.

Such 'market force' influences are easy to see, but there are some other large
influences less easy to see:

In the US, the 900 pound gorilla is the interest of Congress and the US DoD in
technology for US national security.

Such was not always the case: Indeed, during the rapid rise of engineering in
the decades before WWII, schools of engineering actually concentrated on
teaching engineering for students seeking careers in engineering! Amazing!
Radical! Astounding!

Then we had WWII and radar, sonar, and the bomb, right, the atomic bomb, and
Ike and other influentials concluded that "Never again will US academics be
permitted to operate independently of the US military" or some such. Congress
went along, that is, pulled out the US national checkbook, and started
signing.

Then the top three dozen US research universities got an offer they couldn't
refuse: Take the US Federal money for research in math, physical science, and
engineering, with a tilt toward what might be useful for US national security,
or cease to be a leading research university. They took the money!

Now, for such a university, for the money, the most important activity on
campus is research, research, and research as in "How do I get a grant?",
especially, now, from the NSF, DoE (that is, energy, not education!), or
DARPA. Since then Congress has also provided money to the NIH and its grants
-- you see, there are a lot of old people in Congress eager to see progress on
some of the serious diseases of old age and ..., well you can see the
connection!

Now for this 'research', there are some examples and, now, a 'model': The most
influential example, for both science and US national security, is physics,
especially as for the bomb, right, the atomic bomb and now, too, the hydrogen
bomb. So, 'research' has a really severe case of 'physics envy', especially
theoretical and mathematical physics envy. So, good 'research' is supposed to
'mathematize' a field.

So, at top three dozen US research universities, math, physical science, and
engineering pursue research, research, and research with physics envy.

Yes, there was that report, yes, the David Report, that said that some of math
was, well, not so applicable so soon and should, then, maybe get less money.
That's why now at some of the research universities the math department has
trouble keeping the lights on but is still working on the analytic-algebraic
topology of the locally Euclidean metrization of infinitely differentiable
Riemannian manifolds (extra credit for the source!). Math is still the most
respected field, but for some decades more applicable topics in math -- e.g.,
probability, stochastic processes, optimization, statistics, control theory,
signal processing, numerical analysis, computational fluid dynamics,
computational complexity, math for finance, math for theoretical physics --
has been done outside the 'pure' math departments.

Now it turns out that somehow some huge fraction of good students are really
eager to get their bachelor's degrees from such research universities. So, the
universities can be very 'selective' so that employers can get some of the
'criteria' they want just by a student being admitted to such a university!

But the students are being taught mostly just by professors interested mostly
just in narrow, leading edge research to get grants. So, the education is not
really about, say, engineering for engineering students who want jobs in
engineering! Instead, in, say, computer science, the education might be closer
to background for research in 'the fundamentals of computing', not that we
really know what they are but some people would like to!

How to work through the security model of SQL Server for SQL Server
installation, administration, and management? Professors can't get grants for
mud wrestling with the messed up SQL Server security model or its just awful
documentation and terrible problems with installation, management, and
administration and, thus, mostly DON'T! So if someone actually wants actually
to actually work with an actual, real installation of SQL Server for a real,
actual, important, practical database application, in the real world, outside
of academics, as part of a career, that can pay enough to buy a house and
support a family, then the professors in a research university are not a very
direct sources of such information!

Still good students like to go to research universities.

Then there is the money, that is, what the universities charge. So, tuition
has gone up, way up, over the last few decades, up faster than even health
care! Why? Well, there's a dirty little secret! What has gone up is the
published, 'list price'! But there are also 'discounts', especially for good
students, called 'scholarships'! So, if the student's father is wealthy and
the student only so-so as a student, then go ahead and charge list price!
While this student pays list price, they can also do well in their studies in
beer and bed! But for a really good student, especially with poor parents,
there are scholarships.

In the US, there are also many colleges and universities that don't try to be
top research universities and concentrate on teaching.

And there are many community colleges where 'job training' is the goal and not
a dirty word and where there are courses in auto repair, auto body repair,
framing carpentry, finish carpentry, masonry, cosmetology, plumbing,
electricity, HVAC, and, yes, computer programming and network management.
Tuition is low; the courses are fully intended to be practical; the teachers
are not researchers and are often practitioners.

For computer science, there is a secret: The more advanced parts of computer
science have been heavily 'mathematized'. So, net, the best background for
such material actually is not even in the computer science department but in
the math department, especially a course in, say, abstract algebra.

Next, for a career, e.g., in computing, there is now a big, dark, ugly secret:
'Jobs' are no longer such a good idea! E.g., the Stanford AI course got
interest from 50,000 students in 175 countries! So, generally, if take some
material in college and look for a job, then here in the US, both you and your
employer will be competing with thousands of eager people in 175 foreign
countries!

Meanwhile here in the US the people buying houses and getting their children
through college will, may I have the envelope, please? Yes, here it is: Often
they will own their own Main Street business where they have a geographical
barrier to entry and, thus, no competition more than, say, 100 miles away. In
particular they will have no competition from anyone in any of those 175
foreign countries.

So, what future for computing in the US? Broadly, gotta do something new, out
there, on the leading edge in at least some respect, something
entrepreneurial, something you can't get hired for because the guy hiring
doesn't understand that new thing yet. So, have to be an entrepreneur. For
that, something in advanced computing might help. Then, just getting 'skills'
with, say, Linux, C++, Java, Python, MySQL, SQL Server, Flash, HTML5, etc. is
'routine', maybe at some point necessary but not sufficient.

Just what should such new stuff be and just how to get help from a research
university? No one really knows! Welcome to the challenge of the future!

