Recruiter: "On a scale of 1 to 10, how's your COBOL?"
Recruiter: "Great! I have tons of work for you."
Recruiter: "On a scale of 1 to 10, how's your data base?"
edw519: "4. But I'm a 10 in COBOL."
Recruiter: "No one cares about COBOL. I need data base people."
Recruiter: "On a scale of 1 to 10, how's your Microsoft?"
edw519: "4. But I'm a 10 in data base."
Recruiter: "No one cares about data base. I need Microsoft people."
Recruiter: "On a scale of 1 to 10, how's your Oracle?"
edw519: "4. But I'm a 10 in Microsoft."
Recruiter: "No one cares about Microsoft. I need Oracle people."
Recruiter: "On a scale of 1 to 10, how's your HTML & CSS?"
edw519: "4. But I'm a 10 in Oracle."
Recruiter: "No one cares about Oracle. I need web people."
edw519: "4. But I'm a 10 in HTML & CSS."
Recruiter: "No one cares about HTML & CSS. I need back-end people."
Recruiter: "On a scale of 1 to 10, how's your Ruby & Python?"
Lighten up, guys. If you can build stuff, learn, and work well with others, you'll probably always be fine.
Some folks consider a 10 to be the guy who invented the language/technology. Linus, DHH, etc. Almost no one is a 10 on that scale.
Others consider a 10 to be "I'm an expert in any of the tasks you're going to be asking me to perform with this tech". That's the scale that's more important to me.
Then, of course, you get the brand new grad who says they're an 8 in a language despite never making anything that's not a basic class project in it...
The problem with this is that your overall numbers are likely to go down over time.
If you are a college freshman who spent a weekend learning JS then you can now probably do most of everything you can do in your main language (say Java). That means that you put down "5" for JS.
Once you've got a few years under your belt, your knowledge in any new language has a much higher standard to meet.
I filled in a questionnaire for a job once that has vague questions such as "Rate your skills in TCP/IP , Novice/Intermediate/Advanced".
I had no idea what the scale was here, did advanced mean being able to write a TCP/IP stack from scratch, identify bottlenecks in huge networks or what?
I was trying to decide whether to tick novice or intermediate however it turned out that "advanced" actually meant being able to run traceroute from command prompt and set A records in DNS.
I remember the advice a friend of mine gave me, he said "Just put 10 for everything, you'll get an interview"
If I'd put "10" in TCP/IP for that company, I would have been slammed in the interview. As it was I passed. Apparently no one ever remembers the CRC byte in the TCP header. :)
(although, in fairness, I'd only require that level of knowledge at the tip of tongue if someone was going to be a network stack developer or high level sysadmin/network admin -- for a regular web developer, just knowing that there are tcp headers and flags which can be set, and which might get mangled by devices between client and server, is enough.
It's CRC at the level below TCP (well, 2 layers below: ethernet)
Here's why: When I see resumes, the only people who put 10s are recent college graduates. They put a 10 for Java and then don't know what a hashCode is. This demonstrates more about the question than about the candidate
I've also interviewed at a few places, been asked the 0-10 question on a set of languages and been told that 5 "wasn't what they were looking for" and they wanted "10s only." This demonstrates more about the company than about the question.
The right answer is 7. It says "I'm good, but not Linus Torvalds" and if anyone asks a follow just say "I'm never really sure what to say there." It doesn't matter if your answer means anything because the question doesn't
That'd imply that game-theoretically, the right answer is 8. ;-) Enough to tell the interviewer that you've gotten past 7, yet not so high to make them think that you think you know everything.
(Personally, I've taken to saying "Judge for yourself" and listing my accomplishments. That's what they're really interested in, they're going to do it anyway, and this way I show enough confidence in my own abilities that I'm willing to let others judge them.)
The correct answer is "I dunno, what does 5 mean" to show humility and an insistence in meaning in metrics, and if pushed say 6, and trust that decision makers at anywhere you care to work ignore the dufus who insisted on a number.
Please tell me what colleges these graduates are coming from so I know never to hire/work with any of them, ever.
It's just when you spend 3/4 years in that kind of bubble and you get getting good marks on all your assignments you are naturally going to think "I've mastered Java" based on evaluating the only evidence presented to you.
Of course there is a difference between knowledge and aptitude here.
Would a candidate who doesn't know what .hashCode() does but who after a 10 minute explanation says "yes, I understand now makes sense" be a bad hire?
I program a lot of Java but I am unfamiliar with huge amounts of both the language and standard API (especially the parts dealing with 'enterprise' stuff and concurrency). This is mainly because I explicitly design my Java apps around a shared nothing/POJO-JPA configuration and I don't use RMI , Reflection or JMS.
Does that mean I am a bad Java programmer?
In particular discussing how, despite using the same syntactic formalism, the semantics of what was considered "passing" for undergraduate students vs for graduate students was something no one had ever enunciated to me explicitly (e.g. you pass as an undergrad if you're getting Cs. If you're getting Bs as a graduate student, you are doing it wrong).
And I'd STILL get people who rated themselves 10 in multiple categories.
>> If you can go from 4 to 10 in something new every few years, that's amazing ...
free ebook, "The Best of edw519": http://edweissman.com/53640595
Also well worth following:
and many others...
Isn't the reality more like the fact that you have to constantly obtain new skills, thus making you only "hireable" part of the time?
In 2001 you learn you don't have what folks are looking for so you go and "retrain", get a couple years solid experience[%] in that skill, and then have a decent run until 2009 when the whole thing rinses & repeats?
To me, senarios like these, make the case for actually having a CS degree -- that way you have a chance of obtaining a "zero" in number of days without work since 1979.
Recruiter: "On a scale of 1 to 10, how's your COBOL?"
CS Graduate: "10."
Recruiter: "Great! I have tons of work for you."
Recruiter: "On a scale of 1 to 10, how's your data base?"
CS Graduate: "4. But during my days at MIT, we did cover database theory quite intensely;
I'm sure I could adapt to your needs very quickly"
Recruiter: "Oh, MIT 'eh, I'm sure you could, when can you start?"
... and so on...
[%] I know a solid programmer can pick up new skills pretty quickly, but recruiters have a funny way of always insisting on a certain number of years of experience in a particular skill, don't they...
P.S. I don't have a CS a degree.
Having a friend on the inside avoids two potential filter failures without having to actually change the HR process - it's saving HR the cost and hassle of a recruiter and essentially assuring them that they're not going to get flack for wasting anyone's time if they forward your resume through to the actual hiring manager.
Also tech conferences seem to be a pretty expensive way (at least in terms of travel) for hoping to bump into someone.
If a company only trust "recruiters" to select (or pre-screen) candidates for them, then this says a lot about their way of managing and interacting with "our" kind of programmers (by that I mean people who read HN, are enthused by cool innovative projects, write blogs, publish open source code, ...)
If the company doesn't help you bypass the recruiter, maybe you should bypass them altogether?
Then turn it around and look for companies based on the open source products that they use. There are plenty of companies that are heavy users of open source software, and don't hide that fact. You might have to move or something, but that's life.
"Also tech conferences seem to be a pretty expensive way"
There are many cheap conferences, or free exhibit hall passes for early-bird registration (usually allows BoF sessions, as well, and maybe parties); and user groups are ordinarily free (maybe you chip in for dinner or something). There are conferences or user groups in many areas about something, and maybe you can just drive. Often, you can share a room with another attendee (ask on some mailing list related to the conference) to save on lodging. If you have developed something or have some kind of speaking experience, the conference may pay for some or all of the travel expenses, and may waive conference pass. There are even some open source projects that have money waiting for anyone who wants to bootstrap a user group in their town.
And these things are not theory. I have done absolutely everything I have suggested to you here (except trying to bootstrap a local user group): get free conference exhibit pass, drive to conference, share room, get started contributing, get major patches accepted, and get jobs through community connections. And I still do it, and I see many people rising up through very similar means that had little in the way of marketable talents a while ago.
Disclaimer: Nothing about this happens overnight. It takes a long time. There are no magical steps where I can say "do these things and you will get this". You have to look out for the opportunities on your own. Start attacking your own reasons why something can't be done, knocking them down one by one.
I do in theory like the idea of speaking a tech conference. I am just worried about the level of experience that would be required to do so.
I don't want to stand up in front of X number of people and tell them stuff they already know.
I live in a place that is very very far from being a "tech center" in the way that SV is but there are a few user groups around (mainly PHP groups, LUGs etc). I should probably check them out.
I did go to a local LUG once but that was about 5 years ago and mostly consisted of unemployed neck beards grumbling into their pints of bitter. Hopefully things have changed since.
A topic that is always safe is "I used technologies X, Y, and Z to build this thing, and here's what I learned along the way". It's usually a little more entertaining because it's a story (often with a few funny anecdotes); and you're admitting that you didn't do everything perfectly from the start, so being wrong somewhere during the presentation isn't a deficit. Being wrong really only matters in a persuasive talk.
(Aside: it's pretty well established that teaching is one of the most effective ways to learn deeply.)
I'm sure you can find better speaking advice elsewhere, but one other thing I should mention: especially at user groups, a talk is just to start a conversation, which will often be held at a bar later. So don't worry about concluding anything in a profound way.
As for location, then yes, living within driving distance of the SF bay area helped me a lot. But I didn't begin speaking until I moved to Portland, Oregon, where I regularly attended a user group. I'd like to think that a lot of other areas have something to offer as well, but I'm sure it takes more searching.
Need a better recruiter. People are still making really good money with COBOL :-)
Recruiter: "So you were writing Java code that talked to an Oracle Database, right?"
Her: "What version of Oracle was it?"
Me: "8i I think"
Her: "Sorry, we're looking for people with 9i experience." (9i had just come out.)
Me: "Really? You know SQL is a standard and I'm writing Java code anyway, we were using a ORM as well. I didn't even end up writing that much direct SQL code."
Her: "Sorry, the client specifically said Oracle 9i -- I'm looking at it right now. They're not looking for Sequal experience."
This is when I decided to always avoid recruiters if I possibly could.
 I pronounced this "Ess Que Ell" but she responded "Sequel" which I think was the name of another database vendor at the time.
Either they only want people who worked for MS on the platform/language itself or the whole thing was copy/pasted by a recruiter.
What's the worst thing that could happen if you fibbed "Oh yeah 9i, I've used that too."? You'd get an interview with the database guy who actually knows the difference.
Not because of ethics or morals but just because they refuse to say something they know is technically incorrect.
Think about it: most everyone you're going to be working with will have been selected on such a basis as well. Not so much bad people, but on the whole a thoroughly uninspiring workplace where creativity was actively discouraged.
Unless I was really desperate, I would prefer not to get hired at a place that used that sort of process.
Reality is most employers put up fake 'requirements' because they don't know how else to find 'smart, can learn, gets stuff done'.
Oh And: Self-Marketing, it isn't always evil.
But I'm being honest, sometimes you have to make yourself sound good to get the goods.
But I think claiming experience with an explicit version that was stated is going a little bit too far. It's important to clarify. Maybe someone actually does need help with the new platform, and after all, if they're having inept recruiters do their pre-hire screening, you're probably not missing much anyway.
There is a bit of line to fudge there but it definitely doesn't extend into falsifying tasks or skillsets imo.
Getting a job with a company where clueless people make important decisions. Not everybody handles such situations well.
Well, the worst thing would be if it was a trap question designed to weed out the people who lie at initial recruitment selection.
Skin that cat!
"Do you have experience with APIs?"
"Uh...Yeah. Yeah, I do."
(Just wait. It'll happen.)
I pronounce MySQL as My-ess-cue-el.
Secondly there's a constant need for people who make CRUD apps. Constant. Almost every business can benefit from a totally custom app with it's own special workflow. We tried RAD tools, we tried auto-generate tools, we tried plugin workflow that would be 'user' edited. Turns out if you don't involve a programmer it all goes very wrong.
After 20 years of promises from Delphi, VB6, Java, Rails, etc. the reality is it's getting harder to make good apps because everyone's expectations only go up. Bottom line is to make a CRUD app you still need a programmer. Almost every business is realising they need a programmer.
The market's only going to get bigger, much, much bigger.
This reads like it's from a person who's never been out of the ivory towers, hasn't actually been inside a real business.
I've never heard a manager or developer say: If only we had someone that could write an algorithm to solve this travelling sales person problem we'd be saved! No. Inefficient algorithms are often good enough, typical programmers muddle through and yes pick up a textbook or reference wikipedia once in a while.
All of this lifts the benchmark for what is an acceptable web app. The list goes on, and new bits keep getting added every year along with expectations.
It's bloody hard to find people that can deal with a majority of the above technologies. If you find someone that can do the full stack they're gold.
You want to be valuable? Prove that you're someone that can develop within modern software stacks and more importantly adapt as things change. Because change is coming, the music hasn't stopped and we're in for one hell of a ride.
Exactly, or google it and find the difference between between the different sorts, or look at any one of the great visual representations of sorting strategies. I've seen otherwise very accomplished coders throw up a crappy bubble sort and move on because it just didn't matter. And it didn't. In any case, 95% of the time those sorts have already been implemented in your Hash object or database or somewhere else in your languages collections library.
I've been doing this...Good Lord nearly 15 years now -- mostly in the middle-tier and lower, so I've had plenty of opportunity -- and I can't even fill up one hand with the number of times that I had a problem that required anything beyond rudimentary algorithmic analysis, much less a deep examination of sorting efficiencies, etc. Those are solved, cut-and-paste problems, the least of your worries.
The list of technologies you give is spot on. I'd add to that: good data modeling/object design, testing/TDD (if you're into that), security issues to be aware of (XSS, SQL injection, mass assignment), classic performance killers like n+1 queries, on and on.
And yes, the change bit -- as a coder you can never assume you've learned your craft. You're always learning it. We're not blacksmiths that get better and better at one thing over the years and arrive at something like "mastery."
We're attention-deficit polymaths that have to scramble week-in, week-out just to maintain competence on the current problem, while anticipating the trends that will be turning the apple cart over a year from now. Remember in 2008 or so when RIA was the buzz-cronym and everyone was talking Silverlight, Flash and the lot? And 4 years from now who knows what the acronyms will be. We may all be laughing about how awesome we thought HTLM5 was going to be (probably not, but in this business...)
Sorting is a good example, though, because it is often complicated not because the algorithm is hard, but because the sorting criteria rely on heterogeneous data, sometimes coming from multiple services. Coming up with a clean code design in such cases is challenging, and that's where good developers shine.
Then there are the ones I've met who picked up some books, read the tutorials, and got their hands dirty with some code completely outside their degree which was in something totally unrelated and those are the ones who totally amaze me with their skill. Even a lot of the big web development companies around me are WYSIWYGing their way through client work!
It just totally astounds me and your comment and this article are making me ask myself (again) why is that? Is what I describe normal? I'm in Chicago and maybe outside the Valley things are different? Has anyone else seen what I described or is my experience just a fluke? If its anywhere near common I'd be a little disturbed.
100%, at least in my country.
For the past 4-5 years of my life (starting in high school and continuing through college) I've been concentrating a lot of my time learning how to be a better entrepreneur. That's to the detriment of almost everything else in my life. And it's worth it to me because I love it. Even better still, it seems like a pretty safe bet because worse comes to worse and I completely fail over the next few years I'll still be able to get a job as a coder somewhere.
But the thing about being an entrepreneur is that it encourages you to get marginally good at a wide range of skills instead of REALLY good at one area. And so something I was thinking about is the potential consequences of this decision on my life. This is what I came up with.
Being marginally good at many things show that no matter what gets thrown at you, you'll pick it up fast. This is necessary in entrepreneurship and it becomes ever more necessary elsewhere as the pace of software development increases.
There's no such thing as "algorithms" skills anyway. No matter what skills you have, you'll probably need to adapt them heavily to whatever new job you find yourself in.
Yes, we're mainly startup-ish people around here, but I'm pretty sure that even if I wasn't on, I'd hire a jack of all trades with a proven track record over skills in "math" and "algorithms", whatever that means.
The reason people here on HN are objecting to your story is that there are ALSO coding jobs for startups, PR firms, or consumer brands where they DO value entrepreneurship, speed, and creativity. And these types of jobs will never go away. Edelman or Nike is never going to convert their development shops to all CS Ph.D.s with 15 years experience optimizing compilers. They're always going to be building interesting interactive projects for major brands, so they need people who are always restlessly finding the next cool thing.
So just make sure you always apply for the latter kind of coding job and you should be good.
I too am an entrepreneur who sold one company and am well onto my second one. (Though the fact that you've sold multiple companies is damn impressive!) Running your own company means you have to wear many hats and its tough to get really good at just one thing. Some days you are a graphic designer, or developer, or marketing, etc.
However, I also think it's boring to be stuck in one specific area for a long time. I like to be challenged, and I am constantly learning new things to broaden my knowledge. Whether or not this is a good thing remains to be seen.
In the mid-80s people tried to tell me that software was too crowded, that there were too many people going into it and that I was unlikely to have the career my dad had. They were partly right: I've been even more in demand than he was.
Software development salaries are crazy high right now, which is a sign of a ton of suppressed demand. The web is far from done. Mobile is still rising. We are just getting started on the "internet of things". And that's only on the consumer side. Business is not going to be less dependent on software, and the ever-more-networked world is shortening cycles and increasing competitive pressure, meaning business software needs to change more often.
Yes, by all means, people should keep on learning. But the "gosh you'll be fucked in a few years if you don't go back for your CS degree right now" thing is BS.
I definitely wasn't saying that this is definitely what's going to happen. In fact I really don't think I'm going to be applying for jobs in 4 years. But I thought it was an interesting scenario to write about and get people's thoughts on.
What does it mean to "know" a language? I've been writing almost exclusively in C for 15 years, but I wouldn't say that I "know" C; at least once a month I end up needing to consult the C99 standard for some obscure point of library specification.
On the other hand, there's a lot of people who know how to write Hello World in all those languages and consider that to constitute "knowing" them. Would you hire all of them?
The answer is somewhere in between. My personal metric is: do you have a reasonably accurate estimation of what you don't know. The better you know a language, the better you know what you don't know.
That's a pretty good measure of how well you know any subject actually. You could call it the "Dunning–Kruger metric".
>> What does it mean to "know" a language?
People who know Java really well tend to not really know C++ even if they put it on their resume. Unless you count "how hard can it be" knowledge.
Same the other way around. People who know C++ really well think "how hard can it be".
Modern C++ is a very complicated beast, most people who claim to know it but do not have solid experience usually don't.
Modern Java is a huge tower of libraries and concepts, so it's the same.
I would never hire someone based on the libraries they are familiar with. In fact, the next time I interview someone, I'm planning on giving them some sort of "here's a totally unfamiliar API, do XYZ with it" problem. And they can find the documentation on the web themselves.
You should have a look at C++11 if you haven't already.
> Keeps saying "no enum const class".
I fit your description in full. What is your offer? You can find my contact information in my hacker news profile.
Well, at least he's honest about it. As a student you might hope the stuff that you learned was worth anything I have serious doubts now.
On the other hand, I like to think that maybe what we call entrepreneurship - the hacker ethos, audodidacticism, uppityness, get-shit-done, rationality, self-improvement, a weird mix of skills cut in a wide swath of "whatever I needed to learn at the time" - is actually an overarching set of tools and attitudes for turning your ideas and ambitions into real things.
I can't imagine a world where it's not useful to do that anymore, even if the technology changes. It's true that the entrepreneur toolkit and the 9-5 megacorp toolkit aren't compatible, but every trend I can see points to people needing less and less to get more done. Economies of scale are technological: they shrink as the cost of production shrinks. Diseconomies of scale are sociological: as long as people are still the same, they'll stay the same. The advantages of being a large organisation might not always outweigh the disadvantages.
What if it's not us that get left behind, but them?
I think we've not even yet tapped this. Not only we're not over "peak software" hump - it's not anywhere in sight, we're not even on the main exponential growth curve.
Software based pill: There's already a pill the uses a sensor to measure it's place inside you're gut and release it medication in a specific location. There's also an in-body device that release medicine on regular intervals.
Software based glasses: augmented reality glasses. They might be a growth area for apps.
Programmable shoes: There are shoes with a programmable pattern and shoes for people with Alzheimer that transmit you're location.
Programmable cars: cars are already programmable. they contain huge amounts of software.
Programmable chairs: some dental chairs are already programmable. You have programs for different seat positions.
My guess is, to create a new software employment boom, you need to create a new mass market programmable platform, that needs a lot of small volume applications and that developing for it is neither too complex(windows gui before VB) and neither too simple(excel).
They are just blossoms here and there, not a mesh of grass.
Everything will not be simply programmable: it would be flexible, changing. Your wall would not be concrete with some software. It would be software with some concrete.
It has the potential to be very relevant. 3D plastic printers are interesting, but too limited at the moment. 3D concrete printing has existed for a while though, that can definitely allow software to be/create the walls of your house.
For medicine it's just a case of getting the simulation down. It'll also be interesting if robots get better. There's a lot of boring lab work in the natural sciences, if this can be automated and controlled by software... Your testing can be done by software... Genetic algorithms could be used to find cures to things. (Although using robots to physically test would presumably be much much slower and more expensive than a computer simulation. :( )
Or perhaps dozens of little robots built the wall in the first place (by lashing together sticks and filling the space between with gravel, say) under software control, and can disassemble it and reassemble it in a different form under software control in an hour or two. In this case the sticks and stones are the pixels.
Here you will have the real thing. Real but no longer boring.
I've actually had this thought lingering in the back of my head as I'm in the midst of making the transition from primarily MS desktop/embedded C++/C# guy to open source stack web app guy.
A few years back, in a review my supervisor put in a recommendation to go into management. I asked if I would be able to code anymore. Nope. I felt that was some subtle death trap - to lose my skills through non-use, and turned it down, 20% pay bump be damned.
So now this transition, and began to think of it in the same light. But it doesn't work. It's been 13 years since I graduated, and I'm probably better at most CS'esque interview questions than anytime since then. Why? Because of what I did that led me to making this transition, essentially becoming unhappy in my job and looking for anwers, which started with a book called "The Passionate Programmer", which led to maybe 20-30 other books, completing some classics (such as K&R C), some incomplete (SICP, only halfway thru chapter 2), but in general exploring my craft in a way that my job did not require me to. I'm a much better programmer because of it, language be damned.
So here's the question I have:
Who is more likely to be aware of things such as CLRS, SICP, Hacker News, Stack Overflow? Who is more likely to understand (or even be aware of) languages like Clojure, Haskell, Prolog, Python, Ruby, etc? Who is more likely to give you more lucid answers to problems with big data, having even touched a NoSQL database and pontificate on the advantages/drawbacks of that vs. relational? Your typical enterprise dev, or your typical open source web dev? After talking about all this stuff to a team of 30+ engineers over the past few years, I'd have to put my $$ on and double down firmly on the latter.
Of course, this is only an example of one engineer's experience. But I think there's a more meta issue at play here... it's more of the 501s vs. the non-501s, the latter being ones who well, just are more aware of what's actually going on in their respective industry. If the industry as a whole makes a tectonic shift toward algorithmically more difficult problems (ie. machine learning), I trust the open-source web devs will be on it and synthesize it faster than your typical enterprise guy will even be aware there is a need.
Advertising supported websites care a lot about scale on the cheap because serving banner advertising to 10 thousand people a day does not pay the rent even if the servers where free. But, when your actually selling stuff or managing internal projects it's all about manpower costs and .Net projects take less time to build. When a small project costs 10,000$ a day in manpower and needs 2-20k worth of server and say 100k worth of software it's still almost meaningless next to manpower costs.
I doubt most of the web dev people are spending anywhere near 5 figures a day in manpower, and often their software costs are 0. Outside of corporations, nobody specs out 6 figure 3rd party software budgets.
If you're a MS shop with more money than brains (ie, more money to spend on hardware/software vs the skill of people you have on staff), sure, this might be true.
It definitely wouldn't be true if you were a Unix shop, where the tradition has come around to be "free" software for everything.
If it pencils out, why not? But this is a case of like serving like, and having been in this industry for awhile now the inefficiencies are phenomenal. Somewhere at the very end of the chain these costs are ending up at the end customer (in my case, the health care industry, and I don't want to go into too much detail about my company, but we're of a similar size and the market leader).
The enterprise is like government. It's a slow moving beast. But things will not continue like this forever; it runs on the same principle that initially made Microsoft successful and now is leading to its downfall. True story: last week I was contacted regarding a position at a health care startup in Palo Alto. We're primarily a hardware company and they're all software, but the funny thing is we have some skunks-work type projects that have some overlap with what they're doing. Having looked at what they already have on the market, if we actually wanted to go after them I don't think we would stand a chance. Sure, we might be able to leverage network effects to some degree; but time-to-market, overhead, etc, are on different scales.
The problem will be when one or the other ends up making a breakthrough that allows one group to eat the other's lunch.
Obviously I and most of HN are betting on the webdevs.
Of course that's the upside, the there are a wold of risks and many companies burn millions chasing after penny's.
How about SpaceX? They've already accomplished what no government in the world has been able to do. How about startups emerging in the energy, scientific, and education spaces? It's not all photo sharing, geo-tagging, social engagement rah rah. The discontent is obvious on HN over real problems not being addressed and companies are now beginning to emerge. Not only is it socially pragmatic, but as you so clearly detailed, it's mind-numbingly lucrative.
If you perform a technical service encumbered with vast inefficiencies, your lunch will be eaten in time. It's not a matter of if but when. Just because there are all sorts of barriers to entry right now because of government contracts, regulations, etc, that allow you to offer a solution right now that is cost effective because it's less messed up than the system you address, does not guarantee those walls will not be chipped away, bit by bit, by more innovative, smarter, cost effective ways of doing things.
Enjoy it while it lasts.
In the consulting world product is irrelevant you serve large organizations and the problems they develop over time. While they are young Google, Facebook, and SpaceX don't have a lot of the internal cruft that consultants exist to deal with, but I can guarantee they are creating it right now.
Amazon EC2 and GMail are probably the best examples of direct completion with traditional consultants by start-ups. But again it's all about the organization and and not what the organization produces and non of them started by focusing on that stuff. All large companies need email and don't necessarily know how to make is safe, secure, and salable.
PS: By software company, I am talking about mindset. GM creates a lot of software, but it's not what upper management focuses on day to day.
Must be either billionaire investment bankers or high ranking government/military people?
2) The article: "It says here you went to the University of Pennsylvania for undergrad. Were you an engineer there?”
We're not talking about quants here. If (quoting myself again): "the industry as a whole makes a tectonic shift toward algorithmically more difficult problems" we'll probably need to enlist the help of more than the top .1% of all software/cs folk.
If you are a quant, I humbly apologize for implying mere mortals could even begin to grasp the fundamentals of a concept as abstract as machine learning. Norvig/Thrun/Ng might have something to say about that though.
Can I quote you on this? Did you come up with this metaphor yourself, or is there someone else I should cite?
I can't lay any claim to the coder-as-cowboy metaphor in general - it's almost as old as actual cowboys - but I came up with it in the sense that I didn't rip off any one particular person's idea.
It's so easy to forget how much we had to learn to build websites. It's incredibly easy to forget how much time getting the event loop or even MVC to click took. It's easy to fail to remember how hard it was to learn the 5 different languages required to build the app we made in a couple of weeks over the summer. But those are skills, as challenging to learn as algorithms and big data.
As someone who has had to learn data warehousing very quickly, before being shown the joy of such things as MapReduce, before being slung into serious number crunching performance eeking territory, I can say with absolute certainty that, as "web scripters" or entrepreneurs, we have a huge advantage - we're the people who taught ourselves how to make things instead of regurgitating what a CS program teaches us.
I started a CS program at a decent university. While I think it's true that ivy league and extremely competitive programs might force one to think about this stuff the right way, State University absolutely do not. Most of the kids coming out of there will not be as qualified as someone who has taught themselves how to build a business.
Most importantly, If you're coming out of college, there is approximately a 0% chance the folks hiring you will have any expectation that you will be useful for several weeks while you get up to speed, which is plenty of time to become competent enough to be dangerous.
This part is reposted from another comment because I think it's important to your point:
As far as the meaning of the article, I wasn't trying to prove a point as much as elucidate something I was thinking about yesterday: the skills you pick up through entrepreneurship.
You'll be able to leverage your entrepreneurial knowledge with such platforms. Think of Facebook, Twitter and the AppStore. How many times they made it easier to make money starting extremely small and almost at no cost.
They are very different worlds, much as "hard" systems programming and "soft" high level "web dev" are. And it's tempting to look from one to the other and imagine that it's only a passing fad, but it's not.
As far as CS degrees, my experience is the same. I've interviewed a fair number of dev. candidates in my career (in addition to working alongside coworkers of various educational levels), and a CS degree on a resume has never had any correlation in my experience with a higher quality developer skillset. And this is down to a basic almost fizzbuzzian level too. From a few colleges a CS degree may mean something, from most it seems to be pretty worthless.
I'm a guy who has got a subject specific degree - more than twenty years ago now (1st in Computing and Artificial Intelligence for those who care). I was selling software before that, and have spent most of the time since in industry.
What have I noticed since then? Amount I've actually used the "hard" CS stuff I learned there - close to zero. Correlation between "being good at math" and being a successful developer - basically zero. Correlation between having a degree and being a successful developer, after the first few years in industry, basically zero.
I don't see that magically being different in the next four years.
(Curiously the "being good a math" thing seems to be something US centric. I've not noticed the same focus on that with folk in the UK or elsewhere in Europe).
The space that developers get to play in has got larger and larger over the last 30 years. I don't see that changing. Quite the opposite in fact.
Sure some of that is going to be in areas that really need some hard-core math or engineering skills. Those jobs are out there now (embedded development is exploding again, big data has been around for years, the clever end of game development). I'm sure they'll be more in the future.
But there are also many, many jobs out there that don't. Many, many jobs that involve developers being good generalists, or having cross-over with UX and design, or having a decent understanding of economics, or understanding big-money. I'm sure they'll be more of those in the future too.
One thing we're really excellent at is wrapping up complicated stuff in abstractions that are stupidly easy to use. We're excellent at de-skilling our own job. And every generation whines that the previous one can't build their own computer / write microcode / write assembler / manage with less than 1k RAM / cope without a visual editor / manage their own memory / build their own OS / write their own application stack / whatever.
Yet people somehow carry on building new and neat things.
If you're a hard-core CS/algorithms person - go for it. They'll be lots of work for you. If you're not? Go find another niche. There are many, many out there. Be a good developer. Have fun. Make neat things.
And thus ends this particular Grumpy Old Man's Saturday Rant :-)
Reading Commications of the ACM (most recently, ), there's been an ongoing problem getting enough people into computer science programs (in large part because of the broad fears of "outsourcing"), and the projections are that EVEN MORE graduates will be needed through 2020. 
So yes, many times yes: This article is completely and totally wrong. And as adrianhoward says, it's the same sentiment that my parents chided me with back in the 90s, and that I hear all the time from people who imagine that the silver bullet is just around the corner that will make it so that we need fewer rather than more programmers. (FWIW: My degree is in "Cognitive Science", the "other CS", and I haven't had problems getting jobs when I've wanted one.)
And as an aside, if you look at the graph with yellow and brown bars about 2/3 of the way down , getting a job where math is the primary required skill looks like a slog (looks like about 2x as many math grads as math jobs), while in order to hire people AT ALL, companies are going to have to ignore whether they have a computer science degree, because there are nearly 3.5x as many jobs in CS-related fields as there are graduates in CS.
Based on these numbers it looks almost inevitable that we'll see echos of the "You've put together a web page? By yourself! You're hired!" dot com boom, just because it will be so hard to hire anyone at all.
So, while the story was cute, its premise is FUD that has a chance of scaring people away from CS at precisely the time that we need more people in CS. OK, end rant.
Then what fields are these people going into? Law? I hope not: http://www.slate.com/articles/business/moneybox/2010/10/a_ca... . Medicine has its own problems too. Are they majoring in the humanities? Getting bogus business degrees?: http://www.amazon.com/The-Marketplace-Ideas-Resistance-Unive... ?
It might just be that undergrads have serious information asymmetry problems, of the sort I tried, probably futilely, to correct here: http://jseliger.com/2012/04/17/how-to-think-about-science-an... , but I wonder how these kinds of asymmetries can really persist.
And really, is what we're hurting for now more CS people? I don't think so: if anything, the issue is that there's too much a CS emphasis. That's why we have people spending time building Lisp VMs in JS and all that jazz here, while tons of other fields (even technical ones) have absolutely massive inefficiencies because their problems aren't theoretical enough for CS folks.
That suggests that one reason there are so few CS graduates despite demand is that those who enter CS programs because of said employment demand, and not because they are already interested in programming, don't make it out with a CS degree.
(I think this may be part of the reason that while there are increasing proportions of women in all the other sciences, the proportion of women in CS is dropping. Girls are statistically better in all school subjects than boys, and tend to be more diligent at homework and so on. Math and science fall neatly into that paradigm; programming doesn't. You get no gold stars from grownups by programming. So a lot of people who have the problem-solving aptitude to be good programmers never actually get started -- because nobody TOLD them to.)
In my experience it's even negative. Almost every person without degree that I worked with was much better at their job than the rest. They were more driven and that's probably how they got the job too. On the other hand the code written by phds... Often I regret looking. It's not that it's not correct (usually), it's just barely usable and disconnected from reality of how/where it needs to run.
I suspect you have a very limited (that is, "it's all arithmetic") notion of what math is. You cannot develop software without some kind of mathematical thinking.
Then you would suspect wrong :-) I have a friend who was a career mathematicians briefly, before ambling off into finance. Personally I stopped math after A level (roughly equivalent to the first year university math at university in the US as I understand it - calculus, etc.).
I know math isn't just doing sums.
From my mate I know that many mathematicians are bloody awful at coding - which is a PITA now that computers are much more a part of how mathematics are done in many fields. From personal experience I know that a large chunk of the very best coders I've ever worked with don't have a math background beyond basic numeracy.
What I think people experience is actually the opposite connection. Some of the problem solving skills that developers use are the same sort of skills that folk get from being good at mathematics.
You can get those skills in other places without learning higher level math. And being a successful developer needs much more than the skills from the mathematics toolset.
Nothing against math. Math is great. Moderately good at it myself. If you're a good developer having it will help more than it will hinder. But having it won't make you a good developer, and not having won't stop you being a good developer.
Then you're wrong about math and downvote to disagree.
I'm not making this up. I know many good developers that do not have a math background. I know people who have a heavy math background who are lousy at development (and it looks like I'm not the only one - see http://news.ycombinator.com/item?id=3872269)
A bit more than "you're wrong" would help with reasoned debate of course...
For companies who have their shit together, this scenario is unlikely for a few reasons:
1. Experience, not classes or school, is paramount. We're hiring. We need you to do X. Have you done X or things close to X before? If so, you're better equipped to do X than anyone who has only taken a class on doing X.
2. School does not indicate coding skill. I've met many people who (supposedly) went to every class who couldn't code their way out of a paper bag.
3. Academic coding != production coding. The two are light years apart, and the latter is worth way more than the former.
4. Classes don't give a good signal on the ability to execute. Execution means doing what is necessary so you can ship. It means knowing there's a first 90% then a second 90% that looks like 10%. Finished, launched projects show execution. School do not.
5. Algorithms are fantastic and useful, but not in the ways they taught you in class. If you can use Google or use your copy of the CLRS to find what you're looking for, then engineer it into your solution, that's almost always more than enough.
6. If entrepreneurship is ever 'just applauded' in your interview ... run. Don't work there. Entrepreneurship indicates that you know this is a business, and that engineering doesn't exist in a vacuum. It means you can balance sales concerns against user concerns against design, UX, product, scale, and not just do things and throw them over a wall. It means you can be trusted to make decisions that add value and not just code.
Programming is a scarce profession now, but the simple stuff will soon be done by too many people. Software will become a real engineering task. In 20 years, the age of the code monkey will be gone.
Maybe in 2016, you're going to need deep credentials to be a useful web dev, but none whatsoever to start something useful with 3-D printing.
EDIT: that said, nothing makes you more employable than knowing things at a deep level. A friend of mine, a former Plan 9 kernel contributor, quit the tech industry after the first bubble to become a wildlands firefighter. Returned to the tech industry in 2008 and resumed being a highly-paid infrastructure geek like nothing had happened.
You want to be a mathematician? You need to know how to program. You want to be a "secretary", you need to be able to dig through your boss's e-mail using regexpes when he needs to find sth, you are a dentist - you will install your own scripts on the website because you know how to do it from high schools.
In 2016 (or 2012) it won't be "oh, we need more skilled programmers", it will be "sure, you know programming, everyone knows, but what really you know?". Programming will have the same place on CV like "MS Office", or "keyboard typing" has right now. No big deal if you know it, but much harder to find your job if you don't.
Of course there will still be place for real computer experts - algorithm designers et al, but the basics will be known to more and more people.
In 1980 you'd say that in 20 years the average office worker would be performing calculations on thousands of rows of data, generating charts, typesetting documents, creating full color presentations, doing business with clients in multiple continents and they'd wonder how people would cope with the increase in cognition required to do all that. But it's just button-pressing for most people.
In 2016 they'll say "we have a database with 4 billion data points and we need to infer customer behavior patterns from it". You'll say "Sure.", sit at a desk, click "Segment", click "Demographic: 18-21", click "Intersect", click "Products", click "Make Recommendations", click "Apply" and a discount coupon for "Justin Beiber's Comeback Tour" will be beamed directly into the eye sockets of anyone who bought canned salmon last fall.
I don't see a society where 80% programs, I see a society where 10% builds things for the other 90% and a huge part of the middle class will be automated out of existence. This, to me, is the big issue that will shape this generation and the next.
These ideas are worthy of more than a two paragraph comment on HN. I third the notion that you should pen a full blog post.
and 2 books referenced in that wrticle
A = 1
B = 2
C = 3
A = B
What does A equal?
I'm not saying people can't be taught. But think about how big the workforce actually is, think about how widespread MS Office skills are. For every power-user analyst and project manager that's really taking Excel out for a workout, there are 10, 20, 50 people who use Office in every day non-challenging tasks.
I've given that little test to my MBA wife and a GP family member and several other people. Hardly anybody gets it right.
The x-factor here, btw, that determines whether or not somebody understands it, is whether they see that assignment is happening, not some sort of "wha? 1 equals 2? what is that?" And those that didn't just get it, even after I explained assignment they were just as puzzled. Just the concept of variable symbols confused and (i presume) disinterested them.
What we do here everyday, this is difficult, challenging stuff, that I don't think most the workforce will ever understand. Instead, people like us will be busy for decades to come, building tools so they don't have to.
There was a time when machines were new concepts versus simple tools. You could say, in the early stages of the industrial revolution, that soon everybody would understand and be able to fix their machines. But machine complexity has out-paced the desire and ability to learn those skills.
Software is no different, I don't think.
In particular, the link posted in the second edit has a rather poor test using a and b, because it uses = in two different ways with no indication that the meaning of the symbol has changed. Maybe the problem with the test isn't just the people, it's the sloppy notation that assumes people with no programming background are able to infer when we mean equality and when we mean assignment.
And even after three weeks of instruction most of the people who didn't understand it immediately never understood it. I'll quote from the article:
"Either you had a consistent model in your mind immediately upon first exposure to assignment, the first hurdle in programming-- or else you never developed one!"
My wife is a brilliant woman, fantastic at what she does. The GP I mentioned in my post is a very good doctor who had no problems getting into a medical school, passing his boards, or running a successful practice. But that doesn't mean that everybody is meant to be able to understand the abstract concepts you have to master in our line of work.
Is there some sort of disruption of (basic) programming skills coming the way blogging has disrupted journalism?
Some of this was just normal boom-time labor shortage. There were lots of stories about how after the crash these guys would never work again. Some aspect of this was true, but some of it was bunk. The 2000 version of this story would be "sure you can do html, and you can do CGI, and you understand http headers and can whip up a server, but we need people who understand SQL and how to work with record objects and how to do live updates to a system, stuff you need a real degree for. It's 2005 not 2000"
So basically I am suggesting that while Rails may be a non-skill (like HTML has become) and maybe good REST APIs will be auto generated, and some Backbone.js future version or successor will do most of our tricky js stuff, there will likely be good toolkits that allow people to plug together data-mining and data-management without needing super deep algorithmic understanding, we are already seeing the emergence of such tools.
So the other part of me disagrees, the code-monkey will be needed, just that they will be putting together different bits than they are today.
It seems to me the Hiring Manager in the story was fishing for a jack-of-all-trades kind of applicant. The interviewee was obviously a web developer, but the position to be filled was a data analytics job.
What difference does it make if Rails/PHP/Node.js/Backbone/etc become commodity jobs? In order to get the raw data that requires complex and high speed algorithms to parse in to useable data, you'll still need a website, built by a rails/backbone code monkey and a Photoshop designer and with the help of a decent DBA, at the very least. The data position, if it ever comes to it, will just be another job type that a well rounded development team will need to fill, not a replacement for everyone else on the team.
Yeah, my technical screen for an 8-year career in Schlumberger was exactly this.
It is also an even huger place when it comes to demand of people that can do simple stuff.
How do you deal with huge data: you just do.
There are tools for that, and you apply those and perhaps make new tools yourself, but math and algorithms tend to never enter the equation.
I can see why a college sophomore would fear this response. But in my 18 years of programming, I've seen that the vast majority of software development isn't about the stuff they teach you in school. It's about design, collaboration, languages, libraries, and frameworks. It's about working around crazy cross-version incompatibilities, solving heisenbugs, and keeping everything maintainable. Math and algorithms? Feh. Not the real issue.
Let's assume the startup bubble bursts and programming jobs become scarce. There won't be any kindly interviewer at the large bureaucratic companies. There will just be a faceless HR person with a keyword-searching database saying, "No CS degree--no interview."
But personally, even if the startup bubble bursts, I don't see the demand for programmers going anywhere but up. And that entrepreneurial background will only be an asset at the smaller, more interesting companies.
There will always be more people, and more need for people, writing high level application code, glue code, and "spit-and-polish" code than people writing deep, difficult systems code. Always.
Here's the thing, in computing the advances in tooling and performance continue to pile up at an amazing rate. In 2016 it will be even easier to roll out a product built by a couple "web guys" with little in-depth technical knowledge that does an amazing amount of business and has a profound impact on the tech world. Indeed, in time it will be possible to run ventures which support billions of active users per day on incredibly cheap hardware and with a rather modest amount of dev-hours behind them.
Imagining that the future is only for hard-headed systems programming is the time honored ego-stroke of the hard-headed systems programmer who sees all of these "dilettante" "web guys" doing amazing work in the real world and making an impact and money doing so. But that's just a fantasy. The truth is that software is art. It's often a thousand times more valuable to write software which communicates with users and evokes in them strong feelings and strong connections than to write software which is technically pure and strong, but sterile, impersonal, and useless.
No one cares if you can do an algorithmic analysis on different ways of sorting to choose the most appropriate way. These days it's dynamically built into the function. Just call sort.
Educationally there's not much of a difference between a philosophy, math or computer science degree. All of them are doing the same thing - logic. Philosopher approaches it classically, mathematicians do it formally, and comp sci do it ad hoc or practically. Each has it's virtues when you design or program.
What we need is good vocational education. Polytechnic schools that teach you how to code, be part of a project or lead one, design an application and so on. Then the CS taught in universities can focus on the basics of computing, analysis of algorithms, AI, programming languages, etc without the now mandatory Software Engineering 101.
Some sort functions might cover 99% of cases for web development, but not for other problems. The computer is not magical.
There are literally 100 cool things to learn and try: Like this weekend I thought about writing a small program for the DCPU-16, trying Meteor, making a small app using firebase, etc etc.
Possibly, learning more Math has a higher long-term ROI.
An another note: When everything melts down, it might be a good time to start another company, rather than look for employment though.
This will be even more true in 2016 than in 2012.
A nontechnical MBA is just blocked from this insight. And a great algorithm guy who is tone deaf to users is likewise blocked. Even together, they are handicapped compared to the guy who sees both sides.
The most powerful problem solving of all is a group of people who can see both sides. Pud's thread about 400K users and what to do next was stunningly wonderful to me. You don't see that on stack overflow and you don't see that in the Harvard Business Review. You see it here on Hacker News.
I know of 7 basic subcategories of know - all but the last susceptible to phrasing:
(1) Knowledge that is immediately accessible at great depth and can be traversed quickly
(2) Knowledge that is not immediately accessible but resides in the unconscious. It surfaces in dreams, showers and intuition.
(3) Knowledge that is not immediately accesible but can be so quickly understood from a search (physical or digital) that it might as well have been remembered. Truly, the old fashioned idea that all your knowledge can only be kept inside your head is quaint.
(4) Knowledge that is not had but can be quickly acquired due to the similarity of the underlying structure to already possessed knowledge. With speed of acquisition proportional to similarity.
(5) Knowledge that is not had but can be acquired due to available learning strategies, knowledgebase and skill in acquiring knowledge.
(6) Knowledge that will be had in the distant future
(7) Knowledge that can not be gained due to difference in interests, lack of motivation or sufficient strength of reason, entrenched mode of thinking and set of beliefs which inhibit deftness with abstraction and for a very small few - reasons of biology.
Knowledge that cannot be accessed by Human Brains.
For many cases, the level 4 definition of knowing is sufficient and anything above should be good enough for almost all problems.
The point is: A university degree no longer mean the accredited person is capable. It has lost its value, that's why some recruiters are turning to GitHub and other solutions to find skilled people.
To assure the author: I don't hold a University degree, and I live in a third world country. I was offered, a week ago, a software dev. position for 30% of what a fresh Engineer (5 years of study) would get. The recruiter insisted that it was a starting salary but it was only a fraction of what I make online. He told me that the position is available anytime I changed my mind.
4 years earlier only, in this same place, you'll be laughed at if you don't hold a University degree whatever skills/capabilities you can show off to the recruiter. Don't dream getting that job, and even if you did, you'll be paid only a third or less than your colleagues.
I recently got a call from Google for my rank in Code Jam. I explained my position and expressed my desire to work on programming. The HR, a very nice person, made it all but definite that because of my background and experience, I should look for a risk analysis role and not coding. He is still willing to set up a programming interview if I insist... but I don't know what to do :(
"The problems we’re working on involve in-depth data analysis that require an extensive math and algorithms background"
So if you know a bunch of programming languages and built some fairly successful websites, why would you even apply for that job? You're and entrepeneur and a "general programmer" at best or maybe even a "web programmer" only just good enough to hack a CRUD site together.
If the job posting actually indicated the need for in-depth data analysis with extensive maths and algorithms then this applicant wasted everyone's time by even applying.
I suspect that there is some assumption that the hiring manager doesn't know what she's talking about regarding the in-depth data analysis but I don't see where that assumption would come from.
But then there are a few start-ups who are designing and building fuel efficient diesel engines, or fully electric cars built from the ground up. These guys will absolutely know thermodynamics etc.
So if someone built "mycollegematehavingsex.com" and got 40,000 sign-ups via launchrock, who fucking cares. If they made twitter-for-squirrels and got 10,000,000 active users before the Great Nut Famine of 2015 killed the company, then there may conversations to be had.
I am an OK Java & PHP programmer with some Linux Sysadmin experience (mainly running servers for apps I have built). I have basic CS credentials but am mostly self taught (starting with BASIC -> PHP -> Java) and have messed around with a bit of Game Dev & System programming in my spare time but nothing earth shattering.
I want to take the "next step" but my skills in math and Big Data analysis type stuff are fairly poor by HN standard, I am quickly approaching 30 and have very limited money.
A) Take a pure math degree (possibly also with stats) to improve my overall math skill in the hope that this will open more doors for me in stuff like NLP , Big data etc (but also cost significant money+time).
B) Hack on existing open source projects to increase my knowledge of working on large & complex code bases (significant time cost buy little money cost).Possibly compliment this by working on the free online courses from Stanford etc.
C) Work on my own projects in the hope that I can build something cool that will get me recognition in some way.
D) Try to improve other skills outside programming/math and become more "well rounded".
B) Hack on existing open source projects to increase my knowledge of working on large & complex code bases (significant time cost buy little money cost).
A pure math degree will not teach you the skills that you are specifically after: it is pure mathematics.
It's frightening to think with modern medicine and all the technology available, they can't really help you. In the old days, we were better off because now they're all "specialists." Everyone's getting better and better at less and less. Eventually someone will be absolutely amazing at doing nothing.
Good lord, DADS + ACM + CiteSeer and a healthy dose of curiosity and intelligence is enough to get most people through 99% of the "hard" stuff you're going to see even in the era of Big Data.
Have a look at puredata.info and Eastgate Systems' Tinderbox, especially the export template definitions and use of 'agents' for the latter
Why can't I have a visual flowcharty type thing that I model a business process in, click a button, and generate a Web app?
And as much as I would like to see a world where people take the time to learn theory, people are lazy, and that sort of thing isn't always viable (See also Sikha Dalmia's Gandhi Rule: http://www.thedaily.com/page/2012/04/19/041912-opinions-colu... - if you don't learn it, don't expect anyone else to). The consequence is that the computer is often smarter than the people using it.
Somehow, I don't really see this resulting in job losses, though, unlike many people claim. The variability and the amount of problems to solve usually mean that there is someone doing some design work, even if it is not on a very serious technical level. It also opens up new possibilities for startups, and I'm sure Stephen Wolfram would talk your head off about it...
As regards "cowboy nonsense": you don't always have to attach negative behaviors to ordinary things. Getting exercise and having a social life can be good for everyone, not just douchebags.
This article is a reminder that it's never too soon to start adding different skills to your arsenal.
10 years ago if you learned Perl on the side it opened doors to great jobs. It's always the way that learning the up and coming not yet mainstream skill will give you an edge in employment. Data analysis is the skill for 2012. Something else will be in 2016.
Company: Sorry, some dude named Dan Shipper hacked the recruiter industry 4 years ago.
Recruiter: So, do you have any jobs for me myself?
Company: We need hackers, on a scale of 1 to 10, how good of a hacker are you?
Recruiter: only a 4 but my cleaning toilets skills are a solid 10.
I would hope by 2016 someone finds a way to make recruiters obsolete. Sounds like a perfect process to Hack with way too much inefficiency.
Are you looking to make a big difference and make lots of money? Be an entrepreneur. Who was that famous entrepreneur 80 years ago that said, when he was being questioned by an expert in court, that he didn't know the answers himself but he could easily hire someone who does?
If you are looking to get a job, then get the credentials AND be able to help them with what THEY need. Just like you need a product-market fit, you need an employee-job fit. If you are applying to a high speed trading firm, you'll probably need to know C++ and Java and low level concurrency mechanisms (this is what I was interviewed on), and if you know math and stochastic calculus you can make more money as a quant. I didn't like those jobs, so I got out of it. But you can still make a lot. Chances are though, you don't want to make money that way.
If you are looking to understand things better, then LEARN. It doesn't have to be in school. Personally I learned a lot from Wikipedia. Before that, I was coding on my own.
At 17 I enjoyed math and computer science. Check this out: http://www.flipcode.com/archives/Theory_Practice-Issue_00_In...
It has a lot of math in it, but in an accessible way. Why? Because I liked it.
So to summarize: it depends on what you want. If you don't have what it takes to help THEM and back it up with information, then do something else. Finding and picking your opportunities is often the secret!
It's like hiring an architect to do the structural engineering of a bridge. They both "make" buildings, but their heads are in different places.
I wasn't old enough to actually experience it, but I'm sure it was actually exactly like that. Job scarcity comes in cycles, but, assuming civilization doesn't collapse, tech jobs should be pretty safe ... forever.
If former, go get some. No excuses!
If latter, find a way to print yourself one.
For my part, I think that people who can present complex ideas well will always be employable. I think anyone who doesn't learn basic front-end programming concepts is doing himself a serious injustice.
The challenge of 2016 won't be solving hard mathematical problems. Yes, there will be high demand for people with those kinds of talents, and that kind of work will be (as it always has been) important. However, I think the biggest challenge is going to be educational in nature. It won't be enough to build great software; you'll have to teach people (who are too busy to learn and compare the intricacies of 35 technical assets just to do their jobs) how to use what you've built.
Maybe non-autodidacts need to go to college in order to learn how to program? (But I would doubt this-- you all knew how to program long before you were freshman in college, right? I mean, hackers are born, they're not created in CS classes, right?)
On the other hand, the people I've interviewed with and worked with who were non-hackers, who went and got a CS degree, often were weak performers. Much of what's needed in the workforce is not taught in CS programs, and something about the way CS programs are taught seems to often condition people such that they have to unlearn a lot of stuff before they're fully effective in jobs.
Of course, I've known lots of good hackers with CS degrees. Hackers do tend to follow the custom and go to college and get their CS degree and arguably could be better hackers than they would have been without the CS degree (though I think its debatable whether 4 years of employment experience or the CS degree makes the better engineer- for some people its one, for others its the other.)
When I entered the work force, if not having a CS degree meant I couldn't get jobs it would have been a real issue-- but these days, its is a whole lot easier to start a company, and thus you don't need to be dependent on passing arbitrary HR requirements.
If you aren't playing the startup lottery (e.g.: starting an instagram like business and want thus need VC funding) it is vastly easier to start a profitable-from-day-one business now than it has ever ben.
And 4 years from now, that's not going away.
 This also shows how well resumes are read. Mine doesn't lie, but I put job history first. I'd usually have so many interview choices that I'd pick my top 5, do 5 interviews in a week and get 4 offers and a callback. I'm sure some companies did read my resume and didn't give me the chance to interview as a result- but that's fine- it is like a built in bozo filter from my perspective.
 Frankly I think the requirement for a college degree is a bit like hazing. The people before you went thru it, and so they aren't going to accept anyone who also didn't have to go thru it. It has nothing to do with skills, just a way to exclude people who are different. Lord knows that piece of paper is not proof you can program.
He applied for a job where knowledge and/or experience in complex mathematics was required and did not get the job because he lacked the qualifications. how is this interesting or even news???
But that's also the point right? Everyone with more than two years experience in software development knows skill sets are changing continually. This story is speculating that data analytics will be next the long history of hot/baseline skills for the developer. It does so using the powerful narrative device of not saying directly at all.
Is data analytics going to be the next critical skill for developers like relational databases, object-oriented, or internet technologies has been in the past? Or is it going to be (or already is) the next bioinformatics, an interesting subspecialty but far from being the primary source of demand for software development talent?
What I am really saying is that data analytics is nothing more than the combination of math and programming. The programming part is evolving quickly, but not the math side.
I have never applied for a company coding position; I came from design and learned to code as I went; but 75% of my business now is in custom business apps. I've yet to meet a client who doesn't value the fact that I'm willing to learn what I need on the fly. Many times I take projects with the caveat that a certain amount of cash and time is probably going to be spent filling in what I don't know, and hacking around until I figure it out; and that if that becomes onerous, I'll knock some of it off the tab. I bill at $100/hr, modest by freelance coder standards, but obviously many times higher than coders on oDesk, and at least double what I'd earn in an office (if they'd hire me - which they probably wouldn't). And yet my clients end up paying less for rewrites and fixes, spend less time on the phone, and end up with a product they're happy with.
The small-to-midsized business owners who understand the value of letting me hack away, who ask what I think about how they can analyze their data, etc. get great value for their money, and I don't see a shortage of them. When I really, really don't know, I hire out to other hackers who do. I wouldn't want to hire a company composed of people who can memorize algorithms, but can't think on their feet; I'd much rather have the exact opposite, and my clients at least feel the same way. And I'll use whatever tools are at my disposal. The first time I wrote an online store, in 2001, I did it from scratch. I had NO knowledge of databases at all... I actually didn't know there was such a command as serialize(). I ended up writing a whole custom back-end in PHP that did its own serialization, flat file writing and retrieval on products, customers, orders, etc... hundreds of products in the store, thousands of customers... and that site is still running.
Companies that would rather have drones with degrees aren't companies I'd work for, and I'd argue that they aren't who successful businesses looking for software want to hire, either.
Thank God, that job interview was just a nightmare.
Upon entering the job market, I've encountered the new reality that the requirements for an entry-level position are going up faster than I can learn the new things that are required each year. Every position requires experience in something that I do not yet know or have only minimal experience playing with in a personal or academic setting, but there are many qualified candidates competing for the same position. Too many experienced journeyman programmers are out of work and competing for a smaller number of jobs.
The author's bit of sci-fi seems entirely reasonable. It's not where things are headed. It's where they already are, and it's been like this for years.
It's certainly not the Bay Area in California, where there are far more jobs than competent developers. I'm out in Colorado, and I still get constant offers even though I'm not looking for a job.
If you don't even have a day job, then you should have plenty of time to become an expert (or at least competent) in at least one narrow field that people are hiring for. If not, then there's no way you'd be able to handle a serious job if you were hired, since just about any job you take will require that you do a lot of learning on the job.
The key skill to have is how to learn. Unfortunately school (elementary through University) tends to hammer that skill out of people. Try to find it and reclaim it. Then build something that you can show off to prospective companies -- or sell it yourself! Good luck.
Nobody hires in one narrow field anymore, at least not according to the job ads. You need to show competence with six to a dozen different technologies from the basics (languages, server software) to the specifics (frameworks). Miss one and you're out. Have you worked with a CMS but not that specific CMS? You don't qualify. Have you lots of experience with several similar languages but never used this specific language in the workplace? You don't qualify. Are you an expert in the previous version of the language but you haven't caught up with the changes in the latest version? You don't qualify. Someone else evidently does because the company closes the position.
You need to be either on Bart somewhere or on Caltrain somewhere. Then you'll have tons of options -- in San Francisco, or in the 'burbs/secondary technology areas -- and the commute can be on public transit (to avoid the awful traffic all around the Bay).
And you don't NEED to move before you find a job. Suck it up and drive for two or three hours to interview if you have to. Try to get several interviews on the same day in the same area if you can. And make it clear to them that you're planning to move for the job -- some companies don't care what your commute distance is, but others do actually care how happy their people are, and assume that someone commuting 10 minutes will be happier than someone commuting two hours.