Hacker News new | comments | show | ask | jobs | submit login
This Is 2016 Not 2012 (danshipper.com)
364 points by dshipper on Apr 21, 2012 | hide | past | web | favorite | 231 comments



  1979
  Recruiter: "On a scale of 1 to 10, how's your COBOL?"
     edw519: "10."
  Recruiter: "Great! I have tons of work for you."

  1983
  Recruiter: "On a scale of 1 to 10, how's your data base?"
     edw519: "4. But I'm a 10 in COBOL."
  Recruiter: "No one cares about COBOL. I need data base people."

  1987
  Recruiter: "On a scale of 1 to 10, how's your Microsoft?"
     edw519: "4. But I'm a 10 in data base."
  Recruiter: "No one cares about data base. I need Microsoft people."

  1992
  Recruiter: "On a scale of 1 to 10, how's your Oracle?"
     edw519: "4. But I'm a 10 in Microsoft."
  Recruiter: "No one cares about Microsoft. I need Oracle people."

  1996
  Recruiter: "On a scale of 1 to 10, how's your HTML & CSS?"
     edw519: "4. But I'm a 10 in Oracle."
  Recruiter: "No one cares about Oracle. I need web people."

  2001
  Recruiter: "On a scale of 1 to 10, how's your Javascript & PHP?"
     edw519: "4. But I'm a 10 in HTML & CSS."
  Recruiter: "No one cares about HTML & CSS. I need back-end people."

  2009
  Recruiter: "On a scale of 1 to 10, how's your Ruby & Python?"
     edw519: "4. But I'm a 10 in Javascript & PHP."
  Recruiter: "No one cares about Javascript & PHP. I need Ruby & Python people."
Number of days without work since 1979: 0.

Lighten up, guys. If you can build stuff, learn, and work well with others, you'll probably always be fine.


edw519: "4. But I'm a 10 in Javascript & PHP."

Hang in there, 2009 edw519! JavaScript is increasingly back again in 2012 ;-)


especially on the back-end


If you can go from 4 to 10 in something new every few years, that's amazing in itself. (although maybe we're using different 1-10 scale here; mine is probably log and I'm probably and 8.x at my strongest skill, and 4-6 for secondary skills after some solid effort.


That's why I hate the 0-10 scale for skill level.

Some folks consider a 10 to be the guy who invented the language/technology. Linus, DHH, etc. Almost no one is a 10 on that scale.

Others consider a 10 to be "I'm an expert in any of the tasks you're going to be asking me to perform with this tech". That's the scale that's more important to me.

Then, of course, you get the brand new grad who says they're an 8 in a language despite never making anything that's not a basic class project in it...


What most people seem to do with these scales is pick the language that they are most comfortable with and say that they are an 8 or 9 in that and then arrange everything else on the scale relative to that.

The problem with this is that your overall numbers are likely to go down over time.

If you are a college freshman who spent a weekend learning JS then you can now probably do most of everything you can do in your main language (say Java). That means that you put down "5" for JS.

Once you've got a few years under your belt, your knowledge in any new language has a much higher standard to meet.

I filled in a questionnaire for a job once that has vague questions such as "Rate your skills in TCP/IP , Novice/Intermediate/Advanced".

I had no idea what the scale was here, did advanced mean being able to write a TCP/IP stack from scratch, identify bottlenecks in huge networks or what?

I was trying to decide whether to tick novice or intermediate however it turned out that "advanced" actually meant being able to run traceroute from command prompt and set A records in DNS.

I remember the advice a friend of mine gave me, he said "Just put 10 for everything, you'll get an interview"


I just put that I had worked with TCP/IP on a resume once, and they asked me what fields were in a TCP header. I was able to reconstruct all but one from first principles, since I didn't actually remember them all.

If I'd put "10" in TCP/IP for that company, I would have been slammed in the interview. As it was I passed. Apparently no one ever remembers the CRC byte in the TCP header. :)


People who have to worry about checksum offloading for high performance know this :) ... This is a feature of high end network cards.

(although, in fairness, I'd only require that level of knowledge at the tip of tongue if someone was going to be a network stack developer or high level sysadmin/network admin -- for a regular web developer, just knowing that there are tcp headers and flags which can be set, and which might get mangled by devices between client and server, is enough.

It's CRC at the level below TCP (well, 2 layers below: ethernet)


From my limited experience the right answer is always 7. It's a BS question so yes there is a right answer.

Here's why: When I see resumes, the only people who put 10s are recent college graduates. They put a 10 for Java and then don't know what a hashCode is. This demonstrates more about the question than about the candidate

I've also interviewed at a few places, been asked the 0-10 question on a set of languages and been told that 5 "wasn't what they were looking for" and they wanted "10s only." This demonstrates more about the company than about the question.

The right answer is 7. It says "I'm good, but not Linus Torvalds" and if anyone asks a follow just say "I'm never really sure what to say there." It doesn't matter if your answer means anything because the question doesn't


I recall a blog post where someone remarked that "7 is really a 3". Precisely because of Dunning-Kruger: people who don't know how big the field is just say 7 because it seems reasonably high and yet doesn't imply mastery.

That'd imply that game-theoretically, the right answer is 8. ;-) Enough to tell the interviewer that you've gotten past 7, yet not so high to make them think that you think you know everything.

(Personally, I've taken to saying "Judge for yourself" and listing my accomplishments. That's what they're really interested in, they're going to do it anyway, and this way I show enough confidence in my own abilities that I'm willing to let others judge them.)


I he only legitimate use for that question is to have an easy excuse to no-hire an idiot who says "10".

The correct answer is "I dunno, what does 5 mean" to show humility and an insistence in meaning in metrics, and if pushed say 6, and trust that decision makers at anywhere you care to work ignore the dufus who insisted on a number.


"When I see resumes, the only people who put 10s are recent college graduates. They put a 10 for Java and then don't know what a hashCode is."

Please tell me what colleges these graduates are coming from so I know never to hire/work with any of them, ever.


I don't think that's a specific problem with the colleges themselves.

It's just when you spend 3/4 years in that kind of bubble and you get getting good marks on all your assignments you are naturally going to think "I've mastered Java" based on evaluating the only evidence presented to you.

Of course there is a difference between knowledge and aptitude here.

Would a candidate who doesn't know what .hashCode() does but who after a 10 minute explanation says "yes, I understand now makes sense" be a bad hire?

I program a lot of Java but I am unfamiliar with huge amounts of both the language and standard API (especially the parts dealing with 'enterprise' stuff and concurrency). This is mainly because I explicitly design my Java apps around a shared nothing/POJO-JPA configuration and I don't use RMI , Reflection or JMS.

Does that mean I am a bad Java programmer?


A phonology professor of mine at Ohio State spent 20 minutes during one of my classes discussing the syntax and semantics of grade point averages, which i have to say, was one of the most useful things anyone had ever explicitly taught me while at Ohio State.

In particular discussing how, despite using the same syntactic formalism, the semantics of what was considered "passing" for undergraduate students vs for graduate students was something no one had ever enunciated to me explicitly (e.g. you pass as an undergrad if you're getting Cs. If you're getting Bs as a graduate student, you are doing it wrong).


And it's not because grad students are supposed to do better, it's because grad school grades are more inflated.


It's because grad school grades are meaningless. No one ever asks about your grad school GPA. So "A" means "pass" and "B" means "fail".


In the job interviews I held, I clearly stated on the pre-hire self eval, "0 is I've never used it, 10 is I wrote a book on it"

And I'd STILL get people who rated themselves 10 in multiple categories.


  >> If you can go from 4 to 10 in something new every few years, that's amazing ...
If you follow edw519's posts (and I recommend that you do#), you will see he is an amazing guy whose life revolves around programming.

#see http://news.ycombinator.com/user?id=edw519, free ebook, "The Best of edw519": http://edweissman.com/53640595

Also well worth following:

http://news.ycombinator.com/threads?id=pg

http://news.ycombinator.com/threads?id=patio11, http://www.kalzumeus.com

http://news.ycombinator.com/threads?id=tptacek

http://news.ycombinator.com/threads?id=cperciva

and many others...


I like this, but I don't understand it. How you can end up with Number of days without work since 1979: 0

Isn't the reality more like the fact that you have to constantly obtain new skills, thus making you only "hireable" part of the time?

In 2001 you learn you don't have what folks are looking for so you go and "retrain", get a couple years solid experience[%] in that skill, and then have a decent run until 2009 when the whole thing rinses & repeats?

To me, senarios like these, make the case for actually having a CS degree -- that way you have a chance of obtaining a "zero" in number of days without work since 1979.

1979 Recruiter: "On a scale of 1 to 10, how's your COBOL?" CS Graduate: "10." Recruiter: "Great! I have tons of work for you."

1983 Recruiter: "On a scale of 1 to 10, how's your data base?" CS Graduate: "4. But during my days at MIT, we did cover database theory quite intensely; I'm sure I could adapt to your needs very quickly" Recruiter: "Oh, MIT 'eh, I'm sure you could, when can you start?"

... and so on...

[%] I know a solid programmer can pick up new skills pretty quickly, but recruiters have a funny way of always insisting on a certain number of years of experience in a particular skill, don't they...

P.S. I don't have a CS a degree.


I think the point is: don't go through recruiters to find work. Even if the company says "we only accept applications through recruiters" that can be short-circuited if you have an acquaintance who already works there.


Depends on the company. Larger organizations will often have a mandate that EVERY hire go through proscribed HR processes and channels. MAYBE if you know a C level person you can get an exception.


Last time I got hired by a big organization (definitely not C-level), my interaction with HR was as follows: they called me up to say they were sending an offer letter, and asked if by the way I could fill out an application and send them a résumé, for their records.


That's probably an extreme case. Often, knowing someone who will vouch for you will get you straight to the stage of interviewing with a technical person. You still have to go through the motions for their process, but it's a pure formality.


HR process, yes, but there's a huge difference: most (not all but definitely 90+%) of recruiters are simply searching Google for buzzwords and forwarding you over to HR (for this, they expect large amounts of money - typically 6 months of your first year salary). HR usually filters again and ships it through to the actual group which is hiring.

Having a friend on the inside avoids two potential filter failures without having to actually change the HR process - it's saving HR the cost and hassle of a recruiter and essentially assuring them that they're not going to get flack for wasting anyone's time if they forward your resume through to the actual hiring manager.


and if you don't?


Go to (and preferably speak at) the same user groups and conferences that people from that company go to. Hack on the open source projects that the company uses internally. You will soon have friends in that company, and probably 10 others.


This is assuming that they are using open source projects internally and that you A. Know what they are using and B. That they care about who maintains them rather than treating them as a black box that does stuff for free (how most open source projects are used most of the time in practice).

Also tech conferences seem to be a pretty expensive way (at least in terms of travel) for hoping to bump into someone.


At some point, if the company is not making efforts to reach out to better programmers (by actually looking up the name behind the open source project and showing a strongly present figure in tech conferences) then it's their loss, no?

If a company only trust "recruiters" to select (or pre-screen) candidates for them, then this says a lot about their way of managing and interacting with "our" kind of programmers (by that I mean people who read HN, are enthused by cool innovative projects, write blogs, publish open source code, ...)

If the company doesn't help you bypass the recruiter, maybe you should bypass them altogether?


"This is assuming that they are using open source projects internally and that you A. Know what they are using and B. That they care about who maintains them rather than treating them as a black box that does stuff for free"

Then turn it around and look for companies based on the open source products that they use. There are plenty of companies that are heavy users of open source software, and don't hide that fact. You might have to move or something, but that's life.

"Also tech conferences seem to be a pretty expensive way"

There are many cheap conferences, or free exhibit hall passes for early-bird registration (usually allows BoF sessions, as well, and maybe parties); and user groups are ordinarily free (maybe you chip in for dinner or something). There are conferences or user groups in many areas about something, and maybe you can just drive. Often, you can share a room with another attendee (ask on some mailing list related to the conference) to save on lodging. If you have developed something or have some kind of speaking experience, the conference may pay for some or all of the travel expenses, and may waive conference pass. There are even some open source projects that have money waiting for anyone who wants to bootstrap a user group in their town.

And these things are not theory. I have done absolutely everything I have suggested to you here (except trying to bootstrap a local user group): get free conference exhibit pass, drive to conference, share room, get started contributing, get major patches accepted, and get jobs through community connections. And I still do it, and I see many people rising up through very similar means that had little in the way of marketable talents a while ago.

Disclaimer: Nothing about this happens overnight. It takes a long time. There are no magical steps where I can say "do these things and you will get this". You have to look out for the opportunities on your own. Start attacking your own reasons why something can't be done, knocking them down one by one.


Thanks for the advice.

I do in theory like the idea of speaking a tech conference. I am just worried about the level of experience that would be required to do so.

I don't want to stand up in front of X number of people and tell them stuff they already know.

I live in a place that is very very far from being a "tech center" in the way that SV is but there are a few user groups around (mainly PHP groups, LUGs etc). I should probably check them out.

I did go to a local LUG once but that was about 5 years ago and mostly consisted of unemployed neck beards grumbling into their pints of bitter. Hopefully things have changed since.


"I don't want to stand up in front of X number of people and tell them stuff they already know."

A topic that is always safe is "I used technologies X, Y, and Z to build this thing, and here's what I learned along the way". It's usually a little more entertaining because it's a story (often with a few funny anecdotes); and you're admitting that you didn't do everything perfectly from the start, so being wrong somewhere during the presentation isn't a deficit. Being wrong really only matters in a persuasive talk.

(Aside: it's pretty well established that teaching is one of the most effective ways to learn deeply.)

I'm sure you can find better speaking advice elsewhere, but one other thing I should mention: especially at user groups, a talk is just to start a conversation, which will often be held at a bar later. So don't worry about concluding anything in a profound way.

As for location, then yes, living within driving distance of the SF bay area helped me a lot. But I didn't begin speaking until I moved to Portland, Oregon, where I regularly attended a user group. I'd like to think that a lot of other areas have something to offer as well, but I'm sure it takes more searching.


Recruiter: "No one cares about COBOL. I need data base people."

Need a better recruiter. People are still making really good money with COBOL :-)


True story:

Recruiter: "So you were writing Java code that talked to an Oracle Database, right?"

Me: "Right"

Her: "What version of Oracle was it?"

Me: "8i I think"

Her: "Sorry, we're looking for people with 9i experience." (9i had just come out.)

Me: "Really? You know SQL[1] is a standard and I'm writing Java code anyway, we were using a ORM as well. I didn't even end up writing that much direct SQL code."

Her: "Sorry, the client specifically said Oracle 9i -- I'm looking at it right now. They're not looking for Sequal experience."

This is when I decided to always avoid recruiters if I possibly could.

[1] I pronounced this "Ess Que Ell" but she responded "Sequel" which I think was the name of another database vendor at the time.


Both pronunciations (SQL/sequel) refer to the same thing, and both are correct.

http://patorjk.com/blog/2012/01/26/pronouncing-sql-s-q-l-or-...


I think his point was that the recruiter did not know this.


I just call it "sqwel" or "squeel" that way nobody is happy.


Reminds me very much of the job adverts I used to see asking for '5 years C#' when .Net was about 6 months out of Beta.

Either they only want people who worked for MS on the platform/language itself or the whole thing was copy/pasted by a recruiter.


I liked the theory that someday a recruiter would ask DHH how many years of Rails experience he has, and he should answer "all of them".


Wasn't there some actual recruiter offering DDH a senior rails developer position? I'm pretty sure I read about that in DHH's twitter stream some time ago.


Groupon didn't quite ask for his level but did ping him about web dev work.

https://gist.github.com/1285068#gistcomment-56327


I see some value in this type of thing. If someone can't bluff their way past a (non)technical recruiter, will they be able to handle a requirements meeting or communicate status to a PM? Some engineers really can only talk to other engineers.

What's the worst thing that could happen if you fibbed "Oh yeah 9i, I've used that too."? You'd get an interview with the database guy who actually knows the difference.


IME, a lot of the best programmers are anal (often to a fault), and won't lie in that situation.

Not because of ethics or morals but just because they refuse to say something they know is technically incorrect.


I have been hired largely on keyword matches before, it was a bad move.

Think about it: most everyone you're going to be working with will have been selected on such a basis as well. Not so much bad people, but on the whole a thoroughly uninspiring workplace where creativity was actively discouraged.

Unless I was really desperate, I would prefer not to get hired at a place that used that sort of process.


Like Sheldon Cooper in Big Bang Theory when the judge asked him to apologize, "I'm a scientist, I never apologize about the truth!"


Life just doesn't work that way. IME the people who fib a bit and embellish when applying for jobs because they know they can, they do a lot better than those who tell the truth all the time.

Reality is most employers put up fake 'requirements' because they don't know how else to find 'smart, can learn, gets stuff done'.


Are you advocating lying in job interviews?


I think he's advocating 'advancing your skill-set'.

Oh And: Self-Marketing, it isn't always evil.

But I'm being honest, sometimes you have to make yourself sound good to get the goods.


I think this is a valid to an extent, like accepting a title of "expert" in a technology used by the company that you know well but are not actually a real "expert" in. You and I know that the real experts are the guys constantly replying to stuff on the mailing lists and making lots of significant patches to the core of the project, but sometimes it's OK to condescend to use the vocabulary of the plebes and just embrace "expert" within that context. You aren't deceiving anyone here because it's just a different definition of a word, and you are using the definition that the other people expect. In fact one could say it's deceptive to insist on your specific concept of an expert when you know that the people interviewing just need someone qualified.

But I think claiming experience with an explicit version that was stated is going a little bit too far. It's important to clarify. Maybe someone actually does need help with the new platform, and after all, if they're having inept recruiters do their pre-hire screening, you're probably not missing much anyway.

There is a bit of line to fudge there but it definitely doesn't extend into falsifying tasks or skillsets imo.


Definitely. And I suppose I should have clarified; the 'advancing your skill-set' bit was a joke. Or at-least meant to have humour behind it. Sadly the internets doesn't pass off the jovial nature of my voice. :D


> What's the worst thing that could happen if you fibbed "Oh yeah 9i, I've used that too."

Getting a job with a company where clueless people make important decisions. Not everybody handles such situations well.


> What's the worst thing that could happen if you fibbed

Well, the worst thing would be if it was a trap question designed to weed out the people who lie at initial recruitment selection.


"Well, look, I was under an NDA, using 9i as a beta tester..."

Skin that cat!


"Sorry, but we can't stomach the risk of being seen as forcing you to divulge information that was acquired under an NDA. It just isn't worthwhile for us to risk the lawsuits."


>What's the worst thing that could happen if you fibbed "Oh yeah 9i, I've used that too."? Well, the managerial term is to manage expectation. But it's just effort that seems unnecessary* *-- perhaps it's just my current situation..


The worst thing that could happen? I might get the job and find that all of my coworkers were the kind of people who lie whenever they think it will advance their own interests. That is, I'd be working with people like you. That would be a terrible outcome.


Here's another true one:

"Do you have experience with APIs?"

"Uh...Yeah. Yeah, I do."


"How about electrons?"

(Just wait. It'll happen.)


Interestingly, microsoft SQL server is pronounced sequel server: http://stackoverflow.com/questions/3594718/how-do-you-pronou...

I pronounce MySQL as My-ess-cue-el.


In my place, when people say "SQL" they usually mean "MS SQL Server". Perhaps that's what in her mind.


This is absolute and utter nonsense.

Firstly a tiny amount of people know C, C++, Java, Python & Ruby. If you found someone with that lot I'd probably hire them on the spot. That shows some real skill, multi-linguists are actually pretty rare, discounting the obligatory uni taught LISP and Javascript.

Secondly there's a constant need for people who make CRUD apps. Constant. Almost every business can benefit from a totally custom app with it's own special workflow. We tried RAD tools, we tried auto-generate tools, we tried plugin workflow that would be 'user' edited. Turns out if you don't involve a programmer it all goes very wrong.

After 20 years of promises from Delphi, VB6, Java, Rails, etc. the reality is it's getting harder to make good apps because everyone's expectations only go up. Bottom line is to make a CRUD app you still need a programmer. Almost every business is realising they need a programmer.

The market's only going to get bigger, much, much bigger.

This reads like it's from a person who's never been out of the ivory towers, hasn't actually been inside a real business.


I've worked for a very long time in software, network infrastructure and now Internet payments (complex stuff too) and mobile.

I've never heard a manager or developer say: If only we had someone that could write an algorithm to solve this travelling sales person problem we'd be saved! No. Inefficient algorithms are often good enough, typical programmers muddle through and yes pick up a textbook or reference wikipedia once in a while.

Modern software is a complex tangle of standards, protocols, and technologies. To build a typical example modern web application you're going to need to know linux administration, database administration, some server side language (ruby), a web framework in that language (ruby on rails), javascript (perhaps coffeescript), html, css (less), message queues (redis, beanstalk, etc), caches (varnish), loadbalancing proxies (haproxy), SSL termination, web application firewalls, network firewalls, providing JSON REST API's, Mobile optimization, real time communication (socket.io), third party API integration, Amazon S3 management and scaling. Etc.

All of this lifts the benchmark for what is an acceptable web app. The list goes on, and new bits keep getting added every year along with expectations.

It's bloody hard to find people that can deal with a majority of the above technologies. If you find someone that can do the full stack they're gold.

You want to be valuable? Prove that you're someone that can develop within modern software stacks and more importantly adapt as things change. Because change is coming, the music hasn't stopped and we're in for one hell of a ride.


"I've never heard a manager or developer say: If only we had someone that could write an algorithm to solve this travelling sales person problem we'd be saved! No. Inefficient algorithms are often good enough, typical programmers muddle through and yes pick up a textbook or reference wikipedia once in a while."

Exactly, or google it and find the difference between between the different sorts, or look at any one of the great visual representations of sorting strategies. I've seen otherwise very accomplished coders throw up a crappy bubble sort and move on because it just didn't matter. And it didn't. In any case, 95% of the time those sorts have already been implemented in your Hash object or database or somewhere else in your languages collections library.

I've been doing this...Good Lord nearly 15 years now -- mostly in the middle-tier and lower, so I've had plenty of opportunity -- and I can't even fill up one hand with the number of times that I had a problem that required anything beyond rudimentary algorithmic analysis, much less a deep examination of sorting efficiencies, etc. Those are solved, cut-and-paste problems, the least of your worries.

The list of technologies you give is spot on. I'd add to that: good data modeling/object design, testing/TDD (if you're into that), security issues to be aware of (XSS, SQL injection, mass assignment), classic performance killers like n+1 queries, on and on.

And yes, the change bit -- as a coder you can never assume you've learned your craft. You're always learning it. We're not blacksmiths that get better and better at one thing over the years and arrive at something like "mastery."

We're attention-deficit polymaths that have to scramble week-in, week-out just to maintain competence on the current problem, while anticipating the trends that will be turning the apple cart over a year from now. Remember in 2008 or so when RIA was the buzz-cronym and everyone was talking Silverlight, Flash and the lot? And 4 years from now who knows what the acronyms will be. We may all be laughing about how awesome we thought HTLM5 was going to be (probably not, but in this business...)


If someone gave me a sort implementation in a code review, I'd have to seriously wonder how the heck that person got hired. Libraries exist for a reason.

Sorting is a good example, though, because it is often complicated not because the algorithm is hard, but because the sorting criteria rely on heterogeneous data, sometimes coming from multiple services. Coming up with a clean code design in such cases is challenging, and that's where good developers shine.


You got that right. It's strange though that most programmers are never taught half that stuff in college. Hardcore computer science may be different but related degrees basically walk you through C++, teach you some OO design patterns and let you loose in the world. In my experience every single programmer I've ever met that relies solely on their college education for their work doesn't know half that stuff! Now I'm talking about people who "specialize" in web development here to be clear. They know some HTML, JavaScript or ASP and that's about it. The command line is mostly foreign to them and they rely on a lot of pointing and clicking to create anything rather than digging into some code and settling in with a cozy text editor.

Then there are the ones I've met who picked up some books, read the tutorials, and got their hands dirty with some code completely outside their degree which was in something totally unrelated and those are the ones who totally amaze me with their skill. Even a lot of the big web development companies around me are WYSIWYGing their way through client work!

It just totally astounds me and your comment and this article are making me ask myself (again) why is that? Is what I describe normal? I'm in Chicago and maybe outside the Valley things are different? Has anyone else seen what I described or is my experience just a fluke? If its anywhere near common I'd be a little disturbed.


> Is what I describe normal?

100%, at least in my country.


Thanks for your feedback. As far as the meaning of the article, I wasn't trying to prove a point as much as elucidate something I was thinking about yesterday: the skills you pick up through entrepreneurship.

For the past 4-5 years of my life (starting in high school and continuing through college) I've been concentrating a lot of my time learning how to be a better entrepreneur. That's to the detriment of almost everything else in my life. And it's worth it to me because I love it. Even better still, it seems like a pretty safe bet because worse comes to worse and I completely fail over the next few years I'll still be able to get a job as a coder somewhere.

But the thing about being an entrepreneur is that it encourages you to get marginally good at a wide range of skills instead of REALLY good at one area. And so something I was thinking about is the potential consequences of this decision on my life. This is what I came up with.


> But the thing about being an entrepreneur is that it encourages you to get marginally good at a wide range of skills instead of REALLY good at one area.

Being marginally good at many things show that no matter what gets thrown at you, you'll pick it up fast. This is necessary in entrepreneurship and it becomes ever more necessary elsewhere as the pace of software development increases.

There's no such thing as "algorithms" skills anyway. No matter what skills you have, you'll probably need to adapt them heavily to whatever new job you find yourself in.

Yes, we're mainly startup-ish people around here, but I'm pretty sure that even if I wasn't on, I'd hire a jack of all trades with a proven track record over skills in "math" and "algorithms", whatever that means.


Most multi-skilled people i know (and i include myself) would not be happy dedicating their time and talents to a single thing. Although its not a technical limitation, i would say the reasoning makes sense to me.


Your story is the state of reality now for certain types of programming jobs. Here in DC there are plenty of government contractors who don't care how much demonstrated success a person has shown as a tech entrepreneur; they want a guy with a CS degree and 10 years enterprise Java experience ONLY. The folks who work these types of jobs are your classic middle-age engineers. They read a lot of documentation, have a lot of meetings, and don't much care about being agile or fast.

The reason people here on HN are objecting to your story is that there are ALSO coding jobs for startups, PR firms, or consumer brands where they DO value entrepreneurship, speed, and creativity. And these types of jobs will never go away. Edelman or Nike is never going to convert their development shops to all CS Ph.D.s with 15 years experience optimizing compilers. They're always going to be building interesting interactive projects for major brands, so they need people who are always restlessly finding the next cool thing.

So just make sure you always apply for the latter kind of coding job and you should be good.


Amazing article. I can relate.

I too am an entrepreneur who sold one company and am well onto my second one. (Though the fact that you've sold multiple companies is damn impressive!) Running your own company means you have to wear many hats and its tough to get really good at just one thing. Some days you are a graphic designer, or developer, or marketing, etc.

However, I also think it's boring to be stuck in one specific area for a long time. I like to be challenged, and I am constantly learning new things to broaden my knowledge. Whether or not this is a good thing remains to be seen.


I agree. The notion that we've already peaked in the number of profitable places to apply software is ridiculous. And that's what it will take for them to have "too many qualified candidates".

In the mid-80s people tried to tell me that software was too crowded, that there were too many people going into it and that I was unlikely to have the career my dad had. They were partly right: I've been even more in demand than he was.

Software development salaries are crazy high right now, which is a sign of a ton of suppressed demand. The web is far from done. Mobile is still rising. We are just getting started on the "internet of things". And that's only on the consumer side. Business is not going to be less dependent on software, and the ever-more-networked world is shortening cycles and increasing competitive pressure, meaning business software needs to change more often.

Yes, by all means, people should keep on learning. But the "gosh you'll be fucked in a few years if you don't go back for your CS degree right now" thing is BS.


Exactly. Everything is becoming a software problem. Dispense coffee? Make a rug? Drive a car? They'll eventually all be software problems. Businesses are being limited by their ability to understand and adopt software by the supply of talent. This will continue for a long time and it'll be people that can integrate complex systems that will solve them, not academic algorithm experts (whomever they are).


Thanks for commenting! As mentioned elsewhere on this thread I'm not really trying to say that you need a CS degree to be a good programmer. But I think concentrating on programming vs concentrating on entrepreneurship leads to different results. And so I was thinking about the consequences of that concentration and what that could look like in a few years.

I definitely wasn't saying that this is definitely what's going to happen. In fact I really don't think I'm going to be applying for jobs in 4 years. But I thought it was an interesting scenario to write about and get people's thoughts on.


Firstly a tiny amount of people know C, C++, Java, Python & Ruby

What does it mean to "know" a language? I've been writing almost exclusively in C for 15 years, but I wouldn't say that I "know" C; at least once a month I end up needing to consult the C99 standard for some obscure point of library specification.

On the other hand, there's a lot of people who know how to write Hello World in all those languages and consider that to constitute "knowing" them. Would you hire all of them?


Oh come now, you would say that you know C (even if you're unwilling to say you "know" it for the purposes of debate). You don't have to have a language memorized to "know" it, and you don't "know" it if all you can do is write Hello World.

The answer is somewhere in between. My personal metric is: do you have a reasonably accurate estimation of what you don't know. The better you know a language, the better you know what you don't know.


>My personal metric is: do you have a reasonably accurate estimation of what you don't know.

That's a pretty good measure of how well you know any subject actually. You could call it the "Dunning–Kruger metric".


If you really knew what you were talking about, you'd express more doubt about whether it's actually a good measure of how well you know any subject :)


  >> What does it mean to "know" a language? 
My definition is that it is a language you would be willing to start a new project on, given a tight deadline.


I "know english" but I still need to consult a thesaurus or dictionary from time to time. I believe there are standard tests you can work through to determine your level of fluency (at least as far as that standard is concerned). But that isn't really necessary. If you know that you are consulting the C99 standard for "obscure" specifications then it's safe to say you know the language. If you need an official standard, they are there.


Every job I have interviewed for in the past five years had "C, C++, Java, Python & Ruby" or a close equivalent as a basic requirement, not hire on the spot. (Before then, Python and Ruby weren't as widespread, and Java wasn't totally universal yet) But yeah, it is on the higher end (large web companies paying top quartile compensation) of the industry.


Sorry, but no.

People who know Java really well tend to not really know C++ even if they put it on their resume. Unless you count "how hard can it be" knowledge. Same the other way around. People who know C++ really well think "how hard can it be".

Modern C++ is a very complicated beast, most people who claim to know it but do not have solid experience usually don't. Modern Java is a huge tower of libraries and concepts, so it's the same.


...until you cross a certain age. The first generation of Java programmers were C++ refugees from the 90s. While C++ has advanced a little since then, it really hasn't changed that much.

I would never hire someone based on the libraries they are familiar with. In fact, the next time I interview someone, I'm planning on giving them some sort of "here's a totally unfamiliar API, do XYZ with it" problem. And they can find the documentation on the web themselves.


The first generation of Java programmers were C++ refugees from the 90s. While C++ has advanced a little since then, it really hasn't changed that much.

You should have a look at C++11 if you haven't already.


You are definitely not talking about the jobs this kid will be interviewing for 2 years out of college, which is the context of the piece.


Totally agree on CRUD applications. I would like a nice library app for my local library. Look at the comments on the iPhone app:

http://itunes.apple.com/au/app/bookmyne/id350625461?mt=8

> Keeps saying "no enum const class".


> Firstly a tiny amount of people know C, C++, Java, Python & Ruby. If you found someone with that lot I'd probably hire them on the spot.

I fit your description in full. What is your offer? You can find my contact information in my hacker news profile.


> thoughts on programming, startups and entrepeneurship by a college sophomore

Well, at least he's honest about it. As a student you might hope the stuff that you learned was worth anything I have serious doubts now.


Sharepoint is a few steps closer to that compare to Rails/Java these days assuming your companies decided to adopt it wholeheartedly.


I thought this was great. Very clever. Every frontier seems like it will be a frontier forever, until suddenly it isn't. Perhaps software is going to settle down, start a family, and quit this cowboy nonsense. I don't think it'll happen by 2016, but I'd be hard pressed to say it won't happen ever.

On the other hand, I like to think that maybe what we call entrepreneurship - the hacker ethos, audodidacticism, uppityness, get-shit-done, rationality, self-improvement, a weird mix of skills cut in a wide swath of "whatever I needed to learn at the time" - is actually an overarching set of tools and attitudes for turning your ideas and ambitions into real things.

I can't imagine a world where it's not useful to do that anymore, even if the technology changes. It's true that the entrepreneur toolkit and the 9-5 megacorp toolkit aren't compatible, but every trend I can see points to people needing less and less to get more done. Economies of scale are technological: they shrink as the cost of production shrinks. Diseconomies of scale are sociological: as long as people are still the same, they'll stay the same. The advantages of being a large organisation might not always outweigh the disadvantages.

What if it's not us that get left behind, but them?


Software might never end. It will grow and grow and grow and grow until everything is software (or food). Your chair? Software. Your medicine pill? Software. Your glasses? Software. Your boots? Software. Walls of your house? Software. Your car? Software. The road you drive on? Software.

I think we've not even yet tapped this. Not only we're not over "peak software" hump - it's not anywhere in sight, we're not even on the main exponential growth curve.


Most of them do already exist.

Software based pill: There's already a pill the uses a sensor to measure it's place inside you're gut and release it medication in a specific location. There's also an in-body device that release medicine on regular intervals.

Software based glasses: augmented reality glasses. They might be a growth area for apps.

Programmable shoes: There are shoes with a programmable pattern and shoes for people with Alzheimer that transmit you're location.

Programmable cars: cars are already programmable. they contain huge amounts of software.

Programmable chairs: some dental chairs are already programmable. You have programs for different seat positions.

My guess is, to create a new software employment boom, you need to create a new mass market programmable platform, that needs a lot of small volume applications and that developing for it is neither too complex(windows gui before VB) and neither too simple(excel).


When they finally arrive you'll see that no, they do not.

They are just blossoms here and there, not a mesh of grass.

Everything will not be simply programmable: it would be flexible, changing. Your wall would not be concrete with some software. It would be software with some concrete.


Interesting ideas. I'd love to read some good sciene fiction novels based on this.

It has the potential to be very relevant. 3D plastic printers are interesting, but too limited at the moment. 3D concrete printing has existed for a while though, that can definitely allow software to be/create the walls of your house.

For medicine it's just a case of getting the simulation down. It'll also be interesting if robots get better. There's a lot of boring lab work in the natural sciences, if this can be automated and controlled by software... Your testing can be done by software... Genetic algorithms could be used to find cures to things. (Although using robots to physically test would presumably be much much slower and more expensive than a computer simulation. :( )


I recommend reading The Diamond Age by Neal Stephenson. It is classic cyber-punk about what happens socio-economically when damn near everything can be made by the 3D printer in the corner of your house, as long as you have the raw materials...and the software.


I think by "your walls are software" he meant that your walls are controlled by software. Think heat management, view screens, maybe re-arranging them with a remote. But they're probably constructed with software, too.


I think he meant more than simply "your walls are controlled by software". Think of Utility Fog: your walls could be made of hundreds of little robots linked together, and can be reconfigured to a different shape in seconds under the control of software. The robots are sort of like three-dimensional pixels; the shape they assume at any given moment is just the shape that the currently running software gives them.

Or perhaps dozens of little robots built the wall in the first place (by lashing together sticks and filling the space between with gravel, say) under software control, and can disassemble it and reassemble it in a different form under software control in an hour or two. In this case the sticks and stones are the pixels.


Moving into deeper fiction, we'll return to 'no software' again: inherent inefficiencies of digital hardware will force us to build analog systems, sort of pure hardware, that will be in fact software for the machine that runs this world.


This feels like 'The Matrix'..


It would in fact be the opposite. Matrix was about emulating an old boring world in software for people living in jars.

Here you will have the real thing. Real but no longer boring.


I absolutely agree.

I've actually had this thought lingering in the back of my head as I'm in the midst of making the transition from primarily MS desktop/embedded C++/C# guy to open source stack web app guy.

A few years back, in a review my supervisor put in a recommendation to go into management. I asked if I would be able to code anymore. Nope. I felt that was some subtle death trap - to lose my skills through non-use, and turned it down, 20% pay bump be damned.

So now this transition, and began to think of it in the same light. But it doesn't work. It's been 13 years since I graduated, and I'm probably better at most CS'esque interview questions than anytime since then. Why? Because of what I did that led me to making this transition, essentially becoming unhappy in my job and looking for anwers, which started with a book called "The Passionate Programmer", which led to maybe 20-30 other books, completing some classics (such as K&R C), some incomplete (SICP, only halfway thru chapter 2), but in general exploring my craft in a way that my job did not require me to. I'm a much better programmer because of it, language be damned.

So here's the question I have:

Who is more likely to be aware of things such as CLRS, SICP, Hacker News, Stack Overflow? Who is more likely to understand (or even be aware of) languages like Clojure, Haskell, Prolog, Python, Ruby, etc? Who is more likely to give you more lucid answers to problems with big data, having even touched a NoSQL database and pontificate on the advantages/drawbacks of that vs. relational? Your typical enterprise dev, or your typical open source web dev? After talking about all this stuff to a team of 30+ engineers over the past few years, I'd have to put my $$ on and double down firmly on the latter.

Of course, this is only an example of one engineer's experience. But I think there's a more meta issue at play here... it's more of the 501s vs. the non-501s, the latter being ones who well, just are more aware of what's actually going on in their respective industry. If the industry as a whole makes a tectonic shift toward algorithmically more difficult problems (ie. machine learning), I trust the open-source web devs will be on it and synthesize it faster than your typical enterprise guy will even be aware there is a need.


I work at a reasonably large (20+K) well respected consulting firm that has a significant number of software developers or ex software developers. Honestly, they do a lot of web development work and keep up with the latest things but really C# and SharePoint really are the best solutions for most of these projects.

Advertising supported websites care a lot about scale on the cheap because serving banner advertising to 10 thousand people a day does not pay the rent even if the servers where free. But, when your actually selling stuff or managing internal projects it's all about manpower costs and .Net projects take less time to build. When a small project costs 10,000$ a day in manpower and needs 2-20k worth of server and say 100k worth of software it's still almost meaningless next to manpower costs.


I call BS.

I doubt most of the web dev people are spending anywhere near 5 figures a day in manpower, and often their software costs are 0. Outside of corporations, nobody specs out 6 figure 3rd party software budgets.

If you're a MS shop with more money than brains (ie, more money to spend on hardware/software vs the skill of people you have on staff), sure, this might be true.

It definitely wouldn't be true if you were a Unix shop, where the tradition has come around to be "free" software for everything.


Current project 20-25k/day development budget (including 2 sysadmins and 4 testers) total users under 100. And oddly enough the customer is happy and wants to expand the project. I was once on a project that spent 3 million / year and we had less then 40 actual users of a standalone desktop application yet after 3 years we ended-up saving the client money.


"And oddly enough the customer is happy"

If it pencils out, why not? But this is a case of like serving like, and having been in this industry for awhile now the inefficiencies are phenomenal. Somewhere at the very end of the chain these costs are ending up at the end customer (in my case, the health care industry, and I don't want to go into too much detail about my company, but we're of a similar size and the market leader).

The enterprise is like government. It's a slow moving beast. But things will not continue like this forever; it runs on the same principle that initially made Microsoft successful and now is leading to its downfall. True story: last week I was contacted regarding a position at a health care startup in Palo Alto. We're primarily a hardware company and they're all software, but the funny thing is we have some skunks-work type projects that have some overlap with what they're doing. Having looked at what they already have on the market, if we actually wanted to go after them I don't think we would stand a chance. Sure, we might be able to leverage network effects to some degree; but time-to-market, overhead, etc, are on different scales.


There's obviously a economic disconnect between what you do, and what a bunch of 21 year old webdevs surviving on ramen do.

The problem will be when one or the other ends up making a breakthrough that allows one group to eat the other's lunch.

Obviously I and most of HN are betting on the webdevs.


Startups rarely compete in the same markets. Take an Airline doing 10 billion in revenue that ends up as 250 million a year in profit. Suppose someone discovers an inefficiency say in how they adjust routing based on maintenance scheduling that could save them 1/10th of 1 percent of their operating expenses. Well 10,000 * 1/1000th = 10 million a year in savings based on some investment. Sure, you could try and outsource the whole thing to India, but when every month the system is not up and running they are out close to a million dollars speed becomes important.

Of course that's the upside, the there are a wold of risks and many companies burn millions chasing after penny's.


"Startups rarely compete in the same markets."

How about SpaceX? They've already accomplished what no government in the world has been able to do. How about startups emerging in the energy, scientific, and education spaces? It's not all photo sharing, geo-tagging, social engagement rah rah. The discontent is obvious on HN over real problems not being addressed and companies are now beginning to emerge. Not only is it socially pragmatic, but as you so clearly detailed, it's mind-numbingly lucrative.

If you perform a technical service encumbered with vast inefficiencies, your lunch will be eaten in time. It's not a matter of if but when. Just because there are all sorts of barriers to entry right now because of government contracts, regulations, etc, that allow you to offer a solution right now that is cost effective because it's less messed up than the system you address, does not guarantee those walls will not be chipped away, bit by bit, by more innovative, smarter, cost effective ways of doing things.

Enjoy it while it lasts.


SpaceX is creating a new system from scratch. But, at the end of the day they are not a software company so in 10 or 20 years you can expect them to be hiring the same consulting company's for internal projects that Lockheed Martin uses.

In the consulting world product is irrelevant you serve large organizations and the problems they develop over time. While they are young Google, Facebook, and SpaceX don't have a lot of the internal cruft that consultants exist to deal with, but I can guarantee they are creating it right now.

Amazon EC2 and GMail are probably the best examples of direct completion with traditional consultants by start-ups. But again it's all about the organization and and not what the organization produces and non of them started by focusing on that stuff. All large companies need email and don't necessarily know how to make is safe, secure, and salable.

PS: By software company, I am talking about mindset. GM creates a lot of software, but it's not what upper management focuses on day to day.


I'm curious as to who those users are?

Must be either billionaire investment bankers or high ranking government/military people?


Think multi-national industrial conglomerates whose sales-forces each only numbers in the dozens, but who do m(b)illions of dollars in sales each year. Reducing the time it takes to estimate a new job and submit a proposal from weeks to hours is a very lucrative niche.


An "open source web dev who reads Hacker News" will do better machine learning than a Stats MS/PhD at a bank? OK, enjoy your ninja rockstars...


1) I said: "I trust the open-source web devs will be on it and synthesize it faster than your typical enterprise guy will even be aware there is a need."

2) The article: "It says here you went to the University of Pennsylvania for undergrad. Were you an engineer there?”

We're not talking about quants here. If (quoting myself again): "the industry as a whole makes a tectonic shift toward algorithmically more difficult problems" we'll probably need to enlist the help of more than the top .1% of all software/cs folk.

If you are a quant, I humbly apologize for implying mere mortals could even begin to grasp the fundamentals of a concept as abstract as machine learning. Norvig/Thrun/Ng might have something to say about that though.


Well, I'm a math-y overtrained guy who just loves tech as a passionate hobby, and I found HN. I love the community here, and the new things I learn. I hope to never have to work where only people like me are employed. shudder.


"Every frontier seems like it will be a frontier forever, until suddenly it isn't. Perhaps software is going to settle down, start a family, and quit this cowboy nonsense."

Can I quote you on this? Did you come up with this metaphor yourself, or is there someone else I should cite?


Sure! (Joke's on me, you already quoted it in the comment.)

I can't lay any claim to the coder-as-cowboy metaphor in general - it's almost as old as actual cowboys - but I came up with it in the sense that I didn't rip off any one particular person's idea.


Quote me: software IS the infinite frontier.


My read was that the boom times won't last forever. A comment below says this seems more like 2012 vs. 2007. Well, you could just as easily say 1998 vs. 2007 --- which was the last time than anyone with any degree of technical skill could have a cushy job for the asking. And there were booms and busts before that... and in slack times, it can be hard for anyone to get any kind of tech job, regardless of qualifications.


"The computer world is like an intellectual Wild West, in which you can shoot anyone you wish with your ideas, if you're willing to risk the consequences."

http://www.amazon.com/Hackers-Painters-Big-Ideas-Computer/dp...


Wrong, wrong wrong wrong.

It's so easy to forget how much we had to learn to build websites. It's incredibly easy to forget how much time getting the event loop or even MVC to click took. It's easy to fail to remember how hard it was to learn the 5 different languages required to build the app we made in a couple of weeks over the summer. But those are skills, as challenging to learn as algorithms and big data.

As someone who has had to learn data warehousing very quickly, before being shown the joy of such things as MapReduce, before being slung into serious number crunching performance eeking territory, I can say with absolute certainty that, as "web scripters" or entrepreneurs, we have a huge advantage - we're the people who taught ourselves how to make things instead of regurgitating what a CS program teaches us.

I started a CS program at a decent university. While I think it's true that ivy league and extremely competitive programs might force one to think about this stuff the right way, State University absolutely do not. Most of the kids coming out of there will not be as qualified as someone who has taught themselves how to build a business.

Most importantly, If you're coming out of college, there is approximately a 0% chance the folks hiring you will have any expectation that you will be useful for several weeks while you get up to speed, which is plenty of time to become competent enough to be dangerous.


I'm by no means suggesting that a CS degree is the only way to get good at this stuff. In fact, I think that most of the best programmers I know didn't graduate. The difference is concentrating on being an entrepreneur vs concentrating on being a coder.

This part is reposted from another comment because I think it's important to your point:

As far as the meaning of the article, I wasn't trying to prove a point as much as elucidate something I was thinking about yesterday: the skills you pick up through entrepreneurship.

For the past 4-5 years of my life (starting in high school and continuing through college) I've been concentrating a lot of my time learning how to be a better entrepreneur. That's to the detriment of almost everything else in my life. And it's worth it to me because I love it. Even better still, it seems like a pretty safe bet because worse comes to worse and I completely fail over the next few years I'll still be able to get a job as a coder somewhere.

But the thing about being an entrepreneur is that it encourages you to get marginally good at a wide range of skills instead of REALLY good at one area. And so something I was thinking about is the potential consequences of this decision on my life. This is what I came up with.


Well, my opinion is that the Internet will just get bigger than ever. Opportunities made up by huge platforms (like Kick Starter) will mean that you'll have the advantage of making money faster and more than anyone else.

You'll be able to leverage your entrepreneurial knowledge with such platforms. Think of Facebook, Twitter and the AppStore. How many times they made it easier to make money starting extremely small and almost at no cost.


Yes. It's amazing how I can quick pick up my new job's random DSL and special secret sauce algorithm and know house toolchain, and fix bugs in them immediately, whereas 10 years ago I was mindblown for weeks trying to understand how he Y Combinator and call/cc where even theoretically possible. Same with pointers before that, and the whole concept of a computer program drawing a picture on the screen before that.


It's really the same sort of difference of perspective as between industrial engineering and residential construction. If you've every seen someone build an ordinary home you know how much it looks like silly bullshit sometimes. Everything is wood nailed into other wood, and there's a lot more seat of the pants planning than you think should be necessary, and there's a lot more shimming and toe-nailing and adapting and whatnot than a lot of people would be comfortable with. And then compare that to a world where everything is planned out meticulously in autocad and everything is built out of reinforced concrete and bolted together pre-manufactured giant hunks of steel.

They are very different worlds, much as "hard" systems programming and "soft" high level "web dev" are. And it's tempting to look from one to the other and imagine that it's only a passing fad, but it's not.

As far as CS degrees, my experience is the same. I've interviewed a fair number of dev. candidates in my career (in addition to working alongside coworkers of various educational levels), and a CS degree on a resume has never had any correlation in my experience with a higher quality developer skillset. And this is down to a basic almost fizzbuzzian level too. From a few colleges a CS degree may mean something, from most it seems to be pretty worthless.


I've had this kind of argument presented to me every few years since the 1990s. It was probably happening for the forty years before that. Soon we'll need "real engineers" - and these fly by night part timers who don't have a "proper" background in computing are doomed! DOOOMED I SAY!!!!

Tosh.

I'm a guy who has got a subject specific degree - more than twenty years ago now (1st in Computing and Artificial Intelligence for those who care). I was selling software before that, and have spent most of the time since in industry.

What have I noticed since then? Amount I've actually used the "hard" CS stuff I learned there - close to zero. Correlation between "being good at math" and being a successful developer - basically zero. Correlation between having a degree and being a successful developer, after the first few years in industry, basically zero.

I don't see that magically being different in the next four years.

(Curiously the "being good a math" thing seems to be something US centric. I've not noticed the same focus on that with folk in the UK or elsewhere in Europe).

The space that developers get to play in has got larger and larger over the last 30 years. I don't see that changing. Quite the opposite in fact.

Sure some of that is going to be in areas that really need some hard-core math or engineering skills. Those jobs are out there now (embedded development is exploding again, big data has been around for years, the clever end of game development). I'm sure they'll be more in the future.

But there are also many, many jobs out there that don't. Many, many jobs that involve developers being good generalists, or having cross-over with UX and design, or having a decent understanding of economics, or understanding big-money. I'm sure they'll be more of those in the future too.

One thing we're really excellent at is wrapping up complicated stuff in abstractions that are stupidly easy to use. We're excellent at de-skilling our own job. And every generation whines that the previous one can't build their own computer / write microcode / write assembler / manage with less than 1k RAM / cope without a visual editor / manage their own memory / build their own OS / write their own application stack / whatever.

Yet people somehow carry on building new and neat things.

If you're a hard-core CS/algorithms person - go for it. They'll be lots of work for you. If you're not? Go find another niche. There are many, many out there. Be a good developer. Have fun. Make neat things.

And thus ends this particular Grumpy Old Man's Saturday Rant :-)


To follow up with a Grumpy Old Man Postscript:

Reading Commications of the ACM (most recently, [1]), there's been an ongoing problem getting enough people into computer science programs (in large part because of the broad fears of "outsourcing"), and the projections are that EVEN MORE graduates will be needed through 2020. [2]

So yes, many times yes: This article is completely and totally wrong. And as adrianhoward says, it's the same sentiment that my parents chided me with back in the 90s, and that I hear all the time from people who imagine that the silver bullet is just around the corner that will make it so that we need fewer rather than more programmers. (FWIW: My degree is in "Cognitive Science", the "other CS", and I haven't had problems getting jobs when I've wanted one.)

And as an aside, if you look at the graph with yellow and brown bars about 2/3 of the way down [2], getting a job where math is the primary required skill looks like a slog (looks like about 2x as many math grads as math jobs), while in order to hire people AT ALL, companies are going to have to ignore whether they have a computer science degree, because there are nearly 3.5x as many jobs in CS-related fields as there are graduates in CS.

Based on these numbers it looks almost inevitable that we'll see echos of the "You've put together a web page? By yourself! You're hired!" dot com boom, just because it will be so hard to hire anyone at all.

So, while the story was cute, its premise is FUD that has a chance of scaring people away from CS at precisely the time that we need more people in CS. OK, end rant.

[1] http://cacm.acm.org/blogs/blog-cacm/148620-hot-job-market-fo...

[2] http://cs.calvin.edu/p/ComputingCareersMarket

[edit: typo]


there's been an ongoing problem getting enough people into computer science programs (in large part because of the broad fears of "outsourcing"),

Then what fields are these people going into? Law? I hope not: http://www.slate.com/articles/business/moneybox/2010/10/a_ca... . Medicine has its own problems too. Are they majoring in the humanities? Getting bogus business degrees?: http://www.amazon.com/The-Marketplace-Ideas-Resistance-Unive... ?

It might just be that undergrads have serious information asymmetry problems, of the sort I tried, probably futilely, to correct here: http://jseliger.com/2012/04/17/how-to-think-about-science-an... , but I wonder how these kinds of asymmetries can really persist.


Undergrads do have serious information asymmetry problems. My best guess as to the reason is that they don't have a lot of interaction with a wide range of people in the working world. They don't really have any idea what it'll be like to be adults. Incidentally, I think this is a big part of the reason undergrads at good schools gravitate to such a small number of employers (investment banks, consulting firms, and famous programs like Teach for America). Kids simply don't know what's out there, and so they reach for known quantities. And one consequence of that ignorance is that they don't have a very clear idea of which skills are most marketable.


Are there really asymmetries? CS degrees have a small value-add. Many of people enjoy software development without having the desire to be able to implement a RB tree on the fly. For intance, my boss majored in film studies; only one person on the team I work in has actual formal training in CS beyond introductory classes, and they're probably the least productive developer. Given that, why should people major in CS when they can instead spend 4 years debating social theory or going on class trips to Death Valley?

And really, is what we're hurting for now more CS people? I don't think so: if anything, the issue is that there's too much a CS emphasis. That's why we have people spending time building Lisp VMs in JS and all that jazz here, while tons of other fields (even technical ones) have absolutely massive inefficiencies because their problems aren't theoretical enough for CS folks.


Part of the problem is that we don't know how to teach computer science to students that haven't already been programming for years before college. The major has the highest first-year dropout rate.

http://www.siliconrepublic.com/innovation/item/18532-compute...

http://dl.acm.org/citation.cfm?id=1151604

That suggests that one reason there are so few CS graduates despite demand is that those who enter CS programs because of said employment demand, and not because they are already interested in programming, don't make it out with a CS degree.


With respect to CS in particular, I think it's different from all other school subjects in that it is and has always remained a predominantly self-taught subject. You don't really get "taught" to program.

(I think this may be part of the reason that while there are increasing proportions of women in all the other sciences, the proportion of women in CS is dropping. Girls are statistically better in all school subjects than boys, and tend to be more diligent at homework and so on. Math and science fall neatly into that paradigm; programming doesn't. You get no gold stars from grownups by programming. So a lot of people who have the problem-solving aptitude to be good programmers never actually get started -- because nobody TOLD them to.)


" Correlation between having a degree and being a successful developer, after the first few years in industry, basically zero."

In my experience it's even negative. Almost every person without degree that I worked with was much better at their job than the rest. They were more driven and that's probably how they got the job too. On the other hand the code written by phds... Often I regret looking. It's not that it's not correct (usually), it's just barely usable and disconnected from reality of how/where it needs to run.


I hate to say this, and I'm sure it doesn't apply to anyone reading HN, but in my experience the ability to sit down at a keyboard and end up with a useful nontrivial program a few hours later is inversely proportional to the number of years of graduate school.


> Correlation between "being good at math" and being a successful developer - basically zero.

I suspect you have a very limited (that is, "it's all arithmetic") notion of what math is. You cannot develop software without some kind of mathematical thinking.


I suspect you have a very limited (that is, "it's all arithmetic") notion of what math is. You cannot develop software without some kind of mathematical thinking.

Then you would suspect wrong :-) I have a friend who was a career mathematicians briefly, before ambling off into finance. Personally I stopped math after A level (roughly equivalent to the first year university math at university in the US as I understand it - calculus, etc.).

I know math isn't just doing sums.

From my mate I know that many mathematicians are bloody awful at coding - which is a PITA now that computers are much more a part of how mathematics are done in many fields. From personal experience I know that a large chunk of the very best coders I've ever worked with don't have a math background beyond basic numeracy.

What I think people experience is actually the opposite connection. Some of the problem solving skills that developers use are the same sort of skills that folk get from being good at mathematics.

You can get those skills in other places without learning higher level math. And being a successful developer needs much more than the skills from the mathematics toolset.

Nothing against math. Math is great. Moderately good at it myself. If you're a good developer having it will help more than it will hinder. But having it won't make you a good developer, and not having won't stop you being a good developer.


> Then you would suspect wrong

Then you're wrong about math and downvote to disagree.


I didn't downvote you - I don't have the rep to downvote anybody :-)

I'm not making this up. I know many good developers that do not have a math background. I know people who have a heavy math background who are lousy at development (and it looks like I'm not the only one - see http://news.ycombinator.com/item?id=3872269)

A bit more than "you're wrong" would help with reasoned debate of course...


[dead]


[dead]


I think you meant to post this on 4chan, not HN.


So I've done a bit of hiring, and my message to you is as follows: If anyone actually does this to you in an interview, be glad they acted like that because you shouldn't work for them anyway.

For companies who have their shit together, this scenario is unlikely for a few reasons:

1. Experience, not classes or school, is paramount. We're hiring. We need you to do X. Have you done X or things close to X before? If so, you're better equipped to do X than anyone who has only taken a class on doing X.

2. School does not indicate coding skill. I've met many people who (supposedly) went to every class who couldn't code their way out of a paper bag.

3. Academic coding != production coding. The two are light years apart, and the latter is worth way more than the former.

4. Classes don't give a good signal on the ability to execute. Execution means doing what is necessary so you can ship. It means knowing there's a first 90% then a second 90% that looks like 10%. Finished, launched projects show execution. School do not.

5. Algorithms are fantastic and useful, but not in the ways they taught you in class. If you can use Google or use your copy of the CLRS to find what you're looking for, then engineer it into your solution, that's almost always more than enough.

6. If entrepreneurship is ever 'just applauded' in your interview ... run. Don't work there. Entrepreneurship indicates that you know this is a business, and that engineering doesn't exist in a vacuum. It means you can balance sales concerns against user concerns against design, UX, product, scale, and not just do things and throw them over a wall. It means you can be trusted to make decisions that add value and not just code.


This problem will be faced by many developers soon. The Internet is huge. Very large. The big companies are going to be dealing with huge data. You'll need to understand algorithms and math, and frankly, this stuff is a bit difficult to learn on your own. I thought I knew it all till I went into the algorithms class - that when I realized that not only did I not know it all, I was not as smart as I thought I was, and I would never have had the motivation to go through with this if I had not been forced to. And that goes for many developing.

Programming is a scarce profession now, but the simple stuff will soon be done by too many people. Software will become a real engineering task. In 20 years, the age of the code monkey will be gone.


Oh no, it will be cyclical. Programming is all about attaining higher abstractions, hiding more technology under a simple interface. Every now and then some new set of abstractions will be useful enough that we'll need a bunch of people to explore a field of opportunity. These explorers are called entrepreneurs.

Maybe in 2016, you're going to need deep credentials to be a useful web dev, but none whatsoever to start something useful with 3-D printing.

EDIT: that said, nothing makes you more employable than knowing things at a deep level. A friend of mine, a former Plan 9 kernel contributor, quit the tech industry after the first bubble to become a wildlands firefighter. Returned to the tech industry in 2008 and resumed being a highly-paid infrastructure geek like nothing had happened.


I really agree with this.


I think it will be the exact opposite. As computers take over more and more jobs from us, we will need less "office workers" who know how to shuffle documents around, and more workers with programming skills.

You want to be a mathematician? You need to know how to program. You want to be a "secretary", you need to be able to dig through your boss's e-mail using regexpes when he needs to find sth, you are a dentist - you will install your own scripts on the website because you know how to do it from high schools.

In 2016 (or 2012) it won't be "oh, we need more skilled programmers", it will be "sure, you know programming, everyone knows, but what really you know?". Programming will have the same place on CV like "MS Office", or "keyboard typing" has right now. No big deal if you know it, but much harder to find your job if you don't.

Of course there will still be place for real computer experts - algorithm designers et al, but the basics will be known to more and more people.


This isn't going to manifest itself as more people knowing how to program, though. Easier interfaces and smarter searching, but not programming as we know it.

In 1980 you'd say that in 20 years the average office worker would be performing calculations on thousands of rows of data, generating charts, typesetting documents, creating full color presentations, doing business with clients in multiple continents and they'd wonder how people would cope with the increase in cognition required to do all that. But it's just button-pressing for most people.

In 2016 they'll say "we have a database with 4 billion data points and we need to infer customer behavior patterns from it". You'll say "Sure.", sit at a desk, click "Segment", click "Demographic: 18-21", click "Intersect", click "Products", click "Make Recommendations", click "Apply" and a discount coupon for "Justin Beiber's Comeback Tour" will be beamed directly into the eye sockets of anyone who bought canned salmon last fall.

I don't see a society where 80% programs, I see a society where 10% builds things for the other 90% and a huge part of the middle class will be automated out of existence. This, to me, is the big issue that will shape this generation and the next.


Excellent thoughts. I think that more people need to know how to program their computers. But, as you have so elegantly pointed out, the inexorable march of progress will not bring this to pass. It hurts a little to think about, but in a large way you seem to be on the money.

These ideas are worthy of more than a two paragraph comment on HN. I third the notion that you should pen a full blog post.


I second the need for a blog post on this!


Have you written a full blog post or article on the subject by any chance? I'm really interested.



Thanks.


Give this test to the next 5 random non-technical friends and family you talk to:

A = 1

B = 2

C = 3

A = B

What does A equal?

I'm not saying people can't be taught. But think about how big the workforce actually is, think about how widespread MS Office skills are. For every power-user analyst and project manager that's really taking Excel out for a workout, there are 10, 20, 50 people who use Office in every day non-challenging tasks.

I've given that little test to my MBA wife and a GP family member and several other people. Hardly anybody gets it right.

Edit:

The x-factor here, btw, that determines whether or not somebody understands it, is whether they see that assignment is happening, not some sort of "wha? 1 equals 2? what is that?" And those that didn't just get it, even after I explained assignment they were just as puzzled. Just the concept of variable symbols confused and (i presume) disinterested them.

What we do here everyday, this is difficult, challenging stuff, that I don't think most the workforce will ever understand. Instead, people like us will be busy for decades to come, building tools so they don't have to.

There was a time when machines were new concepts versus simple tools. You could say, in the early stages of the industrial revolution, that soon everybody would understand and be able to fix their machines. But machine complexity has out-paced the desire and ability to learn those skills.

Software is no different, I don't think.

Edit Two:

http://www.codinghorror.com/blog/2006/07/separating-programm...


I don't think it's unreasonable for people to assume that = means equality, not assignment.

Edit:

In particular, the link posted in the second edit has a rather poor test using a and b, because it uses = in two different ways with no indication that the meaning of the symbol has changed. Maybe the problem with the test isn't just the people, it's the sloppy notation that assumes people with no programming background are able to infer when we mean equality and when we mean assignment.


The population that took that test were self selected computer science undergrads!

And even after three weeks of instruction most of the people who didn't understand it immediately never understood it. I'll quote from the article:

"Either you had a consistent model in your mind immediately upon first exposure to assignment, the first hurdle in programming-- or else you never developed one!"

My wife is a brilliant woman, fantastic at what she does. The GP I mentioned in my post is a very good doctor who had no problems getting into a medical school, passing his boards, or running a successful practice. But that doesn't mean that everybody is meant to be able to understand the abstract concepts you have to master in our line of work.


I wonder how the experiment would change if you change it to "let A = 1, let B = 2, let C = 3, let A = B", or "make", or some other verb that seems more like assignment.


Or if this were explained to be a sequential process and not a just a descriptive list of unrelated declarations. It seems reasonable to think that that list contains a contradiction if you've never been exposed to these concepts before.


So education and the availability technology will eventually result in many people with basic programming skills will result in not only computer literacy in the "I can use MS Office" sense, but in the "sure, I can hammer out a Python script to get this task automated or sift through that data" sense.

Is there some sort of disruption of (basic) programming skills coming the way blogging has disrupted journalism?


I'm of two minds on this one. Part of me agrees completely with what you say. Further tools will exist that automate a lot of what the code-monkey does. The level of work done by a lot of us will be push-button, or "plug these couple of things together".

On the other hand, in 1998 as a nerdy guy getting out of high school, with minimal html, javascript and programming experience, and running linux on a pentium pro I faced a big choice. I went to the local ISP to pick up a "real modem" (vs a winmodem) to connect my awesome unix box to the internet. The guy there asked why I wanted these modems vs going to circuit city for some amazing sale they were having for a faster winmodem. When I told him I was running linux I was offered a $40K/yr job on the spot - just for getting linux installed on a computer and understanding the basics (quote "we can teach you anything else you need to know, you got the spark"). I was 18, and that was a HUGE deal. Anyway this wasn't uncommon, at the time wired was running stories about "HTML factories" where people were making pages and pages by hand all the time. Minimal programming skills got you a job.

Some of this was just normal boom-time labor shortage. There were lots of stories about how after the crash these guys would never work again. Some aspect of this was true, but some of it was bunk. The 2000 version of this story would be "sure you can do html, and you can do CGI, and you understand http headers and can whip up a server, but we need people who understand SQL and how to work with record objects and how to do live updates to a system, stuff you need a real degree for. It's 2005 not 2000"

So basically I am suggesting that while Rails may be a non-skill (like HTML has become) and maybe good REST APIs will be auto generated, and some Backbone.js future version or successor will do most of our tricky js stuff, there will likely be good toolkits that allow people to plug together data-mining and data-management without needing super deep algorithmic understanding, we are already seeing the emergence of such tools.

So the other part of me disagrees, the code-monkey will be needed, just that they will be putting together different bits than they are today.


> So the other part of me disagrees, the code-monkey will be needed, just that they will be putting together different bits than they are today.

It seems to me the Hiring Manager in the story was fishing for a jack-of-all-trades kind of applicant. The interviewee was obviously a web developer, but the position to be filled was a data analytics job.

What difference does it make if Rails/PHP/Node.js/Backbone/etc become commodity jobs? In order to get the raw data that requires complex and high speed algorithms to parse in to useable data, you'll still need a website, built by a rails/backbone code monkey and a Photoshop designer and with the help of a decent DBA, at the very least. The data position, if it ever comes to it, will just be another job type that a well rounded development team will need to fill, not a replacement for everyone else on the team.


"When I told him I was running linux I was offered a $40K/yr job on the spot - just for getting linux installed on a computer and understanding the basics (quote "we can teach you anything else you need to know, you got the spark")."

Yeah, my technical screen for an 8-year career in Schlumberger was exactly this.


Internet may be a huge place with a huge amount of developers.

It is also an even huger place when it comes to demand of people that can do simple stuff.


Math is overrated. Nobody really needs math; if you find yourself needing some math you're probably reinventing some kind of wheel. Same with algorithms.

How do you deal with huge data: you just do. There are tools for that, and you apply those and perhaps make new tools yourself, but math and algorithms tend to never enter the equation.


I am reminded of Jeff Jonas' "Data Beats Math": http://jeffjonas.typepad.com/jeff_jonas/2011/04/data-beats-m...


Who do you think makes those tools?


Not computer scientists. Programmers themselves, based more on their experience than theory.


"The problems we’re working on involve in-depth data analysis that require an extensive math and algorithms background."

I can see why a college sophomore would fear this response. But in my 18 years of programming, I've seen that the vast majority of software development isn't about the stuff they teach you in school. It's about design, collaboration, languages, libraries, and frameworks. It's about working around crazy cross-version incompatibilities, solving heisenbugs, and keeping everything maintainable. Math and algorithms? Feh. Not the real issue.

Let's assume the startup bubble bursts and programming jobs become scarce. There won't be any kindly interviewer at the large bureaucratic companies. There will just be a faceless HR person with a keyword-searching database saying, "No CS degree--no interview."

But personally, even if the startup bubble bursts, I don't see the demand for programmers going anywhere but up. And that entrepreneurial background will only be an asset at the smaller, more interesting companies.


Besides, starting your startup on the downside of the bubble is exactly where you want to be. You don't get rich starting up at the peak. If you think there is a bubble about to burst, save up everything you can earn right now, and then start your business after the pop.


Couldn't disagree more.

There will always be more people, and more need for people, writing high level application code, glue code, and "spit-and-polish" code than people writing deep, difficult systems code. Always.

Here's the thing, in computing the advances in tooling and performance continue to pile up at an amazing rate. In 2016 it will be even easier to roll out a product built by a couple "web guys" with little in-depth technical knowledge that does an amazing amount of business and has a profound impact on the tech world. Indeed, in time it will be possible to run ventures which support billions of active users per day on incredibly cheap hardware and with a rather modest amount of dev-hours behind them.

Imagining that the future is only for hard-headed systems programming is the time honored ego-stroke of the hard-headed systems programmer who sees all of these "dilettante" "web guys" doing amazing work in the real world and making an impact and money doing so. But that's just a fantasy. The truth is that software is art. It's often a thousand times more valuable to write software which communicates with users and evokes in them strong feelings and strong connections than to write software which is technically pure and strong, but sterile, impersonal, and useless.



I found http://khanacademy.com extremely helpful when trying to catch up on the math involved in the ml-class / ai-class material.


Knowing data structures separates the men from the boys. If you do not understand the difference between a tree (C++ std::set for example) or a hash table (std::unordered_set) you have placed yourself at a disadvantage. You can learn these things, they are not rocket science and you don't need a CS or math degree to do so, but it's important that you do learn about them and when to use what data structure especially if you have to scale to more than you ever thought possible. Most all programming languages have containers (lists, sets, dictionaries, maps, etc.) that are backed by various data structures. So you can experiment and learn.


Very true. My objective-c skills got dramatically better after spending some time learning the stl in a c++ project. Reading about where you would want to use a deque vs some other container was enlightening, and upon returning to objective-c I started to pay attention to its various containers and design patterns in a whole new light.


Barrier to entry is going down, not up.

No one cares if you can do an algorithmic analysis on different ways of sorting to choose the most appropriate way. These days it's dynamically built into the function. Just call sort.

Educationally there's not much of a difference between a philosophy, math or computer science degree. All of them are doing the same thing - logic. Philosopher approaches it classically, mathematicians do it formally, and comp sci do it ad hoc or practically. Each has it's virtues when you design or program.


That's a pretty bad example, since sorting is exactly the kind of problem that really depends on the application. Eg, depth sorting objects in a game would run quicker if you pick an algorithm based on the fact that consecutive frames typically don't change that much.


Except when you need to sort a big collection of strings and the standard library sort function never returns.

What we need is good vocational education. Polytechnic schools that teach you how to code, be part of a project or lead one, design an application and so on. Then the CS taught in universities can focus on the basics of computing, analysis of algorithms, AI, programming languages, etc without the now mandatory Software Engineering 101.


Can you point me towards a sort function which will sort my data in the most appropriate way, every time?

Some sort functions might cover 99% of cases for web development, but not for other problems. The computer is not magical.


I am reminded of Steve Yegge's excellent post on "Math for Programmers" : http://steve-yegge.blogspot.in/2006/03/math-for-programmers.... (posted multiple times before on HN).

There are literally 100 cool things to learn and try: Like this weekend I thought about writing a small program for the DCPU-16, trying Meteor, making a small app using firebase, etc etc. Possibly, learning more Math has a higher long-term ROI.

An another note: When everything melts down, it might be a good time to start another company, rather than look for employment though.


The critical skill for creating value is understanding the customer PLUS the 80/20 rule of software development. A creative technical guy can come with an elegant MVP if and only if he is attuned to real users.

This will be even more true in 2016 than in 2012.

A nontechnical MBA is just blocked from this insight. And a great algorithm guy who is tone deaf to users is likewise blocked. Even together, they are handicapped compared to the guy who sees both sides.

The most powerful problem solving of all is a group of people who can see both sides. Pud's thread about 400K users and what to do next was stunningly wonderful to me. You don't see that on stack overflow and you don't see that in the Harvard Business Review. You see it here on Hacker News.


I see in this thread that a number of people are talking past each other, each with a different subtle sub definition of the word "know".

I know of 7 basic subcategories of know - all but the last susceptible to phrasing:

(1) Knowledge that is immediately accessible at great depth and can be traversed quickly

(2) Knowledge that is not immediately accessible but resides in the unconscious. It surfaces in dreams, showers and intuition.

(3) Knowledge that is not immediately accesible but can be so quickly understood from a search (physical or digital) that it might as well have been remembered. Truly, the old fashioned idea that all your knowledge can only be kept inside your head is quaint.

(4) Knowledge that is not had but can be quickly acquired due to the similarity of the underlying structure to already possessed knowledge. With speed of acquisition proportional to similarity.

(5) Knowledge that is not had but can be acquired due to available learning strategies, knowledgebase and skill in acquiring knowledge.

(6) Knowledge that will be had in the distant future

(7) Knowledge that can not be gained due to difference in interests, lack of motivation or sufficient strength of reason, entrenched mode of thinking and set of beliefs which inhibit deftness with abstraction and for a very small few - reasons of biology.

Knowledge that cannot be accessed by Human Brains.

For many cases, the level 4 definition of knowing is sufficient and anything above should be good enough for almost all problems.


You push your super-repos to the datahub, and link to them on your resume; as we are doing it today but putting repositories in Github instead.

The point is: A university degree no longer mean the accredited person is capable. It has lost its value, that's why some recruiters are turning to GitHub and other solutions to find skilled people.

To assure the author: I don't hold a University degree, and I live in a third world country. I was offered, a week ago, a software dev. position for 30% of what a fresh Engineer (5 years of study) would get. The recruiter insisted that it was a starting salary but it was only a fraction of what I make online. He told me that the position is available anytime I changed my mind.

4 years earlier only, in this same place, you'll be laughed at if you don't hold a University degree whatever skills/capabilities you can show off to the recruiter. Don't dream getting that job, and even if you did, you'll be paid only a third or less than your colleagues.


I fit that storyline background perfectly. With one major exception, I got a job into finance risk management from campus, because my major was in Statistics. (Though after 8 years of experienced, I learned that statistics is not really used in such jobs - anything more than regression is not understood well (even regression in some cases), and 'intuitive' non-statistical solutions are always sure to be better received and sold despite offering far subpar solutions - but I digress.)

I recently got a call from Google for my rank in Code Jam. I explained my position and expressed my desire to work on programming. The HR, a very nice person, made it all but definite that because of my background and experience, I should look for a risk analysis role and not coding. He is still willing to set up a programming interview if I insist... but I don't know what to do :(


This is a nice little story, but I don't see what the problem is. The hiring manager states:

"The problems we’re working on involve in-depth data analysis that require an extensive math and algorithms background"

So if you know a bunch of programming languages and built some fairly successful websites, why would you even apply for that job? You're and entrepeneur and a "general programmer" at best or maybe even a "web programmer" only just good enough to hack a CRUD site together.

If the job posting actually indicated the need for in-depth data analysis with extensive maths and algorithms then this applicant wasted everyone's time by even applying.

I suspect that there is some assumption that the hiring manager doesn't know what she's talking about regarding the in-depth data analysis but I don't see where that assumption would come from.


"Computer science is no more about computers than astronomy is about telescopes." - Edsger Dijkstra


Normally I get this kind if post but this one not so much. Some knowledge isn't something you just "pick up". I'm a software engineer now but my degree it's in mechanical engineering. If I were going for a M.E. job that required advanced knowledge if thermodynamics, heat transfer, calculus, statistics, etc, it would be ridiculous to say that I had started and run a few car repair shops and even ran a high end racing team where we built our own custom off-road truck and won the baja 1000. It's amazing experience but totally not applicable and doesn't mean I can just pick up the requires skills. Am I misreading this? The recruiter doesn't seem clueless.


I think you hit the nail on the head. The number of brogrammer entrepreneurs out there is huge. They are equivalent to Big Jims Baha XTReme Racing Shop that slaps lifted suspension on other people's 2WD Tundras.

But then there are a few start-ups who are designing and building fuel efficient diesel engines, or fully electric cars built from the ground up. These guys will absolutely know thermodynamics etc.

So if someone built "mycollegematehavingsex.com" and got 40,000 sign-ups via launchrock, who fucking cares. If they made twitter-for-squirrels and got 10,000,000 active users before the Great Nut Famine of 2015 killed the company, then there may conversations to be had.


If you don't know algorithms, I certainly wouldn't want to hire you today.


Ok, I'm going to throw out an open question to the HN community here. I'm a working programmer working mostly on web dev/internal apps for an established company. However I do most of the work on my own rather than in a team and have never worked on a team of more than 5 people (and even that was brief).

I am an OK Java & PHP programmer with some Linux Sysadmin experience (mainly running servers for apps I have built). I have basic CS credentials but am mostly self taught (starting with BASIC -> PHP -> Java) and have messed around with a bit of Game Dev & System programming in my spare time but nothing earth shattering.

I want to take the "next step" but my skills in math and Big Data analysis type stuff are fairly poor by HN standard, I am quickly approaching 30 and have very limited money.

Should I:

A) Take a pure math degree (possibly also with stats) to improve my overall math skill in the hope that this will open more doors for me in stuff like NLP , Big data etc (but also cost significant money+time).

B) Hack on existing open source projects to increase my knowledge of working on large & complex code bases (significant time cost buy little money cost).Possibly compliment this by working on the free online courses from Stanford etc.

C) Work on my own projects in the hope that I can build something cool that will get me recognition in some way.

D) Try to improve other skills outside programming/math and become more "well rounded".

Discuss.


very limited money

B) Hack on existing open source projects to increase my knowledge of working on large & complex code bases (significant time cost buy little money cost).

C) Work on my own projects in the hope that I can build something cool that will get me recognition in some way.


You should take one of the free online courses from Udacity, Coursera, or even University of Reddit, on algorithms and machine learning (separate topics).

A pure math degree will not teach you the skills that you are specifically after: it is pure mathematics.


That is true, but if many jobs will be in big data analysis I would think that this would be rewarding to people with knowledge of statistics more than CS.


Paraphrasing Kenneth Williams from http://www.youtube.com/watch?v=CdDtwc9HA7s

It's frightening to think with modern medicine and all the technology available, they can't really help you. In the old days, we were better off because now they're all "specialists." Everyone's getting better and better at less and less. Eventually someone will be absolutely amazing at doing nothing.


My knowledge of how a red-black tree or how hidden markov models work or that a quicksort is n^2 worst case and n log n typically has saved me literally hours of searching. Over the course of my 15 year career.

Good lord, DADS + ACM + CiteSeer and a healthy dose of curiosity and intelligence is enough to get most people through 99% of the "hard" stuff you're going to see even in the era of Big Data.


I agree you can self-teach a lot of it, and a lot of it isn't needed in a lot of jobs, either. I somewhat disagree on the last part. I think there's a significant amount of work in Big-Data-related areas where not having a solid knowledge of statistics greatly increases the odds of doing something completely wrongly, or interpreting the results wrongly. On the other hand, there's a significant amount of work where deep knowledge of statistics is optional at most. But companies like Palantir do seem to put a strong emphasis on the statistical background of people they hire.


Honestly, CS is not that hard, it's just that a PDF/textbook isn't as sexy as 1080p Rails tutorial. If an experienced developer spends 2-4 months working through a quality textbook (Skiena, Cormen etc), then he/she is already significantly more qualified than most four year CS degree holders.


I think people are missing the point of the article -- I think that what the author is getting at is the market might be soon flooded with CS graduates with a nice set of accomplishments. Probably not by 2016, but at some point its going to become more competitive I am sure.


This is like saying every plumber needs to know how to mine copper and smelt it so they can manufacture their own pipes. There are different kinds of software engineers, and the kind that you happen to be may not necessarily be the kind that's in demand by every employer.


Disclaimer: I'm not a programmer

Have a look at puredata.info and Eastgate Systems' Tinderbox, especially the export template definitions and use of 'agents' for the latter

Why can't I have a visual flowcharty type thing that I model a business process in, click a button, and generate a Web app?


Having worked with Pure Data before, I can tell you that it is pretty painful to use for anything substantial. When you've got 3+ lines crossing other lines and coming into the same terminal, it's next to impossible to keep track of how stuff works. You spend a lot of time wiggling components to see how their attached wires move.


As a theoretical physics grad student, I have deep knowledge of math and algorithms and things of this nature (but mostly as related to physics), and I've worked in the world of corporate engineering and programming, and from what I've seen -- having a lot of deep knowledge is not really the norm and it's not really the expectation. Usually, if something difficult or complicated is required, you paper over it with Mathematica or Maple or you use one of the algorithm libraries released by e.g. LLNL or you use ANSYS, things like this that do most of the heavy lifting automatically. Even in experimental physics -- people who have doctorates and loads of experience and such -- you don't typically run across anyone writing that sort of thing from scratch because the majority of problems fall into some categories which can be attacked by a lot of the fancy tools out there these days, and so training to do this stuff doesn't require the intuition and abstract knowledge that might go into understanding the algorithms behind the curtain. To some degree, this drove me away from having an ordinary career that actually pays a living wage and towards academia. The other thing is that you can learn to use the libraries in two weeks or less and it's pretty easy to declare "I have some experience with *" if you've tried it out on your own and can do something with it.

And as much as I would like to see a world where people take the time to learn theory, people are lazy, and that sort of thing isn't always viable (See also Sikha Dalmia's Gandhi Rule: http://www.thedaily.com/page/2012/04/19/041912-opinions-colu... - if you don't learn it, don't expect anyone else to). The consequence is that the computer is often smarter than the people using it.

Somehow, I don't really see this resulting in job losses, though, unlike many people claim. The variability and the amount of problems to solve usually mean that there is someone doing some design work, even if it is not on a very serious technical level. It also opens up new possibilities for startups, and I'm sure Stephen Wolfram would talk your head off about it...

As regards "cowboy nonsense": you don't always have to attach negative behaviors to ordinary things. Getting exercise and having a social life can be good for everyone, not just douchebags.


I'm not sure if this article is saying people with more CS theory and math do better than people with more work experience, or if it's saying it's hard for everyone to get hired.


This article is saying that web development (using frameworks a la Rails) might be enough to get you a job today, but as soon as the web-trend passes other skills will become needed. Recruiters are becoming more and more stupid (not in the derogatory sense, they just do pattern matching between job offers and what's written on candidates' resumes).

This article is a reminder that it's never too soon to start adding different skills to your arsenal.


Dans premonition is actually one step behind. Knowing the mainstream skills of 2016 will not be enough. But you will have enough room in your inventory because the mainstream skill of 2012 will be irrelevant.

10 years ago if you learned Perl on the side it opened doors to great jobs. It's always the way that learning the up and coming not yet mainstream skill will give you an edge in employment. Data analysis is the skill for 2012. Something else will be in 2016.


Recruiter: I have access to great CS people I can place at your company.

Company: Sorry, some dude named Dan Shipper hacked the recruiter industry 4 years ago.

Recruiter: So, do you have any jobs for me myself?

Company: We need hackers, on a scale of 1 to 10, how good of a hacker are you?

Recruiter: only a 4 but my cleaning toilets skills are a solid 10.

I would hope by 2016 someone finds a way to make recruiters obsolete. Sounds like a perfect process to Hack with way too much inefficiency.


It's like my anxiety closet in print. uncanny. except I've been a lot less successful on a number of metrics AND I don't have the hard skills.


As a Penn grad, I'm really glad to see a fellow Quaker hitting the top of HN. I graduated in '07, when most engineers and Whartonites went to work for large consulting/finance firms. In the past few years, I've noticed a bunch of startups come out of Penn and as someone who went to work for a startup out of school, I'm glad to see others choosing this route as well.


Always good to hear from a Quaker! I'd love to chat some time :) my email is dshipper@gmail.com


It depends on what you are looking for.

Are you looking to make a big difference and make lots of money? Be an entrepreneur. Who was that famous entrepreneur 80 years ago that said, when he was being questioned by an expert in court, that he didn't know the answers himself but he could easily hire someone who does?

If you are looking to get a job, then get the credentials AND be able to help them with what THEY need. Just like you need a product-market fit, you need an employee-job fit. If you are applying to a high speed trading firm, you'll probably need to know C++ and Java and low level concurrency mechanisms (this is what I was interviewed on), and if you know math and stochastic calculus you can make more money as a quant. I didn't like those jobs, so I got out of it. But you can still make a lot. Chances are though, you don't want to make money that way.

If you are looking to understand things better, then LEARN. It doesn't have to be in school. Personally I learned a lot from Wikipedia. Before that, I was coding on my own.

At 17 I enjoyed math and computer science. Check this out: http://www.flipcode.com/archives/Theory_Practice-Issue_00_In...

It has a lot of math in it, but in an accessible way. Why? Because I liked it.

So to summarize: it depends on what you want. If you don't have what it takes to help THEM and back it up with information, then do something else. Finding and picking your opportunities is often the secret!


I don't get it. Does people without (hard) engineering or CS education, and without any kind of equivalent experience, get that kind of engineering/CS jobs even today?

It's like hiring an architect to do the structural engineering of a bridge. They both "make" buildings, but their heads are in different places.


Didn't we go through all of this once before? Back when people were saying "This is 2002, not 1999"?

I wasn't old enough to actually experience it, but I'm sure it was actually exactly like that. Job scarcity comes in cycles, but, assuming civilization doesn't collapse, tech jobs should be pretty safe ... forever.


I have studied Marine Transportation for 4 years worked as a seaman for 7 years and still end up as an animator. I feel the same way applying to a job with a competition that have the papers/diploma with them.


This reminds me of the shock I got when interviewing at big tech firms for lowly-sounding "scripting" positions, where most of the questions required a CS background. If only I had gone to college...


Did the questions check for knowledge or a piece of paper?

If former, go get some. No excuses! If latter, find a way to print yourself one.


If I will do maths and algorithms, what will all smart people of this world will do ? I am stupid by intention and not by chance.


Definitely inspired me to put in more effort into my CS degree than I do currently!


You just desmoralized me even more to study CS.


Not entirely sure I get this!


In four years you will. If not before then.


Interesting. My angle of approach is opposite what the OP describes. I say this as someone who studied math as an undergrad, and has a pretty solid grasp of algorithms and the mathematical principles, but who's had a weak point in the front end for a long time, that I'm now working to remedy, because presentation is just as important as algorithmic excellence and efficiency.

Currently, I'm studying Play (the Scala web framework) and, at the same time, having to ramp up on JavaScript, CSS, HTML... and getting an appreciation for how much there is to learn (MVC, database configuration, integrating a web app with a typical build system). It's not mathematically hard, but it's difficult in the way that biology is: there's a lot to learn, and between the concepts are equally important and intricate relationships.

For my part, I think that people who can present complex ideas well will always be employable. I think anyone who doesn't learn basic front-end programming concepts is doing himself a serious injustice.

The challenge of 2016 won't be solving hard mathematical problems. Yes, there will be high demand for people with those kinds of talents, and that kind of work will be (as it always has been) important. However, I think the biggest challenge is going to be educational in nature. It won't be enough to build great software; you'll have to teach people (who are too busy to learn and compare the intricacies of 35 technical assets just to do their jobs) how to use what you've built.


My perspective might be in a minority, because I'm an autodidact. I did go to college- and studied Physics. But it has never hurt me in getting a programming job -- in fact, I have always simply ignored any requirements listed in job listings.[0] Several times it has come out- many months or years after I was hired, and people are surprised I don't have a CS degree. Its like a prejudice- they assume that anyone competent must have gotten a CS degree.

Maybe non-autodidacts need to go to college in order to learn how to program? (But I would doubt this-- you all knew how to program long before you were freshman in college, right? I mean, hackers are born, they're not created in CS classes, right?)

On the other hand, the people I've interviewed with and worked with who were non-hackers, who went and got a CS degree, often were weak performers. Much of what's needed in the workforce is not taught in CS programs, and something about the way CS programs are taught seems to often condition people such that they have to unlearn a lot of stuff before they're fully effective in jobs.

Of course, I've known lots of good hackers with CS degrees. Hackers do tend to follow the custom and go to college and get their CS degree and arguably could be better hackers than they would have been without the CS degree (though I think its debatable whether 4 years of employment experience or the CS degree makes the better engineer- for some people its one, for others its the other.)

When I entered the work force, if not having a CS degree meant I couldn't get jobs it would have been a real issue-- but these days, its is a whole lot easier to start a company, and thus you don't need to be dependent on passing arbitrary HR requirements[1].

If you aren't playing the startup lottery (e.g.: starting an instagram like business and want thus need VC funding) it is vastly easier to start a profitable-from-day-one business now than it has ever ben.

And 4 years from now, that's not going away.

[0] This also shows how well resumes are read. Mine doesn't lie, but I put job history first. I'd usually have so many interview choices that I'd pick my top 5, do 5 interviews in a week and get 4 offers and a callback. I'm sure some companies did read my resume and didn't give me the chance to interview as a result- but that's fine- it is like a built in bozo filter from my perspective. [1] Frankly I think the requirement for a college degree is a bit like hazing. The people before you went thru it, and so they aren't going to accept anyone who also didn't have to go thru it. It has nothing to do with skills, just a way to exclude people who are different. Lord knows that piece of paper is not proof you can program.


This is pointless has nothing to do with that year it is. Companies who need students who know differential equations or linear algebra have always had these requirements (even before computers or programming languages existed). There is no way a humanities student knows this stuff and no way a company wants expend the resources to have them to learn DE on the job...

He applied for a job where knowledge and/or experience in complex mathematics was required and did not get the job because he lacked the qualifications. how is this interesting or even news???


I agree that we could look at it as simply a case of a bad match in the skillset.

But that's also the point right? Everyone with more than two years experience in software development knows skill sets are changing continually. This story is speculating that data analytics will be next the long history of hot/baseline skills for the developer. It does so using the powerful narrative device of not saying directly at all.

Is data analytics going to be the next critical skill for developers like relational databases, object-oriented, or internet technologies has been in the past? Or is it going to be (or already is) the next bioinformatics, an interesting subspecialty but far from being the primary source of demand for software development talent?


There is no comparison to be made here between mathematics and programming languages as far as continually changing skill sets go. Mathematical concepts have had no new paradigms in my entire lifetime. Mathematics progression moves slower than any science or technology that I am aware of. Higher level math fundamentals may be required for data analytic companies because the math needs to be understood and applied to their rules. Figuring out different ways to collect data is not math, it is innovation of technology or psychology perhaps, but the math that this innovation is paired with is not new or changing in any way.

What I am really saying is that data analytics is nothing more than the combination of math and programming. The programming part is evolving quickly, but not the math side.


The algorithms used in things like machine learning are progressing quite rapidly.


As well as things like graph algorithms for dealing with Google-scale data. We might even put skills like map/reduce, GPU programming, and some as-yet-undetermined cloud management API into this bucket too.


First of all, a person with a C/C++ skillset is not just capable of "building an app or a website". Most web devs now (and I include myself in this) don't have the requisite skillset to write the database engines, low-level graphics routines, browsers -- all the numerous layers we take for granted to print 'hello world'. And yet self-taught coders, by definition, are always learning.

I have never applied for a company coding position; I came from design and learned to code as I went; but 75% of my business now is in custom business apps. I've yet to meet a client who doesn't value the fact that I'm willing to learn what I need on the fly. Many times I take projects with the caveat that a certain amount of cash and time is probably going to be spent filling in what I don't know, and hacking around until I figure it out; and that if that becomes onerous, I'll knock some of it off the tab. I bill at $100/hr, modest by freelance coder standards, but obviously many times higher than coders on oDesk, and at least double what I'd earn in an office (if they'd hire me - which they probably wouldn't). And yet my clients end up paying less for rewrites and fixes, spend less time on the phone, and end up with a product they're happy with.

The small-to-midsized business owners who understand the value of letting me hack away, who ask what I think about how they can analyze their data, etc. get great value for their money, and I don't see a shortage of them. When I really, really don't know, I hire out to other hackers who do. I wouldn't want to hire a company composed of people who can memorize algorithms, but can't think on their feet; I'd much rather have the exact opposite, and my clients at least feel the same way. And I'll use whatever tools are at my disposal. The first time I wrote an online store, in 2001, I did it from scratch. I had NO knowledge of databases at all... I actually didn't know there was such a command as serialize(). I ended up writing a whole custom back-end in PHP that did its own serialization, flat file writing and retrieval on products, customers, orders, etc... hundreds of products in the store, thousands of customers... and that site is still running.

Companies that would rather have drones with degrees aren't companies I'd work for, and I'd argue that they aren't who successful businesses looking for software want to hire, either.


The blog entry should end with the guy waking up in a cold sweat, tripping over his bedlinen to get to his laptop, fumbling for his 2-factor authentication fob, and checking his bank account. Inputs the second factor key. Navigates to total in all accounts. Counts the figures, one, two, three, four, five, six, seven, and some cents. Counts them again. Breathes deep, and goes back to sleep.

Thank God, that job interview was just a nightmare.


I grew up hearing stories from the '70s that any intelligent teenager could get a job in computer programming. No experience was necessary. No one was expected to have any experience. They would train you.

When I entered college, the general consensus was that anybody who knew HTML alone -- not Javascript, not CSS, not backend development or server administration -- was guaranteed a high paying job.

Upon entering the job market, I've encountered the new reality that the requirements for an entry-level position are going up faster than I can learn the new things that are required each year. Every position requires experience in something that I do not yet know or have only minimal experience playing with in a personal or academic setting, but there are many qualified candidates competing for the same position. Too many experienced journeyman programmers are out of work and competing for a smaller number of jobs.

The author's bit of sci-fi seems entirely reasonable. It's not where things are headed. It's where they already are, and it's been like this for years.


Huh. Where is this mystical place where there are lots of qualified candidates competing for jobs?

It's certainly not the Bay Area in California, where there are far more jobs than competent developers. I'm out in Colorado, and I still get constant offers even though I'm not looking for a job.

If you're not able to learn what's required, then maybe you should question whether you're in the right field. I've been keeping on top of HTML, CSS, backend development and server administration, especially high performance servers and NoSQL databases, with a smattering of JavaScript on the side, in my spare time while working on video games as my full time day job. Suddenly I'm finding that I needed all of those skills after all; understanding the whole stack can be quite useful, it turns out.

If you don't even have a day job, then you should have plenty of time to become an expert (or at least competent) in at least one narrow field that people are hiring for. If not, then there's no way you'd be able to handle a serious job if you were hired, since just about any job you take will require that you do a lot of learning on the job.

The key skill to have is how to learn. Unfortunately school (elementary through University) tends to hammer that skill out of people. Try to find it and reclaim it. Then build something that you can show off to prospective companies -- or sell it yourself! Good luck.


It certainly is the Bay Area in California, specifically the North Bay. There do seem to be more jobs south of the Golden Gate, but that is too far for me to commute and I have not built up enough of a financial cushion to survive long if I move and still cannot find work.

Nobody hires in one narrow field anymore, at least not according to the job ads. You need to show competence with six to a dozen different technologies from the basics (languages, server software) to the specifics (frameworks). Miss one and you're out. Have you worked with a CMS but not that specific CMS? You don't qualify. Have you lots of experience with several similar languages but never used this specific language in the workplace? You don't qualify. Are you an expert in the previous version of the language but you haven't caught up with the changes in the latest version? You don't qualify. Someone else evidently does because the company closes the position.


The North Bay doesn't really count. You've got, what, San Rafael, Santa Rosa, and a few bedroom communities -- barely any tech at all. The Oakland/Berkeley/Emeryville area alone probably has more jobs than the entire North Bay, and that's not counting San Francisco or other Peninsula jobs.

You need to be either on Bart somewhere or on Caltrain somewhere. Then you'll have tons of options -- in San Francisco, or in the 'burbs/secondary technology areas -- and the commute can be on public transit (to avoid the awful traffic all around the Bay).

And you don't NEED to move before you find a job. Suck it up and drive for two or three hours to interview if you have to. Try to get several interviews on the same day in the same area if you can. And make it clear to them that you're planning to move for the job -- some companies don't care what your commute distance is, but others do actually care how happy their people are, and assume that someone commuting 10 minutes will be happier than someone commuting two hours.

Good luck.


So they did actually reject you for one of those reasons or did you just not send them your CV?


For these reasons. I am generalizing from specific cases in my past: having used Drupal but not Joomla, knowing Java and C++ and object-orientation in general but not having used C# outside a brief school project, knowing early Perl 5 but not the changes up through 5.10, having programmed a bit in Ruby but never used Rails when the job description just said Ruby. "I can learn it" is never an acceptable answer when they can get someone else who already knows it well.


Does anyone else feel that this Op-Ed piece is more 2012 : 2007 than 2012 : 2016?


Ridiculous - This is pointless has nothing to do with that year it is. Companies who need students who know differential equations or linear algebra have always had these requirements (even before computers or programming languages existed). There is no way a humanities student knows this stuff and no way a company wants expend the resources to have them to learn DE on the job...

He applied for a job where knowledge and/or experience in complex mathematics was required and did not get the job because he lacked the qualifications. how is this interesting or even news???




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: