Hacker News new | past | comments | ask | show | jobs | submit login

I think that age is correlation, not causation.

What’s really happening is two things. First, companies typically don’t like paying people lots of money (even if they are great at blowing millions on other mindless things). Second, these companies do not understand that their unwillingness to hire expensive people is causing months of bug-chasing and monkey-patching in their products.

When your organization is under the impression that “a developer is a developer”, new college grads willing to work for peanuts are very attractive. It takes a good manager to understand that someone’s decades of experience or advanced degree really is worth a lot more money, and not just in the long run. Experienced people have seen more programming constructs in more languages, they have encountered more examples of APIs in more libraries, they have made more mistakes and learned from them, they are more likely to be able to apply suitable algorithms and data formats to problems, and so on. Also, having experienced people on staff actually gives your new developers somebody to learn from.




This matches my experiences well. Many of the people I know who are devs in the Bay Area that don't work at Google/FB aren't making more than $80k. The companies pull in grads who couldn't quite make the top companies who don't realize that they will never get a raise or promotion, and many of these people stay for 5-6 years before they realize how hard they are getting screwed.

These companies would NEVER hire an older, experienced developer because they know they wouldn't be able to retain them.

Its worth noting that I work on a dev team where the average age is about 40 (I am 27). We have multiple members of the team nearing retirement. Having the older guys around makes my job much easier and gives me resources that I wouldn't have otherwise. We are able to keep these guys around because we work on interesting projects, don't work more than 40 hours a week, and have a great manager.


> These companies would NEVER hire an older, experienced developer because they know they wouldn't be able to retain them.

Not only that, but they don't have to retain them. . There is a large supply of "~20 year olds who aren't quite Google material" lined up to take on those $80K developer roles. The "churn and burn" strategy seems to be working for these companies so why change?


it's the write-only culture of software development.

People only care about how long it takes to write something from scratch. That dictates everything, from which frameworks or languages are "hot", to which methodology people use, how projects are managed, and....who gets hired.

There isn't a big argument that someone straight out of college with low experience who happens to know the latest new trendy thing will probably pump out a semi-functional MVP faster than someone more experienced. More energy, more willing to work late hours at home because of less responsabilities, but also less experience leading to less time spent trying to think about real problems that could come down the road and how to prevent them.

That lets someone code really, REALLY quick. It will blow up down the road and then thats just "normal" and goes in the bug queue and tech debt goes out of control.

I was once in a meeting where someone was like "Well, of course this code is buggy and sucks: Its at least a year old!!!"

I was floored.


This.

Many SF tech startups overtly believe that actually planning your code before you write it is just, well, not very "agile" -- that's old school, that's waterfall, proven to be deficient, that's not how we do things in the new world. Planning an architecture before coding it?? Why, you may as well be wearing a pocket protector and hating on women and minorities from some massive IBM cubicle farm in the suburbs while chainsmoking and punching your code into paper cards.

In the new world, you break things and move fast -- when you're assigned a massive new project, you just sit down and start typing. When your unplanned, non-architected code turns out to have serious structural issues that make it unextensible and unmaintainable, why, you just layer another level of crap code on top of that to plaster over the cracks, and another on top of that, and so on.

I wonder how much of this has to do with the churn-and-burn VC culture, where they fund hundreds of startups with just enough money to build an MVP so they can "test for market fit," then burn 99 of them and move on to the next batch.


> new college grads willing to work for peanuts are very attractive.

Peanuts being a ways north of 100k a year at the bigger companies.

> advanced degree really is worth a lot more money, and not just in the long run.

Are we talking PhD?


New grads only get more than 100k at a tiny group of elite companies. Almost all new grads lack the credentials to get into those places.


>I think that age is correlation, not causation.

Agreed. I started working in tech when I was 22--now I'm over 30. I've never witnessed old-age discrimination. What I have witnessed are start-ups, built on newer tech, (AWS over on-prem, nosql over relational, etc), passing on candidates whose last reference book was the Data Warehouse Toolkit and still want to be called WebMasters.

The law of the tech industry is when you stop learning--at any age, you lose market value.


There is a difference (both in terms of perceived and actual value) between "never stop learning" and constantly being on the language/framework-of-the-year treadmill.


It's funny, I've been in the industry so long and I usually stay at places 5+ years, so I've been able to skip quite a few trends. When new guys come in, I'll ask about the stuff I've skipped wondering if they are still relevant. Sometimes they are and sometimes I hear things like, "oh, that's so old, this is the new thing."


Yup, I lucked out by hitching my horse to C++ and Linux early in my career. The need for either of them never seems to go away. Also bet on a few technologies that kind of petered out, like OpenGL, and there were a few I'm glad I didn't waste time on.

You can always catch-up and learn something that becomes important/mainstream, but you can't un-waste your time deep diving on a dead-end technology.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: