Hacker News new | past | comments | ask | show | jobs | submit login
It's official: developers get better with age. And scarcer. (coding-and-more.blogspot.com)
375 points by peterknego on June 13, 2011 | hide | past | favorite | 167 comments



Since I don't see this specifically addressed in the text: I suspect this is a classic case of survivorship bias. The same way the stock market looks better and better the further back you go, because you aren't tracking the stocks of companies that are no longer in business, developers probably look better and better the further they are into their careers, since all the ones who've fallen by the wayside probably were the weaker developers who found something better to do with their time.

As a somewhat older developer, I find this a surprisingly difficult question to answer honestly. Comparing myself to myself from 10 years ago, I sincerely think I'm more effective, but self-delusion may play a part in that. I've probably lost some of my "step", in terms of raw capacity to memorize and compute mentally, and I have more commitments outside of the world of software, which dilutes my efforts further. Then again, the strategic ideas I have are more dependably correct, and I spend less time chasing down dead ends, either because I've been down them before or had the good luck of witnessing them second- or third-hand.

I've gotten a chance to see a world-class developer very closely between the ages of 36 and 45. He started this period as, very easily, the greatest engineer I'd ever even heard stories of, and I'm pretty sure he got better over that decade. It can be done.


I don't feel any smarter than I were on my 20's. OTOH, software has gotten much more complex. Mastering the Apple II ROM routines is one thing, but understanding how every piece in a complex web application (app server, rdbms, non-relational storage, cache) interact is much harder. After wrapping my brain around some concepts, it feels very thin. We rely on Google and the web to supplement our memories for things that 25 years ago depended on books and synaptic pathways.

I'm quite sure I am much more effective today than I were 25 years ago, but a lot has to do with cognitive prosthetics.


I tend to think inherent in your reply is an idea that I only see with older developers: that we should actually understand the whole stack. Newer developers are content to let more and more of the development ecosystem be someone else's problem.

So, I empathize strongly, but I think the issue isn't that we need to use prothetics as much as the conventional wisdom is that "that's devops problem."


People have always been content to let parts of the development ecosystem be someone else's problem. It's only the boundaries that have shifted over time.

Today's webdevs consider the webserver and database to be black boxes that are just "there" to be used. The people writing those webservers and databases in the 90s and 00s usually considered the OS and compiler to be black boxes that were somebody else's problem. The people writing those OSes and compilers in the 70s and 80s considered the hardware to be a black box to be trusted (at least at a high level; they knew processor architecture, but I doubt very many thought hard about how to build a flip-flop, NAND gate, or multiplexer). The people designing those chips in the 50s and 60s considered the vacuum tubes and silicon wafers to be black boxes; you don't think very much about how your silicon gets out of the ground, but that's a pretty huge project on its own.


It goes past even that-- how many devs today do you figure see Rails as a black box?


I find web development less enjoyable than 5 or 10 years ago because of the increasing size and complexity of the stack.

At least us old fogies have had the last 15 years to learn web development. How the hell do young programmers learn such a big stack in a few years? I'm guessing half using youthful energy and the other half skipped in blissful ignorance?! :)


Specialization.

When I was starting out, I did design, frontend dev, backend dev, and ops work. Over the last 15 years these positions have all split into specializations: design was the first to go, then ops. The split between frontend and backend opened up in the middle of the last decade, and continues to split even further into controller and model devs on the back end and Javascript and CSS people on the frontend.

The increasingly popularity of frameworks stems from this specialization. It's more important than ever to have separation of concerns, because as apps get larger, individual devs are doing smaller and smaller sections of the work.

Young programmers don't know less than us -- they know way more than we do, but on a much narrower range.


I really don't agree with this, design/dev have always been different specializations, serious javascript has only appeared in the last few years and I've only seen people actually describing themselves exclusively as a 'frontend dev' in the last 6 months.

And 'frontend dev' at the moment seems to be as malleable as a SEO was, sometimes it means a designer who can add jQuery and a couple of modules to a page, sometimes it means a talented javascript programmer with an in depth knowledge of HTML/CSS.

And young programmers can't know way more than older programmers, if you keep learning.


How the hell do young programmers learn such a big stack in a few years?

They don't. Witness the never-ending repetition of basic mistakes leading to SQL injection vulnerabilities, script injection, etc.


I would bet more SQL injections occur due to lazyness than inexperience.


I think one would need data to confirm that.

I mean I am lazy, but not so much that I would knowingly write insecure code for my customers; I'm can't imagine that many developers are different in that respect?


Plus, many database libraries make it significantly easier to write vulnerable code than it is to write secure code.


I find web development less enjoyable than 5 or 10 years ago because of the increasing size and complexity of the stack.

Part of this is that stacks that start off all light and fresh "we're not Struts!" before long acquire a few too many "must-have" features until, Lo! they are Struts.

Then it's time to drop that framework and find something more fresh, light, and lean, and enjoy it while it lasts.


The past weekend (during a RHoK event) I was introduced to TipFy by a young dude who's far faster than me in grokking new tools. It was easy, fresh and led to a fun weekend of web development.

Tipfy runs on the Google App Engine Python stack. It's a bit rough on the edges, but that's part of the fun.


I started programming in web development about 4 years ago: first HTML and CSS, then adding Javascript and AJAX, then PHP, then Ruby and Rails. In the process I learned a fair amount about MySQL and I can do the most basic Unix server administration.

I'm on my third full-time web development job and I now feel decently confident in my abilities. But yes: it's a big stack, and overwhelming. Everywhere I turn I see more things I don't know. I'm always reading and trying to improve.

Were there ever simpler times? It's an interesting thought to me. I find the constant challenge to be interesting, but admittedly, sometimes tiring.


How the hell do young programmers learn such a big stack in a few years?

Welcome to the museum of modern wheels -- we have every shape but round.


That's the point - you don't learn the whole stack, you let your framework manage it.


This is one of my problems with younger developers. They slap framework code together rather than actually figuring out what's going on underneath.

It's one of the reasons I still use PHP. It's the perfect balance between framework and coding.


It's not about being able to add code to every piece of your stack, but understanding what all of its external interfaces do. How many programmers you know can do that with their stacks?


Ho!

At my age of 39 I solve tasks that in my 20's I can't even dream approaching. I attribute it to much higher-level languages I use today (mostly Haskell) and, of course, to experience in various fields.


cognitive prosthetics

I like that term.


I think Clarke's first law applies to programmers as well:

"When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."

When you are older you maybe really spend less time chasing dead ends, I agree with that, but to be fully sincere with oneself, some of those ends might end up not being that dead anymore 10 or 20 years after you've last bothered to visit them.


That's a great point. I try to actively fight conservatism, but it's probably a neurologically losing battle; as I get more experienced, I necessarily get more scarred, and will instinctively avoid the areas that have caused me professional pain. When this helps, we call it "learning," but in the limit it tends towards rigidity.


"...and will instinctively avoid the areas that have caused me professional pain."

This is not good. You learned that your chosen solution did not work back then. This time, you know you have to try from the other side. This is learning too.

I appreciate a lot if there is a chance to try again what did not work a while ago. As long as you don't run out of new ideas. So far I don't (46).


Oh man, I agree with this. Stuff I intended to get around to someday back in the 90's is just a CPAN download away now.


That quote sounds deep, but it reduces almost exactly to:

"Almost everything is possible."

More precisely: "If a distinguished but elderly scientist passes judgement on in idea, it is probably possible. Only ideas whose possibility are not judged might be impossible."


If I recall correctly, Clarke's Law originally applied to fiction.

There it makes perfect sense.


is it fair to call it a bias in this case? it doesn't alter the conclusion that older developers are both better and scarcer, on average.

[disclaimer: 44 yo ;o] [ps. google has mitigated memory loss, i suspect.]


You are 100% right, a developer is a developer, so it's not bias. It could even be argued that the reason there are less older developers is because the better ones went into management and are no longer active developers.

Now, this all assumes that developer activity on Stack Overflow is correlated roughly equivalently over most ages. If it is, then these plainly state that for any random developer you would interview, they are more likely to be more knowledgable (according to the definition extracted by Stack Overflow activity) the older they are. The fact that there may be fewer developers at an older age is irrelevant.

Disclaimer: I'm 25


"It could even be argued that the reason there are less older developers is because the better ones went into management and are no longer active developers."

On the contrary in my experience, engineers who are fed up with coding or find maintaining their skillset too tedious or time consuming to fit in with other responsibilities generally move into management (have kids? : move to an exec role). I've been offered several CTO positions, but still building systems while many collegues have chosen the ladder (33 yrs old here) - if anything a subset of older programmers is healthy for the ecosystem.


"have kids? : move to an exec role"

Exactly - My point was that it is presumptuous (and irrelevant) to assume people leave development as they get older because they aren't very good at it. It's irrelevant because those people are, by definition, no longer developers.

A bias could come into play if you could prove:

Older developers who are (more/less) knowledgable are (more/less) likely to participate in Stack Overflow.

or

Younger developers who are (more/less) knowledgable are (less/more) likely to participate in Stack Overflow.

The latter proposition of bias is probably less likely since the sample size of younger developers is much larger.

I still think the conclusion is correct assuming the provided definitions: For any random developer you would interview from Stack Overflow, they are more likely to be more knowledgable the older they are. If the population on Stack Overflow is representative of the general developer population, than that assumption also holds true to the general population.

Also, you highlight another interesting thing: The average age a man in the US gets married is close to 28, right smack in the middle of that distribution. http://factfinder.census.gov/servlet/GRTTable?_bm=y&-geo...


I think "bias" is technically applicable. I.e., the population of older developers is not comparable to the population of younger developers because of survival (or if you prefer, self-selection) effects. Any non-trivial conclusions (e.g., that individual programmers should expect to get better with age, that your company should make an effort to recruit/retain older programmers, ...) from this data are confounded by these effects.


Well, survivorship bias can still be a reason to recruit older programmers - you want the survivors. As for retention, you would hope that a company will know which of their older programmers are valuable.


The question wasn't about finding out if older programmers are better, but if programmers get better as they grow older.

Measuring the latter in terms of the former is highly prone to the survivor bias. The OP's data is evidence in favour of progress over time, but it's weak. Now, progress over time doesn't sound like such a silly idea, so even weak evidence counts for me.


> Since I don't see this specifically addressed in the text: I suspect this is a classic case of survivorship bias.

The text didn't seem to do much besides present the stats. But he say this:

    I knew that with age coders tend to switch careers, but I
    was surprised to see the size of the drop. After the peak
    age of 27, number of developers halves every 6 to 7 years.
I figured that "survivorship bias" was sort of the exact point he was trying to make. Beyond that, there's really no discussion of causation (getting older makes one more "mature" or whatever), so I think it's implied that weeding out less committed devs is exactly what's going on.


I think you are over thinking it since everyone knows: Old Guys Rule


Aaargh! The data doesn’t say this, all the data says is “Older StackOverflow users have disproportionately high SO reputation.”

An alternate explanation is that for some reason older developers are more likely to be addicted to Stack Overflow.

A big problem here is the unproven assertion that high SO reputation means you are a “better developer.” Does it really? (After all, with few exceptions, the more active you are on SO, the higher your reputation, period, regardless of your answer quality, partly because downvoting is strongly disincentivized. And the article itself notes that older programmers don’t receive significantly more upvotes per post!) Until that’s shown, the article’s conclusion is highly suspect.

Frankly, I’m embarrassed so few people seem to be calling out the terrible reasoning behind this post. It may well be that older programmers are “better,” but what we have here is nothing more than a colossal failure to understand science, reasoning, and evidence.


Yeah, my conclusion from the data was "senior developers know more things and have more time on their hands to tell others about those things", which is exactly what you'd expect. The more senior you get, the more your role is as guide and mentor than immediate implementor.


One of alan's points is that it is incorrect to assign behaviors of "older Stackoverflow users" to the universe of "older developers".

The population of "older Stackoverflow users" is not randomly drawn from the population of "older developers", and nothing is put forward to claim that the former is representative of the latter, so you cannot make this assumption.


Exactly. The graph showing that older coders' answers are not significantly better than those of younger coders is a case in point: maybe the good older coders are too busy actually coding to spend time answering questions on StackOverflow.


The data could also be used to infer that Stack Overflow users are much "better" programmers than non-SO users. Non-SO Users have 0 reputation and are therefore, terrible programmers. While SO users have > 0 reputation, and are therefore, awesome programmers.


If you have a higher reputation and give correct answers, couldn't we then assume that the older developers have a wider range of knowledge to be able to participate in more responses? Also, it is not really clear how "better" is defined in this case. Maybe it is "better" == I know more technologies than you?


I also wonder: Do developers who spend a lot of time on Stack Overflow get "better" than those who actually write code? I've met more than one "Aristotelian Programmer" who could quote you every design pattern from heart and draw UML diagrams on a white board all day, but who actually couldn't write code.


Thank you. I'm not against speculating but we all should know that correlation does not imply causation.


I'm 45 with a little short of 30 years of programming experience. Our team at HP (I'm not a manager) has a few programmers that are significantly older and a few that are very young. From my observation, what's different about young vs. old programmers is not related to the speed at which we pick up new technologies (we all love to tinker with the latest stuff) but at the general approach to problem solving.

Older engineers tend to compare new problems to experiences from the past. The tools at our disposal have become much better but the fundamental mechanisms haven't changed that much so it's easier to identify whether there's a real benefit to using a new tool or if it's better to stick with what you have.

As an experienced developer it's a little easier to avoid sinking effort into novel but misguided technologies.

As a young developer it's a little easier to be open-minded about promising technologies.

But don't pay any attention to me - my SO rep is less than 30% of the average for my age bracket...


I don't disagree with the conclusion: people who aren't as dedicated to or good at programming transition to people or product/program/project management[1]; the remaining folk receive additional experience which allows them to capitalize further on their passion and talent.

However, this isn't exactly proven by the data: what Stackoverflow shows is that older developers are better at talking intelligently about programming. That's extremely useful (and helps career wise), but it isn't the same thing as being a better developer. Sometimes it correlates (the best programmers I've known have also participated in organizations like IETF, written RFCs and have also thoroughly documented their work), but it isn't a total ordering (I know plenty of programmers who are better than I, but who don't participate in any public forums).

On the other hand, I've yet to find a successful programming language made by someone before their thirties. Contrast it, on the other hand, with some of the most ground changing academic work in Computer Science and Mathematics being done by people in their twenties.

[1] There's nothing wrong with that: Google's APM program particularly is a great example of "engineers who don't want to code" being extremely useful. See also "The Russian Lit Major" by rands: http://www.randsinrepose.com/archives/2006/09/06/russian_his...


Historically, it would seem like 30 is almost a "sweet spot" for language creation.

* Dennis Ritchie was 27 or 28 in 1969 when C got going.

* McCarthy was about 30 in 1958 for Lisp.

* Sussman was 28 and Steele was 21 in 1975 for Scheme.

* Alan Kay made Smalltalk between 28 and 31.

And while it used to be true that lots of game-changing mathematics was done early, I don't see much of that lately. A huge amount is done by junior faculty and postdocs, but that's usually late 20s and 30s.


I thought I had a counter example with Yukihiro Matsumoto and Ruby, but Yukihiro was born April 14, 1965 and Ruby was released December 21, 1995. Thirty years an a handful of months.

http://en.wikipedia.org/wiki/Yukihiro_Matsumoto

http://en.wikipedia.org/wiki/Ruby_%28programming_language%29...


The greater number of answered questions could either be because they are better at talking about programming as you say, or it could be that they just know more answers. It could also be that they have more time to answer questions, maybe because they have cushier jobs.


Tell HN:

On a barely related subject, tomorrow is my 49th birthday.


Really? I could've sworn you were in your twenties from this photo: http://reginald.braythwayt.com


IIRC, That photo was taken at MeshU in Spring of 2008, so I would have been 45 or so.


So that's why you're such a good blogger!

Thanks for blogging and contributing on HN, you've really improved my understanding of software, and how to think about software.

Happy Birthday.


Hmm. I have upvoted you automatically: in the past it did not just mean I think your argument is interesting, but also (in cases like this one) that I share the sentiment and do not want to litter HN with "+1" comments. Of course, now that none except you can see your score, this is no longer working like that.

Happy birthday to raganwald!


Happy birthday, and thank you for all the wonderful things you've written over the years! I hope you find time to write many more.


Happy Birthday!!!

And I agree with 6ren. You definitely look like a 20-something.


I suspect Reg has a portrait hidden in a closet someplace that is not fairing as well. :)


Ditto to what others have said. I figured you were in your early 30s.


Don't get scarce!


What this really shows is that developers get better at answering technical questions with age. There is probably _some_ correlation to development skill, but the data doesn't demonstrate that. I'd expect developers to know more things as they get older, and I wouldn't be surprised if they got better at answering questions. Not sure that makes them better at the development part.


I think that this is showing a correlation between those programmers that enjoy programming and want to share their knowledge with others, and those programmers that stick with it for more than a decade.

I bet that if you could separate out the younger developers who will still be developing in 10-20 years, that their rep on SO is similar to that of older developers. Those developers that'll wash out in the next 5 years are dragging down the participation numbers of their peers.

So it isn't the age that's important, it's the personality type which is correlated to those people that'll stick with development.


Since programming is a skill based on the technical information, and since practice/experience is even more important for skill development than for other types of learning, I would expect their abilities to be even better than predicted by a test of their knowledge.


I used to think younger developers were better, but with the experience of age, I now realize that older developers are better.

Honestly, though, I think programming is a "young man's game", partly because you are sharper, have more energy etc when young; but mostly because when everything is reinvented each decade, you are better off starting fresh, without being aware of other choices.

The exception is for higher-level tasks, such as marketing, managing people, strategic business decisions, and code architecture. Also, I would think, language/library/API/framework design. I hesitate a little, because many of these are based on the needs of current programmers, which the front-line troops know better because they are doing it (they are the users). However, for deeper insights, age has the benefit of seeing deeper patterns over decades, and over generations of usage. Most language designers seem to be older (but is that just because their languages are now old?)


Age correlates with experience.

If you've been something longer, you've had time to learn what works and what doesn't, and why.

Thus, you can do a better job at guiding people who are newer to the material.


But if you started programming at 50... that's different than having done it for the last 30 years.


That's pretty rare though, I think most programmers in their fifties have been doing it for many years.


I doubt there are enough such cases to distort the statistics.


Honest question: what happens to developers after they are no longer "Developers"?

As a 29-year-old I'm kind of freaked out by the fact that I'm on the older part of that distribution.


I do not have stats to base this claim, but when I worked at my last job, I heard from an employee in another department that the normal lifespan for a software developer was 7 years, after which they have some sort of career change, be it management or logging.


Maybe with coding, it's either up or out. Older developers who have better skills most likely enjoy what they are doing and are somewhat good at it. If they weren't, they would have left the profession, maybe to become project managers or some related position. It's not like you're always forced out of your position, it's just that when you look down the road, you can see that if you're not a great coder, it may be best to find something you're better at. A 25 year old coder may still be figuring out if that line of work is sustainable over the course of a career.


Wow, the chart only goes up to 49 years old?

For me, that is about 4 programming languages ago :-)


He said he only took age cohorts for which he had 100 or more data points.


Hardly seems like they get better with age, just more active in educating other developers. The quality of posts doesn't look related to age at all. As far as I can tell, that should be the parameter to measure if you are going to make any sort of induction about the quality of a dev from their SO profile.

And obviously, the fact that this only takes data from SO means whatever the conclusions, they only hold true for the kind of people that post there.


Having the right answers counts for a lot. I see a lot of less knowledgeable developers spend a day or two figuring something out that I know off the top of my head. These days, people tend to come by my cube and ask questions before they struggle too long with something, which makes me feel pretty darn good about myself. I've also learned that before I go too far down a rabbit hole in some area I'm shallow in (OS resources, PL/SQL, database performance, webby stuff, the list goes on and on) I should have a conversation with someone who knows what they're talking about.

Age doesn't necessarily have anything to do with knowing the right answers. (My first experience being a greybeard actually came within a few weeks of starting my first job.) Part of it is investing the effort to gain expertise, part of it is being smart and/or lucky about investing your effort in the right areas, and part of it is having the background and aptitude to absorb knowledge. Wherever it comes from, being able to answer questions that stump other developers provides a big gain in productivity, because you spend more time doing work and less time struggling with trivialities. If the ability to answer questions increases with age and quality of work doesn't decline otherwise, then the average value of developers increases with age.

So anyway, my first experience being the greybeard, even though you don't care. I was walking around in the office when someone flagged me down. There were three senior developers standing behind a junior developer, all huddled over staring at his screen. They were trying to figure out a snippet of Java code that looked something like this: set?foo:0

"It must be a valid identifier of some sort or it wouldn't compile." "If it is, where is it defined?" "Question marks and colons aren't allowed in variable names. We checked that three times in the book." "Maybe it invokes an implementation-specific feature in the compiler we're using. Whatever it is, it's some kind of deep magic." "Maybe there's a bug in the compiler and it's treating the question mark as a combination semicolon and comment character." Apparently they had been at this for quite a while and were repeating the same ideas they had had an hour ago. They were scared to just "fix" the code because it had been written by a "really smart guy" who had left the company, so it must be right.

"Sheesh, haven't any of you guys written any C?" I said. Two minutes later I had restored a junior developer and three senior developers back to productive work, saving them God knows how much more wasted time.

That's the value of knowing random things off the top of your head.


LOL, what?

You're seriously telling me that they were stumped with "set?foo:0"??

And you stayed? :-)


Back in the day, it was very common among Java programmers to consider using ?: as atrocious style. It was a relic from C, eccentric and therefore confusing, just more damage inflicted on the language through the influence of conservative programmers who didn't "get" Java. Since most Java programmers were monolingual and learned from books that relegated ?: to a few sentences inserted somewhere for completeness, many didn't even recognize it as an operator.

Even setting aside my love for the ?: operator, I never understood that attitude. One of the major selling points of Java in the early days was that it was simple enough for everyone to understand pretty completely. It was supposed to be impossible for a guru to write code that a junior programmer couldn't understand at the small scale with some effort, line-by-line, if not at the large scale. Training programmers to be competent at only a subset of the language undermined the purpose of having such a simple language.


nod I understand your point, it's not the most clear operator. Java would not be worse if it was never in the language.

Though, it has made many lines of my code more succinct!


The main question, if you really were to judge something like quality, is why older devs answer more questions.

Is it your assumption that they are just more invested in educating others, or is it e.g. because they have a broader range of experience and can answer more/more difficult questions.

As it is, this data set is merely an interesting conversation starter. I hope somebody takes it and does some research on it, because it sure could be interesting


It's probably the former. Older people--developers included--can enter a phase of "generativity" wherein they seek to give something back to the world in which they've prospered by helping following generations.

See "Generativity vs. Stagnation" in this list of Erickon's Psychosocial Stages of Development:

http://www.psychpage.com/learning/library/person/erikson.htm...


The cynic view: they may have more time on their hands to was^C^C^Cspend on StackOverflow.


Even if I subscribed to the cynical viewpoint - how come they do have that time? The snooty old dev in me counters the cynic with "Because we get stuff done" ;)

So, still, lots of room for a more detailed look.

Edit: The reply is of course tongue-in-check. The second reply to the GP seems to provide a better explanation.


There is a strong correlation between being a good developer and your ability to effectively educate other developers.


A quote from this really stuck out for me:

"So, senior coders earn their higher reputation by providing more answers, not by having answers of (significantly) higher quality."

A lot of people here are focused on "being smarter" or "doing a better job" or "higher quality"

Excluding all of the self-taught developers, and limiting ourselves only to people who follow the standard "get a 4-year college degree then go out in the real world and work" crowd, as that's pretty sizable. Make extrapolations as necessary.

Remember your first year (or two?) of development? Looking back, you were probably way in over your head, had mentors looking after you, making tons of mistakes etc.

Fast forward 5 years. Can you write code faster? Probably not. Can you write better code? Sometimes. It's all about experiences and learning from them. When you take a new job, or a new project, or a new anything, you call upon past experiences to guide the efforts of this process. It might be something as vague as "I am going to write tests first because I found it helped me earlier", or (ignoring TDD), "I'm not going to write this function like this, because I know the code will be hard to test when I get around to writing a test for it later"

You learn this all from experience. Senior people who have been in the field longer have more experience. They aren't "better" in the sense that they are smarter or have more intelligence, they just know MORE because they've been exposed to more.

It's also why so many people (especially in hacker news) have been successful without degrees. It's not a degree that matters it's EXPERIENCE.

It might be a fine line differentiating between smartness gained from pure intelligence and "smartness" gained via experience, but I think it's an important distinction, and one that I think this post highlights well.


Is the decrease in number of developers due to real scarcity or due to fewer people that actively using Stack Overflow?


I think it's real scarcity. I'm 41 and am always one of the oldest in any group of devs (except for that Usenix event I attended :-).

Most people get into their career in their twenties. For computers it's often even younger. But it seems relatively rare for someone to pick up programming in their 30s or older.

If you're 40+ today, your twenties were ending around the time that the "tech boom" was beginning. But there just wasn't as much information and inspiration around for getting into software development. Even through the mid-90s a college degree and programming skills was no sure ticket to cushy employment.


Sometimes it's good to be a late bloomer.


Or just so obsessed that you stayed in programming even when it wasn't obviously a good idea. Come to think of it, it's still not obviously a good idea.


I think there might also be another selection bias here, in that StackOverflow might be a bit self-selecting for developers who know their stuff.

It's hard to say what this says about developers as a whole, including the ones not on SO, which I assume is a large number.

An alternative hypothesis might be this: Good developers get better over time, whatever that means.

I would suspect that people who form bad habits early on don't enjoy the same benefits of experience as those who built on solid foundations.


Interesting hypothesis, but the faulty data renders the results presented useless. Namely, only 53% of SO users enter their age. Therefore the data may be wildly biased towards people who are willing to enter their age in an online profile.

In addition this only represents SO programmers, which while a great bunch, is hardly representative of all programmers.


This analysis is completely wrong because the bell shaped curve is not measuring number of developers. It is measuring number of SO users. There are SO users that are not developers, and a lot of developers that are not SO users, so there is no valid conclusion that can be taken from these numbers about the general population of developers.


The bell-shaped curve gives me more confidence in the result, not less. It shows that Stack Overflow has enough developers that the graph isn't choppy.


The bell curve is a result of the distribution for any population: developers, guitar players, sci-fi readers, etc. In this case, the curve happens to represent SO readers.


The bell curve can be choppy when you've got too little data or bad data.


You're just looking at the shape of the bell curve. It tells you that it represents a population, but it doesn't tell you what population it actually is. That bell curve could be for the age of people owning cats, for example. There is no evidence that SO users is a good sample of the developers population, in fact it makes sense that it is skewed towards young people that have lots of time to search and answer programming questions on the web.


Stack Overflow is geared towards developers and doesn't have anything I can see to attract a particular kind of developer, except the curious kind. I can't think of a place to get a better sampling of developers right off hand. Certainly, while Hacker News is popular among developers, it would be a less accurate sampling of developers than Stack Overflow, because people here tend to be attracted to startups.

I disagree with what you said in your first post, that it is completely wrong, and that no valid conclusion that can be taken from these numbers. It's not perfect but it's far from being a terrible sample. It's a general-interest developer site rather than a specific-interest one. There are all types of developers. In a reverse-sorted list of popular recent tags, there are c#, javascript, php, java, jquery, .net, and android. Also, developers didn't choose to be a part of this graph; they merely got put into the survey result because they had Stack Overflow accounts. If it had been a survey that was announced on twitter it would be biased towards people who want to take surveys. This would be a worse bias IMO than people who want to ask and answer technical questions.

I wouldn't take issue if you had said it was problematic, but instead you went straight to a one-sided conclusion.


What about developers that don't spend much time online? Or that don't find much personal joy in answering the questions of strangers?

Just because Stack Overflow might be one of the best places to get a sampling online doesn't mean it is actually a correct sampling of the population of software developers.


Those are good points and I agree that there are numerous reasons why it isn't a correct sample. What I'm trying to say is that it's hard to get a correct sample for developers and that if someone is curious about the subject matter it's worth keeping this data until some clearly superior data comes along. There are plenty of datasets that are worse than this, either by having fewer data points or by being even more biased.


Is there any other career where this would not be accepted as a given?


Teaching, from grade school through college professors.


Why would you say that? Teachers have an incredibly high attrition rate. Barely half of the new teacher classes make it 5 years.

The vast majority of the highly-regarded teachers I know are ones who have been doing it for 25 years or more.


The original topic was: "It's official: developers get better with age. And scarcer."

I was answering the posted question "Is there any other career where this would not be accepted as a given?".

So my post was that for teachers it is not accepted as given that they get better and scarcer with age. I did not make a statement whether that was true or not, just that it was not accepted as a given. In support of my view I would offer the numerous recent news stories calling for the end of last-hired, first-fired rule when laying off public school teachers.


I see your point, but I'm not quite sure that the two things go hand in hand.

If the vast majority of developers were publically financed and had LIFO laws, I'd imagine that there'd be calls to end that practice as well. In both cases, though, I'd argue it's outliers that are the source of the consternation.

I think most education reform advocates would agree teachers get better as they gain experience.


The title s/b "Stackoverflow developers get better with age. And scarcer."

Stackoverflow is not representative of the overall developer population. As an example of an alternate view compare the relative number of items tagged on SO with C#, Java, and PHP. Then compare that to the number of listings of those tags on Dice.


So why the downvote? If you disagree with what I wrote then offer a rebuttal. Simply downvoting is churlish.


The myth that engineers lose their game when they get older always seemed a little off to me. It usually gets clumped in with all the software jobs are going to go to China. I have met a lot of very bad developers that were older as well as very good ones. The same goes for young developers.


There's one huge hole in the analysis: the dataset.

The older someone is, the less chance they'll spend time on an internet community site, especially at work, which is when a lot of people access stackoverflow.

Old dudes work while they're at work, because they learned their work ethic in pre-internet times.


"On the graph we can see a textbook example of a bell distribution curve."

Is this an actual bell curve? It's not symmetrical. (http://en.wikipedia.org/wiki/Normal_distribution)


It's hard for anything dealing with age to be a true symmetrical bell curve - it usually winds up being cut short by old age & death, or foreshortened by, well, not existing. Notice that the ages start at 16. You'd have to be pretty precocious to give high quality answers at age 10 or 5.


Not hard, impossible. Bell curves go to infinity.

Edit: Shouldn't have said that. Everyone knows basic stats here and I was being pedantic.


Yeah, it is literally impossible for a real bell curve to exist, but if you want to be pedantic, that's also because people are clumpy. The bell curve might say there ought to be 0.00000067 excellent answerers at age 12, say, but reality insists on there being 0 or 1 excellent answerers aged 12.

Whenever we look at real data, we acknowledge that it's a discretized approximation to a bell curve and not a real (continuous) one; the point was that on top of the discrete approximation, we have the additional problem of anthropic biases - people not existing or dying at either end.


Seeing that 70% decline in number of developers (on SO) from age 30 to age 40 and assuming they are still working, what do the older "has-been" developers do now and what titles do they have? Perhaps they have never heard of SO? Or they have stopped needing and/or using SO? Statistics based on educational website usage seem to always skew younger.

Analogously, if I measured academic skills and availability by time spent in libraries and time spent teaching, I'm sure we'd see a similar peak in 20's because grad students spend so much time doing both these things and productive professors need this less.


I hope it doesn't mean that older developers are more likely to be unemployed and therefore have more time to waste on SO.


Number of developers by age is likely inversely proportional to the growth in the industry. I'd also say people get into development at a later date in their lives - late 20s - because they started doing something else before making the jump - marketers, engineers, painters, physicists.


One could also find another, not so positive trend in the article:

The older the developer, the less curious they are <= the less questions they have.

Of course you could say it's because the older developers are more experienced. But would that quality per se quell the thirst for new knowledge?

Just thinking aloud...


Alternatively, as you get better and deal with complex issues and have more sophisticated thoughts, you realize that SO can't answer the majority of your problems.

That's what I've learned about SO: it's fine for popcorn questions, but for the in-depth knowledge and discussions... meh.


That's a very fair point. Overall the deeper answers are probably more in the books or at conferences, not in blogs and online forums.


Older people have, on average, been using stackoverflow for longer. SO rep scales roughly linearly with time spent contributing.

This probably isn't enough to explain the entire effect shown in the graph, but in 10 years (if people still use SO), it would be.


I wonder that they didn't do the correlation of length of time on SO (i.e., since join date) vs. reputation. Did the older developers join SO earlier, on average? How many rep points per day did older vs. younger devs accumulate?


I'm a little surprised to see that they provide more answers, but that they aren't of better quality. I would have expected experience to show more.

Of course, we only know they are older, not that they have more years of experience.


I think its one of those things where the real issue is if you have an answer at all. Of those ppl that have answers the quality of answers, I'd expect to be similar (across a population).


A minor nit, I know, but a bell curve is symmetrical. So the graph shows some sort of fat-tailed distribution, and we cannot "see a textbook example of a bell distribution curve".


Another minor nit:

You can still do a fit to a bell curve and find a chi-squared value. If your chi-squared is horrible, obviously you should be considering a different probability density function as your model is incorrect, but if it's decent enough you can omit the features and call it a "bell curve"

That being said, I could see this as a composition of two gaussian bell curves with the means correlating to the ages of people in college and their early/mid career (let's say 23 and 28 respectively)


How did this post title come from that article? If anything, we old fogies are just more prolix - but not necessarily better. (Judging from his statistics, anyway.)


What if there's a filtering effect where lower-quality developers are less likely to stick with the profession past age 27?


My guess is that the older developers have more time to comment and follow up. Plenty of times I have looked for solutions on stackoverflow, but have been too busy to post responses because of looming deadlines. This is like saying my grandparents would make for better farmers because they spend their whole day on Farmville.


I don't think a 45-year-old developer has inherently more free time than a 25-year-old. Given that a 45-year-old is more likely to have a family than a 25-year-old, I'd expect them to have less free time.


It could be though that a 45-year-old has better perspective on what's important and what's not and therefore has better time management skills.

I know when I was first starting out as a developer I spent a lot of long days and weekends working but not really getting a whole lot accomplished. As I've matured, I've gotten better at picking out what's really important and am now much more productive while working less hours. I anticipate this trend to continue as I gain more experience.


While this is true, you also don't see many 45-year-olds doing 60 and 70 hour work weeks.


Yes, but presumably they aren't spending those extra 20-30 hours writing StackOverflow answers.


What's up with the 48 year olds?


There are 148 of them, I'd wager there are a couple with super-high stats who throw off the average.


Also note the significant 34 y.o. bump from the legendary Jon Skeet [1]

   [1] http://stackoverflow.com/users/22656/jon-skeet


Using a median instead of a mean might help w/ this.


Yes - median is a far better method to use in a situation like this, especially when what is being measured (reputation) has a bounded minimum and an unbounded maximum, and double especially when the sample size in each age bucket is relatively small.


If you look at original linked doc, you'll see that there are three 68-yr olds, that have on average 19.000+ points. That's why I only used in graphs age groups that have at least 100 developers.


It's official: developers spend more time on stack overflow with age.


'63 was a good year!


Maybe it's because great developers end up in management positions... each day I write less and less code...


By looking at the Reputation by Age graph it looks more like there are just a few really good older developers. The rest probably are so out of touch they do not participate in developer communities such as Stack Overflow.

In my experience it has been that older developers hold on to old coding habits that are today considered dangerous and are reluctant to change that.


The rest probably are so out of touch they do not participate in developer communities such as Stack Overflow.

If you're curious about the down-votes, it may be because the above quote is an example of the particular kind of bigotry commonly referred to as "ageism". For future reference, if you are ever in a management position, that kind of thing is actionable in a work environment. You could get your backside sued off, is what I mean, if you are found to have used that in a hiring decision, for example.


I'm old enough to feel like employers may sometimes find younger and cheaper people to do dumb stuff faster.

What you say about the HR liability potential agrees with what I've heard from HR people, but never seen it be a factor in reality. In practice I can't imagine suing over a development gig. I can think of a million bad reasons people might not hire/let go a good developer (e.g. questions about manhole covers) but the "he's a curmudgeon who's reluctant to slap stuff together quickly" argument seems halfway legitimate.

So I say, bring it on, let's have an open discussion about who's smart, who's fast, who's wise, and when it even matters.


I agree with you. Experienced developers with skills are key for higher level functions such as architecture and system planning due simply to the "been-there" factor. My statement above was geared toward the "curmudgen".


Outside of, say, medical software, or embedded automotive stuff, can you give me an example of a coding 'habit' which is dangerous?


Functions that return multiple types. Relying on private variables declared many level up with no intention of checking their existance or value, or sitting on perticular record numbers in tables from some other process blindly. 5 different versions of code doing the same relative process in slightly different ways depending on whichever developer's habit all live in a system. I've worked with a lot of old code from self taught engineers. You all may not like it or non-PC statement but it exists and its not pretty.


"I've worked with a lot of old code from self taught engineers."

Wait until you deal with code by university educated engineers who follow the rules of software development to the T. Design is emergent. If you see 5 different functions doing the same thing it sounds like you should go refactor it. Private variables that are unchecked? OMG, you might want to add a few asserts and a couple tests. These problems sound insurmountable.

It sounds like you're dealing with production code that makes money, probably so much that they can afford to pay you to improve it.


Thats what I'm doing. Refactoring and redesigning to be safer, more modular, and more optimized. Its one part engineering and one part CSI sometimes. It makes things like code contracts really exciting.


How is any of that dangerous?

I think you have the wrong adjective.


What this post shows me is that older developers are less willing to ask questions, less willing to admit when they don't know something, less willing to do anything to fix it.

My personal experience is that older developers just don't get it. They haven't kept up with the exponential increases in productivity that we have had in the last 5 years. Things that used to take 2 days 10 years ago can be done in 2 minutes now, but they are still used to thinking that they did it quickly if they finish it in 2 days, so that's how long it takes them to do it.

Also my personal experience is that older developers can't handle the asynchronous nature of modern communications very well. They always want to work on only one thing at a time and get confused/ much slower if they have to work on multiple things, whereas younger developers will happily be able to switch in between tasks while waiting for the previous task to finish compiling/running without problems.


They always want to work on only one thing at a time and get confused/ much slower if they have to work on multiple things

This is the case with everyone. Multitasking negatively affects overall performance. There are many studies that confirm that. If some developers avoid it, maybe it means they are better at evaluating their own productivity?


So how old are you? Do you have any coworkers below 25? How well do they do with multitasking? Better than you?

Twitter, text messages and facebook have trained our minds to work differently. Have you been keeping up?


> Twitter, text messages and facebook have trained our minds to work differently. Have you been keeping up?

Have you?

> In a much-cited 2009 paper* in Proceedings of the National Academy of Sciences, for example, Stanford's Eyal Ophir, Clifford Nass, and Anthony D. Wagner show that heavy media multitaskers demonstrate significantly less cognitive control than light multitaskers. The heavy multitaskers "have greater difficulty filtering out irrelevant stimuli from their environment" and are also less able to suppress irrelevant memories from intruding on their work. The heavy multitaskers were actually less efficient at switching between tasks - in other words, they were worse at multitasking.

* http://memorylab.stanford.edu/Publications/papers/OPH_PNAS09...


If you read the details of that paper, they are really not relevant to what I mean.

Let me be more specific.

So when I talk about the asynchronous nature twitter/texting/facebook what I mean is that there are people now (usually young people) who are comfortable with carrying out multiple conversations with different people/groups of people that occur at different rates of time. I am not suggesting that someone who is frequently using twitter/texting/facebook while at work would be more productive than someone who doesn't, but that this experience helps them be able to manage multiple workflows better.

From my own experience, when I talk about multi-tasking as far as it relates to a developer anyway, let me give you an example of something that I can do all the time: I get assigned a bug, I look at the bug board to see what other similar or related bugs there might be, and I assign them to me. Usually this means that some or all of the steps to reproduce the bug are the same, so that if I have to step into the debugger to identify the problem I can set breakpoints in places that should help me figure out more than one bug at a time. In the middle of this, a co-worker sends me an instant message asking for something. I don't immediately know the answer, but I know the general area of the code to look for the answer, so I dig around for a few minutes and then either reply with what was asked or a "I don't know but xyz worked with that code and might be able to help you better." Then I go back to my debugging. I find the bug, or I find a clue that will lead me to the bug, and I write some code, deploy it on the test server, and start a test, which I know will take 20 minutes or so. In the meantime I might reply to some e-mail, do some code review to see what might be refactored to be more readable and/or maintainable, or work on another set of bugs. Then when the earlier test completes I go back to check on it. I might not immediately go back to it after 20 minutes depending on where I am with my other tasks, I would probably find a good natural stopping point first, but the point is at the end of the day I am able to finish all these tasks much faster than if I did them one at a time sequentially.

I don't think what I just described takes particular mental prowess and most of the younger people at my work (and a few of the older ones too) do the same as well. But there are enough older workers who just get hopelessly lost if you ask them to do more than one thing at a time, whereas if I ask a younger co-worker to do the same thing they have no problems, that I have noticed it.


Its an interesting claim. You could contribute positively to the discussion by providing some sort of experiement to test it.

Would it be fair to say you believe that your high level of interaction with various social media sites and technologies has 'trained' your mind's agility? If so, several useful questions arise which I'd love to know answers too;

1) Can you produce more lines of production code over unit time?

If you use a source repository you might be able to analyze this by check-ins. I suspect a large open source project like KDE or Hadoop etc where you could correlate 'agility' of committer with the commits could shed some light here.

2) Do the designs and implementations produced have similar, better, or worse levels of qualities than designs and implementations done by 'less agile' developers?

I'd probably track bug reports and rewrites against lines of code committed.

3) Does the scale of problem change the effectiveness ratio? Which is to say if you're coding/designing/implementing at the top level of a big project vs at the fringe, does the difference between people trained with social media exposure continue?

Bascially correlating the above two data points across all levels of the code and design.


I think it's an open question how to measure developer productivity objectively. I don't think we really have a good way to answer this question right now, but definitely lines of code checked in is one of the worst. Checking in a 1k line class file with lots of dependencies that easily breaks when other code is changed, using a slow algorithm, with lots of difficult to follow code, would not be more productive than a 10 line method that performs the same functionality.

But I see where you are going there.

I mean come on, if you are comparing a fresh out-of-school graduate with someone with 10 years experience I think most people will agree that the older guy in this case will be more productive. There are things you need to know about working with a large code base that can't be taught by school but can only be learnt with experience.

When I say "older" developer, I mean the 40-50 year old who probably was a really good developer 10 years ago, got a steady, cushy job, with a salary that he/she is more than happy with, and stopped learning because he didn't need to anymore.

I know a lot of exceptions to the rule. The older programmer who got into it because he loves to code, who stayed in it because he loves to code, he keeps up with the times and continues to be relevant. The older programmer who just wants to make a buck and go home to his family? He fell behind a long time ago and doesn't want to catch up.


"When I say "older" developer, I mean the 40-50 year old who probably was a really good developer 10 years ago, got a steady, cushy job, with a salary that he/she is more than happy with, and stopped learning because he didn't need to anymore."

Which I'd agree with, but that eviscerates your thesis about the new mind training regimen does it not? After all the learning engaged developer missed out on the training you got from social media.

I think everyone here knows 'bad' developers, I was astonished at the number of people who I knew responded to the question "Why computer science?" with "I hear it pays well."[1] I suspect those folks stop being developers as quickly as they can and move into management (since it has a higher pay cap). So whether you are 20 something and programming by 'cut-n-paste' or 40 something and 'retired-in-grade'

It's wrong to generalize, and it's often a prelude to discrimination to generalize an opinion based on race, color. religion, sex, or age.

I'd love to get better tools and insights into developer productivity. I think it could be a useful differentiating factor on a source code control system.

[1] This contrasts with the people who respond "What? They'll pay me to do this? Cool."


In reference to your comment on discrimination.

http://www.paulgraham.com/say.html

Reading my original post, I think you are right I should have qualified myself better. I was just really upset by the tone of the OP. IMO the reason why many older developers are worse than younger developers is because they THINK they are better than the younger developers. They don't have the hunger to improve themselves anymore, and so they don't.


Personally, if Paul does a 'greatest hits' list for his essays I would vote to include that one in the list. I'm interested in your reference to it though, can you say more about how it relates to this discussion?

Are you suggesting that discrimination is one of the 'fashions' that Paul refers to?

Its statements liks this:

"IMO the reason why many older developers are worse than younger developers is because they THINK they are better than the younger developers."

Which causes me to wonder. We could certainly debate the results of a study that polled a few thousand developers between the ages of 20 and 60 and asked them to self evaluate themselves with developers older, the same age, or younger than them. Except we don't have that study, do you know of one ?

Paul's essay is a good one on open mindedness, and it gives great examples of how people can over-turn or distance themselves group-think by seeking out the unthinkable.

I certainly cannot claim to know what you are thinking but it reads like you think that labelling older developers as lazy, self-deluded, parasites is an example of giving voice to something that is 'true' but 'unsayable' because of some sense of societal impropriety.

I can't really comment on whether or not its 'true' because I've not seen any process where that question has been analyzed. The data from the Stack Overflow study says that people who self-report as older on Stack Overflow give more answers and have higher karma as a result. I didn't see anything in the data that would support a conclusion that these folks are making value judgements about their younger peers or that they no longer wish to improve.

"I was just really upset by the tone of the OP."

Paul wrote in his essay:

"The prohibition will be strongest when the group is nervous. The irony of Galileo's situation was that he got in trouble for repeating Copernicus's ideas. Copernicus himself didn't."

I guess I'm trying to figure out is what you're trying to say. Are you threatened by the idea that someone "older" who has more experience than you is probably "better" than you are by some definition? Or are you trying to argue that your youth and mad social skillz has permanently elevated you above the skill set of people who came before you?

I'm not critical of either view, I'm just trying to understand the data that leads to it. I liked the analysis of the Stack Overflow answers because it was data + analysis. As someone who is always looking to hire top talent, understanding effectiveness is something that helps me do my job better.


I am saying that the older programmers who are downvoting me are feeling nervous.

"I didn't see anything in the data that would support a conclusion that these folks are making value judgements about their younger peers or that they no longer wish to improve."

That's not from the report, just my personal unhappiness bubbling up. I entered the workforce expecting to learn how to be a better programmer from my more experienced co-workers. It was a bit traumatic to realize that they didn't even understand basic things like why using a hashmap is better than a nested for-loop or why one of the basic OOP principles is "favor composition over inheritance."

Since you mention data so much, I am curious, have you ever read "Fool of Randomness" by Nassim Taleb?


"I am saying that the older programmers who are downvoting me are feeling nervous."

That is an interesting conclusion.

"Since you mention data so much, I am curious, have you ever read "Fool of Randomness" by Nassim Taleb?"

No, however many folks have suggest that Gladwell's article on him [1] covered all the bases in the book. So I did take a moment to read that article.

I think he brought a fascinating perspective to stock trading and I found his discipline in removing confirmational bias from his observations seems to work for him. Since you brought it up, have you read the book?

It seemed from Gladwell's article, that Nassim would make the argument, in this context, that age and experience don't matter in the quality of the designs and implementations, rather some folks will simply arrive independently at a more optimum answer for any given problem than others.

Gladwell states that Nassim is/was a quant focused on the derivative system (the stock market) in which randomness appears to dominate. I have not read the book but Nathan Berg (UTexas) takes this a bit further by showing that 1/N diversification wins (see the article 'Simple Heresy', by Bruce Bower, Science News, 4-Jun-2011, pp 26-29). It read like a pretty solid endorsement of Nassim's take on the behavior of markets.

What I did not get out of Gladwell's article was that Nassim was promoting an 'ignore all data' philosophy. It sounds like you've come to a different conclusion than I did about what constitutes a useful experimental result.

If it helps I'm sorry you've had to deal with some less than helpful people in your career so far.

[1] http://www.gladwell.com/2002/2002_04_29_a_blowingup.htm


I really urge you to read the book, the first half is a bit slow, but the second half I couldn't put down.

Here's a more succinct way to get (one of) his points across:

http://xkcd.com/605/

My emotional outburst is certainly non-scientific, and should not be taken seriously. But it is equally non-scientific to be blinded by data. A good example would be to reward your programmers based on how many lines of code they check in -- you will find yourself regretting that quite quickly.


In my case, they've helped to destroy my ability to focus (or provided interruption and excuses not to develop it)


"They always want to work on only one thing at a time and get confused/ much slower if they have to work on multiple things, whereas younger developers will happily be able to switch in between tasks while waiting for the previous task to finish compiling/running without problems."

The younger developers get confused/much slower when they work on multiple things, too, they just don't realize it. I'm pretty sure there is experimental data showing that everyone performs worse when they switch tasks often.

I actually just watched The Social Network for the first time a couple of nights ago, and one of the things I appreciated most about the movie is how they would yell at people who tried to interrupt someone who was coding with a question ("Don't interrupt Chris, he's in the zone!"). I suspect that attitude had a lot to do with the productivity needed to get Facebook off the ground, and they were certainly all young people.

(Or maybe the writers of The Social Network just made it all up.)


When I set to write that post I was trying to base it on some real data, not personal experience.


Your layout is nearly illegible in Google Chrome on Ubuntu 11.04. The "sky and clouds" background image + light blue text... why? Am I doing wrong? Did I miss something?


I have Google Chrome 10 on OSX and everything is fine. I won't say great, but fine. And the background is snow. Text is grey.


http://i.imgur.com/NDKrT.jpg is how it looks from here. Really not trying to be a dick, but that's illegible.

edit: Disregard, your background image http://blogblog.com/1kt/travel/bg_black_70.png is blocked by firewall @ work.


They haven't kept up with the exponential increases in productivity that we have had in the last 5 years.

I would love to know where you've seen exponential increases in productivity in the last five years. In my experience, developer technology has been largely chasing its own tail for quite some time. Productivity improves dramatically on individual, emerging platforms, but seems little changed in a wider, overall view.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: