Hacker News new | past | comments | ask | show | jobs | submit login
On Getting Older in Tech (corgibytes.com)
499 points by mindfulgeek on Dec 9, 2016 | hide | past | web | favorite | 417 comments



I'm going to apologize ahead of time. this might be a ramble.

63 year old white guy with little hair and a lifelong beard that is now white. A tad overweight as well. I feel for so many people expressing angst about ageism. I've seen it elsewhere but not where I work now.

I suspect that at the faster growing companies and companies in tech centers, mostly on the coasts, see more pronounced ageism.

My last job was at a bank in Richmond VA and there was clear ageism in IT when I got their. I moved to compliance from IT for two years and I was very successful and never observed ageism.

At 53 I moved to a web development manager and developer role in Higher Ed. I took a gigantic pay cut, if you factor in bonuses and options to work in Higher Ed. But I got to send my oldest to college at a selective school for free. A $200K after tax benefit.

I don't look back. My life is so much better now with a 40 hour work week and being in control 100% on how we architect our web and backend eco-system. I spend more than 40 hours because I love learning but I choose when, where and what I learn and work on after 40 hours.

I read these comments from people that are 36, 40, 40+ and shake my head. That is not old. 63 is not even old. I plan on working till I am at least 67. I love my job and I especially love the people I work with.

I find these days I spend less time coding and I end up with better applications because I think through the design before coding.

I'm going to ask around in Richmond VA and see what ageism exists in industry here and report back.

Keep learning, try as hard as you can to stay in shape and engage in critical thinking. Good luck to each of you in staying employed and staying happy.


"I find these days I spend less time coding and I end up with better applications because I think through the design before coding."

^A golden nugget buried in an insightful ramble.


And you don't have to be a certain age to learn that lesson and put it in practice.


Absolutely right. In my job I find that my seniority gives me time to think more however. It depends on your circumstance and the work culture and boss relationship.

I report to the CIO and he has little need to know details. He cares about long term progress and the big picture. He could care less about how we get there as long as we are taking care of people and doing it honestly.


Totally tongue in cheek, but you're never too old to learn that you should have written "He couldn't care less about how we get there..."


Definitely, but key is to actually THINK, and reason about it, not just worry/stress about it!

Great irony also is that I've seen teams try to be agile, which they seem to think means code > design.. Often ends up poorly.


Thanks buddy! 50+ here and optimistic I can remained gainfully employed, but nervous about it. It's been my experience as well that the kind of companies you might see parodied on "Silicon Valley" have a more pronounced age bias, but there are lots of other places that need IT/web and have no such issues.

I was in a room full of developers when I read aloud the story about the guy who started oldgeekjobs.com, who defined old geeks as over 35. The room erupted with the loudest "F U" I've ever heard.


Yup. I was like no Fing way when I read that one. Keep up the optimism.


lol


Was it a paycut or really just electing to convert less time into money?

So many salaries look and sound impressive until you actually do the math on what it costs you.


It was both. With bonuses and options + salary I took about a 50% pay cut. My hours went from an average of maybe 70 / week to 40/week. My kids, now grown, thought I was a grump. Now they enjoy me being around. Same for my wife. Mostly importantly I was having health issues that less hours and less stress allowed me to get under control.

As I said, I have never looked back. I don't miss the money because I don't miss the stress.


It's called compensation for a reason. Sounds like you were compensated for missing out on your family and for trading years of healthy life. Not so tempting when phrased like this, but we often only realise after it becomes "normal".

Also for many there is not much choice. Sounds like you had a top 5% job. Could you exist well without the situation your former life gave you, such as owning your own home and not paying rent. Or would you be forced back into a bad family life to provide for that family?

EDIT: I'm having to reply here as I've hit the fake HN "you're submitting too fast" after doing 2 or 3 posts. The truculent censorship on this site when one hits specific topics sucks.

epalmer: Ok, good for you, and congratulations in finding your equilibrium!


I would have been fine. I've owned two small businesses that paid well at the time. I worked a lot of hours but I had fun with those companies.


I'm in Richmond VA as well and in my late 30s.

I do worry about aging out of this industry and my long-term plan is pretty similar to the path you have taken. My hope is that I can participate in the agency/startup ecosystem for another 10 years before I have to find something else to do. Hopefully that will still be in technology, and will probably be in Higher Ed or at a Nonprofit for a less money but more intrinsic rewards.

I haven't observed specific instances of ageism towards me or others (people being passed over for promotions or treated differently than younger employees), but I also haven't worked with many people 50+ since I started in this sector. I'm not sure how much of that is that people in that age group are looking for a more balanced lifestyle and how much comes from the companies that do the hiring.


I find these days I spend less time coding and I end up with better applications because I think through the design before coding.

At the risk of going off-topic; every new/kinda-new/wish they were new engineer can learn volumes from that single statement.


How do you learn a volume from a single statement? It's a good thing to think about: design up front is a tool for increasing quality. But the whole field of agile design is questioning the universality of that. The "volumes" are filled with all of the little things that tell you what is worth thinking about and what isn't. They contain many more statements on the subject.


>I read these comments from people that are 36, 40, 40+ and shake my head.

I'm 47 maybe there's hope for me yet.


> I read these comments from people that are 36, 40, 40+ and shake my head. That is not old. 63 is not even old.

Thank you for this. You made my day. I just entered my 20s and am constantly in fear of ageism later in my career.


48 yo here, started coding when I was 14, so that's 34 years of building things.

Here's where I'm at.

I look back at my career and I can tell you about great projects that I got to be part of, awards and plaudits that I won, big paybacks from projects that went well and literally saved the company. That's all nice to have war stories.

But I can't point to any of it and say, "I made that" because - and here's the kicker - it's all gone.

Software is ephemeral. One day your client does an upgrade, and then the thing that you spent years building and curating like a baby disappears. It isn't mothballed and put in the basement where visitors can walk by and see it. There's no photo of you standing by the thing that you can hang in the hallway and see every day. Your creation just completely vanishes without a trace.

All those years I've also been a musician and recording engineer. I've made a few dozen records none of which amount to anything that anyone else would care about. And all told I'm sure that I earned more money in one year of my IT work than my entire music career.

However, here is a collection of my work that I can point to and say, "I made that." It's a creation that I can reflect on years and years down the road.

I take much more satisfaction in my musical creations than from my software creations, even though I was much more famous and valued as a software architect.


Torvalds, 46, can point to Linux and say "I made that, over half a lifetime ago".

Stallman, 63, can point to GNU Emacs or GCC and say "I made that in the 1980's". Not necessarily the most recent version of it, but that hardly matters.

Gerald Sussman and Guy Steele can point to Scheme and say, "we made that".

John MacCarthy was able to point to Lisp and say "I made that", right to the day he died and we can continue to say it for him.

Make the right stuff; then you can bask in it for longer and be a kind of living saint to a few generations after you.


That's like saying da Vinci is representative of artists or Prince is representative of musicians. The average person who doesn't achieve software legend status will see their creations vanish as the previous commenter described.

I've worked in this industry for a long time as well, but I'm not someone you've heard of. I too have built some very cool things that I'm proud of, a number of which no longer exist. Meanwhile, a shed that I built myself 30 years ago still stands, and I can still point to that and say I made it.


If you make something when you're 20, and just keep maintaining it until you die at the TTY prompt at 85, then all your life you were able to point to it and say "I made that (and am still making it better)". This is the case even if that work isn't well known. Perhaps nobody else will point to it for you after you're gone, but while you are here, you can say that.

Things you hacked up in the past are gone because they solved a narrowly defined problem which no longer exists, and even before that happened, you already abandoned those programs.

That this happens is almost inevitable, as part of making a living. All those programming DaVinci's who are known for something also worked on lots of things that are now dust.


Why are you concerned about longevity of your creations? Surely impact matters more than the time they exist? Your software might only last a few years, but it it's used thousands of times, that changes the world more than another record heard a few times per year but lasts decades.


Dude. Open source software.


Even jQuery which is one of the most famous open source projects will be forgotten in 10 years or even less.


Nope, it'll live on and the code will be available for all to see and study which means it'll end up being a case study or research material in a textbook or Masters or Phd thesis.

One day someone may try to make an emulator for IE5 and ES4 just to run jQuery (like we've seen done with the 6502 and C64).


jQuery is _still_ in html5 boilerplate, which I consider to be a good simple starting point for content-based websites. https://html5boilerplate.com/


But it seems jQuery won't be in HTML 9 Boilerstrap, so I don't know how long it will live: http://html9responsiveboilerstrapjs.com


I disagree. Software with that level of popularity doesn't disappear.


"Unsurprisingly I now use React for most of my coding instead." - John Resig, creator of jQuery

https://twitter.com/jeresig/status/726058698989277185


Heck, even software with the equivalent level of unpopularity (PHP) doesn't disappear.


That can indeed be part of a legacy, but many companies are not comfortable with open sourcing their proprietary software. A lot of exciting work happens in that space as well.


Impermanence. Everything goes away. As software developers we often just get to see the whole lifespan of a project more often. Not many other professions get the privilege of building something from nothing, watching it grow and change, then get to slowly pull it down and retire it.


I'm 46, and I totally get this. My other thing is I'm a painter (though I make all my money from software).

I imagine when I kick the bucket all my software will be long gone, but I will leave a bunch of artwork that will survive for anything from days to centuries, mostly on its own merits.


I also took up painting. 10 years of writing software and most of it is was replaced, deleted. Weekends of learning a new tech that is obsolete in 5 years are gone too.

On the other hand my paintings will stay forever on my walls and my children's walls.


I like this aspect of software development. I've not seen the permanence and ephemeral aspects of things so cleanly expressed in any other field. The fundamentals don't change but the software artifacts expressing those fundamentals do. Not sure why I enjoy that dichotomy but it tickles my brain the right way.


I don't find this post inspiring, I find it sad.

It's partly self-encouragment, part PR. The fact that it even exists is proof that the author is facing some issues, no matter how confident they would like to appear.

That recipe to stay current looks tiresome. Listen to two podcasts, two webcasts, subscribe to four magazines, teach courses, go to one conference per year, blog regularly, read blogs, follow the latest web trends. Your reward: you are still employable.

And why is it that most older people answering on these threads are so passionate about learning and about new technologies and the latest and greatest javascript frameworks. Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

It all seems fake. Like they're trying to put on a brave face while at the same time being scared and trying to convince themselves that all this new and shiny tech that they work with is awesome.

Why not have an honest conversation instead of pretending that learning some thing or another will make everything ok in the end?


Nah. The fact that this post exists is evidence that ageism is a thing.

What makes you think older people are passionate about the latest and greatest JavaScript frameworks? That is not my experience at all. New frameworks are just the same old stuff in a new wrapper. We've seen it all several times over. Like, backprop was a cool thing in the late 80s.

I, too, find this post sad, but probably not for the same reasons you did. This post, that makes me and you sad, doesn't make it any less of a fact that the young people in this industry think you are a useless dinosaur if you don't know what the new hot thing is about.


> Nah. The fact that this post exists is proof that ageism is a thing.

Indeed, ageism exists and it's a form of discrimination. There are societies (e.g. japanese, korean, central/southern european) where ageism works the other way around, at the expense of younger people.

In both cases it's still discrimination and we should acknowledge the problem and fight it instead of trying to look/act younger or older.


It's tough to communicate this accurately. I've noticed a trend in engineers self-evaluating as old on HN to say things along the lines "I am having more fun than ever learning about the cloud/TypeScript/webfoo". Several people on all these "old in tech" threads.

I have the feeling these posts are not at all representative of the domain, and neither are the proposed solutions to just learn more, but it's all I've seen so far.

More of the same thing is not bringing us any closer to solving this very real issue.


OK, that's probably where our experiences differ. In my experience, the more experienced techies see that there's nothing really new in the new technologies. Many realize that to stay current, they must learn the new tech. But rarely do they really love it, because it's really a step sideways rather than forward.

I suppose it's a form of competition that forces new technologies to pop up all the time. I just wish people would take a really close look at what's already there before spending their prime bestowing the world with yet-another-framework or language.

By the way, I'm 40. I don't think I'm old. I think Scheme is superior to JavaScript.


> think you are a useless dinosaur if you don't know what the new hot thing is about.

Welcome to the technology industry.


I realized this way back in the mid 1990s when suddenly everyone wanted a "webmaster" with 10 years of experience, when suddenly HTML - a document markup language - became the most important "programming language" one could know.

Since then I have studiously avoided specializing in any technology. As soon as I feel like I've spent enough time in a particular stack to start to "know" it, I move on. I have refused to be pigeonholed into any particular tech.

The coding / language skills are the very least important skills I have, I intend to keep it that way. However I can present an extremely long laundry list of technologies that I have built solutions with - the length of the list, not the presence of any particular TLA on it, is the key to demonstrating my learning ability.


Hi! I've been here since the late 80s. What do you want to learn?


Tell me how you've survived in technology without learning new technologies!


Not sure why you think I haven't learned new technologies.

I have learned them whenever I needed to. Most of the new technologies are not that special. That makes them easy to learn, but, also, kind of annoying because I can see that they're just repeating a mistake I've seen 20 years ago already.


I obviously don't think you haven't learned new technologies in 25+ years. It seemed like the claim to ageism was that the kiddies expect you to know the hot, new technology. Requiring knowledge of new technologies for candidates isn't exclusive to veterans. And it's definitely not a requirement of the other 90% of companies that are using older technologies.


I thought you thought that, because you said it. Perhaps I missed a nuance, English is not my strongest language.

My point, going up this discussion thread a few clicks, was that new technology is not always better than the old. Ageism comes into play when one's opinion about the new tech is dismissed just because one has some gray hair.


> Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

I'm not old, I've just past "il mezzo del cammin della mia vita" (meaning I'm 36), and I have to say that the latest and shiniest JavaScript frameworks are pure and utter shit, and I say that as a guy who finds languages like PHP decent enough. I've started working on a project where the "npm install" thing took 1h and a half and generated 260,000 files (?!?!?). One of the countless modules in there is called Chevron (?!?!), with a capital "C", and I sincerely don't know what it does or what's is supposed to do (to make it clear, Chevron has no relationship with said project). It's just baffling.


Heh. I'm right there with you. I've not done web development in over a decade. Yesterday, I decided to look into React. To generate a hello world took over a minute of pulling down and building dependencies. Things are bizzare and inane now. All I want is a .js file to link in my HTML and for that .js to be editable.


Almost every ageism post I've read on HN ignores the hundreds of thousands of others jobs that exist outside of [hot SV tech companies] (avg age at FB is 28? no way!), and also don't mention those people who are right of the bell curve and working at companies like that (because their experience actually is valuable).

Some of these posts are real ageism claims, but most are "I'm old, I'm scared, how am I going to survive with a highly valuable skillset?" It's fear mongering. If you're concerned with job security, go find a job with security in government or in some monolithic non-tech company based in Go-Fuck-Yourself, GA. The "Hi, Fellow Kids" hipster-posing BS isn't going to land you a job at Facebook no matter how old you are.


I'm old, I'm scared ... some monolithic non-tech company

Wait, your solution to the problem of ageism in tech is "get out of tech, old fogey"??


Not so much "get out of tech" as it is "stay in tech, but not at a company that specializes in tech".

Non-tech companies need specialized software, too. When I was in college, I knew multiple people who had internships working on internal tools for a major bank. I imagine those banks also employ senior people to work on their internal tools, customer-facing web portals, etc.

And it's not just banks. I once interviewed for a job working on e-commerce stuff for Neiman Marcus. I'd imagine that other retailers like Walmart, Target, etc. need people working on their portals.

And if you want to work with technology but get out of programming, everyone needs IT.


Not so much "get out of tech" as it is "stay in tech, but not at a company that specializes in tech".

That's not exactly a solution tho' is it? Why go to a company where you are a cost centre, just because you reach a certain age?


They tend to be more mature environments, free of the "brogrammer" culture that's hostile to older people, women, LGBT people, etc. There's not attitude of "let's have a cool, hip office full of cool, hip young people". It's an environment where "culture fit" isn't used as an excuse to discriminate against older people.

It actually turns out that more corporate environments are actually friendlier to marginalized groups than a quirky freewheeling startup.

Job security is great. Big, established juggernauts don't have the kind of churn startups have... there's no worry about "what if the VCs don't go for another round of funding?", and the markets are well-established and slow to change. And if you go into defense contracting or public sector, you might even have lifetime employment.

The work environment is probably going to be nicer. Traditional corporations don't do open offices and don't require engineers to work 60+ hour weeks. Some of us would prefer do to 9-5 in our own cubicle. Banks are also especially generous with PTO (and remember that the "unlimited" PTO you get at startups is a scam)... I'm just going to quote a friend of mine on Facebook when I decided to post a general question of "how much PTO do you get?":

> I used to work for a bank, and they're notorious for giving tons of time, but when in my first position, I had 2 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days. When I was rehired further up the food chain, I got 4 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days, and I could buy an extra week off by lopping a week of pay off my annual salary. If I'd stayed longer, climbed more, I could max out at 8 weeks paid vacation with all the rest of it.

Not all of us care about doing interesting or ground-breaking work. We just want to stay employed so we can fund our lives, and we want a work environment that doesn't make us hate ourselves and want to die.

Honestly, I'm pretty happy at my employer -- we're a tech company, but the environment is very corporate (we're a telecom), and it doesn't feel like a startup at all. The work environment is highly praised, we're ranked as one of the top work environments on Glassdoor, and half of my team are graybeards. I don't want to leave here, but if it ends up happening anyway, I'm giving serious thoughts to pursuing public sector work after this.


I think he's trying to say that ageism isn't as much of a big deal as people make it to be (outside of SV's alternate reality) - everyone in the industry deals with the constantly shifting landscape that engenders feelings of job insecurity for programmers young or old. It's just tougher on older techies that haven't carved out their niche. If you're trying to fit in and compete with fresh bootcamp grads working new frameworks at age 60, you're probably due for retirement - there's a reason why the traditional career path was to engineering management positions for senior engineers. Either management, or something you're really good at that isn't the most hip new tech but still in high demand.


I really dislike this constant caveat that people keep placing against SV. "Well it isn't a problem outside of SV!" I keep seeing this appear in comment after comment on this article


I guess you're allowed to dislike a fact, but any particular reason you dislike this particular fact?


Because it isn't a fact? There's definitely an age bias outside of Silicon Valley as well.


That too, but also, a lot of us outside the valley have to deal with the unfortunate consequence of everyone looking towards SV as the "trendsetters" or sort of the ones who set the "meta" for "modern" software development, if you will, especially since most major confereneces are out in SF/SV area. At my own company, directors and above take frequent trips out west.

What this all boils down to, is if SF/SV is going to be seen as the beacon of software development, it is almost worse to me if it is ageism is being exemplified there. To give a rather rough corollary, consider if Washington D.C. never hired another underrepresented group, such as females, or minorities. It's sort of like, "well maybe I halfway, sadly expect that to happen in some small town somewhere", but C'mon! D.C.! Everyone's looking to you!" -- Same kind of thing.


Maybe, but maybe it's less pronounced? It's hard for me to say, as I've only ever not lived/worked in SV (hello from North Carolina). But FWIW, I am 43, and I don't feel like ageism has been a problem for me. I just went through a job search and had no problem landing a new gig in short order.

OTOH, to be fair, I am obsessive about learning new stuff, and I've been working with some "trendy" stuff the past few years (all big-data, hadoop, storm, kafka, etc. stuff) and I've been doing a lot of machine learning / data science MOOCs over the past year or so. So my skills are a good match for what there's demand for. But that would be valuable if I was 20, 30, or 80.


Agreed. The deliberate ignoring the whole world outside FB/Google is a alarming sign that the idea is just to get blog reads by piggybacking the last bombastic statement by someone famous in the sw industry.


> ignoring the whole world outside FB/Google

The whole world outside of GAFA looks to those companies for "how to do software development right" - no matter if they are right or not.

I've watched more than a few well established companies start to ape Google's hiring practices, or Amazon's churn, or Facebook's development practices just to try and attract new young developers.


I've been following the technology my whole career. I am still employable, but it gets old.

My advice to younger devs is to focus on general computer science fundamentals and application development skills. Those are the only skills that stay with you and grow as your career advances. The technologies always change so don't memorize them. Keep a reference around instead. Memorize things like design patterns and sorting algorithms.


Sorting algorithm you serious? No one uses that in his carreer.


Just to prevent rampant over-generalization in an article whose subject is the topic of bias: We've implemented various special-purpose sorts at least three times in the last five years. Here's one example: https://github.com/efficient/cuckoofilter/blob/master/src/pa...

Yes, this is all in a very high-performance (sometimes insanely so) context, but it does happen. Most of them were like this - unrolled special-purpose versions derived from a sorting network. Some were for GPU.


While this is for the most part true, questions regarding sorting algorithms and data structures (debug this left-rotate function that operates on binary trees) still come up during interviews.


I mostly agree, however there are exceptions that prove the rule. Some engineers are working on standard libraries. (someone has to write that code!) Some people are taking advantage of their data set to write special-purpose sorting algorithms that blow generic algorithms out of the water. (guilty!) Some people are putting stuff together in interesting ways that requires them to understand and sometimes even re-implement the standard algorithms to take advantage of internal data structures or other interesting effects. (It's bad practice, but knowing the order a map will iterate things in can be helpful if you don't expect it to change -- and make sure to have a unit test that proves it!)


You need them for those whiteboarding tech interviews.


I misread that as "waterboarding tech interviews", and I thought it was commentary on how whiteboard interviews feel like torture.


and the latest and greatest javascript frameworks

There are two kinds of learning in play here: learning how to do new things, and learning how to do the same things a slightly different way. Learning a new JS framework is definitely the latter, when 99% of the time the end goal is a CRUD application that could have been done with anything from the last 20-odd years. Because that is all the vast majority of apps are at the end of the day...

The former is what older people should be doing, leveraging the experience gained as a springboard.


I agree - I started at 38 (41 now) and work on a team as an iOS dev where we have mobile, backend and machine learning engineers - ALL - under 30. My interview was blind, so no one knew I was old - or black. They couldn't discriminate if they wanted to. That said, I couldn't imagine using all the tools on this list to stay relevant. I find reading HN, Apple docs and related blogs, books on Swift and iOS, and just thinking about tech time-consuming but enough to keep me up-to-date or even ahead of my teammates. I do appreciate what he said - I don't image FB would hire me - but Fuck them. Do your own thing then. That's why I got into tech - to stay employed long enough to create for myself. Still, if you have the right experience, someone will hire you if the big names (or startups) won't.


I know, right?

It's things like these that make me kinda regret going into tech. I should've gone into accounting or something like that instead.

I'm 32, and while I'm an excellent mid-level developer, I don't think I'm ever going to be principal or possibly even senior material. And everything I've heard from everyone is that is that by the time you're in your mid-40s, you better either a) go into management or b) hit principal or architect level or you'll be unemployable. I'm probably going to end up unemployable in 20 years, and that frightens me.


I kinda agree in the sense that I think that a lot of the tech jobs out there are not "difficult" in a way that requires years and years of experience to perform reasonably well at.

So maybe someone older should look for jobs that suit their abilities rather than a job that, as you said, anyone who went through a bootcamp can perform. Hopefully with age comes some kind of niche specialization or skill that can't be obtained easily.


"Hopefully with age comes some kind of niche specialization or skill that can't be obtained easily."

The big issue I've seen played out multiple times is a hiring manager, dev lead or anyone making a hiring decision not being able to tell the difference between what requires experience/skills/specialization and what doesn't. It leads to the all-too-common situation of people with required-experience being passed on in favor of the (generally younger) "hey, this person can definitely do it - they know all this computer stuff!" (slight exaggeration). It gets even more complicated when the person without the required skills tries to bluff their way through (deceptively or just out of sheer desperation).


> And why is it that most older people answering on these threads are so passionate about learning and about new technologies and the latest and greatest javascript frameworks. Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

Nobody enjoys having their hard-earned knowledge become obsolete (hence the "X11/bash/sysvinit/etc were good enough for me so let's never improve them" crowd).

But the fact is computer technology does change fairly quickly. You have to keep learning to keep up.

If you're really worried there are definitely some technologies that persist for longer than others. If you really hate re-learning stuff then I'd stay far away from the web and javascript. Stick with things like Java, Go, C#, C++ & Rust. Those aren't going away any time soon. Ruby, Docker, React, etc... I give 5 years max. Then you'll have to learn something new.


    > Ruby, Docker, React, etc...
Just for the record, Ruby is 21 years old. Ruby on Rails is 10 years old.


Also, I wouldn't include docker in that list. It's evolving rapidly into a core component of many platform frameworks.


3 web development technologies :D

First tip for elders: don't be in web development.


I am doing just that (but Go and Rust don't belong on that list). However, if a lot of people are doing things radically differently it starts limiting my options too.


Can you explain what's the honest conversation to be had here? That "old" people should be looking for jobs outside of Tech? I am trying to understand your point of view. I can't even start to imagine how hard this gets when you are "old", have a family to take care of and, for god's sake, you'd like some kind of stability in life.

I am 30 by the way. I feel way more prepared than 5 years ago. I think experience is important sometimes.


This could probably only be solved by changing the forever young, fashion-oriented US software development culture.

Staying current is just mitigating the problem, not solving it.

And staying current should anyway mean growing ones knowledge not replacing it every X years or investing so much time like the author recommends.


> And why is it that most older people answering on these threads are so passionate about learning

...because older people are told all the time, even on HN, that they're too old to learn and that they don't keep up with technology.

It's not surprising that they pre-empt these doubts by saying what they do to keep learning.

> Why not have an honest conversation

What's the honest conversation?


Well, an honest conversation would involve acting like a regular human, and not a lean mean learning machine ready to go head to head with any younger developer.

Because most people want to live their lives, not jump on a learning threadmill.


Life IS a learning treadmill. Have you looked at the state of employment today? Unless you don't care about income at all, you have to be constantly learning, because things are constantly changing. Even outside the employment space, I'm amazed at how fast the world is changing. My little girls get music playing by talking to a tiny box in our kitchen. Do you realize how crazy/cool that is?


> acting like a regular human,

In this thread we see people saying that older people are unsuitable employees because they have families, and would prioritise those families over work.

We have people saying that older people are unable to learn new tech.

When those people stop being discriminatory the older people can stop being super human.


I'm not sure if you intend this, but your comment comes across as a bit like "God, it's so embarrassing watching these old guys try to stay trendy by using the latest JavaScript frameworks, it's like seeing an old guy in skinny jeans riding a skateboard. They're not doing it because they enjoy it, they're just trying to act like the kids".

Who says using the latest javascript toys is only for the kids? Are you really worried that if you see old coders using the tools you like, that maybe that means they're not as cool as you thought? Is that really how technology works?


I'd say it's normal and healthy. Many industries in the UK, for example, now have what are called CPD obligations. ("Continuing Professional Development"). To stay current you have to earn a particular number of points each year by attending courses or other public events. Even something light like a public lecture might count, for example, for an architect.

https://en.m.wikipedia.org/wiki/Professional_development


I'm 47 and don't believe its a "macro-domain" issue. Maybe in some micro domains like getting hired at Facebook and Google that is the case. Software is eating the world and in my experience, there is still way more jobs than competent people. I have an older friend who's does c#/.Net/webapps consulting. He's has an endless stream of opportunities and never works for less than 70 an hour which is great wage for the southeast region where he lives.


> That recipe to stay current looks tiresome. ... Your reward: you are still employable.

It does make pursuing a career in programming seem dubious, doesn't it? There's an old saying about "clawing your way to the top", but sometimes it feels like I'm clawing my way back to where I started. I feel like I'm expected to have instant answers to any question and immediately comprehend any technology (regardless of its level of documentation or comprehensibility). I actually do enjoy learning about new things (and old things!), but most of the time, we're not paid to learn, we're paid to instantly know, and if you don't instantly know, there are 20 guys lined up around the block who are ready to take your place as soon as you admit that you don't know how to set up pass-through SSL using the undocumented firewall product that was installed last week that you don't have credentials for and we don't have time for you to waste reading documentation because there's a customer deadline and there's no slack in the schedule. The only other career I can think of that you have to put this much ongoing personal effort into is entertainment; it's like we have all the downsides of entertainment careers without any of the upsides.

On the other hand, I can't imagine doing anything else - every other job (except maybe astronaut) looks murderously boring to me.


Not a bad post at all and it definitely hit home for me.

I'm 43 and have been doing professional development for 20 years (actually 20 years). I moved permanently to Saigon just 1.5 months ago. I'm teaching the Pivotal software engineering process (agile / extreme) to a 100 person consultancy full of really smart ~20 year olds who didn't know or understand process at all.

I keep up on all the latest tech and I have a youthful mind and body (most people think I'm in my 30's). I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general. The culture in Vietnam is strong and my team wants to learn from me. It is very exciting and new for all of us. It has been an amazing experience so far and I look forward to the future.

The best additional advice? Just be nice. It is so simple. The culture here is to never raise your voice or get mad in public, so I've taken it to the other extreme and I just smile and laugh a lot. Even when the servers are melting down. Viet are shy and have poor personal communication skills. By being friendly and nice, they have learned to trust me and that has opened up them up a lot. It has infected my entire team and improved moral almost over night.

Being older has a lot of advantages. I'm loving my 40's way more than my 20's. Cheers! =)


Some pretty sweeping generalizations about the culture after only 1.5 months. I live in a country neighboring Vietnam and there are similar perceptions of the culture here, yet in my experience these ideas are baseless and often ingrained in expats well before ever learning the language or sometimes even before entering the country.


"these ideas" -- which ideas?

Yes, there is a ton of racism and generalizations among expats. As soon as I got here, I was added to a few private facebook groups where expats vent steam over the craziness of this country. I'm actually not a fan of it because it is honestly very racist, but I want to know both sides of the story. Much like democrats read republican news.

The language is hard and learning is going to take me years. I'm trying my best, but when I say something as simple as 'một' (the number 1) to someone, they rarely understand me. Vietnamese also want to learn english (my company has a full time english teacher) and will not help me.

So instead of language, I've focused on learning the culture first and in 1.5 months (also this isn't the first time here, so it is more like 2.5 months) I think I have a pretty good handle on a lot of it. I make 1-2 (or more) new friends daily thanks to the friendliness of the people and because I'm out there networking like crazy. I have over a hundred friends here now, both from business and personal.

If I stop liking it here, I'll leave. I don't see that happening any time soon though.


You made two generalizations in particular:

The culture here is to never raise your voice or get mad in public

This stereotype also exists about Cambodia, the country I've lived in for 4 years now. I can tell you that it isn't true about either Cambodia or Vietnam, since I've seen plenty of people in both countries get mad and raise their voices in public. The same happens of course back where I'm from in the US, and if I had to guess it happens at about the same frequency but I wouldn't rely on faulty human memory to make that judgment.

Viet are shy and have poor personal communication

This is just a ridiculous thing to say if you've never spoken to these people in their native language. I have actually gone through the arduous process of learning the language in Cambodia, and I can assure you that speaking in a foreign language clumsily and constantly being worried about your inability to express yourself will make you more shy than you are in your native language. And I have to mention that famous park in Saigon where foreigners are literally swarmed by Vietnamese people eager to practice their English... is that shyness? I never talked to so many strangers in my life as the couple of nights that I sat out there.

And actually, please do not think I'm vilifying you or calling you a racist or anything. I can tell from your post that you're a nice person, whereas in both Cambodia and Vietnam there are an outsized number of expats who are just straight-up mean, bitter, and racist. These people unfortunately end up influencing the perceptions that new expats have.

I'm only encouraging you to come in with more of a blank slate, and to not delude yourself into thinking that you can know anything about the culture after 1.5 months. I'll be more willing to hear out generalizations after you've been there for years, learned the language, and traveled all around the country. Have you even been outside Saigon yet? I've been to Saigon, and it's not representative of the rest of Vietnam.


I feel like you're making generalizations about me without knowing the full story. You've taken what I've said out of context and quoted and commented it based on your own experiences.

I've been here longer than 1.5 months from multiple trips here. I've also been to Cambodia. I'm not pretending to know everything about the culture. I've travelled to other cities than Saigon. I drive a motorbike as well as or better than locals. I'm working here daily at a company full of (wonderful) Vietnamese. I'm not an idiot and can form my own opinions. I'm not an English teacher (which is another unfortunate stereotype in itself). I'm not bitter or mean or racist. I know about that park. I'm working on learning the language.

I still stand by what I said.


Given the topic of the thread, it's clear that your comment was upvoted because it's inspirational to do new things at the (oh so advanced!) age of 43. But I want to address it along a different axis.

> Viet are shy and have poor personal communication skills.

When I imagine how such a statement might land with your fellow HN users who are Vietnamese, let alone, say, a Vietnamese elder, such a claim is painful to read.

> I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general

I have a similar reaction here too. Other dynamics leap painfully to mind—a vast power differential and violent history—that don't have to do with "life in general".

We don't want a gotcha discourse in which well-intentioned people get scourged for saying things; I don't mean my comment that way. At the same time, civility means more than personal politeness. It includes respect for others different from oneself. That is a profound thing with many levels, each of which challenges us to awareness. Every one of us has these challenges, of course. They're just easier to see in somebody else's case.


I could have said it another way that was more civil and polite. I won't argue that. I have a very direct personality that I don't feel like I should subdue for HN. So I went with painful, but honest. #inmyopinion

Overall though, my experiences have been extremely positive. I literally wake up every day happier than the last, all because I live here now. They are happy and friendly people. They are young and full of energy. If I can help it, I'll never go back to San Francisco. Vietnam certainly isn't perfect by a long shot, but I love it anyway.


That is the same in the Philippines. Public shame is a big no no. You always want to handle things in private.


Thanks for your inspirational story. It seems there is hope if you are willing to step out of your comfort zone.


Thanks! I don't even know what my comfort zone is anymore. =)


Take acid and go to crazy festivals if you want to stay young. No really, do it once at least.

You need to bathe in youth from time to time in order to experience it - it's fantastic.

Of course you need to keep up to date, try to use your wisdom to understand which technology/language is going to survive the test of time.

For example, C/C++ is going to stick around for a while; make sure you're up to date (C++ 14 and C++ 17).

Pick technologies with a steep learning curves, don't try to compete with 20-year olds doing Javascript Bootcamps - go five steps deeper.

Broaden your horizon - read poetry, listen to all kinds of new music, watch experimental movies, travel around, talk to foreigners, eat weird food.

Study physics and philosophy, psychology and economy.

Have lots of sex - your wife will love you again :)

You have kids ? Great! Learn from them - everything. Try to teach them what they study at school - see if you can figure out a better explanation. Notice how much new stuff you learn about the subject, about yourself and your kid!

We're all getting old(er) every day - as we age this process seems to accelerate - and one day we will be no more.

But inside us lives the kid, the 20-year old, the 30-year old. It's still there, it can still be crazy and fun, we just need to remember to go on a date with our younger selves. All the rest will follow.

At least that's what I'm telling myself :)


> use your wisdom to understand which technology/language is going to survive the test of time

I usually use statistics. If you select a random point on the lifetime of something. There is a 50% chance, that you are closer to the middle then either start or end. Thus: Always assume you are roughly in the middle of the lifetime. In other words, if some technology is only one year old, assume it is dead in another year.


That's a great rule of thumb. You should give it a catchy name so people remember it.

When I don't have good insight into whether it's worth my time to learn some new tech, I'm going to try to apply this rule.


This is the Lindy Effect[1].

[1]: https://en.m.wikipedia.org/wiki/Lindy_effect



So that means C looks pretty safe... 44 years. C++ is 33 years.


Nice piece of wisdom right there, thank you!


> Have lots of sex - your wife will love you again :)

Not in my experience

... oh, you meant with her.


"Take a mistress. Your wife will believe that you are with your mistress. Your mistress will believe that you are with your wife. And you can spend more time hacking code."


I'm young but this was invigorating to read.


> Pick technologies with a steep learning curves, don't try to compete with 20-year olds doing Javascript Bootcamps - go five steps deeper.

Wanted to call that one out, don't think I've thought about it before, but it's good advice.


100% agreed, thanks for this!


> Have lots of sex - your wife will love you again

Unless she finds out.


Trust me. If she finds out, she will love you more.


Elephant in the room, IMHO, is the technical interview process.

In software engineering roles at big/desirable/fast-growing companies, the interview process favors faster (by definition, younger) minds. Both young and old are put thru the same/similar coding interviews at many of these places, and often faster coders are younger, and get the job.

You can't fix ageism without fixing the interview process. Being jovial, healthy, nice and culturally sensitive are necessary and useful things to keep your job after you join, but the gatekeeping itself is biased on the other side, which reduces the intake to a trickle.


36 year old checking in. If there is any perceivable gap in speed between a younger and older software engineer, that gap can EASILY be made up with experience. After over 10 years in the industry I can honestly say interview questions aren't that original. If anything I have a huge advantage because I've heard all of them before: Answer a question that demonstrates you understand polymorphism + Algorithms 101 And that covers most of the questions I see right there. Yes it's insanely stupid to interview like that, no I don't think my age does now or ever will put me at a big disadvantage.

In my experience older people get cut out based on not being "a good cultural fit". This has been discussed ad finem on Hacker News because "cultural fit" leads to all kinds of discrimination: racial, gender, age, etc.

Every job I've had we put a person through a series of interviews, then we have a group meeting and we vote. There is no quantifiable evidence that this person actually interviewed the best. It comes down to how people feel in a room. That is the issue, not the speed at which a person can give answers. I've seen people voted down based on all kinds of illegitimate reasons and with age I think it came down to fear in some cases. A lot of software teams don't want to hire the best person they can find. They want to hire someone who is pretty OK, but will also make them look good. Yes, sometimes people don't get the job because they are too good. Am I going to hire someone who makes me look like an under-performer, or could get promoted to before me? Bingo, bad cultural fit.


the interview process favors faster (by definition, younger) minds

Algo-on-the-whiteboard interviews favour people who have recently been cramming for their final-year CS exams - by SHEER COINCIDENCE they happen to be in their early 20s...


I would even say they favor people who have been cramming algo-on-the-whiteboard type of questions. I mean, there are several businesses built around this (CtCI, leetcode, ...)

I know some North American universities have adapted to the practice and are now preparing students, but I assume this is relatively new. My algo and DS classes weren't about cramming at all.

Now to be fair, reasonable companies will focus on higher-level, systems design and architecture type of questions when interviewing seasoned engineers. Or at least, they really should.


Since everyone knows what they're getting into with a tech interview, I don't see why it's such a bad metric. Many companies purposefully give you a rubric/criteria to study. It's a good way of measuring whether a candidate can take the time to learn/prepare a specific set of knowledge, and then work through problems in a way that includes the interviewer (ie. other devs if hired) in the steps to solve the problem.


> It's a good way of measuring whether a candidate can take the time to learn/prepare a specific set of knowledge

And how isn't that a terrible metric? Most of my current coworkers would fail as they simply don't have the time.


> Since everyone knows what they're getting into with a tech interview, I don't see why it's such a bad metric. Many companies purposefully give you a rubric/criteria to study.

It's often so broad that to really cover everything that might come up you've got to have time to make the studying a part-time job.

And even then you might get hit with one of those "you almost have to have seen the trick before" questions, like detecting a cycle in a broken linked-list with O(1) memory.


Why interview in a way that's unrepresentative of the actual work?

The only reason is that you want to build an environment where the work is secondary, such as wanting to hire a bunch of bros to go drinking with and help you spend all that sweet VC cash...


Exactly, it's the choice of evaluation criteria.

Meanwhile, experienced developers have their brains tuned towards on-the-job skills that are harder to cram (like an instinct for edge-cases) while "shelving" the stuff that you don't need.


I'm not sure it's because the interview favors faster minds. When you're 25, and good at what you're doing, there is a good chance you simply don't see what a 40 years old have to offer. Never mind that the day they turn 40 themself, they'll be able to brig far more value to the table because they have 15 years of additional experience.

If there is noone with 20 years of dev experience within the company, there is noone to speak up for this, and the cycle continues.


Why do you think often faster coders are younger? With many more tricks up their sleeves, a more experienced coder will be faster. Or are you saying compared to literally older people, but not experienced?


I don't think the assertion has any basis in objective data in the first place but is based on a personal feeling. I don't see the value of trying to discuss reasons for something that only "exists" based on spurious subjective personal claims?


Young coders can code that 10,000 line bad idea they had really fast, while older ones can recognize that a problem isn't new and avoid the 10,000 lines altogether.


Cognitively, people do slow down after 21 or so. It is likely not to be a linear process, but I'm sure a quick google of research will find this. It is not, as suggested by the other comment, spurious feeling, but a well-known cognitive fact about humans. And experience in eal things doens't help with random tests, expecially if you're 20 more years away from your CS finals.

Whether this translates into a perceptible disadvantage in code tests is less likely to have been tested, but it is given what we know, and given other arguments above, quite possible and indeed likely.


At least one study suggests that unlike athletic abilities that do peak at around 20, fluid intelligence is more complex and may peak all the way up to 50 years. http://m.pss.sagepub.com/content/26/4/433


Technical interviews don't necessarily favor faster coders. I have been on both sides of technical interviews and what counts most is that an applicant asks the right questions and chooses the right approach.

What you're definitely not looking for is a candidate that hacks some solution together quickly but with badly structured code, makes a lot of unconfirmed assumptions and doesn't listen to advice.


> technical interview process

Even before that is the technical screening process. If the company has standardized on Angular 2, your experience with Dojo, Ext-JS, jQuery and even Angular 1 is considered irrelevant by the screeners - you may as well have experience in medieval basket weaving; they won't even call you. If the company has standardized on Groovy, your experience with Java is equally irrelevant. If the company has standardized on MySQL, your experience with Oracle is irrelevant. If the company has standardized on Linux, your experience with Solaris is irrelevant. And on and on it goes...

There's a perception (which I've already seen repeated 10 times in this thread, and I'm only halfway through it) that the "new thing" always completely replaces and invalidates the old thing. But that's almost never the case. Angular uses jQuery. jQuery uses Javascript. Javascript uses the DOM. Groovy uses Java. Hibernate uses JDBC. AJAX uses HTTP. HTTP uses TCP/IP. They all use the OS. And when X uses Y, Y can go wrong in ways you didn't expect if you just assumed that Y became meaningless when X came along.

So we have this environment where the hiring managers are shooting themselves in the foot by looking for style over substance and anybody who tries to bring it up is dismissed as a dinosaur with a case of sour grapes.


> let’s look at the average age of IT workers at well-established companies. Facebook: 28. LinkedIn: 29. Google: 30

I had an interview at Google a couple months ago and noticed that most people were pretty young. When I asked the person who was in charge of taking me to lunch about this, he said that it's probably because there are just much more graduates of CS now than there were before, and that Google would very much like to hire senior people as well but they're much harder to find.

I wonder how much of what he said is true vs ageism.

On the other hand, I wonder how much of a natural bias there is against older people if they have to go through the same interview process because it felt like a mental marathon to me. Although the interview only lasts a day it took a couple days for me to recover.


> ...and that Google would very much like to hire senior people as well but they're much harder to find.

> I wonder how much of what he said is true vs ageism.

Of course they want to hire more senior engineers now, it would likely help win their court battle[1] over it.

[1] https://tech.slashdot.org/story/16/07/02/0438216/age-discrim...


Also Atlassian says the same. They said last week that they just can't get senior people in Sydney. Personally I feel older people are not as good at the little puzzles they set as an entry test.


I applied to Atlassian many years ago and what struck me is their refusal to entertain non java devs to cross over. If they want more seniors they should consider c# or other language devs who could cross over.


That is actually a really hazardous approach, almost every job I've had as a software developer have introduced me to a new language and as someone who's been writing code for 10+ years there's hardly a large struggle to learn a new language or framework, especially if it's just another c-like imperative language.


I heard this so many times before about just another language... learning language itself? yes, can be done damn fast, plus everybody had some java-like language in their studies. Knowing this means almost nothing, we're talking about very junior-level resource.

You know gazillions of frameworks to achieve everything these days, their integration tricks, various app servers, CI toolsets and so on and on?

I mean, if you are senior in something, are you senior also in XXX language, meaning I give you spec, we talk and you deliver proper maintenable solution, leading dev team, managing all issues and bumps along the road? If no, and you just come as described junior, nobody has time to babysit you for weeks/months, and you are not willing to take junior salary. but that's what you are to the company.


I'm sorry. No actual senior person who is senior at java or c# takes months of babysitting to switch to the other. The beauty of a senior person over someone who is junior and knows only js, is that the senior person should have used many languages over their careers. Picking up a new language and/or framework is what senior people should be doing best. Most of the factors that go into a proper maintainable solution are all language agnostic. Good design, DI, IoC, all of the 12 factor app suggestions are all language independent.

Even when I was back in college many years ago, only the first class taught a language. From that point forward the teacher of each class said we're using language X and suggested a book if you needed help learning.


I'm not the person you're replying to, but can you really argue that someone senior who doesn't know a language and its associated frameworks is equivalent to someone who does know those things or deserves to be paid the same?

Sure, it's true that many things about good design are language agnostic. On the other hand, frameworks and languages can actually limit or enable what you can do, and that lack of familiarity with them has the potential to lead to mistakes.


If everything else is equal, then obviously knowing the current language I need will push that person ahead. IME, the language(s) a senior person knows is the least important part whether they are a good hire though.

My reasoning for this is that we are always learning new languages and using new frameworks. Why would I let a better person go when the language is probably going to change, or worst case they pick it up in a couple weeks just by looking at the existing code base? One case where I would deviate a bit is if I was hiring for a functional programming position. In that case I would prefer experience with some functional language, but that is not much different than wanting OO experience for a Java/C# position.


If the more senior program is of the same calibre and lacks skill in a language, then 100% yes. The idea of discounting proper experience is a clear example of the Dunning Kruger effect.


Oddly Google's and Atlassian's software have been getting steadily worse over time but I'm sure that's just coincidental.


I can't really speak to Google's software because there's so much of it and it does so many different things (much of which I try my hardest to avoid), but I've been using JIRA for a long time - probably on and off since about 2010 - and I actually think it's a lot better now than it ever was.


Largely agreed, we're still running a confluence version from 2010 at one organization that I'm a board member of and I must say that it doesn't fare well with the more mobile oriented ecosystem of today.


The Google process is also veeeery long. I guess is not unusual for an older developer to start it, and get other good offers meanwhile. Another benefit of age is to be able to pull contacts as ways of getting jobs, and reduce the amount of "meaningless" interviews.


His reply couldn't sound more scripted


It didn't sound that way to me, but maybe I'm just too trusting. For what it's worth he was probably in his mid 30s and had a family.


Your response couldn't sound any less sympathetic.


I wonder how many more people are graduating with CS degrees in the last few years compared to those who graduated two decades ago. I'd imagine there's a lot more today with how popular it is becoming. That's not even counting all the CS grads that have since moved up the corporate ladder/switched roles or careers and aren't applying for dev jobs now.


Actually the average age of software engineers at Google is much higher than 30.[1] It's the non SWEs that being the average down. The cited source in that article lists average ages for all employees, not just technical ones. This is unfortunately an overlooked distinction when writing about the industry.

[1] source: I work for Google.


That is a recurrent point that Uncle Bob tends to do. In my opinion, sounds resonable. e.g.: http://blog.cleancoder.com/uncle-bob/2014/06/20/MyLawn.html


I don't know your definitions of older/younger, but I interviewed at Google NYC recently, and my perception was that most people were in their 30s, with a few older and much older folks. Maybe a factor of the teams I was talking to, but it was clearly an older bunch compared to FB.


I can't speak for NYC but this is also my experience at Google Zurich and London.


Did they give you a job offer?


I did! Why do you ask?


You left the ending off your story! Nice ending too.


Haha! I didn't feel it was relevant to the comment but thank you! I'd rather be a math genius though! Maybe one day I'll be able to at least study some more math


Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

That said, the OP is correct. You're only as good as your last two years and even that's pushing it. If the tech changes, you have to adapt with it.

I'm 53 as of yesterday (the 8th). I started with PDP-11's in the 80's, then VAX's, then PC's, BASIC at first, then C, then Visual Basic, then ASP, then C#/ASP.NET, and now I'm deep into AWS (Lambda, DynamoDB, Redshift), NodeJS, AngularJS 1.x/2, ReactJS, and I'm still learning new technology all the time.

A lot of developers will transition to management and it's on my mind, but I'm also still drawn to solving problems at a code level. And there's always new toys to play with like Angular and React. Now we have .NET Core and all of its interesting avenues.

If you actually care about being a good developer, you will continue to work.

As long as there are jobs. Nothing will help you if the job market contracts. Then I do believe hiring becomes age-oriented with us older dev's labeled "over-qualified".


Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

The fact that people are still claiming there is a shortage of tech workers, demonstrates that that isn't true. The industry wants young, cheap workers who come ready made with the trendiest skills, then it wants to ditch them rather than retraining or allowing them to accrue seniority, and hire new ones, who will work 80 hours a week for free soda and "stock options"...


Devs should train themselves. If you're waiting for someone to train you, you're not going to last in software development for very long.


Some rhetorical questions: How does a person know if they've trained themselves well? What if part of the training involves interacting with other people?


What other fields operate like that? Doctors, lawyers, accountants, etc all get CPD on the company's time and dime.


What companies fit this mold? Do you work at one or worked at one in the past?


>Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

You'd be surprised. The original post (TFA) gives the median age in major companies for example and it doesn't work like that there.


I'm not convinced that there's more jobs than people in the tech market, especially now that people are starting to wise up to the fact that jobs are being automated away and want to transition into stabler markets like ours.


Younger people are not smarter. They might learn faster new things. On the other hand, older people understand related new things much better as they already have a large context (experience). The biggest difference of my current self (44) to my younger self is that I did spend much more energy in my projects when I was younger. I created results much faster at the cost of limited consideration. Now, I can still burn for a while, but not as long as when I was younger.


I used to be able to burn longer when I was twenty, but I'd spend that fuel on shit I've learned doesn't matter like 100% test case coverage, setting up continuous integration servers, and load testing against massive amounts of traffic I'd never see.

I could go on and on about the stuff I'd do I don't do anymore.

It's not that those things don't have their place, or that all young programmers are guilty of premature optimizations, but there's no difference in terms of productivity between doing nothing and doing things that don't matter at all. It took me a while to learn that, and I still catch myself wasting time.


Do you ever get asked about this? e.g. what you don't do anymore as a result of learning from experience

Either in your current role / in interviews / on the street?


not that i can recall. why do you ask?


It seems like you have learned a tremendous amount throughout your career that others could benefit from, that's why! Not learning from you flies in the face of wisdom (1)(2).

I'm not afraid to generalize this either. I see so many people and organizations repeating the same mistakes, of their own, and of others', over and over. I envision an organization built around the idea of learning from others' experience. It's a near mythical creature, but it could run circles around its peers, who would doggedly pursue finding out its "secret" and promptly discounting it when they hear "learn from others' experience / history."

This turned into a mini-rant, but it's because I am incredulous that you and others like you have learned so much yet the potential of that knowledge is so rarely tapped into.

(1)“Fools learn from experience. I prefer to learn from the experience of others.” ― Otto von Bismarck (2) Why Don't We Learn from History by B.H. Liddell Hart https://www.amazon.com/Why-Dont-We-Learn-History/dp/09850811...


I find that - very broadly speaking - old guns are better at strategy and young guns are better at tactics. The energy and interest of the young lead them to find the local maxima more readily, and the experience of the old help you find the real maxima rather than the local. Or put another way, young'uns will get you out of the rabbit-hole more quickly, but old'uns will take you down fewer rabbit-holes. Both mindsets work well together.

YMMV, caveat, caveat, caveat, etc, roughly speaking, &c


Would you mind expanding a little bit more, from your perspective, on what falls into tactics and what falls into strategy?


Generally what marklgr said. By tactics, I mean being up to date with the latest lib for -foo- and being on top of what the cool kids are doing, chasing down the rabbit-hole to find that one setting to improve db performance. Individual, almost atomic tasks, especially those in a changing landscape.

Strategy is more about planning, forethought, and knowing ahead of time which avenues are worth pursuing and which aren't. As a very simplistic example, a tactician might spend a day down a rabbit-hole and improve a db's performance by a small but significant amount, and a strategist would not have spent that day, knowing that that db is going to be turned off in two weeks. Not the best example, really (and it implies that you're only one or the other, which isn't true at all). There is also overlap between the two.

Perhaps a military analogue - tactics is about how to take the forts (shorter timeframes, clearer objectives, obvious goals), and strategy is about determining which forts are worth taking (long-term timeframes, murkier objectives, sometimes unclear goals, concern for secondary effects). The difference is also in scope, I guess. Being a good tactician benefits from energy and focused interest, something which the young'uns tend to have more of (caveat, caveat), and strategy benefits from forethought and experience, something which the old'uns tend to have more of (caveat, caveat, &c)


In that context, to me, tactics is writing & debugging code, compiling stuff, settings things up to work with other libs--ie. mostly hands-on tasks.

Strategy is more about planning, architecture, knowing about the real pros and cons of different solutions eg. libs, tools or languages.


At 37 I'm much faster at debugging today than I was at 27. And 10 years ago, I was consistently one of the fastest debuggers in a company of 15-20 programmers. Debugging's always been one of my strongest programming skills.

So debugging, no, definitely not. I can spot root-causes of bugs now even easier than I used to be able to.

And one of the big new skills is that I'm much more able to 'guess' where a bug is in someone else's totally new code-base than I previously was able to.

In fact, I also write code faster as I get the general gist of it done much faster first time.


Some people specialize, and get more and more efficient at the same specific tasks; but I believe it's more common to widen one's experience in various areas (languages, tools, platforms, non-functional requirements etc.), leading to some lack of practice in the hands-on tasks one used to do everyday in the past, and also making it more difficult to commit everything to memory. On the other hand, one generally gets a much broader picture.

For instance, I used to be able to perftune UNIX and DB (mostly Oracle on Solaris) pretty well, knowing various kernel parameters by heart and all the tools of the day (tkprof, Cockroft and co). Now I guess I could still get by after some brush-up, but I'm not anywhere near as efficient as I used to be on that specific task. And I still write code, but I don't know every single method of every API anymore. But now, I can design full solutions, from choosing the hardware to setting up platform, languages, monitoring, high-availability, backups, security etc. Not because I'm smarter, but because I've been exposed to all that along the years.


Breadth of experience often turns into strategy. Lack of experience tends to show up as when you only have a hammer, every problem looks like a nail. At this point in my career I've written production code in many languages/frameworks and was even a DBA at one point. I've also worked in many types of businesses. Now I can look at a given problem and provide a range of possible solutions. Sometimes those solutions do not involve writing any code at all.


I've seen several examples where the younger developer (often me) comes up with some really clever code to make a certain thing run really fast, and the older developer coming along and pointing out that my underlying premise is wrong.


I agree with all your sentiments. I think the key thing is that younger people tend to equate efficiency with lines of code written. As you get older, you realize that pausing long enough to consider all implications leads to better solutions. Fast and hacky is fine for prototypes but (hopefully) eventually it needs to scale and be stable and supportable.

I also think that the Zuckerberg quote also points to the startup mentality. They want 22 year olds because you can get them to work 100 hour weeks against the promise of an IPO, not because they are "smarter". Hard to convince a 40 year old with two kids to do the same. There has already been a shift to more traditional CS companies (IBM, etc.) as the lack of IPOs has soured some people on the startup dream of being employee #3 at the next Uber of XXX.


Agreed. My current self can understand complex concepts faster than my younger self. Understanding the innards of a distributed database is easier now than ten years back.


> Younger people are not smarter. They might learn faster new things.

I was a young person once (no, seriously!). I _thought_ I was learning things fast back then. When I got older, I realized I was trading speed for depth.


Young people:

    1. Learn faster (better memory)

    2. Can keep their attention focused longer.
Number 1 actually becomes a bane with old people: my relatives over 70 actively developed an active reticence to learn anything new: I even theorized that they are instinctively protecting the limited amount of functional short term memory for vital tasks.


    > Learn faster (better memory)
And yet, when I needed to learn XSLT (and then React), I was able to simply inhale it in one go because of my experience of similar technologies and functional programming, which come about from having been doing this quite a while.

Younger and quicker developers took a lot longer over it and to find their feet because there were many new concepts there for them.

As you get older, there's less to you haven't already learned.


Let's put aside the unsubstantiated claims about biology (and old doesn't mean 70 yrs in this context). Jobs that rely heavily on attention span and learning speed can also quickly get boring and unfulfilling--the more experienced folks probably know to steer away from that.


Article mentions "RPG" many times: "RPG back-end", "RPG developer". For those who wonder what it it (like me), it is not a "Role Playing Game", it is IBM RPG language: https://en.wikipedia.org/wiki/IBM_RPG


How the author failed to explain this abbreviation is beyond me. Who in their 50s doesn't know that you spell out important concepts and key terms, then shorten them in parentheses?


Maybe it didn't occur to him that people wouldn't know what it is. If you were writing an article about programming, would you explain what BASIC stands for?


No of course not, everyone knows it stands for 'Badgers Are Super Intelligent Creatures'.

:)

(Beginners All-Purpose Symbolic Instruction Code... which now I type it out, obviously cheats. From here on, it shall be known as BAPSIC.)


STFU.


Please comment civilly and substantively on HN or not at all.


I spent my first 5 years out of college as an RPG developer, and even I was like "wait, can he mean THAT RPG?" I'm not even old, I started that job in 2005!


Although it is more fun to imagine he meant 'Rocket Propelled Grenade' back-end... (Etc)


"First appeared: 1959" and it looks it. I don't think the author is doing himself any favours by going on about such an archaic language!


...and that's exactly the attitude the author is pointing out.


No, I agree with the parent commenter - but it's not that the language is old that's the problem.

If the friend mentioned lost or left his current job, he'd probably (?) struggle to find another. It wouldn't be nasty ageism, it'd be "sorry all your experience is with a technology irrelevant to us; that we haven't even heard of".

I say this going on the article alone - of course, for all I know the guy's an avid Ruby/Node/Elixir/whatever's-hot user for a range of awesome side-projects.


No, the author is actually saying that that attitude is fine. His point is that you need to keep learning and doing things that are relevant today, not learn today's technology and then stick with it for the rest of your career.

> I could tell you about all my accomplishments over three decades, such as replacing the use of a System/3 punch card system with the AS/400, writing a Cobol debugger, or…. Ah, I’m boring you. What you do care about are things I did in the last two years.

RPG is only relevant today in the same way Cobol is.


The feeling your comment conveys is not "keep learning". The feeling is that by talking about "archaic" stuff, the author isn't "doing himself any favours", i.e. he makes himself sound old.

That's actually his point.

Someone who has seen a technology mature and develop over decades, seen the hype trains come and go, projects succeed and fail, might be a bit wiser than someone fresh out of Stanford. He might have some ideas about how to build maintainable systems after working on some that are older than most developers.

If you don't see it, that's on you, not on the author for mentioning his particular "archaic" specialization.


One of the things that has made me old (52) and crotchety is that I learned Lisp very early in my career. That gave me the ability to see that 99% of "new" technologies were really just poor re-inventions of (parts of) Lisp. Even today, Common Lisp -- despite (or, as some would argue, because of) the fact that it hasn't been officially updated in decades is still not only a viable language but one of the best choices for many applications. But no one knows it because it's not the shiny new thing, and even young people still can't seem to get their heads around the fact that the parens are a feature, not a bug. And that makes me grumpy sometimes.

The good part was that I was able to build a very successful career while not having to suffer nearly as much pain as many of my contemporaries. The bad part is that now it's hard to find people to collaborate with. :-(


Even today, Common Lisp is still one of the best choices for many applications.

Then why don't people build amazing and popular things with it? I don't mean one or two people build one or two things, but lots of people building lots of things.

Nobody uses it, but it's the best choice, can't both be true. And 'nobody uses it' is approximately true. It's not a mainstream JVM language or CLR language, it's not an AWS or Azure or Google Web Services language, it's not used for Linux Kernel, Windows, Oracle, *BSD, SQL databases, No-SQL databases, it's not used where Erlang is, it's not the research language Haskell is, it's not the fun esolang or the long-tail COBOL, it's not the new compiles-to-JavaScript, it's not the back of 3D games or VR engines, it's not behind Amazon's shop or used where Go is. It's not talked about in StackOverflow's most popular languages surveys, or most profitable languages for devs to learn, or most desired by employers. It's not an educational language like it once was.

Yet it's "one of the best choices".

I simply don't believe it. Any perceived advantages it has, in practise must be a wash.


Too much fun to program in. Lisp makes it remarkably easy to extend the language into just what you want it to be...which means that Lisp programmers tend to spend most of their time extending the language into just what they want it to be rather than solving the problem. (Indeed, this is often touted as one of the benefits of Lisp...once you have a suitably well-adapted DSL, the problem solution follows naturally.)

Meanwhile, the Java or Go programmer is thinking "this is boring and ugly. Let me finish this problem as quickly as I can so I can go home." So they finish the problem as quickly as they can, and then it's done, and it ships. And people have an incentive to use it rather than tinker with it because nobody really wants to peek under the hood, and the people who do peek under the hood tend to be really dedicated and care a lot about the problem domain because why else would you put up with the language?


This one I can believe.


>Nobody uses it, but it's the best choice, both can't be true

Sure it can. It is possible for both to be true, as long as developers don't pick languages rationally, which they probably don't.

There are a few biases at play. One is that people are only exposed to a subset of the languages that exist, which are those that are either used in industry, or are making the rounds in news. Another bias is that we like to pick languages based on familiarity. For example most of my college courses used imperative languages: Python, C++, and Java. From that experience I am quicker at thinking in terms of for loops than folds and maps. So when I try a language like Haskell or LISPs I think "That's neat!" but when faced with a deadline I switch back to something closer to my first languages.

It could be that I am the only one that thinks like this, and everyone else sits down and spends equal time on every language in existence, but I doubt it.

In any case at one point LISPs were popular, so why did they lose all of their momentum?


I also doubt it. But LISP dates back to 1958.

JavaScript is from 1995.

"I chose the language which exists" doesn't apply to the people in 1994 who could have had /thirty-five years/ of LISP experience (potentially) and yet still chose to write another language, in another language. And when Brendan Eich was in college around 1981, the tutors there could have had /twenty years/ of LISP experience, but weren't there to convince him that it was amazing.

Same with literally any language dating post-LISP, and I note that that covers most languages which are popular today.

"Programmers don't pick rationally" is fine on the small scale, but accross the entire industry, even among people who do love exploring programming languages, even among young entrepreneurial risk takers, even among companies in tough markets angling for any and every edge they can get over their competitors, in decade after decade, over problem domain after problem domain after problem domain, there is this empty howling wasteland of happy and productive people using not-LISP, writing world-conquering systems that do just-fine-thanks and the claimed benefits of LISP just don't seem to be making any noticable dent in anything.

Therefore, they are over-hyped.


I agree that claims for Blub (for any value of "Blub") are too often over-hyped. But don't make stuff up in the course of justifying your reasonable point against hype.

I met John McCarthy in 1977. In college, a professor (Ruth Davis, I think; SCU EECS department) brought in someone who taught Friedman's "The Little LISPer". By the time I got to Netscape, I had read SICP. I knew enough about LISP to be dangerous.

But as I've said many times, and Netscape principals have confirmed, the reason JS isn't Scheme is because Netscape did the Java deal with Sun by the time I hired on (after I was recruited with "come and do Scheme in the browser"), and that meant the "sidekick language" had to "look like Java".

There was no "chose to write". Netscape management gave the MILLJ order; Netscape source was C (mostly), JS ("Mocha") was supposed to go server-side via "LiveWire" as well as embed in Navigator, and I was a C hacker. These are the reasons for the new language and its first implementation language.


But don't make stuff up in the course of justifying your reasonable point against hype

Where's the fun in that? ;) No, but really - I apologize for claiming there wasn't enough general LISP enthusiasm around when you were at college to affect you, and for not actually looking into the history of JS before making statements about how/why it happened.

JS design being partly a pre-made business decision - that's something I really should have known by now.


You are overlooking a crucial fact: very often people who want to use Lisp are not allowed to by their management precisely because "no one uses it" and so the experiment never gets done.

I was the lead engineer on the first release of AdWords. I wanted to write it in Lisp, but I was not allowed to, being forced against my strenuous objections to to it in Java. So now AdWords is a Java success story rather than the Lisp success story it might have been not because Java is better (it certainly wasn't -- it was a disaster) but because my boss issued an edict.

This has happened to me many times in my career. The one time I was actually allowed to use Lisp in a project that I did not have direct control over (the DS1 remote agent) it was an overwhelming technical success. In fact, management tried and failed to get the software rewritten in C++, so this was actually a controlled experiment.


If that were the (only) issue, we ought to see Lisp used more in startups than in established businesses, because in startups, you don't have some manager who doesn't know Lisp and is managing to minimize is perceived (not necessarily actual) technical risk.

But in fact I'm not sure that we see more Lisp use in startups, either. That could be because they aren't taught it in school. (Many startup founders haven't had the chance to pick it up on the job. For that matter, many programmers haven't had the chance to pick it up on the job.)

On the controlled experiment: Why was that? I mean, it has to be possible to rewrite anything in C++ (Turing complete, and all that - but possibly at the price of Greenspun's Tenth Law). Was it because the people who tried didn't know anything about Lisp? (Were they working from the source, or from the spec, or from the program documentation?) Were they just not as good programmers as the Lisp programmers (and don't say "if they were as good, they would have been using Lisp" - that wasn't their assignment). Did they have less time, less resources? Why didn't/couldn't they do it?


It's all a viscous cycle going back to AI winter in the late 80's: DARPA stopped funding AI work, which meant that funding for Lisp work dried up, which meant that fewer people used it, which meant that fewer people learned it. Now, 30 years later, hardly anyone uses it, at least not directly. But it keeps getting re-invented again and again. You can't avoid re-inventing Lisp because it's part of the fundamental physics of computing. That is my frustration. People say that Lisp sucks, and then they proceed to re-invent it badly not even being aware that that is what they are doing.

As to why the re-implementation of the Remote Agent (it wasn't the whole thing BTW, just the planner) in C++ failed, it was a combination of factors. This was twenty years ago (holy cow!) and C++ compilers were nowhere near as mature then as they are now. There was only one compiler available for the flight hardware, and it was pretty unstable. But mainly it was Greenspun's tenth: the application did a lot of dynamic allocation (it was an AI search algorithms) which is something C++ is particularly not well suited for. They essentially had to re-implement Lisp in C++, and that was hard. Of course it would have been possible given more time, but we didn't have more time. The Lisp code was working, so that's what we flew.


Supposing the AI winter never happened, what is your perception of where Lisp would be now? Dominant? Large niche? Small niche, but larger than present?


I have no idea, and it doesn't really matter because we can't go back. I really don't want to dwell on the past, except insofar as we can learn lessons that inform the future. The point I really want to make is that Lisp is still a viable option today (Common Lisp and Clojure in particular) and I would like to see people give it a fair shake going forward.


A few notes:

* Google actively develops one of the Lisp compilers. Google Flights powers Orbitz, Kayak, etc. That's Lisp.

* There are several Lisp compilers in active open source development.

* There's a graph database written in Lisp called AllegroCache. It's good enough to support a business (Franz) for more than a decade.

* Another company (LispWorks) also exists and has a large portfolio of clients.

* Lisp has been used to make entire operating systems. Ones of the past, and ones of now. (Of course, an OS needs a community. But where are real OS's with GUIs in other languages?)

* Lisp has been successfully used in my own career for embedded systems to control satellite acquisition systems to, most recently, quantum computing. (At real companies.)

Just because there's not this huge buzz around Lisp doesn't mean no one is using it.


> But where are real OS's with GUIs in other languages?

There's this thing called Windows you may have heard of...

And, out of curiosity, what OS are you referring to?

> Just because there's not this huge buzz around Lisp doesn't mean no one is using it.

jodrellblank's claim was not that absolutely nobody is using it. The claim was, compared to how wonderful Lisp advocates claim the language is, relatively nobody is using it. If you compare the amount of software written in Lisp to the total amount of software written, and compare that to how wonderful Lisp is claimed to be, jodrellblank has a real point. And citing a handful (or several handfuls) of counter-examples does not refute the point at all.


I don't even code in Lisp, but I knew the Google Flights engine would come up, because it seems to be just about the only "serious" application ever written in Lisp. And it wasn't built by Google, but their acquisition ITA, which hails from Boston and MIT's Scheme reality distortion field. (And even MIT has stopped teaching SICP in Scheme...)


Emacs, ViaWeb, Macsyma, the DS1 Remote Agent, much of the autonomous navigation software for the Mars Rover before the Pathfinder mission... And I have built a number of small-scale web applications in Common Lisp. I'm running one of them in production right now.


And Yahoo! paid 49 million dollars of stock to throw the LISP in the trash and use something else instead.

It wasn't even worth them spending 1 million dollars - 2% of that price - on training people.

http://discuss.fogcreek.com/joelonsoftware/default.asp?cmd=s...


Right. And that decision was surely one of the reasons Yahoo is the runaway success it is today.


Are AutoDesk still using it?


It might be that it really is a big secret, and people are quietly using it and gaining advantages from it.

If that were the case, I would still expect to see lots of gushing blogs of leaks from inside hush-hush companies and people desperate to learn LISP posting "how do I replace strings in files in Common LISP" on programming forums, and "I learned LISP and doubled my salary" on Twitter.

Yet what you really see is Steve Yegge and "my EMACS code at Amazon was replaced with Java years ago" and "Facebook working on a new JavaScript thing" and "Microsoft working on a new JavaScript thing" and "Apple working on Swift" and "Fog Creek compile VBScript to PHP" and "rPi comes with Mathematica and Python" and so on and so on.

I wasn't saying that no one is using it. I was saying "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it".

Your points about your career are pretty interesting.


Lisp is old and it's unlikely that you see hype anytime soon. But you can see that Lisp derived languages like Clojure can generate minor hypes. Other new languages may even contain substantial pieces of Lisp influence. Examples would be Julia or R.

Generally the industry has problems reusing old/existing technology. See for example the Javascript domain, where new frameworks for web development pop up every week and the lifespan of frameworks is measured in months.

Instead of using/enhancing existing tools, there is a constant pressure to develop new stuff. Or take Apple with Swift. Instead of using Scala, Standard ML, OCAML, F# or Haskell, they developed a new statically-typed functional language.

It's the NIH syndrome at work. Everywhere. But it's also that tools are complex to learn, so people start new with simpler tools, they grow over time and after some time they are replaced with other stuff. If something gets updated in some incompatible ways, it already causes problems: some users are lost, some users will only use the old stuff, some only the latest stuff and some will try to use multiple versions. See Python.

Full Common Lisp is just too complex for most developers, but it has a life in many specialised and niche applications: CAD, some AI tools, music, robots (like the Roomba), planning/scheduling (crews, telescopes, ...), Expert Systems, verification of software and hardware, some maths stuff, ...

Since Lisp is only left being taught at a few universities, there are not many people able to develop with it. Even when it was taught, it was often only used to teach concepts like recursion and not programming. The younger Lisp programmers found it by themselves.

'Industry' sometimes often has no interest to diversify their programming tools. Many enterprise software shops currently (still) use Java: standardised, broad industry support, ... You won't successfully propose to them to use Lisp, even if the application would be better in some way. For example if the project fails, it certainly wasn't Java fault, because all the others are using it too. If you would use Lisp and the project would fail, it would be Lisp's problem: not enough people, little architecture experience, tools not broad enough, integration story too weak etc... Even it would be successful and in production, there would be a lot of pressure to rewrite it in some industry standard in the next product iteration.

> "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it"

There is no general 'best'. It's all relative to a domain, community, demands, legacy, fashion/hype, ...

Lisp is not more dead than usual. Yesterday there was a donation effort started for the Quicklisp library manager and it's now at $16606.37 .


Very good list of some issues that lead to "the new hotness" all the time. I had not thought of some of these; it's not totally brain-dead fanboyism.

One issue that (good) bosses have is that they cannot allow their business to be held hostage by one person. If that person quits (or dies), they have to be able to replace them. Esolangs are a hard sell on that basis alone, no matter how fit for use they may be.

I'd forgotten Clojure (and others!) - yes, Lisp is more popular than "things named Lisp". And, arguably, Clojure is popular enough these days that the argument in the previous paragraph doesn't really apply to it any longer. (Whether bosses know that is a separate issue.)

Then there's the claim by Guy Steele that, in creating Java, he dragged a bunch of C++ programmers "halfway to Common Lisp". (Lisp purists might concede some fraction considerably less than half...) Paul Graham says that the Lisp feature set is slowly taking over programming languages. Lisp may die but still conquer, or mostly conquer.


What Java mostly got from Lisp was parts of memory model and the managed memory, not so much on the language side. Smalltalk got that from Lisp too. Ruby also (the Ruby developer studied the Emacs Lisp implementation). Microsoft's CLR also. Actually parts of the early Microsoft .net CLR GC were written in Lisp and automatically translated to C.

But Java could not do a lot of things a typical Lisp implementation can do, sometimes with a good reason for that.

  * runtime compilation
  * loading of code (-> custom class loader)
  * garbage collection of code is sometimes difficult
  * saving/starting of memory dumps
  * fast startup times
  * tail call optimization
  * updating changed objects
  * calling error system -> Java has a terminating error system
For most of these problems some 'solutions' have been developed. On the language level the Java community experimented with some original Lisp stuff, too. But not much was added to the language, but at least they got lambda expressions (1958 in Lisp) and a first kind of multiple inheritance of code (around 1980 in Lisp via Flavors). Java soon will have a standard shell for incremental use ( https://blogs.oracle.com/java/jshell-and-repl-in-java-9 ) - that was around 1960 in Lisp. Now imagine when Macros will be added...

But generally knowing Lisp does not help you much with Java, since the language's OOP model is very different from the typical OOP in Lisp (see CLOS+MOP).


> Nobody uses it

This application is written in 7+ million lines of Common Lisp

https://www.ptc.com/cad/elements-direct/modeling

https://www.youtube.com/watch?v=mJGytRaNvec

The video shows how Eterna uses 'PTC Creo Elements/Direct' to develop their watches. There are many other clients of that in various domains.

Also:

https://www.youtube.com/watch?v=4rD0zmA-Trc


Good cite.

(Although the fact that some people can make successful Common LISP software could just be that those people are exceptional enough to make successful software in any language, it doesn't bolster the claim that 'Common LISP is one of the best choices for many applications' very strongly).


Inmarsat uses a Lisp application (G2) to monitor their satellites and to provide a high level overview of the status of all Inmarsat services in real-time.

http://www.inmarsat.com

If you would run a Cement plant, an oil pipeline or a coal terminal in South Africa, then you might also be using G2 to control and diagnose technical processes in real-time.

The product used, written in Lisp:

http://www.gensym.com/wp-content/uploads/Gensym-l-G2.pdf


Then why don't people build amazing and popular things with it?

The reason is because "best" can have more than one meaning. Languages are usually only "best" at one thing, not for all things. In the case of many languages that people consider to be "best", they've been optimised for the development process (making it nice to write code in) rather than speed (C) or security (Ada) or a specific niche (R) or maintainability (not sure) or customisability (LISP). They're designed to make writing code more straightforward. Languages that are optimised to be easy to code with are the languages that tend to bubble up to the top when it comes to popularity contests because the more people who can access the language the more popular it's going to be.

There is an argument that Ruby is mostly LISP, so it could be said that lots of people use it very happily everyday.


I would go with that, in the sense that "here's a box of parts" is the most customisable way to solve a problem - yet not the best way for most people.

Except, even people doing exploratory stuff apparently aren't using Common-LISP.

Let me point to Slava from RethinkDB, who wrote blogs about how amazing LISP is and then went to found a new-software startup written in C++. Peter Norvig, AI researcher, who moved from LISP to Python.

And even your mention of Ruby - "Common LISP is the best langauge", then why is Ruby even needed? Why aren't those people happily using Common LISP? Simply because it wasn't actually any better, and was in fact worse than Ruby in whatever metrics.


> There is an argument that Ruby is mostly LISP, so it could be said that lots of people use it very happily everyday.

How is that? I thought Lisp's "killer app" was its extremely powerful macroing system. Does Ruby have an equivalent?


It has class level declarations which do the same thing.


Sure, in practice people's beliefs about languages can become self-fulfilling prophecies. And the thing about self-fulfilling prophecies is that they are actually true. For example, Javascript is tremendously useful even thought it is a horrible language from a technical point of view, it's just that the sheer weight of people using it because everyone else is using it results in enough infrastructure that you can get useful things done in in Javascript despite its (lack of) technical merits, not because of them.


And when JavaScript was becoming a thing, where was the better alternative written in LISP and the community of people who like better software supporting it?

Nowhere, that's where. Like in approximately all problem domains, whatever perceived benefits LISP has, doesn't seem to make it all that desirable.


Please. You pick a lowest-common-denominator language like Java or Golang so that you have a large pool of cheap, semi-sentient code monkeys to hire from.

If you choose LISP, you're actually going to have to expend a little bit of effort on identifying, recruiting, and retaining actually competent people. That's a problem when you're looking to put butts-in-seats (as is the case in my industry: defense).


That just pushes the problem back a couple of decades. Common-LISP is from 1984, GoLang is from 2007.

In twenty years, Common-LISP with all the advantage it supposedly has couldn't make anything that a 'code monkey' could use? Why not? Is it awful at writing tools?

Or is it that the alledged advantages of Common-LISP don't really make any difference in reality because they are over-hyped or even non-existent?

Note that I'm not saying "why did they pick one language over another", I'm saying "Common LISP can be the best tool for the job - yet they waited twenty years to use a 'worse' tool to implement a 'worse' tool". That doesn't make sense.


You are writing this comment in the lisp-based software. Yeah, I know, Hacker news is a simple software and could have been written in anything else, but, somehow, the author, who I've heard is very smart, thought that Lisp is the best pick.


And his secret to winning big was "write lisp software, have it bought out by another company who will rewrite it in a different language to make it popular and useful".

Really, if PG's "LISP, how to win big" was as compelling and important as he tried to state in that essay, every ycombinator company would be using LISP for their secret advantage over other companies.

Applicants would choose LISP because it's "better", LISP-based applications would be preferred because the applicants have an advantage, existing companies would be encouraged to LISP because it would help them do more for less employee numbers and YC would become a LISPy community of alumni.

I have no proof that this is not happening right now, but would you bet on it happening right now in secret?


Your logic assumes that young founders/developers are very proficient in many languages (including lisp) and they have perfect information for choosing. In reality, they use whatever crap they picked up hacking in high school. On top of that, the tools are only a part of the ingredients. However bad your platform is, at least it works, while most of the screw-up is hidden in business decisions - that's why every startup incubator's job is to help the youngsters not screw up business-wise. Tech is normally something up to the founders, most of whom have never heard of lisp.


> And that makes me grumpy sometimes.

If you will excuse a moment of cheekiness...

Could be that you have cause and effect backwards here: because you are grumpy you are dismissing 99% of other language developers' work as rubbish. Could be that it's less than 99% and you are overlooking some great ideas.


That's not cheeky, it's a perfectly fair question. Yes, that is certainly possible. And there have been a few cool new ideas that have come along that are not easily subsumed by CL, like Haskell's type system. It's easy to implement Hindley-Milner, but actually using that information to inform the compiler, plus adding laziness as a core language feature, is much harder. But I think the jury is still very much out on whether or not Haskell is really a net win.

But the most popular language on github at the moment is Javascript, and there is no question that it is simply a very badly designed Lisp with C syntax. This is not intended to disparage Brendan Eich. He had a week to design and implement something, and under those constraints he did a pretty amazing job. But I can't help but imagine how different the world would be if he had used Scheme as a starting point.

Was there something in particular that you had in mind that you think I may have missed?


My take is that Language X usually has major deficiencies compared with Language Y for domain Z, for many values of X or Y: Lisp, Scheme, Smalltalk, Forth, Erlang, Haskell, assembler, C, etc.

Confirmation bias makes it easy to bind those variables to values that make one's own favorite language obviously the best and everybody else's infuriatingly, irrationally terrible. If unchecked then this leads to wildly false conclusions, such as that languages fit into a hierarchy of powerfulness ("Blub Paradox.")


What do you see as Common Lisp's "major deficiencies"? (Pick your favorite value for Z.)

This isn't a challenge, I'm genuinely interested in your answer.


I prefer not to. I think it would be more productive as a private thought experiment: what would you expect a Smalltalk hacker curse about if you forced them to use Common Lisp? a Haskell hacker? a Rust hacker? I think they would miss some really valuable things, and that you could easily miss the value of those things if you took them out of their original context (e.g. considering only whether Lisp would be improved by adopting those specific features.)


that should be pretty obvious: lack of library support.

there is the famous example of the reddit founders, who believed pg's lisp story, and built the first version of their site with it. it went so badly for them that they had to start over again in python.

... but i bet you are going to have a very plausible-sounding reason why it didn't work for them.


I have no idea why Lisp didn't work for Reddit. But Common Lisp has exceptionally good library coverage today, and with Quicklisp, getting access to it is virtually seamless.


okay, maybe i betrayed my biases there too much, but, i agree with everybody else: if lisp was such a great secret weapon, there would be a hell of a lot more visible success stories by now, other than just pg's original viaweb implementation, and some flight routing software.

plenty of other tech has come from up nothing in the last few decades, to wide adoption, and big successes. the fact that lisp hasn't is, in my mind, prima facie evidence that it is not nearly as great as its proponents claim.


> there would be a hell of a lot more visible success stories by now, other than just pg's original viaweb implementation, and some flight routing software.

There are, but you are ignoring them, you simply did not do any research or it is simply hidden.

The success of Common Lisp today is relatively small, but a careful reader could find a few interesting applications of it like the scheduling system for the Hubble Space Telescope, the design software for the Boeing 747 (and other aircrafts), the software for the Roomba, the software for DWAVE's quantum computer, the crew scheduling software for the London subway, chip design verification software at Intel (and other chip companies), ...

There are some old application platforms which survived. For example Cyc, an attempt to provide common sense AI to computers, is under continuous development since the mid 80s. The company Cycorp has 50+ employees, is very secretive and you need to guess who pays for it. Customers are among others the United States Department of Defense, DARPA, NSA, and CIA. They are using it for various applications.

Note also that prototyping software was for a long time an application area for Lisp. Have relatively small teams develop a prototype and make it a product once the idea is validated. Example: Patrick Dussud wrote the core of the first Microsoft CLR (.Net Common Language Runtime) garbage collector in Lisp. The code was then automatically translated to C (IIRC) and enhanced from there after some time. Lisp now is no longer used and the GC has a lot of new features, but the first working versions came from that Lisp code.


> if lisp was such a great secret weapon, there would be a hell of a lot more visible success stories by now

Not necessarily. There are other possible explanations of Lisp's relative lack of commercial success, not least of which is the fact that a widespread belief that "there must be something wrong with it because no one uses it" can become (and I think has become) a self-fulfilling prophecy.

But another important factor is that the Lisp community seems to attract people who are really good at tech but really bad at business. I think if someone (or, more likely, some pair of co-founders) could bridge that gap they could still kick some serious ass.


Just a data point: I founded a Lisp startup together with a bunch of experienced Lisp hacker buddies from the SBCL community. Sadly and reluctantly, we found Lisp awkward and ended up rewriting everything in C, and then never looked back.

These days I am developing such software with LuaJIT and that is working much better for me than either C or Lisp.

One thing I learned along the way is that many tales of Lisp heroism are actually anti-paradigms. Once upon a time when I read about ITA Software rewriting CONS to better suit their application I thought it was impressive; now I see it as a farcical workaround for having chosen an ill-suited runtime system and sticking with it (and generally an indictment of Lisp not providing a practical performance model for the heap.)

Lispers are too expert at spinning bugs as features. "It's insanely complex, every line could be an interaction with undefined behavior or a race condition or an unexpected heap allocation" becomes "suitable only in the hands of trained specialists, like a chef's knife or a surgeon's scalpel or a Jedi's light saber."

I feel like we need to have a shared "our emperor didn't have any clothes" moment with regards to Paul Graham's essays.

(I say this as somebody who does love Lisp and will probably do a lot more Lisp work in the future but only on a project that is a peculiarly good fit.)


(Funny feeling of being a Lisp hacker searching for catharsis in the Hacker News comments section... :-))


Don't search, go do something cool,what all people will talk about!


Tangentially: I am working with LuaJIT these days and this feels really exciting to me. The compiler is a new kind of beast and possibly the beginning of a large Lisp-like family tree. Feels very "MACLISP" to me - exciting!


Some functional languages make certain behaviors implicit, such as partial evaluation and laziness. However, these work better if they are explicit. They work better because one of the two is severely confusing when implicit and the other potentially performs badly.

  C:\Users\kaz>txr
  This is the TXR Lisp interactive listener of TXR 162.
  Use the :quit command or type Ctrl-D on empty line to exit.
  1> (defstruct integers ()
       val next
       (:postinit (me)
         (set me.next (lnew integers val (succ me.val))))
       (:method print (me stream pretty-p)
         (format stream "#<integers ~a ...>" me.val)))
  #<struct-type integers>
  2> (lnew integers val 0)
  #<integers 0 ...>
  3> *2.next
  #<integers 1 ...>
  4> *2.next.next
  #<integers 2 ...>
  5> *2.next.next.next
  #<integers 3 ...>
Why would I want implicit laziness everywhere? The best of all worlds is to have expressions reduced to their values eagerly before a function call takes place.

When I don't want an expression evaluated in (what looks like) a function call, I can, firstly, make that a macro.

If I really want lazy semantics, I can have a decent vocabulary of lazy constructs that fit into the eager language. For instance for making objects lazily I have lnew, distinct from new.

Implicit laziness everywhere is academically stupid. You're drowning the execution of the code in an ocean of thunks and closures.

The pragmatic approach is best of making a compromise between making everything explicit and visible, yet keeping it syntactically tidy and convenient.


> Why would I want implicit laziness everywhere?

Modularity; see the stone age paper discussed yesterday: https://news.ycombinator.com/item?id=13129540

> Functional programming languages provide two new kinds of glue - higher-order functions and lazy evaluation. Using these glues one can modularise programs in new and exciting ways, and we’ve shown many examples of this.

> This paper provides further evidence that lazy evaluation is too important to be relegated to second-class citizenship. It is perhaps the most powerful glue functional programmers possess.


The paper claims in its conclusion that it has provided evidence (what is more, "further evidence") yet I can't find any in there.

It argues that you can achieve a certain useful separation between programs together when one produces data for the other.

This can be achieved in a very satisfactory way with explicit streams (i.e. lazy lists). It can be satisfied with delimited closures, coroutines, threads and often with lexical closures. Not to mention Icon-style generators.

Lazy lists can be incorporated into the language so that their cell are first-class objects and substitute for regular eager cells smoothly. (Thank you, OOP).

The paper is actually wrong there, because laziness alone will not provide the kind of separation that g can begin executing, such that f then only executes when an item is required. Not for an arbitrary f! Suppose f traverses a graph structure recursively and yields some interesting items. Lazy eval alone isn't going to allow the f traversal to behave as a coroutine controlled by g, proceeding only as far as g continues to be interested in further items. The author is attributing to lazy evaluation magical powers that it doesn't have.


> Why would I want implicit laziness everywhere?

For the same reason you want automatic memory management: so you can fob off the job of figuring out where the thunks should go onto the compiler, just as you fob off the job of figuring out where the calls to malloc and free should go. At least that's the theory. It seems plausible to me. I think it's an open question whether my failure to grok Haskell is due to a problem with Haskell or the ossification of my brain.


It's not the same. Here is why: the program correctness doesn't depend on when (or even whether!) that automatic memory management happens. Lisp systems have been bootstrapped without having a working garbage collector upfront. Short-lived Lisp images run as processes in a conventional OS might never have a chance to collect garbage.

Laziness has precise semantics which has to unfold properly, or else things don't work.

Delaying evaluation is not the same thing as delaying reclamation. They are opposite in a sense, because we only allow something to be reclaimed when it is "of no value".


Not trying to troll you here...

To what degree do you use Lisp as an FP language? As a pure FP language? The forced purity may make Haskell a very different language.

And, to return to your original complaint: If you dislike new languages, I bet XML drives you straight up the wall...


> To what degree do you use Lisp as an FP language? As a pure FP language?

It depends on what I'm doing, but I generally write in an OO style more than a functional style. Real problems have state.

> If you dislike new languages

I want to be clear that this is just a general observation. I don't dislike new things because they are new, I tend to dislike them because they are generally bad. But they are not all bad. Clojure is cool. WebASM is very cool. The work that has been done on Javascript compilers is nothing short of miraculous (even though the language itself still sucks).

> I bet XML drives you straight up the wall

Kind of, but not really. Yes, I dislike XML because it is nothing but S-expressions with a more complicated syntax. But it doesn't drive me up a wall because when I need to deal with XML I just parse it into S-exprs, do what I need to do, and render the results back into XML.


I dislike XML because it's a nested tree of internal nodes and typeless character string leaves.

In XML I have no way to place "255" and "FF" in such a way that XML understands them to be the same object, of integer type.


Sure you do. <base10>255</base10> <base16>FF</base16>


Pure FP deals with state!


Sure, everything non-trivial is Turing-complete. But FP begins as a stateless paradigm and then tacks on state as a sort of a kludge while all the while seeming to be a little embarrassed about it, while OO embraces state from the beginning as part and parcel of the mental model that it endorses. I find the OO model has a better impedance match to my brain and the real world. Reasonable people can (and do!) disagree.


State is something that is best just embraced rather than "dealt with".


An analogy to that is that OO doesn't embrace IO. Yet weirdly enough that makes OO-IO better than older languages that have built in commands to write to disk.

Haskell doesn't have state but you have multiple models to choose from, from simple folds to STM or State or Reader or Writer Monads all of which serve different purposes and do different jobs well.


OOP absolutely embraces I/O. I/O begs to be OOP and makes, hands down, the best use case for illustrating OOP.


What I am getting at is that OOP languages like C++ have no IO commands built in it is all delegated to libraries.

Haskell has no state support built in, it is all delegated to libraries.

So:

C++ has excellent IO support, but the language doesn't embrace IO at all.

Haskell has excellent State support, but the language doesn't embrace State at all.


C++ I/O libraries in fact depend on the sequencing semantics built into the language. If we make two calls to the library, they happen in that order; consequently, the I/O happens in that order. We can do wrong things like:

    f(cout << x, cout << y)
where we don't know whether x is sent to cout first or y.

C++ statements could be added to C++ (e.g. as a compiler extension). They would be straightforward to use; C++ doesn't inherently reject that the way Haskell and its ilk reject sequencing and state.


Haskell doesn't reject sequencing. f = g . h will require h is evaluated first.


h might not be evaluated at all! Consider

    h x = factorial x
    g x = 0
(but I agree that Haskell doesn't reject sequencing).


Yeah the IO monad will, but it isn't generally true of monads. Infant the maybe monad Will cease early on Nothing by design. So it is a brain shift.


> Haskell has excellent State support

Someone who understands where that support is and how to use it should rewrite atrocities like:

https://rosettacode.org/wiki/Assigning_Values_to_an_Array#Ha...


I don't think that's so bad, it's just you are using a function instead of the usual built in indexing operator [0]. Here's something a bit more convenient using Data.Array.Lens[1].

    arr ^. ix 2
vs the original:

    readArray arr 2


Failure to grok Haskell is probably due to the lack of easy accessible literature on it. Plus with so few other people groking it there isn't as much osmosis available.


But I can't help but imagine how different the world would be if he had used Scheme as a starting point.

I've heard that he did in fact use Scheme, but the suits insisted on the whole curly-brackets-and-semicolons thing so they could call it Javascript and piggyback on Sun's marketing efforts.


To be fair, he only took some of the semi-colons.


He took an option on the semi-colons.


I'm certain lisper [parent] knows these, indeed may well have worked with PG in the past, but for those who don't, here's some useful linky: http://www.paulgraham.com/lisp.html


Isn't there a good big comnunity in Clojure now?


Depends on what you count as "big". If you look at languages used on github, for example:

http://githut.info

Clojure is doing OK (better than Common Lisp), but not great (worse than Haskell and Emacs lisp).


I bet more people are getting paid to do Clojure work on production systems than Haskell or Emacs lisp.

That's a really nice web site. It took me a moment to realize it's showing quarterly data (sadly last updated in 2014), and that the ranking is based on the first metric ("number of active repositories").


Not quite as much activity as Scala either. I think the good thing is that functional languages as a whole aren't going anywhere, it's just that the jobs for them are spread very thin over several languages.


Maybe because in Clojure you achieve more with less effort :)


Out of curiosity what flavor/dialect of LISP do you recommend? Common or something else?


I like Common Lisp, and Clozure CL in particular, but it doesn't really matter all that much. The cool kids seem to be using Clojure (with a J, not a Z) nowadays. Pick whatever works for you.


"That gave me the ability to see that 99% of "new" technologies were really just poor re-inventions of (parts of) Lisp"

"But no one knows it..."

It'll be really hard to work with a 23 year old with that attitude too.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: