63 year old white guy with little hair and a lifelong beard that is now white. A tad overweight as well. I feel for so many people expressing angst about ageism. I've seen it elsewhere but not where I work now.
I suspect that at the faster growing companies and companies in tech centers, mostly on the coasts, see more pronounced ageism.
My last job was at a bank in Richmond VA and there was clear ageism in IT when I got their. I moved to compliance from IT for two years and I was very successful and never observed ageism.
At 53 I moved to a web development manager and developer role in Higher Ed. I took a gigantic pay cut, if you factor in bonuses and options to work in Higher Ed. But I got to send my oldest to college at a selective school for free. A $200K after tax benefit.
I don't look back. My life is so much better now with a 40 hour work week and being in control 100% on how we architect our web and backend eco-system. I spend more than 40 hours because I love learning but I choose when, where and what I learn and work on after 40 hours.
I read these comments from people that are 36, 40, 40+ and shake my head. That is not old. 63 is not even old. I plan on working till I am at least 67. I love my job and I especially love the people I work with.
I find these days I spend less time coding and I end up with better applications because I think through the design before coding.
I'm going to ask around in Richmond VA and see what ageism exists in industry here and report back.
Keep learning, try as hard as you can to stay in shape and engage in critical thinking. Good luck to each of you in staying employed and staying happy.
^A golden nugget buried in an insightful ramble.
I report to the CIO and he has little need to know details. He cares about long term progress and the big picture. He could care less about how we get there as long as we are taking care of people and doing it honestly.
Great irony also is that I've seen teams try to be agile, which they seem to think means code > design.. Often ends up poorly.
I was in a room full of developers when I read aloud the story about the guy who started oldgeekjobs.com, who defined old geeks as over 35. The room erupted with the loudest "F U" I've ever heard.
So many salaries look and sound impressive until you actually do the math on what it costs you.
As I said, I have never looked back. I don't miss the money because I don't miss the stress.
Also for many there is not much choice. Sounds like you had a top 5% job. Could you exist well without the situation your former life gave you, such as owning your own home and not paying rent. Or would you be forced back into a bad family life to provide for that family?
EDIT: I'm having to reply here as I've hit the fake HN "you're submitting too fast" after doing 2 or 3 posts. The truculent censorship on this site when one hits specific topics sucks.
epalmer: Ok, good for you, and congratulations in finding your equilibrium!
I do worry about aging out of this industry and my long-term plan is pretty similar to the path you have taken. My hope is that I can participate in the agency/startup ecosystem for another 10 years before I have to find something else to do. Hopefully that will still be in technology, and will probably be in Higher Ed or at a Nonprofit for a less money but more intrinsic rewards.
I haven't observed specific instances of ageism towards me or others (people being passed over for promotions or treated differently than younger employees), but I also haven't worked with many people 50+ since I started in this sector. I'm not sure how much of that is that people in that age group are looking for a more balanced lifestyle and how much comes from the companies that do the hiring.
At the risk of going off-topic; every new/kinda-new/wish they were new engineer can learn volumes from that single statement.
I'm 47 maybe there's hope for me yet.
Thank you for this. You made my day. I just entered my 20s and am constantly in fear of ageism later in my career.
Here's where I'm at.
I look back at my career and I can tell you about great projects that I got to be part of, awards and plaudits that I won, big paybacks from projects that went well and literally saved the company. That's all nice to have war stories.
But I can't point to any of it and say, "I made that" because - and here's the kicker - it's all gone.
Software is ephemeral. One day your client does an upgrade, and then the thing that you spent years building and curating like a baby disappears. It isn't mothballed and put in the basement where visitors can walk by and see it. There's no photo of you standing by the thing that you can hang in the hallway and see every day. Your creation just completely vanishes without a trace.
All those years I've also been a musician and recording engineer. I've made a few dozen records none of which amount to anything that anyone else would care about. And all told I'm sure that I earned more money in one year of my IT work than my entire music career.
However, here is a collection of my work that I can point to and say, "I made that." It's a creation that I can reflect on years and years down the road.
I take much more satisfaction in my musical creations than from my software creations, even though I was much more famous and valued as a software architect.
Stallman, 63, can point to GNU Emacs or GCC and say "I made that in the 1980's". Not necessarily the most recent version of it, but that hardly matters.
Gerald Sussman and Guy Steele can point to Scheme and say, "we made that".
John MacCarthy was able to point to Lisp and say "I made that", right to the day he died and we can continue to say it for him.
Make the right stuff; then you can bask in it for longer and be a kind of living saint to a few generations after you.
I've worked in this industry for a long time as well, but I'm not someone you've heard of. I too have built some very cool things that I'm proud of, a number of which no longer exist. Meanwhile, a shed that I built myself 30 years ago still stands, and I can still point to that and say I made it.
Things you hacked up in the past are gone because they solved a narrowly defined problem which no longer exists, and even before that happened, you already abandoned those programs.
That this happens is almost inevitable, as part of making a living. All those programming DaVinci's who are known for something also worked on lots of things that are now dust.
One day someone may try to make an emulator for IE5 and ES4 just to run jQuery (like we've seen done with the 6502 and C64).
I imagine when I kick the bucket all my software will be long gone, but I will leave a bunch of artwork that will survive for anything from days to centuries, mostly on its own merits.
On the other hand my paintings will stay forever on my walls and my children's walls.
It's partly self-encouragment, part PR. The fact that it even exists is proof that the author is facing some issues, no matter how confident they would like to appear.
That recipe to stay current looks tiresome. Listen to two podcasts, two webcasts, subscribe to four magazines, teach courses, go to one conference per year, blog regularly, read blogs, follow the latest web trends. Your reward: you are still employable.
It all seems fake. Like they're trying to put on a brave face while at the same time being scared and trying to convince themselves that all this new and shiny tech that they work with is awesome.
Why not have an honest conversation instead of pretending that learning some thing or another will make everything ok in the end?
I, too, find this post sad, but probably not for the same reasons you did. This post, that makes me and you sad, doesn't make it any less of a fact that the young people in this industry think you are a useless dinosaur if you don't know what the new hot thing is about.
Indeed, ageism exists and it's a form of discrimination. There are societies (e.g. japanese, korean, central/southern european) where ageism works the other way around, at the expense of younger people.
In both cases it's still discrimination and we should acknowledge the problem and fight it instead of trying to look/act younger or older.
I have the feeling these posts are not at all representative of the domain, and neither are the proposed solutions to just learn more, but it's all I've seen so far.
More of the same thing is not bringing us any closer to solving this very real issue.
I suppose it's a form of competition that forces new technologies to pop up all the time. I just wish people would take a really close look at what's already there before spending their prime bestowing the world with yet-another-framework or language.
Welcome to the technology industry.
Since then I have studiously avoided specializing in any technology. As soon as I feel like I've spent enough time in a particular stack to start to "know" it, I move on. I have refused to be pigeonholed into any particular tech.
The coding / language skills are the very least important skills I have, I intend to keep it that way. However I can present an extremely long laundry list of technologies that I have built solutions with - the length of the list, not the presence of any particular TLA on it, is the key to demonstrating my learning ability.
I have learned them whenever I needed to. Most of the new technologies are not that special. That makes them easy to learn, but, also, kind of annoying because I can see that they're just repeating a mistake I've seen 20 years ago already.
My point, going up this discussion thread a few clicks, was that new technology is not always better than the old. Ageism comes into play when one's opinion about the new tech is dismissed just because one has some gray hair.
Some of these posts are real ageism claims, but most are "I'm old, I'm scared, how am I going to survive with a highly valuable skillset?" It's fear mongering. If you're concerned with job security, go find a job with security in government or in some monolithic non-tech company based in Go-Fuck-Yourself, GA. The "Hi, Fellow Kids" hipster-posing BS isn't going to land you a job at Facebook no matter how old you are.
Wait, your solution to the problem of ageism in tech is "get out of tech, old fogey"??
Non-tech companies need specialized software, too. When I was in college, I knew multiple people who had internships working on internal tools for a major bank. I imagine those banks also employ senior people to work on their internal tools, customer-facing web portals, etc.
And it's not just banks. I once interviewed for a job working on e-commerce stuff for Neiman Marcus. I'd imagine that other retailers like Walmart, Target, etc. need people working on their portals.
And if you want to work with technology but get out of programming, everyone needs IT.
That's not exactly a solution tho' is it? Why go to a company where you are a cost centre, just because you reach a certain age?
It actually turns out that more corporate environments are actually friendlier to marginalized groups than a quirky freewheeling startup.
Job security is great. Big, established juggernauts don't have the kind of churn startups have... there's no worry about "what if the VCs don't go for another round of funding?", and the markets are well-established and slow to change. And if you go into defense contracting or public sector, you might even have lifetime employment.
The work environment is probably going to be nicer. Traditional corporations don't do open offices and don't require engineers to work 60+ hour weeks. Some of us would prefer do to 9-5 in our own cubicle. Banks are also especially generous with PTO (and remember that the "unlimited" PTO you get at startups is a scam)... I'm just going to quote a friend of mine on Facebook when I decided to post a general question of "how much PTO do you get?":
> I used to work for a bank, and they're notorious for giving tons of time, but when in my first position, I had 2 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days. When I was rehired further up the food chain, I got 4 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days, and I could buy an extra week off by lopping a week of pay off my annual salary. If I'd stayed longer, climbed more, I could max out at 8 weeks paid vacation with all the rest of it.
Not all of us care about doing interesting or ground-breaking work. We just want to stay employed so we can fund our lives, and we want a work environment that doesn't make us hate ourselves and want to die.
Honestly, I'm pretty happy at my employer -- we're a tech company, but the environment is very corporate (we're a telecom), and it doesn't feel like a startup at all. The work environment is highly praised, we're ranked as one of the top work environments on Glassdoor, and half of my team are graybeards. I don't want to leave here, but if it ends up happening anyway, I'm giving serious thoughts to pursuing public sector work after this.
What this all boils down to, is if SF/SV is going to be seen as the beacon of software development, it is almost worse to me if it is ageism is being exemplified there. To give a rather rough corollary, consider if Washington D.C. never hired another underrepresented group, such as females, or minorities. It's sort of like, "well maybe I halfway, sadly expect that to happen in some small town somewhere", but C'mon! D.C.! Everyone's looking to you!" -- Same kind of thing.
OTOH, to be fair, I am obsessive about learning new stuff, and I've been working with some "trendy" stuff the past few years (all big-data, hadoop, storm, kafka, etc. stuff) and I've been doing a lot of machine learning / data science MOOCs over the past year or so. So my skills are a good match for what there's demand for. But that would be valuable if I was 20, 30, or 80.
The whole world outside of GAFA looks to those companies for "how to do software development right" - no matter if they are right or not.
I've watched more than a few well established companies start to ape Google's hiring practices, or Amazon's churn, or Facebook's development practices just to try and attract new young developers.
My advice to younger devs is to focus on general computer science fundamentals and application development skills. Those are the only skills that stay with you and grow as your career advances. The technologies always change so don't memorize them. Keep a reference around instead. Memorize things like design patterns and sorting algorithms.
Yes, this is all in a very high-performance (sometimes insanely so) context, but it does happen. Most of them were like this - unrolled special-purpose versions derived from a sorting network. Some were for GPU.
There are two kinds of learning in play here: learning how to do new things, and learning how to do the same things a slightly different way. Learning a new JS framework is definitely the latter, when 99% of the time the end goal is a CRUD application that could have been done with anything from the last 20-odd years. Because that is all the vast majority of apps are at the end of the day...
The former is what older people should be doing, leveraging the experience gained as a springboard.
It's things like these that make me kinda regret going into tech. I should've gone into accounting or something like that instead.
I'm 32, and while I'm an excellent mid-level developer, I don't think I'm ever going to be principal or possibly even senior material. And everything I've heard from everyone is that is that by the time you're in your mid-40s, you better either a) go into management or b) hit principal or architect level or you'll be unemployable. I'm probably going to end up unemployable in 20 years, and that frightens me.
So maybe someone older should look for jobs that suit their abilities rather than a job that, as you said, anyone who went through a bootcamp can perform. Hopefully with age comes some kind of niche specialization or skill that can't be obtained easily.
The big issue I've seen played out multiple times is a hiring manager, dev lead or anyone making a hiring decision not being able to tell the difference between what requires experience/skills/specialization and what doesn't. It leads to the all-too-common situation of people with required-experience being passed on in favor of the (generally younger) "hey, this person can definitely do it - they know all this computer stuff!" (slight exaggeration). It gets even more complicated when the person without the required skills tries to bluff their way through (deceptively or just out of sheer desperation).
Nobody enjoys having their hard-earned knowledge become obsolete (hence the "X11/bash/sysvinit/etc were good enough for me so let's never improve them" crowd).
But the fact is computer technology does change fairly quickly. You have to keep learning to keep up.
> Ruby, Docker, React, etc...
First tip for elders: don't be in web development.
I am 30 by the way. I feel way more prepared than 5 years ago. I think experience is important sometimes.
Staying current is just mitigating the problem, not solving it.
And staying current should anyway mean growing ones knowledge not replacing it every X years or investing so much time like the author recommends.
...because older people are told all the time, even on HN, that they're too old to learn and that they don't keep up with technology.
It's not surprising that they pre-empt these doubts by saying what they do to keep learning.
> Why not have an honest conversation
What's the honest conversation?
Because most people want to live their lives, not jump on a learning threadmill.
In this thread we see people saying that older people are unsuitable employees because they have families, and would prioritise those families over work.
We have people saying that older people are unable to learn new tech.
When those people stop being discriminatory the older people can stop being super human.
It does make pursuing a career in programming seem dubious, doesn't it? There's an old saying about "clawing your way to the top", but sometimes it feels like I'm clawing my way back to where I started. I feel like I'm expected to have instant answers to any question and immediately comprehend any technology (regardless of its level of documentation or comprehensibility). I actually do enjoy learning about new things (and old things!), but most of the time, we're not paid to learn, we're paid to instantly know, and if you don't instantly know, there are 20 guys lined up around the block who are ready to take your place as soon as you admit that you don't know how to set up pass-through SSL using the undocumented firewall product that was installed last week that you don't have credentials for and we don't have time for you to waste reading documentation because there's a customer deadline and there's no slack in the schedule. The only other career I can think of that you have to put this much ongoing personal effort into is entertainment; it's like we have all the downsides of entertainment careers without any of the upsides.
On the other hand, I can't imagine doing anything else - every other job (except maybe astronaut) looks murderously boring to me.
I'm 43 and have been doing professional development for 20 years (actually 20 years). I moved permanently to Saigon just 1.5 months ago. I'm teaching the Pivotal software engineering process (agile / extreme) to a 100 person consultancy full of really smart ~20 year olds who didn't know or understand process at all.
I keep up on all the latest tech and I have a youthful mind and body (most people think I'm in my 30's). I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general. The culture in Vietnam is strong and my team wants to learn from me. It is very exciting and new for all of us. It has been an amazing experience so far and I look forward to the future.
The best additional advice? Just be nice. It is so simple. The culture here is to never raise your voice or get mad in public, so I've taken it to the other extreme and I just smile and laugh a lot. Even when the servers are melting down. Viet are shy and have poor personal communication skills. By being friendly and nice, they have learned to trust me and that has opened up them up a lot. It has infected my entire team and improved moral almost over night.
Being older has a lot of advantages. I'm loving my 40's way more than my 20's. Cheers! =)
Yes, there is a ton of racism and generalizations among expats. As soon as I got here, I was added to a few private facebook groups where expats vent steam over the craziness of this country. I'm actually not a fan of it because it is honestly very racist, but I want to know both sides of the story. Much like democrats read republican news.
The language is hard and learning is going to take me years. I'm trying my best, but when I say something as simple as 'một' (the number 1) to someone, they rarely understand me. Vietnamese also want to learn english (my company has a full time english teacher) and will not help me.
So instead of language, I've focused on learning the culture first and in 1.5 months (also this isn't the first time here, so it is more like 2.5 months) I think I have a pretty good handle on a lot of it. I make 1-2 (or more) new friends daily thanks to the friendliness of the people and because I'm out there networking like crazy. I have over a hundred friends here now, both from business and personal.
If I stop liking it here, I'll leave. I don't see that happening any time soon though.
The culture here is to never raise your voice or get mad in public
This stereotype also exists about Cambodia, the country I've lived in for 4 years now. I can tell you that it isn't true about either Cambodia or Vietnam, since I've seen plenty of people in both countries get mad and raise their voices in public. The same happens of course back where I'm from in the US, and if I had to guess it happens at about the same frequency but I wouldn't rely on faulty human memory to make that judgment.
Viet are shy and have poor personal communication
This is just a ridiculous thing to say if you've never spoken to these people in their native language. I have actually gone through the arduous process of learning the language in Cambodia, and I can assure you that speaking in a foreign language clumsily and constantly being worried about your inability to express yourself will make you more shy than you are in your native language. And I have to mention that famous park in Saigon where foreigners are literally swarmed by Vietnamese people eager to practice their English... is that shyness? I never talked to so many strangers in my life as the couple of nights that I sat out there.
And actually, please do not think I'm vilifying you or calling you a racist or anything. I can tell from your post that you're a nice person, whereas in both Cambodia and Vietnam there are an outsized number of expats who are just straight-up mean, bitter, and racist. These people unfortunately end up influencing the perceptions that new expats have.
I'm only encouraging you to come in with more of a blank slate, and to not delude yourself into thinking that you can know anything about the culture after 1.5 months. I'll be more willing to hear out generalizations after you've been there for years, learned the language, and traveled all around the country. Have you even been outside Saigon yet? I've been to Saigon, and it's not representative of the rest of Vietnam.
I've been here longer than 1.5 months from multiple trips here.
I've also been to Cambodia.
I'm not pretending to know everything about the culture.
I've travelled to other cities than Saigon.
I drive a motorbike as well as or better than locals.
I'm working here daily at a company full of (wonderful) Vietnamese.
I'm not an idiot and can form my own opinions.
I'm not an English teacher (which is another unfortunate stereotype in itself).
I'm not bitter or mean or racist.
I know about that park.
I'm working on learning the language.
I still stand by what I said.
> Viet are shy and have poor personal communication skills.
When I imagine how such a statement might land with your fellow HN users who are Vietnamese, let alone, say, a Vietnamese elder, such a claim is painful to read.
> I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general
I have a similar reaction here too. Other dynamics leap painfully to mind—a vast power differential and violent history—that don't have to do with "life in general".
We don't want a gotcha discourse in which well-intentioned people get scourged for saying things; I don't mean my comment that way. At the same time, civility means more than personal politeness. It includes respect for others different from oneself. That is a profound thing with many levels, each of which challenges us to awareness. Every one of us has these challenges, of course. They're just easier to see in somebody else's case.
Overall though, my experiences have been extremely positive. I literally wake up every day happier than the last, all because I live here now. They are happy and friendly people. They are young and full of energy. If I can help it, I'll never go back to San Francisco. Vietnam certainly isn't perfect by a long shot, but I love it anyway.
You need to bathe in youth from time to time in order to experience it - it's fantastic.
Of course you need to keep up to date, try to use your wisdom to understand which technology/language is going to survive the test of time.
For example, C/C++ is going to stick around for a while; make sure you're up to date (C++ 14 and C++ 17).
Broaden your horizon - read poetry, listen to all kinds of new music, watch experimental movies, travel around, talk to foreigners, eat weird food.
Study physics and philosophy, psychology and economy.
Have lots of sex - your wife will love you again :)
You have kids ? Great! Learn from them - everything. Try to teach them what they study at school - see if you can figure out a better explanation. Notice how much new stuff you learn about the subject, about yourself and your kid!
We're all getting old(er) every day - as we age this process seems to accelerate - and one day we will be no more.
But inside us lives the kid, the 20-year old, the 30-year old. It's still there, it can still be crazy and fun, we just need to remember to go on a date with our younger selves. All the rest will follow.
At least that's what I'm telling myself :)
I usually use statistics. If you select a random point on the lifetime of something. There is a 50% chance, that you are closer to the middle then either start or end. Thus: Always assume you are roughly in the middle of the lifetime. In other words, if some technology is only one year old, assume it is dead in another year.
When I don't have good insight into whether it's worth my time to learn some new tech, I'm going to try to apply this rule.
Not in my experience
... oh, you meant with her.
Wanted to call that one out, don't think I've thought about it before, but it's good advice.
Unless she finds out.
In software engineering roles at big/desirable/fast-growing companies, the interview process favors faster (by definition, younger) minds. Both young and old are put thru the same/similar coding interviews at many of these places, and often faster coders are younger, and get the job.
You can't fix ageism without fixing the interview process. Being jovial, healthy, nice and culturally sensitive are necessary and useful things to keep your job after you join, but the gatekeeping itself is biased on the other side, which reduces the intake to a trickle.
In my experience older people get cut out based on not being "a good cultural fit". This has been discussed ad finem on Hacker News because "cultural fit" leads to all kinds of discrimination: racial, gender, age, etc.
Every job I've had we put a person through a series of interviews, then we have a group meeting and we vote. There is no quantifiable evidence that this person actually interviewed the best. It comes down to how people feel in a room. That is the issue, not the speed at which a person can give answers. I've seen people voted down based on all kinds of illegitimate reasons and with age I think it came down to fear in some cases. A lot of software teams don't want to hire the best person they can find. They want to hire someone who is pretty OK, but will also make them look good. Yes, sometimes people don't get the job because they are too good. Am I going to hire someone who makes me look like an under-performer, or could get promoted to before me? Bingo, bad cultural fit.
Algo-on-the-whiteboard interviews favour people who have recently been cramming for their final-year CS exams - by SHEER COINCIDENCE they happen to be in their early 20s...
I know some North American universities have adapted to the practice and are now preparing students, but I assume this is relatively new. My algo and DS classes weren't about cramming at all.
Now to be fair, reasonable companies will focus on higher-level, systems design and architecture type of questions when interviewing seasoned engineers. Or at least, they really should.
And how isn't that a terrible metric? Most of my current coworkers would fail as they simply don't have the time.
It's often so broad that to really cover everything that might come up you've got to have time to make the studying a part-time job.
And even then you might get hit with one of those "you almost have to have seen the trick before" questions, like detecting a cycle in a broken linked-list with O(1) memory.
The only reason is that you want to build an environment where the work is secondary, such as wanting to hire a bunch of bros to go drinking with and help you spend all that sweet VC cash...
Meanwhile, experienced developers have their brains tuned towards on-the-job skills that are harder to cram (like an instinct for edge-cases) while "shelving" the stuff that you don't need.
If there is noone with 20 years of dev experience within the company, there is noone to speak up for this, and the cycle continues.
Whether this translates into a perceptible disadvantage in code tests is less likely to have been tested, but it is given what we know, and given other arguments above, quite possible and indeed likely.
What you're definitely not looking for is a candidate that hacks some solution together quickly but with badly structured code, makes a lot of unconfirmed assumptions and doesn't listen to advice.
Even before that is the technical screening process. If the company has standardized on Angular 2, your experience with Dojo, Ext-JS, jQuery and even Angular 1 is considered irrelevant by the screeners - you may as well have experience in medieval basket weaving; they won't even call you. If the company has standardized on Groovy, your experience with Java is equally irrelevant. If the company has standardized on MySQL, your experience with Oracle is irrelevant. If the company has standardized on Linux, your experience with Solaris is irrelevant. And on and on it goes...
So we have this environment where the hiring managers are shooting themselves in the foot by looking for style over substance and anybody who tries to bring it up is dismissed as a dinosaur with a case of sour grapes.
I had an interview at Google a couple months ago and noticed that most people were pretty young. When I asked the person who was in charge of taking me to lunch about this, he said that it's probably because there are just much more graduates of CS now than there were before, and that Google would very much like to hire senior people as well but they're much harder to find.
I wonder how much of what he said is true vs ageism.
On the other hand, I wonder how much of a natural bias there is against older people if they have to go through the same interview process because it felt like a mental marathon to me. Although the interview only lasts a day it took a couple days for me to recover.
> I wonder how much of what he said is true vs ageism.
Of course they want to hire more senior engineers now, it would likely help win their court battle over it.
You know gazillions of frameworks to achieve everything these days, their integration tricks, various app servers, CI toolsets and so on and on?
I mean, if you are senior in something, are you senior also in XXX language, meaning I give you spec, we talk and you deliver proper maintenable solution, leading dev team, managing all issues and bumps along the road? If no, and you just come as described junior, nobody has time to babysit you for weeks/months, and you are not willing to take junior salary. but that's what you are to the company.
Even when I was back in college many years ago, only the first class taught a language. From that point forward the teacher of each class said we're using language X and suggested a book if you needed help learning.
Sure, it's true that many things about good design are language agnostic. On the other hand, frameworks and languages can actually limit or enable what you can do, and that lack of familiarity with them has the potential to lead to mistakes.
My reasoning for this is that we are always learning new languages and using new frameworks. Why would I let a better person go when the language is probably going to change, or worst case they pick it up in a couple weeks just by looking at the existing code base? One case where I would deviate a bit is if I was hiring for a functional programming position. In that case I would prefer experience with some functional language, but that is not much different than wanting OO experience for a Java/C# position.
 source: I work for Google.
That said, the OP is correct. You're only as good as your last two years and even that's pushing it. If the tech changes, you have to adapt with it.
I'm 53 as of yesterday (the 8th). I started with PDP-11's in the 80's, then VAX's, then PC's, BASIC at first, then C, then Visual Basic, then ASP, then C#/ASP.NET, and now I'm deep into AWS (Lambda, DynamoDB, Redshift), NodeJS, AngularJS 1.x/2, ReactJS, and I'm still learning new technology all the time.
A lot of developers will transition to management and it's on my mind, but I'm also still drawn to solving problems at a code level. And there's always new toys to play with like Angular and React. Now we have .NET Core and all of its interesting avenues.
If you actually care about being a good developer, you will continue to work.
As long as there are jobs. Nothing will help you if the job market contracts. Then I do believe hiring becomes age-oriented with us older dev's labeled "over-qualified".
The fact that people are still claiming there is a shortage of tech workers, demonstrates that that isn't true. The industry wants young, cheap workers who come ready made with the trendiest skills, then it wants to ditch them rather than retraining or allowing them to accrue seniority, and hire new ones, who will work 80 hours a week for free soda and "stock options"...
You'd be surprised. The original post (TFA) gives the median age in major companies for example and it doesn't work like that there.
I could go on and on about the stuff I'd do I don't do anymore.
It's not that those things don't have their place, or that all young programmers are guilty of premature optimizations, but there's no difference in terms of productivity between doing nothing and doing things that don't matter at all. It took me a while to learn that, and I still catch myself wasting time.
Either in your current role / in interviews / on the street?
I'm not afraid to generalize this either. I see so many people and organizations repeating the same mistakes, of their own, and of others', over and over. I envision an organization built around the idea of learning from others' experience. It's a near mythical creature, but it could run circles around its peers, who would doggedly pursue finding out its "secret" and promptly discounting it when they hear "learn from others' experience / history."
This turned into a mini-rant, but it's because I am incredulous that you and others like you have learned so much yet the potential of that knowledge is so rarely tapped into.
(1)“Fools learn from experience. I prefer to learn from the experience of others.” ― Otto von Bismarck
(2) Why Don't We Learn from History by B.H. Liddell Hart
YMMV, caveat, caveat, caveat, etc, roughly speaking, &c
Strategy is more about planning, forethought, and knowing ahead of time which avenues are worth pursuing and which aren't. As a very simplistic example, a tactician might spend a day down a rabbit-hole and improve a db's performance by a small but significant amount, and a strategist would not have spent that day, knowing that that db is going to be turned off in two weeks. Not the best example, really (and it implies that you're only one or the other, which isn't true at all). There is also overlap between the two.
Perhaps a military analogue - tactics is about how to take the forts (shorter timeframes, clearer objectives, obvious goals), and strategy is about determining which forts are worth taking (long-term timeframes, murkier objectives, sometimes unclear goals, concern for secondary effects). The difference is also in scope, I guess. Being a good tactician benefits from energy and focused interest, something which the young'uns tend to have more of (caveat, caveat), and strategy benefits from forethought and experience, something which the old'uns tend to have more of (caveat, caveat, &c)
Strategy is more about planning, architecture, knowing about the real pros and cons of different solutions eg. libs, tools or languages.
So debugging, no, definitely not. I can spot root-causes of bugs now even easier than I used to be able to.
And one of the big new skills is that I'm much more able to 'guess' where a bug is in someone else's totally new code-base than I previously was able to.
In fact, I also write code faster as I get the general gist of it done much faster first time.
For instance, I used to be able to perftune UNIX and DB (mostly Oracle on Solaris) pretty well, knowing various kernel parameters by heart and all the tools of the day (tkprof, Cockroft and co). Now I guess I could still get by after some brush-up, but I'm not anywhere near as efficient as I used to be on that specific task. And I still write code, but I don't know every single method of every API anymore. But now, I can design full solutions, from choosing the hardware to setting up platform, languages, monitoring, high-availability, backups, security etc. Not because I'm smarter, but because I've been exposed to all that along the years.
I also think that the Zuckerberg quote also points to the startup mentality. They want 22 year olds because you can get them to work 100 hour weeks against the promise of an IPO, not because they are "smarter". Hard to convince a 40 year old with two kids to do the same. There has already been a shift to more traditional CS companies (IBM, etc.) as the lack of IPOs has soured some people on the startup dream of being employee #3 at the next Uber of XXX.
I was a young person once (no, seriously!). I _thought_ I was learning things fast back then. When I got older, I realized I was trading speed for depth.
1. Learn faster (better memory)
2. Can keep their attention focused longer.
> Learn faster (better memory)
Younger and quicker developers took a lot longer over it and to find their feet because there were many new concepts there for them.
As you get older, there's less to you haven't already learned.
(Beginners All-Purpose Symbolic Instruction Code... which now I type it out, obviously cheats. From here on, it shall be known as BAPSIC.)
If the friend mentioned lost or left his current job, he'd probably (?) struggle to find another. It wouldn't be nasty ageism, it'd be "sorry all your experience is with a technology irrelevant to us; that we haven't even heard of".
I say this going on the article alone - of course, for all I know the guy's an avid Ruby/Node/Elixir/whatever's-hot user for a range of awesome side-projects.
> I could tell you about all my accomplishments over three decades, such as replacing the use of a System/3 punch card system with the AS/400, writing a Cobol debugger, or…. Ah, I’m boring you. What you do care about are things I did in the last two years.
RPG is only relevant today in the same way Cobol is.
That's actually his point.
Someone who has seen a technology mature and develop over decades, seen the hype trains come and go, projects succeed and fail, might be a bit wiser than someone fresh out of Stanford. He might have some ideas about how to build maintainable systems after working on some that are older than most developers.
If you don't see it, that's on you, not on the author for mentioning his particular "archaic" specialization.
The good part was that I was able to build a very successful career while not having to suffer nearly as much pain as many of my contemporaries. The bad part is that now it's hard to find people to collaborate with. :-(
Then why don't people build amazing and popular things with it? I don't mean one or two people build one or two things, but lots of people building lots of things.
Yet it's "one of the best choices".
I simply don't believe it. Any perceived advantages it has, in practise must be a wash.
Meanwhile, the Java or Go programmer is thinking "this is boring and ugly. Let me finish this problem as quickly as I can so I can go home." So they finish the problem as quickly as they can, and then it's done, and it ships. And people have an incentive to use it rather than tinker with it because nobody really wants to peek under the hood, and the people who do peek under the hood tend to be really dedicated and care a lot about the problem domain because why else would you put up with the language?
Sure it can. It is possible for both to be true, as long as developers don't pick languages rationally, which they probably don't.
There are a few biases at play. One is that people are only exposed to a subset of the languages that exist, which are those that are either used in industry, or are making the rounds in news. Another bias is that we like to pick languages based on familiarity. For example most of my college courses used imperative languages: Python, C++, and Java. From that experience I am quicker at thinking in terms of for loops than folds and maps. So when I try a language like Haskell or LISPs I think "That's neat!" but when faced with a deadline I switch back to something closer to my first languages.
It could be that I am the only one that thinks like this, and everyone else sits down and spends equal time on every language in existence, but I doubt it.
In any case at one point LISPs were popular, so why did they lose all of their momentum?
"I chose the language which exists" doesn't apply to the people in 1994 who could have had /thirty-five years/ of LISP experience (potentially) and yet still chose to write another language, in another language. And when Brendan Eich was in college around 1981, the tutors there could have had /twenty years/ of LISP experience, but weren't there to convince him that it was amazing.
Same with literally any language dating post-LISP, and I note that that covers most languages which are popular today.
"Programmers don't pick rationally" is fine on the small scale, but accross the entire industry, even among people who do love exploring programming languages, even among young entrepreneurial risk takers, even among companies in tough markets angling for any and every edge they can get over their competitors, in decade after decade, over problem domain after problem domain after problem domain, there is this empty howling wasteland of happy and productive people using not-LISP, writing world-conquering systems that do just-fine-thanks and the claimed benefits of LISP just don't seem to be making any noticable dent in anything.
Therefore, they are over-hyped.
I met John McCarthy in 1977. In college, a professor (Ruth Davis, I think; SCU EECS department) brought in someone who taught Friedman's "The Little LISPer". By the time I got to Netscape, I had read SICP. I knew enough about LISP to be dangerous.
But as I've said many times, and Netscape principals have confirmed, the reason JS isn't Scheme is because Netscape did the Java deal with Sun by the time I hired on (after I was recruited with "come and do Scheme in the browser"), and that meant the "sidekick language" had to "look like Java".
There was no "chose to write". Netscape management gave the MILLJ order; Netscape source was C (mostly), JS ("Mocha") was supposed to go server-side via "LiveWire" as well as embed in Navigator, and I was a C hacker. These are the reasons for the new language and its first implementation language.
Where's the fun in that? ;) No, but really - I apologize for claiming there wasn't enough general LISP enthusiasm around when you were at college to affect you, and for not actually looking into the history of JS before making statements about how/why it happened.
JS design being partly a pre-made business decision - that's something I really should have known by now.
I was the lead engineer on the first release of AdWords. I wanted to write it in Lisp, but I was not allowed to, being forced against my strenuous objections to to it in Java. So now AdWords is a Java success story rather than the Lisp success story it might have been not because Java is better (it certainly wasn't -- it was a disaster) but because my boss issued an edict.
This has happened to me many times in my career. The one time I was actually allowed to use Lisp in a project that I did not have direct control over (the DS1 remote agent) it was an overwhelming technical success. In fact, management tried and failed to get the software rewritten in C++, so this was actually a controlled experiment.
But in fact I'm not sure that we see more Lisp use in startups, either. That could be because they aren't taught it in school. (Many startup founders haven't had the chance to pick it up on the job. For that matter, many programmers haven't had the chance to pick it up on the job.)
On the controlled experiment: Why was that? I mean, it has to be possible to rewrite anything in C++ (Turing complete, and all that - but possibly at the price of Greenspun's Tenth Law). Was it because the people who tried didn't know anything about Lisp? (Were they working from the source, or from the spec, or from the program documentation?) Were they just not as good programmers as the Lisp programmers (and don't say "if they were as good, they would have been using Lisp" - that wasn't their assignment). Did they have less time, less resources? Why didn't/couldn't they do it?
As to why the re-implementation of the Remote Agent (it wasn't the whole thing BTW, just the planner) in C++ failed, it was a combination of factors. This was twenty years ago (holy cow!) and C++ compilers were nowhere near as mature then as they are now. There was only one compiler available for the flight hardware, and it was pretty unstable. But mainly it was Greenspun's tenth: the application did a lot of dynamic allocation (it was an AI search algorithms) which is something C++ is particularly not well suited for. They essentially had to re-implement Lisp in C++, and that was hard. Of course it would have been possible given more time, but we didn't have more time. The Lisp code was working, so that's what we flew.
* Google actively develops one of the Lisp compilers. Google Flights powers Orbitz, Kayak, etc. That's Lisp.
* There are several Lisp compilers in active open source development.
* There's a graph database written in Lisp called AllegroCache. It's good enough to support a business (Franz) for more than a decade.
* Another company (LispWorks) also exists and has a large portfolio of clients.
* Lisp has been used to make entire operating systems. Ones of the past, and ones of now. (Of course, an OS needs a community. But where are real OS's with GUIs in other languages?)
* Lisp has been successfully used in my own career for embedded systems to control satellite acquisition systems to, most recently, quantum computing. (At real companies.)
Just because there's not this huge buzz around Lisp doesn't mean no one is using it.
There's this thing called Windows you may have heard of...
And, out of curiosity, what OS are you referring to?
> Just because there's not this huge buzz around Lisp doesn't mean no one is using it.
jodrellblank's claim was not that absolutely nobody is using it. The claim was, compared to how wonderful Lisp advocates claim the language is, relatively nobody is using it. If you compare the amount of software written in Lisp to the total amount of software written, and compare that to how wonderful Lisp is claimed to be, jodrellblank has a real point. And citing a handful (or several handfuls) of counter-examples does not refute the point at all.
It wasn't even worth them spending 1 million dollars - 2% of that price - on training people.
If that were the case, I would still expect to see lots of gushing blogs of leaks from inside hush-hush companies and people desperate to learn LISP posting "how do I replace strings in files in Common LISP" on programming forums, and "I learned LISP and doubled my salary" on Twitter.
I wasn't saying that no one is using it. I was saying "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it".
Your points about your career are pretty interesting.
Instead of using/enhancing existing tools, there is a constant pressure to develop new stuff. Or take Apple with Swift. Instead of using Scala, Standard ML, OCAML, F# or Haskell, they developed a new statically-typed functional language.
It's the NIH syndrome at work. Everywhere. But it's also that tools are complex to learn, so people start new with simpler tools, they grow over time and after some time they are replaced with other stuff. If something gets updated in some incompatible ways, it already causes problems: some users are lost, some users will only use the old stuff, some only the latest stuff and some will try to use multiple versions. See Python.
Full Common Lisp is just too complex for most developers, but it has a life in many specialised and niche applications: CAD, some AI tools, music, robots (like the Roomba), planning/scheduling (crews, telescopes, ...), Expert Systems, verification of software and hardware, some maths stuff, ...
Since Lisp is only left being taught at a few universities, there are not many people able to develop with it. Even when it was taught, it was often only used to teach concepts like recursion and not programming. The younger Lisp programmers found it by themselves.
'Industry' sometimes often has no interest to diversify their programming tools. Many enterprise software shops currently (still) use Java: standardised, broad industry support, ... You won't successfully propose to them to use Lisp, even if the application would be better in some way. For example if the project fails, it certainly wasn't Java fault, because all the others are using it too. If you would use Lisp and the project would fail, it would be Lisp's problem: not enough people, little architecture experience, tools not broad enough, integration story too weak etc... Even it would be successful and in production, there would be a lot of pressure to rewrite it in some industry standard in the next product iteration.
> "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it"
There is no general 'best'. It's all relative to a domain, community, demands, legacy, fashion/hype, ...
Lisp is not more dead than usual. Yesterday there was a donation effort started for the Quicklisp library manager and it's now at $16606.37 .
One issue that (good) bosses have is that they cannot allow their business to be held hostage by one person. If that person quits (or dies), they have to be able to replace them. Esolangs are a hard sell on that basis alone, no matter how fit for use they may be.
I'd forgotten Clojure (and others!) - yes, Lisp is more popular than "things named Lisp". And, arguably, Clojure is popular enough these days that the argument in the previous paragraph doesn't really apply to it any longer. (Whether bosses know that is a separate issue.)
Then there's the claim by Guy Steele that, in creating Java, he dragged a bunch of C++ programmers "halfway to Common Lisp". (Lisp purists might concede some fraction considerably less than half...) Paul Graham says that the Lisp feature set is slowly taking over programming languages. Lisp may die but still conquer, or mostly conquer.
But Java could not do a lot of things a typical Lisp implementation can do, sometimes with a good reason for that.
* runtime compilation
* loading of code (-> custom class loader)
* garbage collection of code is sometimes difficult
* saving/starting of memory dumps
* fast startup times
* tail call optimization
* updating changed objects
* calling error system -> Java has a terminating error system
But generally knowing Lisp does not help you much with Java, since the language's OOP model is very different from the typical OOP in Lisp (see CLOS+MOP).
This application is written in 7+ million lines of Common Lisp
The video shows how Eterna uses 'PTC Creo Elements/Direct' to develop their watches. There are many other clients of that in various domains.
(Although the fact that some people can make successful Common LISP software could just be that those people are exceptional enough to make successful software in any language, it doesn't bolster the claim that 'Common LISP is one of the best choices for many applications' very strongly).
If you would run a Cement plant, an oil pipeline or a coal terminal in South Africa, then you might also be using G2 to control and diagnose technical processes in real-time.
The product used, written in Lisp:
The reason is because "best" can have more than one meaning. Languages are usually only "best" at one thing, not for all things. In the case of many languages that people consider to be "best", they've been optimised for the development process (making it nice to write code in) rather than speed (C) or security (Ada) or a specific niche (R) or maintainability (not sure) or customisability (LISP). They're designed to make writing code more straightforward. Languages that are optimised to be easy to code with are the languages that tend to bubble up to the top when it comes to popularity contests because the more people who can access the language the more popular it's going to be.
There is an argument that Ruby is mostly LISP, so it could be said that lots of people use it very happily everyday.
Except, even people doing exploratory stuff apparently aren't using Common-LISP.
Let me point to Slava from RethinkDB, who wrote blogs about how amazing LISP is and then went to found a new-software startup written in C++. Peter Norvig, AI researcher, who moved from LISP to Python.
And even your mention of Ruby - "Common LISP is the best langauge", then why is Ruby even needed? Why aren't those people happily using Common LISP? Simply because it wasn't actually any better, and was in fact worse than Ruby in whatever metrics.
How is that? I thought Lisp's "killer app" was its extremely powerful macroing system. Does Ruby have an equivalent?
Nowhere, that's where. Like in approximately all problem domains, whatever perceived benefits LISP has, doesn't seem to make it all that desirable.
If you choose LISP, you're actually going to have to expend a little bit of effort on identifying, recruiting, and retaining actually competent people. That's a problem when you're looking to put butts-in-seats (as is the case in my industry: defense).
In twenty years, Common-LISP with all the advantage it supposedly has couldn't make anything that a 'code monkey' could use? Why not? Is it awful at writing tools?
Or is it that the alledged advantages of Common-LISP don't really make any difference in reality because they are over-hyped or even non-existent?
Note that I'm not saying "why did they pick one language over another", I'm saying "Common LISP can be the best tool for the job - yet they waited twenty years to use a 'worse' tool to implement a 'worse' tool". That doesn't make sense.
Really, if PG's "LISP, how to win big" was as compelling and important as he tried to state in that essay, every ycombinator company would be using LISP for their secret advantage over other companies.
Applicants would choose LISP because it's "better", LISP-based applications would be preferred because the applicants have an advantage, existing companies would be encouraged to LISP because it would help them do more for less employee numbers and YC would become a LISPy community of alumni.
I have no proof that this is not happening right now, but would you bet on it happening right now in secret?
If you will excuse a moment of cheekiness...
Could be that you have cause and effect backwards here: because you are grumpy you are dismissing 99% of other language developers' work as rubbish. Could be that it's less than 99% and you are overlooking some great ideas.
Was there something in particular that you had in mind that you think I may have missed?
Confirmation bias makes it easy to bind those variables to values that make one's own favorite language obviously the best and everybody else's infuriatingly, irrationally terrible. If unchecked then this leads to wildly false conclusions, such as that languages fit into a hierarchy of powerfulness ("Blub Paradox.")
This isn't a challenge, I'm genuinely interested in your answer.
there is the famous example of the reddit founders, who believed pg's lisp story, and built the first version of their site with it. it went so badly for them that they had to start over again in python.
... but i bet you are going to have a very plausible-sounding reason why it didn't work for them.
plenty of other tech has come from up nothing in the last few decades, to wide adoption, and big successes. the fact that lisp hasn't is, in my mind, prima facie evidence that it is not nearly as great as its proponents claim.
There are, but you are ignoring them, you simply did not do any research or it is simply hidden.
The success of Common Lisp today is relatively small, but a careful reader could find a few interesting applications of it like the scheduling system for the Hubble Space Telescope, the design software for the Boeing 747 (and other aircrafts), the software for the Roomba, the software for DWAVE's quantum computer, the crew scheduling software for the London subway, chip design verification software at Intel (and other chip companies), ...
There are some old application platforms which survived. For example Cyc, an attempt to provide common sense AI to computers, is under continuous development since the mid 80s. The company Cycorp has 50+ employees, is very secretive and you need to guess who pays for it. Customers are among others the United States Department of Defense, DARPA, NSA, and CIA. They are using it for various applications.
Note also that prototyping software was for a long time an application area for Lisp. Have relatively small teams develop a prototype and make it a product once the idea is validated. Example: Patrick Dussud wrote the core of the first Microsoft CLR (.Net Common Language Runtime) garbage collector in Lisp. The code was then automatically translated to C (IIRC) and enhanced from there after some time. Lisp now is no longer used and the GC has a lot of new features, but the first working versions came from that Lisp code.
Not necessarily. There are other possible explanations of Lisp's relative lack of commercial success, not least of which is the fact that a widespread belief that "there must be something wrong with it because no one uses it" can become (and I think has become) a self-fulfilling prophecy.
But another important factor is that the Lisp community seems to attract people who are really good at tech but really bad at business. I think if someone (or, more likely, some pair of co-founders) could bridge that gap they could still kick some serious ass.
These days I am developing such software with LuaJIT and that is working much better for me than either C or Lisp.
One thing I learned along the way is that many tales of Lisp heroism are actually anti-paradigms. Once upon a time when I read about ITA Software rewriting CONS to better suit their application I thought it was impressive; now I see it as a farcical workaround for having chosen an ill-suited runtime system and sticking with it (and generally an indictment of Lisp not providing a practical performance model for the heap.)
Lispers are too expert at spinning bugs as features. "It's insanely complex, every line could be an interaction with undefined behavior or a race condition or an unexpected heap allocation" becomes "suitable only in the hands of trained specialists, like a chef's knife or a surgeon's scalpel or a Jedi's light saber."
I feel like we need to have a shared "our emperor didn't have any clothes" moment with regards to Paul Graham's essays.
(I say this as somebody who does love Lisp and will probably do a lot more Lisp work in the future but only on a project that is a peculiarly good fit.)
This is the TXR Lisp interactive listener of TXR 162.
Use the :quit command or type Ctrl-D on empty line to exit.
1> (defstruct integers ()
(set me.next (lnew integers val (succ me.val))))
(:method print (me stream pretty-p)
(format stream "#<integers ~a ...>" me.val)))
2> (lnew integers val 0)
#<integers 0 ...>
#<integers 1 ...>
#<integers 2 ...>
#<integers 3 ...>
When I don't want an expression evaluated in (what looks like) a function call, I can, firstly, make that a macro.
If I really want lazy semantics, I can have a decent vocabulary of lazy constructs that fit into the eager language. For instance for making objects lazily I have lnew, distinct from new.
Implicit laziness everywhere is academically stupid. You're drowning the execution of the code in an ocean of thunks and closures.
The pragmatic approach is best of making a compromise between making everything explicit and visible, yet keeping it syntactically tidy and convenient.
Modularity; see the stone age paper discussed yesterday: https://news.ycombinator.com/item?id=13129540
> Functional programming languages provide two new kinds of glue - higher-order functions and lazy evaluation. Using these glues one can modularise programs in new and exciting ways, and we’ve shown many examples of this.
> This paper provides further evidence that lazy evaluation is too important to be relegated to second-class citizenship. It is perhaps the most powerful glue functional programmers possess.
It argues that you can achieve a certain useful separation between programs together when one produces data for the other.
This can be achieved in a very satisfactory way with explicit streams (i.e. lazy lists). It can be satisfied with delimited closures, coroutines, threads and often with lexical closures. Not to mention Icon-style generators.
Lazy lists can be incorporated into the language so that their cell are first-class objects and substitute for regular eager cells smoothly. (Thank you, OOP).
The paper is actually wrong there, because laziness alone will not provide the kind of separation that g can begin executing, such that f then only executes when an item is required. Not for an arbitrary f! Suppose f traverses a graph structure recursively and yields some interesting items. Lazy eval alone isn't going to allow the f traversal to behave as a coroutine controlled by g, proceeding only as far as g continues to be interested in further items. The author is attributing to lazy evaluation magical powers that it doesn't have.
For the same reason you want automatic memory management: so you can fob off the job of figuring out where the thunks should go onto the compiler, just as you fob off the job of figuring out where the calls to malloc and free should go. At least that's the theory. It seems plausible to me. I think it's an open question whether my failure to grok Haskell is due to a problem with Haskell or the ossification of my brain.
Laziness has precise semantics which has to unfold properly, or else things don't work.
Delaying evaluation is not the same thing as delaying reclamation. They are opposite in a sense, because we only allow something to be reclaimed when it is "of no value".
To what degree do you use Lisp as an FP language? As a pure FP language? The forced purity may make Haskell a very different language.
And, to return to your original complaint: If you dislike new languages, I bet XML drives you straight up the wall...
It depends on what I'm doing, but I generally write in an OO style more than a functional style. Real problems have state.
> If you dislike new languages
> I bet XML drives you straight up the wall
Kind of, but not really. Yes, I dislike XML because it is nothing but S-expressions with a more complicated syntax. But it doesn't drive me up a wall because when I need to deal with XML I just parse it into S-exprs, do what I need to do, and render the results back into XML.
In XML I have no way to place "255" and "FF" in such a way that XML understands them to be the same object, of integer type.
Haskell doesn't have state but you have multiple models to choose from, from simple folds to STM or State or Reader or Writer Monads all of which serve different purposes and do different jobs well.
Haskell has no state support built in, it is all delegated to libraries.
C++ has excellent IO support, but the language doesn't embrace IO at all.
Haskell has excellent State support, but the language doesn't embrace State at all.
f(cout << x, cout << y)
C++ statements could be added to C++ (e.g. as a compiler extension). They would be straightforward to use; C++ doesn't inherently reject that the way Haskell and its ilk reject sequencing and state.
h x = factorial x
g x = 0
Someone who understands where that support is and how to use it should rewrite atrocities like:
arr ^. ix 2
readArray arr 2
Clojure is doing OK (better than Common Lisp), but not great (worse than Haskell and Emacs lisp).
That's a really nice web site. It took me a moment to realize it's showing quarterly data (sadly last updated in 2014), and that the ranking is based on the first metric ("number of active repositories").
"But no one knows it..."
It'll be really hard to work with a 23 year old with that attitude too.