I don't want to start a flame war about diversity, I just want to get the facts straight.
Anyways the article as a whole is heavy on anecdote and low on data & facts. There is an emphasis on outcome metrics but the author doesn't bother to examine any data on productivity vs age.
As a female developer I really don't like the idea that I might be selected just to fill a quota.
> The organizers know how male and white the tech industry is, so they make a special effort to recruit a diverse speaker lineup.
Yes, the author was using social awareness to segue into ageism, but it is still factually incorrect (or at least highly misleading) to state that the industry is predominantly "white".
Asians, especially from India have an out of proportion presence in technology jobs. It is no different than any other ethnic industry affiliation. Irish were cops, Greeks own diners, and South Asians are huge in tech, and have ethnic social networks that connect folks to jobs, just like any other tightly meshed ethnic community.
It's worth mentioning that for 2017 leadership, the Google numbers are significantly different: 26% asian, 68% white (and 75% male). So the decision making is still quite white and male.
In case you missed the memo, the current cultural climate and trendy narrative structure demands that everything, no matter the topic, must include some reference to identity politics. Facts only interfere in this sort of narrative building, so please ignore your data and focus on emotional triggers instead.
Bonus points if it fits somehow into the very trendy culture war narrative.
My take is that older programmers who maintain their skills on new technology are worth their weight in gold. The financial trick is to pair them with younger developers to create an overall cost effective team. If both the older and younger devs go in with the right attitude, it can be wonderful for both the project and their respective careers.
I also think it's really, really dumb to push developers into management positions if they can't do the job. I'd much prefer putting younger managers with great management skills into the mix than just rotating older devs into that role.
Bottom line: software requires lifelong learning. If you can't or won't stay current, you're in trouble. If you can, then it's just agism or a bad business model that's keeping you out of awesome projects.
A lot of older Python candidates are being passed over for younger ones. Sure they might not be learning Rust, but there are younger python developers that are having an easier time getting a job, and they aren't necessarily learning Rust either.
: https://stackoverflow.blog/2017/09/06/incredible-growth-pyth... https://insights.stackoverflow.com/trends?tags=python%2Cjava...
Does it make sense to consider Python an old language if it's still a living language? The next version is coming out in October, and most new projects started today are running on versions released within the three years.
This seems pretty different than something like COBOL, where it's technically still being developed but not in anywhere near the same way.
Python's really good at binding C code , which is one of the reasons you see it in so many machine learning and scientific projects (Numpy, Scikit, Tenserflow, etc.) I think that's what given Python a lot of legs in the last few years.
Python, I'd argue, is very much a living language. It was introduced in 1990, so it's 29 years old. Last release was 5 days ago as I write this 
Java, is also a living language. It was introduced in 1995, so it's 23 years old. Last release was in September 25 
Cobol was introduced in 1959, so it's 1959.  The last table release was 2014 (which is a lot newer than I would have expected). Cobol is a great object lesson, I think, as the parent comment calls out. I don't think many software engineers would consider it a "modern" language, but according to the Wikipedia article, as of 2012, 60% of businesses use Cobol. So it's far from a "dead" language.
Lisp was introduced in 1958... 61 years old. . I don't think many developers would consider Lisp dead, either.
Cobol's longevity seems to be tied to the fact that much of the legacy code encapsulates business logic, as opposed to UX logic (websites, mobile apps, etc.) There's more than a few lessons in there I think...
Whereas COBOL today is pretty much the same pile of stuff it was in 1960-something. It's not a vibrant language family with newer members. COBOL does not absorb new concepts.
There are new, young people learning COBOL, but only in order to become maintainers of legacy systems: job security in banking, finance and government sectors. Maybe a few have a perverse curiosity: they want to know what is really behind the dirty five letter word.
New people coming to a Lisp are coming for the Lisp itself. There aren't that many Lisp jobs involving new or legacy coding; new Lisp people learn and make new stuff from scratch.
There is simply no comparison.
I've had managers who were both technical and people-oriented. They both bring a lot to the table. But the older I get, the more I value having a manager I can go to who is good at managing optics and helping me deal with political or institutional roadblocks. In an idea world, I'd have both.
But fundamentally, if you can't move the decision making power/responsibility/(+ to some degree compensation) out to non-traditional locations in your org structure, you'll naturally have to move those people towards the more traditional locations....
I also think it's really, really dumb to push developers into management positions if they can't do the job.
and also true of older programmers who maintain their skills in older technologies. see all the COBOL comments below. If history is any guide, competent Java programmers in 10 years will not be aged out, and will be compensated more than whoever is competent (regardless of age) at the current tech stack.
This has been known forever, yet it still happens - Why?
>> I'd much prefer putting younger managers with great management skills into the mix...
One reason is it is impossible to tell if a young manager has great management skills. It's really hard to tell if an experienced manager has strong skills.
As a result, technical proficiency is used as a proxy for management potential. It probably has a stronger correlation than picking a random person, but is not great. The new manager is then really unhappy, quits or is fired or eventually learns the new skills and is successful.
The real issue is that management is a pyramid but IC is ziggurat with, if you're lucky, an obelisk sitting on top of it.
People like to point to the principal staff engineer or whatever, but there's one of them for every twenty upper-mid level managers making the same money. And all those managers can take their skills to another company, but if an IC guru leaves the company where he made his bones he's probably destined for consulting (at best). Consulting has its upsides, but that's no more for everyone than management.
The tracks and levels are commonly referred to as the "job family architecture" -- don't be afraid to ask a recruiter for this information. Otherwise you'll never really know if you're being properly leveled. (For instance, I believe a Principal SE at Microsoft is roughly equal to a Staff Engineer at Google.)
>This has been known forever, yet it still happens - Why?
This problem exists in hardware departments too. My best guess is that they want managers familiar with the tech/industry/company. My second best guess is that it's a natural human tendency and hasn't been addressed systematically.
I do find that companies seem to know how to evaluate young programmers: just test them on their knowledge of your tech stack. Any young coder who is good and is working in that stack will know the basics. The bad ones won't.
For older coders, I'm not sure that same signal works. I've forgotten more frameworks than most coders will ever learn. I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.
I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.
But how do you evaluate a coder on their ability to solve problems? Much harder.
Even more difficult, how do you get a 24 year old coder who only knows react and FireBase to evaluate a 37 year old coder who has built web frameworks from scratch?
It's another world, and a lot of companies just don't bother.
Thankfully I have no desire to work at any of those companies. I solve problems. If a company can't hire for that I don't belong there. There are plenty of companies trying to hire people to solve actual problems, not just put coder butts in seats.
Yes, you can learn anything at a superficial level in a couple weeks. If you've spent the last ten years maintaining an ASP.NET WebForms app running on IIS, then within a couple of weeks, you can contribute code to a Go microservice running in Kubernetes and using Kafka.
And that kind of "basic competence, with the ability to learn more" is what hiring managers want out of a junior dev, so if you don't keep your skills up, well, great, you're as useful as a junior dev.
But what they want out of senior devs is people who know the ins and outs, who understand what to do and why to do it and can explain that to others, who can avoid the pitfalls of bad tech choices and misguided implementation patterns.
If you've got decades of experience on your resume, my guess is that you want to be evaluated as a senior dev. And if you're a senior dev who doesn't know modern tech, then... you're not that useful. Your encyclopedic knowledge of the WebForms event lifecycle doesn't give you any particular insights into the design challenges of a microservice ecosystem; your ability to tweak IIS for maximum performance won't help you write a Helm chart.
If you want to be hired as a senior dev, you need to understand how the modern world works, and all the posturing around "I could learn it in a week" or "I prefer to focus on problems" doesn't impress anyone.
So yes, it's helpful that people keep current, and obviously waiting for a new developer to get up to speed on the particular stack is always a cost (and it isn't a couple of weeks, for anyone). However, if what you really need is a senior person, it may well be worth that time.
Seeing this assumes the hiring manager understands how software is actually produced and maintained, not just what their current tech stack needs. It's surprising how often this isn't true, but a contributing factor is how mid level mangers are made - which is related to this discussion I think.
You might be great at mentoring... but you have to know the stuff to mentor people in it. You might be great at communicating with stakeholders... but you have to understand the technical constraints of your system in detail to have that communication. You might understand principles of software design and debugging... but you still need to understand the particulars of these technologies to make your design or to focus your debugging.
And yeah, you can learn the tech stack, and a company that really needs a senior dev eventually could do okay to hire a senior dev with out of date skills, and wait for them to get trained up. But don't plan on it taking three weeks, is all I'm saying. To really get to that point of fluency and domain expertise takes longer.
(And also, one of the things that senior devs are supposed to do is evaluate new technologies and figure out when and where adopting them can make sense. If you've fallen so far behind on learning new tech, then it's not clear if this is something you're able to do -- maybe it is, and you just worked for a workplace that stifled adoption of new tech, or maybe you're someone who'll just learn what you need and then stop paying attention.)
That's mostly not true. While there definitely have been advances, tech is also very faddy and repeats itself. Trends are often reactive, techniques repetitive. Someone who has been around for long enough to see some of these waves resolve themselves is in a much better position to evaluate current and upcoming approaches. Technologists also tend to be myopic, and view the things happening in their own tiny slice of the industry as "what's happening". There is a whole universe of interesting stuff on the go that you are at best barely aware of.
Personally, if I'm hiring senior technical people I'm much more interested in their continuing level of curiosity and engagement with something, rather than particular stacks.
So maybe you are worried that your candidate who has been doing ASP.NET WebForms for the last decade on IIS will have a hard time getting up to speed on your use of TypeScript and React ... but if I find out that she's also being building raspberry PI infrastructure for fun and messing around with the Julia language to implement a cellular automata project I'm not all that worried about their ability to get up to speed, even though none of that stuff will get used by the team.
But yeah, it doesn't take 3 weeks. It doesn't actually take 3 weeks for a new dev already well versed in your technologies, with rare exceptions. At least not to be firing on all cylinders.
The thing is, a huge number of teams really, really need a good senior dev. They often don't think so, and often erroneously think they already have some... but they don't. Instead they have people who had "Senior" (or "Staff" or whatever) added to their title because they'd been there for a little while. And every piece of work the team produces suffers for it. So yeah, it's typically worth the investment.
Now granted, that assumes you are good at identifying talented people. This is actually a big ask - and one of the issues surrounding the OP's questions really comes back to that. I think that a lot of software and technology development suffers fairly systematically from poor quality middle management. One of the symptoms of this is the "ageism" discussed here, but it's only one of many....
then those skills are still tied to your technical knowledge.
Focusing the hiring too much on that first 30% impact gets you in trouble.
I know developers in thier 30s/40s/50s who are both great developers/architects and have current skills. Why would I settle?
Two other points: First, of course when hiring you want to find the “perfect fit”, but you almost never do, so you have to make tradeoffs. Hiring an inexperienced person with the “right” tech stack when you really need to bring experience into your team is a common anti-pattern. Secondly, as a hiring manager you need to properly understand the difference between “10 years experience” and “one year of experience, related ten times”
So if you only know old tech, I'm a lot quicker to believe that you're a "repeated ten times" person.
Also, most coding is maintenance. So yes, an experienced developer in some language is really worth a lot when you initially start out designing your system. But most of the work spent on that system will be maintaining and extending it and to be able to proficiently do that you need to:
1. Grok the problem domain, so you can connect the real-world / problem you're trying to solve with the code in front of you
2. Have a grasp on the limitations of your own system, which will come with experience as you work longer on it
Those things don't really count as languages, they appear to be a mix of development environments, software packages and architectures. Learning these things is critical and hard but there is a key mitigating factor. That factor is that really what needs to be learned is 'how we do things around here' which is typically company specific and needs to be learned by a new hire anyway.
My experience suggest new hires in cognitive roles (at medium-large businesses) take 1-2 years to get to the point where they can be proactively useful, purely from learning who is who, what the corporate history is, what ahs been tried before, why things are what they are and what their role's problem domain is really intended to be by their boss. Compared to that set-up time, the cost of knowing or not-knowing a technology is quite small.
> This whole "learn a new one in a couple weeks" mindset is toxic and needs to die.
It is true though. Learning a new programming language to the point you can write bug-free code and execute someone else's design is really easy. Learning how to solve a new problem is typically the hard part.
I 'learned' Python in 2 weeks. What I write looks a lot like C code, and I didn't make great use of the standard libraries, but it works fine. Compared to that learning how to use a large new library for a new problem typically takes a months.
Not necessarily, except for maybe Go, those are industry standard technologies that you could find someone who knew and could add value almost immediately.
We are very heavily invested in the AWS ecosystem. The difference between hiring someone who has the same amount of useful development experience but no AWS experience and someone who does and how fast they can hit the ground running is stark.
The language + tools aren’t barriers anymore, the interesting part of solving business problems and how to apply the tools become the fun ones.
Once you've learned and used a half-dozen imperative languages day-to-day, there's very little new when picking up another imperative language. It's mostly mapping common concepts onto new syntax, plus some new features and maybe quirks/gotchas.
Silly little example: After an hour of reading a Go tutorial, and having never looked at the language before, I was able to diagnose a bug a co-worker had been struggling with for two hours, because I recognized the pattern from something similar in Django (a python framework). Now knowing what to look for, a quick Google search had the exact answer to the problem right in the first result.
I wouldn't be doing anything large or complex in that short a period, but absolutely can become useful with smaller stuff in that period.
And I say this with experience, thought in a different context: Years ago my manager introduced me to a new codebase by giving me a bug to fix. It was in a Django site, and I'd never done any python before. Took about half a day to figure out what was going on (for comparison, nowadays it would maybe have been 20-30 minutes), and a bit more to fix.
But I did figure it out on my own, taking some work off of another developer who was focusing on more complex work. I would call that "useful".
I have no idea why all of the old folks (again I am in my mid 40s) consider it some great skill that excuses them from actually keeping up with what’s going on in the industry.
* My bugfixing example was indeed trivial, just the first thing to pop into my mind. It was mainly aimed at the use of the word "useful", which is woefully ambiguous.
* Sounds like we agree when talking about someone who already knows the technology stack.
* No matter how skilled the developer, there will be ramp-up time when they not only don't know the framework, but also don't even know the language.
* That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.
> Anyone can pull up an IDE in almost any language, step through existing code, pull up a watch window and fix a bug. That’s something developers can do with three years of experience.
Thanks for the compliment I guess; I was describing something from around 3-4 months into my first job.
And if a company is using a popular language/framework they can find senior developers who know the language and the stack. Why should they hire someone who hasn’t taken the time to learn both?
That ramp-up time being significantly shorter for a senior dev is what we're talking about, and how selecting for the specific stack like you're describing isn't necessarily going to be worthwhile - learning the domain is likely to take longer than learning the technology.
Not really. If someone has ever been exposed to a similar framework - take your typical MV* framework for example - even a developer whose only been active for three years can easily transition.
I have no idea why old folks (putting myself in that category) think they can get by without keeping up. Do you really think that someone who has 15 years of experience but haven’t kept up with the industry won’t be at a disadvantage over someone with 5 that has? On the other end, you have people like me that knows what it’s like being behind the curve and vowed never to put myself in that situation again.
You can’t imagine how long it took me to unteach a 50 year old who had been doing ASP.Net web forms for years how to even develop in ASP.Net MVC using server side rendering let alone client side frameworks.
He was only eighg years older than I was but didn’t keep current
Moreover, if you're proud of how hard it is to learn ins and outs of your tech stack, you're likely using a shitty tech stack and costing your company tons of money.
But you can’t be proficient at an architectural level with a stack in two weeks no matter how good you are.
Would you say that you could me an iOS/Android developer in two weeks just because you picked up Swift/Java?
But to your actual comment: I'm learning new things all the time. My focus when I'm coding in my spare time is a simple IDE, some project management tools, and a simple game engine I am building. I learn new things in those projects literally every day.
I'm not going to do a bunch of hello world apps in random new "Pro" technologies. I did a lot of that in my 20s, it was fun and I was curious about them, but my interests have shifted.
There are enough jobs I can say "no" to the people who want only coders who are using the newest "Pro" toolset.
Honestly, a good hiring manager will realize you ideally want BOTH that person and people like me on a team. One person will know that Rails 7 solves our new problem, and the other will know that a simple refactor will save 30 hours of headscratching over the next year.
And this is where ageism rears its head.
If two people apply for the same position, and you're OK hiring someone who has only 2 years of experience for the position, don't ask for more from someone who's been working for 20 years.
Trust me: There are plenty of older programmers who do not expect a higher pay because of their age. In my career, whenever I hear "older people expect a higher salary", it has almost always been by employers/managers, not by actual older professionals. On the other hand, I've met quite a few programmers who complained they didn't get an offer or were told they were ruled out because they were "too senior", and the complaint almost always was along the lines of "I was not asked salary expectations".
I have another friend in his 40s who took a job that let him work from home who is in his 40s and it pays less than he was making when we were working together.
I wrote one parameterized Cloud Formation template that creates our build pipeline for any new service. Most of the time we can use the prebuilt Docker containers (maintained by AWS) for our build environment. We have one or two bespoke Docker containers for special snowflake builds.
And if you haven’t read my other posts, and I get accused of being young and inexperienced, let me just say that my first “internet based application” used the Gopher protocol....
But we really don't care about the lock in boogeyman. Out of all of a typical business's risk, lock in is the least of them. We use lambda where feasible, Fargate (serverless Docker) when necessary. Why waste time doing the "undifferentiated heavy lifting"?
Why not? I’m older than you and I have passed plenty of technical interviews on the $latest_tech.
Sure you can learn a language in two weeks but a language is far more than syntax. There is the ecosysystem, frameworks, third party packages, etc.
And honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch than someone who used an existing one. Building frameworks rarely adds business value.
If I were interviewing such a candidate, I'd dig into why they built a custom framework. The answer to that question could be quite enlightening, especially if they mention business reasons for doing so, or if they talk about what they've learned since then.
When I started my own business a decade ago, I decided that existing web frameworks wouldn't suit my needs and built my own. Though I built a framework that suited my needs extremely well, it was a massive distraction from my actual business (which wasn't even tech). In interviews, I freely discuss how my NIH syndrome contributed to my inability to get the business off the ground; I've found that it tends to go over rather well with interviewers.
I go to work to be on a team and pull in one direction and support the leadership and make hard compromises.
I write code at home to understand where the industry will be in 10 years.
If they're over 45, then they likely worked in the late 90s and early 00s. If they were like many of us, then they worked on at least some web applications back then. This was far before jQuery and even Prototype.js on the frontend. Java, PHP and Perl were commonly used on the backend (or where I lived, more often IIS with asp). But the start of the career of someone in their 50s and even 45+ was before any commonly-used framework in open source scripting languages used for web development (unless you consider PHP to be a framework).
Even Spring framework was only created in 2003 (I just checked to confirm). Symfony in PHP was 2005. And those took several years to catch on in their respective ecosystems.
Tons of companies had in-house frameworks back then. Who did you think wrote them?
Even if someone working back then never explicitly built a framework, they used design patterns that mirror today's frameworks. If not, they were writing spaghetti code (and to be fair, that was pretty common back then).
Based on your username and your assertion ("older than you"), you're what, 45? Have you forgotten what it was like back then?
But when he said he “made his own web frameworks”, I’m assuming he was talking about this decade.
I re-read the comment in question. Not sure why you would have assumed that. Someone who is now 37 likely started programming before the age of web frameworks.
I wouldn't call it a framework, because it's built of small independent pieces, but it's a similar scope to something like Rails.
I did start coding long before web frameworks existed though. :)
Granted, I hear what you're saying, but I'm not convinced your blanket dismissal is the right analysis without digging deeper on the individual details.
Agreed. However, if you're a software company you've already likely made all of those decisions. If I come work for you, I'm stuck with them. It doesn't matter if I like the package manager or the frameworks. That's just the background against which we work.
Our work is making our lives better, and solving the company's problems to the best of our ability, given all those constraints.
> honestly, I would be more concerned about hiring someone who built their own bespoke framework from scratch
Yes! I feel honestly quite nervous about sharing that aspect of my work, because I worry people will see me as out of touch.
But I am quite up front that I don't advocate using my toolkit in production environments. My professional advice is usually quite conservative. I tend to advocate for minimal disruption and using standard tools as they were meant to be used.
However, because I do that all day at work, I see all the cracks and warts in the architecture and when I go home I want to work on the future.
And even though you are correct "building a framework rarely adds business value"... that's not what motivates me. I do think there is value being created, and I will eventually be able to cash in on that. But I am doing it because I am interested in the future, and I want to work on tools that transcend the pointless busywork I have to deal with day to day, in the name of meeting quarterly goals (which TBF is crucially important in a business context).
I get direct enjoyment from working in my own little hobby world where those constraints don't exist, and that is enough motivation for me.
Let’s say you learn C# in “two weeks”. Is that going to be enough time to get proficient with ASP.Net MVC? Entity Framewor? IIS? The built in logging framework? The DI framework? Middleware? What if they are using Xamarin for mobile? Let’s say you did pick up Xamarin (which I haven’t).
Would you know the fiddly bits about doing multithreaded processing with desktop apps using Windows Forms?
It's almost like you're pretending the years between 2008-2012 never happened, but I'm sure that's not the case.
I learned my lesson. For the last 10 years I’ve kept my resume competitive with whatever the $cool_kids technology is and haven’t had a problem competing in the market.
I didn’t play in the Java space back then but there were plenty of Java frameworks (https://dzone.com/articles/what-serverside-java-web-framewor...)
Ruby on Rails was released in 2005.
Knockout JS was introduced in 2010.
Django was released in 2005.
There was also Mojolicous for Perl also released in 2008.
The things I've seen. shudders
Hum... My experience is that whatever framework a business use in any widespread manner, I do always have to build another one on top, because generalist libraries completely suck for productivity. Unless you are a single app shop, then you can just hard-code everything.
I don't get how improving development speed does not add business value.
If their is a problem, they have to wait for a response from the one Guru who wrote it.
Most architects who think their problem is a special snowflake usually isn’t.
A custom domain for a business would be something more than logging though. For example, perhaps you want your logging to include some regulatory compliance hooks. Or maybe you want to do logging that also serves as a persistence layer for undo/redo and you need that to be tied deeply into your presentation layer.
In those cases it's much more than just "log this" that's being abstracted, it's all of the business needs around logging.
And next developer can't get away from learning all that code. Either it's tied up as a nice internal company logging framework, or it's spread out as a bunch of concerns through the app. But either way, the newbie has to figure it out.
Again, my experience is that the personalization layer is very easy to get by a senior developer, maybe there is some unconscious architectural decision that is biasing it, but I've never seen one have problem taking my code. Junior developers by their turn just can't grasp it, what I consider a feature, they are better just using the functionality until they get some experience.
>I've forgotten more frameworks than most coders will ever learn.
Forgotten knowledge is worthless
> I mostly don't care about programming languages, they're roughly the same and I can learn a new one in a couple weeks.
I used to think that, its possibly true, but the platform that comes with the language certainly can't be learnt in a couple of weeks so I dont think this is true as it used to be.
> I know technologies like Rails and SQL, but I don't play with the new toys as much as I used to, becuase lately I would rather solve a new problem or build a new tool than learn a new tool.
This is a real problem for you if companies are hiring for people that know the new toys.
> It's another world, and a lot of companies just don't bother...Thankfully I have no desire to work at any of those companies...There are plenty of companies trying to hire people to solve actual problems
That's great now, but in the future the job market might not be so tight and then people with your attitude might find it really tough.
You might be getting a mid-six-figure salary in your first job, but you probably ought to be socking it away because you can't expect raises.
Not really, you pick up something you forget much faster than something you've never seen before. It's all in there somewhere.
I do agree that testing for the ability to solve problem is hard and gets neglected.
To play devil's (or manager's) advocate - have you ever crunched the numbers on what it takes to pay you for a couple of weeks? That's not a trivial investment, especially considering the risk that you just thought you could learn a new language in a couple of weeks but were wrong.
(although I agree with the sibling comment — one doesn't really learn most languages in a few weeks)
Architecture and processes that was fine for one environment was horrible for the other.
In development terms, you can write FORTRAN in any language.
Let’s not talk about the one Java developer who thought he could pick up C in two or three weeks. It was ugly.
With an IQ test, the same way you evaluate anyone else on their ability to solve problems.
Better to hire for your particular skills. And younger is cheaper.
I think what's actually happening is some programmers, who happen to be older, don't keep up with their skills. Then when they interview, they bomb it, or talk about how they could do the same thing in an older stack.
That's great, but the company is really invested in the current / new stack, so it doesn't make sense to hire someone who does not have those skills. So they pass them up and find a candidate who knows the software the company is looking for.
But the older interview candidate, instead of reflecting on themselves and realizing they need to spend time enhancing their skills, they simply say the company was "ageist" and blame it on age discrimination.
Blaming others is generally easier than blaming yourself, I guess.
On the other hand, ageism is real. When I was younger I was told more than once by managers, straight up, this candidate is too old so find an excuse not to hire. And now in my 40s, when I walk into some companies for an interview, I can feel the decision has been made before I even start talking. Not everywhere, not even most places, but it happens for sure and it's not even that subtle.
Making a real time video chat client is now a literal programming exercise, it used to be the domain of multi-million dollar venture backed startups that were valued in the billions.
Real time chat between users is now a feature that is added to an application in a day, or even a few hours if you follow any of the myriads of tutorials available on YouTube.
Go from writing a UI in C/C++ to writing it in any of the higher level languages. Assuming you don't get caught in some design pattern trap that results in massive code bloat, it is now possible to do things in hours that used to take days to weeks.
Heck, transparency is no longer "write some ASM" routines, it is setting an opacity variable!
Some tech stacks, such as those for making basic CRUD apps, may have actually degraded a bit, but at the same time someone who is only slightly technical can now drag and drop their way to an online store front that is a fair bit more powerful that Viaweb was back in the day!
Heck, it is possible to now go from an empty folder to writing+deploy HTTPS API endpoints in under half an hour.
App (not web!) updates can be deployed to users by uploading a file and having a client pull down the changes and seamlessly switch over to the new code next time the app is launched. Versus mailing out disks!
There are app stores that you upload intermediate code to and they'll compile apps to the appropriate architecture for a customer's device!
During 72 hours of coding for Ludem Dare, game developers and artists work together to create games that are as complex as a full fledged commercial game would have been 25 years ago.
And of course, many corporate software engineering efforts continue to fail or go massively over budget for largely political reasons.
Likewise, moving from ASM to C to Smalltalk is a night-and-day improvement... that we made in the 1970s. The difference again is what hardware we get to run it on.
Video game construction kits existed in the 80s, and 80s GUIs looked pretty much the same as they do now if you ignore resolution.
Drag and drop application development was huge in the 90s, and the CRUD applications of the time weren't significantly different than now, other than they didn't run in a browser. HTTPS wasn't hard back then, and you could write a Perl endpoint in half an hour easy.
When it comes to things like playing video, it's easy now but I don't even really consider it programming. You're just installing software that does it for you.
Libraries and OSS and StackOverflow and various services have made a real difference. I'm not arguing that we haven't made progress. I'm just saying that I can see how some people feel that this year's incremental advance or retreat is not nearly as exciting if they've seen how the last 20 worked out.
Myself, I've gone back to using Ventrilo because I realized that I really don't care about having video.
Firefox Hello worked remarkably well IMHO, it was super simple to use.
The field is full of video chat clients though. Uber-Conf is popular, although I think the % of problem free calls I've had with them is less than 50%.
Zoom has worked well for me, no real issues.
Facebook Messenger, privacy issues aside, works well.
I've never done this in C/C++ specifically but I have to imagine that the GTK bindings are mostly language agnostic. Using a tool like Glade, you can build a GTK GUI with ease.
Glade was released 21 years ago.
Edit: Ah, it's actually named that and I found a lot of results for it:
This is a topic I'm very interested in (and as I've gotten older, my interest has only grown).
In ~20 years as a coder I've seen only a handful of overt ageist issues. I've seen plenty of ageist language and attitudes, and even more assumptions, but those are a lot harder to answer "Yeah, but would they turn down a qualified candidate for that?" about. I've known people who were both experienced and open-minded, as well as those that were experienced and closed-minded.
End result: No idea what to believe. I'm not comfortable assuming one way or the other, because either assumption is harmful if incorrect.
On the other hand, companies who engage in age discrimination are probably shit companies, so are you really losing out by them not considering you?
If they have the money to make my life easier, and I end up not getting it, then yeah, I lose.
If they have the experiences that make me a better dev, and I end up not there, then yeah, I lose.
If they aren't actually that bad, they just have this one area of ignorance, but I never get to experience all those other benefits, then yeah, I lose.
If I'm NOT in a hot tech center and this is one of the relatively few jobs that is available/is a step up, and I don't get it, then yeah, I lose.
I myself have nothing to complain about, at least not in my experiences to date, but in terms of considering the issue overall, I don't think it can be so easily dismissed. I recall plenty of older techies struggling after the end of Y2K and the dot-com bust when the market wasn't "If you want a job, you can get a great one", and I assume there are plenty of people in the above categories.
Except they don't "simply say that".
It's a fact of the industry that is backed up by (a heck of a lot of) experience and observation, unfortunately.
In that sense, it's kind of condescending to deny the reality of the (sometimes painful and frustrating) experiences many people are having in this regard.
My strategy is: live in the bay area so I can get paid like crazy, live a very frugal life much below my means, invest diligently, and hopefully have enough money to retire in my late 30s (in a much cheaper area, of course), so that when I'll be told that I'm too old, I'll raise my middle finger and leave the scene with the few million dollars I saved.
I wanted to pop back in time and say thanks. Yeah you didn’t retire in your late thirties like you thought you would mostly because it’s harder to make FI cash than you thought. The crash of 2020 took a few years to get passed but hey 45 is the new late thirties as they say now.
The reason I’m thanking you is you went ahead and struggled and saved anyway while things got tough and now you are free. Just FYI, I texted my QUIT to the HR bot in the Philippines just before I came back to see you.
Btw quitting is harder than you think. Just to let you know, I don’t blame you. Not quitting at 40 when you could eke bye because they offered you 2x what you’d ever made to stay; yeah the place was thalidomide for your soul but then your wife (I’m not telling) got pretty sick and that cushion was worth it.
Also: take care of your health and sleep more. An extra ten years of long hours at the terminal has left me not as healthy as I’d like.
To be fair, I don't plan to retire at 39 or 40, as long as someone hires me with reasonable compensation (2x would be a dream!). I'm purely trying to protect myself against a foobar scenario where indeed, I become obsolete from the point of view of most employers, due to "old" age.
You're assuming that he would actually want to stay in that other place long enough for the house purchase to make financial sense. You have to own a house at least 5 years for it to work out, minimum. Moreoever, in the big tech hub, you're making more money, so if you keep your living costs low, you'll bank more cash than living someplace cheaper.
Curious, why? I am having no problem being hireable right now at all (literally just joined a FAANG recently).
As far as I know, it would be impossible to get the same pay I'm getting now anywhere else. Even after considering the high cost of living, the spread between income and cost of living is very high, and allows me to stash away a significant amount of savings every year, to pay for my future freedom. I really doubt I'd be able to save that much if I were working anywhere else.
What else would you suggest?
I don't own a house here in the Bay, I dump everything in very diversified index funds and some rentals out of state, since I'm not interested in staying in this place long term and play the million dollar mortgage roulette.
The day I call it quit, me and my partner (she does well too) can move to Austin TX, or anywhere else really (we're both immigrants with citizenships in really really cheap countries), with our fat liquid assets, and not having to worry about necessarily finding work in my 40s. At least that's my very optimistic plan, it might completely not turn out like this, or I might die tomorrow.
So it's really not that much, most people who get private or state pensions get a better deal than that from a cash-flow perspective.
Edit: sorry for the ninja edit. I read your comment too quickly.
1) If you keep it under your mattress it will be much less than what I described, because every year your nest egg will shrink due to inflation, so you'll just be able to take $25k non-inflation adjusted, which is a big difference from my $25k inflation adjusted (in 60 years, $25k will be $150k at a 3% inflation). My assumption is 0% real growth, not 0% nominal growth, which is what you'd get by keeping it under the mattress.
2) The 4% rule is based on a shorter retirement interval (30 years) than what I'm looking for (60 years). Try to go on firecalc.com and look for the statistical odds of 1M giving you 50k/y for 60 years: the failure rate is higher than the success rate, and that's based on historical data.
3) Yes, my assumptions are very conservative, but I don't believe index funds will return 7% nominal over the next few decades, the world is going to face too many problems in my opinion. That being said, pretty much all I have is invested in index funds despite my opinions (mainly because I wouldn't know where else to invest it, since both cash and bonds are sure losers to inflation), so in the best case I'll be pleasantly surprised.
Again, quite possibly I'll die much younger without even enjoying any of that freedom :-)
Also, Austin, while not as bad as other parts of Texas, is still in Texas.
I've lived in six cities on three continents (and I'm talking signing a lease, not staying for a month) and I can assure you Austin is no better or worse than any other metropolitan area regarding... wait, what metric exactly are you using here? Weather? Burrito size? Number of smug, passive aggressive west coast urbanites per square mile?
So, what's the disadvantage?
If I could pop back in time, it would be to (somehow, because I didn't back in 1992) recognize that where things were at was the Bay Area, and moved there instead of staying in Phoenix.
But here is where I am, and while I make a decent salary, I certainly am not retiring any time soon.
The whole age thing is standing in sharp relief in my mind currently, because things are slightly "up in the air" at my current employer; I want to stay employed here, and I'm sure my employer wants to keep me, but recent events may potentially cause the relationship to have to be dissolved if the projects and numbers don't line up. I can understand that from a business perspective - fair's fair.
But the prospect of interviewing yet again, 3 years after getting this position, is not something I look forward to. I have new skills I can bring with me that I didn't have before, so that's a plus, but I worry that the age thing may be a barrier.
I'll probably have to lower my salary ask just to have a chance, but I won't do that unless I don't get any traction after a few months of looking - should I have to do so.
Even with that, though, I tend to wonder if the question in the minds of those hiring will be something akin to "Why isn't he a team lead or a manager?"; which is something I don't have, and have never been offered - possibly because I've never been able to stay at a company long enough to get to that level - either they go out of business or they get sold 2-3 years after I join, or the company is just too small to have any kind of way to advance you into such a position (I've found that I fit best in companies of under 50 employees, and the fewer the better).
Maybe if there's another time around coming for me, I should look into startups, though that seems like a bust from the get-go (ie - find a startup, in Phoenix, who needs a 45+ year old software engineer - that's probably a very small pool, if it exists at all).
It is what it is, I guess. I am fortunate in that I am debt free, at least.
If I could go back in time I would have the done the bay area thing when I was young too. But, life is a bit of give and take - there are some good things that happened in my life not taking that course (even if they didn't all work out as planned). So you kinda have to appreciate those parts of life and accept the fact that you are maybe not in the financial situation you wish you were in.
The only guarantee we have in life is this moment :)
"the software engineering labor market can stay irrational towards older people longer than the 2 decades past 40 where I could stay effectively employable".
The company where I work administers a survey of all tech employees every year, then publishes the results as a series of graphs, which can be broken down at any branch in the org chart. The very first question is "how many years of industry experience do you have?" The graph for my team is, shall we say, bimodal. There's a bell curve between 1-4 years, and a spike at 30 years.
I've been building things for a long time, and I try to impress upon the other developers on my team (or in my org) that I've never, ever, had an opportunity to build at the scale that they're being asked to build at in their first gig out of school. If I come off as negative, it's only because I can say with surety that X won't work because it didn't work in the past. I expect to learn, with my team, whether we can make things work at now scale.
There are times that I miss hours and hours of uninterrupted coding, and being inventive at small scale. My leadership would take me out to the loading dock and kick the shit out of me if I tried to contribute at that level.
My job is, essentially, to educate, and slap the team off of local optima. To impress upon them that their customers are king, and that their customer base includes themselves and their teammates, paged out of bed at 3am. And to impress upon their management and the business that we really to need to focus on improving operations, rather than add features from time to time. It is also to observe the progress of, and advocate for the developers on my team.
I feel (at the time of writing... Talk to me on Monday afternoon) like I'm lucky that my employer points senior engineers at problems like this.
As someone relatively old, I've been in a lot of developer interviews. I also noticed that older programmers tend to not be as up to date in newer technologies or ways of doing things.
I don't think it's strictly limited to technology. How many 50-year old do you know who listen to trap music? Or approve of the way teens dress these days?
It's just the classic "older people tend to be more conservative" fact. At some point most people get stuck in the things the were doing/liking when they were younger. The hate new music, they dismiss newer technologies (to open a rats nest: Electron anyone?)
I'm sure I'll get comments like "I'm 18 and I hate trap music and Electron". That's not the point, I'm talking about distributions not individual cases.
I've been doing this a long time and it doesn't matter to me.
The newer stuff is most of the time just a variation on a theme that I've seen turnover three or four times since i've been doing this.
What older coders have that younger ones do not, is experience. This can often halve the workload on some projects. The older I get in this profession, the more results I get out of less and less code.
Also, let's not immediately fetishize the newer stuff. It is often a less well implemented retread of older technology. Seems like people have less patience for learning existing systems, so they just re-write some quick and dirty replacement and call it good.
Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.
And this attitude is why older developers find it harder to get a job (I’m in my mid 40s myself).
What older coders have that younger ones do not, is experience. This can often halve the workload on some projects. The older I get in this profession, the more results I get out of less and less code.
I keep seeing this as an excuse for older developers who don’t want to keep up the date and stay marketable. Code is code. At some point you have to put lines in editor to make stuff work.
Also, let's not immediately fetishize the newer stuff. It is often a less well implemented retread of older technology. Seems like people have less patience for learning existing systems, so they just re-write some quick and dirty replacement and call it good.
Perhaps it might be better if younger programmers learned the old stuff instead of just badly reinventing the wheel every few years.
The market can stay irrational longer than you can stay solvent.
My post above didn't mention it, but I also learn new stuff. And some of it is really useful.
I don't insist the client use my preferred technologies. I may recommend them, sure, but in the end I'll learn what I have to do get the job done.
This whole thread reminds me of a story:
A small town barber has been cutting hair for decades, and charging his customers a fair price.
One day a national chain opens a salon just across the street and advertises $10 haircuts.
A customer asks, "Aren't you worried they'll drive you out of business?"
The barber grabs a piece of cardboard, scribbles something on it quickly, and props it up in the window.
The sign reads: "We fix $10 haircuts."
I think the best business model for an older developer is to fix $10 haircuts. Seek out those kind of jobs that capitalize on your experience.
There are so many messes out there that if you can bill yourself as a cleanup guy, you'll never run out of clients.
That's pure gold. Not to say there haven't been advances--there certainly have been. But, there's also the issue of the shininess of new tech giving it undue cred, while simultaneously translating to an outsized denigration of the tech it purports to replace.
Frankly, a lot of these problems have been solved, and solved well.
Look at NoSQL as an example. There are certainly use cases for it, but then people went nuts, exclaiming things like "RDBMS must die" and touting no ACID transactions as a feature/benefit. Then, you get the trickle of reports that maybe NoSQL wasn't the best choice for this or that.
Finally, fast-forward to 2018 and you have announcements from MongoDB that they now support ACID transactions.
Many older coders have seen enough of these cycles, and have adopted a justifiably-cynical posture of "wait for it...".
Yet "Wait for it" usually turns out to be correct.
I think if you pick a concrete example of this phenomena (say, Java applets vs. WebAssembly) and you ask "so why didn't someone take Java Applets and update them, why do we need WebAssembly?", the answers to why this happens start to become more apparent. It's not so much that Java applets were good and then we decided they were unfashionable, as that Java applets were always kind of ugly and we eventually stopped tolerating them.
For examples from before say, 2000, say Turbo Pascal a lot of the answer is because the thing in question was a proprietary product so once the financially backed dev team stopped working on it users were forced to move on to something else. Other times it's because the thing was written for an 80's computer platform that no longer exists outside of emulation and retrocomputing collections. Sometimes it's that the graphical standards the original used were for hardware with very different capabilities and making the thing usefully work on a newer system would almost require a total rewrite anyway.
Even beyond all that, there's another problem: Software is fairly hard to do literature review for. If I want to know if a particular research idea has been done before, I can hop on Google Scholar and figure it out with a few hours of carefully constructed search queries. Doing the same thing for software is very difficult, especially because most products for the majority of the fields history were proprietary. As a consequence, we don't have our history available to analyze in any easily accessed form. The Internet Archive has been doing what they can to rectify this situation. (See: https://archive.org/details/BeyondCyberpunkMacintosh) Nonetheless, there are still serious problems in this area.
I'm a younger person (23) who shares your frustrations, but I don't think that you can really wag your finger at people when there are real barriers to what you're proposing.
I've been asking this exact question on various HN threads with WebAssembly announcements, and the answer is always one of "Applets are insecure" or "Java is insecure".
So I don't really follow your example here... whatever security model you want to have, it seems much more obvious to implement it for Java applets than to implement WebAssembly and then implement your preferred security model for WebAssembly. Why do you think we replaced applets with WebAssembly?
(It's kind of a leading question -- my experience as a browser user suggests that we stopped using applets, spent a while not having them, and then started working on WebAssembly. By that analysis, we never replaced applets with WebAssembly -- we replaced applets with nothing, and then replaced nothing with WebAssembly. That would tend to support the more crotchety answer of "people developed WebAssembly because they just forgot about applets".)
Now where webassembly comes in is a compilation target for anything to the browser. I had the sorry misfortune of working on a project that had an applet that interacted extensively with the host page. It was excruciating, the api's were not up to the task. At the time, before the canvas tag, that's the length we had to go to get "advanced" functionality into a page. Nowadays that could all be done with js (or X compiled to webassembly) and canvas element. Things evolve.
Before that, computer networking killed most of the running systems. And before those, any new computer would require all new software, so there was no continuation at all.
Only security and network seems stable. All are sand dune like.
This eliminates much of the proprietary lockin and portability problems you mention. I never was saying people should just keep using Delphi into the 21st century.
Also, my comments apply to broad categories of solutions not just software implementations in the narrow sense.
For example: I just replaced (had to for perf reasons) a NOSQL solution in one of our applications with MySQL.
Ironically, the application would actually scale better with NoSQL in theory, but the choices the developers made along the way (data modelling and API design) made it perform worse on the NoSQL DB than on MySQL.
The tactical solution, which fits our needs and the size of our user base, turned out to be to replace a new shiny thing, with an older technology. The benefit gained was more flexibility with queries, which in turn allowed us to speed things up significantly with little effort.
The previous team burned a lot of time learning the new NoSQL thing to then not get any benefit from it at all.
Creaky old software tools from the 70’s that are actually still in use have fought off decades of competitors that attempted to improve them.
I think the reason older coders don't "keep up" is because you have to lose ground first. If I want to adopt a new technology, I have to first learn how to handle problems I ALREADY KNOW HOW TO SOLVE with my current technologies. The better I am with solving the problems I encounter already, the more I will have to backpedal before I can go forward in a new tech. And while I definitely get better overall from learning a broader base of techs, over time there are definitely diminishing returns. The cost-benefit of learning something new vs spending that time either learning more of what I'm already in, or just solving any of the infinite things I want to get done is a balance that will slide more and more towards not bothering to learn new tech unless there's a real need.
Add in that this is an industry rife with Imposter Syndrome that DEFINITELY bites you in the face during any moment of struggling to "get" a new tech, and there are plenty of incentives against embracing new tech beyond just "I don't like change". (though that is definitely present - I'm a JS dev currently, so I'm well familiar with the dismissive and arrogant attitudes that can be tossed around).
So most of the issues I fix are due to mistakes I never would have made, because I already made them 10 or 20 years ago.
The worst part is that I no longer think in code, I think in piping data around and working functionally or declaratively because my brain is so apprehensive of side effects. I've found that younger programmers don't tend to realize how dangerous imperative programming (especially object-oriented programming) can be in the long term.
So what I really struggle with is motivation, because it can be so disheartening to not be listened to, or to work hard on an implementation only to have the elegance of its abstractions muddied beyond recognition by developers who don't take the time to understand them.
It's probably too late for me. I cling to a fantasy though of saving enough money that I can take a step back from programming and work on some of my own demo languages/frameworks that illustrate the flaws with the status quo. Because of handwaving and cargo culting, today's apps and websites encode so little core logic in proportion to lines of code that I'm hopeful I can make a niche for myself working at a higher level of abstraction and bidding less money on contracts someday. But it might just be a fantasy.
This post hit really close to home for me. It's especially depressing to be caught between these "pearls before swine" moments and Impostor Syndrome and for a while this made it very difficult for me to work up the courage to go to job interviews.
I am currently working somewhere that I genuinely love, as one of two senior engineers on a team of 7. I think I love it so much because I work with people who care about the quality of their work and respect the experience I bring to the table. They've been around long enough to see the cost of making the same mistakes I made 5 years ago, and have outgrown the "just ship it" mentality so many startups are born with and rewarded for. When you're trying to build a 100-year company, you value good engineering.
Most software jobs in California are at startups that are not trying to build a 100-year company, more like 100-week companies. In fact, most venture-funded startups are trying to have as short a lifespan as possible, because they exist to generate short-term returns through big-money exits instead of creating reliable revenue streams. This often actively disincentivizes high-quality engineering, and I think is a big reason why it can be so depressing to work with startups.
If a new technology is good or seems like it has potential I dig into it, and due to experience I can generally pick up something to the point of productivity in a day or two that takes junior developers weeks or months to reach. Or at a minimum know enough to figure out how to trouble shoot blocking bugs others can't solve by knowing where to look for solutions and what those solutions might entail. So to an extent I keep up to date in that I read the value propositions and reviews on new technologies so I know when they might be an appropriate tool in my toolbox to tackle some solution or other, on the flip side i'm not in a rush to pick up every flavor of the week framework or library that is more than likely to vanish with out a trace in 3-4 years.
Meanwhile outside of a persistent google recruiter whose been asking to do an onsite for the last year I've gotten dear john'd for something like 3 roles in the last month including one where i'm personal friends with the SVP of engineering. Is it missing keywords on my resume, my age, my career path, education, wth knows but unless the market has tanked it strongly differs from my experience a few years prior.
1. Objectives on resumes are useless fluff. Yours doesn’t add anything useful. It’s so vague, I couldn’t even tell what type of job you are looking for. Project management? Test? Software Development?
2. Your whole summary of skills sections is word soup. I would completely ignore it. Anyone can list technologies they touched once. It’s too unfocused. In your experience section list how you actually used those technologies. Why put you know how to use source control? That’s like listing you know how to type. Why list 8 different languages? Could you handle a deep technical dive in all eight?
3. In your work experience section, you listed your accomplishments, but I have no idea what technologies you used to accomplish anything. You didn’t list the languages you used in any of your jobs since 2007.
I wouldn’t call you in for an in person or a phone screen based on your resume. You claim that you can pick up a language in a day or two but I can’t tell from your resume where you feel technically in any specific area. Again, it looks like you think because you can figure out the syntax of a language you’ve avoided learning the ecosystem, the tooling, the frameworks etc.
You can ignore picking up popular frameworks but you won’t be competitive with people who have stayed up to date.
If you came into an interview saying that you could advise on a technology based on “what you read” and you said you decided not to actually put it into practice, that would be a red flag.
I’ve been working for 20 years. I at one point had an encyclopedic knowledge of C and C++, that’s nowhere on my resume, neither is VB6, Perl, or PHP. I don’t want a job in any of those languages and I don’t want to discuss them.
Slowly it’s been morphing into an architect who specializes in “cloud native solutions” with AWS and I’ve added Python and Node and hopefully will be adding the $cool_kids front end stack over the next year or two.
I can deep dive into Elixir, Embedded C, C/C++, Java, C#, and PHP. I've used all of these on the job excluding perl, python and ruby in the last 5 years, and python and ruby i've used on side projects or for tutoring a friend on algorithmic programming for a evangelical role google asked her to do a follow up interview on. This is also not a comprehensive list of every language I've ever worked in or touched. I don't list Lisp, F#, Haskell, ActionScript, etc.
It probably makes sense to prune out outdated frameworks. And I should probably be updating this with some of the more recent frameworks i've worked with. But the thing is I'm bright, top 1% of the top 1% IQ, inability to write a compelling resume aside. I can pick up frameworks like candy, I read the manual. Further when I want to figure out how to do something that's not initially obvious or covered in the documentation I just read the source code or look it up, which is what everyone does but I do it better. That 2-3 day number includes getting a handle on libraries and frameworks being used on the project. I can pickup just syntax in the 10-90 minutes it takes to glance over existing code and documentation for most languages. For existing code I can usually follow the flow on first read with out knowing the language in advance. I learned C/C++ when I was 14 from reading intermediate level game programming books and reverse engineering it to figure out what the constructs do since there was no internet or other books handy, and the skill stuck with me.
So how do I stress that flexibility on my resume, or tell that story with out coming off as braggadocious?
P.s. do you mind if I ping you with a modified version in a few days that takes into account your suggestions?
But for the more generic comments that are germane to the topic....
Fair feed back. The Summary of Skills is mostly a Keyword hack for passive job hunting I can move it around.
You don’t need to move it around, you need to get rid of it. Your resume should express how you’ve used a technology/framework/language in a method that added business value to a company.
Besides, if you’re waiting for the job tooth fairy to show up and hand you a job because you list a bunch of keywords on your resume -
you’re doing it wrong. At some point in your career you should have built a network of former coworkers, managers, and even good local recruiters that you can reach out to.
I can deep dive into Elixir, Embedded C, C/C++, Java, C#, and PHP. I've used all of these on the job excluding perl, python and ruby in the last 5 years, and python and ruby i've used on side projects or for tutoring a friend on algorithmic programming for a evangelical role google asked her to do a follow up interview on.
“Deep diving” is not knowing the syntax of a language. It is having enough experience to know the ecosystem, libraries, frameworks, pros and cons and to have written and being involved in large projects using the language. By definition, you can’t have been
focused on five languages in five years. What exactly is your focus? Your resume doesn’t give a clue with what you’re good at.
Again, you come across as someone who thinks they are a special snowflake because you “learned C when you were 14”- you’re not. You’re posting in a forum with a lot of greybeards who are looking at this entire exchange just thinking “that’s cute”. I
And “knowing algorithms”, leetCode and how to invert a binary tree in Python puts you about on the experience level of a new grad.
From your responses you come across very much as an “expert beginner” (https://daedtech.com/how-developers-stop-learning-rise-of-th...).
I’m a reformed former dev lead, what you call “showing flexibility”, I would see as a jack of all trades and a master of none. If in 2019, you’re in any major metropolitan city in the US with years of experience as a software engineer and you’re not getting lots of hits - it’s you.
Your resume doesn’t (A) tell a coherent story with where you’ve been, which technologies you’ve used to bring business value or your focus and (B) where you are trying to go.
Learned C, Algebra and Calculus at 14 for what it's worth. I wrote a general purpose game GUI system, scrollbars, font system, picture in picture, keyboard and mouse handlers, etc. thousands of hours over the summer break between junior high and highschool. On my own, I mean little brown kid, whose mother dropped out of highschool and didn't know any one else involved in computer stuff, or access to people with math backgrounds own. I wasn't just writing hello world and simple console apps.
I know there are thousands of other developers on here with similar background, many who are much better than I am but I started off good. I took AP C/C++ freshman year of high school and was in the top of my class etc. Although I quit programming for a few years after getting refused entry into the specific program I wanted to attend in the college program my sophmore year of high school, and after the shift from dos to Win95, GameSDK which I just didn't enjoy working with as much.
I never got into the 31337 coding or algorithms side heavily, although I've been meaning to now that I'm down to a nice 40 hours a month work schedule. My focus has mostly been performance tuning high volume web sites, system/software architecture, and internal systems and tools. I'm a strong autodidact so I constantly go out of my way to learn new things and usually go out of my way to do things the hard way (e.g. investigate the best possible solution on a given platform/framework for a given problem instead of just going with the tech debt laden good enough solution based on what I already know), but I've been tied-up picking up IOT development, and Elixir/Phoenix for the last few years instead of electron or whatever is hip these days.
I'm currently the primary developer and architect on a multi million dollar IOT stack that includes a GAE Java with Objectify client api layer, Elixir/Phoenix + Mnesia, Riak and formerly Redis backend, and an Angular4/Bootstrap4 TypeScript admin page. I've had to help debug issues in the Objective C iphone client, and wrote some of the initial framework code and library wrappers for it to interface with some of the Elixir Libraries and to add a layer of indirection around the generated GAE apis. I maintain some node.js admin tools although I've migrated most of that over to elixir. I also have applied some patches to a weather forecast proxy service maintained by another developer in GO to enable better handling of null values and some other odds and ends and read through documentation and plenty of articles on here and elsewhere discussing the high level pros and cons of the language versus elixir to get a feel for it, and it's use cases although I don't know it well enough to list having only spent a few hours here and there with it. I could work in the language today if I had to but I don't know all of its quirks yet.
In addition to the above recent java/elixir and some node experience, I had to go in last minute and spend a good chunk of time rewriting my clients product line's 2017 firmware after the chinese hardware team failed to get the product to reliably function. Prior to that I had to help troubleshoot SSL connectivity and resolve a SSL cipher issue after another hardware team with much more experience than me in the area had failed to do so. These were my first real exposures to doing embedded system work, discounting designing and writing an x-mode GUI on the 386 when I was kid, and I objectively did incredibly well based on outcomes.
In addition to maintaining these Java, Elixir, Typescript (mostly pushed off on the junior dev as of the last year) Admin Page, and C Firmware projects I maintain a php rpg game which, because I wanted to dig into it I setup to run on a docker cluster a few years ago. Senginx with dynamic ip resolution (because it cost extra on vanilla nginx) routing to phpfpm, elixir and static containers, mysql on containers, postgres, jira/atlassian containers, with consul.io DNS for hooking things together with out loosing connectivity after rebooting. Here I don't mean I checked out some existing containers, I went in and figure out how to do a lot of things (should probably include that somewhere in my resume), so I could help give others advice on the stuff.
In the previous five years I've additionally helped trouble shoot some C# issues for JetzyApp although i'm starting to get a bit rusty. At Great nonprofits I helped push the change to bootstrap 3 and responsive design, updated parts of the code and API to run on laravel (because it looked more interesting than previous php frameworks I'd used and had good enough performance characteristics), improved the work flow by setting up production like vagrant developer sandboxes, and circle ci continuous deployment, etc.
At microsoft I took a crude poorly working batch processing script responsible for converting test data from one system and converted it into a reliable system service. I didn't need to make it a system service, It just seemed like the most reliable approach to get to the point it was supposed to be at, and I'd never written one so I wanted to dig into it.
I continuously go out of my way to learn new things, but I think I've been making the mistake of learning the things I find most interesting instead of what the market most values lately.
In school, many of us were use to being the smartest kid in the room. It’s just like being the star basketball player in high school and going to college and being surrounded by a dozen kids who were also the best in their high school.
I know there are thousands of other developers on here with similar background, many who are much better than I am but I started off good. I took AP C/C++ freshman year of high school and was in the top of my class etc.
Bragging about being good at C based on how well you did in programming class is not saying too much. High school programming classes have historically been easy for anyone who was already a programming geek. Do you know how many of us wasted time on comp.lang.c on Usenet before the web even existed?
On any technical subject that’s posted here, you could have experts in the field that could put either of us to shame.
I don’t think I was a special snowflake, but by the time I was 14 (1988), I turned my nose up at any “high level” language because they were too slow and non performant for the little hobby projects I wanted to do on my 8 bit computers. I had already been doing assembly for 3 years. A lot of the grey beards around here can tell you much more impressive stories.
The whole point of this lecture is that your professional experience is no more impressive than plenty of people I’ve worked beside every day and certainly not any more impressive than people on HN.
I am not trying to discourage you. But as has been said by myself and others, you don’t have a consistent easy to follow resume that tells a story about what your area of expertise is, how you stand out from dozens of other applicants and I couldn’t tell where you would be a good fit from reading it.
That's a good point. But you also need to consider it from the point of view of the person sitting across the table. You might be very productive solving a problem in COBOL (to exagerate the point), but they are much more comfortable in Python/Vue.js and don't want to burden themselves with legacy ways. Because they can find someone solving the problem in the stack they want.
I switched jobs recently and tried to adopt a zen-like attitude of beginners mind. It definitely helped...but I'd be lying if I said I never hit frustration points where I'm trying to go from A to C but I have to learn all about B when I don't WANT to care about B, I want to get to C. A good attitude helps, but I think the underlying reason is still true and is part of why experienced devs aren't adopting new tech at the rate of newer devs.
There's an awful lot of cyclical patterns in development, and an awful lot of convergence. HTML5 canvas for example is an awful lot like writing custom 2d graphics back in the day. Docker, Vagrant, etc. are neat but there was chroot, zones, vmware images, cygwin etc. before. A lot of hot new language features as well are just existing patterns from the functional discipline being tacked onto imperative languages and vice versa.
The fact that the constraints and reasons for adding generics to java or lambdas to c/c++ are different from the constraints and reasons for adding templating to c/c++ and lambdas to lisp doesnt mean that you can't go in if you understand one well and identify the pit falls or gotchas of newer implementations.
Is it really meaningful to go all in on learning how feature x from 2000 repurposed to solve solution y is all that fundamentally different? WebSockets are great, so was long polling and comet architecture, restful is great, so was SOAP, and their non web based COM messaging or just cross application communication ancestors.
There are a lot of pitfalls and unique characteristics to knew technologies but there is a lot more of new spins on old ideas, common underlying concepts and reinventing of the wheel.
You had me until that (j/k)(mostly)
> Is it really meaningful to go all in on learning how ....is all that fundamentally different?
That'd be the diminishing returns I mentioned up-thread: each iteration of learning a new way to do an old problem gives you less. I'd argue it's still more than 0, but we aren't comparing to zero, we're comparing to the opportunity cost.
That said, I think, as a generalization, experienced devs tend to be prematurely dismissive. That is, however, a personal opinion.
* stealth edit.
But if you don't do it, you're stuck on the local maximum of Hill X, and as it gradually erodes away, one day you realize that you don't have any lucrative jobs that you're qualified for, and you're so far away from what Tech Z is nowadays that you're effectively starting over from scratch, except that also you need to unlearn a bunch of stuff that's no longer true.
And this happens fast. If you try to coast for even, say, five years, so many of your skills will have turned to dust.
This is why the only real way to have a long-term career as a developer is to genuinely be interested in this stuff for its own sake. You need to want to go learn the latest new tech not just out of a cynical career calculus, but because you think it's genuinely cool and interesting and sounds fun. As long as that's true, I think you'll be able to be a valuable dev even as you get older; as soon as it stops being true, well, time to consider what's next for you.
Now, a lot of companies do over-engineer and cutting edge tools aren't necessary in a lot of cases, but that's a separate problem.
Learn things that will be important for forever. Life is a breath and then we are gone.
All they are is dust in the wind.
Within 10 years here I had encountered more than 10 different CPU architectures. I've lost count now, but it is probably at least a dozen. The same goes for OSes.
I'm pushing 40, and like reggae, acid jazz, electronic. I keep up to date (as far as one can) with tech, programming, cloud and the like.
It's not just me - it feels like a lot of older people around me don't match the old stereotypes. Older colleagues are learning Typescript, my grandmother reads erotica...
Youth culture is of course always changing, but older people are perhaps more liberal than they used to be, and don't feel like they need to behave a certain way any more more.
I'm proud of your grandmother, but TMI, man. TMI.
Any sane person with half a brain hates Electron.
(Not that there are any good alternatives, but still...)
I think it comes from having more and more empathy.
The more I live, the less I feel I can judge people and their choices. Even if it is to prefer Mongo over Postgres :-)
Can we please kill this myth already? That was extraordinary so during the era when social changes swept through society, but someone who spent their formative years during the 70s will hardly regard today's society anything else than an extension of the changes which took place then.
There are many different ways to act conservatively. Resistance to new ideas because "but we've always done it this way" is a lot more common with people whose definition of "always" stretches to five (maybe ten) years.
> Electron anyone?
Well, let's put it this way: Most ideas are bad. But you can't dismiss all ideas on that statistical ground, because then you would miss the really good ones.
Serious consideration: I wonder how the cost of "older programmers not being as up to date" compares to the cost of younger programmers too quickly adopting newer technologies.
If your bank has some COBOL systems running to keep accounts up to date, do you really want some team of pinheads to 'fix it' by porting it to python?
But you're not talking about distributions. You talk about the individuals you meet in interviews.
That's a different issue. New music really is garbage; the entire industry has changed over the last several decades. There's a reason so many younger people are listening to classic rock.
Can I ask where it is you live to see hordes of young people finding solace in classic rock (or any sort of purely guitar-based music) ? Except maybe in US rural communities where it's all about country music (always has been, always will be) those days are well and truly over. "Nobody wants an electric guitar anymore" : https://qz.com/1013293/rock-and-roll-is-dead-sales-of-fender...
Meanwhile electronic music has taken millenials by storm.
Tomorrowland festival sold out nearly 400.000 tickets in 40 minutes a few weeks ago. Same kind of numbers every year for Electric Daisy Carnival in the US.
Never in history has there been so much quality music released every day in all genres, because musicians can now do so without the gatekeepers of the past, much like Youtube has shattered the TV networks.
The consequence of that is that it's all about fragmented niches and discovery can be challenging. But there is gem after gem in my Spotify "Weekly Discovery" playlist. There has never been a more exciting time to be a music lover !
This for example: http://www.hawaiianreggae.org/artists
Nice and mellow, lots of young new artists. It's a pretty good scene.
Ask the kids what they think about today's mainstream music. They will say it's amazing. Who do you think drives the charts?
Yes, yes, I know, you are or you know a couple youngsters who think anything newer than Pink Floyd is shit. And you think that the charts are not organic, they are "marketing", where the record label pay to get entry and rank.
If you look at my resume over the past 10 years and the tech I’ve used, it’s indistinguishable from younger developers.
But...it was popular in the late 90s, which corresponds to us mid-40 age people anyways.