1) Ones who keep up their skills
2) Ones who don't
The latter are liabilities, bringing in 1980-era best-practices. They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.
I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.
My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years if I do occasional deep dives. Basically, I dive headlong into whatever is the newest, trendiest stack, and get a product out using that, deeply learning all the deep things behind it too. That's what works for me. YMMV.
And that's the problem with software engineering vs other white collar careers. For example, my accountant friend is expected to be trained by their employer in the latest accounting practices and law frameworks and does't devote 1-3 months per year of their personal time on open source accounting projects to learn the latest legal framework for fun, that would be crazy for him. Same for my friends in architecture, dentistry and law. Their employers pay them to learn and gather the expertise needed for their future in the firm.
Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job for the future language/framework they will plan to use and instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat the cycle several years/decades down the road.
That's why here you're expected to transition to management as a career progression, as IC roles are not really valued at old age unless you've dedicated your free time to coding and I don't know about you guys, but I'd prefer to spend my free time with my kids and exercising outdoors instead of coding to make myself employable in the latest stack.
When I worked in a place that offered CME/conference reimbursement, it covered about 1-2K a year, depending on budgeting issues. In my current place, and for all independent or small practice physicians, that comes out of your own pocket.
This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.
I wouldn’t mind if it was at least partially reimbursed though. It’s an enormous chunk of change, and not for my benefit.
I also find it interesting that you go so far as to repeat your licensing exams. Is this common among physicians as a whole? Having known a couple of med students personally, these exams were usually seen as a hurdle to be overcome and a source of stress, but, I suppose it might get easier after a few years of practice. On a related note, I find it hard to imagine that, say, lawyers would routinely re-sit the bar exam for funsies.
Regarding CME, isn't that required to maintain licensure? Or, are you talking about courses above and beyond the minimum to keep your license?
And, BTW, I don't know who you are, where you practice, or even what your specialty is, but you sound like the kind of person I'd like to have be my doctor.
"I do it for my patients, not for my paycheck." If you have a problem with that, then by all means, I'm sure you can find another physician that would suit you better.
Some of it is, some of it isn't. The worst doctor you know (okay, maybe not the worst, but close) is doing at least a couple of major journals and his CME. Really, you'd be surprised, but sitting on the other side of the exam table, believe me - it's the exception that doesn't try to stay fresh. That's really not what distinguishes bad from good from great doctors - it's finding a way to integrate and retain all that knowledge so you can apply it in an unexpected clinical scenario, rather than as watercooler talk or on an exam.
I don't re-sit step exams 1/2/3, but I do a lot of ongoing question banks to refresh my boards, and I do go back to refreshing material from step 1/2/3 all the time (which is what I meant about repeating licensing exams- I see now that phrasing was unclear.) You're right that it's largely a hurdle and a stress, but that's because as a med student you're drinking from a firehose and your career depends on it. Studying it at my leisure, I can dive into things as deep or as superficially as is interesting at the time, and the broader my knowledge gets the more insights I ultimately glean from going back to those fundamentals. Memorizing biochem pathways when studying for boards is hell; refreshing biochem at your leisure just to better understand and retain is... well, if not pleasant, it's certainly not hell.
CME is required to keep your license, but that's not a problem that I have - there are multiple sources of materials that grant CME credits that I already do "for fun", so I've got an over-abundance of credits. I hunt-and-seek interesting CME courses to stay abreast of interesting things. I went into medicine for love of medicine, and the idea of getting so narrow into my niche that I lose sight of all of those other exciting things would be a tragedy to me.
As much as I appreciate the praise, honestly, you'd be surprised by how much even your least-impressive physician puts into staying up to date. There's just so much to know that the moment you stop your knowledge base evaporates.
Keeping abreast of changes in the field and state of the art offer no benefits and is done solely for patients' trust?
May be what OP is referring to.
I have seen it in teaching. Teachers need xx hours of training per year. Training often satisfies that requirement but provides no significant benefits in terms of pedagogical improvement or content knowledge.
Teachers that want to improve do so by other means. The training keeps us in compliance.
It is in no way a knock on teachers. They are caught up in a bad system and are responding to systemic incentives.
If we apply an always/sometimes/never framework to my assertion, we can find examples where teachers advanced their practice via continuing education. So the teachers you know certainly could have advanced degrees, some even very helpful in improving their practice.
My experience in K-12 as well as studying the history of education reform in America since Sputnik was launched inform this assertion. It has been a recurring theme for 60 years.
We don't gain or lose patients by it; the most recent changes in the field are often so far from settled clinical practice that they're years from anything that would be considered malpractice; we don't get reimbursed better or for it; patients largely can't tell the difference, so it doesn't change your referral stream.
It does little-to-nothing for our careers. We stay up out of pride, and out of commitment for providing our patients with good care.
That's what I said. It's surprising how many people chose to read into my post something that isn't there.
Same for dentists usually paided for by some whiting product they will push over the next year.
Both my parents are doctors, and I'm a software developer. And you know what? I have it really, really good in comparison.
edit: I noticed that you phrased the question as "earnings potential" -- well, in that case, it's comparable. High-level engineers at FAANG make boatloads of money.
I think the issue is more just that there aren't any clear ethical standards that have been set industrywide, and since developers tend to have limited oversight, at this point it's really just a matter of what standards you set for yourself.
IMHO for things you're learning that will materially benefit your career, a reasonable standard would be that for every hour you spend on your own time teaching yourself that thing, you can spend an hour of paid time. Whereas for things that only benefit your employer, e.g. niche libraries or outdated frameworks, that should happen entirely on the employers dime.
The core methods and principles of SWE are almost timeless, other things are well documented and can be learned on the go.
As an anecdote, and not to boast: I have occasionally had to write data visualization and function plotting software at least a half a dozen times in as many languages and frameworks, from QuickBasic as a teenage hobby, to C in DOS, Java (Swing and JavaFX), OpenGL, WebGL, JS with charting libraries, Linux raw framebuffer... Also tweaked MRTG charts back in the day, wrote a super basic 3D editor in highschool for games I never ended up making, etc.
A team I was on once had to add a simple chart to a webapp. When the task came up in meetings and was causing the dev assigned the task some grief, I mentioned that I've had to do some charting work before, and offered to help with any details if they got stuck. Instead of saying something like, "Ok, I'll let you know if I have any questions," they said, "Yeah, but was it D3?"
I would go so far as to say that a lawyer who just recently passed the bar exam is substantially worse at the practice of law than a paralegal with 20 years of experience. The legal principles learned in order to pass the bar exam are akin to... basic algorithms (maybe?) for a software engineer. They're important, but they're also not really what the job is on a daily basis.
Programming is not like that. If you ask 100 programmers what the new big thing is, you will get 100 different answers. I would much rather take responsibility for my own professional education than outsource it to a company that may not have my career best interests at heart.
This is just not accurate.
Accounting rules not only can be argued, they are all the time.
FASB and GAAP are rife with subjectivity.
I have an accounting degree.
An accountant, lawyer or architect can reasonably be expected to stay with the same firm for a decade or longer, often their entire career. It makes sense under that context for employers to invest more in long-term skills.
Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months. That's maybe not the rule. But even Google and Microsoft has turnover rates that imply a half-life of no more than a few years for the average employee. The economics of long-term re-training just doesn't make sense.
Is this the worse thing in the world though? It allows savvy workers to continuously jump around companies and continuously re-negotiating higher compensation packages. That helps to make sure that workers are paid at or near their market value. In a way that doesn't work in the accounting industry, because future employers would look down at your resume history.
When they finally rolled up our team, after 27 years, the person with the least seniority had ten years.
It's entirely possible to keep folks for long periods of time, but it requires some pretty serious management chops, and those skills are not exactly encouraged, in today's world; in any field -not just software development.
I worked for a Japanese company. The Japanese wouldn't even acknowledge my employees until they'd been around for a year. Having the same people on hand, year after year, was pretty much required, as our projects always took the long view.
I can't even imagine having an environment where engineers are expected to quit every year and a half.
Our HR was run by the Corporate General Counsel. It was pretty much as bad as you can get.
Also, we were paid "competitive" wages (read: below-market).
I was the manager for 25 years (I should mention that the engineer with the most seniority had 27 years -we started together), and I feel I did a pretty good job. Considering they did not "jump ship," I guess something went right, eh?
Nowadays, it seems that people "manage by fad," as opposed to managing humans. It's kind of heartbreaking, really.
Just assume that we were all idiots, and go back to your happy place.
Did you tell them that “we are all family here”?
For me personally, if the money is enough for me to make a comfortable life with my family, other things at work become important. I could never spend most of my days with assholes or do idiotic work even if the pay was way above market value, for instance. I'd also trade money for more free-time, if possible.
And yes, the company I work at does feel like a family. But nobody had to tell me this. It just does so, naturally.
I sincerely wish you the best.
: Also, IIRC, 20-30k of that is taken up by the ORM.
First, I wonder if firms investing in training could possibly improve turnover, thereby creating a bit of a positive feedback cycle. It doesn't even have to be formal training, either. It could be something as simple as having a weekly journal club, or the equivalent, and encouraging engineers to read at least one research paper a month. 
The second aspect, engineers moving jobs just to get raises, seems weird to me from a market efficiency point of view. Interviewing costs companies money -- so much so that it's something they should want to do as little as possible.
Many companies don't keep pace with the open market in terms of raises, which is a primary force driving people to job hop. Are there any studies comparing companies that do at least attempt to keep comp for current employees in line with the open market against those who don't?
: In my experience, reading research papers thoroughly can be a pretty thought intensive process. In grad school, where I studied math, what I would do is read the abstract, decide if it was interesting, then skim the section headings and statements of theorems to see if I wanted to go further. If I did, and I was searching for a particular widget I needed for a proof, then I would read as much as I needed to read to digest the proofs of the useful theorems. If it was for general interest, then I would read the whole thing. I found that once I got an interesting paper in my hands, fully comprehending what it said could take up to 1 day per page for particularly dense papers.
Unless you're serially hopping from one shitty startup to another, it's pretty unusual.
I wonder if that's actually true, and if so, to which degree. On what basis are you saying this?
I think it very much depends on the company and culture. I always had jobs where learning and self improvement was encouraged and expected (also in Germany). With a budget for conferences and books and a fixed time frame (~20%) for that. These were all companies that primarily did software development -- either direct product development or project work for customers. On the other hand you have firms where software treated as an appendage. They might have other great products but an entirely different managerial background. A mindset like: "We need a software department, everyone else has one too" can easily lead to mismanagement -- and I think a lack of time and budget for self improvement is an aspect of mismanagement in the business of creating software.
My employer requires us to do one hour of training per day. We are allowed to study whatever we want or work on personal projects. I personally don't think it's weird for an engineer to keep up with emerging tech and trends. I'm sure a lot of engineers outside of software do this.
That's all great, but much of it ends up not being applicable. For example, we were trained in a new language about 3 years ago, but we haven't been allowed to use it in our product yet! I've been doing my home projects in it, so I'm ready to go when we do start using it. But most of my coworkers took the class and haven't touched it since. They've likely forgotten everything about it. Likewise, the conferences are nice, but I've never implemented anything useful after having read a paper about it, or seen a presentation on it. (The few times I've tried, it turned out the paper didn't give enough information to do your own implementation!) It does keep me aware of what's going on in the field, but I'm not sure how useful it actually is to my job.
Maybe I've been fortunate, but in the UK and for a couple of years in Australia, I have had employers (and later clients for my contracting business) who have been happy to throw me at projects far enough outside my comfort zone that I keep learning and stretching my muscles. I feel at the top of my game (much of the time).
Interesting. I thought Germany's labor law highly discourages this. Isn't "It is easier to divorce than to fire someone." a German saying for your tough labor law?
As a non-expert, I think there's a 50% chance it costs 10x more to teach each new hire for a month.
Wow, exaggerate much?
This doesn't describe the vast majority of "older" workers, it's just another disappointing stereotype and expression of ageism.
Imagine an engineer walks into an interview. It's a young person, you think nothing of it and you go on with your normal interview. Now a different engineer walks in, and he has grey hair and some wrinkles. You feel the need to dig into whether they're in group 1 or group 2, in addition to your normal interview.
I'm not saying you're wrong, but if we were talking about how there are two groups of women and one of them is a liability a lot more people would be setting off alarms.
To get a rough idea of the value of up-to-date skills vs apparent age, let's consider a hypothetical:
If both a 50 and 24 year-old graduate from the same coding boot-camp, do they have equal odds of being seen after the first interview? (let alone actual employment)
And how big is the difference in probabilities?
I also note that desktop games are primarily coded in C++/C. If hiring is skill based then we would expect that industry to be zealously recruiting older engineers.
Had situation exactly like that once and I bypassed HR straight to manager who lambasted HR, got interview and the job, still got shafted by HR who messed up salary, lied about rise and generally made my life hell with pettiness and bullying for want of another way of putting it. That is along with one person in HR taking my side and telling me what her manager was upto and next thing, that person was gone. So yeah - HR causes many of these ageist issues when it comes to the situation you outline.
Why do you assume young people don't learn C++ to work on games? They do to work on web servers.
The AAA in AAA games refers to media content, not the game engine. Game programers are a trivial fraction of the C/C++ industry.
This is not a fact. I've seen both of your types in the same company. Let's not reduce people down to "you're either this, or this". It's not a good way of thinking. People are more complex than that and have different things to offer.
Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field. Why would coding/tech in general be any different?
I think there are lots of younger developers in the hiring process who start out with a bias of assuming older developers have atrophied skills. Then when that bias makes it harder for older developers to find jobs, the younger developers say "ah, well that's meritocracy for you".
A doctor specializes in one aspect - surgeon, anesthesiologist, ER, GP, psychiatry, etc. They might pivot once in their career, but most of them don't, and they're able to find employment as long as they're able and willing.
I know several lawyers, and most of them had to specialize by their early 30's if they ever want to make decent money - family law, real estate, employment, personal injury, whatever. Again, the older the lawyer, the more seniority they have and higher they can bill at most firms.
How many professors in academia do you know that have experience teaching in multiple schools, e.g. business, engineering, social sciences, etc? Not many, usually they have a very narrow niche.
The problem with the tech industry is mostly due to offshoring, the rapid (and pointless) pace of new frameworks and tech that's mostly due to shifting dominant players, and the naivety of most software engineers who've been unwilling or unable to organize and create some sort of protective barrier similar to every other industry (teachers or cop unions, AMA, legal bar, UAW, etc.).
And again, because this is HN, the majority of developers are not worried that they won't be making FAANG salaries with sweet equity and stock options into their 50's. They're worried that they'll be training their 25 year old replacements from Bangalore at the typical mega bank or insurance company, left with only sporadic temp gigs and 6 month contracts at half their salary and with 15 years left before Medicare kicks in.
Do keep in mind that the industry has exploded in size over the years.
Is this really the case? My limited experience is that the amount of constant learning expected from a coder is an order of magnitude more than in most other fields.
That's not the exception to the rule, either. You're expected to stay current on trends in your field, and stay ahead of best practices.
Machine learning - we know that just means linear regression or Bayesian filters plus marketing, and we prefer programming. We've also seen 20 years of "magic bullet" solutions like ML fizzle and die in the real world and know most ML projects never see a day in production.
K8s is great if running k8s is your job. But it is a specialized skill that is only needed to run very large infrastructures, unless your project architecture (read microservices) has gone very wrong.
20/30 year olds think "keeping your skills" up means learning every new programming fad that blows through because they don't have the experience to differentiate the fads from the advancements. It is like telling master carpenters they need to keep switching brands of tools to "keep their skills up". But all these tools do the same things and are 99% identical. They are busy building stuff with the tools they have.
It's not being lazy or a "dinosaur", it's just better time management.
The absolute worst devs I've worked with were very "up to date" people who wanted to re-do all the already-working "outdated" stuff to new "modern" standards. Often, it's just a waste of time.
Revolutionary advance in industry best practices. /s
During the dot-com crash, I went from working at a 40 person start-up to a 25,000 employee utility company, and it was a real eye opener. A lot of my "cutting edge" (for the time) skills were dismissed as being flash in the pan, and all the "real work" was done with tried and trued technologies. I ended up finding my way back to a start-up a few years later, and everything was reversed again.
Ageism is rampant in this industry.
My current team has a good mix of industry experience and excitement for new technologies, which makes planning both effective and exciting.
I realise this might make me unemployable in a modern web dev environment. Maybe I can just ride it out until the industry goes through the rest of the cycle and rediscovers simplicity.
It's why people use it in the first place. I get the need for it when you're dealing with huge scale. But it seems to be the new default deployment model, for services that really, and I mean really, don't need to scale that much.
And I've seen people justify breaking a nice monolith into microservices (usually badly) so they can deploy it easier using k8s. Which is totally putting the cart before the horse.
Easier to run a small service in a predictable environment where nobody can step on your toes. Also pretty easy to adjust resource allocations, update pieces independently, easier to isolate screwups (1 part going down sometimes is better then whole thing going down), etc.
I mean, of course you can't approach the task as "we want to deploy on k8s first, no matter what" - of course you have to consider the task at hand and if it works better as monolith - keep the monolith (you can still use k8s - it's just a way to run code, any code can be run within it). But if the task suits the model - e.g. many data processing/transformation/ML workflows do - having a tool like k8s can make deployment easier. One doesn't have to make a religion out of it, but if your problem looks like a bunch of processes working on separate tasks, it may be a useful tool to manage them.
Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.
For most (80%+) of the applications I've seen k8s used on, the performance question is not tricky at all. Monolithic performance would definitely be orders of magnitude greater.
I can't help but draw the conclusion that people are using k8s because it looks good on their CV. Whether I'm wise in being skeptical about k8s at my age is a good question.
Moreover, I can see quite a few places where on my last job (which didn't use k8s) introducing k8s deployment could help in some places. That said, "one size fits all" is never a good strategy. But I think saying "people are using k8s because it looks good on their CV" is wrong. It has its uses, just don't put it where it doesn't belong.
JS is simply garbage we are stuck with where you have to learn all this years footguns to avoid creating bad code. ML is a buzzword of limited scope. k8s is system administration by another name. Web and mobile technologies are useless to learn unless you need them RIGHT NOW as they have a half-life of 18 months.
I want to learn the "force multiplier" sitting beyond what we are using today. GC languages were the last round of force multiplier, and we haven't had much since.
Right now, the only candidate that looks to be a force multiplier is Rust, but I would jump to something else that looked like a force multiplier.
ML is a "force multiplier", but it has limited scope. It might be worth learning depending upon what field I'm sitting in.
Dart goes in a similar bucket even with the native compilation.
These languages are all effectively Java with some makeup.
I'm not seeing much force multiplication. I see no new language allowing me to write something more than what I can write now.
It remains to be seen whether Rust will wind up as a force multiplier or not. But it's really the only current candidate even if it's not a great one.
It's easy to stereotype people who have unrealistic expectations based on entitlement because of their age, but past a certain point, a lot of employers will reject a older candidate even at the same price as a new one on the assumption that they can't learn any more.
Of course, if you are employable, you don't bang your head against the wall, you go and do something else. Like any kind of discrimination, if they were forced to accept you, it wouldn't make the culture palatable. It reduces your opportunities though.
Of the companies I've been to, most were highly-tech focused (and pretty elite teams), but one was a large, distinctly non-tech company (with mainframes, even, and not the modern types, running legacy algorithms). I ran a small skunkworks team there.
I can't give great statistics, aside from saying that at the tech companies, people tended to fall into the category of older is better. Senior engineers were senior.
At the non-tech company, people tended to fall into the category of older is obsolete. The tech team was there for the paycheck. They worked 9-to-5, maintained a healthy work-life balance, and kept fairly boring systems running. The work they did was really outside of the company's core competency. It just needed to be done, and someone needed to do it. If layoffs ever came, I'm not quite sure who would hire them, though. It was culture shock for me. It wasn't specific to this one company either, but to the industry (we had pretty close contacts with both collaborators and competitors). This company was industry-leading, actually.
This has been my experience, too. Generally, the closer to doing technology as the main offering -- e.g. network engineers at an ISP, or developers at a code shop -- the better and more technical they are.
Once you get into the Enterprise, where they're handling specific apps and frameworks, they tend to get stuck in patterns and whither.
That's not to say you can't be an old grognard doing COBOL at a niche code shop, but when the tech itself is the offering you find much more technically competent seniors.
- I love coding, and I still code every day. I have no desire to move into other roles (such as management).
- As for job security, I think it is harder to find a job as a manager than as a developer, simply because there are many more positions for developers than for managers
- Family-wise, it is much easier at my age, because my children are getting into their twenties now. Much more free time for me (including for work if I want), compared to when they were younger and need lots more attention (and being picked up from daycare)
- As for learning - that's one of the attractions of being a developer. There are always new things to take in, it never gets boring
- On the other hand, even if some things in programming are fleeting (the currently popular frameworks etc), there is a core of fundamental knowledge that you gain over the course of your career that will always apply.
Besides age is not an indicator of quality of work, nor are college credentials. I’ve seen much older devs run circles around younger ones, and vice versa.
Even if you're right, it's not a great argument, because there isn't really a rational counter argument.
If the argument is based on a pattern you've seen before that's germane to the situation at hand - the "I've seen this movie before and it ends with regret and a data breach" argument - it can be much more convincing.
This sounds like you want the respect due to age alone.
The respect due to age alone is horrible. I wouldn't inflict it on my worst enemies.
The respect due to age and experience, on the other hand, is substantial, and something to strive for.
Back up your gray hairs :) with some relevant knowledge and the intelligent people will listen attentively.
This sounds like something coming from the advanced or expert beginner:
What I actually mostly do is saying "Doing X never works, because Y, instead I like to do Z because...", and it has reasonable success.
If it's two people arguing their own opinions at each other and neither one has relevant data where exactly do you go from there? The one with 2 years experience or the one with 15?
It seems ridiculous but I wonder if it would work.
Spot on. Ignore the popular framework of the month.
A lot of developers focus only on the surface (popular tools) and neglect fundamentals, algorithms, OS internals, how CPUs work, networks...
Of course if you aren't having trouble finding a job then you don't need to follow this advice.
The interview topics of on-sites are general enough and focused on principles, and, by far, the large majority of candidates are rejected for giving vague and superficial answers.
My experience convinced me that, as a candidate, I don't want to divide my time preparing and interviewing for many "typical" companies. I instead focus on very few interesting ones and maximize my chances there.
"shotgun versus sniper" if you will: you need one good offer, not 10 so-so ones.
Among other reasons, the typical buzzword-driven company is less likely to be an interesting environment to work in.
You won't apply to to 100% of jobs. So you already make a selection.
Some jobs are require highly specialized skills and knowledge. Keeping up with the framework-of-the-month costs time as well. Time you could have spent getting better at Linux kernel programming or Cobol.
The "framework of the month" is not "a requirement for 20-30% of advertised positions". The "framework of the month" requires a minimum of a year to get to that point and 2-3 years is a lot more common, even in the JS frontend space. By the time something gets to 20-30% penetration it's past that phase.
I've tried to keep one main language and work with something different each time. It creates diversivity in my resume but allows me to use my core anchor language to find a role.
How do you pivot to something brand new?
A mature engineer doesn't need to be bleeding edge. It's perfectly fine to merely be tracking technologies that have gotten to their high-growth phase.
The other thing I try to do is have coverage, such that even if I don't know a particular tech I have a story that says I can learn it easily. I've used enough DBs that I don't need to chase the latest things; I'm confident that even if I've never used a columnar database that I can pick it up quickly if I need it. I've used enough programming languages that I don't need to go chasing the latest one, because it is still frankly mostly a different spelling of things I've already used. And so on. So I don't need to go chasing everything all the time. I generally don't pivot into anything brand new, because within the domains I work in, there isn't much fundamentally new stuff left for me to pivot into.
(A lot of my recent growth involves learning how to do engineering while being more directly tied into the business and interacting with business people, and learning how to be an interface between business and tech, rather than more types of tech. I'm doing this from an engineering perspective rather than a "management" perspective, because it turns out there is quite a difference between the two once a company scales up enough.)
Once you have seen, say, 15 frameworks, it becomes easier to map a 16th in the coordinate space, and quickly learn it when needed by reusing the knowledge you already have.
I'll just add that the perspective of many years of experience brings clarity to many discussions.
= have you chosen tooling that is going to last five years?
= is the design as simple as possible or are we building a Rube Goldberg machine that will assert a life of its own?
= how are you going to train people / maintain the system? do we have a resource plan or is this a death march?
Hasn't happened to me, but I've had a couple older coworkers who seemed to be in that place, which just seems like such squandered experience.
It is squandered. It's one thing to know the right path based on experience. It another to be able to share that experience in a way that benefits others and helps them understand why it's the best choice. Nobody knows everything either, so having a real conversation can result in learning both ways.
I'm being a little hyperbolic here, but after a few decades in the industry it's kind of baffling how few novel ideas are in these new frameworks. There is even a trend to go back to server side HTML and SOAP-like APIs. The tech industry sometimes just looks like the fashion industry. But maybe I'm just too cynical :)
I won’t say I reconcile anything, but I get a chuckle out of it. Makes it easier to learn the new flavor.
As The Who said, “Meet the new boss. Same as the old boss.”
To add another example to the two you provided: how many different designs and implementations have you seen for pub/sub and message-based systems (topics/queues). I literally lost count. Even fucking Redis has an implementation!
They go back at least to the 1970s.
(Remember Enterprise Service Busses in the 90s?)
Even C has evolved, in a kind of meta way: C itself hasn't evolved much, but the ecosystem around it has evolved a lot. How common was something like Valgrind back in 1990?
Whenever you forget how much we've moved forward, despite occasional setbacks, look at this: https://www.youtube.com/watch?v=fPF4fBGNK0U :-)
And, of course, there's also just the Eternal September effect with the vast majority of developers needing to learn things that the grey beards have already learned but have no succinct way to communicate (or just aren't heard over the roar of constantly greater numbers of developers with each passing year)
Do you mean things like Swagger?
[NB There have been few things I've hated more than SOAP]
Can you explain this? Anywhere I have ever worked (Fortune 500 orgs) I have always had an "at will" employment contract as a W-2 employee. They can fire me at any time for any reason. Furthermore, when budgets are cut typically the first group to go is the IT department, since they do not bring in any revenue, unless the company is selling IT products and services.
Has much less to do with software than strategy: maximize core competencies and minimize overhead.
And the IT budgets are different. Very different. On a related note, remember when some organizations had IT (and specifically web dev) under their marketing departments?
I keep reading that on Hackernews, so it must be true!
And besides, AWS has a stable API and a set of CLI tools to actually manage everything in the cloud without button pushing.
I am 29 now. Used Jenkins about 5 years ago because I couldn't find a much better alternative for a single-developer workflow, but it's a glorious pile of buttons that has 0 consistency and/or an API.
And they also look better.
For god’s sake, you could even use GitLab today. What does Jenkins give you? What is there to defend?
Secondly, to your point, there you go: https://stackoverflow.com/questions/17716242/creating-user-i... I'm pretty sure that the reason there is no REST API for it is because you're supposed to be using your favorite back-end (LDAP, AD, etc.), with which Jenkins can integrate.
It's quite disingenuous to complain about the Jenkins APIs, there are a lot of them. They're not perfect or designed necessarily like something you'd design in 2020, but they are there.
I used to work for motorola, engineers with gray hairs are gems, full of knowledge and always learning new stuff, eager to coach new comers. it unfortunately stopped when outsourcing became a fashion.
I'm old too, I kept learning everyday, delving into c++17 these days, and it just makes my day fulfilling.
Can you describe what you do/have done on your day job, and what you are doing on your side project to prevent boredom on day job?
I also like to try to get better at what I do as a developer. Mostly this is by reading books or taking MOOC courses. I think there is quite a lot to learn about developing SW well (because it is a really complex activity). So that also keeps me interested.
About eight years ago I also started blogging  about SW development. I've found that trying to formulate what I think about it has also kept it interesting.
I guess it is part of the learning.
- Older software engineers (40+) won't be discriminated against if they are doing cool stuff and are doing different stuff than they were doing a few years back. I've written this a thousand times before, but we often confuse ageism for stagnation. If you've had the same job for 15 years working on the same system and using the same languages, you aren't a victim of ageism as much as you are being discriminated against because they don't think you'll be able to learn new tricks.
- On resumes, don't advertise your age if you're a bit older. We don't need to list that internship, first job out of school, or graduation dates from college. List 15-20 years of your career and leave the rest off. It isn't a biography.
But I don't do cool stuff. What I do is to keep a ~$25 million per year money printing system from falling over while the youngsters are in year 7 of the 18 month project to replace it with super cool tech. Making money is so overrated these days.
Coming from “sysadmin”/tech/keeper of the lights on: there is no win at the bottom line. Keeping the lights on always costs money, they always wish it could cost less (except when inter-company bragging about having the new hot tool), and when you’re doing your best possible job they will forget you exist. If something happens on your watch they can just as easily assume you have not been working at all.
One thing I’ve distilled from all this: Always keep at least some focus on how much your work costs vs how much value it’s providing.
wish there were more like you these days to interview (I am just 42, so take it with a grain of salt) but the youngsters, straight from college with a streamlined CV are sometimes difficult to deal with imho.
Oldies can be hard to work with, and you're sort of proving it right here.
And everyone younger than you fits this criteria? You're quite bigoted, I hope you realize that.
Wossamotta U, Computer Engineering, BS
Doesn't need a date or any details, my job experience is more relevant, and if they've got questions, they can ask. If I had another good stint of work, I could just put Education: yes. Also going to leave off my college jobs and write Recent Experience. Although I got my last job through networking without writing a resume, and I'd expect the same for future work.
I started doing this years ago. If I listed every job on my resume, it'd be 4-5 pages long. Most of them aren't relevant to what I'm doing now, and only serve to make me look old, or like a job-hopper (which I am--about every three years--but job-hopping in tech isn't a negative unless your list grows too long). At first, I compressed older jobs into one-liners at the end, but now I leave them off entirely. I list 4-5 of the six jobs I've held in the past 22 years.
I have no way of knowing what impact this has on my employability, or how people interpret my age based on it, but I'm 47 years old with 22 years of experience on my resume, so I assume I come off as both old and senior-skilled. I never have to try hard to find jobs.
Jumping around because you can't keep a job or nobody likes working with you is another story.
In Germany you're expected to include your age and a headshot on your CV. I always considered that an insane invitation to prejudice and have only ever done it once, for a job I was guaranteed to get.
I'm sure other countries have other expectations.
In the US it's easy to forget that not everybody is "against" discrimination in hiring, even in tech.
If you keep building 'cool new features', you don't appear to be doing the same thing over and over again.
You’ll get cheap, extremely motivated employees, who spend almost all their time at your company thanks to boardgame/role play/pizza nights and you get them to keep up-to-date on techs by having them do weekly tech-workshops and presentations.
They produce a lot of cheap code, sometimes the quality is decent, other tones it isn’t, and their most talented developers tend to leave for “adult” jobs after a few years, but overall it has been a very successful strategy.
When I joined Snap about 5 years ago, I was middle aged compared to the average which was probably in 20s - even my CEO is reasonably younger than me :). The company was also pretty small (100?).
From that (single/anecdotal) data point, I'd say it is not as bad as you portray. There is definitely an affinity to try new things - part of your job is evaluating it for all its worth rather than its glitter and ensuring you communicate that well / provide value. I always try to focus the conversation on the problem we are trying to solve and how the new tech is solving it better. The good alternatives usually seem to catch on very quick (See Go vs Java, Kotlin vs Java) compared to the also ran technologies. Your worth will kick in based on your skill level in judging these and definitely in getting to a mentoring role rather than a gate keeper role. But it has been a lot of fun.
The agility is also something to keep in mind: Snap allows for extreme career mobility. Recently I switched teams from doing data related work (for the past 15 years?) to a complete unknown of joining our spectacles team and Snap has been supportive. You just don't see the typecasting you tend to hear about in bigger companies.
So yeah - give it a shot with an open mind! You may be surprised!
And this is exactly the problem with tech. At 35, you're only about one decade into your career.
Lawyers, doctors, professors, etc. at 35 would still be considered relatively young, with plenty of time to become partner, senior surgeon, obtain tenure, whatever. But you're now at peak salary, and can expect inflation raises now.
And worst of all, you've got 30 years left until you can obtain Medicare, so if you aren't working full time at a job with decent benefits, you're absolutely screwed, especially if you 'dare' decide to have a family, 30 year mortgage, etc.
People often live 'till mid 80s nowadays, and can't receive any sort of benefits (in the US) 'till mid to late 60s. If your career is peaked and trending downhill at 35, and you're not making millions of dollars in your prime (professional sports, fashion model), there is something wrong with your industry.
After ten years in a mature field, haven't you learned, or even mastered, a large portion of your specialty? You're 95% there already?
As I've gotten older my time has gone from '20% think about it - 80% code it', to '80% think about it - 20% code it'.
I currently work with a surprisingly well balanced spectrum of developers at my current employer. The median age is in the early 30s, with probably a third of us 40 and older out of 150 developers.
I'd say with some confidence, that the older developers in my company complete as many "tasks" but write less code when doing it, with a lower defect rate.
Now I have some knowledge and perspective, even the ability to pick out the fads and novelties from time to time. I can try new things on small projects and I can go with tried and true for the big things and I tend to understand which projects are better for which approach. I can visualize the data, the models, the inputs and outputs, and think through the logic from beginning to end, all within about 30-50 hours per week. That took lots of late nights and a ton of trial and error.
There's a good place for the less experienced and more experienced on any well balanced team, to be certain.
When I say tech company, I'm talking about companies where software engineering skills actually matter. where a O(n²) algorithm also will cost your business. In my definition of a tech company there are tons of new things happening. Look at the spaces like AR, self-driving, rocketry, and machine-learning, and computational photography — no one in any of those fields is doing the same thing they were doing even 2 years ago.
How would you describe the difference between those two types of companies?
I'm finally to the point where I'm confident enough with my career to actually make noise and complain about things that I'm tired of dealing with which I wasn't in my 20s, but now I'm dealing with my complaints not causing any change in the end.
Some of the reasons
1. boredom of doing the same thing over and over in a different languange/framework
2. Still have to get "permission" from (possibly younger) manager before doing anything.
3. Have to leetcode after work to switch jobs.
4. Have to keep learning latest framework after work.
5. In direct competition with new comers who are much hungrier and with ppl who don't have many personal responsibilities.
6. Honestly, it feels a bit weird to be the oldest person on the team by a huge margin.
7. Younger devs assume you might be bad at your job to not grow in your career and don't give you much respect.
> 1. boredom of doing the same thing over and over in a different languange/framework
Sounds like being stuck in the same junior developer position for the rest for your life. Not really common.
> 2. Still have to get "permission" from (possibly younger) manager before doing anything.
What's wrong with that? Age in no virtue per se, and really it's the same situation in military, business and other organisations elsewhere.
> 6. Honestly, it feels a bit weird to be the oldest person on the team by a huge margin.
Not in an appropriately senior role.
> 7. Younger devs assume you might be bad at your job to not grow in your career and don't give you much respect.
Never an issue if you can step in and do their job if necessary.
Example: the almost universally used service infra where I work is a nightmare of excessive context switches and tuning to avoid starvation/deadlocks. Why? Because the kidz who developed it apparently didn't read enough to know that the basic paradigm it's based on is known to have such problems. The people around me think this is normal or inevitable, and just live with it, but even the person who did most to popularize these ideas recanted a decade ago. Too bad; we're just stuck with it, because it's the young folk who refuse to learn.
Your older coworkers probably haven't lost their "building muscles" and aren't reluctant to learning anything. They're reluctant to repeat or build on past mistakes. Overall, your comment seems like a good example of how older programmers are often misrepresented by those who don't share their experience. Let people represent themselves.
... In some fields of software engineering!
There is a whole world of Software Engineering outside Web/Infrastructure. There are plenty of Software Engineering fields out there where slow changing standards and toolchain reliability are considered a valued feature.
That said I don't disagree with the need for self-study in Software Engineering, I just disagree that this need originates from some issue with the high churning of languages/tools/frameworks(which is local to Web/Infrastructure)
It makes no sense to me to try to stay up to date with front end development when front end developers are rapidly becoming a commodity with a bunch of boot camp grads. The money isn’t worth it.
In my 20's I used to enjoy coding just for coding itself. I was always excited to learn a new language, a new library/framework/etc.
When I reached 30-ish, I was more interested in building things. The "coding" part was more of a chore to be honest, I just liked doing it because there was a product to build to solve problems. And the thing is, I can probably build any product with the languages I know today. So many of these "hot new things" are just rehashing ideas that have been around for decades.
After decades in the field, I feel like I'm mostly always doing the same thing. Grabbing data from here and there, making sure I rate limit and fail gracefully, serialize shit and deserialize a response, or vice-versa. Writing software is just plumbing and it's infuriating when people just keep changing the size and shape of the pipes just for the sake of it. That's why to me the actual product being built is more important than playing with tech just to play with tech.
I went to school for CS and ended up being a data janitor. I get paid FU money to do what is essentially ETL in many fancy and various forms. All the excitement is gone.
Got half a mind to switch to interior design.
Do I need any of that or is it worth it? Surely not. But the freedom I feel now making 3x what I did when I worked in retail is something I can't really put into words well.
I also think there's a frustration limit. Like if your spouse is being a jerk, the kids are being dicks, and the plumbing broke you don't want to mess around with a new technology. You want something easy and easy is what you already know.
>I feel so broken because of it and I know my next job is going to be a third of what I'm paid now because it won't be software.
I would not do that. You're probably just burned out. Take some time off and switch jobs before switching careers. Especially one that will pay 1/3 of what you make now. At least start saving up 2/3s of your salary for a while to see what it'd be like.
I'm currently saving about half of what I make. Finances will be easier when I'm not the only one with an income and my partner is days away from a very probable job offer.
I'm wanting to transition to project management and eventually product management. Product management can pay pretty well ($150k+) once you're established. I can afford to not really save much for 5 years until I'm back into six figure income.
It's shitty because it doesn't advance me at all, there's no room for growth, no opportunity for learning something that would be meaningful at literally any other job, there is zero social aspect to it since I'm 100% remote and work on projects alone, all my coworkers are on the opposite coast of the country.
I have tried using my free time to get more familiar with ML/AI stuff but my brain just shuts down as soon as I begin to try. I would love, in theory, to transition towards AI work but the learning curve feels so steep. But the pay would be great, I think there is an absolute shit ton up upward room to grow, I could probably work on some pretty interesting problems (though I'm sure there's tons of "make ads more effective" ML jobs out there too). Maybe if I take some time off work I can try to get back into it, spend 6 months learning, and try to get an entry level ML job.
Fast forward to 2016, I was married, with a step son who was a freshman, tired of working on yet another software as a service CRUD app at my 3rd job since 2008 as an IC, and jumped on an opportunity to be a dev lead at a medium size non software company.
I thought the next step was to either stay a hands on dev lead/ “architect” and just muddle along for the next 20 years, go into management, or go the r/cscareerquestions route and “learn leetCode and work for a FAANG” and move to the west coast.
Neither sounded appealing. Then management decided to “move to the cloud”. I didn’t know anything about AWS at the time and saw how much the “consultants” were making and that opened my eyes. If these old school netops folks could pass one certification, click around in the console and make. $200K+ a year, imagine what I could do if I knew AWS from the infrastructure and dev ops side and I knew how to develop and architect using all of the AWS fiddly bits.
It took three years and teo job changes in between, but I really like consulting. It’s the perfect combination of development, high level architecture, customer engagement and you never know what you will be doing in three months - or in what language.
I started looking for a job and got lucky that another company was trying to build an in house development department led by a new CTO. They had outsourced all of the development before.
The new CTO was very forward looking and wanted to make the company “cloud native” and improve the processes. He only had a high level understanding of AWS as did I. He took a chance on me and I became both the de facto “cloud architect” and the person he called when he wanted a customer facing project done from the ground up without having to deal with the slow moving “scrum process”.
I was quite happy at the company and would have stayed a couple of years probably even knowing I could make more money somewhere else and then Covid hit along with an across the board pay cut.
I was still not really looking, a 10% pay cut at a time when we couldn’t travel or really go out was an inconvenience but not earth shattering.
Then a recruiter contacted me for a software development position at Amazon. I wasn’t willing to relocate or do the leetCode monkey dance but we talked a little and then she forwarded my information to a recruiter on the AWS side.
I saw the interview process was basically a high level technical interview to determine whether I knew the basics of AWS (I did) and all about the Leadership Principles. I knew I could answer the “tell me about a time when...” questions with the best of them and the interview process was going to be fully remote.
To keep a long story from getting longer - I work at Amazon as an AWS Consultant from the comfort of my own home in the suburbs in a low cost of living area.
Anecdotally from our conversations, I would say that they see computer programming as being less valuable than people programming. My friends do have lots of ideas for things and one in particular will keep saying that he wants to "re-learn iOS dev" or "learn Elixir", but he never does. I've started down the path of learning how to angel invest, which is where I'm trying to learn my people management skills.
> Would you trade places with any of those friends?
> Would either of them trade places with you?
I doubt it.
> Do you have an 'exit' plan out of software engineering or do you plan to stick with it to the very end?
I'm already on a trajectory where I won't need to work a traditional job the rest of my life. None of my friends who are climbing the management chain are anywhere near that. I honestly love building and learning (I just finished up a 3.5 day hack-a-thon yesterday). My next path will lead me either to building a company, helping people start tech-enabled companies or helping someone co-found a company. Unless I become a CEO, I don't plan to stop programming.
Actually I'd say that's a sign of maturity. They've grown wise to the idea that our industry has a fetish for reinventing the wheel and refuse to take part in it.
I'm a software engineer and I have no need to keep up with random web frameworks popping up left and right. I'm fine writing my C++ and C#, thank you. I have to keep up to date but the stack evolves over a decade, not every quarter (or what ever the cadence is for web stuff).
Become an expert in a field and then it's irrelevant if you know framework xyz or not - it's no longer a critical requirment. It's critical you have domain knowledge, and what ever the tech stack is it's expected you can get up to speed in it just fine on the job.