I learned FORTRAN in 1968. I learned Scala in 2012.
I'll be 61 this week...and I just started a new job developing in Scala.
EDIT: He completed the iOS application under budget with wonderful tests.
If you don't mind me asking, age aside, what tests qualified as wonderful? Coverage? Integration or unit? I am curious. Which framework? Also, was he one of those rare individuals who could make the app UI look beautiful on both iOS and Android?
UI was presetup by a designer, so he didn't have much choice in that, but he implemented it pixel perfect. No idea on Android, he was only doing iOS for us.
So, while a lot of the bids we got were at $85, him working at $180 for 35% of the hours was still cheaper (by a good bit) -- and we got a rock solid result.
Rates are meaningless until multiplied by hours.
Not really. If the contractor gets sick, it's still your problem.
I care for my contractors and always make sure they are paid enough to afford good medical care and a stable financial situation. It makes sense, even from a deeply selfish point of view - the less time they spend worrying about such things, the more they spend working.
As for him being my problem. If a contractor can't work, you replace them, depending on how the contract was setup, you might collect some insurance to help get over the ugly transition.
As for deciding what's enough for a stable financial condition, I can't. What you I do, as a manager is to pay attention for warning signs.
Years ago, I had a programmer, young and very bright, who kept entering lousy freelance projects on the side. He was under great pressure because he had financial problems. I didn't want to lose him, but burnout was already on the horizon. When performance review time came, I enrolled him on a personal finance course. Within six months, the freelance jobs went away, he started coming in earlier and the already good quality of his work improved further. He had more time to study, more time for his family and more attention for his work.
I am 62 and I have been consulting for 15 years so age has not such an issue. Honestly, I can understand why a company might not want to hire an older software engineer but in my experience companies don't mind old consultants.
When asked for advice by people wanting to have a lifestyle (work on your own terms) consulting business, I suggest they, in addition to working, also invest time in writing a lot (blogging, books) and contributing to open source projects.
I learned C++ in 1999 and it still does the job in 2055.
As a guy in his 30's it would be interesting to read more on how you did this. Were you laid off at any point? Did you learn Scala on your own or through a previous employer, or a class? And finally, how hard was it for you to make your transition & land this job?
1. Finding ANY job when you're 60 is hard;
2. Engineers (who are still engineers) at 40+ will often be passed over in favour of twentysomethings.
Engineering is such a young industry that I'm not sure we've really faced (1) yet (since the number of people who started engineering in the 70s is but a drop in the ocean compared to the number in the workforce today). It'll be interesting to see how the industry has evolved in 30 years.
(1) is a pretty strong driver for everyone having a long term plan to control your own fate, which means working for yourself. That is, if you're not in industry with inbuilt protections (eg teaching).
(2) is harder to pin down.
On the one hand, there is a certain (un)survivor bias in that many who started as engineers in their 20s are managers or the like by their 40s. So are those who remain in engineering a good sample?
Also, the older you get in general the more non-work priorities you have. Family, for example. This can reduce the amount of time you spend on self-improvement. This industry almost requires constant learning, reinvention and skill acquisition. Is it an example of ageism if someone who is 45 can't find a job when what they know is Ada, Cobol and Forth?
It's often said that a good technical foundation and education means you can pickup any language. This is true but picking up new skills is both a habit and a skill, one that atrophies if left untapped (IMHO). It's also a question of degree. Perl to Python? Not a big gap. Ada to Ruby? Well....
That all being said, there is age discrimination in this industry. I've personally witnessed someone say "I prefer new grads so I can mold them". That's fine and all but if you say that and don't hire a 50 year old programmer that's a good basis for an age discrimination suit because, well, it is age discrimination.
So yes, you either need to find a job that doesn't change (eg a plumber or a teacher) or you need to constantly battle to maintain relevancy in a fast-changing industry or you need to become master of your own fate.
This is known.
Um, I'm 54 and the languages of my 20's were C, C++, etc.
I also fairly regularly get calls from recruiters.
> So yes, you either need to find a job that doesn't change (eg a plumber or a teacher) or you need to constantly battle to maintain relevancy in a fast-changing industry or you need to become master of your own fate.
Or, you can be a driver of this change yourself by developing a modern, advanced programming language.
Not everyone has a wikipedia article :)
1. create a website from your own name, like walterbright.com, and put your professional work there
2. create a github account under your own name, and contribute to significant projects using it
3. ditto for stackoverflow, hackernews, gamedev.net and other relevant forums
4. write interesting and useful tech articles under your own name
I.e. your name is your brand, make use of it. If you use a pseudonym and hide your contact info or otherwise put up barriers to being contacted, you'll lose opportunities.
Aside from that, one the best ways to get recruited is to show up at tech conferences and engage with the other participants. A lot of companies use tech conferences as ways to find good engineers.
If you're broke and can't afford to go to a tech conference, submit papers to them and try to do a presentation - speakers usually get expenses paid.
In my experience using a pseudonym consistently works just as well. Most people don't even realized my "real" name isn't Swizec until we start signing contracts.
Yes, I also introduce myself as Swizec when meeting people in person. Especially people who might google me.
The fact that you are the inventor of the D programming language puts you in a very different league of people.
The C guru looked at me and said "Who the fuck do you think you are thinking you can write a C compiler?", and not in a nice way, either.
So what's stopping you?
And frankly your post had me thinking this whole afternoon.
What is really stopping me. And I have to frankly say, Nothing. Thanks for this valuable advice.
Internet branding is overrated. Personal network is a lot more important :)
One thing I've discovered is that knowing the new stuff in conjunction with some of the lost arts (messy old C builds, hardware-related mysteries, debugging without debuggers) gives me some skills that are not easy to come by, and that is a real advantage getting work. There will always be legacy systems that are 10-15 or more years out of date, and a market to keep them running.
the languages of my 20's were C, C++, etc. As in "I wrote compilers for these languages".
If you aren't getting calls from recruiters, then none of the rest of us will.
For those of you who don't know, he wrote, so far as I can tell, the first native c++ compiler.
ex-MWC guy: ...ihnp4!mwc!wgl
Judging by HN, about 60 per year
Do people really think if you're 45 you worked in Cobol?
It's a weird time to be a mainframer. If you are an experienced specialist you can charge insane hourly rates. At the same time, some bigger orgs have started transitioning off the old stack because all the experts keep dying or retiring.
At some point mainframe specialist skills will go from insanely valuable to completely worthless.
Although after I left Quantico they put me in a billet working with dBase III, I could just as easily been assigned to a billet at a RASC where I would have been using COBOL.
I once worked at a startup whose cofounders told me that I was the oldest person in the office (having just turned 26) because "people over 30 don't get technology", meanwhile, they couldn't understand HTML, much less programming, and therefore had to hire me.
The very best programmer I know works primarily in Ada at his day job, whereas I know all the trendy web fads. If I had to choose him or me as an employee to write code, regardless of language, I'd choose him, objectively. If he starts working on whatever I'm working on, he'd almost certainly be better at it than me in a week. Whenever I have a difficult programming problem, I can struggle with it for hours (or days), and he can make it clear in minutes. He can look at my code in a language he's never seen before, for example, and solve my problem. It seems like a lot of companies are in the habit of hiring resumes instead of people.
The older you are, the more time you've had not only to gain concrete skills, but to learn to abstract your existing skills and knowledge to apply to other things.
Some of the seemingly widespread opinions in this industry (or at least among startup scene types) make me wonder how most of you manage not to drown when it rains. Some of the stuff I hear just makes no sense. It's not based in logic.
Fortunately for guys like my friend, defense contractors and other huge corporations will be hiring devastatingly skilled people who have "boring, obsolete" skillsets for the foreseeable future. They're probably too smart to work at our on-average-destined-to-fail startups anyway.
Contrary to popular belief, computer skills really don't decay that much. "New" things aren't really that new, they're almost always related to or based on something we've seen before, and if you're really any good at what you do, it won't take you long to get a grasp on it.
Analogy: If you wouldn't hire Bruce Lee because he doesn't know the latest direct-to-DVD self-defense technique by Generic UFC Guy (TM)... you're making a huge mistake.
tl;dr: make snap judgements and you're going to wind up sub-optimal employees.
Unfortunately the older your are, you are less flexible, less adjusting, running short of energy, you can't take risks as you have a family, you can't work too hard or push yourself too much because there are health risks associated with that. The list goes endless. Its not a very comfortable position to be in old age.
As some one who is in 20's and watching people in their 40's(My uncles) and my father(in his 60's) I can tell you the only chance we engineers have is to make loads of money, real estate and savings before we are 40-45. Else we are screwed. That unfortunately is the most blunt way to put it.
You may believe in whatever you may like about experience coming with old age, but your employers will take raw passion and energy over accumulated experience any day.
Your best bet is to make a lot of money and be safe or risk going through a lot of pain when you grow old.
I don't spend as much time on work as I did when I was younger, but I get a lot more done. I have also learned a few things about how to motivate myself (eg, delayed gratification)
If you're trying to live a good life, your best bet is probably not to focus entirely on making a lot of money, but to find what makes you happy. As a physician, you get to meet old people. Now that I know some old investment bankers, I don't envy the ones who focused on making tons of money. They can't climb the stairs to their ocean view bedrooms, they can't fit into their British sports cars and they never have time to drive them. Managing the paperwork of their assets becomes their full time job.
My dad is a 68-year-old engineer, he seems happy, and making more than ever as a consultant. The fact that he likes restoring hot rods works well when the money guys find out and they can exchange pictures of Shelbys.
Best advice for growing old: buy a one-story house. You can get to your bedroom and your kitchen.
Bull Shit. Stereotypes are just stereotypes. If you follow one you will be one. I am 41 year old engineer(programmer) & Entrepreneur. One project I am coding now has Java, C++, Go all together. The younger programmers in my team were sort of scared to touch C++ part. I am encouraging one of them to learn Go.
I do two projects at a time, one of my own and another for a client. Support the former a bit with the consulting money I get from the latter.
I wrote the above to counter your ageist myths. You are younger and I presume flexible enough to correct them :-)
PS: I didn't down vote you BTW.
Why would you think so? I left home at age 18. My parents were 37 and 40 years old at the time. At that time and for many years after that, I had tons of non-work priorities, certainly more than my parents. And I don't think that's an exception. Most people never really make work their priority. Don't mistake that for an age related phenomenon.
"Is it an example of ageism if someone who is 45 can't find a job when what they know is Ada, Cobol and Forth?"
No. Is it an example of racism if you didn't want to hire a drug trafficking nigerian scammer and pedophile as a baby sitter for your 5 year old daughter?
The idea that these are even relevant examples is ageism or racism. I don't know a single 45 year old software engineer who knows Ada, Cobol and Forth. They didn't serve in Vietnam either.
Everyone knows they have to be up-to-date with current technologies and it becomes easier with age because many changes are very superficial to the point of being trivial.
Ruby is Ruby, but do you use webrick? mongrel? thin? unicorn? passenger? Whoops! You chose what?! everyone knows that hasn't been supported for 6 months when the guy writing it quit supporting it because someone insulted him on a mailing list!
The stuff around a language is generally what changes more often than the language, and keeping up can be a challenge, regardless of age.
Do you mean 'software' engineering? Given its about a 40th reunion, it's likely the comment was about a more established field of engineering.
If anything I would have thought there would be more 40-60 year olds in the work force than 20-40 year olds (or at least a close number). So either some of them have to move into management of non technical staff or you end up with a very top heavy management structure.
Tech companies seem to prefer a flatter management structure to avoid too much bureaucracy.
Not so well. Pyramidal company structures get narrower as you go up.
"Moving to management" doesn't mean you suddenly just create Powerpoints all day. If you're a good technical manager you can suddenly use your skills to multiply your contributions far beyond what you'd be able to do as a single person.
Let's start with the common premise that a good programmer is 10x better than a bad one. A good first level manager, who has 10 reports, could effect a difference of 100x. And a good senior manager, who has 10 midlevel reports, could effect a multiplier of 1000x.
If approximately half the workforce is between 40 and 60 and half is between 20 and 40, then half your staff are managers. That would work if only two people report to each manager (branching factor of 2), but it seems very inefficient to have so many levels of management: log_2(#staff).
On the other hand, with a branching factor of 10 only 1/9th of your workforce are managers, so management positions only support those between 55 and 60.
The point of the original comment was that you can't have "move into management" as the default career track for people in their 40's. There just aren't enough management positions.
Totally agree. The thing is that that applies not only management but to senior development positions as well. Most firms have no need for a workforce that's 100%, 80% or 60% senior devs. There is a LOT of grunt development that needs to be done and the older you are the more money you're going to want.
We can call them managers, senior devs, whatever, but the truth is that there's not enough room for everyone to constantly be moving up and making more money. Engineers have created a very egalitarian workforce and in most cases it's great but this is one of those cases where it doesn't work. Industries that have been around longer have already figured this out (law, banking, etc) where you have the "up or out" mentality. There's been a lot of work and research done showing how and why this works and how to create the proper balance depending what sort of work you're doing. The more cutting edge work you're doing the more senior devs you need but the lower percentage payout the partners get.
It's not a matter of moving people to management or not but a simple fact of we can't all be at the top of a firm and if you're not moving up it's better to replace you with someone younger and cheaper who can do the same thing.
Check out the book Managing the Professional Services Firm for more, it's a great read and really applies to software.
Engineering is an art.
In tech you are trying to make something no one has done before so you want the young hungry engineer.
My experience indicated that they were planning to use too many "new things" simultaneously. If any one beta-level technology failed, the whole project would fail. My Plan B was to essentially solve the main problem it was meant to solve with a minimal core built with tools we knew would work. Put that online on time, then incrementally upgrade features using the new technologies.
Since my group would need some of the services of this software project, I went off and built part of Plan B myself (the part covering what our group would need) as a secret side project using old fuddy-duddy technologies that worked but weren't new, exciting, and unknown. When the 90-day project wasn't ready on Day 90, I put my own online. It worked perfectly. This VP and her minions were angry about it. "It demonstrated a lack of faith in her team."
Two YEARS later, their 90-day Plan A finally shipped. She and the CIO never forgave me, though. I, the old guy, had used 80/20 to deliver the core of what was needed on the day they said it was needed, and I did it alone. Still, they remained convinced that "young, hungry" engineers were more likely to "be on board" with projects like the one they designed, and that was what mattered.
I've seen countless failures like this. I was able to stop a couple before they collided with harsh reality. It never got me any friends in upper management, but saved a couple companies.
If your success had made a big story, it would have become a case study. It becomes intellectually insulting to them.
It directly means they are not capable of doing their jobs properly.
One fact every one need to realize, It's great to be young. But age is not static. Those who are young today have a very narrow window of 9-10 years. Beyond that we are all the same.
The Asian locales had no old system to fall back on, so when Plan A didn't ship and I revealed my (fully localized) Plan B, they adopted it immediately. If I hadn't done my Plan B, the Asians would have had to go back to product registration by postcard.
Alternatively, choose the older engineer with the experience to know: "You know what? Fundamentally, what we're building here is the same as this thing from 10 years ago".
I walked through it with him and explained that you don't realize how much stuff is the same anywhere. Communicating with customers & stakeholders is the same anywhere. Working with your coworkers is the same anywhere. Managing a project has the same steps in every company and if you're good at it it will translate. Likewise requirements gathering. Your knowledge of various domains that you pick up. Etc etc etc.
MOST of what you're trying to do has been done before. All the day-to-day work that makes your business successful has been done thousands of times before. The only thing that's new is a few minor implementation details.
The other thing is all these founders are so young they may lack the experience or maturity to hire an older person.
This entire thread is based on the following:
"I saw this comment left on a Wall Street Journal article:"
So in other words somebody made a comment.
Not a WSJ story even, or a story in a b newspaper or, but a blog comment.
Add: I mean if someone had posed an "Ask HN" question and everybody gives their thoughts (anecdotal or otherwise) that's fine. But this???
Add: With the resulting suggestion, once again based on a blog comment taken as being true, authentic, researched and validated: "Do your future self a favor, find another occupation."
That being said, as a serial startup founder, on a few occasions I've had developers working for me in their late fifties and even sixties. I was, in fact, in my mid to late twenties at the time, and I had great relationships with these guys. They had kept their skills up and their experience enabled them to sometimes do the work of ten people.
In my first company, I had a sixteen year old whiz kid who was possibly the most gifted hacker I've ever worked with, and we later hired a portly, bearded, late-fifties nudist Unix guy who balanced the youngster out quite nicely.
I'm now working in a startup where of the eleven of us, nine of us are in our forties, including three 40-something engineers. I've never worked with a better crew. And our stack is about as cutting-edge as would be responsible in a serious startup. Our VP of engineering literally helped invent Java at Sun, but moved on to new things later on instead of getting stuck.
You had me until "nudist." Dare I ask?
(...and Other True Tales of Silicon Valley)
(To be honest I'd be down for a nudist hackathon, if it could be handled maturely... hah!)
1) Get your MBA and go into management;
2) Start or join a consultancy;
3) Start a company.
My buddies in more traditional engineering fields (aerospace, chemical, etc) are all in the process of setting themselves up with exit options (we're all just around 30).
Without ever doing anything he gets at least two calls a year from people looking to hire him away. (Though I guess the last thing he would want now is be flexible and move somewhere else or commute for hours. So in that sense he doesn’t have that many options should it ever not work out anymore with his current job – that he took right after college, by the way.)
There is a reason "developers" are getting hundreds of thousands of dollars in retainment packages to keep them from starting their own shops.
"I know two or more engineers around the age of 60 that graduated from MIT or Stanford 35 years ago. Of those two or more, at least one has lost his job, and at least one is afraid of losing his job."
Here's what the article doesn't say: what field these engineers are in, whether or not they're good at their job, what company they work(ed) for, what they work on, how up-to-date their knowledge is, etc.
Maybe there is age discrimination in the engineering field, but you certainly can't infer that from this anecdote.
Wait, what I did 40 years ago won't guarantee my employment now?!? What is this insanity, I thought once I got my degree in 1973 I was done with all this "learning" nonsense. How can I be expected to just keep updating my skills?
You can guess my approximate age by knowing that my high school GPA was around 3.5 and that was considered pretty good, but not near valedictorian level.
However, in the 2010s you can get a 3.5 just for attendance so you can tell I'm older.
I'm so old that the highest grade point I could earn in a class was 4 points, I guess now a days an A+ in a AP class gets you 6 points now or something like that in modern high schools.
1) No one cares what the recruiters think. If you mean interviewer that might be different.
2) My GPA was worse than that, and it was only 5 years after the fact, when Google totally-ignored-my-GPA. Anecdote stalemate.
... I might add ", bitch!"
Nearly every employer in the UK asks for your degree class.
It's not a practice I like, but it's something everyone will have to face. The older you get, the more people want to know about your past (which is rarely as good as it should have been).
SICP, which, when it was published, summed up a lot of what I had learned as an undergrad. Nobody reads it to write a mobile app. Since I read it, I had jobs or book contracts that depended on knowledge of LISP machines, bit slice, the original Macintosh OS, Windows telephony API, edge routers, core routers trying to be edge routers, J2ME, Linux on ARM, and Android. Android will be around another 10 years, maybe a bit more or less, and it too will be a museum piece.
My shortest trajectory from zero to Noted Authority and back to zero was when I had a book contract to write about Visual J++. Thank you, Sun's lawyers, for that short trip.
All that said, if you don't know anything about project management tools, estimating market size, or how to write, or critique, a cash flow and P&L projection, your knowledge is going to be hard to package in a form that has high value.
For example, it's pretty embarrassing that we're re-inventing databases because today's generation of "experts" can't be bothered to understand SQL. And don't get me started on how an entire generation of coders will be able to work professionally without knowing how to manage memory. It isn't difficult to find examples of youth and exuberance trumping wisdom in this field -- I've already encountered working developers who are happy to tell me how technologies invented in the early part of this decade are "totally outdated".
The nature of SW is to create abstractions that are reusable. So if you've built a good reusable abstraction, no one needs to know what's "under the hood" except in rare circumstances.
Ask your frontend web dev to explain some differences between the x86 and ARM ISA. Most won't have clue. Even much higher up the stack, most probably couldn't explain how TCP/IP work. For the most part it's irrelevant.
Other fields don't/can't build on lower levels in such a way that practitioners never have to learn the lower levels at all. How many surgeons are there that don't know cell biology?
Your best hope is to either continuously churn with the latest trends. Or pick a part in the stack that is likely to not change, and be a niche expert (aka, a networking guru, or x86 god). Your job opportunities will be much smaller, but if you can find your team (e.g., compiler architect or processor performance analyst) you'll likely to have a job for as long as you like.
The evergreen underpinnings, the basic knowledge of how to build systems floats through languages and technologies.
Granted, there's some selection bias going on: I don't see many 40-something programmers in SOMA. That said: I don't see many 40-something programmers in SOMA.
Computers became cheap enough to have at home. BBSs and then the Internet took off. Computers were doing stuff -- but not common place yet. It was an odd window during a technologies birth that I think got people hooked in a way that is hard to emulate. Too early, you are on punch-cards and computers are rare. Too late, and you have app stores and iphones and stuff just generally works so it can be treated as a dumb tool. Sheer luck.
However, and I know that people will take the opportunity to disrespect me for this, but I realized, after playing around with 3d C++ graphics in college, that I wanted to build interactive desktop applications quickly. So I learned to use a very powerful tool called Visual Basic 6. Not because I didn't understand memory management or anything else, but because it was a better software engineering decision for most projects since it allowed me to not repeat solving the same problems that had already been solved.
I know people aren't going to be able to understand this because of the reputation that Visual Basic has, and they will lose respect for me, but I am going to go ahead and say it anyway. Visual Basic 6 was and still is a better software engineering tool than C++. Why? One of the core principles of software engineering is DRY. Manual memory management means you are repeating yourself. It also means you are doing the memory management yourself rather than letting the computer do it for you. Another reason why Visual Basic 6 is superior software engineering over C or C++ is the component models.
I eventually moved on to .NET and C# because of the stigma of VB and also because the new frameworks provided better tools.
Eventually I realized that the closed source world of desktop Microsoft development was outdated since it was tied to one platform and old-fashioned business models. So I moved into web development. The most practical tool for web development was PHP. It was the best software engineering decision to select PHP to build a web application. Because it allowed me to avoid solving the same problems that many other engineers had already been faced with and solved on their own. Something that many software engineers don't understand is that not only is it important not to repeat yourself in your own code, but not to repeat solving the same problems other people have already solved for you.
So then I wanted to build a realtime interactive collaborative software application in the web browser. Because that is the most compelling, useful, and challenging type of application. So I tried to do it with PHP. Very awkward. I tried to use another popular server-side tool, Python. Again it turned out to be very awkward to do realtime collaborative software applications with Python (using things like Twisted).
So I'm sorry, but if you're not adopting new programming languages, tools and frameworks, then I don't think you are a good software engineer. Because good software engineers only solve the problems that they have to, and don't continue to solve the same problems over and over.
Yes, and? You're arguing a straw man. I'm not claiming that things haven't changed for the better in 30+ years of coding, or that people should be writing web apps in FORTRAN, or that you shouldn't re-use code.
But you're being...imaginative...if you believe that the latest shiny webapp tool (e.g. Node.js) is actually as new and innovative as you're claiming it to be. Anyone who has been around long enough to have written early Windows or Mac apps will have a lot of experience with single-threaded, evented programming. (Certainly enough to know that Node.js is a wonky regurgitation of a very old idea.)
Same thing goes for your MongoDB apologetics: actually, no, the industry didn't settle on relational databases because "that was how you did it". They became widely adopted because they hit a sweet spot in terms of speed, reliability and flexibility (a spot that basically none of the current NoSQL products have yet to match).
In other words, even if you like new things, experience helps you know when the new thing you're evaluating is a pile of crap. And judicious decision-making is far more important than the latest shiny when it comes to being a good engineer.
So in your opinion Node.js and MongoDB are piles of crap?
Then, just curious, what system would you use to build the back end for a realtime collaborative web application, or an API that needs to hit several web APIs and do other IO to service each request?
And how do you recommend that we handle data storage for small business web applications? Really would like to know if there is something that is so much better than Mongo. You really think I should go back to building out fully normalized relational databases for small business web applications? What language/platform do you use, and how do you handle things like ORM?
I would like to know if you have ideas that are better. In my experience, Node.js and MongoDB (throw CoffeeScript in there too) are not crap, perform very well, reduce lines of code, and are better software engineering decisions in most cases. Have you actually tried building applications that way?
It could be that the powerful elder members of these groups ensure that their professions don't evolve in a way that would overly favor young people.
For instance, suppose that software engineers were required to hold S.D's (software doctorates, a three year professional doctoral degree that limits enrollment overall to ensure scarcity and severely limit the enrollment of international students). To get this this SJ, you are required to implement things in enterprise java, with massive amounts of complexity, in a way that is difficult to change. Also, scope is broad - making very minor changes to html code is ok for a para-software, but only under the supervision of a licensed member of the Software Bar. Anything involving "programming logic" counts as practicing software, and if you are not a member of the ASA and you do this, you can be imprisoned. In addition, suppose this group used influence over congress to place heavy and database requirements on the business world - in short, forcing them to spend a great deal of money on licensed members of the "American Software Association".
I suspect that under these circumstances, software wouldn't evolve quickly at all.
1) Doctors and lawyers are seen as exercising judgment, while engineers are seen as building things.
2) Training opportunities (in the "learn by doing" sense of "training") for doctors and lawyers are much harder to come by than for engineers. If you're looking for a lawyer who has taken half a dozen big commercial litigations through trial, you're probably looking for someone very old because there just aren't very many of those trials that happen. Same thing if you're looking for a doctor with half a dozen experiences treating a patient with aggressive cancer. But how hard is it to find a young software engineer with half a dozen big websites under his belt?
Now, I don't think either perception is completely true, in the sense that I think people underestimate how much judgment software engineers exercise and overestimate how quickly they can build up valuable experience.
For the same reason as engineers, it comes down to keeping up with the field. I don't know for a fact that an old physician hasn't kept up with the field, but I do know a younger one can only be so behind.
You have to be clear about the relative rate of change, though: it's nothing like software, where technical knowledge from a decade ago is completely irrelevant today. Medical appointments are pretty much the same as when I was a kid.
The more important point, for me, is judgment: older doctors are far less goofy and spastic than younger ones. That means more conservative treatment, discretion, etc.
I have a young child, and I have had enough conversations with doctors and nurses to see that standards for medical care _do_ change rapidly enough to matter. Some examples:
- At the hospital where my wife gave birth, the SOP is an epidural and continuous fetal monitoring. This means that very little equipment is available which supports mobility. The hospital staff had a hard time wrapping their heads around the idea _lying on your back is painful_ for someone in labor. The human body hasn't changed, but the medical practices are currently optimized around the assumption of an epidural and keeping to a schedule. Doctors who have let this assumption ossify will force the hands of patients who simply don't know otherwise.
- My pediatrician's practice currently recommends avoiding peanut butter until age... three? I think. However, there is recent research on some children of Jewish ancestry in the UK vs. Israel, suggesting that their very low rates of peanut butter allergy are probably due to environment rather than a genetic predisposition. If further studies confirm that finding, the AMA will probably revise its suggestions, causing all pediatricians to start offering the _opposite_ advice.
- Recent research (sorry, don't have the reference) showed higher rates of food-related allergies when children were exposed to solid foods earlier than 3 months.
[edit for list formatting]
Look at your examples: in the first, you're talking about the difference between laying down and...not laying down. And how long have people been giving birth in hospitals?
In the second example, you're discussing a subtle difference in the age at which kids are fed a specific food.
Medicine moves slowly and deliberately. Software does not.
In the U.S., we've been giving birth in a hospital under the supervision of a doctor for several generations now. When my grandparents were of childbearing age, there were some pretty horrific practices. Now they are recognized as such, but the turning point occurred during someone's lifetime. Likewise, it doesn't sound like a big deal, but the culture of "laying down" has a significant impact on how we as a culture have babies:
- The mother's position is not naturally conducive to popping a baby out. This adds difficulty to pushing.
- The mother's position puts all of the baby's weight onto her back, rather than on her legs. This causes more pain, which can lead to further medicinal intervention (e.g. epidurals or IIRC narcotics).
- Motion (walking, lunges, etc.) is helpful to get a fetus into the right position (head down, facing mother's back) and to advance its station. This benefit is lost while lying down.
- Gravity is not of assistance while lying down. It helps while squatting.
Etc. What sounds like such a little thing can end up making a world of difference. And today's ossified doctor who insists that the mother lay on her back is very similar to today's ossified developer who insists on using the Win32 C API for trivial data-entry UIs. Regardless of the rate of change of the field as a whole, ossification and inflexibility is problematic with whatever is changing right now.
The exact rate of change is moot. The rate of change is "enough" over the course of 5-10 years, that it makes an impression on me.
But its our (your doctor's) knowledge of the human body that has changed immensely every decade.
But often in programming you have managers that are ignorant, outdated and sometimes even complete idiots. Why do we put up with that? Why do accept that our managers are not more capable than we are?
I think its just human nature that your boss (or management in general) is out of touch and an idiot. It's why politicians are clueless -- until you get elected and then you suddenly are viewed as clueless too.
By that I mean that any explanation of why people over 30 really are worse programmers, will be interpreted as "discrimination" and therefore wrong. Any explanation of why people over 30 are better programmers, will be taken as evidence that there must be a lot of discrimination since the older programmers are not only not worse, but better.
And with every 5 years adding adding another level of abstraction he can be invaluable for hard to debug things that need knowledge what goes where under the hood.
That was this guy - now in his late 50s who mentored me as a young whelp - I have seen him open a dump with a hex editor (not dissembler) while tracking a bug, saying - aha a pointer is not initializing correctly and fixing it. Will hire him in a second given the chance.
OTOH my father, now in retirement, still occasionally works as a freelancer, because he still enjoys the job, and his skill is in demand.
I wouldn't want to take such a risk when I'm 60.
Yeah, I realized that I disliked management (the concept) when I realized that managing is, in most cases, just as bad as being managed.
My philosophy: mentoring, coordination, and very occasional police work (against internal harassment and bad behavior) are the only valuable contributions of management. The rest of it is useless and should go away. It's outdated.
If there were a way to pop back into the state one should be at, at one's age, were it not for the robberies, then forgiveness would be possible. You'd just recover and forget. But this is a game where if you get screwed, your competition is out there getting better experience so you keep getting screwed in the future, and soon enough you're 40 years old and still a nobody. Fuck that.
Also, these problems that I'm attacking are huge problems. If I could, for one example, start a process that made it socially unacceptable for a company to run closed allocation and call itself a technology company, that'd easily add $10+ trillion in value to the world economy. That's just a massive effect. Obviously, one person can't cause that kind of change, but there's no reason not to try to start the process.
So to sum it all up - I grow my leadership, management, and strategy skills.
There's no need to have hardware at home faster than work. Software that's newer or more complicated than at work, sure.
Keep up with changes in the industry, just like any engineering career. I never fail to be amazed at meeting people who have no idea whats going on in tech outside their work department. They're doomed in the long run.
I'm in my 40s, and I've been programming my entire career. I'm trying to get into management so as to extend my career into my 50s and 60s. The realities are that I likely won't be able to compete against kids 1/2 to 1/3 my age in the next 15 years, so I need to use my experience to my advantage.
It's too bad because I would rather just program.
He's a Manhattan-based lawyer who really doesn't like software developers, and is constantly warning his readers not to let their kids major in computer science. He cites outsourcing, stagnating wages, lack of long-term career development, and perceived low status as reasons why one should stay away from most things in tech, especially anything that could be conflated with "programming" or "IT" by people outside the tech world.
Then again, he doesn't offer any real alternative advice. It's usually some variation of "be rich and work at a non-profit".
Prior to that I ran the technical part of interview and green-lighted the candidates. The problem is real and when it happens, whatever skill set you have is irrelevant.
I wonder if there could be a boom, especially with the online options, in mid-40s programmers getting graduate degrees so as to jam the signals. That's what I'll do if I end up facing age discrimination, so long as I can solve the money problem.
Good luck with that if someone files a suit.
I feel like being a jerk to older people is just asking karma to kick you in the ass. Except for those who die young, which is a rarity at this point, everyone will get old.
Of course the age was never a formal reason. There were also other qualified applicants.
One case was borderline as the senior candidate also were asking for a higher salary. However in all cases I heard it mentioned, "isn't he, kind of, old?", and it was pretty clear there'd be no hire.
Ever notice how the flight attendants on U.S. airlines are mostly in their late 30s to 50s nowadays? The airlines learned a very expensive lesson there.
Once you have been in a job for a while it can be easy to stagnate because you end up spending most of your work time maintaining something that was written in ASP.Net or PHP4 or whatever and it doesn't make business sense to do a rewrite in something newer.
The alternatives are to job hop or to work somewhere that is constantly using or evaluating new tech. Or to spend your own time learning new skills , which you may have less of when you are older.
There's also the issue of deciding what to learn, since you can't realistically learn everything.
I've been burned in the past by spending time looking into various technologies that have never really gone anywhere. It's hard to make a bet like "Will I be more employable in 10 years time by learning node.js or go?"
Yep. I don't know why these types of posts get so many comments. It seems really obvious.
"The reason why California is flooded with thriving Asians is because the Golden State is a beta state, and all California cities are beta towns with a beta White majority. Smart, talented, and hard driven good looking – alpha Whites from different parts of the country leave for places in the Northeast, especially NYC. "
However, if you have a Stanford degree, you have an inside track on being a funded entrepreneur, so it may be a safer path.
At any age, if you've got something unique to offer and can contribute- you can get hired. If you've let your skills lapse, or have become inflexible then that's a different problem.
New immigrants I don't view as a problem at all, but rather an opportunity for us to increase the size of the market overall. There's work enough for all, as long as you've got something to contribute.
Maybe at one point 35 years ago simply having a degree from somewhere great was enough. I don't see that today as being a thing that makes someone immediately awesome.
Invest daily in your education. It doesn't stop when you're 22. Today, instead of doing consulting- I'm learning about Google's Polymer framework/platform.
Of course, he's been in the business since the punch card days and has a huge amount of experience to draw on; as such, he's miraculously (to me) been able to land guaranteed 60 hour per week gigs (even though he likely only works 30 hours) at $125+ per hour.
K-rist, that is a skill in itself, landing high paying telecommute gigs and putting in half the hours you're paid for, the f-er ;-)
There are more life frictions as we age that dissuade us from continually learning. If you realize this and work against it, there is no way someone significantly younger than you can realistically be a better fit for a job. They may have knowledge, but you'll have that and decades more experience and wisdom.