She emphasized that she was looking for diversity, because, as she said, "the type of people on the other side of the table from us are different than they used to be, and they don't want to be dealing with a bunch of 65-year-old white guys. So those aren't the type of people we're looking for." That's pretty much a direct quote.
I was at a small company before I went to grad school. The company began working with a guy in his sixties who'd been laid off for a while and couldn't find work, even though he had a ton of experience. We brought him in on a contract basis, and his role gradually expanded as it became clear that he had both excellent technical knowledge and great people skills. He became a mentor to me and helped me out a ton. He's objectively better at our profession than I am in multiple dimensions, but right now I could find a new job ten times more easily than he could.
I'd like to start a business in 3 or 4 years, and I'm seriously considering going out of my way to target people like him. It'll be my competitive advantage.
It doesn't matter how skilled a person is, or how much experience they have, in Silicon Valley you have a better chance of getting hired as a 20-something serial dog molester than a 50-year-old experienced tech.
And filtering out older people, while illegal, is so very easy to do to 99% of applicants by simply by looking at the year they got their undergrad degree.
Context matters a lot - such quotes are usually said in reference to a company's leadership team, and there's quite a bit of truth to them. When you look at the list of CEOs/Senior-Executives/Board-members at Fortune 500 companies, it's overwhelmingly middle-aged/old white men. I would consider that just as problematic as the age-discrimination that the article was referring to.
> assumed the speaker meant...
I am guessing you are not an old person being subject to ageism...
> Context matters a lot
Yes it does, the context in which the statement is made, the person who made it and the context in which the statement is receive. Asking us to only consider on side of the communication ,and making excuse because in certain context the statement is not offensive doesn't seem fair to me.
> such quotes are usually said in reference to a company's leadership team, and there's quite a bit of truth to them
There is truth in any stereotype, that's why they exist. The usual issue is that they are alienating and non inclusive to some people.
> it's overwhelmingly middle-aged/old white men
And the prisons in America are filled with poeple who look like me, should i be okay with generalization on me about that. One thing that it seems that most leftist like me forget is that most CEOs are white , but most white are not CEOs...
> I would consider that just as problematic as the age-discrimination that the article was referring to.
Yes and the first step is adopt an inclusive language, and not make excuse we wouldn't accept on any other protected classes. The fight for diversity is won we get everybody on board. When we start to have a hierarchy of protected classes, some classes where it's kind of okay to discriminate against because X,Y,Z reasons then it starts to look like the exact opposite of what we are trying to achieve
And i guess the most interesting thing is the amount the mental gymnastic we have to do to find this statement okay, instead of just taking the most plausible explanation... It simply reflect the latent ageism in our industry, and the fact that it is still okay to be ageist so long as it's directed at white men...
If someone could demonstrate convincingly that executive roles are being awarded out in a purely meritocratic manner, then demographic-imbalances might be excusable. But let's be honest - executive roles at mega-corps are almost always handed out on the basis of personal relationships and perceptions. I strongly suspect that in a purely meritocratic system, we would see far more diversity in the executive ranks.
It's also curious to see the extent to which many commenters are willing to accept the premise that old/white/men are being discriminated against in some roles, but not the premise that other demographics are being discriminated against in other roles. Is the evidence for the former really all that stronger than the evidence for the latter?
These things swing to extremes, but harm will come to some. There are still good candidates out there with mortgages, families, etc.
I'm self-employed now and don't expect to ever put myself into someone else's employ (that's my tip to everyone else in the 40+ camp... entrepreneurialism isn't a groovy lifestyle choice, it's the offensive measure to secure your future --- business startup and continuity risks notwithstanding), but the "we need diversity" bell is now mine to ring. That said, I used to dream of creating a startup, building a team, growing a business --- now the dream is zero employees. It's me and my (life) partner, and we won't take on employees if we can help it.
Is that problematic?
Tax system already penalizes those who are producing wealth in order to subsidize those who are consuming it.
I dont think most Libertarians want 0 government, but I believe most government is best served locally where people can actually have influence and oversight.
Using your logic (only related to this portion of your post, because as you stated money, does in fact, beget money), one should not work hard for one's progeny to have an inheritance? Should estate law just be modified and have all assets of a dead person go to the state?
Inheritance is deeply ingrained in most cultures that I know of (so, western first-world cultures), but it had never made sense to me. Inheritance is pure unfairness.
And people far more versed in the economic arcanes than I am have the same opinion: see Karl Marx
Why not? Why should a dead person continue to exert their influence in the form of cash after they're done personally contributing to society? Why does someone deserve the fruits of their parents' labor? Do we also judge children by the sins of their father? If someone's money is transferrable, why not blame or shame?
I think your sin analogy isn't an apples to apples comparison nor does it make sense in this context. However I'll play devil's advocate.
If your father was a felon / committed "sins" then most people in society are going to be skeptical about how you might turn out. It's an unfortunate truth. Not to say you will be a felon but based on the environment you were raised in you have a statistically higher likelihood of becoming a felon / deviant. So yes, you will be judged by the sins of your father whether you like it or not. That's why people leave their home towns for to start new lives or change their names. I'm not saying its right but that's the reality.
Not to defend the quote, but I wonder how else would you characterize “non-diversity” in this context? Specifically, if their current workforce composition is indeed “bunch of 60 something old white males” then how else would you express your desire to be diverse without referring to the current reality?
In fact it would actually make them significantly more diverse if they were to hire some 65yo white males.
She should not have said "65-year old", as that is blatant age discrimination. And she should not have brought in the terms "white guy", as it is racist stereotyping. It seems like that recruiter is really out of touch with what promoting diversity in the workplace is about.
However, despite this, you cannot deny that the majority of people working in the tech sector are white or asian and male. That's just a fact. I feel like you are demonizing diversification efforts in the work place just because of one experience you had.
That old white guy you worked with will not have a problem getting a job in the future, eventually, unless he has some sort of character flaw that renders him absolutely unemployable. Hell, he already has a job right now, though on contract basis. Even just getting your foot in the door is much harder if he was not white and male. You just proved so yourself by saying you would have a job lined up for him and people of his demographic. You are acting like old white men are the victims here. They are victims, but only because of their age. Now you know what discrimination looks and feels like. This is why diversification is important. It's simply a protection measure to help ensure equality and fairness.
> She should not have said "65-year old", as that is blatant age discrimination. And she should not have brought in the terms "white guy", as it is racist stereotyping. It seems like that recruiter is really out of touch with what promoting diversity in the workplace is about.
I am glad we can agree on this
> That recruiter you were speaking of probably meant that they were looking to hire a diverse set of people, as opposed to primarily just 65-year old white guys (which she probably believes to be the majority demographic that you'll see in a stuffy corporate environment).
I am not sure what warrant this generous interpretation. All we have to go with is a a "Derogatory" sentence from someone not in the group (young vs old). She is supposed to be train to speak publicly on behalf of a big company. She is supposed to understand diversity a litle deeper than the average joe. I am not sure what she meant, but to me the "way" she spoke speaks also volume on our industry ageism. I don't think we would have heard this about either hispanics/blacks or women.
> However, despite this, you cannot deny that the majority of people working in the tech sector are white or asian and male. That's just a fact. I feel like you are demonizing diversification efforts in the work place just because of one experience you had.
OP is not denying anything, he did not even mentioned diversity efforts or anything of that sort. The article was about ageism, and his comment was related a story where is people supposed to be trained to look for diversity make "bad" comments on old people. He probably feels concern about it because he can see him self in the old guy in a couple of year, or simply the proximity with the guy in his story make him feel some sympathy for him.
"Straw Maning" "look at this old white guy suffering" into "you are denying the struggle of of all this other people" is bizarre and from my perspective show a lack of empathy. Then going on a lecture on diversity is not helping.
> I feel like you are demonizing diversification efforts in the work place just because of one experience you had.
He is not, he is showing a deficiency in our inclusiveness.
> That old white guy you worked with will not have a problem getting a job in the future, eventually, unless he has some sort of character flaw that renders him absolutely unemployable.
Doesn't this article prove otherwise ? You saying this is essentially denying ageism.
> Hell, he already has a job right now, though on contract basis.
Discrimination is more nuance than getting or not getting a job. It's about being pass for certain possition, it's about having to work harder for the same reward, or working the same for less reward. It's about like in this case someone having the capacities to be a fully employ but having to settle for contract work.
> You are acting like old white men are the victims here. They are victims, but only because of their age.
I am not sure the point of this is on a thread on ageism. there is so many thing wrong with this statement..
Since we are sharing feelings. I feel like you trivializing other people experiences and trying to create a hierarchy of protected classes. We don't go around telling cancer patient about the pains of poverty or vice versa. We don't go around telling white women complaining about sexual harassment about black men being killed by the police or vice versa. We don't feel the need to compare and judge and contrast those experiences. We just accept them as the general suckiness and unfairness of life, offer help when we can, sympathy when we can't;
When we do so, we are alienating people without good reasons. People will naturally fell more empathy toward certain groups either by identification or by just shared experience, and that's okay. We don't need to lecture them just because they aren't focusing on the group we want to focus on.
Ageism is a problem, if you don't feel strongly about it that's okay, but please stop telling people who do , that they should be focusing on something else...
Count me if you can. No, not to give me a job but to be your partner. I would guess that there are some really smart older people who are under utilized because of all perceived drawback of age( often by not very smart people). But smart people are rare - there are immensely valuable whatever their age may be.
If you're being oblique to avoid libel, it's worth noting that libel law doesn't care about whether nor not you name an entity. It cares about whether or not you've identified an entity.
Google has been identified.
Somehow, I don't think this one is going to get pursued ...
They are huge :)
It's possible that the report is true, but that the author doesn't want the consequences of unambiguous identification.
He could be 10 times more expensive than you. Did you consider that?
They won't accept work for less than premium wages, don't need to.
That being said, I admit that there is also another part of the population who is struggling and desperate for work at any price.
Last but not least there is the question of what is premium wage. In London for instance, a lot of folks over 30 moved to contracting where they can charge a good daily rate, without the hassle of whiteboard interview, overtime or politics. You'll see many companies (see the HN who is hiring) allegedly struggling to recruit at their great wages, but the truth is that employees have moved out to greener pastures for twice the money.
I must travel in lowly circles.
Like, cashflow at 50 with 3 kids is important.
Yeah, you've saved for retirement at ~65ish, but you have 2 kids in college with one on the way up, plus a mortgage, plus just regular life stuff. That's not an unreasonable scenario at all.
Like, do you live in a first world nation? Honestly, the paradigm changes a lot in the 3rd world and your confusion would be more appropriate in Chad or something.
(In Chad, average age of first birth is 18, so you'd reasonably be a grand parent by 50, plus cost of living is very low)
If you have kids before your thirties, they will all be +25 when you are 55. They most likely finished college, they probably left the family home.
Edit: below, the table contents
Households age 55-64 with Households age 55-64
no retirement savings with retirement savings
Percent of house- 41% 59%
holds age 55-64
Median net worth $21,000 $337,000
Median non-retirement $1,000 $25,000
Median income $26,000 $86,000
Home ownership rates 56% 87%
Percent who own a home 22% 27%
that is paid off
Percent with a 32% 45%
defined benefit plan
Then an additional 50% who have a home but still a mortgage to pay. The statistics don't say when mortgages will end, so we can't do any meaningful analysis on the majority.
There are more people on the low side, not WAY MORE contrary to your statement.
Which is to say there are some people who don't have their house paid off yet, but will have it paid off before they retire, and then live rent free (but with other bills and taxes). There are other people who will not have anything paid off.
That is the EXTREME minority of people over 50, at least in the US
>I admit that there is also another part of the population who is struggling and desperate for work at any price.
I would not say they are "desperate for work at any price." but most people over 50 need a fair wage for work, and by fair wage I mean a wage comparable to younger people as well
My dad was in fact laid off at 50. Everybody knew he was very careful to save a lot of money, so they always said "but you must have enough money saved away to retire by now". His response was always "yes, but I don't have enough money to get to retirement". In the 15 years between 50 and 65 his money is expected to double twice which makes a big difference to his retirement income. If he had started drawing at 50 would have much less money to work with. He needed a job to pay bills.
That is an assumption that may be invalid. Everybody is in their own private situation. Your assumption will be true for many but not for all. Many older folks may not need a top salary anymore and will take an average income for the pleasure of working with other fine people and doing interesting things. At some point in your life the career ladder will be less important than quality of life.
So, the only way that is even in the realm of possibility is if the kid was making $30K and the greybeard was making $300K. That's another benefit of greybeards: you get more bang for the buck simply because of the way salaries scale vs. performance that comes with good experience.
Costs is more than just salary. 5 weeks vs 2 weeks of hollidays, 401k, health insurance, etc...
More like 50% more or at most twice as much.
This notion that experienced developers, as a whole in the industry, are making x times more than junior developers is nonsense. Utter nonsense spoken by people who have no idea of what they're talking about. It's true you can find outliers for that kind of thing, and I'm willing to bet they are specialty cases.
Early in my career I would get huge jumps in pay when I proved myself at a project. Almost fifteen years into this career and I don't get those huge jumps in pay anymore after a successful project. Not because I'm not deserving of the praise, but I'm already paid well to deliver based on my experience. A junior succeeds and gets a raise; a senior succeeds and moves on to the next project.
Could I get one of these mythical 300k jobs? I know I can. I'm willing to bet I could snag a 500k job if I pushed myself. But that would require sacrifices I'm not willing to make, give up hobbies that I enjoy, and live in a city I really don't care for. A city that would eat up the extra income anyway.
I'm more than happy with my current job that pays well and I get to mentor juniors every day, which I enjoy.
In SF/NYC, juniors can start under 100k, seniors can go over 300k. Either number could go either way, let's not argue about who got what. The point is, there is a LARGE multiplier.
I recall one friend in London, in a junior position. Left his company because of bad conditions and bad pay. One of his last assignment was to interview some possible replacements. There was one senior guy 20+ years of experience that was great, the company made an offer for around double his own salary.
Do you know what happened? The guy declined.
Guess what we found out later? He joined another company that offered him around three times that double.
Young folks don't realize how expensive experienced folks can be.
I'm drawing the same salary as I drew twenty years ago. (Never mind adjustment for inflation. It's the same actual number of dollars.)
I have a clear enough understanding of my trade that I'm willing to buy my own tools when they save me time (keyboards, monitors, development tool licenses, VM service fees, etc).
This all works for me. My kids are out of college without debt. I've got some money stashed in 401(k). I wouldn't mind retiring, but my small-company employer can't afford to hire somebody younger and they've asked me to stick around.
All other roles for different ages I've ever seen has been based on experience and not age, and at the companies I've been that experience is only about 30-50% higher at the worst.
I have heard that companies like recent grads because they are (1) more malleable and (2) can be paid less. But neither of those reasons seem to me strong enough. I'm talking completely about the company's own interests.
Let's address the first reason: malleability. A recent grad presumably will adopt the company's culture faster, complain less, and in general pick up things sooner. Well, the hardest, meanest coworkers I've ever had were late twenties, early thirties. I've worked with people in their sixties, and they're sweet people. Even the grumpy old sysadmin had only a thin layer of spikes. After just a few days I could see through most of it, and he was 10 times more helpful than my other sysadmins. Not only was he softer (at least deep down) but he was smarter, having done it for decades. Even when he met a new problem, his keenly developed taste made him more likely to choose something that would be more maintainable long term.
Now let's address the second reason: salary. I am 10 times better than I was when I started. I know, because I still work with some of my code from back then, and I desperately want to rewrite it all. How much more does a senior developer make than a new hire? 50% more? Seems worth it to me. 100% more? 200% more? Still maybe worth it. And if some old fella can't get work at all, maybe he would settle for something between 50% and 100% more. I mean, why not at least make an offer?
It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good. You see some straight-out-of-college twenty-something in . . . any other field, you think, "I sure hope he knows what he's doing."
Well, iOS11 needs to ship on its annual schedule, non-negotiable. And the SVP will really complain if I ask for more money this quarter. So, cheap it is.
If a bridge falls down, it's really obvious and people die. If billions of computers become vulnerable to malicious actors and a few hundred or thousand people suffer dramatic personal damages, well that's a nice and quiet problem which will be quickly forgotten in the 24-hour news cycle.
This. I do development coaching and training. I’m totally burned out because my job doesn’t matter. These companies don’t care. They hire me to “increase code quality” but that ends up being a lie. They just use me to get their teams to crank out shit faster, and I do mean “shit”. They don’t care if the systems blow up a few times a year or once a week. Banks, oil companies, airlines, all of them. They loose millions because of it too, still it doesn’t matter. No one (in leadership) gives a fuck. “Just make this quarter’s goal” is all that matters.
Fucking equifax for crying out loud, NO CONSEQUENCES. If this isn’t ironic enough do some googling on why Equifax is named such. tl;dr This isn’t their first rodeo. They once fucked up so big they had to change their name.
Sucking at software only has consequences in a few industries, usually ones where the product is software, the product is discretionary, and the customer is fickle. Even then it usually sucks.
These same companies are audit conscious (about money), they’re crazy about regulatory compliance, and workplace harassment. Why? These things have real consequences.
The sooner some company gets fined one year’s profit or an exec goes to jail the sooner this shit will actually matter to them.
Your method is refactored to be super-fast and have no branches? Nobody cares.
Elegant, re-usable architecture? Nobody cares.
You've optimized an inner loop in assembly and increased speed by 30%? Nobody cares.
Made it to zero compiler warnings? Nobody cares.
No known crashers left in the bug tracker? Nobody cares.
Fixed that bug that's been in the code for two years? Nobody cares.
Patched a security hole that could allow a crazy bad attack? Nobody cares.
The vast majority of software shops out there care only that something--anything--ships as fast as remotely possible, and as long as it does not kill the customer, it's a success. Nobody in leadership gives a single fuck about anything else. There are a few places where this is not true, and good software people tend to collect there, but it's true for the vast majority of places where you will work. The sooner you accept this, the less likely you'll burn out.
Software should really only be measured by the value it provides. If a terribly buggy piece of poorly written code still saves hundreds of man hours a week, it's a win. (Unit tests be damned)
If that big ball of mud that's using 20 year old technology still prints a billion dollars a year - that's a win. (Microservices be damned)
If some slapdash jquery-and-duct-tape web app still solves a specific problem I have, that's a win. (React SPAs be damned)
Does crappy software flourish while beautiful software dies? Sometimes. But usually code hygiene, security, bug counts and crashes are not what makes or breaks the success of a piece of software. Developers often think their work is the most important, when in actuality it is usually not. Sometimes you really just are a cog in the machine.
This mentality is the reason why sometimes -- but not always! -- my phone won't play music in my car until I restart the _car_.
This mentality is why game development companies get away with charging full price for unfinished, bug-ridden games with crippled features and missing content.
This mentality is the reason why I couldn't see my credit card in the list of my accounts on my credit union's online banking site for several days.
This mentality is why Experian and Equifax don't report the same credit score for me.
This mentality is why I feel ashamed every time a receptionist apologizes for the wait, because they "have been having problems with the system."
Most importantly, this mentality is why I have to worry about whether someone will steal my identity because people keep writing shitty code on top of shitty code and my personal information keeps getting leaked.
When I'm so disgusted by my own profession, no wonder I ended up burned out.
We want everything fast! Nobody is willing to put in the work and get the reward after a longer period of time... No! We want results now! We don't want to wait for the food to be properly cooked, we want it now, so we go for fast food. We don't want to put in 1 year of work to learn something new so we can feel that we achieved something, we want it now, so we play a game where we can 'win' in 30 minutes. We don't want to build a relationship and eventually end up being intimate, we want it now, so we go for 'one night stands'.
Then you ask yourself... is it weird that the economic system works the same? That the software industry works the same? :/
> We don't want to wait for the food to be properly cooked, we want it now, so we go for fast food.
We go for fast food because it saves time. When I opt to order in instead of going out for lunch, or grab a burger from McDonald's en route, I do that because eating is mostly instrumental. I don't want to eat at that point, I have to, and the sooner I can fill myself up, the sooner I can get back to doing the things I actually care about.
Fast food, and food delivery services, allow people to choose to spend more time on other things, when those other things are more important to them. The same people will enjoy a finely cooked meal on a different occasion, when they prioritize it.
> We don't want to put in 1 year of work to learn something new so we can feel that we achieved something, we want it now, so we play a game where we can 'win' in 30 minutes.
Spending a year on learning something to just get a quick feeling of achievement is a very stupid way to go about it. Videogames are better for that. Learning is better for getting long-term feelings of achievement, and to actually gain knowledge/skills that you can use for something. And again, it's not something new - our generation wastes time on videogames, previous generations wasted time playing soccer, cards, darts, and doing tons of other quick-reward activities.
> We don't want to build a relationship and eventually end up being intimate, we want it now, so we go for 'one night stands'.
One-night stands exist for as long as humans are humans. Nothing fundamental changed, only the hookup methods evolved with population density and available communication tools.
> Then you ask yourself... is it weird that the economic system works the same? That the software industry works the same?
Nah, they don't work the same. In them, the actors are not driven internally, they're driven externally. People write shitty software, or sell shitty products, not because of their need for instant gratification, but because of market pressures. A fast hack sold by your marketing team can be a difference between you getting $100M contract versus not getting it, or getting your product on the market first versus a week after your competitor. Market economy, because of all its efficiency, is what creates the culture of suck.
It's so obvious. But until you realllllly sit down and internalize it... then you can be at peace with the shitty service of Comcast, etc.
In other words: Don't hate the player, hate the game.
Ha ha, maybe this is why they don't want older engineers. They're wise, cynical, bitter, skeptical and not likely to suffer bullshit management.
You just have realize that most software buildings are tents, and there are a few buildings that you can use out there.
I don't know if the world is getting a shittier place. But it surely is getting less reliable.
It is, because everything is being optimized into literal minimum viable products. That's how competitive markets work - whatever aspect of your product or business you can cut out to save money and thus get ahead of your competitors, you will cut out, and then your competitors will cut it out too, in order not to get outcompeted by you. This race-to-the-bottom phenomenon is fundamental to how competition works, and the result is that products gradually lose quality.
It's the reason why your grandfather's washing machine probably works 'till this day, while you have to fix your new one every year, and will probably replace it in five years. "Building to last" is a good example of a quality that the market economy optimized out over time across pretty much all products in all sectors.
Adjusting the cost of each product with inflation, things really are built like they used to be. Your grandfathers washing machine both cost far more than a current model* (again, adjusted for inflation), and all of the poorly built old washing machines broke and no on sees/thinks about them anymore.
* I am having trouble finding a source on washing machines specifically, however I'm assuming they followed the same trends as most commodities tracked by BLS: https://data.bls.gov/cgi-bin/surveymost?ap
As questionable as citing Reddit is, this post rather well encompasses what I am trying to get at: https://www.reddit.com/r/Showerthoughts/comments/4kn8ku/the_...
The way I see the post you're replying to is either you learn to cope with these things or you'll burn out faster.
Since there's no accountability, if you want to do things the right way, you'll have to do it by yourself and be prepared to swim against the tide..
Most of those bugs and failures you listed are really inconsequential in their contexts.
Car won't play your phone's music? Use the radio.
Game has bugs? Play another game until it's patched, or find inventive ways to use the bug.
Can't see your CC? You can still charge on it and pay its balance.
Experian and Equifax don't report same credit score? Your creditors aren't reporting to every credit bureau.
Ashamed that someone else's system doesn't work? I can't help you there. That's some deep psychological issue.
About 10 years ago I came to understand that mediocre runs the world. All these people, who are your bosses, who are getting raises and promotions, they're the B/C students from college. They don't care about perfect. They really only care about finishing what they're assigned and going on with their outside life.
If you want perfect, do it on your own time.
Yes, I expect that for most things we could find a context that could render them inconsequential.
> Car won't play your phone's music? Use the radio. Game has bugs? Play another game until it's patched, or find inventive ways to use the bug. Can't see your CC? You can still charge on it and pay its balance. Experian and Equifax don't report same credit score? Your creditors aren't reporting to every credit bureau.
Elevator doesn't work? Use the stairs. Public restroom is filthy? Hold your breath and don't touch anything. Lost a tooth on the right side of your mouth because of an incompetent endodontist? Chew on the left side.
I'm not being facetious here. I really do agree that almost anything is tolerable if we decide to. That last example is from personal experience: I have lost three teeth on the right side of my mouth and I have to chew on the left. Most of the time, I don't even think about it anymore. And it really isn't that big of a deal -- it doesn't have a very significant impact on the quality of my life.
> Ashamed that someone else's system doesn't work? I can't help you there. That's some deep psychological issue.
It could be. Like I commented elsewhere, I would expect a doctor to be ashamed of Andrew Wakefield, but maybe most of them aren't. And even if they are, it could be a deep psychological issue that I choose to compare the state of our industry to what Wakefield did in his.
But here's the thing: I know that we can do better. Not perfect, better. But we don't have to, because we lack accountability.
Sure, perfect is the enemy of the good. But complacency is the last refuge of the mediocre.
Personally I get angry in those situations... the larger the company, typically the more safeguard crap in place that's supposed to prevent those issues, the more pissed off I get. But that's just me. I get angry when I see evidence of poor/buggy quality in business software the more visible it is.
why would you feel ashamed for waiting a little longer?
that's an odd thing to be insecure about
The problem is when the "negative value" is externalized leading to false evaluations. "Oops, so sorry we leaked all your social security numbers. Our bad."
Because it's not really the value overall the software provides that's being talked about but the instantaneous, immediately visible payoff to a single consumer that software, the people or that consumables are getting judged by. If the software is going to result in trouble over time, if it's going to have security holes that will cost a lot over time, if it's going to commit you to garbage that's updated less and less frequently, etc. None of this calculated. Just as the health costs of sugary drinks don't get calculated, the social costs of poor education don't get calculated, etc.
Is it possible that the value assigned to keeping that information secret was, in the end, actually appropriate?
Ask someone who's had their identity stolen.
I’d hire you.
The vast majority of people making agregrious security errors know no better. You could argue that whoever hired them shares or shoulders the blame, but to suggest they’re not decent people is a stretch. Most I’m sure are. They’re also likely infosec idiots.
Now if you know something is terribly insecure, understand what’s involved in fixing it, and go out of your way not to, then yes ethics come into play. I see that the same as an engineer (in the true sense of the word) staying silent on an issue involving automobile brakes that could lead to casualties. The consequences are not 1:1 but the ethical question is in the category.
If your bugs can hurt people (financially, physically) I think you need to work a bit harder to make sure it doesn't sneak into your work. Whatever you believe helps there (unit tests, formal verification, ...); if you didn't do all you could do because you are not paid for it (your boss tells you to add features, don't waste time on things that didn't break yet for instance), what should you do? Seems like a real issue for professional coders as there are not so many options; not many companies will pay to prevent these kinds of things unless they had a big issue before already (and someone(s) got fired / sued for it already).
But as one HNer to another, I was really talking about your second case. To claim here that security can ever be discounted, as the person I was replying to implied, is pretty much inexcusable. Everyone here knows that security affects more than just your business, so no one here should be solely applying business logic to it.
But those bugs could easily lead to mistakes in the output.
Code quality matters and has tangible consequences in that niche, and it was awesome to work in that environment.
I haven't worked anywhere like that previously, and I think the key factors that brought code quality to the forefront were:
* Bugs cost sales
* Poor code caused friction for features a significant amount (reduced ROI)
* Small teams and an insanely tight customer relationship to those teams. The client needed to have a stakeholder in our team or as close to that as possible.
But most importantly it was just the company culture from top to bottom. The director cared, he made the project managers care, and we already cared because it's more fun to write good code.
We did take on a rushed project at one point, so there is a good comparison in that project. It shipped, but everyone was unhappy, some key people left and had to be re-hired for. It triggered a very transparent discussion roughly titled "let's figure out how to never do that again" and to that end they curated the clients more carefully, and put in some checks to see if projects are a good fit.
But I understand that is probably unique, and at previous jobs the "good enough to make money" mantra has worked out okay.
I think when your software domain is complex and rapid, it pays to have sound architecture. When it is simple, or slow to change, it doesn't matter quite as much.
Whoa now. The Toyota 'unintended acceleration' case showed us that it's fine to kill the customer. The court declared themselves incapable of penalizing even the most stupendously egregious example of criminal negligence we're likely to see (Of 90+ 'required' and 'suggested' practices in automotive industry firmware coding practices, they did 4 in the code in question. The developers didn't have a bug tracker or static analysis tools. Typical of all organizations, the software engineers had no real power to demand more time or testing, etc). They lost a civil wrongful death case and paid out $4 million, then settled with the plaintiffs out of court before the jury could award punitive damages. So killing is OK.
The issue is that, I think, software people need to stand up for themselves. They need to tell executives no, and refuse to pump out slapdash work. They need to take responsibility for the successes and failures of the company. The practical truth is that in most organizations, the software is the most valuable part of the entire business and the thing with the power to enable the company to excel or fall flat. We've been waiting for executives to recognize this since the 1980s and they still just see the software people as people who do typing.
After a 10-month search, NASA and NHTSA scientists found no electronic defect in Toyota vehicles. Driver error or pedal misapplication was found responsible for most of the incidents. The report ended stating, "Our conclusion is Toyota's problems were mechanical, not electrical."
Third advice. It's not enough to fix bug, everyone MUST know that you fixed that weird bug that customers were complaining about for years and noone could understand the cause.
The important word here is pretend. If the system is safety critical you'd think that quality matters. It doesn't. What matters is following the process laid out in ISO 26262 or whatever standard applies to your field. Whether or not that improves quality is not important (it does, but only a little). The important part is that you have reduced/no liability for accidents if you can prove that you did everything by the book.
Edit: It's all about documenting that those things have been done and signed off on, so keeping track of paperwork.
...where he talks about his role and questions regarding the Challenger shuttle disaster; I don't know if things have changed much since then at NASA, but I suspect they haven't, given Columbia and such.
After all, that is the point of those heavy processes: manage to product something fairly reliable despite the acknowledged presence of morons and shitty managers. The process does not try to eradicate those people, it works around them with some kind of redundancy.
Having been one of those 'concerned' people, I must admit it is exhausting. You know that if some work comes from this guy, or that company, it will be 90% shit and you have to explain them again and again what is wrong and how to fix it, or do it yourself when you are really fed up. So basically you do your job + the slackers' job. That is how the system works. As a system, the result is not too bad. As far as individuals are concerned, however...
If you have to remind people of it, it's as good as not existing in the first place.
> What can you do for me or have done for me recently?
Services provided so far in the past that need to be reminded are at best irrelevant and it can actually hurt you to bring them up.
I'm not saying that it is nice, only that humans behave individually and (even more collectively) according to this.
You might also carefully consider your company's culture when deciding how to split your efforts between new (user-visible) features, preventive maintenance, and bugs affecting end users (especially any VIP end users that have your managers' ears).
For example, a 'Hero' culture rewards last-minute, late night, death march efforts, whether to fix an urgent problem, or to add that one feature that is (supposedly) needed to land a specific high-value customer/sale.
I suggest reading the comments on this thread:
I learned this at my first software engineer (though we didn't call it such) job back when I was 18; closing in on 45 years of age, and I'm still here at it, and don't see that changing soon. I have thought about management, but I really don't have such experience, and I'm not a real people person anyhow - plus I like to code.
Another thing I have learned - at least here in the United States: Not only do they not care about quality, they usually don't have any loyalty to you as an employee. They'll let you go without any warning at all, and if you're lucky, you'll get some severance pay.
So save your money, build a savings account with FU cash in it (6 months to a year of salary - or more), and don't be afraid to drop an employer like a hot potato if things aren't working out, or a better offer is in the works.
And above all else, don't let your skills stagnate.
Do they care if a year down the road your TTM times increase tenfold?
We know that the bad code slows us down. Why do we write it?
Why do we write the code that slows us down? How much does it
slow us down? It slows us down a lot. And every time we touch
the modules we get slowed down again. Every time we touch bad
code it slows us down, what do we write this stuff what slows
And the answer to that is: well we have to go fast.
I'll let you to deal with the logical inconsistency there, my
message to you today is—you don't go fast by writing crap.
When nobody cares about you get the point sooner or later where you can't ship, they've probably launched a product but there's still a lot more to ship in future. Eventually they become so mired in technical debt that they care.
When they realize they've invested a hundred million dollars in something that now has no value? They care. (especially if it's a turd before it gets out the door)
When a start up comes along and eats their lunch because they aren't weighed down be legacy code? They care.
When they lose revenue because they can't keep customers happy? They care.
The problem is that the professionalism of software developers doesn't allow them to learn these lessons early and hard enough.
> They care.
I disagree. They will hire a consulting company to do a root cause analysis. The Ha'vad grads from McKinsey will swoop in, determine that the past doesn't matter, you need to get to black and to get to black, the St. Louis team needs to go, along with 20% of the following 7 teams....
No, they don't care. And never will.
Edit: Unless we elect people who take away legal protections in place to regulate safety critical software
It's over two decades since that was written. I remember reading it when it was first published. Does it look much like the future?
`Why? These things have real consequences.`
To a business they see software/engineers as a expense. The whole focus as a business is to provide a service that people are will to pay more than it costs to provide said service.
Everything else doesn't matter, what does matter for a business is $$$$$ and making sure business doesn't get sued because bill touched jill with his #####.
For what my words are worth, I do think you need to take a step back and view it from a Business person perspective and attempt to align your goals with theirs. Get out of your cube and go and meet CEO's and CIO and talk to them about their problems and what their goals are. Then see if you two can work together to achieve their goals and bring value to the table.
Even writing this I've could think of 10 or 20 different business discussions you could have.
Anecdotally, this is one of the first things I try to gauge at a prospective employer. "Are their development teams seen as cost centers, or profit centers?" If the company works with technology a lot and doesn't see technology as being particularly important to its bottom line, that's usually a bad sign.
Also, discussions with C-levels tend to be pretty one-dimensional. It doesn't matter how much their infrastructure is falling apart, they'll still stonewall with something along the lines of, "our goal this quarter is new user acquisition - how does this improve those numbers?"
And they won't accept, "users cannot sign up if everything is broken" as an answer.
Like I said before you're viewing it from the perspective of a developer but not from the perspective of a Business person.
Instead of saying `Users cannot sing up if everything is broken` you need to see it from a Business perspective, and communicate it to them in the language they understand.
`From our quarterly earnings report we need to have 10,0000 new clients signed up by the end of the month to reach the goal. We have on average 2,000 sign-ups per day. On the 2nd, 3rd of last month two of the critical system's fell over, and this result in a estimated 4,000 failed client acquisitions. The estimate from our quarterly earnings each new customer bring's in a revenue of $300 net profit from each client for the quarter. As a result of the system crash our quarterly earnings will be down 1.2 Million.`
Your job is to communicate clearly in the language they understand how it will achieve their goals. You're not the only department/manager who is requiring resources from the Business. As I stated before you're there to serve the business to achieve their goals.
Let's put it this way: Netflix has one of the most robust microservice architectures I know of. I promise you will find zero mention of it anywhere on their customer-facing pages. Because that is not the problem that Netflix solves for their customer; they solve a media library and streaming problem.
The business is your customer. You need to understand their problems and how you're going to fix them. Everything else is an implementation detail.
I'm not. Our industry is full of academics who've been told "this is the right way to do it, everything else is stupid" for the entirety of their education. They're academics, trained by other academics, ignoring the broader reality of why we do these things at all.
Another way to think of it is as of user acquisition problem. You have to highlight the benefits in a way that is compelling to them. At the end of a day all progress/fiascos in a company comes down to consistent and effective communication.
You are devaluing yourself and you are devaluing all the developers who don't want to ruin their week end at work.
I had to break it down to one of them "Listen, we've seen that the site being faster gets us X% more conversions, if the site was down, or if the site was ugly X% of the time and cost us those conversions, you'd be breathing down my neck to fix it, why is this any different...?" reply after a few seconds of thought "You're right, but you'll never convince Leadership"
Thus fuelling the idea that this sort of thing will just be done as other work is happening with no time budgeted for it.
> I obviously didn't bring to the demonstration that I had volunteered the time, it suited my point to make it seem like it was something easy I could do as an aside
I understand the impulse, but in long term this leads to inflated estimates from management - and tech side has only themselves to blame. When we pretend things take shorter then they really do, dont be surprised when they learn to expect our lie to become truth.
Long term here can be as short as three months.
No, there isn't.
Google is showing one series of ads per page. The study showed that having more legitimate results per page decreased their revenues. Of course, it gives less room for ads and less page views.
The work you did probably is great and do I think its above your average developer. Somewhere in the the top 10%. Though how you conducted yourself in the face of management was down right petty/folly.
Yes you did the best job in the world. You improved the performance of the serves by % and it did make management of those server's easier. Though you need to think about what you've done in the eyes of management.
Firstly you've modified the server's without permission. You've run tests out-side of work hours that could of impacted the business bottom line. For all I know (its up for debate) but your senior managers had more pressing issues on the table, and having a young code money running around outside office hours is the LAST thing they want to hear.
`It's not a simple as you may think it is.`
Some of the things that come to mind, and honestly I would of pulled you into a office to explain yourself if you're on my team.
1) Did you document all the changes you've made to the server
2) Did you back-up or store any sensitive information off-site during out of office hours
3) Did you modify critical system files that could cause another dependent system to fail
4) What ETA/Service level agreements if any did you break when bringing down the server for maintenance?
5) Did we lose any Security Certification with you modifying the server or changing any of the certificates on the server?
We're software developer, and yes when we see problem's we want to fix it to our best of our ability. Though when you're in a Business there are many thing you may not be aware of. In this situation I would honestly say you're lucky to keep your job! I hope you take it into consideration next time you want to go out-side of your duties.
I don't want to come down harsh on you, but there are times when you see a bad situation and there is nothing you can do about it. You notify the people who're higher than you of the situation and the dangers. It's then up to them to make the decision. After they've made the decision that is it. If the server blows up during the weekend, and they want you to come in and fix it (even when its there fuckup) you just take your hourly rate and x10. So if your hourly rate is $50hr and the server blows up, then your new rate is going to be $500 per hour to come in during the weekend to fix the mess.
Many of the points you've raised are moot outside of large corporate environments and projects.
It is likely that the commenter has root access to all of the things you mention and touches them on a daily basis. It is likely that they are not inundated with all of the heavy process you're bringing up. It is likely that all of the security concerns you raise are not even part of their current business. Working outside of set business hours is likely not a concern; overtime is likely uncompensated anyway and there probably aren't considerable office security protocols (they probably have keys and an access code for the whole office).
The points you've raised do not apply to most small development teams.
The only "clandestine" part is that I added the split test, and did some work outside of the set of features that was scheduled.
We had no policies around SLA, developing off-site, security or restricting sensitive information (think potential FERPA/HIPPA violation) which are yet other "non-features" that I tried to get them to adopt.
I implemented this entirely above-board, with more precautions and consideration than were taken by other developers on that team for the day-to-day.
The engineer is trying to optimize/normalize his free time by acknowledging quality issues, the management is trying to optimize their income by sweeping them under the rug.
The engineer is ultimately liable for the problem. Quality is insured with his reputation and free time.
The business guys almost always leave at 4:00pm and their bonuses are based on a metric not related to quality.
Not everything is quantifiable like that. If I'm writing code and I take the time to make it 30% faster then it won't make a difference to our bottom line. If that's done everywhere then customers will like us more, be more likely to stay with us and be more likely to recommend us, but that can't be quantified in my day to day work.
It is already possible to more-or-less show how a code change (bug/fix, new feature, change in storage backend, etc.) directly affects your IAAS bill at the end of each month, and taking it a step further by tying in some instrumentation and formulas (of the sort often used to calculate the effectiveness of marketing funnels for optimization and A/B testing) to show the concomittant effect on revenue allows ROI to be calculated.
So, I am pretty sure that Finance-Oriented Programming is going to be a thing not so many years from now, tied into the whole toolchain, and very few - not developers nor their managers - seem prepared for that sea-change (some Operations folks are probably a bit better off in terms of mindset and processes).
It's not a matter of opinion, but what the term means. They are cost centers unless they are closing sales (or being billed for about 3x what they cost, which is still way less than sales people will generate, being closer to 10x+ cost).
Best thing any employee can do is find out how their company actually makes money and understand how they align. Unfortunately, the highest business expense is salaries and one of the most important things for most businesses is remaining runway. So there is huge pressure to keep salaries down, particularly in cost centres, which aren't directly extending runway. If it's a service business it can get worse, as the money is typically made on billable hours - creating a perverse incentive to have cheaper and less efficient people to bill out rather than a 10xer (outside a senior business facing role). As long as they are capable and not so bad that they lose the client, it's often more efficient to have cheaper mediocre people from a business point of view.
Remaining runway is a measure that only exist for companies in the red. For most business it doesn't even exist.
Edit: to be more clear the discussion usually goes like this:
- Me: “where are we on fixing this bug? it’s been two years and customers are still complaining”
- Engineering: “we can’t touch at this component it’s a mess but we are rewriting it, this bug will no longer exists and it’ll be so much powerful, just wait a few month”
We all know how it ends. And when business pressure comes because it failed to deliver it’s all “business is evil we need more time to do great software!”
What we really need is senior and experienced engineers IMHO
Jesus Christ listen to yourself. You don’t know anything about me or how I approach things with my clients.
I care about how code affects humans but business don’t have to be responsible for how they hurt people with it.
Implied in this statement is that the parent has a myopic perspective on the business world. Based on my read of his comment, I don't think that's fair - and moreover would question whether or not your perspective is as unique as you apparently think it is.
In other words, I think you're patting yourself on the back a little much for figuring out how to chew before swallowing.
Good luck even ATTEMPTING to arrange a meeting with a Fortune company's CEO. "Who from what department again?". Worse yet if it not even your company's CEO.
Maybe, if you are lucky, you can get to skip a level or too. Software engineers are usually very far removed from the top leadership.
I've friends at ReliaQuest and I know it is a good company, but I am frustrated that the leadership at Equifax thinks that security is a minor issue that they can outsource to a 3rd party. If a company deals with sensitive financial data, shouldn't security be a core competence?
But in my experience, this is not true AT ALL. More experienced people might work less hours, but will produce the actual end results much quicker - both in hours and in calendar time.
Especially when considering that less experienced devs will (on the average) cause more bugs and worse code structure - in the long run this causes a ton of extra work. In my opinion that is the main difference that easily makes someone a "10x" or a "100x" developer.
Of course, on the other hand over-engineering at an early phase of a project is very much a thing too - starting with microservices, using the latest framework or language, etc.
Microservices however unfortunately have become a fashionable cure-all that's applied indiscriminately to software projects of all sizes, ages and purposes. Hardly anyone seems to be asking anymore why they're actually using microservices. Everyone uses them because everyone else does.
Microservices always incur overhead both in terms of engineering and network latency as well as maintenance / administration. If you have more than 1 microservice you need infrastructure for orchestrating and monitoring these services.
For all that to be worth it you need to have pretty solid arguments for using microservices in the first place. For a new project I'd usually advise against using microservices right from the start. Doing so is often rooted in the same kind of fallacy that has people worry about Facebook-scale scalability before they even have their first user.
If you start having problems that can be solved with microservices then good for you! This means your software is successful and has grown so much in terms of features and responsibilities it's starting to become intractable with standard approaches.
This is precisely why Martin Fowler argues for starting with a monolith first: https://martinfowler.com/bliki/MonolithFirst.html
A traditional monolith gives you the opportunity to learn about the problem domain first without having to worry about implementation details such as microservices.
Makes perfect sense to me.
A lot of what he's arguing against is cargo cult programming, and I think it's fair for anyone to argue against that. If your only explanation for why you chose one architecture or design pattern over the others (especially if that option introduces a lot of additional drawbacks) is "because that's just what you do", you need to stop and objectively evaluate what you're doing.
There is a link on that page for the more generic dataflow based programing. I'd advise anybody to follow it and simply ignore its specializations until they master the generic one. I'd also say that any program should be designed from multiple points of view, so do not stop at dataflow.
That's my yardstick - if I see poor man's distributed transaction attempts happening across service boundaries I'll start to assume over-engineering / cargo culting. If I see services acting a nodes in an pipeline or a flowchart, preferably with multiple incoming lines, it smells better to me.
If you are doing microservices, I hope you have a good reason for doing so. Because there are significant problems with using microservices.
In the last five years or so, I've done contract work at several large corporations. In every single one, they didn't care about defects or code structure. They just wanted to ship an application in a very short period of time.
When I asked about the same thing you pointed out, the response was the same, "We don't care about defects once it's released, and we don't care what the code looks like. It needs to work well for our end users, nothing else. Besides, in 6 months, we're going to redesign and rebuild it anyways."
It just seems nobody cares about long term maintenance, and likewise, they don't care much about the code or the amount of defects that code is generating. Pretty maddening when you think about it. Nobody cares about quality anymore.
1) working great and well for the end user, for the function of the software
2) bugs and code structure
Code quality can be so bad that it matters. But beyond a basic level, it does not matter. Bad code that works beats great code that doesn't fulfill it's function every single time.
I once had this great lesson. I was brought into a team as a TL/Manager. We were making a product that was doing pretty well, in terms of attention it was getting and so on. 1 week in, I learn that about 70% of the code is tests, 30% is actual code that's used at runtime. 2 weeks in I learn that everybody's always working on the tests, never on the program (there's a reasonably good reason for this even). 3 weeks in I explore the program, and at some point I try to run it. Turns out the "main" function is broken. Because of crashes (plural) during variable initialization it never gets to the first line of the main function ... How long had these bugs existed (it wasn't just one), you ask ? Well, over 3 months. Doesn't happen in the (VERY extensive) test run.
Lessons learned in that project:
1) test driven development is a great place to start, and a VERY bad way to run projects once they're even 10% into their delivery schedule
2) even the best tests don't check everything
3) even the best tests don't guarantee that even the most basic simple parts of the program work. In fact, tests actually WORK AGAINST THIS.
4) the value of a system test (where the entire program runs, by actually going into main and doing what it's supposed to on a realistic example) is incredible. The criticism that TDD developers have is valid, that if it fails you won't know (necessarily) where it failed, but isn't that serious. Firstly, you'll usually have some idea of where it failed, and secondly, and this is the big one, you'll know it fails. FAR preferable to the other situation. They're also very hard to write. Tough. You need them, far more than you need unit tests on 5 line functions.
5) you are MUCH better off with a "hero" programmer (bad name, "loose cannon" would be more apt description) AND someone acting 90% anal about tests, and managing the inevitable conflict, than you are without the loose cannon. Managing that conflict will be a challenge though. And yes, that means that, to the utter dismay of the TDD developer, you'll have to tell him to let in code without, or with bad tests, from time to time.
I've made the mistake at one company of thinking that my job was to architect solid software that was sustainable for the long term.
What they really wanted was to have software that was "good enough" and had all of the checkbox features to be attractive to potential acquirers so the investors would stop breathing down their necks.
Jack Dorsey became a billionaire from those losses. The same is true for probably all other early investors. For an investment in Twitter to have done really badly, there was a "bear" period of about 2 years.
And before you say it, remember that even Google stock dropped 60%, Facebook's 70%. Twitter's drop actually looks pretty reasonable in comparison.
So firstly, I don't think twitter is bad, and secondly, I don't think people have given up on twitter, and thirdly, that seems pretty justified.
Wouldn't you ?
As an employee who has a job because the company exists? Yes
As a retail investor - I wouldn't be quite as happy.
Contrast that to the small staff that Instagram had before getting acquired. It's still run on a relatively small staff now.
It's about producing working code, not about how many hours my butt is in a seat...
My personal belief is that our industry tends towards less experienced workers because they put up with shit managers more and can be used to cover for their issues. It's anecodtal only, but I have never heard management decide that their deadlines we're improbable if not impossible when bugs get through. It's always just explained away with platitudes about code quality that suddenly dont matter when that requires more money or more time
I've worked at two places where my tech lead at least recognized that if scope or requirements changed then deadlines changed. Now I am at a place where agile is just a word thrown at any problem and now we have things like being asked to integrate third party tools that we don't receive until a week before the deadline. My team stayed till 2 am and was back in at 9am the next day and _literally no one_ saw a problem because it's "agile" and that's what you do when your agile
e: Actually I did Kanban too and that was OK, but that is a lot closer to the "no methodology" thing and anyway doesn't involve you making decisions like "oh, can't start working on that, because the sprint is almost over."
However, it is important to realize that you are not competing with the 20-somethings or the ones that are fresh out of college. You are competing with the ones that have already a good amount of experience (let's say 7-10 years) but are still in the beginning of their careers.
If they are minimally talented, they will probably produce code just as good as yours, yet they will probably (a) work for less money and (b) still be single and (c) more dedicated to work and (d) more tolerant of bullshit asks from the employer. Those are the ones that eventually fill positions as "Sr. Engineer" and crowd out the older ones.
The alternative is P&L and boy grey beards don't want P&L based measurements
P&L measurement would destroy majority of the old timers unless the old timers work the same little money fresh out of college people do.
The thought is that a team of greyhairs never ships anything, and a team of youngsters ships garbage. But it's better to ship garbage than nothing, hence the bias.
My experience is that programmers who become "architects" and are only responsible for design and code review tend to go downhill in their skills. It SEEMS like an efficient way to use experienced people, but it is actually an anti-pattern. Their tendency is to develop ideas that sound good but don't work well in practice, and there is no direct way to correct the mistake. (Any time it doesn't work, the tendency is to blame the implementer. And there is usually enough to blame that their own contribution to the problem gets missed. You're less likely to miss the problem when YOU are trying to make the implementation work.)
In fact the problem is sufficiently bad that in interviews it is important to have people actually write code to show that they still can. When you get an "architect" who takes offense at the exercise, that's a non-hire. They might have been good 10 years ago, but they aren't worth hiring now.
This was something I'd sort of noticed, but didn't become conscious of until I worked at Google. There they were very conscious of the phenomena. Every programmer from the most junior to the most senior (for the record that would be Jeff Dean) writes code. If you're not willing to write code, you're not a hire.
That said, the exercise goes both ways. If I interview with an employer and I discover that they design things up front as UML diagrams, odds are that this won't be a workplace that I want much to do with. If I'm working in a job and they force me into an abstract architecture role like you describe, I'm going to quit and find a better job.
It's been surprising to me how much I've had to fight people on this issue in the past.
The thing is, software architecture kind of the same job as coding. So what the "architects" you describe really do is equivalent to writing code on a piece of paper. That is, doing one of the most mentally demanding jobs imaginable, but without the tooling to protect them from their own confusion. No surprise then, that it later turns out the architecture doesn't make sense. Without a tool like a compiler to call you on your bullshit, it's too easy to start engaging in fuzzy thinking, and the longer you're not exposed to such practical verification, the more your thoughts will become fuzzy.
1. All of you are overhead.
2. Customer is the profit.
3. Now how do I get (2) to be higher than (1) before we run out of money?
</three letter hat off>
Lots of companies are smart enough to figure out that it is cheaper to hire competent people than to accept the boneheaded mistakes that the cheapest warm body would make.
Besides, the ones that don't figure it out are no fun to work at. Who wants to be on the side that's bound to lose in the long run?
I just wish that they did less collateral damage on their way down.
This applies to everyone everywhere, and particularly positions that anyone with basic language and reasoning skills can fill (PMs, MBA types, etc)
No disagreement there. Just in my experience those jobs are either the first or second to go when things get tough - often because there's an MBA somewhere near the top who recognizes how replaceable all the rest are.
Aren't you being overly restrictive in scope here? Many executives are MBAs, but most MBAs are not executives.
Anecdotally, I know several full time top 10 grads who aren't exactly on the fast track to the c-suite, and I assume this is even more true for the broader pool of MBAs.
Us older folks won't build things we know doesn't work against synthetic deadlines, then work overtime to fix what we knew wasn't going to work in the first place, while taking the blame politically.
Engineering has in my lifetime become "unwinnable." I'm either not "a team player" or "being negative" for planning for reasonable failure scenarios. Then politically I still take the heat when they happen.
This is not a quixotic aspect of engineering, it is a design feature at bad employers. "Heads I win, tails you lose"
Also, that the product is 10% better/more stable/whatever frequently does not translate into even a 10% increase in sales. So it's not worth it (from the business' perspective) to invest in that quality. Pick some other arbitrary point (e.g. 20% better/20% boost in sales, etc.), right down to some threshold whereby the company simply doesn't even have a product that can be demonstrated.
Here is the core of the problem, and it's more of an incompatibility of goals than errors of one of the parties. Businesses want to make money. They usually don't give a flying fuck about what their product is or does beyond the point it gets sold. Does it waste users' time and piss them off? They paid us, which means they value it, so everything is ok.
Engineers, on the other hand, tend to care about what value the product actually provides to their users. So they would rather invest effort in making the product better for the users, instead of making it better for sales team to sell.
I don't really see the way out of this conflict. The engineers are right, but the businesses are right too - it's the business that pays and suffers (some of) the consequences, so it's the business that gets to tell engineers what to do, and not the other way around. If CEO is making a stupid decision, that's on CEO.
The way I see actually useful software gets done, it's outside or on the side of a business, not within it.
I'm not optimistic that this will happen. Anecdotally, I expound to acquaintances the risks of unprotected PII or questionably-secured home security apps, but convenience seems to outweigh such concerns.
Regulation is one approach to solve asymmetric information transactions. I don't see how that could be applied in general to software quality, but it could target the cases that have severe or widespread effects.
Keeping up with new technologies is essentially a proxy question for, "How much free time do you have that I might exploit later?"
It's quite common for things to generate or save money. Sadly, most engineers won't bother counting the results related to their work.
On other hand there tons of grey hairs who never shipped anything and are useless as new grads are.
Yup, this is the software industry boiled down to the basics.
but those aren't really the two alternatives, are they--ie, crap code or nothing?
it might be in the very short terim (ie, by this friday)
but over any other span of time, the choice is more like:
ship crap code in 30 day, followed by 50% of the team's resources spent bug fixing (which can often be cleverly disguised as new features) for the next 90 days
ship high quality code in 45 days
That customer needs this code on the 1st, that is a hard and fast deadline! So shit is churned out, and handed over on the 1st. Then, three or four 1sts later, the customer finally gets their shit together and deploys it, and, voila, it is shit. And the cycle continues, as the scramble ensues to patch the shit by the 15th with yet more shit. And you end up like the little Dutch boy at the dike, except instead of fingers you're using hotfixes made of excrement.
This does not work. Your code monkeys will get demotivated fast and wont be able to learn anyway. The one architect guy will because increasingly out of touch and his code review will become pointless red tape fast.
> The thought is that a team of greyhairs never ships anything, and a team of youngsters ships garbage. But it's better to ship garbage than nothing, hence the bias.
Why would greyhairs never ships anything? I dont get it. My experience was that young people need more supervision to not get demotivated and to actually finish it. (on average)
I don't know what that really means though. People who are smart, hard working, and get the right jobs can get a senior title in 5-6 years at the big companies.
The shipped project is one branch of an extremely large tree of possibilities which a younger, less experienced programmer would have taken a very long time to explore and discard (which I know for certain because I was that programmer).
I'm consistently amazed at how much of a pain in the ass things tend to get when people try to build solutions in their early 20's, vs 30's and now in my 40's. Not to mention how much my viewpoint has changed in the past 20+ years in software.
I'm pretty happy when I can remove a bunch of dead/unused code trees, commented out swaths of crap, and refactor portions of a codebase into 1/5 the size.
There's plenty of shitty older devs. And if they're old enough they're a protected class which will make things very awkward if you hit a shitty one. And with the newer generations, there are plenty of 18 years old that will run circle against more experience devs, nevermind mid 20 ones. There's of course plenty of crappy ones too. So all things equal, but with a different salary, which one do you pick if you're a naive hiring manager?
Fortunately I now work for a company where this is a non-issue. While we definitely hire a ton of new grads, we'll never say no to older/more senior candidates (and sure could use more). Which is good, because I'm getting dangerously close to my 40s.
I propose that working code is necessary, but not sufficient.
I'm only being half snarky. Large chunks of these threads end up as people bashing young people for being idiots for various reasons.
I do believe that discrimination is an issue, but you also just don't need that many seniors. If you have a couple to make sure designs are reasonable, catch mistakes, and mentor the junior people then that seems pretty sufficient.
If you have 5 seniors and a junior you get stuff done, but the junior person quits because they don't see any point in sticking around on a team where they aren't being challenged because someone more senior always gets to do it. Someone else will give them the challenge they want. I'm moving on right now exactly because of this.
I'm primarily calling my designs shoddy when I had only a few years of experience.
If the timing of that bad hacky solution being shipped is critical, then it's actually the perfect solution and the right work was done quickly, which is great. If it's not, and is rather the first piece of a new feature intended to last for a while, then the work on that feature as a whole may be slow as a result, so then it looks more like bad and slow work.
I think with experience comes the ability to choose the appropriate approach for the problem at hand, whereas inexperience will usually lead to the short term quickest easiest route being chosen every time by default.
Another subtlety is that, typically, more experienced people will actually deliver the 'good' solution faster, which results in a double win if that is indeed the correct course of action.
If I want to end up with a great end result, I find this is the way I have to work.
I've spent far too much time around organisations where "more time with bums on seats" = "direct correlation to billing more", which = "better work" if you ask the right people. That leads to an unfortunately different scenario.
That was a fun bug to track down.
You're the second person. When was the last time you encountered anyone using KLOC's (per 1k lines of code) as a measurement of work done? Are they working on mainframe codebases?
The DOD and their "experts" love measuring software projects this way. They even take the cost/LOC ratio as a measure of value. I was once semi-seriously chastised for committing a change with net negative LOC because it broke the formula and implied negative value.
2) You can't be productive having 0 LOCs. So 0 LOCs = 0 productivity. n>0 LOCs = (presumable) some productivity. I would say we have a trend.
2) I know what you mean. I do it often myself and I know it improves code quality. It's not my intention to measure "work done" in LOCs, which, as you say, could be contradictory. I propose to measure the size of a project in LOCs. Suppose, you have two projects. And you have all the time you want to reduce their LOCs (to improve code). After you are done, both projects will still have some LOCs. My point is that the project, which has more LOCs, after you had your fun with it, is the "bigger"/more complicated project.
The added value a software engineer produces with his work in a single unit of time has a huge variance. Furthermore, it can range from negative to positive.
I.e, it does not matter how long one works. The only thing long work hours of a software engineer communicate is that they work long hours.
Crudely put, for project X you have three the following constraints. Cheap, On time, and good quality. You get to pick two of these three. It would seem to me most companies are opting for cheap, on time delivery and thus the quality is simply absent.
My bet is that it's about easy and hard to measure costs, and how our modern big government world insulates large companies from competition.
This is also why grey beards suck for P&L so they better want to work long hours.
Age shouldn't be used as a catch all filter, but I can see how it might be faster to screen those who have been too content to stick to their tools without catching up to industry trends.
You might want to check your own bias in favor of functional programming. To the more seasoned programmers this is just the latest fad. That's not to say there isn't value in it, but if you're making hiring decisions around it you've probably got your head in the clouds.
I'm not sure if I'd want to hire someone who is unable to point out valid criticisms of the technology stack I'm using.
People need to be aware of the limitations of their tools. I spent years in pure C, both complaining about it and singing its praises. Right now I'm in JS land, and I can do the same thing.
I started off in C#. No real complaints there, except for the lack of stack alloc, and that got added in the most recent version. :-D
I only interviewed a few of candidates for developer roles a couple of times and this was one of the questions I asked, managed to get some interesting, constructive responses as well as some well-crafted bullshit.
I learned this after I told the interviewer that I thought that PHP was a crap language, when interviewing for a PHP developer position.
Working in a PHP shop, nobody would care to shit on PHP during an interview. We know the shortcommings, it just doesn’t really matter that much.
I happen to like the one language to rule them all (JS).
They're just biases that haven't a good justification. I usually keep this sort of bias to myself because I recognize it for what it is. Honestly though I wouldn't apply for such positions, either, so the bias problem never surfaces.
how is this type of attitude not sufficient reason to not bother hiring someone? I immediately don't want to work with or for you.
I know this. And I wouldn't want to work with JS devs. At all. It's a win-win for all of us that I don't apply to those positions.
It’s a sign of a bad attitude.
One can be blunt without displaying a trashy attitude, and it doesn’t take that much effort. Most people tell me that I am the most blunt person they know in that I will give some of the most honest observations whether they are uncomfortable or not. I also have a reputation of being one of the nicest people many people know, never needing to put down one group of people to appease another, probably one of the biggest engineering productivity killers one can create (negative politics, cliques, etc.). Engineers often will vent to me (coworkers, teammates, people from various programming communities, etc.), and I lend a sympathetic ear to all, which in turn builds trust, comraderie, and helps use my knowledge & experiences to help guide responses & resolutions to situations generating stress to a healthy outcome for all.
It shouldn’t take much effort to treat each person with respect & humanity - if it does, one should look inwards at maybe something being wrong with oneself than to other people, and work on culling out the toxic attitude.
I'm more likely to ride on the bleeding edge (Node 8, Webpack 3, Babel 7) than most. I am afraid though, in under a decade it will be an uphill battle to find work (I'm 43 at the end of the month).
Maybe the trend is web assembly with Rust now or something. Or Ethereum contracts and R are cool. I dunno but trends do matter.
It's very simple - it is harder for younger managers to work with older subordinates.
The age-based hierarchies are deeply ingrained into virtually all cultures - "Elders are wise, respect your elders, etc." This makes arguing with (or reprimanding) someone who's older than you a doubly-uncomfortable task for many people. The reverse is also true - that is being bossed around by people significantly younger than yourself.
When everyone does a splendid job, the age difference is not a problem. But if someone is not pulling their weight, that when it starts to complicate the situation for everyone involved.
A good manager would figure out how to make it work. I've worked with younger managers, and I've even had to teach them how to deal with problem workers.
I've also had people all the way up to the director level that I've had to manage, as a developer, because they were so terrible. I even had one ask me once 'Why are you managing me?" and had to reply "Why are you making it necessary?".
I don't think that we've got the whole management thing down - I think that there are probably the same % of bad managers as their are bad devs. I'd also suggest that there are 10x managers, probably ones that read Peopleware and took it to heart.
It is easy to manage most people, all it takes is a bit of diplomacy.
This I agree with strongly, and it's yet another reason why younger people, especially those who just learned to code, could be more liked by companies - they still get excited! Whatever turd you'll present them with, they'll think you're giving them truffles. Experienced people know that most dev jobs are just boring rehash of the same bullshit everyone else does.
More experience also means ability to see through management's bullshit.
> It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good
The difference here is employee vs professional
To be clear, there is a huge win here. And that's hiring good technical leaders and listening and consulting with them on technical strategy, standards, technical employee evaluations, etc. It's actually lower risk and higher reward to do so, but it is more work than telling new hires how to crank out strung-together features on your rule-the-world custom framework.
It still works this way in tech. When you need someone who knows what the hell they're doing, you're going to consider hiring a consultant, and you're likely going to get one of those same grayhairs.
I've found with older developers they fall in two camps - old set in their ways, didn't keep up with technology and its harder to teach them modern best practices. The other type are the ones who have learned a lot from their experience and are aggressive learners. I don't think older developers are ever "malleable" we are always stuck in our ways. Our ways just may be "this is the way I've always done it" or "I love technology and learn for the sake of learning" or "I learn aggressively to stay competitive".
As far as being 10x better, the other side of the coin is that it doesn't matter if you're 10x better if your skill set is more than the company needs.
If all your company needs is another CRUD app and you've gained expertise in AI, they aren't going to pay you more based on you being better.
First, software engineering and programming are fields with way more skill diversity, which is a nice way of saying that there are a lot of relatively incompetent people in the industry. The assumption that people with experience are more skilled isn't generally true. I've seen people hired for the role of CTO in small startups who were explicitly chosen for their age because it somehow meant that they got more experience, except that they weren't as good as the graduate.
Second, software isn't as critical as other fields because users don't care (or are trained not to care). Your iPhone calculator not working isn't the same as your room's window not closing. Most companies don't really search talent. They just want to hire cheap "talented enough".
In the specific case of Facebook, Amazon, etc, the hiring process is focused on algorithmic and stuff like that a recent grade is simply more likely to get since it's still fresh.
My point is that it's likely the CTO is not the most technically adept person in the company, and that's to be expected.
The cost of failure is much higher. It is easier to correct shitty software. So the focus is on delivering fast.
I really hope this isn't the case for you, and in general I don't really believe it to be true either. Most of what makes me successful as a developer today I knew 20 years ago, in a prior career.
You have to differentiate between tacit knowledge and explicit knowledge. The former changes very little and the latter changes a lot. You're also best off retaining almost _none_ of the latter. Those "Teach Yourself <language/framework> in <number> <days/weeks>" books you used to see were written for serious professionals, not newbies.
I can often get more done with less using tried, tested tools that change little than my peers can do with 'the new hotness'. It seems every language/stack has its own implementation of cron. Stop that. Use cron.
Networking protocol, binutils and SQL don't really change and that's literally 99% of the "make stuff happen" part of my job. A lot of "Full Stack" developers don't know any of this shit. They don't know what tools will tell them if a client's traffic is coming into their network at all or being rejected at their server.
Con artists, honestly.
In the end, keeping things as simple as possible is usually best. If that means using language features that someone might not be familiar with, it should still be more understandable in the end.
I will agree that frontend is full of churn. I have to do it, but I wouldn't specialize in front end -- most of your role is relying on explicit knowledge rather than tacit knowledge.
Doesn't matter if it is a new language, framework or whatnot. There have been very few advances in our field, stuff changes incrementally. So you have seen everything before, it just has a different twist. You can pick this crap up in a fraction of time, as the concepts are already internalized.
... Unless, that is, if you used to call yourself "X developer", for whatever value of X. Then you are unidimensional, and screwed.
I've been around for a while. Some knowledge has become useless, but I learned how to sniff out technology that's going nowhere and avoid it. Since then, I've rarely wasted my learning on things that become obsolete.
The mistake is this: the interests of the company and the interests of its management are not always the same thing.
Sure, the company would be better off in the long term with experienced workers producing quality working sustainable 40-hour weeks. But in the short term, Jo(e) Manager can earn his or her bonus by burning some cheap people out with 80-hour weeks, then just getting fresh ones in.
I’ve yet to peer behind the curtains of corporate axings, though. So I don’t know when the lists of employees to fire are drawn up, whether or not the fully-burdened costs (especially medical insurance) are somehow accessible by line managers. The per-employee breakdown of medical costs is definitely available to companies.
Under individual 1099 wage treatment in the US, the healthcare cost shift is generally not in favor of the employees in older cohorts. If you are claiming to use actual individual freelance-style contributors engaging with clients as 1099, then you should look at the pricing and coverage constraints of individual healthcare plans in the US for older employees.
1099 corp-to-corp is done all the time of course, but that simply shuffles around the pieces on the board. The cost only appears to get magicked away in most 1099-oriented arrangements because the vast majority of 1099 individuals I've worked alongside and gotten to know have compromised pretty extensively on their benefit plans, compared to most conventional big group-based benefit plans. In the US, volume discounts don't cover just price, but all sorts of other transaction factors that simply do not show up in economic statistics.
There are some individuals out there who are comfortable doing 1099 work, while meeting or exceeding benefits plans of big groups, and I've done it myself in the past. It takes a lot of discernment, judgement, and work to navigate the thicket of options and constantly-changing regulatory landscape from many different jurisdictions. It's easy to wave the magic wand of the free market and say make everyone a free agent, but in reality it's not for everyone, and I would not promote it as a general policy for the general population. Ronald Coase's work on the economic effects of transaction costs at individual and firm level illuminates some rationales behind why encouraging individual 1099 transactions would not produce optimal results.
Do you know how many companies will string along a part time worker, promising full health care after X months, only to fire them if they start demanding benefits? I've seen dozens of cases.
Or if not then I'm not sure what your point is. People who miss a lot of work and make excuses for being belligerent don't make the greatest employees, but there's not as much correlation as you think between that and age.
I feel like you really have given a clear answer to combatentropy's question "why do employers discriminate against older people", but it's maybe not the answer you thought you were giving.
And now I am on a diet and quit smoking :-)
1. Work fewer hours, because of age and family
2. Pay is higher, because of experience
What this implies is their wage per hour metric is higher than younglings. So didn't veteran coder get stuff done more swiftly? IMO, yes, some of them are extremely valuable and irreplaceable. But they are RARE, and doesn't need Facebook Ads to find another job, they cultivate connections or head hunters find them.
So the rest of the old folks are the one that suffer most from ageism. Their career growth stuck when they are around mid 30s, as the scope of their work doesn't expand, their experience also stagnate, they end up doing the same stuff for many many years. At some point, the age is going to catch upon them, when their years of experience becomes a liability other than assets.
Sad part of this is it is not necessarily their fault. They get caught up in their own comfort zone, not because they like it, but often time subjects to circumstances. And software is a field moves so fast, talking about relevance of skills in 5 years term doesn't make too much sense.
Solution? Better social security net could help, but won't solve the problem, unless you go to the Japan model, at the expense of suffocating the younger generation, which is not only costly yet not feasible in western societies.
This is a great point. Older devs are generally "better" than young ones on every metric (including per hour costs). BUT you won't find these. The people that will apply for jobs when old, are not representative of older devs, they are representative of older devs without strong networks.
> And software is a field moves so fast, talking about relevance of skills in 5 years term doesn't make too much sense.
I'm not sure this is necessarily true of all of software. We tend to forget (especially on this site that has an extreme "startup" focus) that most of software dev is still people working with older (5, 10 or 15 years old) tech in a megacorp or public authority somewhere. You don't "see" these people because they don't post to stackoverflow, they don't blog about tech and they don't show up on github. What they do doesn't show up in "language trends" that just look at stackoverflow or github. Because it's just a job, it's not breaking new ground. I work in a 30 man team of 40year olds, where no one really fits the "startup" model.
Of course many of these people (I'm one) will have problems finding jobs if they weren't actually "good" devs. If you grow comfortable and don't have a network, you'll have trouble finding a new job after 50 if your tech has gone out of style. But Java, C# etc are going to be in demand for a LONG time yet. Most of software tech isn't about the latest js framework.
Also, as long as you stay in the same industry, the main asset of a developer is the domain knowledge from the problem domain, not software development.
I think that is generally false today.
"Millennials" demand more of a work life balance than GenX or Baby Boomers, they also demand greater Maternity and Paternity leave when they have children
I think most 40-50 GenX's will work MORE hours than a Millennial that has a younger child...
If you are a large company and have robust processes in place, you don't need smart people with tons of experience. Such people are expensive.
You need average people who don't need to think too much for themselves and can follow directions and work hard.
The process is designed to compensate for mistakes and guide inexperienced workers.
Thinking in computer terms, the analogy goes like this. A long time ago, servers had to be very powerful and very reliable. Then came cloud computing and redundant architectures. Now, you plug in relatively cheap components into an architecture that can tolerate failure. You no longer buy high-availability components as the system can function around issues seamlessly.
That was the fallacy put forth in the CMM (Capability Maturity Model) right on page 1 or 2. The notion that process could eliminate the need to rely on experts. There are a couple problems with that. Number one is that there is no process that can solve all problems or write all code. Number two is that those processes are probably not lower cost - they rely on a larger group of people. All you really get from such things is better consistency, and that is often what large companies are looking for. I would argue that having some experts in such places is extremely beneficial at times. I've benefited from them and I've been one in my niche. Without pockets of expertise, a vast group of mediocre people will drift at times.
I completely agree with each thing you said. I should have been more clear and said "a large company does not need to fill their entire workforce with smart people with tons of experience. Just a few is enough once a good process is in place."
A small company needs very smart people when starting up. They don't have any processes in place and each problem is new.
As companies grow, these experts develop and polish processes. As time goes on, more and more can be done with less capable people.
Not all experts can be eliminated as a company grows. There will always be problems that don't fit the process. An expert is needed for those. A process also needs to adapt with the times. Experts are needed for that.
However, for any class of problem, the pareto principle is in play. An army of mediocre people with an excellent process can handle 80%* of the issues. The remained will need to be handled by experts.
If you have a small company you may need 10 experts to handle X load. You don't need 1000 experts to handle 100x load. You need 20 experts, a good process, and 980 hard working people.
* - numbers depend on actual cases. These values are for illustrative purposes only.
Yea, I was going to say - do customers expect flaky products, and companies have structured themselves accordingly? Or have companies produced flaky products, thereby getting users accustomed to paying for flaky products with promises of "we'll fix it in the next version," thus allowing companies to produce ever flakier products? Chicken and egg question I guess.
Experienced people of any age who can adapt well to change have no trouble getting a job.
Instead you can say "that reminds me of a similar concept we used to implement using ABC", and quietly appreciate that you got a leg up in understanding because of your experience. If young people are smart and curious, they'll want to know more about ABC, and ask.
Well, the rate of change seems to be high. Whether it's "progress" is another matter. Progress usually implies change, but the reverse is not at all necessarily true.
Who's usually in a position to recognize that and be a judge of it? People with experience.
So the question I'd ask is this: when someone with experience doesn't like change, is it because they've "crystalized"? Or is it because they have some defensible reason for disliking the change?
If it might be both, how would someone be able to tell the difference?
Say Person B is proposing a change from Solution A -> Solution B.
Person A thinks Solution A is still the correct approach.
I think it's important to understand in detail what aspects of Solution B are appealing to Person B. It may be subtle, psychological, and/or political, but there's a reason why this change is important to person B. Person A & Solution A must address those issues, or risk eventual obsolescence.
It's also entirely possible that the more correct approach is Solution C, and this process can help establish what exactly that is.
Even if you didn't have family obligations I think when people get older they are less impressed with beer in the kitchen and ping pong tables and would rather not be in the office after 7:00PM.
They keep saying all the wrong things, like "no, I won't go on call, that wasn't in the job description I applied for", "sorry, but I won't take on the extra responsibility unless you give me a raise", or "I've been down this road before, I'm not going to work weekends because it's not going to significantly speed up the project and I'd rather be with the wife and kids".
But the same taste, and knowledge born of experience that there is little as long-lived as a temporary solution, might make him put his foot down and try to prevent management enforcing a short-cut that will generate technical debt. This can make older, more experienced, people seem more difficult to work with.
They may also be more likely to have family commitments or have developed hobbies that require commitment (when I was marathon training last year, sticking to my plan meant arranging extra work around that rather then the other way around) so are less inclined to do overtime because bad planning elsewhere has caused a "crunch". This isn't a case of expecting more money for doing it, it is simply not wanting to do it no matter the incentive because we don't want to compromise on the things that make life feel worthwhile (younger people are easier to convince that they'll have time for that later!).
None of this is fair to judge on, of course. It does not impact their ability to do the job well in the contracted time unless things are badly managed around them. But they are still considerations some people make when hiring.
If all I care about is degree + ~7 years experience in relevant technologies + generally solid head on the person's shoulders, and I have one person with those, asking X, and another person with those, as well as another 13 years unrelated experience asking tens of thousands more, why would I pick the latter? If I don't think the benefit of that extra unrelated experience (we do cloud based distributed systems, that experience is in windows desktop apps and drivers, say) offsets that cost, why would I hire him at his expected salary?
Note that I'm not saying there isn't a bias in this industry, just that it may appear magnified due to such considerations. I know where I work we ignore age, but we often dismiss resumes with "that person is far too expensive for what this position entails" when it comes to junior - senior devs. Leads, architects, technical managers, it's all considered worthwhile, but below that you're an individual contributor in a very particular niche (albeit one that allows you to learn the other parts of the system, and other projects).
A company like Facebook probably doesn't really care that they get the absolute best employee for 90% of their open positions. They go with the group that's going to get them the largest number of acceptable hits so they don't need to work as hard to filter out candidates that they're not interested in.
This is all speculation on my part, I have nothing to cite in this regard. I think it's fairly obvious that if Facebook filters out over-40's for these 90% of jobs right up front they will have less work to do gleaning the wheat from the chaff.
For the remaining 10% of high-productivity/high-value workers they probably wouldn't use such pre-screening and do more targeted recruiting. Again, this is all a guess on my part, but it makes sense. Not saying it should be legal, but I can understand why they do it.
Code quality is vastly, vastly overestimated.
Many very powerful systems that you rely on everyday to do even the most basic of shit, contains some of the most well-tested, battle-hardened shitty code you can’t even imagine.
Writing “good, high quality code” is intangibly, unquantifiably expensive.
Sounds like you’ve only worked on very small projects.
I imagine there is similar logic for their cutoff.
Not saying it's valid, but last I checked (long ago) they had a max age.
Disclosure: I am a bootstrapping old fart.
Or maybe they believe Zuckerberg when he said that younger people are smarter than older people.
Take this article. It says that some companies ("dozens") place some ads targeted at younger audiences.
NOTHING indicates that it's a major trend or that they do this a lot. The article is written because of a legal issue. But if you only read the comments you get an impression of the streets filling up with discarded old ultracompetent programmers.
I don't have the whole picture either of course, but a long life in the industry gives me the impression that it actually mostly does work just the way you think it should.
To be perfectly honest, I'd rather work with people who are better than me and who know things that I don't. Sadly, I work in a location and an industry where those people are thin on the ground, and thus I'm usually deciding between people who have less knowledge than I think they really should for the roles we're advertising.
I'm not going to lie, when someone has 20 years experience on paper but has a shallow understanding of their trade, I'd rather hire the grad with similar knowledge if all else is equal.
With that said, I have hired people who are older than me as junior members of a team before, but that's because they show an interest in learning and improving. Usually this means they've been transitioning into the role from another area.
Interviewing is a crap shoot though. I have no idea if I'm good at it or not because I'll never experience the world where I've made different choices. All I know is I expect senior people who know their shit and when they disagree with me, I expect them to be able to argue their point in a friendly and somwhat clear manner.
P.S. We're hiring "devops" contractors paying above market rate. It's not a thing about being cheap, it's simply having a quota of allowed hires and trying to get people who can do the job.
They don’t challenge you.
They don’t tend to need expensive healthcare benefits.
Experience is necessary but not sufficient for a top quality dev and top quality is neither necessary nor sufficient to make successful software. The cult of experience in other engineering disciplines is the reason those salaries are so low compared to ours (despite the fact that the work, schooling, and certification are orders of magnitude more involved). Let's not bring that to software please.
It's worth considering that, statistically, the discrimination might be justified. Of course the only way to know for sure is to get rid of age discrimination laws and see what happens to the market. Companies bias "incorrectly" at their own peril, to be outcompeted by companies without said bias.
Companies don't care for more experience. The experience the younger programmers have is "good enough" for them, especially weighted with all the other benefits from hiring them (smaller wages, more impressionable, longer hours, less/no family commitments, etc).
I certainly don't have data in front of me, but it seems like a reasonable assumption in my mind. The youngest people just starting their careers are highly likely to be looking for work, naturally. Slightly older people often start looking to change jobs to move into higher paying roles. And by the time someone is moving beyond 40 they have reached the top of their career mobility and are happy to stay put at their current employer, plus an increasing unwillingness to uproot their family to move to a new town for a different job. There are always exceptions, of course.
If, for argument's sake, we assume this is true, I can completely understand why they would only want to target those under 40. Statistically, the ads shown to anyone else are just a plain waste of money.
Anyways, yes, that grey haired architect probably knows what they are doing, but how many architects didn’t survive to that point?
Managers and above want people who dont really know their rights, have the energy and willingness to just do the job.
They also assume that younger people have better skill sets.
All things equal, younger people cost less.
Can you characterize how you are better? I would say the same about myself, it's not that I can do stuff 10 times faster than I could then, I've just got the experience to avoid things that might seem ok at the time but will cause long term problems. Today I know when I'm over engineering, I know when I'm under engineering, I know the patterns that will make code hard to maintain and I can avoid them, I can avoid technical debt (usually).
The powers that be never see that technical debt, they don't know they're groaning under the weight of it, they don't know it could have been avoided, there was/will be a different set of people deal with it. So why would they pay more to avoid problems they don't know they have?
Unlike other industries, software development never, ever plans for maintenance.
What if they're right on average, and your experience is the exception rather than the rule?
High housing prices. If houses are cheap, it means that young people can buy housing sooner and have kids. When they have kids, they can’t take as much risk and don’t have as much energy to start companies. (I have four kids—I barely have the time and energy to blog, much less start a company.) Also, if houses are cheap, it’s easier to “make it big,” and you want it to be hard to make it big.<<
Each generation of companies has come from dream teams from the previous. Space race R&D contracts and AT&T engineers begat HP, which begat Apple, Oracle, and Intel, then Sun Microsystems and the dot-coms, then Google and Apple 2.0, now the current crop of companies. SV housing being expensive has only been true for the last 20 years, and has only been utterly unaffordable for the last three.
Guy knows his probate housing, self-help writing, and public speaking, but I don’t take him as an authority for computer company insights.
If you're the hottest startup around, yes, you can only hire 10x developers.
However, if you're the average tech company, like most are, you cannot. Those 0.3x developers and 1x developers with bad work routines occupy the most job positions in companies.
There's probably a bunch of managerial bias as well about the type of person they can control easier; young eager dummies, or folk with experience who know better than be treated badly
Don't be biased. Your improvement might be heavily influenced by the fact that you work on the same codebase, in the same company, with the same people. If you were to start a new job somewhere else, your productivity would likely drop significantly during the first years.
It may or may not from the business point. But the point of the article I think is to debate whether this is legal or not, which is orthogonal to a purely business decision.
So if you are an older person, make a side project with some cutting edge stuff and prove them wrong.
They know seasoned workers are not going to put up with their shit and will make them look bad, possibly make higher management wanting to replace them, possibly with the season worker.
A company of entirely 20-something dudes is not going to generate any sympathy. If there's a problem, it's not going to generate headlines or blow up outside the walls.
Whatever advantages you have, the common sense thinking is that more commitment and more control has high value.
My father is the pinnacle of excellence and professionalism as a plant engineer. Everything that got me ahead of my fellow students and into a Ph.D. program, I owe to his teaching and critical thinking. He walks into a building and immediately notices things that can be improved and will save them money. (And he's not a jerk about it, either.) He worked for a company for 25 years, and others almost as long. And ever since that 2008 recession? He's been out of a job. His plant got bought out and sold off. They kept him on longer than anyone else because he was one of the best employees they had--but the company eventually went away.
And now? Nobody will hire him even with countless references from management to coworkers. "He's too old." And one recruiter literally told him that.
Imagine how depressing that must be, to work your ass off your whole life and doing "the right thing", putting in 110%, going into work at 3 AM because "the line is down!" and what's he got to show for all that extra work? A dwindling savings account that was supposed to be for retirement, and countless rejections from companies who wouldn't know a good employee if it smacked them in the dick.
That's not how you treat the people who literally built the infrastructure you grew up on. It's reprehensible. And I cannot, for the life of me, understand why anyone (with a brain) would really care about age when all that matters is: "Can you make me money?"
Someone may say, "Well, they can pay a junior engineer less money." Yeah, and a junior engineer is going to have less experience reducing costs, and improving reliability. There's a reason we pay experienced engineers more. Because they add more value for a unit time. It's not like we just go "Oh, you're older now. So here's some more money."
How old is he?
Well if the technology you're using has only existed for 3 years, 10 at most ... what's the point in hiring a grayhair? As far as you're concerned they have the same amount of experience as a fresh-ish grad, except they're more expensive.
Sure the underlying principles are all the same, but you don't know that if you're not technical yourself.