Hacker News new | comments | show | ask | jobs | submit login
Dozens of Companies Are Using Facebook to Exclude Older Workers From Job Ads (propublica.org)
1186 points by danso 10 months ago | hide | past | web | favorite | 786 comments



I'm a grad student at an elite American university. Several weeks ago, some other students and I attended a recruiting presentation put on by a large multinational company I won't name. The woman who gave the presentation services her company's contract with a tech company, that, let's just say, has a big share of the search engine market.

She emphasized that she was looking for diversity, because, as she said, "the type of people on the other side of the table from us are different than they used to be, and they don't want to be dealing with a bunch of 65-year-old white guys. So those aren't the type of people we're looking for." That's pretty much a direct quote.

I was at a small company before I went to grad school. The company began working with a guy in his sixties who'd been laid off for a while and couldn't find work, even though he had a ton of experience. We brought him in on a contract basis, and his role gradually expanded as it became clear that he had both excellent technical knowledge and great people skills. He became a mentor to me and helped me out a ton. He's objectively better at our profession than I am in multiple dimensions, but right now I could find a new job ten times more easily than he could.

I'd like to start a business in 3 or 4 years, and I'm seriously considering going out of my way to target people like him. It'll be my competitive advantage.


I feel this every time I'm between jobs. I'm all of gasp 45, and know from experience that the coding job I have right now, I have to ride as long as I can because at my age I'll probably never get another job again.

It doesn't matter how skilled a person is, or how much experience they have, in Silicon Valley you have a better chance of getting hired as a 20-something serial dog molester than a 50-year-old experienced tech.

And filtering out older people, while illegal, is so very easy to do to 99% of applicants by simply by looking at the year they got their undergrad degree.


That's really sad. Why would you even want to work in Silicon Valley if they treat people like that? There are plenty of jobs elsewhere that are less inclined to discriminate.


So if a older techie would redo his undergrad degree- he would reenter the workforce?


You could also just leave the degree off your resume. If you have recent experience, you can even omit some of your early experience. Degrees don't count for a lot (unless you have zero experience) in silicon valley or tech companies in general, unless you're going for a research spot.


Thanks for actually noticing that. I think we lack some measure of empathy and observation these days.


I've heard similar quotes before, and always assumed the speaker meant that they didn't want everyone to be a old/white/guy, not that they didn't want anyone who was a old/white/guy.

Context matters a lot - such quotes are usually said in reference to a company's leadership team, and there's quite a bit of truth to them. When you look at the list of CEOs/Senior-Executives/Board-members at Fortune 500 companies, it's overwhelmingly middle-aged/old white men. I would consider that just as problematic as the age-discrimination that the article was referring to.


Visible Minority men here chiming in; I find this type of perspective (and the lack of empathy that underlines it) kinda saddening. Most of the argument you are saying are the same we used to make to other minority, "No i didn't mean all of them", "Understand the context in which i make a wide generalization etc..."

> assumed the speaker meant... I am guessing you are not an old person being subject to ageism...

> Context matters a lot

Yes it does, the context in which the statement is made, the person who made it and the context in which the statement is receive. Asking us to only consider on side of the communication ,and making excuse because in certain context the statement is not offensive doesn't seem fair to me.

> such quotes are usually said in reference to a company's leadership team, and there's quite a bit of truth to them

There is truth in any stereotype, that's why they exist. The usual issue is that they are alienating and non inclusive to some people.

> it's overwhelmingly middle-aged/old white men

And the prisons in America are filled with poeple who look like me, should i be okay with generalization on me about that. One thing that it seems that most leftist like me forget is that most CEOs are white , but most white are not CEOs...

> I would consider that just as problematic as the age-discrimination that the article was referring to.

Yes and the first step is adopt an inclusive language, and not make excuse we wouldn't accept on any other protected classes. The fight for diversity is won we get everybody on board. When we start to have a hierarchy of protected classes, some classes where it's kind of okay to discriminate against because X,Y,Z reasons then it starts to look like the exact opposite of what we are trying to achieve

And i guess the most interesting thing is the amount the mental gymnastic we have to do to find this statement okay, instead of just taking the most plausible explanation... It simply reflect the latent ageism in our industry, and the fact that it is still okay to be ageist so long as it's directed at white men...


It's certainly not ok to discriminate on the basis of age/gender/race, regardless of whether the victim is old/white/male or young/black/female. And that's exactly why I think it's problematic that the vast majority of fortune-500 executives are oldish/white/men. It hints at the discrimination faced by other demographics when they try to pursue executive roles.

If someone could demonstrate convincingly that executive roles are being awarded out in a purely meritocratic manner, then demographic-imbalances might be excusable. But let's be honest - executive roles at mega-corps are almost always handed out on the basis of personal relationships and perceptions. I strongly suspect that in a purely meritocratic system, we would see far more diversity in the executive ranks.

It's also curious to see the extent to which many commenters are willing to accept the premise that old/white/men are being discriminated against in some roles, but not the premise that other demographics are being discriminated against in other roles. Is the evidence for the former really all that stronger than the evidence for the latter?


So to sum up what you're saying, you're a hypocrite and no matter what anybody says showing you how much of a hypocrite you are, you're just going to dig in. Got it.


As a 46 year old, 25 year tech veteran, white dude, holding C-positions numerous times, I think what it indicates is that the first hire of certain kinds of positions aren't likely to go to people like me anymore. It closes doors all around, to people of a given demographic.

These things swing to extremes, but harm will come to some. There are still good candidates out there with mortgages, families, etc.

I'm self-employed now and don't expect to ever put myself into someone else's employ (that's my tip to everyone else in the 40+ camp... entrepreneurialism isn't a groovy lifestyle choice, it's the offensive measure to secure your future --- business startup and continuity risks notwithstanding), but the "we need diversity" bell is now mine to ring. That said, I used to dream of creating a startup, building a team, growing a business --- now the dream is zero employees. It's me and my (life) partner, and we won't take on employees if we can help it.


Most Fortune 500 companies were founded by white men -> Most CEOs of Fortune 500 companies are white men.

Is that problematic?

Tax system already penalizes those who are producing wealth in order to subsidize those who are consuming it.


Society vs Anarchy. If you'd like to try an experiment where there is no government and nobody pays taxes, then we'll see how long it takes most of the CEOs to either get killed or to submit to a different CEO feudal lord. The battles to determine who will be King will be quite bloody. Alternatively, we could agree to have a society, in which everyone reaps the benefits. To make that happen we will need taxes. I'm all for libertarianism, but if you want a little fantasy called "Libertarian minimal government, but oh, no, wait, of course I want a police force to enforce property rights so I can keep my millions of dollars worth of stuff", well then suddenly you aren't really against taxes are you? You just want only the benefits that help you.


Except most of the services that are truly useful and needed, like Police, Firefighters, most roads, schools, are actually taxed and handled at the local level.

I dont think most Libertarians want 0 government, but I believe most government is best served locally where people can actually have influence and oversight.


Alternately, the tax system "penalizes" those who lack wealth by advantaging those who make money by owning capital. Pretending like a pre-existing, often un-earned economic advantage implies some sort of inherent moral or economic superiority is an almost laughably privilege-blind perspective.


The latter was a quote[0] from Thomas Sowell[1], who lacks neither qualifications (Being an Ivy League economics professor), nor privilege-blindness (Black man born in 1930).

[0]https://twitter.com/ThomasSowell/status/943248573222850560

[1]https://en.wikipedia.org/wiki/Thomas_Sowell


"often un-earned economic advantage implies some sort of inherent moral or economic superiority is an almost laughably privilege-blind perspective."

Using your logic (only related to this portion of your post, because as you stated money, does in fact, beget money), one should not work hard for one's progeny to have an inheritance? Should estate law just be modified and have all assets of a dead person go to the state?


Well, yes?

Inheritance is deeply ingrained in most cultures that I know of (so, western first-world cultures), but it had never made sense to me. Inheritance is pure unfairness.

And people far more versed in the economic arcanes than I am have the same opinion: see Karl Marx


> Should estate law just be modified and have all assets of a dead person go to the state?

Why not? Why should a dead person continue to exert their influence in the form of cash after they're done personally contributing to society? Why does someone deserve the fruits of their parents' labor? Do we also judge children by the sins of their father? If someone's money is transferrable, why not blame or shame?


It's unfortunate there is so much wealth inequality but that's a bit of a separate issue. In certain situations there should be a threshold and that's why estate taxes exist. However a lot of us not only work for ourselves but to better our childrens' future station or at least give them a better chance at not being riddled with debt like we are / were.

I think your sin analogy isn't an apples to apples comparison nor does it make sense in this context. However I'll play devil's advocate.

If your father was a felon / committed "sins" then most people in society are going to be skeptical about how you might turn out. It's an unfortunate truth. Not to say you will be a felon but based on the environment you were raised in you have a statistically higher likelihood of becoming a felon / deviant. So yes, you will be judged by the sins of your father whether you like it or not. That's why people leave their home towns for to start new lives or change their names. I'm not saying its right but that's the reality.


> as she said, "the type of people on the other side of the table from us are different than they used to be, and they don't want to be dealing with a bunch of 65-year-old white guys. So those aren't the type of people we're looking for." That's pretty much a direct quote.

Not to defend the quote, but I wonder how else would you characterize “non-diversity” in this context? Specifically, if their current workforce composition is indeed “bunch of 60 something old white males” then how else would you express your desire to be diverse without referring to the current reality?


I highly doubt that this mysterious "company with a large proportion of the search engine market" is filled with 65yo white males.

In fact it would actually make them significantly more diverse if they were to hire some 65yo white males.


You've confused the two companies that the parent was talking about. The company that wanted diversity had a contact with the 'mystery' company that has a search engine.


My thought was Microsoft. There are a ton of those from back in the day still there.


I suspect Microsoft. I've heard more than one young white person complain they don't want to work with "old" people. I don't think this is a tech thing as much as an American society thing. We are too biased against older age groups.


Well, if you don't want entangled in a lawsuit you dance around it better than that.


SV might be going too far in the other direction but the market inefficiency for rest of the world is overpaying old white guys. The people younger than me are smarter, harder working than me and underpaid. They lack a bit of experience but that all balances out in 6 months. The key experience is learning the how to work with the production ball of mud that is different at every job. The people older than me are mostly lazy, mediocre at their job, and overpaid. The woman tend to be good but the guys are entitled. The competitive is looking the part and being good at politics.


That recruiter you were speaking of probably meant that they were looking to hire a diverse set of people, as opposed to primarily just 65-year old white guys (which she probably believes to be the majority demographic that you'll see in a stuffy corporate environment).

She should not have said "65-year old", as that is blatant age discrimination. And she should not have brought in the terms "white guy", as it is racist stereotyping. It seems like that recruiter is really out of touch with what promoting diversity in the workplace is about.

However, despite this, you cannot deny that the majority of people working in the tech sector are white or asian and male. That's just a fact. I feel like you are demonizing diversification efforts in the work place just because of one experience you had.

That old white guy you worked with will not have a problem getting a job in the future, eventually, unless he has some sort of character flaw that renders him absolutely unemployable. Hell, he already has a job right now, though on contract basis. Even just getting your foot in the door is much harder if he was not white and male. You just proved so yourself by saying you would have a job lined up for him and people of his demographic. You are acting like old white men are the victims here. They are victims, but only because of their age. Now you know what discrimination looks and feels like. This is why diversification is important. It's simply a protection measure to help ensure equality and fairness.


Same minority men here, chiming in again... I thing we need to dig into this kind of perspective since i really beleive this does more arm to the diversity fight than help.

> She should not have said "65-year old", as that is blatant age discrimination. And she should not have brought in the terms "white guy", as it is racist stereotyping. It seems like that recruiter is really out of touch with what promoting diversity in the workplace is about.

I am glad we can agree on this

> That recruiter you were speaking of probably meant that they were looking to hire a diverse set of people, as opposed to primarily just 65-year old white guys (which she probably believes to be the majority demographic that you'll see in a stuffy corporate environment).

I am not sure what warrant this generous interpretation. All we have to go with is a a "Derogatory" sentence from someone not in the group (young vs old). She is supposed to be train to speak publicly on behalf of a big company. She is supposed to understand diversity a litle deeper than the average joe. I am not sure what she meant, but to me the "way" she spoke speaks also volume on our industry ageism. I don't think we would have heard this about either hispanics/blacks or women.

> However, despite this, you cannot deny that the majority of people working in the tech sector are white or asian and male. That's just a fact. I feel like you are demonizing diversification efforts in the work place just because of one experience you had.

OP is not denying anything, he did not even mentioned diversity efforts or anything of that sort. The article was about ageism, and his comment was related a story where is people supposed to be trained to look for diversity make "bad" comments on old people. He probably feels concern about it because he can see him self in the old guy in a couple of year, or simply the proximity with the guy in his story make him feel some sympathy for him.

"Straw Maning" "look at this old white guy suffering" into "you are denying the struggle of of all this other people" is bizarre and from my perspective show a lack of empathy. Then going on a lecture on diversity is not helping.

> I feel like you are demonizing diversification efforts in the work place just because of one experience you had.

He is not, he is showing a deficiency in our inclusiveness.

> That old white guy you worked with will not have a problem getting a job in the future, eventually, unless he has some sort of character flaw that renders him absolutely unemployable.

Doesn't this article prove otherwise ? You saying this is essentially denying ageism.

> Hell, he already has a job right now, though on contract basis.

Discrimination is more nuance than getting or not getting a job. It's about being pass for certain possition, it's about having to work harder for the same reward, or working the same for less reward. It's about like in this case someone having the capacities to be a fully employ but having to settle for contract work.

> You are acting like old white men are the victims here. They are victims, but only because of their age.

I am not sure the point of this is on a thread on ageism. there is so many thing wrong with this statement..

> I feel like you are demonizing diversification efforts in the work place just because of one experience you had.

Since we are sharing feelings. I feel like you trivializing other people experiences and trying to create a hierarchy of protected classes. We don't go around telling cancer patient about the pains of poverty or vice versa. We don't go around telling white women complaining about sexual harassment about black men being killed by the police or vice versa. We don't feel the need to compare and judge and contrast those experiences. We just accept them as the general suckiness and unfairness of life, offer help when we can, sympathy when we can't;

When we do so, we are alienating people without good reasons. People will naturally fell more empathy toward certain groups either by identification or by just shared experience, and that's okay. We don't need to lecture them just because they aren't focusing on the group we want to focus on. Ageism is a problem, if you don't feel strongly about it that's okay, but please stop telling people who do , that they should be focusing on something else...


>I'd like to start a business in 3 or 4 years, and I'm seriously considering going out of my way to target people like him. It'll be my competitive advantage.

Count me if you can. No, not to give me a job but to be your partner. I would guess that there are some really smart older people who are under utilized because of all perceived drawback of age( often by not very smart people). But smart people are rare - there are immensely valuable whatever their age may be.


It is shocking how they do not understand how racist they are.


You sir are everything that is right in this world. You won the internet today.


So your competitive advantage is doing the exact same thing these companies are doing, except with a different demographic?


> The woman who gave the presentation services her company's contract with a tech company, that, let's just say, has a big share of the search engine market.

If you're being oblique to avoid libel, it's worth noting that libel law doesn't care about whether nor not you name an entity. It cares about whether or not you've identified an entity.

Google has been identified.


And if there were multiple people in that audience who concur about that quote, said entity is on the hook for a class action for millions of dollars.

Somehow, I don't think this one is going to get pursued ...


There are multiple class-action suits going on in Silicon Valley for age discrimination as we speak - I know first hand the Oracle has more than one in place, and given that the O is full of gray hairs already, I can only imagine that the other big names are getting hit too...


Neither do I, which is why I find the whole exercise of not naming the company a bit odd.


For all we know it could be Bing!


Yandex


The Chinese version of Google. Forgot there name though.

They are huge :)


Baidu


Generally, a handfull of possibilities counts as identification (at least, from a jurisprudence point of view)


Remember that it's only libel if it's not true.

It's possible that the report is true, but that the author doesn't want the consequences of unambiguous identification.


>>> I'd like to start a business in 3 or 4 years, and I'm seriously considering going out of my way to target people like him. It'll be my competitive advantage.

He could be 10 times more expensive than you. Did you consider that?


As someone in their 50's, that argument makes no sense to me. If I were unable to find a job, you somehow think I'd hold out for premium wages?


Many people in their '50s do; a lot of people are anchored to their previous salary level and would never accept what they see as a "pay cut". And a company doesn't want to hire someone at a "below market" rate, for fear that they'd jump ship as soon as possible.


To quote Weird Al Yankovic responding to being overqualified while looking for work before his singing career, "Yeah, but I gotta eat!"


I'm in my 50s and I think your second statement is much more true than your first. Recruiters especially get in the mode where "what was your previous salary" determines what the offer is. When you're older, you're more likely to have savings, and taking a pay cut for the right (interesting) job is a no-brainer.


A lot of people in their 50's don't need to work. As long as they bought their home when it was affordable, they are fine. Some have sizable assets that the younger folks can't dream to achieve in their lifetime.

They won't accept work for less than premium wages, don't need to.

That being said, I admit that there is also another part of the population who is struggling and desperate for work at any price.

Last but not least there is the question of what is premium wage. In London for instance, a lot of folks over 30 moved to contracting where they can charge a good daily rate, without the hassle of whiteboard interview, overtime or politics. You'll see many companies (see the HN who is hiring) allegedly struggling to recruit at their great wages, but the truth is that employees have moved out to greener pastures for twice the money.


> A lot of people in their 50's don't need to work.

I must travel in lowly circles.


I suppose the distribution is extreme on both sides. Either you accumulated assets from early on and you're golden, or you're still struggling with rent and no pension in sight.


Um, what?

Like, cashflow at 50 with 3 kids is important.

Yeah, you've saved for retirement at ~65ish, but you have 2 kids in college with one on the way up, plus a mortgage, plus just regular life stuff. That's not an unreasonable scenario at all.

Like, do you live in a first world nation? Honestly, the paradigm changes a lot in the 3rd world and your confusion would be more appropriate in Chad or something.

(In Chad, average age of first birth is 18, so you'd reasonably be a grand parent by 50, plus cost of living is very low)


I live in a first world nation. College education is free, allegedly.

If you have kids before your thirties, they will all be +25 when you are 55. They most likely finished college, they probably left the family home.


In the US college is NOT free, and that's 330 million people living here.


[flagged]


Just because your life turned to having no home and no pension doesn't mean it's the same for everyone else.


I believe that was exactly my point.


It seems like there's way more on the low side than the high side, at least looking at Table 1 (below) from this PDF from the GAO:

https://www.gao.gov/assets/680/670153.pdf

Edit: below, the table contents

               Households age 55-64 with        Households age 55-64
               no retirement savings            with retirement savings


  Percent of house-        41%                        59%
  holds age 55-64

  Median net worth        $21,000                   $337,000

  Median non-retirement    $1,000                    $25,000
  financial resources

  Median income           $26,000                    $86,000

  Home ownership rates     56%                        87%  

  Percent who own a home   22%                        27%
  that is paid off

  Percent with a           32%                        45%
  defined benefit plan


So, that's 25% of people having no rent to pay.

Then an additional 50% who have a home but still a mortgage to pay. The statistics don't say when mortgages will end, so we can't do any meaningful analysis on the majority.

There are more people on the low side, not WAY MORE contrary to your statement.


Too bad we don't know when they will pay off. A friend of mine just bought a house for $250k from someone who bought it in 1998 for $70k - 6 months before selling they refinanced for $225k. (this is all public information if you go to the right courthouse - he knows this because it is in the title).

Which is to say there are some people who don't have their house paid off yet, but will have it paid off before they retire, and then live rent free (but with other bills and taxes). There are other people who will not have anything paid off.


So, um, where you live once you pay off your house everything else is free and you no longer require an income?


>>A lot of people in their 50's don't need to work. As long as they bought their home when it was affordable, they are fine

That is the EXTREME minority of people over 50, at least in the US

>I admit that there is also another part of the population who is struggling and desperate for work at any price.

I would not say they are "desperate for work at any price." but most people over 50 need a fair wage for work, and by fair wage I mean a wage comparable to younger people as well


There are a few people. Not very many. Most people in their 50s have very little savings - http://www.rothira.com/average-retirement-savings-age-2017 has a nice breakdown - baby boomers average could live at most 6 years on their retirement savings - assuming the eat rice and beans...

My dad was in fact laid off at 50. Everybody knew he was very careful to save a lot of money, so they always said "but you must have enough money saved away to retire by now". His response was always "yes, but I don't have enough money to get to retirement". In the 15 years between 50 and 65 his money is expected to double twice which makes a big difference to his retirement income. If he had started drawing at 50 would have much less money to work with. He needed a job to pay bills.


> He could be 10 times more expensive than you. Did you consider that?

That is an assumption that may be invalid. Everybody is in their own private situation. Your assumption will be true for many but not for all. Many older folks may not need a top salary anymore and will take an average income for the pleasure of working with other fine people and doing interesting things. At some point in your life the career ladder will be less important than quality of life.


>He could be 10 times more expensive than you. Did you consider that?

So, the only way that is even in the realm of possibility is if the kid was making $30K and the greybeard was making $300K. That's another benefit of greybeards: you get more bang for the buck simply because of the way salaries scale vs. performance that comes with good experience.


Call it 5 times if you wish. That's a fairly normal range.

Costs is more than just salary. 5 weeks vs 2 weeks of hollidays, 401k, health insurance, etc...


Still not realistic. Assuming an entry level dev in sv is making 100k, most 50 year old devs are not making $500k.

More like 50% more or at most twice as much.


I would say from my experience that I could be making three times that of a junior developer at their first job, in my area. But I would not say that I am making twice that of someone with half my experience. Someone that's just five years behind me is pretty much making what I'm making.

This notion that experienced developers, as a whole in the industry, are making x times more than junior developers is nonsense. Utter nonsense spoken by people who have no idea of what they're talking about. It's true you can find outliers for that kind of thing, and I'm willing to bet they are specialty cases.

Early in my career I would get huge jumps in pay when I proved myself at a project. Almost fifteen years into this career and I don't get those huge jumps in pay anymore after a successful project. Not because I'm not deserving of the praise, but I'm already paid well to deliver based on my experience. A junior succeeds and gets a raise; a senior succeeds and moves on to the next project.

Could I get one of these mythical 300k jobs? I know I can. I'm willing to bet I could snag a 500k job if I pushed myself. But that would require sacrifices I'm not willing to make, give up hobbies that I enjoy, and live in a city I really don't care for. A city that would eat up the extra income anyway.

I'm more than happy with my current job that pays well and I get to mentor juniors every day, which I enjoy.


The norm is to double or triple salary between graduation and much experience. Remember that costs is more than just salary.

In SF/NYC, juniors can start under 100k, seniors can go over 300k. Either number could go either way, let's not argue about who got what. The point is, there is a LARGE multiplier.

I recall one friend in London, in a junior position. Left his company because of bad conditions and bad pay. One of his last assignment was to interview some possible replacements. There was one senior guy 20+ years of experience that was great, the company made an offer for around double his own salary.

Do you know what happened? The guy declined.

Guess what we found out later? He joined another company that offered him around three times that double.

Young folks don't realize how expensive experienced folks can be.


So 10x to 5x to 3x, which isn't really a large multiplier. Also, a guy worth $300K is a "large multiplier" more valuable than 3 entry level juniors at $100K.


3x the double gives 6x.


Yes and that's ignoring the compensation in bonus/RSU/healthcare.


I know the local pay scale pretty well. He was out of work for a while, and he definitely would have taken a salary equivalent to what I was making.


I'm a developer in my sixties. I guess I'm what most HN readers would call an "old white guy." I like working as a developer, and I do a decent job of it.

I'm drawing the same salary as I drew twenty years ago. (Never mind adjustment for inflation. It's the same actual number of dollars.)

I have a clear enough understanding of my trade that I'm willing to buy my own tools when they save me time (keyboards, monitors, development tool licenses, VM service fees, etc).

This all works for me. My kids are out of college without debt. I've got some money stashed in 401(k). I wouldn't mind retiring, but my small-company employer can't afford to hire somebody younger and they've asked me to stick around.


By "somebody younger" do you mean someone in their 50s or 40s, or are they looking for a recent graduate in their 20s?


In my experience that kind of pay differential exists in two places: Pakistan - US or Intern - Not intern.

All other roles for different ages I've ever seen has been based on experience and not age, and at the companies I've been that experience is only about 30-50% higher at the worst.


At most companies I've seen, in a single team, two people doing the same thing with a similar amount of experience can have a difference of pay of double.


This is absolutely true- a two, three or four person team where they all have similar skills, similar ideas of good coding practices and a tight-knit method of idea sharing can blast through code and get things done in 24 hours that I've seen distributed teams of 15 in Pakistan, Mexico and India perform in 60-180 days. (In this specific case I am thinking of building out an MPI statistical patient matching). I'm not saying that kind of skill isn't found in those countries - but I'm specifically talking about a US company trying to do a software project like that. I'm fully aware a team in Pakistan that can plan and perform really well locally when the idea and project is thought of in Pakistan.


Why is this getting downvoted?


It doesn't make sense to me to discriminate against people with more experience. Can someone explain it to me?

I have heard that companies like recent grads because they are (1) more malleable and (2) can be paid less. But neither of those reasons seem to me strong enough. I'm talking completely about the company's own interests.

Let's address the first reason: malleability. A recent grad presumably will adopt the company's culture faster, complain less, and in general pick up things sooner. Well, the hardest, meanest coworkers I've ever had were late twenties, early thirties. I've worked with people in their sixties, and they're sweet people. Even the grumpy old sysadmin had only a thin layer of spikes. After just a few days I could see through most of it, and he was 10 times more helpful than my other sysadmins. Not only was he softer (at least deep down) but he was smarter, having done it for decades. Even when he met a new problem, his keenly developed taste made him more likely to choose something that would be more maintainable long term.

Now let's address the second reason: salary. I am 10 times better than I was when I started. I know, because I still work with some of my code from back then, and I desperately want to rewrite it all. How much more does a senior developer make than a new hire? 50% more? Seems worth it to me. 100% more? 200% more? Still maybe worth it. And if some old fella can't get work at all, maybe he would settle for something between 50% and 100% more. I mean, why not at least make an offer?

It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good. You see some straight-out-of-college twenty-something in . . . any other field, you think, "I sure hope he knows what he's doing."


It's a race to the bottom. You could pay more for better work that gets done more slowly (fewer hours as people have families, etc,) or you could pay less for worse work that gets done more quickly.

Well, iOS11 needs to ship on its annual schedule, non-negotiable. And the SVP will really complain if I ask for more money this quarter. So, cheap it is.

If a bridge falls down, it's really obvious and people die. If billions of computers become vulnerable to malicious actors and a few hundred or thousand people suffer dramatic personal damages, well that's a nice and quiet problem which will be quickly forgotten in the 24-hour news cycle.


If billions of computers become vulnerable to malicious actors and a few hundred or thousand people suffer dramatic personal damages, well that's a nice and quiet problem which will be quickly forgotten in the 24-hour news cycle.

This. I do development coaching and training. I’m totally burned out because my job doesn’t matter. These companies don’t care. They hire me to “increase code quality” but that ends up being a lie. They just use me to get their teams to crank out shit faster, and I do mean “shit”. They don’t care if the systems blow up a few times a year or once a week. Banks, oil companies, airlines, all of them. They loose millions because of it too, still it doesn’t matter. No one (in leadership) gives a fuck. “Just make this quarter’s goal” is all that matters.

Fucking equifax for crying out loud, NO CONSEQUENCES. If this isn’t ironic enough do some googling on why Equifax is named such. tl;dr This isn’t their first rodeo. They once fucked up so big they had to change their name.

Sucking at software only has consequences in a few industries, usually ones where the product is software, the product is discretionary, and the customer is fickle. Even then it usually sucks.

These same companies are audit conscious (about money), they’re crazy about regulatory compliance, and workplace harassment. Why? These things have real consequences.

The sooner some company gets fined one year’s profit or an exec goes to jail the sooner this shit will actually matter to them.


The topic of your rant is something that it took me forever to accept as a software engineer. Nobody cares about quality.

Your method is refactored to be super-fast and have no branches? Nobody cares.

Elegant, re-usable architecture? Nobody cares.

You've optimized an inner loop in assembly and increased speed by 30%? Nobody cares.

Made it to zero compiler warnings? Nobody cares.

No known crashers left in the bug tracker? Nobody cares.

Fixed that bug that's been in the code for two years? Nobody cares.

Patched a security hole that could allow a crazy bad attack? Nobody cares.

The vast majority of software shops out there care only that something--anything--ships as fast as remotely possible, and as long as it does not kill the customer, it's a success. Nobody in leadership gives a single fuck about anything else. There are a few places where this is not true, and good software people tend to collect there, but it's true for the vast majority of places where you will work. The sooner you accept this, the less likely you'll burn out.


I agree with you, and this is a hard lesson to learn. But I don't think it's necessarily a bad thing.

Software should really only be measured by the value it provides. If a terribly buggy piece of poorly written code still saves hundreds of man hours a week, it's a win. (Unit tests be damned)

If that big ball of mud that's using 20 year old technology still prints a billion dollars a year - that's a win. (Microservices be damned)

If some slapdash jquery-and-duct-tape web app still solves a specific problem I have, that's a win. (React SPAs be damned)

Does crappy software flourish while beautiful software dies? Sometimes. But usually code hygiene, security, bug counts and crashes are not what makes or breaks the success of a piece of software. Developers often think their work is the most important, when in actuality it is usually not. Sometimes you really just are a cog in the machine.


This is what I call the Mediocre Mercenary Mentality, this idea that "as long as we can make money off of it, it's good enough".

This mentality is the reason why sometimes -- but not always! -- my phone won't play music in my car until I restart the _car_.

This mentality is why game development companies get away with charging full price for unfinished, bug-ridden games with crippled features and missing content.

This mentality is the reason why I couldn't see my credit card in the list of my accounts on my credit union's online banking site for several days.

This mentality is why Experian and Equifax don't report the same credit score for me.

This mentality is why I feel ashamed every time a receptionist apologizes for the wait, because they "have been having problems with the system."

Most importantly, this mentality is why I have to worry about whether someone will steal my identity because people keep writing shitty code on top of shitty code and my personal information keeps getting leaked.

When I'm so disgusted by my own profession, no wonder I ended up burned out.


We have an economic system where we reward maximizing short term financial gain. As long as that doesn't change companies will try to sell the crappiest shit they can sell you for the most money.


I think this is more than just the economic system. If you look at people in general, we live in the age of instant gratification.

We want everything fast! Nobody is willing to put in the work and get the reward after a longer period of time... No! We want results now! We don't want to wait for the food to be properly cooked, we want it now, so we go for fast food. We don't want to put in 1 year of work to learn something new so we can feel that we achieved something, we want it now, so we play a game where we can 'win' in 30 minutes. We don't want to build a relationship and eventually end up being intimate, we want it now, so we go for 'one night stands'.

Then you ask yourself... is it weird that the economic system works the same? That the software industry works the same? :/


I don't think the phenomenon you described, as some new thing specific to this age, is in any way real. If anything, the things you mention only reflect the new choices we have. Going over your individual examples:

> We don't want to wait for the food to be properly cooked, we want it now, so we go for fast food.

We go for fast food because it saves time. When I opt to order in instead of going out for lunch, or grab a burger from McDonald's en route, I do that because eating is mostly instrumental. I don't want to eat at that point, I have to, and the sooner I can fill myself up, the sooner I can get back to doing the things I actually care about.

Fast food, and food delivery services, allow people to choose to spend more time on other things, when those other things are more important to them. The same people will enjoy a finely cooked meal on a different occasion, when they prioritize it.

> We don't want to put in 1 year of work to learn something new so we can feel that we achieved something, we want it now, so we play a game where we can 'win' in 30 minutes.

Spending a year on learning something to just get a quick feeling of achievement is a very stupid way to go about it. Videogames are better for that. Learning is better for getting long-term feelings of achievement, and to actually gain knowledge/skills that you can use for something. And again, it's not something new - our generation wastes time on videogames, previous generations wasted time playing soccer, cards, darts, and doing tons of other quick-reward activities.

> We don't want to build a relationship and eventually end up being intimate, we want it now, so we go for 'one night stands'.

One-night stands exist for as long as humans are humans. Nothing fundamental changed, only the hookup methods evolved with population density and available communication tools.

--

> Then you ask yourself... is it weird that the economic system works the same? That the software industry works the same?

Nah, they don't work the same. In them, the actors are not driven internally, they're driven externally. People write shitty software, or sell shitty products, not because of their need for instant gratification, but because of market pressures. A fast hack sold by your marketing team can be a difference between you getting $100M contract versus not getting it, or getting your product on the market first versus a week after your competitor. Market economy, because of all its efficiency, is what creates the culture of suck.


You say it like is a bad thing. If I compare two service/products/relationships/experiences which are equal in all but one attribute: delivery time, guess which I will choose.. Delivery time is an important attribtue and reduces (subjective) risk of no delivery. And delivery time is an attribute which is measurable. In comparison it is hard to measure how bug free a peace of software is, or how healthy food is, or how successfull your relationship will be, or how good you can learn this new skill in a year.


In b-school Profit=Revenue-Expense was drilled in over and over again.

It's so obvious. But until you realllllly sit down and internalize it... then you can be at peace with the shitty service of Comcast, etc.

In other words: Don't hate the player, hate the game.


> When I'm so disgusted by my own profession, no wonder I ended up burned out.

Ha ha, maybe this is why they don't want older engineers. They're wise, cynical, bitter, skeptical and not likely to suffer bullshit management.


I have this saying that most software is tent software. Nothing wrong with tents, music festivals are built on tents for example. They are cheap, quick to put up and put down and can do a lot!

You just have realize that most software buildings are tents, and there are a few buildings that you can use out there.


Tents are usually temporary though, software rarely is.


You have been able to put to words the feeling that I have had for such a long time. Thanks.

I don't know if the world is getting a shittier place. But it surely is getting less reliable.


> I don't know if the world is getting a shittier place. But it surely is getting less reliable.

It is, because everything is being optimized into literal minimum viable products. That's how competitive markets work - whatever aspect of your product or business you can cut out to save money and thus get ahead of your competitors, you will cut out, and then your competitors will cut it out too, in order not to get outcompeted by you. This race-to-the-bottom phenomenon is fundamental to how competition works, and the result is that products gradually lose quality.

It's the reason why your grandfather's washing machine probably works 'till this day, while you have to fix your new one every year, and will probably replace it in five years. "Building to last" is a good example of a quality that the market economy optimized out over time across pretty much all products in all sectors.


> It's the reason why your grandfather's washing machine probably works 'till this day, while you have to fix your new one every year, and will probably replace it in five years.

Adjusting the cost of each product with inflation, things really are built like they used to be. Your grandfathers washing machine both cost far more than a current model* (again, adjusted for inflation), and all of the poorly built old washing machines broke and no on sees/thinks about them anymore.

* I am having trouble finding a source on washing machines specifically, however I'm assuming they followed the same trends as most commodities tracked by BLS: https://data.bls.gov/cgi-bin/surveymost?ap

As questionable as citing Reddit is, this post rather well encompasses what I am trying to get at: https://www.reddit.com/r/Showerthoughts/comments/4kn8ku/the_...


> This mentality is the reason why...

The way I see the post you're replying to is either you learn to cope with these things or you'll burn out faster.

Since there's no accountability, if you want to do things the right way, you'll have to do it by yourself and be prepared to swim against the tide..


It sounds like you'd burn out in any profession. Perfect is the enemy of the good. If you're holding your work, and that of others', to a perfectionist's standard, then you're just causing yourself unnecessary stress and anxiety.

Most of those bugs and failures you listed are really inconsequential in their contexts.

Car won't play your phone's music? Use the radio. Game has bugs? Play another game until it's patched, or find inventive ways to use the bug. Can't see your CC? You can still charge on it and pay its balance. Experian and Equifax don't report same credit score? Your creditors aren't reporting to every credit bureau.

Ashamed that someone else's system doesn't work? I can't help you there. That's some deep psychological issue.

About 10 years ago I came to understand that mediocre runs the world. All these people, who are your bosses, who are getting raises and promotions, they're the B/C students from college. They don't care about perfect. They really only care about finishing what they're assigned and going on with their outside life.

If you want perfect, do it on your own time.


> Most of those bugs and failures you listed are really inconsequential in their contexts.

Yes, I expect that for most things we could find a context that could render them inconsequential.

> Car won't play your phone's music? Use the radio. Game has bugs? Play another game until it's patched, or find inventive ways to use the bug. Can't see your CC? You can still charge on it and pay its balance. Experian and Equifax don't report same credit score? Your creditors aren't reporting to every credit bureau.

Elevator doesn't work? Use the stairs. Public restroom is filthy? Hold your breath and don't touch anything. Lost a tooth on the right side of your mouth because of an incompetent endodontist? Chew on the left side.

I'm not being facetious here. I really do agree that almost anything is tolerable if we decide to. That last example is from personal experience: I have lost three teeth on the right side of my mouth and I have to chew on the left. Most of the time, I don't even think about it anymore. And it really isn't that big of a deal -- it doesn't have a very significant impact on the quality of my life.

> Ashamed that someone else's system doesn't work? I can't help you there. That's some deep psychological issue.

It could be. Like I commented elsewhere, I would expect a doctor to be ashamed of Andrew Wakefield, but maybe most of them aren't. And even if they are, it could be a deep psychological issue that I choose to compare the state of our industry to what Wakefield did in his.

But here's the thing: I know that we can do better. Not perfect, better. But we don't have to, because we lack accountability.

Sure, perfect is the enemy of the good. But complacency is the last refuge of the mediocre.


> Ashamed that someone else's system doesn't work? I can't help you there. That's some deep psychological issue.

Personally I get angry in those situations... the larger the company, typically the more safeguard crap in place that's supposed to prevent those issues, the more pissed off I get. But that's just me. I get angry when I see evidence of poor/buggy quality in business software the more visible it is.


>This mentality is why I feel ashamed every time a receptionist apologizes for the wait

why would you feel ashamed for waiting a little longer?

that's an odd thing to be insecure about


I'm not ashamed for waiting longer. I'm ashamed for the same reason I would expect doctors to feel ashamed about Andrew Wakefield.


Because the guy works on computers for a living.


> Software should really only be measured by the value it provides.

The problem is when the "negative value" is externalized leading to false evaluations. "Oops, so sorry we leaked all your social security numbers. Our bad."


Yes, the "software should really only be measured by the value it provides" attitucde is kind of the key problem we're facing in a whole lot fields.

Because it's not really the value overall the software provides that's being talked about but the instantaneous, immediately visible payoff to a single consumer that software, the people or that consumables are getting judged by. If the software is going to result in trouble over time, if it's going to have security holes that will cost a lot over time, if it's going to commit you to garbage that's updated less and less frequently, etc. None of this calculated. Just as the health costs of sugary drinks don't get calculated, the social costs of poor education don't get calculated, etc.


Also the externalities that are never mentioned: bloat and poor performance unnecessarily takes up user's resources, frustrating them and preventing them from doing other things on their machine simultaneously, and also unnecessarily wasting electricity. Multiply that by the number of users of the software, and suddenly all that talk about "optimizing for developer time" starts to sound like "I'll save $100 for myself by inflicting $100k in externalities".


It's up to regulation - self or government imposed - to factor externalities in, change society and/or change the law


Somehow I never consider to solve a problem by delegating it to government or society. My first thought is always: there must be a technical solution for this. Even government and society are going to be solved by technology: crypto/blockchain or something which will evolve out of it.


There's no tech that specifically gets people to do the right thing. There's tech that might make people do things but that tech can be harnessed to get people to do either right or wrong things.


Even on a larger scale, the world hasn't burnt down with all the private information that's been stolen lately, has it? Have there even been any significant consequences?

Is it possible that the value assigned to keeping that information secret was, in the end, actually appropriate?


"Have there even been any significant consequences?"

Ask someone who's had their identity stolen.


It's a waste of the information stolen to use it while people are still on high alert.


I’m a web developer in my spare time. I write little things that solve problems in my two day jobs, they haven’t tended to be pretty because often I’m learning as I go and when it works then it’s truly time to move on to the next problem, even though I now know enough to solve the old problem again more elegantly. I’m trying to use better practices and modern JS frameworks in new projects just for experience, even though they are actually overkill in many cases. Certainly the people I work for would rather have two useful utilities written in plain es5 and php than one useful utility plus a story about the code quality. I’m gradually finding the middle ground where somebody call look at my code and not instantly want to take a shower.


Saying all this shows that 1. You care and 2. You know the difference.

I’d hire you.


Good to hear- I’m hoping to switch to full time web development some time in the new year. It's actually my favorite thing at these other jobs.


You're forgetting about ethics. Even if your boss doesn't care about security, if you want to be a decent person, you need to at least try to prevent your system from becoming a tool to harm others.


Hanlon’s razor comes through in full force.

The vast majority of people making agregrious security errors know no better. You could argue that whoever hired them shares or shoulders the blame, but to suggest they’re not decent people is a stretch. Most I’m sure are. They’re also likely infosec idiots.

Now if you know something is terribly insecure, understand what’s involved in fixing it, and go out of your way not to, then yes ethics come into play. I see that the same as an engineer (in the true sense of the word) staying silent on an issue involving automobile brakes that could lead to casualties. The consequences are not 1:1 but the ethical question is in the category.


I already have issues when I see something but I know no-one is going to pay to fix it, so what can I do? I report it, no-one cares; I might be able to fix it but then I do this for free.

If your bugs can hurt people (financially, physically) I think you need to work a bit harder to make sure it doesn't sneak into your work. Whatever you believe helps there (unit tests, formal verification, ...); if you didn't do all you could do because you are not paid for it (your boss tells you to add features, don't waste time on things that didn't break yet for instance), what should you do? Seems like a real issue for professional coders as there are not so many options; not many companies will pay to prevent these kinds of things unless they had a big issue before already (and someone(s) got fired / sued for it already).


I'm normally on board with Hanlon, but "infosec idiots" who fail to educate themselves on basic security principles are guilty of negligence at best. There's no reasonable way they don't know that security is a real concern, so they should be looking into it. If they still fail, then we can go back to Hanlon's razor.

But as one HNer to another, I was really talking about your second case. To claim here that security can ever be discounted, as the person I was replying to implied, is pretty much inexcusable. Everyone here knows that security affects more than just your business, so no one here should be solely applying business logic to it.


I have a feeling you dont have much experience on working on other peoples code. You have just bundled all kinds of issues into a bucket of "buggy piece of poorly written code". Some of these are outright security vulnerabilities and some make you wonder how was the bug not found so far. Sometimes fixing these issues require a heavy rewrite which cannot be justified as a bug-fix. And how do you justify playing with your customers data by "slapdash jquery-and-duct-tape" techniques.


This, ironic enough, is my main motivation to improve my software development skills, from algorithms to system design to development patterns. Because, like first impressions, the first write is the most important. If it's done well the first time around, the true risk of missing an improvement upon the next iteration is significantly lowered. Edit: also the reason I'd like an ongoing education in basic security for app devs


"Software should really only be measured by the value it provides. If a terribly buggy piece of poorly written code still saves hundreds of man hours a week, it's a win. (Unit tests be damned)"

But those bugs could easily lead to mistakes in the output.


I had the fortune to work at an agency that cared a whole lot, and tended to take on clients who had learned hard lessons from bad code written for the platform we worked in.

Code quality matters and has tangible consequences in that niche, and it was awesome to work in that environment.

I haven't worked anywhere like that previously, and I think the key factors that brought code quality to the forefront were:

* Bugs cost sales

* Poor code caused friction for features a significant amount (reduced ROI)

* Small teams and an insanely tight customer relationship to those teams. The client needed to have a stakeholder in our team or as close to that as possible.

But most importantly it was just the company culture from top to bottom. The director cared, he made the project managers care, and we already cared because it's more fun to write good code.


That's a nice place to work in! How was this company doing? Would it be doing better just shipping "if it makes money - it's good enough" code?


It really was! I believe they were doing well, growing at a steady pace. Another part of the puzzle was that client relationships were long, over years, so if we wrote bad code it was us that would have to deal with it later. The clients were typically larger, which helped to that end.

We did take on a rushed project at one point, so there is a good comparison in that project. It shipped, but everyone was unhappy, some key people left and had to be re-hired for. It triggered a very transparent discussion roughly titled "let's figure out how to never do that again" and to that end they curated the clients more carefully, and put in some checks to see if projects are a good fit.

But I understand that is probably unique, and at previous jobs the "good enough to make money" mantra has worked out okay.

I think when your software domain is complex and rapid, it pays to have sound architecture. When it is simple, or slow to change, it doesn't matter quite as much.


>as long as it does not kill the customer

Whoa now. The Toyota 'unintended acceleration' case showed us that it's fine to kill the customer. The court declared themselves incapable of penalizing even the most stupendously egregious example of criminal negligence we're likely to see (Of 90+ 'required' and 'suggested' practices in automotive industry firmware coding practices, they did 4 in the code in question. The developers didn't have a bug tracker or static analysis tools. Typical of all organizations, the software engineers had no real power to demand more time or testing, etc). They lost a civil wrongful death case and paid out $4 million, then settled with the plaintiffs out of court before the jury could award punitive damages. So killing is OK.

The issue is that, I think, software people need to stand up for themselves. They need to tell executives no, and refuse to pump out slapdash work. They need to take responsibility for the successes and failures of the company. The practical truth is that in most organizations, the software is the most valuable part of the entire business and the thing with the power to enable the company to excel or fall flat. We've been waiting for executives to recognize this since the 1980s and they still just see the software people as people who do typing.


Bad example.

After a 10-month search, NASA and NHTSA scientists found no electronic defect in Toyota vehicles. Driver error or pedal misapplication was found responsible for most of the incidents. The report ended stating, "Our conclusion is Toyota's problems were mechanical, not electrical."

https://en.wikipedia.org/wiki/2009%E2%80%9311_Toyota_vehicle...


Totally agree. But the only way for software people to take any power or control (really, for any worker to have any say in the business at all) is to unionize, and us tech folks tend to treat that like a curse word.


Union is a curse word but I have a hunch that there are others that might make more sense. Guild maybe?


My universal advice to graduating CS students is, "no matter what your current code quality/dev speed balance is, your boss will think you're polishing too much."


Second advice. If you want to pretend that quality matters, move to aerospace, finance, or any critical systems.

Third advice. It's not enough to fix bug, everyone MUST know that you fixed that weird bug that customers were complaining about for years and noone could understand the cause.


> If you want to pretend that quality matters, move to aerospace, finance, or any critical systems.

The important word here is pretend. If the system is safety critical you'd think that quality matters. It doesn't. What matters is following the process laid out in ISO 26262 or whatever standard applies to your field. Whether or not that improves quality is not important (it does, but only a little). The important part is that you have reduced/no liability for accidents if you can prove that you did everything by the book.


Yep, part of the quality standard for medical devices is to have processes, and also processes to improve and change the processes. So theoretically there shouldn't be any doubt as what to do in a given circumstance when something fails, and that standards would rise as development matures. But in reality it's just a bunch of paperwork, and as far as the software is concerned, standard practices of unit and integration testing, code reviews, and strict compiler settings are more than adequate already.

Edit: It's all about documenting that those things have been done and signed off on, so keeping track of paperwork.


Testing, review, history of changes... many things that are completely unheard of in the average small development shop.


I'm on the last few pages of this book:

https://en.wikipedia.org/wiki/What_Do_You_Care_What_Other_Pe...

...where he talks about his role and questions regarding the Challenger shuttle disaster; I don't know if things have changed much since then at NASA, but I suspect they haven't, given Columbia and such.


Mmmyes, but the process implies a heavy chain of verifications. So even when there has been an idiot and a lazy guy, at some later stage in the process, it has to go through someone who cares at some point and it gets fixed at that point.

After all, that is the point of those heavy processes: manage to product something fairly reliable despite the acknowledged presence of morons and shitty managers. The process does not try to eradicate those people, it works around them with some kind of redundancy.

Having been one of those 'concerned' people, I must admit it is exhausting. You know that if some work comes from this guy, or that company, it will be 90% shit and you have to explain them again and again what is wrong and how to fix it, or do it yourself when you are really fed up. So basically you do your job + the slackers' job. That is how the system works. As a system, the result is not too bad. As far as individuals are concerned, however...


Exactly. A prime example of this is the “quality” software at financial companies.


They care about loosing money and risking to lose money.


andai.depression++;


And make sure to record it so that at the end of your yearly review cycle, and before the raises are decided, you can remind your upper management of your notable accomplishments.


Either it's easily perceived as bringing value or it won't have any effect.

If you have to remind people of it, it's as good as not existing in the first place.


I disagree. A year is a long time, and a valuable, difficult achievement in February may raise your prestige for a while, but by December that's faded to background levels. I even forget my own achievements from nine months ago, never mind those of a dozen other people on my team.


The question is always:

> What can you do for me or have done for me recently?

Services provided so far in the past that need to be reminded are at best irrelevant and it can actually hurt you to bring them up.

I'm not saying that it is nice, only that humans behave individually and (even more collectively) according to this.


So at my annual review in December I should only be assessed on work in the last quarter? I guess I'd better save my best work for November.


Depending on the attention span of your particular managers (and their managers as well, most likely), this can indeed be a viable strategy.

You might also carefully consider your company's culture when deciding how to split your efforts between new (user-visible) features, preventive maintenance, and bugs affecting end users (especially any VIP end users that have your managers' ears).

For example, a 'Hero' culture rewards last-minute, late night, death march efforts, whether to fix an urgent problem, or to add that one feature that is (supposedly) needed to land a specific high-value customer/sale.


Really? Ship an app to the store that crashes all the time and see your metrics go WAY down. Loosing user data is another great way to lose users permanently. Sure if you are Google of FB it probably does not matter but for smaller companies NOT CRASH, NOT LOOSE USER DATA seems to matter quite a bit in my experience.


B2C is a tiny, tiny fraction of the overall software industry. The vast majority of developers are working on godawful enterprise software that end-users have no choice but to tolerate. The absolute worst apps on the iOS store are remarkably high quality by the standards of enterprise software.

I suggest reading the comments on this thread:

https://news.ycombinator.com/item?id=13859961


> The sooner you accept this, the less likely you'll burn out.

I learned this at my first software engineer (though we didn't call it such) job back when I was 18; closing in on 45 years of age, and I'm still here at it, and don't see that changing soon. I have thought about management, but I really don't have such experience, and I'm not a real people person anyhow - plus I like to code.

Another thing I have learned - at least here in the United States: Not only do they not care about quality, they usually don't have any loyalty to you as an employee. They'll let you go without any warning at all, and if you're lucky, you'll get some severance pay.

So save your money, build a savings account with FU cash in it (6 months to a year of salary - or more), and don't be afraid to drop an employer like a hot potato if things aren't working out, or a better offer is in the works.

And above all else, don't let your skills stagnate.


Do they care if a year down the road your maintenance costs increase fivefold? Have they seen the analysis which shows that initial development cost just makes a small fraction of the total cost and that fraction get smaller the longer software is in use?

Do they care if a year down the road your TTM times increase tenfold?

  We know that the bad code slows us down. Why do we write it?
  Why do we write the code that slows us down? How much does it
  slow us down? It slows us down a lot. And every time we touch
  the modules we get slowed down again. Every time we touch bad 
  code it slows us down, what do we write this stuff what slows 
  us down?
  And the answer to that is: well we have to go fast.
  I'll let you to deal with the logical inconsistency there, my 
  message to you today is—you don't go fast by writing crap.
https://www.youtube.com/watch?v=TMuno5RZNeE&feature=youtu.be...


No. When they have seen all the bugs, they will tell you to fix them and when you make a new feature they will tell you the same thing, just make it work, just ship it.


Spot on in my experience. Curious if anyone at Google/Facebook/Amazon/Apple core teams could comment on whether these issues are also the case at Google/Facebook/Amazon/Apple? I would imagine solving issues like these do make business sense at big firms with scalability problems.


The industry does itself no favours by regularly rejecting techniques to improve software quality tho'. We could all be writing rock solid Ada, but programmers said no, we prefer JavaScript, hell we'll even try to use it on the server...


Lack of profit focus is definitely something I see in junior devs of the higher quality kind. It's important to sell any prospective improvement to management in a way that makes money. Quality people can't see won't make money, and even quality they can see will only make money if there's a competitive market.


> The vast majority of software shops out there care only that something--anything--ships as fast as remotely possible

When nobody cares about you get the point sooner or later where you can't ship, they've probably launched a product but there's still a lot more to ship in future. Eventually they become so mired in technical debt that they care.

When they realize they've invested a hundred million dollars in something that now has no value? They care. (especially if it's a turd before it gets out the door)

When a start up comes along and eats their lunch because they aren't weighed down be legacy code? They care.

When they lose revenue because they can't keep customers happy? They care.

The problem is that the professionalism of software developers doesn't allow them to learn these lessons early and hard enough.


> They care.

> They care.

> They care.

> They care.

I disagree. They will hire a consulting company to do a root cause analysis. The Ha'vad grads from McKinsey will swoop in, determine that the past doesn't matter, you need to get to black and to get to black, the St. Louis team needs to go, along with 20% of the following 7 teams....

No, they don't care. And never will.


Then they just rewrite it again shittily. lol


This will be changing as software is responsible for the control of an increasing number of physical systems (cars, medical, aircraft, etc.)

Edit: Unless we elect people who take away legal protections in place to regulate safety critical software


> There are a few places where this is not true

Where?


Shops that use strongly-typed functional languages are, in my experience, a good bet. They usually made that choice because they have an unusually compelling need to write correct code. Aerospace also has a few sub-industries where correctness is priority #1 (like commercial autopilot software).


Probably the Apollo moon lander software team, circa 1968.



This article bothers me because it implies that the path to correct code is simply to have code written by white-collar salarymen (as opposed to unkempt youth), and not improved formal methods.


It doesn't imply there is a correct path to code. It implies that the best path to the goal for this program is to minimize the opportunity for exhaustion to be a variable.


> That’s the culture: the on-board shuttle group produces grown-up software, and the way they do it is by being grown-ups. It may not be sexy, it may not be a coding ego-trip — but it is the future of software. When you’re ready to take the next step — when you have to write perfect software instead of software that’s just good enough — then it’s time to grow up.


but it is the future of software

It's over two decades since that was written. I remember reading it when it was first published. Does it look much like the future?


It might not be the future of software, but it's the software of the future, in the sense that it's what enabled us to get to the moon, and will enable us to get to Mars.


If there were a list of such companies, I would patronize them.


if you improved the efficiency of a hpc shot by 30% I am sure people would sit up and take notice :-)


I think you've nailed it on the head but haven't realized why you've nailed it on the head.

`Why? These things have real consequences.`

To a business they see software/engineers as a expense. The whole focus as a business is to provide a service that people are will to pay more than it costs to provide said service.

Everything else doesn't matter, what does matter for a business is $$$$$ and making sure business doesn't get sued because bill touched jill with his #####.

For what my words are worth, I do think you need to take a step back and view it from a Business person perspective and attempt to align your goals with theirs. Get out of your cube and go and meet CEO's and CIO and talk to them about their problems and what their goals are. Then see if you two can work together to achieve their goals and bring value to the table.

Even writing this I've could think of 10 or 20 different business discussions you could have.


>To a business they see software/engineers as a expense.

Anecdotally, this is one of the first things I try to gauge at a prospective employer. "Are their development teams seen as cost centers, or profit centers?" If the company works with technology a lot and doesn't see technology as being particularly important to its bottom line, that's usually a bad sign.

Also, discussions with C-levels tend to be pretty one-dimensional. It doesn't matter how much their infrastructure is falling apart, they'll still stonewall with something along the lines of, "our goal this quarter is new user acquisition - how does this improve those numbers?"

And they won't accept, "users cannot sign up if everything is broken" as an answer.


You're 100% right but also 100% wrong.

Like I said before you're viewing it from the perspective of a developer but not from the perspective of a Business person.

Instead of saying `Users cannot sing up if everything is broken` you need to see it from a Business perspective, and communicate it to them in the language they understand.

For example, `From our quarterly earnings report we need to have 10,0000 new clients signed up by the end of the month to reach the goal. We have on average 2,000 sign-ups per day. On the 2nd, 3rd of last month two of the critical system's fell over, and this result in a estimated 4,000 failed client acquisitions. The estimate from our quarterly earnings each new customer bring's in a revenue of $300 net profit from each client for the quarter. As a result of the system crash our quarterly earnings will be down 1.2 Million.`

Your job is to communicate clearly in the language they understand how it will achieve their goals. You're not the only department/manager who is requiring resources from the Business. As I stated before you're there to serve the business to achieve their goals.


Honestly, this is pretty straight-forward stuff, and I'm surprised you're getting any argument. If a technologist cannot explain how a proposed implementation is going to provide value to the business, that project is probably better off left alone. Prettier code and better tests don't provide value. But fewer customer-facing P1/P2 tickets provides value. That prettier code and better tests leads to fewer tickets is an implementation detail.

Let's put it this way: Netflix has one of the most robust microservice architectures I know of. I promise you will find zero mention of it anywhere on their customer-facing pages. Because that is not the problem that Netflix solves for their customer; they solve a media library and streaming problem.

The business is your customer. You need to understand their problems and how you're going to fix them. Everything else is an implementation detail.


>Honestly, this is pretty straight-forward stuff, and I'm surprised you're getting any argument.

I'm not. Our industry is full of academics who've been told "this is the right way to do it, everything else is stupid" for the entirety of their education. They're academics, trained by other academics, ignoring the broader reality of why we do these things at all.


The corollary here that it is perfectly acceptable for business users to not understand significant parts of their own business operations. In fact, everybody should work around your shortcomings in understanding that "technical" stuff. As the guy with the MBA, you are the most important person in the room and everybody else is there to serve you.


You are a team. Not everyone has the same background and experiences. You would be amazed at how many people in an organization have incredible knowledge of their business domain and little to no knowledge of IT systems, their costs, and benefits. As an IT leader it is your responsibility to help bridge that gap, and the language of business is business not tech jargon, so I would caution against dismissing it.

Another way to think of it is as of user acquisition problem. You have to highlight the benefits in a way that is compelling to them. At the end of a day all progress/fiascos in a company comes down to consistent and effective communication.


Still doesn't work. I've done clandestine work over weekends to speed up an application, then as part of shipping it also add a split test with half the audience having a delay similar to what was in place before I fixed the problem and brought that to the table with the "business" guys, and everyone nodded their head "wow" but every sprint after that was just full of new features.


I hope you got the lesson. Please stop working over the week ends for free.

You are devaluing yourself and you are devaluing all the developers who don't want to ruin their week end at work.


I obviously didn't bring to the demonstration that I had volunteered the time, it suited my point to make it seem like it was something easy I could do as an aside... Problem is, they don't care about $ either, they care about what makes them look good, what makes their boss look good, what the sales guys are bugging them about, and it's features and visual polish and never quality or performance.

I had to break it down to one of them "Listen, we've seen that the site being faster gets us X% more conversions, if the site was down, or if the site was ugly X% of the time and cost us those conversions, you'd be breathing down my neck to fix it, why is this any different...?" reply after a few seconds of thought "You're right, but you'll never convince Leadership"


> it suited my point to make it seem like it was something easy I could do as an aside...

Thus fuelling the idea that this sort of thing will just be done as other work is happening with no time budgeted for it.


> I've done clandestine work over weekends to speed up an application

> I obviously didn't bring to the demonstration that I had volunteered the time, it suited my point to make it seem like it was something easy I could do as an aside

I understand the impulse, but in long term this leads to inflated estimates from management - and tech side has only themselves to blame. When we pretend things take shorter then they really do, dont be surprised when they learn to expect our lie to become truth.

Long term here can be as short as three months.


There is an old Google paper where they talk about discovering that users returned more often if the search results list was 10 instead of 20, since the shorter list took marginally less time to load/render for the user. If your co-workers don't believe you then I am sure they would at least love the Google finding that minor performance improvements have a positive impact. They would then support improving performance in the app and brag about how much like Google they are.


>>> There is an old Google paper

No, there isn't.

Google is showing one series of ads per page. The study showed that having more legitimate results per page decreased their revenues. Of course, it gives less room for ads and less page views.


I saw your post but didn't reply.

The work you did probably is great and do I think its above your average developer. Somewhere in the the top 10%. Though how you conducted yourself in the face of management was down right petty/folly.

Yes you did the best job in the world. You improved the performance of the serves by % and it did make management of those server's easier. Though you need to think about what you've done in the eyes of management.

Firstly you've modified the server's without permission. You've run tests out-side of work hours that could of impacted the business bottom line. For all I know (its up for debate) but your senior managers had more pressing issues on the table, and having a young code money running around outside office hours is the LAST thing they want to hear.

`It's not a simple as you may think it is.`

Some of the things that come to mind, and honestly I would of pulled you into a office to explain yourself if you're on my team.

1) Did you document all the changes you've made to the server

2) Did you back-up or store any sensitive information off-site during out of office hours

3) Did you modify critical system files that could cause another dependent system to fail

4) What ETA/Service level agreements if any did you break when bringing down the server for maintenance?

5) Did we lose any Security Certification with you modifying the server or changing any of the certificates on the server?

We're software developer, and yes when we see problem's we want to fix it to our best of our ability. Though when you're in a Business there are many thing you may not be aware of. In this situation I would honestly say you're lucky to keep your job! I hope you take it into consideration next time you want to go out-side of your duties.

I don't want to come down harsh on you, but there are times when you see a bad situation and there is nothing you can do about it. You notify the people who're higher than you of the situation and the dangers. It's then up to them to make the decision. After they've made the decision that is it. If the server blows up during the weekend, and they want you to come in and fix it (even when its there fuckup) you just take your hourly rate and x10. So if your hourly rate is $50hr and the server blows up, then your new rate is going to be $500 per hour to come in during the weekend to fix the mess.


I have a counter-perspective to provide:

Many of the points you've raised are moot outside of large corporate environments and projects.

It is likely that the commenter has root access to all of the things you mention and touches them on a daily basis. It is likely that they are not inundated with all of the heavy process you're bringing up. It is likely that all of the security concerns you raise are not even part of their current business. Working outside of set business hours is likely not a concern; overtime is likely uncompensated anyway and there probably aren't considerable office security protocols (they probably have keys and an access code for the whole office).

The points you've raised do not apply to most small development teams.


Yes, but both of these viewpoints are important, depending on the context.


It was a simple performance bug (think N+1 query type) that was trivial to implement and trivial to be confident about its behavior vs the pathological implementation, and I used the normal channels for shipping it, it passed QA.

The only "clandestine" part is that I added the split test, and did some work outside of the set of features that was scheduled.

We had no policies around SLA, developing off-site, security or restricting sensitive information (think potential FERPA/HIPPA violation) which are yet other "non-features" that I tried to get them to adopt.

I implemented this entirely above-board, with more precautions and consideration than were taken by other developers on that team for the day-to-day.


Agreed, but don't take this as "don't work hard," merely as "don't donate time to your employer." There are far more worthy causes out there.


The core conflict is this:

The engineer is trying to optimize/normalize his free time by acknowledging quality issues, the management is trying to optimize their income by sweeping them under the rug.

The engineer is ultimately liable for the problem. Quality is insured with his reputation and free time.

The business guys almost always leave at 4:00pm and their bonuses are based on a metric not related to quality.


So we just have to do their job as well as our job?


Making money for the company is every employee’s job.


So the janitor should analyse and quantify what difference he will make to the companies bottom line before he wipes those skidmarks off?

Not everything is quantifiable like that. If I'm writing code and I take the time to make it 30% faster then it won't make a difference to our bottom line. If that's done everywhere then customers will like us more, be more likely to stay with us and be more likely to recommend us, but that can't be quantified in my day to day work.


Probably the biggest change looming over our industry is coding to - and deployment on - platforms that definitely are instrumented to capture exactly how much value a developer is adding (or subtracting) from the bottom line with every commit (it's a natural outgrowth of Function As A Service architectures).

It is already possible to more-or-less show how a code change (bug/fix, new feature, change in storage backend, etc.) directly affects your IAAS bill at the end of each month, and taking it a step further by tying in some instrumentation and formulas (of the sort often used to calculate the effectiveness of marketing funnels for optimization and A/B testing) to show the concomittant effect on revenue allows ROI to be calculated.

So, I am pretty sure that Finance-Oriented Programming is going to be a thing not so many years from now, tied into the whole toolchain, and very few - not developers nor their managers - seem prepared for that sea-change (some Operations folks are probably a bit better off in terms of mindset and processes).


>"Are their development teams seen as cost centers, or profit centers?"

It's not a matter of opinion, but what the term means. They are cost centers unless they are closing sales (or being billed for about 3x what they cost, which is still way less than sales people will generate, being closer to 10x+ cost).

Best thing any employee can do is find out how their company actually makes money and understand how they align. Unfortunately, the highest business expense is salaries and one of the most important things for most businesses is remaining runway. So there is huge pressure to keep salaries down, particularly in cost centres, which aren't directly extending runway. If it's a service business it can get worse, as the money is typically made on billable hours - creating a perverse incentive to have cheaper and less efficient people to bill out rather than a 10xer (outside a senior business facing role). As long as they are capable and not so bad that they lose the client, it's often more efficient to have cheaper mediocre people from a business point of view.


> one of the most important things for most businesses is remaining runway

Remaining runway is a measure that only exist for companies in the red. For most business it doesn't even exist.


How long you can continue operations if sales drops off a cliff is a very important number for every company I've worked at, from startups to big multinationals. Maybe it's less of a consideration for Amazon and Apple but I wouldn't even bet on it there.


How can a software developer fight against this salary limitation?


Become a highly sought after specialist for a product offering (e.g. self-driving cars, blockchain, fintech), or get ownership (startup), or go customer facing (particularly if commission involved), or go into technical management on CTO route.


The better question to ask in my opinion is whether or not they activate their IP on the balance sheet. If they do there is a very good chance that they care about the quality. If they don't then there is a very good chance they treat software as something disposable.


Yet my experience is that the issue comes from engineering a lot of times: fixing bugs is not fun, let’s rewrite everything because this time we learned from our errors, or just to use the new shiny framework. A buggy legacy code base that no one wants to touch or improve, and V2/New projects that of course completely fail to deliver in the end and the cycle repeats. Being in a customer team I try everyday to make Engineering care about non technical end users. My team and business might be responsible for some feature creep, but bad code quality comes from engineering practices.

Edit: to be more clear the discussion usually goes like this:

- Me: “where are we on fixing this bug? it’s been two years and customers are still complaining”

- Engineering: “we can’t touch at this component it’s a mess but we are rewriting it, this bug will no longer exists and it’ll be so much powerful, just wait a few month”

We all know how it ends. And when business pressure comes because it failed to deliver it’s all “business is evil we need more time to do great software!”

What we really need is senior and experienced engineers IMHO


> Get out of your cube and go and meet CEO's and CIO and talk to them about their problems and what their goals are.

Jesus Christ listen to yourself. You don’t know anything about me or how I approach things with my clients.

I care about how code affects humans but business don’t have to be responsible for how they hurt people with it.


> Get out of your cube and go and meet CEO's and CIO and talk to them about their problems and what their goals are.

Implied in this statement is that the parent has a myopic perspective on the business world. Based on my read of his comment, I don't think that's fair - and moreover would question whether or not your perspective is as unique as you apparently think it is.

In other words, I think you're patting yourself on the back a little much for figuring out how to chew before swallowing.


> Get out of your cube and go and meet CEO's and CIO

Oh right.

Good luck even ATTEMPTING to arrange a meeting with a Fortune company's CEO. "Who from what department again?". Worse yet if it not even your company's CEO.

Maybe, if you are lucky, you can get to skip a level or too. Software engineers are usually very far removed from the top leadership.


I routinely talk to CIO’s. They’re often the ones who hire me.


Regarding Equifax, when a company like that gets enough negative public reaction, they look for a bandaid. In the case of Equifax, they recently hired ReliaQuest https://www.reliaquest.com/

I've friends at ReliaQuest and I know it is a good company, but I am frustrated that the leadership at Equifax thinks that security is a minor issue that they can outsource to a 3rd party. If a company deals with sensitive financial data, shouldn't security be a core competence?


> It's a race to the bottom. You could pay more for better work that gets done more slowly (fewer hours as people have families, etc,) or you could pay less for worse work that gets done more quickly.

But in my experience, this is not true AT ALL. More experienced people might work less hours, but will produce the actual end results much quicker - both in hours and in calendar time.

Especially when considering that less experienced devs will (on the average) cause more bugs and worse code structure - in the long run this causes a ton of extra work. In my opinion that is the main difference that easily makes someone a "10x" or a "100x" developer.


But that part - bugs, structure - quickly becomes invisible to higher management, as they start to rake in millions the software issues become only minor hitches in comparison. When a company does well, software problems are easily forgiven - Twitter had huge scaling problems but still grew to one of the largest platforms; Coinbase is having regular problems now and is a multi-billion platform. Code quality doesn't matter if the business goes well. We purists like to think that high-quality software sells itself, but I'm more and more starting to think that you can't start off with that.

Of course, on the other hand over-engineering at an early phase of a project is very much a thing too - starting with microservices, using the latest framework or language, etc.


Please can you explain why using microservoces is over engineering? Is it because it’s a relatively new approach? Im getting involved in a new project that is using microservices and a new framework...


Because like any design pattern microservices should be used to address a specific problem. Martin Fowler describes that particular design pattern here in great detail: https://martinfowler.com/articles/microservices.html

Microservices however unfortunately have become a fashionable cure-all that's applied indiscriminately to software projects of all sizes, ages and purposes. Hardly anyone seems to be asking anymore why they're actually using microservices. Everyone uses them because everyone else does.

Microservices always incur overhead both in terms of engineering and network latency as well as maintenance / administration. If you have more than 1 microservice you need infrastructure for orchestrating and monitoring these services.

For all that to be worth it you need to have pretty solid arguments for using microservices in the first place. For a new project I'd usually advise against using microservices right from the start. Doing so is often rooted in the same kind of fallacy that has people worry about Facebook-scale scalability before they even have their first user.

If you start having problems that can be solved with microservices then good for you! This means your software is successful and has grown so much in terms of features and responsibilities it's starting to become intractable with standard approaches.

This is precisely why Martin Fowler argues for starting with a monolith first: https://martinfowler.com/bliki/MonolithFirst.html

A traditional monolith gives you the opportunity to learn about the problem domain first without having to worry about implementation details such as microservices.


I second this. You can internally build your monolith in a way that makes a transition to microservices easy. Avoid global state and build single concern processes which have their dependencies passed to them on creation.


Also note that it is very, very difficult for anyone, inexperienced or experienced, to build a monolith that can later be pulled apart. You're two suggestions are a great start. You have to really design with the major blocks of functionality in mind, and carefully manage dependencies between these blocks AND third party dependencies. So many monoliths are balls of mud, that can't be pulled apart with any reasonable effort. The code may not be spaghetti, but the dependencies and interdependencies are.


Honestly, if one can not create a monolith that isn't a giant ball of mud, one has not chance at all of creating a distributed application, microservices or not, even ignoring quality.


In that case one not unlikely ends up with an anti-pattern known as The Distributed Monolith, which combines the worst of both worlds.


I think the strategy of "monolith first" isn't a poor one, but it is a little humorous to see someone who makes part of their living helping companies break apart their monoliths advocate it.


I guess. My partner is currently starting a mushroom farm. She hopes one day it will be big enough to make use of a very large facility. Right now it isn't, though. Do you find the same humor in her real estate broker guiding her towards smaller properties initially even though if her business successfully grows the broker will make part of their living helping her move into a bigger location?

Makes perfect sense to me.


I get what you mean by saying it's humorous, but I think it's important that nobody easily dismiss what he says because of who he is.

A lot of what he's arguing against is cargo cult programming, and I think it's fair for anyone to argue against that. If your only explanation for why you chose one architecture or design pattern over the others (especially if that option introduces a lot of additional drawbacks) is "because that's just what you do", you need to stop and objectively evaluate what you're doing.


What about flow-based programming (https://en.wikipedia.org/wiki/Flow-based_programming)? How would you argue (or not argue) that differs from microservices? When would you advocate for a monolith first vs building one tool that does one thing and does it well and then pipes its output into another tool?


It is a design technique that can be used for designing microservices based applications or any other kind of application that is distributed. So, it's something completely different and the difference you asked makes no sense.

There is a link on that page for the more generic dataflow based programing. I'd advise anybody to follow it and simply ignore its specializations until they master the generic one. I'd also say that any program should be designed from multiple points of view, so do not stop at dataflow.


They work well when designed well - and for me that's more often designed organically. If they're designed badly, most bugs eventually reduce to trying to do distributed transactions across microservices.

That's my yardstick - if I see poor man's distributed transaction attempts happening across service boundaries I'll start to assume over-engineering / cargo culting. If I see services acting a nodes in an pipeline or a flowchart, preferably with multiple incoming lines, it smells better to me.


Did you read Honest status page? If you want your debugging session to reminiscent of a whodunit go with microservices.

If you are doing microservices, I hope you have a good reason for doing so. Because there are significant problems with using microservices.


not OP, but microservices can cause you to have to spend a disproportionate amount of time writing glue code between services. Starting with microservices is often a case of implementing separation of concern at the network rather than the functional layer, which is IMO too high in the stack. Microservices should be extracted for performance reasons, not build from the start because they are trendy.


In most companies upper management can change their mind faster than junior developers can write bugs, thus invalidating all the buggy code already written.


Literally laughed out loud. So many times, I've sat on requirements or even projects waiting for the inevitable "change of priorities" making the projects no longer relevant. "Forget X, Y is the new priority", it's very reliable in big dumb corporations.


>> less experienced devs will (on the average) cause more bugs and worse code structure - in the long run this causes a ton of extra work.

In the last five years or so, I've done contract work at several large corporations. In every single one, they didn't care about defects or code structure. They just wanted to ship an application in a very short period of time.

When I asked about the same thing you pointed out, the response was the same, "We don't care about defects once it's released, and we don't care what the code looks like. It needs to work well for our end users, nothing else. Besides, in 6 months, we're going to redesign and rebuild it anyways."

It just seems nobody cares about long term maintenance, and likewise, they don't care much about the code or the amount of defects that code is generating. Pretty maddening when you think about it. Nobody cares about quality anymore.


If they actually redesigned and rebuilt it evety 6 months, that would be a reasonable approach. Often they just want to build more stuff on top of what they already have, however.


That may simply mean that only one of these things really is quality :

1) working great and well for the end user, for the function of the software

2) bugs and code structure

Code quality can be so bad that it matters. But beyond a basic level, it does not matter. Bad code that works beats great code that doesn't fulfill it's function every single time.

I once had this great lesson. I was brought into a team as a TL/Manager. We were making a product that was doing pretty well, in terms of attention it was getting and so on. 1 week in, I learn that about 70% of the code is tests, 30% is actual code that's used at runtime. 2 weeks in I learn that everybody's always working on the tests, never on the program (there's a reasonably good reason for this even). 3 weeks in I explore the program, and at some point I try to run it. Turns out the "main" function is broken. Because of crashes (plural) during variable initialization it never gets to the first line of the main function ... How long had these bugs existed (it wasn't just one), you ask ? Well, over 3 months. Doesn't happen in the (VERY extensive) test run.

Lessons learned in that project:

1) test driven development is a great place to start, and a VERY bad way to run projects once they're even 10% into their delivery schedule

2) even the best tests don't check everything

3) even the best tests don't guarantee that even the most basic simple parts of the program work. In fact, tests actually WORK AGAINST THIS.

4) the value of a system test (where the entire program runs, by actually going into main and doing what it's supposed to on a realistic example) is incredible. The criticism that TDD developers have is valid, that if it fails you won't know (necessarily) where it failed, but isn't that serious. Firstly, you'll usually have some idea of where it failed, and secondly, and this is the big one, you'll know it fails. FAR preferable to the other situation. They're also very hard to write. Tough. You need them, far more than you need unit tests on 5 line functions.

5) you are MUCH better off with a "hero" programmer (bad name, "loose cannon" would be more apt description) AND someone acting 90% anal about tests, and managing the inevitable conflict, than you are without the loose cannon. Managing that conflict will be a challenge though. And yes, that means that, to the utter dismay of the TDD developer, you'll have to tell him to let in code without, or with bad tests, from time to time.


And twitter became a successful (questionable) company with buggy code and the infamous "fail whale". You don't need good code, you sometimes just need shippable code to get by until the company executes on their exit strategy.


Hasn't Twitter racked up 2.5B in losses and yet to turn a profit (though reportedly close to doing it for the first time ever). 2.5B in losses for a platform you could build and scale for what, $10M now. Very questionably successful - some people got rich so yes, i suppose?


That's why I added (questionable).

I've made the mistake at one company of thinking that my job was to architect solid software that was sustainable for the long term.

What they really wanted was to have software that was "good enough" and had all of the checkbox features to be attractive to potential acquirers so the investors would stop breathing down their necks.


Twitter "made 2.5B losses". Yeah ... that's bad ... but not from the perspectives of investors.

Jack Dorsey became a billionaire from those losses. The same is true for probably all other early investors. For an investment in Twitter to have done really badly, there was a "bear" period of about 2 years.

And before you say it, remember that even Google stock dropped 60%, Facebook's 70%. Twitter's drop actually looks pretty reasonable in comparison.

So firstly, I don't think twitter is bad, and secondly, I don't think people have given up on twitter, and thirdly, that seems pretty justified.


A business model is by definition "bad" if it doesn't make sustainable profits. Of course you have to allow for time, but Twitter isn't a startup.


Well I would be very happy indeed if I ever make a business for myself 10x as bad as twitter.

Wouldn't you ?


As a founder - yes, if I could convince VCs to give me enough money to fund a money losing venture long enough to take public to get the market to value my company at a few billion.

As an employee who has a job because the company exists? Yes

As a retail investor - I wouldn't be quite as happy.


It's now the platform for official communications from the US Government's Executive Branch, so you can expect it to worsen based on red tape alone.


That has little to do with their architecture, methinks.


People have been asking for years why Twitter is so bloated with employees. A lot of people think that the bloat is partially caused by a poor architecture that they have to maintain.

Contrast that to the small staff that Instagram had before getting acquired. It's still run on a relatively small staff now.


a 24 year old from a top CS college with 3 years of solid javascript experience will be better than a 40 year old with 15+ years of C#.net experience for today's job market


What about a 36 year old from a top CS college with 6 years of Java experience, 6 years of C#, and 3 years of Javascript? And many more failures and paradigm shifts under their belt, by the way. The core field of undergraduate CS hasn't changed much in the past decade, mostly there are new options for specialization and the bleeding edge remains in graduate programs. If you still think the 24 year old is the superior choice, then even generously you're not looking for skill.


If you live on the web, maybe this is your perspective. There's a whole 'nother world out there than just the web.


I'm one of the greyhairs. I won't work the insane hours, but I'll still get more done. How? By knowing what to write and how to write it, so I don't have to fumble around trying things for nearly as long. By not writing the bugs that the 20-somethings write, which means I don't have to take the time to try to fix them. By not creating the shoddy designs that other create, which then have to be fixed. By having an idea where to look when bugs do show up, so that they get fixed more quickly. And so on.

It's about producing working code, not about how many hours my butt is in a seat...


What you won't do however, is work insane hours to get something done because management changed scope or requirements at the last second but kept the deadline. The flavor of agile development that most shops seem to devolve to is one where management gets the part that they can update requirements based on new information, but deadlines are the same and costs need to constantly be cut.

My personal belief is that our industry tends towards less experienced workers because they put up with shit managers more and can be used to cover for their issues. It's anecodtal only, but I have never heard management decide that their deadlines we're improbable if not impossible when bugs get through. It's always just explained away with platitudes about code quality that suddenly dont matter when that requires more money or more time


Call me cynical, but I am always surprised when people sincerely appeal to "agile/scrum values" of team autonomy or whatever to contest management diktats because, like... isn't it totally predictable that that part will be ignored whenever it is inconvenient?


Agile is cargo cult. Customers are not agile to your fuckups. Competitors are not agile to your "scope creep". You know what is the current most agile company in software? Microsoft. You know why? Because no matter how much it fucked up its business is so huge and so vast that it did not matter. It makes money. But even Microsoft is starting to feel the need to deliver as otherwise it is going to become IBM or Xerox.


Agile is like dieting, simple but hard and requiring a lot of discipline. Very very few companies are any good at it but done well, it's a complete transformation.


I think it's more like a cult, where it's supposed to solve all your problems and any problem you identify with it just proves you aren't doing it right.


It is, at best, a different set of trade offs. What it has become is like you said, some sort of cult where management gets everything they want and don't have to pay any costs.

I've worked at two places where my tech lead at least recognized that if scope or requirements changed then deadlines changed. Now I am at a place where agile is just a word thrown at any problem and now we have things like being asked to integrate third party tools that we don't receive until a week before the deadline. My team stayed till 2 am and was back in at 9am the next day and _literally no one_ saw a problem because it's "agile" and that's what you do when your agile


I've been lucky enough to be on projects where it's done well (not scrum) and would never go back after working that way.


I've experienced "no methodology; just people kind of describing what they want and you make it up as you go along" and Scrum and I prefer the former, but I suppose it's possible that someone out there is doing Agile in a way that is better than the one I learned in a bunch of excruciatingly boring training sessions from consultants.

e: Actually I did Kanban too and that was OK, but that is a lot closer to the "no methodology" thing and anyway doesn't involve you making decisions like "oh, can't start working on that, because the sprint is almost over."


Kanban is very much Agile. Scrum can be implemented in a variously heavy to lightweight way, though FWIW I found having leftover time at the end of a sprint that could only be used for non-business-facing work (e.g. improving the build system, refactoring of code that wasn't immediately being worked on) to be very valuable.


I guess the cynic in me finds it predictable that most Agile shops (at least IME) would be doing Scrum and not something like Kanban, even though it largely works against the supposed core values of Agile.


Yep Kanban is the way to go, focus on flow, small batch sizes, reducing waste of wait times and handovers. Look at cumulative flow and rightsize stories instead of fretting over estimation, points, burndown, and all that mini waterfall that scrum seems to entail. Takes most of the drama and hysteria out of delivery.


Ultimately it makes no difference whether you're doing "story points" or estimates in units of time. All that's going to happen is they'll be converted to units of time in a way you don't quite understand. It's just so naive to believe any of the Scrum promises.


It's why we always do rightsizing and never numerical estimates. You can calculate delivery dates by counting stories * average story time - very fast, very simple, and amusingly at least as accurate as detailed time estimates. Has an interesting side effect of tying scope to delivery. So if you want to reduce delivery time, it's clearly understood as under control of stakeholders and done by dropping stories. If you can't rightsize to a reasonably narrow range (we use 1-5 days) and have to estimate (e.g. sifting stories for prioritisation purposes), t-shirt sizing usually works ok and gives the required info without opening it to abuse.


As an overworking tryhard chronic cover-upper, this hits dishearteningly close to home.


Not many 20-something have the mental strength to insist on the dealine moving if such big change requests come in late. As a contractor you'd insist on the client to pay that extra work and time.


I totally agree that measuring productivity by the amount of working hours is ridiculous.

However, it is important to realize that you are not competing with the 20-somethings or the ones that are fresh out of college. You are competing with the ones that have already a good amount of experience (let's say 7-10 years) but are still in the beginning of their careers.

If they are minimally talented, they will probably produce code just as good as yours, yet they will probably (a) work for less money and (b) still be single and (c) more dedicated to work and (d) more tolerant of bullshit asks from the employer. Those are the ones that eventually fill positions as "Sr. Engineer" and crowd out the older ones.


which means you're competing for principal and architect level jobs, which are much more restricted.


> I totally agree that measuring productivity by the amount of working hours is ridiculous.

The alternative is P&L and boy grey beards don't want P&L based measurements


There are only two things that are measurable:

1. Time

2. P&L

P&L measurement would destroy majority of the old timers unless the old timers work the same little money fresh out of college people do.


I'm not attacking you here (and in fact I agree with you), but from a manager's perspective, I need one guy like you to design it in an architecture modeling tool and 8 young code monkeys to do all the boring work of building and testing it. You can do the code reviews to make sure they don't do stupid shit.

The thought is that a team of greyhairs never ships anything, and a team of youngsters ships garbage. But it's better to ship garbage than nothing, hence the bias.


I STRONGLY disagree.

My experience is that programmers who become "architects" and are only responsible for design and code review tend to go downhill in their skills. It SEEMS like an efficient way to use experienced people, but it is actually an anti-pattern. Their tendency is to develop ideas that sound good but don't work well in practice, and there is no direct way to correct the mistake. (Any time it doesn't work, the tendency is to blame the implementer. And there is usually enough to blame that their own contribution to the problem gets missed. You're less likely to miss the problem when YOU are trying to make the implementation work.)

In fact the problem is sufficiently bad that in interviews it is important to have people actually write code to show that they still can. When you get an "architect" who takes offense at the exercise, that's a non-hire. They might have been good 10 years ago, but they aren't worth hiring now.

This was something I'd sort of noticed, but didn't become conscious of until I worked at Google. There they were very conscious of the phenomena. Every programmer from the most junior to the most senior (for the record that would be Jeff Dean) writes code. If you're not willing to write code, you're not a hire.

That said, the exercise goes both ways. If I interview with an employer and I discover that they design things up front as UML diagrams, odds are that this won't be a workplace that I want much to do with. If I'm working in a job and they force me into an abstract architecture role like you describe, I'm going to quit and find a better job.


My title is "architect". I use scare quotes because I agree with what you wrote and I dislike being associated with the type of people and the jobs that make it true. I, also, only work at places where the interview is sufficiently technical and there are no separate design-only roles.

It's been surprising to me how much I've had to fight people on this issue in the past.


I agree with you.

The thing is, software architecture kind of the same job as coding. So what the "architects" you describe really do is equivalent to writing code on a piece of paper. That is, doing one of the most mentally demanding jobs imaginable, but without the tooling to protect them from their own confusion. No surprise then, that it later turns out the architecture doesn't make sense. Without a tool like a compiler to call you on your bullshit, it's too easy to start engaging in fuzzy thinking, and the longer you're not exposed to such practical verification, the more your thoughts will become fuzzy.


<three letter hat on>

1. All of you are overhead.

2. Customer is the profit.

3. Now how do I get (2) to be higher than (1) before we run out of money?

</three letter hat off>


I mean; that’s fine. I wouldn’t want that job either — it’s why I got an MBA, got out of the tech side and now actually have power over hires/fires and product direction. But unless you’re directly bringing revenue in the door, you are a cost item to be cut. As soon as they can find someone cheaper, they will.


Don't worry, I'm OK.

Lots of companies are smart enough to figure out that it is cheaper to hire competent people than to accept the boneheaded mistakes that the cheapest warm body would make.

Besides, the ones that don't figure it out are no fun to work at. Who wants to be on the side that's bound to lose in the long run?

I just wish that they did less collateral damage on their way down.


> As soon as they can find someone cheaper, they will.

This applies to everyone everywhere, and particularly positions that anyone with basic language and reasoning skills can fill (PMs, MBA types, etc)


Naw, the MBA jobs are all about personal brand and networking. When you make the rules, they tend to benefit you.


> MBA jobs are all about personal brand and networking

No disagreement there. Just in my experience those jobs are either the first or second to go when things get tough - often because there's an MBA somewhere near the top who recognizes how replaceable all the rest are.


Yeah, but these jobs also often have contracts with severance packages. Executives (VP and above) are typically hired under fixed contracts that make it very expensive to get rid of them. You can demand these things if you have specialized knowledge, credentials, and relationships.


> Executives (VP and above)

Aren't you being overly restrictive in scope here? Many executives are MBAs, but most MBAs are not executives.


True; but stalling out in middle management is never the reason people get an MBA.


Waiting tables in Hollywood is never the reason people take up acting, and yet so it goes.

Anecdotally, I know several full time top 10 grads who aren't exactly on the fast track to the c-suite, and I assume this is even more true for the broader pool of MBAs.


Can we just be honest?

Us older folks won't build things we know doesn't work against synthetic deadlines, then work overtime to fix what we knew wasn't going to work in the first place, while taking the blame politically.

Engineering has in my lifetime become "unwinnable." I'm either not "a team player" or "being negative" for planning for reasonable failure scenarios. Then politically I still take the heat when they happen.

This is not a quixotic aspect of engineering, it is a design feature at bad employers. "Heads I win, tails you lose"


Engineers aren't blameless, here. In fact they're at least as much the problem as "bad employers". Constantly seeking to use the latest shiny tech toy whether it's appropriate or not (e.g. "Big Data" tools to run stuff on a few GB of data) is a thing, especially in the Bay Area. It leads to a bunch of people with a small amount of shallow experience in a large number of soon-to-be-deprecated technologies. Few want to do the hard work of actual engineering.

Also, that the product is 10% better/more stable/whatever frequently does not translate into even a 10% increase in sales. So it's not worth it (from the business' perspective) to invest in that quality. Pick some other arbitrary point (e.g. 20% better/20% boost in sales, etc.), right down to some threshold whereby the company simply doesn't even have a product that can be demonstrated.


> Also, that the product is 10% better/more stable/whatever frequently does not translate into even a 10% increase in sales. So it's not worth it (from the business' perspective) to invest in that quality.

Here is the core of the problem, and it's more of an incompatibility of goals than errors of one of the parties. Businesses want to make money. They usually don't give a flying fuck about what their product is or does beyond the point it gets sold. Does it waste users' time and piss them off? They paid us, which means they value it, so everything is ok.

Engineers, on the other hand, tend to care about what value the product actually provides to their users. So they would rather invest effort in making the product better for the users, instead of making it better for sales team to sell.

I don't really see the way out of this conflict. The engineers are right, but the businesses are right too - it's the business that pays and suffers (some of) the consequences, so it's the business that gets to tell engineers what to do, and not the other way around. If CEO is making a stupid decision, that's on CEO.

The way I see actually useful software gets done, it's outside or on the side of a business, not within it.


Good point about incompatible goals. One way out is to align the value as perceived by users with the value as perceived by engineers. If users aren't willing to pay for reliability/security/etc. then it's rational for management to devalue those properties.

I'm not optimistic that this will happen. Anecdotally, I expound to acquaintances the risks of unprotected PII or questionably-secured home security apps, but convenience seems to outweigh such concerns.

Regulation is one approach to solve asymmetric information transactions. I don't see how that could be applied in general to software quality, but it could target the cases that have severe or widespread effects.


The value the product actually provides to users is the viral coefficient. Lots of companies value growth. Find one.


Growth has been completely gamed already. Just look at most startups.


I agree, but this is essentially a tragedy of the commons problem being exploited by management.

Keeping up with new technologies is essentially a proxy question for, "How much free time do you have that I might exploit later?"


Ok, but his specific complaint was that planning for reasonable failure scenarios makes you labeled not "a team player" or "being negative". And I have seen this happen, it is true problem. Someone else refactoring badly does not cancel that systemic problem out.


>>> Also, that the product is 10% better/more stable/whatever frequently does not translate into even a 10% increase in sales.

It's quite common for things to generate or save money. Sadly, most engineers won't bother counting the results related to their work.


You hire wrong grey hairs, most new people never shipped anything. Person with 10-20 years experience probably shipped a TON of stuff. This is the key thing to look when hiring, people who shipped things and like to ship things.

On other hand there tons of grey hairs who never shipped anything and are useless as new grads are.


> But it's better to ship garbage than nothing

Yup, this is the software industry boiled down to the basics.


well i think this is certainly the problem--ie, now "garbage" looks justified

but those aren't really the two alternatives, are they--ie, crap code or nothing?

it might be in the very short terim (ie, by this friday)

but over any other span of time, the choice is more like:

ship crap code in 30 day, followed by 50% of the team's resources spent bug fixing (which can often be cleverly disguised as new features) for the next 90 days

OR

ship high quality code in 45 days


Sadly, it's hard to convince people that doing things right is better than doing them wrong, but quicker.

That customer needs this code on the 1st, that is a hard and fast deadline! So shit is churned out, and handed over on the 1st. Then, three or four 1sts later, the customer finally gets their shit together and deploys it, and, voila, it is shit. And the cycle continues, as the scramble ensues to patch the shit by the 15th with yet more shit. And you end up like the little Dutch boy at the dike, except instead of fingers you're using hotfixes made of excrement.


> You can do the code reviews to make sure they don't do stupid shit.

This does not work. Your code monkeys will get demotivated fast and wont be able to learn anyway. The one architect guy will because increasingly out of touch and his code review will become pointless red tape fast.

> The thought is that a team of greyhairs never ships anything, and a team of youngsters ships garbage. But it's better to ship garbage than nothing, hence the bias.

Why would greyhairs never ships anything? I dont get it. My experience was that young people need more supervision to not get demotivated and to actually finish it. (on average)


I seem to recall an episode of the A16Z podcast where Adrian Cockcroft of Netflix explained they were top heavy on senior engineers when they began moving to microservices. Didn’t seem to slow them down but I could be misremembering.


They only hire senior engineers.

I don't know what that really means though. People who are smart, hard working, and get the right jobs can get a senior title in 5-6 years at the big companies.


I know someone hired at Netflix who is belligerent, inexperienced, and lazy. He is a senior engineer there. Titles don’t really matter much at the end of the day.


Indeed! Over time I've become more and more aware (and proud) of the things I left out. As a young programmer, I would be far more proud of the things I included.

The shipped project is one branch of an extremely large tree of possibilities which a younger, less experienced programmer would have taken a very long time to explore and discard (which I know for certain because I was that programmer).


It's weird how that works... I try to avoid unneeded complexity as much as possible for as long as possible... and when I do need complexity, I try to make it as easy to use as possible, so that it's less in your face on reuse.

I'm consistently amazed at how much of a pain in the ass things tend to get when people try to build solutions in their early 20's, vs 30's and now in my 40's. Not to mention how much my viewpoint has changed in the past 20+ years in software.

I'm pretty happy when I can remove a bunch of dead/unused code trees, commented out swaths of crap, and refactor portions of a codebase into 1/5 the size.


I rewrote a 900 line class today by writing one generic function that encapsulated a pile of common logic that the monstrosity had copied and pasted with slight changes over, and over, and over, and over, ad nauseum, and wound up with less than 100 lines of code. What a glorious day.


Indeed. When you see some of the waste that goes on - whole projects going in the wrong direction and having to be largely reworked. And the people in charge will never admit they made an error in how they set up teams, and that it's the 'greyhairs' getting involved and rescuing projects over and over again.


If we assume this is all 100% true...there's still no way to be sure during an interview.

There's plenty of shitty older devs. And if they're old enough they're a protected class which will make things very awkward if you hit a shitty one. And with the newer generations, there are plenty of 18 years old that will run circle against more experience devs, nevermind mid 20 ones. There's of course plenty of crappy ones too. So all things equal, but with a different salary, which one do you pick if you're a naive hiring manager?

Fortunately I now work for a company where this is a non-issue. While we definitely hire a ton of new grads, we'll never say no to older/more senior candidates (and sure could use more). Which is good, because I'm getting dangerously close to my 40s.


>It's about producing working code

I propose that working code is necessary, but not sufficient.


maybe they discriminate because you call the designs of other people shoddy.

I'm only being half snarky. Large chunks of these threads end up as people bashing young people for being idiots for various reasons.

I do believe that discrimination is an issue, but you also just don't need that many seniors. If you have a couple to make sure designs are reasonable, catch mistakes, and mentor the junior people then that seems pretty sufficient.

If you have 5 seniors and a junior you get stuff done, but the junior person quits because they don't see any point in sticking around on a team where they aren't being challenged because someone more senior always gets to do it. Someone else will give them the challenge they want. I'm moving on right now exactly because of this.


> maybe they discriminate because you call the designs of other people shoddy.

I'm primarily calling my designs shoddy when I had only a few years of experience.


In my experience, worse work typically gets done a lot more slowly than good work.


I think it depends on the timescale. In the short term, a 'bad' hacky solution is usually quicker than a 'good' one, but it will likely slow you down over the medium to long term.

If the timing of that bad hacky solution being shipped is critical, then it's actually the perfect solution and the right work was done quickly, which is great. If it's not, and is rather the first piece of a new feature intended to last for a while, then the work on that feature as a whole may be slow as a result, so then it looks more like bad and slow work.

I think with experience comes the ability to choose the appropriate approach for the problem at hand, whereas inexperience will usually lead to the short term quickest easiest route being chosen every time by default.

Another subtlety is that, typically, more experienced people will actually deliver the 'good' solution faster, which results in a double win if that is indeed the correct course of action.


I find writhing a quick hacky solution and then fixing it over time (IF it's worth it, alternatively a rewrite might happen) is quicker because I don't usually write what I want/need on the first try.

If I want to end up with a great end result, I find this is the way I have to work.


These discussions tend to assume in-house development though.

I've spent far too much time around organisations where "more time with bums on seats" = "direct correlation to billing more", which = "better work" if you ask the right people. That leads to an unfortunately different scenario.


Defense contracting. It's all about billing hours against a contract. As long as I'm doing that, I'm making the company money.


And that's how you get code from company A that's performing sanity checks on a parts of a file that are outside the scope of the spec the data in that file is supposed to meet failing content that meets spec but has had all superfluous information discarded by a custom compression algorithm that's designed by company B to improve performance on a network with insane latency.

That was a fun bug to track down.


This is my experience as well, I've never seen slow result in good.


I don't think they mean slow development. Rather, overall rate - all other things equal, someone who insists on no more than a 40 hour week will accomplish less than someone willing to work 60-80 based on whatever metric you use (KLOC, story points, etc)


I've only ever heard one person ever mention KLOC's, and they were in their 60's nearly a decade ago. They were explaining how people were using it as a measurement of work done in the 80's, and what a terrible idea it was to measure throughput that way.

You're the second person. When was the last time you encountered anyone using KLOC's (per 1k lines of code) as a measurement of work done? Are they working on mainframe codebases?


> You're the second person. When was the last time you encountered anyone using KLOC's (per 1k lines of code) as a measurement of work done? Are they working on mainframe codebases?

The DOD and their "experts" love measuring software projects this way. They even take the cost/LOC ratio as a measure of value. I was once semi-seriously chastised for committing a change with net negative LOC because it broke the formula and implied negative value.


Me too (also DoD contracting). Cleaned up a module full of amateurish over-complicated code resulting in a negative LOC count on the boss's spreadsheet. Oh, I heard about it.


I like to use LOCs to see at a glance the size of a project. A more complex business logic usually takes more LOCs to write it down. And I bet there is a strong correlation between LOCs and hours worked on a project which would make LOCs a good approximation of work done. Actually I should research it. Should be not too hard to find all the tickets for projects and compare the time logged to LOCs.


1) There is no fixed correlation between LOCs and business logic complexity. 2) there is actually an inverse correlation between LOCs and amount of work done. When I'm most productive I actually decrease the LOCs.


1) You tell me that if you take some fixed business logic and add some exceptions to it (make it more complex) you wont need more LOC's to address this added complexity? I don't believe you.

2) You can't be productive having 0 LOCs. So 0 LOCs = 0 productivity. n>0 LOCs = (presumable) some productivity. I would say we have a trend.


1) I said that there is no fixed correlation. 2) I'm most productive when my LOCs are negative, deleting duplicate code or finding a better approach for a problem.


1) so you say there is a (positive) correlation, but it is not "fixed". Sorry, I don't understand what this means.

2) I know what you mean. I do it often myself and I know it improves code quality. It's not my intention to measure "work done" in LOCs, which, as you say, could be contradictory. I propose to measure the size of a project in LOCs. Suppose, you have two projects. And you have all the time you want to reduce their LOCs (to improve code). After you are done, both projects will still have some LOCs. My point is that the project, which has more LOCs, after you had your fun with it, is the "bigger"/more complicated project.


1) Imagine some project written using CTRL+C, CTRL+V everywhere, another much bigger project written with some good architecture and using always the DRY principle, another one still bigger is written in a much more compact language like F# or Haskell. The first project following your criteria is the bigger and most complex, but in reality it is just a bloated mess. 2) It never works like that in reality. No one is paid to spend time only to improve the code, we are paid to provide value to the users. The project with the good architecture is easier to work with and it will be easier to add features and continue to improve the code. The bloated mess will be much more difficult to work with and you won't have the time to improve it apart from getting the low hanging fruits.


I think LOC probably works OK as a very crude metric until you start to measure it. Once you start to measure it all hell breaks loose. If you measure me on lines of code, well I can get you lots of lines to code. No problem at all...


I seem to recall a quote from some computer scientist, possibly Knuth but I'm not sure, that went something like 'You optimize what you measure.' So if you measures LOC, you get LOC... and far more of them than is by any reason 'necessary.' Really a horrendous metric.


There is no universal metric to gauge productivity of software engineers.

The added value a software engineer produces with his work in a single unit of time has a huge variance. Furthermore, it can range from negative to positive.

I.e, it does not matter how long one works. The only thing long work hours of a software engineer communicate is that they work long hours.


I think his argument still applies to an "overall" metric.


Are you saying the 40 hour developer has a lower code quality than the 80 hour developer?


If a bridge falls down and people die, the suits go to prison for criminal negligence for not hiring experienced people and giving them the power in the organization to set the schedules. If a system that involves software fails and people die, the courts throw up their hands and say 'nobody knows how computers work, and there are no standards we could claim they violated or ignored' and the company usually doesn't even take much of a PR hit. It should be obvious to everyone that this situation will not stand forever and it will be very bad when it stops... but everyone has motivations to resist the establishment of standards so they'll continue to bicker about it and do nothing until legislators force the issue in the most hamfisted and horrendous way possible.


Mind you, people are still happy to let bridges fall, and infrastructure in general, crumble. “That’s for the next guy to fix” seems to be the order of the day.


This is just a facet of the good old fashioned project management 'iron' triangle.

https://en.wikipedia.org/wiki/Project_management_triangle

Crudely put, for project X you have three the following constraints. Cheap, On time, and good quality. You get to pick two of these three. It would seem to me most companies are opting for cheap, on time delivery and thus the quality is simply absent.


I am not sure it is cheaper in the end. You might pay a lower hourly rate, but because of bad work, rework, miscommunication, delay on communications you will pay more in the end. That has been my experience with, for example Chinese and Indian development teams abroad. There is a reason why agile prefers collocated teams.


While I agree with you, that it isn't cheaper in the end, people think short term. Lots of companies are trying to pinch pennies at every corner. A great example: Why does extra sauce cost more money? It pisses most people off and a lot of people turn it down. But they still probably see slightly higher margins at the end of the year.


It is not about cost. It is about cashflow. Revenue is not a king. Profit is not a king. Cashflow is a king.


Nope, hiring fewer more experienced programmers is also better for cashflow.

My bet is that it's about easy and hard to measure costs, and how our modern big government world insulates large companies from competition.


Nope. I can hire out of school programmers for $70k/year. I can hire grey beards for $200k/year. Not a single grey beard is worth 3 of those out of school programmers.

This is also why grey beards suck for P&L so they better want to work long hours.


To be fair, Apple already pays better than most companies.


so apply timeboxing


You have it. Terse and to the point, i’m just providing more recognition here than my upvote.


There's a perception in computer programming that any experience outside of the specific language, library and toolkit being used on the project is irrelevant. Or worse, an impediment; "I don't want somebody with bad habits and instincts acquired from working on now-obsolete systems". Managers really believe a kid with 3 years JavaScript experience and 1 year of Vue is as good (or better) than a senior programmer with the same, plus 20 years of Unix, C++, protocol implementation, and similar "baggage".


If that older Unix or C++ programmer doesn't have a strong bias against Javascript, functional programming, new es6/es7 concepts, and web appliactions in general, then sure. I see many of these types of people even on HN with biases though.

Age shouldn't be used as a catch all filter, but I can see how it might be faster to screen those who have been too content to stick to their tools without catching up to industry trends.


>> If that older Unix or C++ programmer doesn't have a strong bias against Javascript, functional programming, new es6/es7 concepts, and web appliactions in general, then sure. I see many of these types of people even on HN with biases though.

You might want to check your own bias in favor of functional programming. To the more seasoned programmers this is just the latest fad. That's not to say there isn't value in it, but if you're making hiring decisions around it you've probably got your head in the clouds.


The most seasoned programmers have been using functional programming since the 60's.


I presume you wouldn't hire someone who expresses a strong distaste for your software stack, regardless of age (unless their job is specifically to replace it).


> I presume you wouldn't hire someone who expresses a strong distaste for your software stack, regardless of age (unless their job is specifically to replace it).

I'm not sure if I'd want to hire someone who is unable to point out valid criticisms of the technology stack I'm using.

People need to be aware of the limitations of their tools. I spent years in pure C, both complaining about it and singing its praises. Right now I'm in JS land, and I can do the same thing.

I started off in C#. No real complaints there, except for the lack of stack alloc, and that got added in the most recent version. :-D


> I'm not sure if I'd want to hire someone who is unable to point out valid criticisms of the technology stack I'm using.

I only interviewed a few of candidates for developer roles a couple of times and this was one of the questions I asked, managed to get some interesting, constructive responses as well as some well-crafted bullshit.


From experience, companies won't hire you if you express a strong distaste for their software stack.

I learned this after I told the interviewer that I thought that PHP was a crap language, when interviewing for a PHP developer position.


Honestly that doesn’t seem a bad decision. You will be eating PHP day in day out. It would be a bad choice for the company if they had candidates who otherwise don’t care that much about PHP’s problems, and a disservice to you if you could otherwise find equivalent jobs in a language you like better.

Working in a PHP shop, nobody would care to shit on PHP during an interview. We know the shortcommings, it just doesn’t really matter that much.


I'm definitely not criticising them for their decision. I honestly probably wouldn't've liked working with PHP all day long, especially since the job would've been working with quite a lot of legacy code.


I tend to feel the same wrt Java, not so much the language, but the tooling around the language feels like such a pain. I always hated PHP. I've worked with them both though.

I happen to like the one language to rule them all (JS).


I was once hired for a C++ position _because_ I hated on C++ during the interview.


I think that c++ is an horrendous collection of disgusting hacks. I also work as a c++ developer. If I were hiring for a c++ Dev role I wouldn't hire someone who cannot point out several problems with c++, as the main difficulty of the job is to avoid those tar pits.


C++ is the exception. You can't be senior without despising all the intricacies of the language.


Linus Torvalds is perhaps the perfect depiction of this.


Linus Torvalds doesn't program in C++ at all, only C.


That's the joke, yes.


Feel like I have to share this hilarious fake interview with Stroustroup (creator of C++) here http://artlung.com/smorgasborg/Invention_of_Cplusplus.shtml


And such a person shouldn't be hired. I wouldn't hire me to work on a JavaScript stack. It's not because I don't or couldn't understand it in short order, and very much better than many (most, possibly all) of the 20-something hipsters they're using to fart out garbage. It's because I have strong biases against it and the kind of development style it attracts, some of which (as illustrated in this very comment) aren't really empirically or otherwise similarly justifiable.

They're just biases that haven't a good justification. I usually keep this sort of bias to myself because I recognize it for what it is. Honestly though I wouldn't apply for such positions, either, so the bias problem never surfaces.


> very much better than many (most, possibly all) of the 20-something hipsters they're using to fart out garbage.

how is this type of attitude not sufficient reason to not bother hiring someone? I immediately don't want to work with or for you.


From my comment: "I wouldn't hire me to work on a JavaScript stack.... They're just biases that haven't a good justification."

I know this. And I wouldn't want to work with JS devs. At all. It's a win-win for all of us that I don't apply to those positions.


I wouldn't want to work with you on anything, as I strongly suspect your attitude extends well beyond javascript.


You think having preferences, and being aware of them, and avoiding environments not in line with them, makes him a bad prospective co-worker?


Having preferences is one thing, but being judgmental/presumptuous towards people working in that realm & using slang with a clear derogatory bent to describe them for something that does not inherently relate to morals is another.

It’s a sign of a bad attitude.


It's a sign of a very healthy attitude to be open about your opinions and not trying to be "nice" all the time. This is a very american thing, to be constantly positive and nice when you really shouldn't, and in my experience, it's a very detrimental thing to morale and productivity because of how obviously fake it is.


What you are saying is very different than the person’s comments I was referring to.

One can be blunt without displaying a trashy attitude, and it doesn’t take that much effort. Most people tell me that I am the most blunt person they know in that I will give some of the most honest observations whether they are uncomfortable or not. I also have a reputation of being one of the nicest people many people know, never needing to put down one group of people to appease another, probably one of the biggest engineering productivity killers one can create (negative politics, cliques, etc.). Engineers often will vent to me (coworkers, teammates, people from various programming communities, etc.), and I lend a sympathetic ear to all, which in turn builds trust, comraderie, and helps use my knowledge & experiences to help guide responses & resolutions to situations generating stress to a healthy outcome for all.

It shouldn’t take much effort to treat each person with respect & humanity - if it does, one should look inwards at maybe something being wrong with oneself than to other people, and work on culling out the toxic attitude.


I've been building web applications for over two decades... I love most of the newer tooling over anything I had even a decade ago. Though I do prefer React+* over Angular. I've kept up, some things I love, somethings I frankly use sparingly... that said, I'm definitely biased.

I'm more likely to ride on the bleeding edge (Node 8, Webpack 3, Babel 7) than most. I am afraid though, in under a decade it will be an uphill battle to find work (I'm 43 at the end of the month).


There is only so far you can go competing with teenagers in node/js space - pick a different tech stack for the future.


I'm 40 and also a JS developer for many years after using C++ and other stacks before that. From what I can tell Node.js and JavaScript stopped being cool for many young people like a few years ago when they got onto Go and other stuff.

Maybe the trend is web assembly with Rust now or something. Or Ethereum contracts and R are cool. I dunno but trends do matter.

Anyway JavaScript is less trendy but still very popular and there are a ton of crap codebases built up over recent years. So any of those that don't get scrapped for a trendier technology will be looking to recruit experienced people to fix up their crap (maybe adding TS declarations for everything or something like that).


On the contrary, there is only so far younger people can go competing in the node/js space with people who have gone through SICP. Higher order functions are everywhere.


When JS stops reimplementing Unix tools or stuff from the Bell ATT labs from before you were born, I'd start getting worried. The fundamentals are important, when you get into anything more complex than slapping crud apps together. Witness the monstrosities birthed by JS "programmers" that have never slung pointers, or been acquainted with tree operations, or think MVC was birthed with Web 2.0.


LOL... of course you can build crap in any platform/language. On the flip side, look at what Walmart and Netflix have accomplished with JS, which is a far cry from the monstrosities bemoaned. It's not all simple, because it's a relatively easy and flexible language to start with.


People who are early in their careers can be quite dogmatic too.


I think graphing dogmatism over age is an inverse bell curve on average, with a lot of point variation.


It's possible but it's a lot less common IME.


My observation is that earlier on in people's careers they tend to be more certain about the limited experiences they have had due to a lack of contradiction.. Of course, if you keep repeating the same limited experiences this will only be reinforced ;)


My experience is that fresh grads are aware they lack experience and are appropriately modest in their judgements of technologies, whereas people with ten years of experience are dogmatically confident that certain technologies are good or bad.


I've found that sometimes people have done one thing their entire careers and, therefore, think that's the only solution to everything. And I think that's more likely when your entire career is a couple years long.


Need someone's opinion on this since OP mentioned c++. If I take c++ as a specialty language would a companies be more willing to hire people in their 50-60's I know it's a difficult question to answer but I feel like c++ is a ageism proof skill much like Fortran or COBOL. Let's all find a solution to this problem as this topic always come up every month on HN. So which programming languages are "ageism proof"?


> Can someone explain it to me?

It's very simple - it is harder for younger managers to work with older subordinates.

The age-based hierarchies are deeply ingrained into virtually all cultures - "Elders are wise, respect your elders, etc." This makes arguing with (or reprimanding) someone who's older than you a doubly-uncomfortable task for many people. The reverse is also true - that is being bossed around by people significantly younger than yourself.

When everyone does a splendid job, the age difference is not a problem. But if someone is not pulling their weight, that when it starts to complicate the situation for everyone involved.


> It's very simple - it is harder for younger managers to work with older subordinates.

A good manager would figure out how to make it work. I've worked with younger managers, and I've even had to teach them how to deal with problem workers.

I've also had people all the way up to the director level that I've had to manage, as a developer, because they were so terrible. I even had one ask me once 'Why are you managing me?" and had to reply "Why are you making it necessary?".

I don't think that we've got the whole management thing down - I think that there are probably the same % of bad managers as their are bad devs. I'd also suggest that there are 10x managers, probably ones that read Peopleware and took it to heart.

It is easy to manage most people, all it takes is a bit of diplomacy.


highly skilled professions like law, medicine don't have a problem with age discrimination, because of the marginally significant positive correlation between years of experience and results. these fields also weed out the incompetent early. there's plenty of well-paid >50 software engineers out there who have no problem finding work, because they've made a mark in some way throughout their careers in the field or have transitioned into engineering management. alot of the more average older engineers though looking for jobs at high-growth companies honestly just don't merit the salary premium. the ability to engineer something thats "maintanable long term" is (1) not really a skill that's exclusive to people with 20+ yrs experience, (2) not really demonstrable or quantifiable especially for middle management living quarter to quarter, (3) is often a skill that close-minded engineers that really are just set in their ways and resistant to innovation mistakenly think they possess. most of your average software jobs really aren't that hard, and decades of experience don't really gain you as much as you'd think.


Though there's been some research recently showing that younger doctors are better, because they're aware of the latest research. I wonder if that's going to lead to some ageism in medicine because of that.


Not a chance. The amount of doctors has a hard limit fixed by regulations, there will always be work for all of them and at good wages.


This is a myth perpetuated by medical schools. There is no regulatory limit to the number of residency slots hospital's can create. There is a limit to the number Medicare will pay for, and teaching hospitals generally refuse to create slots that are not paid for by Medicare.


> most of your average software jobs really aren't that hard, and decades of experience don't really gain you as much as you'd think.

This I agree with strongly, and it's yet another reason why younger people, especially those who just learned to code, could be more liked by companies - they still get excited! Whatever turd you'll present them with, they'll think you're giving them truffles. Experienced people know that most dev jobs are just boring rehash of the same bullshit everyone else does.


This so much.


> It doesn't make sense to me to discriminate against people with more experience

More experience also means ability to see through management's bullshit.

> It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good

The difference here is employee vs professional


Age discrimination happens because your line manager doesn't want a more experienced and higher skilled person reporting to him. You might end up replacing him.


Or disagreeing with them when they're making million-dollar decisions. If the manager is right or wrong, they lose if they're highly-paid senior has a serious concern with a strategic decision. It's safer to just hire a bunch of junior engineers that will be team players.

To be clear, there is a huge win here. And that's hiring good technical leaders and listening and consulting with them on technical strategy, standards, technical employee evaluations, etc. It's actually lower risk and higher reward to do so, but it is more work than telling new hires how to crank out strung-together features on your rule-the-world custom framework.


> It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good. You see some straight-out-of-college twenty-something in . . . any other field, you think, "I sure hope he knows what he's doing."

It still works this way in tech. When you need someone who knows what the hell they're doing, you're going to consider hiring a consultant, and you're likely going to get one of those same grayhairs.


I'm in my mid 40s and active professional developer who goes back and forth between being a team lead, an architect, and a developer depending on what interest me at the moment.

I've found with older developers they fall in two camps - old set in their ways, didn't keep up with technology and its harder to teach them modern best practices. The other type are the ones who have learned a lot from their experience and are aggressive learners. I don't think older developers are ever "malleable" we are always stuck in our ways. Our ways just may be "this is the way I've always done it" or "I love technology and learn for the sake of learning" or "I learn aggressively to stay competitive".

As far as being 10x better, the other side of the coin is that it doesn't matter if you're 10x better if your skill set is more than the company needs.

If all your company needs is another CRUD app and you've gained expertise in AI, they aren't going to pay you more based on you being better.


Software isn't like architecture or carpentry.

First, software engineering and programming are fields with way more skill diversity, which is a nice way of saying that there are a lot of relatively incompetent people in the industry. The assumption that people with experience are more skilled isn't generally true. I've seen people hired for the role of CTO in small startups who were explicitly chosen for their age because it somehow meant that they got more experience, except that they weren't as good as the graduate.

Second, software isn't as critical as other fields because users don't care (or are trained not to care). Your iPhone calculator not working isn't the same as your room's window not closing. Most companies don't really search talent. They just want to hire cheap "talented enough".

In the specific case of Facebook, Amazon, etc, the hiring process is focused on algorithmic and stuff like that a recent grade is simply more likely to get since it's still fresh.


The CTO role is about 40-49% technical at the absolute most. The primary point of the CTO is to build the team below them. This requires a lot of "soft skills" that 99% of graduates don't have.

My point is that it's likely the CTO is not the most technically adept person in the company, and that's to be expected.


Knowledge in the software industry tends to get outdated much faster than in other industries. While there are timeless software engineering practices that are good and evergreen, about 70% of my knowledge today will be outdated in 5 years. This is not true of other industries. Sure, other professionals need to keep themselves updated as well. But the percentage of knowledge that a bridge architect needs to update is much lesser (say 20%?).

The cost of failure is much higher. It is easier to correct shitty software. So the focus is on delivering fast.


> about 70% of my knowledge today will be outdated in 5 years

I really hope this isn't the case for you, and in general I don't really believe it to be true either. Most of what makes me successful as a developer today I knew 20 years ago, in a prior career.

You have to differentiate between tacit knowledge and explicit knowledge. The former changes very little and the latter changes a lot. You're also best off retaining almost _none_ of the latter. Those "Teach Yourself <language/framework> in <number> <days/weeks>" books you used to see were written for serious professionals, not newbies.

I can often get more done with less using tried, tested tools that change little than my peers can do with 'the new hotness'. It seems every language/stack has its own implementation of cron. Stop that. Use cron.

Networking protocol, binutils and SQL don't really change and that's literally 99% of the "make stuff happen" part of my job. A lot of "Full Stack" developers don't know any of this shit. They don't know what tools will tell them if a client's traffic is coming into their network at all or being rejected at their server.

Con artists, honestly.


Not to be offensive, but the front end is quite often far more work than the SQL that goes on in the back. I've worked strictly on the back, the front, and in between. I've also had to work in a number of platforms and languages.

In the end, keeping things as simple as possible is usually best. If that means using language features that someone might not be familiar with, it should still be more understandable in the end.


Not offensive at all. I actually would say in the best case scenario, you can't do one without the other.

I will agree that frontend is full of churn. I have to do it, but I wouldn't specialize in front end -- most of your role is relying on explicit knowledge rather than tacit knowledge.


The kind of knowledge that gets obsolete is replaced by stuff that's very similar.

Doesn't matter if it is a new language, framework or whatnot. There have been very few advances in our field, stuff changes incrementally. So you have seen everything before, it just has a different twist. You can pick this crap up in a fraction of time, as the concepts are already internalized.

... Unless, that is, if you used to call yourself "X developer", for whatever value of X. Then you are unidimensional, and screwed.


> ...about 70% of my knowledge today will be outdated in 5 years...

I've been around for a while. Some knowledge has become useless, but I learned how to sniff out technology that's going nowhere and avoid it. Since then, I've rarely wasted my learning on things that become obsolete.

Besides, the most valuable knowledge is typically how product managers, salespeople, and people that set budgets think. That knowledge evolves much slower than javascript frameworks.


I'm talking completely about the company's own interests

The mistake is this: the interests of the company and the interests of its management are not always the same thing.

Sure, the company would be better off in the long term with experienced workers producing quality working sustainable 40-hour weeks. But in the short term, Jo(e) Manager can earn his or her bonus by burning some cheap people out with 80-hour weeks, then just getting fresh ones in.


In the highly distorted US employment market, older employees are more expensive even at the same salary level as younger employees. Big factor I haven’t seen discussed in these comment threads yet is healthcare expenses. In average aggregates, older employees in the US will tend to consume more and more expensive healthcare services, products, and pharmaceuticals. Unless you are an outlier, and somehow signal lots of health oriented choices by mentioning struggling to maintain 15% body fat over the holidays, how proud you were to join the 1000 pound club as a milestone, or how you’re looking forward to the tough mudder in four months with your buddies, it is rational for the hiring side to be led by the statistics, and presume average American health complications at each successive age cohort.

I’ve yet to peer behind the curtains of corporate axings, though. So I don’t know when the lists of employees to fire are drawn up, whether or not the fully-burdened costs (especially medical insurance) are somehow accessible by line managers. The per-employee breakdown of medical costs is definitely available to companies.


1099..problem solved


You should clarify what part of the problem is solved.

Under individual 1099 wage treatment in the US, the healthcare cost shift is generally not in favor of the employees in older cohorts. If you are claiming to use actual individual freelance-style contributors engaging with clients as 1099, then you should look at the pricing and coverage constraints of individual healthcare plans in the US for older employees.

1099 corp-to-corp is done all the time of course, but that simply shuffles around the pieces on the board. The cost only appears to get magicked away in most 1099-oriented arrangements because the vast majority of 1099 individuals I've worked alongside and gotten to know have compromised pretty extensively on their benefit plans, compared to most conventional big group-based benefit plans. In the US, volume discounts don't cover just price, but all sorts of other transaction factors that simply do not show up in economic statistics.

There are some individuals out there who are comfortable doing 1099 work, while meeting or exceeding benefits plans of big groups, and I've done it myself in the past. It takes a lot of discernment, judgement, and work to navigate the thicket of options and constantly-changing regulatory landscape from many different jurisdictions. It's easy to wave the magic wand of the free market and say make everyone a free agent, but in reality it's not for everyone, and I would not promote it as a general policy for the general population. Ronald Coase's work on the economic effects of transaction costs at individual and firm level illuminates some rationales behind why encouraging individual 1099 transactions would not produce optimal results.


It's health care and pensions you silly people. The stereotype (which is massively true in most cases) is that this generation of aging people suck up vast amounts of health care and demand to retire, while young people don't want health care and this generation probably won't get any retirement anyway so no worries.

Do you know how many companies will string along a part time worker, promising full health care after X months, only to fire them if they start demanding benefits? I've seen dozens of cases.


Here's my experience with an elderly coder in his mid 50s. He was a decent delphi and .net coder, he was a decent human being, but he had health problems. Most of the week he had hospital appointments. He required nose surgery. His blood levels were low. He needed vitamin d and iron supplements and during our four month acquintance, he never got any better. He constantly forgot things and he feared his lack of iron had caused a permanent damage to his brain. He was a diabetic which gave him license to be more hot headed and harder to persuade. He had accumulated a lot of debts which gave him license to be more hectic and disturbed. Overall, his long life and experience did not outweigh his negative properties.


I hate to tell you this but you're looking in the mirror at your future self (and your peers) as coding isn't usually in most cases a healthy lifestyle choice.


That may be true, but it leaves me with the frustration of having no other choice as I haven't developed interest on any other subject that would possibly give me a more healthy lifestyle. Perhaps being rich with no worries in the world would help?


So you met one guy with a lot of chronic health problems and assume that everyone else in his age group is similar?

Or if not then I'm not sure what your point is. People who miss a lot of work and make excuses for being belligerent don't make the greatest employees, but there's not as much correlation as you think between that and age.

I feel like you really have given a clear answer to combatentropy's question "why do employers discriminate against older people", but it's maybe not the answer you thought you were giving.


It is the other way around. I didn't understand why companies would even consider not hiring old/experienced software engineers until I met him.

And now I am on a diet and quit smoking :-)


Counter anecdote: I was the oldest guy in the team and took the least day of the sick leave (none).


To be the devil's advocate from employer perspective, old folks:

1. Work fewer hours, because of age and family

2. Pay is higher, because of experience

What this implies is their wage per hour metric is higher than younglings. So didn't veteran coder get stuff done more swiftly? IMO, yes, some of them are extremely valuable and irreplaceable. But they are RARE, and doesn't need Facebook Ads to find another job, they cultivate connections or head hunters find them.

So the rest of the old folks are the one that suffer most from ageism. Their career growth stuck when they are around mid 30s, as the scope of their work doesn't expand, their experience also stagnate, they end up doing the same stuff for many many years. At some point, the age is going to catch upon them, when their years of experience becomes a liability other than assets.

Sad part of this is it is not necessarily their fault. They get caught up in their own comfort zone, not because they like it, but often time subjects to circumstances. And software is a field moves so fast, talking about relevance of skills in 5 years term doesn't make too much sense.

Solution? Better social security net could help, but won't solve the problem, unless you go to the Japan model, at the expense of suffocating the younger generation, which is not only costly yet not feasible in western societies.


I'd like to add just one thing to this explanation: Many younger managers simply don't like having their authority challenged by experience. Vanity aside, managers like to build 'their own' teams, with people that they can manage. Older, more experienced people add a level of complexity to that need.


> IMO, yes, some of them are extremely valuable and irreplaceable. But they are RARE, and doesn't need Facebook Ads to find another job, they cultivate connections or head hunters find them.

This is a great point. Older devs are generally "better" than young ones on every metric (including per hour costs). BUT you won't find these. The people that will apply for jobs when old, are not representative of older devs, they are representative of older devs without strong networks.

> And software is a field moves so fast, talking about relevance of skills in 5 years term doesn't make too much sense.

I'm not sure this is necessarily true of all of software. We tend to forget (especially on this site that has an extreme "startup" focus) that most of software dev is still people working with older (5, 10 or 15 years old) tech in a megacorp or public authority somewhere. You don't "see" these people because they don't post to stackoverflow, they don't blog about tech and they don't show up on github. What they do doesn't show up in "language trends" that just look at stackoverflow or github. Because it's just a job, it's not breaking new ground. I work in a 30 man team of 40year olds, where no one really fits the "startup" model.

Of course many of these people (I'm one) will have problems finding jobs if they weren't actually "good" devs. If you grow comfortable and don't have a network, you'll have trouble finding a new job after 50 if your tech has gone out of style. But Java, C# etc are going to be in demand for a LONG time yet. Most of software tech isn't about the latest js framework.

Also, as long as you stay in the same industry, the main asset of a developer is the domain knowledge from the problem domain, not software development.


>Work fewer hours, because of age and family

I think that is generally false today.

"Millennials" demand more of a work life balance than GenX or Baby Boomers, they also demand greater Maternity and Paternity leave when they have children

I think most 40-50 GenX's will work MORE hours than a Millennial that has a younger child...


It's about costs.

If you are a large company and have robust processes in place, you don't need smart people with tons of experience. Such people are expensive.

You need average people who don't need to think too much for themselves and can follow directions and work hard.

The process is designed to compensate for mistakes and guide inexperienced workers.

Thinking in computer terms, the analogy goes like this. A long time ago, servers had to be very powerful and very reliable. Then came cloud computing and redundant architectures. Now, you plug in relatively cheap components into an architecture that can tolerate failure. You no longer buy high-availability components as the system can function around issues seamlessly.


>> If you are a large company and have robust processes in place, you don't need smart people with tons of experience. Such people are expensive.

That was the fallacy put forth in the CMM (Capability Maturity Model) right on page 1 or 2. The notion that process could eliminate the need to rely on experts. There are a couple problems with that. Number one is that there is no process that can solve all problems or write all code. Number two is that those processes are probably not lower cost - they rely on a larger group of people. All you really get from such things is better consistency, and that is often what large companies are looking for. I would argue that having some experts in such places is extremely beneficial at times. I've benefited from them and I've been one in my niche. Without pockets of expertise, a vast group of mediocre people will drift at times.


>> Without pockets of expertise, a vast group of mediocre people will drift at times.

I completely agree with each thing you said. I should have been more clear and said "a large company does not need to fill their entire workforce with smart people with tons of experience. Just a few is enough once a good process is in place."

A small company needs very smart people when starting up. They don't have any processes in place and each problem is new.

As companies grow, these experts develop and polish processes. As time goes on, more and more can be done with less capable people.

Not all experts can be eliminated as a company grows. There will always be problems that don't fit the process. An expert is needed for those. A process also needs to adapt with the times. Experts are needed for that.

However, for any class of problem, the pareto principle is in play. An army of mediocre people with an excellent process can handle 80%* of the issues. The remained will need to be handled by experts.

If you have a small company you may need 10 experts to handle X load. You don't need 1000 experts to handle 100x load. You need 20 experts, a good process, and 980 hard working people.

* - numbers depend on actual cases. These values are for illustrative purposes only.


Here's my take: in the world of software products speed of delivery trumps quality almost every time. Customers seem to be happy with a flaky, feature-lite products and pay for it. They'll pay to have something that kinda-sorta works now and pay to improve it on an extended basis in the B2B space. Development companies recognize this and structure themselves accordingly.


It's not that customers (stakeholders) are happy, it's more that they just don't (immediately) know that what they are getting is mediocre or worse.


Customers are unhappy, but ignorant of the causes of their unhappiness.


The tyranny of low expectations: "Oh, it broke and I have to start over? Well, computers are like that, right?"


> Customers seem to be happy with a flaky, feature-lite products and pay for it. They'll pay to have something that kinda-sorta works now and pay to improve it on an extended basis

Yea, I was going to say - do customers expect flaky products, and companies have structured themselves accordingly? Or have companies produced flaky products, thereby getting users accustomed to paying for flaky products with promises of "we'll fix it in the next version," thus allowing companies to produce ever flakier products? Chicken and egg question I guess.


The rate of change of technological progress is increasing, and many people just have a tendency to... crystallize.

Experienced people of any age who can adapt well to change have no trouble getting a job.


My advice to graybeards: when somebody is excited about a new technology XYZ, don't say "oh, we did that 20 years ago and called it ABC". It annoys people, because generally XYZ has important differences that prevent ABC from being a drop-in replacement.

Instead you can say "that reminds me of a similar concept we used to implement using ABC", and quietly appreciate that you got a leg up in understanding because of your experience. If young people are smart and curious, they'll want to know more about ABC, and ask.


I'm 53. What I do is learn XYZ at home, and use it in my side / research projects, while the people at work are still struggling with their legacy software and technical debt written in ABC.


When XYZ is just ABC with additional layers on top which kills performance and/or security...


And then you quote the drawbacks of ABC and it turns out that the interviewer has the same issues with XYZ because it's trying to do the same things. Now he hates you because he really believed in XYZ, maybe wrote it himself.


> The rate of change of technological progress is increasing

Well, the rate of change seems to be high. Whether it's "progress" is another matter. Progress usually implies change, but the reverse is not at all necessarily true.

Who's usually in a position to recognize that and be a judge of it? People with experience.

So the question I'd ask is this: when someone with experience doesn't like change, is it because they've "crystalized"? Or is it because they have some defensible reason for disliking the change?

If it might be both, how would someone be able to tell the difference?


Good question! I think it would be useful to try something like Double Crux: (http://lesswrong.com/lw/o6p/double_crux_a_strategy_for_resol...)

Say Person B is proposing a change from Solution A -> Solution B.

Person A thinks Solution A is still the correct approach.

I think it's important to understand in detail what aspects of Solution B are appealing to Person B. It may be subtle, psychological, and/or political, but there's a reason why this change is important to person B. Person A & Solution A must address those issues, or risk eventual obsolescence.

It's also entirely possible that the more correct approach is Solution C, and this process can help establish what exactly that is.


Certainly 1 and 2 but I believe younger people are more likely to work longer hours because they are less likely to have families.

Even if you didn't have family obligations I think when people get older they are less impressed with beer in the kitchen and ping pong tables and would rather not be in the office after 7:00PM.


> It doesn't make sense to me to discriminate against people with more experience. Can someone explain it to me?

They keep saying all the wrong things, like "no, I won't go on call, that wasn't in the job description I applied for", "sorry, but I won't take on the extra responsibility unless you give me a raise", or "I've been down this road before, I'm not going to work weekends because it's not going to significantly speed up the project and I'd rather be with the wife and kids".


> his keenly developed taste made him more likely to choose something that would be more maintainable long term.

But the same taste, and knowledge born of experience that there is little as long-lived as a temporary solution, might make him put his foot down and try to prevent management enforcing a short-cut that will generate technical debt. This can make older, more experienced, people seem more difficult to work with.

They may also be more likely to have family commitments or have developed hobbies that require commitment (when I was marathon training last year, sticking to my plan meant arranging extra work around that rather then the other way around) so are less inclined to do overtime because bad planning elsewhere has caused a "crunch". This isn't a case of expecting more money for doing it, it is simply not wanting to do it no matter the incentive because we don't want to compromise on the things that make life feel worthwhile (younger people are easier to convince that they'll have time for that later!).

None of this is fair to judge on, of course. It does not impact their ability to do the job well in the contracted time unless things are badly managed around them. But they are still considerations some people make when hiring.


Not every person in a position of make this kind of decision are thinking in the company's best interest. And their best interest are not always the same as the company.


Honestly, just evaluating a person's relevant experience vs salary requirements would end up looking like age discrimination.

If all I care about is degree + ~7 years experience in relevant technologies + generally solid head on the person's shoulders, and I have one person with those, asking X, and another person with those, as well as another 13 years unrelated experience asking tens of thousands more, why would I pick the latter? If I don't think the benefit of that extra unrelated experience (we do cloud based distributed systems, that experience is in windows desktop apps and drivers, say) offsets that cost, why would I hire him at his expected salary?

Note that I'm not saying there isn't a bias in this industry, just that it may appear magnified due to such considerations. I know where I work we ignore age, but we often dismiss resumes with "that person is far too expensive for what this position entails" when it comes to junior - senior devs. Leads, architects, technical managers, it's all considered worthwhile, but below that you're an individual contributor in a very particular niche (albeit one that allows you to learn the other parts of the system, and other projects).


Ignoring age is the right thing. The title of the job will do the filtering for you. I'm not going to apply to a Jr Dev position. The pay and position handle all this for you. And if someone with extra experience wants the role, what's wrong with that?


I think it's more about the commoditization of employees than anything else. Big companies seem to have trended to thinking of and treating their employees as commodities. A piece of furniture that can be swapped in and out at will. I don't know that it's willful discrimination against older workers so much as willful disregard as to whether or not they're getting the best employees.

A company like Facebook probably doesn't really care that they get the absolute best employee for 90% of their open positions. They go with the group that's going to get them the largest number of acceptable hits so they don't need to work as hard to filter out candidates that they're not interested in.

This is all speculation on my part, I have nothing to cite in this regard. I think it's fairly obvious that if Facebook filters out over-40's for these 90% of jobs right up front they will have less work to do gleaning the wheat from the chaff.

For the remaining 10% of high-productivity/high-value workers they probably wouldn't use such pre-screening and do more targeted recruiting. Again, this is all a guess on my part, but it makes sense. Not saying it should be legal, but I can understand why they do it.


Yeah you don’t understand business.

Code quality is vastly, vastly overestimated.

Many very powerful systems that you rely on everyday to do even the most basic of shit, contains some of the most well-tested, battle-hardened shitty code you can’t even imagine.

Writing “good, high quality code” is intangibly, unquantifiably expensive.

Sounds like you’ve only worked on very small projects.


Doesn't YC have a max age for founders?

I imagine there is similar logic for their cutoff.

Not saying it's valid, but last I checked (long ago) they had a max age.

Disclosure: I am a bootstrapping old fart.


It does not, no. The average age is 29, and they've brought in some people in their 60s. https://www.ycombinator.com/faq/#q42


Paul Graham is no spring chicken, so it would somewhat hypocritical to be ageist when you're old.


Maybe Graham believes he couldn't start a successful startup himself at his current age. Maybe that was his reason for moving into VC.


Do they? I dont think they do.


Possibly the people hiring want similarly aged people to themselves, or the people hiring want to hire people of similar age to who they already have.

Or maybe they believe Zuckerberg when he said that younger people are smarter than older people.


I think you're right, and I also think the age discrimination issue is overblown.

Take this article. It says that some companies ("dozens") place some ads targeted at younger audiences.

NOTHING indicates that it's a major trend or that they do this a lot. The article is written because of a legal issue. But if you only read the comments you get an impression of the streets filling up with discarded old ultracompetent programmers.

I don't have the whole picture either of course, but a long life in the industry gives me the impression that it actually mostly does work just the way you think it should.


My dad is 67 and his phone rings 5 times a day with recruiters begging him to take a call. He codes in c++, C sharp, python, you name it.


It could be slighly more complex than just be discriminated on based on age.

To be perfectly honest, I'd rather work with people who are better than me and who know things that I don't. Sadly, I work in a location and an industry where those people are thin on the ground, and thus I'm usually deciding between people who have less knowledge than I think they really should for the roles we're advertising.

I'm not going to lie, when someone has 20 years experience on paper but has a shallow understanding of their trade, I'd rather hire the grad with similar knowledge if all else is equal.

With that said, I have hired people who are older than me as junior members of a team before, but that's because they show an interest in learning and improving. Usually this means they've been transitioning into the role from another area.

Interviewing is a crap shoot though. I have no idea if I'm good at it or not because I'll never experience the world where I've made different choices. All I know is I expect senior people who know their shit and when they disagree with me, I expect them to be able to argue their point in a friendly and somwhat clear manner.

P.S. We're hiring "devops" contractors paying above market rate. It's not a thing about being cheap, it's simply having a quota of allowed hires and trying to get people who can do the job.


You can pay less in a salary on average to young people.

They don’t challenge you.

They don’t tend to need expensive healthcare benefits.


Sometimes, it's the opposite. A 67 year old developer with a pension from a previous job and grown kids doesn't need any benefits


What I think you're missing is how easy it is to coast in this field. It certianly sounds like you're a smart capable person with the good fortune to largely work with other smart capable people. That's great; it's not normal. At least coming from the financial sector (which employees huge numbers of people with the title "software developer") there's a pularity of people who stop improving somewhere around year one then coast on office politics and inter-firm job hopping. These are the people I think the new wave of companies/teams are looking to avoid. After 10+ years, your jobs, if you're good, ought to be coming from network references rather than facebook adds anyway.

Experience is necessary but not sufficient for a top quality dev and top quality is neither necessary nor sufficient to make successful software. The cult of experience in other engineering disciplines is the reason those salaries are so low compared to ours (despite the fact that the work, schooling, and certification are orders of magnitude more involved). Let's not bring that to software please.


>It doesn't make sense to me to discriminate against people with more experience. Can someone explain it to me?

It's worth considering that, statistically, the discrimination might be justified. Of course the only way to know for sure is to get rid of age discrimination laws and see what happens to the market. Companies bias "incorrectly" at their own peril, to be outcompeted by companies without said bias.


>It doesn't make sense to me to discriminate against people with more experience. Can someone explain it to me?

Companies don't care for more experience. The experience the younger programmers have is "good enough" for them, especially weighted with all the other benefits from hiring them (smaller wages, more impressionable, longer hours, less/no family commitments, etc).


The ultimate goal of advertising is to target the people most likely to take an interest in what you are offering. What if it is simply that, statistically speaking, younger people are more likely to be looking for work or willing to change jobs?

I certainly don't have data in front of me, but it seems like a reasonable assumption in my mind. The youngest people just starting their careers are highly likely to be looking for work, naturally. Slightly older people often start looking to change jobs to move into higher paying roles. And by the time someone is moving beyond 40 they have reached the top of their career mobility and are happy to stay put at their current employer, plus an increasing unwillingness to uproot their family to move to a new town for a different job. There are always exceptions, of course.

If, for argument's sake, we assume this is true, I can completely understand why they would only want to target those under 40. Statistically, the ads shown to anyone else are just a plain waste of money.


Tech is especially bad, but other fields don’t necessarily reward gray beards. Many industries have career path pyramids that get much narrower as they go up and no tolerance for not advancing. The military officer corps for one....

Anyways, yes, that grey haired architect probably knows what they are doing, but how many architects didn’t survive to that point?


Your suffering from survivorship bias, and you also seem to be talking from the perspective of a colleauge at the same level or near enough as the new employee.

Managers and above want people who dont really know their rights, have the energy and willingness to just do the job.

They also assume that younger people have better skill sets.

All things equal, younger people cost less.


> Now let's address the second reason: salary. I am 10 times better than I was when I started.

Can you characterize how you are better? I would say the same about myself, it's not that I can do stuff 10 times faster than I could then, I've just got the experience to avoid things that might seem ok at the time but will cause long term problems. Today I know when I'm over engineering, I know when I'm under engineering, I know the patterns that will make code hard to maintain and I can avoid them, I can avoid technical debt (usually).

The powers that be never see that technical debt, they don't know they're groaning under the weight of it, they don't know it could have been avoided, there was/will be a different set of people deal with it. So why would they pay more to avoid problems they don't know they have?

Unlike other industries, software development never, ever plans for maintenance.


The reason is that software - or a lot of work anyway - is throwaway software. Startups want something that just barely works (MVP that just looks good) and then either picks up and takes off, or gets bought for a few billion by investment companies or Facebook/Google/Twitter/etc - their problem then.


> Let's address the first reason: malleability. A recent grad presumably will adopt the company's culture faster, complain less, and in general pick up things sooner. Well, the hardest, meanest coworkers I've ever had were late twenties, early thirties. I've worked with people in their sixties, and they're sweet people. Even the grumpy old sysadmin had only a thin layer of spikes. After just a few days I could see through most of it, and he was 10 times more helpful than my other sysadmins. Not only was he softer (at least deep down) but he was smarter, having done it for decades. Even when he met a new problem, his keenly developed taste made him more likely to choose something that would be more maintainable long term.

What if they're right on average, and your experience is the exception rather than the rule?


Guy Kawasaki once had a post that states the reason why, but in a different angle - it's a post basically on why Silicon Valley is successful:

>>https://guykawasaki.com/how_to_kick_sil/ High housing prices. If houses are cheap, it means that young people can buy housing sooner and have kids. When they have kids, they can’t take as much risk and don’t have as much energy to start companies. (I have four kids—I barely have the time and energy to blog, much less start a company.) Also, if houses are cheap, it’s easier to “make it big,” and you want it to be hard to make it big.<<


Wasn’t most of SV’s most important growth during a time when housing, talent, and funding were plentiful?

Each generation of companies has come from dream teams from the previous. Space race R&D contracts and AT&T engineers begat HP, which begat Apple, Oracle, and Intel, then Sun Microsystems and the dot-coms, then Google and Apple 2.0, now the current crop of companies. SV housing being expensive has only been true for the last 20 years, and has only been utterly unaffordable for the last three.

Guy knows his probate housing, self-help writing, and public speaking, but I don’t take him as an authority for computer company insights.


Another very strong factor is health insurance. A single employee with serious cancer can cause the premiums for the entire company to go up 20%. That’s not a made up scenario or a made up number. United Healthcare is not a small provider either.


They are discriminating against age, not experience. A 25-year-old with 10 years of experience (elite teen hacker!) is far more attractive than a 50-year-old with 10 years of experience (second-career washout).


That's a false equivalency. They're more likely discriminating between a 25 year old with 3 years of experience, and a 50 year old with 20 years of experience.


Is there any evidence that companies discriminate between two people with exactly the same experience over the last 5 years, with one having additional 20 in something else ?


I'd be surprised if the average 50yo applicant seeking a development job was a second-career "washout" (whatever the word washout means for someone who actually has managed to stay working in any career for a decade). In fact, I'd guess that among 50yo applicants, more of them were teen hackers and have 30+ years of software development experience than have <10 years of experience, though perhaps the median might be more like 20+.


They had another career before this which means they WANTED to go into tech. I feel like the chances they would be more passionate and dedicated are much higher.


There are also a lot more developers who have ingrained work routines that are just bad. You might be a 10x developer, but there are a lot of 1x developers, 0.3x developers than 10x developers. If they are younger, you can still change their ways a lot better.

If you're the hottest startup around, yes, you can only hire 10x developers.

However, if you're the average tech company, like most are, you cannot. Those 0.3x developers and 1x developers with bad work routines occupy the most job positions in companies.


It's a managerial move; and most of these companies are still quite manager led; it's the reason for frameworks like Angular, which is a manager's tool, not a developers - the manager can replace one react developer with another, and stay the (fr)agile course;

There's probably a bunch of managerial bias as well about the type of person they can control easier; young eager dummies, or folk with experience who know better than be treated badly


> I am 10 times better than I was when I started. I know, because I still work with some of my code from back then, and I desperately want to rewrite it all.

Don't be biased. Your improvement might be heavily influenced by the fact that you work on the same codebase, in the same company, with the same people. If you were to start a new job somewhere else, your productivity would likely drop significantly during the first years.


I genuinely wonder if people in charge of companies read up on the tech news and wonder "I sure would love to have a person, akin to a mathematician, that could do good work and operated in those abstract truths we see are interwoven in all things and reliably improve my business" and then proceed to hire cheapest mental workforce that gets the job barely done .


> It doesn't make sense to me to discriminate against people with more experience. Can someone explain it to me?

It may or may not from the business point. But the point of the article I think is to debate whether this is legal or not, which is orthogonal to a purely business decision.


"I've worked with people in their sixties, and they're sweet people."

Survivorship bias.


There is the perception that as you get older you don't keep up with new tech and that you work less hard.

So if you are an older person, make a side project with some cutting edge stuff and prove them wrong.


Insecure middle managers.

They know seasoned workers are not going to put up with their shit and will make them look bad, possibly make higher management wanting to replace them, possibly with the season worker.


Older employees are significantly more expensive for your employer provided health insurance in the US. Not sure how that translates to bigger co's but it's significant for small firms.


No women and no non-whites = no HR problems /s


You're being sarcastic, but...

A company of entirely 20-something dudes is not going to generate any sympathy. If there's a problem, it's not going to generate headlines or blow up outside the walls.


But what? Finish your statement.


They don’t want people with health issues, kids and spouses.

Whatever advantages you have, the common sense thinking is that more commitment and more control has high value.


It is about fitting stereotype. Stereotype of good programmer is young man in formal clothing. Hence companies look for that.


Perhaps the extra experience is not worth 50% more? I'd be wary about calling an entire industry irrational when they're spending their own money.


Health insurance costs more for old people.


Have you been wondering why so many companies fail? They artificially limiting themselves with such things like not hiring people based on age.


I completely agree with everything you said.


I'll tell you right now. There is ABSOLUTELY discrimination against older workers. Even though most workers these days don't stay at companies for more than 5-10 years, if a worker is "too old" they think they'll be too hard to train, or "just looking to retire"--even if the length of time is the same.

My father is the pinnacle of excellence and professionalism as a plant engineer. Everything that got me ahead of my fellow students and into a Ph.D. program, I owe to his teaching and critical thinking. He walks into a building and immediately notices things that can be improved and will save them money. (And he's not a jerk about it, either.) He worked for a company for 25 years, and others almost as long. And ever since that 2008 recession? He's been out of a job. His plant got bought out and sold off. They kept him on longer than anyone else because he was one of the best employees they had--but the company eventually went away.

And now? Nobody will hire him even with countless references from management to coworkers. "He's too old." And one recruiter literally told him that.

Imagine how depressing that must be, to work your ass off your whole life and doing "the right thing", putting in 110%, going into work at 3 AM because "the line is down!" and what's he got to show for all that extra work? A dwindling savings account that was supposed to be for retirement, and countless rejections from companies who wouldn't know a good employee if it smacked them in the dick.

That's not how you treat the people who literally built the infrastructure you grew up on. It's reprehensible. And I cannot, for the life of me, understand why anyone (with a brain) would really care about age when all that matters is: "Can you make me money?"

Someone may say, "Well, they can pay a junior engineer less money." Yeah, and a junior engineer is going to have less experience reducing costs, and improving reliability. There's a reason we pay experienced engineers more. Because they add more value for a unit time. It's not like we just go "Oh, you're older now. So here's some more money."


> "He's too old." And one recruiter literally told him that.

How old is he?


60's.


> It just don't make no sense. Other fields reward grayhairs. You see some sixty-year-old painter or architect or carpenter, you think he's probably pretty good. You see some straight-out-of-college twenty-something in . . . any other field, you think, "I sure hope he knows what he's doing."

Well if the technology you're using has only existed for 3 years, 10 at most ... what's the point in hiring a grayhair? As far as you're concerned they have the same amount of experience as a fresh-ish grad, except they're more expensive.

Sure the underlying principles are all the same, but you don't know that if you're not technical yourself.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: