> I spent many years working in a satellite office. A lot of the time when I had some deep technical disagreement about something, I'd fly to the main HQ, and go out for dinner with the other team. I wouldn't even talk about technology, just about random stuff. Once they knew me, and realize I was a human (and not a faceless blob), most technical disagreements tended to go away. People are more likely to assume positive intent and not malice/stupidity/laziness.
Building relationships is important important to get bug fixed/features built. Bug prioritization isn't a just a technical nor a business exercise.
Now, I want to hear which company this is that this guy could just "fly to HQ" (I'm assuming Microsoft Back in the Day) but the geographic gaps aren't always that large... :P
Part of it this was to develop a healthy rapport, to build mutual trust while working on large-scale projects.
But another aspect was productivity; both my team and the other teams had a lot on their plate, and it could be difficult to align incentives such that our shared projects would be completed in a reasonable amount of time. Flying out, finding a desk near the other team, making yourself visible/available, and working directly on the project would get more done in a week than would have happened in 2-3 months of remote collaboration.
Of course, it would be lovely if all companies could get work done remotely (yay Gitlab, Basecamp, etc. etc.), but most companies aren't set up for remote collaboration success, especially across management lines.
I have never started a company and I was hire #76 at my current place, but if I were to start something remote, I would find a game or hobby appreciated by all the founders and make sure that remained a bedrock of our culture.
Going on golf courses together, dining with the team etc. are the traditional recipes to create trust. But it's walking that fine line between personal and professional life, while also bridging the technical abilities parts and the personal chemistry parts.
I understand we are all human and personal feelings have an impact on work, it can't be extracted up to a point. I'm just not sure we should see it as something that should be perpetuated, moved online or adapted to new environments if those were by default devoid of these elements.
Basically I'd hope we can prioritise other ways to create trust than building personal bonds (TBH, being good at one's job for a decent amount of time seems plenty enough to me)
While outright racism is rare, I often see a sort of "you can't be better then me" or maybe a "you think you're better then me" attitude.
Maybe it has more to do with my personality then my looks, but it seems to overlap with in person meeting vs virtual.
This is called networking and I don't think it's going away anytime soon.
I'm a consultant and I get along very well with the director of engineering where I'm working now. When cuts happen, I won't be the first to go. The first to go will be the people that have no relationship with upper management.
Career advancement isn't all about the job that you do.
"Basically I'd hope we can prioritise other ways to create trust than building personal bonds (TBH, being good at one's job for a decent amount of time seems plenty enough to me)"
Being only good at one's job is a great way to keep your current job and position forever. If that's what you want, it's fine. However, if you want to get into management or advance, you need to get noticed and build relationships so your potential can also be noticed.
It's never going away. It's human nature to trust and favor the people you know over those you don't.
A friend of mine who was one of the top salesmen in his outfit would eat his lunch at his desk to save money. But he wasn't getting promoted, while others were. He eventually realized the "do lunch" socializing with coworkers was essential to success.
The idea is to adapt to reality, it's far easier than to try to change human nature.
> The first to go have no relationship with upper management
This also means upper management had no clear oversight on how they were doing, what the value they brought in.
If they did the same as you, being let go while you stay is mostly fair.If they were better at their job than you, I’d wish they find a job where they’ll meet better upper management.
I know it feels overly idealistic or unrealistic to a lot of people who, as you describe, climbed the ladder by sticking with the big boys. Especially for those working in bigger companies where a single employee’s value is heavily diluted and political skills become huge assets.
But companies who care more about your code than where you eat are also growingly common. It’s also so much more comfortable when you can feel that your work output actually matters.
Given the potential to democratize socializing to form business partnerships, it would catch on all on its own as a way to be inclusive during the pandemic in a way that golf never was.
Semi-related, I was reading a sociological research paper by Irving Crespi about playing cards in Endicott, NY and in his conclusion, he speculates that people used the structure of playing bridge or what have you in order to more easily tolerate each other's company. He was trying to explain why most card players weren't gambling seriously and I think he was right about structured activities providing a social lubricant analogous to alcohol.
Perhaps the eu-social aid of structured activity is even stronger than alcohol because we're given a detailed script to play out and distract ourselves instead of simply taking the edge off of our ability to waste energy over analyzing the social situation.
For example, think about dating. It can be exhausting to just meet somebody new for dinner. You have to keep up an interesting conversation without much help, hoping you'll stumble upon some common interests.
It's generally easier to do an activity where you can focus on the task, talk about how to do it well, have an excuse for lulls in the conversation, etc.
I don't have any research to support this but I have come to the same conclusions on my own organizing social events at the workplace. I've found that working on a common goal-oriented task which requires communication is an amazing medium to get people talking. The conversations start around the goal and eventually move outside of the task and into more casual conversation. But having that base that requires communication (think like "pass the ball" in soccer) is required.
As an example: At the previous company I worked, the CTO and another engineer really liked bouldering. They'd often go out after work to the local climbing gym. Eventually, two other engineers joined that group and went climbing too. Mysteriously, all three climbing engineers got promoted and none of the non-climbing engineers did. Eventually the CTO left for greener pastures and the "new" two engineers (by now team lead and VP) stopped climbing the week after.
I worked at a place that hired off shore (European) consultants for a part of our product and my colleagues would constantly complain about the quality of the consultants work and skillset.
I didn't complain
Eventually corporate switched out the current body shop with equally bad off shore one from a different European country, but this time around they'd invite them to our HQ before hand for a week to to have lunch and spend time with before being sent back home to work remotely with us.
This time around no in my team said anything bad about the new group. These consultants weren't better than the previous ones at all, and ended up doing more damage eventually. Hell, some of the mistakes they did, the previous ones would do as well
Granted you could attribute it to "what's the point", but I suspect largely because of what you mentioned, face to face made them mellow... or from my PoV, weaker.
1: The reason I didn't complain is because they weren't any worse than my in house colleagues anyway and I found the irony funny and I was on my way out anyway.
No point in stepping into the fire unless I can control it.
I've definitely fallen into the "attribute to malice/stupidity" trap as well, even when I know that the people responsible are working flat out, if you're not in the trenches it's so easy to say "but it's so simple! I could do it in an hour". It's something you have to constantly keep in check.
Honestly, I think the prioritisation thing is absolutely the most important thing any team or company can do. The person working on a feature almost never has the overall context to accurately prioritise. The next feature or bug on whatever they're working on will always feel more important than someone else's problem. Building a working environment which empowers people to work towards a common goal - and not just whatever feels important to them in the moment - is really hard.
Such "invisible" work may be difficult to justify over "visible" work, yet is crucial for the long-term quality of the product.
There is a decades-old ERP system (that shall remain unnamed), which to this day doesn't use foreign keys in its database. The consequences of this are not immediately visible, but are very serious in the long term, including data corruption (which I have actually seen with their customer) and (somewhat counter-intuitively) performance (https://stackoverflow.com/a/8154375/533120).
Having had a glimpse of their internal culture, I'm not surprised. Any "foreign key champion" would have been quickly assigned to "more important" tasks.
This is such a common gripe, and I agree it's grounded in a lack of perspective. It reminds me of an image that helped me quit smoking. When you pick up a cigarette, imagine that a string is connected to every other cigarette you will smoke for the rest of your life, in a chain that stretches on into the future. You're not just picking up one cigarette. And similarly, you're not just picking up one hour of work, because there are 100 other features that fall into the same box.
That's when you clone their project and set aside four hours on a weekend/downtime and open a PR (if you happen to finish in four hours)!
I try to commit to fixing annoying bugs whenever I can spare a wee bit of time. Sometimes my attempt is a hacky or not-so-great solution, but it's close enough that those close to the project can realize that it only needs a few more lines in exactly the right place or choose to ignore it.
Opportunity cost is the cost associated by not reaping the benefits of an alternative not being picked. This is a cost from the perspective of the party making the choice.
So in this story, 'opportunity cost' is not an explanation why something didn't happen. It is a side effect of something not happening.
You could say the real opportunity cost in this story is the startup teams having a bunch of unhappy users. The explanation for the missing features is simply: they did something else.
So better: "Never attribute to stupidity what is adequately explained by prioritisation"
>You could say the real opportunity cost in this story is the startup teams having a bunch of unhappy users. The explanation for the missing features is simply: they did something else.
The "opportunity cost" _is_ an explanation and your suggestion of shifting the label "opportunity costs" to apply to something else such as "unhappy users" is just adding to the misinterpretation.
Let me try to bridge the gap between what the author wrote and your critique of it...
Think of "explanations" as different levels:
- level 1: Why is feature X missing? Because they did something else.
- level 2: Why did they do something else? Because of opportunity costs.
- level 3: Why are there opportunity costs? Because no company has infinite budgets, infinite programmers, and infinite time, and therefore, they must choose what _not_ to work on.
- level 4: Why does the company not have infinite <anything>? Because that's what economists, philosophers, theoretical physicists, etc have observed in the visible universe so far and that's the framework of reality we're stuck with for the _practical_ purposes of running a company on Earth.
The level 1 cause & effect is "correct" and "true" but also trivial, obvious, and uninteresting.
The level 2 cause & effect is what the author is trying to explain.
The levels 3 and 4 start getting into metaphysical discussion which might be intellectually entertaining but hard to apply to real life.
"Why is feature X missing?"
1. They decided to do something else, and by extension, not this (e.g. because of opportunity costs)
2. They decided not to do anything (e.g. because the product is abandoned)
3. They evaluated the possibility of adding feature X, and decided it doesn't make sense now, even if the resources to implement it were available (similar to the above 2 but it means feature X is unlikely to get implemented without some external thing happening to make it make more sense)
4. They evaluated the possibility of adding feature X, and decided it doesn't make sense ever (e.g. because it directly cannibalizes their primary business model)
If you would like feature X as a customer or potential customer, you may care which of those is actually the case. If you are responsible for a large chunk of their revenue, and feature X is make-or-break for you, 1 and 3 are ok scenarios for you, but in scenarios 2 and 4 you're unlikely to have much luck. If you are evaluating whether or not to start using this product, scenario 2 may be alarming enough to make you choose another product entirely, and scenario 4 tells you important things about their business model.
The article is suggesting that people overestimate the probability of explanations 2, 3, and 4, and that most of the time explanation 1 is the real one, especially for obvious shortcomings in a product. By the author's evaluation, explanation 1 is "opportunity cost", explanations 3 and 4 are "strategic reason", and explanation 2 isn't really covered (though it could fall under "opportunity cost" if you stretch).
I’m applying the exact definition quoted in the original article.
The real problem is failing to estimate opportunity costs correctly.
If you need to get Product X out the door and you spend two years rewriting it, you have to failed at basic cost/benefit analysis.
That's simple. More complicated: the user POV can look very different to the PM POV. Which is always a bad thing.
I use a number of DAWs, and on some DAWs the PMs keep adding releasing new versions with new features that are interesting, but non-essential. Meanwhile version x.0 is always broken in important ways, and may be so broken it's literally unusable. And some bugs are just never fixed.
From one POV they need to ship n+1.0 to keep the annual upgrade cycle humming and the income stream ticking over. From another, these companies accumulate a lot of frustration and ill-will, and when a realistic alternative appears not a few users jump ship. So they lose that income anyway.
What's the ideal solution? Other DAW companies ship updates less often. They rely on positive PR, better marketing, and general customer good-will. They seem to be doing okay.
I have no idea how the financials work out, or if they're even comparable between two products that are nominally identical but are used by different demographics.
Even so - the critical point is that someone in the management food chain should at least be attempting to assess these strategic trade-offs.
To an outsider it often looks as if this isn't happening. Instead, the org made a decision a long time ago, the decision has never been updated or challenged, and now the org is just following it robotically whether or not it still makes commercial sense.
Also, prioritization is not only minimizing cost, you’re probably also maximizing reward. So the calculation is more complicated anyways.
It would also draw a bigger distinction between the "strategic" decisions he describes at bigcos (which also involve opportunity costs!)
Terminology aside, I did think he made a good point.
But it does actually work. When there are high-value things to do, then that is the opportunity cost of doing low-value things instead. ie, by doing the low-value thing you incur opportunity cost equal to high-value thing.
If we implement Feature Y, we can make $$$.
If we fix Bug X, we can make $.
So the opportunity cost of Bug X is $$$, the benefit of Bug X is only $, so we're not going to fix it right now.
There is more than one way to refer to this familiar situation. I could say "opportunity cost," I could say
"priority," I could say "limited resources." I could pick one of those and say the others are all wrong, but what do I gain by doing that?
You pick the bill up, and it is indeed $20.
Why didn't the man pick it up? It's free money.
It's true that it's free money. However, the man might have something that he's rushing to that in that instance, he's decided is worth more than $20. Perhaps it's a very important job interview, for example.
So while it is a free $20, it also costs him a job. that's why he can't afford to pick it up.
You don't have the same opportunity cost. For you the $20 didn't potentially cost a job. So you pick it up.
Developers are perhaps in a similar situation.
Let's say there is a bug that is so horrendous, every month 5% of your userbase leaves saying "We are leaving because you don't do x." Another 5% of your userbase is on forums desperately asking for x, saying they will start paying you the moment you do it.
Why don't you pick up all these customers? All you have to do is x!
Perhaps the reason you're not picking up these customers is because you have another feature that increases your reach by a factor of 5. So, although you could retain 5% of your users who are leaving, and pick up 5% of users who would join, you have another 500% of your users who you are buying by investing your time in doing y instead of x.
You can't "afford" to do x, because then you wouldn't be doing y.
In essence, you are rushing to an interview instead of picking up the bills your customers and potential customers are waving in front of you hoping you'll take it.
Is your analysis of the opportunity cost rational and correct? Maybe not.
But this mentality does explain why a whole lot of software is extremely buggy and has forums full of people asking for simple bug fixes and features, all of which are ignored while the company pursues new customers. The company feels they "can't afford" to incur the opportunity cost of fixing their bugs, because they'd be losing out on the new customers they could be getting by focusing on new features instead.
This is also why making a big scene in the news gets a bug fixed very fast. Now you're not just "paying" with your dollars, which the company would ordinarily snub. you're paying with the dollars of all the new customers who are reading about their response in the press; and the company cares a lot more about those new customers than it does about you. so it fixes your bug, because now it can't afford not to.
Or, to quote Steve Jobs: "People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are."
I find personal projects very humbling. Even with no one stepping on my toes or setting unrealistic requirements, I still make dumb mistakes and write subpar code.
"It's not a matter of convincing me we should do this. Obviously, we should. But we also have a hundred other things that we should do, and a few dozen things that we need to do. So it's not a discussion of 'should we'; it's a discussion of understanding how important this thing is in relation to all of the other things."
Triage and prioritization is hard.
It's always going to be a choice of doing these 5 things versus these 20 others unless one of those "20 others" means we can now do 6 things in the same amount of time. If you can pile a bunch of those up and get to 10, a lot of rhetoric goes out the window, on both sides.
Sharpen your damn saw.
He finally got tired of me saying, "Given enough time and resources, yes, we can do pretty much anything! Talk to <Product Owner> about what we SHOULD do."
We started to gain larger clients. We thought that a 5,000 person company would be ahead of the 5 person company when it came to technology. Not so. Here is what we learned: there are political risks for changing technology. A CTO (these are not tech companies) would wait until people were complaining so extremely that people would have to feel that changes were good. It all arose from that bias most people have about change being bad unless it is ten times better.
Big companies have inertia to deal with for every change and once you get burned by something trivial that gets blamed for problems, you get very shy about improvements.
I'm in the process of building a system that had to make alot of compromises based on existing expectations. It's probably a year away from being where I want it to be because every change is so slow...
It's frustrating to defend these decisions every time someone comes along and notices how hard it is to use the system. I have to explain that every part of the system is broken out currently to find issues and fix them, the end goal is to stack everything together, but right now all the sausage making is on display and frankly it's appalling...
(edited for readability)
end goal is turning a push button system that had to be carefully monitored for breakage (while it was running after hours) into a staged system that gets checked and verified during business hours, but still fires off after hours.
I also wrote about this recently: https://timwhite.digital/building-software-sharing-knowledge
This is from someone who's had the [dis]pleasure of working on all manner of steaming piles of rubbish.
If there are any Junior devs reading this, Chesterton's Fence is a great concept to understand and abide by!
This helped me develop relationships with other devs, QA, ops, etc. Also, I could learn or discover the reasoning behind trade offs and compromises that were made before I proposed changes.
One time I went direct into a director/manager lead role. It was a nightmare, some on the team thought I didn't know what I was talking about -- even though they were not using source control until I brought it up during my interview. They had not been exposed to professional software development before my arrival and took little stock in my advice/leadership.
Or that the way to do it better simply didn't exist back then :)
At a certain point it becomes inexcusable, and I personally never judge them as being stupid or lazy, I just attribute it to lack of empathy.
It’s painful for people that have to regularly deal with something not straightforward (over complex) or just plain old messy (no concern over what it would be like to revisit the code for anyone, even themselves).
Ruins my day(s), and it takes some mental gymnastics to justify serial offenses of this kind (e.g We got it done , that’s what matters - great, but you slowed down velocity at every later stage).
Lastly, I think when you find yourself asking ‘why is it done this way’, you’ll notice it’s a pattern. It’s never a few pieces here and there where you can reasonably understand, it’s always endemic throughout the codebase.
So sure, not stupidity, just a pungent disregard for others.
A casual company culture is an implied pungent disregard for others and it comes with its own pungent smell of entitlement too
In reality there probably was incompetence bordering on malpractice by the prior firm that did the work. But you never know.
If $100Ks (and multiple years) have been wasted, I know it is incompetence bordering on malpractice. Especially, if the firms that did the bad work have well known brands.
In professions like law, medicine, or professional engineering prioritizing is important and useful but providing shoddy work product or corner cutting should be unacceptable. And often corner cutting is malpractice.
In software development I used to push hard to get people to focus on getting from A to B efficiently while using an architecture that supports flexibility rather than over-engineering from the beginning. It seemed to work well.
Though I did run into some managers/directors/VPs that would carefully listen to a reasonable proposal and ignore it or in one case do the opposite. I always took that as the sign to move somewhere else. Eventually leading me to go law school.
Or, more succinctly, asking `Who has the problem?' has helped me over the years to adjust my expectations, because more often than not, it is me who has the problem, and not the person I am reaching out to.
You will see, especially at work, when you have something that makes a ton of money, but people will be uninterested because it doesn't benefit them as much. At that point you require lots of escalations.
This happens about 95% of the time when you don't know a certain person/team."
I used to do plant floor support in a manufacturing plant. Over the radio, people treated me like an idiot. When I walked out to the plant floor they treated me like a genius.
Ok, let's leave security for later!
I personally experienced an important delivery being delayed several days because the tech lead wasn’t happy with the security settings on the database server. If the DB was accessible outside our internal network or contain any information that wasn’t publicly available, I would have understood.
I also will not sign off on security concerns unless they are correct. I'm happy to let management override me, but I won't back down and say things are finished when they aren't.
IMO, your tech lead wasn't the problem here. Their boss was. Their boss decided that the security was more important than the delivery date, or they decided that they didn't care enough to get involved, which amounts to the same thing.
Why should sales stop littering when development is the one cleaning up the highway?
A larger company is generally not moving as quickly, and the scale of the things that they're choosing not to do for opportunity-cost reasons gets much smaller.
I worked for a money transfer startup at one point. You could use it on Android and iOS but not on desktop/web. We just hadn't gotten to it yet. If they're successful but still mobile-app only in five years, though, I'll expect that it's for reasons like "we looked into web but fraud rates were much higher" instead of it just being a matter of priorities.
- some 20pct of people think the office is where politics and games are played. Thus they try to play it better. That's because openness at the office is usually quite low. Such people are unmoved by push back thinking themselves the smarter, grayer haired guy. Survivors of this kind of BS actually are smart, and do have experience with effective but distorting coping skills
- software should be a formal science. It's not almost all the time
- the chief issue unaddressed here but smartly lead up to is the management function: as the normal scenario is that resources are vastly drwafed by problems what do we say no to?
The management question when examined is complex and when it unfolds to systemic issues that are chiefly human not technical viz any of the corporate histories on quality improvement in the 80s/90s esp with company turnarounds. Ex. Detroit, Xerox, IBM under gerstner. While technical issues remain on the table, sustained under performance is ultimately rooted in management and company culture.
It's not uncommon to have a product person (or even team) that spend tons of time doing analysis to find high-value features. When they show up with some slick Powerpoint/Excel thing that says "If we build this feature we make $BIGNUM", it's extremely difficult for some dev (or dev manager) to come up with an effective counterargument for working on tech debt. Partially because it's not their main job, and partially because quantifying the impact of tech debt is extremely difficult. And it's very easy for that to happen every single planning cycle, even while everyone makes appropriate noises about tech debt being super important and we're totally gonna address that but we just can't justify not going for that $BIGNUM cash first!
I wish I had a solution for that - I don't. If anybody else does I'd love to hear it. :)
But, sometimes there's stupidity at a different level. Just because someone is making an effort at prioritization doesn't mean they are doing a good job of prioritization.
It's hard to tell from the outside (where you lack the relevant knowledge), so you're kind of forced into giving them the benefit of the doubt. But sometimes that internal tools bug which they never get around to fixing is eating up 5 minutes of every employee's time every day for 2 years, and it adds up to a lot of wasted time, and they should have prioritized it but they didn't. Maybe because they didn't appreciate what the cost was, or maybe because they had some bias toward certain kinds of tasks, or maybe some other reason.
Well put. I'm going to use this. It feels quite true that when analyzing the behavior of any entity (and to a lesser degree, a person) it's rare to take time into account. If something wasn't done it's either stupidity, malice, or something-I-don't-understand. It's refreshing to think that it might just not have happened yet, and perhaps never will because there will always be more important things in the pipeline.
So perhaps this idea is a corollary of my own pet aphorism: "Things take way, way, waaayyy longer than you'd think". Everything seems simple and easy until you set to actually do it. The kicker is that even looking back on my own work, I sometimes can't understand "What took me so long?!"
You just have to look at the k8s hype to see that it's not.
In this industry, I often have the feeling that people are fetishizing complex solutions even if the problem at hand doesn't require them.
But I have to admit, it's not easy to focus on the problem space and really work out what is needed. Most people do too much or not enough.
"Why didn't we invest in long term issues?" (say virus research...)
Because I can make lots in finance this quarter."
As I understand it humans have (at least) 2 really big problems in rational thought: 1) A bias towards the short term and 2) unknown unknowns (cue Nassim Taleb)
Never attribute to higher order stupidity that which can be attributed to lower order brain functions.
The better pay and frequent incompetence are opportunity costs, because somebody has to write this code. The business need is present and so factors are adjusted to ensure delivery of necessary business solutions while reducing hiring risks. It made sense to hire people who could get by just writing minimal logic over a framework tool, because developers are interchangeable assets of the business that come and go rapidly in the marketplace. When there is a shortage of talent you take what you can and ensure it is commonly replaceable. It should also be stated the conditions for hiring developers were well in the developers' favor because the business need was clear and businesses could afford to hire more people while enjoying the longest running bull market in history.
I suspect, though, this party is over. We now exist in a period of rapid economic contraction unlike anything in history. Before the recent bull market developers primarily existed to provide automation. The value of most developers was in the reduction of expenditure on the employer's internal operating costs. During this bull market that value became lost as the lines between automation and product creation frequently blurred. Regardless employers just knew they need people to write code and could happily afford it.
COVID has put many developers out of work and economists are anticipating the economic hardships will continue for about 2 more years. From a business perspective developers are expensive cost centers. Developers do not add to the revenue of the employer. They only exist to substitute business expense for automation. The people who actually generate revenue are called something like sales, marketing, or merchandising. So, for a hiring manager of a business struggling to stay afloat why would they hire 12 barely competent people who write a few lines of code here and there for a framework SPA when instead they could hire 2 or 3 developers at a slightly hire rate that focus on automation in general? In other words the safe expectation moving forwards is to expect that developers will require more from fewer developers or will find a completely different medium with which to engage their users/clients. With many developers suddenly out of work talent is no longer rare.
The opportunity costs have drastically changed in response to market pressures. The employers who will most rapidly adapt their hiring and use of developers are those under greatest financial pressure. The employers who are currently safe from market pressures, whether by their essential status or cash liquidity, will not feel this pressure and will become the new dinosaurs as they retain older and less efficient ways of shipping products.
What really upsets developers is the extra work that is generated by managers that should not manage anything really.
If you are competent enough to be able to create from zero and deliver a software product/service it seems the only way is to have your own product/service.
A software company is one that generates revenue directly from the software they write. In that regard I do not consider Facebook to be a software company, their revenue is from advertising and media. They are a software company the way CNN is. I consider Google to be a software company because their revenue is from micro-auctions to their search engine.