But someone in the business ought to have estimated the impact of a task, and it's very useful for developers to be in on that discussion because then they can suggest alternatives that are way cheaper but still worth just as much, or that cost slightly more but would also be worth so much more.
And that's just the immediate benefit. It goes beyond that to organisational learning and being able to calibrate past estimates and get better at picking high value fruit rather than just the low-hanging ones or the ones someone has a good gut feeling about.
But in my experience from organisations both small and large, new and old, is that it's actually a huge mental shift required to go from "this sounds neat let's do this" to
- How exactly do we actually think doing this will help us in the long run?
- How would we express that in dollars to be able to compare it with other things?
- How soon will we be able to tell that we were wrong?
Say you're building an enterprise SaaS, and several of your VIP customers ask for a button to "Export this table into Excel for offline analysis". These customers are paying >$100k p.a., and not threatening to leave or anything, but it's a feature you think would be valuable for the core product. You ask a developer to build this, and they ask, "How long can I spend on this before we're losing money?". There are several ways to answer, but few are polite. While every business has some prioritization matrix/process, rigorously estimating the "marginal revenue" for each movement in the dance would add friction and internal transaction costs that make the business not viable.
I mean yeah, essentially this is uno reverse carding the business people with a question that's as difficult for them to answer as their question is for the dev.
They probably don't know. They're waiting for the dev to throw a number out so they can gut feel whether it'll work or not.
It's also likely a made up number. I find many people think business is about careful planning and optimising what you do for the best outcome. In practice, it's the opposite - you do what hope will add the highest value in the long term, then you scrabble about trying to realise that value or change tack in response to feedback. Some win, some lose, and all with varying degrees of severity.
The nice thing about estimates is that you can make them at whatever confidence level is sensible.
If a precise estimation (which is what it seems like you think of) adds too much friction and internal transaction costs, make it at a different confidence level.
So in your specific example, you might go, "I can't give you the exact tipping point right now, but I'm 98 % certain it's worth more than five days. So if you end up spending four days on it and it's still not done, come back to me for a more precise estimate."
As a thought experiment: even though it's impossible to meaningfully answer "how long can we spend on this without losing money on it?", imagine some omniscient Business Analyst does the math and confidently declares, "I expect this button to increase our profit margin by $300k p.a.". Presumably, the developer now has the commercial guidance to announce, "My salary is $200k, so yes, I will create this button in the next 12 months, and it will be my only job to oversee The Button as long as it exists, and the business still gets to keep $100k p.a. Great planning session."
Estimations and prioritization systems are entire complex fields we could debate. I'm not trying to do that; I'm saying that asking "how long can we spend on this without losing money on it?" will not be a productive addition to most (all?) discussions with developers/product owners/executives/clients.
How do you define "meaningfully answer" to arrive at the conclusion that it's impossible? To me, meaningfully answering that is just as possible as meaningfully answering "when will Tobias die" which is something life insurance companies have done for a very long time.
We're getting quite off topic here, but the "duration of a human life" follows a narrow, predictable, bell-curve-shaped distribution. "Impact of this feature" does not - it follows the power-law distribution. Life insurance companies can meaningfully estimate when you'll die, but they cannot meaningfully estimate the value of attending any given party/meeting/trip/conversation. Most of those will be meaningless by comparison to the few that cause an inflection point and make your life worth living, often by accident.
The neat thing about these power law numbers is that you don't need to estimate whether we're talking 83 or 91. You just need to do enough work to tell apart 1000 from 10, which is often considerably easier.
And note that I'm not saying you'll always know down to the cent, or even closest hundred dollar bill. Sometimes your best estimation is going to be "between zero and 999999" and that's fine. If you've determined that, you also know which of your inputs have the most uncertainly, and that might represent a blind spot of yours at an important aspect of the business.
Or it represents a fundamentally unknowable thing. But then at least you know you're staking this part of the development on a fundamentally unknowable thing. That's more than many other know.
And that's just the immediate benefit. It goes beyond that to organisational learning and being able to calibrate past estimates and get better at picking high value fruit rather than just the low-hanging ones or the ones someone has a good gut feeling about.
But in my experience from organisations both small and large, new and old, is that it's actually a huge mental shift required to go from "this sounds neat let's do this" to
- How exactly do we actually think doing this will help us in the long run?
- How would we express that in dollars to be able to compare it with other things?
- How soon will we be able to tell that we were wrong?