Hacker News new | past | comments | ask | show | jobs | submit login

I've been developing software for over 20 years, and I still can't estimate how long something will take me when I've never done it before. This uncertainty needs to become more than just a stick to beat developers about the head and shoulders with. Most of the time the PMs understand this, but there have been many projects where they just don't get it. I have suffered great anxiety from being forced to give estimates when the truth is I have no clue. It depends on how easy it is and how many unforeseen issues I encounter. It was so bad that once my husband asked me how long it would be before I was done cooking something, and I practically had a meltdown. That's when I knew it was time to leave that team. Can we stop pretending we can forecast the unknown? (edit typo)



No.

Even bad estimates are better than no estimates. If you are having meltdowns your reputation is being tied too closely to your ability to give estimates.

You must never turn estimates into a promise, always remind people they are estimates.

Want to give fast estimates? Here’s how:

1) first determine the scale of the task? Is it a year, month, week or day kind of task?

2) Then, it’s just 3 of those units. The smallest task takes 3 days. One day to completely fuck up, one day to figure out why, one day to get right. The longest takes 3 years. One year to fuck it all up, one year to learn why, one year to finish it.

I suggest never giving estimates in units smaller than a day. They just become noise. If a task is smaller than dayscale just say the task is too small to provide any meaningful estimate but won’t take more than a day.


> Even bad estimates are better than no estimates.

No estimate is clearly better. Here's a common story I've seen across multiple companies.

1. Marketing management asks Engineering management how long it takes to do feature X so they know when to launch the online ad campaign.

2. Engineering management then asks potentially good coder how long it will take. Coder replies with a time and "it's just an estimate."

3. Engineering management reports to Marketing that coder's estimate leaving off the most important caveat, and Marketing treats that as the gospel truth.

4. Coder takes longer than expected because of some bad technical cruft that some other engineer put in because he was potentially rushed or just plain inept.

5. Marketing is pissed because they now have to withdraw the ad campaign, and starts blaming engineering.

6. Under increased scrutiny, Engineering gets a bad reputation, who then throws the coder under the bus in front of Marketing and other managers.

7. This shows up on the coder's annual review who then leaves.

8. Engineering hires replacement which will have a 3-6 month learning cycle, and potentially writes worse code than the person that just left.

EDIT: The point is that if there's no estimate, management has to deal with the uncertainty that the coder experiences. Hope for the best, plan for the worst.


What is plan for the worst in a scenario with literally zero estimate?

“It may take 0 to 3 years” ?

“We literally have no way of knowing?” “Not even a ballpark?” “No.”

This is what you’ve proposed with no estimate, and this seems extremely unhelpful towards the goal of helping all groups at least have some idea when certain “next steps” can be accomplished.


I work as a sound mixer for film and estimating how long it will take is always hard, but I never really got a bad reception when I just said: I canlt tell you unless I see the thing.

Hell if you ask a mechanic to fix your car they will also have to check the thing first before deciding how long it is going to take.

This is the professional thing to do: gauge the situation, take tour time to figure out the scale of the thing as long as you need and then give a pessimistic guess with a disclaimer that things can easily get out of hand without anybodies fault if unforseen problems arise.


Right, but there's a big difference between "Don't estimate until you've done your due diligence" and "Don't estimate".

It's perfectly reasonable to say "This is a big project, it'll take me a week to know where we stand"- there, you've provided an estimate of the scoping task and promised an estimate in the future as well.


My advice would be to live in the real world. We don't know how long it is going to take. Just like if you go to turn on your car and it doesn't start. Maybe it was a minor issue and the next time you turn the key it will start. Maybe there was a short circuit and the car is totaled. A passenger asking you when you are going to get moving is no help.


If your car won’t start, it will take more than a second, but less than a year to fix. That is a bad estimate, but better than no estimate for a space alien that doesn’t know what a car is.

PM’s are space aliens.


The only tool the space alien's PM has is a deadline. "Do X by this date or else." Because he has no hope for understanding the true problem domain before the project deadline -- just like a space alien can't be expected to learn english in 4 days.

A PM with a deep understanding of the software process, can ask insightful questions, identify and possibly mitigate many of the issues beforehand. So when it gets to the software lackeys, many of the resource/architectural issues may have been solved.


> If your car won’t start, it will take more than a second, but less than a year to fix. That is a bad estimate

It's actually a better estimate than most software estimstes, because it isn't just an expected time but a range that results will usually fall within. It would be better if it included an explicit degree of confidence that th actual result would be in the range, and if it was centered around the average time it would take for events in the class.


In these cases it’s often a matter of ‘give me a week and I’ll tell you’.

Then again, a similar number of times, the only information you get is ‘we need a chat program, can you please estimate how long that will take?’. Which will leave it forever impossible to estimate.


> “We literally have no way of knowing?” “Not even a ballpark?” “No.”

That's when the engineering lead or whoever was giving that as the answer is told that his services are no longer required.


Oh come on. Any decent project manager understands the difference between an estimate and a deadline and plans and communicates accordingly. It's not rocket science. Stuff gets shipped on time all the time.


Any decent project manager who wants to keep their job will acquiesce to people further up in the org chart who want deadlines, not estimates, and who consider the difference between the two to be as fuzzy as it needs to be.


> Any decent project manager who wants to keep their job will acquiesce to people further up in the org chart

In my book, that project manager is not "decent". A decent project manager would recognize the situation for what it is and leave.


Then there won’t be any good project managers, as I’m fairly certain most executives do this shit.


That's the problem I've been seeing in non tech based companies. IOW when tech isn't what they're selling.


A deadline is a requirement. It is not a PM’s job to reject requirements, though it is their job to help th customer understand and manage resolution of situations where the combination of requirements given are in conflict, such as when a deadline is insufficient to acheive the functional requirements.


And any decent engineering team will accept that a project has deadlines and raise progressively more serious risks with appropriate explanations up the chain as the confidence that completion will occur before the deadline declines below near-certainty.

Deadlines can be legitimate project requirements as much as functional requirements are. Identifying practical conflicts between requirements is part of what an engineering team does. Managing resolution of those once they are identified so that the customer gets the best result achievable is what a PM does.


>who want deadlines, not estimates, and who consider the difference between the two to be as fuzzy as it needs to be.

This is exactly it. Well said!


There's two issues right?

1. Inability to estimate effort. Admittedly an academic issue, and should be taught to all engineers in college.

2. Inability of management to deal with delays due to bad estimation. This might be caused by a "bozo explosion", say, (where inept managers hire more inept people underneath them.)

Edited to add:

3. Why do we keep making assumptions that management must be infallible? In any dysfunctional organization, it might just take one bad manager high up in the chain that causes pain for the entire organization beneath him.


I cannot believe how much I love the term "bozo explosion". I see this situation all the time as a consultant, and now I have a fantastic term for it. Thanks for sharing!


I'm a consultant as well. You go through enough companies and you can see how much of a problem this actually is.

As much as I did not particularly care for the management style of Steve Jobs, as far as I know, he was the one that first used the term.

https://guykawasaki.com/what-i-learned-from-steve-jobs/

Edited to add:

This just happened recently. Maybe you'll appreciate it.

15 years ago or so (before the term "technical debt" got widespread usage) management kept asking why we working on so many bugs. The solution was to label them as "enhancements".

Just recently within the past year, I saw the exact same thing happen again.


> If you ask customers what they want, they will tell you, “Better, faster, and cheaper”

Amazon took that to heart. Seems to have worked out for them.


They did? In what way? When I look at the online ecosystem for buying things in the EU, Amazon is consistently the one that can't promise I will get what I order, can't promise when it will come and can't make it easy to give them money. The only thing they do right is that they're theoretically cheaper, but then because you might get a product that doesn't work (and it's very hard to solve that issue to a standard that you'd expect as someone in the EU) the cost saving benefit goes right out the window.


They were the first that made online retail work. Plus there's the whole AWS thing.


I want to work where you do, where friction is negligible, cows are perfectly spherical, wind resistance is never a factor, all functions are continuous and differentiable across their entire domain, and all project managers are decent.


I'm not saying the story is implausible, but I dispute the idea that estimating software development efforts is an intractable problem that is better left unattempted.


Estimating things is important and valuable, but as soon as you get large corporate structures and multiple levels of people communicating involved it becomes a really bad idea. That's why it shouldn't be attempted in those kinds of situations.


Please don't assume decent project managers. That's how you get in trouble.


> Hope for the best, plan for the worst.

Oh dear, have you ever spoken to a finance department before?

Here's a common story I've seen across multiple companies[0].

1. The Finance department alots the marketing department with a $500k budget.

2. Marketing department blows through the $500k budget on the engineering department and has no productive app to speak to.

3. The Finance department goes back to the Marketing department asking what happen to all of the money they gave them.

4. The Marketing department says "well engineering told it was agile, which meant they didn't know when it would be done and for how much"

5. Finance department: "Ya, you're not getting a budget ever again from us"

> No estimate is clearly better.

Sorry, but this is not how the real world works. Agile assumes a perpetual budget, which is not realistic for most businesses in the world.

[0] - Personally seen among 100+ and counting.


Doesn’t agile presume you have some deliverables every week? At the very least they’d have ‘something’. It might not be fit for purpose, but that should have been fairly visible a ways before they blew through the whole budget.

Unless, you know, there are other systemic issues in the company.


I'm sorry, but if you're spending $500k or more based upon one engineer's "estimate" (for which you're paying $10k/month) then something is more fucked up in that company than what appears on the surface.

But I agree with you, if the engineering management can't budget and prioritize work to get done, that's a larger issue.


> but if you're spending $500k or more based upon one engineer's "estimate" (for which you're paying $10k/month)

The $500k was just a random number I came up with. Budgets wildly vary based on whether they get allocated weekly/quarterly/yearly/etc. And also it's never "one engineer's estimate", it's usually a project manager who works with N number of engineers to come up with an estimate.


And that's the way it should be.

But keep in mind that if a developer is in a sprint, he might start adding tickets to the epic because of technical/organizational issues. Suddenly the epic might look 3x more work than was originally planned. Note this is not theoretical, as it's happened 3 times to our team already this year.

But meanwhile in the example I gave, the developer gets held accountable still for not making the correct estimate. Management just passes it up the chain rather than trying to increase the confidence around the estimate.


It's even worse in my organization. Funding is based on projects so the amount of funding you receive is based on exactly how long your estimate is (if they choose to fund your project). Often projects are estimated based on "here's how much money they are willing to spend, so we'll just figure out the number of man hours that amounts to and give that as an estimate."


> No estimate is clearly better. Here's a common story I've seen across multiple companies.

As strange as it may sound to developers, the salary is paid by the customers, current or future.


I’m sorry for your bad experience, but some companies are able to do this well.


>1. Marketing management asks Engineering management how long it takes to do feature X so they know when to launch the online ad campaign.

Why can't Marketing just wait until the feature has been built to launch their campaign?


Because campaigns have to be planned ahead too - anywhere between a few weeks and a year or so, depending on the size of the project and the company, and often timed to match big trade events.

And there's always the possibility that if you announce too late competitors will eat your lunch.

One way to handle this is to spend some time on a formal estimate. Two to four weeks of R&D to scope a project can help narrow estimates to something approaching realism. You'll still be wrong, but you're less likely to be hopelessly wrong.

Asking someone for an instant opinion is madness. That's not an informed estimate, it's just a guess, and usually worthless.


Sounds like marketing needs to be agile


Because if they launch on the original schedule with Partner X they get Massive Benefit Y that could make or break the product. For example.


This. Most developers either don't want to or can't understand that there are real and valid reasons for wanting a predictable software production schedule.

That said, I've only ever seen one software project consistently meet production deadlines. Is there benefit to committing to the original schedule with Partner X if there's no way you can deliver on schedule? Or is it one of those things where Partner X has committed itself and they have no real choice but to work with the sliding deadlines?


In the example I'm thinking of the partner wasn't really bothered that the schedule slipped, but the ideal marketing window came and went before the product officially launched, which certainly affected sales. They were really excited that all the planned features made it to market though.

I'd say that some aspects of the capability maturity model make sense, even for engineering groups that practice agile day-to-day.


> the ideal marketing window came and went before the product officially launched, which certainly affected sales.

Definitely. I used to see that all the time in the games industry. Getting the right launch window is critical because most of a game's sales happen in the first few days of launch. And yet, games are famous for shipping months, even years late. They usually seem to make it work. I guess you'd be particularly hosed if you couldn't afford the burn rate for the extra X months it would take to get to a decent launch window.


Lots of reasons:

1. Marketing management is full of idiots.

2. The coder's estimate is used to tell the CEO when a product will be rolled out. Who then questions why it didn't get rolled out on time.

3. Marketing gets a bonus for rolling out ad campaigns on time.

4. Marketing is using this for their big trade show coming up and wants to make a big announcement for maximum impact.

Why do you assume management is always competent?


I think this comes down to the idea that estimates are really for coordination.

At least, I think that’s the best reason for using estimates. I wrote something about this here: https://riskfirst.org/Estimates


Actually I don’t really understand why all the HN posts on agile just devolve into a discussion about the difficulty of estimating.

Surely there’s more to it than that?


Well, standup is also terrible. In addition, there's the general agile assumption that developers are capable of consistently producing X hours of work per day. Maybe it's just me, but I'm a bit more burst-y with my work habits than that.


I wouldn’t say Standup is terrible but one where you go round the room giving updates is almost useless. It’s not just you, most developers can do 8 hours work in an hour if they aren’t interrupted with meetings.


Shouldn’t it average out over 2 weeks?


Not if you're doing standup every day and expected to report daily progress.


Because the superbowl comes once a year -- seasonality matters!


> You must never turn estimates into a promise,

Whoever asked you for that estimate will do that for you…

> always remind people they are estimates.

…even if you remind them. I mean, the reasonable ones won't, but many people aren't that reasonable.

I've been in an interview last week where they told me the teams were strongly committed to the features they planned to do each sprint. Which I interpreted to mean that their estimates are promises.

> Want to give fast estimates? Here’s how:

My, this looks like it could actually work. I'm going to try it right away.


It’s funny because even with capital a Agile / Scrum, in 2011 the term commitment got changed to forecast [1]. But then many companies who are dogmatic about Agile still use estimates as commitments anyway. That makes technical debt really hard to avoid... One solution is to create technical debt tickets in the backlog so people become aware of it.

1. https://www.scrum.org/resources/commitment-vs-forecast


>Even bad estimates are better than no estimates

Absolutely not! An estimate should not be a random number but should be constrained with available data however small it might be. If you feel that you don't have enough to form a "guesstimate" do not give me a number but first work on finding the data which will enable you to form a proper estimate.

Once you give an estimate, no matter how many times you explain that it is a "guesstimate" people tend to lock on to the given number. It then becomes a real battle trying to explain the hurdles (and there are always some unknowns) while revising the original estimate. Soon mutual distrust develops between the implementation engineers (stressful and detrimental to actual execution) and the management leading to everybody losing faith in estimates. Agile/Scrum have exacerbated the problem with their short time windows and sprints. In one team that i was on, people just gave up and started quoting 2 weeks for any and every feature, trivial or non-trivial and the whole exercise became meaningless.

PS: The book "How to Measure Anything: Finding the Value of Intangibles in Business" is worth reading to get some ideas on how one might do proper estimation.


> You must never turn estimates into a promise, always remind people they are estimates.

The person giving the estimate isn't the one who does this. Other people turn them into promises, because that's what they were actually asking for when they asked for "an estimate".

Giving some kind of uselessly vague estimate isn't particularly useful from an engineering perspective and everyone else has been trained by scrum the last decade to treat them as promises. So don't do the thing that you know will have a bad effect.


> The person giving the estimate isn't the one who does this. Other people turn them into promises,

Why is that your problem? If someone tried to hold me to such an estimate, I would simply say "I never promised this, in fact I explicitly didn't promise this. I'm sorry X lied to your face about this." Don't own other people's failures.


> Even bad estimates are better than no estimates.

I disagree. The only bad estimates that are ok have to error on the high side. I have found that biz side may not like the longer estimate, but they much prefer that over missing a date.


Right. If they asked for a 95% confidence number, they would get something to depend on more whole heartedly. But that date will be too far in the future to be comfortable.

So they look for more of a natural median. And then are shocked when dates are missed more than half of the time (!).

So if you want honesty, ask for estimate ranges or 95% sure estimates. Then pareto your plan to hell and back so everyone understands the risk. Then reestimate periodically as you home in on completion.


Great you know how you work best.

There are other ways. The best of humanity have a wide range of methods they use to get shit done at a level most of us dream of.

So I have a hard time taking a straight jacket process like you suggest as some sort of panacea. It’s essentially another manifesto. What’s the expiration date on this.

Everyone goes on about the value of writing less code, and here we are incentivizing manufacturing line processes for crafting more code?

Note the authors of things like the Phoenix Project: rich Silicon Valley types. We’re just discussing how to be better assembly line workers for tech aristocracy.

Figure out how to build your business in such a way it works for the business. Google didn’t get big because it has perfect process; it can and often is a mess. It got big solving it’s problems and monetizing the solution (every company will need to search for digital files, send email, edit docs, etc).

Figure yourself out, compare with others. There’s is no one size fits all to thinking about problems and even the biggest winners don’t have all the customers there is. Why isn’t Gmail the only email solution?

There’s never a perfect process.


I do something similar, except don't assign to a number. My back of the envelope estimates are: hours, days, weeks, months.

This gives people enough information as to whether or not a change is worth it. If the difference between, say, 2 and 5 days makes or breaks the feature it's probably not worth exploring in the first place.


>You must never turn estimates into a promise, always remind people they are estimates.

Herein lies the problem. Many companies that do "Agile" fail to realise that estimates are just guesses, they're not accurate and yet, the issue is companies taking these estimates and holding developers to them. That's the real problem.

It's all well and good to say, remind people they're only estimates, but many of us who have been in this industry longer than a minute knows that estimates nine times out of ten are taken as promises and that's when we get crunch and burnout as developers are forced to achieve the impossible with excessive hours.

Deadlines need to drop dead. Code quality suffers when you put a timeframe on it.


Features and stories should be completable in the span of an iteration.

Epics and Features are prioritized by the business and product. They can be expected to slide from iteration to iteration.

Any User Story or task should be expected to be completed an a single iteration.

If you keep having open stories stop bringing in a complete story. Just assign tasks.

Features and Epics are pointed.

Business looks at points earned per year or quarter. Look for trend.

IT leadership looks at amount of open stories and tasks that were moved from iteration back to the backlog. Over the span of all iterations.

This will ensure teams are breaking down work into actionable goals.


Good lord! Buzzword Bingo and Obfuscation at its finest!

Nothing personal, but you have just confirmed to me why i loathe the Agile/Scrum/[fad] processes so much. Take easily understood commonsense ideas/terms and invent fancy names for them, convert heuristics to axioms and sell a business around it.


So said the 2 day scrum master


> Even bad estimates are better than no estimates

Bad estimates are worse than no estimates, but if you are doing work complexity estimation and measuring velocity (which you need to do to evaluate internal process changes, manage workload, and for lots of other purposes), you are incidentally gathering the info you should need for excellent estimates, which are not necessarily hyperprecise but are excellent instead because they can explicitly quantify uncertainty as well as mean expected delivery time.


> Can we stop pretending we can forecast the unknown

Within reason. Ive worked in orgs where there is no estimate at all, and that bring a different set of problems (unbound projects and no work getting done because of the complete lack of pressure).

Now you're totally right: software engineers rarely do the same thing (or even similar things) twice, so estimating is somewhere between "very hard" and "impossible".

"Scoping" however is still needed.

If you ask me "How long will it take to add a button to this page", that's a very ambiguous question.

I can solve the problem in a few ways: - Tack on a plain html button element - Add a button from an existing styleguide/library/design system. - Build our own button component instead of reusing one. - Build a full blown button editor that allows a non-software engineer to use a WYSIWYG editor that will create their own custom button and insert it on the page without code.

Now obviously there's a range between these, going from a few seconds (plus deploy), to months or years of work.

The project/product/whatever managers and the tech leads/engineers need to work together to agree on scope. An arbitrary deadline can be picked and then engineers have to do whatever they can up to that deadline. Or we can agree on minimum functionalities with onbound deadline that MUST be created regardless of how long it will take. Realistically, successful projects will be a mix of both, along with various forms of padding and revisiting estimates along the way to account for unknown and mistakes.


> because of the complete lack of pressure

I would say it may be due to more of a lack of focus.


estimating is either easy or impossible. You either have done it repeatedly and have data to show how long it took, or you have no fucking clue.

Focusing on the most important features of each module first, and then the most important of the next then the next with a focus on progress towards something useful is the only way to go.

When you have a consistent team and some tracking data to compare task size to hours required to complete, then you can start doing estimates for future work, but you never have to ask the dev team to estimate other than setting some relative size between tasks and trying to stick to the sizing system. I prefer Fibonacci story points, but it can be anything that has a number.


It's more about what is BEHIND the button.


The Why rather than the What. Maybe they don't even need a button.


> unbound projects and no work getting done because of the complete lack of pressure

Actual demos seem to help a lot here in my experience.


I feel as you should be able to provide an estimate even if it something that you have not completely done before. One should spend some time to gauge how much of this new thing is really "new" and what parts should be easy to figure out. Then, try to look at resources about those unknown parts, and that should allow to provide a rough estimate.

And when road blocks come up just communicate early, and then if PM/boss don't understand there is not much you can do but probably look for a better gig.


Great! Please let us know when cancer will be cured.

In other words, "I don't know" is sometimes a complete answer. I don't know. Period.

This is supposedly one of the main points of this religion (agile). Some things are unknowable. So you move ahead a bit, and reassess. If you are scrumming, that is usually 2-3 weeks, and don't get me started on that arbitrary limit. But, you try to move forward for a bit, learn something, and then that might

1) let you know enough to know how to estimate

2) give you some information on what not to try next (I tried a, b, and c, and they all failed)

3) give you guidance on what to work on next (d seems promising, I can flesh it out more and see if it still seems like a promising avenue).

You proceed on, and eventually get your hands around the problem, or the person writing checks decides this is not an economical search (because that is what this is, search on a multidimensional surface) and change/delete the requirement.

That's what agile is supposed to be, with a nod to the fact that yes, we can estimate writing a single database query or something fairly well, so in some cases estimates can be useful at this scale.

The bane of my existence is the endless pressure for estimates. I'm doing research; no one has done this stuff before. It is truly unknowable. If it was known it would be in a paper somewhere, and I would merely be implementing that paper. So I get told "break it down into smaller chunks", as if my 30 years of success didn't teach me how to break down problems. Thanks PM that has never coded or produced anything intellectually novel before! I'm surely being dumb and/or obstinate!

I got that written in my previous performance review, that I don't know how to plan and break down problems, because I flatly refuse to play this game. You get punished for trying for hard things. It's nonsense. "I don't know" cannot be changed by insisting on an estimate.


Insisting on the estimate in that situation is basically a special case of the old trope of using something meaningless but easy-to-measure as a proxy for something important but hard-to-measure. It's hard to know how long something will take, and easy to ask someone how long they think it will take (or if it's you, to pull an estimate out of your butt).


> Please let us know when cancer will be cured.

An absurd comparison given most software engineering tasks have been done before, they're simply difficult to estimate for some given team without expertise in doing some particular task.


TBF, GP said they were doing research, so it’s entirely possible that they’re working in a space with little to no prior art.


More than a day but less than a millennium. I am 99% certain on that.

Granted that’s more broad than a PM (or humanity) would want, but it is SOMETHING.


Assuming you are not being sarcastic, quote from another post;

>the old trope of using something meaningless but easy-to-measure as a proxy for something important but hard-to-measure


>I feel as you should be able to provide an estimate even if it something that you have not completely done before. One should spend some time to gauge how much of this new thing is really "new" and what parts should be easy to figure out. Then, try to look at resources about those unknown parts, and that should allow to provide a rough estimate.

This isn't personal, just a comment on a way of thinking that is not uncommon: each of those sentences are insane and not based in reality. You may "feel" these things are possible, but that's about as far as it goes. "Just look up all the unknown stuff." OK.


I wish people were more comfortable saying "I don't know but I can do some research and get back to you later." Often times that's where the conversation about time estimates ends because they don't ask you again unless it's actually important. And if they do ask you, now you have a better answer (assuming you actually looked into the problem you're trying to solve).

The other problem with time estimates that I find is that even though I might know for sure that a certain feature will take only a day or two, I can't tell when it'll be done because I usually have other things to work on as well.


Well here's the thing. Often I'm asked for my estimate in the same meeting I first see the story. I have no way of researching before giving an estimate.

> ... and then if PM/boss don't understand there is not much you can do but probably look for a better gig.

If only it were that simple.


> Often I'm asked for my estimate in the same meeting I first see the story

That's definitely awful. Personally when I was leading agile teams, for anything non-trivial we'd start by creating an issue to scope the individual parts of the project, frequently allocating several days to do so.

And some stuff is still mostly unscoppable. That's where Fermi estimation and the whole "How many chips can you fit in the empire state building" time deal comes into play. You have no way to really know how long something will take, but we need to allocate resources SOMEHOW, even if its completely off. An imperfect guesstimate is better than none at all.

The catch is that everyone involved has to know that its an imperfect (and potentially completely wrong) guesstimate, and it has to be revisited regularly as new information comes in. Everyone also has to be ok with restructuring the project (or even cancelling it, in extreme cases!) if we learn its completely wrong.

We recently discovered at work that an assumption/decision that was made nearly 2 years ago turned out to be totally false. The project that relied on that false assumption was about 10% in (the assumption was made long ago, but actually work started recently). We had to sit together and ask ourselves if it was actually worth pushing through the 90% (and likely regret it in a year), or agree to scrap the 10% and start over now that we know what we're doing. We have to be careful not to pull off a Vista/WinFS resource blackhole though!


It's even worse when your lead is giving timeframes for you...


I used to think that too, but is it really? If estimates are always wrong, then is there really much of a difference? Besides, the lead often has business related information that informs his decision of "this feature must take no longer than X amount of time or we're going to have problems with Y" where Y is often political and not technical.


It just goes against agile development so it seems like if we're doing that, we're not doing agile. Then again, nobody is it seems? :)


I remember when Agile development used to involve chickens and pigs. If you have no idea what I'm talking about, then you're not doing Agile as it was written. Almost nobody is today.


I don't agree with you - you're presenting the view which is the exact position people here are trying to put words to why doesn't work (and often hate about their jobs).

But downvoting you is just as counterproductive as burning the devil's advocate and you have my upvote. I hope people will get a hold of their emotions.

I've just spent a month doing a script I thought I could do in a weekend. I didn't realize several of the difficulties inherent in the data, I didn't realize I'd have to up my game with regard to techniques in shuffling data and I didn't know I'd run into so many quirks in the programming language.

I didn't know what I didn't know. That's why I guessed wrong.

In everything you haven't done before, you don't know what you don't know - and only a small portion of those things could have been realized by spending some hours researching the work I needed to do. Those didn't pop up on my radar until I had to solve them :(

But that knowledge isn't without value. Knowing that it is doing stuff you haven't done before that is risky (timewise) means you can underscore when you have a high-risk assignment.


Im not a coder, so maybe the domain is different in a way i dont understand, but I agree with you 100%.

Refueling nuclear aircraft carriers have projections start to finish, a half decade long. There are countless pre and co requisites with interrelated projects, not counting the mundane issues like material and manpower.

I simply do not accept it is impossible to project a timeline for software. If someone stops you in the hall and says "hey you, how long will this take?" That is not a reasonable question, so sure you cant give a reasonable projection. If youre a professional coder and you show up to a planning meeting and your answer is "I dont know, i have no idea and its not possible to provide an answer even on an order of magnitude" thats just ridiculous.


> Refueling nuclear aircraft carriers have projections start to finish, a half decade long.

The difference is that those projections weren't spit out on the fly by a engineer after a 15 minute pitch of some manager's great new idea. Those plans took many months of effort to put together by a team of specialists planning out the various details of the project. Software does have the equivalent of this, it's called waterfall development. The reason we don't do this is because, unlike aircraft carrier design, if you take six months to create a project plan your requirements will have likely already substantially changed.


I'm not a nuclear aircraft carrier refueller, so maybe the domain is different in a way I don't understand, but it's ridiculous to need a five year plan to just refuel something, people refuel cars in five minutes everyday.

I simply do not accept that it is impossible to refuel in a week or so, assuming it's a couple orders of magnitude more complex than refueling a car.

Thats about how your comment sounded.


+1

However, your comment didnt help me understand. It doesnt help because even if i embark on something i have no idea about, even in total ignorance i can _bound the problem_.

I dont understand how a professional coder can approach even a problem and have no idea - you have to DO the problem, so what is your approach? Just start coding and somewhere between 5 days and 5 years you stop?

Planning meetings dont happen in a vacuum, so what kind of problem can you research but have literally no guess about its solution? Bear in mind, we're not talking edge cases (research grants or whatever), but coders hired to do a job.


>i can _bound the problem_

This is the nub of the problem when it comes to software. I understand your skepticism but software development is really very different from other activities. It is "mind-stuff" (and thus quite unstructured) which needs to be expressed in very precise language to solve an [almost always ill-defined/constantly redefined] problem. The inherent complexity involved is huge due to the number of degrees of freedom and malleability involved.

I can do no better than point you to the article "The Humble Programmer" (http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...) written by one of the pioneers of Computer Science to really understand the issues that make Software Development such a complex and difficult activity. And since then (the article is from 70's) we have made matters exponentially worse by orders of magnitude.


The top answer to "Why are software development task estimations regularly off by a factor of 2-3?" on quora is a great analogy.

https://www.quora.com/Why-are-software-development-task-esti...


Here's an analogy.

The spell for floating an object is Wingardium Leviosa. It usually takes an 11-year old a few days to become proficient at it.

Please give an estimate for a modification of this spell that will make the object lift, complete two vertical circles (720 degrees) and then go back to the starting point.

(Before you ask: yes, we're doing magic. We're combining words in specific ways to create significant effects in the world.)


Software development is filled with fractals. To do A, you break it down into A1, A2, and A3. To do A1, you break it down into A1.1 and A1.2. To do A1.1, you break it down into A1.1.1 and A1.1.2.

In even a small project, that means that your moment-to-moment work might be something like "do task A4.1.2.3.5.3.2.1.5.4". Not literally, but conceptually, that's what's happening. Every task involves a bunch of other tasks, which involve a bunch of smaller tasks, ad infinitum.

This is probably similar to your refueling of a nuclear aircraft carrier. Big tasks involve little tasks, which involve smaller tasks. I assume. I've never refueled a nuclear aircraft carrier. But I have developed a lot of software.

Every software project is a working a plan that has never been done before. Because when it is done, the result is software that is infinitely reproducible, so there is no need to do it again. It's as if somebody needed to figure out how to refuel a nuclear aircraft carrier once. And then after that, everybody who wanted to refuel just cut-and-pasted a fully-fueled carrier.

In many ways, writing software is like figuring out that plan. It's not following the plan, it's creating the plan. And somebody who knows how to do A can figure out that involves A1, A2, and A3. And they can probably figure out that A1 involves A1.1 and A1.2. But they can't predict all the way to A2.3.1.5.2.4.2.2.5.1.1.1.5.2. (It's been tried. It didn't go well. Google "Software Crisis.") Too many issues don't appear until you try to solve them for real.

And those little edge tasks way down at the bottom? They might take five minutes. Or they might turn into another nested set of tasks that takes five hours. Just today, I was working with a team trying to solve a simple problem: print the URL of their current web page. This was no problem. The tool we were using to serve web pages told us the current URL. But it only told us the url's path (the part after the domain name). We also needed the scheme ("https:") and the domain name ("news.ycombinator.com") and the port (":80").

And that wasn't something our tool expected us to want [1]. So a five minute task turned into a half-day marathon of reading documentation, trying things, and reading more documentation. It took us half the day to figure out how to do something that should have taken us five minutes, and we had assumed it would only take five minutes when we estimated the larger task two weeks ago.

Coders who know professional estimating techniques approach this problem by using Monte Carlo simulations that provide a probabilistic range of dates. The high-confidence numbers resulting from these simulations are usually way too far in the future to satisfy stakeholders, because the simulations have a long tail. (More can go wrong than can go right.) Professionals have found it's often easier to refuse to provide estimates than to fight over high-confidence estimates or educate stakeholders in interpreting probabilistic date ranges. Not estimating saves lots of time, too.

I hope this long-winded explanation is the help you were looking for.

[1] For the nitpickers in the audience, I'm obviously leaving out a huge amount of detail about how our REST API was actually interacting with its framework. But that's the gist--we were trying to find a clean way to translate our current absolute URL to another absolute URL. Even now, I'm sure we were missing something obvious.


This is a much better explanation of the problem with estimates than my analogy :)

Especially this part:

Professionals have found it's often easier to refuse to provide estimates than to fight over high-confidence estimates or educate stakeholders in interpreting probabilistic date ranges. Not estimating saves lots of time, too.


False equivalence.

You planning 'how to put fuel in a nulean aircraft carrier'.

For most software projects the equivalent question being asked is 'can you put some fuel in this thing we have'.

When asking things like * what kind of fuel * is the thing a container or a vehicle * etc

The response is often 'isn't it obvious, you're the developer you should know'.

Can you tell I'm in the middle of training coworkers to offer up proper requirements.


Very true; to expand on this metaphor, past experience has lead me to ask “does this thing actually need fuel, or is it in fact a bicycle?”; aka: please give me the full context - what are you actually trying to achieve and why?


I'm going to steal this if you don't mind.


feel free to


>I simply do not accept it is impossible to project a timeline for software.

Well, as you said, you're not a coder.

"Refueling nuclear aircraft carriers" has so fewer fail states and unknowns, it's not even comparable to writing a large piece of software.


> If youre a professional coder and you show up to a planning meeting and your answer is "I dont know, i have no idea and its not possible to provide an answer even on an order of magnitude" thats just ridiculous.

True, but it's because that question is often ridiculous.

If it's technology I've never worked with before, in a domain I've never worked, for which the test data doesn't exist yet, it's perfectly reasonable to be outside an order of magnitude.

E.g., we mostly do Web app development with React and Django, sometimes going into more complicated data processing and visualization. In a recent planning meeting we were asked about recognizing waterways from aerial photographs to improve local government's GIS datasets. Ummm...


> we mostly do Web app development with React and Django, sometimes going into more complicated data processing and visualization. In a recent planning meeting we were asked about recognizing waterways from aerial photographs to improve local government's GIS datasets. Ummm

To be fair, thats something that happens all the time in "normal engineering". While the field itself is much better understand (mostly because aside of record beating sky scrappers, its usually within the same problem space), not everyone knows everything.

For a more down to earth example, if I ask my carpenter to scope out a bathroom remodel, the first thing he's going to do is call a plumber to take a look. The carpenter's not going to be able to give an answer on the plumbing themselves, but they sure as hell can say "Im going to ask someone and get back to you on it".

As a software engineer, I have a network of connections I can reach out to if I'm asked about a problem completely out of my space, and you'd be hard pressed to ask about something that no one in my team knows or has a connection to someone who knows. Definitely won't get the answer today or even tomorrow, but we can get SOME information.


The domain is different in a way you don't understand.


The domain is different, indeed.

Refuelling a nuclear aircraft carrier may be a complex operation but it's one for which the plan was already developed quite some time ago.

To compare to software development is missing the point. Building a PC, installing Windows and all the apps you need might take half a day if you're experienced, and it'll take half a day each and every time: it's labour and you can't reduce the time taken to zero. That's like refuelling an aircraft carrier: executing a plan.

But software development isn't like manual labour where you execute a pre-determined plan. It's more like researching how to build the aircraft carrier. Once the plan (the software) is built, installing and executing it is trivial and takes very predictable amounts of time, which means software developers are almost always doing "research" even if it doesn't seem that way.

How long did the first nuclear aircraft carrier take to design? Well, literally the first sentence of the Design section of the wiki article for USS Enterprise says:

Enterprise was intended as the first of a class of six carriers, but massive increases in construction costs led to the remaining vessels being cancelled

So I guess ship designers suck at estimating about as much as software developers do.


One way software is different is that you're always doing things you've never done before. Because if you've done it before, you can just reuse that code.

This is probably not how the refueling nuclear aircraft carriers projects work.

That said, it's usually not quite as completely unknowable as your last sentence. Then again, it frequently is.

The real tension is about trust. When people think engineers are lying about their estimates.


I will say that in sfortware you can estimate but because there are many unknown things it's difficult.

Is a bit like "How long it will take to discover the cure for cancer?"


Yes!

"We have this problem, you see, where cells sometimes start to multiply aggressively instead of doing their job. That's a bug, the customers are complaining. How long will it take to fix it? What do you mean you don't know - not even a ballpark estimate?"


You refuse to accept that there's a possibility of something in a domain you have no understanding of... That sounds like a very illogical refutation in itself


What you are describing are reasons why software engineering isn't Engineering.


Could you expound on this? I’m not saying I disagree, I’m just not sure which part of what he says disqualifies software engineering from being “engineering”


Start with the HUGE failure rate in software development. There is no provable reliability and/or costing, nor is there any form of standardization beyond RFCs and "best practices." One might even doubt that it's even possible for there to be standards like you see in capital-E Engineering due to the unique and complex nature of general computing systems.

To sum up, software development requires too much trust, way more than would be acceptible in an aircraft carrier or airplane or nuclear bomb. Or refrigerator.


What about Engineering (or, construction, say) being hundreds of years old vs software development only 50?

Not arguing with your main point, but curious to hear your thoughts.


Software engineering is clearly engineering. The main difference is simply that in the software world, engineers often report directly to people who aren't engineers. This basically never happens in other fields - e.g. all buildings, tunnels, railways etc are built by dedicated engineering firms founded and run by more engineers. The exception in software is of course the tech industry, which routinely pulls off engineering marvels.

Projects usually go wrong, or are "late" (relative to estimates engineers didn't want to give in the first place), when they're being closely controlled by people who are not engineers. The recent article on Berlin's new airport being a case in point, where the politicians tried to double its size after it started being constructed and the entire project collapsed in a heap.

Now imagine that happening all the time, every day. That's the enterprise software world.


I'm the same way but only about 15 years experience. I have a friend recently retired who has done a wide variety of software development since the era of punch cards (at least 40 years development) and he agrees with me entirely.

I'm not sure how people can provide any reasonable estimates, especially these days when technology is shifting even faster under your feet unless it's a clone of something you've already done in a specific set of technologies that haven't changed.

For me, I simply make a very conservative guess and use a 2x or 2.5x multiplier to be safe. I'm usually far ahead of schedule but there have been occasions I as happy I added a 2.5x multiplier in. Everyone is typically happy... the fact is, my estimates are garbage.


Even those estimates are good for people who have no clue if something can take 2 weeks or 6 months. They're not garbage.


This is what I do! I disagree on 'garbage', though, I would say that you sound like you estimate very well, and that factor of 2.5 uncertainty is reasonable given all the unknowns in this line of work.


> Everyone is typically happy... the fact is, my estimates are garbage.

If people are happy with your estimates then they are good estimates.


You go to the mechanic and tell them your car won't start - nothing happens when you turn the key. You ask them how long it's going to take to fix and how much it will cost.

They don't assume it's the starter and tell you it will be $400 for parts and labor. They tell you it will be $150 diagnostic fee and the diagnostic will take two hours. Then they call you and tell you the cost to fix and time it will take.

For whatever reason, software engineers don't have the luxury of doing a diagnostic. We are made to guess up front and assume it's the starter, when we really have no idea what rat's nest is under the hood until we look.


> software engineers don't have the luxury of doing a diagnostic

These are called "spikes" in the agile community (no idea why), and "prototypes" elsewhere.

If you feel that the error bars are too wide on your estimate, you should build the minimum prototype required to reduce the uncertainty to tolerable levels.

I like to schedule these at least a sprint prior to kicking off the main task, so that you can benefit from the improved estimate accuracy when actually scheduling the thing.


Often, you think it is a car, and then when you start to work, you realize it is actually an octopus


I love this comment!

The analogy I've been using lately is that it's like estimating how long it will take to pack your kitchen when you are moving, except sometimes you open a cupboard door and there's another kitchen inside.


For whatever reason, software engineers don't have the luxury of doing a diagnostic.

That´s not necessarily true. I´ve worked with very good, serious consultancies, and I´ve seen a quote for thousands of dollars for exploratory work to give an estimate. Of course that was for a million dollar project - smaller projects might not have this opportunity.

Other companies had some interesting approaches where they gave order-of-magnitude estimates and then refined the estimate iteratively.


This was a good analogy!


Most time estimates aren't necessary in the first place and it's aggravating that we need so many of them. Yes it's mostly a stick to beat devs with.

But since folks reading this will probably need to make estimates anyway, this tends to be accurate: Estimate it'll take as long as the most recent similar thing you've done (real time, not heads down time). Resist the urge to trim out parts of the previous task that aren't related (you'll have new yak shaving) or related to mistakes you've learned not to make (you'll make new ones). If it's really different from anything previous, look for a time you had to learn something and implement a new system based on what you learned (you're allowed to be meta).


There's a problem with this. See JB Rainsberger's video:

https://www.youtube.com/watch?v=WSes_PexXcA


After many years in large infrastructure transformation projects, Johnny’s Rule of Threes now goes like this:

1. The first time you do something, you can guess how long it will take and what the detailed steps will be, but that is all. You should still plan it, but during execution of this phase you must take copious notes. Do not expect it to be correct.

2. After you have done something for the second time — using what you learned from the first time — you should end up with an accurate schedule. This schedule contains all of the steps required to do the thing along with accurate estimates of time and cost.

3. The third time you do the thing, you are able to do it to schedule and to budget.


I can see how this would work for software boutiques doing fairly similar projects each time, or projects with very similar parts, but not in complex products. Much less for software that does not use any kind of framework or follows any kind of recipe. Sometimes it's just step 1 in a loop.


I agree. My theory now it's due the fractal nature of the tasks in software development. To design at a high level and break out the work, you have to make some assumptions about the ground-level designs (i.e. how individual components talk to each other). As you start implementing those, you'll have often encounter unexpected constraints that require you to rework the high-level design. As you progress through implementation, you find more and more tasks that need to be done and eventually the number of tasks generated levels off and starts to decay and it's only then that you really have an idea of how long you need (assuming no large bugs surprise you at that point).

I find even heuristics like double your estimate or padding it aren't useful because there's so much variance.


Flip it around: it's really hard to budget, plan for staffing, make commitments, etc. without the ability to give estimates of this sort. The best thing to give yourself when giving estimates of a dificult sort is the freedom to be wrong, and the ability to express how confident you are in such an estimate.


> Can we stop pretending we can forecast the unknown?

Except that you're wrong.

We predict quite well, thanks. The problem is that everybody ignores the predictions.

Most people predict approximately where the 50% probability is with maybe a little fudging. And they tend to be pretty good at it.

The problem is that everybody just adds those up. And that's the disaster from a statistical viewpoint.

Things can only come in so early, but they can come in infinitely late. A blown schedule on an early dependency throws everything out of whack far more than a late dependency would.

We can roll these up in a proper way. We can run Monte Carlo simulations and get "real" numbers. People have done this and the results are remarkably accurate.

The problem is that someone in management will always undercut the realistic estimate for personal gain. And then wind up longer than the estimate at the end.

And, the worst part is that this is RATIONAL. The only way to get a project completed is to start and get the sunk-cost fallacy rolling in the powers-that-be.


I think this is where a 'spike' solves your problem.

By creating a spike, you can do some initial investigation into how the code is written, what's involved, how long it will take, etc.

Once you have completed your spike, you can come back to the original piece of work and give a better estimate based on your findings.


In the perfectly spherical Agile world, yes.

In practice, for anything that's new, your project devolves into a set of spikes for development because your time-boxed investigation leaves unknown unknowns everywhere. Which is fine in that perfectly spherical Agile world, but it makes less-than-perfectly-spherical stakeholders very upset that they are not getting to use the Agile stick to beat over their developers' heads, as they have been trained to do.


Maybe a spike takes a week. Maybe a spike takes an hour. I can't estimate that.

Which would be fine, but then management or the PM expects me to loop back around when a spike is complete. I'd rather just continue engineering a solution when I'm done investigating, which is the natural progression of things.

I'd rather treat {get requirements, investigate, implement, debug, deploy} as a single atomic task, rather than splitting it across N meetings for planning the sprint, planning the spike, meeting with the PM for more requirements they forgot to put in the user story, describing it at a retrospective, showing it at a demo.

Now I know somebody will say "But that's now how agile works!". Well, we have a few agile "coaches" that were embedded in our teams who would disagree.


What are you doing in a spike? If you can't work out after a day the rough complexity of the task (bearing in mind you should be working on a small deliverable) there's something seriously wrong. You're not meant to be doing the work in the spike. If your research spike shows that it could be a can of worms, then the outcome of it should be another ticket for a proof-of-concept for another sprint, but with specific goals (e.g. "get a POC working that does X, Y & Z").


> Maybe a spike takes a week. Maybe a spike takes an hour. I can't estimate that.

Timebox it then! If you think the spike is going to take a day, and it turns out you'd need a week to even get the prototype working, then that's a successful spike -- you dramatically increased the lower bound on your estimate.

(Sadly by the asymmetry of estimation it seldom happens that you think it's going to take 5d and it ends up taking just 1d).


> Which would be fine, but then management or the PM expects me to loop back around when a spike is complete. I'd rather just continue engineering a solution when I'm done investigating, which is the natural progression of things.

This has nothing to do with agile. Feature X may sound great if it takes 2 weeks, but if it takes 6 months I may not consider it worthwhile right now. Engineering is not just about building something, but dealing with constraints like time and cost.


Sure, that works for adding something without a lot of connections. But if you are tasked to add a feature that has many hooks into a complex system (plus additional unforseen hooks that you won't realize is needed until you implement it), it's basically a stab in the dark. You can guess a time, then triple it.


>That's when I knew it was time to leave that team

Currently in that situation. My "agile" estimate blew up by a factor of as much as 10, just because the ask was conceptually very simple, even to the domain experts I consulted. And by bad luck the way I was implementing the story it happened that the issues unfolded one-at-a-time, rather than somewhere in the beginning where we could have broken things up into more stories with additional time.

I mean, it happens in engineering sometimes. We engineer our way out of it, and the engineering is solid in the end. But no, we blew up our story, so all that working nights and weekends to get back on track are for naught.


In retro we would have asked 'what happened and could we have avoided this?'. Then we would have broken up the now expanded unfinished work, and asked the PM if they want to continue knowing it is 10x more work than expected. If yes, cool we pull in the tickets next sprint and keep going. If no, we wrap up anything in progress and maybe come back to it later.

Shit happens. The end of a sprint is there to highlight issues like this so the business side can re-evaluate if continuing is still a good idea.

I don't want anyone on my team working nights and weekends to finish a sprint task (system going kaput is the only time I would reach out for help). Anyone letting their team work normal sprint tasks on nights/weekends is not going to keep their team very long. They will either quit or burn out and stop working.


I've had to give estimates for my entire 20 year career, half of which wasn't 'Agile'. So first, Agile has nothing to do with this, in fact, Scrum tries to over come this by splitting up tasks into smaller chunks with frequent demoing, re-prioritization etc. It was much worse before this.

But to everyone, if management is going to beat you up with your estimates, maybe find a place where management doesn't? The pathological thing described in all of these situation isn't Scrum, Agile, or giving estimates... Its the managers who beat up employees.


> Its the managers who beat up employees.

Agree completely. There is a difference between management pushing and beating up though. Sometimes I wonder if people get upset at normal pushing/asking for clarification. I've seen both from management, but have also seen engineers get unreasonably upset at simple questions around an estimate.


What I've found is that when engineers are sensitive to giving estimates its a symptom of some sort of insecurity around their work. They're probably also feeling that they are 'outside' the decision making/power structure.

What helps is first identifying why an engineer is feeling insecure about their job, imposter syndrome, some sort of life events you may not know about. If the engineer hasn't been having issues delivering, then you want to really talk with them and find out what it is. Why do they feel they're going to get in trouble or 'beat up' if they don't hit the estimate. If they have been having issues, then you should be identifying why those issues have been having, but that is a whole different set of issues than what I think this thread is talking about.

If they are feeling outside, this is probably evidence that this engineer feels that they've been shut out giving input for the 'how' or 'why' questions surrounding the design of the project. They also maybe having resistance to opting into what the goal/product is. Perhaps the goal isn't clear, perhaps the goal isn't well defined, or there is some bigger issues.

Finally, if you as a manager aren't constantly getting estimates as the project is progressing, especially if you aren't using an agile framework, then you need to. If you are using it, have you pointed your backlog without input of the team? Are the scores stale and if you repointed, with new info the scores would be much different?

If you (anyone reading this) are a manager and you think your job is to get an estimate and drive your engineers towards that, then you are only doing only a superficial part of the job.


>My "agile" estimate blew up by a factor of as much as 10, just because the ask was conceptually very simple

Please do not feel bad, nor let anybody make you feel bad about this. It's absolutely natural for this to happen. If anything you could use it to ask for a junior or intern to delegate implementation details while you dig further into the coal mine of this feature.

"Simple is hard." What you can learn from this (beyond team pushback) is what kinds of questions can be asked to figure out if a simple ask is masking 10 other things.


why in the world would you ever use the end of a sprint as a hard deadline?

That's broken from the get-go.


Broken, but very very common, IME. All of the agile-phile places that I have had the misfortune to interact with had things like sprint commitments, and etc. Very abusive and quite the opposite of 'people over process'.


>That's broken from the get-go.

Indeed.


I know that kind of environment too. Usually some department head is a YES person, saying the delivery can be done in some timeframe X. Then timeframe X is told to the dev team and the dev team says X + 50 days. Then the dept. head goes back to the sales team and by that time it's too late. Then the entire organization pushes down on the dev team with fury. The product is of course released in X + 75. (because you always need to double or triple the dev team estimate)

Those environments are toxic and the sign of toxicity is that the estimation process is not up for discussion (AKA the dept head keeps doing it again and again and no heads roll)


There's a balance to it for sure. It sounds like estimation was taken far too seriously on your team if it was affecting you outside of work too. However, estimation is one of the primary skills of a software engineer. It's always hard to estimate well, but it's infinitely harder for a less technical person to do it for you. I think it's important to understand that and do one's best to improve at it over time.


I just screamed this at someone (over IM, not really screaming) that estimates in scrum are meant to measure complexity and not time to completion. There are still people in 2019 saying things like "1 point is 1 day of work" and I want to murder them. The whole point of scrum is that time estimation is futile.

Estimate complexity and then measure your velocity as tasks get completed and you can make a rough forecast of future velocity.


Estimates are like the cones of uncertainty NOAA publishes during hurricane season. A simple task I have done before is at the bottom of the cone.

Possible complexity and unknowns push my estimates further out the cone. Less than a year, but more than 3 months is my standard answer for those random 'how long will this feature take' that I only have a vague idea about [0]. I then follow up asking if they would like to schedule a few weeks of research to close the cone a bit.

The point is that you have to manage the person asking for the estimate. Teach them that unknowns means something could take a day or a month.

I've only had one person really be a jerk about it, and my response was to make the estimate whatever they wanted. If they were not going to listen to me, then there was no point in giving an estimate at all. That response was from my younger, smart ass self though YMMV.

[0] This also depends on what is being asked. How unknown are the unknowns? For example, is the feature clearly visible in another product?


If you are a good engineer and a valuable member of the team people will know, whether you hit your estimates or not. It’s almost like the better you are at actually producing the less you need to worry about the red tape. If you are invaluable and crushing projects no sane manager is going to fire you over your estimate accuracy, but if you are doing poorly and not producing they may point to it as a problem area. I have only worked for small-mid size teams though so maybe it is not like that for huge ones.


I usually reply: Give me a week to look into it then I will have more insight into how long it might take.

But you are right the problem is you can't estimate (correctly) how long it will take to do something your organization has never done before, that's what should be made clear to them.

At the same time it is true that estimates get better as we start working on it, and the work can start by focusing on the unknowns to figure out how difficult they are to do. If that is a plan they agree to.


Sometimes it's good to give people what they ask for [insist on], even if it does not make a sense.

No anxiety needed.

Then calmly explain that the reason things didn't work out as [they] expected is because in real world things doesn't work the way they want or fantasy about.

Suggest better approach. If they refuse and keep doing [asking] the same thing all over again and expect a different result - then let it be their insanity [or anxiety], not yours.


There are ways to get better at estimates. And a proper process will help you to do it. Unfortunately most of the orgs do not do it.

1) We know that "People are bad at estimates". That is why the estimate should come from the team (not one person in the team). Most of the orgs that I know do not do this. And to be honest they are not set up to do it because people in a team do not have overlapping skills. Ideally you have a team where people can take other people jobs (to a certain degree). You can then, during grooming, play "poker scrum" to estimate (where it is harder for people to cheat following what the lead person says). It is ok for somebody to say "I have no idea". The AVERAGE (not the MODE) is used for the estimation. If the items are more than 2 card apart have the 2 people at the extreme say why they think they think it takes the little amount of points and the other person why they think should take the large number of points.

2) If it is hard to agree on the estimation, the item probably needs to have a "spike" where more assessment or design is done in order to make clear what need to be done. In my team we also agree that a story has more than a certain number of points, we may say that it is too big and we have to break it down.

3) During retrospective or a specific meeting we have the team look at the time that an item really took and when it was more than 2 card off, try to understand what was under or over estimated.

Usually after about 5 cycles, people start developing a sense of what the right amount of story point should be for a story.

If you are always left by yourself at estimates and you never properly scope the task before and reflect on the variance afterward, it will be very hard to become good at estimations.


Spot on! I’ve also been developing software for over 20 years and my estimates have never improved. I think an order of magnitude is the only reasonably useful estimate, which can then feed into the question “is it worth doing it or not?”


Use a random number generator to generate your estimates and slowly modify the mean until it matches expectations.

I bet your estimates won't be as far off as your colleagues'.


I think there is a way to give an estimate (have been doing it myself for a decade), even for dependencies that are not known from the start.

(I mean, reasonable dependencies. No one can account for unknown unknowns, but the software engineering field is not an art, more of a skill, we do have well established processes to minimize the surprises).

Is this something that HN audience is interested in?


> I think there is a way to give an estimate (have been doing it myself for a decade), even for dependencies that are not known from the start.

I agree with this. The problem is that estimates vary, and that management teams often don't understand the way they vary. It's one thing if your estimates are precise and are generally accurate within 20%; it's another if your estimate is "it will likely take between six months and three years", and a third of the time the actual required time falls outside even that range.

... and yes, there are many projects for which such a large estimate wouldn't be unreasonable; nor would it be unreasonable for there to be unexpected breakthrough or challenges that significantly impact the timeline.

> Is this something that HN audience is interested in?

I can't speak for the community, but I think the only way you can know the answer to this question is to post it :)


> Is this something that HN audience is interested in?

Are you teasing? Yes, we want to know! Though frankly I'm incredulous.


No, I wasn't teasing, but I wasn't sure that people would find it interesting (at least, this was my expectation based on experience).

I tried to describe the process somewhere in this thread.


It just sounded like teasing because it's a famously hard problem.

But having read your proposal, I actually totally believe that that could result in accurate time estimates. To summarize, remove the unknowns by planning things out beforehand, then count how many hundreds of small pieces you'll need! That sounds plausible as long as what you're doing is straightforward. (E.g., no crazy database optimizations needed to make things fast enough, no crazy ML techniques needed to make things accurate enough, no crazy algorithms needed to solve NP-hard problems, etc.)


You are right: non-trivial things are hard to estimate - as tasks may not even have an upper time limit.

But in the majority of software development projects the tasks are trivial, and it is possible to enumerate all the little details and their dependencies.

I was thinking of writing a tool to support the process instead of using excel, but unfortunately the dev has stalled due to lack of time on my part.

https://github.com/andy-goryachev/ReqTraq


Fred Brooks would be incredulous.


I'd love to hear what you have to say in the topic.


Sorry, tried to answer and apparently I was "posting too fast".

One may start with enumerating the features that needs development (a.k.a. "user stories"). Also enumerate external dependencies, team overhead (more people working together means overhead is greater). For each feature enumerated in step 1, try listing as much detail as possible, also listing open issues and unknowns.

For example, list all the UI elements under development. List all individual functions / use cases that need distinct functionality (classes, functions) to be developed. List all the test cases. List external APIs, enumerate different ways the said API can be used. Enumerate failures and possible recovery actions.

I'll give you an example: a login dialog for an application. This starts as a two page requirements document, which balloons to around 120 items using this method.

This alone should give a good idea at the start of the project. Keep doing it (enhancing the level or detail), while keeping the ratio of initial estimate vs real value - that will help in estimating the remaining portion of the work.

The problem (the way I see it) is that we don't really have the tools to do this kind of tracking and analysis. Jira and similar systems don't even come close.

Here is an attempt to apply the methodology using Excel:

https://github.com/andy-goryachev/PasswordSafe/blob/master/F...


This is roughly how I was taught to estimate. It is accurate but takes time and everybody hates the result. Management halved my estimates without telling me, made fun of my mentor’s estimates, complained about needing to know everything upfront (not very agile eh!), etc. I think you are right and estimating is more of a solved problem than we think, some people just don’t want to admit it!


Oh yes, my experience as well. I just hope these companies do not manufacture aircraft or nuclear power plant software.


And then the customer decides that they don't want to move forward with the project because it's too expensive. Oh, and they don't want to pay you for all the time you spent estimating because no one pays for estimates up front anyway.


Yep. 30% upfront.

But actually, once you show the customer what's involved, and give them a realistic estimate, it might reduce the chances of cancellation. The discussion might be steered towards what they really want, or prioritization of deliverables.

A mockup (and multiple iterations thereof) might be necessary as well.


https://www.microsoftpressstore.com/store/software-requireme...

That book is from this perspective, that you can work out what needs to be done for a project before actually doing it.


> I still can't estimate how long something will take me when I've never done it before

In that case you're meant to do a spike to research the issue. You can timebox the spike to half/whole a day. Then use that knowledge to help your estimation for the following sprint.


I've become convinced over the years that upper management doesn't really care about 'estimates'. What they care about 'commitments'.

Every team I've been on has had upper management drive team leaders to get 'commitments'. They want some form of emotional investment in the work. That way, when scope has arbitrarily changed or a massive problem has lead to run over, they can stare at you like a whipped puppy-dog and say "I know it's 10PM and you want to get home to your wife, but you promised us, you committed, to getting this done on time..."

Pure emotional exploitation. The ideals of agile are laudable. I was the excited bannerman of my first agile experience and I read the manifesto with a lot of hope. However high-minded agile is, the language is too easily co-oped into the worst sins of management. In many places it exists solely to blackmail free overtime out of engineers, get away with micro-management, and ride engineers with false deadlines under a guise of hipness and modernity.

Estimates are just there for a false sense of stakeholding, or at best a ballpark cap so you don't work yourself into suicide. They can't be made accurately even if we could estimate properly, because all estimates are constantly second guessed by people who are making judgmental jokes about sandbagging, or asking "Are you really sure it'll take that long?" in voices that practically scream "chilling effect". More than a few possibly realistic estimates are torpedo'd by managers who are looking at their roadmap and tsking about how they don't line up.

I would even say accurate estimates are actively undermined. I've worked at companies which have positive work images abroad, and even there any team which was on time found either its sprint budget slashed, or scope increased until it was forced behind. The underlings won't do over-time if they don't feel the squeeze.

Story points, velocity, burndown, and any other metric you can think of that Agile courts are absolutely useless, because this industry refuses to come to terms with Goodhart's Law. I just watched a 10 million dollar project go up in flames. Their sprint graphs were great, and a constant buzz was going about the company about how these teams were setting the standard, meeting their metrics and goals consistently. Only recently did the CEO get down to business and figure out that all the metrics were bogus, and that for the past year all of the PMs and teams had been gaming the system to hide the fact that they were way behind schedule and over budget. The project was quietly scrapped and nothing changed. The shell games continue to actively sabotage data. I only know of the fallout because of who I sit next to.

Most of the benefits of agile seem to fail basic contact with humans unless they are backed by outstanding and visionary leadership. Most companies do not have that, yet they still find value in switching to Agile. Why? Maybe because even in total collapse it provides an unparalleled system for squeezing out more overtime, in my cynical opinion anyways.

I don't really what else to offer in place of Agile. I'm not that intelligent, but I don't really feel the industry can come up with a good development model of software development culture until it stops what it's doing and starts acknowledging problems arising from basic psychology, politics, and data gathering.


Nobody in charge cares about the estimate of an individual task. They care about progress toward program goals. With experience and skill, estimation mistakes tend to come out in the wash.


Probably lost 10 years due to that.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: