Hacker News new | past | comments | ask | show | jobs | submit login
Stop wasting time getting estimates right (agileupgrade.com)
140 points by jakobloekke on Feb 14, 2015 | hide | past | web | favorite | 87 comments

The problem I've seen with estimates is a mix of an inconsistency of ability mixed with a lack of experience in what is being done.

For example, I know that I can build a basic CMS system in Rails in a day because I've done it 100,000 times. The details of the text, layout, etc. don't take much more time to develop, so if the requirements are some derivative of that functionality I can give a pretty accurate estimate for how long it will take to build.

However once team size and project scope expands, you have Developer A who is great at the assigned tasks and can finish a deliverable scoped at two weeks in two afternoons. Meanwhile, you have Developer B who is not so great at the assigned tasks, and requires a lot of hand-holding and assistance to build a below-par version of what the client expects. This mixture leads to ridiculously inaccurate estimates. What it usually boils down to on these projects is a spreadsheet written by an out-of-touch project manager who has to answer to a senior manager and pulls a timeframe out of thin air. A lot of planning and documenting goes into filling about 2/3 of this timeframe, and then the last 1/3 is spent killing yourself trying to finish a shitload of work in a short time frame to meet the original estimate so that the PM can show that he's really good at estimates!

I've been through this process so many times and think about it constantly and have come to the conclusion that it's done this way because quite simply, there is no better way that accounts for the client's demand to have a firm estimate in place.

> What it usually boils down to on these projects is a spreadsheet written by an out-of-touch project manager who has to answer to a senior manager and pulls a timeframe out of thin air. A lot of planning and documenting goes into filling about 2/3 of this timeframe, and then the last 1/3 is spent killing yourself trying to finish a shitload of work in a short time frame to meet the original estimate

This is an accurate description of my last job using Agile practices. This was a shrinkwrapped product too. No customer demanding estimates, in the midst of fundamental architecture changes, and people still trying to slice-n-dice blue-sky new product development into sprints and user stories. The estimates were a joke and quality suffered because people rushed to get stuff done for the 3 week sprint deadline. All because some manager wanted it. Fucking stupid.

Maybe my years (I'm over 40). I tend to ask for estimates I didn't create, and call bullshit when I have to. People believe me, because I back my claim up with a lot of experience and anecdote.

Also estimation and agile do work, if done with some forethought - I wrote about that here: https://www.wittenburg.co.uk/Entry.aspx?id=db2d81dc-c435-42c....

I agree. I find the thinking of developers here is all over the place due to the variety of types of work we're all doing. But some people take a hard-line approach to estimates based on the recommendation of others who are working in a totally different environment.

If you are in the consulting business and you refuse to give estimates, then you simply will miss out on a huge amount of opportunities. That's just reality. No amount of explaining your methodology is going to get you a large corporate contact if you can't provide a quote. If you have no interest in that type of work, then that's fine, but there are plenty of people who do make their living that way.

If you are on a product team, you may have an understanding business who realize the nature of programming. But you still have to work with the company so that press releases, support documentation, marketing, etc can get done. The sales and support teams may need training on features. So there is some degree of estimations that are still needed.

As far as other types of ground-breaking experimental research programming - I can't really speak on that subject as far as deadlines because I've never done that.

At work, we are constantly hounded by non-technical business people for estimates. They want to time/cost box you. To them - you're an expense, and an expensive one.

My analogy would be someone asking how long it will take me to drive to my kids' school to pick them up.

70% of the time there is no traffic and takes 35 minutes. 25% of the time there is some wreck and takes 50-60 minutes. 5% of the time there is bad weather and very bad traffic and takes 60-80 minutes.

So, in your mind - most of the time it takes 35 minutes, and often the devs' answer is "35 minutes". This is where the experience comes in. The correct answer is.. 80 minutes.

I tend to accompy this (to the particularly ignorant non-technical manager) with various disclaimers about the 40+ year ongoing issue in est. sw development.

That's the biggest danger with estimates, that you're being held accountable for a padded guesstimate. One way of avoiding this is to give estimates in ranges or with +/- attached:

"It will take 2 days, + 5 days, - 1 day" <-- that shows a lot of risk

"It will take 1-2 hours, + 3 hours - 0.5 hours" <-- again, showing that there's some risk

Or you push back and tell them to give you some time to form a better estimate or that you will give them an estimate update as soon as you know more.

The issue is that most software developers are not trained in how to estimate and in how to deal with these time/cost boxing scenarios. We don't attend a class like managers do on how to deal with situations like this. We have to learn as we go along.

I highly recommend Steve McConnell's Rapid Software Development, fantastic book, it has a chapter on estimation and how to deal with client's and boss's that ask for estimates.

In my opinion, if someone is asking you for estimates on 1-2 hour tasks, that person should kindly remove him/herself from the software development industry.

You could simply say, "I don't give estimates in under 3-hour increments, because there are inevitably other fire-extinguishing and support activities that interrupt even the most trivial tasks."

Sometimes I get interrupted with "Alice needs X added to Y right away", or "Bob needs reassurance that this is working as intended in the original ticket" -- no problem. Other times, one might lose productivity to things like, "Java update broke our GWT compilation toolchain completely".

It helps if your manager knows that such interruptions happen, too. I also am a fan of overbudgeting and then revising later once I understand the problem better.

He also has a whole book on estimation called Software Estimation: Demystifying the Black Art. It's a great read as well.

Bought it and recommended it to someone already, thanks for the suggestion!

This happens when people confuse estimates with deadlines.

I've had clients trying to negotiate my estimate downwards. So stupid. Why would you want to underestimate the actual needed time? You want to go over budget?

They want to negotiate your estimate down, so they can then pin your quote and budget to the lowered estimate.

"Going over budget? That's fine, let's renegotiate. Remember though that you're already behind on this project so that needs to be taken into account. We'd prefer this last leg of the project to be flat rate..."

It's definitely not right - and as a contractor your should read the tarot cards and have had preparations in place so it wouldn't happen - but you'd be amazed at how often variations of this scheme work out with PMs and procurement at larger companies that should really know better.

In this case, they were billed by the hour anyway, so no.

> At work, we are constantly hounded by non-technical business people for estimates.

They actually want you to tell them price. They just think that price is called estimate in IT world.

I try to clarify on each occasion that what I'm giving them is not a price, not even an actual estimate. I'm giving them just my guess about the future.

To get an estimate they need to apply some math to my guess, hopefully based on project history and other factors.

Full price will never be known because all actions bring in complexity into the system, that causes technical debt to grow by unknown amount.

I have been doing freelancing for the last 5 years without estimations or deadlines. Has worked so great, everyone involved is focused on the task, you iterate and concentrate on business value, quality of the solution, etc.

Nothing can be "late", there is no "I said that by X..." in your subconscious (because you like to do what you say in the end), no absurd negotiations or revisions, there is nothing to try to stick to, nothing artificial interfering with working hard to get the best out of the project or task at hand.

I wholeheartedly recommend it to anyone who can work that way.

Sounds like you haven't been crippled by Agile for these past five years. Meaning, no middleman (perhaps one with some sort of Agile "certification") has been gumming up the works.

This article read like it was written by a disconnected, manager-type. Not that I don't value the efforts of a skilled project manager, but that's a different matter entirely. Too often the process becomes more important than the outcome. I know the Agile evangelists will say that such managers are doing it wrong, but in that case I'm not sure I've ever seen it done right.

The ironic thing, of course, is that the original goal of 'agile', by the people who invented it, was precisely to avoid middleman gumming up the works. Precisely.

We are all (including many of the people who popularized the term) aware that it hasn't always worked out that way. But that's still the goal.

fwiw I've seen kanban work extremely well but never scrum

I've also done this a few times but it's always a though sell. Do you mind sharing how you convince your clients that this is the right way to go? e.g. If a client asks for estimates, what do you reply?

Can you elaborate a bit more? What types of projects do you do? Where do you find these types of clients?

In practice, this starts with my availability. Since I don't work with estimations my gigs do not have an estimated end date, because what I only care is that what I am hired for finishes in its natural way.

In particular I don't tell my clients "I need to be done by time T", that would be putting a de facto deadline to the project and wouldn't be consistent with my view to focus on bringing the best value to what I am hired for.

So this is not a selling strategy, but it is just the natural way in which the conversation normally goes. Then, I explain why I work that way. Also, the corollary is that if that person or company hires me, I will be as focused on their problem and in the quality of work as I am being now with my current gig (the commitment that doesn't allow me to give a starting date.)

I have some public track record, so when someone hires me they know who are hiring. That, I am sure, also helps. Though I have to say when I started I didn't have so much exposure.

BTW, that for me includes Scrum sprints. No matter if you call it points, or whatever, you are estimating that something is going to take so much, and that by next week we could deliver A, B, and C. Is the same shit in the end.

Don't do that, talk about facts. Use past tense. Last week we DID A, B, and C. This project COSTED this much.

Some clients already work without that shit, pipeline and priorities. That's all.

Also, do not put yourself in a compromise by inertia. "I'll have it by tomorrow" and nobody asked. Try to avoid that kind of sentences. Then you absolutely have to do it by tomorrow no matter what, or else you need to excuse to fail the expectation. Be deliberately vague: "I'll have it soon", "I'll write back", "I'll prioritize this", often times that language is more than enough for communicating. But best of all: say nothing! Just work hard, let your work speak for you.

Working hard is key. Work hard and communicate often to have a close feedback loop. If you do that, in a few days your client already sees how it goes, it can feel the speed (as you do), it can feel the velocity, and that everyone is concentrated on what matters for quality. If you enter that flow, everything is so smooth and good for the success of the project.

I have been working non-stop that way for 5 years, and I truly believe it is the best way to deliver quality software.

While I completely agree with you, I don't see this happening anytime soon in big companies with multi-year projects easily worth 7+ figures.

Often enough the deadlines are set in stone within the contract and every milestone is defined ahead of time as well. I've worked with many such companies over the years and its always the exact same story. Some of them are even considered tech giants today.

It's also telling that I consider every single one of these project to be of mediocre quality at best from an engineering point of view. Thing is, our customers usually bleed so much money they barely notice it could be done any better.

People in charge of these projects usually don't know anything about software engineering and add more importance in pleasing investors in the short-term than producing value in the long-term.

Yes, in that case that's not your market. (Unless someone responsible for some of that budget hires you under those "local" conditions.)

But in general, as in any product or service, you have a market. Sometimes you may loose the lead, but it is important to say no to be true to yourself.

What I have found is that people and companies with which I get to work are clients of high quality themselves.

I recognize your way of working in how my girlfriend works. She is very successful working this way. Within a couple of days in a new job people see her working hard and delivering results. She earns a lot of trust from colleagues, clients and managers by working this way. What she does, she does it very well. It looks easy enough, but it is not. Working this way is actually difficult.

I have said it before but I think it bears repeating.

Time estimations is an industrial way of thinking applied to a post-industrial world.

In the post industrial world time isn't the problem but rather project definition and scoping.

In the industrial world the problem was already solved (machine was built, market often established and output depended on a few factors that could be adjusted. Need more output add more of X)

In the post industrial world every project is about problem solving and scoping.

To put it into comparison.

If we apply post-industrial thinking to an industrial world. It means that each time a product needed to be done if not the factory, then the machines would have to be developed. It will take many many years before time estimation will die, but it will happen.

[Edit: Story changed based on feedback below.]

Here's my analogy:

"Developer, how long will it take to acquire a vehicle that can travel a mile?"

"That depends. Will the vehicle drive over a highway, through the sky, or dig through a mountain?"

"That's only an implementation detail, right? Give me a ballpark estimate."

"I can call a taxi here in 15 minutes and $50. Chartering an airplane will require two days and $2000. A granite-boring drill will take 6 months to design, 2 years to build, and several million dollars, possibly several times that if --"

"Geeze, no wonder no one trusts developer estimates. I'll double it and tell them half an hour and $100, just to be safe."

Having said that, the alternative of "estimating is a waste of time" is also ridiculous.

"Developer, will I be able to catch the next train if I walk, or do I need to get a taxi?"

"Estimating is a complete waste of time"

"But I just want to know if I need to spend money on a taxi. It's two miles away and I don't have any bags, and the train leaves in one hour"

"Start thinking post-industrially"

"FFS, I just want to know how long it'll take me to get to the station!"

So what's new here? That estimating an unknown task is silly? Well of course it is! But saying estimates are useless is also daft since there are plenty of times we know a fair amount about the problem, and knowing how long things are likely to take is very helpful in deciding what to do next.

If someone asks me how long it'll take to add a bit more information onto a page that we already have in the database, I can give them a fairly good answer since it's a problem that we've done plenty of times and therefore have a good idea about how long it's likely to take. If they want me to find new data sources and integrate them then I can tell them we've never spent less than X weeks doing that so we have to assume it'll take at least that long. If X weeks doesn't seem like a worthwhile investment of time, then even this level of estimation has been useful.

This analogy is flawed, and I think represents the exact problem developers experience with non-developers. There's an underlying assumption that we know what walking or a taxi are. We don't. In this analogy, developers would need to invent both before giving an accurate estimate. The methods of transport are the mechanism for delivering the result of arriving at the desired destination. Software development projects are also about creating the mechanism for arriving at a result. The software is bespoke almost by definition, or you could go buy it off the shelf. Bespoke software is unknown. It's invention. We may be able to apply past experience inventing similar things to our estimations, but there is always some amount of uncertainty. Sometimes there is a lot because we've never built anything like the thing requested. We often may not even understand what is being asked for. If I don't know what a car is, asking me how long it will take to invent one so that you could use it to get to the train is inviting trouble.

> There's an underlying assumption that we know what walking or a taxi are. We don't.

Of course we do! The idea that every task involves some groundbreaking research into an entirely novel field is ridiculous. We often know a great deal about what it is we're about to do. Yesterday I needed to fix some chef recipes to get some monitoring working on a different OS, I could give a fairly decent estimate about how long that would take and it had some uncertainty in it. I built a simple site with stripe integration and had a good idea about how long it would take. I know more about doing that next time, too and have a better idea about how long it will take.

> We may be able to apply past experience inventing similar things to our estimations, but there is always some amount of uncertainty.

Having uncertainty in an estimate does not make the estimate pointless. Even having a lower bound can be enough. If I know a feature will not be done in under a week (because similar ones take two weeks, and this has a few extra complexities so my base assumption should be "longer than two weeks") then that might be enough for someone to not choose to implement it. I've had that before on projects, where I've given a minimum time and the feature has been dropped.

> If I don't know what a car is, asking me how long it will take to invent one so that you could use it to get to the train is inviting trouble.

But if someone said "Invent a car for me, I need to get to the station tomorrow" you could easily say that it's not going to happen within a day, and that information would still be useful.

That's exactly the point why "software engineering" still isn't.

It's tinkering, and as much as we software developers would like to keep it that way, at some point mankind will just have to solve that problem once and for all.

And today's software development will be called "software art" or whatever.

We'll never solve that.

What we likely will do, is develop higher level abstractions for more and more sets of problems so the subset of problems developers works on will change.

Just like an exceedingly small percentage of developers today needs to care about hitting the hardware directly.

>That's exactly the point why "software engineering" still isn't.

"Still"? It will never be "engineering", as we don't have enough constraints and fixed boundaries to guide our development process.

How long will it take to invent a flying car? That's engineering right? I want a precise estimate. Also it needs to be made out of a material that is both transparent and can withstand crashing into a concrete wall at 85 mph. And the entire thing needs to be powered by a fuel that can be harvested from inorganic compounds. It must seat 6 people as wel as be able to interact automatically with a aerial car traffic control system that is being built by another team using standards that haven't yet been developed.

How is that any different than software? Software engineering is a thing. There are constraints however each implementation has the potential to be novel. Electrons or steel, it's all engineering. Even engineering has trial and error. Would you say the Wright Brothers didn't engineer the airplane?

What's the definition of engineering? Let's refer to Wikipedia:

Engineering (from Latin ingenium, meaning "cleverness" and ingeniare, meaning "to contrive, devise") is the application of scientific, economic, social, and practical knowledge in order to invent, design, build, maintain, research, and improve structures, machines, devices, systems, materials and processes.

Even "tinkering" is engineering. In fact every invention comes from tinkering. There's no definition of engineering in existence that would preclude software development from being considered authentic engineering.

In terms of constraints and fixed boundaries, to suggest software doesn't have those is just nonsense. Just try leaving out the curly braces next time you write something in C. Unlike a field such as writing, software does have rules. I can write a misspelled sentence and forgo the rules of grammar and my sentence will still be "compiled" and probably understandable. A book doesn't stop working because of a bug. Software certainly does. Thus, there are plenty of fixed boundaries and constraints on our work. The characteristics of steel are analogous to the charactistics or a specific software class. Except in our business, many times we have to invent the steel ourselves.

> How long will it take to invent a flying car? That's engineering right? I want a precise estimate.

No, that's R&D, a task which then later leads to engineering when a working solution has been found.

How is invention tinkering?

It's not. People think that because inventions tend to result from tinkering that the two are causally related. They're not. Tinkering is merely one of the ways to provide a mental space in which invention can occur; invention itself is not engineering.

Implementing the invention is.

In that you don't know whether what you do will work or not before it does. Lightbulb as an example.

That might be true for galois, but not your run of the mill webapp shop.

I'll be the devil's advocate and say that a lot of software development is not particularly novel. A lot of software development is essentially doing minor changes to existing systems, in the same way and format as everything existing, with nothing novel or particularly challenging.

If you're asked to add a couple of fields on a CRM form that you've worked on for years, being able to estimate how long changing the storage, data, business and presentation logic should be expected and easy, with high accuracy and very little to no risk. Or maybe you're making yet another minor derivative of a data import task that's slightly different from the hundreds you did before. This is often why businesses demand very specific skillsets, because while I would have no idea how long a Wordpress integration module would take, I would bet that someone who works with it a lot would.

There are a lot of software developers who, when given such a task, will try to make it more challenging by rebuilding everything in the process, then throwing up their hands about the complexity involved. It's understandable why business partners get upset about this.

There is absolutely inventive software development, but we don't make a credible case when we lump it all in together.

You're right, but in my experience, even non-novel software features can be hard to accurately estimate, especially due to yak shaving.

Last week I was asked to add a button to a form that calls a service endpoint, same as the other five buttons that I already implemented, and which took me around 30m, so this should be easy, right?

Except during testing the service throws this weird error that makes no sense, and so I have to start debugging, by tracing the code and possibly looking at the network traffic. I find that the library I'm using is sending a weird request, which with a bit of Googling tells me it's a known bug, and which has a patch, but that doesn't apply to our deployed version of the library.

So now I'm two hours into this half-hour task, and I'm trying to decide whether I should spend the time to backport this fix and re-test all the code using the library, or implement some kludge that solves this problem but increases the effort of anyone in the future that touches this code.

And a lot of these seemingly trivial, familiar tasks require far different time than estimated (both less or more) precisely because no one went back and rebuilt things, thereby leaving an inconsistent, incoherent spaghetti mess behind them.

Editing code is more akin to rewriting essays than re-engineering a machine in that way: It's going to take a lot more effort to make a 3rd grader's essay print-worthy than one of Hemingway's.

thereby leaving an inconsistent, incoherent spaghetti mess behind them

Generally my experience has been that the inconsistent, incoherent mess develops when people tasked with doing trivial changes try to "fix technical debt" in the process, leaving you partly on the way to a "better" model, at least until the next guy comes along. And then the next guy. Until you have a layer cake of different approaches and technologies and philosophies, degrading to a worse and worse state.

Look, I've been doing this for 20 years. Almost everything I work on tends to be novel code. I'm terrible at estimating, and now refuse to do it at all. But there is a lot of self-serving, well, nonsense that we ply in this field, justifying our failures (and the reality that most developers don't want to learn other people's code, so they just short circuit the whole thing and pretend it's higher ground), and the whole "technical debt" thing is one of the most egregious. Most developers, when asked to do something to existing code, will throw their hands up and declare it the worst code they've ever worked with, and they can't possibly make some trivial change without completely rewriting all of it, etc (usually somehow pulling in whatever the pet technology of the month is...sorry, I can't change the web form without turning it into a single-page app built on io.js and React and...). We all tell each other this on boards that we dominate, not realizing how completely ridiculous and transparent it really is. It's always more fun to build something novel than to just modify things that exist.

Which is why I really think maintenance programmers are a unique and valuable breed for businesses to have. I personally have a weakness that I always need to rebuild things. I'm terrible as a maintenance programmer, so I simply don't do it. But there are some people who are perfectly happy adapting to whatever they are working on, learning its idioms and unique traits, and then competently and quickly making necessary changes without drama and theatrics about how dire of a situation they've been put in.

To abuse the analogies that have been plied, it's being asked to drive somewhere, and first re-building the car into an electric car, then a self-driving car, and then deciding that you want it to be a hyperloop, and then talking about relativity, all while the business just wanted to get from A to B.

Ideally you need something in the middle of "Rewrite this whole damn thing from scratch" and "don't touch a thing more than you need to, never change any architecture, just make it work and the tests pass."

You need to understand the existing architecture and idioms and traits. And you need to understand the business domain. Then you need to decide when it makes sense to refactor a sub-system first to make the change easier (and support future such changes, which requires you to make a _guess_ as to how likely future such changes are), or to resolve existing lingering problems while making your change easier. And when it doesn't. And you hardly ever, ever, need to rewrite the whole thing (but not never, just hardly ever).

Personally, I think this is what the art of crafting good code _is_, and something that needs to be the goal of everyone writing software, not something you can say you Just Don't Do because It's Not Your Style.

> all while the business just wanted to get from A to B.

Getting the business to realize that they just want to get from A to B can also be an issue.

Exactly this. The problem is by getting to B, the customer thinks they need to visit C, D and E and they don't tell you until later that they also need to visit G but only if you haven't visited H first.

The book The Nature of Software development covers this concept much better than I could..

Cunningham on technical debt https://m.youtube.com/watch?v=pqeJFYwnkjE

Even minor changes to existing systems can be hard to estimate if the system has accumulated technical debt over time.

I agree that most software development isn't very novel, but I also lost count of the number of developers scared to change parts of a codebase in fear of what it could break, even if that codebase is less than a year old.

This is why spreadsheets were invented. The lack of automation in other areas is just low design maturity - Salesforce are partly there with the CRM.

A Wordpress integration module could certainly be inventive depending on what it was integrating with and what it was doing.

> So what's new here? That estimating an unknown task is silly? Well of course it is! But saying estimates are useless is also daft since there are plenty of times we know a fair amount about the problem, and knowing how long things are likely to take is very helpful in deciding what to do next.

What's more, if we don't know enough about the problem to come up with a reasonably-confident estimate, that indicates that we need to define or scope out the problem better.

Thats still a kind of industrial example IMO, the issue with post-industrial is different.

Its fairly easy to estimate how long it's going to travel a mile once you undstand how you have to travel it. You can rarely go faster than you have estimated.

No the actual problem is that in the post-industrial world you haven't even build the car to travel by the highway or the tools to go through the mountain.

Good point. I'll update the story to better reflect that.

Even after the update I still see it as an industrial example. Switching from time to money doesn't change knowns into unknowns. If you know exactly how much money and time its going to take for every option then the choice is easy.

A better analogy would be someone asking a doctor how long it will take to cure cancer or aids. "Well you've cured chickenpox before, surely you can estimate a cure for cancer based on that!"

We actually work in a software industry.

What we manufacture in this industry is deltas to software products.

Clients often want a reasonable idea of how long it will take for a piece of work in the production line to make it to the factory door.

Velocity works well in that environment. It asks "how long will it take this piece to reach the door?", not "if I want a pile of deltas this high outside the factory door, how long will that take?"

The former can be derived from actual fresh data. The latter is an estimate built from adding up other estimates and is harder to rely on.

We are in a situation where the client wants some enhancements, we know to an order of magnitude how hard they are - ie more than a few days, less than a month - but cannot be more accurate without actually diving in and basically doing the work itself. But they will not green light us without an estimate. Result = deadlock. I dream of a world where (a) customers understand that this is an art, not bricklaying, and (b) realise that always demanding a number up front is just going to cost them more money because the s/w folks will always highball it to cover themselves for unexpected issues.

I think it's widely understood that product teams write higher quality software than consulting teams

I would say that is probably true but isn't exactly fair to the individual developers on those teams.

Product teams tend to be stable, were hired for their domain expertise, and have been working on the same product for a long time. So naturally they are a well-oiled machine for producing one specific product.

Consultants are a mix of whoever is available at the time. They may or may not have any previous domain experience. Each programmer might be working on multiple projects of totally different companies at the same time. Developers might get rotated in/out of projects mid-way.

So it's not that consultants are crappy developers and product teams are excellent developers, it's more the situation that they're thrown in.

Sorry if I'm being dense, but how does that relate to what I wrote above?

I guess I was alluding that if one is in a consulting situation where he can no longer estimate, one possible solution may be to increase quality by being more like a product team, and the resulting clarity of thought will help out his estimates

We recently had a discussion with a client about estimation. From the developer point of view it's hard to predict the date because of many things which might happen during development and we are not aware yet about them at the moment when we need to declare the date.

Now we are using historical data to predict how many calendar days the features scope may take with a specific confidence level.

From the client perspective dates are important because they need to coordinate release of multiple projects and dates help them release right stuff, in right order, in right moment on production.

Here is article about our approach http://blog.lunarlogic.io/2015/project-management-estimation...

Reasonable time cost estimation is certainly possible with software. It's not particularly easy but you certainly have a better chance if you at least expose yourself to the decades of hard research on the subject and not arrogantly imagine that everyone before you started programming of flogging your "agile approach" was an idiot and knew nothing about software development.

Software estimating is not a particularly pleasant task - neither for developers (they generally have no training in the area and so don't even know how to approach the problem) nor managers (who have too little technical knowledge).

And since 95% of development is writing in-house CRUD stuff or putting together lightly trafficked web-pages, it's tempting to bask in ignorance and pretend that not only is the task almost impossible anyway (and so pointless) but that in-fact it's unnecessary. And for IS department mickey-mouse projects, this is probably the case.

Some of us however have had to pitch fixed price tenders for large software development projects, for example. And no, the customer isn't going to accept that it will be done when it's done and that we should just start a few sprints and see how it goes. And if you get the estimate wrong (too low), you screw yourself and lose money on the project - too high and you don't even get the contract. In this environment, you learn quickly that estimation is vital, difficult but possible with the appropriate effort. And yes I hated doing it the first time but it became almost satisfying after a few iterations having gained some confidence.

This assumption that a "post-industrial", agile, folksy-anecdotal approach is appropriate for all software development really pisses me off. It breeds developer myopia, ignorance, self-importance and exceptionalism. Believe it or not, engineers have estimated, planned and executed very complex systems inside and outside of software. But it's only in software engineering do you find clowns earnestly claiming that estimating or planning developing some crappy CMS system is beyond human capabilities because software is so "different" and that estimating and planning a project like the Hoover dam was easy in comparison.

Building projects are a terrible example of predictability. They are ultimately highly agile due to dependencies on inherently unpredictable factors such as weather, 3rd party suppliers, industrial disputes, archeological discoveries, legal disputes and so on.

The difference with software is that if you have done it before you should have automated it. The great majority of programming is therefore design work. There is an answer here which is reversing the problem and simply timeboxing based on expected ROI and there is certainly no need for task estimation as a component of fixed price bidding, particularly where that has a lousy historical track record and wastes client money.

(I'm well aware that on many very large projects this is not the case and they are run like an old-fashioned production line with humans performing well-defined tasks - but this only illustrates the huge inefficiencies, lousy processes, and general ignorance of software development practises as standard in SI's.)

But aside from all that, how do you account for http://en.wikipedia.org/wiki/Planning_fallacy ?

I'd also be interested in how you apply cost estimation to debugging and other BAU tasks.

The majority of building projects are simple and predictable - like single domestic dwellings. Building contractors have gotten pretty good at estimating the cost and time (or else they go bust quickly) to complete a family home from scratch on an already serviced plot despite all the dependencies and unpredictability you describe. Applying "agile" (i.e. not doing requirements gathering/careful estimation/detailed planning and not having a clear goal at the start) would be a recipe for going bust very quickly.

And the majority of development is (despite our egos) as mundane as this really. The effort is certainly NOT spent on design work; have you ever quantified the effort that goes into various aspects of delivering working production software?

Re. planning fallacy, what of it? It doesn't make estimating impossible; once you are aware of it, you account for it in your estimates. You also calibrate your estimates with independent models and with empiricism (i.e. previous experiences).

But the literature is extensive on this topic - I can't give a summary here but if you are interested, there are worse places to start than with Boehm; deeply unfashionable I know but built on (shock! horror!) empiricism and not the rationalist (in a epistemological sense) sophistry which dominates the thinking of "agile visionaries".

Really it's like we've regressed to a pre-scientific age in the field of software engineering processes. Most of the debate seems about as relevant as arguing about how many angels can dance on the head of a pin. It seem actually observing and measuring reality went out of fashion sometime in the 90s.

> The majority of building projects are simple and predictable

The waterfall approach you are advocating can work on simple projects and indeed it's how most digital agencies operate. However it starts to fall apart at the seams precisely on large enterprise projects and it's no coincidence that agile came from experience on these kind of large projects.

It's worth noting that the proponents of agile and post-industrial approaches are not '20-somethings who don't know how the real world works' but people with extensive industry experience such as Alan Cooper.

> The effort is certainly NOT spent on design work

While I'd certainly agree that there is a lot of boilerplate and wheel reinvention out there, that does not represent effective development. Software development by definition is the practise of creating automation.

The mindset of much non-technical management today is frequently one of Henry Ford era industrialisation whereby processes are designed and carried out by developers in a form of mechanical turk model - for example just about any large scale outsourcing project.

However, if you take a look at a car production line today, that human work is automated and done by robots in darkness. Efficient software development is supposed to look like this - the developers designing and creating the production line itself, not manually creating the end product.

We see this better use of human labor and subsequent raising of the skills bar reflected in the job market with demand shooting up for all kinds of design roles, from architect to UX.

> Re. planning fallacy, what of it? It doesn't make estimating impossible; once you are aware of it, you account for it in your estimates

That's the whole point of the planning fallacy - it occurs even when we take it into account. There are supporting studies here and it's simply incorrect to say that this is non-scientific.

> It seem actually observing and measuring reality went out of fashion sometime in the 90s.

Are you really advocating that time and motion ever turned out to be effective at growing or even saving businesses? But seriously, agile delivery tends to be associated with an excess of observing and measuring if anything.

I agree that software estimating is absolutely possible and often necessary. However, it's important to keep in mind that software engineering has much less in common with the more traditional engineering disciplines than they have with each other. A big difference is the ability to heavily iterate on a design after it's been shipped (not in all cases, and within limits, but still generally much more than a structure or physical machine). So we often see procurement cycles which treat software development like a construction project or equipment manufacturing (because the organizations making the purchase--enterprise and government--have built their procurement processes around that sort of project) even though that cycle does not necessarily match the way software "should" be built.

FWIW, that procurement inefficiency represents an opportunity for software entrepreneurs who have the patience to deal with the sub-optimal process. It also probably represents an opportunity for large firms, in terms of how to cut down on software expenses (presumably by bringing development in-house, although that obviously comes with its own overhead).

I'm afraid I don't fully buy into software engineering exceptionalism; software engineering is more immature but it shares far more with traditional engineering project processes than with artistic or even craft-style work (of course "in the small", for some tasks, an individual programmer uses an art/craft approach to implementing functionality).

And the majority of (non-software) engineering projects also involve iteration.

The majority of software projects, I believe, can be estimated with reasonable accuracy as long as you ensure the project is a development project and not a research project. Most of the literature on this subject classify the types of software projects but nearly all recognize a category of "research project" and most agree you cannot estimate such a project. Note this finding applies to all branches of engineering.

A major problem I see with software engineering is developer ego resisting accepting that most of what they do is mundane and has been done many times before. They resist this ego-deflating fact by trying to turn every project into a "research" project; thus instead of using simple boring dependable tools (let's say Apache, MariaDb or Postgres and Php) to deliver working software, many developers/geeks (and I consider myself one) will try their utmost to turn the project into a research project. So it will be proposed to use a Hadoop cluster for storage and the latest cutting edge ORM tool with load-balancing and an XML based messaging bus talking to a front end written using Nimgorust or whatever cool new programming language is out there. Of course, estimating such a project is an order of magnitude more difficult than a project involving mature well-known tools. But this difficulty is a product of technical decisions (which you can control) and not inherent in the task of delivering working software.

> And if you get the estimate wrong (too low), you screw yourself and lose money on the project - too high and you don't even get the contract. In this environment, you learn quickly that estimation is vital,

In my vertical what happens is that customers field optimistic estimates from vendors much less qualified than you. But customers have no basis to know those estimates are fanciful. So they rely on a fanciful estimate from an unscrupulous vendor, and then get stuck.

Estimates are not judged on being right, they're judged on being low. In that environment, the only winning move is not to play.

Obviously in some verticals there could be people technically qualified to evaluate the incoming bids. I'm just reporting that it isn't the case where I work.

If you can't demonstrate prior track record on a similar project and are relying on price alone to win then you should qualify out in any case.

Simpler estimates are a step in the right direction, but this article is still advocating a very heavyweight (dare I say unagile?) process.

Time-tracking by individual story card? What would you even do with that data if you had it? Sure you can put a precise numerical cost on how much a "medium" story usually takes - but your estimates of feature value are never going to be accurate enough that the difference between $6000 and $5900 is a yes-no decision.

Going back to the points actually saves you time here - and saves you a lot of less tangible value by avoiding timesheets (when employers introduce timesheets, I leave. How much does hiring a new dev cost?). We got 59 points of stories done last iteration, therefore cards cost $TOTAL_TEAM_COST/59 per point. Done.

The next step is to break all stories down to similarly sized stories. Sometimes it is difficult but I have seen teams who can get it consistently and have high confidence intervals in their times.

The title says "Stop wasting time getting estimates right", then outlines a number of classic ways in which estimates can be improved: give a range, collect data, break down estimates (which is problematic by the way[0]).

The process described actually sounds a lot like the Personal Software Process that Watts Humphrey described in the 1990s, right down to the use of proxy values for estimation purposes.

[0] obligatory self-promoting link -- http://confidest.com/articles/how-accurate-was-that-estimate...

I wrote the essay and seems it sparked a good discussion in here :-) Can't comment on it all but the point I tried to get across was actually very closely related to the style of thinking most of you promote in here - focus on the outcome not the output, iterate fast and expect uncertainty in both cost and outcome. The principles I present is trying to do exactly that (IMO) while providing the structure to make informed decisions in terms of prioritization and risk.

Basically - Make it short and simple for the team to estimate stuff so they can spend their time focusing on what's important (which is NOT worrying about whether 4 or 8 hours remain before you hit you original estimate and the stress, drop in motivation and even potential blame games that might follow from that) - and be disciplined enough about data to answer all the other questions people will ask anyway. Estimating in time sucks!

And as I wrote, yes you can skip registering time on the individual user story if you find yourself in an environment where that is doable. You will loose a bit of data to make better informed decisions but I'll admit following the 80/20 rule that you probably should not do it if you can avoid it.

Once the structure is set up it actually is quite lightweight (especially for the team but also the person collecting the data, both with and without the per user story time registration) and having at least some amount of real data does in my experience provide better and faster decisions in most contexts.

But anyway, whether you agree or you don't thank you all for discussing and sharing. Never in a million years expected close to 13.000 users from 6 continents to visit my blog in less than 24 hours (and still rising)

BR Jesper

Estimate frequently, quickly, and move on to execution. This will allow you real-world information to make a better estimate next time. Lots of little fuzzy guesses which trend in the right direction. I believe the complaint about "not getting any closer to resolution" is a red herring. The value in the exercise is creating a shared mental model, not creating COCMO VII. There is no perfect estimation model, mainly because your team is not full of robots.

We screw up when we get up from the IDE and start thinking of technology development in the same way as we would using the Jquery framework. People do not work like machines.

Unrelated: the first heading should be "preface" not "prefase" :)

This is an interesting article to me as I'm currently making a series of screenshot videos to document a project that we're doing at my company. It is estimated at about 3 months and I'm recording the progress and everything to see if we hit our goals. I thought it would be fun to post them as we go, but for competitive reasons I'm going to hold off until we actually get it done and then post them all.

The videos so far are turning out to be somewhat long (10 minutes reviewing each week of the project) so I'm wondering if anybody will actually care about viewing such a thing?

#NoEstimates to me is often an indicator of poor company culture. It immediately makes me think of shops where there's a disconnect between developers and the rest of the organization.

The business and product people are unable to explain why they need estimates. Often they don't know it themselves and are just passing on the request from their bosses or customers. They hold devs accountable for estimates even if the requirements, priorities and teams change. They also think estimating is free - they don't realise that producing meaningful estimates takes time and requires a deep understanding of the code base. It also requires clear, precise requirements which they are unable to provide.

Developers at these shops believe their single responsibility is producing elegant code. They don't have to prioritise features based on their cost and value; and they don't have to communicate with clients, investors or sponsors. Many devs never learned estimating, that's why they call it an "art". They spend forever to produce estimates that turn out to be way off, even if requirements and circumstances don't change. The lazy way out is to say: Estimating is a waste of time because estimates are almost always wrong.

I have seen estimating done right: It took about 5%-10% of the overall effort spent. It helped make or break business cases, prioritise features and manage client expecations. It raised questions that, if left unanswered, could have resulted in massive cost explosions and unhappy customers later on.

Key cultural ingredients: - Business and product people don't live on an island. They understand that estimating is neither free nor instant. They work close enough with developers to know what level of detail is necessary and they also understand the value of refactoring, test automation, pair programming, etc. - Developers don't live on an island. They are involved in product decisions and client communication, so they understand why people are asking for estimates. They not only take pride in the quality of their code but also in their ability to deliver features quickly - Estimates are not used to crucify developers, but the company culture allows an open and forward-looking conversation when the actual effort deviates a lot - Estimating is seen as a skill. Becoming good at it (ie. accurate and efficient) is a requirement to get senior dev roles - Everybody understands that when the world around them changes, the estimates change too

Final note: work should be estimated in line with the predictability of the business. If your company is pivoting every 8 weeks then there's no use in estimating 6 months out. If you have a successful app for iOS and consider expanding to other platforms, it's probably worth estimating

Since you want to do the features with the highest impact per unit effort first, the units used for estimates don't matter. You could replace a time estimate with a number between 1 and 10, and that's equivalent for prioritization purposes.

As long as you don't let the difficulty numbers get translated back into dates, you get the best of both worlds. You don't waste time on projects that aren't worth doing, and there's no deadlines and estimates to cause friction.

I want to point out, that I'm not the author of this article. It was written by a former colleague of mine who is a very talented project manager.

The whole game around estimates are a dirty way of dealing with other issues:

1) Can this be done before we run out of money and market opportunity?

2) How should other assets in the company be coordinated around the product. e.g. marketing needs a lead time to produce their campaign.

3) Scopeboxing.

For #1: If you work in a big-corp, this is tied explicitly to some project budget you have to manage. Going over that budget is a no-no. For startups, it's the investment dollars in the company. Startups tend to be over-capitalized for the products they tend to produce. Big-cos tend to under-fund in an attempt to capture as much of the money as profit (especially if your big-co is a contractor of some sort).

Estimates are intended to be converted into budget dollars (time * team-size * burn-rate) and are critically important signalling mechanisms. No money, no work.

For #2: Development is not the only piece of the puzzle. There might be a team just as large and much more expensive that relies on your estimate to get their work done so that the company runs in a reasonably coordinated fashion. "It'll get done when it gets done" doesn't allow anybody else in the universe to plan anything at all, and now the company has a bunch of expensive resources sitting around doing nothing until the primadona development team produces something. This is usually the source of adversarial and caustic inter-departmental relationships in a company.

For #3: Good development managers learn to invert the question of estimates. When asked "how long will this take to get done?" they answer with "how much time do we have? we can get this much done in that time" and then make that scope a commitment. This provides much more valuable information to corporate management, they get to set their schedule, but they also get to understand what the product they're going to be delivering is going to look like, which informs the teams from #2.

It also defines a clear barrier of what will be delivered and when. Scope creep gets cleanly pushed into "the next development phase" because anything that wasn't promised and agreed upon is outside of the current development phase. This approach lets the development manager set expectations, and also lets them never say "no". "Sure we can do that, I'll put that in the next development phase" becomes the answer. Phases should be scoped to deliver a completed working version of the product (as opposed to sprints which complete components of a product).

If the budget runs out, or a marketing campaign is underway and development is paused, the development manager now has a barrel full of things to scope out for the next development phase. Taking this time to work with corporate management and prioritize and scope out the next phase is usually a good idea. When the time comes to resume work, the development manager can again say "I'll deliver this much by this date".

Awesome development managers will underpromise on the scope and keep a reserve of features as a stretch goal. If they hit the major scope, they look like a competent and solid development team. If they hit the stretch goals they surprise and delight everybody else in the company.

I'm not saying it's easy, but it's better than a waterfall approach and what passes as Agile in most shops. It results in shipping products and allows development to be better determined into overall strategy. What corp management needs is predictability, not undetermined development cycles and feature creep.

(#4) - To be able to say "how much time do we have? we can get this much done in that time" the devmanager needs to actually know how to estimate. A good technique is to break the tasks down and ask the developer who's likely to do the coding how long they'd take, add up those times and see if they fit by the due date. If not, start tossing things. It's amazing how much you can cut from a product and still end up with an MVP or with another version phase.

(#5) - To do all this also requires the devmanager to understand what are the minimum tasks required to achieve a working product at the end of a phase. Those tasks are the core of the scope. Anything else is stretch. This usually means the devmanager needs to have a bit of domain expertise to figure this out. It should come as a surprise, just like soda execs shouldn't run Apple, devmanagers who don't understand what they're building shouldn't be building it.

Estimates will always suck for 10 of them, but they work for large numbers (you can tweak the knobs to compensate for bad estimates). My experience after around 5000 stories and 100 man years (single project) is that we are slowly approaching acceptable accuracy!

Or save yourself a lot of time by making each story worth 1 point and just count how many stories you have. They will average out over the course of a program (this is borne out by studies of many projects at Thoughtworks).

The quality of estimates improves directly in relation to the completion date of the project. Like forecasting the weather - looking out the window always beats the forecast for the next week.

I've seen the opposite happens, the closer to the deadline the worse the estimates. At which point the entire team is in constant crunch mode and getting tired. This causes even more defects, which cause even worse estimates...

One project I worked on last year saw four different authentication systems implemented. The last one was built on the leftovers of the scaffoldings of the previous three. We couldn't get rid of it because many other systems had been tightly integrated into authentication at various points in time.

I would say the quality of the estimates aren't a function of the project's completion but instead a function of the quality and simplicity of the codebase.

I liked this article but I couldn't stop thinking "you would do well in investing in a project manager" at every bullet point.

Are you supposed to have a project manager in a lean or agile ? No, you're a self-reliable and democratic driven team with a PO as a single point of contact, with no hierarchy that will mess up this beautiful utopia.

Exactly! No architecture, no structure, no process, no documentation and with some luck, no money either.

I've changed my mind on estimates; if you're giving a guesstimate it's your professional responsibility to make it clear that this is a guess and to explain that as you work or investigate the task(s) you will have a better estimate.

If you're estimating once at the beginning of a sprint and neglecting to update the estimate as you're working on the task or as you have new information on the task, you are being unprofessional. It's the finer-grained equivalent of your project manager or account manager failing to tell the client or the boss that the project will take longer than expected after re-evaluating risks.

At my current job, the CTO is asking for estimates in order to get a project under control. The first estimates at the beginning of our sprints are not fine-grained but as I work on a task I add more information and update the estimate based on what I know. Sometimes the second and third estimate will be a guess as well but at that point there will be some code written (or read/investigated) which makes it easier to do an estimate.

Padding estimates is something that I'm against when you have enough information about the task. If you're padding to account for things like meetings or interruptions, don't do it, just log how much those are taking and make a note of it and then everyone can see that your estimates are correct (it did take you 5 minutes to do task XYZ) and that meetings and bullshit interruptions are eating up lots of time. A daily standup that takes 30 minutes and then 1hr lunch and a 30min team meeting should be logged against the project but shouldn't affect your estimate of the task.

I think the issue is that the estimate can be seen, by managers and developers, as the end goal for the task. If you estimate at 2 hrs and you finish in 1 hr you can just get on Hacker News or reddit for an hour. If you estimate 0.5 hrs and it takes 10 hrs, you shouldn't be penalized as if you're late. Updating task estimates as you gain more information about them gets rid of that issue. If you think that 2 hr estimate was too optimistic, go ahead and update your estimate. You can't be held accountable for a guesstimate, you can only be held accountable for the failure or success of doing the task itself.

If you don't know how long will it take, make it clear to your team, boss or client that this is the case and that you will need some time to research and to estimate.

If there's a lot of tasks that need to be done, estimating might take half a day or a day as you go through everything. That estimation work needs to happen for any project that has its deadlines already defined (this is frequently the case and why agile can't be improperly implemented in most workplaces). Management needs to give you time so that you can do the professional tasks that need to be done. You can't fix what you can't measure and as time goes on your estimation skills will get better.

id like if all my stuff took 100x1min as in the author story ;)

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact