Hacker News new | past | comments | ask | show | jobs | submit login

Planning doesn't mean estimation though; we all know it's just guesswork, and basing business decisions on guesswork is madness. On top of that it's a lot of wasted time and effort.

Milestones and projections is a much more constructive approach, simply tracking the number of tasks in your backlog over time will give you an increasingly correct projection of when the project will be finished.

https://www.youtube.com/watch?v=QVBlnCTu9Ms




Very good point, planning is a broader activity than estimation. I find people bring a lot of baggage to the idea of estimation, like in this case you say "it's a lot of wasted time". There's an assumption in there somewhere about what constitutes an estimate. If you're talking about an eng-week granularity estimate for a project that is going to take 6+ months then absolutely I agree with you.

But for me an "estimate" is just a sense of the level of effort required sufficient for the needs of planning at the time. It goes hand in hand with risks, dependencies, and other assumptions that need to be kept in mind. The exact form of an appropriate plan depends a lot on the business constraints like how delivery timeline interacts with customers, external events, and the broader ecosystem that the company operates in.

As for your suggested backlog approach, this makes sense if the requirements are rigid and accuracy on the delivery timeline is the most important thing. This is situational, but in many projects I've led, the scope is negotiable, and therefore important to keep the big picture in mind to find opportunities to refining or consolidating scope. A common anti-pattern I've seen when relying on ticket/task-level tracking is the team missing the forest for the trees. It's possible to leverage these systems to support high-level planning, but I tend to prefer documents or a simple spreadsheet gantt for that purpose, and use ticket/task tracking for last-mile work, intakes, bugs and other inputs that we don't want to lose track of, but may vary widely in their importance and relevance.


To your point, most planning is bad. That doesn't mean planning is worthless when done right. It seems that most bad planning fundamentally misses the interactions between different tasks (ie it largely treats the different tasks as independent). A probabilistic approach that correlates the tasks seems to work better.

Sometimes estimates are made out of ignorance or borne from an optimism bias. But sometimes they are lies, because its easier to get a project funded based on a misrepresentation, and keep it funded once the initial cost is sunk.


The point is they're always guesses, so we multiply by Pi or whatever to make reasonably sure we're not coming up short. But it's all a game, and everyone knows deep down inside that it isn't working.


>they're always guesses

Maybe it's so prevalent in SWE that most are jaded, but all it means is that we are pretty bad at modeling it or simply don't care. Other domains, like aerospace, have similar problems but have much better methods to arrive at more reasonable estimates (when they care). For example, [1] gets into joint cost/schedule estimating using a more data-driven method where you can put uncertainty around it. It certainly beats 'multiplying by pi.'

[1] https://www.nasa.gov/ocfo/nasa-cost-estimating-handbook-ceh/


Comparing to physical engineering doesn't make much sense to me.

Part of the problem is it's all virtual, so there are no limits; part is that we're always pushing the envelope in the complexity department.

Building the same software again with just a tweak or two doesn't happen very much, the reason we build new systems is often to try something so different that it's not feasible to adapt existing systems.

Most of the time we have no idea exactly what we're building until we get there, even the customers usually have no idea exactly what they want.

Imagine going to a car manufacturer and giving them the kind of specifications we usually get for software, I can assure you they wouldn't be able to give you a good estimate either.

Research is a better comparison. How long will it take you to build a fusion reactor? Depends, right?


There's a lot you've said here that I agree with, but I think we use it to arrive at different conclusions.

FWIW, I've worked in software, automotive, and aerospace. They are probably more alike than you may realize. Vague requirements are quite common. The link above has a section specific to software development, so I don't think it's fair to say it only applies to physical systems. I would characterize it as a systems approach, rather than domain-specific. It uses "WBS" or "work breakdown structure" to delineate tasks. That's an approach that is agnostic to the domain. So if you're working on a mechanical system you might have propulsion, deployment, and control surfaces. If you are working on a software system, it might be user interface, command & control, and data acquisition. But the applicability of the method is the same.

What I think you highlighted is that SWE is generally much less well-managed than other domains. Mechanical engineers often get vague requirements but implement much more standardized processes in their work to hone in those requirements, largely because the costs of jumping in and iterating are much higher. By comparison, software development is the wild west. Like you said, software is virtual, so people are lulled into complacency that they don't need a structured process because they can build and make tweaks cheaply. But what you point out isn't really that software development is inherently different, but just that it's not managed well.

I think your research comparison misses its mark. Research isn't involved with developing some end-stage product for a consumer. The 'product' of research is an experiment and every research project I've been involved with starts with an estimate of how much that experiment will cost and how long it will take. And that follows the same basic structure as any other development effort. Maybe it's an apt comparison for some yet-to-be-proven tech like self-driving cars, but it doesn't explain why most basic CRUD applications miss cost and schedule estimates. I think the answer is that most software development has a culture that isn't as well developed in process control.


And that's my point exactly, every new software system is an experiment. No one has any idea if it's actually going to work out, it's all plug and pray.

If estimating research projects is as much a thing as SW, it's just as insane.

Pretending we know something we don't is lying.

Projections is another thing altogether, let's say we let this run for a month and then we have another look and see where we are. Much more connected to reality than pretending we know where everything is going from square 1.


It’s not any more different than any new mechanical design. Every new car line or rocket design I worked on was an “experiment” just like every new software product. You seem to confuse engineering with mass production. They are two different things.

If you think you’ll get to do research without estimating cost and schedule, you won’t be a researcher for very long. Pretending you know something you don’t is bad project management. But what I’m advocating is estimating with transparency regarding uncertainty. Uncertainty can be quantified. Pretending software is some precious unicorn project that doesn’t have to follow any process is just rationalizing bad process management.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: