Hacker News new | past | comments | ask | show | jobs | submit login

> instead of sitting down and figuring out the actual requirements first like any real engineering job

If it were so easy, then we would have done it already that way for the last seven decades. The simile topples over upon further contemplation:

civil engineers:

• client is not a domain expert

• to an overwhelming part, client needs are easy to transport into the mind of c.eng.

• can employ a wealth of standard solutions refined over the course of milleniums

software authors:

• client is a domain expert for the subject matter that the software is supposed to model

• to an overwhelming part, client needs are difficult to express and transport into the mind of s.auth.

• if a standard solution exists, the client would have already bought it off the shelf, so coming to a s.auth. always means customised development

• the entire sector is still in its proverbial baby shoes

We have to come to grips that we unfortunately can't apply what happens to work well for a different sector. The supposed value of non-traditional software methodology lies in that at least the client notices that the requirements are off the needs rather early.




I agree it's not easy, but instead of trying harder to become good at it, it seems like this industry in its proverbial baby shoes threw a tantrum and decided that it doesn't want to learn at all.

So instead of pushing for more effort & skill for up-front planning and modelling of the requirements and design (and trying to employ modelling tools & formal methods), we're just like fuckit nah let's just implement something and see where that leads us.

It doesn't help that so many of the up-front design failures were not examples of engineers screwing up but sales people selling shit and then having engineers cook it.

We'll be stuck in baby shoes for a long time with this approach.


We tried all that. Formal requirements gathering, modeling tools, the works. It was the SDLC/Six Sigma/ISO9000/vendor-certs-are-substitutes-for-degrees era of the late-90s/early-2000s. And it also didn't work.

Often times, it's very easy to determine the functional requirements: "Process this billing statement", "Spit out that report." The complexity doesn't come from the requirements. It comes from the constraints: "We only use this ancient version of this database", "We didn't tell you at the start that the users expect to interface with this system via email".

To quote Mike Tyson, "everyone has a plan, until they get punched in the face." The point of Agile is to get punched in the face earlier and more frequently.


> The complexity doesn't come from the requirements. It comes from the constraints: "We only use this ancient version of this database", "We didn't tell you at the start that the users expect to interface with this system via email".

Those constraints sound like requirements: we require that only this ancient version of this database is used. We require that users can interface with this system via email. Different term, same thing.

That's in the bucket with the kind of stuff that I'm advocating people should pay more attention to up-front. Gathering these requirements before you start designing anything is a big deal. If you failed to do so, then you failed to do a proper job at the part I called sitting down and figuring out the actual requirements.


There's another problem not really addressed here; when the requirements you get from the customer are wrong/make no sense/could be done better another way/are for things they won't actually use/are a part of some internal political struggle.

This is something that crops up quite frequently in my work. We'll get a requirement like this: "the app should use HTTP like a web browser, not a different protocol". On inquiry as to why it's so important for something that's not a web browser to so strongly resemble a browser, the answer will be something like, "so we don't have to ask IT to open a firewall port". The person deploying the software could just fill out the paperwork, but they find it easier to "require" that the software sneak past their own corporate security controls. This is the sort of requirement that is real in some sense and not in another sense, depending on whether you define the customer as whoever you're talking to on the phone right now or the business as a whole.


A lot of places will never give you access to the people who can answer those questions and will expect you to start work right away and "quit wasting time" or they "put you on a Performance Improvement Plan"/fire you. The problem is not developers doing a bad job, it's management preventing them from doing good job.


Agreed, the industry isn't actually allowing engineers to engineer. https://news.ycombinator.com/item?id=22391047


You're missing the point. In the real world some of those requirements are unknowable on any reasonable cost or schedule basis regardless of your analysis process. From an economic standpoint sometimes the best choice is to accept some uncertainty and get moving.


Companies are more than happy to churn through staff, losing millions in accumulated human knowledge, just to keep it out of the hands of employees.

My company just shut down an entire office, they offered three out of four critical employees the opportunity to move to Atlanta. They didn’t want to move, opting to retire instead. Company just shrugged their shoulders rather than letting them work remote, gave them like a month to transfer knowledge and that’s that.


Didn't we start by trying to plan ahead? There's a reason we have moved to agile.

In software if you know what you are doing ahead of time you are not really innovating and you might as well be using an existing system. Since everything that is standardised you can just use as libraries.

When building bridges, you can't just use a library that's .buildBridge(...parameters), even though similar things have been done thousands of times before. With code, if something is done enough times that you know how to plan for it, it is likely already a library.

You can plan building a bridge since you have done that and have experience of it thousands times over.


There is a reason we moved to Agile, yes, but it was the wrong solution to the right problem. Scrum identified the problem as a lack of accountability for management, and then imposed new processes on employees without any teeth to hold management accountable. That's why I hate Scrum's too-cutesy-by-half zoomorphization of the problem with the analogy of the chickens and pigs (https://en.wikipedia.org/wiki/The_Chicken_and_the_Pig). In a real barnyard, the pigs and chickens are on equal footing: they are both slaves to the farmer. But in most software dev projects, the developers are not on equal footing with the project manager.


> Didn't we start by trying to plan ahead? There's a reason we have moved to agile.

"Everything was waterfall and then came agile" is myth, but yes, it looks like it's fashionable to no longer try to plan ahead. I'm not convinced the reasons are good, and I'm not convinced the craft of planning ahead was ever perfected to the point that anyone could say it doesn't work.

Scrum, XP, Agile all date back to the 90s when one server running PHP or Perl (with SQL injections) passed for software and others were still re-inventing basic data structures in C (and then gasp C++ with templates!) from first principles... The methodology, tooling, rigor, and collective experience in software development has taken leaps in the past two to three decades. Likewise, understanding systems and planning ahead has become so much easier. If only people took it seriously and spent as much effort on it as they spend on fad-of-the-year frameworks.

People just stopped trying, and when I look in software projects now, there's hardly ever anything resembling a model of the software that you could use to test new ideas "on paper" to see how they fit the given problem and current application.

People aren't even trying, just like they aren't even trying to achieve correctness: https://news.ycombinator.com/item?id=22224054

The less you try, the harder it will be.

> In software if you know what you are doing ahead of time you are not really innovating and you might as well be using an existing system.

I think the vast majority of software is boring and not even meant to be innovative and indeed would be best written using an existing system, but customizing these to the specific needs is just another can of worms. As with bridges, you can't just copy & paste an old bridge to a new location and use it as-it is, you need to design it given the locale and other constraints. It's still probably going to be yet another relatively boring bridge with no innovation.

And indeed a lot of software work is all about using existing systems, but perhaps at a lower level than you posited. Customizing complete off-the-self packages does happen, but it's more common to glue something "new" (but not innovative) using existing frameworks, libraries, services and glue languages.

You still need to design to do it right. But even here people seem to rush ass-first into whatever kit looks fashionable now.


Agile as a named methodology may date to 90s, but the concept of rapid iterations, incremental improvement, and constant feedback cycles has been on the books since at least 1950s.

The last time[0] I vented on the subject, I had to dig out the research papers I keep on my desk, because there was more than passing curiosity. Oh yes. I keep these on my desk so I can quote them when needed.

Check out this research paper from your nearest sci-hub entry point:

- DOI: 10.1109/MC.2003.1204375 ; "Iterative and Incremental Development: A Brief History"

---

0: https://news.ycombinator.com/item?id=20563125


Fully agree. I wrote myself a summary at some point: http://beza1e1.tuxen.de/waterfall.html


"Not trying" is not new. The problems with both Waterfall and Agile is that nobody really ever did them the way they were intended. Waterfall was supposed to include prototyping during requirements gathering, specifically to aid in discovering constraints. Agile was supposed to give equal footing to all team members and not let the project manager spring new constraints on the team without planning for the change. Neither of those things happened very frequently, because our modern business culture treats knowledge workers as interchangeable minions, and management as unassailably perfect.

Yes, people don't do their jobs. That's not the fault of the process that they are failing to adhere to. Indeed, no process of any type could ever save them. It's the culture that is wrong.

So, in a roundabout way, I agree that Agile isn't the solution. No bottom-up process will ever succeed, because at the end of the day, the failure of the team is always the fault of the leader.


The binary thinking in this thread strikes me as odd. Is it bad to try and plan everything 6 months ahead of time? Yes. Is it also bad to fly by the seat of your pants and only plan 1 week ahead? Also yes. Instead, why don't we plan out by some intermediate time period. Say, 3, 4 weeks ahead of time. Or whatever time horizon ends up being the most efficient.


Yes there is no single correct planning horizon. The optimal horizon is proportional to requirements stability. At the extreme when requirements flux is extremely high then the best approach can be a Kanban process, planning no more than a day in advance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: