Hacker News new | past | comments | ask | show | jobs | submit login
My 20-Year Experience of Software Development Methodologies (zwischenzugs.wordpress.com)
413 points by zwischenzug on Oct 15, 2017 | hide | past | favorite | 147 comments

There is an insider story about how these methodologies comes about. So there are few groups of people whose sole job is to do consulting on failed/late/over budget projects. Mind you, they don't write code but rather they observe how things are going and then prescribe process/management improvements (McKinsey style). Once in a while, these folks bump in to terrible projects and whatever they prescribed sometime works like a charm. In that case, they take that prescription on road and advertise the hell out in conferences, magazines, blog posts. Unlike regular developers, they have all the time in the world to do these activities. They write books and give interviews and by the media power of distributing information suddenly they pop out as process gods who knows how to fix any project. Eventually the new things starts to fad, people realize what works in project X didn't worked in Y, speaker engagements starts drying out and then these folks need new thing to repeat the cycle.

The obvious problem is that these folks prescribing the development process are not active developers. They are not even part of any real project over any long duration. They are in job of inventing and selling processes and handing out management advice as consultants. Whatever they prescribe, might have worked only in specific context and for specific symptoms, usually with huge dose of luck. Next time when you see new process fad, look up the history of originator of this process, what company he is part of, how much code has he written, how he makes money. You will know what I'm talking about.

This is not true for agile. The agile manifesto was in fact drafted by software developers. Also, agile doesn't prescribe any process at all, in fact, it does the opposite. Agile in essence is quite beautiful, unfortunately it gets twisted and turned upside down until it's just another methodology (which is exactly the opposite of its original meaning).

Some 10 years ago I attended a purportedly OO development conference which was by that time taken over by agile crowd. It was full of nice, chatty and energetic people.

I was lost. It felt like slipping through a crack in the Universe, Sliders-style, into reality where you can't understand people discussing the things you do for years. There were few other developers attending, seemingly just as startled by all this.

The whole thing ended with some Agile activity for all attendants which I can best describe as a mix of kindergarten class and Baptist sermon. I could not comprehend the intent, the procedure and the desired outcome (and whether it was achieved).

That's how I felt when I attended my Scrum Master training. There was almost nobody in there who actually wrote code. The discussions were very abstract and about situations I had never encountered in work life.

Couple points re this.

People attending SM training are often just middle managers learning enough to know WTF their SMs and PMs are talking about in meetings.

Also, there's no requirement for SMs to be coders, it's a role a lot of BAs take on. Some experience in software development context is desirable tho.

I found a lot of my SM training relateable to real life. My team went crazy for planning poker because it got a lot of assumptions out early.

Standups were impossible because some people started at 1pm, others finished at 4pm.

I see Scrum as a bag of tools to apply to teams. Some work, some don't.

Oh, you know, I'd forgotten about the paper airplanes and pasta sculptures.

I actually like vanilla Scrum and think often you need a formalized methodology for the team to follow. I've worked with a lot of different teams and there are a few things I've noticed being pretty common among developers (or sometimes being guilty of myself).

1. Not working on what you are supposed to be working on. Sometimes some developers feel that they should probably rewrite X or add Y and does so without consulting anyone first. Something it is good and sometimes it just takes time or adds regressions you did not need at the time.

2. Senior developers owning a piece of the code base. Often grabbing a huge chunk of work for themselves on it and then you don't hear from them for a while when they hack away. Sometimes they do good work and sometimes not and if they are sick or away and no one knows how their code works.

3. Junior (or some senior) developers being stuck for a long while without asking for help.

4. Ignoring the customer and building what you think should be built.

None of those are automatically fixed with Scrum and I know some here will just say "Yeah, we have a professional team that actually talks with each other and does code reviews" so they don't need it. And I get that but for a lot of those small teams building some CRUD application at a large enterprise the formalized communication ways are a god send in my experience.

I agree with you that a formalized methodology is required for you to start with agile, be it XP, Scrum, Kanban or whatever the flavor the team itself feels more comfortable doing.

What no one tells these teams is that the methodology/process itself should be flexible, should be allowed to change to accommodate the idiosyncrasies of their work environment. What I've noticed helping some small companies is that teams learn some framework and use that as their "agile" concept and then proceed to be overzealous of the process itself, completely forgetting the original manifesto.

Like always, it's a strange whisper game...

Adjusting the process literally is part of Scrum. At the retrospective, if something isn't working, you call it out and change it. That can be something as big as completely changing the methodology. I've never worked on a XP team but my understanding is formally there is no retrospective process - I would definitely want to put one in place if I did work with it.

Any development process that doesn't allow for changing itself is shortsighted and ultimately self-defeating.

Except at most companies that use Scrum, there are very rigid rules sent down from above regarding the implementation of Scrum. These rules apply to the whole company and can't be questioned or changed via retros.

This is definitely the case for me. I work at a reasonably large (~5-7000 total employees) company that had a top-down Agile Mandate imposed about a year ago. It’s worked reasonably well, honestly, but there was a lot of chafing initially, and there’s still some resistance. Problems with the process are occasionally solved by abandoning it entirely, with mixed results.

The thing to be careful of is that a lot of developers also don't like certain parts of agile, and those don't get used. They may have a preference to continue to use long cycle approaches (code reviews) where a shorter cycle approach is now necessary (pair programming). The end result is they pick and choose the bits they want and end up failing to ever do the process even once, not because its impossible but because it isn't their preference. They didn't do it out of knowledge, infact they failed to see the interplay in the processes working together and how the entire thing feeds together to produce the whole.

I think most teams drop agile practices before they have knowledge to show they aren't appropriate, they usually never get used at all. They adopt what they have to because they don't really want to do it at all because they consider a large chunk of it doesn't work.

Changing the process from a position of knowledge and experience on it is one thing, doing it without that knowledge is the more usual thing and its harmful.

>I actually like vanilla Scrum

Scrum has some really serious flaws:

1) It creates a massive disincentive against refactoring technical debt (refactoring stories essentially have to be 'sold' to the product owner).

2) You end up spending more time than is necessary in too-long meetings. Planning should really be done continuously rather than in biweekly meetings.

3) The idea that you should deliver working software every two weeks is outmoded. You should be aiming to deliver potentially shippable increments continuously.

1 the po is the representative of the customer and cares about the long life of the product. Sounds like a role that should care about the future state of the code/product though will obvs want to understand the cost/value of anything that isn't helping the immediate needs

2 planning can be continuous, just every two weeks you look at what you should focus on next, you can still bring stuff into the sprint along the way if everyone agrees

3 you plan to deliver a bundle of value every two weeks (so you can bite off enough to get stuck into, but not so much that you can head in the wrong direction for long) but you can release as soon as the code is ready, no need to wait until the end of sprint.

>the po is the representative of the customer and cares about the long life of the product.

That does not make them qualified to rank and/or write refactoring stories.

IMHO, they should tell the developers how long to spend on refactoring/tooling (e.g. 30% of their time) and developers should figure out themselves how to spend that time.

That isn't scrum, though.

>planning can be continuous, just every two weeks you look at what you should focus on next

That's saying on the one hand that planning both is and is not continuous.

>you plan to deliver a bundle of value every two weeks

I call this mini waterfall. It's a good thing to shoot for if you were y previously doing releases every 3 months. It's not a good thing to shoot for if you were previously doing daily releases.

Sorry, wasn't clear with the value piece, you deliver (to live/the user) each piece/story when it is ready, but have a grouping of items that you want to improve across he sprint. I.e. This sprint we will improve login with this 8 stories

And fully agree the po won't write the refactoring stories, but anyone can write stories. I'm not that technical but I've never worked somewhere where the team wasn't able to talk about the value of tech debt etc and help me understand the value of bringing them in.

>I'm not that technical but I've never worked somewhere where the team wasn't able to talk about the value of tech debt etc

They can talk about it, but under scrum it has to be sold to the PO to get it prioritized. I've worked with plenty of dev teams that could talk but couldn't sell.

And it's the hardest kind of thing to sell, because the benefits are diffuse, abstract and usually long term, whereas feature stories are the exact opposite.

I've seen this whole sales pitch/pushback process fail in lots of places. Techie explains problem to PO in techie terms -> PO's eyes glaze over while they nod their head and agree that this stuff is important -> story keeps getting pushed down the backlog in favor of stories that have a concrete dollar value attached. -> quality suffers as a result.

True, however I'd say this is a problem with having an incompetent person in charge, not a problem with the process. I've had the experience you describe but also the opposite experience where people understood the need to rework things and what the business impact of not doing it will be. What makes a difference are 2 things:

1. The person making a decision like "should we do refactoring" should not be incompetent. They should be technically great and also in tune with the company's business needs. I know some smaller companies have non-technical people in all non-dev roles, so it's either the devs who do the work deciding or a non-technical person deciding, but that is just stupidity, at the very least the guy who is the boss of the development team should be a development guy. I seriously doubt that at for example Microsoft or Google, a non-technical person makes this type of decision.

2. The incentive system should not be messed up. Especially at companies with multiple levels of traditional managers, you have problems like people treating their job almost like a game, where they try to make their numbers good at the end of the year or avoid ever telling their bosses something their bosses don't want to hear... They completely lose alignment with the company's interests and treat their job like a game of looking good to their boss, so even if they understand the need for something they might say no to avoid missing a deadline they told their boss they would hit etc... This doesn't really happen at SD companies because most of them are new and never developed this type of culture but it happens at companies that are in another field and just have a SD department

I also don't think the solution of "30% of time" is good. Sometimes you need more, sometimes less, sometimes it's more important, sometimes less, sometimes more urgent, sometimes less. I still think you should "sell" refactoring to your boss, but the "boss" at a level where "should we refactor" decisions are being made should be an extremely technical guy who also knows how it will affect the business side of the project.

>True, however I'd say this is a problem with having an incompetent person in charge, not a problem with the process.

No, it's a process problem. Devs can't directly assess the relative importance of setting up a CI server compared to the new login workflow. If they could, there wouldn't be any need for a PO. POs probably don't grasp exactly how important setting up a CI server is. And why should they?

>The person making a decision like "should we do refactoring" should not be incompetent.

That's a given. If they're incompetent, no matter what process you give them they will inevitably just make things worse.

>I also don't think the solution of "30% of time" is good. Sometimes you need more, sometimes less, sometimes it's more important, sometimes less, sometimes more urgent, sometimes less.

Yeah, sometimes more, sometimes less depending upon how much pressure customers are putting on you, if the trade show is next week, if it's a quiet week, etc. IMHO PO should choose the % each sprint and devs should choose what goes in it.

>I still think you should "sell" refactoring to your boss

That's tantamount to saying you don't want it to get done, because devs are typically shitty salespeople. Also: do you really care whether module B is decoupled from module A or a new CI server is set up? Why not just say "spend 30% of your time this sprint on code quality improvements" and leave it at that?

At most, the devs should sell the % of time they want each sprint. Honestly though, I'd be happy to let that be 100% at the PO's discretion. Just don't make me try to explain to a non-techie why it's important that I need to refactor some tests.

I think what you don't understand in my post is that I am saying POs should grasp those things.

I just don't see the reasoning behind hiring someone who is non-technical for such a position.

And the answer to "why should they" is precisely because they can then understand the development side better and these types of problems are avoided. It allows them to make more informed decisions. Basically, why should they not?

That’s a list of poor engineer attributes, which need to be addressed via conversations. Not impose a process in the hope that’ll fix them.

The process is a tool you use to address them. You might have worked with more mature engineering organizations than I have and with people that are more professional.

But I've worked with guys who spent most of the time surfing the web or were really territorial with their code. Or teams that were just shitty at communicating with each other so you had frequent misunderstandings or blocking. Or people who said "Yeah, this will be done in a week" and then after one week they say one more week etc.

Some are personal issues that should probably be handled by a boss and some are just shitty engineering practice that could be fixed by conversations (provided someone actually knows good engineering practices). But that also requires people not to get defensive and say that we have always done it this way and it works or become angry because they think you are calling them out.

Instead agreeing about things like these:

1. The work we do is agreed upon by all and is visible on a board or website.

2. Every day in the morning we chat for 15 minutes about how things are going and if we need help.

will fix some of them in my experience and the angry people can whine about the process for a while instead of directing that anger at team mates.

Fair response - my assertions were a bit extreme :-).

You can use process to force change, but, personally, I think a strong manager/leader would aim to resolve these issues irrespective of the process. Also, and I think this is why so many complain about agile, is that it is used to change these types of behaviour and is then seen as ‘the bad _guy_’.

My personal view is that the process should be aligned with your business, not as a tool to fix people challenges.

The 15 min daily catchup is invaluable irrespective of your overall process.

Easier said than done though :).

Happy to talk further and learn from your experiences. Thanks.

> Sometimes some developers feel that they should probably rewrite X or add Y and does so without consulting anyone first.

To be fair, project managers often decide to keep using X or leave out Y without consulting anyone first.

Also, to be fair, project managers are rarely organized or explicit about where the project is and where it should be with respect to cross pollination. If they tracked their goals there like they did other things, you'd see some developers taking this style of project under their wing.

That being said, my experience largely matches yours. Though I find the success rate in setting up a closed feedback loop (actually demoing for the actual people that should be happy with the actually deployed feature) is much lower than people notice, so I'm not sure agile is as technically useful as it seems. It's politically useful, for sure. You can say "Working on the wrong thing? I gave Henderson eight demos as work progressed this fall!"

The developer should consult not just manager, but also (or primary) other developers. Unless you are working in isolated silos (which is fine), every developer should coordinate with others.

And, some of the best products ever created were written without agile. How did we ever manage. I don't like it when people tell an engineer they're wasting the company time because they implemented something no one asked for. Usually, the customer or client needed that and didn't realize it. And, it bites them in the ass later if they don't.

Sure, I'm not saying it is the only methodology to follow or even the best. I'm saying that in my experience it has been a good tool to fix issues like that.

And sure you sometimes know that certain things need to be implemented but it should not be done without anyone else knowing it. At least it should be known and agreed upon with the team and then discussed with or made visible for the product owner/PM/whatever role.

> The agile manifesto was in fact drafted by software developers

Really? When I look at the list of authors of the Agile Manifesto, I recognize several names as belonging to people who have made a career out of coaching and advising. Several of those people made their careers by bragging about their involvement in a catastrophic failure.

I mean, they're probably all better software developers than I am. But I think that sytelus's description seems to be right on, at least for many of the Agile Manifesto authors.

You can't argue that there hasn't arisen an entire industry of snake oil salesmen whose sole purpose is pushing Agile, and Scrum, and whatever.

To me the key differentiator in agile approaches is whether it's business-led or developer-led. If developers can decide on what gets done and when, then agile can work quite well, but to me that's just a pipe dream in most cases. Unless a company's sole business is software, and the financial health of a company is strong, developer-led agile is a hard sell for business owners, even if it leads to long term benefits. So what happens instead is business-led agile which in my experience is worse than waterfall, as you get constant demands for poorly defined new features, with tight deadlines as that's being 'agile'.

In other words, agile has just become a stick for companies to beat their developers with. That wasn't the original intention, but I would suggest that's at least part of the reason why management types like the term 'agile' so much.

Developer led as in you give me some requirements and a deadline and then get out of my way and let me decide how i will break down my task and coordinate with other team members? Give me waterfall please! :)

Interesting. What issues have you had with developer-led agile?

That it’s not really developper led and is just another form of micromanagement. A developer led way of development would do away with sprints if it suited the purpose. It would prioritize requirements over stories if it suited the purpose. It gets even more frustrating when you have to start to work a certain way with scrum (eg frequent commits, vertical development, etc).

Waterfall was also drafted by software developers, who noticed that many of the least tractable errors had their roots in poorly-specified requirements and poor design. It was a first pass at a problem that is still with us.

There's working with agility and there's "Agile." One is a product one is not.

Something like Scrum springing up is completely predictable though

Well, let's see:

Extreme Programming (XP): created by Ron Jeffries, Kent Beck and Ward Cunningham - all developers.

The Agile Manifesto: written by seventeen developers (http://agilemanifesto.org/).

Scrum: created by two of the seventeen above.

I would distinguish between agile as conceived and agile as sold. Agile itself was very simple and limited as a manifesto, arguably trivial and obvious.

It was definitely not obvious at the time. Those people spent years experimenting with alternatives to the status quo before they could even really articulate the common direction.

And I'd say it still isn't trivial given the number of places that fall short of the vision as expressed in the manifesto's values and principles.

I would agree, though, that things have changed as it has been sold. That was much less about what programmers wanted to happen, and much more about what managers were willing to buy from consultants.

>Agile itself was very simple and limited as a manifesto, arguably trivial and obvious.

It's mostly just vague. Moreover, "Individuals and interactions over processes and tools" is, IMHO, wrong, and has been used to justify de-prioritizing tooling improvements on the teams I've worked on (e.g. automated testing, BDD).

The interpretation was, I think, defensible, but the outcome wasn't. Human interaction is nice but it doesn't scale.

I think a large part of Agile's popularity is due to its vagueness. A bit like religion, people just assign their own meaning to it.

I was too young on my career at the time to know how much agile was trivial and obvious when it appeared, it sounds more like a Columbus' egg but I'd appreciate if anyone can give me sources on how obvious agile was before the manifesto was released.

For anyone else who hadn't heard this phrase before: https://en.wikipedia.org/wiki/Egg_of_Columbus

Thanks for introducing me to it :)

It's very much dependent on the tech, at the time most tech was languages that were used were slow and error prone to work with, requiring large teams and long timeframes to deliver systems - and upfront planning was highly important.

The people who created Agile were working with languages such as smalltalk that simply allowed you to create more functionality faster - which enabled Agile. For smart, experienced people working with tooling that allowed small teams to produce whole systems fast enough that business feedback was useful, Agile could be called obvious. However to the majority of the industry at the time, struggling with hugely complex and slow compiling languages with a large time cost per feature, Agile was not obvious - it was impossible. And that was about 99% of the industry at the time Agile evolved.

It was good times - I was creating entire systems as a one or two person team, which the competition would have taken 5-10 times longer to deliver.

The industry wide tooling has generally caught up though enabling the possibility of widespread Agile. However few people do it that well from the dozens of companies I've seen the inner workings of.

Another important point that just doesn't seem to be commonly discussed is that Agile works in the context of highly experienced developers, but has now been attempted by the wide majority. All the signatories that I'm aware of were career developers with decades of experience, and it worked for them. Most of the dysfunctional agile environments I've seen have come from lack of experience and the perspective it brings, combined with the empowering attitude that agile brings - they don't know what they don't know.

It's like Jazz improvisation in that the difference between mastery and cacaphony is hard to understand for beginners - and basically impossible to master without really a lot of experience.

And the reality is many/majority of developers are actually beginners!

Consider the huge range of skills required (language, libraries, os, networking, patterns & techniques, industry advancements, then general professional skills such as organisation and communication and time management, then economic understanding to apply these to the business domain etc, the list goes on and on). It basically takes most people 5-10 years to actually grasp the skillset, and then double that again to master it.

Combine this with the growth of the IT industry which means greater numbers joined recently and fewer numbers of more experienced people started decades ago, even if they are still working. Overall the ratios are terrible with the majority of people not having the level of mastery and perspective to deliver on the potential of Agile - so they are probably better off with more up front planning!

But the voices of the crowd are deafening these days.

Thanks for the thorough description, I couldn't agree more with agile working better with senior engineers. After about 12 years in the industry I can definitely see how much I lacked on my skillset for the first 5 years to be able to understand and apply agile efficiently...

On the topic about the growth of IT and training, I haven't seen much talk about a team structure closer to what NASA does with each engineer having a mentor. No company I've worked so far had that as a process, it happened organically sometimes but I think what hinders a lot of young professionals is finding someone who will make them learn faster. Personally I only had that feeling once and it was definitely the best 2 years of development in my career, would love to work more at companies which have something like this in place for engineers.

That's my point: you can't do as the original poster suggests ("look up the history of originator of this process"), since those are often developers.


I find it hard to take Ron Jeffries seriously after his failed attempt to write a Sudoku solver:


The one major thing I continue to love about extreme programming is it contains a solid list of practices that you can implement along with an understanding of why they work together. The issue I have with the agile manifesto is that it fails to acknowledge that changing away from that without solid experience of it usually means making that decision for the wrong reason. The agile manifesto's great flaw I think is that it assumes it experience in these types of processes and the sort of practices that underpin these principlies, without that experience it is very hard to adopt them.

What about Alex Cockburn's Crystal Clear Methodology?

Thanks to dang, we know exactly where the waterfall model came from:



It starts with a brilliant writeup on software process that shows waterfall as an example of what not to do. That document recommends iterative practices. Then, it got twisted into the waterfall model that was applied everywhere. The OP's comparisons to religion are quite warranted here.

> Once in a while, these folks bump in to terrible projects and whatever they prescribed sometime works like a charm

Not always true. For example Extreme Programming actually originated from an effort to save a large software system in trouble (a payroll management system for Chrysler). But in this case the prescription didn't work and the project was never completed.

It is pretty impressive to be able to sell a methodology on this background.

I read that the main selling point of XP is "a group of highly skilled people doing their thing"

Which only works as long as you got highly skilled people to start with.

Which only works as long as you got highly skilled people to start with.

Moreover, I would argue that almost anything you put down on paper as the development process can work if you've got an interested team of skilled people. Such a team will naturally adapt and adjust until they've got something effective, if necessary to the point that the original on-paper idea has all but disappeared if it wasn't helping. That's also essentially the first point of the original Agile Manifesto, though of course it's been happening since long before anyone ever wrote it down and gave it a name.

You're not wrong, but all those software methodologies aren't meant for the teams of very good developers, because they don't need them, but for the large majority of mediocre programmers who need to be herded.

Right, so the effectiveness of any particular methodology when applied by a team of very good developers isn't particularly useful information. What would be more useful is knowing whether the same methodology brings significantly better results when adopted by a team of average developers when compared to whatever they were doing before.

I'm agreeing with k__ here.

It's not so black and white like that. You could have very good developers on a team with a mediocre "scrum master" or manager, who could conceivably become less productive because of it, despite best efforts to make it work. You can disagree, but I've seen it first-hand.

A bad manager can typically do far more damage than any individual member of a team, but IME it's unlikely that people who are good are going to stick around for long if they're being managed by a moron, whatever that moron's oh-so-trendy job title may be. A team of developers good enough to figure out a working process for any given job without management holding their hands has plenty of better options.

That aside, surely the more interesting scenario is still the reverse of the one you described: can good management leading average developers get better results from those developers by adopting certain working practices?

I think the problem is if you got a mixed bag.

part of the team needs it, the rest is going crazy because of it...

Yes, precisely, and the people who buy into it and force their organizations to adopt the latest fad, DON'T WRITE SOFTWARE. I'm so fucking tired of managers trying to play chess with teams when he has never really dealt with a software problem. I really think all these methodologies are ways for non-programmers to CONTROl programmers. In the near future, I think programming will be required for any software company employee, because compared to writing competent software, whatever the fuck management is doing will eventually be some form of automated reporting/dynamic to-do list.

Test driven development is about the only methodology that has worked for me. Agile, etc has been useless, at least at the gigs I've worked.

it is about interactions within the team, as about how team interacts within, author pointed to article about this:


meaning that process doodads are ok but you have to focus on interaction.

I've been trying to find good material on software development in an enterprise setting in the 'real world' not the magical world of unicorns and pixies you nicely described.

If anyone knows any good sources shout out.

No the Waterfall model and BS 5750/ ISO9000 both originated in traditional engineering - back in the day I actually worked on research to apply BS5750 to software development.

What the parent means is: Can you point to any other (faddish) methodology that didn't come about in that way? Also, waterfall was provided in the original paper as a minimally-viable methodology of software development which was specifically recommended against. A kind of "If it comes down to using waterfall or using nothing, I guess use waterfall" type of thing.

ask the USA and UK MotorBike and Car industry about the "faddish" methodology the Japanese companies used to hand them their ass

But we're not talking about "the USA and UK MotorBike and Car industry" now are we? Or, you know, really anything but software development methodologies?

Actually, I'd say the statistical methods used by the Japanese (mostly to zero in on sub-par team members by shuffling them through teams and measuring the results) do have some similarity to test driven development, and even the sort of frequent (rough) measurement that Agile allows.

Really liked this article.

I liken software methodologies to cooking recipes.

On one end of the scale you have a highly skilled and experienced chef, who can craft a great (requested) dish with a variety of ingredients and equipment. Knowing how to adapt and what to add and when. This works well, but requires a heavy cost (experience) and is, from the outside, difficult to follow and control.

On the other end of the scale you have fast food. A strict process and fixed ingredients that ‘just works’. Those involved in cooking have very little knowledge of why, and are almost certainly not able to adapt. This works well for scenarios where you exactly know and control both the input and output.

A skilled chef can ‘codify’ their recipes, but this results in a specific description of what to do, and not why. Thus marginalising the adaptability.

In my experience, most software engineers are trying to attain the skilled chef role. While most _managers_ want the fast-food style predictability. With the ‘process’ being the battleground.

The best process is where you - a complete multi-disciplinary team - make as quick iterations as possible, learning as you go. But, always knowing where you are trying to get to (and why!) and constantly adjusting accordingly.

Good luck out there.

Methodologies are a trap. The problem with any methodology is they tend to lead people to stop thinking about the context they are operating in.

"The methodology says X, we should do X, why oh why aren't we doing X?" is a familiar lament in the development world. I have been there and done that, to be sure.

It really isn't going to matter what the Methodology says, if something or someone in the context means it isn't being adhered to.

If the person ultimately paying for the project isn't convinced by the Methodology, or wants to deviate from it badly enough, guess what? The Methodology will be ignored.

You should always advocate for what the Right Thing To Do(tm) is, but you should ask yourself first: what is the most important criteria that determines what the Right Thing is?

Sometimes writing the quickest, hackiest, throwaway code can be the Right Thing. It can be a horrifying truth for a software developer. Sometimes doing the ballsy re-write is the Right Thing. Sometimes letting a project fail is the Right Thing.

Sometimes last weeks context and decisions no longer apply.

More critical to project (and personal) success than any Methodology is gaining a thorough understanding of the context in which you are working in, as quickly as you can. And then doing all the Right Things that will make you effective within those constraints.

This means there is no Single Truth for software development. No easy answers. But there are a wide variety of principles you can draw from and apply and discard as needed.

Or as Bruce Lee put it: Be like water.

This is what I call, development by a dozen managers.

If you have every developer on the team on the look out to do The Right Thing then you be sure that you'll start replacing your decision-making strategies with whimsical fancy.

A collective fiction, or methodology if you will, isn't supposed to replace individual thought with adherence to the all-powerful methodology. I'm suspicious of sales people with a vested interest in the methodology they're selling. A methodology isn't something you can purchase... the collective fiction metaphor is apt: you have to convince people to believe and adopt it.

In my experience a team will adapt a methodology like Agile but the best teams will mold it to the way they do things. Their collective fiction needs to integrate their tribal knowledge, goals, experience, etc. There isn't Agile™ — there's Agile the X way.

This is a great point.

I don't mean to say everyone should be running around doing whatever they want. I consider the team, and working collectively in an effective way, of critical importance.

I think what you call collective fiction, I call operating principles. If you have any pointers to learn more about this, I'd be eager to take a look. Thanks!

What’s worked best for me is basically ‘agile’ but without a lot of what I consider to be the bullshit. Project manager and/or team lead prioritize new cards as they come in, and developers just keep working on them. No sprints, just keep pulling work out of the backlog as needed. Stand ups and most meetings can be replaced with quick updates via Slack, otherwise they can be organized as needed (they usually aren’t). We had agreed that estimation was mostly a pointless waste of time, so we stopped doing it.

I have never been happier or more productive under any other system than the one you described. We paired the agile stuff down to _just_ stuff we needed. Planning was bucketing things into a super coarse High/Medium/Low priority, and then off we'd go. We'd still give a quick estimation, but that was primarily used as a tool to gauge if anyone had "Thar be dragons" level hesitations about the feature.

The opposite end of the spectrum is where I'm at now. Where planning meetings last _hours_ and are mostly an exercise in squabbling over how we're going to get the points to agree with the TPMs completely arbitrary projected view of how things should go which was somehow divined before actually talking to any of the developers.

I think that's called "Kanban".

This could work, but it relies on highly competent autonomous team members who trust eachother. Often not the case.

It should be the case though. Competent, autonomous, and trustworthy is a pretty low bar.

In general I agree with you, but that assumes that you have people (and management) with the sufficient big picture thinking. I have found that it is usually safer to follow methodologies even if they aren't perfect because the alternatives are worse when dealing with average or worst teams/management (and by definition that is like...half the world!.)

For example, consider the concept of MVP (Minimal Viable Product). I once had a boss who kept re-defining the scope of the MVP until 18months in, we still had not completed the ever-expanding MVP. In light of these experiences, i'd rather follow methodologies to the letter just to avoid extreme abuse. For example, following some silly hard rule driven by methodology X that MVPs must be completed in 3mo would be helpful, even if arbitrary.

"The problem with any methodology is they tend to lead people to stop thinking about the context they are operating in."

Sometimes (quite often actually) that is precisely what we need. It would be quite bothersome to feel the need to explain the reasoning behind every decision and put everything you're doing in context. How about actually doing any work? Pick a methodology that fits your context, work according to the methodology and regularly evaluate the fit between the context and your methodology.

Ofcourse, don't overdo methodology, but don't overdo not-methodology either. Try to find a balance that works. Don't let the means become the goal.

Tangent: agile, waterfall, and the rest are just individual "methods", not "methodologies". Or at least they should be. Unfortunately it seems our living language is allowing this "-ology" to stop meaning "study of" or "branch of knowledge" and start meaning, well, nothing. If agile, waterfall, etc. are "methodologies", then what do you call the actual research and study of the methods themselves? "Methodologyology"? I get it. I like saying big words instead of small words for effect as much as the next person. For some reason, this one in particular really gets me. I don't know why.

The Single Truth about software development is that there is only one deliverable: working software.

I would take that one step further and say the only deliverable is satisfied customers.

I would take that further and say the only deliverable is profit. Or more VC :)

Also true.

True story: was once asked by the co-founder of a company to sabotage the project I was working on.

"Drive a horse and cart through it" were his precise words.

So for my team, the deliverable was working software. The co-founder wanted another deliverable altogether.

That still requires to specify what is a "working software".

> Or as Bruce Lee put it: Be like water.

So you are saying go with Waterfall?

I've worked in a variety of environments, and the most important thing in my experience is simply to ensure that your team is creating a working, tested, and "shippable" update every ~2 weeks. It amazes me how many teams fail to do this, despite calling themselves "agile" and "customer driven".

Use whatever methodology, technology, and processes you need to accomplish that singular task. Orient your entire organisation around doing that. Your customers don't care how you do it. They just want frequent updates and progress. And if you're not giving them useful updates (you'll know, they'll tell you) then the problem isn't methodology, it's probably your ideas. No methodology will fix that.

You still have a whole variety of environments to discover, I believe :-)

In none of the situations I've been working, updates as frequent as 2 weeks have been a request. It would have been either a nightmare to do that (you end up spending 50-80% of your time delivering under pressure), or just impossible (in cases where testing, verification, delivery takes 1 to 3 months), and in every case simply unwanted.

This only works in certain kinds of projects. I often work on projects where you need to spend months figuring out some technologies and see what works best. For a while we tried to deliver "shippable" updates every 3 weeks but this just created a lot of work creating artificial prototypes without any real benefit so we gave up.

If you work on some web site where the tech is well understood then shipping something every 2 weeks probably works.

> figuring out some technologies

That is research, not software development.

For software development you know your tools and how to use them, and only what the resulting product should be is slightly unknown.

I call myself a developer but I often work on stuff where at the start we don't know whether it will be desktop, web, mobile or whatever. We don't know which database will be best so we try a few. I guess it's research but it feeds straight into development which is done by the same people within the same project.

>"We don't know which database will be best so we try a few."

There is but a handful of valid choices for a relational database, and all of them pretty much able to handle every single thing you can throw at it.

Or are you just trying out "flavors" of database? In which case you're wasting the client or employer's time/money, in all honesty.

I work in medical devices. We can store data directly on the device, on some other device, the cloud or our data center. All have their advantages and disadvantages regarding security, maintainability, battery consumption and a lot of others. There is no clear answer up front so you have to try a few approaches and see how they go.

Both are typically filed under R&D on expenses reports and more often than not done by the same people.

>ensure that your team is creating a working, tested, and "shippable" update every ~2 weeks.

If you think about this, it falls down a bit because there is no way you can code and do a full regression test in 2 weeks on any substantial product. You might be able to run some automated tests, but would you ever ship a product without eyes on the screen testing? Heck no.

Before the agile thing, we did 4 releases a year, 2 majors and 2 minors, so you are looking at a 3 month cycle.

I think Agile in many ways was a reaction to a trend where internal teams where cargo culting what external teams were doing.

When I was literally shipping software it made sense to have longer release cycles because the cost of shipping was so high and it was a cost your clients had to pay. But for internal teams, you could ship a lot more cheaply and if you needed to redeploy you typically had the power to do that.

You could also literally have the customer in the room with you, which isn't possible when you are shipping to N external customers.

I think that also explains the rise of SaaS. By keeping the software on site I can ship much more cheaply, though the client negotiation part then is the more traditional model.

I understand what you are saying, but having done 2 week cycles for several years now, it seems a lot more chaotic to me. With the 3 month cycle, it seemed users (or at least project managers) and developers had time to hash things out before and during coding. Users could get a "look" at how the developer interpreted what they want and could offer suggestions. QA would be able to give it a thorough test with a few bounce backs during the cycle without frantically trying to beat window. Now it seems so hectic and rushed with the 2 week cycle.

It was also nice to have a sense of accomplishment after finishing a 3 month cycle release. I was also much more confident in the code quality because it was thoroughly tested with eyes on screen. Now it seems much more of an endless grind and crossing your fingers.

Also, I think "shippable, bug free code" in 2 weeks is a pipe dream. It's a great goal, but not often realistic in the time frame. Any time a bug comes back, it interrupts the new 2 week cycle and throws everything off.

I'm not saying it's horrible, it's just not ideal. A 3 month cycle seems cleaner. It also gives user time to grasp all the changes. There are always emergency patches allowed as an exception. I don't think a 2 week cycle is the best way to do it, it's just how everyone does it right now.

Customers want updates?

I want my software to stop changing!

Depends on the product but indeed I want stability over all else with updates for security. We have a product sold to us 2 years ago and still hasn't passed UAT. It is enterprise and they use agile so after every 3 to 6 weeks we get a new build to verify. There is always something new that fails so we go back to waiting. At this point I wish I could write the thing ourselves as I would have been done 18 months ago.

> "shippable"

It's less vague, more value, and barely more work to replace "shippable" with built (no scare quotes), packaged (no scare quotes) and demoed. The first two can be automated. The third is a matter of getting the right people to show up for a quick meeting. If that's too hard, you can just post a changelog including screenshots, animated gifs, etc. if needed.

Notice that I didn't say the product should be shipped. That's varies a lot from domain to domain and project to project. Sometimes even feature to feature.

How about every week? :)

Nice post! I worked in a few big and small companies, and from I can see software companies with more then ~40-50 devs don't scale well (I only talk about about single-ish product software companies here). The reason for this is in my opinion is absence of financial liability of devs: if you're a dev in a company with 500 other devs - most likely you have zero financial liability to produce great code, on other hand you have all time in the world to create job security code around you. Also if there is 500 devs on one project, most likely one will see a lot of infrastructure just to get this devs busy with something. This could be seen in companies were one department creates "the framework", and others departments create "features" based on this framework. In this case there is no "market" of frameworks - you basically stuck with it, and "the framework" has zero market competition. Compare it to normal market of middleware - if your framework is bad, I'll just go to your competitor. Free market is much healthier situation in my opinion (could be a strawman argument though). I am not wise enough to say that creating internal competition inside one company is always good thing, though financial contracts make a healthier political discussions.

I don't believe job security code really works. If they're in a place where they're not paying any attention to you committing awful code why would they care when developers tell them that only you can work on your code?

The job security comes when you write great code but awful designs and architectures. The resulting objections are more abstract and harder for project managers to really understand. Or they'll understand the objections but not know how to weight their severity.

They'll just ignore them and do what they please, is where I'm going with this.

Yes, but it's not as obvious as it sounds. For example let's say we have dev A and dev B and we give them same task. Dev A solves the task in such way that we don't need to come back to the solution at all, it "just works". While dev B solves it in such way that now we need 1+ devs maintaining a solution full time. Any solution that needs to be maintained full time is a form of job security. Sometimes it is needed (like placing a small team on devops), sometimes it is not (like in-house full time development of dashboard frameworks).

I’m surprised that Don Reinertsen’s book The Principles of Product Development Flow doesn’t come up in these discussions. Rich Hickey mentioned it once after a conference years ago and that’s the only time I’ve heard it mentioned. I’ve since found a few blog posts responding to conference talks (of which there are several on YouTube from around 2015), but not many. I think people simply don’t know about it. I wrote it down at the time and finally digested it over the past several months. It’s an incredible book that I think would resonate with many of the commenters here.

I think of Reinertsen’s approach as industrial agile, in the most honorable sense of industrial. There is theory and some math. There is heavy emphasis on project economics as a means of making complex trade-offs and a basis for training, deputizing, and expecting employees at all levels of an organization to make better decisions on the fly as part of doing their jobs. Lessons are drawn from many fields, such as network communications, queueing theory, and the Marines. Contrasts are drawn with manufacturing to explain exactly why some of the techniques we adopt do not apply in product development but where many techniques from kanban systems do apply.

The book is organized around hundreds of principles given as tools, to be considered and applied as needed to your specific situation. Annoyingly, he refuses to get into implementation. At first I thought he was saving pages, but the more I think about I realize the seeming omission was necessary genius. Digging into even one example implementation would result in the death of the message and framework as people copied that and get dogmatic about a methodology all over again.

The whole presentation is beautifully constructed and tight. It puts a perspective and language on things in a way that makes me hopeful and interested in project management, similar to how Rich Hickey’s talks raise my understanding, vocabulary, and aspirations for software engineering.

I would welcome anyone involved in prioritizing commercial software projects to read and discuss this book (and document their implementations).

My 30 years of Software Development Methodology:

* The user is more important than any of this. * The programmer is nothing. * The computer is doing all the work.

Thus: use what makes all the above, True.

What does "the programmer is nothing" mean?

I'm guessing this means, if the software works and can be maintained as the users (by that I mean client and the owner) needs it to be maintained, then all considerations are satisfied. Other considerations of aesthetics, architecture, language should be set aside.

Yup, you get it.

By the way: I think this is brilliant.


And "the computer does all the work"?

I'm guessing: The machine does what the user (client and org) expects effectively in a way that they easily understand.

Automate manual processes

The biggest skill when interacting with a "big serious man" type of client is to making him pay you for what you want him to pay you for.

In my circle, we have a lot of discussions on clients like that: "and I told him, for what freaking thing do you need that freaking neural network when a banal statistics does the job better?". Having such guy, "a Wharton grad with stellar track record at the big 4" agreeing on the former requires compromising his ego, self image, exploiting him wanting to not to look incompetent in eyes of his superiors.

In the end, all such guys are made to pay 6 or even 7 digit sums for a fancy UI over database querying tools and a stats package, and not a "big data, machine learning blah blah blah," or otherwise being given "the exact thing they wished for" that had no chance of even minimal functioning

But how will his company make it to 100M+ valuation if he can't claim it is being done with AI?

The fiction I like best right now is "continuous delivery": You release often, push to production several times per day, one working branch, always ready to deploy. Features being in development are behind flags. Regression tests help a lot, eg, when you make an update or fix a bug you make an automatic test that checks if the new thing works, or repeats the steps that caused the bug and makes sure it's no longer there.

We use 'waterfall' (detailed, fleshed out specs in formal language) on the hardware, firmware & backend and 'agile' on the frontend (web/app). So far that served us well and it addresses the difference in developer skills and needs as well. With the danger of generalizing; frontend people are different from backend people who are different from hardware people (in my experience), and this gives all what they enjoy working with.

Hardware (and firmware to a lesser degree) almost require waterfall. Moving fast and breaking things doesn't really work well when you have to throw away the physical product you broke as opposed to iterating it to a fixed state.

I can actually see the merit in waterfall for elements of the backend.. though how do you avoid pre-optimisation or even complete wastage building out APIs that end up never being needed?

The two single biggest factors in determining what sort of process you should have are a) cost to ship/deploy and b) cost of mistake.

On one extreme are things that are expensive to change and mistakes cause critical failures and on the other are things that are trivial to change and mistakes are easily mitigated.

Author here: interested in any strong views on this - @ianmiell on twitter.

Me and my brother wrote our first commercial videogame in 1987 as a 16-17yo pair of self taught programmer/artist/designers. We just did what seemed like common sense.

- We had periodic meetings with the client (publisher). we showed them progress and they suggested directions to move towards

- We didn't stick religiously to the initial design, and adapted as we saw things work or not work to make the game fun

- We stuck to the deadline and just cut what didn't make it

It felt quite agile to me.

That sounds like heaven. What was the game?

Erikbye is correct, Stardust for Topo Soft on the Sinclar Spectrum and Amstrad CPC 8-bit computers.

I attribute a lot of the merit to us being mentally organized, but mostly to our producer Javier Cano (RIP), who had that magic touch for ensuring that things around him just ran smooth, always with a smile and making it feel natural. My god he was amazing.

Star Dust?

Slightly OT: The biggest collective fiction today is probably this idea that as tech person, we can change the world, and as a result we should try to change the world. Anything less impactful feels like a waste of potential. The general society has bought into it - because we've been hearing this for years: Johnny - you are such a smart programmer, where is your world changing app ?

Among other things, this translates into working ridiculous hours in expensive cities and convincing a set of investors that they too can be part of this world changing process in exchange for a few shekels. Now these shekels will drive innovation, and they will push civilisation forward.

But the magic of the fiction is that poor Johnny will never feel like he did much to change the world, and so he will keep trying ad infinitum.

Yes, capital is still king. However, I would say that Zuckerberg (for example) did achieve a lot without selling his product for capital. That is pretty magical.

> ...without selling his product for capital...

Maybe I'm missing what you mean somewhere in there, but facebook raised a lot of capital through investment rounds and an IPO.

He retained control of the company (is what I meant).

Good writeup :)

I did the whole rollercoaster only for 11 years now, but I also was in different companies from startup to big corp.

Some people are totally into their methodology and won't do a step aside the predefined path.

Some people are totally elusive to process. They will talk to you via email, Skype, WhatsApp and JIRA and you won't find any infos later.

I have the feeling there is no methodology for devs to work better, only for devs to get peace from non-devs who work with them.

Having lived through a variety of methodologies, all of which were taken very seriously and equally successful, this rings true.

The one thing I would maybe add is that the tools of the trade have changed for the better in thirty years, particularly source control management, document tracking, and the ability to collaborate remotely. I think ways of development have changed in response to that.

Imagine a world where compiling a build of a piece of software took as a long as an hour, source code was stored on a single server so that you didn't even want to have two people working on the same set of code, automated testing didn't exist.. I can remember cases of that in my youth. It seemed to call for more planning and a more top down approach.

Thanks! Yes, I remember when I deployed to live money-taking systems using tar files as my source control.

Collaboration tools have completely changed the game, totally agree. As has the death of the physical artifact (that couldn't be patched, recalled, or 'bounced').

I'm not sure I have any particularly strong views to share, but a couple of notes.

I felt like "collective fiction" was a harsh term that implies we're always lying to ourselves. While that might be true, I like to use the more ambiguous term "narrative" for the same concept. The need for a methodology is a narrative or story that may or may not be true. I guess I do believe ;) that some of our collective narratives are definitely true, and calling them fiction undermines the good ones, or at least makes it harder to evaluate which collective beliefs are more valuable & closer to truth and reality than others.

Side note - collective narratives are a human condition, not in the least limited to programming methodologies. We build collective narratives for everything, and our physiology is prone to it. Having kids, it was fun watching them repeat stories they hear about the world over and over, as a way to cement their narratives with their friends and parents. When they were younger it was almost in the form of a question, they were waiting to be challenged or get more specific information. As they grow older, they get more firm in their statements and beliefs. After watching them do it, I became more aware of how often adults do the same thing, and how pretty much all communication, even technical communication, is just narrative.

My own 20 years of experience with methodologies and with other programmers has led me to believe that pushing coders to communicate frequently about what they're doing and what needs to be done is a good thing; most programmers I've known (including me) tend to wander in directions that they want to go and don't naturally want to do all the things that really need to be done. I sometimes don't like being transparent about what I'm doing, sometimes because I'm working on a pet project that I shouldn't be working on, sometimes because I hate budgeting and I don't know when I'll be done, sometimes because I want to deliver a higher quality than was requested and I know I'll be asked to stop being a perfectionist and just check in my changes. I see others not wanting to be transparent about what they're doing for the same reasons. But making sure that transparency exists is probably the only thing I've seen that works. Checking in often, making sure incremental progress is going the right direction, and re-iterating the requirements and users' point of view.

Too much formal process is usually bad unless you're NASA, and by the time it's called a "methodology", it's probably too much. It gets in the way of a lot of the good things that need to happen. Two week sprints get exhausting after a while, it teaches devs to aim smaller, and it causes undue burdens from management on tasks that legitimately take a couple of months. Being forced to estimate everything in story points has some advantages, but ultimately makes measurement harder, it's subjective and easily manipulated, and it's just contrived anyway.

So, I agree about formal methologies not working well, by and large, but I think the basic intent to keep focused and communicate frequently are good goals. It is ironic that most software methodologies are reasonably good at helping people evaluate how to make certain kinds of focused technical decisions, but they're very bad at helping people evaluate when a formal process is wasteful or bad, or to suggest a different formal process.

All these methodologies are there just to get things done. After many years involved in software development i have find out that it all comes down to leadership. Of course you need people with skills, but no good leader, no good product.

I like the car analogy, and I had additions to it with software-components in mind:

engine-choices representing data storage: very big engines. for big ships. very small ones, for lawnmower. self-made ones, smoking.

gear-choices representing libraries: single gear gear from 1 to 1000. No neutral and no reverse.

chassis choices (language choice): rusty old ones, hard to work with, but lasts. new shiny ones, overpriced and falls apart.

locks = safety: one key fits all (all doors and all cars). different key for each lock, can be bypassed by opening window. keypad with password, written on the back-side of the car.

interior = user interface: millions of choices, all flashy and shiny, each part looks totally different. Unusable steering.

break (software stability): no breaks. when used causes blue-screen

outside (program output): all parts have different sizes. holes covered with duck-tape and rust covered with shiny paint.

Enjoy the ride!

I really enjoyed reading this and I strongly agree there isn't an easy task at all. Depends on the business, people, existing processes, it's up to us to find what really works. I had some thoughts based on my recent experience and went into a mood to write something quite long on my LinkedIn https://www.linkedin.com/pulse/waterfall-scrum-kanban-scumba... if anyone interested.

Sorry, had a typo in the previous link, please use this one https://www.linkedin.com/pulse/waterfall-scrum-kanban-scrumb...

I've not found a methodology yet that can displace smart decisions.

Methodologies organize projects to help further smart decisions and avoid dumb ones, but smart decisions and skill still dictate the success or failure of projects.

In the end, the people are still more important than the methodology, but I've encountered a mentality a few times where people believed a good methodology will fix all.

I've seen companies keep changing methodologies looking for success with largely the same staff, and key decision makers, but yet the results are the same. Insanity, anyone?

The style he describes as RAD is really enjoyable to work in. No constant scheduled meetings and unread documents; just use your common sense and write something.

There are no need for formal methodologies. A person with no technical expertise should not be but on a position managing technical teams. That requirement divides the industry on "tech companies" and everybody else.

The example of a tech company with gigantic dev teams and very little number of MBA doctrinaires involved in development process is Intel.

Companies with external development clients that practice "we will agree on any methodology you want for right amount of money, but we would not let you anywhere close to our developers (who will be doing their own thing anyways)" are Tatas, Wipros, Luxofts and others. Their biggest asset is their skill at managing the client.

Ah and how would say produce a management system for a core part of a country's core data network (I was a team member developing a management system for the UK's SMDS network) with out a process? - In this case we used a hybrid waterfall /RAD system 12 weeks to develop the fist release

The very very few cases where anything called a methodology works is when you deal with a repeating tasks: making a customized versions of standard tools, or what people call copy&paste development. SAP does it very well. Charges people 7 digits for "customized solution" that in reality is not customized beyond 20 lines of code from what the company sells to every sixth or seventh client.

The goal there would be to not to dive too deep into the task that is already solved in a way that "works well enough," and for what the client is already ready to pay money for.

Not everyone always works on me to crud applications leaving aside SAP and other "miracle" tools :-)

I think the key word is "formal," to me, meaning a strict process to be used at all times in all situations. Like anything in development, it makes better sense to use whatever process works best for whatever team is implementing whatever solution is needed.

I work for one of those fancy startups. Everything is chaos all the time, you have all these extremely smart people in the small doing actually kinda dumb things in the large. This worked better when the company was small and is causing more and more problems as we grow. Anyway, the point being that there’s lots of smart individuals and little process.

I mentioned this during a meeting with my team, and so a process was devised for the next project we embarked upon. The process was waterfall, except nobody referred to it that way (most of the people are like 25, and weren’t working back when it was the norm). Everything old is new again.

It wasn’t important anyway; we didn’t follow the process.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact