The obvious problem is that these folks prescribing the development process are not active developers. They are not even part of any real project over any long duration. They are in job of inventing and selling processes and handing out management advice as consultants. Whatever they prescribe, might have worked only in specific context and for specific symptoms, usually with huge dose of luck. Next time when you see new process fad, look up the history of originator of this process, what company he is part of, how much code has he written, how he makes money. You will know what I'm talking about.
I was lost. It felt like slipping through a crack in the Universe, Sliders-style, into reality where you can't understand people discussing the things you do for years. There were few other developers attending, seemingly just as startled by all this.
The whole thing ended with some Agile activity for all attendants which I can best describe as a mix of kindergarten class and Baptist sermon. I could not comprehend the intent, the procedure and the desired outcome (and whether it was achieved).
People attending SM training are often just middle managers learning enough to know WTF their SMs and PMs are talking about in meetings.
Also, there's no requirement for SMs to be coders, it's a role a lot of BAs take on. Some experience in software development context is desirable tho.
I found a lot of my SM training relateable to real life. My team went crazy for planning poker because it got a lot of assumptions out early.
Standups were impossible because some people started at 1pm, others finished at 4pm.
I see Scrum as a bag of tools to apply to teams. Some work, some don't.
1. Not working on what you are supposed to be working on. Sometimes some developers feel that they should probably rewrite X or add Y and does so without consulting anyone first. Something it is good and sometimes it just takes time or adds regressions you did not need at the time.
2. Senior developers owning a piece of the code base. Often grabbing a huge chunk of work for themselves on it and then you don't hear from them for a while when they hack away. Sometimes they do good work and sometimes not and if they are sick or away and no one knows how their code works.
3. Junior (or some senior) developers being stuck for a long while without asking for help.
4. Ignoring the customer and building what you think should be built.
None of those are automatically fixed with Scrum and I know some here will just say "Yeah, we have a professional team that actually talks with each other and does code reviews" so they don't need it. And I get that but for a lot of those small teams building some CRUD application at a large enterprise the formalized communication ways are a god send in my experience.
What no one tells these teams is that the methodology/process itself should be flexible, should be allowed to change to accommodate the idiosyncrasies of their work environment. What I've noticed helping some small companies is that teams learn some framework and use that as their "agile" concept and then proceed to be overzealous of the process itself, completely forgetting the original manifesto.
Like always, it's a strange whisper game...
Any development process that doesn't allow for changing itself is shortsighted and ultimately self-defeating.
I think most teams drop agile practices before they have knowledge to show they aren't appropriate, they usually never get used at all. They adopt what they have to because they don't really want to do it at all because they consider a large chunk of it doesn't work.
Changing the process from a position of knowledge and experience on it is one thing, doing it without that knowledge is the more usual thing and its harmful.
Scrum has some really serious flaws:
1) It creates a massive disincentive against refactoring technical debt (refactoring stories essentially have to be 'sold' to the product owner).
2) You end up spending more time than is necessary in too-long meetings. Planning should really be done continuously rather than in biweekly meetings.
3) The idea that you should deliver working software every two weeks is outmoded. You should be aiming to deliver potentially shippable increments continuously.
2 planning can be continuous, just every two weeks you look at what you should focus on next, you can still bring stuff into the sprint along the way if everyone agrees
3 you plan to deliver a bundle of value every two weeks (so you can bite off enough to get stuck into, but not so much that you can head in the wrong direction for long) but you can release as soon as the code is ready, no need to wait until the end of sprint.
That does not make them qualified to rank and/or write refactoring stories.
IMHO, they should tell the developers how long to spend on refactoring/tooling (e.g. 30% of their time) and developers should figure out themselves how to spend that time.
That isn't scrum, though.
>planning can be continuous, just every two weeks you look at what you should focus on next
That's saying on the one hand that planning both is and is not continuous.
>you plan to deliver a bundle of value every two weeks
I call this mini waterfall. It's a good thing to shoot for if you were y previously doing releases every 3 months. It's not a good thing to shoot for if you were previously doing daily releases.
And fully agree the po won't write the refactoring stories, but anyone can write stories. I'm not that technical but I've never worked somewhere where the team wasn't able to talk about the value of tech debt etc and help me understand the value of bringing them in.
They can talk about it, but under scrum it has to be sold to the PO to get it prioritized. I've worked with plenty of dev teams that could talk but couldn't sell.
And it's the hardest kind of thing to sell, because the benefits are diffuse, abstract and usually long term, whereas feature stories are the exact opposite.
I've seen this whole sales pitch/pushback process fail in lots of places. Techie explains problem to PO in techie terms -> PO's eyes glaze over while they nod their head and agree that this stuff is important -> story keeps getting pushed down the backlog in favor of stories that have a concrete dollar value attached. -> quality suffers as a result.
1. The person making a decision like "should we do refactoring" should not be incompetent. They should be technically great and also in tune with the company's business needs. I know some smaller companies have non-technical people in all non-dev roles, so it's either the devs who do the work deciding or a non-technical person deciding, but that is just stupidity, at the very least the guy who is the boss of the development team should be a development guy. I seriously doubt that at for example Microsoft or Google, a non-technical person makes this type of decision.
2. The incentive system should not be messed up. Especially at companies with multiple levels of traditional managers, you have problems like people treating their job almost like a game, where they try to make their numbers good at the end of the year or avoid ever telling their bosses something their bosses don't want to hear... They completely lose alignment with the company's interests and treat their job like a game of looking good to their boss, so even if they understand the need for something they might say no to avoid missing a deadline they told their boss they would hit etc... This doesn't really happen at SD companies because most of them are new and never developed this type of culture but it happens at companies that are in another field and just have a SD department
I also don't think the solution of "30% of time" is good. Sometimes you need more, sometimes less, sometimes it's more important, sometimes less, sometimes more urgent, sometimes less. I still think you should "sell" refactoring to your boss, but the "boss" at a level where "should we refactor" decisions are being made should be an extremely technical guy who also knows how it will affect the business side of the project.
No, it's a process problem. Devs can't directly assess the relative importance of setting up a CI server compared to the new login workflow. If they could, there wouldn't be any need for a PO. POs probably don't grasp exactly how important setting up a CI server is. And why should they?
>The person making a decision like "should we do refactoring" should not be incompetent.
That's a given. If they're incompetent, no matter what process you give them they will inevitably just make things worse.
>I also don't think the solution of "30% of time" is good. Sometimes you need more, sometimes less, sometimes it's more important, sometimes less, sometimes more urgent, sometimes less.
Yeah, sometimes more, sometimes less depending upon how much pressure customers are putting on you, if the trade show is next week, if it's a quiet week, etc. IMHO PO should choose the % each sprint and devs should choose what goes in it.
>I still think you should "sell" refactoring to your boss
That's tantamount to saying you don't want it to get done, because devs are typically shitty salespeople. Also: do you really care whether module B is decoupled from module A or a new CI server is set up? Why not just say "spend 30% of your time this sprint on code quality improvements" and leave it at that?
At most, the devs should sell the % of time they want each sprint. Honestly though, I'd be happy to let that be 100% at the PO's discretion. Just don't make me try to explain to a non-techie why it's important that I need to refactor some tests.
I just don't see the reasoning behind hiring someone who is non-technical for such a position.
And the answer to "why should they" is precisely because they can then understand the development side better and these types of problems are avoided. It allows them to make more informed decisions. Basically, why should they not?
But I've worked with guys who spent most of the time surfing the web or were really territorial with their code. Or teams that were just shitty at communicating with each other so you had frequent misunderstandings or blocking. Or people who said "Yeah, this will be done in a week" and then after one week they say one more week etc.
Some are personal issues that should probably be handled by a boss and some are just shitty engineering practice that could be fixed by conversations (provided someone actually knows good engineering practices). But that also requires people not to get defensive and say that we have always done it this way and it works or become angry because they think you are calling them out.
Instead agreeing about things like these:
1. The work we do is agreed upon by all and is visible on a board or website.
2. Every day in the morning we chat for 15 minutes about how things are going and if we need help.
will fix some of them in my experience and the angry people can whine about the process for a while instead of directing that anger at team mates.
You can use process to force change, but, personally, I think a strong manager/leader would aim to resolve these issues irrespective of the process. Also, and I think this is why so many complain about agile, is that it is used to change these types of behaviour and is then seen as ‘the bad _guy_’.
My personal view is that the process should be aligned with your business, not as a tool to fix people challenges.
The 15 min daily catchup is invaluable irrespective of your overall process.
Easier said than done though :).
Happy to talk further and learn from your experiences. Thanks.
To be fair, project managers often decide to keep using X or leave out Y without consulting anyone first.
Also, to be fair, project managers are rarely organized or explicit about where the project is and where it should be with respect to cross pollination. If they tracked their goals there like they did other things, you'd see some developers taking this style of project under their wing.
That being said, my experience largely matches yours. Though I find the success rate in setting up a closed feedback loop (actually demoing for the actual people that should be happy with the actually deployed feature) is much lower than people notice, so I'm not sure agile is as technically useful as it seems. It's politically useful, for sure. You can say "Working on the wrong thing? I gave Henderson eight demos as work progressed this fall!"
And sure you sometimes know that certain things need to be implemented but it should not be done without anyone else knowing it. At least it should be known and agreed upon with the team and then discussed with or made visible for the product owner/PM/whatever role.
Really? When I look at the list of authors of the Agile Manifesto, I recognize several names as belonging to people who have made a career out of coaching and advising. Several of those people made their careers by bragging about their involvement in a catastrophic failure.
I mean, they're probably all better software developers than I am. But I think that sytelus's description seems to be right on, at least for many of the Agile Manifesto authors.
In other words, agile has just become a stick for companies to beat their developers with. That wasn't the original intention, but I would suggest that's at least part of the reason why management types like the term 'agile' so much.
Extreme Programming (XP): created by Ron Jeffries, Kent Beck and Ward Cunningham - all developers.
The Agile Manifesto: written by seventeen developers (http://agilemanifesto.org/).
Scrum: created by two of the seventeen above.
And I'd say it still isn't trivial given the number of places that fall short of the vision as expressed in the manifesto's values and principles.
I would agree, though, that things have changed as it has been sold. That was much less about what programmers wanted to happen, and much more about what managers were willing to buy from consultants.
It's mostly just vague. Moreover, "Individuals and interactions over processes and tools" is, IMHO, wrong, and has been used to justify de-prioritizing tooling improvements on the teams I've worked on (e.g. automated testing, BDD).
The interpretation was, I think, defensible, but the outcome wasn't. Human interaction is nice but it doesn't scale.
I think a large part of Agile's popularity is due to its vagueness. A bit like religion, people just assign their own meaning to it.
Thanks for introducing me to it :)
The people who created Agile were working with languages such as smalltalk that simply allowed you to create more functionality faster - which enabled Agile. For smart, experienced people working with tooling that allowed small teams to produce whole systems fast enough that business feedback was useful, Agile could be called obvious. However to the majority of the industry at the time, struggling with hugely complex and slow compiling languages with a large time cost per feature, Agile was not obvious - it was impossible. And that was about 99% of the industry at the time Agile evolved.
It was good times - I was creating entire systems as a one or two person team, which the competition would have taken 5-10 times longer to deliver.
The industry wide tooling has generally caught up though enabling the possibility of widespread Agile. However few people do it that well from the dozens of companies I've seen the inner workings of.
Another important point that just doesn't seem to be commonly discussed is that Agile works in the context of highly experienced developers, but has now been attempted by the wide majority. All the signatories that I'm aware of were career developers with decades of experience, and it worked for them. Most of the dysfunctional agile environments I've seen have come from lack of experience and the perspective it brings, combined with the empowering attitude that agile brings - they don't know what they don't know.
It's like Jazz improvisation in that the difference between mastery and cacaphony is hard to understand for beginners - and basically impossible to master without really a lot of experience.
And the reality is many/majority of developers are actually beginners!
Consider the huge range of skills required (language, libraries, os, networking, patterns & techniques, industry advancements, then general professional skills such as organisation and communication and time management, then economic understanding to apply these to the business domain etc, the list goes on and on).
It basically takes most people 5-10 years to actually grasp the skillset, and then double that again to master it.
Combine this with the growth of the IT industry which means greater numbers joined recently and fewer numbers of more experienced people started decades ago, even if they are still working. Overall the ratios are terrible with the majority of people not having the level of mastery and perspective to deliver on the potential of Agile - so they are probably better off with more up front planning!
But the voices of the crowd are deafening these days.
On the topic about the growth of IT and training, I haven't seen much talk about a team structure closer to what NASA does with each engineer having a mentor. No company I've worked so far had that as a process, it happened organically sometimes but I think what hinders a lot of young professionals is finding someone who will make them learn faster. Personally I only had that feeling once and it was definitely the best 2 years of development in my career, would love to work more at companies which have something like this in place for engineers.
It starts with a brilliant writeup on software process that shows waterfall as an example of what not to do. That document recommends iterative practices. Then, it got twisted into the waterfall model that was applied everywhere. The OP's comparisons to religion are quite warranted here.
Not always true. For example Extreme Programming actually originated from an effort to save a large software system in trouble (a payroll management system for Chrysler). But in this case the prescription didn't work and the project was never completed.
It is pretty impressive to be able to sell a methodology on this background.
Which only works as long as you got highly skilled people to start with.
Moreover, I would argue that almost anything you put down on paper as the development process can work if you've got an interested team of skilled people. Such a team will naturally adapt and adjust until they've got something effective, if necessary to the point that the original on-paper idea has all but disappeared if it wasn't helping. That's also essentially the first point of the original Agile Manifesto, though of course it's been happening since long before anyone ever wrote it down and gave it a name.
I'm agreeing with k__ here.
That aside, surely the more interesting scenario is still the reverse of the one you described: can good management leading average developers get better results from those developers by adopting certain working practices?
part of the team needs it, the rest is going crazy because of it...
meaning that process doodads are ok but you have to focus on interaction.
If anyone knows any good sources shout out.
I liken software methodologies to cooking recipes.
On one end of the scale you have a highly skilled and experienced chef, who can craft a great (requested) dish with a variety of ingredients and equipment. Knowing how to adapt and what to add and when. This works well, but requires a heavy cost (experience) and is, from the outside, difficult to follow and control.
On the other end of the scale you have fast food. A strict process and fixed ingredients that ‘just works’. Those involved in cooking have very little knowledge of why, and are almost certainly not able to adapt. This works well for scenarios where you exactly know and control both the input and output.
A skilled chef can ‘codify’ their recipes, but this results in a specific description of what to do, and not why. Thus marginalising the adaptability.
In my experience, most software engineers are trying to attain the skilled chef role. While most _managers_ want the fast-food style predictability. With the ‘process’ being the battleground.
The best process is where you - a complete multi-disciplinary team - make as quick iterations as possible, learning as you go. But, always knowing where you are trying to get to (and why!) and constantly adjusting accordingly.
Good luck out there.
"The methodology says X, we should do X, why oh why aren't we doing X?" is a familiar lament in the development world. I have been there and done that, to be sure.
It really isn't going to matter what the Methodology says, if something or someone in the context means it isn't being adhered to.
If the person ultimately paying for the project isn't convinced by the Methodology, or wants to deviate from it badly enough, guess what? The Methodology will be ignored.
You should always advocate for what the Right Thing To Do(tm) is, but you should ask yourself first: what is the most important criteria that determines what the Right Thing is?
Sometimes writing the quickest, hackiest, throwaway code can be the Right Thing. It can be a horrifying truth for a software developer. Sometimes doing the ballsy re-write is the Right Thing. Sometimes letting a project fail is the Right Thing.
Sometimes last weeks context and decisions no longer apply.
More critical to project (and personal) success than any Methodology is gaining a thorough understanding of the context in which you are working in, as quickly as you can. And then doing all the Right Things that will make you effective within those constraints.
This means there is no Single Truth for software development. No easy answers. But there are a wide variety of principles you can draw from and apply and discard as needed.
Or as Bruce Lee put it: Be like water.
If you have every developer on the team on the look out to do The Right Thing then you be sure that you'll start replacing your decision-making strategies with whimsical fancy.
A collective fiction, or methodology if you will, isn't supposed to replace individual thought with adherence to the all-powerful methodology. I'm suspicious of sales people with a vested interest in the methodology they're selling. A methodology isn't something you can purchase... the collective fiction metaphor is apt: you have to convince people to believe and adopt it.
In my experience a team will adapt a methodology like Agile but the best teams will mold it to the way they do things. Their collective fiction needs to integrate their tribal knowledge, goals, experience, etc. There isn't Agile™ — there's Agile the X way.
I don't mean to say everyone should be running around doing whatever they want. I consider the team, and working collectively in an effective way, of critical importance.
I think what you call collective fiction, I call operating principles. If you have any pointers to learn more about this, I'd be eager to take a look. Thanks!
The opposite end of the spectrum is where I'm at now. Where planning meetings last _hours_ and are mostly an exercise in squabbling over how we're going to get the points to agree with the TPMs completely arbitrary projected view of how things should go which was somehow divined before actually talking to any of the developers.
For example, consider the concept of MVP (Minimal Viable Product). I once had a boss who kept re-defining the scope of the MVP until 18months in, we still had not completed the ever-expanding MVP. In light of these experiences, i'd rather follow methodologies to the letter just to avoid extreme abuse. For example, following some silly hard rule driven by methodology X that MVPs must be completed in 3mo would be helpful, even if arbitrary.
Sometimes (quite often actually) that is precisely what we need. It would be quite bothersome to feel the need to explain the reasoning behind every decision and put everything you're doing in context. How about actually doing any work? Pick a methodology that fits your context, work according to the methodology and regularly evaluate the fit between the context and your methodology.
Ofcourse, don't overdo methodology, but don't overdo not-methodology either. Try to find a balance that works. Don't let the means become the goal.
"Drive a horse and cart through it" were his precise words.
So for my team, the deliverable was working software. The co-founder wanted another deliverable altogether.
So you are saying go with Waterfall?
Use whatever methodology, technology, and processes you need to accomplish that singular task. Orient your entire organisation around doing that. Your customers don't care how you do it. They just want frequent updates and progress. And if you're not giving them useful updates (you'll know, they'll tell you) then the problem isn't methodology, it's probably your ideas. No methodology will fix that.
In none of the situations I've been working, updates as frequent as 2 weeks have been a request. It would have been either a nightmare to do that (you end up spending 50-80% of your time delivering under pressure), or just impossible (in cases where testing, verification, delivery takes 1 to 3 months), and in every case simply unwanted.
If you work on some web site where the tech is well understood then shipping something every 2 weeks probably works.
That is research, not software development.
For software development you know your tools and how to use them, and only what the resulting product should be is slightly unknown.
There is but a handful of valid choices for a relational database, and all of them pretty much able to handle every single thing you can throw at it.
Or are you just trying out "flavors" of database? In which case you're wasting the client or employer's time/money, in all honesty.
If you think about this, it falls down a bit because there is no way you can code and do a full regression test in 2 weeks on any substantial product. You might be able to run some automated tests, but would you ever ship a product without eyes on the screen testing? Heck no.
Before the agile thing, we did 4 releases a year, 2 majors and 2 minors, so you are looking at a 3 month cycle.
When I was literally shipping software it made sense to have longer release cycles because the cost of shipping was so high and it was a cost your clients had to pay. But for internal teams, you could ship a lot more cheaply and if you needed to redeploy you typically had the power to do that.
You could also literally have the customer in the room with you, which isn't possible when you are shipping to N external customers.
I think that also explains the rise of SaaS. By keeping the software on site I can ship much more cheaply, though the client negotiation part then is the more traditional model.
It was also nice to have a sense of accomplishment after finishing a 3 month cycle release. I was also much more confident in the code quality because it was thoroughly tested with eyes on screen. Now it seems much more of an endless grind and crossing your fingers.
Also, I think "shippable, bug free code" in 2 weeks is a pipe dream. It's a great goal, but not often realistic in the time frame. Any time a bug comes back, it interrupts the new 2 week cycle and throws everything off.
I'm not saying it's horrible, it's just not ideal. A 3 month cycle seems cleaner. It also gives user time to grasp all the changes. There are always emergency patches allowed as an exception. I don't think a 2 week cycle is the best way to do it, it's just how everyone does it right now.
I want my software to stop changing!
It's less vague, more value, and barely more work to replace "shippable" with built (no scare quotes), packaged (no scare quotes) and demoed. The first two can be automated. The third is a matter of getting the right people to show up for a quick meeting. If that's too hard, you can just post a changelog including screenshots, animated gifs, etc. if needed.
Notice that I didn't say the product should be shipped. That's varies a lot from domain to domain and project to project. Sometimes even feature to feature.
I think of Reinertsen’s approach as industrial agile, in the most honorable sense of industrial. There is theory and some math. There is heavy emphasis on project economics as a means of making complex trade-offs and a basis for training, deputizing, and expecting employees at all levels of an organization to make better decisions on the fly as part of doing their jobs. Lessons are drawn from many fields, such as network communications, queueing theory, and the Marines. Contrasts are drawn with manufacturing to explain exactly why some of the techniques we adopt do not apply in product development but where many techniques from kanban systems do apply.
The book is organized around hundreds of principles given as tools, to be considered and applied as needed to your specific situation. Annoyingly, he refuses to get into implementation. At first I thought he was saving pages, but the more I think about I realize the seeming omission was necessary genius. Digging into even one example implementation would result in the death of the message and framework as people copied that and get dogmatic about a methodology all over again.
The whole presentation is beautifully constructed and tight. It puts a perspective and language on things in a way that makes me hopeful and interested in project management, similar to how Rich Hickey’s talks raise my understanding, vocabulary, and aspirations for software engineering.
I would welcome anyone involved in prioritizing commercial software projects to read and discuss this book (and document their implementations).
* The user is more important than any of this.
* The programmer is nothing.
* The computer is doing all the work.
Thus: use what makes all the above, True.
In my circle, we have a lot of discussions on clients like that: "and I told him, for what freaking thing do you need that freaking neural network when a banal statistics does the job better?". Having such guy, "a Wharton grad with stellar track record at the big 4" agreeing on the former requires compromising his ego, self image, exploiting him wanting to not to look incompetent in eyes of his superiors.
In the end, all such guys are made to pay 6 or even 7 digit sums for a fancy UI over database querying tools and a stats package, and not a "big data, machine learning blah blah blah," or otherwise being given "the exact thing they wished for" that had no chance of even minimal functioning
I can actually see the merit in waterfall for elements of the backend.. though how do you avoid pre-optimisation or even complete wastage building out APIs that end up never being needed?
On one extreme are things that are expensive to change and mistakes cause critical failures and on the other are things that are trivial to change and mistakes are easily mitigated.
- We had periodic meetings with the client (publisher). we showed them progress and they suggested directions to move towards
- We didn't stick religiously to the initial design, and adapted as we saw things work or not work to make the game fun
- We stuck to the deadline and just cut what didn't make it
It felt quite agile to me.
I attribute a lot of the merit to us being mentally organized, but mostly to our producer Javier Cano (RIP), who had that magic touch for ensuring that things around him just ran smooth, always with a smile and making it feel natural. My god he was amazing.
Among other things, this translates into working ridiculous hours in expensive cities and convincing a set of investors that they too can be part of this world changing process in exchange for a few shekels. Now these shekels will drive innovation, and they will push civilisation forward.
But the magic of the fiction is that poor Johnny will never feel like he did much to change the world, and so he will keep trying ad infinitum.
Maybe I'm missing what you mean somewhere in there, but facebook raised a lot of capital through investment rounds and an IPO.
I did the whole rollercoaster only for 11 years now, but I also was in different companies from startup to big corp.
Some people are totally into their methodology and won't do a step aside the predefined path.
Some people are totally elusive to process. They will talk to you via email, Skype, WhatsApp and JIRA and you won't find any infos later.
I have the feeling there is no methodology for devs to work better, only for devs to get peace from non-devs who work with them.
The one thing I would maybe add is that the tools of the trade have changed for the better in thirty years, particularly source control management, document tracking, and the ability to collaborate remotely. I think ways of development have changed in response to that.
Imagine a world where compiling a build of a piece of software took as a long as an hour, source code was stored on a single server so that you didn't even want to have two people working on the same set of code, automated testing didn't exist.. I can remember cases of that in my youth. It seemed to call for more planning and a more top down approach.
Collaboration tools have completely changed the game, totally agree. As has the death of the physical artifact (that couldn't be patched, recalled, or 'bounced').
I felt like "collective fiction" was a harsh term that implies we're always lying to ourselves. While that might be true, I like to use the more ambiguous term "narrative" for the same concept. The need for a methodology is a narrative or story that may or may not be true. I guess I do believe ;) that some of our collective narratives are definitely true, and calling them fiction undermines the good ones, or at least makes it harder to evaluate which collective beliefs are more valuable & closer to truth and reality than others.
Side note - collective narratives are a human condition, not in the least limited to programming methodologies. We build collective narratives for everything, and our physiology is prone to it. Having kids, it was fun watching them repeat stories they hear about the world over and over, as a way to cement their narratives with their friends and parents. When they were younger it was almost in the form of a question, they were waiting to be challenged or get more specific information. As they grow older, they get more firm in their statements and beliefs. After watching them do it, I became more aware of how often adults do the same thing, and how pretty much all communication, even technical communication, is just narrative.
My own 20 years of experience with methodologies and with other programmers has led me to believe that pushing coders to communicate frequently about what they're doing and what needs to be done is a good thing; most programmers I've known (including me) tend to wander in directions that they want to go and don't naturally want to do all the things that really need to be done. I sometimes don't like being transparent about what I'm doing, sometimes because I'm working on a pet project that I shouldn't be working on, sometimes because I hate budgeting and I don't know when I'll be done, sometimes because I want to deliver a higher quality than was requested and I know I'll be asked to stop being a perfectionist and just check in my changes. I see others not wanting to be transparent about what they're doing for the same reasons. But making sure that transparency exists is probably the only thing I've seen that works. Checking in often, making sure incremental progress is going the right direction, and re-iterating the requirements and users' point of view.
Too much formal process is usually bad unless you're NASA, and by the time it's called a "methodology", it's probably too much. It gets in the way of a lot of the good things that need to happen. Two week sprints get exhausting after a while, it teaches devs to aim smaller, and it causes undue burdens from management on tasks that legitimately take a couple of months. Being forced to estimate everything in story points has some advantages, but ultimately makes measurement harder, it's subjective and easily manipulated, and it's just contrived anyway.
So, I agree about formal methologies not working well, by and large, but I think the basic intent to keep focused and communicate frequently are good goals. It is ironic that most software methodologies are reasonably good at helping people evaluate how to make certain kinds of focused technical decisions, but they're very bad at helping people evaluate when a formal process is wasteful or bad, or to suggest a different formal process.
engine-choices representing data storage:
very big engines. for big ships.
very small ones, for lawnmower.
self-made ones, smoking.
gear-choices representing libraries:
gear from 1 to 1000. No neutral and no reverse.
chassis choices (language choice):
rusty old ones, hard to work with, but lasts.
new shiny ones, overpriced and falls apart.
locks = safety:
one key fits all (all doors and all cars).
different key for each lock, can be bypassed by opening window.
keypad with password, written on the back-side of the car.
interior = user interface:
millions of choices, all flashy and shiny, each part looks totally different. Unusable steering.
break (software stability):
when used causes blue-screen
outside (program output):
all parts have different sizes.
holes covered with duck-tape
and rust covered with shiny paint.
Enjoy the ride!
Methodologies organize projects to help further smart decisions and avoid dumb ones, but smart decisions and skill still dictate the success or failure of projects.
In the end, the people are still more important than the methodology, but I've encountered a mentality a few times where people believed a good methodology will fix all.
I've seen companies keep changing methodologies looking for success with largely the same staff, and key decision makers, but yet the results are the same. Insanity, anyone?
The example of a tech company with gigantic dev teams and very little number of MBA doctrinaires involved in development process is Intel.
Companies with external development clients that practice "we will agree on any methodology you want for right amount of money, but we would not let you anywhere close to our developers (who will be doing their own thing anyways)" are Tatas, Wipros, Luxofts and others. Their biggest asset is their skill at managing the client.
The goal there would be to not to dive too deep into the task that is already solved in a way that "works well enough," and for what the client is already ready to pay money for.
I mentioned this during a meeting with my team, and so a process was devised for the next project we embarked upon. The process was waterfall, except nobody referred to it that way (most of the people are like 25, and weren’t working back when it was the norm). Everything old is new again.
It wasn’t important anyway; we didn’t follow the process.