In 1997 I worked for a small cell phone provider. One day my boss rounded up all the devs that weren't maintaining the billing system, and told us to write a customer care and account activation system.
That's all he told us.
We asked for requirements and were told to figure it out. So we did. We delivered a first version in three months, and a final product in 8.
Afterwards he told us about a paper he'd found after reading Wicked Problems, Righteous Solutions.
That paper describes what SCRUM was before it got butchered into what it is today.
In 1986, SCRUM consisted of choosing an elite team of experts, throwing them into a room, and telling them to solve a problem with seemingly impossible goals. This unsettles them somewhat, but you persevere. You tell the team how they do it is their business. Your business is to support them by providing all the resources they need. Management is decidedly hands-off, so you leave them alone but give advice when asked. After a while magic happens, the team self-organises, and a product starts taking shape. Soon after the project starts a leader naturally emerges. As the project progresses, leaders change, because the initial leader may not have the expertise required in later stages of the project.
If you're thinking that all of this sounds a bit like programmer anarchy without the stand-ups, you'd be right.
It kills me that almost 30 years after this was proven to work, we're still trying to come up with new ways to differentiate existing processes.
> In 1986, SCRUM consisted of choosing an elite team of experts, throwing them into a room, and telling them to solve a problem with seemingly impossible goals.
I used to work in scientific research, and the concept there is exactly the same. And it works, I can confirm that!
> It kills me that almost 30 years after this was proven to work, we're still trying to come up with new ways to differentiate existing processes.
The problem is, in 1986 you talked about experts. What people have been trying to do lately, is to apply whatever methodology to average (or sub-average) developers and expect that the methodology itself increase substantially their productivity and the quality of their work.
But it doesn't work, simply because it cannot work. Buying a book on Agile/SCRUM/whatever is much cheaper than investing on the competency your team, and you simply get what you paid for!
I don't fundamentally disagree with the author's conclusion (I think it is correct, but only for some organizations, and for different reasons), but it seems to me like he doesn't actually grasp the act of programming.
First of all, there's the truisms, which make this sound as if it was written by some HR drone:
> on the other hand it [Agile] improves productivity when correctly implemented.
Improves productivity over what? Over cases when it's incorrectly implemented? That's true for anything. Over cases when no policy is actually in place? That's also true for anything that introduces some form of order, at least in the beginning. Compared to other PM techniques? Good luck implementing Agile in projects that require hardware design.
> You don’t just need a good programmer, you need one with good communication skills, one that understands the business enough to work with it productively and one that is savvy enough about why tactical and strategic decisions are made.
Arguably, a programmer with poor communication skills, who does not understand what the system he's building does and why it's being done the way it's being done is not a good programmer. It may be a good programmer from the perspective of the head of recruiting, or from the perspective of someone in accounting who sees the smaller paycheck they're giving, but arguably not for someone who is interested in, you know, the programs that guy/gal is writing.
This perspective that focuses on the code churning aspect of a programmer's work rather than on the problem solving aspect is typical of people who have actually not worked on a non-trivial programming project. The truth we all painfully learn in our first four or five years on the job is that writing code isn't the hard part in programming (well, it is for those who made a wrong career choice...). Coding is the most visible and the least cryptic of the activities we do. To someone in a management position without any programming experience, it looks like the programming language and the technologies are the main obstacles to anyone becoming a programmer. Truth is, any idiot can write code.
> Programmer Anarchy presumes that everyone in the team is totally passionate about the project’s success
I'd say programmer anarchy presumes that everyone in the team is totally passionate about what they do.
Just in the paragraph above this, the author argues that PA worked for Fred George because he had a team of good programmers -- but this is a methodology that won't work for unskilled programmers. If the determining factor was actually passion about the project's success, why would the level of skill matter? A poorly-trained programmer can be just as passionate about seeing a project succeed as a good one, especially if he's just starting out and is a bad programmer as a result of lack of experience, not lack of raw skill. In fact, in my experience, I've seen a lot of poor programmers scoring high points on the passion thingie, simply because they tend to compensate their lack of programming productivity with really loud talking, and engineering skills with social skills.
Martin Jee seems to look at this with the eyes of someone who's managing names on paper, not real people. There are factors that the waterfall model doesn't put into play, like peer pressure and team solidarity. It also scores high points on other motivation factors like trust; good developers are skeptical to project managers whose methodology has more buzzwords than ideas -- and Agile's thin, broad tent is a prime candidate for people who do that -- and more favourable towards solutions which they perceive to be closer to their actual job.
> You find that many other members of the team prioritise their private lives more when it comes to spending time in the office, or that their minds are somewhere else when they are sitting at their desks.
This is true of any project management methodology. Unless I'm missing something, waterfall also presumes people are actually doing work during their work hours. PA also doesn't come with an increased number of work hours.
> With a central leader figure you also get one person who has staked the next part of their career on the success of the project and they become the driving force behind it’s delivery. Programmer Anarchy presumes everyone is their own leader, but again not everyone is like that. Lots of people don’t want to take control or drive through the delivery, they just want to do a day job.
This is fallaciously presuming that every central figure is "staking" the "next part of their career" -- i.e. that every central figure's purpose in life is to climb the corporate ladder. This is usually symptomatic of malformed meritocracies. Everyone in the team is staking their careers on the success of the project they're working on; it is far easier to find scapegoats when you have prominent leader figures, and scapegoating is a fundamental component of the corporate ladder. But informal leadership can functionally provide some of the needed advantages.
I think the fundamental difference in view is this:
> I think the most powerful idea in Programmer Anarchy is the idea that programmers take personal responsibility for the success of each project
There is a very small set of products whose success depends more on marketing, sales or the IP lawyers than on the product's technical merit. This is extremely small, confined to a handful of monopolized domains, more often than not with products that are not under serious active development, and while an investor might consider them successful, no self-respecting programmer would want to claim that as one of the high points of their careers.
For anything else, the success of each project preeeeeetty much depends on the product it aims to develop actually, you know, working as expected, which is obviously up to the programmers. This is true for any field: a well-financed real estate project will fail if the architects and builders come up with houses that crush under the weight of the roof, a heavily-advertised car that doesn't start or breaks down every hundred kilometers because the engineers and workers didn't do it right will not earn much income and so on.
On the issue of a central leader figure, I have seen projects fail precisely because someone was staking a career on it. In that scenario, the customers you truly serve are your boss and your boss's boss. Users? Damn them and their expensive 'requirements'. Support? Someone else's problem. Maintainability? How is that going to get me promoted? All of those things cost time and money, and therefore are risks to be avoided. Better to spend your time burying bad news and selling your failure as 'success' to those who really matter.
All of the self-managing teams I've ever worked on have been much more user-focused and success-oriented (in the true sense) than any project where someone was afraid of losing their head. Self-managing teams just seem to be immune to certain kinds of office politics.
This has been my experience as well. That's why I mentioned malformed meritocracy: when you lead a technical project based on non-technical values, projects have a higher chance of failing precisely on the technical side. Which, despite the wet dreams MBAs were brainwashed into believing, is usually a single point of failure, unless you're a company like Oracle, whose licensing practices and negotiation gap allow them to shove any piece of crap along with their DBMS (which is otherwise worth its money).
Keeping good programmers without rewarding technical merit (while, worse, holding non-technical values in a disproportionally high esteem) is, in my experience, next to impossible. It quickly leads to a depletion of valuable team resources. This doesn't necessarily mean poor economic performance -- you can still achieve that through non-technical means, from aggressive licensing to patent trolling. But at that point, what PM methodology you use for your cargo cult software development process isn't really relevant anymore.
In my experience, most of the projects that were led by primarily ladder-ambitious people have been utter failures if taken at their overal impact. Many of them have actually reached their economic goals, but alienated exceptional developers, blocked promotion for more value-oriented professionals and had outrageous maintenance costs. Good if you look only at a handful of trees, bad if you look at the forest.
This sounds a lot like "What we do at startups smaller than 10 people." At least, it's pretty similar to what we did at my last company and a few others I've seen.
In my single anecdotal experience, we tried to keep this sort of "Everyone is self-directed" mentality as long as we could. As long as everyone could keep pretty much complete knowledge of the system and product needs in their heads, it worked great. Everyone coded, everyone answered support emails, and there were no formal rules.
After a while, though, it sort of broke down. One day I looked up fro the subsystem I was working on and realized I had no idea what was being worked on in a couple other areas, and needed to ask people when I wanted to do work that might affect them.
One dev found that he was increasingly the person people came to with higher level questions about what other people were doing, and the person to consult with when considering a new project.
We also found that important bugs and projects were falling through the cracks because there was no clear person to whom they would fall. So other devs took up that roll. By the time we actually named a few people "Leads" (they were effectively proto-PMs), it was just putting a label on what was defacto happening.
That was the first of several aspects of project process and management that arose relatively naturally as we grew.
Eight, Bob. So that means that when I make a mistake, I have eight different people coming by to tell me about it. That's my only real motivation is not to be hassled, that and the fear of losing my job. But you know, Bob, that will only make someone work just hard enough not to get fired.
"Fred George’s experience of Programmer Anarchy was probably so positive because he was working in a team of high performance programmers."
This pretty much sums up the validity of programmer anarchy for any given work climate. Much like a society of good citizens may (theoretically) be self governing, high-performance programmers work best when they are given a task and all red tape is removed. Unfortunately, anyone who has worked in the tech field knows this isn't always the case. Some people/companies need guidance and lack of personal autonomy, however unfortunate that may be.
I don't necessarily disagree with you on that, but I'm still thinking about how we usually evaluate this. I don't think it's often that people come to this conclusion after actual self-directed work, rather it's somekind of assumption after perhaps showing lack of passion in a directed-from-above workplace.
"There is some limited but direct evidence that Programmer Anarchy is effective, and as of yet no evidence that it is not. However, it confuses and scares me, so I will say many words about how it could never work."
Author says, haha this won't work, but tell me about it. Then he says, Oh it worked, but I don't think it will work anywhere else, I think leadership is always necessary "to organize and motivate the team" but offers no counter-examples of why this is generally the case, or where he has seen this approach to programming not work?
Oh yeah, and he threw in some Billy Bragg jokes and a Syndicalist flag.
Thanks for informing me about this approach, sounds interesting.
However I think it's fairly obvious when reading your post that you're fundamentaly opposed to a "leaderless" workplace rather than actually evaluating the effectiveness/viability of "Programming Anarchy", let alone judge its future sucess.
I have never heard of the concept before. It would seem to me that, if you have a team of motivated, experienced, and intelligent programmers working on a creative project, this might be pretty effective.
The only work I consider 'dirty' work is work that doesn't help the final product. User documentation, support documentation, test plans, etc... all help the final product. I actually enjoy doing them as part of the process and as a break from heads down coding.
The only work that I genuinely consider dirty work is useless work. Things like status reports or faux process fall into the useless category.
You may be the exception. I have yet to see a software shop where the documentation and test cases were all current.
You never know when status reporting can help... If done right. If 20 teams who each impact the final deadline all are 90% of the time on time, you'll still likely have a late project because that 10% adds up. And in most companies that 90% is lower. It helps to have quantifiable progress metrics to understand where the trouble lies. This might not be the type of status you're referring too though.
The article says that agile is a big-enough tent to encompass waterfall development with standup scrum meetings.
No way. No way. You can put lipstick on a pig, but at the end of the evening, you are still dating a pig. I lived through the waterfall era, where upwards of 90% of large projects were seriously over budget, seriously behind schedule, or outright cancelled. A colleague at the time told me he had worked on a $10M projects that ended up $1B over budget. No that wasnt a typo. Fortunately, I wasnt a party to any of that nonsense.
I think there's a pretty clear analogy between Programmer Anarchy and Free Market Economics.
Each posits that regulation (whether by central managers or external forces like government agencies) hinder progress. Supporters of this view claim that spontaneous order is superior to any order that does not allow individuals to make their own choices of what to produce, what to buy, what to sell, and at what prices, due to the number and complexity of the factors involved. They further believe that any attempt to implement central planning will result in more disorder, or a less efficient production and distribution of goods and services.
When it comes to economics, true free markets almost never work. Some form of governing agency is necessary to safeguard social and environmental values. I imagine a similar form of regulation is beneficial to commercial software projects. However, without appropriate data, this (like the author's points) is only speculation.
When it comes to economics, true free markets almost never work.
Not that it proves they work, but when was a true free market ever even tried? Regulation (by public and private institutions), for good and bad, is pervasive; I don't see how could one even set up a free market, regardless of its consequences.
I read through it and I like the idea. I was into agile in the early 2000's but it quickly grew into a massive process in and of itself. The process geeks just replaced waterfall processes with Agile processes, still lots of formalized ... processes. I don't think the name will sell, anarchy, but Agile needs to be more agile.
This model assumes everyone in your organisation is Clueless and the author correctly points out that it falls on its face when you try to apply it to a group with a few Losers in the mix. Also, someone is always writing the checks and handing out the raises, so there will always be a leader somewhere in the group.
I suspect you are operating under the misapprehension that anarchist means no structure. It doesn't. It means self-organizing. Newby anarchists tend to have more meetings than any other group I know, because they often have a hard time learning to delegate.
Experienced anarchists tend to have meetings because they get good at them and enjoy them.
> Frequent visitors to this blog will notice that I use the term “Programmer” here a lot, as opposed to the usual “Developer” – that’s simply because Fred George uses that term, and he probably uses it because he’s American.
Does anyone else think this has anything to do with being American? I use programmer. Developer means real estate to me. I suppose a programmer could be in broadcasting, too, but I just don't find that usage in the common vernacular nowadays.
As a Brit my impression is that the "correct" term is "Developer". "Programmer" is used self-deprecatingly, like a violinist calling themselves a fiddler; it has overtones of button-pusher, code monkey rather than someone who makes things.
FTA: "Take the example of a tester, or a 3rd line support analyst – in the main their ranks are filled with the below-par programmer. They help the process by reduce the programmers’ work load and don’t cost as much to the CFO’s bottom line."
Don't cost as much as what? As much as a smaller team of expert programmers doing it right?
When I imagine programmer anarchy, I wonder whether a large software project work in a Wikipedia-style development process. What if git.kernel.org allowed anonymous commits to the Linux kernel? Alternately, what if Wikipedia used git-style pull requests?
I've been around long enough to see all the trends in approaching software development. As they wither, one by one, so do the fanatic programmers behind them. Yet, I still continue to program as usual. Most of them talk too much and code so little.
They are, but it's easier for a programmer to develop business domain experience, than an end user to learn programming. Adding a business analyst between them works if it's a great BA, but frequently it cuts the accuracy of the communication.
I can't think of too many times when I got burned putting programmers in front of customers. They don't always facilitate the meetings or ask the open ended questions, but it's great when they're present.
Same concept as getting factory workers to observe end users of their cars. Valuable, even if the product managers and marketers own the interaction.
It's not a question of whether it's easier for programmers to do BD stuff, or BMAs to learn technical details; it's a question of whether you want to waste your highly trained and expensive programmer's time dealing with customers.
Sure you need somebody technical in front of customers, but the whole point of org structures is to delegate roles to increase efficiency. This methodology strikes me as ridiculous - who is going to volunteer for QA work? What if your 'QA guy' gets bored and decides to become lead programmer for an area, but can't really handle it? What if a bunch of programmers get together and find a really interesting technical problem, and get lost in the jungle for 6 months whilst contributing nothing to the project?
I've never heard of PMs or managers as 'empowering' their teams to do work - in the engineering world they're simply considered to act as shields against the crap that flies around at the next level up (contractual and legal shenannigans - do you really want programmers getting into that?), and helpers to get rid of admin problems.
Incidentally, when I worked for a college we spent some time investigating Healthcare IT possibilities. Speaking to hospital hiring managers, the general consensus was that they'd like to take someone with a healthcare certification (RN, etc) and train them to be a programmer, database administrator, or IT support person instead of hiring a skilled technologist and training them in healthcare.
I've noticed this in practice as well; one of my friends is a pharmacist who moved into an IT project management role at his facility. They would have never considered a non-healthcare professional for that role, is what I understand.
I have to agree with this sentiment all throughout my studies of comp sci i felt the amount of study required was very small to that of other science or really any degrees.
After being a developer for almost a decade in various roles ive found the thing that i enjoy most is the learning new domain knowledge at each business. Sure ive picked up a lot of new tech knowledge along the way but the thing that takes the most time is always business knowledge, needs, usability, and industry domain knowledge.
Further to that many universities have started adding tech study to the courses. I guess they got one too many requests for managers who knew sql spreadsheets?
My impression (could be because I went to a non-engineering school?) was that computer science had the most work of any major. Certainly much more than anything out of the business school, and most non-science. Perhaps Physics could have had more work, but I don't think biology or chemistry did. Things could be different at other schools though.