Free loading includes plain slacking, doing things the fun way rather than the good-enough way, playing politics, keeping yourself busy with unnecessary management/coordination, etc.
The reality is much more complicated. I worked with some great people at a huge company, a lot of them had families, one had a child with very severe asthma, and one time only a helicopter could get to the hospital fast enough.
Others had perfectly healthy children... in college, and a huge mortgage on the family home.
They all engaged in counter productive job preservation strategies. The easiest of which is simply to not aggressively share knowledge. At very large companies you really have to actively push your knowledge onto people or a shared knowledge repo.
The next level is to put a very small bit of effort into actively avoiding the sharing of knowledge. Give less than the best answers you can give when asked for information. You don't have to lie or hide, just don't be quite as eloquent as you could be.
At yet another level are things like writing custom wrappers around things that didn't need wrappers, or that already were wrappers themselves.
People were doing this because if they lost their jobs they would lose their homes, the children's heath insurance which can pay for things like helicopters, etc.
Then there's personality conflicts. At very large companies you work with a lot of people and this increases the probability you'll meet your worst match. That is the personality that just happens to clash with your personality. The most common I've seen is very confrontational vs. very passive. The very passive then often becomes as passive aggressive as the other is openly aggressive. And boy can smart people like hackers be incredibly good at passive aggression.
The point here is that either of those people would be good on their own. It's the combination that's toxic for them and anyone caught up with them.
And then there's anything other than stellar management. Organizing any large group of people so that communication is effective is not easy. If that large group is programmers and the information is highly complex... you don't have to be bad to mess it up, just not being a great manager is enough.
And there's even more, I could write a large book about this.
When you're stack ranked against your peers, I wonder if people ask themselves, "why should I help you, when all I'm doing is making it harder for myself to get a bonus next year"?. I don't indulge in such behaviour, but I've wondered if others have thought and acted upon it.
Yep, and what I've heard (and only heard, never worked there myself) is that Microsoft has some of the most programmer friendly corporate culture around.
A lot (if not most) big companies are not like that. The reason Joel's writing on company culture is so popular is because such simple things are still quite rare in the world.
I figure this is part of what has kept them around, and is fueling what seems to me to be a come-back in spite of everything going against them.
I don't have a great depth of knowledge on them, but I did interview with their Xbox team, and everyone was just really enthused to be there.
And you shouldn't allow confrontational people on your team anyways -- they can't be in a group (but would properly be awesome on their own).
A lot of great hackers are very confrontation, it comes from love of truth. Look at Linux and Linus Torvalds, Linus freely calls himself an asshole, because well, he is one. Many hackers are.
You can't just magically decide not to work with confrontational people when you're in the software business. Not unless you have infinite amounts of money to hire whomever you want.
The bigger problem is that it would encourage bitter feeling between teams. If a silo is not successful and freeloaders are not pruned, then there will be a tendency for a silo to sabotage another silo if they think their collective job security is at risk. It is not a perfect solution. It can be approximated but still actually separate small companies is what makes more sense.
Take a look at a farm, what's outside the silo? Rats, rot, waste, what's inside the silo? Stuff you can sell for money that is well protected from the elements. In a large company you need to be a very adept bargainer / barterer to acquire all the resources you need despite the bean counter mentality.
In cases like that you can improve productivity by actually cutting the team's size and pairing less important features.
Some programmers, in some environments, are negative net productivity programmers. If some programmers literally get nothing accomplished at all, then you can easily say that "some programmers are 10^3 or 10^6 or 10^100 times more productive than others"
Another factor in "rockstar" productivity is that you need to get all the bull out of the requirements gathering process. I've seen so many places where it takes six hours of time working with "customers" to figure out the requirements for something that takes 6 minutes to implement. Tasks that are absolutely trivial can stretch on for weeks if your organization can't be decisive about getting them done.
Of course, in large teams there's a lot more bull to go around.
And a thirty person team is not only more likely to have several interns, but also people who spend a significant amount of time training, mentoring, and managing them.
Large teams also allow people to hide out which I think kills morale for the rest of the team. I still don't understand how the entire team can realize a person is ineffective yet management somehow misses it and keeps them around. At a minimum if management wants to keep a person around they should explain they are not cutting it and need to improve.
I'm starting a new job soon and cannot wait to get back into a small company.
If one's idea of how to deal with personnel issues is to just fire people, one may not have the temperament for management in a large organization.
I don't really think that's what I said. People who show time and again that they are ineffective at their jobs should not be there. I'm certainly not advocating firing someone for a single mistake or even for many mistakes as long as it's not the same one over and over again. Hell, making mistakes means the person is actually working and trying to get things done. That's not the kind of worker I'm talking about here. I'm talking about the worker who does little and consistently gets in the way of everyone else who is trying to push the boulder forward.
I also blame management because they need to be proactive in explaining to people that they are not cutting it and need to step up or will eventually be let go. I know if I was not meeting the needs of my job, I hope my manager would tell me.
it often is significantly more cost effective for a large company to keep them than to go through the expenses entailed in the hiring and training processes.
The cost effective you're thinking about is likely only in the short term. If you keep someone who is mediocre and consistently slows down other teams, and burdens other team members to make up their slack, then long term it is cost effective to let them go and get a quality person.
Rockstars may be 10x more productive individually, but anti-rockstars can be a huge minus on the productivity of the entire team. IMHO, it's much more important to not hire (or keep around) mediocrity than it is to hire rockstarts.
On occasion a come to jesus talk might be appropriate with some types of employees, but once one starts threatening people with the loss of employment, bad things often follow - particularly in the US where such threats often mean loss of access to healthcare for the employee's family.
BTW, when the job is pushing the boulder forward, smart people recognize the sisyphean nature of the task and respond accordingly.
If you only look at your organization then yes 1/2 of the teams in your org are below average for your org, not necessarily below average for all software developers. I'm not talking about trying to hire only the best of the best like Google. I'm talking about avoiding the worst of the worst.
So are you arguing to just fire them outright? I thought that's what you were against? And since when is a frank review of performance a threat? This must be where 'everyone gets a trophy' day has led us. Unless people are told their true performance and how it stacks up to what's expected they will not be able to improve. It may be that they are simply unaware how bad they are performing, or they simply need to be motivated.
Huh? Are you saying all work is useless? That's certainly not what I meant by the term. Pushing the boulder forward is what I consider advancing the business in a given direction, one that I certainly hope doesn't have to be backtracked.
In business there are people on the boulder deciding where it should go then there are people on the ground making it happen (I'm including all the support people here). Finally there are people standing on the boulder doing neither, and instead provide dead weight to be carried.
In small companies the people who decide the direction and make it happen are often the same. This is what makes small companies so nimble. Those deciding where to go also know directly of the challenges of getting there. Small companies also rarely have much dead weight just standing around.
In big companies there is often a huge divide between those who decide direction and those who make it happen. Big companies also tend to have lots of people just standing around on the boulder. Now, standing around isn't so bad until these people start getting in the way of the job of advancing the business. When they start actively trying to make things harder for those pushing because they feel some odd sense of job security by doing it that way.
If you look across the industry odds are most or all of your organization's teams are below average given that Google, Microsoft, Apple, Oracle, IBM, etc. are able to retain a significant fraction of the top talent and entrepreneurship claims many as well.
In a large organization, smart people realize that those distant people who decide what to do with the boulder are often motivated to move the boulder because they must do something and moving the boulder is something.
Occasionally, moving the boulder improves things, often it doesn't. A good manager musters resources to the former and gives lip service to the latter. A good manager listens to the whining of those compelled to demonstrate needless nimbleness by pointing to the degree to which others are not participating in the fire drill of the quarter, while retaining adequate staff in a pleasant workplace to move the boulder when it actually needs moving.
Mythical Man Month
Very highly recommend you read both. The MMM is amazing because it's held true for over 35 years now.
Can't remember where I read that - might have been the MMM.
3 in unstable as too often 1 person feels left out, 4 is an even number so risk of split vote.
I'd agree that a larger team effectively mitigates risks around individuals not fitting in or decision-making being stymied, but in my own personal experience I've definitely preferred, due to a perception of maximising by own personal productivity, that 3 or 4 was best. That is, of course, completely subjective!
However, here is a more recent article that suggests that "best" team size is 5 plus or minus 2 - which fits into my preferred range:
In a given large system, you can and should have many small teams working independently.
Another small team should act as the 'curator' or 'middleman' for communication between teams.
(This does match my experience. 40+ teams simply accomplishing less and less than a very small, co-located or constantly in contact team).
Computational efficiency also tends to correlate with cost per line of code in a the man month model the author requires wherein all programmers are of the interchangeable 10k a month type.
Finally, large teams are likely to operate in environments where large teams are required due to bureaucratic overburden, the very environments where $1.8 million dollar coding expenses are both justified and common.
With great respect, I find this approach flawed in analyzing the behaviour of large companies. Assuming rational actors, the thinking goes, every company is providing value and acting efficiently.
For starters, actors may be “rational,” but they suffer from information inefficiencies, so they are doing the best they can with what they know. Sometimes, they proceed “rationally” from flawed assumptions.
For example, they may believe that Waterfall is how to get a project done. Often this is not correct, so the “rational actors” produce waste.
Another problem is that rational actors in the small don’t necessarily produce rational behaviour in the large. Rationally, it is often in an actor’s best interests to do something that harms a team in exchange for providing a personal benefit.
This is how turf wars and silos competing with each other arise even with “rational” actors.
So in summary, I suggets it’s very dangerous to advance an argument that since the actor are rational, the results must somehow lack waste.
I would argue the opposite: That the larger the team, the more likely that rational actors will face situations where being rational means producing waste.
However, economic rationality is one premise of the author's argument. Otherwise, there is no reason to care about efficiency.
Having worked in organizations both large and small, turf wars and petty behavior is hardly confined to large organizations and on the level of the production worker it is my experience that it creates far more drag in small organizations than large ones - in general, thirty person teams more easily replace a resigning worker than five person teams.
The term "waste" is as suspect as "dollars per line of code." A team of thirty is more likely to be dealing not only with an overburden of bureaucracy, but with an additional constraints such as systematic compatibility and legacy code bases - e.g. IE6 - while small teams of $10k per month programmers are probably more likely to produce code worthy of The Daily WTF e.g.
"True" = true
"true" = true
"TRUE" = true
"tRue" = true
"TRue" = true
"TrUe" = true
Exactly. A four-person team may be far more efficient on-task, but requirements-gathering and politicking at a large company could swamp them for months. There do exist problems so large that simply defining them exceeds the capability of a four-person team.
Furthermore, a small team's ability to bring in a project for less money ignores that time is a business-critical aspect of development. In many cases it is worth far more to a company to get a solution out the door in 12 months at greater expense than to wait for a more cost-efficient team to finish in, say, 24.
Lastly, I take issue with the entire idea of efficiency being definable as cost-per-line-of-code. If we as a profession are going to dismiss SLOC as a useful measure of productivity, we certainly can't turn around and allow it to be the basis for an efficiency argument.
I ask whether we can just take it as a given that large teams can get a given piece of software done more quickly than a small team. It may happen some of the time, it may not. But I have trouble granting it as axiomatic, for reasons that would require a blog post to list:
Even if half the latter team is utterly useless, and the team runs 10% the efficiency of the four-man team, you still have more production power. Obviously it costs twenty times as much, but as a previous poster noted, sometimes that just doesn't matter.
Perhaps the biggest reason is that when you have a team of four, every hour you spend meeting involves approximately 15 minutes talking about the problems you're having, and discussing the solutions to your problems. In the team of 20, I'd go to a meeting, spend three hours there and 2 of them would be taken up by a particular aspect of the project. I wasn't working on it, and how they implemented that problem had absolutely no bearing on my work.
Meetings, or at least irrelevant unproductive meetings, are a huge time sink. At least half of the time spent in those meetings could have certainly been spent doing other, productive, tasks.
It's getting tiresome going over the same point that we've known for decades, if not centuries.
Or should I say, it's getting tiresome having to explain and/or defend a position that's so obviously correct, you'd have to close your eyes, put your fingers in your ears and go "whaaa whaaaa whaaa" till you're blue in the face not to see it.
By the original proverb fewer cooks only make the broth not-spoiled, not necessarily good either :P
Small populations are more variant. Most of the best schools are small. But most of the worst schools are small too. Large schools tend to converge on a median. The most dramatic rates of cancer and absence of cancer are found in rural areas because they have small populations. Cities converge on a median. And so on.
We don't hear about the small, unsuccessful businesses. They don't get on panels because they no longer exist. So the small firms that did well are over-represented.
That said, that smaller projects perform better is a consistent finding in classical software engineering research.
The small team works as long as there is good group cohesion. Unfortunately, if there is a morale issue with one person, it can quickly spread to the remaining 3 or 4 people on the team. Such an effect is at least slower in larger groups. It might not occur at all if the organization is such that there are silos for protection.
At larger companies the goals have been more abstract. Work might be to satisfy an ideal created by management ("testing is good! 30% more testing!") which may or may not actually be useful. In the worst case it's entirely possible to do work that nobody actually cares about, which I find demoralizing. I don't think that happens as much at small companies because I don't think they can afford it. Big companies with old cash cow products and lots of inertia can.
Whilst I think there is more than just increased communication overhead involved, I know in my gut that I much prefer working with a team of 3-5 than a team of 20. Things just happen so much quicker!
I think another point to look at would be the level of scrutiny that goes into hiring another member of a dev team of 5-20. A 100 dev team is usually hiring by resume via HR with less focus on culture fit and in-depth technical knowledge.
points at CatB http://catb.org/~esr/writings/cathedral-bazaar/
I think part of the poor performance of large teams is because as soon as the business is big enough to have large teams, it's big enough that it starts having middle managers and PHBs. Large teams are only bad because of the conventional commercial software development practices (waterfall, anyone?)
Large teams should be more decentralised like Linux, Order emerges out of chaos adding manager that enforce correct use of waterfall want be good.
However, then you notice that the size of project being discussed here is only 100,000 equivalent source lines of code. Assuming that an "equivalent source line of code" is something like a real source line of code, if you're building projects on that scale with teams containing an average of 32 people then I'd say you're doing it wrong. Such projects are almost certainly simple enough for a single small team to complete inside a year, as indeed the study also found, and an insane proportion of the time spent by the members of your larger team is going to go on management, communication and integration overheads.
If you started looking at projects large enough that a single small team could not complete them within a useful timescale, perhaps an order of magnitude bigger or more, and you still found that teams averaging 32 members took almost as long to complete the work as teams averaging 4 members, that would be interesting. But that's not what this study seems to have considered.
Though I'm not sure the QSM article is strong proof of much anything, really. They don't seem to test for e.g. language or problem domain, and while SLOC has a relationship to the defect rate I'm not entirely sure I agree it's a good comparison metric, nor does it say anything about the success rate of the projects.
That said, one obvious reason for having a 30 man team to develop something is because it's composed of a multitude of disciplines. A software/hardware project might require people with deep skills in digital electronics, analog electronics, software development not to mention domain experts for what they're interfacing. Assuming the thing has an interface you might want a UI guy, then there's the supply chain and manufacturing to take in to account.
It's not hard to see that it could swell to thirty people quite quickly.
9 x 32 x $10k != $1.8m, = ~$2.9m; 9 x 4 x $10k != $245k, = ~$360k.
Cost and schedule are the first things that managers look at, but as a client I would also be looking at stuff like SQA, code quality, completeness of my specification. There's also the 'bus factor' (if one of these guys gets hit by a bus, is there someone else who can work on and understand my code?) What additional documentation am I getting with the code? How is the customer service, is there someone there who I can call and talk to?
We also don't know how this 32 person team was comprised. If it was 4 programmers being supported by 28 other people who worked on the project off and on (management, sqa, marketing, documentation, requirements) these results are not surprising.
As a programmer I want to know things like: how many 16 hour days did the devs work, how maintainable is the code, did the programmer salaries increase commensurate with the additional effort that they seem to have put in? Am I confident in the reliability of this code, or has it been simply thrown over the wall with the promise of support.
I know we all want evidence for 'just Me, Steve and our Emacs buffers, is most efficient. Going it alone like Clint Eastwood and Arnold Schwarzenegger in a buddy comedy about programmers.' But studies in the social sciences like this one are incredibly difficult to get right and should be taken with a grain of salt.
If small teams _always_ and _unconditionally_ are so much more efficient, why do we still have the likes of, say, IBM, Hyundai, Microsoft, or ant colonies?
I think that it partly is because large teams do a lot of things that small teams deem unnecessary, such as documenting design choices, pre-estimating demand/bandwidth requirements etc, making sure one complies with legislation, etc. Not all of that is necessarily waste.
Another large part of the difference, I think, is due to the difference between programming and engineering. Many small teams, especially at smaller companies, program, while almost all larger teams engineer. Programming produces way more functionality per man month than engineering does (for an example, read GNU's well-engineered hello world source code).
So, I think the main rule should be "only start engineering if you have to". Large team size is, for a large part, just a side effect of the decision to engineer.
I would be inclined to agree with the premise of the article, but the evidence given is flawed at best.
Cost of change becomes excruciating when dealing with large established systems. New systems must be evaluated, understood, the under-average performers trained on it, functionality exported correctly, etc.
Simply put, there's more to do to release "big-co" number of features, as well as more people to talk to about releasing them.
things are the way they are for a reason.