From a management perspective, you're forgetting about the obvious: the less than productive people. The benefit of giving employees no- or few-strings attached time to work on whatever is clear. Things like GMail, Apple 1, etc. The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.
When your ability to make money hand over fist starts to get challenged, it is difficult to continue giving people free reign, especially when your competitors focus on cost, cost, cost. HP was a great place that made oodles of money selling tank-like PCs (among many other things) that cost $3k. But then Dell came along, invested $0 in R&D and started cleaning HP's PC clock. Bell Labs was engineer/scientist nirvana, then the AT&T monopoly went away.
The other issue in big companies is that as people with direct connections to the business start losing control, the bureaucrats (well intentioned as they are) start moving in, and they worry about things that the engineers/etc didn't really care about. They are passionate about you using the appropriate powerpoint template, and will speak to your supervisor if you don't comply!
Re: productive people. We have this notion of the 10x producer, but the real problem is high variability within the same person's work.
I know productivity superstars, but if you catch them on the wrong day or at the wrong time, they look like lazy bums. Over a year, they're extremely productive. On any given day, they may look like they're wasting time.
This property of conceptual work is at the core of a lot of culture clash with people who have more predictable work-comes-in, work-product-goes-out flows.
I know productivity superstars, but if you catch them on the wrong day or at the wrong time, they look like lazy bums.
I've encountered a particularly nasty variation of this since Agile became a Thing, where anyone who isn't writing code or running tests or showing up in the commit logs every few minutes obviously isn't working. The idea that it might be better to step back for an hour, or a day, or even a month, to think things through properly and maybe do some throwaway prototyping before you start pushing code to production, doesn't even seem to be accepted as credible any more in some quarters.
People don't recognize it as such because the LOC -- lines of code -- has an added time dimension now and a different name. In Agile / Scrum it's K-(issue/day) where issues take the place of raw lines and the metric is applied per unit of sprint duration.
But same idea, and same fallacy. It results in gobs of ugly Rube Goldberg machine code that's slow for fundamental O(crap) reasons and bug-ridden.
An org that calls what it does "agile" or "scrum" and then proceeds to accumulate technical debt like the Titanic taking on water is lying to itself about what it's doing. Piling up technical debt is a textbook Agile(flavor) failure, full stop.
What sounds like happened in the case(s) you describe is that someone with dev management preconceptions wore "Agile" like a wolf in sheep's clothing and proceeded to dole out the same old ad-hoc nonsense. Nonsense as in being "efficient" (high KLOCs/metrics, butts in seats, long hours, etc.) and not caring about being "effective" (working on the right problem vs a problem, necessary understanding of the company and customer needs, working smarter vs. simply harder, etc.).
I've seen this attitude a number of times, e.g. in program managers with history in big, established s/w companies. They learned a certain way of working, but then talk "Agile" as the trendy hire-word. Unfortunately, some of these folks never gained understanding of the tools that Agile brings to the table, and when/how to apply (or not apply) them to a situation.
I had a manager who believed that such metrics were the only true measure of performance. This same manager also believed that CUDA on a VM (which is itself on a server hosting numerous other VMs, and the bare-bones minimum graphics card--so no support for virtualized usage of CUDA) was a good suggestion for improving performance, that running a debugger on a specially compiled Apache webserver in the production environment would be a great way to troubleshoot performance issues, and that having an 8-12 week sprint/"rolling release cycle" was "Agile" as long as we called it that.
So, yeah, some people just don't get it. At all. And in my experience management is definitely a place where one is more likely to find such people.
1,000x yes! My productivity comes in intense bursts of effort, with quiescent periods of research and reflection between them (or what looks like goofing off to rigidly process-oriented sorts). Combine that variability with mandatory daily scrum meetings and it makes me want to figuratively slit my wrists.
I'll even go a step further. I have a reputation as one of those people who has an ability to get things done at an incredible pace, but there's definitely days where I'm flat-out procrastinating and being lazy.
For me, personally, and I suspect other people like me, it comes down to an ability to perform remarkably well under pressure, along with a lack of ability to perform well when the pressure is off. If there's no urgency to what I need to do, I find it very hard to commit myself to doing something.
It can also be an anxiety thing, if one has generalized anxiety disorder. The anxiety of doing something hard or with higher uncertainty (a challenging software problem) is high, and it takes the even greater anxiety of the high pressure environment to overrule it.
Worth investigating, because it seems like ADD, but it is not. ADHD drugs in this case can be counter-productive as they tend to increase anxiety.
There are techniques to cope with this that can be quite effective.
Over the years I've come to realize that what once felt like procrastination is actually my brain working out the solution to a problem, and it quite literally prevents me from reducing the solution to code until it's damned well ready for me to do. So I just go with it now. The only thing that seems to speed this process up is intense cardiovascular exercise, which is probably a good idea on its own anyway (but I imagine it seems awfully odd from the outside).
As for ADHD, I probably have a little of that going on with a sprinkling of Aspberger symptoms, but not seemingly in a sufficient way to have any significant effect on my life and/or overall productivity (except that is when it comes to %$!^ing daily scrum updates which drive me bonkers - once a week would be fine, once a day is ridiculous).
Today I spent 5 hours staring at, stepping through some callback-heavy code, reading HN, browsing the web and about an hour to write, test and deploy about 20 lines of code fixing a really nasty race condition. And I'm still not completely sure things are really fixed. I might have spent even longer had it not become apparent that we have just the two options:
1. band-aid around the problem
2. complete rewrite of a large portion of the website
I'll go even further and state that the problem is using MBAs to formulate how to manage creative sorts cough agile cough cargo-cult coding cough. if the area under your productivity curve is greater than everyone else around you, I don't give a you-know-what about the shape of the curve, you're doing just fine.
To be fair to agile though, I can see situations where it can work. The problem is that many of its adherents seem to see agile as a hammer and all software engineering as various forms of nails.
> The problem is that many of its adherents seem to see agile as a hammer and all software engineering as various forms of nails.
Agile isn't a methodology, but a metamethodology -- or, in terms of the metaphor, it isn't a hammer, it is a set of guidelines to use in selecting tools.
Scrum is a hammer, but Scrum ≠ Agile. Often rigorous adherence to particular methodologies (usually Scrum) get misidentified as being "Agile", but rigorous adherence to a particular methodology is not only not the same as Agile, but is directly contrary to Agile principles (particularly, its a direct violation of the first value from the Agile Manifesto, "Individuals and interactions over processes and tools".)
Except that every incarnation of Agile that I've encountered is a rigid implementation of scrum plus sprint planning plus Jira. And this quickly becomes Waterfall with scrum. And it really sucks.
While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments, a lot in fact, so methinks the agile fans ought to take some ownership of this recurring problem and either find a solution or stop shoving agile down everyone's throats.
Finally, I'd take a 30% pay cut to escape agile to do exactly the same work I'm doing right now minus scrum. I'd get more work done and I'd feel better about it because I would no longer feel like I have the engineering equivalent of an ankle monitor attached to me. That's more than worth the loss of compensation to me.
> Except that every incarnation of Agile that I've encountered is a rigid implementation of scrum plus sprint planning plus Jira.
This is not a problem with Agile; anything that works anywhere will lead to imitations that steal the name and attempt to extract some simple recipe from the "lessons" of that thing that worked.
> While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments
No, its how something that is nothing like Agile gets implemented in large corporate environments and called Agile.
Fundamentally, this is a symptom of a broader leadership culture issue environments in the authority structure and culture has people that neither know nor care to know about the domain have authority for decision making within that domain, and its certainly beyond the power of people external to the affected organizations with an interest in particular approaches to problems in any given domain (software development or otherwise) to do much about. It is, however, a pervasive problem in large bureaucracies (not only corporate ones.)
If the Agile Manifesto were just that and little more, I'd agree with you...
But instead it has become an enormous metastasizing moneymaker for minting Certified Scrum Masters, Certified Scrum Product Owners, Agile Certified Practitioners, and all sorts of other Agile titles for $1000+ a pop. So I guess we're going to have to disagree because I think this means a little ownership of the issues that arise in the practice resulting from that training is appropriate here. Because what I'm hearing from you now sounds a lot like the usual "You're doing it wrong!" refrain which accomplishes precisely nothing.
> But instead it has become an enormous metastasizing moneymaker for minting Certified Scrum Masters, Certified Scrum Product Owners, Agile Certified Practitioners, and all sorts of other Agile titles for $1000+ a pop.
As long as there are people looking for packaged solutions and willing to pay top dollar for them, there will be people willing to sell them to you under any name that you ask.
But if the name on the tin is refers to something diametrically opposed to that kind of packaged-solution approach, well, its probably not going to be an accurate label.
Me too. My productivity curve looks very rocky, with very tall spikes -- huge amounts of very hard stuff done in hours -- and days where nothing happens. There seems to be a longer-duration cycle too. I have super-productive weeks and ho-hum weeks. Overall the average is decently high.
The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.
Isn't that the motivation for the whole 20% idea, though? If you can produce 1 GMail for every 1,000 ideas, it probably doesn't matter if the other 999 didn't produce much of tangible value, because the programme almost certainly paid for itself just on that one success anyway. Meanwhile, you still get to enjoy the morale benefits for all 1,000 staff for the other 80% of their time when they are working on assigned tasks.
I agree. But as an organization grows, the points of view grow. People focus on their niche, and don't necessarily get the big picture. Also, the political environment in a company changes as the company grows, and tends to not reward speculative activity well. If you invent GMail, your boss becomes VP or gets more people/prestise/money/etc, If you invent Orkut, your boss gets to talk about how popular it is in Brazil.
One of the bad side effects of innovative, rapidly developed things is that the support org gets left behind. When the organization is small, this isn't a huge problem. When you're a huge company, the Exec VP of Support has an incentive structure to deliver better support. That VP will lobby for more controlled changes and slower product release cycles.
I've worked in places where most of the organization would be angry if some team invented GMail. They didn't welcome disruption.
> From a management perspective, you're forgetting about the obvious: the less than productive people. The benefit of giving employees no- or few-strings attached time to work on whatever is clear. Things like GMail, Apple 1, etc. The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.
Actually, there are other reason for non-rigid 20% time (that is, the 20% is a target with considerable variation in the short term); it means that resources across the company and in any team aren't fully committed on critical tasks routinely, so surging due to an emergent need doesn't mean dropping the ball somewhere else. When routine utilization gets too high, this rapidly becomes a very significant effect.
Its also a way that people gain experience and understanding and avoid developing tunnel vision around the way things are done on their exisiting primary products, and so become more efficient.
In the case of Google's 20%. The numbers are easy to quantify, how much does 20% of the engineering staff's time cost? How much money has the output of 20% time earned? It might be that Google thought the work was too unfocused, and outside of some pretty obvious early big hits, fairly little else of value was coming out of the work year after year.
The best way to deal with it, I think, is to have people submit proposals, pick the best, then take those people out of regular work for a few months and put them into a "lab" where they come up with an MVP. If it looks good at that stage, invest more in the idea.
When your ability to make money hand over fist starts to get challenged, it is difficult to continue giving people free reign, especially when your competitors focus on cost, cost, cost.
That kind of thinking sounds like it would lead to a "race to the bottom" to me. If everybody is obsessing over - and competing on - a quest to cut costs the most and the fastest, I really don't see how anybody is going to benefit from that in the long run.
Or to put it another way: "You can't cost cut your way to a growing company".
Of course, I'm not saying there's never a time when circumstances change and some cost-cutting might be called for. But cost-cutting is a tactic, IMO, and not a strategy. Innovation, on the other hand, and committing to the activities that lead to more and better innovation, is a strategy.
My feeling: If you want to grow, you have to innovate. So if you reasonably believe that "20% time" is an approach that leads to useful innovations, then cutting it is a short-sighted, and arguably mistake, move.
Kind of a strange point on cost cutting, but how does this fall into the arguement of "paying for talent" when hiring your executive level vs "race to the bottom" point of view for your employees? I've seen this vomitous 180 flip of statement coming from the same company in a short timespan.
The top gets paid an exponential figure for 'talent', but is ignored in any cost cutting measure... yet, any purely non financial 'talent' sets you on a crash course race towards minimum wage of "Oh, you did awesome this year, but times are lean.. here is a 2% raise."
Maybe I am more irked now because the company I work for just got bought by a huge company and R&D staff was slashed in half and "No more research." became a mandate
And then consider that Google's hiring standards seem to have decreased as they need a lot of people and not just the best. Thus 20% of more average people's time is going to be a lot less valuable on average.
While their standards have definitely dropped because they hired me in 2011, they then proceeded to assign hirees like myself to all the work no one else wanted to do around the googleplex.
I suspect this was one of Larry's failed experiments because a whole bunch of the people I met were let go within a year. I personally fled the place after a couple months of trying and failing to find work remotely suitable to my skillset, which was, ironically, what led them to recruit me in the first place.
I've gone on about this elsewhere (search my comments), but it comes down to the utter stupidity of blind allocation for experienced engineers. There were projects that literally needed my exact skill set, and engineers on those teams did their best to try and open up a position for me on them, but middle-level management and HR blocked all their efforts.
I could have stayed a year and hoped for the best, but by then I suspect I would have been so embittered that I would have become the embodiment of a bad culture fit so I left before that happened because I had a great opportunity dropped right into my lap.
Now I suspect I am blacklisted at Google because a few people have tried to get me rehired now that such openings exist and they were immediately shut down by HR.
I'm curious - were the projects you wanted to transfer to in niche technical areas or niche products?
It's very easy at Google to transfer from niche to core. If we (in Search) see someone languishing in another part of the company with skills that we want and they want to work with us, we make the transfer happen, and there's nothing HR or another manager can do about it. It's much harder to transfer from core to niche, and it usually requires a solid track record of sustained performance in your original assignment. I know a number of people that transfer from Search to Google-X after 4-5 years, but I know of virtually nobody that can make that transfer after 1-2. (Basically, the company wants to "make back" their initial investment in you before they'll let you work on speculative projects that may not show a return.)
Arguably some of the niche projects would be better done as startups - they don't face the constraints of working at a large, very visible multinational where they don't get resources or attention from higher management - but the entrepreneurial spirit isn't quite dead at Google.
I was in a core technical area trying to transfer into another core technical area in both cases (really, they needed me elsewhere, I could have made a genuine difference, and the blind allocation process completely hosed that up).
I also didn't know upfront that little tidbit that whatever assignment I took, I would be stuck there for 4-5 years. If I had known that, I would not have accepted my initial assignment, nor any other, until it was something I knew I'd remotely enjoy, and I'd probably still be there today.
So by relentlessly focusing on short-term ROI, Google lost 100% with me.
If they're going to insist on blind allocation, then they ought to not be surprised when it doesn't work out. But I gather the heuristic is to assume these cases are a 100% indicator of non-googliness.
From my perspective, Google recruited me aggressively away from a long-term gig where I had an absolutely stellar reputation. I uprooted my career with the mistaken belief that they wouldn't do this unless they had a reasonably clear idea what to do with me. Apparently they didn't and I'm not the only one who had an experience like that.
Sure, I could have said no and I take full responsibility for saying yes and for everything that happened as a result of doing so. And once I realized that Google was going to be of zero help in fixing what I think was a minor allocation error, I once again took responsibility to do what it took to fix the problem myself: I left.
So here's why I think they wouldn't help: the team onto which I was placed was losing an engineer a month. Every time they got a noogler to say yes (3 times during my short stay), another team would intercept them before they got to their first day on the job. The work was dreadful and tedious and the manager even seemed to hate running the team. And the only reason I said yes was because I had this naive faith that Google wouldn't do something as seemingly daft as blind allocation unless they had a pretty good idea how to make it work. My bad. But not my problem. High level people should have been fixing the root cause here instead of continually throwing nooglers into the pit and expecting a miracle.
I don't think you have a good characterization of this:
What Google has learned over the last few years is that the people they were hiring as "the best" weren't necessarily getting the job done any better than employees from more "average" backgrounds...who happen to be much more readily available in the job market.
I don't think Google's hiring standards have decreased, they've just shifted their focus away from inaccurate signals such as where you went to University and what your GPA was there. One could argue that Google's hiring standards have actually increased, as they're now hiring people who can demonstrate an ability to perform instead of people who were able to get a pretty piece of paper from Stanford.
Full disclosure: I'm a rising senior at Stanford, so maybe your comment is just irritating to me on a personal level. That said, the Stanford CS department is objectively very good; further, judging by the number of people I know who have abandoned or failed out of CS here, plenty of people wouldn't be able to complete the coursework for the undergrad degree even if they were all enrolled. I agree that hiring people based on their school is bad practice (see: people failing out of CS), but I don't think it's fair to call any engineering degree a "pretty piece of paper." I've put too much work into mine for that.
Ability to complete coursework is not necessary and sufficient for being a good engineer in a company. Sorry. I've seen too many people flounder around, never completing things, making inane suggestions, and so on, all while talking great theory. Of course, I've seen the opposite, and Stanford is a very good school, I don't think anyone would deny that.
And that shouldn't be surprising. Look at brilliant physicists. Most end up in either the theoretical or experimental side, and are often quite bad at the other. Likewise, theorem heavy CS has its place, but getting through a program like that doesn't mean that you can write a for loop (I've interviewed Stanford grads that fumbled and failed though that), design readable, robust software, push through a sea of decisions and make effective, near optimal decisions (the whole SW life cycle is a n-dimensional optimization problem), get along with peers, and so on.
There is a huge cachet attached to degrees from certain institutions that really isn't deserved, in my opinion. In that sense the paper is "pretty". It's not a slam of the effort anyone at is undoubtedly making at the school, but the reverent regard with which it is regarded.
Except that's really all it is. You might be fancy with your degree for a short period of time and land some interviews others might not, but it soon all goes out the window. The second you have some sort of industry experience, where you went to school and how you did there doesn't really matter.
Your "pretty piece of paper" is essentially like getting your drivers license. It allows you get behind the wheel, but it makes no guarantees you'll be any good at driving. In the end it really doesn't matter much which DMV you go to.
Second that. We engineers have our own perspectives but the management has theirs and it would be unfair to say ours are unconditionally better. If the management thinks in that way there must be a reason for it, and then constructive communication + concessions on both sides would be the real way to achieving a better end.
If someone is "less than productive" you let them go. If they could use some improvement, encourage them to spend their 20% non-core work time learning and improving their basic abilities. Better to grow a less-productive employee who could improve than to roll the dice and try someone else, or expect anything to change while piling on a full workload of critical tasks.