Then "professional" management came in and killed the proverbial goose. They had to focus more on the "bottom line". To do what was easy to measure and track, rather than what was necessary for the next step of the company, and now HP is a mere shadow of its former glory -- directionless and bleeding.
3M and Corning have largely avoided this fate, but it seems that Google won't. This should make a lot of entrepreneurs happy, as there will continue to be a lot of top-down management-driven products that, if history shows, will continue to be market failures. Yet somehow, I'm incredibly sad, as it seems that too many companies go down this road.
I mean sure-- if your company is under cash flow pressure you have to pinch pennies. You have no choice. Spreadsheet says so, and spreadsheet's the boss. But if you're not, you should be investing and thinking long term cause the other guys probably aren't.
I've seen a related phenomenon in the startup world. Watched it, front row seat. I did a stint in startup-tech-focused business consulting. If you have a top-ten MBA and connections you can raise millions of dollars, set fire to it like the Joker in Batman Begins, and then raise millions of dollars again, serially.
They were basically cargo cultists, mindlessly imitating the words, phrases, and superficial behaviors of supposedly-successful people and businesses. But there was no higher-order conceptual thinking beneath the surface-- no "there" there. They had no plan and no plan on how to acquire a plan. They got the money and then did a kind of mindless MBA rain dance until the money was gone. Then they'd raise more.
I watched them do shit like destroy products that big customers had money in hand ready to pay for when they were inches away from release. I mean a done product, ready to go, and better than anything else in its market. A product that they owned and had already paid to develop. The rationale was always some kind of MBA newspeak blather. I can't even remember it since my mind filters out sounds that imitate language but lack conceptual content. Otherwise I risk wasting a synapse.
But what do I know? I went to a po-dunk Midwestern state school, so what looks obviously stupid to me is maybe genius. I'm not saying I definitely could have done better, but I do think my probability of failure would have been <= to theirs. But there is no way in hell I could get what they got. Not a chance. I saw people try with better credentials than me and who were probably much smarter, but they lacked whatever special magic blessing the cargo cult guys had.
I'm convinced its pure cronyism and ass-covering. I guess nobody ever got fired for losing their clients' money to a Harvard or MIT Sloan MBA. Nobody with a degree like that could be at fault. It has to be the employees (I've seen really good people get blamed for following stupid orders several times), bad market timing, etc.
The top-10 MBA cult is awful. These are usually those people you knew in high school/college who were excellent at studying for and passing tests. Excellent at getting great grades on projects. Excellent at everything except building ANYTHING.
I think that the people who are best at building things that people want don't want to get an MBA. They instead choose to spend their time building a business, a piece of software, a piece of hardware, whatever. People are good at what they love to do. MBA people love to go to school and get pieces of paper that say "pay me I'm smart."
As a data guy, I have to go in and deal with these assholes all the time. I have hard numbers, they have hand-wavey MBA speak bullshit. It is probably the hardest thing we data people have to deal with: criticism from the "trusted advisors" who, due to the cognitive dissonance suffered by the executives who pay them loads of cash, are deemed to be intelligent when they really aren't. After all, what executive wants to admit that the man or woman he has been paying high 6 figures to advise him for months is actually a talented actor/mimic at best and an idiot at worst??
This problem has nothing to do with education and everything to do with short term professional management that is compensated based on short term results. If you want to blame anyone, you need to blame current financial thinking by most board of directors.
Of course, most of those guys are just in it for the short term too. So ultimately you need to blame the guys with money who give it to people who don't know how to invest. I'm sure most of us, including me, are guilty of this as well.
I don't have an MBA, but my understanding is that this is what MBA programs teach, and is at least partly to blame for Wall St's, and boards of directors, short-term, bottom-line, quarterly focus.
There are some people in the MBA world trying to correct that, one of which I know of are the Throughput Accounting  advocates. Maybe more as well.
Can't come soon enough.
The board of directors are simply large shareholders who are put there to make sure management does what is in the best interests of share holders. In most large companies, these people are mostly made up of employees of pension funds and similar institutions. Their job is to try their best to make sure the company is committed to giving their investors a return of x% and are usually there because it is conventional thinking that having control of a company is in your best interests.
Now obviously these guys have no idea what should be going on in a technology (or just about any other) company. They aren't really concerned with employees or anything like that - only with a few accounting and market figures such as return on equity, share price, potential acquisitions of other companies they have invested in, and whether or not to sell the company to others. Actually running the company or how the company works is the furthest thing from their minds.
Now, as I said above, these guys are not actually to blame. They are just doing their jobs - giving returns to the guys who have dumped money into their funds. The people ultimately to blame are you and me. We put our money in these massive pension funds and similar. We don't even care where the money goes. Each year we check our statements and say 'oh, 13% returns only, maybe I should switch pension funds?'.
As always, the problem is apathy, entitlement and 'well these guys are a big company, they must know what they are doing'.
This is inherently what the board of directors does. Furthermore, you characterization of the make-up of a board of directors is not necessarily correct. Many (if not most) boards also have independent directors, who many not own a single share.
"There really isn't anything in an MBA that would have anything to do with the job of corporate oversight that a board of directors handles."
This left me scratching my head, my experience was the polar opposite of this comment. In my MBA program the topic of the board came up a number of times in finance and management classes. The board & corporate oversight were very much top of mind issues.
That would be what the board of directors is theoretically meant to do. In practice it's extremely far from the truth, with board meetings being very infrequent and focused primarily on share price and dividends.
This discussion is not really about Google though: Google does actually have a very relevant board of directors with most of them being founders or directly involved in starting large tech firms. Many other companies (Nokia? Microsoft?) are not so lucky.
Interestingly enough on Google's board only Paul S. Otellini (previous CEO of Intel) and L. John Doerr (early Intel engineer and VC) have an MBA. 
Personally, I have worked as a software developer, and I do also have an MBA, and I strongly believe that my broad/diverse education allows me to better interface with people from different backgrounds within a work environment.
Are there ways to obtain broad skills (technical, management, etc.) which are more time effective, cost effective, etc.? Maybe, maybe not. Each person is free to make their own choice.
However, back to the article's point, if Google's 20% is dying down it is ultimately due to Larry Page, who is clearly a technical guy, doesn't have an MBA (1) and most likely is seen as a "doer".
(1) According to wikipedia, he actually received an honorary MBA for his entrepreneurial spirit.
Personally, I like to relate to people, rather than interface with them.
>You could also say: "Personally, I like to understand problems, rather than grok them."
It would have gotten the point across that the grandparent's comment was an unfair dig at a style of speech rather than the content of that speech.
SV technorati pretentiousness is awful.
>"MBA people love to go to school and get pieces of paper that say "pay me I'm smart."
As opposed to some "Silicon Valley" types that don't have a clue about what it means to run a business, and say "pay me with investors money, I have lots of users!"
>"I have to go in and deal with these assholes all the time. I have hard numbers, they have hand-wavey MBA speak bullshit."
You sound like you don't have a clue what getting an MBA actually entails. Your comment is full of ignorance and generalizations.
Where does this idea that "an MBA" made this decision even come from? Google is run by an engineer. Point your misinformed, misdirected, stereotype hatred elsewhere.
This community is getting worse every day.
You've been in "this community" for less than 3 months according to your username. I'm not SV technorati, and I've run a business, as well as been a founder in a non-tech business that grew to 150 employees. I know business, and I know what I said is correct.
"SV technorati" is itself a stereotype.
SV Tehnorati is a stereotype because he's drawing comparisons, he wouldn't normally make a steretype argument unless you had already. The point was to demonstrate a different perspective, and this entire thread is just demonstrating a disgusting level of stereotyping for a category of people - MBAs.
The bottom line is that there is nothing inherently evil or pejorative about an MBA. Judge human beings on individual merit, not on a piece of paper.
Except for those and probably thousands of other exceptions, your logic is water-tight and well-reasoned. Kudos, sir.
Apparently. Have you checked the stock market lately?
But last I checked SV and California in general pumped out more innovation than the rest of the country combined, and is preparing to colonize another planet and electrify transport.
There is obviously something fundamental in the "California mindset" that differs starkly from the majority of the rest of the world, and that ought to be understood. I think it has something to do with reasoning from first principles, and with an expansive risk-tolerant business culture.
That risk tolerance and general liberal thinking is going to generate a fair amount of silly stuff, but it's also going to permit genius.
I forget who said this, but I recall reading it somewhere: "the further West you go, the further into the future you go." I'd say it's not true literally but certainly philosophically.
This is what people refer to as a 'reality distortion field'.
It's a field alright, and it does have some reality-distorting side effects, but there is actually a "there" there.
If I had to boil it down I'd say it's this:
(1) California believes in the future. People in California (stereotypically) think about what they can do tomorrow, building on what they have today. Everyone else thinks about what they already have today and fears losing it tomorrow.
(2) California reasons from first principles more than elsewhere. Everyone else looks at what everyone else is doing and tries to superficially copy what looks like it works, or looks to the past. Ideas from the past will only get you what was done in the past, and other peoples' ideas will not make you competitive since they're everyone else's advantages. (Assuming you succeed in copying them at all, and don't just end up cargo-culting them.)
I quoted this elsewhere in the thread. It bears repetition.
"For the engine which drives Enterprise is not Thrift, but Profit." - John Maynard Keynes
California runs on this.
I'm not saying that a lot of great stuff doesn't come out of California/SV (just like Apple, for whom the 'reality distortion field' thing was first coined, also produces great products). The casualness with which you suppose that California alone is producing more innovation the entire rest of the country suggests that you're overwhelmingly focused on a very, very specific definition of 'innovation'.
The places where innovation comes from are going to be places that believe in the future, are open to experimentation and risk, and reason from first principles instead of cargo-cultism. It just seems like California (and SV in particular) has an above-average amount of these things.
There is no "California mindset" just like there's no "Wall Street mindset" in New York. You're just being sectionalist.
[A little background about me: I was born in Cali, left when I was 6 months old, returned when I was in my late 20s, and stayed there for 5 years (San Diego).]
The reason I ask you this is that your viewpoint is why I went to Cali. But when I got there I found that the idea of Cali is very different from the reality of Cali. I have lived in a variety of places around the world, and I must honestly say that California was one of the worst...
Re your first point: (1) California believes in the future.
I would say NO, California believes in itself. They think (sometimes)that they are the future, but most often they build on what they have today because they fear about losing it. Not about what they can do tomorrow. A lot of what they do is about preserving the image of what they are, not about progress or the future. And as for the people (stereotypically)... Rednecks, gangbangers, surfer-dudes... There are a lot of them.
Your second quote: (2) California reasons from first principles more than elsewhere. Everyone else looks at what everyone else is doing and tries to superficially copy what looks like it works...
I think here you have it 50-50. Yes, some new things come out of Cali, but on the whole, they copy, (it's just that sometimes the copy is way better than the original)Except the idea of copy is a little changed, they don't copy exactly, they take an idea and 'shift' it. First is was a shift from real world to online. At the moment it's a shift from many to individual (online).
And your quote does need repeating, as is sums up Cali well, from gold rush to dotcom boom -
California runs on this.
Cali is about profit, nothing more, nothing less.
Oh good, maybe you can show Europeans and Asians how to get around their countries without internal combustion engines.
I'm not sure 'California mindset' is the right way of framing the innovation inside silicon valley.
I'm not sure what objective measurement one could use to really determine "innovation", maybe economic activity from industries or products established within the last x years?
Furthermore, I think that constantly being praised during childhood for being smart did them harm as it made them underestimate the value of tenacity.
What? Go take a financial statements analysis class and then tell me how MBAs don't "study" things. There are certainly ways to coast through an MBA program, as there are ways to coast through many things. Simply because MBAs don't have to go through the mathematical rigor that an engineer or physicist does does not mean there is NO rigor or requirements necessary to attain an MBA. There are many vacuous, over-confident, arrogant MBAs out there, but that is part of the human condition not simply inherent in those who go earn MBAs.
This broad (and widely inaccurate) characterization of hundreds of thousands of people is mind-boggling.
Engineering mathematics is not for everyone, and indeed not required for every MBA graduate, but it certainly helps for the more quantitative courses. And finance at one school can differ wildly from finance at other schools. Look for the professors. Look at the courses people take.
But also look at the culture. Some universities are focussed on the money, others, like Yale, on much wider impact in business and public service.
I know MBA school will be the first thing I do in hell.
You're all talk and no substance; let's have some examples.
Matt Soldo, a serial entrepreneur who has worked in both technology and management and founded his own startups, has an MBA. He currently works at Heroku.
Which one of us is wearing the SV blinders again? What have MBAs done, in the capacity of MBAs, that is incredible? Founding or working at some tech startup is not incredible.
My examples are meant to show you that your "Us and Them" mentality with engineers and MBAs is just false and doesn't reflect reality, no matter how many anecdotes people throw around on HN.
You asked for examples. I gave them. Now you're attacking me instead of my point. I've proven you wrong, bottomline. You just can't see past your anti-MBA bias.
I can only conclude that it is as I suspected; you are full of hot air.
If you want to keep innovating, the company need to stay nimble, with the owner(s) only interested in innovation. I don't know, maybe that's why the startup eco system works.
Another model that seems to work as well is the "duel roles". For example, Jeff Bezos lets the MBA optimize the hell out of Amazon, while he can play with Blue Origin in his spare time and focuses on more innovative technology ideas. Again, this is possible because Bezos probably has far more influence at Amazon than Page at Google.
So if Page can't convince "the board and the big shareholders" I believe it means he can't convince Brin, not some horde of faceless MBA's or institutional investors.
Your assumption about Bezos relative influence over Amazon seems unlikely, as Bezos Amazon holdings and the lack of a similar share structure means Bezos has far less voting power at Amazon than Page has at Google. Last I heard, Bezos held less than 20% of the voting power at Amazon.
It's the JDs and MBAs, IMO.
When your ability to make money hand over fist starts to get challenged, it is difficult to continue giving people free reign, especially when your competitors focus on cost, cost, cost. HP was a great place that made oodles of money selling tank-like PCs (among many other things) that cost $3k. But then Dell came along, invested $0 in R&D and started cleaning HP's PC clock. Bell Labs was engineer/scientist nirvana, then the AT&T monopoly went away.
The other issue in big companies is that as people with direct connections to the business start losing control, the bureaucrats (well intentioned as they are) start moving in, and they worry about things that the engineers/etc didn't really care about. They are passionate about you using the appropriate powerpoint template, and will speak to your supervisor if you don't comply!
I know productivity superstars, but if you catch them on the wrong day or at the wrong time, they look like lazy bums. Over a year, they're extremely productive. On any given day, they may look like they're wasting time.
This property of conceptual work is at the core of a lot of culture clash with people who have more predictable work-comes-in, work-product-goes-out flows.
I've encountered a particularly nasty variation of this since Agile became a Thing, where anyone who isn't writing code or running tests or showing up in the commit logs every few minutes obviously isn't working. The idea that it might be better to step back for an hour, or a day, or even a month, to think things through properly and maybe do some throwaway prototyping before you start pushing code to production, doesn't even seem to be accepted as credible any more in some quarters.
People don't recognize it as such because the LOC -- lines of code -- has an added time dimension now and a different name. In Agile / Scrum it's K-(issue/day) where issues take the place of raw lines and the metric is applied per unit of sprint duration.
But same idea, and same fallacy. It results in gobs of ugly Rube Goldberg machine code that's slow for fundamental O(crap) reasons and bug-ridden.
An org that calls what it does "agile" or "scrum" and then proceeds to accumulate technical debt like the Titanic taking on water is lying to itself about what it's doing. Piling up technical debt is a textbook Agile(flavor) failure, full stop.
What sounds like happened in the case(s) you describe is that someone with dev management preconceptions wore "Agile" like a wolf in sheep's clothing and proceeded to dole out the same old ad-hoc nonsense. Nonsense as in being "efficient" (high KLOCs/metrics, butts in seats, long hours, etc.) and not caring about being "effective" (working on the right problem vs a problem, necessary understanding of the company and customer needs, working smarter vs. simply harder, etc.).
I've seen this attitude a number of times, e.g. in program managers with history in big, established s/w companies. They learned a certain way of working, but then talk "Agile" as the trendy hire-word. Unfortunately, some of these folks never gained understanding of the tools that Agile brings to the table, and when/how to apply (or not apply) them to a situation.
So, yeah, some people just don't get it. At all. And in my experience management is definitely a place where one is more likely to find such people.
For me, personally, and I suspect other people like me, it comes down to an ability to perform remarkably well under pressure, along with a lack of ability to perform well when the pressure is off. If there's no urgency to what I need to do, I find it very hard to commit myself to doing something.
Worth investigating, because it seems like ADD, but it is not. ADHD drugs in this case can be counter-productive as they tend to increase anxiety.
There are techniques to cope with this that can be quite effective.
As for ADHD, I probably have a little of that going on with a sprinkling of Aspberger symptoms, but not seemingly in a sufficient way to have any significant effect on my life and/or overall productivity (except that is when it comes to %$!^ing daily scrum updates which drive me bonkers - once a week would be fine, once a day is ridiculous).
1. band-aid around the problem
2. complete rewrite of a large portion of the website
During my internship, I did quite a lot of work but in similar manner. Implement an interesting feature, then a week of laziness. Another feature and laziness.
However the other replies to your comment also make me afraid about some psychological problem. I think I need to visit a doctor! :)
To be fair to agile though, I can see situations where it can work. The problem is that many of its adherents seem to see agile as a hammer and all software engineering as various forms of nails.
Agile isn't a methodology, but a metamethodology -- or, in terms of the metaphor, it isn't a hammer, it is a set of guidelines to use in selecting tools.
Scrum is a hammer, but Scrum ≠ Agile. Often rigorous adherence to particular methodologies (usually Scrum) get misidentified as being "Agile", but rigorous adherence to a particular methodology is not only not the same as Agile, but is directly contrary to Agile principles (particularly, its a direct violation of the first value from the Agile Manifesto, "Individuals and interactions over processes and tools".)
While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments, a lot in fact, so methinks the agile fans ought to take some ownership of this recurring problem and either find a solution or stop shoving agile down everyone's throats.
Finally, I'd take a 30% pay cut to escape agile to do exactly the same work I'm doing right now minus scrum. I'd get more work done and I'd feel better about it because I would no longer feel like I have the engineering equivalent of an ankle monitor attached to me. That's more than worth the loss of compensation to me.
This is not a problem with Agile; anything that works anywhere will lead to imitations that steal the name and attempt to extract some simple recipe from the "lessons" of that thing that worked.
> While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments
No, its how something that is nothing like Agile gets implemented in large corporate environments and called Agile.
Fundamentally, this is a symptom of a broader leadership culture issue environments in the authority structure and culture has people that neither know nor care to know about the domain have authority for decision making within that domain, and its certainly beyond the power of people external to the affected organizations with an interest in particular approaches to problems in any given domain (software development or otherwise) to do much about. It is, however, a pervasive problem in large bureaucracies (not only corporate ones.)
But instead it has become an enormous metastasizing moneymaker for minting Certified Scrum Masters, Certified Scrum Product Owners, Agile Certified Practitioners, and all sorts of other Agile titles for $1000+ a pop. So I guess we're going to have to disagree because I think this means a little ownership of the issues that arise in the practice resulting from that training is appropriate here. Because what I'm hearing from you now sounds a lot like the usual "You're doing it wrong!" refrain which accomplishes precisely nothing.
As long as there are people looking for packaged solutions and willing to pay top dollar for them, there will be people willing to sell them to you under any name that you ask.
But if the name on the tin is refers to something diametrically opposed to that kind of packaged-solution approach, well, its probably not going to be an accurate label.
Hold back some of the extra work on your killer productivity days and keep it in reserve. On those off days, reach into the "bank" and push some of those changes.
And now you are consistent.
But I learned long ago not to worry about it. The down cycles inevitably pass and then I'm producing again.
(no, I'm not bipolar)
Isn't that the motivation for the whole 20% idea, though? If you can produce 1 GMail for every 1,000 ideas, it probably doesn't matter if the other 999 didn't produce much of tangible value, because the programme almost certainly paid for itself just on that one success anyway. Meanwhile, you still get to enjoy the morale benefits for all 1,000 staff for the other 80% of their time when they are working on assigned tasks.
One of the bad side effects of innovative, rapidly developed things is that the support org gets left behind. When the organization is small, this isn't a huge problem. When you're a huge company, the Exec VP of Support has an incentive structure to deliver better support. That VP will lobby for more controlled changes and slower product release cycles.
I've worked in places where most of the organization would be angry if some team invented GMail. They didn't welcome disruption.
Actually, there are other reason for non-rigid 20% time (that is, the 20% is a target with considerable variation in the short term); it means that resources across the company and in any team aren't fully committed on critical tasks routinely, so surging due to an emergent need doesn't mean dropping the ball somewhere else. When routine utilization gets too high, this rapidly becomes a very significant effect.
Its also a way that people gain experience and understanding and avoid developing tunnel vision around the way things are done on their exisiting primary products, and so become more efficient.
The best way to deal with it, I think, is to have people submit proposals, pick the best, then take those people out of regular work for a few months and put them into a "lab" where they come up with an MVP. If it looks good at that stage, invest more in the idea.
but...oh yeah...Google got rid of "Labs".
That kind of thinking sounds like it would lead to a "race to the bottom" to me. If everybody is obsessing over - and competing on - a quest to cut costs the most and the fastest, I really don't see how anybody is going to benefit from that in the long run.
Or to put it another way: "You can't cost cut your way to a growing company".
Of course, I'm not saying there's never a time when circumstances change and some cost-cutting might be called for. But cost-cutting is a tactic, IMO, and not a strategy. Innovation, on the other hand, and committing to the activities that lead to more and better innovation, is a strategy.
My feeling: If you want to grow, you have to innovate. So if you reasonably believe that "20% time" is an approach that leads to useful innovations, then cutting it is a short-sighted, and arguably mistake, move.
The top gets paid an exponential figure for 'talent', but is ignored in any cost cutting measure... yet, any purely non financial 'talent' sets you on a crash course race towards minimum wage of "Oh, you did awesome this year, but times are lean.. here is a 2% raise."
Maybe I am more irked now because the company I work for just got bought by a huge company and R&D staff was slashed in half and "No more research." became a mandate
"For the engine which drives Enterprise is not Thrift, but Profit." - John Maynard Keynes
I suspect this was one of Larry's failed experiments because a whole bunch of the people I met were let go within a year. I personally fled the place after a couple months of trying and failing to find work remotely suitable to my skillset, which was, ironically, what led them to recruit me in the first place.
Great perks, lousy work.
I also joined in 2011 (SWE, normal hiring process) and my experience has been nearly the opposite of yours.
I could have stayed a year and hoped for the best, but by then I suspect I would have been so embittered that I would have become the embodiment of a bad culture fit so I left before that happened because I had a great opportunity dropped right into my lap.
Now I suspect I am blacklisted at Google because a few people have tried to get me rehired now that such openings exist and they were immediately shut down by HR.
It's very easy at Google to transfer from niche to core. If we (in Search) see someone languishing in another part of the company with skills that we want and they want to work with us, we make the transfer happen, and there's nothing HR or another manager can do about it. It's much harder to transfer from core to niche, and it usually requires a solid track record of sustained performance in your original assignment. I know a number of people that transfer from Search to Google-X after 4-5 years, but I know of virtually nobody that can make that transfer after 1-2. (Basically, the company wants to "make back" their initial investment in you before they'll let you work on speculative projects that may not show a return.)
Arguably some of the niche projects would be better done as startups - they don't face the constraints of working at a large, very visible multinational where they don't get resources or attention from higher management - but the entrepreneurial spirit isn't quite dead at Google.
I also didn't know upfront that little tidbit that whatever assignment I took, I would be stuck there for 4-5 years. If I had known that, I would not have accepted my initial assignment, nor any other, until it was something I knew I'd remotely enjoy, and I'd probably still be there today.
So by relentlessly focusing on short-term ROI, Google lost 100% with me.
If they're going to insist on blind allocation, then they ought to not be surprised when it doesn't work out. But I gather the heuristic is to assume these cases are a 100% indicator of non-googliness.
For me, I was with you until I read your suggestion to "read your comments" to learn about your experience.
From my perspective, Google recruited me aggressively away from a long-term gig where I had an absolutely stellar reputation. I uprooted my career with the mistaken belief that they wouldn't do this unless they had a reasonably clear idea what to do with me. Apparently they didn't and I'm not the only one who had an experience like that.
Sure, I could have said no and I take full responsibility for saying yes and for everything that happened as a result of doing so. And once I realized that Google was going to be of zero help in fixing what I think was a minor allocation error, I once again took responsibility to do what it took to fix the problem myself: I left.
So here's why I think they wouldn't help: the team onto which I was placed was losing an engineer a month. Every time they got a noogler to say yes (3 times during my short stay), another team would intercept them before they got to their first day on the job. The work was dreadful and tedious and the manager even seemed to hate running the team. And the only reason I said yes was because I had this naive faith that Google wouldn't do something as seemingly daft as blind allocation unless they had a pretty good idea how to make it work. My bad. But not my problem. High level people should have been fixing the root cause here instead of continually throwing nooglers into the pit and expecting a miracle.
What Google has learned over the last few years is that the people they were hiring as "the best" weren't necessarily getting the job done any better than employees from more "average" backgrounds...who happen to be much more readily available in the job market.
And that shouldn't be surprising. Look at brilliant physicists. Most end up in either the theoretical or experimental side, and are often quite bad at the other. Likewise, theorem heavy CS has its place, but getting through a program like that doesn't mean that you can write a for loop (I've interviewed Stanford grads that fumbled and failed though that), design readable, robust software, push through a sea of decisions and make effective, near optimal decisions (the whole SW life cycle is a n-dimensional optimization problem), get along with peers, and so on.
There is a huge cachet attached to degrees from certain institutions that really isn't deserved, in my opinion. In that sense the paper is "pretty". It's not a slam of the effort anyone at is undoubtedly making at the school, but the reverent regard with which it is regarded.
And then the MBAs had Lucent finance customer purchases and count the promissory notes as income ...
This is a completely unjust attack. Quite frankly, I have no idea what "MBA-think" even is. You make an assumption that an MBA making a bad decision is making a bad decision because they have an MBA. That doesn't pass the test. Would the same person make the same decision even without the MBA?
I always seem to get sensitive over the general MBA hate expressed at HN. As someone who spent years in web development before getting an MBA, I completely fail to connect to any of the insults typically thrown at MBA's on here. I certainly don't recall a class where we learned it's best to destroy 20% time. I don't recall ever being indoctrinated to the type of business thinking that is negatively attributed to MBA's. I recall getting an education on things like finance, economics, marketing, strategy, operations, etc. that weren't covered in my undergraduate technical degree.
I appreciate the developer-oriented aspects of software startups and Hacker News and I'm certain that many people have encountered assholes who happen to hold MBA degrees. I'm certain the degree attracts certain segments, I clearly had some as classmates, but attributing every business decision you disagree with as MBA-think is not a good approach.
This just seems like taking shots at a fuzzy construct for sake of taking shots and I'm not sure what value it adds to the discussion. I'd rather see legitimate reasons why removing 20% time is a bad idea for Google's operations.
I think the observed pattern is that we repeatedly see startups get kicked off, grow like wildfire, then turned into an empty shell of their former selves once the "professional management team" is brought in as they promptly kill off all of the reasons the company was growing in the first place in favor of short-term (bonus making) metrics that are almost never good for long-term growth and survivability.
It repeats over and over and over again and it's especially frustrating when you're on the inside watching outsiders come in as VPs who's only qualification is a top-10 MBA destroy unbelievably large numbers of man-hours of work and turn thriving companies into joyless bean counting husks.
I remember vaguely going through my own management education specific moments where I stood back and realized what a smoking pile of self-serving bullshit and handwaving the professional management industry had become. Most of what we were studying was full of vague and meaningless, but impressive sounding, aphorisms and pretend sciency/engineeringy sounding talk. Management methods were described like bold scientific experiments but constructed of the flimsiest methodology one could come up with. I felt like the books we were reading were consistently written by flimflam men who had no consistent measurable success and wrote endlessly about management theory with the structure of those late night get rich infomercials where they talk endlessly about they're going to show you how to achieve success, but then never actually tell you. Hundreds upon hundreds of pages of it.
I've since tried to purge most of my management education from me. The only thing it really got me used to was a sense of comfort with sitting in front of spreadsheets all day, moving around millions of dollars, and writing status reports.
The Mayo lighting experiment is an awesome and sadly typical example of the kind of shit poor "research" that goes into the field. With premises, methodologies and outcomes so flimsy a six year old could poke holes in it. Yet these kinds of "studies" are published and taken as a great advances and contributions to "management science".
Before you know it, based on one or two of this "studies", great fads sweep the ranks of professional management and we end up with bizarre and counter-productive management initiatives. When those plans inevitably fails and some consulting firm is brought in and recommends a house cleaning, new management is brought in who's only worth is that they're more up to date on the latest fads and reshape the company along those lines...generating lots and lots of activity (reorg after reorg after reorg) but no actual value.
The hardest of the social sciences will create experiments where they try to study a single variable, like a person's reaction to a specific set of stimuli or decision making under a certain specific set of conditions. That way you can at least pretend to control for things.
It is simply impossible to do this in management. There are millions of variables. You can't know them all, control them, or do multiple runs of an experiment.
One of the more recent management fads has been complex systems simulation -- trying to do computer simulations of difficult managerial problems. This can work for logistics, routing, and mechanistic process optimization, but you can't reduce human beings to "agents" in a model.
One of the major issues is the concept that a person, with no specific experience of understanding of a certain industry can take a couple years of generic administration courses and get slapped into a VP role in any given company. This concept extends down to the worker bees in that the assumption is made that workers are fungible.
I think this is a fundamental flaw in current management theory that needs to be burned out of the entire field with extreme prejudice. It colors the entire field and I believe is the root cause of most of the major failures in the field.
The case of John Scully is a particularly notable example.
There are some schools that have toyed around with industry focused management degrees, a step in the right direction. You can learn how to manage a business in a particular kind of field...managing professional services in a software vertical is unbelievably different from managing the R&D division for a major cosmetics company, managing a small company is unbelievably different from managing in a megacorp.
I think also that MBAs simply shouldn't be available until a person has a few years of industry experience under their belt -- much like executive MBAs are today.
I think at the very least, because of the damage a shitty MBA student can do in the world, it should require industry specific professional certification that needs to be maintained and has an ethics bar like becoming a lawyer to self-censure particularly bad apples.
My point is that I am curious if it is possible to change things so such a person won't be bad for the company after being slapped into the role.
Technology has a viable lifetime before you need new technology. As a technology company, you therefore need to be invested in creating new technology or be willing to go in to the spiral of customer loss and only dealing with legacy systems before a slow death.
Statistically speaking, if you know most of your staff is intelligent and has experience in your market, you're more likely to get an outlier idea (one that is way off on the end of the bell curve, ie, actually really good) by casting a wide net, and listening to the bulk of your employees.
The problem is that to go anywhere, ideas need time to gestate and develop, so you have to give all those employees a little bit of time to develop new ideas for your company.
It's been a while since undergrad, but if you want, I could try to bust out an equation modeling (and predicting the expected value of) the likelihood you hit a really important idea in the wide-net situation versus the "dedicated research staff" one.
The key distinguishing factor to me of MBA-think is a combination of posturing with credentials and cargo cult thinking. The essence of this thought pattern consists of thinking divorced from real-world referents.
A strikingly similar cognitive anti-pattern can be found in the humanities, by the way.
I wonder if this is because business borrowed something rather toxic from the post-1970s "postmodern" intellectual meltdown of the humanities? The worst kind of MBAbabble that I've endured over the years reminds me very much of postmodern literary criticism in its vapid, posturing use of language to hide the fact that the speaker is not actually saying anything.
"Supercalifragilisticexpalidocious! If you say it loud enough you'll always sound precocious!"
A closely related cognitive anti-pattern in computer programming leads to "architecture astronautism," premature generalization, and over-engineering: http://www.codinghorror.com/blog/2004/12/it-came-from-planet...
In stereotypical MBA-think you have people reasoning about businesses without reasoning about the business -- about what the business actually physically does in the real world. So you see something similar to the cargo-cultish application of design patterns in programming. A management practice will be applied because it worked once in business X, but it's being applied to a business with wildly different characteristics.
At no point is an attempt made to actually walk the halls, talk to the boots on the ground, actually ascertain the concrete nature of the business one is managing in order to tie one's thinking to reality.
Finally, there's the ugly aspect: all of this is posturing to justify unjustly high compensation relative to the people who do real work.
Back to programming, I have run into "enterprise architects" who do not know what they're doing and who make more than the people in the organization who do. What they do know how to do is how to sound impressive.
Back to the humanities, it's sort of transparent to me that postmodern psychobabble is a similar sort of impostiture to hide the fact that the people in question are supposed to be cultural vanguards but in reality have less to say than the street graffiti artists who tag up their buildings at night. Personally I'd fire the humanities people and then hang out in the bushes at 2am and offer the clever social critics with spray paint cans a job.
Elon Musk is a great example of someone at the absolute opposite end of the spectrum from stereotypical MBA-think.
He's not Tony Stark. He's not superhuman. What he does do is get his hands dirty. When he founded SpaceX he actually taught himself some bona fide rocket science so he would know what the hell he was talking about. He did a similar thing with Tesla, actually dove into some of the hard problems of electric car design himself so he'd have a clue. I'm sure he spends most of his days doing managerial things and raising money like any executive, but the fact that he's gotten grease on his hands means that when he reasons about his businesses he's reasoning about his businesses and not about abstractions divorced from reality.
His astonishing success at building businesses doing some of the hardest things one could possibly choose to build a business doing can be chalked up, IMHO, almost entirely to the fact that he is a smart guy reasoning about things instead of reasoning about hypothetical things. He does not confuse the map with the territory.
You described easily over 75% of the management textbooks I had to work through when I was getting my management degree. Vapid & content free. It was shocking sometimes to be reading a few chapters in a row, and not even realize different authors wrote them because they all had the same "voice" of page after page of absolutely nothing at all to say.
It's sort of funny how Soviet business culture actually is.
Capitalist heirarchies look like state heirarchies in a system widely criticized as being "State Capitalism".
Funny to the people who think that the Soviet Union represents the polar opposite of capitalism, expected by the left-libertarians who noted that the Soviet Union recapitulated the features of capitalism central to the socialist critique of capitalism.
The heirarchical megacorporation existed before socialists invented the name "capitalism" to refer to and criticize the system which spawned such beasts, so the Soviet Union mirroring the heirarchical structure of such an entity -- with a similar elite vs. worker power relationship -- is exactly the USSR recapitulating features of capitalism central to the socialist critique, and not "the opposite direction".
(Levchin, BTW, seems much more like someone who actually "gets his hands dirty" - which doesn't actually work out all that well sometimes, as evidenced by subsequent ventures like Slide that executed really well against a pointless market.)
I'm convinced it's not.
Betting on the long term at the expense of the short term is incredibly hard. Even the smartest people in the world will panic (and in some cases pivot) when doubts start to creep into their mind. I am sure all of us have been in a similar situation. It's always easy to judge from the outside but on the inside those decisions are tough, lonely, scary and rarely to do with just covering your own ass.
Now granted, I know far more about programming than business. I can spot a cargo cult programmer after three sentences. In business it takes me a lot longer, maybe 6-12 months of working with someone. But by then if I've watched the cargo cultism long enough I can be pretty sure that's what I'm seeing, and I stop giving the benefit of the doubt. Sometimes when someone repeatedly seems like an intellectual impostor, it's because they are.
"Even the smartest people in the world will panic (and in some cases pivot) when doubts start to creep into their mind."
I think that's a lame excuse. Maybe my expectations are irrational, but I expect the kind of people who get millions or tens of millions of dollars to experiment with to be fighter pilots, not bus drivers. And if you wreck the ten million dollar plane, you don't get to fly another right away. Someone else gets a turn.
I also expect them to be smarter than me. When I talk to them I should have the feeling I've talked to an intellectual superior, not an impostoring dumbass.
It downright offends me to see the American middle class collapsing and people with no jobs while this shit goes on. If you're an entrepreneur, you're a working stiff too. Your job is to create value, and probably jobs, by building new businesses. When ordinary working stiffs screw up they get passed over or fired. When these guys screw up they get promoted -- or at least endless second chances.
I think what it boils down to is credentialism and cronyism rather than performance-based investing. If I were an investor I would split my odds between two areas: taking a chance on newbies, and investing in people with proven track records. But I would not repeatedly invest in people with poor track records just because their resume says "MIT Sloan School of Business."
Yeah, it's a rant. You're talking to a war veteran here. I also earned these war stories in the Boston metro area, and have heard a fair number from a friend in New York. I've been told that this problem is worse on the East Coast due to the absolute worship of ivy league degrees and in-group connections there. The West Coast is more meritocratically-focused. Never lived or worked there, but the culture I've seen in the West supports that notion.
The prioritization of short term benefits is an obvious consequence of reward schemes that judge you based on yearly performance. There is little reason to invest in things that help the company long term if your pay depends on short-to-medium term results, unless you think your company might be going down and are fighting for job security.
Large sums of money can draw them in sometimes. But then they generally do the job for a while, stuff their pockets, and then leave.
People don't stick around with enthusiasm at a lame party.
A lot of this "idiocy" is in fact rational (if unethical) behavior. At many large companies, upper management is rewarded for short-term gains. In response, these managers adopt strategies that produce short-term gains. If these gains can be boosted by risking the company's future, that too is a rational trade-off: The managers will be long gone when the company falls into decline, but in the meantime they will have collected a lot of pumped-up bonuses.
When people are paid today to burn tomorrow, why are we surprised that they reach for the matches?
The fact that microoptimizations that are harmful to the company in a broader perspective are rational on the part of individual mid-level managers working within the incentive framework provided by upper management, doesn't mean they aren't idiocy on the part of the upper management responsible for creating the incentive framework.
In fact, that means they are idiocy on the part of upper management.
If IO 13 is to be believed its become an essential pillar of their current vision/roadmap for search (Answer, Converse, Anticipate).
It's a bad metaphor.
Adsense, Go? I'm sure there are others too...
It's an issue of central control versus decentralized organization, and where the company thinks innovation will come from. One could also argue you get a short term price bump too.
An MBA is neither necessary nor sufficient to suffer from MBA-think.
This is the con of being a public company: a lot of cynicism. Imagine that you have inside information that Blackberry will revolutionize the mobile market, you will put money in their stocks and may be move on fast once it increases significantly the price. Yes, value investing is very interesting but how many daily transactions are based on this thinking?
I think there was a lot of bias in favor of public companies and now we are realizing that staying private may be the best option if you think long term and don't want to deal with massive conflict of interest.
MBAs are like a gradient optimization algorithm without any simulated annealing process.
> MBAs are like a gradient optimization algorithm without any simulated annealing process.
Now that is content-free babble. If you are a bot, more work is needed. If you are a human, you should be ashamed of yourself.
He didn't express any big new insights, but there was an honest attempt to impart information behind that post.
About MBAs, I don't think changes at Google is because of MBA-invasion. Most of the changes we see today at Google are actually directly influenced by Steve Jobs. When Larry Page became CEO, he requested mentoring meeting with Jobs which he surprisingly granted even though he was very angry about Android. The strongest point Jobs made to Larry was that company should do only 5 things and do it very well (paraphrase). Larry took this to heart and realized Google was extremely fragmented with lots of small undirected efforts going no where. Rest is the history... It's debatable if Google's old model was better than new one. Personally I think the best thing for a company is to keep continuous balance between two extremes. Historically, companies starts believing in one end, drift there and then realize that it's too much then drift towards another end.
They'd been there for a week, and told me they hadn't seen any of the city. After work they'd go to the gym or drive their rented car to a restaurant recommended by colleagues. Taxis were a max $3 if you live anywhere in the core, which they did.
On a walk to a restaurant famous for mondongo, we had to cross a 4 lane street. It was 9pm with no traffic and after waiting awhile I noticed the crossing signal didn't go off. I looked both ways with no car in sight and started walking, looking back expecting them to follow. They all stood there awkwardly, with no cars and no other pedestrians, for another 30 seconds, waiting for the walk signal to turn on, while I watched them from the other side. The signal never changed.
They were scared of the city. Anyone who has been to Medellin in the past few years knows there is little to fear.
Not all MBA's are like this, but this group just didn't get it. I realize this is more about travel fears than business smarts, but there is something to be said for people that can adapt and embrace new environments.
I'm watching friends in two different, reasonably large consulting companies (~700-1000 employees) suffer as their respective companies go through the consulting version of this kind of MBA suicide. In one case the company even decided to change their name and go through a rebranding. Some examples of the idiocy at work
- The rebranded company's schtick is to hire staff with advanced degrees (PhD preferred and multiple Masters) and at least 10 years of industry experience and charge bongo bucks for renting them out to do high-end but not necessarily mind-blowing work. Something like 80% of their contract staff has a PhD. Due to economic reasons they lost a couple medium-sized contracts. Their response was to lay off half of the PhD employees in the company because they were too expensive to keep around while they looked for new work. even ones who were already working on a contract and making money for the company.
This meant that they also had to cancel at least 3 more medium-sized contracts and a handful of smaller ones because they eliminated the staff who were working on them
At one point they also had two CTOs (because if one is good, two look even better) until they came to their senses and laid one of them off.
- At the other consulting firm, a similar pattern, tightening customer budgets meant that they decided to replace staff already on contract with cheaper, less experience, all new staff to pump up the profitability score on the spreadsheet. (they'd already laid off all of the idle staff not on contract, so without winning new business, some MBA thought that this was the best way to increase the numbers and get another bonus)
But now they've let go all of the people in the firm who had experience doing the work, sometimes decades of experience. And now there are no mentors, no experienced hands, nobody. The quality of the consulting work rapidly went down resulting in the loss of 2 contracts and senior positions that they'd normally fill from within required them to go outside of the firm for.
Both of these firms are in death spirals and all of the people that could help them pull out of it were fired.
At one company I used to work for, I also saw my leadership completely lose their minds and fire off all of the development staff, thinking we could coast with a new sales team and the product we had and pull ourselves into profitability. They also turned down trying to fill out profitable professional services contracts we had with staff because the margins were lower than selling new licenses. I also brought them significant new work in a different line of business where it was decided to turn it down because we didn't want to have to deal with low paid temp staff because "temp workers get paid more than their worth to their temp agencies".
I watched this particular face-plant occur once as well. When they actually got customers the results were hilarious. Hilarious because I was not a direct employee, mind you.
It was like having a high-end trendy restaurant with excellent branding, great decor, a great location, and no food. So they run across the street to McDonalds and order fifty Big Macs, run back over, dress them up on plates (who will notice), and...
Running out of money sucks, but this maneuver was destined to fail from the get-go. Anyone who knows anything about tech could have told them that. One smarter alternative would have been to lean the development staff -- they had to -- and hire a small number of salespeople on contract (not full time) and offer them disproportionate bonuses to bring in sales.
What they actually did was fire anyone who knew how to make anything -- and in a way that burned bridges! -- and hire a bunch of sales guys full time and at full salary. Hilarity ensued.
If the company hits it big despite them, the stock is awesome.
If the company does poorly they might miss a bonus but they'll still get their inflated salary.
If the company ends up on the rocks, they'll get their salary cut "till company performance improves" as their pay is usually tied to the performance of the company as an incentive.
But what invariably happens is that, even with their pay tied to company performance, the moment it gets cut, they high tail it out of there and land a new job with their top-10 MBA and another couple years of job experience...saying "I'm looking for growth opportunities" over and over again in their interviews.
I don't know if it's still true, but Facebook used to filter out these types by making them do a version of an engineering interview. Except instead of regurgitating algorithms you learned in college, MBAs had to regurgitate stuff from MBA school. Like derive the time-value of money, or discuss the TPM or whatever. The ones who coasted through MBA school got filtered out really quickly and the ones who took the subject seriously might pass the gauntlet.
The truth is, nobody knows how to bottle "success" and reproduce it -- but in a sense that's what management school is trying to do. With just the right management approach, applied in just the right ways, you can get Instagram instead of Color, or Google instead of Cuil.
The problem is that it's very easy to look at the failures and pick apart the problems. But it's very hard to look at the 1% that succeeds and figure out why -- and when it's attempted it usually overlooks the cases where the company succeeded despite having many of the problems the failed companies had.
Most companies succeed because of 90% dumb luck and 10% business strategy.
(hell, business is so screwed up people still don't understand how to define success. You see it here all the time that a "successful" startup is one that raises a huge round, not one that's profitable and growing...even BusinessWeek does this)
And what exactly is a po-dunk school when it's at home?
(2) A po-dunk school is any school not on the coasts and not in the top ten, possibly excluding a small number of top-tier schools in "flyover country."
(*) Flyover country is the large area you fly over when traveling between, say, Boston and San Francisco.
Yes, as well as the similarly anglocentric (but somewhat oppposite in perspective) aphorism: "Look after the pennies, and the pounds will look after themselves."
Google's per-ad revenue was down 6% last year. Ads are 95% of their revenue. They have huge data to see trends.
Just because Google is still profitable doesn't mean that they aren't seeing a near future of red ink. Say per-ad revenue goes down 10% next year with the same costs and their profit goes from $10 billion-ish down to $5 billion. And if this decrease is a trend?
Competition from iAd and Bing Ads, smaller screens that are harder to advertize on, and so on. It's easy to see a near future Google that has to spam ads or make huge cuts to remain profitable.
And I'd argue with more and more accessible technologies these behemoths won't have the luxury of "lots of time". And hopefully the large sums of money will mean less.
1.) Demography: Who is the ideal MBA candidate? From what I've gathered, the historical answer would be: "Someone from outside a business education about to move into a business role." So, of course, your average MBA is going to be someone who studied Biology or Mechanical Engineering as an undergrad, right? Well, they're in there, but they're surrounded by people who studied Business and Economics as undergraduates, often from elite schools! What could a Wharton undergrad possible gain from a Wharton MBA? Social proof? A two-year vacation? It's unsettling, and it leads to the second problem.
2.) Intent: What sort of work is the MBA cut out for? Well, they're masters of administrating businesses, so...management? Nope. Consulting and banking. Like lemmings, they go right off the cliff into these fields. All of them. Even the ones who came in there looking to simply move up the ladder in their chosen field, like biotech or energy, will feel pressured into becoming some sort of banker or consultant. So, the MBA ends up becoming a launching pad for people outside of consulting and banking to get in, and for people on the inside to move up.
I don't know how business schools would go about rectifying these flaws. Even if they were to shut out business types, that second flaw is so deeply ingrained in the b-school ethos that anyone seeking to be a better manager in their original field would likely be derided by his or her peers as an underachiever.
Interestingly enough, I hear the Executive MBA is a far better experience, simply because the students are too far along in their careers to make a pivot into something else. They're just there to get better. And maybe for the two-year vacation thing as well.
Personally I think this makes sense. 20% time was great when you're a small company trying to see what works. When you're a big company, you usually get disrupted because you lost focus.
When you're small, you need a source of revenue quickly, or you'll die. You don't have a luxury to experiment too much, you have to build one thing that works.
When you're a huge company and have untold millions in cash, you can afford experimentation. In fact, you have to, because your revenue source is probably finite and you have to find another before the current source dries up.
This is why Google dived into mobile phones, self-driving cars, home entertainment, renewable energy, Internet providing, consumer cloud computing, goods delivery, etc. Some of the experiments became huge successes (see Android), some failed (see the long list of closed projects), some are too immature to judge (e.g. self-driving cars).
It just looks that experimentation is now locked at a thinner top exec level, while 'simple engineers' are not expected to do too much of it. Which is, of course, a pity.
When you're huge and cash-flush, you have the luxury of really innovating in truly hard areas.
This luxury is damn hard to obtain in a short-term gerbil-wheel economy like ours. Once you have it, throwing it away is like setting fire to a house as soon as you've paid off the mortgage or crashing your new luxury car as soon as you drive it off the lot. It is abysmally stupid and short-sighted. Investors should call for the heads of people who do this, literally. As in on a pike.
People think innovation comes from startups, but in reality it doesn't. Not because startups aren't smart and agile, but because they don't have the resources.
When I say innovation, I mean innovation. I don't mean application of existing innovations to new market areas or problem spaces. Startups excel at that.
But you'll never see a scrappy basement startup whip out a self-driving car, an artificial lung printed from a 3d printer, an orbit-capable reusable rocket, augmented reality goggles with a complete software stack, etc. Not unless the parts for those things already exist and can simply be combined in a novel way to yield a result in less than a year.
If you're big and cash flush, you can do what nobody else can do (except other monsters like you). You can do this.
That dent gets you a Ph.D, but if you do it for something of high economic value it gets you early entry into a market nobody else can enter because they don't know how.
If you do it, you've now created a piece of value that you can do one of many things with. You can feed it into your more short-term product-dev branches and do things nobody else can compete with, you can license it, or you can use it to pump up the prestige of your company in ways no advertising can.
"Holy crap! A self-driving car! And it really works. I mean, there it is, on the road, and it's driving better than I am and getting around faster than I am! I'm going to move all my business's hosting to Google Apps, cause they obviously have the smartest people on the planet..."
I do have the sense though -- and keep in mind I am an outsider -- that there might have been some "ADD" issues with Google's 20% policy as it was implemented. But I don't think the solution is to ditch it. The solution is to focus it, to try to get people to spend their 20% time pushing harder into deeper and more difficult areas instead of whipping out hacks. Maybe incentivize more people to work together more formally on 20% projects over longer spans of time, and incentivize them to tackle things that are very difficult... things only a big elephant can do instead of things a scrappy startup could do.
But these projects need to be outlined, have OKRs and an estimation of potential impact, then approved. Not impossible, but far fewer and far less wild projects probably can now pass.
Well, let's see how it works; in a few years it will be obvious whether the stream of innovation dries up or not.
That sounds completely backwards to me. Big companies get disrupted because they are too locked-in to milking their current cash cow, and they either don't see - or consciously choose to ignore - any threat to that cash cow. It's more like a case of myopia combined with tunnel-vision, IMO.
Meanwhile, scrappy upstarts are experimenting and trying new ideas, find a better new approach, and can launch it and start growing it, while $BIGCORP remains blissfully unaware, until it's too late. Basically, the classic Innovator's Dilemma situation.
And while $BIGCORPs often ignore disruptive innovations even when they develop them themselves, it still seems to me that you're better off to be the one developing the disruptive innovation yourself, so you at least have a shot at adapting before somebody else comes along and takes your lunch money.
You went to Mizzou too?
Oh, well, at least I had low debt (few $K, paid off easily) even though I got screwed on scholarships by my high school.
Apple has a similar story with Android phones. Keep innovating or get your low-end eaten.
Google doesn't have to innovate. They're already China. If somebody makes something they like, they'll buy it. Or make a knock-off.
Yahoo has survived, and made a lot of money for a lot of stakeholders over the years. I would also say that they have a better record than Google when it comes to privacy, legal entanglements, and general good open-source citizenship. Yahoo is a success story. Anyone who presents it otherwise is just putting their own lack of perspective and/or business sense on display.
What is more, it's clear the founders are still very keen to push the limits of innovation.
Maybe, as has been said by others, they are not shutting this down but restricting it. In effect, they're replacing most of the 20% time with acquisitions but they do still have the process alive to enable some internal innovation.
Additionally, Corning sold off the Corningware brand to a non-research focused company. They've pivoted again, and their research (and relationships with Apple and Google) have allowed them to continue to be successful.
No, they don't "have to". But then they die, or just keep doing what they do. It's an option.
There are boundaries that sometimes protect business: geography, client base, product specifics, etc.
And arguably enabling the Apple I.
The article claims that "20% time" at Google is no longer real; do you dispute this?
We've seen Microsoft do it, I'd argue that Apple is well down that road, as you said we've seen HP do it, Google may be headed in that direction.
This should make a lot of entrepreneurs happy, as there will continue to be a lot of top-down management-driven products that, if history shows, will continue to be market failures.
There's no rule to success.
Amen. I think as HN readers we suck up so much info on a daily basis (I would guess so much more than the everyday public) and get confused so many ways on what is 'The right way' to do thing, 'The wrong ways to do things' and the '5n lessons you should learn to do things right'. But really, all that will change and become irrelevant when company x does things in way y. Then all the armchair experts and runaway copiers will wax philosophical, and no-one will know any better until the daring party actually does it, or completely bucks it.
That said there's lot of penny wise pound foolish style thinking everywhere. That also tends to relate to the growth of the business to the point where management can't really figure out what going on any more (assuming they could at some point).
My last three years were spent turning my 20% project into a product, and my job now is spent turning another 20% project into a product. There was never any management pressure from any of my managers to not work on 20% projects; my performance reviews were consistent with a productive Googler.
Calling 20% time 120% time is fair. Realistically it's hard to do your day job productively and also build a new project from scratch. You have to be willing to put in hours outside of your normal job to be successful.
What 20% time really means is that you- as a Google eng- have access to, and can use, Google's compute infrastructure to experiment and build new systems. The infrastructure, and the associated software tools, can be leveraged in 20% time to make an eng far more productive than they normally would be. Certainly I, and many other Googlers, are simply super-motivated and willing to use our free time to work on projects that use our infrstructure because we're intrinsically interested in using these things to make new products.
Then it's not 20% time, it's personal time you're giving to your employer for free. Why would you do that? Why not build your projects outside of Google and keep them for yourself (assuming it's a product and not open source)?
Because my entire career- well before I started working here- has been dependent on things that Google has given to me for free.
Like Google Search. Search helped me learn to run linux clusters effectively (it was far better than AltaVista for searching for specific error messages) which ensured I had a job, even in the dotcom busts. It helped me learn python, which also played a huge role in my future employment.
Like Gmail. Although I've run my own highly available mail services in the past, free Gmail with its initial large quotas hooked me early on. I have never regretted handing the responsibility for email over to Gmail.
Like Exacycle (my project): http://googleresearch.blogspot.com/2012/12/millions-of-core-...
in which Google donated 1B CPU hours to 5 visiting faculty (who got to keep the intellectual property they generated).
I would like to repay Google for their extreme generosity. Spending my "Free" time doing things I enjoy (building large, complex distributed computing systems that manage insane amounts of resources) so that Google can make products that it profits from seems perfectly reasonable to me.
If I had continued to work in academia, I'd spend most of my time applying for grants, writing papers, and working 150% time just to maintain basic status and get tenure. Anybody working in the highly competitive sciences, or in the tech industry, who wants to be successful, has to put in more than what most people consider a 9-5 job.
As for open sourcing: Google has a nice program to ensure that Googlers can write open source code. I haven't taken advantage of it, because most of my codes are internally facing and don't need to be open sourced. But I would certainly consider using my time to do that; I just think my time is best spent working on Google products because I believe their impact will be much higher.
You certainly seem like a smart guy, working on some cool stuff, so I'm not surprised people are a bit confused (hence the term "brainwashed") by your (pretending to?) not understand the business model of the company you work for.
You are giving your time away, for free, to a for profit corporation. That's so irrational it's painful to hear.
If you like working with google systems and resources so much that you are willing to pay your employer to use them then ok, that's a bit weird, but it's your time. If you feel you need to work 120% time to keep your career on track then ok, that's not uncommon in this industry (but it's the opposite of generosity and it's not sustainable for you).
Framing this as repaying Google for "their extreme generosity" is delusional, which is why I'm assuming it's not the real reason.
And I still consider what Google provides (search, gmail) "free". Free as in free beer- http://en.wikipedia.org/wiki/Gratis_versus_libre
I'm certainly not "giving my time away for free": to be clear, I'm a salaried worker, and I choose to work the hours I do. Further, to be clear: Google gives me immense resources to carry out life-saving scientific research, the intellectual property of which belongs to scientists (and the general public), not Google.
And you would join a large number of people who have recently started using that word to mean "can cost any amount in anything of value as long as it's not currency". So sure, it's "free" in that way and still not free in the definition of the term that's actually useful to people. Boring semantic argument, let's drop it.
As to the rest, this is a much better way of phrasing it than in terms of Google's "generosity", which is what I was pointing out as flawed.
I was a startup founder before Google. I called the shots on what I'd work on, owned all the code I produced, and worked when and how I wanted to. A large part of my motivation for writing all this code was to learn things; another large part was to produce stuff that would be a positive contribution to the world. Alas, nobody used my stuff (well, almost nobody - we had a userbase measured in the hundreds), and I didn't get paid for it. So it wasn't exactly sustainable.
In my official job duties, I now write software that's seen by over a billion users. In my unofficial job duties, I'm the maintainer of an active open-source project watched by thousands of people which gets a dozen or so patches per week. When testing this open-source library, I had free availability of thousands of machines and a corpus of billions of documents. When developing it, I had the help and mentorship of experienced coworkers, some of whom had help leadership roles in major open-source projects. I get paid a fat salary for this. I do work longer hours, but it hasn't come at the expense of things I really care about. I go out with friends 3-4 times a week. I have a steady girlfriend. I call my mom every week. Most of the time for this has come out of loafing around on Reddit and Hacker News, where you are also giving your time away, for free, to produce something of value for a for-profit corporation.
In pretty much every dimension I care about, this is a win for me. My software reaches more users. I learn more. I get paid more. I meet more interesting people, and have more of a social life. My professional reputation increases more. I get more experience.
It used to bug me that I was giving away my labor "for free" to my employer, who would then profit handsomely for me. But what I realized is that not all labor has equal value. I used to reap all the fruits of my labor, and those fruits were worth virtually nothing. I'm now in a position where my labor has dramatically more leverage, and much of that is because I use the resources of my employer, and they're entitled to take a cut of it for that reason.
When the time comes where I feel I can accomplish more outside of Google than inside it, I'll quit. That was what drew me to them in the first place, the realization that, as an entrepreneur, all the ideas I had would be better executed as features of existing Google products. If that ever reverses and something that I really want to do would be better accomplished as an individual startup, that's what I'll do.
If you feel differently than I about Google's contribution to society or the ethics of unpaid overtime cultures in general then even my rather tame "that's a bit weird" might seem wrong to you and that's fair enough, these are opinions after all.
I meant for my comment to address the framing of similar arguments to yours as "repaying Google's generosity". Considering that type of statement really does "sound brainwashed" as another commenter pointed out, I was interested in a better articulated reason which you have definitely provided.
- He's building a reputation of innovator in the company, which is certainly not bad for the next promotion or raise.
- He's sure well paid and doesn't need to risk his own wealth.
- He's out of the paper stuff and can concentrate on the technical side.
Of course, that way he misses the "get insanely rich" possibility of startups. The "famous" part no so much, you can get that in a big corp as well...
My form of brainwashing happens to include 40 hours weeks. It's pretty nice.
Anyway, I think you raise an interesting analogy: working on 20% projects at Google, then using the company's resources to launch the product, does have a number of parallels to VC funding for startups (note: I'm advisor for Google Ventures, so I have some experience with both worlds). In a sense- and nobody everybody will agree with me- Google as an employer is a low-risk, low-capital way to launch my products. Larry and Sergey already took the risks (launching a company with no clear monetization strategy), they figured out a monetization strategy, and now they invest their capital in speculative projects.
Anyway, in my case, after it seemed like my project was in good hands and ready to be a product, I looked for something else interesting to work on. I think the main problem I have working here is that there are too many cool projects I could work on, learning from experienced SWEs and SREs, but I have to stick to one.
At my current job, I can't get a port opened to make an OUTBOUND connection to Amazon Web Services.
Yeah, working at Google, it's definitely easier to use big tools to try new things, huh?
The combination of the three is rare in industry.
This isn't a criticism or negativity, all things considered it seems to be a great company to work for; it just seems to be a bit more homogenized than it used to be.
Let me tell you a story from my recent experience.
My neck hurts from time to time, and as part of stopping that, I decided that one thing I needed to do was to not sit at my desk anymore. Fortunately, I already had a standing desk at work, so it was just a matter of using it. When I got it, I thought the best plan would be to gradually ramp up; 30 minutes today, 40 minutes tomorrow, until I was standing the entire day. That never happened and I think I maybe stood for an hour a day, basically bailing out as soon as my legs felt even the slightest bit uncomfortable. Last week, I decided that that was not going to work, and I mentioned this to some people that sit (well, stand) near me. They agreed to shoot me with Nerf guns if I sat down. I did, and they did, so I stopped sitting down. Now I can stand the entire day. What really did it was watching others around me stand for the entire day; if they could do this for months, why couldn't I? That's what really motivated me to endure the annoyance in getting used to something new.
So what does this have to do with Google? It boils down to: coworkers that care, coworkers that are "different", and the willingness to make expensive non-essential office furniture universally accessible and useful.
Ultimately, there are a lot of great reasons to work at Google, and HN comments can only give you a small snapshot at a time.
1) his manager (my manager) spent the next two days with a translator contacting his family back in Iran to explain what happened
2) paid a huge amount of money to have him rehabbed. He had brain damage. They did a great job- he had access to awesome rehab people and regained a ton of brain function.
3) worked hard to get his work visa extended while he was out of work.
I use bullet points because they are a succinct way to list several orthogonal items. Please don't take my data and generalize to all of Google. All I can say is that this company is pretty incredible, and much of what get published about it isn't very accurate.
Couldn't agree more. Partially-truthful stories titled "Google used to be a great place to work but isn't anymore," seem to make a lot of HN commenters very happy. I'm not sure why.
Anyway, back to complaining about how I don't like the new pour-over coffees that the baristas upstairs make. Replacing Intelligentsia with Stumptown? How dare they! What a terrible place to work! :)
This is bald hyperbole. There are a lot of reasons you would consider working there.
One of the things I love about our culture is InDay/HackDay. It works out to less than 20% time, but the entire company is given 1 day a month - generally the middle Friday - to do anything they want. This results in a lot of things ranging from prototyping ideas; learning new tools (I spent my day today brushing up on Scala); or just taking a fitness class.
The best part? The day is honored. It's not 100+n% time. Unless you own something that is bleeding money or users from a serious bug no one is going to come to you and ask you to do anything other maybe grab a beer.
I know we're not the only company doing this, I think Twitter has hack weeks every quarter, but it's a huge differentiator from our neighbors.
See, I don't get it. Management at my company encourages going home at the end of the day. I don't understand why I would work somewhere that expects me to work more than my 40 hours.
Every place I've worked has allowed employees to come in and work weekends on projects that would help the company. I guess those places just weren't innovative (read: arrogant) enough to rebrand it as an employee benefit.
I feel that the reason 20% time captured the attention of those wishing to become future Google employees is because it implies that you are literally granted the ability to take 20% of your work schedule and spend it on side-projects. Put another way, Google is granting you permission to pursue what interests you on their time.
What I appear to be hearing is that work obligations require 100% of one's work schedule as opposed to 80%, and that a 20% project is typically pursued with time that would otherwise be considered personal time.
If one were to use 20% of their work schedule to build something, and then invest some of their personal time into it, then I would say that 20% time is alive and well. If Google merely grants you access to resources, not time, then I would argue that 20% time is dead. It would be more appropriate to rename the perk.
Sorry if I'm coming off as confrontational, I'm genuinely interested in getting a clearer view of the situation.
However, I would agree that it is "as good as dead". What killed 20% time? Stack ranking.
Google's perf management is basically an elaborate game where using 20% time is a losing move. In my time there, this has become markedly more the case. I have done many engineering/coding 20% projects and other non-engineering projects, with probably 20-40% producing "real" results (which over 7 years I think has been more than worth it for the company). But these projects are generally not rewarded. Part of the problem is that you actually need 40% time now at Google -- 20% to do stuff, then 20% to tell everyone what you did (sell it).
I am a bit disappointed that relatively few of my peers will consciously make the tradeoff of accepting a slower promotion rate in return for learning new things. Promotion optimizes for depth and not breadth. Breadth -- connecting disparate ideas -- is almost invariably what's needed for groundbreaking innovation.
It's implicit because I'm not coding to a spec. I've got an area of responsibility and some general goals the people above me would like to see accomplished. If I have an idea, I try it out. I could see someone in a more spec-based, feature-driven role wanting to have time to work on their own features off the spec.
On the other hand, I personally don't want conventional 20% time. I don't really like working on two projects at once. One fills my head and all of my ideas relate to it and a second one gets in the way. I once again understand that some people can work on two things at once and enjoy it, but hey, I'm not one of them.
So I'm not sure there's an actual problem. If most of your engineers are like me, and are working with a great deal of independence on a project they find interesting, why would they want 20% time?
This is just a "They did WHAT?" moment for me. Like, wow. Stack ranking >_< Seriously?
It is a bit puzzling to me that Google was pretty innovative in a lot of areas, including HR policy, but the perf stuff is unimaginative and rote. Then again, I don't necessarily have a better solution for a company of its size.
From a personal perspective I think it's great to give up a level and 10-20% salary for increased time learning things. Google already pays at least 10-20% more than other places which DON'T have 20% time.
From the company perspective, I think it is sad that 20% time is becoming less and less relevant.
That's not what promotions are at Google. If you want to become a manager, that's called a ladder transfer, not a promotion. Promotions are basically for paying you more for making computers do more magical things. You are never asked to do anything other than software engineering, unless you want to do something other than software engineering.
(Note that software engineering is more than opening up an editor and typing in code, though.)
Google's higher engineering ladder rungs are defined more by scope of influence than the excellence or profusion of individual code. At higher levels, engineers will tend to spend more time communicating with and influencing other people.
Some people are happy to stop at level 4 or 5, where they can mostly do their own thing. They can do good work, they can feel they have time for 20% projects, and (to some extent?) they can keep getting raises and bonuses.
HR/MBA types value titles, buzzwords and other nonsense because most of them are incapable of actually parsing a technical resume. So, if you're going to look for a job, filling in your title as "developer" over a period of multiple years is a losing strategy.
By the same accord, without titles, buzzwords, etc as an engineer I won't be able to differentiate between resumes of a regular graduate level physics educator and one who is capable of nobel prize level research.
if you're in the position to make a hiring decision and can't make such differentiation - well, yep, that is what we have in software engineering. In physics the situation seems to be a bit (though not much) better, in part because of CVs, in part because the network is smaller.
First of all, I haven't seen any evidence of the "fire the lowest X% of the stack rank" attitude which I saw at IBM and which has been reported to happen at Microsoft. Because Google is so selective during the hiring process, I don't think it's as necessary at companies where dead wood accumulates and so the Jack Walsh/GE philosophy of there's always deadwood to be eliminated isn't as applicable. (Sure, there will be PIP plans for problem employees, but that's on a case by case basis, and not driven by statistics.)
Secondly, the stack ranking is just one of the inputs in your perf score, which in turn drives salary increases, bonuses, and equity refreshes. It is also just _one_ component in the promotion process, which is handled outside of the management chain, by committees of engineers that are typically level N+1 and N+2 of the people being considered for promotion. Speaking personally, as far as compensation is concerned, for which the perf score is the primary driver, of course money is important, but it's not my primary motivator, and I'm paid generously enough that whether I get an extra X% isn't going to force me to try to get that extra bump in my perf score. It certainly wouldn't cause me to try to sabotage my colleagues --- and by the way, the stack ranking is also done by your team members and merged into the results (it's not just a stack ranking done by the managers), so trying to get a higher perf score by not being helpful to your colleagues is not a winning strategy.
Finally, I've touched a bit about the promotion process, and certainly at the higher levels, promotion is done by your potential future peers, and is more about recognizing the fact that you are already performing at the level of a Staff Engineer, or a Senior Staff Engineer. Your manager will write a recommendation letter, but certainly at the higher levels, it will be your future peers across the entire company that will be judging whether the work you are doing has the impact and a wider scope which is the hallmark of the higher ranks of the engineering ladder. And of course, if your 20% project happens to have a high impact or affects teams across the company in a positive way, that is something that you can write up in your promotion package, and they will consider it at promotion time.
Also, once you get to Senior Software Engineer, there is no expectation that everyone has to keep on climbing the promotion ladder. There is no "up or out". If you want to do really great engineering work, but it's work that isn't something that fundamentally affects the company's bottom line, or isn't of broad scope, that's OK. You won't get promoted, but at the end of the day, how important is that? You can only buy so many toys, after all, and if you are doing work which is fulfilling and you have other things in your life which is giving you great satisfaction, who cares if you never make Distinguished Engineer or become a Fellow? And that's yet another reason why stack ranking may not be as important.
The bottom line is that "stack ranking" as done by Google is pretty different form what you might think of as "stack ranking" as practiced by other companies, so it's important to keep that in mind.
“The goals of the advertising business model do not always correspond to providing quality search to users.”
“We expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”
“Advertising income often provides an incentive to provide poor quality search results.”
"Since it is very difficult even for experts to evaluate search engines, search engine bias is particularly insidious. A good example was OpenText, which was reported to be selling companies the right to be listed at the top of the search results for particular queries. This type of bias is much more insidious than advertising, because it is not clear who “deserves” to be there, and who is willing to pay money to be listed.”
“We believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.”
“Search engines have migrated from the academic domain to the commercial. Up until now most search engine development has gone on at companies with little publication of technical details. This causes search engine technology to remain largely a black art and to be advertising oriented. With Google, we have a strong goal to push more development and understanding into the academic realm.”
Also, Baidu was accused by many to have intentionally removed or weighed down the links to sites, admins of which have refused to bid advertisement or increase their advertisement bidding. I cannot explain such strategy except to hypothesize that Baidu has sustained de facto monopoly.
When that focus shifted to exactly what Page and Brin had criticized themselves in 1998, we can clearly see that it's not the interests of the users being served anymore.
The decline of 20% time appears to go hand in hand with this shift from the focus on the user to focus on the advertiser and other interests first. I'm not claiming direct correlation but it's not a stretch to say there might be some connection.
Of course, putting it behind a paywall would have been a failure. But they could have possibly done a freemium model, offering better and more detailed search capability for a small monthly or yearly fee.
All the analytics, google alerts, etc would have been non-free products as well.
It just would have made them much, much, much less money.
But god damn that search engine would be amazing by now.
That's not "anti-Google propaganda". That's honest criticism from an old geek.
"The difference between Google's advertisements and OpenText's is Google (usually) identifies advertisements as advertisements."
Also, Google results have gotten significantly better than they were in 1998 so even if we ignore the OpenText context, I don't see how this could be a successful prediction as you're trying to imply.
Somebody on the Internet is wrong and said something bad about Google? How dare they! At what time did this serious offence occur?
Not everyone cares about whatever is going on in antitrust, but if you're not objective, you have no argument.
I don't have to get approval to take 20% time, and I work with a number of people on their 20% projects.
I can also confirm that many people don't take their 20% time. Whether it's culture change due to new hiring, lack of imagination, pressure to excel on their primary project, I'm not sure, but it is disappointing. Still, in engineering No permission is needed.
There is a reason the article talked to many ex-googlers because they are ones who, perhaps, left when there was a difference of opinion about their contribution to the company. And if that conversation attempted to use some of their 20% projects as a way of contributing to their contribution, and that was the disagreement, well it ends up with them leaving.
I left in 2010, which was the start of the lurch toward more managers, and those managers were being scored on what their team accomplished that was assigned to the manager, not on what their team accomplished on their 20% time. Some managers had taken that to mean that 20% time didn't count at all toward calibration, and if you were working only 80% of the time on their projects it counted against calibration. Hence the disconnect. That is why "120%" sort of works, even in the face of a manager trying to make their own number. Except the benefit sounds different if you say "And on Saturday you can work on any project you want." :-)
The policy of course is to not do that, but how would Google enforce that? If you spent 200% of your time on your primary project, you might get excellent reviews. Should you be calibrated 50% lower to normalize everyone? So yes, if you work 100% on your primary and +20% on your 20p, then you might get higher perf scores. Eh, if you have a good idea how to correct for that, go ahead, but it's a people problem and people are hard to deal with in these kinds of thing.
This is one reason why I actively encourage as many people to take their 20% time as possible. :)
My personal experience though is very positive regarding 20% time. My first project, I started, gathered a few contributors, built a prototype, pitched it and turned it into a full-fledged project. I got great reviews and promoted in large part because of that. In my second PA I work on open source projects. Less glamorous, and maybe less perf-impacting, but that's not my goal right now.
In general the people I've seen take 20% time in a focused way tend to be the higher-performers. They're more self-guided, more critical of problems that need to be solved, or use 20% time to teach themselves or move onto harder problems. Maybe as we've grown the percentage of employees like that has shrunk, maybe it takes a while to realize you have 20% time. Not sure, but I'd like to see it be used more, it's good for everyone.
I'm not being critical of your account, I'm glad it has worked for you, as the manager of the engineering effort here I want to take the "good" stuff and use it, and leave behind the "not so good" stuff. So that is my agenda in this discussion, nothing more, nothing less.
"In general the people I've seen take 20% time in a focused way tend to be the higher-performers."
That is my experience as well, and regardless of a company policy for or against 20% time, focussed, high performing engineers will spend 20% of their time do cool and innovative things. Whether they do it at home or at work doesn't really matter, and having them do it at work is good for the company.
So the interesting thing about the policy is to capture the next tier of engineers and help them to be more productive by encouraging them to develop habits of the aforementioned highly successful focussed engineers. And to weed out the folks who are abusing the program  or at least not being any more productive with it.
If nothing else people are different right? Facebook's response has been "hackathons" which carries with it some characteristics of high performers (who quickly prototype ideas to test their validity or get a handle on their challenges)
But in all those scenarios, if you have managers, you need to also train your management on what the program is trying to achieve and how it might be addressed. So you don't end up with some managers giving their people 1 day a week off, and some demanding they work on Saturday if they want to use that extra time.
If it is "You have this huge resource available, dare to use it." then you can manage to that without damaging either morale or perf scores. From the anecdotes in the OP article it sounds like they are still working on that part.
 Like the guy who said he was trying to capture the great ideas he dreamed about in his 20% time so he would spend several hours napping for an hour and then waking up and writing down what he dreamed about.
My previous experience is mostly in R&D startups, doing robotics and natural language processing. I was hired in 2010 and then found out I would be working on YouTube ads. I was disappointed, but decided to try to make the best of it. After three months I realized my lack of interest was going to be a problem, so I talked to my manager about trying to transfer. He discussed it with the site director, and the response was "we don't care" and I couldn't transfer until I'd been there 18 months.
I decided to stick around and see if I could work the system in some way, with the probably naive thought of trying to demonstrate my abilities and catch someone's attention that would help me transfer to a project I'd enjoy where I could make a real contribution.
During Innovation Week (a hackathon) I led a team of three other engineers working on an idea I came up with, and we won the "Most Innovative" award. The other engineers decided they all wanted to devote their 20% time to working on my idea.
My Tech Lead and my manager had no interest in my Innovation Week project, and I still had no way out of YouTube ads. Unsurprisingly my performance on my 100% project wasn't great and even if I made it to 18 months it seemed unlikely that I'd be able to transfer. I left after 17 months at Google.
I've mentioned this story before, and I hope I'm not just grinding an axe--I'm just telling my experience in the hope that it will inform engineers about possible outcomes of working for Google. I am fully responsible for my experience there, but I can say the priorities of the (large, heterogeneous) company were not what I (again, probably naively) expected.
FYI, when it comes to reading any Michael Church claim, especially one about working at Google, I'd put more weight on the "outspoken" than the "ex-Googler." He admitted himself on HN that he engaged in what he called "white hat trolling" while at Google, a pattern he seems to have repeated throughout his life.
He's very articulate, but he has an EXTREMELY strong belief in the quality and accuracy of his own ideas. For illustration, he wrote a few months ago, "societies live or die based on what proportion of the few thousand people like me per generation get their ideas into implementation." When presented with such self-confidence beyond all reasonable proportion and corroborating evidence, I think an appropriate response is skepticism.
Engineers get 20% time period. You can be asked to defer it for a quarter. On the other hand, most don't take it.
I don't know him personally but I read the stuff he writes, and while his character can be a bit abrasive he's an extremely intelligent dude who can make astute observations and connections that other people miss. I think you're doing him a lot of disservice by dismissing him the way you did.
(That's a trick question, since the answer is going to depend on the person's intelligence and ability to make observations.)
Also, a ton of long-time Googlers agree with the stuff he says. It's not like he's some lone dissenting voice.
Personally, I did have very soft discouragement against wide-open 20% time. I asked around and a lot of people I talked to initially advised me against starting a 20% project so early, and especially against starting a new project rather than working on an existing one with engineers at a higher level than me. At least, they said, make sure I could get reviews out of it. That's not policy, but advice. People have their own theories about how best to get noticed and get promoted. Some of that has to do with 20% time. I guess if you're solely interested in promotion then you give more weight to such advice. I hope most Goolger's aren't solely interested in promotion.
I'm very glad I ignored that advice, both because I got to do very interesting things and because I got recognized for it, and I'm glad that I could ignore the advice because of our policy.
Or are you in an unintendedly unique position?
That doesn't seem to track with the article, or the articles/blogs linked from the article.
Lots of no-comments from the PTBs at Google... too bad. They could end the discussion one way or the other quickly.
I can't imagine my former TLM ever taking 20% time, but I can thoroughly imagine him encouraging every single one of his direct reports to do so, unless we were on a launch sprint.
I don't want to equate this with working at Google, because I don't have enough information, but here's my anecdote about good managers:
Twice before I've worked with spectacular managers. They treated their employees well, and acted as great shields against the political infighting inside the company. But both times, the managers were forced out, and the employees that depended on them to get interesting work done ended up quitting as well.
If this is indeed, as the TFA posits, a memo from the top being filtered out by a few good managers, in a short amount of time those managers won't matter; they will be forced, or burnt, out. Neither case is good for the company.
If you need humans instead of robots doing a job, that usually (increasingly, as automation advances) means it requires judgement such that pre-written inflexible rules will be inadequate to handle it sufficiently. Which means you need to rely on the judgement of people applying flexible rules for the best results.
Rather, I've always imagined that it's not that it'd be closer to the perks like "unlimited vacation time": you can't just decide "well I'm taking the next 5 years off", because that'd be a dereliction of duty. In other words, if you have the premier release of your main project coming up in 2 weeks, it may be in your best interest to not take that 20% time for the time-being, but instead use that time to fully ensure that your project has a successful launch. But, if you don't have any big deadlines coming up soon, and you're reasonably on schedule, then by all means take the 20% time.
The 20% time is a perk, and just like many other perks, your use of the perk cannot preclude any tacit duties that you may have, which include delivering on your assigned projects.
But, this is just an outsider's view, I could be completely wrong.
That's incorrect, AFAIK 3M were the first company to pioneer this approach (with 15% of time spent on self-directed projects).
For example see: http://www.fastcodesign.com/1663137/how-3m-gave-everyone-day...
Talking to a friend at 3M (who has been there 20+ years, an engineer with dozens of patents) I am told that while 15% officially still exists, for a long time it's effectively meant working 115% of hours.
Nonetheless the tradition allowing self-directed research continues at 3M - and this might mean using lab resources, or creating prototypes without getting approval.
And get hours paid for it, even it being extra hours you wouldn't do other way.
I don't work there, but I feel pretty safe in saying 3M researchers are not hourly employees.
This way it is not goof-off time as it will not attract anyone except self-starters and thinkers, but the company is providing intellectual and material resources. You get to have fun, rejuvenate yourself on the job, the company contributes some, and maybe, just maybe, you hit the jackpot at some point. If not, well, you learned some stuff and had fun.
To your point though, Google probably has many "Paul Buchheits" (by that I mean at his level of intellect and ability), and so do many other companies. Being on the end of having a good idea, and not having the companies backing to build it, it sucks and discouraging. Thats why people make startups though (or at least why I like to believe some people do), to take that idea that the company didn't give them the time/resources for and build it themselves. One part to show the company that they were dumb, but also because they believe in the idea. I think as corporations try to maximize profits and throw things like 20% time out the door, the world will see even more startups and I do believe we are currently seeing that type of trend with the newer generation. Ideas can consume people, and if corporations learned to harness that power (Like 20% time and hackathons) they could innovate in ways that people believed impossible.
I haven't been at Google long enough, but I doubt there was ever a time when a majority of engineers did 20% work. It's hard: you have to take away not only time, but also focus from your main project (which like all software is probably already taking longer to build than you'd like). Most engineers aren't motivated enough.
20% time has always been a great marketing gimmick for Google. I remember interviewing candidates and mentioning it as a bonus to prospective candidates - most employees mentioned it as a bonus because we were inculcated into believing it was actually a bonus. It's much easier to see that it was never a real perk once you're outside the bubble.
And for what its worth, I personally have never had a problem requesting for 20% time. Like all big companies, it really depends on your team and boss. I believe your experience of straight out rejection is more the exception than the rule.
What happened to the company’s most famous and most imitated perk? For many employees, it has become too difficult to take time off from their day jobs to work on independent projects.
This is a strategic shift for Google that has implications for how the company stays competitive, yet there has never been an official acknowledgement by Google management that the policy is moribund. Google didn’t respond to a request for comment from Quartz."
What I've heard is more nuanced: the policy is not dead, but it's very hard to both satisfy the core demands of your job and maintain 20% time. So it's not dead, but possibly not an actual priority in the culture the way it used to be..?
The article seems to be pretty nuanced already.
... and the difference in outcome is?
Guess who advised Larry Page to focus on a small number of projects? Steve Jobs. It seems like Larry's acceptance of Jobs' advice has sabotaged Google. I'm no conspiracy theorist, but one does wonder if this was Jobs' secret intention.
 See the second quote in: http://www.edibleapple.com/2011/10/22/steve-jobs-advice-to-l...
But do any of the other companies listed do 20% time? AFAIK, they only have infrequent "hack days".
But this might affect their ability to attract absolutely top-tier talent.
If you fully de-risk your company, it will stagnate. No risk, no reward either. And in Google's quest to kill Microsoft, they are becoming Microsoft.
I totally understand the "management killing freedom of experimentation" comments, but they are only one side of the story, and a side explicitly pulled from disgruntled Xooglers at that.
The Garage is one of the things I miss most about Google - both the makerspace itself, and what it represents. We hosted several internal hackathons there, both within and outside the intern season and had great success.
Wouldn't that be 8% weeks?
Maybe current Google employees now prefer to develop their ideas in their own startups rather than as part of the 20% program..
There's nothing more soul-crushing for an organization than a board of directors.
But to argue that the board of directors is what's soul crushing about an organization seems odd to me.
It might be easier to manage a separate Innovation Department, but what happens when someone on another team has a great idea? Do we transfer that person to the Innovation Department to continue innovating? What would that mean for their team left behind, relieved explicitly of responsibility, and implicitly of capability, to innovate?
Maybe the whole company should be tasked with innovation, but innovators should be allowed 80% time to write CRUD apps if they so choose.
Exactly. At the end of the day, anybody in a company can be responsible for a valuable innovation. This is why something like "20% time" is such a good idea in the first place. It avoids the trap of assuming that you can actually accurately select the people who "should" be doing innovative stuff.
OTOH, having a dedicated Research department where people spend 100% of their time on research and cutting-edge stuff also seems like a good idea. An interesting approach might be to have both "20% time" AND "Google X" but with a policy that allows the Hoi Polloi folks a chance to rotate through the dedicated research group on occasion.
I know that it is a large company and control is required to manage everything, but this sounds depressing. I'm glad I work in a tiny company and even smaller team.
Their performance measurements are much more based off asking other engineers and colleagues what they think of your work.
There's lots of pressure to focus and do well on your main project, sure. But it's not external pressure, it's internal pressure. Everyone wants their software to be great. It's hard to detach for a day, ignore the influx of bugs and emails, switch gears, and work on something completely different. Most engineers don't want to spend the mental overhead.
But speaking from experience, when you do choose that path you get nothing but support.
They now seem to think that their management has obtained good enough crystal balls to be able to determine beforehand which of the potential products will succeed and which will fail. Meaning that they can scrap the 20% time and put all their "wood behind the few arrows" they have determined will succeed (you know, like Google+).
There was a good image I saw on reddit awhile back that I can't find off hand. On the left was titled what most people view success and failure as, and it had a single branch, you fail or you succeed. Then it had what successful people view it as, and it was a chain of failure after failure finally leading to a success.
If you eliminate failure, you eliminate the success you gain from learning from failures.
Isn't that begging the question in regards to whether or not you can know, ahead of time, which projects will or won't become viable and have a supporting business case?
You need your engineers to put up with mundane, boring work. Paying them to spend a minority of their time on an interesting project helps.
See also the "people simply empty out" article posted on HN. Google seems to be falling for the PHB fallacy that people are just machines that crank out code. Perhaps that is what google is becoming, but I certainly wouldn't want to work there.
Google is gigantic these days. It probably depends a lot on where you are in the company. Two people in different Google divisions or even physical offices are not going to have the same experience.
This means it's dead as a company wide policy, and it's problematic to be sold during recruitment.
With any project like 20% time, you should expect that almost everything produced will be worthless. So what? All you need are 1-2 big hits to justify the entire program.
Letting employees work on what they think will be valuable is a bottom up approach that can reveal knowledge that's inaccessible to managers.
Some comments on here say that the program 'only' created Gmail. That's been a massive success for Google. That + the smaller successes it has led to likely justify all the projects that didn't pan out.
Lets suppose you are a middle manager, and your performance is based on the completion of projects assigned to you by upper management, and perhaps a little bit of weight is given to the innovation your team cranks out in 20% time.
If the projects you are given is either slipping, or is not achieving the type of success that makes you look good (for the purposes of your own promotion/bonus), do you allow your "resources" to work on 20% time, which may bring very little benefit to you directly even if it is successful (it brings benefit to the engineer, not you), or do you divert that 20% time resource to projects that _do_ give you personal benefit?
Some of the most useful stuff that has come out of google has been due to the labs, and the people who work at google are supposedly some of the people who are at the top of their game, who are now not going to benefit us with their great ideas, just work on the "more wood" approach, which just sounds like the google employees and google consumers are getting the wood.
To be fair, not everybody is interested in running a company, and doing all the "other stuff" that it entails beyond writing code. Also, Google provide great resources and infrastructure, which it would be hard to replicate (assuming the hypothetical project in question required Google scale infrastructure).
I can see where that would be true for some people, but I doubt it's universally true. And it may even be that it just makes more sense to do the idea inside Google than doing it independently anyway. Again, look at the resources and infrastructure Google already have assembled.
And some people may have just bought into the Google mission / vision / whatever, and / or just feel a sense of loyalty to GOOG.
Anyway, I don't mean to suggest that it never makes sense to quit and do your own thing. I mean, hell, I probably would, as far as that goes. I'm just saying that there are some legitimate reasons why some people might prefer to just work within Google than quit and run off to start a new business of their own.
- Many 20% projects are not things that would not be easily monetizable: open source work, non-technical things, improvements to infrastructure.
- If the idea is very speculative, exploring it while still getting paid a handsome salary is a much easier step than quitting your job.
- Some things are easier to do if you have all the Google infrastructure to build on: for example the Transit Maps thing below is a lot easier if you can plug into Maps! (But some things would be easier to launch externally.)
- Part of the attraction, like for open source projects, is that it's something different from your main job, so you learn and stay fresh. If you turn it into your main job you lose that.
The theory is that if you do work on a new product and it works well, eventually it will be staffed full-time and you'll be rewarded. Apparently that did happen with Google Now. I doubt it happens every time, but then not every worthwhile startup succeeds.
You mean Orkut. That was weird: it got wildly popular in Brazil, India, Turkey and other places, but never really took off in "the West" (weird name, bad quality when first released, boring UI...). I guess most of its members (it's still going!) will become G+ users, so it wasn't a complete failure in the end (in fact, a large userbase in developing countries is an insurance policy for the future, if you can keep it going).
EDIT: Wikipedia says "Orkut was quietly launched on January 22, 2004 by Google. Orkut Büyükökten, a Turkish software engineer, developed it as an independent project while working at Google. While previously working for Affinity Engines, he had developed a similar system, In Circle, intended for use by university alumni groups. In late June 2004, Affinity Engines filed suit against Google, claiming that Büyükkökten and Google based Orkut on In Circle code. The allegation is based on the presence of 9 identical bugs in Orkut that also existed in Circle."
Wave wasn't a social network, it was collaboration protocol with a proof-of-concept demonstration app.
I did get the impression it had some interesting innovation under the hood though that made it into Plus and Docs.
all the wood led by deadwood behind one arrow ... this arrow ... now this arrow ... no, this arrow that has just been renamed to the name of that arrow ... of course it is an arrow, it just looks like a deadwood onshore washed tree trunk and weights like it, and flies like it, well, that of course if somebody can make it fly ...
Despite the rumors, I don't think 20%-time is dead.
When I was in the Valley a few years ago, lots of people were quite convinced that top talent had already been completely chained to Google/Facebook/Twitter anyway, and there was no chance of poaching it away. If you follow that line of reasoning, there is no point investing in recruitment, at least locally.
I also wonder though if the real estate hyperinflation of the valley is a factor. It negatively impacts the ability to attract out of town talent when a "starter home" is one million dollars and a decent apartment is over $3000/month. It also impacts retention. I meet a lot of folks from the Valley and SF who loved it but "left cause they wanted to start a family" or "left cause they got tired of spending >50% of their income on real estate." You have either pay exorbitantly high salaries or expand elsewhere.
Manhattan is similar to SV, but the thing is this: it is possible in New York to find reasonable (for the local market) real estate within a reasonable commute from work in one of the boroughs. It won't be the trendiest, but it's going to be okay.
In SV you are talking about $800k or $2500+ a month for places that in other parts of the country would be called a "crack den." Either that, or you are going to be both far away and in an utterly dull, depressing suburb.
SV is America's only six-figure ghetto. There is nowhere else where so much buys so little. Driving around on my most recent visit I was shocked by the squalor of these six-figure engineers and millionaires (relative to what you'd get for that price in sane markets).
I'm not a local so I can't say for sure, but I'd be strongly tempted to blame the anti-development political mentality. With growing demand and no supply, this is gonna happen.
(Yes, yes, the % isn't correct. It doesn't matter ;-))
In many ways it's a shame, although I think most of us can appreciate that the Google of yesterday is a different beast to the Google of today. One area where I think it could really hurt Google is in recruitment. I imagine the "20% rule" would have been a huge factor in getting the best engineers to work at Google. With this news, I can see some of the magic of Google disappearing in my mind.
The two people I know said the 20% was a way for their brains to take a break and think creatively. Nothing was too crazy or far fetched not to try out. They said it helped combat the crazy hours and working from dusk till dawn. In short, it helped prevent burnout. I would be willing to bet the turnover rate will start increase as their employees start to work more hours, without any breaks, and nothing to give them something to look forward to and burnout starts to settle in much faster.
Sure, it's not going to make them go bankrupt, but it will have a real world cost to their bottom line.
IIRC, both Gmail and AdSense mentioned in the title didn't come to Google as 20% byproducts, but as acquisitions.
Does anyone know of any popular (by Google standards) product that started that way?
Full disclosure: one of the big leaders of the Google Transit project was Avichal Garg, who was a PM for the webspam team when he ramped up Google Transit in his 20% time.
Also, as others have pointed out repeatedly, 20% has not been "ditched." And this isn't a poster child, it's just one example.
If that was where 20% time was going, I don't think we're missing much.
This was yesterday.
This tendency is self enforcing: If there is less opportunity for the side project to make it then developers will be less inclined to follow through with the idea.
So as an outside observer I can identify a cycle: less room for innovation breeds less incentive for an individual to come up with initiative. Management sees that its liberalism is besides the point, so free room for initiative is scrapped.
Maybe google can still keep on to a culture of innovation in areas of infrastructure and research, but in a wider sense google+ probably did them in.
Maybe that is the right thing to do: they have enough money to buy any start-up if it looks promising. I don't know which one is more cost effective: exploring ideas in-house or buying out start-ups; however if Google looks like any other ordinary shop then it will be less able to attract top talent. I would guess that the free food is not the winning perk that attracts developers.
I dare the management at Google to reverse course and allow their brilliant stable of top talent to keep the 20% perk - let a few Benjamins go for the sake of building awesome shtuff ("h" intended).
INNOVATION RULES - cash - well, it's just dirty paper...
What should be done is to first decide your company's mission statement and if innovation is anywhere within and is at least on par or above profit, then you need to take that seriously and set incentives that move folks toward that goal. I was unable to find an actual Google mission statement but I did fined their philosophy page (i.e. "Ten things we know to be true") and it seems clear to me that innovation is definitely a core principle for the company.
As to all those detractors who say the 20% has become basically screw-off time, maybe-maybe not. But the proof is in the pudding and if amazing things come out of the practice - as is the case with 3M and Google - then the screwing off is clearly at a minimum if there at all.
For those that have an idea and wish to use 20% time to incubate, then restructure all of their core projects using a 32h week and pay them for 40h.
My dare to Mr. Brin and the rest remains - don't just not smother the policy but rather, embrace it with a massive kung-fu grip bear hug. Make it a top priority for those with ideas to use 20%-time and incubate to the N-th degree!!!!!
I get praise and appreciation, and bonuses from other teams. I think it looks good on a resume, I learned something, and it builds a connection across teams. I know the things I'm doing are helping Google as a whole.
People are always going to have different feelings, and often mixed feelings, about putting time into a side project rather than concentrating on their main thing or doing something away from the computer. For me it comes and goes: some days I want a change and sometimes I feel too busy.
How many of you who have startups would feel OK about taking time from it to spend Friday working on an open source project, even if you don't have a boss that's stopping you?
Google has had some success with it obviously, but I wonder how it would have worked had they done it by Quarter or semi-annually instead.
Say, you meet a certain threshold in your performance review, you get a lump sum of ~20% of the time period that the review covers.
Say for a quarterly review, if you meet the threshold, you get 2.6 weeks. Or semi-annually you get 5.2 weeks, straight. You have to be at the office, but you can work on your own project during that time.
5.2 weeks is some serious hacking time, enabling a great deal of extended focus. Highly productive Google engineers could probably bootstrap a startup MVP in that time, except of course Google would own it.
Personally, the way I've always taken it was "I've got no pressing tasks I need to take care of for my main project or for coworkers? Great, I'll work on my 20% project!" And I've found out that over my nearly 5-year career at Google, this averages out to pretty close to 20%. But there were weeks that I did nothing but 20% work, and there were stretches of 2-3 months where I didn't do 20% work at all.
We're evaluated over a long period of time and nobody really knows what you do day-to-day, so it doesn't actually matter all that much how you take 20% time, the important thing is what comes out of it.
Google previously focused more on the creativity of new better solutions to an array of different verticals. Now they are focusing on the creativity of their own "super-man-team" instead of the "our peoples make crazy stuff, we hire amazing people" and let them go at it in their spare time and build innovative projects.
Maybe this time marks a period in Google's future where they stop innovating in a disruptive fashion.
So in some sense, this comes as depressing news.
But not really, because another company will take its place, and I can set my sights there.
Whelp, it's been grand, Google. Who's next?
Someday a company might actually grow correctly and preserve their humanity and agility despite our tendency for social discord as a species, but it's a damn hard problem.
I will try to add a point I haven't seen here yet:
Once I was in a big company with a big, famous
The group I was in, just three of us, was led by a
guy who had taken some technical work and published
some papers. The journals were peer-reviewed but
semi-popular so reached some ordinary business
customers. The papers fit in with a lot of hype at
So, some of the customers picked up a phone and
called our group for more information. So, presto,
we got contact with real customers otherwise
essentially forbidden in research.
Why forbidden? Because the sales/marketing guys
wanted to be the guys who had total control over all
contact with customers.
Eventually our little group broke up, and I wanted
to do some research that would lead to products to
be sold to customers. I started to discover that no
one in the company wanted that. The big guys in
research wanted to throttle all contacts outside
research and mostly didn't want any. The people in
sales, marketing, product development, and product
production didn't want to deal with anything new
from research. No one wanted a research worker bee
talking to real customers.
Next, when research did come up with something that
might be of interest to real customers, no one
wanted to pursue that opportunity. Again, the
marketing people wanted to be the source of all new
product/service initiatives. No one in research
wanted a research worker bee to get credit/blame
having a product go from research to success/failure
in the market.
In the end, research was forced to be essentially
irrelevant to the business. Finally research was
turned into a patent shop so that lawyers could take
the big patent portfolio, go to companies, claim
patent infringement, and 'license' the patent
Point: Generally, a business organization, given a
nice new product/service to offer their customers,
has multiple layers of organizations and people,
processes, reasons, etc. why they just do not, not
want to come out with anything new. Period.
Curiously, yes, some of the people at some of the
customer organizations were eager to work with our
group in research as a way to look more innovative
within their own organizations. Here, of course,
they didn't have to work through layers of
managers but only had to exercise their own
freedom to handle their own responsibilities.
So, lesson: Commonly individuals are ready to do
and offer new things to others or try new things
from others, but it's the layers of managers who
block such things.
So, the blockage of these layers of managers is one
of the main reasons and opportunities for
Yes, it's dumber than paint, but it's usually the
way of the world.
This brings me to my staple question again: "How in the world those intelligent people with all those qualifications fail to understand something, well, dumber than paint?"
Or is it that majority of layers are full of unwilling/stupid people. I can see people being unwilling. During my time in internship, I became quite hesitant to touch any issue that wasn't assigned to me. QA guy didn't know much about programming and assigned features randomly. And there was no incentive for me to develop a feature assigned to someone else(though being curious, I did "snatch" a feature or two and implemented myself)!
I guess one weak point causes the problems and then they get amplified! That's how offices work.
Dumber than paint, but usually not totally irrational
for the managers who do such things.
Generally if such dumb behavior is made sufficiently
'public', then nearly everyone will fall in line and
agree that, yes, sure, it was dumb and we don't want
anything like that going on here.
Otherwise, dumber than paint is common, and here is
some of why: You are a manager, at some level,
but not CEO, with
several subordinates, managers and/or worker bees.
One of your subordinates comes to you with a new
'idea' X. Maybe all X is is 'blue sky', back of the
envelope, 10 pages of manuscript descriptions and/or
mathematical derivations, a carefully prepared paper
of 50 pages with technology, market analysis,
cost and revenue projections, running prototype
software, breadboard hardware, joint research
with some leading customers, or some such.
Your mission, and likely you have to accept it,
now, is to decide what to do about
Suppose you investigate project X. Okay, that's
time and energy away from your assigned duties
that count with your manager. So, maybe you put
in some weekends investigating project X -- now
your spouse is unhappy.
Suppose from your investigation you decide to
try to support project X. In simple terms, there
are two cases:
Success. Suppose project X is a success in
some significant sense, say, the project
grows in headcount, budget, floor space, etc.
Suppose project X gets some publicity in the
company and/or outside. Suppose some major
customers hear about project X and tell your
marketing people and/or CEO that they want
to try X. Etc. Now you can be seen as a
bright, rising star, and your management
chain from your manager all the way up to
but possibly not including the CEO and COB
feels threatened by you and will, all
together, at the first opportunity,
hang you by your toes from a lamppost in the
employee parking lot.
Failure. Suppose project X is a failure.
Then all the resources you devoted to it
will be considered a waste for the company.
You will be painted as an incompetent, loose
cannon on the deck, and chastised, ostracized,
put on the slow track, have a black mark
on your career, or just moved out the door.
Commonly there is only one person in the
company who can push project X, put up with
the blame if X fails, and really benefit
from the credit if X succeeds, and that
one person is the CEO.
If the CEO sees that you got dumped on for
your efforts with project X, then he may
shake a finger at the managers in your
management chain, but likely you still don't
report to the CEO who still needs your
management chain for the responsibilities
they have been assigned to do.
On the other hand, suppose you decide just
to get a list of excuses, that you release
only one at a time as needed, for just
why project X is 'not appropriate' for your
group. That is, you pour cold water on
project X and work to kill it. Here you
are likely safe from any blame or attacks
and don't have to lose some weekends
on project X.
So, net, nearly never does anyone get
blamed for not innovating.
Really, usually in an organization,
there is work to be done, and the
management tree is a partition of that
work to get it done. No one is holding
their breath waiting for innovation to
save or even help the company.
The company and its existing products/services,
customers, revenue, earnings, stockholders,
etc. is a valuable bird in the hand,
and any innovation is seen as at best
two birds in the bush.
So, possibly one reason for the old
20% time of Google was to let everyone
in the organization pursue their case
of a project X without blame and, also,
to get a hearing, without much risk,
in case their project
X looks good.
Why stop 20% time? Maybe not much good
was coming from the 20% time. Maybe
a lot of middle and upper managers got
tired of having to evaluate projects
they really were not interested in --
a VC can get 2000 contacts from entrepreneurs
in a year and invest in 0-3 -- so, that's
about 2000 contacts a year tossed in the
trash, and maybe Google managers didn't
want to evaluate such projects. Maybe
the Google top management was concerned
that the main work was not getting done
fast or well enough. Maybe the 20%
time looked like chaos. Whatever.
As we know, Jobs or Woz was at HP,
presented to a manager at HP the idea
for, say the Apple I, and secretly
smiled when the manager said that
HP would not be interested.
It remains that some of the most important
ideas in computing are authentication,
capabilities, and access control lists.
These were in good form in the MIT
Project MAC computer Multics. Also
processor architecture with gate segments,
since in Intel's x86 architecture.
Well, Multics was done on a computer
from GE, expensive. GE sold their
computer division to Honeywell, so
Multics was on a Honeywell.
Honeywell didn't like selling Multics,
and not many were sold -- one sale
was in basement of a five sided
building on the west bank of the
Potomac River. Some engineers
at Honeywell saw 'bit sliced' hardware
and microcode and concluded that
with those two, some extra register
sets, a tweaked Fortran compiler,
and the Multics hard/software ideas,
they could bring up Multics on
a single board super-mini computer,
sell it, and make money. The story
went that the Honeywell managers said
that they didn't believe that there
could be such a computer, that if
there was it wouldn't sell, and even
if it did Honeywell would not be
interested. So, the engineers
started Prime computer, one of the
three, along with DEC VAX and
DG 'Soul of a New Machine' MV8000,
on or near Route 128 in Boston. As I recall,
in 1980, Prime yielded the best ROI
of any stock on the NYSE. Later
Prime got a CEO from a big computer
company and went down, down, splash!
It's a very old lesson: Big companies
do not, not want to innovate. An
explanation doesn't really need the
book 'The Innovator's Dilemma' and
is simpler -- people acting dumber
than paint and, the real cause, getting
away with, and even benefiting from, it.
Why? Because nearly no one can be sure that a
project X would work. So early on,
X into the bit bucket is not proven to
be a disaster. The short term cost is
easier to see than the long term gain.
Why? Because accurately evaluating
a project X is commonly a lot of work
and still needs some judgment and with
really cannot easily be communicated to
others, say, higher mangers, the CEO,
the Board, security analysts, the
tech trade press, or the stockholders.
So, dumber than paint is accepted as
Then one of the big opportunities, especially
now with the fantastic price/performance
of computer hardware, Internet bandwidth,
and infrastructure software, is information
technology entrepreneurship. Broadly in
economic terms we want to automate
everything in sight to give higher
economic productivity. There should
be lots of opportunities, better than
wheels, bronze, iron, open ocean sailing,
steel, steam, electric power, motors,
and lights, internal combustion engines,
oil, radio, TV, chemical engineering,
But a bootstrapped entrepreneur can't do
a 13 nm microelectronics fabrication
facility, an iPhone 10, a Tesla, etc.
So, a company with everything in place --
HR, legal, finance, floor space,
and cash -- should have a big, huge
advantage in innovation, i.e..
powerful, valuable new products that
will make a bundle but that small
entrepreneurs can't do. So, maybe
it's the first HP scientific
pocket calculator, the first HP
laser/ink jet printer. The Apple
iPad, iPhone, etc.
Still, apparently to be good at such
'internal innovation' takes a Jobs.
For a Steve Ballmer, John Chambers,
Marissa Mayer, Larry Ellison,
Page/Brin, maybe the best they
can do is to buy a HotMail,
YouTube, Tumblr, and just hope
not to kill it right away.
Back to my startup!
I remember being in direct competition on a project with a google 20% team, and only beating them by a few weeks. We had fun with that.
I guess that's just the natural life cycle of companies?
Like most startups, we also operate at breakneck pace, we're not exactly sitting around twiddling our thumbs. This strikes me as the same as "I don't have time to..." excuses people make in their personal lives. If you value it enough, you will find a place for it.
It's all about how much you actually value the 10%-time, or if it's just a recruiting/PR gimmick. If the company approaches it from a "it would be nice if..." angle, it will always find a million things that are higher-priority than 10%-time. If you approach it from the "this is a critical part of company strategy" angle, you will find ways to keep it sacrosanct.
I thought it was Google that copied it from 3M, who invented the Scotch Tape during the 20% time?
- search in Gmail is actually losing functionality.
- hemming and hawing about the NSA scandal (not wanting to risk irritating political leaders)
- ending free google apps tier
- ending google reader
- ending 20% time
What a huge sacrifice HP made thereby. Friday afternoon is otherwise the most productive time of the week.
Hack Days work much better. Easier to accommodate and more fun.
It is probably not intentional but their inability to manage funding is actually a blessing in disguise for most labs.
Business is just not capable of long term dedication to experimentation planing an innovation.
We under-value sincere-but-failed projects.
Expecting Google to build a webmail system would have been like expecting Twitter to release an accounting package.
Besides if you care about failure then you don't belong in the tech industry.