Hacker News new | comments | show | ask | jobs | submit login
Google’s “20% time,” which brought you Gmail and AdSense, is now as good as dead (qz.com)
581 points by antr 1321 days ago | hide | past | web | 481 comments | favorite



Few people remember it, but the same thing happened at HP. It used to be that HP engineers were expressly given Friday afternoons and full access to company resources to just play with new ideas. Among other things, this led to HP owning the printer market.

Then "professional" management came in and killed the proverbial goose. They had to focus more on the "bottom line". To do what was easy to measure and track, rather than what was necessary for the next step of the company, and now HP is a mere shadow of its former glory -- directionless and bleeding.

3M and Corning have largely avoided this fate, but it seems that Google won't. This should make a lot of entrepreneurs happy, as there will continue to be a lot of top-down management-driven products that, if history shows, will continue to be market failures. Yet somehow, I'm incredibly sad, as it seems that too many companies go down this road.


It boggles my mind, given the big money involved, why so many people continue to bet huge sums of cash on the proven short-term penny-wise/pound-foolish idiocy of MBA-think.

I mean sure-- if your company is under cash flow pressure you have to pinch pennies. You have no choice. Spreadsheet says so, and spreadsheet's the boss. But if you're not, you should be investing and thinking long term cause the other guys probably aren't.

I've seen a related phenomenon in the startup world. Watched it, front row seat. I did a stint in startup-tech-focused business consulting. If you have a top-ten MBA and connections you can raise millions of dollars, set fire to it like the Joker in Batman Begins, and then raise millions of dollars again, serially.

They were basically cargo cultists, mindlessly imitating the words, phrases, and superficial behaviors of supposedly-successful people and businesses. But there was no higher-order conceptual thinking beneath the surface-- no "there" there. They had no plan and no plan on how to acquire a plan. They got the money and then did a kind of mindless MBA rain dance until the money was gone. Then they'd raise more.

I watched them do shit like destroy products that big customers had money in hand ready to pay for when they were inches away from release. I mean a done product, ready to go, and better than anything else in its market. A product that they owned and had already paid to develop. The rationale was always some kind of MBA newspeak blather. I can't even remember it since my mind filters out sounds that imitate language but lack conceptual content. Otherwise I risk wasting a synapse.

But what do I know? I went to a po-dunk Midwestern state school, so what looks obviously stupid to me is maybe genius. I'm not saying I definitely could have done better, but I do think my probability of failure would have been <= to theirs. But there is no way in hell I could get what they got. Not a chance. I saw people try with better credentials than me and who were probably much smarter, but they lacked whatever special magic blessing the cargo cult guys had.

I'm convinced its pure cronyism and ass-covering. I guess nobody ever got fired for losing their clients' money to a Harvard or MIT Sloan MBA. Nobody with a degree like that could be at fault. It has to be the employees (I've seen really good people get blamed for following stupid orders several times), bad market timing, etc.


You are sooooo right on with this.

The top-10 MBA cult is awful. These are usually those people you knew in high school/college who were excellent at studying for and passing tests. Excellent at getting great grades on projects. Excellent at everything except building ANYTHING.

I think that the people who are best at building things that people want don't want to get an MBA. They instead choose to spend their time building a business, a piece of software, a piece of hardware, whatever. People are good at what they love to do. MBA people love to go to school and get pieces of paper that say "pay me I'm smart."

As a data guy, I have to go in and deal with these assholes all the time. I have hard numbers, they have hand-wavey MBA speak bullshit. It is probably the hardest thing we data people have to deal with: criticism from the "trusted advisors" who, due to the cognitive dissonance suffered by the executives who pay them loads of cash, are deemed to be intelligent when they really aren't. After all, what executive wants to admit that the man or woman he has been paying high 6 figures to advise him for months is actually a talented actor/mimic at best and an idiot at worst??


I don't think it's the MBA degree itself that causes this. Most MBA programs go very heavily into the idea of culture and competitive advantage - and obviously stopping a core part of your culture like the 20% time would be against what most MBAs are taught.

This problem has nothing to do with education and everything to do with short term professional management that is compensated based on short term results. If you want to blame anyone, you need to blame current financial thinking by most board of directors.

Of course, most of those guys are just in it for the short term too. So ultimately you need to blame the guys with money who give it to people who don't know how to invest. I'm sure most of us, including me, are guilty of this as well.


>If you want to blame anyone, you need to blame current financial thinking by most board of directors.

I don't have an MBA, but my understanding is that this is what MBA programs teach, and is at least partly to blame for Wall St's, and boards of directors, short-term, bottom-line, quarterly focus.

There are some people in the MBA world trying to correct that, one of which I know of are the Throughput Accounting [1] advocates. Maybe more as well.

Can't come soon enough.

[1]: https://en.wikipedia.org/wiki/Throughput_Accounting


Not at all - MBAs teach company management and company finance. There really isn't anything in an MBA that would have anything to do with the job of corporate oversight that a board of directors handles.

The board of directors are simply large shareholders who are put there to make sure management does what is in the best interests of share holders. In most large companies, these people are mostly made up of employees of pension funds and similar institutions. Their job is to try their best to make sure the company is committed to giving their investors a return of x% and are usually there because it is conventional thinking that having control of a company is in your best interests.

Now obviously these guys have no idea what should be going on in a technology (or just about any other) company. They aren't really concerned with employees or anything like that - only with a few accounting and market figures such as return on equity, share price, potential acquisitions of other companies they have invested in, and whether or not to sell the company to others. Actually running the company or how the company works is the furthest thing from their minds.

Now, as I said above, these guys are not actually to blame. They are just doing their jobs - giving returns to the guys who have dumped money into their funds. The people ultimately to blame are you and me. We put our money in these massive pension funds and similar. We don't even care where the money goes. Each year we check our statements and say 'oh, 13% returns only, maybe I should switch pension funds?'.

As always, the problem is apathy, entitlement and 'well these guys are a big company, they must know what they are doing'.


"MBAs teach company management and company finance."

This is inherently what the board of directors does. Furthermore, you characterization of the make-up of a board of directors is not necessarily correct. Many (if not most) boards also have independent directors, who many not own a single share.

"There really isn't anything in an MBA that would have anything to do with the job of corporate oversight that a board of directors handles."

This left me scratching my head, my experience was the polar opposite of this comment. In my MBA program the topic of the board came up a number of times in finance and management classes. The board & corporate oversight were very much top of mind issues.


"This is inherently what the board of directors does."

That would be what the board of directors is theoretically meant to do. In practice it's extremely far from the truth, with board meetings being very infrequent and focused primarily on share price and dividends.

This discussion is not really about Google though: Google does actually have a very relevant board of directors with most of them being founders or directly involved in starting large tech firms. Many other companies (Nokia? Microsoft?) are not so lucky.

Interestingly enough on Google's board only Paul S. Otellini (previous CEO of Intel) and L. John Doerr (early Intel engineer and VC) have an MBA. [1]

[1] http://investor.google.com/corporate/board-of-directors.html


I agree with your point about theoretical vs the reality of a Boards function. I should have been more explicit, I am disagreeing that Board of Directors are overlooked in MBA education. At least in my experience, the Board was discussed often.


I think executive hiring practices like competing based on references certainly don't help either.


I really don't see the need to stereotype, and even less to call names.

Personally, I have worked as a software developer, and I do also have an MBA, and I strongly believe that my broad/diverse education allows me to better interface with people from different backgrounds within a work environment.

Are there ways to obtain broad skills (technical, management, etc.) which are more time effective, cost effective, etc.? Maybe, maybe not. Each person is free to make their own choice.

However, back to the article's point, if Google's 20% is dying down it is ultimately due to Larry Page, who is clearly a technical guy, doesn't have an MBA (1) and most likely is seen as a "doer".

(1) According to wikipedia, he actually received an honorary MBA for his entrepreneurial spirit.


>allows me to better interface with people from different backgrounds within a work environment.

Personally, I like to relate to people, rather than interface with them.


"Personally, I like to understand problems, rather than grok them."


The two following comments are spammy but the only real issue with jmduke's comment was the form. Had it been:

>You could also say: "Personally, I like to understand problems, rather than grok them."

It would have gotten the point across that the grandparent's comment was an unfair dig at a style of speech rather than the content of that speech.


"Personally, I like people"


Personally, I like persons.


I'm glad you made this comment. It's disheartening that so many people in this community lash out at MBAs like they're viral scum. Stereotyping never solves anything.


>"The top-10 MBA cult is awful."

SV technorati pretentiousness is awful.

>"MBA people love to go to school and get pieces of paper that say "pay me I'm smart."

As opposed to some "Silicon Valley" types that don't have a clue about what it means to run a business, and say "pay me with investors money, I have lots of users!"

>"I have to go in and deal with these assholes all the time. I have hard numbers, they have hand-wavey MBA speak bullshit."

You sound like you don't have a clue what getting an MBA actually entails. Your comment is full of ignorance and generalizations.

Where does this idea that "an MBA" made this decision even come from? Google is run by an engineer. Point your misinformed, misdirected, stereotype hatred elsewhere.

This community is getting worse every day.


"This community is getting worse every day."

You've been in "this community" for less than 3 months according to your username. I'm not SV technorati, and I've run a business, as well as been a founder in a non-tech business that grew to 150 employees. I know business, and I know what I said is correct.

"SV technorati" is itself a stereotype.


It doesn't matter how long he's been here. His statement about the community may have been hyperbolic; do you have a real rebuttal against the content? Or will you nitpick one sentence you can attack?

SV Tehnorati is a stereotype because he's drawing comparisons, he wouldn't normally make a steretype argument unless you had already. The point was to demonstrate a different perspective, and this entire thread is just demonstrating a disgusting level of stereotyping for a category of people - MBAs.

The bottom line is that there is nothing inherently evil or pejorative about an MBA. Judge human beings on individual merit, not on a piece of paper.


Yep, absolutely correct. Well, save for Nike, which was founded by an MBA. Oh yeah, and Apple, whose CEO has an MBA and was personally selected by Steve Jobs. Well, now that I think about it, Meg Whitman, the current CEO of HP and former CEO of eBay has an MBA. And wasn't Bonobos founded by a Stanford MBA? I think Birchbox was born by a group of MBA's as well.

Except for those and probably thousands of other exceptions, your logic is water-tight and well-reasoned. Kudos, sir.


But Apple's founder didn't have an MBA. You know, the guy that came back in 1997 and brought the company from the brink of bankruptcy after it's collapse under the leadership of John Sculley, MBA.


So I wonder what positions the good MBAs tend to be in vs the bad, for example which kind are more likely to be big company CxOs.


Either it is a meritocracy for them, in which case we can draw some really nasty conclusions using people like Sculley, or they aren't operating in a meritocracy, which jives with the other accusations leveled at them....


Did I miss HP turning around and becoming an awesome company again? Has Apple been doing as well as it did under Jobs? Do you think it still will be in 5 years?


Did I miss HP turning around and becoming an awesome company again?

Apparently. Have you checked the stock market lately?


Yeah, we're in the middle of a massive bull run which has swelled every company in my portfolio like a balloon. And the correlation between stock price and awesome company in the short run is really really tenuous. Many of the less savory leaders out there have a penchant for making short term profits look awesome by killing off investment in long term prospects, and the market eats it up.


Perhaps all stereotypes are harmful and should be avoided.


SV technorati pretentiousness is indeed awful sometimes, and stereotypical MBAs do not have a monopoly on brain-dead business ideas.

But last I checked SV and California in general pumped out more innovation than the rest of the country combined, and is preparing to colonize another planet and electrify transport.

There is obviously something fundamental in the "California mindset" that differs starkly from the majority of the rest of the world, and that ought to be understood. I think it has something to do with reasoning from first principles, and with an expansive risk-tolerant business culture.

That risk tolerance and general liberal thinking is going to generate a fair amount of silly stuff, but it's also going to permit genius.

I forget who said this, but I recall reading it somewhere: "the further West you go, the further into the future you go." I'd say it's not true literally but certainly philosophically.


But last I checked SV and California in general pumped out more innovation than the rest of the country combined, and is preparing to colonize another planet and electrify transport.

This is what people refer to as a 'reality distortion field'.


Reality distortion fields don't emit actual products that work.

It's a field alright, and it does have some reality-distorting side effects, but there is actually a "there" there.

If I had to boil it down I'd say it's this:

(1) California believes in the future. People in California (stereotypically) think about what they can do tomorrow, building on what they have today. Everyone else thinks about what they already have today and fears losing it tomorrow.

(2) California reasons from first principles more than elsewhere. Everyone else looks at what everyone else is doing and tries to superficially copy what looks like it works, or looks to the past. Ideas from the past will only get you what was done in the past, and other peoples' ideas will not make you competitive since they're everyone else's advantages. (Assuming you succeed in copying them at all, and don't just end up cargo-culting them.)

I quoted this elsewhere in the thread. It bears repetition.

"For the engine which drives Enterprise is not Thrift, but Profit." - John Maynard Keynes

California runs on this.


Reality distortion fields don't emit actual products that work.

I'm not saying that a lot of great stuff doesn't come out of California/SV (just like Apple, for whom the 'reality distortion field' thing was first coined, also produces great products). The casualness with which you suppose that California alone is producing more innovation the entire rest of the country suggests that you're overwhelmingly focused on a very, very specific definition of 'innovation'.


I was talking about an idea more than a place, and should have made that clear.

The places where innovation comes from are going to be places that believe in the future, are open to experimentation and risk, and reason from first principles instead of cargo-cultism. It just seems like California (and SV in particular) has an above-average amount of these things.


You're making sweeping statements evidencing the superiority of one area in the world over any other place, and all you did was make a comment full of fluff and grandeur.

There is no "California mindset" just like there's no "Wall Street mindset" in New York. You're just being sectionalist.


Have you actually been to and/or lived in California?

[A little background about me: I was born in Cali, left when I was 6 months old, returned when I was in my late 20s, and stayed there for 5 years (San Diego).]

The reason I ask you this is that your viewpoint is why I went to Cali. But when I got there I found that the idea of Cali is very different from the reality of Cali. I have lived in a variety of places around the world, and I must honestly say that California was one of the worst...

Re your first point: (1) California believes in the future. I would say NO, California believes in itself. They think (sometimes)that they are the future, but most often they build on what they have today because they fear about losing it. Not about what they can do tomorrow. A lot of what they do is about preserving the image of what they are, not about progress or the future. And as for the people (stereotypically)... Rednecks, gangbangers, surfer-dudes... There are a lot of them.

Your second quote: (2) California reasons from first principles more than elsewhere. Everyone else looks at what everyone else is doing and tries to superficially copy what looks like it works...

I think here you have it 50-50. Yes, some new things come out of Cali, but on the whole, they copy, (it's just that sometimes the copy is way better than the original)Except the idea of copy is a little changed, they don't copy exactly, they take an idea and 'shift' it. First is was a shift from real world to online. At the moment it's a shift from many to individual (online).

And your quote does need repeating, as is sums up Cali well, from gold rush to dotcom boom -

"For the engine which drives Enterprise is not Thrift, but Profit." - John Maynard Keynes

California runs on this.

Cali is about profit, nothing more, nothing less.


Hi api. I'm interested in your view about reasoning from first principles. Is there more behind that suggestion? And do you have evodence of it?


> electrify transport.

Oh good, maybe you can show Europeans and Asians how to get around their countries without internal combustion engines.


How much innovation has come out of San Diego, LA, Fresno or Sacramento? How many tech workers moved to silicon valley from somewhere else after college?

I'm not sure 'California mindset' is the right way of framing the innovation inside silicon valley.


San Diego is something of a Silicon Valley for genetics and other life sciences, LA is/was something of a hub for aerospace and arguably movie technology (it's more diversified now, but does have a Silicon Beach movement).

I'm not sure what objective measurement one could use to really determine "innovation", maybe economic activity from industries or products established within the last x years?


And Qualcomm is headquartered in San Diego.


When we say a child is talented, we usually mean they are a good learner. When we say an adult is talented, we mean they are a good doer. But these are actually two different axes.


I don't really believe there is such a great divide. In a greater probability most of those who are "genius" in childhood will continuously perform better when they become adults, in all fields including "doing" things. Sure there are some prodigies who turn out to be quite useless in real-world scenarios, but those are really few, IMO.


Unfortunately I have encountered the opposite. I have worked with plenty of folks who were good at learning and very poor at getting things done.

Furthermore, I think that constantly being praised during childhood for being smart did them harm as it made them underestimate the value of tenacity.


That's probably too far-stretched. In fact I believe conventional wisdom states the opposite, that those who get high scores in tests will likely be technical geniuses who understand how things work and do the hands-on stuffs, while those who manage are likely those who know how to talk but don't know how to study. MBA is more of a place to socialize, not to "study" things, right?


"MBA is more of a place to socialize, not to 'study' things, right?"

What? Go take a financial statements analysis class and then tell me how MBAs don't "study" things. There are certainly ways to coast through an MBA program, as there are ways to coast through many things. Simply because MBAs don't have to go through the mathematical rigor that an engineer or physicist does does not mean there is NO rigor or requirements necessary to attain an MBA. There are many vacuous, over-confident, arrogant MBAs out there, but that is part of the human condition not simply inherent in those who go earn MBAs.

This broad (and widely inaccurate) characterization of hundreds of thousands of people is mind-boggling.


Not to sound arrogant, but I took a few MBA courses which included Enterprise Finance and Operations Management. There is some rigor to the courses, and I found the finance class in particular interesting and applicable. However, the level of difficulty was at least one order of magnitude below the engineering courses I had taken, when compared EE & Physics courses. Aside from some term memorization the math was really easy and "coasting through" was definitely possible if you're used to stuff like semiconductor physics.


I agree - an MBA is not engineering, and many in my class would have been unable at that stage to finishing 3rd year engineering mathematics. A few of us had engineering backgrounds though, and some very deep at that. However there is (usually) a deliberate effort to overload students with volumes of work, so that they can figure out how the real world works and work in teams. Also in my MBA course students had a minimum of quantitative work they had to do, and if they wanted they could dive deeply into the more serious mathematics in certain courses. It was part of the expectations that people who were more technical or more experienced in one area would help out those who had less experience, and vice versa.

Engineering mathematics is not for everyone, and indeed not required for every MBA graduate, but it certainly helps for the more quantitative courses. And finance at one school can differ wildly from finance at other schools. Look for the professors. Look at the courses people take. But also look at the culture. Some universities are focussed on the money, others, like Yale, on much wider impact in business and public service.


I can't believe the egocentrism so many engineers demonstrate when talking about MBAs. It's like the popular kids vs the nerds in high school. It's...sad.


Are you implying that the tables have turned?


No, not at all. Just drawing a comparison.


There's a cohort of people who do particularly bad in academic measures (especially the standardized testing sort that is in such fashion now), that are exceptionally good at what they do. I haven't seen a statistical study, but there are many, many anecdotal accounts of scientists, inventors, and businessmen who did poorly in academic settings but were brilliant.


"I think that the people who are best at building things that people want don't want to get an MBA."

I know MBA school will be the first thing I do in hell.


GMAFB, you're just showing a personal hatred and stereotypical perspective, stop being so melodramatic. Incredible things have come from MBAs. You're being ignorant.


> Incredible things have come from MBAs.

You're all talk and no substance; let's have some examples.


The two founders of http://flightcaster.com (tech startup) both have MBAs.

Matt Soldo, a serial entrepreneur who has worked in both technology and management and founded his own startups, has an MBA. He currently works at Heroku.


Incredible. Truly. I would have never guessed somebody with an MBA might start a company or work in management. Somebody with an MBA who also is capable of technical things? Stunning. You have opened my eyes; clearly getting an MBA allows one to create incredible things. These are incredible.

Which one of us is wearing the SV blinders again? What have MBAs done, in the capacity of MBAs, that is incredible? Founding or working at some tech startup is not incredible.


I'm not wearing any blinders. That's the point. I have no loyalties or hatred towards anyone just because they have letters after their names.

My examples are meant to show you that your "Us and Them" mentality with engineers and MBAs is just false and doesn't reflect reality, no matter how many anecdotes people throw around on HN.

You asked for examples. I gave them. Now you're attacking me instead of my point. I've proven you wrong, bottomline. You just can't see past your anti-MBA bias.


Your best defense of MBAs is that some are capable of doing other things as well. You may as well tell me that dogfood is a perfectly reasonable meal for humans because sometimes you feed your dog your leftovers: "Dogfood makes great people food. See, here is some dogfood that also happens to be people food" "MBAs can create just as well as technical people. See, here are some MBAs who are technical people as well" "Chiropractors aren't all quacks. See, here is a chiropractor who is also a licensed physical therapist."

I can only conclude that it is as I suspected; you are full of hot air.


The root cause of this is company ownership. Even if Page wants to follow an innovative approach, he might not be able to convince the board and the big shareholders to invest long term because they want to see their shares "appreciated".

If you want to keep innovating, the company need to stay nimble, with the owner(s) only interested in innovation. I don't know, maybe that's why the startup eco system works.

Another model that seems to work as well is the "duel roles". For example, Jeff Bezos lets the MBA optimize the hell out of Amazon, while he can play with Blue Origin in his spare time and focuses on more innovative technology ideas. Again, this is possible because Bezos probably has far more influence at Amazon than Page at Google.


Unless something has changed that I'm not aware of, Google has a share structure that gives Brin and Page effectively full control through shares in a separate class that has 10 times the voting power of the A shares that were offered in the IPO.

So if Page can't convince "the board and the big shareholders" I believe it means he can't convince Brin, not some horde of faceless MBA's or institutional investors.

Your assumption about Bezos relative influence over Amazon seems unlikely, as Bezos Amazon holdings and the lack of a similar share structure means Bezos has far less voting power at Amazon than Page has at Google. Last I heard, Bezos held less than 20% of the voting power at Amazon.


Not surprised...I am hearing rumors that the "do no evil" policy is next to go. What then?

It's the JDs and MBAs, IMO.


From a management perspective, you're forgetting about the obvious: the less than productive people. The benefit of giving employees no- or few-strings attached time to work on whatever is clear. Things like GMail, Apple 1, etc. The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.

When your ability to make money hand over fist starts to get challenged, it is difficult to continue giving people free reign, especially when your competitors focus on cost, cost, cost. HP was a great place that made oodles of money selling tank-like PCs (among many other things) that cost $3k. But then Dell came along, invested $0 in R&D and started cleaning HP's PC clock. Bell Labs was engineer/scientist nirvana, then the AT&T monopoly went away.

The other issue in big companies is that as people with direct connections to the business start losing control, the bureaucrats (well intentioned as they are) start moving in, and they worry about things that the engineers/etc didn't really care about. They are passionate about you using the appropriate powerpoint template, and will speak to your supervisor if you don't comply!


Re: productive people. We have this notion of the 10x producer, but the real problem is high variability within the same person's work.

I know productivity superstars, but if you catch them on the wrong day or at the wrong time, they look like lazy bums. Over a year, they're extremely productive. On any given day, they may look like they're wasting time.

This property of conceptual work is at the core of a lot of culture clash with people who have more predictable work-comes-in, work-product-goes-out flows.


I know productivity superstars, but if you catch them on the wrong day or at the wrong time, they look like lazy bums.

I've encountered a particularly nasty variation of this since Agile became a Thing, where anyone who isn't writing code or running tests or showing up in the commit logs every few minutes obviously isn't working. The idea that it might be better to step back for an hour, or a day, or even a month, to think things through properly and maybe do some throwaway prototyping before you start pushing code to production, doesn't even seem to be accepted as credible any more in some quarters.


Agile / Scrum is the return of the K-LOC:

http://en.wikipedia.org/wiki/Source_lines_of_code

People don't recognize it as such because the LOC -- lines of code -- has an added time dimension now and a different name. In Agile / Scrum it's K-(issue/day) where issues take the place of raw lines and the metric is applied per unit of sprint duration.

But same idea, and same fallacy. It results in gobs of ugly Rube Goldberg machine code that's slow for fundamental O(crap) reasons and bug-ridden.


"'Agile.' You keep using that word..."

An org that calls what it does "agile" or "scrum" and then proceeds to accumulate technical debt like the Titanic taking on water is lying to itself about what it's doing. Piling up technical debt is a textbook Agile(flavor) failure, full stop.

What sounds like happened in the case(s) you describe is that someone with dev management preconceptions wore "Agile" like a wolf in sheep's clothing and proceeded to dole out the same old ad-hoc nonsense. Nonsense as in being "efficient" (high KLOCs/metrics, butts in seats, long hours, etc.) and not caring about being "effective" (working on the right problem vs a problem, necessary understanding of the company and customer needs, working smarter vs. simply harder, etc.).

I've seen this attitude a number of times, e.g. in program managers with history in big, established s/w companies. They learned a certain way of working, but then talk "Agile" as the trendy hire-word. Unfortunately, some of these folks never gained understanding of the tools that Agile brings to the table, and when/how to apply (or not apply) them to a situation.


I had a manager who believed that such metrics were the only true measure of performance. This same manager also believed that CUDA on a VM (which is itself on a server hosting numerous other VMs, and the bare-bones minimum graphics card--so no support for virtualized usage of CUDA) was a good suggestion for improving performance, that running a debugger on a specially compiled Apache webserver in the production environment would be a great way to troubleshoot performance issues, and that having an 8-12 week sprint/"rolling release cycle" was "Agile" as long as we called it that.

So, yeah, some people just don't get it. At all. And in my experience management is definitely a place where one is more likely to find such people.


1,000x yes! My productivity comes in intense bursts of effort, with quiescent periods of research and reflection between them (or what looks like goofing off to rigidly process-oriented sorts). Combine that variability with mandatory daily scrum meetings and it makes me want to figuratively slit my wrists.


I'll even go a step further. I have a reputation as one of those people who has an ability to get things done at an incredible pace, but there's definitely days where I'm flat-out procrastinating and being lazy.

For me, personally, and I suspect other people like me, it comes down to an ability to perform remarkably well under pressure, along with a lack of ability to perform well when the pressure is off. If there's no urgency to what I need to do, I find it very hard to commit myself to doing something.


It can also be an anxiety thing, if one has generalized anxiety disorder. The anxiety of doing something hard or with higher uncertainty (a challenging software problem) is high, and it takes the even greater anxiety of the high pressure environment to overrule it.

Worth investigating, because it seems like ADD, but it is not. ADHD drugs in this case can be counter-productive as they tend to increase anxiety.

There are techniques to cope with this that can be quite effective.


It's funny, I made a comment on reddit a couple of weeks ago describing myself in almost the exact same way in a discussion about dealing with ADHD as an adult:

http://www.reddit.com/r/science/comments/1j0ml8/psychopaths_...


I am the exact same way, and was recently diagnosed ADHD.


Over the years I've come to realize that what once felt like procrastination is actually my brain working out the solution to a problem, and it quite literally prevents me from reducing the solution to code until it's damned well ready for me to do. So I just go with it now. The only thing that seems to speed this process up is intense cardiovascular exercise, which is probably a good idea on its own anyway (but I imagine it seems awfully odd from the outside).

As for ADHD, I probably have a little of that going on with a sprinkling of Aspberger symptoms, but not seemingly in a sufficient way to have any significant effect on my life and/or overall productivity (except that is when it comes to %$!^ing daily scrum updates which drive me bonkers - once a week would be fine, once a day is ridiculous).


Today I spent 5 hours staring at, stepping through some callback-heavy code, reading HN, browsing the web and about an hour to write, test and deploy about 20 lines of code fixing a really nasty race condition. And I'm still not completely sure things are really fixed. I might have spent even longer had it not become apparent that we have just the two options:

1. band-aid around the problem

2. complete rewrite of a large portion of the website


I'm glad that I am not the only one. I had this impression that I'm a bad developer because I'm not consistent.

During my internship, I did quite a lot of work but in similar manner. Implement an interesting feature, then a week of laziness. Another feature and laziness.

However the other replies to your comment also make me afraid about some psychological problem. I think I need to visit a doctor! :)


The difference between lazy people and people with a disorder is that lazy people don't feel bad about not getting work done.


I'll go even further and state that the problem is using MBAs to formulate how to manage creative sorts cough agile cough cargo-cult coding cough. if the area under your productivity curve is greater than everyone else around you, I don't give a you-know-what about the shape of the curve, you're doing just fine.

To be fair to agile though, I can see situations where it can work. The problem is that many of its adherents seem to see agile as a hammer and all software engineering as various forms of nails.


> The problem is that many of its adherents seem to see agile as a hammer and all software engineering as various forms of nails.

Agile isn't a methodology, but a metamethodology -- or, in terms of the metaphor, it isn't a hammer, it is a set of guidelines to use in selecting tools.

Scrum is a hammer, but Scrum ≠ Agile. Often rigorous adherence to particular methodologies (usually Scrum) get misidentified as being "Agile", but rigorous adherence to a particular methodology is not only not the same as Agile, but is directly contrary to Agile principles (particularly, its a direct violation of the first value from the Agile Manifesto, "Individuals and interactions over processes and tools".)


Except that every incarnation of Agile that I've encountered is a rigid implementation of scrum plus sprint planning plus Jira. And this quickly becomes Waterfall with scrum. And it really sucks.

While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments, a lot in fact, so methinks the agile fans ought to take some ownership of this recurring problem and either find a solution or stop shoving agile down everyone's throats.

Finally, I'd take a 30% pay cut to escape agile to do exactly the same work I'm doing right now minus scrum. I'd get more work done and I'd feel better about it because I would no longer feel like I have the engineering equivalent of an ankle monitor attached to me. That's more than worth the loss of compensation to me.


> Except that every incarnation of Agile that I've encountered is a rigid implementation of scrum plus sprint planning plus Jira.

This is not a problem with Agile; anything that works anywhere will lead to imitations that steal the name and attempt to extract some simple recipe from the "lessons" of that thing that worked.

> While I agree that this is against the agile manifesto, that's no excuse. This is how it ends up getting implemented in large corporate environments

No, its how something that is nothing like Agile gets implemented in large corporate environments and called Agile.

Fundamentally, this is a symptom of a broader leadership culture issue environments in the authority structure and culture has people that neither know nor care to know about the domain have authority for decision making within that domain, and its certainly beyond the power of people external to the affected organizations with an interest in particular approaches to problems in any given domain (software development or otherwise) to do much about. It is, however, a pervasive problem in large bureaucracies (not only corporate ones.)


If the Agile Manifesto were just that and little more, I'd agree with you...

But instead it has become an enormous metastasizing moneymaker for minting Certified Scrum Masters, Certified Scrum Product Owners, Agile Certified Practitioners, and all sorts of other Agile titles for $1000+ a pop. So I guess we're going to have to disagree because I think this means a little ownership of the issues that arise in the practice resulting from that training is appropriate here. Because what I'm hearing from you now sounds a lot like the usual "You're doing it wrong!" refrain which accomplishes precisely nothing.


> But instead it has become an enormous metastasizing moneymaker for minting Certified Scrum Masters, Certified Scrum Product Owners, Agile Certified Practitioners, and all sorts of other Agile titles for $1000+ a pop.

As long as there are people looking for packaged solutions and willing to pay top dollar for them, there will be people willing to sell them to you under any name that you ask.

But if the name on the tin is refers to something diametrically opposed to that kind of packaged-solution approach, well, its probably not going to be an accurate label.


Me too. My productivity curve looks very rocky, with very tall spikes -- huge amounts of very hard stuff done in hours -- and days where nothing happens. There seems to be a longer-duration cycle too. I have super-productive weeks and ho-hum weeks. Overall the average is decently high.


I will tell you the secret to exhibiting consistent productivity, taught to me long ago by a master who shall remain nameless.

Hold back some of the extra work on your killer productivity days and keep it in reserve. On those off days, reach into the "bank" and push some of those changes.

And now you are consistent.


I know what you mean - my productivity definitely goes in cycles. I might create a great deal in 2 weeks, and then spend the next 2 weeks being unable to get much done.

But I learned long ago not to worry about it. The down cycles inevitably pass and then I'm producing again.

(no, I'm not bipolar)


The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.

Isn't that the motivation for the whole 20% idea, though? If you can produce 1 GMail for every 1,000 ideas, it probably doesn't matter if the other 999 didn't produce much of tangible value, because the programme almost certainly paid for itself just on that one success anyway. Meanwhile, you still get to enjoy the morale benefits for all 1,000 staff for the other 80% of their time when they are working on assigned tasks.


I agree. But as an organization grows, the points of view grow. People focus on their niche, and don't necessarily get the big picture. Also, the political environment in a company changes as the company grows, and tends to not reward speculative activity well. If you invent GMail, your boss becomes VP or gets more people/prestise/money/etc, If you invent Orkut, your boss gets to talk about how popular it is in Brazil.

One of the bad side effects of innovative, rapidly developed things is that the support org gets left behind. When the organization is small, this isn't a huge problem. When you're a huge company, the Exec VP of Support has an incentive structure to deliver better support. That VP will lobby for more controlled changes and slower product release cycles.

I've worked in places where most of the organization would be angry if some team invented GMail. They didn't welcome disruption.


> From a management perspective, you're forgetting about the obvious: the less than productive people. The benefit of giving employees no- or few-strings attached time to work on whatever is clear. Things like GMail, Apple 1, etc. The "cost" is that people are doing things that don't necessarily contribute to the bottom line -- for every GMail, there are 1,000 low-impact ideas.

Actually, there are other reason for non-rigid 20% time (that is, the 20% is a target with considerable variation in the short term); it means that resources across the company and in any team aren't fully committed on critical tasks routinely, so surging due to an emergent need doesn't mean dropping the ball somewhere else. When routine utilization gets too high, this rapidly becomes a very significant effect.

Its also a way that people gain experience and understanding and avoid developing tunnel vision around the way things are done on their exisiting primary products, and so become more efficient.


In the case of Google's 20%. The numbers are easy to quantify, how much does 20% of the engineering staff's time cost? How much money has the output of 20% time earned? It might be that Google thought the work was too unfocused, and outside of some pretty obvious early big hits, fairly little else of value was coming out of the work year after year.

The best way to deal with it, I think, is to have people submit proposals, pick the best, then take those people out of regular work for a few months and put them into a "lab" where they come up with an MVP. If it looks good at that stage, invest more in the idea.

but...oh yeah...Google got rid of "Labs".


When your ability to make money hand over fist starts to get challenged, it is difficult to continue giving people free reign, especially when your competitors focus on cost, cost, cost.

That kind of thinking sounds like it would lead to a "race to the bottom" to me. If everybody is obsessing over - and competing on - a quest to cut costs the most and the fastest, I really don't see how anybody is going to benefit from that in the long run.

Or to put it another way: "You can't cost cut your way to a growing company".

Of course, I'm not saying there's never a time when circumstances change and some cost-cutting might be called for. But cost-cutting is a tactic, IMO, and not a strategy. Innovation, on the other hand, and committing to the activities that lead to more and better innovation, is a strategy.

My feeling: If you want to grow, you have to innovate. So if you reasonably believe that "20% time" is an approach that leads to useful innovations, then cutting it is a short-sighted, and arguably mistake, move.


Kind of a strange point on cost cutting, but how does this fall into the arguement of "paying for talent" when hiring your executive level vs "race to the bottom" point of view for your employees? I've seen this vomitous 180 flip of statement coming from the same company in a short timespan.

The top gets paid an exponential figure for 'talent', but is ignored in any cost cutting measure... yet, any purely non financial 'talent' sets you on a crash course race towards minimum wage of "Oh, you did awesome this year, but times are lean.. here is a 2% raise."

Maybe I am more irked now because the company I work for just got bought by a huge company and R&D staff was slashed in half and "No more research." became a mandate


"Or to put it another way: "You can't cost cut your way to a growing company"."

"For the engine which drives Enterprise is not Thrift, but Profit." - John Maynard Keynes


And then consider that Google's hiring standards seem to have decreased as they need a lot of people and not just the best. Thus 20% of more average people's time is going to be a lot less valuable on average.


While their standards have definitely dropped because they hired me in 2011, they then proceeded to assign hirees like myself to all the work no one else wanted to do around the googleplex.

I suspect this was one of Larry's failed experiments because a whole bunch of the people I met were let go within a year. I personally fled the place after a couple months of trying and failing to find work remotely suitable to my skillset, which was, ironically, what led them to recruit me in the first place.

Great perks, lousy work.


If you don't mind my asking, what role were you hired for? Were you part of an acquisition?

I also joined in 2011 (SWE, normal hiring process) and my experience has been nearly the opposite of yours.


I've gone on about this elsewhere (search my comments), but it comes down to the utter stupidity of blind allocation for experienced engineers. There were projects that literally needed my exact skill set, and engineers on those teams did their best to try and open up a position for me on them, but middle-level management and HR blocked all their efforts.

I could have stayed a year and hoped for the best, but by then I suspect I would have been so embittered that I would have become the embodiment of a bad culture fit so I left before that happened because I had a great opportunity dropped right into my lap.

Now I suspect I am blacklisted at Google because a few people have tried to get me rehired now that such openings exist and they were immediately shut down by HR.

Whatever...


I'm curious - were the projects you wanted to transfer to in niche technical areas or niche products?

It's very easy at Google to transfer from niche to core. If we (in Search) see someone languishing in another part of the company with skills that we want and they want to work with us, we make the transfer happen, and there's nothing HR or another manager can do about it. It's much harder to transfer from core to niche, and it usually requires a solid track record of sustained performance in your original assignment. I know a number of people that transfer from Search to Google-X after 4-5 years, but I know of virtually nobody that can make that transfer after 1-2. (Basically, the company wants to "make back" their initial investment in you before they'll let you work on speculative projects that may not show a return.)

Arguably some of the niche projects would be better done as startups - they don't face the constraints of working at a large, very visible multinational where they don't get resources or attention from higher management - but the entrepreneurial spirit isn't quite dead at Google.


I was in a core technical area trying to transfer into another core technical area in both cases (really, they needed me elsewhere, I could have made a genuine difference, and the blind allocation process completely hosed that up).

I also didn't know upfront that little tidbit that whatever assignment I took, I would be stuck there for 4-5 years. If I had known that, I would not have accepted my initial assignment, nor any other, until it was something I knew I'd remotely enjoy, and I'd probably still be there today.

So by relentlessly focusing on short-term ROI, Google lost 100% with me.

If they're going to insist on blind allocation, then they ought to not be surprised when it doesn't work out. But I gather the heuristic is to assume these cases are a 100% indicator of non-googliness.

Again, whatever...


Have you considered that there might've been an issue that you didn't know about?

For me, I was with you until I read your suggestion to "read your comments" to learn about your experience.


Does that really matter?

From my perspective, Google recruited me aggressively away from a long-term gig where I had an absolutely stellar reputation. I uprooted my career with the mistaken belief that they wouldn't do this unless they had a reasonably clear idea what to do with me. Apparently they didn't and I'm not the only one who had an experience like that.

Sure, I could have said no and I take full responsibility for saying yes and for everything that happened as a result of doing so. And once I realized that Google was going to be of zero help in fixing what I think was a minor allocation error, I once again took responsibility to do what it took to fix the problem myself: I left.

So here's why I think they wouldn't help: the team onto which I was placed was losing an engineer a month. Every time they got a noogler to say yes (3 times during my short stay), another team would intercept them before they got to their first day on the job. The work was dreadful and tedious and the manager even seemed to hate running the team. And the only reason I said yes was because I had this naive faith that Google wouldn't do something as seemingly daft as blind allocation unless they had a pretty good idea how to make it work. My bad. But not my problem. High level people should have been fixing the root cause here instead of continually throwing nooglers into the pit and expecting a miracle.


I don't think you have a good characterization of this:

What Google has learned over the last few years is that the people they were hiring as "the best" weren't necessarily getting the job done any better than employees from more "average" backgrounds...who happen to be much more readily available in the job market.


I don't think Google's hiring standards have decreased, they've just shifted their focus away from inaccurate signals such as where you went to University and what your GPA was there. One could argue that Google's hiring standards have actually increased, as they're now hiring people who can demonstrate an ability to perform instead of people who were able to get a pretty piece of paper from Stanford.


Full disclosure: I'm a rising senior at Stanford, so maybe your comment is just irritating to me on a personal level. That said, the Stanford CS department is objectively very good; further, judging by the number of people I know who have abandoned or failed out of CS here, plenty of people wouldn't be able to complete the coursework for the undergrad degree even if they were all enrolled. I agree that hiring people based on their school is bad practice (see: people failing out of CS), but I don't think it's fair to call any engineering degree a "pretty piece of paper." I've put too much work into mine for that.


Ability to complete coursework is not necessary and sufficient for being a good engineer in a company. Sorry. I've seen too many people flounder around, never completing things, making inane suggestions, and so on, all while talking great theory. Of course, I've seen the opposite, and Stanford is a very good school, I don't think anyone would deny that.

And that shouldn't be surprising. Look at brilliant physicists. Most end up in either the theoretical or experimental side, and are often quite bad at the other. Likewise, theorem heavy CS has its place, but getting through a program like that doesn't mean that you can write a for loop (I've interviewed Stanford grads that fumbled and failed though that), design readable, robust software, push through a sea of decisions and make effective, near optimal decisions (the whole SW life cycle is a n-dimensional optimization problem), get along with peers, and so on.

There is a huge cachet attached to degrees from certain institutions that really isn't deserved, in my opinion. In that sense the paper is "pretty". It's not a slam of the effort anyone at is undoubtedly making at the school, but the reverent regard with which it is regarded.


Except that's really all it is. You might be fancy with your degree for a short period of time and land some interviews others might not, but it soon all goes out the window. The second you have some sort of industry experience, where you went to school and how you did there doesn't really matter.


Your "pretty piece of paper" is essentially like getting your drivers license. It allows you get behind the wheel, but it makes no guarantees you'll be any good at driving. In the end it really doesn't matter much which DMV you go to.


And hence Google's slow descent into becoming what Microsoft is today.


Second that. We engineers have our own perspectives but the management has theirs and it would be unfair to say ours are unconditionally better. If the management thinks in that way there must be a reason for it, and then constructive communication + concessions on both sides would be the real way to achieving a better end.


If someone is "less than productive" you let them go. If they could use some improvement, encourage them to spend their 20% non-core work time learning and improving their basic abilities. Better to grow a less-productive employee who could improve than to roll the dice and try someone else, or expect anything to change while piling on a full workload of critical tasks.


> Bell Labs was engineer/scientist nirvana, then the AT&T monopoly went away.

And then the MBAs had Lucent finance customer purchases and count the promissory notes as income ...


> proven short-term penny-wise/pound-foolish idiocy of MBA-think.

This is a completely unjust attack. Quite frankly, I have no idea what "MBA-think" even is. You make an assumption that an MBA making a bad decision is making a bad decision because they have an MBA. That doesn't pass the test. Would the same person make the same decision even without the MBA?

I always seem to get sensitive over the general MBA hate expressed at HN. As someone who spent years in web development before getting an MBA, I completely fail to connect to any of the insults typically thrown at MBA's on here. I certainly don't recall a class where we learned it's best to destroy 20% time. I don't recall ever being indoctrinated to the type of business thinking that is negatively attributed to MBA's. I recall getting an education on things like finance, economics, marketing, strategy, operations, etc. that weren't covered in my undergraduate technical degree.

I appreciate the developer-oriented aspects of software startups and Hacker News and I'm certain that many people have encountered assholes who happen to hold MBA degrees. I'm certain the degree attracts certain segments, I clearly had some as classmates, but attributing every business decision you disagree with as MBA-think is not a good approach.

This just seems like taking shots at a fuzzy construct for sake of taking shots and I'm not sure what value it adds to the discussion. I'd rather see legitimate reasons why removing 20% time is a bad idea for Google's operations.


I say this as somebody who suffered through getting a professional management degree as well:

I think the observed pattern is that we repeatedly see startups get kicked off, grow like wildfire, then turned into an empty shell of their former selves once the "professional management team" is brought in as they promptly kill off all of the reasons the company was growing in the first place in favor of short-term (bonus making) metrics that are almost never good for long-term growth and survivability.

It repeats over and over and over again and it's especially frustrating when you're on the inside watching outsiders come in as VPs who's only qualification is a top-10 MBA destroy unbelievably large numbers of man-hours of work and turn thriving companies into joyless bean counting husks.

I remember vaguely going through my own management education specific moments where I stood back and realized what a smoking pile of self-serving bullshit and handwaving the professional management industry had become. Most of what we were studying was full of vague and meaningless, but impressive sounding, aphorisms and pretend sciency/engineeringy sounding talk. Management methods were described like bold scientific experiments but constructed of the flimsiest methodology one could come up with. I felt like the books we were reading were consistently written by flimflam men who had no consistent measurable success and wrote endlessly about management theory with the structure of those late night get rich infomercials where they talk endlessly about they're going to show you how to achieve success, but then never actually tell you. Hundreds upon hundreds of pages of it.

I've since tried to purge most of my management education from me. The only thing it really got me used to was a sense of comfort with sitting in front of spreadsheets all day, moving around millions of dollars, and writing status reports.


Matthew Stewart's "the management myth" runs with this thesis. Google it - the Atlantic article is a great read. He explicitly links the Tom Peters types with televangelists and self help infomercial pitchmen.


I remember this article when it came out and thinking how well it mirrored my own experience. I almost feel like I just summarized his 5 pages ;)

The Mayo lighting experiment is an awesome and sadly typical example of the kind of shit poor "research" that goes into the field. With premises, methodologies and outcomes so flimsy a six year old could poke holes in it. Yet these kinds of "studies" are published and taken as a great advances and contributions to "management science".

Before you know it, based on one or two of this "studies", great fads sweep the ranks of professional management and we end up with bizarre and counter-productive management initiatives. When those plans inevitably fails and some consulting firm is brought in and recommends a house cleaning, new management is brought in who's only worth is that they're more up to date on the latest fads and reshape the company along those lines...generating lots and lots of activity (reorg after reorg after reorg) but no actual value.


Part of the problem is that it's immensely difficult to do science with people.

The hardest of the social sciences will create experiments where they try to study a single variable, like a person's reaction to a specific set of stimuli or decision making under a certain specific set of conditions. That way you can at least pretend to control for things.

It is simply impossible to do this in management. There are millions of variables. You can't know them all, control them, or do multiple runs of an experiment.

One of the more recent management fads has been complex systems simulation -- trying to do computer simulations of difficult managerial problems. This can work for logistics, routing, and mechanistic process optimization, but you can't reduce human beings to "agents" in a model.


I wonder what would be good professional management that is as close to general purpose as the old MBAs.


I'm of the opinion that the entire discipline is in need of a reboot. History has shown that most of the research into the management science bits is of an extremely poor quality and the field is in such a state of denial that they simply can't accept this.

One of the major issues is the concept that a person, with no specific experience of understanding of a certain industry can take a couple years of generic administration courses and get slapped into a VP role in any given company. This concept extends down to the worker bees in that the assumption is made that workers are fungible.

I think this is a fundamental flaw in current management theory that needs to be burned out of the entire field with extreme prejudice. It colors the entire field and I believe is the root cause of most of the major failures in the field.

The case of John Scully is a particularly notable example.

There are some schools that have toyed around with industry focused management degrees, a step in the right direction. You can learn how to manage a business in a particular kind of field...managing professional services in a software vertical is unbelievably different from managing the R&D division for a major cosmetics company, managing a small company is unbelievably different from managing in a megacorp.

I think also that MBAs simply shouldn't be available until a person has a few years of industry experience under their belt -- much like executive MBAs are today.

I think at the very least, because of the damage a shitty MBA student can do in the world, it should require industry specific professional certification that needs to be maintained and has an ethics bar like becoming a lawyer to self-censure particularly bad apples.


One of the major issues is the concept that a person, with no specific experience of understanding of a certain industry can take a couple years of generic administration courses and get slapped into a VP role in any given company.

My point is that I am curious if it is possible to change things so such a person won't be bad for the company after being slapped into the role.


> I'd rather see legitimate reasons why removing 20% time is a bad idea for Google's operations.

Technology has a viable lifetime before you need new technology. As a technology company, you therefore need to be invested in creating new technology or be willing to go in to the spiral of customer loss and only dealing with legacy systems before a slow death.

Statistically speaking, if you know most of your staff is intelligent and has experience in your market, you're more likely to get an outlier idea (one that is way off on the end of the bell curve, ie, actually really good) by casting a wide net, and listening to the bulk of your employees.

The problem is that to go anywhere, ideas need time to gestate and develop, so you have to give all those employees a little bit of time to develop new ideas for your company.

It's been a while since undergrad, but if you want, I could try to bust out an equation modeling (and predicting the expected value of) the likelihood you hit a really important idea in the wide-net situation versus the "dedicated research staff" one.


You're correct that there are smart MBAs, so it might be a bit unfair that "MBA" has become shorthand for something else. But it's not without cause. It's a stereotype with some basis in reality.

The key distinguishing factor to me of MBA-think is a combination of posturing with credentials and cargo cult thinking. The essence of this thought pattern consists of thinking divorced from real-world referents.

A strikingly similar cognitive anti-pattern can be found in the humanities, by the way.

http://en.wikipedia.org/wiki/Sokal_affair

I wonder if this is because business borrowed something rather toxic from the post-1970s "postmodern" intellectual meltdown of the humanities? The worst kind of MBAbabble that I've endured over the years reminds me very much of postmodern literary criticism in its vapid, posturing use of language to hide the fact that the speaker is not actually saying anything.

"Supercalifragilisticexpalidocious! If you say it loud enough you'll always sound precocious!"

A closely related cognitive anti-pattern in computer programming leads to "architecture astronautism," premature generalization, and over-engineering: http://www.codinghorror.com/blog/2004/12/it-came-from-planet...

In stereotypical MBA-think you have people reasoning about businesses without reasoning about the business -- about what the business actually physically does in the real world. So you see something similar to the cargo-cultish application of design patterns in programming. A management practice will be applied because it worked once in business X, but it's being applied to a business with wildly different characteristics.

At no point is an attempt made to actually walk the halls, talk to the boots on the ground, actually ascertain the concrete nature of the business one is managing in order to tie one's thinking to reality.

Finally, there's the ugly aspect: all of this is posturing to justify unjustly high compensation relative to the people who do real work.

Back to programming, I have run into "enterprise architects" who do not know what they're doing and who make more than the people in the organization who do. What they do know how to do is how to sound impressive.

Back to the humanities, it's sort of transparent to me that postmodern psychobabble is a similar sort of impostiture to hide the fact that the people in question are supposed to be cultural vanguards but in reality have less to say than the street graffiti artists who tag up their buildings at night. Personally I'd fire the humanities people and then hang out in the bushes at 2am and offer the clever social critics with spray paint cans a job.

Elon Musk is a great example of someone at the absolute opposite end of the spectrum from stereotypical MBA-think.

He's not Tony Stark. He's not superhuman. What he does do is get his hands dirty. When he founded SpaceX he actually taught himself some bona fide rocket science so he would know what the hell he was talking about. He did a similar thing with Tesla, actually dove into some of the hard problems of electric car design himself so he'd have a clue. I'm sure he spends most of his days doing managerial things and raising money like any executive, but the fact that he's gotten grease on his hands means that when he reasons about his businesses he's reasoning about his businesses and not about abstractions divorced from reality.

His astonishing success at building businesses doing some of the hardest things one could possibly choose to build a business doing can be chalked up, IMHO, almost entirely to the fact that he is a smart guy reasoning about things instead of reasoning about hypothetical things. He does not confuse the map with the territory.


> I wonder if this is because business borrowed something rather toxic from the post-1970s "postmodern" intellectual meltdown of the humanities? The worst kind of MBAbabble that I've endured over the years reminds me very much of postmodern literary criticism in its vapid, posturing use of language to hide the fact that the speaker is not actually saying anything.

You described easily over 75% of the management textbooks I had to work through when I was getting my management degree. Vapid & content free. It was shocking sometimes to be reading a few chapters in a row, and not even realize different authors wrote them because they all had the same "voice" of page after page of absolutely nothing at all to say.


In the USSR they were called apparatchiks:

http://en.wikipedia.org/wiki/Apparatchik

It's sort of funny how Soviet business culture actually is.


> It's sort of funny how Soviet business culture actually is.

Capitalist heirarchies look like state heirarchies in a system widely criticized as being "State Capitalism".

Funny to the people who think that the Soviet Union represents the polar opposite of capitalism, expected by the left-libertarians who noted that the Soviet Union recapitulated the features of capitalism central to the socialist critique of capitalism.


It goes more in the opposite direction I think. "The Firm" has a centrally planned economy that in the case of an organization like Google can easily be as big as a small country. So it naturally tends to behave the same way as any other large centrally planned economy.


The thing to understand is that the socialist critique of capitalism has always been that it is a system in which, both on the "micro" scale of firms and the "macro" scale of broader institutions, is centrally organized for the benefit of a narrow elite, with a distinct and well-defined (though, in the case of the "macro" scale, less formal than is the case in, e.g., feudalism) social heirarchy.

The heirarchical megacorporation existed before socialists invented the name "capitalism" to refer to and criticize the system which spawned such beasts, so the Soviet Union mirroring the heirarchical structure of such an entity -- with a similar elite vs. worker power relationship -- is exactly the USSR recapitulating features of capitalism central to the socialist critique, and not "the opposite direction".


It's interesting that you use Elon Musk as an example, since Max Levchin's interview in Founders at Work suggested that his only contribution to PayPal was to attempt to get them to switch all their servers over to Windows.

(Levchin, BTW, seems much more like someone who actually "gets his hands dirty" - which doesn't actually work out all that well sometimes, as evidenced by subsequent ventures like Slide that executed really well against a pointless market.)


> I'm convinced its pure cronyism and ass-covering.

I'm convinced it's not.

Betting on the long term at the expense of the short term is incredibly hard. Even the smartest people in the world will panic (and in some cases pivot) when doubts start to creep into their mind. I am sure all of us have been in a similar situation. It's always easy to judge from the outside but on the inside those decisions are tough, lonely, scary and rarely to do with just covering your own ass.


I'd believe you if the folks I'm recalling had a good track record of having built successful businesses, or if their thinking struck me as... well... thinking, rather than cargo cultism. I know what it looks like when a person is thinking -- even fallaciously or sloppily, but still rubbing neurons together -- and I know what it looks like when a person is bullshitting to give the appearance of thought when they really have nothing at all going on upstairs. Buzzwords divorced from meaning and used without proper context are a dead giveaway for cargo-cult thinking.

Now granted, I know far more about programming than business. I can spot a cargo cult programmer after three sentences. In business it takes me a lot longer, maybe 6-12 months of working with someone. But by then if I've watched the cargo cultism long enough I can be pretty sure that's what I'm seeing, and I stop giving the benefit of the doubt. Sometimes when someone repeatedly seems like an intellectual impostor, it's because they are.

"Even the smartest people in the world will panic (and in some cases pivot) when doubts start to creep into their mind."

I think that's a lame excuse. Maybe my expectations are irrational, but I expect the kind of people who get millions or tens of millions of dollars to experiment with to be fighter pilots, not bus drivers. And if you wreck the ten million dollar plane, you don't get to fly another right away. Someone else gets a turn.

I also expect them to be smarter than me. When I talk to them I should have the feeling I've talked to an intellectual superior, not an impostoring dumbass.

It downright offends me to see the American middle class collapsing and people with no jobs while this shit goes on. If you're an entrepreneur, you're a working stiff too. Your job is to create value, and probably jobs, by building new businesses. When ordinary working stiffs screw up they get passed over or fired. When these guys screw up they get promoted -- or at least endless second chances.

I think what it boils down to is credentialism and cronyism rather than performance-based investing. If I were an investor I would split my odds between two areas: taking a chance on newbies, and investing in people with proven track records. But I would not repeatedly invest in people with poor track records just because their resume says "MIT Sloan School of Business."

Yeah, it's a rant. You're talking to a war veteran here. I also earned these war stories in the Boston metro area, and have heard a fair number from a friend in New York. I've been told that this problem is worse on the East Coast due to the absolute worship of ivy league degrees and in-group connections there. The West Coast is more meritocratically-focused. Never lived or worked there, but the culture I've seen in the West supports that notion.


Whenever you see theoretically smart people acting in stupid ways you should consider incentives before deeming them idiots.

The prioritization of short term benefits is an obvious consequence of reward schemes that judge you based on yearly performance. There is little reason to invest in things that help the company long term if your pay depends on short-to-medium term results, unless you think your company might be going down and are fighting for job security.


True to some extent, but over time fields with deeply perverse incentives end up driving away smart people. Smart people see perverse incentives and think "that field would suck," and run away.

Large sums of money can draw them in sometimes. But then they generally do the job for a while, stuff their pockets, and then leave.

People don't stick around with enthusiasm at a lame party.


> It boggles my mind, given the big money involved, why so many people continue to bet huge sums of cash on the proven short-term penny-wise/pound-foolish idiocy of MBA-think.

A lot of this "idiocy" is in fact rational (if unethical) behavior. At many large companies, upper management is rewarded for short-term gains. In response, these managers adopt strategies that produce short-term gains. If these gains can be boosted by risking the company's future, that too is a rational trade-off: The managers will be long gone when the company falls into decline, but in the meantime they will have collected a lot of pumped-up bonuses.

When people are paid today to burn tomorrow, why are we surprised that they reach for the matches?


> A lot of this "idiocy" is in fact rational behavior. At many large companies, upper management is rewarded for short-term gains.

The fact that microoptimizations that are harmful to the company in a broader perspective are rational on the part of individual mid-level managers working within the incentive framework provided by upper management, doesn't mean they aren't idiocy on the part of the upper management responsible for creating the incentive framework.

In fact, that means they are idiocy on the part of upper management.


I guess that conclusion would depend on whether the upper managers expected to better off (a) by staying at the same company for life or (b) by pumping the short-term returns and then using them as a shortcut to a better position at some other company, where the long-term consequences won't be felt.


Well, from what I have seen about the business people, their preferred method is b. Since by going to some other company, they can get higher salary, usually higher than the raise they'd be getting.


This is a story about Google, which is still led by its technical founders (who are also huge equity holders with a long-term interest) and who don't have MBAs? If 20% is getting killed, it's getting killed because Larry Page wants it dead.


I don't think the decision had anything to do with money. It's part of the whole "more wood behind fewer arrows" initiative. Essentially, a defrag. 20% time was seen as having created too much fragmentation with little benefit. Gmail was the only major success to come from the program. Also, Google has a separate team for working on skunk works style projects.


Google Now: http://memeburn.com/2013/03/from-a-20-project-to-googles-fut...

If IO 13 is to be believed its become an essential pillar of their current vision/roadmap for search (Answer, Converse, Anticipate).


Did not know about now. Thanks.


Metaphor is not reality, but that metaphor is terrible. I'm an archer: you want as many arrows as fit in the quiver, and as far as more wood, well, we've switched to carbon fiber shafts for a reason. When you shoot an arrow, velocity gets you to your target, and velocity penetrates it too. Extra weight is a literal drag.


Go the other way. You could have 500 arrows made with toothpicks or 5 traditionals.


Our modern "arrow" is essentially a field point expelled via explosion at incredible velocity.

It's a bad metaphor.


>Gmail was the only major success to come from the program.

Adsense, Go? I'm sure there are others too...


I don't think adsense has done very well. Does Go generate revenue?


While Go doesn't directly generate any revenue, it's probably not without reason that Google is rewriting some important pieces (dl.google.com for example) in Go. I can imagine their engineers being much more expressive in Go than in C++, causing them to spend less time on writing the code as well as resulting in simpler code (easier to comprehend and maintain, so less time/money spent on maintenance). This doesn't generate revenue but it does optimize development processes, which could indirectly save them some money.


I wouldn't blame this on MBA-think. The folks doing it at Google aren't MBAs, and the 3M/Google/HP approach is treated as a positive case study.

It's an issue of central control versus decentralized organization, and where the company thinks innovation will come from. One could also argue you get a short term price bump too.


I wouldn't blame this on MBA-think. The folks doing it at Google aren't MBAs

An MBA is neither necessary nor sufficient to suffer from MBA-think.


It boggles my mind, given the big money involved, why so many people continue to bet huge sums of cash on the proven short-term penny-wise/pound-foolish idiocy of MBA-think.

This is the con of being a public company: a lot of cynicism. Imagine that you have inside information that Blackberry will revolutionize the mobile market, you will put money in their stocks and may be move on fast once it increases significantly the price. Yes, value investing is very interesting but how many daily transactions are based on this thinking?

I think there was a lot of bias in favor of public companies and now we are realizing that staying private may be the best option if you think long term and don't want to deal with massive conflict of interest.


I've had a speculation for a while that this is why Google, Apple, and many other tech giants like to keep their stock float small and thus their stock price per share very high. I wonder if it's a way of trying to select for longer-term investors. ???


I am not sure because you can own half of an stock. For example if you invest in ETFs you don't an integer number of stocks.


The shares of stock in an EFT are controlled by an independent group. Essentially you are buying shares of someone else's shares.


But this is a way to operate on fractional stocks. Can you elaborate on "api" original speculation?


MBAs are just trained to optimize around the the current maximum in the "energy" landscape. These maxima are all too often not the best in the landscape, or even worse, can vanish over time as the landscape changes.

MBAs are like a gradient optimization algorithm without any simulated annealing process.


> MBAs are just trained to optimize around the the current maximum in the "energy" landscape. These maxima are all too often not the best in the landscape, or even worse, can vanish over time as the landscape changes.

> MBAs are like a gradient optimization algorithm without any simulated annealing process.

Now that is content-free babble. If you are a bot, more work is needed. If you are a human, you should be ashamed of yourself.


He attempted to explain why MBA behaviour is rational using the language of hill climbing algorithms: http://en.wikipedia.org/wiki/Hill_climbing

He didn't express any big new insights, but there was an honest attempt to impart information behind that post.


Companies like Google are not limited by money, they are limited by available people. There are only 1.3 million professional software developers in US. Make it 2 million to account for immigrable developers. Now you want to hire top 1% of the best of the best. How many can you hire? It's not money limitation, it's people limitation.

About MBAs, I don't think changes at Google is because of MBA-invasion. Most of the changes we see today at Google are actually directly influenced by Steve Jobs. When Larry Page became CEO, he requested mentoring meeting with Jobs which he surprisingly granted even though he was very angry about Android. The strongest point Jobs made to Larry was that company should do only 5 things and do it very well (paraphrase). Larry took this to heart and realized Google was extremely fragmented with lots of small undirected efforts going no where. Rest is the history... It's debatable if Google's old model was better than new one. Personally I think the best thing for a company is to keep continuous balance between two extremes. Historically, companies starts believing in one end, drift there and then realize that it's too much then drift towards another end.


Two years ago I was in Colombia and had lunch with 4 late twenties MIT MBA students interning in Medellin. One was ex-Google, one was a consultant at the Big 4.

They'd been there for a week, and told me they hadn't seen any of the city. After work they'd go to the gym or drive their rented car to a restaurant recommended by colleagues. Taxis were a max $3 if you live anywhere in the core, which they did.

On a walk to a restaurant famous for mondongo, we had to cross a 4 lane street. It was 9pm with no traffic and after waiting awhile I noticed the crossing signal didn't go off. I looked both ways with no car in sight and started walking, looking back expecting them to follow. They all stood there awkwardly, with no cars and no other pedestrians, for another 30 seconds, waiting for the walk signal to turn on, while I watched them from the other side. The signal never changed.

They were scared of the city. Anyone who has been to Medellin in the past few years knows there is little to fear.

Not all MBA's are like this, but this group just didn't get it. I realize this is more about travel fears than business smarts, but there is something to be said for people that can adapt and embrace new environments.


> I watched them do shit like destroy products that big customers had money in hand ready to pay for when they were inches away from release.

I'm watching friends in two different, reasonably large consulting companies (~700-1000 employees) suffer as their respective companies go through the consulting version of this kind of MBA suicide. In one case the company even decided to change their name and go through a rebranding. Some examples of the idiocy at work

- The rebranded company's schtick is to hire staff with advanced degrees (PhD preferred and multiple Masters) and at least 10 years of industry experience and charge bongo bucks for renting them out to do high-end but not necessarily mind-blowing work. Something like 80% of their contract staff has a PhD. Due to economic reasons they lost a couple medium-sized contracts. Their response was to lay off half of the PhD employees in the company because they were too expensive to keep around while they looked for new work. even ones who were already working on a contract and making money for the company.

This meant that they also had to cancel at least 3 more medium-sized contracts and a handful of smaller ones because they eliminated the staff who were working on them

At one point they also had two CTOs (because if one is good, two look even better) until they came to their senses and laid one of them off.

- At the other consulting firm, a similar pattern, tightening customer budgets meant that they decided to replace staff already on contract with cheaper, less experience, all new staff to pump up the profitability score on the spreadsheet. (they'd already laid off all of the idle staff not on contract, so without winning new business, some MBA thought that this was the best way to increase the numbers and get another bonus)

But now they've let go all of the people in the firm who had experience doing the work, sometimes decades of experience. And now there are no mentors, no experienced hands, nobody. The quality of the consulting work rapidly went down resulting in the loss of 2 contracts and senior positions that they'd normally fill from within required them to go outside of the firm for.

Both of these firms are in death spirals and all of the people that could help them pull out of it were fired.

At one company I used to work for, I also saw my leadership completely lose their minds and fire off all of the development staff, thinking we could coast with a new sales team and the product we had and pull ourselves into profitability. They also turned down trying to fill out profitable professional services contracts we had with staff because the margins were lower than selling new licenses. I also brought them significant new work in a different line of business where it was decided to turn it down because we didn't want to have to deal with low paid temp staff because "temp workers get paid more than their worth to their temp agencies".


"At one company I used to work for, I also saw my leadership completely lose their minds and fire off all of the development staff, thinking we could coast with a new sales team and the product we had and pull ourselves into profitability."

LOLZ

I watched this particular face-plant occur once as well. When they actually got customers the results were hilarious. Hilarious because I was not a direct employee, mind you.

It was like having a high-end trendy restaurant with excellent branding, great decor, a great location, and no food. So they run across the street to McDonalds and order fifty Big Macs, run back over, dress them up on plates (who will notice), and...

Running out of money sucks, but this maneuver was destined to fail from the get-go. Anyone who knows anything about tech could have told them that. One smarter alternative would have been to lean the development staff -- they had to -- and hire a small number of salespeople on contract (not full time) and offer them disproportionate bonuses to bring in sales.

What they actually did was fire anyone who knew how to make anything -- and in a way that burned bridges! -- and hire a bunch of sales guys full time and at full salary. Hilarity ensued.


Did the guys at the top ever lose any money?


What usually happens is that they've locked in some preferential options or stock grants + bonuses etc.

If the company hits it big despite them, the stock is awesome.

If the company does poorly they might miss a bonus but they'll still get their inflated salary.

If the company ends up on the rocks, they'll get their salary cut "till company performance improves" as their pay is usually tied to the performance of the company as an incentive.

But what invariably happens is that, even with their pay tied to company performance, the moment it gets cut, they high tail it out of there and land a new job with their top-10 MBA and another couple years of job experience...saying "I'm looking for growth opportunities" over and over again in their interviews.

I don't know if it's still true, but Facebook used to filter out these types by making them do a version of an engineering interview. Except instead of regurgitating algorithms you learned in college, MBAs had to regurgitate stuff from MBA school. Like derive the time-value of money, or discuss the TPM or whatever. The ones who coasted through MBA school got filtered out really quickly and the ones who took the subject seriously might pass the gauntlet.


But what about the people investing in these companies? Money don't grow on trees and you can't make money without products to sell.


Investors bank on most of their investments not succeeding, and the few that do making up for the rest.

The truth is, nobody knows how to bottle "success" and reproduce it -- but in a sense that's what management school is trying to do. With just the right management approach, applied in just the right ways, you can get Instagram instead of Color, or Google instead of Cuil.

The problem is that it's very easy to look at the failures and pick apart the problems. But it's very hard to look at the 1% that succeeds and figure out why -- and when it's attempted it usually overlooks the cases where the company succeeded despite having many of the problems the failed companies had.

Most companies succeed because of 90% dumb luck and 10% business strategy.

(hell, business is so screwed up people still don't understand how to define success. You see it here all the time that a "successful" startup is one that raises a huge round, not one that's profitable and growing...even BusinessWeek does this)


Early stage startup investors expect most investments to fail. Those investing in mature companies expect their investments to do well over many years. I'm sure MBAs have plenty of stories of incredibly inefficient companies that have squandered opportunities. MBAs aren't about growing startups, they're about taking the now $1bn Instagram and doing more with it.


I'd also ask them if they know when to use and when not to use these concepts.


Side Issue: do Americans really say "penny-wise/pound-foolish"? I'd have thought it'd been dollar.

And what exactly is a po-dunk school when it's at home?


> do Americans really say "penny-wise/pound-foolish"?

Yes.


(1) Yes

(2) A po-dunk school is any school not on the coasts and not in the top ten, possibly excluding a small number of top-tier schools in "flyover country."

(*) Flyover country is the large area you fly over when traveling between, say, Boston and San Francisco.


> do Americans really say "penny-wise/pound-foolish"?

Yes, as well as the similarly anglocentric (but somewhat oppposite in perspective) aphorism: "Look after the pennies, and the pounds will look after themselves."


do Americans really say "penny-wise/pound-foolish"?

Absolutely.


> I mean sure-- if your company is under cash flow pressure you have to pinch pennies. You have no choice.

Google's per-ad revenue was down 6% last year. Ads are 95% of their revenue. They have huge data to see trends.

Just because Google is still profitable doesn't mean that they aren't seeing a near future of red ink. Say per-ad revenue goes down 10% next year with the same costs and their profit goes from $10 billion-ish down to $5 billion. And if this decrease is a trend?

Competition from iAd and Bing Ads, smaller screens that are harder to advertize on, and so on. It's easy to see a near future Google that has to spam ads or make huge cuts to remain profitable.


Well, so they still have lots of money, and quite a lot of time to do something about it. It sounds like their core business is loosing to competitors, and their solution is to keep the old product (the one that they believe is failing) and cut research?


Sounds like Microsoft doesn't it.

And I'd argue with more and more accessible technologies these behemoths won't have the luxury of "lots of time". And hopefully the large sums of money will mean less.


I'm pretty sure he burns the money in The Dark Knight, not Batman Begins.


I don't think that the MBA is so important that it could be blamed for so many ills in the world of work, but I do think there are two fatal flaws with the modern MBA: Demography and intent.

1.) Demography: Who is the ideal MBA candidate? From what I've gathered, the historical answer would be: "Someone from outside a business education about to move into a business role." So, of course, your average MBA is going to be someone who studied Biology or Mechanical Engineering as an undergrad, right? Well, they're in there, but they're surrounded by people who studied Business and Economics as undergraduates, often from elite schools! What could a Wharton undergrad possible gain from a Wharton MBA? Social proof? A two-year vacation? It's unsettling, and it leads to the second problem.

2.) Intent: What sort of work is the MBA cut out for? Well, they're masters of administrating businesses, so...management? Nope. Consulting and banking. Like lemmings, they go right off the cliff into these fields. All of them. Even the ones who came in there looking to simply move up the ladder in their chosen field, like biotech or energy, will feel pressured into becoming some sort of banker or consultant. So, the MBA ends up becoming a launching pad for people outside of consulting and banking to get in, and for people on the inside to move up.

I don't know how business schools would go about rectifying these flaws. Even if they were to shut out business types, that second flaw is so deeply ingrained in the b-school ethos that anyone seeking to be a better manager in their original field would likely be derided by his or her peers as an underachiever.

Interestingly enough, I hear the Executive MBA is a far better experience, simply because the students are too far along in their careers to make a pivot into something else. They're just there to get better. And maybe for the two-year vacation thing as well.


These sorts of stunts also damage the company's and ecosystem's reputation. Its hard enough to get big customers to give startups a chance anyway (and I remember reading that big company buy in is one of the biggest issues in the U.K. startup scene, which I'm sure is not unique).


Who says this has anything to do with money? I thing the decision is more about a new strategy - focus instead of experimentation.

Personally I think this makes sense. 20% time was great when you're a small company trying to see what works. When you're a big company, you usually get disrupted because you lost focus.


To my mind, it's exactly the other way around.

When you're small, you need a source of revenue quickly, or you'll die. You don't have a luxury to experiment too much, you have to build one thing that works.

When you're a huge company and have untold millions in cash, you can afford experimentation. In fact, you have to, because your revenue source is probably finite and you have to find another before the current source dries up.

This is why Google dived into mobile phones, self-driving cars, home entertainment, renewable energy, Internet providing, consumer cloud computing, goods delivery, etc. Some of the experiments became huge successes (see Android), some failed (see the long list of closed projects), some are too immature to judge (e.g. self-driving cars).

It just looks that experimentation is now locked at a thinner top exec level, while 'simple engineers' are not expected to do too much of it. Which is, of course, a pity.


Dead on.

When you're huge and cash-flush, you have the luxury of really innovating in truly hard areas.

This luxury is damn hard to obtain in a short-term gerbil-wheel economy like ours. Once you have it, throwing it away is like setting fire to a house as soon as you've paid off the mortgage or crashing your new luxury car as soon as you drive it off the lot. It is abysmally stupid and short-sighted. Investors should call for the heads of people who do this, literally. As in on a pike.

People think innovation comes from startups, but in reality it doesn't. Not because startups aren't smart and agile, but because they don't have the resources.

When I say innovation, I mean innovation. I don't mean application of existing innovations to new market areas or problem spaces. Startups excel at that.

But you'll never see a scrappy basement startup whip out a self-driving car, an artificial lung printed from a 3d printer, an orbit-capable reusable rocket, augmented reality goggles with a complete software stack, etc. Not unless the parts for those things already exist and can simply be combined in a novel way to yield a result in less than a year.

If you're big and cash flush, you can do what nobody else can do (except other monsters like you). You can do this.

http://matt.might.net/articles/phd-school-in-pictures/

That dent gets you a Ph.D, but if you do it for something of high economic value it gets you early entry into a market nobody else can enter because they don't know how.

If you do it, you've now created a piece of value that you can do one of many things with. You can feed it into your more short-term product-dev branches and do things nobody else can compete with, you can license it, or you can use it to pump up the prestige of your company in ways no advertising can.

"Holy crap! A self-driving car! And it really works. I mean, there it is, on the road, and it's driving better than I am and getting around faster than I am! I'm going to move all my business's hosting to Google Apps, cause they obviously have the smartest people on the planet..."

I do have the sense though -- and keep in mind I am an outsider -- that there might have been some "ADD" issues with Google's 20% policy as it was implemented. But I don't think the solution is to ditch it. The solution is to focus it, to try to get people to spend their 20% time pushing harder into deeper and more difficult areas instead of whipping out hacks. Maybe incentivize more people to work together more formally on 20% projects over longer spans of time, and incentivize them to tackle things that are very difficult... things only a big elephant can do instead of things a scrappy startup could do.


Actually, a few Googlers around me still participate in 20% projects.

But these projects need to be outlined, have OKRs and an estimation of potential impact, then approved. Not impossible, but far fewer and far less wild projects probably can now pass.

Well, let's see how it works; in a few years it will be obvious whether the stream of innovation dries up or not.


If an initiative doesn't seem far too wild, then it's not wild enough to be innovative. Quite a few lousy ideas seem too wild initially, but pretty much all of the actually valuable projects are as well.


Agreed. I think you're right.


When you're a big company, you usually get disrupted because you lost focus.

That sounds completely backwards to me. Big companies get disrupted because they are too locked-in to milking their current cash cow, and they either don't see - or consciously choose to ignore - any threat to that cash cow. It's more like a case of myopia combined with tunnel-vision, IMO.

Meanwhile, scrappy upstarts are experimenting and trying new ideas, find a better new approach, and can launch it and start growing it, while $BIGCORP remains blissfully unaware, until it's too late. Basically, the classic Innovator's Dilemma situation.

And while $BIGCORPs often ignore disruptive innovations even when they develop them themselves, it still seems to me that you're better off to be the one developing the disruptive innovation yourself, so you at least have a shot at adapting before somebody else comes along and takes your lunch money.


I'm not sure you could back that up with case studies. Most companies are disrupted because they focus too narrowly. They also tend to think they're in the "tool" business when they grew successful as a solution business. E.G. Whip and Buggy company think they're in the buggy business, get disrupted by a company that thinks they're in the transportation business.


It sounds more ego-driven and self centric to me. Is that what MBA programs teach you? To be completely focused on your own solutions that you are dismissing everyone else's?


" I went to a po-dunk Midwestern state school"

You went to Mizzou too?


Worse. The University of Cincinnati.


Ouch! UC Comp Sci graduate here, whats wrong with UC?


Nothing. It was great, and the profs were excellent. But it's not on the coasts so it might as well be from somewhere that ends in -stan. :)


Could be from University of North Dakota, who got the pleasure (early 90's) of being told by Microsoft that they only hired our graduates for Tech Support because we had a nice accent and worked hard (being, of course, all farmer's kids).

Oh, well, at least I had low debt (few $K, paid off easily) even though I got screwed on scholarships by my high school.


MBA has nothing to do with bad decisions. Bad decision makers make bad decisions.


A team of HP managers walks into the Svalbard global seed vault a week after the bombs went off. They smile contently as they sit down to their first meal of popcorn, to which they will survive for literally weeks longer than everyone else.


Businesses that make material goods like 3M and Corning _have_ to innovate. If you don't keep innovating, you're going to lose your lead in established products (post-it notes, corningware) to China or WallMart.

Apple has a similar story with Android phones. Keep innovating or get your low-end eaten.

Google doesn't have to innovate. They're already China. If somebody makes something they like, they'll buy it. Or make a knock-off.


That's what Yahoo thought, 15 years ago.


Yahoo has been, on the whole, an astonishingly successful company; they may not have been as sexy as those we cover here, but they've kept plodding along. Stuff like this is why I think Yahoo will outlast Google in the long run.


Exactly. If Yahoo was such a failure, we wouldn't be talking about them. There were many other very similar companies that were founded about the same time as Yahoo. Some got acquired (including some by Google or Yahoo) and more just died, but all essentially lost the race and even those of us who were around at the time barely remember them any more.

Yahoo has survived, and made a lot of money for a lot of stakeholders over the years. I would also say that they have a better record than Google when it comes to privacy, legal entanglements, and general good open-source citizenship. Yahoo is a success story. Anyone who presents it otherwise is just putting their own lack of perspective and/or business sense on display.


While Google does have a Borg-like aspect, I think they do still need to innovate.

What is more, it's clear the founders are still very keen to push the limits of innovation.

Maybe, as has been said by others, they are not shutting this down but restricting it. In effect, they're replacing most of the 20% time with acquisitions but they do still have the process alive to enable some internal innovation.


The examples you mention for 3M and Corning are valid patent reasons. If a company is going to productize their invention, they should have some protection. They are encouraged to innovate because of the patent process. The same does not apply to Apple and Google.

Additionally, Corning sold off the Corningware brand to a non-research focused company. They've pivoted again, and their research (and relationships with Apple and Google) have allowed them to continue to be successful.


Which I suspect is doing Corning some brand damage. Pyrex(TM) used to be really tough low-expansion borosilicate glass, or perhaps properly tempered soda-lime glass, truly great stuff. Now outside of Europe and lab glassware it's non-tempered generic soda-lime glass, which is injuring quite a few people: http://www.consumeraffairs.com/news04/2008/08/pyrex.html


Wow! Thanks for that link!


I have no problem with Apple or any other IT company pursuing hardware patents. But I think software patents are BS.


The main difference in my mind is that innovation in software is relatively cheap compared to innovation in material goods. With material goods, the only way to know if your innovation works is to make a bunch of it and hope it sells. Software just wastes a little employee time and maybe a bit of server space/bandwidth. Plus, with software even if the whole project fails you can often recover parts of it to use elsewhere.


> Businesses that make material goods like 3M and Corning _have_ to innovate

No, they don't "have to". But then they die, or just keep doing what they do. It's an option.

There are boundaries that sometimes protect business: geography, client base, product specifics, etc.


I suppose I should have added "...if they want to continue to succeed."


>It used to be that HP engineers were expressly given Friday afternoons and full access to company resources to just play with new ideas. Among other things, this led to HP owning the printer market.

And arguably enabling the Apple I.


The same thing may have happened at HP, but at Google is isn't happening (yet). It may happen but to claim that 20% time is gone is simply false.


Do you have information that contradicts the description in the article? Because the article claims that the 20% time is still "official policy", but that employees are expected to still put in 100% time on their normal project (and measured on doing so) and that they also need permission to start a "20% time" project and finally that management has been instructed to be parsimonious in granting this permission.

The article claims that "20% time" at Google is no longer real; do you dispute this?


Plenty of googlers on the record in this thread that this is bs, I don't see what my anecdotes relating friends' experience would add to that.


It really seems to be the life cycle of a lot of companies. Be small, lightweight and flexible, come up with great ideas that are able to quickly rise to the top. Those ideas make you tons of money and cause you to grow to be massive. But then you're crushed under your own weight.

We've seen Microsoft do it, I'd argue that Apple is well down that road, as you said we've seen HP do it, Google may be headed in that direction.


How frequent was this scheme ? I bet all giants had this, Bell, Kodak ... Disregard towards little creativity bubbles can be harmful.


Apple would be a good counter-example to your:

This should make a lot of entrepreneurs happy, as there will continue to be a lot of top-down management-driven products that, if history shows, will continue to be market failures.

There's no rule to success.


Actually, Apple (1988-1998) is a perfect example.


> There's no rule to success.

Amen. I think as HN readers we suck up so much info on a daily basis (I would guess so much more than the everyday public) and get confused so many ways on what is 'The right way' to do thing, 'The wrong ways to do things' and the '5n lessons you should learn to do things right'. But really, all that will change and become irrelevant when company x does things in way y. Then all the armchair experts and runaway copiers will wax philosophical, and no-one will know any better until the daring party actually does it, or completely bucks it.


To be honest the world has also changed while this was happening. HP used to make quality stuff that people would pay more for. Today for the most part they have succumbed to commoditization where it's a race to the bottom. When people (for the most part) put price ahead of quality there's less value for things that create quality.

That said there's lot of penny wise pound foolish style thinking everywhere. That also tends to relate to the growth of the business to the point where management can't really figure out what going on any more (assuming they could at some point).


Shareholder primacy trumps all


I am a Googler. I will only speak to my personal experience, and the experience of people around me: 20% time still exists, and is encouraged as a mechanism to explore exciting new ideas without the complexity and cost of a real product.

My last three years were spent turning my 20% project into a product, and my job now is spent turning another 20% project into a product. There was never any management pressure from any of my managers to not work on 20% projects; my performance reviews were consistent with a productive Googler.

Calling 20% time 120% time is fair. Realistically it's hard to do your day job productively and also build a new project from scratch. You have to be willing to put in hours outside of your normal job to be successful.

What 20% time really means is that you- as a Google eng- have access to, and can use, Google's compute infrastructure to experiment and build new systems. The infrastructure, and the associated software tools, can be leveraged in 20% time to make an eng far more productive than they normally would be. Certainly I, and many other Googlers, are simply super-motivated and willing to use our free time to work on projects that use our infrstructure because we're intrinsically interested in using these things to make new products.


> Calling 20% time 120% time is fair. Realistically it's hard to do your day job productively and also build a new project from scratch. You have to be willing to put in hours outside of your normal job to be successful.

Then it's not 20% time, it's personal time you're giving to your employer for free. Why would you do that? Why not build your projects outside of Google and keep them for yourself (assuming it's a product and not open source)?


Why would I give my time for free to Google?

Because my entire career- well before I started working here- has been dependent on things that Google has given to me for free.

Like Google Search. Search helped me learn to run linux clusters effectively (it was far better than AltaVista for searching for specific error messages) which ensured I had a job, even in the dotcom busts. It helped me learn python, which also played a huge role in my future employment.

Like Gmail. Although I've run my own highly available mail services in the past, free Gmail with its initial large quotas hooked me early on. I have never regretted handing the responsibility for email over to Gmail.

Like Exacycle (my project): http://googleresearch.blogspot.com/2012/12/millions-of-core-... in which Google donated 1B CPU hours to 5 visiting faculty (who got to keep the intellectual property they generated).

I would like to repay Google for their extreme generosity. Spending my "Free" time doing things I enjoy (building large, complex distributed computing systems that manage insane amounts of resources) so that Google can make products that it profits from seems perfectly reasonable to me.

If I had continued to work in academia, I'd spend most of my time applying for grants, writing papers, and working 150% time just to maintain basic status and get tenure. Anybody working in the highly competitive sciences, or in the tech industry, who wants to be successful, has to put in more than what most people consider a 9-5 job.

As for open sourcing: Google has a nice program to ensure that Googlers can write open source code. I haven't taken advantage of it, because most of my codes are internally facing and don't need to be open sourced. But I would certainly consider using my time to do that; I just think my time is best spent working on Google products because I believe their impact will be much higher.


I'm not seeing any "generosity" on Google's part, at least from the examples you provided.

You certainly seem like a smart guy, working on some cool stuff, so I'm not surprised people are a bit confused (hence the term "brainwashed") by your (pretending to?) not understand the business model of the company you work for.

You are giving your time away, for free, to a for profit corporation. That's so irrational it's painful to hear.

If you like working with google systems and resources so much that you are willing to pay your employer to use them then ok, that's a bit weird, but it's your time. If you feel you need to work 120% time to keep your career on track then ok, that's not uncommon in this industry (but it's the opposite of generosity and it's not sustainable for you).

Framing this as repaying Google for "their extreme generosity" is delusional, which is why I'm assuming it's not the real reason.


I worked in Ads for a year when I started here. I understand the business model of my company quite well :-)

And I still consider what Google provides (search, gmail) "free". Free as in free beer- http://en.wikipedia.org/wiki/Gratis_versus_libre

I'm certainly not "giving my time away for free": to be clear, I'm a salaried worker, and I choose to work the hours I do. Further, to be clear: Google gives me immense resources to carry out life-saving scientific research, the intellectual property of which belongs to scientists (and the general public), not Google.


> And I still consider what Google provides (search, gmail) "free"

And you would join a large number of people who have recently started using that word to mean "can cost any amount in anything of value as long as it's not currency". So sure, it's "free" in that way and still not free in the definition of the term that's actually useful to people. Boring semantic argument, let's drop it.

As to the rest, this is a much better way of phrasing it than in terms of Google's "generosity", which is what I was pointing out as flawed.


Not the OP, but I just launched a Google-owned open-source project that was developed partly during 20% time and partly during my personal time, so I can share some of my reasoning why I would do this.

I was a startup founder before Google. I called the shots on what I'd work on, owned all the code I produced, and worked when and how I wanted to. A large part of my motivation for writing all this code was to learn things; another large part was to produce stuff that would be a positive contribution to the world. Alas, nobody used my stuff (well, almost nobody - we had a userbase measured in the hundreds), and I didn't get paid for it. So it wasn't exactly sustainable.

In my official job duties, I now write software that's seen by over a billion users. In my unofficial job duties, I'm the maintainer of an active open-source project watched by thousands of people which gets a dozen or so patches per week. When testing this open-source library, I had free availability of thousands of machines and a corpus of billions of documents. When developing it, I had the help and mentorship of experienced coworkers, some of whom had help leadership roles in major open-source projects. I get paid a fat salary for this. I do work longer hours, but it hasn't come at the expense of things I really care about. I go out with friends 3-4 times a week. I have a steady girlfriend. I call my mom every week. Most of the time for this has come out of loafing around on Reddit and Hacker News, where you are also giving your time away, for free, to produce something of value for a for-profit corporation.

In pretty much every dimension I care about, this is a win for me. My software reaches more users. I learn more. I get paid more. I meet more interesting people, and have more of a social life. My professional reputation increases more. I get more experience.

It used to bug me that I was giving away my labor "for free" to my employer, who would then profit handsomely for me. But what I realized is that not all labor has equal value. I used to reap all the fruits of my labor, and those fruits were worth virtually nothing. I'm now in a position where my labor has dramatically more leverage, and much of that is because I use the resources of my employer, and they're entitled to take a cut of it for that reason.

When the time comes where I feel I can accomplish more outside of Google than inside it, I'll quit. That was what drew me to them in the first place, the realization that, as an entrepreneur, all the ideas I had would be better executed as features of existing Google products. If that ever reverses and something that I really want to do would be better accomplished as an individual startup, that's what I'll do.


That's all perfectly reasonable and very well stated. Your perspective was what I was trying to address with the "If you like working with google systems and resources so much that you are willing to pay your employer to use them then ok, that's a bit weird, but it's your time" line.

If you feel differently than I about Google's contribution to society or the ethics of unpaid overtime cultures in general then even my rather tame "that's a bit weird" might seem wrong to you and that's fair enough, these are opinions after all.

I meant for my comment to address the framing of similar arguments to yours as "repaying Google's generosity". Considering that type of statement really does "sound brainwashed" as another commenter pointed out, I was interested in a better articulated reason which you have definitely provided.


- He gets to work on what he probably considers to be much more "interesting than average" projects.

- He's building a reputation of innovator in the company, which is certainly not bad for the next promotion or raise.

- He's sure well paid and doesn't need to risk his own wealth.

- He's out of the paper stuff and can concentrate on the technical side.

Of course, that way he misses the "get insanely rich" possibility of startups. The "famous" part no so much, you can get that in a big corp as well...


Google doesn't give you anything for free; you give them data that allows them to sell directed marketing to you. I don't have a problem with that as for me it's a fair trade. But I wouldn't call what they offer as "free".


I believe that's also known as "TANSTAAFL".


To be honest, you sound brainwashed.


All the best corporate cultures are a form of brainwashing.

My form of brainwashing happens to include 40 hours weeks. It's pretty nice.


Are you kidding? Google provides you web search and email for free? How does it happen that they make profit? They are profitable enterprise, they are no non-profitable business. Information from text of your emails and your clicks on search results is more valuable than google expense to support email and web search for you.


Are you a Googler? You are just brainwashed. Several months after you leave Google your brain restores its sanity, and you will laugh at all this sectarian bs that you currently consider as a deep truth or something.


You remind me one of those users @ youtube "Google, thank you very much for excellent music and movies! You give it to us for free! Thank you, Google, you are very kind!"


What happens if your 20% project becomes a product at Google? Do you get rewarded with a share or promotion? In a sense, Google could be acting as a mini-VC for you, so even if you work 120% that is just equivalent you doing a side project hoping it turns into a start up. Except you'd be able to work on different kinds of projects.


I don't really know if there is a standard "share or promotion" for 20% projects that become products. I'm sure some people have received bonuses. I don't think most people who work on 20% projects that become products really care that much about the monetary compensation- it's far more enjoyable to see people use your product happily, than get a wad of cash (at least for me).

Anyway, I think you raise an interesting analogy: working on 20% projects at Google, then using the company's resources to launch the product, does have a number of parallels to VC funding for startups (note: I'm advisor for Google Ventures, so I have some experience with both worlds). In a sense- and nobody everybody will agree with me- Google as an employer is a low-risk, low-capital way to launch my products. Larry and Sergey already took the risks (launching a company with no clear monetization strategy), they figured out a monetization strategy, and now they invest their capital in speculative projects.

Anyway, in my case, after it seemed like my project was in good hands and ready to be a product, I looked for something else interesting to work on. I think the main problem I have working here is that there are too many cool projects I could work on, learning from experienced SWEs and SREs, but I have to stick to one.


At my last job, I had to wait two weeks to get a VM created and assigned to me.

At my current job, I can't get a port opened to make an OUTBOUND connection to Amazon Web Services.

Yeah, working at Google, it's definitely easier to use big tools to try new things, huh?


We have a lot of perks, including great food, but ultimately it comes down to: 1) awesome coworkers I learn from daily 2) access to infrastructure and few barriers 3) a competent IT team for corporate IT

The combination of the three is rare in industry.


What I'm hearing is that the dream of Google has been reduced to some characteristic bullet points and a reasonably positive conclusion.

This isn't a criticism or negativity, all things considered it seems to be a great company to work for; it just seems to be a bit more homogenized than it used to be.


I think you hear a lot of bullet points because it's hard to write a HN comment that describes everything about Google in one sentence.

Let me tell you a story from my recent experience.

My neck hurts from time to time, and as part of stopping that, I decided that one thing I needed to do was to not sit at my desk anymore. Fortunately, I already had a standing desk at work, so it was just a matter of using it. When I got it, I thought the best plan would be to gradually ramp up; 30 minutes today, 40 minutes tomorrow, until I was standing the entire day. That never happened and I think I maybe stood for an hour a day, basically bailing out as soon as my legs felt even the slightest bit uncomfortable. Last week, I decided that that was not going to work, and I mentioned this to some people that sit (well, stand) near me. They agreed to shoot me with Nerf guns if I sat down. I did, and they did, so I stopped sitting down. Now I can stand the entire day. What really did it was watching others around me stand for the entire day; if they could do this for months, why couldn't I? That's what really motivated me to endure the annoyance in getting used to something new.

So what does this have to do with Google? It boils down to: coworkers that care, coworkers that are "different", and the willingness to make expensive non-essential office furniture universally accessible and useful.

Ultimately, there are a lot of great reasons to work at Google, and HN comments can only give you a small snapshot at a time.


When one of my coworkers (at Google), outside of work, crash-landed his parachute and was in the hospital for months:

1) his manager (my manager) spent the next two days with a translator contacting his family back in Iran to explain what happened 2) paid a huge amount of money to have him rehabbed. He had brain damage. They did a great job- he had access to awesome rehab people and regained a ton of brain function. 3) worked hard to get his work visa extended while he was out of work.

I use bullet points because they are a succinct way to list several orthogonal items. Please don't take my data and generalize to all of Google. All I can say is that this company is pretty incredible, and much of what get published about it isn't very accurate.


All I can say is that this company is pretty incredible, and much of what get published about it isn't very accurate.

Couldn't agree more. Partially-truthful stories titled "Google used to be a great place to work but isn't anymore," seem to make a lot of HN commenters very happy. I'm not sure why.

Anyway, back to complaining about how I don't like the new pour-over coffees that the baristas upstairs make. Replacing Intelligentsia with Stumptown? How dare they! What a terrible place to work! :)


How awesome is the food?


I like it a lot. I eat about 10 meals a week there, have lost about 25 pounds, and I hate going out to restaurants because you have to pay and the food isn't as good.


You're probably going to crappy restaurants.


I don't think you've eaten in a Google cafeteria then. The food here really is top-notch. An equivalent restaurant would cost you between $25 and $35 a plate, possibly more depending on the day. I've had meals that would have easily cost me more than $50 anywhere else.


I had an interview at Google (no job :(...), and their food really is excellent. It can easily be compared to a higher end restaurant on Yelp (think $$ or $$$).


You haven't eaten there, have you? Their "cafeteria" is a bunch of kitchens worked by chefs. Good ones. Their food is the only reason I would consider working there.


Their food is the only reason I would consider working there.

This is bald hyperbole. There are a lot of reasons you would consider working there.


The only advantage over my current gig, I meant.


> Calling 20% time 120% time is fair. Realistically it's hard to do your day job productively and also build a new project from scratch. You have to be willing to put in hours outside of your normal job to be successful.

LinkedIn'er here.

One of the things I love about our culture is InDay/HackDay. It works out to less than 20% time, but the entire company is given 1 day a month - generally the middle Friday - to do anything they want. This results in a lot of things ranging from prototyping ideas; learning new tools (I spent my day today brushing up on Scala); or just taking a fitness class.

The best part? The day is honored. It's not 100+n% time. Unless you own something that is bleeding money or users from a serious bug no one is going to come to you and ask you to do anything other maybe grab a beer.

I know we're not the only company doing this, I think Twitter has hack weeks every quarter, but it's a huge differentiator from our neighbors.


Atlassian does similar a thing every quarterly-ish, called Ship-It days (used to be called Fedex, because, you deliver overnight...but the real Fedex don't like it, so we don't call it that anymore), where you ship something over 24 hours (or 48, if you cheat a little and start early...), either by yourself, or convince other people to team up with you. Criteria is that you must have something that can be demo'ed, or better, shippable (or close to shippable). There's also prizes and presentations, then food/beer bonanza. Sure people put in extra hours, but i think they are willingly doing it because its also fun and quite a social activity, all whilst generating innovation and allow experimentation.


I work for a company that adopted 20% time a few years ago and the 120% time is true here as well. It probably works out to 110% time in our case, I can typically take a half day during the week and then nights & weekends. Really, I think the final paragraph is true anywhere. Every company has some cool data or infrastructure to play with and smart management encourages that hacking in a sensible fashion.


> smart management encourages that hacking in a sensible fashion

See, I don't get it. Management at my company encourages going home at the end of the day. I don't understand why I would work somewhere that expects me to work more than my 40 hours.


> Calling 20% time 120% time is fair

Every place I've worked has allowed employees to come in and work weekends on projects that would help the company. I guess those places just weren't innovative (read: arrogant) enough to rebrand it as an employee benefit.


I'm afraid I'm having a hard time understanding how calling 20% time dead vs insisting that it became 120% time are two different things.

I feel that the reason 20% time captured the attention of those wishing to become future Google employees is because it implies that you are literally granted the ability to take 20% of your work schedule and spend it on side-projects. Put another way, Google is granting you permission to pursue what interests you on their time.

What I appear to be hearing is that work obligations require 100% of one's work schedule as opposed to 80%, and that a 20% project is typically pursued with time that would otherwise be considered personal time.

If one were to use 20% of their work schedule to build something, and then invest some of their personal time into it, then I would say that 20% time is alive and well. If Google merely grants you access to resources, not time, then I would argue that 20% time is dead. It would be more appropriate to rename the perk.

Sorry if I'm coming off as confrontational, I'm genuinely interested in getting a clearer view of the situation.


Give me names of 4-5 Google products launched recently which were results of 20% project? Please, just 4-5 product. Real products, not toys like "Google Surveys". Or could you name me few products which would be initiated by engineers, not by VPs?


20% time isn't dead -- I have been using it at Google consistently for over 7 years, and it has immensely benefited me. You don't need any permission, at least in engineering.

However, I would agree that it is "as good as dead". What killed 20% time? Stack ranking.

Google's perf management is basically an elaborate game where using 20% time is a losing move. In my time there, this has become markedly more the case. I have done many engineering/coding 20% projects and other non-engineering projects, with probably 20-40% producing "real" results (which over 7 years I think has been more than worth it for the company). But these projects are generally not rewarded. Part of the problem is that you actually need 40% time now at Google -- 20% to do stuff, then 20% to tell everyone what you did (sell it).

I am a bit disappointed that relatively few of my peers will consciously make the tradeoff of accepting a slower promotion rate in return for learning new things. Promotion optimizes for depth and not breadth. Breadth -- connecting disparate ideas -- is almost invariably what's needed for groundbreaking innovation.


Eh, from my perspective 20% time is both implicit and unwanted.

It's implicit because I'm not coding to a spec. I've got an area of responsibility and some general goals the people above me would like to see accomplished. If I have an idea, I try it out. I could see someone in a more spec-based, feature-driven role wanting to have time to work on their own features off the spec.

On the other hand, I personally don't want conventional 20% time. I don't really like working on two projects at once. One fills my head and all of my ideas relate to it and a second one gets in the way. I once again understand that some people can work on two things at once and enjoy it, but hey, I'm not one of them.

So I'm not sure there's an actual problem. If most of your engineers are like me, and are working with a great deal of independence on a project they find interesting, why would they want 20% time?


> What killed 20% time? Stack ranking.

This is just a "They did WHAT?" moment for me. Like, wow. Stack ranking >_< Seriously?


Well, actually Google has had stack ranking forever, since before I got there. I remember my first manager (as a new hire) was telling me about this weird ritual where managers got together in a room and talked about everyone and wrote stuff down.

It is a bit puzzling to me that Google was pretty innovative in a lot of areas, including HR policy, but the perf stuff is unimaginative and rote. Then again, I don't necessarily have a better solution for a company of its size.

From a personal perspective I think it's great to give up a level and 10-20% salary for increased time learning things. Google already pays at least 10-20% more than other places which DON'T have 20% time.

From the company perspective, I think it is sad that 20% time is becoming less and less relevant.


[deleted]


You may want to graduate to management or UML architect or be bogged down in valueless meetings, but that's not for me. I just want to make computers do magical things.

That's not what promotions are at Google. If you want to become a manager, that's called a ladder transfer, not a promotion. Promotions are basically for paying you more for making computers do more magical things. You are never asked to do anything other than software engineering, unless you want to do something other than software engineering.

(Note that software engineering is more than opening up an editor and typing in code, though.)


> engineering is more than writing code, and senior-levels of engineering even more so.

Google's higher engineering ladder rungs are defined more by scope of influence than the excellence or profusion of individual code. At higher levels, engineers will tend to spend more time communicating with and influencing other people.

Some people are happy to stop at level 4 or 5, where they can mostly do their own thing. They can do good work, they can feel they have time for 20% projects, and (to some extent?) they can keep getting raises and bonuses.


Quite a lot has been accomplished by just that, opening an editor and typing in code. It's what it all comes down to, and it's why startups can take the lunch from big corps who lose competency in opening an editor and typing in good code.


Reviews affect salaries even without promotions, and salary ranges are often tied to titles. If you're at the top of the range for your grade, even a good review might get you nothing. Anything that affects reviews - such as stack ranking - is therefore going to affect salaries, and I hardly think it's unreasonable to "give a shit" about salaries.


>I don't understand why engineers who don't want to be managers give a shit about corporate promotions or corporate ladders.

HR/MBA types value titles, buzzwords and other nonsense because most of them are incapable of actually parsing a technical resume. So, if you're going to look for a job, filling in your title as "developer" over a period of multiple years is a losing strategy.

By the same accord, without titles, buzzwords, etc as an engineer I won't be able to differentiate between resumes of a regular graduate level physics educator and one who is capable of nobel prize level research.


>By the same accord, without titles, buzzwords, etc as an engineer I won't be able to differentiate between resumes of a regular graduate level physics educator and one who is capable of nobel prize level research.

if you're in the position to make a hiring decision and can't make such differentiation - well, yep, that is what we have in software engineering. In physics the situation seems to be a bit (though not much) better, in part because of CVs, in part because the network is smaller.


Promotions and titles play a huge part in a lot of people's self-image, even at Google. While I was there, I heard more than a few engineers describe their near future goals as "Achieve level n by next year" instead of "build this thing" or "Improve this technology" etc. We liked to think we were better than "traditional" software companies like Microsoft or IBM but it was all the same. Corporatism and careerism as far as the eye can see.


It's going to be up to individuals and to a lesser extent, teams, about how much stack ranking really is an important consideration.

First of all, I haven't seen any evidence of the "fire the lowest X% of the stack rank" attitude which I saw at IBM and which has been reported to happen at Microsoft. Because Google is so selective during the hiring process, I don't think it's as necessary at companies where dead wood accumulates and so the Jack Walsh/GE philosophy of there's always deadwood to be eliminated isn't as applicable. (Sure, there will be PIP plans for problem employees, but that's on a case by case basis, and not driven by statistics.)

Secondly, the stack ranking is just one of the inputs in your perf score, which in turn drives salary increases, bonuses, and equity refreshes. It is also just _one_ component in the promotion process, which is handled outside of the management chain, by committees of engineers that are typically level N+1 and N+2 of the people being considered for promotion. Speaking personally, as far as compensation is concerned, for which the perf score is the primary driver, of course money is important, but it's not my primary motivator, and I'm paid generously enough that whether I get an extra X% isn't going to force me to try to get that extra bump in my perf score. It certainly wouldn't cause me to try to sabotage my colleagues --- and by the way, the stack ranking is also done by your team members and merged into the results (it's not just a stack ranking done by the managers), so trying to get a higher perf score by not being helpful to your colleagues is not a winning strategy.

Finally, I've touched a bit about the promotion process, and certainly at the higher levels, promotion is done by your potential future peers, and is more about recognizing the fact that you are already performing at the level of a Staff Engineer, or a Senior Staff Engineer. Your manager will write a recommendation letter, but certainly at the higher levels, it will be your future peers across the entire company that will be judging whether the work you are doing has the impact and a wider scope which is the hallmark of the higher ranks of the engineering ladder. And of course, if your 20% project happens to have a high impact or affects teams across the company in a positive way, that is something that you can write up in your promotion package, and they will consider it at promotion time.

Also, once you get to Senior Software Engineer, there is no expectation that everyone has to keep on climbing the promotion ladder. There is no "up or out". If you want to do really great engineering work, but it's work that isn't something that fundamentally affects the company's bottom line, or isn't of broad scope, that's OK. You won't get promoted, but at the end of the day, how important is that? You can only buy so many toys, after all, and if you are doing work which is fulfilling and you have other things in your life which is giving you great satisfaction, who cares if you never make Distinguished Engineer or become a Fellow? And that's yet another reason why stack ranking may not be as important.

The bottom line is that "stack ranking" as done by Google is pretty different form what you might think of as "stack ranking" as practiced by other companies, so it's important to keep that in mind.


I'd point out that Page and Brin predicted the course of their own search engine, and perhaps their own company, in 1998:

“The goals of the advertising business model do not always correspond to providing quality search to users.”

“We expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”

“Advertising income often provides an incentive to provide poor quality search results.”

"Since it is very difficult even for experts to evaluate search engines, search engine bias is particularly insidious. A good example was OpenText, which was reported to be selling companies the right to be listed at the top of the search results for particular queries. This type of bias is much more insidious than advertising, because it is not clear who “deserves” to be there, and who is willing to pay money to be listed.”

“We believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.”

“Search engines have migrated from the academic domain to the commercial. Up until now most search engine development has gone on at companies with little publication of technical details. This causes search engine technology to remain largely a black art and to be advertising oriented. With Google, we have a strong goal to push more development and understanding into the academic realm.”

> http://infolab.stanford.edu/~backrub/google.html


This is exactly what Baidu (NASDAQ:BIDU) has been doing in recent years. IMHO, the precision of results from Baidu is less competitive in many niche perspectives compared with that from Google, and many keywords draws hoax, sponsored links by liars/copycats/fake suppliers. The reason is obvious. Censorship in various ways blocks Google from entering Chinese market, and the rest of the players are just not strong enough to compete with Baidu. However, Baidu is good in providing the latest rumours and gossips on entertaining subjects/celebrities.

Also, Baidu was accused by many to have intentionally removed or weighed down the links to sites, admins of which have refused to bid advertisement or increase their advertisement bidding. I cannot explain such strategy except to hypothesize that Baidu has sustained de facto monopoly.


This is an interesting point, but I don't see how it relates to Google doing away with 20% time.


Just watching as an interested outsider for all these years and partly based on the paper quoted above, much of Google's early energy and innovation was based on the simple idea of extending and applying knowledge of the new field in service to users.

When that focus shifted to exactly what Page and Brin had criticized themselves in 1998, we can clearly see that it's not the interests of the users being served anymore.

The decline of 20% time appears to go hand in hand with this shift from the focus on the user to focus on the advertiser and other interests first. I'm not claiming direct correlation but it's not a stretch to say there might be some connection.


I sometimes do wonder what would have happened if they'd decided to charge for their search engine.

Of course, putting it behind a paywall would have been a failure. But they could have possibly done a freemium model, offering better and more detailed search capability for a small monthly or yearly fee.


If they hadn't done advertising then many of the non-search products (Gmail especially) could have been behind a paywall. The brand recognition from search would build the other businesses for them.

All the analytics, google alerts, etc would have been non-free products as well.

It just would have made them much, much, much less money.

But god damn that search engine would be amazing by now.


The difference between Google's advertisements and OpenText's is Google (usually) identifies advertisements as advertisements.


Way to hijack the thread with anti-Google propaganda ...


My post in fewer words: Page and Brin have wandered from their roots.

That's not "anti-Google propaganda". That's honest criticism from an old geek.


Not really, as zeckalpha pointed out, this is quoted out of context and was talking about OpenText:

"The difference between Google's advertisements and OpenText's is Google (usually) identifies advertisements as advertisements."

Also, Google results have gotten significantly better than they were in 1998 so even if we ignore the OpenText context, I don't see how this could be a successful prediction as you're trying to imply.


Why is this anti Google propaganda? I can't see a single point that is not reasonable enough on its own merit, and none of them applies to Google alone, but to all search engine providers.


I think he forgot the sarcasm tag, but it's hard to tell.


First and foremost it's off-topic. Second, these are the same citations used by those who were pushing for the FTC to sue Google for antitrust, so they have political baggage, third: it's before the company was founded so it's all academic and theoretical with no actual experience behind it, fourth: context matters - choice quotes from long texts have been used for Google bashing before, some earlier this week even.


> choice quotes from long texts have been used for Google bashing before, some earlier this week even

Somebody on the Internet is wrong and said something bad about Google? How dare they! At what time did this serious offence occur?


Being before or after the company was founded, being 10 years ago or today, has absolutely no bearing on whether what was said is right or not. Experience != truth, just having experience doesn't mean what you say is more likely to be right. The theoretical can be right or wrong. You haven't proven your argument at all.


no, it's confirmed now that he's just crazy. Dude, get an objective bone in your body, otherwise you're just a crusader.

Not everyone cares about whatever is going on in antitrust, but if you're not objective, you have no argument.


Propaganda is nothing more than an idea that spreads (propagates). Or in other words: an idea.


As a Googler, I can confirm that this article is... completely wrong.

I don't have to get approval to take 20% time, and I work with a number of people on their 20% projects.

I can also confirm that many people don't take their 20% time. Whether it's culture change due to new hiring, lack of imagination, pressure to excel on their primary project, I'm not sure, but it is disappointing. Still, in engineering No permission is needed.


What would make this a more durable statement is if you could say "I don't have to get approval to take 20% time, and I work with a number of people on their 20% projects. And my calibration scores are the not different than when I don't use my 20% time."

There is a reason the article talked to many ex-googlers because they are ones who, perhaps, left when there was a difference of opinion about their contribution to the company. And if that conversation attempted to use some of their 20% projects as a way of contributing to their contribution, and that was the disagreement, well it ends up with them leaving.

I left in 2010, which was the start of the lurch toward more managers, and those managers were being scored on what their team accomplished that was assigned to the manager, not on what their team accomplished on their 20% time. Some managers had taken that to mean that 20% time didn't count at all toward calibration, and if you were working only 80% of the time on their projects it counted against calibration. Hence the disconnect. That is why "120%" sort of works, even in the face of a manager trying to make their own number. Except the benefit sounds different if you say "And on Saturday you can work on any project you want." :-)


Look, this is the type of thing that's impossible to settle, because yes, different managers and _peers_ might view things differently. As you know, but people here might not, there's as much weight on peer reviews as your manager's review. So what the article didn't mention (which is why I don't think they talked to that many ex-Googlers) is that if your peers feel like you're slacking your perf score will be lower, so even they could view 20% time negatively, if they wanted to.

The policy of course is to not do that, but how would Google enforce that? If you spent 200% of your time on your primary project, you might get excellent reviews. Should you be calibrated 50% lower to normalize everyone? So yes, if you work 100% on your primary and +20% on your 20p, then you might get higher perf scores. Eh, if you have a good idea how to correct for that, go ahead, but it's a people problem and people are hard to deal with in these kinds of thing.

This is one reason why I actively encourage as many people to take their 20% time as possible. :)

My personal experience though is very positive regarding 20% time. My first project, I started, gathered a few contributors, built a prototype, pitched it and turned it into a full-fledged project. I got great reviews and promoted in large part because of that. In my second PA I work on open source projects. Less glamorous, and maybe less perf-impacting, but that's not my goal right now.

In general the people I've seen take 20% time in a focused way tend to be the higher-performers. They're more self-guided, more critical of problems that need to be solved, or use 20% time to teach themselves or move onto harder problems. Maybe as we've grown the percentage of employees like that has shrunk, maybe it takes a while to realize you have 20% time. Not sure, but I'd like to see it be used more, it's good for everyone.


Impossible to settle? Probably. Impossible to discuss? No.

I'm not being critical of your account, I'm glad it has worked for you, as the manager of the engineering effort here I want to take the "good" stuff and use it, and leave behind the "not so good" stuff. So that is my agenda in this discussion, nothing more, nothing less.

"In general the people I've seen take 20% time in a focused way tend to be the higher-performers."

That is my experience as well, and regardless of a company policy for or against 20% time, focussed, high performing engineers will spend 20% of their time do cool and innovative things. Whether they do it at home or at work doesn't really matter, and having them do it at work is good for the company.

So the interesting thing about the policy is to capture the next tier of engineers and help them to be more productive by encouraging them to develop habits of the aforementioned highly successful focussed engineers. And to weed out the folks who are abusing the program [1] or at least not being any more productive with it.

If nothing else people are different right? Facebook's response has been "hackathons" which carries with it some characteristics of high performers (who quickly prototype ideas to test their validity or get a handle on their challenges)

But in all those scenarios, if you have managers, you need to also train your management on what the program is trying to achieve and how it might be addressed. So you don't end up with some managers giving their people 1 day a week off, and some demanding they work on Saturday if they want to use that extra time.

If it is "You have this huge resource available, dare to use it." then you can manage to that without damaging either morale or perf scores. From the anecdotes in the OP article it sounds like they are still working on that part.

[1] Like the guy who said he was trying to capture the great ideas he dreamed about in his 20% time so he would spend several hours napping for an hour and then waking up and writing down what he dreamed about.


My experience isn't directly about 20% time, but may be relevant to the question of who at Google is allowed or encouraged to innovate.

My previous experience is mostly in R&D startups, doing robotics and natural language processing. I was hired in 2010 and then found out I would be working on YouTube ads. I was disappointed, but decided to try to make the best of it. After three months I realized my lack of interest was going to be a problem, so I talked to my manager about trying to transfer. He discussed it with the site director, and the response was "we don't care" and I couldn't transfer until I'd been there 18 months.

I decided to stick around and see if I could work the system in some way, with the probably naive thought of trying to demonstrate my abilities and catch someone's attention that would help me transfer to a project I'd enjoy where I could make a real contribution.

During Innovation Week (a hackathon) I led a team of three other engineers working on an idea I came up with, and we won the "Most Innovative" award. The other engineers decided they all wanted to devote their 20% time to working on my idea.

My Tech Lead and my manager had no interest in my Innovation Week project, and I still had no way out of YouTube ads. Unsurprisingly my performance on my 100% project wasn't great and even if I made it to 18 months it seemed unlikely that I'd be able to transfer. I left after 17 months at Google.

I've mentioned this story before, and I hope I'm not just grinding an axe--I'm just telling my experience in the hope that it will inform engineers about possible outcomes of working for Google. I am fully responsible for my experience there, but I can say the priorities of the (large, heterogeneous) company were not what I (again, probably naively) expected.


Michael Church (an outspoken ex-Googler) would disagree with you. One thing I remember him saying that was verified by several other Googlers, both current and ex, is that whether you are allowed 20% time depends on your team and your manager. And in fact, most teams in Google do not get 20% time, so you may be one of the lucky ones.


Oh boy, michaelochurch. His experience was extremely atypical. He had a legitimately awful time at Google. I don't know what happened between him and his manager and team so I can't say who or what was to blame. But I think he was put on a performance improvement plan (whether fairly or unfairly). And I think it would make sense to most people that an employee on a performance improvement plan would be strongly discouraged from expanding into a 20% project until their performance improves.

FYI, when it comes to reading any Michael Church claim, especially one about working at Google, I'd put more weight on the "outspoken" than the "ex-Googler." He admitted himself on HN that he engaged in what he called "white hat trolling" while at Google, a pattern he seems to have repeated throughout his life.

He's very articulate, but he has an EXTREMELY strong belief in the quality and accuracy of his own ideas. For illustration, he wrote a few months ago, "societies live or die based on what proportion of the few thousand people like me per generation get their ideas into implementation." When presented with such self-confidence beyond all reasonable proportion and corroborating evidence, I think an appropriate response is skepticism.


Oh, and, unfortunately, he has a known history for creating sockpuppet accounts, which unfortunately makes me unrecoverably skeptical of anyone who cites or endorses his ideas.


Citation?



Jesus Christ.


Michael Church is someone who is not worth listening to. He made massively incorrect assertions while at Google, and continues to after he basically talked himself out of his job in a very public way.

Engineers get 20% time period. You can be asked to defer it for a quarter. On the other hand, most don't take it.


Just because he made some massively incorrect assertions while at Google does not mean he is not worth listening to. Everyone says incorrect things sometimes. I mean, can you claim that you are right all the time? Probably not.

I don't know him personally but I read the stuff he writes, and while his character can be a bit abrasive he's an extremely intelligent dude who can make astute observations and connections that other people miss. I think you're doing him a lot of disservice by dismissing him the way you did.


He was at Google for 6 months. Whatever you think about his opinions in general, he doesn't know anything about how things at Google work.


Not reflecting on this case specifically, but if six months isn't long enough for a new employee to generally understand how a company works, it would seem to suggest something's wrong with the company's culture, or at least with how people are brought on board. Six months is a long time in an industry where people change jobs every two years.


Part of the problem with mchurch was willful ignorance. A few colleagues, including some fairly senior people, reached out to him and volunteered to try and help him resolve his concerns. To my knowledge he never took them up on it.



I consider myself to be consciously observant of organizational issues. I would say it took me about 4 years to develop a reasonably rounded picture of "how things work" at Microsoft, and then only from the point of view of a low-ranking employee. Big companies are vast, layered, intensely game-oriented social universes.


How long do you have to be at a company to understand how things work in it?

(That's a trick question, since the answer is going to depend on the person's intelligence and ability to make observations.)

Also, a ton of long-time Googlers agree with the stuff he says. It's not like he's some lone dissenting voice.


I'm a Googler myself and my manager has told us that although 20% time is ok, we should only do it if the project has some direct contribution to the core work we work on. Really depends on the team/department you work at, the policy is there to allow it, but different team have different culture. I believe it's more discouraged in certain departments such as Android and Social (G+) (those teams are probably under more constant pressure to "produce") than some of the more "old school" departments.


I agree it depends on the culture in each department. For someone already working long hours and under constant deadlines, I can imagine how they don't feel like they have 20% time. That's not a good way to work continuously, and hearing of certain departments or projects doing that to their teams is really saddening.

Personally, I did have very soft discouragement against wide-open 20% time. I asked around and a lot of people I talked to initially advised me against starting a 20% project so early, and especially against starting a new project rather than working on an existing one with engineers at a higher level than me. At least, they said, make sure I could get reviews out of it. That's not policy, but advice. People have their own theories about how best to get noticed and get promoted. Some of that has to do with 20% time. I guess if you're solely interested in promotion then you give more weight to such advice. I hope most Goolger's aren't solely interested in promotion.

I'm very glad I ignored that advice, both because I got to do very interesting things and because I got recognized for it, and I'm glad that I could ignore the advice because of our policy.


I think michaelochurch was pushing for things like open allocation and taking responsibilities for things like performance improvement away from HR.


I am not a googler but what you say is more in line with my thinking. You don't ask for, but simply take the 20% time. Ask forgiveness, not permission. Until they send out a memo saying you're only allowed to work or think about tasks assigned by a superior officer, then assume you're an autonomous being :-)


The policy is that you don't ask, you tell. If your manager needs you to focus on your primary project, she can ask you to bank your 20% time for up to a quarter, but then you get to use that saved time.


So, the employees and other sources cited by the author are lying about needing manager permission?

Or are you in an unintendedly unique position?


They may personally feel that they need manager permission. That doesn't mean that they do. There's also a wide range of managers with different opinions throughout the company.


> They may personally feel that they need manager permission. That doesn't mean that they do.

That doesn't seem to track with the article, or the articles/blogs linked from the article.

Lots of no-comments from the PTBs at Google... too bad. They could end the discussion one way or the other quickly.


I'm a manager at Google and I think 20% time is great. Engineers explore something that's exciting to them, and the projects usually pay dividends, either in great launches or else people get a chance to learn new skills or hone their current skills.


It entirely depends on your manager and team. Some managers might want you to ask permission, and/or if you take the "forgiveness" approach your relationship with that manager might suffer. But that's because people are people.

I can't imagine my former TLM ever taking 20% time, but I can thoroughly imagine him encouraging every single one of his direct reports to do so, unless we were on a launch sprint.


Frankly, I don't think that inconsistant application of rules, and relying on a good Manager is a good thing overall.

I don't want to equate this with working at Google, because I don't have enough information, but here's my anecdote about good managers:

Twice before I've worked with spectacular managers. They treated their employees well, and acted as great shields against the political infighting inside the company. But both times, the managers were forced out, and the employees that depended on them to get interesting work done ended up quitting as well.

If this is indeed, as the TFA posits, a memo from the top being filtered out by a few good managers, in a short amount of time those managers won't matter; they will be forced, or burnt, out. Neither case is good for the company.


> Frankly, I don't think that inconsistant application of rules, and relying on a good Manager is a good thing overall.

If you need humans instead of robots doing a job, that usually (increasingly, as automation advances) means it requires judgement such that pre-written inflexible rules will be inadequate to handle it sufficiently. Which means you need to rely on the judgement of people applying flexible rules for the best results.


Not a Googler either, but I can't believe it'd be anything like the forgiveness/permission dichotomy, since I believe they're all given explicit permission upfront. It's a part of the culture.

Rather, I've always imagined that it's not that it'd be closer to the perks like "unlimited vacation time": you can't just decide "well I'm taking the next 5 years off", because that'd be a dereliction of duty. In other words, if you have the premier release of your main project coming up in 2 weeks, it may be in your best interest to not take that 20% time for the time-being, but instead use that time to fully ensure that your project has a successful launch. But, if you don't have any big deadlines coming up soon, and you're reasonably on schedule, then by all means take the 20% time.

The 20% time is a perk, and just like many other perks, your use of the perk cannot preclude any tacit duties that you may have, which include delivering on your assigned projects.

But, this is just an outsider's view, I could be completely wrong.


"if 20% time has been abandoned at Google, are other companies, which reportedly include Apple, LinkedIn, 3M and a host of others, wise to continue trying to copy it?"

That's incorrect, AFAIK 3M were the first company to pioneer this approach (with 15% of time spent on self-directed projects).

For example see: http://www.fastcodesign.com/1663137/how-3m-gave-everyone-day...

Talking to a friend at 3M (who has been there 20+ years, an engineer with dozens of patents) I am told that while 15% officially still exists, for a long time it's effectively meant working 115% of hours.

Nonetheless the tradition allowing self-directed research continues at 3M - and this might mean using lab resources, or creating prototypes without getting approval.


The author needs to google 3M an find out actual facts. It's like Google came up with it and Everybody was in a mad rush to copy it. God forbid, innovative ideas come from anywhere else but the companies we perceive to be the most innovative. Google was keen to adopt the practice that's worked well elsewhere and they seem to be focusing on other methodologies now, maybe they're figured out something that works well for Google NOW. It doesn't make 20% time more or less effective. It's just how companies evolve.


Agreed, it's a silly question even if it were true that Google invented the idea. I do not get the impression that Apple is a company that mindlessly copies R&D strategies from Google.


> Nonetheless the tradition allowing self-directed research continues at 3M - and this might mean using lab resources, or creating prototypes without getting approval.

And get hours paid for it, even it being extra hours you wouldn't do other way.


"And get hours paid for it,"

I don't work there, but I feel pretty safe in saying 3M researchers are not hourly employees.


But it could correspond to other benefits such as applying the time towards paid time off or something like that.


That strikes me as very reasonable. 20% (15%, whatever) time is largely self indulgent plus self education. I have no problems with being expected to supply that initiative. I've done it before, and invented products that the company ended up selling (until I realized the only payoff in this case was a "we'll compensate you for this later, trust us" - I'm not talking about any company discussed here, btw).

This way it is not goof-off time as it will not attract anyone except self-starters and thinkers, but the company is providing intellectual and material resources. You get to have fun, rejuvenate yourself on the job, the company contributes some, and maybe, just maybe, you hit the jackpot at some point. If not, well, you learned some stuff and had fun.


I offered 20% time to my team and only about 5% of them ever wanted to take advantage of it. I wanted a higher percentage of time spent thinking outside the box so I ended up creating a formal "innovation" group, through which individuals are cycled based on ideas, needs, and desires.


The title of this article may unintentionally embody a point about 20% time, because GMail and AdSense have something else in common that Occam's razor implies was more important than Google policies: http://en.wikipedia.org/wiki/Paul_Buchheit


Are you saying giving gifted and talented people resources and means to build things that they build amazing things? This is ludicrous!

To your point though, Google probably has many "Paul Buchheits" (by that I mean at his level of intellect and ability), and so do many other companies. Being on the end of having a good idea, and not having the companies backing to build it, it sucks and discouraging. Thats why people make startups though (or at least why I like to believe some people do), to take that idea that the company didn't give them the time/resources for and build it themselves. One part to show the company that they were dumb, but also because they believe in the idea. I think as corporations try to maximize profits and throw things like 20% time out the door, the world will see even more startups and I do believe we are currently seeing that type of trend with the newer generation. Ideas can consume people, and if corporations learned to harness that power (Like 20% time and hackathons) they could innovate in ways that people believed impossible.


I just launched a project, which made it to #2 on Hacker News, which was done almost exclusively in 20% time. (Well, also a few nights and weekends, and my manager gave me some time away from my main project to work on it, but it was in all ways a 20% project. Self-conceived, self-directed, and done in addition to my regular job duties.)

https://news.ycombinator.com/item?id=6209713


This means that a lowly engineer with a great idea will leave Google to develop it instead of developing it at Google. Perhaps, this is a good outcome given the way Google keeps killing web services that people find useful but aren't profitable for Google.


I'd be more worried about hiring the next great engineer and continuing to grow. I imagine this was a huge draw for potential new recruits who might now look elsewhere. Existing employees usually have a lot of other reasons for inertia and therefore not leaving (comp and all the other perks)


But then what's stopping Google from just turning around and buying their startup if it looks promising?


That was my first thought but then I realized that no-one can force them to sell to Google


They can't force in the gun-to-the-head sense, but they can "force" a startup to sell with the threat of throwing a billion dollars at the problem themselves.


Because one can buy success? Like Google+? Or a lot of things Microsoft has tried recently? I realize, all else being equal, a threat of a major investment by a deep pocketed company shouldn't be taken lightly, but the idea that Google can "throw a billion dollars" at a market segment and thus dominate seems like wishful thinking by management.


I'm a Googler, and comments here from other Googlers are correct: 20% time still exists and requires no manager approval, but only a minority of engineers take advantage of it. For instance, in my team of 14, only 2 work on 20% projects.

I haven't been at Google long enough, but I doubt there was ever a time when a majority of engineers did 20% work. It's hard: you have to take away not only time, but also focus from your main project (which like all software is probably already taking longer to build than you'd like). Most engineers aren't motivated enough.


I worked at Google from 2002-2007, and the dirty little secret is that there is no 20% time. It's 120% time. You work a full schedule, and if you want to put in extra hours for a side project go ahead. I explicitly asked to use 20%-time to work on a side project (i.e., literally take 1 day a week to work on it) and was refused.

20% time has always been a great marketing gimmick for Google. I remember interviewing candidates and mentioning it as a bonus to prospective candidates - most employees mentioned it as a bonus because we were inculcated into believing it was actually a bonus. It's much easier to see that it was never a real perk once you're outside the bubble.


As a former Googler definitely agree with the 120% time. But I think its more because its hard to do any work effectively only once a week. So in order to make your 20% time be useful you just have to put in more time than 20%. And obviously you can't reduce your other workload so you put in extra time to make your 20% project successful.

And for what its worth, I personally have never had a problem requesting for 20% time. Like all big companies, it really depends on your team and boss. I believe your experience of straight out rejection is more the exception than the rule.


"Google’s “20% time,” which allows employees to take one day a week to work on side projects, effectively no longer exists. That’s according to former Google employees, one who spoke to Quartz on the condition of anonymity and others who have said it publicly.

What happened to the company’s most famous and most imitated perk? For many employees, it has become too difficult to take time off from their day jobs to work on independent projects.

This is a strategic shift for Google that has implications for how the company stays competitive, yet there has never been an official acknowledgement by Google management that the policy is moribund. Google didn’t respond to a request for comment from Quartz."

What I've heard is more nuanced: the policy is not dead, but it's very hard to both satisfy the core demands of your job and maintain 20% time. So it's not dead, but possibly not an actual priority in the culture the way it used to be..?


You're quoting such wordings as "effectively no longer exists" and "as good as dead", not "no longer exists" and "is dead".

The article seems to be pretty nuanced already.


The rest of the article describes how it's being pinched out of employees' schedules using performance evaluation policies.


> "What I've heard is more nuanced: the policy is not dead, but it's very hard to both satisfy the core demands of your job and maintain 20% time. So it's not dead, but possibly not an actual priority in the culture the way it used to be..?"

... and the difference in outcome is?


> Six months after he took the reins, Page announced that Google would adopt a “more wood behind fewer arrows” strategy that would put more of Google’s resources and employees behind a smaller number of projects. This meant killing off Google Labs, which had previously been Google’s showcase for its experimental projects—many of them products of employees’ 20% time.

Guess who advised Larry Page to focus on a small number of projects? Steve Jobs.[1] It seems like Larry's acceptance of Jobs' advice has sabotaged Google. I'm no conspiracy theorist, but one does wonder if this was Jobs' secret intention.

[1] See the second quote in: http://www.edibleapple.com/2011/10/22/steve-jobs-advice-to-l...


On the other hand, focusing on a small number of great products is what Apple always did under Jobs. The first thing Jobs did when he came back to Apple was jettison superfluous products and concentrate on what Apple was good at. Seems to me like he was giving genuine advice.


He was giving genuine advice for himself. Apple did awesome with focus and top-down management because the guy at the top was Steve Jobs, who has an uncanny knack for sensing what the market needs. But companies are different: what was a core competency of Apple is something that Google doesn't have, and Google has various other core competencies that Apple never had. The general rule of business is "play to your strengths", not "emulate the winner".


I think it lines up with how Jobs ran Apple. He killed a ton of "blue sky" research projects after he took over as CEO, and Apple under his leadership spent a lot less on research than Microsoft or Google.


A Google recruiter, while doing his spiel, told me in a light-hearted, offhand manner that some engineers call it "120% time". Having lurked here long enough and read similar sentiments from ex- and current google employees, it wasn't really news to me, but I was pleasantly surprised at his candor.

But do any of the other companies listed do 20% time? AFAIK, they only have infrequent "hack days".


Yeah, Google still strikes me as probably a great place to work. 20% time is not common in the industry, and trust me-- if you work for Google you work in a much better place than your typical "enterprise software" hell-hole.

But this might affect their ability to attract absolutely top-tier talent.


Ours instead does one week a year. They get scheduled like any other project. It means they can't be as ambitious or unbounded, but it also provides uninterrupted time to work on whatever-it-is you are interested in.


The absolutely relevant Dilbert strip:

http://dilbert.com/strips/comic/2011-12-19/



Businesses don't live and grow forever. They have a lifecycle, and when they get big enough and their business model becomes mature enough, they divert resources away from risky innovation and toward defending the fortune they've amassed, defending their primary cash cow products against competition through anticompetitive practices like patent litigation, and promising steady returns to shareholders. The problem is, this swing toward conservatism tends to fossilize both the product and the organization.

If you fully de-risk your company, it will stagnate. No risk, no reward either. And in Google's quest to kill Microsoft, they are becoming Microsoft.


It probably made more sense to have 20% time when Google didn't have tens of thousands of engineers. When you have a few hundred really smart people all working in the same building back in the early 2000's in Mountain View, it was I imagine a different thing than it would be now.


I think there may be some truth to this. With a company as large as Google, it may no longer be as beneficial to give everyone 20% time. Instead it's better to take the best engineers (and probably most innovative) and give them 100% time in Google X.


Seems odd that nobody is mentioning Google's in-house makerspace, hackathons, and 20% weeks (where engineers without 20% projects, or who prefer time-boxed commitments, take a week once a quarter to work on stuff). I ran a 20% week project last year - or maybe the year before? - and there are several really fantastic people working tirelessly on the Garage and other makerspace stuff to ensure that the 20% mindset is alive and kicking.

I totally understand the "management killing freedom of experimentation" comments, but they are only one side of the story, and a side explicitly pulled from disgruntled Xooglers at that.

The Garage is one of the things I miss most about Google - both the makerspace itself, and what it represents. We hosted several internal hackathons there, both within and outside the intern season and had great success.


> 20% weeks (where engineers without 20% projects, or who prefer time-boxed commitments, take a week once a quarter to work on stuff)

Wouldn't that be 8% weeks?


Actually that's good news for startups.

Maybe current Google employees now prefer to develop their ideas in their own startups rather than as part of the 20% program..


It's human nature to stop "screwing around" in order to get "real" work done and make your boss happy. The only way to keep 20% time is to force everyone to take it. If others on your team stop doing it, then the competitive balance isn't fair anymore. If your boss can get away with unofficially taking it from you, then she will. Something like 20% time must be in the company policies with punishment for those that discourage it.


Wow. I never thought of 20% time that way, but it makes perfect sense. Just like mandatory insurance. Thanks!


Six months after he took the reins, Page announced that Google would adopt a “more wood behind fewer arrows” strategy that would put more of Google’s resources and employees behind a smaller number of projects.

There's nothing more soul-crushing for an organization than a board of directors.


I think it's not unreasonable for senior management at a company to say, "Geez, you know, we have our fingers in too many pies". Sometimes their logic is sound - they're distractions, they're consuming resources that could be used on core business ideas, we can shift innovation to a focused group of our core innovators. And sometimes it's not sound - it's purely about the dollars and sense.

But to argue that the board of directors is what's soul crushing about an organization seems odd to me.


"we can shift innovation to a focused group of our core innovators"

It might be easier to manage a separate Innovation Department, but what happens when someone on another team has a great idea? Do we transfer that person to the Innovation Department to continue innovating? What would that mean for their team left behind, relieved explicitly of responsibility, and implicitly of capability, to innovate?

Maybe the whole company should be tasked with innovation, but innovators should be allowed 80% time to write CRUD apps if they so choose.


It might be easier to manage a separate Innovation Department, but what happens when someone on another team has a great idea?

Exactly. At the end of the day, anybody in a company can be responsible for a valuable innovation. This is why something like "20% time" is such a good idea in the first place. It avoids the trap of assuming that you can actually accurately select the people who "should" be doing innovative stuff.

OTOH, having a dedicated Research department where people spend 100% of their time on research and cutting-edge stuff also seems like a good idea. An interesting approach might be to have both "20% time" AND "Google X" but with a policy that allows the Hoi Polloi folks a chance to rotate through the dedicated research group on occasion.


My point is that they tend to be penny-wise and pound-foolish. Killing an idea like 20% time because it'll save money in the short term while destroying a culture of innovation that can lead to gains in the future is precisely what I meant by "soul crushing."


I'd better stop working on my 20% project. I just wish I didn't have to hear this from Quartz!


As someone inside Google, this simply doesn't square with reality: just yesterday I spoke to a friend about joining a project he's working on on a twenty percent basis.


"Google has a highly developed internal analytics team that constantly measures all employees’ productivity—and the level of productivity that teams are expected to deliver assumes that employees are working on their primary responsibilities 100% of the time."

I know that it is a large company and control is required to manage everything, but this sounds depressing. I'm glad I work in a tiny company and even smaller team.


This description may lead you to imagine software counting how many lines someone changes or how many hours of the day someone is typing. That would be pretty depressing, but I don't think Google does that.

Their performance measurements are much more based off asking other engineers and colleagues what they think of your work.


Bullshit. I'm a Googler, and I'm 20%ing today (as soon as I finish breakfast).

There's lots of pressure to focus and do well on your main project, sure. But it's not external pressure, it's internal pressure. Everyone wants their software to be great. It's hard to detach for a day, ignore the influx of bugs and emails, switch gears, and work on something completely different. Most engineers don't want to spend the mental overhead.

But speaking from experience, when you do choose that path you get nothing but support.


Honestly, this completely makes sense given how Google has been killing off non-profitable products lately. Why pay engineers to work on projects that will never meet a business case? This isn't the olden days of Google, where they would shovel anything out there and see if it stuck.


The same reason why a VC fund will invest in 10 startups knowing that 9 will have to be closed down as failures. 20% time meant that Google was continually making those bets in various areas, knowing that most will fail but that a few will succeed in a big way.

They now seem to think that their management has obtained good enough crystal balls to be able to determine beforehand which of the potential products will succeed and which will fail. Meaning that they can scrap the 20% time and put all their "wood behind the few arrows" they have determined will succeed (you know, like Google+).


It really depends on the real stats though. Because if only 1 out of 100 20% projects is worth anything, maybe it's just not a good use of your engineers. Like a VC fund with too low a success rate just goes out of business.


Google Wave failed quite horribly. However the resulting technology developed has been rolled into Docs, revolutionizing how Docs is used for collaboration.

There was a good image I saw on reddit awhile back that I can't find off hand. On the left was titled what most people view success and failure as, and it had a single branch, you fail or you succeed. Then it had what successful people view it as, and it was a chain of failure after failure finally leading to a success.

If you eliminate failure, you eliminate the success you gain from learning from failures.


Why pay engineers to work on projects that will never meet a business case?

Isn't that begging the question in regards to whether or not you can know, ahead of time, which projects will or won't become viable and have a supporting business case?


"Why pay engineers to work on projects that will never meet a business case?"

You need your engineers to put up with mundane, boring work. Paying them to spend a minority of their time on an interesting project helps.


The 20% thing isn't really much of a perk anyway. I would only ever work for a company that let me work whenever I wanted, take naps whenever I wanted, go for walks whenever I wanted, and work on other projects. I do this at my current job, and I still end up being more productive in my actual work than any other developer I've ever met (and more productive than the average google engineer, I'd venture to say). I run the company, so I guess I can decide how I work :)

See also the "people simply empty out" article posted on HN. Google seems to be falling for the PHB fallacy that people are just machines that crank out code. Perhaps that is what google is becoming, but I certainly wouldn't want to work there.


Just wanted to say one more thing: I've heard some Googlers say this is accurate, and I've heard some say 20% time is alive and well.

Google is gigantic these days. It probably depends a lot on where you are in the company. Two people in different Google divisions or even physical offices are not going to have the same experience.


> It probably depends a lot on where you are in the company

This means it's dead as a company wide policy, and it's problematic to be sold during recruitment.


This is the opposite of Black Swan farming.

With any project like 20% time, you should expect that almost everything produced will be worthless. So what? All you need are 1-2 big hits to justify the entire program.

Letting employees work on what they think will be valuable is a bottom up approach that can reveal knowledge that's inaccessible to managers.

Some comments on here say that the program 'only' created Gmail. That's been a massive success for Google. That + the smaller successes it has led to likely justify all the projects that didn't pan out.


but the problem is that the incentivization scheme for a middle manager doesn't align with what 20% time gives to the org.

Lets suppose you are a middle manager, and your performance is based on the completion of projects assigned to you by upper management, and perhaps a little bit of weight is given to the innovation your team cranks out in 20% time.

If the projects you are given is either slipping, or is not achieving the type of success that makes you look good (for the purposes of your own promotion/bonus), do you allow your "resources" to work on 20% time, which may bring very little benefit to you directly even if it is successful (it brings benefit to the engineer, not you), or do you divert that 20% time resource to projects that _do_ give you personal benefit?


I do not run a multi-billion dollar tech company, but I look at this and feel a bit saf.

Some of the most useful stuff that has come out of google has been due to the labs, and the people who work at google are supposedly some of the people who are at the top of their game, who are now not going to benefit us with their great ideas, just work on the "more wood" approach, which just sounds like the google employees and google consumers are getting the wood.


Maybe developers smartened up. If you have a solid idea that Google would benefit from and allow you to work on it in your 20% time, why not quit and do something similar on your own? Google is no longer a cutesy company, they're a corporation like all others, and most employees probably woke up and realized they don't owe it a single thing, let alone their brightest ideas.


If you have a solid idea that Google would benefit from and allow you to work on it in your 20% time, why not quit and do something similar on your own?

To be fair, not everybody is interested in running a company, and doing all the "other stuff" that it entails beyond writing code. Also, Google provide great resources and infrastructure, which it would be hard to replicate (assuming the hypothetical project in question required Google scale infrastructure).


I think that when it comes to an idea one truly believes in, even if one is not inclined towards running a company, one is likely to give it a shot anyway. I'm talking about real passion here. The kind of passion that would cause you to break down if Google were to decide that your 20% time is misused and make you do other things instead, or if your side project took off and was taken away from you.


I think that when it comes to an idea one truly believes in, even if one is not inclined towards running a company, one is likely to give it a shot anyway.

I can see where that would be true for some people, but I doubt it's universally true. And it may even be that it just makes more sense to do the idea inside Google than doing it independently anyway. Again, look at the resources and infrastructure Google already have assembled.

And some people may have just bought into the Google mission / vision / whatever, and / or just feel a sense of loyalty to GOOG.

Anyway, I don't mean to suggest that it never makes sense to quit and do your own thing. I mean, hell, I probably would, as far as that goes. I'm just saying that there are some legitimate reasons why some people might prefer to just work within Google than quit and run off to start a new business of their own.


A few reasons:

- Many 20% projects are not things that would not be easily monetizable: open source work, non-technical things, improvements to infrastructure.

- If the idea is very speculative, exploring it while still getting paid a handsome salary is a much easier step than quitting your job.

- Some things are easier to do if you have all the Google infrastructure to build on: for example the Transit Maps thing below is a lot easier if you can plug into Maps! (But some things would be easier to launch externally.)

- Part of the attraction, like for open source projects, is that it's something different from your main job, so you learn and stay fresh. If you turn it into your main job you lose that.

The theory is that if you do work on a new product and it works well, eventually it will be staffed full-time and you'll be rewarded. Apparently that did happen with Google Now. I doubt it happens every time, but then not every worthwhile startup succeeds.


Wasn't the "all the wood behind one arrow" motto famous for heralding the decline of Sun Microsystems?


In all fairness 18 months ago the perception of Google is it was doing a LOT of things but none of them very well. Embarrassing attempts at social networks (wave, err, that other one I'm forgetting) is just one example. So narrowing their focus (still) seems like a good idea. But having great engineers is the most important thing for any success so not sure why you'd take away a mystical founding pillar of your culture.


>Embarrassing attempts at social networks (wave, err, that other one I'm forgetting)

You mean Orkut. That was weird: it got wildly popular in Brazil, India, Turkey and other places, but never really took off in "the West" (weird name, bad quality when first released, boring UI...). I guess most of its members (it's still going!) will become G+ users, so it wasn't a complete failure in the end (in fact, a large userbase in developing countries is an insurance policy for the future, if you can keep it going).


He probably meant Google Buzz, which was an unmitigated failure. Orkut, all things considered, is actually reasonably successful.


Fair enough! I always thought of Buzz as a twitter clone more than a real attempt at a social network though (which Orkut undeniably was).


For what it's worth, I really loved Google Buzz, and all the techies I knew used it, but we didn't migrate to Google+ when Google Buzz was discontinued. We were just very sad.


I did mean buzz. I think Orkut was an acquisition if I remember correctly.


I'm sure it was a 20% but I could misrember, it was a long time ago.

EDIT: Wikipedia says "Orkut was quietly launched on January 22, 2004 by Google. Orkut Büyükökten, a Turkish software engineer, developed it as an independent project while working at Google. While previously working for Affinity Engines, he had developed a similar system, In Circle, intended for use by university alumni groups. In late June 2004, Affinity Engines filed suit against Google, claiming that Büyükkökten and Google based Orkut on In Circle code. The allegation is based on the presence of 9 identical bugs in Orkut that also existed in Circle."


More likely imho, he meant google buzz


or + ?


> Embarrassing attempts at social networks (wave, err, that other one I'm forgetting) is just one example.

Wave wasn't a social network, it was collaboration protocol with a proof-of-concept demonstration app.


Wave was just bizarre. I called it "4chan enterprise" almost instantly upon checking it out.

I did get the impression it had some interesting innovation under the hood though that made it into Plus and Docs.


>Wasn't the "all the wood behind one arrow" motto famous for heralding the decline of Sun Microsystems?

all the wood led by deadwood behind one arrow ... this arrow ... now this arrow ... no, this arrow that has just been renamed to the name of that arrow ... of course it is an arrow, it just looks like a deadwood onshore washed tree trunk and weights like it, and flies like it, well, that of course if somebody can make it fly ...


This doesn't seem all that surprising. You get what you measure. If Google has been placing a greater weight on direct short-term contributions than on blue-sky projects, and then putting employees in competition with one another, then of course employees are going to feel under pressure to reallocate that 20% of the time toward the stuff that makes them money. Some persist anyway, because of personal satisfaction or (much less often) sincere belief that what they're working on in their 20% time will eventually win big for both themselves and Google, but employees take those risks and make those deals at every company. The middle managers have no reason to support it as the general case, and most employees don't have the backbone to force the issue. Ultimately it's a bit like working from home. Some do it even at companies that don't have a policy for it, and some companies that do have such a policy don't really make it possible.


Funny, I'm a Googler. I just went to the internal 20%-time page. It's still there. Still looks about the same as last year, and the year before that.

Despite the rumors, I don't think 20%-time is dead.


I curious if they did any cost benefit analysis in regards to declines in hiring quality when they removed such a feature.


I'm positively sure that they did. The sad thing is that they still thought that it was worth doing. To me, that says a lot about where Google's priorities are today.


They might have, and it might have told them it was the right thing to do.

When I was in the Valley a few years ago, lots of people were quite convinced that top talent had already been completely chained to Google/Facebook/Twitter anyway, and there was no chance of poaching it away. If you follow that line of reasoning, there is no point investing in recruitment, at least locally.


Google is expanding quite a bit in Pittsburgh, New York, and a few other locations. I wonder if this is why?

I also wonder though if the real estate hyperinflation of the valley is a factor. It negatively impacts the ability to attract out of town talent when a "starter home" is one million dollars and a decent apartment is over $3000/month. It also impacts retention. I meet a lot of folks from the Valley and SF who loved it but "left cause they wanted to start a family" or "left cause they got tired of spending >50% of their income on real estate." You have either pay exorbitantly high salaries or expand elsewhere.


Expanding in New York because of real estate prices in Silicon Valley? That sounds nutty, but maybe I'm missing something.


Take a look. Silicon Valley real estate prices are unhinged. I mean like frothing at the mouth, must be restrained or else he'll bite his tongue off crazy.

Manhattan is similar to SV, but the thing is this: it is possible in New York to find reasonable (for the local market) real estate within a reasonable commute from work in one of the boroughs. It won't be the trendiest, but it's going to be okay.

In SV you are talking about $800k or $2500+ a month for places that in other parts of the country would be called a "crack den." Either that, or you are going to be both far away and in an utterly dull, depressing suburb.

SV is America's only six-figure ghetto. There is nowhere else where so much buys so little. Driving around on my most recent visit I was shocked by the squalor of these six-figure engineers and millionaires (relative to what you'd get for that price in sane markets).

I'm not a local so I can't say for sure, but I'd be strongly tempted to blame the anti-development political mentality. With growing demand and no supply, this is gonna happen.


There was a joke going around recently. "I work at Google and still have 20% time. It's called Saturday."

(Yes, yes, the % isn't correct. It doesn't matter ;-))


10 hours on Saturday. ;-)


First, I was sort of annoyed that I had to "sign in" in order to read public statements from former employees about the "20% time" being dead at Google. Beyond that, I think few companies exists that just want to make great products and know that eventually the money will come. In the meantime, most companies are there to make money and if the numbers show that people are more productive working 100% of their time on responsibilities rather than innovation, then who can blame them - they want to make money. However, this is Google, a mega company making tons! of money, so I expect them to encourage innovation. In fact, someone I know recently interviewed there and the "20% time" for side projects is something they seem proud to promote, so can it really be dead? I think the issue may be in the managers/company EXPECTING results out of this time investment, rather than seeing it as a way to encouraging innovation that won't necessary lead to a Gmail or Talk.


It's not like there's a shortage of startups to provide innovation so it probably just makes more sense to buy the successful ones than speculatively spend a fifth of their workforce on random ideas.


It's very logical. Back in the day G had very few, very bright employes. Nowadays they have a lot more people, who are, on average, less intelligent and have less will power. Which means the majority of G-ers today are incapable of innovation and letting everyone take the Fridays off is a huge waste.


Seems like Google is splitting into Google-Page and Google-Brin (X Labs). I'd much rather join the latter, but perhaps it wouldn't exist without the former.


I know it'd be impossible to know these stats, but it'd be interesting to see how many (recent) ex-Googlers went on to start their own successful startup. With that, it'd also be quite interesting to speculate how many of these business ideas could have benefited by being "within" Google.

In many ways it's a shame, although I think most of us can appreciate that the Google of yesterday is a different beast to the Google of today. One area where I think it could really hurt Google is in recruitment. I imagine the "20% rule" would have been a huge factor in getting the best engineers to work at Google. With this news, I can see some of the magic of Google disappearing in my mind.


I know a few people who have worked there over the years and they confirmed the same thing - albeit with a huge caveat.

The two people I know said the 20% was a way for their brains to take a break and think creatively. Nothing was too crazy or far fetched not to try out. They said it helped combat the crazy hours and working from dusk till dawn. In short, it helped prevent burnout. I would be willing to bet the turnover rate will start increase as their employees start to work more hours, without any breaks, and nothing to give them something to look forward to and burnout starts to settle in much faster.

Sure, it's not going to make them go bankrupt, but it will have a real world cost to their bottom line.


Did that 20% time actually bring anything useful at Google, or it's just a myth?

IIRC, both Gmail and AdSense mentioned in the title didn't come to Google as 20% byproducts, but as acquisitions.

Does anyone know of any popular (by Google standards) product that started that way?


Transit directions on Maps started as a 20% project.


It's just one single feature of one of the projects. If that's the poster child of 20% projects then that's good enough reason to ditch them.


Transit directions have been a big reason why some people prefer Google Maps to (say) Apple's maps. In places like New York or Tokyo, it's a huge win for users. It's also spawned an open specification (Google Transit Feed Specification) that has sparked a ton of innovation in the public transit space, including real-time location updates from public transit authorities.

Full disclosure: one of the big leaders of the Google Transit project was Avichal Garg, who was a PM for the webspam team when he ramped up Google Transit in his 20% time.


Transit compiles schedule data from hundreds of worldwide transit agencies, does custom pathfinding and is a pretty sophisticated project all on its own. It's not gmail, sure, but there are entire companies that do the same thing as this "one single feature."

Also, as others have pointed out repeatedly, 20% has not been "ditched." And this isn't a poster child, it's just one example.


For what it's worth, I had a friend who worked at Google a few years ago and asked her about her 20% time. I was extremely underwhelmed, as it sounded like the exact same project she was working on from day to day. My idea of 20% time was always this really creative unbounded time to pursue ideas and things that are interesting to you, but for her it really just sounded like a logical extension of what she was already doing. It was just weird, given all of the projects in all of Google, with all sorts of interests, why would she just focus on the things she was already doing?

If that was where 20% time was going, I don't think we're missing much.


Why? Maybe that was just what your friend found most interesting at that time.


Larry Page had that chat w/ Steve Jobs where Page asked Jobs for advice. Jobs supposedly told him he had to start cleaning shop if he was to avoid becoming the next Microsoft. Page apparently took Jobs' message to heart as he immediately began unifying services and shutting down anything that didn't fit with the new mission. It's easy to see how 20% time was part of the reason things got so messy and it makes sense why they're scaling it back now.


Pretty sure this is false. My brother is a google engineer and he was demoing a project for me that he and some other guy had been doing in their 20% time over the last week.

This was yesterday.


3M didn't copy 20% from Google. They were doing it well before Google ever decided to do it.


I think that due to new focus on google+ the company is now less able to adopt new projects, if the new idea does not originate from project management. With google+ all features have to be integrated, so simply there is also less incentive/room for innovation; things that stand out of the google+ user experience must be cut for the sake of consistency;

This tendency is self enforcing: If there is less opportunity for the side project to make it then developers will be less inclined to follow through with the idea.

So as an outside observer I can identify a cycle: less room for innovation breeds less incentive for an individual to come up with initiative. Management sees that its liberalism is besides the point, so free room for initiative is scrapped.

Maybe google can still keep on to a culture of innovation in areas of infrastructure and research, but in a wider sense google+ probably did them in.

Maybe that is the right thing to do: they have enough money to buy any start-up if it looks promising. I don't know which one is more cost effective: exploring ideas in-house or buying out start-ups; however if Google looks like any other ordinary shop then it will be less able to attract top talent. I would guess that the free food is not the winning perk that attracts developers.


Anywhere I've worked that says they do it ends up being more like 100% bottom line and the extra 20% is expected to happen as extra hours. I work on those "extra" projects during regular hours anyway. If I don't get to practice my creativity, programming turns into a drag and I get less done. Most of the time they happily deploy the extra stuff I make and I've never been reprimanded for it even when the extra work was a miss.


Lots of the discussion below focusses on the merits of MBAs, upper management, etc. - and while that is good discussion, I just want to add that the killing of this policy - if valid - is yet another example of our flawed practice of capitalism. Capitalism was never meant to destroy creativity nor innovation nor people - but our use of it does lead to those inevitability because of our worst human trait - GREED. Gecko was indeed wrong - GREED is not good - drive is - and they are not the same thing. We need to sharply change how we practice capitalism and develop, into all of our financial models, factors both for total cost impact (i.e. environment, people, etc.) as well as innovation. Current economic models in use do not look at these critical aspect of our existence and until they do, we're all screwed. Innovation and drive are what have brought humanity to this most amazing place in history - not cash (especially fiat cash) and not the hoarding of resources.

I dare the management at Google to reverse course and allow their brilliant stable of top talent to keep the 20% perk - let a few Benjamins go for the sake of building awesome shtuff ("h" intended).

INNOVATION RULES - cash - well, it's just dirty paper...


After initially positing, I found the Quartz update and it seems the reality is that 20%-time has in reality become 120%-time. So it's the issue in reverse - the program is not officially dead, it's just smothered. In essence, Google is saying it's cool if you want to innovate aside from your core duties but do so on your time - the time that is taken away from your life and family. So the loud and clear message to employees is innovate with new ideas only if you are willing to make sacrifices in very important areas of your life - what a hugely bad move. Some might say, "Well - sacrifices need to be made to do great things." True - but shouldn't the main benefactor be the one who makes the main sacrifice??? Of course they should. In addition, 8h per week is likely not nearly enough time to incubate anything worth while so while the 8h is a great starting point, it is more than likely that folks would need to still use additional time to make any degree of progress.

What should be done is to first decide your company's mission statement and if innovation is anywhere within and is at least on par or above profit, then you need to take that seriously and set incentives that move folks toward that goal. I was unable to find an actual Google mission statement but I did fined their philosophy page (i.e. "Ten things we know to be true") and it seems clear to me that innovation is definitely a core principle for the company.

As to all those detractors who say the 20% has become basically screw-off time, maybe-maybe not. But the proof is in the pudding and if amazing things come out of the practice - as is the case with 3M and Google - then the screwing off is clearly at a minimum if there at all.

For those that have an idea and wish to use 20% time to incubate, then restructure all of their core projects using a 32h week and pay them for 40h.

My dare to Mr. Brin and the rest remains - don't just not smother the policy but rather, embrace it with a massive kung-fu grip bear hug. Make it a top priority for those with ideas to use 20%-time and incubate to the N-th degree!!!!!


I can confirm this. When I went on interview at Google and asked various engineers what they did for their 20% time they all said they didn't have time for it.


Does it mean that we reach peak Google now?


This doesn't match my experience as a Googler. I do 20% projects, and my manager has a 20% project. It helps keep things fresh and interesting and I get exposure to more of the vast ecosystem of Google stuff than I would see in my main job.

I get praise and appreciation, and bonuses from other teams. I think it looks good on a resume, I learned something, and it builds a connection across teams. I know the things I'm doing are helping Google as a whole.

People are always going to have different feelings, and often mixed feelings, about putting time into a side project rather than concentrating on their main thing or doing something away from the computer. For me it comes and goes: some days I want a change and sometimes I feel too busy.

How many of you who have startups would feel OK about taking time from it to spend Friday working on an open source project, even if you don't have a boss that's stopping you?


While I am saddened by the fact that this is coming to an end, I understand it. I have always wanted to work in a "Bell labs" type of environment. Now that the little fish I was working at got bought by "big Corp" I have been angling to get something like this started at our division. But.... while I have the type of mind that does very well pulling in disparate ideas and melding them together, I have found that a lot of the other engineers, even the very smart ones, are not always good at invention. I can imagine that Google, has possibly found a way to identify the best group to allow the extra free time, rather than allowing it to everyone. Ideally, if they gave 75 to 100% freedom to the 1 to 5% that have the "gene" of true creativity, they will be better off.


I've always wondered if one day a week was the optimum use of 20% time, given the need for extended concentration and flow to do really good work in software development.

Google has had some success with it obviously, but I wonder how it would have worked had they done it by Quarter or semi-annually instead.

Say, you meet a certain threshold in your performance review, you get a lump sum of ~20% of the time period that the review covers.

Say for a quarterly review, if you meet the threshold, you get 2.6 weeks. Or semi-annually you get 5.2 weeks, straight. You have to be at the office, but you can work on your own project during that time.

5.2 weeks is some serious hacking time, enabling a great deal of extended focus. Highly productive Google engineers could probably bootstrap a startup MVP in that time, except of course Google would own it.


You don't have to do it as one day a week.

Personally, the way I've always taken it was "I've got no pressing tasks I need to take care of for my main project or for coworkers? Great, I'll work on my 20% project!" And I've found out that over my nearly 5-year career at Google, this averages out to pretty close to 20%. But there were weeks that I did nothing but 20% work, and there were stretches of 2-3 months where I didn't do 20% work at all.

We're evaluated over a long period of time and nobody really knows what you do day-to-day, so it doesn't actually matter all that much how you take 20% time, the important thing is what comes out of it.


This always happens when large companies assert more control in the form of analytic analysis on the productivity of virtually everything. When you are building a pipeline which must be efficient, analytic analysis / statistical analysis plays a huge role in improving efficiency, but it's also a stifling factor when it pertains to the creativity of new solutions.

Google previously focused more on the creativity of new better solutions to an array of different verticals. Now they are focusing on the creativity of their own "super-man-team" instead of the "our peoples make crazy stuff, we hire amazing people" and let them go at it in their spare time and build innovative projects.

Maybe this time marks a period in Google's future where they stop innovating in a disruptive fashion.


20% time helps bring in the "entrepreneurial" types of engineers that Google seeks to hire and as it starts to disappear, they might see fewer of those hires and more of those engineers leaving. Perhaps it's simply because Google has grown to the size where it can take on really big innovative projects (Google X) instead of the respectively smaller projects that were created under 20% time. They've found that they can't have both without hurting their bottom line, so they've decided to go after the big projects. It's a significant shift, but in the end, it could be good for all of us because many have the resources to go after the small and medium wins, but only a few can go after the really big wins.


After getting a PhD in (non-CS) engineering and a boring office job, I became somewhat enamored with the idea of Google's 20% time allowance and its apparent spirit of innovation. It led me recently to begin investing my free time in learning the foundations of CS properly. While this effort could lead me down different paths, in the back of my mind, I always thought it would be really cool to land a job at Google. That possibility, whatever its likelihood, has motivated me.

So in some sense, this comes as depressing news.

But not really, because another company will take its place, and I can set my sights there.


/play trombone

Whelp, it's been grand, Google. Who's next?

Someday a company might actually grow correctly and preserve their humanity and agility despite our tendency for social discord as a species, but it's a damn hard problem.


There are a lot of good remarks in this thread.

I will try to add a point I haven't seen here yet:

Once I was in a big company with a big, famous research lab.

The group I was in, just three of us, was led by a guy who had taken some technical work and published some papers. The journals were peer-reviewed but semi-popular so reached some ordinary business customers. The papers fit in with a lot of hype at the time.

So, some of the customers picked up a phone and called our group for more information. So, presto, we got contact with real customers otherwise essentially forbidden in research.

Why forbidden? Because the sales/marketing guys wanted to be the guys who had total control over all contact with customers.

Eventually our little group broke up, and I wanted to do some research that would lead to products to be sold to customers. I started to discover that no one in the company wanted that. The big guys in research wanted to throttle all contacts outside research and mostly didn't want any. The people in sales, marketing, product development, and product production didn't want to deal with anything new from research. No one wanted a research worker bee talking to real customers.

Next, when research did come up with something that might be of interest to real customers, no one wanted to pursue that opportunity. Again, the marketing people wanted to be the source of all new product/service initiatives. No one in research wanted a research worker bee to get credit/blame having a product go from research to success/failure in the market.

In the end, research was forced to be essentially irrelevant to the business. Finally research was turned into a patent shop so that lawyers could take the big patent portfolio, go to companies, claim patent infringement, and 'license' the patent portfolio.

Point: Generally, a business organization, given a nice new product/service to offer their customers, has multiple layers of organizations and people, processes, reasons, etc. why they just do not, not want to come out with anything new. Period.

Curiously, yes, some of the people at some of the customer organizations were eager to work with our group in research as a way to look more innovative within their own organizations. Here, of course, they didn't have to work through layers of managers but only had to exercise their own freedom to handle their own responsibilities.

So, lesson: Commonly individuals are ready to do and offer new things to others or try new things from others, but it's the layers of managers who block such things.

So, the blockage of these layers of managers is one of the main reasons and opportunities for entrepreneurship.

Yes, it's dumber than paint, but it's usually the way of the world.


This whole segmentation is really bad. And from outside, it does seem quite simple.

This brings me to my staple question again: "How in the world those intelligent people with all those qualifications fail to understand something, well, dumber than paint?"

Or is it that majority of layers are full of unwilling/stupid people. I can see people being unwilling. During my time in internship, I became quite hesitant to touch any issue that wasn't assigned to me. QA guy didn't know much about programming and assigned features randomly. And there was no incentive for me to develop a feature assigned to someone else(though being curious, I did "snatch" a feature or two and implemented myself)!

I guess one weak point causes the problems and then they get amplified! That's how offices work.


> This brings me to my staple question again: "How in the world those intelligent people with all those qualifications fail to understand something, well, dumber than paint?"

Dumber than paint, but usually not totally irrational for the managers who do such things.

Generally if such dumb behavior is made sufficiently 'public', then nearly everyone will fall in line and agree that, yes, sure, it was dumb and we don't want anything like that going on here.

Otherwise, dumber than paint is common, and here is some of why: You are a manager, at some level, but not CEO, with several subordinates, managers and/or worker bees. One of your subordinates comes to you with a new 'idea' X. Maybe all X is is 'blue sky', back of the envelope, 10 pages of manuscript descriptions and/or mathematical derivations, a carefully prepared paper of 50 pages with technology, market analysis, cost and revenue projections, running prototype software, breadboard hardware, joint research with some leading customers, or some such.

Your mission, and likely you have to accept it, now, is to decide what to do about project X.

Suppose you investigate project X. Okay, that's time and energy away from your assigned duties that count with your manager. So, maybe you put in some weekends investigating project X -- now your spouse is unhappy.

Suppose from your investigation you decide to try to support project X. In simple terms, there are two cases:

Success. Suppose project X is a success in some significant sense, say, the project grows in headcount, budget, floor space, etc. Suppose project X gets some publicity in the company and/or outside. Suppose some major customers hear about project X and tell your marketing people and/or CEO that they want to try X. Etc. Now you can be seen as a bright, rising star, and your management chain from your manager all the way up to but possibly not including the CEO and COB feels threatened by you and will, all together, at the first opportunity, hang you by your toes from a lamppost in the employee parking lot.

Failure. Suppose project X is a failure. Then all the resources you devoted to it will be considered a waste for the company. You will be painted as an incompetent, loose cannon on the deck, and chastised, ostracized, put on the slow track, have a black mark on your career, or just moved out the door.

Commonly there is only one person in the company who can push project X, put up with the blame if X fails, and really benefit from the credit if X succeeds, and that one person is the CEO.

If the CEO sees that you got dumped on for your efforts with project X, then he may shake a finger at the managers in your management chain, but likely you still don't report to the CEO who still needs your management chain for the responsibilities they have been assigned to do.

On the other hand, suppose you decide just to get a list of excuses, that you release only one at a time as needed, for just why project X is 'not appropriate' for your group. That is, you pour cold water on project X and work to kill it. Here you are likely safe from any blame or attacks and don't have to lose some weekends on project X.

So, net, nearly never does anyone get blamed for not innovating.

Really, usually in an organization, there is work to be done, and the management tree is a partition of that work to get it done. No one is holding their breath waiting for innovation to save or even help the company.

The company and its existing products/services, customers, revenue, earnings, stockholders, etc. is a valuable bird in the hand, and any innovation is seen as at best two birds in the bush.

So, possibly one reason for the old 20% time of Google was to let everyone in the organization pursue their case of a project X without blame and, also, to get a hearing, without much risk, in case their project X looks good.

Why stop 20% time? Maybe not much good was coming from the 20% time. Maybe a lot of middle and upper managers got tired of having to evaluate projects they really were not interested in -- a VC can get 2000 contacts from entrepreneurs in a year and invest in 0-3 -- so, that's about 2000 contacts a year tossed in the trash, and maybe Google managers didn't want to evaluate such projects. Maybe the Google top management was concerned that the main work was not getting done fast or well enough. Maybe the 20% time looked like chaos. Whatever.

As we know, Jobs or Woz was at HP, presented to a manager at HP the idea for, say the Apple I, and secretly smiled when the manager said that HP would not be interested.

It remains that some of the most important ideas in computing are authentication, capabilities, and access control lists. These were in good form in the MIT Project MAC computer Multics. Also processor architecture with gate segments, since in Intel's x86 architecture.

Well, Multics was done on a computer from GE, expensive. GE sold their computer division to Honeywell, so Multics was on a Honeywell. Honeywell didn't like selling Multics, and not many were sold -- one sale was in basement of a five sided building on the west bank of the Potomac River. Some engineers at Honeywell saw 'bit sliced' hardware and microcode and concluded that with those two, some extra register sets, a tweaked Fortran compiler, and the Multics hard/software ideas, they could bring up Multics on a single board super-mini computer, sell it, and make money. The story went that the Honeywell managers said that they didn't believe that there could be such a computer, that if there was it wouldn't sell, and even if it did Honeywell would not be interested. So, the engineers started Prime computer, one of the three, along with DEC VAX and DG 'Soul of a New Machine' MV8000, on or near Route 128 in Boston. As I recall, in 1980, Prime yielded the best ROI of any stock on the NYSE. Later Prime got a CEO from a big computer company and went down, down, splash!

It's a very old lesson: Big companies do not, not want to innovate. An explanation doesn't really need the book 'The Innovator's Dilemma' and is simpler -- people acting dumber than paint and, the real cause, getting away with, and even benefiting from, it.

Why? Because nearly no one can be sure that a project X would work. So early on, dropping project X into the bit bucket is not proven to be a disaster. The short term cost is easier to see than the long term gain. Why? Because accurately evaluating a project X is commonly a lot of work and still needs some judgment and with results that really cannot easily be communicated to others, say, higher mangers, the CEO, the Board, security analysts, the tech trade press, or the stockholders. So, dumber than paint is accepted as okay.

Then one of the big opportunities, especially now with the fantastic price/performance of computer hardware, Internet bandwidth, and infrastructure software, is information technology entrepreneurship. Broadly in economic terms we want to automate everything in sight to give higher economic productivity. There should be lots of opportunities, better than wheels, bronze, iron, open ocean sailing, steel, steam, electric power, motors, and lights, internal combustion engines, oil, radio, TV, chemical engineering, "plastics!", etc.

But a bootstrapped entrepreneur can't do a 13 nm microelectronics fabrication facility, an iPhone 10, a Tesla, etc. So, a company with everything in place -- HR, legal, finance, floor space, and cash -- should have a big, huge advantage in innovation, i.e.. powerful, valuable new products that will make a bundle but that small entrepreneurs can't do. So, maybe it's the first HP scientific pocket calculator, the first HP laser/ink jet printer. The Apple iPad, iPhone, etc.

Still, apparently to be good at such 'internal innovation' takes a Jobs. For a Steve Ballmer, John Chambers, Marissa Mayer, Larry Ellison, Page/Brin, maybe the best they can do is to buy a HotMail, YouTube, Tumblr, and just hope not to kill it right away.

Back to my startup!


That's sad.

I remember being in direct competition on a project with a google 20% team, and only beating them by a few weeks. We had fun with that.

I guess that's just the natural life cycle of companies?


And that signals the beginning of the end for Google, when an engineer-lead company is taken over by "managers". It happened and will happen to all companies.


My previous employer had a 10% time in place - which while in theory was a great idea, I found myself always too busy to take advantage of the time. This is in no fault to my employer, or my direct management - but lets face it - at a startup, there is little room to not be working. When you already are coding long hours, trying to make quick changes, implementing new features constantly - there just isn't enough time to do it.


I disagree. I currently work at a startup with 10% time, and barring rare "the servers are on fire!" events, we're pretty religious about doing it. Even when we skip it, it's credited so we can take it later (arguably better since you get longer stretches of it).

Like most startups, we also operate at breakneck pace, we're not exactly sitting around twiddling our thumbs. This strikes me as the same as "I don't have time to..." excuses people make in their personal lives. If you value it enough, you will find a place for it.

It's all about how much you actually value the 10%-time, or if it's just a recruiting/PR gimmick. If the company approaches it from a "it would be nice if..." angle, it will always find a million things that are higher-priority than 10%-time. If you approach it from the "this is a critical part of company strategy" angle, you will find ways to keep it sacrosanct.


I was often encouraged to just take the 10% time - and perhaps it is an excuse on my part, but at the same time I valued my work and took pride in accomplishing certain tasks that I felt that that work was far more important. In the end, I did spend some time on personal projects that ended up getting implemented, but in the year and a half I was there, it was definitely not an often used perk (on my part).


This is great news. Even Google that where a startup started are starting with big business bad manners. All of us know that starting a company is much much easier then it was before and defently not like in the 1950 where you needed a lot of money to get going. All of my worries that google will keep a dominated position on the market is gone. So lets go out there and kill it.


Right along with "anecdotes are not data" should be "long-term success is not made of little bits of short-term success".


So we can effectively say goodbye to a strong engineering culture at Google. With layers and layers of PMs, the best and brightest have no desire to entire their system. So Google's left with the academically talented individuals who don't have much initiative or drive outside of extremely narrow scopes. (So, innovation was killed by the bottom line.)


I think they are doing ok and will continue to do ok with "side-projects": Google Fiber, Google Glass, Google Cars.


Isn't their continued investment in things like Google X sort of the more scalable replacement to 20% time, now that not all of the engineers are brilliant and driven? Just kind of a natural result of scale - find the people that were actually using 20% time and give them 100% time.


> And what’s more, if 20% time has been abandoned at Google, are other companies, which reportedly include Apple, LinkedIn, 3M and a host of others, wise to continue trying to copy it?

I thought it was Google that copied it from 3M, who invented the Scotch Tape during the 20% time?


A completely wrong piece of HN-bait.


Mimms, original author, posted a follow-up which incorporates a lot of the comments from here. I've posted it to HN here

https://news.ycombinator.com/item?id=6225725


Public companies must focus 100% of their energy and attention on this quarter's earnings. Anything less and the CEO could get fired. Actual investment in long term strategies gets reduced to lip service.


Just because people aren't "approving" the project doesn't mean it's not going to happen. Chrome started as a 20% project and it wasn't approved until it was way into development.


There are a few other signs that Google is freaking out about wasting money.

- search in Gmail is actually losing functionality.

- hemming and hawing about the NSA scandal (not wanting to risk irritating political leaders)

- ending free google apps tier

- ending google reader

- ending 20% time


> It used to be that HP engineers were expressly given Friday afternoons

What a huge sacrifice HP made thereby. Friday afternoon is otherwise the most productive time of the week.


No companies should be offering 20% time anyway because it's a difficult promise to keep between crunch schedules and need to keep work between teams synchronized. Some companies also have creative interpretations of "20% time" as "We offer 20% time: you can spend 20% of your time doing whatever marketing needs you to do" turning the whole notion of 20% time into a disincentive.

Hack Days work much better. Easier to accommodate and more fun.


Crunch schedules are often the result of poor management (at many points along the chain). My group seems to handle the balance reasonably well--we certainly go through crunch times (either because of deadlines, or customer requests), but when we are outside those times, we can spend some of our time playing with things. There's no guarantee that we get some percentage of our time each week or month to play with new things, but if we feel we can put it into our schedules, we have the freedom to do it.


If you have to crunch a lot, your manager has poor time management skills.


Are those during working hours? That's the important part.


Killing Reader -- something no engineer would ever do -- was already proof that the engineers are no longer in charge


Would they approve a 20% project that aims to bring back Reader? Or would you be able to work on that without approval? Just seems like a lot of politics would be tied up in what people choose to work on.


When did Google first introduce 20% Time? I'm curious how it became an official activity.


And this is why business will never truly innovate and government backed projects that waste tonnes of money will bring the truly revolutionary discoveries.

It is probably not intentional but their inability to manage funding is actually a blessing in disguise for most labs.

Business is just not capable of long term dedication to experimentation planing an innovation.


Good. Now they just need to spend that extra time fixing bugs in most of their software.


Yes because for innovation they have a separate unit namely GOOGLE VENTURES.


Sure, but 20% helps with innovation within Google, give googlers room for to flex their creative muscles, and is attractive to potential hires.


And how many failed projects came out of 20% time?


Failure is a prerequisite for success, though. As long as those failures were inspected, digested, and understood, then the company gained more in terms of training than they lost in operational costs.


Does this matter if the engineers and/or the company learn something from the failed project?

We under-value sincere-but-failed projects.


They probably didn't lose as much money as they're gaining from the successful ones


I'm sure the gains from AdSense and Gmail alone outweigh anything they lost. Also engineers can stay happy if they work on things they like more often. So there could be an implied cost (by losing employees) to not allowing 20% time. I'll also add that it's almost impossible to stay focus on 1 thing 8 hours+ a day for 5 days a week. Even if there is no 20% time I'm sure they still lose time to unproductive activities. Maybe 20% time actually captured some of the time that would otherwise be spent browsing the web or making pointless comments like this on HN.


It's not as if Google never would have thought to make a web-based mail if someone didn't do it in their 20% time...


At the time, Google was a search engine, period.

Expecting Google to build a webmail system would have been like expecting Twitter to release an accounting package.


What is your definition of "failure?" A project that nobody uses and makes the company no money is a failure in terms of direct profitability, but may be a success for other reasons. Your engineers get to exercise their minds in unusual ways and keep themselves sharp. Your engineers are more willing to put up with boring tasks. You attract talented people who want to be self directed at some level.


Search and email are closer than tweeting and accounting. Not even a good comparison.


None, failures tend to be abandoned early in most cases.

Besides if you care about failure then you don't belong in the tech industry.


What a strange sentiment. We should all just not care about being successful?


How many successes you had matters. How many failures you had in addition is basically irrelevant.


You should care about successes, but you should ignore the failures that will follow.


How many failures gave you the knowledge and/or practice to allow a follow on (non-20%) project a better chance of success?


Does that matter as long as you get a multi-billion dollar business out of it every now and then?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: