At Yahoo, I designed and built a large project. This took about six months to deliver. At the time, the company was doing quarterly reviews that I think Marissa had imported from Google. Anyway, my manager had rated me as "meets expectations" for a few quarters because this project and the one I had previously worked on weren't visible to "customers" till they were delivered, so he didn't have any evidence to justify a higher rating against other managers' team members.
Little did my manager know that a few quarters of "meets expectations" had caused HR to drop me into the bottom 5% of the company and so I received a letter from HR that I was at risk of being terminated.
So I deliver the project and now it's visible and everyone loves it and I get an "exceeds expectations" rating and then a promotion and a raise.
From being warned I'd be terminated to a promotion in less than six months, with no change in my work, but simply it becoming visible. Whee big company fun.
This is disturbing and, I think, a twisted form of grade inflation, mixed with the usual suitspeak where words don't necessarily mean what they mean. If an employee is as good as you think they should be, why would you put it at risk of termination?
For the record, I also work in a big company with long product cycles, meaning that the product I'm working on started more than one year ago, and the first public release is planned for next December, so I feel your pain. Luckily our process is a bit more sane and our manager follows our work closely, so I'm getting good ratings (but I wonder if the grade inflation will continue in such a way that "exceeds expectations" now means "subpar employee"...).
Stack ranking ensures that you have churn in your workforce, and not in a good way. You wonder why workers in the tech industry move around so much? Look at stack ranking as a significant contributing factor.
It’s often easier to get that pay raise by hopping companies. You’re then replaced with a new hire who is probably getting more than you did. And needs a few months to get up to speed.
Treating your proven employees well to retain them would cost less.
With such a messed up evaluation process engineering demanded direct, documented process for doing evaluations. Instead we had managers trying to gauge output based on defects fixed and lines of code committed. You could tell management had no clue.
But, I bet, a large and well-resourced HR department, which was the real goal. Who invented the system, after all.
Because quality of the employee and the employee's work are not the only factors in hiring and firing decisions. Sometimes a smaller workforce is what makes sense, and if all your engineers are "meeting expectations", what are you going to do?
Nevermind that this policy necessarily means churn and undervaluing aspects of software like security or fault tolerance and a constant drain of institutional knowledge.
But I feel like this isn't very hard to understand and I don't know what you're not getting about it.
And if consecutively "meeting expectations" puts you near the bottom, then the expectation must actually be to exceed expectations.
Uh.. no.. you seem to think "meets expectations" necessarily means "will not get fired". This policy exists, as dumb as it may be. Imagine you're a manager at a company like this and you're tasked with giving a performance review. You have two choices: you could do what you seem to be implying they should go and fudge the ratings, so that the bottom 5% always get "does not meet expectations." Or you could say "fuck that stupid rule, I'm going to rate this person honestly, and if they still fire him that's their choice."
A performance rating and a company policy about firing are two completely separate things. I think you must have a naive understanding of how staffing decisions are made in the real world.
"Meets expectations" — which expectations? The expectations for the role of for the employee? If we're talking about the expectations for the employee, then obviously it makes no sense for meeting expectations to be a bad thing. But if instead we're talking about expectations for the role, it's perfectly consistent to say that an employee who perpetually merely meets the expectations for their role is subpar, if employees are expected to continually improve.
Now what's less clear is why companies care about the derivative of their employee productivity in a labor market where employees stay with a company for just a few years.
By what standards are Bus Drivers evaluated? Near accidents are OK as long as they're lucky enough to not be involved in a real accident? If so, I'd prefer they actually improve and decrease their odds of being in a future accident. Does efficiency matter? Maybe one that's on time more often is actually better? Maybe customer service matters, and they can get better at helping people who need to buy tickets for the first time?
> I don’t want to be driven around by drivers who think that they’re Formula 1 drivers
It seems like there are multiple dimensions that a bus driver can improve on that aren't "become a race car driver". Given that, it seems like the bus driver analogy already isn't a good one.
> How are programmers different from bus drivers?
It doesn't even seem like we need to answer this one. You started with the false assumption that there are no dimensions that a bus driver can improve on other than becoming a race car driver. One could argue that there are much wider ranges of value creation in programming between average and above average programmers, but that's irrelevant given your initial premise is obviously flawed.
Similarly, a bus driver that consistent drives without causing accidents, doesn’t spill the passengers’ coffee, doesn’t burn too much gas or wear out the breaks too quickly is a good driver.
What you say to either of these people is “Great job, we love you, keep doing what you’re doing!”
Encouraging growth and improvement as an end in itself can be destructive: How many projects amount to rewriting something that works in some new language or framework because an engineer wants to pad their resume or just learn something new?
Craig Mod recently wrote Fast Software, the Best Software; I’m hoping he follows that up with Software That Already Exists and Works Fine Using an Unsexy Suite of Technologies, Software that Doesn’t Need to Be Rewritten.
We agree. If someone has maxed out all the relevant dimensions then there's no room for improvement. The problem is that there's a difference between that and average. The other problem is that there's continual room for being even better at identifying solutions.
> How many projects amount to rewriting something that works in some new language or framework because an engineer wants to pad their resume or just learn something new?
Unneeded re-writes and getting better are orthogonal concepts. That you're confusing them here underlines the logical fallacy that you're making. If there's legitimate room for improvement (and there generally is in bus driving and in software engineering), and if that improvement makes you substantially better at your job, then it's worth it. In neither of those areas is the "average" employee completely topped off in where they can grow.
In that programmers are supposed to be creative, over-deliver, etc.
You'd worry if a bus driver got from A to B in half the time driving twice as fast suddenly.
But you'd be OK if a programmer you've asked to design system with features A, B, C and performance P, also delivers features D, E, F and performance 2*P without being asked!
Like I said, I don't want the programmer in charge of the software that manages my financial or health records to be creative or over-deliver, quite the contrary. I'd add personal-data to the mix, and now I've covered a huge chunk of SV companies. I think that the era of "move fast and break things" should be over by now, unfortunately relatively paltry $5 billion fines won't stop the creativeness of some companies.
Sure, but most programmers don't program systems that manage financial or health records or write the software for the ISS.
For the majority working on enterprise backoffice stuff, CDUD, web services, mobile and desktop apps, websites, and so on, being creative and over-delivering is OK, and even encouraged.
They probably deduced from this that their warning system worked as expected.
A new manager we had deliberately did this to us. Few of us got marked as "under-performer" levels at random times - risk of being terminated by the bell-curve justification and then within 6 months (not at the same time) we all were "exceeds expectations". The few of us figured out what the manager had done - after 2.5 years. By then she went from being a manager to a senior director with a decent portfolio - and that too pretty quickly. Oh why not! she created "stars" and "performers" out of under-performers.
I ended up "deliberately" underperforming for the next 1-1.5 years after they first tagged me. How hard can it be to NOT work? Well it is hard. I invested in learning lot of stuff. But management always figured out I was more than meeting expectations, got me raises and promotions. May be forwarding emails did the trick. But then I got bored, frustrated and quit.
I watched a new hire recently lynchpin himself into a critical part of the software, make larger then life promises about new “cutting edge” tech in presentations, constantly post articles in Slack, and finally stop showing up or being online much at all other than to do some lip-service and make the occasional PR. I am a little envious of how little he does and how much praise he receives for checking all the boxes with higher ups.
As a rule of thumb: anybody who needs regular progress updates on a project is a "customer".
The difference in agile vs other development processes is progress reports are partial releases of working code instead of a percentage increase on a gantt chart.
But at an entry-level rank, it is possible to get good ratings and even get promoted if your product hasn't shipped to customers yet (and I would do this with my directs routinely), as long as you are delivering milestones. Part of the definition of T3 is that you cannot yet independently work on a project, ie you need help from senior devs to get things done, and IMO if you stay at this level for more than 1.5 years (3 reviews), and there are no other extenuating circumstances, it's appropriate to consider if the company is a fit.
At more senior levels, you have to show business/customer impact to get promoted, but there is also no notion of being managed out for meeting expectations.
Of course, these guidelines aren't always implemented or communicated equally across the company, to say nothing of other companies that copy only parts of the process.
Toxic environments like that are not worth working in. All sorts of other "interesting" corporate cultural aspects are usually also present. Their ethics are compromised so fraud and other criminal activity etc is probably going on as well. Even if not, the company is badly run and long term will fail in various ways for foolish reasons.
Also the fact that "meets expectations" is concern for firing makes absolutely no sense. A plumber who "meets expectations" in fixing my toilet is a good plumber. I don't expect a plumber to go "above and beyond" and give me a Thai massage too.
Absolutely unreal some of the crap we put up with in this industry. Blows my mind that there are actually humans out there in management positions advocating for and implementing this totalitarian dystopian garbage, throwing away their employees for "meeting expectations" as if they're disposable "widgets".
And on top of that, the criteria are different. Working a little faster than your peers is EE, and is rewarded with raise and extra bonus, but that's different from performing in a higher level role with qualitatively different expectations.
1) There is a pure relative ranking firing cut-off at Yahoo regardless of how well the people at the bottom do?
2) Does Yahoo make this clear when they hire you?
3) Your manager was so incompetent that he could not judge your performance outside of customer facing product?
4) A company as big as Yahoo thinks its going to survive alienating the personality that looks for big company culture with that kind of "make the cut" performance review?
This again can turn into a culture where doing what's expected is devalued to lowest score - because you have that system where there has to be bottom and top performers.
And you sadly end up with a situation where workers are (almost) embellishing their projects to get more visibility.
People are also useless at estimating time in advance, but pretty great with throwing a story-point number on a given story with the right guidance. Please feel free to schedule me in with your uneducated upper management. They're about to lose value.
This creates a situation where everyone is searching for tickets to make up for their daily 8h.
It's handy for charging clients for development work specific to them.
Now that I've delivered those improvements my boss added reworking the software stack that feeds my current system. He also said to me "Oh, BTW RIFs are coming, take that how you will."
So I'm not certain if at this point I'll be RIFd right after delivery of a complete retooling of our infrastructure build process.
Did you write the documentation yet? I think you know what I mean.
A specialist working on a platform that doesn't need the whole backend team is isolated and easily skipped over for all accolades
That had an extended 9 month design phase - but a short 12 week agile sprint phase that delivered two intermediate POC versions and then a final working project (we even delivered hardware when at the last minute the customer admitted they hadn't ordered a server) - we even took enough networking gear that we could have installed a net work if that was missing to.
What a luxury that must've been...
The first time I worked with a stable group where everyone was a smart as me, even the old new guy with a massive inferiority complex who managed to sneak one really profound comment into every meeting, there was a lack of humility on everybody’s part and things did not go well. Planning meetings were exhausting because you could get them to bash their own old work but couldn’t get them to admit to needing to change process at all. I should have quit, but I had a string of experiences with persistence paying off and I had not been beaten yet. That was the first time, and my current employer the second (I should probably quit here, too).
Cognitive dissonance is at its absolute worst in the head of a scary-smart person.
Where smart people fail is because they were "right" in a limited, seemingly appropriate context and it turns out that in a larger context, it didn't matter.
> (...), but it makes you a hell of a lot better at rationalizing your position and persuading others.
My reading has said the opposite: smart people are far more likely to be right, but it does no good because the important thing is to be able to communicate
You might ask yourself if your current executives are aware of the issues you're seeing, then think through what you can control and re-prioritize.
(I was just looking at your profile because I was wondering if you were Kee who worked for Apollo?)
When you building hard tech products (e.g. borg or tensorflow), the actual risk is technology and competition, and since this is new technology, there are not that many customers. In this case the risk is in the architecture (i.e. it would be hard to change the core architecture) and in the technology, since you do not know what kind of application will be build on top of this new technology. In addition, you face general competition risk, since the market can usually absorb only one dominant architecture.
So in this case, you invest much more time in design and careful construction.
While Agile is not perfect for this kind of work, it does have a lot of facility in this area.
Now if you could just get any two people to agree on what constitutes the last responsible moment...
And with new tech you often have to do it first or do it better. Oddly I find Agile seems to work better for “do it better” because if it’s easier for you to add features than your competitor, they will probably get burned out before you do. If you can survive that long you start to pull ahead.
I think that Agile is good after the market is captured. But when developing the core tech (usually with no customers beside product managers and based on research), it might be too constraining.
The whole MVP thing is more about selling demoware to investors. It just sounds like it’s being supportive and it gets them what they want. They don’t give one shit about what happens 18 months from now. That’s somebody else’s problem. There’s always someone with a higher pain tolerance they can hire (note: pain tolerance is negatively correlated with capacity to anticipate problems before they become emergencies). And there’s always advertising or mergers to mask the real problems.
For much of what Agile development is applied to, the greatest risk is market risk. You want to get something out to users as quickly as possible to see if your assumptions about your users and their needs are right.
For google style projects, there is primarily technology risk rather than market risk. If you are trying to build something that is obviously useful but is perhaps impossible, it makes sense to focus on delivering working systems that allow you to see if your assumptions about technological limitations are correct.
Presupposing that there are years of feature work to be done, do you go with Agile, or waterfall?
And I think we all learned in college that hill climbing rarely if ever finds the optimal solution. Often it doesn’t even find Good Enough. But it’s the best thing you can see in the vicinity so it gets labeled Good Enough even over the protests of people who have done it better elsewhere.
First of all, it's not like most engineers at Google are working on Bigtable or Borg, with a "very simple interface and tons of hidden internal complexity". Plenty of them are working on normal consumer-facing products, the "software with a simple core and lots of customer visible features that are incrementally useful", although maybe that's not the hype people want to believe.
But either way -- he insists "companies like Google write revolutionary software which has never been written before, and which doesn’t work until complex subcomponents are written." But putting aside the "revolutionary" hype, there's no reason subcomponents can't have agile philosophies applied.
A big part of agile is ensuring developers actually understand the requirements (consensus via point estimation), seek to define requirements where they're suddenly discovered to be vague/undefined, and have frequent check-ins to demonstrate that their software is making progress towards those requirements and raising any potential blockers early on, and not accidentally going down a rabbit hole of building the wrong thing (despite best intentions) for weeks at a time which nobody notices until too late, and the project is delayed by months.
That's just as important for a subcomponent of a subcomponent, as it is for a service dashboard. To assume agile is only for "consumer-facing" software is a deep and fundamental misunderstanding.
I'm sure there's ways we can improve our canonical agile-ness, but it's not lip service. We set quarterly goals sure, but they can change if situations change and the goals themselves are usually data-driven (based on prior quarters a/b testing and other research done by our UX team). We also have strong product and project managers, who own a lot of the scrum meta-work.
So, I guess #notallgooglers (which for context is an ironic, kind of self deprecating response here). There's tens of thousands of engineers at G. Some teams are agile, some aren't.
I would say that most engineers are working on something like that. Those two examples are from google cloud/technical infrastructure, but other products such as search and ads have tremendous complexity behind them.
Honestly, it seems a bit elitist to me.
I work at a company that pioneered SOA for both technical and organizational flexibility reasons and I'd say it's worked out pretty well for us!
If I’m building a new server and client architected system, the early sprints may be getting certain design details down. More documentation than code. Later on, I may have sprints that only establish the handshake between client and server and no real functionality. The point is taking milestones (big chunks of work) down to smaller chunks of work that can establish a clearer goal set for the near future.
And that isn’t even unique to Scrum. If you don’t establish small iterative goals it becomes difficult to measure progress. Saying, “We can’t measure success until this large number of lines of code are completed” is folly. If you have that design, you can measure your progress towards completion by creating inch-pebbles for the next few iterations. Like in my example, finalize design docs based on R&D efforts, make prototypes or simulations, create basic handshake code, then build up server and client features in tandem to ensure they make sense (maybe the design can be simplified when you realize several commands or messages are really specialized versions of a more generic one, or maybe something is more complex and requires a reexamination of the design). And yes, a lot of this is common sense to good engineers. Problem is, common sense isn’t common and you’ll get people pushing Waterfall again and again.
It does not. It merely acknowledges that Scrum is the most popular and widely practiced incarnation of Agile and is what most developers are familiar with.
From the article: "The simple high level Agile Manifesto is something I think is close to the way Google engineers think about software development...Google development style exemplifies the type of individual empowerment talked about in the principles."
And: "While the high level Agile Manifesto is flexible enough to work with these principles, these are very different than the short-iteration low documentation Agile/Scrum process which has become synonymous with the word Agile."
Can we do anything with that though?
The point is to adapt the process to the teams needs, not to dogmatically follow some other teams process.
Most people who are pessimistic about the agile development who I have spoken to are usually pessimistic because of poor implementations they've encountered. Any methodology is awful when misused or when hidden agendas are at play. No methodology will ever solve that.
That being said (probably onpopular opinion), I often also see engineers use agile to blame everything that is going wrong onto "management" and use it as way of avoiding responsibility. Complaining and arguing as if there are only absolutes is an immature and unproductive attitude. Calling agile "nonsense" is a prime example of that.
What really happens is that you have a set of requirements and an idea of how you are going to implement the feature. You start working on it, maybe you discover that there is a whole area of code that needs updating in order to allow for your feature to be implemented. This adds on another week of work required.
In a common-sense based development team, what usually happens is the developer has a quick work with the project manager and maybe other devs who need to be involved and says "Hey, I've found out that we need another week to complete this feature because X needs refactoring, do you still want me to continue with this?" then the PM either says "Yes this feature is important, even if it will take another week" or "No, the feature is not important enough to be worth spending another week on". Its really not rocket science.
In SCRUM what usually happens is the SCRUM master will tell you that the rules say that we have already committed to a set amount of work this sprint and that extra week of work falls outside the scope so we should create another card for the extra work and consider this card blocked.
The whole idea of having a rule for what to do in that situation just seems absolutely crazy. Just do whatever makes sense! Its really not that difficult.
They don’t, there are ‘spikes’ that are just time spent researching or exploring or designing rather than implementing.
If your scrum master is telling the team what they can’t or can’t do, or how they should do their jobs, they are doing it wrong. Their job is to help you plan and help you resolve impediments to your work. They aren’t your boss.
The first thing in the manifesto is people over process, there should never be a case where some rule tells you that you can’t do the right thing.
Formal Scrum is a nice way to organize ‘normal’ work, but there are lots of cases where you have to improvise, and the process should never get in the way of that.
Unfortunately in my experience this does happen quite often. I've found the biggest issue with scrum is when people (usually the scrum master and PM) take hard-line, dogmatic views of scrum processes.
In 2002, Schwaber with others founded the Scrum
Alliance and set up the Certified Scrum accreditation
series. Schwaber left the Scrum Alliance in late 2009 and
founded Scrum.org which oversees the parallel Professional
Scrum accreditation series.
No, it doesn't. It wants the team to discuss features to help find preventable issues up front and to make it as clear as possible. The unknown unknowns remain unknown, which is why an estimate is an estimate, and a sprint is a best-effort attempt at getting a thing done. If and when it becomes clear that this feature is too complex, too big, or whatever, then it can be reestimated, or decomposed or reevaluated. It's supposed to be agile, not rigid and fragile.
Whether or not you think SCRUM is good or bad, you can't argue that a sizeable (if not the majority) amount of companies are "not doing it right" and that a sizeable (if not the majority) amount of developers loath working in SCRUM teams. So wouldn't we just be better off if the industry ditched SCRUM entirely?
Some questions are easy to answer, but many questions can only be answered by doing a thing that takes work. And Scrum gives you a framework that tries to put a limit on how much work goes by with no answers.
So it uses a simple schedule.
The team works for X weeks, working on implementing specific tasks. (Which in turn answer some questions.)
Then the team stops, reviews their progress on the larger product, and adjusts plans accordingly. Now "do what makes sense" has demonstrably advanced.
That being said, stop complaining. If you can find room to complain, you surely can find room for improvement right? It might just be me, but everywhere I've worked I've never encountered a manager who isn't open to solid ideas for improvement.
Scrum is just not a good way to manage a team. I find Scrum to be a lot like religion - it means different things to different people but its evangelists always have a pre-prepared answer to every possible criticism, usually containing lots of jargon, buzzwords and double speak.
Sometimes it's even the right development model! If you're building the control system for a new power plant, you'd better follow the agreed-on spec even if you realize it's not the best way to do things.
The thing is, your story cards or whatever are meant to be a live document. You have the big meeting, the next day you realise that what you came up with doesn't work, you grab a coffee with your stakeholder and talk it through, come up with a new plan, and you change the card you're working on. Probably check with the tech lead first, but it's that flexible.
> In SCRUM what usually happens is the SCRUM master will tell you that the rules say [...]
This SCRUM thing sounds evil.
> Just do whatever makes sense! Its really not that difficult.
This is probably the best advice ever for any development team. Just do the thing!
"This Guide contains the definition of Scrum. This definition consists of Scrum’s roles, events, artifacts, and the rules that bind them together."
Definitions, Rules, Roles, Artifacts etc. Sounds pretty authoritative to me. If it wasn't authoritative, how would you even know what SCRUM "really" is? You cannot say both that SCRUM is not authoritative and lets you do what you want, and then also say "you're not doing scrum correctly".
You may say it's not a methodologies fault when it's misused but I'd say it IS a methodologies fault if the feedback cycle on the misuse is easily hidden by the process.
In the end I think for many places where agile is put in more value would be had putting that work into the business communication flow. If your biggest problem to solve is you don't know how to work on problems in smaller chunks + communicate when you've done something and that's what you need to reform you're probably doing pretty good and agile will be great. If you're implementing agile for another reason it probably won't fix that problem and agile will be blamed.
This is what happens all the time in big companies, agile or not.
Where your methodology really matters is the how information flows between your dev team, the end users, and any other systems you need to interoperate with. Agile is great for providing development-as-a-service to a client who doesn't know what they want but will know it when they see it (eg. building enterprise software or web apps). Waterfall is great for providing known, fixed functionality (eg. building an ECU for an engine, or a back-end for a web bank).
Basically, developers who try to develop 10,000 lines of code on their own, without sharing it for a month, create massive risks and are an impediment to their team. Forcing regular code curation and sharing among the team makes the project stronger – that's what "short term planning" does.
It turns out that most projects can be broken down into absolutely tiny pieces, and that is generally a good idea. However, the core idea is you need to adjust the methodology to the problem at hand and not the other way.
Generally I agree, but I have noticed in doing things like that, no one ever seems to bother about refactoring, or tidying things up at a higher level.
Our designer complained about the CSS organization and spend a couple of days tidying it up. I have a feeling that breaking of tasks into small pieces takes away focus from more long term commitment to code organization and architectural patterns.
That kind of time allows some actual deep thought to happen. Some kinds of outcomes won’t happen any other way. I agree with that point in the quora answer.
This comes to mind as a recent example of an engineering project that had no incremental
It’s important that it’s beautiful since the bytecode of a VM gets used so much throughout some of the VM’s most complex parts. Beautiful code is easier to use and maintain.
'short term planning' appears to be your construct, not found in TFA.
Short term implies reactive / tactical -- not the planned / strategic level that TFA is talking about.
It may well be that Agile 'precludes long proof-of-concept projects', given its essence is 'better ways of developing software', which clearly hides a wealth of complexity behind the heading of user-facing software.
This of course doesn't mean that if you have a very concrete goal and you need to do a trial and error iterative research and development project, there is of course no need to do "agile", because the goal is fix, the things you can try is probably largely predetermined, and you already know one of them will work adequately. (Or not, but then at least you might have some form of benchmark target in mind.)
But even then, almost no endeavor starts in a vacuum, so it likely pays off to have some ability to quickly check progress (be it a CI system or just a testbed in the lab, or the aforementioned benchmark), and similarly other aspects of "agile", like try to break up the work and estimate it (think about it, its dependencies, its complexity, your competence for the particular work item, typical pitfalls), let the team know who works on what, interact with stakeholders from time to time, and so on.
Secondly, well meaning management often thinks that the reason they are having trouble delivering on time is because agile needs to be implemented. If the developers and business analysts don’t have deep understanding of the problem set then the project is at risk. All the agile in the world won’t fix that.
Hiring developers is not easy, and finding those that can deliver is also tough. Agile doesn’t solve that.
That said I do prefer an agile like approach to dev ops because it usually encourages teams to communicate and break work down into smaller pieces. When that happens system wide issues come to light much quicker than they would in a waterfall style delivery.
As it relates to the quora question, I suspect that the person that asked the question is pretty new to development. Google is huge, there is probably a million different styles of dev ops happening there. Ok that was hyperbole, but steelframes comment is exactly right. Google isn’t a monolith and googlers aren’t all the same.
Which is antithetical to the agile manifesto. The whole point of agile was to question the current methods and be agile enough to change them as needed.
And not just because all of my experiences have been agile (they have not been).
For agile to work well, you need a competent team and manager, AND be embedded in an organization (ie, who determines appraisal of your manager/team) which truly is concerned with prioritizing value to business/customer over, say, "looking good to other managers", or "trying futilely to predict what the software will look like a year from now, because it makes me feel in control" or "weird power struggles with other parts of the organization". Everyone (or at least the majority of those with the power to make you miserable or break your career) has to be really at least trying to work together to maximize business/customer value.
If you have that, is your experience (and your software) going to be good whether you use agile or not? Probably. But agile is a really useful and effective mindset for making your software product _and_ your experience (stress-levels for instance) even better. It works, in my experience.
If you don't have that, is your experience (and your software) going to be terrible using agile or not, in proportion to how much you don't have it? Sure. And agile and especially scrum can probably make it even worse.
Worth a read or two if you want to be cured of the Agile affliction.
A related issue is software that cannot be meaningfully prototyped because any design that is reflective of the desired product has the same order of effort as the actual product, for many of the same reasons.
There are certain kinds of high-value software that cannot be built inside most tech companies now because it pattern matches so poorly to the way people think "agile" software development should look like.
Google runs on design docs. You define the problem, dependencies, risks, end state, etc, in a 20-page doc, and then you drive its development for 1-2 quarters. Google doesn’t do extreme programming, we do extreme ownership.
I really prefer it to the time I spent working with Pivotal or Carbon5. They were great, but imo not a sustainable process for mature products. Agile is good for driving a new, greenfield MVP... and that’s about it.
When that awful Design Patterns book was off the hype train but the new devs had heard about it and asked if they should read it, I would tell them No, read Refactoring twice instead, because it tells you why and when to use this stuff, which is much more valuable. Like feature work, if you know why you are doing something you can often reason out the gaps in the “what” on your own.
We sort of take a bunch of XP processes as a given and forget that this was Agile for most people at the time of the manifesto. Scrum pretends it’s separate but is essentially useless without half of XP.
You can assign work to teams to complete, but you must drive the effort and gain buy in. Otherwise it’s not actually worth refactoring.
Anyway, I don't want to get into the weeds, but if anyone is interested I just posted a gist of all my notes I took while reading every Scrum document I could find a few months ago: https://gist.github.com/erikpukinskis/a8a61b8fbd19f8063737f7...
There's a lot to it, as a software engineer, learning more about it was actually pretty exciting for me because there are some good ideas in there for some of the standard headaches I see over and over in software companies.
The original Extreme Programming project at Chrysler was a payroll system, so many strategies for handling things like this have been thoroughly discussed in literature and forums in that community.
I don't see agile as exclusively requiring you to expose everything you do to your end-users as you make it. I get that it can accommodate that if that's what you want, but the real essence of it is about de-risking a project by breaking things down into shorter milestones.
Do you remember those Microsoft Press software management books? So good. One point was "Beware of a Guy in a Room". Which is to say, don't let developers shut their door and go dark for months at a time without surfacing with something working. It can take a bit more work to adapt a big project into smaller "working" milestones and seem silly and inefficient, but in the end this helps avoid a lot of risks that can lead to deathmarches that can truly kill a project.
One way to adapt the "customer" aspect of agile is to not have an actual live customer but rather a "voice of the market" that the product manager represents. After all, you do want to empower them to course correct because the world can change over 8-20 months. This applies even to BigTable.
Different tools for different tasks. Neither a screwdriver nor a hammer is nonsensical, though using either in the wrong application is.
I'm not saying that agile processes are inferior or even no better than other software development processes. I'm just saying that there is some level of fundamental capability that a software organization must have before the process is even going to matter.
If your work doesn't have a customer, why are you even doing it?
The customer doesn't have to be the business's end user. The can be another team or even your own team. Someone must benefit from the work or else the work should not be done.
There is just too much religion here :)
If you need it use it, if it works, great, if it doesn't or you don't need it, move on. The only advice I would give to people is that there is no rule book or the good way. Use your brain and pick up things that work.
In that kind of environment it doesn't lead to short-term thinking because a long-term business vision can exist independently of how a software solution is implemented.
Edit: removed the bit about Google being an exception.
Also a lot of video chat apps.
Does somebody have more on this? I'm never sure how to structure my design documents, how does a Google Design Document look like, is there a template somewhere on how to write one? This would be super helpful, please.
There are teams at Google that look to use agile methods. E.g. I see standups, there are internal tools that track sprints, etc. I think these are mostly front end teams.
I've never worked on front end but was on a team that used scrum at a previous job. I hated it. It seems like a huge impedance mismatch for my kind of work.
I'm guessing some consultants sold management on the miraculous benefits of scrum at my previous job, they like the snakeoil they were sold and forced scrum on everyone. The amount of overhead process it brings is an enormous tax on productivity.
Some people call this "corporate scrum" to try to distinguish it from real scrum. But this feels like a No True Scotsman fallacy. Why is it so damn hard to do "real scrum" and actually get the productivity benefits it promises? I've never met any SWE in real life who has tried scrum and liked it.
At Google, the development process is not just "develop 10000 lines of code on their own" like some posts here have imagined. There is typically a design phase that gets wide buy in, and then code reviews, and extensive testing every step of the way during development. This feels far more realistic as a process.
And it is NOT some "waterfall" process which only seems to exist as a strawman for agile consultants to knock down. There is no architect who designs the code and then hands it off to code monkeys to go implement exactly as written. That's ridiculous and now how software is written anywhere as far as I can tell.
If I were to answer the question in the title myself, it's that Google has allowed engineers to build a process that works for them, and they almost never pick scrum or other agile methods.
Agile at its core, to me as an engineering manager, is:
- People over process/tools
- Plan-able execution and pivoting quickly in rapidly changing environments
- Create consistent, repeatable work delivery
- Retrospection, learning and iteration over time
- Tools (can) support the process to multiply efficiency but do not solve the core goals and problems
Like the author of the comment said, the details of how you execute "agile" is where things get bumpy for teams.
However, if you try to create a product in a market with competition, a non agile competitor can reach a global optima (for example Apple vs MS in mobile phones), and than you will lose the market (and sometime your future).
Amazon is relentlessly customer focussed and is now a veritable hit factory And mostly in industries that were already seen as mature.
Interesting definition of simplicity.
Is this how googlers refer to results of their work and people respectively?
Developing for a small but demanding set of enterprise customers with concrete (but sometimes arcane) requirements is a very different problem from developing for consumer markets.
> I would caution against trying to paint "developers at Google" using broad stroke
> a leapfrog for the masses
I think this is the logical leap that people aren't understanding. Generalizing means assuming things about individuals based on their group membership: OP wasn't saying it's a moral failing, but that it leads to inaccurate assumptions when applied to the heterogenous group in question.
What's the analogy to using the term "the masses"?
The morality and intent is irrelevant to my core assertion. A criticism of contravariance (if that makes it clearer) followed by a usage of it is unexpected hypocrisy, which is ironic.
 Pick a dictionary and you will find references to lower-classes, the not-elite or the common people. https://www.merriam-webster.com/dictionary/the%20masses https://en.wiktionary.org/wiki/masses#English https://www.vocabulary.com/dictionary/masses
 Put in to words nicely here https://email@example.com/why-i-left-google-to-join-gr..., but you can feel it in many of Google's products where they make all the major decisions because they don't believe users are as capable of making good decisions as Googlers.
And then, of course, nobody said you're part of those masses anyway. Maybe you're not. Do you enjoy Google product X as other hundreds of million do? Then you're part of the mass that enjoys it. Otherwise you are part of the mass that doesn't.
People do sometimes seem to forget this! - so it's good to have a reminder.
But is the average Googler contributing more to humanity than the average teacher? I'm not so sure.
Maybe you could say : is the average engineer contributing more to humanity than 1,000 teachers? Or you could go by pay: is the average engineer contributing more than 10 teachers? (guessing on numbers of course)
One to one ratio though? I'd argue: yeah probably. It's indirect effects, but Android/Chromebook/Google Docs... These all have brought computing technology and the internet into the hands of children around the world. Internet access is clearly one of the great opportunities for education. It might be hard to measure, but it clearly has an outsized effect.
But, no, I can't cite a study, it's a pattern I've observed at range of places.
I suspect a company like that has committees and subcommittees and review processes and 100 layers of bureaucracy before anything even gets done
I bet there are projects that reached a production version and never saw the light of day because of this very reason
The only thing noteworthy about Google is _there is no process_.
Hell, just last week, a coworker and I finally resolved a year-long difference of opinion when it clicked for him that there wasn't a defined system, insomuch as there was one I was defining it, and I was thrilled to have someone else help with that.
Dollars to doughnuts, there's multiple teams doing extremely prescriptive Agile, but I've never seen one in 3 years in a 2,000 employee office.
By far the most organized team I've been on merely kept track of bugs using an internal web app that renders bugs from the internal bug tracker as post-it notes arranged in columns, with each column representing a bug status (ex. assigned/started work/code in review/done).
This alone caused a ton of grumbling, meanwhile I'm sitting there gobsmacked that Scrum and Agile aren't proper nouns that everyone has highly detailed opinions on.
People do read your in product feedback, btw. But it takes quarters for anyone to triage and act on it.