Should you empower teams to make their stack choices? I've seen chaos breaking out for a year in a major development organization with 500 devs when most teams suddenly were allowed to make some choices that better should have been made centrally. Yes, they were faster but they were faster going nowhere. And then there are all these aspects that go beyond the development phase - long term maintenance, behavior of software in the field / training field engineers and moving people across teams.
Good technical leaders know where choice is called for and where not. Managing these trade-offs requires a perspective that goes beyond the agile team.
Sure this requires an "architect who codes" type person to be involved (which can get expensive), as opposed to some ivory tower powerpoint architect but is typically worth it.
There's a fine balance between allowing choice and setting standards. There's productivity gains as well from having some standards -- you can built tools which leverage standards, so long as they aren't too restrictive.
e.x. In the world of microservices, you end up needing a lot of telemetry to keep things running smoothly and figure out what goes wrong when your call is four levels deep. Doing so is hard without having a standard to which your services must adhere to. It's easiest if you have a base application which takes care of that for you. That, however, becomes hard if you have the combinatorial explosion of [languages] x [frameworks].
That doesn't mean you shouldn't entertain new choices, (hey, team X wants to use Go.) but there should be some discipline. It boggles my mind when individual engineers prefer to speed up their development for the short-term vs. the overall organization.
I currently work in an environment where there is virtually no central technical management. There are groups of 2-4 developers who roam around their department's problem space, making whatever choices they want.
If someone doesn't like the interface on one microservice, they fork the code and make a new one. This leads to 5 microservices that accomplish the same thing, and 5x the maintenance burden whenever something changes. I've derisively called these "nanoservices", but I need a more offensive term.
People intentionally choose techs that no one else in the company has used, trying to cement their position and hedge up potential competition or unwelcome cooperation. They also select technology for the best PR effect, being able to say "We're being like Google today". The result is a whole mess of different stuff running throughout production, each needing its own backup system, etc., which, to be honest, does not always get installed.
It's a disaster.
The author's contention is that giving more choice to teams is better because it makes them feel more ownership in their product. That's fine, but do recognize that this is a psychological solution to a management problem. The job of the manager is to actively control the psychology of employees and the structure of the organization such that employees are happy to do things that lead to cooperation, not to allow employees to always get their way.
Without a central coordinating force and everyone's acknowledgement that some compromise and sacrifice is necessary, everything falls apart quickly.
Normally, people don't do this a) because they don't want to be required to deal with the cruft or b) because they believe someone will observe their bad behavior and stop it. They don't really have to deal with the cruft, and it's clear now that no one in this organization will hold them accountable, so why should they waste time and energy trying to build a consistent platform?
re: Conway's Law, the organization's communication mechanisms are loose and ad-hoc; there is no central coordination. Consequently, the software is coordinated on a loose, ad-hoc basis between small teams. The same featureset will be implemented 4 times by 4 different teams, not because they can't talk (there is an office where most people hang out together all day, and people do generally keep an awareness of what the other group is working on), but because they'd rather just do it the way they want to do it.
It's not that no one ever ends up sharing anything. It's just that instead of having a discussion and making consistent decisions about the platform, they just do what they want, because they can.
You're acting like they don't talk. They do talk, they just don't care. They know that guy x is making thing y, but they don't want to do it in the language/platform/framework/whatever he's using, so they make their own thing y. Structure needs to be in place to make cooperation happen.
So many reasons, but they usually boil down to simple self-importance.
Also, remember that it's harder to read code than to write it. Any non-trivial library or service is going to have a learning curve.
>Also even if a rigid structure existed I can still not use your thing Y most likely because I don't respect your work and I would implement thing Y differently ( and in my opinion better ).
Everyone has their own opinion.
In a rigid structure, this perspective wouldn't work because your paycheck would be imperilled by it.
>The problem isn't people chose different tech stacks its they don't want to work together.
Sure, I agree. In a correctly organized structure, their incentives would change, and their behaviors would be more cooperative.
>If they did they would choose tech stacks and work practices that foster collaboration.
No, that's like saying if people wanted unity, they'd make choices everyone would agree on. Everyone wants unity. They just everyone to unite to their idea of what's important/right.
Management is all about crafting an environment and structure that causes people to work in a productive way. That could happen in this company, but it hasn't.
The reasons that get hand waved away in this article are exactly the reasons you can't ignore consistency across teams. There is value in having just one library that multiple teams use to communicate via HTTP to various services, there is value in members of different teams being able to work across projects due to the similarities in infrastructure. If each team has their own client library for communicating with a service, that's work you've paid for twice, or more.
I clicked on this article thinking it would be about choice within the prioritized stack on what to build and when, because that is valuable. Letting the folks building the stuff decide which things to build first is much more valuable than letting them pick between Postgres and MySQL, or memcache vs. redis, etc. (just examples, don't get too wrapped up in them).
Sure, we can sit here and come up with exceptions to choice, and yeah I do like being able to choose my IDE, the OS I develop on, etc., but those choices are totally different than which relational database I use, or which web framework I develop with. There, I don't think choice is good, at least not for everyone. Maybe this is written for my boss's boss, I dunno. But as for me, the dev writing the code, I don't really want me (or my colleagues) to have all that much choice.
If I can't trust you to make a good choice on which technology to use why do I think I can trust you to make good choices in other aspects of your job. If you don't feel confident to make a technical choice then what you are telling me is you don't understand, and don't want to learn. Not really a desirable quality.
My expertise is in designing and building software, not in organizing teams for optimal productivity. The tools and tech that make the most sense for my project will get me to 100% optimization, but if I pay down a 5% optimization cost to bring all teams in my organization 80% to 90%, a manager should be the one making that choice, not me.
Pretending like everyone has the visibility into the org like you do will get you into trouble, and giving that visibility to everyone will harm their ability to focus on their jobs. I'm not a manager for a reason, I don't want to make those choices, but I need someone to.
Also, not everyone gets to work at Google. Advice like this needs to have value at Medium Sized Company, Inc., because those are the people who are most out in the cold. An Apple product manager has every tool on the planet available to him/her to figure out how best to help his/her team, but the guy/gal managing a dev shop in Carson City might not.
Empower your tech teams to do great work, it doesn't require you to be a google or apple, or podunk startup. I have seen empowered teams at 10 man contact shops and 100,000 fortune 100 companies. If you run into a company that tries to tell you something else run away, you will find politics and infighting alongside terrible technology and an inability to deliver
Doesn't this just encourage silos?
There are lots of reasons why you may never hit that state of everyone self selecting the same or a very similar stack, but they are never technical and they always indicate a team structure problem that is going to cost you way more if you don't deal with it then the amount of duplicated work you might encounter
I agree that total anarchy is bad in the workplace, but don't you think that the people using the tools should have some input in choosing the tools?
The best standardization I have seen grew out of a team who was empowered to make decisions, made good decisions, showed how their decisions were good for the entire origination and helped everyone migrate to one solution which turned out to be pretty good at solving everyone's problems.
If you as a leader find yourself pushing a standard tool set to a team who is hesitant to adopt it the problem is most likely you.
The truth is that for most types of applications, there are several technology stacks that would be perfectly adequate. You get into the tradeoffs, and unless the benefit posed by a specific thing is immediately obvious, these are a matter of subjective judgment. Great technical minds will have differing subjective judgment; no one's judgment is exactly same as another's.
People also usually do not understand their own motives. This is the origin of fads. Everyone suddenly agrees that THING_Y is the best, and then, the next year, everyone suddenly believes it's THING_Z now. If someone can present a dense, semi-passable argument in favor of it, the rest of the uptake comes down to marketing. This is true no matter how smart you think you are.
If you have a technician who prides themselves on always knowing about new technologies, and that's their reputation, it's likely their primary motivating factor will be continuing to propagate that self-image. They might not even realize it.
They will come and say "There is a new tech called TECH_X and it allows us to do these amazing new things!" As long as TECH_X has some value proposition and isn't a blatant or obvious bad fit (meaning obvious and blatant to everyone), their argument is plausible enough and they'll run with that.
They think they're doing this because they learn all the new things and it gives them a competitive edge. In reality, it's because they view themselves as cutting edge, they want everyone else to view them as cutting edge, and need to feed themselves and everyone else some proof validating that desirable conception.
The difference between "political schemers" and your average employee generally comes down to their naivety around the universality and importance of these functions. Which is another danger -- the less-naive employees in the ranks will be actively working their own angles here to attempt to manipulate the psychology of the people around them. This can and does invade technical choices. You can make the best arguments in the world for your stack, but if you have a political competitor, they will work to undermine you, often in non-obvious ways. If they feel now is the time to strike and they can embarrass you, they'll do so; if not, they'll agree and subtly work to sabotage your plan.
There's another type of employee here; the type who wants to keep his day job on cruise control, and will oppose anything new because it would make his short-term daily existence more difficult (and perhaps long-term if the improvement could reduce inefficiencies so much that he becomes redundant), regardless of organizational benefit.
There are many arguments to be made in favor of keeping the status quo, so he will look OK; if your new stack is a wild success, he will be able to say "Oh, I just needed some time to come around" and if your new stack is an abject failure (which happens much more often than people wish to admit, and even more often, it's made to appear an abject failure by political competitors regardless), he will be able to say "I told you so".
People are naturally self-interested and will behave that way all that time. Management must look outside of the individual goals of the employee or team and consider what is in the long-term interest of the organization.
tl;dr This is naive; a purely democratic approach doesn't work because people's goals and values don't necessarily align.
So given people with differing goals and values they come to different conclusions about how to pursue those goals. Unless you hire sociopaths most of your employees will want the firm to do well and work with functional tech stacks that accomplish the stated goal of the company. However people in tech leadership don't always share that goal. They make decisions based much more on personal utility then the average worker. They might sign that sweetheart deal with a vendor because they see it as an avenue for a career change in the near future, or they might want to make sure no one below them starts to outshine them so they keep the stack something they understand ( knowing full well that someone not in leadership has much more time to persue learning ), or they know if they force an outdated but well known stack they can hire cheap.
In general none of those conditions are good for people actually doing the work and they know it which is why they push back. It gets propagated up the stack as naivete or fad chasing, but in general its leadership insulating their position ahead of the company or their co-workers. Those leaders fail and those companies fail. When decisions like this are pushed down from on high look out, its generally a sign of bad culture and a bad company
This is not at all the case. There are many rational arguments to be made for abandoning an outmoded-but-already-in-place practice or stack, regardless of the knowledge base that already exists in the company. Many people could and would suggest a conversion to a new platform in good faith. Sometimes this would be the right decision, and sometimes it wouldn't; it really depends on the particulars of the organization.
> If everyone is rational and had the same goals these things sort themselves out.
People are not always going to have the same goals. You can't limit your organization to a subset of people who are all ideologically aligned.
Also, people frequently don't understand their motives for something; that is, they ascribe some rational-sounding reason to behaviors they are biologically driven to perform. They may fully believe their goals are aligned, but come to find out, they're not.
>Unless you hire sociopaths most of your employees will want the firm to do well and work with functional tech stacks that accomplish the stated goal of the company.
Sure, and since there are many tech stacks that would do well, this doesn't mean anything. People can disagree. When they disagree, the subjective judgment on the tradeoff must be passed up to a central authority.
>However people in tech leadership don't always share that goal. They make decisions based much more on personal utility then the average worker.
You're grossly overestimating the "average worker". The average tech workers wants to keep his job simple and easy so he can browse reddit all day at work.
You also assume that these things don't factor into the decisions of average workers. They do. Some workers aspire to be leaders, and, less naive about human nature, are willing to employ some showmanship and artifice to get there.
>In general none of those conditions are good for people actually doing the work and they know it which is why they push back.
None of the conditions you listed are good, but mature technical leadership, including standardization on a reliable stack (and, of course, a willingness to make rare-but-justified exceptions) is a boon to all.
>It gets propagated up the stack as naivete or fad chasing, but in general its leadership insulating their position ahead of the company or their co-workers.
There is a great deal of naivete and fad chasing everywhere, inside and outside of leadership. The ability to recognize it is valuable.
>Those leaders fail
Quite the contrary. Leaders do these things because, as long as one maintains plausible deniability, they are rewarded richly.
>and those companies fail. When decisions like this are pushed down from on high look out, its generally a sign of bad culture and a bad company
Companies that understand human psychology, both in the public and in their employee base, are extremely successful, because ultimately power comes down to the consent and resources of others. They are also usually the companies people most want to work for.
As a result its slow, messy and difficult to maintain. Would be nice if managers prevented small teams from choosing any technical choice they felt like.
That requires knowledge, experience, and good technical & business judgement. Unfortunately not everyone has these qualities. Refusing something simply for keeping things as they are is not a sustainable strategy in technology. Decisions have to be made smart, and they have to contribute to the incremental improvements.
The gamedev stuff I've helped tackle fundamanetlaly required mostly C++... and a little C, and some x86/ARM/PPC for debugging, HLSL 3-5/GLSL ES/PSSL/<custom shader DSLs> for porting graphics, at least two shell scripting languages for CI servers (Bash, and Powershell or (ew!) Batch), Python (for various modeling tools)... Java, Objective C, and C++/CX for platform specific APIs... MSBuild, Makefiles, Gradle, Ant... and that's just the languages! The tooling front is no better.
A handful more language choices that theoretically could've been avoided (e.g. C# for tools and server stuff, ActionScript for UI) is really not that big a deal in this context. You can add pointless layers of cruft, but I've only seen it once - a Python script used to hack up the result of a C++ program. Even then they had a reason: some of the C++ didn't get checked in, so we couldn't even build the C++ program. (A happy ending: With a little reverse engineering of the resulting binary output, I was able to re-implement the missing C++ bits, implement the Python-script 'fixes' more cleanly in the original C++, and add my own necessary changes on top of that.)