Using a lightweight, comprehensible framework is good, until you hit the limits of that framework and start pulling in lots more libraries to fill the gaps (the Sinatra/Cuba world). Using a heavyweight, complete library is good, until you start suffering from bugs caused by incomprehensible magic buried deep inside (the Rails world).
I see the same problem in microservices versus monolithic apps. If you use microservices, you have a lot of duplication and inter-component configuration management complexity. If you use monolithic apps, you get the big ball of mud.
Or, as I sometimes put it, "Which kneecap do you want to get shot in?"
The underlying problem isn't fashion, or bloat. It's that software is very, very complex. Unless you're doing hardcore embedded work in assembly language, you're building a raft on an ocean of code you won't read and can't understand.
A friend of mine put it well once. He said that you should have deep understanding of systems one layer from yours (ie your frameworks), and at least a shallow understanding of things two or three layers away (operating systems, etc). If you don't have some grasp of the things you depend on, you're relying on magic and are vulnerable. But you can't realistically expect to know it all.
A lot of the problems we face in software engineering are Kobiyashi Maru tests. They're no-win scenarios, tests of character rather than ingenuity. There's a certain irreduceable complexity to all interesting problems, so at some point, you're not solving complexity, but merely pushing it from one place to another.
Right, but I'm not about to believe we're even close to that limit. Every day we have people writing bugs which, from a state of the art perspective, are already solved problems. It's like people are out there riding horses while others are driving past in their cars.
I think it's a little better to say that a lot of the problems we face in software engineering are cultural, not technical. So many people adopt these tribalistic mindsets when discussing their preferred technologies rather than allowing themselves to be open to better ideas.
This is a good analogy but in software development, as in life, there are roads and there are trails. Some developers try and drive their car on the trail and their horse on the road! But sometimes you need a horse and face already solved problems because your environment requires it (e.g. using C for embedded development).
> So many people adopt these tribalistic mindsets when discussing their preferred technologies rather than allowing themselves to be open to better ideas.
The problem is there is always a better idea. You have to strike a balance between being open to new ideas and just getting the job done. But there is definitely tribalism because that is human nature; You earn your way into a technology and community through pain and suffering so you become attached to it. Maybe some people can avoid that but I believe it's pretty hard. Especially when the differences between technologies are mostly cultural. You're less likely to be tribal when considering C vs. Ruby but perhaps more so when faced with Ruby vs. Python.
For example, I have to read data from one source, and write it to another. I can have a single app provide api for both the writing source and the reading source, or I can have two separate apps and two separate apis. But I can't get away from the problem of reading and writing. That's irreduceable. If I have separate apps for the two apis, I have a configuration management issue. If I have a monolithic app that does both, I have a coupling issue.
Imagining that Bleeding Edge Technology of the Month will make this go away is wishful thinking. But facing the truth is awful, so we choose the wishful thinking.
If you compare software engineering to another field like aerospace or structural engineering it's laughable how little rigour and formality takes place in most software. It's shocking how much disdain programmers show for mathematics which gives us the tools to prove that our applications are correct.
I personally think these things are great but too expensive. My customers care about solving their problems not perfection. There's a balance to be struck and I'm not excusing bad software, but if the customer isn't willing to pay for better and is happy with the current quality, there is little incentive to put in this extra effort (and therefore cost) when the resources could instead be put to work on providing value in ways the customer does care about.
I guess after you exceed the customers quality requirements/expectations, there are diminishing returns.
I'd love to produce perfect software, but nobody will pay me to do it.
I didn't say you had to do all of this yourself. I don't believe every programmer has to have a degree in mathematics. Where I take issue is with the refusal to take advantage of the hard work of others. There exist languages, tools and libraries which have a strong mathematical basis that are ignored in favour of the latest fad. Places like stackoverflow are full to the brim with people asking for help solving problems that they wouldn't have to deal with if they'd chosen better tools.
I do, however, think there's a "which kneecap" problem to rigor as well. Rigor isn't free - it comes with costs, in learning curve, in the quality of developer required, etc.
I've been bouncing back and forth between not-rigorous Ruby, kinda-rigorous Go, and pseudo-rigorous Java. I don't think rigor is a killer solution, nor do I think it's a gruesome waste of time. But I find the really ugly problems to be at the requirements and architecture level, not the code level. Not surprising, considering coding isn't my primary work.
The popularity heuristic may be a little better than the OP suggests though, maybe.
I seem to be noticing more often now that "the best tool for the job, for the person, at the time" is completely acceptable. I feel as though this didn't used to be the case, and I know a lot of more "established" engineers who believe it's naive to choose tools based on an inclination or personal/team preference. While in this case, we're getting less code in the dependency, I'm highly suspicious of how their working knowledge, method of code organization, etc... can be transferred over time.
Ultimately though, knowing what actually happened over time with this project would be the most interesting. Does he eventually find new team members who convince him to switch back to a framework that is more widely understood and practiced?
> 3. Cuba itself is extremely easy to work with because it barely does anything. You can read the entire source in 5 minutes and understand it completely. I'm confident future teams could pick it up.
> While in this case, we're getting less code in the dependency, I'm highly suspicious of how their working knowledge, method of code organization, etc... can be transferred over time.
I think more useful indicators are tests, documentations and pull requests.
* Tests. These are things you can actually run to see if the library actually works. In most cases, you can read them and understand how it is supposed to be used. On some platforms even if the tests passed in the past, they may not pass now, for example using different version of the language or platform. The coverage of the tests would be nice, but it is hard to measure at first sight, therefore using the ratio of tests LOC to library LOC might be an indicator.
* Documentation. The documentation itself does not change how the library behaves but it is a clear indicator whether the author expected someone else to be able to use it or not, and shows a sort of responsibility. Most of weekend hacks would not have one.
* Pull Requests. If there are old and open pull requests, that seem useful, it is a bad sign for the maintainer of the project.
Ah, the "industry standard" argument, where industry standard seems to be defined as "whatever I've read the most blog posts about", or "whatever was used at my last job", or possibly "what I learned in school (last year)".
One of my major dissatisfactions is newly hired, junior (or mid-level) developers coming on board and then immediately wanting to change everything to be more "industry standard". They rarely seem to try to understand why we're already doing what we're doing, they almost never seem able to explain why what they want to do is strongly better, and they frequently just go off and do it, leaving a pile of projects each doing the same task in different ways.
Part of the reason the ball of mud is as ugly and messy as it is is because of repeated attempts to fix it, just getting absorbed into the structure. An application that was shaped by many different engineers and leads over many years can have a lot of different flavors of weird.
About 20% of the projects I've dealt with would have been measurable (by pretty much every measurement you can come up with) by a complete rewrite. One of the commonalities I noticed is no one from the original technical team was still around. The notion of losing "institutional knowledge" about the code goes out the window at that point - it's already lost. You now have new people now just adding on more crap (or fixing crap) without ever even knowing why it was crap in the first place.
If someone only wants to keep their systems running - that's 100% fine - no need to rewrite. If the org expects new functionality on a regular basis, and there's no one who truly understands any of the current code, a rewrite may make sense.
You can't build a product with a mediocre team, no matter what tools you have.
If you have a decent team, I think they will be able to manage to build great products even with not-so-great tools. Of course, if you force them to use shitty tools, they might just decide to leave.
I understand that you're saying it's a bad practice to adopt a tool for reasons other than its utility in addressing a specific problem.
But is it really so terrible for a developer to want to learn a new tool so he can broaden his skillset and make himself a more attractive candidate to potential employers (so long as the tool is the right one for the task at hand)?
There are types of developers who are very limited in what they can do. They learn some basic "tricks" in the form of "If write this magic spell, I get this magic result".
These are the mediocre developers.
They tend to pick some popular tool and try to market themselves as "an X developer" where X is something that's popular right now.
Then there are developers who can create magic. They can build something almost from scratch without having to have a library that already does it for them. These developers can pick up X or Y or Z in a week or two and then they will become 20x more productive with it than the mediocre developers are.
I'm not saying it's a bad thing to pick up a tool to market yourself as an "X developer". I'm just saying there's a lot more mediocre people in the pool of "X developer", and if that's what you go looking for, you'll get mostly mediocre people.
However, software companies that hire for a specific popular tool/framework need to be punished with mediocrity.
It's not mediocrity, it's called playing to your advantages.
For the developers that go out of their way to learn a tool only because people like you will hire him, their problem is that they'll be hired by people that hire by a "stack", and not by competence. Jobs that select this way normaly place a negative value on competence, and the developer will face the choice of adapting and become mediocre, or going away and losing some short term benefit.
If you don't move to a new stack, how's your retention?
Here, have an up for this...
Edit: Ha! I seem to have touched a nerve!
Unfortunately for him, this is a bad example. Twitter was notorious in its early days for being down all the time. So Twitter decided to migrate to Java - a dull, over-verbose, yet reliable language.
1. Small team, small budget, smaller set of technical skills.
2. The need to do a lot with less.
For these needs, I would argue that Rails was the perfect tool for Twitter early on. It likely gave them a competitive advantage in their development.
However as with all fast growing prodcts, nearly every app experiences a new set of challenges every time its traffic scales up by an order of magnitude. Twitter being an extreme example of among the most difficult-to-scale apps you can imagine (data is rapidly being created and read -- by its nature its inherently hard to cache).
Their original app was built by relatively inexperienced developers hacking together an MVP. By the time they reached scale the had a roster of senior developers capable of rebuilding the system way more professionally than they ever could have in the early days. Rebuilding the core functionality was trivial, and this time, they could build it from the ground up to support 5000+ tweets a second.
I'm primarily a Ruby developer. Literally every single time I've started a web-oriented project using something like Sinatra or Padrino etc., I've ended up regretting it. I end up building a shitty, half-featured, buggy version of Rails. My experience is that most other projects end up the same.
I suppose mileage may vary; perhaps it's easier to write bloated apps in Rails. I'm not convinced that's a good reason to avoid it, though.
Perhaps he is not suggesting that popular ideas likely to be are wrong.
Instead maybe he is saying that to think and develop ideas like Knuth's one needs a certain amount of irreverance for what is popular.
(Undue?) reverance is rampant in the software industry, in my opinion. Would Knuth agree?
The "elite" developers (who often don't actually maintain real production apps, or need to live with the consequences of their design choices) begin to fad over a new language. O'Reilly publishes a book and sure enough... Influential senior developers who obsess over these elites start fangirling, leading them to convince their bosses/teams/companies to also adopt this new "cool" technology. As the popular tide of generic bandwagoners rises, the "elite" developer begin to feel the pressures of wanting to redefine their identity again, and soon jump ship to switch to the next hot thing.
Then the process repeats...
I'm happy for the guy that he gets to do anything at all beyond that which is promoted by Oracle or Microsoft. I suppose Google might belong on the list of "Promoters not to be Ignored", as well, but they haven't flogged the use of inappropriate hammers enough, yet.
"Oracle or Microsoft"
Would any self-respecting programmer follow along with such idiocy if they were not paid to do so?
Actually reasoning about stuff when stuff can be reasoned about, and considering most opinions to be superfluous nonsense is the right direction. Opinions follow abstract models. You can pretty much find a computational or mathematical model, throw some nouns and verbs on it, and bam, you've got an opinion that has nothing to do with reality.
I humbly request some illustrative examples. Note I agree with you. Examples can be powerful (and, unfortunately, polarizing). Whatever you can share would be appreciated.
Then again, is it just me who sees the patterns in the words, or are the patterns actually there? I think it probably depends on how you mind constructs analogies and relations between it's map of cultural topics and dialogue, etc. It just bother me when I see flashes of pictures that I normally equate with programming and math, that model the relational structure of what I'm reading. The two have nothing to do with one another, and yet they persist in flashing across the visual processing part of my mind without me doing anything. I am occasionally fascinated by determining whether they mean anything, or where they come from.
People use terms though - black and white thinking, etc. Lots of analogies and metaphors have very simple abstract forms. It's memes, deeply validated patterns. They don't have to be true in reality as long as people keep talking about them and agreeing they exist.
I feel as though I have become a scientist scientist. Thanks for the kind words, but I'm just trying to get through a rough mental space. I prefer maintaining a zen beginner mind, because it is fairly easy to silence doubt.
If most people believe X and that makes you think X is probably wrong, then you are probably in one of the groups that believes alternatives to X. There are lots of groups like yours, they can't all be right, and therefore your chances of being any more right than X is are slim (all things being equal). At least, with X, you have the support of the population and can fit into society during your lifetime.
And the nice thing about X is that it has been battle-tested by many people, and while it may be "wrong", it WORKS.
With the past year in retrospect, I can say with certainty that I'm quite pleased with our decision to embrace Cuba. It's led to a highly-declarative codebase with clear layering and very little magic.
If anyone has any specific questions I'd be more than happy to answer.
This is what makes established products unreasonably difficult to dislodge (from a technical perspective).
At the other end of the capability spectrum, Alan Kay said there's an exception to the rule to reuse not reinvent: those who can make their own tools should.
Really I think the developers just chose Cuba more than anything so that they could scratch the itch of getting to try something new,
Counter-point: at least in the open source world, popularity can translate into lots of eyeballs on the code, lots of bug fixes, better ability to hunt down solutions on Google, hire people, etc.
Counter-counter-point: lots of people depending on a project may slow its progress and prevent its maintainers from correcting fundamental design mistakes, all for fear of breaking existing installs.
Even in a non-open-source tool, lots of users means lots of reference on Stack Overflow, and an exponentially better chance of not encountering bugs for your particular use-case.
Any framework of moderate size has bugs in it. The only question is, has the bug that would slow you down been fixed yet?
That's why the size of the alternative is so important. If the alternative is just 300 lines of code (as in this case), then even if you do run into bugs, fixing them is gonna be trivial. But what about 3,000 lines? 30,000? ...
I wouldn't say everybody should automatically go with the popular option, because this kind of herd mentality can really be destructive. If another, less popular tool seems better for your use, give it a fair consideration. I've actually personally chosen that route several times, and I would again. But by doing so you accept the strong possibility of dealing with bugs which your competition doesn't need to worry about.
Apparently they didn't do a good job since they've long since abandoned it.
Haha, indeed it does.
But the thing about groupthink is, it is at its most insidiously effective when its participants don't recognize it as such. It works best when one feels oneself an iconoclast for saying what most everyone around them believes.
In the first place, if the product does not work well, it won't even get much attention or popularity. There are reasons why products got massively popular in the first place.