One should give credit to Joel for correctly identifying the issue: great thinkers excel in abstract thought. Many of the comments here, however, seem to misunderstand his point about the propensity of creative minds to continue the refinement process and "not knowing when to stop". One (codeulike) apparently thinks simply being "bored" is what drives abstract thinkers. Let me assure you that is not the case, nor is it an inability to code. Many architecture astronauts started out as wiz coders.
Now let's review the gifts given by said astronauts - I'll limit to 2 examples but there are more:
- LISP is the mind product of a software astronaut.
- UNIX ("everything is a file") is the product of software space exploration.
My advice to serious young software engineers is to not accept mediocrity and imprecise thinking as acceptable standards for their chosen vocation. This cult of celebrating mediocrity in software design is a transient phase (20 years to date) and it is mostly a side effect of the introduction of facile soap boxes via the internet and blog sphere. When the dust settles, deep thinking and an ability to conceive powerful abstractions will yet again take center stage.
Software, after all, is all about abstraction.
- If your goal is to create great new architecture, that's fine, go for it.
- But if your goal is to ship a product to customers, on a tight schedule – say, you are a startup trying to get an MVP out the door before the end of your runway – then you should probably not be doing more than a fairly minimal amount of architecture, and you should beware of the temptation to do so.
Note that both your examples were created by people employed by large, wealthy organizations (MIT and AT&T respectively) that could afford to give them plenty of time to explore interesting new ideas. That's a great situation to be in, the world would be a better place if we put more people in that situation, and if you are in it, then you should take advantage of it. But if you are unfortunately not in it, then you should be aware of that fact.
It takes "astronauts" who think hard about the architecture to come up with systems that are simple and powerful at the same time.
the original XML Spec is pretty simple https://www.w3.org/TR/xml/
even if you add in namespaces - still simple https://www.w3.org/TR/xml-names/
The difficulty in XML really came with XML Schemas and the need to wind that in every subsequent spec.
Right now I have a project of potentially infinite complexity and we need to build really fast. A lot of my focus has been finding the very few pieces of bedrock and fixing on those, allowing the rest to vary as our appetite for complexity grows. Start today with an achievable goal, test hypotheses, don’t paint ourselves into a corner. All team comments are how simple it is. That’s by...design.
eg: "everything is a file" maps properly to both physical storage with addressable, streamable content and to the human mind as data reachable by classification (file name). Maybe a revolutionary OS design could better that with "everything is a message", but I think you can understand why one can have doubt on this translation from files.
A "message" tends to have formats associated with it (which is what differentiate it from being just a "file" that's transferred via a protocol). These formats, like JSON, or xml, or whatever new fangled formats kids these days use, now requires architectural astronauting; namely, common field names, or standards, so that different programs would parse them similarly. And now you'd want schemas for those formats, and automatic parser generated for those formats, and more and more...
It's not. Plenty of Unix tools work with binary files.
> A "message" tends to have formats associated with it
It doesn't. At least, not any more than a "file" tends to have formats associated with it.
But you may have a point with "everything is a file", and I wouldn't say that pattern have been entirely successful outside of actual files. I'm inherently skeptical towards any "everything is a..." philosophies. They tend to look good on paper but become ugly when they hit the gnarly real world.
This is a pretty huge leap from what the commenter actually said, which was:
> I'll tell you what though, if you have a programming task that you find boring, over-engineering it and over-architecting it can make it _so_ much more enjoyable.
It seems you've misinterpreted many of the comments. People aren't upset about any kind of architectural design, only where it crosses a hard-to-define threshold into "over-architected". This is pretty hard to quantify exactly, but it is entirely likely you, codeulike and the rest would have an identical threshold where you'd say "you've gone too far ..." on some design.
> My advice to serious young software engineers is to not accept ''m and imprecise thinking as acceptable standards for their chosen vocation.
Hmm? What straw man are you attacking?
So I guess the question is - what's the best way to scope and evangelize a 'moonshot' project? Or should you just shut your trap and go join a startup?
Instead we have ended up with an overly bureaucratic format and process that is mostly used for paper-pushing and vendor lock-in pretty much.
.Net and c# have evolved into a great general purpose framework and language but the original lofty hype around .net was about turning everything into soap web services for b2b messaging with automatic discovery and blah blah blah and that never happened. JSON and Rest waltzed in and (somewhat) did that organically without the hype.
XML is a great example of what Joel's talking about. It turned out to be moderately useful but was hobbled by all the over-architecting in its early days.
Joel isn't arguing against progress in general here, he's arguing against over-architecting.
You can see this happening with the newer technologies. JSON has a lot of quirks, like not having binary or integer types. So people have bolted those on. JSON doesn't have schema definitions (which are nice if you, say, want to generate code to talk to an API), so we're bolting those on (JSON Schema, OpenAPI, etc.) Admittedly, it's all a lot easier to understand this time around.
To some extent, I'm not sure if we as a field learn from the past, or just reinvent it every 15 years and do a little better accidentally. (Remember the PC revolution, and how you wouldn't have to connect to a big mainframe to do computing? That revolution died and now we carry around $1500 dumb terminals in our pockets, while all the software runs on a mainframe!)
Which I suppose it sort of does, but that seems like much less of a good thing to me now than it did in the 90's.
This is a perfect description of how it happened, and how practicality trumped the bombast of the people pushing SOAP and what later became the terrible framework that is WCF (whatever happened to Don Box? He was all over the place for a while).
Biztalk also lands nicely in the 'conceived and designed by astronauts' camp.
Edit: And if we take your examples, the Architecture Astronaut would be the person who says "We should use node.js in our backend, it will solve all our problems!", not the person who wrote node.js.
As well as the excruciating angst you saw from others or anticipated at larger scale?
Here is one HN thread about some of the issues others have run into: https://news.ycombinator.com/item?id=19072850
> It processes these as quickly as possible, with no ordering guarantees.
Look at this for example:
> A recent example illustrates this. Your typical architecture astronaut will take a fact like “Napster is a peer-to-peer service for downloading music” and ignore everything but the architecture, thinking it’s interesting because it’s peer to peer, completely missing the point that it’s interesting because you can type the name of a song and listen to it right away.
He was exactly right. Now you can do the same thing on YouTube: type the name of a song and most of the time it will come up right away. And that's what matters the most.
Crypto comes to mind. Crypto is astronaut crack. Distributed ledger astronautery for some. Weird economics astronautery for others. Meanwhile, the "listen to song" feature of crypt being "I can make money with this." Eventually we'll have CB crypto with reversible transactions.
The example of generic file support seems a bit of a stretch for example. There's inherently nothing wrong with adding more support, but the contrived example conveniently meant the loss of a feature. Of course backward steps are bad... and why certain 'contracts' are established.
I think he went into it wanting to make a different point - delivery in the end matters more than design. This, I've [somewhat begrudgingly] learned, I agree with. He's probably gone on to make a post just like this that changed my views.
I don't think he gives credit to these astronauts and what they have to achieve, and the flexibility it demands. These are the dreamers, and sometimes you have to keep them from wandering too far. Other times, you might want to see where it goes.
The real world isn't FIFO. You can go with the short term safe plan while you try to find a clever solution to never have $design_problem again, or bolt on the latest feature.
It's great that he can imagine his new favorite gizmo, but it has to happen somehow. What's more, the lessons learned along the way tend to be useful anecdotes elsewhere.
Like Mr. Savage implied (I won't try to accurately quote), what makes science not screwing around... is writing it down.
At the time (2001) .Net was the brand for a vaguely defined all-encompassing Microsoft initiative - something about connecting everything. The .net framework was just one piece of this, there were also SOAP services, a global identity system and various other stuff. See for example this old article: https://www.computerworld.com/article/2596383/update--micros... :
> For end users, .Net provides a sparse, browser-like user interface without any menu bars. A key concept in the new user interface is the "universal canvas," which Microsoft said eliminates the borders between different applications. For example, spreadsheet and word processing features will be available inside e-mail documents. .Net also will support handwriting and speech recognition, the company said.
It is clear nobody (even in Microsoft) had any idea what .net was supposed to mean!
This branding obviously failed, so .net was narrowed to just describe the development framework.
these are all tools not solutions but people keep pretending the tools are the solution. That's the 'astronaut' part - off in space and not grounded to actual user requirements.
Not to take away from that achievement, but he wasn't the first one to implement this idea.
The infamous ECMAScript 4 was supposed to be a more general purpose language. Also there were previous attempts at roughly the same in the form of e.g. Rhino.
Oh but we still do, we still do...
What happens if you don't go far enough up, abstraction wise?
You end up repeating yourself - everywhere. You end up with globals everywhere. You end up with God Classes. You end up with epic-length functions nested 15 levels deep. You end up with primitives trying to do the job that well-crafted data structures, named using the terminology of the problem domain, should be doing.
It's odd that the author rails against "astronauts" without addressing the very real motivations that lead to good architectures. Like all design, there are tensions in software development, and navigating the forces pulling you in opposite directions regarding architecture is one of the most important contribution you can make.
It's also odd that the author never calls out specific architectures that are the work of astronauts, just "the stupendous amount of millennial hype that surrounds them." He seems to hint in the introduction that CORBA is (was) one of them, but even that reference is pretty vague.
When the architecture is underdeveloped, it only takes somebody with a little more experience to fix the problem using incremental refactoring. But when astronauts have run wild with too many abstractions, it can be incredibly difficult to unravel them, especially if you don’t have the original developers’ help.
Sometimes we deal with under-engineered stuff, that's the stuff that breaks the first time you put it to use. We don't tend to think highly of those things.
Under-engineering can sometimes be fixed. But not always. Sometimes the fix is to throw it all away and re-do it because the issues run so deep the design is not fixable.
Good software has good "bones". The right decisions were made early on to support building more stuff over it... I completely disagree that you can refactor bad software into good software. Though presumably that's not what the parent meant. Perhaps the idea of the parent is defer big decisions that don't have to be made early. That doesn't mean don't make good decisions that do have to be made early.
That's not what overengineering mean.
To re-use your bridge example, over engineering a bridge would be something like using a complex suspended bridge design with expensive alloys, small tolerances, etc to cross a 10 meters gap where a simple slab of concrete would have done the job.
And "underengineering" would be to pour more steel and concrete at the bridge until it sticks.
I guess "expensive" might fall under costs, so something that was targeting a cost of $1000 but the design costs a million dollars to make is certainly over engineered.
I'd consider load ratings of an elevator to be similar. We mandate those are over-engineered by some margin. But if the load rating of an elevator is 1 ton and you designed an elevator to be able to carry 100 ton then it's certainly over-engineered by more than the safety requirements...
So IMHO over and under engineering doesn't refer to the requirements, but to the engineering effort deployed.
What developers often refer to as YAGNI and KISS.
I like how you expressed this. It puts into words what I've experienced with various codebases, both my own and others'.
This kind of understanding is difficult to teach and learn, what it means for software to have good bones. It also brings up the thought that it's not just about under- and over-engineered, but the qualitative differences between well- and (for lack of a better word) badly engineered software.
Fortunately there are various concepts that people have developed which help in talking about things like "good bones", such as modularity, encapsulation, functional purity, idempotence..
I suppose over-engineered software goes too far in applying these concepts (especially OOP?) where it starts to go against it being "well-engineered". But in general I'd personally prefer working with such a codebase, in comparison to under-engineered software where things are disorganized, too interdependent, rampant use of globals, etc. It follows your analogy of an "over-engineered bridge".
On the other hand, there is something to be said for software architecture that is tastefully and minimally engineered, with just enough strong bones as the basis - but no more! - to achieve its purpose. It's a quality that is somewhat rare to see, where the code is almost deceptively simple to understand, such as small functions with clear singular purposes, and not being too clever. It takes many years of experience (and perhaps certain kinds of personality) to reach that level of simplicity.
I believe such simplicity cannot be taught, but perhaps can be "absorbed" via osmosis, haha..
Joel's criticism targets PEOPLE not IDEAS. See the blog post title. There's nothing wrong with abstractions as an idea. The issue is when folks apply them in inappropriate and unnecessary ways.
When a smart, motivated, and caring person programs a computer then sensible abstractions will naturally emerge. A complex proprietary formula will not be copy-pasted because of course that's a bad idea. These judgement calls are important!
Globals are not always bad. Some data is truly global and should not be instanced, so to speak.
God classes are not always bad. Sometimes it's important to isolate experimental behavior from everything else and get it done quickly. Throw it into a class, throw the class into a file, mash a bunch of code in there, and test it out. If it works out then refactor it.
Primitives are not always bad. Data structures are not zero cost! The amount of times I've seen a "well crafted" (???) data structure tank performance by orders of magnitude is just sad.
> It's odd that the author rails against "astronauts" without addressing the very real motivations that lead to good architectures. Like all design, there are tensions in software development, and navigating the forces pulling you in opposite directions regarding architecture is one of the most important contribution you can make.
This paragraph is over-engineered.
Or you end up with a flat hierarchy of isolated modules that can do one thing and one thing only but can do it well.
Architectures are what can't be changed easily and it is easier to change what is small. Architectures that try hard to accommodate every case always end up getting in the way of everything.
I've found more usefulness in being pithy than in being prolix.
Think of the formula for the solution of 2nd order polynomials. You might say it is too abstract, you don't need to solve all 2nd order polynomials, just this one and this one. If that kind of thinking had prevailed we would be missing many things the modern world has to offer.
I'll tell you what though, if you have a programming task that you find boring, over-engineering it and over-architecting it can make it _so_ much more enjoyable.
Even overengineered systems should be simple to change.
Clearly you've never met a 10x over-engineer
Over-engineering (for example over-generalising or adding too many layers of abstraction) is typically done with certain sorts of potential future changes in mind, so in the resulting system some changes are easy but some types of change (the ones the over-engineer didn't anticipate) can only be done with massive refactoring.
"Sure, our web application might take a minimum of two seconds to handle trivial get requests, and logging in doesn't work half the time, but say the word and I can instantly migrate from Postgres to MySQL."
It is not easy to predict the future of a software application. In Louisiana they were building levees to withstand a once in 100 years flood or something like that. Based on historical weather reports they are able to estimate how high and strong the levees would need to be. But with software it is hard to see how we could estimate what kinds of requirement-changes might be needed during the next 100 years for any application.
So, for example, you architected the system so that a dozen teams with a dozen developers each can hack independently on the system with minimum conflict, but you only have one team with four developers, who would have been able to iterate much faster on a simpler codebase, and who pay a high cost for maintaining and operating such a complex system.
Ofter the realization that a system is overengineered in one dimension happens simultaneously with the realization that it is underengineered in another. For example, you realize that your system that is engineered to scale to petabytes of traffic per day scales awkwardly to more than three or four customers. It takes weeks to onboard a new customer, and it's a impractical to onboard more than one customer at a time. Meanwhile your first four customers are only sending you hundreds of megabytes of traffic per day, and the mechanisms you added to scale to petabytes of traffic are making it really awkward to reengineer the onboarding process. Salespeople are quitting because they have prospects in the pipeline that they aren't allow to move forward on, and upper management is demanding to know why we need more integration engineers when they already outnumber our existing customers. But by god, if one of these customers wants to send us tens of millions of requests per second, we're ready (for some untested meaning of the word "ready.")
People who overengineer systems are often looking for a challenge because the real needs of the business (such as onboarding new customers, making the UI brandable, integrating with a dominant third party ecosystem) don't seem like engineering challenges. They want to do good engineering work, so they pick a challenge that they think of as engineering (tail latency, resilience, "web scale," you know, engineer stuff) and they throw themselves into it. And they run foul of the truth that LOC (and architectural complexity) are liabilities.
One of the most significant qualities of a over-engineered system is an unusual resistance to change.
Best part about the people that do such things is that they may not even realize they do it!
We don't think of it that way, but I personally believe that's really what it is. It's a reductionist (abstracted) approach of developing a framework around a set of observations. Just like research, it may be too far abstracted to be relevant to the problem at hand and may take years before it becomes useful. That's just my perspective, though.
Nor is it the case that everyone who does this kind of "research" is actually qualified to be doing it.
So they're publishing papers with falsifiable hypotheses and subjecting themselves to peer review, right?
Wow, this hasn’t aged well. I always found Joel’s articles to be interesting but pretty limited in actual wisdom. He writes as if to be making profound observations, but in reality they are actually pretty narrow minded and really only apply to a very narrow nitch of computing as a whole and really only apply well to a certain startup type — not technically complex communications, productivity or organizational type software.
His advice is invalid for the development of anything more complex, for example you shouldn’t take his advice if you’re building a biotech company or starting a new kind of tech company that isn’t your run of the mill CRUD app.
I think it’s rather disingenuous and quite arrogant to make generalizations about the types of people who work at big companies (without even knowing what they do or having ever met them). I’m sure the same traits that make one have disdain for corporations are the same drivers that cause one to become an entrepreneur, and then blog loudly about it to the world.
I get where you're coming from, with your reply as a whole and not just this snippet, but I think it's worth pointing out that Joel did work at Microsoft for quite a few years during its historically ascendent and dominant period in the 90s so I think he has at least some idea about the types of people that work at big companies.
If you want to talk about things that haven't aged well, SOAP would be right up there - you may not remember all the fanfare surrounding it, but I certainly do, and where exactly is it now? I know it's still used in some legacy contexts. I've even seen SOAP responses within the last 5 - 7 years, but it's basically dead (or at least stagnated) tech that isn't used for newer projects and systems, and certainly isn't something we'd even consider using.
I think his advice would be 'does this actually have to be more complex?' - I suspect. If might superficially look complex, but could it be actually be handled in an old-school , simpler manner?
One is definitely climbing the transaction ladder to the point of no oxygen, Astronaut Architecture. But, I don't think that alone explains the amount of bombastic, heroic, utopian, grandiloquence of those Astronaut Architecture quotes.
People describing their jobs, their companies, ideas and such seem draw to grandiose & abstract nonsense. "Solutions-talk." Walk around a lot of business-ey trade shows and read plaques. 90% of the time, it is impossible to know what they do, who they do it for, why. They all do "business, people, and technology solutions." Even if you stop with questions, the first answer is always hopelessly abstract. It takes a lot of digging to eventually find out they do custom spreadsheets for dentists.
Meaningful statements are limiting. Who wants to limit themselves?
Also, it works. Saying something specific enough to be meaningful opens you to criticism, being eliminated by process of elimination, etc.
Architecture Astronautary feeds comfortably into sales, marketing, investor relations, recruitment. It's acceptable in boardrooms, AGMs, job descriptions. Media is happy to report on it.
Obscurity by abstraction works.
From what I hear 'buck' at Facebook is directly modelled on Blaze/Bazel.
This is a very effective statement. Today p2p for music is certainly alive but has greatly been surpassed by services like YouTube and Spotify (and Rdio, sadly dead before it’s time). One is ad supported, the others paid. But they all just let the user search and listen.
Popcorn Time has done this for movies and tv. It’s purely p2p. Disliked by content owners, but they’ve largely failed to provide the same kind of universal search and watch functionality that the app provides. Hulu, Netflix, Apple+, Disney+, insert other streaming services. They all basically fail at usable search and recommendation usability and have limited overall content.
Is is the really big company that makes you die inside or the people themselves?
I tend to think its the company... its soo hard for me to work and feel productive at a large company. Something just oozes nobody really cares about you or what your doing. Everyone is just trying to do the bare minimum and go home.
That's the "Clueless" section of that essay.
A good architect worth their salt will be able to find the right balance. To distill and simplify a problem down to the absolute minimum where it cannot and should not be abstracted anymore before it lose or miss its core objectives and reason to exist.
Personally I like the rule of thumb that you let 3 instances of something occur before you try to refactor it into a single design.
Without examples of real usage it's far too easy to come up with bad abstractions. I'd rather have code that's easy to delete later.
Nobody likes bombastic claims, but these likely go from marketing than from architects. It's important to see beyond the marketing and recognize the engineering thinking and potential behind technologies.
Strong blockchain vibes in this and the paragraph that follows it.
Spotify claims they have 356 million users. I would claim that torrent users are greater - you just don't know or hear of them!