Hacker News new | past | comments | ask | show | jobs | submit login

Hi! Author here. I think it's important to address certain aspects of this post that people tend to misunderstand, so I'll just list them here to save myself the effort.

* I do not argue against abstractions, but against the unrestricted application of them.

* I do not advocate for a reversion to more constrained platforms as a solution.

* I do not advocate for users becoming "more technical" in a "suck it up" fashion.

The key to understanding the software crisis is the curves of "mastery of a platform" and "growth/release cycles". We have, in the past 40+ years, seen these curves diverge in all but a few sectors. We did not address the crisis when these curves were close in proximity, but the second best time is now.

As for folks calling this clickbait, it is the first in my log, and reflects my thoughts on the situation we find ourselves in as developers. The sentiments are mirrored, in various forms, around multiple communities, some of them based in counterculture.

I do want to deliver some part of the solution to these problems, so I do intend on following up on "I'll show you how". I am a single entity, so give me time and grace.




I personally don’t understand what you are arguing for. While I agree that problems can be too abstracted or that bad abstractions abound, I don’t think that’s a controversial opinion at all. You’ll also literally never fix this problem because millions of people produce software, and you’re bound to disagree with some of them, and they’ll never all be as skilled as you. So my only interpretation of this post is that you think we should have even higher standards for constitutes an acceptable abstraction than is the norm already.

That’s easy to say writ large but I think if you examine specific areas you consider “too abstracted” you might find it a humbling experience. Most likely, there are very good reasons for those “over abstractions”, and the engineers working in those areas also find the abstraction situation hairy (but necessary or unreasonable to fix).

For example, a lot of software is built by providing a good abstraction on top of a widely used or accepted middle abstraction (eg Kubernetes on Linux + container runtimes + the traditional control plane:config layer:dataplane architecture). All that logic could all be implemented directly in a new OS, but that introduces compatability issues with users (you can build it, but you may have no users) unless you reimplement the bad abstractions you were trying to avoid. And, that kind of solution is way, way, way harder to implement. I personally want to solve this problem because I hate Kubernetes’ design, but see it purely as a consequence of doing things the “right way” being so absurdly difficult and/or expensive that it’s just not worth it.


Indeed, its tradeoffs all the way down


It's important to remember "Clarke's three laws"[0]:

  The laws are:

    1. When a distinguished but elderly scientist states that
       something is possible, he is almost certainly right. When
       he states that something is impossible, he is very
       probably wrong.
    2. The only way of discovering the limits of the possible is
       to venture a little way past them into the impossible.
    3. Any sufficiently advanced technology is indistinguishable
       from magic.
Depending on one's background and when they entered this field, there exists a non-trivial probability that previous abstractions have become incorporated into accepted/expected practices. For example, at one point in computing history, it became expected that an Operating System with a general purpose file system is assumed to be in use.

The problem I think you are describing is the difficulty one experiences when entering the field now. The prerequisites in order to contribute are significant, if one assumes the need to intricately understand all abstractions in use.

While this situation can easily be overwhelming, I suggest an alternate path; embrace the abstractions until deeper understanding can be had.

0 - https://en.wikipedia.org/wiki/Clarke's_three_laws


A tangent, but Clarke was slightly wrong. Magic is not just indistinguishable, but it actually IS advanced technology.

In fantasy worlds, wizards spend years studying arcane phenomena, refining them into somewhat easily and reliably usable spells and artifacts, which can be used by general public. (Don't believe depictions of magic in video games, they simplify all the annoying little details.)

The above paragraph is actually true about our world, we just happen to call these people scientists and engineers, not wizards.


Gandalf wasn’t a technologist.

(Edit) and his magic wasn’t remotely Vancean


He wasn't a technologist by the standards of his family, but it's somewhat implied in Silmarillion that the magics of Valar are intertwined with their understanding and knowledge of the world.


Yes, but in the Legendarium, the Valar were innately ‘magical’ creatures created by a supreme being, not people who studied spells


This is true, but the practical LLM wrangling seems less technological than I think any tech I have seen until now.


I worked in image processing for some time, it’s very similar


Whoooosh


Subspace whoosh. Or more seriously, I think OP was going for a corollary to how stories about dragons aren’t false, they are more than true because they teach us the dragon can be slain. The same could be said for magicians - the stories are more than true because they tell us the complexity and arcana can be tamed. (Also note how many of the magic stories warn of unintended, unforeseen consequences resulting from unintended interactions between parts. (Sourcerers apprentice.) This should ring true for anyone in programming.)


I liked the way you showed the problem as ongoing in history. Indeed, the phrase "software crisis" is nice because it references the first point when the situation was articulated.

That said, I think the reason the situation is not going to change is clearly economic. That's not saying that bad software is cheaper. But there's a strong incentive to cheap, bad practices because cutting corners allows one person/organization to save money now with the (greater) costs being picked by the organization itself later, the organization's customers and society at large. Moreover, software isn't amendable to the standards of other sorts of engineering and so there's no way to have contracts or regulations demanding software of a certain standard or quality.

Edit: The only imaginable solution would be a technological revolution that allowed cheaper and better software with same technology also not allowing even cheaper and worse software.


Yes to this.

Lately I feel like we have built a society with expansive software infrastructure, where that software is doomed to be crappy and inhumane because our society actually couldn't afford to build this quantity of software well.

So another hypothetical fantasy solution would be a lot less software, like being careful about where we used software, using it in fewer places, so that we could collectively afford to make that careful intentional software higher quality.

Certainly a society like that would look very different, both in terms of the outcome and in terms of a society that is capable of making and executing a decision to do that.


Our society / societies may well be able to afford to build this quantity of software well. We choose not to.


I don't know how/haven't seen an attempt to approach this question by a method other than "my hunch", but as a software engineer "my hunch" is it would cost at least 10-50x as much human labor (not just engineers but designer and UX researchers as well as all the other support roles like project managers etc) to build the software "well" (including more customization for individual enterprises or uses), and that it would become an unsustainable portion of the GDP.

Just "my hunch", but one I reflect on a lot these days.


You could easily reduce the amount of software that exists today by 10-50x and have an adequate amount of software for virtually all purposes.

But this incredibly hypothetical. A lot of software labor today revolving around manipulating the user rather than aiding them so where we'd go with "better" versions of this is hard to say.


> (not just engineers but designer and UX researchers as well as all the other support roles like project managers etc)

Oh! I thought you were going to say "testing teams, design reviewers, trainers".

I'm not on-board with this "10-50x" claim for the amount of effort. I'd say maybe 3x the effort, given the developers have been well-trained, the methodology is sound, and the managers' focus is exclusively quality. That last item is one I've never experienced; the closest I came was on a banking project where the entire development team was subordinated to the Head of Testing.


Humanity could easily afford to provide proper healthcare and schooling for every person on the planet if we didn't spend so much money on our collective militaries too, but "we" don't.

Getting everybody (or even a minority of any sufficient size) to act in service to a single goal has been a problem for humanity ever since we first invented society.


What doesn't get spent on quality goes down as profit. This is why we have fake mobile games. Vaporware is the real software crisis.


It's not just about cost it's about survival. If I fail the generate revenue in a reasonable timeframe, I can't put food on the table. If I have a healthy income, I'm happy to tinker on the the technology ad infinitum.


This state of affairs is very much due to competition (instead of cooperation) and the feast-and-famine shape of what software invention and development looks like, especially in the US.

It takes time and effort to optimize software for performance, user-friendliness, developer-friendliness and everything that ethical and mindful developers hold up so highly, but that’s not the kind of thing that the big money or academic prestige is very interested in. The goal is to get the most users possible to participate in something that doesn’t repulse them and ideally draws them in. The concerns of ethics, dignity, and utility are just consequences to settle, occasionally dealt with before the world raises a stink about it, but are hardly ever part of the M.O.

Imagine if developers could make a dignified living working to listen to people and see what they really need or want and can focus on that instead of having to come up with a better ‘feature set’ than their competitors. It’s essentially why we have hundreds of different Dr. Pepper ripoffs sold as ‘alternatives’ to Dr. Pepper (is this really a good use of resources?) instead of making the best Dr. Pepper or finding a way to tailor the base Dr. Pepper to the consumers taste, assuming enough people claim they want a different Dr. Pepper.


I don't think there is a software crisis. There is a more of a software glut.

There is so much software on so many platforms, such a fragmented landscape, that we cannot say anything general.

Some projects may be struggling and falling apart, others are doing well.

There is software that is hostile to users. But it is by design. There is cynicism and and greed behind that. It's not caused by programmers not knowing what they are doing, but doing what they are told.

The users themselves drive this. You can write great software for them, and they are disinterested. Users demand something other than great software. Every user who wants great software is the victim of fifty others who don't.

Mass market software is pop culture now.


Windows 3.1 and Word easily fit on a 40MB hard drive with plenty leftover. Word operated on systems with as little as 2MB RAM on a single core 16MHz 80386. Modern microcontrollers put this to shame! Word then didn’t lack for much compared to today’s version.

Windows and Office require now, 50-100GB disk just to function? 1000X for what gain?

This is sheer insanity, but we overlook it — our modern systems have 5000-10000X as much disk, RAM, and CPU. Our home internet is literally a million times faster than early modems.


You mean the Word where i had to save the document every 20 seconds just to be sure i didn't loose my school assignment from crashes at the most inconvenient moment.

It left scars in my spline, to the point that I still have to actively hold myself back from the reflex of hitting Ctrl+S halfway through this comment.


Yes, that Word. Imagine if we'd been focused on improving reliability this whole time...


Is that all you got?


> 1000X for what gain?

A stepping stone to the 1000000X gain that would make it possible for Windows and Office to emulate the mind of a Microsoft sales executive so it can value extract from the user optimally and autonomously. Also with memory safe code because Microsoft fired all the testers to save money. And so they can hire people who can't write in C or C++ to save money. And ship faster to save money.


I could not agree with you more. I'm trying to formulate data and arguments and I think I actually have the data to prove these points.

I'm working on a series of articles at enlightenedprogramming.com to prove it out.

I'm also working "show you how" solutions because, absent data (and sometimes even with it), I still get people who believe that the industry has never been better, we're in some golden age and there is not a whole to improve on and it just boggles my mind.


It's hard to get past the hubris wrapped up in the statement "I'll show you how"... as if the tens of thousands of bright engineer's whose shoulders you stand on were incapable and you're the savior... maybe you are! (But just adding the word "try" in that sentence would reduce the perceived arrogance by orders of magnitude.)


I didn't read it that way. I read it as an introduction to a counter-cultural paradigm shift away from the "move fast and break things" and "abstract over it" mindsets. I'm very interested in the path they are going to outline.


Well, that certainly wasn't my intention, but in an environment of "I have a silver bullet and it'll cost you $X", I can understand the sentiment.

At the same time, I do want to show that I have confidence in my ideas. Hubris and confidence must be applied in equal parts.


Hubris and confidence are two sides of the same coin. Did you maybe mean to say that one should balance hubris with humility to avoid coming across as arrogant?


Yep, that's what I meant. Lots of comments, not a lot of time for proofreading!


Well, I don't think many people would disagree about the crisis.

Personally, I'm anxious to see your proposal on "how".


You claim to be the author of Modal [0]. While Modal might not be your "solution" it is likely aligns with what you have in mind philosophically. Modal seems like some sorta pattern matching lisp. I dont think im alone in being skeptical that a "pattern matching lisp" is going to be the "solution" to the software complexity problem.

[0] https://wiki.xxiivv.com/site/modal


Assuming you're posting here at least in part to get feedback, I'll share mine. This piece would have been a lot more readable with a bit better organization. Being dominated by one and two-sentence paragraphs, it's more of a soup of claims and quotations than a coherent article or essay. A clear structure makes it faster and easier for readers to identify which parts are your main points and which are written to support them.


There is no software crisis. I have worked on software for 30+ years and it is as easy/hard to write and maintain software today as it was 30+ years ago.

And we are creating software today that would absolutely blow the minds of software developers working 30+ years ago. Mostly because we have amazing abstractions that makes it possible to stand on the shoulders of giants.

All that is happening is that as we get better at writing software, we and our users demand more of it. So we will always be at the breading edge of not being able to deliver. But we still do. Every day.


I think this might be premature if you're trying to apply this to how the world makes money with software. There's too much of the economy that depends on adding another abstraction layer to make money, so you'd need to rewire that as well unless we just want to break things. You'd eventually have to tell a business "no" because they're +1 abstraction layer above the limit.

I think most of us realize that working at corporate is still a necessary evil as long as we need to pay for stuff. Frankly that sector can burn because they've been taking advantage of developers. Most of the money goes to people above you, performance bonuses are not the norm, etc. We shouldn't be actively trying to give them improvements for free because of this behavior. Let them follow the hobby/passion projects and learn these practices and limits on their own.


I think society has made it convenient for people to lend their creativity to someone else's vision. I don't think its a good idea for talented people to work for corporations that have no connection to the public good.

I don't think its a necessary evil, I think most people either don't want to work to realize their own vision of what they want in the world, or want more than they need and they are willing to sacrifice their soul for it.


Hi wryl. I'm interested in hearing your follow-up to this post, so I tried to add your log to my feed reader, but I couldn't find an rss/atom feed, which is the only way I'll remember to check back in. Do you have a feed I can follow?


Hi! Yes! I'm intending on adding an RSS feed at https://wryl.tech/log/feed.rss within the next few days.

I have a habit of hand-writing the markup for the site, so it's missing a few features people expect from things like statically generated sites or hosted blog platforms.

I'm watching this thread for accessibility and feedback, and am taking the suggestions to heart!


I've been a fan of handmade and seems like a new blog in the same spirit is born. I look forward to reading more of your writings.

I have recently started writing myself and paying more attention to what I read. I really liked your style and enjoyed reading it.

So please keep writing!


Thank you very much! Very much in line with the handmade ethos, and I'm hoping to get involved with more handmade (and handmade-like) things in the future. :)


My problem with your article is that it seems to operate on the misconception that someone must completely understand from top to bottom the entire stack of abstractions they operate atop in all of its gory and granular detail in order to get anything done, and so a larger tower of abstractions means a higher mountain to climb up in order to do something.

But that's simply the opposite of the case: the purpose of abstractions — even bad abstractions — is detail elision. Making it so that you don't have to worry about the unnecessary details and accidental complexity of a lower layer that don't bear on the task you have at hand and instead can just focus on the details and the ideas that are relevant to the task you are trying to achieve. Operating on top of all of these abstractions actually makes programming significantly easier. It makes getting things done faster and more efficient. It makes people more productive.

If someone wants to make a simple graphical game, for instance, instead of having to memorize the memory map and instruction set of a computer, and how to poke at memory to operate a floppy drive with no file system to load things, they can use the tower of abstractions that have been created on top of hardware (OS+FS+Python+pygame) to much more easily create a graphical game without having to worry about all that.

Yes, the machine and systems underneath the abstractions are far more complex, and so if you set out to try to completely fit all of them in your brain, it would be much more difficult than fitting the entirety of the Commodore 64 in your brain, but that greater complexity exists precisely to free the higher layers of abstraction from concerns about things like memory management and clock speeds and so on.

So it's all very well and good to want to understand completely the entire tower of abstractions that you operate atop, and that can make you a better programmer. But it's very silly to pretend that everyone must do this in order to be remotely productive, and that therefore this complexity is inherently a hindrance to productivity. It isn't. Have we chosen some bad abstractions in the past and been forced to create more abstractions to paper over those bad abstractions and make them more usable, because the original ones didn't allied out the right details? Yes, absolutely we have. I think a world in which we abandoned C Machines and the paradigm where everything is opaque black box binaries that we control from a higher level shell but have no insight into, and instead iterated on what the Lisp Machines at the MIT AI Lab or D-Machines at Xerox PARC were doing, would be far better, and would allow us to achieve similar levels of ease and productivity with fewer levels of abstraction. But you're still misunderstanding how abstractions work IMO.

Also, while I really enjoy the handmade movement, I have a real bone to pick with permacomputing and other similar movements. Thanks to influence from the UNIX philosophy, they seem to forget that the purpose of computers should always be to serve humans and not just serve them in the sense of "respecting users" and being controllable by technical people, but in the sense of providing rich feature sets for interrelated tasks, and robust handling of errors and edge cases with an easy to access and understand interface, and instead worship at the feet of simplicity and smallness for their own sake, as if what's most important isn't serving human users, but exercising an obsessive-compulsive drive toward software anorexia. What people want when they use computers is a piece of software that will fulfill their needs, enable them to frictionlessly perform a general task like image editing or desktop publishing or something. That's what really brings joy to people and makes computers useful. I feel that those involved in permit computing would prefer a world in which, instead of GIMP, we had a separate program for each GIMP tool (duplicating, of course, the selection tool in each as a result), and when you split software up into component pieces like that, you always, always, always necessarily introduce friction and bugs and fiddliness at the seams.

Maybe that offers more flexibility, but I don't really think it does. Our options are not "a thousand tiny inflexible black boxes we control from a higher later" or "a single gigantic inflexible black box we control from a higher layer." And the Unix mindset fails to understand that if you make the individual components of a system simple and small, that just pushes all of the complexity into a meta layer, where things will never quite handle all the features right and never quite work reliably and never quite be able to achieve what it could have if you made the pieces you composed things out of more complex, because a meta-layer (like the shell) always operates a disadvantage: the amount of information that the small tools it is assembled out of can communicate with it and each other is limited by being closed off separate things as well as by their very simplicity, and the adaptability and flexibility those small tools can present to the meta layer is also hamstrung by this drive toward simplicity, not to mention the inherent friction at the seams between everything. Yes, we need tools that are controllable programmatically and can communicate deeply with each other, to make them composable, but they don't need to be "simple."


> Operating on top of all of these abstractions actually makes programming significantly easier. It makes getting things done faster and more efficient. It makes people more productive.

Only if those abstractions are any good. In actual practice, many are fairly bad, and some are so terrible they not only fail to fulfill their stated goals, they are downright counterproductive.

And most importantly, this only works up to a point.

> Yes, the machine and systems underneath the abstractions are far more complex, and so if you set out to try to completely fit all of them in your brain, it would be much more difficult than fitting the entirety of the Commodore 64 in your brain, but that greater complexity exists precisely to free the higher layers of abstraction from concerns about things like memory management and clock speeds and so on.

A couple things here. First, the internals of a CPU (to name but this one) has become much more complex than before, but we are extremely well shielded from it through its ISA (instruction set architecture). Some micro-architectural details leak through (most notably the cache hierarchy), but overall, the complexity exposed to programmers is orders of magnitudes lower than the actual complexity of the hardware. It's still much more complex than the programming manual of a commodore 64, but it is not as unmanageable as one might think.

Second, the reason for that extra complexity is not to free our minds from mundane concerns, but to make software run faster. A good example is SIMD: one does not simply auto-vectorise, so to take advantage of those and make your software faster, there's no escaping assembly (or compiler intrinsics).

Third, a lot of the actual hardware complexity we do have to suffer, is magnified by non-technical concerns such as the lack of a fucking manual. Instead we have drivers for the most popular OSes. Those drivers are a band-aid over the absence of standard hardware interfaces and proper manuals. Personally, I'm very tempted to regulate this as follows: hardware companies are forbidden to ship software. That way they'd be forced to make it usable, and properly document it. (That's the intent anyway, I'm not clear on the actual effects.)

> I think a world in which we abandoned C Machines and the paradigm where everything is opaque black box binaries that we control from a higher level shell but have no insight into, and instead iterated on what the Lisp Machines at the MIT AI Lab or D-Machines at Xerox PARC were doing, would be far better, and would allow us to achieve similar levels of ease and productivity with fewer levels of abstraction.

I used to believe in this idea that current machines are optimised for C compilers and such. Initially we could say they were. RISC came about explicitly with the idea of running compiled programs faster, though possibly at the expense of hand-written assembly (because one would need more instructions).

Over time it has become more complicated though. The prime example would again be SIMD. Clearly that's not optimised for C. And cryptographic instructions, and carry-less multiplications (to work in binary fields)… all kinds of specific instructions that make stuff much faster, if you're willing to not use C for a moment.

Then there are fundamental limitations such as the speed of light. That's the real reason between cache hierarchies that favour arrays over trees and pointer chasing, not the desire to optimise specifically for low(ish)-level procedural languages.

Also, you will note that we have developed techniques to implement things like Lisp and Haskell on stock hardware fairly efficiently. It is not clear how much faster a reduction machine would be on specialised hardware, compared to a regular CPU. I suspect not close enough to the speed of a C-like language on current hardware to justify producing it.


I just want to say this was a wonderful log. There are not many in this field who have an intuition or impulse towards materialist/historical analysis, but whether you are aware of it or not, that is certainly what I read here. Just to say, its not quite recognizing how we find ourselves thinking that is enlightening, but rather recognizing why we find ourselves thinking like we do that can bring clarity and purpose.

In a little more woo woo: abstractions are never pure, they will always carry a trace of the material conditions that produce them. To me, this is the story of computing.


Thank you for the kind words! "I appreciate them" would be an understatement.

And I agree entirely. Tracking the history of an abstraction will usually tell you its root, though it gets pretty muddy if that root is deep-seated!


abstraction is the crabification of software,it gets do powerful and so good,it kills the carprace wearer. Allowing fresh new script kid generations to re experience the buisness story and reinvent the wheel.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: