
Software Disenchantment (2018) - ibdknox
https://tonsky.me/blog/disenchantment/
======
rogerdb
> Would you buy a car if it eats 100 liters per 100 kilometers? How about 1000
> liters?

I think the analogy here is backwards. The better question is "how much would
you prioritize a car that used only 0.05 liters per 100km over one that used
0.5? What about one that used only 0.005L?". I'd say that at that point, other
factors like comfort, performance, base price, etc. become (relatively) much
more important.

If basic computer operations like loading a webpage took minutes rather than
seconds, I think there would be more general interest in improving
performance. For now though, most users are happy-enough with the performance
of most software, and other factors like aesthetics, ease-of-use, etc. are the
main differentiators (admittedly feature bloat, ads, tracking, etc. are also a
problem, but I think they're mostly orthogonal to under-the-hood performance).

These days, I think most users will lose more time and be more frustrated by
poor UI design, accidental inputs, etc. than any performance characteristics
of the software they use. Hence the complexity/performance overhead of using
technologies that allow software to be easily iterated and expanded are
justified, to my mind (though we should be mindful of technology that claims
to improve our agility but really only adds complexity).

~~~
ljm
I have to be careful about what I describe, but I don't think people care
about speed or performance at all when it comes to tech, and it makes me sad.
In fact, there are so many occasions where the optimisation is so good that
the end user doesn't believe that anything happened. So you have to
deliberately introduce delay because a computer has to feel like it thinks the
same way you do.

At my current place of employment we have plenty of average requests hitting
5-10 seconds and longer, you've got N+1 queries against the network, rather
than the DB. As long as it's within 15 or 30 seconds nobody cares, they
probably blame their 4G signal for it (especially in the UK where our mobile
infrastructure is notoriously spotty, and entirely absent even within the
middle of London). But since I work on those systems I'm upset and
disappointed that I'm working on APIs that can take tens of seconds to
respond.

The analogy is also not great because MPG is an established metric for fuel
efficiency in cars. The higher the MPG the better.

~~~
TeMPOraL
> _In fact, there are so many occasions where the optimisation is so good that
> the end user doesn 't believe that anything happened. So you have to
> deliberately introduce delay because a computer has to feel like it thinks
> the same way you do._

I never liked this view. I can't think of a single legitimate use case that
couldn't be solved better than by hiding your true capabilities, and thus
wasting people's time.

> _they probably blame their 4G signal for it_

Sad thing is, enough companies thinking like this and the incentive to improve
on 4G itself evaporates, because "almost nothing can work fast enough to make
use of these optimizations anyway".

~~~
danenania
"I can't think of a single legitimate use case that couldn't be solved better
than by hiding your true capabilities, and thus wasting people's time."

Consider a loading spinner with a line of copy that explains what's happening.
Say it's for an action that can take anywhere from 20 milliseconds to several
seconds, based on a combination of factors that are hard to predict
beforehand. At the low end, showing the spinner will result in it flashing on
the screen jarringly for just a frame. To the user it will appear as some kind
of visual glitch since they won't have time to even make out what it is, much
less read the copy.

In situations like this, it's often a good idea to introduce an artificial
delay up to a floor that gives the user time to register what's happening and
read the copy.

~~~
icebraining
Wouldn't it be better to delay the appearance of the spinner, so it doesn't
show at all for those fast operations?

~~~
danenania
You can still end up with the jarring flash. Say you delay 100ms--if the
action takes 120ms, you have the same problem.

~~~
TeMPOraL
Flash is good. If the state transition is "no indicator -> spinner ->
checkmark", then if the user notices the spinner flashing for one frame, that
only ensures them the task was actually performed.

It's a real case, actually. I don't remember a name, but I've encountered this
situation in the past, and that brief flash of a "in progress" marker was what
I used to determine whether me clicking a "retry" button actually did
something, or whether the input was just ignored. It's one of those unexpected
benefits of predictability of UI coding; the less special cases there are, the
better.

------
iamwil
I agree it's all slower and sucks. But I don't think it's solely a technical
problem.

1/ What didn't seem to get mentioned was the speed to market. It's far worse
to build the right thing no one wants, than to build the crappy thing that
some people want a lot. As a result, it makes sense for people to leverage
electron--but it has consequences for users down the line.

2/ Because we deal with orders of magnitude with software, it's not actually a
good ROI to deal with things that are under 1x improvement on a human scale.
So what made sense to optimize when computers were 300MHz doesn't make sense
at all when computers are 1GHz, given a limited time and budget.

3/ Anecdotally (and others can nix or verify), what I hear from ex-Googlers is
that no one gets credit for maintaining the existing software or trying to
make it faster. The only way you get promoted is if you created a new project.
So that's what people end up doing, and you get 4 or 5 versions of the same
project that do the same thing, all not very well.

I agree that the suckage is a problem. But I think it's the structure of
incentives in the environment that software is written that also needs to be
addressed, not just the technical deficiencies of how we practice writing
software, like how to maintain state.

It's interesting Chris Granger submitted this. I can see that the gears have
been turning for him on this topic again.

~~~
ibdknox
I might strengthen your argument even more and say it's largely a non-
technical problem. We have had the tools necessary to build good software for
a long time. As others have pointed out, I think a lot of this comes down to
incentives and the fact that no one has demonstrated the tradeoff in a
compelling way so far.

I find it really interesting that no one in the future of programming/coding
community has been able to really articulate or demonstrate what an "ideal"
version of software engineering would be like. What would the perfect project
look like both socially and technically? What would I gain and what would I
give up to have that? Can you demonstrate it beyond the handpicked examples
you'll start with? We definitely didn't get there.

It's much harder to create a clear narrative around the social aspects of
engineering, but it's not impossible - we weren't talking about agile 20 years
ago. The question is can we come up with a _complete_ system that resonates
enough with people to actually push behavior change through? Solving that is
very different than building the next great language or framework. It requires
starting a movement and capturing a belief that the community has in some
actionable form.

I've been thinking a lot about all of this since we closed down Eve. I've also
been working on a few things. :)

~~~
Tossrock
I'll take this opportunity to appreciate C# in VS as a counterexample to the
article. Fast as hell (sub-second compile times for a moderately large project
on my 2011 vintage 2500k), extremely stable, productive, and aesthetically
pleasing. So, thanks.

~~~
jiggawatts
It's very hard for me to get away from C# because it's just so crazy
productive. The tooling is fanstastic and the runtime performance is more than
good enough.

One thing I found was that surprisingly the C# code I write outperforms the
C++ code I used to write at equal development times.

I was good at C++, but the language has so many footguns and in general is so
slow to develop in that I would stick to "simple" and straightforward
solutions. I avoided multi-threading like the plague because it was just so
hard to get right.

Meanwhile in C# it's just _so easy_ to sprinkle a little bit of multithreading
into almost any application (even command-line tools) that I do it "just
because". Even if the single-threaded performance is not-so-great, the end
result is often much better.

Similarly, it's easy to apply complex algorithms or switch between a few
variants until something works well. In C++ or even Rust, the strict ownership
semantics makes some algorithm changes require wholesale changes to the rest
of the program, making this kind of experimentation a no-go.

The thing that blows my mind is the "modern" approach to programming that
seems to be mostly young people pretending that Java or C# just don't exist.

Have you seen what JavaScript and Python people call "easy?" I saw a page
describing a REST API based on JSON where they basically had thousands of
functions with no documentation, no schema, and no typed return values. It was
all "Just look at what the website JS does and reverse engineer it! It's _so
easy!_ "

I was flabbergasted. In Visual Studio I can literally just paste a WSDL URL
into a form and it'll auto-generate a 100K-line client with async methods and
strongly-typed parameters and return values in like... a second. Ditto for
Linq-2-SQL or similar frameworks.

------
surround
From a Reddit comment:

> While I do share the general sentiment, I do feel the need to point out that
> this exact page, a blog entry consisting mostly of just text, is also half
> the size of Windows 95 on my computer and includes 6MB of javascript, which
> is more code than there was in Linux 1.0. Linux at that point already
> contained drivers for various network interface controllers, hard drives,
> tape drives, disk drives, audio devices, user input devices and serial
> devices, 5 or 6 different filesystems, implementations of TCP, UDP, ICMP,
> IP, ARP, Ethernet and Unix Domain Sockets, a full software implementation of
> IEEE754 a MIDI sequencer/synthesizer and lots of other things.

>If you want to call people out, start with yourself. The web does not have to
be like this, and in fact it is possible in 2018 to even have a website that
does not include Google Analytics.

[https://www.reddit.com/r/programming/comments/9go8ul/comment...](https://www.reddit.com/r/programming/comments/9go8ul/comment/e6689uu)

~~~
panic
Since this Reddit comment was made, the Twitter iframe responsible for the
megabytes of JavaScript has been replaced by a <video> tag. The only
JavaScript left on the page is Google Analytics, which is way less than 6MB.

~~~
surround
I feel bad now that my comment received so much attention. I didn’t realize
that the Reddit comment was made a year ago, and I should have tested the
webpage size myself. The author’s argument is still important, after all.

------
FartyMcFarter
Long ago I watched a documentary about the early Apple days, when management
was encouraging their developers to reduce boot times by 10 seconds. The
argument was that 10 seconds multiplied by the number of boot sequences would
result in saving many human lives worth of time.

Edit: found a link with the same story:
[https://www.folklore.org/StoryView.py?story=Saving_Lives.txt](https://www.folklore.org/StoryView.py?story=Saving_Lives.txt)

The software world needs more of this kind of thinking. Not more arguments
like "programmer's time is worth less than CPU time", which often fail to
account for all externalities.

~~~
jchoca
I wish more companies thought like this in general. I often think about the
nature of the work I'm doing as a developer and wonder if it's making society
better off as a whole. The answer is usually a resounding no.

~~~
qlk1123
Same here, but why exactly?

In my country, SW engineer is one of the best careers in terms of income, and
I bet it is similar in most of the other countries. Why do we deserve that
much buzz/fame/respect/income if the work we are doing is NOT making the
society better?

These thoughts just haunt me from time to time.

~~~
bscphil
> Why do we deserve that much buzz/fame/respect/income if the work we are
> doing is NOT making the society better?

I understand that you're asking a theoretical question, not a practical one,
but in practical terms the answer is fairly simple. Our economy is not built
to (indeed, is built _not to_ ) reward individuals in line with what they
contribute to society. An entirely different set of incentives are what
structure our economy, and therefore the jobs and lives of most people.

In some sense, David Graeber's Bullshit Jobs is all about the widespread
awareness (and denial) of this phenomenon, and what caused it. I wouldn't say
it's a perfect book but it's the best one I've read on the subject.

------
d23
Performance is one thing, but I'm really just struck by how often I run into
things that are completely broken or barely working for extended periods of
time.

As I write this, I've been trying to get my Amazon seller account reactivated
for more than a year, because their reactivation process is just... broken.
Clicking any of the buttons, including the ones to contact customer support
just take you back to the same page. Attempts to even try to tell someone
usually put you in touch with a customer service agent halfway across the
world who has no clue what you're talking about and doesn't care; even if they
did care, they'd have no way to actually forward your message along to the
team that might be able to spend the 20 minutes it might take to fix the
issue.

The "barely working" thing is even more common. I feel like we've gotten used
to everything just being so barely functional that it isn't even a
disadvantage for companies anymore. We usually don't have much of an
alternative place to take our business.

~~~
mherdeg
I'd love to see a software-industry-wide quality manifesto. The tenets could
include things like:

* Measure whether the service you provide is actually working the way your customers expect.

(Not just "did my server send back an http 200 response", not just "did my
load balancer send back an http 200", not just "did my UI record that it
handled some data", but actually measure: did this thing do what users expect?
How many times, when someone tried to get something done with your product,
did it work and they got it done?)

* Sanity-check your metrics.

(At a regular cadence, go listen for user feedback, watch them use your
product, listen to them, and see whether you are actually measuring the things
that are obviously causing pain for your users.)

* Start measuring whether the thing works before you launch the product.

(The first time you say "OK, this is silently failing for some people, and
it's going to take me a week to bolt on instrumentation to figure out how bad
it is", should be the last time.)

* Keep a ranked list of the things that are working the least well for customers the most often.

(Doesn't have to be perfect, but just the process of having product & business
& engineering people looking at the same ranked list of quality problems, and
helping them reason about how bad each one is for customers, goes a long way.)

~~~
six2seven
You might be interested in Software Craftsmanship [0] manifesto. There are
many communities and initiatives around the world gathering folks with the
interest in producing high-quality software. From the few of the folks I have
been working with that are involved in SC, I can definitely recommend the
movement and so I'm also exploring options in joining some local meet-ups
and/or events.

[0]
[http://manifesto.softwarecraftsmanship.org/](http://manifesto.softwarecraftsmanship.org/)

------
buboard
He has a nice follow up which gets to the reasons why

[https://tonsky.me/blog/good-times-weak-men/](https://tonsky.me/blog/good-
times-weak-men/)

Another take: rewrites and rehashes tend to be bad because they are not
exciting for programmers. Everything you re about to write is predictable,
nothing looks Clearly better and it just feels forced. First versions of
anything are exciting, the possibilities are endless, and even if the choices
along the path are suboptimal, they are willing to make it work right.

~~~
jiofih
He hints at Electron in the end, but I think the real blame lies on React
which has become standard in the past five years.

Nobody has _any fucking idea what’s going on in their react projects_. I work
with incredibly bright people and not a single one can explain accurately what
happens when you press a button. On the way to solving UI consistency it
actually made it impossible for anyone to reason about what’s happening on the
screen, and bugs like the ones shown simply pop up in random places, due to
the complete lack of visibility into the system. No, the debug tooling is not
enough. I’m really looking forward to whatever next thing becomes popular and
replaces this shit show.

~~~
jeffmcmahan
I completely agree, here. React has replaced the DOM, and it's pretty fast,
pretty efficient when you understand its limitations... but when you start
rendering to the canvas or creating SVG animation _from within_ react code,
everything is utterly destroyed. Performance is 1/1000 of what the platform
provides. I have completely stopped using frameworks in my day-to-day, and
moved my company to a simple pattern for updatable, optionally stateful DOM
elements. Definitely some headaches, some verbosity, and so forth. But zero
tool chain and much better performance, and the performance will improve,
month-by-month, forever.

~~~
Aeolun
It seems to me that using your react components to render SVG animations, or
to canvas, is just inviting disaster.

~~~
jeffmcmahan
Well yeah. But I've seen it done; the attitude being "this is fine, React is
fast, it works on my Mac..."

------
cortesoft
He seems to make a contradictory point... he complains:

> iOS 11 dropped support for 32-bit apps. That means if the developer isn’t
> around at the time of the iOS 11 release or isn’t willing to go back and
> update a once-perfectly-fine app, chances are you won’t be seeing their app
> ever again.

but then he also says:

> To have a healthy ecosystem you need to go back and revisit. You need to
> occasionally throw stuff away and replace it with better stuff.

So which is it? If you want to replace stuff with something better, that means
the old stuff won't work anymore... or, it will work by placing a
translation/emulation layer around it, which he describes as:

> We put virtual machines inside Linux, and then we put Docker inside virtual
> machines, simply because nobody was able to clean up the mess that most
> programs, languages and their environment produce. We cover shit with
> blankets just not to deal with it.

Seems like he wants it both ways.

~~~
userbinator
It's possible to both improve efficiency _and_ maintain backwards
compatibility.

~~~
coldtea
Combining these two is only a non-issue with unlimited resources.

Otherwise it's a tradeoff if you add constraints like cost, effort, time to
market, and so on...

~~~
buboard
Windows does it. And despite that, versions like win 7 were pretty fast

~~~
rogerdb
I'd argue that of any software project on the planet, Windows is the closest
to having unlimited resources; especially when you consider the number of
Windows customers for whom backwards compatibility is the #1 feature on the
box.

And speed isn't the only metric that matters; having both the 32-bit and
64-bit versions of DLLs uses a non-trivial (to some people) amount of disk
space, bandwidth, complexity, etc.

~~~
Wowfunhappy
Surely, Apple and Google have just about as many resources as Microsoft does.

If Android, Mac OS, etc were super slimmed down systems in comparison to
Windows, I would understand the argument much better. Instead, it feels like
we're in the worst of both worlds.

------
arh68
See also: in _Good times create weak men_ [0], the author explains his
interpretation as to why. I can't summarize it well. It's centered around a
Jonathan Blow talk [1] _Preventing the collapse of civilization_.

[0] [https://tonsky.me/blog/good-times-weak-men/](https://tonsky.me/blog/good-
times-weak-men/)

[1] [https://www.youtube.com/watch?v=pW-
SOdj4Kkk](https://www.youtube.com/watch?v=pW-SOdj4Kkk)

~~~
guitarbill
I watched that talk a while ago. It is great, and it did change my opinion on
a few things. Whether you agree with the premise or not, you can still learn
something. For me, the importance of sharing knowledge within a team to
prevent "knowledge rot". "Generations" in a team are much more rapid than the
general population/civilisation, so that effect is magnified IMO.

------
city41
This article really resonates with me. But my biggest complaint is everything
is _so_ buggy! I won't name any names, but I find many major pieces of
software from large, well known companies are just riddled with bugs. I also
feel like you almost need to be a programmer to think of workarounds "hmm, ok,
so clearly it's in a bad state. If I had coded this, what boundaries would
likely cause a complete refresh?" My wife is often amazed I can find work
arounds to bugs that completely stop her in her tracks.

Before we fix performance, bloat, etc, we really need to make software
reliable.

~~~
neilalexander
I'll gladly name names.

Apple have totally forgotten how to test and assure software against what
appear to be even stupid bugs. macOS Catalina has been fraught with issues
ranging from the minor to the ridiculous. Clearly nobody even bothered to test
whether the Touch Bar "Spaces" mode on the MacBook Pro 16" actually works
properly before shipping the thing. Software updates sometimes just stop
downloading midway through, the Mail.app just appears over the top of whatever
I'm doing seemingly at random and Music.app frequently likes to forget that
I'm associated with an iTunes account.

Microsoft are really no better - Windows 10 continues to be slow on modest
hardware and obvious and ridiculous bugs continue to persist through feature
releases, e.g. the search bar often can't find things that are in the Start
menu!

My question is who is testing this stuff?

~~~
TeMPOraL
> _My question is who is testing this stuff?_

Telemetry.

Companies seem to be increasingly preferring to use invasive telemetry and
automated crash reports in lieu of actual in-house testing, and they use that
same telemetry to also prioritize work. I have a strong suspicion that this is
a significant contributing factor to the absurdities and general user-
hostility of modern products.

~~~
cytzol
I'm in complete agreement. Thanks to automated crash report uploading, the
software I use is more _stable_ than ever — it's a genuine surprise to me when
an application crashes, and I can't remember the last time I had to reboot
because my OS froze.

But this means that anything that's _not_ represented in telemetry gets
completely ignored. The numbers won't show you how many of your users are
pissed off. They won't alert you to the majority of bugs. They won't tell you
if you have a bloated web application that's stuffed full of advertising. They
won't tell you if your UI is incoherent.

I really do think that large companies are looking at the numbers instead of
actually using their software, and the numbers say that everything's fine.

~~~
TeMPOraL
Indeed. The way I've been summing it up recently: A/B testing is how Satan
interacts with this world.

It's understandable people want to base their decisions off empirical
evidence. But it's important to remember that what you measure (vs. what you
don't measure) and how you measure will determine the path you're going as
much as the results of these measurements.

------
jancsika
> How is that ok?

Probably because a browser like FF has the goal to load and display arbitrary
dynamic content in realtime like a reddit infinite scroll with various 4k
videos and ad bullshit, whereas the game has the goal to render a known,
tested number of pre-downloaded assets in realtime.

Also on shitty pages the goal is different-- load a bunch of arbitrary adware
programs and content that the user doesn't want, and only after that display
the thing they want to read.

Also, you can click a link somewhere in your scrolling that opens a new,
shitty page where you repeat the same crazy number of net connections, parsing
ad bullshit, and incidentally rendering the text that the user wants to read.

If you want to compare fairly, imaging a game character entering a cave and
immediately switching to a different character like Spiderman and inheriting
all the physics and stuff from that newly loaded game. At that point the bulk
of your gameplay is going to be loading new assets and you're back to the same
responsiveness problems of the shitty web.

Edit: clarification

~~~
nearbuy
I'm both a web developer and a game developer, and this comparison doesn't
ring true at all. Games usually have tons of arbitrary dynamic content to
display in realtime. Minecraft will load about 9 million blocks around your
character plus handle mobs, pathfinding, lighting, etc. Reddit infinite scroll
loads a sequence of text, images, and videos. Multiplayer games have such
tight latency and bandwidth targets that game developers routinely do
optimizations web developers wouldn't even consider.

As a web developer, sending an 8 KB JSON response is no problem. That's nice
and light. In a networked action game, that's absurd. First, (hypothetical
network programmer talking here) we're going to use UDP and write our own
network layer on top of it to provide reliability and ordering for packets
when we need it. We're going to define a compact binary format. Your
character's position takes 96 bits in memory (float x, y, z); we'll start by
reducing that to 18 bits per component, and we'll drop the z if you haven't
jumped. Then we'll delta compress them vs the previous frame. Etc.

Really, what's happening is things are getting optimized as much as they need
to be. If your game is running at 10 fps, it's going to get optimized. When
it's hitting 60+ fps on all target platforms, developers stop optimizing, even
if it could potentially be faster. Same for Reddit; it's fast enough for most
users.

~~~
theandrewbailey
> As a web developer, sending an 8 KB JSON response is no problem. That's nice
> and light. In a networked action game, that's absurd.

It depends on what that 8 KB is doing. If that 8 KB is a chat message, that's
way too big. On the other hand, I've never seen an 8 KB game patch.

~~~
nearbuy
This doesn't really relate to my point. The blog post asked why is it that we
can handle games (fancy 3D simulations, sometimes with hundreds of players and
millions of objects) at a smooth 60 fps but not scrolling a web page. The
parent comment suggested that it's easier to render games smoothly because you
know the content in advance. I'm suggesting that software gets optimized (by
necessity) until it works well enough. If some website had to display a
million elements, the devs would either optimize it until it could do so, or
the project would get scrapped.

When I talk about sending 8 KB in a "networked action game", I'm referring to
the update packets sent to and from clients in something like Fortnite or
Counter-Strike, not a game patch. I'm not trying to make a competition for who
uses the least bandwidth (which a 60 GB game would lose just on the initial
download). I'm trying to illustrate that games don't run faster than some
website because it's inherently easier to make games run fast, but rather that
developers, by necessity, optimize games until they run fast (or in this
example, until they reduce network lag enough).

I'm not sure why a chat app would tack on something like 7.5 KB of overhead on
a chat message, but I wouldn't be surprised if there's a chat app out there
that does so. Users won't notice the extra couple milliseconds (especially so
because they don't know exactly when the other person hit send). A 3 character
Discord message is close to 1 KB including the headers. The same message over
UDP in a game might be under 20 bytes, including the UDP header (games could
also use TCP for chat - text chat isn't going to strain anything). So I'd say
the overhead of a Discord message is still an order of magnitude or two bigger
than it could be. Which is perfectly fine; we can afford 1 KB of overhead on a
modern connection. It's optimized as much as it needs to be.

------
userbinator
A related article in a similar spirit from 4 years ago:
[https://news.ycombinator.com/item?id=8679471](https://news.ycombinator.com/item?id=8679471)

 _I can comfortably play games, watch 4K videos, but not scroll web pages?_

I think this is one of the more important points that the article tries to get
across, although it's implicit: while the _peak_ of what's possible with
computing has improved, the average hasn't --- and may have gotten worse. This
is the point that everyone pointing at language benchmarks,
compiler/optimisation, and hardware improvements fail to see. All the
"Java/.NET is not slow/bloated" articles exemplify this. They think that, just
because it's _possible_ for X to be faster, it always will be, when the
reality couldn't be further from that.

Speaking of bloat, it's funny to see the author using Google's apps and
Android as an example, when Google has recently outdone itself with a 400MB(!)
web page that purports to show off its "best designs of 2019":
[https://news.ycombinator.com/item?id=21916740](https://news.ycombinator.com/item?id=21916740)

~~~
bhauer
I agree that the peak is pulling away from the average, and most of us want
the average performance of applications to lift. We have to throw aside facile
"Good Enoughism" and genuinely respect the time of our users.

Where I differ a bit from your take: Languages and platforms that target high
performance are providing application developers an elevated performance
ceiling that allows them the luxury to use CPU capacity as they see fit.
Application developers using high-performance platforms may then elect to make
their application high-performance as well, yielding a truly high-performance
final product, or they may elect to be spendthrifts with CPU time, yielding
something middling on performance. And yes, a truly wasteful developer can
indeed make even a high-performance platform yield something low-performance.

What benchmarks and the resulting friendly competitiveness help us avoid is a
different and worse scenario. When we select a language or platform with a
very low performance ceiling, application developers continuously struggle for
performance wins. The high water mark for performance starts out low, as
illustrated by how much time is spent in order to accomplish trivial tasks
(e.g., displaying "hello world"). Then further CPU capacity is lost as we add
functionality, as more cycles are wasted with each additional call to the
framework's or platform's libraries. When we select a low-performance
platform, we have eliminated even the _possibility_ of yielding a high-
performance final product. And that, in my opinion, illustrates the underlying
problem: not considering performance at key junctures in your product's
definition, such as when selecting platform and framework, has an unshakeable
performance impact on your application, thereby pulling the average downward,
keeping those peaks as exceptions rather than the rule.

------
johnr2
One thing nobody seems to mention is the environmental cost of inefficient
software. All those wasted CPU cycles consume electricity. A single laptop or
phone on it's own is insignificant, but there are billions of them. Combine
that with the energy wasted shovelling unnecessary crap around the internet,
and it adds up to a big CO2 problem that nobody talks about.

~~~
jupp0r
I hear that argument very frequently and I don’t buy it.

Think about all the gas that is saved because people don’t have to drive to
the library, all the plane trips saved by video conferencing, all the photo
film, all the sheets of paper in file cabinets, all the letters being sent as
emails, all the mail order catalogues, ... you get the idea.

Does anybody know of a comprehensive study on this?

~~~
UweSchmidt
If websites and business software were as lean as they could be, most
computers could have amazingly weak, low-powered processors.

I'm quite disenchanted with software myself. It takes way too long to open any
program, for this JIRA ticket to properly display.

One thing that has improved was boot times, I seem to remember that Windows 7
was quite a bit faster than XP. Maybe someone in upper management wanted it to
be as fast as MacOS? So speed IS possible, if it is prioritized.

~~~
Wowfunhappy
> One thing that has improved was boot times, I seem to remember that Windows
> 7 was quite a bit faster than XP. Maybe someone in upper management wanted
> it to be as fast as MacOS? So speed IS possible, if it is prioritized.

I seem to remember boot times being a frequent topic of discussion the early
2000s, because _people turned off their computers_.

In a way, this is a great little microcosm of the problem. Just fix habits
instead of fixing the software.

------
peter_d_sherman
Excerpt:

"An Android system with no apps takes up almost 6 GB. Just think for a second
about how obscenely HUGE that number is. What’s in there, HD movies? I guess
it’s basically code: kernel, drivers. Some string and resources too, sure, but
those can’t be big. So, how many drivers do you need for a phone?

Windows 95 was 30MB. Today we have web pages heavier than that!

Windows 10 is 4GB, which is 133 times as big. But is it 133 times as superior?
I mean, functionally they are basically the same. Yes, we have Cortana, but I
doubt it takes 3970 MB. But whatever Windows 10 is, is Android really 150% of
that?"

My favorite line: _" Windows 95 was 30MB. Today we have web pages heavier than
that!"_

If there's a new saying for 2020, it _shouldn 't_ be that "hindsight is
2020"... <g>

Also... each web page should come with a non-closable pop-up box that says
"Would you like to download a free entire OS with your web page?", and offers
the following "choices":

"[Yes] [Yes] [Cancel (Yes, Do It Anyway!)]". <g>

~~~
pcurve
My favorite was "Google’s keyboard app routinely eats 150 MB. Is an app that
draws 30 keys on a screen really five times more complex than the whole
Windows 95? "

~~~
crazygringo
I mean... considering the fact that in contains the probabilities of typing
every single word in the English language versus every other, at every stage
of typing, including potentially the probabilities of all the ways you might
mistype each word while swiping without precise accuracy...

...maybe?

I don't know if that's 150 MB' worth of data... but it's certainly a lot.

~~~
mikro2nd
So why is it persistently, perniciously and stubbornly insistent in refusing
to spell 'naughty' words like 'duck'?

~~~
noisem4ker
What's the problem again? I just typed "duck" by swiping. If you mean the "i"
variant, be informed that swear words are blocked unless a preference is set.

~~~
ShamelessC
They mean the "f" variant.

------
johnwatson11218
When I was in school and first leaning about programming I assumed that code
written in C or Java would eventually be ported to hand tuned assembler once
enough people were using it. Then I got in to the industry and realized that
we just keep adding layer after layer until we end up at the point this
article talks about.

I remember once reading that IBM was going to implement an XML parser in
assembler and people were like "Why? If speed is needed then you shouldn't use
XML anyway." I thought that concern was invalid because these days XML ( or
JSON ) is really non-negotiable in many scenarios.

One idea that I've been thinking about lately is some kind of neural network
enabled compiler and/or optimizer. I have heard that in the javascript world
they have something called the "tree shaking" algorithm where they run the
test suite, remove dependencies that don't seem to be necessary and repeat
until they are getting test failures. I'm thinking why not train a LSTM to
take in http requests and generate the http response? Of course sometime the
request would lead to some sql, which you could then execute and feed the
results back into the LSTM until it output a http response. Then try using a
smaller network until something like your registration flow, or a simple
content management system was just a bunch of floating point numbers in some
matrices saved off to disk.

~~~
MaxBarraclough
> I'm thinking why not train a LSTM to take in http requests and generate the
> http response?

Why? With responses generated according to what? Are you really just
suggesting using neural networks in the compiler's optimiser?

> Then try using a smaller network until something like your registration
> flow, or a simple content management system was just a bunch of floating
> point numbers in some matrices saved off to disk.

Why? What's the advantage over just building software?

~~~
johnwatson11218
I'm suggesting that you take an existing system and build up a corpus of
request/response pairs. Then you use the LSTM to build a prediction model so
that given a request it will tell you that the current production system will
produce the following sql statement and this http response. Once the LSTM's
output is indistinguishable from your current production system , for all use
cases, then you replace the production system with the LSTM and a thin layer
that can listen on the port, encode/decode the data, and issue sql queries.

Why would I want to do this? I'm not 100% sure ... I think it would be super
fast once you got it working. I think it would avoid many security bugs. You
wouldn't have to read that "oh drupal 3.x has 20 new security bugs" better go
patch our code. I think when I had this idea I was thinking about it terms of
a parallel system that could catch hacking by noting when actual http
responses diverged too much from the predicted response. The main idea being
that for a given input the output really is 100% predictable, assuming your
app doesn't use random numbers like in a game or something.

To link this idea to the article, I think things like XML parsers could be
written this way .... I can't prove it but I suspect that they would be very
fast and not come with all the baggage that the article complains about.

I started thinking along these lines after reading stuff like this
[https://medium.com/@karpathy/software-2-0-a64152b37c35](https://medium.com/@karpathy/software-2-0-a64152b37c35)

~~~
capitalsigma
What if your app has literally any mutable state? Registering accounts,
posting comments, etc.

Also I'll bet you that your neural net is > 100x slower than straight line
code.

~~~
johnwatson11218
Mutable state in the sense of database writes would be part of the network's
output and just passed on to a regular db. Mutable state in the sense of
variables that the application code uses while processing a request? Well LSTM
networks can track state like that.

For session based variables? Not sure, either it all becomes stateless and the
code has to read everything from storage for each reqeuest .... or maybe the
lstm is able to model something like an entire user session and remember the
stuff that the original app would have put in the session.

That Andrej Karpathy article that I linked to two comments above ... he
pointed out, in a different blog post, that regular neural networks can
approximate any pure function. Recurrent neural networks like the LSTM can
approximate any computer program. It is because they can propagate state from
step to step that allows them to do this.

As far as it being 100X slower, well at a certain point I will be willing to
take your money :)

------
anilakar
An unpopular but effective short-term solution: Developers ought to use five
year old hardware for development and testing tasks. FWIW, my work laptop and
mobile phone are both 2015 models and I feel they're completely adequate for
running all the software I've written.

~~~
mtm7
I remember a story about an electronic musician who preferred to mix his songs
with earbuds/headphones instead of with a high-end megabucks studio sound
system. His reasoning was basically, “that’s how my fans will listen to my
music, so I need to make sure it sounds good to them.”

I can’t remember who it was, but the idea always stuck with me.

Anyway, I agree that we should test our applications with the same hardware
(and internet speed) of our average user. Very few people use a computer as
good as a software engineer’s. :)

~~~
gonational
That’s funny; I’m always debating my brother, talking about how earbuds sound
much better than even high-end monitors, largely because of the surrounding
environment (or lack thereof, when earbuds are in).

When I make music I have to do the opposite; take my earbuds out and listen to
the music over my car stereo or via my MacBook Pro speakers, to make sure it
doesn’t sound good only through earbuds.

~~~
ratww
Hah, I have nice studio monitors since forever, but I also make music with
Earbuds.

I like them, but I've found that they make a bit too conscious about the
sound, so I have to be extra careful. The isolation emphasizes noise and
frequencies that the monitors don't, and that will be masked in the full mix.
It used to lead me to a rabbit hole of noise-gating and EQs.

~~~
gonational
Interesting; I could totally see myself going down the same rabbit hole, or
the opposite, thinking that a certain faint sound will be great, but when I
play it in my car later it's completely unnoticeable.

~~~
ratww
> thinking that a certain faint sound will be great, but when I play it in my
> car later it's completely unnoticeable.

That's true! This happens all the time with super expensive recordings too,
and I love it. IMO discovering new sounds in the music you're used to only
adds to the experience.

------
lxe
I’m tinkering with the Atomic PI SBC — Quad Core 1.5ish GHz Intel Atom with 2
Gigabytes of RAM. You can load up an OS and launch Firefox with maybe 3 tabs
until you run out of memory and have to page out (which will basically hang
the device due to eMMC/flash I/O speeds).

These kind of specs are incredible for a high-end desktop machine from 15
years ago that would be good for almost anything — gaming, browsing, etc...
What happened to software (Linux & Friends, Firefox, etc) that renders
hardware obsolete so quickly? Is it just purposeful optimization that uses
more RAM to benefit performance elsewhere, or is it truly this disenchantment?

------
megous
I was in Rome recently, and google maps were basically unusable on EDGE
(dsepite pre-downloading the area before the trip). We'd wait a minute (or
more) for a timetable of a bus stop and a route of the bus to be show on the
map.

Try planning a route in an unfamiliar area with this slow an UI when you are
standing outside and there's no place to sit and rest, and you need to click
around on a bunch of stops just to see what busses are going through a stop
and where they are heading.

So yes, optimization is still important.

We replaced the glorious and easily iterated and expanded google maps app with
a photograph of a public transport map, and we could get an answer of how to
get from any A to any B within seconds of looking at the map without typing or
searching or waiting for anything.

Which also shows that, sometimes, slow software is less than useless.

~~~
xchaotic
Why were you using EDGE if there’s dense 4G coverage in Rome ?

~~~
pantalaimon
Replace Rome with the German countryside and you'll still just have EDGE.

Sites like Hacker News or [https://i.reddit.com](https://i.reddit.com) are
still perfectly usable. In contrast to the 'modern' Reddit UI that that takes
a couple of seconds to load even on my home WiFi.

~~~
itronitron
It would be nice to have a maps/routing service that could just produce a
basic HTML page with text directions from a given point A to another given
point B.

Bonus points if it also lists the before and after streets at direction
changes so people know if they missed a turn.

This could be implemented as a proxy to either of the maps apps (Google,
Apple, OpenStreetMap)

------
xorand
Programming is now a bureaucracy.

[1] "In any bureaucracy, the people devoted to the benefit of the bureaucracy
itself always get in control and those dedicated to the goals the bureaucracy
is supposed to accomplish have less and less influence, and sometimes are
eliminated entirely"

[1]
[https://en.wikipedia.org/wiki/Jerry_Pournelle#Pournelle%27s_...](https://en.wikipedia.org/wiki/Jerry_Pournelle#Pournelle%27s_iron_law_of_bureaucracy)

~~~
TeMPOraL
That's a good observation on many levels.

For instance, I started noticing that a lot of the code I've written or worked
with in many projects have a particular flavor to it. Pieces that take some
data, repackage it, and pass it on to different code that does essentially the
same - all arranged in a structure that's supposed to reflect some shared,
abstract understanding of the problem. I've started to call this type of code
"bureaucracy", and I see it as something to be kept in check.

------
megous
Even quite underpowered phones can boot in ~1-2 seconds if optimized for that.
Not everything in the phone will startup in that amount of time (modem, wifi),
but it's possible to start to Linux userspace and display fully interactive UI
in that amount of time.

Even my e-book reader Linux port boots to UI in ~2s.

It really is just bloat and lack of care.

~~~
telegrammae
To be honest, I have never seen any phone or computer that could boot in about
2 seconds. 6-7 seconds is the absolute minimum I've ever seen, regardless of
hardware or OS.

~~~
LargoLasskhyfv
You know? Once upon a time i've done some freelance sysadmin for small
businesses. So i installed a new fileserver and a networking gateway with
least-cost router functionality for the phones, somewhere. When ready i asked
what that big tower did, that stood aside unplugged. 'Nothing, that is just
trash we haven't discarded yet.' Since the case had a nice design it seemed
wasteful to me, and i took it back home. At home i plugged it into to some
spare screen, and since it had a networking card into that also. When ready i
rocked the big red switch it had with a satisfying _clack_ into the upwards
posistion and wanted to do something else at my other systems while it booted
up (i thought). So i rolled, sitting on my chair, maybe 3 feet, and then heard
a loud 'Ta-Daa!'. I looked sideways and barely saw how Word 6 and Excel 5
popped up, out of the corner of my eyes. I was dumbstruck!

I shut it down and repeated that, looking at my wristwratch.

 _FOUR SECONDS!_

I couldn't believe that and tested maybe five times again. It took never more
than 4 seconds from rocking the switch to WfW 3.11 Desktop with Word 6 and
Excel 5 in autostart ready to use! It even got a DHCP address while being at
it.

The thing is, at that time i had some Sun SparcStations, some HP PA-Risc
Workstations, and assorted X86 PCs, one of them an AMD Athlon XP PR1800
overclcocked to PR2100 with 1.5 GB virtual channel memory SD-Ram, running
NetBSD (because it just worked, don't ask). I felt very ahead of 'the curve'.

And then this trash came along and burst my bubble...

I sat there and only thought: _' what for?!'_

Now some old Windows running atop of DOS isn't something to really envy, but
in this combination of components, BIOS, drivers it got the job done without
any hassle, FAST!

That was eye opening for me.

~~~
LargoLasskhyfv
Edit: WRT BIOS. It had a MR-Bios from Microid Research which allowd to use it
over the onboard serial ports, you could choose which, how, and so on. Another
functionality which has been gone, and came back later only as expensive add
on.

------
_trampeltier
It's not just the problem with large and slow today. There are also these dark
patterns everywhere special in Android. Even local thing do not work, if there
is a network connection, but connections for ex. to Google are not allowed.
For ex. the standard Android "Photos" App does not always show you the newest
pictures, if the network blocks Google. Also you can't share something to
another App if you have a network connection, but Google is blocked. If you
switch off the network complete on the phone, everything is working again.

~~~
makapuf
This. I know that today we get so many software for free, but I see so many
good opportunities spoiled by the fact that someone from marketing told the
developer to hamper interoperability or usability to push some other product
or feature that when you don't have it, it's so refreshing. Remembering my
browser of choice. Ability to save some place in google maps without letting
it have all my location, always, local only functions, accept : always/not
now, click here for extra features, ... we could save so many bytes or cpu
cycles or UI missteps...

------
_bxg1
I'm so weary of this moral panic we've been having. There are so many other
factors to be weighed against efficiency when it comes to making software.
There are completely legitimate tradeoffs to be made that sacrifice
performance. There are also programmers who write bad code on all dimensions -
performance included - out of sheer laziness. But those aren't the primary
cause of this hardware "waste". Demanding efficiency for efficiency's sake,
ignoring all other constraints, is shortsighted and narrow-minded.

~~~
opportune
Yeah, I really dislike the black and white thinking. The python script example
the author gives is a perfect example of what doesn’t need to be made any
faster. If you are interested only in execution time, you might as well never
write anything in native python.

But on the other hand, a lot of web content does need to be faster. Gmail has
somehow gotten so much slower to load over time. And every time I visit a
newspaper/magazine website I am aghast at how bloated they are. Does that mean
nodejs is inherently bad, no, but it does mean people should try to optimize
noticeably terrible performance that actually degrades UX.

~~~
_bxg1
Sure. But even that probably has more to do with businesses prioritizing
features over quality, not programmers lacking character.

Much of it on the web also has to do with how _much_ browsers can do. The
number of CSS properties that can be applied, the ways different elements'
sizes can automatically influence the layout of other elements, etc. These
traits are what make the web such a powerful and attractive platform for user
interfaces, but the complexity of the platform is definitely becoming a real
issue that deserves attention.

A couple of points:

\- NodeJS is server-only and usually has nothing to do with perceived
performance of web apps

\- The biggest offender of web performance is ads. They dump piles and piles
of crappy JavaScript from dozens of different sources that all include their
own copies of common libraries and have no incentive not to slow down the
page.

\- Beyond ads, the bottleneck is usually not even JavaScript, but layout (as
in the paragraph above). Web layout is incredibly flexible and incredibly
complex. Computing and rendering it all is slow, but it does serve a purpose.
Not that it couldn't be improved.

Bad ads are a tragedy of the commons and I don't know what can be done about
them unless Google or Facebook decided to throw their weight around to force
them to be better.

I do wonder if a new web standard could be developed for using some
constrained subset of the layout vocabulary, that would be cheaper and more
straightforward to compute. The current version has to remain for backwards
compatibility reasons, but it's trying to serve a bunch of different types of
cases at once, and therefore doesn't do a great job at any one of them.

------
keyP
I think people who recognise this have generally been developing from the days
where computing resources were scarce (or on mediums now where they need to be
efficient). It was a necessecity to implement efficient techniques instead of
a nice to have. Nowadays those restrictions have been lifted for the most
part.

In this day of "Agile" development, as long as something's working during UAT,
that's all that's needed for sales and consumers.

Webdev, IME, is an example where the ecosystem has facilitated bloated
websites. I've worked with developers who throw any library they can just for
basic things because they don't have a need to try to optimise. The meme of
using jQuery for everything when it came out has just been replaced by other
frameworks. I find it often depends on developers who really want to work on
something and take pride in it vs those who just need something on their CV or
got hired by following a few tutorials on the web but not understanding what
they wrote (which, to me, signifies a hiring problem in the company). During
code reviews, I encourage leads to keep calling up hacky code to a point where
the developer will just start writing it properly the first time round. As
developers, I feel we should be aware of not creating selfish software which
hogs memory from other software or requires huge data downloads for mobile
users (whenever doable). Possibly a naive ideal but if it's a byproduct of
developing fast software for my end users, I think that's a win-win.

~~~
amboo7
Time is as scarce as it ever was.

~~~
keyP
Indeed, and pragmatism should be applied, but I mean in the context of not
being rushed. I don't mind my team watching YouTube/browsing the web during
work if things are going well but I wouldn't accept it if it's done after
submission of suboptimal code. If there's time to watch YT, there's time to
improve your code (unless it's clearly too much of a refactor).

------
jmpeax
> Jonathan Blow has a language he alone develops for his game that can compile
> 500k lines per second on his laptop. That’s cold compile, no intermediate
> caching, no incremental builds. You don’t have to be a genius to write fast
> programs.

The guy is most definitely at least a genius.

~~~
ibdknox
Depends on your definition of genius, but I definitely agree that these folks
don't quite hold up the sentiment that "anyone can do it." I would put Martin
Thompson, Raph Levien, and Jonathan Blow at least in the top 0.1% of
programmers.

They are great examples for his overall point though. It probably would've
been better just to leave out the genius bit and talk about them as folks
proving it can be done.

~~~
roddux
You should add the late Terry Davis to your list, or if that area interests
you, read up on his work:
[https://en.wikipedia.org/wiki/TempleOS](https://en.wikipedia.org/wiki/TempleOS)

~~~
ibdknox
Yeah, there are many others I'd add to a more general list - Carmack, Bellard,
Wirth, the folks from Our Machinery, etc. I was just referencing the people
specifically mentioned in the original post.

------
S-E-P
The linked Medium article (when talking about npm) was what I expected, but
gives wonderful examples of just how bad it's gotten.

For those who don't want to dig, one example was Ember.js, which has a
dependency called "glimmer" which makes up ~95% of the code size. The author
looked into glimmer and found that it had the entirety of the Encyclopedia of
Britannica's "G" section to include a definition of "glimmer" in their help
menu.

And that wasn't even the most ridiculous example.

It's shameful that it's gotten this bad; but when you look at what people are
expected of in the current climate it makes sense that this would happen.

    
    
       * Horrendously short deadlines for enterprise CRUD (and the "frameworks" that support it)
       * REUSE REUSE REUSE THIS REFUSE (few seem to know how to read source code before installing the dependency)
       * "Not paying me enough for that shit"
       * "We can't rewrite, we put 20 years into this codebase"
       * Even our languages are shit, JS (despite it's usefulness) has undefined behavior as a feature.
       * [among many others I'm sure you could think of]
    

It's toxic, corps incentivize lazy quick work that won't hold up in the long
run, but they are too stupid to realize that. Though i blame more so the
sycophant that just silently nods and does the work without a sliver of
conscience telling them that "this is wrong".

Civility has a lesser place in efficiency then what we have now; you can't
make a decent product without bashing a few skulls (figuratively ofc)

Lastly, don't be afraid to reinvent the wheel if your wheel is better than
mine.

~~~
nneonneo
Not to burst your bubble but the linked Medium article you’re referring to is
satire. Glimmer doesn’t actually pull in Britannica - it never has.

~~~
grenoire
The sad state of affairs is that satire is dead and this guy fell for the
bait. It doesn't even sound _that_ ridiculous anymore.

------
hermitcrab
As a dinosaur who has been programming for >30 years, it shocks me how bloated
many modern programs are. The installers for my own products are around 20MB
and most of that is Qt libraries. But 100s of MB seems to be standard now. The
Airbnb app on my iPhone is 210 MB. I understand that if you are shipping a 3d
game with maps, textures, sound etc, but not for a mobile phone app.

~~~
_trampeltier
The sound driver on my ZBook at work does need almost 500MB Ram. No idea what
is in there .. maybe HP has these days not just keyloggers included in the
drivers, maybe the have screengrabbers too.

~~~
zamadatix
Probably 1 MB for the actual driver and 499 MB for the Electron GUI to
interface with it.

------
coldtea
> _And then there’s bloat. Web apps could open up to 10 times faster if you
> just simply blocked all ads. Google begs everyone to stop shooting
> themselves in the foot with the AMP initiative—a technology solution to a
> problem that doesn’t need any technology, just a little bit of common sense.
> If you remove bloat, the web becomes crazy fast. How smart do you have to be
> to understand that?_

If you "simply blocked all ads" the people making the pages wouldn't have the
income which they maintain the pages with.

How smart do you have to be to understand that?

> _We haven’t seen new OS kernels in what, 25 years? It’s just too complex to
> simply rewrite by now. Browsers are so full of edge cases and historical
> precedents by now that nobody dares to write layout engine from scratch._

Well, there's Fucsia, speaking of new kernels. And Mozilla is doing exactly
that, written a new layout engine from scratch (plus a language to write it
in).

(I agree with the general sentiment of the post, but the examples are often
shoddy)

~~~
lazyjones
> _If you "simply blocked all ads" the people making the pages wouldn't have
> the income which they maintain the pages with._

That's how we get deliberately slow pages, articles split over multiple page
loads, image slideshows and deliberately slow pages (because time spent on
your site is time not spent on the competing websites).

On the other hand we have pages like HN, paid for(?) using other means and
built to be perfectly usable and fast. Or some CSEs where revenue comes from
affiliates and CPC fees instead of ads, so they try to keep things fast too.
Then we have news sites with paywalls and hopefully some day better engineered
UI to read those news.

People should just vote with their pockets for a better user experience, IMO.

~~~
jodrellblank
_People should just vote with their pockets for a better user experience,
IMO._

How? How can I vote with my pocket for old.reddit.com instead of new reddit
UI? How can I vote for another site which is different, when the main draw of
Reddit is the number of people on there?

How can I vote for an instant messenger which works the way I want, when the
people I talk to aren't using it because the people they talk to aren't using
it?

How can I pay for a faster Windows 10 where the start menu works every time I
press the start button and updates install in the time it takes to copy the
data to SSD and no longer? That option isn't on the market.

------
zzo38computer
Yes, it is true, many programming they don't do such a good job. I do try to
make better software, but will not always succeed. Also computer hardware is
becoming too complicated too actually, I think. I generally don't add so much
dependencies to a program though. Some people (including myself) do still
write DOS programs sometimes. The web browser is too complicated. I use IRC
and NNTP, I think they are much better anyways than Slack and so on (and even
this HN, too, I think). I do program in C (I use other programming languages
as well, but mainly C). Many programs I think have too many animations. And a
lot of programs, they just write it stupid!! TeX is good and it still works
more than thirty years later, and is fast, too. But anyways I don't like
WYSIWYG, so that is why I don't use LibreOffice and Microsoft Word and so on.

------
trixie_
Energy (or more accurately power) is a scarce resource. If it's not being
spent to keep the organization or individual going, then it could be argued
that the energy is wasted. As it requires functioning organizations to acquire
more energy and so on.

All systems - biological, physical, meta-physical are built on layers that
once deep enough are pretty well cemented in. Hindsight is 20/20 and though we
know if the foundations were different things would be better - they won't
actually change until the energy gain exceeds the energy cost of uprooting
everything to make the change.

Just saying this problem isn't exclusive to software, but also laryngeal
nerves in giraffes, x86, and Esperanto.

~~~
TeMPOraL
On the other hand, life is extremely frugal with energy. The foundations of
biological systems are ridiculously efficient, and all the complex life forms
are pretty good too. Despite being designed by a biased random process, they
have at least this going on for them: the fitness function favors energy-
efficient systems. Unlike our markets.

~~~
ncmncm
Plants convert 2% of incident light into usable chemical energy. Muscles turn
30% of consumed chemical energy into motion.

But they don't waste much material.

------
TheTank
In addition to the costs of developing good foundational code, I think revenue
is becoming a driver for slower-than-necessary software.

An increasing number of providers (for example databases) charge per server,
per core, or another hardware-usage metric. The more hardware, the more
revenue for them as they make about 90% margin for every new machine. There is
a high incentive to get users to need more machines on their "managed cloud".

Vendors could try to improve their software so it requires twice as little
CPU. But why bother since this would result in 2x cost in revenue? It makes
more sense to focus on horizontal scalability than on core efficiency so users
can keep on adding machines to their cluster over time.

If software is running at 1% of the maximum performance as suggested in the
article, a 1% improvement could reduce costs by 50%. But I think none of the
existing vendors will ever make the move as it conflicts against their own
interest.

------
vojta_letal
> Modern buildings use just enough material to fulfill their function and stay
> safe under the given conditions. ... Only in software, it’s fine if a
> program runs at 1% or even 0.01% of the possible performance.

> I can comfortably play games, watch 4K videos, but not scroll web pages? How
> is that ok?

IMO the comparison should be buildings <-> fine-tuned libraries (i.e. video
decoding algorithms), modern applications <-> cities.

Go to any city center in Europe. Urban planning a century ago was much more
elegant and elaborate taking into consideration the city as a whole. Nowadays
developers and investors often ignore important aspects, such as surrounding
buildings, infrastructure, making cities inefficient for the people who
actually live there.

Any system which has plenty of resources has to become inefficient. It's just
that the Moore's law allows for pretty damn inefficiency.

~~~
anthk
>Go to any city center in Europe. Urban planning a century ago was much more
elegant and elaborate taking into consideration the city as a whole.

Eh, no. Just a castle/wall, shops around in the main St's, a market in the
middle, and expanding chaotically from that.

~~~
vojta_letal
That's more than a century ago. I was thinking about stuff built like in
1850-1950.

------
arithma
Users don't know about this performance gap, but there is a gap, and that gap
is an economic opportunity.

Since the current practices are so inefficient, users are having to pay for
this from pocket as hardware expenses. Buy a $1000 smartphone, to get the same
experience as last-year's.

A different stack (don't need to reinvent the universe), can be branded as
brutally efficient, uses slim hardware, and strict engineering practices to
provide a much better experience at a fraction of the price (1/5 possible.)

I believe nobody is doing that yet since there are two main barriers:

1- Risk of no-market (I think this might be proven unfounded, given the
current trend in price hikes)

2- Capital investment necessary to get started (But this also can be solved
given the obvious appetite for the next-big-thing money being poured left and
right, without anything catching on yet)

------
coldtea
> _You’ve probably heard this mantra: “Programmer time is more expensive than
> computer time.” What it means basically is that we’re wasting computers at
> an unprecedented scale. Would you buy a car if it eats 100 liters per 100
> kilometers? How about 1000 liters? With computers, we do that all the time._

The argument is incomplete.

The correct question (to maintain the analogy) is:

"Would you buy a car if it eats 1000 liters per 100 kilometers, but that
doesn't affect you at all (you still get to where you want to be fast enough),
and the time to manufacture and cost to buy it is much lower than would be
possible with more efficient car that used 10 liters per 100 km?"

The answer to which would be yes. A software that does a task we do once a day
or so at 1 second vs 0.2 seconds doesn't cost us money (and even the
environmental impact is small).

~~~
juststeve
But in your example, your only describing one specific task (n=1). The issue
is that time accumulates non linearly as the number of tasks increase. i.e the
petrol stations must be able to supply enough petrol to all customers that
drive the most inefficient car.

The _culture_ of "developer time is most important", makes overall system
performance someone else's problem, because "my program is fast enough when I
measure it off the wall clock". But who's responsible to fix overall system
performance, and how can they fix it? I think a lot of people would just
upgrade their RAM, CPU or IO to solve the issue (create more petrol stations),
rather than asking vendors to change the programming language, or to be more
conservative on RAM.

And because there's costs to switching language stacks, people will stick to
writing in the language they are comfortable in, so critical business systems
get written in slow languages.

~~~
coldtea
> _But in your example, your only describing one specific task (n=1)._

That's to maintain parity with the example TFA gives. I'm not saying it's
never worth it... Of course you'll optimize often repeated tasks, loops, etc.

Here's what TFA quotes as bad reasoning, because it makes fun of optimization
effort in some cases:

"@tveastman: I have a Python program I run every day, it takes 1.5 seconds. I
spent six hours re-writing it in rust, now it takes 0.06 seconds. That
efficiency improvement means I’ll make my time back in 41 years, 24 days :-)"

Unlike TFA I agree with the twitting guy above that the efficiency improvement
here wasn't worth it.

~~~
juststeve
Yeah agreed. i mean in that example, if tveastman wrote it in a fast language
the first time, the rewrite probably wouldn't have been required. (i think
that tweet was for a bit of a laugh anyway, because he can now write new code
in Rust.)

But as many languages support implicit static typing, the folklore that
dynamic languages are "more expressive" is not actually correct.

such as:

let x = "string literal" //compiled as string

let y = 42 //compiled as int

let z : i32 = 100 //compiled with type declared as 32 bit int.

------
mrcsharp
I agree with the author on every point except:

>That is not engineering. That’s just lazy programming

I don't believe that developers/programmers are all lazy. There are a lot that
want to do a good job optimizing their code and making sure it performs well
and is future proof as much as possible. I believe that budget limits and
pressure from deadlines set by non-technical people forces even the good
programmers to cut corners in order to deliver.

~~~
Aeolun
I believe fully in cutting corners when you know they are being cut. It’s when
you are unaware you are cutting corners that you end up with a seriously
unrecoverable system.

------
ken
> I want to take pride in my work. I want to deliver working, stable things.
> To do that, we need to understand what we are building, in and out, and
> that’s impossible to do in bloated, over-engineered systems.

Somebody needs to _write_ this software. Efficient, maintainable, and debugged
are not the low-energy state.

That means it either needs to come from business, or from open-source
hobbyists. Business doesn't see a competitive advantage in it -- even Apple,
which has traditionally cared more about UX than anybody, just has to be
better than their #2 competitor (and price of entry to "desktop OS" or
"smartphone OS" is high so that list is short, and not changing on any
relevant timescale). And the open-source world has never delivered well on the
end-user experience side of things.

The sad truth is that users would rather have new software for $0 (paid by
ads, or media subscriptions, or whatever) than pay what it truly costs to
develop software.

My main hopes today are that the end of Moore's Law will force companies'
hands, or that government will step in to regulate minimal quality, or that
workers will organize so they can stand for quality behind a CBA. These all
seem rather unlikely at this junction. The number of programmers in it for the
paycheck far outweighs the number of people who care about simplicity.

Software is going to get much worse before it gets better.

------
nixpulvis
Heh, software in general, absolutely! But I think the "Web 2.0" is the worst
offender: [https://nixpulvis.com/ramblings/2018-08-11-web-shit-point-
oh](https://nixpulvis.com/ramblings/2018-08-11-web-shit-point-oh)

~~~
TeMPOraL
The irony is, Web 2.0 was enabled by a performance improvement! It's only once
JavaScript engines became fast enough to be useful that all these troubles on
the Web started.

------
dang
Discussed at the time:
[https://news.ycombinator.com/item?id=18012334](https://news.ycombinator.com/item?id=18012334)

------
perceptronas
I like the fact that people are starting to get fed up with slow software.
Maybe this will increase demand big enough and we can start seeing new
performant software popup

------
valtism
The article quotes this in regards to npm: [https://medium.com/s/silicon-
satire/i-peeked-into-my-node-mo...](https://medium.com/s/silicon-
satire/i-peeked-into-my-node-modules-directory-and-you-wont-believe-what-
happened-next-b89f63d21558)?

It seems he does not realise that this is a satire piece, and seem to
completely buy in to his view of the world instead of seeing things in a more
nuanced way.

~~~
camillovisini
This made my day! It just gets better and better!

------
sytelus
A single car has about 30,000 parts, counting every part down to the smallest
screws (according to Toyota). A lot of people don't understand the difference
of complexity between a car and software. The analogy between two is quite
unfortunate.

The software in nutshell is programmable transistors. Each CPU instruction is
in effect just a convenient way to design a specific electronic circuitry.
Even a trivial act of printing Hello World takes an astonishing amount of
complexity when you take into account all kind of protoicals, APIs, driver
codes, kernal code, fonts, rendering, graphics that gets executed in between.
If you showed a computer printing Hello World on screen to someone from 1920s
who knows how to build electronic circuits and primitive "display", they can
estimate the amount of work that would be required to do that. Nothing has
changed from 1920s to 2020s in terms of complexity to enable simple Hello
World. A relatively simple program will easily exceed 30,000 low level
components working together to achieve a goal in an intricate dance. Now think
of large code bases with million lines of code... This is why software is
hard, software is complex, software is messy and software is magic.

------
burlesona
Nothing feels more broken to me than the React / Javascript web ecosystem.
Writing rich, stateful UI's in HTML isn't trivial, but React plus webpack hell
turns this into a mess that is so difficult to develop, maintain, and reason
about, that people just quit trying and accept that their error tracker will
be full of random bugs that nobody can explain or reproduce. It makes me sad.

------
overgard
I totally agree with him that it's awful, but I think the problem is that
making things efficient is expensive in terms of time and having to hire
expertise. Evidently efficiency just doesn't have enough return on investment
for companies to care. The thing about looking back at the windows 95 era and
comparing it to now, is that windows 95 _needed_ to be efficient to be usable,
windows 10 doesn't.

The exception would be games and embedded software, but even there, there's
certain degrees of lazyness. For instance, games are very cpu/memory/gpu
efficient, but they're almost always ridiculous when it comes to disk space
usage. There's no reason that your average AAA game needs to take up 50GB
other than that things which could address that aren't worth the fuss. (I'm
thinking common demo-scene tricks like procedural textures/data/everything,
aggressive compression schemes, reusing assets, etc.)

~~~
shiftless
But what real enhancements does Windows 10 provide, which couldn't have simply
been done as patches to Windows 7, keeping the same main structure and layout
and overall design, but just continually improving it, the way Henry Ford did
with the Model T for 20+ years?

For that matter, one could for example take the Windows NT 4.0 source code,
add in drivers for the necessary hardware, fix boot code, linking, etc to be
compatible with late model computers, spruce up the UI with better font
rendering, antialiasing, 24-bit color wallpaper, OpenGL rendering even--and in
the end, you'd have something just as functional as Win 7/10 but at 1/4 of the
bloat.

This sort of thing would be technically very easy to do. It's much easier than
the status quo of continually reinventing the wheel. So why, oh why, is there
this overpowering desire to continually throw out good code and replace it
with heavier, more bloated junk, which doesn't really offer any _real_
increase in functionality?

~~~
criddell
The overpowering desire is to make more money for your shareholders. That
sometimes isn’t going to align well with the interests of end users.

~~~
zozbot234
Except that it also happens with internal software, where there's no
separation between developers and end-users. It's second-system effect writ
large. "We'll rewrite this old, outdated software and add plenty of bells and
whistles! This time will be different, we'll finally be doing it right."

~~~
Wowfunhappy
See, to me, internal software may be the one place where "developer time"
really is more valuable (depending on the size of the organization), because
you're developing for far fewer uses.

One second times five million users is almost two months. One second times
five thousand users is less than 90 minutes.

~~~
Wowfunhappy
^ Bad typo, I meant to say "you're developing for far fewer _users_."

------
cutler
Second the OS > VM > Docker > Kubernetes gripe. Soon no-one will know how to
admin a sys. Companies like Hetzner
([http://www.hetzner.com](http://www.hetzner.com)) are offering an 8-core VPS
with 32Gb RAM for a mere $40 per month but all we hear about is AWS. It's
insane. Facebook conqured the world on a fraction of that hardware back in
2004. Docker and Kubernetes were originally designed to solve problems
managing massive fleets of servers but now devs at every 2-bit startup with a
single server are expected to be hiding their efforts behind these 2 extra
layers. "Over-engineering" doesn't even come close.

~~~
Aeolun
My problem with that 8C/32GB machine as opposed to AWS is that that’s
literally everything I want my VPC and direct connect.

~~~
cutler
Sorry, I don't understand the last part of the sentence.

~~~
Aeolun
I missed a period. I meant to say that Hetzner gives me great machines, but
nothing else (actually they changed that a little bit on their VPS offering,
which now has an internal network).

This works great for my personal machines, but not for my company, which wants
to do fancy networking things without going over the open internet.

I do fully agree that it’s an order of magnitude cheaper if you know what you
are doing though. I’m sure our $3000/month AWS stack would run a lot more
efficiently on my $20/month Robot server.

------
d_burfoot
Can someone from the quant/HFT world comment on whether these kinds of
problems exist in that sector as well? Somehow I imagine that the fierce
competitiveness of trading would force people to write better software, but
that’s just a theory.

~~~
pnako
It depends what you mean by better. Obviously, real-time trading systems are
going to be optimized to reduce latency, since it directly correlates with
profits. One way to do that is to completely bypass certain abstraction
layers, use polling instead of events, do more stuff in userland, tune
kernels, and so on. This works, but you end up with a really bespoke system
that is not particularly elegant.

But you don't have to go to those extremes to have good software performance.

------
phillipcarter
A coworker of mine explained his belief that professional software development
is an inherently economic activity. This was clarified by saying that the
amount of imperfections, performance problems, and bugs in a piece of software
is not reflective of the software nor its writers, but what most end users
ultimately care about.

Whenever I read posts like Software Disenchantment, I find myself agreeing
with that philosophy. In other words, it’s probably by design. Of course this
doesn’t account for the enormous waste of time and money that occurs during
software development, but that doesn’t really affect my feelings on the
matter.

~~~
detaro
That assumes end users can make informed decisions about the software they
use, and have free choice to choose a less-buggy option if they just preferred
that. Neither of which is true in many cases.

------
itqwertz
...And someone just made their first million with some buggy PHP app.
Perfectionism is the enemy of progress. In the end, software is categorized
into two types: software that people bitch about and software no one uses.

------
cryptica
>> Build systems are inherently unreliable and periodically require full clean

What frustrates me above all else is the trend of compile-to-javascript
languages. IMO interpreted languages are great, of course some performance was
sacrificed to get there - I think that was a fair trade-off because saving the
developer from having to build the project is a HUGE advantage when developing
(at least for my particular development flow)... So when I see people throwing
away that massive advantage by adding an unnecessary compile step in order to
get slightly better syntax or static typing (e.g. CoffeeScript or TypeScript),
I find it deeply disturbing. Static typing can be a useful feature, but is it
worth adding a compile step? Not by a long shot.

And the idea of transpiling a language into an interpreted language is just
ridiculous in principle. We had an army of very smart people who invested a
huge amount of time and effort into making efficient interpreters for certain
languages but all that work is thrown away as soon as you add a build step.

And the stunning thing is that it's actually possible (easy even) to create
excellent software with clean code without a build step (I've done it many
times) but these simple, clean approaches are never popular. People want to
use the complex approach that introduces a ton of new problems, delays and
compatibility issues.

------
noyc
While I agree with the author's observation on efficiency and simplicity in
modern software, I think that the article's tone is needlessly antagonistic.
It read like a diatribe about all of us these terrible programmers screwing up
software for him. Perhaps I am an exception, but so far the majority of
programmers I've worked with were very conscious of their software's
performance and worked hard on making it as fast as they could.

The problem is that when your software is built on top of a framework and/or
uses X different web APIs etc then you often run into issues where a part of
the system that you don't have control over causes performance issues and you
don't have the expertise/time to profile it in order to fix it. So I think
what's causing problems is that software has become a lot more about putting
together frameworks, libraries and reusable components and when faced with
such a complex system a programmer will often give up and say "there! it's as
fast as I can make it without rewriting everything from scratch".

Therefore, the issue seems to be that programmers are building on top of other
systems that they don't know enough about to use efficiently. The author does
mention this issue in his article as well, but in a slightly derogatory
fashion blaming programmers for bringing in dependencies they don't need.

I think if everybody had the time and ability to write everything from scratch
like Jonathan Blow is doing with Jai then yes, things would be more efficient.
It is far easier to profile and debug code you've written yourself. However,
seeing how this isn't feasible for most projects, I think more focus should be
put on better documentation of frameworks and libraries.

------
eximius
I agree with this.

I think we need a guild. We need licensed software engineers.

Not every programmer needs to be one, just like not every engineer needs to be
licensed, but there needs to be a licensed engineer on every team. And, of
course, sometimes there doesn't need to be. But I sure wish there was the
option.

And hell, bring back apprenticeships and mentoring with the guild. There is so
much we could learn from the physical science engineering disciplines

~~~
pm90
While the apprenticeship model is a good one and can work in software, I would
be wary of such an effort considering how rapidly software evolves. Guilds
work well for technologies that are fairly stable and require years of
practice and study to get right (metal working, carpentry, plumbing, surgery
etc.), while with software you could specialize in languages and frameworks
that become obsolete in the order of decades.

However, the idea of working closely with Senior Engineers and learning from
them is certainly something that I vehemently agree with. I've been fortunate
to have had that opportunity.

~~~
eximius
While software engineering is growing and evolving, there's also a lot of
cyclical fads that we should look past. Sure, there is the framework du jour,
but the fundamentals of computer science and software engineering grow far
slower.

Mastery of frameworks is NOT mastery of our craft. They are useful tools that
can come and go. But the underlying principles are what should concern is.

To that point, none of your examples of stable disciplines are static. New
surgical tools, techniques, and technology are constantly produced and
surgeons must learn to lay down their old ones and adapt. Metalworking,
carpentry, plumbing all need to learn about new materials developed and code
changes.

All of those things are like frameworks.

------
matheusmoreira
A good way to remove bloat is to find and remove needless abstractions. Even a
C compiler will produce huge executables and it takes a lot of work to reduce
a program to the smallest amount of code:

[http://www.muppetlabs.com/~breadbox/software/tiny/teensy.htm...](http://www.muppetlabs.com/~breadbox/software/tiny/teensy.html)

------
mc3
Sounds like he wants software to contain AGI, such that it can work out which
version of a document you want to keep, and fix it's own errors.

Honestly things were way more sucky years ago. Your Windows computer crashing
was just normal in the 90's. Rebooting and reinstalling par for the course.
Getting Linux to install with drivers working seemed impossible unless you
carefully chose hardware.

------
minimuffins
All these critiques really resonate with me. I've asked myself that same "What
could it possibly be DOING!?" question every time I update windows. Recently I
started learning 6502 assembly so I can write NES games. It's really pleasant
to abandon the layers and layers of dysfunctional, messy framework goo that is
now the norm in my daily work life in web based programming.

But something crucial is missing from this manifesto. As the author says, the
problems don't exist because we can't solve them, but because no one ever
takes an interest in solving them. We're all engineers, so why don't we just
do some engineering and fix this bullshit? Well, because it wouldn't make any
money.

Perplexingly and contradictorily, the profit motive drives both innovation and
stasis, both growth and sprawl, and both efficiency and inefficiency. The
problem is both technical and social, and probably so will be the solution.

------
slx26
we all want simple, fast, small programs with clear objectives.

but:

\- economic incentives do not align with common sense

\- landscape is fragmented, continually evolving and unstable

\- we are all posers. we all have opinions and morals, look at others and
criticise, but when it comes to our own work we are just like anyone else. we
need money to live, so we just go with the flow

------
cryptica
From the JavaScript ecosystem, I'm frustrated by Babel's popularity. Whenever
I discuss it with developers, everyone agrees that it's not worth it, and yet
people still use it! It's as if we have no choice in the matter!

The whole point of Babel was to allow us to use the latest JavaScript syntax
so that we wouldn't have to update our own source code when the new syntax
finally became broadly supported by browsers and other engines.

IMO, the Babel project is a failure because:

\- Babel itself is always being upgraded to support newer ECMAScript syntax,
so people still need to upgrade their own code whether they use Babel or not.
The only benefit is that using Babel allows you to use these features before
other people.

\- Instead of just worrying about how JavaScript syntax changes affect your
code, with Babel, you also need to worry about how Babel upgrades will affect
your code. The babel plugin dependencies often change and break over time
(even if you don't change your own code) and you always have to support both
ecosystems.

So when you consider the big picture, Babel doesn't save you from having to
upgrade your own code (as per its original promise). When you evaluate the
pros and cons, the cons greatly outnumber the pros:

Pros:

\- You get to use the newest language features before other people.

Cons:

\- You need to maintain your code for compatibility with two ecosystems
instead of just one. Keeping up with both ECMAScript + Babel is a lot of work.
I would even argue that staying up to date with Babel is more work because
dependencies keep changing underneath your project.

\- It forces you to use a build step so you lose the benefits/iteration speed
that an interpreted language brings to your development flow.

\- Adds a lot of bloat and unnecessary, hard-to-describe dependencies to your
project which can open up security vulnerabilities and make your code more
opaque and brittle.

------
didibus
> “Programmer time is more expensive than computer time.”

I support the idealism of the article, but this quote is very accurate. Nobody
wants to pay for quality software. Not your users, not your stakeholders, not
even you!

And that's because the price of quality isn't just a little more expensive.
It's not 1$ a year. It's exponentially more expensive. And some of those paid
efforts will still end up slow and crappy in outcome, because of something
else the article doesn't acknowledge:

Writing fast, efficient, simple, correct, full featured software is really
really hard.

So not only is it expensive, it's also just plain difficult. Meaning it's not
just about money and resources, but also time, time to experiment and fail.

------
wffurr
>> That is not engineering. That’s just lazy programming. Engineering is
understanding performance, structure, limits of what you build, deeply.

If I think back to my engineering school days, the definition of "engineering"
for my classmates in civil and electrical engineering was to look up well
defined procedures and calculations from a book and apply them. No deep
understanding required to design a bridge that didn't fall down or a circuit
that didn't overheat.

What's the equivalent for software? Design patterns were a bust. SICP is for
cultists. It's a huge void. There is hardly any such discipline as "software
engineering" yet.

------
coldtea
> _Modern text editors have higher latency than 42-year-old Emacs. Text
> editors! What can be simpler? On each keystroke, all you have to do is
> update a tiny rectangular region and modern text editors can’t do that in
> 16ms. It’s a lot of time. A LOT. A 3D game can fill the whole screen with
> hundreds of thousands (!!!) of polygons in the same 16ms and also process
> input, recalculate the world and dynamically load /unload resources. How
> come?_

Well, where's your word processor buddy? Try to write one to achieve those
goals -- and offer what people want today, including syntax highlighting,
linting, auto-completions, etc, and come back to us...

~~~
anthk
>Well, where's your word processor buddy? Try to write one to achieve those
goals -- and offer what people want today, including syntax highlighting,
linting, auto-completions, etc, and come back to us...

Emacs could do that since 25 years ago, if not more.

~~~
coldtea
Emacs never could, and still can't draw while processing input, and even needs
occasional manual refresh. A badly written code (plugin, etc), a loaded loop,
etc, can freeze emacs. So there's that...

And, Lisps aside, the kind of linting, auto-completion, understanding of
syntax/AST, etc emacs could do 25 years ago is much easier compared to what
devs expect today.

------
acvny
Each and every point the author makes shows lack of understanding of some
basic things. Comparing cpu resources with fuel consumption? Really? Or
windows update taking 30 minutes... Remember premature optimization is the
root of all evil.

------
angarg12
I can imagine how many software products come get to this state. At some point
someone makes a poor design decision, and that sets a precedent. Or
programmers can't be bothered to clean up and refactor their code, and it adds
over time.

But the best explanation for why this problem persists even in teams of
proficient engineers that I have seen comes from the 2005 GDC Keynote from
John Carmack [1].

> I can remember clearly in previous projects thinking [...] I've done a good
> job on all of this but wouldn't it be nice if I could sit back and really
> clean up the code, perfect the interfaces, and you know, just do a wonderful
> nice craftsman job [...] interestingly this project I've had the time to
> basically do that and I've come to the conclusion that it sort of sucks
> [...] there's a level of craftsman satisfaction that you get from trying to
> do what you do at an extreme level of quality and one thing that I found is
> that's not really my primary motivation [...] that's not what's really
> providing the value to your end-users [...] you can sit back and I am do a
> really nice polishing job it's kind of nice but it's not the point of
> maximum leverage and I found.

So, as others have mentioned here, there is a threshold where extra
performance provides less value to the project than, say, an extra feature.

It seems that in some cases we have crossed that threshold and some software
has become comically bloated. I attribute the reason why these are not solved
to the same reasoning. Refactoring an existing project to reduce the bloat
would take too much time and effort from one single developer, that could be
used somewhere else, although in the end everyone would benefit from it. So
you are better off adding stuff to the dumpster fire and moving on.

It's sort of a tragedy of the commons [2] of software performance.

[1] [https://youtu.be/N0auhzHZe5k?t=1015](https://youtu.be/N0auhzHZe5k?t=1015)

[2]
[https://en.wikipedia.org/wiki/Tragedy_of_the_commons](https://en.wikipedia.org/wiki/Tragedy_of_the_commons)

------
ateng
> Modern text editors have higher latency than 42-year-old Emacs. Text
> editors! What can be simpler? On each keystroke, all you have to do is
> update a tiny rectangular region and modern text editors can’t do that in
> 16ms.

Modern text editors have to deal with proportional font, right to left
writing, anti-aliasing, and a big bag of Unicode related issues (Arabic is
going to break all assumptions you have on language and text editing)

Fantastic read on this subject from few months ago:

[https://news.ycombinator.com/item?id=21384158](https://news.ycombinator.com/item?id=21384158)

------
29athrowaway
In the past, there were elixirs that claimed to cure baldness, impotence,
cancer, etc. Marketing based on unproven claims was legal, and everyone was
doing it.

Software today is in the snake oil era. "Secure", "privacy-friendly",
"robust"...

We need the government to create the FDA of software, to protect the consumers
against potentially harmful software, marketed using false advertisement.

We also need job applicants to be protected against predatory companies that
hire people that want to do right by the customer, but are forced to produce
rushed, poor quality stuff.

------
misir
I remember 5-6 yrs ago I tried to create an app using Pascal to remove
cache/temporary files, fix registry errors and a few optimizations like
popular "optimization/speedup" apps. I wanted to show the progress on progress
bar but see what: the process just took 0.3-0.5 seconds and the progress bar
is not required anymore. But I wanted to show it anyway and put random amount
of wait time (sleep) between steeps. I didn't realize why the popular apps
needs 3-5 minutes for the task that my simple app just needs 0.5-1 second.

------
sergiotapia
>I’ve been programming for 15 years now. Recently, our industry’s lack of care
for efficiency, simplicity, and excellence started really getting to me, to
the point of me getting depressed by my own career and IT in general.

You and me both buddy! 11 years here, you are not alone in this!

What I found helped me take my mind of this nagging feeling is to bring in a
tool that is like a razer sharp brand new scalpel. That was Nim. It's
readable, small, fast, and spits out tiny statically compiled binaries.

Find your scalpel and slice and dice. Back to basics. Purity.

~~~
eecsninja
25 years since I started learning C++ as a kid, 14 years in industry and
feeling like a disappointed old man despite still having the youthful looks to
pass for a student!

------
swills
I love that he linked this page in the article:

[https://docs.gitlab.com/ee/administration/operations/unicorn...](https://docs.gitlab.com/ee/administration/operations/unicorn.html#unicorn-
worker-killer)

because I've had to deal with that a bit. N.B. the last sentence here:

One other thing that stands out in the log snippet above, taken from
GitLab.com, is that ‘worker 4’ was serving requests for only 23 seconds. This
is a normal value for our current GitLab.com setup and traffic.

------
Izkata
> A 16GB Android phone was perfectly fine 3 years ago. Today, with Android
> 8.1, it’s barely usable because each app has become at least twice as big
> for no apparent reason. There are no additional features. They are not
> faster or more optimized. They don’t look different. They just…grow?

I jumped straight from Android 2 to Android 8 development and was surprised
myself, so did some investigation and have half an answer. The author is
actually wrong here, on both looking different and additional functionality.
However, the bloat is still far larger than it needs to be.

All the bloat comes from the AppCompat modules, which all the docs recommend
to the point of it apparently being required if you don't know better.

AppCompat is for both supporting differing APIs and creating a consistent look
and feel across different Android versions. Each Android version has its own
visual design, which Google decided was a bad thing, opting to use AppCompat
so the most recent designs (and in some regards design functionality like
coloring the selection handles) were used in older versions of Android.

To do this however, it includes a crap-ton of images. The build scripts are
supposed to remove the unused ones, but even with maximum trimming enabled it
can only remove somewhere around 10%. There are hardcoded inclusion rules for
some of the AppCompat java code, that no one's found a way to override, which
in turn reference the images - so they get kept a well, even if your app never
uses them.

As for differing APIs, notifications have changed massively over the years.
The interface is so different you do actually need the AppCompat subset for
notifications to target different Android versions (and that can be used
separately from the rest of AppCompat), but there also have been a huge number
of new features added to notifications - such as delay settings, shortcuts,
icons, even full-on fancy designs, that didn't exist early on.

I'm calling this only half an answer because there's no apparent reason for
some of the notification API changes, and the build scripts/AppCompat can
certainly be significantly improved to remove more cruft. I have a sneaking
suspicion that it's not done because this is low-hanging fruit for handing
over signing keys to Google for their "optimized" builds...

------
ChrisMarshallNY
Lots of moving parts.

I am a dependency skeptic. I think that you need them to do big stuff, but
should probably avoid them for small stuff.

High-quality dependencies can have a drastic impact on the quality of your
software, but so can low-quality dependencies.

I think we are at the tail-end of a "wild west" of dependencies.

When the dust settles, there will be a few really good, usable and stable
dependencies, and a charnel pit, filled with the corpses of all the crap
dependencies, and, unfortunately, the software that depended on them.

------
chrismmay
Great article. Author makes some really good points. Surely the big tech firms
have skunk works projects going on to rebuild the problem areas? Microsoft.
Google. Facebook. Amazon. Netflix. They all must have decent sized R&D
departments? Perhaps an independent, very public, curated list that points out
areas that desperately need work would be helpful. Name and shame so-to-speak,
would prod them to move in the right direction on the areas that need work the
most.

------
systematical
Couldn't agree more. I am going to throw marketing under the bus for web
application bloat. They make us install script after script for tracking you.
They are the worst.

------
vojta_letal
Most of the issues outlined are problems of the web. The HTML+CSS+JS combo is
just painfully slow and wasteful by design - it's just way too many levels of
flexible abstractions. Which is suboptimal for app development. Moreover it's
expensive to maintain two apps - web and native - which share have the exact
same UI/UX. Hence the rise of Electron, React Native , ...

The only way out of this is to rethink the web. Which is a hard one to tackle.

~~~
mikro2nd
I think it's a desktop problem, too. TBH I don't know how things are over in
the Mac/Win world, but over here in Linux land it's... not so good.

Boot used to be a small number of seconds. Now (on the rare occasions I'm
forced to actually boot/reboot) I start the machine and bugger off to make
coffee while it does its thing. I don't know what takes so fucking long, but
it's in the range of 'several minutes'.

Starting apps likewise. I just started up an infrequently used picture-editing
app a little while ago,... up towards a full minute of 'loading this crap',
'loading this other crap', etc.

And let's not mention Atom (an Electron app unless I've misunderstood
something) -- so laggy for some things that should be near-instant that I'm
developing an active hate.

 _Alright: get the hell off my lawn now!_

------
notSupplied
This is Jevon's Paradox of software development.

As self driving cars enable us to fit far more cars on the road before causing
the same level of congestion, people will start taking increasingly longer car
rides for decreasingly valuable reasons, up until the roads are equally
intolerable as they were before the innovation had occured.

Replace "self driving cars" with "faster hardware / more memory".

------
pm90
IMO the article is painting a really negative picture about the state of
technology. Any profession with large enough practitioners will create tons of
crappy stuff; with most professions that crap is just not discoverable. e.g.
if you're a crappy carpenter, only your city or neighborhood can see your
shitty work. But if you make a crappy website or app, anyone in the world can
see and use it.

~~~
jodrellblank
Sturgeon's Law, "Ninety percent of science fiction is crud, but then ninety
percent of _everything_ is crud".

------
titzer
FTA: > As a general trend, we’re not getting faster software with more
features. We’re getting faster hardware that runs slower software with the
same features.

Or fewer features. Talk to an actual professional who uses spreadsheets all
day long about switching from Excel to Google sheets. It's infuriating the
infantilization of UIs and the "oh they'll never miss it" attitude.

------
rckoepke
> Google Play Services, which I do not use (I don’t buy books, music or videos
> there)—300 MB that just sit there and which I’m unable to delete.

Thats....not what google play services is, like at all.
[https://developers.google.com/android/guides/overview](https://developers.google.com/android/guides/overview)

------
bhupesh
> Modern text editors have higher latency than 42-year-old Emacs. Text
> editors! What can be simpler? On each keystroke, all you have to do is
> update a tiny rectangular region and modern text editors can’t do that in
> 16ms. It’s a lot of time.

because we are going to make editors using JAVASCRIPT, which was never meant
to be used this way

------
LogicalBorg
The law of software bloat has been known since the 1980's. It's called Wirth's
Law, or "What Intel giveth, Microsoft taketh away". See
[https://en.wikipedia.org/wiki/Wirth%27s_law](https://en.wikipedia.org/wiki/Wirth%27s_law).

------
smadurange
My concern about this state of affairs in software engineering is that it may
be producing generations of engineers who couldn't actually build/conceive
things better (eg. Building a rendering engine or a mobile OS) if they wanted
to concentrating that power in a few corporations. That's a bit terrifying.

------
jordanbeiber
I’m in a bad mood today, and IM bad mood O, much of this is a symptom of
modern economics and business management.

Also - resources that are essentially free to use (customer/users CPU cycles,
storage and bandwidth) will be consumed.

We as consumers are paying the bills for it all in different ways(electricity,
new gadgets, cloud costs etc).

------
SebastianFrelle
How does this apply to programming languages? I've always heard that
developers should just stick to the language in which they're most productive,
but am I part of the problem if I pick Python over a more performant language
like Rust or Go for all of my work (web apps, command-line tools, etc.)?

------
amelius
Nowadays, the shittiness of software doesn't come from lack of performance,
but from ads and tracking.

------
tanseydavid
This is extremely well-worded and gets right-to-the-point. This is so
refreshing in contrast to the constant attempts to _explain_ why software is
so slow, bloated and especially unreliable.

An explanation of "why" does not explain "why this is acceptable".

------
agumonkey
Large socio economic factors are at play.

It depresses me to no end too, but i am surprised in the least.

There were tiny groups trying to go frugal and solid. Remember suckless ? I
forgot other names, there's also alan kay vpri project with ometa.

Maybe we should make a frugalconf. Everything 25fps on a rpi zero

------
Aeolun
In my opinion, the reason nobody cares about something taking 15 seconds, is
because the process would have taken them 4 hours of manual work.

Of course they’re going to accept that delay, even if adding a few million
numbers together should really take less than a millisecond.

------
jakear
Yet another Electron hit-piece... curious, how do other cross-platform
frameworks compare in both build time and live-reload time? For comparison, VS
Code clean builds in about a minute, and a change-build-reload dev cycle is
about a second or two.

------
jayd16
Everyone is so angry. Isn't this the the world of ubiquitous code and
unlimited resources we wanted?

Not everything needs to be super efficient. Most things are tuned for
production cost and time. Efficient code isn't going anywhere. Relax guys.

~~~
nixpulvis
I'll relax when I can get through the day without fighting some idiotic
organizations excuse for a web form or attention hogging pop ups or, god help
me, what else?!

We shouldn't have to put up with all this shit. That's the point.

~~~
groby_b
Excellent. What, then, are you doing to make sure this shit doesn't happen?

Are you willing to pay $20 for a phone app, instead of fishing for the free
version? Are you willing to pay for websites to get rid of ads? Are you
willing to pay $300 for a new OS?

Alternatively, are you willing to ditch capitalism for a system that
prioritizes the commons?

It's us. We are creating the incentives for a world that produces this. Unless
we change that world, this is what we'll get.

~~~
nixpulvis
I'd like it if my phone let me install my own software. That would be a pretty
big step for me personally. But I'm still stuck on iOS for now.

~~~
literallycancer
iOS doesn't have 3rd party app markets?

------
davnicwil
> Linux kills random processes by design. And yet it’s the most popular
> server-side OS

There's no reference for this in the article, and it caught my attention -
anyone have any idea what the author is talking about here? Never heard of
this before

~~~
cube00
It could be referring to the out of memory (OOM) killer
[https://www.kernel.org/doc/gorman/html/understand/understand...](https://www.kernel.org/doc/gorman/html/understand/understand016.html)

~~~
qqii
There has been some work on this in nohang[0].

> Customizing victim selection: impact on the badness of processes via
> matching their names, cgroups, exe realpathes, environs, cmdlines and euids
> with specified regular expressions;

[0]: [https://github.com/hakavlad/nohang](https://github.com/hakavlad/nohang)

------
jeromebaek
The AI/ML hype drives away talented students from engineering and systems
research, as well. Because of the hype engineering somehow is less prestigious
than “data science”. Wonder how long until this trend will blow up.

------
niknetniko
I agree with the overal sentiment, yet the examples could be better.

> Modern text editors have higher latency than 42-year-old Emacs. Text
> editors! What can be simpler? On each keystroke, all you have to do is
> update a tiny rectangular region and modern text editors can’t do that in
> 16ms. It’s a lot of time. A LOT. A 3D game can fill the whole screen with
> hundreds of thousands (!!!) of polygons in the same 16ms and also process
> input, recalculate the world and dynamically load/unload resources. How
> come?

Text is very complicated. Does your 42-year-old Emacs support Unicode? And not
just accents, but whole different scripts?

See
[https://news.ycombinator.com/item?id=21105625](https://news.ycombinator.com/item?id=21105625)
for some discussion and a good link about the complexities of rendering text.

~~~
anyfoo
I'm not sure if the author meant current versions of emacs (with their 42 year
lineage) or not. But if that's the case, then I think emacs was one of the
very first to support Unicode and whole different scripts. If my memory does
not fail me, emacs had extensive wide character support long before Unicode.
With an extraordinarily broad treasure trope of different input methods on
top.

emacs has also always been (in)famous for its text update render algorithms,
that probably still work well over the old slow terminal lines it was
originally used for.

------
block_dagger
This article ignores the primary reason for the basic “problem”:
changeability. How easy is it for me to hire a cheap programmer to iterate my
profitable saas biz to keep it relevant in the rapidly changing market?

------
bluedays
Whenever I read articles like this I often wonder how much overhead is due to
security concerns, and how much that overhead and additional complexity
contributes to the problem of slower software.

------
qwertygerty
It appears to me that much of your complaint is based on the flawed assumption
that the software "industry" is filled with engineers. ("Industry" because in
software what appears to be production of something new, is really just a re-
invention or re-creation of what's been done before - and unfortunately rarely
an improvement)

For example, how many authors on medium proclaim themselves "Senior Software
Engineers" but when you dig find they've got maybe 5 years experience doing
web development, with no CS or engineering education. Maybe stuff like a 36hr
"Web Development Bootcamp". Do people really not understand the definition of
an engineer anymore?

From there they progress into the deeper parts of software. And create the
atrocities to be found in the npm registry, which become dependencies of
dependencies of dependencies that results in nightmares everytime one needs to
navigate to a website.

If it would've been possible to see the background and education of the
numerous critics here, what may we find? If I (the developer as described by
the OP) am surrounded by people like me, and the world filled with people who
think like me, and create things at a similar cognitive level as my peers,
would I not misjudge the collective level of quality that I perceive to be
acceptable? Smells like confirmation bias to me? Maybe a few others. Dunning-
Kruger anyone?

In their defense, the marketing campaigns created by large corporations to
turn the supply-demand (cost of employment) of employees in their favor, has I
think been a big part of the problem. First the programmers, then the "Data
Scientists", etc - the amount of disappointment and student debt being created
as these people eventually realise they've been sold something that they're
not suited for!

If we cannot critically look at our industry and admit our flaws, we cannot
move forward as a collective.

------
barrkel
The rant is entertaining. The problem is pricing. Developers largely don't pay
the costs for what they deploy; there's no profit in efficiency. Thus things
aren't efficient.

------
nprateem
So we really need to blame the developers of all the (free) libraries we use
for not optimising them to the nth degree so our code doesn't become
bloated...

No one's forcing you to use a library. But if you do they come with tradeoffs.

OK, things are slow and buggy. But we've got lots _more_ things thanks to all
the productivity we've gained from using libraries, etc. That means we
collectively solve more problems for more people.

Purism is a nice idea, but ultimately probably not worth the effort until
things become so bad that they are, and at that point become a differentiator.
I mean, I don't care if a web page is twice the size of Windows 95 because my
computer is way faster than a 486.

------
xadhix
Had to remove background to read the article :D
[https://imgur.com/gFla2WG](https://imgur.com/gFla2WG)

------
mothsonasloth
Choose your scape-goat:

a) Lack of fundamental understanding of how computers work

b) Abstraction away from the bits, flops, shifts and pops

c) Quick sort

d) Electron.js

e) Ruby hipsters

f) Magical cloud computing

g) Software companies that know the cost of everything and the value of
nothing

h) All of the above

------
rs23296008n1
npm would be a lot better if it flattened out its node-modules. I've seen
plenty of structures with multiple identical installs of the same library.

All module folders should be at root level.

Need version 15 and version 16? Yep different folders under the root.
Modulename_version, eg somelib_0.1.15 alongside somelib_0.1.16

It would allow clear identification of old versions and much less duplication.
Less bloat. No way to have duplicate copies of the same library.

------
zuhayeer
> Web pages ask you to refresh if anything goes wrong. Who has time to figure
> out what happened?

I feel like this is a feature, not a bug of software dev. The fact that you
can push updates out so instantaneously allows you to work incredibly
incrementally. You can make barely functional, inefficient things (shortcut
hacks) just to make sure that your product is something that people actually
use before focusing on optimization. If users are willing to put in the extra
work to "refresh" the page, then certainly there's some real problem you are
solving.

------
erikbye
This whole article is regurgitation, aside from that, I was just wondering,
what complex, highly efficient software has the author written?

------
georgeecollins
The analogy to cars misses where they are in their life cycle. When cars were
evolving at a faster pace, and gas was relatively cheaper, they did get
something like 5-10% efficiency, particularly when you correct for the level
of emissions produced.

Gas car technology matured, gas got more expensive, and cars got more
efficient. Moore's Law isn't going to go on forever. We are hitting limits in
battery life. Software will tend to get more efficient over time.

------
Wald76
Safari on iOS would not render Nikita’s article until I killed and restarted
it, nicely illustrating one of his key points.

~~~
city41
iOS has evolved into such a buggy mess I switched to Android. I used to be a
huge Apple fan, and now I don't use any Apple products. I feel they are one of
the worst offenders in degrading software quality, unfortunately.

------
drbojingle
I understand what the author is saying but it sounds like wanting tech for
tech's sake. If the user doesn't care, then does it matter that sites are
slower? And if it matters, why hasn't someone come along and done something
better like the author suggests? Surely this trend in tech performance has
been going on for so long that someone could be doing something about it?

------
AkshatM
Obligatory mention of Parkinson's law
([https://www.wikiwand.com/en/Parkinson%27s_law#/Generalizatio...](https://www.wikiwand.com/en/Parkinson%27s_law#/Generalization)),
which states

> The demand upon a resource tends to expand to match the supply of the
> resource (If the price is zero).

Have cheap hardware, software _will_ expand to use more of it.

------
kraig911
Everyone forgets the spiritual cost that it takes on the meat to make silicon
think. Does it suck? Certainly not. Could it be better yes. That's why we're
here. Right now though the human can barely interpret things less than 1/60th
of a second. Speed/Performance isn't everything.

------
haecceity
Ha! And people call me crazy for using emacs and eww for everything.

------
EdSharkey
Since about 2004, when computers started having lots of excess ram and CPU,
all that excess went towards surveillance. Your software is slow because
watching you is hard work!

------
dustingetz
Peter Principle

Conway's Law

------
utxaa
Avast! the river was angry at us arr ...

------
xmzx
Something no one has said which probably should be mentioned is that
programmers today are just not as good as programmers of yesteryear.

~~~
hazeii
Perhaps it's more that most (not all) of the many programmers today are not as
good as the (relatively fortunate) few who had access to computers in the
past.

In fact, I'd go so far as to suggest there are more good programmers about
today then there were in the past. Though having said that, percentage-wise
I'd say the profession is definitely being deskilled (good thing if you're a
manager, bad thing if you're a good programmer).

~~~
eecsninja
> few who had interest in computers rather than viewing programming as a path
> to easy money.

FTFY

------
idclip
This. I feel this to the bone.

------
graycat
From the OP:

> I hope I’m not alone at this. I hope there are people out there who want to
> do the same. I’d appreciate if we at least start talking about how absurdly
> bad our current situation in the software industry is. And then we maybe
> figure out how to get out.

Okay, I'll respond, especially on the last part "how to get out".

For the problems and struggles in the OP, I've seen not all of them but too
many and sympathize. Mostly though, I don't have those problems, and the main
reason is in the simple, old advice in the KISS Principle where KISS
abbreviates Keep it Simple Silly although the last _S_ does not always
abbreviate _silly_.

In particular my startup is a Web site and seems to avoid all of the problems
in the OP. Some details:

(1) Bloated Unreliable Infrastructure?

My Web site is based on Microsoft's .NET with ASP.NET for Web pages and
ADO.NET for access to SQL Server (relational database). The version of .NET
I'm using is some 4.x. So far I've seen essentially no significant revisions
or bugs.

For the software for my Web pages I just used one of the languages that comes
with .NET. I wanted to select between two, the .NET version of C# and the .NET
version of Visual Basic. As far as I can tell, both languages are plenty good
ways to get to the .NET classes and make use of Microsoft's _managed code_ ,
e.g., garbage collection, and their CLR (common language run time) code. And
IIRC there is a source code translator that will convert either language to
the other one, a point which suggests that really the two language are deeply
equivalent.

I've written some C code off and on for 20+ years; mostly I remember the
remark in the Kernighan and Ritchie book on C that the language has an
"idiosyncratic syntax" or some such -- I agree. I never could understand the
full generality of a declaration of a function -- IIRC there is some code in
the book that helps parsing such a statement. I do remember that

i = ++j+++++k++

is legal; I don't want any such code in my startup; also my old tests showed
that two compilers gave different results.

I find Visual Basic to have a less "idiosyncratic syntax" and a more
_traditional_ syntax closer to the original Basic and then Fortran, Algol,
PL/I, Pascal, etc. So, my 100,000 lines of typing are in the .NET version of
Visual Basic (VB).

For my Web site, part of the code is for a _server_ process for some of the
_applied math_ computing. The file of the VB source code is 478,396 bytes long
(the source code is awash in comments), and the EXE version is 94,720 bytes
long. As far as I can tell, the code loads and runs _right away_. Looks nicely
small and fast and not bloated or slow to me.

(2) A bloated IDE (integrated development environment).

I have no problems at all with IDEs. The reason is simple: I don't use one.

Instead of an IDE, I typed all my code, all 100,000 lines, into my favorite
text editor, KEdit. It has a macro language, KEXX, a version of REXX, and in
that language I've typed about 200 little macros. Some of those macros let
KEdit be plenty good enough for typing in software.

E.g., I have about 4000 Web pages of .NET documentation from Microsoft's MSDN
site. Many of the comments in my VB source code refer to some one of those
pages by having the tree name on my computer of the HTML file; then a simple
command displays the Web page. When reading code and checking the relevant
documentation, that little tool works fine.

After all, VB source code and HTML code are just simple text; so are my Web
site log files, the KEXX code, Rexx language scripting code, all the
documentation I write either in the code or in external files (just simple
text or TeX language input), etc. So, a good general purpose text editor can
do well. And, "Look, Ma: I get to use the same spell checker for all such
text!" The spell checker? ASPELL with the TeX distribution I use. It's
terrific, really _smart_ , blindingly fast, runs in just a console window.

For KEdit, it seems to load and run right away. I just looked and saw that
what appears to be the main EXE file KEDITW32.exe is

1,074,456

bytes long -- not so bloated.

(3) Windows 10 Home Edition Reliability.

For a move, I got an HP laptop; it came with Windows 10 Home Edition. I leave
it running 24 x 7. It hasn't quit in months. It appears that now the Microsoft
updates get applied without stopping the programs I usually have running,
e.g., KEdit, the video player VLC, Firefox, etc.

Using carefully selected options for ROBOCOPY, I do full and incremental
backups of _my_ files. I keep the ROBOCOPY log output; that output shows the
data rate of the backup, and I have not seen that to grow slower over time.
The _disk_ in that laptop is rotating, and I've never done a de-fragmentation.
So, I can't complain about performance growing slower from bloat, disk
fragmentation, etc.

(4) Windows 7 64 bit Professional Server.

For a first Web server, I plugged together a mid-tower case with an AMD
FX-8350 processor, 64 bit addressing, 8 cores, 4.0 GHz standard clock speed
and installed from a legal CD and authentication code Windows 7 64 bit
Professional SP1. As I left that pair running 24 x 7, occasionally it would
stop with a memory error of some kind. I installed an update and never again
saw any reliability problem in months of 24 x 7 operation.

Since then I looked into Windows 7 64 bit updates and concluded that (i) there
was a big _roll-up_ of about 2016 or some such; (ii) since then there have
been updates and fixes monthly and cumulative since the big roll-up, and (iii)
the updates for Windows 7 64 bits and Windows Server 2008 are the same.

I can believe that Windows Server 2008 long ran and still runs some of the
most important computing in the world. So if my Windows 7 64 bit Professional
has the same updates as Windows Server 2008, maybe for use as a server my
Windows 7 installation will be about the most reliable major operating system
in computing so far. Fine with me.

So, I am not screaming bloody murder about operating system or software
reliability.

(5) Smart Phone Bloat and Reliability.

I have no problems with smartphones if only because I have no smartphone and
don't want one: When smartphones first came out, I saw the display as way too
small and the keyboard as just absurd. Heck, in my desktop computing I have a
quite good keyboard but would like a better one and, of course, would like a
larger screen -- no way do I want to retreat to an absurd keyboard and a tiny
screen.

Next I guessed that there would be problems in security, bloat, reliability,
system management, documentation, and cost. Maybe history has confirmed some
of these guesses!

For a recent move, I got a $15 cell phone and did use it a few times. Then I
junked it.

My phone is just what I want -- a land line touch tone desk set with a phone
message function from my provider. Works fine. So, I have a phone with lower
cost and no problems with keyboard, screen, security, bloat, or documentation.
Ah, old Bell Tel built some solid hardware!

(6) Web Site Speed, Reliability, and Bloat.

My Web site apparently has few or no problems with speed, ....

Why? The site is simple, just some very standard HTML code sent from now old
and apparently rock solid ASP.NET.

Fast? The largest Web page sends for just 400,000 bits.

The HTML used is so old that it should look fine on any device anywhere in the
world with a Web browser up to date as of, say, 10 years ago.

The key to all of this? The KISS Principle.

YMMV!

------
jimhefferon
> Modern text editors have higher latency than 42-year-old Emacs.

Hmmm. Let's think about this.

~~~
_trampeltier
At least the "small" Window 10 still does have a text editor. The larger
Android don't even has an text editor.

~~~
zamadatix
Nor should it. Shipping an app 99% of users won't use as a system app is the
anti-thesis of what this post was saying.

------
fizixer
This is me.

------
omg_ketchup
RIP Flash

------
angleofrepose
What are you up to these days Chris? And the obligatory question, what is your
take on the article/why post?

I piled standout quotes below.

I think a big takeaway from the intersection of Bret Victor, Alan Kay, Jim
Hollan and the ink&switch folks and your work is that the right dynamic
interface can be the "place we live in" on the computer.

Victor shows a history of interactive direct manipulation interfaces, live
environments where explorations of models or the creation of art go hand in
hand with everything else related to that task: data input, explicit
(programmatic) requirements and the visual output.

Hollan and ink&switch show the environment (ZUIs, canvas) can contain
everything for doing work, the code alongside any manipulation of the viewport
that can be conceived. Tools infinitely more advanced than Microsoft OneNote
and designed 40 years ago.

From what I know about your work, I see another take on the environment I want
to live in on the computer. I dont understand why I would want to lose power
by stepping away from my language/interpreter/compiler/repl into a GUI or some
portal when I can bring whatever it is which is nice about GUIs or portals
into my dynamic computing environment. I very much want a personal DSL or set
of DSLs for what I do on the computer, and I want to be able to hook into
anything ala middle mouse button in plan9.

The superior alternative to walled gardens and this absurd world of bloat and
'feature loss' (for lack of a better term for software engineering's
enthusiastic rejection of history) seems to be known, and facets of it
advocated by you and these others. It seems clear that "using the computer"
needs to return to "programming the computer" and that to achieve that we need
to fundamentally change "programming the computer" to be a more communicative
activity, to foster a better relationship between the computer and the user.

Where is this work being done now? VPRI shut down 2 years ago, Dynamicland
seems to be on hiatus? I am inspired most these days by indie developers who
write their own tools and build wild looking knowledge engines or what they
sometimes call "trackers."[1] And of course the histories and papers put
forward by the above and their predecessors. And I play with my own, building
an environment where I can write, draw, code, execute and interact with it
all. I see no existing product which approaches what I want.

> Everyone is busy building stuff for right now, today, rarely for tomorrow.

> Even when efficient solutions have been known for ages, we still struggle
> with the same problems: package management, build systems, compilers,
> language design, IDEs.

> You need to occasionally throw stuff away and replace it with better stuff.

> Business won’t care Neither will users. They are only learned to expect what
> we can provide.

> There’s no competition either. Everybody is building the same slow, bloated,
> unreliable products.

> The only thing required is not building on top of a huge pile of crap that
> modern toolchain is.

> I want something to believe in, a worthy end goal, a future better than what
> we have today, and I want a community of engineers who share that vision.

[1]: [https://webring.xxiivv.com](https://webring.xxiivv.com)

~~~
ibdknox
I'm working on tools/interfaces at Relational AI, which is doing really cool
work in the declarative languages space. It was started by several of the
folks whose papers were foundational to Eve. :)

I agree with the post, though as others have pointed out, it doesn't really
dive into the fact that this problem is systemic and would require a shift in
incentive structure.

I think the last quote you have is one of the most important missing pieces
for making a meaningful change in this space. A lot of people want something
better, but right now, as a community, I don't think we really know what that
is. What is the complete story for an ideal version of software development?
And by that I don't mean idealized examples, I mean the ideal version of the
real process we have to go through. What does perfect look like in the world
of changing requirements, shifting teams, legacy systems, crappy APIs, and
insufficient budgets? If we could show that - not the simple examples we had
for Eve, but something that addresses the raw reality of engineering - I think
it would just be a matter of beating the drum.

------
syntheticnature
Some valid and useful points wrapped up in a pile of failure to do even
trivial research (e.g. Google Play Services isn't what the author thinks, the
iOS 'nothing changed' hand-wave) and the sensibility of someone walking around
an art show saying "I could do that, better." The author could bear
introduction to Chesteron's Fence, if nothing else, and a review of their
apparent GitHub profile points to, perhaps, needing some time spent in the
land of embedded systems to understand why a phone doesn't just boot in 1s.

~~~
zamadatix
I caught the play services as well but rather than try to take down the
article for not being perfect I've been trying to think about what it's
talking about.

I don't think there is anything mentioned in the article that is not
realistic, including a 1 second boot time on a phone that you bring up. The
only reason we can't get a phone with a static hardware configuration to boot
in a second is because that's not what we've been optimizing phones for.
LinuxBoot is a great example of how the time it takes to boot is purely about
how much time we assign people to optimizing it not some laws of physics that
require servers to take 7 minutes to boot.

------
bronlund
I've seen several similar posts like this, but this is the best one yet.
Kudos!

------
tyzerdak
Don't feed the russian troll. These trolls take 1% truth and mix it with 99%
lie.

------
shiftless
To put a finer point on it, software today is absolute garbage. I've been
screaming about this for decades.

All of this bloated 'shitware' today is the result of it having been written
by people who a) have no deeper understanding of what the computer is actually
doing; also known as typical Python/Java/etc/etc/etc programmers, and/or b)
simply not giving a damn about conservation of resources--as further evidenced
by all of the other extremely wasteful and destructive habits they hold in
their personal lives, and in their societies in general.

After all, this is the same civilization that's burning through increasingly
vast quantities of oil at an astounding rate, despite the fact that previously
existing abundant and cheap oil is nearly depleted, with no possibility of
replenishment or replacement. So is it any surprise that foolish developers
also burn through CPU and memory with reckless abandon?

Really, the problems we face aren't just in software; they're more about the
foundations of our entire Western 'civilization.' Such problems generally tend
to be rather intractable, in the historical view.

I'm working to construct, in my own computing life, something of a 'personal
oasis', which is increasingly removed and estranged from all of the horrible
things I see Other People out there having to suffer in their personal
computing lives, thanks to talentless 'developers' who _Just Don 't Fucking
Care_. Some of these pricks actually have the audacity to call themselves
'engineers', even.

~~~
uncle_j
> To put a finer point on it, software today is absolute garbage. I've been
> screaming about this for decades.

That statement contradicts itself. Is this satire?

> All of this bloated 'shitware' today is the result of it having been written
> by people who a) have no deeper understanding of what the computer is
> actually doing; also known as typical Python/Java/etc/etc/etc programmers,
> and/or b) simply not giving a damn about conservation of resources--as
> further evidenced by all of the other extremely wasteful and destructive
> habits they hold in their personal lives, and in their societies in general.

Everything has a cost / benefit associated with it. Pretending otherwise show
how little you know about engineering.

> I'm working to construct, in my own computing life, something of a 'personal
> oasis', which is increasingly removed and estranged from all of the horrible
> things I see Other People out there having to suffer in their personal
> computing lives, thanks to talentless 'developers' who Just Don't Fucking
> Care. Some of these pricks actually have the audacity to call themselves
> 'engineers', even.

I dunno how far your head has to be up your own behind to actually believe
this? It not like most developers these days work on large software projects
that are normally poorly costed, estimated, planned usually with stifling
restrictions because certain "enterprise" technology is mandated by some
architect who hasn't written a line of code in a couple of decades. Most
people like a consistent regular income which allows them to support
themselves and their family. I suspect these concerns are more important then
incoherent ramblings of some guy on HN.

------
qaq
Well complaining about it does not change anything. Support the projects that
you believe will help with these problems with $ and things might actually
change.

~~~
catalogia
> _Well complaining about it does not change anything_

The twelve steps program may be controversial, but I think the first step,
'admitting you have a problem', is generally the first step of _any_
productive approach to problem solving.

------
lazyant
Whenever I see a rant that is a blanket "all is bullshit" list of complaints,
I check if there are any actionable proposed solutions. Author gave none, so I
dismiss all this with a flick of my hand.

------
CivBase
I get really tired of hearing people complain incesently about how inefficient
software is. I want it too, but it isn't going to just happen.

Effiency is a selling point that most users don't care much about in most
markets. There are efficient browsers out there but everyone uses Chrome
because those browsers are inferior to Chrome in many other ways. Ways that
are more important to the average browser user.

If there's a market for a more efficient software solution, go make it and get
rich. Otherwise, I'm getting sick of the complaining.

------
erikbye
Here's why I'm slightly annoyed by articles like this one. Oftentimes the
"Software is slow" mantra rings true, but here's the thing: everyone repeating
it claims it's the shit further down the stack that's the cause of the
slowness, this is often untrue. V8 is fast; it's your shit JS code that is
slow. PostgreSQL is fast; your shit queries are slow. We live in the age of
the Stack Overflow programmer. Think about it for just a second, what requires
the most competency? Writing V8 or PostgreSQL, or churning out some JS for a
web app or Electron? It's not the programmers working on the former that is
not concerned about performance. They spend considerable effort on it.

The least competent programmers are the once writing slow code. The least
competent programmers are the ones working at the top of the stack.

~~~
lerpapoo
yea, I don't like articles like this for the same reason. nobody has the brain
power (yet) to rewrite the entire stack in a gpu shader and add in complex
logic and a dynamic interface. it's an optimization vs delivery tradeoff. If
someone is getting payed to build a product, they must deliver it with the
tools at hand, you can invest time to optimize it but it will come at a cost
of less features. This also depends on the industry, if you are building lower
stack drivers that others will depend on, you care about performance more.
Higher level applications are more user oriented and care more about ui and
features. organically, people tend to spend their time and braincells in the
most valuable way they can. a lot of products, especially in the web world,
features hold more value than performance. Once we get the agi thing going, we
can just task it to strip and redesign the entire OS down to the kernel and
tailored for every user, strip everything down to only the buttons that the
user clicks. some granny only knows how to open up google and read her
newspaper website, all that other code can be removed lol.

