
Niklaus Wirth was right and that is a problem - bowero
https://bowero.nl/blog/2020/07/31/niklaus-wirth-was-right-and-that-is-a-problem/
======
ummonk
The referenced AppSignal post about 13,000 dependencies for a todo list is
conflating dependencies required for the build tooling with dependencies that
are actually bundled in to the web app.

It's also hilarious that right after referencing the dependency problem in the
JS ecosystem the OP then goes and advocates splitting up your library into a
bunch of mini libraries. That's exactly how we got into this mess in the first
place.

Also, computer speed has definitely gotten faster. There was a time period
during which this was not the case, but particularly the switch to SSDs
resulted in a massive jump in computer responsiveness. (The previous jump
happened when we no longer needed to wait for dial up to establish a
connection)

~~~
LeftHandPath
I agree with the core point of the article, but I agree with your points as
well. The conclusion of the article is definitely wrong.

I recently buddied up with someone who has a CS degree to try and make a
desktop app, as a side project. I asked if he is familiar with MariaDB and
whether or not he's used C++ ODBC.

Our discussion quickly turned into choosing libraries / existing code. He uses
Spring and Hibernate and was bewildered by the concept of tying columns to
application variables "manually". After I told him I have some CentOS servers
where we could put a shared database, he bought a Windows Server because he
"needs a GUI". He added, "Ideally, after it's setup, we'll never even have to
remote in to it." For the actual desktop application, he wants to use
Electron.

It seems there are two distinct branches of computing emerging - one where
performance matters, and one where it doesn't. Performance will always matter
in places like the stock market or on IBM mainframes. In the consumer-facing
world, All that seems to matter is perceived performance. Slack and Discord
seem fast, when you watch how quickly a new message pops up on your laptop
screen after you sent it on your phone. They seem egregiously slow when you
open up task manager and see just how much overhead the chromium-based engine
adds -- but most people won't care.

Applications made for "consumers" tend to be made like a cheap car - corners
are cut, the end result isn't pretty, but things in the category are what
makes up 90% of ordinary use cases. Slack isn't meant to be open on your work
machine while you're compiling code in the same way a Prius isn't meant to
chauffeur top-level executives. It doesn't mean that the Prius is bad or
unimportant - it will do far more for more people than the entire lineup of
many luxury car brands.

But I am damn sure that I'd rather be engineering a Bentley than a Prius.

Edit: In the metaphor, luxury and performance are sacrificed, not efficiency.
I probably should've used Fiat Chrysler, but unlike Fiat Chrysler, consumer-
facing software has its place. I just don't enjoy working on it myself.

~~~
viraptor
> but most people won't care.

Charity Majors said about reliability: "Nines don’t matter if users aren’t
happy."

I'd propose an alternative view here: "Bloat doesn't matter if users are
happy."

~~~
heavenlyblue
Users are happy until they saw better :)

Today bloat doesn’t matter, tomorrow you’re are a fat dinosaur

~~~
viraptor
Sure, but... that confirms this view.

~~~
heavenlyblue
This view is more akin to working class parents telling their children to
become a builder because “people have always been building”.

You should not outsource your main product. And if your product is a chat
application that is supposed to be fast and always on the background, then
you’re outsourcing your main area of expertise.

------
hinkley
I was expecting Moore's Law to give us a renaissance in algorithmic thinking,
but The Cloud has shown me I was wrong. First, we're going to have to fully
explore Amdahl's law.

Eventually every problem goes to logn time, best case. The logn factor shows
up over and over, from constraints on on-chip cache to communication delays to
adding numbers of arbitrary size. We make a lot of problems look like they are
O(1) until the moment they don't fit into one machine word, one cache line,
one cache, into memory, onto one machine, onto one switch, into one rack, into
one room, into one data center, into one zip code, onto one planet.

If we can't solve the problem for all customers, we dodge, pick a smaller more
lucrative problem that only works for a subset of our customers, and then
pretend like we can solve any problem we want to, we just didn't want to solve
that problem.

~~~
agumonkey
Herb Sutter had a wonderful talk on constant factor plague. As much as I like
clojure and similar convenience and algorithmic beauty.. I can appreciate the
devil-is-in-the-details much more since this video (forgot the title sorry)

~~~
pcstl
I went looking for the talk and all I could find was this paper:
[https://www.researchgate.net/publication/337590358_Inflation...](https://www.researchgate.net/publication/337590358_Inflationary_Constant_Factors_and_Why_Python_is_Faster_Than_C)

I'd like to watch the actual talk. Hope you could try remembering the name.

~~~
agumonkey
maybe this one
[https://www.youtube.com/watch?v=L7zSU9HI-6I](https://www.youtube.com/watch?v=L7zSU9HI-6I)
.. damn memory (sic)

------
horsawlarway
This article is the problem.

It's the same freaking fallacy I see again and again on here - It's simple,
easy to understand, and dead fucking wrong.

The vast majority of the complexity you're dealing with in modern computing
comes from three sources. In the order of impact

1\. Networking. It turns out there are real and hard limits on how fast we can
pass data around over copper wires. Fiber is better, but you just literally
cannot move faster than light, and latency is a big deal when the items you're
accessing and changing live somewhere else (and basically everything of
_value_ does live somewhere else, or thinks you are "somewhere else").

2\. Security. This is a direct consequence of number 1. When everything is
connected, _everything is connected_. You can't just lock the lab door and
call it a day now.

3\. Compatibility. This is a direct consequence of both 1 and 2. Value is a
consequence of compatibility (this is why a good chunk of you on here still
support IE, even though you don't want to). We have more devices, of more
kinds than ever before. We have more people, with more use cases than ever
before. There is value in being as compatible as possible with all those
devices and people. All those devices are connected in ways designed to keep
them compatible, but also secure. It turns out this is not an easy task.

If you'd like to go wank off over how fast your pre-network, unsecured,
unsupported, inaccessible and manually configured systems are, be my guest
(oh, and I hope you read english...). The rest of us will continue to produce
items of value.

~~~
sheepz
No, it's the other way around. Advances in computer hardware have allowed the
use more inefficient programming languages allowing more inexperienced and
unskilled programmers to create programs leading to more resource hungry
programs. When there are little resource constraints the only real constraint
becomes developer time.

I don't see how having IDEs implemented in browsers has anything to do with
security, the speed of light or compatibility. It's just the lack of
constraints allowed by advances in computer hardware.

Most software is written with no performance considerations in mind at first
and the performance issues are addressed only when they become visible.
However, if there is abundant memory available, why bother?

~~~
baryphonic
> I don't see how having IDEs implemented in browsers has anything to do with
> security, the speed of light or compatibility. It's just the lack of
> constraints allowed by advances in computer hardware.

This isn't a compatibility issue? We've seen about 8-16 branches of the write-
once, run-everywhere tree over the past 25 years, I'm not sure how that isn't
seen as a constraint on programmers. JWT, Swing, Web, Cordova, QT, React/<web
front-end> Native, Xamarin, Electron, Flutter and even quirky ones like Toga
have all attempted to solve this problem. The only unifying thread has been
that managers follow greedy algorithms and choose the lowest common
denominator platforms as possible. QT, the Java tools and Xamarin at least
can't be lumped into the inefficient language bucket, though the UX is just
awful. Other than hardware drivers, it's hard to think of a clearer example of
compatibility constraints.

~~~
Sebb767
> JWT, Swing, Web, Cordova, React/<web front-end> Native, Xamarin, Electron,
> Flutter and even quirky ones like Toga have all attempted to solve this
> problem, and

... and for the most part they have. You can write your app right now and the
only thing you need to worry about is screen size. If you use bootstrap, even
this is mostly solved. Your app is accessible on Windows, Linux and Mac;
Chromebooks and Tablets; iPhones, Android and even the one Symbian user. Of
course it's not perfect yet, there _are_ edge cases and you cannot do
everything, but let's not act like things have gotten worse.

> The only unifying thread has been that managers follow greedy algorithms and
> choose the lowest common denominator platforms as possible.

Yes, I agree. But for nearly every use case, it's good enough. Take HN as an
example: Does it need anything more?

Of course, if you need access to specific hardware, you'll have to go deeper.
But if you do not, it would simply be you taking the lowest common
denominator. And I'd argue that the framework probably did a more thorough
search.

~~~
baryphonic
I basically agree, with the caveat that I'd still prefer a world where our
write-once-run-everywhere lowest common denominator at least required native
widgets and an ability to integrate new platform-specific capabilities at the
expense of writing a small amount of native code, rather than barfing up web
UI/UX at users (e.g. the execrable MS Teams).

------
wrs
The argument got rather confused at the end. The todo list has 13,000
dependencies precisely because the NPM community follows the advice here to
create many small libraries. So is that supposed to be a good or bad thing?

~~~
noobermin
Yeah...not sure where they were going there. The many small libraries becomes
a problem when said libs go out of sync, i mean that is where dependency hell
bites.

~~~
Gibbon1
After 40 years I've decided that dependencies are the root all evil. There is
something to be said for curated libraries and frameworks.

------
jesstaa
>A good way to start would be to split up libraries. Instead of creating one
big library that does everything you could ever possibly need, just create
many libraries. Your god-like library could still exist, but solely as a
wrapper.

This isn't it. It's literally what is creating the problem. Small libraries
mean duplication, a lack of shared abstractions and dependency hell. The
reason garbage collection was such a huge win is that before that every C
library shipped with it's own completely different way of managing memory,
that's a nightmare.

We need bigger libraries providing better abstractions that are reused through
out and written to allow composition and dead code elimination('tree shaking'
for the JS crowd) to work well. So you can opt in to functionality you need
and opt out of functionality you don't need.

Communities working on big libraries can actually do good release planning,
backwards compatibility and timely deprecations. React is a example of a
library that does a really good job of this. All the smaller libraries in the
JS ecosystem are the cause of the pain in modern JS development.

~~~
m463
I think we should continually fold new libraries into the language. Basically
"batteries included", collapse the dependency tree.

I found that the standard python library has enormous functionality that makes
it possible to solve many problems in a self-contained fashion.

This sort of solution will give you a big toolbox spread out in front of you.
You will be less likely to re-invent the wheel. Many eyes on the standard
library may lead to optimizations used by everyone.

And it makes it possible to share effectively. You can talk about functions
with others using common terms. You can share your code and it will work in
other environments. Education can teach in an unambiguous fashion.

------
markbnj
> 1995 was the year in which programmers stopped thinking about the quality of
> their programs.

Oh, please. There are some good points in the article but that hyperbole was
unnecessary. I was programming in 1995, and nobody stopped thinking about the
quality of their software.

~~~
wglb
I’ve been programming since 1965, read Dijkstra, Niklaus Wirth. And these days
I break software at people’s request. There is in fact much less concern and
emphasis on quality these days.

It is also interesting to note that Knuth also does not do libraries

~~~
markbnj
> I’ve been programming since 1965, read Dijkstra, Niklaus Wirth.

Well sir, you have me by ten years, and believe me I respect that extra
decade. That said, I think you're falling into the trap many of us greybeards
tumble into as we get older: the things we cared about and focused on become,
in our minds, the right and true perspective that has been lost or disregarded
by younger practitioners. It's really not very different from "When I was your
age I walked to school every day, uphill, both ways!" Or maybe it's just me.

Yes, there are far fewer programmers hand optimizing assembly code, and yes
every programmer now cobbles together applications by reusing code written by
other programmers, some of which is very good, and some of which is not. But
if programmers were still spending their time optimizing low level loops and
eschewing any code that they did not personally write and verify the beauty
of, the world we have today would not exist. Instead of zooming with my family
in the middle of a pandemic we'd be exchanging emails, or posting on a BBS to
say "Hi!" There's obviously still a need and role for people who like to work
at that level, and that kind of engineering remains fascinating (one of the
reasons I love to read the linux kernel mailing list), but I don't bemoan the
rise of high level languages, libraries, package management ecosystems,
frameworks and the like. That stuff has given us the world we have today, and
a few warts notwithstanding I still like it much better than the one we had.

~~~
wglb
Whippersnapper. I grew up in Montana back when winters were severe. I did walk
to school—about 25 feet as my parents drove me.

None of what you seem to think about my rant represents what my position is.
Read my other note in my thread about the NYT article concerning Knuth and
Norvig’s commentary. There is a time today for total deep dive. There is skill
involved to do this and wisdom when to do this.

There are folks who write with minimal libraries cf qmail.

What seems to be completely missing from today’s discourse about programming
is something dijksata said about interrupts. Paraphrasing “If you don’t see
the code on the page in front of you, you will make mistakes. “

Take a look at modern Java. Levels of abstraction in use require serious deep
dive to truly know what is going on. There is a famous Node package issue
where code that wasn’t even on your computer crashed a swath of applications.

Quality in the context of the article means the code is pleasing to read,
doesn’t crash unexpectedly and doesn’t have side effects that you may only
discover when Brian krebs emails you about your customer’s data ending up in
some remote online flea market

------
rossdavidh
There is an analogous problem in city design, wherein it was discovered that
it was empirically not possible to get the average commute down below a
certain time, because as more roads were built people moved further away from
town. If you build more lanes to your roads, you don't get a faster commute,
you get more sprawl.

People spend time making software less bloated, when it's the number one
problem they have. When hardware speed is taking care of making that only the
#2 or #3 problem they have, then they will work on whatever the #1 problem is,
meanwhile adding more software bloat.

When Moore's Law once and for all stops, due to some law of physics reason,
then software bloat will become a priority. Until then, other things are.

~~~
cortesoft
Yeah, you are describing Braess's Paradox:
[https://en.wikipedia.org/wiki/Braess%27s_paradox](https://en.wikipedia.org/wiki/Braess%27s_paradox)

It is a form of Induced Demand:
[https://en.wikipedia.org/wiki/Induced_demand](https://en.wikipedia.org/wiki/Induced_demand)

------
Sebb767
I guess it's more of a natural selection process - the software house which
can deliver a good enough _working_ piece will (nearly) always beat the one
which adds another two years of development time (and cost) to make the app a
bit snappier. Ask Lotus Notes and Netscape.

I don't want to say that performance does not matter at all - it does - but
with hardware being as cheap as it is and developer time and time-to-market
being as expensive, optimizing that last 500ms and 200MB out simply is not
going to be worth it.

And let's not disregard the expense of performance optimization - you'll not
only need a reference to benchmark and test against, but also spend a lot of
time debugging and writing very plattform-specific code with tons of edge
cases. It's not like saving 2GB of memory comes for free.

~~~
mcguire
Multiply that 500ms and 200mb by more than a few thousand users and you are
talking about real time and money.

Everything seems to have been optimized for the enterprise market.

~~~
Sebb767
That's why I've mentioned time to market. Take GitHub as an example: It's
neither the fastest/most lightweight git hosting page (that would probably be
Gogs/Gitea) nor the most feature-rich (GitLab is far more advanced, as far as
I can tell). They are where they are because they've made a viable product
first. Same with slack: Electron is not a good solution, but it was a desktop
app in 5 minutes - can't beat that.

I'm not trying to say performance doesn't matter - I'm using lower-powered
devices myself - but development time is also a big factor for b2c.

------
pacaro
What gets me is part of the '95 quote about computing in the 70s

"About 25 years ago, an interactive text editor could be designed with as
little as 8,000 bytes of storage"

Such a text editor likely couldn't handle lowercase in English, let alone any
other Latin script language, let alone cjkv or bi-di. The bloat in software of
95 and the present day is real, but there is no real effort to make an apples
to apples comparison in what our expectations of software are, and it
massively weakens the argument

Parallel arguments can clearly be made for compilers etc.

~~~
userbinator
[http://www.texteditors.org/cgi-
bin/wiki.pl?TinyEditors](http://www.texteditors.org/cgi-
bin/wiki.pl?TinyEditors)

A lot of these are less than 8K, none of them can't handle lowercase, and I
bet the majority of them will be fine with "high CP437" bytes (so other Latin
languages.)

~~~
pacaro
Of course, but editors written in 1970 (25 years before the Worth quote in
question) often couldn't handle lowercase (and neither could early versions of
Pascal) often because platform support was limited or missing.

Much of the cability of a modern tiny editor comes from the environment in
which it is running, we just expect more from an editor now, and the developer
expects more from their operating system.

------
UweSchmidt
There are many complaints but there doesn't seem to be a real movement to
make/use/cultivate small and fast software for all purposes.

I'd join.

Fragmented parts could be suckless.org, old cheap thinkpads from ebay, fast
Linux distros, unix command line tools, retrocomputing, raspberry pi; all
things with communities and fans who like a certain quality, simplicity and
the good old days.

~~~
black_puppydog
check out the suckless suite of software. it's neat :)

~~~
xvilka
Suckless is neat, but it stuck in the past, in a way. Using modern programming
languages and native frameworks is the way to go. But the philosophy is
exactly what required.

------
mabbo
Software speed is a usability feature, and usability is a subset of economic
value.

It turns out, people are perfectly happy to put up with slightly slower
software for all of the other benefits we get from modern software: rapid
development, rapid deployability, ease of code comprehension, and more.

When users complain about slow software enough to buy a different product (in
whatever variation of 'buy' that may be) then it becomes a high priority thing
to fix. Software today is precisely as fast as it needs to be- and no more,
typically.

To me, this article is just the software engineer's version of Grandpa
complaining that things were better in the old days- he's ignoring the reasons
why things changed.

~~~
Barrin92
>people are perfectly happy to put up with slightly slower software for all of
the other benefits we get from modern software

I don't think this is true. I think this is more of a 'boiling frog'
situation. Increase the temperature one degree at a time and it won't jump out
of the pot.

Every individual piece of software slowly eating up more resources is
something the user barely notices or doesn't even attribute to the software in
question, but give it a few years and everything grinds to a halt, _and people
very much dislike this_ , hence the infamous 'Windows rot' that everyone has
suffered from at one point of their lives.

------
realtalk_sp
"In economics, the Jevons paradox (/ˈdʒɛvənz/; sometimes Jevons effect) occurs
when technological progress or government policy increases the efficiency with
which a resource is used (reducing the amount necessary for any one use), but
the rate of consumption of that resource rises due to increasing demand.[1]"

[https://en.wikipedia.org/wiki/Jevons_paradox](https://en.wikipedia.org/wiki/Jevons_paradox)

~~~
Gibbon1
See modern IRC clients that take 500MB of memory on the desktop. Compare with
Orcad Capture a mid 90's schematic capture program that would run acceptably
on a 486 with 16MB of memory.

~~~
realtalk_sp
To be clear, the "efficiency with which a resource is used" in this case can
be represented roughly by dollar per unit compute. As that value goes down,
you would expect software "waste" to go up. That's how the Jevons Paradox
applies here.

------
jwdunne
There’s a few issues with this post. It starts off great but falls down hard.

For one, PHP did not start out object oriented. Nor did it come with an IDE.
Sure, it was dynamically typed (unlike Java). If I recall, it was 1994 too but
I can’t be sure.

It was only PHP4 that added an abomination of a class system that motivated
the current object system in PHP5. In fact, the problems with PHP have more to
do with that pattern: creating abominations that motivate the development of
something that isn’t a total abomination (often whilst retaining said
abomination for a while).

And, as others have said, small “libraries” is exactly why the todo app needs
13k NPM packages.

I say “libraries” but you can’t call them that. I mean where’s the analogy? A
library of one function? Behave

------
thelazydogsback
For the most part, the output object/exe size should be the realm of the
tooling, not the programmer's concern -- a dependency-walking package-manager,
linker or loader -- and in dynamic languages, unused code should never be
JITed. Of course, surrounding issues such as choosing _bad_ or incompatible
libraries, etc., is another matter.

~~~
Someone
There’s no way a linker, no matter how smart, can give us back the object size
and performance of the ‘90s.

For starters, there’s 64-bits. Pointers were a quarter of the size in the
‘90s.

Then, there is Unicode. All programs need ICU ([http://userguide.icu-
project.org/icudata](http://userguide.icu-project.org/icudata)). Even if you
dynamically link it, many of its symbols (or entry point IDs) end up in your
executable.

Unicode isn’t an exception, though. Every library choice you make adds a bit
more in size and takes a bit more in performance than the solution from the
‘90s would have.

For example, the moment you decide to use json, you get its entire feature set
(arbitrarily nested arrays, a multi-line string parser, ability to read fields
in arbitrary order, etc), even if all you need to do is pass two integers and
get one back.

A parser generator that generates code for the json subset you need would help
here, but would mean extra work for the programmer and the overhead typically
isn’t that large, so why bother? It all adds up, though.

Even if you can’t remove some code, you still could optimize memory layout to
move code you expect will rarely run into separate code pages so that it
likely will never be mapped into memory, but that’s serious work.

And of course there’s all that argument checking/buffer overflow protection
people do nowadays that ‘wasn’t necessary’ in the ‘90s.

~~~
thelazydogsback
The article clearly focuses size due to the use of libraries.

As for pointer size, personally I agree -- most processes can get by just fine
with their own 32bit address space, so I'm not sure why we need to double the
working-set size of all pointer-based data-structures.

~~~
zozbot234
> so I'm not sure why we need to double the working-set size of all pointer-
> based data-structures.

If your data structures can fit in a 32-bit address space, you can just place
them in an arena w/ 32-bit indexes. You do need to use a custom allocator for
every element of that data structure, but other than that it ought to be
feasible. Link/pointer-based data structures should be used with caution
anyway, due to their poor cache performance

------
sillysaurusx
Sidenote: Niklaus Wirth designed Oberon, one of the coolest operating systems
I’ve ever seen. You can run it on an emulator. It’s basically a graphical OS,
which sounds like an oxymoron, until you realize that the programming language
itself is also graphical.

It’s so unknown that it’s shocking. Imagine designing an entire OS that was
used by dozens of people, and no one knows about it.
[http://worrydream.com/refs/Wirth%20-%20Project%20Oberon.pdf](http://worrydream.com/refs/Wirth%20-%20Project%20Oberon.pdf)

------
jacknews
"Languages like these made programming a lot easier and took many things out
of the programmer’s hands. They were object-oriented and came with things as
an IDE and garbage collection.

This meant that programmers had fewer things to worry about"

Yeah, right. Those lazy programmers.

It's obviously become easier to build complex software, but software is now
required to be much more complex. There are still many things to worry about
(actually far more than previously, I'd say), but they're not the same things.

------
wwright
I think a lot of this is governed by Conway's Adage: the software reflects the
social systems that built it. If you look at the larger software ecosystem, it
reflects the larger society that built it. The priorities and social customs
of communities are leading to the "bloat." It's hard to say whether that is a
growth on society's part or that software is simply still catching up with
society.

~~~
cryptonector
Back when a ton of software was written by graduate students... it was small
(because no time), fast-ish (because small), and buggy as all heck (because no
time).

------
userbinator
One only has to look at the demoscene to realise that limitations inspire
creativity and efficiency. It's just a form of art, but gives a glimpse at
what computers are capable of, if only we would try to use them more
efficiently. I think award-winning 4k or 64k demos should be required watching
for every software developer. Here are some personal favourites which have
also appeared on HN:

[https://news.ycombinator.com/item?id=11848097](https://news.ycombinator.com/item?id=11848097)

[https://news.ycombinator.com/item?id=14409210](https://news.ycombinator.com/item?id=14409210)

[https://www.youtube.com/watch?v=Y3n3c_8Nn2Y](https://www.youtube.com/watch?v=Y3n3c_8Nn2Y)

------
arexxbifs
> About 25 years ago, an interactive text editor could be designed with as
> little as 8,000 bytes of storage.

That would be in 1970, but my guess is that "ed" would be a hard sell today.

There is plenty of bloat to go around these days and I think we could do a lot
more to address that. But we've all got too much skin in the web game to own
up to the embarrassing fact that a chat program that's basically IRC with
pictures feels like glue on a 2.4 GHz, multi-core CPU.

With that out of the way, we shouldn't get silly, either. Every actually
useful feature added will increase complexity and resource usage. I like
split-window, code-folding, auto-indented, word-completing, syntax highlighted
multiple document editing more than I like saving a few fractions of a percent
of my hard drive space.

~~~
greggman3
That text editor quote really bugs me. Sure my Atari 800 had a word processor
that came on a 8k cartridge. I couldn't type Chinese, Japanese, Korean,
Russian, Arabic into. I could only type ASCII. Just the ability to do that
alone would likely entail many many megabytes of code.

For one, there is no text based system like those old machines that handles
all that so I have to switch to graphics. Just the font alone for all of those
languages will be multi-mega bytes and I'll need multi-megabytes more for
space to rasterize some portion of those fonts. Rasterising just 4 or 5 glyphs
is more work than that entire 8k word processor had to do on its 40x24 text
screen.

Then for each language I'll need multi-megabytes for handling the input
methods. The input methods are complex and will likely need a windowing system
to display their various options and UX so add more code for that.

The point being that we need the complexity. That 8k editor wasn't useful in
the same sense as our stuff today. I don't know a good analogy. It's like
complaining that people use CNC machines today and at one point got by with a
hammer and chisel. I'm not going back.

~~~
userbinator
_Just the ability to do that alone would likely entail many many megabytes of
code._

 _Just the font alone for all of those languages will be multi-mega bytes and
I 'll need multi-megabytes more for space to rasterize some portion of those
fonts._

 _Then for each language I 'll need multi-megabytes for handling the input
methods._

Those statements clearly show your lack of awareness of what things were
really like 40 years ago. They had CJK input and
output([https://en.wikipedia.org/wiki/Cangjie_input_method](https://en.wikipedia.org/wiki/Cangjie_input_method)
was invented in _1976_ , for example) on the systems of the time, and that
certainly did not entail "megabytes of code".

What it did entail, however, was a certain amount of skill, creativity, and an
appreciation for efficiency and detail that lead to being able to do it with
the hardware of the time, skills which are unfortunately a rarity today.
Instead, we are drowning in a sea of programmers who think the simplest of
tasks somehow requires orders of magnitude more resources than were available
decades ago, when the reality is that there existed software at the time able
to do those tasks perfectly well and at a decent speed.

 _The point being that we need the complexity._

The point is precisely that we _don 't_.

~~~
arexxbifs
While parent's "many many megabytes" might be an overstatement, today's
editors are expected to display any number of languages and alphabets
simultaneously, using user-configurable, scalable, variable-width fonts that
render in a variety of different resolutions with sub-pixel smoothing. Things
like that add to the complexity of both the OS and the application itself.

------
coliveira
That's why Chuck Moore, the creator of Forth, use to say that he can create
any software with 99% less code that a commercial version. He always designs
software thinking on how to keep it as close as possible to hardware and
removing useless abstractions.

------
rini17
Seems to me the software features interact in a kind of network effect, so
that the amount of resources required grows faster than amount of
functionality.

------
ericmcer
I do see this as a gap/opportunity in a lot of existing markets. Huge players
who dominate these areas are doing so with these giant, slow, unwieldy web
apps. Look at how Figma managed to take over in the design market by creating
a lighter/faster product than what everyone else was offering. I wholly
endorse any other teams that want to use Wasm to kick stagnant old web apps
off their thrones.

~~~
amoe_
Surely Figma actually took over that market by 1) making a web-based version
of a category of software that was primarily desktop-based, and 2) giving it
away for free? I don't think performance has much to do with it.

------
hyko
Modern software is doing orders of magnitude more than the software of the
past. The feature set of a word processor in 1980 is a rounding error compared
to the 2020 word processor’s print dialog.

The OP’s viewpoint is akin to claiming that aviation has not advanced beyond
Kitty Hawk because modern aircraft are simply wasting fuel when they take to
the skies. It doesn’t take into account the huge differences in the modern
computing environment.

The computers of the recent past weren’t even powerful enough to encrypt a
modern TLS session. Does that mean using all that power to encrypt the session
today is “bloat”? Of course not. _One person’s bloat is another person’s
feature_. Note that sometimes that feature is that the software can actually
exist and be maintained on your preferred platform.

If you think about it, it would be an utterly bleak future if we were just
running the exact same software stack on faster and faster hardware. That
would represent stagnation, not progress.

~~~
TheOtherHobbes
I think you'd be surprised how few extra features Word in Office 365 has
compared to WordStar 2000 considering the forty year gap. And also how poorly
designed, inelegant, and - yes - pointlessly bloated Word feels in comparison.

The biggest differences are screen size - full document vs a few lines of text
- and support for colours and outlining. Also, PDFs, which weren't a thing in
the early 80s.

But the core features in WS2000 are a lot more than a "rounding error"
compared to Word.

------
jb_gericke
The central conceit being we are all doomed as software becomes more bloated
and less efficient while processing power remains at it's current levels?

While Moore's law may be over in terms of number of transistors per chip, we
are still seeing growth in the number of cores per CPU, not to mention faster
memory and the death of spindle storage.

ImO software will start leaning more heavily towards parallelesim (if it
hasn't already, building a threaded or asynchronous application is no longer
an arcane dark art). Couple that with the surge in adoption of distributed
architecture (yes microservices), and the emergence of languages which
encourage pragmatism and efficiency (ala Go and Rust) and I think we'll be
okay for the next while. Can't speak to electron though (gross).

------
adrianmonk
Incentives are often a big part of the problem. In many cases, people are
writing software that will run on someone else's hardware. This applies to
basically all front-end web development, all mobile apps, and all software
licensed to someone else.

That means the programmer (or their employer) doesn't pay for the electricity
it uses (or battery it drains), the RAM it allocates, the disk space it
wastes, or the hardware upgrades necessary to make it run acceptably.

Why conserve a resource that you're not paying for? Especially if you have to
expend your own resources to do it.

I'm not defending crappy software (nor, apparently, offering a solution), but
if a programmer's personal sense of honor is the only weapon in this fight,
then it's not a great formula for winning the fight.

------
sally1620
There are a lot of these type of articles that conclude that "we", the
developers, need to fix this problem. But if you read the original article by
Prof. Wirth, it is not up to "us".

It is the reality of the software industry. Time-to-market is the only rule.
and it dictates how we write software: Agile, Scrum, don't reinvent, deploy
early, fix it later, technical debt.

There are many of us who care about quality of our work. But in performance
reviews, the only thing that matters is how many features you shipped. You
cannot demonstrate the quality of your work in your performance review. Or
better yet, the fact that the software you wrote is going to be bug-free for
next 10 years.

------
peter_d_sherman
>"Wirth’s law is an adage on computer performance which states that software
is getting slower more rapidly than hardware is becoming faster.

[https://en.wikipedia.org/wiki/Wirth%27s_law"](https://en.wikipedia.org/wiki/Wirth%27s_law")

[...]

>"Niklaus Wirth, the designer of Pascal, wrote an article in 1995:

About 25 years ago, an interactive text editor could be designed with as
little as 8,000 bytes of storage. (Modern program editors request 100 times
that much!) An operating system had to manage with 8,000 bytes, and a compiler
had to fit into 32 Kbytes, whereas their modern descendants require megabytes.
Has all this inflated software become any faster? On the contrary. Were it not
for a thousand times faster hardware, modern software would be utterly
unusable.

Niklaus Wirth – A Plea for Lean Software"

[...]

>"Time pressure is probably the foremost reason behind the emergence of bulky
software.

Niklaus Wirth – A Plea for Lean Software

And while that was true back in 1995, that is no longer the most important
factor. We now have to deal with a much bigger problem: abstraction.
Developers never built things from scratch, and that has never been a problem,
but now they have also become lazy."

[...]

"The problem does not seem that big, but try to grasp what is happening here.
In another tutorial that Nikola wrote, he built a simple todo-list. It works
in your browser with HTML and Javascript. How many dependencies did he use?
13,000.

These numbers are insane, but this problem will only keep increasing. As new,
very useful libraries keep being built, the number of dependencies per project
will keep growing as well.

That means that the problem Niklaus was warning us about in 1995, only gets
bigger over time."

Related: "The Law Of Leaky Abstractions" by Joel Spolsky:
[https://www.joelonsoftware.com/2002/11/11/the-law-of-
leaky-a...](https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-
abstractions/)

------
Skunkleton
It’s true that lots of software is slower than it should be. It’s also true
that software does a lot more than it used to. What’s not true is that
abstraction is the reason for slow software, at least not on the order of
magnitude that the article claims. Even in the worst case, the cost of
language level abstractions is well outstripped by advances in processing
power.

I’m not actually sure we really even have “slow” software. At least not
relative to how much that software can do. Latency is a different story.

------
usefulcat
“1995 was the year in which programmers stopped thinking about the quality of
their programs.”

There are some valid points in this article, but seriously?

~~~
cable2600
Actually it was the year that managers stoppped taking programmers in that
know how to do quality control and focus on doing it faster with more bugs so
it ships sooner. If you look at some of the MS-DOS programs they rn with
little to few bugs, but Windows 95 ran them in MS-DOS mode. Later on Windows
programs needed updates and bug fixes to make them work faster and every
version of MS-Office fixed bugs but ran slower as CPUs and hard drives got
faster they didn't focus on quality any more just shipping it with bugs and
fix the bugs later on.

------
agumonkey
And the trend is not slowing. That said some people like low fat computing.

Another thing.. computing is over. At least in the previous era form. It's not
bringing dreams anymore, will probably turn into an ubiquitous invisible form
where intrinsic details such as resource usage won't matter.

~~~
alexisread
I think you might be a little pessimistic here - IOT has a wealth of fruit to
bear, from:

low power always-on devices with long range radio (LoRa), few resources (32k
ram), the security constraints of securing every device, dealing with
terabytes of log data in the cloud, ML at the edge (Kendryte K210), open-
source firmware including radio (DASH7 firmware), open-source hardware (and
open-source FPGA tooling) to create custom hardware designs (hardware
implementation of algorithms), formal verification of OS and protocols, etc
etc.

Even a handful of innovation in any of the above would be groundbreaking.

~~~
agumonkey
I'm a bit jaded and at the same time not much.

These are all very advanced low level technology subjects (some of which I
like a lot btw).

What Wirth said only concerns a few grams of people on earth, the rest will
stop using computers just like they stopped using desktops and just
stream/talk on smartphones. If you ask most users they'll probably root for
whatever electron app they use compared to frugal but powerful programs. For
the layman computing will fade and become like roads. And I believe they never
really needed nor liked computers, it just was a 20 year period where it was
thought to be a technological wonder to have in your home.

------
mikl
All that “bloat” enables us to do (much) more with less developer hours.

The balance is the same as ever, developer time vs. compute power. As compute
power gets ever cheaper, and developer time still costs the same, we put
bigger burdens on the machine to save developer time.

In other words, simple economics.

~~~
IncRnd
I don't think that developer time is the only metric here.

Yes, every project can get to market or to the next release with a startup
company's speed, but you trade-off and generally get "startup quality" that
way. That is the actual point.

This industry is so fast-moving that the costs associated with that level of
quality is often paid later by other people. There are plenty of examples of
this.

------
weinzierl
He was right in 1995 and he still is. _Bucky_ , as he was called at Standford,
is one of founding generation of computer science that is still alive. Being 4
years older than Knuth he might even be the oldest one.

------
ausbah
I think this post has some validity to its main points, but I personally think
it veers on the edge of "real programmers do X" and reels of rose tilted
nostalgia.

------
daniel-s
Meh, I think this whole argument is wrong. Software "bloat" is fine.

I'm reminded of this [1] article comparing Bruguet and Carson numbers. You
have ever cheaper hardware. The marginal utility of any given unit of new,
equally serviceable computing power is going to be less than the previous one.
So eventually if something can be 2% better/more functionality, etc. at 10x
the bloat it is rational to accept that. When you have lots of something you
become less efficient in how you use it. If you're surprised by that go and
read economics.

[1] [https://www.cringely.com/2011/05/19/google-at-carsons-
speed/](https://www.cringely.com/2011/05/19/google-at-carsons-speed/)

~~~
xvilka
It comes with a cost. If not money then environmental footprint. Bitcoin is a
good example. Or a toxic deadlands of computer and mobile scrap. Just because
companies think it's cheap and blind developers think it's acceptable to
bloat.

------
hota_mazi
> 1995 was the year in which programmers stopped thinking about the quality of
> their programs.

-yawn-

Prove it.

The number of books and articles about the quality of software engineering
published in the past 25 years certainly seems to prove we care a great deal
about code quality. Probably more than we ever did prior to 1995 when Pascal,
BASIC, FORTAN, COBOL, and self-modifying assembly code were being taught in
colleges.

As for the point of the article, that hardware doesn't accelerate as fast as
programs using it, same challenge: prove it.

This article is all handwaving and speculation with no data to support any of
its claims.

------
known
Reminds me [http://www.kegel.com/c10k.html](http://www.kegel.com/c10k.html)

------
xvilka
Another solution would be to use the proper, statically and strictly typed
programming languages with AOT compilation. Rust, Swift, Kotlin - these are
the best of production-level languages that can produce both native code and
WebAssembly. There are other, similar languages as well. They key point - when
compiler knows more about your intentions (e.g. strict and sound type system)
it can optimize it better. Especially with LTO or PGO.

------
stevefan1999
Yeah, you will always keep writing crappy software while your hardware will
also keep wiping your ass

------
coding123
I feel like we had this debate a couple months ago. Someone posted that the
computer at his local library could search and display data on available books
super fast. (I think it was a twitter thread). The interface was programmed in
the 90s or something. And then they complained about software today.

My reply was that now days you can, at home, search for a book on a specific
interlibrary system, and find what specific libraries have it, download and
check out an Ecopy, find out the number due back if you still wanted the hard
one - AND have someone go hold it.

It's just not apples and apples anymore. I don't care how fast you can scan a
text file in 90s written software.

~~~
noobermin
There is not reason you can't have the connectivity without having the
unnecessary complexity of modern computing.

~~~
Sebb767
Alternative question: Why do I need to download 50 GB of book index to search
for that one title and that index can't even do a full text search?

That complexity is Google (or your favourite search engine) running a
datacenter indexing exabytes of data so that you can search it in the blink of
an eye. Yes, it takes 600ms now instead of 50ms - but that's like complaining
that your new eco car engine can't even properly run on aftermarket lamp
petroleum.

~~~
noobermin
What does google's datacenter for a book index have to do with editing a file
or rendering a button? That doesn't seem relevant at all to Wirth's law and
isn't an justification for the increase in abstraction that has made things
slower.

------
Marazan
Sniff sniff. What's that I smell.

Gatekeeping bullshit? The scent is unmistakable.

------
wldcordeiro
Ah yes let's pine for the days when ~men were men~programmers were programmers
and we coded real quality, unlike this current trend...

~~~
basementcat
...and furry little creatures from Alpha Centauri were real furry little
creatures from Alpha Centauri.

