
Brilliant Hardware in the Valley of the Software Slump - mpweiher
https://craigmod.com/essays/software_slump/
======
Tom4hawk
The problem is that both hardware and software are garbage.

Spectre/Meldown & friends are just the tip of an iceberg. We have layers &
layers of indirection/abstraction everywhere. We have hardware that lies and
tells you that it has certain properties when in reality it doesn't (example:
sector sizes in hard drives/NVMs, processors still pretending that they behave
like PDP-11), we have hardware that is flat out broken. We try to fix those
issues in software.

But in the software, we have another dump of workarounds, dependencies,
abstractions with a sprinkle of backward compatibility. We are now creating
"minimalist" applications with a fraction of functionality of the software
from 30 years ago but using so many layers that total amount of code used to
make it work is many orders of magnitude larger than what we had back then.

I know that most of the programmers did not work with systems where it's very,
very easy to debug the whole stack and you can learn it in a short period but
it's amazing when you have knowledge about EVERY part of the system in your
head.

There are some good things going on (like strive to replace C with something
which has similar performance characteristics but without its flaws) but it's
not enough.

Here are two things worth watching:

[https://www.youtube.com/watch?v=pW-
SOdj4Kkk](https://www.youtube.com/watch?v=pW-SOdj4Kkk) \- Jonathan Blow -
Preventing the Collapse of Civilization

[https://www.youtube.com/watch?v=t9MjGziRw-c](https://www.youtube.com/watch?v=t9MjGziRw-c)
\- Computers Barely Work - Interview with Linux Legend Greg Kroah-Hartman

~~~
scroot
Simplicity and good design take lots of time and money. Our culture is not
truly ready to make these kinds of investments in the manner required. Why
would they? There is a whole universe of FOSS out there upon which anyone can
cheaply create "working" software. If your goals are short term (quarterly
earnings, looking only a year or two down the road) this is "good enough."
Worse, that FOSS foundation is typically filled to the brim with complexity.
We have created a computing culture that is premised on pushing the extremes
of the teletype model of computing, and tacking what customers think they want
on top of it.

We have good alternate examples from the past (Oberon, Lisp machines,
Hypercard, Smalltalk systems, etc). How often does the new generation of
computing people get exposed to these ideas?

~~~
zozbot234
> Worse, that FOSS foundation is typically filled to the brim with complexity.

Really? In my experience, FOSS tends to be a lot simpler and more streamlined
than non-free software with comparable functionality.

~~~
na85
I agree with 'de_watcher that FOSS/encumbered is an orthogonal axis to
complexity/simplicity.

Lots of FOSS software is excessively complex (the systemd ecosystem of
shovelware comes immediately to mind) and lots of FOSS is simple. Similarly
there are untold thousands of overcomplicated/overengineered proprietary
suites and of course it's hard for a graphical application to get simpler than
notepad.exe.

------
rcoder
Semi-related to this and the recent Haiku R1/beta2 announcement: just for
giggles, I installed that inage on a USB3 external SSD and fired it up under
KVM, and was suddenly greeted with a faster, more polished desktop environment
than Windows 10 or Ubuntu offer out of the box. Consistent UI patterns,
applications that open in less than the 250ms perception threshold, and some
truly useful utilities installed (a fast POSIX terminal, programmer's editor,
media viewers, basic but serviceable WebKit-based browser, etc.). All of it
happy to run in 1-2 GB of RAM and two cores of my six-core laptop, even under
virtualization.

Running on an Atom-based SBC I had on the workbench, it's even more
responsive.

Yes, building native apps requires at least a basic knowledge of C++. No, it
won't seamlessly run the latest React SPAs as well as Chrome. The driver
situation isn't as good as (say) Linux or FreeBSD.

And yet, the focus on providing a productive environment for normal computing
tasks instead of endless up-selling to an app store, countless background
updaters, and vendor-bundled crapware is like a breath of fresh air.

This is what we've lost in the move to impose ever more layers of services,
unique per-app GUIs, and ubiquitous (even gratuitous) packaging of "webapps"
in lieu of targeted, native apps.

To be clear: this isn't a unique property of Haiku. A nice clean install of
FreeBSD or Debian has many of the same properties, and I have reason to
believe that a Windows 10 "distro" based on something like Server Core could
be similarly light and responsive.

Unfortunately, the major platform providers seem deeply uninterested in
building systems that don't push you aggressively towards newer hardware,
loads of subscription services, and "strategic" software bundles no one asked
for.

~~~
yellowapple
Yeah, I feel like as soon as Haiku has support for encrypted partitions it'll
be my daily driver (even if it's a performance hit, I'll take that over my
data being in the clear should someone snag my laptop). Implementing support
for it is something I'd love to try my hand at as soon as I can figure out how
I'd go about doing so (namely: how to make an encrypted partition look like a
normal partition to the overall system while still providing some mechanism to
prompt for a passphrase or key).

~~~
waddlesplash
Haiku already has support for encrypted partitions via "DriveEncryption" ...
but no support for encrypted boot drives, yet, indeed.

~~~
yellowapple
Whoa! How long has that been around?

Even without boot partition support, if Haiku's able to keep my user files on
a DriveEncryption'd partition and automatically decrypt it on boot (i.e. with
a password prompt at some point in the boot process or immediately after),
then that's good enough for me.

~~~
waddlesplash
I think DriveEncryption needs a few tweaks now that Beta2 is out to work
again, but it's been around for quite a long time. It was made by one of the
foremost kernel developers who certainly knew what he was doing. I think it's
based on TrueCrypt, though he may have updated it to VeraCrypt at some
point...

Here's the GitHub, anyway:
[https://github.com/axeld/driveencryption](https://github.com/axeld/driveencryption)

~~~
yellowapple
Definitely looks interesting, and the advertised support for decrypt-on-login
is exactly what I'd need. Doesn't look like it's working on x86-64; I wonder
how hard that'd be to fix...

Whatever the case, thanks! I left enough room on my disk to add an encrypted
partition later (and move the stuff currently in ~ over to it), so hopefully I
can get up and running with this.

~~~
waddlesplash
Probably not that hard to fix, there are a few potential issues that come to
mind but all have simple solutions, if you know where to look (and if you
don't, as in Freenode#haiku.)

------
PaulHoule
I think it's the real problem of the industry. It used to be that Intel had a
strong brand because you'd buy a new computer in three years that would knock
your socks off. It isn't that way anymore, and from the top of the bottom of
the stack we should be addressing perceived performance.

My #1 frustration as a Windows user is that every so often an applications or
the whole system freezes up for a few seconds. I use MacOS sometimes and also
Linux and I don't think either one is much better.

For instance when I launch a command prompt often it is up in <1s, but
sometimes it takes 10s. That's a very simple program, but often there are
security checks involved; even ssh logins to a linux server on a LAN have a
long-tail distribution in time-to-prompt because maybe some daemon was swapped
out.

I would be satisfied I had some visual indication that progress was happening
(ALWAYS AND <100ms LATENCY) and some insight into the process (e.g. real
progressbars... if hadoop can do it, why can't pandas?)

We don't want to trouble people with details, but what's the harm in telling
the user we are waiting on a 30 second timeout?

~~~
RcouF1uZ4gsC
> For instance when I launch a command prompt often it is up in <1s, but
> sometimes it takes 10s.

On Windows, a reason for this might be virus scanners. Get rid of all third
party virus scanners and tune the settings on the built in ones.

I personally am of the opinion that virus scanners are a waste of computing
resources, in that if you reach the point where a binary you run has been
infected you are probably compromised in ways a virus scanner cannot fix
anyways.

~~~
kyboren
What's more, virus scanners often run with SYSTEM privileges and consist of
hundreds of thousands of often shitty LOC, presenting a huge attack surface.
These days you might infected _because_ you're running "anti-virus" software.

------
yummypaint
Ive recently been noticing that g suite apps in firefox tabs are routinely
grabbing more than 500MB of ram, it's often more than 1GB just to have a gmail
window open. Just a few years ago this wasn't the case, but nothing has been
improved as far as functionality. Anyone have insight into how we got to a
place where a google sheet with 4 entries can use more memory than the entire
OS? I used to routinely perform identical tasks on much lesser hardware with
greater speed, so how do these seemingly broad regressive changes happen at
the ground level?

~~~
hedora
I blame the Software as a Service model. Clearly, you should just downgrade to
the old version of google docs (or whatever) that used 10% as many resources,
and had a better UI.

Since you don’t get to control which version you use, teams don’t have to
compete with last year’s version. Therefore their management doesn’t need to
worry about extreme regressions in functionality, so they don’t take action to
avoid them.

Since the service provider is the development team, you’re locked in, and
won’t switch.

Usage metrics look good, management gets credit for shipping ${feature}, and
gets their promotion. This happens a thousand times, and everyone wishes the
irreparably broken ${megacorp} would just go out of business already.

This pattern doesn’t play out to the same extent with hardware, where people
have to pay big lump sums to upgrade, and can hold on to older models / switch
vendors instead.

------
nateburke
Brilliant post.

I think this trend will continue, sadly, until a crafty company is able to
steal 1B of revenue from a flaky incumbent by building a reliable and fast
version of a core utility.

I am reminded of the early google.com homepage. I can remember setting it as
my default homepage in IE because it got to the point that Yahoo would take
MINUTES to load, if not just crash the browser.

Or maybe we need hardware to just suck again, to force the issue.

~~~
tonymet
make a law requiring software developers to use 15 year old hardware when
doing development

------
phlhar
Electron apps are easy to develop, but the performance as it is a browser is
pretty bad compared to native. The performance af all electrons apps I have
used (Slack, Discord, Spotify, Twitch, ...) is always worse than native apps,
and not just a little bit. You will notice it! I understand it is easy to
develop a electron app when you come from a web background, but it still seems
wrong to me.

~~~
plehoux
Co-founder at Missive, an email client built entirely in HTML/JS; Electron on
desktop; Cordova on mobile.

You can try the app, it's way faster than Mail from Apple. Apple even featured
the app multiple times on the mobile App store; again all HTML/JS.

Making fast software in HTML/JS is definitely possible.

An email client is mostly HTML/Style rendering so our use case might be more
aligned.

You can learn more about how we made it fast by listening to :

[https://syntax.fm/show/184/desktop-and-mobile-apps-with-a-
si...](https://syntax.fm/show/184/desktop-and-mobile-apps-with-a-single-
codebase)

~~~
emsy
I haven't tried Missive, but how are memory and battery usage? I've seen fast
electron apps, so I know they exist. Programmers basically treat memory as if
there's enough of it to not care about, but Laptops are still sold with 8GB of
RAM and the upgrade markups on current devices are insane.

~~~
Junk_Collector
Is it not insane to anyone else that we talk about _only_ having 8 GB of ram
with regards to the performance of mail and chat applications?

~~~
nitrogen
Seriously... 8GB is an ocean. People were reading and editing HTML email in
the days of 16MB.

~~~
efreak
HTML has changed since then, though. HTML email might basically be stuck with
whatever old version of HTML, but nobody writes an HTML renderer just for
email--they embed a fully functioning browser engine in the mail client (or
write the client itself to run in the browser).

------
centimeter
The complexity of software has far exceeded the ability of traditional
strategies to manage it. We know how to reliably scale our software: the
principled application of formal methods. More advanced type systems, proofs
of correctness, principled math-based strategies for software construction,
etc. Yes, all of these are locally more expensive than the slipshod strategies
most companies are using now, but I am confident that if the cost of buggy
software was correctly internalized, it would become clear that improving the
processes we use to create software would leave us a lot better off in the
end.

~~~
asdfman123
It could have to do with talent shortage. If you're a company that wants to
hire a team of people like that, how much are you going to have to pay for it?

And if you're making, say, a chat app, are you sure you're not going to be
beaten by a bunch of 20 year olds who slap together JS and Electron,
ultimately winning because they get that people want funny reaction gifs built
in?

Maybe it freezes for a half second every so often and takes 10 seconds to
load, but no one cares because 1) it gets to market three times faster and 2)
it has funny reaction gifs.

~~~
centimeter
> how much are you going to have to pay for it

Right, this is the part where you first have to internalize the true cost of
bugs. You can't keep hiring $12k/yr third world sweatshop programmers if you
want reliable software.

> no one cares because 1) it gets to market three times faster and 2) it has
> funny reaction gifs.

This is absolutely true, but I maintain this is mostly because businesses have
an artificially high time preference due to government subsidies of debt, and
in a less distorted market people would probably care a lot more about
quality.

------
entropicdrifter
My grandfather used to say, "The people who sit up front catch all the bugs in
their teeth"

If you're trying to move your full desktop workflow to a platform that only
added the option to work that way recently and as an afterthought compared to
its original use-case, you're going to be catching bugs for quite a long time.
If you're unlucky, your particular use-case for that hardware will _never_
catch on enough for the value of working out those kinks to outweigh the cost.

I know techies like most of us on here love to be early adopters, but you have
to draw a line between using tech as a toy and using it as a tool. If you're
using something that you don't expect to be stable 99.9% of the time, odds are
that it's a toy and you should hold your expectations a hell of a lot lower
for any sort of productivity.

~~~
Nextgrid
The problem is that in some cases you are forced to upgrade your tool and turn
it into a toy. Windows 10 is an example. macOS Catalina (required for latest
Xcode which itself is required to target latest iOS) is another one and I've
been bitten by it.

------
_bxg1
Economics, economics, economics. Companies only put the amount of effort into
their software that's necessary to optimize profits.

Startups are incentivized to move fast and break things, and then to keep
adding to their broken prototype instead of rebuilding the product, because
it's more affordable.

OS vendors benefit from lock-in and hardware is fast enough that the vast
majority of consumers don't notice. If something breaks, you just take it to
the Apple store and they reset it. It's cheaper for everybody involved.

Online ad vendors have no incentive to create a less-than-terrible web
experience because _it 's not their site_ that's being trashed.

On top of it all, there's no regulatory or institutional quality standard.
It's left to be a race to the bottom.

I don't know what the fix is, or whether there is one at all, but we should at
least stop being surprised. We shouldn't really be blaming it on "kids these
days" either, which is an all-too-common refrain. There simply isn't a
business incentive to invest in quality.

------
bcrosby95
Of course, that hardware receiving a lot of praise has a lot of software
(firmware) behind it too. The line between great hardware and software isn't
really that distinct in my mind. Lots of things that may have once been
mechanical in nature have software behind it instead because of that amorphous
trait of it brings about great flexibility.

------
tonymet
I feel that we as software engineers need a hippocratic oath of software
development. One of the commitments should be to write efficient and
responsive (responding to user input) software.

I feel both angered and embarrassed at software inefficiency. Trivial Electron
"apps" require 1-2GB + of ram. Seconds of Input latency like typing on the
keyboard or when pressing a button. Not to mention the gigawatts of
electricity and gigatons of CO2 being wasted server-side on poor code ,
logging, encode-decode & other nonsense.

My desktop experience peaked on Windows 2k and has been declining rapidly for
the last 10 years.

~~~
bombela
I don't know why you are being down-voted. I think you make a good point here.

If we were real engineers; not just by pretension; maybe we would have to
abide to a minimal set of rules that respect our users and theirs wallet.

~~~
tonymet
Agreed. Think how how powerful modern smartphones are, yet the apps still
drag. Many apps still sync the UI with the server, making the UI hang with any
radio hiccup, and even take seconds to respond in good conditions

------
shakermakr
You know what’s impressed me software wise has been Microsoft on Mac.

For home office I pretty much use their software constantly: Office 365,
Teams, SharePoint, VSCode, and it’s not just rock solid, but pretty enjoyable
as it is rock solidly integrated together.

Would never have thought I’d say MSFT having rock solid anything, nevermind on
a Mac, but credit where it’s due.

------
robomartin
> Something strange is happening in the world of software: It’s slowly getting
> worse. Not all software, but a lot of it. It’s becoming more sluggish, less
> responsive, and subtly less reliable than it was a few years ago.

> What baffles about these software moans is that Apple’s hardware is ever-
> more refined. While far from flawless, the entire lineup is now (finally)
> largely free from these “foundational” issues you see in software.

The answer to this is very simple (at some level): It is impossible to produce
mechanical designs that are the equivalent of the software engineering
abominations the world is stuck with today.

Going back to iPhone 3 days, I remember coding a genetic solver in Objective-C
that was an absolute dog. I optimized as much as I could and could only
squeeze so much performance out of it.

I finally gave up and re-coded the entire thing in clean C++. The code was
somewhere in the order of 290 times faster. Objective-C was an absolute dog
because, at a minimum, every data type you had to use was an object oriented
mess. This was absolutely unnecessary, the proof of it being that my C++
equivalent performed exactly the same function and lacked nothing. In fact, it
was so fast that it allowed the use of this genetic solver in real time in the
context of this app. Objective-C, er, objectively speaking, had no reason to
exist. Yet, it did, and lots of people believed it had to exist and likely
many do today. They are wrong.

Another way to look at this is that the clean solution used somewhere in the
order of 200~300 times less energy. This is something hardware engineers are
keenly aware of. Software consumes energy and badly written software is even
worse. Think of it this way: A bit transition costs energy. Inefficient code
requires more bit transitions per unit time, therefore, more energy and more
power dissipation.

Imagine the mechanical engineering equivalent. Some monstrosity where, instead
of using a simple screw and a nut one ends up using a complex fastener that
has layers upon layers of mechanisms and unnecessary complexity. A screw with
the ability to grow and shrink to any length and diameter, including all the
wrenches, tools, nuts, washers and devices needed to install, tighten and test
them. Very quickly a simple product that could mechanically fit in the palm of
your hand would quickly become the size of a car.

And so, in this way, mechanical and industrial design is always "low level",
like always coding in assembler (not proposing we do that). Sure, materials
and manufacturing techniques improve, yet, at the most fundamental level,
excellent mechanical and industrial design is clean, uncomplicated, easy to
understand and easy to manufacture. It's machine language, or C. Not
Objective-C.

Software engineers who are not exposed to this reality, through no fault of
their own, are not aware of these issues. I don't blame them. If most of what
someone sees in school amounts to Python, their view of reality will be
skewed.

My son is studying Computer Science at one of our top universities and has
less than a year to graduate. He is home now for both the summer and due to
the virus. I've been working on several embedded projects and showed him a few
tricks to improve performance. He was amazed to see how you could optimize
code for execution speed (with sometimes dramatic results) by making simple
high level choices. For example, a down counter in a "for" loop ("i=10; i!=0;
i--") is generally faster than the typical up-counter: "i=0; i<10; i++". This
is due to the fact that processors have instructions like "DJZ" or "DJNZ"
(Decrement and Jump if Zero / Decrement and Jump if Not Zero) that don't
require loading a comparison value and engaging the ALU and sometimes even
fetching that value over and over again from memory.

Software engineering doesn't have the same physical constraints found in
mechanical engineering unless someone is coding for an application and a
domain where excess really matters. One example of this is writing code for
space hardware, where, aside from reliability, you have to be aware of the
fact that every bit you are flipping will cost a quantum of energy and you
might not have enough to be careless about it. Energy quickly translates to
mass in the form of large cooling radiating surfaces that must be launched
into space along with the hardware.

It's an interesting problem this issue of bad or bloated software. Not sure if
there's a solution. There won't be until there's a forcing function that
requires a change in perspective and approach.

EDIT: To be clear, my point isn't to pick on a specific language or languages
but rather than use easy examples to highlight one of the problems that has
been building up since the introduction of object-oriented programming. I
remember the early days of OO. It was a mess. Much like the early days of
desktop publishing where people loaded-up documents with every font they could
find.

~~~
zozbot234
Objective C will always be an absolute dog for high-performance work because
of how everything has to go through dynamic dispatch and indirection steps.
It's why even Apple is trying to replace it with Swift - that name is not
coincidental! And the Rust language by Mozilla is even better performance than
Swift while supporting a great set of principled, higher-level language
features if you want them.

~~~
saagarjha
The average overhead of a message send using the runtime Apple ships these
days is insanely low; it’s on the order of approximately two nanoseconds a
call. With some minor insight you can bring that down to well under a
nanosecond, literally single-digit clock cycles, just by testing if
indirection is necessary. (For reference, this is about on par with, or on
days I’m feeling a bit confident even better than, a C++ virtual method call.)
There is no way that Objective-C is hundreds of times slower than C++ unless
there is something else going on.

------
Agathos
An Apple keyboard is a funny example to open with. Weren't we just recently
celebrating the end of Apple's ill-advised MacBook keyboard experiment?

~~~
codeisawesome
The end of that failed experiment was marked by the release of a much better
keyboard - which is what is shipped in the device featured.

If anything, it further illustrates the author's point that hardware has to
respond to customer complaints, but software can seemingly get as awful as it
wants and customers can't really vote with their feet, ironically.

