
Almost everything on computers is perceptually slower than it was in 1983 (2017) - LinuxBender
https://twitter.com/gravislizard/status/927593460642615296
======
dsalzman
Previous discussion -
[https://news.ycombinator.com/item?id=21835417](https://news.ycombinator.com/item?id=21835417)

~~~
SilasX
Can the mods un-dupe this? That was 2 months ago, and it's generated fruitful
discussion on a hacker-relevant matter. Certainly more relevant than the Atlas
Obscura "look at this quirky history" articles that get upvoted so much.

~~~
erikbye
I appreciate re-posts when they receive new insightful comments, but 60ish
days is perhaps a bit on the short side? However, this re-post now has 300+
comments and has justified its existence.

------
dayvid
I remember turning on the computer and waiting for the motherboard company
logo to finish flashing. After that was the Windows logo. I also remember
waiting for the dial-up modem to connect to the internet and waiting hours for
a video to download. Heck, last week I played a game streaming on GeForceNow
and I was surprised by how seamless the whole experience was.

Maybe some web apps are inconvenient, but you can easily setup a command line
interface or Linux system and move blazing fast.

~~~
soulofmischief
> you can easily setup a command line interface or Linux system and move
> blazing fast

Well... not really. Terminal response times are way slower than they used to
be in the 80's. Sure, we have greater throughput, but we also have greater
latency. [https://danluu.com/term-latency/](https://danluu.com/term-latency/)

~~~
mrdatawolf
"Terminal response times are way slower than they used to be in the 80's." In
1983 the top selling computers were the TRS-80, Apple II and IBM PC (among
others). I started on those computers. And what you said is crap... straight
out BS. Any person from that era should remember what happened when you typed
dir (or equivalent) into a folder with a couple hundred items. It was time to
go make some coffee while each...item...was...written...to...the...screen. The
OP has good points but the start was WEAK because it is factually inaccurate
like your comment.

~~~
carapace
Right, I can recall the first time I saw my friend's parent's 486.

I remember doing 'dir' on a large directory, it scrolled so fast, so fast. I
said to my friend, "The next generation will be so fast it will have finished
scrolling _before_ you finish typing the command."

Sadly, it was not to be. DOS on a 486 is as fast as computers have ever
gotten, perceptually.

------
gravislizard
As the OP who wrote this while drunk two years ago and thinks that about
30-50% of it is objectively wrong, it's extremely funny to me that every five
months someone reposts it here and everyone gets in a fight about it again.
Many of the details are entirely inaccurate - the thesis is still completely
valid, and it's Computer Person Thinking that wants to attack the details
while refusing to stand back and look at the overall picture, which is that
using computers is now an incredibly messy experience where nothing quite does
what you want, nothing can be predicted, nothing can be learned.

I hypothesize that this is because programmers and other various Computer
People consider "being on the computer" to be a satisfying goal in itself that
sometimes has positive side effects in the real world, while everyone else
simply has stockholm syndrome and has long since given up on even imagining a
better experience.

~~~
Reedx
That tendency to rathole and nitpick, while discarding the larger point, is
increasingly frustrating.

It seems like a lot of this boils down to people getting stuck in a local
maxima. If someone isn't used to zooming out to the big picture, are they more
likely to solve problems in isolation and end up with layers of dependencies,
npm installing all the things? Resulting in an unstable stack and less
predictable ecosystem at large.

~~~
dougmwne
Yes, the ratholeing vs. generalizing is an interesting phenomenon. I get the
sense that it's largely personality-based. Some people (most computer people)
are wired up to get every detail exactly correct and if anything is out of
place they break out the tweezers and start tweaking. That's exactly what's
required in most engineering work. It can lead to perfecting something that's
about to get thrown away.

The other personality deals with abstractions, analogies and top-down
thinking. When faced with an issue they'll start by defining goals and values.
And it's easy to be blind to the details when you're thinking of the big
picture and ask for things counter to the laws of physics.

Even if you're a two-minded kind of person and can deal with both generalities
and specifics, it's incredibly hard to deal with both at once and quite a task
to switch from the one mindset to the other.

------
hoorayimhelping
This post again with its ridiculous ranting examples.

"This text searching program that searches text against text is way faster and
simpler than a fully rendered, interactive map of the entire world with
accurate roads and precision at the foot / meter level."

No. Shit, really? Get out of town.

Yes, some very popular apps have bad UX. But some apps have incredible UX.
Cherry picking the bad ones while ignoring the good ones to prove a point is
boring and not reflective of the actual industry.

These posts fondly remember just the speed, but always seem to forget the
frustrations, or re-imagine them to be something we treasured.

Remember autoexec.bat files? Remember endless configuration to get one program
working? Remember the computer just throwing its hands up and giving up when
you gave it input that wasn't exactly what it expected? Remember hardware
compatibility issues and how badly it affected system stability? Remember when
building a computer required work and research and took hours? I do, and it
wasn't fun then, it was a detriment to doing what you wanted. It wasn't a rite
of passage, it was a huge pain in the ass.

So yeah, things are slower now, and I wanna go fast. But I also don't need to
spend an entire weekend setting up a PC and a printer for my mom anymore
either. I don't need to teach her arcane commands to get basic functionality
out of her machine. Plug and play and usability happened, and while things
feel slower, computers are now open to a much wider audience and are much more
usable now.

These posts always have a slight stench of elitism disguised as
disappointment.

~~~
TeMPOraL
Huh. I'd say the examples are perfectly good and on-point. While dealing with
autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact
that a DOS-era POS still offers orders of magnitude better UX than current-era
browser POSes, or than most web apps for that matter.

It also doesn't change the fact that Google Maps is very bad at being a _map_.
It's entire UI flow is oriented for giving turn-by-turn directions for people
who know where they are and where they are going; it gives almost no
affordances for exploration and cross-referencing.

> _Remember when building a computer required work and research and took
> hours._

As someone who builds their own PC every couple years: it still does. It's
actually worse now, due to the amount of products on the market and price
segregation involved. Two PCs ago, I didn't have to use two parts
compatibility tools, several benchmarking sites and a couple of friends, and
didn't have to write CSS hacks for electronics stores, just to be able to
assemble a cost-effective PC.

> _But I also don 't need to spend an entire weekend setting up a PC and a
> printer for my mom anymore either._

You don't? Printer drivers are only slightly less garbage than they were, but
now there's also less knobs to turn if things go wrong. When my Windows 10
doesn't want to talk to a printer or a Bluetooth headset, all I get to see is
a stuck progress bar.

Bottom line: I agree 100% with the author that one of the primary functions of
a computer is enabling easy cross-referencing of information. This ability has
been degrading over the past decades (arguably for business reasons: the
easier it is for people to make sense of information, the harder it is for
your sales tactics to work).

> _These posts always have a slight stench of elitism disguised as
> disappointment._

That I don't get. Is it "elitist" now to point out that the (tech) "elite"
_can_ actually handle all this bullshit, but it's the regular Joes and Janes
that get the short end of the stick?

~~~
ncallaway
> It also doesn't change the fact that Google Maps is very bad at being a map.
> It's entire UI flow is oriented for giving turn-by-turn directions for
> people who know where they are and where they are going;

As it turns out, that is probably the most popular use case for maps in the
world.

Note also that for most smartphone users of Google Maps the use-case is
actually much broader than that. The UI flow also totally accounts for users
who _only_ know where they are going—thanks to GPS and Google Maps knowing
where you are often isn't necessary.

I'm confused by the complaint that the "Maps" app only caters to the
90-percentile use case for maps, but doesn't cover the other uses-cases well.

> I agree 100% with the author that one of the primary functions of a computer
> is enabling easy cross-referencing of information. This ability has been
> degrading over the past decades

I just find this not the case at all. For expert-users the tools that existed
decades ago are still there and still usable. Or you can craft your own!

For non-expert users the information in the world is orders of magnitude more
accessible than it used to be.

~~~
TeMPOraL
> _As it turns out, that is probably the most popular use case for maps in the
> world._

There's a very subtle point the Twitter thread was making here. This use case
may be most popular not because it's what the people want, but because it's
_all that they can easily do_. The tools you use shape how you work, and what
you can work on.

FWIW, I learned to pay attention when the machine doesn't help me do what I
want (it's a good source of ideas for side projects), so I've noticed that I
do want a map that works like a map - something I can explore and annotate. I
do sometimes resort to screenshotting GMaps, or photographing paper maps in
the past, just to have a _map_ on my phone. I've seen non-tech people do that
as well. So I can be confident that it's not just me and the Twitter thread's
author that want this.

> _For expert-users the tools that existed decades ago are still there and
> still usable. Or you can craft your own!_

The Twitter thread's point (and mine as well) is that expert users can and do
work around this degradation. It's a frustrating chore, but it's not all that
difficult if you can code a bit and have some spare time. It's the experience
for the non-expert users that has degraded in a way they can't fix for
themselves.

> _For non-expert users the information in the world is orders of magnitude
> more accessible than it used to be._

The way it's accessible, it's almost as if it wasn't. Sure, you can easily
Google random trivia. But good luck trying to _compare_ things. That's always
a pain, and usually involves hoping that someone else made a dedicated tool
for similar comparisons on the topic you're interested in, and that the
information hardcoded in that tool are current and accurate. Notably, the
tools you use for searching have no support for comparing.

~~~
ncallaway
> so I've noticed that I do want a map that works like a map - something I can
> explore and annotate.

I don't doubt that there _are_ use cases for a map that works this way. Even
if Google Maps covers 80-90% of the use-cases for mapping, mapping is an
absolutely _massive_ domain. 10-20% of use-cases still represents a huge
volume.

But it doesn't have to be Google Maps. It actually seems _worse_ to be for one
"maps" app try to handle all possible use-cases for a map.

Why isn't there a separate different tool that handles the use-case you
describe?

I guess, going back to the original thesis, what would the "1983" replication
of what Google Maps does, but faster. Or, what would the "1983" version of the
mapping behavior you wanted.

In the thread they say:

> in 1998 if you were planning a trip you might have gotten out a paper road
> map and put marks on it for interesting locations along the way

I'd argue that this use-case still exists. Paper road maps haven't gone away,
so this is still an option. People largely don't use this and prefer Google
Maps or other digital mapping tools for most of their problems. Why? If you
gave me both the 1998 tools and the 2020 tools, for 95% of the options I'm
going to use digital tools to solve it _because they let me solve my problems
faster and easier_. I know this because I have easy access to paper maps and I
never touch them. Because they're largely worse at the job.

> There's a very subtle point the Twitter thread was making here. This use
> case may be most popular not because it's what the people want, but because
> it's all that they can easily do. The tools you use shape how you work, and
> what you can work on.

Ultimately, my point above is my response to that. None of the old tools are
_gone_. Paper maps are still available. And yet they have been largely
abandoned by the large majority of the population. I agree that there are
limitations to our current digital tools, and I hope in 2030 we have tools
that do what the article describes. But the 1983 version of the tools _are
worse for solving problems_ than the current tools, for most people.

~~~
6510
pretty much all games in the early 80's had [so called] pixel perfect
scrolling. Each frame showed exactly what was required.

Today it is entirely acceptable for a map to be a jerky stuttering pile of
crap. The same goes for the infinite scroll implementations. Its preposterous
to start loading things after they are needed.

There is a good analogy with making things in the physical world. The
professional doesn't start a job before he has everything he needs to do it,
the amateur obtains what he needs after he needs them.

~~~
shadowgovt
Games have huge advantages of constraint of application that mapping
applications don't. You can get pixel perfect scrolling when you constrain the
max rate the user can pass through the dataset, you deny them random access
into the dataset, your dataset isn't trying to represent a space the volume of
planet Earth, etc.

There's a huge gulf between the use cases you're comparing here, and I don't
believe for one second that loading the Google Maps dataset into Commander
Keene's engine would make for a better experience.

(Also, not to be overly pedantic, but "The professional doesn't start a job
before he has everything he needs to do it" pretty much classifies all
building construction as unprofessional. The professional doesn't magically
have everything on-hand, especially bulky or expensive resources; they have a
plan for acquiring them at reasonable rates of input and mitigation strategies
if that plan can't be followed)

~~~
6510
Illl ignore the pedantic part since it was just an analogy, if it doesn't work
for you there is little to talk about.

> Games have huge advantages of ....

I have thoughts like that but I consider them "making up excuses". You don't
_have to_ see it that way but I can't see someone fix a problem by making up
excuses for it to exist. For me it is just like you can always come up with an
excuse not to do something.

8 gigabytes of memory / 64 kilobytes of memory = 125 000 times as much memory.

14 gigahertz (4 cores x 3.5 GHz) / 1.023 MHz = 13 685 times as much processor
power.

4108 gigahertz (2560 Cuda cores / 1605 MHz) / 2 MHz = 2 054 000 times as much
video power.

Can I just call 2 Mhz memory bandwidth 16 Mbit/s?

If so, 500 Mbit / 16 Mbit = 31.25 fold the bandwidth

We are not rendering many layers of colorful animated game content. A map is
just a bunch of boring lines. The modern screen however is a lot bigger. I see
a glimmer of hope for an excuse!

320x200 px = 64000 px

1920x1080 px = 2073600 px

2073600 / 64000 = 32.4 times the screen size

meh?

We must applaud everyone involved in making all this hardware progress. It
truly blows the mind and defies belief. No one could have imagined this.

Then came weee the software people and... and.....

I'm cringing to hard to continue writing this post.

The numbers don't lie, we suck. Lets leave it at that.

~~~
shadowgovt
I'm still happy with the configuration we have where my map is a little slower
than maybe I'd like it to be (though honestly, I just loaded maps.google.com
and moused around randomly and... it's fine? Certainly not so slow I'm
bothered by it) but the mapping app also can't crash my computer due to the
three layers of abstraction it's running on top of. Because _that_ would suck.

If you're curious where the time goes, btw... Most of the visible delay in
Google Maps can be seen by popping the browser inspector and watching the
network tab. Maps fetches many thin slices of data (over 1,000 in my test),
which is a sub-optimal way to do networking that adds a ton of overhead. So if
they wanted to improve maps significantly, switching out for one of the other
protocols Google has that allows batching over a single long-lived connection
and changing the client and server logic to batch more intelligently could do
it. I doubt they will because most users are fine with the sub-three-second
load times (and engineering time not spent on solving a problem most users
don't care about is time spent on solving problems users do care about).
You're seeking perfection in a realm where users don't care and claiming the
engineers who don't pursue it "suck;" I'd say those engineers are just busy
solving the right problem and you're more interested in the wrong problem. By
all means, make your mapping application perfect, as long as you understand
why the one put out by a company with a thousand irons in the fire didn't.

Also, I think the analogy was great, but you reached the wrong conclusion. ;)
That _is_ how large-scale engineering works. Scheduling becomes the dominant
phenomenon in end-to-end performance. Games have huge advantages on
constraining the scheduling problem. General-purpose apps do not. Hell, to see
this in action in a game: Second Life's performance is crap because the whole
world is hyper-malleable, so the game engine cannot predict or pre-schedule
anything.

~~~
6510
> If you're curious where the time goes, btw...

Since it is software nothing is set in stone, everything can be changed, we
know how to do it really really fast and really really efficiently.

People did incredible things to improve almost everything.

To me this means all of the performance loss is there for no reason. All we
need is for people to stop making excuses. I for one know how to get out of
the way when better men are trying to get work done.

You are the engineer if not the artist, impress me! Impress the hardware
people! Impress the people who know your field. Some attention to consumers
wishes is good but Gustave Eiffel didn't build his tower because consumers
wanted that from him.

Why would you even have tap water if the well is down the street? A horse and
carriage is just fine, it is good enough for what people need. no? What if our
doctors measured their effort in "good enough's" and living up to consumer
expectation only?

The hardware folk build a warp capable star ship and we are using it to do
shopping on the corner store because that was what mum wanted. Of course there
is no need to even go to Mars. It's missing the point entirely you see?

------
ergothus
> almost everything on computers is perceptually slower than it was in 1983
> .... amber-screen library computer in 1998: type in two words and hit F3.
> search results appear instantly. .... now: type in two words, wait for an
> AJAX popup. get a throbber for five seconds. oops you pressed a key, your
> results are erased

So we start with something 15 years AFTER the title as evidence of the "good"
times, then make a vague anecdotal reference to something modern. I've seen
POOR performances in places, but the majority of experiences I see now are
faster than in 1998 and definitely than 1983. Faster AND more convenient.

> And it's worth noting that HYPERTEXT, specifically, is best with a mouse in
> a lot of cases. Wikipedia would suck on keyboard.

Um....no. It's less convenient than a mouse, but way better than the function
key-based commands than the author lists.

I think there is a lot of room available to complain about terrible interfaces
today, and in particular how choked everything is by the need to track and add
advertising, but there's no actual evidence in this article, and it comes
across as a rant and selective nostalgia.

~~~
pixl97
Also the amount of data we query in most cases is many orders of magnitude
larger.

Most of the 1980-1995 cases could fit the entire datasets in CPU cache and be
insanely fast.

Most things I query these days are in the gigabytes to terabytes range.

Lastly, we have to make them secure,especially against malformed data
attempting to attack the app, which eats a lot of CPU cycles.

~~~
TeMPOraL
> _Most of the 1980-1995 cases could fit the entire datasets in CPU cache and
> be insanely fast._

They couldn't then. They had to fit it in RAM.

> _Most things I query these days are in the gigabytes to terabytes range._

That still is in "fits in RAM on a typical PC" to "fits in SSD on a PC, fits
in RAM on a server" range.

There's little excuse for the slowness of the current searching interfaces,
even if your data is in gigabytes-to-terabytes. That's where the whole "a
bunch of Unix tools on a single server will be an order of magnitude more
efficient than your Hadoop cluster" articles came from.

------
michaelfeathers
When an area of engineering is new we make products better than the market
will eventually pay for. Over time, we learn how to make them just good enough
to reach an equilibrium in a market. This is an idea that is explored in _The
World Without Us_ by Alan Wiesman. The Roman Colosseum still stands, but
buildings today don't need to be that durable. It would cost significantly
more to make them that way, so we don't. The same could be said about
supersonic travel. We could do it again, but enough of us are happy to not pay
the premium, so that product has disappeared.

~~~
TeMPOraL
The problem is, the market equilibrium is usually the worst possible garbage
that's still fit enough for purpose that it won't break (on average) before
the warranty period ends. Some of the money you save on this isn't really
saved, you just pay for it in having to replace the product sooner, or being
constantly stressed and irritated with the quality and fragility of your
tooling (it's essentially death by a thousand cuts for your psyche).

So there is a reason to add some extra pressures to the market to raise the
floor.

~~~
jeffdavis
Is that true or does it just seem that way?

Cars today are pretty reliable. Random other stuff seems pretty decent as long
as you don't buy the absolute cheapest stuff around. I'm working on a solid
wood desk with no nails or screws (it's called a "puzzle" desk because it fits
together in a stable configuration without any connectors). I'm looking at a
backpack, which I assume is nylon, and it seems pretty tough and has lasted
through a few adventures. I'm looking at my metal-framed bookcase, which has
fake-wood shelves but it still seems pretty sturdy.

There are certainly some things that seem worse. Furniture is probably worse
overall, but that's because good wood is so much more scarce now. If you buy
metal/glass furniture than it's fine. There are also some worrisome practices
that are pretty widespread, like using plastic-threaded screw holes rather
than metal; or just using really cheap metal that can be stripped or cross-
threaded easily.

On balance, where are we? I have a feeling that we're basically better off. We
remember the old stuff that lasts, and forget about the stuff that breaks
(unless it's still celebrated/historical, like the Liberty Bell).

~~~
TeMPOraL
> _Is that true or does it just seem that way?_

Definitely seems that way, but I don't think it's just an illusion.

Cheapest stuff is absolute garbage, arguably to the point you could consider
it a waste of natural resources (for a few cents here and there, you could
make things not break halfway their first use). What worries me though is that
price isn't a reliable quality signal anymore - there's profit being made in
selling cheap garbage at high prices, because many people are fooled into
believing a more expensive good must be quality. Brands are no longer a
quality signal either. Even ignoring that the market is currently DDoSed by
throwaway "brands", even the proper ones don't mean much anymore. For
instance, witness any discussion about white goods. Next to the people telling
you which brands are historically quality ones, you'll find people reporting
that recently, those quality brands started to cheap out on hardware and their
products are no longer reliable or repairable.

Ultimately what I'm trying to say is that the market equilibrium for price-
sensitive customers is complete garbage; some companies can survive on less
price-sensitive customers, but I see a trend towards converting quality into
better margins even there. It's true that we have more of everything now, but
it also seems most of those things are short-lived, and increasingly shorter-
lived. It would be fine if matter and energy were too cheap to meter, but they
aren't - especially not if you factor in the environmental costs.

------
dahart
There are indeed a lot of curious examples of computers seeming to go slower
today than 20,30,40 years ago. But on the whole, I completely disagree with
the summary. My laptop today is faster than anything I’ve ever used before
today. It boots faster, it loads files faster, it responds faster, it does
more simultaneous things, it crashes far less often.

My first computer was an IBM PC jr, and using it was an exercise in patience
_at all times_. Maybe the author never played Ultimate III on an IBM PC. It
was _sooooo_ slow. The exact same was true of the computers my friends had,
C-64, Atari 800, IBM PC. It probably took a full minute to boot. Maybe the
author never saw the memory test PCs used to do before even starting to boot,
or how slow floppy drives were. My first modem was 110 baud... that’s a
whopping 15 7-bit bytes per second. Downloading a 40x40 character image
(picture made out of keyboard characters) took a minute and a half.
Downloading games routinely took hours. The PCjr hard crashed and needed
reboots all the time. Even my later 486 would do the same thing. Rebooting was
something you just did constantly, multiple times per hour. Today, almost
never.

One thing this article completely avoids acknowledging is the general
difference in functionality we have today than with computers in 1983. The
database lookups and maps that were faster were faster precisely because they
are 7 orders of magnitude smaller data. The article is comparing DBs and maps
that fit in the main memory of a computer at the time to databases and maps
that are out on the internet and don’t fit on any single computer. It’s
amazing that we get the results as fast as we do.

~~~
rsecora
"Maybe the author never played Ultimate III on an IBM PC. It was sooooo slow."

You nailed it.

------
davidguda
"one of the things that makes me steaming mad is how the entire field of web
apps ignores 100% of learned lessons from desktop apps" Not perhaps 100% but
sure, in the modern age of web apps for everything I miss having proper
desktop apps.

Even more so, programming web apps is nowhere near as convenient as developing
desktop apps in late 90's with tools like Delphi.

~~~
fwsgonzo
I remember that. It was very easy.. I sort of have that same feeling with
.NET, but I wonder if it's still going to be a layer on top of windows C
system functions which might just be wrappers for system calls, which require
an expensive context switch.

One big difference between win95 and now is the security aspect. Another big
difference is the tons of services running on Windows 10 now that is always
there (such as YourPhone). On Android for some vendors you are forced to use
FaceBook and it can't easily be removed.

------
stiray
I am just stepping away from web topic as it is irelevant. Slow VM, huge text
files, lasignas of code stacked one on another. Comparison here just isnt
fair, I do understand that people who never expirienced MFC 6 application cant
compare it to browser. As they know only browser/electron/...

But just last weekend I have installed Windows 7 on modern i7 computer (with
SSD). Sure, half of drivers didnt work etc. but it did boot. Everything I have
started was instantanious. The dreaded decompressing stage at end of
installation? 5-10 seconds. Then I have installed Windows 10 with office, I am
so sorry I didnt install Office 97 (I didnt have installation) to the Windows
7, as I did install 356 to the machine, so I cant compare here (and for my
usage they are both more than enough). But. The Windows 7 was much faster in
everything while not offering major improvement in features.

Bottom line. The software was optimized for what the hardware speed was at
that time. On new computers it is lightning fast. When the hardware become
faster, no one cared to optimize anything and layers of bloat added to it.
Sure, the production of software is cheaper, everyone could code today ("omg,
i need to free allocated memory, the horror").

Instead of improving on features, performance, we have optimized on production
costs. And major improvements on hardware processing power are lost in more
bloat in code and adding endless layers of abstractions, virtual machines,
wastefull protocols, languages. It is sad.

~~~
nonbirithm
I think if something is possible to do given better hardware, no matter how
bloated or useless, developers at large will end up doing it. I don't think
RAM usage or performance are concerns for the people that go straight to
Electron because in the present it's easy develop for and nobody has the right
to force them to chose otherwise. Electron isn't going away; the only ways it
improves are Chrome getting more performant or computers getting even faster.
Either one is nontrivial to work on as an outsider - there are only two
significant browser engines in the present age because they've become so
complex and accumulated so many features that they require the support of
well-funded corporations to maintain them, and computer processors can only
become so small.

Browsers on the whole will not lose features. The web is ubiqutous and cutting
down the number of features to maintain would break existing webpages. The
resulting web standard establishes a hard baseline on the amount of complexity
needed to engineer a standards-compliant browser. That in turn means a hard
baseline on the amount of processing power required to run it acceptably, save
for more performance gains that will surely be buried immediately under even
more features.

I once tried using Firefox 1.0 and found it was impossible since it has no
understanding of modern security protocols. The web as viewed by Firefox 1.0
in 2003 no longer exists. You can't practically divorce the features necessary
to use the web such as new protocol compatibility with the extra features like
Pocket in the code changes. And trying to do so anyway leads to Pale Moon
where a scant few contributors are responsible for merging in dozens of
security patches - which themselves come about due to the growing complexity
of the web platform. Numerous people have called using it irresponsible from a
security standpoint. In the end the endless growth of web standards and the
code needed to support them has made that fact a necessary one.

For all practical purposes, you either use one of the two major browsers or
get left behind by coders and organizations that have orders of magnitude more
resources than you.

Society and the economy don't incentivize creators with being happy with a
modest amount of features if there's a glaring increment in performance that
ought to be seized on. It's about growth and innovation. Finding new ways to
reinvent the wheel and get further away from the hardware, because it's now
easy to.

Sometimes I wish we could undo growth. It would be nice if Electron and the
like fall out of favor and we have a renaissance of software like it was
designed in 2004. But enough developers have decided that the performance
tradeoff is worth it, and average users have become desensitized to SPAs and
their expectations are low enough that they would rather put up with it all in
order to do what they need to instead of demanding better.

I feel like it can only keep growing. That's one thing the collaborative
nature of software pushes forward. More features. It's just that some are
better at limiting the scope of projects than others.

~~~
stiray
I have explicitly said that I will refrain myself from browser as it is
uncomparable. The whole browser based infrastructure is so bloated (from
browser which can be treated as vm, to bloated, text based, human readable
language (thank god for someone to push webasm - I said it is necessaty in
2001 but everyone was fine in hacking together js for next ~20 years, now it
came to the electron absurdity of 100mb+ applications that run js on packaged
browser on operating sytem that runs on cpu, the overhead is just incredible)
to text based protocols (http! rest!). Nodejs? I wont even start. The waste of
cpu power, electricity, hardware,... is just fantastic.

On the other side, android sdk/framework/whatever looks like bunch of
academics pouring their frustrations into one library - if you want to learn
all patterns just dig in. MFC 6 in 1998 was more organized
([https://t1.daumcdn.net/cfile/tistory/11781B244C45903F42](https://t1.daumcdn.net/cfile/tistory/11781B244C45903F42)
please try to draw this one for android! No vision, just chaos. Oh you loose
context with orientation? How nice...). Android development environment
catched what Windows Mobile had in 2000 barely when they pulled in IntelliJ
(which is copy of visual studio) while for machine code they still lack
behind. Please, DONT believe any of this, DO check VS 6.0 and Windows Mobile.
Everyone is reinventing everything without/ignoring past and its mistakes,
repeating them over and over again. Loosing 20 years for the sake of another
corporation gaining monopoly is so crazy that you need to see it with your own
eyes.

Cloud. We had mainframes. Enough said. We are moving to those again, for short
term profit for users and long term profit for providers. I wonder when
affordable server hardware will dissapear from market.

While I think that hardware industry is moving forward, the development is
racing backwards or barely catching what we had 20 years back. Drowning itself
in lasignas and stacks of different overheads, from overdesigned interfaces to
VMs. IoT is still sane (due to lack of resources to waste) but the common
software development just went crazy.

I do suspect that at some point, hardware will stop supporting bloating due to
physics limitations and optimized code will come back, but I wonder who will
still remember how to write it? Ah, we will reinvent it again?!

But nevermind the rant of an old fart. He just doesn't understand waiting on
cpu 1000 times faster than what we had at 50% while loading megabytes of web
based software doing what he would implement in 100kb. With 0.0015% of cpu
load. It is just him. He doesn't understand the hype.

------
ckastner
The author bemoans how everything has gotten (perceptually) slower, and then
proceeds to communicate this not through a single letter or blog post, but
instead through ninety-five (95) tweets.

C'mon.

~~~
ThomasWinwood
Allow me to reply to your complaint about tweet threads with a tweet thread.
[https://threadreaderapp.com/thread/1066547670477488128.html](https://threadreaderapp.com/thread/1066547670477488128.html)

------
rsecora
The experience of the 80s was really slowness, things were so slow that now
It's unimaginable now. So unimaginable that we don’t remember any more common
actions by 1983.

The experience was so slow, that you can play another game while loading [1].
Namco even patented "playing games during load” by 1987 [2].

As an example, I have fond memories of my C64 tape loading Hunchback [3] ...
while watching "V". A single tape can take one hour if it fails in the middle
and you need retries. All those waiting times were order of magnitude of what
a common toddler will expect today. We were so relieved when the fast loaders
appears [4] or when a 1541 arrived home [5]. And then we were so frustrated
when the 1541 was only 300bytes/s instead of the 300bits/s of the Datasette.

The scenario in Xenix or PC-DOS loading from 8" floppies was similar.

I'm not arguing that now are better times, I’m arguing that in the 80s we have
the best for the time, and those were good times to remember.

[1] Mini Games: [https://en.wikipedia.org/wiki/Invade-a-
Load](https://en.wikipedia.org/wiki/Invade-a-Load)

[2] Nanco patent:
[https://patents.google.com/patent/US5718632A/en](https://patents.google.com/patent/US5718632A/en)

[3]
[https://en.wikipedia.org/wiki/Hunchback_(video_game)](https://en.wikipedia.org/wiki/Hunchback_\(video_game\))

[4]
[https://en.wikipedia.org/wiki/Fast_loader](https://en.wikipedia.org/wiki/Fast_loader)

[5]
[https://en.wikipedia.org/wiki/Commodore_1541](https://en.wikipedia.org/wiki/Commodore_1541)

(edited link formats)

~~~
rahuldottech
Links formatted as code sorta defeat the point, right?

~~~
rsecora
Right, edited

~~~
rahuldottech
All good :)

------
gravislizard
As the OP who wrote this while drunk two years ago and thinks that about
30-50% of it is objectively wrong, it's extremely funny to me that every five
months someone reposts it here and everyone gets in a fight about it again.
Many of the details are wrong - the thesis is still completely valid, and it's
Computer Person Thinking that wants to attack the details while refusing to
stand back and look at the overall picture, which is that using computers is
now an incredibly messy experience where nothing quite does what you want,
nothing can be predicted, nothing can be learned.

I hypothesize that this is because programmers and other various Computer
People consider "being on the computer" to be a satisfying goal in itself that
sometimes has positive side effects in the real world, while everyone else
simply has stockholm syndrome.

~~~
drbawb
>I hypothesize that this is because programmers and other various Computer
People consider "being on the computer" to be a satisfying goal in itself that
sometimes has positive side effects in the real world, while everyone else
simply has stockholm syndrome.

Oh believe me, as much as I love being on the computer for the sake of it, I
don't enjoy having that screen time utterly wasted by shit software. There's
only so many hours in a day after all.

I work w/ ERP software, so I see people still using text-mode UIs on a daily
basis (hell I track my time w/ one), and I also support a "modern" ERP that is
GUI based and can "run inside a browser." (Which nobody actually does, because
it sucks, doesn't support all features, and swallows up tons of keyboard
shortcuts.) One of these packages can be run comfortably from a $5 Linux VPS.
The other package asks for two _very fast_ SAS storage arrays, 32GB of RAM,
minimum of 6 CPU cores, etc. Of course you've gotta license Windows for all of
that, which thankfully is not my job. (That or you spin it up in "the cloud" I
guess, and pay thousands of dollars annually to rent somebody else's computer,
since this software is not "cloud native" at all, no matter what their sales
people say.)

I try to leave the software better/faster/more usable than I found it, but
it's hard when the upstream vendor is just piling shit on the fire so they can
pitch their half-implemented features on the sales brochure: with absolute no
regard for the added operational overhead of the garbage code.

------
smkellat
A tweet thread where you complain about AJAX popups and other Javascript-based
slow downs seems utterly ironic. Whatever happened to having your own blog?
Even better is that these tweets were from 2017.

This makes me yearn for building an ncurses-rooted non-GUI flavour of Ubuntu.

------
blfr
Yes, for a good long while UIs were improving and the last few years I'm
fighting most of my computers to stay as usable as they were a decade+ ago.
And it's usually a losing battle wherever a terminal isn't available, most
painfully on the phone.

Online stores have less and less useful categories and filters. Ubuntu's take
on Gnome is a weak version of their Unity DE and much less efficient on
smaller screens. Google Maps mentioned in the OP in addition to all these
problems slowed down significantly.

And yes, others have it worse because they don't even have the escape hatches
like command line apps and are cast to the webshit wasteland looking for a
webapp to unlock their pdf or extract images or whatever.

 _Joe should be offered incredible power by software that encourages him to
use it with purpose-built interfaces optimized for him._

 _I am upset by the way that computers disenfranchise non-nerds. I wish it was
better for me; I wish it WORKED AT ALL for everyone else._

[https://twitter.com/gravislizard/status/927621687952076800](https://twitter.com/gravislizard/status/927621687952076800)

~~~
mindcrime
_Yes, for a good long while UIs were improving and the last few years I 'm
fighting most of my computers to stay as usable as they were a decade+ ago._

I blame the advent of the "mobile era" and the much smaller screens, and
differing I/O capabilities, of mobile devices. And the fact that, at some
point, everybody decided "we need one UI that is applied for both desktop and
mobile, despite them being completely different types of devices." And now UI
design is all muddled up and nobody can (or will) optimize for large screens,
keyboards, and mice, because "not everybody has a large screen, a keyboard,
and a mouse." But on mobile, the pull remains to keep a UI metaphor that is
basically a slight variation of the WIMP interface that originated on desktop.
In the end, nobody wins.

------
seapunk
Original source:
[https://twitter.com/gravislizard/status/927593460642615296](https://twitter.com/gravislizard/status/927593460642615296)

Threader version (if you’d like to read the compiled version without ads &
fake news recommendations):
[https://threader.app/thread/927593460642615296](https://threader.app/thread/927593460642615296)

~~~
dahart
OH, thank you! This explains why the article is a wall of one-sentence
paragraphs.

------
Santosh83
I wouldn't generalise. Behind the scenes modern computer systems are of course
processing _billions_ of times more instructions than computers of old. Some
portions of the UX do have noticeable delay and sometimes even inordinate
delay. This is usually due to poor choice of algorithms and program logic, but
sometimes it is the case that blazing speeds of modern hardware has trained
most of us to expect _every_ computation to be almost instant, but some
algorithms are still pretty slow, no matter how fast the hardware and they are
likely to remain that way. Of course the UI in particular doesn't fit into
this algorithmic complexity constraint and there is little excuse for their
perceptible slowness in recent years except for poor coding and feature creep.

------
TomBombadildoze
Almost everything on computers is perceptually more rich, useful, and
intuitive than it was in 1983.

~~~
collyw
Disagree. It looks prettier but its often not more intuitive.

Not 1983, but Windows 95 all the icons had text underneath them, so you knew
what they did.

More recently it took me a couple of years of seeing hamburger menus to make
the connection "it's an icon to give you a menu". Those are everywhere but are
in no way intuitive.

~~~
nimajneb
Google has been dropping the hamburger menu, now it's your face, lol. (Google
Maps 10.34.4)

------
miguelmota
A problem with a lot of web apps today is that they are designed by visual
designers that put a lot of emphasis into aesthetics and animations to make it
'stand out' rather focusing on user experience and keeping it dead plain and
simple. This is why I love TUIs because they make use of all the screen real
estate and all the possible actions are laid out on the screen to easily
navigate.

------
pjungwir
I have a lot of sympathy for this article.

(And all the comments about how much faster computers are today: I interpreted
the article to be talking about the ease of UI, in other words how fast the
interface lets _you_ do things, not how fast the _computer_ does things.)

I have a private Rails app I've used as a freelancer for 7-8 years for time
tracking, invoicing, expenses, and financial reports, and I've tried to make
it very keyboard-focused, so that I can enter info without clicking around.
I've always wanted to polish it enough to share and charge money for, and I've
thought a nice niche would be to double down on the "keyboard productivity"
theme. But making that work well in a browser is tricky.

Do you mostly rely on browser-native functionality for keyboard nav/input?
That's how it is today. It's fine for me, but it's not really "enough" I
think. There are lots of other navigation moves I wish had keyboard shortcuts.
But a lot of taste is required here. Usually when people override browser
behavior it makes things worse (e.g. scrolling). But there are still things
that would be actual improvements.

So a couple years ago I rewrote the Invoices section in React+Redux, hoping
that would make it easier to do some custom UI behavior. It made a couple
small things better, but it's a _lot_ more code, and it breaks some things the
browser used to give me for free. For example if you add a new invoice and see
it at the bottom of the list, then you click to a non-React part of the site,
then click Back, the new invoice disappears from the list! I'm not even sure
how to fix that, unless I want to add a spinner and Ajax call on page load.
(Right now I pass the initial state as JSON in the rendered HTML.) Maybe I
could use local storage, but then somehow when you click Back I have to decide
whether to trust local storage or make an Ajax call. . . .

Anyway, rather than complaining about the article, I'd be more interested in a
conversation about how to build webapps with a keyboard focus, so that you can
complete entire tasks without touching the mouse. What native browser features
would you leverage? What frontend tech would help you? Can you imagine
"standard" patterns you could encode in a JS lib, both for DRY code and for a
more predictable UI? A "Bootstrap for keyboard-first webapps" would be really
cool. In fact Bootstrap already does a lot for accessibility.

~~~
BlueTemplar
Stop. Making. Web. "Apps".

------
lasermike026
The web is lightning fast when I turn off javascript.

~~~
Waterluvian
I heard people saying this a lot so I decided I'd give it a try. My experience
is that this quote is actually, "The web is lightning fast when we don't use
most of it." The number of features and websites that just wouldn't work was
significant.

However, some sites I visited that were informational only did get a bit
faster which for some people with bad Internet connections might make a
difference.

This probably already exists, but what I want is an AdBlocker-style plugin
that executes a community-curated list of "websites that need JavaScript to
work." I can then get the best of both worlds: faster websites with less
unnecessary JavaScript, but without crippling a large portion of the Internet
for myself.

------
agumonkey
Said this already but worth repeating:

the best computer application I interacted with was an AS400 era tax filling
system.

It was a superb tool in the prettiest meaning. It was a care free, lag free,
pedagogy free thing. Fill out data as it comes (the only smart key you needed
was tab and shift-tab), smarter than the average web app form validation
checks and suggestions. Think instant Elm typecheck where you enjoy the
computer assisting work.

I was in 4th year in college and blown away how this thing beat anything we'd
see or studied in school yet it was antique pre-OO pre-UML (maybe an ancestor
of modeling hygiene though). I wish I could have a talk with the team who
wrote that.

Of course it was about to be replaced by a web2.0 variant which was a resource
hog, required constant scrolling, didn't do half of the old work but was full
colored css.

------
jungletime
The problem is human memory! People forget all the features, and dependencies
software has.

Since the natural direction of the Universe is entropy. As time goes on,
software is developed, dependencies are broken, and features quietly stop
working.

When you look into fixing it, you notice that its this giant hairball of
dependencies which nobody can unwind. So then the next best thing is to put
another layer of abstraction on the hairball, and you build on. The price of
this is things will get slower.

My Commodore 64 had a more responsive terminal than my macbook, since it was
thin layer over hardware, only one or two abstractions deep.

HTML/CSS and Javascript are garbage technologies for making responsive UI,
they are so so many levels deep, with so many dependencies and constraints.

The way forward is start fresh with something new, but no idea what it is.

~~~
BlueTemplar
Rust/Vulkan?

------
aforwardslash
Vaguely related, someone measured keyboard latency and it does seem slower

[https://www.extremetech.com/computing/261148-modern-
computer...](https://www.extremetech.com/computing/261148-modern-computers-
struggle-match-input-latency-apple-iie)

------
lunias
Computers of the past were undoubtably slower on average; both in terms of
physical performance as well as their UX. The software of now, when compared
to the software of then, is slow relative to the hardware that it runs on.
Before we used to strive to do more with less. Today we strive to do more with
more. In an attempt to do more, faster and easier; we've made trade offs.

I wish we would circle back and remember the
[https://en.wikipedia.org/wiki/Unix_philosophy](https://en.wikipedia.org/wiki/Unix_philosophy).

Not every app needs every feature rolled out by FAANG in the last decade.

JavaScript is not the best choice for every part of every application stack.

It seems important to me that programs become focused and composable again.

------
strictnein

      LOAD"*",8,1
    

Go get a snack and a drink, walk back to the computer

    
    
      bzz bzzzztttzzzt bzzzz
    

Oh, great, I recognize that noise from the disk drive, it means the game is
almost loaded. Just another 30 more seconds.

~~~
rsecora
Right... or it can be even slooooower, you can watch a full "V" episode if you
have a Datasette.

As you point out, the waiting times were not in the seconds but in the two
digit minutes.

~~~
strictnein
Yep.

    
    
      PRESS PLAY ON TAPE
    

Might as well go walk the dog.

------
Mattasher
This is a major source of dissatisfaction for me as well. Especially latency
related to keyboard input and program startup. Are there any resources out
there specifically intended to help you speed up your (Mac) with these things?

------
stopads
It boggles my mind how terrible the web is becoming on lower end connections.
I have about 200ms of delay to most servers and about 5Mbit of bandwidth, and
many major web pages take over 30 seconds to load if I don't block most of the
assets.

It's insane that webpages are now loading megabytes of data across hundreds of
http requests before the images even start downloading.

I've completely given up on the mobile web since I can't block things the way
I need to. It's just unusable. It's a platform that just flat out doesn't work
anymore for lots of people in remote areas.

~~~
squiggleblaz
> I've completely given up on the mobile web since I can't block things the
> way I need to.

Can't you just use uBlock Origin with Firefox Mobile and run it behind a
localhost VPN like Blokada? It's not perfect but it seems enough for me. A
different localhost VPN will probably give you even better results, if you
customise it to your specific purpose.

~~~
OJFord
Not on iOS, I got one recently for while my Android phone's being repaired,
and it was an unwelcome surprise that A) FF (or anything) can't be set as
default browser instead of Safari; B) 'content blockers' only work in Safari,
even though all browsers are forced to use it to render anyway.

~~~
magduf
You shouldn't be trying to set anything besides Safari as your default browser
in iOS: Apple doesn't want you to. Your desires are not important if you're
using an iOS device, and you're thinking wrongly to be wanting to do anything
differently. If you want to do things your own way, Apple products are simply
not for you; they've made this abundantly clear for decades now.

~~~
OJFord
I thought I'd try it out while my usual phone's being repaired (plus I'm
sceptical it can even be repaired, so that two weeks might turn into four if a
replacement isn't forthcoming) - since Android has its issues too.

But yeah, I don't think it's for me. It may be a step forward on first-party
privacy, but it's two steps back on third-party. And 20 steps back on
'platform freedom', no changing defaults, no file access for running apps like
Syncthing, extremely limited customisation of the home screen (I like a simple
alphabetical list of apps, nope, can't have that) or 'control centre' (not
possible to put hotspot toggle there? Come on...) etc.

Plus fonts and UI elements feel massive, on the lowest text size. The Slack
'sidebar' takes up almost the full width, and about five channel names fill
the height. (On an SE. Larger models of course have more space, but it's the
size of things that irritates me, that presumably wouldn't change, there's
just be a few more of them.)

------
root_axis
This is just totally untrue. Even my Windows 10 computer boots from BIOS
screen to login screen in under 8 seconds.

I remember what it was like running windows in 1994, even after waiting 40
seconds for windows to start up, I'd wait upwards of an additional 3 minutes
before the computer was actually usable. I remember with keen frustration
waiting for the churning disk noise to die down as a signal that the computer
was no usable. Since the mid 2000s or so this has just never been a problem.

------
excerionsforte
Google Maps is great for A to B. I love it. Use it even if I know the route
since traffic can be a pain. It appears fast for me. I'm very much
disconnected from what the post says. I've used DOS, Windows 3.1, Windows 95
and so on. Now I'm on Mac and Fedora. Never will look back.

Let's remember here that most operations were executed against a cache of data
on the local system. That cache is way too big to store on the local system
nowadays, so of course we are going contact a server to perform the operations
and we will experience network latency that will vary. US internet speed has
stagnated. Some time in the future, if typical ISPs don't get their way and we
get faster and faster connection speed then these latency complaints will be
irrelevant.

Developers could do better in sending less data in general, but that will take
more time. Does product management care much about latency if they are not
given the mandate? We seem to all want features and more features, but less of
the cruft that follows as a result of development, which is weird. Best way to
vote in the end is to just build your own application however you can. Waze's
features would have never made it into Google Maps if users were not using
them instead of Google Maps.

------
ecpottinger
I use Haiku which is a fast responding OS. Yet I was able to double the speed
of the text search program by writing a custom version of GREP, however I also
knew the type of searches I did and got rid of a lot of the matching options
that were in the original code.

Result a far faster program, but far less options available to the user. I
think you will find many of the older systems that are faster are also more
limited in what options they offer to the users.

------
crazygringo
Even if true, our computers are able to access many, many, many orders of
magnitude of information than my computer could in 1983. What my computer can
do for me today is probably 1,000 times more valuable to me than what my
computer could do in 1983.

If it takes being slower, I'll take that tradeoff in a heartbeat.

(But I also remember it taking up to a couple minutes to load a program from a
floppy, so there's also that.)

------
bovermyer
Related to this, is there a native desktop application for kanban?

Essentially, a local version of Trello without input lag. I'm tired of web
apps.

------
seemslegit
Wasn't around computing in 1983 but phone and web app latency are still way
behind the average early 2000s office or line-of-business app.

Also ironic is the act that the most significant improvement in responsiveness
during the last 20 years was afforded not by better software or even faster
CPUs but by the move to SSDs for OS, data and swap space.

------
arminiusreturns
I would say it doesn't have to be like this. I have made a whole thing out of
stripping complexity out of things, which includes favoring cli tools over gui
counterparts, etc, and most of my daily stack is nice and fast and not
suffering from this. Webpages with good adblocking and scriptblocking start to
seem functional again, but are still the main sticking point since I am doing
a whitelist approach (and sometimes forget to commit my rule changes).

If a program is written in electron it's an automatic "won't install" for
example. The same for npm. If one can learn to be more discerning in their
tooling they can avoid many of these pitfalls.

I would also say; go on a pid hunt sometime. Does that process really need to
be running? No, like really?

    
    
      systemctl list-units --all --type=service
    
      systemctl disable $service.service
    
      systemctl mask $service.service

------
fortran77
I'm old enough to have done applications that were meant to run on IBM 3270
terminals.

Applications that dealt with text and filling out forms were probably better
on the 3270-style interface than they are on most web applications, for no
good reason. Several times a week I get bitten by a bad form. Forget to put in
my Country, hit submit, and the entire form gets erased with an error message,
etc.

That being said, for graphical applications his hated mouse is better than
what we had in the late 70s. Doing CAD on Tektronics 4010 light-pen terminals
wasn't fun. People still went back to T-squares and drafting tables because of
this.

[https://upload.wikimedia.org/wikipedia/commons/6/62/Computer...](https://upload.wikimedia.org/wikipedia/commons/6/62/Computervision_piping.agr.jpg)

------
StreamBright
I think it has something to do with the 30M lines of code problem.

[https://www.youtube.com/watch?v=kZRE7HIO3vk&feature=youtu.be](https://www.youtube.com/watch?v=kZRE7HIO3vk&feature=youtu.be)

------
vectorEQ
people went with HTTP and other shitty models. now we live with the pain. good
for selling new hardware tho, so who really cares!

There could be tons done to still have our modern looks, but have old skool
performance. issue is mainly in how we store and subsequently use data.

tons of people still use DOS era softwares just due to this very fact. their
old programs they can perform the same work 1000 times faster due to how data
is stored / presented. It generally doesn't go through 1000 layers of
processing each click, and a lot of data is stored as a 'view', not some raw
binary blob to be parsed out again on demand..

------
zwaps
This may be related to my bad hardware (which is, albeit, not bad on paper).

However, there used to be a time when websites were fast. Like HN is now.

Now, most "apps" and websites are slow. Some are downright sluggish and
unusable.

Maybe I am doing something wrong.

~~~
rahuldottech
I wish all websites were as lightweight and fast as HN.

I sorta want to use New Reddit (because I like having smaller windows for
multitasking and responsiveness makes that much more convenient) but boy is it
painfully slow even on a 100+ mbps connection

------
nvesp
Whoa no need to attack the poor mouse. It has its place. Especially in a
generation of PC gamers that came up with optical mice. Me personally the
muscle memory I have from PC gaming for some reasons or other seem to have
made navigating ui with a mouse that much more intuitive to me over the last 2
decades and some odd years. To the point it actually feels really good and
extremely proficient to me. I feel like I'm faster using a mouse than just
pure kb shortcuts. Although I do use a combination of keyboard short cuts and
my mouse. Guess it just depends on the task/work flow.

~~~
mike_ivanov
"Whoa no need to attack the poor mouse."

Did you actually read the article?

~~~
nvesp
I think I responded to the wrong comment.

------
jp57
Opening a web browser is way faster now. In 1983 it took ten years.

------
anonnyj
I hate when games do something like have their main menu be a webpage. All I
care about is the start button and it takes a good 10 seconds some time just
for it to pop up.

------
downerending
One of the things that drew me to Linux in '93 was running 'find /' on the
console. This was blindingly fast versus the same on a SparcStation 4 (a far
more expensive and generally superior platform).

This was sort of an accident, in that SparcStation consoles were slow partly
because they weren't meant to be used. But still, it was impressive as hell at
the time.

------
Semiapies
My problem is that I remember hard drives randomly and audibly grinding for
several seconds, making me unable to do anything else.

I remember old word processors lagging behind my typing.

I remember waiting for greenscreen systems to pull down results from a search
on a remote server.

When everything was great and optimized (and optimal), many things could be
instant, back in the day. Funnily enough, the same is true now.

------
Covzire
The only slowness I feel running off an NVMe SSD is iTunes on Windows. For
some reason Apple made every basic click in iTunes have a built in delay.
Doing almost anything has 300-5000ms of delay. Playing video and seeking ahead
or back freezes for about 5 seconds for me, while VLC, MPC-BE or Power DVD are
nearly instant for 60GB 4K UHD files played off a NAS.

------
hackpert
A use case where I see this issue very often and would love to see pure text
based interfaces is healthcare. Half of providers’ and nurses’ time is spent
clicking on tiny things in drop down boxes before scanning a barcode, which to
me is pretty silly. Hitting F4 or something and entering whatever, and then
scanning would make it so much faster.

------
carapace
This _kills_ me. Everything from waiting for the machine to boot (WTF ARE YOU
DOING IN THERE???) to waiting for the machine to shut down.

 _Oberon_ on a twenty-year-old thinkpad is a better _faster_ OS.

Go try it for yourself:

[https://schierlm.github.io/OberonEmulator/emu.html?image=Ful...](https://schierlm.github.io/OberonEmulator/emu.html?image=FullDiskImageWithSource&width=1024&height=576)

This is faster in your browser than your whole computer can run the current
bloated crap. Click "reset" in the gear menu and watch it reboot. OOoooo so
fast.

Oberon OS plus "Gadgets" has never yet been equaled or exceeded by any other
UI, anywhere. If you haven't used it you can't know.

Just the separation of selection and (text, not mouse) cursor is so useful,
let alone the built-in cut/copy/paste with mouse chords. Don't get me started
on the Gadgets system...

But enough about the greatest OS+UI ever made...

(And my computer freezes for a full second when I scrolled down to the
_broken_ (on my machine) image. The fuck Firefox Y U so slow!? Junk. It's all
junk. Made out of junk, fractal dimensions of over-wrought crap. Redundant
computations that burn energy to do nothing but fail to be better than
decades-old versions of itself, the fires of a digital hell.)

------
Mikeb85
> When you're all done, you go back to your plotted trip and start laying out
> the chosen locations and optimizing your path. > You can do this with a
> paper map. You can't do this with gmaps. So you just don't do it.

Has this guy never used Google Maps? You can definitely plot out a trip with
multiple points and pick specific routes for each leg along the way. I know
because I take a lot of road trips.

Anyhow, computers might be perceptually slower but do so, so much more and are
more pleasant to use. Being able to save all my documents to the cloud and use
them from any browser is massive. Google Maps has changed the way I travel.
The fact I can control my TV from my browser while using HN and planning a
route on Gmaps at the same time. The fact I can plug anything into my laptop
and have it just work (even my Linux laptop!). And if you really, really miss
that 80's era text-only interface, well, you can still have that.

Again though, the sheer amount of data you have access to nowadays and the way
computers can visualise it is mind-boggling. Staying on the maps theme, the
fact you can find any location on earth, get a satellite image of it and a
whole bunch of info in it just by dragging a map around is insane, and I grew
up in the early internet era.

------
dagaci
For the web users if a user spends more time clicking and browsing, this
contributes to the interactive usage statistics of the web app.

Apps like facebook, twitter, linkedin, instagram often feel like your wading
through mud.

The drive to have users spend more time in your web app overrides and has a
higher priority over efficiency of the user experience.

------
warrenmiller
I remember when all of this was trees.

------
pelasaco
Some 50 years old developer is angry because he must use "the web".. I'm 40, I
was using PC XT/AT in 1988, and no, clipper, reading from floppy disk A and
rendering it, Wordstar and Lotus weren't faster. You are just older.

------
Causality1
I found recently that dragging the middle half of the screen on mobile street
view now scrolls up and down the street instead of rotating the view. I don't
know why companies obsessively break things that worked fine for years.

------
kfarr
Dan Luu's post on computer latency is my favorite data driven approach to this
topic: [https://danluu.com/input-lag/](https://danluu.com/input-lag/)

------
shadowgovt
Useful to know: the 'past' link underneath a headline on HN will show you past
instances of the same URL being shared as news.

This is our 6th time discussing this topic, and we have become exceedingly
efficient at it.

------
mseepgood
Is a 1982 Commodore 64 faster than a new MacBook Pro?
[https://www.youtube.com/watch?v=nI3C9yLVsVE](https://www.youtube.com/watch?v=nI3C9yLVsVE)

------
Godel_unicode
What about color-grading video? Photos? Backing up all of your family slides,
negatives, and portraits? Learning to program for free on your own time?

Get out of here with this spotlight-y maga-esque ridiculousness.

~~~
erikbye
> Learning to program for free on your own time?

We have been doing this for many decades.

~~~
Godel_unicode
Maybe a small select group have been. Do you really believe that the resources
available in 1983 are even in the same category as the current ones? Come on.

~~~
erikbye
> Do you really believe that the resources available in 1983 are even in the
> same category as the current ones?

No, I never wrote anything remotely close to suggest so, if by "in the same
category" you mean volume. If you implied quality, well, that is another
matter. There is so much low-quality content online; you could argue it makes
enough noise to drown out the high-quality content. The first search results
are rarely the best.

However, people should stop disseminating distorted realities. People learned
things before the Internet. I would even suggest many learned better before
the Internet. You did not need a superficial Medium blog post on a subject or
a hand-holding tutorial. You reverse engineered the binary. Besides, there's
been a ton of books on programming for a long time. I have a great one from
'78, The C Programming Language.

~~~
Godel_unicode
> you could argue it makes enough noise to drown out the high-quality content

> I would even suggest many learned better before the Internet.

Ok Boomer.

~~~
erikbye
Your response contributes nothing of value and is better suited to Reddit.

Anyway:
[https://duckduckgo.com/?q=internet+attention+span](https://duckduckgo.com/?q=internet+attention+span)

------
chooseaname
I remember my BBS days, waiting minutes for stuff that happens instantly now.

That being said, it is a bit of a shame that the faster computers get the more
time the computer spends on fancy UI elements, etc.

~~~
geocrasher
"That being said, it is a bit of a shame that the faster computers get the
more time the computer spends on fancy UI elements, etc."

Isn't that the point of faster computers? ;-)

------
dragonsh
Google maps is a bad example in tours rant. Used it to explore many cities,
not only turn by turn direction. It worked get although personally like
openstreet maps, because I feel one company s shouldn’t control so many
aspects of life and like open source nature of it.

Indeed able to discover a onsen built by community in Hokkaido. That’s based
on a lake I saw by just zoom in out of the map to do exploration and saw it
built by Ainu community living there. In that turn by turn map helped quite a
lot to reach there subsequently just by click and dropping a pin there.
Submitted a text describing it and subsequently visited by many people.

Took a journey through treacherous mountain and went up to 2500 meters high
and crossed it via a tiny off-road designed for motorcycles. That too did
based on exploration of map on phone.

Paper maps didn’t provide such convenience. Yes POS is slow, but then it has
different expectations today. Try integrating old POS with 10 payment network
working under milliseconds to authorise transactions. Worked enough since 1990
with dos based POS and also wrote IBM MQ C++ code for integration. Don’t want
to go back there.

I think today is a step forward. Like always there will be good and bad as it
was in 1980 or 1990. But largely today we do much more with computers than we
ever did in those days.

So not everything is perceptually slow on computers. Apple Watch, iphone,
androids are responsive and usually fast. Windows 10 for certain tasks might
be slow but then overall it’s not that bad either.

As far as exploration goes you can do the same or even more today than before.
I can discover planets. Built3D model in physical form, use CNC on desktop, do
laser cutting in my home.

Built a computer like BBC micro, raspberry pi, fly a computer using fly by
wire and test it using real physical model plane. Can do farming on a small
scale using hydroponics at home. Can just go on and on about it. All possible
thanks to advancement in speed and reduction of size in computing devices and
peripherals.

So I feel it’s not slow, just the work we do with old computers is very
different from what we do with modern one’s.

------
ThaJay
Google maps is actually bloody good at just being a map. It's extremely
convenient to zoom in and out, click around, view locations, add them to your
route, etc. Yes you can up the productivity by using more tabs and that is the
standard nowadays because we humans are being trained to be multitasking
monsters. Forgive me for ending my read there because to me it just sounds
like a rant from a bitter old person that doesn't understand this brave new
world any more.

------
ixtli
I hate to be this guy but there are good points in here so _why_ is there a
newline after every sentence? This is not how written english is supposed to
work :(

EDIT: Apparently its output from a twitter thread unroller. My point still
stands and, ironically, its another example of how randomly smashing a bunch
of applications together is broadly a degradation on the status quo.

~~~
arathore
The original content was a series of tweets and the link is a website that
summarizes long twitter threads, so each newline indicates a follow-up tweet.

------
dmead
mice become worse when they switched from ps/2 to usb. ask the quake
community.

------
geocrasher
I just wish the author used actual paragraphs. I find a string of 1-2 sentence
"paragraphs" to be difficult to read. I glanced through it, thought "he's got
a good point here and there but its not THAT big of a deal" and then closed
it.

------
kavi87
The author does not own a smartphone apparently.

------
fourzs
It was really nice of this guy to take time out of his busy day at the
retirement home to write this.

------
bonestormii_
To me, this the major paradigm shift is away from native clients to a model
that, at the very least, _allows_ for the entire client to be re-downloaded
every time the application is loaded, which is the case in the current web app
landscape.

Even if the entire application is stored locally, the goal is frequently on
code reuse with systems like React, so it is still baked into the design.

We talk about function keys being missed--but the fact is, one of the main
issues is that your web browser already makes use of many of these keys. This
is not a problem with the web app you are using, but with the fact that you
are using a web app, which is a client running within your browser, itself a
native client.

Simply returning to native desktop clients for productivity software would be
a huge step in the right direction. There are economic incentives _not_ to
accommodate this user need, however. Developers need to offer cloud-based,
app-based access to their services in order to remain competitive. For many
companies, it is built into their SaaS business model! If you develop a
productivity application professionally, you would need to believe that the
existence of the native client (and all of the resources and man hours it
consumes) will give you a meaningful competitive edge. The benefits to users
are obvious to developers who have time to reflect on it, and to the users
running a POS system, but not to the people several levels above the POS
operator who are making decisions. For them, a cloud system that centralizes
access, control, and supervision works better for _them_.

I had a job recently where I had to do data entry for a lot of invoices into a
intranet cloud system. The system was slow; fields had bugs that literally
caused the text to be entered backwards for some reason (seriously)... and it
sucked for me. But for our business partner who was a majority owner of our
main asset, it affords them control, centralization, and remote
administration.

I actually agree with the OP aesthetically, but it dwells on the POS operator
and forgets the hidden user-- POS operator's managers and business ownership.

I will also say that to those of you who feel anti-javascript--an SPA is
_much_ better at managing stateful applications than dynamically pages
generated by the server, and I'd much prefer to minimize chat between client
and server where possible, which SPA frameworks do afford.

I think the whole thing could be improved by better support for native-style
applications in the browser, possibly running in some "native mode".

The last thing all say is that windows are nice for multitasking, but they do
represent a significant cognitive load to manage, and you can't usually do it
from muscle memory. I remember teaching my grandparents to use AOL in around
~2000, and I literally had to make exercises for them to try to establish
"object permanence" with windows that were hidden. The interface appeared flat
to them, and they didn't intuitively grasp that another window jutting out
behind another was a part of a separate interface altogether. They got it
eventually, but it is the type of thing that is easy to forget when we live
our lives enmeshed in window based operating systems. But look at phones--they
don't have window-like functionality even after all this time, with bigger and
better screens, multi-core processors, etc. Why? Because we still consider a
phone to be a "device". It's capable of multitasking, but we subdue that
aspect of the interface in favor of focus and usability. Terminal applications
are similar in that respect, and I think that's why people find them
attractive in some contexts.

To anyone who feels strongly about this, I'd say: Develop a beautiful
productivity app for native desktop usage. Market it. Make people fall in love
with it. A deluge of such apps may promote a reversal of this trend.

------
cathames
thanks for writing what I've been thinking for so long

------
jimhefferon
TeX is faster.

~~~
BlueTemplar
It tends to be compiled into PDF these days, which is computer-hostile.

------
B008L355
Caffeine be messing with your head boy

------
tracerbulletx
rofl. no its not.

------
shp0ngle
This post must be 5 year old, nobody says “AJAX” in 2020

~~~
masswerk
Yes, we now have a few layers of abstractions on top of this. :-)

------
rboes2
Sorry this seemed like a terrible article to me. Pure bs.

------
maxdo
Another version of “back in those days everything was so good”

------
dave_j
An article like this get posted every so often and it is normally absolute
trite.

When I was a kid I had a hand me down computer. Generally it took 20-30
minutes to load a game from cassette. There was no interchangeability between
computer manufacturers. I had the choice of BASIC of Assembly to program in
and the whole interface was BASIC.

Today I can load up a web browser in a second or two, transfer files between
machines with ease (for the most part) and I can download in minutes any
language I want to program in.

------
friedman23
I remember programming culture in the 2000s and it was so stupid. Every
programmer whining 24/7 about 'bloat', they were complaining about Java though
not electron. Thank God the whiners never got anything they wanted.

~~~
icedchai
It's even worse now. JavaScript, originally intended for form validation and
image preloading, took over. And they decided to compile it (sorry,
transpile), thus making front end work as tedious as back end.

