
Almost everything on computers is perceptually slower than it was in 1983 (2017) - seesawtron
https://twitter.com/gravislizard/status/927593460642615296
======
topoftheforts
This is a repost favorite on HN apparently

Previous discussions (from oldest to newest):

\-
[https://news.ycombinator.com/item?id=15643663](https://news.ycombinator.com/item?id=15643663)

\-
[https://news.ycombinator.com/item?id=15648652](https://news.ycombinator.com/item?id=15648652)

\-
[https://news.ycombinator.com/item?id=21831931](https://news.ycombinator.com/item?id=21831931)

\-
[https://news.ycombinator.com/item?id=21835417](https://news.ycombinator.com/item?id=21835417)

\-
[https://news.ycombinator.com/item?id=22365795](https://news.ycombinator.com/item?id=22365795)

~~~
shadowgovt
I can only assume HN has a critical mass of people who weren't alive in 1983
willing to believe the headline assertion. ;)

~~~
dijit
I had a c64 and it was what made me fall in love with computers.

I tried my friends recently and it really did feel like a joy- it’s basically
a toy now, impossible to use for anything meaningful (even IRC).

It’s hard to convey in text the “feeling” of immediacy and low input lag. I
suppose it’s similar to how some gamers talk about fps, it feels immersive,
like the machine is in-tune with you. It genuinely feels like something
mechanical that you’re interacting with rather than something that “will get
to it”.

It’s very subtle but it’s there.

~~~
shadowgovt
It feels in tune until the moment you hit Return, generally. Then the machine
has to do actual calculation, and you're reminded of the clock speeds you're
dealing with.

Yes, I have very low latency while I type RUN EXCEL, but who cares when the
floppy disk access that comes next takes 20 seconds?

~~~
dijit
Yeah but you still feel like the machine's working, right? you get this
immediate crunching or a light blinking immediately.

I said that throughput is significantly worse, for sure.. but when I open a
program theres about a second where it seems like my computers doing nothing
at all..

Websites too, click a link and it seems like nothing until the page goes blank
and starts rendering the new site.

Most laptops don't have harddisk indicators, all indication of work has gone
away, and rendering it to the screen takes time- and isn't often followed
anyway.

~~~
shadowgovt
This is true. Apple, in particular, is aware of this; iOS has specific visual
indicators designed in to make it look like the machine is doing work even
when it's just loading state. It was one of the distinguishing factors between
the iPhone and Android user experiences early on.

------
napoleond
I call bullshit. The sort of attitude expressed in this thread is important,
because it pushes us forward. The author identifies real UX shortcomings with
current software. Nevertheless, the assertion that "almost everything on
computers is perceptually slower than it was in 1983" is patently false, and
the comparison between online maps and paper maps seems almost intentionally
obtuse. Yes, online maps could have even more capability and better UX than
they do today. No, it is not more difficult to conduct mapping activity today
than it was in the 90s. (For instance, the example given about finding
interesting places to stop along a route: the 90s equivalent of that would
have been a book, separate from the map, not searchable, and full of things
completely irrelevant to your interests. So yes, maybe putting a mark on a
piece of paper is easier than doing it in online maps, but every other aspect
of that process is way harder without the internet.)

~~~
CodeWriter23
Try servicing your neighbor’s P4 running XP machine, then review your
position. (I recently had the pleasure of doing this) It’s not all the way
back to 1983, but the snappiness compared to Win 10 or macOS on recent
hardware is startling.

~~~
bluedino
I don't miss the 2 minutes to boot up or 10 seconds to start Word and 60
seconds to start Adobe products.

~~~
usrnm
The progress in boot times has much more to do with SSDs than anything
software-related.

~~~
coding123
Not if you go back far enough. Ultima 8 Pagan, whenever I switched "maps" on
my 386 it literally took about a minute to load the next area. When I got my
next computer it was a Pentium 100 Mhz. That thing loaded the next map area in
about 1-2 seconds. Both of those computers did not use SSDs.

------
jfengel
What version of 1983 did he live through?

My Osborne 1 took a minute to load Wordstar. Searching a document would take
more minutes if it was longer than about 30k because it had to swap to floppy
disk. Connecting to a BBS was time for a bathroom break. So were compiles.

I think most of what is perceptually slower for the author is that it's a
small percentage of the 40 years it took to get here. I loved my O1 but it did
what little it could do very slowly. My phone in my hands right now is doing a
trillion more things better and faster.

~~~
drchopchop
Clearly this person never tried to load a program on a C64 with a tape drive.
I could make myself lunch and the thing still wouldn't be done, and there was
a non-zero chance it would screw up somehow and you'd have to rewind the tape
and start it over. Upgrading to floppy wasn't much better, either, before the
advent of fast load cartridges.

Contrast that with my current Win 10 install, which on a 3-year old computer
goes from "off" to login screen in under 10 seconds.

~~~
dijit
Depends on what the goal is of course. A c64 started and dropped you into the
interpreter in under a second.

Input latency being another absolutely astonishing difference.

But modern computers have so much more computational throughput that they’re
many orders of magnitude more useful as tools.

I think the author is just saying that we lost the feel of immediacy for the
common case, which I agree with, sometimes I press something and there’s
either a sub second delay (text input) or a large pause (clicking an icon)
before the machine seems to be doing anything.

Back in the day, you press a key and your computer is displaying it before the
key has risen; you press enter and your hard disk churns immediately and a
flashing light would blink merrily indicating work. That’s what perception is.

(Also, you could use a c64 with a floppy or Hard-disk which was obviously much
faster than tape)

------
rplnt
Don't know about 1983, but a lot of software is definitely slower than it was
10, 20 years ago.

Something that used to result in 2MB application with 5MB in memory
requirement is now done as half a gigabyte of resources that will spawn dozens
of processes totaling at least 1GB of memory. Sure, computers are faster, but
not by that much apparently.

Looking at all those electron (et al) apps. Cheap to write beats user
experience I guess.

~~~
The_Colonel
I miss some concrete examples.

What exactly was faster to do 20 years ago than now?

~~~
miltondts
Visual studio startup and debugging. Source:
[https://youtu.be/GC-0tCy4P1U](https://youtu.be/GC-0tCy4P1U)

~~~
The_Colonel
IDEs are I think a good example. There were IDEs in the 80s and 90s but they
are very different from now.

For example back then they did not index files at startup since there was no
function for that (like "find references"). These days IDEs provide a lot of
"intelligence" and static analysis for which you need memory, a lot of reads
and processing.

So yeah, I guess VS got slower but it's because it's now doing much more
(useful) things.

------
rsecora
The experience of the 80s was slowness, things were so slow that it's
unimaginable now. So unimaginable that we don’t remember the feeling anymore.

The experience was so slow, that you can play another game while loading [1].
Namco even patented "playing games during load” by 1987 [2].

As an example, I have fond memories of my C64 tape loading Hunchback [3] ...
while watching "V". A single tape can take one hour if it fails in the middle
and you need retries. All those waiting times were order of magnitude of what
a common toddler will expect today. We were so relieved when the fast loaders
appears [4] or when a 1541 arrived home [5]. And then we were so frustrated
when the 1541 was only 300bytes/s instead of the 300bits/s of the Datasette.

The scenario in Xenix or PC-DOS loading from 8" floppies was similar.

I'm not arguing that now are better times, I’m arguing that in the 80s we have
the best for the time, and those were good times to remember.

[1] Mini Games: [https://en.wikipedia.org/wiki/Invade-a-
Load](https://en.wikipedia.org/wiki/Invade-a-Load)

[2] Nanco patent:
[https://patents.google.com/patent/US5718632A/en](https://patents.google.com/patent/US5718632A/en)

[3]
[https://en.wikipedia.org/wiki/Hunchback_(video_game)](https://en.wikipedia.org/wiki/Hunchback_\(video_game\))

[4]
[https://en.wikipedia.org/wiki/Fast_loader](https://en.wikipedia.org/wiki/Fast_loader)

[5]
[https://en.wikipedia.org/wiki/Commodore_1541](https://en.wikipedia.org/wiki/Commodore_1541)

------
bluedino
We replaced an AS/400 (well, iSeries) accounting/inventory system with the hot
new web-based ERP from DATABASE_CORP.

People had been using this system for over 20 years. Muscle memory from
hitting Tab, Tab, F2, enter...that pulls up the inventory detail. F5, 3, 6,
Tab, that brings up the wholesale price.

Now, with the 'better' system, people have to move the mouse up to Lists,
Items, wait for the Item page to load, then choose the item the want, wait for
that to load, then click on the Inventory or Pricing tab, wait for that to
load. They can save a few steps by typing 'item: 25630' in the search bar. But
of course it loads about 5x slower.

You could also hit PgUp/PgDn to cycle through items. Now you have to click the
little arrows at the top of the page, and the view resets each time, so if you
were on the Pricing tab, you have to click on that again. The old system left
you where you were.

Sure, the new system has all kinds of advantages in other places, but to most
of the internal users, you just ruined their world and they hate the
unfamiliar, slower system that costs millions of dollars.

~~~
jcwayne
This seems to be a common problem with web-based business applications.
Little, if any, consideration seems to be give to user efficiency. I consider
that a critical part of UX, but I seem to be in the minority.

------
umvi
Because each successive generation of programmers grows less efficient with
the resources available.

A new programmer in 2020 probably treats RAM and CPU as near infinite
resources. Hence you have super simple 2D games that are 900MB in size popping
up in /r/Unity2D and webpages weighing in at 15 MB and, etc. They just don't
know those are unreasonable sizes because they've never tried to make a 100KB
game/website.

~~~
The_Colonel
"unreasonable" is totally arbitrary.

30 years ago 100 KB for simple 2D game would be considered unreasonable.

~~~
npongratz
30 years ago was 1990. I remember 100KB being fairly reasonable for a 2D game
in 1990, even when having to download it with a 1200 baud modem.

~~~
u801e
There must have been a large increase in game size between 1990 and 1995. I
remember games like Lemmings 2 the tribes and sim city 3000 requiring multiple
floppy disks.

~~~
umvi
No, 100KB is a bad estimate. Most SNES ROMs are 1-3 MB, though NES ROMs are in
the KB range.

------
spottybanana
Statement simply not true. First example is comparing offline interface to
online one - more realistic comparison would be to something like first BBS's
and the experience with those was generally insanely slow.

I didn't bother reading the rant because I think it doesn't make sense but is
just personal "feeling" of the author. Objectively speaking computers are
insanely faster and can do much more things. For exmple I use Sublime Text
editor with multiple plugins and features, and all those feel instant. Also
with web apps that I use 95% of time the experience is very instant-like.

~~~
dcabrejas
Totally, I read about 10 lines and gave up becuase it's just a massive angry
rant

------
temporallobe
This mostly sounds like the ramblings of a frustrated user for a particular
application who is projecting this to the entire domain of modern
applications. The thing is, they’re not wrong.

I worked on a multi-year multi-million dollar project to replace an aging
greenscreen US Federal government legacy system (originally written in Natural
/ Adabas). The replacement was supposed to have a modern web application
front-end with J2EE middleware and a modern relational database (Oracle IIRC).
Literally tens of millions spent, but the project was cancelled. Why? First of
all, there was a nasty project management issue I won’t go into now, but the
more important problem in my opinion was that basically it sucked. The
application was extremely slow and cumbersome, response times were
unacceptably long. Users complained they actually preferred the old
application because it actually worked and actions were instantaneous. There
were key combinations that meant something, etc.

In addition to the usual incompetence that abounded on projects like these, my
conclusion was that far too much time was spent on the infrastructure and not
enough on UI/UX. I am a solid UI developer with no official UX background but
I have enough experience to know what pissed users off, and the application
design had these issues in droves. All of this also revealed just how much
abstraction we use in modern application development. The old greenscreen apps
were very interconnected and were run on “bare metal” systems such as
mainframes. An action you took immediately got stored in the database and
lookups were lightning quick. The modern equivalents to these go through many,
many layers of processing that never before existed.

Modern application development is hard, really hard. Sure it’s easy to throw
together an app with modern development tools, but once you want to make
anything Enterprise-grade, customizable, reliable, and actually usable, it
takes time, dedication, and talent.

Don’t ignore UI/UX.

~~~
leeter
This reminds me of something Joel Spolsky said awhile back[1] "Never Rewrite
from Scratch". I stand by that too.. even if it's a legacy mainframe app it's
probably better to use strangler pattern if you really need to extend it an
the development platform doesn't suit the need.

I'm also reminded of the SABRE rewrite that had similar issues where agents
got massively slowed down by the removal of the 10key navigation that ruled
the old system and they had to add it back as emulation.

[1] [https://www.joelonsoftware.com/2000/04/06/things-you-
should-...](https://www.joelonsoftware.com/2000/04/06/things-you-should-never-
do-part-i/)

------
root_axis
I remember waiting at least 8 minutes going from boot to a usable desktop on
windows 95. Opening up IE completely froze the computer for 2-3 minutes while
it churned away. Minimizing mech warrior would take 45 seconds as I literally
watched the desktop be slowly repainted pixel by pixel. Installing 400mb
daggerfall took all of a half hour. Applying an image filter in paint shop pro
took 15 seconds at a time... the idea that computers are perceptually slower
today than in the 80s is just totally false. I get the sense that people
parroting this kind of nonsense are simply invested in an ideological
narrative that means to disparage the contemporary programmer as lazy or
uninformed because they grew up at a time when you didn't have to manually
manage menory... as if _that_ was some kind of computing golden age.

~~~
marcosdumay
That's the thing, 95 was not in 83.

In 83 you'd boot into the computer ROM or an OS floopy. The floppy completed
in a few seconds, for the ROM it's meaningless to talk about wait time.

You wouldn't install software. You would load it from a tape, a floppy or a
ROM cartige. For the cartige it would, again, be meaningless to talk about
wait time. The floppy would load faster than a 95 program from the hard driver
(thanks to bloat), and the tape would take minutes.

No program would trash your system, because the memory management algorithms
that are subject to trashing was only available for mainframes, and people
didn't have disposable disk space anyway.

------
JoeAltmaier
Hm. I remember Windows 2.1 would composite the screen image right on the tube.
You could watch it paint the components in bursts - a window background with
holes where the components would go (with other windows showing through the
holes!) then components would populate then data be drawn. Could take a second
or two.

But sure sometimes programmers take advantage of speed and space, to do more.
Which can look like "the same thing got slower". But its hardly the 'same
thing', since now you can have HTML renderings inside components etc.

Takes a moment to reconcile code across the web as well. If we amortize the
time it took to update an app from floppies (hours?) to each execution,
compared to modern self-updating code over the net we'd see a fairer
comparison.

~~~
theandrewbailey
> Hm. I remember Windows 2.1 would composite the screen image right on the
> tube. You could watch it paint the components in bursts

I wish there were more things like that. I would rather see computers work
than see some fancy animation that is there only to look pretty, but
ultimately ends up wasting time.

------
foobarian
> one of the things that makes me steaming mad is how the entire field of web
> apps ignores 100% of learned lessons from desktop apps

Unfortunately it's worse than ignoring. It's intentionally abandoning, after
years of A/B tests optimizing for revenue have found a perceptual sweet spot.
Sadly for people who care about fast UI that point is way up on the
sluggishness scale...

~~~
MattGaiser
Is there a reason that is the case? What is it that makes the new Reddit UI in
all its slowness nicer to people?

~~~
celeritascelery
It looks more modern. That seems to be the only optimization target.

------
oarsinsync
This is from 2017.

Previous discussions:

[https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...](https://hn.algolia.com/?dateRange=all&page=0&prefix=true&query=almost%20everything%20on%20computers%20is%20perceptually%20slower%20than%20it%20was%20in%201983&sort=byPopularity&type=story)

------
JulianMorrison
Random fact: I've been playing with RISCOS on my Pi 3B+ and oh my gosh how
snappy is it? Even relative to pure desktop, non-web apps. It's the last of
the true 80s OSes and honestly, I feel it represents a missed opportunity.
Computers used to have simplicity enough to start in seconds.

------
danpalmer
Requirements change.

A search field in a local application opening instantly is not the same as a
search field searching orders of magnitude more data on a machine thousands of
miles away.

Things were "faster" because so much _blocked_. Modal windows suspended
applications. Now in the ~100ms it takes to open a window your computer is
still doing other things in that application, in other applications, in the
kernel, on other cores. We may have traded latency for _throughput_ in some
cases, but that's not necessarily a bad thing, and often perceptually much
faster still.

Lastly much of the speed back then came from things that are not practical for
security reasons now. Shared memory makes a lot of stuff possible, and
quickly. Temple OS (for all the faults of the author) demonstrates that today
there are still really neat (and fast) things that can be done when you don't
have to worry about security at all, but unfortunately we don't live in that
world anymore, and a cost of our connected world is that security must be a
design principle from the outset, and there are performance costs to that at
every level of the stack, whether it's memory protection, ASLR, stack
canaries, SSL/encryption, process sandboxing, containerisation, etc.

------
jrockway
I thought this was going to be a rant about how slow modern keyboards are with
long debounce cycles and waiting for the USB poll interval. Or if not that,
how every change to the screen is buffered and waits until the next vertical
sync before being drawn. Those things are quite noticeable once you start
looking for them (you can thank me later!)

But instead it's "when I write an email to someone, the software looks through
every person I've ever corresponded with and suggests an email from that
list". The 1983 version of that was waiting 5 minutes for a bounce because you
typo'd the address. Now it's taken care of for you in a few hundred
milliseconds. Not that bad.

As for autocomplete being slow... there isn't really a good reason for this.
I've seen PHP apps that add a solid 100ms to every request because they are
built in the one-process-per-request model, and the framework does a lot of
setup and teardown for every request regardless of whether or not the request
uses those resources. Obviously, don't do that. There is no intrinsic reason
that "AJAX" apps need to be slow -- you can do a database roundtrip in less
than 10ms. Websockets and HTTP/2 eliminate TCP setup/teardown overhead.

Things could be faster, for sure. How much extra would you pay? Most customers
wouldn't pay extra for a faster app, so apps aren't faster.

As for why people make mouse-based apps instead of TUIs... it's because the
average person has to interact with tens or hundreds of pieces of software on
a daily basis these days. If they all work differently, people will never use
them. They ask for and prefer fast to learn but slow to use. So developers
build on a common set of idioms, even if they are slow in the long run. You'll
note that "power user" software is often hard to learn (think Emacs, Vim,
Blender, Solidworks) but quite efficient once the user has been trained.
People aren't interested in doing that much work these days.

Finally, business processes account for a lot of this stuff. I've been to
stores that use a nice efficient TUI for their in-store pickup. But my
question is: why is software with a UI even involved? You should scan my order
number barcode, scan my driver's license barcode, and then hand me my
merchandise. (Why even scan my driver's license. Just eat the fraud and save
me 5 seconds! That's a business decision, not a software implementation
quirk.) Having to navigate around to a bunch of fields and manually type that
information in isn't efficient. The UI shouldn't even exist.

~~~
altfredd
> There is no intrinsic reason that "AJAX" apps need to be slow -- you can do
> a database roundtrip in less than 10ms.

This is a wishful thinking. You can't expect consistent 10ms (or even 100ms)
response times from network-driven autocomplete. Nevermind a fixed speed of
light (which already makes 10ms figure impossible for transatlantic users),
you will have to account for all kinds of temporary bottlenecks in network.
DNS can take _a lot_ longer than 10ms, and many hosts use short DNS TTL by
design. There are some totalitarian countries (Belarus, Great Britain etc.)
that use DNS-based website blocking — each time you send a DNS request, it
will have to be compared against a bunch of censorship lists. Then there is an
ISP throttling/AQM, that will add small delays to each SYN packet... As
result, if your autocomplete responds in under 10ms during local testing, it
may take between 100ms and 900ms for most of your users.

~~~
jrockway
Yeah there's the network round trip and that is going to be the bulk of the
time. You have to serve interactive apps from servers that are close to your
users.

DNS, even when censored by the government, should be cached by the client for
the TTL. And, for interactive apps, you should keep the TCP connection open
between requests. (Not really sure what the browser state of the art is here,
but the idea behind HTTP/3 is to skip TCP entirely, and let the server control
how long it remembers the state necessary to keep the session alive. One
autocomplete request should be one UDP packet.)

I'll also point out that networks aren't very good. For example, I have a RTT
of 50ms to Chicago, but the speed-of-light round trip between New York and
London is 36ms. (For Chicago, it's 8ms!) At least on my setup, the vast
majority of the time is spent waiting for a DOCSIS timeslot. Once my packet
gets to the internet, it's pretty fast.

Remember, people are playing real-time shooting games over the Internet. If
they can make that smooth, you can make your autocomplete smooth.

------
rbrtl
> one of the primary reasons computers were _created_ was to cross reference
> data. that is nearly impossible in most software now.

Because the data became the hotly-guarded commodity of the "software"
companies.

------
cowmix
Back in 1983, i/o was much, much slower... which slowed everything else down.
Floppy drives, cassette recorders, 300 baud modems, etc.

------
MattGaiser
> GUIs are in no way more intuitive than keyboard interfaces using function
> keys such as the POS I posted earlier. Nor do they need to be. > GUIs
> require you to learn how to use a mouse, how input focus works, how multiple
> windows work, how modal dialogs work.

Here is the difference. People can learn how mice, input fields, and modals
work and use those learnings across products.

With keyboard driven input or command lines, people need to read something to
make it work. That creates a higher barrier to initial usage.

I would bet that the challenge for most software is getting that user to do
something for the first time and far fewer people are concerned about whether
it is a good long term way to do things.

------
mmaunder
I was programming on an Apple IIe in 1983 and this post is ... weird. Firstly
some things on the Apple were crazy slow. Gaming for example. And the PFS
database system when doing any meaningful work. And disk access was a killer
on those burping floppy drives.

The Apple has no network, LAN or WAN. I think a modem may have been available
but most people didn’t have that.

All AJAX apps use network. And you will never get away from some amount of
latency, even for edge apps. Which means that for certain things, physics
dictates that Apple IIe will always be faster than AJAX.

Much of the rant isn’t even related to the comparison. I’m also curious about
the author’s broken shift key.

------
marcinzm
I don't get his maps example. It's like he's expecting google Maps to contain
all the information he'd ever need for a trip. If it did he'd probably
complain the UI was too complicated and heavy handed. He compares it to a
paper map but a paper map doesn't have all the information. You used a book to
supplement it. Same way with Google Maps you can use TripAdvisor, a separate
Google Maps tab, Yelp, etc. The Google Map you use to plot your route is just
for that, you put in the addresses and it gives you a route. The research is
done elsewhere just like it's always been.

------
marcus_holmes
I used to use UI's like the ones he talks about. And yes, you can get quick at
it.

But OMG they took forever to learn. And if there wer more functions than could
map onto function keys that would fit in the bottom row, then stuff got
hidden. Cue people yelling across the room "how do I get into the stock
reorder menu?","F3 for stock, then F2 for management, then F5 for reorder",
"thanks!" (no help window, or tooltips, or anything).

This isn't "easy", and it's certainly not discoverable. And it's only fast if
you know exactly what you're doing.

~~~
chosenbreed37
> This isn't "easy", and it's certainly not discoverable. And it's only fast
> if you know exactly what you're doing.

True. Would it be fair to assume that users would work with a subset of the
commands available? If so, once they got used to those it would have been
plain sailing 80%-90% of the time. If I'm not mistaken I've seen a few those
DOS programs recently in different settings. The folks using them seemed to be
getting along just fine.

------
treerunner
Loading data isn’t!

It’s simply not possible to make a blanket statement like this. Computers
today are so very different and have so many more layers than those from 1983.

I recently refurbished an old Mac Plus. Of course I couldn’t leave it stock,
booting from floppies was simply too ridiculous.. So it boots from Zip.
Anyways, once I had this machine up and running I was quite surprised at how
snappy the interface is - once everything was in place - and that’s the
difference. The old interface is paper thin and has nothing going on, no
asynchronous, little to no threads, it is just what it is.

------
ChrisRR
This has always bothered me as an embedded C developer. We always strive to
squeeze the last drop of performance out of a 32MHz processor in order to
reduce battery consumption.

Yet in the PC space, CPU manufacturers are pushing the limits of physics to
produce faster, more efficient CPUs, and yet it's being wasted to dozens of
layers of abstraction and interpretation.

Software should be absolutely blazing fast by now, and yet computers feel just
as sluggish as they did years ago (ignoring the boost from SSDs)

------
Zenst
Given the latency of CRT compared to modern panels, yes on that aspect alone.

Though GUI's and the like would be the main aspect for preception, the added
eyecandy, then things like security, encryption... The CPU gods giveth with
one hand and the software gods take more with the other hand - has played out
many a time. Then the early stuff, machine code to the metal easy as less to
know, today we have just fragments of library's used in coding languages that
exceed are larger and than early operating systems. So the whole ability to
know the whole machine inside and out is very much lost as even if you did
learn it all, things would've moved on faster than you could learn.

WHich is another aspect, a user could almost know to the bit what their
machine was doing, today, so many aspects and with managment cpu's in the
background with microcode you can never pick apart, and more grunt and
processing than early more simpler systems, the odd's of knowing the level of
detail and being that intouch with what's ticking underneath is lost.

Bit like moving from manual gear changing (stick as some call it) over to
automatic. You lose that connection as it is obfuscated from you. Akin to
flying a plane by wire or as is the case today, by electronics - you lose that
level of being in touch.

------
myth2018
An interesting experiment: if you have the chance of going to one those
decades-old stores using text based interfaces, ask the employees about which
systems they like most -- the old-school ones or the new desktop or web based
ones.

I invariably get votes for the first. And the reasons are basically always the
same: they are fast and just work.

It feels like the entire user interface field is creeping.

------
myth2018
I believe that one area the author's criticism mostly applies to is that of
business applications.

I've been working as a user and developer in such field and I see a number of
advantages of older applications (and even those text based ones) over the
more recent web based ones.

Newer technology for web/mobile user interfaces is great, but it shouldn't be
the "default" approach, in my opinion, and I think that they are at the heart
of that overall disappointment with many applications. They are harder to
build and maintain; they are less constrained, leaving room for "creative"
deviations from core functionality. In summary, we are spending lots of
resources on something that eventually destroys value.

I'm aware of a number of anecdotes about companies with 30/40 years old
systems beating newer ones in terms of user speed. This, in conjunction with
the fact that they are cheaper to develop, is a very strong evidence that we
are doing something wrong.

------
cromwellian
Nope, not on my vic20, c64, c128 and Amiga. Scrolling was slow as hell. BBSes
slow. Rerendering a full page of text, slow. GEOS slow. Disk operations
hellaslow.

This is an example of nostalgia goggles in action. You know how that movie you
swore was amazing in your childhood sucks when you rewatch it as an adult and
realize it had atrocious acting and writing?

~~~
jansan
Disk operations? You mean loading from datasette. Using Turbo Tape accelerated
loading times by a factor 10, but still waiting a minute for a game to load
was common.

------
ballenf
I think the real issue today is the unpredictability of delays even though
computers are overall much faster.

When the same action can have such variable response times depending
unknowable variables, your brain begins to believe that you're working for the
computer and not the other way around. It just changes your relationship with
your computer in a subtle yet fundamental way.

If you hired a professional cutlery expert to feed you it would be faster but
it's no longer mechanical -- they might occasionally get distracted by a text
message and 1 in 100 bites are completely missed. Now there's a gatekeeper
between you and your meal. When it's working it's way more efficient and
awesome, but your brain now understands that your meal isn't really guaranteed
anymore.

Our relationship with computers is fundamentally different in a way that our
subconscious immediately grasps despite being somewhat irrational.

------
buboard
It's weird we don't have psychometric benchmarks to capture this trend. How
many times a day do you get frustrated thinking that your mouse has broken
down because you clicked something and it didn't start spinning (it will be
spinners of course, lots of them, and when they stop spinning you 're already
fuming and banging at your screen)

Can anybody point out an app that has become better at perceived feedback over
time? Everything is a horrible, slow mess. And the bigger issue is: due to the
choice of language/UI libraries, they are un-optimizable

Incidentally, this may help explain the popularity of vim which was posted on
another article. Console feedback is truly instant and, importantly,
predictable. You can hit a bunch of keyboard commands on vi, and it will
execute them 100% predictably even with a slow connection or high system load
(of course, it was designed for that).

------
lakis
This is not how things were. Maybe a few micro-actions were like that. Like
turn on a C64. But then it was doing nothing sitting there. Let me tell you my
experience that's etched in my memory. While learning image processing, we
were suppose to implement a low pass filter in a 512x512 color image. That's a
1MB image. It took 20 minutes to make a modification of a single character. 20
minutes to compile the (C) program. And 20 minutes to run the program. The VAX
we had in 1988 had 4 MB of RAM. We were 3 teams going through the cycle. So
the machine was constantly swapping. Today it will take me less then a few
seconds to change one character in my program, compile it and then run it.

------
jkoberg
Yeah, "Designers" destroyed latency in UX with animation, transitions, and
other such noise.

------
codingdave
Log into a BBS in 1983, watch the text slowly render on-screen as it traverses
that 300 baud connection.

Sure, we shoot ourselves in the foot with bad UX, but a long rant with a false
headline and deliberately obtuse examples isn't the strongest way to make this
point.

------
bitcurious
There are two things happening - one is straight up bad UX, especially for
users with poor internet connections. This is called out in the search
example, and is fixable.

Another is a little harder - if were a professional who used a computer in
1983, you probably only used one or two programs. This means that as a
developer, you could reasonably expect your users to become power users, and
thus design for them. This is no longer the case for most programs, so you see
slower, less information dense, but more intuitive software succeed.

Exceptions to this are software which only power users use - 3D modelling
software, video editing, etc. These are still fast fast fast.

------
rammy1234
Function keys are best about this article. As years passed by I see the usage
of function keys have reduced in any software updates. Function keys are thing
of past. MacBook doesnt have a dedicated function keys instead I have to go to
settings and have the function keys as dedicated one. Why Volume control,
brightness,emojis are being brought to the forefront which can be done easily
using a better with function keys. I hear some say keyboard is barrier to many
new people. But in my experience it is different. They quickly get used to key
shortcut and use products like pro. we underestimate our customers.

------
beart
This rant is full of "missing the forest for the trees."

One example.

> On the library computer in 1998 I could retry searches over and over and
> over until I found what I was looking for because it was quick Now I have to
> wait for a huge page to load, wait while the page elements shift all over,
> GOD FORBID i click on anything while its loading

Which completely overlooks how bad search results were 20 years ago compared
to now. I would rather see the right result in 500 ms than the wrong result in
20 ms.

Or to take that example even further, I can now perform that same search on my
phone, anywhere, instead of having to travel to the library.

~~~
altfredd
> Which completely overlooks how bad search results were 20 years ago compared
> to now.

Care to elaborate? Do you have examples of today's "good results" as opposite
to "bad results" of old?

~~~
beart
No, I don't have any examples of 20 year old search results.

------
Koshkin
I wouldn't say "slower," but the fact remains that the perception indeed has
not improved much, and the reason for this is in fact surprisingly simple: the
way it's developed, software is like a gaseous substance - it occupies
whatever space it is given. If some action or a response cannot take more
than, say, 10 seconds, it will tend to take at least 10 seconds; if a program
"expects" to be given a machine with 16 GB of RAM, it will take this much
space just to run acceptably fast, etc.

------
Seb-C
This is completely wrong.

I still remember when my Windows 95/98 computer took 10~15 minutes to boot,
when launching everything from Word to a game took minutes of waiting, when
installing a game from a CD could take one hour or when loading an HTML page
with no images took dozens of seconds.

God help you if you had to load an image, or even worse: a photo or animated
gif.

------
ilaksh
I have an idea about an alternative web browse/protocol that is faster.
[https://github.com/runvnc/noscriptweb](https://github.com/runvnc/noscriptweb).
.. although don't have time to code it.

Gemini could be pretty fast if they did not insist on TLS and new connections
all the time.

------
thibaut_barrere
Sorry I am laughing out loud. In 1983 everything I owned was using a cassette
and slow as hell. Rendering was sluggish at best.

------
geogra4
Disagree about 1983. Though I would agree with maybe around 1999-2003 era. Web
browsing was fast in the web 1.0 era.

------
jdmichal
Most of this is a rant against Google Maps UI. I'd like to do a shout out to
Bing Maps, which _does_ keep a list items as you look them up. You can swap
between them, and see all of them on the map. Also, I find the Windows 10 app
(called Maps) very smooth.

------
voldacar
There should be an OS that kills any program which doesn't respond to user
input within a frame

------
ibobev
In his talk "Preventing the Collapse of Civilization"
([https://www.youtube.com/watch?v=pW-
SOdj4Kkk](https://www.youtube.com/watch?v=pW-SOdj4Kkk)) Jonathan Blow is
defending similar ideas.

------
radarsat1
This guy never used GEOS. :)

(Ok fair enough, apparently it was 1986, so 3 years later..)

[https://en.wikipedia.org/wiki/GEOS_(8-bit_operating_system)](https://en.wikipedia.org/wiki/GEOS_\(8-bit_operating_system\))

------
azangru
> perceptually slower

Well, interfaces are perceptually very different than they were in 1983 too...

~~~
u801e
Yes, like typing a message in Slack takes longer to render on screen (at least
a half second delay in some cases) than typing the same message on a Commodore
64.

------
awinter-py
web/dom based apps (web + electron) have been particularly bad for text entry

there's no satisfying option for text input that doesn't come with various
kinds of tricky formatting, scroll jank, periodic slowness, and general
trouble

the assumption that we can write sophisticated + performant GUIs on cross-
platform frameworks may not be right

if contenteditable didn't ship in the next v of chrome I think people who
maintained CE-based tools would just shrug, assume it was a new compatibility
quirk, get it working at some cost to their codebase, and move on without ever
checking the doc

------
genidoi
I'd like to see an implementation of Google maps with a 1983 toolchain

------
6510
how about this?

1 -
[https://www.youtube.com/watch?v=HxXhLhTHkD0](https://www.youtube.com/watch?v=HxXhLhTHkD0)
Look at that scroll speed!?! and the accuracy!?! Infinite accuracy?!! Being
analog it goes infinitely beyond _pixel perfect_ These things can print too!

Slow and choppy:
[https://www.youtube.com/watch?v=tNtuvNO54Rs](https://www.youtube.com/watch?v=tNtuvNO54Rs)

------
cheschire
It's because in 1983 everyone was riding the white pony

------
quattrofan
He's right and 100% on Google maps it's awful other than just looking for an
address, any kind of planning like maps used to be used for, its shit.

------
JTbane
[https://en.wikipedia.org/wiki/Wirth%27s_law](https://en.wikipedia.org/wiki/Wirth%27s_law)

------
magoghm
On my Atari 2600 all games ran at a steady 60 FPS.

------
flatfilefan
Maybe we could start the revolution by doing away with the caps lock?

~~~
MaxBarraclough
Isn't that the reason FORTRAN code is so fast?

------
shadowgovt
Computers were much simpler in 1983.

------
koiz
What... lol this is totally BS. Why do people make such absolute statements.

------
metalliqaz
This one again?

------
foobar_
Because pixels

------
draw_down
Alright gang, let’s put our heads together and figure out how we can blame it
on JavaScript

------
tannhaeuser
(2017) so it's not relevant anymore ...

~~~
oarsinsync
It's definitely from 2017, and it's definitely still relevant.

My terminal lags more today than it did 10 years ago.

------
fractal618
Here is some self realized, hypocritical advice: Either do something about it,
or stop complaining.

