
Almost everything on computers is perceptually slower than it was in 1983 (2017) - fao_
https://twitter.com/gravislizard/status/927593460642615296
======
notacoward
Computers are perceptually slower because we've replaced CPU/memory slowness
with remote latency. Every click, often every keypress (for autocomplete),
sometimes even every mouse hover, generates requests to who knows how many
services in how many different places around the world. No wonder it's slow.
Speed of light. I see this even using my development tools at work, where
actions I didn't really want or ask for are being taken "on my behalf" and
slowing me down. I notice because I'm on a VPN from home. The developers don't
notice - and don't care - because they're sitting right on a very high-speed
network in the office. It's the modern equivalent of developers using much
faster machines and much bigger monitors than most users are likely to have.
Just as they need to think about users on tiny phone screens and virtual
keyboards, developers need to think about users with poor network connectivity
(or just low patience for frills and fluff).

~~~
auiya
When dealing with web applications at least, people don't realize how many
non-essential requests are being made peripheral to the action the user
actually wants to accomplish. For instance, install NoScript and make a page
fetch to cnn.com. There are 20+ external page requests being made to all kinds
of tracking, advertising, and analytical domains which have fuck-all to do
with the user's request to see content hosted by cnn.com. The page loads
almost instantly when all these non-essential requests are filtered. It's a
hilariously bad side effect of the web becoming as commercialized as it is.

------
kristianp
Windows explorer is seriously slow on Windows 10. Things like the right-click
menu and creating a new folder are too slow for what is done. The new item
menu is very slow, perhaps due to having office 365 installed? Creating a new
folder sometimes doesn't update the display of the containing folder at all.

~~~
PetahNZ
Data point of 1, but mine is as fast as anything. On my 7 year old desktop,
upgraded from 7 to 10, its near instant. But given almost anything can hook
into that menu (i.e. I have winrar, vscode, treesize, etc) it probably comes
down to what you have running.

~~~
Cougher
The second data point to the OP: I have a brand new laptop 16gb RAM and an SSD
drive. When I mouse over "new", it takes 3-4 seconds for "folder" to appear.
This is first thing in the morning with only Opera running. I do have a bunch
of tabs open, but it does this without my browser running as well.

------
WalterBright
The productivity boosters for me since 1983 aren't so much speed, it's:

1\. A large hires screen so I can see lots of context

2\. Lots of disk space

3\. Online documentation available

4\. Protected mode operating system

5\. Github

6\. Collaboration with people all over the world

The productivity destroyers:

1\. social media

~~~
gpderetta
can't comment about 4 as it was before my time doing useful work on a
computer, but everything else sounds right.

>The productivity destroyers: > 1\. social media

 _stares at HN page_

~~~
WalterBright
> can't comment about 4

Having a real mode operating system (DOS) means that an errant pointer in your
program could (and often did) crash the operating system requiring a reboot.
Worse, it would scramble your hard disk.

My usual practice was to immediately reboot on a system crash. This, of
course, made for slow development.

With the advent of protected mode (DOS extenders, OS/2), this ceased being a
problem. It greatly speeded up development, and protected mode found many more
bugs more quickly than overt crashes did - and with a protected mode debugger
it even told you where the bug was!

I shifted all development to protected mode systems, and only as the last step
would port the program to DOS.

------
issa
Apparently I use my computer differently than a lot of commenters. Because
when I dust off my 1983 Apple IIe it gets REALLY slow when I try to have 50
open browser tabs, edit video, and run a few virtual machines.

~~~
dijit
Yet if you check how fast it renders a character to the screen it will almost
certainly be faster.

We've made trade-offs in the computer space, input latency and rendering of
the screen (also in terms of latency) has suffered strongly at the hands of
throughput and agnosticism in protocols. (USB et al.)

~~~
pas
The latency issues are dealt with, but you have to accept the RGB LEDs that
come with gaming things.

~~~
TapamN
Not really. Here's what an 70's/80's PC and OS had to do to print a single
character in response to user input (simpified):

Poll the keyboard matrix for a key press. Convert the key press coordinate to
ASCII. Read the location of the cursor. Write one byte to RAM. Results will be
visible next screen refresh.

A modern PC and OS would do something more like this:

The keyboard's microcontroller will poll the keyboard matrix for a key press.
Convert the key press location to a event code. Signal to the host USB
controller that you have data to send. Wait for the host to accept. Transfer
the data to the USB host. Have the USB controller validate that the data was
correctly received. Begin DMA from the USB controller to a RAM buffer. Wait
the RAM to be ready. Transfer the data to RAM. Raise an interrupt to the CPU.
Wait for the CPU to receive the interrupt. Task switch to the interrupt
handler. Decode the USB packet. Pass it to the USB keyboard driver. Convert
the USB keyboard event to an OS event. Determine what processes get to see the
key press event. Add the event to the process's event queue. Task switch to
the process. Read the new event. Filter the key press through all the
libraries wrapping the OS's native event system. Read the location of the
cursor. Ask the toolkit library to draw a character. Tell the windowing system
to draw a character. Figure out what font glyph corresponds to that character.
See if it's been cached, rasterize glyph if it's not. Draw the character to
the window texture. Signal to the compositor that a region of the screen needs
to be redrawn. Create a GPU command list. Have GPU execute command list. Page
flip. Results will be visible next screen refresh.

I could drag this out longer and go into more detail, but I don't really feel
like it.

I'm sure people who actually work on implementing these things can find
inaccuracies with this, but it should give an idea how much more work and
handshaking between components is being done now than in the 70's/80's.
Switching to gaming hardware isn't enough to get down to ye olde latencies.

~~~
lonelappde
We have 1000x speed machines to handle that. Notepad is perfectly responsive.
but most apps do crazy side gunk interfering with typing.

~~~
joepie91_
Except we don't, and this really _is_ faster on several older machines:
[https://danluu.com/input-lag/](https://danluu.com/input-lag/)

------
stock_toaster
I think it comes down to the fact that GUIs _sell_. GUIs have visibility and
appeal, they are something users can actually see, and have opinions about
(right or wrong). GUIs are the ultimate bikeshed, and for many users, the
lipstick IS the pig.

\----

Anecdote: I can't count the number of times I have seen a team changing a
color, updating a logo, or moving an image a few pixels, resulted in happy
clients/customers, and managers sending a congratulatory company wide email.
While teams solving difficult engineering problems may have garnered a quiet
pat on the back, if they were lucky.

~~~
TeMPOraL
> _I think it comes down to the fact that GUIs _sell_._

IMO not just that, but also that the sale happens very early - before people
get a chance to discover the UI is garbage. What's worse, in work context, a
lot - probably most - software is bought by people other than the end users.
Which means the UI can be (and often is) a total dumpster fire, but it'll win
on the market as long as it appeals to the sensibilities of the people doing
the purchase.

------
EL_Loco
And he's not even talking about software bloat. The word processor I have on
my early 90's Powerbook is more responsive, generally faster to use, than my
current one running on a Core 2 duo processor. Oh, and by the way, I was once
complaining about this to a friend who's in IT, and he told me how the speed
in which a software runs doesn't mean anything regarding its quality. What I
mean is, I was telling him how bad some new software was because it was quite
slower than one 10 years older which did the same thing, and he tolde me that,
in software engineering, this (speed) is never a measure of a program's
quality. Is this universally accepted? Speed and responsiveness are not taken
into account? I always meant to ask other people in this field, but always
forgot.

~~~
TeMPOraL
> _Oh, and by the way, I was once complaining about this to a friend who 's in
> IT, and he told me how the speed in which a software runs doesn't mean
> anything regarding its quality._

Your friend is wrong. It's an imperfect proxy, but looking at programs that do
work, speed is a good proxy for quality, because speed means someone gives a
damn. There are good programs that are slow, but bad programs all tend to be
bloated.

Of course "speed" is something to be evaluated in context. In a group of e.g.
3D editors, a more responsive UI suggests a better editor. A more responsive
UI in general suggests a better program in general.

> _this (speed) is never a measure of a program 's quality. Is this
> universally accepted?_

Universally? No. It all depends on who you ask. Companies tend to say speed
isn't, but the truth is, a lot of companies today don't care about quality _at
all_ \- it's not what sells software. If you ask users, you'll get mixed
answers, depending on whether the software they use often is slow enough to
anger them regularly.

------
jtbayly
>how many times have i typed in a search box, seen what i wanted pop up as i
was typing, go to click on it, then have it disappear

Regardless of anything else, this is 100% happening to me on a regular basis.
And the ironic thing is that I think it is caused by the attempt to speed up
getting _some_ results onscreen. But it’s always 500ms behind, so it “catches
up” while I’m trying to move the mouse to click on something.

~~~
baq
firefox is notoriously bad at this - type a bookmark name into the url bar,
move mouse onto the bookmark, search results come in and you click something
you didn't want. gets me all the time.

------
jokoon
I despise the web.

HTML was designed for static documents, it boggles my mind that things like
nodejs were created. It's not a secret.

HTML techs can't even run efficiently on a cheap smartphone, which is the
reason apps are needed for smartphones to be usable.

Every time I'm talking to someone for job offers, I state that I want to avoid
web techs. No js, no web frameworks. I prefer industrial computing, to build
things that are useful. I don't want to make another interface that will get
thrown away for whatever reason.

Today the computing industry has completely migrated towards making user
interfaces, UX things, fancy shiny smoothy scrolly whatnots, just to employ
people who can't write SQL. Companies only want to sell attention. This is
exactly what the economy of attention is about.

All I dream about is some OS, desktop or mobile, that lets the user write
scripts directly. It's time you encourage users to write code. It's not that
hard.

~~~
ActorNightly
>It's time you encourage users to write code. It's not that hard.

It is. Try teaching coding to someone non technical, especially someone that
doesn't want to learn, and by the time you get them to understand what a
variable is, you will fully understand that coding is not for everyone.

~~~
iteratorloopmap
This.

What he suggested wasn't viable in terms of productivity either. One may be
programmer but don't want to spend time administering the insignificant parts
of the system.

I never understood the culture of elitism in system micro-administration by
hobbyist crowd.

------
johnday
I think we can actually blame React and similar frameworks for the issues we
see in many modern apps, including the ones mentioned in the article.

Part of the issue stems from the "strong data coupling" that's all the rage.
Everything on the page should correlate at any given point in time. Add a
character to a search box and the search results should be updated. The
practical effect of this is that any single modification could (and often
does) rewrite the contents of the entire page.

The other thing the article brings up is that developers and designers often
disregard input flow. This may be partly driven by not having sufficiently
dynamic tooling (Illustrator can hardly be used to design out flow patterns,
for example.)

These two issues have a unifying quality: Websites must be "instagrammable",
which is to say look good in single snapshots of time, and the dynamics take a
serious back seat.

~~~
proc0
Yeah but React's popularity is because there is demand and expectations for
interactive UIs with tons of features that are taken for granted. Users expect
basic amount of interactivity and usability that adds a ton of complexity
which React offers a good model for.

~~~
bryantraywick
Uhh,no. Reacts popularity is because of lazy devs that only know JavaScript
and want to use it everywhere. Large companies want to have have one "unified"
codebase that runs on all "platforms". It has nothing to do with interactivity
or usability because if it did then developers would write native apps with
native UIs with much better interactivity, usability, and performance.

~~~
mc3
Uhh,no. React is popular because devs HAVE to use JS & DOM to do web
development in the browser, and they want speed, and to use a library endorsed
by a big company. And some of those devs (like me) prefer the unidirectional /
function paradigm.

~~~
ScottFree
> and they want speed

They won't get it with React. And I'm referring to both how fast the webapp
runs and development speed. It gets too complicated too quickly, even for
relatively small sites.

~~~
mc3
Whether or not it seems complicated depends on if you have nailed down the
mental model. It's a bit like Git, in that respect.

~~~
ScottFree
Complexity is a very real and objective thing. When you can't 100% guarantee
your program won't go into an infinite loop when you change a single line,
it's too complex.

~~~
mc3
I don't know of any infinite loop problems in React. I would think they are
less likely than using MVC / JQuery style patterns.

~~~
ScottFree
Have you done a lot of work with Hooks and useEffect? React changed the entire
component lifecycle with the introduction of hooks. All sorts of weird things
trigger re-renders now, including when a dependent function changes. You have
to surround all of those with useCallback.

I should do a thorough writeup of the infinite loop issue in React.

~~~
mc3
I've not used hooks yet, but it's on my list to learn. Thanks for the warning
though!

------
unlinked_dll
If you spent what people paid for a PC in 1983 (literally, without any
inflation) you probably wouldn't notice anything being perceptually slower.

Like the first Mac retailed for $2500 US. Go spend $2500 on a PC today, you'll
have a great time.

Granted, economies of scale make this kind of a dumb argument. But it has a
bit of truth to it. People are just less willing to spend as much on their
machines, as well as push much more limited platforms like mobile to their
limits. We should definitely deal with that as developers, don't get me wrong
- but not having to deal with the optimizations they dealt with 40 years ago
doesn't make me _un_ happy.

~~~
asserttrue5
Not true.

I have a top of the line Intel processor that’s less than 2 years old
(launched, not bought). 970 Evo Pro that’s the one of the fastest drives
around. 32 GB RAM (don’t remember the speed but it was and is supposed to be
super fast).

Explorer takes a second or two to launch. The ducking start menu takes a
moment and sometimes causes the entire OS to lock up for a second.

The twitter rant is spot on.

There’s so much of supposed value add BS that the core usage scenarios go to
shit.

And this is coming from a Product Manager. :-)

Anyway the referencing problem is painful. I feel it often. Google maps or
Apple Maps. Try to plan a vacation and Mark interesting places on it to
identify the best location to stay. Yup gotta use that memory. Well isn’t that
one of the rules of UX design, don’t make me think?

Regarding OSes: storage has gotten so much faster and CPUs haven’t, that
storage drivers and file systems are now the bottleneck. We need less layers
of abstraction to compensate. The old model of IO is super slow is no longer
accurate.

~~~
jlarocco
Honestly it sounds like your problem is Windows.

I'm writing this on an AMD Phenom II, running Debian and StumpWM, that's over
10 years old. I've upgraded the hard drive to an SSD, and the memory from 8 Gb
to 16 Gb (4 Gb DIMMs were very expensive when I first built it) and it's as
fast as can be.

My work computer is much newer, has twice as much memory and a newer Intel
processor, and I really can't tell the difference except for CPU bound tasks
that run for a long time, like compiling large projects.

~~~
ganzuul
Have to voice my agreement. Linux is an expensive investment but so very much
worth it. Each time my colleagues complain about their computers it is because
of Windows. I count myself lucky to have Linux as my only desktop and the
skill to maintain it. I run an ancient i5 2500k with 8GB RAM and SSD. All the
games I play work fine on Steam Proton. I still have to figure out how in the
world Reddit on Firefox manages to completely lock the system up, with looping
audio and frozen cursor. Nothing else causes that fault.

~~~
yosamino
> I still have to figure out how in the world Reddit on Firefox manages to
> completely lock the system up, with looping audio and frozen cursor. Nothing
> else causes that fault.

Fellow X220 user here... a solution for this exact problem where the system
runs out of memory and then you sit there staring and waiting until it churns
around long enough until it can do stuff again is to run earlyoom[0].

It will kill off the firefox process (or whichever is the main memory hog)
early, which is also annoying but less so than having to wait minutes until
you can use your computer again.

[0] [https://github.com/rfjakob/earlyoom](https://github.com/rfjakob/earlyoom)

------
growlist
I miss how tight my Amiga used to feel, and in the electronic music sector
some people still prefer dedicated hardware rather than PC/Mac for these types
of reasons. Click a button and the device responds instantly. When you're
doing something creative the aesthetic quality of your tools can genuinely
affect the output, and a tight response just feels right, not to mention how
fast your workflow can become over time using dedicated hardware that responds
instantly.

~~~
frankling_
This is one reason I prefer simple guitar pedals to VSTs and such. With their
instant response and tight controls, pedals feel like solid tools you can
build muscle memory on. When unzipping plugins and awkwardly dragging sliders
with a touchpad, it feels like it's myself who's the tool.

------
DrScientist
There are two problems with interfaces like google maps - and one exacerbates
the other.

\- it's _not bloody obvious_ how they work - randomly clicking on meaningless
icons, try to uncover functionality. \- then just as you get used to it, they
change it!

My biggest feature request would be a key stroke to hide all the floating crud
that is obscuring my view of the map!

------
AtlasBarfed
"one of the things that makes me steaming mad is how the entire field of web
apps ignores 100% of learned lessons from desktop apps"

But dude, DESIGN. The design. Look at those rounded corners.

------
fxtentacle
Financials are to blame.

Selling a software release is a one-time payment. Selling a support
subscription is recurring revenue. And if you make your software horrible
enough to use without the support subscription, it is automatically immune to
piracy.

As a practical example, I don't know anyone who uses the free open source
WildFly release. Instead, everyone purchases JBoss with support. It's widely
known that you just need the paid support if you want your company to be
online more than half of the day. And as if they knew what pain they would be
causing, their microservice deployment approach was named "thorn-tail".

~~~
tpxl
Anecdotally, we used WildFly with great success on a project or two.

------
hasperdi
Most of the arguments mentioned in the article are just a bias
[[https://en.wikipedia.org/wiki/Rosy_retrospection](https://en.wikipedia.org/wiki/Rosy_retrospection)]
and personal preference.

Remember when softwares are stored in floppy. It took a while to load. Then
every application came with different behavior and key bindings.

~~~
cf
No this has actually been measured by people.

[https://danluu.com/input-lag/](https://danluu.com/input-lag/)

The computers are faster, can do more stuff, and monitors have higher frame
rates. But for many applications that aren't games latency and non-responsive
UIs are a growing problem.

~~~
aidenn0
I remember being able to type faster than the machine could keep up an an old
Mac. (Maybe a Mac plus?).

I couldn't type up handwritten notes reliably, because half a page in, I would
fill up the buffer and characters would get dropped.

~~~
boring_twenties
If you want to relive that experience, just use voice.google.com.

~~~
nikanj
Or the desktop Outlook client.

------
Taniwha
I've been trying to explain for years that for the past 4 decades the hardware
guys have been surfing Moore's law and the software guys have been pissing it
away ....

Well Moore's law is falling by the wayside, if they want to start doing more
with less the software guys are going to have to stop using interpreted
languages, GC, passing data as json rather than as binary, all that overhead
that's deriguer but that doesn't directly go to getting the job done

~~~
technics256
Is json really that difficult to deal with?

~~~
Ace17
JSON provides a lot of flexibility _: it 's human readable, it has explicitly-
named/optional/variable-length fields, it provides nesting ... all of this
comes with a cost:

\- extra branching in the parsing code (the parser cannot predict anymore what
the next field will be, they could be in any order)

\- extra memory allocations, decreased memory locality (due to variable-
length/optional fields, and also the tree-like structure).

So if your data consists in a single object composed of a timed-fixed set of
little-endian integer fields, you're comparing the above costs to the cost of
a single fread call_* with no memory allocations.

* many other data formats provide similar flexibility, text-based ones (XML) and also binary-based ones (IFF, protobuf, ISOBMFF, etc.)

 __don 't write it as such though, you must write the endianess-decoding code
(which the optimizer should trash anyway on little-endian architectures - e.g
LLVM does).

------
aiCeivi9
[https://threadreaderapp.com/thread/927593460642615296.html](https://threadreaderapp.com/thread/927593460642615296.html)

Still not very readable.

~~~
tiborsaas
Unfortunately, because twitter threads forces the user to dumb down their main
points to a single, compressed sentence. It's a shame, since I like to read
well thought out articles.

Twitter takes that away because it offers a UX that's makes publishing too
easy for your random ideas. People with low self control will create threads
like this. With hundreds of likes comes self validation so they keep doing
this.

------
trickledown
In 1983 virtually everything was text base. Since moving to graphical user
interfaces a great deal of effort moved into more visually stimulating UI such
as animations and better fonts etc. Not all of this should be counted a
progress/innovation. We have waisted much of the HW performance we achieved of
the years for baubles and trinkets:)

------
reportgunner
That's what you get for catering to people who don't care for your work one
bit.

The same people who are telling me that their computers are slow are the same
people who need a flashy animated button for every single action and the same
people who refuse to understand that passwords are not just a formality.

To each his own.

------
userbinator
Reminds me of this article from 12 years ago:
[https://hubpages.com/technology/_86_Mac_Plus_Vs_07_AMD_DualC...](https://hubpages.com/technology/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins)

Computers have gotten much faster in terms of raw speed and throughput, yet
that hasn't translated into much of an improvement in basic UI interactions
and general functioning.

------
kissickas
That keyboard-centric design for GUIs (I clearly have never taken a design
class) is what makes Reddit Enhancement Suite such an effective product, in my
opinion. HN's interface is possibly just as effective in that it discourages
me from taking too many actions; I can vote on basically every reddit comment
I read but using a mouse to do it on HN represents such a massive barrier
compared to keyboard navigation.

------
toss1
It is NOT just Web apps

I'm here trying to write in LibreOfficea several-page document with minimal
formatting, usinga high-spec CAD laptop/workstation -- and every damn
keystroke is laggy!

My muscle-memory arrow-keys & quick moving around to edit portions of
sentences, merge/split lines -- all rendered useless - because I need to wait
for the cursor to catch up.

A part of my mind keeps wandering off to whether I should go setup another old
DOS box and load XYWrite - which was feature-rich, always lightning-fast and
never laggy and worked great. Of course, the lack of printer drivers...

In every area, the software developers just squander more of the processing
power than the incredible continuing hardware advances provide.

Anyone have any advice on software that at least attempts to work closer to
the metal and lets us see the performance that we should see from modern
hardware (for all values of modern)?

I got out of the software industry because of this trend to building on
multiple layers of squishy software, instead of requiring efficiency - this
framework compiles to that pcode, talking to the other API, which gets thunked
down to the other ..... where the h*!? is the hardware that does some actual
computing?

It seems like it happened for the same reason that high-fire-rate rifles took
over military use -- because they figured out that most troops couldn't
actually shoot straight, so it worked better to just let them spray bullets in
the general direction than to require & teach real skills.

Similarly, this whole morass seems designed to make it easy for mediocre
programmers, and programmers that learn something more serious like Haskell
are considered exceptional, while the bulk of stuff is written for

------
dang
Discussed at the time (2017 that is—not 1983):
[https://news.ycombinator.com/item?id=15643663](https://news.ycombinator.com/item?id=15643663)

------
mcv
When we plugged in our Acorn Electron (it had no power switch), it would be on
and ready to use immediately (unless it hung). Nowadays it takes a minute for
my work laptop to boot. When I'm lucky.

I also totally agree with his complaint about things disappearing when you try
to click on them. Most of his rant is just about how crap Google Maps is,
though.

------
zwaps
I can say that I have yet to see ANY web app, that is fast and snappy. ANY. At
some point, they all have problems.

Even many frameworks for iphone and android that are essentially web apps are
terrible and make every app slow and miss clicks. On the latest and modt
powerful iPhone no less.

If you are creating a product as a web app only, you are telling me you do not
care about UI enough.

Programs never not had bugs or issues, but we never had the Situation that
every app, even though it technically works, is either sluggish, breaks
somehow or requires the user to learn intricate timings to use a simple UI.

Developers took the easy way out and used these frameworks because its simple
and, they feel, good enough.

And here we are.

~~~
nottorp
Sadly it's not the "easy" way. It's the "cheap" way. So we're most likely
stuck with the javascript garbage.

------
daxorid
Casey Muratori and Jonathan Blow have been bitching about this for many years,
now, and they're absolutely right. But it's not just UI latency - it's
everything. Modern software is, technically and morally, a five alarm tire
fire.

The profession needs to actively fight Moore's law in order to keep our jobs
relevant, and find more "work" to do - most of which is not only poorly
engineered, but culturally destructive.

If you care about this, there's a tiny community of developers that actually
care about reversing it:
[https://handmade.network/](https://handmade.network/)

------
blablabla123
That's also how I remember the XT (~8086) I got in the 90s. When I typed or
clicked, the response was instant. But that was about it, sometimes the HDD
didn't work for weeks because I forgot to properly "park" it before turning
the computer off. So I had to boot from floppy disk. And that was slooooow...
Probably that's one core difference to today's computing, batteries are
usually included, systems can run for months or even years without maintenance
and crash much more seldomly. In the past people often had to wait for the
computer to work again...

------
fouc
danluu has some articles on this topic

Computer latency: 1977-2017 ([https://danluu.com/input-
lag/](https://danluu.com/input-lag/))

[https://news.ycombinator.com/item?id=16001407](https://news.ycombinator.com/item?id=16001407)

------
auiya
I blame two things.

1\. The "release early, iterate often" culture - as it encourages half-assed
software to flood the marketplace.

2\. Poor or non-existent incentives for proper code maintenance.

~~~
gchamonlive
> The "release early, iterate often" culture - as it encourages half-assed
> software to flood the marketplace.

That is blaming the wrong aspect of agile development. Software should fail
early. It is wasteful to go through a lengthy process only to find out the
final product either is incompatible with the market or someone else made a
tool that already dominates the niche market.

The problem most of the time is not realising you are actually selling a
prototype and what should be a proof of concept ends up in production.

------
phsource
Have some empathy! I've seen a trend where rants against the modern web often
get upvoted lots on Hacker News. Consider this quote from the article:

> I make no secret of hating the mouse. I think it's a crime. I think it's
> stifling humanitys progress, a gimmick we can't get over.

Does the world's typical computer user today hate the mouse, and prefer a
keyboard-only interface (CLI)? No -- in fact, command-line interfaces are less
discoverable and harder to use, starting out. Even as a programmer, I struggle
to remember the flags to many common command line utilities

Sure, the author's example of a cashier's checkout console might be great as a
text-only interface -- cashiers use it day-in, day-out, and can learn all the
keyboard shortcuts in a day. But what about the self-checkout machines that
shoppers use maybe once a week? Would you rather have every person have to
learn a list of keyboard commands while navigating a two-color interface?

Does the modern web poorly serve the author, who's good enough with technology
to master any UI? Sure!

But the modern web works better for the billions who otherwise would not have
started using it in the first place

~~~
yosamino
> command-line interfaces are less discoverable and harder to use, starting
> out. Even as a programmer, I struggle to remember the flags to many common
> command line utilities

Not to disagree with the general point you're making, but autocompletion of
commands just using the tab command is how CLIs get discoverability and it's
kind of cool.

Whenever I "don't know" I just 'cmd -<tab><tab>' and suddenly I am presented
ith a list of options that I can filter by continuing to type the option I
suspect I need, or tab to the one that I see on screen. Then if that requires
an argument <tab><tab> it let's me select the, for example, file that is
needed as the argument.

~~~
Const-me
> Whenever I "don't know" I just 'cmd -<tab><tab>' and suddenly I am presented
> ith a list of options

You assume you already know which `cmd` to type. Most users don't.

------
torstenvl
(2017)

Google Maps has improved the primary complaint here. You can now search along
your route.

~~~
danjayh
Not on the desktop...

------
zzo38computer
It is true. Many stupid programming design, and other stuff, results slowly.

They say, type two words and push F3, well, you can implement a telnet (or
SSH) service which provides such a program.

Or, better may be, what I thought of "SQL Remote Virtual Table Protocol". You
can access remote data using local SQL, allowing you do make cross-referencing
data, both with the same and with different data sources.

Of course, there is still going to be network latency regardless what you do.
But many local programs are still slow (as many comments mention), also due to
doing too many things, I think. (Maybe network latency may make it a bit slow
even if you use telnet to implement the old interface now, but not as slow as
with HTML, which is just bad for this kind of things.)

Modern user interfaces I think are also bad, and makes it slow.

I hate touch screens, and slightly less hate mouse. Command buttons and
toolbar icons are bad and keyboard is better, I think. There are some uses for
mouse, but it is way overused.

------
d--b
The guy's rant is a pain to read, but he's mostly right.

Don't get all defensive: take this as a boatload of opportunities to make
things better.

I woke up the other day to find my mouse broken, and believe me, on macos,
it's very hard to do anything without the mouse. I had to look up all sort of
crap from my phone just to find out how to reboot the thing.

~~~
username90
> it's very hard to do anything without the mouse

Is it? I tried your example on windows and I could shut off the computer
easily (alt-f4), then I opened up a browser (windows button, write chrome),
navigated to your post, wrote this message and logged in without touching the
mouse. I've found that you can navigate most websites without a mouse, as you
can just move to the links by using the browser search and then click then
with ctrl-enter.

Edit: I even managed to go back and edit this message without touching the
mouse.

~~~
d--b
Yes, windows is a lot keyboard-friendlier than apple. The windows key / start
menu is definitely missing on mac

~~~
username90
Then why do people like mac computers? I never liked them and I grew up with
them.

~~~
timw4mail
Consistency, better design, fashion, take your pick.

------
mikorym
> twitter.com

I found it funny that this appeared on Twitter, a website which always slows
down my browser, especially in VMs.

------
musicale
Some things that are drastically faster: email, copying files, booting a disk
OS and/or waking instantly from sleep (though some 1983 laptops were instant-
on), printing, GUIs (try using a Lisa from 1983), any kind of complex computer
graphics, software downloads and installation.

------
bullen
There is going to be a major C64 revival when the average Joe realizes we have
hit the peak in the last human invention:

The most sold computer ever in history is still alive:

New cases:
[https://shop.pixelwizard.eu/en/commodore-c64/cases/90/c64c-c...](https://shop.pixelwizard.eu/en/commodore-c64/cases/90/c64c-case-
transparent)

New keycaps: [https://www.indiegogo.com/projects/keycaps-for-your-
commodor...](https://www.indiegogo.com/projects/keycaps-for-your-commodore-
computer#/)

New software: [https://csdb.dk/](https://csdb.dk/)

It's happening boys, back to the future!

------
jammygit
Including reading essays, now written half-paragraph by half-paragraph in
twitter threads?

~~~
dilaudibble
Exactly right. This person wants me to read his thoughts on UX but could not
make it less work for me, the user, to read them?

------
thaumasiotes
Boot times are way faster.

~~~
sixothree
Maybe boot times are faster than 1996, but not 1983.

~~~
thaumasiotes
Boot times are faster than they were in 1986. I feel comfortable assuming
they're also faster than they were in 1983.

SSDs are fast.

~~~
rongenre
A powered-off Apple 2 or C64 could be operational in seconds. We're nowhere
close to that now.

~~~
thaumasiotes
A powered-off Dell laptop running Windows 10 is operational in seconds. I know
this because that's what I use.

Whereas an IBM PC booting into DOS in 1986 took, sure, seconds, but a lot more
seconds. You could read a lot of the messages as they scrolled by during boot.

To get to a BIOS configuration screen now, you need to independently research
the key that will bring it up and memorize it. Then you have to frantically
mash it during the whole very brief boot process, because there's only a split
second during which it will actually work. It used to just be a boot message.
When you saw the message, you had time to hit F12 or whatever.

~~~
michaelmrose
If it's operational in seconds it was hibernated or suspended.

Windows now by default has "quick startup" which is effectively log the user
out kill their apps and hibernate.

Beware if you dual boot and want to access the windows files or your machine
does not handle hibernation well.

Actual startup probably takes more like 20-40 seconds

~~~
Reelin
> If it's operational in seconds it was hibernated or suspended.

This is not true. Are you still using a platter hard drive? (If an SSD, have
you looked up benchmarks for it?)

My ~5 year old laptop used to cold boot Windows 7 in less than 10 seconds
(once I'd disabled most autostarting programs, at least). It currently cold
boots Ubuntu in ~5 or so; most of that time is spent displaying the UEFI and
Grub splash screens. This is made possible _almost entirely_ by a Samsung Evo;
I'm looking forward to getting an M.2 drive when I replace the computer.

~~~
michaelmrose
Are you pressing the button to start and externally timing the process of
arriving at a usable desktop and have you explicitly disabled fast boot? This
isn't a new feature in windows 10.

Internally my computer tells me the process takes about 5 seconds from the OS
start to graphical environment but in reality there are several steps. For
example this doesn't account for the period of time between hitting the power
button and the OS itself starting to run, entering full disk encryption
password, unlocking volume.

I would be surprised if a full restart actually took so short. Maybe not
loading a menu or unlocking a volume is sufficient to explain the difference?

------
Cougher
Sorry, but it's just a very dumb statement meant to be a hot take. So he has a
select few examples that he's choosing as "everything on computers". I have to
admit, I didn't start working on computers until 1985 when it took forever to
load the program via floppy . . . as long as the drive wasn't drifting out of
alignment; then it would either take even longer or it wouldn't load at all .
. . so maybe his golden age of computers was just a year.

------
phendrenad2
I think us software developers are so used to doing inherently slow things
("find all" for some function name in a huge codebase, installing packages,
compiling assets, starting the iOS simulator, making API calls, waiting for a
unnecessary meeting to be over so I can get back to working), that we need a
good UX team to tell us when to make things faster. We just don't notice slow
things anymore.

------
woah
The thread started out being about how library computers were faster in the
80's and then derailed into angry Google Maps feature requests

------
inamberclad
Not quite the same topic, but I remember someone measuring keyboard to screen
input latency, and that has also increased over the years.

------
shrubble
Consider the precursor to current MacOS, NextStep v3.3.

It ran very well on a 33mhz 68040 with 32 to 64 megabytes of RAM.

------
strooper
Few days back, I installed the last version of Microsoft Encarta (2009) in an
old PC (CPU E5200, RAM 4GB). To my surprise, all the millennial around me were
astonished how fast and smooth that software was running and searching for
information on a typically (very) slow machine.

------
wodenokoto
I feel like my 2017 mbp takes longer to log in than my 2009 mbp did. The old
one broke, so I can't compare, but I remember it like wake up would be done by
the time I open the lid. On my new one I can sit and wait for the screen to
turn on.

------
lysium
So this is mostly about UI. I would have agreed that many operations take very
long, eg. accessing our web-based info system or waiting until outlook open or
closes that window. It‘s strange that these operations don’t happen instantly
nowadays.

------
stevebmark
Searching a map of the entire world online is slower now than it was in 1983?

------
lwhi
Lots of great points - but feels slightly ironic that he's decided to divide
this into segments of 160 characters, and spread the words between the
nomenclature of the Twitter UI.

------
taneq
This is a comparison of native applications doing simple things in 1983 vs.
web apps in 2017. Not all the world’s a web app, although that does seem to
get forgotten here.

------
tripzilch
Half the people in this thread keep confusing latency and speed. The other
half keep explaining it to them. It's a little repetitive here :)

------
haecceity
Code running in real mode has less overhead than long mode but who wants to
use an OS that runs in real mode besides that templeos guy.

------
musicale
The performance of Microsoft Word seems to be a universal constant across all
known hardware.

------
elil17
I thought about reading this article, but the webpage was taking too long to
load

------
narrator
This guy has obviously never had to wait for GI Joe to load on a Commodore 64.

------
crazygringo
But my computer is at least 1,000 times more valuable in what it can do for
me, than it was in 1983. Probably even more.

Seems like a pretty good tradeoff to me.

I'll take a slow autocomplete box of _all the world 's knowledge_ over a
lightning-fast lookup of my local files in a single directory any day.

~~~
adatavizguy
I wonder if your computer is more valuable in 1983 running the first version
of Lotus 123 compared to 36 years earlier in 1947 or more valuable now
compared to 1983 with everything it can do? Not sure how to quantify that but
it might be similar.

------
shmerl
Define everything. A lot of things were way slower in 1983.

------
thetanil
This guy never copied floppy discs

------
_Codemonkeyism
Except the internet it seems.

------
clarry
Speaking of slow and crappy UIs, what the fuck is up with the trend of "click
to read more."

Youtube does this in video description and comments. If I'm scrolled down
there to read comments, maybe I want to actually just read them and not click
to read them?

Reddit does this. I don't know what reason I have for reading a thread of
comments other than reading comments, so why do I have to keep clicking read
more?

Twitter evidently does this. If I'm reading a thread, why do I need to click
to read more? And after a couple dozen posts, click again. In this case, it
also seems to expand the unread posts above the point where you're currently
scrolled to, so you have to scroll back up and manually figure out where
exactly the last post you read is and where the new stuff begins.

Many shops do this, by cutting product descriptions at a few lines so you
can't read what the product is all about without clicking read more.

And they do the same thing with reviews.

I'm really tired of clicking read more over and over again in places where
reading is the whole point!

~~~
jon-wood
There’s three possibilities, although I’ll warn you now that none are great.

1\. In an attempt to improve perceived performance on initial load a decision
was made not to load all content in at once. In the case of a Twitter thread
containing potentially hundreds of items that’s reasonable, for product
descriptions less so.

2\. The widget being used to display a product description on the product page
itself is also used elsewhere on the site, but in a context where space is
constrained to fit a grid. They got round that with a read more link.

3\. Sadly the most likely for product descriptions, in an attempt to determine
customer interest an arbitrary cut off was chosen for how much of a
description is shown. Metrics are then tracked on which descriptions are
expanded, and taken as a proxy for customer interest in those products.

~~~
croon
> 1\. In an attempt to improve perceived performance on initial load a
> decision was made not to load all content in at once. In the case of a
> Twitter thread containing potentially hundreds of items that’s reasonable,
> for product descriptions less so.

In an age where web pages are several MB large, bandwidths surpassing 100Mbps,
and GPUs alone with 11GB of RAM, we can't render more text on a screen.

This page right now contains around ~8kB of pure text, and everything is
already expanded as opposed to most other comment sites. I'm aware that
formatting, layout, data modelling, messaging etc increases that amount, and
that's fine, I'm just baffled that it's possible to have a slower experience
with percievably the same amount of brain-data as 20 years ago, but with
hardware that is magnitudes better.

We shouldn't lose this much to UI fluff.

~~~
mewpmewp2
The bottleneck would in this case be within YouTube servers that have to fetch
the comments. Nothing to do with UI. Since comments work on scale it is
possible it takes quite a bit of resources to load them instantly with the
video.

~~~
croon
It's not directly related, sure. But it's certainly a second order effect of
UI choices that has brought us here. They manage to compute (batched and not
real time) personal ads tailored to you, load media for this, push a whole
bunch of data from your clicks to further this tailoring, push metrics, grab
libs from 40 different locations, etc, etc, etc.

If we wanted to do some work around getting more text to users and improve the
reading experience of sites with at least a portion of them designed for that
purpose, we certainly could.

------
nrp
Is it deliberate satire that this was written as a very long series if tweets
rather than as a well structured article like it may have been in 1983?

~~~
csomar
It reminds me of the people complaining that the streets are dirty and then go
on by throwing their coffee cup on the street just as they are talking.

I mean the dude is using the worst medium possible to display and transfer
data. It is also slow and barely follow-able/readable.

Interesting world we are living in.

~~~
reportgunner
> _I mean the dude is using the worst medium possible to display and transfer
> data._

Whoa hold your horses. Are you sure about the __worst __part ? Imagine if this
were an instagram story or a series of snapchat snaps. Would it be a better
medium? I beg to differ.

------
_bxg1
Started with what could have been an interesting premise for discussion, and
then lost me when he declared mice to be intrinsically bad in a writeup about
user interfaces.

It always confounds me when people assert that the number-one desirable
feature in all user experience is _velocity of input_. As if there's nothing
to be thought about, much less discovered. Just data to be entered. That may
be true in certain narrow segments of interfaces, like POS software and
development environments, but to pretend that all human/computer interaction
fits that mold is to be willfully ignorant of the reality of personal
computing.

Edit: It's been pointed out that towards the end he does acknowledge some
valid use-cases for mice, but he still drastically oversimplifies the space of
GUI use-cases and ignores some of the key benefits of mouse-based interaction.

~~~
godot
I got to that point and had the same reaction you did, but finished reading
the thread and realized he's actually claiming something far more ambitious
and interesting than I thought he was.

Particularly when he clarified that keyboard-only doesn't just mean text-
based, and it can be graphical and keyboard-only. I'm trying to imagine UI
that are graphical and beautiful but works with keyboard only, and it actually
could be a very interesting world if that were popular. It's not unlike games
from older days where ways of input is limited and input is purpose-driven.

For sure, as you said, there's the discovery/exploratory aspect of using a
software where GUI helps a lot. But there's definitely merit in what the
author is saying.

~~~
asdff
I'm not understanding the adversion to the keyboard in the comments here. It's
just faster. I prefer not having to reach over and find the mouse or trackpad,
then wiggle around to find the cursor. Useful only when the keyboard ux is
bad.

Imagine playing piano while having to take a hand off the keys and find a
mouse during the song. It would be impossible to keep up. Keyboard only is
speedy when learned.

~~~
jspash
I'm not taking sides on this one, but thought you might find this article
interesting that was posted here 2 days ago.
[https://www.asktog.com/TOI/toi06KeyboardVMouse1.html](https://www.asktog.com/TOI/toi06KeyboardVMouse1.html)

The key take-away was they spent a lot of money on a research project
specifically about mouse vs. keyboard and found the precise opposite to be
true! Surprising to say the least.

* Test subjects consistently report that keyboarding is faster than mousing. * The stopwatch consistently proves mousing is faster than keyboarding.

~~~
marcus_holmes
I have no idea why you're getting downvoted.

Supplying actual data is always useful.

Especially when it contradicts a common opinion ;)

~~~
TeMPOraL
Supplying actual data is always useful. If only that article did that :).

Sometimes, articles that contradict a common opinion do so because they're
simply dead wrong. This is one of those cases.

~~~
marcus_holmes
I'll admit it's not much data, and there doesn't appear to be a source. But it
vaguely refers to a study, and doesn't just reflect the authors preferences.

Did Apple do a study? Did it come to the conclusion that the mouse was faster?
If so, why is it dead wrong?

~~~
TeMPOraL
I don't know if Apple did a study, and I'm not particularly inclined to look
hard for it, because the argumentation from the article series itself is
pretty much bogus and uses some weird test setups (like
[https://www.asktog.com/SunWorldColumns/S02KeyboardVMouse3.ht...](https://www.asktog.com/SunWorldColumns/S02KeyboardVMouse3.html)
and the e -> | replacement game); that, coupled with my real-world experience
which every day proves superiority of keyboard over mouse in structured
interfaces, leads me to assign very low prior to the validity of the
conclusion of that Apple study, as reported by Tog.

(The study perhaps had a more narrow set of conclusions than presented in the
article. Wouldn't surprise me.)

------
johnobrien1010
Why do people keep writing essays on twitter instead of writing an essay and
posting it as a blog post?

~~~
dgellow
Twitter works quite well to captures some thoughts you just have on the
moment. Though of course the reader experience is awful. The blog post format
somehow implies more work and preparation, where you can just open twitter,
start to write a short line, then continue message by message.

I don't know why twitter doesn't even try to improve the reader experience,
threads are just a complete mess.

------
webboynews
I didn't have a computer in 1983. I wasn't born even. But Windows 98 felt
faster than Macintosh and even Linux.

~~~
Erlich_Bachman
Today Linux can be much faster than Windows 10.

------
vl
It’s such a bs, UI in 80ies and 90ies was beyond slow by today’s standards. We
just expected to wait more back then.

Computer magazines used to publish comparisons on how fast editors can scroll
through the text document!

