
Almost everything on computers is perceptually slower than it was in 1983 (2017) - zdw
https://twitter.com/gravislizard/status/927593460642615296
======
dang
Discussed yesterday as explained here:
[https://news.ycombinator.com/item?id=21835519](https://news.ycombinator.com/item?id=21835519)

------
kabdib
I had an Atari ST in a closet, and decided to get rid of it a while back. I
pulled it out to test it. The boot sequence, all the way to a desktop with a
mouse that moves, takes less than one second. If you boot from a hard drive,
maybe another second. For a while I just kept hitting the reset button,
marveling at the speed at which the ST came up.

Most machines I work with these days take minutes to get rolling.

Okay, I know that systems are bigger and more complicated now; buses have to
be probed and trained, RAM has to be checked, network stuff needs to happen,
etc., etc., but _minutes_? This is just industry laziness, a distributed
abdication of respect for users, a simple piling-on of paranoia and "one or
two more seconds won't matter, will it?"

Story time.

A cow-orker of mine use to work at a certain very large credit card company.
They were using IBM systems to do their processing, and downtime was very,
very, _very_ important to them. One thing that irked them was the boot time
for the systems, again measured in minutes; the card company's engineers were
pretty sure the delays were unecessary, and asked IBM to remove them. Nope.
"Okay, give us the source code to the OS and we'll do that work." Answer:
"No!"

So the CC company talked to seven very large banks, the seven very large banks
talked to IBM, and IBM humbly delivered the source code to the OS a few days
later. The CC company ripped out a bunch of useless gorp in the boot path and
got the reboot time down to a few tens of seconds.

When every second is worth money, you can get results.

~~~
bobcostas55
The latest AMD CPUs are particularly bad at this, I got a 3600 and for half a
year now there are known problems with extremely slow booting. The latest BIOS
update made them a bit better but it's still at completely unacceptable
levels.

~~~
floatboth
That's very board specific I think, some boards from the first Zen generation
also had that problem, but my MSI board boots very quickly

~~~
anon73044
This. MSI B450 + 3600 and I'm still on bios 7B85v19 SSD is and older 850 EVO
and boots to desktop in under 30 seconds

------
joncrane
Kind of related: does anyone else notice how long it takes to change channels
on the TV these days? It used to be instantaneous when cable first came out
and at some point it became this laggy experience where you'll press buttons
on the remote and the channel takes forever to change. I hate it and it's one
of the reasons I don't have cable any more.

~~~
wolfgke
The central reason is that modern video codecs use I-frames and P-frames
(sometimes also B-frames; even though as far as I am aware, these are not used
for TV broadcasts); see
[https://en.wikipedia.org/w/index.php?title=Video_compression...](https://en.wikipedia.org/w/index.php?title=Video_compression_picture_types&oldid=916270429)

I-frames are only sent, say, once or twice a second.

When a channel is switched, the TV has to wait for the next I-frame, since
P-frames (and B-frames) only encode the difference to the previous I-frame (or
to the previous and next I-frame in the case of B-frames).

If you are aware of a possibility for efficient video compression that avoids
this problem, tell the HN audience; the really smart people who developed the
video codecs apparently have not found a solution for this. ;-)

Otherwise complain to your cable provider that they do not send more I-frames
to decrease the time to switch between channels (which would increase the
necessary bandwidth).

~~~
seanalltogether
Unless I'm very mistaken about modern digital transmissions, cable coming into
the house is still based on a broadcast system, which means you're getting
everything all the time. The frames are all there for every channel, they're
just not being read. I don't know how much processing or memory it would take
to read and store frames for surrounding channels to the one you're on, but I
imagine its possible.

~~~
phonon
Feel free to find a chipset on the market that can decode hundreds of channels
of Full HD H.264 simultaneously...

~~~
seriesf
You wouldn’t have to actually decode it, just receive it all and buffer
everything after the last key frame. That eliminates waiting for the next key
frame.

~~~
sigstoat
receiving it all _is_ the problem. nobody would pay the price for an RF front
end with the bandwidth to digitize the entire OTA TV portion of the spectrum.
they're spread from 54MHz to 806MHz. that's 752MHz of analog bandwidth. that's
huge. (i'm not even sure you could buy that frontend for love or money. ok,
maybe you take advantage of the gaps and just have 3-4 frontends. now there's
a correspondingly large amount of PCB space taken up, more interference
issues, increased defect rate, etc)

------
AStellersSeaCow
To call this rose-tinted glasses when considering how things worked in 1983 is
a massive understatement.

A counterexample: in 1983, enter two search terms, one of them slightly
misspelled or misremembered, hit f3: "no results", spend 10 minutes trying to
find the needle in the haystack, give up and physically search for the thing
yourself.

Enter two search terms slightly incorrectly now: no of the time it will know
exactly what you want, may even autocorrect a typo locally, get your accurate
search results in a second.

When things were faster 30+ years ago (and they absolutely were NOT the vast
majority of the time, this example cherry picked one of the few instances that
they were), it was because the use case was hyperspecific, hyperlocalized to a
platform, and the fussy and often counterintuitive interfaces served as guard
rails.

The article has absolutely valid points on ways UIs have been tuned in odd
ways (often to make them usable, albeit suboptimally, for the very different
inputs of touch and mouse), but the obsession about speed being worse now
borders on quixotic. Software back then was, at absolute best, akin to a drag
racer - if all you want to do it move 200 meters in one predetermined
direction then sometimes is was fine. Want to go 300 meters, or go in a
slightly different direction, or don't know how to drive a drag racer? Sorry,
need to find a different car/course/detailed instruction manual.

~~~
romwell
All of this is absolutely no excuse for the ridiculously high latency
everywhere now.

Want to give me fancy autocorrect? Fine. But first:

* Make a UI with instant feedback, which doesn't wait on your autocorrect

* Give me exact results instantly before your autocorrect kicks in

* Run your fancy slow stuff in the background if resources are available

* Update results when you get them... if I didn't hit "enter" and got away from you before that.

It's not that complicated. We've got the technology.

And also, there's still no fucking reason a USB keyboard/mouse should be more
laggy than their counterparts back in the day.

~~~
ultrarunner
Windows 10 search and macOS's spotlight do something like this. It's
irritating when results reorder themselves milliseconds before I hit enter.

Either way I'm not sure it rises to the level of indignation shown here.

~~~
romwell
The only thing Windows 10 UI rises to is a paragon of lagginess - and I say
that about the UI I generally _like_ , and one that runs on 10-year-old
equipment just fine.

There's no good reason for a lag after hitting the "Start" button.

There's no good reason for a lag in the right-mouse-button context menu in
Explorer (this was a "feature" since Windows 95, however).

I could go on for a long time, but let's just say that Win+R notepad is still
the fastest way to start that program, because at least Win+R menu wasn't made
pretty and slow (but still has history of some sorts).

The search box behaves in truly mysterious ways. All I want it to do is bring
up a list of programs whose name contains the substring that I just typed.
It's not a task that should take more than a screen refresh, much more so in
2019. And yet, I still have no clue what it actually does - _if_ it works at
all[1].

[1][https://www.maketecheasier.com/fix-windows-10-start-menu-
sea...](https://www.maketecheasier.com/fix-windows-10-start-menu-search/)

~~~
folmar
Just install Classic Shell (now Open-Shell). Resistance is futile.

------
ktm5j
> one of the things that makes me steaming mad is how the entire field of web
> apps ignores 100% of learned lessons from desktop apps

I can't agree with this enough. The whole of web development in general really
grinds my gears these days. Stacking one half-baked technology on top of
another, using at least 3 different languages to crap out a string of html
that then gets rendered by a browser. Using some node module for every small
task, leaving yourself with completely unauditable interdependent code that
could be hijacked by a rogue developer at any moment. And to top it all off
now we're using things like Electron to make "native" apps for phones and
desktops.

It seems so ass-backwards. This current model is wasteful of computing
resources and provides a generally terrible user experience. And it just seems
to get worse as time passes. :/

~~~
themagician
I’ve gone back to making simple HTML pages for all sorts of projects. Even
trying to avoid CSS when possible.

It’s funny, in a way, because the “problem” with straight HTML is that it was
straight hierarchical (and thus vertical) and so a lot of space was wasted on
wide desktop displays. We used tables and the later CSS to align elements
horizontally.

Now on phones straight html ends up being a very friendly and fast user
experience. A simple paragraph tag with a bunch of text and a little padding
works great.

------
mtm7
Related: I was bored last Sunday, so I decided to install Sublime Text. I'm
normally a VS Code user, but VS Code is built on Electron and it felt a little
sluggish for certain files, so I wanted to give something else a try.

I've been using Sublime all week and it feels like an engineering masterpiece.
Everything is instantly responsive. It jumps between files without skipping a
beat. My battery lasts longer. (I don't want to turn this into an editor
debate, though. Just a personal example.)

If you would've asked me a month ago, I would've said that engineers cared too
much about making things performant to the millisecond. Now, I would say that
many of them don't care enough. I want every application to be this
responsive.

I never realized how wasteful web tech was until I stopped using it. And I
guess you could say the same for a lot of websites – we bloat everything with
node_modules and unnecessary CSS and lose track of helping our users
accomplish their goals.

~~~
efdee
Weird to compare an IDE and a text editor, though.

~~~
SAI_Peregrinus
VS Code ~= VS. It's a text editor, not an IDE (unless you consider things like
VIM to be IDEs because you can use plugins to do IDE-like stuff with them.)

~~~
temac
That's kind of a marketing distinction. VS is not a kind of monolithic
spaghetti monster system: internally, it is made of components, which provide
a text editor, indexing for various languages, etc.

You can do likewise with VS Code or other environments, except maybe some
plugins are not installed by default.

In the end it boils down to: how do we define an IDE? And even if it is about
bundled capabilities, I would still be able to create a "dedicated" (would not
need much modification) Linux distro and declare it to be an IDE.

It was easier to distinguish IDE from other things in the MS-DOS era.

------
BuildTheRobots
Reminds me of John Carmack complaining that it was quicker to send a packet of
data internationally compared to drawing on screen:
[https://www.geek.com/chips/john-carmack-explains-why-its-
fas...](https://www.geek.com/chips/john-carmack-explains-why-its-faster-to-
send-a-packet-to-europe-than-a-pixel-to-your-screen-1487079/)

More relevant to the article, I fully agree with the authors upset at trying
to do two parts of the same task in google maps, it's entirely infuriating.

Edit: duplicate submission: one directly on twitter, this one through hte
threaded reader. The other submission has >350 comments:
[https://news.ycombinator.com/item?id=21835417](https://news.ycombinator.com/item?id=21835417)

------
Diederich
> in 1998 if you were planning a trip you might have gotten out a paper road
> map and put marks on it for interesting locations along the way

In 1998, I used [https://www.mapquest.com/](https://www.mapquest.com/) to plan
a road trip a thousand miles from where I was living, and it was, at the time,
an amazing experience, because I didn't need to find, order and have shipped
to me a set of paper road maps.

In the 1970s, when I had a conversation with someone on the phone, the quality
stayed the same throughout. We never 'lost signal'. It was an excellent
technology that had existed for decades, and, in one particular way, was
better than modern phones. But guess what? Both parties were tied to physical
connections.

Google Maps is one product, and provides, for the time being, an excellent
experience for the most common use cases.

> amber-screen library computer in 1998: type in two words and hit F3. search
> results appear instantly

So that's a nice, relatively static and local database lookup, cool.

I wrote 'green screen' apps in Cobol for a group of medical centers in the
early and mid 90s. A lot of the immediate user interface was relatively quick,
but most of the backend database lookups were extremely slow, simply because
the amount of data was large, a lot of people were using it in parallel, and
the data was constantly changing. Also: that user interface required quite a
bit of training, including multi-modal function key overlays.

This article has a couple of narrow, good points, but is generally going in
the wrong direction, either deliberately or because of ignorance.

------
Pxtl
This seems to be conflating several separate problems

1) I can't manipulate data in my resultset well enough in google map.

2) Searches are too slow.

3) Mousing is bad.

Now, you can argue that those are related.

The first two are an argument for moving away from full-page post/response
applications to SPA-style applications where the data is all in browser memory
and as you manipulate it you're doing stuff on the client and pulling data
from the server as needed, desktop style.

The latter? I don't know why he had to go back to DOS guis. Plenty of windowed
UIs are very keyboard friendly. Tab indexes, hotkeys, etc.

> GUIs are in no way more intuitive than keyboard interfaces using function
> keys such as the POS I posted earlier. Nor do they need to be.

This is where he loses me. I remember the days of keyboard UIs. They almost
all suffered from being opaque. You can't say "the problem is opaque UIs" when
that describes the vast majority of keyboard-based UIs.

While there are obviously ways to create exceptions, GUIs are intrinsically
more self-documenting than keyboard inputs, because GUIs _require_ that the UI
be presented on the screen to the user, and keyboard inputs do not.

~~~
ehnto
Mousing is bad. It really is a cumbersome tool. I do a lot with my workspaces
so that I can avoid using it.

~~~
riversflow
Hmmm

Most Millennials I know who are technical absolutely love mice because they
grew up using them, and most of them have extensive PC gaming experience to
boot.

I’m the Linux/CLI junky among them and even I don’t find mousing cumbersome—to
use someone else’s words, it’s an amazing, first class input device. Same goes
for trackballs. By comparison touchscreens are a joke, there’s no depth of
input like with a mouse(RMB,LMB,etc) and every action requires large physical
movements.

I’m used to seeing people fly through menus with a mouse at speeds people here
expect to only to see from keyboard shortcuts. Just because mousing is
cumbersome for you, doesn’t means it’s universally true at all. I know
keyboard shortcuts are fast, but it’s a lot to memorize compared to menus
which typically have the same basic order. File|Edit|View|...|Help

~~~
ehnto
Interesting! I am definitely in your first group. Born in the 90s, grew up
using a mouse and playing mouse driven PC games, yet I far, far prefer the
concise precision of a keyboard.

I guess it just depends on what the program requires of your inputs. When it
comes to software development, window switching and maneuvering around
websites, keyboards are precise and rapid, where the mouse can only do one
thing at a time before needing to travel to the next input.

The other important part about ditching the mouse, is that when you're
predominantly typing and using both hands on the keyboard, switching over to
the mouse takes a non-trivial amount of time. You have to move your hand over
there, figure out where the cursor is on the screen, then do what you need to
do with it. When you're doing it hundreds of times a day, it adds up.

------
rohan1024
After reading this headline I felt like I was having Deja Vu. I wasn't, it was
posted 11 hours ago

[1]
[https://news.ycombinator.com/item?id=21831931](https://news.ycombinator.com/item?id=21831931)

[2]
[https://news.ycombinator.com/item?id=15643663](https://news.ycombinator.com/item?id=15643663)
(posted two years ago)

~~~
dang
I marked this one as a dupe and was about to merge the threads, but the
discussion here actually seems to be better than yesterday's. There's clearly
a major appetite for this topic, so perhaps we'll break with normal practice
and leave this thread up.

~~~
Multicomp
I agree. I went to click this link thinking it was the link I saw yesterday
that looked intersting but hadn't clicked at the time, now I'm doubly excited
to find that there are multiple threads covering the same thing.

Additional: anyone know of a good F# library for the gui.cs framework? Before
I manually write bindings I figure I can throw a quick query out there.

------
arminiusreturns
Coming from the ops side and as a user, I blame a focus on abstractions away
from fundamentals that lure new and old developers alike into overcomplicating
stacks.

For example, I would say some significant percentage of both websites and
electron-style apps don't need anything other than pure HTML5+CSS3. Javascript
only if necessary. _Not_ here are 5 different Javascript frameworks! Of course
there are cases where a framework is applicable, but I'm talking about the
vast majority of cases. Then on top of all that are 20+ javascripts
advertising and spying per page and pages become unusable without a good js
blocker (umatrix is my favorite). Some of this is technical and some of it is
business, either way developers need to focus on the fundamentals and pushback
against dark patterns.

Now on the desktop side, this is also why I am a CLI junky who lives in a
terminal. CLI apps don't get in the way, because most of the time you are just
dealing with text anyway. There are many websites that ought to be a cli app
at least via api. This is also one of my criticisms of companies trying to
force people into the browser by disallowing api clients.

It was the constant bloat and spying that finally spurred me to go gnu/linux
only many years ago, and things are only getting better since then. It
requires a change in how you do your computing, yes. It may not be easy
(higher learning curve), but the rewards are worth it.

------
rticesterp
I was driving across country, became tired and had to check-in at a roadside
motel a few weeks ago. It was 3AM and the clerk took about 15 minutes to
complete the check-in. She apologized several times and said "this was so much
faster back years ago when we had DOS. There's too many clicks, screen freezes
and confirmation with the newer computers". She was older so I'm assuming by
newer she means POS systems built within the past 20 years

------
acheron
One that got worse and then better again at least in my experience is boot up
time. Old computers running MS-DOS I remember booting up very quickly (unless
you waited for the full memory self test), then for quite a long time it took
forever to get into Windows no matter what you did to try to speed it up. More
recently things start up pretty quickly again; I think it's mainly hardware
(solid-state disks being the major improvement), but I do think Windows seems
to do a little better software-wise too. (Linux and BSD I haven't used as a
desktop in a very long time so I'm not sure where those are now. OSX I don't
have much of a sense of, partly because it just doesn't need to fully reboot
as often as Windows.)

~~~
dexen
_> but I do think Windows seems to do a little better software-wise too_

Windows 10 does a little trick to speed up boot - when you perform a shutdown,
Windows 10 saves a mini-hibernation image to the hibernation file. When you
perform normal boot, it can start up quite fast[1]. This gives noticeably
shorter boot time especially on spinning rust drives (i know, i know,
$CURRENT_YEAR). However if you perform a "reboot" instead of "shutdown + power
on", you'll get the full length boot of notably longer time.

[1] assuming the hardware setup is sufficently unchanged

~~~
Fabricio20
This exact hibernation feature is what I feel is making my shutdowns slower!

I've been rocking an M.2 ssd for quite some time now and Win10 _always_ takes
a considerate 1-2 minutes to shutdown.

~~~
arh68
I get 11 seconds to shut down, 11 seconds to boot, on a 256G NVMe ssd. I'm
curious what's going on on your machine.

------
sonofaplum
There are some good points here, but of course I am going to ignore those to
talk instead about the stuff I disagree with!

The section about google maps follows a form of criticism that is widespread
and particularly annoys me, namely => popular service 'x' doesn't exactly fit
my power user need 'y', therefore x is hopelessly borked, poorly designed, and
borderline useless.

There is always room for improvement, but all software requires tradeoffs. One
of the things that makes a product like google maps so powerful is that it
makes a lot of guesses about what you are actually trying to do in order to
greatly reduce the complexity and inputs required in order to do these
incredibly complicated tasks.

So yes, sometimes when you move the map some piece of data will be removed
from the screen without your explicit consent, and yeah, in that moment that
feels incredibly annoying. But balance that against the 100s or 1000s of times
you used google maps and it just worked, perfectly, because it reduced the
number of inputs needed to use it to the bare minimum.

Google maps doesn't need to fit every use case perfectly, and while its fine
to talk through how your hyper specific use case could and should work,
remember all the times that it seamlessly routed you around traffic from your
office to your house in one touch while you were already hurtling down the
highway at 70 mph.

~~~
PostPost
The example for maps is not a "hyper specific use case": "The process you
WANT: pick your start and end. now start searching for places in between. Your
start and end are saved. When you find someplace interesting, add it to your
list. Keep doing that, keep searching and adding."

That's a common use case. The problem with Google maps (and the problem with a
lot of modern software) is, as you say, it makes a lot of guesses.

The definition of a good user interface is "to meet the exact needs of the
customer, without fuss or bother"*

Google Maps is _great_ for finding directions to a very specific place. But
after mapping those directions, doing almost anything else destroys that
route. If I have to (and I do) open multiple map tabs, or repeatedly enter the
same route info after making a search (if I'm on a phone) it is not a good UI.

*[https://www.nngroup.com/articles/definition-user-experience/](https://www.nngroup.com/articles/definition-user-experience/)

~~~
sonofaplum
Why do you think that use case is that common? And why do you think google
maps should elevate that particular use case above other competing use cases?

~~~
hyperdeficit
I hadn't even thought of this specifically, but after it was mentioned in the
initial post I realized just how much that is my use case most of the time and
how often I am fighting with google maps to accomplish what are relatively
simple tasks like adding an additional point along a route. If you are trying
to add multiple additional points on a route it gets even worse.

This is all much worse on the Android app as well, where it makes the
assumption that your use case is to get from where you are right now to
somewhere else. Trying to get from point A to B, where neither is where you
are now, is unnecessarily frustrating.

~~~
jakelazaroff
_> This is all much worse on the Android app as well, where it makes the
assumption that your use case is to get from where you are right now to
somewhere else._

That strikes me as a _fantastic_ assumption. I wonder what percentage of
routes involve the user’s current location? I bet it’s high!

~~~
eitland
Yep. But it used to be even better when it made that assumption clear by
adding a pre-filled box with your current location.

It just worked for the default case but when you needed something else it was
straightforward to do that.

~~~
StavrosK
Doesn't it? It gives me two boxes, "current location" and "destination" and I
can change either.

~~~
eitland
Straight away after you open Google Maps on a mobile?

~~~
StavrosK
No, I open it, look for the destination, then press directions and can edit
the starting point.

~~~
eitland
Then we agree. I find that utterly annoying since I've seen how simple it
could be but it seems many people disagree with me :-)

------
lolc
What bugs me about these nostalgic rants is that somehow the convenience of
today should be mixed with the frugal interfaces of the past.

So you couldn't be bothered to find a better route planner and defaulted to
Google Maps? Which you got for free, instantly available? And now you're
unhappy because it doesn't do exactly what you want? Please spend some effort
on your tool selection before you spend time ranting. And let's turn this
around: Did you at some point in 1990 wish you'd brought a map? How did you
resolve that? Google Maps on a pocket computer would have been a marvel back
then!

~~~
cgh
His general point is that the point and click interface is a blunt tool and
there is room for specialisation. The POS example in the tweet-rant is ugly
and non-intuitive but the perfect tool for the job. I'm not sure how I'd
improve Google Maps exactly because I haven't really thought about it, but I
do find it frustrating enough that I end up breaking out the good old BC
Backroad Mapbook from time to time[0].

0\. [https://www.backroadmapbooks.com/](https://www.backroadmapbooks.com/)

------
halfcreative
I don't understand the purported problem this man is facing when using google
maps. Like step by step what is going on that google maps erases what you're
doing. I've never had a problem as described, and seeing these other comments
mkaes me question whether I'm using google maps in a drastically different way
than the common person, or if the people complaining about this issue is using
it weirdly, or if theres some different version of google maps that we are
using.

if I open up maps, type in the location i want to go to, click 'directions'
and add my starting location i click and drag all over and move the map
around, zoom in and out to look at the route and my info does not disappear. I
click add a destination if i want to add a destination. if i want distance
from the center of town, I change the starting location of my directions to
the center of town.

For the people who face issues with google maps, can they please describe to
me what you are trying to accomplish and the steps you take that results in
the info on screen disappearing? I'm genuinely curious on what I might be
doing differently to have such a seamless experience vs the awful experience
described.

------
dwheeler
It is absolutely true. I don't know of any current computer that comes close
to the low latency of the 8-bit Apple //e for example. Here's a survey:

[https://www.extremetech.com/computing/261148-modern-
computer...](https://www.extremetech.com/computing/261148-modern-computers-
struggle-match-input-latency-apple-iie)

That survey shows that the Apple //e latency from keypress to screen display
of 30 milliseconds is something current computers don't even approach, even
though their processors are far faster.

Here's another article: [https://www.pcgamer.com/the-latency-problem-why-
modern-gamin...](https://www.pcgamer.com/the-latency-problem-why-modern-
gaming-pcs-are-slower-than-an-apple-ii/)

Of course there are reasons for this. We demand a lot more functionality. But
there are costs to the functionality. In addition, most systems are not
designed to minimize latency or jitter.

I believe we could do a lot better, but it would require that more hardware
and software developers care about it.

------
regulation_d
In aggregate, search is orders of magnitude faster because it's so much more
accurate. When was the last time you had to think about your search terms?

The other day I was thinking about the Baader-Meinhof phenomenon, but I
couldn't remember the name of it. So I googled "when you suddenly notice
something everywhere" and Baader-Meinhof was first result. Go back to 1983
(maybe even 2003) and let me know how long it takes to structure your search
terms so you get the right answer.

~~~
AnIdiotOnTheNet
> When was the last time you had to think about your search terms?

Literally every day because Google has become so ridiculously bad at guessing
what I want. I'm trying to tell it what I want, but it keeps throwing out
words or adding synonyms or whatever else the fuck it does.

------
MadWombat
"amber-screen library computer in 1998: type in two words and hit F3. search
results appear instantly"

Bullshit. The search results appear instantly if you are searching a small
text file in an editor. But if your app is actually fetching the data from
somewhere, a dBase database on a network disk mounted from a Netware server,
good luck. You type your words, you hit the search key and then you wait.
There is no indication that anything is actually happening, no spinning
things, not progress indicator, nothing. You cannot do anything while you wait
either (multitasking is not a thing in DOS, remember?), you just sit and wait
there hoping for the best.

And this is an application that is trying to keyword search few megabytes of
data over a local network. Not an application doing a fuzzy search on hundreds
of terabytes of data across half a planet...

~~~
CrankyBear
Exactly so. I used those computers back in the day. I actually designed
library software in the 80s. Users today would go crazy waiting for the hard
drive a 10Mbps Ethernet connection away to slowly pull data out of dBase
database.

------
Someone1234
This is a very hard rant to read. The examples either aren't factually true or
are poorly explained and the "solution" is largely nebulous.

I feel like this topic has merit and with a well written article with real
examples (both 2019 and 83) it could be something special, but this isn't
that.

I'd be interested to know how many upvoted based on the title/what they
expected this to be, rather than after having tried to read it. Most of the
replies from the older thread are about the title alone (or being critical of
the content).

~~~
asdfman123
I appreciated his complaints, but not his solution. Only a programmer would
think that going back to keyboards is a good idea.

Most people are happy waiting for a few seconds for their webpage to load, if
it means they can do it without learning anything new.

Programming is a profession, and we design software for people who have other
interests than software. Many of us would be better off if we got better at
empathizing with people who are not like us.

~~~
mlyle
> Only a programmer would think that going back to keyboards is a good idea.

He goes too far and his suggestions are nebulous.

But it sure would be nice to go back _somewhat_ to the keyboard.

There are so many apps with broken tab-orders; so many common operations
without common shortcuts; so many badly tuned completion engines; bad
interactions with autofill, etc. There's a whole lot of stuff that I could do
faster on the keyboard, but I am being constantly trained to not dare assuming
I can fly through and the right things will happen. In any application that I
am not positive will do the "right things", I am slow and tentative.

It's bad for accessibility, too. A vision deficit or dexterity deficit impacts
mouse use harshly.

We need to go back to making the keyboard experience good. Not just in
individual applications, but across the board. While we're at it, we should
realize that being too free in design choices has negative impacts to user.
There was a time that Apple really cared about this stuff, for instance, and
usability on Apples excelled because you knew that there was a lot of effort
in the developer community to do the right things and conform to common
standards.

~~~
hindsightbias
And you have to have 3 browsers because 2 of them are broken on some field. I
wonder how often updates cratered green consoles back in the day. Curses could
be tricky at times.

The path for many commercial UIs seems to be to map out complex processes into
the most linear common path so that anyone off the street could do it. All
that mouse action kneecaps productivity, and as soon as you come to an
exception you enter a hell of popups or bazaar of UI elements.

Then of course they chop out all the keyboard ahortcuts.

Great UX is sort of like fusion, it’s always 30 years away.

~~~
sudosysgen
Great UX is worse than fusion - at least once we get fusion done it'll be
worked out, but UX will always be on the brink of regression.

------
rayiner
Computers jumped the shark in 1998. I remember dual-booting NT4 and BeOS on a
PII 300 MHz with 64MB of RAM. Connected to a 256 kbps SDSL modem, it’s the
best computing experience I’ve ever had. You could do everything you can on a
modern machine. Indeed, even more, because the modern versions of software
like OneNote and Word are neutered compared to the ‘97 versions.

It feels like all of this effort has been spent making computer interfaces
worse. The only improvement I can point to between MFC and a modern web UI is
DPI scalability. Besides that, there are tons of regressions, from keyboard
accessibility to consistency in look, feel, and operation.

~~~
geogra4
Yes, that late 90s/early 2000s nirvana. I could perfectly browse the web and
write documents and send emails on a PowerMac G3 without too much fuss. Only
things like photo/video/audio manipulation were really lacking from machines
at the time compared to today.

~~~
Multicomp
I don't know if it's the baby duck syndrome, rose tinted glasses, or 'back-in-
my-day-ism', but I agree with you wholeheartedly.

Computing for me peaked in about 2002-2005 (I'm young, sorry), coasted okay
until about 2010, then began gaining weight until today where I have a sense
that computing latency is like a type 2 diabetic man having his midlife
crisis. Either he loses some weight, or he has a heart attack and dies.

I agree on your point re: number crunching for rendering videos, manipulating
photos, audio DSP etc.

------
derpherpsson
The reason for this is, among other things, systems programmers wrote the UIs
in 1983. Today random twenty-years-old web-muppets write the UIs.

The system programmers of 1983 were used to low-level programming, and most of
them had probably written code in assembler. Web programmers seldom have that
deep understanding of the computer.

At least, this is true from my own personal experience.

------
cgh
I agree with much of this mainly because as a teenager in the ‘80s, I
witnessed the speed and efficiency of secretaries navigating Wordperfect’s
non-GUI. The POS ui in the tweet thread is similar. There’s certainly room to
rethink the use of GUIs everywhere for everyone.

------
duxup
I distinctly remember watching the Netscape Navigator logo for long periods of
time waiting for a page to load.

~~~
wolco
You should have turned images off.

~~~
duxup
I wonder if there was a "images are ruining the internet" panic like the the
modern day "Look my blog is just html, why can't everything be that way?"
rants?

~~~
MiddleEndian
I just remember wishing it was faster. Now that it's faster I still wish it
was faster.

------
johnklos
Even reading that stupid post is slower because the poster can't be bothered
with capitalization, punctuation, or complete sentences.

~~~
krick
Yeah, this looks like a collection of badly written twits. Who types like
that?

------
azinman2
I think this is a case of selective memory. I’m restoring and programming for
an old Mac Plus, and it’s anything but blindly fast. Booting, opening an
application, hell even dragging a window has visible refresh of the contents
behind it. Windows used to take forever to boot (still isn’t that fast), and
anything requiring networking was orders of magnitude slower.

------
jkaptur
For those interested in metrics, Dan Luu wrote a really cool article that
measured input latency on computers from different times:
[https://danluu.com/input-lag/](https://danluu.com/input-lag/)

------
333c
As I scrolled through this, using my mouse to click the heart on a number of
the tweets, I really understood the point he makes about actions that don't
need to use a mouse. When I program in vim and run things on the command line,
I don't use my mouse. Even when I browse Reddit, I don't (often) use my mouse
because of the RES extension, which lets me browse with keyboard shortcuts. I
haven't really thought about how much easier things generally are when I'm not
using my mouse, and I wish there were a similar extension for other websites.
Does such an extension exist for Twitter, or for HN?

------
Aaronstotle
I worked at a Fry's for about a month, this was back in 2014 and they were
phasing out the blazing fast POS system he mentions and moving to a web based
one. Nearly every employee hated it and it made everything much slower.

------
slowhand09
This post is GOLD! I'd prefer Wordperfect or MSWord in DOS over any of them in
Windows. Same with 1-2-3, Supercalc, Excel. Windows version are so kludgey. I
could write 3x as fast as I can now.

~~~
ttctciyf
Or Wordstar.. :)

Remember when you could see _and edit_ embedded format codes in your word
processor? Surely that never happened!

------
exabrial
Speed is always the #1 feature: [https://varvy.com/pagespeed/wicked-
fast.html](https://varvy.com/pagespeed/wicked-fast.html)

------
ksaj
I got a Raspberry Pi 4 with 4G of RAM. It is so much faster at booting and
logging in than my Mac Pro, by far. And that Mac Pro has a whole lot more
cores and memory, etc.

How much LESS am I getting out of of the Raspberry Pi? I'm not exactly sure. I
just know I can reboot in seconds and get back online while my Mac is still be
showing that stupid grey logo screen.

Of course, the Raspberry Pi can't come close to running Logic Audio and 5
virtual machines simultaneously doing other things in the background as I
generally do on that Mac. But the boot up to useful state speed metric favours
the much lower spec'd device by far.

For daily stuff like going from OFF to editing an office document, the Mac is
sloth slow. It's a very interesting and wide divergence of competencies when
it comes to performance and ability.

We are definitely witnessing the later part of the Law of Diminished Returns
when it comes to updating and upgrading computer equipment.

I can't say whether or not we've reached the apex, but we sure are close if we
haven't. Today, anyone can have a super computer. But nearly nobody does.

------
WaltPurvis
_" There's no reason for Twitter to use a mouse. There's nothing mousey about
this website, not a damn thing twitter: i need to navigate through a linear
list and perform one of four actions on discrete items, almost all text-
based"_

He's apparently never bothered to learn Twitter's built-in keyboard shortcuts.
You can do almost everything on Twitter without touching a mouse.

~~~
oneeyedpigeon
Any reference? Closest I can find on Twitter's site itself is this [1] but the
second shortcut (l to like) doesn't work for me.

[1] [https://help.twitter.com/en/using-twitter/how-to-
tweet](https://help.twitter.com/en/using-twitter/how-to-tweet)

Edit: OK, it works, when you use default keyboard controls (e.g. tab) to
select the tweet _first_. I'd assumed it would work from a 'single tweet' page
without having to do anything else first. Overall, looks promising, but not as
nice as the keyboard controls in gmail, for example.

------
tom_
As somebody that actually uses a computer from 1983, i question the
conclusion.

The speed of modern pcs is surprisingly bad, relatively speaking, but this
mainly just means they aren't as much ridiculously quicker as they probably
should be.

(The main issue cited is not meaningful, in my view, as computers from 1983
just couldn't do graphical maps at interactive frame rates at all.)

------
msla
I don't know, I can compile code perceptually instantly and get optimizations
which perceptually instant compilers didn't do in 1983. I know that Turbo
Pascal was very fast when it came to compiling Pascal, and I certainly
appreciate the effort which went into making that happen, but Turbo Pascal
didn't turn string handling code into SIMD operations after turning a
recursive function into an unrolled loop.

Partly, this was due to the fact the IBM PC did not have SIMD operations in
1983, but the rest is because modern compilers run complex optimization passes
in tiny fractions of a second.

Also, I'd like to see this person say that operations on computers were
perceptually instant while logged into a PDP-11/70 or (let's be a tad
generous) a VAX-11/780 with a half-dozen other people all trying to use it at
the same time. Yes, faster computers existed in 1983. Didn't mean the likes of
you got to use them.

------
Koshkin
This also reminds me of the conspiracy perpetrated by the industry to remove
the antiglare treatment of laptop screens and at the same time reduce the
screen area by changing the aspect ratio. All this, of course, was done solely
for the benefit of the consumer. (Oh, that, and a wild ornament around the
keyboard.)

------
TheOtherHobbes
I used to know a computer music academic who used to attempt to run FFT
convolution and other DSP processes on an Atari ST.

One time he started a process, went away on holiday for two weeks, and it was
still running when he got back.

These days it would be much faster than real time - not just native, but _in a
web browser._

------
thedaemon
Well, I think it's much more valid to claim it's much slower than 2000. Take
for instance, Microsoft Office products. On a business class desktop, i3-i5,
they are much slower than they were in 2000 in all aspects. Software has
become slower as computers have zoomed in speed.

~~~
geogra4
I agree I think this is generally more accurate. Browsing the web, writing
documents, and navigating around (modest sized, not enterprise
size)spreadsheets feels much slower than it did in the late 90s/early 2000s. I
won't go back as far as 1983, but maybe 2003?

------
microtherion
The author must have lived in a different 1983 from me. I remember Commodore
BASIC freezing up for a minute to perform garbage collection on its 30K of
heap if you created too many temp strings.

The C-beams glittering in the dark were pretty cool, though.

------
jborichevskiy
I agree with a lot of this.

Every app wants you to "experience" its data solely in its walled garden with
no ability to cross-compare data. Say I want to find a coffee shop on the way
to somewhere but also one which appears in a list of great coffee shops on a
separate Reddit post I found. Pretty much impossible on mobile, and a major
pain on desktop.

At the very least, give me the ability to hide records/items/instances I've
already seen as I'm working my way through several searches or lists. Often
times searching things on Yelp and Google feels like re-reading the same list,
just differently ordered over and over again.

------
8bitsrule
I got mad just yesterday at a new transit system interface with a map. It has
moving icons to represent the present locations of buses. I clicked on a
single point in the map, and it popped open a big black box of information. So
big that it covered the icons of interest. Hmmm. If I could move the box to
one side, I could leave it open ... nope, the box won't move.

Bad enough. So I wanted to close the box. That took a minute, because you
couldn't just click anywhere inside, you had to click at one single ... and
unmarked ... location. Here? Here? Here?

------
ineedasername
When comparing to amber/green screen interfaces, hatred of the mouse is odd.
Sure, for a simple search interface the text interface was fine-- search
field, arrow down through results. But for anything more complex you end up
having to tab between a dozen or more fields. ERP interfaces we're especially
tedious in this regard, so the author's nostalgia for the "solved" problems of
that age aren't completely warranted, and the mouse is at worst just as bad.

------
znpy
in 1983 there was no network delay to edit a document. nowadays it's "in the
cloud!"

------
PaulHoule
The "things shifting around" may be very deliberate.

I've noticed that many advertising-supported pages (say trade publications
such as 'FierceWireless') are a disaster on mobile with ads and other
intrusive pop-ins causing page elements to move all around to the point where
it isn't worth trying to click anything because it won't be there by the time
your finger gets there -- but "oops!" you clicked on an ad so the cash
register goes KA-CHING!

------
awinter-py
The maps feedback is spot on -- as a consumer I want a decision support tool
that helps me run searches reliably and quickly. I think G wants to show me a
maximum of three things at once so they can optimize ad clicks.

G maps had the option to be excel and instead chose to be the bottom shelf of
the cereal aisle. It's fine to treat your users like consumers instead of
power users, but that opens a hole in the market for a city-aware, mobile-
friendly GIS tool.

------
cjbprime
I wanna know more about these super fast library computers. The public library
I worked in in 1999 used computer terminals that were just doing full-screen
telnet sessions to a remote Internet host. It was like connecting to a BBS
from my home modem.

Random other info: One time the application on the remote host crashed and
dropped me to a Unix shell. ls showed the directory had around 50000 files.
The system's name was "DS Galaxy 2000".

------
zrm
See also:

[https://en.wikipedia.org/wiki/Wirth's_law](https://en.wikipedia.org/wiki/Wirth's_law)

~~~
mikro2nd
Ah, the same Nickle's Worth who said (iirc), "Europeans call me by name.
Americans call me by value."

------
lez
Reading through the comments it would seem that the article is a false alarm.
The count of upvotes dictates the opposite.

My opinion is that the baseline for what we achieve as good (and quick)
service has been slowly lift up by the industry. At the same time specific
design practices were implemented to ease our frustration about slow service,
and "feel" at least that the service is quick.

~~~
krick
It's a common trend. Headline with a popular opinion says something a typical
HN user is inclined to agree without even reading it, so there will be a bunch
of upvotes and some "me too" comments (disguised as personal anecdotes). But
those who actually clicked on the link will find a number of issues with the
content of it and will tell about it here. And here we are, lots of likes and
comments suggesting it isn't really worth attention.

------
christopoulos
I feel like it’s a behavior encouraged by Googles Material design or derived
projects: slide details in and out of the screen (google maps pin details, for
example), hide input boundaries (no border on search text input fields), tons
of sporadically placed spinners.

I really dislike that tendency in design and behavior and find it
counterproductive.

------
thibaut_barrere
Sorry, but in 1983 I was using CLOAD and CSAVE, and it took very long minutes
to save & load anything (on cassette!).

~~~
cgh
If you read the article, he's talking about the ui experience for certain
narrow applications, not i/o on '80s 8-bit home computers.

------
tangue
I won't go back to 1983 but I have an old Mac with Snow Leopard and Text Edit
is way more responsive on the old Imac than on my new Macbook pro. I know
there was some change due to security, but I definitely feel this one. And I
won't talk about the Adobe Suite ...

------
tikiman163
Jesus Christ on a cracker people, are you all really this stupid? The
difference between 1983 and 2017 is the magnitude of information and
functionality. In 1983, a whole MB of ram was what super computers ran on.
Your average desktop typically had around 2 KB of ram. My desktop today runs
16 GB, that's 8 million times more data.

First of all, when data sets start to get that big it becomes a monumental
task to organize just the execution order of the compiled code. Second, the
main thing the bitch from the article is complaining about not being able to
do is something you just can't do from the phone UI. Google has a trip planner
app that specifically helps locate interesting landmarks between the start and
destination points, and even plan where to stop for gas or find a hotel for
the night all from a single tab. For having spent so long writing his rant,
I'm surprised he never tried googling a trip planner app. It would have been
faster and prevented him from proving what an idiot he is.

------
Isamu
Well, in 1983 I was waiting all day for my program to load from floppy. Then
maybe I had to swap disks out and wait again. But yeah, if you had one of
those awesome 10 Meg drives, things would load fast!

Then you would run out of memory if you were doing anything ambitious.

------
msla
I don't know, I can compile code perceptually instantly and get optimizations
which perceptually instant compilers didn't do in 1983. I know that Turbo
Pascal was very fast when it came to compiling Pascal, and I certainly
appreciate the effort which went into making that happen, but Turbo Pascal
didn't turn string handling code into SIMD operations after turning a
recursive function into an unrolled loop.

Partly, this was due to the fact the IBM PC did not have SIMD operations in
1983, but the rest is because modern compilers run complex optimization passes
in tiny fractions of a second.

Also, I'd like to see this person say that operations on computers were
perceptually instant while logged into a PDP-11/70 or (let's be a tad
generous) a VAX-11/780 with a half-dozen other people all trying to use it at
the same time. Yes, faster computers existed in 1983. Didn't mean the likes of
_you_ got to use them.

------
donohoe
In December of 2018, the average news article took 24 seconds to load,
consumed 3.58MB of your data plan, and Google scored it a Speed Index value of
11,721 (a Speed Index of 3,000 or lower is considered good).

It has gotten worse this year.

------
hsnewman
So whats with the punctuation (no upper/lower case, no commas, periods, etc)?

------
annoyingnoob
When I worked on an IBM mainframe, everything we did had a guaranteed response
time built-in. We had to plan and engineer the required response time. Seems
like we have forgotten to care about response time.

------
foreigner
This especially drives my crazy about my phone. I'm not talking about using it
as a web browser or handheld computer, just basic phone functionality like
making a call. It's so damn slow! WHY?

~~~
tartoran
Bloat

------
kingkawn
I wish everything was optimized for speed of interaction above all other
metrics.

Make it so.

------
coupdejarnac
The human condition in a nutshell. We have to relearn the same lessons every
20 to 30 years and probably much more often in the software world. Progress is
mostly sideways instead of forward.

------
Tempest1981
Earlier discussion:
[https://news.ycombinator.com/item?id=21831931](https://news.ycombinator.com/item?id=21831931)

------
bluSCALE4
They guy hates the commoditized web. Get in line. It's not about what you
want, it's about what the service you're using is trying to sell you.

------
domador
Ironically, this rant is posted in chunks on Twitter, making it hard to
actually read it in a fluid manner. How about a blog post with paragraphs
instead?

------
brosinante
Has anyone ever also noticed how cars are always heavier than the bikes we
used when we were children. And how much more air you need and how much longer
it takes to fill a car tire with a hand pump.

Also, remember when none of us was a portuguese author and we used paragraphs
rationally and wrote things down without splitting them into 200 letter long
strings because we used mediums made for rants and did not need a separate app
to wrap it all into something vaguely resembling text?

Also has anyone noticed that time passes and things change?

~~~
ryanianian
Your comment is ad-hominem (dismissive) and does not contribute to the
conversation. You can dislike with the style/syntax of the article but the
point it's making isn't "things change" it's that "things change for the
worse" complete with several examples which other commenters here either
support or disagree with, but helpful comments mention central points in the
article in good-faith conversation.

~~~
xpe
I am not confident you are using ad-hominem correctly. From the Merriam
Webster online dictionary:

“1 : appealing to feelings or prejudices rather than intellect an ad hominem
argument

2 : marked by or being an attack on an opponent's character rather than by an
answer to the contentions made made an ad hominem personal attack on his
rival”

(Here on HN, I think that #2 is the most common usage.)

------
jcadam
Hey, the purpose of a computer is to consume electricity while running IA-
mandated corporate policy enforcement and virus scanning software. That we
provide users with monitors and input devices so that they can use any
remaining spare CPU cycles on whatever machines just happen to be located near
their desks to help themselves do their "jobs" is just icing on the cake.

They should be grateful if opening a modestly-sized word document takes less
than 10 seconds.

------
KaoruAoiShiho
Isn't it obvious, it's because webapps get data from very far away and back in
the day the data was right there.

------
baybal2
I like programming in C, but it seem that C in GUI dev is slowly dying.

GTK is truly the last bastion, and even it took a hit with GTK 3.0

------
peter_retief
Absolutely agree, I remember when hospital clinical data was captured by
keyboard hotkeys with no mice. It was quick to learn and lightning fast to
capture and search. Along came progress in the form of a visual basic
frontend. Network traffic escalated, frozen screens became common, data was
corrupted, the computer that was supposed to save time sucked up the energy of
busy professionals. This was 30 years ago and it is still in use.

------
aj7
As Pauli said, “That’s not even wrong.”

------
Vaslo
I don’t know, I had a Commodore 64 back in the 80s and loading those games was
painful

------
arpa
So you hate the mouse. Mouse is an benevolent dictator. It tries to help. You
want to know who Stalin is? Fucking touchscreens. You're blind? Fuck you. You
have Parkinsons? Off to the death camp you go. You want to type in your pocket
while paying zero attention to the poison screen which is taken up by half by
an idiotic keyboard on which you will continuously mistype and be moronically
autocorrected to a point where you WILL LONG FOR THE GOOD OLD DAYS OF TYPING
WITH T9 - enjoy the bullet to the back of your head you sexual deviant. You
will drag, pinch, stretch, lick the glossy oh-so-fragile glass screen, you
will fucking obey the tech giants that want your eyes glued to the screen so
you can see more ads until you lose it and start loving the uncaring big
brother and all the pain you are brought from him. Mice are evil. Pshaw.

~~~
glitchc
Do you need a hug?

~~~
arpa
Actually, honestly, yes.

------
zwieback
This is two years old - a lot of stuff in this rant already sounds totally
obsolete. Early web apps were crap and we're still learning how to do stuff
that works on both phones and PCs but the sentiment expressed here seems off
base now.

------
dmead
i got really annoyed when new motherboards starting not including ps/2 ports
for keyboard and mice.

not surprising my railgun accuracy in quake 3 started to decline in quake 3
after that.

I blame the ports.

------
elproxy
Explained before the advent of computers as Jevons’ Paradox

------
strathmeyer
Before web tracking webpages would load instantly.

------
ttctciyf
> Suppose during my regular game development everyday life I've installed
> Photoshop recently and I want a look at a screenshot of the game - someone
> reported a problem or something.

> So I double-click on the thing ... one ... two ... three ... four ... five
> ... six ... it's about seven seconds before I can actually see the image.
> [...]

> So that's really slow and I'm going to talk about that for a bit, but
> there's an element of severe irony to this which is that as soon as I double
> click this thing within one second _it draws an image_ and it's a pretty
> high resolution interesting image. It's just not the image that I care
> about.

> So obviously it's not hard to start a process and draw an image on the
> screen in much less than seven seconds. They just don't manage that.

> Now. I gave a speech a year ago that started this same way. That was 2016.
> It's now 2017 and of course a new version of Photoshop has come out and of
> course what will align directly with my point in the next few slides:
> they've made it worse. And the great way in which they've made it worse is:
> say there's some operation that you want to do maybe once in a while like
> create a new image... So I'm going to go to file, new ... Urghh. And that
> menu takes - it probably takes about a second to come up.

> And you might think "Oh, well, you know it was just all these assets were
> cold or something.. Maybe they come off the hard drive. It'll be faster next
> time." And it's like: "Well, let's test that out. Nope." Like every time.
> I'll use the keyboard shortcuts. File. New. Nyeaarghh. [...]

> Imagine if the people who programmed this were trying to make VR games.
> Everybody would be vomiting everywhere all the time.

> Well what machine am I running this on? It's actually a pretty fast machine,
> it's a Razer Blade laptop with a pretty high-end i7 in it, and you can talk
> about how fast the CPU is or the GPU is and some arbitrary measurement and
> I'm going to discuss CPU speeds here and I want to say in advance that none
> of what I'm about to say is meant to be precise or precise measurements. I'm
> making a general point.

> And the general point is that the CPU of this thing would have been
> approximately the fastest computer in the world when I was in college. Or
> the GPU would have been the fastest computer in the world in the year 2000
> or thereabouts.

> Now you might say "That's a really long time ago. This is ancient Stone
> Age." Well. Photoshop was released in 1990 before either of those dates. And
> Photoshop 6, which I used heavily during my earlier days in game development
> is from the year 2000. And this is what it looks like. This is a screenshot
> of Photoshop 6. It's got all the same UI that Photoshop has today. It's got
> all these same control widgets and it's got our layers and channels and all
> that stuff. Today the UI is a different colour but apart from that it's
> essentially the same program.

> Now I don't doubt that it has many more features. But you have to ask how
> many more features are there and what level of slowdown does that justify?

\- Jonathan Blow, Reboot Develop 2017:
[https://www.youtube.com/watch?v=De0Am_QcZiQ&t=155](https://www.youtube.com/watch?v=De0Am_QcZiQ&t=155)

------
cabaalis
What is this? Are these tweets? It's like reading a programming language where
vast amounts of syntax is optional. This style of blogging is making the world
worse.

~~~
FemmeAndroid
This is a website that 'rolls up' tweets. Here's the original tweets:
[https://twitter.com/gravislizard/status/927593460642615296](https://twitter.com/gravislizard/status/927593460642615296)

~~~
dang
We've changed the URL to that from
[https://threadreaderapp.com/thread/927593460642615296.html](https://threadreaderapp.com/thread/927593460642615296.html).
The community is divided on which interface it prefers, so we usually break
the tie by linking to the original source, which the site guidelines call for
anyhow.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
Deleriumm
_oops you pressed a key, your results are erased_

Every damn time!

------
buboard
thread reader loads fast, that's because it doesnt force you to watch the page
"booting up" and then its equivalent windows hourglass (ajax spinners). Maybe
we should try a browser without javascript ...

------
blackearl
>Mice are bad. Mice are absolutely terrible.

Bah humbug, I hate modern technology!

This luddite author should go into goat herding if they hate modern tech so
much.

~~~
keedon
The computer mouse was invented back in the 60s, its hardly modern technology

~~~
blackearl
Doesn't that make the author even more wrong?

------
alexnewman
Thank G-d. I have been saying this for year. Code is:

\- Getting less stable \- Getting less speedy \- Getting less efficient

I blame the fact that old programmers and young programmers do not have the
appropriate relationship.

