
Almost everything on computers is perceptually slower than it was in 1983 - dredmorbius
https://tttthreads.com/thread/927593460642615296
======
Declanomous
This could be a copy-paste of rants I have written. I think marketing has
ruined pretty much everything. Everything has a subscription now, and it's _in
the cloud_.

My experience is that everything is slower than when I first started using
computers as a child in the 90's, and it does it worse. You don't own anything
anymore, upgrades routinely take away features, and nothing interfaces with
anything else.

There are a lot of reasons why building an app on a web framework makes sense,
but the fact that I routinely lose access to applications I've paid for
because the company has gone out of business or stopped supporting them drives
me insane. I can still install Excel 97 if I want to, but I have 0 confidence
that anything I've bought from Google will work a year from now.

I mean, there is perceptible lag in character entry that never used to happen.
Sure, my computer used to freeze inexplicably, but Atom crashes often enough
that I don't consider that a win, and text has a noticeable delay before it
appears on the screen when using Atom. Text entry is something I can implement
flawlessly on a microcontroller and the majority of applications I use on a
daily basis get it wrong.

~~~
bonaldi
We've even managed to slow down things like Excel 97. I played with a 486
laptop recently, and was completely amazed at how fast things like Word felt
(once they'd loaded off the ancient hard disk).

The most immediately notable part was the typing, which felt completely
instantaneous. I haven't found out exactly why, but the switch from serial to
USB keyboard interfaces (and the lag therein) as well as the various windowing
pipeline "improvements" over the years seem to be in the mix.

~~~
gugagore
on PCs, keyboards were never "serial" as in "the serial port"(mice were), but
they were always "serial" in the same way that the USB port is "serial".
Perhaps it's better said that the switch from a special-purpose interface to
USB is what introduced lag.

~~~
crankylinuxuser
I know on microcontrollers, if you want to reduce jitter (by sometimes up to a
magnitude), you disable USB. Ive noticed my interrupt loops on ARMs go way
down when the micro doesn't have to service USB.

But then you have to use an STLink to upload programs.

------
ambrosite
I wonder how much of this is a result of UI designers deliberately putting
tiny delays/animations into their UIs to make them more "usable"?

For example, it was a common piece of advice that if clicking a button opens
up a dialog box, there should be a brief animation showing the box expanding
from the direction of the button, so the user would associate the clicking of
the button with the appearance of the box and understand where it came from.
This does actually improve usability the first few times it happens, but over
hundreds of iterations these tiny delays really slow down the overall
experience of using the interface.

~~~
kgwxd
Just last week I discovered I could disable animations in Android through the
hidden developer options. It's fabulous. The only initial quirk was no visual
feedback when a picture is successfully taken, so it felt like i missed the
button, but I've gotten used to it already.

~~~
tome
Does anyone know if you can do the same with iPhone?

~~~
kec
Settings -> General -> Acessibility -> Reduce Motion

~~~
tome
Oh nice, thanks! Unfortunately it seems that apps still animate.

~~~
SilasX
Yeah and I still see a lag from opening apps.

------
vikramkrishnan
The front end developer at one of the companies I was working for nearly
fainted when I told him I like Craigslist's web interface. The only criticism
he could muster was "But it looks like the 90s". And then he proceeded to tell
me off "because you don't want to learn how to use a new interface". Of course
I don't want to, it is a skill with zero transferability. Learning how to be
productive with an interface that only works with 'a' given app, for 'a'
duration of six months until some bozo changes the whole UI, because some
focus group decided that "it would look so pretty if ..." (and that is the
best case scenario, mostly it is because some UI/UX guy was inspired while
sacrificing chickens to a graven image of Steve Jobs) is bunk. The two most
hated things that came out of Redmond in the last 20 years were the 07 rejig
of the Office UI (so you had muscle memoried a bunch of key strokes that made
you super productive, well the F U buddy, we are going with this weird ribbon
thingie for no discernable reason - fixed that by recording a bunch of macros
that did what Office 03 did) and the Win 8 UI. I mean which diseased mind
thought that it would be a good idea? And if that was not retarded enough, to
force it on server users.

~~~
gmmeyer
Craigslist has an amazing interface. It's so easy to use. It's ugly, but who
cares! I don't care if it's pretty. It does exactly what I want. There was a
great article on this a few years ago that I think is perfect:
[https://m.signalvnoise.com/why-the-drudge-report-is-one-
of-t...](https://m.signalvnoise.com/why-the-drudge-report-is-one-of-the-best-
designed-sites-on-the-web-c34f764c3c4c)

~~~
wtetzner
And honestly, to make it not be ugly would really only require some tasteful
CSS, not a redesign or rewrite in some JavaScript framework.

------
fusiongyro
In the name of usability, we have practically neutered these computing
machines. I think optimizing for user experience is partly to blame for this.
You don't want users to have a confusing experience the first time they use
the application, and you want the application to look good and be inviting.
But that may be at cross purposes with making applications that allow open-
ended exploration. Exploration implies that the user is driving the process,
which suggests that the UI is not actively trying to constrain them to the
usage patterns it is designed for.

I think pretty much everything in computing is like this. By setting our
sights on enabling users rather than training operators, we have not only
limited what they are able to do, we have actually turned ourselves into
users, and we're basically unable to program without the help of frameworks,
libraries and Stack Overflow.

~~~
aeorgnoieang
> I think pretty much everything in computing is like this.

But then there's Dwarf Fortress, Blender, Vim, Emacs, ...

Accounting for Sturgeon's Law, things aren't _quite_ so bleak.

~~~
rglullis
Did we forget how Emacs used to mean "Eight Megabytes And Constantly
Swapping?"

Saying that as an Emacs User for almost 20 years now. Even today there are
things that can bring Emacs crawling to its knees. The moment you start adding
auto-complete to a big project, or if you have your files hosted in some NFS
server, you're SOL.

~~~
taeric
The autocomplete is somewhat easy to fix in principal. Just offload to a
daemon for the autocompletes, for example.

No, I haven't bothered with it. :( I did setup GLOBAL, which is quite nice.
Though, many languages nowdays would benefit greatly from a presentation
compiler. Which seems silly, at the extreme. It isn't like projects have
gotten more complicated in scope. They have ballooned in implementation
complexity, though. :(

------
michrassena
Last year, in a fit of nostalgia I finally bought an Apple IIe, a system I had
wanted 30 years ago, at a garage sale. Somehow in moving it back to my house,
I dislodged the floppy controller card, leaving the system to boot from ROM. I
turned it on, it beeped, and there was the prompt. I perceived it as
instantaneous. I know that from that point I could have written a BASIC
program and saved it to a cassette tape. The anecdote falls apart once you
actually start trying to boot ProDOS from disk and have to rely on the
something besides solid state.

This experience got me thinking. What has changed in the meantime? Computers
have become communication devices and communications devices have become
computers (though even this ancient Apple came to me with a modem installed).
Developer time has outstripped hardware costs, and somehow an hour of
developer time is worth the same as wasting a million hours of the users' time
(1 million users x 1 hour or whatever formula you wish).

I think we all recognize this problem, and it's already too expensive to fix.
The hardware and software space are too federated, too balkanized, too
complicated to ever integrate a system into a cohesive whole in the way the
Apple II was.

~~~
ryandrake
> Developer time has outstripped hardware costs, and somehow an hour of
> developer time is worth the same as wasting a million hours of the users'
> time

Key point here. It's hard to get the biz guys and product managers to agree
with spending developer hours on performance improvements, because it's hard
to measure the effect it has on business metrics. Does making your application
take 750ms to launch rather than 8 seconds really bring in more sales? Who
knows? Nobody tries to measure it either, so we'll never fix the problem.

Feature cram, on the other hand, is easier to justify, which is why every
application eventually ends up with a plug-in interface, theme-able, skin-
able, able to read E-mail, and able to interact with Facebook and Twitter.

~~~
michrassena
And I'm not even talking specifically about application performance, though
what you're saying is still absolutely spot on. Think of all the time wasted
because of poor, or incomplete documentation. An I sympathize with anyone
writing documentation. We're definitely still in the era of move fast and
break things.

These costs which accrue to everyone because no one can afford to take it on
themselves, I would call externalities in the same sense we do when talking
about pollution.

------
zaarn
The core point of the entire thread can be summed up by one of the tweets; "I
believe well designed keyboard interfaces and well designed GUI interfaces
have exactly the same learning curve."

I agree with this. A properly and well designed keyboard interface is faster
than any mouse. On the other hand, a properly designed mouse interface can be
fast too. Both need to be applied when it makes sense.

I can also resonate with the GMaps example; people find it rather ridiculous
that I prefer to use pen and map to plan routes but GMaps simply does not
cover the complex demands of holiday routes with family.

~~~
jgh
IMO Google Maps became completely awful when they made that new version of it,
and they've been doing everything in their power to continue making it worse.
Sure the scrolling is a lot smoother, but it's missing just a ton of features
that the old version had. Plus the problems mentioned in the tweet storm,
everything disappearing as soon as you change one item. PLUS that stupid side-
bar that takes up half the screen now.

Edit: Here's another good one I just ran into -- I can't sign out of just one
Gmail account. I have to sign out of _all_ my Gmail accounts, and sign back
into the ones I didn't intend to sign out of.

~~~
pjc50
I suspect most of gmaps' problems are there because it's an advertising
company. They don't want you to plan a route, they want you to click on local
search ads:
[https://support.google.com/adwords/answer/3246303?hl=en](https://support.google.com/adwords/answer/3246303?hl=en)

Note that if they clear your pins so you have to re-search for something,
that's potentially another CPC payment to them...

You can sign into _multiple Gmail accounts_? How on earth does that work, does
it involve the menacing prospect of "linking" them?

~~~
jgh
I'm not really sure how it works under the hood, not being terribly familiar
with how these login systems work, but you can basically (in your browser)
click "add an account" and it will open a new tab and you can sign into the
account. So right now I have 3 tabs open for gmail with 3 different accounts.

------
glial
Given the speed of modern computers, I'm fascinated by the fact that so many
processes complete in human-scale time (seconds-minutes), rather than
milliseconds or years+, depending on problem complexity. If times for compute
tasks have some power law distribution (a big if...), I'd expect a very small
part of that distribution to overlap with the 'seconds' range.

Then again, computers do _so many things_ in the millisecond range and faster
that maybe what we observe IS only a small fraction of the total.

~~~
Cthulhu_
It's a relief to do a bit of Go from time to time, render a complete html
template in 40 ms (with almost 2 MB of JSON data) instead of spending forever
figuring out the JS framework du jour to do the same thing, only with it
feeling a lot more involved, heavyweight, and slow - even if said framework
claims to be faster than all the other JS frameworks.

I think I want to do back-end or terminal-based interfaces again. Native
interfaces. Mmmmm

~~~
wesleytodd
This has nothing to do with Go and a lot to do with choosing the right tool
for the job. Maybe Go is the best tool for you in this job, but a given js
framework can serve someone else just as well if it is in their expertise
area. Don't confuse your expertise with language/ecosystem pros and cons.

~~~
theon144
This has nothing to do with choosing the right tool for the job, but with
performance, which this discussion is about. Practically all JS frameworks,
having multiple levels of (near-inscrutable) abstractions naturally degrade
the performance and certainly contribute to the trend the submitted twitter
thread is talking about.

------
gilbetron
No? Getting on a BBS in 1983 at 300 baud means text files "downloaded" at the
same speed you can read. Word processors were crazy painful to use, and
incredibly slow. A text editor was just a line-by-line replacement. My TRS-80
would slow to a crawl even making the most basic (ha!) of programs.

Over the years, I've often thought about why computers never seem to get
faster - mostly it is because people have a tolerance for response speeds, and
that is unchanging. So software sits somewhere inside that tolerance range,
because why be ultra fast when most people don't really care that much?

~~~
antod
Yeah. For me in 1983 loading something meant 5 mins waiting for the cassette
to play through. Switching to a different program meant repeating the process
again.

~~~
gilbetron
One of the computers I had was cassette tape based, and the tape "reader"
broke. I was rather shocked to find I could use my parents cassette player and
connect its output to my computer, and it would (generally) read tapes just
fine. As a bonus, I figured out how to have the speakers on the tape player on
at the same time so I could listen to my programs being read in :D

------
owenversteeg
It's a tragedy of the commons. It's very rare for one piece of software to be
solely responsible for slowness, but each individual piece contributes just
ever so slightly.

I happen to know quite a bit about front-end technologies, so I'll speak to
those. Bootstrap is around 100 kilobytes. One hundred thousand bytes - after
gzip and minification. By itself, is it slow? Not very. Bootstrap fans will
point you to endless benchmarks and tests that show its impact on performance.
Same with React: one hundred and forty five thousand bytes, after gzip and
minification [0]. According to React fans, React is blazing fast!

But development happens, so you throw in a few hip libraries and frameworks
and suddenly Slack takes one billion bytes of RAM. Whoops. "But it's not
React's fault!" Sure it's not. If React is the only bloated thing on your
site, it will work great. But chances are that if you've got React, you've
also got Bootstrap, and some visualization things, and some code from
Stackoverflow that iterates over your DOM in O(n!^n!). And all of that is how
things get slow.

Now, I might be a bit biased, because I spent years of my life working on a
CSS framework 100x smaller than Bootstrap, but I think that if everyone spent
time optimizing the size of things to be 100x faster we could get back to
snappy UIs. Yes, it would be hard, and yes, it would require compromises, but
the result just _feels_ good. There's something about a webpage loading in
250ms, or a button reacting as soon as you tap it, that just feels nice. Maybe
it means not using React; maybe it means you don't use as many nice-to-have
frameworks, but I think it's an achievable goal.

[0] React fans will point out that if you don't need to interact with the DOM,
this gets smaller. Yes, this is true, but obviously for most webpages you kind
of need DOM interactions.

~~~
zinckiwi
I've written many sites with React, Bootstrap, and a dozen or three other
third party dependencies, but I've never run into a case of a button not
responding when clicked or things taking a noticeably long time to load
(assuming that the server processing or sheer size of the payload over the
network isn't the cause).

I think a lot of it boils down to "don't do work you don't have to." But
there's nothing about a framework that causes -- or saves you -- from that.

~~~
owenversteeg
Great work on the sites you've made! Unfortunately, most of the time when I
see a simple webpage not responding or lagging heavily, it's made with
React/Bootstrap.

It's particularly tragic when the page is only text and images.

------
fallingfrog
This! I once took over programming a point of sale system that our (internal
to the company) users were complaining about. Turns out there was no concept
of tab sequence, you had to use the mouse for everything. I asked the previous
programmer to show me how to use the form he created, and he started typing a
phone number using the _top row_ of the keyboard instead of the numpad. But
hey, it had lots of javascript!

~~~
mwcampbell
I don't get your point about typing phone numbers. When I was being taught to
touch-type in first and second grade, my teachers told me that I should type
on the top row, not the numeric keypad. Not sure why. Maybe because it takes
less time to move to the top row and back? Anyway, to me, typing numbers on
the top row is not an obvious problem.

~~~
fallingfrog
I mentioned that because I thought it showed how he'd never really watched one
of the users in their job- once you get used to the numpad it's a _lot_
faster. You can type in a long number very quickly without taking your eyes
off the screen.

------
sunseb
The web feels slower than it was 10 years ago too.

It's fascinating how we tend to over-engineer and bloat things.

~~~
Cthulhu_
The web is hundreds of times more feature-rich than 10 years ago too though.
But, sites like HN are some of the few that are still built like they were 10
years ago. The rest all adds rich JS frameworks (for supposed faster
interaction after the first load), a few billion social links, ads, cookie
banners, "sign up for our newsletter" or "pay our subscription pls" popups
when you first land on them, rich fonts, animations, images, etc.

I mean it's a lot nicer if you look at it from a distance after it's loaded,
but it's not as snappy as it used to be.

~~~
sunseb
I agree modern websites look a lot nicer... Or do they?

They all look the same: big images, not much text, the top bar, yeah this
newsletter thing to upsell with these crappy (but effective) marketing
techniques...

Somehow I feel old school websites (HN, Reddit, and so) are more sticky, more
addictive, more unique. They focus on what we really want: contents and
communication with others.

~~~
sunseb
Another thing I realized it's nonsense lately: object oriented programming.

It's often an over-engineered bloat that only works for trivial Programming
101 courses (yeah the famous Employee or Bike class). But in 10+ years of
programming, I found that OOP is just a complete mess (you end up with awful
classes in your code like Service, Manager, AbstractFactory, and so).

I wish we could just use variables and functions, that's all we need really.
:)

~~~
ams6110
> I wish we could just use variables and functions, that's all we need really.

You still can. You need decent data structures too, but I never in over 20
years really "got" OOP. It seemed needlessly complicated. Functions and data
always seemed to do the job for me.

~~~
digi_owl
OOP as most people get to know it seems to be a bleak application of the
original concept.

And the original was not aimed at dealing with the internals of a single
program sitting on a single CPU, but grand simulations running on massive
clusters. There each "object" would could very well be a process of its own,
running on its own dedicated hardware.

Effectively OOP became another one of those buzzwords on the bingo board...

------
finchisko
I think what guy is actually experiencing strong feeling of nostalgia for
older times. And I understand that. I see it on myself more and more with
coming years too. We were simply born in some era, which holds strong memories
in us. We hold sentiment to old things (programs), because somehow they're
connected with best memories from time we were much younger. And the best
memories are usually bounded to age around 20 - 30. Ask senior to tell you a
story from his time and there is high probability that the story will be from
their younger times, than from older ones.

My point is, that maybe todays software is not all that bad. We just don't
feel the same needs as 20 year old guy somewhere at Google who programmed it.
And it also works apposite way too. Show 20 year guy Visicalc, what he will
think about it. Or check teens reactions to Windows 95.

[https://www.youtube.com/watch?v=8ucCxtgN6sc](https://www.youtube.com/watch?v=8ucCxtgN6sc)

~~~
tomc1985
But maybe it is. On nearly every metric that matters to power users software
is worse.

Control? We have less.

Options? Less.

Ownership? Not that we ever had it, but now you don't even get a disc.
Licenses are even more restrictive.

The pace of 'upgrades'? Way, unnecessarily faster

And when the business dies now? Now you're fucked.

There are stories of mechanics running their shops today with 40-year-old
dinosaurs, where their biggest logistical issue is getting old parts to repair
when it breaks. It was sad that the magazine article I read that story in
laughed at the "backwards" proprietor, when in fact he is a hero for standing
up to the current trend of users-as-serfs cloud everything.

~~~
s73ver_
"Options? Less."

How so, given that more people are able to write software, and it's easier to
create something than ever before?

"And when the business dies now? Now you're fucked."

How is that any different than before?

~~~
tomc1985
You've answered your own question... more people are learning to program, from
fewer and more professionally run sources, which preach a gospel that prizes
engagement, conversions and ease-of-use, over empowerment, functionality, or
choice. Plus, with this larger and more mainstream community, where 'coders'
are pulled in off the street with little technical knowledge (let alone
passion) who never grokked complex apps anyway, produce what they know. The
end result is a sea of gimped, appliance-like 'apps' that do little more than
funnel users towards business objectives.

And when the business died, their software still ran. Hard-won expertise kept
it running. No stupid CEO or developer could stop you, and neither could they
pull the rug out from under your feet with a sunset or some other shitty move.
You bought your software and that, mercifully, could be the end of your
relationship with that publisher.

~~~
s73ver_
This doesn't really address my point. You still haven't explained anything
regarding choice, or any of your other points.

------
aeorgnoieang
What's _really_ annoying and frustrating to me is that, even with a browser
extension/plugin like Vimium, websites and webapps seem to almost go out of
their way to break my ability to use my keyboard.

A lot of the changes that break usability for me are purely cosmetic changes
that, e.g. replace a `button` with an `img` or `svg`.

------
squarefoot
The corporate culture to release a product in time rather than having it
perform fast ruined the IT world and is infecting the FOSS ecosystem too. Just
throw away all VM-constrained or interpreted languages (or use them only for
prototyping) and the problem is almost solved. Then make everything work
offline unless it is absolutely necessary to do the opposite (backups, updates
etc). Rediscovering the lost art of optimization would also help a lot.

~~~
gmueckl
This does not work. The economic incentives are biased towards fast releases
of mediocre software. Whoever delivers faster and cheaper tends to win.

Quality is detrimental to the business. Good software does not require
support. Paid support earns money.

An initial release that is stuffed with features takes longer to develop (the
competition gets the customers) and has fewer incentives for later upgrades
(less income after initial release).

Software cannot really improve unless quality standards become mandatory. The
liability disclaimer should go. Engineers don't get a free pass if their
products don't work. Why should software be treated differently?

------
hawski
Funny to have a rant on UX on medium not suitable to long form posts.

It is sadly very similar in desktop realm. You can’t copy text from the most
part of the interface. There can be an error and you have to rewrite it to
find what is this all about. The computer has this already written down, it’s
just silly. That is the one thing that usually is better about webapps. Of
course there are silly designers who want to expose their users to the same
limitation, by blocking copy or context menu. But usually they don’t do this,
because it’s more work.

That’s why CLI will likely never die. By default you can manipulate the data
to your heart’s content. It may be slow and sometimes half-assed, but it still
will be faster. You don’t have to rewrite anything. It’s a bit sadly the
closest we have to a data-oriented system. Where one can manipulate all
available information without massive hurdles. Only with some hurdles.

What else: Oberon, Plan9 interfaces (Acme!), Powershell, AppleScript (? - when
I was using OSX at work few years ago I couldn’t find reference manual of the
language), certainly more throughout the history.

We can’t do anything useful in 2d and yet we are working on VR/AR interfaces
where undoubtedly we will make same mistakes (and more!).

~~~
npsimons
> By default you can manipulate the data to your heart’s content.

[http://blog.vivekhaldar.com/post/3996068979/the-levels-of-
em...](http://blog.vivekhaldar.com/post/3996068979/the-levels-of-emacs-
proficiency)

------
dfox
Recently I discovered how well sapGUI works from the UX perspective. It is
essentially an glorified graphical 3270 terminal emulator which runs
applications that very often rely on user entering cryptic but human readable
IDs (there is lookup functionality integrated in essentially all such fields,
but frequent users typically do not use it that much).

Another interesting fact is that the 3270/sapGUI/dynpro model is almost
perfect match to how HTML forms work (with the inline lookups and such stuff
being the small part that would require some trivial JS/AJAX). I don't
understand why today's web apps universaly try to reinvent the wheel by being
"single page", emulating desktop WIMP interfaces and such things when simple
HTML would work perfectly fine.

Edit to add another point: For our anime conventions I wrote our own online
ticketing system. It has web interface for customers and general
administration, but the on-site box office uses ncurses interface which
intentionally has very ISPF-ish feeling to it, genral consensus of the staff
seems to be that it is significantly more efficient and intuitive than web
based systems used by other local conventions.

~~~
zokier
3270 is really fascinating evolutionary dead end. I don't have first-hand
knowledge (wish I had!), but as far as I can tell it is bit like high-level
ncurses/dialog(1)/forms built in directly to the terminal layer. Of course
being very proprietary very enterpricy tech, there isn't that much information
about how it worked in practice, but it certainly feels like a very different
path for UIs.

~~~
dfox
3270 allows user to edit screen contents without interaction with host
computer and allows host computer to mark screen regions as user editable or
not. Interaction with host mostly consists of sending (text) framebuffer
contents back and forth.

SAP's dynpro is abstraction layer on top of that which presents somewhat
higher level view of that (forms and fields, not framebuffers and characters)
and also allows 3270's behavior to be emulated on normal character-oriented
terminals (obviously by ncurses-like software running on host). This
abstraction is sufficiently high-level that sapGUI can look like somewhat
modern desktop application instead of obvious 3270 emulator. (Windows version
of sapGUI allows you to render parts of UI as embedded HTML or ActiveX
components, but that capability does not seem to be used that much outside of
SAP's development tools)

From programmer's point view it is not that different from web 1.0 application
built on semi-modern form handling library (eg. WTForms or whatever your all-
in-one webapp framework provides). Probably only difference is that for web
frameworks you will have if(form_submitted_and_valid()) somewhere, while in
SAP body of that conditional and rest of the controller implementation are
always two distinct procedures.

------
peterlk
This lends itself to supporting functional physical design. I'm getting tired
of the screenification of everything. I don't want a touchscreen in my car. I
want a bunch of knobs that I can reach for, feel, and use without looking at
them. I cannot do this with a touch screen.

~~~
the_cat_kittles
i strongly agree. a physical thing is just more expressive, and when designed
right, that can be leveraged. cars are the example ne plus ultra of how
touchscreens arent always a great idea

------
bajsejohannes
> And it's worth noting that HYPERTEXT, specifically, is best with a mouse in
> a lot of cases. Wikipedia would suck on keyboard.

A decade ago I would navigate web pages quicker than now. I was using Opera,
which had keys for navigating from hyperlink to hyperlink (spatially, using
shift+arrow keys). It also keys for navigating to the logical previous and
next page--that is, not navigating history, but following "link rel=prev" and
"link rel=next" links. Unfortunately, the web evolved to make this harder.
Everything is a hyperlink and no one uses link rel tags anymore.

~~~
magnet_ball
It's possible that the vimium extension for chrome will give you what you
want. The 'f' shortcut gives something similar to what you want.

------
romaniv
The thing that _horrifies_ me about all of this is that instead of criticizing
or fixing issues most people adapt what they do and even how they think to
match all the deficiencies in software. Often, users simply can't imagine
systems that work fundamentally differently and better. Often, when
deficiencies are pointed out people start _defending_ the mess and pointing
out all the clever (half-assed, really) workaround and hacks they came up
with. They are proud of all the time they spent (wasted) learning about
obscure and counter-intuitive software functionality. They are proud that
barely-working plugin/extensions/add-on they found and set up. All of that to
do thing that should be trivial to begin with.

One counter-point, though.

 _> I make no secret of hating the mouse._

If you look at the original uses of the mouse it was great. Especially in
systems like Xerox Star. Star allowed people to perform complex tasks with
almost no learning curve.

[https://www.youtube.com/watch?v=Cn4vC80Pv6Q](https://www.youtube.com/watch?v=Cn4vC80Pv6Q)

(Note how they weren't shy of using keyboard either. There are dedicated
hardware buttons for standard commands like copy, find, repeat, "properties"
and even for common text editing actions. Meanwhile, our keyboards don't have
dedicated keys for undo, redo, cut, copy and paste - operations that are used
in almost every application today.)

Trouble is, we lost most of the driving ideas behind Xerox-style interfaces.
Using a predefined set generic, powerful commands. Object-oriented UI.
Uniformity of representations. Modern system have those things only as
vestigial traits and in very limited contexts.

I don't think there were any quantum leaps in conceptual UI design since Xerox
Park times. There were some minor improvements in very specialized apps and
significant regression in software that's used universally. For examples,
phones and tables almost completely lost drag-and-drop functionality and
generic file UI.

For example, drag-and-drop is a very powerful concept, because it allows you
to perform actions by combining things you already know about - and those
"things" figure out how to interact in the best way possible. So, for example,
instead of having N "Print" buttons in N applications you can have a singe
drag-file-onto-printer-icon action that does different things based on the
type of the file. [BTW, this is also the key idea behind the original notion
of OOP.] Unfortunately, that's not how it works in modern UIs. They don't use
_either_ the keyboard or the mouse to their full potential.

~~~
snerbles
People get used to doing things a certain way. When they have to overcome a
terrible interface, an attachment forms. It's no different than the pride of
those that live in physically harsh environments.

Habituation is a powerful thing.

------
mattlondon
I think a lot of the perceived slowness is now the fact that so much more is
reliant on the network now - and more often than not requires several hops
over the public internet to some data centre somewhere.

As a reminder - network is really really slow compared to pretty much anything
else:
[https://people.eecs.berkeley.edu/~rcs/research/interactive_l...](https://people.eecs.berkeley.edu/~rcs/research/interactive_latency.html)

I'd not be surprised if the author's library search terminal from the 80s had
a local copy of the index, or at least the index was stored on a server on the
same lightly-loaded LAN as the terminal.

My home laptop is hugely, hugely faster than ones I've had a decade ago for
local workloads, but my network connection's latency and bandwidth have not
improved nearly as fast.

~~~
dx034
Network latency can be one reason but I found the lack of optimisation to be a
bigger factor. As long as a programme somehow works many developers won't try
to optimise it. That gives us Electron apps which are easily 10x slower than a
comparable native app (pgAdmin is a prime example).

------
j_s
I'm still thankful for [https://tinyapps.org](https://tinyapps.org) even well
after it has gone nearly read-only!

 _To qualify for TinyApps, a program must:

1\. Not exceed 1.44mb

2\. Not be adware

3\. Not require the VB/MFC/.NET runtimes. Also, preference is given to apps
which are 100% self-contained, requiring no installation, registry changes,
etc.

4\. Preferably be free, and ideally offer source code. Shareware will only be
listed if there is no freeware alternative._

------
dvfjsdhgfv
Fortunately we still have a choice. That's also one of the main reason I'm
always using command line if it makes sense. In most cases you get instant or
near-instant results. Our lives are too precious to waste them on sitting by
the computer and waiting for the results - only because some people decided we
should do it this way, and other people followed, and still other people had
to implement it, dealing with mediocre infrastructure and additional
complexity, usually with mixed results.

Nobody will steal this freedom of choice from me.

------
jancsika
> i posit that nobody wants autocomplete-style live DB lookups. They don't fit
> the mold that autocomplete fits in.

1\. I live-code an example in about:blank with devTools window open.

2\. I roll my own little jQuery-like closure for the sake of convenience.

3\. It works.

4\. I close the browser and forget about the thing.

5\. Later, I decide I wanna test out a feature of the Web Animation API, so I
open my browser to about:blank again.

6\. I remember that I used a little closure last time-- as I type the var name
Chromium pops up an "autocomplete-style live DB lookup" menu. At the bottom is
My Little Closure from last time!

7\. I move the mouse down to the relevant line in the menu. Or, I use keyboard
arrows to navigate to it. Or I keep typing and narrow down the menu options.

Useful? Check. Discoverable? Check? Obtrusive? Negative-- in the case that I
don't every want previously typed expression, that menu option is put at the
bottom out of the way of the more common JS internals and DOM methods.

Default settings in a terminal (or terminal-based GUI) generally require the
user to either type something to get into a history mode, or type tab to do
completion. But both of those options are less discoverable-- they require me
to know ahead of time that I want to retrieve something. With devTools it
shows me what can be retrieved so that I know what's available _even if_ I'm a
neophyte.

------
thibran
Its 2017 and I wait for my computer. I wait on Linux, I wait on Windows. This
sucks hard. It is still complicated to search for stuff or to morph data from
one representation into another. The web is even worse. And lets not talk
about sharing data or other computer-resources with others.

When looking back at what people invented in the 80s till today, I hardly see
much progress.

I don't care much about how much RAM an application takes, I care about speed.
The next big OS should have a golden rule, never make the user wait. That's
the ultimate goal which all OS's fail to address today.

What I want in the end is most times some text which is layout in a visually
pleasant way, some images, some videos. I thought multiple times about writing
a program which would fetch all data from all the websites I'm interested in,
messengers, email-accounts and so on and store them somewhere – so that I
could later view them (with whatever program) without any fucking delay.

------
inetknght
This puts to more eloquent words the very same thing that I've been ranting
about to peers and coworkers for years. Cheers for that; I'm going to share
this with them.

------
kabdib
The day I sold my Atari ST, I pulled it out of the stack of old computers,
connected it to a monitor, hooked up a disk drive and turned it on.

Poof: Desktop with a working mouse cursor in under one second.

I booted a nearby PC, and then hit reset until it booted. In the time that the
"hot shit" 386 box took to get to a DOS prompt, the ST booted about fifty
times. To something with graphics.

Modern system boot times depress me. We have database servers that take 15
minutes to POST. Doing updates or diagnosing hardware-level issues with these
things is g l a c i a l l y slow. I work with message queueing systems that
are not shy about several seconds of latency -- what are these stupid things
doing? The datacenter they're running in is less than a hundred microseconds
wide, and everything is high-end XEON CPUs with SSD and more cache than the
first 30 computers I used, put together.

Sigh.

~~~
bitwize
Everything is faster when you build it into ROM. The cost is flexibility.
Should you wish to run another OS on that ST, ypu would have to bootstrap it
through TOS.

~~~
kayamon
On a modern system, you could use flash RAM to get the same speed with all the
flexibility advantages.

~~~
bitwize
Indeed and unlike the 90s, SSDs are cheap now. :)

------
ams6110
> i posit that nobody wants autocomplete-style live DB lookups

Amen. I abhor this. I always get tripped up on these "smart" autocomplete
entries.

------
yalogin
Awesome. Looks a lot of people are frustrated with this like I am. More
importantly there is nothing wrong with me :)

------
JustSomeNobody
Perception of speed means a lot. I once was evaluating reporting apps back
when people did reporting on the desktop. One app would render all the pages
and then start displaying them. Another app would display each page as it
rendered them. Which was faster? Overall, the first. Which _felt_ faster? The
second.

------
JustSomeNobody
> one of the things that makes me steaming mad is how the entire field of web
> apps ignores 100% of learned lessons from desktop apps

Burns me up too but I think this is by design. Meaning, web app developers
generally don't want to listen, even when their bringing that to the desktop
in the form of "native" apps.

~~~
digi_owl
Feels like this is the cycle of computing.

First came the big irons.

Then came the desktop.

Then came mobile.

then came the web.

And none of the later ones seems particularly interested in learning from
those that came before.

Maybe it is willful ignorance, maybe it is youthful hubris...

------
chaoticmass
The computer store I used to work at switched from an dedicated cash register
to a PC based QuickBooks POS system and it was a POS (piece of sh __).

Windows based GUI app. You couldn't ring up anything without looking at the
screen, taking your hand off the keyboard to move to the right input box
constantly.

------
dx034
The argument about function keys resonates with me. I still use F5 (refresh)
and F2 (rename) but can't remember when I last used any of the other keys.
Instead of using F6 for a function in a programme I now have to use something
like CTRL+Shift+K. Why? Why not utilise the keyboard properly?

------
kuon
About the google maps things. Anybody knows an alternative where I can measure
things (like the length of a named road or the width of a building) and count
things (for example, I draw a polygon, and I want to know the number of houses
within it).

~~~
vinw
OsmAnd is pretty good for measurements and stuff.

[http://osmand.net/](http://osmand.net/)

------
gergles
The point of sale system shown in the tweetstorm is this one, and is highly
recommended if you're even in an occasion where you need to sell stuff.

[http://keyhut.com/pos.htm](http://keyhut.com/pos.htm)

------
peatmoss
I wonder if there is a business model for a company to recreate the 1995
native desktop UX for popular web services (scraping, API). Relatedly, how
much various web SaaS tools can be... de-webbed without significant loss of
functionality?

------
carapace
If this resonates with you check out "The Humane Interface" by Jef Raskin.

[https://en.wikipedia.org/wiki/The_Humane_Interface](https://en.wikipedia.org/wiki/The_Humane_Interface)

------
LordHumungous
Gmaps is so far ahead of anything available in 1983 that a comparison is
almost silly.

~~~
falcolas
Except for actual, physical maps, with indices for finding locations, and all
kinds of markings to denote things of interest (such as restaraunts, museums,
etc.)

~~~
lolsal
Physical maps don't necessarily have:

\- satellite imagery

\- reviews for destinations

\- automatic re-routing

\- traffic congestion

\- toll/public-transit information

\- On a mobile/gps enabled device you get your location as well

\- etc.

Comparing physical maps to Google Maps is one-dimensional: you can see a
specific area (if you had the foresight to buy it before hand!) with roads and
mile markers.

~~~
hawski
Do people actually use satellite imagery other than to appreciate how cool is
to look at one's house from above?

Reviews are mostly around average 3 stars for most places. Because there are
two types of revies: 1 star - waiting times were horrible, the meal was cold
and waiter was rude; 5 star - the best meal I ever had, everyone was cheerful
and helpful.

Automatic re-routing is good. Especially with traffic information. However it
is miserable where there are unknown closed roads. It will constantly reroute
you to the closed road. You can't pick a road segment and remove it from
consideration. Then you are back to experience of physical maps. However
slightly worse, because you don't know road numbers - up to this point they
were irrelevant.

Traffic congestion is a big plus.

Public-transit is also a great thing.

~~~
lolsal
> Do people actually use satellite imagery other than to appreciate how cool
> is to look at one's house from above?

Definitely! I used it when trying to find a good spot to view the recent
eclipse - I wanted to find an open place that would not be obstructed by
buildings, trees or mountains. I've used it to understand better how water
drains through my community and across my property.

> Reviews are mostly around average 3 stars for most places.

That is exactly what I'd expect!

> Because there are two types of revies

That seems to be hyperbole.

> Automatic re-routing is good. Especially with traffic information. However
> it is miserable where there are unknown closed roads.

That's definitely true. Paper maps would not have this information either. In
a worst-case scenario, you could use a Google Map like a paper map - disable
routing and trying to plan a route manually!

------
_Codemonkeyism
My CPC took 30min to load a game from datasette.

------
bajsejohannes
One way to improve this is to do like gmail and provide keyboard shortcuts in
addition to the point and click interface. (It has to be explicitly enabled in
settings, but it didn't have to be that way). Like the author is claiming, it
takes very little time to get used to and you get more productive quickly. The
only problem I've run into (also covered by the OP) is that input focus might
be different than what you think and you end up quickly doing unintended
things to your email. Luckily there's an undo function for that.

~~~
digi_owl
Ugh, now i am reminded of how Youtube hijacks the home button to mean "start
playing the video from the beginning". This makes it impossible to scroll back
to the top of the page quickly using a keyboard.

------
IronWolve
No doubt, Microsoft office took tasks that was 1 button and made it 5+ clicks.
And this feature creep expanded everywhere with companies thinking they are
smarter than you, so they try to train "you" to work their sites.

Also why I don't like apple computers, you use dos/windows/linux, and get to
an mac, and if you don't know the secret gestures or commands, its just
painful.

Its like how Gnome 3 took over and had to redesign everything, then finally
came back to Gnome Classic.

Or Windows said TILES is the future, deleted your start button, then went "My
Bad" and gave you a start button again.

Directories are disappearing, now its "Search Boxes" everywhere.. Good luck if
you know the asset file, and the file doesn't have the name "asset" in it. Or
if the damn file has permissions for you to even open it if you find it.

Don't even get me started on android phones, I hit dialer, and I don't get a
keypad when I need it, I'm in a menu tree now and dialer wants to display
current contanct info.. grrr.

Yeah, KISS is no longer a thing, its all wiz-bang, we can do it better than
you. NoSQL is awesome! Oh wait, back to SQL we go. Perl SUCKS! Python2, er, 3
now.

Get off my yard.

------
thoae55tbuoe0
This rant has some good points, but completely misses other very important
points. The main one being that those "really fast interfaces" were only fast
or even useable if you spent months to years learning how to use them. They
were also horribly limiting. (For example, with no mouse and no way to change
where the input is focused, you can't re-sort the lists by a different field.)

I worked for a major cash register manufacturer writing DOS-based software for
point of sale (PoS) systems in the 90s. One thing our customers (big stores)
constantly reiterated was that training costs were very high and retention
very low, so they wanted simpler-to-use systems, but with no less
functionality.

The big wins tended to be combinations of hardware that made the process
easier (think grocery store scanners), and better UI on screen that was easier
to understand and use. We ended up making a graphical (but still DOS-based) UI
along with some specialized hardware (an LCD screen with function keys around
it so the command name was right next to the key, but could change per
screen). It's been used for the past 20 years by the likes of Walmart, Target,
and the USPS, so I guess it worked. (I haven't worked on it in 20 years, so
I'm sure it's changed from how it was when I was there.)

It may very well not be as fast to load any given screen or respond to any
given command. But it costs less to train people to use it, and it's easier to
understand. I believe that it also helped improve checkout speed, as well,
which was important to some of the bigger stores.

I feel the pain the OP has, but I don't think their rant is entirely
justified. There are good points (especially about maps!), but they're also
seeing the past through rose-colored glasses.

------
cpburns2009
I certainly agree. The one case where this isn't necessarily true is a fresh
boot. Granted, my reference point starts from the 90s, and that's assuming
you're booting from an HDD (not an SSD, flash, or restoring cached state from
quick/fast boot).

~~~
inetknght
I disagree with that. Even spinning HDDs are several orders of magnitude
faster than even the main RAM of computers from 20 years ago.

~~~
davidgay
Not in latency, by a very long stretch (500ns 30 years ago).

~~~
inetknght
Not in latency but definitely in throughput. I remember being amazed 20 years
ago at 3MiB/sec throughput when copying some files around. Today, even a
spinning disc can sustain 160MiB/s and peak at 240MiB/s (personal anecdotal
evidence) before looking at RAID or hybrid SSD options.

Sure, latency is still pretty bad. But a well-designed application is going to
minimize individual seeks required, and therefore minimize the total aggregate
latency cost.

It's the same way across the internet; a well designed application is going to
minimize the latency cost across a socket by minimizing the amount of
synchronization (seeking) required to continue work. TCP connection, TLS
handshake, HTTP processing and routing, application processing to
disk/database, multiple queries/seeks to handle the request... it all adds up.

------
juliesulti
Well, this is a cute bit of nostalgia, but it used to take 6.5 seconds to
redraw the terminal screen at my local library, which was connected to some
distant system at 2400bps.

------
SN76477
Software fatigue for sure

It is so bad on iOS with and iPhone I am trying to figure out how to live
without one.

Doing anything productive on iOS takes 12 clicks, 6 button pushes and takes 9
minutes.

~~~
dep_b
As an app developer I have been guilty of this in the past as well. I really
try to make interfaces instantaneous nowadays so you can keep hammering away
while in the background things like fetching and saving happen.

Things like naive saving (always continue immediately assuming inserting or
updating really happened), command queues and aggressive preloading can make
applications really quick again despite they're just an interface for an API.

------
sharpshadow
"GOD FORBID i click on anything while its loading" hahaha that's how you
distinguish experienced from not experiences users

------
peterwwillis
But, you know, clearly AI is going to replace the job of people making maps in
the future. Because software just keeps getting better.

------
fapjacks
Oh, man. That Google Maps peeve absolutely speaks to my soul. Every single
time!

------
perlpimp
this
[https://www.youtube.com/watch?v=4YgLVFXpuq0](https://www.youtube.com/watch?v=4YgLVFXpuq0)
from chrome dev summit

------
0x006A
everything seams a bit much. There are so many things you just could not do at
all in 1983, like HD videos or 3D engines or even rendering Mandelbrot
fractals

~~~
xamuel
The weirdest thing is when navigating to an HD video is slow, individual
folders take entire seconds to open up, then when you finally get to the video
file and open it, the video runs smoothly and perfectly. What on earth is the
file navigator doing??

------
snambi
totally agree. eventhough machines and networks are faster than ever. The Web
apps are getting slower everyday. :(

------
dragontamer
Ehhh?

* I remember spending hours, sitting on my new 2GB hard drive, formatting it to FAT32. Or defragmenting every week... because that disk format was rather subpar compared to modern file systems.

* I remember the 10+ Floppy Disks that needed to be manually swapped in and out to install... something. I think Microsoft Word. Or maybe it was Windows. In any case, Floppies were slow as all heck, and when CDs came out everyone was amazed by their speed.

* I remember being able to "see" the flood fill algorithm in "Paint" applications line-by-line fill up regions. Computers didn't have the memory to fill up those regions in one pass back then and had to resort to slower algorithms.

* I remember waiting multiple minutes to download a 5MB file in the 90s. 56kbps downloads 5MB in 11 minutes, for example. Even small pictures were unreasonable to distribute on the internet, let alone audio or movies. Flash was popular because a lot of it was rendered vector art and could fit into the ~3MB or 4MB size needed to be practical.

* I remember the dial-up sound. It took multiple minutes to CONNECT to the internet, and you couldn't use your phone while you were on the internet. To check your email took at least 5 minutes before you saw your first email. ~2 minutes dialing in, a few more minutes downloading the emails, and then finally you could begin reading them

\------------------

Seriously, did the author here ever use 90s technology? I remember searching
through 3 CDs worth of information to write reports in the 90s.

Yes, CD-rom based encyclopedias. You'd juggle CD-roms constantly to do things
that are as simple as "Wikipedia" or "Google" today.

We are way, way wayyyyy faster than the 90s. I can imagine that one would be
jaded by the past, but... speed? Nah man, we're way faster today than back
then. Its not even a close contest.

\------------------

I mean seriously, when did people stop using Floppies? It was like, 2004 or
2005 if I remember correctly. And there was a good chunk of time in there
where we spent many minutes burning CD-ROMs (when our data was too big for
Floppies: MS Word documents easily over the 1.44 MB limit...) but Flash Drives
were way too expensive to be practically used.

I think some people used those "Jazz" and "Zip" drives as a "floppy
replacement", but really it was burning CDs / DVDs (which had a significant
delay to spin up and write). Modern Flash drives are instant. Modern cloud-
storage is instant.

I cringe at the thought of returning to the 90s. I have a 30MB Powerpoint
sitting on my desktop that I've been working on between my home and work. Do
you have any idea of how much time it would have taken to transfer that back
and forth in the 90s?

~~~
nottorp
Do you also remember that the Word that you installed from 10 floppies was a
lot more responsive than the Word you subscribe to now?

You are saying hardware is a lot faster now. And you are correct. What he is
saying is that applications are a lot slower, in spite of the faster hardware.
And he is also correct.

~~~
dragontamer
Word hasn't gotten much slower or faster in my experience. Honest.

Word has a bunch of additional features that are nice to have. We have spell-
check and grammar check today, layouts that don't explode, decent "styles
system" (kinda like LaTeX's tagging: you can change a style throughout your
document pretty easily under modern Word)

I'd take Word 2016 any day over Word 97. Granted, I'm one of those "strange"
people who see absolutely nothing wrong with the modern Ribbon (way better
than menus inside of menus, that change from system to system. My school's
toolbars for Word 97 looked completely different than my home computer's
toolbar... it was kind of ridiculous)

------
hyperpallium
git is _very_ fast

------
megaman22
Was originally a tweet-storm ->
[https://twitter.com/gravislizard/status/927593460642615296](https://twitter.com/gravislizard/status/927593460642615296),
if anyone is confused by the stream-of-consciousness style.

It is frustrating that I've got a machine on my desk more powerful than
supercomputers a generation ago, and it locks up and can't refresh the screen
as fast as my relatively slow typing on a regular basis...

~~~
sushibowl
> Was originally a tweet-storm ->
> [https://twitter.com/gravislizard/status/927593460642615296](https://twitter.com/gravislizard/status/927593460642615296),
> if anyone is confused by the stream-of-consciousness style.

I don't mean to detract from the point, but this piece would be helped so much
by properly fleshing it out into an article. presenting it as a sequence of 92
tweets is combining the worst of both worlds.

~~~
gumby
I thought the choice of medium was itself an ironic comment on the crappiness
of modern computer usage. Tweetstorms are essentially unreadable, even with a
tool like this thread aggregator.

~~~
dredmorbius
Precisely my response on first encountering the content.

Wherein I discovered that this particular tweet aggregator _could_ be
initiated by a third party (others ... cannot).

[https://plus.google.com/+TimWesson/posts/VSGs2BddARD](https://plus.google.com/+TimWesson/posts/VSGs2BddARD)

------
almonj
I agree with this guy that having to use a mouse is often a chore and
inefficient. Google search used to have a feature where you could hit tab
after a search and it would auto select the results in order, but that seems
to be gone. We need more features like that. Keyboard control supremacy will
come back, it is inevitable.

------
the_cat_kittles
its all about the command line! right!? that thing rules!!

------
grandalf
My terminal responds more quickly than it did in 1983. In every other way
computers have subsumed televisions as entertainment devices, and have
performance characteristics appropriate to their main uses.

Because of Moore's law advances, the industry has learned to design for the
next generation CPU. Before Moore's law was understood, industry designed for
the current hardware and everything was extremely fast.

In a way I wish RIM had won the smartphone wars, but after the patent loss and
removal of the click wheel, the devices were far less usable.

Apple's foray into Skeuomorphism offers us a history of the competitive
landscape for usability and design. Skeuomorphism is a way of avoiding the
learning curve associated with abstractions. If the software is meant to let
the user take notes, making the screen yellow with lines on it helps the most
abstraction-resistant users grok what is going on.

Most adults can't do basic algebra, so it should be no surprise that the
dominant mobile software platform succeeded only after substantially dumbing
down the UI to the point where it was essentially abstraction free.

With Jony Ive taking more control of iOS we are finally breaking free of the
Skeuomorphic training wheels. Google's material design is an ambitious attempt
to create an abstraction that retains some of the more textural and familiar
"material-ness" of real-world objects, without resorting to crude mimicry. I
wouldn't say it's close to perfect, but it is promising.

So consider the need for a fast GPU and high res graphics in a mobile device
to be (in a sense) a tax that we must pay to help the small percentage of
abstraction-phobic luddites understand how to write a note or add a meeting to
the calendar. It also makes the device useful for gaming and for watching TV,
which many people enjoy doing and which create massive revenue for content
companies.

Apple came to dominate the smartphone market because it understood that
consumers wanted an all-purpose device that didn't shove too many abstractions
down anyone's throat, had decent video games, would let you watch TV and
movies, did not feel like using a computer.

With Android, Google has mostly stayed behind by about one year to cut costs
and capture market share, but has increasingly been focusing on delivering a
top-tier experience and offering/supporting top-tier hardware.

Movies, shows, and songs make up 99% of most users' stored data. So of course
the device needs to work with that data. Any comparison to 1983 cannot
consider such massive data because it was unfathomable. Back then we thought
calendars, note taking, simple spreadsheets, etc., were what computers were
good for. Nobody realized that they would become handheld TVs with a built-in
gossip rag with custom content about all of our friends.

We will someday reach a point where a mobile device reaches "peak dopamine",
meaning that no further improvements will be necessary to make the device more
entertaining or more addictive. We have a long way to go, and a lot more
transistors will be needed in the hardware to reach peak dopamine, so we can
expect the trend to continue. UI responsiveness figures in somewhere, but is
obviously not the main driver of hardware cost increases.

~~~
grandalf
Wow this comment is getting a lot of downvotes. I'm curious what I said that
offended.

------
stevebmark
tl;dr we have a network layer? what is this garbage

------
dredmorbius
There's a ton here, virtually all of which I agree with strongly and have
long-harbored and deep frustrations with.

I've addressed a number of them in "The Tyranny of the Minimum Viable User",
arguing that this is a case of Gresham's Law and market interactions, _in a
domain with a tremendous range of human capabilities_. The result though is a
disenfranchisement of more capable users.

[https://www.reddit.com/r/dredmorbius/comments/69wk8y/the_tyr...](https://www.reddit.com/r/dredmorbius/comments/69wk8y/the_tyranny_of_the_minimum_viable_user/)

There's the whole matter of computer inputs, and the tremendous and persistent
utility of keyboards.

There's my immense frustration with mobile computing, where a near perfect set
of physical characteristics (form-factor, self-supporting-but-removeable
cases, stow-away keyboards) are sabotaged at every possible point by atrocious
and user-hostile OS and application design and lock-down, crippled storage and
capabilities, gratuitously thwarting advanced use. The lack of form-factor and
compatibility standardisation means that keyboard- and case-pairing to devices
_requires not manufacturer but model-specific compatibility_. Warrenties are
not honoured (Logitech). Devices aren't updated (Samsung/Google).

[https://ello.co/dredmorbius/post/lqgtwy_rhsfbdh5cdxb1rq](https://ello.co/dredmorbius/post/lqgtwy_rhsfbdh5cdxb1rq)

On the failure of systems to cross-reference, I give you Pocket, which
absolutely and deliberately stymies advanced use and actively gets worse the
more you use it. I've submitted long and detailed feedback to Pocket (now part
of Mozilla), and whilst acknowleged, there's been absolutely no movement on
any of these, even the simplest, such as incremenetal search through my
copious tag classification. _It literally takes several minutes to swipe
through this by hand._ The application -- intended for long-form reading --
_has no search-in-page functionality_.

Whiskey. Tango. Foxtrot.

[https://www.reddit.com/r/dredmorbius/comments/5x2sfx/pocket_...](https://www.reddit.com/r/dredmorbius/comments/5x2sfx/pocket_it_gets_worse_the_more_you_use_it/)

I'd really like to know how the hell to clobber the tech industry with a
cluestick, because the present model is absolutely not working.

------
mistermann
Nice rant, but sadly as he says, this is just how it is. There are very few
pieces of software I use that aren't absolutely chock full of brain dead
stupid behaviors....not just mildly annoying or difference of opinion style
issues, but downright dumb/broken. Gmaps is a good example, my blood pressure
always rises while trying to use it.

~~~
dx034
No, the people on HN are the ones who could change it. Many here develop
software. The current mantra of many is to avoid premature optimisation. Who
cares about a function that runs 50ms? Well, in reality this function will at
some point run within other functions and needs to be called 10 times, adding
half a second to page load. Micro services are also really nice to develop but
you'll almost always pay with performance. Even inter-datacentre latency adds
up if you need 100 calls until you can show the result to the user.

------
stevewilhelm
Back in the mid eighties rendering very simple three dimensional scenes using
ray tracing took a great deal of time.

"...images generated using some of the above improvements to ray tracing. They
all took approximately 50 minutes each to compute on a VAX 780." \- From
[http://www.cs.yorku.ca/~amana/research/cones.pdf](http://www.cs.yorku.ca/~amana/research/cones.pdf)

Example image found here:
[http://www.cs.yorku.ca/~amana/research/images/spheres.jpg](http://www.cs.yorku.ca/~amana/research/images/spheres.jpg)

A DEC VAX 780 cost over a hundred thousand dollars -
[http://www.computerhistory.org/revolution/mainframe-
computer...](http://www.computerhistory.org/revolution/mainframe-
computers/7/182/736)

