
Slow Software - Thibaut
https://www.inkandswitch.com/slow-software.html
======
dandare
Input latency, I can understand. Web latency, at least you know what you are
paying for (trackers, fonts, bloated stylesheets, MBs of JS libraries...). But
core software latency, that is pure madness.

Android share menu: what on earth takes 500ms-2000ms to display a menu??

Android scrolling: I could have a smoother scrolling if I rendered the view on
a wall in Doom III. Why on earth is the scrolling process not prioritized
above everything else?? iOS got this right.

Microsoft Outlook on Mac: it can teak easily 2000ms to close(!) a window, are
you kidding me?

My new 2018 Touchbar Mac does not "feel" any faster than my old 2013 retina
Mac, while the paper specs and benchmarks show at least 100% increase in
computational power. They can both equally easily choke down on some unwanted
Skype update or stupid WebEx video stream (yes, I work in corporate).

~~~
nugator
Amen, the Android share menu. Seems like it tries to go through the entire
history of my previous shares to see what is most popular, then asks every app
that enables sharing to list their respective sharing targets. Some of these
apps also checks which of its own sharing targets are most popular (like the
sms app and Messenger). Finally it renders a long list that to me feels random
each time and not always relevant. /End rant

~~~
Bjartr
I recently learned I could long press share targets and pin them. So the fine
I actually use in the list of a couple dozen actually show up first now.

~~~
deanCommie
Which, infuriatingly, is disabled in some tier-1 Google apps, including
Youtube and Google News.

------
throwaway286
I work at a large company that is not one of the famous silicon valley tech
companies. One of the worst parts of working there is the heap of various
enterprise anti-virus software they install on our computers. It brings huge
typing and disk access latencies. Even opening files in vim with FZF is slow.
I can't explain why but this makes programming so much less pleasant. I really
just want to work somewhere without anti-virus.

~~~
silviogutierrez
I'm currently picking the OS stack for all machines at my company. All
guidance, even people I respect, point towards antivirus protection. Yet I
lean towards nixing that. I know it's hugely ineffective. In fact, it opens up
holes of its own[1].

And yet... it's like scaffolding in NYC[2]. Absolutely useless[3]. But if you
are all for removing it, and a brick falls and hurts someone, heads will roll.
Quite a quandary I - and other C-levels - face.

[1]
[https://www.computerworld.com/article/3089872/security/secur...](https://www.computerworld.com/article/3089872/security/security-
vulnerabilities-in-symantec-and-norton-as-bad-as-it-gets-warns-
researcher.html) [2] Another contrarian passion of mine. [3] Bricks do fall
and hurt people. But no more than scaffolding itself falls with the same
effect.

~~~
coder543
Microsoft Defender typically doesn't hurt performance or security much. The
alternative is to run Mac or Linux stacks instead of a Windows stack, of
course.

~~~
godDLL
While running a Linux stack may still work, your Mac info is outdated by three
years, at least. Macs have entered the zoo in 2015.

~~~
coder543
For practical purposes, that's not true.

Macs do not allow running unsigned software by default, and _no one_ runs
antivirus on Mac, ever. So even if they were commonly being infected, which
they aren't, the person above would have organizational indemnity if someone
were infected because they're following industry best practices by not running
antivirus on Mac.

If you want, you can further restrict Macs to only App Store software, which
is heavily sandboxed. Then you can go even further by not allowing the
individual users to install software on their own, if you really want to be
draconian about it.

Unless someone is being individually targeted, running very outdated software,
or is intentionally trying to get themselves infected, it will not happen.
Even if all three conditions are true, it's still very unlikely.

Anyone who says otherwise is just fear mongering. That same level of fear
mongering could point to the dozens of pieces of malware that have been
released for Linux.

I say this as someone who uses a Linux laptop for work and a Windows desktop
at home. I don't have a dog in this fight. I do, however, try to stay very
informed about the state of software security.

~~~
pjmlp
In many enterprises you are only allowed to connect laptops into the network
if an anti-virus is present, that includes company issued Macs.

~~~
petepete
And this is why I used to have a laptop in my drawer I used for logging into
the proper intranet, booking holidays and organising my pension.

All my _actual_ work was done on Unix/Unix-like machines on our own old
network, something we clung on to after acquisition.

------
coder543
I think this article places too much emphasis on input devices and hardware
constraints, and not enough on software architecture. JIRA doesn't really feel
any faster just because your input devices are fast; the application level
latency dwarfs the input latency by a large margin.

I would take measurements, but JIRA prohibits benchmarking for some reason...
¯\\_(ツ)_/¯ they're probably just trying to save everyone else the
embarrassment of seeing how incredibly fast JIRA is compared to their own
sluggish offerings, right?

I am confident that JIRA's application latency has nothing to do with Java
(backend) or JavaScript's (frontend) garbage collector.

Many companies just feel no impetus to write fast software, or to use
commensurately powerful hardware.

~~~
ploxiln
Jira is in a league of its own. With latest stable Firefox on a Macbook Pro
from 2014, I get a full 10 seconds loading time for the notifications sidebar.
Every click in Jira takes between 5 and 15 seconds. It is absolutely nuts and
it seems like many people just don't understand how insanely bad it is (but
obviously many engineers do ;)

Meanwhile my browser can load, render, and scroll a 4000 line colored diff in
less than half a second (e.g.
[https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...](https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=01897f3e05ede4d66c0f9df465fde1d67a1d733f))

~~~
sb8244
Is your jira heavily customized? I haven't seen this level of slowness and use
it daily. I know that customizations can be the root of evil on jira

~~~
notduncansmith
And this level of open-ended configurability is why many vendors prohibit
benchmarking. While Atlassian is exceptional in some regards, this is not one
of them.

~~~
michaelmrose
If jira is slow crap when used in the way real people use it then isn't this a
meaningful fact worth sharing?

------
taneq
I've said it before and I'll say it again - one of the greatest boons from the
new wave of VR is not the tech itself (fun though it is) but the focus on
latency as a first-class metric. I've always been sensitive to microstuttering
and so find it irritating when a game that's "running at 90fps" (or a document
that's scrolling) still has perceptible judders.

~~~
chrisweekly
is 'judder' a synonym for 'jank'?

~~~
jblow
‘judder’ refers to dropped frames in a fixed-frame-rate application, frames
that are not dropped but are rendered too late, or, more generally, variance
in frame rate of a program that should be smoothly animating.

~~~
Aeolun
I think ‘jank’ would refer to the same thing, as soon as it lasts longer than
a few frames.

E.g. the program runs consistently otherwise, but somehow freezes for 0.5s
every minute or so.

~~~
lloeki
My intuitive interpretation has been that `jank` is related to frame dropping
but with time perception being constant (like sampling being insufficient yet
played at correct times) whereas `judder` also has a sort rubberbanding effect
on time, where time is perceptutally compressed or dilated as playback is
delayed or "catches up", with frame dropping happening upon e.g a deadline
being missed. Here, with frames 2 and 4 being costly to render:

    
    
        normal: F1....F2....F3....F4....F5....F6
        jank:   F1....F2..........F4..........F6
        judder: F1....F2........F3....F4......F6

~~~
taneq
This use of 'judder' is what I would call 'jitter'. 'Jank' I'd just call
'skipped/dropped frames'. I never realised how much this terminology varies
between groups, I think it's fascinating!

------
jumarm
Further reading from Dan Luu, who wrote about this as well. [0]

He goes into why the Apple 2e is so quick, the iOS Rendering pipeline, and
general complexity of computing input.

I find it really interesting that latency is so high on a lot of devices in
2018.

0\. [https://danluu.com/input-lag/](https://danluu.com/input-lag/) (2017)

------
gleenn
I like how the site is an example of very fast software, such a simple design
and it renders immediately; a single file load with the tiniest of javascript
to make the videos work. A full page load in 27ms with only 14kb transferred!

~~~
merb
to be fair the site is a html site only and gets served via cloudflare. not a
fair metric against a "dynamic" site.

also most webservers/frameworks/whatever will favor throughput over latency.

~~~
pdpi
dynamic sites are a choice, and achieving that richer level of interaction
comes with its own costs. For every legitimately dynamic page out there (going
all the way up to full-on applications), there’s many more that have no good
reason to be anything other than static content that could be served from a
cdn. Going pure html for a page that doesn’t need more is precisely the sort
of design choice that goes with the mentality they’re advocating.

Optimising for throughout rather than latency is, again, a design choice, and
the article serves as an indictment of taking that too far.

------
robochat
My least favorite type of latency is when you start typing and the application
takes at least 1 - 2 seconds to catch up with you. This still happen to me
often enough to be a normal occurrence on both my desktop computer and phone.

~~~
Shank
You know what, I'm actually more okay with this than perceptible, constant
latency. If _nothing_ is happening then I know I can just stop looking for
characters to appear and keep typing. A tiny part of me loves to "get ahead"
and just keep going because I know it'll be magical when it does arrive. If
it's just always laggy, though, it's disorienting and disconnects me much
worse.

Now if it's always doing that -- like a 1-2 second delay more than a few times
in 10 minutes, for instance, it's terrible. But if it's a one-off, I don't
mind it.

~~~
O_H_E
What is _extremely_ annoying is when I get used to this "getting ahead" dance,
but every once in a while chrome would just ignore what I typed and I would
have to go back and retype it.

------
voltagex_
I keep coming back to thinking that it's the gap between dev and "mainstream"
hardware.

Forget about phones for the moment (lol, Javascript performance on a $99
Android).

My PC is a 6-core, 32GB of RAM beast. I suspect that most (non-PC gamers)
people's machines look closer to
[https://www.officeworks.com.au/shop/officeworks/p/acer-
aspir...](https://www.officeworks.com.au/shop/officeworks/p/acer-
aspire-3-15-6-laptop-a315-21-40al-aca3152140).

Dual core, 8GB of RAM, spinning hard drive (!)

I can tell you from experience that Windows 10 isn't a fun experience on
anything other than a fast SSD.

~~~
flukus
The machine you posted is pretty typical of PC gamers according to the steam
user survey
([https://store.steampowered.com/hwsurvey?platform=combined](https://store.steampowered.com/hwsurvey?platform=combined)).
8GB+ of RAM only got to 50% of users a couple of years ago. I suspect most
non-pc gamers machines to be even worse, the firefox survey has 30% of people
still on 4GB:
[https://data.firefox.com/dashboard/hardware](https://data.firefox.com/dashboard/hardware)
.

> I can tell you from experience that Windows 10 isn't a fun experience on
> anything other than a fast SSD.

Agreed, I think MS are dogfooding exclusively on surface books.

~~~
imtringued
>The machine you posted is pretty typical of PC gamers according to the steam
user survey

The linked laptop still has an ancient 1366 x 768 screen which is only 12.62%
of all players that have taken the survey.

------
okaleniuk
Well, that's all good but it doesn't explain how did we get there. I think,
Moore's law resulted in a resource curse for PC and mobile. Unlike with the
shared computers from the past and cloud services from now, you pay not for
the resources you use, but for the resources you own. And under the Moore's
law they grew exponentially for quite a while, so the software simply followed
the trend growing exponentially in overheads, too. Technical reasons are
probably secondary, since we do still have real-time computing, and high-
performance computations in their own respected niches. It's not that all
software is inherently bad, it really depends on what it is for. I believe,
the socio-economical reasons of the current state of software are most
interesting and the least researched.

------
stirner
> When dragging items on the screen, for example, users perceive latencies as
> low as ~2ms

This seems doubtful, given that even a state-of-the-art 240Hz screen only
refreshes every 4.16ms. I guess they could compare dragging on a screen to
dragging a physical object, but that would still be comparing 4.16+ms latency
to 0ms latency, which doesn't explain the 2ms figure.

~~~
alimbada
> This seems doubtful

Ever tried playing a fast paced game with VSync turned on?

~~~
stirner
True, the next frame can start getting drawn before the last one, but every
desktop compositor I know of uses vsync.

------
jccalhoun
When I read the headline I thought this was going to be something like slow
food or the slow movement in general
[https://en.wikipedia.org/wiki/Slow_movement_(culture)](https://en.wikipedia.org/wiki/Slow_movement_\(culture\))
and arguing for software to be slower.

I'm glad I was wrong.

~~~
SmellyGeekBoy
My understanding of Slow Food is that it isn't necessarily about "slowness" so
to speak. More about keeping things close to nature and not interfering with
them / overprocessing too much. Something similar would make sense for
software (e.g. using lower level languages, less extraneous JS on websites
etc)

~~~
DoctorOetker
turn on, tune in, NOP out

------
gok
The "user-hostile" section is kind of silly. Poor/lazy coding is also user-
hostile. Requiring 20 megabytes of JavaScript frameworks because you're too
lazy to figure out how to solve your feature requirements is user hostile.

Maybe call that other stuff "monetization"

------
thinclientwoe
I spent a decade working in engineering at Intel. Basically all design work
(as of 2016, and for 10+ years prior) is done over VNC to Linux servers in
data centers.

The experience was barely tolerable over 100mbit Ethernet to an on-site data
center, and anything less was fairly abominable for just about anything other
than working in a black and white terminal.

The majority of the work is fairly graphical and most engineers rarely have
the luxury of connecting to an on site data center.

Over the years the situation both improved and worsened:

Pluses: \- circa 2010 some of us were lucky enough to get gigabit Ethernet
connected to our desk. \- circa 2010-2015 there were improvements to the VNC
protocol like JPG and Zlib compression that helped a lot with bandwidth
constrained situations (nearly all) \- circa 2014 a lot of us got 802.11ac
capable office APs and laptops, often pushing 300+ mbps reliably.

Minuses: \- The company shut down a bunch of datacenters and setup “hub”
sites. Making most of us work over high latency WAN links even in the office.
\- More and more work seemed to get organized across sites, making many of us
remote to far-off datacenters even if we were local to a hub.

No one in engineering management or IT seemed to take the problem seriously.
No wonder the company has floundered so much.

To me an unresponsive interface completely spoils flow and dramatically
reduces my productivity.

------
turrini
Somebody once posted a well known news website with an alternate link that
loaded instantly. Anyone remember this? I don't remember if it was WSJ, or NYT
or something else.

~~~
shakna
A couple news sites have 'text-based' versions. Became popular as a way to
give people in disaster areas with spotty signal a chance to find things out.

These are some I found:

[0] [https://lite.cnn.io/en](https://lite.cnn.io/en)

[1] [http://thin.npr.org/](http://thin.npr.org/)

~~~
DoctorOetker
how do I get my browser/OS to modify link URLs when I go there so it prepends
the right "lite" URL?

~~~
anoncake
[http://einaregilsson.com/redirector/](http://einaregilsson.com/redirector/)

------
jodrellblank
Related, dadgum.com's article "How much processing power does it take to be
fast?" commenting on an arcade machine playing Defender with very low latency
30-40 years ago -
[https://prog21.dadgum.com/68.html](https://prog21.dadgum.com/68.html)

------
Corrado
On a related subject, I've noticed that my local gas station updated their
software for their fuel pumps and it is terrible. For some reason the latency
is very high, so high that I can't reliably type in my PIN. It literally takes
500ms - 1000ms for key-presses to register, plus it doesn't seem to cache them
very well so if you go too fast it drops the key-press altogether. Finally,
they changed the font to a fancy script that is difficult to read on the low
resolution display.

At first I thought it must be a hardware change or something on the back-end
is slower. There is another gas station of the same brand just 3 miles away
and it is still running the old version of the pump software, which is fast
and user friendly. In fact, it makes me want to drive that extra bit in order
to not have to put up with the slow software.

------
jancsika
Why aren't ads a data point in the infographic?

Ads are a necessary component of the web atm. They typically insert a delay
between the user's action (clicking a button, scanning paragraphs of text with
one's eyes) and the desired behavior (watching a video, comprehending which
text is the article vs. advertising pictures and/or text).

So a Youtube app running on Fuschia could become a poster child for "anti slow
software" based on the author's guidelines. Yet this would only deliver the
user more quickly to the problem of ad latency-- a problem which is orders of
magnitude worse UX than the problems listed in the article.

It seems like inside baseball to make ad latency an externality to the core
problems of slow software.

------
dorukane
While comparing Samsung and Apple mobile device latencies the article gives
these examples: Tapping latency examples (Videos slowed 16x): \- Opening a
settings tab on an iPhone 6s with ~90ms of latency. \- Toggling a setting a
Samsung S3 with ~330ms of latency.

I agree latency is evil, I hated Android a while ago because of this. Apple
always felt really fast compared to other OS. BUT it seems normal that
toggling a setting proceeds a bit slower than just opening a tab no? It's like
it's just a bad example.

~~~
keldaris
Isn't that just ridiculously slow animations for the most part? I still use an
ancient OnePlusX that I got when it came out and I've disabled all UI
animations, toggling most settings (with legitimate exceptions like activating
the wifi hotspot feature) feels almost instant, certainly nothing close to 300
ms. Admittedly, I haven't used any iPhone in many years, so I can't really
compare.

~~~
SmellyGeekBoy
> Isn't that just ridiculously slow animations for the most part?

Agreed, and as the recent example with the iPhone calculator proved, Apple
aren't exactly immune to this either.

~~~
keldaris
Can you just disable animations on Apple devices the way you can on Android?

~~~
gitgud
Yes, I think low power mode disables almost all animations on apple

~~~
keldaris
Wouldn't that also disable other features as well (push notifications,
maybe?)? Animations are something I disable permanently just to have a better
user experience, I wouldn't like to sacrifice anything else.

~~~
pdimitar
Yes it would. But you can only reduce effects:

Settings -> General -> Accessibility -> Reduce Motion.

The only thing I dislike is the slightly counter-intuitive quick fading effect
when minimizing or switching between apps. Outside of that though, any iDevice
feels snappier with that option enabled (== effects reduced).

------
reallydontask
Link is blocked at work by McAfee Web gateway

Url category is pornography

------
AnonC
I have an HP EliteBook with an i5 processor, 8GB RAM, and an SSD that’s about
70% empty (only 30% space is being used). It runs the latest Windows 10 image
from work and _is slow as molasses._ Almost every action I take, be it a mouse
click or hitting a key or switching between applications, takes a few seconds
or much longer. I thought it’s a McAfee issue, but the CPU usage is above 50%
almost all the time and this usage is across many processes (whose names I
don’t understand), not just McAfee.

Since it’s a work image of Windows, it has policies set to prevent me from
changing many things.

Where do I even start troubleshooting this issue and finding the culprits? Is
it just a CPU usage issue and/or some kind of I/O issue? I haven’t yet tried
using something like Process Explorer (from the sysinternals tools) to get a
clearer idea of what’s happening (though I’m not sure if that’d help).

I’m thinking of putting Linux on it as an alternative.

Any and all suggestions are welcome and appreciated.

~~~
hugg
McAfee intercepts every IO read/write afaik, which makes it horribly slow

------
pier25
I seriously doubt the vast majority of the population cares.

I mean, I do. I hate websites that take many seconds to completely load when I
_know_ they could take less than 1 second without the bloat.

Hardcore desktop gamers and developers usually are also very performance
conscious but that is minority.

Sites keep piling hits while adding bloat, and it's totally counterintuitive.
Why?

~~~
HankB99
I care. One of my worst slow system experiences is my "new" Xfinity X1
"Entertainment System." Latency to remote control button press is on the order
of a second or more. Worse than that, it will put prompts up on screen and not
be ready for the input. For example, when I finish watching a recorded program
it puts op a "Delete" prompt. If O press the OK button when this prompt
appears, nothing happens. I have to wait a second or two and press the OK
button again. When I open the list of recorded programs it can take from 2-4
(or more) seconds to display anything. The delays are long enough that I am
often left wondering if it registered a button press or if I need to press
again. I have to wait at a minimum 4-5 seconds to see if the system is
catching up or if I need to repeat an action. It's a constant irritation when
using the system. It baffles me that their flagship product is so non-
performant. I guess that when you have no real competition there is no
motivation to produce a better system.

~~~
majewsky
I had an LG "Smart" TV for a year. It was horrible. Suppose that I want to
change the channel to number 54:

1\. Pick up remote. Type "5", "4", "OK". Put down remote.

2\. Wait for 5 seconds. A "5" appears on the screen.

3\. Wait for 2 seconds. A "4" appears on the screen.

4\. Wait for 2 seconds. Channel switches.

I sold it and got a 40" screen for my PC instead. The PC actually boots faster
than the "Smart" TV.

~~~
pier25
Smart TVs are generally terrible. It's usually better to just use an Nvidia
Shield or an Apple TV for the smart functionality.

Manufacturers like LG and Samsung are terrible at UX/UI, and then you have
others using Android TV which is great but using crappy low performance SOCs
like Sony does.

------
phtrivier
> We hope this material is helpful for you as you work on your own software.

Sadly, as interesting as the material in the article is (great to learn about
the measured latencies of the hardware part), I fail to see much that is
"actionnable" for a run-of-the-mill software developper.

It seems the only advice is "don't download ad / tracking / social media -
related stuff", but even that is not exactly in the developpers circle of
influence. Who's going to make google analytics smaller to download ? (except,
well, google ?) Is any developper really in the position to say "great news,
our pages now load xxx ms faster !! However, you won't be able to compute your
KPIs for this semester, is that a problem ?")

Also, is "using a language without GC" accessible today for a web frontend
developper ? (through some rust / wasm / whatever magic ?)

~~~
icebraining
The author doesn't say you should use a language without GC, but minimize its
effects. There are techniques to avoid GC churn by reducing allocations and
the subsequent cleanups.

~~~
majewsky
I heard an anecdote about how Minecraft got much slower when Notch (the
original developer) turned it over to a team of employees. The new team did
some refactoring, e.g. instead of calling functions like

    
    
      doWork(int x, int y, int z)
    

they refactored that into

    
    
      doWork(Coordinate c)
    

and that's when Minecraft started eating RAM like some sort of delicious
candy, because now each time you deal with a new Coordinate, it's one more
object to garbage-collect. The old method may not have been particularly
pretty, but plain ints are allocated on the stack and thus reduce GC pressure.

(BTW, can anyone confirm or deny that anecdote?)

~~~
MaxBarraclough
Was `Coordinate` a class or a struct? C# structs generally [0][1] don't force
use of the garbage-collected heap.

[0]
[https://blogs.msdn.microsoft.com/ericlippert/2010/09/30/the-...](https://blogs.msdn.microsoft.com/ericlippert/2010/09/30/the-
truth-about-value-types/) (ignore the usual comments telling the reader they
are wrong for wondering about whether the garbage-collected heap is used)

[1]
[https://jacksondunstan.com/articles/3453](https://jacksondunstan.com/articles/3453)

~~~
slavik81
Minecraft is a Java application.

~~~
MaxBarraclough
Derp, of course! Mention of Microsoft threw me off :-P

In the JVM, there are of course no structs, but I'd expect the escape analysis
optimisations in the HotSpot JIT to reduce it down to avoiding any GC churn.
If this isn't happening, I'm curious as to why.

------
carapace
I recently switched to an OpendBSD machine with no mouse. I have a large
screen and use tmux-- no X windows --and a clicky gamer's keyboard. It's soooo
nice.

The only downside so far is that about ~%60 of the WWW sucks through Lynx... I
have a separate machine on my desk that's basically a Firefox and VSCode kiosk
now. But I've gotten a dead-tree hardcopy _book_ on Vim...

------
piccolbo
At first glance I thought it was the sw equivalent of the "slow food"
movement. No such luck.

------
wink
Only since I work fulltime with gcc hogging all my CPUs and most of my RAM for
up to 10 minutes at a time and also slowing down the computer I fully
appreciate having worked in dynamic languages for many years. Yes, that
problem might be solvable with a beefy build box.

~~~
kpmah
The difference is interpreted vs compiled, not dynamic vs typed. You can use a
C interpreter to instantly run C code without compilation.

~~~
wink
Of course you're right but it's equally often used for interpreted scripting
languages. And I don't like the term scripting languages. Apparently naming is
hard, who knew.

------
daxfohl
At the app level, I'd say it's about responsiveness not latency. Instagram had
a post a while back about how they cheat a bit to make their app feel
responsive even if actual latency was high.

From this perspective one could say this article puts too much focus into the
raw numbers, how many ms to a response. As techies, we like that: cheating is
cheating. Numbers are important. But really we need to look harder, how can we
make users perceive that things are better than the raw numbers.

------
leowoo91
I hope devs can fix the racing wheel latency first, so we can enjoy those
racing games.

~~~
voltagex_
Which racing wheel have you tried? Was it implemented as a HID device (subject
to the polling rate limitations described in the article) or with its own
driver?

~~~
leowoo91
I didn't buy single one since I have seen many youtube videos, all seem to
have latency (yep, HID device). By latency I mean, when you throw steer, you
can see it reflects about 1/3 second to the gameplay. It is very hard to blame
slowest part, probably the game itself, but I really wish that would be
solved.

------
agumonkey
there's slow as laggy and slow tempo

I remember how I loved my slow hp48, the input buffer was still listening and
I could easily think and keep typing operations while the screen was busy,
never felt "slow"

------
PavlovsCat
> Android and iOS both make substantial use of "long press" to access context
> menus, which require that the user wait hundreds of milliseconds in the
> middle of their command gestures.

> A related source is delays for disambiguation. For example, on mobile Safari
> there's a default 350ms delay between when the user taps a link and when the
> browser begins fetching the new page, in order to tell the difference
> between a link click and a double-tap zoom.

I really wish mobile devices had one or two modifier buttons on the side. That
way you could have "right click", maybe even positioning a cursor without
clicking, all sorts of crazy stuff, like being able to "mouse over" a link in
a mobile browser.

~~~
ggreer
Apple has 3D touch, where if you press firmly on the screen, the phone makes a
tactile "click" and triggers a different action than normal tap. Annoyingly,
Apple mapped 3D touch to a totally new set of actions instead of replacing
long press.

~~~
saagarjha
Apple has slowly started merging the two, because it makes it easier for them
to support devices with and without the feature.

------
RemarkableMan
You will never massage and cram all the remote cruft efficiently enough to get
back the responsiveness of an Apple II. Today’s software is turtles all the
way down from the library to the sub-library to the JIT to the application and
OS and hardware and they each need 5-25 milliseconds to even wake up. That’s
before you even hit the network.

Things don’t get bad, they get worse.

~~~
majewsky
> Today’s software is turtles all the way down from the library to the sub-
> library to the JIT to the application and OS and hardware and they each need
> 5-25 milliseconds to even wake up.

Where do you get these numbers? I have programs that run from start to finish
in 5 ms, including tons of OS syscalls.

I think many of us forgot how efficient OSes are because of the shitshow that
app developers put on top.

~~~
zimpenfish
I have a Go program that parses command line arguments, loads and parses a
text file, then interprets that code on a faux-CPU. Total time 24ms start to
finish for 107 faux-CPU steps (sorting 12 numbers of bunches of 3) according
to `time`.

~~~
majewsky
Mine is also a Go program that reads several files and does some basic
computations on top of it. I wrote about the `time` measurements here:
[https://blog.bethselamin.de/posts/latency-
matters.html](https://blog.bethselamin.de/posts/latency-matters.html)

