Hacker News new | past | comments | ask | show | jobs | submit login
Almost everything on computers is perceptually slower than it was in 1983 (2017) (twitter.com)
461 points by fao_ 32 days ago | hide | past | web | favorite | 405 comments



Computers are perceptually slower because we've replaced CPU/memory slowness with remote latency. Every click, often every keypress (for autocomplete), sometimes even every mouse hover, generates requests to who knows how many services in how many different places around the world. No wonder it's slow. Speed of light. I see this even using my development tools at work, where actions I didn't really want or ask for are being taken "on my behalf" and slowing me down. I notice because I'm on a VPN from home. The developers don't notice - and don't care - because they're sitting right on a very high-speed network in the office. It's the modern equivalent of developers using much faster machines and much bigger monitors than most users are likely to have. Just as they need to think about users on tiny phone screens and virtual keyboards, developers need to think about users with poor network connectivity (or just low patience for frills and fluff).


When dealing with web applications at least, people don't realize how many non-essential requests are being made peripheral to the action the user actually wants to accomplish. For instance, install NoScript and make a page fetch to cnn.com. There are 20+ external page requests being made to all kinds of tracking, advertising, and analytical domains which have fuck-all to do with the user's request to see content hosted by cnn.com. The page loads almost instantly when all these non-essential requests are filtered. It's a hilariously bad side effect of the web becoming as commercialized as it is.


Inserting network and server latency into the local UI feedback loop is a terrible anti-pattern.


Windows explorer is seriously slow on Windows 10. Things like the right-click menu and creating a new folder are too slow for what is done. The new item menu is very slow, perhaps due to having office 365 installed? Creating a new folder sometimes doesn't update the display of the containing folder at all.


The built-in image viewer application in Windows 10 takes so long to start up that it's outright unusable for its purpose.


It also has a large permanent gutter, can't successfully fullscreen an image, can't switch images while zoomed (really common lately but still dumb), resizes to some third size if you double click on it twice(??), and loses its zoom state if you switch desktops.

Seriously the worst image viewer I've ever seen.


Someone from Microsoft, I know you are reading this, please explain to us how this makes it all the way to launch like this.


Their priority seems to be to make their "programs" "apps" so PCs are indistinguishable from cell phones. It may make business sense to them as so many of these companies are aiming to be everything to everyone, but it doesn't make user sense because if I wanted my PC to work like a phone, I wouldn't have just saved the money and stuck with my phone. They're also trying to make their apps to be social magnets to sell news stories, movies, music . . . it's very tiresome.


Same as everywhere else: some managers needed to justify a promotion.

P.S. not from MS


And all the more irritating because they had a perfectly usable image viewer which they could have used.


Data point of 1, but mine is as fast as anything. On my 7 year old desktop, upgraded from 7 to 10, its near instant. But given almost anything can hook into that menu (i.e. I have winrar, vscode, treesize, etc) it probably comes down to what you have running.


The second data point to the OP: I have a brand new laptop 16gb RAM and an SSD drive. When I mouse over "new", it takes 3-4 seconds for "folder" to appear. This is first thing in the morning with only Opera running. I do have a bunch of tabs open, but it does this without my browser running as well.


How's your start menu?


I concur. I handle video files on a SSD, it can take 2-3 seconds to just view a folder with one file inside. All I want is the file name and an icon but apparently in 2019 this is an Herculean tasks.


I really wish we could turn off what appears to be a deep scan of all the files every time one opens a directory in Windows 10. I just need to see the filenames to do what I want 99% of the time.


"This folder contains a single .wav file. I'll automatically switch over to an artist / album / etc view, and hide all the details you normally see"


It all depends on what hooks up into the context menu. The vanilla menu is fast. But once you accrue some unpackers, file sharing, VCS tools and garbage like Intel graphics drivers, the context menu gets unusably slow.


One part of me says Microsoft still needs to take responsibility for managing the experience.

"git-torrent is lowing your system down, do you want to disable it?"

Another says careful what you wish for, don't want everything to be locked down like Apple do you?


Microsoft does this with Outlook add-ins. Many people accidentally disable them, or wonder where their Skype add-in went to when Outlook killed it because it took too long to load.


I'm using a CalDAV add-in to Outlook and it gets stopped every now and then. It seems pretty lightweight, and my guess is that Outlook is so borked that it causes the problem itself and then mistakenly attributes it to the add-in.


I drop to a command prompt quite often to do a dir /S thatfile.exe

Finding that file via Explorer search takes 10 minutes. Via dir, it somehow takes 10 seconds or less.


I use Everything from Void Tools.


I use Total Commander on Windows, Midnight Commander and DoubleCmd on Linux. All nice and instant with plethora of operations. Not sure why Explorer type UI dominates.


Off-topic story time. So, midnight commander has a nice feature where you hit a hot key to drop into a full-screen shell of the tab which is currently selected. Back in the early 2000’s it had some sort of bug where it would sometimes drop you into a shell where PWD was from the _opposite_ tab.

And of course, the one time this bit me was when I issued ‘rm -rf *’, suddenly realized the command was taking waaaay too long, ctrl-c’ed it, and felt the blood drain from my face as I realized I had just lost 1/4 of my mp3’s.

Not a bad first text editor though. Cut my programming teeth with it.


The weird thing is that these are solved problems.

The most impressive, simple piece of software I've tried is a search tool called Everything.

I thought search was just hard and slow. Everything indexes every drive in seconds and searches instantly. I imagine it must be used by law enforcement to deal with security by obscurity.


I agree, I think there are a few things in the Windows file explorer that conspire against its good performance (file preview is a big factor, but recycle bin content seems to affect it too), and it does seem to get worse over time. I think there's a market now for a third party 'back to basics' explorer.


Classic Shell, but Windows 10 updates sort of broke it


7z file explorer works well for me. Now I am mostly using Linux though.


The productivity boosters for me since 1983 aren't so much speed, it's:

1. A large hires screen so I can see lots of context

2. Lots of disk space

3. Online documentation available

4. Protected mode operating system

5. Github

6. Collaboration with people all over the world

The productivity destroyers:

1. social media


can't comment about 4 as it was before my time doing useful work on a computer, but everything else sounds right.

>The productivity destroyers: > 1. social media

stares at HN page


> can't comment about 4

Having a real mode operating system (DOS) means that an errant pointer in your program could (and often did) crash the operating system requiring a reboot. Worse, it would scramble your hard disk.

My usual practice was to immediately reboot on a system crash. This, of course, made for slow development.

With the advent of protected mode (DOS extenders, OS/2), this ceased being a problem. It greatly speeded up development, and protected mode found many more bugs more quickly than overt crashes did - and with a protected mode debugger it even told you where the bug was!

I shifted all development to protected mode systems, and only as the last step would port the program to DOS.


> The productivity destroyers:

> 1. social media

2. Project Managers


Also modern programming languages, right Walter? ;-P


7. The Internet itself


Apparently I use my computer differently than a lot of commenters. Because when I dust off my 1983 Apple IIe it gets REALLY slow when I try to have 50 open browser tabs, edit video, and run a few virtual machines.


Yet if you check how fast it renders a character to the screen it will almost certainly be faster.

We've made trade-offs in the computer space, input latency and rendering of the screen (also in terms of latency) has suffered strongly at the hands of throughput and agnosticism in protocols. (USB et al.)


The latency issues are dealt with, but you have to accept the RGB LEDs that come with gaming things.


Not really. Here's what an 70's/80's PC and OS had to do to print a single character in response to user input (simpified):

Poll the keyboard matrix for a key press. Convert the key press coordinate to ASCII. Read the location of the cursor. Write one byte to RAM. Results will be visible next screen refresh.

A modern PC and OS would do something more like this:

The keyboard's microcontroller will poll the keyboard matrix for a key press. Convert the key press location to a event code. Signal to the host USB controller that you have data to send. Wait for the host to accept. Transfer the data to the USB host. Have the USB controller validate that the data was correctly received. Begin DMA from the USB controller to a RAM buffer. Wait the RAM to be ready. Transfer the data to RAM. Raise an interrupt to the CPU. Wait for the CPU to receive the interrupt. Task switch to the interrupt handler. Decode the USB packet. Pass it to the USB keyboard driver. Convert the USB keyboard event to an OS event. Determine what processes get to see the key press event. Add the event to the process's event queue. Task switch to the process. Read the new event. Filter the key press through all the libraries wrapping the OS's native event system. Read the location of the cursor. Ask the toolkit library to draw a character. Tell the windowing system to draw a character. Figure out what font glyph corresponds to that character. See if it's been cached, rasterize glyph if it's not. Draw the character to the window texture. Signal to the compositor that a region of the screen needs to be redrawn. Create a GPU command list. Have GPU execute command list. Page flip. Results will be visible next screen refresh.

I could drag this out longer and go into more detail, but I don't really feel like it.

I'm sure people who actually work on implementing these things can find inaccuracies with this, but it should give an idea how much more work and handshaking between components is being done now than in the 70's/80's. Switching to gaming hardware isn't enough to get down to ye olde latencies.


We have 1000x speed machines to handle that. Notepad is perfectly responsive. but most apps do crazy side gunk interfering with typing.


Except we don't, and this really is faster on several older machines: https://danluu.com/input-lag/


That’s not true, polling of usb keyboards, multi-process scheduling and rendering through the various translation layers adds latency, quite a lot actually. You really notice it if you type on a C64 today. The machine really feels instant. Obviously it’s too anemic to do real work on and that’s kind of my point. We traded a lot of latency for a lot of throughput in other areas.


I know, there's input lag, USB driver fiddling, kernel queue, font processing, kerning, glyph harfbuzzing or whatever, blitting, compositing, rendering, freesync/g-sync, and then waiting for the LED crystals to deform and so on.

Yet there are 1000 Hz USB, and optimizations for all of the above, and e2e lag soon might become solved, and then if we're lucky it'll percolate down to consumer stuff eventually.


I think it comes down to the fact that GUIs _sell_. GUIs have visibility and appeal, they are something users can actually see, and have opinions about (right or wrong). GUIs are the ultimate bikeshed, and for many users, the lipstick IS the pig.

----

Anecdote: I can't count the number of times I have seen a team changing a color, updating a logo, or moving an image a few pixels, resulted in happy clients/customers, and managers sending a congratulatory company wide email. While teams solving difficult engineering problems may have garnered a quiet pat on the back, if they were lucky.


> I think it comes down to the fact that GUIs _sell_.

IMO not just that, but also that the sale happens very early - before people get a chance to discover the UI is garbage. What's worse, in work context, a lot - probably most - software is bought by people other than the end users. Which means the UI can be (and often is) a total dumpster fire, but it'll win on the market as long as it appeals to the sensibilities of the people doing the purchase.


I just remember doing cad in the 1980's. If you want to talk slow. As in getting coffee while the compute redraws the screen slow. Some time around 1993-4 suddenly the hardware was fast enough.

I think at this point we're trading performance for a bunch of ultimately worthless bling bling. No one's added a damn thing that improves user productivity.


> changing a color, updating a logo, or moving an image a few pixels, resulted in happy clients/customers, and managers sending a congratulatory company wide email. While teams solving difficult engineering problems may have garnered a quiet pat on the back, if they were lucky.

That sounds like an extremely unhealthy business environment. It'll also leave you with just the worst engineers who cannot find a better job. A company that doesn't do this should be able to run circles around one that does.


And he's not even talking about software bloat. The word processor I have on my early 90's Powerbook is more responsive, generally faster to use, than my current one running on a Core 2 duo processor. Oh, and by the way, I was once complaining about this to a friend who's in IT, and he told me how the speed in which a software runs doesn't mean anything regarding its quality. What I mean is, I was telling him how bad some new software was because it was quite slower than one 10 years older which did the same thing, and he tolde me that, in software engineering, this (speed) is never a measure of a program's quality. Is this universally accepted? Speed and responsiveness are not taken into account? I always meant to ask other people in this field, but always forgot.


> Oh, and by the way, I was once complaining about this to a friend who's in IT, and he told me how the speed in which a software runs doesn't mean anything regarding its quality.

Your friend is wrong. It's an imperfect proxy, but looking at programs that do work, speed is a good proxy for quality, because speed means someone gives a damn. There are good programs that are slow, but bad programs all tend to be bloated.

Of course "speed" is something to be evaluated in context. In a group of e.g. 3D editors, a more responsive UI suggests a better editor. A more responsive UI in general suggests a better program in general.

> this (speed) is never a measure of a program's quality. Is this universally accepted?

Universally? No. It all depends on who you ask. Companies tend to say speed isn't, but the truth is, a lot of companies today don't care about quality at all - it's not what sells software. If you ask users, you'll get mixed answers, depending on whether the software they use often is slow enough to anger them regularly.


To me (on the internet since 92), speed is 100% a measure of a program’s quality. Intensive tasks get a pass (especially if they are pushed to a background queue), but IU responsiveness is definitely a measure of quality for me. Jason Fried has also written an optimized extensively for UI speed in Basecamp (a quick google shows an article from 2008). Speaking of: there has also been a lot written about Amazon’s discovery that every 100ms of latency cost them 1% of sales from people simply walking away from the “slow” site.

Especially when you’re doing the same task “template” on a day to day basis, even 1 second per input adds up quickly.


More like speed is a quality, whether you care about it or not is a different story.

In many cases I'm happy with simple but slow but fast enough.


One reason why I'm happy to not be in IT, is because of said bullshit. Maybe that means I am part of the problem, because if every programmer who has a problem with that, leaves, you're left with just the programmers like your friend who don't see anything wrong with this.


>how many times have i typed in a search box, seen what i wanted pop up as i was typing, go to click on it, then have it disappear

Regardless of anything else, this is 100% happening to me on a regular basis. And the ironic thing is that I think it is caused by the attempt to speed up getting some results onscreen. But it’s always 500ms behind, so it “catches up” while I’m trying to move the mouse to click on something.


firefox is notoriously bad at this - type a bookmark name into the url bar, move mouse onto the bookmark, search results come in and you click something you didn't want. gets me all the time.


I despise the web.

HTML was designed for static documents, it boggles my mind that things like nodejs were created. It's not a secret.

HTML techs can't even run efficiently on a cheap smartphone, which is the reason apps are needed for smartphones to be usable.

Every time I'm talking to someone for job offers, I state that I want to avoid web techs. No js, no web frameworks. I prefer industrial computing, to build things that are useful. I don't want to make another interface that will get thrown away for whatever reason.

Today the computing industry has completely migrated towards making user interfaces, UX things, fancy shiny smoothy scrolly whatnots, just to employ people who can't write SQL. Companies only want to sell attention. This is exactly what the economy of attention is about.

All I dream about is some OS, desktop or mobile, that lets the user write scripts directly. It's time you encourage users to write code. It's not that hard.


>It's time you encourage users to write code. It's not that hard.

It is. Try teaching coding to someone non technical, especially someone that doesn't want to learn, and by the time you get them to understand what a variable is, you will fully understand that coding is not for everyone.


This.

What he suggested wasn't viable in terms of productivity either. One may be programmer but don't want to spend time administering the insignificant parts of the system.

I never understood the culture of elitism in system micro-administration by hobbyist crowd.


> HTML techs can't even run efficiently on a cheap smartphone

That's largely the fault of ads. Some well placed JS stuff is lightning fast, even on mobile.


Fortunately we now have webasm, which will allow developers to write customized web browsers that will recreate the same HTML/Javascript/DOM web runtime environment that we have now, only somewhat less efficiently.


I agree with your general point, but:

> I despise the web.

The web used to be great. I think you're despising something else.

> HTML was designed for static documents, it boggles my mind that things like nodejs were created. It's not a secret.

Someone already told me we were heading this way in 2005, using JS to write apps inside web pages. It boggled my mind then, but it hasn't really boggled since. My main worry (and sadness) was that JS was such an utterly shitty programming language back then. It was something you loved to hate, writing JS functionality for web pages was almost like a boast; look at the trick I can make this document do by abusing this weird little scripting language.

But that has changed, oh boy has it changed. Almost all the warts in JS can be easily avoided today. With the addition of the `class` keyword (standardizing the already possible but hacky class-like constructs), the arrow functions, and the extreme performance increase in current engines, it's actually become one of my favourite languages to code for. But don't worry I don't use it to write bloated web apps :)

> HTML techs can't even run efficiently on a cheap smartphone, which is the reason apps are needed for smartphones to be usable.

That's not the reason why apps are "needed". It's simply because it allows for more spying on the user. A website can only do it when it's loaded, an app can do it all the time, periodically, on boot, or whenever. They get a neat device-global tracking ID (and more than enough fingerprinting info, just in case), which makes tracking super easy and the advertisers happy. They don't have to do anything with that cookie permission banner EU law, because, well yeah apparently the EC didn't realize that apps are being used to do everything the advertisers want but the websites can't. Cookies are child's play compared to the trackers they can insert at elbow depth.

And the few apps which are actually apps because performance of being a web app doesn't suffice, they actually tend to be about performance and not so bad in the bigger picture.

You see the same thing on the web though, all the bloat and slowness and shit is caused by ads and tracking. Normally we fight against industries that are a net negative on society. Except that ads happen to be equivalent to propaganda, and tracking happens to be equivalent to surveillance, so somehow there's not a lot of push from the powers to get rid of these things, because of how convenient it is this industry just builds the infrastructure for them. They especially like it when the tracking is sent over unsecured HTTP.

That said, there should be sufficient work for a qualified engineer to write code for industrial applications, no? Many web devs can only write PHP/JS framework code. If you know how to program industrial controllers, or have similar qualities, experience with various industrial systems, I doubt you're going to have to explain anyone you're not a web dev ...


I think we can actually blame React and similar frameworks for the issues we see in many modern apps, including the ones mentioned in the article.

Part of the issue stems from the "strong data coupling" that's all the rage. Everything on the page should correlate at any given point in time. Add a character to a search box and the search results should be updated. The practical effect of this is that any single modification could (and often does) rewrite the contents of the entire page.

The other thing the article brings up is that developers and designers often disregard input flow. This may be partly driven by not having sufficiently dynamic tooling (Illustrator can hardly be used to design out flow patterns, for example.)

These two issues have a unifying quality: Websites must be "instagrammable", which is to say look good in single snapshots of time, and the dynamics take a serious back seat.


> any single modification could (and often does) rewrite the contents of the entire page.

I thought the entire point of react was that it _doesn’t_ rewrite the entire page (DOM diffing)?


I said that it rewrites the contents of the entire page, by which I mean ~100% of the things visible to the user may change.


If the devs use SSR and code splitting and whatnot loadtime isn't so bad. But yea these frameworks get abused because everyone wants to stay competitive and make a decent salary


Yeah but React's popularity is because there is demand and expectations for interactive UIs with tons of features that are taken for granted. Users expect basic amount of interactivity and usability that adds a ton of complexity which React offers a good model for.


I feel that, for a discussion on a site with the minimalistic design of HN, the demands of that sort of user are sufficiently foreign that you need a citation.

Not because I particularly doubt that expectations on the wider web are different to HN. Just that the crowd here isn't going to be able to easily understand the people out there. Anyone who expects interaction was weeded out long ago.


I think the core point missed by some people on HN and in dev circles in general is that non-tech people don't expect shit. They accept what they're given, mostly unquestionably, because they don't know any better and couldn't know any better. It's not their domain.

To get meaningful answers about what kind of UI people prefer, you'd have to sit a lot of them in front of several different interfaces, show them around, and then let them use those interfaces for prolonged amount of time, and then - only then - ask which one they prefer. But this almost never happens in the wild, so the market is completely detached from what people want.


Uhh,no. Reacts popularity is because of lazy devs that only know JavaScript and want to use it everywhere. Large companies want to have have one "unified" codebase that runs on all "platforms". It has nothing to do with interactivity or usability because if it did then developers would write native apps with native UIs with much better interactivity, usability, and performance.


React is leveraging Functional Reactive Programming concepts, with the use of lifecycle and other conventions that basically make it easy to reason about events and their effects. Every click/scroll will have an effect on the app state and/or network call, and I actually think a sufficiently complex enough app would end up with React-like patterns or the alternative which is much worse, a ton of repeated code and logic.


Uhh,no. React is popular because devs HAVE to use JS & DOM to do web development in the browser, and they want speed, and to use a library endorsed by a big company. And some of those devs (like me) prefer the unidirectional / function paradigm.


> and they want speed

They won't get it with React. And I'm referring to both how fast the webapp runs and development speed. It gets too complicated too quickly, even for relatively small sites.


Whether or not it seems complicated depends on if you have nailed down the mental model. It's a bit like Git, in that respect.


Complexity is a very real and objective thing. When you can't 100% guarantee your program won't go into an infinite loop when you change a single line, it's too complex.


I don't know of any infinite loop problems in React. I would think they are less likely than using MVC / JQuery style patterns.


Have you done a lot of work with Hooks and useEffect? React changed the entire component lifecycle with the introduction of hooks. All sorts of weird things trigger re-renders now, including when a dependent function changes. You have to surround all of those with useCallback.

I should do a thorough writeup of the infinite loop issue in React.


I've not used hooks yet, but it's on my list to learn. Thanks for the warning though!


You seem to think React is mostly (only?) for building mobile apps.

Most usage of React is for web apps. For most of those, a native UI makes no sense and has massive distribution implications.


If you spent what people paid for a PC in 1983 (literally, without any inflation) you probably wouldn't notice anything being perceptually slower.

Like the first Mac retailed for $2500 US. Go spend $2500 on a PC today, you'll have a great time.

Granted, economies of scale make this kind of a dumb argument. But it has a bit of truth to it. People are just less willing to spend as much on their machines, as well as push much more limited platforms like mobile to their limits. We should definitely deal with that as developers, don't get me wrong - but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.


Not true.

I have a top of the line Intel processor that’s less than 2 years old (launched, not bought). 970 Evo Pro that’s the one of the fastest drives around. 32 GB RAM (don’t remember the speed but it was and is supposed to be super fast).

Explorer takes a second or two to launch. The ducking start menu takes a moment and sometimes causes the entire OS to lock up for a second.

The twitter rant is spot on.

There’s so much of supposed value add BS that the core usage scenarios go to shit.

And this is coming from a Product Manager. :-)

Anyway the referencing problem is painful. I feel it often. Google maps or Apple Maps. Try to plan a vacation and Mark interesting places on it to identify the best location to stay. Yup gotta use that memory. Well isn’t that one of the rules of UX design, don’t make me think?

Regarding OSes: storage has gotten so much faster and CPUs haven’t, that storage drivers and file systems are now the bottleneck. We need less layers of abstraction to compensate. The old model of IO is super slow is no longer accurate.


Honestly it sounds like your problem is Windows.

I'm writing this on an AMD Phenom II, running Debian and StumpWM, that's over 10 years old. I've upgraded the hard drive to an SSD, and the memory from 8 Gb to 16 Gb (4 Gb DIMMs were very expensive when I first built it) and it's as fast as can be.

My work computer is much newer, has twice as much memory and a newer Intel processor, and I really can't tell the difference except for CPU bound tasks that run for a long time, like compiling large projects.


Have to voice my agreement. Linux is an expensive investment but so very much worth it. Each time my colleagues complain about their computers it is because of Windows. I count myself lucky to have Linux as my only desktop and the skill to maintain it. I run an ancient i5 2500k with 8GB RAM and SSD. All the games I play work fine on Steam Proton. I still have to figure out how in the world Reddit on Firefox manages to completely lock the system up, with looping audio and frozen cursor. Nothing else causes that fault.


> I still have to figure out how in the world Reddit on Firefox manages to completely lock the system up, with looping audio and frozen cursor. Nothing else causes that fault.

Fellow X220 user here... a solution for this exact problem where the system runs out of memory and then you sit there staring and waiting until it churns around long enough until it can do stuff again is to run earlyoom[0].

It will kill off the firefox process (or whichever is the main memory hog) early, which is also annoying but less so than having to wait minutes until you can use your computer again.

[0] https://github.com/rfjakob/earlyoom


My previous two laptops came with Windows installed, it was XP and Win7, which I happily used for a year or two, until despite my best efforts, the whole system got crudded and slow, and at some point a teensy crashy or maybe I got a virus and that's when I put Linux on the thing. That easily gave the device a few more years of useful life. (my current laptop also came with windows, but I just backed up the factory image, wiped and installed Linux right away).

Anyway, while every version[0] of Windows I have used has become inescapably crudded up and slower over time, on Linux, even the old laptop, the only thing that got slower over time was the web browser. Which has mostly to do with webpages becoming heavier.

[0] Actually win95 cause I can't remember if this also happened on win 3.11 and the like.


The new Reddit just does that on every device. Luckily there is still old.reddit.com and i.reddit.com


I don't feel like linux is expensive investment for everyone.

I am a first year CS student. when I got my first laptop recently, I got crazy and installed debian (had some prior experience with command line), it didn't work very well for laptop. All DEs except enlightenment (yeah i even tried it) had lots of display related glitches due to cheap hardware.

Then I moved on and installed fedora. Nothing to tweak from CLI. Just changed few settings from GUI and peace of mind even on relatively obscure hardware.

It has been vastly simplified and worth it for anyone in IT / CS related fields.


> memory from 8 Gb to 16 Gb

2 GiB total is not a lot


Very satisfying to read about the new start menu. Every year or two I'll wonder how things are in Windows land, install a fresh copy of Windows 10, open the start menu and wait. Oh, there it is!

This is one of the things that made me ditch Windows when it came out, but I was pretty sure they would have fixed it now. Now I'm convinced Windows 10 is part of an authoritarian experiment in getting populations to gradually submit to a worse quality of life.


I have been told I have to update my Windows 7 system at work starting next year and I'm already dreading it. Too much software is Windows-only to switch to something else.

Privately it was so easy to ditch. (Still have it on dual-boot, rarely use it and so every 2-3 times it needs to update for long minutes while I wait. Meanwhile Mint updates the kernel during operation while I barely notice at all.)


I heard you can get security updates til 2023 with some registry tinkering, but for your work computer it might not be worth the social consequences.


Actually, a computer from 1983 is still the reigning latency champion. https://danluu.com/input-lag/


That comparison's a bit ridiculous, considering how much more a modern system is doing and making possible. I think all it shows is that latencies under 200ms are widely regarded as acceptable. What latencies are observed if you run an OS of comparable simplicity to the Apple 2e's on a modern machine?


If you ran an OS that was exactly as simple as the Apple 2e, the Apple 2e would still win.

Modern hardware introduces a significant amount of latency, its important to differentiate throughput and speed, a modern computer would crush a 2e in throughput a million times over, maybe more, but that doesn't mean its pipeline is shorter.


Why? Can nothing be done about it short of literally going back to 90s era hardware?

I see latency as a silent killer, of sorts. For instance, if you introduce a tiny bit of mouse latency, users won't notice the additional latency, but they will sense that their mouse doesn't feel quite as good. Give them a side by side comparison, and I bet most will be able to tell you the mouse with slightly less latency feels better.

This extends to everything. Video games with lower latency appear to have better, smoother controls. Calls with less latency result in smoother, more natural conversations. Touch screens with less latency feel more natural and responsive.

(I only have anecdotal evidence of this, but I am absolutely convinced of it.)


Definitely dig this theory and feel like it might be spot-on, but do we have any linkable evidence?


Evidence? Of how computers work?


Evidence that “If you ran an OS that was exactly as simple as the Apple 2e, the Apple 2e would still win.”...


> I think all it shows is that latencies under 200ms are widely regarded as acceptable.

They are literally not. At all. You're way off. For anyone who cares about latency, you gotta be sub 50ms at least. For anyone doing generic not latency-sensitive work, maybe you can get away with 100ms, but that's stretching it.

200-250ms is the (purposefully built-in) latency with which an autocomplete may appear while typing. Not the latency for a single character or mouse click!

Where do you get 200ms latency anyway? That's a lot


> ... you probably wouldn't notice anything being perceptually slower

I disagree. I have such a PC (64 GB of RAM, Quadro GPU, SSD, etc.) and I absolutely do notice things being slow, even things like Word, Excel, and VS code, let alone resource-intensive professional software.


Personal machine or work machine with antivirus and enterprise spyware?


Work machine. But it's a small company, no dedicated IT staff, no anti-virus, and everyone has admin privileges to their computer.


A more expensive PC does very little to address the latency issues at play here, the problems are very much not lack of processor speed, gpu speed, or even ssd speed (most times).

I know from experience, the most godlike PC you can possibly build does virtually nothing to make common applications less laggy.


Modern day tools such as Slack, VS Code and other Electron & browser based apps do bring a fair amount of lag into day to day work.

The common denominator there is browser tech & I think that will improve with time. And network-delivered services like Google Maps & Wikipedia are best compared to CD and DVD-ROM based services like MapPoint and Encarta, which had their own latency and capacity challenges.

In the meantime, you can still use tools like vim for low-latency typing. And it’s kind of interesting to see a Java GUI (IDEA) perform as well as it [has](https://pavelfatin.com/typing-with-pleasure/).


I get your point, but I don't agree on the anecdotal front. I haven't used fewer than 4c/8th and 16GB of highspeed ram in probably 5 years - and the only "common" applications that I notice going slow on me (unless on the occasion I'm not booting older/slower hardware), are things like IDEs and absurdly large spreadsheets. Even stuff like Electron apps are snappy to me and I haven't had issues with (GitKraken, VS Code, and Slack are daily drivers for me).

Browser based apps are a shitshow though, but I figure that's mostly out of anyone's control. I chop that up to the browser being fundamentally a poor place for most applications, even ones that are tightly coupled to a server backend.


Problem is its perceptual, Go back and use windows XP, its a complete nightmare lagfest with any appreciable CPU load compared to windows 7+ (all UI rendering was done directly by the CPU, no GPU acceleration except a few minor things).

I bet at the time you barely noticed though.


I will try to explain - going back to XP may not take you back far enough.

I recently read a history of early NT development, and then installed NT4 in a VM to play with, choosing a FAT disk. It is /extremely/ responsive. Much more so than the host OS, Windows 10.

The NT4 and 95 shells were tight code. They were replaced a few years later by the more flexible "Active Desktop". This was less responsive.

In later releases, Windows started to incorporate background features, such as automatic file indexing. File indexing is IO intensive and hammers your CPU cache.

When I was regularly using NT4 (years ago), I had an impression that there was some overhead caused by registry searches. If this was ever a thing, improvements in raw computing power have conquered it.

If anyone else wants to try, NT4 and VC++ cost me next to nothing on amazon. For a good editor, get microemacs. Python2.3 works. (Don't let it near an open network.)


I do recall being a luddite in upgrading from Office 2003 to 2010 (rip, '07 on Vista) and rued the day that it became permanent. It did get better though.


You'd think so, but yet here we are. Even on a modern $3,000 machine it takes 5+ seconds to open Photoshop.


That's more a statement on Adobe's quality of engineering than computing. CC is awful software and I hope their engineers are embarrassed by its deployment for the fantastic platform that is their creative tooling.


Right, that's that point. Why does Photoshop take so long to start up and be usable even on incredibly powerful hardware?

It's hard to find an excuse, considering:

- Adobe has vast resources

- Photoshop is a mature piece of software

- It's image editing, not a complex video game (look at what something like Red Dead Redemption 2 can accomplish with every frame, @ 60FPS)


I would argue that it’s because of vast resources.

Both Adobe’s, and their customers.

At a certain level, when a graphic designer complains that Photoshop is too slow, they don’t push back against Adobe for optimizing poorly, they just buy a new computer.


The whole point on that rant and supporting comments here is that almost all software today is of such "quality of engineering".


Most user facing software is more like adobe's quality than unlike adobe's quality.


Most software surprises me with how much time it needs to start. Games are particularly slow, I guess because they need to load the entire engine and all assets before they can even show you a game menu, but even very simple applications can be slow.

On the laptop I'm typing this on, Windows Explorer often takes several seconds to open.


Games tend to be optimized for framerates so it makes sense to sacrifice load times for that. Of course there are plenty of games that are just badly optimized and could improve both framerates and load times.


Games on old machines also used to have loading screens, remember? They actually sometimes took quite a bit longer than, say, starting PS on a modern machine. I don't think games a very useful comparison in this context, it's more about utility and application software.


> "People are just less willing to spend as much on their machines"

And why should they? Today's smartphones are much more powerful than the most powerful supercomputer of 1983. Computers have been powerful enough for most practical purposes for years, which means most people select on price rather than power. And then a new OS or website comes along and decides you've got plenty of power to waste on unnecessary nonsense.


The first Mac was 3.5 inch disk based IIRC. I remember test driving it and was kind of shocked at that price since it felt slower than my Commodore 64 with a hard drive (the tape drive was so slow but cheap!) or my next computer, an Atari ST with a hard drive, of course the disk access/read/write speed was the dominating speed factor.


> People are just less willing to spend as much on their machines,

Please stop blaming the consumers, they have very little freedom of choice.

> as well as push much more limited platforms like mobile to their limits.

I don't think anyone has really pushed any recent smartphone to their limits. I haven't checked if any demoparty maybe had a smartphone compo, but if they didn't, then yeah nobody has really tried.

The C64, Amiga and early x86 PCs have been pushed to their limits though, squeezing out every drop of performance. And there still exist C64 scene weirdos that work to make these machines perform the unimaginable.

Smartphones haven't been around long enough and have been continuously replaced by slightly better versions, that really nobody has had time to really find out what those machines are capable of.

> but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.

I used to have to deal with such optimizations and I totally get that. It's freeing and I occasionally have to remind myself what it means that I don't have to worry about using a megabyte more memory because machines have gigabytes. Except that a megabyte is pretty huge if you know how to use it.

But not having to deal with the optimizations also means that new developers never learn these optimizations and they will be forgotten. And that's bad. Because there's still a place for these optimizations, like 95% of the code doesn't matter, but for that 5% performance critical stuff, ... if you just learned the framework, then you're stuck and your apps gonna suck.

It's kinda weird to optimize code nowadays though. At least if you're writing JS. It's not like optimizing C or machine code at all. If you're not measuring performance, 99% sure you'll waste time optimizing the wrong thing. Sometimes it feels like I'm blindly trying variations on my inner loop because sometimes there is little rhyme or reason to what performs better (through the JIT). Tip for anyone in this situation: disable the anti-fingerprinting setting in your browser, which fuzzes the timing functions. It makes a huge difference for the accuracy and repeatability of your performance measurements. Install Chromium and only use it for that, if you worry about the security.


I seriously doubt there is a huge difference in how fast I can access files, scan memory, or iterate through a loop, which is what has a huge impact on perceptual latency.


NVME over SATA drives will drastically improve file access times. You will find these on newer, pricier machines, but if your mobo has a slot, use it because the drives are fairly cheap.

Going from low clocked memory to high clocked memory can cost a bit of money (last I looked, it was like a 30-50% premium going from 2666 to 3200 to 3600MHz). As well, if you're comfortable, tightening the CAS timings on your memory can see noticeable improvement in memory bound applications. I personally have measured a 25% performance increase once my memory profile for 3200 was set correctly (mostly a Ryzen thing) and just upgraded to 3600 and haven't tested, but in my larger projects with tons of in-memory code I'm noticing improvements.

Iterating over a loop can be a world of difference depending on what is happening in the loop and what vector instructions your CPU supports, and how well it is supported. As well as your CPU's clock, L1/L2 cache sizes... basically everything.


I have used a computer with NVMe, higher clock speed, better caches, more RAM…the works (but the computers I’ve personally owned only have had some of those ;)). They’re faster, yes, but fractionally for short latencies. Typing a character on one and waiting for it to show up is not significantly better on the other.


I miss how tight my Amiga used to feel, and in the electronic music sector some people still prefer dedicated hardware rather than PC/Mac for these types of reasons. Click a button and the device responds instantly. When you're doing something creative the aesthetic quality of your tools can genuinely affect the output, and a tight response just feels right, not to mention how fast your workflow can become over time using dedicated hardware that responds instantly.


This is one reason I prefer simple guitar pedals to VSTs and such. With their instant response and tight controls, pedals feel like solid tools you can build muscle memory on. When unzipping plugins and awkwardly dragging sliders with a touchpad, it feels like it's myself who's the tool.


Knob-per-function with no menu dives is what make an instrument.

I hate software.


There are two problems with interfaces like google maps - and one exacerbates the other.

- it's not bloody obvious how they work - randomly clicking on meaningless icons, try to uncover functionality. - then just as you get used to it, they change it!

My biggest feature request would be a key stroke to hide all the floating crud that is obscuring my view of the map!


"one of the things that makes me steaming mad is how the entire field of web apps ignores 100% of learned lessons from desktop apps"

But dude, DESIGN. The design. Look at those rounded corners.


Financials are to blame.

Selling a software release is a one-time payment. Selling a support subscription is recurring revenue. And if you make your software horrible enough to use without the support subscription, it is automatically immune to piracy.

As a practical example, I don't know anyone who uses the free open source WildFly release. Instead, everyone purchases JBoss with support. It's widely known that you just need the paid support if you want your company to be online more than half of the day. And as if they knew what pain they would be causing, their microservice deployment approach was named "thorn-tail".


Anecdotally, we used WildFly with great success on a project or two.


Most of the arguments mentioned in the article are just a bias [https://en.wikipedia.org/wiki/Rosy_retrospection] and personal preference.

Remember when softwares are stored in floppy. It took a while to load. Then every application came with different behavior and key bindings.


No this has actually been measured by people.

https://danluu.com/input-lag/

The computers are faster, can do more stuff, and monitors have higher frame rates. But for many applications that aren't games latency and non-responsive UIs are a growing problem.


I remember being able to type faster than the machine could keep up an an old Mac. (Maybe a Mac plus?).

I couldn't type up handwritten notes reliably, because half a page in, I would fill up the buffer and characters would get dropped.


If you want to relive that experience, just use voice.google.com.


Or the desktop Outlook client.


<cough> Android Studio. I'm pretty sure code completion used to be way faster and less intrusive (pop up with suggestions comes up when you're finished typing the keyboard, decides to pick a random thing, erases your keyword).


Ouch. "finished typing the keyword" not "keyboard".

I don't think I could have typoed this, there must be a spell checker somewhere that I haven't disabled...


That input lag is true, but that is not the argument the article is presenting.

The article argues that keyboard is a better interaction hardware than mouse. Google Maps doesn't work exactly as he wanted. Popups everywhere, etc.


Last night I downloaded an app update on my handheld computing device (a phone). It took around 30 seconds to download and install the 100mb update, on a internet connection I can use pretty much anywhere in Europe for £10/mo.

15 years ago I would have been waiting 20 minutes for a single song to download on a hard wired PC.


I've been trying to explain for years that for the past 4 decades the hardware guys have been surfing Moore's law and the software guys have been pissing it away ....

Well Moore's law is falling by the wayside, if they want to start doing more with less the software guys are going to have to stop using interpreted languages, GC, passing data as json rather than as binary, all that overhead that's deriguer but that doesn't directly go to getting the job done


This is widely joked about, there's even a Wikipedia article: https://en.wikipedia.org/wiki/Andy_and_Bill%27s_law

"What Andy giveth, Bill taketh away"


I mostly disagree with the conclusion. GC can be very fast these days. Serialization to JSON does not really take much time. Granted, some scripting languages are terribly slow.

But the main problem seems to be a lack of a clear architecture in many systems. These systems have often accumulated so much technical debt that nobody understands why they are slow. Profiling and optimization might remove the worst offenders but usually don't improve the architecture.

Basically, in the software industry, we use the hardware gains to cover up our organizational deficits.


I don't think interpreted language or gc are inherently the issue you think they are.

You can write seven layers of lagging crap in c if you like.


Is json really that difficult to deal with?


JSON provides a lot of flexibility: it's human readable, it has explicitly-named/optional/variable-length fields, it provides nesting ... all of this comes with a cost:

- extra branching in the parsing code (the parser cannot predict anymore what the next field will be, they could be in any order)

- extra memory allocations, decreased memory locality (due to variable-length/optional fields, and also the tree-like structure).

So if your data consists in a single object composed of a timed-fixed set of little-endian integer fields, you're comparing the above costs to the cost of a single fread call* with no memory allocations.

* many other data formats provide similar flexibility, text-based ones (XML) and also binary-based ones (IFF, protobuf, ISOBMFF, etc.)

don't write it as such though, you must write the endianess-decoding code (which the optimizer should trash anyway on little-endian architectures - e.g LLVM does).



Unfortunately, because twitter threads forces the user to dumb down their main points to a single, compressed sentence. It's a shame, since I like to read well thought out articles.

Twitter takes that away because it offers a UX that's makes publishing too easy for your random ideas. People with low self control will create threads like this. With hundreds of likes comes self validation so they keep doing this.


In 1983 virtually everything was text base. Since moving to graphical user interfaces a great deal of effort moved into more visually stimulating UI such as animations and better fonts etc. Not all of this should be counted a progress/innovation. We have waisted much of the HW performance we achieved of the years for baubles and trinkets:)


That's what you get for catering to people who don't care for your work one bit.

The same people who are telling me that their computers are slow are the same people who need a flashy animated button for every single action and the same people who refuse to understand that passwords are not just a formality.

To each his own.


Reminds me of this article from 12 years ago: https://hubpages.com/technology/_86_Mac_Plus_Vs_07_AMD_DualC...

Computers have gotten much faster in terms of raw speed and throughput, yet that hasn't translated into much of an improvement in basic UI interactions and general functioning.


That keyboard-centric design for GUIs (I clearly have never taken a design class) is what makes Reddit Enhancement Suite such an effective product, in my opinion. HN's interface is possibly just as effective in that it discourages me from taking too many actions; I can vote on basically every reddit comment I read but using a mouse to do it on HN represents such a massive barrier compared to keyboard navigation.


It is NOT just Web apps

I'm here trying to write in LibreOfficea several-page document with minimal formatting, usinga high-spec CAD laptop/workstation -- and every damn keystroke is laggy!

My muscle-memory arrow-keys & quick moving around to edit portions of sentences, merge/split lines -- all rendered useless - because I need to wait for the cursor to catch up.

A part of my mind keeps wandering off to whether I should go setup another old DOS box and load XYWrite - which was feature-rich, always lightning-fast and never laggy and worked great. Of course, the lack of printer drivers...

In every area, the software developers just squander more of the processing power than the incredible continuing hardware advances provide.

Anyone have any advice on software that at least attempts to work closer to the metal and lets us see the performance that we should see from modern hardware (for all values of modern)?

I got out of the software industry because of this trend to building on multiple layers of squishy software, instead of requiring efficiency - this framework compiles to that pcode, talking to the other API, which gets thunked down to the other ..... where the h*!? is the hardware that does some actual computing?

It seems like it happened for the same reason that high-fire-rate rifles took over military use -- because they figured out that most troops couldn't actually shoot straight, so it worked better to just let them spray bullets in the general direction than to require & teach real skills.

Similarly, this whole morass seems designed to make it easy for mediocre programmers, and programmers that learn something more serious like Haskell are considered exceptional, while the bulk of stuff is written for


Discussed at the time (2017 that is—not 1983): https://news.ycombinator.com/item?id=15643663


When we plugged in our Acorn Electron (it had no power switch), it would be on and ready to use immediately (unless it hung). Nowadays it takes a minute for my work laptop to boot. When I'm lucky.

I also totally agree with his complaint about things disappearing when you try to click on them. Most of his rant is just about how crap Google Maps is, though.


I can say that I have yet to see ANY web app, that is fast and snappy. ANY. At some point, they all have problems.

Even many frameworks for iphone and android that are essentially web apps are terrible and make every app slow and miss clicks. On the latest and modt powerful iPhone no less.

If you are creating a product as a web app only, you are telling me you do not care about UI enough.

Programs never not had bugs or issues, but we never had the Situation that every app, even though it technically works, is either sluggish, breaks somehow or requires the user to learn intricate timings to use a simple UI.

Developers took the easy way out and used these frameworks because its simple and, they feel, good enough.

And here we are.


Sadly it's not the "easy" way. It's the "cheap" way. So we're most likely stuck with the javascript garbage.


Casey Muratori and Jonathan Blow have been bitching about this for many years, now, and they're absolutely right. But it's not just UI latency - it's everything. Modern software is, technically and morally, a five alarm tire fire.

The profession needs to actively fight Moore's law in order to keep our jobs relevant, and find more "work" to do - most of which is not only poorly engineered, but culturally destructive.

If you care about this, there's a tiny community of developers that actually care about reversing it: https://handmade.network/


That's also how I remember the XT (~8086) I got in the 90s. When I typed or clicked, the response was instant. But that was about it, sometimes the HDD didn't work for weeks because I forgot to properly "park" it before turning the computer off. So I had to boot from floppy disk. And that was slooooow... Probably that's one core difference to today's computing, batteries are usually included, systems can run for months or even years without maintenance and crash much more seldomly. In the past people often had to wait for the computer to work again...


danluu has some articles on this topic

Computer latency: 1977-2017 (https://danluu.com/input-lag/)

https://news.ycombinator.com/item?id=16001407


I blame two things.

1. The "release early, iterate often" culture - as it encourages half-assed software to flood the marketplace.

2. Poor or non-existent incentives for proper code maintenance.


> The "release early, iterate often" culture - as it encourages half-assed software to flood the marketplace.

That is blaming the wrong aspect of agile development. Software should fail early. It is wasteful to go through a lengthy process only to find out the final product either is incompatible with the market or someone else made a tool that already dominates the niche market.

The problem most of the time is not realising you are actually selling a prototype and what should be a proof of concept ends up in production.


(2017)

Google Maps has improved the primary complaint here. You can now search along your route.


Not on the desktop...


It is true. Many stupid programming design, and other stuff, results slowly.

They say, type two words and push F3, well, you can implement a telnet (or SSH) service which provides such a program.

Or, better may be, what I thought of "SQL Remote Virtual Table Protocol". You can access remote data using local SQL, allowing you do make cross-referencing data, both with the same and with different data sources.

Of course, there is still going to be network latency regardless what you do. But many local programs are still slow (as many comments mention), also due to doing too many things, I think. (Maybe network latency may make it a bit slow even if you use telnet to implement the old interface now, but not as slow as with HTML, which is just bad for this kind of things.)

Modern user interfaces I think are also bad, and makes it slow.

I hate touch screens, and slightly less hate mouse. Command buttons and toolbar icons are bad and keyboard is better, I think. There are some uses for mouse, but it is way overused.


The guy's rant is a pain to read, but he's mostly right.

Don't get all defensive: take this as a boatload of opportunities to make things better.

I woke up the other day to find my mouse broken, and believe me, on macos, it's very hard to do anything without the mouse. I had to look up all sort of crap from my phone just to find out how to reboot the thing.


> it's very hard to do anything without the mouse

Is it? I tried your example on windows and I could shut off the computer easily (alt-f4), then I opened up a browser (windows button, write chrome), navigated to your post, wrote this message and logged in without touching the mouse. I've found that you can navigate most websites without a mouse, as you can just move to the links by using the browser search and then click then with ctrl-enter.

Edit: I even managed to go back and edit this message without touching the mouse.


Yes, windows is a lot keyboard-friendlier than apple. The windows key / start menu is definitely missing on mac


Command-spacebar is probably what you’re looking for

And there are many ways to restart the Mac using shortcuts: https://support.apple.com/en-us/HT201236


Then why do people like mac computers? I never liked them and I grew up with them.


Consistency, better design, fashion, take your pick.


> twitter.com

I found it funny that this appeared on Twitter, a website which always slows down my browser, especially in VMs.


Some things that are drastically faster: email, copying files, booting a disk OS and/or waking instantly from sleep (though some 1983 laptops were instant-on), printing, GUIs (try using a Lisa from 1983), any kind of complex computer graphics, software downloads and installation.


There is going to be a major C64 revival when the average Joe realizes we have hit the peak in the last human invention:

The most sold computer ever in history is still alive:

New cases: https://shop.pixelwizard.eu/en/commodore-c64/cases/90/c64c-c...

New keycaps: https://www.indiegogo.com/projects/keycaps-for-your-commodor...

New software: https://csdb.dk/

It's happening boys, back to the future!


Including reading essays, now written half-paragraph by half-paragraph in twitter threads?


Exactly right. This person wants me to read his thoughts on UX but could not make it less work for me, the user, to read them?


Boot times are way faster.


Maybe boot times are faster than 1996, but not 1983.


I don’t understand why you are being downvoted. Flip the switch and the BASIC prompt was essentially instantaneous.


Boot times are faster than they were in 1986. I feel comfortable assuming they're also faster than they were in 1983.

SSDs are fast.


> "SSDs are fast."

ROM is faster. That's what microcomputers booted off in 1983.


A powered-off Apple 2 or C64 could be operational in seconds. We're nowhere close to that now.


A powered-off Dell laptop running Windows 10 is operational in seconds. I know this because that's what I use.

Whereas an IBM PC booting into DOS in 1986 took, sure, seconds, but a lot more seconds. You could read a lot of the messages as they scrolled by during boot.

To get to a BIOS configuration screen now, you need to independently research the key that will bring it up and memorize it. Then you have to frantically mash it during the whole very brief boot process, because there's only a split second during which it will actually work. It used to just be a boot message. When you saw the message, you had time to hit F12 or whatever.


If it's operational in seconds it was hibernated or suspended.

Windows now by default has "quick startup" which is effectively log the user out kill their apps and hibernate.

Beware if you dual boot and want to access the windows files or your machine does not handle hibernation well.

Actual startup probably takes more like 20-40 seconds


> If it's operational in seconds it was hibernated or suspended.

This is not true. Are you still using a platter hard drive? (If an SSD, have you looked up benchmarks for it?)

My ~5 year old laptop used to cold boot Windows 7 in less than 10 seconds (once I'd disabled most autostarting programs, at least). It currently cold boots Ubuntu in ~5 or so; most of that time is spent displaying the UEFI and Grub splash screens. This is made possible _almost entirely_ by a Samsung Evo; I'm looking forward to getting an M.2 drive when I replace the computer.


Are you pressing the button to start and externally timing the process of arriving at a usable desktop and have you explicitly disabled fast boot? This isn't a new feature in windows 10.

Internally my computer tells me the process takes about 5 seconds from the OS start to graphical environment but in reality there are several steps. For example this doesn't account for the period of time between hitting the power button and the OS itself starting to run, entering full disk encryption password, unlocking volume.

I would be surprised if a full restart actually took so short. Maybe not loading a menu or unlocking a volume is sufficient to explain the difference?


I often hear hibernation is one of the great Mac Os Features. For my part, I decided years ago that cold booting windows works much faster than hibernating it.

I will not start to count the seconds and fight which OS boots faster, but it is certainly much faster than it was in the nineties. Boot times are certainly one thing where modern computers have significantly improved. Everyone who compares an instant on 8-bit is oversimplifying things. Try booting to ie. GEOS on one of those.


That's windows. In Linux, I get this from cold boot (on an Thinkpad T495):

    > systemd-analyze 
    Startup finished in 8.878s (firmware) + 1.666s (loader) + 1.592s (kernel) + 3.265s (userspace) = 15.403s 
    graphical.target reached after 3.176s in userspace
It's crazy that the slowest part of the whole process is the firmware stuff.


Is that wall clock time? I get 13s but actual wall clock time is more like 26 from power on to desktop. Around 30 seconds to opening a browser.


I haven't check wall clock; I would have to enable auto-login to eliminate pauses from user interaction.


> I would have to enable auto-login to eliminate pauses from user interaction.

Why? By the time you're logging in, booting is already finished.


To check at what point the system is responsive enough to launch a browser. So, not just calculating the boot times (we have that already), but from power on to launching a browser.


While that's not an entirely unreasonable idea, I note that launching a browser has its own startup time. Power on to opening a document in evince is faster than power on to launching a browser, and that's support for the original thesis in a way, but it also feels like a little much to count it against "boot time".


Just tested mine from totally off to login screen, took slightly over 12 seconds.


A powered off Dell laptop is not nearly as fast as a TRS-80 Model 4.


I used to use Apple 2. Powered on Apple 2 is practically useless - it doesn't do anything useful until much later.

My Chromebook boots in seconds, with full GUI and everything and usable. My Windows desktop boots in seconds and usable. I'd say anyone saying Apple 2 was faster is comparing apples (ahem) to oranges. In no way Apple 2 provided faster user experience for anything compared to modern machines.


That's because it doesn't do anything. We expect many orders of magnitude more functionality from a modern computer than we expected back then.


Remember when OS's had startup bleeps/bloops to entertain us while they booted? Those were the days. Now I'm completely irked waiting for an old 5400 RPM drive to boot Fedora on the occasion I need to do it.


You felt like you really got your money’s worth when that counter slowly ground it’s way up to “8192KB RAM OK”


The POST alone for my AT clone took over a minute. I remember watching it count up to 512k.


1992ish?


No, I definitely had upgraded to 1MB by 1992. The AT clone was an old CAD computer my dad's work had gotten rid of in the late 80s.


You need really good firmware and storage to actually see this. Otherwise it’s 25s in firmware and 35s to load the OS. If you have a LOT of RAM, that’ll slow things down as a single thread inits all the pages.


I definitely recall back in the late 90's boot times that lasted minutes were the norm, not the exception.


Sorry, but it's just a very dumb statement meant to be a hot take. So he has a select few examples that he's choosing as "everything on computers". I have to admit, I didn't start working on computers until 1985 when it took forever to load the program via floppy . . . as long as the drive wasn't drifting out of alignment; then it would either take even longer or it wouldn't load at all . . . so maybe his golden age of computers was just a year.


I think us software developers are so used to doing inherently slow things ("find all" for some function name in a huge codebase, installing packages, compiling assets, starting the iOS simulator, making API calls, waiting for a unnecessary meeting to be over so I can get back to working), that we need a good UX team to tell us when to make things faster. We just don't notice slow things anymore.


The thread started out being about how library computers were faster in the 80's and then derailed into angry Google Maps feature requests


Not quite the same topic, but I remember someone measuring keyboard to screen input latency, and that has also increased over the years.


Have some empathy! I've seen a trend where rants against the modern web often get upvoted lots on Hacker News. Consider this quote from the article:

> I make no secret of hating the mouse. I think it's a crime. I think it's stifling humanitys progress, a gimmick we can't get over.

Does the world's typical computer user today hate the mouse, and prefer a keyboard-only interface (CLI)? No -- in fact, command-line interfaces are less discoverable and harder to use, starting out. Even as a programmer, I struggle to remember the flags to many common command line utilities

Sure, the author's example of a cashier's checkout console might be great as a text-only interface -- cashiers use it day-in, day-out, and can learn all the keyboard shortcuts in a day. But what about the self-checkout machines that shoppers use maybe once a week? Would you rather have every person have to learn a list of keyboard commands while navigating a two-color interface?

Does the modern web poorly serve the author, who's good enough with technology to master any UI? Sure!

But the modern web works better for the billions who otherwise would not have started using it in the first place


> But the modern web works better for the billions who otherwise would not have started using it in the first place

We need to start talking about expected utility.

For software that's used briefly and once in a blue moon, it's perhaps not worth the effort to make the UI particularly ergonomic. Most web pages fall into this category - the random e-commerce shop or pizza delivery service you're using today. It would be nice if the UI wasn't actively user-hostile, but it's not critical.

The problem is with software used regularly, for extended periods of time. Like, during a work day. A very large part of the world's population interacts with software at work. A lot of them sit in front of a small set of programs 8+ hours a day, day in, day out. For example, a word processor + e-mail program + IM + e-commerce platform manager + inventory manger. That software needs to be as ergonomic as possible, otherwise it's literally wasting people's life (and their employers' money). Such software needs to be keyboard-operable, otherwise it's just making people suffer.

A lot of software falls into this category. If you're doing a startup that is meant to, or even conceivably can be used in a business, you probably have some full-time users. You probably want those full-time users. If so, then for the love of $deity make it more like that old DOS POS than the hip mobile-on-desktop web garbage. Otherwise you're wasting people's health, money and sanity.


Keyboard-only input is not antithetical to discoverability. You can have a menu system (including top-menu bar) with clear hotkeys displayed.


> command-line interfaces are less discoverable and harder to use, starting out. Even as a programmer, I struggle to remember the flags to many common command line utilities

Not to disagree with the general point you're making, but autocompletion of commands just using the tab command is how CLIs get discoverability and it's kind of cool.

Whenever I "don't know" I just 'cmd -<tab><tab>' and suddenly I am presented ith a list of options that I can filter by continuing to type the option I suspect I need, or tab to the one that I see on screen. Then if that requires an argument <tab><tab> it let's me select the, for example, file that is needed as the argument.


> Whenever I "don't know" I just 'cmd -<tab><tab>' and suddenly I am presented ith a list of options

You assume you already know which `cmd` to type. Most users don't.


I've never seen a self checkout machine with a mouse.


mouse or touchscreen, the point still stands


If we are talking about how intuitive an interface is, I strongly disagree. I remember in the 90s seeing people learn to use a mouse that had never used a mouse before. There's a definite learning curve. A touchscreen with large enough targets and low enough parallax is a different story.


Consider the precursor to current MacOS, NextStep v3.3.

It ran very well on a 33mhz 68040 with 32 to 64 megabytes of RAM.


Few days back, I installed the last version of Microsoft Encarta (2009) in an old PC (CPU E5200, RAM 4GB). To my surprise, all the millennial around me were astonished how fast and smooth that software was running and searching for information on a typically (very) slow machine.


I feel like my 2017 mbp takes longer to log in than my 2009 mbp did. The old one broke, so I can't compare, but I remember it like wake up would be done by the time I open the lid. On my new one I can sit and wait for the screen to turn on.


So this is mostly about UI. I would have agreed that many operations take very long, eg. accessing our web-based info system or waiting until outlook open or closes that window. It‘s strange that these operations don’t happen instantly nowadays.


Searching a map of the entire world online is slower now than it was in 1983?


Lots of great points - but feels slightly ironic that he's decided to divide this into segments of 160 characters, and spread the words between the nomenclature of the Twitter UI.


This is a comparison of native applications doing simple things in 1983 vs. web apps in 2017. Not all the world’s a web app, although that does seem to get forgotten here.


Half the people in this thread keep confusing latency and speed. The other half keep explaining it to them. It's a little repetitive here :)


Code running in real mode has less overhead than long mode but who wants to use an OS that runs in real mode besides that templeos guy.


The performance of Microsoft Word seems to be a universal constant across all known hardware.


I thought about reading this article, but the webpage was taking too long to load


This guy has obviously never had to wait for GI Joe to load on a Commodore 64.


But my computer is at least 1,000 times more valuable in what it can do for me, than it was in 1983. Probably even more.

Seems like a pretty good tradeoff to me.

I'll take a slow autocomplete box of all the world's knowledge over a lightning-fast lookup of my local files in a single directory any day.


I wonder if your computer is more valuable in 1983 running the first version of Lotus 123 compared to 36 years earlier in 1947 or more valuable now compared to 1983 with everything it can do? Not sure how to quantify that but it might be similar.


I didn't have a computer in 1983. I wasn't born even. But Windows 98 felt faster than Macintosh and even Linux.


Today Linux can be much faster than Windows 10.


Define everything. A lot of things were way slower in 1983.


This guy never copied floppy discs


Except the internet it seems.


Speaking of slow and crappy UIs, what the fuck is up with the trend of "click to read more."

Youtube does this in video description and comments. If I'm scrolled down there to read comments, maybe I want to actually just read them and not click to read them?

Reddit does this. I don't know what reason I have for reading a thread of comments other than reading comments, so why do I have to keep clicking read more?

Twitter evidently does this. If I'm reading a thread, why do I need to click to read more? And after a couple dozen posts, click again. In this case, it also seems to expand the unread posts above the point where you're currently scrolled to, so you have to scroll back up and manually figure out where exactly the last post you read is and where the new stuff begins.

Many shops do this, by cutting product descriptions at a few lines so you can't read what the product is all about without clicking read more.

And they do the same thing with reviews.

I'm really tired of clicking read more over and over again in places where reading is the whole point!


There’s three possibilities, although I’ll warn you now that none are great.

1. In an attempt to improve perceived performance on initial load a decision was made not to load all content in at once. In the case of a Twitter thread containing potentially hundreds of items that’s reasonable, for product descriptions less so.

2. The widget being used to display a product description on the product page itself is also used elsewhere on the site, but in a context where space is constrained to fit a grid. They got round that with a read more link.

3. Sadly the most likely for product descriptions, in an attempt to determine customer interest an arbitrary cut off was chosen for how much of a description is shown. Metrics are then tracked on which descriptions are expanded, and taken as a proxy for customer interest in those products.


> 1. In an attempt to improve perceived performance on initial load a decision was made not to load all content in at once. In the case of a Twitter thread containing potentially hundreds of items that’s reasonable, for product descriptions less so.

In an age where web pages are several MB large, bandwidths surpassing 100Mbps, and GPUs alone with 11GB of RAM, we can't render more text on a screen.

This page right now contains around ~8kB of pure text, and everything is already expanded as opposed to most other comment sites. I'm aware that formatting, layout, data modelling, messaging etc increases that amount, and that's fine, I'm just baffled that it's possible to have a slower experience with percievably the same amount of brain-data as 20 years ago, but with hardware that is magnitudes better.

We shouldn't lose this much to UI fluff.


The bottleneck would in this case be within YouTube servers that have to fetch the comments. Nothing to do with UI. Since comments work on scale it is possible it takes quite a bit of resources to load them instantly with the video.


It's not directly related, sure. But it's certainly a second order effect of UI choices that has brought us here. They manage to compute (batched and not real time) personal ads tailored to you, load media for this, push a whole bunch of data from your clicks to further this tailoring, push metrics, grab libs from 40 different locations, etc, etc, etc.

If we wanted to do some work around getting more text to users and improve the reading experience of sites with at least a portion of them designed for that purpose, we certainly could.


HN doesn't show every comment if there are too many comments, FYI.


How could it be one?

Loading text, or anything statically, is faster than many megabytes of JavaScript frameworks.

I think web devs have collectively broken their brains if they think this.


I suspect that for a surprising number of sites it's secretly #3: they track the clicks to keep track of what you're reading on the page.


Try going to a reddit page from google nowadays... it shows, like, one comment and then “click to read more”. Oops, you were on the amp page, so now it’s loaded the exact same page but non-amp. Click to read more again. Now the long comments are still not shown fully so you have to click each one to read the whole comment. Now the thread replies are still collapsed so click those to read the replies. This is absolutely infuriating if you’re in a place with patchy internet. With HN I can load a comment page and read it top to bottom for ten minutes, with reddit I’m clicking non-stop just to see text on a text-based website!


First rule of reading reddit: edit the url and replace the leading "www" with "old".


I keep an account and remain logged in for the single purpose of using the "Preferences" setting that I always want the old version. So I can now just use the regular www-URLs and still get the old interface.

EDIT: Checking my preferences, there is an option on the bottom that actually is disabled and says the opposite: "Use new Reddit as my default experience". So I guess that's on by default and you have to disable it.


I have that set too but still use https://addons.mozilla.org/de/firefox/addon/old-reddit-redir... to make sure I don't accidentally share a link to that awful new interface.


What a fantastic user experience!


I also love it when there is an ellipsis menu … and once you click it there is just one option in it.

Edit: another of my favorites is a Cogweel icon for the contextual settings, another cogweel icon on the other side of the screen for the account settings, then a hamburger menu for navigatio, then an icon to display all apps (like the numpad icon) and then an another menu when I click my profile picture.

Every time I'm looking for a preference it's like it's Easter! :)


Performances and money. You don't want to make the db, cache and network operations to get the comments for every users, only those who really want to read them which are not most of them. At the size of google, you save millions or dollars with such trick.


At the cost of a crappy and sluggish user experience, of course


Everything is a matter of compromise. What's important is to make them consciously, which I believe Google did in that case.


Yet they insist on auto-playing the next video when most users will not be interested in that. Serving all comments is much less data than even the initial buffer of the video. The real reason I think is that a user reading comments on youtube isn't watching ads during that time.


Auto-play means more views means more ads means more money, yet. But I don't think it's the same for comments.

Comments don't have adds. They are a feature that cost money, but don't bring any.


This is what mobile first design gives you In mobile it might be welcome On desktop or any big screen environment this is just silly


There have been very few times that I have enjoyed clicking on one of those, even on mobile.


I think it's actually close to what I'd like without getting there. The "click to read more" essentially gives me an overview of the comments. When you click on an individual comment, it expands to give you a more detailed view.

I think what I really want is something more akin to org mode, where it's expanded/contracted by default (configurable) and when I hit a single key, it expands portions to reveal more detail. I often find when I'm in the middle of a thread, I start to think that I'm wasting my time and want to collapse that thread in some way so I can start searching for where I want to reinsert myself. I rarely want to read every single comment on one of these services. Basically, I think the intention is on the right track, but the execution is poor. Getting it right would be tricky, but I hope someone tries and sets a better bar than the one we have.


Gmail even does this on long emails! It’s insanely annoying.


I can't explain Reddit, but for Twitter it's a side-effect of how they decided to create the "simple" linear view.

Tweets are actually structured as a tree similar to Reddit comments, but while Reddit essentially displays a depth-first traversal, Twitter opted for a breadth-first traversal so it can show all the immediate replies. Almost every "show more" is for going one level deeper in that tree.

(With some caveats/custom rules, at least - if a subtree only has 1 reply it'll often be shown in-line, then there's tweet chains like this one)


Another possibility: this destroys the "reader view" that some browsers have. Since the reader view usually skips ads, it is considered undesirable.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: