My late father, who was a software developer by trade before his retirement, had a desktop machine set up running Windows 7 (probably?) that he used for checking email, and buying stuff online.
A few years later, when he had been diagnosed with cancer and was on chemo, it updated itself to Windows 10 without his explicit consent. It completely fucked up the install, and was unusable for him from then on. He was too tired to go through the process of getting it sorted out, and was thus unable to book a vacation that he had intended to take to recover from that round of chemo.
Microsoft's unfriendly us-first, customers-second process robbed him of his last holiday and I will not easily forgive them for it.
If a seasoned developer can be robbed of quality of life by this flavour of bullshit, what chance do the non-technical types stand?
This is the type of thing I imagined could happen, but hoped it never actually would happen.
The worst example I had heard until today was an update that caused a friend to lose a good chunk of her dissertation. Windows 10 decided that final week was an excellent time for an update. She had no option of saying "no" and was just looking to use Word uninterrupted.
Software is too important now for this level of user hostility.
this is much more minor than a cancer patient missing his last damn holiday, but I have a device running android tv, and every few days it auto-redownloads “Android TV Home”, which is an advert-infested replacement home screen run by (I assume) Google.
every time, I have to go into the Play Store and uninstall it again. uninstall the home screen! how illogical is that? and how would I have known how to do that if I wasn’t tech-savvy enough to find a reddit post telling me how?
every time, this resets my normal home screen, and I have to set it all back up again, removing the semi-advertising “channels” that are already in the less-bad default home screen.
the device this is on cost more than £500, and yet I’m still paying out in attention because greedy Google wants to please their shareholders
Just side load an Android app called "NetGuard" which is a free firewall app on Android (though it's only available on the Android phone, but you can still sideload it on the Shield TV). Download the NetGuard APK file and then install it on your Shield TV using a 3rd party file manager app (you can find the NetGuard .APK file on Google search, or you could use an APK extractor and extract the NetGuard app from your Android phone after downloading it from the Play Store). You might need to also download a 3rd party launcher to see the side loaded Android apps as it will not appear on the stock Android TV home launcher. Then open the NetGuard app and disable network connections to the "Software Upgrade" app, then switch on the NetGuard toggle and it will block the "Software Upgrade" system app from accessing the internet, therefore it cannot download the software update and then prompt you to install it with an annoying pop up screen message.
If you get a software update message and the software update gets downloaded, just go to the app settings of "Software Upgrade" and delete the app cache and data.
Once you have the NetGuard app enabled, it should auto start every time you boot up the Shield TV and it will always run in the background. I suggest that you regularly open the NetGuard app and have it in the background so just incase Android OS puts it to sleep.
Additionally, I suggest you go to Google Play Services and the Google app and disable all the app permissions to access storage or change system settings (though if you use things like Google Assistant or voice search, you may need to leave the microphone permission on). Do the same for the other Nvidia apps which you might think may try to control or change settings on your Shield TV without your permission.
I'm still on Android 8.0 on my 2017 Shield and I'm happy with it. I don't really need any extra features and nothing needs to be fixed. This is the only method I found to work. I have tried to delete the "Software Upgrade" app package from my computer using Android ADB but I just couldn't do it (maybe Nvidia blocked it). So the firewall method is the only option. I don't know if Nvidia made changes and blocked this method in newer updates, but try it and let me know if it works.
Wow. I think your reply demonstrates the crux of the issue perfectly. I can't tell if its satire or not.
It is insane that this level of workaround and hacks is required to avoid advertising and crapware on a £500+ _television_. Normal people have almost no hope for a good experience when you consider something like a Lenovo laptop from the OP.
You have the manufacture-installed adware, windows 10 + edge nonsense constantly nagging you, and then there is just the state of the web itself. Its all just too much.
It's one thing to hack for fun or to find new uses for hardware and software. It's another to live in a world where we purchase devices for hundreds of dollars, only to have a bait-and-switch on the software and suddenly we get ads on our splash page and a slower UI.
I'm glad there are workarounds, but they _shouldn't be needed_.
The thing that amazes me is the sheer amount of money that seems to be in advertising. It’s sometimes more profitable to surveil or shoehorn in ads than to sell products.
Could this be somehow related to two decades of cheap money and the bloated corporate budgets it created?
It works, there wouldn't be so much advertising without the gains. Of course in this process lots of companies and business, amateurs in advertising, loose money, and of course big companies try new stuff everytime and they loose money too. I'm a marketer and I really feel the ad sector is like a disservice to humanity, trowing money to the script driven money factory when lots of people can't even eat every day.
Samsung has now decided that for me to use the camera on my Android phone, i must give full permission to something called Nearby Devices. Thankfully, i can live without the camera but for the first time, i mulling over moving to Apple.
Samsung has fallen far. I switched to a cheap version of the pixel last time I got a new phone. Much longer os update cycle, and they essentially always get rom support first of all the phones. There is certainly better hardware put there but it only has google adware not the vendor's and Google's adware. And roms with no google are am option. I think I may consider an iphone when it gets usb-c but I would miss having a fedora install on my phone to do weird stuff. If they gave me a way to get a terminal where I could have rootless docker or jails on an iphone. I'd be on it. Bonus for having the usbc recognize keyboards, thumb drives and Ethernet adapters.
" For the past five years, Apple has been setting up for what will soon be its most profitable product line. It's not a new phone or computer: it's advertising. "
As some C-suite suit bonus will depend on that, you can be sure that the various protections will evaporate over the years.
Fearmongering or maybe you just haven't been watching? Apple has already been doing advertising for years now. They make it quite easy to disable any tracking you might be concerned about. There's zero reason to think they'll change that.
Pest or cholera - that's what happens when there is a monopoly and the monopoly buys out or sue every competitor. Google and Apple are laughing all the way to the bank, Just like Microsoft for PC.
> I suggest that you regularly open the NetGuard app and have it in the background so just incase Android OS puts it to sleep.
Shouldn't happen since NetGuard works as a ‘vpn client’ to pass connections through the app—so afaik reasonably it should stay working as long as the pseudo-vpn-connection is up.
(Not to be confused with actually connecting to a vpn server, which it doesn't do.)
I run blokada on my phone, which also acts as a pseudo vpn, and that does have problems with dying randomly and needing to be restarted. not sure if it's a bug in the software or a general android issue.
Perhaps Blokada doesn't properly run the background service, with a persistent notification. The notification is necessary since around Android 9 or so—and while NetGuard's ‘vpn’ is indicated in the status bar, it also still has a notification.
Some apps have a setting to disable the notification, but all that achieves is that the app doesn't stay running in the background.
By the way: I semi-randomly landed again on the dontkillmyapp site, and it notes that killing background apps works differently on different phones—presumably first of all between various manufacturers, since they tend to modify Android somewhat deeply: https://dontkillmyapp.com/google
More to the point, the site reminded me that you might want to verify if Blokada is exempted from ‘battery optimizations’ (works via app info —> ‘Battery’ —> ‘Battery optimization’ for me). I checked NetGuard on my phone, and sure enough I have the ‘optimization’ disabled for the app.
(Let me know if you see this comment, or I'm gonna notify you manually.)
hey, thanks for following up! I checked and blokada is set to "unrestricted battery use", so I'm going to guess it's just buggy (the previous version didn't have this issue)
It's getting harder and harder to turn this stuff off. Firefox wants to talk to its Sync and Pocket servers. I still have an Android phone with F-droid, no Google account, and all Google services turned off, but I don't know if that will work in my next phone.
I really don't think it was that bad. I'd put it in a DIY category with self-install of internet service or hooking up your own car alarm or other such "beneath the surface" technical things non-technical people manage to do for themselves every day by following youtube videos or whatever.
Either way do we need a new post every hour making this same point from yet a different angle?
I am not doubting you, but I'm surprised that it was cheaper. Every time this is discussed here on HN, or on other forums, the prevailing narrative is that it is increasingly difficult to buy a non-smart TV, and that the only option is to look for "industrial display panels" (or some similar designation), which, people claim, are rather expensive.
Note that I am only recounting hearsay, and I haven't done my research on this, as my three PC monitors cover my needs completely, and I have no need for a TV. I'm only "idly curious" about this topic, because one day I may need such a dumb TV.
HN consists of many different people in many markets.
For US folks, yes, you generally have to go for an industrial display panel which is slightly more expensive than a smart TV [whose lower price can be subsidized by the data they're eventually selling].
Maybe for the previous poster who is in the EU, faces a completely different market than the US, a market which is more favorable to non-smart TVs.
“Dumb TVs” can still be found in the US, but they tend to be quite low spec, using panels with mediocre at best performance. That’s not a problem for a lot of people but anybody looking for a nicer panel is probably stuck buying a smart TV.
Current Sony TVs offer a “basic TV” mode that disables all the smart stuff though, so they’re a decent option for more discerning buyers looking for a dumb TV.
the reason for the expensiveness being that “industrial display panels” are specifically built to prevent the burning in of an image, as consumer tvs will retain an image if held on that same image for too long
I have a 'smart tv' (HiSense) - and while it has a bunch of smart stuff that isn't that bad, the best thing is it starts on the input you left it on. So for me it always opens as connected to hdmi1/chromecast, and I never have to see the smart stuff and the tv itself has no internet connection
My 2018 Sony X900F behaves similarly. Have never connected it to the internet, and its near-stock Android TV install doesn’t nag me about that. When I turn it on it starts on the last used input and you don’t see any “smart” UI unless you explicitly summon it with the home button on the TV remote.
I’ve been using it with an Apple TV 4K and it’s been great.
don’t bother with a firestick. firesticks used to be good, but last year they forced an update that did pretty much the same thing as Android TV, but worse. a massive ad-banner across the top of the home screen that you can’t get rid of, with your apps relegated to a tiny ribbon, followed by semi-adverts underneath
at least with android TV it’s sort of, somewhat a choice, for now, but this is flat-out immutable. you used to be able to sideload an alternative home screen, but another forced update removed that option too
buying a firestick is buying adverts for inside your house
Was not expecting the level of animosity for this app. The last update in August apparently forced a ton of ads on the interface, even if people had paid subscriptions that would remove ads.
Absolute trash. In their boundless greed, Google decided to force endlessly cycling ads onto the home screen, and had the temerity to highlight it as a feature. I give two @$#@$# about the content they are highlighting, from a Christmas movie (in July) to the latest teen pablum. Pretty sure I had automatic updates disabled, and that was overwritten without my consent (with no way to roll back to the prior version). I took a chance on the Android ecosystem with this device (NVIDIA shield), and that's the last time I make that mistake. I'll be tossing that out and not looking back...
Unbelievable. They added unremovable ads right in the home screen. The "Staff Picks" takes over the top 30% of your screen advertising random videos from apps that I don't ever want to install. No way to turn off. How quickly my $2k TV turned from pleasant to ad infested junk. The only solution I had was was to uninstall updates in App manager. If this is the future of Android TV, I'm certainly not buying another TV with this OS again.
I paid a hefty sum for a top-end TV, then again for the most expansive TV box one can find, and what did I get? Ads! Ads that blink and distract. Ads that take more than a third of the screen, pushing what I want to see on the second page. Ads for movies I cannot be less interested in, on services I have absolutely no intention to subscribe to. Ads that have been trying to sell me the exact same three movies in genres I don't care about since they appeared. Evil greedy corporation at its worst.
IANAL but some of this stuff might be class-action-worthy. if you’ve paid to remove ads and they forced them back upon you, surely that’s an infringement of some variety?
I would think this would fall under the "false advertising" laws in any given state:
Minnesota Statue 325F.67 FALSE STATEMENT IN ADVERTISEMENT.
Any person, firm, corporation, or association who, with intent to sell or in anywise dispose of merchandise, securities, service, or anything offered by such person, firm, corporation, or association, directly or indirectly, to the public, for sale or distribution, or with intent to increase the consumption thereof, or to induce the public in any manner to enter into any obligation relating thereto, or to acquire title thereto, or any interest therein, makes, publishes, disseminates, circulates, or places before the public, or causes, directly or indirectly, to be made, published, disseminated, circulated, or placed before the public, in this state, in a newspaper or other publication, or in the form of a book, notice, handbill, poster, bill, label, price tag, circular, pamphlet, program, or letter, or over any radio or television station, or in any other way, an advertisement of any sort regarding merchandise, securities, service, or anything so offered to the public, for use, consumption, purchase, or sale, which advertisement contains any material assertion, representation, or statement of fact which is untrue, deceptive, or misleading, shall, whether or not pecuniary or other specific damage to any person occurs as a direct result thereof, be guilty of a misdemeanor, and any such act is declared to be a public nuisance and may be enjoined as such.
They advertised there would be no ads, then promptly started serving ads while still collecting monthly subscription revenue I think fall into this area.
Yeah, I abandoned Android on my TV for this exact reason and now I use the TV as a dumb screen for an external streaming box (also running Android TV, but one that's less locked down and allows me to install my own launcher and control the UI).
I wish dumb TVs with good panels were easier to buy.
I'm getting close too, starting with the cancellation of my YT Premium subscription. All I wanted was to be able to watch my subscriptions in peace on my tv (e.g. bigclivedotcom) but instead I get random breaks (e.g. only the home page shows on the yt app, no access to subscriptions), daily reset of my playback speed preferences, etc. I hope the people who think it's cool to experiment with paying customers get to experience the same for the rest of their lives.
I really have to wonder what the corporate process behind implementing this was. they must have known people wouldn’t like it. they must have known they were literally making their product worse. when these decisions were made, did they feel bad? did they not care? did they delude themselves that people want “recommended content” against their will? did anyone at any point say “hey maybe we shouldn’t do this?”? I bet at least one person who was involved is on hacker news. maybe they’ll see this and can comment. or maybe someone who’s been in a similar position at a company
> did they delude themselves that people want “recommended content” against their will?
Probably. In fact, I think you're underestimating the degree to which they can.
Very few evil people actually think they're evil. Most think they're legitimately doing good and helpful things, while being extremely misguided as to what is good and helpful.
I have given up the fight. I have pihole. It blocks ads. I can see the broken ad square on my sony android tv/google chrome. I disabled the recommendation. I just scroll down to my favorite app "Emby" and start watching my stuff.
>because greedy Google wants to please their shareholders
I sometimes wonder, if the founder has controlling voting shares, or the shareholders on public exchange / stock market are non-voting shares. Would the company still have to please their non-voting shareholders?
Yes (to the extent that it's reasonably possible). Voting rights give you power, but you have a contractual obligation to use that power in the interest of _all_ your shareholders.
It's the same reason why, at least in theory, VC founders can't structure a deal that devalues everyone else to ~0 and pockets the extra money in a side-channel transaction. They definitely have the power to do so, but if you can prove that better options existed and have the inclination to take that to court then you have a decent shot at recovering damages.
Non-voting shareholders can and will sue, and in the US they will probably win. This pretty much forces companies to please shareholders as much as possible at the expense of everything else.
"When people had a choice in updates, too many would postpone them. This left our OS susceptible to attacks which made for some bad press for us. If we force updates, this won't happen."
- some exec, probably
If by "bad press" you mean "other users getting fucked over by an entire homogeneous ecosystem of the same OS being a breeding ground for gigantic botnets that criminals were able to turn on services to knock them off the internet," yes.
The only real way to address it without the mandatory updates was for the ecosystem to diversify; can't use a Windows exploit to compromise a Mac OS machine.
You can imagine why that solution was not on Microsoft's list of recommendations.
The real way to address this would be to not make users hate updates.
That would mean doing as much of the work as possible in the background in a way that doesn't impact the user and making sure the reboot where it actually gets activated doesn't take noticeably longer. It would also mean not doing things that are against the user's interest like moving things around, installing new ads or misfeatures, or breaking things.
I don't know anyone who hates or wants to disable Chrome updates, for example (I'm sure such people exist, but it's much less common than Windows updates), despite Chrome updating way more often than once a month.
Of course, overcoming the aversion people built over years because of past bad behavior is going to be hard, but forcing people against their will and making updates more disruptive isn't going to help with that aversion. People really dislike being forced to do something, even if they don't actually mind the thing itself that much.
This is the key. Lots of systems have seamless updates where the only real interruption is a reboot.
Android does an A/B update and after a only slightly slow reboot is in the new version. It then does some optimization in the background that can have a slight performance impact.
NixOS must be the gold standard here. It downloads and completely installs the new version in the background then a regular-speed reboot is enough to be running the new system. (~10s on my system)
But Windows and macOS both have crazy slow update processes. When I had a mac for work I was dumbfounded by how slow the update process was. It regularly took >30min! Those machines are crazy fast, it seems like it could have written a TiB of data in that time so IDK what it is actually doing.
I think practically all the problems with Windows will go away with one simple change.
Do not ever reboot a computer without a user's consent.
Any feature that requires automatic reboot should be opt-in.
However, this exposes the real challenge.
We cannot have informed consent without informed users.
I think this problem will haunt us repeatedly in the future.
We need users to know at least a little bit about their options.
Only then can they make informed choices on what option to pick.
This means people can't glaze over the details and not worry about them.
This does not mean you have to be a computer wizard.
You just need to know what you want and pick an option that is good for you.
For example, I run this on Fedora:
Any update requires a reboot.
However, I am ok with it because I am positive each update comes after a lot of testing.
More importantly, I am in control of when to run this update.
I can't accidentally run it because it requires my sudo password.
I used to wonder why no corporation complains about Windows Update.
Then I learned that virtually all companies use group policy to change Windows Update behavior to suit their organization.
Therefore, this is a non-issue for them.
Welp.
The problem here is that a modern OS should only need to be rebooted when the kernel is updated, and Microsoft's security model requires updating the kernel all the time. On Linux, kernel security updates are usually, "Hey, update this when you get a chance, because a bad actor already on your system could escalate privilege in a rare scenario." With Windows, it's an almost-weekly, "Hey, update right now, because someone in another country is pwning all unpatched machines that aren't air-gapped from the internet."
It doesn't nag me at all other than ask me for my sudo password.
It is perfect.
Here is what I understand about dnf offline-upgrade
> The process of restarting, applying updates, and then restarting again is called Offline Updates. Your computer boots into a special save-mode, where all other systems are disabled and where network access is unavailable. It then applies the updates and restarts.
> With Windows, it's an almost-weekly, "Hey, update right now, because someone in another country is pwning all unpatched machines that aren't air-gapped from the internet."
I see.
I feel like I don't quite understand the extent of the problem.
In any case, I am not qualified to even attempt to propose a fix.
My mode of thinking was Google Chrome and derivatives are able to slowly walk along
with increasingly scarier visible warnings asking users to restart the web browsers
but they don't (as far as I know) reboot the web browser while the user is still using it.
Even Google Chrome OS didn't do that (well, it didn't back when I had my cr-48).
When I updated Mozilla Firefox on my Fedora machine, existing tabs continued to work.
New tabs would come with a warning for me to close Firefox and open it again.
This is the kind of warning I was getting rid of with offline-upgrade.
I agree with my parent comment though.
Updates are too bloated.
Windows updates should be very fast
with a modern processor (Intel eighth gen, AMD equivalent or greater), enough memory, and a fast SSD.
Even DNF which has a very renowned reputation (at least among Fedora users) of being slow, is pretty reliably fast.
I think the point that my parent comment was making was make people not dread rebooting their machine.
My thought was
1. educate the users why reboots are essential
2. allow the users to postpone reboots indefinitely
However, I don't know what to say if there is an active remote command exploitation in the wild.
It is important to get the update to the user before the exploit.
However, that goes against everything I've said so far.
I guess this leads us to the status quo.
Those who have the know-how to use group policy can opt out of certain Windows Update behavior.
The rest can either learn to do this:
and set up their own Windows Update server?
Periodically check CVE and gate update releases?
Not sure how it works...
The reason I hesitate and say automatic reboot should be OPT-IN
is sometimes I like to do some simple long running task
like stitch photos I took into a video using ffmpeg.
It can take hours on a slow laptop processor.
It would really suck to do something like that
and come back to see the computer rebooted itself.
> Do not ever reboot a computer without a user's consent.
That also includes never crashing, which Microsoft has historically struggled with.
If you don’t reboot in an orderly way, then the machine crashes while the user is doing something, and when it restarts everything is different. “Don’t restart” is insufficient for good UX here.
If memory serves: point-symbol optimizations to make code calls into The objective-C libraries performant.
Under The hood, objective-C really does use strings to reference library functions. So when the OS is updated, the string interning table is updated and all of the installed software has to have its string to symbol caches rewritten to account for the new API.
There are almost certainly better ways to do this but if I understand correctly, Apple inherited the existing solution without putting enough thought into how it would scale and now changing the solution would break an indeterminate amount of software depending upon the current implementation.
Yes, it’s been like this since the 10.0/public beta days at least. I remember wondering about it back then. I believe the command the system updater always runs is something like update_prebinding -root /
When a new version of the OS is installed, the OS re-runs prebinding for applications to the dynamic libraries they rely on because the dynamic libraries may have changed. The alternative is to have a big latency bump every time an app is launched for the first time as the caches are rebuilt on-the-fly.
(I think this is the step that used to be called "Making Your Macintosh Happy" during an install, but the veracity of that notion has fallen too far down the Google search hole for me to easily pull up a reference to confirm or deny it).
Software companies' bad behavior has trained me to no longer trust updating. I would love to auto-update my software if I had some kind of assurance that all I was getting in those updates were security fixes. But instead, companies abuse auto-update to shovel everything else at me too: Unnecessary UI changes, new features I don't want, removal of old features I regularly use, larger binaries, slower binaries, and so on.
More software companies need to get on board with the idea of having a separate maintenance branch and "next version" branch. Too many of them just develop on trunk and every time you update for bug fixes and security, you get all the other crap their developers have been working on too.
Funny enough, I was trained off of that not by software companies, but by the open-source community.
Too many instances of installed Linux distros breaking because some foo relied on some bar that the distro declared should work fine but must have slipped through the testing cracks somehow. Or I'd gone off-book on some custom project or other because it relied on library versions that weren't in the distro, then I forgot I did that and updating the distro libs broke my project.
>I don't know anyone who hates or wants to disable Chrome updates
That would be me for one. I hate it when google sneaks in privacy reducing preferences that are automatically turned on. I like to occasionaly go through settings to see what has changed and was surprised when an update showed up a whole host of new active features.
Why on earth should usb ports on my laptop be automatically exposed to any web page that demands it? Chrome can do that now. And not just usb/serial/midi ports, and motion sensors, chrome has also enabled 'presence' (a way for sites to know when you're actively using the browser). I found that creepy.
My first thought ends with 'enabling yubikey-style mfa', but it seems that some [0] hardware vendors use webusb to enable a binary protocol, rather than translating their usb sensor output into (keyboard) scancodes.
> The real way to address this would be to not make users hate updates.
That's basically not possible in some cases. Not all of us are on super-fast broadband, even in places that are nominally 'first world'. The _best_ my parents can get is 8Mbps on DSL. Every time an update starts auto-downloading, you know immediately because all other connections grind to a halt.
(Now why a minor point update security fix for Monterey is 1.6GB I have no idea. But presumably that's not something Apple is capable of/interested in fixing)
Chrome seems to me to be an example of how some users will never update regardless of how easy upgrading is. I can't count how many colleagues, when screen sharing, always have the red icon in their Chrome window showing that they need to update. It doesn't matter how easy it is - they just won't do it.
Just to be clear, this isn't an argument that as a result updates should be forced.
I lost some work a while ago because in Win 10 home you can't easily turn off automatic updates. The amount of hate I got on some forums for suggesting that automatic updates are bad was beyond me. I just upgraded to Pro instead and turned it off in group policies, but I still got hit with an automatic restart at some point.
While I have upgraded to Pro and turned off those settings myself, I have not experienced my whole system restarting to do an update. I have had to neuter the TPM modules in my BIOS to prevent Windows 11 from automatically installing though.
However the issue that I've had was my Terminal App shutting down by itself, I thought it was a crash but it turns out it was because the Windows Store decided to update it.
Now that is complete BS! How can a store arbitrarily decide to update an app while it is still running without asking, without waiting? Even Steam waits to download and update games for when you're not playing one. If it ever did the same thing as what the Windows Store does then there'd be a riot for sure. I guess that it means the Windows Store is just not being used by people at all.
For those interested the "fix" for not having Terminal app restart for an update is to download the MSI package, extract it to a directory, and run it from there. That's directly from the Developer's words on Github.
Automatic updates to address real issues and security risks are fine, unless you're talking about updates that can't be postponed if you're in the middle of something.
What we see, of course, is exactly what the twitter thread posted mentions: some "dickweasel" hijacked the update system to push crapware or do something user-hostile for purposes completely unrelated to what the user would want.
James Williams discusses this in his book "Stand Out Of Our Light", and likens it to a GPS that takes you off to places that you never wanted to visit in the course of (maybe) getting you to your intended destination. Of course, you arrive late, if ever, and burned way more gas (or battery) than you needed. But the GPS got you to drive by a specific set of billboards that they wanted you to see.
In my situation, I was doing GPU intensive work that took about two weeks to complete and got hit with an automatic reboot overnight at some point far into the run. It hit me again overnight while I hadn't saved something, which is my own fault, but it still bugged me. I update my system when I need to update it, but I don't like that it happens without my consent and that there is no way to turn it off.
It's when they try to launder in a bunch of "feature" updates (i.e. more spyware, more crapware) with needed security updates.
They force the user to make a decision: Do I skip this update and avoid having new "features" from crippling my machine, or do I download the update to prevent my machine from being hacked?
If vendors really gaf, they would make is so that you could download needed security updates w/o the rest of the garbage. But then they couldn't force more malware on you.
In their defense, it would be difficult to keep security updates fully separate from feature updates since sometimes replacing a vulnerable feature with a new feature is the easiest fix rather than pushing the new feature, but also issuing a separate fix for those that don't want the new feature.
In other words, it's only reasonable to expect security updates for a certain length of time and not indefinitely.
It’s not just software execs, I’ve actually heard customers asking for auto-update recently as approving and updating it themselves was too cumbersome, or they relied on a department for this that had no time to do this soonish. They literally asked for “auto updates like chrome or Firefox”…
Auto updates are fantastic if, and only if, the developer can be trusted not to break things for fun. Firefox's auto-updates consist of downloading stuff in the background, prompting me to "restart when you want to apply the update", and then changing next to nothing about my experience of the application between updates.
That is the perfect update story. Everyone loves it: the users aren't having their workflows upended, the developers are able to push updates regularly for security problems and the like, the only loser is the graphics designer who got hired because daddy owns the company and needs to constantly re-design the user interface to justify their job. (seriously, people in companies that do this kind of shit, what is it with re-designs? Why?)
They were wrong in the sense that they used it to condition people to accept non-security updates and they were wrong in the sense that some people are better placed to determine whether any form of update is an unacceptable risk.
As things stand today, I only trust updates from community driven open source software projects. With exceptionally few exceptions, commercial software seems to offer the guarantee of delivering undesirable payloads.
I work for one of those very large commercial software companies. In general, I think we do a pretty good job. We absolutely fuck some things up, for sure, for a lot of different reasons.
But I promise you there is never any kind of Machiavellian multi-year plan to condition people for upcoming bad behavior. Most of the things we do screw up are from lack of foresight; we are definitely not strong enough at long range planning to run psyops on customers.
Somehow windows devices are ok at installing security updates and most apple phones are on the latest os version.
I’d be interested to know what the actual story at MS was for this. I think the executives there in the past demonstrated pretty good decision making. Here’s[1] someone claiming something about PM’s thinking they knew better than users and maybe there is some hidden OKR incentive driving that reasoning, but my weak understanding is that that person wasn’t really involved in what happened so I don’t know how true it is.
I still regret that when Fall Creators Update came out, that I assumed that my active stylus (Staedtler Noris Digital, purchased as a bundle w/ my Samsung Galaxy Book 12) being used for scrolling and becoming unable to select text was a bug, so I simply rolled back, assuming it would be eventually be fixed.
If I'd realized that that was a feature which would be pervasive for all future versions of Windows, I would have returned the machine.
I've rolled back to 1703 twice now, and am managing to stay there by the expedient of keeping my hard drive too full for Microsoft to download any further updates.
I despair of replacing this device --- it wasn't quite the replacement I wanted for my Fujitsu Stylistic ST-4110 --- no daylight-viewable transflective display.
Why is it so hard to purchase a device which has:
- a good quality stylus (Wacom EMR)
- a high-resolution display (the 2160x1440 on a 12.something inch display is fine, though I'd love more)
- decent battery life
- reasonable size/thinness
- reasonable price
- access to the file system and the ability to install arbitrary software, esp. opensource stuff
I'd also like a daylight viewable display, but that's probably not happening.
I worked for the computer store on my University's campus during the initial rollout of Win 10. Many such cases. There were also TOO many instances of Dell computers doing a BIOS update in the background and booting to bitlocker because the drives come from factory encrypted.
Not that I want to defend MS, but the longest I have ever seen it take to update is a couple of hours. Annoying, yes. Should it be user postponable? Yes.
Actually, it is user postponable, you can tell the update to wait up to 5 days by default. It should still allow you to wait unlimited time of course, but you can always press that button once.
5 days! How magnanimous of them!
(Snark not directed at you personally at all) But I write as a person who once had to make idle chitchat with an audience of 300 for 8 minutes when a university presentation computer decided that it had to restart RIGHT NOW.
It's gotten a bit better, but I still remember booting up my machine (after not using it for a while), getting a "WOULD YOU LIKE TO UPDATE NOW OR IN 10 MINUTES?" dialog, and sure enough, after 10 minutes, the system force-rebooted, discarding any unsaved work (!), and spent the better part of the hour I had available to finally play a game on installing an update.
Might have been the same update that reset my privacy settings.
Well, "better" here means you get the chance to start the update by yourself withing a few days. No warning is displayed on those few days, and if you fail to update by yourself, the system still force-reboots and loses any unsaved work.
Also, my impression is that the updates have become much less reliable. So your computer may not be back at all. Or something you need never work anymore.
Not only can you not say no, sometimes the computer will wake itself from suspend in the night and update - or fail trying. And you can't stop it (except for pulling the power cord) because Windows won't provide an option to shutdown without update. It's so incredibly disrespectful towards users.
That's why friends don't let friends run Microsoft software. In return for your money, you get to follow their orders, watch their ads, and generally compute in whatever way they want you to this quarter.
Windows is a control mechanism that, for now, runs third party code.
The only way I know of to reliably prevent forced reboots is to use Reboot Blocker, a background service which continually shifts the configured (mandatory) daily time window for reboots, so that you’re always outside of that window.
I hate bringing this up, because that loss should never have happened. But I ensure all such data lives on the cloud as well as locally because there could be a catastrophic hardware failure, natural disaster, or other external cause of that same problem regardless of Microsoft’s obvious negligence.
That is a good point. However, when I am working on my thesis or even a piece of code that resides on OneDrive folder, I usually pause sync for at least 24 hours. This is because OneDrive will often lock files or otherwise interrupt running/saving code so that it can do continuous sync, like frequently ramping the CPU usage up to 100%, send the fan on frenzy or cause other distractions. And a lot can go wrong in that 24 hours, or even 8 hours which is the lowest setting.
And curiously enough, this was the moment she converted to being an Apple user. She expected and found the same quality on her Mac as she did on her iPhone.
> There isn’t another computing hardware company which doesn’t have a “hot garbage” tier of products.
There were many more vertically integrated software-hardware PC companies in the 80s. A true shame that some of the others didn't survive; I'd love to be writing this message on my 2022 Acorn RISC PC or Amiga. What we witness now is the end result of the Wintel monopoly.
>There were many more vertically integrated software-hardware PC companies in the 80s.
>A true shame that some of the others didn't survive; I'd love to be writing this message on my 2022 Acorn RISC PC or Amiga.
>What we witness now is the end result of the Wintel monopoly.
A great point, BeOS was basically the last attempt anybody had at creating a new desktop OS, and when it failed, so did the last vestiges of anybody else's appetite for trying again.
I love my Mac, but really enjoyed GEM on my Atari ST too.
And people wonder why I still hold a grudge against Microsoft. We still live with the consequences of three decades of bad behavior. A little deathbed conversion doesn’t mean as much as some think it does.
Is OSX on average more reliable than Windows though?
I've never had much experience with it but heard some stories about its bad backward compatibility and not-too-pleasant upgrades. Thinking about finally getting a Macbook instead of my current Linux/Windows dual boot since Macs have the unique combination of professional-grade display, very powerful hardware and full support of proprietary apps that I need. And I mostly dislike Apple UX choices.
Yes, it is. I used to be very into windows but after the same upgrade issue (they just swapped me up without asking for permission and ruined some projects I had), along with how annoying it was to deal with different vendors, driver issues, I just said it wasn't worth my time and I wasn't enjoying tinkering there.
Mac has been great. My 2015 MBP is still going strong, it was many updates behind and I resisted for a while but I eventually just upgraded to the latest (2-3 major versions) and it went perfectly smoothly.
Every company and ecosystem will have its quirks. But some are way more hostile than others, and I've never seen that level of hostility from apple (yet).
Ports can be annoying though but I'd rather take minor annoyances over big problems like this.
BTW -- one of the bigger problems with the forced Win10 upgrade was that it changed the UI / icons and for older folks caused a lot of grief, because they felt lost on top of everything else.
> Is OSX on average more reliable than Windows though?
In my personal, anecdotal experience, macos is more reliable than windows. Neither is as reliable as my 14-year-old but constantly updated and fresh today Gentoo/Linux install. But the things that a laptop needs, macos is better. So I have a macbook and a Gentoo/Linux workstation and I generally feel like I get the best of both worlds.
I also often just run a debian in qemu on my macbook rather than installing a bunch of software from macports or homebrew, which has worked very nicely.
The big thing to remember is that Apple owns the entire stack. This has good and bad parts but one thing it means is that they can’t play support games where Dell and Microsoft point fingers saying it’s the other one’s fault, and because they make revenue after the initial sale Apple doesn’t have the same incentive to push you to buy new hardware every year or two. This thread started with some crapware Lenovo bundled to get another revenue stream, which Apple doesn’t need to do and knows would hurt their reputation.
The "Backwards compatibility" issue is an interesting one. For years, companies like Adobe used OSX upgrades as an excuse to force paid upgrades. Hardware manufactures did (still do?) this with Windows drivers. The other side is that Apple deprecated 32bit support around 2011, dropping it in 2019/20. They've also introduced and deprecated a few frameworks in this time. They have always announced well ahead of time. People get caught short by not paying attention.
With regards to OS upgrades, there has always been a vocal minority that likes to complain, ever since the System 7 days, sometimes not without reason but it's minor. Overall, the upgrades are decent, but I'll agree it's not always plain sailing for everyone. The thing with macOS is last year's Lemon is next year's gold. Rose tinted glasses are abundant online, especially with competing OSs.
Absolutely. I've owned many machines and driven two Mac OSX, basically every Windows since 3.1, Linux Mint, Ubuntu 14.4-20.4, and several generations of Android. OSX for me wins reliability and user experience, hands-down. XP service pack 2 is probably my favorite Windows experience, followed by 7. Linux is nice for control but not winning any awards for stability or smooth updates (headless/cli is fine, but GUI linux desktops are always a bit buggy).
I have not reinstalled MacOS since I bought my laptop, 5 years ago. And before that it was the same with another 5 year old laptop. These laptops feel pretty snappy still.
I bought a secondary laptop - a 12 inch Macbook Pro - and it took a few hours to set up. I didn't have to turn a bunch of things off. I didn't have to find the button that wouldn't send my entire life to a tech company. Just a few clicks and done. It's decently fast for such an old, underpowered machine, too.
MacOS (which is what it's called now, OS X name got retired) will annoy you, changes annoy people, but if you buy an Apple product it'll actually work for quite a few years. Not 15, unless you're unusual, they're not supported that long. If you want your software or hardware accessories to be supported forever, they won't be. If you have really opinionated positions on how something will function forever, you'll be disappointed.
But out of the box it works. Hard crashes are very rare. User hostile things like forced updates don't happen.
The worst things are much much less likely to happen than with hardware that runs Windows, and if something happens the tech support at Apple stores is pretty good. If you want a good Windows machine you have to do extensive research to figure out how not to get crap, and you still end up wrong sometimes. If your grandma wants a computer, you can just tell her to go to an Apple store and get whatever she wants and it'll work.
It wasn't that many months ago that I had a conversation with a customer that if they wanted to continue to use email (outside the browser) then the best option was to buy a new mac because their current mac version don't support the new certificates required for modern tls and in order to get a new macos version they need to buy a new device.
If I remember right their mac-thing was around 5 years old?
They may have bought it 5 years ago, but it wasn't 5 years old. Also for future reference you can update the certs on older OS X versions to get a few more years out of a system for someone thats happy with it: https://apple.stackexchange.com/questions/422332/how-do-i-up...
Yep. My wife's still running a MBP from 2012 that I think finally got its last update to Big Sur (released 2020) without issue.
Looking at the compatibility charts, everything from 2015 on-ward supports Monterey (2021, the latest version). The last laptop that couldn't update past El Capitan was 2009. As far as Macbooks (non-pro), last one stuck on El Capitan is the 2009 model as well.
And it's not even that Apple broke anything on the 13 year old laptop, it's just that the rest of the world moved on and they're no longer fixing it.
An interesting hack. We went with a different hack to accomplish a working solution, but good to know there is alternatives.
The idea in the link is that one transfer the System Root certificates from another (more modern) Mac to which the customer should already own. In theory I guess we as a email provider could continuously buy the latest version of mac and use that as the transfer host (hopefully do not transferring any account specific certificates). It must work without any corner cases (not overwriting anything on the customer end), the script need to work in future macos versions, and the assumption is that the first group of certificates is the relevant ones, and that it doesn't break anything on the customer end.
For those who has not read the link, this do not update the System Roots keychain since users are not allowed to do that in macos. Instead this adds a trustRoot certificate. A nice trick. One of the comments say that it didn't fix the problem for them but who know what their situation was.
> If I remember right their mac-thing was around 5 years old?
My 2015 MacBook still runs absolutely fine.
For that matter, I'm considering (but probably not) buying a new iPad to replace my TEN YEAR OLD iPad that I'm still using every single day, as it can no longer update to the latest iPad OS.
But my 7 year old (soon 8 years old) laptop is as good as the first day I bought it.
My 2012 MBP that I'm using to write this piped up to disagree. It retrieves mail from a variety of sources every day, including work email, and there is never a problem. All using the stock Mail app that comes with the OS. So I'm guessing some details are being mis-remembered by the 2nd-hand information supplied by a person who doesn't sound technically-inclined enough to have gotten the story right in the first place.
IOW, if one is going to supply anecdotal data, probably best to make sure it's your anecdote, maybe?
email is not the same as the web. HTTPS seem a bit more permissible than smtp/imap. That said I know that some banking sites do reject old version of browsers on the basis of security.
Some providers (email and others) can also continue to support deprecated/broken security for a very long time in order to not having to deal with customers using old deprecated software. 512 bits RSA public keys for example was used a very long time, long past where different standards and recommendation had a much higher minimum. Algorithm and key attacks are quite rare so companies can keep a low security standard for many years/decades before anyone exploits it. For customers that could be seen as a positive or a negative depending on perspective.
I would, if Apple would just make a MacBook w/ a stylus --- I don't want the hassle of multiple devices, but I'm not willing to give up a stylus --- at this point, I'm considering just breaking down and getting a Mac Mini and Wacom Cintiq Pro 16 and giving up portability/battery.
What applications do you want to use a stylus with? iPads are really good at what they do, and Apple makes integration with their laptops straightforward. You can easily share files back and forth.
In fact I wouldn’t be surprised if you could use an iPad as an external display with the Apple Pencil, but I haven’t researched that.
No, that is why people turn to Linux (and *BSD to a much (, much) smaller extent) after having been thoroughly disgusted by these practices whether they come from Microsoft (Windows et al) or Apple or anyone else. It is the only "platform" where you can be really free from this type of behaviour. They first install it on the hardware previously used for Windows or Mac OS and quickly find out their old hardware could perform much better than it did when burdened by those marketing strategies disguised as operating systems. The next step comes when they buy new hardware on which they directly install some Linux distribution, usually keeping the factory-installed Windows or Mac OS around where it tends to languish. When they then decide to boot into those factory images they are immediately hit with the upgrade hammer which only increases their determination to stay away from those systems as much as possible. This is easier for those who used to run Windows than for those who need to use Mac OS to develop for one of the other Apple brands since these all mandate the use of Xcode [1] which needs Mac OS which "needs" [1] Apple hardware.
[1] as in "is only licenced for", there are ways out of this...
I don't think I've ever seen a Debian or Ubuntu "stable update" / "patch" (apt-get upgrade, not dist-upgrade - what is enabled by default these days) break anything - I've been using various variations of Debian since around Debian 3(? 1.2/1.3-kernel era).
There has been a few (documented!) issues with major upgrades, like lilo-grub transition and Ubuntu changing system group ids.
But there are no forced lts-to-lts upgrades - and generally they are quite stable.
As sibling comment mentions, canonical even have a live-patch service for kernel updates without reboot (not that I would recommend that - doing a reboot after patch is a nice way to verify that the system still boots ok - which it should - but easier to debug when you know there's just been a patch..).
There are countless examples of people having problems post-update. Its understandable that you personally haven't gone out of your way to look, but its easy with a quick internet search. (BTW I don't mean to attack you personally, I just wanted to present examples.)
Two of these seem unclear if update or just reboot triggered an error? The last one (middle one, machine boots but is unstable) - is also unclear - as far as I can tell the user doesn't report if the system still runs OK with old kernel? (and all of these appear to affect other distros than stable Debian or lts Ubuntu - I've certainly seen a few issues when running Debian testing or sid. Updates without the "stable" part aren't always stable!).
Okay fair enough, my general point was that its not exactly rare (as many have implied) for Linux users to have problems with updates. A cursory internet search links to forums filled with people having issues.
Oh, I've had issues with Manjaro Linux for example (I inherited a laptop at work that I didn't immediately reinstall) - but Arch/Manjaro are rolling, bleeding edge distros - it's like using beta channels of windows or being an early adopter of macos releases.
My point was specifically about Debian stable and Ubuntu LTS.
Ed: And of course errors found and bugs filed in Debian sid(unstable) or testing - or Ubuntu non-lts contribute to those bugs not affecting those that run stable releases.
AFAIK it's ran via cron/systemd - so it should only affect login if the computer was off before login? And apt update / download should not impact a reasonable system (maybe something with really slow storage, like a microsd card, though?).
You don’t have to reboot for every update - for a server it would be mostly kernel updates that require a reboot. If you enable live patch or subscribe to another live kernel update service you don’t need to reboot for most security patches.
Because even with the low bar set by windows, desktop linux is genuinely less usable. If you need anything outside of a browser and encounter a problem you need substantially more understanding of how the OS works get it fixed, not to mention many common use-cases are either not supported or have an inferior less stable version on linux.
For common problems in windows you can find a current guide with pictures in seconds, on linux you are more likely to find a slightly out of date forum. And thats ignoring the complexities of different desktop environments and hardware compatibility.
> For common problems in windows you can find a current guide with pictures in seconds, on linux you are more likely to find a slightly out of date forum. And thats ignoring the complexities of different desktop environments and hardware compatibility.
This actually isn't my experience. It's possible that I only ever have had less common problems on windows, but invariably whenever I've looked something up for that, I've usually found terrible windows support forums with incorrect information and usually no correct solution at all.
For linux, and again I'd consider myself a power user so am familiar with most things, I usually find the information easily and it's quite straightforward. I believe part of this is due to the more technical nature of linux users, and due to there being active stack overflow resources e.t.c. which is far better than windows support forums (both in quality of questions and answers).
Obviously this is a very opinion based position, but I find the opposite to what you've asserted.
It’s not bias, it’s just how it is. If it’s not something bleepingcomputer or tenforums has covered, it’s a nightmare to find an answer for Windows problems. The difference is that most people are scared of the CLI, so they think Linux is harder because of that. When, in actuality, it saves you several seconds of clicking, and gives you far better error reporting.
Windows is a fucking joke of an OS, and that’s why nearly no one defends it on its merits. It’s always defended on the stance of compatibility because of things like O365 and Adobe products. It has nothing to do with the quality of Windows as a platform because as a platform, it’s a pile of shit, and everyone knows it. We’re just forced to deal with it if we want to game, edit photos and music, etc., without a ton of config changes and other hassles.
Command line and config files are really not a great way to learn a system, once you know what you are doing its faster, but arranging the actions you can take in space with something that represents the current state of the system is by far superior for anything new. There are bad GUIs but fundamentally I believe a system where you can see what is going on will always be easier to learn.
M$ forgot whats actually good about their OS, its been slowly metastasizing, but even so I dont think desktop linux is the answer. There are some real advantages to the windows paradigm, the idea that the program and the window are the same thing is important; the application should always show what its doing and what you can do. Settings are selected from all the possible options (you dont need to know whats available in advance)
I have always found that personally as I become more capable I get stranger and more obscure problems with computers, but never less problems. The edge cases and weirdness that you can get help with in an active forum is greater for sure.
When talking about common problems, I mean truly common, the things an average high-school student runs into like joining an online classroom or editing a document (or even playing a game!). These things do fail on linux, and without being a power user that knows where to start and some keywords to search getting past square one is not so easy.
The only data point I have, is my non-technical partner, who has had far fewer issues since moving to linux than on windows. It's a data point of n=1, and I'm not sure if I'd recommend it for everyone.
I was more than happy to recommend it for her since I know if there is any issue she has, I'll be able to solve it quickly and easily. The fact is though after throwing a standard ubuntu system on the machine I don't think she's had any issues past the first week or two of figuring out where the settings were, and what program does what.
There might have been the odd libre office question maybe?
The fact is, I'm pretty much the family tech support. My linux "support burden" is far lower than my windows "support burden" was when she used that (both in frequency, and for me, complexity). I still get a fair few macOS or iphone questions from my parents from time to time, but almost never things about linux.
Video editing, photo editing, connecting a Focusrite to the machine. Connecting a midi keyboard. Connecting a remote printer. Connecting a Bluetooth headset. Getting usable battery life. Installing software that allegedly supports Linux. All this and more are issues I've had with my Linux installations. Yeah I could have used a different distro, and yes many of these u could solve. But it sure was a pain learning it all.
Trying to disable mouse acceleration for touchpads when running Ubuntu 20.04 (and derivatives) made me lose more than 6 hours of my life.
The culprit was libinput, which, for some reason, ignored the "flat" acceleration profile for touchpads. It was eventually fixed, but the fix didn't hit an LTS version of Ubuntu until this year.
You plug in your phone, indicate file transfer is ok (on your phone itself, which I assume must be the same for windows) and nautilus gives you the option to browse it.
nautilus, nemo, etc support MTP and have done so for many years. You just plug your phone in and navigate to it in your file manager, which is the same thing you'd do on Windows.
We have to tolerate it because some software isn't available on Linux.
For example, I can't think of one RAW image processor from the original manufacturer that's available on Linux. Honestly, if there was, it would actually be a good reason to switch to that brand.
Also yes, I'm aware of RawTherapee and DarkTable. I use RawTherapee regularly, it's absolutely excellent and I highly recommend it, but it's also absolutely useless for certain use cases.
Nowadays I tend to use linux apps on Windows, thanks to WSL. For example, Windows' file browser crashes frequently on my setup, so I use Dolphin instead.
> RAW image processor from the original manufacturer
I'm active in a few photography spaces online and I don't think I've talked to a single person who actively uses one of those. Most people use Lightroom or Capture One, which are also not available on Linux.
I'm curious as to what value you get from OEM RAW processors.
> I'm curious as to what value you get from OEM RAW processors.
Exactly (or almost) the same render as from SOOC images.
I hate it when I deliver SOOC preview images that the customer really likes, but wants some small tweaks (remove a fly from the image, for example), and the colors or contrast on the resulting image is different because I didn't use the same processing engine or color profiles.
You can do the reverse too. Wine or a vm to run the couple things you still need. If valve can get all their AAA game titles to run using proton, you ahould be able to make it work. I had one tool at work, an xml viewer that was enough better than the competition to be worth running in wine. I stopped even noticing it wasn't a linux app.
I tried Wine for a variety of things in the past, but never quite could get it to work right.
Also, some of those software are mission-critical for me, and I can't really afford to use something that may break unexpectedly when I need to deliver images same day.
Sounds like you know what you are doing, wasn't trying to guilt trip anyone, just provide options. Despite the topic of the thread, windows is a much better os than it ever has been. If it is still possible to keep an install on some (potentially older version) running in a vm that would be sufficient mitigation against unscheduled MS decisions interrupting your business. I'm not an IT professional though, I just code.
This is an advanced task, and if you want to focus on games, this so wrong it hurts. Most new games (and especially AAA titles) need a substantial amount of work to run well on the part of the proton (and sometimes game) developer, very few have good support, less still without some performance penalty, and even less within a few months of release. Thats before you run into driver oddities, and good luck with multiplayer.
The idea that you can slap wine on the few things that you need is a weirdly repeated fiction, it wont make it true by repeating it, stop doing it.
I've never used proton, that was just heresay feom my gamer friends using steam decks. I remember having problems with wine in the late 90s but the single data point of that xml app was really good. I guess I just got lucky. Running a windows vm in linux has never worked as well as the ither way around, but it does work. And for people with critical software that runs on w95 or some version of dos it isn't unusual to keep it in a vm with extremely limited connectivity. Do you think that might be a reasonable approach for the gp?
But after being unable to even _install_ the Win7 downgrade that was promised with my purchase of a Win10-only machine (several years back), I decided Microsoft's obsession with licenses and stuff was just getting in my way. I grabbed Ubuntu and haven't looked back.
Linux is awful for its own reasons, not least of which is a flippant glee with which decades-reinforced UI reflexes are jettisoned in favor of someone's new window-management metaphor-of-the-moment. But those seem easy enough to opt out of, I can tweak a few things and I'm largely fine day-to-day.
Updates tend to break things in weird ways, but I can always just reinstall, or decide it's time to try a new distro, restore /home/ from backup, and go on with my life. A Windows reinstall is usually a multi-day affair and barely works afterward.
My biggest Linux gripe lately, and this might be Ubuntu-specific, is hiding the names of the programs. There's an image viewer apparently just called Image Viewer, and if you think that's fucking obnoxious to search for help about, you're right! Ditto with pretty much all the other default GUI tools. I think the file manager is probably called Nautilus but you can't get it to tell you that, you have to just infer that from googling error messages and finding them mentioned in issue-trackers for something called "nautilus". Aaaaagh.
> Linux is awful for its own reasons, not least of which is a flippant glee with which decades-reinforced UI reflexes are jettisoned in favor of someone's new window-management metaphor-of-the-moment
That's just Ubuntu.
> My biggest Linux gripe lately, and this might be Ubuntu-specific, is hiding the names of the programs
I don’t use Gnome (want tiling, am able to set up whatever, only use like 3 applications), but feel it’s pretty amazing. My 6-yo with no prior knowledge of computing devices that aren’t an iPad can use it (to launch Scratch, watch videos I loaded, show around his photos, and keep an audio diary) after 10 minutes of explanation, and seems to prefer it for some things over the iPad. My 70+ yo parents (with computing experience starting from the late 1960s, but limited capacity for new bullshit) use it with no issues. My wife uses it for the few tasks she doesn’t do on her phone, and is fine with it. I occasionally use it (as a minimum, to show things to all of them), and it’s very easy to remember how. By comparison, other environments (I include Windows) are overwhelming and have tons of confusing stuff that only gets in the way.
This is all re: vanilla Gnome, not whatever Ubuntu delivers.
Apple loves giving things generic names, i.e. “Music”. The only reason they get away with it is you can search for “macos $APP”, “Apple $APP”, or “$APP.app”.
I've worked for a non-profit that is officially Microsoft Office based, and our policy manual say all field staff (who provide their own laptops/computer hardware) need to be able to interact with MS Office file formats. I and an number of colleagues I know have used LibreOffice for years without ever having incompatibility issues with MS Office; for me its been at least 10 years since I've had MS Office on any of my machines... I run Debian or an old ThinkPad, and Mac OS X with MacPorts.
Solution: Office 365. Since it runs in the browser Office 365 will work for the average home user of MS Office on Linux. No it's not the full fat versions of office, but for most home users it's fine.
A lot of them. Have you tried to fill out forms (for stuff like homeowner taxes, personal income tax, etc) prepared by the government office in MS Word using LibreOffice Writer? It doesn't work at all, and if it at least opens, the document as well as output is completely fucked.
Anything a tiny bit smarter than your usual document (with scripts and/or macros) won't work too.
I mean, in the past 3 years I've sold a home, rented a place, and bought a home, and filled out my taxes and other forms from .gov websites and haven't had an issue to be honest.
Thank the US government, they're pretty good with the digital things. The EU governments are much worse with computers - Word documents and Excel tables that don't work in LibreOffice are usual here.
I've always found a mixture of google docs and libreoffice to suffice for me, but I'm not a particularly heavy MS office user.
There's definitely pieces of software that aren't available for linux, and if you require that to do your job, it probably isn't the best solution for you. For me, and my partner (who is a classic browser + office software user, as I suspect 90%+ of computer users are), it's never been a problem (and office software has never been an issue for me personally).
It's not beside the point. The usability of Windows vs. Linux must factor in the availability of software. I would literally be unable to do my job on Linux
> including the forced updates making desktop unusable for hour every month.
Patch day is once a month and takes 15 minutes at worst, sometimes only a reboot after the work day.
edit: Not to mention how upgrading Firefox on Linux-based systems through the console forced me to restart my Firefox mid-day, because "it works that way". Never had that on Windows.
I got a six year old T460, so I'm not really top of the line either. Nevertheless I don't sit in front of the machine for hours each and every month and watch Windows doing its updates.
People like to dramatically exaggerate how long Windows update bind you each month. Probably no Windows users but it's an easy target.
It doesn’t matter how long it takes if it elects to do it when you want to do something else. Computers are our servants not our managers. Choosing to do something which can neither be deferred or cancelled is unforgivable.
I was at a conference and 5 minutes into the plenary speaker's keynote address, his Windows 10 laptop that he was running his presentation slides from started an obligatory none-cancelable system upgrade. I don't think Microsoft scored any PR points there.
These posts are perplexing. It's like watching people getting angry because they were pulled over for not renewing their car tabs.
- Everyone knows that car tabs have to be renewed by a deadline and everyone knows Windows has forced updates by a deadline.
- Everyone knows how to pay to renew their car tabs and everyone knows how to check the Windows update so that it updates when you have free time for it. And if they don't, it's just 5 seconds of Googling to find either.
- Everyone knows what is going to happen if they don't renew their car tabs or allow the reboot by a certain date.
- Everyone knows that neither the vehicle licensing authorities nor Microsoft is going to change their policies. It doesn't matter what people think the policy ought to be because it isn't that way and they know it.
So basically, they picked up a gun, took careful aim at their foot, pulled the trigger, and are shocked and angry that there's a bleeding hole? Sure, they can go ahead and change operating systems but it isn't going to fix the fundamental problem, if you know what I mean.
Actually it would fix the problem. No other operating system forces updates whenever the fuck it feels like, and people should not accept this as a fait accompli.
Also; where the do you live such that there isn’t a grace period on registration renewal? Do you think a car just stops working when that happens in the middle of your commute to work?
Yeah, but I'm not refering to the original post but to complaints that Windows updates will take hours of your productive time each month for nothing. That's simply not true.
It doesn't matter whether it is hours or one minute, it's unacceptable. A sibling comment here describes how a Windows update started in the middle of a presentation someone was giving, which indicates quite clearly that Windows Updates take productive time away - not just of the user of the computer, but those who were listening also.
Again, computers are our servants, not our managers.
I have the opposite experience. 4GB of ram + i3-4005U + spinning drive, windows update is unbearable. That's on top of anything newer than 2016 being unusable anyway.
I don't do media on it, so I've been quite happy running 9front lately.
There's your problem. Even the cheapest of SATA SSD will massively outperform almost any laptop hard disk drive when it comes to random I/O. And it'll probably increase your battery life a smidge as well.
I wouldn't put applications on any spinning rust these days, especially laptop 5400 (or worse!) drives.
$40 for a 512GB SATA SSD that'll blow the socks off whatever spinning drive you have.
I don't have the budget for that at the immediate moment, but yes, I know.
I do think my point still stands though. I don't think it's acceptable for software to run this slow, even on this. I'm talking windows has been idling for hours and basic tasks like clicking on the volume slider are sluggish.
On a side note: I'm fascinated HOW cheap SSD storage has become. Really looking into changing my backup drives to SSDs instead of good old spinning drives...
With those specs I would guarantee the update actually takes less than 15 minutes on average each month. I don't remember an update adding more than 1 or 2 minutes on my i5 8gb to my restart time since Windows 11.
I mean switching away from an operating system that has a chance to randomly brick your computer to one that doesn't seems pretty damn reasonable to me? I guess YMMV.
It's a very fair point, and it is most _definitely_ possible (and not even particularly hard) to brick a linux system.
However, I think the difference is that it's always due to an action you take.
I do think this is something qualitatively different to when your machine decides to automatically update to windows 10, when your machine is not compatible with windows 10, and it bricks itself.
If you randomly turn on a computer and it says "hold on, I'm updating" and then never turns on again without you ever agreeing to an update, I think you'd quite rightly be somewhat upset with it.
Now, I don't even mind windows that much. I prefer other things, but I recognize the strengths of windows and I understand why people like to use it. At the same time, I think it makes sense to recognize the frustrations that windows can sometimes bring to some users.
Yes, there is a difference, but the auto-scheduling part is not the one that has the bug.
The bug is that OS is in a borked state, and the update infrastructure cannot deal with it. This is what ends up bricking the system. (Using the loose definition of the term here, I understand that it doesn't literally damage the hardware)
The overall big picture is the fact that most users don't want to be system administrators; they just want it to be managed automatically. However a subset of users, do want to administer the system and have a level of control, and this is where Microsoft screwed up, by not providing us those tools.
At work we have our IT manage the updates on all of our machines, and they have managed to apply some kind of policy where they can control the updates. I am assuming this is only possible in a domain environment, and not on a stand-alone PC.
> it updated itself to Windows 10 without his explicit consent.
it's bs how they can just do this. Microsoft thinks that just because they aren't legally liable for their software breaking a user's system, that they can just do anything like these automatic updates.
I think it's really high time that there be regulation on software and their reliability - that is, some sort of consumer protection, where an update such as this breaking becomes a liability for microsoft. And of course, this doesn't just apply to microsoft, but apple and google and any other software manufacturer.
We're actually lucky that the current global political & economic system allows governments to enforce this. Companies can weasel out of paying taxes to the country in which they operate, but they generally can't escape the local country's regulation with regards to how they render their services.
All we need is political momentum to enact the regulation - and that's easier said than done.
I have been thinking about this a lot lately. I went through chemo less than two years ago, and while I don't use windows, quite a bit of technology became outright hostile to me. I even fell for a scam on Instagram and lost $40 trying to buy a gift for my son.
I keep day dreaming about designing systems that focus on stable, simple, consistent interfaces. That don't even give application developers a choice in the interface. Where service providers aren't even tempted to become user-interface developers. If I could afford to take a year or two off, this is what I would work on.
>That don't even give application developers a choice in the interface.
Then you get the crowd of “indie developers” crying about how the platform has taken away user choice and that they should be able to use every API of the OS to do whatever they want, user-be-damned.
Apple doesn’t go nearly far enough and every “indie developer” working in adtech constantly cries about the entire OS should be accessible to every application developer on the planet.
> Microsoft's unfriendly us-first, customers-second process
Fortune 1000-first, themselves second, and customers last.
Thankfully, Apple has recognized that their devices and software are -- at least philosophically -- designed for people, not companies. A phone is a distinctly personal device. I think the reason Windows Phone flopped is that it was ultimately designed to deploy in corporate fleets, with all the usual "the company owns your device and everything on it" experience that most of us have with the corporate laptop. You can say that Microsoft's monopoly in businesses pushed Apple in this direction, but at least they have capitalized on the distinction, and didn't sacrifice user experience to try to court the "corporate purchaser."
This is why I HATE Gartner "market share" numbers. It lumps in corporate purchases (to Microsoft's overwhelming benefit), which completely distorts the view of how people are using operating systems "on the ground." It's my gut-level impression that Apple is leading at least 4-to-1 in individual purchases, and Windows is something everyone but gamers just put up with to do their jobs.
I installed Teams on my personal PC last week, and as part of the install Microsoft managed to change my whole windows install to use my Microsoft account to login, rather than my local account. It took me a while to work out what had happened and unfuck it all.
Microsoft accounts are the biggest clusterfuck they've produced in recent history. I hate them with a passion, and they keep pushing them no matter how much you keep rejecting them. You need to jump through lots of hoops to simply get a local account in Win 10, and then they keep harassing you about upgrading.
When my son was ready for his own laptop, I originally set it up with a Microsoft Family account. That was a terrible mistake. Microsoft Family causes only pain and misery and completely fails at what it's supposed to do.
Microsoft bought Minecraft and recently forced everybody to migrate from perfectly functional Mojang accounts to Microsoft accounts, which makes everything more complicated. My son's Minecraft account ended up on my wife's Microsoft account, and suddenly I'm called UnshavenFiber (though not entirely incorrectly, I've got to admit).
To clarify mcv's last statement... if you have taken the time to create local user accounts because who is using your computer is none of Microsoft's business, and then you make the mistake of buying Minecraft and logging in to it, Minecraft will automatically and with no warning [1] irrevocably tie that Windows user account to the Microsoft account used to log in Minecraft.
I now have n+1 accounts on my Windows system because I had the kid who did that leave that account just to use Minecraft and created a new account for everything else.
I'm pretty close to going to Linux on this system. It was the house game system but I just got a Steam Deck, which is fulfilling that role nicely, and that leaves Windows hanging on only by a thread of my personal laziness/business (depending on how charitable you are); everything else that system does is either equally doable by Linux or better done by Linux.
[1] At least, no warning I saw and/or understood, and I think it's fair to say that if I didn't realize that's what going to happen, neither did 99%+ of the user base. I'm sure lawyers have their bases covered somewhere but I don't care.
> I'm pretty close to going to Linux on this system
Well, I'm one step ahead of you. As of last weekend, I purchased and returned 3 games due to unplayability on Windows 10 (yes, patched, all drivers updated, msconfig nothing running, ran exclusive/un-bordered/windowed, every recommended change and sfc checks ok), and they all crashed regularly at irregular intervals.
I've partitioned the SSD and have installed Ubuntu. Now to learn how to use Proton to make games work.
That was my last Windows machine. Goodbye Microsoft.
edit: Just to add...I'm the most technical of my group of friends and typically lead the way. If this works, I'll have 5 other guys migrating in the next 12 months with me, they too have been complaining since leaving Windows 7.
Also Microsoft stores the escrow keys for disk encryption and login on their servers when you use a Microsoft Account, and will happily hand it to the government on request.
I can't express how much the MS account thing totally ruined me from ever using or buying MS products again. Trying to make a PC have an account for my kids to play MS flight simulator was pure insanity. Never again MS.
Oh the pains of Minecraft administration! I could write a book about cross platform incompatibility, Microsoft account dysfunction, billing issues, and upgrade blocking execution. We’ve spent $100s and it does nothing but get worse.
I tried setting up Minecraft the first time on my kids ipads and the whole Xbox/Live account thing literally made me angry. I am technically competent and I couldn't figure that shit out for hours.
During the Windows 10 install they make it look like you needed to login with your MS account. They option to use a normal local login is right there but it’s named in some funny way so you don’t think that it’s the normal login. At first I thought this was just for the main admin account and let it slide, because I was only intending to use Windows for games (it’s not good for anything else). But when you want to create a second account, it tries to trick you into creating a new MS account.
With Windows 11 there's no UI option to use local account at all. You need to press magic combination, launch cmd, execute some bat-file with unpronounceable name and reboot. Then this option appears.
I mean, it's one Google search away, so not a big deal, but they do push hard for MS account.
I recently had to install Windows 11 and did so by downloading an ISO and flashing it to a USB. To my surprise the latest version of Rufus (https://rufus.ie/en/) can detect if you are using the official Windows 11 ISO and let you skip the initial setup and create a Windows 11 local account. You can also disable the TPM and processor requirements IIRC. IDK how it even does this, I guess it modifies some boot menu when writing the image.
Win 11 still sucks but this makes it much much better.
The process is available (though you also need to click away a few nudges to get you onto cloud) using the normal setup if you use Pro instead of Home, with Home they basically force it on you, allegedly so they get to backup your BitLocker recovery keys.
I think in general Microsoft is suffering from an incentive problem. Unlike Apple they don't really have any revenue stream from users except for the up-front cost of the installation but still incur costs for patching etc. so they do all this trickery to at least get access to data and metrics to improve their paid line of products (Office etc.)
Idk what happens on the backend, so I wont engage in conjecture on how safe it is, but keep in mind that BitLocker is not available for Home users at all in Win10 and enabling it by default in Win11 does increase the data safety of the average user quite a bit because a petty thief will not be able to get any data from the device.
Thanks, was not aware about edition difference. I'm surprised that MS put that bat-file in their install files, because it definitely looks like it was specially created for that purpose (it sets one key in registry and then reboots computer which starts installation again, but now this registry key enables UI option to use local account).
I believe this is about to change in 22H2 Pro. You will need internet and microsoft account to set up. And a Pro license is a $1000 markup on consumer laptops, from what I've seen.
Telemetry/metrics and device-account relation is not just for revenue or improving the paid products; it helps immensely in tracking down and debugging Windows and external device drivers as well.
On 11, just click the 'login with Microsoft Account' and for email use no@thankyou.com and when you click Next it'll automatically dump you to the local setup flow.
I recently ran into this setting up a laptop for work and how user hostile this step had become made my blood boil. There is no option to setup an initial local account unless you intentionally provide bad input to the Microsoft login page, I ended up just wiping the machine and installing Windows 10 from scratch
I'd really like to able to send invoices to Microsoft and other companies like Dell who pull this bullshit for wasting time in my life to work around this stuff (and I wonder how much potential productivity is used up by these practices)
To run Android apps on Windows 11 you need a Microsoft account AND an Amazon account. The Windows Subsystem for Android comes with the Amazon Appstore app which is only available through the Microsoft Store.
The last time I installed Windows on a computer, back in late 2020, that option would also be hidden if you were connected to the internet. Since the wifi configuration occurs first, ostentiably to start downloading updates while going through the setup, this was usually the case. The only way to get the local option back was to entirely shut down the machine, then start the process over.
What about Windows 10 Enterprise LTSC ? I recently installed it in a Qubes to be able to run some specific database software until I can export the data to an open, libre database. My understanding was that LTSC didn't require an account or even an activation code, but I've been able to run a trial iso I downloaded directly from Microsoft.
In my experience, shutting down and rebooting wasn't enough; Windows 10 still forced me to create a Microsoft account. I had to open a cmd on the installer screen and remove my home Wi-Fi network from the known network list, then possibly reboot.
For me, because I know this is subjective, fucking with Linux to make it work how I want is less of a hassle now than fucking with Windows to make it work how I want (again).
Reinstalling graphics drivers from a real TTY when the occasional update hoses them is far and away more preferable than, say, ads in the start bar (for me).
... or installing fucking candy crush by default (which, really, is just a long form ad).
But maybe that's my inner (outer, now?) old man talking? I mean, Minesweeper, patience and a couple of others, maybe even a pinball game always came with Windows.
Windows never came with fortune though, which is the best "I didn't ask for this, but would have if I'd known" default I've come across. Yep, old man is now outer and child is inner.
The difference to me between Minesweeper + Solitaire and the Candy Crush incident is that Candy Crush was developed by a third party and, more importantly, is designed to be addictive and separate users from their money. The point of putting Candy Crush on Windows machines was to lure new people into playing Candy Crush and hopefully giving King + MS money. Microsoft is providing access to its users to third parties for profit.
The point of putting Solitaire + Minesweeper on the machine was to demonstrate the OS's capabilities + get people used to using either a computer or a GUI. Solitaire and Minesweeper were installed for the user, Candy Crush was installed for King.
Removing windows and putting a user friendly Linux distro changed everything. No more invasive software in her way and her aging PC was brought back to life. At first we were a bit scared that there might be technical challenges with Linux that might appear but so far nothing of that sort.
And yet you'll still get technical folks parroting mindlessly "Updates are good! Make sure you always update! Muh security!"
No, updates are objectively not all good. Updates frequently break things. This is colloquially referred to as your user experience being "enhanced." Any solution begins with an honest appraisal of these realities.
Very sorry to hear about your father's illness, and how the situation was made even the worse by such brazen disrespect for user experience. My parents also have to regularly go through what I think amounts to violence by their electronic devices, but nothing as horrifying as this.
This terrifies me, TBH. I'm almost 50 and a little bit technical (used to admin Solaris boxen), and I just know the day will come when I haven't a damned clue how any of this works any more. For the time being, I'm happy using Linux as a bootloader for Emacs; will that be enough to see me through to my personal end? I sadly doubt it.
> and I just know the day will come when I haven't a damned clue how any of this works any more
I don't think this is a foregone conclusion, and it's easy to say this and sort of throw your hands up and go "oh well, woe is me" but it's kind of a cop out, no?
I have family in their 90's who can navigate whatever tech you put in front of them, because they think it's important to figure things out when they get stuck. My father can barely turn his phone on because at the slightest frustration he gives up and waits for someone to "fix it" for him. Certainly different folks have different aptitudes, but choice and working through things plays an exceedingly large role.
> I don't think this is a foregone conclusion, and it's easy to say this and sort of throw your hands up and go "oh well, woe is me" but it's kind of a cop out, no?
The problem is that as we get older, we tend to value our time differently. Learning a new interface just isn't as important to us as spending that time doing something enjoyable.
Imagine if car companies randomly update the way you drive a car overnight every 7 or 8 years. You go out one day, and suddenly instead of a steering wheel and pedals you have a stick with paddles, then a few years later its changed to something that looks like an oar. Eventually you too might say to hell with it.
The irony is that if a vendor can't provide a consistent experience every time, why should I remain loyal to that vendor on the next update?
If Windows is going to change their user interface every other year, why would I relearn their UX-du-jour when I can just re-learn once on a Mac and be good for a few years (if not longer)?
If I use hosted Gmail and my UI changes every other month, it just tells me that I should host elsewhere where the UX is more stable.
If you abuse users long enough with this everchanging, constant-beta UI nonsense, you are just telling them to move on to a platform that isn't so unstable, where you only have to learn the UX once, not every X months.
> The problem is that as we get older, we tend to value our time differently.
> Eventually you too might say to hell with it
I agree 100% and that's even kind of my point. You're choosing not to care about that stuff, which is fine (honestly, probably even healthy at a certain point), but it's not as if there is some biological imperative that as you age you are less able to use technology.
I’d point out that it’s not necessarily those people’s problems. It’s the tech makers’ problems. Hardly any of this stuff needs to change so wildly and so often.
The somewhat recent FaceTime updates have befuddled even my brother and I. And there’s zero reason for any of the changes. They were just changes, not improvements, and were probably even regressions.
I am typically able to poke around and figure out why something broke so I can fix the dependencies. And I can search online to figure out where some feature I depend on was moved to, or what extension I need to re-enable it. But I nevertheless resent this kind of mandatory administrative overhead that comes with relying on computers.
On the flip side, I enjoy learning new techniques, languages, and approaches to advance my craft.
The difference is that the former is dictated by developers and required on a periodic basis just to tread water, whereas the latter is self-directed and helps me become more capable.
I am an active engineer and software developer, and I already feel like I don’t have a clue how any of it works already. I am being a little bit hyperbolic of course, but that’s the feeling.
And to be honest, I don’t think it’s me. It isn’t clear to me that the people who make all this stuff know how it works either. Nothing works as intended in technology. At least some form of technology doesn’t work as it should every day, from your work machine’s OS, or some program, or your smartphone, or Android Auto or Apple Car Play, or your TV, or your Internet provider or WiFi router, or that website you visit, or your smart thermostat, and on and on. And that’s stuff just plain not working or encountering bugs. It doesn’t even address the usability of all this.
We are just shitting out technology left and right, all at the alter of scale. In my opinion, capitalism is part of the problem. The other is human nature. There are no incentives to get this right. And these days, large companies do not care that they get it right. Statistics of failure and user frustration are explicitly part of their business models. They don’t want to even know if there’s an issue with their product. They just let some statistics drive their decision making.
It’s all just a tragedy and comedy all in one. Amazon has their 12 principles they hire with that makes it seem like they hire geniuses left and right. And they can’t even get book selling right these days. And there’s no way to report issues. They do not care.
It is difficult to get a person to understand something, when his/her bonus depends on not understanding it.
Most of these bonuses are probably tied with frivolous, often highly unpopular and unwanted, feature updates. I doubt people get bonuses (or as large bonuses) for fixing usability bugs or improving quality.
A coworker of me never switched from DOS to Windows (he told me he tried it a few times but... meh), recently i helped him getting his new(-ish) Lenovo workstation set up with FreeDOS 1.3.
So... it IS possible to stay on a system where you are comfortable with, if you are willing to make concessions...
He mostly uses it for embedded programming, but also does the finances for a club he is in via a spreadsheet (As-easy-as) and does write the occasional letter in (if i remember correctly) wordstart. For emails and web he uses Arachne (with a hack that allows arachne to fetch https via wget).
Already happened to me, I have less and less idea how my linux system works now. I haven't kept up with all the various systemd rewrites, buses and random shit I don't know about and it's annoying when I actually want to do something
has journalctl -xe ever dumped any useful information for anyone ever or am I just an idiot? when I restart nginx I want to know if it failed, why , what's the syntax error. instead old reliable /etc/init.d/nginx configtest no I have to dig around
# Output all logs since boot
journalctl
# Follow all logs in real time
journalctl -f
# Output logs for a given systemd unit
journalctl -u $unit
# Combine follow and unit flags to follow logs for a given systemd unit
# Easy to remember: You are saying F*** U to a broken piece of software
journalctl -fu $unit
# Output kernel logs
journalctl -k
# Follow kernel logs
journalctl -fk
# Tip: Use -b flag to see logs from previous boot cycles
# Learn more, including advanced filtering and formatting
man journalctl
You can also add that configtest command as an ExecStartPre to the unit, and it will run that before starting nginx and optionally fail early if it finds an error.
> has journalctl -xe ever dumped any useful information for anyone ever
It can be useful when there's potentially multiple things leading up to the error in whatever you're running, as it is showing and explaining nearly every action that's recently happened in the system log. But if you're just wanting to look at logs related to nginx starting at the most recent then just query for niginx's system log instead of the whole system.
If a process or service is using systemd to start and stop it (e.g. You start it with `systemctl start $SERVICE_NAME), I often do `systemctl status $SERVICE_NAME` to get the service-specific logs. For me, it seems to get the job done much better than journalctl.
If you can set up nginx to do what you want, you can learn how to use systemd. In my experience it is quite usable once you get a handle on it. I'm sure there are plenty of "getting started" kinds of introductions that could make it useful to you after maybe half an hour of time investment. Other commenters have answered your specific question.
YMMV. I find systemd a lot easier to work with (I actually started wrapping daemons in proper services instead of sticking everything into nohup or tmux sessions).
Right, then I have to relearn my userspace setup... well maybe it's time - but I've been on debian longer long before I met my childs mother, before I even grew a single hair on my chest, before I had a bank account, a job, or a cell phone. But perhaps you are right, it's time to quit it
MX Linux (Debian-based, uses sysvinit with systemd shims) and Devuan would seem to be obvious alternatives that wouldn't require changes to your setup.
I'll check it out. I have all sorts of weirdness with this system. snap is great because debian doesn't package a lot, yet then all my snap programs are sandboxed in a non useful way. (like mysqlworkbench won't save passwords in a keystore because that hookup doesn't work - and it cant run without a working keystore. like firefox will start once, but if you run the firefox command, it'll error out that firefox is already) lots of annoying little things
Yes, a well run / nice project. KDE3 seems far better than anything in the KDE or Gnome world since.
(I don't use kwin, but instead a different window manager, and only part of trinity for this and that, but that's not due to a lack of quality, I just prefer to mix and match.)
This is a good example of why there should always be an alternative way to access services besides a website or app. There's always an assumption that a computer or a phone is available to do things as simple as making an appointment right through to something as complex as planning and buying a vacation, which is fine up until it's not an option any more. In the past 20 years more and more companies have started to fail at this. It's practically impossible to do anything more complex than buying something from a local shop without access to tech now.
Conventional and unconventional behavior can both have unwanted consequences. We each pick our poison.
Another senior here.
As a child I scratched my father's guitar with a belt buckle I had been told to wear. I learned to distrust adult judgment, I still don't wear belts half a century later, and I have a successful research career for which I partly thank this errant belt buckle. I lost a potential life as a rock God, but I've been protected from ever buying Microsoft products. This story horrifies me but it doesn't surprise me; I made a fair trade.
So you did not learn to weigh consequences correctly.
Wearing belt keeps you from a lot more inconvenience than scratching a guitar.
Besides unless it was IDK some kind of guitar that Jimmy Hendrix was playing I would not scold kid for scratching it - so much that he would stop touching guitars for life or stop wearing a belt for a life.
Buying Microsoft products save you a lot of inconvenience as well.
It really does seem that Microsoft is now in the business of trolling its users. The degradation of "Pro" with ads and uninstallables (like frikking Xbox) is beyond disgusting. When a post mortem of its lost os dominance is done in the (hopefully near) future, I have a feeling that these anti patterns will features prominently. Edit: I'm sorry for your loss.
Microsoft also decided to wipe my entire inbox because I didn't log into my hotmail account all the time.
This was the first time an email company this big pulled some shit like this in the internet era.
All my correspondence from my high school years to my mid-twenties were gone, including cherished back-and-forths with people that were now deceased. It felt like someone came and burned down my house while I was on vacation.
Just thinking about it continues to paralyze me with rage.
Sorry for your loss, such a horrible story. The small things really do make a difference and this isn't even small but a major nuisance.
But...
Are you really saying he didn't just buy a new laptop for his last holiday!? Just buy the new laptop, even if the plan was it'd immediately go to someone else in inheritance.
To quote the parent post, "He was too tired to go through the process of getting it sorted out".
I don't know if you've ever been really sick (or just been around someone who's really sick or suffering from chronic pain), but being on chemo is a thing that completely shuts you down. I fully believe that going out to the store and buying a new laptop and getting it set up was beyond what the parent-poster's father felt capable of. Similarly, going out to the library and doing it all from a shared computer would be entirely too much. Anything that's not the well-worn familiar path would feel insurmountable.
My father was indeed too tired to do anything much in the immediate aftermath of chemo. I would cheerfully have booked and paid for the holiday myself, but he didn't tell me about it at the time. He wasn't much one for asking for help.
The internet is not the only place to buy things. It's also not strictly necessary to buy a computer solely to access the internet. For instance, you could ask your neighbour if you could use theirs for an hour or two or go use one at the local library. Especially if it's just to book a, what seems to be, very important trip while you're dying of cancer.
what in the world could that upgrade have possibly fucked up for an "email and e-commerce" machine.
I completely understand how things get fucked up for engineers and serious tool users or users who have needs for very specific hardware that require very specific operating systems to run. but an email and e-commerce machine?
what could possibly have broken email and e-commerce?
also I'm very impressed that the very first comment I read on this article has twisted the topic of "value-add software that hardware vendors put on is garbage" blaming Microsoft in record time.
It's amazing how good the vocal HN commenters are about things like this.
I'm sorry to hear about this. I wonder if there was anything you could have done for him. Like, if you knew he was having such difficulties, why not drop by one day and book the vacation for him?
Everytime Microsoft forces its product upgrades (I'm not talking about security patches) on people and through shady dark UI patterns, it turns thousands, if not hundreds of thousands, of peoples' lives into shit.
The fact that they keep insisting on repeating this awful practice, year after year, even perfecting it at each iteration, can only be explained by a company being run by sociopaths, in my humble opinion.
Microsoft has absolutely no sense of responsibility when it comes to its retail market, its product owners would rather compromise the integrity of any senior's computer to make it unusable than just accept the fact that most people on earth don't need a new Windows every two years.
I don't get it. For a few more bucks they instantly get their brand connonated with slowness and being unreliable.
How do manufacturers throw in all the crapware and expect a good user experience and happy customers? Oh of course, they don't care.
I've moved to Mac but back in days I used Windows, the VERY FIRST thing I did was to install a fresh copy of Windows, then install any necessary drivers which are absolutely required for proper function (I remember just finding the driver files and installing them manually from add hardware dialog instead of installing bloatware of the manufacturer).
Wish things have changed in 10 years... apparently it didn't.
And of course I'm tech savvy. I don't see any less-savvy senior being able to solve this problem; it should be regulated in a way that there should be a mandatory and ugly label showing all the crapware included with the device while buying, perhaps even extra taxing for each crapware included, discouraging manufacturers from including them.
You have to watch out these days even when installing a fresh copy of Windows as I found out recently, since some motherboards coughASUScough have software in the BIOS which can get automatically installed. There's an option that comes up on first boot and I clicked yes thinking it was some drivers, but it was the regular company branded bloatware more or less. Not going to make that mistake again.
It's called Windows Platform Binary Table, and lets the firmware provide an EXE for Windows to execute on boot. So even if you install a clean OS, this will run, and there is no mandatory prompt (if it's run through this approach and you see a prompt, then the software that was run was nice enough to ask, it was not Windows asking you to confirm whether that is OK).
Yep, experienced this a few times on their consumer products and even older Thinkpads licensed with Windows 10 Home. It is possible to neuter all (most?) of the bloatware by deleting the associated binaries and wiping all of Lenovo's background processes, but it's still a royal pain-in-the-ass when reinstalling Windows on it. They're easy to shut up with Linux, but my heart weeps for those who insist on using Windows.
I had the same experience recently. It's a rootkit installed by the BIOS to inject the crapware.
Immediately returned the ASUS board and got something else. It sucked to tear apart a computer I just built but I don't trust then hardware in my machine now.
Not new. Windows Update will by default pull and automatically install drivers and software based on hardware IDs. If you try to uninstall it, it might be "self-healing software" (can't actually uninstall, have to delete services, autostarts and files manually and in the right order) but in any case Windows Update will put it back anyway soon until you disable automatic driver updates completely. GP already mentioned ASUS by name, MSI also does this, their boards install some weird very hard to remove "audio improvement software" or something like that.
Fair enough. I suppose I should have said "new to me".
Also, I stopped using MSI when their motherboards weren't properly supporting Piledriver/Steamroller AMD cpus power and thus thermal requirements. At least, on some of the mid-high tier boards. (It said it could handle a 8350. That apparently was a lie.)
Windows allows the firmware to provide a binary that will be executed on first boot (or maybe every boot? not sure) - ASUS is abusing this feature to make their crapware permanent.
> Wish things have changed in 10 years... apparently it didn't.
We no longer have browser toolbars. I remember a time when I had customers come in with 4, 5, or even 6 different browser toolbars. The Google toolbar, the Yahoo toolbar, the McAfee toolbar, the Because-We-Fucking-Can toolbar, etc. Most laptops came with at least one toolbar pre-installed.
Haha! I was working IT support for the local municipality back then. One time I had to go into the office of the local orchestra to fix their computer which was "slow". When I turned it on there was a cat or a dog walking around on the screen, IE had at least 5 toolbars, including two porn tool bars and the bookmark menu was full of porn links. A hot mess to say the least. Cleaned it up and removed all junk, malware, spyware, toolbars and "fun" applets. Pretty sure it was back to the same mess a couple of weeks later ...
Toolbars we're a big mistake, but I understand why people thought they might be a good idea. Explorer context menu hooks (i.e. open in VS Code) are really useful, and without the benefit of hindsight those seem similar to toolbars.
Those "few more bucks" end up in the right manager's PnL who gets promoted, though. Nobody at Lenovo leadership cares about Lenovo beyond the next few years; that's the problem.
Would be interesting to develop an executive/managerial compensation structure that functions like a 10-year call option. Maybe it could feel a bit like TV show 'residuals' - even if you leave the company you'd still get a check in the mail for the company's longer term success.
So much of the societal harm of corporate america comes from short term optimization at the expense of long term value. Would be great if businesses recognized this and found a good way to attract talent that thinks in decades rather than quarters.
How many CEOs dump all their stock right after leaving a company?
Maybe you could do a 10 year stock grant that is fully vested regardless of employment, and grant the same amount at the same schedule each year (or more frequently) so number of options is proportional to length of employment?
Exactly. A lot of boneheaded corporate decisions make a whole lot of sense when you think about the individual manager/exec who would've been empowered to make that call and which metrics might get that manager promoted or fired.
The guy at Facebook who decided to randomly resort your feeds and emphasize a rotating set of "like if you like kittens or hate Obama" posts at the cost of your friend's messages wasn't thinking of the long term success of Facebook. They were clearly aiming to increase "like" clicks on a graph.
The guy who decided to integrate Google+ into YouTube and Chrome and everything else didn't give a shit if YouTube or Chrome or Google succeeded. He cared whether or not Google+ membership/participation went up. Same with whoever made most of the decisions about Google Buzz.
The guy at Sony who decided to put rootkits onto music CDs didn't care whether people liked Sony. He cared about whether he could say that he'd done something to stop piracy, and presumably also had a goal of efficiently damning his soul to Hell.
I think this has generally been true of a lot of American companies for a while now. I would add it's not even "next few years", it's more like "next few quarters". If the CEO can juice the stock price for say, four quarters, he may walk away with several million in the bank and who cares what happens after that?
Being slow and crap is basically what consumers expect in a windows laptop. Which is why they largely moved to mobile or a work provided laptop without the crap ware.
Imposing restrictions on OEMs got them in a bunch of trouble in the past, and restrictions like what you're suggesting are actively getting Google in trouble (see the recent fine imposed by India).
Incidentally, this kind of thing is why my personal laptop is a Microsoft Surface, and my personal phone is a Google Pixel.
I see this indeed, it is why some older people move almost entirely to an (entry level) iPad and are quite happy. The Windows laptop is something of a necessary evil to them, they use it from time to time but hate it (fear of virusses, nag ware, because they use it very little they always need to spend time updating, confusion because of Windows-S).
On the laptops I've used with pre-installed Windows (which I reinstalled), the license key was in ACPI table "MSDM". I assume Windows gets it from there automatically. You can extract the license key from Linux with: cat /sys/firmware/acpi/tables/MSDM
Yes; your laptop should have the Windows license embedded in firmware. If it came with one, that is. You can download the Windows installer directly from MS.
Installing all the drivers needed is on you. They might or might not be downloaded from Windows Update.
If you happened to buy a machine with an OEM Windows Home license baked into the motherboard and want to install retail Windows 11 Professional, this makes it extremely frustrating because you no longer get the choice of which version of Windows to use during the install process.
You need to add the EI.cfg and PID.cfg [0] files to the installer medium before booting it. Once you have those files present with the correct syntax, it will install the version you want, but I can't imagine a non-tech person being able to figure this out on his own.
I was mostly concerned with making sure none of the preinstalled Windows Home bloatware would remain after an upgrade. I figured the safest way would be installing Professional right off the bat.
In general, I think, yes, as long as you're installing the same edition. Usually the license is somehow stored in the firmware.
However, in any case, even with a separate, expensive "boxed" license, Windows helpfully provides a convenient way for the firmware to drop additional useful software (aka crap) onto your fresh clean install. As in, it actively checks whether the firmware would like something executed and if so, executes it without prompting. https://news.ycombinator.com/item?id=19800807
You can install windows 10 for free and leave it like that forever if you can deal with the watermark and non-restrictions like not changing the wallpaper.
> How do manufacturers throw in all the crapware and expect a good user experience and happy customers? Oh of course, they don't care.
Edit: To be clear I'm not defending the companies, they shouldn't put all this crapware on there. I'm just saying most customers won't notice it on a new laptop.
In most cases on new laptops their crapware doesn't make a huge difference, because there is enough CPU headroom. The people who are doing so much that they would notice probably will reinstall fresh Windows anyway.
- Running Linux (assuming they can deal with everything else that comes with that)
Even these aren't perfect, but everything else I assume to be hot garbage unless you're planning to install your own OS anyway. The problem, though, is that this mostly excludes the real budget-tier devices (at least for nontechnical folks), which is unfair to those who can't afford anything else
While this is covered under "first-party," I wanted to name the Chromebook category here. Getting my father a Chromebook totally eliminated the regular fix-it sessions that I'd gotten used to having with all his previous computers.
Yes. Technologists seem to have a blindspot around Chromebooks even though they are quite popular.
Related rant to OP: compute power is so ubiquitously affordable nowadays that there is no excuse for apps to be slow even on these cheapest new devices.
I recently made a 3D game in my custom engine for the Ludum Dare 51 game jam[0] and I was shocked to see it run at a solid 50fps on a $118 IdeaPad 3[1].
50fps means every _20ms_ the game does at least:
- builds an oct-tree and auxiliary data structures for 1000+ physics objects, traverse this to find collisions, solve all physics constraints, compute new transform matrices for each obj.
- serialize and upload each object, view, and scene data to the igpu (no dedicated gpu)
- do vertex transformations on ~25k vertices + rasterize 30k triangles TWICE: once for the main camera, once for the shadow camera; rendering to 5 textures/gbuffers
- run two post processing passes over 1,049,088 pixels, doing outline + concavity detection, vignette, and gamma correction (18 texture samples per pixel).
- all the game logic including: reading inputs, character controller, firing cannons, updating projectiles, splintering meshes, updating game state, audio, UI, custom ECS query engine, etc.
The whole payload, unminified code + 3D assets + sound + textures, is 1.2mb over the wire. This is an engine I built over the last ~year in javascript(!), so it's not had nearly as much optimization as Unity etc.
And yet every day I encounter websites and apps that fail to be responsive to even basic button clicks. There's no excuse for this.
There is some serious flaw with software people (all stakeholders). It's like they are addicted to "staying relevant." Most of the time people don't need new features. Why isn't it being drilled over to the project managers? Instead of maintaining and supporting what they created, they are more into ruining and adding bloats of features. (looking at you, Jira).
There are enough engineers to have custom tailored softwares. Figma was a stellar example of one well-designed single purpose software. All they did was maintained and optimized the crap out of it!
Game dev is one crazy field that proves how bloated the modern tech stack is.
With games, it is brutally competitive and the end user cares about performance.
With web dev, there is often total separation from customers so huge resources are expended on things that add no value for users but that create lots of work for devs. Then you have people using inefficient bundling solutions, not splitting UI up, using huge imports, etc.
Also, UI is tricky which can lead to a general over-engineering that adds to package size...even if there was good knowledge of customers, I think devs often find it hard to produce clear/functional UIs (for example, stuff like change of behaviour on hover...bad idea).
I think project managers are unnecessary. If your devs don't understand your business, then there is nothing that a PM can do. It is like polishing a turd.
The whole industry has a massive issue in that developing software is inherently a commercial venture, but 95% of devs leave college understanding very little about business. That is fine when Google wants to flush billions a year down the toilet for some software dev/bureaucrat...but the RoI at other companies is clearly being interrogated more closely now.
Anecdotally, I purchased a surface pro 7 and the ridiculous number of unexpected automatic updates and reboots without warning (overnight) lost me a bunch of data. Luckily it wasn't work related, but it has continued my deep-seated hatred of the MS "user experience". Don't get me started on the tracking & surveillance crap built into Windows 10+.
It was okay after about a year of random reboots. It's a shame because MS was looking to become "better", but they're just better at increasing their bottom line at the expense of user data and their privacy.
Software development and everything feeding into it should have a code of ethics like other professions: medical, law, engineering. I don't know how it will survive long-term without losing trust. But then again it's been limping on from scandal to breach to incident to overreach for decades now.
A Surface device just gets you away from additional OEM crap (since most MS "OEM crap" is included in Windows anyway). You still have to deal with running an OS written by Microsoft.
Apple or MS Surface or Pixel are the only ones where there's some direct connection between quality (os quality, ux, physical build quality, etc) and the people actually building it. Every other player who is not first party can just point fingers, it seems.
A couple years ago, when there was an MS Store in a physical mall in the area, I got close to buying a surface book (or... surface pro, or ... I can't remember the naming). The build quality and specs seemed decent in person. It's been the only time I have been tempted to jump back in to the windows world in the past... 15+ years.
Even google - I bought what as a 'flagship' phone from them in ... 2010(?). About 2 weeks later, it was discontinued, and I couldn't get any updates a year later. And the experience of setting it up and using gmail was... abysmally slow and just BAD. And this was getting whatever their 'top of the line' was at that time (it cost north of $500 at that time).
I've dipped my toes in and out of Android over the years, needing to get devices for testing client projects, etc. Sometimes it's been OK, but never enough to make me trust it again. If the experience from google on their own device with their own service was subpar, I just don't have the time to dig in and 'make it work'. When I showed someone, they told me the problem was gmail was slow because I had 'too much email'. As if the selling point of gmail for the previous 4-5 years wasn't "we'll hold all your email".
100% agreed - but I'd also add Framework[1]. Completely unaffiliated with them, just a fan. They make an incredible product for a new company and the ability to repair and upgrade parts in the laptop form factor is incredible.
Is it any wonder why Apple has been so successful with the iPhone, iPad, and their services? I can't stand the iron grip they have on the App Store, but on the other hand - traditional, wide open computing environments have become so hostile to users in terms of data collection and tracking. If I was a non-technical person who had to use this Lenovo laptop mentioned in the tweet thread, being given an iPad would feel like water in a desert.
Thankfully, it is a laptop (presumably a Windows PC), not a phone or tablet.
It may be full of junk, but you can remove it, reinstall a fresh copy of Windows, and there is a good chance it can be fully functional on Linux too. I don't know how it is built but 10 years from now, there is a good chance it will continue to work as a useful computer, with OS security updates.
It doesn't mean there isn't a problem with Lenovo installing crapware, and not everyone is tech savvy, but that senior just had to hand out the computer to someone who is, and he solved his problem.
Now, with phones and tablets, you can't do that. You are stuck with the manufacturer's OS. Sometimes, when the manufacturer is nice enough and there is an active community, you can install third party ROMs, but often with significant loss of functionality: lower quality camera, some apps refusing to install, etc...
I am a bit worried about Apple right now, they are making their laptops more and more phone-like. Apple has a rather good UX, and a consistent ecosystem that their users love, so, fine. But once Apple starts to do something, other manufacturers (PC, Android, ...) have a nasty tendency to copy, even the worst parts, and do it poorly on top of that.
My grandma bought a mid-prized Android phone. All she does is lookup recipes on Youtube and Whatsapp her family.
After 2 months (!) she came to me with the phone. Somehow she now had ads on her lockscreen, her default browser was hijacked by some other browser app with ads, and multiple apps had persistent notifications and were running continuously. Among them AVG anti-virus, some cleanup tool...
I have no idea how they got there. Probably because they pop-up in Ads with texts like "YOU NEED TO INSTALL THIS".
My grandma can't differentiate between an Ad like that and a serious System update notification.
It's stuff like this that leads people to just "give up" and buy an expensive Apple device. For all their faults at least they protect you from this kind of nonsense. The Lenovo is just the same, really. A profound disrespect for their own users and the usability of their own product.
Apple isn't for everyone, but it's a lot more consumer focused than Android is.
It's one reason why Apple phones still do really well on the secondhand market.
My parents both had Android tablets (Samsung iirc), but they didn't use them for very long, switched to iPads, and used them at the dinner table for years afterwards - just checking news, playing games, that kind of thing.
My dad bought a laptop the other day to play games and watch F1 with - at least the pre-installed shit wasn't too bad. Still had to talk him out of getting both AVG and some other virus scanner though, that would've bogged his system down and make him lose trust, because virus scanners these days are scareware - that is, if they don't pop up every once in a while pretending to be useful, people will get rid of them.
On the other hand, it’s hard to make great innovative leaps with a mature product. The iPhone is something like 15 years. And I’m happy that they don’t change things just because they think they should.
And it looks like their laptops actually found the path again.
At the same time, they keep adding useful features. I noticed something new in the Notes app the other day. I had entered a recipe as a note and all the recipe quantities (e.g. "1 tbsp", "5 cups", etc) had a faint underline. I tapped it and got an overlay with the units in a bunch of variations: liters, CCs, gallons, pints, etc.
Sometimes Apple products just have these really nice touches. Other times, it's confounding where the gaps are and leave me wondering if Apple even uses their own product. I have two garage doors that I've HomeKit enabled and if I tell Siri "close the garage door" I get asked which one, even if only one is open. Even if I say "close both garage doors" or "close all garage doors" I still get prompted.
I'm coming from a techy 30-something person with Android for years, but also had iPhone for work from 2014-2019.
The UI of Nova Launcher on my S21 is so much more intuitive than Apple. Apple does everything through swipes. Swipes suck for several reasons, one of them being that I must swipe certain positions at certain speeds to get everything done. I miss buttons. Related: After two weeks of owning an iPhone 14, I was complaining about how I can't move my cursor in the URL field. It just kept highlighting words! My friend told me the super-intuitive iOS experience: hold down Spacebar to move your cursor. I'm glad that was covered in the non-existent tutorial.
Then there are the anxiety inducing badges, the fact that settings is an app instead of a system menu, the fact that control center (or whatever the system drop down is) can't have nearly the function of Android's system drop down, the fact that notification center doesn't feel trustworthy or consistent, the fact that notification center shows me notifications from blacklisted apps even in focus mode...
Android's UI just makes so much more fucking sense. And I hate that I am tempted now to go back to my S21 with inferior camera and DAC/bluetooth stack just to have software that makes sense.
I acknowledge this is a rant and not as well put-together as most of my comments, but effing Hell, iOS is a mixed bag.
> one reason why Apple phones still do really well on the secondhand market
Apple products do well on the second-hand market because Apple usually doesn’t cannibalize that market by selling low-end devices. By positioning themselves as a premium brand and ensuring that the low end of the market is not served with new products, they ensure that the low end will be served by the second-hand market. PC/Android can’t do this because they don’t control the hardware and there’s always some brand that will sell new low-end devices which compare favorably to any used device.
In the few cases where Apple has deviated from this strategy of not serving the low end, resale values have suffered. The iPhone 8 and iPhone X were released at basically the same time. But the X currently sells for roughy 25% of its original price whereas the 8 sells for closer to 15%. This is due to the introduction of the SE which is basically just a cheaper and better 8.
Similar story for my mom. She was used to windows because at work she used locked down, centrally managed, IBM thinkpads for many years. The trick is that those were secure and managed. If anything didn't work, they just got swapped by IT for another clean one.
But she always had problems with her private laptop, and no assistant or IT department to fix it. We thought she needed a windows one because that is what she knew, but the switch to a macbook air took maybe a week or so to adjust to.
It was a bit of a gamble, since I don't own a mac so supporting one via phone would be very tricky. But she has had the macbook now for years and I've had to do absolutely zero tech support on it. For emails, Word documents and some web browsing it really just works.
My dad had never used a Mac in his life until I helped him buy an M1 Air. He loves that thing. The only thing I did was install Brave and make it the default browser.
I think he gives Apple credit for how fast his internet is, when really it’s due to Brave. But the rest of the experience— no crap ware, no spyware, no ads in the system— that’s on Apple.
Edit: forgot the main point, which is that he needed no handholding. Given how often he had to replace his crapware-ridden netbooks in the past decade due to performance degradation or something physically breaking, I showed him the trivial math that made the MacBook look like an economic decision as much as a QoL decision.
Safari with a content blocker like Wipr is very fast too, and easier on the battery compared to chromium based browsers.
My dad’s life is made easier with a MacBook Air + iPhone combo because all his 2FA SMS codes get auto inputted into the 2FA field so he does not have to type out the number.
The absolute worse for me was when I found myself trying to explain to my grandma that you must ALLOW cookie preferences, but you must absolutely DENY notification permissions. For trying to read some news online... The web is a harmful mess.
The shitty thing is, she'd probably be better served by denying cookie preferences... but despite whatever regulations say, companies find impressively difficult ways to keep the user from answering "no, please don't spy on me".
Every cookie banner that I've ever clicked "no" on has let me proceed to the main site. The user is still a potential viewer of ads, even if they can't be microtargeted, so it's worth it to the site to let them proceed.
i hope you understand the inherent irony of your second suggestion, but if you missed it: I Don’t Care About Cookies just got bought by one of these companies known for similar types of abusive UX (surveillance-ware).
There is a fork by the community "I still don't care about cookies" (note the "still") that should avoid any evil stuff Avast would do to the original extension.
I had the same thing with my mother. She plays Candy Crush, actively uses WhatsApp, Youtube, Facebook, takes photos to share with friends/family and not much else.
Tapping adds apparently lead to installing these things called "Launcher" which are essentially complete phone UI and those launchers have ads that lead to even more hideous apps. They also lie about the thing you are installing, she was trying to get some emojis and I checked back then, the launcher in question was disguised as emoji thingy. You need to carefully read the text to understand that you get the emojis together with complete phone UI replacement.
I can't stand the Android way of doing things. How it is possible to replace the main user interface of the device? I understand the desire to completely control your device and I do support it in principle but this sort of modifications should be possible only by going through scary screens that let only people know what they do achieve that sort of device modifications.
One has to grant permissions to install unknown software. That means going into settings and flipping a toggle. Are you saying these ads somehow bypassed that?
These days on Android when sideloading things the installer will give you instructions and then can automatically send you to the correct settings screen and scrolls down to the correct toggle, and it may even highlight it I believe, it's been awhile.
Who said anything about unknown software? Unless of course by unknown you mean software on Google Play store, then yes that switch is probably switched to install Candy Crush and WhatsApp. It's not a 2007 Nokia after all, some "unknown" apps are needed.
> It's stuff like this that leads people to just "give up" and buy an expensive Apple device. For all their faults at least they protect you from this kind of nonsense. The Lenovo is just the same, really. A profound disrespect for their own users and the usability of their own product.
Be careful what you wish for. For many power users, overlays and replacing the default browser are 100% essential. Removing them just for the sake of user safety would be a grave mistake.
Google is already pushing to replace overlays with "bubbles" for example.
I'm not sure how much protection this affords, but I'm aiming to only buy phones that are on the LineageOS support list, and can have their bootloader unlocked.
Weird how things that the manufacturer would consider insecurities, eg unlocked bootloader, allow end users the kind of control that can increase security and privacy (admittedly, maybe conflating privacy and security). Highlighting the disconnect between industry security and user interest (even though users, by and large, aren't interested in their own interests).
I treat use of SafetyNet or attempting to detect root as actively malicious and give apps that do that 1-star ratings on Google Play.
I realize there's a theoretical risk the OS could be compromised or malware could have superuser permissions, but my previous attempts to find any significant data breaches or user harm caused that way have come up empty. If anybody knows of one, I'd be interested to read about it.
SafetyNet doesn't protect me from having my personally identifiable browsing data sucked away and sold off, and that happens daily to all users of their devices. Google can't even sanitize their own ad platforms.
What are you attempting to protect me from then SafetyNet? Oooh, I get it, you're trying to protect apps created by companies that collect and store sensitive data to minimise the chances of said companies being sued for, and having to publicise, a data breach. Ok, why didn't you just say that? Yeah, you're right, it doesn't play anywhere near as well to the great unwashed.
SafetyNet sounds like shit to me:
> requires Google Play Services to be enabled on the device for the API to function smoothly.
> SafetyNet works in combination with the snet service on Android devices which collects data about the integrity of the device and constantly ping Google about the safety status of the device.
> Google offers many other options like application sandboxing, encryption, app-based permissions and so on to secularize apps but none of them are considered as an all-inclusive solution.
(GPS permissions required for wifi scanning? Fix your basic shit first!)
> By identifying devices which are currently in the non-tampered state the API provides an assurance that the device on which the app is running is neither rooted nor using a custom ROM.
"expensive Apple device" I know it's a lot of money for some folks, but a new iPhone SE is $400, and old iPhones remain serviceable for an absurdly long time — the iPhone SE 1st gen was compatible with latest-versions on release date for seven freaking years.
How is a $400 device that lasts seven years more expensive than even $200 budget Androids that have to get replaced every two years?
My dad accepts every websites request to send notifications. His phone is just constant chime after chime. I went into his Chrome, removed them all, and disabled the option, but somehow it finds itself on and his phone loaded with them every couple months.
And yet every time mobile Safari comes up people on this site go off about how it's "broken" because it does not support notifications from websites. There are legit uses of website notifications, but 99% will be spam like this.
It's because people assume that these things are benevolent because they were made by people. Gen X on down are natural skeptics so we don't end up with as much garbage on our phones, it's still possible to accidentally click a pop up when you were getting ready to hit a link.
> possible to accidentally click a pop up when you were getting ready to hit a link.
I personally believe that some sites are designed to shift just a bit to cause this. Like they've used analytics to know how quickly people push the link and now they've designed a slow loading image to shift the page at that point.
I find the best solution for people who accidentally click on ads is ad blockers.
Firefox for Android has AdBlock plus support now. The browser itself is somewhat slower than chrome on Android but totally worth it for the ad blocker.
That's why you should install an adblocker for any relative.
Brave will kill all ads, and many apps like Trackercontrol offer a local DNS server that will block any ad - related domain.
> It's stuff like this that leads people to just "give up" and buy an expensive Apple device
It’s usually not even “expensive” when you consider the frequency with which people replace low-end hardware. All of my Android-using relatives replace their phones and tablets 2-3 times more frequently so they end up paying more for slower hardware and then having to spend time playing tech support instead of using their devices.
To be clear, that’s not Android’s fault in the sense that the market is broken: Apple is fine with you having a 6 year old device because you’re probably using the App Store, iCloud, Apple Music, etc. There’s also some lesson about how consumers want the ability to install things outside of the App Store but statistically a large fraction of us can’t do that safely.
I have the exact same issue with mine. As a workaround I set a password to install apps from the play store and it works, she never download new apps so it's not an issue.
An iPhone SE isn't "expensive" if you value your time at all.
I got all of my older relatives and in-laws to buy one (eventually) and they've been 99% problem free ever since. With Androids it was even worse than trying to get their "got it from the store on sale" Windows-laptops cleaned up.
My mother is on her second iPhone SE, she used the first one until it didn't get software updates (5 years IIRC). If everything goes as usual, she'll be using it until 2025.
what's messed up is these apps are in the play store, rated e for everybody (so parental controls can't filter them) and have 5 star ratings from thousands of users.
Have you used one? I know someone who bought one because she wanted an easy-to-use device, and it was nearly unusable. The thing one had like 1 or 2 GB of RAM and 16 GB of storage which was quickly filled by photos (on crap quality camera). The thing somehow became slower than molasses to operate.
Thankfully I was able to copy off the pictures, and I bought her a Pixel 4a, installed a tempered glass screen protector and thin/minimal TPU case, went through the settings with her, and even showed her screenshots of more user-friendly third party launchers. She opted to keep the default launcher, and it's been smooth sailing since. (I also installed Firefox and uBlock Origin on it.)
My father, who is now 76, introduced an Apple II into the home. He wrote a number of programs himself for a business he started (that lasted 30 years). Followed by a series of PCs, the first of which had the pre-Windows 95 GeOS on it.
These days, he fumbles with his phone; half of the time hanging up while trying to answer it. His texts messages are usually unintelligible as he totally relies on voice to text. This can mostly be attributed to cognitive decline following a major heart attack which was accompanied by an undiagnosed stroke, a minor heart attack, and a diagnosed stroke. He still dabbles with Kodi and VPNs but is always at least a year behind the curve, and new technology thoroughly confuses him.
That's left me wondering: will there be a point at which I struggle, after having spent decades writing code that powers the web?
>That's left me wondering: will there be a point at which I struggle, after having spent decades writing code that powers the web?
I'd say it's not that tech will become too difficult to use, but the constant and frequent changes will start to annoy you to a point where you just don't invest as much time into learning how the new stuff works. YMMV, but I am less than half your dad's age and I'm already starting to feel it.
I used to spend hours customizing everything I used to make them "just right" and updates to software would always inevitibly break them, and I'd repeat the cycle. As I've gotten older with less free time, I've just started changing my own preferences and workflows to fit the defaults. Even that doesn't always seem to be the way as UI/UX changes mean disruptions for disruptions sakes rather than changes that make my life easier...
I'm 50 and I start to struggle. Not because I can't keep up with the tech but because I refuse to. No smart tv or doorbell, a 2004 unconnected car, physical light switches etc. I spend too much time figuring out what printer will not DRM me into buying branded ink. I wonder how much longer I can keep up with this, because everything is becoming smart and "as a service". It seems the perfect products for me are the products from 15 years ago that I can hardly buy any more.
Well, my dad wrote sophisticated engineering programs for mini's, and taught me to program BASIC, but now he needs me to get his new laptop to print on wifi, and install a weather app on his Android phone. The world is quite a bit different from 40 years ago, and you do have to put in some effort to keep up with the changing paradigms, but I think that effort is up to each of us. What I worry about is being able to see the screen, and type. As long as I can still do both, I will probably never retire.
Apple IIc was our first family computer (Really mine since I used it most). I loved that I could carry it around and plug it into the TV and have a color monitor.
My father uses a Windows 10 laptop, and a Samsung Android phone.
The biggest pain point is that Windows sometimes decides to do an update, and upon restart, shows a splash screen with a wizard which wants you to make decisions and read/understand a few things. He has no idea what going on or if something will break, or if he actually did something wrong. I solve this via WhatsApp video calls "quickly", and then have to explain to him for 5-10 minutes how it's not his fault.
When it comes to the phone, all he essentially uses it for is YouTube and WhatsApp. When I visited a while back an OTA update had showed up and I promptly started to install it. While it downloaded i found ELEVEN registry cleaner derivatives installed on his phone. He couldn't have gotten this from anywhere other than YouTube.
Again I have to hear this; "I'm so sorry son, you know we have no idea what's going on, please forgive us for causing you so much trouble."
I know they feel guilty for not knowing how to handle this stuff themselves and wish they didn't have to "bother" me so much. In my eyes everyday tech took a gigantic leap from having a TV with a remote to requiring people to be super users in tremendously more complex software/hardware.
From the thread:
> Maybe the most frustrating thing for me is how often I'm working with people who are apologetic and unsure BECAUSE of these derailments
> They think it's their fault! They think they're not capable of doing things BECAUSE they keep getting fucked with by this user-hostile crap
So much this. I have similar conversations with my family all the time. My mom actually uses a hoke computer for business related stuff more than anybody. At one point a few years back after some Windows update that messed some things up I booted a vanilla linux mint off a USB and had her play around with it. She liked it because it was simple and easy to understand, so she's been with thay for a few years.
I've had the exact same experience. Bought a cheap ultrabook (whatever was within my wherewithal back then) and wiped everything. Put Linux mint on it and gave it to my mom. After years of struggling with Windows she got a user interface where she was comfortable in for around 2 years.
My brother bought her a "better" laptop with standard issue Windows bloatware. She hated it and kept asking for Linux Mint. She said that the Windows UI was too confusing.
Ethics and privacy aside, I just find it plain weird that hardware vendor would deliberate make their own product worse like that. And this is not the first time Lenovo has done something like this, apparently they did face any consequences.
When I still worked as a sysadmin/support person, the ritual for preparing a new PC for use included replacing the preinstalled Windows 8/8.1 with a fresh install of Windows 7 from a vanilla installation medium. This was in part because the software our SCADA people used required Windows 7, but we did it to all PCs just to get rid of the useless crap vendors would ship with their machines.
Fortunately, the Windows 8/8.1 Pro license allowed us to do that. But I don't know if the Home license does, and frankly, you cannot expect non-tech people to do that.
As a long-time Linux/BSD user on my private machines, I am glad I don't have to deal with these problems in my spare time, but I am part of a pretty small minority in this regard.
I remember reading a story about the development of the Apple Macintosh and its OS, and one developer talked about how Steve Jobs gave him a hard time about the time it took the machine to boot. Even if you shave off just one second, he argued, imagine how much time that saves all the users over the lifetime of their computers. And now imagine how much time is wasted, in total, because of hardware vendors sabotaging their own products.
This is why I tell every senior or “muggle” I know to just get an iPad for their computing needs. Sandboxed. Cheap. Backs everything up for you. Totally managed experience. I’ve switched numerous folks in that demographic over and have never once heard anything but rave reviews. F*ck PC’s and the entire race to the bottom market they represent.
That was the right move for my mom, mostly. Not perfect and she’s completely unable to remember passwords so I’m concerned she will brick it one of these days.
Personally I have one of these Lenovo’s and a default install of windows helped tremendously. I also went with linux mint on it to just completely get out of that s** show. So much lighter weight, fast. Not completely stable though, Vivaldi and firefox have some crashing and high CPU issues on it.
This is why I buy Apple laptops. They always work and come with zero bloatware. You can buy the cheapest laptop they sell, and it will work effortlessly for email, web surfing and most other generic tasks.
Their lack of bloat shows in their hardware specs. I've worked on Dell laptops and they were spec'd out at 64GB. My latest MacBook Pro has 16GB and its feels MORE responsive than the Dell computer.
Each MacBook Pro and MacBook Air I've purchased has also lasted many more years than my friends HP, Dell or Lenovo laptops. Honestly, the only reason I typically upgrade my laptop every 4 or 5 years is because I just want something new.
Apple products cost more, but given their full lifecycle, I think they're cheaper to own than most other brands.
I would immediately reinstall a just bought laptop from scratch, as in through a Windows/Linux ISO, not through their own recovery, especially from Lenovo, they have been caught with Superfish adware/malware before:
That's fair, I would recommend a MacBook, Chromebook or iPad to senior folks, even though I loathe the companies, they're perfectly fine for those who do not want to tinker.
Have the exact same experience with my mothers laptop. Some acer shit.
Within a few months of her using it she complained it’s slow, takes ages to turn on etc. After removing almost everything from the manufacturer it was usable. But even now she barely uses the machine and never installs anything and it’s unusable slow with essentially zero usage after a couple of years. She’s not even ran any updates. It wouldn’t surprise me if these machines are built to die.
Try Ubuntu (or whatever Linux you enjoy, it's just that Ubuntu is mostly hand-off for 5 years), I just bought a ~6 y/o laptop for my brother, a Fujitsu Lifebook E736 (introduced in 2016), 512 GB hard drive, 8 GB ram, some Corei5 CPU 150 eur, I did put a new battery in it. Works like a charm, 0 complaints.
But, to be honest, the guy that sold it put some super bare Win10 on it and that also felt pretty fast.
In any event, for email and web-browsing I think Ubuntu is just perfect. I have it on our HTPC and the whole family uses it without thinking twice. And it stays fast.
Oh, our family laptop (the one the kids use, and I do on occasion) is a Dell Latitude E5450, first introduced in 2015(!), I got it 2nd hands in 2019, it looked barely used. I put Ubuntu on it (but recently I installed Arch, when I regularly use a machine myself I prefer it), the thing still feels like a fast new laptop, the kids went through Covid with it, using Teams (for Linux) and doing online homework. I use it for dev work on the side. Love the machine.
My advice: Get a second hand business model from a good source (some local trader that gets good reviews, or you can visit, we have a lot of local IT people gathering on a website here that sell older business hardware [0, Dutch website]). Plus, it's nice to give stuff a second life instead of buying cheap throw-away shitware for cheap.
I'm surprised there aren't more suggestions to ditch the windows install totally and go with a Linux distro like Ubuntu. Personally I also know of family members that get on very well with chromebooks
I agree, but the biggest obstacle is still hardware compatibility. Any device you buy out there that plugs into a computer will work with Windows. They might not work with Ubuntu or even have been tested with Ubuntu (so even the manufacturer might not know whether it's compat).
Having said that, many devices just need to sync with your phone and not computer, so there's maybe less of an issue than before. For example, my bike computer doesn't need to be Win compat since it syncs with an app on my phone.
During the pandemic I used what I had to hand, I chucked Fedora on a Vostro 3750 (which was my old dev laptop years ago so had an SSD and 16GB of RAM) and the thing was perfectly acceptable for all his schooling, youtube stuff, since that model was a 17" battleship desktop replacement it was perfect for him with an external keyboard and mouse .
I would honestly just move right on to Linux Mint. If they're familiar with Windows, they will be right at home with the Cinnamon DE.
I'm really impressed with this distro and use it as my own daily driver. One of the few times where I do very little tweaking to get the UI/UX just right for my own preferences.
I use Fujitsu since a long time at home (3 notebooks, 1 workstation). I think Fujitsu has almost none bloatware. For me the reason, why I buy this (a bit expensive) brand .. and Linux does work mostly well too.
My mother has a gas meter in her home, which requires a usb device to be plugged into her laptop to load value on to this device which is then plugged into the meter and transfers the value. It is not compatible with anything but Windows.
arguably just force you to use windows. you could possibly go to a neighbor or library or something, but.. well.. a library would probably lock down random usb drives being put in or random software to manage gas meter accounts. and ... your neighbor may not want to share because you'd have access to their meter info as well (assuming you could even do that)... so... yeah...
Just give up on low-end Windows laptops in combination with non-savvy users. Give them an iPad instead.
Starts instantly, not in 10 minutes. Great battery life. Automatic backups when properly enabled. Far less security and malware risks. Few to none resource hogs slowing down the entire system. Not really possible to get a corrupt system as seems inevitable on Windows.
Even then you'll have some support. Like that time my mum called to say it's completely broken, doesn't turn on.
I do the same thing with my parents. They use their iPhone for pretty much everything and I get to spend thanksgiving talking to family rather than fixing their tech.
People complain about Apple having a closed ecosystem, but doing so disables all the spying / crapware that plagues Windows and Android.
I use Windows 11 on an Intel 10th generation NUC and it is very fast. It came from Novatech in the UK.
My last computer was a 32 core 128gb RAM Threadripper from ChillBlast.
Surprisingly the Threadripper had a slower single core performance than the Intel NUC.
I also use a Ubuntu virtual machine in VirtualBox with Vagrant for development machines to do DevOps.
But I also use IntelliJ and it is fast enough on the NUC and it's SSD and 32GB RAM. I don't play games. But I write multithreaded software so I get 12 logical cores to play with. (6 physical cored and 12 hardware threads)
I think the innovators dilemma has occurred with mobile technology. Mobile chips such as my NUC and ARM chips in phones are good enough for development. (See apple silicon) I can run 6 virtual machines without slowdown.
Admittedly my IntelliJ is running on a small repository. I suspect it I was working on an employer's project it would slow down due to all the files.
Most of the time the computer is waiting for IO and the CPU can be different levels of idle. I recommend the Gist "latency numbers every developer should know"
I would recommend people buy from manufacturers that don't preload stuff into their laptops. Such as Framework, System76, ChillBlast, Novatech.
EDIT: I had an ASUS but I installed Ubuntu on it and I cannot remember if the windows had lots of manufacturer software on it. I rarely booted into windows.
This is true even when installing a fresh copy of Windows. Microsoft have made some deals to include a lot of apps that gives you nothing. Zero. Well, unless you for some reason need candy crush on your laptop. If that's your thing, sure, enjoy. For the rest of us; please just give us the minimum.
Pre-installed Office package just in case we will buy a license in the future?
It's not even limited to a 'fresh' copy of Windows either. When you 'upgrade' to Windows 11, it automatically installs and enables Microsoft Teams (formerly Skype) at startup without the user's consent. I have never once used the software in my life. Surely the capable engineers at Microsoft have a way to detect if I have Skype/Microsoft Teams in my previous installation to make an educated guess as to whether it should be enabled in the new installation?
Perhaps ironically given the thread author's concerns about surveillance, I solved the computing problems of my elderly relatives by switching them to Chromebooks.
They really just use the computer to browse the internet anyway. The fact that there's no room for malicious actors to install crap on that architecture to interfere with the user experience because the user experiences completely dominated by one megacorporation's notion of how the computer should work has been a Godsend for maintenance of their machine.
Now, the Android phone is an entirely different story, unfortunately.
On the other hand, my very high end work-issued Lenovo laptop came with zero crapware installed. Only one Lenovo support tool which I used when I needed and has never bothered me ever.
I think this problem is more with the consumer line.
I've been in the same situation as OP before though with computers from parents and friends, and it's unbearable. A straight reinstall of Windows after buy is the only viable option. I had to diagnose a not-so-old pc sold with all kinds of crap installed, 4GB of RAM, Windows 10 and a rotating 5400HDD thrashing all the time.
>I think this problem is more with the consumer line.
That's my experience as well. So if you need a cheap Windows laptop, a used corporate class laptop is often a better path than a cheap new consumer class one.
> I think this problem is more with the consumer line.
Not even all of their consumer products are bad. It's often the cheap (sub $600) laptops that I suspect gets subsidised by the crapware that comes preinstalled.
When people ask me for recommendations for a laptop for around $400, I always say "second hand" or "save up a bit longer" because these days spending less than $600 is often a recipe for disaster.
> When people ask me for recommendations for a laptop for around $400, I always say "second hand" or "save up a bit longer" because these days spending less than $600 is often a recipe for disaster.
This is the best advice you can give, but I'd include the option of an iPad/Chromebook. And even with a Chromebook anything <$300 is asking for trouble. Realistically, a decent laptop that will be performant and cheap that is well made doesn't exist. You're looking at ~$800 before you get get anything relyable.
My Dad has 2 old laptops. Both perfectly serviceable but dog slow due to Windows 10 BS. I've been called to help get them back in a usable performance state on several occasions. A combination of MSs own stuff and third party software bogging things down. Installed Pop_OS 6 months ago and haven't heard from him about those machines since. Linux is the way forward.
I got my grandfather a laptop and loaded a Linux distribution on it. With a little bit of planning I was able to create a set of desktop entries to allow him to instantly open the applications he needs, on top of that I created a basic html landing page to take him to news/email/&c. It was a breeze to set up and it has allowed him to actually use his computer. On windows I would have had to constsntly fight against it, with this setup I am in control in a way that let's me craft an experience that fits his needs.
Asus has a line of products called Asus Pure. Those are the same Asus laptops as their "normal" ones, but Pure models do not have any 3rd party software, no trialware, no test versions etc. Plain Windows, nothing else. But I think those are only available in Finland, Sweden, Norway and Denmark.
At least us, en, gb, es, it, and de versions the the URL did not work. Not sure why those are not available everywhere, but those are the only laptops I buy for my parents.
EDIT: The marketing material on those pages says:
"ASUS PURE comes without pre-installed third-party software, trial versions and apps. You choose what you install on your machine. ASUS PURE is designed for fast and optimal performance by freeing up memory and CPU power. The ASUS PURE laptop starts up faster and the machine feels like new for longer. Your computer is protected by Microsoft's Windows Defender antivirus program and a firewall that receives regular updates and does not include annual fees. Your ASUS PURE laptop is therefore always safe."
Like, why would anyone want to have it any other way? What's the benefit for us customers to have all that crapware installed when it does no good for us?
I think you've stumbled on the answer everyone is searching for here. I'm certain that these specific SKU's exist only in these specific countries because of legislation. We could do the same thing in the US, but companies are runningour government, so... no bueno.
Windows 10 LTSC + Snappy Driver Installer Origin + Firefox + uBlock Origin + TinyWall removed my family PC support burden by 95%. Vast majority is used business class Lenovo hardware. No ads, no spyware, no bloat, no tracking, no telemetry, no viruses. Add an older, pre chip Brother laser printer/scanner and home computing for frugal non technical people is almost bearable.
This is certainly the case, and I think more and more people are going this way. My 70 year old mother uses her 4 year old iPad for things she might have previously done on a PC. She gets on with it without issues
I used to think the iPad was going to be the best way forward for my (non-technologically-inclined) father, now in his 90s. But in the end, the combination of the touch UI and his arthritic fingers just didn't work out. We ended up going back to a Chromebook, with slightly better success.
Don't give anyone you care about (who is not a techie) a Windows machine. Especially if you'll have to support it.
At this point, the best thing for a "typical" user is a Chromebook, or better yet one of the few Chrome desktop boxes. For $600 you can have a very functional Chrome machine with a giant monitor. And remote-management + screen sharing is baked in.
The worst is visiting a Best Buy (or similar electronics store) and hearing a neophyte go up to a sales clerk and say, "I want to buy a computer". I can't stay near, because the discussion I will hear will kill me.
"It's not your fault. You didn't do anything wrong. You're completely capable of making sense of this machine. It's just that some asshole got a bonus for trying to confuse you so a little graph would go up."
I don't want to sound like a broken record, but I am really satisfied with my Framework laptop. It came with the RAM not installed, I assume so they'd "force" me to open it up and familiarize myself with the process. It took all of one minute to watch their video, unscrew five screws, pop the keyboard out and install the RAM.
The machine itself feels really solid, I was expecting a few rough edges from a first-time manufacturer, but it feels as good as my old Macbook Air did. I just love it, and I am really enjoying how repairable it is. All the stuff has stickers and QR codes internally to explain what components are and link you to how-to videos.
Well worth whatever premium I paid for it, if it ensures that the company continues to make products.
I'm just waiting for them to build a 15" model and ship to Europe.
If they are reading this thread, an option with no numberpad and hardware buttons on the touchpad. I'm paying an extra for that. The language of the keyboard is much less important. Thanks.
The digital divide is a major problem today, and manifests all over the place, whether it’s spyware like this laptop, issues with phones or just interacting with service providers.
I recently tried helping my aunt switch internet providers to one 50% cheaper, I called, set everything up, it was all finalized they just needed to come switch the connection over.
After I left, she made a call to the company to switch the appointment and ended up getting the contract cancelled, then stuck in limbo and a month later I still haven’t been able to get it fixed.
It’s impressive how if someone doesn’t know the right phrases and things to say, you end up in digital purgatory where no one can help you.
I'm so tired of clicking on posts like this expecting a civilized discussion on how things like this should be stopped, possibly regulated, and instead finding dozens of comments that read like they were written by Apple marketers.
I’m really not sure how you concluded that Apple is astroturfing here. Maybe I’m biased or blind but I think it’s this kind of company that really avoid this way of promotion. It’s also one of the tech giants. They don’t need to market to a site full of people that got a macbook from their employer to develop software on.
But this is how you get those dirt cheap laptops isn't it? Likewise for super cheap huge TVs.
I've long believed that most computers sold cheap as consumer devices simply aren't fit for purpose, sure, most people here could take one and make the changes required to remove bloat and spyware, but it's out of the reach of a typical consumer
> Getting lots of questions about how to do this on Lenovo and other machines. YMMV, but here's what I did:
> Add/Remove Programs, in Windows. Then I found everything from Lenovo and uninstalled.
These days you can usually get at the oem license key for windows - I would recommend just doing a clean install of windows 10 (possibly 7) from Microsoft install media. Just make sure to make a backup of user data.
IMNHO that's easier and better than playing wack-a-mole.
And if the user can afford it - just upgrade to a pro license while you're at it.
For users on a budget - I would probably recommend Ubuntu LTS or another stable desktop distro.
I have a razer laptop that gets 20-30% higher FPS when you uninstall all the razer crapware. You'd think a "gaming" brand would value performance, though I did put gaming in quotes because razer is definitely more of an aesthetics brand today.
The hardware is great* but the software kneecaps it. I'm sure most people buying a gaming laptop aren't doing a full windows re-install on it and are losing out on noticeable performance amounts.
* It could definitely use more ventilation, they're aiming too much for the macbook look but on factory overlcocked i7 that doesn't work well.
In my view Google and Facebook turned evil the same way Microsoft did in the mid-1990s, switching from building technological improvements to using vast resources to dictate how each personal compute node should be run and how one could access the internet.
But in the 1990s many developers hated it and actually used and contributed to alternatives, which eventually brought other options, from Linux to Firefox.
I see no such work now, as big tech got better at buying competing tech and engineering talent, so I am not very optimistic. Maybe if the next generation is not so smartphone obsessed.
Imagining what a bootable linux distribution for seniors might look like from a feature perspective.
- skeumorphic icons
- familiar light colours
- large fonts
- emphasis on storage and management of images and videos, archiving service
- a support app that goes right to a irc channel.
- journaling for remote support, wonder what else.
Kind of like android was intended to be before google's ecosystem became a landfill. They didn't invent poverty for the internet, but they scaled it as a service for people who prey on those who are less advantaged.
First of all, THANK YOU for volunteering your time to help out seniors. You are a great human.
I tried volunteering to help a few years ago, just with a group of seniors in my mother's circle, and it just became too depressing for me. It was a never-ending battle against adware/bloatware/crapware that I just couldn't keep up with. I spent more time telling them what not to do with their laptops/smart-phones than showing them all the things they could do.
I bought an external monitor two months ago or so. Since then there were a total of three firmware updates, each one throwing out the display driver before showing the update dialog which, when closed without confirmation, had to be fished out of the bloatware again for the procedure to continue and actually re-install the drivers.
My question is, why does a monitor even need firmware updates? I didn't gain any features, nor did I get an update report.
Why wouldn't the author re-install Windows without the bloatware? Seems easier and more likely to succeed than trying to find every single piece of garbage Lenovo installed.
Crap like this is one of the main reasons my household went 100% Apple well over a decade ago. Apple isn't perfect, but after doing software all day at work, the last thing I want is to spend time debugging shitty software on a $500 Windows laptop.
A similar annecdote, my wife's work laptop is a Lenovo and it had a part fail under warranty.
The Lenovo technician that showed up was 19 years old and said "wow that must be an old laptop!" And proceeded to replace the wrong part, breaking something else in the process.
The Lenovo laptop is a 2021 model. Now I'm scared that future generations will regard technology as branded magic. This was his job.
I'm also a bit scared. Sent mine in to Lenovo with an obvious hardware fault (rebooting randomly, happened under both Windows and Linux) explained to them. They reinstalled Windows and sent it back still broken. The second round I called in and got to talk to a guy who had no idea how they sent it back like that, who got it fixed, but sounds like the first attempt wasn't even the worst it could be.
> "How often does the trackpad REALLY need a firmware update?"
Never. Literally had broken trackpad drivers on Lenovo for years. They were aware of the issue, it was "normal" on the model I had. This was around 2013.
I will never ever again recommend anything from Lenovo. Trackpad was a minor issue compared to other problems that I had with that same laptop.
I was a student, barely managed to afford it, couldn’t justify replacing it. Wifi issues were worse than trackpad for example - absolutely terrible range and speeds up to 400KBs. I sent it back only to get the same laptop back with extra scratches. After that I didn’t even get a reply from support anymore.
I managed to fix wifi issues with extra dongle, but how sad is that? Trackpad was always an issue.
You think I’m speaking badly of Lenovo just for fun?
They don't get a lot of love on HN, but Chromebooks are awesome for seniors, and those that need to support them.
They are malware resistant, and easy to remotely support. They are difficult to really mess up, and you don't need to worry about data loss. Migration to a new chromebook is trivial.
Unpopular opinion (on HN at least): A Chromebook, under locked down mode to prevent installing any plugins, is best for non-tech savvy, casual internet users.
I still remember when I bought my first sony vaio in about 2004, running on Windows XP in Japanese, I was just appalled to find that a whole family of useless crap softwares pre-installed. There was an app that notifies me when the battery was fully charged with a dolce female sound "Batteri no shuuden wa kanrio shimashita". I tried to reinstall the system with the recovery CD, but all these softwares were also installed.
Years later when I visited in Japan again, I wanted to buy another sony vaio, and asked the people to give me one with clean windows system. They did not have one with clean system, but they showed me that there were only a few harmless pre-installed apps, and they were easy to remove.
These aren't the Thinkpads power users usually buy. The bloatware machines are generally their consumer lines.
I use the Lenovo Vantage update app that I think the thread is talking about on my own dual boot Thinkpad. It's a particularly slow app with pointless animations to look good. It makes warranty info prominent to try to sell customers on an extended one, and has some other 'useless to power user' features. However it only runs when I run it, and I'm just there for the firmware updates.
I expect the consumer line also includes other bloat beyond that though, or enables mis-features to 'protect' the user.
I don’t even think the real issue here is the crapware. Sure, the laptop would be a lot better without it, but if it subsidises cost, then you could make a case for it.
I think the bigger issue is the lack of quality control. If you want to put crapware on a device and sell it to people, at least make sure it is moderately efficient and test that the final product is usable. If the final product is not fit for function, it should not be sold (or alternatively no one should buy it).
In this specific case, it’s really impressive how they’ve made surveillance software (which shouldn’t need to do THAT much) so resource hungry.
I don’t think selling utter shit that’s not fit for purpose is a software/hardware industry thing though.
The topic of seniors and tech is something I despair over quite frequently. My own dad is 75. He’s by no means a genius, but he isn’t a stupid man (though he has seen significant cognitive decline in recent years). He worked in finance most of his career so he was working with computers fairly early on and throughout most of his career. We had a home PC by 1989, and he spent a lot of time in the 90s messing with DOS and Windows 95 and programming with C as a hobby.
These days he has an iPhone and iPad that I got him because of accessibility and ease of use, a Windows PC, and a Roku (he used the Hulu local tv service instead of cable). On all of those devices he can only do the basics and is completely helpless if something unexpected happens. Just a few days ago he was watching a show on Prime Video, accidentally skipped to the next episode, and couldn’t figure out how to navigate back. On his iDevices he routinely gets confused trying trying to look at messages, i think because he’s used to seeing/answering them through notifications and doesn’t understand the layout of the messages app. He also has a lot of frustration with the iDevices due to arthritis in his hands. On his PC he’s been using Quicken to track his finances for probably two decades, and last year he ask me to help set up online download of his accounts to make the reconciliation easier. I thought it was a good idea at the time, but every month or two now I have to sit down with him and spend an hour unravelling a mess he created due to getting confused about Quicken’s process for matching downloaded transactions with those he manually entered in the register.
I could go on and on with examples, but the question I ponder over is whether or not there’s a fundamental difference between his generation that didn’t grow up with technology and the younger generations that did. Will technology move on and change so that even those of us who a tech savvy today are hopelessly outclassed in 30-40 years? Or are have our brains been wired differently to make us more adaptable? I don’t have answers and I’m rambling at this point, but I feel like something has to give. For seniors today technology is mostly a convenience (they can choose to largely ignore it if they want to) but by the time we reach their age I don’t think that will be the case.
The "fundamental difference", in my observations, is a willingness to explore interfaces. I reckon it's because before computers, most devices were relatively single-purpose, and had a limited interface that you were expected to either be already familiar with, or could get away with a very limited understanding of (how many people used more than 5 buttons on their tv remotes?). Wandering off the beaten path was either not possible, or likely to mess something up.
Computers, on the other hand, are wildly more complex, and thus their interfaces need to do a lot more. Finding what you want to do involves some level of exploring the interface, which comes naturally to, say a child introduced to a computer, but grates against the learned behavior of someone who's only interacted with a much simpler world.
I would actually love to volunteer to help seniors with tech related stuff but I can't for the life of me find an opportunity. I looked online and sign up for different groups but never got a request or email.
I stopped using Windows because I was tired of Windows doing updates / telemetrics without my control when I was playing games or waiting for code to run over night. I was tired of other technically minded people telling me that I was being unreasonable when I wouldn't have those issues on Ubuntu...
Sometimes I have to switch to my windows partition to use some software and it will sometimes update, completely messing up my bootloader until I get off my behind and load grub from a USB stick.
On a similar note, I wish OS updates came not just with supported hardware list, but also how well an OS works on older hardware.
I have a late 2013 15” MBP with 16GB RAM as my daily driver, and it was great till it had Mojave on it. In a moment of insanity I decided to update it to Big Sur , since it was no longer getting security updates.
And guess what, it massively effed it up. It became much slower, services started taking unnecessary CPU and memory pressure running into yellow more frequently.
As much as I appreciate my x220’s god knows what they would be like if I hadn’t replaced the stock OS. Thank god Ubuntu aren’t shipping Lenovo Vantage just yet
Wipe your machine and install Linux. It's all very easy these days. You don't need all that proprietary spyware garbage, you just think you do. (They put a lot of marketing $ into making you believe that.) If you really, really need a particular piece of garbage install a castrated version of Win 7 or 10 in a VM.
Run GrapheneOS on your phone.
Buy hardware from people working to make it open, like Purism, Frame.work, Pine64, or Star Labs.
What's funny is Lenovo is hurting themselves in the long run on this. Every time there's an experience from someone who wants to check emails and watch Youtube, I'd recommend their next computer be an iPad. You can see this in long-term computer buying trends. PCs (and Macs) are overly complicated for 99% of what 99% of people want to do, and PCs make it worse with bloatware.
I know I'm privileged and so is my family, because I can insist that my elder relatives opt for Apple, where such issues are far, far more rare. For most of the last 6 or 7 years, my 82 year old mom has done nearly all her computing on an iPad. She only needs her Air when -- no kidding -- she needs to transfer embroidery patterns to the USB stick her sewing machine uses (about which: yes, sewing machines have gotten HELLA COMPLICATED).
No bloatware. No marketing software. No problems. But also: only possible because we can afford Apple hardware, and because she has someone in her family (me) that knows to send her there.
It's still largely a Windows world. That Windows machines are so routinely loaded with ALL THIS CRAP is completely unacceptable. MSFT could do something about it if they wanted to. I remember joking, in the 90s, that MSFT's greatest crime would end up being how much they lowered everyone's expectations about what was possible in computing, and that's definitely turning out to be true.
When I was thirty five years old, I hired in for a building maintenance job at a major airline data center. I had little computer experience and didn't even know that Windows was an operating system. I quickly discovered that almost all of the power plant equipment I operated was computerized, and I was way behind the learning curve!
A fellow co-worker offered to help me and showed me many things, but the best advice he gave me was to buy some books and study up on computers. As I studied, I brought him plenty of questions! Years later I marveled to him how he was so patient with all the questions. He told me if I hadn't used the books to learn what to ask, he would have cut me off for not really trying. I've always treasured this. I'm about to retire now after 30 years and have seen many changes with many machines, I've never regretted learning any if it!
As for bloatware, it is evil and unfair to the buyer/owner of the device. But let's not kid ourselves, they don't consider us owners.
I think about this a lot and one thought experiment would be to imagine yourself time travelling 10 years into the future and question if you would know how to use email or phone without being confused. I’m pretty sure everyone on HN is considered as tech-savvy as one would get but I’d also imagine that many people would be lost.
I used to work at a computer shop and one of the things we always offered and pushed quite strongly when customers brought a new computer was a free setup service. As part of that setup service we uninstall all of the junkware that new computers come with pre-installed and install some useful applications like Libre Office and Chrome.
The primary reason we did this was because if we didn't it was very likely customers would eventually bring the computer back and complain that it's running slow or that some software had stopped working (30-day office trial, etc).
It's kind of insane that companies ship computers with so much junk on them because it literally degrades their product for anyone who doesn't know how to service a computer. I always thought it would be easy win to just ship a computer without all the junk and just some useful freeware...
Every company from here to end of the universe in competition of who can have the most intrusive surveillance shoved down uses throats.
It use to be enough to just wipe newly bought laptop to remove manufacturers spyware, but now it comes with OS itself and packaged into big browsers.
Today i had a somewhat similar experience. The printer at one of my step grandpas location was on and off working. I quickly noticed problems with the networking (wireless) and several other wireless stations in the same building. Appears the issue was automatic channel setting in his wireless router (clashes with other stations sharing the same channel, probably).
After changing the routers defaults to fixed channels everything worked like a charm and the printer had stable networking. He told me they apparently had the same problem with their old printer, which they assumed was broken and was disposed...
What a waste of money and resources, the other printer clearly suffered from the same issues caused by the default ISP router configuration and it would have been fixable...
Similarly, this kilobyte of text took over half a minute to load and render (on Firefox Mobile). Slower than a 33k modem. Presumably Twitter's attempt to steer me toward using their app for better surveillance.
Got my mom an iPad. I replace it every 10 years. So far once. She’s totally happy. Almost no effort on my side needed.
My dad has will only use android based devices. Nothing but endless issues.
Most likely the machine had not been wiped and reinstalled, and there was some real malware on it (probably a crypto-miner of some kind).
It makes no sense that Lenovo's dev teams would add bloatware to a machine to the point it literally crawls, and then say "yeah, this works great" and sign off on it for manufacturing.
My own experience of Lenovo is the opposite: snappy and fast even on low-end hardware with integrated graphics, many months between needed firmware updates, and the machines are always fairly priced for what you get.
I realize that not everyone is technically capable of doing this but my 3 suggestions are:
1. Wipe Windows and install Linux
2. Wipe Windows, download the Windows .iso image or buy media from Microsoft and install that so there is no extra crap
3. Buy a Mac
I realize that it is ridiculous to buy a brand new windows PC and it is unusable and you have to do a bunch of stuff yourself to make it usable but that is the state of much of the PC industry. Vote with your wallet or find a technically capable friend to help.
The optimistic POV on this is that there is a massive opportunity in many industries to knee-cap the incumbents and establish dominance in markets that:
1.) Don't need to be proven (i.e., no time to waste on "market fit" as what they want is already clear).
2.) Have low expectations in the form of "it just works."
If you're comfortable with jamming on "boring stuff," you can become a low-level tycoon just cleaning up messes over the next decade or two.
It is somewhat ironic that this is posted on Twitter, where if I try to read it, once I scroll down too far Twitter demands I create an account to keep reading.
I only boot windows to play Counter-strike. I just got a new video card and wanted to get into task manager but I forgot what it was called. I searched "activity monitor" in the little search box, and the top most result was definitely a program to install from the microsoft store. If I was a novice, I might click that and find myself frustrated that my computer is trying to get me to buy something...
Not that I think all of these shitty actions companies do are good. But really people have only themselves to blame. They will consistently go for the lowest of the rung bottom dollar device. Essentially forcing a single company to this shit. Once the first company has started the other companies have to follow suit to not be outcompeted. It takes an incredible strong brand to not do it (Apple for example).
If he thinks that's bad Lenovo installs an ad framework on there most recent laptops that is not uninstable from add/remove. there is a specific command that you have to use to uninstall it. It took me about 30 mins googling for the right keywords to find it. When that kind of think starts happening. You know there's a problem in the industry.
I think a lot of internet connected hardware is sold cheaper than it would be otherwise just so they can get some consumer data harvesting software into your life.
ie, that laptop would be more expensive without the junky pre-installed software.
Nothing new, and still a shame they don’t let consumers know what’s going on. It’s even more of a shame when expensive things do it too —- like cars.
Maxing out the RAM on any PC you buy is probably the easiest solution. Modern CPUs are fast, and I can't imagine a user having this much trouble unless there was hardware swapping.
Which leads me to a question: is virtual memory even a good idea with most PCs? Once your computer starts swapping heavily, it's over. I think I would prefer an informative kernel panic.
You are suggesting that we are OK with programs using excessive amount of resources or doing things we don’t want? All advantages on getting better hardware in the past decade goes straight down the drain if compensate by just getting more RAM.
My grandpa got an Arch Linux which I update every once in a while via SSH. Since most of his use cases revolve around a browser (news, weather, ebay, e-mail, banking) it doesn't really matter which OS is running below the browser as long as it is stable and performant.
His last OS before Arch Linux was Windows Vista which didn't match those requirements.
If I ever have anything to gripe about when I’m a senior, it’ll be passwords. Same is true now! How do you rotate passwords for a 200 account accumulation over the years? They’ll never solve that and whatever they use to solve it will be terrible. Unless implemented at the protocol level and widely adopted by all devices at that lower level
Windows is fine if you do the install yourself, but every major device manufacturer does this to some extent.
When you issue a computer for work IT will almost always, without fail, emulate this by installing a bunch of their own junk spyware and “security” software.
Reviewers of laptops do a fresh install of Windows before reviewing like that is normal consumer behavior.
I doubt it. Instead they are focused on chasing down profits by engineering out the value.
My family has several 'copies' of a laptop from the same company, one of which is the newer model, by one year. The newer model is flimsy and has required multiple repairs to the display, hinges, keyboard, case, etc... Parts are easy to source for the newer model but impossible to find for the older one, although naturally we haven't needed any yet.
Had a similar experience, I have one that I installed windows on myself, and it's fast. I have one, that is three generations ahead in CPU, but came with a pre-installed windows. After booting, it basically became unresponsive for a solid minute. Removing the built-in software helped so now it's fast.
I don't know why people here primary blame about Windows Update so much. Restart is sometimes pain but please update. Microsoft cares security. There are many other complains about Windows, like Win11 taskbar, ads, pushing Edge, Microsoft account, and so on
Besides this there's also the pre-installed AVS crapware from "reputable" companies that tries to take over your entire system and continuously pinging you with scam-like "you're in danger, let me in" messages.
There is no choice because the users made them to do so, since market shows quality won’t keep you in the business like what IBM think pads did in the past.
Most of the customers WANTS NEW laptop every couple of years even there is no real point to change.
Bologna. Customers want to NEVER migrate to a new device if they don't have to. These low-end, piece-of-crap, plastic machines simply do not LAST more than a few years, before the hinge won't hold the screen up, or the keys start falling off, or the battery won't hold a change. My wife complained about the cost when I replaced a busted up Asus laptop with our first MacBook Air. The PC was $500, and lasted 4 years. The Air was $900 (refurb), and lasted 11. Our house is filled with Apple tech now.
I also help locals with their PC issues ... my favorite solution is to wipe the drive and install Ubuntu ... they get the same browsers without fighting Windows ... these new linux users absolutely love their PC once again
It's really stunning how Microsoft and Google have allowed this. 3rd party OEMs are significantly crippling the Windows & Android experience. Wouldn't surprise me if attitudes change here.
Switching my 60 y/o mom from Windows laptops to a Macbook reduced the amount of family IT support I do by about 75%. She still struggles with some stuff, but more on the level of how to do things than issues with the machine itself.
My mum (in her 70s) switched to a MacBook Pro and while it increased the family IT support for me in the short term as the Mac "expert" among us, once she'd figured out her way around, I've not had a single help request in about 5 years now.
Similarly for me, I switched my dad to my old Dell XPS which came pre-installed with Ubuntu. He's on Fedora and it's been running fine for about 4-5 years, across multiple (automatic) version upgrades.
The only thing I've had to help him with across that time are little things like setting his Firefox home page and finding some files "I definitely saved in Documents" but turned out to be in Downloads. I can deal with those.
And that's the reason why I love Linux. Even though I complain a lot about several of its shortcomings, I am a happy user. I just recently installed Mint it in a 2011 Macbookpro , and it is great.
I know people dont want to hear that but none of these problems would happen if people installed Linux. Windows is just a bottomless pit of malware either by third parties or MS themselves
A few years later, when he had been diagnosed with cancer and was on chemo, it updated itself to Windows 10 without his explicit consent. It completely fucked up the install, and was unusable for him from then on. He was too tired to go through the process of getting it sorted out, and was thus unable to book a vacation that he had intended to take to recover from that round of chemo.
Microsoft's unfriendly us-first, customers-second process robbed him of his last holiday and I will not easily forgive them for it.
If a seasoned developer can be robbed of quality of life by this flavour of bullshit, what chance do the non-technical types stand?