Hacker News new | past | comments | ask | show | jobs | submit login
Your Computer Isn't Yours (sneak.berlin)
1514 points by sneak 20 days ago | hide | past | favorite | 742 comments



What we poor souls didn't understand all these years is that we're effectively helping other companies to steal from Apple by running software that isn't distributed via their app store [1].

I think now that Apple has learned from iOS how profitable a walled garden business model is they are trying to bring that model to the PC world as well. Shipping hardware with their own processors is an important step in that direction because it gives them control over the IP of arguably the most important component of the computer, which in turn makes it easier to control software distribution for their architecture as well.

The transition will be slower of course because people have to get used to seeing computers as closed-source devices with app stores, there are still too many of us who have the mindset that you can just install anything you want on a computer without asking Apple for permission and without paying tax to them.

At this point I really wonder how any serious "hacker" can work on such a device, it's becoming the antithesis of everything that the original Hacker culture stood for.

1: https://arstechnica.com/gaming/2020/11/judge-dismisses-apple...


Good thing said hacker culture spent 15 years buying inflated laptops and desktops from them ultimately for convenience to give them the market presence to do this in the first place. Its the same thing with Windows.

People will preach all day that they won't make any personal sacrifices to try to avoid feeding literal cancers that are eating the software industry and are shocked when said all consuming voids take away their autonomy but they are so locked in to their ecosystems they are trapped.

These are for profit corporations, not your friends. Their bottom line dictates they maximally abuse copyright and their ability to sell and distribute proprietary software out of your control to maximum effect. Microsoft, Google, et al want you using devices you cannot control, because then they hold all the keys and can demand the greatest ransoms for you to do what you want. Its inevitable, and the persistent total arrogance and hubris on display in the tech hive mind in regards to any of these walled gardens is just... depressing.

Free software isn't going anywhere, especially with RISC-V now being an established failsafe if UEFI PC and ARM totally lock down, but it would just make the struggle a lot easier if the principle victims would stop martyring themselves for the corporate ecosystems they have no say, influence, or control over.


What's worse is the number of other posts with the top comments being vociferous defenses of these companies as if they needed these people to defend them or cared one iota for their welfare. It feels like it's Stockholm Syndrome on a mass scale.

I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens. They scream outrage when they're not allowed their tiny little dance-jig operators provided courtesy of an abusive regime.

Sure, I reflect that I may be now nearing the curve of an obsolescent person attached to such silly outmoded principles like "ethics" but if we're all just selling ourselves out constantly to the angry god-machine of the id, is this really the future we want for our daughters and sons?


> I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens.

Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

I recommend Jonathan Blow's talk "Preventing the Collapse of Civilization"[1] for an unsettling view of how far we have already traveled down that path.

[1] https://www.youtube.com/watch?v=ZSRHeXYDLko


Gen Z can get real general-purpose computers with a development environment for cheap. The new Raspberry Pi 400 is a modern take on home computers like the C64, but 15 times cheaper. Pi 400 full kit: $100, C64: $595 (1982) / $1576 (inflation adjusted for 2019).

The computers are not just getting cheaper, educative material is easier than ever to find.

Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore, but then again, nothing is stopping a kid from buying a microcontroller and playing with it. It cheap and information is plentiful. I would even say that it is easier for Gen Z to play with electronics than it was for millennials.

But that unprecedented availability of computers didn't turn everyone into a nerd. Just because someone has a smartphone doesn't mean he has any interest in computing. When computers were limited and expensive, only those who had some affinity with them had them, now everyone have them, so mechanically, you can't expect the same average level of interest.


> The new Raspberry Pi 400

Learning about the Raspberry Pi is nice, but that doesn't help preserve the institutional knowledge needed to build system/low-level features in a modern OS. I'm not talking about teaching kids to program; I'm concerned about preserving in the future the knowledge of how to create all of the complex tech we use today. Many civilizations throughout history collapsed after they lost the technical knowledge on which their civilization depended.

> Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore

Yes, that's the problem! Unless those skills are actively used and passed on to the next generation of engineers, that knowledge decays. Part of the reason you don't see a lot of bare metal programming anymore is due to the knowledge decay that has already happened!

> nothing is stopping a kid from buying a microcontroller and playing with it

This article is about how those same kids are being stopped from learning the complex systems we currently use.

> didn't turn everyone into a nerd.

Nobody is trying to turn everyone into a nerd. I'm talking about making sure the nerds of he future have the ability to learn about the tech they use, so the ability to understand and make that tech isn't lost. Locking down the OS into an "appliance" that cannot be inspected or changed is a direct attack on the ability to learn,


The problem with low level programming is just the general trend of specialization.

The modern computer is the pinnacle of human civilization, a thing so complex that it requires thousands of people to fully understand how it works. Low level programming is just a small link in the chain of complexity. From mining the materials, to producing silicon of incredible purity, to designing circuits at quantum scale, to CPU design and finally the guy who places the frame that will display cat videos. So if you argue that no one knows how to program the bare metal (that is not that bare anymore), one can argue that the knowledge of how to make logic gates from transistors or the metallurgy of copper traces is getting lost.

Maybe less people will know how to program on the bare metal. But think about it, it has been thousands of years since most people are unable to hunt and gather. This is a great deal more worrisome than not knowing how to program computers, and yet, human civilization have been thriving.

The important thing is that it is still accessible, and it is.


What you're saying was truer in the '00s when 'scopes, logic analyzers, etc. were professional instruments costing thousands and FPGAs were inaccessible to all but the most dedicated and wealthy hobbyists. Today, with adequate hobbyist-grade instruments and dev boards available in the $100-200 range pretty much anywhere you look, really, the situation for low level engineers and developers is better than it's been for years and, in some ways, even better than the '70s heyday of homebrew computers.

The problem is more the lack of interest. Low level systems development is hard, a computer engineering degree is hella hard, and all the hype and easy money was in following the webdev career path starting from the late '90s dotcom boom.


It was always a tiny minority of the general population that had real knowledge and capabilities of creating the actual technological equipment widely used by the said general population. In the same vein, there are no less people capable of low/OS level programming than before, and you can't expect every proficient user, even a power user or e.g. the now numerous web developers, to be at this level of ability.


What changed is that the very things capable of eliciting interest in programming also offer overpowering content consumption functions with huge, never ending catalogs of games, movies, videos, short funny clips etc.

As computing and "content" proliferate, the uncompetitiveness of creation, esp. symbolic creation such as programming, is increasing. At some point, broadening of the access no longer offsets this effect, and the talent pool may start to shrink even if capability and permeation is a million times higher than it was.


I think Arduino has us partially covered there, allowing kids to get into electronics easily with some wires, sensors, and C code.

And for cobbling together a computer out of microelectronics there's CollapseOS https://collapseos.org/


It's also a fait accompli in education where Python reigns supreme as the primary teaching language.


There is nothing wrong with teaching Python as an introduction to programming methods and data, then diving deeper. My son started with Python in his Junior year of high school and as a senior is now learning Java.

It is just top down learning, instead of bottom up. For programming, I think either works.

I just happened to learn the other direction because of the year (early 1980s). On CP/M, you had a macro assembler right there, and it took me a while to get my hands on a Basic interpreter.


My university course in 1980 started with Pascal and Fortran along with Niklaus Wirth's "Algorithms + Data Structures = Programs".


Excellent book! Although written with Pascal in mind, it became really useful to me. Always brought it at work along with the K&R 2nd Ed. and the famous TCP/IP networking book by W. Richard Stevens.


I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn. Now, most people get computers where those things are forbidden. You still can get "programmable" computers, but you have to do it explicitly. Before, it was implicit. So there's a filter that some people won't pass and won't learn, and the pool of people shrinks.


>I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn.

That's a rose-colored view that maybe reflects a small window in time.

But take the 80s. First of all that computer was maybe $5,000 in today's money. (Yes, there were some limited alternatives like the Sinclair and Commodore 64 that were cheaper.) Long distance telephone calls were expensive as were the big BBS systems like Compuserve. Any software you purchased was costly. A C compiler was hundreds of dollars. (Turbo Pascal at $50 was an innovation.)

Perhaps more to the point, most people didn't just have a computational device of any sort. So the fact that, if you had one, you could use it for a lot of different purposes is sort of irrelevant.


My experience being born in 1992 is that as a kid I could toy with QBasic for instance (and then with Dev-C++ a bit later) ; as well as go peek & change stuff in random files (hello red alert save files :D) - can't really do that with an iPad.


I was a QBasic kid as well, but I think it was pretty clear to me even at the time that QBasic was a mickey mouse environment. You couldn't get into 32-bit mode, or dynamically allocate memory with pointers, much less blt sprites or backgrounds around at a usable speed.

I'm not saying it's exactly equivalent, but there's certainly a perspective which could argue that a) the free JavaScript interpreter available on every device is more featureful and interesting than anything available to us 80s and 90s kids, b) low-cost SBCs running free open source operating systems and compilers are more accessible and better supported than the commercial compilers that few could have accessed back then, and c) the overall ramp from curious/creative gamer kid to capable hacker is much smoother now than it was then, with a lot of interim options between JavaScript and commercial environments in which to gain comfort and achieve interesting results (thinking stuff like GameMaker and RPGMaker, modding communities for existing games, hacking/homebrew/porting communities for consoles, etc).


I get the argument that many kids are growing up today without necessarily touching a "real" computer, just a tablet. That said, as someone who got into PCs quite early on, I'm a bit skeptical that the world we live in today where you can assemble a working computer based on a Raspberry Pi with loads of open source software for probably about $100 somehow is less accessible to a kid who wants to hack around on computers.


I think the key with the Pi is that it needs to find its way into the house for some other reason than just to be an educational aid. Positioned that way it will be about as interesting to most kids as a plate of cold spaghetti— am thinking here of the book I was given as a teenager on developing ActiveX components, because it was an enterprise-y thing, when what I really wanted was a book on DirectX, for making games.

But yeah, if the Pi shows up as part of an IOT system, or as a TV/streaming box, or to play retro games on, or whatever, then it's there and it's available to be tinkered with; and from my limited experience, basically none of those use cases will run on their own without at least a little bit of tinkering. :) Even my little Linux-running Powkiddy emulation handheld has probably consumed about as much of my tinkering time as it has my retro gaming time.


Right. So quite a bit later than the period I'm talking about. Late 90s/2000s period is arguably when commonly-used hardware was most accessible/hackable.


Eh, I mean the kids who are passionate about how systems work will find a way to dig deep. I don't think this is going away with lock-down systems regardless.

The only thing it will do is make it easier for the general public to work with machines and the masses never cared to begin with and to be frank that's a good thing. They are less likely to install malware and cause trouble for themselves.

Plus you're forgetting that these companies still need engineers to build and maintain their infrastructure so it's not like the knowledge is going to disappear, never mind the fact that the corporations heavily rely on OSS.


I am a developer, I use Mac... so I don’t understand how it’s forbidden to program. Mac OS comes with Python, Ruby and PHP preinstalled, even.


The lockdown and controlling of distribution of software.

Great, you can write a script, it's checksum is then sent over the net and verified to not be malicious or if the service on the other side is experiencing lag..well... gotta wait, and wait.

To the point where, all choice is removed from an operating system. So much for root access.


It's not about the ability to write programs, but the ability to write programs that influence your base system in a meaningful way. Yes, you can write programs, but can you write a different /bin/init for your system? I'm not sure you can (either because Apple will not let you, or your lack of ability to do things like setting up a system to be used). Maybe you can, we're wrong about this, and more power to you. But it's quite likely that you can't.


$100 isn't cheap for a curious kid with no income, especially if they're not already sold on it as an interest.

There's a big difference between buying a computer explicitly for the purpose (even if it's cheaper), vs being able to play around with the hardware that you already have, for no new monetary investment at all.


But computing tools are cheaper now than they have been in ages.

You can buy an Atmel dip version of the chip in an arduino really cheap, and build basically what is an arduino on a breadboard. Then get a USB programming adapter (again, cheap) to get it running.

Then you can get in expensive USB logic analyzers that are plenty capable of monitoring I2C and SPI buses and learn how all that works.

None of that existed in the 1990s. It simply wasn't there.

The the price and the availability of tools is so much better today. You don't even need to buy any books, it is all online, with community members jumping in all the time to help.


Get involved in the local community board. Present to the library council that this is a good investment for education. Purchase 20, lead a community class to teach the next generation of students.


If a kid with a phone but no money wants to know if programming is something they might like to study, and somehow the school also can’t afford a Pi, I’d point the kid at something like https://js.do/


The problem is that's not really how people get started. Hello world is how you get started in a university intro course, not in your parents' basement.

What happens there is that you have a pile of photos that are all in one folder and you want to organize them into separate folders by date. A problem you're actually having, not a homework assignment. Which is possible to do automatically, because they all have metadata with that information in it. So you find a python script somebody wrote to do that, and it works! Then you realize the script is just a text file and you can modify it to sort using geotags or something else instead, and now you're down the rabbit hole. But only if it's easy to see and modify the code that runs on the device you actually use.


Not everyone learns the same way or for the same reasons.

For example, I learned to read at the same time I learn to code, and from the same source — I had the unusual combination of not just the right birth year, but well-off parents and two elder siblings that had gotten bored of their Commodore 64 just as I was reaching reading age.

Back then my coding was, naturally, 95% transcribing from the manual, but it was fun. One of the few ones I can remember reasonably well this far removed from the experience was a sentence generator that had a list of nouns, verbs, etc and picked from them to build grammatically correct gibberish.


Hello world is how a lot of us started in C with the K&R book, wherever we were. Most of us didn't want to organize photo folders, we wanted to make the computer display something cool or play a song or something, and usually we had to start with something elementary when tackling a new language, especially a complicated one like C.

I can still remember being 11 years old and meticulously keying in the hello world program on my dad's aging Tandy Model 16, then saying 'cc hello.c' and watching as the compiler chugged through the tiny program for a minute or two. (It #includes stdio.h, so there was more code to go through than it seemed, plus there was a linking step. And not much actually happened, because Unix C compilers are silent if there are no errors.)

When I ran a.out and saw "hello world" appear, I was over the moon. It was like clearing 1-1 in Super Mario Bros. for the first time. Like I had surmounted the first obstacle and was thus ready for all the challenges that lay ahead.


I don't know why a raspberry pi is supposed to be a good device to learn programming other than the fact that it forces you to use Linux. Do kids really ssh into their raspberry pi from their phones? Or do they really buy an expensive screen, expensive cases and expensive peripherals adding up to $200?

I also doubt that raspberry pis are actually the type of device that schools are interested in. What they'd want is the TI-84 equivalent of a laptop. It should be easy to reset the laptop to factory conditions and it would only run a few specific preinstalled programs (one of which would be a barebones IDE, probably for Java). To me it feels like the raspberry pi fails at all of these. You have to mess with hardware and then mess with the software. A school provided Chromebook with student accounts managed by the school would be way more practical.

However, if you actually want people to learn programming organically, it's much simpler to just get a Windows laptop and use that as the main device.


I too have doubts about the real-world suitability of the Raspberry Pi in formal education environments, but certainly the business of resetting it _is_ a solved problem; every kid is issued their own SD card, and can flash it back to the starting image whenever they want.


It all depends on what you want to learn.

On a RPi, with a standard laptop on the side (Windows/Mac/Linux, doesn't matter), and some basic installed tools, you can pull down and configure Alpine Linux and run a bare bones IoT Linux platform. If you want to get really gritty, you also configure and build U-Boot as a primary loader.

Once you get passed that point, you pull docker into your Alpine build and start running containers.

Stepping through that full process will teach you research (because you are going to be reading a lot about how to pull that off), it will teach you about Linux kernel configuration. It will teach you to be at least comfortable with git.

There is a lot you can learn on a Raspberry Pi, cheaply, that only involves plugging in an ethernet port and power supply, and never seeing anything on a terminal.


> The computers are not just getting cheaper, educative material is easier than ever to find. [...]

Being accessible and being accessed are two different things.

I'll make a more general parallel.

Never in history was such an enormous amount of audiovisual media about the decades preceding the birth of their generation ever produced, during those decades of later. And this astonishing amount of written documents, audio, films, both fiction and non-fiction, live takes, documentaries, is readily available: often immediately, often gratis.

And yet this generation is disconnected from the previous ones in an unprecedented manner. And yet this generation seems to be the most ignorant about how stuff was just a couple decades before them. I've never seen such a break in the flow. The material is present but not reached or understood.


The Rasberry PI is not open hardware, and is actually part of the problem, not the solution.


Yes and no. At least it created a modest tinkering culture while the mainstream went with smart phones. I wonder if RISC-V will help us getting over ARM* or if RISC-V SoC vendors will try to create their own walled garden, proprietary interfaces and GPUs/codec support.

* = The issues with the Raspberry PI are both with Broadcom's closed source hardware (there are other tinkering boards with more open hardware) and of course with the license model of ARM.


I agree there are other 'tinkering' boards with more open hardware, but the Pi has the advantage of a huge community of very nice supporters. Maybe even as nice as the community over on arduino. The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.

As far as the ARM license goes, very few people are ever going to spin their own hardware, and at least the ARM programming API/ISA is very well documented. We'll see if that continues under NVidia.

Certainly MOS, ST, National, Zilog, Motorola, INtel, IBM, Dec, (etc etc) CPU were always closed source. That did not prevent anyone from learning about computer systems and programming then, and it won't stop them now.

That said, I doubt Apple will ever make the M1 available for purchase. Which is where this thread started.


>> The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.

Avoiding the black hole does not solve the part of the problem where you are financally supporting closed hardware and disincentivising the Rasberry PI foundation fron doing the right thing.


This is the same problem we had with TiVo: you could get the operating system, tinker with it, and run it. Just not on your TiVo. What good is having the source if you can only run it on toy hardware? You need to be able to alter the code on the tools you use so you aren't beholden to a company that drops warranty and support within 5 years and may never fix the problems you have even while it's supported.


I just made the C64 <-> Pi 400 comparison to someone last week. It is completely analogous.

Pair that up with an arduino, and maybe wiring, and you can a have hardware interaction with a Linux desktop on the 400.

I just helped a young arduino user with reading an analog pot yesterday. There are still young people out there playing with hardware, and it is way easier today than it was for me in 1984.

Then go over and look at the people configuring and compiling custom versions of Marlin for their 3D printers.

Or the nerds designing flight computers for solid fuel model rockets.


Wouldn’t it be great if you could route the mac’s internet through the raspberry keyboard, strip the surveillance automatically, and continue to use the internet like normal?


So you see the non-free-and-open Raspberry Pi as a real development environment. That's nice.


> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

Gen Z here. I tried to give manufacturers the finger and try to play with systems level software on modern PC's and ended up retreating after realizing, among other things, just how much modern computers are not for hackers. Everything feels like a "undocumented" (by which I mean the documentation is likely only accessible to OEM's and those with large business deals woth them) mess. Even when it is available, I'm too scared to touch it. I found something I wanted to try to to hack around with till I saw the license agreement I had to read in order to access the docs. I couldn't parse out the legalese, especially regarding open source works and I'm not consulting a lawyer to figure out if I can post some project I might hack together online under an OSS license. The few things that do get open sourced are incredibly half assed and you can still tell they're designed for corporations, not hackers (i.e. edk2)

Conversely, I have an old IBM XT compatible luggable sitting in my closet. The manual for that has the full BIOS source llisting. Nowadays I mostly just hack around woth old non-PC devices, but for the most part computing just isn't fun for me anymore.


As other commenters have said, just picking up an open source *NIX is a great start.

If the lack of open hardware bothers you, there are several non-x86 architectures that are fully open source. Chief among them is RISC-V[0]; SiFive[1] is taping out RISC-V processors to real hardware, selling dev boards, and stuff like that. These are real, competitive, Linux-capable CPUs, although widespread adoption has not happened yet.

On the more minimalist front, the ZipCPU[2] is simpler (though has less software support), and it's author writes some really great blog posts about HDL[3] (Hardware Description Language -- how these CPUs are defined).

You might also enjoy developing for very small embedded systems. I like AVR CPUs such as the ATMega328P (of Arduino fame) for this purpose. Although not open-source, they have quite good and thorough datasheets. AVRs are very small 8 bit systems, but as a result, you can fit most of the important details in your head at once.

If you want to talk more about these topics, feel free to get it touch with me. You can find my contact info on my profile, and on my website (also listed in my profile).

0 - https://riscv.org/

1 - https://www.sifive.com/

2 - https://github.com/ZipCPU/zipcpu

3 - http://zipcpu.com/


Most of the issues you list should not be a problem if you use a Linux distro on a fairly standard pc/laptop.

Pretty much all inside it is open for you to look into and change as needed.


At the OS level sure. Its still running on a completely non-free / hardly hackable platform though riddled with all sorts of backdoors and other curiosities. I'm aware of things like coreboot / libreboot, but support is even more limited there and porting involves a deep understanding of the x86 platform in my brief experience.


You don't need any of that to understand how x86 or the OS works. Realistically you can do the entire learning with QEMU/VM and not be restricted by the hardware at all?

At some institutions they teach how OS work by students implementing Pintos, a full x86 OS.

In my first year I had to build a full raspberry pi emulator from scratch and then code an assembler for it. These programs would then run natively on the raspberry pi. People wouldn't even touch the hardware until much, much later into the project when the emulator has confirmed to be working.

I disagree with the view that you need a full FOSS hardware to understand a platform. You can do a lot with a VM/QEMU.


It really depend on what you want to do - Linux distro on x86 is IMHO good start, but of course its not the only option - there are now some reasonably open arm devices (including an atm laptop) from pine, there is the very open Tales workstation running on Power & RiscV showing up everywhere.


Why do people make these kinds of comments? Talos isn’t showing up anywhere, and even during the Kickstarter it was more than $3000 for just a motherboard.


As an example of what is available ? And these do get used - for example as a desktop: https://m.youtube.com/watch?v=ktSwuF32ywM

Also they seem to have released a cheaper micro ATX board: https://www.raptorcs.com/BB/


> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.

Kids today aren't as skilled with computers as I would've expected years ago. I feel the app culture stunts deeper learning.


I'm aware that I am looking at this through the lens of self-bias, but I have definitely found that much of what I know about computers came from being able to (and sometimes needing to) poke around and figure out how to do something.

I am not a software developer and have little formal education in computer software or hardware. With that said, I've picked up enough just by growing up with Commodore, DOS, Linux, etc. to at least have the basic understanding needed to research or just figure out solutions to most common problems.

I work in media production and IT in an educational setting, and while I'm far enough along the Dunning-Kruger graph to know just how little I know, the sort of things I see from students (grad and post-doc at that) definitely give me pause sometimes. Often, the mention of basic things like how to change a setting or search for a file in MacOS or Windows is met with glazed eyes or signs of discomfort. Apologies for how "I'm not really tech savvy" follow talk of anything more complex than pointing at pictures on a screen or doing something outside of the web browser.

And yes, I do understand that the very nature of these students' specialization in other fields can mean they haven't had the need or opportunity to learn much about computing. I just feel like I only picked it up because it was how you got things done or because it was there for poking and prodding when I got bored or curious.

I think in the end it's not just that I have a personal connection to computing and think everyone should share my interests. It's more akin to a basic competency that opens a lot of doors and prevents you from being taken advantage of.

My analogy is usually along these lines: if your job requires you to drive a car, you don't need to be a mechanic or be able to build a car...but you should at least know the basics of how to operate and maintain a vehicle beyond just gas pedal, brake pedal, and wheel.


10 years ago when I started working I feared the younger generation and their enthusiasm and energy. Then I had to teach a few of them how to ssh and that fear was dissolved. Most of these kids will not cut it in heavy reading/comprehension jobs.


In my experience the up-and-coming generation are far, far better at branding than my own generation ever was. Talking to my younger colleagues I generally get a deep feeling that they know what they are talking about, are thoughtful and make good decisions.

But the thing is, that's just branding: they aren't actually much more competent or thoughtful than my generation was at that age, they just seem like it. When I look at the code they write, it is as bad as the code I wrote. They have startling gaps in their knowledge and experience, just as I did (and no doubt still do — there is always something to learn!).

The thing that worries me is that until one gets to more objective measures, they really do seem more competent and trustworthy — which means that others are more likely to trust them, which is likely to lead to more bad decisionmaking at scale.

I wonder, though, if there is really a difference at all. Maybe my generation actually seemed more competent to our betters than we really were, too!


As an "older" coder I get where your hinting at but your view on Zoomers in tech from your high horse seems very narrow and entitled.

I'm sure there are Zoomers out there who can code rings around you(and me).


not really meant as a hard judgement, just sharing my experience. i am 100% sure there are folks at all age levels that can be and are better than me. i have definitely worked with some.


I think the parent just meant the surprise at the lack of enthusiasm, based on the ubiquity of the technology/device.

Perhaps it is considered more of a commodity now?


> Perhaps it is considered more of a commodity now?

It is and it should be. What the white beards all too often forget is the fact that the generation before them would say the exact same thing about them (lack of enthusiasm, lack of interest and knowledge, etc.).

This is a story as old as time:

> The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households.

attributed to Socrates (469–399 B.C.)

How many 80s kids were into HAM radio or actually building computers (Ben Eater-style: https://www.youtube.com/c/BenEater/videos ) instead of toying with micros or basically playing Lego with PC components? Same difference.

Today's computers have reached a level of tight integration and complexity that simply cannot be grasped by a single individual anymore.

People keep whining about that when it comes to computers but happily accept the very same thing in other commodities like cars, clothes(!), highly processed food(!), hygiene and cleaning products (soaps, hair gels, rinsing agents, ...), utility grids, etc.

It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.


> It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.

that is perhaps the most poignant take on this i have come across in a while. thanks that got the cogs turning in... different directions. i do need to ease up. :)


I have to disagree with you about cars.

There has been a significant increase in demand and cost for pre-owned vehicles that do not have integrated entertainment systems and drive-by-wire systems.

The used truck market, I have seen vehicles triple in value over the last decade.


The issue with cars and trucks is the ability to repair them (not necessarily by yourself - just in general), not interest in the technology, though.


Problem is the pipeline. Schools that really bought iPads for "digitalization" will produce learn results that are significantly worse for understanding tech applications and information technology, because the principles get obscured by fancy UI. Nevermind that it wouldn't even be different from their phones.

A problem is that teachers would require some extensive training first before they can even try to teach kids something they didn't know before.


I do appologise if this will sound condescending : What are you on about?

I do agree that the majority are limited in terms of technical knowledge but I think that was always the case. It's always a (driven)minority that are responsable for tech advancements, preservation and magic.

I wish I was 14 in this day and age with this much information, how-to's, cheap(and expensive) tech and microcontrollers(when is the last time you checked places like aliexpress and git's?).


We live in a potential golden age for technology education:

- Hardware is ridiculously cheap and powerful and widely available

- Inexpensive educational computers such as the BBC micro bit and Raspberry Pi are widely available along with associated educational materials; these devices are easily interfaced to the real world, and can often be programmed at a very low level or high level as desired

- Overall, robotics and real-world computing is cheaper and more accessible than it has ever been

- FPGAs are ridiculously cheap and powerful and allow you to explore processor and digital hardware design

- Nearly every computer, smartphone, or tablet contains a web browser that can be programmed in JavaScript, webasm, and dozens of other languages that compile to JavaScript or webasm; web apps can display 2D, 3D and vector-path graphics, generate sound, and interact with all sorts of input devices

- Nearly every computer, smartphone, or tablet has a Python implementation as well

- Free and open source software is widely available

- Free and open source game engines are widely available

- Games are often moddable and many contain built-in level editors

- Source code of almost any type of software you can imagine, from operating system kernels to compilers and libraries to databases to to web browsers to scientific computing to games and productivity software is widely available

- Billions of people have internet access

- Tons of free and often high-quality educational materials are available from universities, khan academy, and many other sources on youtube, on github, and all over the web

- Many books including textbooks and tutorials are available for free as PDFs or readable on the web

etc.

The materials are out there, but the challenge is empowering people to find and make use of them.


Omg, I can’t up vote this enough. I have been telling principal/director-level engineers this for almost a decade. We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them. So much of being a senior+ in tech is knowledge sharing which we don’t do enough of. I’m guilty of this too! I value my personal time and don’t want to spent it building training materials.

The whole “playing” with system level software is how I became an engineer in the first place!

It’s also extremely apparent when you suggest a new way of doing things and older folks look at you like you’re crazy or worse, actively block because “we’ve always done it this way”. Those youngsters in those tiny screens have big ideas, from a different perspective, and we should acknowledge that and learn what they know.

Back to topic, I think the maturity of software engineering has enabled a lot of what we see today due to the macro cash grab of capitalism. There are things that stand firm beyond it, but 80%> of software written is logical duplication of software already written (for another company perhaps).

I’m guessing on the percent but it’s important to know that global overall LOC is really just hashing out the same stuff we’ve been doing (albeit in maybe a slightly more novel way) to get business value.

20%< is moving that stick forward.


> We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them.

As a self-taught developer with just a few years of experience, I appreciate seeing this sentiment. I have this burning desire to be a great engineer and to learn as much as I can, but I'm not exactly sure what I should be doing or learning.

Do you have any advice as to activities, books, or other resources that you think would benefit a young engineer who would like to learn how to do things right?


Learn as much as you’re enthusiastically able!

I’m on year 20 of this amazing ride. I have another 20, possibly 40 in me. I love it!

I was self-taught. The trick is to find something you’re super super interested in and learn as much as you can from books, youtube, Google, blogs, other members of HN.

I was really into games and wanted to make quake-style fps games. I started with art. Went to college for graphic design and hated it so I dropped out. Worked on games while I did crap jobs. Eventually made a few small ones and got really into web development. Got a new job...

I can’t really tell you what books without knowing your interests.


As an old dude, I agree with you completely. We push way to much on the degree/school thing in HR, long before we get to interview them. For most positions, I would rather have an enthusiastic learner with a BS, or even an associates, and have them learn how it is done from experienced staff. If they really like it, and want to pursue more college, fine. If they want to dive in and learn by doing, fine.

I am in my retirement position now, stepped back to staff engineer, because I don't want to spend any more time stuck in meetings. The best principles, the best directors, all came up through the ranks. Yes, some of them returned to college for a few years before coming back, but the best ones just came in as enthusiastic green horns and dove straight it.

I have worked long enough now (36 years in this field, and this is my third 'career') to have seen people I mentored grow into director level positions, and that is perhaps the most rewarding thing there is.


> I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy

Aren't there more kernel developers now than ever before?

It's not the case that we're struggling to produce technically skilled software engineers.


Thanks, that was a really good watch!


> I worry for Gen Z because they're tiny-mobile-device native.

Gen Z here. I learned most of my basic computer skills by playing / modding games, editing videos, and having no friends. Gradually snowballed into writing scripts to make things easier, learning more programming languages and eventually getting books on more formal CS topics. I hope my kids can have a similar experience.

The ugly and maladjusted among us will always find a way ;)


On a related note, game consoles seem to be going the same direction, focusing on services and downloadable content or games over physical media (and even those that are physical seem to have a large online/multiplayer component). After a console generation or two, these services get taken offline as they are no lonher profitable, leaving a portion of your system useless.

I can go back and play old games I grew up on any number of platforms thanks to emulation or buying old physical media. This generation won't be able to do the same in 10-30 years.


Also, it seems alot of games are only meant to be enjoyed as a distributed multiplayer, not really a single player ‘quest’.


> What's worse is the number of other posts with the top comments being vociferous defenses of these companies

I slowly tend to think that some of those commenters, who fiercely defend all and any action taken by Apple to limit the freedom of its users, are paid marketing and PR firms who scan forums to oppose any opinion that is not in line with Apple‘s vision of a closed system.


I understand the sentiment, but how many people were ever going to understand their systems at that level? I'm sure Gen Z will have their own tech equivalent of Roy Underhill telling them to put down the Ikea bookshelf and grab an axe and a log, and that the number of people who respond then will be about the same as now.


It is a choice to buy or not buy a device from a company. In this particular case the greatest threat is that devices that were always used by powerusers, are a step away from preventing us having superuser privileges on them, and instead have a moderator like on an iDevice.

At the same time Gen Z has way more choice and exposure to "computers", so worrying about them sounds very diminishing to their prospects. Sure, adoption of computers, even if they fit your bag or your pocket, is much greater nowadays which leads to way wider range of usages, for better of for worse.

In the end same could be said about the Edison generation and electricity. In the end I barely have a clue how electricity networks operate. Can I experiment with it? Sure. Do I need to if I just want to power on my computer? No.


Next laptop for me is going to be from a company that explicitly supports linux. Probably dell or lenovo.

A lot of people like to rag on Ubuntu. I agree with some of the criticisms. But the one thing they have done right is broker partnerships with hardware vendors like dell and lenovo.

https://www.dell.com/en-us/work/shop/overview/cp/linuxsystem...

https://news.lenovo.com/pressroom/press-releases/lenovo-laun...



I'm personally quite fond of System76.


Unfortunately System76 still uses 10 year old LCDs. Let me know when their displays are at least as good as an 8 year old retina MacBook Pro.


Highly recommend the Dell XPS 13 if you enjoy the Macbook build quality. The edge-to-edge screen feels like the future.

I run EndeavourOS which is like a GUI installer for Arch. Arch's package management with (AUR -- Arch User Repo) will remind you a lot of Homebrew. And their wiki helps you get up to speed quickly.


I did that already with a ThinkPad T495. Couldn't be happier. With Arch Linux and KDE it's objectively far superior to Mac OS for development (I do mostly TypeScript and Rust).

The crazy thing is that I've been blind to the massive improvement that happened in Linux-land for the 7 years I was using Mac OS.

Of course there is still some PITA with missing productivity software (in my case for hobby photography), but so far I am determined to suffer with open source tools and maybe try to contribute a little.


superfish was reason enough for me to never touch lenovo and all the other malware they've infested their systems with since have only made me feel better about that decision. Dell might be okay though.


use Fedora. I run Fedora on a Dell XPS and version 33 is great.


I remember people pretending to be "skaters" way back, "hacker culture" reminds me a lot of that.


Hah, I've become a skater relatively recently. It's my Zen place.

It's s good analogy, you have to want it, either skater or hacker, any label really, must be earnt with time and effort. True for all things, and those who have earnt it can spot the pretenders a mile away.


And what do you expect us to do, install Linux and spend days configuring the touchpad?

Computers are a tool we use to do our work. Setting up the computer cannot become a job by itself.


I've spent far more of my life trying to getting development environment shit working on a Mac than I ever have getting touch pads working on Ubuntu.

For example, the latest version of Mac OS broke docker - literally today we discovered this.


Not the most productive comment, but I always wait 6-12 month before doing a major upgrade - I let other people find issues like this. Or just wait a month if you're in a hurry for some reason.


>latest version of Mac OS broke docker

That's wrong, Docker will run fine on M1/Big Sur after an ARM release and probably some patching. Not too different what you'd expect on any major OS release.


How do you know it will?


Apple showed it on stage at WWDC, along with Debian ARM running inside a VM. It's coming, if for no other reason than developers inside Apple need it when writing infrastructure software. (Honestly, I'm surprised that Docker didn't get the same pre-release hardware Parallels did)


Because M1 has virtualization and emulation support, unlike the A12Z (not M1) ARM processor on which the docker issue was opened. This is explained here: https://news.ycombinator.com/item?id=25073010

And Docker has the motivation and means to patch and build the M1 Arm version.

You can take it directly from the Docker people:

> Thanks. We're working closely on this with Apple. This is expected right now (and indeed there's nothing we can do about it yet) as the new chips don't have virtualisation support yet.

https://github.com/docker/for-mac/issues/4733

Notice the lack of panic or any mention of "it'll never be done", even implied.


Because Docker has a large financial incentive to get it working?


Buy one of the many laptops that work out of the box with GNU/Linux. Vote for manufacturers with your money. It's not that hard.

I haven't had to spends days configuring anything for over a decade.


My colleagues often joke that I have to recompile my kernel whenever I want to do something novel with my computer.

I play along, but I haven't compiled a single thing in years. I don't even think proper build tools are installed on my PC.


I find that statement a bit dubious, but I guess if you only use tools that are in the repo and Chrome that's probably easy enough to accomplish. I remember fairly recently having to compile my own dosbox because of a many-years-old bug that remains unfixed and causes garbage graphics in black borders [0].

My history of using Linux desktop is riddled with crap like that.

[0] https://sourceforge.net/p/dosbox/bugs/447/


No, I expect you to buy a well built tool from a Linux-first manufacturer, where you pay a premium to get a device that "just works". This is, after all, the Apple way right? You pay a premium for a Mac because it "just works", right?

https://linuxpreloaded.com/


Doesn't even need to be "Linux-first." Our standard Linux laptops at Red Hat are Thinkpads, many of which are used by people who are not particularly technical and use a locked-down corporate build. (Many others run Fedora or CentOS.) I'm sure we have many thousands in use.

The Dell XPS 13 is supposedly good too although I don't have personal experience.

I use Macs quite a bit but the idea that running Linux on a properly-selected modern laptop is likely to be an exercise in frustration is just a myth at this point.


I was mostly being tongue-in-cheek, making a point that one doesn't have to buy a Mac to get a great software-hardware combination from a recognized manufacturer.

But you are absolutely correct, and my Linux laptop is a Thinkpad X1 Carbon. It's also well supported under OpenBSD but for what I do with it (mostly media consumption and some light gaming mixed in with the usual laptop stuff), Linux works out better.


Dell XPS 13 is a solid choice, network card/wifi card may have to be changed out, but they support ubuntu natively. Cheap to buy used and generally very good.


An open computing device doesn't need to be unusable, and a usable computing device doesn't need to be locked up. There's a lot of space to explore between those extremes.


It's funny because almost nobody uses Apple where I live (Poland). The only people who use MacBooks are some "artsy" types and some managers (because they only need office and outlook to work :) ).

Developers never use apple because it has tons of compatibility issues.


Is it really the case though? I am also from Poland (though I do not live there anymore) and based on my experience it was rather a popular dev machine a couple of years back, easily seen at conferences. It is of course less popular due to pricing but definitely not absent.

Can you maybe also comment further on the compatibility issues? Mac is perfectly fine for any general-purpose frontend/backend development. Unless you work in a Windows-centric environment or in a setting that clearly rules out a Mac (e.g. Linux system programming), I cannot really imagine what compatibility issue would prevent you from using a Mac e.g. for Typescript, Java, Python or Go development. Not sure why Poland would be less friendly for developer Macs than the rest of the world.


It is a luxury brand in the US. Every tech person that has made it would never buy anything but Apple. Yes, things like battery life are great. But if you are seen working at a coffee shop, there is a 100% guarantee of a that glowing apple logo on your laptop.


I know, I just can't understand it. With anything open-source apple ports are 3-rd grade after Linux and Windows. Sometimes even after *Bsd :)

Good PC laptops are much better if you're using them like a developer would (move it twice a day, code on it for hours without moving).

Thermal design is awful on apple laptops, they are basically running half-speed by design. Can't fix anything if it breaks.

Why would people value brand over all these concerns is beyond me. Must be consumer patriotism, like with Harley-Davidson :)


Could it be that working at a coffee shop benefits from good battery life?


This is not the case any more. A few months ago I migrated out of the MacOS ecosystem and got myself a Thinkpad X1 Carbon. I run Fedora on it. It works seamlessly just like MacOS did on a Macbook. Haven't had a single hardware compatibility issue till date.


I've had a Thinkpad and Ubuntu for 5 years. The 'hardest' thing i've had to deal with is some flaky drivers with displaylink. I'm not at all putting crap on Displaylink. It's been fantastic, I just upgraded to 20.4 and hit a few small issues.

Ubuntu's been really solid. Boring even. Which may not seem like a compliment, but I don't nerd out on OS stuff, I just want a computer that I can do stuff with when I need it. And I personally found previously with mac and windows my ability to 'just do work' wasn't there.


Use mouse. It works out of the box and more convenient anyway.


The exceptionally good touchpad on the macbooks are almost half the reason you’d buy the machine in the first place.

Wish it was as good on the alternatives. Hopefully it works 100% out of the box, but I wouldn’t bet my life on it.


What are you on about? Apple's touchpads feel like mush, even by touchpad standards.


I dont believe people have many complaints when it comes to the usability of Apple hardware. Touchbar excluded of course. And that keyboard.


There's probably a business opportunity in that ;)


But it should be your tool. Apple just grands you the right to use it like they think it's the best for you. That leads to this https://mobile.twitter.com/lapcatsoftware/status/13269902964...

How could it be that people are all about freedom when it comes to governments but have no problem if private companies censor and controll what they write and do?


Because if governments get involved we’ll never be able to solve the problem.

I.e. if you take away the rights to privately create and distribute software, the current situation will become permanent.


We meet again zepto!

Nobody wants to take away their right to do so (ok well maybe some do; I'm not one of them). Just that there should be rules we all agree to when we do.

We accept this in other venues. Seatbelts. Stoplights. Nutrition labeling on packaging. etc..

No such rules exist for software, which is odd. It does seem to suggest the industry is ripe for regulation. It's always disappointing to me that we only accept limits when the law places them, as though ethics and morals are somehow not good things to pursue for their own reasons.

Which goes back to OP's point: just because a company CAN be as dystopian as they can imagine, doesn't mean they should be.


We already have general rules against fraud and other kinds of crime which apply to software just like any other product.

Only a very small set of products have special regulations, generally those which we deem unsafe for consumers unless regulated.

“Seatbelts. Stoplights. Nutrition labeling on packaging. etc.”

A good set of examples. Every one of them is about consumer health and safety. This is consistent with liberal democracy - people should be free to do what they want unless it hurts someone else. That is why all product related regulations are related to safety.

It seems reasonable to assume you are arguing for rules which prevent the installation of unsafe software on computers which are sold to consumers.

Apple would be the most compliant company with such rules, and would benefit greatly.


That's not what I said. I didn't say the government should get involved. I asked why it's ok that that a private company controls and censors us.


It’s ok if you think it’s ok.

If you don’t, you don’t have to use their products.


Because they own shares


LOL. Right


We expect you to be a hacker.


A touchpad is not really a proper tool for work (a laptop is hardly one anyway except in cases which require it), more of a handy last resort solution that comes included hence always available with the laptop.


Well said. The hacker culture is not immune to the trappings of luxury brands. You can’t show off your wealth using a no name brand running Linux.


It won't change anything switching to RISC-V, if those same hackers just build SaaS placed behind a server wall with a pretty Web interface implemented in WebAssembly talking over gRPC.


The one thing that gives me hope is the pendulum of open->closed->open has swung back and forth so many times now that even as depressing as current trends may be, I'm confident things will swing back again.

Not sure when, but I'm certain it will.

It is one of those strange things, the more closed things become, the greater the incentive is to open them back up again.

The hacker ethic is still out there, just this recent boom has dulled it a bit. A steady diet of locked down computing will fix that.


One of the major drivers is app developers. You can't build an iOS app without buying a mac, and if you're buying a mac.

It's kinda abusive how you need to pay the hardware tax, the yearly $100 USD rent, and 30% percent of your sales to have the privilege of developing for their platform.


I'm glad windows server lost to linux so we'll at least be able to run our own servers for the foreseeable future. But I see platform layers being the threat there (AWS, Azure, GCP).

> literal cancers that are eating the software industry

figurative cancers


What is your ideal solution? Linux seems to have largely won. Will it run on RISC-V?

I wish we had a completely open system of hardware and OS. Is librem making any progress in this space?


Like all subcultures when the hacker ethos went mainstream it's spirt died and all that was left was fashion. No one cares anymore if their consumer choices are a threat to freedom as long as they look cool. Outward appearances are now more important than principles. Prickly principles require sacrifice and why have character when you can have a disposable "personality".


The hacker ethos didn't go mainstream. People who don't deserve the moniker started self applying it.

I'm not a hacker, I'm a corporate shill because I work for my own profit within a system that's set up to ensure its own continuity. I'd like to be a hacker, but I have a family to support and am too weak or addicted to lifestyle or scared or realistically-minded to actually commit to what's required to earn the title of hacker. Same with anyone working towards advertising, marketing, consumer data collection. Doesn't matter how intelligently you solve a problem in those fields, ain't no starry eyed kids ever wanted to grow up to make the world a better place by more accurately identifying and individuals preferences on the internet.

Fuck fake hackers. The best thing I've done with my life is being a good husband to a teacher.


I totally agree with you.

The ethos went "mainstream" in the sense that it was fashionable to play the part and be in your words a "fake hacker". And for future historians and those unfamiliar with the jargon, hacker ethos isn't about criminally breaking into computer systems.


amen. all these so-called hacker capitalists either starting a company or working corporate gigs forgot to read the hacker manifesto.


Real Hacker who has concern about her/his privacy wouldn't touch this kind of product with a 10 foot pole!


And yet, Apple still has a reputation of being really privacy- and security- conscious. Windows' telemetry has had all the alarms raised for years now, and how does e.g. Canonical earn money?

Same on mobile, Android phones - especially in the cheaper ranges - are dodgy as fuck when it comes to privacy and security.


Does Apple really have literally all telemetry disabled by default, out of the box, on all devices? Is there an independent audit proving that? If so, I would buy all their products in an instant.

Then how do they know what to fix/improve without that? I can't imagine average iOS/OSX users, who aren't devs, are writing and submitting detailed bug/crash reports on a regular basis.

Or is it just bias and personal preference where "I don't mind Apple collecting my telemetry because I like Apple but Microsoft collecting my telemetry is evil because I hate Microsoft"?

I don't have a dog in this fight but don't trust any for-profit company when it comes to privacy, no matter how good their PR is on the matter, especially since both of them(and most SV companies) were part of the NSA Prism bulk collection program[0].

[0] https://en.wikipedia.org/wiki/PRISM_(surveillance_program)


Apple gives you the option to turn telemetry on/off as part of the machine setup & OS upgrade process. This option is explained in simple english and not buried in the legalese.


The question is, is the telemetry really off completely on mac or either?

Is there any guarantee that Apple gets zero bytes of your data or do we just take their word for it because their products are shiny and cool?


Those questions apply to literally every computer.

https://www.bunniestudios.com/blog/?p=5706


No hacker would trust Apple, or an independent audit, to make such a claim anyway.


There's a conundrum that requires further investigation which is the intertwining of the concepts of privacy and security and how in order to increase security protections of their products there must be a decrease of privacy. Security requires a probing knowledge of the details of what is running on the device.

The same information that can contribute to increased security for the end user is also valuable to sell to advertisers if, delightfully ironically, the security vendor is ok sharing private information with said advertisers.


you can't apply nearly the same heavy accusations on canonical as you do on MS and apple, cannonical makes the glut of its money from ubuntu server. The os is just a byproduct, if you really want to get technical, you could say that ubuntu popcon tracks you, but you can disable that from the package manager settings, because "oh no! they have anonymized package usage records of me!"


Real hackers don't need secure boot and code signing either. :) Real hackers have backups, plan for failure, and are not afraid of consequences.


As a wise man once said: "haha, pwnd!"


> people have to get used to seeing computers as closed-source devices with app stores

Consider all of the school age kids using chromebooks, many of them their 1st computer.

Will they care in 10 years as adults that their Mac doesn't allow programs to be installed from "untrusted sources"? Unfortunately, this transition probably won't phase them.


They won't be able to build a disruptive startup or a modest lifestyle business anymore without having to pay upfront and play by the rules of big tech. But nobody will be so they won't feel that something was taken away from them.


This actually describes my feelings about all this at the moment. I imagine that now and in the future, the 2 Steve scenario in the "garage" will be next to impossible


Ironically, Chromebooks are the "freest" hardware you can get. They all support full-featured coreboot, including measured boot and verified boot.

https://www.coreboot.org/


The PC world only happened due to singularity point that IBM was unable to prevent, even though they tried it.

Everyone else was vertically integrated, if anything Apple is the survivor of those kind of systems, and now everyone (OEMs) else wants that model back.


it's becoming the antithesis of everything that the original Hacker culture stood for.

A hacker in the “Hacker News” sense is more than likely working on an ad-supported web app and dreams of working at a FAANG. The old hacker dream is dead, sadly.


You're not looking hard enough, which is also part of the problem.

Back in the "golden years" you'd have to put effort in to finding and participating in the community. The real hackers are still somewhat underground simply because they're anti establishment; actively seeking out the high effort, loss making niches that attract fringe thinkers and the unique characters required for true innovation.

Crypto parties, maker faires, get to one. If nothing exists locally, build it and they will come. (And admittedly I need to take my own advice).


We have an open space workshop and hack shop in my town, and it’s frequented by quite a lot of people. You’ll see anything IoT, 3D printing and Linux you can imagine being hacked upon. You can also see people building race cars, or ideas that eventually turn into businesses.

Everyone has an Apple or android device, and macs are easily make up 90% of the personal workhorse laptops. Because at the end of the day, technology that just works, is nice too.

I get that it’s probably very different in crypto communities, but we don’t really have any of those around here. I wouldn’t say that real hackers aren’t using walled gardens though.


Maker faires perhaps, but crypto parties seem a lot like pyramid scheme parties on steroids.


Long gone. The last time I had a conversation here about what a "hacker" is, there was a sizeable contingent who defined it as "someone who knows how to write computer programs".

If that hacker ethic lives, it's not in people labelled "hackers" anymore.


It's alive and kicking, but certainly not on "hacker" "news" or SV.

See all the events like CCC and the fediverse.


I know so many CCC people who have trouble to make a living or are (un/happily) employed by companies that went all in Apple, Google, Mirosoft, AWS.

Many for sure are brilliant brains but they lack to monetize it. To break or stop near-monopoly/walled garden systems you have to dismantle them through the market. Build better/cheaper (primary) alternatives which are free and open (secondary).

IMHO mainstream only values convenience and price, freedom is only a "nice to have" item and when it requires more work it is a no-starter.


> Many for sure are brilliant brains but they lack to monetize it.

...as if selling your brain was the most important thing in life.

> you have to dismantle them through the market

Citation needed.

The large majority of modern technology was developed with public money.


Fortunately Apple has like 3% market share in the desktop market globally[1] , so it doesn't really matter what they do.

[1] browsing data shows around 10% market share but it is based mostly on English websites which are minority (and English-speaking countries have significantly higher apple users percentage than the rest of the world).


Even English speaking countries don’t have that high Macbook usage, because the premium sticker price is extremely high; unlike in phones, where an iPhone is comparable or in some cases less expensive than equivalent Android flagships, Macs tend to be several hundred dollars more than even their flagship laptop or PC counterparts, and from my personal experience a computer running Win10 is not that flaky compared to some Androids I’ve used.


But also Windows phones home.m and Linux desktop is still a shitshow (pardon the French) arguing over and over again about the same melange of disputes that should have been settled a decade ago...

Not sure if it’s a sign of aging, but I’m losing hope and interest in most of these pantomimes /s


Linux desktop. I just use what's presented and get on with my work. I don't have a particularly complicated setup, but I found something that works and I've been using it happily enough for over a year having migrated from Windows.

I don't know what disputes are being argued over, so I bothers me not in the slightest. Would it help if you just ignored the detail and just worked with what's presented?


Dual booting is still a problem. I just tried to install ubuntu as dual-boot on a separate SSD from windows (which I keep around for video games) and I ran into 13 bugs (I wrote them down!) within two hours - including two which caused fatal issues in the installation process itself which I had to run around eight times before it succeeded. Then it somehow permanently trashed the ability of windows to boot. After spending three hours trying to fix that I said fuck it and nuked both OSs with a fresh windows install.

This is the first time I've tried installing linux in about a decade, after using a stable dual-boot setup during university. Things seem to have gotten worse. It'll be another decade before I try again, probably. I lost two days of pay and probably years of non-grey hair dealing with this fiasco.


really? I just set up lvm2 on my new dell, partitioned some space, ran the windows installer and it just worked(tm). I do still remember how hard it was to dual boot on my old laptop, I don't know why it was different for me this time


If you are happy to just ‘work with what’s presented’, why not buy an Apple device or ChromeOS machine?


There are many reasons to choose Linux over the alternatives that have nothing to do with one's ability to customize it.

One example is the simplicity of system management. Desktop oriented Linux distributions tend to take care of everything from application installation to updates with a single interface with minimal interruption to my work flow. This is only sometimes the case with macOS and Windows. Android, iOS, and ChromeOS are not realistic contenders in the application space for some users.

Cost of ownership is another factor. Linux may not make sense for some businesses if they have to hire someone to manage their systems, but an end user who can handle often trivial tasks can usually support their own system and benefit from less downtime. While paying for software is a good thing, it frequently adds many constraints on what can be done while modern business models can make licenses prohibitively expensive (e.g. subscription models or various forms of forced obsolescence).

Other reasons include: sometimes the desired software just works better under Linux since it was designed for Linux, a desire for privacy or a need to ensure confidentiality, compatibility with older hardware that is no longer supported by the vendor (but may be supported by open source developers).

Working with what's presented simply means that you are unlikely to modify what is shipped by the vendor. You can still add to it or benefit in other areas.


So you are claiming that simplicity of system management is a strength of Linux?

Ok, we disagree on that, but I get what you are saying.

But, another aspect of what you are saying is ‘trust the vendor and don’t look under the hood’. This is no different from trusting Apple or Microsoft.

You are actually confirming the general point that Linux provides no benefit in terms of who you have to trust.

I.e. it doesn’t in fact give you a system that you own and control - you are at the mercy of a vendor.


Most Linux distributions are also easier to audit than commercial operating systems and some go as far as encouraging it. While this is optional and requires a deeper understanding of what you're looking at, it does not require handing control over to a third-party (including the vendor). You can choose to trust the vendor by clicking a button, you can use integrated tools to choose what is done and when it is done, you can use those tools to audit what is done before it is done, or you can audit it down to the source code level. This is a far cry from Apple, Google, and Microsoft's approach where you have very little control and what little control you do have is complex to access.


To be clear: my commentary on "work with what's presented" was merely in response to above comment about Linux desktop shit-show. The following is about choosing Linux over alternatives.

I avoid Apple because of the slow descent into vendor / walled garden lock-in, which suits neither my wallet nor personality.

I have a couple of ChromeOS devices for the kids' schooling, but they're unwieldy for my workflow.

You didn't ask, but I moved away from Windows because of increased bloat, telemetry and the dual issue of decreasing control of "services running" alongside noticeable slow-down of performance frustratingly too soon after a fresh install. I've also had two occasions (which is two too many) where Windows decided, upon it's own, to install updates and reboot, losing whatever I had open at the time, no message pre- or post-update just clean login screen next time I went to use the machine. I think they've made that better, and there are options to control the nature of how updates are dealt with, but I've already made the jump.

"what's presented" by Linux may not be perfect, but it's closer to my perfection than the alternatives I've tried (to be clear, this is what works for me, not necessarily anyone else).


Windows is also a shit show. Windows updates have bricked systems for decades.


People, outside of HN, don't care about operating systems. They are all good enough and have been for a long time. What people care about are applications. If you want to play games, you are probably going to buy a Windows computer. If you want to run Logic, you're going to buy a Mac. If all you really need is a browser and email, you are going to choose by other criteria like price or beauty.


Microsoft tried this with Windows RT. It didn’t go down well at all. The slow burn might be what’s needed, but if they do this, it could damage why people still buy macs vs just buying iPads. Lets see, time will tell here.


I don't think Apple would object to everyone dropping Macs and buying just iPads.


They have had ads in the recent past to that effect


There is always a war on when property is not democratized. In the Anglophone world, the easiest example to think of is enclosure[1] which was a war between the mercantile class on one side and the lazier half of the lords and the little people on the other.

If any company, be they Google, be they Apple, should experience total "victory" in the war on intellectual property they would find everyone who is not

  * An investor
  * In management
  * An employee
  * A temp/contractor, or
  * A happy customer
Set against them and they would fall. Imagine there was a neural network or network of neural networks that could arrange bits into entertaining movies. It would print money and it would not be strategic for whatever company had the resources to do that to hoard the benefits, they would have no customers if they did (I suppose if they had enough compute and memory resources to succeed in such a task they might already have a monopoly on money).

I used to understand the CEO's job as cheerleader, for the investors, for the employees. I still think that is how they operate tactically, but strategically, I think the goal of a CEO in a Obama-era neo-liberal is to put the whole world in one of those categories above I suppose, with the acceptable additions of

  * Vendor/supplier
  * Companies in partnership
  * Competitors who participate in a trade association together in a reasonably civil manner, or
  * Members of government who are lobbied to advocate for your firm or industry or a strategic need that you hold in common with the broader public
In the long run the alternative is to perish. Utopia I think may have been inevitable by the 50s/60s when people started to believe that some companies really would last forever. On a long enough time scale, it is best to do everything in everyone's best interest considered in toto.

So all of that to say that I expect one day, maybe 40 years from now, maybe 80, that IP as it operates today will be transformed into an index of ideas that everyone can take and implement who might put them to use beneficially for the citizenry and other residents. It will be nice after all of that work to have a primary key for the ideas that have been created so far, so I don't think we the people will burn down the USPTO or the LOC or anything like that.

[1] https://en.wikipedia.org/wiki/Inclosure_Acts


"What we poor souls didn't understand all these years is that" computing and information processing started out as tactical and then strategic components of military, intelligence, and industry. The central computing infrastructure ("the Cloud") and attendant closed/dumb psuedo-terminals (closed systems, walled gardens) is precisely the vision outlined in the 60s if not the 50s.

Two unexpected events in information processing technology threw monkey wrenches in the industrial and geopolitical policies of the strategic thinkers dreaming in DARPA, SRI, and elsewhere:

1 - Personal Computing was an unexpected event. It has taken all of 4 decades to put that genie back in the bottle. The topic of this thread.

2 - AI. This mainly threw a big wrench in the planned integration of Communist China into the Western system. AI enabled CCP to maintain control of the Chinese society, which was completely unexpected.

Having advanced general purpose, user programmable, computing machines networked globally in the hands of peasants is simply unacceptable to the folks who gifted humanity with these tools.


True, but I think people who gifted these tools are the first who got them taken away again.


> 2 - AI. This mainly threw a big wrench in the planned integration of Communist China into the Western system. AI enabled CCP to maintain control of the Chinese society, which was completely unexpected.

Could you expand on this or better yet, point to some reading on the subject? This sounds like a very interesting topic.


Pure speculation on my part, trying to understand the nature of the relationship between Communist Party of China and Western power centers.

Back in the early 70s, when the deal that Kissinger and Mao shook hands on was made, there were no PCs, or pervasive networking and mobile communication and computing. It was always my speculation that the risky gambit of fast forwarding China's development and economy and freely transfering crown jewels of technology was made with the conviction that the process of integrating with the West would naturally create a power base in Chinese society that would push aside the Communist Party. Tiananmen ("June Fourth Incident") was a manifestation of this correct original prognosis of the planners of the deal, that it would cause political headaches for the CCP. A cultural instead of a kinetic overcoming by the West.

I think the consensus now in the West is that CCP armed with AI, pervasive networking, and the emergence of a digital society in the true sense, can stifle any possible organic formation of alternative poles of power in Chinese society. The only remining challenge to CCP were the Tech Mandarins (like the Senior Self Crowned Bozo in the West, a certain Bill Gates) and just this week Jack Ma was reminded as to who is in charge in New China.

And of course, Eric Schmidt et al. are now salivating at the "Chinese Model" and we're gonna get the same, regardless of what little you and little me thinks about it.

/end speculation


I hate walled in gardens and in Microsoft land it did get a lot of flack which I am thankful for. I doubt many Apple users have the foresight of a Stallman to see where their environment is headed.

There is money in creating apps for Apple systems, but you are giving up a lot in exchange.

Some developers use it for convenience but I would not say it is widespread and most are fundamentally opposed to use a mac.

Our advertising department is the only one with macs. Sadly, we still support Apple because every sales rep gets the whole suite. iPhones and iPad Pros (because the others are too small and you cannot work with them... what a new revelation...), that is a few thousand bucks for every employee that Apple gets. They even get the ridiculously priced pen (120$?) and keyboard (300$+). And our sales people still complain about not have notebooks, because they are superior devices. There is just the presentation argument.


Probably 90% of developers I've worked with in the past 5 years used a Mac, and really wanted to do so. I know a handful that left Linux to do it, and have told me they can't imagine going back. I'm not sure that your'e right about developers.

I do know that my next machine will be Linux. I'm probably going to put together a desktop for my office, and I'll buy a System76 laptop or whatever for those times when I want to work from the porch.


Anecodote- I was on OSX for ~9 years +/-1 or 2 years. Came from Linux (Ubuntu) and Windows for ~10 years before. I made a switch a year ago back to the same (Windows + Linux) and I've been nothing but happy. I know people tend to complain about Linux on the desktop, but I had great experiences a decade ago, and I do again.


I always have complaints about my Linux system, until I realize that my Windows colleagues do the same. It shit here, shit there, shit anywhere I imagine. I personally had good luck so far, each of my machines worked well with Linux out of the box, and I have to realize that I had around the same amount of issues as I did in my Windows days.


Estimated 1-3% developers of developers I worked with run MacOS. Maybe it is a local or SV thing, but most I know prefer the freedom. In the industry I work there aren't many tools for Apple ecosystems available.

edit: https://insights.stackoverflow.com/survey/2018/

It is quite high and to correct myself: Some developers have a Mac for iOS development. Most people I know hate it.


IBM has nearly all of it's developers on Mac at this point. I came from using exclusively linux for 10+ years and have decided to switch my personal machine to a Mac and my phone to an iPhone because it's been so good.

Mac has everything I liked about Linux but with zero cognitive overhead. With brew and a window manager (amethyst) I'm not missing any features I enjoyed in Linux.


I hope that changes with the acquisition of Red Hat. I think they did it to help Red Hat, but they also wanted to buy the community of developers in my opinion.


Yet despite all the wailing and gnashing of teeth - both in the article and in the comments here in the various threads - people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges.

These companies have repeatedly shown that they don't respect people's privacy and people have resoundingly responded that neither do they.

I would suggest that the users here are probably some of the best suited in the world to help smooth those rough edges and even surpass the features and usability in the privacy respecting alternatives I just hope we collectively notice this before it's too late.


I'm not. Since the last few iterations of Apple hardware and OS software, I've decided to switch to something else. I'm also advicing all friends and collegues to do the same. There is simply no reason to stay with Apple anymore as a professional. It used to be stability and ease of use, but Windows has since closed that gap, and Linux is a close second (and a must for programming).

Most likely that means I'll run a Linux laptop for programming, and have a partition for Windows on a gaming computer, cuz I love me some games. I know Steam is gradually approaching Linux, but most AAA games still run best under Win. Plus it's an easy platform (outside OS X) to run multimedia production tools. Sorry, I just don't think Gimp or other Linux tools for multimedia production are good enough yet, but the hope remains that they will someday compete with giants like Adobe. For real. And not just as free but woefully inadequate alternatives. Please correct me if I'm wrong here, though. I'd love to be surprised, if you are able!


I haven't used Windows in long time, but it feels like "the gap" was closed by Apple, by letting their system degrade. The high end 2019 MacBookPro is in many regards less reliable for music production than my old 2012 MacBookAir, which is really, really sad. Unless Apple gets their act together my next music production setup will be Windows-based (or maybe I'll even try Bitwig on Linux).


> “the gap” was closed by Apple, by letting their system degrade.

100% agreed. And I would even go so far as to say that Steve Jobs had such a strong vision of how things could be, that it was his greatest mistake to put someone in charge who did not (Tim Cook).


>it feels like "the gap" was closed by Apple, by letting their system degrade.

While there is some merit to this claim, I don't agree with it entirely. In my experience using these systems side-by-side for over 10 years now, it's pretty obvious that Microsoft has stabilized while "stealing" Apple's good parts, and that Apple hasn't really evolved any further than they got in about 2010. In essence, Apple has allowed Microsoft to close the gap, although by about 2005 Apple managed to become so high quality on the computer market, and so ahead of its time (by "stealing" the good parts from Linux) that it's hard for me to say how they could possibly keep that position for long. (I'm not saying that they literally stole things. Obviously they've done a lot of innovation themselves, but there are also some pretty obvious signs of "inspiration" going on too.)

There has for sure been downgrades with Apple too, but not really in regards to pure stability. In my experience most of it started in the 2010's, when Apple decided to turn away from the pro market to focus more on the trend consumer market instead. At first it happened slowly with a few technical changes that put off web designers. However the first major blow was against Final Cut Pro. I guess it turned out well in the end, but in the beginning they stumbled badly and gave away a large market share to Adobe as media pros started fleeing from Apple. Soon after came system changes that made Adobe crash more often (I'm sure it was just a coincidence...), only resulting in customers fleeing from Apple as a whole. Today Apple Mac's are decidedly more of a tightly controlled trend brand than the prosumer brand it used to be. (Kind of like Hasselblad has gone from the photo engineer's brand to an almost purely luxury brand by now.) (Also I really miss the MagSafe lol.)


Apple still is best for audio and animation applications and there are synergies in certain industries like game development.

But apart from that it is a bad deal.

Windows degraded itself with its bad store, telemetry and PaaS dream, but yes, as soon as they could Apple developed in the same direction. Microsoft suffers from bad decisions on some management level.


Gravit designer, Blender and Bitwig studio would come to mind as tools for multimedia production that are as good as anything on the other platforms.


Also Krita is improving pretty much constantly to become the "Blender of drawing".


Great tips! Thank you! I'm more into 2D compositing for video, but I've had my eye on Blender for some time. Not sure how it stacks up against stuff like After Effects. For audio production I already use Reaper FM, but I might give Bitwig a go too. ^^


Not open source; however, supposedly DaVinci Resolve is supported for Linux: https://www.blackmagicdesign.com/products/davinciresolve/. For most people the free variant should be more than enough.


I don't even bother dual-boot anymore and just use WSL. Windows has basically become what OS X used to be except better because it's actually Windows and actually Linux. It just goes back to Apple not caring one iota about developers. Focusing on developers used to be Microsoft's strong suit and they've positioned themselves well for that again.


You pointed out a core problem here, and a common blindspot with the HN crowd:

> continuing to use Apple/Google/Microsoft

That is probably 98%+ of devices that most typical people use for everyday computing, across phone, laptop, and desktop. Whether it is an android device, an iPhone, a Dell laptop or a new M1 macbook you are beholden to a corporation that doesn't care about privacy. I would argue that Apple cares a bit more, and I really hope they respond to the most recent shit show with some changes, but I'm not holding my breath.

And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis. And hell, if you setup Chrome and Gsuite as your default tools of choice, as many do (or are forced to because their job) how much are you really breaking free anyways?

I don't know the solution, but its clear that in the war between privacy and convenience and usability that privacy is on the ropes and its the last round of the fight.


You are totally right that Linux is only a single digit percentage in user share if you look at all computers deployed world-wide.

However, in the developer sphere, GNU/Linux is way more represented. In the Stack Overflow 2020 survey, 26% of people said they were using a Linux based OS [0]. This number has been gradually increasing over the years so further growth is expectable. It's less than one percent less than the Mac OS share of 27%, but I'm not sure how much noise there is.

So Linux is usable enough for a growing community of developers, and many laptop/computer vendors targeting the developer market offer GNU/Linux as a preinstalled option.

As a result, at least for software people, Linux is a real privacy preserving alternative on the Desktop. HN is a software forum, and the blog post was also made with people educated about software as the recipients. So I think the post you are replying to should be understood not in a global context but in a context of the IT crowd.

Of course, it's not beautiful that Linux is less of an alternative to people who need to use complicated niche ISV software, but generally any industry can make the move if there is a subset of users pushing towards GNU/Linux.

[0]: https://insights.stackoverflow.com/survey/2020#technology-de...


Every company I've worked at for the last 10 years has developed exclusively on macOS.

Linux is fine, but it doesn't just work, and it needs to in order to gain a foothold not just in the developer community but with the general public -- that's the lesson Android has taught us. Until it does it will not budge the numbers. It just won't.

The real apologism here comes from the Linux community, who believes that by virtue of existing, and not being an Apple or Microsoft project, it should be widely accepted in spite of being a distant third in usability. Linux needs to step up its game or it will forever remain a distant third. The general public does not care about "free and open source" -- heck they barely care about privacy at all. That's just us, the HN crowd.

I love Linux in theory, I use it a lot, but my daily driver is macOS. Frankly it likely will remain so even as I transition to being an Android engineer in the coming months.


Have you really used a desktop Linux desktop recently? Sure it might have some problems but Fedora for example is in no way "a distant third in usability".

I haven't used MacOS for a couple of years but I can say for certain that Linux is not that far behind Windows and sometimes even ahead. I think the main problem is lack of application support but specifically for developers most applications are already there.


I use Ubuntu on an officially supported Dell enterprisey machine (E7400) and the experience is noticeably worse than on my Mac.

There are no multitouch gestures, there's screen tearing, 4K video playback on a 4K screen is choppy, sleep and wake up both take a long time, deep sleep doesn't work properly and I HAD TO DOWNGRADE MY KERNEL because of the stupid "Resetting rcs0 for hang on rcs0" bug that would freeze the machine for five solid seconds every minute.


I use Fedora.

I work on dotnet. I've not been able to run my azure functions on Linux at all. Even on windows, I have to start the azure storage emulator directly and then start func (using the full path). On Linux, azurite is not the same thing as azure storage emulator.

So I must use windows for dotnet development. Even after four years of dot net core. That being said, Linux isn't the problem. Gnome 3 isn't the problem either. Flatpak works great.

There are a few regressions once in a while though. For example, I'm unable to use my lenovo flex 14 (and ryzen 3500U) built in mic since I upgraded to Fedora 33. Looks the issue well be fixed in kernel 5.9 but for the moment I'm on 5.8. But it has been great mostly. If I could use it for work, that'd be great.


> Have you really used a desktop Linux desktop recently?

Someone has to say this literally any time someone mentions problems with Linux Desktop, along with "it works for me", "my grandma uses it!", "you just have to buy the right hardware", and "you're using the wrong distro". It is incredibly tiring.


It's a reply to the tiring cognitive dissonance of posts like GP that basically broadcast that 'oh, I would simply love to support open software and general-purpose computing instead of locked-down proprietary platforms, but alas, Linux is simply unusable you see, so I have no choice!'


Yet rather than take at their word that Linux is unsuitable for their use case or seek to remedy those problems, people insist that it must be because of one or more of the reasons I listed above. Alternatively, they attempt to change the user's use case to one that fits Linux better.

It's extremely tiring to watch this happen for as long as I have been watching it happen.


They didn't say Linux is unsuitable for a particular use case or give any meaningful criticism or definition of a use case, they just said that it 'just doesn't work'. Is it any wonder people call out that tripe.


'just doesn't work' and 'doesn't just work' are two very different statements. You said the former, but I've only spotted people saying the latter in here.

Also if you have to argue with someone that linux does work for them, you've missed the point.


Ah yes you're right, I did misread that.


Are you actually seriously of the belief that Linux ‘just works’ for most use cases or people, and there is a small minority for whom it doesn’t?


The implication being that it doesn't just work for them. They know that, that's why they asked if they'd personally tried Linux lately. Nobody cares if it "just works" for your grandma or whatever, because that doesn't help them at all.


Linux is hardly unusable, it's just a worse experience, and for something I spend 10 hours a day on and earn my living with, I want the best experience.


For macOS, you also have to buy the right hardware. You cannot throw it at random $300 Acer garbage either.


You probably could if they'd let you heh, it's really well optimized, and works great on 5 year old Apple hardware with much worse specs than a $300 modern Acer. This is especially true as low-end cheap devices tend to be much more "standard" -- no exotic hardware.


You would either not even boot due to buggy UEFI implementation, or it was cheaping out on some hardware that there is not driver for anything except the OS release it came with.

In some aspects, the hardware on high-end-insh laptops have moved nowhere in the last few years. Yeah, they've got slightly better GPUs to handle 4k, Vulkan/SPIRV and new media codecs, and faster SSDs once they've got rid of SATA, but the CPU performance moved just in percents and it comes with the same memory. My wife still uses 2012 i7 Thinkpad (T430s) with 16 GB RAM, and there are not many cheap machines that could be objectively called better. Certainly not in the $300 range. Heck, until Ice Lake, you couldn't purchase a laptop with more RAM unless you went into luggable workstation segment, so it is no wonder that MacOS works relatively OK on 5yo machines. They are still way better than $300 Acer, despite that Acer having possibly some better paper specs.


How good the specs are doesn't matter for most driver bugs. Instead, it's important which components you use. And even if cheaper devices use "standard" hardware, they still use a much larger variety than the few models that Apple releases.


I use Fedora on a Thinkpad on a daily basis and it sucks compared to macOS IMHO.


I use both, and they both have their strong and weak points. Purely system-UI-wise, I prefer Gnome Shell to Finder. Upgrading to Big Sur was a shock, the aesthetic went downhill.


> Every company I've worked at for the last 10 years has developed exclusively on macOS.

There are probably also plenty of companies which exclusively develop on Windows. You can have an entire career in the embedded industry, target Linux all the time, but never use it for development. Development is not all Linux based, I never said that. What I quoted were percentages.

> it should be widely accepted in spite of being a distant third in usability.

As you shared your anecdote, let me share mine. Recently I had to set up a new networked printer on both Windows and Linux. On Linux it worked immediately, on Windows it didn't and required installation of a manual driver. That installation was partially botched and removal didn't work. Only after a few attempts it did, at which point I've spent an hour with the problem. It's the same old crap that you had with Win 95, but now on Windows 10, while Linux has a proper IPP driver and no need to install outside software. Maybe Windows has an IPP driver too, no idea, but it didn't work which is what matters :).

Also, there are definitely distros that are more usable and ones which are less usable. I think there is a true core in the meme of people being lured to linux by "use linux it's user friendly", then being told that Arch isn't that much harder anyways, and then ending up using one of the expert distros which often break in subtle and hard to debug ways.

But yes, in many ways Linux could do better usability wise. For example, it's a nightmare to target GUI Linux and then expect your binaries to work for years. It's almost easier to just ship Windows software and then use Wine. Nvidia also has shitty drivers, and in general, the driver situation isn't perfect. But things are improving I think and the gap is shrinking.

Also note that there is an effect when more people use a piece of software, there is a larger market for people who make a living with improving that software, via support contracts, consultancy contracts, etc. So once desktop Linux gets adopted widely, it might improve in many ways from the current state, simply due to ecosystem size.


+1 for the printer story. Linux was plug and play. On windows... Uuuugh.


macOS uses CUPS :)


I wonder whether it'll continue using it. I remember some drama about the maintainer leaving apple and then forking it.

https://news.ycombinator.com/item?id=24831162

Anyways, the claim was that Windows and Mac are way better than Linux and it doesn't even compare. My anecdote shows that it's not always the case :).


So I jumped to Linux two years ago. Since, I installed Linux Mint on five different machines, old and new ones. I just put the Live DVD. It boots. I launch the installer. End of the story. I honestly never ran into a problem. Not going back to windows anytime. Don't see what's so great about MacOs either.


This could have been written by an Apple user circa 2000, when the entire world ran on Windows.

I switched from Mac to Linux, and it's been fine, partly because I used a laptop that has Linux installed by the manufacturer. If you only ever used Hackintoshes you'd think MacOS was a mess, too.


Even better : Switching to Linux on a Mac. So you get a blazing fast Linux experience on a very sturdy long-lasting laptop. I did just that. I love it.


Dude. Installing Linux on a Mac laptop is a nightmare. What model/year did you use? Do external displays work reliably? Does your web cam work?


Long-lasting? Sturdy? You mean the ones tuned by Apple to maintain silence by overheating?


The number of aluminium balls (battery swelling) that used to be Apple laptops I've seen recently makes me a bit worried.


Let me counter with my own anecdote.

Every researcher I've met in the last 10 years needs some version of Linux to get their job done effectively. Largely because tools like Python/Numpy have replaced stagnated (due to years of monopoly) tools like Matlab and they are much easier to setup and use on Linux.

Mind you, I am not talking just computer scientists. Biology, physics, maths, all use linux based setups.


No company (or academic institution) I've ever worked for developed _anything_ on MacOS. That's a US-only phenomenon, I think.

The thing about Android, it was essentially foisted on people, not chosen because it just works. Also, many Linux distributions today "just work" on typical hardware - almost as well as Android on your phone. And if you were using a hardware config chosen by someone else, and a Linux-based distribution pre-installed and pre-configured for that hardware - then Linux very much "just works".

It is true, though, that this is not enough to expand the user base far enough. That needs a lot of coordinated, collective, public activity.


> Every company I've worked at for the last 10 years has developed exclusively on macOS.

Counter-anecdote: I have had a Linux computer on my desktop for over the last twenty years, have used Linux primarily for maybe the last eleven, and exclusively for the last eight.

I do not get the near-religious love for Apple products. Pre-OS X, they really were wonderful, best in class without a doubt. I kinda get the attraction back in the early 2000s, when Linux could still be a chore and Macs were a decent choice to get a computer which had a shell and ran real software. But now? They are just not for me. I want to own my own computer, write my own software and control my own destiny.


You can’t control your destiny when it’s based on millions of lines of other people’s code, billion transistor hardware designs that you have never seen, fabricated by distant mega factories using closely guarded secret processes.

You may want some idea of controlling your own destiny, but it’s a mirage.


Rome wasn't built in a single day either. If you want to eventually get to your destination, you must do the first step.


That is a sentiment that I partially agree with. I frequently argue that iOS (for example) represents more than a decade of investment, so we can’t expect an alternative to pop up overnight.

However using a Linux desktop simply is not a step in any particular direction. It is especially meaningless for non-developers to do if one of the other platforms just works for them.

Steps in the right direction involve developing software to solve the problems that the corporate operating systems have in fact solved.


It works both ways: Windows (or Mac) are meaningless to any user, who can be served by Linux.

The first step in solving problems that corporate OSes have solved is identifying them. For example, one of the problems is stable/versioned/sandboxable/distro-agnostic ABIs and that is something that is being worked on and they are quite usable nowadays. The related problem is, that the corporate-software is ignoring them (how many conference solution support Pipewire for desktop sharing? Or Wayland in Chrome? Heck, VA in Chrome? Exactly... but whatever Apple comes with, they support within weeks).


It’s easy to know what Apple is going to support and for how long, and how much warning you’ll get when things are deprecated.

How would I even begin to know the answer to these questions when it comes to the Linux technologies you are naming.

Based on what I read here, the community of people who do use them isn’t even sure.


Is it? For few things yes (OpenGL or 32-bit depreciation); other things are simply not found in the new releases (Kerberos Ticket Viewer, TWAIN). Then there's a middle ground, where there is a window of few months to make significant architectural changes to your product (see VPNs and personal firewalls).

So excuse me, I'm going to look for a new release of Cisco VPN that works with Big Sur. Of course, it is customer's VPN, and they don't care that their vendors might use Mac, so no help from them, it is my problem now.


Sounds like don’t buy Cisco.

But more importantly - you verified my assertion.

Even if a small number of security related technologies have a smaller deprecation window, the others have very long ones.

More importantly - we can even find out what they are!

With Linux, can you even tell me whether Cairo, the graphics library behind GTK and for which there is no alternative is even still supported?


> Even if a small number of security related technologies have a smaller deprecation window, the others have very long ones.

It uses mechanism that was introduced in Mojave, so that deprecation window was very fast. Definitely faster than Cairo, which has already more than decade and half under it's belt.

There are alternatives to Cairo, but not drop-in ones; they would require porting (Skia, for example). Other than that, Cairo is basically "done". Yeah, there is occasional need for release with small fixes, but as a software-based rendering library it is not going anywhere. Not that announcing deprecation is otherwise widely respected; the Xorg-server was declared obsolete years ago, the last major release was in 2018, but hey, some still think there's a future and in a few years are going to be surprised by "sudden abandonment".


What are you talking about. Mojave is 2 years old. That isn’t a particularly small window. Even with the 2 year window on MacOS, an alternative was delivered.

But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.

You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.

Is it done? How do you know. Does it need to be developed further? It sounds like it does since people say it’s very slow compared to skia.

It seems like you think you know what the status of things are, but other people don’t. How can that be?

This is the reality of Linux.


> What are you talking about. Mojave is 2 years old. That isn’t a particularly small window. Even with the 2 year window on MacOS, an alternative was delivered.

2 years/2 releases is small window. Alternative was delivered, but it also is no drop-in one. Just lika Skia vs Cairo, requires porting/reachitecting and might not have 100% coverage for what the old solution did. You might have noticed, that in Big Sur there is a list of system apps that ignore VPNs and filtering. That's a problem.

> But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.

Because it works differently. There is no dictator (vendor controlling access) telling you what you are going to use whether you like it or not, and present it as a done thing. Instead, it works by forming a consensus (not 100%, mind you), whether something is needed or not. The community around a piece of a stack might arrive at a conclusion that given piece is no longer maintainable, and make something new (i.e. Xorg-server vs Wayland compositors, or systemd vs sysv-init) and then it will bubble through as distributions adopt it - either because the new solution is maintained and the old one is not, or because the new one solves their problems better than the old one.

> You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.

Supported by whom? In this case, the original maintainer is no longer around. But that's not a problem, distributions will support whatever they ship, so if you have an application that uses it, it will continue to work.

Should you use it for a new project? No. But that was true for years. Cairo is 2000-era software library. Consider it equivalent to QuickDraw, you would not use it for a new software either. Gtk (which was issue here on HN recently) uses it for software fallback (and they've made part of their API, so they've locked themselves in; changing it would mean they would have break their ABI. On the upside, that means that there always be some support). Firefox abandoned Cairo years ago, in favor of Skia, and so did LibreOffice in the last release.

So it shows, that a library or piece of a stack might be deprecated for years, but there might be someone somewhere who really needs it and is willing to keep it on life support (exactly just like Gtk needs Cairo). So the question for you, as someone who requires support, is: what level of support you require and what are you willing to pay (not means in purely monetary sense; though when users are expected to pay for continual support, what's wrong with expecting that on different layer of the stack?) for getting that? Because you can always get it supported; or support it yourself, if you really needed. No one is going to kick it from underneath.The source will still be available, and it is always possible to fix it when needed. You will be never completely denied access as is the case in proprietary stacks.

> It sounds like it does since people say it’s very slow compared to skia.

It is going to stay slow (unless someone invests heavily into it; which is not probable). It is a software library, deliberately; if you want hardware acceleration, switch to Skia, and accept the tradeoffs that come with hw accel.

> It seems like you think you know what the status of things are, but other people don’t. How can that be?

Simple. I asked.

There are many venues, where you can ask. These people will tell you. However, due to having no dictator mentioned above, their answer on roadmap-like questions might be: "depends on the community adoption".


If the answer is ‘don’t adopt Cairo for new projects’ there is a problem, since Skia has poor documentation and a not stable api.

And so if the answer is always ‘it depends’, you are making my point for me.


In Cairo specific case it is 'don't adopt Cairo for new projects'.

Others may have specific reason, why they are still using it (like the Gtk example). Your new project doesn't.


If Cairo is not to be adopted, then Linux doesn’t have a good option right now.


In the previous comments, I've pointed an alternative several times, including what other applications use. So yes, Linux does have good option right now.


> Linux is fine, but it doesn't just work, and it needs to in order to gain a foothold not just in the developer community but with the general public

I understand the general public, but the developer community? Why can't the "professional-computer-whisperer" demographic be arsed to setup their computers?

In all seriousness: The extent to which you have to manually configure / fix stuff in Linux is seriously overstated, especially w/ the "easy" distributions like Ubuntu and Mint. I've installed both of those on other people's laptops and they've had no complaints at all.


I think Ubuntu 20.10 with Gnome 3.38 is a viable alternative to macOS, for developers.


I'd say Mint with Cinnamon, since you can install the 20.10 Kernel from the Update Manager.


You can install the hwe kernel in LTS too. Might not be a good idea for a normal user, if it is not necessary.


The general public in Europe cares enough about privacy to pass the GDPR.

Choosing examples from the UK so people can search for them in English, they care when their healthcare data isn't secured, when dating apps leak information, when local councils abuse the rules to "snoop" on people.

If you explain that without a blocking extension (or Firefox's built-in one now?) most pages they visit are sent to Google, and Google maintains a detailed profile, they will ask how to install the extension.

You can say the common man doesn't care about free speech, but enough do care that we maintain it in Western countries.


> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.

Uh, no.

There are many reasons why people don't use Linux. The easy of setup and use lines are pure nonsense. Many Linux distributions have been easier to setup and use since at least the introduction of Ubuntu, and perhaps earlier. Just pop in the live installation media, try it out for a bit to ensure your hardware is compatible, then run an installation program that asks questions that make sense to people rather than marketers. Getting the applications you need installed has involved using a store-like interface for longer than software stores have existed on commercial desktop operating systems.

So why is it perceived as harder? One reason: users typically have to install their operating system, while end users rarely see the process on commercial operating systems. Second reason: those who do install their own OS typically dedicate their system to commercial operating systems, yet use dual-boot for Linux (which will add technical steps). Third reason: those making the transition may try to install commercial applications via a VM or WINE due to compatibility, familiarity, or (periodically) the lack of an open source alternative. In other words, it is perceived as harder because people make it harder.

As someone who has been using Linux for decades, I find the setup and user friendliness of something like Windows far inferior to to Linux. Yes, part of that is due to familiarity. On the other hand, some of it is inherent due to there being fewer restrictions with open source software.


> As someone who has been using Linux for decades, I find the setup and user friendliness of something like Windows far inferior to to Linux. Yes, part of that is due to familiarity. On the other hand, some of it is inherent due to there being fewer restrictions with open source software.

Non-tech people don't want (or don't know how) to "setup". The most user-friendly setup won't ever beat "no setup" (e.g., macOS).

Besides, marketing plays a huge role as well. Ask people to name a few "computer" brands: Apple, Microsoft, Google. No one would name "Linux". So, it's not just that people should be able to buy Linux computers with no setup (Dell is selling them, I think), it's also that these kind of computers should get enough marketing so people know about them.


> Non-tech people don't want (or don't know how) to "setup". The most user-friendly setup won't ever beat "no setup" (e.g., macOS).

I don't like this line of thinking because it is condescending. More importantly though, there are plenty of competent technical users like myself that would love to be using an open operating system but are fed up with dealing with the systemic problems Linux Desktop has that its developers and community keep papering over and pretending don't exist.


I'm going to let you in on a secret: most of the people who advocate a particular operating system are pretending that systematic problems don't exist.

Strictly speaking, Windows is a usability nightmare. Most people don't notice simply because most people learn and use a tiny subset what is there. Beyond that, there is an entire industry to support Windows (which is part of the reason why people like it). Apple isn't much better. They tend to paper things over by simply dropping support. An example was brought up by another commenter when they mentioned that some ix programs use raster fonts. Strictly speaking that can happen under Linux yet not macOS since Apple dropped support for legacy software while some ix software is decades old. Linux will have its own issues, but I'm not the best judge of that since it is my operating system of choice.

At the end of the day, any operating system will be a compromise of some form or another. Which you choose will depend upon what your wants and needs are.


I don't disagree, Windows especially has been getting a lot worse as it embraces modern development and the users-are-cattle mindset that comes with it.

My problem isn't so much that there are tradeoffs, or even that those tradeoffs are not ones I want to make, it's that there are people out there who seem to insist that Linux Desktop is the one and only proper choice and everyone who doesn't choose it is either stupid or misinformed. If Linux Desktop people were more willing to listen to why people don't use it and take criticism to heart instead of as some kind of personal attack, it might have evolved into something more people would actually want to use.


> My problem isn't so much that there are tradeoffs, or even that those tradeoffs are not ones I want to make,

That's fine, and I agree that advocates sometimes detract from progress.


It's not condescending, it's realistic. It's how I (as a tech person) approach most non-tech things in life. I rent because I don't want to deal with maintaining a house. I use public transit because I don't want to deal with maintaining a car. And so on.

I may be an extreme example in some ways, but everyone only has so many fucks to give, and most people have legitimately exhausted their budget by the time they get to tech policy.


Extending your analogy, I always feel like when I set up my Linux workflow (something I attempt and abandon at least once a year) it feels like someone dropped a car off in my driveway with the engine and transmission on the sidewalk and no instructions on how to assemble. Installing the "engine" always requires hours of browsing through comment threads written by random people on the internet that are years old.

This year is different though! I'm going to make it work. Also looking at Pinephone for my mobile.


I think another thing that contributes to that last problem about "computer" brands is the lack of knowledge of the distinction between operating system manufacturer and hardware manufacturer - I just moved recently, and in the process of getting to know more people and accidentally becoming their tech support, I was surprised to see how many people didn't have that basic bit of knowledge. When pointing at a chromebook, people say "Google" without questioning why "lenovo" is also written on it.


They can buy a PC with GNU/Linux preinstalled then. If I bought a mac and tried to install windows there it would be a shitshow too.


> They can buy a PC with GNU/Linux preinstalled then.

That's the problem: non-tech people don't know they can do that. Besides, why would they buy a PC with "GNU/Linux" preinstalled if they don't know what "GNU/Linux" mean to begin with? Almost everyone out there knows what "Mac" or "Windows" means, and so they buy Apple stuff. Again, marketing.


It would actually work rather well, that’s the whole point of bootcamp.


Interesting. :)


Fifth reason - many many laptops have at least some piece of hardware or behaviour that requires fiddling to get to work on major linux systems. Sometimes it's power management, sometimes it's running with multiple monitors or 3d acceleration or the touchscreen, sometimes it's getting the touchpad to work properly or the fingerprint reader.... In my experience, there's usually something.


While this may be true, we should still be impressed by the number of laptops Linux operates flawlessly on. Very few people would expect Windows to run properly on a MacBook and far fewer would expect macOS to run on a generic PC. Yet people expect Linux to work flawlessly on almost anything thrown at it. From my experience, it does a very good job of this (at least in recent years).


From a technical perspective it's a marvel, but as a user, "works flawlessly on the device you have," beats, "mostly works, but with daily annoyances, on your device and many others."


I started to reply earlier in the thread and then decided not to, but basically what I wanted to say was this.

I have used Linux as a daily driver desktop for years, but I don't just buy random hardware and expect it to work. I do some research first and make sure what I buy works well enough with Linux to meet my needs.


This is actually an excellent point that rarely sees the light of day.


> In other words, it is perceived as harder because people make it harder.

If by "people" you mean "Linux Desktop developers". They have taken relatively simple things and abstracted them two or three times until they are so complex and fragile they need special programs and standards to automagically manage it all for the user. As soon as you step outside the box of what they expect (hey, why can't I install a program on a different disk than my OS?) it is revealed for the Goldbergesq garbage pile it is.


> hey, why can't I install a program on a different disk than my OS?

Apple user here (had Linux on the desktop for 5+ years and windows for a longer time):

Why would I care whether my program is on one disk or another? Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do. Do Linux applications all self-update using Sparkle like Mac-applications do?

I also love Apple-music where I can listen to everything on all my devices, can I have that in Linux?

Does linux have the same beautiful font-rendering that I get in OSX, or do I have ugly non-antialiased bitmap-fonts now and there?

I also haven't used installation-media for operating-system installation in many many years, everything just works via the internet. I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?

And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?". I can have that in windows, can I have it in Linux? Can I easily create a encrypted bundle and mount it in the UI? Can I drag the little icon on top of any open window into any filedialog to rapidly access that file?

Can I easily configure keyboard shortcuts system-wide?

Can I have a photos-app that is as good as Apple Photos?

I think OSX is great. It feels more and more locked down, that is true, but I haven't had any real issue with this yet. I still can develop whatever I want and run it if I feel like it.


Linux is not a single thing. There are many different distributions and combinations of software you can have on any of them. You get what you choose.

There are like 7+ different file managers that you can install and use. One of them probably has compress to zip. Some of them are much more powerful than finder in OS X. I use `mc` and I don't even need to click. I just select a folder <F2> <Enter>, and the folder is compressed. Two pane file managers are great. I love them ever since MS-DOS times.

The fact that I can use bitmap fonts is the major benefit of Linux to me, because I don't use hidpi screens. I can have small fonts that don't suck to look at, and are perfectly crisp. I like that I have that choice.

All of those automatic sync things are an anti-feature to me. I was given an iPhone for webdev testing, and was snapping an odd photo or two with it from time to time, including photos of my ID documents, without realizing that it by default uploaded everything into some stupid cloud. Great, now my state ID is with Apple.

A few times I had to use Mac OS for development had me running the other way since then. Why would I search online and download files to drag and drop them around to "install" them, and sometimes click through next next next finish dialogs, like on Windows, when I can just type a list of programs I want installed and they just get downloaded and installed for me with no fuss, and meanwhile I can do something more meaningful? I can install 10 different programs I need for some project in a single command, and it's such a time saver.

To some people Mac features may be great, to some they suck horribly.


> Why would I care whether my program is on one disk or another? Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do.

Answer to your second question is very similar to the first one. Why should user drag anything anywhere, and decide to which folder put the applications? It is much easier to click on the install button in the store-like application.

Many users are confused by dmgs and they keep launching apps they downloaded from their ~/Downloads folder.

> Do Linux applications all self-update using Sparkle like Mac-applications do?

Linux applications were auto updated by their respective package managers before Sparkle was a thing. I consider keeping a Linux system updated to be much easier, than Mac one; even if it is updated by multiple mechanisms underneath (apt/dnf, ostree, flatpak, fwupd), from the user's POV it is unfied in the form of Gnome Software or KDE Discovery. On Mac, you have Apple App Store, Sparkle, brew, Microsoft Update, Adobe Update, Eclipse updater, and myriad of other mechanisms, specific to each app.

> I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?

I don't get that on a Mac. But then, I'm not locked into their products.

> And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?"

Yes, you can. In Gnome (default for Ubuntu, Fedora), you can right click a folder and there is a "Compress..." menu item. It will give you a choice of .zip, .tar.xz, .7z. I'm sure KDE has something similar.

> Can I easily configure keyboard shortcuts system-wide?

Easily? There's no system that would do that. Some are more flexible, but more difficult to configure, and vice-versa. MacOS goes into the less flexible one, for example I've never managed to have the media keys controlling VLC instead of launching iTunes, or Apple Music nowadays.

> Can I have a photos-app that is as good as Apple Photos?

This is about the first time I see 'good' and 'Apple Photos' used in the same sentence.


> Why would I care whether my program is on one disk or another?

A lot of reasons. Maybe your OS disk is on a ludicrously fast SSD but is space-limited and you don't want to waste that space on applications that don't need it.

> Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do.

Agree wholeheartedly. It makes perfect intuitive sense: the application is exactly where I think it is, and if I move it somewhere else then it is exactly there, and if I delete it then it is exactly gone.


I got fed up with Win10 last year and finally switched to a Mint distro. I run a software development company that works on the MS stack. I stuck with Mint for six months and finally switched back to Windows.

I didn't find it particularly difficult to set up. Some things are much easier (TAP drivers for connecting to multiple VPNs and running side by side remote sessions to difficult workstations across different networks, for instance). Even setting up a Windows virtual box was no biggie.

There's two reasons I switched back: 1) lack of consistency across applications is hampering to productivity, and 2) lack of stability in the application/tooling is hampering to productivity.

I gave myself 6 months to let myself get used to it, and there's a great many things I loved about it, but in the end it was less of a hurdle to deal with Win10's oddities than it was do deal with various inconsistencies and "usage maintenance" of getting having a suitable, productive workspace across Linux (and yes, I realize a large part of this was due to the fact I essentially run an MS-development shop and therefore have to work with MS tooling, but on the other hand, everyone working in any end-user business scenario has to work with Office documents, etc.)


There's installation and then there's setup. I would agree with you that installation on linux is easy these days, but setup... no.

I have tried moving over to linux several times throughout the years, but setup was always a huge issue. Installation was easy, but things like trying to get my aging peripherals to work properly, and trying to get font rendering not to look like absolute ass were always a huge hindrance.


> Just pop in the live installation media

So... way harder than Mac OS.

In fact, that’s literally impossible on my computer.


USB thumb drive is still "live installation media"


Now don’t I feel sheepish!

But seriously, I guess it is possible to use a USB thumb drive on my Mac with only Thunderbolt ports, but I never have and it’s never occurred to me. I suppose I still have one somewhere and maybe a dongle that would let me do it.


which is still a pain in the butt because you have to own, find, backup, and completely wipe a thumb drive to do it. Or you have go to the store because who uses thumb drives anymore?


You don't have a USB port of some kind?


Macbooks from 2019 have 4 thunderbolts 3 ports and 1 mini jack. And that's all.

If you want to power the device, you do it through a thunderbold. If you want to connect a USB that is not USB C, you have to use adapter and/or USB HUB. So you would have to connect to your USB through adapter/hub, and if for some reason whatever installer wouldn't support this you would be screwed (but I guess most of the time it would be doable, just inconvenient).


> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.

This isn't true. This perception continues to hurt the Linux community along with others. Use a properly supported -buntu distro and you are good.


>Use a properly supported -buntu distro and you are good.

Honestly it sounds insane but I've started recommending Manjaro KDE. Stuff breaking due to rolling release seems to happen less often than me avoiding having to do other stuff that newcomers to Linux are unfamiliar with.

Like adding a ppa or struggling to find a package since it lets you easily add the AUR, snaps, flatpaks, what have you without a care about the competition between those. The install is painless. KDE's stuff feels intuitive and familiar to windows users and is customisable enough you don't have to deal with some minor personal preference annoyances.


Personally, I gotta disagree. This still absolutely true, even for large distributions.

Every 2 years or so, I give Linux another try on the desktop, and it always ends after a couple of weeks, because I'm just tired of fixing mouse and keyboard settings, touchpads, HI-DPI and multi-monitor settings, font rendering and video codecs.

I've tried Ubuntu, Mint and recently Manjaro and – don't get me wrong – all of them had great stuff in it and worked fast out of the box, but it's always the last little details where desktop linux fails for me.


I have. Many times. Every time I end up debugging suspend/resume, graphic drivers, multiple monitor, etc.

I spend enough time changing config files for my job, I’m not spending my own time doing it just to get basic functionality working for workstations.


I think it’s fairly simple to avoid this by buying hardware that is proven to be effective with the OS. E.g. buy a machine with *buntu preinstalled.

Same as with mac really.


Even so, I install Ubuntu on a wide range of machines, laptops included, and I can't recall having any hardware related issues since, oh, 2005 or so?

Parent must be very unlucky, or in fact not realise it is their tweaking and tuning that causes issues.


Could also be selective memory. "I can't recall having any hardware related issues since, oh, 2005 or so" --> oh, that issue? Yes, of course I had that, everybody has that, that's par for the course...


The last time I attempted was 2017/18. Pretty plain clone PC desktop - MSI motherboard, ryzen 1600, AMD Radeon 580, etc. I bought a couple a different WiFi dongles to make sure the chipset was supported.

Yet the problems remained. Maybe when people say they had lots of problems and it wasn’t worth their time, take them at their word ?


In which case we can remark Windows and macOS are not without problems either. I do remember having hardware support issues with those this decade.


True, installing Ubuntu is probably simpler than installing Windows, it doesn't ask you a million privacy shirking questions.


Lets call this user-friendly.


Agreed apart from fingerprint scanners on laptops.


And LTE modems. And sleep and suspend. Oh there is also this bug with screen locking that shows your desktop for few seconds. Oh and with dual graphic chipsets too.


> Oh there is also this bug with screen locking that shows your desktop for few seconds.

I definitely encountered this on my mac work laptop from three years ago. (Left that job and have been Linux-based since then.)


I'm happy GNU/Linux user for many years and I don't remember when was the last time I had to do those kind of things.

Choose a mature/stable distro and that is it. Be prepared to trade a bit of convenience for privacy in case of weird/non-free hardware/firmware.


> This isn't true. This perception continues to hurt the Linux community along with others.

The continual denial, constant evangelism, and needless condescension, which I have watched for 20 years now, is what hurts the Linux Desktop community.


>This isn't true. This perception continues to hurt the Linux community along with others. Use a properly supported -buntu distro and you are good.

And yet, unvariably people aren't good, and when they complain, they are told, "use this other distro" or whatever...


Setting up Ubuntu was easy enough. But the file manager locks up regularly, switching between multiple windows of the same app with alt+tab doesn't work reliably, and I actually had to Google how to create a new folder. The stock picture viewer is lacking information such as ICC profile and when I installed the recommended Gimp, it was a mess of windows and the save dialog was horrible.

I suffer through all of it because I need docker with gpu support, but it's honestly a lot worse than what I expected.


You can activate single-window mode in Gimp under Windows -> Single-Window Mode.

I would generally recommend Kubuntu over Ubuntu -- it seems like Gnome aims for minimalism somehow, and has thrown out a lot of functionality that used to exist, and generally tends to emulate OSX for a lot of things. (E.g., OSX doesn't do alt-tab for multiple windows of the same app either IIRC.)

BTW, Kubuntu's default image viewer would likely be Gwenview -- You can get it to show a lot of information by clicking "More..." in the Meta Information panel and selecting stuff in the resulting dialog window. Here's a screenshot I found on the nets: https://www.fossadventures.com/wp-content/uploads/2018/05/Gw...


"You can activate single-window mode in Gimp under Windows -> Single-Window Mode."

Thank you sooo much.


I have not had any of these experiences with Ubuntu. Installing an OS on any new personal computer is much more difficult than when they had dvd drives. But once installed on compatible hardware, the Gnome 3 desktop use is nearly the same experience as Mac OS. Installing software is usually a matter of searching the software menu or downloading a .deb and double-clicking to run, and the New Folder button is prominent in the file window.

The office suite is easy to use. Thunderbird is as easy to setup as Mail. Networking and devices are easy to configure from the gui, and Firefox and VSCode run without crashes. Proprietary video drivers for common hardware can be downloaded and installed via the gui. It's at least as usable from an admin standpoint as Windows 10, and from a user standpoint as Mac OS, albeit with a much smaller "app store" experience.


I like Thunar for file browsing. I would give it 6 stars out of 5.


So you weren't able to come up with the idea of right-clicking without Google?


Right clicking didn't work. The main menu also didn't offer the option.

You need to open a burger menu in the top right corner. And then the new folder button is purely an icon and it doesn't show a folder.


Literally the first item in the context menu is "New Folder Shift+Ctrl+N"


It isn't for me :( So then I guess this isn't a usability issue, but rather a problem with unreliable setup.


What the heck are you using? :D


Ubuntu 18.04.5 LTS


That should be Nautilus already and you should have the "create new folder" context menu on right click. I also can't imagine anything going wrong during setup that leaves you in that state.

One thing with Nautilus is that if there is no empty space to right click on, you can't get to the context menu, because you always end up in the context menu of one of the icons. There is no dead space between them to click on. In this case you can click the breadcrumb to get the context menu or use the icon in the overflow.


I’ve worked out that breaking free from this is just not for the average person. If you have the power then stand outside and look in but there’s absolutely nothing you can do about the churn other than watch it burn. I’ll carry on educating people but even suggesting this might be an issue is drowned out by the marketing machines of the large vendors and downvoted into oblivion even here. That is unless there is some large story that something fishy is going on. But a big story is made of a thousand small ones which people don’t care about.

Look where fighting Facebook and political influence got us. Nowhere.


>>> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup

I think why Apple, Android, Windows has the advantage is none of those 3 OSes require installation by most people. Why treat Linux differently?

>>> and nowhere near as user-friendly for most people on an everyday basis

User-Friendliness, I think, is more on day to day interaction. It will be interesting if there is a study in an area with low computer/smartphone interaction. Split them into 4 groups. Each group has their own OS. Later on they will be tested with some tasks.


I bought a Purism laptop with Linux installed, and it was definitely as simple to get going on that as it was a Macbook.

Actually even easier because there's no "You must have an AppleID to use this machine - if you don't have one go and register now" step, and no associated problems with that.


> I would argue that Apple cares a bit more

Have you read the article?

Spoiler: They don't.


Exactly. Apple has done a good job of selling the "we care about your privacy" line, but it's still just a line they're feeding you.


If you compare how Siri works compared to comparable services, or Apple Maps, this just isn’t true.


The active word is more. The article doesn’t examine other platforms at all, so doesn’t address this.


> And before "have you tried this *nix distro??" comments appear

Your style of comment is also quite the meme at this point, though.

I always reply with: if my dad, mom, grandfather and other 10+ members of my family can use Mint and I have to do minimal maintenance for them, much less frequently than with Windows, then so can you.


Yeah, because your family members definitely represent literally every possible use case in the the world anyone could possibly have for a computer.

Ugh.


Did I say definitely, literally every and could possibly? Notice to what I am replying:

> [..] they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.

I think my comment is a fair rebuttal of nowhere near and most. If it was true, all of my family members would have to be outliers, which is statistically improbable. The more credible explanation is that it is not really true (and in my opinion, it clearly isn't).


I think you're being disingenuous about what constitutes "most people" in the context of desktop computing, but alright fair enough, I I seem to have read more into your comment than was intended.


> Whether it is an android device, an iPhone, a Dell laptop or a new M1 macbook you are beholden to a corporation that doesn't care about privacy.

Yet Dell and Lenovo offer computers with Linux pre-installed, but so many technical people ignore them because Apple is shiny and fast.

If this trend continues then Linux and other open source OS will eventually exist only as a virtualized guest inside a locked-down proprietary system.

There is genuine choice in the market now, but in ten years there won't be if people keep chasing baubles.


> Yet Dell and Lenovo offer computers with Linux pre-installed, but so many technical people ignore them because Apple is shiny and fast.

I got a Lenovo last year with Ubuntu preinstalled. It was he "blessed hardware". Wifi didn't work after wake / sleep and my connection via usb-c to my monitor was unusable. I switched back to a mac because I wanted to work, not fix my tools.

These aren't one off anecdotes, there are dozens of people in this thread with the same story slightly different. This isn't people going to Apple because it is shiny, it's going to Apple because managing hardware isn't their job.

Either verbally address that meaningfully or drop the condescending attitude.


A number of user stopping using these products wont change the slightest thing. We all encouraged our friends and family to ditch IE for chrome, and look where that got us.

Legislation and regulation are the only fixes for these practices. The EC have a lot of kettles in the fire here - on multiple fronts. I trust that, although it will take time, "fairness" will prevail.


> We all encouraged our friends and family to ditch IE for chrome, and look where that got us.

I didn't. I encouraged friends and family to ditch IE for Firefox and nowadays I encourage them to ditch Chrome for Firefox. I never cared that Chrome was a little faster for some time.

One person that only uses the (very old and underpowered computer) for mail and printing documents I helped install Linux (not Ubuntu ;)) and since then they are much happier, because it doesn't get slower with each security update and needs less maintenance help than with Windows.


> encourage them to ditch Chrome for Firefox

You should probably start advising that they ditch Firefox for IceCat.

https://spyware.neocities.org/articles/browsers.html


The last IceCat update is over a year ago. Might as well leave my front door open.


> We all encouraged our friends and family to ditch IE for chrome, and look where that got us.

Antitrust entities bashed microsoft for shipping IE and look where that got us.

If anything, it's worth keeping in mind that volumes of regulations written for big corps may indeed force them to behave, but are an impediment for tiny companies to enter.

Corps like microsoft do call for regulations and copyright laws all the time, the have an army of lobbyists, they know that even without regulatory capture these extinguish competition.


Actually if those users go to alternatives and offer they money and feature requests there, alternatives would improve.


Legislation doesn’t solve software problems.


> "...people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges."

I found myself being one of those people, so I made a commitment to no more Microsoft or Apple for my personal devices[1]. I'm eagerly awaiting PostmarketOS to reach full functionality on the PinePhone and I'll give up my Android phone at that point, taking Google out of the equation as well.

I'm even considering trying to learn C so I can contribute to OpenBSD for my hardware that is "rough around the edges".

[1] My wife's PC with Windows 10 is exempt for now; she's not a techie person at all and I don't want to force her into something alien. She tends to borrow my Thinkpad with Linux every now and then and is getting more used to it, so we'll see how that goes.


Don't hold your breath. Majority of people would trade security and openness for convenience and profit.

I think we are handing over the future of tech to these few tech giant megacorps.


Unless I missed something, we can still build custom computers right?


Apple has NOT being the same after steve is gone. It is MS 2.0 now!

With better design and products of course, none the less the business practices are the same. Much this is enforced by the current economic and political system.


Yep, Steve Blank wrote an article analysing the situation in 2016: https://www.inc.com/steve-blank/tim-cook-steve-ballmer-visio...


That was a really good read, it really explains what has been happening to Apple in its post-Jobs life.


Not MS. MS today is much less into the anti-user, anti-privacy, rentseeking game, compared to Apple or Google. Apple today is M$ 2.0, as much as people here hate that trope.


Only because their attempts at copying Apple and Google fail miserably.

Otherwise, here's Windows 10, have it for free, hell, here's Linux on Windows, just stay on the OS, buy apps on the Store and see these ads! Also give us telemetry.

Side note: no one cares, but I finally tracked down the cause of a really annoying bug, hibernation due to "thermal event" (CPU "overheating") in Windows 10: It's because of using dynamic disks in mirror mode (perhaps in other modes, as well). Makes Windows 10 hibernate randomly. Converted disks back to basic, no problems. Took me a long freaking time to finally solve this.


The privacy-invasive tracking in Windows 10 is much more obnoxious and tricky to workaround (since it is spread out and hard/impossible to disable). This tracking in macOS is at least under the guise of security whereas the Windows tracking is pure telemetry.


And you can disable that tracking on setup /installation.


Not all in one place, and not everything. See for example https://superuser.com/a/1512224


MS is still the same, they just matured a bit. They just have to behave better in the open source community to attract the developers. I don't believe for a second that they are doing it for the benefit of information sovereignty or the development of tech and software.

Their backdoor put a really bad taste in my mouth. Will try to avoid them if I can.


> I don't believe for a second that they are doing it for the benefit of information sovereignty or the development of tech and software.

They're doing it because the cloud is Linux and they need to get into the cloud because Nadella's vision for the MS OS going forward is a unified system that spans PCs, phones, tablets, IoT, everything, not just a desktop OS.


> much less

Define "much". Windows 10 is a mind-boggling privacy invasion. It's just the other offenders attract more attention, that's all.


How about disable all privacy features on setup/installation? It's far easier than the amount of hoops one has to go through for Google.


> "I would suggest that the users here are probably some of the best suited in the world to help smooth those rough edges"

How would you get around the problem that many vocal Linux users think easy-to-use software is for weak losers, and consider difficult software a sign of manliness and an identity, even a religion? This is like saying Buddhists would be the best suited people to disarm gun-lovers, without considering that said gun-lovers don't want to be disarmed, and see pacifists as weak and unmanly.


The alternatives are still completely unusable by most people.


Not anymore. Consumer usage nowadays is very browser centered, which is something one can experience with a Linux distro on pretty much any machine.

I agree that it's not as much as a polished experience, but that's the point as indicated - trading off polish for ownership.

I'm excluding from the discussion professional documents editing - compatibility with Libreoffice is mediocre (at best) in my experience, but having very good compatibility is not a consumer requirement IMO.


> Consumer usage nowadays is very browser centered

But it's also just as likely that you're using a browser made by Google, and using other services made by one of the big companies that don't care about your privacy. Where's the difference?


Google's not in control of the CPU, OS, or Desktop. It's not fingerprinting every binary you run, nor locking you into any of the google tools. You could easily use a different email service, different photo service, etc.


Sure since but this subthread argues that users actually don’t need binaries other than the browser that’s moot.


There's some confusion in the thread, in the fact that there's a mixup between operating systems and programs (or classes of programs), however the parent was referring to operating systems:

> The alternatives are still completely unusable by most people.

Otherwise the statement wouldn't make sense.


> Not anymore. Consumer usage nowadays is very browser centered, which is something one can experience with a Linux distro on pretty much any machine.

Or their smartphone, which is a hell of a lot smoother experience. So why does Linux Desktop continue to pride itself on being "good enough" for an "average user" [0] that doesn't even use a desktop?

[0] https://news.ycombinator.com/item?id=25047042


This is completely conceding the point. Nothing but the browser is up to par, and even that is not as ‘polished’.

If all people need is a browser, they should buy ChromeOS or an iPad.


"Not as polished" doesn't imply "not up to par".

Ironically, one of your two suggestions is a Linux distribution.


Android is a Linux distribution and MacOS is a FreeBSD distribution.

There is nothing ironic about that. Users don’t know and don’t care.

Are you equating the presence of the Linux kernel with some notion of software freedom or independence from corporate control?

If all people want to use is a browser, Linux is irrelevant.


> Not anymore. Consumer usage nowadays is very browser centered, which is something one can experience with a Linux distro on pretty much any machine.

Browser apps need connection. Most of the services are also subscription based.


Ok, suppose I can find an alternative to Adobe & co. on linux. Which laptop features a multi finger touchpad as nice and feature rich as the MacBooks?


Well the alternative can't rely on unpaid workforce while Apple/Microsoft and co spend billions on their software, GUI and user experience. It is simple economics. Open source needs financing, not just free Github repositories that MS/Apple and co can then exploit for free because the code is licensed as MIT.


I think this is something we need to fix with education. We spent so long meeting the user that it developed an expectation. For any piece of technology its peak function is always reached when the human meets it half way and makes a concerted effort.

Whenever someone says “I don’t get computers” or gets angry at their smartphone they’re just leveraging that expectation and conning themselves out of a gain they could make. Same problem - laziness and expectation. At the end of that path is a life of a sharecropper and a cow to milk.


I disagree. It's a matter if taking a couple of minutes to educate people. You don't have to be a CS major to be able to use a modern Linux OS. "Most people" don't do anyhting on their Windows or Mac computer that they couldn't do on Linux. IMO it's a combination of fear of the unknown, and a lack of general availability.

Macs are shiny and their marketing department is fantastic. Windows has a monopoly with OEM preinstalls. Those are the main reasons why macos and Windows are so popular.

I've tried using Windows, it's "completely unusable" to me. I couldn't even figure out how to install it without creating a Microsoft account nor could I figure out how to turn off "telemetry".


> It's a matter if taking a couple of minutes to educate people.

While it sounds nice in theory, is just unrealistic.

We'd also need to educate people why they cannot use random proprietary rental application on their mobile device that's not google or apple, or connect to a university vpn that doesn't support linux properly. Many workplaces use zoom for meetings, or google suite, etc etc

Exclusively switching to open/ethical software starts to resemble living like a hermit in the woods, and in reality makes ones life more difficult, reducing the impact the person can do on others.


Android handles this reasonably well. I can use F-Droid reproducible builds where possible and fall back to Play Store (or Amazon App Store) apps for apps I don't care that Google (or Amazon) knows I'm running. I get to choose where I want to make the trade-off between privacy and convenience. If I want convenience, it lets me go further than any Apple product, and if I want privacy, it also lets me go further than any Apple product in that direction also.


I’ve considered making the switch back to Androids to play with F-Droid, do you have a device recommendation?


For phones, anything that gets updates directly from Google (Pixel and Android One) will update quickly and will have community support for LineageOS for many years after official support ends. Unfortunately, Google makes absolutely terrible hardware decisions, but nothing comes close in terms of software value, which is far more important in my experience.


Keep in mind that some of the user friendliest machines today are chromebooks. Simple, not many knobs to turn, relatively secure, and of course browser centric. ChromeOS is of course running the linux kernel and the chrome web browser. The experience is very similar to installing ubuntu, google chrome, and dragging the other icons off the launcher.


But chrome os is is just another of the locked down, centrally controlled, telemetry driven products that people are complaining about.

The fact it happens to use the Linux kernel is like saying MacOS is based on FreeBSD.


Bollocks. I've installed Linux for non-technical people any number of times and, provided I make sure everything (network, printers) is working for them, generally never hear from them again. At least not for computer help.

Most people are using their (personal) computers for browsing, maybe an email client if they're not using one of those web-email abominations, the odd bit of word-processing and watching movies/listening to music. Linux works just fine out of the box and the install is substantially simpler than anything I've ever seen over in the MS world.


Right, they just need an expert to install things for them to get started... that's not simple enough for people who don't have an expert they can call.


Why could anyone's auntie do it when they needed MS-DOS or Windows installed in the 90s, and they could not do it any more today? The 'expert' being a neighbour, the little nephew himself, or whichever friend of the family.


The world outside of Linux has moved on from people needing experts to set up computers.

Today we just tell auntie to buy an iPad.


or, trust.


> These companies have repeatedly shown that they don't respect people's privacy

I mean, this is just plainly not true.

Apple has done many things, things they did not at all need to do, in order to protect user's privacy. They have shown, repeatedly, that they do in fact care.


I need XCode to build software for iOS and OSX, and there isn't to my knowledge any other feasible, performant and off-line capable way to do that beside running OSX on a Mac.

This is the only reason I had to move away from (arch) linux and it saddens me every day.


>Yet despite all the wailing and gnashing of teeth - both in the article and in the comments here in the various threads - people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges.

Or we like what Apple/Google/Microsoft are putting out, including the "walled" nature of the iOS/macOS. People's homes are also "walled" for a reason.

Your concerns are not everybody's concerns.

That said, I don't like the telemetry and information leak as described in the article, and would want Apple to stop it and allow disabling it.

But I also don't care for using another OS just because "it's actually yours and you can do anything you want with it". That's also true for TempleOS, but I would't want to use it (to use an extreme example of an alternative), and simililarly, I prefer macOS and the app ecosystem to any Linux distribution I've seen (and I've used UNIX when there were still UNIX wars and SunOS workstations with black and white monitors, and Linux for close to 25 years).


It's super infuriating reading posts from developers who are like "Well I would love to use Linux, but I can't because the touchpad drivers don't behave exactly the way they do under MacOS..."


It was super frustrating until I got used to the Linux way of handling, and then the other system was the weird one. Companies don't mind this effect at all I think, it increases their retention.


I kinda pity them, I have a track-point.


My main workstation is Linux for a couple of years now. I have dedicated PC for Ableton and I disabled all known telemetry calls. I got rid of all Apple products except iPad 2.


The alternatives are not "a bit rough round the edges". They are much, much worse.


How much worse are they? Can you explain your experience?


Reminds me of Eben Moglen's "Freedom in the Cloud" talk.


I was looking into various differences among demographics which use and prefer apple products. Nobody rated any of the problems often thrown in here as a priority.

Majority of iPhone users are women and HN/reddit are primarily men dominated. It should be no surprise that apple caters to aesthetics over thermals and you see people complaining about that on these platforms.

Apple customers put status and prestige when deciding to purchase something above cost. They are rich. They care about environment that's why apple "care" about environment. They care about privacy which is why apple "care" about privacy.

So for them, it's reasonable to change the device than allow it to be repaired at some shady shop. It's consistent with their image. They can't compromise on privacy or safety because that's why they bought into apple ecosystem.

https://www.theguardian.com/technology/2020/feb/26/apple-doe...


>Majority of iPhone users are women

I sold iPhones for years and never noticed this trend. I highly doubt it is true in any significant way. I would also challenge the assumption that women want aesthetics over some other factor. In this day and age, that's an incredibly 'old fashioned' statement to make.

From what I could tell. People bought iPhones because they were:

a)Cool and of good quality. (Your everyman can just go buy an iPhone and KNOW it won't be junk. They don't know the other brands and researching stuff is a pain in the ass)

b)They were used to using them and didn't want to use android/didn't like using android phones

c)They had shit experiences with other platforms and just wanted something 'that would just work'

d)The iPhone really was the first good smartphone on the scene. They have inertia associated with points a->c above.


> that's an incredibly 'old fashioned' statement to make

If spending in cosmetics industry is any indicator, women definitely cares about how something looks over men. There are hard number even if you don't believe in the various evolution theories and biological differences which makes women more prone to noticing subtle visual and design cues.

From 2010: https://appleinsider.com/articles/10/12/01/women_want_apples...

From 2013: https://www.statista.com/statistics/271204/android-vs-iphone...

From 2015: https://www.statista.com/statistics/513995/smartphone-user-g...

I am not aware of any large scale recent surveys focused entirely on that but there are many dating surveys which reports women having preference for men with iPhones. Likely dubious at the numbers most of them report (70%?) and could be influenced by other factors but still it seems women do prefer iPhones more.

I don't think this should be odd. iPhones have the best overall cameras every year and are huge on Instagram and Pinterest (platforms dominated by women) so I can only extrapolate the difference would be bigger now with social media booming.


I will state it more plainly than 'old fashioned'. I find your assumptions about what Women want from their tech products offensive and from what I can gather, just based on you wanting to believe it really badly.

In addition to that... I actually sold the damn things to actual human beings! I KNOW who is buying them! I've met them by the thousands! It's like approaching a mechanic about why Ford's don't break down by pointing vaguely at some tables.

Your data doesn't particularly support your views either.


If you wanted to compare purely biological imperatives, then you'd only take data from cultures that treat women and men wearing makeup equally.

As such - men spend a lot on fashion and grooming, while the same can be said about women and cosmetics... but that's mostly about cultural customs.

PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.

PPS: Your stats are a little bit out of date... Let alone - the 2013 data contradicts your claim. Women are distributed between Android and iPhone equally in 2013. There would be a shift towards iPhone, if women wanted iPhone more.


For cultural customs, I am not sure which country to take. For US and UK (which dominate the statistics on stuff like this due to language and other factors), women spend significantly more.

> The 2013 data contradicts your claim.

It doesn't. Men want iPhone less than women. My initial point was simply that people here and on reddit (men dominated sites) are not reflective of apple's focus.

31% of men wanted Android while only 24% of women wanted the same.

> PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.

This is misleading. While iPhones fared worse than pixel in photo department for 3 years, they make up to it in other aspects especially video. As for Samsung, it depends on what you consider better camera. iPhone arguably has the best color profile and accuracy while Samsung phones does beautification and increase contrast which does make them look nicer but overall, not accurate.


> For cultural customs, I am not sure which country to take. For US and UK (which dominate the statistics on stuff like this due to language and other factors), women spend significantly more.

You're trying to make a biological preference claim out of what is customary in the Western European cultures.

Sometimes it's better to say that "we don't know", because that is exactly it.

> This is misleading.

No... Your original was misleading, I just forced you to correct yourself.


To be fair, if this was truly a big deal, it would be a political issue with bipartisan support and taken care of short order.

In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)


> To be fair, if this was truly a big deal, it would be a political issue with bipartisan support and taken care of short order.

Are you serious? Since you're using the term bipartisan, I'm going to assume you're also from the US. Have you been following the news/current events recently (and by recently, really I mean any time at all over the course of the past forty or fifty years)? Almost nothing actually gets "bipartisan" support these days. Our government is in a constant state of gridlock and it is next to impossible to get anything done.

> In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)

I'm really having a hard time telling if this comment is sarcastic. I'm not trying to cause you to lose sleep or anything, and honestly, I commend your optimism if you're serious, but the fact is that democracy is an abstract concept that is almost never put into practice perfectly. Even if democracy's promise of "most people being happy" meaning 51 people are happy while 49 people are not is actually sufficient, even that idealized version is very far from our actually political system in the United States.

I think one thing most Americans actually would agree with is that the way we do things here, right now, in terms of government, doesn't work so well. I think there is a lot of disagreement about what should change, and how, but it seems like very few people are satisfied with the current state of affairs. The more I think about it, the more I think this comment just has to be sarcastic, so this reply is probably a waste of everyone's time, but in any case, wow. I wish what you were saying here were actually true. That would be a nicer world to live in than this one (although, frustrating as it might be at times, this one is also not so bad).


> Almost nothing actually gets "bipartisan" support these days

There is bipartisan support - for breaking encryption: https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020#Legislatio...


Democracy is probably fine, it's the fact that we allowed politicians anywhere near it that's the problem...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: