What we poor souls didn't understand all these years is that we're effectively helping other companies to steal from Apple by running software that isn't distributed via their app store [1].
I think now that Apple has learned from iOS how profitable a walled garden business model is they are trying to bring that model to the PC world as well. Shipping hardware with their own processors is an important step in that direction because it gives them control over the IP of arguably the most important component of the computer, which in turn makes it easier to control software distribution for their architecture as well.
The transition will be slower of course because people have to get used to seeing computers as closed-source devices with app stores, there are still too many of us who have the mindset that you can just install anything you want on a computer without asking Apple for permission and without paying tax to them.
At this point I really wonder how any serious "hacker" can work on such a device, it's becoming the antithesis of everything that the original Hacker culture stood for.
Good thing said hacker culture spent 15 years buying inflated laptops and desktops from them ultimately for convenience to give them the market presence to do this in the first place. Its the same thing with Windows.
People will preach all day that they won't make any personal sacrifices to try to avoid feeding literal cancers that are eating the software industry and are shocked when said all consuming voids take away their autonomy but they are so locked in to their ecosystems they are trapped.
These are for profit corporations, not your friends. Their bottom line dictates they maximally abuse copyright and their ability to sell and distribute proprietary software out of your control to maximum effect. Microsoft, Google, et al want you using devices you cannot control, because then they hold all the keys and can demand the greatest ransoms for you to do what you want. Its inevitable, and the persistent total arrogance and hubris on display in the tech hive mind in regards to any of these walled gardens is just... depressing.
Free software isn't going anywhere, especially with RISC-V now being an established failsafe if UEFI PC and ARM totally lock down, but it would just make the struggle a lot easier if the principle victims would stop martyring themselves for the corporate ecosystems they have no say, influence, or control over.
What's worse is the number of other posts with the top comments being vociferous defenses of these companies as if they needed these people to defend them or cared one iota for their welfare. It feels like it's Stockholm Syndrome on a mass scale.
I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens. They scream outrage when they're not allowed their tiny little dance-jig operators provided courtesy of an abusive regime.
Sure, I reflect that I may be now nearing the curve of an obsolescent person attached to such silly outmoded principles like "ethics" but if we're all just selling ourselves out constantly to the angry god-machine of the id, is this really the future we want for our daughters and sons?
> I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens.
Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.
I recommend Jonathan Blow's talk "Preventing the Collapse of Civilization"[1] for an unsettling view of how far we have already traveled down that path.
Gen Z can get real general-purpose computers with a development environment for cheap. The new Raspberry Pi 400 is a modern take on home computers like the C64, but 15 times cheaper. Pi 400 full kit: $100, C64: $595 (1982) / $1576 (inflation adjusted for 2019).
The computers are not just getting cheaper, educative material is easier than ever to find.
Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore, but then again, nothing is stopping a kid from buying a microcontroller and playing with it. It cheap and information is plentiful. I would even say that it is easier for Gen Z to play with electronics than it was for millennials.
But that unprecedented availability of computers didn't turn everyone into a nerd. Just because someone has a smartphone doesn't mean he has any interest in computing. When computers were limited and expensive, only those who had some affinity with them had them, now everyone have them, so mechanically, you can't expect the same average level of interest.
Learning about the Raspberry Pi is nice, but that doesn't help preserve the institutional knowledge needed to build system/low-level features in a modern OS. I'm not talking about teaching kids to program; I'm concerned about preserving in the future the knowledge of how to create all of the complex tech we use today. Many civilizations throughout history collapsed after they lost the technical knowledge on which their civilization depended.
> Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore
Yes, that's the problem! Unless those skills are actively used and passed on to the next generation of engineers, that knowledge decays. Part of the reason you don't see a lot of bare metal programming anymore is due to the knowledge decay that has already happened!
> nothing is stopping a kid from buying a microcontroller and playing with it
This article is about how those same kids are being stopped from learning the complex systems we currently use.
> didn't turn everyone into a nerd.
Nobody is trying to turn everyone into a nerd. I'm talking about making sure the nerds of he future have the ability to learn about the tech they use, so the ability to understand and make that tech isn't lost. Locking down the OS into an "appliance" that cannot be inspected or changed is a direct attack on the ability to learn,
The problem with low level programming is just the general trend of specialization.
The modern computer is the pinnacle of human civilization, a thing so complex that it requires thousands of people to fully understand how it works. Low level programming is just a small link in the chain of complexity. From mining the materials, to producing silicon of incredible purity, to designing circuits at quantum scale, to CPU design and finally the guy who places the frame that will display cat videos. So if you argue that no one knows how to program the bare metal (that is not that bare anymore), one can argue that the knowledge of how to make logic gates from transistors or the metallurgy of copper traces is getting lost.
Maybe less people will know how to program on the bare metal. But think about it, it has been thousands of years since most people are unable to hunt and gather. This is a great deal more worrisome than not knowing how to program computers, and yet, human civilization have been thriving.
The important thing is that it is still accessible, and it is.
What you're saying was truer in the '00s when 'scopes, logic analyzers, etc. were professional instruments costing thousands and FPGAs were inaccessible to all but the most dedicated and wealthy hobbyists. Today, with adequate hobbyist-grade instruments and dev boards available in the $100-200 range pretty much anywhere you look, really, the situation for low level engineers and developers is better than it's been for years and, in some ways, even better than the '70s heyday of homebrew computers.
The problem is more the lack of interest. Low level systems development is hard, a computer engineering degree is hella hard, and all the hype and easy money was in following the webdev career path starting from the late '90s dotcom boom.
It was always a tiny minority of the general population that had real knowledge and capabilities of creating the actual technological equipment widely used by the said general population. In the same vein, there are no less people capable of low/OS level programming than before, and you can't expect every proficient user, even a power user or e.g. the now numerous web developers, to be at this level of ability.
What changed is that the very things capable of eliciting interest in programming also offer overpowering content consumption functions with huge, never ending catalogs of games, movies, videos, short funny clips etc.
As computing and "content" proliferate, the uncompetitiveness of creation, esp. symbolic creation such as programming, is increasing. At some point, broadening of the access no longer offsets this effect, and the talent pool may start to shrink even if capability and permeation is a million times higher than it was.
There is nothing wrong with teaching Python as an introduction to programming methods and data, then diving deeper. My son started with Python in his Junior year of high school and as a senior is now learning Java.
It is just top down learning, instead of bottom up. For programming, I think either works.
I just happened to learn the other direction because of the year (early 1980s). On CP/M, you had a macro assembler right there, and it took me a while to get my hands on a Basic interpreter.
Excellent book! Although written with Pascal in mind, it became really useful to me. Always brought it at work along with the K&R 2nd Ed. and the famous TCP/IP networking book by W. Richard Stevens.
I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn. Now, most people get computers where those things are forbidden. You still can get "programmable" computers, but you have to do it explicitly. Before, it was implicit. So there's a filter that some people won't pass and won't learn, and the pool of people shrinks.
>I think the point is that before, if you had a computer, you had the ability to inspect, modify and program, and with curiosity you'd learn.
That's a rose-colored view that maybe reflects a small window in time.
But take the 80s. First of all that computer was maybe $5,000 in today's money. (Yes, there were some limited alternatives like the Sinclair and Commodore 64 that were cheaper.) Long distance telephone calls were expensive as were the big BBS systems like Compuserve. Any software you purchased was costly. A C compiler was hundreds of dollars. (Turbo Pascal at $50 was an innovation.)
Perhaps more to the point, most people didn't just have a computational device of any sort. So the fact that, if you had one, you could use it for a lot of different purposes is sort of irrelevant.
My experience being born in 1992 is that as a kid I could toy with QBasic for instance (and then with Dev-C++ a bit later) ; as well as go peek & change stuff in random files (hello red alert save files :D) - can't really do that with an iPad.
I was a QBasic kid as well, but I think it was pretty clear to me even at the time that QBasic was a mickey mouse environment. You couldn't get into 32-bit mode, or dynamically allocate memory with pointers, much less blt sprites or backgrounds around at a usable speed.
I'm not saying it's exactly equivalent, but there's certainly a perspective which could argue that a) the free JavaScript interpreter available on every device is more featureful and interesting than anything available to us 80s and 90s kids, b) low-cost SBCs running free open source operating systems and compilers are more accessible and better supported than the commercial compilers that few could have accessed back then, and c) the overall ramp from curious/creative gamer kid to capable hacker is much smoother now than it was then, with a lot of interim options between JavaScript and commercial environments in which to gain comfort and achieve interesting results (thinking stuff like GameMaker and RPGMaker, modding communities for existing games, hacking/homebrew/porting communities for consoles, etc).
I get the argument that many kids are growing up today without necessarily touching a "real" computer, just a tablet. That said, as someone who got into PCs quite early on, I'm a bit skeptical that the world we live in today where you can assemble a working computer based on a Raspberry Pi with loads of open source software for probably about $100 somehow is less accessible to a kid who wants to hack around on computers.
I think the key with the Pi is that it needs to find its way into the house for some other reason than just to be an educational aid. Positioned that way it will be about as interesting to most kids as a plate of cold spaghetti— am thinking here of the book I was given as a teenager on developing ActiveX components, because it was an enterprise-y thing, when what I really wanted was a book on DirectX, for making games.
But yeah, if the Pi shows up as part of an IOT system, or as a TV/streaming box, or to play retro games on, or whatever, then it's there and it's available to be tinkered with; and from my limited experience, basically none of those use cases will run on their own without at least a little bit of tinkering. :) Even my little Linux-running Powkiddy emulation handheld has probably consumed about as much of my tinkering time as it has my retro gaming time.
Right. So quite a bit later than the period I'm talking about. Late 90s/2000s period is arguably when commonly-used hardware was most accessible/hackable.
Eh, I mean the kids who are passionate about how systems work will find a way to dig deep. I don't think this is going away with lock-down systems regardless.
The only thing it will do is make it easier for the general public to work with machines and the masses never cared to begin with and to be frank that's a good thing. They are less likely to install malware and cause trouble for themselves.
Plus you're forgetting that these companies still need engineers to build and maintain their infrastructure so it's not like the knowledge is going to disappear, never mind the fact that the corporations heavily rely on OSS.
The lockdown and controlling of distribution of software.
Great, you can write a script, it's checksum is then sent over the net and verified to not be malicious or if the service on the other side is experiencing lag..well... gotta wait, and wait.
To the point where, all choice is removed from an operating system. So much for root access.
It's not about the ability to write programs, but the ability to write programs that influence your base system in a meaningful way. Yes, you can write programs, but can you write a different /bin/init for your system? I'm not sure you can (either because Apple will not let you, or your lack of ability to do things like setting up a system to be used). Maybe you can, we're wrong about this, and more power to you. But it's quite likely that you can't.
$100 isn't cheap for a curious kid with no income, especially if they're not already sold on it as an interest.
There's a big difference between buying a computer explicitly for the purpose (even if it's cheaper), vs being able to play around with the hardware that you already have, for no new monetary investment at all.
But computing tools are cheaper now than they have been in ages.
You can buy an Atmel dip version of the chip in an arduino really cheap, and build basically what is an arduino on a breadboard. Then get a USB programming adapter (again, cheap) to get it running.
Then you can get in expensive USB logic analyzers that are plenty capable of monitoring I2C and SPI buses and learn how all that works.
None of that existed in the 1990s. It simply wasn't there.
The the price and the availability of tools is so much better today. You don't even need to buy any books, it is all online, with community members jumping in all the time to help.
Get involved in the local community board. Present to the library council that this is a good investment for education. Purchase 20, lead a community class to teach the next generation of students.
If a kid with a phone but no money wants to know if programming is something they might like to study, and somehow the school also can’t afford a Pi, I’d point the kid at something like https://js.do/
The problem is that's not really how people get started. Hello world is how you get started in a university intro course, not in your parents' basement.
What happens there is that you have a pile of photos that are all in one folder and you want to organize them into separate folders by date. A problem you're actually having, not a homework assignment. Which is possible to do automatically, because they all have metadata with that information in it. So you find a python script somebody wrote to do that, and it works! Then you realize the script is just a text file and you can modify it to sort using geotags or something else instead, and now you're down the rabbit hole. But only if it's easy to see and modify the code that runs on the device you actually use.
Not everyone learns the same way or for the same reasons.
For example, I learned to read at the same time I learn to code, and from the same source — I had the unusual combination of not just the right birth year, but well-off parents and two elder siblings that had gotten bored of their Commodore 64 just as I was reaching reading age.
Back then my coding was, naturally, 95% transcribing from the manual, but it was fun. One of the few ones I can remember reasonably well this far removed from the experience was a sentence generator that had a list of nouns, verbs, etc and picked from them to build grammatically correct gibberish.
Hello world is how a lot of us started in C with the K&R book, wherever we were. Most of us didn't want to organize photo folders, we wanted to make the computer display something cool or play a song or something, and usually we had to start with something elementary when tackling a new language, especially a complicated one like C.
I can still remember being 11 years old and meticulously keying in the hello world program on my dad's aging Tandy Model 16, then saying 'cc hello.c' and watching as the compiler chugged through the tiny program for a minute or two. (It #includes stdio.h, so there was more code to go through than it seemed, plus there was a linking step. And not much actually happened, because Unix C compilers are silent if there are no errors.)
When I ran a.out and saw "hello world" appear, I was over the moon. It was like clearing 1-1 in Super Mario Bros. for the first time. Like I had surmounted the first obstacle and was thus ready for all the challenges that lay ahead.
I don't know why a raspberry pi is supposed to be a good device to learn programming other than the fact that it forces you to use Linux. Do kids really ssh into their raspberry pi from their phones? Or do they really buy an expensive screen, expensive cases and expensive peripherals adding up to $200?
I also doubt that raspberry pis are actually the type of device that schools are interested in. What they'd want is the TI-84 equivalent of a laptop. It should be easy to reset the laptop to factory conditions and it would only run a few specific preinstalled programs (one of which would be a barebones IDE, probably for Java). To me it feels like the raspberry pi fails at all of these. You have to mess with hardware and then mess with the software. A school provided Chromebook with student accounts managed by the school would be way more practical.
However, if you actually want people to learn programming organically, it's much simpler to just get a Windows laptop and use that as the main device.
I too have doubts about the real-world suitability of the Raspberry Pi in formal education environments, but certainly the business of resetting it _is_ a solved problem; every kid is issued their own SD card, and can flash it back to the starting image whenever they want.
On a RPi, with a standard laptop on the side (Windows/Mac/Linux, doesn't matter), and some basic installed tools, you can pull down and configure Alpine Linux and run a bare bones IoT Linux platform. If you want to get really gritty, you also configure and build U-Boot as a primary loader.
Once you get passed that point, you pull docker into your Alpine build and start running containers.
Stepping through that full process will teach you research (because you are going to be reading a lot about how to pull that off), it will teach you about Linux kernel configuration. It will teach you to be at least comfortable with git.
There is a lot you can learn on a Raspberry Pi, cheaply, that only involves plugging in an ethernet port and power supply, and never seeing anything on a terminal.
> The computers are not just getting cheaper, educative material is easier than ever to find. [...]
Being accessible and being accessed are two different things.
I'll make a more general parallel.
Never in history was such an enormous amount of audiovisual media about the decades preceding the birth of their generation ever produced, during those decades of later. And this astonishing amount of written documents, audio, films, both fiction and non-fiction, live takes, documentaries, is readily available: often immediately, often gratis.
And yet this generation is disconnected from the previous ones in an unprecedented manner. And yet this generation seems to be the most ignorant about how stuff was just a couple decades before them. I've never seen such a break in the flow. The material is present but not reached or understood.
Yes and no. At least it created a modest tinkering culture while the mainstream went with smart phones. I wonder if RISC-V will help us getting over ARM* or if RISC-V SoC vendors will try to create their own walled garden, proprietary interfaces and GPUs/codec support.
* = The issues with the Raspberry PI are both with Broadcom's closed source hardware (there are other tinkering boards with more open hardware) and of course with the license model of ARM.
I agree there are other 'tinkering' boards with more open hardware, but the Pi has the advantage of a huge community of very nice supporters. Maybe even as nice as the community over on arduino. The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.
As far as the ARM license goes, very few people are ever going to spin their own hardware, and at least the ARM programming API/ISA is very well documented. We'll see if that continues under NVidia.
Certainly MOS, ST, National, Zilog, Motorola, INtel, IBM, Dec, (etc etc) CPU were always closed source. That did not prevent anyone from learning about computer systems and programming then, and it won't stop them now.
That said, I doubt Apple will ever make the M1 available for purchase. Which is where this thread started.
>> The biggest problem with Broadcom on the Pi is the videocore block, and if you never run the video (using it for IoT), you avoid the biggest black hole on the device.
Avoiding the black hole does not solve the part of the problem where you are financally supporting closed hardware and disincentivising the Rasberry PI foundation fron doing the right thing.
This is the same problem we had with TiVo: you could get the operating system, tinker with it, and run it. Just not on your TiVo. What good is having the source if you can only run it on toy hardware? You need to be able to alter the code on the tools you use so you aren't beholden to a company that drops warranty and support within 5 years and may never fix the problems you have even while it's supported.
I just made the C64 <-> Pi 400 comparison to someone last week. It is completely analogous.
Pair that up with an arduino, and maybe wiring, and you can a have hardware interaction with a Linux desktop on the 400.
I just helped a young arduino user with reading an analog pot yesterday. There are still young people out there playing with hardware, and it is way easier today than it was for me in 1984.
Then go over and look at the people configuring and compiling custom versions of Marlin for their 3D printers.
Or the nerds designing flight computers for solid fuel model rockets.
Wouldn’t it be great if you could route the mac’s internet through the raspberry keyboard, strip the surveillance automatically, and continue to use the internet like normal?
> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.
Gen Z here. I tried to give manufacturers the finger and try to play with systems level software on modern PC's and ended up retreating after realizing, among other things, just how much modern computers are not for hackers. Everything feels like a "undocumented" (by which I mean the documentation is likely only accessible to OEM's and those with large business deals woth them) mess. Even when it is available, I'm too scared to touch it. I found something I wanted to try to to hack around with till I saw the license agreement I had to read in order to access the docs. I couldn't parse out the legalese, especially regarding open source works and I'm not consulting a lawyer to figure out if I can post some project I might hack together online under an OSS license. The few things that do get open sourced are incredibly half assed and you can still tell they're designed for corporations, not hackers (i.e. edk2)
Conversely, I have an old IBM XT compatible luggable sitting in my closet. The manual for that has the full BIOS source llisting. Nowadays I mostly just hack around woth old non-PC devices, but for the most part computing just isn't fun for me anymore.
As other commenters have said, just picking up an open source *NIX is a great start.
If the lack of open hardware bothers you, there are several non-x86 architectures that are fully open source. Chief among them is RISC-V[0]; SiFive[1] is taping out RISC-V processors to real hardware, selling dev boards, and stuff like that. These are real, competitive, Linux-capable CPUs, although widespread adoption has not happened yet.
On the more minimalist front, the ZipCPU[2] is simpler (though has less software support), and it's author writes some really great blog posts about HDL[3] (Hardware Description Language -- how these CPUs are defined).
You might also enjoy developing for very small embedded systems. I like AVR CPUs such as the ATMega328P (of Arduino fame) for this purpose. Although not open-source, they have quite good and thorough datasheets. AVRs are very small 8 bit systems, but as a result, you can fit most of the important details in your head at once.
If you want to talk more about these topics, feel free to get it touch with me. You can find my contact info on my profile, and on my website (also listed in my profile).
At the OS level sure. Its still running on a completely non-free / hardly hackable platform though riddled with all sorts of backdoors and other curiosities. I'm aware of things like coreboot / libreboot, but support is even more limited there and porting involves a deep understanding of the x86 platform in my brief experience.
You don't need any of that to understand how x86 or the OS works. Realistically you can do the entire learning with QEMU/VM and not be restricted by the hardware at all?
At some institutions they teach how OS work by students implementing Pintos, a full x86 OS.
In my first year I had to build a full raspberry pi emulator from scratch and then code an assembler for it. These programs would then run natively on the raspberry pi. People wouldn't even touch the hardware until much, much later into the project when the emulator has confirmed to be working.
I disagree with the view that you need a full FOSS hardware to understand a platform. You can do a lot with a VM/QEMU.
It really depend on what you want to do - Linux distro on x86 is IMHO good start, but of course its not the only option - there are now some reasonably open arm devices (including an atm laptop) from pine, there is the very open Tales workstation running on Power & RiscV showing up everywhere.
Why do people make these kinds of comments? Talos isn’t showing up anywhere, and even during the Kickstarter it was more than $3000 for just a motherboard.
> Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.
Kids today aren't as skilled with computers as I would've expected years ago. I feel the app culture stunts deeper learning.
I'm aware that I am looking at this through the lens of self-bias, but I have definitely found that much of what I know about computers came from being able to (and sometimes needing to) poke around and figure out how to do something.
I am not a software developer and have little formal education in computer software or hardware. With that said, I've picked up enough just by growing up with Commodore, DOS, Linux, etc. to at least have the basic understanding needed to research or just figure out solutions to most common problems.
I work in media production and IT in an educational setting, and while I'm far enough along the Dunning-Kruger graph to know just how little I know, the sort of things I see from students (grad and post-doc at that) definitely give me pause sometimes. Often, the mention of basic things like how to change a setting or search for a file in MacOS or Windows is met with glazed eyes or signs of discomfort. Apologies for how "I'm not really tech savvy" follow talk of anything more complex than pointing at pictures on a screen or doing something outside of the web browser.
And yes, I do understand that the very nature of these students' specialization in other fields can mean they haven't had the need or opportunity to learn much about computing. I just feel like I only picked it up because it was how you got things done or because it was there for poking and prodding when I got bored or curious.
I think in the end it's not just that I have a personal connection to computing and think everyone should share my interests. It's more akin to a basic competency that opens a lot of doors and prevents you from being taken advantage of.
My analogy is usually along these lines: if your job requires you to drive a car, you don't need to be a mechanic or be able to build a car...but you should at least know the basics of how to operate and maintain a vehicle beyond just gas pedal, brake pedal, and wheel.
10 years ago when I started working I feared the younger generation and their enthusiasm and energy. Then I had to teach a few of them how to ssh and that fear was dissolved. Most of these kids will not cut it in heavy reading/comprehension jobs.
In my experience the up-and-coming generation are far, far better at branding than my own generation ever was. Talking to my younger colleagues I generally get a deep feeling that they know what they are talking about, are thoughtful and make good decisions.
But the thing is, that's just branding: they aren't actually much more competent or thoughtful than my generation was at that age, they just seem like it. When I look at the code they write, it is as bad as the code I wrote. They have startling gaps in their knowledge and experience, just as I did (and no doubt still do — there is always something to learn!).
The thing that worries me is that until one gets to more objective measures, they really do seem more competent and trustworthy — which means that others are more likely to trust them, which is likely to lead to more bad decisionmaking at scale.
I wonder, though, if there is really a difference at all. Maybe my generation actually seemed more competent to our betters than we really were, too!
not really meant as a hard judgement, just sharing my experience. i am 100% sure there are folks at all age levels that can be and are better than me. i have definitely worked with some.
> Perhaps it is considered more of a commodity now?
It is and it should be. What the white beards all too often forget is the fact that the generation before them would say the exact same thing about them (lack of enthusiasm, lack of interest and knowledge, etc.).
This is a story as old as time:
> The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households.
attributed to Socrates (469–399 B.C.)
How many 80s kids were into HAM radio or actually building computers (Ben Eater-style: https://www.youtube.com/c/BenEater/videos ) instead of toying with micros or basically playing Lego with PC components? Same difference.
Today's computers have reached a level of tight integration and complexity that simply cannot be grasped by a single individual anymore.
People keep whining about that when it comes to computers but happily accept the very same thing in other commodities like cars, clothes(!), highly processed food(!), hygiene and cleaning products (soaps, hair gels, rinsing agents, ...), utility grids, etc.
It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.
> It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.
that is perhaps the most poignant take on this i have come across in a while. thanks that got the cogs turning in... different directions. i do need to ease up. :)
There has been a significant increase in demand and cost for pre-owned vehicles that do not have integrated entertainment systems and drive-by-wire systems.
The used truck market, I have seen vehicles triple in value over the last decade.
Problem is the pipeline. Schools that really bought iPads for "digitalization" will produce learn results that are significantly worse for understanding tech applications and information technology, because the principles get obscured by fancy UI. Nevermind that it wouldn't even be different from their phones.
A problem is that teachers would require some extensive training first before they can even try to teach kids something they didn't know before.
I do appologise if this will sound condescending : What are you on about?
I do agree that the majority are limited in terms of technical knowledge but I think that was always the case. It's always a (driven)minority that are responsable for tech advancements, preservation and magic.
I wish I was 14 in this day and age with this much information, how-to's, cheap(and expensive) tech and microcontrollers(when is the last time you checked places like aliexpress and git's?).
We live in a potential golden age for technology education:
- Hardware is ridiculously cheap and powerful and widely available
- Inexpensive educational computers such as the BBC micro bit and Raspberry Pi are widely available along with associated educational materials; these devices are easily interfaced to the real world, and can often be programmed at a very low level or high level as desired
- Overall, robotics and real-world computing is cheaper and more accessible than it has ever been
- FPGAs are ridiculously cheap and powerful and allow you to explore processor and digital hardware design
- Nearly every computer, smartphone, or tablet contains a web browser that can be programmed in JavaScript, webasm, and dozens of other languages that compile to JavaScript or webasm; web apps can display 2D, 3D and vector-path graphics, generate sound, and interact with all sorts of input devices
- Nearly every computer, smartphone, or tablet has a Python implementation as well
- Free and open source software is widely available
- Free and open source game engines are widely available
- Games are often moddable and many contain built-in level editors
- Source code of almost any type of software you can imagine, from operating system kernels to compilers and libraries to databases to to web browsers to scientific computing to games and productivity software is widely available
- Billions of people have internet access
- Tons of free and often high-quality educational materials are available from universities, khan academy, and many other sources on youtube, on github, and all over the web
- Many books including textbooks and tutorials are available for free as PDFs or readable on the web
etc.
The materials are out there, but the challenge is empowering people to find and make use of them.
Omg, I can’t up vote this enough. I have been telling principal/director-level engineers this for almost a decade. We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them. So much of being a senior+ in tech is knowledge sharing which we don’t do enough of. I’m guilty of this too! I value my personal time and don’t want to spent it building training materials.
The whole “playing” with system level software is how I became an engineer in the first place!
It’s also extremely apparent when you suggest a new way of doing things and older folks look at you like you’re crazy or worse, actively block because “we’ve always done it this way”. Those youngsters in those tiny screens have big ideas, from a different perspective, and we should acknowledge that and learn what they know.
Back to topic, I think the maturity of software engineering has enabled a lot of what we see today due to the macro cash grab of capitalism. There are things that stand firm beyond it, but 80%> of software written is logical duplication of software already written (for another company perhaps).
I’m guessing on the percent but it’s important to know that global overall LOC is really just hashing out the same stuff we’ve been doing (albeit in maybe a slightly more novel way) to get business value.
> We shouldn’t be making hiring bar so high that it requires a PhD but rather hire on aptitude and hunger for knowledge and feed them.
As a self-taught developer with just a few years of experience, I appreciate seeing this sentiment. I have this burning desire to be a great engineer and to learn as much as I can, but I'm not exactly sure what I should be doing or learning.
Do you have any advice as to activities, books, or other resources that you think would benefit a young engineer who would like to learn how to do things right?
I’m on year 20 of this amazing ride. I have another 20, possibly 40 in me. I love it!
I was self-taught. The trick is to find something you’re super super interested in and learn as much as you can from books, youtube, Google, blogs, other members of HN.
I was really into games and wanted to make quake-style fps games. I started with art. Went to college for graphic design and hated it so I dropped out. Worked on games while I did crap jobs. Eventually made a few small ones and got really into web development. Got a new job...
I can’t really tell you what books without knowing your interests.
As an old dude, I agree with you completely. We push way to much on the degree/school thing in HR, long before we get to interview them. For most positions, I would rather have an enthusiastic learner with a BS, or even an associates, and have them learn how it is done from experienced staff. If they really like it, and want to pursue more college, fine. If they want to dive in and learn by doing, fine.
I am in my retirement position now, stepped back to staff engineer, because I don't want to spend any more time stuck in meetings. The best principles, the best directors, all came up through the ranks. Yes, some of them returned to college for a few years before coming back, but the best ones just came in as enthusiastic green horns and dove straight it.
I have worked long enough now (36 years in this field, and this is my third 'career') to have seen people I mentored grow into director level positions, and that is perhaps the most rewarding thing there is.
On a related note, game consoles seem to be going the same direction, focusing on services and downloadable content or games over physical media (and even those that are physical seem to have a large online/multiplayer component). After a console generation or two, these services get taken offline as they are no lonher profitable, leaving a portion of your system useless.
I can go back and play old games I grew up on any number of platforms thanks to emulation or buying old physical media. This generation won't be able to do the same in 10-30 years.
> What's worse is the number of other posts with the top comments being vociferous defenses of these companies
I slowly tend to think that some of those commenters, who fiercely defend all and any action taken by Apple to limit the freedom of its users, are paid marketing and PR firms who scan forums to oppose any opinion that is not in line with Apple‘s vision of a closed system.
I understand the sentiment, but how many people were ever going to understand their systems at that level? I'm sure Gen Z will have their own tech equivalent of Roy Underhill telling them to put down the Ikea bookshelf and grab an axe and a log, and that the number of people who respond then will be about the same as now.
It is a choice to buy or not buy a device from a company. In this particular case the greatest threat is that devices that were always used by powerusers, are a step away from preventing us having superuser privileges on them, and instead have a moderator like on an iDevice.
At the same time Gen Z has way more choice and exposure to "computers", so worrying about them sounds very diminishing to their prospects. Sure, adoption of computers, even if they fit your bag or your pocket, is much greater nowadays which leads to way wider range of usages, for better of for worse.
In the end same could be said about the Edison generation and electricity. In the end I barely have a clue how electricity networks operate. Can I experiment with it? Sure. Do I need to if I just want to power on my computer? No.
Next laptop for me is going to be from a company that explicitly supports linux. Probably dell or lenovo.
A lot of people like to rag on Ubuntu. I agree with some of the criticisms. But the one thing they have done right is broker partnerships with hardware vendors like dell and lenovo.
Highly recommend the Dell XPS 13 if you enjoy the Macbook build quality. The edge-to-edge screen feels like the future.
I run EndeavourOS which is like a GUI installer for Arch. Arch's package management with (AUR -- Arch User Repo) will remind you a lot of Homebrew. And their wiki helps you get up to speed quickly.
I did that already with a ThinkPad T495. Couldn't be happier. With Arch Linux and KDE it's objectively far superior to Mac OS for development (I do mostly TypeScript and Rust).
The crazy thing is that I've been blind to the massive improvement that happened in Linux-land for the 7 years I was using Mac OS.
Of course there is still some PITA with missing productivity software (in my case for hobby photography), but so far I am determined to suffer with open source tools and maybe try to contribute a little.
superfish was reason enough for me to never touch lenovo and all the other malware they've infested their systems with since have only made me feel better about that decision. Dell might be okay though.
Hah, I've become a skater relatively recently. It's my Zen place.
It's s good analogy, you have to want it, either skater or hacker, any label really, must be earnt with time and effort. True for all things, and those who have earnt it can spot the pretenders a mile away.
Not the most productive comment, but I always wait 6-12 month before doing a major upgrade - I let other people find issues like this. Or just wait a month if you're in a hurry for some reason.
That's wrong, Docker will run fine on M1/Big Sur after an ARM release and probably some patching. Not too different what you'd expect on any major OS release.
Apple showed it on stage at WWDC, along with Debian ARM running inside a VM. It's coming, if for no other reason than developers inside Apple need it when writing infrastructure software. (Honestly, I'm surprised that Docker didn't get the same pre-release hardware Parallels did)
Because M1 has virtualization and emulation support, unlike the A12Z (not M1) ARM processor on which the docker issue was opened. This is explained here: https://news.ycombinator.com/item?id=25073010
And Docker has the motivation and means to patch and build the M1 Arm version.
You can take it directly from the Docker people:
> Thanks. We're working closely on this with Apple. This is expected right now (and indeed there's nothing we can do about it yet) as the new chips don't have virtualisation support yet.
I find that statement a bit dubious, but I guess if you only use tools that are in the repo and Chrome that's probably easy enough to accomplish. I remember fairly recently having to compile my own dosbox because of a many-years-old bug that remains unfixed and causes garbage graphics in black borders [0].
My history of using Linux desktop is riddled with crap like that.
No, I expect you to buy a well built tool from a Linux-first manufacturer, where you pay a premium to get a device that "just works". This is, after all, the Apple way right? You pay a premium for a Mac because it "just works", right?
Doesn't even need to be "Linux-first." Our standard Linux laptops at Red Hat are Thinkpads, many of which are used by people who are not particularly technical and use a locked-down corporate build. (Many others run Fedora or CentOS.) I'm sure we have many thousands in use.
The Dell XPS 13 is supposedly good too although I don't have personal experience.
I use Macs quite a bit but the idea that running Linux on a properly-selected modern laptop is likely to be an exercise in frustration is just a myth at this point.
I was mostly being tongue-in-cheek, making a point that one doesn't have to buy a Mac to get a great software-hardware combination from a recognized manufacturer.
But you are absolutely correct, and my Linux laptop is a Thinkpad X1 Carbon. It's also well supported under OpenBSD but for what I do with it (mostly media consumption and some light gaming mixed in with the usual laptop stuff), Linux works out better.
Dell XPS 13 is a solid choice, network card/wifi card may have to be changed out, but they support ubuntu natively. Cheap to buy used and generally very good.
It's funny because almost nobody uses Apple where I live (Poland). The only people who use MacBooks are some "artsy" types and some managers (because they only need office and outlook to work :) ).
Developers never use apple because it has tons of compatibility issues.
Is it really the case though? I am also from Poland (though I do not live there anymore) and based on my experience it was rather a popular dev machine a couple of years back, easily seen at conferences. It is of course less popular due to pricing but definitely not absent.
Can you maybe also comment further on the compatibility issues? Mac is perfectly fine for any general-purpose frontend/backend development. Unless you work in a Windows-centric environment or in a setting that clearly rules out a Mac (e.g. Linux system programming), I cannot really imagine what compatibility issue would prevent you from using a Mac e.g. for Typescript, Java, Python or Go development. Not sure why Poland would be less friendly for developer Macs than the rest of the world.
It is a luxury brand in the US. Every tech person that has made it would never buy anything but Apple. Yes, things like battery life are great. But if you are seen working at a coffee shop, there is a 100% guarantee of a that glowing apple logo on your laptop.
An open computing device doesn't need to be unusable, and a usable computing device doesn't need to be locked up. There's a lot of space to explore between those extremes.
This is not the case any more. A few months ago I migrated out of the MacOS ecosystem and got myself a Thinkpad X1 Carbon. I run Fedora on it. It works seamlessly just like MacOS did on a Macbook. Haven't had a single hardware compatibility issue till date.
I've had a Thinkpad and Ubuntu for 5 years.
The 'hardest' thing i've had to deal with is some flaky drivers with displaylink. I'm not at all putting crap on Displaylink. It's been fantastic, I just upgraded to 20.4 and hit a few small issues.
Ubuntu's been really solid. Boring even. Which may not seem like a compliment, but I don't nerd out on OS stuff, I just want a computer that I can do stuff with when I need it. And I personally found previously with mac and windows my ability to 'just do work' wasn't there.
How could it be that people are all about freedom when it comes to governments but have no problem if private companies censor and controll what they write and do?
Nobody wants to take away their right to do so (ok well maybe some do; I'm not one of them). Just that there should be rules we all agree to when we do.
We accept this in other venues. Seatbelts. Stoplights. Nutrition labeling on packaging. etc..
No such rules exist for software, which is odd. It does seem to suggest the industry is ripe for regulation. It's always disappointing to me that we only accept limits when the law places them, as though ethics and morals are somehow not good things to pursue for their own reasons.
Which goes back to OP's point: just because a company CAN be as dystopian as they can imagine, doesn't mean they should be.
We already have general rules against fraud and other kinds of crime which apply to software just like any other product.
Only a very small set of products have special regulations, generally those which we deem unsafe for consumers unless regulated.
“Seatbelts. Stoplights. Nutrition labeling on packaging. etc.”
A good set of examples. Every one of them is about consumer health and safety. This is consistent with liberal democracy - people should be free to do what they want unless it hurts someone else. That is why all product related regulations are related to safety.
It seems reasonable to assume you are arguing for rules which prevent the installation of unsafe software on computers which are sold to consumers.
Apple would be the most compliant company with such rules, and would benefit greatly.
A touchpad is not really a proper tool for work (a laptop is hardly one anyway except in cases which require it), more of a handy last resort solution that comes included hence always available with the laptop.
It won't change anything switching to RISC-V, if those same hackers just build SaaS placed behind a server wall with a pretty Web interface implemented in WebAssembly talking over gRPC.
The one thing that gives me hope is the pendulum of open->closed->open has swung back and forth so many times now that even as depressing as current trends may be, I'm confident things will swing back again.
Not sure when, but I'm certain it will.
It is one of those strange things, the more closed things become, the greater the incentive is to open them back up again.
The hacker ethic is still out there, just this recent boom has dulled it a bit. A steady diet of locked down computing will fix that.
One of the major drivers is app developers. You can't build an iOS app without buying a mac, and if you're buying a mac.
It's kinda abusive how you need to pay the hardware tax, the yearly $100 USD rent, and 30% percent of your sales to have the privilege of developing for their platform.
I'm glad windows server lost to linux so we'll at least be able to run our own servers for the foreseeable future. But I see platform layers being the threat there (AWS, Azure, GCP).
> literal cancers that are eating the software industry
Like all subcultures when the hacker ethos went mainstream it's spirt died and all that was left was fashion. No one cares anymore if their consumer choices are a threat to freedom as long as they look cool. Outward appearances are now more important than principles. Prickly principles require sacrifice and why have character when you can have a disposable "personality".
The hacker ethos didn't go mainstream. People who don't deserve the moniker started self applying it.
I'm not a hacker, I'm a corporate shill because I work for my own profit within a system that's set up to ensure its own continuity. I'd like to be a hacker, but I have a family to support and am too weak or addicted to lifestyle or scared or realistically-minded to actually commit to what's required to earn the title of hacker. Same with anyone working towards advertising, marketing, consumer data collection. Doesn't matter how intelligently you solve a problem in those fields, ain't no starry eyed kids ever wanted to grow up to make the world a better place by more accurately identifying and individuals preferences on the internet.
Fuck fake hackers. The best thing I've done with my life is being a good husband to a teacher.
The ethos went "mainstream" in the sense that it was fashionable to play the part and be in your words a "fake hacker". And for future historians and those unfamiliar with the jargon, hacker ethos isn't about criminally breaking into computer systems.
And yet, Apple still has a reputation of being really privacy- and security- conscious. Windows' telemetry has had all the alarms raised for years now, and how does e.g. Canonical earn money?
Same on mobile, Android phones - especially in the cheaper ranges - are dodgy as fuck when it comes to privacy and security.
Does Apple really have literally all telemetry disabled by default, out of the box, on all devices? Is there an independent audit proving that? If so, I would buy all their products in an instant.
Then how do they know what to fix/improve without that? I can't imagine average iOS/OSX users, who aren't devs, are writing and submitting detailed bug/crash reports on a regular basis.
Or is it just bias and personal preference where "I don't mind Apple collecting my telemetry because I like Apple but Microsoft collecting my telemetry is evil because I hate Microsoft"?
I don't have a dog in this fight but don't trust any for-profit company when it comes to privacy, no matter how good their PR is on the matter, especially since both of them(and most SV companies) were part of the NSA Prism bulk collection program[0].
Apple gives you the option to turn telemetry on/off as part of the machine setup & OS upgrade process. This option is explained in simple english and not buried in the legalese.
There's a conundrum that requires further investigation which is the intertwining of the concepts of privacy and security and how in order to increase security protections of their products there must be a decrease of privacy. Security requires a probing knowledge of the details of what is running on the device.
The same information that can contribute to increased security for the end user is also valuable to sell to advertisers if, delightfully ironically, the security vendor is ok sharing private information with said advertisers.
you can't apply nearly the same heavy accusations on canonical as you do on MS and apple, cannonical makes the glut of its money from ubuntu server. The os is just a byproduct, if you really want to get technical, you could say that ubuntu popcon tracks you, but you can disable that from the package manager settings, because "oh no! they have anonymized package usage records of me!"
The PC world only happened due to singularity point that IBM was unable to prevent, even though they tried it.
Everyone else was vertically integrated, if anything Apple is the survivor of those kind of systems, and now everyone (OEMs) else wants that model back.
> people have to get used to seeing computers as closed-source devices with app stores
Consider all of the school age kids using chromebooks, many of them their 1st computer.
Will they care in 10 years as adults that their Mac doesn't allow programs to be installed from "untrusted sources"? Unfortunately, this transition probably won't phase them.
They won't be able to build a disruptive startup or a modest lifestyle business anymore without having to pay upfront and play by the rules of big tech. But nobody will be so they won't feel that something was taken away from them.
This actually describes my feelings about all this at the moment. I imagine that now and in the future, the 2 Steve scenario in the "garage" will be next to impossible
it's becoming the antithesis of everything that the original Hacker culture stood for.
A hacker in the “Hacker News” sense is more than likely working on an ad-supported web app and dreams of working at a FAANG. The old hacker dream is dead, sadly.
You're not looking hard enough, which is also part of the problem.
Back in the "golden years" you'd have to put effort in to finding and participating in the community. The real hackers are still somewhat underground simply because they're anti establishment; actively seeking out the high effort, loss making niches that attract fringe thinkers and the unique characters required for true innovation.
Crypto parties, maker faires, get to one. If nothing exists locally, build it and they will come. (And admittedly I need to take my own advice).
We have an open space workshop and hack shop in my town, and it’s frequented by quite a lot of people. You’ll see anything IoT, 3D printing and Linux you can imagine being hacked upon. You can also see people building race cars, or ideas that eventually turn into businesses.
Everyone has an Apple or android device, and macs are easily make up 90% of the personal workhorse laptops. Because at the end of the day, technology that just works, is nice too.
I get that it’s probably very different in crypto communities, but we don’t really have any of those around here. I wouldn’t say that real hackers aren’t using walled gardens though.
Long gone. The last time I had a conversation here about what a "hacker" is, there was a sizeable contingent who defined it as "someone who knows how to write computer programs".
If that hacker ethic lives, it's not in people labelled "hackers" anymore.
I know so many CCC people who have trouble to make a living or are (un/happily) employed by companies that went all in Apple, Google, Mirosoft, AWS.
Many for sure are brilliant brains but they lack to monetize it. To break or stop near-monopoly/walled garden systems you have to dismantle them through the market. Build better/cheaper (primary) alternatives which are free and open (secondary).
IMHO mainstream only values convenience and price, freedom is only a "nice to have" item and when it requires more work it is a no-starter.
Fortunately Apple has like 3% market share in the desktop market globally[1] , so it doesn't really matter what they do.
[1] browsing data shows around 10% market share but it is based mostly on English websites which are minority (and English-speaking countries have significantly higher apple users percentage than the rest of the world).
Even English speaking countries don’t have that high Macbook usage, because the premium sticker price is extremely high; unlike in phones, where an iPhone is comparable or in some cases less expensive than equivalent Android flagships, Macs tend to be several hundred dollars more than even their flagship laptop or PC counterparts, and from my personal experience a computer running Win10 is not that flaky compared to some Androids I’ve used.
But also Windows phones home.m and Linux desktop is still a shitshow (pardon the French) arguing over and over again about the same melange of disputes that should have been settled a decade ago...
Not sure if it’s a sign of aging, but I’m losing hope and interest in most of these pantomimes /s
Linux desktop. I just use what's presented and get on with my work. I don't have a particularly complicated setup, but I found something that works and I've been using it happily enough for over a year having migrated from Windows.
I don't know what disputes are being argued over, so I bothers me not in the slightest. Would it help if you just ignored the detail and just worked with what's presented?
Dual booting is still a problem. I just tried to install ubuntu as dual-boot on a separate SSD from windows (which I keep around for video games) and I ran into 13 bugs (I wrote them down!) within two hours - including two which caused fatal issues in the installation process itself which I had to run around eight times before it succeeded. Then it somehow permanently trashed the ability of windows to boot. After spending three hours trying to fix that I said fuck it and nuked both OSs with a fresh windows install.
This is the first time I've tried installing linux in about a decade, after using a stable dual-boot setup during university. Things seem to have gotten worse. It'll be another decade before I try again, probably. I lost two days of pay and probably years of non-grey hair dealing with this fiasco.
really? I just set up lvm2 on my new dell, partitioned some space, ran the windows installer and it just worked(tm). I do still remember how hard it was to dual boot on my old laptop, I don't know why it was different for me this time
There are many reasons to choose Linux over the alternatives that have nothing to do with one's ability to customize it.
One example is the simplicity of system management. Desktop oriented Linux distributions tend to take care of everything from application installation to updates with a single interface with minimal interruption to my work flow. This is only sometimes the case with macOS and Windows. Android, iOS, and ChromeOS are not realistic contenders in the application space for some users.
Cost of ownership is another factor. Linux may not make sense for some businesses if they have to hire someone to manage their systems, but an end user who can handle often trivial tasks can usually support their own system and benefit from less downtime. While paying for software is a good thing, it frequently adds many constraints on what can be done while modern business models can make licenses prohibitively expensive (e.g. subscription models or various forms of forced obsolescence).
Other reasons include: sometimes the desired software just works better under Linux since it was designed for Linux, a desire for privacy or a need to ensure confidentiality, compatibility with older hardware that is no longer supported by the vendor (but may be supported by open source developers).
Working with what's presented simply means that you are unlikely to modify what is shipped by the vendor. You can still add to it or benefit in other areas.
Most Linux distributions are also easier to audit than commercial operating systems and some go as far as encouraging it. While this is optional and requires a deeper understanding of what you're looking at, it does not require handing control over to a third-party (including the vendor). You can choose to trust the vendor by clicking a button, you can use integrated tools to choose what is done and when it is done, you can use those tools to audit what is done before it is done, or you can audit it down to the source code level. This is a far cry from Apple, Google, and Microsoft's approach where you have very little control and what little control you do have is complex to access.
To be clear: my commentary on "work with what's presented" was merely in response to above comment about Linux desktop shit-show. The following is about choosing Linux over alternatives.
I avoid Apple because of the slow descent into vendor / walled garden lock-in, which suits neither my wallet nor personality.
I have a couple of ChromeOS devices for the kids' schooling, but they're unwieldy for my workflow.
You didn't ask, but I moved away from Windows because of increased bloat, telemetry and the dual issue of decreasing control of "services running" alongside noticeable slow-down of performance frustratingly too soon after a fresh install. I've also had two occasions (which is two too many) where Windows decided, upon it's own, to install updates and reboot, losing whatever I had open at the time, no message pre- or post-update just clean login screen next time I went to use the machine. I think they've made that better, and there are options to control the nature of how updates are dealt with, but I've already made the jump.
"what's presented" by Linux may not be perfect, but it's closer to my perfection than the alternatives I've tried (to be clear, this is what works for me, not necessarily anyone else).
People, outside of HN, don't care about operating systems. They are all good enough and have been for a long time. What people care about are applications. If you want to play games, you are probably going to buy a Windows computer. If you want to run Logic, you're going to buy a Mac. If all you really need is a browser and email, you are going to choose by other criteria like price or beauty.
Microsoft tried this with Windows RT. It didn’t go down well at all. The slow burn might be what’s needed, but if they do this, it could damage why people still buy macs vs just buying iPads. Lets see, time will tell here.
"What we poor souls didn't understand all these years is that" computing and information processing started out as tactical and then strategic components of military, intelligence, and industry. The central computing infrastructure ("the Cloud") and attendant closed/dumb psuedo-terminals (closed systems, walled gardens) is precisely the vision outlined in the 60s if not the 50s.
Two unexpected events in information processing technology threw monkey wrenches in the industrial and geopolitical policies of the strategic thinkers dreaming in DARPA, SRI, and elsewhere:
1 - Personal Computing was an unexpected event. It has taken all of 4 decades to put that genie back in the bottle. The topic of this thread.
2 - AI. This mainly threw a big wrench in the planned integration of Communist China into the Western system. AI enabled CCP to maintain control of the Chinese society, which was completely unexpected.
Having advanced general purpose, user programmable, computing machines networked globally in the hands of peasants is simply unacceptable to the folks who gifted humanity with these tools.
> 2 - AI. This mainly threw a big wrench in the planned integration of Communist China into the Western system. AI enabled CCP to maintain control of the Chinese society, which was completely unexpected.
Could you expand on this or better yet, point to some reading on the subject? This sounds like a very interesting topic.
Pure speculation on my part, trying to understand the nature of the relationship between Communist Party of China and Western power centers.
Back in the early 70s, when the deal that Kissinger and Mao shook hands on was made, there were no PCs, or pervasive networking and mobile communication and computing. It was always my speculation that the risky gambit of fast forwarding China's development and economy and freely transfering crown jewels of technology was made with the conviction that the process of integrating with the West would naturally create a power base in Chinese society that would push aside the Communist Party. Tiananmen ("June Fourth Incident") was a manifestation of this correct original prognosis of the planners of the deal, that it would cause political headaches for the CCP. A cultural instead of a kinetic overcoming by the West.
I think the consensus now in the West is that CCP armed with AI, pervasive networking, and the emergence of a digital society in the true sense, can stifle any possible organic formation of alternative poles of power in Chinese society. The only remining challenge to CCP were the Tech Mandarins (like the Senior Self Crowned Bozo in the West, a certain Bill Gates) and just this week Jack Ma was reminded as to who is in charge in New China.
And of course, Eric Schmidt et al. are now salivating at the "Chinese Model" and we're gonna get the same, regardless of what little you and little me thinks about it.
There is always a war on when property is not democratized. In the Anglophone world, the easiest example to think of is enclosure[1] which was a war between the mercantile class on one side and the lazier half of the lords and the little people on the other.
If any company, be they Google, be they Apple, should experience total "victory" in the war on intellectual property they would find everyone who is not
* An investor
* In management
* An employee
* A temp/contractor, or
* A happy customer
Set against them and they would fall. Imagine there was a neural network or network of neural networks that could arrange bits into entertaining movies. It would print money and it would not be strategic for whatever company had the resources to do that to hoard the benefits, they would have no customers if they did (I suppose if they had enough compute and memory resources to succeed in such a task they might already have a monopoly on money).
I used to understand the CEO's job as cheerleader, for the investors, for the employees. I still think that is how they operate tactically, but strategically, I think the goal of a CEO in a Obama-era neo-liberal is to put the whole world in one of those categories above I suppose, with the acceptable additions of
* Vendor/supplier
* Companies in partnership
* Competitors who participate in a trade association together in a reasonably civil manner, or
* Members of government who are lobbied to advocate for your firm or industry or a strategic need that you hold in common with the broader public
In the long run the alternative is to perish. Utopia I think may have been inevitable by the 50s/60s when people started to believe that some companies really would last forever. On a long enough time scale, it is best to do everything in everyone's best interest considered in toto.
So all of that to say that I expect one day, maybe 40 years from now, maybe 80, that IP as it operates today will be transformed into an index of ideas that everyone can take and implement who might put them to use beneficially for the citizenry and other residents. It will be nice after all of that work to have a primary key for the ideas that have been created so far, so I don't think we the people will burn down the USPTO or the LOC or anything like that.
I hate walled in gardens and in Microsoft land it did get a lot of flack which I am thankful for. I doubt many Apple users have the foresight of a Stallman to see where their environment is headed.
There is money in creating apps for Apple systems, but you are giving up a lot in exchange.
Some developers use it for convenience but I would not say it is widespread and most are fundamentally opposed to use a mac.
Our advertising department is the only one with macs. Sadly, we still support Apple because every sales rep gets the whole suite. iPhones and iPad Pros (because the others are too small and you cannot work with them... what a new revelation...), that is a few thousand bucks for every employee that Apple gets. They even get the ridiculously priced pen (120$?) and keyboard (300$+). And our sales people still complain about not have notebooks, because they are superior devices. There is just the presentation argument.
Probably 90% of developers I've worked with in the past 5 years used a Mac, and really wanted to do so. I know a handful that left Linux to do it, and have told me they can't imagine going back. I'm not sure that your'e right about developers.
I do know that my next machine will be Linux. I'm probably going to put together a desktop for my office, and I'll buy a System76 laptop or whatever for those times when I want to work from the porch.
Anecodote- I was on OSX for ~9 years +/-1 or 2 years. Came from Linux (Ubuntu) and Windows for ~10 years before. I made a switch a year ago back to the same (Windows + Linux) and I've been nothing but happy. I know people tend to complain about Linux on the desktop, but I had great experiences a decade ago, and I do again.
I always have complaints about my Linux system, until I realize that my Windows colleagues do the same. It shit here, shit there, shit anywhere I imagine. I personally had good luck so far, each of my machines worked well with Linux out of the box, and I have to realize that I had around the same amount of issues as I did in my Windows days.
Estimated 1-3% developers of developers I worked with run MacOS. Maybe it is a local or SV thing, but most I know prefer the freedom. In the industry I work there aren't many tools for Apple ecosystems available.
IBM has nearly all of it's developers on Mac at this point. I came from using exclusively linux for 10+ years and have decided to switch my personal machine to a Mac and my phone to an iPhone because it's been so good.
Mac has everything I liked about Linux but with zero cognitive overhead. With brew and a window manager (amethyst) I'm not missing any features I enjoyed in Linux.
I hope that changes with the acquisition of Red Hat. I think they did it to help Red Hat, but they also wanted to buy the community of developers in my opinion.
Yet despite all the wailing and gnashing of teeth - both in the article and in the comments here in the various threads - people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges.
These companies have repeatedly shown that they don't respect people's privacy and people have resoundingly responded that neither do they.
I would suggest that the users here are probably some of the best suited in the world to help smooth those rough edges and even surpass the features and usability in the privacy respecting alternatives I just hope we collectively notice this before it's too late.
I'm not. Since the last few iterations of Apple hardware and OS software, I've decided to switch to something else. I'm also advicing all friends and collegues to do the same. There is simply no reason to stay with Apple anymore as a professional. It used to be stability and ease of use, but Windows has since closed that gap, and Linux is a close second (and a must for programming).
Most likely that means I'll run a Linux laptop for programming, and have a partition for Windows on a gaming computer, cuz I love me some games. I know Steam is gradually approaching Linux, but most AAA games still run best under Win. Plus it's an easy platform (outside OS X) to run multimedia production tools. Sorry, I just don't think Gimp or other Linux tools for multimedia production are good enough yet, but the hope remains that they will someday compete with giants like Adobe. For real. And not just as free but woefully inadequate alternatives. Please correct me if I'm wrong here, though. I'd love to be surprised, if you are able!
I haven't used Windows in long time, but it feels like "the gap" was closed by Apple, by letting their system degrade. The high end 2019 MacBookPro is in many regards less reliable for music production than my old 2012 MacBookAir, which is really, really sad. Unless Apple gets their act together my next music production setup will be Windows-based (or maybe I'll even try Bitwig on Linux).
>it feels like "the gap" was closed by Apple, by letting their system degrade.
While there is some merit to this claim, I don't agree with it entirely. In my experience using these systems side-by-side for over 10 years now, it's pretty obvious that Microsoft has stabilized while "stealing" Apple's good parts, and that Apple hasn't really evolved any further than they got in about 2010. In essence, Apple has allowed Microsoft to close the gap, although by about 2005 Apple managed to become so high quality on the computer market, and so ahead of its time (by "stealing" the good parts from Linux) that it's hard for me to say how they could possibly keep that position for long. (I'm not saying that they literally stole things. Obviously they've done a lot of innovation themselves, but there are also some pretty obvious signs of "inspiration" going on too.)
There has for sure been downgrades with Apple too, but not really in regards to pure stability. In my experience most of it started in the 2010's, when Apple decided to turn away from the pro market to focus more on the trend consumer market instead. At first it happened slowly with a few technical changes that put off web designers. However the first major blow was against Final Cut Pro. I guess it turned out well in the end, but in the beginning they stumbled badly and gave away a large market share to Adobe as media pros started fleeing from Apple. Soon after came system changes that made Adobe crash more often (I'm sure it was just a coincidence...), only resulting in customers fleeing from Apple as a whole. Today Apple Mac's are decidedly more of a tightly controlled trend brand than the prosumer brand it used to be. (Kind of like Hasselblad has gone from the photo engineer's brand to an almost purely luxury brand by now.) (Also I really miss the MagSafe lol.)
> “the gap” was closed by Apple, by letting their system degrade.
100% agreed. And I would even go so far as to say that Steve Jobs had such a strong vision of how things could be, that it was his greatest mistake to put someone in charge who did not (Tim Cook).
Apple still is best for audio and animation applications and there are synergies in certain industries like game development.
But apart from that it is a bad deal.
Windows degraded itself with its bad store, telemetry and PaaS dream, but yes, as soon as they could Apple developed in the same direction. Microsoft suffers from bad decisions on some management level.
Great tips! Thank you! I'm more into 2D compositing for video, but I've had my eye on Blender for some time. Not sure how it stacks up against stuff like After Effects. For audio production I already use Reaper FM, but I might give Bitwig a go too. ^^
I don't even bother dual-boot anymore and just use WSL. Windows has basically become what OS X used to be except better because it's actually Windows and actually Linux. It just goes back to Apple not caring one iota about developers. Focusing on developers used to be Microsoft's strong suit and they've positioned themselves well for that again.
You pointed out a core problem here, and a common blindspot with the HN crowd:
> continuing to use Apple/Google/Microsoft
That is probably 98%+ of devices that most typical people use for everyday computing, across phone, laptop, and desktop. Whether it is an android device, an iPhone, a Dell laptop or a new M1 macbook you are beholden to a corporation that doesn't care about privacy. I would argue that Apple cares a bit more, and I really hope they respond to the most recent shit show with some changes, but I'm not holding my breath.
And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis. And hell, if you setup Chrome and Gsuite as your default tools of choice, as many do (or are forced to because their job) how much are you really breaking free anyways?
I don't know the solution, but its clear that in the war between privacy and convenience and usability that privacy is on the ropes and its the last round of the fight.
You are totally right that Linux is only a single digit percentage in user share if you look at all computers deployed world-wide.
However, in the developer sphere, GNU/Linux is way more represented. In the Stack Overflow 2020 survey, 26% of people said they were using a Linux based OS [0]. This number has been gradually increasing over the years so further growth is expectable. It's less than one percent less than the Mac OS share of 27%, but I'm not sure how much noise there is.
So Linux is usable enough for a growing community of developers, and many laptop/computer vendors targeting the developer market offer GNU/Linux as a preinstalled option.
As a result, at least for software people, Linux is a real privacy preserving alternative on the Desktop. HN is a software forum, and the blog post was also made with people educated about software as the recipients. So I think the post you are replying to should be understood not in a global context but in a context of the IT crowd.
Of course, it's not beautiful that Linux is less of an alternative to people who need to use complicated niche ISV software, but generally any industry can make the move if there is a subset of users pushing towards GNU/Linux.
Every company I've worked at for the last 10 years has developed exclusively on macOS.
Linux is fine, but it doesn't just work, and it needs to in order to gain a foothold not just in the developer community but with the general public -- that's the lesson Android has taught us. Until it does it will not budge the numbers. It just won't.
The real apologism here comes from the Linux community, who believes that by virtue of existing, and not being an Apple or Microsoft project, it should be widely accepted in spite of being a distant third in usability. Linux needs to step up its game or it will forever remain a distant third. The general public does not care about "free and open source" -- heck they barely care about privacy at all. That's just us, the HN crowd.
I love Linux in theory, I use it a lot, but my daily driver is macOS. Frankly it likely will remain so even as I transition to being an Android engineer in the coming months.
Have you really used a desktop Linux desktop recently? Sure it might have some problems but Fedora for example is in no way "a distant third in usability".
I haven't used MacOS for a couple of years but I can say for certain that Linux is not that far behind Windows and sometimes even ahead. I think the main problem is lack of application support but specifically for developers most applications are already there.
I use Ubuntu on an officially supported Dell enterprisey machine (E7400) and the experience is noticeably worse than on my Mac.
There are no multitouch gestures, there's screen tearing, 4K video playback on a 4K screen is choppy, sleep and wake up both take a long time, deep sleep doesn't work properly and I HAD TO DOWNGRADE MY KERNEL because of the stupid "Resetting rcs0 for hang on rcs0" bug that would freeze the machine for five solid seconds every minute.
I work on dotnet. I've not been able to run my azure functions on Linux at all. Even on windows, I have to start the azure storage emulator directly and then start func (using the full path). On Linux, azurite is not the same thing as azure storage emulator.
So I must use windows for dotnet development. Even after four years of dot net core. That being said, Linux isn't the problem. Gnome 3 isn't the problem either. Flatpak works great.
There are a few regressions once in a while though. For example, I'm unable to use my lenovo flex 14 (and ryzen 3500U) built in mic since I upgraded to Fedora 33. Looks the issue well be fixed in kernel 5.9 but for the moment I'm on 5.8. But it has been great mostly. If I could use it for work, that'd be great.
> Have you really used a desktop Linux desktop recently?
Someone has to say this literally any time someone mentions problems with Linux Desktop, along with "it works for me", "my grandma uses it!", "you just have to buy the right hardware", and "you're using the wrong distro". It is incredibly tiring.
It's a reply to the tiring cognitive dissonance of posts like GP that basically broadcast that 'oh, I would simply love to support open software and general-purpose computing instead of locked-down proprietary platforms, but alas, Linux is simply unusable you see, so I have no choice!'
Yet rather than take at their word that Linux is unsuitable for their use case or seek to remedy those problems, people insist that it must be because of one or more of the reasons I listed above. Alternatively, they attempt to change the user's use case to one that fits Linux better.
It's extremely tiring to watch this happen for as long as I have been watching it happen.
They didn't say Linux is unsuitable for a particular use case or give any meaningful criticism or definition of a use case, they just said that it 'just doesn't work'. Is it any wonder people call out that tripe.
'just doesn't work' and 'doesn't just work' are two very different statements. You said the former, but I've only spotted people saying the latter in here.
Also if you have to argue with someone that linux does work for them, you've missed the point.
The implication being that it doesn't just work for them. They know that, that's why they asked if they'd personally tried Linux lately. Nobody cares if it "just works" for your grandma or whatever, because that doesn't help them at all.
Linux is hardly unusable, it's just a worse experience, and for something I spend 10 hours a day on and earn my living with, I want the best experience.
You probably could if they'd let you heh, it's really well optimized, and works great on 5 year old Apple hardware with much worse specs than a $300 modern Acer. This is especially true as low-end cheap devices tend to be much more "standard" -- no exotic hardware.
You would either not even boot due to buggy UEFI implementation, or it was cheaping out on some hardware that there is not driver for anything except the OS release it came with.
In some aspects, the hardware on high-end-insh laptops have moved nowhere in the last few years. Yeah, they've got slightly better GPUs to handle 4k, Vulkan/SPIRV and new media codecs, and faster SSDs once they've got rid of SATA, but the CPU performance moved just in percents and it comes with the same memory. My wife still uses 2012 i7 Thinkpad (T430s) with 16 GB RAM, and there are not many cheap machines that could be objectively called better. Certainly not in the $300 range. Heck, until Ice Lake, you couldn't purchase a laptop with more RAM unless you went into luggable workstation segment, so it is no wonder that MacOS works relatively OK on 5yo machines. They are still way better than $300 Acer, despite that Acer having possibly some better paper specs.
How good the specs are doesn't matter for most driver bugs. Instead, it's important which components you use. And even if cheaper devices use "standard" hardware, they still use a much larger variety than the few models that Apple releases.
I use both, and they both have their strong and weak points. Purely system-UI-wise, I prefer Gnome Shell to Finder. Upgrading to Big Sur was a shock, the aesthetic went downhill.
> Every company I've worked at for the last 10 years has developed exclusively on macOS.
There are probably also plenty of companies which exclusively develop on Windows. You can have an entire career in the embedded industry, target Linux all the time, but never use it for development. Development is not all Linux based, I never said that. What I quoted were percentages.
> it should be widely accepted in spite of being a distant third in usability.
As you shared your anecdote, let me share mine. Recently I had to set up a new networked printer on both Windows and Linux. On Linux it worked immediately, on Windows it didn't and required installation of a manual driver. That installation was partially botched and removal didn't work. Only after a few attempts it did, at which point I've spent an hour with the problem. It's the same old crap that you had with Win 95, but now on Windows 10, while Linux has a proper IPP driver and no need to install outside software. Maybe Windows has an IPP driver too, no idea, but it didn't work which is what matters :).
Also, there are definitely distros that are more usable and ones which are less usable. I think there is a true core in the meme of people being lured to linux by "use linux it's user friendly", then being told that Arch isn't that much harder anyways, and then ending up using one of the expert distros which often break in subtle and hard to debug ways.
But yes, in many ways Linux could do better usability wise. For example, it's a nightmare to target GUI Linux and then expect your binaries to work for years. It's almost easier to just ship Windows software and then use Wine. Nvidia also has shitty drivers, and in general, the driver situation isn't perfect. But things are improving I think and the gap is shrinking.
Also note that there is an effect when more people use a piece of software, there is a larger market for people who make a living with improving that software, via support contracts, consultancy contracts, etc. So once desktop Linux gets adopted widely, it might improve in many ways from the current state, simply due to ecosystem size.
So I jumped to Linux two years ago. Since, I installed Linux Mint on five different machines, old and new ones. I just put the Live DVD. It boots. I launch the installer. End of the story. I honestly never ran into a problem. Not going back to windows anytime. Don't see what's so great about MacOs either.
This could have been written by an Apple user circa 2000, when the entire world ran on Windows.
I switched from Mac to Linux, and it's been fine, partly because I used a laptop that has Linux installed by the manufacturer. If you only ever used Hackintoshes you'd think MacOS was a mess, too.
Even better : Switching to Linux on a Mac. So you get a blazing fast Linux experience on a very sturdy long-lasting laptop. I did just that. I love it.
Every researcher I've met in the last 10 years needs some version of Linux to get their job done effectively. Largely because tools like Python/Numpy have replaced stagnated (due to years of monopoly) tools like Matlab and they are much easier to setup and use on Linux.
Mind you, I am not talking just computer scientists. Biology, physics, maths, all use linux based setups.
No company (or academic institution) I've ever worked for developed _anything_ on MacOS. That's a US-only phenomenon, I think.
The thing about Android, it was essentially foisted on people, not chosen because it just works. Also, many Linux distributions today "just work" on typical hardware - almost as well as Android on your phone. And if you were using a hardware config chosen by someone else, and a Linux-based distribution pre-installed and pre-configured for that hardware - then Linux very much "just works".
It is true, though, that this is not enough to expand the user base far enough. That needs a lot of coordinated, collective, public activity.
> Every company I've worked at for the last 10 years has developed exclusively on macOS.
Counter-anecdote: I have had a Linux computer on my desktop for over the last twenty years, have used Linux primarily for maybe the last eleven, and exclusively for the last eight.
I do not get the near-religious love for Apple products. Pre-OS X, they really were wonderful, best in class without a doubt. I kinda get the attraction back in the early 2000s, when Linux could still be a chore and Macs were a decent choice to get a computer which had a shell and ran real software. But now? They are just not for me. I want to own my own computer, write my own software and control my own destiny.
You can’t control your destiny when it’s based on millions of lines of other people’s code, billion transistor hardware designs that you have never seen, fabricated by distant mega factories using closely guarded secret processes.
You may want some idea of controlling your own destiny, but it’s a mirage.
That is a sentiment that I partially agree with. I frequently argue that iOS (for example) represents more than a decade of investment, so we can’t expect an alternative to pop up overnight.
However using a Linux desktop simply is not a step in any particular direction. It is especially meaningless for non-developers to do if one of the other platforms just works for them.
Steps in the right direction involve developing software to solve the problems that the corporate operating systems have in fact solved.
It works both ways: Windows (or Mac) are meaningless to any user, who can be served by Linux.
The first step in solving problems that corporate OSes have solved is identifying them. For example, one of the problems is stable/versioned/sandboxable/distro-agnostic ABIs and that is something that is being worked on and they are quite usable nowadays. The related problem is, that the corporate-software is ignoring them (how many conference solution support Pipewire for desktop sharing? Or Wayland in Chrome? Heck, VA in Chrome? Exactly... but whatever Apple comes with, they support within weeks).
Is it? For few things yes (OpenGL or 32-bit depreciation); other things are simply not found in the new releases (Kerberos Ticket Viewer, TWAIN). Then there's a middle ground, where there is a window of few months to make significant architectural changes to your product (see VPNs and personal firewalls).
So excuse me, I'm going to look for a new release of Cisco VPN that works with Big Sur. Of course, it is customer's VPN, and they don't care that their vendors might use Mac, so no help from them, it is my problem now.
> Even if a small number of security related technologies have a smaller deprecation window, the others have very long ones.
It uses mechanism that was introduced in Mojave, so that deprecation window was very fast. Definitely faster than Cairo, which has already more than decade and half under it's belt.
There are alternatives to Cairo, but not drop-in ones; they would require porting (Skia, for example). Other than that, Cairo is basically "done". Yeah, there is occasional need for release with small fixes, but as a software-based rendering library it is not going anywhere. Not that announcing deprecation is otherwise widely respected; the Xorg-server was declared obsolete years ago, the last major release was in 2018, but hey, some still think there's a future and in a few years are going to be surprised by "sudden abandonment".
What are you talking about. Mojave is 2 years old. That isn’t a particularly small window. Even with the 2 year window on MacOS, an alternative was delivered.
But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.
You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.
Is it done? How do you know. Does it need to be developed further? It sounds like it does since people say it’s very slow compared to skia.
It seems like you think you know what the status of things are, but other people don’t. How can that be?
> What are you talking about. Mojave is 2 years old. That isn’t a particularly small window. Even with the 2 year window on MacOS, an alternative was delivered.
2 years/2 releases is small window. Alternative was delivered, but it also is no drop-in one. Just lika Skia vs Cairo, requires porting/reachitecting and might not have 100% coverage for what the old solution did. You might have noticed, that in Big Sur there is a list of system apps that ignore VPNs and filtering. That's a problem.
> But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.
Because it works differently. There is no dictator (vendor controlling access) telling you what you are going to use whether you like it or not, and present it as a done thing. Instead, it works by forming a consensus (not 100%, mind you), whether something is needed or not. The community around a piece of a stack might arrive at a conclusion that given piece is no longer maintainable, and make something new (i.e. Xorg-server vs Wayland compositors, or systemd vs sysv-init) and then it will bubble through as distributions adopt it - either because the new solution is maintained and the old one is not, or because the new one solves their problems better than the old one.
> You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.
Supported by whom? In this case, the original maintainer is no longer around. But that's not a problem, distributions will support whatever they ship, so if you have an application that uses it, it will continue to work.
Should you use it for a new project? No. But that was true for years. Cairo is 2000-era software library. Consider it equivalent to QuickDraw, you would not use it for a new software either. Gtk (which was issue here on HN recently) uses it for software fallback (and they've made part of their API, so they've locked themselves in; changing it would mean they would have break their ABI. On the upside, that means that there always be some support). Firefox abandoned Cairo years ago, in favor of Skia, and so did LibreOffice in the last release.
So it shows, that a library or piece of a stack might be deprecated for years, but there might be someone somewhere who really needs it and is willing to keep it on life support (exactly just like Gtk needs Cairo). So the question for you, as someone who requires support, is: what level of support you require and what are you willing to pay (not means in purely monetary sense; though when users are expected to pay for continual support, what's wrong with expecting that on different layer of the stack?) for getting that? Because you can always get it supported; or support it yourself, if you really needed. No one is going to kick it from underneath.The source will still be available, and it is always possible to fix it when needed. You will be never completely denied access as is the case in proprietary stacks.
> It sounds like it does since people say it’s very slow compared to skia.
It is going to stay slow (unless someone invests heavily into it; which is not probable). It is a software library, deliberately; if you want hardware acceleration, switch to Skia, and accept the tradeoffs that come with hw accel.
> It seems like you think you know what the status of things are, but other people don’t. How can that be?
Simple. I asked.
There are many venues, where you can ask. These people will tell you. However, due to having no dictator mentioned above, their answer on roadmap-like questions might be: "depends on the community adoption".
In the previous comments, I've pointed an alternative several times, including what other applications use. So yes, Linux does have good option right now.
> Linux is fine, but it doesn't just work, and it needs to in order to gain a foothold not just in the developer community but with the general public
I understand the general public, but the developer community? Why can't the "professional-computer-whisperer" demographic be arsed to setup their computers?
In all seriousness: The extent to which you have to manually configure / fix stuff in Linux is seriously overstated, especially w/ the "easy" distributions like Ubuntu and Mint. I've installed both of those on other people's laptops and they've had no complaints at all.
The general public in Europe cares enough about privacy to pass the GDPR.
Choosing examples from the UK so people can search for them in English, they care when their healthcare data isn't secured, when dating apps leak information, when local councils abuse the rules to "snoop" on people.
If you explain that without a blocking extension (or Firefox's built-in one now?) most pages they visit are sent to Google, and Google maintains a detailed profile, they will ask how to install the extension.
You can say the common man doesn't care about free speech, but enough do care that we maintain it in Western countries.
> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.
Uh, no.
There are many reasons why people don't use Linux. The easy of setup and use lines are pure nonsense. Many Linux distributions have been easier to setup and use since at least the introduction of Ubuntu, and perhaps earlier. Just pop in the live installation media, try it out for a bit to ensure your hardware is compatible, then run an installation program that asks questions that make sense to people rather than marketers. Getting the applications you need installed has involved using a store-like interface for longer than software stores have existed on commercial desktop operating systems.
So why is it perceived as harder? One reason: users typically have to install their operating system, while end users rarely see the process on commercial operating systems. Second reason: those who do install their own OS typically dedicate their system to commercial operating systems, yet use dual-boot for Linux (which will add technical steps). Third reason: those making the transition may try to install commercial applications via a VM or WINE due to compatibility, familiarity, or (periodically) the lack of an open source alternative. In other words, it is perceived as harder because people make it harder.
As someone who has been using Linux for decades, I find the setup and user friendliness of something like Windows far inferior to to Linux. Yes, part of that is due to familiarity. On the other hand, some of it is inherent due to there being fewer restrictions with open source software.
> As someone who has been using Linux for decades, I find the setup and user friendliness of something like Windows far inferior to to Linux. Yes, part of that is due to familiarity. On the other hand, some of it is inherent due to there being fewer restrictions with open source software.
Non-tech people don't want (or don't know how) to "setup". The most user-friendly setup won't ever beat "no setup" (e.g., macOS).
Besides, marketing plays a huge role as well. Ask people to name a few "computer" brands: Apple, Microsoft, Google. No one would name "Linux". So, it's not just that people should be able to buy Linux computers with no setup (Dell is selling them, I think), it's also that these kind of computers should get enough marketing so people know about them.
> Non-tech people don't want (or don't know how) to "setup". The most user-friendly setup won't ever beat "no setup" (e.g., macOS).
I don't like this line of thinking because it is condescending. More importantly though, there are plenty of competent technical users like myself that would love to be using an open operating system but are fed up with dealing with the systemic problems Linux Desktop has that its developers and community keep papering over and pretending don't exist.
I'm going to let you in on a secret: most of the people who advocate a particular operating system are pretending that systematic problems don't exist.
Strictly speaking, Windows is a usability nightmare. Most people don't notice simply because most people learn and use a tiny subset what is there. Beyond that, there is an entire industry to support Windows (which is part of the reason why people like it). Apple isn't much better. They tend to paper things over by simply dropping support. An example was brought up by another commenter when they mentioned that some ix programs use raster fonts. Strictly speaking that can happen under Linux yet not macOS since Apple dropped support for legacy software while some ix software is decades old. Linux will have its own issues, but I'm not the best judge of that since it is my operating system of choice.
At the end of the day, any operating system will be a compromise of some form or another. Which you choose will depend upon what your wants and needs are.
I don't disagree, Windows especially has been getting a lot worse as it embraces modern development and the users-are-cattle mindset that comes with it.
My problem isn't so much that there are tradeoffs, or even that those tradeoffs are not ones I want to make, it's that there are people out there who seem to insist that Linux Desktop is the one and only proper choice and everyone who doesn't choose it is either stupid or misinformed. If Linux Desktop people were more willing to listen to why people don't use it and take criticism to heart instead of as some kind of personal attack, it might have evolved into something more people would actually want to use.
It's not condescending, it's realistic. It's how I (as a tech person) approach most non-tech things in life. I rent because I don't want to deal with maintaining a house. I use public transit because I don't want to deal with maintaining a car. And so on.
I may be an extreme example in some ways, but everyone only has so many fucks to give, and most people have legitimately exhausted their budget by the time they get to tech policy.
Extending your analogy, I always feel like when I set up my Linux workflow (something I attempt and abandon at least once a year) it feels like someone dropped a car off in my driveway with the engine and transmission on the sidewalk and no instructions on how to assemble. Installing the "engine" always requires hours of browsing through comment threads written by random people on the internet that are years old.
This year is different though! I'm going to make it work. Also looking at Pinephone for my mobile.
I think another thing that contributes to that last problem about "computer" brands is the lack of knowledge of the distinction between operating system manufacturer and hardware manufacturer -
I just moved recently, and in the process of getting to know more people and accidentally becoming their tech support, I was surprised to see how many people didn't have that basic bit of knowledge. When pointing at a chromebook, people say "Google" without questioning why "lenovo" is also written on it.
> They can buy a PC with GNU/Linux preinstalled then.
That's the problem: non-tech people don't know they can do that. Besides, why would they buy a PC with "GNU/Linux" preinstalled if they don't know what "GNU/Linux" mean to begin with? Almost everyone out there knows what "Mac" or "Windows" means, and so they buy Apple stuff. Again, marketing.
Fifth reason - many many laptops have at least some piece of hardware or behaviour that requires fiddling to get to work on major linux systems. Sometimes it's power management, sometimes it's running with multiple monitors or 3d acceleration or the touchscreen, sometimes it's getting the touchpad to work properly or the fingerprint reader.... In my experience, there's usually something.
While this may be true, we should still be impressed by the number of laptops Linux operates flawlessly on. Very few people would expect Windows to run properly on a MacBook and far fewer would expect macOS to run on a generic PC. Yet people expect Linux to work flawlessly on almost anything thrown at it. From my experience, it does a very good job of this (at least in recent years).
From a technical perspective it's a marvel, but as a user, "works flawlessly on the device you have," beats, "mostly works, but with daily annoyances, on your device and many others."
I started to reply earlier in the thread and then decided not to, but basically what I wanted to say was this.
I have used Linux as a daily driver desktop for years, but I don't just buy random hardware and expect it to work. I do some research first and make sure what I buy works well enough with Linux to meet my needs.
> In other words, it is perceived as harder because people make it harder.
If by "people" you mean "Linux Desktop developers". They have taken relatively simple things and abstracted them two or three times until they are so complex and fragile they need special programs and standards to automagically manage it all for the user. As soon as you step outside the box of what they expect (hey, why can't I install a program on a different disk than my OS?) it is revealed for the Goldbergesq garbage pile it is.
> hey, why can't I install a program on a different disk than my OS?
Apple user here (had Linux on the desktop for 5+ years and windows for a longer time):
Why would I care whether my program is on one disk or another? Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do. Do Linux applications all self-update using Sparkle like Mac-applications do?
I also love Apple-music where I can listen to everything on all my devices, can I have that in Linux?
Does linux have the same beautiful font-rendering that I get in OSX, or do I have ugly non-antialiased bitmap-fonts now and there?
I also haven't used installation-media for operating-system installation in many many years, everything just works via the internet. I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?
And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?". I can have that in windows, can I have it in Linux? Can I easily create a encrypted bundle and mount it in the UI? Can I drag the little icon on top of any open window into any filedialog to rapidly access that file?
Can I easily configure keyboard shortcuts system-wide?
Can I have a photos-app that is as good as Apple Photos?
I think OSX is great. It feels more and more locked down, that is true, but I haven't had any real issue with this yet. I still can develop whatever I want and run it if I feel like it.
Linux is not a single thing. There are many different distributions and combinations of software you can have on any of them. You get what you choose.
There are like 7+ different file managers that you can install and use. One of them probably has compress to zip. Some of them are much more powerful than finder in OS X. I use `mc` and I don't even need to click. I just select a folder <F2> <Enter>, and the folder is compressed. Two pane file managers are great. I love them ever since MS-DOS times.
The fact that I can use bitmap fonts is the major benefit of Linux to me, because I don't use hidpi screens. I can have small fonts that don't suck to look at, and are perfectly crisp. I like that I have that choice.
All of those automatic sync things are an anti-feature to me. I was given an iPhone for webdev testing, and was snapping an odd photo or two with it from time to time, including photos of my ID documents, without realizing that it by default uploaded everything into some stupid cloud. Great, now my state ID is with Apple.
A few times I had to use Mac OS for development had me running the other way since then. Why would I search online and download files to drag and drop them around to "install" them, and sometimes click through next next next finish dialogs, like on Windows, when I can just type a list of programs I want installed and they just get downloaded and installed for me with no fuss, and meanwhile I can do something more meaningful? I can install 10 different programs I need for some project in a single command, and it's such a time saver.
To some people Mac features may be great, to some they suck horribly.
> Why would I care whether my program is on one disk or another? Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do.
Answer to your second question is very similar to the first one. Why should user drag anything anywhere, and decide to which folder put the applications? It is much easier to click on the install button in the store-like application.
Many users are confused by dmgs and they keep launching apps they downloaded from their ~/Downloads folder.
> Do Linux applications all self-update using Sparkle like Mac-applications do?
Linux applications were auto updated by their respective package managers before Sparkle was a thing. I consider keeping a Linux system updated to be much easier, than Mac one; even if it is updated by multiple mechanisms underneath (apt/dnf, ostree, flatpak, fwupd), from the user's POV it is unfied in the form of Gnome Software or KDE Discovery. On Mac, you have Apple App Store, Sparkle, brew, Microsoft Update, Adobe Update, Eclipse updater, and myriad of other mechanisms, specific to each app.
> I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?
I don't get that on a Mac. But then, I'm not locked into their products.
> And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?"
Yes, you can. In Gnome (default for Ubuntu, Fedora), you can right click a folder and there is a "Compress..." menu item. It will give you a choice of .zip, .tar.xz, .7z. I'm sure KDE has something similar.
> Can I easily configure keyboard shortcuts system-wide?
Easily? There's no system that would do that. Some are more flexible, but more difficult to configure, and vice-versa. MacOS goes into the less flexible one, for example I've never managed to have the media keys controlling VLC instead of launching iTunes, or Apple Music nowadays.
> Can I have a photos-app that is as good as Apple Photos?
This is about the first time I see 'good' and 'Apple Photos' used in the same sentence.
> Why would I care whether my program is on one disk or another?
A lot of reasons. Maybe your OS disk is on a ludicrously fast SSD but is space-limited and you don't want to waste that space on applications that don't need it.
> Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do.
Agree wholeheartedly. It makes perfect intuitive sense: the application is exactly where I think it is, and if I move it somewhere else then it is exactly there, and if I delete it then it is exactly gone.
I got fed up with Win10 last year and finally switched to a Mint distro. I run a software development company that works on the MS stack. I stuck with Mint for six months and finally switched back to Windows.
I didn't find it particularly difficult to set up. Some things are much easier (TAP drivers for connecting to multiple VPNs and running side by side remote sessions to difficult workstations across different networks, for instance). Even setting up a Windows virtual box was no biggie.
There's two reasons I switched back: 1) lack of consistency across applications is hampering to productivity, and 2) lack of stability in the application/tooling is hampering to productivity.
I gave myself 6 months to let myself get used to it, and there's a great many things I loved about it, but in the end it was less of a hurdle to deal with Win10's oddities than it was do deal with various inconsistencies and "usage maintenance" of getting having a suitable, productive workspace across Linux (and yes, I realize a large part of this was due to the fact I essentially run an MS-development shop and therefore have to work with MS tooling, but on the other hand, everyone working in any end-user business scenario has to work with Office documents, etc.)
There's installation and then there's setup. I would agree with you that installation on linux is easy these days, but setup... no.
I have tried moving over to linux several times throughout the years, but setup was always a huge issue. Installation was easy, but things like trying to get my aging peripherals to work properly, and trying to get font rendering not to look like absolute ass were always a huge hindrance.
But seriously, I guess it is possible to use a USB thumb drive on my Mac with only Thunderbolt ports, but I never have and it’s never occurred to me. I suppose I still have one somewhere and maybe a dongle that would let me do it.
which is still a pain in the butt because you have to own, find, backup, and completely wipe a thumb drive to do it. Or you have go to the store because who uses thumb drives anymore?
Macbooks from 2019 have 4 thunderbolts 3 ports and 1 mini jack. And that's all.
If you want to power the device, you do it through a thunderbold. If you want to connect a USB that is not USB C, you have to use adapter and/or USB HUB. So you would have to connect to your USB through adapter/hub, and if for some reason whatever installer wouldn't support this you would be screwed (but I guess most of the time it would be doable, just inconvenient).
> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.
This isn't true. This perception continues to hurt the Linux community along with others. Use a properly supported -buntu distro and you are good.
>Use a properly supported -buntu distro and you are good.
Honestly it sounds insane but I've started recommending Manjaro KDE.
Stuff breaking due to rolling release seems to happen less often than me avoiding having to do other stuff that newcomers to Linux are unfamiliar with.
Like adding a ppa or struggling to find a package since it lets you easily add the AUR, snaps, flatpaks, what have you without a care about the competition between those.
The install is painless. KDE's stuff feels intuitive and familiar to windows users and is customisable enough you don't have to deal with some minor personal preference annoyances.
Personally, I gotta disagree. This still absolutely true, even for large distributions.
Every 2 years or so, I give Linux another try on the desktop, and it always ends after a couple of weeks, because I'm just tired of fixing mouse and keyboard settings, touchpads, HI-DPI and multi-monitor settings, font rendering and video codecs.
I've tried Ubuntu, Mint and recently Manjaro and – don't get me wrong – all of them had great stuff in it and worked fast out of the box, but it's always the last little details where desktop linux fails for me.
Could also be selective memory. "I can't recall having any hardware related issues since, oh, 2005 or so" --> oh, that issue? Yes, of course I had that, everybody has that, that's par for the course...
The last time I attempted was 2017/18. Pretty plain clone PC desktop - MSI motherboard, ryzen 1600, AMD Radeon 580, etc. I bought a couple a different WiFi dongles to make sure the chipset was supported.
Yet the problems remained. Maybe when people say they had lots of problems and it wasn’t worth their time, take them at their word ?
And LTE modems. And sleep and suspend. Oh there is also this bug with screen locking that shows your desktop for few seconds. Oh and with dual graphic chipsets too.
> This isn't true. This perception continues to hurt the Linux community along with others.
The continual denial, constant evangelism, and needless condescension, which I have watched for 20 years now, is what hurts the Linux Desktop community.
Setting up Ubuntu was easy enough. But the file manager locks up regularly, switching between multiple windows of the same app with alt+tab doesn't work reliably, and I actually had to Google how to create a new folder. The stock picture viewer is lacking information such as ICC profile and when I installed the recommended Gimp, it was a mess of windows and the save dialog was horrible.
I suffer through all of it because I need docker with gpu support, but it's honestly a lot worse than what I expected.
You can activate single-window mode in Gimp under Windows -> Single-Window Mode.
I would generally recommend Kubuntu over Ubuntu -- it seems like Gnome aims for minimalism somehow, and has thrown out a lot of functionality that used to exist, and generally tends to emulate OSX for a lot of things. (E.g., OSX doesn't do alt-tab for multiple windows of the same app either IIRC.)
BTW, Kubuntu's default image viewer would likely be Gwenview -- You can get it to show a lot of information by clicking "More..." in the Meta Information panel and selecting stuff in the resulting dialog window. Here's a screenshot I found on the nets: https://www.fossadventures.com/wp-content/uploads/2018/05/Gw...
I have not had any of these experiences with Ubuntu. Installing an OS on any new personal computer is much more difficult than when they had dvd drives. But once installed on compatible hardware, the Gnome 3 desktop use is nearly the same experience as Mac OS. Installing software is usually a matter of searching the software menu or downloading a .deb and double-clicking to run, and the New Folder button is prominent in the file window.
The office suite is easy to use. Thunderbird is as easy to setup as Mail. Networking and devices are easy to configure from the gui, and Firefox and VSCode run without crashes. Proprietary video drivers for common hardware can be downloaded and installed via the gui. It's at least as usable from an admin standpoint as Windows 10, and from a user standpoint as Mac OS, albeit with a much smaller "app store" experience.
That should be Nautilus already and you should have the "create new folder" context menu on right click. I also can't imagine anything going wrong during setup that leaves you in that state.
One thing with Nautilus is that if there is no empty space to right click on, you can't get to the context menu, because you always end up in the context menu of one of the icons. There is no dead space between them to click on. In this case you can click the breadcrumb to get the context menu or use the icon in the overflow.
I’ve worked out that breaking free from this is just not for the average person. If you have the power then stand outside and look in but there’s absolutely nothing you can do about the churn other than watch it burn. I’ll carry on educating people but even suggesting this might be an issue is drowned out by the marketing machines of the large vendors and downvoted into oblivion even here. That is unless there is some large story that something fishy is going on. But a big story is made of a thousand small ones which people don’t care about.
Look where fighting Facebook and political influence got us. Nowhere.
>>> And before "have you tried this *nix distro??" comments appear, they are still too hard to setup
I think why Apple, Android, Windows has the advantage is none of those 3 OSes require installation by most people. Why treat Linux differently?
>>> and nowhere near as user-friendly for most people on an everyday basis
User-Friendliness, I think, is more on day to day interaction. It will be interesting if there is a study in an area with low computer/smartphone interaction. Split them into 4 groups. Each group has their own OS. Later on they will be tested with some tasks.
I bought a Purism laptop with Linux installed, and it was definitely as simple to get going on that as it was a Macbook.
Actually even easier because there's no "You must have an AppleID to use this machine - if you don't have one go and register now" step, and no associated problems with that.
> And before "have you tried this *nix distro??" comments appear
Your style of comment is also quite the meme at this point, though.
I always reply with: if my dad, mom, grandfather and other 10+ members of my family can use Mint and I have to do minimal maintenance for them, much less frequently than with Windows, then so can you.
Did I say definitely, literally every and could possibly? Notice to what I am replying:
> [..] they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.
I think my comment is a fair rebuttal of nowhere near and most. If it was true, all of my family members would have to be outliers, which is statistically improbable. The more credible explanation is that it is not really true (and in my opinion, it clearly isn't).
I think you're being disingenuous about what constitutes "most people" in the context of desktop computing, but alright fair enough, I I seem to have read more into your comment than was intended.
> Yet Dell and Lenovo offer computers with Linux pre-installed, but so many technical people ignore them because Apple is shiny and fast.
I got a Lenovo last year with Ubuntu preinstalled. It was he "blessed hardware". Wifi didn't work after wake / sleep and my connection via usb-c to my monitor was unusable. I switched back to a mac because I wanted to work, not fix my tools.
These aren't one off anecdotes, there are dozens of people in this thread with the same story slightly different. This isn't people going to Apple because it is shiny, it's going to Apple because managing hardware isn't their job.
Either verbally address that meaningfully or drop the condescending attitude.
A number of user stopping using these products wont change the slightest thing. We all encouraged our friends and family to ditch IE for chrome, and look where that got us.
Legislation and regulation are the only fixes for these practices. The EC have a lot of kettles in the fire here - on multiple fronts. I trust that, although it will take time, "fairness" will prevail.
> We all encouraged our friends and family to ditch IE for chrome, and look where that got us.
I didn't. I encouraged friends and family to ditch IE for Firefox and nowadays I encourage them to ditch Chrome for Firefox. I never cared that Chrome was a little faster for some time.
One person that only uses the (very old and underpowered computer) for mail and printing documents I helped install Linux (not Ubuntu ;)) and since then they are much happier, because it doesn't get slower with each security update and needs less maintenance help than with Windows.
> We all encouraged our friends and family to ditch IE for chrome, and look where that got us.
Antitrust entities bashed microsoft for shipping IE and look where that got us.
If anything, it's worth keeping in mind that volumes of regulations written for big corps may indeed force them to behave, but are an impediment for tiny companies to enter.
Corps like microsoft do call for regulations and copyright laws all the time, the have an army of lobbyists, they know that even without regulatory capture these extinguish competition.
> "...people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges."
I found myself being one of those people, so I made a commitment to no more Microsoft or Apple for my personal devices[1]. I'm eagerly awaiting PostmarketOS to reach full functionality on the PinePhone and I'll give up my Android phone at that point, taking Google out of the equation as well.
I'm even considering trying to learn C so I can contribute to OpenBSD for my hardware that is "rough around the edges".
[1] My wife's PC with Windows 10 is exempt for now; she's not a techie person at all and I don't want to force her into something alien. She tends to borrow my Thinkpad with Linux every now and then and is getting more used to it, so we'll see how that goes.
Apple has NOT being the same after steve is gone. It is MS 2.0 now!
With better design and products of course, none the less the business practices are the same. Much this is enforced by the current economic and political system.
Not MS. MS today is much less into the anti-user, anti-privacy, rentseeking game, compared to Apple or Google. Apple today is M$ 2.0, as much as people here hate that trope.
Only because their attempts at copying Apple and Google fail miserably.
Otherwise, here's Windows 10, have it for free, hell, here's Linux on Windows, just stay on the OS, buy apps on the Store and see these ads! Also give us telemetry.
Side note: no one cares, but I finally tracked down the cause of a really annoying bug, hibernation due to "thermal event" (CPU "overheating") in Windows 10: It's because of using dynamic disks in mirror mode (perhaps in other modes, as well). Makes Windows 10 hibernate randomly. Converted disks back to basic, no problems. Took me a long freaking time to finally solve this.
The privacy-invasive tracking in Windows 10 is much more obnoxious and tricky to workaround (since it is spread out and hard/impossible to disable). This tracking in macOS is at least under the guise of security whereas the Windows tracking is pure telemetry.
MS is still the same, they just matured a bit. They just have to behave better in the open source community to attract the developers. I don't believe for a second that they are doing it for the benefit of information sovereignty or the development of tech and software.
Their backdoor put a really bad taste in my mouth. Will try to avoid them if I can.
> I don't believe for a second that they are doing it for the benefit of information sovereignty or the development of tech and software.
They're doing it because the cloud is Linux and they need to get into the cloud because Nadella's vision for the MS OS going forward is a unified system that spans PCs, phones, tablets, IoT, everything, not just a desktop OS.
> "I would suggest that the users here are probably some of the best suited in the world to help smooth those rough edges"
How would you get around the problem that many vocal Linux users think easy-to-use software is for weak losers, and consider difficult software a sign of manliness and an identity, even a religion? This is like saying Buddhists would be the best suited people to disarm gun-lovers, without considering that said gun-lovers don't want to be disarmed, and see pacifists as weak and unmanly.
Not anymore. Consumer usage nowadays is very browser centered, which is something one can experience with a Linux distro on pretty much any machine.
I agree that it's not as much as a polished experience, but that's the point as indicated - trading off polish for ownership.
I'm excluding from the discussion professional documents editing - compatibility with Libreoffice is mediocre (at best) in my experience, but having very good compatibility is not a consumer requirement IMO.
> Consumer usage nowadays is very browser centered
But it's also just as likely that you're using a browser made by Google, and using other services made by one of the big companies that don't care about your privacy. Where's the difference?
Google's not in control of the CPU, OS, or Desktop. It's not fingerprinting every binary you run, nor locking you into any of the google tools. You could easily use a different email service, different photo service, etc.
There's some confusion in the thread, in the fact that there's a mixup between operating systems and programs (or classes of programs), however the parent was referring to operating systems:
> The alternatives are still completely unusable by most people.
> Not anymore. Consumer usage nowadays is very browser centered, which is something one can experience with a Linux distro on pretty much any machine.
Or their smartphone, which is a hell of a lot smoother experience. So why does Linux Desktop continue to pride itself on being "good enough" for an "average user" [0] that doesn't even use a desktop?
Well the alternative can't rely on unpaid workforce while Apple/Microsoft and co spend billions on their software, GUI and user experience. It is simple economics. Open source needs financing, not just free Github repositories that MS/Apple and co can then exploit for free because the code is licensed as MIT.
I think this is something we need to fix with education. We spent so long meeting the user that it developed an expectation. For any piece of technology its peak function is always reached when the human meets it half way and makes a concerted effort.
Whenever someone says “I don’t get computers” or gets angry at their smartphone they’re just leveraging that expectation and conning themselves out of a gain they could make. Same problem - laziness and expectation. At the end of that path is a life of a sharecropper and a cow to milk.
I disagree. It's a matter if taking a couple of minutes to educate people. You don't have to be a CS major to be able to use a modern Linux OS. "Most people" don't do anyhting on their Windows or Mac computer that they couldn't do on Linux. IMO it's a combination of fear of the unknown, and a lack of general availability.
Macs are shiny and their marketing department is fantastic. Windows has a monopoly with OEM preinstalls. Those are the main reasons why macos and Windows are so popular.
I've tried using Windows, it's "completely unusable" to me. I couldn't even figure out how to install it without creating a Microsoft account nor could I figure out how to turn off "telemetry".
> It's a matter if taking a couple of minutes to educate people.
While it sounds nice in theory, is just unrealistic.
We'd also need to educate people why they cannot use random proprietary rental application on their mobile device that's not google or apple, or connect to a university vpn that doesn't support linux properly. Many workplaces use zoom for meetings, or google suite, etc etc
Exclusively switching to open/ethical software starts to resemble living like a hermit in the woods, and in reality makes ones life more difficult, reducing the impact the person can do on others.
Android handles this reasonably well. I can use F-Droid reproducible builds where possible and fall back to Play Store (or Amazon App Store) apps for apps I don't care that Google (or Amazon) knows I'm running. I get to choose where I want to make the trade-off between privacy and convenience. If I want convenience, it lets me go further than any Apple product, and if I want privacy, it also lets me go further than any Apple product in that direction also.
For phones, anything that gets updates directly from Google (Pixel and Android One) will update quickly and will have community support for LineageOS for many years after official support ends. Unfortunately, Google makes absolutely terrible hardware decisions, but nothing comes close in terms of software value, which is far more important in my experience.
Keep in mind that some of the user friendliest machines today are chromebooks. Simple, not many knobs to turn, relatively secure, and of course browser centric. ChromeOS is of course running the linux kernel and the chrome web browser. The experience is very similar to installing ubuntu, google chrome, and dragging the other icons off the launcher.
Bollocks. I've installed Linux for non-technical people any number of times and, provided I make sure everything (network, printers) is working for them, generally never hear from them again. At least not for computer help.
Most people are using their (personal) computers for browsing, maybe an email client if they're not using one of those web-email abominations, the odd bit of word-processing and watching movies/listening to music. Linux works just fine out of the box and the install is substantially simpler than anything I've ever seen over in the MS world.
Right, they just need an expert to install things for them to get started... that's not simple enough for people who don't have an expert they can call.
Why could anyone's auntie do it when they needed MS-DOS or Windows installed in the 90s, and they could not do it any more today? The 'expert' being a neighbour, the little nephew himself, or whichever friend of the family.
> These companies have repeatedly shown that they don't respect people's privacy
I mean, this is just plainly not true.
Apple has done many things, things they did not at all need to do, in order to protect user's privacy. They have shown, repeatedly, that they do in fact care.
I need XCode to build software for iOS and OSX, and there isn't to my knowledge any other feasible, performant and off-line capable way to do that beside running OSX on a Mac.
This is the only reason I had to move away from (arch) linux and it saddens me every day.
>Yet despite all the wailing and gnashing of teeth - both in the article and in the comments here in the various threads - people are still making excuses for continuing to use Apple/Google/Microsoft/et al products because the alternatives are a bit rough round the edges.
Or we like what Apple/Google/Microsoft are putting out, including the "walled" nature of the iOS/macOS. People's homes are also "walled" for a reason.
Your concerns are not everybody's concerns.
That said, I don't like the telemetry and information leak as described in the article, and would want Apple to stop it and allow disabling it.
But I also don't care for using another OS just because "it's actually yours and you can do anything you want with it". That's also true for TempleOS, but I would't want to use it (to use an extreme example of an alternative), and simililarly, I prefer macOS and the app ecosystem to any Linux distribution I've seen (and I've used UNIX when there were still UNIX wars and SunOS workstations with black and white monitors, and Linux for close to 25 years).
It's super infuriating reading posts from developers who are like "Well I would love to use Linux, but I can't because the touchpad drivers don't behave exactly the way they do under MacOS..."
It was super frustrating until I got used to the Linux way of handling, and then the other system was the weird one. Companies don't mind this effect at all I think, it increases their retention.
My main workstation is Linux for a couple of years now. I have dedicated PC for Ableton and I disabled all known telemetry calls. I got rid of all Apple products except iPad 2.
I was looking into various differences among demographics which use and prefer apple products. Nobody rated any of the problems often thrown in here as a priority.
Majority of iPhone users are women and HN/reddit are primarily men dominated. It should be no surprise that apple caters to aesthetics over thermals and you see people complaining about that on these platforms.
Apple customers put status and prestige when deciding to purchase something above cost. They are rich. They care about environment that's why apple "care" about environment. They care about privacy which is why apple "care" about privacy.
So for them, it's reasonable to change the device than allow it to be repaired at some shady shop. It's consistent with their image. They can't compromise on privacy or safety because that's why they bought into apple ecosystem.
I sold iPhones for years and never noticed this trend. I highly doubt it is true in any significant way. I would also challenge the assumption that women want aesthetics over some other factor. In this day and age, that's an incredibly 'old fashioned' statement to make.
From what I could tell. People bought iPhones because they were:
a)Cool and of good quality. (Your everyman can just go buy an iPhone and KNOW it won't be junk. They don't know the other brands and researching stuff is a pain in the ass)
b)They were used to using them and didn't want to use android/didn't like using android phones
c)They had shit experiences with other platforms and just wanted something 'that would just work'
d)The iPhone really was the first good smartphone on the scene. They have inertia associated with points a->c above.
> that's an incredibly 'old fashioned' statement to make
If spending in cosmetics industry is any indicator, women definitely cares about how something looks over men. There are hard number even if you don't believe in the various evolution theories and biological differences which makes women more prone to noticing subtle visual and design cues.
I am not aware of any large scale recent surveys focused entirely on that but there are many dating surveys which reports women having preference for men with iPhones. Likely dubious at the numbers most of them report (70%?) and could be influenced by other factors but still it seems women do prefer iPhones more.
I don't think this should be odd. iPhones have the best overall cameras every year and are huge on Instagram and Pinterest (platforms dominated by women) so I can only extrapolate the difference would be bigger now with social media booming.
I will state it more plainly than 'old fashioned'. I find your assumptions about what Women want from their tech products offensive and from what I can gather, just based on you wanting to believe it really badly.
In addition to that... I actually sold the damn things to actual human beings! I KNOW who is buying them! I've met them by the thousands! It's like approaching a mechanic about why Ford's don't break down by pointing vaguely at some tables.
Your data doesn't particularly support your views either.
If you wanted to compare purely biological imperatives, then you'd only take data from cultures that treat women and men wearing makeup equally.
As such - men spend a lot on fashion and grooming, while the same can be said about women and cosmetics... but that's mostly about cultural customs.
PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.
PPS: Your stats are a little bit out of date... Let alone - the 2013 data contradicts your claim. Women are distributed between Android and iPhone equally in 2013. There would be a shift towards iPhone, if women wanted iPhone more.
For cultural customs, I am not sure which country to take. For US and UK (which dominate the statistics on stuff like this due to language and other factors), women spend significantly more.
> The 2013 data contradicts your claim.
It doesn't. Men want iPhone less than women. My initial point was simply that people here and on reddit (men dominated sites) are not reflective of apple's focus.
31% of men wanted Android while only 24% of women wanted the same.
> PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.
This is misleading. While iPhones fared worse than pixel in photo department for 3 years, they make up to it in other aspects especially video. As for Samsung, it depends on what you consider better camera. iPhone arguably has the best color profile and accuracy while Samsung phones does beautification and increase contrast which does make them look nicer but overall, not accurate.
> For cultural customs, I am not sure which country to take. For US and UK (which dominate the statistics on stuff like this due to language and other factors), women spend significantly more.
You're trying to make a biological preference claim out of what is customary in the Western European cultures.
Sometimes it's better to say that "we don't know", because that is exactly it.
> This is misleading.
No... Your original was misleading, I just forced you to correct yourself.
To be fair, if this was truly a big deal, it would be a political issue with bipartisan support and taken care of short order.
In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)
> To be fair, if this was truly a big deal, it would be a political issue with bipartisan support and taken care of short order.
Are you serious? Since you're using the term bipartisan, I'm going to assume you're also from the US. Have you been following the news/current events recently (and by recently, really I mean any time at all over the course of the past forty or fifty years)? Almost nothing actually gets "bipartisan" support these days. Our government is in a constant state of gridlock and it is next to impossible to get anything done.
> In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)
I'm really having a hard time telling if this comment is sarcastic. I'm not trying to cause you to lose sleep or anything, and honestly, I commend your optimism if you're serious, but the fact is that democracy is an abstract concept that is almost never put into practice perfectly. Even if democracy's promise of "most people being happy" meaning 51 people are happy while 49 people are not is actually sufficient, even that idealized version is very far from our actually political system in the United States.
I think one thing most Americans actually would agree with is that the way we do things here, right now, in terms of government, doesn't work so well. I think there is a lot of disagreement about what should change, and how, but it seems like very few people are satisfied with the current state of affairs. The more I think about it, the more I think this comment just has to be sarcastic, so this reply is probably a waste of everyone's time, but in any case, wow. I wish what you were saying here were actually true. That would be a nicer world to live in than this one (although, frustrating as it might be at times, this one is also not so bad).
I'm not going to comment on the linked article, other than say I think it contains serious mistakes in describing what OCSP is for and errors in other statements.
In the background there is a war going on, but it isn't what you think. It is a war between malware creators and OS creators.
From my perspective it goes like this:
To "identify" malware you need signatures. Signatures need valid certificates. Signing keys get lost (or worse stolen), bad actors need to get identified and their identity revoked.
If you don't have signatures you will end up with polymorphic malware. (Brew, pip, ruby gems, npm make me uneasy with how much external trust they require!)
To stop persistence you need some system immutability and secure startup / boot.
If you don't have secure startup / boot you will allow persistence in ways it may be impossible to remove.
If you don't have an encrypted disk protected by a passphrase, a lost device means your personal data is now potentially now "public". If you don't also have the other above protections having a passphrase at this point may be meaningless.
I use an iPhone because it boots from ROM and a DFU Restore wipes all mutable firmware. The flash, where my data is stored is encrypted. When I can, I will also use an Apple Silicon Mac because it works the same way.
Hardware products and operating systems from other vendors and sources do not have these protections. Some attempt to, but the implementations are very flawed or poor - I choose not to use them with my personal information.
For me, Apple products really are designed to protect my personal information.
I think this is complete FUD to be honest. There is a war on the market due to economic incentives. There are some criminals, but they don't target your workstation specifically.
Previously a hacker might have gotten access to your computer and maybe you had a keylogger, which could compromise you to a great deal. Today you have so much of your info not on your local device that you have countless additional attack vectors. Where do peoples info get leaked? In cloud services mostly.
edit: And of course your info is not safe. The programs you run get send to Apple. That is nothing else as a data leak.
You're absolutely right, and I use a Pixelbook and an iPad Pro for much the same reason. The cryptographic protections are great. (They'd be even more great if I could blow my own bootrom CA into the fuses.)
The phone-home is the issue, however. I've long understood the issue with certificate validity periods and the tradeoffs between short notAfter/frequent reissue and revocation check intervals.
The side effect is that it functions as telemetry, regardless of what the original intent of OCSP is or was. Additionally, even though the OCSP responses are signed, it's borderline negligent that the OCSP requests themselves aren't encrypted, allowing anyone on the network to see what apps you're launching and when.
Many things function as telemetry, even when not originally intended as so. The intelligence services that spy on everyone they can take advantage of this when and where it occurs, regardless of intent.
It's not worth putting everyone in a society under surveillance to defeat, for example, violent terrorism, and it's not worth putting everyone on a platform under the same surveillance to defeat malware. You throw out the baby with the bathwater when, in your effort to produce a secure platform, you produce a platform that is inherently insecure due to a lack of privacy.
I don't think that snooping on the connection explicitly reveals what app is being being launched - it just reveals the identity of the developer that signed it. But I agree that if the developer only has one App, then it does sort of give the game away.
Yes, and DNS is another example of Telemetry. What is google doing with the data they get from their DNS services?
I have in the past examined other devices - A QNAP NAS for example, phones home, sending the last part of the device MAC address. It did this once a minute from when it was turned on. I stoped using it - I do not know if this has changed in recent QNAS versions.
Having OCSP encrypted would cause a chicken and egg problem... OCSP is supposed to validate a certificate, but how do you check the validity of the certificate if the OCSP endpoint also requires validation?
You could ignore the validity of the TLS certificate when checking OCSP. That way, passive listeners are foiled, and only active MITM would be able to see which certificates you're checking. It's better than HTTP plaintext, which is how it works now.
Most of the bulk surveillance, pattern-of-life IC stuff is passive, not active.
Ultimately, though, I should be able to opt out of app/binary signing (and associated certificate checking) entirely if I so desire, ideally with a preference setting, or at least with Little Snitch. It looks like I'm going to have to compromise platform security overall to disable it, or use external network filtering hardware.
Additionally, it seems excessive to check every time. Checking once a day would be enough (if it needs to be global and immediate, Apple could push a kill hash to Gatekeeper as I understand it). The volume of queries would greatly exceed the size of getting the updates to the CRL every day (or hour!). Indeed, OCSP stapling is a cache of the signed proof of validity.
It also seems like a bloom filter could be used instead.
It really points to Apple being quietly satisfied to have this massive stream of usage data. And available every ISP and snooper along the way too.
>In the background there is a war going on, but it isn't what you think. It is a war between malware creators and OS creators.
No, this is _exactly_ the war we think. It is a war between Apple and malware over control of _my_ personal(!) computer.
At this point both malware and Apple leak personal data from _my_ personal computer. So from the perspective of controlling MY personal computer they are both malware and differences are neglectable if I do not even have a choice to install my own OS.
Secure boot is secure if I(!) can secure it, not Apple or any other external authority.
>For me, Apple products really are designed to protect my personal information.
Apple products designed to get _control_ of your personal information from you and thus to control you and your behavior.
Computer is no longer a "Personal Computer" with all of these developments. It's simply a termianal of some big machine where your 'personal' space isn't even really your personal.
Probably it is a good idea to recall what "Personal Computer" meant and what it should mean.
For me it is My control over My computer and My data if I wish to, and this a minimum for My privacy.
And privacy is a minimum for society respecting freedom and rights of a person.
How much private entities get to shape this stuff is a sensible concern but this sort of absolutism is how pretty much nothing actually works. You can't drive an unsafe car on public roads. You can't build an unsafe house, even on "your" land, etc. It's not crazy to regulate what kind of computer you can put on the public network.
It has nothing to do with absolutism, we are talking about basic human rights for privacy, massively disrespected recently. That fact though doesn't make them less valueable, relevant or required for the free society to function.
May be for you it's not even crazy to regulate what kind of paper person uses for own notes because who knows, may be he will bring those notes to the public one day. But for me this communist and totalitarian way of disrespect to personal rights and freedoms is crossing the red line. I've been in such situations and I saw in person the results of such disrespect therefore I think those who are willing to attack personal freedoms should learn some history and shouldn't be surprised about what people can do to protect their freedoms when words do not work.
No, you're trying to elevate some flaky security feature to a debate about human rights. It's not particularly meaningful or conducive to any kind of insight because once you make that jump, you just get to fulminate endlessly about principles and oppression. It's like starting to recite the Universal Declaration of Human Rights every time you stub your toe instead of just saying 'shit!' and thinking about maybe moving that table out of the way.
The broader topic is a debate about human rights that's actively being fought in elections, parliaments, and courts. The right to privacy and the right to repair, for example, and a nascent debate about the right to run whatever software you choose.
"Your computer isn't yours" is a powerful way to express the essence of those debates. Who has the right to do what they want with a computer, the company that built it or the person who owns it?
Software signing is admittedly a relatively minor battle in that war, but it's not completely separate from the broader issues.
>you're trying to elevate some flaky security feature to a debate about human rights.
It is not a "flaky security feature" It is a main thing that defines who controls the personal computer.
Accessing and controlling Personal Computer with personal notes is the same as accessing and controlling personal notes written on paper.
There is no elevation here from my side, it is literally about human right for privacy and I am not the one who tried to elevate the issue to unrelated areas like cars and houses.
It is a main thing that defines who controls the personal computer
It's a flaky feature that people had a workaround for in minutes. Nobody lost control of their computer.
There is no elevation here from my side, it is literally about human right
My secondary school had a bigass Universal Declaration of Human Rights mounted on a wall behind plexiglass. Whenever I got detention I'd protest citing Article 9. At 12, this seemed both funny and trenchant commentary on my predicament.
So what? There are risks everywhere in life. You buy a new car and you decide to go on a road trip and there is a dangerous road ahead. The car company should stop your car remotely?
When you are driving a car, the dangers you encounter are immediately perceptible, and your driver’s license means you specifically trained for and are certified as able to deal with them.
With our laptops full of PII, corporate secrets, cryptocurrency wallet keys and so on, without proper security measures one can be taken advantage of without detection over long periods of time, and only a tiny percentage of the user base can adequately assess potential attack vectors (even the scope of possible damage is not intuitively obvious).
There may be issues with specific implementations, and the degree of trust one has to put in their device’s manufacturer could probably be lower, but without those measures a typical user would be like a 5-year old driving a Ferrari on a winding mountain road. Yes, I’d be okay if my car automatically stopped before a cliff that I couldn’t see even if I drove right into it.
This all sounds well and good. But why not just use signed binaries? Have a user editable keystore that includes the accepted signatures. The default would be apple, installing chrome would require accepting the Google signature, photoshop would require the adobe key. Then users could add their own for brew, firefox, or whatever community software they would use.
This would give good protection against hackers, and not require uploading a signature for every binary run.
Mac users can already add any certificate to the Apple keychain and authorize them for code signing. The outage today, which what was being written about in the article, was caused by the OCSP.APPLE.COM service not responding. The OCSP service was likely being used to validate if an Apple Developer certificate was still valid.
Operating a "trusted" Certificate Authority generally requires operating under some rules. For example, the "Certification Authority Browser Forum" requires operating a CRLs (now considered bad) or a live OCSP endpoint.
Let's Encrypt does this, as does every other certificate issuer.
As is being discussed, separate OCSP is bad from a privacy standpoint - if a check of OCSP is being made, it gives telemetry on if you are trying to validate a certificate. If you can see the traffic it does reveal the certificate being checked.
FWIW, there is an OCSP Stapling method of attaching "recent" OCSP responses inside of TLS requests so that a TLS client doesn't have to make a separate request to an OCSP service.
And for websites, there's a better way to do OCSP. The web server using the certificate can get an OCSP response for itself (usually once every few minutes) and attach it to all TLS handshakes for that same domain ("OCSP stapling"). That way, clients get an up-to-date OCSP response, but without having to reveal their browsing behavior to the OCSP server.
Unfortunately, there is no obvious way to carry over this behavior to application binaries, since we're not dealing with a client-server architecture here.
Don't forget that Apple is a multinational megacorp, and is user centric only when it suits them. Consider Tim Cook speaking at the conference used by the Chinese government to promote internet regulation, saying that the vision of the conference is one that Apple shares.
Sorry, but there are no details about how PRISM works.
The 2012 date is also suspicious - it is in the same year a new Apple datacenter in Prineville came online.
I personally think that PRISM works by externally intercepting data communication lines running to / from the unwitting companies.
The NSA has previously tapped lines (AT&T), but they made the mistake of doing it inside the AT&T building. That eventually leaked out -- so I think it is most likely that PRISM is implemented without the knowledge of anyone except the NSA.
[edit] I started doing more research. Facebook also has a datacenter right next door. I wonder who else is there and where do all the network cables go.
> Sorry, but please can you point to any real evidence that Apple allows state-level actors access? From what I can see it is exactly the opposite.
> Sorry, but there are no details about how PRISM works.
> I personally think that PRISM works by externally intercepting data communication lines running to / from the unwitting companies.
Unless you can prove you're ex-NSA or worked in SIGINT, sharing your opinion is not proof. Snowden's leaks are proof.
Why not give weight to the arguments and proof put forth by Edward Snowden? There have been no attempts by the US govt. or the NSA to disprove his leaks. No counter-evidence presented.
My question is: what do you benefit from continuing to deny and refuse to accept Snowden's whistleblowing?
I'm not ex-NSA, current NSA and I have not worked in SIGINT for any country. I cannot demonstrate proof of this.
I do not believe that the Snowden slides show proof of collusion by any of those companies. All the slides do are indicate that data is being collected from those companies and the program is called PRISM.
If for example, there was also a slide that indicated payments were being made in exchange for the data, then that would be an entirely different thing. That would indicate collusion and be more believable.
I am not denying or refusing to accept Snowden's whistleblowing. I think it is highly likely that PRISM exists. What I refute are the speculations that the companies listed are complicit.
I will also add that PRISM and "beam splitting" are a bit too close for coincidence.
Room 641A at 611 Folsom Street, SF is where "beam splitting" was done. That info leaked. The NSA isn't stupid, I doubt they wanted to repeat that sort of discovery - which is why I think it is believable and likely that the companies listed on the slides have no idea what has been done.
[edited to fix grammar and that there was more than one slide leaked]
They aren't, since they are proprietary software. And taking into account Apple's history, and "proprietary software in a relative position of power", such as the ones we have keeping tabs on our entire lives, chances are quite against users at best.
And the government is designed to protect you but it can also come after you.
If you are not going to allow "malware" on your computer, you're giving the power to Apple to label anything they don't want you to run as "malware".
They already have a walled garden on the phones and tablets, which has greatly benefited them. They can deny anyone access to publish software and they have exhibited everything that is obviously anti-competitive.
Now, they're bringing it to laptops.
Honestly, I don't see anything happening to Apple anytime soon. They're way too well off and they have very important government officials in their pockets. The best thing that may happen will probably come from the outside... like the EU antitrust investigation that is going on. Even so, it's not going to stop, not anytime soon anyway.
For most people, if there computer was ‘theirs’, it would be that way for about 10 minutes until it was pwned by a bad actor.
The general issue of other people (I.e. Apple et al) having a lot of control over your computer, is entirely valid even if some of the points in the piece are exaggerations.
The problem is that people want it that way.
They are in fact correct that computers are scary and too complex for them to manage, and they want someone they can trust to do it for them.
Stallman may have predicted the issue, but he really didn’t do much to solve the real problems.
The presence of open software isn’t enough without a way for people to know what is trustworthy without being experts.
Until we provide that, this situation cannot change.
> The presence of open software isn’t enough without a way for people to know what is trustworthy without being experts.
As Apple has demonstrated, neither has the presence of closed software. Between Apple and Microsoft, they're nearly 100% likely to get software that is not trustworthy, and cannot be made so.
This would be less worrisome if the state did not compel these organizations to provide their data to the military on demand without a warrant, but due to the vertical integration of big tech companies and US military spying, this becomes a major threat to our society and the basic exercise of civic participation.
It's hard to do things like, for example, investigative reporting against corrupt government officials if the government officials know every time you open Premiere, and where you do it, because they forced your chip/OS manufacturer to tell them.
> For most people, if there computer was ‘theirs’, it would be that way for about 10 minutes until it was pwned by a bad actor.
I own a company which manufactures and sells showerheads. Since "most" people are prone to a fall while taking a shower, my company also mandatorily installs a camera alongside, so that we can monitor for accidents, given our deep concern for our customers.
"Some" customers have raised concerns about their privacy, but this is for the greater good. My company is very ethical in its practices, and really values each customer's privacy (pinky promise).
The footage of you and your loved ones showering is completely safe with us, although you can choose not to be recorded while showering by not purchasing our products.
Oh by the way, our showerheads are sleeker and more convenient than anything else in the market.
> For most people, if there computer was ‘theirs’, it would be that way for about 10 minutes until it was pwned by a bad actor.
I remember very well the times where when you’d connect a fresh Windows XP SP2 to the internet without a router or firewall it immediately showed malicious popups and eventually a BSOD.
No, the real story here is that Corporations in the USA, and other Global North countries, have both 1) benefited from state/taxpayer-funded subsidies, grants, and research, and 2) are now locking up and monopolizing these discoveries using the international TRIPS system, 'claiming' Patents and Copyrights, 'kicking away the ladder':
"How did the rich countries really become rich? In this provocative study, Ha-Joon Chang examines the great pressure on developing countries from the developed world to adopt certain 'good policies' and 'good institutions', seen today as necessary for economic development. Adopting a historical approach, Dr Chang finds that the economic evolution of now-developed countries differed dramatically from the procedures that they now recommend to poorer nations. His conclusions are compelling and disturbing: that developed countries are attempting to 'kick away the ladder' with which they have climbed to the top, thereby preventing developing counties from adopting policies and institutions that they themselves have used."
+
“Rich countries have ‘kicked away the ladder’ by forcing free-market, free-trade policies on poor countries. Already established countries do not want more competitors emerging through the nationalistic policies they themselves successfully used in the past.” [1]
Professor Yash Tandon adds:
“During the 1980s and 1990s I worked in many countries in eastern and southern Africa, and then for four years at the South Centre—2005–09. I can say from my experience that the industrialised countries of the North have been trying systematically to block all efforts by the countries of the South to industrialise. Their mega-corporations have tried—and, alas, succeeded—in privatising knowledge, and using it to promote corporate profits over the lives of people."
[...]
“It is the seeds and pharmaceutical companies of the West that have pirated the knowledge of seeds and medicinal products from the South. But whereas in the South this knowledge was shared as a public asset, the Western companies, having learnt from the South, proceeded to claim it as their private property. They are guilty—morally guilty—for the avoidable deaths of millions of people in the South who cannot afford their ‘patented’ medicines against, for example, AIDS, malaria, tuberculosis and other killer diseases. It is a sordid story. But it is not all doom and gloom. Those who control the system (the global corporations and the international organisations that the West controls) do not get their own way entirely. Wars do not always end in the victory of the militarily or ‘intellectually’ powerful.” [2]
All of this then comes together through Tech's Great-Man theory, that has us idolizing those monopolizers:
"In the movie Steve Jobs, a character asks, “So how come 10 times in a day I read ‘Steve Jobs is a genius?’” The great man reputation that envelops Jobs is just part of a larger mythology of the role that Silicon Valley, and indeed the entire U.S. private sector, has played in technology innovation. We idolize tech entrepreneurs like Jobs, and credit them for most of the growth in our economy. But University of Sussex economist Mariana Mazzucato, who has just published a new U.S. edition of her book, The Entrepreneurial State: Debunking Public vs. Private Sector Myths, makes a timely argument that it is the government, not venture capitalists and tech visionaries, that have been heroic.
“Every major technological change in recent years traces most of its funding back to the state,” says Mazzucato. Even “early stage” private-sector VCs come in much later, after the big breakthroughs have been made. For example, she notes, “The National Institutes of Health have spent almost a trillion dollars since their founding on the research that created both the pharmaceutical and the biotech sectors–with venture capitalists only entering biotech once the red carpet was laid down in the 1980s. We pretend that the government was at best just in the background creating the basic conditions (skills, infrastructure, basic science). But the truth is that the involvement required massive risk taking along the entire innovation chain: basic research, applied research and early stage financing of companies themselves.” The Silicon Valley VC model, which has typically dictated that financiers exit within 5 years or so, simply isn’t patient enough to create game changing innovation." [3]
This is the same sort of logic that some people have that people shouldn't be allowed to make choices in life because they "might get it wrong". People should be allowed to make their own choices in what they believe to be their best interests and they should be allowed to get things wrong.
> For most people, if there computer was ‘theirs’, it would be that way for about 10 minutes until it was pwned by a bad actor.
This may have been true back in the early 2000s when people first had good broadband. However all the major operating systems have pretty decent security out of the box mainly due to the embarassment of events like MS Blaster Worm.
> The general issue of other people (I.e. Apple et al) having a lot of control over your computer, is entirely valid even if some of the points in the piece are exaggerations.
The problem is that if you buy a general purpose computing device. You should be able to run whatever you like on it.
They should just have a button somewhere in whatever the equivalent of the BIOS on these machines is that says "I am an adult and I accept the risks of turning off these protections" and then let you install Temple OS if you so choose to.
This is something that needs to be enforced by legal means IMO, something similar to right to repair.
> The problem is that people want it that way.
I don't know about that. They frequently get irritated by it and are just resigned to it IME.
However the popularity of single board computers such as the Raspberry Pi, People building their own gaming rigs and people tinkering with gadgets goes someway in refuting this notion. There will be of course many people that just won't care and will use "iDevice" and that is fine, however there is a large spectrum between "I only run stuff from the app store" to "I run custom Arch Install with optimised kernel with tk kernel patches".
> The presence of open software isn’t enough without a way for people to know what is trustworthy without being experts.
I agree. This is a failing of the open source community. I've actually written a draft manifesto called "better than freedom". This may actually push me to at least have it critiqued by my friends.
> Until we provide that, this situation cannot change.
Something has to change. We are sliding back into the 1980s where all the computer hardware was incomptable with one another.
> This is the same sort of logic that some people have that people shouldn't be allowed to make choices in life because they "might get it wrong".
No, it's not at all the same sort of logic. We are talking about technology here, not overall life experience. Everyone starts out life on the same footing, with zero experience, and has the innate potential to learn how to live it. But computers are a very specific, highly specialized machinery that the vast parts of the population will never be able to directly understand or fix. I can drive a car but I don't know the first thing about fixing it.
You could claim that no one should own technology if they don't understand it intimately, but that is a different kind of logic and argument. It's silly to conflate the two.
> You could claim that no one should own technology if they don't understand it intimately, but that is a different kind of logic and argument. It's silly to conflate the two.
Can you point out where he made such a claim? Because I don't see it.
>But computers are a very specific, highly specialized machinery that the vast parts of the population will never be able to directly understand or fix.
Computers are all that only in the vaguest of sense. They're actually not specialised at all. They're actually extremely general purpose. But whether that's the case at all there's a lot items, equipment and tools that are very specialised that most people don't understand but if they were made in act in any way similar to this because someone somehow managed to use them in a retarded way it would surely raise some eyebrows.
Company incentives are trash and I trust them A LOT lot less to not make stuff designed to fail, to lock in customers, to snoop data, etc, etc than I trust myself to not do stupid shit when i want an exception to some safety feature or the like.
>I can drive a car but I don't know the first thing about fixing it.
And would you buy a car with the hood welded shut? With the tires needing some kind of manufacturers signature to not be locked? One that locks up if you try to use some 3rd party seatcovers?
No.
And yeah there's some type of modifications to them and actions that we generally try to prevent...but that's because we're talking about a highspeed deathmobile to surrounding drivers/pedestrians if you start putting spikes on it or fuck up your breaks. That however is not the case for a pc, phone, printer or what have you.
All you are advocating for is a new priestly class.
> Everyone starts out life on the same footing, with zero experience, and has the innate potential to learn how to live it. But computers are a very specific, highly specialized machinery that the vast parts of the population will never be able to directly understand or fix.
I started off with zero experience and learned more over time to throw your own words back at you.
I learned how to program on a computer when I was 10 years old (which was 28 years ago now) after my Grandfather (who was born in the 1920s) learned how to operate his own computer at home. We had manuals and trial and error and the library. It was much more difficult to learn back then because there was no internet, no youtube and almost nobody had a computer. Computers back then were a lot more difficult to use as well. So don't give me this nonsense about it being so opaque it is impossible for an ordinary person.
Also most people will know a friend or relative that is handy with a PC or you could go to a local specialist for help. A large multi-billion company "looking out for the plebs" isn't required.
Apple has been quite good it seems as selling this narrative.
> I can drive a car but I don't know the first thing about fixing it.
Plenty of people used to be able to work on their own cars in the past (and still do today). Most of them would be considered "laymen". Plenty of people fix their own phones and computers, or go to a repair shop where people have taught themselves to repair electronics and turned it into a business. Just because you choose to be ignorant (which is absolutely fine) doesn't mean other people aren't able or willing to.
> People should be allowed to make their own choices in what they believe to be their best interests
> You should be able to run whatever you like on it.
You are contradicting yourself. People are explicitly choosing iphone, a walled garden. Then you say that that's a wrong choice because you believe it's a general purpose computing device and should work differently.
If people buy it then, well, they are fine with it.
> If people buy it then, well, they are fine with it.
I don't think people have much choice these days. If you want a smartphone then you will either get something from Apple or from Android people. There are other choices but are they mainstream? In my local shops I can't see anything else. Android or Apple.
That's why I'm buying PinePhone and going to waste lots of my life to make it fit my needs, but I hope once time is spent I will be happier human being ;)
> I don't think people have much choice these days.
Well, choice is about compromise. You have to choose between freedom and convenience, but you can choose. I'm saying that as a person who used 4 linux phones as daily drivers for last 10 years (n900, n9, jolla, xperia with sailfish).
> There are other choices but are they mainstream
Well there are. Maybe they are not mainstream because people value convenience over freedom, which is unfortunate to me, but who am I to dictate them my values.
At least there are always some linux phones to choose from, and hopefully fabless/new open platforms would even increase the amount of such. I mean, 10 years ago it was nokia or nothing, today there are pinephone, fairphone, librem
I am not talking about the software itself. I am talking about the ability to run whatever software you like. This means alternative OS or allowing someone to install another app store.
They could literally put in a button to let you install what you like. In fact they already did this in the past.
Those statements aren't mutally exclusive. People should as a matter of principle be able to run whatever they like, however if they want to stay in the walled garden they can.
If you read the rest of my comment I specifically mentioned some sort of mechanism for turning off the walled garden protections. I would imagine it would work something like secure boot. I have a motherboard that lets me turn off that machanism if wanted to via the BIOS software.
If the only way for it to happen is through lobbying politicians and requiring it by law (not a solution I would prefer) so be it.
Also it is quiite clearly a general computing device as it has all the characteristics of one. It isn't a matter of belief.
> People should as a matter of principle be able to run whatever they like
Well you are free to provide a platform that is as convenient as ios and as free as linux.
In practice that's not easy to say modestly since convenience require quite an investment in design and developement which free platforms tend to lack.
So in reality it's either a well monetized platform (which is either a walled garden or just sells your data), or free platform which is far behind in terms of convenience and support.
> Also it is quiite clearly a general computing device as it has all the characteristics of one. It isn't a matter of belief.
No, it's exactly your belief. Apple doesn't advertise it to you that way, they doesn't say you'll be able to run whatever you'd like on it, so I dunno where did you get this misconception that they should provide you such an ability.
> Well you are free to provide a platform that is as convenient as ios and as free as linux.
Respectfully you obviously don't understand what I wrote.
I was quite clearly talking about the ability to run an alternative OS on the platform. Not whether there Lineage OS or similar is as good as iOS.
You can install whatever operating system you want on a Laptop or a PC. You can already run other ROM images on Android (though some phones you have to root the device which is not okay.
It isn't about an OS already being there. It being able to run whatever OS might be available.
> No, it's exactly your belief. Apple doesn't advertise it to you that way, they doesn't say you'll be able to run whatever you'd like on it, so I dunno where did you get this misconception that they should provide you such an ability.
There is no misconception on my part. If you look up what makes a general purpose computer, that includes things such as PC, laptops, servers and smartphones. The iPhone has all the properties that make it one.
So whether you like it or not, it fits all the criteria. It doesn't matter what Apple market it as or whether they artifically lock it down. That doesn't fundamentally make it a different thing.
> I was quite clearly talking about the ability to run an alternative OS on the platform.
I got it, and explained how it will hurt monetization of their product. Android is a different story with different (I dare say way more scummy) strategies of monetization.
The point is you can't have a convenient platform which doesn't produce revenue allowing to make it convenient and rewarding the investment.
It was quite clear that you were conflating many things.
As to whether it would hurt monitisation. I don't think it will and even if it did I don't care they have more money than most countries do. If they must be required to by legal means so be it.
Also they already kinda allow some of this when you enable developer mode via a Mac with the appropriate iOS developer account. So there is no reason why they cannot do this tomorrow without the paywall.
> So there is no reason why they cannot do this tomorrow without the paywall
Sure you don't when you are dismissing economics of software development so vigorously. Try to look at things from the perspective of economic incentives, not pure technical standpoint. You'll understand why free platforms are way less convenient, and closed platforms are as they are.
The incentives can change through eg law as forest_dweller has already mentioned in his comments. I think most people "understand" the current economic situation works out quite well for closed platforms, but it doesn't mean a better way does not exist and can work out just fine economically
As I said if it has to be mandated legally through a movement similar to right to repair so be it. The incentive will be "you must follow the law or be sued".
“This is the same sort of logic that some people have that people shouldn't be allowed to make choices in life because they "might get it wrong". People should be allowed to make their own choices in what they believe to be their best interests and they should be allowed to get things wrong.“
I am not making any argument that can be construed as this. You are misreading me if you think I am. I agree that people should be allowed to make their own choices which is why I strongly oppose government regulation of software distribution. It’s also why I support people’s right to choose platforms like iOS.
“Something has to change. We are sliding back into the 1980s where all the computer hardware was incompatible with one another.”
I’m not sure freedom and compatibility always go hand in hand, but I agree that something has to change.
As to the tinkerers and people building their own rigs - the idea that there is a large spectrum. I potentially agree.
Serving those people well is a hugely untapped potential. I used to believe that Apple would see this, but I no longer do. I suspect that Jobs would have done.
I would like to think the open source community and commercial ecosystem outside of Apple and Microsoft can seize the opportunity.
We won’t if we spend our time feeling victimized by corporations doing what corporations do.
> I am not making any argument that can be construed as this. You are misreading me if you think I am. I agree that people should be allowed to make their own choices which is why I strongly oppose government regulation of software distribution. It’s also why I support people’s right to choose platforms like iOS.
Fair enough. I've read some of your other replies on here and I got the wrong end of the stick.
This is hardly about choices and more about shared resources and interaction. You can do whatever you want, but you can't do it to whatever/whoever you want.
All your recent comments are dead. Don't bother. It's unfortunate that contrarian opinions are silenced but this is the result of another walled garden
Forced autonomous cars: Yes, once they are good enough. We make people wear seat-belts. Same thing.
Stopping selling of knives: Yes, once we have a safer way of cutting things. Though the real problem is probably people cutting each other on purpose, rather than themselves by accident.
> Forced autonomous cars: Yes, once they are good enough. We make people wear seat-belts. Same thing.
Selt belt wearing is rarely enforced anywhere (especially in Europe, in Italy nobody wears one even though it is illegal).
As for autonomous cars. If they ever work, one of the first mods people will do is to add a steering wheel.
> Stopping selling of knives: Yes, once we have a safer way of cutting things.
Knives are a safe way of cutting things. Billions of people use them everyday without issue.
> Though the real problem is probably people cutting each other on purpose, rather than themselves by accident.
How are you going to stop people from making knives? Humans have been making them for several 1000s of years, IIRC Chimps can make them now. In prisons they literally make shives out of toilet paper.
In the UK carrying knives is banned and their sale is heavily restricted. However in our capital city we still have a lot of knife violence. Guns are also heavily restricted but people still have those illegally and people do get shot. My uncle is a convicted bank robber, he had plenty of guns.
You are solving the wrong problem. You must solve the problem of why people are being violent to one another. That will actually make people safer.
Enforcement and compliance are two different things. I was saying that it cannot be enforced. Not that people won't comply.
> You probably missed the part where I said we would only outlaw knives if we had a safer way of cutting things (reliably)
You seemed to misunderstand. The way we have is perfectly safe as it is. Billions of people use them everyday without issue. There is neither the incentive or need to.
You cannot enforce the ban of knives as they are soooo trivial to make (you can make a knife by hitting a large stone against a piece of flint). Even in a very locked down environment such as a prison inmates still manage to acquire them or produce them.
Even if in your fantasy world you did outlaw them and was successful. People would hit each other with Baseball bats, Umbrellas, Pieces of wood, Bricks, lumps of metal.
You must solve the problem of why people are being violent not what they are using to be violent.
Make shift weapons can easily be as dangerous (or more so than a knife). People have lost their lives by simply being punched in the head. Life isn’t like the Hollywood action movies, the human body can be very fragile.
It is obvious that you are either idiotic or disingenuous. Judging by the user name, I would guess the second. Have a nice day.
I'm so glad I sticked to Arch Linux since early days. It felt wild in the beginning but now, after 10 years, I'm glad I took the risk. Yes, I have had a few sleepless nights trying to solve problems (90% of time caused by myself, remaining 10% caused by not reading wiki/docs/forum before doing stuff). And yes, Arch is not as polished and pretty as MacOS is but... I don't care, I do backend so I can survive without frontend niceness. In the same time I feel sorry for people who rely on tools from apple... it won't be as easy for them to switch.
Or you could ditch Arch and go with a workstation specific Linux distribution like Fedora or Pop_OS that works out of the box with a lot of hardware. Gnome is more polished than the MacOS UI imho.
Been using Fedora for a few months now and I haven't had a sleepless night yet. Everything just works.
I always loved arch for its minimalistic approach. Well, maybe this is not entirely true since when we was pretty much forced to switch to systemd :) but it is still my favourite distro, even after those sleepless nights. I am maybe a weirdo but I don't regret a minute spent fixing my installation :)
> These machines are the first general purpose computers ever where you have to make an exclusive choice: you can have a fast and efficient machine, or you can have a private one.
Most of the components inside my PC are over 10 years old. I do video editing. I do music production with many VSTs open and running. And of course web browsing with a million tabs opened at once. For any of those tasks I can't see what would be the point of having a faster machine. Everything happens at once.
Also, having a modular machine lets you replace individual parts when they fail (which they almost never do anyway).
What are people doing with their computers that need so much computing power? Is it gaming?
It's a new software. Take the note editor inside Evernote for example. It just slightly more sophisticated than a common notepad.exe, but it lags and freezes like it computes the Moon flyby trajectory by numeric simulation once every few seconds.
Or the new fancy autocompletion module for Spyder python IDE. some stupid autocompletion requires TensorFlow running and therefore I just can't use it, because my 6 core PhenomII ( which is enough for most of what I'm working with) doesn't have AVX-instruction set. MSVC had a decent autocompletion for ages, VisualAssist raised the bar even higher... But now I'm kinda forced to upgrade my PC to get a simply decent code autocompletion... What next? Quad-GPU rig to run notepad?
I quite agree... The moment I find a simple note-taking app that synchronizes well between Win7 and Android, that moment will be the last moment I use Evernote.
(well, I use excellent DokiWiki engine as my primary knowledge base, but to make it available on the mobile phone requires setting up and securing a web-hosted machine which is too much bother for me right now)
> What are people doing with their computers that need so much computing power?
Running and developing web apps and microservices in proprietary languages and using oh-so-great modern environments and IDEs only there to lock you into a vertical garden
I agree. My machine is a new modular one. You can have both with no compromises. It’s not even hard. And it works with open source operating systems! I doubt it will last 10 years though because it is modular. It will evolve.
I have a custom build Ryzen 3700X / 64Gb / 2TiB SSD sized desktop which is built heavy, large and silent. I use a Lenovo T495s as a terminal for it locally and remotely. It covers all use cases then.
I'd strongly advocate fellow hackers/technical users to move out of the Mac ecosystem and get yourself a Thinkpad. My Thinkpad X1 Carbon that I got a few months ago runs Fedora as seamless as the Macbook runs MacOS. I haven't had a single hardware compatibility issue to date.
I don't even miss the Apple Trackpad since I've gotten used to the Thinkpad Trackpoint which is an order of magnitude better as an input device than any trackpad, with the only caveat that it'd take you a week or two to get a hang of it.
Hey thanks for mentioning this. My current dread is having to switch to a linux laptop because I hate the yak shaving inherent with linux systems and driver issues. However, I won't be getting big sur or any other future osx systems until they fix their api and allow firewalls to actually work with their software.
So starting to scope out what options I have and this helps.
Lenovo has started to offer a few Thinkpad models with Fedora pre-installed afaik. I believe you can select the OS on Lenovo's website while placing an order. So you can hit the ground running with Linux.
Alternatively Dell XPS offers Ubuntu out of the box as well. But I prefer Thinkpads since XPS has only USB-C ports like the Macbook, requiring you to carry a dongle around.
16GB is enough for the sort of work that I do. But if that's not sufficient for you, you could get the X1 Extreme which is the "Pro" version of the X1 Carbon, or you could get the Thinkpad P1 (ultra-portable workstation) or even a Thinkpad T series (corporate focused - cheaper than X1 & P series). X1 & P1 have a Carbon fibre chassis which is why they are more expensive than the T series, which has a magnesium alloy chassis.
The newer Thinkpads are not as extensible/repairable as back in the day. However, anything is a significant improvement when compared to the horrible dictatorial ecosystem of Apple.
My t460p can go up to 64 GB. Some of the newer ones have part of the ram soldered on though, need to watch for that if you want to max it in the future.
I'm so glad I caught onto RMS and Eben Moglen when I did. I saw this coming a long way off, and have spent a lot of my time completely gpl'ing my daily use stack. There are a few rare exceptions (steam being the largest one)... but I'm happy I put in the effort.
What we are headed for is a world where a certain subset of people with the technical knowledge have computing freedom, and that vast majority giving up all kinds of rights and data for convienience. A techno-caste system underneath the neo-feudal system that is incoming.
Well said, we’ve come to the point where if you want the latest silicon, a very good build quality and the most hyped technology, you’ll have to give away some of your privacy.
Extra points for mentioning Cory Doctorow, who is providing most of his (prophetic I dare say) books for free over at https://craphound.com (you can also buy them and support him ;-) )
Quote: "These machines are the first general purpose computers ever where you have to make an exclusive choice: you can have a fast and efficient machine, or you can have a private one."
I call BS on Apple having the fastest machine, even at current moment, not to mention in the future. I can make a gaming rig with latest Ryzen CPU that will run whatever I want and be both faster and cheaper than Apple's expensive crap.
If you know of any other 8 core, 64 bit, 220+ ppi, >=13", 18 hour battery life machines with a great keyboard that weigh less than 1300g, please do let me know. Bonus points for metal enclosure.
If you find such a beast, I will buy it instantly, and I will happily ignore the fact that the MBAir will still have a better trackpad and speakers.
It's worth noting that Apple's "18 hour" battery life on the MBA is listed when watching something via the Apple TV app, presumably as they have various optimisations both in the OS for that and at a chip level.
For regular desktop use it's 15 hours, which is about the same as other higher battery spec laptops.
Doesn't the Dell XPS meet those expectations? The 15" version can be specced to an 8 core i9 with around 290ppi and an 87Whr battery, and if you're happy with a 6 core i7 you can get a 13" with around 330ppi (same resolution), but with a 52Whr battery.
I'm not sure what "18 hours" means, I am inherently skeptical of any "hour" claims of battery — obviously the less you demand of your machine the less battery it will take.
The 15" is ~700g heavier. The 13" has 2 fewer cores (as you noted), and a slower GPU. 330ppi does sound amazing, though.
Apple's way out in front here, and they know it. That's why I wrote this article: because the choice is private xor fast.
There's perhaps some good news, though: it may be possible to disable the signed system volume checks on the Apple Silicon macs, which would allow editing of the OS startup scripts/settings.
> I can make a gaming rig with latest Ryzen CPU that will run whatever I want and be both faster and cheaper than Apple's expensive crap.
No you can not. The M1 chip has the fastest single-core performance of any consumer chip available today, beating every chip in AMD’s brand-new Zen 3 lineup. You can get faster multi-core performance (5950X) but definitely not foe the same price as Apple’s new computers - not to mention the massive power draw and much worse thermals in comparison.
The Mac Mini is a desktop computer, it's just much smaller than your typical desktop tower. It doesn't have a screen or keyboard built-in. It's not a laptop. It's a box you connect your screen to. How is this relevant?
The comment I was responding to was talking about screen size and only screen size. Your points may all be true (except that it has much more space for cooling than a laptop, which has significant effects on noise and sustained performance) but have no relation to the topic at hand.
It would be weird if the person you’re responding to were referring to personal efficiency—that depends on the person. They were likely referring to power efficiency.
Apple's mobile chips have been consistently the best in the business. When the A14 was released, the second fastest mobile chip you can buy became... the A13.
Is this unsurmountable? Obviously not, it's excellent engineering, not a pact with the Devil. But you can't just hand-wave it away either, Apple's silicon is the one to beat, and they have a head start.
You can only run Apple’s awful software platforms on Apple Silicon so they’re only competing with themselves and for the few people who actually run macOS in the case of the new Macbooks.
Even if the rest of the PC industry never goes to ARM, Apple’s market share won’t change more than one measly percent IMO.
In every case, the greatest return on investment for any surveillance system comes from extortion. Extortion takes many, many forms. Some forms are favored by the FBI and prosecutors. Others are favored by spooks. Criminals have many of their own. In every case, they involve coercion.
A key fact about extortion is that the person being coerced is often not the person surveilled. It may be, instead, a judge, or a prosecutor, a purchasing agent, an admissions director, a night watchman. Each has parents, children, spouses, bosses, friends they would not want to suffer. The person surveilled need not even know they were used; the coercion is often more effective when they don't.
For each time you have heard of a defendant who pled guilty to a false charge, a judge who ruled in obvious contradiction to settled law, a legislator who voted against contituents' interests or resigned for "personal reasons", a manager who embezzled, or a contract awarded to the weaker proposal, consider whether it really was just on a whim.
This is the world of pervasive surveillance we are plunging into. This is the world that only consistent encryption and anonymity can have any chance of protecting against. But it is your family and friends who need the encryption and anonymity, not just you.
You make an interesting point about judges. I wonder how prevalent behind the door blackmailing is given we probably wouldn't heard about them if the person complied.
This is really bad , apple touts itself as a privacy company but it seems like a fox in disguise and meddling with VPN is just completely immoral. Now google and Microsoft seem better to me as they at least tell you we are getting your telemetry data and don’t modify vpn traffic.
I have found it curious how this has gotten zero traction in the press. Earlier today, I was naïvely sure this would cast a pall over the product launches.
I might be mistaken, but wasn't Windows well ahead of Apple on this one?
First it sneaked Windows 8 and 10 updates onto people's computers, then used every dark pattern imaginable to get them to consent to tracking, then slowly included ads in their operating system.
For the last 2-3 years, I thought "at least Apple isn't doing that", but that is changing.
Unless I'm mistaken Windows 10 still allows you to disable everything and doesn't play games with what you can and cannot block at software/host firewall level.
Thanks for summarizing a bunch of changes that make me depressed about modern computing. Now Android, 'privacy apple' macOS, iOS & Windows all act in similar way. The only thing left is bad user experience linux.
I've opted to purchase the new MBAir, and will only use it in conjunction with a travel router, on which I have root, and can default-deny all network traffic from the laptop except for the stuff I explicitly permit. Dual-Wi-Fi, small travel routers will run for quite some time on a USB battery pack.
I've been meaning to do this for some time anyway, due to the pervasive spyware that's embedded in most iOS apps, which Apple explicitly permits in the App Store. One travel router device should serve me for phone+tablet+laptop. I'll probably have it just do LTE+WireGuard back to a server I run, and then do all of the filtering/monitoring on the VPN server.
It sucks that it's come to this. Hopefully Apple can find their way back. I have a machine running KDE Plasma, which is worlds better than it was in years past, but still has enough rough edges compared to macOS's mirror sheen that it's annoying to use. I'm keeping my intel rMBP 16" I just got as it's likely the last machine of this quality level that will be able to run such a system.
It's not going to happen. Apple talks up privacy and security but the truth is they don't care about those things. What they care about is having a locked-down platform because locked-down platforms make Apple $hitloads more money than open general-purpose computers.
Let's take another example: Remember back before 9/11 and you didn't have to show an ID when you boarded a plane? The airlines hated that because it meant people could resell tickets when they changed their plans, and there was no way for the airlines to get a cut of that money. After 9/11, the Feds made the airlines ask for ID, which the airlines absolutely loved because it destroyed the secondary market for unused tickets. Of course they didn't frame it this way--the airlines said "they were happy to help enforce additional security measures"--even though you and I both know that asking for ID before boarding a plane is just another aspect of security theater.
Apple is doing the same thing. Locked-down platforms they justify with security theater are making Apple a lot of money. They have absolutely no incentive to change anything.
I've been using Macs since 1984. I bought a Lisa to do development for the Mac after Steve Jobs came to my university and told us we could buy his weird new computer for half price. Guy Kawasaki mailed me printouts of new sections of Inside Macintosh every month. I've single-handedly bought several hundred Macs (mostly through my employer) over the years, fighting prejudice and bias the whole time about Macs not being "real computers." I had a rack of about 50 XServes at one point that was my lab's private supercomputer.
But I'm about done. Apple has turned into an entity I no longer recognize. They have become--to use Google's word--"evil".
It's bad enough that they can't seem to fix long-standing and new bugs in their operating systems.
It's bad enough that every version of the operating system deprecates or eliminates useful features in favor of shiny useless ones.
It's bad enough that they don't care about developers any more, and cannot be bothered to create sane developer tools or documentation.
It's bad enough that they don't do backward compatibility any more and you are pretty much forced into a newer -- and in most respects worse -- version of MacOS at least once a year, or else you don't get security updates any more.
It's bad enough that the IOS upgrade treadmill is an even worse situation, where if you have an iPhone, you must keep upgrading it constantly, cannot downgrade, and will have to buy new hardware just to stay on the treadmill.
But what I cannot abide is Apple's increasing lockdown of the platform. It's now to the point that they no longer sell general-purpose personal computers. As your article points out, it's not even yours any more and it cannot be made yours.
Apple just sucks now. And I get to say that because I've been their biggest cheerleader for 36 goddamn years.
It's likely that I have purchased my last mac and last iPhone this month. This post marks the end of an era, and a real-life humanscale era, not a computer-time one: 30+ years!
I'll miss it.
EDIT: It's possible that via bputil(1) run from the Recovery partition that the signed system volume checks might be disabled in M1 macOS 11. This means I could remove the plists for certain intrusive/telemetry system services and still boot. I've updated the post, and will write more about it once I have the equipment in hand.
I disagree with that, user experience in Linux is and in foreseeable future will be the most flexible of all and hence allows users to make it look exactly the way they like.
For comparison I would say Linux is like bootstrap that you can customize to make it look the way you want. iOS is like a paid theme you have bought over Themeforest with near to zero flexibility.
If you want a predefined good looking desktop, you can look at Manjaro ( that I am currently using and am pretty happy , my decent 16gb RAM with SSD system boots faster than my Macbook Pro 2019 and opens applications faster on cold boot as compared to Mac )
I switched back (after a Mac stint for 12 years) to Linux a couple of years ago, and although hardware support is much better there are still some very sharp edges you could easily bleed out on if you briefly touch.
I have a ThinkPad T470s running Arch Linux (I also tried Fedora and Ubuntu and saw these there too):
- Desktop performance is too poor to run at 4K, even though it works fine in Windows. I'm not talking playing games (they actually work well), just having a desktop with a browser and editor running at 4K. Ive tried Gnome, XFCE and Plasma on Wayland and X11 and they are all the same. The desktop feels slower, but strangely running tests (backend TypeScript and Scala) is noticeably much slower.
- Multiple monitors mostly works, but it's not always plug and play. Sometimes when I plug in an external monitor it doesn't always switch to it, so I have to set it up manually. Other times it just doesn't detect the monitor so I need to reboot. See note above about testing different DEs.
- Bluetooth audio mostly works (including LDAC) but the sound system often gets confused about which output device should be used. Sometimes Bluetooth audio connects, but it still defaults to the speakers. Sometimes the volume control turns down the Bluetooth audio, but turns up the speakers. See note above about testing different DEs.
- Suspend/resume mostly works, but sometimes it has the same issues as when connecting monitors. Sometimes when I resume I get a black screen, but if I go to a virtual terminal I can restart X11 and everything works. Other times it doesn't without a reboot.
Now of course these aren't the end of the world. I've been using this system and dealing with these issues for a couple of years now. The tradeoffs vs Apple privacy and better developer tooling (native Docker is so much nicer) on Linux make it worth it for me. I also have a desktop system which works perfectly, so it seems to mainly be issues with laptops (or this laptop?). But yeah compared to macOS or Windows, Linux really lacks a lot of polish.
Desktop performance is too poor to run at 4K, even though it works fine in Windows.
I have used 4k with Intel, AMD, and NVIDIA GPUs on Linux (both Wayland and X11) and it is a smooth as it is on macOS. My only data point is GNOME though, because I am happy enough to never try another DE.
I fully agree with Bluetooth problems. I just gave up and use a wired headphone with mic now :(.
But yeah compared to macOS or Windows, Linux really lacks a lot of polish.
I generally agree. I am a NixOS user, though I recently tried Fedora Silverblue and I was surprised how smooth the whole experience is. They nailed the desktop. Atomic upgrades/rollbacks. Flatpaks are really a nice model for installing desktop applications. It's one of the first times I felt that it's a Linux-based system that I wouldn't worry much about recommending to non-technical users.
I think the more serious problem is the lack of applications like Microsoft Office, the Adobe Suite, the Affinity Suite, OmniGroup applications, etc. Sure, there are free replacements, but they are not as good. And even if they were as good, people generally do not like to switch away from what they know.
"I think the more serious problem is the lack of applications like Microsoft Office, the Adobe Suite, the Affinity Suite, OmniGroup applications, etc."
I'd buy Affinity again if I could use it on Linux. I'm hoping that MS does port Office as it will make me stop wondering why I pay for Office Live (365 or whatever they call it this week).
History repeats itself. 10 or 15 years ago the problems I had with Linux were related to Wi-Fi and graphic card drivers. Nowadays these issues are solved, but new ones appear (the ones you mentioned). I'm pretty confident these issues will be solved in the future... but again, new issues will appear and Linux will always be one step behind.
I use two 4K monitors. Desktop performance is uniformly excellent, using only Intel Iris graphics. I cannot imagine what you must be doing. I didn't need to do anything special.
> The only thing left is bad user experience linux.
I occasionally have to use a Macbook for work. The last thing I want when I'm trying to get work done is: an unsolicited notification that OS updates are available, with no direct option to decline to update and dismiss the notification. I'd much rather update my system by issuing "pacman -Syu" on my own terms, and I consider Linux distros to offer a superior user experience in this regard.
"The only thing left is bad user experience linux."
I'm not so sure Linux is that bad of a user experience anymore. I just switched back to Kubuntu from MacOS and Windows 10. The only app I really need that I can't run on Linux is Xcode for building iOS apps. I can't really think of one hardware feature that doesn't work better under Kubuntu than Windows or MacOS: printer works, bluetooth works way better, multiple displays work, sleep/suspend works, sound works better, USB works better (Linux will mount and connect to stuff that Windows and MacOS won't) and that's before we even talk battery. The biggest surprise is that I'm getting 4-5 hours of battery with Kubuntu out of a Dell XPS-15 (i7 w/ Radeon graphics and 4k display) when Windows would kill the rather large battery in about 35 minutes. Honestly, I've been pleasantly surprised, and I have to go do command line config stuff about as often as I did on the Mac.
What kind of source are you looking for. This is about hundreds of changes that all build up to a more locked down experience.
iOS has always been locked down.
MacOS now won't run unsigned software. Mac Pros will brick if you replace the SSD with another genuine SSD.
Safety net on android makes it harder to use root. There is an API to block screenshots. Security changes in Android 11 break termux and all other bash apps.
Windows tried RT and the S version but it doesn't seem to be replacing the regular one yet.
Regarding the state of computing - there are so many cheap critiques but so little construction.
For years now, cutting edge OS research has just been reimplementing unix with $new_security_feature.
So many lines of code are written but so little changes in terms of what we are supposed to expect out of our computers, and if it ever does, it's always in the direction of walled garden phone-ification of the personal computer. It feels like our CPUs and RAM have grown, but our ambition has atrophied.
What we need is a new synthesis.
A personal computing platform that picks up in 2020 from where the lisp machines left off in the 80's. Something built from scratch to be interactive, dynamic, and hackable.
It will never actually happen, for "worse is better" reasons, of course. But it's nice to dream about.
If anyone out there is working on something like this, even on a conceptual level, I would love to hear your thoughts :)
I’m surprised more has not been made of the claim that all this information is being sent unencrypted.
If that’s really true, how can anyone take the claim being made by one of the largest and most privacy-touting companies seriously?
If they send, in real-time, location and which program is being run unencrypted over the Internet for anyone to read, this should get a lot more traction. Especially because of it being Apple, making this a huge double standard.
I, and the vast majority of computer users, don’t want to manage every nitty-gritty detail of the security and privacy of my computer. I have better things to do. In fact, doing it fully competently is more than a full time job.
So farming this out to a third-party is a necessity, for me and many, many others.
Of course, it’s not perfect. How much can I trust the third party? How well can they prevent, fix, or remediate bugs, breaches, and other threats? How much do they even want to?
It’s not perfect, that’s for sure. We see bugs. We see potential tracking opportunities. We see difficult to work-around limitation...
But no one is offering a better solution. Not for me, that is, nor the vast majority of computer users.
If your response to this is “Look at the sheeple defending Apple!” you aren’t getting it.
Oh c'mon, that is the oldest argument in the tool-box of Apple/MS "sheeple", as you put it. Most Linux distros don't require you to "manage every nitty-gritty detail" of anything. In fact, some of them make it easier for you to choose your privacy settings once and only during installation.
> I have better things to do.
Oh, mr. I'm-so-important, if ignoring your privacy and the rights to use your own machine is low on your priority list, you gotta respect those people who see it as something important, as well as their demand for something better. If you are fine with Apple/MS, great, use it. But don't try to take away the rights of others to look for something that better addresses their needs, and, especially, their right to try to tell others that it could work for them too.
A prison with golden walls is definitely safer and more convenient than freedom in the wilderness. The day the jail guards decide they don’t like you for some reason, you may start rethinking your lifestyle.
I’m 100% apple hardware atm, my first mac was the macintosh plus with 128kb ram, and i’m an iOS professional developer. Let me tell you that moving away from apple is not going to be easy for me, but with the recent trend in government behaviors (always in the name of Good and Security) i feel that i have no choice.
While I disagree that this is a strong counter, it does explain how this has happened.
People are lazy. They want their computer to show them pictures on Facebook, funny videos online and run some numbers for them. People don't want to care about stuff like privacy, because thinking gets in the way of fun and ease.
Malware has already bypassed Apple's signature mechanism and has been vetted to run by Apple as a result. The security and trust argument only works if Apple's security mechanisms work.
The vetting Apple is doing here is a solution looking for a problem. People have been installing third party applications forever, the ability to do so is a feature, not a bug. Instead of fixing the lacking antivirus software built into macOS, Apple is branding third party code as dangerous and untrustworthy.
I don't know anybody who cares about the problem this is solving, because it's a theoretical problem only tech literate people care about; the same tech literate people are also the people who run into this stuff the most. The reason that no one is offering a better solution is that there isn't a real problem that this solution is solving.
It's not laziness. I want Apple to take responsibility for privacy and security mechanisms in the OS. I want them to be the experts on this. That's part of what I'm paying them for.
> The security and trust argument only works if Apple's security mechanisms work.
If a security mechanism isn't 100% effective that doesn't mean it doesn't have substantial value. Black and white thinking isn't a useful way to evaluate security (or privacy) mechanisms.
> Apple is branding third party code as dangerous and untrustworthy
I think the design of the code signing warning is good: If someone is unsophisticated enough to be frightened by that warning, they probably should be.
What I want is a council of wizards I can pay a monthly fee to and in return they will vet all the software updates I get on my computer. I want Bruce Schnier to be on that counsel, and ideally get all the major committers on major projects to review each others work. Do it transparently. I'd pay $100/mo for this service, maybe even more, since it is insurance on critical infrastructure. Could be a nice chunk of change for some deserving folk, too.
I own a company which manufactures and sells showerheads. Since the "vast majority" of people are prone to a fall while taking a shower, and since it is difficult to come up with a fool-proof solution against slipping and falling (who wants to get into the nitty-gritty of that?), my company also mandatorily installs a camera alongside, so that we can monitor for accidents.
Of course, it's not perfect. Some customers have raised concerns about their privacy, but this is for the greater good. My company is very ethical in its practices, and really values each customer's privacy (pinky promise). This is the best solution for everyone.
The footage of you and your loved ones showering is completely safe with us, although you can choose not to be recorded while showering by not purchasing our products.
Oh by the way, our showerheads are sleeker and more convenient than anything else in the market.
We get it. We want you to accept a "worse" experience and contribute to making it competitive. We don't "deserve" Apple, we chose it long ago (edit: many of us, because of problems with MS that Apple is now emulating) and it has changed to take advantage.
Evangelists do nothing but attach a higher value to computers in an effort to convince others that they are not doing "work" and that their way is the right way.
You would be hard pressed to find a better tool to do your work than a Mac/iPad/iPhone in 2020.
They stopped being mere tools for work a long, long time ago. Computers are - apart from nowadays being ubiquitous in many everyday objects - sold as consumer appliances for end users to watch TV and store their personal photos, texts, correspondence and conversations.
And even if people stop doing that, computers are also used to do banking and contact government agencies, in many places so much so that they're _needed_ to do this, creating a massive lock-in for most people to accomplish everyday tasks.
I wonder how many people there'd be who'd really have to "do work" on computers if the above wasn't the case.
Haha, hard pressed to find something better than an apple product in 2020? C'mon, go to any computer/phone store and you'll have plenty of options, for a cheaper price and with better privacy.
What a despondent and dystopian attitude shift from "bicycles for the mind." If this is the modern attitude of Apple users, no wonder the products have also gone to shit.
I remember that third-party being anti-virus softwares. Then given enough time they basically became malware themselves, and we got rid of them for the most part.
Now that third-party is the computer vendor itself. Will we be able to get rid of them when they all start going down that path ?
I read here a lot of nostalgia of the 80's and 90's computers that were open to users' curiosity and changes, helping teenagers and young adults to develop computer and programming skills... It is true but it is only a part of the story.
For me, the real issue today is not much the disappearance of opportunities to practice and to learn: some of you noted the availability of open source, single board computers, cheap hardware, free software... and even the presence of a free JavaScript engine on all devices.
No, the real issue when you have a computer flavor (PC, phone, tablet...) in hands is that there are too many distractions. In the past, you could get bored in front of your computer and look for new challenges: trying new stuff, trying some programmation, changing files...
Now, you can switch from an unlimited stock of games, to an unlimited volume of press, to an unlimited volume of video, to an unlimited volume of discussions, to an unlimited volume of films and series...
There is no place to get bored and look for new challenges. It even takes a strong will to refuse to engage with these stocks and keep programming and experimenting.
I see it everyday with my teenager: I have a hard time stopping the brainwashing of day-long poor social discussions, shopping, video and series on a small phone screen. They are consumers instead of being actors. And when the phone is off, they are just lost, unable to engage in anything.
To come back to the article, these machines are not only technical jails, they are also little mind-jails.
Fellow Gen Z-er here. I recall reading Stallman's writings some time during high school (a few years ago), at first taking them seriously, using free software as a much as possible (even installed and used Trisquel at one point) but then gradually due to peer/environment, even school-related pressure I became more complicit in using non-free software. I began to dismiss Stallman's predictions as unrealistic or overreactive, but as the years passed, as many have noted, these predictions have been realized on a massive scale.
Opting out of this stuff is hard, and in many causes would involve changes at educational institutions and companies. IMO it's the normalization and following-by-example of adults that lead children to sacrifice their freedom and privacy. Things such as
- creating G-Suite accounts for users joining an institution
- school-provided MacBook Pros/Chromebooks
- making use of a proprietary program (when free alternative exist) mandatory in a course
- strong one here, peer pressure from the network effect of services such as iMessage, social networks, or even AirDrop
If anyone is in a position of influence for the next generation of society, please think about these points and at the very least make it possible to use FOSS alternatives and make users aware of the issues surrounding digital privacy and freedom.
All OSes are chatty in some way or another. It's a small price to pay for having your system 'just work' without going down the Arch Linux road and having ultra control over every component of the OS. Even Ubuntu constantly phones home to `snapcraft.io` and `canonical.com` but needs to do that to check for updates and do things like sync the clock with NTP. Do I care that Canonical now knows when my box goes online? Hardly. It's a small price to pay for having a pretty solid OS that has good hardware compatibility and I have no issues with my printer, and it's plug-and-play. If I need to use Windows to open a .DOCX file (for work) I use Virtualbox where I have a Windows VM to do Windows things. I don't see the fuss really in using Linux as my daily driver. Others are so biased and fanboi-ish towards MacOS it's disturbing to witness.
I’m finding it kind of funny that the conversation about this Apple issue bounces between “you don’t own your computer” and “you can fix this problem by editing your hosts file to block the OCSP calls.”
Doesn’t the success of the latter kind of undercut the former?
You're assuming that the ocsp process looks at /etc/hosts. I would not be surprised if it did it's own DNS lookup to a hardcoded DNS nameserver for the ocsp.apple.com lookup.
Your first choice to edit a config file should be: sudo --edit pathname
This trick has the same problem, $() has to be evaluated before sudo sees anything.
I see so many people flailing about here, and I've done it myself. So let's consider why these tricks don't work.
sudo is an ordinary command that accepts an environment, a series of words, and whatever filehandles it inherits.
Read /etc/sudoers to note that it's then munging its environment per the env_reset and env_keep directives.
man sudo notes that it's closing non-standard filehandles, by default.
And, by design, sudo doesn't want you to do clever tricks because it's trying to stop you from hacking the Internets.
Let's also look at what a shell is doing. All a shell really does is munge some environment vars, set up redirections, and ultimately fork itself and run commands.
All the redirection, command substitution, process substitution, etc. must take place either before or after the shell forks to run sudo. All the shell can provide to sudo is command line arguments, environment variables, and usually stdin.
All the shell can get from sudo is going to be a return code, and whatever it can capture from the stdout and stderr.
The 'echo foo | sudo tee -a path > /dev/null' trick works because we redirect stdin, and we delegate opening the file for append to the 'tee' command, which has now been run as the privileged user by sudo, and we ignore the stdout.
Likewise, if you want to read a privileged file, var=$(sudo cat foo) will work, because you get sudo's stdout.
You can try sudo sh -c 'arbitrary shell script', but it's not good general advice as shells are often banned as a security precaution.
Also, if you get curious and want to edit sudoers without locking yourself out by corrupting it, use 'visudo'. Note that it won't prevent you from locking yourself out by removing your own permissions! RTFM, take some notes, and experiment on a machine you have a root account on or can log in via single-user mode.
I think now that Apple has learned from iOS how profitable a walled garden business model is they are trying to bring that model to the PC world as well. Shipping hardware with their own processors is an important step in that direction because it gives them control over the IP of arguably the most important component of the computer, which in turn makes it easier to control software distribution for their architecture as well.
The transition will be slower of course because people have to get used to seeing computers as closed-source devices with app stores, there are still too many of us who have the mindset that you can just install anything you want on a computer without asking Apple for permission and without paying tax to them.
At this point I really wonder how any serious "hacker" can work on such a device, it's becoming the antithesis of everything that the original Hacker culture stood for.
1: https://arstechnica.com/gaming/2020/11/judge-dismisses-apple...