I think now that Apple has learned from iOS how profitable a walled garden business model is they are trying to bring that model to the PC world as well. Shipping hardware with their own processors is an important step in that direction because it gives them control over the IP of arguably the most important component of the computer, which in turn makes it easier to control software distribution for their architecture as well.
The transition will be slower of course because people have to get used to seeing computers as closed-source devices with app stores, there are still too many of us who have the mindset that you can just install anything you want on a computer without asking Apple for permission and without paying tax to them.
At this point I really wonder how any serious "hacker" can work on such a device, it's becoming the antithesis of everything that the original Hacker culture stood for.
People will preach all day that they won't make any personal sacrifices to try to avoid feeding literal cancers that are eating the software industry and are shocked when said all consuming voids take away their autonomy but they are so locked in to their ecosystems they are trapped.
These are for profit corporations, not your friends. Their bottom line dictates they maximally abuse copyright and their ability to sell and distribute proprietary software out of your control to maximum effect. Microsoft, Google, et al want you using devices you cannot control, because then they hold all the keys and can demand the greatest ransoms for you to do what you want. Its inevitable, and the persistent total arrogance and hubris on display in the tech hive mind in regards to any of these walled gardens is just... depressing.
Free software isn't going anywhere, especially with RISC-V now being an established failsafe if UEFI PC and ARM totally lock down, but it would just make the struggle a lot easier if the principle victims would stop martyring themselves for the corporate ecosystems they have no say, influence, or control over.
I worry for Gen Z because they're tiny-mobile-device native. And the only usable tiny mobile devices are walled gardens. They scream outrage when they're not allowed their tiny little dance-jig operators provided courtesy of an abusive regime.
Sure, I reflect that I may be now nearing the curve of an obsolescent person attached to such silly outmoded principles like "ethics" but if we're all just selling ourselves out constantly to the angry god-machine of the id, is this really the future we want for our daughters and sons?
Without the ability to grow up playing with system level software, combined with the software industry's unwillingness to pass on institutional knowledge to younger generations, I fear we are already on a path towards a civilization that loses a lot of the technological capability we currently enjoy.
I recommend Jonathan Blow's talk "Preventing the Collapse of Civilization" for an unsettling view of how far we have already traveled down that path.
The computers are not just getting cheaper, educative material is easier than ever to find.
Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore, but then again, nothing is stopping a kid from buying a microcontroller and playing with it. It cheap and information is plentiful. I would even say that it is easier for Gen Z to play with electronics than it was for millennials.
But that unprecedented availability of computers didn't turn everyone into a nerd. Just because someone has a smartphone doesn't mean he has any interest in computing. When computers were limited and expensive, only those who had some affinity with them had them, now everyone have them, so mechanically, you can't expect the same average level of interest.
Learning about the Raspberry Pi is nice, but that doesn't help preserve the institutional knowledge needed to build system/low-level features in a modern OS. I'm not talking about teaching kids to program; I'm concerned about preserving in the future the knowledge of how to create all of the complex tech we use today. Many civilizations throughout history collapsed after they lost the technical knowledge on which their civilization depended.
> Because modern computers are much more complex and do much more than before, you don't do much bare metal programming anymore
Yes, that's the problem! Unless those skills are actively used and passed on to the next generation of engineers, that knowledge decays. Part of the reason you don't see a lot of bare metal programming anymore is due to the knowledge decay that has already happened!
> nothing is stopping a kid from buying a microcontroller and playing with it
This article is about how those same kids are being stopped from learning the complex systems we currently use.
> didn't turn everyone into a nerd.
Nobody is trying to turn everyone into a nerd. I'm talking about making sure the nerds of he future have the ability to learn about the tech they use, so the ability to understand and make that tech isn't lost. Locking down the OS into an "appliance" that cannot be inspected or changed is a direct attack on the ability to learn,
The modern computer is the pinnacle of human civilization, a thing so complex that it requires thousands of people to fully understand how it works. Low level programming is just a small link in the chain of complexity. From mining the materials, to producing silicon of incredible purity, to designing circuits at quantum scale, to CPU design and finally the guy who places the frame that will display cat videos. So if you argue that no one knows how to program the bare metal (that is not that bare anymore), one can argue that the knowledge of how to make logic gates from transistors or the metallurgy of copper traces is getting lost.
Maybe less people will know how to program on the bare metal. But think about it, it has been thousands of years since most people are unable to hunt and gather. This is a great deal more worrisome than not knowing how to program computers, and yet, human civilization have been thriving.
The important thing is that it is still accessible, and it is.
The problem is more the lack of interest. Low level systems development is hard, a computer engineering degree is hella hard, and all the hype and easy money was in following the webdev career path starting from the late '90s dotcom boom.
As computing and "content" proliferate, the uncompetitiveness of creation, esp. symbolic creation such as programming, is increasing. At some point, broadening of the access no longer offsets this effect, and the talent pool may start to shrink even if capability and permeation is a million times higher than it was.
And for cobbling together a computer out of microelectronics there's CollapseOS
It is just top down learning, instead of bottom up. For programming, I think either works.
I just happened to learn the other direction because of the year (early 1980s). On CP/M, you had a macro assembler right there, and it took me a while to get my hands on a Basic interpreter.
That's a rose-colored view that maybe reflects a small window in time.
But take the 80s. First of all that computer was maybe $5,000 in today's money. (Yes, there were some limited alternatives like the Sinclair and Commodore 64 that were cheaper.) Long distance telephone calls were expensive as were the big BBS systems like Compuserve. Any software you purchased was costly. A C compiler was hundreds of dollars. (Turbo Pascal at $50 was an innovation.)
Perhaps more to the point, most people didn't just have a computational device of any sort. So the fact that, if you had one, you could use it for a lot of different purposes is sort of irrelevant.
But yeah, if the Pi shows up as part of an IOT system, or as a TV/streaming box, or to play retro games on, or whatever, then it's there and it's available to be tinkered with; and from my limited experience, basically none of those use cases will run on their own without at least a little bit of tinkering. :) Even my little Linux-running Powkiddy emulation handheld has probably consumed about as much of my tinkering time as it has my retro gaming time.
The only thing it will do is make it easier for the general public to work with machines and the masses never cared to begin with and to be frank that's a good thing. They are less likely to install malware and cause trouble for themselves.
Plus you're forgetting that these companies still need engineers to build and maintain their infrastructure so it's not like the knowledge is going to disappear, never mind the fact that the corporations heavily rely on OSS.
Great, you can write a script, it's checksum is then sent over the net and verified to not be malicious or if the service on the other side is experiencing lag..well... gotta wait, and wait.
To the point where, all choice is removed from an operating system. So much for root access.
There's a big difference between buying a computer explicitly for the purpose (even if it's cheaper), vs being able to play around with the hardware that you already have, for no new monetary investment at all.
You can buy an Atmel dip version of the chip in an arduino really cheap, and build basically what is an arduino on a breadboard. Then get a USB programming adapter (again, cheap) to get it running.
Then you can get in expensive USB logic analyzers that are plenty capable of monitoring I2C and SPI buses and learn how all that works.
None of that existed in the 1990s. It simply wasn't there.
The the price and the availability of tools is so much better today. You don't even need to buy any books, it is all online, with community members jumping in all the time to help.
What happens there is that you have a pile of photos that are all in one folder and you want to organize them into separate folders by date. A problem you're actually having, not a homework assignment. Which is possible to do automatically, because they all have metadata with that information in it. So you find a python script somebody wrote to do that, and it works! Then you realize the script is just a text file and you can modify it to sort using geotags or something else instead, and now you're down the rabbit hole. But only if it's easy to see and modify the code that runs on the device you actually use.
For example, I learned to read at the same time I learn to code, and from the same source — I had the unusual combination of not just the right birth year, but well-off parents and two elder siblings that had gotten bored of their Commodore 64 just as I was reaching reading age.
Back then my coding was, naturally, 95% transcribing from the manual, but it was fun. One of the few ones I can remember reasonably well this far removed from the experience was a sentence generator that had a list of nouns, verbs, etc and picked from them to build grammatically correct gibberish.
I can still remember being 11 years old and meticulously keying in the hello world program on my dad's aging Tandy Model 16, then saying 'cc hello.c' and watching as the compiler chugged through the tiny program for a minute or two. (It #includes stdio.h, so there was more code to go through than it seemed, plus there was a linking step. And not much actually happened, because Unix C compilers are silent if there are no errors.)
When I ran a.out and saw "hello world" appear, I was over the moon. It was like clearing 1-1 in Super Mario Bros. for the first time. Like I had surmounted the first obstacle and was thus ready for all the challenges that lay ahead.
I also doubt that raspberry pis are actually the type of device that schools are interested in. What they'd want is the TI-84 equivalent of a laptop. It should be easy to reset the laptop to factory conditions and it would only run a few specific preinstalled programs (one of which would be a barebones IDE, probably for Java). To me it feels like the raspberry pi fails at all of these. You have to mess with hardware and then mess with the software. A school provided Chromebook with student accounts managed by the school would be way more practical.
However, if you actually want people to learn programming organically, it's much simpler to just get a Windows laptop and use that as the main device.
On a RPi, with a standard laptop on the side (Windows/Mac/Linux, doesn't matter), and some basic installed tools, you can pull down and configure Alpine Linux and run a bare bones IoT Linux platform. If you want to get really gritty, you also configure and build U-Boot as a primary loader.
Once you get passed that point, you pull docker into your Alpine build and start running containers.
Stepping through that full process will teach you research (because you are going to be reading a lot about how to pull that off), it will teach you about Linux kernel configuration. It will teach you to be at least comfortable with git.
There is a lot you can learn on a Raspberry Pi, cheaply, that only involves plugging in an ethernet port and power supply, and never seeing anything on a terminal.
Being accessible and being accessed are two different things.
I'll make a more general parallel.
Never in history was such an enormous amount of audiovisual media about the decades preceding the birth of their generation ever produced, during those decades of later. And this astonishing amount of written documents, audio, films, both fiction and non-fiction, live takes, documentaries, is readily available: often immediately, often gratis.
And yet this generation is disconnected from the previous ones in an unprecedented manner. And yet this generation seems to be the most ignorant about how stuff was just a couple decades before them. I've never seen such a break in the flow. The material is present but not reached or understood.
* = The issues with the Raspberry PI are both with Broadcom's closed source hardware (there are other tinkering boards with more open hardware) and of course with the license model of ARM.
As far as the ARM license goes, very few people are ever going to spin their own hardware, and at least the ARM programming API/ISA is very well documented. We'll see if that continues under NVidia.
Certainly MOS, ST, National, Zilog, Motorola, INtel, IBM, Dec, (etc etc) CPU were always closed source. That did not prevent anyone from learning about computer systems and programming then, and it won't stop them now.
That said, I doubt Apple will ever make the M1 available for purchase. Which is where this thread started.
Avoiding the black hole does not solve the part of the problem where you are financally supporting closed hardware and disincentivising the Rasberry PI foundation fron doing the right thing.
Pair that up with an arduino, and maybe wiring, and you can a have hardware interaction with a Linux desktop on the 400.
I just helped a young arduino user with reading an analog pot yesterday. There are still young people out there playing with hardware, and it is way easier today than it was for me in 1984.
Then go over and look at the people configuring and compiling custom versions of Marlin for their 3D printers.
Or the nerds designing flight computers for solid fuel model rockets.
Gen Z here. I tried to give manufacturers the finger and try to play with systems level software on modern PC's and ended up retreating after realizing, among other things, just how much modern computers are not for hackers. Everything feels like a "undocumented" (by which I mean the documentation is likely only accessible to OEM's and those with large business deals woth them) mess. Even when it is available, I'm too scared to touch it. I found something I wanted to try to to hack around with till I saw the license agreement I had to read in order to access the docs. I couldn't parse out the legalese, especially regarding open source works and I'm not consulting a lawyer to figure out if I can post some project I might hack together online under an OSS license. The few things that do get open sourced are incredibly half assed and you can still tell they're designed for corporations, not hackers (i.e. edk2)
Conversely, I have an old IBM XT compatible luggable sitting in my closet. The manual for that has the full BIOS source llisting. Nowadays I mostly just hack around woth old non-PC devices, but for the most part computing just isn't fun for me anymore.
If the lack of open hardware bothers you, there are several non-x86 architectures that are fully open source. Chief among them is RISC-V; SiFive is taping out RISC-V processors to real hardware, selling dev boards, and stuff like that. These are real, competitive, Linux-capable CPUs, although widespread adoption has not happened yet.
On the more minimalist front, the ZipCPU is simpler (though has less software support), and it's author writes some really great blog posts about HDL (Hardware Description Language -- how these CPUs are defined).
You might also enjoy developing for very small embedded systems. I like AVR CPUs such as the ATMega328P (of Arduino fame) for this purpose. Although not open-source, they have quite good and thorough datasheets. AVRs are very small 8 bit systems, but as a result, you can fit most of the important details in your head at once.
If you want to talk more about these topics, feel free to get it touch with me. You can find my contact info on my profile, and on my website (also listed in my profile).
0 - https://riscv.org/
1 - https://www.sifive.com/
2 - https://github.com/ZipCPU/zipcpu
3 - http://zipcpu.com/
Pretty much all inside it is open for you to look into and change as needed.
At some institutions they teach how OS work by students implementing Pintos, a full x86 OS.
In my first year I had to build a full raspberry pi emulator from scratch and then code an assembler for it. These programs would then run natively on the raspberry pi. People wouldn't even touch the hardware until much, much later into the project when the emulator has confirmed to be working.
I disagree with the view that you need a full FOSS hardware to understand a platform. You can do a lot with a VM/QEMU.
Also they seem to have released a cheaper micro ATX board: https://www.raptorcs.com/BB/
Kids today aren't as skilled with computers as I would've expected years ago. I feel the app culture stunts deeper learning.
I am not a software developer and have little formal education in computer software or hardware. With that said, I've picked up enough just by growing up with Commodore, DOS, Linux, etc. to at least have the basic understanding needed to research or just figure out solutions to most common problems.
I work in media production and IT in an educational setting, and while I'm far enough along the Dunning-Kruger graph to know just how little I know, the sort of things I see from students (grad and post-doc at that) definitely give me pause sometimes. Often, the mention of basic things like how to change a setting or search for a file in MacOS or Windows is met with glazed eyes or signs of discomfort. Apologies for how "I'm not really tech savvy" follow talk of anything more complex than pointing at pictures on a screen or doing something outside of the web browser.
And yes, I do understand that the very nature of these students' specialization in other fields can mean they haven't had the need or opportunity to learn much about computing. I just feel like I only picked it up because it was how you got things done or because it was there for poking and prodding when I got bored or curious.
I think in the end it's not just that I have a personal connection to computing and think everyone should share my interests. It's more akin to a basic competency that opens a lot of doors and prevents you from being taken advantage of.
My analogy is usually along these lines: if your job requires you to drive a car, you don't need to be a mechanic or be able to build a car...but you should at least know the basics of how to operate and maintain a vehicle beyond just gas pedal, brake pedal, and wheel.
But the thing is, that's just branding: they aren't actually much more competent or thoughtful than my generation was at that age, they just seem like it. When I look at the code they write, it is as bad as the code I wrote. They have startling gaps in their knowledge and experience, just as I did (and no doubt still do — there is always something to learn!).
The thing that worries me is that until one gets to more objective measures, they really do seem more competent and trustworthy — which means that others are more likely to trust them, which is likely to lead to more bad decisionmaking at scale.
I wonder, though, if there is really a difference at all. Maybe my generation actually seemed more competent to our betters than we really were, too!
I'm sure there are Zoomers out there who can code rings around you(and me).
Perhaps it is considered more of a commodity now?
It is and it should be. What the white beards all too often forget is the fact that the generation before them would say the exact same thing about them (lack of enthusiasm, lack of interest and knowledge, etc.).
This is a story as old as time:
> The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households.
attributed to Socrates (469–399 B.C.)
How many 80s kids were into HAM radio or actually building computers (Ben Eater-style: https://www.youtube.com/c/BenEater/videos ) instead of toying with micros or basically playing Lego with PC components? Same difference.
Today's computers have reached a level of tight integration and complexity that simply cannot be grasped by a single individual anymore.
People keep whining about that when it comes to computers but happily accept the very same thing in other commodities like cars, clothes(!), highly processed food(!), hygiene and cleaning products (soaps, hair gels, rinsing agents, ...), utility grids, etc.
It's hard to get young people enthusiastic about glue or forging metals, even though both are essential to our daily lives - often in ways we don't even realise. Same deal, no whining from metallurgists or chemists.
that is perhaps the most poignant take on this i have come across in a while. thanks that got the cogs turning in... different directions. i do need to ease up. :)
There has been a significant increase in demand and cost for pre-owned vehicles that do not have integrated entertainment systems and drive-by-wire systems.
The used truck market, I have seen vehicles triple in value over the last decade.
A problem is that teachers would require some extensive training first before they can even try to teach kids something they didn't know before.
I do agree that the majority are limited in terms of technical knowledge but I think that was always the case. It's always a (driven)minority that are responsable for tech advancements, preservation and magic.
I wish I was 14 in this day and age with this much information, how-to's, cheap(and expensive) tech and microcontrollers(when is the last time you checked places like aliexpress and git's?).
- Hardware is ridiculously cheap and powerful and widely available
- Inexpensive educational computers such as the BBC micro bit and Raspberry Pi are widely available along with associated educational materials; these devices are easily interfaced to the real world, and can often be programmed at a very low level or high level as desired
- Overall, robotics and real-world computing is cheaper and more accessible than it has ever been
- FPGAs are ridiculously cheap and powerful and allow you to explore processor and digital hardware design
- Nearly every computer, smartphone, or tablet has a Python implementation as well
- Free and open source software is widely available
- Free and open source game engines are widely available
- Games are often moddable and many contain built-in level editors
- Source code of almost any type of software you can imagine, from operating system kernels to compilers and libraries to databases to to web browsers to scientific computing to games and productivity software is widely available
- Billions of people have internet access
- Tons of free and often high-quality educational materials are available from universities, khan academy, and many other sources on youtube, on github, and all over the web
- Many books including textbooks and tutorials are available for free as PDFs or readable on the web
The materials are out there, but the challenge is empowering people to find and make use of them.
The whole “playing” with system level software is how I became an engineer in the first place!
It’s also extremely apparent when you suggest a new way of doing things and older folks look at you like you’re crazy or worse, actively block because “we’ve always done it this way”. Those youngsters in those tiny screens have big ideas, from a different perspective, and we should acknowledge that and learn what they know.
Back to topic, I think the maturity of software engineering has enabled a lot of what we see today due to the macro cash grab of capitalism. There are things that stand firm beyond it, but 80%> of software written is logical duplication of software already written (for another company perhaps).
I’m guessing on the percent but it’s important to know that global overall LOC is really just hashing out the same stuff we’ve been doing (albeit in maybe a slightly more novel way) to get business value.
20%< is moving that stick forward.
As a self-taught developer with just a few years of experience, I appreciate seeing this sentiment. I have this burning desire to be a great engineer and to learn as much as I can, but I'm not exactly sure what I should be doing or learning.
Do you have any advice as to activities, books, or other resources that you think would benefit a young engineer who would like to learn how to do things right?
I’m on year 20 of this amazing ride. I have another 20, possibly 40 in me. I love it!
I was self-taught. The trick is to find something you’re super super interested in and learn as much as you can from books, youtube, Google, blogs, other members of HN.
I was really into games and wanted to make quake-style fps games. I started with art. Went to college for graphic design and hated it so I dropped out. Worked on games while I did crap jobs. Eventually made a few small ones and got really into web development. Got a new job...
I can’t really tell you what books without knowing your interests.
I am in my retirement position now, stepped back to staff engineer, because I don't want to spend any more time stuck in meetings. The best principles, the best directors, all came up through the ranks. Yes, some of them returned to college for a few years before coming back, but the best ones just came in as enthusiastic green horns and dove straight it.
I have worked long enough now (36 years in this field, and this is my third 'career') to have seen people I mentored grow into director level positions, and that is perhaps the most rewarding thing there is.
Aren't there more kernel developers now than ever before?
It's not the case that we're struggling to produce technically skilled software engineers.
Gen Z here. I learned most of my basic computer skills by playing / modding games, editing videos, and having no friends. Gradually snowballed into writing scripts to make things easier, learning more programming languages and eventually getting books on more formal CS topics. I hope my kids can have a similar experience.
The ugly and maladjusted among us will always find a way ;)
I can go back and play old games I grew up on any number of platforms thanks to emulation or buying old physical media. This generation won't be able to do the same in 10-30 years.
I slowly tend to think that some of those commenters, who fiercely defend all and any action taken by Apple to limit the freedom of its users, are paid marketing and PR firms who scan forums to oppose any opinion that is not in line with Apple‘s vision of a closed system.
At the same time Gen Z has way more choice and exposure to "computers", so worrying about them sounds very diminishing to their prospects. Sure, adoption of computers, even if they fit your bag or your pocket, is much greater nowadays which leads to way wider range of usages, for better of for worse.
In the end same could be said about the Edison generation and electricity. In the end I barely have a clue how electricity networks operate. Can I experiment with it? Sure. Do I need to if I just want to power on my computer? No.
A lot of people like to rag on Ubuntu. I agree with some of the criticisms. But the one thing they have done right is broker partnerships with hardware vendors like dell and lenovo.
I run EndeavourOS which is like a GUI installer for Arch. Arch's package management with (AUR -- Arch User Repo) will remind you a lot of Homebrew. And their wiki helps you get up to speed quickly.
The crazy thing is that I've been blind to the massive improvement that happened in Linux-land for the 7 years I was using Mac OS.
Of course there is still some PITA with missing productivity software (in my case for hobby photography), but so far I am determined to suffer with open source tools and maybe try to contribute a little.
It's s good analogy, you have to want it, either skater or hacker, any label really, must be earnt with time and effort. True for all things, and those who have earnt it can spot the pretenders a mile away.
Computers are a tool we use to do our work. Setting up the computer cannot become a job by itself.
For example, the latest version of Mac OS broke docker - literally today we discovered this.
That's wrong, Docker will run fine on M1/Big Sur after an ARM release and probably some patching. Not too different what you'd expect on any major OS release.
And Docker has the motivation and means to patch and build the M1 Arm version.
You can take it directly from the Docker people:
> Thanks. We're working closely on this with Apple. This is expected right now (and indeed there's nothing we can do about it yet) as the new chips don't have virtualisation support yet.
Notice the lack of panic or any mention of "it'll never be done", even implied.
I haven't had to spends days configuring anything for over a decade.
I play along, but I haven't compiled a single thing in years. I don't even think proper build tools are installed on my PC.
My history of using Linux desktop is riddled with crap like that.
The Dell XPS 13 is supposedly good too although I don't have personal experience.
I use Macs quite a bit but the idea that running Linux on a properly-selected modern laptop is likely to be an exercise in frustration is just a myth at this point.
But you are absolutely correct, and my Linux laptop is a Thinkpad X1 Carbon. It's also well supported under OpenBSD but for what I do with it (mostly media consumption and some light gaming mixed in with the usual laptop stuff), Linux works out better.
Developers never use apple because it has tons of compatibility issues.
Can you maybe also comment further on the compatibility issues? Mac is perfectly fine for any general-purpose frontend/backend development. Unless you work in a Windows-centric environment or in a setting that clearly rules out a Mac (e.g. Linux system programming), I cannot really imagine what compatibility issue would prevent you from using a Mac e.g. for Typescript, Java, Python or Go development. Not sure why Poland would be less friendly for developer Macs than the rest of the world.
Good PC laptops are much better if you're using them like a developer would (move it twice a day, code on it for hours without moving).
Thermal design is awful on apple laptops, they are basically running half-speed by design. Can't fix anything if it breaks.
Why would people value brand over all these concerns is beyond me. Must be consumer patriotism, like with Harley-Davidson :)
Ubuntu's been really solid. Boring even. Which may not seem like a compliment, but I don't nerd out on OS stuff, I just want a computer that I can do stuff with when I need it. And I personally found previously with mac and windows my ability to 'just do work' wasn't there.
Wish it was as good on the alternatives. Hopefully it works 100% out of the box, but I wouldn’t bet my life on it.
How could it be that people are all about freedom when it comes to governments but have no problem if private companies censor and controll what they write and do?
I.e. if you take away the rights to privately create and distribute software, the current situation will become permanent.
Nobody wants to take away their right to do so (ok well maybe some do; I'm not one of them). Just that there should be rules we all agree to when we do.
We accept this in other venues. Seatbelts. Stoplights. Nutrition labeling on packaging. etc..
No such rules exist for software, which is odd. It does seem to suggest the industry is ripe for regulation. It's always disappointing to me that we only accept limits when the law places them, as though ethics and morals are somehow not good things to pursue for their own reasons.
Which goes back to OP's point: just because a company CAN be as dystopian as they can imagine, doesn't mean they should be.
Only a very small set of products have special regulations, generally those which we deem unsafe for consumers unless regulated.
“Seatbelts. Stoplights. Nutrition labeling on packaging. etc.”
A good set of examples. Every one of them is about consumer health and safety. This is consistent with liberal democracy - people should be free to do what they want unless it hurts someone else. That is why all product related regulations are related to safety.
It seems reasonable to assume you are arguing for rules which prevent the installation of unsafe software on computers which are sold to consumers.
Apple would be the most compliant company with such rules, and would benefit greatly.
If you don’t, you don’t have to use their products.
Not sure when, but I'm certain it will.
It is one of those strange things, the more closed things become, the greater the incentive is to open them back up again.
The hacker ethic is still out there, just this recent boom has dulled it a bit. A steady diet of locked down computing will fix that.
It's kinda abusive how you need to pay the hardware tax, the yearly $100 USD rent, and 30% percent of your sales to have the privilege of developing for their platform.
> literal cancers that are eating the software industry
I wish we had a completely open system of hardware and OS. Is librem making any progress in this space?
I'm not a hacker, I'm a corporate shill because I work for my own profit within a system that's set up to ensure its own continuity. I'd like to be a hacker, but I have a family to support and am too weak or addicted to lifestyle or scared or realistically-minded to actually commit to what's required to earn the title of hacker. Same with anyone working towards advertising, marketing, consumer data collection. Doesn't matter how intelligently you solve a problem in those fields, ain't no starry eyed kids ever wanted to grow up to make the world a better place by more accurately identifying and individuals preferences on the internet.
Fuck fake hackers. The best thing I've done with my life is being a good husband to a teacher.
The ethos went "mainstream" in the sense that it was fashionable to play the part and be in your words a "fake hacker". And for future historians and those unfamiliar with the jargon, hacker ethos isn't about criminally breaking into computer systems.
Same on mobile, Android phones - especially in the cheaper ranges - are dodgy as fuck when it comes to privacy and security.
Then how do they know what to fix/improve without that? I can't imagine average iOS/OSX users, who aren't devs, are writing and submitting detailed bug/crash reports on a regular basis.
Or is it just bias and personal preference where "I don't mind Apple collecting my telemetry because I like Apple but Microsoft collecting my telemetry is evil because I hate Microsoft"?
I don't have a dog in this fight but don't trust any for-profit company when it comes to privacy, no matter how good their PR is on the matter, especially since both of them(and most SV companies) were part of the NSA Prism bulk collection program.
Is there any guarantee that Apple gets zero bytes of your data or do we just take their word for it because their products are shiny and cool?
The same information that can contribute to increased security for the end user is also valuable to sell to advertisers if, delightfully ironically, the security vendor is ok sharing private information with said advertisers.
Consider all of the school age kids using chromebooks, many of them their 1st computer.
Will they care in 10 years as adults that their Mac doesn't allow programs to be installed from "untrusted sources"? Unfortunately, this transition probably won't phase them.
Everyone else was vertically integrated, if anything Apple is the survivor of those kind of systems, and now everyone (OEMs) else wants that model back.
A hacker in the “Hacker News” sense is more than likely working on an ad-supported web app and dreams of working at a FAANG. The old hacker dream is dead, sadly.
Back in the "golden years" you'd have to put effort in to finding and participating in the community. The real hackers are still somewhat underground simply because they're anti establishment; actively seeking out the high effort, loss making niches that attract fringe thinkers and the unique characters required for true innovation.
Crypto parties, maker faires, get to one. If nothing exists locally, build it and they will come. (And admittedly I need to take my own advice).
Everyone has an Apple or android device, and macs are easily make up 90% of the personal workhorse laptops. Because at the end of the day, technology that just works, is nice too.
I get that it’s probably very different in crypto communities, but we don’t really have any of those around here. I wouldn’t say that real hackers aren’t using walled gardens though.
If that hacker ethic lives, it's not in people labelled "hackers" anymore.
See all the events like CCC and the fediverse.
Many for sure are brilliant brains but they lack to monetize it. To break or stop near-monopoly/walled garden systems you have to dismantle them through the market. Build better/cheaper (primary) alternatives which are free and open (secondary).
IMHO mainstream only values convenience and price, freedom is only a "nice to have" item and when it requires more work it is a no-starter.
...as if selling your brain was the most important thing in life.
> you have to dismantle them through the market
The large majority of modern technology was developed with public money.
 browsing data shows around 10% market share but it is based mostly on English websites which are minority (and English-speaking countries have significantly higher apple users percentage than the rest of the world).
Not sure if it’s a sign of aging, but I’m losing hope and interest in most of these pantomimes /s
I don't know what disputes are being argued over, so I bothers me not in the slightest. Would it help if you just ignored the detail and just worked with what's presented?
This is the first time I've tried installing linux in about a decade, after using a stable dual-boot setup during university. Things seem to have gotten worse. It'll be another decade before I try again, probably. I lost two days of pay and probably years of non-grey hair dealing with this fiasco.
One example is the simplicity of system management. Desktop oriented Linux distributions tend to take care of everything from application installation to updates with a single interface with minimal interruption to my work flow. This is only sometimes the case with macOS and Windows. Android, iOS, and ChromeOS are not realistic contenders in the application space for some users.
Cost of ownership is another factor. Linux may not make sense for some businesses if they have to hire someone to manage their systems, but an end user who can handle often trivial tasks can usually support their own system and benefit from less downtime. While paying for software is a good thing, it frequently adds many constraints on what can be done while modern business models can make licenses prohibitively expensive (e.g. subscription models or various forms of forced obsolescence).
Other reasons include: sometimes the desired software just works better under Linux since it was designed for Linux, a desire for privacy or a need to ensure confidentiality, compatibility with older hardware that is no longer supported by the vendor (but may be supported by open source developers).
Working with what's presented simply means that you are unlikely to modify what is shipped by the vendor. You can still add to it or benefit in other areas.
Ok, we disagree on that, but I get what you are saying.
But, another aspect of what you are saying is ‘trust the vendor and don’t look under the hood’. This is no different from trusting Apple or Microsoft.
You are actually confirming the general point that Linux provides no benefit in terms of who you have to trust.
I.e. it doesn’t in fact give you a system that you own and control - you are at the mercy of a vendor.
I avoid Apple because of the slow descent into vendor / walled garden lock-in, which suits neither my wallet nor personality.
I have a couple of ChromeOS devices for the kids' schooling, but they're unwieldy for my workflow.
You didn't ask, but I moved away from Windows because of increased bloat, telemetry and the dual issue of decreasing control of "services running" alongside noticeable slow-down of performance frustratingly too soon after a fresh install. I've also had two occasions (which is two too many) where Windows decided, upon it's own, to install updates and reboot, losing whatever I had open at the time, no message pre- or post-update just clean login screen next time I went to use the machine. I think they've made that better, and there are options to control the nature of how updates are dealt with, but I've already made the jump.
"what's presented" by Linux may not be perfect, but it's closer to my perfection than the alternatives I've tried (to be clear, this is what works for me, not necessarily anyone else).
If any company, be they Google, be they Apple, should experience total "victory" in the war on intellectual property they would find everyone who is not
* An investor
* In management
* An employee
* A temp/contractor, or
* A happy customer
I used to understand the CEO's job as cheerleader, for the investors, for the employees. I still think that is how they operate tactically, but strategically, I think the goal of a CEO in a Obama-era neo-liberal is to put the whole world in one of those categories above I suppose, with the acceptable additions of
* Companies in partnership
* Competitors who participate in a trade association together in a reasonably civil manner, or
* Members of government who are lobbied to advocate for your firm or industry or a strategic need that you hold in common with the broader public
So all of that to say that I expect one day, maybe 40 years from now, maybe 80, that IP as it operates today will be transformed into an index of ideas that everyone can take and implement who might put them to use beneficially for the citizenry and other residents. It will be nice after all of that work to have a primary key for the ideas that have been created so far, so I don't think we the people will burn down the USPTO or the LOC or anything like that.
Two unexpected events in information processing technology threw monkey wrenches in the industrial and geopolitical policies of the strategic thinkers dreaming in DARPA, SRI, and elsewhere:
1 - Personal Computing was an unexpected event. It has taken all of 4 decades to put that genie back in the bottle. The topic of this thread.
2 - AI. This mainly threw a big wrench in the planned integration of Communist China into the Western system. AI enabled CCP to maintain control of the Chinese society, which was completely unexpected.
Having advanced general purpose, user programmable, computing machines networked globally in the hands of peasants is simply unacceptable to the folks who gifted humanity with these tools.
Could you expand on this or better yet, point to some reading on the subject? This sounds like a very interesting topic.
Back in the early 70s, when the deal that Kissinger and Mao shook hands on was made, there were no PCs, or pervasive networking and mobile communication and computing. It was always my speculation that the risky gambit of fast forwarding China's development and economy and freely transfering crown jewels of technology was made with the conviction that the process of integrating with the West would naturally create a power base in Chinese society that would push aside the Communist Party. Tiananmen ("June Fourth Incident") was a manifestation of this correct original prognosis of the planners of the deal, that it would cause political headaches for the CCP. A cultural instead of a kinetic overcoming by the West.
I think the consensus now in the West is that CCP armed with AI, pervasive networking, and the emergence of a digital society in the true sense, can stifle any possible organic formation of alternative poles of power in Chinese society. The only remining challenge to CCP were the Tech Mandarins (like the Senior Self Crowned Bozo in the West, a certain Bill Gates) and just this week Jack Ma was reminded as to who is in charge in New China.
And of course, Eric Schmidt et al. are now salivating at the "Chinese Model" and we're gonna get the same, regardless of what little you and little me thinks about it.
There is money in creating apps for Apple systems, but you are giving up a lot in exchange.
Some developers use it for convenience but I would not say it is widespread and most are fundamentally opposed to use a mac.
Our advertising department is the only one with macs. Sadly, we still support Apple because every sales rep gets the whole suite. iPhones and iPad Pros (because the others are too small and you cannot work with them... what a new revelation...), that is a few thousand bucks for every employee that Apple gets. They even get the ridiculously priced pen (120$?) and keyboard (300$+). And our sales people still complain about not have notebooks, because they are superior devices. There is just the presentation argument.
I do know that my next machine will be Linux. I'm probably going to put together a desktop for my office, and I'll buy a System76 laptop or whatever for those times when I want to work from the porch.
It is quite high and to correct myself: Some developers have a Mac for iOS development. Most people I know hate it.
Mac has everything I liked about Linux but with zero cognitive overhead. With brew and a window manager (amethyst) I'm not missing any features I enjoyed in Linux.
These companies have repeatedly shown that they don't respect people's privacy and people have resoundingly responded that neither do they.
I would suggest that the users here are probably some of the best suited in the world to help smooth those rough edges and even surpass the features and usability in the privacy respecting alternatives I just hope we collectively notice this before it's too late.
Most likely that means I'll run a Linux laptop for programming, and have a partition for Windows on a gaming computer, cuz I love me some games. I know Steam is gradually approaching Linux, but most AAA games still run best under Win. Plus it's an easy platform (outside OS X) to run multimedia production tools. Sorry, I just don't think Gimp or other Linux tools for multimedia production are good enough yet, but the hope remains that they will someday compete with giants like Adobe. For real. And not just as free but woefully inadequate alternatives. Please correct me if I'm wrong here, though. I'd love to be surprised, if you are able!
100% agreed. And I would even go so far as to say that Steve Jobs had such a strong vision of how things could be, that it was his greatest mistake to put someone in charge who did not (Tim Cook).
While there is some merit to this claim, I don't agree with it entirely. In my experience using these systems side-by-side for over 10 years now, it's pretty obvious that Microsoft has stabilized while "stealing" Apple's good parts, and that Apple hasn't really evolved any further than they got in about 2010. In essence, Apple has allowed Microsoft to close the gap, although by about 2005 Apple managed to become so high quality on the computer market, and so ahead of its time (by "stealing" the good parts from Linux) that it's hard for me to say how they could possibly keep that position for long. (I'm not saying that they literally stole things. Obviously they've done a lot of innovation themselves, but there are also some pretty obvious signs of "inspiration" going on too.)
There has for sure been downgrades with Apple too, but not really in regards to pure stability. In my experience most of it started in the 2010's, when Apple decided to turn away from the pro market to focus more on the trend consumer market instead. At first it happened slowly with a few technical changes that put off web designers. However the first major blow was against Final Cut Pro. I guess it turned out well in the end, but in the beginning they stumbled badly and gave away a large market share to Adobe as media pros started fleeing from Apple. Soon after came system changes that made Adobe crash more often (I'm sure it was just a coincidence...), only resulting in customers fleeing from Apple as a whole. Today Apple Mac's are decidedly more of a tightly controlled trend brand than the prosumer brand it used to be. (Kind of like Hasselblad has gone from the photo engineer's brand to an almost purely luxury brand by now.) (Also I really miss the MagSafe lol.)
But apart from that it is a bad deal.
Windows degraded itself with its bad store, telemetry and PaaS dream, but yes, as soon as they could Apple developed in the same direction. Microsoft suffers from bad decisions on some management level.
> continuing to use Apple/Google/Microsoft
That is probably 98%+ of devices that most typical people use for everyday computing, across phone, laptop, and desktop. Whether it is an android device, an iPhone, a Dell laptop or a new M1 macbook you are beholden to a corporation that doesn't care about privacy. I would argue that Apple cares a bit more, and I really hope they respond to the most recent shit show with some changes, but I'm not holding my breath.
And before "have you tried this *nix distro??" comments appear, they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis. And hell, if you setup Chrome and Gsuite as your default tools of choice, as many do (or are forced to because their job) how much are you really breaking free anyways?
I don't know the solution, but its clear that in the war between privacy and convenience and usability that privacy is on the ropes and its the last round of the fight.
However, in the developer sphere, GNU/Linux is way more represented. In the Stack Overflow 2020 survey, 26% of people said they were using a Linux based OS . This number has been gradually increasing over the years so further growth is expectable. It's less than one percent less than the Mac OS share of 27%, but I'm not sure how much noise there is.
So Linux is usable enough for a growing community of developers, and many laptop/computer vendors targeting the developer market offer GNU/Linux as a preinstalled option.
As a result, at least for software people, Linux is a real privacy preserving alternative on the Desktop. HN is a software forum, and the blog post was also made with people educated about software as the recipients. So I think the post you are replying to should be understood not in a global context but in a context of the IT crowd.
Of course, it's not beautiful that Linux is less of an alternative to people who need to use complicated niche ISV software, but generally any industry can make the move if there is a subset of users pushing towards GNU/Linux.
Linux is fine, but it doesn't just work, and it needs to in order to gain a foothold not just in the developer community but with the general public -- that's the lesson Android has taught us. Until it does it will not budge the numbers. It just won't.
The real apologism here comes from the Linux community, who believes that by virtue of existing, and not being an Apple or Microsoft project, it should be widely accepted in spite of being a distant third in usability. Linux needs to step up its game or it will forever remain a distant third. The general public does not care about "free and open source" -- heck they barely care about privacy at all. That's just us, the HN crowd.
I love Linux in theory, I use it a lot, but my daily driver is macOS. Frankly it likely will remain so even as I transition to being an Android engineer in the coming months.
I haven't used MacOS for a couple of years but I can say for certain that Linux is not that far behind Windows and sometimes even ahead. I think the main problem is lack of application support but specifically for developers most applications are already there.
There are no multitouch gestures, there's screen tearing, 4K video playback on a 4K screen is choppy, sleep and wake up both take a long time, deep sleep doesn't work properly and I HAD TO DOWNGRADE MY KERNEL because of the stupid "Resetting rcs0 for hang on rcs0" bug that would freeze the machine for five solid seconds every minute.
I work on dotnet. I've not been able to run my azure functions on Linux at all. Even on windows, I have to start the azure storage emulator directly and then start func (using the full path). On Linux, azurite is not the same thing as azure storage emulator.
So I must use windows for dotnet development. Even after four years of dot net core. That being said, Linux isn't the problem. Gnome 3 isn't the problem either. Flatpak works great.
There are a few regressions once in a while though. For example, I'm unable to use my lenovo flex 14 (and ryzen 3500U) built in mic since I upgraded to Fedora 33. Looks the issue well be fixed in kernel 5.9 but for the moment I'm on 5.8. But it has been great mostly. If I could use it for work, that'd be great.
Someone has to say this literally any time someone mentions problems with Linux Desktop, along with "it works for me", "my grandma uses it!", "you just have to buy the right hardware", and "you're using the wrong distro". It is incredibly tiring.
It's extremely tiring to watch this happen for as long as I have been watching it happen.
Also if you have to argue with someone that linux does work for them, you've missed the point.
In some aspects, the hardware on high-end-insh laptops have moved nowhere in the last few years. Yeah, they've got slightly better GPUs to handle 4k, Vulkan/SPIRV and new media codecs, and faster SSDs once they've got rid of SATA, but the CPU performance moved just in percents and it comes with the same memory. My wife still uses 2012 i7 Thinkpad (T430s) with 16 GB RAM, and there are not many cheap machines that could be objectively called better. Certainly not in the $300 range. Heck, until Ice Lake, you couldn't purchase a laptop with more RAM unless you went into luggable workstation segment, so it is no wonder that MacOS works relatively OK on 5yo machines. They are still way better than $300 Acer, despite that Acer having possibly some better paper specs.
There are probably also plenty of companies which exclusively develop on Windows. You can have an entire career in the embedded industry, target Linux all the time, but never use it for development. Development is not all Linux based, I never said that. What I quoted were percentages.
> it should be widely accepted in spite of being a distant third in usability.
As you shared your anecdote, let me share mine. Recently I had to set up a new networked printer on both Windows and Linux. On Linux it worked immediately, on Windows it didn't and required installation of a manual driver. That installation was partially botched and removal didn't work. Only after a few attempts it did, at which point I've spent an hour with the problem. It's the same old crap that you had with Win 95, but now on Windows 10, while Linux has a proper IPP driver and no need to install outside software. Maybe Windows has an IPP driver too, no idea, but it didn't work which is what matters :).
Also, there are definitely distros that are more usable and ones which are less usable. I think there is a true core in the meme of people being lured to linux by "use linux it's user friendly", then being told that Arch isn't that much harder anyways, and then ending up using one of the expert distros which often break in subtle and hard to debug ways.
But yes, in many ways Linux could do better usability wise. For example, it's a nightmare to target GUI Linux and then expect your binaries to work for years. It's almost easier to just ship Windows software and then use Wine. Nvidia also has shitty drivers, and in general, the driver situation isn't perfect. But things are improving I think and the gap is shrinking.
Also note that there is an effect when more people use a piece of software, there is a larger market for people who make a living with improving that software, via support contracts, consultancy contracts, etc. So once desktop Linux gets adopted widely, it might improve in many ways from the current state, simply due to ecosystem size.
Anyways, the claim was that Windows and Mac are way better than Linux and it doesn't even compare. My anecdote shows that it's not always the case :).
I switched from Mac to Linux, and it's been fine, partly because I used a laptop that has Linux installed by the manufacturer. If you only ever used Hackintoshes you'd think MacOS was a mess, too.
Every researcher I've met in the last 10 years needs some version of Linux to get their job done effectively. Largely because tools like Python/Numpy have replaced stagnated (due to years of monopoly) tools like Matlab and they are much easier to setup and use on Linux.
Mind you, I am not talking just computer scientists. Biology, physics, maths, all use linux based setups.
The thing about Android, it was essentially foisted on people, not chosen because it just works. Also, many Linux distributions today "just work" on typical hardware - almost as well as Android on your phone. And if you were using a hardware config chosen by someone else, and a Linux-based distribution pre-installed and pre-configured for that hardware - then Linux very much "just works".
It is true, though, that this is not enough to expand the user base far enough. That needs a lot of coordinated, collective, public activity.
Counter-anecdote: I have had a Linux computer on my desktop for over the last twenty years, have used Linux primarily for maybe the last eleven, and exclusively for the last eight.
I do not get the near-religious love for Apple products. Pre-OS X, they really were wonderful, best in class without a doubt. I kinda get the attraction back in the early 2000s, when Linux could still be a chore and Macs were a decent choice to get a computer which had a shell and ran real software. But now? They are just not for me. I want to own my own computer, write my own software and control my own destiny.
You may want some idea of controlling your own destiny, but it’s a mirage.
However using a Linux desktop simply is not a step in any particular direction. It is especially meaningless for non-developers to do if one of the other platforms just works for them.
Steps in the right direction involve developing software to solve the problems that the corporate operating systems have in fact solved.
The first step in solving problems that corporate OSes have solved is identifying them. For example, one of the problems is stable/versioned/sandboxable/distro-agnostic ABIs and that is something that is being worked on and they are quite usable nowadays. The related problem is, that the corporate-software is ignoring them (how many conference solution support Pipewire for desktop sharing? Or Wayland in Chrome? Heck, VA in Chrome? Exactly... but whatever Apple comes with, they support within weeks).
How would I even begin to know the answer to these questions when it comes to the Linux technologies you are naming.
Based on what I read here, the community of people who do use them isn’t even sure.
So excuse me, I'm going to look for a new release of Cisco VPN that works with Big Sur. Of course, it is customer's VPN, and they don't care that their vendors might use Mac, so no help from them, it is my problem now.
But more importantly - you verified my assertion.
Even if a small number of security related technologies have a smaller deprecation window, the others have very long ones.
More importantly - we can even find out what they are!
With Linux, can you even tell me whether Cairo, the graphics library behind GTK and for which there is no alternative is even still supported?
It uses mechanism that was introduced in Mojave, so that deprecation window was very fast. Definitely faster than Cairo, which has already more than decade and half under it's belt.
There are alternatives to Cairo, but not drop-in ones; they would require porting (Skia, for example). Other than that, Cairo is basically "done". Yeah, there is occasional need for release with small fixes, but as a software-based rendering library it is not going anywhere. Not that announcing deprecation is otherwise widely respected; the Xorg-server was declared obsolete years ago, the last major release was in 2018, but hey, some still think there's a future and in a few years are going to be surprised by "sudden abandonment".
But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.
You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.
Is it done? How do you know. Does it need to be developed further? It sounds like it does since people say it’s very slow compared to skia.
It seems like you think you know what the status of things are, but other people don’t. How can that be?
This is the reality of Linux.
2 years/2 releases is small window. Alternative was delivered, but it also is no drop-in one. Just lika Skia vs Cairo, requires porting/reachitecting and might not have 100% coverage for what the old solution did. You might have noticed, that in Big Sur there is a list of system apps that ignore VPNs and filtering. That's a problem.
> But you are confirming the point - nobody even knows what is deprecated or what the alternatives are.
Because it works differently. There is no dictator (vendor controlling access) telling you what you are going to use whether you like it or not, and present it as a done thing. Instead, it works by forming a consensus (not 100%, mind you), whether something is needed or not. The community around a piece of a stack might arrive at a conclusion that given piece is no longer maintainable, and make something new (i.e. Xorg-server vs Wayland compositors, or systemd vs sysv-init) and then it will bubble through as distributions adopt it - either because the new solution is maintained and the old one is not, or because the new one solves their problems better than the old one.
> You still haven’t answered the question of how someone is supposed to tell whether Cairo is supported or not.
Supported by whom? In this case, the original maintainer is no longer around. But that's not a problem, distributions will support whatever they ship, so if you have an application that uses it, it will continue to work.
Should you use it for a new project? No. But that was true for years. Cairo is 2000-era software library. Consider it equivalent to QuickDraw, you would not use it for a new software either. Gtk (which was issue here on HN recently) uses it for software fallback (and they've made part of their API, so they've locked themselves in; changing it would mean they would have break their ABI. On the upside, that means that there always be some support). Firefox abandoned Cairo years ago, in favor of Skia, and so did LibreOffice in the last release.
So it shows, that a library or piece of a stack might be deprecated for years, but there might be someone somewhere who really needs it and is willing to keep it on life support (exactly just like Gtk needs Cairo). So the question for you, as someone who requires support, is: what level of support you require and what are you willing to pay (not means in purely monetary sense; though when users are expected to pay for continual support, what's wrong with expecting that on different layer of the stack?) for getting that? Because you can always get it supported; or support it yourself, if you really needed. No one is going to kick it from underneath.The source will still be available, and it is always possible to fix it when needed. You will be never completely denied access as is the case in proprietary stacks.
> It sounds like it does since people say it’s very slow compared to skia.
It is going to stay slow (unless someone invests heavily into it; which is not probable). It is a software library, deliberately; if you want hardware acceleration, switch to Skia, and accept the tradeoffs that come with hw accel.
> It seems like you think you know what the status of things are, but other people don’t. How can that be?
Simple. I asked.
There are many venues, where you can ask. These people will tell you. However, due to having no dictator mentioned above, their answer on roadmap-like questions might be: "depends on the community adoption".
And so if the answer is always ‘it depends’, you are making my point for me.
Others may have specific reason, why they are still using it (like the Gtk example). Your new project doesn't.
I understand the general public, but the developer community? Why can't the "professional-computer-whisperer" demographic be arsed to setup their computers?
In all seriousness: The extent to which you have to manually configure / fix stuff in Linux is seriously overstated, especially w/ the "easy" distributions like Ubuntu and Mint. I've installed both of those on other people's laptops and they've had no complaints at all.
Choosing examples from the UK so people can search for them in English, they care when their healthcare data isn't secured, when dating apps leak information, when local councils abuse the rules to "snoop" on people.
If you explain that without a blocking extension (or Firefox's built-in one now?) most pages they visit are sent to Google, and Google maintains a detailed profile, they will ask how to install the extension.
You can say the common man doesn't care about free speech, but enough do care that we maintain it in Western countries.
There are many reasons why people don't use Linux. The easy of setup and use lines are pure nonsense. Many Linux distributions have been easier to setup and use since at least the introduction of Ubuntu, and perhaps earlier. Just pop in the live installation media, try it out for a bit to ensure your hardware is compatible, then run an installation program that asks questions that make sense to people rather than marketers. Getting the applications you need installed has involved using a store-like interface for longer than software stores have existed on commercial desktop operating systems.
So why is it perceived as harder? One reason: users typically have to install their operating system, while end users rarely see the process on commercial operating systems. Second reason: those who do install their own OS typically dedicate their system to commercial operating systems, yet use dual-boot for Linux (which will add technical steps). Third reason: those making the transition may try to install commercial applications via a VM or WINE due to compatibility, familiarity, or (periodically) the lack of an open source alternative. In other words, it is perceived as harder because people make it harder.
As someone who has been using Linux for decades, I find the setup and user friendliness of something like Windows far inferior to to Linux. Yes, part of that is due to familiarity. On the other hand, some of it is inherent due to there being fewer restrictions with open source software.
Non-tech people don't want (or don't know how) to "setup". The most user-friendly setup won't ever beat "no setup" (e.g., macOS).
Besides, marketing plays a huge role as well. Ask people to name a few "computer" brands: Apple, Microsoft, Google. No one would name "Linux". So, it's not just that people should be able to buy Linux computers with no setup (Dell is selling them, I think), it's also that these kind of computers should get enough marketing so people know about them.
I don't like this line of thinking because it is condescending. More importantly though, there are plenty of competent technical users like myself that would love to be using an open operating system but are fed up with dealing with the systemic problems Linux Desktop has that its developers and community keep papering over and pretending don't exist.
Strictly speaking, Windows is a usability nightmare. Most people don't notice simply because most people learn and use a tiny subset what is there. Beyond that, there is an entire industry to support Windows (which is part of the reason why people like it). Apple isn't much better. They tend to paper things over by simply dropping support. An example was brought up by another commenter when they mentioned that some ix programs use raster fonts. Strictly speaking that can happen under Linux yet not macOS since Apple dropped support for legacy software while some ix software is decades old. Linux will have its own issues, but I'm not the best judge of that since it is my operating system of choice.
At the end of the day, any operating system will be a compromise of some form or another. Which you choose will depend upon what your wants and needs are.
My problem isn't so much that there are tradeoffs, or even that those tradeoffs are not ones I want to make, it's that there are people out there who seem to insist that Linux Desktop is the one and only proper choice and everyone who doesn't choose it is either stupid or misinformed. If Linux Desktop people were more willing to listen to why people don't use it and take criticism to heart instead of as some kind of personal attack, it might have evolved into something more people would actually want to use.
That's fine, and I agree that advocates sometimes detract from progress.
I may be an extreme example in some ways, but everyone only has so many fucks to give, and most people have legitimately exhausted their budget by the time they get to tech policy.
This year is different though! I'm going to make it work. Also looking at Pinephone for my mobile.
That's the problem: non-tech people don't know they can do that. Besides, why would they buy a PC with "GNU/Linux" preinstalled if they don't know what "GNU/Linux" mean to begin with? Almost everyone out there knows what "Mac" or "Windows" means, and so they buy Apple stuff. Again, marketing.
I have used Linux as a daily driver desktop for years, but I don't just buy random hardware and expect it to work. I do some research first and make sure what I buy works well enough with Linux to meet my needs.
If by "people" you mean "Linux Desktop developers". They have taken relatively simple things and abstracted them two or three times until they are so complex and fragile they need special programs and standards to automagically manage it all for the user. As soon as you step outside the box of what they expect (hey, why can't I install a program on a different disk than my OS?) it is revealed for the Goldbergesq garbage pile it is.
Apple user here (had Linux on the desktop for 5+ years and windows for a longer time):
Why would I care whether my program is on one disk or another? Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do. Do Linux applications all self-update using Sparkle like Mac-applications do?
I also love Apple-music where I can listen to everything on all my devices, can I have that in Linux?
Does linux have the same beautiful font-rendering that I get in OSX, or do I have ugly non-antialiased bitmap-fonts now and there?
I also haven't used installation-media for operating-system installation in many many years, everything just works via the internet. I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?
And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?". I can have that in windows, can I have it in Linux? Can I easily create a encrypted bundle and mount it in the UI? Can I drag the little icon on top of any open window into any filedialog to rapidly access that file?
Can I easily configure keyboard shortcuts system-wide?
Can I have a photos-app that is as good as Apple Photos?
I think OSX is great. It feels more and more locked down, that is true, but I haven't had any real issue with this yet. I still can develop whatever I want and run it if I feel like it.
There are like 7+ different file managers that you can install and use. One of them probably has compress to zip. Some of them are much more powerful than finder in OS X. I use `mc` and I don't even need to click. I just select a folder <F2> <Enter>, and the folder is compressed. Two pane file managers are great. I love them ever since MS-DOS times.
The fact that I can use bitmap fonts is the major benefit of Linux to me, because I don't use hidpi screens. I can have small fonts that don't suck to look at, and are perfectly crisp. I like that I have that choice.
All of those automatic sync things are an anti-feature to me. I was given an iPhone for webdev testing, and was snapping an odd photo or two with it from time to time, including photos of my ID documents, without realizing that it by default uploaded everything into some stupid cloud. Great, now my state ID is with Apple.
A few times I had to use Mac OS for development had me running the other way since then. Why would I search online and download files to drag and drop them around to "install" them, and sometimes click through next next next finish dialogs, like on Windows, when I can just type a list of programs I want installed and they just get downloaded and installed for me with no fuss, and meanwhile I can do something more meaningful? I can install 10 different programs I need for some project in a single command, and it's such a time saver.
To some people Mac features may be great, to some they suck horribly.
Answer to your second question is very similar to the first one. Why should user drag anything anywhere, and decide to which folder put the applications? It is much easier to click on the install button in the store-like application.
Many users are confused by dmgs and they keep launching apps they downloaded from their ~/Downloads folder.
> Do Linux applications all self-update using Sparkle like Mac-applications do?
Linux applications were auto updated by their respective package managers before Sparkle was a thing. I consider keeping a Linux system updated to be much easier, than Mac one; even if it is updated by multiple mechanisms underneath (apt/dnf, ostree, flatpak, fwupd), from the user's POV it is unfied in the form of Gnome Software or KDE Discovery. On Mac, you have Apple App Store, Sparkle, brew, Microsoft Update, Adobe Update, Eclipse updater, and myriad of other mechanisms, specific to each app.
> I input my apple-id on a new mac and everything sync perfectly without any issue. Do I get that in linux? One login and my mail, notes, calendar, photos, music, files, backups, keychain sync across all devices?
I don't get that on a Mac. But then, I'm not locked into their products.
> And also simpler UI-questions: Can I rightclick on a file/folder and it says "Compress to zip?"
Yes, you can. In Gnome (default for Ubuntu, Fedora), you can right click a folder and there is a "Compress..." menu item. It will give you a choice of .zip, .tar.xz, .7z. I'm sure KDE has something similar.
> Can I easily configure keyboard shortcuts system-wide?
Easily? There's no system that would do that. Some are more flexible, but more difficult to configure, and vice-versa. MacOS goes into the less flexible one, for example I've never managed to have the media keys controlling VLC instead of launching iTunes, or Apple Music nowadays.
> Can I have a photos-app that is as good as Apple Photos?
This is about the first time I see 'good' and 'Apple Photos' used in the same sentence.
A lot of reasons. Maybe your OS disk is on a ludicrously fast SSD but is space-limited and you don't want to waste that space on applications that don't need it.
> Why would I even "install" something, instead of dragging/dropping a single "file" (yes I know it's a directory) like I usually do.
Agree wholeheartedly. It makes perfect intuitive sense: the application is exactly where I think it is, and if I move it somewhere else then it is exactly there, and if I delete it then it is exactly gone.
I didn't find it particularly difficult to set up. Some things are much easier (TAP drivers for connecting to multiple VPNs and running side by side remote sessions to difficult workstations across different networks, for instance). Even setting up a Windows virtual box was no biggie.
There's two reasons I switched back: 1) lack of consistency across applications is hampering to productivity, and 2) lack of stability in the application/tooling is hampering to productivity.
I gave myself 6 months to let myself get used to it, and there's a great many things I loved about it, but in the end it was less of a hurdle to deal with Win10's oddities than it was do deal with various inconsistencies and "usage maintenance" of getting having a suitable, productive workspace across Linux (and yes, I realize a large part of this was due to the fact I essentially run an MS-development shop and therefore have to work with MS tooling, but on the other hand, everyone working in any end-user business scenario has to work with Office documents, etc.)
I have tried moving over to linux several times throughout the years, but setup was always a huge issue. Installation was easy, but things like trying to get my aging peripherals to work properly, and trying to get font rendering not to look like absolute ass were always a huge hindrance.
So... way harder than Mac OS.
In fact, that’s literally impossible on my computer.
But seriously, I guess it is possible to use a USB thumb drive on my Mac with only Thunderbolt ports, but I never have and it’s never occurred to me. I suppose I still have one somewhere and maybe a dongle that would let me do it.
If you want to power the device, you do it through a thunderbold. If you want to connect a USB that is not USB C, you have to use adapter and/or USB HUB. So you would have to connect to your USB through adapter/hub, and if for some reason whatever installer wouldn't support this you would be screwed (but I guess most of the time it would be doable, just inconvenient).
This isn't true. This perception continues to hurt the Linux community along with others. Use a properly supported -buntu distro and you are good.
Honestly it sounds insane but I've started recommending Manjaro KDE.
Stuff breaking due to rolling release seems to happen less often than me avoiding having to do other stuff that newcomers to Linux are unfamiliar with.
Like adding a ppa or struggling to find a package since it lets you easily add the AUR, snaps, flatpaks, what have you without a care about the competition between those.
The install is painless. KDE's stuff feels intuitive and familiar to windows users and is customisable enough you don't have to deal with some minor personal preference annoyances.
Every 2 years or so, I give Linux another try on the desktop, and it always ends after a couple of weeks, because I'm just tired of fixing mouse and keyboard settings, touchpads, HI-DPI and multi-monitor settings, font rendering and video codecs.
I've tried Ubuntu, Mint and recently Manjaro and – don't get me wrong – all of them had great stuff in it and worked fast out of the box, but it's always the last little details where desktop linux fails for me.
I spend enough time changing config files for my job, I’m not spending my own time doing it just to get basic functionality working for workstations.
Same as with mac really.
Parent must be very unlucky, or in fact not realise it is their tweaking and tuning that causes issues.
Yet the problems remained. Maybe when people say they had lots of problems and it wasn’t worth their time, take them at their word ?
I definitely encountered this on my mac work laptop from three years ago. (Left that job and have been Linux-based since then.)
Choose a mature/stable distro and that is it. Be prepared to trade a bit of convenience for privacy in case of weird/non-free hardware/firmware.
The continual denial, constant evangelism, and needless condescension, which I have watched for 20 years now, is what hurts the Linux Desktop community.
And yet, unvariably people aren't good, and when they complain, they are told, "use this other distro" or whatever...
I suffer through all of it because I need docker with gpu support, but it's honestly a lot worse than what I expected.
I would generally recommend Kubuntu over Ubuntu -- it seems like Gnome aims for minimalism somehow, and has thrown out a lot of functionality that used to exist, and generally tends to emulate OSX for a lot of things. (E.g., OSX doesn't do alt-tab for multiple windows of the same app either IIRC.)
BTW, Kubuntu's default image viewer would likely be Gwenview -- You can get it to show a lot of information by clicking "More..." in the Meta Information panel and selecting stuff in the resulting dialog window. Here's a screenshot I found on the nets: https://www.fossadventures.com/wp-content/uploads/2018/05/Gw...
Thank you sooo much.
The office suite is easy to use. Thunderbird is as easy to setup as Mail. Networking and devices are easy to configure from the gui, and Firefox and VSCode run without crashes. Proprietary video drivers for common hardware can be downloaded and installed via the gui. It's at least as usable from an admin standpoint as Windows 10, and from a user standpoint as Mac OS, albeit with a much smaller "app store" experience.
You need to open a burger menu in the top right corner. And then the new folder button is purely an icon and it doesn't show a folder.
One thing with Nautilus is that if there is no empty space to right click on, you can't get to the context menu, because you always end up in the context menu of one of the icons. There is no dead space between them to click on. In this case you can click the breadcrumb to get the context menu or use the icon in the overflow.
Look where fighting Facebook and political influence got us. Nowhere.
I think why Apple, Android, Windows has the advantage is none of those 3 OSes require installation by most people. Why treat Linux differently?
>>> and nowhere near as user-friendly for most people on an everyday basis
User-Friendliness, I think, is more on day to day interaction. It will be interesting if there is a study in an area with low computer/smartphone interaction. Split them into 4 groups. Each group has their own OS. Later on they will be tested with some tasks.
Actually even easier because there's no "You must have an AppleID to use this machine - if you don't have one go and register now" step, and no associated problems with that.
Have you read the article?
Spoiler: They don't.
Your style of comment is also quite the meme at this point, though.
I always reply with: if my dad, mom, grandfather and other 10+ members of my family can use Mint and I have to do minimal maintenance for them, much less frequently than with Windows, then so can you.
> [..] they are still too hard to setup and nowhere near as user-friendly for most people on an everyday basis.
I think my comment is a fair rebuttal of nowhere near and most. If it was true, all of my family members would have to be outliers, which is statistically improbable. The more credible explanation is that it is not really true (and in my opinion, it clearly isn't).
Yet Dell and Lenovo offer computers with Linux pre-installed, but so many technical people ignore them because Apple is shiny and fast.
If this trend continues then Linux and other open source OS will eventually exist only as a virtualized guest inside a locked-down proprietary system.
There is genuine choice in the market now, but in ten years there won't be if people keep chasing baubles.
I got a Lenovo last year with Ubuntu preinstalled. It was he "blessed hardware". Wifi didn't work after wake / sleep and my connection via usb-c to my monitor was unusable. I switched back to a mac because I wanted to work, not fix my tools.
These aren't one off anecdotes, there are dozens of people in this thread with the same story slightly different. This isn't people going to Apple because it is shiny, it's going to Apple because managing hardware isn't their job.
Either verbally address that meaningfully or drop the condescending attitude.
Legislation and regulation are the only fixes for these practices. The EC have a lot of kettles in the fire here - on multiple fronts. I trust that, although it will take time, "fairness" will prevail.
I didn't. I encouraged friends and family to ditch IE for Firefox and nowadays I encourage them to ditch Chrome for Firefox. I never cared that Chrome was a little faster for some time.
One person that only uses the (very old and underpowered computer) for mail and printing documents I helped install Linux (not Ubuntu ;)) and since then they are much happier, because it doesn't get slower with each security update and needs less maintenance help than with Windows.
You should probably start advising that they ditch Firefox for IceCat.
Antitrust entities bashed microsoft for shipping IE and look where that got us.
If anything, it's worth keeping in mind that volumes of regulations written for big corps may indeed force them to behave, but are an impediment for tiny companies to enter.
Corps like microsoft do call for regulations and copyright laws all the time, the have an army of lobbyists, they know that even without regulatory capture these extinguish competition.
I found myself being one of those people, so I made a commitment to no more Microsoft or Apple for my personal devices. I'm eagerly awaiting PostmarketOS to reach full functionality on the PinePhone and I'll give up my Android phone at that point, taking Google out of the equation as well.
I'm even considering trying to learn C so I can contribute to OpenBSD for my hardware that is "rough around the edges".
 My wife's PC with Windows 10 is exempt for now; she's not a techie person at all and I don't want to force her into something alien. She tends to borrow my Thinkpad with Linux every now and then and is getting more used to it, so we'll see how that goes.
I think we are handing over the future of tech to these few tech giant megacorps.
With better design and products of course, none the less the business practices are the same. Much this is enforced by the current economic and political system.
Otherwise, here's Windows 10, have it for free, hell, here's Linux on Windows, just stay on the OS, buy apps on the Store and see these ads! Also give us telemetry.
Side note: no one cares, but I finally tracked down the cause of a really annoying bug, hibernation due to "thermal event" (CPU "overheating") in Windows 10: It's because of using dynamic disks in mirror mode (perhaps in other modes, as well). Makes Windows 10 hibernate randomly. Converted disks back to basic, no problems. Took me a long freaking time to finally solve this.
Their backdoor put a really bad taste in my mouth. Will try to avoid them if I can.
They're doing it because the cloud is Linux and they need to get into the cloud because Nadella's vision for the MS OS going forward is a unified system that spans PCs, phones, tablets, IoT, everything, not just a desktop OS.
Define "much". Windows 10 is a mind-boggling privacy invasion. It's just the other offenders attract more attention, that's all.
How would you get around the problem that many vocal Linux users think easy-to-use software is for weak losers, and consider difficult software a sign of manliness and an identity, even a religion? This is like saying Buddhists would be the best suited people to disarm gun-lovers, without considering that said gun-lovers don't want to be disarmed, and see pacifists as weak and unmanly.
I agree that it's not as much as a polished experience, but that's the point as indicated - trading off polish for ownership.
I'm excluding from the discussion professional documents editing - compatibility with Libreoffice is mediocre (at best) in my experience, but having very good compatibility is not a consumer requirement IMO.
But it's also just as likely that you're using a browser made by Google, and using other services made by one of the big companies that don't care about your privacy. Where's the difference?
> The alternatives are still completely unusable by most people.
Otherwise the statement wouldn't make sense.
Or their smartphone, which is a hell of a lot smoother experience. So why does Linux Desktop continue to pride itself on being "good enough" for an "average user"  that doesn't even use a desktop?
If all people need is a browser, they should buy ChromeOS or an iPad.
Ironically, one of your two suggestions is a Linux distribution.
There is nothing ironic about that. Users don’t know and don’t care.
Are you equating the presence of the Linux kernel with some notion of software freedom or independence from corporate control?
If all people want to use is a browser, Linux is irrelevant.
Browser apps need connection. Most of the services are also subscription based.
Whenever someone says “I don’t get computers” or gets angry at their smartphone they’re just leveraging that expectation and conning themselves out of a gain they could make. Same problem - laziness and expectation. At the end of that path is a life of a sharecropper and a cow to milk.
Macs are shiny and their marketing department is fantastic. Windows has a monopoly with OEM preinstalls. Those are the main reasons why macos and Windows are so popular.
I've tried using Windows, it's "completely unusable" to me. I couldn't even figure out how to install it without creating a Microsoft account nor could I figure out how to turn off "telemetry".
While it sounds nice in theory, is just unrealistic.
We'd also need to educate people why they cannot use random proprietary rental application on their mobile device that's not google or apple, or connect to a university vpn that doesn't support linux properly. Many workplaces use zoom for meetings, or google suite, etc etc
Exclusively switching to open/ethical software starts to resemble living like a hermit in the woods, and in reality makes ones life more difficult, reducing the impact the person can do on others.
The fact it happens to use the Linux kernel is like saying MacOS is based on FreeBSD.
Most people are using their (personal) computers for browsing, maybe an email client if they're not using one of those web-email abominations, the odd bit of word-processing and watching movies/listening to music. Linux works just fine out of the box and the install is substantially simpler than anything I've ever seen over in the MS world.
Today we just tell auntie to buy an iPad.
I mean, this is just plainly not true.
Apple has done many things, things they did not at all need to do, in order to protect user's privacy. They have shown, repeatedly, that they do in fact care.
This is the only reason I had to move away from (arch) linux and it saddens me every day.
Or we like what Apple/Google/Microsoft are putting out, including the "walled" nature of the iOS/macOS. People's homes are also "walled" for a reason.
Your concerns are not everybody's concerns.
That said, I don't like the telemetry and information leak as described in the article, and would want Apple to stop it and allow disabling it.
But I also don't care for using another OS just because "it's actually yours and you can do anything you want with it". That's also true for TempleOS, but I would't want to use it (to use an extreme example of an alternative), and simililarly, I prefer macOS and the app ecosystem to any Linux distribution I've seen (and I've used UNIX when there were still UNIX wars and SunOS workstations with black and white monitors, and Linux for close to 25 years).
Majority of iPhone users are women and HN/reddit are primarily men dominated. It should be no surprise that apple caters to aesthetics over thermals and you see people complaining about that on these platforms.
Apple customers put status and prestige when deciding to purchase something above cost. They are rich. They care about environment that's why apple "care" about environment. They care about privacy which is why apple "care" about privacy.
So for them, it's reasonable to change the device than allow it to be repaired at some shady shop. It's consistent with their image. They can't compromise on privacy or safety because that's why they bought into apple ecosystem.
I sold iPhones for years and never noticed this trend. I highly doubt it is true in any significant way. I would also challenge the assumption that women want aesthetics over some other factor. In this day and age, that's an incredibly 'old fashioned' statement to make.
From what I could tell. People bought iPhones because they were:
a)Cool and of good quality. (Your everyman can just go buy an iPhone and KNOW it won't be junk. They don't know the other brands and researching stuff is a pain in the ass)
b)They were used to using them and didn't want to use android/didn't like using android phones
c)They had shit experiences with other platforms and just wanted something 'that would just work'
d)The iPhone really was the first good smartphone on the scene. They have inertia associated with points a->c above.
If spending in cosmetics industry is any indicator, women definitely cares about how something looks over men. There are hard number even if you don't believe in the various evolution theories and biological differences which makes women more prone to noticing subtle visual and design cues.
From 2010: https://appleinsider.com/articles/10/12/01/women_want_apples...
From 2013: https://www.statista.com/statistics/271204/android-vs-iphone...
From 2015: https://www.statista.com/statistics/513995/smartphone-user-g...
I am not aware of any large scale recent surveys focused entirely on that but there are many dating surveys which reports women having preference for men with iPhones. Likely dubious at the numbers most of them report (70%?) and could be influenced by other factors but still it seems women do prefer iPhones more.
I don't think this should be odd. iPhones have the best overall cameras every year and are huge on Instagram and Pinterest (platforms dominated by women) so I can only extrapolate the difference would be bigger now with social media booming.
In addition to that... I actually sold the damn things to actual human beings! I KNOW who is buying them! I've met them by the thousands! It's like approaching a mechanic about why Ford's don't break down by pointing vaguely at some tables.
Your data doesn't particularly support your views either.
As such - men spend a lot on fashion and grooming, while the same can be said about women and cosmetics... but that's mostly about cultural customs.
PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.
PPS: Your stats are a little bit out of date... Let alone - the 2013 data contradicts your claim. Women are distributed between Android and iPhone equally in 2013. There would be a shift towards iPhone, if women wanted iPhone more.
> The 2013 data contradicts your claim.
It doesn't. Men want iPhone less than women. My initial point was simply that people here and on reddit (men dominated sites) are not reflective of apple's focus.
31% of men wanted Android while only 24% of women wanted the same.
> PS: Only the latest iPhone has best camera. They traded places with Pixel, Samsung, etc... over the last few years.
This is misleading. While iPhones fared worse than pixel in photo department for 3 years, they make up to it in other aspects especially video. As for Samsung, it depends on what you consider better camera. iPhone arguably has the best color profile and accuracy while Samsung phones does beautification and increase contrast which does make them look nicer but overall, not accurate.
You're trying to make a biological preference claim out of what is customary in the Western European cultures.
Sometimes it's better to say that "we don't know", because that is exactly it.
> This is misleading.
No... Your original was misleading, I just forced you to correct yourself.
In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)
Are you serious? Since you're using the term bipartisan, I'm going to assume you're also from the US. Have you been following the news/current events recently (and by recently, really I mean any time at all over the course of the past forty or fifty years)? Almost nothing actually gets "bipartisan" support these days. Our government is in a constant state of gridlock and it is next to impossible to get anything done.
> In times of strife, at least there's one thing you can always count on: Democracy. It never fails. If you can just remember that one thing, you should be able to sleep soundly at night. (It works for me at least, I used to be a constant worrywart until I learned this trick from my therapist, and I've been golden ever since.)
I'm really having a hard time telling if this comment is sarcastic. I'm not trying to cause you to lose sleep or anything, and honestly, I commend your optimism if you're serious, but the fact is that democracy is an abstract concept that is almost never put into practice perfectly. Even if democracy's promise of "most people being happy" meaning 51 people are happy while 49 people are not is actually sufficient, even that idealized version is very far from our actually political system in the United States.
I think one thing most Americans actually would agree with is that the way we do things here, right now, in terms of government, doesn't work so well. I think there is a lot of disagreement about what should change, and how, but it seems like very few people are satisfied with the current state of affairs. The more I think about it, the more I think this comment just has to be sarcastic, so this reply is probably a waste of everyone's time, but in any case, wow. I wish what you were saying here were actually true. That would be a nicer world to live in than this one (although, frustrating as it might be at times, this one is also not so bad).
There is bipartisan support - for breaking encryption: https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020#Legislatio...