I have seen people here address this before, but it's always like a list someone from Lebanon would write after moving Costa Rica - never really that things are wrong on the other side of the fence, just that it's different. Stick it out a bit and the pain of those differences melt away.
If I had to shift to MacOS now it would be far far more painful than the shift from Mac to Linux.
With a Mac I can have it set up to my liking and be working in little more than an hour. Under Linux (and to a lesser extent Windows) it’s a multi day project.
This used to be the difference.
But between Preview.app bugs (crashes all the time when previewing beamer slides, slow rendering, jumping through slides in non-linear order), sticky or stuck keys on the butterfly keyboard, ancient UNIX userland (yes, you can install newer versions with Homebrew or Nix), weird bugs when running XQuartz, etc. The gap is narrowing pretty quickly.
Also, I have been using NixOS on my Linux workstation at home. I like it so much that I want fully declarative system configurations everywhere ;).
I would like to elaborate, but I think it is pretty futile to do so online, which is pretty telling in itself. One example that should challenge people view would be the ssh default story that is on the front page right now. Everything in Linux is pretty much like that if you dig deep enough. And while its competitors aren't perfect either, they are just far more consistent. Because that is a major selling point for them.
I don't see how you could even compare enterprise linux to home linux. Enterprise linux not only has a different userland (i.e. it's designed for LTS setups -- stability, not usability), but almost all enterprise places run 20 versions behind anyway. Not only that, but you're facing peculiarities of your setup that no sane home user is going to face in a billion years. The entire experience is different.
Go pick up Linux Mint's Cinnamon distribution, then see if you have the same complaints.
Quality isn't an absolute measure. If you were to plot the security holes in the Linux kernel on a timeline you will most likely find that there is at least one the majority of the time. Security is essentially a percentage of how hard something is to discover and exploit. And you don't know what that percentage is until someone go looking.
The same is essentially true for the rest of the the operating system. Your average user can be perfectly happy e.g. clicking on random links and executing random code without apparent consequences. They will rave about Linux until reality catches up with them. Whether that is software conflicts, (lack of) backups, security or just maintenance in general. Usually at that point they get disillusioned and switch to something else. That is why we have the perpetual "year of the Linux desktop". Because people leave at a similar rate that they join.
By working in enterprise Linux I, involuntarily, get to see all this at different points in time. Not just from my own perspective as someone who has largely accepted these problems, or from the perspective of a user that is excited about Linux until they aren't. It is from this perspective, and because I know how hard it can be to do things that are outside of the default, that I say that the other big operating systems are better overall.
Lack of either a near complete monopoly or a track record with dedicated, quality hardware. Throw in some questionable leadership decisions.
I don't think there's anything inherent to Linux that's unable to level up on the desktop, but the people best placed to shake things up are probably manufacturers who'd rather take a cut of a Windows license.
Windows has certainly been consistent in not offering an SSH client at all for a very long time.
On MacOS there is one way to do everything. Want to make a desktop GUI app? XCode has a project template for you. If you deviate from the APIs it suggests, you’ll get nothing but pain. On Linux, there is no default window API. You have to choose between Qt and GTK and <a billion other things>. That leads to fragmentation at the windowing system level. And just think, that fragmentation happens at all the other levels too.
Ubuntu has removed some of the fragmentation, since it’s so popular that you can just target Ubuntu and let your application fail on different configurations (or worse than failure - technically work, but require tons of tweaking in config files. Ugh).
However, Ubuntu has reached the point, IMO, where to provide better UX, they would need to start making Ubuntu actually incompatible with other Linux distros, or push the Linux community to adopt new standards that are better for desktop linux. A perfect example is Wayland: supposedly Wayland is a lot better than X11 for desktop purposes. It’s just too bad that Ubuntu can’t just break compat and blaze their own path here, or we would probably have a great Linux UX story already.
What you should do is the opposite. They should start with the toolkit, the network infrastructure, the management utilities or something else that would be the use-case. Once you have e.g. a solid toolkit that is attractive to developers and people start making good applications the rest is a matter of time. There is of course a reason why people don't do this, which is because it is hard. But hard is good, it is complexity that will kill you.
This is of course essentially why Android and Chrome OS is successful in their own right.
I'm a Java dev mostly, so perhaps I'm insulated from such things by the JVM. So long as I can run Chrom[e|ium], Eclipse, Docker and a terminal with a bash shell I'm pretty content. Even with that small list I've got issues with keyboard input lagging in Chrome on MacOS and Docker performance is poor relative to Linux. iTerm is a thing of beauty though.
I've got a box running Wayland on Arch. Can't say I've noticed any differences as a user from X11, I'll have to look them up and see what I'd missed.
That has changed. One of the Windows10 previews offers a client and server (Ed25519 only). I`ve been experimenting with it on WSL and it seems functional... time will tell whether or not it is secure.
Back to the point of the original discussion: having worked using OSX/MacOS, Windows10 and GNU/Linux in various combinations for the last couple of decades I honestly prefer the desktop experience of GNU/Linux. As long as you avoid the dodgy hardware (Nvidia cards, an ever decreasing number of wireless chips) then everything is painless.
At this point really the only thing distinguishing these operating systems as general desktops (as opposed to gaming systems) is whether or not they are Free Software or not. That is something that is becoming more and more important to me in order to avoid being compromised by criminals (governmental or not).
On Mac OS, I'm also pretty happy with the stock interface, but I spend quite a bit more time downloading and installing dev tools. Then I end up running out of memory because Docker for Mac and Minikube both end up running Linux virtual machines in the background.
With a Mac I set it up as much as I can in little more than an hour, with no hope to achieve what I really want. Under Linux (and to a lesser extent Windows) it's a multi day project but I can get whatever I want.
How much of that is due to being accostumed to using Mac's defaults? I nean, Debian's default setup is pretty much my ideal setup, but that's because throughout the years I've got accostumed to it, not because Debian's UI guidelines are awesome.
I have had Mac users use my laptop for a little while and love the usability. Everything simply works. I use Skype Web and Zoom to do videoconferencing ...and obviously all the Slack-like tools work great.
And you have the choice to buy spectacular developer focused hardware like XPS or ThinkPad laptops.
The usual suspects: fonts, video drivers, sounds drivers, various USB devices supports, sleep/wake issues.
And lack of basic apps, like a good multitabbed SSH/RDP terminal.
Fedora is spectacular and has been spectacular for many years now. The driver support is brilliant. It was one of the first distros to have absolutely seamless integration with RAID mode NVME (which was something the XPS set it to).
I will be very surprised if you have to "tweak" your laptop. Everything that you have in OSX is already there - including nightmode, etc... the works.
And here's the cool part - customizing Fedora is a browser extension away! http://extensions.gnome.org/
It usually breaks for users, who do not respect package manager and what it does, break their installation with misc convenience scripts and tweaks run as root, and then wonder, what went wrong.
This comes up somewhat regularly, and probably always will - it’s the nature of the beast.
I’ve found myself very happy using a Mac as my main desktop. It’s not perfect either, but good enough.
It's tempting to compare Linux with MacOS, but also (in my opinion) unfair. Apple appears to be making conscious decisions about what goes in their machines and not simply picking the cheapest components. I think Windows is closer to a fair comparison (they have to try and support everything out there) I have had no more hassle under Linux than I've had with Windows.
EDIT: Rather than downvoting, perhaps those who disagree with me could tell me why other companies aren't capable of "making conscious decisions about what goes in their machines"?
Apple is, after all, far from the only Unix vendor to do so. Whether it's HP-UX or Solaris, it's fairly common for Unix vendors to tightly couple their software to supported hardware. What is preventing Red Hat from offering a fully Red Hat compatible laptop that works near-perfectly out of the box?
If you do your purchase outside this list and something doesn't work, it was your decision and now your problem. It is not Redhat's or Canonical problem. They are not in hardware business, their partners are, and the information which models were tested and what was the result is publicly available.
Everyone has their own stories and use cases, why are people so hell-bent on convincing their use case and solution is The One True Way and-I-can't-understand-why-you-re-using-anything-else?
So yes it's silly, but it's fairly strongly motivated silliness.
Skype and Whatsapp video are the only two ones that work reasonably well in India with shaky mobile bandwidth. I mentioned Zoom as well.
But here's the thing - have you used Skype Web ? There's no installation needed.
Regarding the rest of the world, WhatsApp definitely comes to mind. For chat, video, and audio.
I'm using an XPS 15. A recent purchase - I'm an OS X user by choice, but Apple currently makes no laptop hardware I'm interested in buying.
Here's the relevant archwiki page: https://wiki.archlinux.org/index.php/Dell_XPS_15_9560
It would take me days to work through that horror. I'd still end up with an only partially-working machine. Yes, it's the fault of Dell & chipset manufacturers for not supporting Linux, but if I'm going to spend effort apportioning blame for the world's ills, I'll do that on things that matter more (torture, ecosystem collapse, social isolation).
In practical terms, I can spend time setting up & configuring Linux, or I can use Windows today and plant some vegetables, or walk up a hill, read a new volume of history or philosophy, talk to a friend. Windows is kind of nasty, but all the applications I need are available for it, it can launch them, it can give me access to files, and I don't spend any time messing with it. Good enough for me.
By "user-oriented", do you mean kitsunesoba-oriented? Or do you think all users have the same preferences?
> With a Mac I can have it set up to my liking and be working in little more than an hour.
macOS doesn't even include tiling window management out of the box. Apple clearly optimizes macOS for the average user. If you're an average user, that's cool. But for people with sophisticated workflows who aren't average users, the out-of-the-box experience provided by macOS (and Windows, GNOME, etc.) may be grossly inadequate.
In general, the existence of a bias is insufficient proof that you should reverse your conclusion.
Sure it’s a convenience, but so is basically everything we do to alter a computer after unboxing it.
I agreed it's nice - but essential enough to merit the time sink of running linux? Not for me.
Maybe I’m just a Luddite that doesn’t know any better, but I’ve been quite happy without tiling windows for many years. Then again, I spend most of my time in a terminal in a tmux session. Maybe that counts.
A common thread here is - your use case is not my use case. Many people are assuming that their workflow is the ‘one true way’. That not a healthy way to have a discussion.
I was more like pointing out that macOS does allow for the following in some way out of the box:
> Every OS should let you line up two windows side-by-side out of the box
Personally I feel like I’m using a toy when I have to use a non-tiling window manager. It may not be for everyone, but the display server or desktop environment should at least provide an interface to allow someone else to implement it.
A one time xmonad + arch setup that I did a couple of years ago has lasted me across 3 machines with only minor tweaking.
I try to make few customizations to the default macOS install, and just roll with the changes coming from Apple. I do have some dotfiles, containing mostly just SSH and ViM configs. But I've already had to fight ViM several times because of it.
- can't run apps I depend on (e.g., Sketch.app only runs on macOS, and I I have a lot of existing files on it, and there's a lot more support for Sketch than, say, Inkscape. Other examples: Things.app, 1Password... hell, even Spotify barely works on Linux (does hi-dpi work yet?)
- hardware quality & support (if my Apple laptop has an issue, I can take it into an Apple store and get it fixed same-day in many cases. also, recent keyboard issues notwithstanding, Mac laptops are generally seen has having best-in-class build quality and durability)
- hardware support+power management (much better than it used to be, but depending on which laptop you choose, many people still have issues getting hardware edgecases, like suspending and resuming, adjusting the backlight based on ambient light, applying optimal power management, etc)
- touchpad drivers (libinput and wayland are making progress on this front, but the experience is still not on par with Windows Precision and macOS
- mobile device integration (unlock my computer with my watch, get phone texts forwarded to my laptop, make and receive calls via my phone, etc)
Trust me when I say, I really wish I could move back to Linux. I spent many years helping develop a desktop linux distro. And certainly things have improved substantially since a few years ago — hiDPI support in particular is really excellent these days. But there's still a number of barriers for even technical users. At the end of the day, I want to get work done with my laptop. I don't want to fuss around with configuring and worrying what happens if an update upsets my driver compatibility (yes, I know about btrfs snapshots and kernel image caching).
Just my $0.02. Sorry to be a downer.
Having integration between iPhone/iWatch/MacBook is great and I have come to accept Apple’s UI experience as definitely good enough for me.
I love Linux, and do much of my day to day coding on Linux servers with far more horsepower than my laptops. Working in macOS for email, writing, research, small code experiments, etc. is fine, as is working in a few SSH shells on a server with a nice Emacs/tmux/etc. setup for development. I find that I can live without the nice JetBrains IDEs, VS Code, etc.
Edit: I am nearing ‘retirement’ when I will do less software development and more writing. I am seriously considering switching to a really nice Chromebook since that would also meet my needs - so I am not permanently married to Apple.
That kind of depends on the country you are in, I guess. Here in the Netherlands you first need to make an appointment (~1 week), then they order the parts (~2-3 days), then they fix the issue (~1 week). All in all you are without your laptop for 1,5 weeks. That is really lousy support compared to something like Dell's next business day on-site support.
I'm in the USA.
I suppose the difference in handling is that other vendors often have a wide open parts chain, and Apple doesn't. You can't buy them as a consumer or user, and as an APSP or AASP you can only get parts when ASD2 says there is a reason to, and you can only order them via GSX. In some cases, for example when it's a big item like a screen assembly or a mainboard, you have to send the item in first. All of that stuff takes time, and for many users, that is a reason to get next-day AppleCare support so you don't have that problem. You just get a replacement sent or someone at your door. Yes, that exists, just like with Dell and HP.
The AASP I contacted could not return the hardware until the repair was done, so bringing it in when the parts arrived was impossible. The Apple stores need 1 week for repairs nowadays.
> a reason to get next-day AppleCare support (...) Yes, that exists, just like with Dell and HP
Only for enterprise users, and it sets you back 400 EUR/yr. Dell charges 190 EUR for 3 years (in total). Quite a difference.
I used to have Macs but made my way back to Linux (Manjaro) and happiest ever
In the hardware dept, people are always bringing up the issue that Linux doesn't work well on all the random laptop hardware. It's true but it's not exactly a disadvantage compared to OS X, which works well on a much smaller set of hardware.
This is the infuriating part. You can buy an os x license - but only to run on Apple hw. So you can't legally run it on other hw.
Not sure if this is really enforceable here (Norway) vs consumers due to somewhat strong consumer protection - but it's probably enforceable against businesses, which is bad enough.
This is the only thing I need before I move to Linux on my laptop. Until then I'm sticking with Linux on my desktop and MacOS on my Macbook.
I have been using 1password with wine for over 3 years now.
These aren't Linux issues, they're hardware issues. Pick up a decent-quality laptop which lots of developers use and support is rock-solid. Pick up a cheapo Win10 plastic thing, and yes, nothing will run great on that.
And the first step was to open up the laptop to swap out a part!?!?!
I’m sure Linux probably supports the built in WiFi card on this laptop too. But I remember having to do just this same thing with an older laptop to get Linux to support the WiFi chip.
Every response I ever get is always so disappointing too because it's usually people who think your time is worth nothing and they casually want you to spend hours screwing with Bumblebee or whatever. And when I do build up the patience to actually try it ends up breaking something else about my graphics.
Sorry this is a sore spot for me. I do robotics so using a Mac for years was painful when I had to run ROS, but everything else about it stayed completely out of my way.
Ubuntu is fantastic only when I low-pass my actions, avoiding anything slightly edgy, unusual, out there.
Oh my god I had to compile mouse drivers for my Logitech MX to make the wheel stop screwing up. It's a mouse!!! They were invented in the 20th century!
My Bluetooth driver just crashes once a week. Silently. My mouse just stops working. I have to power it off fully because a reboot won't do it.
I lived in an apartment around 50 or so wifi connections. I would have to toggle wifi on and off over and over the first time connecting because the damn list wouldn't show them all and mine wasn't on the list!
I've worked in product and while I still think so much about product is just utter nonsense, some of it isn't, and it's those practical parts that seem lost on so many engineers.
Why can't I make my damn software Ubuntu update thing not appear every day? I've tried to kill it with stack overflow answers a few times.
Why do I have to pick between chrome having tiny controls and the rest of my UI having massive controls on my 4k monitor?
I don't care about what brand it is. If it's cool. If it's open source or free. I just want it to all just quietly work and stay well out of my way.
No actually I'm not done yet. Why do I have to tell it every time what kind of headphones I just plugged in? And when I leave sleep it forgets and starts blasting 80s eurodisco in the office even though they're still plugged in?
Okay I think I'm done for good this time. Just off the top of my head the things that make me feel like I'm trying to become a master musician by practicing 10 000 hours on a kazoo.
Aside from needing to write a quick bash script to get the settings for my trackpoint to stick with systemd after reboots, it's been absolutely rock solid.
I don't know which laptop you're having issues with, but a $120 used Thinkpad (X/T/W430+ for dual external monitor support) and any modern linux distro will work out of the box with little to no fooling around and will keep up with almost any new laptop in terms of performance, reliability, and battery life.
But one of the problems I attribute to Linux more generally, is that if you Google for information on something like KDE vs. Gnome, you tend to find discussions where people get into the weeds on things very removed from "which one works and lets me get things done?". The threads are more like sports fans debating the merits of their teams while trying hard to be civilized about it. Which is how I ended up not trying KDE first even though that was the obvious default.
Not just the visual appearance, but for example keyboard shortcuts are relatively universal; Cmd + , is the shortcut for Preferences/Settings in Safari, Chrome, iTerm, Slack, NewsReader etc. I don't know if that would even be possible on the leading Linux DEs.
I love the customizability of linux, but I've decided that a good user experience goes beyond the appearance, and what's required is putting serious thought into user workflows. Those sorts of usability studies don't seem to be a priority for most FOSS projects.
There are things about Linux that I miss on Mac, but overall I prefer a system that's consistent to one that ticks every single box.
Kinda funny, but the inverse is true for me. I am forced to use a Macbook at work and shortcuts are a frustrating thing. Do I have to use Cmd + <x> or Ctrl+ <x>?
Considering both Windows and Linux follow almost the same shortcuts, Mac is the odd one out here.
No reason to be confused. You need to use Cmd + <x>, because off the top of my head Ctrl is involved only into text editing shortcuts (what GTK+ calls "emacs theme") and some obscure stuff.
To be fair I wouldn't really count this as a negative, this is probably very far down in my list of gripes I have against Apple devices.
It's just that it's equally pointless when a Mac user complains about shortcuts in Win/Linux. Even in this thread, you can see most of the complaints boil down to "It doesn't work like the way it does in <Insert the system you are used to>"
It was hard to consider it a valid selling point then, so I'm not really sure why people would consider it to be a valid selling point now for MacOS (although I do get why people would find them useful).
And why excluding non-native apps? They count for a large portion of app ecosystem nowadays and they don't have the same problem on Windows/Linux.
As for why exclude non native apps... it’s a personal preference thing, and I don’t enjoy using them. I avoid them whenever I can and will opt for a native alternative where possible, even if it means shelling out some cash.
In fact, I’d say that’s one of my gripes with desktop Linux — for some things, an electron app is the only option and nobody cares to develop a GTK+ or Qt alternative. I’m a big believer in use of the local UI toolkit, regardless of platform.
But laptops are a whole different story. The last time I tried using Linux (and Windows too, actually) on a laptop, I couldn't get through a day without wishing for the touchpad from my (2015 vintage) macbook pro. Most touchpad hardware sucks to begin with, and installing Linux on the MBP itself still sucked due to drivers. My computer is sadly getting long in the tooth and I definitely won't be getting one of the new MacBooks after reading people's experiences with them, so I think I'll have an opportunity to try out a Linux laptop again soon, but I'm more dreading than looking forward to it. Maybe I'll be pleasantly surprised.
Current crop of laptops aren't particularly good. They are all plagued by various thermal issues due to the quad-core CPUs in ultrabook form factors. Even thinkpads have suspend resume issues right now. Avoid the thin ones. Avoid nvidia. Save some money, and go with something a bit older.
If I need to suffer a laptop, then the TrackPoint is head and shoulders better than any giant Apple glass slab. For all other use, a $20 Logitech Marble Mouse is supreme bliss.
What would be painful for me is to lose my i3 environment with the perfectly tuned keyboard setup, all configuration stored in a version control, including the system software through the declarative NixOS configuration.
Basically getting a new laptop for me is to boot from NixOS USB stick with my configuration, run the installer and boot to the familiar i3 desktop I've been using for years. All the software including the right configuration will be there. Nothing will change through a whimsy update that introduces some new UI feature that will break my workflow.
I also really like the ThinkPad hardware, especially the perfect keyboards.
Why would I futz around with Linux when I can get a Mac that has an operating system tailor made for it and a true Unix environment when needed? Only on the Mac can I develop for every mainstream desktop and mobile OS.
You can develop on Windows.
And use Microsoft’s hosted build environment for up to 40 hours a month for free.
The whole kerfuffle you are referring to is about 7 years out of date.
The post I was responding to said “it was against Apple’s TOS”, probably referring to a policy that was rescinded back in 2011.
There is a big difference between Apple not allowing something and Apple not supporting something. I choose to write native apps for the platforms I care about. If I wanted to use Xamarin I could.
It's also against ToS to virtualize OSX on non-Mac hardware as far as I know.
I don't think they are just doing that for fun.
Apple is locking you in and you reward them for that.
Again there is an existence proof you can deploy code directly from Windows to iOS devices....
Microsoft tells us that it talked to Apple about this functionality and that it has its rival’s blessing and that the Live Player application complies with all of Apple’s usual rules..
Now you are playing word games, just to avoid the point.
But as Microsoft has just shown, you don’t need to run “thier” emulator to develop on another platform.
because Apple does not supply one. You cannot supply your own,
Yet Microsoft did just that....
because the emulator/simulator/whatever has to run Apple code inside, and you don't have permission to distribute it, even if you had the inclination to make your own emulator/simulator/whatever.
Microsoft had the inclination to do just that....
So OK, Microsoft made an agreement and can now provision iOS packages. They had to find a way to test-run them somehow, and their solution won't allow you to write against native frameworks, only against theirs, which they can run on their platform.
So first you said that thier was no way to run on any OS, you couldn’t write your own emulator, etc. but now that I’ve shown you that you can do all that, now it’s that you can’t write against the native frameworks.
Guess what? You can’t write the same binary against different versions of Linux.
No, you run the .net code on the .net runtime. If you use UIKit namespace, for example, you will run with the dotnet shim.
> Microsoft had the inclination to do just that....
Because Microsoft doesn't have emulator or simulator. Not anything comparable with what Apple has, or what Google has (and what Microsoft is shipping for Android).
> So first you said that thier was no way to run on any OS, you couldn’t write your own emulator, etc. but now that I’ve shown you that you can do all that, now it’s that you can’t write against the native frameworks.
It is not a first-class solution.
> Guess what? You can’t write the same binary against different versions of Linux.
Actually, you can.
And when you run your code on the iOS simulator you are also not running against the same code that’s is running on your phone...
You aren’t running ARM code.
When you run on your code on your desktop you run on a simulated environment - the Xamarin runtime on top of Windows. When you run your code on MacOS, you run on x86 version of the iOS runtime. By definition, you are running a simulator.
So in your experience using both, what capabilities were you missing?
Actually you can
You haven’t had to ship C binaries to various Linux distributions....
For instance, the official MySQL driver for Python is not pure Python. Even if you are on Linux, you can’t just do a pip install from your box, create a zip file and guarantee it will work. You need to use the package built for whatever version Amazon Linux is based on....
There are plenty of compatibility issues distributing binary builds for different versions of Linux.
Not a big deal, I use AWS CodeBuild to create my distribution anyway.
> You aren’t running ARM code.
You are running the same framework by the same author, which certifies that they behave the same to some extent. It's the same way, that Gnustep is not an adequate replacement for Cocoa.
> When you run on your code on your desktop you run on a simulated environment - the Xamarin runtime on top of Windows. When you run your code on MacOS, you run on x86 version of the iOS runtime. By definition, you are running a simulator.
Are you still interested in arguing in circles? I'm not, it makes no sense and is a waste of time. You demonstrated that you are not willing to understand the point, you just want to win an argument on the internet. Go find someone else for that.
> You haven’t had to ship C binaries to various Linux distributions....
Now it is C binaries... It was just binaries before. You can move Go binaries with no problem, the crosscompiled ones too, because they do not have problem with importing symbols nilly-willy.
You can do that with C binaries too, but you have to take care. Just make sure you know what are you linking against, bundle it if it is not present on the target or just link statically. It no problem for a competent developer, who knows what a linker is and can maintain ABI hygiene.
If you think that is something complicated, go setup apache with mod_wsgi on windows, you will run into the same problem. There you will have to align the right Visual Studio version (because each msvcrt version has its own ABI) with Apache version with Python version, otherwise the hell will break loose. Of course, all the native python extensions you are going to use will have to be compiled with the same Visual Studio version. Yet nobody is saying, that it is impossible to distribute binaries that run on all Windows releases, mainly because it has nothing to do with specific system, but with the apps you chose to run, how they are built, and also boils down to ABI hygiene.
Have you done mobile development?
If you run on top of the iOS simulator, you still don’t have any “guarantees” that they will run the same way. I’ve been doing mobile development since before the iOS App Store every existed (Windows Mobile), you always had to run on the real device and the behaviors and the performance characteristics were nowhere near the same.
When you are developing with Xamarin, you smoke test on the simulator at first and then you run on the device using the Xamarin player - the same way you are going to deploy your code.
Now it is C binaries... It was just binaries before. You can move Go binaries with no problem, the crosscompiled ones too, because they do not have problem with importing symbols nilly-willy.
You can do that with C binaries too, but you have to take care. Just make sure you know what are you linking against, bundle it if it is not present on the target or just link statically. It no problem for a competent developer, who knows what a linker is and can maintain ABI hygiene.
Excuse me for referring to the most popular language for writing native code (and the language that Linux itself was written in) instead of specifying that I’m not talking about Go of all things. Almost all non C languages that I’m aware of have a way to call native C code - not Go.
Your “solution” for distributing C code is “statically linking”? So if I have four python modules that all have native dependencies they should all ship with statically linked libraries? Not only will my code be larger, so will the memory requirements. If you have four processes all using the same dynamically linked library, the library is only in memory once, with statically linked libraries, you are increasing your code size and your memory footprint.
How well do you think that with work with the 128MB VMs that you get to run Lambdas? Or do you think I should just spend more on higher capacity lambdas? If I wanted to use heavyweight VMs and Docker containers, I might as well use Windows.
But this whole thread you’ve shown when it comes to developing mobile applications, you’re speaking hypothetically while I’m speaking from a perspective of doing mobile development for over 10 years.
I’ve even played around with J2ME development.
The largest issue is that you’re over indexing on the value of the simulator. Mobile simulators have never been a good way to validate the functionality, user interaction, performance, etc. of a mobile app. They are pretty good for development, and okay for negative testing (if it doesn’t work on the emulator it probably won’t work on the device), but you can’t depend on it for positive testing.
It’s not a word game. Both of your assertions were incorrect. You no more need Apples proprietary “emulator” to develop apps for iOS - Xamarin is proof of that, nor is there a “policy” against developing or deploying iOS apps without a Mac - again Xamarin is proof of that.
Calling Apples simulator an emulator is just as technically incorrect as saying using Safari and setting the window size to match mobile screens an “emulator”
What matters is, that you cannot run it on another OS, because Apple does not supply one. You cannot supply your own, because the emulator/simulator/whatever has to run Apple code inside, and you don't have permission to distribute it, even if you had the inclination to make your own emulator/simulator/whatever.
Still, I might try the next time I'm up for getting a new laptop. I'm not pleased with the way Apple is going with their laptops, and I'm not interested in Windows.
That seems to be the general consensus around here. If you're a dev, you either run MacOS or Linux. As a dev who runs Windows and MacOS I don't really get this. Windows 10 is excellent for dev work. I understand there are privacy issues that make people not want to use MS products but from a purely functional perspective it's a great User experience.
* There's now a major upgrade every six months and Microsoft still hasn't figured out how to make the process reliable.
* Chocolatey is good but the available packages are sometimes lacking. Linux distros have always been miles ahead in this regard, and nowadays with snap I can even find third party software that wasn't packaged before.
When that third-party package replaces something that is a part of the system, you are going to have a difficult upgrade next time the new distro release is out, unless the third party ensures smooth transition. Do you have that trust in them?
All systems can be broken by incompetent user with root or administrator password.
Graphical enviroment is ChromeOS which is basically just Chrome and extensions. Yes - HBOGo, Spotify, Inbox, Photos could be used as apps but they are still working in a browser.
And with crouton I have Debian stretch CLI which is just that - Debian - solid, stable and there is solution to most problems.
This setup is amazingly portable. I make regular backups with crouton and keep USB stick with ChromeOS installer on my keychain. Now I am spending summer vacation and I have left my Mac at home. So I have just installed ChromeOS on 10-year old Acer laptop, Google downloaded all documents and extensions in 15 minutes and I have installed crouton and recreated Debian from backup.
In 15 minutes I had been up and running in my home environment. Of course - touchpad on Acer sucks compared to Mac, laptop gets hot and screen estate is 1/4 of Mac's Retina but every single setting in environment is 1:1 with my Macbook.
I have never been heavy user of macOS native apps and I have slowly moved from Apple/iCloud walled garden to Google (Photos, Drive etc.) so jumping ships had been painless for me. My memories of macOS is of never fixed annoying glitches, heavy 'obligatory' updates and tons of unnecessary files cluttering my drive.
Details: try a HiDpi display and a non HiDpi display both connected to the same PC. Nothing but MacOS handles this well. Even Win10 struggles with Windows that straddle the boundary or things dragged across it.
The Qt and Tk (since 8.6) applications also scale nicely with GNOME3.
The only glitch is that currently only integer scaling works well.
sudo pacman -S gnome
sudo systemctl enable gdm
sudo systemctl start gdm
The last time I used Fedora it worked without extra work (besides setting scaling to 200% in the Display settings).
Things only started working well with GNOME 2.26 and very well with 2.28.
In any case, you have to make sure that you are using Wayland, support for multi-DPI screens on X.org has been terrible for me. Even a single 4k screen is not very consistent in X.org (e.g. the mouse cursor sometimes becomes tiny when moving between windows).
If you use your Mac to manage emails, calendar, skype, chrome, etc... then you are better of with a Mac and OS X.
When I developed on a Mac, it used to be nice to have an integrated environment. I really liked having the iPhone/iPad/Mac ecosystem available even right on my working dev laptop. But dropping function keys, along with longer update cycles, indicates my workflow isn't Apple's priority. I've found the transition really easy, and I now think of Apple devices as appliances, not computers I can program, which is what Apple wants me to think anyway.
Well I'm a developer (if not 'in this community'). I don't run Linux on the desktop because when I have I've just found it too much trouble in so many ways. It took up too much of my time. Working full-time as a developer already consumes as much of my life as I'm willing to spend confined to a screen and keyboard. I know a dozen folk will chime in with the automatic responses informing me that it takes no more time to run than macOS or Windows. But in my experience that's just not true, and, frankly, the OS I use isn't of much concern to me. It wouldn't get anywhere near my "1000 things I'm most concerned about in life" list.
Things are installed all over the place, hard to have consistent version across different machines.
I think that a developer OS MUST have a proper package manager.
I'm imaging now some grey bearded dev at Apple getting immediately shot down for suggesting they build the Mac App Store on an open package manager foundation.
I’d also have a hard time going back to OSX now. At the same time, it’s not for everybody. This was something I wanted to do for me and I was determined to see it through. If somebody had pushed me into it I probably would have spent more time focusing on deficiencies.
I’d held on to my 17” MBP until last January when it was clear they were sticking with their current direction on laptops.
I've been slowly moving from Linux (mostly Ubuntu 14.04) to High Sierra, and there are a few things that make absolutely no sense about OSX:
- There's a difference, in almost every piece of software, between closing the last window and quitting. In a browser, just pressing the "close" button would lose all your tabs, whereas "quit" will preserve them. That makes no sense to someone coming from Linux or Windows; I'm sure OSX users have gotten used to this difference, but ... it's clunky.
- OSX puts one small dot next to an icon of the app that's open. Unity puts one dot for every open window the app has (but no more than 3 even if you more than 3), _and_ a marker that tells you which app has focus (and whether it is on the current workspace or on different one); also middle click on the icon universally opens a new windows. I never realized how convenient and informative this is, until I lost this feature with OSX.
- OSX litters all the (local) directories you visit with finder preview files. It does it for remote ones to, but at least that can be turned off - not so for local. There used to be 3rd party kexts to stop that, but not anymore. Gnome/unity keeps the previews in one directory; Windows is almost as bad as MACOS (except it can be turned off universally).
- Having used locally integrated menus for the past ~30 years or so, I find OSX' global menus uncomfortable. I've spent a while with Unity's global menus to try them out, and after two months of use gave up and went back to local menus. OSX doesn't give you that choice. (And yes, I know of Apple's studies that resulted in the global menu - I suspect the originals are no longer relevant -- due to higher resolution screens, larger number of open apps on average, and familiarity with window systems; Does anybody know if these studies were repeated any time in the last 5 years?)
- "apt install" (and many of its GUI frontends) are years ahead of Mac OS app installations, whether app store or DMG, both in ease of use and in breadth covered; "brew" comes close, but no cigar yet; Haven't tried Nix, Fink or MacPorts yet.
- A kernel update that requires restart is uncommon on either system; less than once a month. Yet, on the Mac it's 5-30 minutes of downtime to apply, depending on the update; on Linux I have my desktop back in less than 30 seconds.
- keyboard shortcuts. "End" goes to end of document, "Home" goes to beginning; Low frequency of use (for anyone I know). Beggining/end of line: Option+Left, Option+Right; high frequency of use. It's a "different strokes" thing, for sure, but even though the Mac is consistent, it is very weird compared to (everything else going back 50 years), and not optimized.
Yes, OSX looks more polished, and even small utility apps are really gorgeous. But so far it feels that functionality wise for a non-artist engineer like me, OSX is trailing.
MacOS distinguishes between closing an application's window and closing the whole application. Therefore, closing the last window will not automatically close the application as well. Consequently, in a browser pressing the close button results in all tabs getting closed, with the browser still running.
Quit however closes the whole browser and, hence, preserves the tabs. This is the action triggered by pressing the close button on Windows and Linux.
I can understand that this is counterintuitive coming from Linux or Windows. But it is a design-choice coherently implemented throughout MacOS.
It used to be the terminating a process you were going to reuse would cost you tens of seconds. That's no longer the case. The fact that it was a rational, consistent, design decision made 35 years ago doesn't mean it still makes sense. Apple killed the phone jack and the replaceable batteries, two things which are even older than that - things change.
>Beggining/end of line: Option+Left, Option+Right; high frequency of use.
on macOS you can just use the 30+ years old Unix convention of Ctrl+A and Ctrl+E to get to the beginning and end of every line everywhere in the OS.
> "apt install"
That is because no generic consumer in the world actually uses a text-based package manager. That's only the CLI users, Linux/Unix/BSD users and the like. At the same time, software distribution isn't a one-size-fits-all approach. In line with apt, on macOS there is a packaging format much like .deb and it's called .pkg. It is compressed, has a manifest, some pre/post install scripts and the actual files. The .app distribution method is much more like a snap or container, which is what the world has been moving to for decades. In one way, it's even more comparable to nix, since applications and their dependencies live in their own location independent namespace. On top of that, there used to be a package manager that actually proved apt and dpkg for macOS, but almost nobody wanted to use it because that method doesn't actually fix anything.
> There's a difference, in almost every piece of software,[..]
> OSX puts one small dot next to an icon of the app that's open. [..]
> OSX litters all the (local) directories you visit with finder preview files. [..] (<-- actually nothing to do with previewing anything)
Those three are pretty much all the same thing: preservation of state. By default, the concept is that it doesn't really matter if a piece of software is 'open' or 'running', what matters is that when a user takes an action (i.e. start browsing the web), it continues from the last state the user left it in. That is a really hard problem to solve and something Linux will probably never be able to solve due to the difference in even basic GUI SDKs like GTK, QT, WxWidgets etc. and that's not even considering X11 which is a whole different mess by itself (data exchange via atoms on window objects? wtf?) There is a setting that will undoubtedly become default where the dock will no longer show a difference between 'open' and 'closed' since the difference will pretty much be to small to be relevant (i.e. is it in the process table or not will be about as much as a difference that will be left). Currently, there are a number of frameworks in place that do almost all of this, some are purely for managing state of the UI, some are for document-based programs that offload revisions, storage etc. to the OS, some are for managing idle time (i.e. AppNap). The idea that an Application is just a binary that hooks into the UI thread to dump some pixels on the screen hasn't been a reflection of reality on macOS for about 15 years. Having this information in mind might make all of your observation a lot more sensible.
Regarding .DS_Store files: they are data files containing the settings you set for a directory. So if you don't change anything for a directory, then no .DS_Store file is made. But when you say, change to icon view, move stuff around, go to list view, and setup custom sorting orders, then the OS will store your settings inside that directory. If you then open that directory on another Mac, you will get the exact same representation (be it via network, mass storage or target mode).
> Global menus
This often boils down to taste, but regarding those studies: they were about 'how hard is it to aim', which is relevant today, even (to an extent) for touch screens. The idea is that the global menu is at the top, and if you throw your mouse pointer at the top of the screen, you are always at the global menu, no aiming required. You basically completely remove an axis of aiming from the process. You can't overshoot either as the screen simply ends there. If you take the 99% of users, this is much easier than trying to aim for a strip of pixels at a random location somewhere in the top half of your screen. On top of that, multiple menus make little sense, you can't use more than one as you can only focus one, you only have one pointer, and you can't (on any implementation i've seen) use them without losing focus of your current window. Maybe this background information alleviates some irritation ;-)
> A kernel update
There are no stand-alone kernel updates on macOS, at last not by default and not planned. The type of updates that require a reboot are pretty much always those system updates, which are run as transactions instead of copy+replace. They aren't small or as insignificant as a single package either, most of them are comparable to complete distro upgrades like FC27 to FC28. And yes, I'm talking about those point releases (i.e. 10.13.3 to 10.13.4). Just because the semver delta isn't that high doesn't mean that on the scale of operations the content doesn't have a big delta either ;-) Most of them are a few GB in size and comparable to a reinstall of the complete OS (because again, it is a single, reliable transaction). Keeping that in mind, isn't it great that this can be done in under an hour, only breaks if you have made extreme system modifications and basically doesn't touch your own user data or programs? While not-touching-anything-it-doesn't-need-to-touch is standard, just like it is on Linux packaging systems (well, most of them), doing a complete distro upgrade on the fly isn't always that easy and reliable on other operating systems. Windows mostly just fails or takes forever, on Linux distros you have to do things like 'sudo apt-get update; sudo apt-get upgrade; sudo apt-get dist-upgrade; sudo reboot;' and hope nothing broke. Same with yum and dnf, I've had simple upgrades (bare FC install upgrades) fail and leave it in an unworkable state, mostly when libc was upgraded, or kernels (and the modules failed to update in the initramfs) or when the packaging tools themselves were upgraded (when they break, good luck fixing anything).
Re: global menus, as I mentioned, I'm well aware of the original Apple studies. I admit I'm a power user, but many professional (not necessarily power) users these days have more than one app open on the screen, and I've observed the pattern of "click app to get the right menu, then locate menu item" with a few co-workers -- whereas on Windows (and linux), they only have to locate the menu inside the right window.
When Apple did their tests, they focused on use in a single app, IIRC -- which, back then, was justified. That's why I wonder if anyone repeated that with a modern workload.
Re kernel updates: No, it isn't great that it can be done in under an hour. I often install ubuntu or debian from a very old DVD, and then, within just a few minutes, it is all up-to-date, kernel and all. Never failed once.
I haven't used dnf or yum in a while, so can't comment; I know window breaks. But debian-stable is rock solid, and even ubuntu is (with the exception of the OOM kernel that did byte -- but still took just 5 minutes to update, and left you a copy of the older kernel to boot to if you wanted - which, once you got bitten by the oom bug, you obviously did).
I agree Apple updates are miraculous if you are coming from Windows. But compared to debian ... not so much.
... which appears nowhere in the keyboard help or shortcuts. I actually used this in shells, from muscle memory, but never actually tried in editors or input fields.
> > "apt install"
> That is because no generic consumer in the world actually uses a text-based package manager.
I did mention the variety of GUIs; I like the CLI better, but I know people who prefer synaptic. Regardless of the interface, once you've decided to install a piece of software, Modern Linux does it much, much better than MacOS; It's possible that your use case requires packages unavailable in the repositories, in which case it doesn't matter - but the breadth of packages in debian/ubuntu/fedora repositories covers 99% of the users 99% of the time. No app store or repostiory on the mac does as well.
> Those three are pretty much all the same thing: preservation of state.
No, they aren't - two dots for two windows vs. one dot for one window is purely informative but hugely helpful. And while it might at some point converge to the "binary loaded or not doesn't matter", it's not there yet by a long short - and likely won't be for a decade.
The "quit vs. close" difference on OSX goes directly against the "preservation of state" concept you are promoting, in my opinion. In Ubuntu, whichever way I closed my Firefox (closing the window, shutting down the computer, upgrading the kernel, killall firefox, etc.) I get my state back. On the Mac, if I press the red button it doesn't, but if I choose "quit" on the menu it does ?!?
I mentioned them as things in which the MacOS fails compared to modern Linux, whereas somehow you took it as if the Mac does better.
Linux does preservation of directory view state as well. Except it is centralized and not all over the place. Windows does most of it in the registry (but put previews in the directory). These make much more sense -- and I think come because of Linux's Unix legacy of serving multiple users and having a lot of remote, read-only file systems often mounted. OSX is Unix, but is almost entirely single-user-at-a-time-and-they-own-everything-they-see from the day Apple adopted it.
Personally, I have configured my linux to automount every USB or network share read only unless I specify otherwise - I hate to accidentally modify files or directories, and many software allow themselves to drop swap/lock/backups even if you're in read only mode. MacOS is a huge offender from this perspective.
Quit and Close mean two different things. The state that is preserved is the state you put it in. So if the state is: I have a window open with a site loaded, then the next time I visit this task, I want that same state. If I tell a program that I wish to not see a window or site, then that is the state I put it in, and the next time I visit that same task/program, I want it to return to that state as well. While you might not agree with that logic, that is what the HIG describes and how it is implemented. The logic is there, and it is documented and followed, both on the desktop and on the ~1 billion mobile devices where much of the same HIG applies.
Let me add a car-analogy here:
Say you are in your car with the doors open and the engine running. If I turn off the engine, I don't expect my doors to close. And when I close my doors, I don't want my engine to turn off by itself. And when I open my doors, turn the engine off, go for a walk and come back, I expect my doors to remain open when I turn the engine back on.
Regarding the view state, Linux and Windows do this for the local machine, where as macOS does it for the directory. From a UX perspective both ways are with their own good and bad elements, but if you use multiple machines or manage multiple machines and share view states across machines and accounts, this is the way to do it (short of exchanging view state across other methods like a network or integrated into the filesystem).
Regarding your read-only mounting of mass storage and remote filesystems, that makes total sense to me, but makes zero sense for the general user. While it is nice to reconfigure your local system to do this (and macOS has this as well via autofs if you want it -- it's built in), this would make your environment completely unportable to anyone else.
I know why it's there. As I wrote in another post, this was a great choice in the 128KB/floppy days of the early macs. It no longer is - for the general user, on a computer (unlike a car) it feels like a distinction without a difference. The fact that it is consistent and based on consistent logic (albeit one that was prescribed 30 years ago) does not matter, IMO.
> Regarding the view state, Linux and Windows do this for the local machine, where as macOS does it for the directory.
I think you're rationalizing a poor design choice. For local files, the end result is mostly the same (except Linux preserves the directories mtime) -- as local files are rarely if ever accessed across the network by someone with write permissions for that directory. And those permissions often make it seam dysfunctional on a shared directory - you can't store settings, so it doesn't stay the same between visits.
How does your explanation sit with the fact that there's a way to disable this for network/remote directories, but not for local ones?
> and macOS has this as well via autofs if you want it -- it's built in
I'm aware that's a power-user preference, I couldn't figure out how to do it on the mac; I'll look up autofs permissions - any recommended documentation?
This is a weird omission in OS X. For what it's worth, they are documented here: https://support.apple.com/en-us/HT201236
Part of ubuntu's and debian's "apt" magic is not just in the code; It's in the meticulous curation of the main repositories - which brew can't really compete with.
My System76 laptop, pre-installed with customized Ubuntu, needs to be rebooted just to switch between Nvidia and Intel GPU if you want power savings (external monitors are directly wired to Nvidia chip).
Which is why I've used Linux for nearly 20 years. It Just Works.
YMMV, Plural of Annecdote is not data, etc.
I'm just happy that nowadays it's a genuine choice.
Any link to do this? I've tried a number of times and still didn't work :/
I believe SMCDeviceKey and the product names/serial numbers are the critical part. The OS doesn't mind the rest of the virtual hardware.
The only real headache on the laptop front, beyond drivers, is that MacOS only supports a small number of WiFi cards, so you need to do your research and find a laptop where the wireless card is user-replaceable.
Or at least one with existing open-source drivers that you can port, as has happened with at least one popular model:
Hackintosh is, after all, a learning experience. Including kernelmode programming. ;-)
(Or at least it was --- the whole driver signing thing has probably put off quite a few who would've otherwise given it a try...)
I recently build one and it's super easy. My build: z370m + i5 8400 + 240g ssd + 16g ddr4. Benchmark is near 2018 MacBook Pro 15-inch.
Total cost: $550
How about major breakage because of the installation of a new program? Like having installed programs that uses readline in version 6 and installing a program that needs readline in version 7?
Homebrew will happily upgrade readline to version 7 and install the new program but thus silently breaking the programs using readline with version 6.
Edit: Every program and library involved has been installed with Homebrew.
Every program and library involved has been installed with Homebrew.
I'm the first to blame Apple for various stuff but you can't blame Apple for using OSX in a way that Apple hasn't intended and isn't supporting.
If you're not happy with the pre-installed unixy stuff you are (still) free to add your own.
Which Homebrew allows you to do in a very Apple way. That is, works mostly as intended and almost OOTB but if you are a power user you will soon run into various problems.
If I could slap macOS on a Matebook X Pro, I'd switch tomorrow.
I like macOS, but Apple's making it incredibly difficult to take their product line seriously lately.
Stuff like this is why Hackintoshes are sadly still super frustrating.