Hacker News new | past | comments | ask | show | jobs | submit login
From Mac to Linux – the setup I've grown to love (shooting-unicorns.com)
492 points by derp_unicorns 53 days ago | hide | past | web | favorite | 457 comments



I use Windows/Linux/macOS.

From a performance to value perspective you cannot beat Linux. Docker/Microk8s the overhead is so low. Dev speed is leagues ahead the unfortunate circumstance of having to run Docker/Minikube in a VM on Windows and macOS. Also filesystem IO is unreal compared to Windows at least.

Getting a refurb Thinkpad on ebay and having better compute hardware than a mac pro for half the price is also a nice cherry on top so to speak. That and the insane sales Lenovo has all the time for brand new machines is kind of hard to beat as well.

Next up is Windows from a hardware perspective. Same refurb thinkpad can dual boot without issue.

Then lastly macOS. I have had a mac since 2011. I am having a hard time with the direction Apple is going with their laptops.

I have all 3 and they all have their merits, but I find myself using Windows/Linux at home exclusively and macOS at work and I don't mind the context switch.

To each their own!

With the uncertain future of mac with their potential switch to ARM and not shipping python and ruby by default, I see some drawbacks to the dev ecosystem. I know brew will package a ruby version to handle this but I do worry about the ARM switch.

Linux used to be quite difficult, but I stuck with Ubuntu and the UX/UI has improved so much :)


> Then lastly macOS. I have had a mac since 2011. I am having a hard time with the direction Apple is going with their laptops.

This, a thousand times this. I had a discussion earlier this week with the owner of a Mac repair shop of 15+ years here in Toronto, who lamented the release of any Mac portable since 2015 - saying 'thank God for the 2012-2015 units, or I would be out of business.'

I told him I'd been buying Macs for 15 years, and during especially times like buying my first iBook at age 15, I absolutely relied on, and still rely on, purchasing a laptop with the intent to upgrade the RAM and the HDD/SSD in the future.

With the laptops continually increasing in price, justified by tacking on useless features nobody wants, and then preventing upgrades, the laptops are out of reach for me to justify as an intermediate iOS developer. The 2017 models locked to 16GB are already virtually obsolete to a serious developer or film editor.

I will not, would not, on principle, buy a computer whose hard drive is soldered to the Logic Board, if only for the sake of retaining the hard drive itself aside from the laptop.

There is no possible, potential benefit a soldered hard drive, or soldered RAM, gives me, and the detriments far, far outweigh any benefits.

Previously, if the hard drive or RAM got corrupted or damaged, I could replace those parts the same day. What now?

Truly - and I mean truly, butterfly keyboard and lack of ports aside, even internally, Apple has finally gone from questionably being form of over function, to its focus on form over function being a literal insult to its long term dedicated users, and simply not responding to criticisms.

So the media laughs at the Touch Bar, fans and critics deride it, and Apple's response - to cancel the non-touch bar version of the 13" MacBook Pro.


You mention the RAM and hard drive, but it irks me more my keyboard is not replaceable. It's the most used moving component of the entire unit. It's exposed and susceptible to damage the easiest. I've hated every MacBook Pro since the release of the unibody. My favorite was the PowerBook 12 and 17.

I've had a MacBook/PowerBook Pro (and Thinkpad) since Titanium and the First MacBook Pro and if it wasn't for the fact of macOS and the convenience of being in the Apple ecosystem with their iPhone, TV, HomePod(s), Watch, Music, and iCloud I would be back on Linux (I actually came from FreeBSD on my Thinkpad). I'm currently using a 2017 and 2018 MacBook Pro. The 2017 would be great except the keyboard (and of course the soldered RAM/HDD) is absolutely trash. The 2018 is much better, but nothing like a Thinkpad. I also miss the Trackpoint, but the Apple Trackpad is good.

I miss my Thinkpad (+ BSD) bad, but being an iOS / web developer and entrenched in the Apple ecosystem (which I honestly like the convenience) I feel stuck and hard to even get a secondary machine. I even leave a fully maxed out P53 and P1 (I can't decide which one I want) in my Lenovo cart ready to buy at a given moment.


What's so good about the apple ecosystem everyone hounds about? I don't see why you'd want to lock yourself into paying 5x more for every tech product you'd ever want. All of that stuff has an Android equivalent and Bluetooth now. It's not 2011.


You get more than just the eco system which is must just part of the experience. The hardware is that much better. It boils down to you get what you pay for. I value the convenience of what I need or want just working as well as the quality of the devices, especially compared to the alternatives. It's also not 5x as much, but I'm thinking you were exaggerating.


> My favorite was the PowerBook 12 and 17.

Golden, golden days. Every other film editor and director I've ever met, I've had a talk about how blessed the 17" G4 was (at the time).


The keyboard keys fit the shape of your fingers.


You could also walk into the Apple store. Go to the back corner. Pick up a box with a new battery in it for $119 (iirc). Buy it and walk out with a new battery. All in a single day. You could also order them online.

Mind-blowing.


And replace them yourself without catching on fire, or whatever "safety" issue they keep blabbering on about...

Ho-lee sheee-yt


The screen bezel was symmetrical enough to keep my OCD in check.


I have had 3 MacBookPro in the last 4 years. Had to, because of some development for iOS I had to at my job.

Never had a Mac before, everybody was saying they were fantastic. You'll see, coming from Windows, what a difference, they would say.

The first had some serious hardware failure which made it reset at random times. With time the resets became more frequent until it became impossible to use it. I gave it back to IT with the order to destroy it.

The second had the infamous keyboard. God knows how much I hated it. Random keys wouldn't work, but most commonly the ones you need more, like shift. Thanks Apple. Went to IT and told them to throw it in the bin.

This last one I got has the horrid touch-bar which starts the bloody Siri 3 to 5 times a day because my finger randomly flies by the up-right corner of the laptop (typically when I am looking for the backspace). I hate it. The network sometimes goes away, for unknown reasons, until I reset the network card. Recently, the screen sometimes shows some worrisome fast-disappearing black areas.

You'll see, they would say. Very reliable, they would say.


I've had a few Macs - and my 2013 MacBook Air is still going very strong, but my 1yo MacBook Pro's keyboard has been replaced already, which is a serious issue, to the point where I doubt my next laptop will be a Mac again for that alone.

Other than that though, I think you might just have been unlucky. In professional env, I've seen bad units with pretty much every brand out-there. It shouldn't happen, but it does. In my experience, Apple will replace these without much fuzz, but their service, certainly towards businesses, is a far cry from that of for example Dell.

The Touch Bar complaint is just you not investing enough time with the system to get to know it. You can disable siri, and completely customise the touch bar.


Have you tried customizing your touchbar to address at least that annoyance? Obviously things like butterfly keys you can't do much about, but you can do something about the touchbar.


I am so sad this has been your experience. From my first iBook G3 in 2004 to my last 2012 13" I have only ever had the most positive experience with the combination of hardware and software.

Oh, how the mighty have fallen.


My wife has my old 2012 MacBook Retina for some hobby iOS development. Apparently all her friends are envious because it's the last Mac that doesn't suck. I don't know if it's true or not - but I replaced that one with a Dell because, as I always do when I buy a new machine, I consider my needs, the cost, and the benefits of the hardware, and for the first time in a decade Apple didn't come up on top. So maybe I agree with them too.


I switched from Linux to Mac about six years ago, because I believed it's better. Now I think that it wasn't worth it and I'm not even talking about money. It's overhyped. From the start, I was annoyed by various things (some things worked just better on my old Ubuntu), but recently I'm becoming fed up with it, mostly because of carbon, finder, lacking bash utils, touch bar, and crashes. I'll be switching back to Linux+Windows.


> carbon

What? I haven't dealt with Carbon API's in forever. Care to add more detail?


I meant that Mojave does not support Nvidia drivers [1]. Sorry, I've confused Carbon with Metal, the 3d graphics API.

[1] https://www.forbes.com/sites/marcochiappetta/2018/12/11/appl...


There were also a ton of people asking for a lower end Mac Pro. So what does Apple do? Make the highest of highest end computers ever! $6000!

It's like their product management is run by sadistic assholes.


I know of a ton of people asking for a MacBook Pro with a keyboard that doesn't suck, but I genuinely haven't heard many people asking for a "lower end Mac Pro." The trashcan Mac Pro, love it or hate it, clearly positioned the line as the intended spiritual successor to old SGI workstations -- the ones that started around $10K in 1999 money -- and the new Mac Pro is doubling down on that.

"Lower end Mac Pro" sounds an awful lot like the "xMac," a Mac that would be more like an iMac but a box that takes cards, which is something that some folks have wanted for literally two decades at this point. It's clearly not what Apple wants to bring to market. The low-end headless desktop Mac is the Mac mini -- and I suspect it would be a great machine for a lot of developers. That doesn't mean it's what you want, and I'm not saying you're wrong to want something else! I am saying, though, that "Apple won't make my dream Mac" isn't a sign of sadism.


Such a ridiculous stance for Apple. I am 100% convinced that the xMac would double their desktop computer market share.


I agree. Their decisions seem almost intentionally against what their vocal user base is asking for.

From removing the non-touch bar MacBook Pro model, to the ghastly price of the new Mac Pro, to their refusal to create an upgraded iPhone SE-sized model - Apple has started to shove things down the user base's throat now more than ever.

Apple has always been guilty of this to some extent, but for me, it's offerings worked with me, my workflow, and their direction grew with me - I embraced the move to Intel, because, as a developer, being able to dual-boot, and eventually even virtualize the dual-boot with parallels, was excellent for me. (Waiting for Office 2008 and the Universal Adobe CS3 package, however...)

The point is - Apple has always at least provided options in the past that were, well, options. At this point, there is no option in Apple's lineup I would keep for free - I'd simply rather sell a new MBP and create a hackintoshed Lenovo ThinkPad in a heartbeat than actually use any of their offerings on the daily, which I can say from experience, as I happen to have to use both on the daily.


Completely agree. I've been saying this since 2012 when the Retina machines came out, and skeptical since 2009 when they moved to non-user-replaceable batteries in the MacBook Pros.

I don't think I've actually ever had a non-Apple main notebook, and that goes back to the 1990s!

That said, I only tolerate the 2013-2015 rMBPs, use one for my main personal and work laptops, but the soldered RAM pisses me off (a lot, because my personal machine only has 8GB and my god is it hard to find a 16GB model for a reasonable price in the used market), and the proprietary storage irks me. Thankfully 10.13 supports NVMe with an adaptor, which to me basically confirms that there was zero reason for Apple to use the proprietary stupid thing in the first place.

As for any machine they've built after 2016, well, I don't want them. I don't want a butterfly keyboard with no travel that breaks with a skin flake. I don't want screens that stop working because they use a flex cable connector that's too short. I don't want a touchbar if it means no function row. I don't want to give up MagSafe. I don't want to give up my SD card slot. I don't want to give up USB Type A. I don't want a massive trackpad, and I don't want the fscking T2 chip.

In fact, the only things on the >2016 machines I do want are the faster CPUs and GPUs, the better quality displays, and Touch ID!

... As for Lenovo though, they're slowly into Apple 2.0. Have you seen the T/X x90 and X1 series? Soldered RAM. At least they still make the X1 Extreme and P series.


It’s easy to dismiss since it was obsolete for so long, but the new Mac Mini is actually a pretty capable machine - 6 reasonably fast cores, 32 gigs of ram, fast storage, and 10 GbE onboard. I’ve been extremely happy with it.


And if you need it you can add an external GPU for an extra graphics boost.


But nothing from nvidia, correct?


Currently no, or at least not if you want to run the latest version of MacOS.


> I know brew will package a ruby version to handle this...

Not to mention brew is vastly inferior to most Linux package managers (apt, yum, pacman, etc.)


Until you need/want some project that's relatively new and you pine for the days of brew. Technically inferior Perhaps but often much more up to date. :/


Linux distribution often have a parallel system for bleeding edge software.

On Ubuntu, it's snap.


I think the way to go is adding PPAs.

But yes, I agree we have great ways to have the latest and greatest.


not only on Ubuntu. I installed it on my PureOS machine. Works like a charm.


Not a problem with Arch packages and the AUR.


I heard great things about AUR when moving to Arch but honestly I've been unimpressed. A lot of stuff is already exists in official repos for other distributions, and I don't actually care about bleeding edge versions, just ease of install. And most of the time in my experience, AUR packages aren't very recent, or aren't configured how I'd like.

That said, you can just customise the PKGBUILD yourself, and even then it's no real stress to build from source yourself. Even if you then stabbed yourself in the eye with a pencil, it'd still be a better system than on MacOS or on Windows.


The problem I find with Arch Linux, and this really only has started happening recently, is that for some reason Arch loves to break stuff. My most unstable distribution had to do with Arch. I find that distros like Void and Alpine Linux offer more robust rolling release systems.


What in particular has broken for you? I've had the same Arch install for almost 6 years now, with no breakage at all. (Other than once or twice when I decided to reconfigure something and screwed it up. Always recoverable though, without a reinstall.)


I personally find that Alpine offers the least robust, because it doesn't even have a package mirror, so once a package gets updated (especially in Edge), then you can no longer download the old package, unless you build and sign it yourself. Arch has been incredibly stable for me, even more so than Ubuntu or Debian.


> The problem I find with Arch Linux, and this really only has started happening recently, is that for some reason Arch loves to break stuff.

Then I must be lucky. I've been using arch on my dev machine without notable breakages for about 2 years.


I've considered switching to arch because of that.


Manjaro has access to the AUR as well, and it's as polished as any distro I've used. For me it's the clear winner for developers and hobbyists


I completely agree. Having used Manjaro (xfce and i3 variants) as main OS for half a year, it has been a good experience.

To be fair, I can't compare with other distros as this was the first time I used Linux as main OS.


I always assumed this is because Linux package managers such as apt and yum are first-class citizens on the platform, while Homebrew is a bit of a de-facto solution on macOS.

This still holds, right?


No, macports is non-native as well and is an excellent, stable alternative, although Homebrew has slightly better coverage. Homebrew's main issue is that it was written from scratch, ignoring lessons from 30+ years of package management experience, and there's no obvious benefit that doing so has brought. Compared to macports, which is based on freebsd ports, Homebrew is brittle and normal operations frequently result in an inconsistent state.

In my experience, unless your needs are extremely basic, sooner or later you'll run into an issue where the solution is basically to commit nuclear warfare on your filesystem and start over again. Also, expect to rely on random blog posts and stack overflow as the de facto user's guide (which maybe is just the state of the world for everything now.)


I've honestly never found a good reason to use brew instead of macports, aside from an annoying hipsterism. Welcome to try to convince me otherwise.


Macports is good, I'm also a fan of pkgsrc[0].

As to Homebrew, I don't understand why it complains if I use sudo to do an install but then also complains if I'm not running as an admin account! If there's a reason for this splitting of hairs, I don't know what it is.

[0] https://pkgsrc.joyent.com/


Could also have a lot to do with Ruby. Chocolatey is way worse though (Windows).


Try Scoop on Windows instead of Chocolatey.


shrug use Mac for the generally superior UI, interface hardware, and app ecosystem, ssh into a Linux VM/server/workstation for the generally superior development experience. Best of both worlds.


Would be nice if Apple stepped up in respect to tools...


Totally tangential, but who could have predicted, 20 years ago, that in 2019, more than zero people would be talking about having to use Macs at work, then going home to their comfy PC. How the turns table. I increasingly don't have strong opinions either way (pros and cons to everything, etc.), it's just such an unexpected timeline we've ended up in.


If it wasn't for Microsoft pumping millions into Apple around that time we very well may not be having the conversation now either.

https://appleinsider.com/articles/18/08/06/august-6-1997----...


I also use Windows, Linux and MacOS, but mostly Windows.

The "Moby" VM on Windows is a bit annoying - it takes 30s or so to start the Docker engine, bind mounts/volumes are a bit pernickety, and resource use is obviously higher than without a VM. Having said that, once it's started everything works pretty well, with containers starting almost instantaneously.

I believe there are some IO perf issues if you're using WSL (I don't use it much, preferring git bash for most things).

Both of these issues should be fixed when WSL2 finally arrives. But unless you're on an Insider's build, I believe that's going to be 2020 (someone please correct me if that's wrong).


Alongside pouring resources into the Windows Subsystem for Linux, it would be nice if MS would put some resources into a Linux Subsystem for Windows. That is, do for WINE what they've done for Cygwin.


They have hardly done anything for cygwin.

Windows NT had Windows Services for UNIX, replaced by Subsystem for UNIX-based Applications on Windows 2003 until Windows 8, eventually replaced by Windows Subsystem for Linux, given that now Linux API compatibility has became more relevant than straight POSIX code compatibility.


I agree. I would argue that bring powershell to Linux is a good step in this direction, but it's missing some key components (Active directory...) still.


Don't worry about Active Directory, just sign up to Azure.


I love MacOS. The only reason I have windows at all is for a few games. That being said, Apple has gutted the Macbook lineup.

The hardware, especially the keyboard and touchbar, are utter trash and reminiscent of the dell and HP laptops around 2008, which were the reason I dumped those brands in the first place.

All the pointless dongles were a moronic excuse to make thinner machines. They forgot about function and also failed to realize that aesthetic doesn’t matter if the keyboard is literally a dumpster fire.

I wonder sometimes if Tim Cook is becoming senile or if he just don’t care since the iPhone was printing money.


I remember when it was the opposite. My PowerBook G4 from 2005 had DVI and S-Video. S-Video! Why would anyone in 2005 need an S-Video port?


Projectors. Was a big deal back then.


The Docker/IO perf story is changing with WSL2 significantly.


Yes indeed! And I will gladly attempt to finally dig into WSL at that point. For now I just use choco and pwsh and I find myself productive with go/node/elixir/rust without needing WSL just yet.

I mainly use Windows for games but every now and then I don't want to leave my desktop to make some changes to a repo while my character is respawning/spectating or something haha

WSL2 is going to be awesome for sure


I’m loving WSL2 on my Insiders build. But there are issues with local host networking and file updates to Windows apps right now that need to be worked around

But it is very fast!

The Insider build also green screens less often than the most recent general release did!

Having Github available for issues is great too, as you can see progress being made in fixing the bugs - I feel more connected then when I actually worked inside Microsoft!


I'm curious, are you still experiencing issues with localhost networking? I'd read that as of build 18945, those issues should mostly be fixed. [1] Or is it that you're hosting applications on Windows and trying to access them from Linux?

[1] https://devblogs.microsoft.com/commandline/whats-new-for-wsl...


Some people still are - mine now works. I think it depends on your Linux server binding to 127.0.0.1 or 0.0.0.0

I’m currently working on cross host code, so being forced to use the Hosts file actually helped me out!

I haven’t tried going from Linux to Windows via localhost though

I do have issues running multiple Selenium tests with a headless browser. Running 1 process works ok though


5 years too late


> Thinkpad

Which Thinkpad model do you recommend?

And which Linux distro do you use?


I'll go ahead and drop my preferences here as well:

The X1 Extreme is nice. It is one of relatively few PC laptops that are comparable to the 15" MBP in terms of performance (H-series processor rather than U-series) and portability. It has upgradeable RAM/SSD, so I got the base model with the upgrade CPU then purchased 32GB/1TB separately and saved something like $600-700 as a result. With slightly better cooling and a 3:2 display I think it would be nearly perfect.

Concerning distro, if you want to "just get work done" I recommend Ubuntu if like the way it looks or want to use one of the non-Gnome desktops. If you do like Gnome but but like the Ubuntu desktop, Fedora is great.


I have the X1 Extreme since December. Running Ubuntu 18.04. My experience with it was in short - "the worst possible ThinkPad that I had in 15 years" - and I had 8 ThinkPads.

Maybe I am unlucky. I bought it because I finally wanted a 15" screen with a centred keyboard, i.e. without the num-pad, because uncentered for touch typers is really really weird. It started that in order to get Linux running I had to switch the graphics card to discrete mode, at least this is what I found on Internet. This bricked the machine and according to the Lenovo support thread I was far from alone [0]! Luckily I had on-site 24h support, so called. They were able to come only after a week... With the wrong board... With travels in between I had in total to wait for 1 month to make the machine work.

Now it works and I am using it and trying to accept it.

- It is extremely loud! BIOS updates made it better as of lately and I got a bit more used to it.

- It gets extremely hot. So hot that actually typing on it gets uncomfortable.

- I am not able to do any meaningful work for more than 3 hours on battery. With my last X25 with 2 batteries I was able to work a whole day!

- The screen is like a mirror! I found a workaround by trying to work as much as possible with white background.

- And finally, but this is probably more the fault of Ubuntu/Gnome/nVidia - it is the laggiest experience ever! I mean, I am working most of the time in a terminal! Typing on the terminal is so laggy, that I do not even remember back to 1995 when I was starting using Linux to have such a laggy experience. Come on, this is supposed to be the most powerfull machine that I ever had?!

- Using external screen is possible only when switching ot nVidia. When using Intel graphics card (to prolong battery life) you cannot switch on external screen.

- Another, but probably this is Gnome/Ubunut/nVidia annoying thing is, that as soon as I lock the screen in Gnome, the fan starts turning like hell and the temperature rises! Come on, is Gnome screensaver mining bitcoins or what? I mean, I configured it to just turn off the screen when I lock it! And instead of saving energy it is heating the planet!

By now I spend so much time trying to configure, update and whatever that I am really tired of it. I mean, I have work to do! I am trying to prepare to go to NixOS, because I heard from some people that they got it configured to be usable. Preparing for this slowly, when I can dedicate time.

It's not all bad though. There are some positive things:

- I like the physical build quality. It feels solid and sturdy.

- The screen (I have the 3840x2160 resolution) brightness, resolution and colors are really good. To watch photographs or to see movies. Unfortunately working on text is only possible with white screen. Otherwise it is like a mirror. It would even be possible to work outside, but you need to have electricity, the battery life is horrible.

- I like the keyboard. The touch depths is nice and I have the impression that it is a bit more distinctive than on the prior models that I had (X25 and T450s).

- The CPU power is more than enough for me.

[0] - https://forums.lenovo.com/t5/ThinkPad-X-Series-Laptops/Anoth...

EDIT: Typo, s/X24/X25/


I use a t490, 500nit retina display with 100% ARGB. Upgradable ram, well one slot but more than enough for a JS dev. excellent keyboard. Was running Manjaro, but switched to Ubuntu Budgie. Workhorse machine for getting shit done.


Props for recognizing the awesomeness that is budgie!


Not gp, but I currently use a Lenovo T 480s dual booting Windows 10 Pro and Ubuntu 18.04 LTS and it's the best development setup I have used so far. No driver or hardware problems whatsoever.


Second that T480s, love it. 24GB RAM, 2TB SSD. Booting OpenBSD, macOS and Linux. Scaling on WQHD display works solid. Only drawback is that the WWAN (4G) only works under Linux and the touchpad is not as good as in a MacBook Pro.


Yea this is a great route as well! Considered it but found a deal I couldn't pass up on a refurb x1c6 haha


YMMV but I use a X1 Carbon 6th Gen. 1080p/Matte screen. No dual boot so some things to tweak but not much at all tbh.

Installed erpalma/throttled from github to squeeze my CPU perf to max. Also make sure to set sleep to S3 in BIOS. Otherwise nothing else to do. Everything seems to work just fine.

The 7th Gen just came out so there might still be some issues but I would look around and see if it's good to go.

I use Ubuntu 18.04 LTS. Once 19 is done I will more than likely move on to 20.04 LTS


My 7th gen X1 works on Ubuntu out of the box, except for an annoying trackpad bug that can be fixed with a one-line config file change.


Oh good to know! Thanks for the info


I use a 6th gen X1 Yoga at home (for its oled panel and integrated wacom tablet) and an 8th gen X1 (non-yoga) at work. (note - the generation I refer to is the processor generation, ie the 6th gen is an i7-6600u, which lenovo calls a first gen x1 yoga)

The yoga needed some tweaks to get the OLED brightness to work correctly, but other than that they basically worked out of the box under ubuntu 18.04. My only complaints are that the oled panel only came in gloss and has since been discontinued (although there are rumors that a 15" 4k oled will be available on the x1 extreme), and that the mouse click on the yoga is several orders of magnitude too loud. The trackpoint buttons are fine and the keyboard is great, but the trackpad click is loud enough to hear in the next room over.


I still use a T420 and T430. Both are great for development use. The screens are lacking, but when you spend most of your time in the terminal or browser, you don't really notice.


T5x0 (Avoid the T540, the touchpad buttons are supposedly shit.)

I'm running a T530. Max out the ram, get a ssd, it's great. I'm probably on my third one, I keep buying junkers for $50 from eBay for the parts. I majorly abuse it. The magnesium frame likes to crack near the heatsink, the steel frame around the screen likes to crack 1 inch above the hinges, don't spill water on the table it's sitting on.


T480 archlinux and kde


I come at this from a productivity perspective after using Mac's for several jobs - Linux boosts my productivity by 30-40%. Anecdotal, sure, but things just work in Linux and your not constantly having to fiddle around and click through things. It could also be that Mac's have declined in quality and I can put Linux on nearly anything and it lasts forever, especially when you put it on high quality gear that just doesn't exist at Apple.


My experience is somewhat opposite. I love Linux. I've used it for over a decade in various roles. But it always ends up the same - either some obscure Bluetooth or graphics bug frustrates me to the point I can't stand it anymore, or I get fed up with 2 hours of battery life and go back to macOS.

In this case, I'm measuring my productivity by the time it takes to mess around with the OS to get the desired result, and the fact that the stability baseline just never seems to get there.

I get and support the attraction to Linux on the desktop, but find YMMV to be very much true.


Yeah, this is exactly my experience too.

I try to move over every six months to a year or so, and it's the same gripes every time at this point.

Driver support's reasonable now, and the desktop environments are generally solid enough, but things like mixed DPI work really badly on Linux, my browser nearly always tears when scrolling on my secondary display, etc.

But... the single biggest killer for me though is how badly Linux copes with very low amounts of free memory. Put 32G in a machine and it still periodically runs completely out under my dev workload and when that happens, the whole system becomes unusable and I have to hard reboot it. I'm not sure what macOS and Windows do differently, but it just doesn't happen on either of those two OSes.

I really want to have the freedom to pick and choose my hardware more, but at the moment I keep falling back to macOS.

It's a UNIX environment so it has the tooling I want and a solid GUI that works well.


There are discussions happening on LKML right now about how to solve this. I don’t have a link handy, but saw them either here or on LWN recently.

I used to have that problem too, but it went away when I stopped using JetBrains products :). Not for any reason other than the contract I was working on ended.


Then I shall live in hope that it will be fixed :)

And I didn't even have a JetBrains product in the loop, it was a mix of virtualbox and VS code, along with browser, mail client, etc.


To elaborate a bit based on my understanding of the issue: VirtualBox seems like a great program to tickle the issue.

During a memory pressure scenario, the kernel starts looking around for things that it can get out of RAM to free up space. If swap is enabled and not saturated, paging out some data to disk is a likely option. Reducing disk cache size works too. But... when the usual candidates run out, things have to get more clever. Things like shared libraries can get paged out! If one of those pages is requested, it can be reloaded from disk. Or, in the VirtualBox case, the mmap'd disk image can be removed mostly from RAM and have those pages loaded from disk as needed. Performance sucks terribly, but it keeps trucking on.

The wrinkle in all of this is SSDs. The out-of-memory (OOM) killer heuristically watches the system and kills off processes that cause memory pressure problems. These heuristics, however, are expecting these page-in and page-out operations to be slow (as they were on HDDs). On newer SSDs, the disks are too fast to trip the OOM killer into action! This is why, when this problem manifests, your disk activity light goes on solid, even if you don't have swap enabled. The kernel is sitting there trying every trick in the book, and the OOM killer doesn't see what's happening. Every individual page fault is handled quickly, there's just waaaaaay too many of them.


Yep, this is an accurate description according to my understanding of the issue too.

The lesson I've recently learned is that, for now, swap is necessary on Linux machines with SSDs. I've enabled zswap and added a 4 GB swap file to my machine with 16 GB of RAM, and the problem hasn't reoccurred for me since then. Supposedly, the memory pressure measure in the kernel gets a more accurate reading when swap is enabled, but I don't know for sure that that's true. At the very least, you can page out the memory you're using the least instead of file-backed pages, which is what happens in memory pressure situations on SSDs (as opposed to OOM killing).


Wow, that's a really cool explanation. Although, I've had an SSD in my desktop for 8 years now, it's a bit sad to hear.


I saw it as more of a Java problem on Linux than a Jetbrains one with Android Studio.

When Android Studio is run with large code base on emulator, memory issues were frequent in Linux with halting issues. SO has several such cases.

No such issues with macOS, even with mutiple Jetbrains IDEs in parallel(Same memory config).

I wonder how Android Studio is doing on ChromeOS, considering many of those are low end machines. I'm sure they had to optimize it, but I assume the issue would persist till the Linux kernel itself is fixed.


Another alternative for people who still love macOS but can't tolerate the dumpster fire that is the butterfly keyboard is to get a new Mac Mini, then get a monitor of your choice and righteous clickety-clack buckling spring keyboard to go with it.


Seconding this, the Mini has really reduced my periodic urges to upgrade my 2013 MBP.


I find that for low RAM situations that zram is very handy and allows for a graceful reduction in performance under memory stress rather than a cliff edge that you get with a swap partition.


zram has been replaced with some other technology (z-something, can't remember the name) that also compresses swap and removed duplicate pages.


zswap?


And this is why many people like me stick to Mac. Yes there's a solution for it on Linux, but no I don't want to look for it, maintain it, and at some point when a new 'best solution' is available keep up to date with all that...


On Linux, the distributions do that for you. You don't have to, but if you want, you can.

On Mac, you can't, even if you want. So you won't see discussions like these, because Mac does not have that kind of visibility inside. If something is broken (and Mac has its share of broken things), you get to keep all the pieces.


I had the same issue. The solution I settled on (and have been very happy with) has been a Mac Mini as a polished front end/web browsing machine, and then a Threadripper workstation running Ubuntu that I ssh into and do all dev work on. The pleasure of OS X without being so limited by Apple’s hardware options (and extreme markup).


How big is your swapfile?


I'm really curious, what is your dev environment/setup like?


Have you tried earlyoom?


+1 to this as a workaround until the kernel finally addresses the issue. Earlyoom is a user space OOM-killer that kicks in before the system starts the mad paging dance.

https://github.com/rfjakob/earlyoom

Packages are available in Debian Stable (Buster), so they should be available in most child distros by now as well.


It seems likely that something can be done to make the behavior less perverse but I'm not convinced that the behavior CAN be better than something like earlyoom.

What makes earlyoom useful is the fact that you can tell the machine what is low value and likely to be problematic. I'm not sure that information can be determined automatically. I'm further not sure what a better strategy than start killing low value problematic processes when we reach a threshold looks like.


Do you have a swap partition?

I've been in a couple of interesting discussions about Linux memory management lately that enlightened me somewhat, and I won't claim to be an expert now, but I've been around the low-memory block enough to understand now that, there's no simple right answer to the question of "Do you have swap?"

"The Linux kernel has overcommit baked into the fiber of its being." I've begun to understand that this idea is so deeply engrained in the kernel that in a multi-tenant or desktop workstation, you simply can't extract it back out and "just provide enough RAM," unless you know the performance characteristics and you really mean it when you say "that should be enough RAM." If you don't have any swap and the kernel starts to run out of memory, it's going to start evicting whatever pages it can back to disk.

(Wait, pages back to disk? I told you I didn't have swap) Yes – the linux kernel can page things back to disk even if you don't have swap, remember all of the binaries you're running have originally come from that disk, and the kernel knows it doesn't strictly need to have them in memory until they are volatile, or you tried to read those pages again.

Having some swap gives the kernel something else to evict, so have a healthy amount of swap and Linux will find the occasion to use it for the least frequently used pages that are not already on disk. This will improve your "nearly out of memory" performance.

The second worst thing that you can do is put your swap on fast SSD or NVMe, and it's not why you think. The kernel is making decisions based on a heuristic which is complicated and well-documented, but inscrutable. If the solid disk is 50x faster than the spinning disk that the swap was originally designed to use, then swapping will cost less overall and the heuristic will lean on it as a strategy to keep the OOM killer away even more often. You may find your cache recycle rates going through the roof because things can be paged out to disk and re-loaded faster than should be possible. I don't fully understand this part, but I suspect the answer is "try to use Swap less, and be aware of when you are using it."

The kernel does really not want to kill off your processes, and it has more opportunities than ever to ensure it keeps too many balls in the air when you have asked it to do so. So, find a way to stay ahead of the kernel and know better. If you have a dock widget that tells when you are going above 50% swap usage, you can close some tabs before it gets to be an unrecoverable situation. It's a mystery to me why modern computers don't come with disk activity lights, as this problem we didn't need dock widgets to solve 20 years ago when literally every computer came equipped with one.

The best advice is to have enough RAM for whatever you're doing, and at 32GB "I think you've had enough." At any rate the one suggestion that I could give is, if you anticipate running out of memory (ever, and it looks like you still do), then you should be sure to have a healthy amount of swap, to me that's probably at least 5 or 6GB but YMMV.

But, 32GB for a desktop workstation really ought to be enough IMHO, so try to find a way that you don't run out? If you're eating all that memory up with VMs, try a lighter weight solution for your ephemeral workloads like footloose, which behaves like a VM in the ways you generally tend to want for your dev workloads, (like for example, it can run systemd like your deploy target most likely does, if you're using VMs to match the deploy target). Footloose doesn't impose the "VM's" whole footprint upfront due to actually being a container, so when you run out of memory it will be because your application workloads used too much, not because your virtual machine manager has grabbed much more than it needed.


All that being said, my daily driver is a Mac and I don't think about this stuff either, until it affects a server.


> I get fed up with 2 hours of battery life

As I'm typing this, I got 14hours and 33minutes of battery left and it's only charged to 83%. In hardcore scenarios out of the grid for a few days (where I go sailing) I got a few spare batteries. (the laptop is the Lenovo X270)

> In this case, I'm measuring my productivity by the time it takes to mess around with the OS to get the desired result, and the fact that the stability baseline just never seems to get there.

The time investment goes down significantly over time. After a decade using linux on as a daily driver, I don't remember the last time I've tweak anything.

> fed up ... and go back to macOS.

Same thing but the other way around. I stay away from my Mac except for:

- making sure my applications look good enough on a Mac and are usable

- making music as I'm not patient enough to relearn everything on a different platform but that's just laziness from my end


>The time investment goes down significantly over time. After a decade using linux on as a daily driver, I don't remember the last time I've tweak anything.

So, it gets stable after the first 4-5 years of tweaking. Doesn't that make the parent's point?

And after those 4-5 years, wont one have to get a new laptop at some point, upgrade to newer OS version, and adjust to whatever changes the FOSS projects like Gnome/KDE/etc did in the previous years all from the beginning?

>- making music as I'm not patient enough to relearn everything on a different platform but that's just laziness from my end

Just laziness? As if Linux has anything remotely as powerful/coherent as Live/Cubase/Logic/etc, Native Instruments, Arturia, and all the other VSTs?


> And after those 4-5 years, wont one have to get a new laptop at some point, upgrade to newer OS version, and adjust to whatever changes the FOSS projects like Gnome/KDE/etc did in the previous years all from the beginning?

But that's part of what I love about linux, my text-based configuration doesn't have to change because I upgrade, unlike on Mac where the name of some `defaults write` key suddenly renames or disappears altogether.

In the rare cases that something in a Linux distribution changes so much it upends your config, you're just a package install away from getting the old behaviour back until you want (if ever) to deal with it.


Reaper and bitwig run on linux. Plenty of reaper users come from other daws. Daw choice is mostly workflow preference than feature differences.

Vsts though... people use Carla that’s based on wine to use windows vsts. Results vary.


Yeah sure, I've heard it all. Let me guess: What you conveniently forgot to mention is how you only use terminal applications with a minimal window manager, which makes a comparison to MacOS or Windows completely pointless since their desktop environments are 40 years advanced and thus have to do a lot more computation.


There's very little net gain to changes in GUI. Most of the fundamental metaphors actually date from the MoaD, December of 1968, fifty-one years ago this year.

Actually, that still has capabilities lacking from "modern" GUIs.

https://www.invidio.us/watch?v=yJDv-zdhzMY

jwz observed that "UI is different" years ago when discussing Safari vs. Firefox interface changes. (Wayback link to avoid his "special greeting" for HN visitors): https://web.archive.org/web/20120511115213/https://www.jwz.o...

I watch "normal" computer users struggling to keep up with even very modest changes to MacOS UIs. Which, for the record, are remarkably consistent with the first iteration, deployed in 2001, eighteen years ago. It's older now than the Classic Mac interface was when OS X was introduced (1984 - 2001: 17 years).


That's not the point.


What is the point of advances if not to provide greater end-user utility, functionality, ease of use, etc?

Again: changes in GUI demonstrably do not deliver that.

And good GUIs don't change.

Because in large part of the institutional cost of breaking shell scripts, TUIs don't change often (and tools violating this principle are quickly and sharply deprecated and/or replaced with those that don't). Which means that as a user (or administrator or programmer), the investment you put into using console tools tends to have an exceedingly long half life.

Mind: I'd given this deliberate and conscious thought in the mid-1990s when I was faced with a few possible directions to take my own computing career and use. I'd already seen numerous platforms, notably proprietary and GUI ones, change substantially, or die entirely. Seemed to me that the skill-preserving route would be with Linux or the BSDs. That's proved a good decision and rationale.

Even a "minimal window manager" -- say, twm or vtwm, provides extensive functionality and does not change. There's a hell of a lot to be said for learning a skill once and not having to either replace it with another, or keep obsoleting previously acquired knowledge and habits.

I don't use twm myself, outside occasional testing. One of the best and most skillful programmers I've ever known did use it, and had a highly tricked out configuration, almost completely keyboard driven, that let him fly around his display and workspaces with an amazing faculty. The fact that the windowmanager itself is flyweight and bedrock stable only added to this.

My own preference is WindowMaker, based on the design principles of NextStep (1988), and largely static since the late 1990s. It has capabilities modern WMs and DEs still lack, is extremely high performance, and extraordinarily stable. Graphically, it's nonobtrusive. I might swap it for a tiling WM, but it's served me well for over two decades.


> After a decade using linux on as a daily driver

I mean, this is hardly an endorsement for people early in their careers who haven't already concluded the Linux is the best path forward.


Actually, it is.

I'm in my fourth decade of technical activity. I'm leveraging skills and tools I learned in my first day using Unix, in the mid-1980s.

Over the same time, I've gained, and obsoleted, skills on CPM, MacOS, VM/CMS, MVS, VMS, DOS, Windows 3x, WinNT, and classic Macintosh.

Yes, there are a few flavours of Unix -- BSD, SysV4, Solaris, HPUX, Irix, AIX, FreeBSD, and numerous Linux flavours. Those, and even OSX/MacOS share far more in common than all the other platforms.

Unix knowlege has proved extraordinarily durable, as have the tools. Though there are new utilities and environments coming out frequently, old standards remain available and still work. I'm not forced onto that treadmill, most especially not for my personal work.

GRRM still uses Wordstar. Works for him.

(That's ... one of the editors I've used as well, though I vastly prefer vim these days -- one of those "first day on Unix" skillsets I'm still earning dividends on.)


I'm not exactly sure the GRRM point hold true. He might write on Wordstar, but the distraction free writing environment hasn't exactly helped him finish books in a decade.


Have you seen the control-group results? ;-)

There's also Stephen Bourne, who had initially programmed in Algol, and has a bunch of Algol-like macros that he uses when programming in C. I'm not finding an original source, though several references turn up.

Muscle memory is a real beast to change. The local optimum is always "stick to what I know".


I've not had obscure graphics or bluetooth problems for like 4 years now. Graphics problems have not been a problem for the last decade.

I've been running linux desktops and laptops for about 20 years, starting with early/pre-RHEL redhat, and moving around to many others. My first laptop was a pentium thing at 75 MHz, and I triple booted Linux, OS2, and Windows/DOS on it. I wound up kicking off the last two, as I used them only infrequently.

Battery life is an issue for me, but its not linux specific. The laptops I have, all have power hungry ram and GPU cards. I get 2 hours on them, or if I play with the brightness and other things, I can stretch it to 4 hours. My old 2010 laptop (still in use, still running linux) is a 16GB ram, 0.5TB SSD affair with an NVidia GTX560m card. My 2018 laptop is a 48GB ram, 1.5 TB SSD (0.5 + 1.0) with an M2 256GB SSD for the included windows 10 home, and a GTX 1060m card. Windows 10 on the newer laptop lasts about 2.5 hours before it shuts down. I now run the pre-installed windows 10 via a kvm with passthrough of the M2 into the instance.

All of these are currently running late model Linux Mint 19.2 with accelerated graphics.

Work laptop is a Mac 16 GB ram, 512GB SSD with an intel/AMD hybrid graphics bit. This will last 5 hours with significant tweaks to aggressive power off, and me not running any builds on it.

I like the mac for its physical fit and finish, weight, etc. But I need to bring the power supply with me, as I can burn through much of the power in a 2 hour meeting.

I like the linux machine for work, and everything else. It just works. The drivers just work. The networking just works. Single/multi displays just work. I have cinammon (display manager) set up to a very comfortable configuration.

I am hopeful that the day job will enable me to trade up to a bigger machine with linux and nvidia graphics at some point ... 32GB is bare minimum for a functional machine for me, 48->64GB is better.

My home office deskside is an older Sandy Bridge machine with 16 cores, 128 GB ram, old GTX750ti card, running the same environment as on my laptop.

Of course, YMMV.


> Graphics problems have not been a problem for the last decade.

Except one happens to have a laptop with an older AMD card or Optimus Intel/NVidia combo.


Older AMD? Any AMD card or igp made in the last 14 years works out of the box with Linux, save their newest GPU arch.


Tell that to my AMD Radeon HD 6330M, released in 2010, 8 years ago.

The open source driver still doesn't provide feature parity with the proprietary one that Ubuntu LTS dropped support for.

https://www.omgubuntu.co.uk/2016/03/ubuntu-drops-amd-catalys...

Namely hardware video decoding and OpenGL version.


Sounds more like AMD dropped support for Linux. It's shitty, but I'm not sure what you can expect from distributions if AMD stops updating their drivers for newer versions of xserver.


Yep, while you can still get the up-to-date driver for Windows, fully DirectX 11 compliant.

So this thing about marvelous open source AMD support, depends pretty much how much luck one has.


I don't have laptops with older AMD. My current laptop has the ability to switch between CPU and discrete GPU, but it doesn't work well even under windows, so I disabled it in BIOS.

AMD drivers have been hit and miss for a while, which is one of two reasons I tend to prefer NVidia cards. NVidia took time to make sure their whole stack works reasonably well.


what brand/model is your 48GB laptop? I'm shopping for something that can take lots of ram.


This is a Sager Notebook NP8156. There are newer models, including better NVidia cards. Up to 64 GB ram. I bought mine with 16GB and added 32GB. Very expandable. Battery life is 2-ish hours, though can be increased by reducing brightness.

Lenovo has a 32 GB max model, and HP has a 64 GB max model.


I've been using Linux full time for 3 years now and I agree. Linux comes with lots of little, edge casey bugs. Like a thousand paper cuts, they add up and hamper productivity.

I can finally say Ubuntu MATE 18.04 for me is pretty solid. There's still two issues I wrestle with, but other than that it's been very dependable for me.

When it comes to OSes these days, I feel like you have to pick the least bad one. A truly rock solid OS just doesn't exist in my experience.


Same. I use Linux (specifically Ubuntu) on my work/study computer and I love it.

But, weird hardware issues popped up now and then which I didn't have to deal with on my MacBook, such as the wifi card that had to be manually installed, the headphone jack sometimes doing weird things, the processors overheating.

I was able to solve those problems by searching online and finding others who had the same issue which lead to instructions to fix the problem, but I just can't imagine my family members who have never used the Terminal in their entire life having the same success.

The fact is that with my old MacBooks (and the computers I've given to my family) the only hardware related issue we've had over the last 10+ years is the battery related issues from the MBP.


I've been using Linux for the last two years, having switched from macOS, and I haven't had a single weird problem. Not one.

I use Arch Linux, and a combination of KDE and i3 so one would expect me to have a ton of rediculous problems. But I haven't had a single problem. It took me a single day to set up my computer how I wanted it (I've used Linux before) and I haven't had to touch a config file since.

Part of this might be that in using a Dell XPS DE, which is designed to work well with Linux, but I think it might just be a YMMV situation. And I do a lot of back end, applications, web and front end development (mostly in my spare time), so I think I've hit a lot of programmer use-cases.

Also, I get 15 hour battery life streaming 4K video on my 4K touch screen, and more doing other stuff.

I also used an Apple Magic Touchpad or whatever over Bluetooth with it, including gestures and it worked really well. Better, actually, as far as Bluetooth, than my Mac. I only stopped because it wasn't good for my hand and wrist health.

Just adding my anecdata. Overall my experience with Linux has been hugely positive.


> or I get fed up with 2 hours of battery life and go back to macOS.

Not sure about the Bluetooth part (i had this kind of issue once in like 10 years) but the battery life, getting 2 hours is not common anymore. Sure, you won't get as long as Windows on the same hardware more of the time, but 5 to 7 hours on a full charge is what you can expect, if not longer.


This and it's come a long way in the last 10yrs but it's more dependent on hardware. Optimus still an issue with nvidia (3hrs) vs 630(5hrs) and have to switch between. However, just bought a pair of (used) x270's to travel with - one has Fedora and the other windows 10. Linux lasts 8-9 hours and the Windows about the same (maybe less).


Maxed out X250 had been reported to last 26h hours online, with plain debian install, no tricks like Macs' "oh, you look outside, I'll go to sleep now"


Anecdotally, my cheapo $500 linux laptop gets a full work day (so 6-8 hours depending on what I'm doing, unless I run docker, then it gets maybe 3 hours tops) while my previous 2017 $2000 macbook pro typically got around 4 hours for the same type of work.

I did spend about 2 hours optimising for battery when I got the laptop, but it was a once-off thing.


I guess this all depends on your dependence on mobile hardware.

Bluetooth? I don't even use it. Battery life? I'm on a desktop.


It's a good idea to research any components for compatibility before buying them.

It's true that Linux won't run with all hardware, but this is even more true for MacOS ;)


I find that many people who say things like this have never actually tried to boost their productivity on Mac in the first place. As someone who uses Alfred, Karabiner Elements, Keyboard Maestro, Spectacle, Omnifocus, and a host of plugins on fish shell besides, it baffles me how slow my peers are at doing basic stuff they do twenty times a day. Just set up a shortcut, it's trivial.

Not so trivial on Linux. The fact that the author considers Firefox add-ons (!) a Linux feature is a clear indicator they never even tried on a Mac and were enamored by Linux due to the lionisation in the particular subreddits they follow.

Kind of like here! Not that there is anything wrong with getting excited about something. Just don't make sweeping generalizations.

Also, a small tip: You can put your Mac apps and configurations in Dropbox and they will show up and work as you expect across your multiple machines.


One thing I sorely miss after switching from Linux to Mac is i3. Spetactle gets close, but something as trivial(?) as switching window focus with the keyboard I couldn't figure out a solution for... Can Keyboard Maestro do it?


I haven’t used i3, but I missed good keyboard context switching support when I started using macOS. There’s some reasonable solutions out there, but my current favorite interface customizer is BetterTouchTool, which is setup to let you customize any input for any kind of OS function you might want. I have a 4-monitor setup so easy window/tile management is important to me. I have a handful of keyboard chords to put windows on the screen/quadrant combo that I want. My TouchBar is permanently set to media keys with media info + notification area + emoji keyboard. For keyboard control over focus, I use the Contexts app inter-app and ShortCat within app.


I personally use Contexts (https://contexts.co) for switching between applications. It functions similar to rofi's application switcher (fuzzy keyboard driven switching).

There are a couple of automatic tiling window managers for MacOS, the most notable being Yabai (https://github.com/koekeishiya/yabai) and Amethyst (https://github.com/ianyh/Amethyst).

Yabai doesn't handle window switching on its own but it can use SKHD (https://github.com/koekeishiya/skhd) or any other application that can bind terminal commands to keyboard shortcuts such as Hammerspoon (https://www.hammerspoon.org) or BetterTouchTool (https://folivora.ai) or even Keyboard Maestro. The commands are context-aware of spaces and the placement of windows on the x-y plane so you can move between windows relative to their position on the screen.


Depends on your use-case. KM comes with an app-switcher out of the box that looks like this and can be controlled via Keyboard:

https://puu.sh/E9qyC/cdbeb9ce4c.png

On another note, highly recommend Puush (https://puush.me/) for instantly taking and sharing screenshots without having to manually upload.

Anyhow, you're more likely looking for something like Amethyst.

https://github.com/ianyh/Amethyst


Switching between windows of the same application:

https://superuser.com/questions/299241/in-mac-os-what-is-the...


I agree with your premise, I'd suggest Yabai if you're curious about getting the productivity of a power user on Linux using something like i3


I find the opposite (and I've used Linux since 1997, plus several other unices starting from Sun OS).

Things tend to just work on the Mac, even though I use it for a wide variety of tasks (programming mainly, writing, music, and video secondarily, plus some photo work). Music and Video (DAWs and NLEs) are almost a joke in Linux.

You have to be cautious to buy compatible laptop hardware, and still there's always something not working on new setups, usually sleep, sound, GPU compositor, bluetooth, etc.


Things tend to just work on the Mac, even though I use it for a wide variety of tasks (programming mainly, writing, music, and video secondarily, plus some photo work).

I'm pretty sure that would be my problem if I tried to move (back to) Linux. A lot of the HN crowd is (understandably!) focused on development, and the chances are your favorite dev environment is going to be good-or-better on Linux as it is on the Mac. But I can't find a screenplay writing program on Linux I personally like as much as Highland for the Mac, or a Markdown editor that I like as much as iA Writer or BBEdit for the "heavy lifting" of technical writing at work, or a graphics editor I like as much as Acorn, or a Twitter client I like as much as Twitterrific, or a Markdown previewer/converter I like as much as Marked, and and and. I know it's all subjective, but it's a sticking point. And the last I checked, at least, I couldn't find good equivalents to OS X Services/Quick Actions, which can be just amazing.

I'm sure if Apple really tanks, I could make the switch and hit a happy place, but it'd be turbulent for a while.


> I'm pretty sure that would be my problem if I tried to move (back to) Linux. A lot of the HN crowd is (understandably!) focused on development, and the chances are your favorite dev environment is going to be good-or-better on Linux as it is on the Mac.

Those of us into graphics programming, UI/UX are better served with macOS/Windows tooling, development enviroments and SDKs.

That is what triggered my move back into those platforms.


Those of us into graphics programming, UI/UX are better served with macOS/Windows tooling, development enviroments and SDKs.

That's been my observation and occasional experience. I'm more of a technical writer these days, but I've done light graphics and UX work at various points over the years and prefer what's available on the Mac. (My experience on Windows is pretty limited.)


For some reason I wrote macOS/Windows and not only Windows.

Also note that not every country is enamoured with Apple price levels, hence Windows for software that runs on both.

Then there are the games console SDKs, DirectX, high performance graphics with SLI cards, e-gaming, ...


> You have to be cautious to buy compatible laptop hardware, and still there's always something not working on new setups, usually sleep, sound, GPU compositor, bluetooth, etc.

Technically, there are more 100% Linux compatible laptops than macos dito...

That is true even if you limit yourself to thinkpads


It also works excellent on low quality gear, I have some Point of Sales with very old computers, they just need internet, a browser and a thermal printer.

Installed Xubuntu and forgot about those for years, the experience is still smooth, fast and the computers turn-on fast.


“Things just work in Linux” - please do share this magical setup you have because my experience over the last 20 years is that nothing just works. MacOS mostly works, Windows sometimes works, Linux never works.


I'm actually being driven to Linux by the absolute crappiness of Windows 10 and OS X. Oh, how the wheel turns ...

> Windows sometimes works

I'm going to take exception to this. Windows 10 has been a disaster for me.

Effectively, Windows 10 treats me like a "supplicant". "Oh, great computer, can you please do some work for me right now?" "I say NAY! I must now commence my Update Ritual. Please come back and ask again. But I make no guarantees."

Every ... single ... time. WTF!

> MacOS mostly works

Unless you want to do modern graphics and then you get the "joy" of learning Metal (or not). Or, you can simply throw in the towel and switch to Windows/Linux where you can use OpenGL and Vulkan. Valve is funding MoltenVK (a Vulkan shim on top of Metal) because it is more cost effective than dealing with Metal. Let that sink in for a minute.

And OS X hardware is a disaster. We've actually stockpiled used-2015 era OS X laptops in the office. At this point we have enough if someone has a laptop that goes down.


I didn't use Windows XP or anything since. I'm now using Windows 10 due to a work situation. But seriously. The Update Ritual happens out-of-ours when you're not using your computer. It seems to take a few days longer than it needs to (why not do it tonight? why must you ask me for days and days, and then finally do it overnight?). But it's generally quite reasonable.

I don't fully understand why Linux is able to do its system updates as one of several tasks, but Windows has to take over the computer to do it though. Is it just that if a Linux box gets stuffed during update, the user is probably able to recover, so the relative risks are different?


I've never had a minor update on Linux break the entire graphics stack like osx 10.14.4 did. And I've certainly never waited month(s) for a major breakage to be fixed. For an ecosystem with the smallest variety of hardware, they sure are bad at supporting it...


Upgrading to Ubuntu 18.04 broke my graphics stack. I couldn't even log on. The only option was to roll back to a snapshot of the previous system. The problem was acknowledged but not deemed important.


Unfortunately, as Apple tries to integrate more and more of the supply chain, and other computer-makers are copying Apple's business model, I predict that generic computers (the ones you install Linux on) will slowly disappear from the market.


> I predict that generic computers (the ones you install Linux on) will slowly disappear from the market.

It seems the be going the opposite way, since by far and large there's more offering hardware wise than there has ever been, so what you see is manufacturers catering to niches more than before.


Here in Germany, the majority of DIY PC shops have died or are connected to the machine selling parts over the Internet.

Everyone just buys laptops with pre-installed OSes, phones and tablets.

Most parts shops are now targeting the Maker movement, aka Arduino, Raspberry and friends and even gamers prefer solutions like Asus Republic of Gamers series.

So actually it looks like the return of the vertical integration of 8 and 16 bit home markets of yore.


Agree with vbezhenar that Apple doesn't really budge the needle to make this happen. And then you have vendors like Dell that are specifically catering to the Linux market.

https://www.dell.com/en-ca/shop/dell-laptops-netbooks-and-ta...


Yes, but what prevents Dell from becoming more like Apple, e.g. by installing vendor lock-in features at the BIOS level, or work with Intel to install such features at the CPU level.

Also, you can buy a GPU but in some ways it is locked down (by NVidia), so as technology progresses we might be moving away from the "generically useful computer" model.


Because competition. It would only limit their market, not increase it. Dell competes with other laptop vendors in this "generic" market vs. the lock Apple has on their OS. And do you really think Intel would work with one specific vendor in this space to given them exclusive CPU features that nobody else would get?

NVidia has always been locked down so that is not news. If you are concerned about the GPU, buy AMD and enjoy their open drivers. Intel is entering this space too with open drivers as well. And who knows, perhaps NVidia might be opening up a bit after all vs. going the other way?

https://github.com/NVIDIA/open-gpu-doc


Android, ChromeOS and the now gone netbook market, show what happens when each OEM delivers their own Linux derived flavour.


Why is Android included?


I don't know, because it uses the Linux kernel and is a good example of OEM fragmentation?

Or should I rather write Android/Linux, ChromeOS/Linux, PuppyLinux, Xandros then?


I always wonder where apple is going to proprietary hardware. For example the “t2” chip, which seems to be a “computer in a computer” and only Apple. I suspect other hardware manufacturers just want windows to work for the most part.

From Wikipedia “Thge Apple T2 chip is a SoC from Apple first released in the iMac Pro 2017. It is a 64-bit ARMv8 chip (a variant of the A10, or T8010), and runs a separate operating system called bridgeOS 2.0,[95] which is a watchOS derivative.[96] It provides a secure enclave for encrypted keys, gives users the ability to lock down the computer's boot process, handles system functions like the camera and audio control, and handles on-the-fly encryption and decryption for the solid-state drive.[97][98][99]”

https://en.m.wikipedia.org/wiki/Apple-designed_processors


There are similar things in PC land and most of the time they can be disabled.

Things do not look as good in mobile space (mainly thanks to Qualcomm)


I am not sure this is true. I believe that generic computers will be in the market until there will be people willing to buy them.


“Generic computers” were never a thing. For decades, every computer shipped with Windows, the drivers were for Windows, the hardware was only tested on Windows, and to make it work, Linux would usually have to pretend to be Windows (e.g. when evaluating ACPI). Except for Macs, which were the same but with macOS. Most hardware was not documented. If Linux support existed, it was thanks to the work of reverse engineers; it often didn’t exist. If you bought a random laptop, you could expect some of the hardware to be unsupported.

These days… I’d say things aren’t all that different. On one hand, Linux has better hardware support overall, and Dell and some smaller manufacturers are offering Linux laptops. On the other, a lot of hardware still doesn’t work, or doesn’t work well. Some laptops have been getting more custom hardware, including Apple’s and others, and Linux has fallen behind a bit in supporting it. But it’s nothing new for Linux to take time to support new hardware.


Apple's market share is tiny. They don't have any significant influence on generic computers. I don't really see other vendors following Apple's model. I still can build PC from different parts, like 20 years ago and it just works. While I did not closely followed non-Apple laptops, I believe that they are the same, just ordinary computers with ordinary parts which can run Linux just fine, if drivers are implemented. Glued batteries and welded SSDs are cancer, yeah, but that's more about overall quality of the product, rather than restricting Linux to run. There are enough "Enterprise" laptops which you can disassemble with screwdriver and they are not going anywhere because enough people understand their value.


Isn't it a bit harsh to say that the company that practically shaped the very form of modern computers doesn't have any significant influence?


Well, I still can't install Linux on a phone. This tells me than once freedoms are lost, they don't easily come back and as we get new technology, it doesn't necessarily allow to run Linux.


What are some of those things? (Curious question, haven't thought of switching yet but could try if it's worth it.)


I'm curious to this as well. I recently got a desktop computer to supplement my MacBook, and am now using Ubuntu as my more-or-less primary OS. I'm quite enjoying it, but I certainly wouldn't claim that it works better out of the box than MacOS. Perhaps it's because I've used Mac for so long, but I can't think of anything that I find difficult to set up. Ubuntu on the other hand has several issue that I've just sort of ignore - keybindings don't always work, you can't drag to/from the Desktop for some reason, some startup programs don't always run, little things like that.


> I'm curious to this as well. I recently got a desktop computer to supplement my MacBook, and am now using Ubuntu as my more-or-less primary OS.

I have used macOS as my primary OS from 2007 to ~2017 (before that BSD and Linux). I am now mostly back on Linux, though I also have a MacBook Pro that I use every now and then. Primary reasons for switching back to Linux:

* MacBook hardware limitations: too few ports, keyboard problems, expensive upgrades.

* Competitive hardware prices for Linux. I got a NUC8i5, which was somewhere between 300-400 Euro and has the same quad core CPU as my 2000 MacBook Pro. I added a 500GB SSD I had lying around and 16GB RAM. I have more resources for a fraction of the price, and can always bump up the SSD or memory relatively cheaply.

* Nix. There is package/system management before and after Nix. I actually started with Nix on macOS, but being able to manage your whole system declaratively is awesome.

* The subscription disease on macOS. I am fine with buying good applications. Overall I have probably spent thousands of Euros on licenses for macOS software. But I will not use an application with a subscription model. Period. [1] It transfers a huge amount of control from me to the software vendor. Unfortunately, more and more macOS applications are switching to subscriptions.

* Linux is generally faster than macOS.

There are also things that I like about macOS: Apple's strong push for security (including sandboxing of applications, T2, etc.), fewer issues with drivers and random paper cuts, better support for hardware decoding throughout applications, traditionally strong 3rd-party applications (OmniGraffle, Little Snitch, LaunchBar/Alfred, Things, OmniFocus, etc.), integration through AirPlay, handover, et al.

[1] Admittedly, there is one exception: 1Password, we like using it for password sharing and arguably, you are paying for a cloud service.


Are you a developer?

I think the productivity aspect would only be true in that case, as the things that “just work” are (for me) external libraries, GitHub readmes, sdk examples, etc.


Yes and I agree that is the case. Came from DevOps, so Linux still feels natural. Though I use it for everything now from developing, image/video editing, browsing. Literally, I feel like the only thing I can't do is IOS development.


As a developer and hobbyist photographer and maker, I'm between two worlds. Linux is perfect for development but nothing Adobe or Autodesk runs on it, which is extremely frustrating. I don't even have to reboot to game, but I do need to reboot to edit a photo.


> Linux is perfect for development but nothing Adobe or Autodesk runs on it, which is extremely frustrating.

True for Adobe, technically false for Autodesk if you aren’t in CAD. Maya and MotionBuilder (acquired from Alias in mid 2000s) run on Linux for the film/VFX industry.

Though to be fair, I’m pretty sure they are the only applications in AD’s entire portfolio that run on Linux, and it wasn’t because of them.


I am in CAD, alas. Fusion 360 barely runs on Windows.


Being a developer does not equate with UNIX.

Plenty of us are developers in other platforms.


Docker has been my biggest gripe lately but admittedly has gotten better.

Update environment. MacOS requires reboots and nags you constantly until you do. Whereas, apt and dnf are simple and can be automated in the background.

Doing anything 'interesting' requires you to reboot and fiddle with the firmware. Where linux sudo works as expected.

Outdated software due to licensing issues. See GPL and bash. Not to mention you will be much closer to a production environment and you will find less bugs developing due to differences in OS.

Lots more but this is a good start. All of these things are small and mostly can be worked around but they add up I'm a big way.

The road on Linux isn't completely rosy and can take more learning if you need to do anything really complicated. Tools are great but not necessarily pretty, etc.


This seems rather uncharitable.

Docker [was] off on macOS through no fault of the OS. Particular software dev teams taking shortcuts is on themselves.

Linux also requires reboots for certain updates (plus you can disable checks for updates entirely and run one only when you want to on macOS, or even use `softwareupdate` for finer-grained control, so if it was "nagging you constantly" that's kind of on you).

`sudo` works "as expected" for everything that doesn't involve conflict with the built-in System Integrity Protection, which is most things (in the past three or so years on macOS I've only run up against it twice). There isn't any "fiddling with firmware" going on. Plus if you want total freedom to delete your entire `/bin` folder or something, again you can disable SIP and move on with your life.

I'm currently running Bash 5 on my MBP, so I don't get your outdated software complaint either. macOS doesn't have any magical power to force you to use the versions of (third-party!) software it ships with.

If these are the first complaints that come to mind it just sounds like you haven't used the OS much and entirely refused to explore it or give it a chance in the time that you did. I mean, not even trying to set your own update preferences?


> Doing anything 'interesting' requires you to reboot and fiddle with the firmware. Where linux sudo works as expected.

If you mean SIP, can't you just disable it for good? Then you should be in a situation similar to Linux.


There are a few low-hanging improvements (updates almost never require you to restart). But it's a lot of tiny things that add up, some of which are hard to put into words. If I had to pick one thing that encompassed a lot of them, it would be that when I use Mac I feel like I'm adjusting my work-style to what Apple thinks it should be. In other words, my Linux setup now feels like it's a professional tool, and the Mac I use for work feels like a consumer-grade OS that happens to have work tools bundled on it.

This is hard to put into words. If I'm an artist, my tools are very, very focused and robust. I might have specific pens and brushes that I know the feel of very well. They're not flashy and they don't have advertisements written on them, and they don't change their properties behind my back. Everything about them is designed to help me draw. If I'm a musician, I spend a lot of money to buy an instrument, and I get to know it very well. I have particular brands of reeds that are consistent that I'm likely preparing or sanding myself. I know my instrument so well that I can tell you which notes trend slightly flat or sharp, and after a while adjusting to that becomes instinctive.

So if I'm a professional programmer, I likewise want a computing environment that I understand completely and can service myself, and that is very customized to my own preferences. It's no different from any other professional field -- the point of the computer is to help me get work done, everything else is secondary.

You'll get different answers if you ask someone why Linux makes them productive, because the benefit of Linux is that it adapts to you. For me, personally, the biggest upgrades to my productivity have been:

1. Switching to Linux in general

2. Switching to Emacs/Spacemacs (Emacs works best on Linux)

3. Switching to Arch as my main distro (which is hard to do unless you already know Linux)

4. Most recently, switching to EXWM as a window manager (which is a lot easier to do on Arch)

Each step of this process has been me getting rid of things that distract me from work, and each step has built on the last. Switching to Linux gives me a setup that is much more customizable and stable, switching to Emacs gives me an editor that is very tightly integrated into the host operating system, switching to Arch allows me to have a very minimal setup (its easier to debug because there's less going on), and switching to EXWM allows me to focus the entire setup on work.

On the other hand, I have an old Surface Pro 3 that's running Manjaro/Gnome that I use for drawing. It's a very different setup from my main computer, because I use it for different things. Again, my computer should adapt to my workflow, not the other way around. The Surface setup is actually interesting, because it suffers from driver issues (unreliable Wifi, bad suspend support). And yet I'm still more productive on it than I was on Windows. I think people underestimate how much time and energy can get lost to distractions, surprise updates, stuff like that. Specialized devices are really stinking good for getting stuff done.

But everyone is different. I know people that get frustrated by the initial setup times or needing to dig more into the OS internals, and I get that -- it's reasonable. For me, once I got past that I found Linux to be really stable, because it doesn't change until you tell it to. Linux is the only OS I'll set up for someone who's not tech-savy, because putting in more work up front means I won't need to do as much regular maintenance.


> 2. Switching to Emacs/Spacemacs (Emacs works best on Linux)

Just to echo this: Magit (Emacs git client) alone has given me a large boost in productivity.


I didn't want to go into details there, but Magit is wildly good. Line-by-line git-blame an entire file with 3 keystrokes, time travel back-and-forth over commits for a single file, quickly preview any file in the repo from any branch/commit using fuzzy-search. This kind of stuff really shines when you're working on a large company repo -- it meant if someone from another department called me up to talk about some obscure feature branch, I could open the relevant files without switching branches or stashing my current changes.

Aside from Magit, I also get a lot of use out of Org-mode (Emacs pure-text notetaking/todo-list client). I'm syncing to Android with Orgzly. Org-mode was the original feature that got me to try out Emacs, and for a while it was the biggest reason I stuck with it, since I'd never used Vim keybindings before. Vim keybindings made me less productive until I learned them, but were balanced by just how good Org-mode was.

I've even grown to appreciate packages like Calc (Emacs calculator). Dang if RPN style input isn't actually faster to use once you get used to it.


I use and appreciate all of these tools on MacOS, FWIW.


I do as well. If you're on Mac, you should still totally look into Emacs, it's great. Heck, if you're on Windows you should still at least think about Emacs.

Emacs overall works slightly better on Linux, because it's primarily optimized for that system. One big area where you'll notice that is if you start embedding X windows into buffers. EXWM is definitely not something I'd try to set up on Mac.

If you're not trying to do stuff like that, then Emacs on Mac is fine. I use a Mac at work and Emacs is a big productivity boost.


How is pen support for your Surface using Manjaro? Is it as good as it is on Windows (which is decent at best). I have a Surface 3 sitting around. I would love to repurpose it for drawing using Linux. I imagine battery life is probably terrible though.


Pen support is great, touch support is adequate, HDPI support is bad. Most of this comes down to individual apps -- Linux devs just don't think about touch or responsive design, and the frameworks they use are buggy or need config options set. Occasionally on Krita I'll get issues where I'll need to hit the tablet home to jump out of the app and back in to reset the touch "mode" that the app thinks I'm in. Kind of annoying.

I would say that it is not nearly as good as the touch support in Windows 8.1, but is comparable or potentially a little better than the touch support in Windows 10. Gnome's touch UI is good, but that's more just a testament to how much worse touch support got in Windows 10.

Krita is not amazing, but is still surprisingly good. When I first started using it, Krita was a massive pain and I missed Clip Studio all the time. It's gotten way, way better, and I now only rarely miss Clip Studio.

I'm honestly not sure what battery life is like. I will regularly use it for about 4-5 hours a day unplugged, but usually I'm at a desk and everything I own is plugged in. I still have Windows 10 on an old partition just to make it easier to calibrate the pen hardware (https://www.sony.com/electronics/support/downloads/W0009338), but I've never taken the time to compare the battery life for both.

It was kind of a pain to get everything set up, but that was years ago, and now that it is set up I just don't think about it any more. I'm very happy with its performance as a drawing tablet; at least for the type of illustration work I personally do. If you're comfortable with Linux, I'd say go for it. If not, you're probably better off with a Wacom tablet that won't force you to fight with Linux drivers.


Thanks. Yeah, I'm not super comfortable with Linux, but agree with your mindset. I want a professional tool that works for me without constantly demanding updates. I also dabble in Clip Studio, primarily for comic inking. I really like it. It was unfortunate to see it go to a subscription model for iOS.


There's a guy at the office who has to tinker and customise everything. He can't use anything that touts itself as opinionated because he has his own opinions.

Kind of reminds me of this article. You say you're more productive but honestly: how much time have you spent working on and customising your OS and is it a continuous project? Can you really say you're more productive than the people who open their lid and just work?


That’s a good question. Then again, it seems that for some people tinkering serves more as a hobby that they just like doing.

By tinkering and customizing you might gain a productive work environment but the way I see it is that some people just want to tinker because it makes them happy. The ’increased productivity’ seems to be more of a way to rationalize it to oneself.


Shortly after I graduated from college, I switched from Arch to Ubuntu and it was a bigger step in growing up than getting my actual diploma


I was initially very cynical about Arch, assuming that it was basically a form of role play for people who didn't actually need to get anything done with their machines but wanted to feel clever. And while that's 100% true, I did eventually realise that I could stop trying to install it on my laptop, and instead put it on a tiny meaningless server instead. Now I get all the hard-won lessons without actually having to think about X config again for the first time in 10 years.

Now, don't put Arch on your servers either because there's no real security story for Arch, but in that time I have learned tons about systemd, and you can't put a price on that.


And how often do you draw on skills you learned on Arch that you wouldn't have learned on Ubuntu?


Arch Linux teaches skills like:

- Knowing what happens if I have the audacity to update my computer without reading 3 different forum threads

- Understanding how to fix hilariously bad font rendering issues in a terminal, a software paradigm that's almost as old as computers themselves

- Tempering expectations that incredibly obscure apps like "Spotify" will "just work".

I think to imply Arch teaches much beyond the skills needed to deal with Arch Linux is a tenuous premise at best.


Contrary to the popular belief that Linux has no games, Arch includes many fun games such as "find out why the audio decided to stop working today for no reason"


I use Spotify for hours every day and I've had absolutely zero issues with it since I moved to Arch 3 or so years ago.

Agreed on the fonts, that has always been a pain. That being said, I recently installed Arch on a new laptop and the fonts weren't too bad out of the box - times change.

I update at least twice a week and I've only had one breakage in 3 years - I didn't update a config file with some new settings.

All in all, your response comes across a little FUDdy IMO.


My experience of Arch is that it's fun as a home computer tinkering exercise but I don't trust it for my work computer since I can't resist a version number bump and the amount of time spent faffing around would be a nightmare.


I think it can definitely differ between individual systems - the fact that Arch is so flexible also means that each system can have individual little foibles. Personally speaking, I've used Arch for work for the 3 years mentioned previously with no problems. At my previous job I used to be oncall for a week at a time and I wouldn't upgrade during those periods just in case, but that was the extent of my caution.


This is all way too close to home, but I legitimately think the time I spent on arch made me more productive on linux machines.


I used a variety of GNU/Linux distros: Ubuntu, Debian, and Arch, Slackware, Frugalware, Fedora and more recently openSUSE Tumbleweed.

Honestly, I don't understand what makes Arch so special.

I have mostly learned things on Ubuntu and I can use Arch just fine, and I think I have a decent understanding how things work.

Really, I think there is nothing fundamentally different across these distributions technically speaking.

Their biggest differences lie in their package and release management and policies (and, agreed, this is huge).

I'm not sure what skill you would learn on Arch and not on Ubuntu. It seems some vocal Arch users are lying to themselves and to the rest of the world about this.

Gentoo (or LFS) might probably be more educational, but I haven't used Gentoo enough, and haven't tried LFS enough to say.

(Kudos for Arch's documentation though, useful even when using another distro)


I'm not a zealot or anything, I just like Arch because of certain good aspects. Great documentation for everything I could want to do (though it used to be better). Rolling release. Upstream, latest packages. AUR. On that note, I tend to prefer Manjaro unstable nowadays because it works out of the box and still gets me the latest packages.

I don't particularly care for the whole DIY aspect. I spend enough time tinkering with Emacs and other tools that I don't want to waste any time on configuring/tweaking the OS itself. The different Manjaro versions are pretty decent (currently using the Awesome WM one), and I really like the Manjaro CLI installer (manjaro-architect) if I want a bit more control over my installation.


I went the opposite direction, Ubuntu to arch.

I think that archlinuxs advantage is that it's minimalistic, the philosophy is simple, you know what is running on your computer and generally why.

Tinkering with arch itself isn't that time consuming because the package management is so incredibly good, along with the AUR and wiki. These are invaluable.

I really do think the difficulty is overhyped though, it's really not hard at all.


I have been using Arch from Windows/Ubuntu at home (I run Ubuntu in a VM at work) since the beginning of the year on 2 personal latpops Asus/Dell and didn't have any issue.

On the Asus Nvidia updates broke my Windows manager on Ubuntu it not happen with Arch.

The Arch install is a little bit more complex but you can do exactly what you want which was for me: use systemd boot instead of grub, and cipher only the home partition.

Once the setup is done, I did not have any issue, fiddling to be done.

For me where I lost time on Linux was when I tried to customize my desktop environment to my liking with i3/polybar/etc. Now I just run Gnome3 on Wayland, far from perfect but it is a good compromise TBH between setup ease/integrated UI components and features.


I'm using a very similar desktop environment (mainly i3 + terminal + vim). What I really like about this setup is that it's compatible with every project I work on. I rearely have to learn new tools and I honestly did before spent more time getting used to new IDEs and stuff than configuring my current work environment. Configuring it is an ongoing project, but I commit changes to my dotfiles, so it's no lost time.


I definitely believe in knowing the most about your tools - regardless of profession. I’m less sure on where this line sits for OS choice. There are or at least were professionals who really needed Mac OS to get their work done properly along with the Adobe suite. Giving them Linux and OSS alternatives would almost certainly hinder them too greatly. But being able to hop onto just about any box and quickly get whatever done is itself a demonstration of broad knowledge.


You absolutely should know your tools so you can actually trace problems and troubleshoot by yourself, which saves you time from researching and finding expert opinion. This is the exact reason why I don't use linux. I know my tools pretty well (macOS and bash), and I can troubleshoot my own problems.

But there are always going to be exceptions, new problems will arise that will slow you down. That's another reason why I'm on a mainstream OS and not linux, because when I do encounter a problem that slows me down, devs work at breakneck speed to fix bugs and there is extensive documentation because the community is much larger than the linux community. I'd be spending waaay too much time in the weeds on ubuntu, sifting through pages of SEO crap search results to find the one sage forum post from 10 years ago with now deleted screenshots, just trying to run the software and workflows I know like the back of my hand on macos. I don't want to start over and be a dumbass again, I've found my niche.


Check out the Arch Wiki. Mind you: I am not saying use Arch necessarily. But the wiki is fantastic documentation for Linux.

Sure, it’s one thing to compare knowledge that you already have. But does macOS even have a counterpart to the Arch Wiki for those starting out?


I loathe tinkering/ driver set up. I was using a MacBook, but when it came time to replace..

I bought a pre configured Linux computer (laptop). They’re not very common but common enough.

I haven’t had to spend any time with setup, and frankly I’ve been pleased that everything just works. My main complaint it Battery life isn’t terrible but isn’t great but for what I do it’s good enough.


> Can you really say you're more productive than the people who open their lid and just work?

I don't know if overall all the tinkering I've done over the years was a net positive in terms of efficient use of working hours.

But I can for sure tell you that if I've just opened my lid and worked, I would have been much less satisfied while working when I would constantly run into unnecessary limitations of my tools. Hard to put a quantifier on work satisfaction.

As a professional, using the right tools for your job should be part of your job. You wouldn't trust a workman hammering a nail into the wall with a screwdriver just because that's the only tool in his tool belt.

And this isn't even an exaggerated metaphor. For decades Windows was unusable OOTB for any serious development. OSX is still handicapped by a decades old userspace (better than nothing but not good).


You have to be smart about what you tinker with:

1. Do you know what the end state is, and do you know that you'll be significantly more productive? If so, then spend some real time on it.

2. Do you not know what the end state is, OR are you not sure what the best setup is? If so, then do the absolute minimum amount of work to make your changes functional and no more. Then use the incomplete setup for a while and see how it feels.

3. Are you not sure of the end state, AND are you not sure it will make you more productive? Then put it off and keep using your current setup, or at most try it out on a separate computer in your free time.

It's not unlike working on software architecture. You can get so focused on good architecture and clean code that you never get any real work done. Some code is fine to leave ugly. But not all code -- the art is knowing when code actually needs to be refactored, and figuring out how to refactor it in a way that doesn't lock you out of developing new features for a month.

Environment customization is the same way.


> Can you really say you're more productive than the people who open their lid and just work?

- I do open a laptop and just work. Using configuration that I’ve already created to suit my work.

- Other people is a big group, and one that doesn’t matter so much here. I’m more productive using what I’ve opted to change than without. Yes, the minor time investment was worth it.


It's a multiplicative effect. Spend a few hours setting it up just right for your own workflow, and it tightens up your productivity going forward.

It may not be by 30-40% as some people claim, but even at 1% that's around 20 hours in a year assuming a light schedule (40 hours a week 48 weeks a year).


Well, I think asking a Gentoo guy if the compile times are worth the performance boost is quite valid (I used Gentoo for several years).

But just because you are using Linux, doesn't mean you have to spend a lot of time customizing. Actually, customizing can be a lot quicker on Linux than on Windows for example (due to the well-integrated package managers).

That said, I do think that some customizations help with productivity, but you should know your limits. If you are trying something completely new, which nobody has done before, you are unlikely to find huge productivity boosts. But if you cautiously follow some best practices you might find some productivity treasures.


It's a good point, it can be true, if done well. You have to have an abstract/productive view on tinkering. Not just bike shedding.. otherwise yeah it's just constant costs.

ps: one talk that I find a pretty strong example is "the unix chainsaw" by Gary Bernhardt (of wat js fame). He shows how to using tools `against` themselves, as data, to help your work. It's not rocket science, but it's 1) something I rarely do truly 2) easy to fall back as tools as silos instead of .. `objects` collaborating.


If tinkering/customising my OS was still my hobby, I wouldn’t consider it wasted time.

Agreed, this doesn’t apply to everyone. Years ago, every 6 months I’d switch to Linux/hackintosh purely out of frustration (or envy) as OSX looked sooo much better than anything else (antialiasing!) AND supported Adobe CC.

Nowadays I’m a Mac user and I’m unlikely to switch, because I’ve grown a bit tired of my tweaking marathons and need a proper *nix system.

I wish Windows just became a GUI layer on top of Linux though.


This is something I sometimes also ponder. I think the answer is yes, as long as you keep your ultimate business goal in mind and optimize for workflow rather than looks. Yes, you do take longer to get going but once your setup is honed in you gain a little more time every minute you use your system.


I have a couple shell scripts that bootstrap any debian machine to work as my dev machine. Probably less time than starting from scratch on a Mac.


I see customisation as a form of optimisation. Optimising tasks that you perform hundereds of times a day will make you more productive. Optimising everything is a waste of time.


You can if the people who are just working do manual steps that I do in one step. I write TONS of bash functions that end up automating everything my peers do manually.


Bash scripts are definitely needed. It’s not optimum to atrophy over time and have to rely on those scripts though. I have coworkers who want to turn seemingly any useful combination of posix commands into a script. I’m not sold on that being the best use of time.


It's a start, hopefully they will discover .bashrc, dotfiles and the power of CLI


> As a developer, I spend almost 80% of the time in my terminal

In which case then sure, pretty much anything will suit you fine. It's not a universal rule though - I spend the vast majority of my time in TextMate or an IDE and probably 10% of time in my terminal, so the comfort of the Mac UI is a more significant factor.


I think the problem is the vague use of the word "developer."

Some people develop programming languages. For them, a GUI-less environment may be ideal.

I do web development. So I need the GUI, not just for testing, but for dealing with image and video assets that come in from the art department, or creating mockups, or presentations, or maps, etc.

It's still "development," but unsuited for a 100% command line experience.

As is often the case, it's easy for someone in one field of "development" to forget that it's a broad category of experiences.


My preference for Linux over macOS is actually the UI. I like having a real package manager and such, but it’s not like I care much about kernel subsystem differences. I have used macs a lot and I just find their UI cumbersome.

And this is from someone who loves the iPhone.


To each his own. For me, I appreciate the way most applications interoperable and the consistency of the experience.

Lately, for me, the killer app has been the ability to cut and paste content between machines and devices automatically and seamlessly. I once I started using it in my workflow, it became indispensable.


If you're talking about universal clipboard, I believe that KDE has also supported that for a while via KDE Connect. Other DEs also support it.


Between machines, but not devices. I can't copy something on my boss' iPad and paste it into KDE.


I'm not really sure I see the distinction between machines and devices.

I can't copy something from my Android phone to a Mac, but I can copy something from my Android phone into KDE.


How do you do this on macOS/iOS?


For some strange reason being a developer seems to only mean doing UNIX GUI-less for some people.


It sounds like you do web design, at least in part.


No, we have an art department for that. But if a map needs to be replaced on the site or a new director headshot or making documentation or training people how to use the site I built for them, those are all better assisted or accomplished with the UI than from the command line.


>> As a developer, I spend almost 80% of the time in my terminal

>In which case then sure, pretty much anything will suit you fine.

I thought the same. I’m married to Photoshop. How good is graphic acceleration support inside a VM these days?

I need to run the latest version, fast and glitch free. Wine will never be a solution unfortunately, unless Adobe supports it officially.


Except the author uses i3.

If you enjoy the tiling window, with nearly everything driven by the keyboard way of working, then Mac OS can’t come close from a UI perspective.


Really no OS can. Tiling window managers are Linux's (and the BSDs) killer exclusive. I've been using Linux full time for 3 years now and it's been a struggle at times. i3 is the sole reason why.


Oberon used a tiling window manager, never found it that great experience.

The other stuff that the OS featured was great and still missing on UNIX clones.


Try Spectacle on osx. I much prefer it over forced tiling WMs like i3. Been looking to replicate it on linux.


All the people I've seen who praise i3 and tiling managers seem to spend way more time finding the proper windows between dozens of tiles on half a dozen desktops than I do on a very minimal setup with Divvy on one desktop.


If you set up your workspaces for different modes, then this rarely becomes an issue.

While some spaces I use are more flexible for their use, certain spaces are designated for specific tasks. I use a space for communication/chatting, a space for my music/email, a space for my calendar/task planning, etc. These don't deviate so for these common tasks it becomes second nature.


Also, even if I would spend 100% of my time in a terminal, iTerm offers such a better experience than Linux terminals (that’s just my personal experience, I used various Linux distributions for ~6 years)


I use iTerm2 when I'm on MacOS, but I basically use it like I use a Linux terminal. I just don't need so many features from the terminal itself... I do need Unicode, with RTL, and colors, and a nice scrollback buffer.

For example, I don't use tabs in the terminal. I usually open a new window and use the WM from outside the terminal, or use tmux inside the terminal. Tabs in the terminal itself are not that useful in my workflow.

My current favorite Linux terminal is kitty (https://sw.kovidgoyal.net/kitty/), and before that I used rxvt-unicode.


Wait, you use iTerm2 on macOS, but you need RTL support? Is there a way of making that work? I’ve long wanted to switch to iTerm2, but its treatment of Arabic script is so janky, I just can’t do it. I’m still using the default Terminal app as a result.


There's a proposed patch in the bug tracker: https://gitlab.com/gnachman/iterm2/issues/1611#note_10740636...

Maybe you can give it a try.


Thanks, I’ll check that out!


I don't have a solution for you, sorry... I need RTL on Linux, on macos I just avoid it.


He's getting that experience through the window manager. i3 offers the features like tabs or horizontal/vertical splitting.


I believe you mean "she's" as the author of this article is a woman.


This. I love iTerm2 and hadnt found an acceptable replacement on Linux until I installed Terminator, which I would say is 70% there.

Other than that I use Linux (Mint) for my main Desktop OS, but I still deal with plenty of rough edges. The latest ones:

- Wifi doesn't work after suspend wakeup. - After upgrading OS to next version using recommended UI method, PC us unable to start graphics mode bc it doesn't find some random UI package... i had to login in TTY and manually install it. - CS:GG, a game with Linux native port suddenly decided to get choppy lagged sound. Same game works well in Windows in same PC - Connecting bluetooth headphones sometimes works, sometimes doesn't

So yeah, plenty of rough edges. Still I use it because I love the programming workflow and use docker with Linux containers.


I recently switched from using a Mac professionally for 8 years to Ubuntu and used iTerm on the Mac, and I honestly have no idea what you're talking about. iTerm was good, but I miss nothing about it. The default terminal with Ubuntu 18 is just as good. Well, better, because it has Ubuntu underneath it all ;)


I had colleagues who returned their company-issued X1 Carbons and bought a Mac with out-of-pocket money, so they could use iTerm2. Some of its features are indispensable and not found anywhere in Linux, although YMMV of course.

I use iTerm to connect to Ubuntu and I feel like I have the best of both worlds, a great terminal emulator and Ubuntu beneath it.


What features? (Honestly asking)


- Auto-summon of the Password Manager on password prompts

- Grep for things that look like IP address and color them in blue, or MAC address in green, or errors in red

- Auto-Complete based on the text in terminal (this leaves people watching me `docker rmi f7<Cmd-;>` breathless)

- Broadcast same keypresses into several panes (having SSH sessions to several servers)

- Making an icon jump when a long running command just finished

- etc.etc.etc.


Funny thing, given all the emphasis on doing things via terminal in Linux community, is that I cannot find a decent Linux application to manage SSH conections. My work laptop is a Windows 10 machine, where I installed pro version of MobaXTerm (used to access/manage about 50 linux servers), and I fell in love with it. No Linux application that I know of comes close. Any suggestions?


.ssh/config is awesome, and together with a decent shell (with tab-completion for ssh command, e.g. host completion based on your known_hosts file) I find it hard to understand why these applications even exist.


Why would you not just use the ssh command?


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: