Hacker News new | past | comments | ask | show | jobs | submit login
Apple Plans to Use Its Own Chips in Macs from 2020, Replacing Intel (bloomberg.com)
1302 points by uptown on April 2, 2018 | hide | past | favorite | 1111 comments

This might just be a bargaining move on Apple’s part, but I don’t think so. I think that long term they are much better off controlling their entire hardware stack. I wouldn’t be surprised to even see them make their own display screens.

As an Apple customer, I like this idea also. For 20 years, I used to be a desktop Linux fanatic and later became a fan of Android. In the last few years, I have switched to using all Apple devices.

Even though I am a computer scientists, I actually spend more time on my iPad and iPhone, by far, than on my Mac laptops. I only use a laptop for software development (because of the nature of my work, that is often just SSHed into Linux servers), the rest of my workflow and entertainment is on iOS devices.

The older I get, the more I want to get my work done expediently, leaving time to study new technologies and spend time with family and friends. At least for right now, I am maximally effective in Apple’s environments.

As a bonus, I trust Apple more that Microsoft, Google, Intel, etc.

I am a software developer, and I detest working on my mac laptop. At $lastjob I had a Linux desktop and it is, I believe, the most productive environment I have ever developed in. The job before, I had a windows desktop, and I prefer that to mac.

You say you want to just "get things done expediently," but in my experience apple software is flat out inferior and OSX is the worst of the 3 major operating systems I have to choose from.

Lastly, what does Apple have to gain by switching away from Intel? Not much, at least, not much that benefits me as a customer. Likely they are interested in making their laptops have more in common with their iOS devices, which does little to nothing for me. Apple's behavior towards OSX and macbooks in the past few years should be of great concern for anyone, especially if you actually like the devices.

2/3s of all PR activity on Github is on a Mac, for what it's worth. Certainly the platform punches above its weight relative to market share.


My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective. (I'm not sure I agree with this, but it is what it is.) Depending on how widespread this is, usage may not reflect actual user desire (though I would wager quite a large number of users either don't care what they're using, or care, but a MBP is adequate, but would also be fine on something else.)

In my second to last company (a web development shop), they had originally the policy that a new developers can choose if he want either a mac, linux or windows machine. When i arrived there it was 75% mac and the rest used linux, not even one had made the choice for windows. After a while some new guys get hired and one of them wanted to use windows. We found out immediately, that this caused a lot of problems, because he had serious problems to get our projects to run under windows, even though we thought this shouldn't be an issue because all the dev environments were running in VMs managed by vagrant. After while ago watching this guy not getting productive, because he ran from one issue into another, cto forced him to install linux on his machine to end this horror show and windows was removed from the new guys choices at least for developers. The company i went after that and my current one are 100% mac environments, for the reasons of homogeneity you mentioned.

We run a Mac and Linux setup here - me being the admin, this is tolerable. Mac's still-essentially-Unix underlying OS makes this reasonably consistent - I can SSH between either platform fairly easily, for example. We offer new starters the option, and are about 50/50, though I've had a few surprises - one user had only previously used Windows, and from experience these sort of users switch to Mac more easily than Ubuntu, but he gave his MBP back after a day and switched to Ubuntu instead, which he's stuck with. I could not be more proud :D

It's very nice to give end users the option - when I joined, I asked for Linux and was given a brand-new Dell XPS with Ubuntu pre-installed, root access, and told, customise it to suit you. I keep that spirit with my users.

Throwing Windows into such a setup is a nightmare though - two users actually installed their company laptops with their own personal Windows 10 licenses without my knowledge or approval - I still hold a grudge because I had to actually read the MS EULA to make sure they weren't about to cause trouble! It means I can't manage them (I have no real tools to do so on Ubuntu), so they're on their own for that. Fortunately management has my back, declaring the company to be a *nix shop.

Our actual product is all containerised so it should run anywhere.

> I have no real tools to do so on Ubuntu

Ansible works pretty well for managing Windows machines but it requires a little bit more up-front setup on the Windows side and as always YMMV depending on what you're trying to do.

I would have been pretty upset with that. Unless you're writing machine-native code, development environments should work on anything, especially if they run on VMs deployed by Vagrant.

Well the thing with Vagrant is that it (in theory) runs your application code inside the VM, but you typically run all your development tooling in the host OS.

I suspect that it was on the local tooling side that things fell apart for this Windows developer, if he was grabbing a project mostly built on Mac or Linux.

A lot of popular dev tooling built around web technologies (node, Ruby, PHP, etc) isn't as mature on Windows as it is on MacOS or Linux.

My source on this is me. I have a Mac and a Windows box at home, and I've had projects fail to set up and build on the Windows machine for these reasons.

I’ve had problems even getting anaconda (scientific python distro) to work in a VM on windows (because the file system of the host OS lacks certain UNIXy features).

In theory there is no difference between theory and practice. If no one is testing the build on platform X, no one should be surprised if it doesn't "just work" on platform X.

You would think that right? Except then you actually want to run a simple python script on Windows and observe things like: os.rename not being able to replace opened files, the default encoding not being UTF-8 (wtf), not being able to just `pip install scipy` because of some weird BLAS/MKL dependency...

I guess from a business point of view it makes sense, but I really get frustrated when I find an open source project that's completely unbuildable on Windows.

Maybe the developer in question would have been better using a manually created VM or WSL?

Now, take this with a grain of salt, as I'm no open source maintainer, and I only rarely use Windows, and even then for development. Up until very recently, to get real developer tools that are supported by Microsoft one had to buy visual studio. VS2017 has a community edition that's usable for open source projects.

Not to mention, the windows environment is completely different to linux and OSX. Until x64, they had a different calling convention. On OSX and linux, also until recently, I could use the same compiler for both platforms and still be supported by the platform vendor.

All of these non-trivial differences make it a lot more resource-intensive to support a codebase on windows that already runs on Linux and OSX. Asides from my work VMs, I don't even have a windows device that I could use for development. So to me, it's no surprise that most open source projects don't build on Windows.

Community Edition is really nice nowadays, usable for day-to-day professional usage as well. If ReSharper works - it works for me.

Student and hobby editions of Visual Studio exist for decades, before they introduced Community, there was the Express editions.

Also, mingw and cygwin also exist since late 90's.

OS X tools are kind of included on Apple's hardware price.

Apple hardware costs in many countries is similar to average PC + VS License costs.

Whilst I agree with you on all but mingw and cygwin, these tools are wildly different. Integrating microsoft's C/C++ toolchain into an existing makefile would be hell. And again, most people who have a mac don't have to buy the compiler. Most people don't buy extra software after the fact just to compile open source libraries. I can't imagine a reason to buy visual studio for personal use, and I can't imagine a reason as to why I'd dick around with visual studio in my free time.

As for mingq and cygwin, most regular people don't have it installed and configured. And cygwin and mingw are not _platforms_ which receive first party support, as unfortunate as that might be.

I once had to use cygwin to port an old C library from UNIX (I don't recall the flavor of UNIX) to windows. It involved editing makefiles to use the MSVC toolchain, amongst many other things.

It was pure pain. Especially when the boss kept asking me what was taking so long (it's just a recompile right?).

People that have a Mac already payed for the compiler.

500 € PC + Visual Studio vs 1000 € Mac.

I don't believe that the majority of people who buy 500€ PCs would spend money on Visual Studio. In the Mac case, nobody pays for the compiler because Clang is open source - this is a massive oversimplification, but whilst the fact that Clang and Make are distributed in the standard OSX base install is a value add in my book, I don't believe that it's something that people pay for. Also, the people who buy Apple hardware are a completely different set of customers to those who buy 500€ PCs. However, this is all a bit moot as you and many others have pointed out that the community edition is out there and it is usable.

Windows is a bad product made by a company that has historically been a bad and unethical actor that has attempted to limit user freedom and destroy freedom of choice by illegally destroying competitors. Recently we are to believe that they have found jesus and ethics via quiet contemplation and peaceful regime change.

All projects are ultimately created to scratch somebodies itch. If it doesn't work on windows out of the box that isn't their use case. What you are wondering is in effect is why people don't pay money to purchase a windows license which will ultimately fund a bad and evil company in order to enable the projects software to run on an inferior OS that the dev doesn't run or care about. If its not end user software for desktop users it doesn't even have the positive effect of enabling a substantially bigger group of potential users to benefit from the software. For anything server related they are either already running linux or can as easily run a linux vm as a windows one.

Further the users who would benefit will by and large buy a license if the software is non free but probably wont contribute anything but complaints phrased similarly to the support requests they would make for paid products that had failed to perform adequately.

A windows developer just needs to create a build

What was the problem just install an xwindows server (the new bash environment should support it) or just turn on the telnet clinet.

The new bash env is a bit of a let down for development currently, at least for me. I love this move by MS though and am willing to give it another go in the future once it’s more complete.

> My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective.

Not that many years ago that was the same reason IT departments would give for only giving people Windows machines.

Apple shot themselves in the foot for this market with the touch bar. It adds such a small amount of utility for annoying every VIM user.

15 year VI and VIM user here. This is a stupid complaint: anyone seriously using VIM remapped Escape a long time ago, or learned to use Ctrl+[.

I've been using vim seriously for years, and I've had escape on capslock for many of them. However, not all users know how to do that or haven't thought of the amazing benefits that can be obtained. There's people who are only picking up vim today. Give them a chance.

To those who haven't: Give it a go. Put Escape on Caps lock. If you've got Control there and you're running Linux or Windows with a standard PC keyboard, you probably actually want to swap Control and Alt. If you haven't, you probably want to swap Control and Alt anyway. (This will give you bindings similar to the Mac's Command key. The thumb is a much stronger finger and far more suited to these combinations than the weak pinky.)

I got lucky on that one! I have been remapping caps lock to escape for years now on every machine I use. It's great, until I need to use literally anybody else's machine.

I have the same experience with the Dvorak layout, which is what I used when I learned to type, at age... um... nine? People crack up when I try to type on anyone else's machine, because my WPM drops by a factor of ten.

This is part of the reason I basically refuse to customize anything on a new linux machine; my needs are already weird enough that I'd rather just adapt myself to the defaults for everything else.

Not to mention every IntelliJ user, and every touch typist.

Vim users commonly map escape to caps-lock.

Caps to Ctrl

Ctrl to Escape

Escape to Caps

I mostly use Ctrl-[ for Escape. I have a lot or weird Ctrl bindings. Not an emacs level of Ctrl bindings, but enough that I use this particular rearrangement.

I don't, because my agency uses a very locked down Windows env and I can't even remap keys (shell?! You jest....)

Then you don’t have the touchbar anyway.

No, my main work environment makes the remapping inaccessible, so I have declined to buy a touchbar. A consultant offered me one free and I still declined, requesting an original rMBP instead.

> map! jk <Esc>

Remap the sequence "jk" to escape, you don't even have to take your hand off the home row to exit a mode.

> It adds such a small amount of utility for annoying every VIM user.

You have to love Emacs and Vim users.

Quite sure Apple's world doesn't revolve around the needs of such a small user base.

FWIW I'm an Emacs user on a MBP with Touch Bar.

No issues.

> My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective.

This is a huge improvement over when I started developing and was forced to use Windows.

The managing multiple kinds of machines issues is a real one, speaking from the other end. For uniquely talented software engineers I might let them use what they want but from managing an IT department you definitely want anyone who has no reason to be otherwise on the same exact hardware and software configuration.

Yeah, at my last place, the deal was: First we have to find 5 - 10 devs who want to switch to linux (there were 50 - 100 in total). Then we get a PXE boot image to install linux and, well, DHCP. We get to try it for a couple of month, while producing documentation on how linux works with the infrastructure.

That feels like a fair solution.

I don't trust that for two reasons:

1) There is a lot of cargo-culting around the MacBook because hurr durr Windows is terribad (though, to be fair, Docker does work better natively with OS X than Windows, even under WSL), and

2) Most companies give their developers the choice of a MacBook or the shittiest Windows machines known to man (because of irresistable volume deals from Windows OEMs). I bet the spread would be more even if companies were willing to offer the Surface Book as an option. This laptop is nice.

While I do prefer MacOS over Windows, I believe that you have a point in regards to the computers companies (and people in general) run Windows on.

Most of issues people have with Windows can easily be explained by low quality hardware. If you need a decent laptop, you're looking at price tags above $1000. Depending on where you live, I wouldn't buy a Windows laptop below $1400, if you expect to be happy with it.

Honestly Microsoft should take steps to prevent the sale of Windows 10 on laptops without an SSD and at least 4GB of RAM. That would help the Windows brand tremendously.

What I do every few years: buy a refurb/clearance business-class laptop (latitude, precision, et. al) from the Dell Outlet. The specs may be merely ok, but the build quality is pretty good. And you'll spend less than $1000 :)

Do you think that they want to cede the entire low end market to chromebooks/linux?

Satisfaction rate among surface owners is abysmal. At this point, I'd call the surface brand a failed experiment.

I was going to say; most people I know brought them back for many reasons. One of my Windows fan friends had a surface book 2 and brought it back for repair because of random battery life a few times; ended up buying a Lenovo T470 (I think) instead as they kept insisting it was his fault instead of the glitchy laptop. No issues with the Lenovo ofcourse.

Do you have stats for that? Maybe I'm an outlier, but I loved the Surface Pro so much, I bought a Surface Book for work and a Surface Book 2 for personal use.

The first year was a really bad. Plain and simple. Now it is a decent device, but I'd take a Lenovo today if I had to make the choice. However, for note taking, the Surface Pro is second to none. The form factor, pen, and OneNote is a really good combo.

Until the screen breaks and you can’t turn it on...

If that kind failure recovery is important, then no tightly integrated device is a good choice.

Had two surface pros fail just out of warranty, if that’s the price of a tightly integrated device it’s too high for me. No more surface devices here.

That is indeed poor reliability. The SP4 I used and others I know people have had since launch, are still working fine.

We had the choice between MBPs and surface books (the good ones) and still only management opted for windows.

When I asked for a newer MBP, i got bitched out by the purchasing director. He made the point that I was one of only a handful of people in a 500 person IS department allowed to have a Mac.

Many companies don’t allow mac purchases, even for developers.

A company that balks at spending 2k to enable an asset who costs 50k-250k annually to perform maximally isn't very wise. We are talking about spending 1-4% more.

Docker has a native Windows client for like a year now. Just FYI

Yes, it does, if you're on Windows 10. And volume mounting from within WSL doesn't work with POSIX paths (at least it didn't when I used it last over six months ago) regardless of the container engine type you select (HyperV or Windows Native Containers, which has its own quirks).

Docker for Windows doesn't exist for Windows 7 (still used in a number of enterprise environments). You're stuck with Docker-Machine in that case.

My only complaint is that if the mac were truly a developer-centric device, I wouldn't need homebrew to install software. It should just be in the apple store for me to download.

I think that'd be a disservice to both developers and non-developers alike. It'd clutter the app store and force developers to use the app store and whatever approval process they decide.

I think it would be cumbersome to say the least...

  * Would you have to install through the app store gui?

  * How would it manage dependency chains and conflicts (system ruby vs local ruby)?

  * How about explicit paths or build parameters?

  * How would it handle different shells?
I know other operating systems do this but I've never liked it... I guess I just don't mind installing and managing command line/developer tools from a terminal window ¯\_(ツ)_/¯

And, with all that said, I have a quite a bit of negative sentiment towards Apple lately for their hardware and platform choices. So I would still say, it's not a developer-centric platform (the Mac)...

> Would you have to install through the app store gui?

Possibly or not. Ubuntu gives me both a gui and a command line option for apt. Thinking about being able to type "app-store install openssh" gives me goosebumps.

> How would it manage dependency chains and conflicts (system ruby vs local ruby)?

First, that example is a problem for ruby, not for the package manager. Second, software can be in the store statically compiled. Third, apt handles dep management really well. Perhaps that's something apple can learn from?

> How about explicit paths or build parameters?

This is a solved problem in ubuntu.

> How would it handle different shells?

Again, this is a solved problem.

> So I would still say, it's not a developer-centric platform (the Mac)

The question I have is if we deploy on linux, why are we using a mac to develop software on?

As someone who lives on linux and has to touch OSX for things like building for iOS devices, I find it really odd that OSX doesnt just solve these problems like all the FOSS distros do. It's insanely odd that Apple gives me a bash shell which is years out of date, and it feels hacky to just use what i want. I know Propietary and free software kinda have a hard time co-existing on operating systems built on free or mostly free software, but if OSX is half free software and its posix compliant, Its hard to believe that apple couldn't give you both propietary software and an easy package system for developers easily.

The sad thing about linux is that as much as I love it and its ecosystem, i cant recommend it to anyone who wants things to "just work":

- X and wayland crash on me all the time on this laptop because of its HiDPI screen and my kinda-works-but-is-wonky fixes to work with multiple monitors.

- Hardware support is the best its ever been, but graphics cards, wifi, exotic devices, laptop power states and embedded devices can still be a pain because manufacturers simply dont care.

- desktop applications can still be a little glitchy, Web browsers work fine, as do first party DE apps, but the more you get away from things which arent in the big name gui toolkits and have custom controls and behaviour, the more problems you seem to run in.

But aside from all that im happy here in ubuntu. When your software library feels as easy as picking a book from a shelf and 90% of the system updates by the update manager and says "hey restart when you feel like it" Im quite comfortable.

GPLv3 appears to be the reason for shipping that ancient version of bash.

If you look Apple ship a modern version of ZSH with their OS, I believe because it isn’t affected by the same potential licensing issues.

But why? That doesn't seem like a rational choice by Apple.

Apple wants to use open source code, but doesn't want to make their proprietary code open or to license patents to anyone who wants to build off of their code.

Ultimately apple will sell developer devices at the price point of their Mac pro devices that come with all the approved tech you are allowed to use to build end user services/apps. In addition to the high purchase price you will have to sign up for a developer account and pay annually.

Everyone else will buy consumer oriented devices that are as open as the ipad.

It's not a bicycle for the mind its a train and if you dress appropriately and pay your fee you may set up a concession stand on the route.

* Would you have to install through the app store gui?

With various linux package management systems you have multiple interfaces to the same same system thus you can at one moment use the gui to install foo and having closed that you can fire up your favorite terminal and run install bar.

* It'd clutter the app store and force developers to use the app store and whatever approval process they decide.

It also has a concept that neither Microsoft Apple nor Google has opted to pick up on because they desire control of their platform and to extract a substantial tax on all software sold on same via such control... sources also called repositories.

Linux package management systems draw from not a single centrally managed source but a user editable list of sources. Each source is free to run with their own set of requirements. There is nothing requiring a hypothetical dev tools source from being any more restrictive than whomever maintains the source used by homebrew.

Further packages can actually contain sources. It would be entirely trivial to package up a source of dev tools as a package and allow people to install that via the front end of their choice.

* How would it handle different shells?

Different shells is probably the simplest answer the arguments to a hypothetical install command would be simple text strings. If the argument command includes characters that the shell considers special characters they would have be escaped or wrapped in quotes like any other combo of shell and cli tool. Generally most package names just don't include characters like ([])$\~`'" in the name and most commands don't require any particular special attention to shell escapes.

* How would it manage dependency chains and conflicts (system ruby vs local ruby)?

* How about explicit paths or build parameters?

These are implementation details that don't go away by exiling developer tooling to officially unsupported channels. For example language specific package managers will probably remain a thing but it would be vastly easier if you could do officalmacpackage install developerrepo then officialmacpackage install cargo|node|whateverfloatsyourboat then use that to deal with whatever.

All your concerns are basically lack of experience with more reasonable systems there are zero good reasons not to do this. There are literally no downsides other than the work required for an official solution. However it seems vastly unlikely that we will ever see such a thing.

The logical endpoint of Apple's vision seems to be 2 classes of device. One with a very high introductory price and annual maintenance that allows you to create and run whatever you like so long as such software is distributed via blessed channels and you tithe the required 30% to apple and more reasonably priced but still expensive devices that only allow you to consume software.

The former will come with xcode and technology to deal with apple approved languages. You will be able to play with other tech on your local devices but it wont be able to be distributed to end users.

The latter will be as open as your ipad is now.

It's not a developer-centric OS and it's not supposed to be. But, that doesn't change the fact that a lot of developers prefer it over other OSs. I prefer because it has been more reliable for me than Windows. It's not right for everyone though and I completely understand anyone who prefers Windows or Linux.

I could see an argument that Apple should make a tool like Homebrew themselves or official support Homebrew development (maybe they do, haven't looked), but including it in the app store would just be confusing. Plus, I can type 'brew install <package>' far faster than I could type <package> into the search bar and make the necessary clicks to install.

That same argument extends to the users of Homebrew who are overflowing with cash from startup get-rich moments: please consider funding (if you have $1/mo or more to spare) the Homebrew patreon, which currently lists a total of 285 paying users out of the million developers that depend on it to earn a paycheck every month.


(I do not participate in Homebrew other than as a user.)

I've been using homebrew for years and have never heard of this. Maybe the reason there's only 285 patrons is that the patreon page has not been promoted in any way that I can remember?

Can’t say I blame them for not promoting it through the only way any current user would see it: inside the brew command line tool. I can see the angry commenters now:

“How dare they inject advertising into a critical command line tool that I use to get paid money!”

“It’s inexcusable to promote your own financial success when someone types brew upgrade”

“No doubt they’re blowing the money on burrito delivery in SOMA”

“I’ve been a $1 Patreon since they launched three months ago and they haven’t implemented my favorite wishlist feature!”

“If they can’t do it for free, they should give up and shut down and let someone fork it.”

> ... or official support Homebrew development ...

From memory of speaking with an ex senior-Apple-dev-person, Apple doesn't support any 3rd party Open Source projects nor Communities. eg Homebrew, or even the software they bundle in their OS

Apple will never distribute gpl v3 software, this is the reason bash is ancient on mac and why samba is gone.

Any license with a Patent clause or Tivoization clause will be verboden. As an example you will never get something that depends on Postgres in the Apple store.

Postgres is BSD licensed. Apple even uses it in its Server app

Apple supports and hosts MacPorts


macOS Forge shut down in 2016. All those links are to the new homes of the projects.

Macs are macOS and iOS developer-centric devices, those hardly need homebrew.

Other types of developers were a kind of nice to have regarding sales, but no longer relevant.

probably related to the fact that the top language on Github is javascript.

If you do anything else than UI scripts, I would agree with the above and say you are better off with a linux box.

I think you're probably right. I feel like the one thing Mac is really good for is writing the porcelain. node.js came out of the porcelain side, so it seems to fit really well there too.

When you get into plumbing, I feel like there are better alternatives.

Yes that, plus it could be that MacBook/Pros are popular with students: many of whom create lots of tiny Hello World etc repos.

That's because most GitHub activity is by frontend web devs. What about the rest of the world.

Those stats also don’t include GitHub Enterprise..

Do you have a non-video source on this? 2/3s seems really high.

That really is "2/3s of all PR activity on Github is unix-based". People don't use Macs to develop on because of OS X, but because it is unix-based. Sounds like they are going to iOS based which makes a Mac much less interesting to develop on for anything but iOS devices.

That does not mean anything. People can use Macs for development and still be miserable. I am practically the only Dev in my company using non-mac laptop, the amount of envy in people eyes is astonishing.

Couldn't agree more.

Finder is garbage compared to Windows Explorer. OS hotkey navigation is much better on Windows and GNOME/KDE/Fluxbox/xmonad. (I actually miss using xmonad; it was soooo good at this.) Office applications (still very real in a lot of environments) are mostly awful on the Mac. iTerm2 is fine, but I prefer PuTTY. Starting applications by their binary name with WIN + R is better (for me) than trying to find them within Spotlight. There are other little enhancements that I prefer Windows over to OS X, but I'm not remembering them right now.

I also hate the new MacBook keyboards. They take "getting used to," i.e. typing "lighter" than I'm accustomed to. Also, not having USB-A ports and requiring a 90W USB-C to USB-C power brick (i.e. not being able to charge the thing from common chargers, the most lucrative reason for using USB-C, for me) is garbage.

The one thing I like about OS X is that it's a BSD with GNU utilities built in. The thing that I don't like is that it's not a Linux and many of the dev environments I've worked in run on Linux, especially now in the age of Docker. While that's good news in that it forces me to do almost everything within Docker, it's bad whenever I need to maintain parity for whatever reason.

Odd I find hotkeys much better on Mac. They follow a more logical predictable pattern and are much more standarized. E.g. what is the standard windows hotkey for getting info/properties of an object. Never found one.

Also if an app doesn’t have configuration of hotkeys you are screwed. On Mac hotkey config is OS wide.

The command console experience on windows is horrible.

Office apps are better on Mac. Pages, numbers and keynote offer me a totally superior experience. The ribbon interface on Windows is a disaster. You spend too much tome hunt around among a zillion incompethensible icons.

In fact I do GUI design at work and use MS Office as an example of how to NOT design a UI.

Not sure why you complain about Finder. You got plenty of alternatives on Mac. I use Terminal and Finder a lot in combination and they interact with each other a lot better than on Windows. E.g. does windows have an “open” command yet?

"E.g. what is the standard windows hotkey for getting info/properties of an object. Never found one."

Alt + Enter (since Windows 95, at least)? You can find it in here: https://support.microsoft.com/en-us/help/12445/windows-keybo...

The Mac environment takes some getting used to, it took me years to learn all the little details, like if you select a bunch of folders and double-click, it will open all of them. To close them all, option-click the close button.

I do find Finder to be inferior to explorer in almost every way. That tree pane on the left is too useful.

I'm far more of a Mac user, but I use Windows, too, and I agree that the Finder is quite inferior to Win Explorer. The simplest, most obvious things, such as having 2 side-by-side trees for file organization, or right-click on anything and create a new folder there--things that are easy and obvious--are NEVER going to come to Finder, which occasionally adds trivial eye candy such as "album flow" views or colored "tags", but basic workaday functionality, no.

Yes, you can buy a 3rd party replacement with the hassle of deploying and maintaining on multiple machines, if you have them, but my main complaint is what decades of not caring about the most fundamental of all Mac apps (the only one you can't quit) implies about Apple's strategic attitude toward the Mac overall.

For those of us who find the Mac the best pro dev platform, the implications are not good, because what we tend to like about it (desktop unix where all the client stuff just works) is just a historical accident that Apple would not create again and does not intend to maintain longer than necessary.

Rather than make the Finder more powerful to make the Mac better for serious users, Jobs took the approach that people who couldn't understand the difference between a file and a folder were the real market for Apple, solving the problem by creating iOS with no user view of the file system at all.

This turn from computer company to fashion accessory for those who don't care about computers per se was so successful as a business strategy that I can hardly criticize it. It just bodes ill for those of us who like the Mac for features its only supplier wants to be rid of.

Each time Apple has an event where Cook pointedly emphasizes that the iPad is "what we at Apple see as the future of computing", I google for an update on the current state of "best desktop Linux distro".

On OSX you can create a new folder with Command+Option+N. As for the side-by-side trees, I don't understand what you mean, I usually have 2 Finder windows open, then drag and drop as needed.

When I transitioned from Windows to Mac, 13 years ago, I needed some adaptation time, especially with Finder, but in the end I found it powerful. The main issue coming from Windows Explorer is that basic Explorer workflows such as copypasting don't have a Finder equivalent.

As I said, Windows lets you put a new folder (as well as various types of new documents) in any part of the tree that you right-click, while Mac's cmd-opt-n is limited to the root of the folder you're looking at. And having to separately position two windows that have/lose focus independently and can have one come to the front without the other is ridiculous compared to having both trees side by side in the same window like Win Explorer and every 3rd-party Finder replacement on the Mac.

Having been a Finder user since my 128K Mac in spring, 1984, I've had adequate time to get used to an application that hasn't improved in even the most obvious ways for going on two decades. It's fossilware.

Must be something about a particular way you are working. I have never experienced this as a problem.

Why would losing focus on one window be a problem?

Why would you need to create all these directories in multiple parts of the tree?

I can't quite comprehend what sort of work flow of work you do which require these things?

Perhaps I don't see it because I use the Finder and Terminal a lot together. I don't use one exclusively for very long periods.

But I am curious what you're workflow is like, because I see a lot of Finder hate, but don't really get what people's problem are, and how they are using Finder which is causing them so much problems.

Sounds more like you are not familiar enough with using a Mac. Putting two finder windows next to each other is easy, and so is quickly creating a directory.

Finder also supports tree view you know. OTOH does windows have as good filtering tools, smart folders, labels etc?

Option-Click to close is a novelty to me and a very handy one at that... the number of times I’ve accidentally shot myself in the foot by selecting and accidentally opening maybe hundreds of folders is not something I’ve kept count of, but it’s definitely happened multiple times. I’ve been using OS X (now macOS) intensively since 2002 and I’ve come to think of myself as a “Power User”, but I’ve never known of the option-close trick. Thanks.

On your last point: START "title" [/D path] [options] "command" [parameters]

It's been there since forever (Win95?).

start (without parameters) - open a new CMD window

start . - open an Explorer window of $PWD

start <DPATH> - open an explorer window of DPATH dir

What? As much as I'd like for the Excel dominance to end, I can't see how anyone would ever work quicker in Numbers over Excel for any work that takes more than an hour a month. The ribbon interface offers shortcuts to pretty much every single function through the keyboard in an interactive way. I can't say the same about Numbers.

What exactly is it you do in Excel which is quicker than Numbers?

Numbers excel at what a spreadsheet application should be about. Once you get very complex sheets you are much better off using more specialized software such as DataGraph, R, Julia, Matlab, Numpy, SAS

'open -a appname.app'

Is what i generally use to open applications. I never use spotlight

I have long said that Mac os X is only useful as a way to start Emacs, but since 10.10 I am not even sure about that.

Of course it is a matter of taste. I get that. I still have to bite my tongue to not yell profanities whenever I hear anything about productivity and Mac OS.

Most of the time I keep my mouth shut. Now I just wanted to tell you that you are not alone. I have to use Mac OS at work. To avoid frustration I have started to let my soul and will leave my body before logging in and let my lobotomized shell I left behind just go with the flow. At least until Emacs is running full screen.

I don't suppose it's any consolation, but I feel the same way if I have to use Linux or Windows as a workstation. Different strokes for different folks.

I guess it depends on what work you do.

As for writing software macOS is nice since it is quite close to Linux and does contain a lot of the UNIX-thinking.

Windows is, IMHO, a user experience mess. Mac is more gentle and aimed to be simple and useful for everyday people. Windows feels like it was designed by a lot of different groups that in the end glued their pieces together in order to ship it.

> As for writing software macOS is nice since it is quite close to Linux and does contain a lot of the UNIX-thinking.

You know what's even closer to Linux?


Nice thing on windows is I can open the command line type >bash and I've got vim, ssh, sftp, bash whatever immediately. I still don't use that because other telnet, x-window setups are better for me. I never found out how to be productive in OSX vs what I can do in Windows.

> Nice thing on windows is I can open the command line type >bash and I've got vim, ssh, sftp, bash whatever immediately.

Yes, macOS has a terminal emulator as well.

> I never found out how to be productive in OSX vs what I can do in Windows.

Open Spotlight, type in "Terminal" and hit return.

Unsure how that magically makes OS X productive. Stuff is just easier and faster on windows or linux. And I’m tired of OS X randomly crashing or freezing. It’s like windows 98 now.

>Stuff is just [...] faster on windows

Windows is the slowest of OSes, see http://www.bitsnbites.eu/benchmarking-os-primitives/

Then how does opening the command line in windows magically make it productive? They're just citing a counterpoint.

It's a trendy way to start Emacs. I used one just to use vim.

I was with you until the "I preferred Windows" part. Windows right now is the most infuriating piece of software I can imagine for a developer. Forced upgrades? Forced advertisement? Gigantic, expensive SDKs with incomprehensible versioning and install times measured in hours, no standard library installation facilities, no standard build facilities, a default compiler that is markedly inferior to the alternatives, it just goes on on and on.

No idea where you're getting most of this from.

I was a C# developer as recently as 2017, and I'd say that I prefer Windows 10 to OSX as a development environment, but going through your points:

* Upgrades are handled by the system administrator at most companies, so it's unlikely that automatic upgrades will be set up if you've got anyone remotely competent handling your IT. * I can't say I've ever seen an advert when working on Windows. There's some Cortana crap, but that takes a few seconds to click away, and you'll never see it again. It's no different to your standard desktop setup for OSX or Linux. * If you're a .NET dev, it's extremely unlikely that you're paying for the OS or the platform, in the same way that you're not paying for OSX. Admittedly, Microsoft tools take an age to set up, but the latest versions of .NET and Visual Studio are much quicker - if anything, I spend far more time upgrading/installing stuff on OSX. Hell, sometimes setting something up on Homebrew will take longer than a standard Windows installation for a given tool. * I'm yet to see a Windows machine, outside of a brand-new one, set up without the necessary .NET framework. If it's not on there, Visual Studio will install it for you. Again, not an issue. * Not sure what you mean by build tools and a default compiler - IMO building/compiling is ridiculously easy for .NET apps, either through the command line or through Visual Studio.

In my view, as someone who has worked on all three sides (OSX, Linux (Debian), and Windows) I'd say that Windows is just as capable as the other platforms for its main use cases. Where Windows struggles is in its differences. It's a very different experience, and people from each side struggle to make the switch, and it's a switch where you feel that you can run before you can walk at times. You have your own way of doing things efficiently, but even though you're looking to do something similar on a different stack you're using entirely different tools.

I don't understand you guys, it significantly depends on the programming language you use, what you said would apply to certain languages only, for example we use Delphi and Delphi runs only on Windows, I believe c# programmers are in a similar boat too.

For C# there's .NET Core nowadays, which iirc is a crossplatform JVM / JDK-like thing, runs on linux and everything too.

Developers really should be using Windows Server for Windows development, particularly if the software is going to be running on Windows Server in production. That solves at least some of the issues you mentioned.

Good point. But the license cost would be too much. I don't think there's an upgrade path to server from consumer. You have any helpful tips?

If you are a Silver Partner or better you don't have to pay to use Windows Server for internal purposes, including development and testing. It's not expensive to get that.

As mainly a Windows developer, with UNIX experience going back all the way to Xenix I can enumerate similar complaints about developing on UNIX.

Expensive SDKs? You should have seen UNIX compiler prices before GNU and BSD actually mattered.

Don't try to use non-UNIX OS as if they are UNIX and the experience will be much better.

This is contrary to my experience. I've been on various Macs for years, tried to switch to a Surface Pro (i7 model) + WSL. Nice machine, but it didn't work. I ended up running Ubuntu in a VM just to get `npm install` to work reliably. And that was horrible and slow, even with VMWare. Installing Linux on that thing looks like a lot of work (there's a whole SurfaceLinux subreddit...)

You know what I ended up doing after nine months of this crap? I switched back to my trusty old 15" MacBook Pro. The backlight is dying, but it works a lot better for me than Windows or Linux!

I wish I could agree. Half of the time I spent on my last linux laptop was spent frustratingly trying to get things to work. The screen font was too small for some stuff, too big for others. It didn't wake up from sleep properly. sometimes it didn't GO to sleep properly. It was death by a thousand cuts.

This has been my experience too. I love Linux, but its just too buggy (And I had all these issues on a ThinkPad that was marked as 'Linux Ready'). It really does give me a nice environment for development but the lost hours and days to fixing problems made it a total time sink. MacOS on the other hand has all the tools I need and works incredibly well. I also cannot stand Windows, its a total mess and drives me mad. So I am stuck with MacOS and for now I couldn't be happier.

Thing is. I could honestly say the exact same but change Linux to Mac and otherwise around.

I had so many issues with Mac and all my Linux devices just work flawlessly for years without issues.

As a counterpoint, I installed Arch Linux on a Lenovo gaming laptop and I'm delighted with the result. Arch being Arch, I had to configure lots of things manually, but there are no bugs to speak of. Maybe you mean weird behavior instead of bugs? That would make more sense.

What does Linux give you that macOS doesn’t?

I use both a Mac and an Arch Linux running i3/awesomewm, and to me macOS is like any Linux distro with a user-friendly desktop environment.

I guess it doesn’t have an « official » package manager, but homebrew has most packages anyway?

Package management is core to the OS, not bolted on. I can change the desktop environment as I please, and potentially run different environments for different purposes, if I so choose. For any piece of software on the system, I can have a part in its development process, if I choose to. I like a non-GUI-centric system. I'm not limited to Apple's drivers, or Apple hardware in general (it's pretty and sleek, but I find some of the design choices grating).

I feel like Apple provides a computer and OS that are user-friendly to the general population. But it also seems like the whole culture is "No, don't do it that we. We've provided this method as the One True Way."

How is package management via apt any different than using homebrew? They're both equally "bolted on".

Apt is the update mechanism for the system. OS updates, application updates, etc. It's a core piece of software on a Debian-related system. Brew behaves more like an alternate software repository...it's not like you're using it to fetch your kernel updates in macOS.

Why does the method of updating the core OS matter? Unless you're a system admin, and have to do it a lot. If not, that's hopefully something you don't spend a significant amount of time on each day.

Most of my concerns with regards to the OS I use has to do with the stuff I do 10s or 100s of times a day.

> Why does the method of updating the core OS matter

Because updating Arch is about 100x faster than updating a Mac to a new version. The loading bar looks nearly complete and then "About 17 minutes remaining."

I’m not sure Arch Linux is the best thing to compare it to. It might update quickly but it might not boot up next time, either.

That's a popular myth, but nothing more. I've been running a single Arch install for close to 4 years now, no problem.

> Why does the method of updating the core OS matter?

Because it's part of the answer to the question "How is package management via apt any different than using homebrew?", and (I think) supports my assertion that Apt is more of a piece of core OS functionality than Brew is.

So, it's more of a philosophical hang-up than a practical one.

I love apt, but I have no philosophical stake in the game. Both allow me to install things from command line. In that respect, they are functionally identical to me.

Then once a quarter or half-year, I need to do an OS upgrade, and then I use two different systems depending on platform (I use both regularly). Let's say one takes 15 minutes and uses apt, and the other takes 45 minutes and uses App Store.

Then I amortize that over the preceding three months, and in both cases the attention required, confusion created, and effort expended approaches zero rapidly, regardless of system.

Like I said, if you're a system admin, then sure.

try 'brew install gnome-desktop' maybe?

Wait this works? Gnome shell? The full gnome experience?

Nope :-/

        $ brew install gnome-desktop
        Updating Homebrew...
        ==> Auto-updated Homebrew!
        Updated 3 taps (caskroom/cask, caskroom/versions, homebrew/core).
        ==> Renamed Formulae
        php70 -> php@7.0

        Error: No available formula with the name "gnome-desktop" 
        ==> Searching for a previously deleted formula (in the last month)...
        Error: No previously deleted formula found.
        ==> Searching for similarly named formulae...
        ==> Searching local taps...
        Error: No similarly named formulae found.
        ==> Searching taps...
        ==> Searching taps on GitHub...
        Error: No formulae found in taps.

I only realized later that it wouldn't really work anyway guessing from all the missing dependencies.

Uniform platform support for multiple architectures. I have Linux desktops on i686, x86_64, arm and aarch64. Same desktop environment, same programs, same easily mirrorable configuration, same firewall system, wireguard, same or fairly equivalent package management between systems.

I work on container-tech and the absence of namespaces and cgroups in the macOS kernel is a continuous source of frustration for my team since you need to work through a VM abstraction for Macs.

What you're talking about is way over my head, but I do understand Linux might be more appropriate for more low-level stuffs and hardcode users like you at the kernel level. I'm happy doing React on my Mac :)

> What does Linux give you that macOS doesn’t?

perf, case sensitive file systems, non stupid alt-tab behaviour, strace, pstack, gdb (these don't seem to work without sacrificing animals), gnome-shell (better than finder by a long way, imo).

I've got a mac. I don't install programs except for things through brew. It's basically shitty linux with outlook.

perf, strace, and pstack can be replaced with dtrace on MacOS.

You can have a case-sensitive position. (Separate from root so it doesn't break some apps)

Do you need to add dtrace to the key chain each time it's updated or do they just not update it like most of their command line software?

What do you mean "add dtrace to the key chain"? I've never done anything related to keychain while using dtrace.

gdb needs to be added to the keychain as a signed program in order to attach to programs. But brew is a rolling release so I might need to add this frequently. It's painful.

DTrace is built-in software.

I took some time to try dtrace out. Sadly it's broken unless one turns off SIP:


Only when tracing the operating system, not your program. But yes, that is still a shame. You don't need to turn off SIP globally, you can just enable DTrace while keeping the rest of SIP on.

Thanks for the tip!

I was trying to work with some students with Swift and the Mac character encoding ending up causing no end of problems.

> the Mac character encoding

You mean UTF-8?

..with outlook?

Sorry man, but you’re doing it wrong.

And the Mac file system is case sensitive.

Alt-tab behavior is more of a preference thing.

>It's basically shitty linux with outlook.

Linux is shitty Linux. Except on the server. (IMO, of course).

I can accept that alt-tab is a preference if you want to use alt-tab and alt-`. It's heresy to prefer alt-`, but whatever -the vim-spaces-only(but automated code formatters are best and I dont care what they use)-alt-tabbers will eventually win out.

But on multiple desktops if you alt-tab to the previous application again, it only brings up the application windows in the current screen instead of bringing you to the last window you used (on another screen). Wrong! Broken! Sad!

Also, on mac alt-tab raises ALL the application windows. So if you have shed loads of terminal windows or loads of browser windows open, then alt-tab brings them all to the front. This is definitely broken since it stops common workflows like copying between windows; or finding some text that you want to type into a terminal and then alt tabbing to the terminal only to have the screen covered in terminal windows. SAD.

Mac is low energy (That's the reason I think I have it for a laptop).

I find separate inter-app cycling (cmd-tab) and intra-app cycling (cmd-`) much superior, faster, giving you better control.

However, agreed that on a Mac with multiple "spaces" it's completely broken.

Compared to a optimized Arch, or even a non overloaded windows 10 mac never appeared to me very energy friendly.

> What does Linux give you that macOS doesn’t?

Stable, long term, OS support. I'm running CentOS 7 x64 now, as I got completely sick and tired of Apple's "new OS release every year" bullshit.

You can run things like the Adobe Suite without any ridiculous overhead. Where Xcode is important you've also got that.

For a lot of people that don't care the differences are largely subjective and the difference between Ubuntu, for example, and macOS are largely academic and aesthetic.

With academic, I'm not sure if you also include 'philosophic'. Personally, I'd rather develop on mac than on windows but I'm the happiest on Linux.

A large factor in this is that I like to contribute to FOSS and find that 'ideology' to be a match with my beliefs regarding software.

For work I prefer the macOS environment partially for the software, but mostly because the machines are standardized and interchangeable. If one machine dies I can swap it for another without any fuss. Restore from Time Machine and get on with life, something that takes about an hour or so.

This is really not the case with Windows or Linux. These require a lot of tinkering and tuning. A recent swap from one Windows 7 machine to a Windows 10 one took days, the migration procedure is basically garbage.

I've never had much luck with desktop Linux even though I use it all the time on servers but those get rebuilt with a new OS when they're out of date. Upgrading them is just too much of a fuss.

If you've got a workflow for keeping desktop Linux up to date and rolling over from one machine to another as you upgrade hardware, that's worth sharing.

> Upgrading them is just too much of a fuss.

Switching to a rolling distro will eliminate the upgrade pain. Keeping dd backups is also relatively easy.

And unless you're on a custom kernel, you can just roll over to a new machine with your image and the appropriate kernel modules would get loaded for the new hardware at boot.

> What does Linux give you that macOS doesn’t?

For one, you largely don't pay strategy tax, which is under discussion in the very title of this submission.


Strange, I have the complete reverse experience. Mac is by far the best. Windows is worst.

I use Linux at work daily but miss my mac. It is much more unstable and unpolished. The app selection is really weak and the integration between gui and console is weak although better than on windows.

Windows is an utter mess these days as Microsoft is jumping between so many different UI paradigms.

Can you give examples of the GUI and CLI integration? I have a mac, but mostly use a Linux desktop, so I don't know what I'm missing.

Drag and drop an item from Finder to the terminal and it expands to its full path. Not sure if Linux does something similar as I'm not a regular user.

It does, under modern DE, KDE/Plasma does this.

No comparison is fair if someone hasn't used Plasma.

I could not care less about "freedom" or these other philosophical aspects of Linux.

Plasma is just straight up amazing on every level.

I also quite like the 'open' command to do the opposite.

The Linux equivalent of this one is "see", an alias to the run-mailcap program, which on Ubuntu is in the "mime-support" package, which is probably installed by default.

Drag and drop almost anything into the CLI produce a sensible result. Open command works as if you double clicked the item in Finder, which means it can be used to open directories, launch a program associated with that file etc.

In addition because mac use commad+C, command+V for copy paste rather than ctrl+C and ctrl+V as Linux and Windows, you feel no different working from a CLI than from any other app. You don't have to mentally jump in and out of two different ways of working.

This extends all the way into GUI apps. Typical CLI commands keys work in all mac GUI apps. I can use readline keys such as ctrl+a, ctrl+e for moving the cursor e.g. Works even in office apps like Keynote and Pages.

It is very frustrating to not be able to use these Unix conventions on Linux!!

There are lots of little things like this which makes a superior experience IMHO.

I only use KDE, so I can't comment on Gnome etc.

Dragging a file into Konsole (KDE's terminal) gives a menu with the options "Copy here", "Link here" and "Paste location". Seems reasonable.

Dragging a hyperlink or image from Chromium gave the same options. "Copy" downloaded the file, although set a timestamp in 2106 for some reason. This doesn't work from Firefox.

Dragging selected text pastes it in.

"see ." opens the current directory in the file browser thing, "see thing.odt" opens LibreOffice, "see my.pdf", etc. "see http://example.org" doesn't work, although I can right-click the link to open it. (Naturally, "alias open=see" if you prefer that word.)

I set a custom shortcut for Konsole for "Super+C" etc (Windows/Cmd key), but I don't use it very often. I mostly select + middle click to paste, which is a Unix convention I miss on a Mac! The readline keys are nice, they seem to work about 80% of the time on a Mac, and I haven't found a way to get that working in Linux.

My "lots of little things" favours KDE. Properly maximizing a window, having a "keep above" button for any window, focus-follows-mouse, and the general feeling that the computer does what I ask in a boring way, not what it thinks I want in a stylish way.

So why are you sticking with it? Go out and get a nice Linux machine. Expense it. If you're working at a company too square to approve that, install Linux on your Apple laptop. If you're working at a company that won't even allow that, well, you have my condolences. Things to ask before you sign, I guess.

This works, but only for simple environments. Given a work environment where you have a choice of: a) prepared environment with all application dependencies available and a tested-by-everyone, one-click-install/update on a bad system, or b) preferred system, but you have to do all of that from scratch yourself... Sometimes the reasonable answer is "a", even if you're allowed to do "b" (on your own time of course)

I've always been a b guy, and it's helped me understand how things work. One time, I joined a company, and two weeks after joining, poking around with a weird custom machine setup, I found that the machine was running a world-writable batch file as SYSTEM upon boot. Who knows how long this huge gaping security hole would have gone unnoticed if I hadn't felt compelled to poke around with Cygwin?

I asked for both a and b.

I could make do with an older computer for Office and Outlook.

Windows is good, but getting hardware working is a lottery. I recently built a PC with pretty standard components (Asus MB, i5 CPU, Asus GTX 1060 GPU), installed Windows 10, it worked fine until I enabled Hyper-V. With Hyper-V enabled it BSODs few times a day because of buggy NVidia driver.. I had to disable it because of that. While macOS is pretty buggy and I experienced crashes, they are not that often, may be few times a month or less.

I must be very lucky. Didn't saw BSOD in 5+ years now while using Windows nearly every day

May be I'm unlucky, I don't know. Hardware seems to work fine, because with disabled Hyper-V long stress test performs just fine. I'm trying to prepare a bug report, but NVidia doesn't seem to even have a proper Bugzilla, so I'm not sure if it'll go anywhere. I hate to interact with corporations.

What exactly does your development workflow look like where Linux was so much better than MacOS?

Apple frequently introduces changes to its OS that caters to the average user without any hindsight for developers. A good example I had to deal with; I wanted to change the port the SSH daemon listens on. With El Capitan and future versions you have to temporarily disable SIP (System Integrity Protection) - which requires two reboots - in order to make the change to the ssh.plist file. As you can see it's a common problem: https://apple.stackexchange.com/a/208481 It's lots of little things like this that require extra effort on OSX but which are straightforward on a decent Linux distro.

Flatly, This isn't true.

I can still go into /etc/ssh sudo vim sshd_config, can my values, and they stick.

Just to test it, I did it right now, and I'm running 10.13.4, for what its worth.

What exactly was the issue with changing the sshd config? This isn't a SIP protected directory, which is the only thing that would prevent such a move.

Perhaps the requirement of sudo? I feel like most linux distros force that in the /etc directory too.

Changing /etc/ssh/sshd_config was my first attempt as well. But since this is OSX things are a little different, see here: https://serverfault.com/a/67616/

  this security feature that is great for the vast majority of Apple's user's isn't convenient for me.
Luckily, Apple shipped SIP with the option to disable it, and it's not hard. So you can disable it once and then you never need to deal with issues like that again. It's weird because it sounds like you want the protection of SIP without the inconvenience of SIP, but that's never been possible with pretty much any security measure ever -- more safety means less convenience. That being said, to me Apple has actually been the best when it comest to safety/convenience ratio. Linux distros don't even have the option of SIP or something similar, so I can't say I find your argument compelling.

It's not the fact that the SIP is there which bothers me. It's that Apple introduces these kinds of things with little notice and without caring if they break compatibility. This has always been Apple's approach and it's just not friendly for developers.

Is there anything stopping you using a separate sshd to run on a different port? Or using one from homebrew instead of Apple's bundled one?

What was the workflow in which windows was better than anything UNIX based....

I’m missing the logic as well.

Linux > Windows > macOS?

Desktop software development, graphical debugging tools.

The only UNIX based that tops it is macOS.

But Linux is better for that?

Not all all, Linux is the culture of command line, where people improving UI/UX on GNOME/KDE always get bashed as taking the power away doing irrelevant work.

There is not a single cohesive stack of desktop technologies, what macOS calls Kits, Android frameworks, Windows UWP and such.

Something like Glade still fails short of what XCode or VS Blend are capable of.

Sure there is something like Qt QML designer, but that isn't Linux specific anyway.

You are missing context. Grandparent was claiming that Linux was the best OS he’d used for development (fair enough), but that he’d rather have Windows over macOS otherwise. That part I was hoping for clarification on.

Not OP, but the LLVM toolchain provided by Apple is a bit clunky and missing features relative to what you get from a typical Linux distro.

Something I don’t quite understand, coming from a scripting-language bqckground: why are you using your OS’s provided compiler toolchain (for anything other than building OS packages for distribution?) Is there no version manager for clang the way there is for e.g. Rust?

Speaking about the classical C/C++ mainstream, and more from the Linux perspective:

Tooling for C and C++ mostly relies on some external package manager, often the OS-provided one (on most Linuxes, for example). There isn't a standard cpan/npm/pip/cargo for C/C++, although there are plenty of tools that can do kind of the same thing.

There's also not much support for virtual environments (there are tools out there, but not ubiquitous tools). It's pretty easy to point the compiler to a different set of header files and libraries, even on a per-file basis, to get a similar effect.

And from the Apple side (which I have a vague understanding of, having dipped my toes in a few times): Most of the documentation assumes that you're using XCode, and I'm pretty sure that the version of the compiler is just tied to whichever version of XCode you're using (which has a somewhat looser tie to the version of MacOS you're running). So in that case, you'd be using the XCode-provided toolchain rather than the OS-provided one.

You can install GCC from Homebrew if desired. Personally I don't bother because, well, Apple's clang works just fine!

> Is there no version manager for clang the way there is for e.g. Rust?

Not an official one for sure, Cargo is a blessing for people used to dealing with C/C++ dependency management.

Macs ship with outdated versions of common bash commands and tools as well. Homebrew goes a long way to fixing this for me.

I have the same problem on Ubuntu LTS distributions. Eg the ancient version of git in 16.04LTS.

You can fix that of course but homebrew does the same job on macOS.

Roll on 18.04LTS.

Homebrew exists on Linux. I like running LTS distros because they just work. For newer per-project exceptions, I put everything into a container (again layered on LTS), it is esp nice for things like specific versions of LLVM which I wouldn't want to pollute my base machine with. My peronal userland gets shipped via, cargo, go, pip, npm, etc.


I try to put everything our team does into a container for the same reason. I'd point out though that the version of docker listed by apt on 16.04LTS is also really ancient. Pre docker-ce.

Yeah, I install docker on to 16.04LTS via this install guide [0]. I'd happily use something else, and probably will, but docker is low friction.

[0] https://docs.docker.com/install/linux/docker-ce/ubuntu/

The most enjoyable and productive dev workstations I have used/setup have all used rolling release distro's for this exact reason.

It does have the overhead of requiring you actually understand (or know how to google) your system/packages that you use though.

It's not just understanding the packages. When I upgraded from 16.04LTS to 17.10 I had to relearn how to compile the kernel because my laptop would no longer boot.

Wasn't that hard to do in the end but it's not something I've done for many, many years and doesn't fill me with confidence.

BASH is outdated, but I compile the latest version and install it in /bin (I have system integrity protection turned off).

Other UNIX utilities are actually standard POSIX ones. If you are used to GNU extensions to these on Linux, then Mac ones may seem outdated, but ironically (macOS kernel is called XNU which stands for X is not UNIX) Mac is certified UNIX, while Linux is only UNIX like.

StumpWM & a full GNU userland just like the machines we deploy on are killer.

It's sort of apples and oranges .... I'd thought the parent was talking about day-to-day consumer stuff (which even computer scientists and programmers do) like managing music, photos, various online accounts, and so on. Very different from all the stuff that software engineers specifically do. Of course there is some overlap but overall its two vastly different sets of user experiences.

I can concede Linux might be superior, but were you comparing apples to apples with OSX vs. Windows? Most Windows development seems to be on MS-based development stacks. A lot more of platform-independent development happens on OSX, I'd be willing to bet, and that iOS development on OSX would be commensurately productive as .NET development on a Windows machine.

The issue seems to be many equate developer == UNIX developer, as if there wasn't anything else.

Also UNIX underpinnings on macOS, just like on NeXTSTEP, was just a mere convenience, the OS culture was never about crufty CLI programming, and many that only jumped into Apple after OS X have not yet grasped it.

Aww this thread is such a waste of time and space. Bunch of totally subjective thoughts without any specific examples.

What apple software do you need to use? Sure, it's annoying not having the GNU command-line utilities. But `brew install coreutils` basically fixes that for me. I admit I don't work heavily with compiled languages on macOS so I can't comment on technical aspects of that. iTerm2 is not inferior to linux terminal emulators. The Apple laptops are quite obviously the best hardware experience out there. I mean obviously iTunes is a complete insult to humanity, and I've no fucking clue how to use Finder etc, but does that matter really?

I mean the laptops are great.

Provided you don't need a HDMI port. Or more than 1 spare USB port. Or am ethernet port.

You could get the touch-bar version, but I've read literally nothing but bad reviews and stories about that thing, plus tb versions have smaller batteries.

Yes, I could get a dock, but here's a better idea: how about apple chills out on their unnecessary-thinness fetish, and givese back some actual functionality in my laptop.

With touch bar i agree, it hate it. But the thunderbolt dock thingie totally rocks. It's really nice to only plug in one cable to connect the complete hardware on your desk (multiple screens, ethernet, power, etc).

In theory the dock is nice but in reality they don't work at all. Does Apple make their own dock?

Every time I come back to my desk with my laptop I have to go through some ritual involving opening and closing the laptop lid and connecting cables in a certain order to maintain screen orientations and try to get the thing to even wake up. I power cycle my dock at least three times a week to try and make it all work

I used to have a 2014 RMBP with the Apple display and it was fantastic because it was all built by Apple and Just Worked (TM).

Hopefully moving to their own chip fab is the start of them going back to owning the entire device solution.

Yeah, this is why I'm so happy with a Surface Book. In a couple of generations once all the bugs have been ironed out I'm sure third-party Thunderbolt docks will be great, but for the time being I want a first-party dock.

TB3 is definitely an improvement over the TB2 ports on previous Macs. But there's no reason Apple couldn't build a laptop that has both TB3 and HDMI/USB-A ports.

Here's a counterpoint: I have mbp tb and I like it. Only problem I had was esc, but after a while you get used to it. I also like how light and thin it is, I can go to work now with my backpack pretty much empty.

I mean, I used to ride to work/uni with my 2009 macbook in my backpack without any issue (along with textbooks, notes etc). Compared to today's machine's that thing would seem like a tank, so I don't know why everyone goes on about these new ultra-ultra thin mbp's like they're the first laptop to be portable lol.

I've heard it said that the greatest minds of our generation are thinking about how to make people click ads. However, I feel like the greatest minds of our generation are spent trying to figure out how to enable people to build Linux software on OSX. Whether it is:

* a convoluted process involving Vagrant, Docker, or both, because you depend on one or more pieces of software that don't run at all on OSX, including Docker itself.

* slogging through bugs specific to services running on OSX because they really only support Linux well, such as Cassandra or Kafka, or even MySQL.

* Getting shells scripts that work reliably on both OSX and Linux, especially as the tools used to do so break backwards compatibility, either by Apple itself or by 3rd party tools like HomeBrew.

* Getting a consistent development environment at all on OSX. Doing so seems to be much easier on a Linux distribution than on OSX.

And that's not even talking about general issues like:

* incredibly flaky bluetooth drivers, often requiring a full restart to fix, if not having to reset some weird hardware bit.

* My laptop randomly not waking up properly from sleep and requiring a restart.

* My laptop randomly beach-balling more often than I ever saw BSoDs on windows.

* OSX seeming to just run really slowly compared to Linux whenever it is stressed in any meaningful way.

* My laptop's wifi not working with random wifi endpoints, such as at the airport or hotel. Whether it's router software bugs or OSX bugs, I am always able to connect just fine on my windows laptop.

* Having to deal with OSX's incredibly outdated userland and its BSD-specific userland. Yes, there are workarounds, but they are generally a pain to figure out and have not been standardized in our environment in any way.

* For a few months my OSX terminal was SEG-faulting about twice a week. I learned to be very grateful for screen/tmux during that time. At some point Apple seemed to fix it, at least.

At $lastjob we were actually developing a Linux service that relied heavily on Linux APIs because it was essentially creating full-on Linux containers like Docker. Before I joined, the previous developers, all big apple fans, were actually going through the tremendous effort of trying to make the service at least build and run on OSX, even if it no-oped most of the things it did.

When I joined, one of my first acts was to completely remove OSX support for this service, and I promise you that life got way easier ever since. Our development processes got simpler. Our build system got simpler. And most of all, our source code got simpler and easier to read.

Granted, most of us aren't developing software that directly calls into Linux APIs, but even then I think you'll find huge productivity wins if you just use a Linux laptop or desktop, assuming everything else is equal.

Everything you listed under your general issues section is my exact experience when running Linux. I don't seem to have any of those issues on OSX.

I did maintain a Linux laptop (Ubuntu) for all my personal and development usage, from 2003-2011. Based on my experience at that time, it is entirely incorrect to claim that the experience with device drivers (wifi, bluetooth) and sleep/suspend/hibernate is better on Linux. Those things basically never worked right, battery experience was terrible, and many times during that era I lost a day's work because something was broken early in the boot sequence, and I couldn't even start X Windows.

In 2011 I switched to macOS due to my job and I have never had to deal with any of that. Ever. Perhaps the Linux laptop experience has improved significantly since then though.

Furthermore, there was no hardware nearly as nice as Apple laptops on which to install linux. (Yeah, other than Apple laptops, but I felt like I didn't have the money to justify that).

However I of course agree with your points regarding Vagrant/Docker and shell scripting. It is a shame that MacOS could not be based on Linux.

The device drivers compatibility and availability on Linux these days has never been better. Even better than Windows and OS X out of the box in many cases. Seriously, there has been huge improvements in the last few years.

If you want to run Linux on a laptop and get the best experience, a Mac is not the best option. Thinkpad (business class X and T series) is what you are looking for.

Linux has come far in seven years. I haven't had issues with wifi or bluetooth for years on quality hardware (Thinkpad, Dell.) YMMV

I begrudgingly use a MBP now after many years dealing with similar issues on Linux. The hardware is great, things adjacent to hardware are great (drivers, external monitors, configuration) but I would prefer to have the windowing system behave more like Gnome. Things like Alt+click to drag a window, Alt-Tab cycling through windows not apps (and stick to the same desktop workspace at that) menu bar location etc.

FWIW Alt-Tab, for me, does cycle through windows not apps, and windows within the same workspace only. Check SystemPreferences => Keyboard => Shortcuts => "Move focus to next window"

for everyone who wants to expirience a hell of developing software for linux under OSX i suggest to try to build apache httpd server from source on mac. it seems to be a trivial task on Linux but it is a hell of nightmare on OSX. I am saying this just because having a custom build version of apache is a commin need for many web projects.

Once you've got to the point of needing to match your production server technology, doesn't it feel like you should be using a VM or docker (assuming you're tied to MacOS as the development environment)?

Yeah, i don't understand this argument. Even all the devs i know working under linux are doing all their development under docker or VMs to match the live environment and being able to switch between different projects more easily. And even when you have only one project, who wants to run debian stable or centos as his main OS?

I do use docker. Do not worry about this part.

But first of all development without docker is much faster. Setting up a docker container takes some time (especially with a custom built apache server).

Secondly running a docker machine (needs a Linux VM) on mac in parallel to Vagrant (I have to use it too) is CPU taxing as both (docker machine and vagrant) VMs take a lot of power from main CPU.

Add to this constant port conflicts and network paths resolutions issues between different docker containers and hosts. Also it forces me to use Oracle Virtual Box (who likes Oracle ?) These little things all add up.

My point is that what should be a trivial task, is not so trivial when deployment is done on one platform and development is done on another. I agree with original comment that it probably makes sense for industry as a whole to switch to Linux development workstation instead of Mac.

The only reason I use Mac besides it is a company policy is because OSX somehow able to render better fonts. My eyes are getting tired after looking on poorly rendered fonts from Linux machines. If you use Mac and want to see how Linux fonts look like - install SeaMonkey browser. It turns off all proprietary patented algorithms available on OSX and renders fonts exactly as on pure Linux where due to some licensing issues many nice font rendering technics are disabled.

Do an intermediate container, and it shouldn't take very long for repeat builds.

> Even all the devs i know working under linux are doing all their development under docker or VMs

I don't, because docker is a buggy, embarrassingly-poorly-designed system and VMs are a pain. I develop on the same system I deploy on: Debian stable.

> And even when you have only one project, who wants to run debian stable or centos as his main OS?

I loathe CentOS, but I love Debian stable. It's a wonderful, solid (one might even say … stable) system. Why wouldn't I want to run it as my main OS?

One of the things I really hate about our development culture is the cult of the new. It's a good thing to use stable, well-tested systems. Let others find the bugs; I'm happy to get work done.

> who wants to run debian stable or centos as his main OS?

I haven't used these two in particular, but if they support flatpak, why not? You'd have the same packages as on the server plus the latest versions of GUI apps.

Interesting point, is flatpak already a mature thing?

I've used flatpak on an Ubuntu 16.04 base without issues. There are limitations, for example you can't easily replace/update your window manager through flatpak or snap. But for user-facing apps, it's really a great step forward.

The macOS system shell, python, compiler etc. are out of date, and out of your control. They're good enough for basic tinkering. However, as soon as you develop on a Mac professionally, you should create your own dev environments and stack independent of the macOS maintained one, starting with brew for example.

I thought it was common knowledge that you should not rely on system provided Python, Ruby, Tcl etc. They are there for scripting the system and nothing else.

If you are Python developer, install specific version your project needs and use that. Same for any other language.

Not sure what you mean by outdated compiler? For what? C, C++ etc are distributed with Xcode and clang is usually standard compliant and recent.

BASH is the only problem. Newer BASH versions switched to GPL v3 and Apple will never upgrade to that.

My personal solution for that is to disable system integrity protection, build latest bash and install in /bin.

This way most scripts that have #!/bin/sh or #!/bin/bash and rely on new BASH features just work.

You always ought to build on the same hardware you deploy on - to be blunt having multiple developers messing around on n different macs is just madness!

Just use docker or a VM for matching your production environment. Then your devs can use whatever device they're comfortable with. Add in a CI environment and then suddenly you don't need to force the same OS on everyone. Everybody likes something different.

Its better than nothing, but you still have added extra risk by doing it this way:

Are all the VM's exactly the same

Can you 100% prove that the vm behaves identically on all the varied hardware.

And no "professional" would consider "Everybody likes something different" is ever valid for a paying job!

No problem if its a hobby project but on a project with even a small number say 5 or 6 the risk isn't worth it.

Once again, this is why CI is important, it acts as a final test to make sure things build and run correctly. If your deployments go through the entire CI process before going anywhere important, the source of the work (developers' particular workstation setup) is not as important.

> Are all the VM's exactly the same

Is the silicon die on your processor exactly the same as everyone else on your team? If not, you're not a TRUE professional.

> Can you 100% prove that the vm behaves identically on all the varied hardware.

No, but I also recognize that I (and many other devs) are not writing space shuttle / train control / self-driving car software that is responsible for human lives. If I were, I wouldn't be advocating for the style of development I mentioned above. That is a different situation which I imagine most people on this site are not dealing with.

> And no "professional" would consider "Everybody likes something different" is ever valid for a paying job!

What does this even mean? Of course it can be "valid" for a paying job. There are literally companies out there who offer different OSes and machines to use for a job and the developers get paid. You get to deem they are not "professional" because...?

> No problem if its a hobby project but on a project with even a small number say 5 or 6 the risk isn't worth it.

The risk doesn't lie with the number of people on a project but the scope of the project itself: basically, can human lives be impacted in a significant way if a dev screws up? Then yeah, there are better, more rigorous ways. Are you writing a web app, a desktop GUI, some CLI tool, or perhaps...the Linux kernel? Then develop on the machine you like! Have a good review process, set up CI to build for the platforms you support, and implement a good set of tests.

"Is the silicon die on your processor exactly the same as everyone else on your team?"

Of course I have worked on projects where all the test dev and live hardware was explicitly brought from the same production run so the hard ware was identical down to the rev no of the pcb's - our hardware guy would have liked to have all the disk also from the same production run.

Can you "100% prove" that your destop Linux distribution has the same kernel modules and packages, and that none of the versions or patches diverge from the server your project is destined for?

I don't even know what hardware I'm deploying on. Amazon/Google/Azure has abstracted that away from me.

> What apple software do you need to use?

Not sure if you're asking about "Apple" software (eg from Apple themselves), or OSX Software.

Personally, I miss Affinity Designer, and SnagIt. Both OSX applications I've paid (though not made by Apple), and which have not-as-good-to-use equivalents for Linux.

I still find myself switching back to OSX when I need to get things done using them. Rare now, but still happens.

Quite obviously the best hardware experience? That doesn't seem obvious. I've been running Linux on Apples for many many years, but mid last year I bought a Dell. I'm certainly satisfied with it; I definitely miss the magsafe power adaptor, but that was a trade off I knowingly made. It seemed on balance that Dell had the better product.

What was so obvious that after checking out the options, I bought the wrong hardware?

I like Linux better than Mac which I like better than Windows. I don't use Linux for work right now because HiDPI support just isn't there for a lot of Linux software.

What has given you HIDPI trouble?

I have Kubuntu at home and a 4k screen, and also at work with a "3k" screen. I don't do development at home, but the general desktop works fine.

You're lucky that you get to have just one dev environment. I have 2 Windows, 1 Mac, 2 iOS and 1 Android device on my work desk and I have to constantly switch between them.

That said, I do most of my heavy lifting on the Mac, and all primary dev on a Mac. Doing the equivalent tasks is a chore in Windows. I used to be all-Windows, but once I understood that MacOS is file-oriented and not program-oriented, it helped a lot.

Really? I've always found Mac OS unpleasant, because it seems to me very program oriented. I typically work in a task oriented way (a collection of windows, probably from different apps, on one virtual desktop, referring to related files; another virtual desktop has windows from the same apps for another task).

This has always seemed to me to be a species of "file oriented" working, and I prefer apps that run in a file oriented way. For instance, a graphical file manager that shows a folder in a window, and you click to get open an item in the folder you so always get a new window - even if it happens that the item in folder is just another folder that is opened by the graphical file manager. Or office apps which always open each document in a new window, and when I close the current document, the only thing I notice is that the current document is closed - it doesn't try to focus some other window from the same app.

But Mac OS has always seemed an almost perfectly application-based interface with its fully application-based dock and it's application-based global menu (close a window for a file that was loaded by Cool App, and Cool App's menu still shows). I think it's Finder is a bit confused: it used to be mostly file based, at some point in the OS X years it got more and more application based but perhaps it's swung back - it's been a few years since I've bothered trying Mac OS. It had some of the most appalling virtual desktop support when I last used it, that made me think "the people who have implemented this only work in an application-based way". I can only imagine that's got better. But it seems that Mac OS is always: application-based workflow is prioritised, file-based workflow is secondary.

I have found that Linux can[1] excel in this workflow, Windows is ambivalent to all workflows, supporting all of them badly because it supports none at all, and Mac OS prefers you to think "what tool am I using", not "what task am I doing".

Is my experience entirely unique?

[1] i.e. it depends on what particular tooling you're using - it's possible a default setup is appalling.

I'm a software developer too. I've held senior engineering roles at Microsoft, Apple, and Intel (in that order). I grew up loving Windows but soon after getting really deep into development that love vanished.

Anything I do today (everything from Intel microcode, x86 assembly to C/C++) happens on macOS simply because I can do every single thing I need to in one place. Most devs I bump into that really hate macOS have no idea it is really just (open source) BSD with Apple's oddly unique visual facade. There is literally nothing I can't do on my Mac even during the times I run Visual Studio, VTune, etc. I also find it amusing when devs tell me that macOS isn't as customizable as Windows. Sure... sure... ;-)

That's the question. Will you be able to continue developing for non-Apple CPU on a Mac??? That's not what Apple wants.

You seem to be conflating macOS with Macs. Have you running Linux on Apple hardware?

Not GP, but I'd like to share my perspective.

Apple creates their products for a specific customer (which covers most of population looking at its popularity among people, who have enough money to buy their products), but not for me. Using Macs or iThings feels like using devices which aren't design for my workflow.

Macbooks are well-built, but they are optimized for Mac OS. Dells and Lenovos are good enough for me and work nice with Linux. I like physical buttons and keyboards which are... well, hard to say what I don't like in Mac's keyboards, but they are not going well with me.

What parts of your workflow do you think are uniquely enhanced by not being on Mac?

For me,

1. Getting packages with apt on ubuntu without installing homebrew. It's vastly simpler and natively supported by ubuntu. I can also update my entire system with sudo apt-get update && sudo apt-get upgrade.

2. Native docker support.

3. On mac, I always ask the question whether something is ctrl-c or cmd-c, say. On linux it's always ctrl.

4. I don't have to login to the apple store to get the software I need.

5. Linux has always been command-line first. On mac it's always been GUI first.

6. Macbooks are terribly built, and fail for a number of reasons. [1]

7. The Apple brand, while it used to be one of homebrew is now akin to one of fashion. It's kind of like seeing all the kids walk around with their "Hollister" t-shirts on.

I could go on I suppose but those are the meat of the issues.

1: Louis Rossmann's macbook repair channel. https://www.youtube.com/watch?v=sfrYOWlKJ_g

2 - docker works flawlessly on macos

4 - i cant think of a single application, outside of maybe xcode, a developer might need to sign into the app store for

6 - whatever reasons that youtuber might have for not using macbooks, i will never buy a non-mac notebook ever again. this 2013 13" mbp has been the best and most reliable computer I have ever had in 30 years and I bought it used!

7 - sure the apple brand is cool but i don't get your point? doesn't it mean that you care about the branding so much so that you won't use it? do you see where I'm going?

> docker works flawlessly on macos

He said nothing about "fine". Docker on Mac is through a VM and thus not native. Containerization in MacOS is impossible without a VM.

Exactly, on linux you have the entire CPU and memory space to use, as it is just a mapped name space.

I will add that the usefulness of seeing the containers PIDS in ps and other tools is more useful than most mac/windows users realize.

xhyve died on the vine it appears, but did make things a bit better.

Containers, sandboxes and zones are all different names for the same thing. macOS and iOS actually have quite good sandboxing support.

6: Go watch a few videos. He repairs macbooks -- for a living. Apple just keeps sending business his way.

Pro-tip: If you spill water on the keyboard, shut the MBP off immediately and send it in to the apple repair shop to get cleaned. If the MBP is on while there's water on the logic board, rust will form, either short circuiting chips, or rusting out the copper traces.

7: Apple used to be an amazing technology company giving the world wonderful computing devices. Today, it's a fashion accessory company. Mac OS doesn't run in the cloud. It is not the core OS that runs our infrastructure or space programs. And lastly it doesn't capture our imagination anymore. That throne has been passed to Elon Musk with Tesla, SpaceX and Hyperloop.

> 2 - docker works flawless on macos

Lol, nope. At the job I just left almost all of our team had MacBook Pros. We ran our entire dev stack with docker-compose and wasted so much time dealing with broken docker crap that ultimately stemmed from the fact that you're not running docker natively, but rather in a VM.

I wish I could still query the Jira database. I could throw out tons of specific issues.

I bought a 3000 dollar, 2015 MBP with serious heating issues. Which I was told is 'normal'. This pretty much killed any idea that I had that i deal with quality devices there.

I knew they are fragile, I didn't know that using the actual cpu performance for more than a few minutes is a problem.

>On linux it's always ctrl.

Unless you want to copy from the terminal, where it is ctrl + shift + C.

I highlight the text and middle mouse button for paste.

Didn't you say you liked command-line interfaces, not GUIs?

I'm not sure how using a mouse to select text means that you're using a GUI. I spend most of my working day in an Emacs window, which is ostensibly text, but it doesn't mean that there aren't cases where selecting some text with the mouse is faster than moving the cursor to the text by retyping it. People have been using mice with computers before windowing systems were even invented.

And on a Mac, you do Cmd+C. Neither maintains uniformity.

On macOS, it's very simple, consistent, and intuitive:

GUI menu shortcuts are invoked with Cmd, and that includes copy/paste.

CLI shortcuts work as always (and there's no overlap, because of the extra Cmd key).

ctrl-C not command-c to exit a process.

I'm also a software developer and I also have a Linux/Mac mixed environment, and I feel no difference between the two.

To be honest, Mac does many things really differently. Yes it feels and sounds like double-think, but I do everything differently from my Linux desktop, and I can do the same things at the same speed, if not faster.

On the average, I use them equally, and can do the same thing on both.

One footnote is, I do not install anything via homebrew or anything massive which installs into depth of the macOS. For these stuff I have a Linux VM, which is fired up rarely.

Maybe they also have an unpublicized reason to build their own - the backdoors on Intel/AMD/Qualcom/IBM chips. But building their own isn't a guarantee that they won't put one for "concerned" parties.

Find me a 15" 16:10 or 4:3 laptop with a *nix based OS, 4 cores, and 16GB of RAM and I'll switch any day. Unfortunately, there's only one choice for all those requirements right now.


This is what I have is used since 2016 as my personal machine. Comparable in price to a MacBook Pro and easily outperforms a MPB. Specs on mine:

Intel CoreT i7 6700HQ (4 core s, 3.5GHz)


15" 4K Display (1080p is standard, I upgraded. No regrets).

GTX 970M

256 GB SSD (system drive) + 1TB hard drive

Installing and running Ubuntu has been a breeze.

Like a MBP it has an aluminum body and weighs less than 4lbs.

Xps 15, aero 15 are v nix compat. Think the xps comes in a developer edition that's default ubuntu

The Dell 7520 DE?

Think pads?

My hero. Pretty much I'm the only one in the whole webdev team who prefers linux over mac :D Fortunately now that we have figma, nothing ties me to win or mac.

I'm a fan of using a Mac and a Linux VM. I use "Spaces" to switch between macOS and Linux.

> Not much, at least, not much that benefits me as a customer.

Why would it matter to you, as a customer, what chip is inside your computer? As long as it can run your programs, why do you care?

See also: Microsoft and Windows on ARM

> in my experience apple software is flat out inferior and OSX is the worst

It is almost absurdly bad. The only OS that still hangs, freezes, and crashes regularly. It is like Windows 98 quality wise and seems to get worse instead of better.

The whole UX is also insane. Every feature is hidden behind some obscure keyboard shortcut that you have to google or you just get used to working with this useless toy os.

The terminal is garbage. Everything is slooooow as fk (typing, mouse, etc).

It is shocking that the internet industry has standardized on working on this garbage when they run Linux on their servers and would be far better off developing on the software they actually use.

It just demonstrates the cult mindset and horrible lack of real technical proficiency in the industry.

> The only OS that still hangs, freezes, and crashes regularly.

Tell that to my Windows 7 box. It crashed last week because I tried to copy a PNG on the desktop into an Outlook e-mail message.

(IT still hasn't cleared Win 10 for company-wide deployment)

> some obscure keyboard shortcut

FWIW, according to the old Human Interface Guidelines basically everything was supposed to be discoverable via menus, and then has the keyboard shortcut right there in the menu.

The keyboard shortcut System Preference pane lists additional keyboard shortcuts, and lets you easily change them system wide. But yes, granted, searching online reveals quite a few more shortcuts.

I'm just wondering how Windows or Linux are better in that regard in any way??

You can't easily share media between the Apple ecosystem and others. You can't access some Apple services at all from Linux.

There's no way to run userscripts/WebExtensions on IOS. Your device is no longer a user-focused tool to access media, your browser is closer to being a "smart tv" than a customize-able information explorer and augmenter essentially controlled by no one.

You're giving up an awful lot for the sake of convenience, trends that if amplified could irrevocably change the character of the Internet for the worse.

I also don't get what's missing from a modern Linux desktop, especially since nearly everything is on the web these days.

> what's missing from a modern Linux desktop

Networking, printer stuff, graphics stuff that works immediately after installation, without one having to search for various problems & fixes on the Internet. And that doesn't randomly break after kernel upgrades.

Because of running into networking & graphics driver problems every now and then (and having to revert to older kernel versions), and problems with printers & drivers, I feel I would never never never recommend Linux to people, unless they enjoy troubleshooting things and learning new stuff.

I have used Debian almost exclusively for ten years, not counting the last couple months where I've been doing Windows 10 for some contract work. I can't remember the last time I ran into problems with networking, printer setup, or really anything except suspend/hibernation, barring one astonishingly cheap ($168 at Walmart) machine that had a weird integrated Bluetooth/wireless/something else card without a free driver.

That being said, Windows 10 is great for regular-user stuff --- really great, in fact. It's only for development that it's sometimes a little awkward, and it's really not bad. If I were working with .NET/MSSQL/IS more, I'd love it: Visual Studio is a nearly perfect IDE on a powerful machine. Debian is still better for me, though.

Well I suppose there're lots of people who never ran into any problems. I've been using Ubuntu and Mint mainly, and usually that works fine for me too (with most laptops & desktops I've had). In one case though (maybe 3 years ago), on fairly new hardware IIRC, I've had to test various kernel versions, until I found one that was compatible with the graphics stuff on the laptop. And then take care to not accidentally auto-upgrade to another kernel version. In another case, networking in Ubuntu didn't work after installation. I installed another distro instead and then everything (incl networking) worked fine directly. Finding printer drivers that doesn't just print random garbage characters, is usually super frustrating (I think), largely because the printer company webites' UX is terrible.

I think one is safer, if one use a bit older hardware (laptops and printers), because then the Linux people have had time to look into bug reports and incompatibility problems and fix them?

Nowadays when buying a new laptop, I always websearch for the laptop name + "Linux problems" or something like that, to see what bugs & incompatibilities other people have reported already. And then maybe I decide to avoid that laptop. But ... I wouldn't expect my parents or most other people to do this. Instead they'll buy a "random" laptop with new "unknown" hardware, and then there'll be a 20% ? risk that networking or graphics won't work for them? And I think they'll need help to get the printer working. ... And with all that in mind I feel I need to slightly warn them about Linux.

Um I have never had to download fixes for networking / graphics Ubuntu worked out of the box for me.

with amd open drivers or even intel ones there is virtually no more support issue for graphics on Linux.

I'd be amazed if you even managed to break networking after kernel upgrades in the recent 5 years.

You know that OSX and Linux both use the same printer stack (CUPS) right?

> I also don't get what's missing from a modern Linux desktop

Adobe software, for one.

I have also yet to find an IDE as nice as Visual Studio...

Many of JetBrains IDE's are better and they're cross-platform. I've switched to Rider after getting sick of constant white screen of deaths, crashes, locking and perf issues in VS.NET. Rider is on par with VS.NET/R# for C# and superior for pretty much everything else except for visual GUI designers - although I haven't used one seriously in over a decade.

I'm very skeptical but I will try out JetBrains, thanks for the pointer! :)

EDIT: These all seem to be .NET development? I use Visual Studio mostly for C++... for that would you recommend CLion?

Yeah you'd use CLion for C/C++ development (tho AppCode includes C/C++ support as well), although haven't used that variant myself, but they're all built from the same high-quality core platform and share many components so I'd expect it to provide a quality experience for C/C++ as well.

I just gave CLion a whirl. It's decent, but unless I'm missing something massive, it's nowhere close to Visual Studio in terms of being a good experience IMO. The debugging experience is so rough in comparison. Aside from the fact that the shortcuts are pretty non-standard (F5 is "copy" instead of "debug"??) here are some issues I can see right away:

1. Artificially long delay when I hover on variables before I can see their values. This drives me insane.

2. I can't view the "raw" values of a variable; I can only see whatever it decides to show me. (So if there's an std::string, I can't examine its fields; at least not by default, if there's any way.)

3. The first time I tried running the debugger, the process stays hanging in the background. I couldn't kill it; I had to manually detach it from the debugger. This was literally the "hello world" example it came with.

4. The completion database for their hello world (all it includes is <iostream>) took quite a while longer than Visual Studio's did. I can't imagine what it's like for actual projects.

5. It initially told me I can't debug because I didn't have a project? So I had to reload the CMake project. This makes no sense.

6. There seems to be no filtering of private variables that I can't access (in the completion UI)? Maybe I'm missing something here.

7. There seems to be no obvious "immediate mode" in the debugger. I'm not well-versed enough to know if GDB has this feature, but regardless, whatever you have to type into GDB to get this feature would be more painful than in Visual Studio (where you literally just type the expression into the Immediate tab and press Enter).

8. Even the cursor hiccups repeatedly. Not that Visual Studio's is perfectly innocent in this regard either, but CLion's stuttering is far more pronounced, far more frequent, and far less excusable (see https://imgur.com/a/Huo2V).

This is just after a very quick use. It seems to "work" in the same way Eclipse does -- the functionality is "there" (maybe even more than what Visual Studio has), but it's just clunky and doesn't feel smooth or "integrated" (the 'I' in IDE).

I watched the CLion sales pitch video after reading the recommendation. Binging around for the features shown I then discovered Visual Assist[1] which seems to add comparably nifty code completion/generation etc features to Visual Studio. I'm demoing it just now, it's definitely worth a look if you have an itch that needs scratching.

It's better to add bells and whistles on top of a rock solid foundation such as VS than the other way around imo.

[1] https://www.wholetomato.com/

I used Visual Assist X with VS2008 actually. Functionally it made things better but UX-wise it was a bit sluggish and glitch from what I remember. If I get a chance I'll check out the newer versions (thanks for mentioning it) but somehow I expect something again similar this time around haha.

Interesting, okay thanks! I'll give it a try.

I don't know if its still the case, but I can honestly say with a straight face that at various points in the past decade I've used Netbeans for C development and it was fantastic. My understanding is that a lot of the Netbeans C/C++ code was ported over from Sun Studio, which is why it is so solid and mature.

Microsoft released Visual Studio for Linux [1] a little while ago. I find it to be excellent; it gives comparable functionality to Atom but with a lot of the stuff you get with plugins already built in.

[1] https://code.visualstudio.com/

That's Visual Studio Code, not Visual Studio.

VSCode is a good text editor, but it doesn't compare to the full Visual Studio IDE.

Out of curiosity, what are the killer features of the full IDE over Code?

I'd probably suggest debugging being one of the main killer features.

I'm not a VS/VSCode user but I am a Sublime Text and PhpStorm user. You can debug in Sublime Text but I'm not sure you'd want it to be your primary debugger. I'd guess VSCode/VS might be similar.

And again taking Sublime Text and PhpStorm as an example, PhpStorm knows your code far, far better than Sublime Text ever would do and this makes navigating your code quicker and easier.

> There's no way to run userscripts/WebExtensions on IOS. Your device is no longer a user-focused tool to access media, your browser is closer to being a "smart tv" than a customize-able information explorer and augmenter essentially controlled by no one.

I have (personally) never had the desire to write/run a userscript on iOS. If my device being a “smart tv” lets me focus more on other areas of my life (i.e. projects, hobbies, and career) rather than fiddling with my tools, then that’s a trade off I’m more than happy to make.

I suspect many other people share this point of view.

As usual, a lot of the innovation you probably take for granted happens through side channels, user-favouring features that would never appear unless they can be "unauthorized" add-ons. Like ad blocking. I understand that some people value convenience above all else, but maybe you never experienced the bad old days when everything was locked down. It's good that Apple still has competition and others are conscious of what might happen.

>"unauthorized" add-ons. Like ad blocking.

There are iOS ad blockers on the Apple App Store, and extensions for MacOS Safari. I'm not sure how much more authorized it can get than that unless you want Tim Cook to hold your hand through the process.

The point I didn't make clearly enough is ad blockers probably wouldn't exist if others hadn't introduced this kind of feature through userscripts and add-ons. If only Apple can enable these abilities, it's up to other ecosystems to innovate and define access.

In my opinion, the most important branch the Web could take right now is going from transparent content to opaque content. Now I can filter, transform, organize and augment content accessed through my browser, and even remix it, all enabled by the inherent organizing of the technology, which suggests to the larger society more can be done. It's an open question though, and that technology could be changed to restrict what can be done with content.

that only works with safari of course because they don't want chrome having feature parity or, gasp, being better!

It also works with any app that uses the Safari View Controller - like Feedly.

If Google wanted to include their own ad blocker with Chrome they could - it is highly unlikely that they would want to.

For instance, Firefox for iOS's settings screen indicates it offers some sort of "tracking protection" in Incognito Mode by default. I remember reading about how this works on the desktop[1], but I've not had a chance to understand how well this is implemented on iOS.

Still, it's clear Chrome could offer an adblocker if they wanted to.

[1] https://support.mozilla.org/en-US/kb/tracking-protection

Add blocking is something Safari (and Apple) does better than the any other browser vendor, IMO.

A good ad blocker for Safari (on Mac or iOS) just uses the Content Blockers system, and provides essentially a JSON blob with a bunch of rules in it to Webkit, which will then take the requested actions (mostly block, sometimes hide elements, sometimes force a URL to https) internally - the Ad Blocker "app" never knows what sites you're visiting and is never involved in actual blocking.

> I also don't get what's missing from a modern Linux desktop

A user interface that isn't ugly and laggy.

You know not everyone actually likes the Apple desktop experience either.

What's ugly or laggy?

It's not like there is a defacto user interface but there are plenty of them that sure doesn't lag or look ugly!

I bought a system76 bonobo with their pop linux distro. I'm perfectly happy with my desktop experience.

- decent video calling

- imessage (signal is getting warmer, but the integration is still lacking)

- adobe creative suite

- video editing

- audio production apps

- photo management and post processing (darktable is still garbage)

- sleep/wake and connecting/disconnecting external displays

- not-ugly fonts

It’s kind of funny to compare Linux vs. macOS if your experience of macOS primarily comes from using a Hackintosh. (No sleep/wake support for me! Unplugging my HDMI cable causes a hard-reboot! Etc.)

Which is to say: Linux is pretty good on Macs, whereas macOS is pretty broken (after weeks of intense debugging with other Hackintosh folks) on hardware that would run Linux just fine.

That's to be expected, of course. Linux maintainers care about the fact that it can run on just about anything. macOS maintainers only care about whether it runs on Macs.

You admit you're using a hackintosh, and then blame Apple when stuff doesn't work?

That's one heck of a mental disconnect.

I didn’t say I blame Apple. It’s not Apple’s responsibility to write drivers for machines they don’t claim to support.

I’m more saying that—although Linux’s support for things like sleep/wake on random machines might be sometimes wonky, the best efforts of the Hackintosh community don’t get macOS to support those same machines any better. It’s the machines that are wonky. (Usually by having horrible off-spec ACPI tables that get patched over with drivers the manufacturer releases only for Windows.)

Which is further to say: it’s not really a failing of macOS or Linux if your hardware won’t sleep/wake correctly, or won’t connect/disconnect from external displays correctly. In either case, it’s because your machine is nonconformant to the specs the drivers were written to follow. The only way to succeed in such an environment is to spend man-decades (Linux) or at least man-years (Hackintosh) reverse-engineering the brokenness and writing heuristics into your drivers that patch over it—or by being such a monopoly player that the OEM does that for you (Windows.)

Breaking news: An OS with a limited set of drivers for the hardware it's specifically designed to be used on, doesn't work very well on hardware it was never intended to be used on. Full story at 11.

OP is nodding to infamous problems with actual macs freezing and rebooting when waking if external displays are connected, which is a pretty common occurrence on high sierra devices.

Audio production on linux is pretty good, well not for everyone, but it's got a lot to offer. I spent years with logic / Ableton live on OSX. I'm moving to a more barebones approach these days with linux and supercollider.

Still nothing for the creative suite except for wine though.

You might be interested in looking at Bitwig Studio for audio production. I've heard it's similar conceptually to Ableton Live and it runs natively on Linux.


I agree that audio production on Linux is pretty good these days. Ardour is a pretty good DAW. The only problems are jack being a PITA sometimes and the lack of plugins.

I got bitwig as soon as it came out. It does some things fantastically! The apt-like package manager is badass. I haven't tried again in about a year but every time I've used it the massive instability reminds me of mediocre DAWs that drove me to live in 2003.

The reason I went with bitwig was to support anyone doing anything that wasn't quite Live because everything is made with Ableton now and I can hear it. Also the collaborative features that didn't work when I last tried them. Moving to Supercollider has solved all those issues for me. It's also cross platform and free.

I suppose you only have looked at free Linux software, as opposed to paid-for software?

The only thing I have a hard time trusting Intel with is the way they have handled questions around their Management Engine. I think they have done a poor job convincing us that it's safe or at least benign.

It will be interesting to see if Apple includes a secretive management engine. If they do, then the speculation that it's required by one or more governments will be dialed up to 11.

With Apple's documented interest in owner's control of their devices and delivering the required documentation for it, they won't tell us what is included in the chips and what's not.

Fortunately, Apple regularly publishes their iOS Security Guide as criddel mentioned below and there is a vibrant Apple chip reverse engineering community that does microscope analysis of Apple's chips [1].

1. https://www.anandtech.com/show/11596/techinsights-confirms-a...

Apple has done a very good job with documenting iOS security:


If they included a management engine and produced a document like that, I'd be satisfied.

I always see this document linked to on HN even though it's a (relatively) high level description of what is happening on the device.

Besides, there is absolutely no guarantee that Apple isn't just being "semi-open" with their security mechanisms. In other words, who is to say that there are no redundant, undocumented mechanisms in place?

But I will concede that Apple is ahead of everyone else in this regard. It's just that I think Apple being ahead doesn't mean that the information they're providing is enough.

You're right that a whitepaper isn't enough to be really sure. That takes disassembly of binaries and then you are still stuck with judging the trustworthiness of the hardware and I'm not sure how you solve that problem.

Apple is probably going to produce a similar document that details at a high level how the security works. If they follow the example that they set with iOS devices, what they won't do is provide the source code for these components.

I mean, of course, it's still their right to release or keep any source code they produce. We shouldn't be expecting them to release the source code, but even a whitepaper is better than what Intel is doing.

How's this different than going back to PowerPC architecture? When they switched to Intel, I recall there being a lot of talk of benefits to running on the same chips as everyone else.

From what I've been hearing lately about Apple, the impression I keep getting is that Apple doesn't think of Mac as a product with a specific target market, but more as a sort of just-a-really-really-big-ipad. The closed garden approach worked well with iOS, where for the most part people just play throwaway games, but I really don't see that being a viable underlying philosophy for laptops, where interop with digital plumbing is important.

Apple is in a much different state now. Making your own chip requires significant resources, and before they didn't have the leverage nor talent to get the chips they wanted built. Now, they have a trillion dollar product (iPhone) and a mature chip development team that many consider the best in the industry, and they can throw tons of money at the problem.

The benefit of running the same chips as everyone else was about leveraging the investment of other products into the Intel architecture. Now they're doing that, just with their own chips.

From what I've been hearing lately about Apple, the impression I keep getting is that Apple doesn't think of Mac as a product with a specific target market, but more as a sort of just-a-really-really-big-ipad.

As a long-time Mac user who's still basically happy with the product, this is what worries me the most long-term. YMMV, different strokes, go in peace with Ubuntu, but for me personally, macOS has been my favorite desktop Unix, hands-down, for about fifteen years. There's nothing that I want to do on even a semi-regular basis that I can't do, and a lot that would require, well, heavy adjusting if I moved to Linux. (Yes, I've used Linux within the last couple of years.)

I also love iOS and the iPad, but it's not a general computing platform; it's a computing appliance. It's very good at what it does, but by design it's difficult-to-impossible to do things on it that Apple doesn't want you to in ways that aren't true for macOS. (I'm sorry to those of you who had to disable SIP to recompile your own version of Apache, or who are infuriated you can't replace Finder with ratpoison, but you know that's not what I'm talking about here, right?) If "Project Marzipan" is about creating a new UIKit/AppKit hybrid that allows developers to create codebases that run on both iOS and macOS, that's already a little worrisome; if it's about "letting iOS apps run on macOS," as some of Gurman's reporting has it, that's a lot worrisome. I have a lot of apps that exist for both macOS and iOS, and in every single case, the macOS version is more capable. And iOS's "sandbox everything" model--and, I suspect, attitudes it engenders--make every app feel like an island not just in terms of data but in terms of functionality: there's much less of the "learn the basics of one app, learn them all" feeling that makes macOS, well, macOS.

If the Mac line moves to Apple's A-series chips, that's...not necessarily bad, but if it's being done in conjunction with sweeping software changes, it makes me extremely uneasy about the line's future in a way that even the Touch Bar doesn't (and trust me, I do not take the Touch Bar as a good sign). I'm not planning to switch platforms any time soon, but I'm starting to wonder if maybe I should buy an inexpensive Linux-compatible laptop so I can, you know, practice. Just in case.

I'm not too worried about Marzipan in the short term, I'm thinking the goal of that project might be to get devs who make pro apps on the mac side work on a codebase that they could easily port over to the iPad side seen as the iPad Pro is seriously lacking in app support compared to the Mac, because it's unprofitable to maintain them.

In the long term this is a play to push Cook's iPad Pro post-pc idea, yet again...

> When they switched to Intel, I recall there being a lot of talk of benefits to running on the same chips as everyone else.

They switched to Intel first and foremost because PowerPC was not going to get them the performance/power ratio that they wanted. It was holding them back in laptops. For example, Apple never shipped a laptop with a G5 chip because it took so much damn power to run a G5. Intel was basically their only option.

A secondary benefit, which they marketed heavily, was the opportunity to boot your machine natively into Windows. They thought this would help overcome the reluctance of "switchers" who were worried that they would miss something about Windows.

These days, the role of the OS as a gatekeeper is basically gone. There's greater diversity in client OS (Windows, macOS, Linux, ChromeOS, iOS, Android, etc.) and most anything important is available through a browser or a native app.

And Intel is no longer their only option. They've proven they can run big businesses making their own chips.

I do wonder how this will affect VM performance on Macs though. I don't see how it can be maintained, unless Apple chips are so much faster than Intel that they can absorb the translation overhead.

> most anything important is available through a browser or a native app

I guess that depends on what people think is "important". If by that you mean Facebook and Youtube, sure. But if we mean things like compilers, imho it would be a net loss if the only way to compile anything for any Apple OS was through Xcode/Swift.

OS as a gatekeeper is still present in a lot of corporate stuff. You'd be amazed the number of people who are still running IE because of one reporting package or what not.

That said there is no reason in 2020 they couldn't dual boot to MS's Windows ARM which is already shipping supports 32 bit x86 apps and should be polished enough to do legacy to the limited degree most users need.

I have to say this isn't how I expected Intel to go down. It is basically microcomputers killing minicomputers all over again (which was before my time, TBH).

The x86 emulation is far from reliable. It's far more likely that corporations just get the intel laptop because processors just are simply just a small fraction of the total price of a laptop. When you're buying a dell laptop that costs $1300 then spending another $200 for the intel laptop is a drop in the bucket.

Well, certainly I think a part of it was that they could unify on one Chip with the rest of the industry did provide some economies of scale advantages, though rumor has it that Apple has always paid a little bit more for its chips so they could guarantee certain things and dictate as such.

I believe the main driver though, as pointed out in this CNET article https://www.cnet.com/news/four-years-later-why-did-apple-dro... was that IBM could not deliver on a powerful enough powerpc chip that would also meet the other constraints, namely, at this time, one of the biggest markets Apple had was the notebook market, and its sales there were exploding. IBM was unable to deliver a lower thermal envelope for its portable chipsets on top of performance issues.

I think this more than anything else pushed that reality.

It doesn't need to be a closed garden. Look at what Microsoft is doing with x86 emulation on ARM.

We should all not forget that if IBM, in the 80s, controlled its hardware and software stack like Apple does now, we would not have the modular desktop PC and we would not have Linux.

I consider the extreme vertical integration of Apple as a bad development (Apple recently is trying to control even rare Earth metals). I like the market to be "modular", with many different vendors, making replaceable parts that fit together like in the desktop PC. And I wish for a future where I still have a choice, instead of being forced to use iDevices for everything from scientific computing to consumer media players.

...we would not have the modular desktop PC and we would not have Linux.

There’s probably no way to support that counter-factual. We could easily have ended up with some other open-architecture personal computer and open source operating system.

There is actually: BSD. In 1991 BSD transitioned to open source


Which of course gave us FreeBSD in 1993

Thats just one simple example I can think of. Intel, as well, was never monopolized as a chip manufacturer. They could sell chips freely to IBM competitors.

Also, the reverse engineering of the IBM platform is what made Compaq computer successful in the mid 80s.


I don't think even a quick examination of the histories involved here would yield any other result than open hardware and software was always going to be a big component of desktop computing.

These things ebb and flow anyway, historically what you usually see is that for at least a decade (sometimes more, a lot more, and sometimes dramatically less) what ends up happening is vertically integrated solutions are favored as they bring the most harmony to the average customer of said platform ('it just works!' was as slogan of many firms, not just Apple). Then as complexities can be either mitigated or eliminated these things tend to open themselves up.

I believe Mobile devices are going through this phase. I think in another decade or two you will have a situation where you start seeing really useable open alternatives to the major platforms even though right now it does not seem intuitive or obvious how that will work

To play devil's advocate.. While you may trust them more, you can't deny they have had many significant software security issues of late. How could consumers trust the security of their chips?

A bug is bad. A business model based on selling your personal information to third parties without your explicit knowledge -- far worse. A bug is a mistake -- not excusable, but the _intent_ is much different.

The admin access without a password bug -- fixed in 24 hours, and it still required physical access to a machine.

That's not to diminish the seriousness of the flaw, but that's far different that Facebook's (and many others') nefarious, continual and one-might-argue, malicious exploitation of personal data.

I feel like Apple actually cares deeply about privacy. Look at Tim Cook's personal life for example -- extremely private person -- someone, given his life story, who probably appreciates privacy more than most.

No company is perfect and no company should get a free pass for negligence, but I think Apple has earned the benefit of the doubt.

Everyone has had security problem since I started in the industry 25 years ago. -everybody-

Given software !== chips and their ARM chips are - hands down - the best on the market... yeah, I'd think people would trust them.

Given the amount of evidence you presented, how could anyone refute your opinion?

OP raised a valid argument - Apple's QC has been crap lately and trending downwards. There's a lot of feedback out on the net to support that line of thinking.

There's been Apple-bashing "feedback out on the net" since Apple was founded.

Most of it has been laughable. The stuff you're talking about is firmly within that laughable camp.

If Apple's QC were "crap" you'd see that in customer satisfaction ratings for their #1 product: the iPhone. But guess what? You don't see that. And there is no downward trend.

Shipping Mac OS so that anyone can just YOLO login as root?


Storing the full disk encryption password in plaintext on the disk?


How do you explain these except for atrocious and trending downwards QC?

I do not deny that those were both terrible, awful mistakes (and I'm upvoting you for correctly including them in this thread). I would note that they were both relating to what is (perhaps unfortunately, but them's the facts) a very minor product for Apple. And that they're both fixed. And that Apple's security overall is still unquestionably far superior to that of its competition. Ask anyone who has to administrate both Macs and PCs for a living (like me).

My comments above still stand.

Software, you could make a case, that's been a bit of a mess lately.

Hardware, though, you've got nothing. Where have they dropped the ball on their chips?

It's all one company. The reputation, stellar or tarnished, affects the entire brand. Their software QC has been bad lately, it's makes me question their hardware QC too. Those bugs/issues are often harder to find though - if they exist - they might not. But I think it's fair for me to question.

For a company that makes a large volume of a very narrow range of products they have a pretty good track record when it comes to physical hardware, and so far their chip work hasn't hit any major snags.

When Apple's custom CPU has its first F00F-type bug then we'll see how they handle a real issue.

> I think that long term they are much better off controlling their entire hardware stack.

But CPU/GPU aren't the only chips Apple is sourcing from 3rd party. Think wireless modems like WiFi, LTE, etc.

Close, but Apple already produces it's own WiFi/BT chipsets (the S1/S2 etc SOCs in Apple watches for example, and the W1 chipset in AirPods and new Beats headphones), and is rumored to be working on an LTE/GSM chipset.

I think Apple is intensely interested in controlling the entire stack.

Why? A lot of these parts are commodetized. Are they really getting that much better battery life from their own stuff?

W1 chip seems to have better performance and power than any other chip. A10 A11 seems to have better performance and power than any other chip. I see a trend..

For performance. They can outcompete against others by moving to bespoke hardware. My iphone x cpu is ~ 13” MBP in performance, which is bonkers.

Absolutely! That's why the Apple Watch, for example, is an unqualified success compared with any other wearable. Qualcomm, which is the only real rival in that space, hasn't come out with a new wearable-sized SOC in forever, leading to the downright abandonment of many wearable product lines from pretty much everyone except Apple.

No, that success is because apple is literally a fashion brand. How many people do you think that wear apple watches could even tell you what hardware is inside it? Sure, it runs smooth, but that's not why they have it on. Nobdy is streaming hd video or running heavy applications on it.

With the W1, battery life and bluetooth that actually works as it should have from the start.

Apple claims the W2 chip is 50% more power efficient than existing Bluetooth/Wifi chips on the market, and is 85% faster than the W1 chip for data throughput.

Existing Bluetooth hardware has so many hardware and software bugs that it would [hyperbole] a [large thing]. Apple should absolutely control the BT hardware/software stack, there is no other way they can offer the Apple Experience without it.

Produces? Are you sure you don’t mean designs?

Apple is as yet fabless, correct. By choice, clearly, as they have enough cash on hand to build and operate more than anyone else on the planet if they wished.

If Apple were to make their own LTE modem, does that mean we might get one in a MacBook Pro?

Isn't wireless hardware in general more like a IP/Patenting/Licensing issue?

Not if you want to do it better than the competition. We already have a zillion radio chips that all do about the same thing. If Apple (or any other company) wanted to be 'better', building a better design to do the thing in a better way (think efficiency, power, speed etc.) would give a unique advantage that cannot easily be copied by a competitor. Same goes for the RTOS (or lack thereof) that you need to run those. For some implementations, a softMAC vs. fullMAC makes sense.

Apple's watch now has a W2 chip that does Wifi and Bluetooth, and maybe even LTE (which is how they crammed it into such a tiny device).

Slowly but surely, they've put together all they'd need for a computer. Their T2 chips in the current iMac Pro handle encryption and NVME storage on the SSD's, as well as the camera image processing, Secure Enclave, and the SMC.

I don’t know what the current status is but they have definitely looked at making their own baseband chips before.

What I'd like is for a company like Apple to provide me (and others) the tools & APIs we need in order to take advantage of that integration, without locking us into their UI paradigms. I would like everything to Just Work™; I wouldn't mind a Macintosh laptop; I would hate to be stuck using the macOS UI.

A lot of what Apple really does is hardware integration. Let me use the hardware to run a free OS, and also take advantage of the excellent hardware integration. Let me use a free operating system, StumpWM, emacs and some nifty APIs which let me talk to the integrated hardware system.

Give me, in the words of Steve Jobs, a 'bicycle for the mind,' not a tram to nowhere.

It's the old story of convinience vs. freedom. I myself am willing to trade a bit of convinience and the need of tinkering with the system for a big chunk of freedom. Not only for me but by using and making Linux better, hopefully it's also a (small) investment for future generations.

This way I hope they still will have the choice to run software they (can) trust on their own devices. If everyone would just go for the convinience, at least desktop Linux would dissapear as an option.

I have a mac, it's a great piece of hardware, and it works great for doing basic stuff, surfing the web, responding to emails, writing documents, but for my work I find Linux so much simpler to use.

Linux just works, sure it's isn't pretty like macOS but it let you get your work done fast, I know perfectly how everything is put toughener, there is a problem surely I can fix it in seconds, just edit some config file and done, it has far more tools for developers, if you need to install software you have the package manager, sure on macOS you have brew but it have some problems, and a lot of developer tools either don't work on macOS, or doesn't work correctly, or are similar but not the same as Linux.

In fact on my mac most of the times or I works with remote servers or with VMs on my computer, because for development work Linux it's far better.

The problem is that Apple’s focus on this distracts from the software that defines the user experience. The MacOS ecosystem is both beautiful and deeply troubled.

On the other hand, with Marzipan (shared libraries on iOS/MacOS) and a single processor architecture, you could say that Apple is putting 'more wood behind fewer arrows' and will have more engineers developing software that reaches MacOS as a result.

> The MacOS ecosystem is both beautiful and deeply troubled.

So, shifting it to ARM and letting universal apps in the iOS App store now run on macOS in addition to iPad opens up the mac to a metric fuckton of developers that are happy to make cross-platform apps, since the iOS app store is SO lucrative.

Apple showed, with the iPad which was a totally new product category, that the sheer number of iOS developers could push the iPad from a platform with zero apps to the best tablet in the world (with an incredible app ecosystem) and universal apps who's UI adapt to the screen they're made for.

> I wouldn’t be surprised to even see them make their own display screens.

I think they're already working on that: https://www.bloomberg.com/news/articles/2018-03-19/apple-is-...

There was a post on here recently that said Apple was interesting in making their own displays.

Maybe they’ll make a TV that doesn’t sell information on everything you watch and say to all and sundry.

interesting, i only use my ipad as a remote to the avr so i can play tidal from my hifi... it's gathering dust otherwise.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact