Hacker News new | past | comments | ask | show | jobs | submit login
Why I left Mac for Windows: Apple has given up (char.gd)
695 points by shlema on March 5, 2017 | hide | past | favorite | 759 comments



I used to be hardcore windows guy...

Then 10 years ago I got a mac. I never went back..

But what am I saving money for right now? To build a nice PC again.

Mostly because of the exact reasons in the article.

I have a fondness for apple... but they have definitely lost their way. First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone. Then they became a computer company who also made a phone and a tablet. Then they became a phone company who also made computers and tablets.

Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

It makes me sad, as a mac fan. The hardware is getting worse. The decisions are getting dumber every time. I wont buy a laptop without a magsafe or similar connection, i have kids and animals, and the magsafe has saved a laptop more than once.. to remove something that was as core and identifiable a part of their computers was just a stupid move and served no purpose.

They don't listen to the industry or the consumers anymore, they stick their fingers in their ears and pretend to know best.

Jobs was hardheaded, but reasonable. Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.


I too have become annoyed recently at really basic stuff not working in macOS, like I can't drag and drop a picture out of Photos app into Pixelmator to edit. In fact, dragging and dropping a picture out of Photos into any application is completely broken and doesn't work. Its as-if the Apple QA team no longer tests to make sure their applications play nice with each other in a way that has been fundamental to desktop computing since the desktop OS was invented.


At the same time, Windows 10 on my Dell laptop will heat up, spin the fan noticably and something called "Service Host: Local System" will consistently use 48 - 52% of CPU when the laptop is sitting doing nothing at all, with no applications open. I really can't understand what it's doing but have a hard time taking it seriously as an operating system.


Check the resource monitor to see which service that corresponds to: it's likely Windows Update since they changed it to mine Bitcoin or something about a decade ago when it went from being I/O to single-core CPU bound.


This is just a guess -- being very late to the conversation -- but in case someone even later is curious, this may be the well documented case of the .NET framework recompiling libraries in the background after an update.

It seems like it will never finish, but it does ... eventually. However, the lack of any UI, any indication of its presence or its progress, and also that killing it will just make it come back later are all pretty hostile.

(Google for more info.)


I'm not sure what it is but it's amazingly inefficient: I notice it most when downloading VMs from https://modern.ie because as soon as I start a test VM the updater will run and then it's multiple hours + reboots before a browser in the VM is remotely usable.


Is this a serious statement, that windows update mines Bitcoin?


No - just my boggled speculation about the kind of work which would tie up a modern CPU for hours installing a few GB of files. (SHA-256 could hash each file in a couple seconds)


A MS engineer posted a lengthy blog post on why this is the case. The ridiculous time has to do with insane system state and dependency checking for every update and each one in history before it, since Windows Update allows you to add or remove every little update ever for the OS.

The Rollup updates that MS moved towards, even in Windows 7 now, is supposed to help this. However, a completely new solution is really needed to replace this antiquated one, like many other lingering parts of Windows.


Decompression. They probably weighed compression time vs file size and realized that every byte not downloaded is money saved on bandwidth.


No.


Come to think of it, that's actually an interesting monetization scheme. "You can use our software for free, but when your computer idles, we use x% of its computing power for mining cryptocurrency" (or whatever profitable endeavor). It's of course disastrous from an environmental perspective, but still interesting.


And if it doesn't idle enough, we'll make it.


This is very likely it from my experience as well.



It happens on OS X. Kernel freezes as well.


Svchost.exe is a bonkers bit of design that goes against the simple, basic separation of concerns that OS-level processes provide by running multiple "services" as DLLs loaded into one process. This makes it extremely hard to pinpoint which service is faulty.

Apparently the magic runes are

    sc config wuauserv type= own


Just go into Task Manager, Details tab, right click on the columns and add "Command Line". Now you can see what each svchost does.


That helps a little, but fundamentally the runtime of a module is still "billed" to the svchost.exe process as a whole - so if a particular one is at 100% of a cpu, you can't tell which module is causing the problem.


This kind of bug is not unique to Windows.


Seriously as I am typing this right now, on Mac OSX, the "kernel_task" is using 50% CPU for no apparent reason and fans have been running for > 20 minutes


> I really can't understand what it's doing

Probably backing up your files to government servers :-)


I used and loved iPhoto and then I had a great (first impression) of Photos until I tried to do the thing you describe: DRAG AND DROP. Apple forums provided an answer (I would call it a work-around) with their usual condescending tone: hold Alt while dragging, and you can at least get a copy of the photo/video of interest into a folder in the Finder.


I just tried it in Photos 2.0 (macOS Sierra 10.12.3) and it works as expected - drag'n'drop from Photos to Finder copies files into selected Finder folder.


I recently discovered the detailed and amazingly accessible Aqua Human Interface Guidelines from their golden years have been completely gutted and replaced with the macOS guidelines that cover only the basics. They've abandoned years of accumulated wisdom. Luckily the old versions still float around in PDF form, the first few chapters are just good timeless UI design advice.



Photos overall is a trainwreck. Options are hidden and hard to find, anything beyond "tag this image with your auntie's name" is way harder than it should be... and this without ever mentioning the dreadful "I'll quit when I want to, not when you tell me to" backend process.


I've noticed this as well. I use a "workaround" in the form of Yoink: https://itunes.apple.com/us/app/yoink/id457622435

Yoink seems to trigger a file export in Photos. Why that doesn't happen with other programs is anyone's guess.

I put "workaround" in quotation marks because this is really how I do all dragging and dropping between programs in macOS, since I tend to run stuff in full screen.


Is it even possible to chose a photo out of your photo librry from a browser's file upload dialog? I don't think it is, when the photos are in iPhoto. How many steps does it take to get a picture off your camera, into iPhoto, then exported, and finally uploaded to Craigslist?


Scroll down to the 'Media' section in the sidebar.

Select 'Pictures'

There you'll find all your pictures in Photos


Photos should show up as an icon in the side-bar? Rather than navigating to the actual files I think you're meant to access them this way. It's a good idea and used to work better in my memory - these days it can be a bit finnicky to find the photo you want.


It is possible and very easy. Just drag the photo from iPhoto to the upload dialog.


You can't copy and paste an image out of Photos, either. It's really quite ridiculous.


1. Open Photos.

2. Select Photo.

3. Edit Menu -> Copy.

4. Open TextEdit.

5. Edit Menu -> Paste.

It is ridiculous. Ridiculously simple.


After some experimentation, it turns out my problem was that I expected - ridiculous, I know - that it would copy image data in some form that's comprehensible to Gimp, Firefox and Thunderbird. So I now rather suspect that Photos is using some new API, that Gimp/Firefox/Thunderbird aren't, and that this new API is, for no particular reason that I can see, not remotely backwards-compatible.

Feels like this forms a data point for the thread - since meanwhile, on Windows, I have a couple of programs in my bin folder that I use fairly regularly that I compiled in 2006.


Yes you can - I've just done it.


Instead of being condescending you could have explained how to do it.


Sorry, it really was a question of selecting a picture and the selecting copy from the menu. I wasn't trying to be an arse.


I know this isn't the point of the comment, but Yoink works perfectly as an intermediary for this. I use it dozens of times a day for various things but this one in particular is a lifesaver.


> really basic stuff not working

Try selecting text in a Windows dialog box, such as the "About" box. It doesn't work, it never did as far as I can tell.


But isn't it sad if MacOS has to be compared to Windows to relativize its flaws? People were using it because they felt it was BETTER than Windows, not because "both are meeh".

Also arguably, selecting text in a dialog box (instead of copying the whole text) is a lot less used feature than drag&drop of photos.


Ctrl-c works though.


Is dragging a photo to the Desktop and then into the app really a huge deal? Just curious, because that works fine for me.


It's not a big deal, but it's a death by a thousand cuts. These little things that used to be part of the refined experience of OSX are turning the "just works" part of their slogan into "damnit, why can't I"


That's called a work around. Its not a big deal in the grand scheme of things, but its not something you'd call a desirable design trait for a desktop app.


And then deleting or moving the photo from the desktop afterwards. So, three steps as opposed to one. For a feature (drag n drop) that's all about convenience.


Ah, that Apple mantra. It's not hard because it's hard, it's hard because you don't do it OUR WAY. Do it OUR WAY and everything is easy.


Buy a breakable USB cable? I've had MacBooks for 10 years, and I'm ecstatic to see MagSafe gone. Apple chargers have always been shit, and Apple laptops have never been compatible with off the shelf battery packs. Moving to USB-C and detachable, replaceable cables, is a huge step forward.

Also, what PC alternative has a MagSafe equivalent?


People complain that Apple uses proprietary ports. They ditched MagSafe for USB-C and people still complained.


I think the magsafe was at the bottom of the list of "bad choice of proprietary ports by Apple"


I can now charge my Macbook with the same cable I use to charge my phone. I can now charge my laptop from both sides. I can trickle charge my laptop in my car now. It's making my cable life so much simpler. Yes, I have other devices that have yet to catch up to the C spec, but with the leading laptop manufacturer pushing so hard for the C port; I don't think it'll be that long.


MagSafe is the only proprietary port on my pre-touchbar MBP.


Good point, and it's because of the status quo bias. When an alternative is proposed, people look at what they lose, and since we don't like loss, we say, "No, I don't want that".

Thought experiment: say Macs had USB-C for 10 years, and the latest Macbooks switched to MagSafe. There'd be a lot of complaints about Apple locking us in to their walled garden, and how we have to buy a proprietary overpriced charger now, which can't charge other devices, moreover.

One technique I try to follow is an anti-knee-jerk reaction: if everyone has a knee-jerk reaction about something, I remind myself of the advantages of the alternative.

Ultimately, the only true test is time. Wait 2 years and let the emotions cool, and you'll know if the outcry was justified.


> Thought experiment: say Macs had USB-C for 10 years, and the latest Macbooks switched to MagSafe.

Then you would presumably still be able to use the remaining USB-C ports to charge it, with the MagSafe merely an option. That would not get many complaints. If they violated that they would deserve the complaints.


No, in this thought experiment, when they have MagSafe, they charge only via MagSafe.

The thought experiment is intended to be the same as reality, just in the opposite direction.


That's what my last sentence is for. If they go pure-magsafe and get rid of all USB-C charging ability then they deserve the complaints.

But if they add magsafe while also having USB-C charging then it's much better than either option on its own.

And keeping USB-C charging does not require extra ports, because you need the USB-C ports anyway to do USB stuff.


The question isn't whether they deserve the complaints, or about the minutae of charging, but whether people think that going from X -> Y is bad and going from Y -> X is also bad. If so, they're just being change-averse: ignore them.


In general it's a good thing to think about.

But you were using it to make an argument that the complaints were invalid.

So I'm going to reply to that specific argument, by pointing out that when Apple changes the IO ports to USB-C and enables USB charging, it does not require them to remove the magsafe port.

In other words, apple went X -> Y. But adding Y reused IO ports and could have coexisted with X. Even though Y might be better than X, it was a false dichotomy in the first place.


There's no need for MagSafe if you have USB charging. It's redundant.

It also entrenches the old standard rather than making way for the new one. If you have a laptop with both USB-C and MagSafe, and you need a second charger, you might buy a MagSafe charger if it's cheaper or to use with an older Macbook. Whereas without MagSafe, you'll buy a USB-C charger — the new technology.

If you want to do a transition, you have to go all in. Having both the old and the new port merely delays the transition, causing more pain in the long-term. Get it over with.


That would be a valid argument except that it implied one port less since you have to use one of the already scarce ports to charge. People like me were waiting for an extra port and got one less


Apple could keep usb-c and MagSafe, but they didn't.


Microsoft's own Surface Book uses something very like MagSafe:

https://www.youtube.com/watch?v=5bnIIuS0New

http://www.cultofmac.com/174993/microsoft-steals-apples-mags...

I've also been using MacBooks since 2006, and I love MagSafe - or at least, MagSafe 1 (haven't tried the more recent MagSafe 2).


Try to keep MagSafe 2 connected in any circumstances other than flat and stable. Sitting on the couch or laying in bed no way, it constantly disconnects.

Good riddance.


I never had a magsafe2 laptop. I love magsafe1. I have no idea why they went through with changing it.


Probably something to do with shaving a billion of a millimetre off the thickness of the MacBooks


Probably, but they could have changed the adapter so the cable left the connector sideways like in Magsafe1.


So that you have to buy new chargers.


Apple owns the patent on magsafe. That was the one feature I actually liked on my MacBook pro. I have seen to many chargers wear out, or wear out their ports. Magsafe also made it easier to connect the charger.

Think how nice it would be to have the same functionality with USB ports....


Currently at work we use (mostly T 4xx/5xx series) Lenovo laptops with rectangular power connectors [1]. I have mine for ~2 years, was used ~year prior. I connect/disconnect charger multiple times a day and both ports are sturdy as new. Others state similar feelings.

[1]: http://shop.lenovo.com/ISS_Static/options/US/images/17912239...


FWIW, some appliance connectors have used magnets for well over a decade in Japan and elsewhere in Asia (it usually looks like https://www.liandung.com.tw/Japan-m/LT-515.html). I owned an electric kettle with one of those.

I don't understand how Apple could patent this and prevent anyone else from implementing it. In any event, there do seem to be 3rd party implementations for USB-C if you're keen on the feature: https://www.macrumors.com/2016/01/04/griffin-breaksafe-magne....


Apple patented MagSafe and how it works, specifically. This doesn't preclude other magnetic breakaway cables if they don't work exactly the same way as MagSafe.


Can you recommend magsafe-like USB cable that works with 2016 MBPs? When I looked for something like this I found only a cable made for the 12" macbook that could not charge my machine.


I pledged on Kickstarter to receive a pair of these (still in manufacturing), but maybe they'd help?

https://www.kickstarter.com/projects/branchusb/magneo-first-...


I don't mind the lack of MagSafe connector on the new MBP. What I do miss however is the green/orange light to know when it's fully charged.

That's something they could have kept (somehow) in the shipped USB-C charging cable.


The Surface Pro 4 line has a mag safe equivalent (while avoiding the Apple patents) which also allows for distributing ports ala USB-C.


I was worried about the loss of MagSafe too. Now, with almost three months with the new MacBook Pro, I have become very happy with USB-C, and I hope that iPhones and iPads will make the switch too. A USB-C-only world would make so many things easier!


Switching the Iphone to USB-C would definitely be the best choice for the consumer.

It is, however, not the best choice for Apple.

I think it should be pretty obvious what I believe will happen.


If they included a breakable USB cable, most people would have been happy. For some reason they didn't.


If they actually enforced their cords so they didn't fray and fall apart within 6 months, I would be happy. Their stubbornness to correct the cord situation is exasperating.


Proper strain relief goes against the 'make everything thin' ethos.


>Also, what PC alternative has a MagSafe equivalent?

The Surface Pro uses a MagSafe style charger.


My Surface Pro 4 has a MagSafe equivalent power supply. Works quite well, albeit not as well as the mac version.


Apple is a still computer company, because Phones ARE computers.

For the majority of the population in developed countries, and almost all of the population of developing countries, smartphones are the only computer people have or need.

I agree that Apple have shifted focus away from "making tools for people to create things and solve problems" towards consumption-oriented mobile devices. But those devices are still computers, and they're wildly successful.

If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.


Sure, phones are computers. Also, most people use phones as their primary way of connecting to the Internet.

But what does that have to do with me (or, it seems, the GP poster)?

I want a great computer on which I can do what I usually do on my computer which is mostly programming, but I guess I could be fancy and say "content creation" instead.

The fact that phones are computers have exactly zero relevance when it comes to me choosing a new laptop.

I used to be a hardcore Mac user, because the computing environment was superior to any other choice, but that's not the case anymore so my next computer will be a Linux laptop.


> If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.

True. However, creators and developers (who, in my experience, almost always use a desktop) are important for the iOS platform. Someone has to write those native apps. Therefore it doesn't make sense to ignore them for too long.


Developers flock to the platform with more users. Which means Apple actually does need to focus on iOS to get more people to keep using mac.

Those of us who are ios developers will keep bitching about it but have no alternative so will keep on using macs as long as their iOS ecosystem is doing great.


While this is generally true, it's a little more nuanced. For example: Android has a larger market share (by a lot globally), but apps are frequently developed for iOS first. Why? Because it's easier. Device fragmentation plays a large role in that, but part of the cost of the project is dealing with the development ecosystem (good or bad). As Apple makes the development less appealing (because of sub standard hardware / OSX) android will seem like a better option.


Developers don't develop for iOS first because it is easier. They develop there first because iOS users pay money for things/apps. Android users do not.


I'm an Android user and I pay for things (?) and apps. Are you saying that I'm the only Android user who pays for apps and "things"? Or is your statement just a gratuitous cliche?


Statistically the average revenue per android user is lower than per iOS user. It's not a binary either/or.

Source: have been a mobile developer for 6 years.


Is the average revenue per user a relevant number though? When it comes to choosing what platform to develop for, wouldn't total revenue be a more important number?


The difference between total revenue and average revenue per user is important, but I think the answer is still iOS.

http://bgr.com/2016/07/20/ios-vs-android-developers-profits-...


It's well understood by many startups and established companies that developing for iOS often takes priority because there's a larger group of users willing to pay more money than their android counterparts


My point is not whether Android or iOS is better. It's that the key to keeping iOS developers around is NOT to make a better laptop but by putting more effort into iOS so that its market share grows.

No matter how great XCode becomes, if everyone starts using Android, people will all jump ship to Android, which means many iOS developers will change to windows.


Sadly that's more a race to the bottom since Google still don't seem particularly interested in actually making the experience for Android developers any better.


iOS apps are also more profitable than Android.


No one makes money selling phone apps.


"Apple's App Store has paid over $50 billion to developers"[0]

[0]http://www.theverge.com/2016/8/3/12371006/app-store-50-billi...


You don't have to sell an app to make money from it


I live in developing countries. It's true that smartphones are the only computer people have or need, which is a problem.

Mobile devices are still consumption devices. It's hard to create complex, multi-layer structures on phones: the screen is too small, the processor is too underpowered... take your pick. But creating a good SaaS or a good UX on a phone is darn tough.

Here comes speculation:

Websites/services in developing countries are painfully bad.

This is due to a number of factors outside technology, but it's also true that universities here graduate people who never grew up with big screens and unwalled gardens, never grew up creating rather than consuming, and don't have a sense of what it's like to go from blank screen to working prototype to polished platform.

Neither do most people in developed countries.

But at least computers are widely available in developed countries so that the x% of kids who have an affinity for these things end up getting started early.

Someday mobile devices will match and surpass desktops and laptops. But that day is not today, and meanwhile developing countries are years behind not only in physical infrastructure, but in online infrastructure as well.


You're correct that phones are computers, however:

Solidworks, AutoCAD, etc. don't run on phones

Photoshop/Lightroom/Illustrator don't run on phones

SPICE, VHDL, Verilog don't run on phones

InDesign and other DTP programs don't run on phones

Emacs/Vi don't run on phones

There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.



Yes, but on iOS you still can't compile or run any of the programs you write.


> Solidworks, AutoCAD, etc. don't run on phones

"AutoCAD" is a brand that applies to a large number of different CAD-related programs, some of which are mobile apps.

> Photoshop/Lightroom/Illustrator don't run on phones

Both Adobe Photoshop and Adobe Photoshop Lightroom are in the Google Play Store for Android phones.

> SPICE, VHDL, Verilog don't run on phones

Well, I can find circuit design apps that support a subset of verilog, but this is basically right.

> Emacs/Vi don't run on phones

There are ports of both on the Android App Store, supporting phones.

> There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.

There may be applications that phones don't have the processing capacity for or that work best with certain I/O peripherals that phones aren't often used with, sure. But the array of classes apps that are completely unavailable for phones is smaller than you seem to think.


SPICE, VHDL and Verilog typically run on some 56 core machine in a server room. At least once you are working on a sufficiently complex design.


> For the majority of the population in developed countries, and almost all of the population of developing countries, smartphones are the only computer people have or need.

Not true. I am from India. Developing country. There are 5 working people in my family. None of us can replace our notebooks with our phones. For all of us phone is for communication and notebook for work.


> because Phones ARE computers.

This is like saying trucks are cars, too.

Different form factor for different purpose.


Yes, this, thank you. The "but it's a computer!" sophistry falls apart the moment you take one look at the user interface. The reason people are starting to get annoyed with Apple is their continued phone-ification of the desktop.

It. Doesn't. Work.


Your first line is true but only in a trivial sense. In this sense Fitbit is also a computer company. That would be a stretch, but not by far.


Wholly agree with you -- in fact I think we're at a turning point in smartphones catching up to being full fledged computers.

I've written some thoughts about this previously, if you would care for a detailed elaboration: https://guan.sg/apples-2017/


Thanks, that is a nice write up. I especially like "Imagine a world where the way desktops are meant to function is by plugging your ultraportable into any available set of peripherals: monitor, keyboard and mouse. The interchangeablity and convenience of that, necessarily implies that the brand and type of monitor / keyboard / mouse you plug your ultraportable into doesn’t matter, as long as they’re technically compatible." I have written on the same topic, and that is the future I see. Microsoft had the right idea with Continuam, and I expect Apple, Microsoft, and Google to all try to make this work on the mobile side while leaving it to 3rd party manufacturers to develop USB-C compatible monitors that act as a hub for keyboard, mouse, USB drives, etc.


Thanks! Would love to read, if you have a link!


> Imagine a world where the way desktops are meant to function is by plugging your ultraportable into any available set of peripherals: monitor, keyboard and mouse.

I'm skeptical that this will work for everyone.

Everything is tradeoffs. When "must be tiny" is the top priority, performance and battery life necessarily suffer.

There will always be people - from gamers to video producers - whose top priority is performance. And for computer geeks generally, the weight difference between a laptop and a phone is not compelling, but having four times as many cores would be.

The portability difference between a mainframe and a laptop is immense - it changes your working life. The portability difference between a laptop and a phone is much smaller. It means you can do things spontaneously, because you can always have the phone with you. But if you're planning to work, the difference is minimal. Especially if the smaller form factor means you have to plan to have peripherals wherever you're going.

Eg, I can take my laptop to a cafe or a park and work. I couldn't do that with an "ultraportable" unless I bring my own peripherals. So it's actually less portable for the situations I care about.


Saying phones ARE computers is being completely pedantic.

Of course they are. So are microwaves at this point. So are toilets at this point.

I fail to believe that you just "missed" the real point, that the form factor and the user interface are (and need to be) quite different from a phone to a desktop.


> If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.

Consider:

- iPhones will suck if developers stop making software for them.

- You need XCode to make software for iPhones

- XCode only runs on MacOS

If developers start abandoning Mac en masse, that's going to hurt iOS a lot.

Also, if photographers, videographers, animators, designers, audio engineers, etc start championing Windows or Linux as their platform of choice, the average person will follow eventually. "All the pros I know use X" persuades a lot of average people.

And if people aren't using Mac, there's a lot less reason to use iPhone as opposed to Android.

I hate to use the word "synergy", but that seems to be what they're risking here.


No, Phones are TVs, books, consoles etc. A typical computer usage at a guess is an Excel file. I would wager that majority of business usage is some kind of document editing stuff and Phone by design will never ever be able to do this.


> Phones ARE computers

Computers that need jail-breaking, though.


I switched to Mac at about the same time as you. I can't see myself ever going back to Windows but if Adobe ever ports their application suite to Linux, I'd switch immediately.


I'm on a similar timeline as well. Thought I'd never go back, but have you checked out Windows 10? The gap has closed a lot.


I think the gap has increased even more with Windows 10.


I can't agree here. Windows 8 was not usable as a web developer, for me. Windows 10 is something I don't mind using as my experimental box (I still use OSX for day-job tasks).


Windows 8 was worse than Windows 10 which is worse than 7. I should have specified-to me Windows 8 and 10 are both poor, both much worse than 7. To be honest, I had an EASIER time with Windows 8 (company provided laptop) than Windows 10.

Windows 7 was the pinnacle windows experience for me and it has gotten worse ever since.


What about windows 7 was so great? The start menu? Windows 10 has a 5 second boot up time, most all annoyances anyone has online can be configured away, and it is insanely stable (haven't had it once crash on me yet).


It took a familiar UI and improved it. It added features in an intuitive way. And many of the features of windows 10 could've easily been added to 7.

> Windows 10 has a 5 second boot up time

Man, I wish. My 10 system never boots up that fast. Meanwhile my Windows 7 desktop takes 15 seconds to boot up on SSD. Those 10 seconds just aren't that much of a feature for me-especially since my laptop/desktop are typically in sleep mode anyway.

> most all annoyances anyone has online can be configured away,

Yes, because we should have to do work to eliminate baked in ads and processes that share my information with who knows who.

Also, you can't even intuitively FIND settings. The Control Panel has some settings the Settings app doesn't have and vice versa. It's a mess. Why can't they all be in one place? Mac? one place.

I can't even get Windows 10 to update. Instead I have to constantly kill a rogue update process that decimates resources because I can't get a basic update to download and install properly.


Well, it crashed on me. Not even a blue screen of death, just spontaneous reboot.


This is one of those annoyances, probably not a crash but one of those infamous 3am forced reboots while you're playing CSGO no less. This is one of those things you need to google and configure away.


With configured away you surely mean install third party tools full of adware until one nearly does what you want.


That's quite an assumption


Thanks, I'm quite good at those.


I didn't necessarily mean that the gap has closed just due to improvements at Microsoft ...


Check out my Windows-based node dev environment:

http://i.imgur.com/nJhqlvX.jpg

I am far more productive now than I was on OSX. Switched over when the Windows Subsystem for Linux came out (you can see my ZSH shell running in the corner there)

Jelly yet?


>far more productive now than I was on OSX

Really? In what ways?

This is interesting to me, because the main reason I'm a mac zealot (for all but le screengames) is because it's mostly indistinguishable from working on a linux box, without any of the negative aspects of running linux on a workstation. Any time I try to do work on my Windows machine it feels very handicapped. WSL helps, but it's still one more layer of abstraction.

SSH keys/git always feel like a hassle on Windows, rsync was slow as fuck for me on WSL (but works fine in Cygwin), any kind of scripted behavior is a hassle. Package management outside of WSL is pretty annoying, choco is okay, but nothing compared to brew/apt.

At best I could see myself being equally productive on Windows, but not without significant effort.


I'm on a beast of a machine that I could only dream of when I was stuck inside the Apple ecosystem.

Granted there were a few of annoyances I had to find workarounds for, but nothing unsurmountable, we are devs after all. There was a lot I had to relearn the-windows-way, old habits I had to drop, new ones I had to adopt. But with experience comes expertise, and you get used to the new way. I now have that handicapped feeling when I use OSX, death by a thousand restrictions. Funny thing, I never noticed them before, I assumed that's just how things were.

You don't know how far you can push and customize an OS until you make it your main environment and force yourself to stick to it until you overcome and find your new workflow. Pussyfooting around with Windows while you use OSX on your main machine is not how you test the OS for fitness, nor how you change old habits. You have to go in it with an open mind, you're not going to find OSX in Windows, you WILL have to relearn new ways of doing things, then you have to dive into the deep end of the pool head-first and attempt to hit your stride.


I understand your point about having to give an honest shot at Windows - I bought a Win10 license some months ago. It's fine.

Until I wanted Emacs.

Until I wanted to uninstall software.

Until I wanted to setup a VM in VirtualBox (set up a Win2k12 Server VM tonight, and it simply failed to boot 3 consecutive times, then booted up in a "repair" mode, and then failed to boot again, just to boot up again after the 5-6th try. Wonderful).

Until I wanted a damned clock that keeps time (always have to manually stop/start automatic time in the options).

I really want to like Windows but I can't. It feels like it's going out of its way to annoy me. Booting up a fresh Debian install feels so much better... I just feel like I have to understand how things work much more. Linux is a sharp tool, and Windows feels like a clunky bicycle.


The only actual _windows_ problem of those you listed is that uninstalling software is a hassle. That said, it isn't a walk in the park on Mac either.

Can't use Emacs on Windows? Use any other editor. Can't setup VM in VirtualBox? A VirtualBox (maybe) problem

If you go on Windows expecting the same environment as OSX or whatever OS you're on, you're going to have a bad time. Booting up a fresh Debian install feels better because you know what to do already.

Booting up a fresh Debian install for anyone else is likely to be an almost impossible undertaking without reading some kind of guide, if you're wanting to setup a proper dev environment.

There are problems in any platform you chose, you're probably just subconsciously sidestepping those in your process of setting up, while the Windows ones stick out to you.

Example problems I notice on Mac that are fine on Windows; Docker is extremely slow, I need to run it inside a Linux VM for any kind of proper developing Window management is horrible Installing software

Now, I'm no advocate for any OS, I love running Arch Linux with i3wm, I love OSX and I love Windows. I don't see any reason at all to hate any OS, I can setup my dev environment on practically any platform I could want with little or no difference. The only things that change are the things around my environment, the simplicity of i3wm, the task bar on OSX etc.

In my opinion Linux is the outlier here, which provides the greatest change in environment, not a bad one mind you, just a difference. Mac and Windows are mostly interchangable, I can switch betweem them with little overhead.


Alright, I was feeling quite whiny yesterday. My opinion of Windows is not that bad - I bought a license, which is enough said :)

I don't prefer OS X over OpenBSD or Debian. What I like is that I can mostly just hop from one to the other without doing a context-switch. Things (mostly) work as I expect them to from one box to the other. That's not true for me on Windows (but that is to be expected).

I learned about computers on Windows, from 95 to Vista (briefly touched it and then left for Unix). I used to memorize countless contextual menus and options and paths between each, so that I would see how to solve a problem when it arose and could diagnose it without access to a computer. I still do not have the same ease with Unix.

What I have gained by using Unix is real knowledge about how computers actually work, not only how the OS itself is built. And in my anecdotal experience, typical users of Windows (at work, college and friends) unequivocally understand and know less about computers than typical users of Linux do. That is true of Mac users in general but the effect is less pronounced than with Windows - most Mac users that I know have a basic understanding of the command line.

But take this for what it is: personal experience.


> Can't use Emacs on Windows? Use any other editor.

But... to Emacs users, there is no "any other" editor :)

// That being said, it's possible to run a native Windows build of Emacs, I remember doing that at some point. It wasn't overly nice though, and you have to delve into the whole msys/mingw/cygwin thing.


Stackoverflow answer for how to run Emacs on Windows' WSL.

http://stackoverflow.com/questions/39182483/how-to-use-x-win...

It's like they don't even try. After a lifetime of looking down at Windows users, most Mac users go into Windows looking for reasons to validate the way they already feel towards it (irrational hatred) and don't allow themselves to like anything about it. They hit a couple of bumps in the road and quit in frustration and use those as excuses for their decision to retreat back into their comfort zone.

Don't be that guy.

I am proud to say that I can fully configure a dev environment and work on any major OS out there. Linux (Debian and Redhat based), OSX, and Windows.

Currently, I'm glad I'm no longer bound to OSX (deprecated OS, overpriced hardware) or limited to Linux (no gaming and no adobe suite). I'm on a constantly evolving OS ran by a forward thinking company on hardware upgradable through the next decade. It can only get better from here. I got no worries.


> Can't use Emacs on Windows? Use any other editor.

I dont understand what you are trying to say there.


My point was that not having Emacs in Windows is a non problem because the editor market is oversaturated. It's not like the Adobe suite where any other product is almost a downgrade, so running a system that can't run Adobe products would be a liability.

Not being able to run Emacs as a reason not to use Windows is like saying; Damn! PulseAudio doesn't work on Windows, guess I'm back to Ubuntu.


I am not the biggest fan of emacs but for what it is it is more on par with people saying "I dont move to Linux because it does not run Adobe products".

There is no alternative for emacs if gimp does not count as alternative for photoshop.


If that's true, then not being able to run Adobe software in Linux is not a reason to not use Linux.


I run Emacs on Windows just fine. Cygwin Emacs + Cygwin Xserver works pretty well. Ubuntu on Bash on Windows is better by some measures though. You might like that one.


This window mess is the way you work more productive? Try something tiled maybe?


I don't actually work with all the windows opened or even on the same desktop, they are arranged like that for the benefit of the screenshot.


Looks very awesome. What term app is that?


That's WSL+ZSH running in ConEmu (w/ Solarized Dark color theme).

I also have this theme installed on Windows 10, which is why everything is dark including the ConEmu's titlebar: http://www.cleodesktop.com/2016/08/after-dark-cc-theme-for-w...

I spend long hours staring at the screen, I need a dark theme for eyesight preserving reasons. I had a dark theme on OSX until Apple decided to give a massive middle finger to the theming scene when they released El Capitain. The built-in dark mode just doesn't cut it. Yet another reason I'm glad im on Windows now.


Yes, exactly agree on Adobe opening their SaaS cloud to be used on Linux machines. I don't understand their hesitations but I also don't see it happening any time soon.


>> The new "features" every cycle are more "lets put this phone feature on the desktop"

What is it missing? It's been a while since I really wanted a feature in the desktop OS. All of the 'phone' features (besides launchpad) have made my life a lot easier (continuity, handoff, notification centre, even Siri from time to time).


It's missing high performance graphics drivers in the form of either openGL or Metal.

It's missing proper support for eGPU cards, which is currently the only way to add a high performance GPU to any currently produced Mac.

Blender, for instance, can't even make use of th GPU in my MacBook Pro because the openCL support in macOS is not good enough.


Mac laptops never had that.


I don't miss features I just miss refinement.

As a app developer who wanted to build an app which challenged what a mac can do with regards to productivity I had to basically leave the mac app store because they made it impossible for me to make it work in Sandbox mode.


Shipping Unix tools (like Bash) from this decade would be a start.

Trying to get modern Unix software to run on OSX is getting more and more painful. Since development is the main activity I do on a laptop, this is quite important.


UNIX being part of OS X is a historical accident, Apple just wanted to save themselves from Copland's failure.


It was the main reason a lot of people (myself included) moved to OSX from Linux. I honestly believe that its Unix compatibility was one of the main reasons Macs got popular with developers in the first place.


Macs were already popular with developers.

Apparently as someone that favors Xerox PARC way of thinking, I am not a developer, as I don't embrace the UNIX religion.


That's not what I said. I never meant to give the impression that “developers” necessarily means Unix users. A lot of them are though, definitely enough to be a noticeable fraction. The number of people in this very thread proclaiming the greatness of the Linux subsystem in Windows 10 should confirm this.

You have to admit that around 2003 or so there was a huge influx of technical users to the Mac. My proposition is that a lot of those users were developers moving from Linux (or from an unhappy Windows life, wishing they were using a Unix-based operating system).

I base this on the general sentiment at the time, and I was one of the people who made the switch. I'm sure you can find old posts from me on Slashdot talking about how great the switch was.

Now I'm on Hacker News, talking about doing the exact opposite. I'm moving away from OSX because the Unix experience has become really bad.


I agree, but on these discussions, that subset of developers just gets packaged into developers as whole, as if there wasn't anything else.

Personally I know UNIX since Xenix days, do have a vast experience across UNIX variants, use it in some form in many projects, but rather use OS X and Windows environment.

For strange as it may seem, some of us are actually happier with the GUI developer tooling culture of OS X, Windows, Android, iOS.


My experience with hardware that comes pre-loaded with Windows is that it is utter crap. Or so it seems - it works fine once you remove Windows and install Linux for example. I have kept none of my previous Windows machines. None. They all decay so fast, BSODs after months of use, constant reboots just because things don't act right, general instability, sluggishness, etc. I have only recently built myself a new Linux tower. Much more pleasant.

OTOH, ALL Apple hardware that I have bought (multiple iPads, iPhones, iPods, Macbook Pros, iMacs) still works as advertised and is in good shape (I always buy and use Apple Care). I have just recently begun to use my Linux desktop more because my 2009 MBP is starting to feel a bit sluggish.

People can whine all they want about Apple (I personally will not buy the new MBPs, wait until next year's model). The truth is they've set the bar so high that anything less than perfection (which IMO is mostly what they gave us during the last decade) is seen as unacceptable. The same cannot be said of Windows - at least according to my experience since Windows 95. I recently bought a brand new SSD and a license for Windows 10. One day after installation, Windows crashed and had to "repair" itself. To be fair, it has been running flawlessly since.


Do you just roll into WalMart and buy the first $300 HP you see?

That's literally the only way I can see your anecdotal experience being truthful.

The PC marketplace is open, like the Android one. There's a lot of cruft to sift through, but it should only take about 1-2 hours of research to find the best laptop at any given time and any given price range for any given use-case.


No. ~1200 PCs and laptops. I understand that it seems like I'm exaggerating. I'm not. Either I have been incredibly unlucky (or stupid in my choices), or Mac hardware is of better quality.

But you're right that it's anecdotal.


>First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone. Then they became a computer company who also made a phone and a tablet. Then they became a phone company who also made computers and tablets.

>Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

I don't know much about this area, other than the little I have read, but didn't MS do somewhat the same thing - "lets put this phone feature on the desktop" - with Windows 8? Not sure if they reversed that in Windows 10. Interested to know.


Windows 8 was ridiculous, and one of the things that gave MS a big wake up call. Its now a different company. Windows 10 is leaps better in about every way than 8 (not without its own issues, of course). Disclosure: I am a MS MVP. MS will listen to me in almost every area I care about.


Interesting. Yes, though I've never used Win 8 (and don't want to), from the little I had read, it seemed like the idea of trying to merge the mobile and desktop paradigms was a bad idea from the start, and doomed to fail.


Windows 8 didn't merge the mobile and desktop paradigms, it put the mobile paradigm over the top of the desktop.

The apps stuff (Windows Runtime) was actually quite well done for a first try, but it was unfamiliar to Win32 users and got in the way. (Only one click away, but people are impatient and resist change.)

Windows 10 does an excellent job of merging apps and traditional programs, though it's not strictly correct to equate apps and mobile. Windows 10 is a mobile operating system that runs apps from an app store, but you can resize apps and run them seamlessly alongside traditional Win32 programs.

You can still run Windows 10 in tablet (Windows 8/8.1) mode if you want to. I don't know anyone who does.


I, strangely enough, sometimes run Windows in Tablet mode and use it as tiling window manager. Works surprisingly well. Try it sometime.


> Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.

I don't think Cook is hardheaded. I think he has no idea how to run a product company. Pretty much everyone from the average HN user to Larry Ellison predicted this outcome. Cook is logistics / supply chain etc, Ballmer was sales, neither had any business at the top of their respective companies. Jobs did it on purpose, in my opinion, because he believed the single most important function initially was to keep Apple running smoothly and to complete the iPhone boom (ie the big product for the next decade was already in place). The only question is how long Apple will stay in the Cook era before getting a product leader replacement.


Death throes, not death throws, fwiw.


FWIW, I have a 2016 13" nTB and have had a few trip over the power cord incidents. The USB-C cord snaps right out almost like magsafe for me... the laptop will get a nice tug and move an inch, but it comes out even on a slick countertop.


I had similar experience with MacBook 2015. Yes, the laptop did move, but that was less than 2" before the cable detached.


The original magsafe design ( http://www.iclarified.com/images/news/9014/31392/31392-500.j... ) was the most ideal desing out of all of them in my opinion... I have seen macbooks slide about 20-30cm with the new designs without the magsafe connector disengaging, all because the of direction of pull.

The only problem was that the original magsafe design was prone to fraying. and probably didn't look as sexy as having the cord at a right angle to the computer.


> I wont buy a laptop without a magsafe or similar connection

Check this out: https://griffintechnology.com/us/breaksafe-magnetic-usb-c-po...


I own one of these. It comes close, but it's just not the same.

I use my laptop, closed, hooked up to an external monitor. Nowadays, I spend a lot of time on video conferences. The number of times that my laptop has shut down mid-conference because I tapped the cable under my desk just enough for the connection to drop for a split second (combined with Apple's -- let's call it "interesting" -- decisions around power management) in the last week is way too high.


Careful: BreakSafe is rated up to 60 watts (20 volts @ 3 amps), and has been designed and tested to meet USB-C power standards.


I categorically refuse to pay more money to buy something to add on to an already overpriced laptop that sticks out from my computer's form factor and that should be built in.

Fortunately for me I got a new laptop not long before this new abomination came out. If Apple doesn't correct this gigantic mistake, the next time I get a new laptop it won't be made by Apple.


Man you would absolutely hate to work any job that requires a serial connection then. Because let me tell you about the serial-to-USB adapters I've used... I wish I had the luxury of saying that I categorically refuse to pay more for something that sticks out of my computer that I think should be built in.


If those are permanently-emplaced USB to serial adapters and you have problems with reliability, I recommend checking out Digi.com for their "Serial Servers." We have a couple of customers who've been running faxes from VMs to external modems over Digi One SP units for more years than I care to think about, and they're quite simply rock solid.


I do embedded dev; at any time I have 2-6 serial-USB adapters hooked up to my MBP. They are all plugged into an external 10 port USB hub though, would be a little harder to do it otherwise.

Anyway, back to your point, I most certainly do NOT want DB-9 or DB-25 ports on my laptop. ;)


The operative clause there is "should be built in". And of course I can say "should be" because it has been for years now. Literally just yesterday magsafe saved my computer from being thrown off the desk. The regressions in overall user experience with this latest MBP mean that I'm no longer willing to pay Apple the premium that I've been paying them.


No, the operative clause is "I think". You're talking about your opinion, while simultaneously telling me that my opinion is worth less than yours just because it's different.

It'd also be great if I could have a VGA output, but that ship has long since sailed, hasn't it? It'd also be great if I could have more than two USB ports, but I have a 2015 MacBook Pro. To get four USB ports I'd have to upgrade to the 2016 MacBook Pro. See? To me, the 2016 is better than that one I have. It lets me plug in more adapters, adapters that I've been used to carrying for years. I carry four with me to every client site I visit, and you don't see me whining about it on the Internet.

Regressions in user experience? It's double the number of ports! It's a god damn lifesaver!


The number of people who trip over power cords is almost definitely much larger than the number of people who need VGA output, or need four USB ports (given the easy availability of USB hubs).


I say the OS is getting worse, the quality of the software is nowhere near what it is used to be. It's clear that they need to put more effort and polish Mac OS.

Apple laptops are still successful as for now, they sold wagons of the latest Macbook Pro, but it is not going to last if they don't fix the OS ASAP.


> First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone.

You missed the most important stages in between: when they became a company that made cute, candy-colored computers that would match your Volkswagen New Beetle, then when they made a totally awesome mp3 player you could get to match it.


Good riddance to magsafe. I've had two laptops become unusable because the magsafe connection became flaky and wouldn't charge unless I fiddled with it until it rested just so (even contact cleaner wouldn't help). There was no hope of it lasting more than a few hours without constant tending by me to get the connection just right again. I've never in my life had any other computer that failed in that way.

Moving to USB-C was the right choice. Charge cable goes bad? Buy another one for $15.

Everything else is spot on. Apple quality is fast going downhill.


Many years ago, before the age of the iPhone, I had a meeting with about thirty people at Apple headquarters at the request of Steve Jobs. That was my very first visit to Apple HQ.

After half a day of conversations, as we left the building, my very first words to my team were: "Apple Computer is no more. This is a marketing company. They might as well stick their logo on washing machines, microwaves and refrigerators. They know how to market them to be cool and they'll sell millions of them"

Then came the iPhone.

And, yeah, they could have filled homes with Apple appliances. Not sure why they didn't go there. So easy. Not saying it would have been right, but they had the opening and the mindless following to make every home "Apple Cool".


> I wont buy a laptop without a magsafe or similar connection

Not a big deal nowadays, my mid-2014 Thinkpad has a similar connection (it won't ever drag the laptop when pulled), I'd think most laptops should have such connectors by now, "mag-safe" or other?


which adapter exactly are you referring to here? I have a 2015/2016 thinkpad yoga for work and I don't think it's the same/similar at all to the magsafe adapter. Magsafe comes off even with the gentlest, but firm, tug even at an angle. I can't say the same for my thinkpad


> the consumers anymore, they stick their fingers in their ears and pretend to know best.

They are though. Consumers will pay for hardware with batteries that die after a year and a half. They'll pay for vendor-lock in and to be locked into the walled garden experience. They'll pay for hardware that will be locked out of the walled garden and unsupported by vendor updates after 4-5 years. They'll pay extra for accessories that give basic functionality to their electronics. They'll pay a massive premium for accessories that are subpar for their asking price if they're endorsed by celebrities.

I'm not a fan of newer Apple hardware and software, but we're in the minority.


Agree. Have you noticed that their phones also suck now? I am locked in their eco system, but I can tell, the quality of software is just not there any more.


> Jobs was hardheaded, but reasonable. Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.

I think we can be sure that under Jobs, they would not be in a situation where you cannot charge the current iPhone model with the current MacBook without a dongle. "Oh, it's not a dongle, it's just a cable." -> Get out of my face.


I'm in the same boat. I switched around 2008-2009 to jump on the iOS app wave, but am more interested in Windows since 10 and Azure.

I still think their rMBP 15s are the all around the best you can buy. I haven't tried the track bar version yet though. Also, OS X (or whatever) on a rMBP is still great for surfing the web. The zoom with the trackpad is flawless.


You can try it on any Mac with https://github.com/bikkelbroeders/TouchBarDemoApp however the touch bar does not seem all that useful to me and I think that the previous model, (2015), is all around a better buy, higher performance CPUs, standard USB 3-A, MagSafe...


"death throes"


> Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

it's death throes.


> Then they became a phone company who also made computers and tablets.

You're describing the changing world we're living in. Apple is merely reflecting it.


> I wont buy a laptop without a magsafe or similar connection

Do Windows laptops have it, or you're going to use a 2014 MacBook Air forever..?


Recall that Apple Computer Inc became Apple Inc January 2007.


Under Steve Jobs's leadership, no less.


> I wont buy a laptop without a magsafe or similar connection

What about wireless charging? I see magsafe as deprecated and only Macbook air has it now.


Wireless charging is horribly inefficient...IIRC there aren't even any that go over 10W.

Magsafe may be deprecated by Apple, but that's a step backwards in functionality. One cannot deny that using USB C for charging is inferior in terms of protecting the device.


Thought the same thing, but then saw this:

https://www.kickstarter.com/projects/branchusb/magneo-first-...

Gotta say the idea of a single USB C adaptor carrying power, data, and having a general purpose mag-safe like adapter is pretty compelling.


That looks massive though.


Massive?

.. there's no pleasing some folk.


Given it's intended to be a permanent fixture in the side of your laptop, at the size of a (keyboard) key I'd say it's pretty big, yes.

Add that to the one-and-a-half key-sized optionally permanent part (unless you're going to buy one for each cable, it's fixed until you need the safety feature) and it's a huge thing sticking out of the side!

To be honest, I question the usefulness anyway - I've pulled my Macbook Air off a table (accidentally) by it's MagSafe power cable before, and trying it deliberately now it only works with vertical movement. Side-to-side or directly 'out' it just moves my laptop, even as short & and sharp as I can.

If I stepped directly on the cable it would be such a directly-out movement; if I tripped, a side-to-side one. Unless my laptop was positioned perfectly such that the MagSafe hung over the edge of the table, and I trod directly down, I can't imagine how it would have its intended effect.

I use it all the time for un/plugging deliberately, but I can live without that, and as above with this Kickstarter product without buying several I'd have to leave the whole dongle plugged in anyway, so any deliberate dis/connection would be a normal un/plug motion.


> intended to be a permanent fixture

Are you sure about that? I've ordered two and I've never thought about leaving it in: for me, it's purely a normal connector except that it could "break" safely.


How many devices did you break because they did not have a magsafe like functionality? I did not break a single one.

Magsafe is a differentiating feature of Apple with little to none daily life saving rationality for existance. Its there to be different not to be better.


3. 3 laptops, broken when I wasn't even there from a family member or child or pet running by snagging the valve and ripping t out. 2 were just rendered unchargable, the third shattered the screen. I'm far from alone in that, as well. It's rather presimptuous on your part, don't you think?


Those computers are like $3000. Even if it only saves 10% of MacBooks from an untimely end, it's still $300's worth of insurance.


10% is a really high guess. I would suggest less than 1%.


Wireless charging needs to be strangled in the crib. As long as we're burning carbons for power, we have no business charging wirelessly until it's as efficient as wired.


You know what fans (the real ones) do? They try to know more about the thing they are fans of, and to understand it. Your comment shows nothing of the kind.


Sorry but who wants to be some lame "fan"? We want the best tool for the job, too get that job done and we're willing to pay for it. The tools are no longer up to scratch or functioning as expected and we are voicing our concern.


I couldn't agree with you more on this statement. Said exactly as I feel.


[flagged]


We've banned this account for repeatedly violating the HN guidelines and ignoring our request to stop.


I'm sorry I re-read GP and I'm still unclear what you're talking about. For as long as I've been using Apple it has always been because of the superior experience. Its with regret I have to say I am let down by (in particular) their software quality recently. I've gotten to the point where I'm fairly certain my next phone won't be an iPhone. I can't ... understand your line of reasoning .. that as an Apple user that means I should have some loyalty to them if what they're selling me doesn't fulfil my needs. I'm really interested in knowing where you're coming from but your snarky tone is making that difficult.


>But recently, I realized I’d gotten tired of Apple’s attitude toward the desktop. The progress in macOS land has basically been dead since Yosemite, two years ago, and Apple’s updates to the platform have been incredibly small.

So, like creating a full blown new programming language (Swift)? Or a full new filesystem (AFS)? Integration of Cloud storage directly to the desktop? Siri on the desktop? Saving RAM through memory compression? Continuity to transfer work across desktop and mobile (and different desktops) seamlessly? All things added in the last few years, with few of them still ongoing.

Sure -- they've totally abandoned it /s.

>Take a look at Sierra: the only feature of note is Siri, which is half-baked as it is, and the things that did get ported over from iOS are half-done too.

That's hardly "the only feature of note". But even so, I wouldn't want Apple to continue to change much in OS X, except refining things.

>and so I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro.

So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).

If it discussed the state of Mac Pro and Mac Mini the post would actually have a leg to stand...


>So, like creating a full blown new programming language (Swift)?

Byproduct of Apple's work on mobile.

> Or a full new filesystem (AFS)?

Byproduct of Apple's work on mobile.

> Siri on the desktop?

Byproduct of mobile again! (By the way, ask Siri on Mac to set an alarm or interact with homekit!)

But, I take your point. I tried to mostly detail Apple's lack of attention for developers, but perhaps missed the emphasis on that there in the post. Microsoft is really trying with developers and it's readily apparent.


"Byproduct of mobile." If you need evidence:

Swift appeared in IOS first and still is not great on the desktop at all.

APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017

Siri on the desktop is mostly useless, as said here.


Right - that's simply a result of what I say in the post about Apple focusing on iOS first, then letting the byproduct of that filter down into OS X. That's a end-result of the Mac team being merged into the iOS team. Maybe it's not bad, but it just means slower overall progress.


For APFS, I don't buy that it has anything to do with prioritizing one operating system or the other. APFS wasn't ready when Sierra was released, which is why it was included as a beta with limited functionality. It seems to be ready now, so it will be included in the next version of iOS (10.3), which is still in beta, and presumably the next major release of macOS.

The only difference is that it's being released in a minor update to iOS but will probably wait for a major update to macOS. But that makes sense, because macOS allows the user to customize partitions and filesystems, and allows running other operating systems - including Boot Camp, which Apple itself must provide drivers for, and Linux, which is waiting on Apple releasing documentation for their filesystem, as they have promised to do. By contrast, iOS has the same partition layout on every single device and no cross-OS compatibility concerns. Thus APFS on macOS requires more work and has greater risk of failure. For both reasons, regardless of Apple's priorities, it would be logical to start with iOS. Even if Apple did add stable APFS support to a minor macOS update, they probably want to auto-convert users to APFS eventually, but doing so in a minor update would definitely piss off technical users; they could make it opt-in to start with, but it's easier to just wait.

The next version of macOS is only a few months away; really not a big deal.

(I agree that Apple seems to be prioritizing iOS in some areas; I just don't think APFS is one of them.)


>Swift appeared in IOS first

Which makes perfect sense. We talk to our phones all the time anyway, and hands-free is crucial. For desktop, not so much.

>APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017

Which makes perfect sense. A constrained environment without an exposed filesystem like iOS is easier to convert to a new FS. A full blown desktop OS not so much. That's why it needs much more testing and development to deliver the latter.

But they ARE doing this testing and development.

(Btw, APFS is not "the default in iOS" yet. It will be when 10.3 is released -- it is still in beta atm).


I agree it makes sense.

But it also indicates that Apple isn't really designing for the Mac as the author points out.

I mean, Siri is the biggest feature Apple is touting for Sierra (http://www.apple.com/macos/sierra/). And as you point out, it's something that doesn't even make all that much sense on the Mac.


I may be in the minority here, but I find Siri just about as annoying as the old Clippit thingie in Microsoft Office. It's the first thing I turn off.


>But it also indicates that Apple isn't really designing for the Mac as the author points out.

Maybe they don't but I don't see them doing anything major on the iOS side either. Both platforms are quite mature by now anyway.

(And Apple was never about revolutionary new designs and jumps. Back in the day of the iPod and early MBPs etc, we cheered and waited anxiously for at best incremental changes -- now it has USB, now it has a different touch wheel, now it has a color screen, now it has wifi, now it does video, etc, year over year).

>And as you point out, it's something that doesn't even make all that much sense on the Mac.

Yes, but people (and pundits) have been asking for it all the same to appear on the Mac anyway.

And "talking to your computer" has been a thing from the times of 50's sci-fi stories even.


I'm sure they'll figure it out at some point, but the mystique of Apple only releasing something when it was "ready" has been over for a while now. So not only did it come later, it was less evolved than the Windows implementation a year prior. Cortana felt useful from day one for various tasks. Siri? Only in a subset cases.


Siri is a humiliating embarrassment. There is no other way to describe it. The only time I ever use it is for comedic relief, or to demonstrate the obvious superiority of either Google's, Microsoft's, or Amazon's implementations. To Apple, it's just another in a long series of marketing-driven features rolled out into some perceived white space in the competitive landscape and then abandoned. It's ridiculous that something so potentially game-changing to enable non-technical users, which is theoretically the market Apple thinks it owns, has gotten such short shrift at Apple, even in the face of growing (and superior) competition.


> The only time I ever use it is for comedic relief, or to demonstrate the obvious superiority of either Google's, Microsoft's, or Amazon's implementations.

Curious here - can you give some examples? Siri seems to work better (on friends' devices) than "OK Google" on my Moto X.


I don't know how well Cortana works, but Siri works very well for English. It is passable in Swedish for every-day language (like "påminn mig att köpa mjölk ägg och smör" will be fine) whereas anything remotely technical or heaven forbid English will be a jumbled mess. Google is understandably leaps and bounds ahead of Apple, and it's not surprising that there is essentially nothing in the field of machine learning coming out of Apple, whereas Google is basically _the_ ML company on Earth right now.


Apple was never about revolutionary new designs and jumps.

http://theoatmeal.com/comics/apple


Yeah our desktops are just the thing we use to make those, no big deal really


> APFS (not AFS) is the default in IOS

Well, it will be when that beta iOS is released.

Besides, it makes sense for any sort of file system to be released first in such a limited manner.


Uh You realize they make more phones than desktops, right? By like a factor of 10 at least :)

So the limited manner would be to release it to desktop?


In what sense is Swift not great on the desktop at all?


I think a pretty good summary is about 3 years ago Microsoft realized it had a couple of failed stratagies.

1) Catering to casual. Xbox one launch is probably the best example. They wanted to become the home media center so the entire launch and discussion was centered around sports streaming and social media. + Kinect.

2) Desperately trying to get a foothold in mobile.

I think they realized people weren't going for either. Sony focusing on games and hardware killed it initially in sales, and still kind of does. Apple and Android had two corners of the market and MS could make a space for a third.

They pivoted, targeted "professionals" and that's what you see in their approach for Xbox & PC. I think it's overall good, but saying Apple abandoned developer is sort of desparate fear mongering.

Apple's strategy has always been pretty consistent, focus a lot of resources on making a good ecosystem. More recently it seems that's changed to "focus a lot of resources on making the best systems, and allow for a great ecosystem" with sponsoring third party monitors, and apple's home automation front.

I think it's still a pretty good strategy, but with windows catching up with some "pro" features like their own premium desktops and laptops and adding in old features like "workspaces" people jump to say Apple "abandoned" them when they're still kind of sticking to the same successful strategy.


If you'll look even further back in the Mac OS history, you'll see that they didn't really care about developers back then. Their entire focus was media and publishing with education on the side. I think it hurt them in the long run as they've always lagged behind with software and gaming and they only managed to make some headway in the PC market when they devoted resources towards making life easier for developers.

I've always liked Mac OS better than Windows and even though Windows 10 is finally a usable system for development, I still don't trust Microsoft. I've been burned too many times with a promising system that then gets hobbled by ORMs adding their own Adware and other BS, by Microsoft caving into the bean counters and hobbling their systems in order to fight Chinese piracy, and by some C-suite managers bright idea that they should to waste resources on some "Synergy" play instead of just making the OS, and other products, better.

At the core, you still have an ecosystem where the people making the computers can't make enough money on their own because Microsoft and the chip makers are squeezing them. Which makes for a sorry experience in the long run, even if Apple has its own problems.


I've been an enterprise developer on windows since XP (now on 10) and I don't think they've come very far in terms of what's going on with your dev pc. I still wouldn't want to use my dev machine for anything but development because of how much shit the .Net dev environment does to your system.

Don't get me wrong, Azure is blazing amazing, and so is .NET in general, but when your enterprise IDE still hasn't found a way to function properly when the "documents" folder is on a network share, then you've got ways to go.


This is the entire reason I do the majority of development in a VM. There needs to be installed so much software, that is deeply integrated into Windows, that I don't want any of it on my daily workhorse.


By that logic, isn't iOS byproduct of OS X? I think people are overblowing the situation. I still rember Snow Leopard being best desktop OS I have ever used, but Sierra is still far far in front of Windows, and I realized that just when I sat in front of the computer with Windows 10 on it.


> Microsoft is really trying with developers and it's readily apparent.

Which has been their approach for a very long time - let's not forget that Ballmer speech. From what I have heard, Apple puts minimal effort into XCode; the iPhone lead dev here was throwing around all forms of cuss words trying to get CI set up.


XCode is a sometimes excellent, sometimes shaky experience, still far ahead of Android Studio until the past 6 months or so. The API design of iOS is also still second to none, and part of the reason I continue to develop for Apple platforms. I use XCode Server for CI without issue, another team uses Jenkins happily. Apple's hardware can be fairly neglected, but the experience of being a developer in their ecosystem is still very pleasant overall.


WSL is also a byproduct of their failed attempt to run android apps on windows phone, and it's the most interesting thing they released recently at least for me.


"Byproduct of mobile" doesn't really say much -- if anything.

They support all three on Desktop systems, and they have features and APIs for both that only make sense on desktop systems too.


It says a lot.

The desktop isnt a phone. Forcing phone features into a desktop and slowly turning the desktop into iOS is not a good thing.

We were just fine without an app store. Just fine. I cant think of any of the "use this mobile feature on your desktop" that provides any real value...


>Forcing phone features into a desktop and slowly turning the desktop into iOS is not a good thing.

Siri, maybe, but there's nothing about an FS or a programming language like Swift that makes them a "phone feature". Same for memory compression (which only exists on the Mac IIRC) and continuity (which would be useful even if it only worked from Mac to Mac).

If anything Apple has adamantly refused to "force phone features into a desktop", and only ports stuff that makes sense -- that's sort of what Microsoft did, trying to merge e.g. tablet and desktop platforms in the same UI.

>We were just fine without an app store. Just fine.

Well, I'm much much better WITH an app store. Why wouldn't I want one? I might want some more features out of the Mac App Store (like the ability to demo an app) but I very much want to have it.

And in no way is an app store a "mobile feature". App stores, (and app repos) existed way before they appeared on mobile phones, for one.


> Same for memory compression (which only exists on the Mac IIRC) and continuity (which would be useful even if it only worked from Mac to Mac).

Windows 10 has memory compression.


And Linux too since Dec 2012.


Huh? Is that enabled automatically, or do I need to flip a switch?


And of course, Android (Cyanogenmod had it in 2010 or so)


"On the Mac" as opposed to "also on iOS".


> We were just fine without an app store

You were just fine. Don't speak for everyone.

Finding applications on the internet is a dangerous endeavour if you are inexperienced. Many people get tricked by popups suggesting their computer is broken and they need to download a "cleaner app". Many people get tricked by the fake Download buttons. Many people just use whatever the first link on Google.

And then once they are tricked they then proliferate their passwords, email addresses, credit cards etc all over the internet.

App Stores IMHO are a must have for most people.


If the mac allowed the "store" to have more than one source, allowed usage via cli and gui and didn't require a 30% cut most of your software would probably come from the store and it would be a good thing.


They'd have to remove the account requirement too. Just `appstore install foo`.


Maybe the could just offer a way to download the apps via the internet, and then open some kind of image container and move a file to the Applications folder.


This is inherently inferior as you are requiring the user to make the distinction between safe and unsafe software each and every time they install software something users are notoriously bad at.

Additionally it makes it hard for 2 apps to specify libraries required meaning each must include whatever they require AND either updates are manual, read nonexistent, or each app includes its own nagging update mechanism.


An account could be required for installation of non free software while still allowing installation of free software without it.


I'm a professional developer, I don't want a new toy language. I'm buying a powerful laptop, I don't want cloud storage. I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem. I don't want compressed memory, I want 32GB of RAM. And while I'm at it, I want a laptop with a real keyboard, not keys that have so little travel they feel like a touchscreen.

In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.


>I'm a professional developer, I don't want a new toy language.

First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

Second, whether some random web dev or embedded C dev or Java dev wants Swift or not is irrelevant to its intended audience and utility. It's not for "professional developers" in general, it's for Mac/iOS application developers, and thus, for Mac/iOS users (that benefit as a side-effect from developers having a modern/better language to implement stuff).

>I'm buying a powerful laptop, I don't want cloud storage.

So? Millions of users do want it, judging from the huge popularity of Dropbox, Google Drive, MS implementation of the same, etc.

>I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem.

Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+ and Apple obliged.

And whether you can "dual boot" or "share drives" is orthogonal to whatever OS X has a new native filesystem. You can still format the rest of the hard disk on another fs to dual boot, and you can still share drives with other systems in NTFS, xFAT, etc.

>I don't want compressed memory, I want 32GB of RAM.

And I want a pony unicorn. But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it. And Intel gives late 2017 / early 2018 as the date for those. (Existing PC laptops with 32 GB ram merely sacrifice battery life using regular high power drawing RAM -- and even those are far and few between).

(And of course compressed memory helps whether you have 16 or eventually 32GB of ram)

>In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.

Well, as a 15 year Mac user and 20 years professional developer, I do want those things.

I don't particularly care for the touch strip though -- I'd prefer it to be OLED physical buttons giving both tactile feel AND changeable inscriptions.


> So? Millions of users do want it, judging from the huge popularity of Dropbox

Yet just about everyone uses local FS with cloud for backup or sharing of a small subset. Giving us an odd cloud first strategy seems more suited for a phone or severely storage constrained devices.

> Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+

Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning. HFS+ has been getting long in the tooth for years, so a better FS is long overdue! So now we're getting APFS that explicitly doesn't checksum user data[0]! With current storage size, that's disappointing to understate it hugely.

> until Intel delivers boards that accept 32GB low-power RAM

It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both. Probably a longer overall life. Teardowns show a lot of empty space around current batteries.

[0] http://dtrace.org/blogs/ahl/2016/06/19/apfs-part5/#apfs-data


>Giving us an odd cloud first strategy seems more suited for a phone or severely storage constrained devices.

What "odd cloud first strategy"? Cloud is just ANOTHER option, not a "first" or privileged one. To the point that Apple also includes a whole new local filesystem (in beta) with Sierra.

>Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning.

Oracle bought Sun and poisoned the area with patents and threats.

>It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both.

No, we really couldn't. At best we'd have a 10% or so larger battery space. The impact of RAM (there whether you use 32GB or fewer for a task or not) is much larger.


To comment on Apple's "cloud first" strategy, while they do provide local file storage in their APIs, their guideline suggests avoiding creating an option for storing local files since users "expect all of their files to be available on all of their devices." (https://developer.apple.com/ios/human-interface-guidelines/i...)


> Yet just about everyone uses local FS with cloud for backup or sharing of a small subset.

In most cases, not because they want to, but because that's how current software works. Both Dropbox and Google Drive focus on having their own special folder that gets synced; syncing the OS desktop and documents folders is possible but only with special configuration (on both Windows and macOS).

Cloud first makes perfect sense no matter how much or how little storage you have. It prevents you from losing data if your device is lost or damaged, and if you have multiple devices it allows accessing all your data from any device. If anything, its usefulness depends more on internet connection speed.

I do think Apple's specific cloud storage offerings could be improved for large devices. Currently the highest tier of iCloud storage is 2TB, which costs $20/mo; that's only twice as large as my MacBook Pro's SSD, and fairly expensive. But it's the same price/GB as both Google Drive and Dropbox, so it can't be that much of a ripoff...


> You can still format the rest of the hard disk on another fs to dual boot, and you can still share drives with other systems in NTFS, xFAT, etc.

Until that other OS has support for the new proprietary filesystem, it won't be able to read it. Since Apple and Microsoft categorically refuse to implement any of the many featureful existing filesystems, one is stuck with archaic NTFS (with no file permissions) or FAT (with less than 4gb files) to keep data.


Why do you think there’s no file permissions in NTFS?

In NTFS, each file or directory can have arbitrary access control list specifying granted and denied permissions for users and groups.

Those lists are typically inherited down the directory hierarchy, but that inheritance can be stopped.


>Since Apple and Microsoft categorically refuse to implement any of the many featureful existing filesystems, one is stuck with archaic NTFS (with no file permissions) or FAT (with less than 4gb files) to keep data.

xFAT which I already mentioned is supported and doesn't have a 4GB limitation. You can have up to 128 pebibytes (which should be enough for everybody: that's ~144 petabytes).


> which should be enough for everybody

I've heard these words so often that they've lost any meaning to me. :)


Glad you agree the keyboard sucks


> First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

A language with no promises of stability counts as a "toy" in my mind (I understand that this can be subjective, just providing my opinion).

Say what you want about Objective-C, but at least Apple stuck with basic syntax decisions.


That kind of oversimplifies things. From my reading of the Swift mailing list, many things you would consider "basic syntax decisions" will have a profound impact on the stabilisation of the ABI.

Migrating is a pain, but I'll take the Swift migration pain over the maintenance of old Objective-C code any day. Even on large projects.


As a Linux/Windows developer, Swift remains a toy language as it cannot be used to do develop useful applications for Linux and Windows, where I make all of my money.


>As a Linux/Windows developer

As a Linux/Windows developer you are not the target audience (which I already addressed).

Besides, just because you can't use it, it doesn't make it a toy language, just a language that doesn't have support (or full support) for Linux.

A toy language has a specific meaning, it's not a generic work for "language I can't use professionally on my platform/industry".

I can't use Forth to make web apps either, which is were I make all of my money, but it's not a toy language by any means.


Yes, it's wrong to be dismissive like that. But no, "toy language" does not have a specific meaning. It is a slang insult, not a proper technical term.


As a Linux/Windows developer, you probably are not the primary target audience of Swift.

As a developer for Apple platforms, am I right to consider C# or Java toy languages, too?


There are many Java apps for Mac and Apple used to provide its own implementation of the JRE.


There are thousands of Apple and Android C# apps (see Xamarin).


Which, again, is besides the point. The parent was mocking the idea that just because a language doesn't support a platform it can be called a "toy language".

C# that he mentioned wouldn't be toy languages even if there were 0 Apple and Android C# apps. It would just be a totally professional Windows language with no Android/iOS support.

C# was professional even when it only supported Windows (and partially FreeBSD), before Mono came along...


> C# was professional even when it only supported Windows (and partially FreeBSD), before Mono came along...

As well as the decade-plus when Mono existed but wasn't owned by Microsoft, making cross-platform support merely incidental or even a negative from Microsoft's perspective.


Seems like a bit much to say that it's a toy language because it can't (currently) be used to write Linux or Windows applications. There's probably tens of thousands of developers at least making a living developing Swift applications. I bet there's more people making money writing Swift than Haskell and Lisp combined.


First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

That's some amazing CoolAid you've got there. But I'll grant you that it's on the same level with a bunch of other new languages that also aren't in general use for serious long term products yet.

But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it.

Yeah, wouldn't it be great if Apple made this happen? That's the kind of innovation I want to see.

As for the bulk of the rest of your points, developers and common Mac users have different priorities, and your opinions are not mine. Who better represents professional developers will take time to find out.


>Yeah, wouldn't it be great if Apple made this happen? That's the kind of innovation I want to see.

Adding something with marginal utility for the majority of their users (since even pros can do just fine with 16GB) just because some insignificant minority runs multiple VMs on their Macs?

That's not the kind of innovation Apple was ever, ever, interested in.


You want Apple to make their own CPUs?


No, I want Apple to do what they've done in the past and get Intel to do something for them. http://mrob.com/pub/comp/mac-models.html#intel_special


> I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem.

Then you're in luck. APFS isn't proprietary, as Apple's website claims that "Apple plans to document and publish the APFS volume format specification when Apple File System is released for macOS in 2017." So there will be nothing preventing drivers from being written for other operating systems, though of course it will require someone to actually do the work. In return you get a FS that improves on HFS+ in reliability, performance, and features (volumes). If there is enough interest I imagine it might even compete with btrfs for use as a root volume for Linux installations… though there probably won't be.

You may criticize Apple for not open sourcing their implementation of APFS, though I'm holding out hope they will do so eventually. An open source implementation would definitely be useful as a starting point for drivers for other operating systems, but to be fair, it would still require someone to spend the time to port it. Apple's HFS+ driver is open source as part of xnu, but Linux hfsplus doesn't support journaling which was added in 2002. That's not Apple's fault.


APFS isn't proprietary, as Apple's website claims that "Apple plans to document and publish the APFS volume format specification

Great. ... anyone else still waiting for the open FaceTime API announced in public on stage by Steve Jobs?


>Great. ... anyone else still waiting for the open FaceTime API announced in public on stage by Steve Jobs?

Only people who missed the part that some other company took them to court over a BS patent on it and won, so they couldn't finally do it.

Meanwhile, they HAVE opensourced other stuff, including Swift, in the meanwhile, so it's not like they have some big history of backtracking on open sourcing promises...


That was six years ago :)

The rumor is that Apple changed its mind due to the VirnetX patent lawsuit, which forced it to switch FaceTime from P2P to relaying all calls through a central server. But who knows if that's the whole story. I'd still like to see an open FaceTime.


What about patents?


I haven't heard any statements suggesting that APFS is patented or that Apple wants to charge for licensing it, but it's possible. We'll see what happens when 10.13 is released...


> I'm a professional developer, I don't want a new toy language

Why do you think Swift is a toy language?


I'm way more than three years with the Mac platform, and I have similar concerns. The impression I get is that Apple are not particularly interested in the computer market as opposed to the mobile (phone/tablet) market.

I think it is a question of emphasis. You've focused on the OS, but not all the innovation was in the OS itself. Remember when they added iLife? That was a big deal. Software you had to pay for on Windows was bundled free, and it was good. Everyone used iPhoto, iMovie was nice, GarageBand was awesome, although probably not everyone needed it.

(Good lord, just checked, iLife debuted (minus GarageBand) on 10.1. Evidently, I've been using MacBooks longer than I realized.)

The sense that they have an exciting vision and plan for the future of how people use and relate to the computer is gone. That sort of vision and the innovation that goes with it is reserved for the mobile space now.

They are still fine computers, particularly the notebooks, but the mobile products are driving the process now. I do think that is a shame.


> Integration of Cloud storage directly to the desktop?

This was done extremely poorly, though. Among the big sync providers (Dropbox, Sync, Box, Google), iCloud Drive is easily the worst.

The iOS app looks like something an intern cooked up. Hardly any iOS apps can read or write from the drive. There's no way to share anything in your drive. There's no way to share anything into your drive, either.

While Apple introduced the ability for your Documents folder to be stored in iCloud, macOS has completely lost control of the Documents folder in a very Microsofty way: Mine has turned into a rubbish heap of various application data that I didn't ask for. Savegames, virtual machines, "My Music", "Microsoft User Data", etc. — it's so unusable that I ended up hiding the entire folder and creating a new one that I could use for, you know, documents.

The iCloud services that are hidden away from the user seem pretty great, but the way Apple decide to expose the Drive is terrible. It's almost like they don't want users to actually use the drive.


> "My Music", "Microsoft User Data", etc.

Neither of these are Apple's doing - obviously in the case of "Microsoft User Data", and I don't know what "My Music" is but Apple's software puts music in ~/Music. (Heh, it seems I have a "My Music" in Documents, as well as "My Documents", "My Pictures", "My Games"… not sure whether this is Wine or VMware's doing, or whether they're there for the same reason as on your system.)

Applications are not supposed to use random subdirectories of Documents (that's what Library/Application Support is for), and in the case of sandboxed Mac App Store apps, can't. But I don't know what you want Apple to do about random unsandboxed third party apps deciding to do it anyway. I guess if the Mac App Store had been more of a success, more apps would be sandboxed, but still…

I guess it would be nice if macOS made it easy to sync custom (existing) folders to iCloud rather than just desktop and documents. But if you're starting from scratch, you can just create the folder under iCloud Drive and drag it to Finder favorites. If you're a CLI user, add a symlink in the home directory or whatever.

I agree iCloud Drive on iOS is really inconsistent and awkward, and it would be nice if you could share folders with other users. I don't know what you mean by "no way to share into your drive", though. You can add arbitrary files to iCloud Drive from the share menu (and you can access arbitrary files from the iCloud Drive app).


My Music etc., are made by Apple. My Music is Garage Band, I believe. My Videos would be iMovie, I believe. I think savegames end up in Documents because they are considered to be "documents" as opposed to app state (which would go under ~/Library/Application Support), but I'm not sure what the Apple guidelines are.

"Microsoft User Data" is apparently from earlier versions of Office. Newer versions will look for it in ~/Library/Preferences, and you can actually move the folder there to get rid of it. [1]

Here [2] is my Documents folder. It's 100% crap that I did not ask for.

As for sharing: This discussion was about macOS, although the same limitation applies to iOS: There's no way to share an iCloud Drive file directly. If you do right-click -> Share -> Email, you get an attachment that you need to send, even though the file is already reachable through the cloud. Every other competitor (Dropbox etc.) adds a "Copy Link" option to the Finder context menu.

[1] https://www.atulhost.com/remove-microsoft-user-data-from-doc...

[2] http://i.imgur.com/PvXbmJk.png


The current version of GarageBand saves projects under ~/Music/GarageBand by default. and iMovie uses an opaque "iMovie Library" in ~/Movies. I'm not sure about older versions, although from a quick Google it seems unlikely to me that it used those directories.

Apple's documentation[1] implies that applications are not supposed to use the documents directory themselves (only when the user selects it in the file picker).

[1] https://developer.apple.com/library/content/documentation/Ge...


> Take a look at Sierra: the only feature of note is Siri, which is half-baked as it is, and the things that did get ported over from iOS are half-done too.

I agree 100% with your point. If you want pointless changes to the OS that means you have to relearn a lot just to keep doing what you were doing, then yes, Microsoft is what you want.

If you want small incremental improvements that don't move around common functions for no reason, then stick with Mac OS.


>>and so I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro.

>So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).

Exactly. Sigh. Kids these days. That is like the complain about Webkit being the new IE, without actually experiencing Web Development during the IE era, and falsely claiming IE era means IE 7+, then later acknowledged he wasn't even doing Web Development in IE7! time. Anyway......

There are lots of new fans into the Apple Ecosystem since iPhone. And without the RDF from Steve they quickly lose sight of Apple's greatest strength. Apple is the master of iteration! Very rarely will you see a completely new product CAT being introduced. They just continue to iterate. Sometimes taking two steps forward and one step back.

I agree Apple have spend less time with Mac and specially Desktop in General, but most of the point in article were not convincing.


Amazing how people forget RAM compression and Metal. The latter prolonged my Air life by 2 years!


Actually I care more about OS performance rather than new features. Of course the only reason I don't want new features is because Apple refuses to release a screenless iMac where you can upgrade the RAM and hard drives easily.


Isn't that what a Mac Pro is? Screenless and upgradable hardware.


It's screenless, but I wouldn't call it upgradable when you compare it to the old metal Mac Pro


All of the things you mention as proof of Apple caring about the desktop are things I don't give a shit about. I needed _none_ of the things you list. Nor do any of the people I work with.

True, there probably isn't much the average user would like to change in OSX. But it would be okay if stuff was broken was fixed and stuff that was wonky got improved.


Er isn't AFS still essentially vapor ware at this point?


On iOS Apple switched it to being the default update quietly in a point release: https://arstechnica.com/apple/2017/01/ios-10-3-will-be-apple...

It actually shows marked performance improvements, but that's because it focuses on performance on the type of storage an iPhone uses.


>but that's because it focuses on performance on the type of storage an iPhone uses.

If that's solid state storage, then it's the same storage all Macs will also use in the future (and most models now).


No? Pretty sure with the initial release they specifically said that they were taking their time in testing it thoroughly, but that they wanted some initial feedback from people. It's been in a preview mode available in MacOS for just shy of two years if I remember correctly, but they explicitly said from the beginning that they didn't advise people to use it for their main OS partitions or anything important for a while.


No, it has essentially already shipped with Sierra (only in demo mode for developers), and it's already on route with the final 10.3 iOS version (currently in beta) to become the official iOS FS.


No, it's almost ready. It's already in use in iOS 10.3, and signs point to a full rollout this year.


Oh please. Not this again.

I recently had the pleasure of checking in on Windows for the second time in 7 years. I had been a PC user since the DOS days, starting on a 286, and going all the way to Windows 8 when I switched to Macs around 2010. I've only ever needed Windows for games since then, and decided to install Boot Camp on the 15" 2016 MacBook Pro that I just got (which happens to be a pretty good machine [1], all in all, and runs Paragon [2] in high detail at 1920x1080 60 FPS; enough for me.)

There is still so much wrong with Windows I honestly don't know where to begin. The UI remains a convoluted Escher'esque nightmare. Apple and macOS are still FAR from fucking up badly enough to make me want to willingly head back to Microsoft.

I am running both Sierra and the latest version of Windows 10 side-by-side (with Parallels Desktop) so the differences are very glaring and obvious. Maybe I'll write up an in-depth comparison later.

[1] https://news.ycombinator.com/item?id=13790106

[2] https://www.epicgames.com/paragon/en-US/home


> The UI remains a convoluted Escher'esque nightmare.

Comments like this make it hard to take any side of the discussion seriously.

/Neither/ of the OSes are "Escher'esque nightmare[s]." My elderly mother can kludge her way through both of them, more or less reliably - admittedly, with more trouble than using her phone, but not a lot more.

Nothing that a tech-illiterate can reasonably use on a day-to-day basis qualifies as a "nightmare." Can we please stop pitching hyperbole-for-effect until it nullifies the entire discussion?


> Comments like this make it hard to take any side of the discussion seriously. /Neither/ of the OSes are "Escher'esque nightmare[s]."

• macOS Finder: http://i.imgur.com/9Y5hK0e.png

• Windows Explorer: http://i.imgur.com/ef7CnJ4.png

Alright. Maybe not nightmarish, but one of these IS objectively more convoluted than the other.

Let's start with something simple:

WHY does the "Options" button on the Windows Explorer Ribbon (in the top-right corner) drop-down to reveal a SINGLE menu item which DOES THE EXACT SAME THING as clicking on the button itself?

(Bonus: This is what happened when I tried to take a screenshot first: http://i.imgur.com/UsN554o.png — Explorer hanged when trying to remove a shortcut to a now-missing network share.)


Perhaps you're failing to view things from the perspective of the average user?

I see the MacOS Finder and see unlabelled icons. My elderly folks are terrible at remembering what various icons do, and what error messages are for, etc. I've trained them to pause and read the words on the toolbar, read the words on the window, etc. as they're generally self-explanatory.

So the finder that is more convoluted for a more proficient user, is the one that my folks can actually more reasonably navigate.

Okay, your criticism is on point: the Explorer Options button has a redundant click. To me, that would matter if I ever used anything other than keyboard navigation. To my mother, father, and sister, "yay! It's f'n labeled!"

My nieces don't give a damn either way, because they're too young to care about one unnecessary click, and they don't forget what the icons mean.

Your point isn't wrong, it's just not that clear-cut an issue when dealing with different people with different levels of proficiency and needs.

But the fact that we're down to "labels vs. reasonably intuitive icons, plus or minus a redundant click" suggests we're well into the weeds of trivialities.


> I've trained them to pause and read the words on the toolbar, read the words on the window, etc. as they're generally self-explanatory.

You're a miracle worker. :)

Failure to read what's on the screen is one of the biggest reasons why "non computer" people don't get things done with computers.


Sometimes not just "non computer people". The number of times as a computer science TA (for a non-intro class) I had to read people their compiler error messages is staggering.


Redundant labeling is kludge, not a hallmark of an easily accessible UI. If an element in your interface requires multiple labels to make itself clearer, then you did not design it well in the first place.

In any case, the longer label could have been implemented as a simple mouse-hover tooltip, instead of an extra button alongside the primary button that expands to a menu with a single item (with its own icon too!)

I'm not just comparing it to macOS; the Windows UI is heavily flawed not only in terms of efficiency in usage of system resources, it is inconsistent with itself, and baffling no matter which school of design you adhere to.

The option button was just one of the immediate examples I could point out in a quick reply. There is still a lot to pick at in that screenshot without comparing it to any other OS.

----

As an aside, I don't think that "average" or new users are a good gauge of how well-designed a system is; people can learn and put up with horribly unintuitive interfaces out of necessity when they don't have any better alternatives. Just look at computers and phones and other devices and systems of only 20 years ago. Many will make you go "how the hell did we live through that?!"


I see both sides of the arguments but I'm a power user and even I rarely need access to those icons. Let alone my grandma. Surely, I can't be the minority here. If you do not need it, I think it should be hidden by default as Macs have done it.


Right click the tool bar in Finder and choose "Icons and text" from the drop down. One can discuss what should be default.


There's a lot to not like about Windows's UI, but Explorer is worlds better than Finder, there's absolutely no comparison. Every day I still struggle to remember how to go up a level. Option+Up? Nope that selects the next folder up. Control+Up? No, that made all of my windows do something weird that I don't know what that is. Ah, it's Command+Up. As far as I can tell, there's no non-keyboard way to do it. I can't click a button and go up to the parent folder, all I can do is go back, which is confusing if I forget what the previous folder was.

Likewise, "All My Files" sorts based on... what? And which folders does it look in? Because very often I save a file from the Internet and it never shows up in All My Files. And when I search for something when I'm in a directory, I expect the search to be restricted to that directory by default. Instead it defaults to My Mac, which makes no sense. Is there also no way to get the current path as text? On Windows I can type "C:\Users\My Documents" by hand in Explorer. I don't see an option to do that in Finder.

Explorer isn't perfect, but Finder is just awful. Just awful.


> As far as I can tell, there's no non-keyboard way to do it. I can't click a button and go up to the parent folder, all I can do is go back, which is confusing if I forget what the previous folder was.

Right-click on the toolbar and select "Customize Toolbar...", then drag the "Path" button into the toolbar.


> Option+Up? Nope that selects the next folder up. Control+Up? No, that made all of my windows do something weird that I don't know what that is. Ah, it's Command+Up.

In macOS, generally:

⌘ Command = App-specific shortcuts. The primary modifier key.

⌃ Control = Global shortcuts. App-specific shortcuts will not generally have Control as the sole modifier.

⌥ Option/Alt = Something to with the keyboard, selection, or secondary modifier for a shortcut. Option+Up/Down is Page Up/Down.

⇧ Shift = Something to with the keyboard, selection, or secondary modifier for a shortcut.

Common shortcuts are way more consistent than in Windows, and you can search for them from the Help menu of any app, and even modify/create shortcuts for any app from System Preferences -> Keyboard.

----

> As far as I can tell, there's no non-keyboard way to do it.

Right-click on the folder icon in the title bar, use Columns view, or show the Path Bar from the View menu.

> Likewise, "All My Files" sorts based on... what?

Files in your home folder's hierarchy, that are indexed by Spotlight.

> And when I search for something when I'm in a directory, I expect the search to be restricted to that directory by default. Instead it defaults to My Mac, which makes no sense.

Keyword: defaults. You can change it.

> Is there also no way to get the current path as text? On Windows I can type "C:\Users\My Documents" by hand in Explorer. I don't see an option to do that in Finder.

"Go to Folder", it's under the Go menu.

To copy an item's path name, hold down Option/Alt to modify the "Copy" menu item.

----

Most of your complaints are basically "it doesn't do everything the way Windows does it so it's bad" or "I'm furious because I didn't bother to tweak the defaults."


And most of your solutions are incredibly non-obvious, just like Cmd+Up. Right click on the folder icon in the title bar? Hold down option to change the right click menus? This is exactly what I'm talking about. Explorer is terrible but it's better than Finder because you don't need to read a manual or tweak the defaults in order to make it work in a predictable manner.


You can right click on the folder/document icon in the title bar in any app, and you can left click it to drag it. The icon, not the window. What's bizarre is how Windows has trained people to see icons as non interactive stickers instead of reactive proxies for the underlying object. The fact that many labels in Windows still don't update until you close and reopen the dialog is ridiculous, as is the continued use of "ok" "cancel" preferences.


You can right click on the folder/document icon in the title bar in any app, and you can left click it to drag it. The icon, not the window.

"You are holding it wrong".

The only reason to know this seems to be because someone has told someone it seems.

There is no indication in the design of the element, and IIRC not even a subtle glow when you mouseover.

Sorry: as much as I like Apple I don't waste my time on their software. Not saying it is bad but for me I had to admit it was a waste of perfectly good working time and focus.

If you like it: good. I know people who seem to be more productive with Macs. And more Apple users is good news for both Linux users and Windows users since they force companies to think cross platform and force Microsoft to be more humble.


Not everything has to be equally discoverable. By that same logic, the majority of right click interactions are off the table. You can right click window icons in Windows too, with no visual cue either, they just act different and the surface area is much smaller. Titlebars also don't tell you you can drag them, and essential productivity shortcuts like alt-tab are not advertised either.

It's okay to have to be taught some things, it makes it better for pro users when the handholding is minimal.

Here's another one: hold alt when a menu is open to see some options change into alternate versions. Instant feedback, works on dialogs too. Keeps the UI uncluttered for noobs and power users alike without significantly impeding you.


> And most of your solutions are incredibly non-obvious, just like Cmd+Up.

HOW is Cmd+Up NOT an obvious shortcut for going UP one level? As I said, Cmd is already the primary modifier key for all app-specific shortcuts.

On the other hand, what does Windows use? BACKSPACE! A key people associate with deletion and editing text!

> Hold down option to change the right click menus?

Option modifies shortcuts, changing their behavior. "ALTernate." Again it's a behavior that's consistent across all macOS apps.

> Explorer is terrible but it's better than Finder because you don't need to read a manual

You don't need to read a manual for Finder, or almost any macOS app, because you can just click on the Help menu, and type the name of a command or action you expect that app to be able to perform, and it will automatically highlight the menu which contains that action:

http://i.imgur.com/HNUNqpN.png

I just typed "path" in the Finder help menu and it showed me the shortcut for "Copy as Pathname" including the Alt/Option modifier. And that works in EVERY macOS app. Try doing that in Windows!


In macOS, generally:

Back when I used it, 5 years ago, it was a solid mess IMO:

Jump one word forward or backwards?

Windows/Linux: ctrl + arrow

Mac: depends on which program you are currently using

IIRC our resident Mac enthusiast explained that it was because some was written in carbon and some in cocoa or something but I don't care.

Opening a file upload dialog box in one browser window would block other windows from the same browser forcing me to use two different browsers: one for docs and one for the task at hand.

This went on for 3 years and in the end I looked as much forward to leave as I had once looked forward to get a Mac.

To each their own but Mac users talking about consistency and ease of use still baffles me.


> Jump one word forward or backwards?

> Windows/Linux: ctrl + arrow

> Mac: depends on which program you are currently using

I don't know what the deal is with Carbon - it was already dead 5 years ago, even Photoshop transitioned in 2010 - but everything I use uses option-arrow. Even the terminal.

> Opening a file upload dialog box in one browser window would block other windows from the same browser forcing me to use two different browsers: one for docs and one for the task at hand.

I just checked and at present in Safari, file upload dialogs block the current window (i.e. can't access other tabs) but not other windows. Though to be honest, I don't understand why you needed to keep file upload dialogs open… why would you click the button to pop the dialog if you weren't intending to immediately select a file? But I guess https://xkcd.com/1172/ applies :)


Though to be honest, I don't understand why you needed to keep file upload dialogs open… why would you click the button to pop the dialog if you weren't intending to immediately select a file?

Would finding and copying a somewhat long folder path from the internal wiki qualify as a good reason?


These are the things that bug me the most about macOS. See also: page up / page down, home / end. For an OS that seems to want to care about consistency, to have pretty much every app interpreting the most common keyboard navigation differently is incredibly annoying.


The complaint is that Apple software tend to be throughly lacking in the discoverability part of their UI. It is especially bad on iOS.


Which issues are you specifically referring to?

macOS has demonstrably more discoverability than Windows; Not only are shortcuts consistent across all apps, but the visuals of common UI elements are the same as well, so a new user will immediately know what does what, in different apps. Whereas on Windows, you have a multitude of menu, button, scrollbar etc. styles right out of the box, in Microsoft's own apps!

macOS also has a list of all mouse/trackpad gestures — complete with videos — in System Preferences. Likewise, all global keyboard shortcuts are visible in one place as well (Keyboard preferences, where you can modify and even create your own shortcuts for ANY app.)

Best of all however, you can click on the Help menu and just type the name of a command or action that you expect an app to have (for example a filter in an image editor) but don't know its shortcut or which menu it's under, and the unified menu bar subsystem will automatically highlight all matching menu items and their shortcuts for you:

http://i.imgur.com/HNUNqpN.png

Here in that screenshot I wondered about the ability to copy a folder or file's pathname, and the Help Search pointed it out to me! And it works in EVERY macOS app! Does Windows have anything which comes close to aiding in discoverability like that?

I will concede your point on iOS, however.


Speaking of the nuances of Finder.

In windows, to jump to a file starting with 'docu' one can start typing d o c and explorer will jump the file.

Likewise, it's possible to tap d d d d to navigate between all files starting with 'd'.

Is there a way to do either of these with finder?


The way to do the first one in Finder is to start type d o c, as you might expect.

Typing the same letter a second time in finder will go to the last file starting with that letter, which could be a handy feature. But d, down, down, down seems almost as good a d d d.


I've learned to right-click the title on the toolbar, which shows a dropdown of folders above the current one, which you can select to go up.


Funny you call it a right click...


Just cmd-click the window title. Works in every document-based native macOS app btw.


By default the ribbon is hidden: http://i.imgur.com/5hxOIcb.png

I don't use Windows, but it's important to use fair comparisons.


> By default the ribbon is hidden: http://i.imgur.com/5hxOIcb.png

You have to expand it and be faced with that hodgepodge of cluttered buttons to perform many functions. Even with it collapsed, it's still 4! toolbars (counting the buttons in the title bar and the ones in the bottom status bar) and a menu (?) style that is inconsistent with many other built-in Windows apps.


You don't really have to expand it for typical day-to-day operations - everything of interest is in the context menu.


So to prove your point, you take a non-standard folder view on OSX, and a non-standard view on Windows to make the mac look simple and the windows version look convoluted.

A+

While you can argue about whether or not the option to modify folder settings belongs within that window or on a title bar, taking screenshots like that is dishonest, at best.


This is the default view when you click on the Explorer icon in the Windows Taskbar on a new user account: http://i.imgur.com/TM8ZyLg.png

This is the default view for Finder on a new user account: http://i.imgur.com/Dni0Kpo.png

The only thing I changed in this Finder screenshot is collapse the auto-populated Devices, Shared and Tags list for privacy, so I didn't have to edit the image. Notice that it defaults to the "All My Files" view which is empty for a new user and would be even more of an unfair comparison to the default Windows view.


My mother would find the Windows one easier to use. There is text on the buttons, so if she presses something by mistake, she'll have more confidence to press other things to put it right again.

Which even is the options on Finder? The cog, I assume. If I'm helping over the phone, can a user describe the current state of Finder? Not really. But I could ask for the contents of the address bar in Explorer.


> If I'm helping over the phone, can a user describe the current state of Finder? Not really. But I could ask for the contents of the address bar in Explorer.

• Click on the first icon/happy face at the bottom of the screen (to make sure Finder is in focus).

• Click on "View" at the top of the screen [or press Cmd+Option+P] to show the Path Bar.

----

To guide someone to a specific folder:

• Click on "Go" at the top of the screen [or press Cmd+Shift+G].

• Type the address.


As a longtime user of both OS's, I think Explorer is a lot more usable than Finder and has been for years. Personal choice of course.


"macOS Finder": you are talking about that software that rename folders when you press the enter key? :p


Can you give a reason why Enter to open a file is ideal, other than just because it's how Windows does it?

The Windows Explorer uses Alt+Enter to show the properties of something, and in games and other apps that shortcut means Full Screen.

Also, why does Alt+F4 quit an app? What does F4 stand for? Compared to Cmd+Q to Quit or Cmd+W to close it's a lot less intuitive.

And so on and so forth.


> Can you give a reason why Enter to open a file is ideal

Because "Enter" has been used a key to activate/open the currently selected item in various UIs across various platforms for several decades now?

Also, possibly, because the button is labeled "Enter"? I guess on Macs it's "Return" though, but that doesn't really make any more sense.


Really, it doesn't make sense to you that enter would correspond to the most common, least destructive action?

Alt-F4 makes little sense, true, but in almost all windows apps control-q or control-w does exactly the same thing as on osx.

Also, having sane defaults for keys people are likely to press is much more important than /not/ having obscure keyboard shortcuts that users never have to know even exist.


I want to say ctrl+q is an exit command from within the application (graceful shutdown) whereas alt+f4 is more of a system level halt for whatever is in focus (hard reset).


That's not actually true. Alt+F4 is exactly the same as clicking the Close window button, and sends WM_CLOSE to the app, which can then handle it gracefully (or not) as it chooses.

In particular, if you have unsaved data, most apps will prompt to save on Alt+F4.


Did not know that, so if an app is non responsive but alt+f4 does the trick, does that mean the app maybe wasn't as FUBAR as I thought?


When Windows sends WM_CLOSE for any reason, it will wait for some time for the app to react (not necessarily close itself, just process the message). But if it does not react soon enough, Windows will mark the app as hanging, and you'll get that standard wait-or-close dialog.

However, Alt+F4 translating to WM_CLOSE is also handled by the app message loop - i.e. Windows sends WM_KEYDOWN events for those keys, and when they get to the event loop, it's supposed to call TranslateMessage. That one normally handles WM_KEYDOWN -> WM_CHAR translation, but it will also handle any shortcuts, and this includes Alt+F4. So if the app is hanging, messages aren't pumped, and the shortcut doesn't work. To send WM_CLOSE, you'd need to use the close button on the window.

So if the app does close on Alt+F4, it was, at the minimum, pumping messages and invoking TranslateMessage on them.


Thanks for the detailed reply! I learned something today.


Yes. If you want to kill an app in Windows, use Task Manager or powershell.


Well aware of that alternative :P I had a display glitch with a game client where it would take the whole screen but not really display anything, leaving me with sound and a working computer but no display even after alt+tab or even ctrl+alt+del. I basically had to blind type win+r > "cmd" > taskkill /f /im Gw2-64bit.exe whenever it happened :(


> the most common, least destructive action?

Launching processes with a single keypress and no confirmation can hardly be called "the least destructive action."


Why Cmd+Q and not Cmd+Esc then? It is frustrating to press Cmd+Q by accident when you intend to press Cmd+W, at least there is a physical separation between F4 and F5, so you can't get it wrong.


I still haven't figured out how to actually open a file/folder in Finder (with the keyboard).


There is even a shell command called "open" that opens its argument with the application Finder would use, or watever you pass as an option.

(As a sidenote, you can drag documents from the finder into a Terminal window and their absolute paths appear at the cursor position).


I hear it's Command-O.


Oh! Thanks. ⌘↓ works as well (and ⌘↑ to navigate to parent directory).


Yup. Finder's still not the greatest for keyboard navigation, to be sure, but these shortcuts do make it more comfortable to use.


Don't worry I don't use "Enter opening something" as my only criteria to judge the discoverability and the ease of use of an interface -- and like you said Windows is full of arbitrary and/or curious keyboard choices -- but pretty much like all interfaces. (However, I still find enter => renaming really weird :p )

As for Cmd+Q and Cmd+W, those might be somewhat more intuitive, but only for some languages, and arguably only very vaguely intuitive for Cmd+W even for english speaking people.

But then again, I don't think there is a real notable difference in the end between most modern systems. It's more of a question of what we are used to.


Enter is just a huge fucking key on some keyboards and easy to get in your way. It just feels super unintuituve to map rename on there. I dont ever use enter for starting ether so its not really about that


> Can you give a reason why Enter to open a file is ideal

Lol. 10/10 troll.


Correction: Cmd+W to close a window.


That ribbon is hidden by default.


But it's not on the Finder, and the Finder defaults to the "All My Files" view which is completely empty on a new user account, so I changed one thing on each side to make the screenshots more comparable, but here are the absolute default views for each OS:

http://i.imgur.com/Dni0Kpo.png

http://i.imgur.com/TM8ZyLg.png

The Finder is still less cluttered, with all its buttons in 1 toolbar instead of spread across 4.


If you want to compare the two, at least use the default view. Pinning the ribbon and then call it more conveluted, is just trying to force your argument.

http://imgur.com/a/5AGBU


macOS Finder, for me, has always been one of the worst parts of of macOS btw. Try navigating to a different drive or to enter a path directly (without knowing the keyboard shortcut) or openeing a shell window at a location or doing anything useful... Explorer is so good at what it does. Maybe it's less pretty, form follows function?


Not to mention it gets special treatment as a program and the behavior of it in the task bar is weird and non-intuitive for inexperienced users.


You're trying to compare a simple case of Finder with an Explorer screen that's on the middle of reconfiguring the view. Ordinarily it's much simpler-looking than that. This is not a remotely fair comparison.


Are you talking about 1) or 2) in http://imgur.com/gallery/J3KPm

If it's 2, that's because other items can be added to that menu, such as LastPass and Bitdefender just to use the examples conveniently at hand.

And as a quick tip if you need to take a lot of screenshots on Windows - Greenshot is a great screenshotting program for Windows (and OSX, though I don't know how it is there), free, GPL, etc. Binds the PrtSc button and lets you select the area you want to capture as well as destination, format, etc.


Regarding the Options button, I think you're mistaking the tooltip for a menu item.

And as for when the ribbon is closed, assuming you're using an appropriate language selection I think I'd find "File/Home/Share/View" at least as informative as "4 squares/4 lines/3 boxes/multisize boxes" - and the text labels next to them on the Windows side when you're on the View ribbon to be much more informative.


> I think you're mistaking the tooltip for a menu item.

It's a bona-fide menu item, accessed via that extra dropdown button below the primary "Options" button.


When I moved out to a far away country, in the early 2000's, my non-English speaking mom started using my localised-Windows desktop as the mean to keep int touch (hello Yahoo messenger). In the 2 years that followed there was not a month were I didn't receive an alarmed phone call from her about "The computer says I am at risk!" or "There is a message in English, I don't understand, it looks bad!". Switched to OS X and the tone changed to, "there was an update and every thing went fine". I have been using OS X (macOS) for the last 10 years but I can still see in my corporate environment that localisation of OS and third parties (hello mandatory anti-virus) is not a fixed issue.


I am tech savy and I have been using mac for 4 years. I can't find where to change settings on Windows 10 to be honest, I find my self looking at two different control panels all the time, one is the moderm one like for tablets and the other one is the one we had since Windows 95 I believe. Some settings can be changed in one but not in the other one.


> The UI remains a convoluted Escher'esque nightmare.

Totally disagree. I hopped from OS to OS for over a decade every other year or 3 (Win, OSX, Linuxes with various DMs/WMs) and it's always, always an adjustment period from "WTF is this" to "totally productive with this now", usually a matter of 1-2 weeks. Happy with Fedora/Gnome right now but have to say just Explorer itself beats Finder and Nautilus (rebranded as "Gnome Files" apparently now) for keyboard users when it comes to out-of-box file managers.

The weird new tiles/metro UI stuff they tried pushing from Win 8 is of course a pretty bad attempt to Microsoftify the "secondary desktop layer" concept that has been prevalent on the other OSes for longer, and better. It sucks, but also useless and thus not used for many/most power users I'd guess.

The rest of the "UI" question is in the hands of individual independent app developers, as it properly should be.


Linux file explorers (Nautilus, etc.) aren't exactly a pleasure to use, but then again, as a Linux user, if I have to open a GUI to do something with my files, I'm usually not happy about it in the first place.


> but then again, as a Linux user, if I have to open a GUI to do something with my files

Well I'm kinda visual/directional when it comes to files and other data structures. Even before I had a PC that could run Windows, ie living in MS-DOS and certainly being aware of navigating and operating the file-system via CLI, I much preferred the blocky-ASCII-GUIs (with menus and "windows" and panes, usually both mouse&keyboard-capable) of Borland, MS and many other software vendors incl "commander"-type fie managers. So.. probably a very longstanding "habit" thing.

The irony is in those days the "power-user" in general was adamant to get as quickly and as far away from the CLI as they can, even though they need to know some basics to automate script and launch stuff. Other than the odd GNU beard. Whereas the 2017 generation loves the CLI, would never have predicted such a thing! =) I've found time and again that once one operates with more than 1 or 2 dozen commands, it's disturbingly often `command --help` or `man command` first because there are only words (that vary from command to command) absent a program based on and supporting the rich set of commonly-used, de-facto-standard spatial gestures and interaction primitives that have evolved over the years.


Ever use ranger?

Of course, not exactly comparable, but it's pretty amazing.


Oh, this looks interesting! I've never really tried out terminal-based file managers, but I could see myself using something like this. I'd definitely need to spend some time getting used to it though, not to mention learning how to do various things (show hidden files, etc.)


You, and others, might also find 'vimv' to be very useful ... it's not a file manager as you think of one, but it's very powerful.

Your pwd opens up in vi and you can edit it as you see fit. When you :wq! all of the filename changes you have made get written to the directory.


Dolphin is pretty nice!

It also has a well integrated terminal with directory tracking, so it is straightforward to switch between command line and graphical operations.


> Explorer itself beats Finder and Nautilus

Have you checked out Nemo, which is basically the good old, full-featured Nautilus?


I'll agree with most of this. We had a new hire come in with his personal Windows laptop. We tried to get him connected to the corporate WiFi, and into the corporate net with our own client certs and private CA.

Uh... no. No, you can't do that.

WiFi configuration was annoying. Lots of clicking. No consistency to the UI. And for a relatively new laptop, it was slow. OSX in comparison was simple, relatively straightforward, and responsive.

Certificates were a joke. We could import certificates and CAs. But once imported, they didn't do anything. Access to the corporate intranet was still blocked with "unknown CA". When we went back to see if the certs had been imported... nope. No certificates. OK, import them again? Nope, still no certificates on the machine.

We gave up and let him use a spare Mac Mini.

I hate where Apple has been going with OSX: Slow strangulation by oxygen deprivation. But until it becomes entirely unusable, it's still better than Windows for me.


The fact that I still had to do 3 hours worth of updates and restarts after fresh install is beyond me.

Install (only) MS Word - - another cycle of updates. Restart, check updates, more updates!!! It's just insane. Why does it download Outlook spam filter update, 100mb?I don't have Outlook installed, I only wanted freaking MS Word.


What version of windows are you on? (10 or 7)?


Just did fresh W10 install a few weeks ago and had to go through that pain.


That's weird. Whenever I have to reainstall windows 10, I only get 1 update, namely to the latest version. (and drivers, but those get installed in the background).


Until you install Word, apparently ;)


Sad but true. Apple may be on a slow descent, but they have years until their flight path intersects the nether realms of Windows.

Granted, Windows has a usable file manager (Explorer) and Finder is an embarrassment, but theres always Path Finder.


Path Finder is great, but I've always hated how on OS X you have to pay for 3rd party replacements of even the most basic software.


An in-depth comparison would be most welcome. I use Windows (7) all day writing code at work, and use Sierra at home nights and weekends. I do prefer MacOS, but I'm very curious what your issues are because I don't see them. If I really wanted as powerful a (desktop) machine as I could get for gaming, I'd definitely go Windows. No contest. (I don't want that.)

Apple's notebooks are excellent, no argument there.


I'm a huge fan of Windows 7. The UI is simple, everyone is familiar with it, it's relatively stable, but most importantly, basic tasks are so much easier to deal with. In my opinion, it is the best Windows OS made.

Unfortunately Windows 8 and 10 are just not great OSs, in my opinion. A major theme on these threads is "Apple is trying to force mobile onto desktop computers." yet I see Windows doing it just as much, and not nearly as smooth, as Apple. Cortana, weird metro UI layout, terrible settings, confusing maintenance, updates never work, slow response times, etc.

I gave up on Windows 10 when I got my father a new laptop and had to troubleshoot my sister's. The OS is a mess. Finding basic things is so difficult. Just give me a control panel. Now we have "Settings" too, which is just a mess. Windows Update never works properly. It constantly needs a troubleshooter to run and start and stop the services again.

My dad didn't want to log in with an email to his laptop. So I disabled it (which was difficult to find, so I had to google it.) Now it's suddenly back, and he's complaining about it. There's no way he manually did it purposefully.

Trying to disable things like Cortana never works right. The huge debacle of info being sent to microsoft and needing heavy workarounds to actually prevent it from occurring. And then being force fed apps with advertisements built into the OS...come on.

Apple simply needs to improve their hardware at their price point and 90% of these threads wouldn't be made. The OS needs some additional features and improvements, but the experience is just so much better than Win10 and 8 for me.

My current build uses Windows 7 and whenever it is officially unsupported I will move on from Windows completely.


> Just give me a control panel.

The old Control Panel is still there.

> Trying to disable things like Cortana never works right.

Cortana is a settable option. There's no reason to try to disable it.

*> Now we have "Settings" too, which is just a mess.

Settings is a clean, well-organized and simple app that does what most people want. It saves ordinary users from having to grapple with the Control Panel.

You'd probably do better if you spent more time in learn mode instead of in angry mode.


> You'd probably do better if you spent more time in learn mode instead of in angry mode.

And you'd probably do better if you weren't condescending towards other people's experiences and preferences.

You say the Settings app is great, yet it's not great enough to replace the control panel? Redundant settings abound between the two?

There are many reasons to want to disable Cortana-not just turn it off, because it doesn't actually turn it off.


There's an awful lot of settings in but actually, being on the preview version, you notice things move to the new settings app with every update. Also links to the control panel pop up in the settings app and the search finds the correct setting usually. They get better.


To me it's just pointless. Have all of the settings in the settings app now so I don't have to go to two places to change them. The problem with settings was that it didn't include everything I wanted/needed to change.

As far as search goes, my Windows 7 machine finds most settings too.


Thee are a lot of settings, and it would have taken too long to move them all at once.

The Control Panel remains because a lot of hardcore Windows users would have bitched if Microsoft had removed it.

As usual, Microsoft would have been criticized for whichever decision it made. That's life when who have more than 400 million Windows 10 users.


> And you'd probably do better if you weren't condescending towards other people's experiences and preferences.

It was simply intended as good advice.

If you think about it, there are good reasons for moving settings to a simpler, touch-sensitive, sandboxed app.

If you think about it, there are good reasons for moving settings from the Control Panel in stages, and for keeping the mouse-oriented Control Panel program around for people who have grown used to it.

If you've read the book, the anger is a System 1 rant and should be modified by System 2 thinking.


Settings in windows 10 is garbage. Why do we need yet another easy-mode for the control panel? I thought that was what the larger control-panel icon groups were for in XP?


It makes settings more accessible to more people.


Definitely not an indepth comparison, but for me the dealbreaker was when I "upgraded" to Windows 10 and it started giving me spammy notifications asking me to try out various Microsoft products. Apparently it's possible to turn all that crap off, but you have to mess with the settings of multiple applications and dig into the Windows registry. For me, it's easier to just avoid Windows when possible.


The UI is a mess of old and new in Windows 10. Generally "advanced" options means switching to and old menu style (win 7 and older).


"win 7 and older"? There are still plenty of configuration dialogues that have only seen the slightest of changes since Windows NT 4.0; I'm not exaggregating.

Windows is easily the most visually inconsistent OS out there -- even in comparison to a Linux desktop. That has it reasons of course, many features are in there that didn't need any kind of update in the last 15+ years, therefore their UI was never updated to conform to updated guidelines and so on.


Yes. Windows 10 is a commendable effort, but nothing more. Seeing multiple styles of typography in different windows is just sad. And ugly. macOS is untouchable, whatever sins Apple has committed.


> Oh please. Not this again.

Exactly. Honestly it's linkbate nonsense. Sucks you in like a car crash on the side of the road or a hockey fight.


Escher would turn in his grave being compared to that.


I'm not 100% happy with where Apple is at, but I don't think I ever was. But I'm not going to switch.

I get the slickest hardware, great battery life, the best touchpad and keyboard, and a hi-resolution screen that doesn't have battery life, flicker, tint, or scaling issues.

Windows is still...Windows. The bash subsystem is half-baked. Windows itself is still the same mess. Docking is still a joke. Performance is terrible, laptop hardware is still 'almost-as-good', display scaling is a joke, and the ecosystem is still fragmented beyond belief.

Linux is still as bad as it ever was. It's a nice place to visit but I wouldn't want to live there. I've been using Linux on laptops since Redhat 7 (the first one from 2000) and a Dell Latitude C-series.

Somehow I shut my laptop off and it booted backup to a failure message, I had to make another bootable USB and go online to find out how to do a fsck. I was using external monitors without issue for a month, and then one day the configuration changed to a point where I couldn't even use the laptop with it's built-in display. The sound and volume controls are a joke. I was prevented from installing any packages because of an issue with a package from Google.

All fairly minor, fixable issues, but I forgot all about them because I haven't had to deal with crap like that since I switched to a Mac back in 2010.


> Somehow I shut my laptop off and it booted backup to a failure message, I had to make another bootable USB and go online to find out how to do a fsck. I was using external monitors without issue for a month, and then one day the configuration changed to a point where I couldn't even use the laptop with it's built-in display. The sound and volume controls are a joke. I was prevented from installing any packages because of an issue with a package from Google.

A bit of a rant ahead, but I am really tired of people still treating Linux like it's 1997 or expecting it to work like their old OS did and then running into silly problems that none of us, who actually spend our day in Linux, run into.

These problems sound more like inexperience in using Linux to me than anything else. fsck works automatically after a power failure, configuration doesn't override itself and installing a package from the web, while not a great way of doing things, definitely does not prevent you from installing other packages, unless you somehow broke a whole lot of other stuff by deleting dependencies without really understanding what you were doing.


With great power comes great responsibility.

If you want raw power, you can buy a car with 500 horsepower. But you handle all of that power, you'll need traction control and maybe stability control, and anti-lock brakes for when you need to stop it. But all of that limits the amount of power you have available, limits your control. Now, a race car driver would look at these driver assist tools and say "I'm sick of people expecting race cars to work like their Jetta and then running into problems that us professional race car drivers don't run into". When you complain that the tires spin or you are having a hard time working the clutch and steering wheel without power assist, they might even tell you "these problems sound more like inexperience with driving race cars than anything else".

And they'd be 100% right. But I'm trying to get to the grocery store, not win the Daytona 500. I want a car with power steering and an automatic gearbox and anti-lock brakes and traction control and an airbag. You may say "computers were designed for power and Linux gives you maximum power!" like the other guy commenting did but I don't want that because when it inevitably breaks (because I'm inexperienced), I don't want to be stuck on the side of the road reading manpages and stack exchange posts with no answers and trying to navigate mailing lists. I have a job to do.

(Seriously, did saying "sounds like you're just inexperienced with Linux" sound good in your head? Because it certainly didn't sound good reading it.)


The thing is, you really don't need to "be stuck on the side of the road reading manpages" these days and what drives me crazy is the outdated evaluation of Linux, it's like saying, "but viruses and blue screen of death" about Windows in 2016.

Does it require some adjustment? Sure, just like any new OS. If you're going to try it, I think it's fair to require you to learn its ways. If you don't want to do that, why not stick with your old OS?

Mac requires adjustment too, has many problems, (ie WiFi drops, file system, temps etc.), however I feel the reason people are willing to adjust is because they paid solid money for it and so want to get the most out of their purchase. Because Linux is "free", there's no such incentive and thus it is judged much more harshly.

Hardware selection is another problem, with Mac you get custom tailored hardware, but with Linux, most people just purchase any random junk PC and expect it to work great, not really fair is it?

> Seriously, did saying "sounds like you're just inexperienced with Linux" sound good in your head? Because it certainly didn't sound good reading it

I am sorry if it sounded harsh, but constantly reading about novice problems I haven't had in a decade also doesn't "sound good reading".


> not really fair is it?

It's not fair. It's very very unfair because even in the PC world, hardware vendors write their own drivers for Windows, or release documentation only under NDA, while Linux is often dependent on reverse engineering by random contributors. And of course the latter can't begin until the hardware is actually available, delaying support.

Unfortunately, that still leaves hardware broken on a large fraction of systems, with only a handful of vendors explicitly designing systems for Linux. It's better than it used to be, but not great.


>> I am sorry if it sounded harsh, but constantly reading about novice problems I haven't had in a decade also doesn't "sound good reading".

It's not a novice problem. I know what a fsck is and I know you're not supposed to unmount filesystems improperly. But you shouldn't be presented with an error that you can't recover without a rescue USB because your system lost power.

And the video configuration issues are just weird. Monitors plug and play perfectly but all of a sudden it got so screwed up, I had to poke around quite a bit to get things back to normal. Frustrating the heck out of me because I'm used to something that just worked.


I am not saying this didn't happen to you, but it looks like a fluke or a distro specific bug, since I specifically had power outages while running my Arch home server and haven't had to recover from USB.


Usually this happens when the system crashes/otherwise terminates during system upgrades. Kinda hard to boot if /sbin/init is a link to an empty file.


Well sure, but that is true on every system, hence Windows warns "Do not turn off your PC".


Actually Windows can and does use NTFS transactions (TxF) in the Installer and Updater code, so the probability of non-recoverable damage should be lower (which is also my experience; power off a Linux machine during updates and you are pretty much guaranteed to have many packages with empty files, partial file trees and so on; power off a Windows machine and it probably is still fine, but sometimes you get weird issues where it can't properly reinstall/deinstall some updates to try again).


I don't think your analogy is very compelling; if a regular person untrained in driving race cars bought a new race car, crashed it on the way to the grocery store, and then blamed the car rather than their lack of experience driving race cars, I'd definitely find it a bit silly.


Now imagine every race car driver on the Internet was saying "you should be driving a race car, your Nissan is just a toy".


It's strange that you think that's what's happening here; if anything, the comment that the grandparent was objecting to was saying not to use a race car if you don't know how to.


Plenty of people this this comment section are saying that though, and you get it in every thread about Mac vs PC. When you've got basically the entire Internet telling you "Linux is a valid choice!" and then someone says "Linux isn't broken you're just not good enough to use it"... well, that's not really going to drive Linux user adoption, is it?

It's a "you're holding it wrong" moment. It's a completely ridiculous argument.


The issue I have with your characterization is the forced equivalence of "plenty of people in this comment section" and "basically the entire Internet" part of it. Don't get me wrong, there is definitely a segment of Linux users that vocally claim their choice is the superior one, but you could say the same thing for Mac and Windows. Given that there's no way to measure the number of Linux (or Mac or Windows) users who aren't commenting, it's a little presumptive to just assert that all Linux users are telling you your choice is wrong.

As an aside, I can't speak for other Linux users, but my goal is most certainly not to "drive Linux user adoption". If someone is interesting in learning to use Linux or wants advice about which distro to use, I would happily help them, but I actively try not to convert people who haven't expressed that interest. The reason I chimed in here is that I disagreed with the sentiment that Linux is a terrible choice. It's certainly not the right choice for most people, but there's a difference between being being niche and being bad.


Seconding that. I have a hard time understanding why professionals would look down on an OS that gives the user pretty much all the power, which, let's remember was the seminal idea of personal computing. Interestingly, Linux comments are about 5% of the thread, which is also roughly the market share.


I 'm kind of more confused at everyone's seeming aversion to learning. Friends ask me why I go through the trouble of running Windows and Linux side by side with virtualization, or why I'm bothering to write my own blog site in python when Wordpress exists; why are you going the long way?

The answer is always "Because I'm learning cool new shit and enjoying myself".


Sure, but time is a finite resource.

You know when the last time I even thought about my OS was? Probably 8 yrs ago. I'd much rather fiddle with the stuff I'm building than with the stuff I just need to work in order for me to build what I'm building.

So I have no aversion to learning, but I want to focus my time learning the stuff I want/need to learn. And if there's an OS that runs well enough I don't need to think about it then that's fantastic.


Sure, but you had to learn something about this OS when it was new for you so that now you don't have to think about it. When you switch to Linux, there's maybe a week of adjustment to learn how are things around town, which is really the same for Windows and macOS.

The argument is that if you're going to try Linux, it's only fair to give it a fair shot at adjustment, rather than complaining that it doesn't work like you old OS, because guess what - it's not supposed to.


Not even as a side project or something though, running on some old hardware you were never going to use again? I guess this whole thread is about working environments and precisely not what we play around with in our spare time though.

To be honest, I hadn't really given much thought to my OS for even longer before I decided to go looking myself, I'm glad I picked that fight though. Granted, 8 years ago I was 15, digging in to the guts of an OS for the first time is very different from focusing your time to build what needs building :)


> These problems sound more like inexperience in using Linux to me than anything else. ... unless you somehow broke a whole lot of other stuff by deleting dependencies without really understanding what you were doing.

That's the problem right here. There's a vanishingly small percentage of people who can afford a laptop who are also willing and able to spend the time to gain "experience in using Linux". Sure it's much easier than it used to be, but it used to be a completely unmitigated disaster (I remember things like having to teach myself enough C to fix a wireless driver that didn't want to compile under a newer gcc version) so that's not saying much.


> Linux is still as bad as it ever was. It's a nice place to visit but I wouldn't want to live there. I've been using Linux on laptops since Redhat 7 (the first one from 2000) and a Dell Latitude C-series.

This is so far removed from my world as a daily Linux user. I put Fedora 25 onto a Dell XPS 13 Developer Edition and everything worked perfectly out of the box. The OS looks gorgeus, and I now have the tooling so perfected to my workflow that I would be absolutely gutted if I had to change OS. In fact, I would likely turn down a job if it meant having to use Windows as a daily driver. MacOS I could likely get on with ok, but would miss the customizations I have on my linux machine.


Absolutely. Maybe it's also just a Fedora thing, installed it a week ago on my laptop that only had Win on it before, and from prior Linux experiences (Xubuntu & friends) just some ~3 years back I was absolutely baffled how everything installed quickly without a hitch, without glitches. HiDPI graphics recognition with a good default scaling, there, no config-file messing required. Wifi, there. NumPad, multi-touch trackpad, even audio!! you name it, it Just Worked. Still very-minor rare glitches throughout the Gnome environment but no showstoppers especially regarding the first-install experience. Major props to the Fedora devs for rounding out such a smooth distro.

If someone gave me a MacBook tomorrow, I'd install Fedora on it, no contest. At least I wouldn't have to get lost in "homebrew ports" and such, or some "Powershell" RedmondUbuntu2017 concoction ;))


I'm surprised HiDPI is working so well for you. See my other comment but I'm guessing you don't run anything in WINE. Yes Fedora is miles better than other distros regarding HiDPI but there's still no way to force per-application scaling in Wayland.

I mean, Windows doesn't get a pass here - you still can't scale HyperV, or anything involving mmc.exe, but Mac's HiDPI support is flawless.


> but I'm guessing you don't run anything in WINE

Now that is true. I can readily believe that Apple invests more efforts than the others in somehow ensuring even old no-longer-updated apps somehow render properly. It really is weird how the OS' font-scale factor seems to be ignored by so many old programs (or their GUI toolkits)!


For gaming, I will be the first to admit Linux has a way to go yet and the coverage is much better on Windows. It is certainly moving in the right direction though, and with recent developments such as vulkan.


Does the XPS Developer Edition have a 4K screen?

I have a standing offer to buy a developer a laptop with a high DPI screen [1] if they can make High DPI support suck less on any distribution.

At least Fedora auto detected the screen and enabled 2x scaling. GRUB is still unreadable. WINE apps are unreadable. QT less than 5.6 is unreadable. GTK is unreadable.

1: http://en.chuwi.com/product/items/Chuwi-HiBook-pro.html although there's a newer one I can't find


> I put Fedora 25 onto a Dell XPS 13 Developer Edition and everything worked perfectly out of the box

You have a 13" 1080p screen correct? Then all text will be absurdly small because Fedora only supports scaling at 1x and 2x but not at 1.5x, which is what that screen size / resolution combination requires for a comfortable DPI. Windows does support 1.5x scaling.


Maybe we have different definitions of 'unreadable'. I'm typing this on an XPS 13 with 1080p screen and 1x scaling (Fedora 25). The text is perfectly readable.


Are you sure you're working off of recent, in-depth Windows experience?

> I get the slickest hardware, great battery life, the best touchpad and keyboard, and a hi-resolution screen that doesn't have battery life, flicker, tint, or scaling issues.

None of these things are universal among Windows machines. I can't remember which model, but a Verge review recently said that a Windows laptop had surpassed the best Mac (in terms of hardware quality).

> display scaling is a joke

It's fine. Everything works and looks good. Sometimes you use legacy apps (which is a feature of Windows -- legacy apps still work) that don't handle scaling well, but the OS helps resolve those issues.

> Performance is terrible

Reviewers, people on this thread, and I have all reported Windows being much snappier than OS X. I actually switched to Windows for performance alone. I hate not having a bash terminal, but it's better than using buggy, slow OS X.


> None of these things are universal among Windows machines. I can't remember which model, but a Verge review recently said that a Windows laptop had surpassed the best Mac (in terms of hardware quality).

Doubtful. Apple does crazy stuff like custom ACPI for Thunderbolt controllers so they can be very efficient. Or having each of the fins on a fan be at a different angle so they have a different pitch and fan noise is distributed across the sound spectrum (reducing total perceived noise). They were also the first (not sure if the only one) with terraced batteries.

> It's fine. Everything works and looks good. Sometimes you use legacy apps (which is a feature of Windows -- legacy apps still work) that don't handle scaling well, but the OS helps resolve those issues.

It's universally known that display scaling is the forté of macOS, and has been for a long time

> Reviewers, people on this thread, and I have all reported Windows being much snappier than OS X. I actually switched to Windows for performance alone. I hate not having a bash terminal, but it's better than using buggy, slow OS X.

Final Cut Pro works fine (read: smooth) on the anemic Macbook 2016. Try running Premiere on a Netbook and see how that goes. This is the power of macOS/Apple: they only have a handful of devices in each category, so they can hand-optimize every single one of them. So, wrong.


> It's universally known that display scaling is the forté of macOS, and has been for a long time

I have to interject here. Apple was the first one to have actually working and non-horrible implementation.

But the way they went about it has big performance cost. They can only render 2x versions. So if you use the physical resolution, the one that is "as big" as 1280x800 on 2560x1600 13" rMBP, they render it in 12802x800x2 and everything is fine.

But if you use scaled resolution, say "as big" as 1400x900 on 13" rMBP, they render it in 14402x9002 and rescale it to 25601600, and this gets laggy on bigger resolutions or on older devices or devices without dedicated GPU.

Meanwhile, other operating systems can do non-2x scaling and have no problem with resolutions like that.

(Personally I don't can what Windows can do - I'm not using that if I can help it. I just wanted to note that there's this issue with how Apple decided to do scaling.)


> and this gets laggy on bigger resolutions or on older devices or devices without dedicated GPU

I have the original 2012 rMBP, which by all accounts has far too wimpy a GPU for retina, and I've been running it in the max scaled resolution (the 1900x1200@2x mode) since the day I bought it.

On first few versions of OS X after it was released, yes, the graphics often got very choppy (especially stuff like Expose), but Apple has really optimized the hell out of those drivers, and by now on Sierra, I never run into any noticeable 2D graphic lag. I'm sure it's terrible for 3D though.


Interesting. You have 15", based on resolution?

It might be different but my 2013 13" is definitely laggy on max resolution.


Yeah, the first generation was 15" only.

Interesting that later hardware was actually laggier. I even keep my machine stuck on the Intel 4000 graphics 99% of the time.


> Doubtful.

Honestly, it depends on what you're looking at.

In graphics power for example ALL macs suck. Even when you spend 5000$ on a workstation you barely get something that can hold even with top-of-the-line pc hardware.

MBP: 2977 - http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+Pro+450...

iMac: 5714 - http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+R9+M395...

Mac Pro: 5178 - http://www.videocardbenchmark.net/gpu.php?gpu=FirePro+W9000&... (in D700 variant)

Mind, the last one is a high-accuracy cad work graphics card, but if you're looking for raw graphics performance for 3d development for example, all of those are easily outperformed by a piece of kit costing 190$:

Any PC: 8508 - http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+10...

And thanks to Apple being Apple, upgrading is not an option.


Yes I know, but anyone that needs to do heavy graphics work hasn't been Apple's target market for a long time. That really started to show when they switched to only offering a discrete GPU on the most expensive 15inch MBP.

In general I feel that no one (for now) can touch Apple in the laptop market, although half of the reason is macOS working so damn well on MacBooks. But other companies have other markets on lockdown.. Windows is best for the desktop. Linux (or BSD) for servers. Android for phones.


And yet they were first to market with 5K, and although not exclusive, make a point of citing color management as a priority in displays and software.


That's for photographers and (static) graphical artists. I guess I should have clarified that by graphical workloads I primarily meant rendering.


> Or having each of the fins on a fan be at a different angle so they have a different pitch and fan noise is distributed across the sound spectrum (reducing total perceived noise).

Not trying to take away from the innovation of that idea, but IMO ventilation is one of the few areas where I think Macbooks are strictly worse than the PC laptops I've used. The Macbook I was given for work (13" Pro, 2015) constantly overheats, as there isn't really any vent for the fan to push the air out of. I'm sure it's a conscious design decision not to have a vent (either because it "looks bad" or because blowing out hot air is considered worse than overheating the laptop itself), but I'd personally have trouble counting the way that my Macbook ventilates as a net win in "hardware quality".


The rMBP 2015 vent is at the laptop hinge. If you get the fans to spin up, you can feel the air traveling across the face of your screen with your hand.

That being said, it's still awful ventilation. I tried playing video games on it through Windows and Apple's drivers left the laptop more hot to the touch than I would like. I ended up using a fan curve controller application to fix that.


Yeah, I probably could have been more precise and said "there really isn't any ventilation" rather than "there isn't really vent". The vent is there, it's just completely ineffective.


This, I regularly see 95C while compiling, which I'd imagine isn't doing any favours to a $2000 machine that I expect to last at least 2-3 more years.


I use my 2012 rMBP in a tropical climate in summer (ambient temp 30 C and up) and it's constantly throttling the CPU and it's survived well. Complete hell on the li-ion battery though (time for my second replacement)


> Final Cut Pro works fine (read: smooth) on the anemic Macbook 2016

Yeah, my girlfriend's MacBook would beg to differ, as it stutters on a couple of heavy websites in Safari tabs.


https://www.youtube.com/watch?v=my2I5ge8bqY

even 4k is workable, although not quite as smooth.

Desktop Safari is also notorious for not being as smooth and fast as other browser because its primarily built for (battery) efficiency, not performance.


Safari stutters regularly for me on a new laptop. It's just not a very good browser in my opinion.


> Performance is terrible > > all reported Windows being much snappier than OS X

It's odd that this is a discussion point. I have a MBP 17' circa 2011 or so. I have both OSes installed. They are both really snappy. I am not quite sure what all the fuss is about.


Cheap underpowered laptops are really common in the windows world. My wife had an HP that we bought in 2015 that would take up to a minute to load chrome on Win10 for example. Everything done on the laptop was an exercise in waiting. I would honestly dread having to help her with her laptop because of how slow every action one. She upgraded to a Yoga finally and it's snappy now.


I've tried to use an app developed in 2014 with Windows's display scaling. Some buttons you need to be able to do anything end up being placed outside of the window, so you cannot click them.


I am using bash extensively on the newest insider build (so this is coming in creators update), bash is working flawlessly.

I have no problems with scaling with a combination of 4k, 1080p and 2736x1824 monitors. (It's working great, though it's true it didn't work like that a year ago)

With a surface book you get hardware that's even better quality than any Apple laptop, and actually has a detachable screen I can write on (this isn't a gimmick, I'm using it daily as a software developer).

No idea what you mean with saying "Windows is still a mess", maybe give an argument for that?


I think the start menu on win10 looks like someone threw up a bunch of squares and trying to update settings in what was the control panel is now a special kind of hell.

I don't use it ever except when I need to help friends/family though.


This is the update where you get to specify 12 whole hours Microsoft won't forcibly upgrade your device, right?


I live in Linux world (Archville with visits over the border to Debiana). I admit getting things ideal takes a while and every 9 or 10 months something melts down but the cost is a couple of days here or there.

The cost of Mac or Windows is cumulatively huge for me: on both I regularly sit wondering when some incidental operation will end.


"great battery life"? really? that's why I gave up Apple, they aren't keeping up. What's the new Macbook 3-6 hours, maybe? Meanwhile Windows laptops are pushing past 12.


The bash system is incomplete when compared to native Linux, but it's still really good for anything that is purely userspace. I haven't had a userspace issue in many months, and it's only getting more complete over time.


Also - command line usage on Mac OS is far from perfect. While bash itself works fine (as it does on Windows), many command-line tools are differently organized or missing features. (Try adding a user on the command line on Mac OS.)


Same here.

Even though there are some things that are annoying or in a bad shape, all in alle macOS and the ecosystem makes me more productive than Windows or Linux.

As a developer, I also notice that a lot of tools don't work well on Windows. Even things like Docker.


> Linux is still as bad as it ever was.

Not really. It may not be suitable for all use-cases, but it has certainly much improved in the last years.

To make a concrete example, I'm running ArchLinux + bspmw on an Asus Zenbook UX305UA. Thanks to ArchLinux I don't have all the bloat that I don't need, putting me at 10 hours of battery life. The tiling WM allows me to be extremely efficient with desktop and window managing. This is why I find it really obnoxious to use both Windows and OS X when I have to.

If you don't need proprietary software that doesn't run on Linux I seriously suggest you give it a try.


> Linux is still as bad as it ever was.

I agree, but why do this? Has it ever led to anything in your experience that was not a pointless and tiresome flame war?


There's definitely a green tint that sometimes develops on my Mac after a long usage and a simple Google search shows that I am not alone.


Hmmm ... I've been running Linux on laptops and desktops since 1997 (yes, really). My current laptops are 1 Mac OSX and 1 old linux machine of similar spec. My current desktop is a big honking many core, huge ram machine, with lots of SSD/disk goodness, and a nice recent Nvidia card.

The only problem I've had with this desktop has been the occasional hissyfit of compiling the Nvidia driver against a modern kernel. Thats a story for another post, but the Linux systems I've built and used for my day-to-day work are and have been rock solid. Without me sitting there struggling to make things work.

On the laptops, I've not had an issue in like 5 or 6 years, apart from a single Wifi driver ... which also didn't work in windows. So I added a simple Panda nub and off to the races I went.

Really, from my own personal experience, the OSX interface is a pain in terms of how it maps against keyboards, and the lack of a sloppy focus equivalence in OSX.

From a programming scenario, the toolchain is maddening (I am looking at you Xcode ... every (&(&^&^&* update, you have to reapply the click-this-license-button) on OSX. I use homebrew (having left macports in the dust), as I want stuff to work. And then along comes an update which just breaks everything.

And then the SMB stack in OSX. Yeah, the SMB stack. Using a lowest common denominator for our office meant using crappy file system technology, but thanks to either licensing issues or a NIH mentality, OSX has an almost completely neutered SMB implementation. It is awful. Painfully slow even over fast connections.

Then there is the desktop experience. Many of the customers of my previous employer are creative types whom used to use Mac for their video bits. Most are migrating to Linux, some back to windows. They are done fighting the battles (don't shoot the messenger, this is what they tell me).

I know the article was clickbait, and had a number of issues. But blasting Linux, which is a perfectly good, and very functional OS for many by rehashing issues from 10 years ago ... might not be the best approach for advocating your view.

FWIW: I run windows. In a window. Having had the joyful experience of self-destructing windows installations on wife/offspring's laptops, I've sworn to never again allow a windows OS to run installed on bare metal. I've taken wife/offspring's old laptops, virtualized them, and they can run their old stuff in a VM. They use Macs now, and I won't try to force them to change, because they just work for their use case.

Linux is a great OS for desktops (ignore the haters). Linux Mint is definitely one of the best user experiences I've had on a system so far, for day to day use.


Yeah, agreed. I use Mint nowadays, too. I started a couple of years before you did on Linux. It was painful until about Ubuntu 7.04. And then it was very slick pretty much from that day forward. Funny story: I built a gaming machine recently (specifically for Star Citizen), and Windows 10 failed to find my (onboard ASUS motherboard) NIC, and I had to iterate through a few versions of driver for my ATI/AMD 7790 GPU before I found one that was stable with Windows. I realized that day that I hadn't actually had a single hardware problem on Linux for something like eight years. It was a funny and strange feeling, hitting a hardware brick wall with Windows 10, of all OSes. Then you've got all those really annoying things about it, like the whole update thing. I paid for a "Home" (or whatever) license, but when I realized I couldn't stop Windows from crippling itself by automatically forcing updates of the fucking GPU driver I decided to go back to a pirated version of "Ultimate" (or whatever it's called) just so I could keep this shitty OS from killing itself. LOL!


More FOTM moaning about the state of Apple.

I can't believe people keep reading these "I left Apple and here's why you should care" articles by front end devs with no real idea of what goes into OS dev and where macOS has come from since the days of OS8.

Give me a break, surely if you want to use Windows you don't need to write 2000 words in the form of some needlessly pronoun heavy (because your opinion is really important) diatribe telling us all about how detatched Apple is from reality and how suddenly Windows is the 'place to be' because suddeny Windows has bash support, tell me again why I should drop a fully featured, mature *nix shell for one bolted on top of Windows.

Do people get paid to write articles specifically like this or are these devs so full of their own self importance that they feel like it's their duty to inform us of their opinions on the state macOS and why it's suddenly so much worse than ever before when in reality, macOS has never been more stable or developer friendly.

It's honestly so predictable I could have guessed 90% of the content of this article just from the sensationalist headline alone.


Welcome to the Internet, where people express their opinions on matters that are often trivial in the grand scheme of things. I trust you are equally annoyed at posts praising Macs?

macOS has never been more stable or developer friendly

The OS is fine, although largely in maintenance mode. The hardware value is poor and has gotten significantly worse in the last few years. I'm typing this on a retina iMac that I'm very happy with, but if I had to replace it there's no Mac that I would consider now that Windows has decent high-DPI support.


> now that Windows has decent high-DPI support

YMMV, but this is why I still use a Mac--because even Windows 10's high-DPI support is messy and not great. Applications act inconsistently and Windows is straight-up bad at handling different DPI on different devices (and when you've got two 27" 2560x1440 panels and the laptop's own panel, this is a pretty significant problem).


You're lucky with only 2560x1440, try a 5k screen on Windows 10, most (all?) modern apps work fine with Win 10 DPI scaling, but allot of the older, or less supported things like headphone software (looking at you Logitech) have 0 DPI support and are TINY or are scaled up, but pixelated (Looking at you Steam! Steam has updates like every 2nd day, but have yet to update the UI for proper DPI scaling)

Logitech headphone software on 5k screen vs Chrome -> https://drive.google.com/file/d/0B4joq2oW_zHBLXJLa2d1MEhlbTJ...

Sorry, rant over, had to get it out.

But I do use both Mac and Windows regularly and they both have their quirks, but personally I find Mac to be more polished, but Windows is catching up and Apple has been slacking recently.


Right - I meant that the big monitors are standard DPI, the laptop isn't.

On OS X, I've literally never thought about DPI.


I have a 4k monitor and a 1080p monitor connected to my Win10 desktop I'm typing this from. The DPI scaling works pretty damn well, apps switch DPI settings the instant they pass the boundary between the monitors. Sure, sometimes it breaks when games etc change the resolution on the 4k monitor, causing a window to be unscaled on the 4k monitor, but that's fixed simply by moving the window to the 1080p screen and back.

What trouble did you have? It's pretty great for me.


Controls jumping around when they windows between displays. Windows spawning on the laptop and not rescaling when brought to the low-DPI displays. It's just a stack of problems.


> macOS has never been more stable or developer friendly

* I have to turn off gatekeeper to run unsigned apps.

* I can't write into /bin or /usr on my own machine without flipping some magic option

* I can't run gdb without some complicated signing dance I have to do every time I update it.

* I can't run dtrace without rebooting and switching some secret flag off. I have to tell other people to turn they same thing off so they can dtrace applications.

That's just straight off the top of my head. The dtrace and gdb things are particularly annoying, as it makes life harder for me to get other users to do simple debugging tasks, and there is no simple workaround, just complicated instructions.


> I have to turn off gatekeeper to run unsigned apps.

Right-click on the unsigned app and select "Open". You only need to do this the first time you run the app.


If you're a developer or tester, this means every single time. Because every start is the first time for that build...

Regarding the gdb signing, it's painful. Even more so is the privilege escalation--it's impossible to debug over ssh with gdb or lldb since the GUI prompt is on a different machine. Not having the prompt in the terminal where the debugger is being run is asinine. I had to switch to debugging on FreeBSD to avoid the pain of all this; it's madness.


I agree this is annoying, but regarding gdb: lldb works fine, and it's not Apple's fault that gdb hasn't automated the complicated signing dance. In my experience, gdb is broken on macOS in several other ways too...


The days of OS8 were over 20 years ago.

even OSX 10.3, which was when OS X started to get really good, was released in late 2003, almost 14 YEARS AGO .

I know people who have been on Macs for 20 years, since OS8 or 9 days, who have bought non-Mac hardware and won't get another MacBook - that is significant and Apple should be paying attention to that.

Apple is not gated by their software costs - they clearly make a lot of money from their desktop, mini, and laptop sales.

So why not produce something truly excellent instead of merely adequate?


    > where macOS has come from since the days of OS8.
Downhill in many ways that I care about. Someone else posted elsewhere in this thread that they can no longer drag a photo directly from Photos app to Pixelmator. That would be unimaginable in the old days.

I care a lot about a consistent Finder UI, about the Clipboard and Dragboard. Those became a little less logical when OS X 10 was introduced. As you know, many other features became much more powerful, so overall it was easily forgivable.

Unfortunately, since Tim has been in charge, the lack of logic and consistency has become a real problem. Nobody at Apple seems to care about their own "Human Interface Guidelines" anymore.

It's like Apple compares Mac with iOS, and decides that mystery meat navigation, and inconsistent UI, are no big deal. As long as it's no worse on the Mac than on iOS, Apple is fine with things.


Oddly enough, I found that reading the article carefully made me consider switching to Windows LESS.

The author's primary issue seems to be the fact that Apple hasn't been packing the punch that is necessary for him, and that especially with his wanting to get started with VR the entire platform is a no-go for him atm. No arguments there.

But curiously, he spends the bulk of the article either explaining or talking about the things that he had to compensate or now has to deal with as part of his swtich: The fact that apps aren't as polished (and neither the OS itself), issues with installing drivers, Windows Bash that doesn't work quite 100%, etc., etc.

I switched several years ago as I was finishing college, and like the author am extremely disappointed in the fact that my MBP can't really do it for me when it comes to VR. Yet, as per his article, it seems that in just about every other meaningful way the Windows experience for me (and him) is worse.

More broadly speaking, just about every article I have read touting how great it was to switch "back" to Windows seems to follow this general trend; it seems as though everyone who switches is glad to be using machines with more juice behind them, yet it is clear the computing experience itself is worse. I can't help but feel that had there just been better graphics/ram options on the current mac lineup, that pretty much would kill any of the justifications I see for the switch.


If there's anything that people expect to stay constant about Apple is that they will try to tightly control their ecosystem and charge a premium for that.

If Apple just stopped doing those things, I'd consider running OSX, but they won't so it seems like an entirely pointless exercise.

It's the same iOS vs Android debate, Apple charge a premium but promise things will be significantly better, if they are only marginally better then it becomes harder to justify their premium and lock in.

I think the recent spate of articles are more about the fact that the gap has been narrowing significantly, partly due to Apple's neglect, partly due to Microsoft doing better, so the balance is changing for people on the margins.


You are right-- you can argue about which way to go, but it is undeniable that the gap has narrowed. Depending on the way VR/AR develops in the near future, that will likely be another battle-line over which they will compete, and it is clear that at least at the outset Microsoft hold a massive advantage.


I particularly found the "x has given up" quickly followed by "but Electron apps are making Windows better!" to be quite odd.


To be fair, when memory, processor speed, and battery life are not an issue, Electron apps are usually good enough. That said, I always cringe when I have to use them on a laptop, no matter the OS.


> The progress in macOS land has basically been dead since Yosemite, two years ago,

And the progress on Windows's side is what exactly?

For the end users, instead of business corporations, practically nothing changes. I swear every freaking version of Windows nearly everything about the user experience is similar to before. It looks nicer at first glance, but once you open some apps, particularly the windows apps like msconfig, disk management and such, you find yourself with UI from ten years ago.

Just look at the file copy window. What a joke. They made it look a bit nicer, they added a fancy animation while copying, but really, it didn't change at all. It stills sucks majorly at giving you a proper estimate of the time it will take to copy a file.

The single biggest change for me in Windows 7 was the ability to use Win + <number> to switch quickly between apps. It's incredibly useful, and thart's pretty much the ONLY real change in may day to day experience of Windows compared to earlier versions.


> And the progress on Windows's side is what exactly?

Safety, performance, stability, easier UI (for the average user).

Yeah, none of these things feature well in tv advertisements, but compared to its predecessors Windows 10 actually does excel in all of those. Heck, i installed 10 on my parents' cheapo PC and with the same programs and such installed it actually is more responsive.

> It stills sucks majorly at giving you a proper estimate of the time it will take to copy a file.

That's not a software problem, that's a hardware problem. Particularly on SSDs you have to deal with deletions being surprisingly slow, and in the copying process itself you often have a very fast phase at the beginning when it's just slurping the file into ram; and then it drops off when it runs into the write limit of the target medium and/or runs out of ram to use cache. If you're copying to the same medium you get an even stronger drop due to read/write happening on the same thing.

Predicting this is HARD.

The only way to get reliable predictions out of the copy dialog would be to disable memory caching while copying, and uh, you kinda don't want that. It would just be predictably slow.


Unlike the article, I'm not as interested in new features. I really want to stop having to type killall Finder because I dragged too many files. Its like Apple doesn't actually use its own products for day to day work. The whole "Save As" fiasco really screwed a lot of workflows which they sort of fixed later but not for Preview. Networking has also been a constant issue. I just wish I didn't keep putting in radars "still an issue".

They need Snow Leopard 2, and someone who cares about PC hardware.


>I really want to stop having to type killall Finder because I dragged too many files.

For what it's worth, I think this will be addressed by the switch to AFS.


I don't think its the filesystem. Finder itself seems to be the problem and it started with Mavericks about the same time as AppleScript stopped updating the Finder windows without a close/open.

Another fun problem is copying a bunch of files out of a directory that Finder has currently opened and watch Finder go catatonic. I get the feeling someone did a lot of programming in Finder without proper testing.


The question I'm left asking from posts like these is:

As a non-game developer, what practical advantages does Windows hardware and software provide over Mac?

The writing doesn't make it clear to me. Sure, there are some interesting facts noted, but there's no connection drawn or relationship established between the facts and the impact on the author's ability to work, or the work product itself.


Cheaper and faster, broadly, which has been the same for as long as I remember: for an equivalent price, you can get a much more powerful PC than you can a Mac (though there are exceptions: the surface book matches up closely with a 2015 macbook pro I believe).

After that, it depends on what you are doing. I develop using a PC, macbook (for iOS builds usually) and a Linux machine (running mint).

Linux is best for Node/JS heavy development, IMO, largely because of the long filenames in Nodeland and the general linux-first documentation and help. Recently I have found its also pretty good for Dotnet Core development. The OS is nice as well as in that sort of bare-metal coding, it kind of gets out of the way.

Windows, either with VS or VS Code, is for things that require a lot of power or are huge (usually related): Docker stuff is easier, solutions with hundreds of projects, etc. When I want to throw code against a tiny god, the PC is my go to.


> Cheaper and faster, broadly

Presumably the author has already purchased the Mac, in which case, s/he's already obtaining the benefit from that purchase. To rationally justify the replacement cost (which is not just the cost of the hardware itself, but the opportunity cost of learning the new platform, purchasing new software, etc.), the benefits of switching have to be greater.

It's those benefits--not necessarily of the platform itself, but of switching--that aren't clear to me.


My reading of the author was that he (Owen Williams) switched when he needed to upgrade. He had a Macbook which suited him for a few years, and now needed to decide to go with the late 2016 models or buy into PCland (which gave him much more bang-for-buck).


Gaming, VR, upcoming 3D and Cortana are a few reasons I can think of why someone might pick Windows over MacOS.

I think you're discounting the performance aspect as well - if workload X is too slow on the fastest mac, it may well still run on a windows machine since they can be specced higher. Cost isn't a factor for everyone - sometimes you literally just need the fastest possible machine, in which case Windows can be it.


>Gaming, VR, upcoming 3D

Am I missing something or does this seem like one thing? Or is there use case for VR that isn't gaming? And what is 'upcoming 3D' exactly?

>if workload X is too slow on the fastest mac, it may well still run on a windows machine

Can't argue with that, but what are such workloads if you are not game developer? Wouldn't cluster of Linux servers always be a better choice? Unless of course you are die hard Windows fan.


I was thinking media editing/consumption (e.g. Video production, music making etc.). Presumably the Machine Learning GPU crowd get some use out of Windows machines too


> Or is there use case for VR that isn't gaming?

Training and design.


> As a non-game developer...


3D/VR design and creative activities (or even just consumption), Cortana, higher performance media editing/creation... plenty of possibilities.


I can answer this. (Background: I was a PC user 1989-2007. MBP user 2008-2016. Back to PCs from 2017 onwards.)

I do machine learning development. It is very helpful to have a CUDA-compliant GPU (read: NVIDIA) on your local machine so you can develop and run things quickly. Yes, you can use AWS, but that is costly, you need to push your data into the cloud, and you definitely cannot do it w/ sensitive data such as health records/medical images with localization clauses. None of the major machine learning platforms work well on non-CUDA compliant GPUs, most are barely functional outside NVIDIA cards.

I waited over a year for the new MBP and was disappointed to see no NIVIDA GPU on them. I love the MBP, it is an absolute pleasure to use. But the new line had barely any horsepower improvements. It was almost a step back from the 2014 MBP I already owned. Further, the cost was obscene for the value/improvements I was getting.

I'm not happy about it, but I purchased an Alienware w/ a hefty NVIDIA card and 32GB RAM. It is a beast, but it meets my work needs. I do miss the agility of a MBP (e.g., carrying it in one hand to a cafe or onto a plane), but ultimately horsepower is needed more.


Why not spin it around the other way? What advantages does Mac now have over Windows, in a world where Microsoft is catering to developers from both sides?


I don't think it's "Mac vs Windows" specifically. It's more that the compelling advantages that the Mac once had over _every_ other platform are being eaten away.

My first Mac was in 1992 (a IIsi). Since then every computer I've bought (servers aside) has been a Mac.

Last week my MBP died. The battery overheated and 'burst', taking the case and trackpad with it. I'll still have a Mac desktop for development and cartography, but I need a laptop, and I simply can't justify £1250 for a MacBook.

Instead, for the first time in 25 years, I'm buying something else: a Chromebook, on which I'll install Ubuntu. It's pretty much the same weight (1.19kg vs 0.91kg) and size; it does word processing, web browsing, PDF viewing, and Ruby hacking just fine; and it costs £1,000 less. Sure, the performance is much, much worse, but for my uses that's fine.


* Actual UNIX, as opposed to a hacky compatibility layer.

* Generally nicer hardware even compared to things at the same price point. Don't forget that hardware is more than RAM and CPU specs. There are some notable exceptions with pretty nice hardware, like the surface pro tablets.

* More friendly UX all around. This is somewhat subjective, of course, but if you'll recall a while back there were some big corporate IT departments who explained that IT costs were much lower for Mac users. They chalked this up to a combination of increased user friendliness, lower virus risk, and increased reliability.

The only activity in my life where I actually choose to use Windows is for FPS games. However, that advantage is disappearing; over half my steam games support mac now.


Don't take the bait! Otherwise this thread will devolve into a petty OS war.


I work on Windows and have a Mac at home. Windows looks cool at first but when you actually need a workhorse the MacBook is still the best option in my view. Windows 10 and the Surface Book look really nice on paper but once you start using them they have a lot of little problems that make an unpleasant experience.

So in my book the Mac is still best but I also agree that the direction Apple is going doesn't look too promising.


I too use both. OSX at work, and Windows (7) at home. They are both bad, neither is the best.

I remember spending time trying to figure out how to disable the animation that plays when switching OSX virtual desktops - only to find out that you literally can't without code injection or a third party app - rendering the virtual desktops on OSX entirely useless.

The fact that this a) is the default behavior and b) is not configurable, is just another in a long line of awful OSX UX and it only gets worse with every update. El Capitan changed terminal font smoothing to something woeful which was also a struggle to revert.

Honestly the only way I have found development possible on native OSX and Windows is by stretching a terminal(/putty) to the full size of the desktop and handling all window management with tmux.


System preferences > Accessibility > Display > Reduce motion. You're, welcome.


This has always been an issue with both iPhone and my MBP (though I just got them last year). There are _so_ many options in Sys Prefs and otherwise, and there typically is an answer to problems you're having, but they feel hidden...

Pretty minor issue though, I still love both. I'm just not sure if I'll love the new versions everyone seems to be whining about :)


Because this is an article about abandoning the Mac, not Windows. If it were the other way around, then that would be an equally fair question.


If you "can" switch between Windows <-> Mac, why not give linux a spin? Like, if you're not building apps with native UI kits on those platforms, what is holding you on them?


Lack of compelling evidence that there's a *nix distro out there that will serve me better than macOS.


But are you switching to windows? Or have have you recently switched from windows to mac?


I really recommend looking outside the Ubuntu ecosystem, (Arch is a personal favourite), if you ever decide to try it.


While I don't want to hijack the op's thread, I would like to hear more about the advantages of the mac. I listen to some mac-centric podcasts and hear them complain about the current state of the mac so much but when they mention windows they just say vague things like "but windows is worse." So what is better about a mac?

To give more detail on my use case: I'm not a developer so things like terminal or unix don't really matter to me. And I don't think I've had a virus since the 90s.


> so things like terminal or unix don't really matter to me.

That's really what it's all about though. Many developers prefer a unix environment. Windows is often neglected in open source, so getting up and running with a working dev environment can be a challenge. So, for many, Windows just isn't really an option.

That leaves macOS as the only option that's a first class citizen in all regards -- hardware support from manufacturer, support in open source projects, support for proprietary applications (I.e MS Office), etc.

For non-programmers though, there really isn't a meaningful difference. Take your pick.


I wonder if and how quickly this will shift with Bash shell support in Windows 10. Unfortunately, there still isn't a great terminal emulator on Windows. ConEmu looks really sad next to iTerm2.


If I already have a Mac and I'm invested into the Apple ecosystem things on Windows needs to be 10-100 times better for me to make the switch.

If you are just starting out or you aren't in the Apple's ecosystem (e.g. you don't have Idevices) then Windows might be the place for you. I'd still go with Linux personally, but that's just me. Or if you just need more power, but I'm guessing most people here don't actually NEED that much power


Because why abandon anything if it's better than the alternatives?


It doesn't constantly give me "Windows Explorer has stopped working"


I don't think that's been a thing since 2006.


We get similar arguments on the Linux side, which haven't been true for a decade.


Getting that every other day, but to be fair, on a remote machine, connected with Citrix


Sometimes MacOs reminds me of those times when Explorer could stop working:

- If I have sublimetext maximized on other display, my mouse cursor gets stuck at default pointer, no longer changing on context.

- If I have same sublimetext maximized on other virtual screen, my mouse cursor stucks in text select.

- Option to reorder icons in dock via drag and drop goes away randomly.

- When I plug display to my MBPro 2016, menubar has white square where tray goes.

- It gets stuck in boot -> kernel panic -> restart -> boot loop when I am starting it with display plugged in via their HDMI dongle.

- On random, finder stops showing changes in filesystem. I've had this happen to me in Mavericks on old laptop, it keeps happening in Sierra on new one.


A PC (yes, true, technically Macs are now PCs, but they're still special PCs, or at least Apple says that) has mainly the advantage that it's cheaper while being faster. Now, to be fair, I don't like to use laptops (they're slow, limited and annoying), so I mostly refer to desktop hardware (best bang/buck ratio, easy to make very, very silent, very fast, easily extended).

Quarrel over operating systems is pretty pointless, if it doesn't run natively, slap a hypervisor on it and run whatever you like in a VM; for me that'd be a Linux or BSD, I'm not really opinionated there (although certain things run faster in Linux due to experience).

(Obviously I don't grok the laptop lifestyle, I don't understand the "work everywhere" mentality, in fact, I don't want it at all. It's called work place for a reason - at least for me. I do have a laptop (or half a dozen, who has time to keep record?), the mainly used one is a Getac unit with a carrying handle that weights ~3 kilos or so; at least it's silent, has a superbright screen and a battery that's larger than most ultrabooks by volume ;)


> A PC (yes, true, technically Macs are now PCs, but they're still special PCs).

You obviously don't understand what those 2 letters stand for (hint it stats with Personal and ends with Computer).


well, if you do stuff in machine learning, then being able to put in some good graphics cards is a big deal.


One of the advantages is that you can install Linux on it. Generally, it is much easier to install Linux on a regular PC than on Apple hardware.


touch screen, convertibles, pens, price, performance, and better battery life (if comparing to the most recent macbook generation)


Touch screens and pen support, use the same device for development, creating and for browsing in bed.


Modularity, extensibility and price would be a few advantages.


1K $


Hilarious rant!

* talks about developing VR apps, shows screenshot of editing html

* complains about lack of innovation in macOS, has zero ideas of things he lacks

* praises slack for being a great windows app

* is impressed by bash for windows

* never considered using Linux

check, check & check!


It is so funny to read these Apple vs. Windows discussions when you live outside of North America...

The truth is that, for the rest of the world, "Apple platform" means only iOS, only the IPhone, nothing more.

In Asia, Africa, Latin America and some parts of Europe, Macs and OSX are a rare species, most people spend years without even seeing a Mac computer "in the wild". They are used only for graphics editing and IPhone development.

These discussions only show how exotic HN is, a Silicon Valey bubble.


How do tech conferences and meetup groups work without a sea of macs?

But, it might be better to say HN is a tech bubble, because everywhere in the world that I've been people who work on web tech use macs, for most of us the job would be almost impossible without a *nix OS.

And this post was written (somewhat regretfully) from a Mac in Aotearoa/New Zealand, where non mac laptops are somewhat rare in the tech community (and are then usually Linux rather than Windows).


Here in Germany I see Mac products every day. Even in the train there are more MacBooks than Windows Laptops.


For "PC" (laptop and desktop), the market share of Apple worldwide is somewhere between 6% and 10%. Linux is ~ 1 to 2%. The rest is mainly Windows.

Interestingly I could not found Germany on http://gs.statcounter.com/os-market-share/ but here is Switzerland which have one of the highest OS X penetration of the world (the highest?)

http://gs.statcounter.com/os-market-share/desktop/switzerlan...

I don't have the figures for laptop only, but I don't believe for 1s that OS X is above Windows in any country (of a non-trivial size)

Maybe in certain extremely local areas, you see a highest concentration of mac laptops. Even then, I challenge you to count carefully and come back to us if you still find that mac laptops are more common than laptops running Windows. Of course there are some factors that can bias in favor of mac laptops, depending of where you look at. However, while that can be interesting from a socio-economical POV, keep in mind that Windows market share is, on average, consistently largely superior to OS X, regardless of if running on desktop or laptop computers.


Germany has less Apple interest than the rest of Europe: http://gs.statcounter.com/os-market-share/desktop/germany


I'm sure some of this has to do with wealth and the ability to pay for intangible, but definitely positive, experience premia. Go into a cafe in New York City or a college library and you'll see a sea of MacBooks. I might say 90% MacBooks (i'm a part-time grad student.) The situation is very different in the workplace, especially in lagging industries. In finance and management consulting, my past careers, I saw 99% PCs.


Apple has also been aiming heavily towards students, to the point of setting up mini-stores in university libraries (never mind offering discounts).


My casual observation: most commuters don't use a laptop in the train. Those who do, mostly Windows. Apples seem fairly rare, as in, every 20th unit or so.


that is not true. people use clearly more windows laptops in germany


You can both be correct depending on sampling.

UK Birmingham: 0700 trains to London (corporate types), Thinkpads, Dells and a few HP laptops all with Excel/Outlook in operation. A few hours later (students and younger people), it changes to tablets and Macs. Mostly watching films or doing Web stuff. And lots of phones.

I'd imagine a large percentage of Windows licenses in any country are for desktops sitting in offices (1000+ in my fairly small organsation alone).


People use more Windows laptops than Mac laptops everywhere. Apple's marketshare is like 25%.


Way lower than 25% actually.


Uhm no. Here in Norway Apple is huge. Go to a major Asian city and there are Macbooks everywhere. Sure there is higher percentage of Apple among some devs and for places with higher income, but to call a major product from one of the most valuable companies in the world "exotic" is absurd.


Again i suspect it depends on when and where one look around (never mind that every Mac has that "WITNESS ME!" spotlight on the lid).

There are certain industries etc that is very heavy Mac users, media production in particular. They crop up in the weirdest of ads because the production staff use them for example. But for the rest of the nation it is Windows all the way.

The only people i know personally that own a Mac musicians or in the ads/marketing/graphics business. And i think the latter, as much as the _nix internals, that has made Mac a web dev fixture.


I live in New Zealand (at last check, that was outside North America). I see Macs all the time.


Marketshare is similar to the US at around 20%

http://gs.statcounter.com/os-market-share/desktop/new-zealan...


To be fair to HN, Apple laptops are very common in American universities too. It's not a purely Silicon Valley thing.


This is the view you have in most universities in the US, Canada and the UK http://i.imgur.com/xzEyLQB.jpg


I don't think that's representative of the UK.

Students don't tend to have money to spare for a Mac, so they buy a cheaper Windows laptop instead.

A Macbook is 15% of a year's money, which is better spent on rent, food and beer.


For those unfamiliar with US geography, mizzou is not in California.


Have you ever been outside of North America. I see Mac's all over Europe all the time. Not only in developer friendly environments. Busses, trains, coffee shops,...


I live outside North America and I don't see Mac's all over this Europe: http://gs.statcounter.com/os-market-share/desktop/europe


Basically this. Never mind that you could walk past piles of Lenovos, Samsungs, Acers, Asus', etc etc etc, and never notice. But Apple, thanks to that glowing fruit logo on the lid, basically screams "witness me!".


And in India where all the developers live - Apple MBPs are everywhere


What "India" are you talking about? The India I know is http://gs.statcounter.com/os-market-share/desktop/india


I think you need a reality check. Even here in South Africa, Apple's computer market share is steadily increasing. In general (and this applies to most countries outside the US), those who actually care about their computers (as opposed to seeing a computer as just a fancy typewriter) and have the budget to justify it, will often buy Macs. In terms of developers, I expect the market share to be even higher. In our company we have about 50 people and we're all on Macs (Airs and Pros).


PC penetration is about 10% in asia and africa. Is it exotic about talking about gadget which 90% of population does not own?


Huh? There's macs everywhere I look.


I live in Hungary, and I see MacBooks all over the place, albeit typically older models. The only Toucbar MBP I've seen is mine.


MacOS is at a standstill in terms of usability, but I feel that windows 10 has gone back.

Microsoft has chosen to focus on enterprise. This is probably the right move for their culture and the future of the company, but the remainder of their consumer products seem to be turning into an ad-supported model heavily focused on legacy support.

Windows may have a new coat of paint, but the underlying OS just gets more buried under more and more simplified UIs which ironically make its harder to use.

This is really desktop linux's time to shine, but it's usability is still sporadic.


Agreed.

I can't believe how much worse Windows 10 is than 7. And all the good features of 10 could easily be integrated with the 7 UI.

It's very frustrating that there isn't another player in this space. I don't mind fooling around with linux but sometimes you just want to buy a laptop and go without all the extra crap and most of the consumer space won't go through that anyway.


I feel like Fedora has a fairly good chance at being the distro to make it big (among other things, it actually works with GNOME software centre), as long as the community can actually unite behind it.


Agreed on Apple dropping the ball. But Windows 10, while certainly better than modern OS X or Windows 8 does not have the stability or adherence to well-tested design that Windows 7 had. W10 has seen serious quality assurance issues[1], partly as a result of Microsoft cutting large amounts of QA testers as it switched development methodologies.

This has lead to an immense number of mistakes, bugs, and unhandled cases seeping all throughout Windows 10.

At its core, W10 is still a great OS, but the lack of proper testing (and its massive, scary levels of privacy invasion) is something to be aware of.

[1] http://www.computerworld.com/article/2859902/at-microsoft-qu...


I do not agree that Windows is better than OS X. In my own personal opinion, I feel strongly it is the other way around. However, that is only a personal opinion. The Windows alternative clearly wins when it comes to hardware: you can choose the components you want, at competitive prices, and you can update them whenever you want. Also, Apple hardware's aesthetic advantage is not as big as it used to be. Many manufacturers can create thin pretty laptops now.


What makes Windows 10 better than macOS Sierra, in your opinion? I use both every day for similar projects and find that Windows 10 is lacking compared to Sierra overall.


It is really difficult to have a constructive discussion about OS X and Windows, mostly because both projects are mutually exclusive with their goals and users feel their interests aligned with one side.

To name an example, most of the people who prefers OS X complain that Windows lacks aesthetics (ie. font rendering, high dpi support, etc.).

On the opposite side, most of the people who prefers Windows complains that OS X lacks support (ie. a huge library of software, backwards compatibility, etc.).

Unfortunately, BC and aesthetics are mutually exclusive, taking support for High DPI under Windows as an example:

Only the modern stacks like WPF or UWP are DPI Aware by default, but the amount of software built with these stacks is relatively small, on the other side, most of the software on the wild is built with stacks like GDI/MFC or WinForms, but they'll always look "ugly" with High DPI configurations.

Apple is the kind of company which would demand developers to update their software, Microsoft can't afford to alienate its developers.


>On the opposite side, most of the people who prefers Windows complains that OS X lacks support (ie. a huge library of software, backwards compatibility, etc.).

I've been very impressed with Mac backwards compatibility. Old Apple hardware (going back nearly a decade) still works great with latest macOS Sierra and I have decade+ old utilities and custom scripts working as fast or faster than ever before.

There's some of my custom scripts I've had to tweak over time due to Apple's increasingly locked-down security measures within the OS, but that's very much worth the small amount of time I've spent tweaking them and I appreciate the better, overall security.

There's rarely the case that there's a functionality in Windows that can't be found within the many hundreds of thousands of Mac apps available. There's more Mac apps that one could ever use in a lifetime. As a matter of fact, the problem I've run into with Windows is the lack of quality apps that can't match the superior third party Mac apps or built-in macOS functionality in many cases. Of course, there's occasions the opposite is true and I run Crossover and Parallels in Coherence mode for those.

There's also a lot of built-in, time-saving functionalities within the macOS that third party apps in Windows don't replicate well or at all. For example, spring-loaded folders or a solid, fast alternative to Mission Control in Windows that works as seamlessly as it does in macOS.

I use Windows 10 and macOS in near daily production and consulting/support environments. Windows 10 has its advantages over the macOS, but Task View isn't one of them.

Mission Control on Mac in a production environment blows away Task View - which was only finally copied by MS from the macOS after already being in use for well over a decade for Apple users. Granted, there was some Windows third party apps that attempted to clone Exposé (former name of Mac's Mission Control), but they were terribly slow, clunky, crashy and buggy on Windows. That's why I was really happy to see Windows 10 finally copy Mission Control and incorporate it natively, but I was sorely disappointed after using it.

For example, I can use corner gestures with Mission Control that've been removed from Windows 10. Microsoft had corner gestures in Windows 8, but removed the option entirely in Win10.

Even after I brought corner gestures back to trigger Win10 Task View with a custom script that works via a third party app (the great AutoHotkey), it's still incredibly limited compared to Mission Control. The AutoHotkey app doesn't even trigger itself right away consistently like the built-in macOS corner gestures always instantly and reliably does. I've wasted time with multiple third party triggers and none work as well as the native, built-in macOS corner gestures.

On top of that, with the macOS (and I've been able to do this for about a decade with Exposé and now Mission Control) - I can drag any file to my corner gesture, then drop the file directly into a preferred Mission Control thumbnail window.

Try that in Windows 10 Task View. There's no integration with the file system in Win10 Task View at all and that severely cripples its functionality. There's no third party app that fills the void yet for this either. Granted, I often use launchers on both Mac & Windows to move files, but when there's a need to have a more GUI, visual approach with dragging and dropping, Win10 fails badly because it also inexplicably doesn't have spring-loaded folders in Win10 and no reliable third party app copies that functionality properly either.

I do enjoy the Task Bar thumbnails in Windows that the macOS lacks, but I just use a third party app called HyperDock that not only replicates the functionality, but much improves upon it - and HyperDock has never had any speed or stability issues against the macOS for me like many third party Windows apps tend to have.

That said, there's definitely various advantages to running Windows over Mac and that's why I work in a mixed environment at home and in my work tasks.


Despite my bellyaching, Windows is built on much more solid ground than macOS is these days. All the new stuff introduced in 8/10 - Metro stuff, control panel stuff, etc. - can be wacky, but the core OS is strong. It works, it doesn't crash, it acts the way you expect it to. That's all I really want from a desktop OS, and given that Apple has total control over their ecosystem, they are frighteningly bad at providing it.

What areas do you find W10 to be lacking in?


I have switched from Mac to Windows a few months ago. My biggest annoyance is privacy. I do not feel my data is safe while using Windows. I had to switch off too many defaults (keylogging, for example!), and I really do not know if I missed any, or if some update has sneaked in any new way for MS to spy on me.


I totally understand. Even with the bugginess, it's my single biggest issue with W10.


I find W10 lacking in consistency.

Poke around in the settings for a while and you will find remnants from the NT days.

The dark theme is another example, it works for a handful of their own apps, not even half of them.

Half baked and unpolished. I'm a daily W10, macOS sierra and Fedora Linux user. I develop on all the OS:es and play games mostly on W10.


I agree 100% with that. Microsoft stacked new features without deprecating old ones. Nowadays you can do things in 32 different ways; some ways are the same as they were in Windows 2000 and XP, but maybe some "advanced tweaks" are not available in the "old interface", so you need to struggle to find two different interfaces that achieve the same essential function.

This is, IMHO, a backlash of Microsoft's long update cycle. The yearly update Apple pushed to MacOS, along with free updates, allows for an easier deprecate-then-remove approach that gently transitions users from the old to the new approach. It's hard to do the same when people got used to an OS for many, many years. Maybe W10 with its "rolling" approach will suceed, btw.


Yeah, that's exactly the problem.

Happily, though, the inconsistencies don't reach as far down as the kernel level. W10 looks weird and sometimes acts strangely when you try to use the new stuff, but its bones are stable, which is all I really need.


>the core OS is strong. It works, it doesn't crash,

Windows almost never crashes for me, but I haven't had a system crash on my Macs in close to a decade even after updating the OS numerous times without a clean install. Granted, I know to use combo updaters for Mac instead of the streaming updates, so that helps me quite a bit along with making sure I update third party apps first.

On the other hand, Windows 10 updates have caused all kinds of various issues and it's documented to be widespread. Killing many webcams is one major issue that comes to mind.

Then again, some people have had wifi issues with Mac updates, so no OS is perfect, that's for sure. However, to allude that the macOS system core with Sierra isn't as strong as Win10 doesn't seem realistic to me.

macOS Sierra has been as rock solid as Win10, if not more in some cases.

>given that Apple has total control over their ecosystem, they are frighteningly bad at providing it.

That's a myth. I have Android phones integrated with Macs just fine, for example. I use the free MightyText to send & receive texts and that's just one of several good options. Google Keep app to sync notes across Mac & Android and the list goes on and on.

If any professional power user wants to skip Gatekeeper on a Mac and install apps without any hoops (a simple right-click, basically), Apple made it as simple as this in Terminal so there's no hoop at all:

sudo spctl --master-disable

Done.

I use both Windows and Macs daily. I run anything and everything on my Mac I want and have done so for many years. I'm not trapped in some ecosystem at all on my Mac. If anything, I feel more trapped (privacy-wise) on Windows 10 than Mac and I despise how Microsoft forces updates on me that have crippled my workflow on occasion whereas Mac just puts up a daily reminder until you do it.

Now, the iOS devices are another story, but that's a huge can of worms when we're talking about phones and the need for security, etc. -- I'm not going to get into that here since we're talking about Mac vs. Windows -- not iOS vs. Android, etc. (my preference is Android for most of my use cases and iOS for some others).


That's from 2014. Have things changed since then?


I don't have hard evidence either way, but I'd argue no. Ars' recent article on the bizarreness of the upcoming Creators' Update is one example.[1]

[1] https://arstechnica.com/information-technology/2017/02/forth...


Creators update is not even released, what are you talking about??


...did you read the article? I'm not gonna summarize it for you, but typically, even insider builds have not been this buggy.


The article is about insider builds, i.e. alpha and beta releases.

They are supposed to be buggy as hell!


I've just switched to Mac a couple of weeks ago after over a decade of Linux as my desktop - Ubuntu and Gnome for the past 6 years or so.

I have a much better machine now (although there are nice Linux certified laptops out there) and a much nicer hardware integration with the OS (especially in terms of battery life).

But as a desktop Gnome is just as good if not better (that's a matter of taste after all), and as a Unix development machine it's sub-par compared to Linux.

I'm pretty happy with it, but I can see myself going back to Linux. But Windows - not anytime soon.


It's very clear that this is all about gaming and game development. Please 302'd yourself to glorious /r/pcmasterrace.

> On the developer side? Nothing, unless you use XCode

> Their hardware is underpowered

> Gaming on Mac, which initially showed promising signs of life had started dying in 2015

> brings dedicated gaming features, full OS-level VR support, color customization

> NVIDIA GTX 1080 graphics card is an insane work-horse that can play any game

> On top of that? I can play recent games without the PC breaking a sweat, and I’ve started experimenting with VR


I think the is a bit hyperbolic. Apple certainly hasn't given up. It may be having some identity issues, but what I really think is happening is politics. Steve Jobs made Apple, he fought for the company so hard he was fired, then he got acqui-hired back. Let's face it, the guy was invincible (do you really think anyone in their right mind would try to tell Jobs what Apple was and what it was about?) and incredibly talented and focused. He balanced the powers. Now, power grabs and politics is confounding simplicity.

Things that I loved and now I hate:

- spaces (now this convoluted combination of "notifications", no more far left widget screen

- universal zoom, you used to be able to zoom in on anything

- odd wifi issues (wifi was such a pain in the early 2000s on a PC that OSX experience of working was amazing)

- terrible cloud features, icloud, mobileme always sucked but when my computer started automatically updating and then switched to trying to store my local files on icloud it f'ed over my file structure, huge annoyance

- cmd-ctr-alt-8 (defaulted off but enable'able at least)

I'll still probably get a base model air for my next laptop and I run a custom build PC that runs Ubuntu 16 LTS, Win 7, Win 8 (Windows mostly for CAD and other such software still not available on mac, like proprietary 3D printer environments, looking at you Stratasys).


Some of those things that you loved are actually still there. They have just been moved or disabled by default.

- Dashboard is still there and can be enabled as a space or an overlay (System Preferences > Mission Control).

- You still can zoom in on anything. I use ctrl+scroll (System Preferences > Accessibility > Zoom).

- The invert colors shortcut (cmd-opt-ctrl-8) is still there. (System Preferences > Keyboard > Accessibility).

Another feature I love that used to be easy to find but got buried in settings is the three-finger drag (now found in System Preferences > Accessibility > Mouse & Trackpad > Trackpad Options...).

I question some of the changes in default settings too, but then I realize the defaults are most important for the non-power-users who can't or won't change them. I would also never want to store my Documents or Desktop contents in iCloud, but I can see why someone who doesn't understand the file system might.


Interesting story, and one I understand because I've been using Windows 10 a lot more since the WSL announcement/availability.

Basically in 2006 when I joined Google I had a choice, Macbook or Linux laptop, I chose a macbook. Really liked it and used a Macbook for more portable computing for the next 10 years. I also got a Linux desktop at Google and learned the various bits you had to know in order to run a Linux desktop full time with my laptop filling in for things I couldn't get on my desktop. When the Surface Book was announced, the hardware was just amazing. I figured even if Windows sucked I could eventually get Linux working on it. When WSL became available suddenly Linux was sort of just their and all my ARM development tools just worked. My home desktop which had been windows 7/Linux dual boot (defaulting to Linux) and I booted Windows 7 and took the free upgrade when it was offered. Then went back to Linux. And then needed to run my ECAD program so brought the windows side up to snuff. And have been running on the Win10 partition for the majority of the last 3 months.

At this point I use my Macbook Pro less and less.


I just discovered WSL today, and holy crap it's exactly what I've wanted since for ever!

I've been running an Ubuntu server VM for ages to get bits and pieces done, it's so nice to _just be there_.

Pretty cool stuff!


The Mac is a victim of the iPhone's success - pure and simple. From a short & medium term perspective the business/ops guys will tell you that any engineering resources spent on Mac would yield better results in iPhone land.

It requires fundamental "long-game" mindset to realise that once you lose the high-end Mac guys - you lose your best proponents - then you go into a decline, a long and very profitable decline.

There is no excuse for the current line-up of Macs - super expensive, incremental and confusing half-baked features.


> The Mac is a victim of the iPhone's success - pure and simple. From a short & medium term perspective the business/ops guys will tell you that any engineering resources spent on Mac would yield better results in iPhone land.

Which is why I'm voicing my objections to the way that Apple is treating their Mac lineup by leaving iOS. I sold my iPhone and iPads (I had 2) and moved to Android.

I'm still a Mac user who will happily spend thousands when they offer compelling reasons to upgrade. But if their belief is that there is more money in iOS, I'm making damn sure that I'm not part of that equation.

Switching to Android also has the pleasant side effect of being far cheaper. I was able to get a brand new Nexus 5X for net-$20 after trading in my old iPhone 6, which barely held a charge anymore. And Google Fi is so much cheaper for my use case (I use about a half gig per month and do a lot of travel in other countries) that I'm saving around $20/mo and getting better service.

For those of you wanting Apple to focus more product development resources on macOS and Macs, abandoning that platform will only confirm Apple's decision to focus on iOS. We need to abandon that platform if we want more focus on macOS.


It really depends on what you’re doing, I think.

For instance, if you are developing a game with Sprite Kit, you will find that around 95% of the code is IDENTICAL on Mac and iOS. Sprite Kit itself works the same on both. Sister frameworks like AVFoundation are mostly the same and they have peanut-buttered some #define values to allow “different” classes like UIColor and NSColor to be SKColor, etc. to make it easier to have code that does not needlessly vary. You end up having to clearly think about UI differences between the two but that is true anyway between desktop and mobile.

On the other hand, sure, a utility-style application is not that similar between iOS and macOS once you get past the likes of NSString and NSArray. Yet, utility applications often look and work quite differently between desktop and mobile so this may make sense. And if you rely on the cloud to implement part of the functionality (and share it between the two), you may find that again you are dealing with mainly a lot of UI-centered differences on the two platforms that would have been different anyway.

The iPad is the weird one. Here, Apple probably needs a middle API that really can use exactly the same constructs between mobile and desktop where they actually do end up looking or even working the same.


My first computer was an iBook G3. I loved it. I have never owned a PC, and reading this post makes me sad ... because it's true. The reality that in the next few years I imagine the Mac to be killed entirely is an awful prospect but one we should probably prepare ourselves for.


Even Apple's current leadership can't be stupid enough to do that. If they kill the Mac/MacBook, who is gonna write the apps for their mobile platforms?

Or maybe they can, judging by the last MBP release. Hopefully they turn it around.


If they kill the Mac/MacBook, who is gonna write the apps for their mobile platforms?

I'd be surprised if Apple doesn't have XCode running on both iOS and Windows internally.


My family is fully invested in the Mac and iPhone ecosystems for productivity. I recently purchased a 2012 Mac Pro. It was easy to justify as my wife is a photographer and uses the Mac versions of Photoshop and Lightroom.

I forgot how awesome it is to have a desktop/server online all the time at home and how much performance I had given up for the convenience of a laptop.

The lack of a used market for the 2nd generation (2013+) Mac Pro leaves me frustrated, but it's no surprise given that it was targeted for the designer niche and the rest of the desktop market has largely been subsumed by laptops.

I'm really encouraged by the growth of the Hackintosh community and all of the problems they have solved. That will likely be my next computer.


If someone wants a good development platform and is leaving the Mac, why not switch to Linux rather than Windows? Windows is IMHO not pleasant to use: the ads, the UI, the lack of speed. Linux is ad-free, pleasant & beautiful (with StumpWM anyway; Unity, GNOME & KDE aren't so great), and blazingly fast. Plus it's free software.


When I read "ads" I had to do a double-take. I find it really surprising that people are willing to tolerate an OS with built-in "malware". Is it because people simply aren't aware that there are alternatives?


No, it's because when there are three main alternatives, and the one with ads has massive market share, you are necessarily going to have some people using it.

Including people who don't actually tolerate the ads, but who need to target that platform.


There is an option to turn them off.


I've never seen an ad in Windows 10 -- but I installed a start menu replacement. Linux just simply can't support the hardware in my laptop appropriately, I don't do much gaming but I appreciate the option, and the subsystem for Linux is actually good enough for my needs. There are plenty of small issues that make Linux as-primary-OS not a great solution.


I wouldn't say it's panic time yet, but I do get the feeling that Apple isn't terribly interested in desktop Mac hardware and software. Microsoft could have been said to have been there a few years ago, but they seem to be doing rather better on desktop now.

This article actually prodded me to try out the Windows Subsystem for Linux, and so far it seems pretty nice. Everything I've tried to install has worked so far, and worked fine with my standard Unix config files.


Instead of choosing one platform and making a big deal of it, why not just use whatever suits you at that moment and avoid features that tie you to only one platform, making it inconvenient to switch? For me, at least, it seems best to be aloof when it comes to operating systems. Loyalty to only one may have some minor benefits, but is likely to make you feel like you ought to have a say in the direction that platform is going.


I really don't understand the whole Windows vs Mac thing, both operating systems largely do what we want it 100% comes down to if the software that you need is on one platform or the other.

I'm in the market to upgrade at some point in the near future as my mac has developed a number of hardware issues and the linux subsystem for Windows and Docker support are both intriguing.


I'm in the same boat (with everyone, it seems). One remaining big differentiator, for me, is the resell value and service support of Macs. I can use a Mac for three full years, under warranty, and then turn around and sell it used the day AppleCare expires. And, I can regularly get 50-60% of original sale price (realizing that I've also sunk $300 for AppleCare and taxes). After three years, you'd be hard pressed to give away a Windows machine. This means, all things considered, the Mac premium isn't really a premium, but actually great value.


> After three years, you'd be hard pressed to give away a Windows machine

Because they actually update the hardware...


No, it's because there are new, low-end PCs that can be bought. There's no low-end Mac, so it means that people who don't want a Mac but don't want to spend as much buy used Macs.

If Apple sold a $400 Mac, you wouldn't be able to give away a 3-year-old Mac either.


While that has been true in general for Macs, is that going to be true for Touch Bar Macs?

They are more expensive to buy, not really “great” over their predecessors, and there is no doubt that some future-generation Touch Bar 2 with a feedback engine, etc. will be far superior to use (kind of like buying an iPad 1 with not enough memory, being too heavy, etc. and iPad 2 blowing it out of the water, not to mention Apple unceremoniously ending OS support for iPad 1). These first-generation Touch Bar laptops will really age poorly I think, I doubt they will resell as well as Macs typically have.


If you spent a comparable amount on a PC as you do on your mac, I guarantee you would not have a hard time selling it in three years.

I don't care about this whole windows/mac debate, as I run linux, but my 4 year old Asus gaming machine is still quite a powerful machine and hasn't been updated outside of SSDs. It would fetch a few hundred bucks, probably 40% of its original price, and it doesn't have a cult following willing to pay a premium for...whatever it is.

Used Macs remind me of used Jeeps.


I spent apprx. $2500 on a returned Lenovo workstation (box hasn't been opened) maybe 1.5 years ago. It has 64G ECC RAM and 20 physical cores, but it also has a lot of room for expansion (it can, theoretically, have 768G RAM).

I don't need to sell this in 1.5 years, probably not in 3-4 years. After 5 years, it'll certainly worth >$1000.