Hacker News new | past | comments | ask | show | jobs | submit login
Video Pros Moving from Mac to Windows for High-End GPUs (mjtsai.com)
447 points by mpweiher on Feb 24, 2017 | hide | past | web | favorite | 464 comments



Big surprise. Video Pros now, developers next, common users last.

While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.

Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.

Even Windows is becoming more viable again, ubuntu core and all.


What's the viable alternative? I'm looking for a replacement for my 2013 rMBP. Every PC laptop has a fatal flaw. The Spectre x360 15 comes close, but has less battery life (despite having a slightly larger battery), is bigger and heavier, has only a dual core processor, and has a worse screen.

The XPS 15 is another option. Even with a significantly larger battery it doesn't seem to have better stamina than the MBP. It's apparently been plagued by QA problems, and offers a choice only between a 1080p and power-hungry 4K (can't get high-DPI and 10 hour battery life in the same package).

I suspect, looking at the sales numbers, that you're wrong. Apple is giving up on a certain group of users (who need fast GPUs). They gave on those users years ago by shipping crappy OpenGL drivers. But they've correctly perceived that common users don't want ports, or expandability. They want a fast machine with a great screen and all-day battery life, and they want to hit those metrics in the smallest package possible.


I recently trashed my 2011 MBP (oops) and spent a month without it. I decided to try using my corporate-provided Surface Pro 4 instead to see how I could get on with a similar-spec Windows 10 PC for personal use.

By the end of the month I had gone slightly loopy (seriously - MacBook withdrawl is real), gave up and bought a 2015 rMBP. Personally I find macOS, and its deep integration with Apple's hardware, too intuitive and 'invisible' to the way I work to give up on it.


All that that tells me is that switching is hard.

I use both Linux Mint and MacOS on a daily basis and I can't say that one really is better than the other. I'm sure it would be the same with Windows if not for the fact that I need some unix-y stuff.

I do have a use case that I would really like to get on my primary laptop that neither Mac nor Linux currently offer: Detach screen and use stylus to annotate pdfs. If I was using windows on a surface this would be a major part of my workflow, and I would find it impossible to switch back. Doesn't prove that Windows is superior. (Maybe I will end up dualbooting Windows and Linux on a Surface or Yoga or some such thing eventually).


Have you tried the Windows Subsystem for Linux? You might be able to get by without a separate Linux install.

https://en.wikipedia.org/wiki/Windows_Subsystem_for_Linux


>> Have you tried the Windows Subsystem for Linux? You might be able to get by without a separate Linux install.

I have been meaning to Subsystem for Linux that since they announced it, but Cmder (Conemu) has been so working well for me that I have gotten too complacent to even try it. Cmder was the one thing for me when I switched to Mac that truly made the transition painless -- I thought I would hate not having OSX Terminal. I don't do super advanced unix shell stuff, and it was just perfect.


You can run WSL inside Conemu.


Yes, but Conemu already provides access to a lot of unix-y tools.

If your needs are already being met, it's sometimes hard to make time to play with something that does the same.


I've been using it since it was available on the insider fast ring.

It works pretty well for command-line use. If you need X at all, you're probably SOL.

I've been doing mostly C++ & Python development. I've run into a couple of issues around networking and a couple of UI issues. The UI issues have been fixed after I reported an issue on GitHub.

All in all, it's been a fairing great experience, and the developers have been very responsive to issues reported on GitHub.


Not yet, but I'm following it closely. Currently I don't have a Windows install on any of my work machines, so it's something I will evaluate when I update computers next.


>> By the end of the month I had gone slightly loopy (seriously - MacBook withdrawl is real),

After being on Mac for about a decade, it only took a week of using Windows full time before going back on the Mac felt foreign to me.

If I spent a week or so full time on the Mac, the reverse would probably happen.

These days I only care that my main tools work on whatever operating system I use (it's a great era for end users now that so many tools are cross platform). There are things about every OS'es main UI that bug me, so I don't find myself being loyal to any one particular OS.


Mac hardware is great! But I find using OS X is like trying to juggle with handcuffs compared to the Windows.

IMHO In their quest for simplicity, that have made a very good GUI for users, but hell for professional developers who DON'T only use the command line. One of my lifelong friends swears by his MBP, but he only uses the command line.


>> Mac hardware is great!

I think that depends on your view. IMO, Apple generally does make some of the best hardware in terms of quality and design, but there are exceptions.

The Mac Pro garbage can, for example. I don't know why they chose to make an art piece to replace the incredibly functional and expandable tower that existed before.

Another example is the early 2011 Macbook Pro 15" with discrete GPU, which had serious overheating problems that they just pretended didn't exist for more than three years.


This goes against my (and others) experience. Every recent OS release, Apple has dumb'd down the UI, not just for devs, but any user.


I feel like I'm in prison or an insane asylum when I have to use my Mac or iOS devices after enjoying the freedom that I have using Windows all day.

I could go very far into detail here and list all of the extremely annoying limitations that I run into, but instead I'll respond to your vague complaints with my own. Apple quite obviously wants absolute control over their device and their software whereas Microsoft lets me to do whatever I want with my computers and my software.

I have to have a Mac to make iOS apps, but as soon as those are no longer a thing I'll toss all my Mac stuff straight into the garbage.


I've had the opposite experience. OSX is just weird bsd with a great window manager and ui framework. You can easily sidestep gatekeeper, if that is your complaint.

The hardware fit and finish is also second to none, and runs windows just fine if that's your thing.

The alternative on windows is a machine that spies on me, has horrible ui bugs and inconsistencies I run into constantly, and decides to auto update and reset all my privacy settings in the middle of the night while I am using the machine.

Not to mention it is often used with some awful trackpad. I haven't tried them all but I have never seen a non-mac trackpad I could go back to.


> The alternative on windows is a machine that spies on me, has horrible ui bugs and inconsistencies I run into constantly, and decides to auto update and reset all my privacy settings in the middle of the night while I am using the machine.

This exactly. I'm still baffled by claims that Windows 10 has a good desktop UI when I see its iconography [1], huge click (touch) targets [2], and wildly inconsistent use of whitespace [3]. That's not even mentioning the forced updates and always-on telemetry. I'm not sure how one can say macOS is more of a walled garden than Windows at this point, at least macOS's security features will get out of your way if you ask nicely.

[1] http://www.intowindows.com/wp-content/uploads/2015/07/Change...

[2] https://cdn2.pcadvisor.co.uk/cmsdata/features/3632303/how_to...

[3] https://cdn.arstechnica.net/wp-content/uploads/sites/3/2017/...


> I'm still baffled by claims that Windows 10 has a good desktop UI when I see its iconography [1], huge click (touch) targets [2], and wildly inconsistent use of whitespace [3].

This almost reads as sarcasm. Some people really don't care about those things, and using them as examples of why you're baffled just highlights that disconnect between you and them. I don't use any bundled windows apps, and I'm rarely in the settings (and I just search for what setting I want), so iconography and whitespace design decisions in windows apps don't even factor into it for me.

Neither OS X nor Windows feel as comfortable as my customized FVWM config did, but windows gets a lot closer nowadays. I had to use OS X at work for a few years and it always grated.

> That's not even mentioning the forced updates

It's possible to disable them, you just have to put some effort into it (it requires regedit). I think this is the right decision. If you want to disable updates and you don't know how to change a registry setting, then for the good of us all, the answer is no.

The tracking is a valid concern though.


I want automatic updates. I think this is a good thing. Chrome automatically updates whenever I restart it. This is great.

I don't want updates when I am in the middle of something full screen like a game, forcing a restart of the machine on me. This is madness.

I don't want ads for office 365, Cortana or edge on my desktop. I don't want to learn how to block them. I don't want to use an OS that feels like it is being milked for all it's worth in its dying breaths.


It always asks me, and I can delay it. You've actually had it force an update right then while playing a game, and without having told it "no, delay it" multiple consecutive times (I believe it will only let you delay it 2-3 times)?

Edit: Also, have you set your active hours? Windows allows you to define the times you use your system so it won't attempt to update during those times. Additionally, you can set a specific custom restart time for when it will restart.


> Edit: Also, have you set your active hours?

I tried to. I play games for a few hours either at night or early am -- say 6pm-1am or 5am-9am are my possible slots. Unfortunately, windows update will not accommodate this -- you are only allowed to set one window with an 8 (maybe 12) hour max, and it must be consecutive. I had to dig around to find this, and it still is not sufficient. I ultimately solved my problem by using the regex editor to convince windows I was on a metered internet connection. Unfortunately, this broke update all together. I turned it back and now its still broken -- apparently the magic auto-updater is the only way get updates -- there's no button I can click to just download the latest update and install it? (at this point I gave up)

I mean... that's just insane.


What about setting the specific update time to something like 3 AM?

I was just in the windows update settings and there was a way at the top to check for updates right then. I didn't use it so I'm not sure if there's some other gotchas involved with that.


I'll have to check that out, thank you for the tip. What I would really like is a a shutdown button that actually "Check for updates, install if found, then turn off". I'd click that every time I was done using it.


The window inconsistently interrupts my game. Some games it is able to rip me out of it mid session to tell me to restart, other times it silently times out in the background despite my computer running at full blast.

I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone. It is often reset without rhyme or reason to this random default (I think NYC). I don't always notice and change the time one when it boots up because I have steam launch in big picture mode.

Also, why does it even need to ask for active hours by default? I am using the machine at full-throttle. That is a really easy metric for "maybe wait until later". It's already logging everything I do and sending it to Microsoft, it would be nice to see some usuabiltiy features come out of all that data

Inconsistently working is a commmon theme of my experiences with windows. I am routinely baffled that I paid $100 for this experience and wish that there was better Linux game support for AAA titles. I know I throw my money that way whenever possible.


> why does it even need to ask for active hours by default? I am using the machine at full-throttle.

Some people run things all day long. My brother sometimes keeps the video game 7 days to die running all day at home while he's at work. Not updating when activity is detected is a good way to have it never update, and a good way to allow a virus to trigger a condition that may prevent automatic patches to holes it likes to use.

> I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone.

That's odd. Is it actually changing your time zone, or is it just off by a few hours? If it's just off by a few hours, my bet would be that it's a difference in how linux and windows set the system clock (one may prefer to keep the clock in UTC time, the other in the set time zone). If it's the actual time zone that's changing... I dunno, maybe some location service helper and a poorly mapped IP address? I haven't heard of that, but it does sound annoying.


I'm not arguing that there aren't cases where postponing restarting until there is lower load doesn't miss out on some people, I'm arguing that this shows a less respect for the user and is a poor experience. It does not feel like my machine, contrary to great-great grandparent. No other system I own forces restarts, and they all seem much more secure.

Its definitely the time zone not persisting. I've navigated through seas of menus to change it to no avail.


Plenty of ways to disable the automatic restarts. Easiest is setting the Group Policy in a few clicks.

http://tunecomp.net/disable-automatic-reboot-after-updates-i...

I agree its a good feature to ship enabled by default. Grandma who leaves her computer on for years at a time needs to have her security updates up to date.


I wouldn't call myself an unexperienced user at all but even after months of trying to disable the auto updates in W10 through all sort of settings and tweaks I gave up on it. Whatever I did it never lasted for long. When I'm not in control of my own machine what's the point? I have switched to a Linux distro and haven't regretted it.


"...runs windows just fine if that's your thing."

I have a 2015 MPB and used it for 1 year before switching it to Windows because I got fed up of the apple dev environment.

It doesn't just run windows fine, it runs Windows with amazing speed compared to macOS. Everything feels (and is) snappier. I'm still left with a very expensive and under-powered machine.

PS. Installing Little Snitch, one becomes immediately aware that macOS does talk to base....a lot!


Sorry but that's just blatant lying. Apple takes great care of optimizing its OS and apps for its hardware, to the point where it's possible to use Final Cut Pro on the anemic MacBook 12 somewhat comfortably(!)


Yes, apple does optimize. I'm not disputing that; neither starting a silly "apple vs" debate. I am no fan boy of either mac or PC - I just use whatever is best.

The point is, that "equivalent software" runs faster. I would wager that if Final Cut Pro was for Windows, and you run it on the very same MacBook12, it would run faster on Windows. Despite this, using Final Cut Pro an example is a bad one, because it was developed by Apple itself (and therefore assumed to be highly optimized to the OS) and it is not available for windows.

It is more useful to compare an equivalent 3rd party (neither made by apple or microsoft) application. I'm a developer, so I use a lot more software than just "apps" so maybe if you try to use a wider set of software, the difference in platforms will become apparent.

Admittingly, I don't have numbers. But I don't need numbers, because you can feel the difference as a user. Install it and see for yourself!


Microsoft takes great care for optimizing its OS and apps for any hardware, to the point where it's possible to run Windows 10 on 10 year old hardware somewhat comfortably. To the point where the Windows 10 Kernel and a lot of the Windows 10 Core (OneCore) runs on mobile and IoT hardware...


To me, MacOS just seems like the best desktop UI and UX out there. I'm talking about stuff like font rendering, gestures, multiple desktops, etc. All of these things on the Mac have clearly had a huge amount of thought put into them. And since these things mediate your entire interaction with the computer, it's a big factor for me.

Windows seems to be a combination of 90's era throwbacks and 2edgy4you Metro design. If they've managed to improve this stuff recently I would be interested in switching, since there's a huge tax on hardware specs with Apple computers.


How recent is recently? Windows 7 was the first windows version I didn't dread using for years, Windows 8 was a small refinement on that, and Windows 10 is a vast improvement on that.

Windows 10 feels like it just wants to get out of my way, which is really nice, since for me the gold standard is a customized FVWM config I refined over a decade to be minimalist and extremely usable for work.

If you haven't used Windows 10, it's probably worth at least a look (as long as you don't mind or have ways to mitigate the privacy concerns).


I'll take a look. Does the ubuntu subsystem make it work like a real computer for programming tasks?


I'm reading all of your comments here (not just yours) and I'm wondering. Do you spend that much time in OS alone?

I use MacOS, Windows, and Linux each and every day interchangeably. What I do is I start my productivity application(s) and spend time in that. I see OS when I'm copying files or when I'm in shell doing shit. Even when in shell, with baboon on Windows it's more or less the same experience across.

Only thing I want in all of OS' that is only in MacOS is preview. That thing is damn awesome. Everything else is invisible to me.


You can install Gnome Sushi to have that in Linux.


Thanks for the tip! Now do Windows!!


> whereas Microsoft lets me to do whatever I want with my computers and my software.

Not in my experience. My Windows 10 automation/unscrew-up script alone is like 5 kloc. And I bet 70% of that script could be replaced if I had real control of the system, like dropping a .config file in some folder instead of having to find hidden settings with nonsensical names deep down the regedit hole. Another example, you have to use some stupid hacks to make sure there are no Flash DLLs in your pc. No matter what you do they always come back in some security update.


> And I bet 70% of that script could be replaced if I had real control of the system, like dropping a .config file in some folder instead of having to find hidden settings with nonsensical names deep down the regedit hole.

You should see my ansible playbooks for our Windows Server systems, I don't mind PowerShell per-se but it takes a lot more effort to get anything done compared to my CentOS systems where I can template a config file and be done with it.

I really hope our vendors start supporting .Net Core soon, the SDK for our ECM software is the only reason we're still stuck on the full framework and having to manage a bunch of Windows VM's for our integration software...


5000 lines for a fresh install? That's... a lot. Is that on github somewhere?


It's based on several scripts from Github. A lot of lines are just regex and lists (apps, services, tasks) of things to disable or remove.

I recommend you do your own script by choosing what you want from each type of script. I would release my script if I was sure it wouldn't break random people's computers, because IT WILL. I'm also running Windows 10 enterprise because I want as little telemetry and things shoved up my ass as possible.

Some Windows updates can change registry keys or disable certain policies. I monitor the commit log of other repos to know what I need to update, but they don't always cover everything. Feels like a lot of work but it's actually not.

Here's how I structured it:

- admin.ps1

--- admin-config.ps1 (policies, tweaks)

--- disable-services.ps1

--- remove-flash.ps1

--- ...

- user.ps1 calls

--- user-config.ps1

--- disable-gamedrv.ps1

--- disable-services.ps1

--- ...

Because if you're using a regular user account (like you should) you need to run 3 things:

- admin.ps1 as admin

- user.ps1 as admin

- user.ps1 as your regular user

I gave up on using runAs or any of the things recommended on stackoverflow, something always go wrong so it's easier to do it this way.

For a fresh install, I recommend that the first thing you do is update everything and let Windows install the 200 apps you don't want. Run the 3 things like I mentioned, reboot, run it again, reboot.

https://github.com/cluberti/VDI/blob/master/ConfigAsVDI.ps1 https://github.com/W4RH4WK/Debloat-Windows-10/ https://github.com/dfkt/win10-unfuck https://gist.github.com/sven212/5febf372aaa6e4cc1fda71ad9637...

My installation is months old and it runs like new even after heavy usage, hardware changes, tons of apps and games installed/uninstalled (this kills Windows 7). Just be careful what you remove, don't ever install ccleaner or any shit. All you need is sysinternals tools.

I'm too lazy to proof-read/make this shorter, hope it helps somebody.


Thank you for taking time to share this!


>> whereas Microsoft lets me to do whatever I want with my computers and my software

Whatever is left after bricking it, I suppose?

You do know that there were actual lawsuits against Microsoft over the Windows 10 auto upgrade right?

https://www.amazon.com/Winning-Against-Windows-10-Microsoft-...

Even assuming that issue was overblown, I distinctly remember that Catbert-style perma-nag message appearing on each login asking me to upgrade. Microsoft doesn't even deem its users worthy of a simple "Don't bother me again" close window.

Let us conservatively say about 10% of the 300 million people who supposedly got the Windows 10 update didn't actually want it. That's about 30 million folks who would disagree with this notion that Microsoft "lets me do whatever I want with my computers and my software".


I'd been a hardcore Mac user since '04.

The best thing Microsoft and Dell have going for them in the laptop space... the 2016 MacBook Pro w/Touch Bar. It was a dud. The GPU issues were the final straw for me.

I thought that switching back to Windows would be a HUGE hassle. The Windows Subsystem for Linux took the pain out of it. That, and I no longer have to fight the we-don't-have-a-macOS-version of $APP issues.

If Apple doesn't care about their computers, why should their users? ¯\_(ツ)_/¯


> the 2016 MacBook Pro w/Touch Bar. It was a dud.

I thought their sales report indicated it is their highest selling macbook to date?


My favorite thing about Windows is I can usually fix it in the Registry

My least favorite thing is I have to use the Registry to fix it


My XPS 15 has been a nightmare. Endless driver problems. Build quality isn't even close to what Apple delivers.. lots of false trackpad hits too.

I'm a little conflicted about my next Apple purchase as well, but for me, the XPS 15 hasn't been the answer.

Just one experience amidst many, I know...


I'm considering buying an XPS 15. Can you give more details on driver problems? Which OS are you using?


XPS15 + Ubuntu = External monitor hot swap not working, have to reboot the whole thing for it to detect the monitor. Total showstopper.


>> Every PC laptop has a fatal flaw.

Given that everyone's got a different use case, every laptop, regardless of maker, has some fatal flaw to someone.


There is a difference between "design tradeoff" and "design flaw." E.g. I'd rather have an SSD-only machine with a bigger internal battery than use that space for a 2.5" drive bay. But I don't consider that a "flaw" in my T450s. But coil whine (XPS), crappy screens with backlight bleeding and PWM (Lenovo), not using Microsoft Precision touchpads (HP). Those are flaws. That's not a trade-off between different peoples' use cases, but just the manufacturer skimping on some part.


One person's tradeoff is another person's flaw and vice versa.

As someone who has a lot of trouble using any trackpad (including Apple's) that doesn't have physical buttons, should I view buttonless trackpads as a tradeoff or a flaw?

I don't care about the design tradeoffs made for esthetic reasons, I only care that I struggle with them and get stressed out by them. So to me, the lack of physical trackpad buttons on a laptop is a design flaw, even though it might be a completely sensible tradeoff for 99% of the population.


> One person's tradeoff is another person's flaw and vice versa.

If some people view a design decision positively, then its a trade-off, not a flaw. For example, I hate moving parts on my computing devices (they break). So I'm a big fan of the non-clicking force touchpad.

What I'm talking about is flaws. Nobody prefers a laptop with coil whine to one without. Nobody prefers a cheap-o IPS display with an uneven backlight or bleed. Nobody prefers Synaptics drivers to Microsoft Precision drivers. Those are flaws, not trade-offs.


>> Nobody prefers a laptop with coil whine to one without. Nobody prefers a cheap-o IPS display with an uneven backlight or bleed. Nobody prefers Synaptics drivers to Microsoft Precision drivers. Those are flaws, not trade-offs.

But like anything, you have to be sensitized to these things as a negative. Clearly, you've got plenty of things you're sensitized to.

Virtually none of my non-technical friends would even think to complain about uneven backlights, bleed or coil whine, simply because they haven't been sensitized to them.

Coil whine and a little backlight bleed have never been dealbreakers for me in choosing a laptop. All other things being equal, maybe they would be dealbreakers. With the wide variety of laptop options out there, the "all other things" never tends to be equal anyways.

You can make the argument that I have low standards, and maybe that's true. But I am not you and you are not me. Let's not assume that everyone is sensitized to the same things as you or has the same tolerances as you.


Happy that someone pointed out the obvious. Was going to post a comment almost identical to yours :-)


If you aren't CPU constrained, max out the RAM and swap in an SSD... you should be fine for a while.

I'm running a 2014 and so disappointed they soldered on the ram, and their SSD is a wierd format nobody else is using. I'm not planning on upgrading any time soon though.

I think the Apple/Mac pro market is really being abandoned in favor of the laptops.. but how much of their base can they piss off before it starts taking other segments with it?


I have a lenovo t-450s (since replaced with t-460s) and I am extremely impressed. It's an excellent dev machine.


I'm looking hard at the T570 (I've got a T450s docked at work, but 14" is too small for working on the go). It sucks that Lenovo's screens are such crap. Even /r/Thinkpad complains about Lenovo shipping cheap crappy screens.


I also use an HP ProBook G2 450. This is another awesome machine where the screen is subpar. On the Lenovo I an not bothered by it. But the probook unplugged is annoying.


The upper end of the Surface line seems decent hardware-wise, and the detachable keyboards give the option to trade thinness vs. keyboard. One should be able to put Linux on them, most likely.


>> What's the viable alternative?

That would depend on your criteria. For example my MSI laptop blows my mbps out of the water performance wise (I have 2 mbps btw 2014 and 2016- so no mac hating here), but it's about 4 times heavier :) but I don't have to haul it around too often.

If I had to haul a PC laptop around - for mbp - comparable form factor/weight I'd look at high end razer laptops. I'm also not a fan of glossy screens, razer has a matte screen option available (so do MSIs but again they are BIG).


The Razer is a good example of why I don't switch. The closest thing they have to something in the Mac's lineup is the Razer Blade Stealth: http://www.laptopmag.com/reviews/laptops/razer-blade-stealth.

Compared to the 13" rMBP, you get a smaller display and almost two hours less battery life (and people are up in arms about the 13" rMBP being a regression on the battery life front compared to the previous model). It's not a replacement for the rMBP, just a less good, but cheaper, alternative.


Not sure why you decided razer stealth @1.4K is closest to what Mac has to offer when they have a "regular" razer blade in the same price range as the mbp with the only upside to mbp being a better battery life (which is most likely due to a superior but hungrier video setup on the blade).

speaking of battery life - not sure that once you get into 6 hrs vs 8hrs type difference it's all that important.


I used to recommend the Razer Blade, but tons of horror stories on PCMR have veered me away. The only other laptop in the same class that I know of is the Gigabyte Aero.


I'm done with apple. The only MBP I have is from my work and I leave it there. My personal laptop is a chromebook pixel 2015, running debian. I absolutely love it, but I'm planning to upgrade to a new alienware 13. yes, it's a gamig laptop, but it has removable components (sdd, ram), high end graphics (gtx 1060) and an oled screen. all for less than $2000.


I'm researching new laptops, and the two laptops that keep bouncing between the #1 and #2 spots on my short list are the XPS15 9560 and the Alienware 13R3.

I think the Alienware 13 is an ugly, heavy brick (in fairness, you can get uglier), but it basically ticks off everything I need on paper.

My max acceptable weight for a laptop - irrespective of size - is around 5.5 lbs, so it barely makes that, but once you get past the esthetics, it's a fantastic machine. It's available with a quad core CPU with multiple options for GPUs (including the 1050Ti which kills the battery less than the 1060), a nice array of current and future ports, user replaceable wifi, RAM and SSD, and the option for an OLED screen. Oh, and it's also got physical touchpad buttons, which are something I've been missing for a long, long time.

Because my last two laptops have been quad core, I struggle with the idea of replacing my laptop with an ultrathin laptop that has an ultrabook processor that would be slower than what I currently have.

Every time I think the XPS15 is the one for me, I look at the Alienware 13 and change my mind.

The one thing I really hate about the Alienware 13R3 is that it's a "gaming" laptop and the matching gaming bling, but that's a rant for another day.


The XPS 15 without the touch screen is pretty good. I prefer working on it (home machine) to my work MBP.


ThinkPad T460p


I wish Apple would just admit the trash can Mac Pro was a mistake and bring back the "cheese grater" Mac.

Some people just need or want a beefy desktop built with upgradable components (in addition to a laptop).

Or, crazy idea here, they could licence their OS if they don't want to make pro gear anymore.


> Or, crazy idea here, they could licence their OS if they don't want to make pro gear anymore.

Maybe some are too young to remember 'Power Computing', the company that licensed MacOS in the 90s. That was juuuuust before Apple ran in to major problems, and had to file bankruptcy. The first thing Jobs did when he came back was borrow 55 million dollars from Bill Gates and buy back all the licenses and shut down production of all mac clones. One of the only true things Apple has is its reputation for quality. You want to bleed that out, you would do it by allowing clones, thus bringing about the destruction of the brand as a whole.


Power Computing clones were great, though. I had one and it was very reliable, excellent value for money. They were built like PCs and at the time, PCs were a lot better-made in many respects than low-end Macs. (High-end Macs were nice but very expensive, and still not as fast as high-end PCs.)

A clone program now would have a totally different dynamic, since Apple is incredibly successful rather than close to bankruptcy. I still don't know if it would make business sense for them, but for Mac users like me it would be absolutely terrific.


Sell the licenses directly to consumers then? Their reputation doesn't take a hit that way. Just sell it on a "use at your own risk" basis and just let people have it who want to use it on their own builds. I figure the support for the bazillion possible hardware configs would make it a nightmare though.


Or they could license the OS and review the products from licensees(?) before they ship out. Make a test balloon and license to 5 or 10 hardware manufacturers, offer them a reference design and review and run with it.


> Apple ran in to major problems, and had to file bankruptcy.

Apple never filed for bankruptcy protection.

http://www.investopedia.com/articles/personal-finance/051115...


While they were in bad shape apparently they didn't file bankruptcy...

https://www.wired.com/2009/08/dayintech_0806/

From this source they borrowed 150 million from MS.


If Apple doesn't care about high-end desktops anymore (and it certainly seems that way), they could only license MacOS to third parties for desktop use. This wouldn't cannibalize their laptop sales, but would keep power users in their ecosystem. In the end they'd probably sell more laptops too.


>> If Apple doesn't care about high-end desktops anymore

I think Apple makes high end computers for consumers now.

They just don't cater to special audiences like they did before, which is unfortunate for people who don't want computers that cater to consumers (i.e., content creation pros, developers, etc).

I can't blame Apple for following the money, but it is what it is, I guess.


> crazy idea here, they could licence their OS

Another one to chime in that I don't consider this crazy...any more.

It didn't work the first time around partly because the cloners had much smaller volume and could therefore always scoop Apple on the most profitable high-end gear, which didn't have component availability.

The licensing model was always a gamble, with the hope that the loss of margin would be offset by an increased base. The problem was that it was "bet the company" stakes at the time, and when it didn't quite work out as hoped they had to shut it down or shut down the company.

Now the Mac line in general and the pro line in particular is a sufficiently insignificant part of the overall business that they could afford the gamble.


The question is whether the licensing fees + revenue from the app store would be enough to offset Apple's ludicrous hardware margins. Their market is shrinking, but it's still a profitable line of business for them since, just like the iPhone, their margins are way higher than the competition.


The mistake isn't the trash can Mac Pro (which is a perfectly fine machine that 100% fits my use case), it's not keeping it up to spec down the road (I'd buy one today if it was, even knowing I won't upgrade it for then next 3-5 years)


The mistake was the trash can, they locked themselves in to a physical design that doesn't allow upgrading easily. Graphics cards come in different shapes and sizes.


If they had sold enough of them, there would be a market for Mac Pro-shaped graphic card upgrades. Swapping out the AMD FirePro D300 looks to only need a Torx screwdriver set, so certainly do-able by a user.


>Some people just need or want a beefy desktop built with upgradable components (in addition to a laptop).

Apple is a hardware company. What you're suggesting doesn't make business sense for Apple. They rely on being able to sell new hardware. Allowing upgrades wouldn't make sense. A laptop last for five years, for most people, being able to slap in extra memory, a new graphics card or just a new SSD would "rob" Apple of a sale of a new computer for an additional two or three years.


This might be all completely true but doesn't regard mindshare or loyalty. Apple has historically had a fanatical fan base, and should be spending a small fraction of their huge cash reserves to make sure they keep a fanatical fan base.


Yes, Apple could license their operating system to just one high-tier desktop hardware company, and ensure that the power users will stay in the Apple ecosystem.

Somebody who will sell a high-end desktop system with: dual-socket Xeons, gobs of ECC memory, lots of storage, multiple 16x PCIe slots, and a very limited selection of high-end graphics cards.

That would satisfy the power users, and wouldn't cost them anything.


They would pay the same price Microsoft does to support random hardware. More importantly, it also wouldn't gain much. As others pointed out, the high-end workstation is barely a business for Apple.

Sure they are losing customer, but compared to the hoards of students buying Apple laptops or iPhone customers, it not really an issue.


They have so many patents on hardware that they could licence the system and clones would still not being able for example to have Apple like touchpad, touchbar screen, MagSafe or even continuity (they could block it for licensed hardware). They will not do that because they realise that they fantastic, proprietary features means nothing to average pro user. Stuff like that is nice to have but more important than that is to be able to run all needed applications reliably. Currently on Macs are only iOS devs and people that hate Windows. That second group could go to linux only if there would be Adobe Suit available on it.


>> .. MagSafe

If they licensed Magsafe to a clone manufacturer they would definitely lose sales. So many Mac users love the Magsafe.

It's really too bad they discontinued it.


and if desktop linux wasn't shit


I am pretty sure that is a subjective opinion, but you wouldn't be a productive member of the community anyway, no loss there.


you're a bit aggressive : I use it daily and it definitely works (but I work with it : no games, almost no 3D, just PyCharm, git, LibreOffice, firefox, irssi and claws mail on openbox WM; scanning/printing a document from time to time nothing fancy)

Latest problem I had were with the printer driver (Brother HL-1110 which doesn't seem to know that this printer has much less memory than advertised)


Speaking of printing in Linux, I was able to print without any problems on a printer that needed terrible drivers on mac and windows.


I think that's the key to Linux on the Desktop. It generally ranges from very limited to terrible for your average end user. Your video professional might fare slightly better, but realistically is only going to use Linux if they're working at a company that's invested in Linux for their video production stack.

Where the Linux desktop is truly unparalleled is for a developer. As a developer I'm comfortable enough with the technical details to handle the quirks (like doing things from the command line), and the strengths of Linux truly shine (again, the commandline, as well as choices in DE, software, etc). One great example is Arch. On Arch I can install any developer tool directly from the command line with very little effort. Anywhere else and you have to click through endless menus.


Yep I know that :-) My comment was a bit "anti" :-) But sure, it works as long as there's someone knowledgeable in charge (or at least a bit) :-)


The Mac Pro is barely a business for Apple. There's almost nothing about that line of business that makes any difference to Apple. Frankly, the entire macOS part of the business doesn't really matter much to Apple from a financial point of view.

I think they should give people what they want - a high powered upgradable machine. If they don't want to make it upgradable, then they should at least refresh it annually.


That seems to be Apple's thinking but I think this article shows the reason that that's a flawed approach.

Having content producers using macOS is a huge benefit to their brand and I think it's a big part of driving sales of their iOS devices. At the very least, if they lose the content producer market there will be significantly more friction to producing iOS apps.


I don't buy that.

Yes, there's a halo effect, but I don't think it's all that huge especially with respect to video pros and I bet it mostly works in the other direction - people like their iPhone so they are more likely to buy a macOS laptop or desktop. Nobody is buying a Mac Pro because they like their iPhone.

The driver for iOS development is profit and as long as iOS users are willing to pay more for apps, it will continue to dominate. And if you want to make an iOS app, you are going to buy a Mac, but probably not a Mac Pro.

The Mac Pro is like the iPod in that it exists in this weird space where it continues to be sold but is clearly not something that Apple thinks is important.


I agree, though I think persuading Apple's decision makers--particularly those who were present for the clone program in the 90s--would be hard as hell. It was a poorly designed program, and allowed clone makers to beat out Apple on price and speedy updates with new hardware. And those who remember it at Apple probably remember all of the problems it threw at the company's feet back then.

Personally, I think letting the trash can linger on for over 3 years is a pretty loud indicator that Apple's not exactly concerned with the line and that, internally at least, they're well aware of just how big a mistake the design was. Of course, it's possible that it's just a reflection of far larger problems with how the company views and supports its Mac engineers.[0] Plus, the Mac Pro's manufacturing here in the US is problematic for the company. That little political ploy can't be easily reversed, especially with the anti-trade FUD that's so prevalent now.

If Apple were willing to acknowledge the problems, they could certainly design a restrictive licensing program that would allow third-parties to build Macs that target higher-end professionals without cannibalizing their consumer lines too much while avoiding some of the problems from the 90s. Strict design requirements for case aesthetics, limitations on software bundling, and other requirements could be used to limit third-party machines to high-end machines to avoid eating their consumer and laptop sales.

Not that I expect it to happen, or even be considered. But something has to change if Apple wants to keep their hold on creative professionals who are now either being forced to switch, doing extensive upgrades on their old Mac Pros to bide as much time as possible, or building Hackintoshes

0. https://www.bloomberg.com/news/articles/2016-12-20/how-apple...


Yeah, the "throw-away" $5,000 Mac Pro wasn't such a good idea.


> Or, crazy idea here, they could licence their OS if they don't want to make pro gear anymore.

This is already worked like gangbusters for them for monitors: http://www.macworld.com/article/3169308/macs/apple-stores-st...


Not so crazy, Apple licensed their OS before, and then ended up paying-out the clone manufacturers contracts years later then they first released OSX. The model worked for Microsoft in the 80's (I bet the Big Blue is still kicking themselves for that one), but I don't feel confident it'd guarantee their survival now. For one, the operating system isn't nearly as separate of a concept anymore -- it's all the ecosystem.


If you're worried about Apple's survivability, I suggest you dig deep for where your model of how the world works broke.

Microsoft beat Apple for a decade, then Apple beat Microsoft. Both companies are still around, making fortunes. Even Yahoo's still around.


It's one of the main things I see resulting from the Touch Bar - for me, it's more about Apple ensuring there's a natural connect between the software and hardware, making any kind of separation impossible.


Licensing could easily eat up their laptop revenues though.


You think? With what, the chinese clones?

I don't mind using macOS, it's a pretty nice experience, definitely nicer than windows but when I buy an Apple machine the thing I'm buying is definitely the hardware and I'm not convinced the clones do a good enough job of matching that, can I get 14 hours of battery life from the leading air-clone?


I am a Windows, Linux and macOS user (plus some PcBSD for good measure). I wonder why you much prefer macOS over Windows? I would agree that Apple have got 4k/retina displays done and that is much, much better than Windows. Is there anything else that you can identify as an advantage?


I did the transition from Windows to Mac last year after many years of having both and hating OSX while loving Windows. Now I love it.

The event that changed everything was getting a 34" Ultrawide monitor and the Mac experience was just so much more solid and respective of the screen space. I can't quite put my finger on it but I felt more in control.

I have a Surface Book on my desk, but my Macbook Air is what I use daily. For me, Mac wins with:

- Predicable window management - everything stays where you put it

- Tidier fonts and app bevels so it doesn't waste screen space

- Access to apps like Sketch

- No UI delays when using Adobe Premiere, despite the less capable hardware - definitely an issue on Windows

- Much better command line support (not used the Windows Linux subsystem)

- Apps are easier to handle and less bloatware

- Notes app that is actually helpful

- Fast with an SSD - absolutely rubbish without (my mac mini is unusable)

- I have gotten used to clicking the window first, before it doing something in that app - so fewer accidental actions

But, I do still love Windows, it wins on the following:

- Better graphical support (ok, hardware specific)

- Windows Explorer is much better than Finder for me

- Window snapping is easier (but might be also frustrating, can't put my finger on it)

- I can open multiple calculator instances

- Wider app support, but less of an issue these days for what I use

Windows does have a lot more bugs, such as incorrectly scaled cursors in Premiere, or apps that a impossibly small or stupidly large when using a non-retina second monitor.


You can run multiple calculators from the terminal:

/Applications/Calculator.app/Contents/MacOS/Calculator&


While you're in the terminal, you might as well fire up `bc -l`, etc. I can't stand GUI calculators that don't display the whole formula.


I've used bc for decades. It's versatile, ubiquitous (Unix, Linux, MacOS), and easy to use. I find it so much better than some calculator I have to use with a mouse.

A developer friend I know used to always prefer the even older unix command line tool, dc. It operates with reverse Polish notation.

Now, I often just start the python repl (type 'python' at the command line) for quick calculations that might require slightly more power than bc.

To bring this back to the original topic, it is this easy access to the underlying unix tools that made me switch to Macs in the first place. Windows, IMHO, lost some points with the emphasis on touchscreen based UI and gained some points with the new Linux shell support.


For quick calculations I just use spotlight or alfred, whichever I happen to launch. For more complex calculations I use Soulver.


Soulver is very similar to a project I started working on a few months ago. I guess I didn't do enough market research!


I think there is still place for similar projects. Although do look at Calca as well if you have not done so yet.

In regard to both, I suppose that there is an opening to something more complex.


Even easier:

    open -na Calculator


I did not know this one! Thank you!


I would describe the command line support in OSX to be sub-par to what is offered in Windows. The terminal itself feels like a toy compared to the Windows 10 command prompt. And posix doesn't seem to hold a candle to Powershell.

After having given Powershell a chance I don't want to go back to bash.


> I would describe the command line support in OSX to be sub-par to what is offered in Windows. The terminal itself feels like a toy compared to the Windows 10 command prompt.

Pretty much anyone I know on a Mac uses iTerm, which is definitely more capable than the Windows 10 command prompt.

> And posix doesn't seem to hold a candle to Powershell.

In what specific ways? It seems way more intuitive to me and powerful enough for any task that it's not worth bringing a language like Python/Ruby/Go etc. for.


Piping grep feels absolutely vintage to me.


Usually, grep itself is powerful enough to do what you want on its own, you only pipe grep if you do not know grep well enough.


That's really interesting, thanks for taking the time to give a detailed reply.

I love the iOS platform, it just works and has never failed me (I am leaving iTunes out of this). However, even with the scaling/DPI issues I find Windows 10 fast and easy to use. It must be biased as I've used Windows more, but I thought I'd love macOS more than I do, having loved my iPhone & iPad experience. I do always have to put the taskbar at the top of the screen, but that I think goes back to my Amiga Workbench days!


>- I have gotten used to clicking the window first, before it doing something in that app - so fewer accidental actions

Jeez is that an actual thing on OSX? That would just make it feel unresponsive to me.


Not entirely. For text fields and buttons, clicking them will usually work even if their window is in the background. But clicking them will also focus the background window.

If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.


> If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.

Whoa. I had no idea!


>If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.

Damn, is there any equivalent functionality (be it integrated or third party) for windows? I'd love to be able to run games in fullscreen exclusive mode without having them lose focus every time I want to look at a different chrome tab on my secondary display.



My desktop computing happens on Windows because I need graphics card support (games at home) and because my work pc is windows. I chose the Macbook Air for my travel machine because it's extremely light, it's battery lasts more than a day and macOS is unix-y, I initially planned to install debian on it but I couldn't convince myself that the power management would be as good so I left macOS on it.

Window management is vastly superior in my opinion, I frequently just full screen everything and swipe between the full screens, this is particularly notable in my case since I almost exclusively use the keyboard (get bad RSI from mice)

Updates feel like they have much less friction.

But basically it comes down the underlying unix-y system. I'm never going to do serious development on a laptop but if I pick up a macOS machine it has the unix core tools that I need to get stuff done, vim, ssh etc. (Though there are some pain points mostly they're fixed by brew for machines I'll use for any length of time).

Also, I helped someone set up Parallels the other day on their MBP and tried out air drop for the first time, show me friction-less 4GB ISO copying OTA between Windows laptops that aren't on the same network!


I wouldn't mind buying something like a Xiamo Air if it had native MacOS support. I'd be picky about display quality though. I've never gotten more than 7h of usage out of my new 2016 MBP 15".


As long as it doesn't hurt the iPhone, do they really care?


The design would have worked just fine if they'd supported it. They won't sell you an SSD upgrade, which is absurd. The design should have made CPU and GPU upgrades possible, even if they were only available from Apple, but those never materialized.


We live in very different bubbles if you see common users being the most sticky Mac users. Almost every common user I know, with the exception of a few students, consider Macs ridiculously overpriced and buy <£500 laptops. Apple are moving further away from them users by increasing prices, and I think they'd rather they buy an iPad pro anyway (though I've never heard anyone non-technical say they want one).


> We live in very different bubbles

HN is a North American bubble, mostly Silicon Valley. Apple computers are a thing only in North America and parts of Europe.


Just out of curiosity : do you know what's "the thing" in, say, China or India ?


While it doesn't tell the whole story, just looking at things like sales numbers and user agent stats gives you a pretty good idea of the relative popularity in different areas. Apple just has very little presence in some of these places, _especially_ for desktop.

Take a look at:

http://gs.statcounter.com/os-market-share/all/united-states-...

vs.

http://gs.statcounter.com/os-market-share/all/india


Interesting, but I don't consider iOS or Android a real operating system.


Probably pirated Windows!


This has nothing to do with the focus on laptops and everything to do with the simple fact that the new Nvidia cards are not supported, and there doesn't seem to be any indication from either Nvidia or Apple that they are committed to providing that support.

It's a strange problem, because there is plenty of demand for it, and I think we all appreciate the outsized role that GPUs play these days; to not offer support for 50-80% of the GPU market seems like a rather poor strategic decision. Particularly since you're really only talking about a team of 30 or 50 within Apple to help Nvidia, the drawbacks are minimal.

This has been an ongoing problem since the summer. Some have reverted back to using several 9xx cards (which have spiked in price) while others have switched platforms. Lacking any real progress on this, I would suspect many in this situation would abandon OSX permanently by the end of the year. And if you give up OSX on your desktop, the incentive to stay in that environment on your laptop, tablet, and phone go way down.

This is a serious problem and the only outcomes are either a) Nvidia GPUs are supported, or b) OSX is abandoned, because the simple fact is that Nvidia GPUs are more important long-term than the entire sum of Apple's hardware; I can replace a tablet or desktop or laptop, but I can't replace a Pascal TITAN X.


Apple seems more interested in getting the higher margin from using AMD GPUs in at least some generations.


Supply chains being valued over product quality?

This fits the "operations guy running the company" narrative.


If/when I leave macOS, it'll be for linux or bsd, not Windows. I have to use Windows 10 3-5 days a week and rare are the ones I don't curse at it.


Just curious, but have you tried the beta builds?


> While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.

I've purchased the second generation (new) Macbook this year, and must say, I'm delighted. I honestly don't get what all the fuss is about. It just works.

Why would I change something that works really well with something that might work well?


My new MacBook randomly crashes and the touch bar is a real nuisance. The wretched USB c only ports is a real pain for me. It's slower than my former MacBook Pro (which is strange) and the battery time is worse too. The most ridiculous thing though is that I need a bunch of adapters with my iPhone 7 to charge and sync it using my machine. If only there was a positive side to it I would probably be less unhappy but I cannot think of a single one. I do. not. understand what apple was thinking when releasing it. I'm not the only one at the office that have these issues either so I'm not the only one. And no: we don't have any weird stuff on it. It leaves me super conflicted: MacBooks used to be the safe choice for a high quality laptop. Now I don't know what my next device would be, but certainly won't be an apple product.


Maybe you should have done a little case study before buying it? I replaced my 2012 mbP retina with the 2016 maxed one. It's a blast, the touch bar, while not that useful, just work and IS more useful than old functions keys. And it's fast, I don't see how it could be faster actually.


There is this misconception among HN folks that a developer is a UNIX guy, get real there are many kinds of developers.

Many of us, developers, don't care 1 split second if the OS we are targeting has any kind of UNIX support.

Mac is still a good platform for developers writing OS X, iOS, tvOS and watchOS applications.


It is very painful for me to develop on Windows. The development tools seem very disjoint and not integrated with the OS. Some projects are cygwin, some are Visual Studio. A lot of the Visual Studio options are tucked away in different configuration dialogs. Command prompt is bordering on useless. On *nix I can do everything I need to outside of an IDE, if I run into a problem, there is documentation for just about every program. OSX is pretty nice with brew. At the end of the day though, Linux just feels like it was made for developers.


Why bring up Visual Studio and command prompt when you have VSCode and Windows Subsystem for Linux?


Those are fairly new and most projects do not use them. If a project was started before Bash on Windows was released, it will most likely be built with Visual Studio.


Being a developer doesn't mean using the command line.


I don't think that's what he meant. As someone who has done some development on Windows, doing anything was a pain, even getting .NET packages wasn't as easy and intuitive as a Linux package manager is, pretty much every JS framework is easier to get, update, configure in Linux etc. Even working with PATHs on Windows seems harder than on Linux.

It's the overall experience.


As if Windows would be the only alternative to GNU/Linux in what means to be a software developer.


The others are all UNIX-based, having all the same fundamental advantages, so I don't see your point.


Everyone working on software products for:

- ClearPath

- IBM i

- IBM z/OS

- PS 3 and 4 OS

- WiiU OS

- XBox 360 and ONE OS

- Integrity

- µ-velOSity

- RTXC Quadros RTOS

- VxWorks

- L4

- ...

is also a developer and they aren't UNIX based OSes.


> PS 3 and 4 OS

At least in the PS4 case, it's based on FreeBSD, but that doesn't matter because you do not develop directly on it, but use a dev kit/SDK.

> WiiU OS & XBox 360 and ONE OS

You can't develop directly on these. You use Windows-based dev kits with custom SDKs mostly.

I am not even going to continue, since you clearly didn't want to understand OP's or my arguments, which was that Linux/UNIX is usually the best dev platform. Even if you target embedded, you actually develop on UNIX. There are exceptions for highly proprietary platforms of course, but that's not what most people on HN tend to develop for, (i.e. it's mostly web dev, mobile dev and such).


And yet you're arguing for this by replying to the top comment.

You have a point but I think it's undeniable that a significant share of developers wants a UNIX environment. Me included.


It is called visibility.

I don't deny there are developers that care about UNIX, and even I do care, occasionally.

What I don't accept is that for whatever reason now to be a developer one has to breathe UNIX, as if there wasn't anything else a developer might be.

This is yet another reason why sometimes I wish Apple had bought Be instead.


I kind of agree with you but at the same time I've yet to see a system that sucks less than UNIX style systems.

Unless we're talking game dev, real-time 3D graphics and such - then Windows is IMO unparalleled - I have not seen comparable tooling/driver support on other platforms - although I have not tried using Metal.


Thing is, a CLI is nice, but nothing that cannot be done better in language repls.

UNIX architecture is the "Worse is better" from OS architectures and POSIX is irrelevant when one uses programming languages with rich runtimes.


>but nothing that cannot be done better in language repls.

You're thinking on different abstraction layers - sure given a library for runtime X a REPL will be better than CLI, but when you're talking about processes then having functionality exposed via CLI vs Windows way of monolithic apps with GUI only - it's a huge difference for automation/testing/workarounds/re-usability/etc. - and you don't really care what the underlying platform/runtime is.


Uh most of windows is exposed via powershell and can be rn completely from the CLI. .NET, the primary development stack is completely scriptable from the CLI. It's quite powerful. I'm not saying I love the powershell language, I don't. But I don't love Bash either.


I can do all of that with Python, Ruby, Powershell, AppleScript,.....

No need for UNIX CLI for it.


Again you're talking completely different abstraction layers - unless you're saying that you can use a scripting language to use something like windows automation API to click GUI and stuff.

My point is that Windows has a culture of black box monolithic apps with GUI focused (if not GUI exclusive) functionality exposing. Nothing stopping you from exposing stuff like commandlets for PS and composing them to build higher level systems - but people don't really do this - and they do in UNIX systems - things end up being more transparent.

Nothing to do with any shell or anything really - more about UNIX philosophy of small focused apps composed together vs monolithic apps.

When you're developing stuff UNIX approach is IMO way preferable.


Windows is not the only alternative to UNIX, there are plenty of OS out there full of developers.

UNIX, is also full black box monolithic apps with GUI, there is a UNIX world out there beyond GNU/Linux and *BSD FOSS culture.


Developing on Windows is only ok if you're a .Net, Java, or node.js developer; which arguably encompasses a lot of developers. However it's a pain if you're using anything else


Sure, but according to many on HN, being a .NET, Java, JavaScript, C++, Swift, Objective-C, Android, PL/SQL, Transact-SQL, PG/SQL, PS3, PS4, WiiU, XBox 360, XBox ONE, Swift, Cobol, RPG,... isn't being a developer, as it implies for whatever magical reason to use POSIX and a UNIX CLI.


I either may not be reading HN as much to see the trend or you may be confusing HN with reddit.

(btw most video game development is in C/C++)


> video game development is in C/C++

So what? It surely isn't done in GNU/Linux or UNIX.


My point is that there's no need to list 'PS3, PS4, WiiU, XBox 360, XBox ONE' separately when it's covered by 'C/C++'. Just being pedantic.

I also still don't see HN's bias against using Windows as a dev environment, especially when a lot of people on HN create stuff on .Net. Sure a lot of people like nix and Apple here, but it doesn't mean they look down on people who develop on Windows just because they don't want to do it themselves.


Why Node? I'd assume the 2nd class unix support would still make it a pain.


Because I'm assuming MS invested a lot in both Node and npm to be a 1st class citizens on Windows. I was pleasantly surprised as well when I started working with it last year on Win 7; everything worked out of the box without any tweaks such as the use of Cygwin.

I was free in choosing my tech stack but I couldn't choose my OS, even Windows 10... which means I don't have access to Docker. I couldn't use a VM either because the app needed to run on Windows.


I am never going back to Windows -- of my own accord.

What I did is, buy an XPS 13 preinstalled with Ubuntu 16.04.

It is the best of both worlds: support for the latest hardware and trusted Unix like system

cheers!


If you can get away with using a Chromebook for a laptop, you might not really be the target user for the Macbook Pros.

FWIW, I have found the new MacBook Pros fairly incredible. They are thinner and lighter, which is quite valuable for people who use notebooks on the go, while still being more powerful than their predecessors.

If I had one complaint it's that they didn't keep around the old 15-inch chassis for a model with more ram and a bigger battery. Made just have one model and aim it at a very specific niche.

But overall, they are very good notebooks. You can power multiple 5K displays from them for God's sakes.


Over the past 2 months I tried hard to "make the switch". I built up a nice shinny new PC, installed Windows and Arch Linux, and told myself I can do this. Two months later, and I removed WIndows from the box (ubuntu on windows is decent, but still doesn't run a default rails app), and am running Arch exclusively. While I like the setup, I realized that I will ALWAYS be at least 20% less productive on a WIndows/Linux box.

The keyboard shortcuts, the UI inconstancies, and on linux the lack of native apps (specifically evernote) just didn't cut it for me. As trendy as it is to hate on Apple right now, MacOS is far and away the best operating system for developers and power users. It's not even close.

Having a single main modifier key on apple (⌘) is SOO much faster and more productive than switching between (alt) and (ctrl) modifiers for switching windows, copy/pasting, new window/tab, etc. It just kills my workflow.

I love my pc build and Arch Linux is pretty good. But I'm currently scoping craiglist for used iMac 5k's. The productivity loss just isn't worth it.


I guess they are more popular with ordinary users, so they are focusing on that audience. No problem with that.


Except that new apple applications have to come from Apple users. If the devs aren't happy with what's available, that means fewer applications. Fewer applications means fewer users, fewer users means a less attractive base for devs... the spiral continues ever-downward.


Or maybe, if you are a mac/iOS develpoer locked into the ecosystem, looking at the emojibar with a wtf face.

I mean I knew what I got into, I just hoped Apple won't neglect the people made its platform desirable for the masses.

I hope this is just a wave serving the 'starbucks pros', and next time we get to get a decent update.


> developers next,

Why would developers switch from mac to windows? If they're alienated by what Apple is doing, they're unlikely to be happy with Microsoft. Linux, sure.

Of course, I think this is exactly the sort of misprediction people on HN tend to make, where it turns out that (surprise) Apple actually knows more about market demands than random internet commentators. Remember how we were predicting that the new Touch Bar MacBook Pro would fail hard, but it ended up being the best selling MBP ever?


When Linux Subsystem will be finally finished I personally don't see any reason to use Linux as desktop environment.

As for MacOS looks - Windows also are releasing update to their visuals this year which looks pretty darn aesthetically pleasing. [0]

[0] https://blogs-images.forbes.com/amitchowdhry/files/2017/02/P...


Oh come on, there are plenty of reasons, it's a case of whether they affect you enough personally. Telemetry, closed-source kernel/base system, Microsoft's history of nasty practice in the computing industry, E&E, having to pay money vs. free as in beer, outside of the VS ecosystem a generally poor development experience.


Well, the new microsoft is actually one of the biggest open source contributors.

I'm a UNIX developer (mainly Go), and I use a surface pro 4 with win 10 for development. Bash on Ubuntu on Windows already runs great, and I have the ability to sketch diagrams whenever I want to. Additionally I have docker running flawlessly and a system that just works 99.9% of the time. (as opposed to Linux where I have to hack my way through most of the things)


What software do you use to sketch diagrams? Pre-Surface I was switching between Visio and yEd. Now that I have a Surface I am using OneNote but have a feeling I could be using something better.


OneNote windows 10 app version. Though I sometimes use Lucidchart.


For me that picture indicates all manner of problems. The main window does not follow the Windows user guidelines, which is bad enough in Windows 10 already. For decades double-clicking on the top left corner of a window means that you want to close it - it was explicitly mentioned in the 3.11 manual. With 7 (or was it Vista?) they removed the icon from Explorer for no obvious reason, but kept the functionality there - you could still doubleclick the invisible icon and it'd do what you expect.

With early Windows 10, the Settings app didn't even have a title bar so there was no clear demarcation from the window grey to the titlebar grey - how would I know that the grey bar at the top was the way I moved the window but the grey bit beneath the grey bit at the top DIDN'T move the window??

With current Windows 10, the Settings app finally has a title bar that has a colour like all other applications. But the paradigm we learned 20+ years ago (double click top left to close the window) and the ability to get the window menu HAS GONE. I can still press Ctrl-space and get the window menu but the ability to get the window menu from a Metro app or a metro-style app using the mouse has completely gone.

So now I have to learn another method of interacting with the basic building block of the system - windows. When each window behaves differently according to the "metro or not" measuring stick, this does not make for a great user experience. And this is me coming from using Windows since the 90s. How do you think new users and my mum is going to get on??

What's the point of having window guidelines if Microsoft themselves don't even follow them??

So although the screenshot is aesthetically pleasing, I remain very concerned about the direction usability is taking on the platform.


I don't know, when Apple decides to change things to go against the established norms they call it being courageous.

Somehow, I'm not bothered about ignoring learned paradigms from so many years ago. For example, my 12 year old daughter could care less how you learned windows management 24 years ago. She's learned to Windows as it is today.


Just picking up an odd expression you use - are you American by chance? I notice this expression in films and interviews and online and it's wrong: "I could care less".

This expression implies that you currently have a high level of care about a subject, and that you have a range or distance to travel until you reach rock-bottom/zero regarding your level of care for the item.

In fact, if you were to care 80% or 100% about an item, the expression would still be true: eg. "I could care less about my health" which would mean I care GREATLY about my health and therefore have range to drop my level of care from its current position.

The expression that everyone should use is "I couldn't care less", or "I could NOT care less".

This implies that you are at 0% of your care about an item, thereby having an inability to move below your current level of care. It is already at rock bottom. It cannot get any less. It cannot go any lower.

Subsequently, I am assuming that you meant that your daughter COULD NOT care less about how I learned window management. i.e., she wouldn't care in the slightest.

I had to ask because I am hearing this expression more and more and it's incredibly irritating because it is wrong.

I never hear any native speakers here in the UK saying it, but judging how manners of speech drift over here I imagine it is only a matter of time before I hear people saying it, much the same way I now hear the word "like" sprinkled into sentences needlessly. When I hear this expression over here I think I'll suffer an apoplexy and drop down dead.

EDIT: By the way, I don't think Apple's changes have been courageous. I find that they stuck their head in the sand for too long and finally did sensible things. For example, on Snow Leopard and prior, you COULD NOT resize the window from any corner but the bottom right. This was officially STUPID. Thankfully they realised how stupid this was and enabled the ability to resize a window from any edge. Only took them 30 years.

Also in recent versions they have been making a big fuss about the ability to go full screen with applications. Windows have always been resizable, so the ability to get a window fullscreen and not see the menubar is not really a big feature. Nor is the ability to run two fullscreen apps side by side - surely we have been able to put windows side by side since the 1980s? Just because the menubar has gone does not make it a feature. This is not courageous, it is dumb.

They have also made regressions. For example, now when you want to use Mission Control you must move to the top of the screen to see the desktops. Hang on - just tried it again and now it works again. Hurray!

Their new filesystem is still significantly behind NTFS for features. Sad to believe we're all still hobbling along on HFS+ with byte swapping for big endian to little endian metadata every read/write of metadata.

I am not saying that Apple does things any better with software development, but the feel across the OS is generally consistent. This is not the case on Windows 10, eg. how do you find out which build version you are on? Press Windows key + Break to see system properties.... oh it's not in there. I suppose I need to run the Settings app > System > About to display a duplicate window. How do I join a domain? I could use the Settings app and it'll mention joining a school. I could also use the duplicate domain settings from Advanced System Properties and join there, but there's no mention of the word "school". Why do the two sets of windows look different? Why is there even two sets of windows? What the heck is going on?

So now I have Control Panel, the Settings app, Microsoft Management Console and its snap-ins. They all do the same thing. Are the teams not talking to each other at Microsoft? Is nobody at the helm?

For extra fun, look at shell32.dll and see how many icon styles are inside that thing. That's not even an architectural change - that's just pictures in a single DLL, owned and controlled 100% exclusively by Microsoft. All of the icons are from different eras. If they can't get even little pictures to be consistent and uniform, I do not hold much hope of uniform system behaviour.

So you can see the cobbled-together nature that Windows 10 exudes, despite a fantastic opportunity to make the entire system coherent again. And I say this as a C++ developer on Windows as my day job.


Yeah, I totally agree but it must be really hard to implement new guidlines for a monster (super large) company like MS.

They are moving slowly, but IMO to the right direction.


In fairness, while double clicking the upper left icon did work, it was non standard and redundant.


It's not whether Windows is aesthetically pleasing. It's that it's 2016 and some first-party windows apps still upscale fonts rendered at half resolution. It's that there are two control panels. That kind of messiness is endemic to the platform.


I don't agree with the notion of blaming the platform because some devs refuse to update to modern standards of the platform. That's on the devs.


First party, as in, provided by Microsoft.

I just ran Win10 for the first time yesterday to work on a security guideline and I was blown away by how janky the whole thing was. Using it is like crawling through 20 years of computer history; there's things from every phase of Windows since '95 in that UI, and I don't mean UI hints, but entire applications.

I bet somewhere in there there's still UI that dates back to Win3.1.


Well, I think it is quite possible that the devs on the first-party team are not the same devs on the platform team. First-party or not, if the platform provides the tools and the devs refuse to use them, it is on the devs. Not the platform, the devs. Unless you can show me where the company demands its devs do so for no good reason.

>> I bet somewhere in there there's still UI that dates back to Win3.1.

I'm sure there is, that's a mix of backwards compatibility and don't mess with stuff that works. Some people want it, some people don't; can't make everybody happy. Can I complain that OS X provides a terminal that looks like it dates back to the 70's?


No? Because that complaint doesn't make any sense? Especially since Terminal.app is far more modern than CMD.EXE?


> Especially since Terminal.app is far more modern than CMD.EXE?

To be fair, a lot of Terminal.app dates back to ancient Mac OS too. The menu bars, for example, are all still Carbon. There is still a full classic Mac OS main loop running alongside the Cocoa main loop in every macOS application. Apple's done a good job hiding it, but it's still there.

Anyway, cmd.exe is a bit of a special case. What you're really seeing is conhost.exe, which is kept the way it is because it's part of CSRSS, which occupies a similar place as PID 1 does, and Microsoft doesn't want to bring new UI into it for fear of increasing its dependencies [1].

[1]: https://blogs.msdn.microsoft.com/oldnewthing/20071231-00/?p=...


It does makes sense if I insist we only consider my own personal subjective opinion on the matter, which most complaints along these lines tend to be.


The 3.1 UI that still exists is silly stuff like file open dialogs that aren't "desktop aware" or whatever. There's no reason at all to retain such "compat" because it instead requires users to learn where all such magic folders live in the filesystem.


I'm with you there. Windows 10 looks like developers designed it. Not meant to come off negatively to developers but sometimes we just get lazy and use solid colors.


Windows is beautiful compared to the Mac OS where the software all looks like it was inspired by a 1970's era stereo unit.

The ugly UI isn't even the worst part about the Mac OS though. The worst part is that it just doesn't even come close to offering the same sort of freedom that you get on Windows where Microsoft leaves hooks in to let developers actually do what they want.

Most of the problems with the Mac OS are by design too. I think it's hilarious that Apple folks think it's a really good idea to hide the label on most buttons. I guess you have to "just know what it is" before clicking it or hover over it and hope for a tooltip to popup tell you what the thing will do. Real efficient.

There's really no wonder in my mind why most people don't use Apple anything.


I have to agree despite your downvotes. From 8.1 onwards, mobile and desktop, Windows has been the most beautiful OS. Using my iPhone 7 feels like stepping back in time in terms of the visuals.


I dual boot Ubuntu and Windows on my desktop, and I only switch to Windows when I need to use an application that is Windows only. I find it to be a more pleasant and user friendly desktop experience. Switching back to Windows honestly feels like stepping into the past.


i dunno...part of why i use Linux or MacOS is the knowledge that under the hood they're well designed. With Windows, no matter how good it looks, I feel like it's an ugly system, not to mention insecure.


How so? It's definitely an unfamiliar system, for me, and there might or might not be vulnerabilities in the current implementation, but I have no reason to believe that the windows NT model is intrinsically less secure than the unix model.


err modern windows has many good security measures and in many cases Microsoft are ahead of the curve in adopting new security techniques.

For me "insecure windows" is an old trope and not particuarly accurate any more.


The most insecure part of the operating system is the user itself. You can only do so much to stop people from doing dumb things.


The security infrastructure behind the Windows desktop is a decade ahead of the Linux desktop. Yes, kernels are okay in both cases, but e.g. Xorg is a security disaster.


That's why we have Wayland, (and yes you can use it today, just install Fedora 25 or Arch with GNOME).


Reading books like "Windows Internals" would reveal to you the actual way how things are designed.


I just want to randomly poke in the discussion to say that out of the three big OS, MacOS user interface is the most vomit-inducing. I know this feeling isn't shared by most.

Now that windows has nice workspaces and an already usable UNIX environnement (Windows Subsystem still has things to iron out, but it's very nice for a beta), my personal ranking became:

Linux > Windows > MacOS

I see nothing that MacOS offers that Windows doesn't, whereas Windows has potential for much better hardware specs, games, bigger user base.

The only thing left for Apple is their 16:10 screen and their per-monitor scaling.


Same ranking for me here. At home I use Linux and would never go back to the alternatives but at workI had to use Mac and Windows. If Mac did not have a good terminal (iterm2 is what I used) then it would be no competition at all.

I do miss a good terminal in windows but assume the Linux subsystem will fill that hole


It's not the best terminal, but give hyper a shot. It seems to work fairly well plus it's cross platform so you can take all your configs. Downside is it's an electron app.


I'll have to check that out, thank you!


How can it be ugly if it looks good?


No package management on Windows? Even if they somehow managed to do an app-store, that wouldn't be tailored for developers.


Windows already has an app store (although it's pretty terrible). It's had non-networked package management for decades (through .msi files that end up as line items in Control Panel). Nowadays there seems to be a move to nuget and chocolatey.


It kind of does now: https://chocolatey.org/


I use apt-get.


If you are dependent on the linux ecosystem, it seems counter-productive to awkwardly force your workflow on another one.


It's not awkward at all. I use cmder that is using Linux subsystem beneath it.

For example:

- apt-get install php

- php -S localhost:8081

- Launch Chrome on Windows, go to localhost:8081 (it works)

There is zero awkwardness or extra configuration there. Only problem is that some things do not work properly yet (because still beta). For example `go get github.com/mattn/go-sqlite3` will not work because of some ?gcc? compilation issue. But it's getting fixed on next release.


Wall of texts in this lovely page [0] are set of good reasons to consider before switching to Microsoft. If nothing, EEE [1] is.

[0] - https://en.wikipedia.org/wiki/Criticism_of_Microsoft

[1] - https://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish


Oh, good old EEE. So that's why they open sourced and redesigned .NET Core and C#? To extinguish it?

Come on.


Yes, this is the "Embrace" Phase.


So what will they be extinguishing?


.NET, obviously! :'D


>EEE

end this FUD


Best-selling? Sure, I can believe that. Pent-up demand alone likely accounts for much of it.

Best-selling to developers? Not a chance.


Any statistics to back that up? Because some random hacker news comments dislike new MacBooks it doesn't mean other developers would.

I find it hard to see any realistic alternatives for MacBooks in software development. They give you all the tools you need from the Windows world that are not available on linux like Office, Photoshop etc. and it is still a UNIX system which can do all the stuff that linux can without kernel panics, crashes, bugs and other crap. Also unlike Windows or Linux, a single macOS installation can last pretty much forever and doesn't need to reinstalled every year because of the lost performance like Windows. And last but not least MacBooks are well optimised for the hardware they run on which gives them 1.5-2.5 times the performance with the same hardware as on Windows or Linux.

The whole thing about complaining about new MacBooks is same thing as with iPhones. People complain because it's trendy to whine about anything Apple does even if they would create eternal world peace and still everybody will buy them because there is nothing better or even similar on the market.


>Also unlike Windows or Linux, a single macOS installation can last pretty much forever and doesn't need to reinstalled every year because of the lost performance like Windows.

You need to update your arguments. It's 2017, not 1998. I haven't reinstalled Windows on my desktop system in years and it's running perfectly fine.


As a developer on windows, I'd have to disagree with most of what you said.

Windows hasn't been less stable than a mac for years if you buy equivalent hardware (instead of scrapping together a system with junkyard parts or getting a cheap walmart laptop).

Windows does not need to be reinstalled every year if you don't treat it stupidly. My current work windows install is 4 years old, has seen two machines (disk swap) and two major windows versions (in-place upgrade), and it runs as well as when i first used it.

OS X is only faster than windows on mac hardware because apple ships bad windows drivers on bootcamp deliberately. On equivalent hardware from other vendors it is just as fast.

It sounds to me like you don't actually know windows all that well. I use a mac mini at home and a windows laptop at work, and I'd say that they're pretty much equivalent for my purposes.


You mean direct, market-research-supplied numbers? No. I am just one of thousands of people who makes a living in the tech world. What I can say is that

1) A very large portion of other devs I know use(d) MacBooks, and

2) A very large portion of them said "F this, I'm out" when Apple released their most recent machine. Many switched to Surfaces.

Your hyperbole against Linux and Windows does your point no credit. Linux has been rock-stable for a very, very long time, and Windows 10 is a pleasure to use, particularly on touch-enabled devices. MS went through severe growing pains with Win8, but those days are past. And the OS/hardware combinations that have emerged since make Apple's offerings feel old and out of touch.


I agree with you except for "Linux has been rock-stable for a very, very long time". This depends on what you mean by "rock-stable" but things begin to get really annoying really quickly once you move on from relatively simple tasks. Even basic stuff like getting good battery life on laptops, or waking up from sleep/suspend/hibernate, using >1 monitors can be flakey or unpredictable. If they've worked fine for you then congrats, but when they don't it can be infuriating.

This is not to disparage the kernel developers, package maintainers or the distributions themselves - there's a lot of people doing excellent work without being compensated at all. However let's not kid ourselves - linux can still be tough.


I run Ubuntu 14.04 as my daily driver on a custom-built PC. I agree with you that Linux is not necessarily rock solid, but I would instead suggest that the compelling difference these days is that it can be if you put in the work. In years past, that wasn't even the case.

Unless you buy something from System76 or do very meticulous research on hardware support and build your own, Linux is not going to be seamless out of the box. But the real highlight is that it can be seamless at all these days.

My PC has four GPUs, 128GB of RAM and a deca-core CPU. I have four monitors, a bluetooth mouse, and I almost never turn it off. Instead, it automatically suspends. When I'm away from home I can ssh into my computer if needed. I have daily backups that are essentially seamless.

Now, it was difficult to get to this place. I would say that Linux is, and has for a long time been, rock solid if you're not using a GUI. In my opinion, the single greatest impediment to "Linux on the desktop" is xorg, which is terrible (to put it mildly). Nouveau is absolutely crap, especially since Nvidia puts out its own drivers. But my point here is that after a few days of careful installation and setup work, I have had a seamless desktop experience for something like eight months.

What needs to happen for Linux is a software company that takes something like Debian, fork it and dramatically bring it up to wider compatibility with modern hardware across the market. In some sense that's Ubuntu, and it shows in how easy my daily experience has been.


Well, let me rephrase- Linux is as stable as you want it to be. If you want an ideal linux laptop experience, you need to do the research and get a model made with linux in mind. Otherwise, you're just rolling the dice.

That being said, I only use Linux for server-side work, and I can easily say that it is every bit as stable as any other server OS I've used, and has been for a very long time. There are plenty of weedy areas to play in if you want to, of course. But if your goal is It Just Works, you can do that, too. Perhaps most importantly, you can do it easily, with just a little foresight during the spec stage of planning your project.


> Many switched to Surfaces

They switched to Surfaces because they wanted power-user hardware?


They switched to Surfaces because they saw what I was doing with mine, wanted to be able to do the same, and Apple had no comparable answer. Simple as that.


But that is just people in your surrounding. Anyone amazed by Surface (tablet and touchscreen things) switched but that is ok, it is not so much of Apple's loss as of MS's gain. I do not want touchscreen, I don't need it, I need classic laptop and many Apple MacBook users want just that, a laptop, no touchscreen.


For what it is worth, I used to feel exactly the same way until I got a machine that had one. Now I miss it when touch is unavailable. Being able to use touch, mouse/trackpad, & keyboard all at the same time is, perhaps surprisingly, a very pleasant way to work.

To pick one example: being able to flick through long documents with a finger while keeping focus on other parts of the screen alone has been a huge productivity booster, and I do not say that lightly.


How is this different from being able to flick through long documents with a finger via a touchpad (aside from not having to reach up to the screen)?


It turns out that it is less effort to flick/move directly on the item you want to adjust than to do it via proxy on the touchpad.

Like I said before, I was firmly in the "touchscreens are a waste of time for me" camp until I actually started using my Surface. Not trying to sound like an ad, it just has been my actual experience.


Touchpad gets that for me. Only way I can find what you saying more "comfortable" if you hold it in hand or horizontally on table while reading.

Most of my "work" time I spend in terminal. White text on black background, keeping my fingers on home row. I cannot see scenario where touchscreen makes me more productive except in some edge cases, and btw they don't gain over what I already have now. But that's just me. Each to their own, that is only fair thing to say.

But making assumption and claim based on your own point of view is not best thing to do. With Apple, especially it is always the same story, customers leaving and rarely satisfied (it was period around Snow Leopard, IIRC 2009-2010), but still I can't find anything better than MacBook. Dell and Lenovo came close, but when I install Linux on them I make few more sacrifices, summing everything up, Apple just works for me. (I won't even mention Windows. 100% not my cup of tea)


I have yet to meet a developer that hasn't said he's jealous of me being able to sketch on my sp4.


Developer on HN still count. You know that Apple is at least losing some developers. Then there are the people like me that skips this upgrade waiting to see more clearly Apple plans.

So the only question is to know if they are winning some that were on other systems. Objectively, the MBP has more or less stagnated on a performance point of view, it has regressed in autonomy and the new features it gained are at best neutral for developers. Developers is typically the slice of the population that will be affected by reviews, and the reviews are not favourable to this new MBP. Last but not least, competition has not stagnated either. I don't believe there is a better laptop on the market for my usage pattern, but it took me much longer before I came to that conclusion.

It is reasonable to think that the new MBP is less successful than the previous one with developers. It could still grow in raw numbers even with the developers, but it is reasonable to think its growth has been dampened a bit.

I'm not strongly disagreeing with you, but you are not making a better point than parent. I do believe that if Apple has a spec bump release of the MBP ready end of 2017 and is back in the regular 1 year cycle the whinging will die and you will win the argument.


I am not sure this was an issue on Linux ever, but I run a 3+ year old Arch installation, no problem here.


Most servers run linux, which means many developers are bound to access them remotely, which makes the desktop OS a bit irrelevant. And many people may work faster with windows desktop (win7 for me please, none of that tablet crap).


> Of course, I think this is exactly the sort of misprediction people on HN tend to make,

Says the guy who suggests Linux as a desktop.


Suggesting linux as a viable desktop is not equivalent to predicting that $CURRENT_YEAR is the year of the linux desktop. You being dismissive of the idea on this ground is illogical.


Why would a developer be on a mac in the first place? Cause ur a fanboy? Cause ur against monopolys?

If i need to build code that apple says can only be built on a mac, i simply spin up my mac VM and point some tools dont need to drink any koolaid to get the job done.

MDI + keyboard skills beats vim ninjas any day of the week.


OSX is very convenient do develop on. Got the command line power of linux while beeing more userfriendly (personal opinion)


And now windows has literal Linux cli


Back when the first intel macs came out, they actually were made for developers and were great machines.


they actually were made for developers

Apple has never, so far as I am aware, deliberately targeted developers as a primary audience of a laptop computer.

Apple's laptops became popular because they ran an operating system that was Unix under the hood, with a nice UI and good set of non-developer applications, which made them good as machines a developer could use for work and for non-work (Linux on the desktop being not so great at the second part of that). Apple has certainly been happy to make money from developers buying laptops as a result of this, but has consistently ignored the expressed wishes and desires of developers when updating its laptop lines, meaning Apple was popular in spite of Apple's indifference, rather than because Apple lovingly catered to this audience.

In other words: if you were looking for an executive to run around the stage screaming "DEVELOPERS! DEVELOPERS! DEVELOPERS!", I think that was probably some other company, and they got ridiculed for it...


Steve Jobs certainly understood the importance of developers:

https://youtu.be/4QrX047-v-s?t=459


"Understanding the importance of" is different from "develop product line with explicit goal of catering to".


I believe you are mistaken. Apple has certainly courted scientific and other developers before.

Here's an article about when they stopped targeting science.

https://www.macobserver.com/tmo/article/apple-drops-science-...


"Science" is not "developers". Apple wanted to sell them on a full stack all the way up to clusters of expensive Mac hardware for parallel computing. And the article you link even admits that companies which have targeted scientific users with hardware have mostly failed -- that market is more likely to buy either extremely specialized hardware or, more likely, commodity machines that they slap Linux on.


You started a platform and editor flamewar at the same time. Impressive!


Where did you get that Mac VM from and on which host OS are you running it? That should be a Mac, you know.


> Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.

This is EXACTLY what I did. I build my own PC (after more than a decade not doing it) and bought an Acer R11. Couldn't be happier!


I did kinda the same thing except I still keep my good ol' macbook air instead of a chromebook. Ubuntu on desktop and I can run stuff on it remotely and I got a lot more performance if I were to spend the money on a new macbook.


> developers next

Ehhhhh, as long as there's money to be had in iOS land, I'm not sure that's true.

Contrary to internet commenters complaining that it's impossible to launch new iOS apps, iOS shops still make a ton of money. ¯\_(ツ)_/¯


Most developers use Macs to write Java, Ruby, PHP etc.; these are the developers who would leave first, and that would already cause a lot of damage to the ecosystem.


Developers don't need to be on a Mac. They can run OSX inside a VM on any OS they want.


For sufficiently large firms, that is probably not true because the risk of getting sued for violating Apple's EULA is probably too high.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: