While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.
Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.
Even Windows is becoming more viable again, ubuntu core and all.
The XPS 15 is another option. Even with a significantly larger battery it doesn't seem to have better stamina than the MBP. It's apparently been plagued by QA problems, and offers a choice only between a 1080p and power-hungry 4K (can't get high-DPI and 10 hour battery life in the same package).
I suspect, looking at the sales numbers, that you're wrong. Apple is giving up on a certain group of users (who need fast GPUs). They gave on those users years ago by shipping crappy OpenGL drivers. But they've correctly perceived that common users don't want ports, or expandability. They want a fast machine with a great screen and all-day battery life, and they want to hit those metrics in the smallest package possible.
By the end of the month I had gone slightly loopy (seriously - MacBook withdrawl is real), gave up and bought a 2015 rMBP. Personally I find macOS, and its deep integration with Apple's hardware, too intuitive and 'invisible' to the way I work to give up on it.
I use both Linux Mint and MacOS on a daily basis and I can't say that one really is better than the other. I'm sure it would be the same with Windows if not for the fact that I need some unix-y stuff.
I do have a use case that I would really like to get on my primary laptop that neither Mac nor Linux currently offer: Detach screen and use stylus to annotate pdfs. If I was using windows on a surface this would be a major part of my workflow, and I would find it impossible to switch back. Doesn't prove that Windows is superior. (Maybe I will end up dualbooting Windows and Linux on a Surface or Yoga or some such thing eventually).
I have been meaning to Subsystem for Linux that since they announced it, but Cmder (Conemu) has been so working well for me that I have gotten too complacent to even try it. Cmder was the one thing for me when I switched to Mac that truly made the transition painless -- I thought I would hate not having OSX Terminal. I don't do super advanced unix shell stuff, and it was just perfect.
If your needs are already being met, it's sometimes hard to make time to play with something that does the same.
It works pretty well for command-line use. If you need X at all, you're probably SOL.
I've been doing mostly C++ & Python development. I've run into a couple of issues around networking and a couple of UI issues. The UI issues have been fixed after I reported an issue on GitHub.
All in all, it's been a fairing great experience, and the developers have been very responsive to issues reported on GitHub.
After being on Mac for about a decade, it only took a week of using Windows full time before going back on the Mac felt foreign to me.
If I spent a week or so full time on the Mac, the reverse would probably happen.
These days I only care that my main tools work on whatever operating system I use (it's a great era for end users now that so many tools are cross platform). There are things about every OS'es main UI that bug me, so I don't find myself being loyal to any one particular OS.
IMHO In their quest for simplicity, that have made a very good GUI for users, but hell for professional developers who DON'T only use the command line. One of my lifelong friends swears by his MBP, but he only uses the command line.
I think that depends on your view. IMO, Apple generally does make some of the best hardware in terms of quality and design, but there are exceptions.
The Mac Pro garbage can, for example. I don't know why they chose to make an art piece to replace the incredibly functional and expandable tower that existed before.
Another example is the early 2011 Macbook Pro 15" with discrete GPU, which had serious overheating problems that they just pretended didn't exist for more than three years.
I could go very far into detail here and list all of the extremely annoying limitations that I run into, but instead I'll respond to your vague complaints with my own. Apple quite obviously wants absolute control over their device and their software whereas Microsoft lets me to do whatever I want with my computers and my software.
I have to have a Mac to make iOS apps, but as soon as those are no longer a thing I'll toss all my Mac stuff straight into the garbage.
The hardware fit and finish is also second to none, and runs windows just fine if that's your thing.
The alternative on windows is a machine that spies on me, has horrible ui bugs and inconsistencies I run into constantly, and decides to auto update and reset all my privacy settings in the middle of the night while I am using the machine.
Not to mention it is often used with some awful trackpad. I haven't tried them all but I have never seen a non-mac trackpad I could go back to.
This exactly. I'm still baffled by claims that Windows 10 has a good desktop UI when I see its iconography , huge click (touch) targets , and wildly inconsistent use of whitespace . That's not even mentioning the forced updates and always-on telemetry. I'm not sure how one can say macOS is more of a walled garden than Windows at this point, at least macOS's security features will get out of your way if you ask nicely.
This almost reads as sarcasm. Some people really don't care about those things, and using them as examples of why you're baffled just highlights that disconnect between you and them. I don't use any bundled windows apps, and I'm rarely in the settings (and I just search for what setting I want), so iconography and whitespace design decisions in windows apps don't even factor into it for me.
Neither OS X nor Windows feel as comfortable as my customized FVWM config did, but windows gets a lot closer nowadays. I had to use OS X at work for a few years and it always grated.
> That's not even mentioning the forced updates
It's possible to disable them, you just have to put some effort into it (it requires regedit). I think this is the right decision. If you want to disable updates and you don't know how to change a registry setting, then for the good of us all, the answer is no.
The tracking is a valid concern though.
I don't want updates when I am in the middle of something full screen like a game, forcing a restart of the machine on me. This is madness.
I don't want ads for office 365, Cortana or edge on my desktop. I don't want to learn how to block them. I don't want to use an OS that feels like it is being milked for all it's worth in its dying breaths.
Edit: Also, have you set your active hours? Windows allows you to define the times you use your system so it won't attempt to update during those times. Additionally, you can set a specific custom restart time for when it will restart.
I tried to. I play games for a few hours either at night or early am -- say 6pm-1am or 5am-9am are my possible slots. Unfortunately, windows update will not accommodate this -- you are only allowed to set one window with an 8 (maybe 12) hour max, and it must be consecutive. I had to dig around to find this, and it still is not sufficient. I ultimately solved my problem by using the regex editor to convince windows I was on a metered internet connection. Unfortunately, this broke update all together. I turned it back and now its still broken -- apparently the magic auto-updater is the only way get updates -- there's no button I can click to just download the latest update and install it? (at this point I gave up)
I mean... that's just insane.
I was just in the windows update settings and there was a way at the top to check for updates right then. I didn't use it so I'm not sure if there's some other gotchas involved with that.
I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone. It is often reset without rhyme or reason to this random default (I think NYC). I don't always notice and change the time one when it boots up because I have steam launch in big picture mode.
Also, why does it even need to ask for active hours by default? I am using the machine at full-throttle. That is a really easy metric for "maybe wait until later". It's already logging everything I do and sending it to Microsoft, it would be nice to see some usuabiltiy features come out of all that data
Inconsistently working is a commmon theme of my experiences with windows. I am routinely baffled that I paid $100 for this experience and wish that there was better Linux game support for AAA titles. I know I throw my money that way whenever possible.
Some people run things all day long. My brother sometimes keeps the video game 7 days to die running all day at home while he's at work. Not updating when activity is detected is a good way to have it never update, and a good way to allow a virus to trigger a condition that may prevent automatic patches to holes it likes to use.
> I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone.
That's odd. Is it actually changing your time zone, or is it just off by a few hours? If it's just off by a few hours, my bet would be that it's a difference in how linux and windows set the system clock (one may prefer to keep the clock in UTC time, the other in the set time zone). If it's the actual time zone that's changing... I dunno, maybe some location service helper and a poorly mapped IP address? I haven't heard of that, but it does sound annoying.
Its definitely the time zone not persisting. I've navigated through seas of menus to change it to no avail.
I agree its a good feature to ship enabled by default. Grandma who leaves her computer on for years at a time needs to have her security updates up to date.
I have a 2015 MPB and used it for 1 year before switching it to Windows because I got fed up of the apple dev environment.
It doesn't just run windows fine, it runs Windows with amazing speed compared to macOS. Everything feels (and is) snappier. I'm still left with a very expensive and under-powered machine.
PS. Installing Little Snitch, one becomes immediately aware that macOS does talk to base....a lot!
The point is, that "equivalent software" runs faster. I would wager that if Final Cut Pro was for Windows, and you run it on the very same MacBook12, it would run faster on Windows. Despite this, using Final Cut Pro an example is a bad one, because it was developed by Apple itself (and therefore assumed to be highly optimized to the OS) and it is not available for windows.
It is more useful to compare an equivalent 3rd party (neither made by apple or microsoft) application. I'm a developer, so I use a lot more software than just "apps" so maybe if you try to use a wider set of software, the difference in platforms will become apparent.
Admittingly, I don't have numbers. But I don't need numbers, because you can feel the difference as a user. Install it and see for yourself!
Windows seems to be a combination of 90's era throwbacks and 2edgy4you Metro design. If they've managed to improve this stuff recently I would be interested in switching, since there's a huge tax on hardware specs with Apple computers.
Windows 10 feels like it just wants to get out of my way, which is really nice, since for me the gold standard is a customized FVWM config I refined over a decade to be minimalist and extremely usable for work.
If you haven't used Windows 10, it's probably worth at least a look (as long as you don't mind or have ways to mitigate the privacy concerns).
I use MacOS, Windows, and Linux each and every day interchangeably. What I do is I start my productivity application(s) and spend time in that. I see OS when I'm copying files or when I'm in shell doing shit. Even when in shell, with baboon on Windows it's more or less the same experience across.
Only thing I want in all of OS' that is only in MacOS is preview. That thing is damn awesome. Everything else is invisible to me.
Not in my experience. My Windows 10 automation/unscrew-up script alone is like 5 kloc. And I bet 70% of that script could be replaced if I had real control of the system, like dropping a .config file in some folder instead of having to find hidden settings with nonsensical names deep down the regedit hole. Another example, you have to use some stupid hacks to make sure there are no Flash DLLs in your pc. No matter what you do they always come back in some security update.
You should see my ansible playbooks for our Windows Server systems, I don't mind PowerShell per-se but it takes a lot more effort to get anything done compared to my CentOS systems where I can template a config file and be done with it.
I really hope our vendors start supporting .Net Core soon, the SDK for our ECM software is the only reason we're still stuck on the full framework and having to manage a bunch of Windows VM's for our integration software...
I recommend you do your own script by choosing what you want from each type of script. I would release my script if I was sure it wouldn't break random people's computers, because IT WILL. I'm also running Windows 10 enterprise because I want as little telemetry and things shoved up my ass as possible.
Some Windows updates can change registry keys or disable certain policies. I monitor the commit log of other repos to know what I need to update, but they don't always cover everything. Feels like a lot of work but it's actually not.
Here's how I structured it:
--- admin-config.ps1 (policies, tweaks)
- user.ps1 calls
Because if you're using a regular user account (like you should) you need to run 3 things:
- admin.ps1 as admin
- user.ps1 as admin
- user.ps1 as your regular user
I gave up on using runAs or any of the things recommended on stackoverflow, something always go wrong so it's easier to do it this way.
For a fresh install, I recommend that the first thing you do is update everything and let Windows install the 200 apps you don't want. Run the 3 things like I mentioned, reboot, run it again, reboot.
My installation is months old and it runs like new even after heavy usage, hardware changes, tons of apps and games installed/uninstalled (this kills Windows 7). Just be careful what you remove, don't ever install ccleaner or any shit. All you need is sysinternals tools.
I'm too lazy to proof-read/make this shorter, hope it helps somebody.
Whatever is left after bricking it, I suppose?
You do know that there were actual lawsuits against Microsoft over the Windows 10 auto upgrade right?
Even assuming that issue was overblown, I distinctly remember that Catbert-style perma-nag message appearing on each login asking me to upgrade. Microsoft doesn't even deem its users worthy of a simple "Don't bother me again" close window.
Let us conservatively say about 10% of the 300 million people who supposedly got the Windows 10 update didn't actually want it. That's about 30 million folks who would disagree with this notion that Microsoft "lets me do whatever I want with my computers and my software".
The best thing Microsoft and Dell have going for them in the laptop space... the 2016 MacBook Pro w/Touch Bar. It was a dud. The GPU issues were the final straw for me.
I thought that switching back to Windows would be a HUGE hassle. The Windows Subsystem for Linux took the pain out of it. That, and I no longer have to fight the we-don't-have-a-macOS-version of $APP issues.
If Apple doesn't care about their computers, why should their users? ¯\_(ツ)_/¯
I thought their sales report indicated it is their highest selling macbook to date?
My least favorite thing is I have to use the Registry to fix it
I'm a little conflicted about my next Apple purchase as well, but for me, the XPS 15 hasn't been the answer.
Just one experience amidst many, I know...
Given that everyone's got a different use case, every laptop, regardless of maker, has some fatal flaw to someone.
As someone who has a lot of trouble using any trackpad (including Apple's) that doesn't have physical buttons, should I view buttonless trackpads as a tradeoff or a flaw?
I don't care about the design tradeoffs made for esthetic reasons, I only care that I struggle with them and get stressed out by them. So to me, the lack of physical trackpad buttons on a laptop is a design flaw, even though it might be a completely sensible tradeoff for 99% of the population.
If some people view a design decision positively, then its a trade-off, not a flaw. For example, I hate moving parts on my computing devices (they break). So I'm a big fan of the non-clicking force touchpad.
What I'm talking about is flaws. Nobody prefers a laptop with coil whine to one without. Nobody prefers a cheap-o IPS display with an uneven backlight or bleed. Nobody prefers Synaptics drivers to Microsoft Precision drivers. Those are flaws, not trade-offs.
But like anything, you have to be sensitized to these things as a negative. Clearly, you've got plenty of things you're sensitized to.
Virtually none of my non-technical friends would even think to complain about uneven backlights, bleed or coil whine, simply because they haven't been sensitized to them.
Coil whine and a little backlight bleed have never been dealbreakers for me in choosing a laptop. All other things being equal, maybe they would be dealbreakers. With the wide variety of laptop options out there, the "all other things" never tends to be equal anyways.
You can make the argument that I have low standards, and maybe that's true. But I am not you and you are not me. Let's not assume that everyone is sensitized to the same things as you or has the same tolerances as you.
I'm running a 2014 and so disappointed they soldered on the ram, and their SSD is a wierd format nobody else is using. I'm not planning on upgrading any time soon though.
I think the Apple/Mac pro market is really being abandoned in favor of the laptops.. but how much of their base can they piss off before it starts taking other segments with it?
That would depend on your criteria. For example my MSI laptop blows my mbps out of the water performance wise (I have 2 mbps btw 2014 and 2016- so no mac hating here), but it's about 4 times heavier :) but I don't have to haul it around too often.
If I had to haul a PC laptop around - for mbp - comparable form factor/weight I'd look at high end razer laptops. I'm also not a fan of glossy screens, razer has a matte screen option available (so do MSIs but again they are BIG).
Compared to the 13" rMBP, you get a smaller display and almost two hours less battery life (and people are up in arms about the 13" rMBP being a regression on the battery life front compared to the previous model). It's not a replacement for the rMBP, just a less good, but cheaper, alternative.
speaking of battery life - not sure that once you get into 6 hrs vs 8hrs type difference it's all that important.
I think the Alienware 13 is an ugly, heavy brick (in fairness, you can get uglier), but it basically ticks off everything I need on paper.
My max acceptable weight for a laptop - irrespective of size - is around 5.5 lbs, so it barely makes that, but once you get past the esthetics, it's a fantastic machine. It's available with a quad core CPU with multiple options for GPUs (including the 1050Ti which kills the battery less than the 1060), a nice array of current and future ports, user replaceable wifi, RAM and SSD, and the option for an OLED screen. Oh, and it's also got physical touchpad buttons, which are something I've been missing for a long, long time.
Because my last two laptops have been quad core, I struggle with the idea of replacing my laptop with an ultrathin laptop that has an ultrabook processor that would be slower than what I currently have.
Every time I think the XPS15 is the one for me, I look at the Alienware 13 and change my mind.
The one thing I really hate about the Alienware 13R3 is that it's a "gaming" laptop and the matching gaming bling, but that's a rant for another day.
Some people just need or want a beefy desktop built with upgradable components (in addition to a laptop).
Or, crazy idea here, they could licence their OS if they don't want to make pro gear anymore.
Maybe some are too young to remember 'Power Computing', the company that licensed MacOS in the 90s. That was juuuuust before Apple ran in to major problems, and had to file bankruptcy. The first thing Jobs did when he came back was borrow 55 million dollars from Bill Gates and buy back all the licenses and shut down production of all mac clones. One of the only true things Apple has is its reputation for quality. You want to bleed that out, you would do it by allowing clones, thus bringing about the destruction of the brand as a whole.
A clone program now would have a totally different dynamic, since Apple is incredibly successful rather than close to bankruptcy. I still don't know if it would make business sense for them, but for Mac users like me it would be absolutely terrific.
Apple never filed for bankruptcy protection.
From this source they borrowed 150 million from MS.
I think Apple makes high end computers for consumers now.
They just don't cater to special audiences like they did before, which is unfortunate for people who don't want computers that cater to consumers (i.e., content creation pros, developers, etc).
I can't blame Apple for following the money, but it is what it is, I guess.
Another one to chime in that I don't consider this crazy...any more.
It didn't work the first time around partly because the cloners had much smaller volume and could therefore always scoop Apple on the most profitable high-end gear, which didn't have component availability.
The licensing model was always a gamble, with the hope that the loss of margin would be offset by an increased base. The problem was that it was "bet the company" stakes at the time, and when it didn't quite work out as hoped they had to shut it down or shut down the company.
Now the Mac line in general and the pro line in particular is a sufficiently insignificant part of the overall business that they could afford the gamble.
Apple is a hardware company. What you're suggesting doesn't make business sense for Apple. They rely on being able to sell new hardware. Allowing upgrades wouldn't make sense. A laptop last for five years, for most people, being able to slap in extra memory, a new graphics card or just a new SSD would "rob" Apple of a sale of a new computer for an additional two or three years.
Somebody who will sell a high-end desktop system with: dual-socket Xeons, gobs of ECC memory, lots of storage, multiple 16x PCIe slots, and a very limited selection of high-end graphics cards.
That would satisfy the power users, and wouldn't cost them anything.
Sure they are losing customer, but compared to the hoards of students buying Apple laptops or iPhone customers, it not really an issue.
If they licensed Magsafe to a clone manufacturer they would definitely lose sales. So many Mac users love the Magsafe.
It's really too bad they discontinued it.
Latest problem I had were with the printer driver (Brother HL-1110 which doesn't seem to know that this printer has much less memory than advertised)
Where the Linux desktop is truly unparalleled is for a developer. As a developer I'm comfortable enough with the technical details to handle the quirks (like doing things from the command line), and the strengths of Linux truly shine (again, the commandline, as well as choices in DE, software, etc). One great example is Arch. On Arch I can install any developer tool directly from the command line with very little effort. Anywhere else and you have to click through endless menus.
I think they should give people what they want - a high powered upgradable machine. If they don't want to make it upgradable, then they should at least refresh it annually.
Having content producers using macOS is a huge benefit to their brand and I think it's a big part of driving sales of their iOS devices. At the very least, if they lose the content producer market there will be significantly more friction to producing iOS apps.
Yes, there's a halo effect, but I don't think it's all that huge especially with respect to video pros and I bet it mostly works in the other direction - people like their iPhone so they are more likely to buy a macOS laptop or desktop. Nobody is buying a Mac Pro because they like their iPhone.
The driver for iOS development is profit and as long as iOS users are willing to pay more for apps, it will continue to dominate. And if you want to make an iOS app, you are going to buy a Mac, but probably not a Mac Pro.
The Mac Pro is like the iPod in that it exists in this weird space where it continues to be sold but is clearly not something that Apple thinks is important.
Personally, I think letting the trash can linger on for over 3 years is a pretty loud indicator that Apple's not exactly concerned with the line and that, internally at least, they're well aware of just how big a mistake the design was. Of course, it's possible that it's just a reflection of far larger problems with how the company views and supports its Mac engineers. Plus, the Mac Pro's manufacturing here in the US is problematic for the company. That little political ploy can't be easily reversed, especially with the anti-trade FUD that's so prevalent now.
If Apple were willing to acknowledge the problems, they could certainly design a restrictive licensing program that would allow third-parties to build Macs that target higher-end professionals without cannibalizing their consumer lines too much while avoiding some of the problems from the 90s. Strict design requirements for case aesthetics, limitations on software bundling, and other requirements could be used to limit third-party machines to high-end machines to avoid eating their consumer and laptop sales.
Not that I expect it to happen, or even be considered. But something has to change if Apple wants to keep their hold on creative professionals who are now either being forced to switch, doing extensive upgrades on their old Mac Pros to bide as much time as possible, or building Hackintoshes
This is already worked like gangbusters for them for monitors: http://www.macworld.com/article/3169308/macs/apple-stores-st...
Microsoft beat Apple for a decade, then Apple beat Microsoft. Both companies are still around, making fortunes. Even Yahoo's still around.
I don't mind using macOS, it's a pretty nice experience, definitely nicer than windows but when I buy an Apple machine the thing I'm buying is definitely the hardware and I'm not convinced the clones do a good enough job of matching that, can I get 14 hours of battery life from the leading air-clone?
The event that changed everything was getting a 34" Ultrawide monitor and the Mac experience was just so much more solid and respective of the screen space. I can't quite put my finger on it but I felt more in control.
I have a Surface Book on my desk, but my Macbook Air is what I use daily. For me, Mac wins with:
- Predicable window management - everything stays where you put it
- Tidier fonts and app bevels so it doesn't waste screen space
- Access to apps like Sketch
- No UI delays when using Adobe Premiere, despite the less capable hardware - definitely an issue on Windows
- Much better command line support (not used the Windows Linux subsystem)
- Apps are easier to handle and less bloatware
- Notes app that is actually helpful
- Fast with an SSD - absolutely rubbish without (my mac mini is unusable)
- I have gotten used to clicking the window first, before it doing something in that app - so fewer accidental actions
But, I do still love Windows, it wins on the following:
- Better graphical support (ok, hardware specific)
- Windows Explorer is much better than Finder for me
- Window snapping is easier (but might be also frustrating, can't put my finger on it)
- I can open multiple calculator instances
- Wider app support, but less of an issue these days for what I use
Windows does have a lot more bugs, such as incorrectly scaled cursors in Premiere, or apps that a impossibly small or stupidly large when using a non-retina second monitor.
A developer friend I know used to always prefer the even older unix command line tool, dc. It operates with reverse Polish notation.
Now, I often just start the python repl (type 'python' at the command line) for quick calculations that might require slightly more power than bc.
To bring this back to the original topic, it is this easy access to the underlying unix tools that made me switch to Macs in the first place. Windows, IMHO, lost some points with the emphasis on touchscreen based UI and gained some points with the new Linux shell support.
In regard to both, I suppose that there is an opening to something more complex.
open -na Calculator
After having given Powershell a chance I don't want to go back to bash.
Pretty much anyone I know on a Mac uses iTerm, which is definitely more capable than the Windows 10 command prompt.
> And posix doesn't seem to hold a candle to Powershell.
In what specific ways? It seems way more intuitive to me and powerful enough for any task that it's not worth bringing a language like Python/Ruby/Go etc. for.
I love the iOS platform, it just works and has never failed me (I am leaving iTunes out of this). However, even with the scaling/DPI issues I find Windows 10 fast and easy to use. It must be biased as I've used Windows more, but I thought I'd love macOS more than I do, having loved my iPhone & iPad experience. I do always have to put the taskbar at the top of the screen, but that I think goes back to my Amiga Workbench days!
Jeez is that an actual thing on OSX? That would just make it feel unresponsive to me.
If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.
Whoa. I had no idea!
Damn, is there any equivalent functionality (be it integrated or third party) for windows? I'd love to be able to run games in fullscreen exclusive mode without having them lose focus every time I want to look at a different chrome tab on my secondary display.
Kinda. Just for scrolling apparently.
Window management is vastly superior in my opinion, I frequently just full screen everything and swipe between the full screens, this is particularly notable in my case since I almost exclusively use the keyboard (get bad RSI from mice)
Updates feel like they have much less friction.
But basically it comes down the underlying unix-y system.
I'm never going to do serious development on a laptop but if I pick up a macOS machine it has the unix core tools that I need to get stuff done, vim, ssh etc. (Though there are some pain points mostly they're fixed by brew for machines I'll use for any length of time).
Also, I helped someone set up Parallels the other day on their MBP and tried out air drop for the first time, show me friction-less 4GB ISO copying OTA between Windows laptops that aren't on the same network!
HN is a North American bubble, mostly Silicon Valley. Apple computers are a thing only in North America and parts of Europe.
Take a look at:
It's a strange problem, because there is plenty of demand for it, and I think we all appreciate the outsized role that GPUs play these days; to not offer support for 50-80% of the GPU market seems like a rather poor strategic decision. Particularly since you're really only talking about a team of 30 or 50 within Apple to help Nvidia, the drawbacks are minimal.
This has been an ongoing problem since the summer. Some have reverted back to using several 9xx cards (which have spiked in price) while others have switched platforms. Lacking any real progress on this, I would suspect many in this situation would abandon OSX permanently by the end of the year. And if you give up OSX on your desktop, the incentive to stay in that environment on your laptop, tablet, and phone go way down.
This is a serious problem and the only outcomes are either a) Nvidia GPUs are supported, or b) OSX is abandoned, because the simple fact is that Nvidia GPUs are more important long-term than the entire sum of Apple's hardware; I can replace a tablet or desktop or laptop, but I can't replace a Pascal TITAN X.
This fits the "operations guy running the company" narrative.
I've purchased the second generation (new) Macbook this year, and must say, I'm delighted. I honestly don't get what all the fuss is about. It just works.
Why would I change something that works really well with something that might work well?
Many of us, developers, don't care 1 split second if the OS we are targeting has any kind of UNIX support.
Mac is still a good platform for developers writing OS X, iOS, tvOS and watchOS applications.
It's the overall experience.
- IBM i
- IBM z/OS
- PS 3 and 4 OS
- WiiU OS
- XBox 360 and ONE OS
- RTXC Quadros RTOS
is also a developer and they aren't UNIX based OSes.
At least in the PS4 case, it's based on FreeBSD, but that doesn't matter because you do not develop directly on it, but use a dev kit/SDK.
> WiiU OS & XBox 360 and ONE OS
You can't develop directly on these. You use Windows-based dev kits with custom SDKs mostly.
I am not even going to continue, since you clearly didn't want to understand OP's or my arguments, which was that Linux/UNIX is usually the best dev platform. Even if you target embedded, you actually develop on UNIX. There are exceptions for highly proprietary platforms of course, but that's not what most people on HN tend to develop for, (i.e. it's mostly web dev, mobile dev and such).
You have a point but I think it's undeniable that a significant share of developers wants a UNIX environment. Me included.
I don't deny there are developers that care about UNIX, and even I do care, occasionally.
What I don't accept is that for whatever reason now to be a developer one has to breathe UNIX, as if there wasn't anything else a developer might be.
This is yet another reason why sometimes I wish Apple had bought Be instead.
Unless we're talking game dev, real-time 3D graphics and such - then Windows is IMO unparalleled - I have not seen comparable tooling/driver support on other platforms - although I have not tried using Metal.
UNIX architecture is the "Worse is better" from OS architectures and POSIX is irrelevant when one uses programming languages with rich runtimes.
You're thinking on different abstraction layers - sure given a library for runtime X a REPL will be better than CLI, but when you're talking about processes then having functionality exposed via CLI vs Windows way of monolithic apps with GUI only - it's a huge difference for automation/testing/workarounds/re-usability/etc. - and you don't really care what the underlying platform/runtime is.
No need for UNIX CLI for it.
My point is that Windows has a culture of black box monolithic apps with GUI focused (if not GUI exclusive) functionality exposing. Nothing stopping you from exposing stuff like commandlets for PS and composing them to build higher level systems - but people don't really do this - and they do in UNIX systems - things end up being more transparent.
Nothing to do with any shell or anything really - more about UNIX philosophy of small focused apps composed together vs monolithic apps.
When you're developing stuff UNIX approach is IMO way preferable.
UNIX, is also full black box monolithic apps with GUI, there is a UNIX world out there beyond GNU/Linux and *BSD FOSS culture.
(btw most video game development is in C/C++)
So what? It surely isn't done in GNU/Linux or UNIX.
I also still don't see HN's bias against using Windows as a dev environment, especially when a lot of people on HN create stuff on .Net. Sure a lot of people like nix and Apple here, but it doesn't mean they look down on people who develop on Windows just because they don't want to do it themselves.
I was free in choosing my tech stack but I couldn't choose my OS, even Windows 10... which means I don't have access to Docker. I couldn't use a VM either because the app needed to run on Windows.
What I did is, buy an XPS 13 preinstalled with Ubuntu 16.04.
It is the best of both worlds: support for the latest hardware and trusted Unix like system
FWIW, I have found the new MacBook Pros fairly incredible. They are thinner and lighter, which is quite valuable for people who use notebooks on the go, while still being more powerful than their predecessors.
If I had one complaint it's that they didn't keep around the old 15-inch chassis for a model with more ram and a bigger battery. Made just have one model and aim it at a very specific niche.
But overall, they are very good notebooks. You can power multiple 5K displays from them for God's sakes.
The keyboard shortcuts, the UI inconstancies, and on linux the lack of native apps (specifically evernote) just didn't cut it for me. As trendy as it is to hate on Apple right now, MacOS is far and away the best operating system for developers and power users. It's not even close.
Having a single main modifier key on apple (⌘) is SOO much faster and more productive than switching between (alt) and (ctrl) modifiers for switching windows, copy/pasting, new window/tab, etc. It just kills my workflow.
I love my pc build and Arch Linux is pretty good. But I'm currently scoping craiglist for used iMac 5k's. The productivity loss just isn't worth it.
I mean I knew what I got into, I just hoped Apple won't neglect the people made its platform desirable for the masses.
I hope this is just a wave serving the 'starbucks pros', and next time we get to get a decent update.
Why would developers switch from mac to windows? If they're alienated by what Apple is doing, they're unlikely to be happy with Microsoft. Linux, sure.
Of course, I think this is exactly the sort of misprediction people on HN tend to make, where it turns out that (surprise) Apple actually knows more about market demands than random internet commentators. Remember how we were predicting that the new Touch Bar MacBook Pro would fail hard, but it ended up being the best selling MBP ever?
As for MacOS looks - Windows also are releasing update to their visuals this year which looks pretty darn aesthetically pleasing. 
I'm a UNIX developer (mainly Go), and I use a surface pro 4 with win 10 for development. Bash on Ubuntu on Windows already runs great, and I have the ability to sketch diagrams whenever I want to. Additionally I have docker running flawlessly and a system that just works 99.9% of the time. (as opposed to Linux where I have to hack my way through most of the things)
With early Windows 10, the Settings app didn't even have a title bar so there was no clear demarcation from the window grey to the titlebar grey - how would I know that the grey bar at the top was the way I moved the window but the grey bit beneath the grey bit at the top DIDN'T move the window??
With current Windows 10, the Settings app finally has a title bar that has a colour like all other applications. But the paradigm we learned 20+ years ago (double click top left to close the window) and the ability to get the window menu HAS GONE. I can still press Ctrl-space and get the window menu but the ability to get the window menu from a Metro app or a metro-style app using the mouse has completely gone.
So now I have to learn another method of interacting with the basic building block of the system - windows. When each window behaves differently according to the "metro or not" measuring stick, this does not make for a great user experience. And this is me coming from using Windows since the 90s. How do you think new users and my mum is going to get on??
What's the point of having window guidelines if Microsoft themselves don't even follow them??
So although the screenshot is aesthetically pleasing, I remain very concerned about the direction usability is taking on the platform.
Somehow, I'm not bothered about ignoring learned paradigms from so many years ago. For example, my 12 year old daughter could care less how you learned windows management 24 years ago. She's learned to Windows as it is today.
This expression implies that you currently have a high level of care about a subject, and that you have a range or distance to travel until you reach rock-bottom/zero regarding your level of care for the item.
In fact, if you were to care 80% or 100% about an item, the expression would still be true: eg. "I could care less about my health" which would mean I care GREATLY about my health and therefore have range to drop my level of care from its current position.
The expression that everyone should use is "I couldn't care less", or "I could NOT care less".
This implies that you are at 0% of your care about an item, thereby having an inability to move below your current level of care. It is already at rock bottom. It cannot get any less. It cannot go any lower.
Subsequently, I am assuming that you meant that your daughter COULD NOT care less about how I learned window management. i.e., she wouldn't care in the slightest.
I had to ask because I am hearing this expression more and more and it's incredibly irritating because it is wrong.
I never hear any native speakers here in the UK saying it, but judging how manners of speech drift over here I imagine it is only a matter of time before I hear people saying it, much the same way I now hear the word "like" sprinkled into sentences needlessly. When I hear this expression over here I think I'll suffer an apoplexy and drop down dead.
EDIT: By the way, I don't think Apple's changes have been courageous. I find that they stuck their head in the sand for too long and finally did sensible things. For example, on Snow Leopard and prior, you COULD NOT resize the window from any corner but the bottom right. This was officially STUPID. Thankfully they realised how stupid this was and enabled the ability to resize a window from any edge. Only took them 30 years.
Also in recent versions they have been making a big fuss about the ability to go full screen with applications. Windows have always been resizable, so the ability to get a window fullscreen and not see the menubar is not really a big feature. Nor is the ability to run two fullscreen apps side by side - surely we have been able to put windows side by side since the 1980s? Just because the menubar has gone does not make it a feature. This is not courageous, it is dumb.
They have also made regressions. For example, now when you want to use Mission Control you must move to the top of the screen to see the desktops. Hang on - just tried it again and now it works again. Hurray!
Their new filesystem is still significantly behind NTFS for features. Sad to believe we're all still hobbling along on HFS+ with byte swapping for big endian to little endian metadata every read/write of metadata.
I am not saying that Apple does things any better with software development, but the feel across the OS is generally consistent. This is not the case on Windows 10, eg. how do you find out which build version you are on? Press Windows key + Break to see system properties.... oh it's not in there. I suppose I need to run the Settings app > System > About to display a duplicate window. How do I join a domain? I could use the Settings app and it'll mention joining a school. I could also use the duplicate domain settings from Advanced System Properties and join there, but there's no mention of the word "school". Why do the two sets of windows look different? Why is there even two sets of windows? What the heck is going on?
So now I have Control Panel, the Settings app, Microsoft Management Console and its snap-ins. They all do the same thing. Are the teams not talking to each other at Microsoft? Is nobody at the helm?
For extra fun, look at shell32.dll and see how many icon styles are inside that thing. That's not even an architectural change - that's just pictures in a single DLL, owned and controlled 100% exclusively by Microsoft. All of the icons are from different eras. If they can't get even little pictures to be consistent and uniform, I do not hold much hope of uniform system behaviour.
So you can see the cobbled-together nature that Windows 10 exudes, despite a fantastic opportunity to make the entire system coherent again. And I say this as a C++ developer on Windows as my day job.
They are moving slowly, but IMO to the right direction.
I just ran Win10 for the first time yesterday to work on a security guideline and I was blown away by how janky the whole thing was. Using it is like crawling through 20 years of computer history; there's things from every phase of Windows since '95 in that UI, and I don't mean UI hints, but entire applications.
I bet somewhere in there there's still UI that dates back to Win3.1.
>> I bet somewhere in there there's still UI that dates back to Win3.1.
I'm sure there is, that's a mix of backwards compatibility and don't mess with stuff that works. Some people want it, some people don't; can't make everybody happy. Can I complain that OS X provides a terminal that looks like it dates back to the 70's?
To be fair, a lot of Terminal.app dates back to ancient Mac OS too. The menu bars, for example, are all still Carbon. There is still a full classic Mac OS main loop running alongside the Cocoa main loop in every macOS application. Apple's done a good job hiding it, but it's still there.
Anyway, cmd.exe is a bit of a special case. What you're really seeing is conhost.exe, which is kept the way it is because it's part of CSRSS, which occupies a similar place as PID 1 does, and Microsoft doesn't want to bring new UI into it for fear of increasing its dependencies .
The ugly UI isn't even the worst part about the Mac OS though. The worst part is that it just doesn't even come close to offering the same sort of freedom that you get on Windows where Microsoft leaves hooks in to let developers actually do what they want.
Most of the problems with the Mac OS are by design too. I think it's hilarious that Apple folks think it's a really good idea to hide the label on most buttons. I guess you have to "just know what it is" before clicking it or hover over it and hope for a tooltip to popup tell you what the thing will do. Real efficient.
There's really no wonder in my mind why most people don't use Apple anything.
For me "insecure windows" is an old trope and not particuarly accurate any more.
Now that windows has nice workspaces and an already usable UNIX environnement (Windows Subsystem still has things to iron out, but it's very nice for a beta), my personal ranking became:
Linux > Windows > MacOS
I see nothing that MacOS offers that Windows doesn't, whereas Windows has potential for much better hardware specs, games, bigger user base.
The only thing left for Apple is their 16:10 screen and their per-monitor scaling.
I do miss a good terminal in windows but assume the Linux subsystem will fill that hole
- apt-get install php
- php -S localhost:8081
- Launch Chrome on Windows, go to localhost:8081 (it works)
There is zero awkwardness or extra configuration there. Only problem is that some things do not work properly yet (because still beta). For example `go get github.com/mattn/go-sqlite3` will not work because of some ?gcc? compilation issue. But it's getting fixed on next release.
 - https://en.wikipedia.org/wiki/Criticism_of_Microsoft
 - https://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish
end this FUD
Best-selling to developers? Not a chance.
I find it hard to see any realistic alternatives for MacBooks in software development. They give you all the tools you need from the Windows world that are not available on linux like Office, Photoshop etc. and it is still a UNIX system which can do all the stuff that linux can without kernel panics, crashes, bugs and other crap. Also unlike Windows or Linux, a single macOS installation can last pretty much forever and doesn't need to reinstalled every year because of the lost performance like Windows. And last but not least MacBooks are well optimised for the hardware they run on which gives them 1.5-2.5 times the performance with the same hardware as on Windows or Linux.
The whole thing about complaining about new MacBooks is same thing as with iPhones. People complain because it's trendy to whine about anything Apple does even if they would create eternal world peace and still everybody will buy them because there is nothing better or even similar on the market.
You need to update your arguments. It's 2017, not 1998. I haven't reinstalled Windows on my desktop system in years and it's running perfectly fine.
Windows hasn't been less stable than a mac for years if you buy equivalent hardware (instead of scrapping together a system with junkyard parts or getting a cheap walmart laptop).
Windows does not need to be reinstalled every year if you don't treat it stupidly. My current work windows install is 4 years old, has seen two machines (disk swap) and two major windows versions (in-place upgrade), and it runs as well as when i first used it.
OS X is only faster than windows on mac hardware because apple ships bad windows drivers on bootcamp deliberately. On equivalent hardware from other vendors it is just as fast.
It sounds to me like you don't actually know windows all that well. I use a mac mini at home and a windows laptop at work, and I'd say that they're pretty much equivalent for my purposes.
1) A very large portion of other devs I know use(d) MacBooks, and
2) A very large portion of them said "F this, I'm out" when Apple released their most recent machine. Many switched to Surfaces.
Your hyperbole against Linux and Windows does your point no credit. Linux has been rock-stable for a very, very long time, and Windows 10 is a pleasure to use, particularly on touch-enabled devices. MS went through severe growing pains with Win8, but those days are past. And the OS/hardware combinations that have emerged since make Apple's offerings feel old and out of touch.
This is not to disparage the kernel developers, package maintainers or the distributions themselves - there's a lot of people doing excellent work without being compensated at all. However let's not kid ourselves - linux can still be tough.
Unless you buy something from System76 or do very meticulous research on hardware support and build your own, Linux is not going to be seamless out of the box. But the real highlight is that it can be seamless at all these days.
My PC has four GPUs, 128GB of RAM and a deca-core CPU. I have four monitors, a bluetooth mouse, and I almost never turn it off. Instead, it automatically suspends. When I'm away from home I can ssh into my computer if needed. I have daily backups that are essentially seamless.
Now, it was difficult to get to this place. I would say that Linux is, and has for a long time been, rock solid if you're not using a GUI. In my opinion, the single greatest impediment to "Linux on the desktop" is xorg, which is terrible (to put it mildly). Nouveau is absolutely crap, especially since Nvidia puts out its own drivers. But my point here is that after a few days of careful installation and setup work, I have had a seamless desktop experience for something like eight months.
What needs to happen for Linux is a software company that takes something like Debian, fork it and dramatically bring it up to wider compatibility with modern hardware across the market. In some sense that's Ubuntu, and it shows in how easy my daily experience has been.
That being said, I only use Linux for server-side work, and I can easily say that it is every bit as stable as any other server OS I've used, and has been for a very long time. There are plenty of weedy areas to play in if you want to, of course. But if your goal is It Just Works, you can do that, too. Perhaps most importantly, you can do it easily, with just a little foresight during the spec stage of planning your project.
They switched to Surfaces because they wanted power-user hardware?
To pick one example: being able to flick through long documents with a finger while keeping focus on other parts of the screen alone has been a huge productivity booster, and I do not say that lightly.
Like I said before, I was firmly in the "touchscreens are a waste of time for me" camp until I actually started using my Surface. Not trying to sound like an ad, it just has been my actual experience.
Most of my "work" time I spend in terminal. White text on black background, keeping my fingers on home row. I cannot see scenario where touchscreen makes me more productive except in some edge cases, and btw they don't gain over what I already have now. But that's just me. Each to their own, that is only fair thing to say.
But making assumption and claim based on your own point of view is not best thing to do. With Apple, especially it is always the same story, customers leaving and rarely satisfied (it was period around Snow Leopard, IIRC 2009-2010), but still I can't find anything better than MacBook. Dell and Lenovo came close, but when I install Linux on them I make few more sacrifices, summing everything up, Apple just works for me. (I won't even mention Windows. 100% not my cup of tea)
So the only question is to know if they are winning some that were on other systems. Objectively, the MBP has more or less stagnated on a performance point of view, it has regressed in autonomy and the new features it gained are at best neutral for developers. Developers is typically the slice of the population that will be affected by reviews, and the reviews are not favourable to this new MBP. Last but not least, competition has not stagnated either. I don't believe there is a better laptop on the market for my usage pattern, but it took me much longer before I came to that conclusion.
It is reasonable to think that the new MBP is less successful than the previous one with developers. It could still grow in raw numbers even with the developers, but it is reasonable to think its growth has been dampened a bit.
I'm not strongly disagreeing with you, but you are not making a better point than parent. I do believe that if Apple has a spec bump release of the MBP ready end of 2017 and is back in the regular 1 year cycle the whinging will die and you will win the argument.
Says the guy who suggests Linux as a desktop.
If i need to build code that apple says can only be built on a mac, i simply spin up my mac VM and point some tools dont need to drink any koolaid to get the job done.
MDI + keyboard skills beats vim ninjas any day of the week.
Apple has never, so far as I am aware, deliberately targeted developers as a primary audience of a laptop computer.
Apple's laptops became popular because they ran an operating system that was Unix under the hood, with a nice UI and good set of non-developer applications, which made them good as machines a developer could use for work and for non-work (Linux on the desktop being not so great at the second part of that). Apple has certainly been happy to make money from developers buying laptops as a result of this, but has consistently ignored the expressed wishes and desires of developers when updating its laptop lines, meaning Apple was popular in spite of Apple's indifference, rather than because Apple lovingly catered to this audience.
In other words: if you were looking for an executive to run around the stage screaming "DEVELOPERS! DEVELOPERS! DEVELOPERS!", I think that was probably some other company, and they got ridiculed for it...
Here's an article about when they stopped targeting science.
This is EXACTLY what I did. I build my own PC (after more than a decade not doing it) and bought an Acer R11. Couldn't be happier!
Ehhhhh, as long as there's money to be had in iOS land, I'm not sure that's true.
Contrary to internet commenters complaining that it's impossible to launch new iOS apps, iOS shops still make a ton of money. ¯\_(ツ)_/¯