When I switched in 2005 I had to convince friends I wasn't crazy. I advocated for the platform. I suffered weird incompatibilities. I dealt with bad updates. I found work arounds for non-multi-platform tools I had to run. But look how stable, how much less restrictive the platform was, I said! And it was!
The platform I switched to in 2005 wasn't Linux. It was OS X.
In 10 years the world has changed and today if you are in most of the "creative" professions, not using a Mac is the exception. You can today make a lot of arguments against desktop linux, yet for the past several years I've run it exclusively on all my main systems (laptops and desktops, work and home).
10 years ago, plenty of Windows users couldn't imagine a world in which Macs were where they are today. I mean couldn't imagine. The way Mac enthusiasts can't imagine desktop linux today. The parallels aren't exact. They don't need to be. I am neither predicting success or failure. I am simply pointing out that imagination fails and it is a mistake to say "that will probably never change."
And those machines are terrible. Teradici thin clients streaming Red Hat from a closet somewhere. The latency on those things was significant, and the fonts unreadable. The hardware was really buggy, the monitors were really low resolution, and there was some irony on typing on a broken Dell keyboard in the Steve Jobs building. The crappyness of having to sit in front of those machines for 10 hours a day was a significant part of my decision when I left Pixar.
Howso? I was working with animation tools developed entirely internally -- last I checked C++ and Qt work fine on OS X.
Would anyone have ditched IRIX to run Linux on their MIPS kit? Of course not because it was never about the OS in this space, it was about FLOPS/$.
What exactly did Sun do wrong with their x86 and x86_64 versions of Solaris that
you call "so short-sighted"?
What actually happened was that x86 hardware eclipsed SPARC hardware... And organisations that would have been happy to refresh the hardware and keep the same OS and just recompile their apps found that they couldn't, so they jumped to Linux instead... And the rest, along with Sun itself, is history. I was working for a big Sun customer during this transition, we begged them to make Solaris x86 a first-class citizen, and we had a lot of support within Sun too but the SPARC Mafia won the day and with regret, our next big order was (IIRC) with Dell and SUSE...
Granted, with Linux you will often have to do a bit of setup initially, but after that you don't need to touch the innards of the OS again if you don't want to.
I don't think MacOS is a superior OS compared to Linux anymore. However, MacOS does have an ecosystem of some very good apps that do not have their equivalence in Linux. Hopefully that too will change.
I only really miss Adobe, but I was a heavy user of it. VMs/Wine aren't the same.
what about battery usage?
what about heat?
I don't think having a Mac in '05 made you too much of an outlier --- at least, not if you worked in technology. By 2005, Apple had already started the switch to Intel!
This would be nitpicking, except that the timeline matters here. It's not a 10 year gap, just by your own numbers. But it's not 12 years, either. It's more like 16.
Let's also remember that Linux has been an unreasonable mass-market desktop choice for a lot longer than 16 years. "The year of Linux on the desktop" is a nerd joke for a reason.
My much more powerful Macbook Pro provided to me from my job sits and collects dust as I use something that I actually feel more productive in.
By the way, I am a software engineer. People use software to reach their goals and if they choose Mac, Windows, Linux or BSD or whatever, God bless them.
I'm way more productive on my Linux Mint Cinnamon setup than I am on anything else. And I can make it look beautiful with 5 minutes of tweaking.
I keep an OS X partition on my MacBook Pro, just for the off chance that I might want to do iOS development at some point.
My theory is that the Mac Pro hasn't seen an update because Apple knows that its current thermal design is a lemon, and they don't really want to sell any more of these because the replacement rate is so high.
Also, I think Apple will have to come out with a new display to go with the Mac Pro.
I must be missing something. If a new Mac Pro ever came out wouldn't it use USB-C and be fine on the new LG displays Apple sells (price dropped by the way)?
MacPro "SubZero Edition". :)
Depending on which pro needs We're talking about, I don't believe this is true anymore.
I was a Linux user for many years, then an OS X user for over a decade, and earlier this year I'd had enough with being annoyed by OS X & switched back to Linux. I am extremely happy I did.
If you want beautiful GUIs or do video/audio/graphics editing then, yes, macOS is still superior. If you are a developer who spends most of your time split between the terminal and the browser, then Linux is not just an acceptable substitute, but can in fact be a superior replacement.
Adobe porting their Creative Suite to Linux and maybe even releasing their own branded AdobeOS Linux distribution would seem to be a massive existential threat to Apple and Microsoft both, and would give Adobe a possible "out" to the world they live in right now, where Apple tries to do their best to keep Adobe "in check". Seriously: every single one of these threads ends up in the holding pattern of "we could all be using Linux tomorrow if we had [software which is almost entirely controlled by Adobe and which is already designed by them in a way where the UI seems to be an in-house toolkit and which would be trivial for them to port to Linux]"; that would effectively just leave "office productivity" as the only class of software where Microsoft (and to a lesser but still noticeable extent, Apple) would be able to hold people on their platform (and OMG: a future where the next Adobe CS release was a word processor... that would be brutal).
And if my aunty had bollocks she'd be my uncle.
It feels like when I first moved from Windows to OS X. Makes sense, since back then OS X felt minimal compared to Windows, and a bare tiling manager over Linux is the ultimate minimalism.
I used dwm at first when I switched back to Linux, but found it still doesn't handle multi-head very well, so now I use XMonad (with dmenu), and have been very content with it.
I still have a MacBook Air lying around that I use when I'm on the road. The amount of visual clutter, annoying UI animations, the nagging from notifications and alerts surprises me every time. I didn't notice it before I started using my current XMonad setup. So either Apple is making it worse or you just don't see it when you don't have a clean and minimal system as a comparison.
I guess we are also missing professional photo editing to match photoshop (Krita isn't trying to be one, and Gimp is kind of a mess) and 2d animation (in a world of toonboom and flash, synfig is a poor substitute). Freecad is also not a very effective substitute for CAD workstations, but to my knowledge those users have never left Windows for OSX in the first place.
Ardour is only half the equation for audio production on Linux, even if it didn't have serious shortcomings. Professional users need an extensive suite of plugins, the vast majority of which are proprietary and have no good Linux alternative. There's no good pitch correction plugins for Linux, no Kontakt compatible sampler, no mastering suite, no restoration and repair suite, no good metering tools.
I've not seen sleep issues on MacOS or Windows for over ten years. You close the lid it sleeps, you open the lid it wakes. Seems reliable.
The user wasn't able to connect to our guest 802.1X (EAP, 802.11n) wireless network using their Ubuntu laptop. MacOS clients and Windows clients have no issue (it is a guest network, non-employees use it often).
That was a pain to set up, and I switched to their alternative network that permitted whitelisted MAC IDs pretty quickly.
Also, I've noticed that recently, when I connect to certain guest networks at office buildings and libraries and so forth, there is a "login" page where you consent to not look at porn or whatever on their connection that you are supposed to be automatically redirected to when you first connect and try to load a page.
Sometimes the redirect doesn't work, and you are forbidden from loading pages until you've clicked the button on the page it failed to send you to. Going to 188.8.131.52 or 192.168.1.1 or some variant usually finds it.
But for the most part! I have not had any issues with normal "type the password into the WPA network" or "associate with the unsecured network". See my other comment in this chain: https://news.ycombinator.com/item?id=12881305
And in the worst case, I always have my phone with me, rooted with unlimited 4G and a USB cable to tether it with.
These kinds of things are also part of why I switched to OS X in 2003-ish, so I understand the hesitation.
In my experience, Linux is perfectly fine nowadays for these things. I've been traveling a bit recently (and so have been using a lot more random WiFi access points, tethering with my phone, and relying on battery for power), and everything's pretty much just worked for me.
My company's development team is just about 50/50 Linux/Mac now, and I haven't seen any real difference between the two sides with issues for WiFi. In fact, I've probably seen more difficulty on the Mac side recently: I feel like the last few versions of OS X have had the habit of randomly dropping WiFi connections sometimes. One of my friends has an MBP that will drop a connection & refuse to reconnect until restarted (or at least he hasn't yet found a solution that doesn't involve restarting).
It's possible a lot of the issues people report might be down to hardware/distribution choices: not all hardware support is equal, and not all Linux distributions are equal. FWIW I use Arch Linux on a ThinkPad.
That all might sounds a little "works on my machine", but I'm not trying to say that the contrary stories of people having a terrible time aren't valid concerns. What I am saying is that many Linux users have perfectly stable & pleasant experiences, and many Mac/Windows users don't have stable/pleasant experiences, so ultimately YMMV regardless of what setup you choose.
The flip side is for the work I do nowadays, I usually use a Linux VM on something like Digital Ocean. There was at one point I was using a 2011 11" Mac Air to dev work.
If you have Nvidia or AMD GPUs, some other wifi vendors (cough broadcom cough) then there can be issues due to drivers, firmware, binary blobs etc. They manifest as suspend/resume issues, wifi flakiness etc.
So yes you can do it by selecting component vendors, but in general 100% of systems will not be robust.
What? I'm running Arch with i3 right now, and networkmanager (with nm-applet for my tray) is doing just fine at finding and associating with networks. `systemctl suspend` suspends my system as desired. I'm sure if I spent some time looking, I could bind it to my power button or laptop lid.
If your complaint is that user shouldn't have to use the command line to suspend their device - I've used Ubuntu versions since 12.04 and they all have this sort of thing built in and intuitively accessible.
Were you referring to something besides network and power management?
Suspend works fine.
(IIRC suspend issues are mostly due to proprietary drivers, which is basically nVidia and AMD to a smaller extent (AMD's open source drivers work well))
That was why I was asking if the state of things for Linux has improved now that it is 2016. Thanks for sharing.
My company is a heavy Docker user, though: dealing with docker-machine & running Docker via a VM on OS X was one of the final straws that caused me to switch. Given that FreeBSD's Docker support is pretty recent (and was still considered beta when I was deciding to switch), and OpenBSD doesn't have any native support at all as far as I know, I felt that the probability of trading one annoyance for a similar one was too high, so Linux was the right choice for me.
So, if you have everything already on macOS (I still hate that name change), I can get not wanting to switch. But what on Earth is it doing so much better than Windows that at minimum a decade would be required for it to catch up?
* Constantly changing APIs based on Microsofts internal politics vs. the very stable Cocoa
* HiDPI that always works or has good fallbacks
* Touchpad / gesture support that is actually pleasant and precise to use
* Battery life
* POSIX scripting (that one is slowly coming to windows now)
* Can be restored from another computer in a matter of minutes (see also the first point)
* PDF support baked into OS
Really? I can run Windows apps I bought 10 years ago easily, whereas many Mac apps seem to break with every OS update. It's also extremely hard to build apps for older versions of Mac OS X, as Apple hide the older versions of the SDK.
Cocoa has been a long path, but my Windows friends have had a constant churn of the latest API to use.
1) I am talking about the stuff we are supposed to use above the base Win32.
Whether it's WPF, WinForms, WinRT, or whatever the new flavour of the day is, it's always a layer on top. And that layer is never perfect, because you still have to do the occasional [DllImport("Kernel32.dll")] interop reference.
Here are the currently supported ways to deal with windows on Windows:
On OSX, you use Cocoa. When Apple introduced Swift, they didn't have to layer on top of the API accessed by ObjC. When you write Swift code, you call the exact same APIs as you would have done with your ObjC code.
Here are the currently supported ways to deal with windows on the Mac:
There was the Object Pascal Toolbox, replaced by C Toolbox (MacApp), replaced by MPW, replaced by PowerPlant, replaced by Carbon, replaced by Cocoa.
I can list many other examples, even Cocoa has lots of deprecated APIs since OS X 10.0.
While Apple floundered during the 90s with their OS strategy, their API strategy in OS X has been absolutely rock solid.
It's now 2016, Carbon has come and gone as it was intended to and Cocoa is still around and is still the API you use for working with the Mac. If you're trying to do something on the Mac, there's going to be a single officially supported way to do it and that's it (if it's supported at all).
Meanwhile, there are three totally different frameworks on Windows that you can use to do the same thing, all of which are officially supported and none of which are deprecated.
There are quite a few 32bit apps making use of it and they will never be ported to Cocoa.
> Meanwhile, there are three totally different frameworks on Windows that you can use to do the same thing, all of which are officially supported and none of which are deprecated.
Win32 and MFC are deprecated for new applications, the way forward is UWP with XAML.
Project Centipede is just for porting existing code into UWP world.
Windows Forms was officially deprecated at BUILD 2014.
That's not a bad thing. It means if I wrote an app 5, or 10, or 15 years ago, I can keep working on it and updating it without having to throw most of the code away and start again.
2) Applications written in the 90s still work on Windows 10. There was a short period (around Windows 8) where MS started dropping all their old APIs, but Windows 10 has stabilized that.
3) Yeah, screen resolution stuff is finicky on Windows but this is largely due to legacy apps. Modern apps don't have issues with this.
4) The Microsoft Precision touchpad APIs solve this problem. Admittedly far too many manufacturers are still using the old system (which pretends the touchpad is a USB connected mouse) but the ones that have switched to the Precision APIs are really nice, and keep improving with each Windows update. I expect very few non precision trackpads to exist come 2017 models.
5) Windows Laptops have as much, if not more battery life than Mac laptops. (The surface book gives you a 16 hour option). I don't think this is very valid anymore.
6) MultiDesktop - If you mean virtual desktops, Windows's Virtual desktops are no worse than MacOS's. Neither are close to what Linux offers. And if you are talking about multiple monitors, simply the presence of Win+Direction Keys makes Windows superior.
7) POSIX scripting - This is huge. It's why I usually run a Linux VM in my Win10 machine for most of my programming. Good thing that MS is improving, but I am not convinced the Linux Subsystem will ever be as good as being a POSIX OS in itself.
8) I have to strongly disagree with this one. Time Machine corrupts far too easily. You can theoretically restore from elsewhere, but the failure rate is way too high for anyone to trust Time Machine as a backup system. Windows backup while far less user friendly, is more reliable. If you are referring to more UNIXy solutions (which might be the case considering you refer to point 1), you may be correct.
9) Windows 10 has native PDF support.
Bit late but just to chime in on this, to my knowledge on Windows 10 you cannot:
- Save desktop arrangement so your setup persists through shutdowns.
- Assign apps to specific desktops so they always open straight on that desktop.
- Assign keyboard shortcuts so you can jump to specific desktops rapidly without sliding through or using the mouse, e.g. ctrl + 1-9.
If you can do any of these, then I would be genuinely grateful to hear how as I'm evaluating returning to windows as my main OS. Without these Windows virtual desktops are more of a proof of concept and are lacking the tools needed to make them actually useful and productive.
> Windows 10 has native PDF support.
All of my PDF's seem to want to open in Edge by default, which is a web browser. On macOS they open in preview, or quick look.
Seems like I would take a super powerful Windows or linux server any day over a lower powered Mac Pro just because of the OS if hardware specs were that critical.
Taking 10 vs 1 minute to render would be way more important.
BTW, Windows 10 has multi desktop and most apps work fine with high DPI.
Every single app integrates well and the scaling is flawless even for the few that don't support 2x mode. I have never dragged a window between screens on the Mac and seen it appear at the wrong scale.
I've done it about 50 times in the past week at my day job on my Windows machine. To be fair, that Windows machine drives a VR headset that you just can't use on OSX because no Macs have the compute power needed - there are definite wins to going with Windows.
It's just that the Windows mentality is to just get most things working and assume the rest is just something that most users won't care about, whereas the Mac mentality is to get it all to work together seamlessly.
I enjoy both OSs (the UX is basically 90% the same btw), they're fine. And there are areas where one works better than the other. By focusing on the issues with one of them without realizing the issues of the other just shows that you're probably not familiar with the other.
I'm not disputing your point, just showing your partiality. I've literally not had to "manually" resize or drag a window in Windows since Win7, but I do it everyday on mac. The Windows model for window management is just better, but you'd never hear it in these comments.
I use Windows more than OS X these days.
But yes some windows apps still leave a lot desired on 4k if they haven't been updated.
* My Razer Blade 2015 has no problem driving its hidpi screen and my lodpi monitor, even at the same time
* My battery lasts 8 hours with a VM running, which is about as good as any MacBook I've ever had
* Multi desktop exists in Win10, Win+Tab lets you switch desktops
Windows isn't perfect, and I spent a long time on a combination of Linux and Mac machines, but my daily driver is Windows with a Linux VM for programming now and I'm perfectly happy.
One of the developers at my work can't do fine tuning of our styles because his Windows multi-display setup won't re-render apps at 1x when he drags them to the lodpi monitor. Instead it renders the app at 2x and downscales it to 1x, blurring everything.
Really? Mac OS also has a very long history of deprecated and thrown out of the window APIs and SDKs.
Now fill the other 9 years and 10 months, minus two or three developers that are still working on some of the bigger items for another couple of months at most.
Edit: maybe it would help you to level out your perspective by thinking about some reasons why almost every business on the planet runs Windows and not Mac.
Every is a bit strong, but...
Well, IBM wanted to use an OS called CP/M and was buying a lot of products from Microsoft. Something stupid happened and then IBM bought their OS from Microsoft. Microsoft was allowed to license it and Compaq figured out a work around for the BIOS issues. Thus DOS became the big thing because of IBM then all the other PC companies. Apple totally flubbed the Apple III, and Commodore didn't retain their leading market share. IBM then completely failed to follow-up with OS/2 (seems working with Microsoft to compete against Microsoft Windows is not a healthy move).
DOS begat Windows which begat Windows NT, and the software was good enough. I can buy a PC from more than one company and they have a lot of non-technical, business support. Microsoft doing a bit of cheating probably helped. It actually helps sell support contracts to have some problematic areas in your software which generates an amazing ecosystem that businesses understand (paying someone to be responsible is understandable).
A lot more people drive Kia than BMW, but I really wouldn't argue Kia is a better built car.
He's right with these.
I now solely use the touchpad on my MBPr. I cannot see myself ever switching back to a mouse. Having been a mouse user for 16 years!
The ability to do a 3 finger swipe and switch to another virtual desktop in a split second is awesome. Is this possible on Windows? I don't know.
I was working on a Windows 10 machine yesterday and PDFs opened with Edge. Again, I don't know if it's right or not.
Edit: Every business on the planet runs Office, and they used to all run Exchange, Outlook and SharePoint. They got Windows as a side effect.
I agree. :)
Maybe those other shitty operating systems makers should try harder if they want to replace Windows. It won't happen though because none of them care about their business customers as much as Microsoft does.
Here it is again: Everything that person stated was either incorrect or extremely insignificant because obviously none of those reasons have affected Windows very significant market share.
EDIT: What I don't have time for anymore is refuting every person's laundry list of nit-picks.
As for web development, there is nothing you can do on a Mac that can't be done on Linux or BSD apart from testing in Safari, and Safari has a smaller user base than IE/Edge. Hell, even Windows is improving in this space.
I guess one could argue that macOS is a more aesthetically pleasing environment to work in, to which I counter that I'd rather work in BeOS/Haiku as I find it more aesthetically appealing than any other OS. In other words, that's purely subjective.
* Better touchpad support throughout the entire OS.
* Vastly superior HiDPI support (especially if you're using a multi-screen, multi-DPI setup). I cannot stress how poorly Windows 10 performs in this regard. This has been a solved problem on the Mac for years.
* Better colour management through the whole UI stack.
* Better tablet support (OSX has had integrated support for tablet events since 10.4, on the Windows side there were still apps that broke when the Surface 3 launched almost a decade later due to using a third-party API for tablet events that needed custom drivers)
* Core Audio. OS X has had the same set of audio APIs since 10.3, and they've been well regarded for years and years. It wasn't until Windows 10 that you could argue Microsoft had finally caught up, and there's still people with ASIO driver compatibility issues. That's literally 12 years it took for Microsoft to catch up.
* OS X had scrolling of inactive windows for over a decade before Windows caught up. I actually think it's something that was there in 10.0 but I only started using Macs at 10.3 so I can't be sure.
* Spotlight. Spotlight is almost the perfect analogy of the difference between OS X and Windows, and I'm just going to look at one tiny feature of it. If you want to do some math on Windows at a single keypress, you need to turn on Cortana which only became availble in Windows 10. Cortana isn't even available in every country. Meanwhile OSX has had that built into Spotlight since 10.4 and it works on every Mac ever shipped since then, no matter what country you're in.
* File tagging. OSX has had this for over a decade, and you still can't do it in Windows.
* An integrated C/C++ runtime library, like every other Unix ever. It wasn't until Windows 10 that you could ship C/C++ code without worrying about whether your end user had the right msvcr/msvcp DLL installed on their machine.
I'll stop here, because there's so many things that OSX has had for a decade or more that Windows still hasn't caught up with that I wouldn't be surprised if Windows doesn't catch up after another decade.
* Precision Touchpads are equally well supported in Windows.
* HiDPI is better on the Mac
* I don't know much about this.
* Are you referring to Wacom style tablets? Not sure about that, but if tablets in general, OSX tablets don't even exist, so it's irrelevant.
* Windows Audio APIs are as good as Core Audio now.
* This really isn't that big a deal. In Windows it leads to different benefits, where my mouse could be anywhere, ubt I could still scroll the page I was reading.
* Cortana is better than Spotlight now. I didn't know that Cortana would be completely disabled in countries that aren't supported. That sucks.
* Windows has had file tagging since Win 7.
There really isn't that much of a difference. There are places where Mac OS is better, and there are places where Windows is better.
* The APIs are there now, but the hardware isn't on many machines.
* Yes, I'm referring to graphics tablets. I've had to deal with the APIs for that stuff myself and what was about 10 lines of extra event handlers on OSX ended up being well over 200 lines on Windows.
* There's still driver issues that aren't present on OSX.
* You've got to realise that anywhere where you think "this really isn't that big a deal" it's actually a huge deal to the people who use OSX daily and want a better Mac Pro because Windows doesn't care about the small things.
* Cortana isn't better than Siri though, which is what it's actually meant to compete with.
* To be fair, the feature was so undiscoverable that it took someone on Hacker News telling me about it before I knew it existed.
A lot of my post was pointing out things that have been better on OSX for literally more than a decade before Windows finally caught up in a few areas.
HiDPI is probably the biggest thing that Windows just won't catch up to for a while. The API compatibility requires Windows to support APIs that will stop HiDPI from working well, because old Windows had the ability to set DPI that affected certain UI components but not everything.
My point about the scrolling isn't that it's not a big deal because it's a minor deficiency. My point is that it's a design choice, and many prefer the Windows method. Neither choice is inherently better.
Despite having come in much later, Cortana is at worst about as good as Siri. For one thing, on the desktop, I can actually type in my Cortana requests. The fact that Cortana combines Siri and Spotlight makes it far more powerful and useful. It's kind of insane that Apple thinks people want different information when they ask the same question when they type it, or when they speak it.
Windows has been a lot better than OSX in many ways as well. Especially since Windows 7. For one thing, it will take until next year before Apple gets a decent File System, instead of a kludgy mess.
What's worrying for Apple is that Windows is improving a lot faster than OSX. The gap between them is narrowed very significantly, and personally, for the past few months, Windows 10 is a much nicer environment than OSX (and I've been using OSX exclusively for over a decade).
I think the idea that Windows is so far behind OSX is just not true. Part of it is that this has been true for so long, and Windows 8 was so bad, that people believe it instinctly. Part of it is that people moving to Windows want it to be like OSX, and anything that is different is considered worse.
When I've wanted to do some quick calculations on a Windows PC at a single keypress, I've pressed the calculator button on my keyboard and a calculator has come up. And this has been true since the turn of the century (for me personally; for Windows for several years longer than that).
Each one of them comes with their own particular set of quirks/warts and I can't objectively say one is so superior that I'd rather use it to the exclusion of the others.
The WSL will never replace a native Linux OS. Period. But it will be pretty useful for some lighter use cases. As a dev, I'm still not convinced it's good enough, given the recent issues with Ruby and Node.
But, I kind of agree actually, the only true issue was advanced socket issue with PhantomJS. Like for some reasons, Capybara/Poltergeist can't connect to it. It's the only reasons I am back to MacOS. But, I think I will switch as soon as it fixes. It should be soon.
I'm running current stable (full disclosure: MS FTE here), and there are still a few hang-ups that I know are being addressed (and some already fixed) in the Insider Previews.
I had to leave the Previews because I needed to pin down a few things, and mostly use Docker for Windows instead of WSL for work. WSL will not run Docker containers directly (for instance), but I managed to get the CLI to run in WSL to control them inside the Docker VM. Well, until the last Docker upgrade reset the configs... :)
Meanwhile Windows now has everything Ubuntu does.
Thanks Open Source!
I'm actually surprised at the question, considering the number of devs who work on macOS (and I'm not talking about front-end folk).
It does similar things with boot times (measure and report about slow startup apps) and IE add-ons. I want my OS to make suggestions about changes I could make to improve my experience.
Well, this time it's Intel and not IBM, except Apple owns their own chip designs and has 200 billion dollars laying around.
The kernel would run on an ARM processor, along with all the supported software. Any x86 binary would be offloaded to the intel cpu. Sharing RAM and even the remaining hardware would be nearly impossible I guess. Not sure though. This way it can be like the integrated / discrete gpu design and switch can be done a lot less painfully.
Yes of course, we were doing it back in the 1980s with hybrid 6502/Z80 systems, the Z80 side running CP/M.
The territory that they'd concede though is that expandability and people buying CPUs, RAM, drives and GPUs from non-Apple vendors. But that would buy them continuing to tie professionals down to macOS and then selling them laptops and building hype. Instead I think they're going to ditch macOS and switch to iOS on desktops and laptops and you'll see their laptop and desktop sales shrink and they'll be become entirely an iphone/ipad company. I don't think this has anything to do with supply chain and this is entirely self-inflicted damage because they're prioritizing their walled garden strategy above all else.
And? If any of the handful of people on the entire planet capable of building an ARM capable of rivaling Intel's current x86 designs were hired by Apple, it would be all over the tech industry news sites instantly. 200 billion doesn't mean a thing if it isn't applied properly.
The only piece missing is creating iOS apps on iOS devices, for which they've already created a language and an app to learn programming, then everything we do and know and learn just to keep our computers running is kind of legacy stuff.
The only issue will be major apps like Adobe CS and Office, but even those have iOS incarnations that can potentially reach parity with their MacOS counterparts.
I suspect a Mac running a KBM and Windowing friendly iOS would make a really nice machine.
Sure you can. Or rather, you can install the [free] VMWare ESXi hypervisor on said hideous box, and then install macOS on that. The ESXi hypervisor is an officially-supported macOS hardware configuration. Apple want you to only run macOS guests through ESXi if you're on macOS hardware, but there's literally nothing stopping you from doing otherwise.
Default new USB device attachments to the macOS VM; add a USB Bluetooth dongle; use SR/IOV to feed the VM a dedicated video card, and plug your monitor into that. You'll never even know ESXi is running.
Side benefit: cheap sibling Linux VMs!
ON APPLE HARDWARE. If you run MacOS on non-Apple hardware it isn't officially supported at all, effectively piracy, and you need to hack ESXi to make it work at all (ESXi Unlocker, etc).
> but there's literally nothing stopping you from doing otherwise.
Except the license agreements (both Apple and VMWare), and the fact that ESXi won't do it unless modified.
By "officially supported", what I mean is that Apple supports VMWare in developing drivers for ESXi, so neither new versions of ESXi, nor macOS updates, will suddenly break macOS ESXi guests. You can put the same expectations on the stability of the ESXi hypervisor in running Apple software as you can on Apple hardware in running Apple software. (This contrasts with the Hackintosh driver ecosystem, where every time macOS updates you have to figure your compatibility out all over again.)
Now, the other half of that—the stability of your hardware in running ESXi, etc.—is up to you. Obviously, you can't take your Dell into an Apple Store and expect support.
VMWare sort of does support you, though, insofar as it's the free-tier support (you can get better if you pay for VSphere), and insofar as whatever problem you're facing is a problem with ESXi—or a problem in your hardware's ESXi compatibility—instead of a problem in how ESXi handles macOS guests. But, as I said above, the latter is Apple's problem to worry about; such bugs get fixed, to support the people who are virtualizing macOS on Apple hardware.
> the fact that ESXi won't do it unless modified
Not actually true! ESXi won't create a macOS guest without hacks. It's perfectly happy, however, to import a macOS VM created using VMWare Fusion. There's even a quick menu-action to upload VMs directly from Fusion into ESXi.
> Except the license agreements (both Apple and VMWare)
A license agreement doesn't stop you from doing anything. It doesn't have hands to restrain you from typing the commands that will install macOS. You stop yourself, because of a license agreement. Or you don't. Up to you, really. It's not like either Apple or VMWare is going to stop selling you their products, let alone sue you, for what you do with their software on your own computer.
(Now, if you're an employee of a company, then the licence agreement might indeed have "hands" to stop you with: the hands of the lawyers of the company, enforcing the company's agreement with VMware/Apple. This is why https://macminicolo.net/ exists, instead of just being an ESXi farm.)
First, a point of note: one funny thing is that the VSphere ecosystem assumes, is that the ops people deploying VSphere use Windows. So the tooling around VSphere—even the unofficial community tooling—are Windows programs.
Besides the server computer I was setting up, I only have a MBP (with no Boot Camp partition), so, "step zero": I installed VMWare Fusion, and downloaded a Windows evaluation guest image from Microsoft.
The rest of this assumes you have Windows:
1. Install http://www.v-front.de/p/esxi-customizer-ps.html, and use it to generate an ESXi ISO customized for your machine. (Add network-card and storage drivers, essentially.)
The generated ISO doesn't "burn" onto a USB correctly using regular tools for some reason. The recommended Windows tool is called Rufus. I couldn't use that from a VM without a lot of hassle, but it turns out the ISO is essentially an ISO9660 export of what was a FAT filesystem, including UEFI boot files. So:
2. Extract the ISO onto a FAT-formatted USB stick. Plug it into the new machine, and pick the UEFI file in your BIOS boot manager. The installation proceeds a lot like an ncurses Linux install (e.g. Ubuntu Server.)
3. Once the machine is running, it shows a URL you can reach its web client interface at. For managing the hypervisor, this is mostly fine. (You can also enable SSH from this web console, and then SSH into the hypervisor.) For setting up new VMs, however, I found the [free] VSphere Client (https://kb.vmware.com/selfservice/microsites/search.do?langu...) a lot more powerful. Install that if you like. Again, though, it's Windows-only.
4. Create a macOS VM in VMWare Fusion. (It assists you in doing this; it just requires that you have a recovery partition on your Mac.)
5. Find the menu option in VMWare Fusion to "Connect to an ESXi server"; specify the IP the server console provides, and the admin user/password you provided during installation.
6. Go to the VM library window, right-click the macOS VM, and one of the options should be "Upload to an ESXi server". You can then pick one of your connected servers.
One thing VMWare wants you to do, that you shouldn't do (for a home lab, at least), is to install their Platform Management Controller, "VSphere VCenter." (You might be tempted, because that's the only way to get their modern [horribly-janky] web-app interface that exposes all the VM configuration options like the Windows client does.) The VCenter VM appliance is ridiculously bulky: it takes ~16GB of RAM and two dedicated vCPUs just to run it. It's designed to consume one whole ESXi host node of a VSphere cluster, not to assist in the maintenance of a single-node ESXi setup.
It's not to say that I don't care about performance, I have a MacBook Pro bought by my work that's about 5 months old and it's great.
But honestly, I run the hell out of this thing and I never really notice a slowdown.
I can have Chrome open with a bunch of tabs (I probably limit it to a dozen before I force myself to start reading/closing them), multiple VM's spun up with vagrant/virtual box, a Windows 10 VM via Parallels, an entire linux software stock via docker-compose, Sublime Text, Slack, SourceTree, PostMan, 4 or 5 terminals, a VPN manager, multiple web servers and not notice any slowdown whatsoever.
If a new MBP came out and offered 32GB of RAM, I'd take the option. But do I really have any reason based on reality that I need more than 16GB of RAM? No.
Would I ever buy a 4K or 5K Mac Pro? Unless I had some sort of big data or scientific computing need, no. And even then, wouldn't I have a cluster somewhere else for that? Probably.
Miserable is a strong word. Ubuntu is hardly miserable. Chromebooks are pretty much guaranteed to be able to run Linux without worry, and installing Ubuntu on them shouldn't be much of a hassle.
I personally run Fedora on an XPS 15 and have had very few issues with it, which is crazy since I'm bleeding edge everything by that count (hardware and software).
And RIP Apple hardware, I give you the Hackintosh!
I would be surprised if he's ever really given Ubuntu or Fedora a serious chance. I think he thinks Google's Android and Kindle Fire have the same UI. I listened to him once bitch about Windows and it was so obvious he hadn't installed it properly.
I'm pretty sure I won't get it (or worse, that I'll have to pay through the nose for it at Mac Pro-grade prices), but I'd really like to get a more modular Mac.
I don't think that Apple will do it, though. Jony "VP of Narration" Ive has a thing about unbroken surfaces, and even though I like Apple's overall aesthetics, I don't think there's any way a modular system fits into the Mac "look".
They're not going to keep making that much money from iPhone forever, it will decline eventually as competitors keep getting better and better. Google is entering the market with products that finally rivals iPhones and Microsoft/HP/Dell with Surface/Spectre/XPS products that does rival Macbooks.
This is Apple, they want total control of the entire stack and the only way to do this is to do it themselves. macOS will not survive on other hardware with questionable QA. I started hating W10 simply because of its driver issues, automatic brightness stuff that I can't turn off on SP4. Same Win10 on my Macs, it works much better and consistent. There's no way Apple will do this to macOS.
And when Linux breaks, there is a wealth of knowledge on how to fix it. I have found OS X's community to be plagued by the opposite.
> Nobody else can make macOS hardware. If Apple doesn’t address someone’s hardware needs, there’s no alternative.
So that's a bad thing right, the closed ecosystem?
Oh no wait.
> Microsoft is boldly experimenting with PC hardware, but [even] if Microsoft did everything right, it would take Windows at least a decade to catch up
Talking about which, that decade an entirely unfounded claim. It doesn't even attempt to make arguments, instead waiting for the mac vs windows users flamewar to start. Then to add some fuel, it says linux sucks as well, not backing up that claim either, inviting anyone using a linux-based desktop (and by extent, the rest of the open source desktop community) to add to the flamewar.
And it's all for nothing. The article is telling about how bad it would be not to get another mac pro, using a hundred arguments that come down to "I like OS X software and software that runs on OS X, and the pro hardware is better than the consumer version [duh]". Apple cannot possibly not have thought of this yet.
PS. My memory must be wrong but I thought I remembered a different company boldly (bravely?) experimenting with hardware.
I take this as a hint that they secretly tolerate hackintosh at a small scale. They can't allow it altogether, and anyway it is nothing for most Mac customers (who want the "it just works" experience), but some people in the "pro" segment who need the extra power and don't fear some hacking can be satisfied that way. I'm thinking especially about small iOS or macOS developers, or people who work with audio/video, who need a beefy macOS machine.
And it would make a lot of sense to go back to a bit more "PC" like design like the predecessor of the current machine. That would mean that Apple has less pressure to update it frequently, it would be sufficient to have updates whenever Intel releases new CPUs, but all the other stuff, especially graphics cards, could be updated independent of Apple releases.
I disagree! A Linux desktop gets as good as your customization.
For me nothing can replace my setup: Debian + custom Stumpwm window manager
Pro users want flexibility.
Not saying there's anything wrong with people enjoying tinkering with their setup, but I guess the advantage OSX offers you is most of that is taken care of you on the desktop, you just install your apps on top
They discontinued the rackable servers, yet require developers to do software testing and compiling on MacOS. On what? Racks of macbooks and imac mini apparently.
What backend are we supposed to render on? A single, 3 year old Mac pro?
I get wanting to only sell in highly profitable segments, but maybe they should give developers a little sugar
They make some really nice computers, but the whole point is that there are others. I really miss times when I would assemble my own computer. Obviously you can't do this with laptops, but I still miss freedom it gave me.
I am exploring how to get my work done on something else, for start on desktop, some nice linux machine for start.
I tried several times in the past, but without us actually using linux, there will never be real need to fix issues that it has.
Is this really true? Yes, there are currently support issues, but this would seem to be because Apple exclusively makes its own hardware and doesn't legally permit running their OS on anything else. Might macOS thrive if it was retargeted to (or at least officially permitted use of) high-end 3rd-party hardware? Microsoft doesn't make their own high-end workstations, and Windows seems to be doing just fine.
But that is irrelevant for a lot other use cases.
Hackintoshes are getting better, but Marco is fundamentally right. Any OS update could break your system, which means you need to wait before installing an update, which opens you up to security and/or professional risks (for example, if it takes a couple of months for the newest MacOS to be made suitable for a hackintosh, that's a couple of months where you cannot update your iOS apps with the latest features, since XCode versions also tend to be tied to OS versions).
Which is not true. I have gotten my iMessage to work without having to clone a serial number.
A whole decade to catch up? I use Windows and macOS on daily basis for web development and various other tasks. I find macOS inferior to Windows in many ways. For example, window management on macOS is terrible and the transitions are nothing but annoying (personal preference) . Can't cut and paste with keyboard, can't lock your screen with keyboard, finder, itunes, and the task manager are all garbage. There are many other examples. That's not to say there aren't many things that are better than Windows or other operating systems, but the statement above is exaggerated.
Uh, are you using some weird KVM or something? There are keyboard shortcuts for both of those.