I never had success with Linux. My Macbook pro has had its shares of problems (Wifi issues that later resolved with a system update) but it's nowhere my experience trying to install Linux and battling the drivers issue.
Anyone figuring out the Linux/Laptop problem is re-inventing the Macbook Pro/OS X.
Here are things that I'd pay $1,000 on top of the current Macbook Pro model:
- Longer Battery Life (5+ hours)
- 32/64GB RAM
For OS X:
- Less cluttering (ie: remove all Apps and let the user decide what to install, like Siri and crap).
- Native Package Manager
That's about it. I'd be buying the new Macbook Pro in a month. But if Apple releases something like the above, I'm more than happy to drop 5-8k usd into it.
If MacOS had: up-to-date OpenGL support, Nvidia made drivers that supported new cards for it, was not locked into Mac hardware - that would be tremendous.
On the other hand, if Windows had a proper shell and cli tools, like cygwin with zsh, but native and not Ubuntu layer inside - that would be tremendous.
If Linux, any desktop variant (Fedora my poison), had Adobe's support for their DCC apps and great battery management for laptops - that would be tremendous.
If Windows and Linux had the above + Preview from MacOS - that would be tremendous.
It is PowerShell, and it really is. Until recently I thought Windows had poor CLI support, and I discovered PowerShell and now I favor it even more than bash.
Am I crazy? Possibly, but PowerShell is truly a piece of gem in the CLI history. It is a thoughtfully crafted product regarding what "command-line interface" should look like.
It is a thoughtfully crafted product regarding what
"command-line interface" should look like.
I've never appreciated what's actually good about powershell... you pass objects around? So...? Is that a thing that's useful?
If you want to just automate a task, having intermediate objects that are serializable (eg. strings) so you can `foo ... > blah` and inspect the value of blah before continuing (`cat blah | command2...`) has always seemed far more tangibly useful.
Having methods on an object you can invoke like a REPL for the OS sounds like a good idea, but I've never actually found it useful; its like the python REPL; useful for prototyping and doing stuff after you've imported the 50 packages and setup all of the environment, but once you open a new instance, you've got to spend the time doing that before you can actually do any work; and its useless for scripting.
...but, powershell gets a lot of love from people; so what do you actually find it useful for?
Honestly curious, I've only ever touched it briefly and then swapped over to other things.
String parsing is the bane of my command line scripting experience. Even in targeting "identical" environments, all it takes is one changed installation default altering the output of one of my many commands for my scripts to break - usually in some non-obvious ways that require a good hour to get to the bottom of, rework, and fix. To prevent such changes from forcing me to rewrite my entire scripts every time, I try to centralize such text parsing and munging in one place, "deserializing" those strings once and feeding it to the rest of the system. Shipping around this deserialized state in command line scripting languages can be so awkward at times, as to warrant rewriting the entire thing in a proper programming language. Inter-operating between your new program and your existing scripts will, of course, require even more text parsing.
Don't get me wrong, sometimes munging text is your least horrible option. Powershell still lets you do that.
But Powershell's objects also let you, with great frequency, skip the "try to 'deserialize' text that was really formatted for humans and isn't versioned, can be ambiguous, and otherwise was never written with machine consumption in mind" step. If I feel the need for a 'proper' programming language for parts of my script, I can write C# modules and use them from powershell without writing a bunch of text (de)serialization code on either end. This singlehandedly eliminates entire swaths of the most brittle, opaque, and otherwise obnoxious code to ever grace my scripts.
TL;DR: Strings shot my dog.
Sure, I could write a Python script right now that would read me the last lines of a log file on a remote server. Or, I could just type something like this and hit enter: ssh user@server "tail /path/to/log"
I think what GP is trying to say is that PS is in an awkward position between the two. It has a deeper understanding than bash/zsh/whatever, but it also requires more typing. Yes, PS will fix issues like the ones you have described, but typing in long names (at least for me) defeats the purpose of using a shell in the first place. I don't want to type in "Get-Item" or whatever a million times, nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
It's my experience that the latter eventually morphs into the former "without question".
And go figure, the build server doesn't have python installed. Or only has python 2. Or only python 3. Ditto for a coworker - this being game development, a lot of those coworkers aren't programmers, and won't be able to debug "hey python is missing" on their own - sucking up IT and developer time.
> It has a deeper understanding than bash/zsh/whatever, but it also requires more typing.
Aliases, tab completion... you're not wrong, but I've not found it an issue in practice. In fact, rather the opposite: I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using. This is perhaps because I'll script anything that gets annoying. I don't spend a huge amount of time doing bespoke commands in a shell, though.
> nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
I suppose we do vastly different things with our shells. Looking through my history, it's mostly things like "cd", "ls", "vi", "make", etc. and my longest bash script that's stood the test of time is 12 lines long, with the most complex part of it being an if statement in a string (trust me, there's a reason for that). I've ran much, much longer shell scripts, but I almost never write a shell script longer than 20 lines.
> And go figure, the build server doesn't have python installed [...] this being game development, a lot of those coworkers aren't programmers
AHHHHH ok we definitely do work in very different atmospheres! I suppose in instances where "coworkers aren't programmers, and won't be able to debug", I would just write a Python script and use PyInstaller so they could just double-click on a .exe
But if I'm on someone else's computer and they don't have Python or anything like it, then I would honestly just install Python. But I definitely see how you or anyone else would object to this, and I can totally understand the view that it's much better to use PS in this instance.
> Aliases, tab completion... you're not wrong
You're write, there are aliases and tab completion, just like on bash/zsh but on PS, I have to remember both "gi" and "Get-Item". Sure, I would use something like "gi" all the time, but whenever I look up something and see a StackOverflow answer that says "Get-Item", I have to know what that means, which means I have to memorize both the long and the short versions of a lot of things. On Linux shells, I feel like I only memorize a short thing like "cat". Sure, I also have to know what it does, but the same applies to PS.
> I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using
The letter flags part is a fair criticism. But don't all shells suffer that? It's the cost of writing quickly. I could Google "what is gi" but instead I choose to google "what does set -E do?"
As for the part about "reading documentation to decode bash/zsh scripts", I think that this discussion sums up why what you're saying is true for PS as well: https://news.ycombinator.com/item?id=14034414
> I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
I was working on two programs. One would do stuff and print JSON to stdout, and the other would take JSON from stdin and process it. I had a Linux VM running inside Windows. From my VM, I ran something like `program1 > file.json` and then I ran `cat file.json | program2`. This way I could inspect the JSON file at any time in case something went wrong in one of the two, independent programs. Everything was working just fine.
Then I stopped writing code and testing it in my VM. I decided to go the Windows route, and update my code outside of my VM, and then run my code in PS. I ran `program1 > file.json` and it worked like a charm. Then I ran `cat file.json | program2` or whatever you run in PS (it's been a while) - but it didn't work. So I assumed it was my fault. Time to debug. I looked at `file.json` line-by-line, and it was just fine, so program1 was fine. I looked at program2 line-by-line, and it was just fine, so program2 was fine. I went to my VM and ran `program1 | program2" and everything worked fine.
How was it possible that my code worked just fine in Windows, but not in Linux? It turns out that when I ran `program1 > file.json`, it fucked up my json file in a way that was like undetectable. I ran `program1` in PS, selected the output, and copy-pasted it into a text editor, and save the file as file.json. Then I could run `cat file.json | program2` or whatever from Windows and it worked like a charm.
To this day I am not sure what happened. Also, program2 supports a file name as an argument which it will then open and read, so some of the commands I listed may be slightly different from what I actually typed, but the gist of it working perfectly in bash on Linux but not on PS was enough to destroy me. Perhaps the issue was something about encoding? Sorry if what I'm saying does not seem very concrete. Here are some links that demonstrate (possibly different) issues people have using redirection:
There seems to be a solution to all of the problems, but that debugging session did quite a number on me.
The problem is scaling this to many coworkers. At some point it becomes "wait for I.T. to get around to automating the install across the fleet" or make your scripts install python, ninja-like. But it sounds like you're more able to rely on python, so that probably makes more sense for you (if only so you don't have to rewrite the same script for non-Windows boxes.)
> Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
It sounds bad enough I can totally get where you're coming from. Heck, it's basically the exact same place I'm coming from with the "strings shot my dog" quip ;)
> Perhaps the issue was something about encoding?
Something to do with e.g. UTF-8 BOMs or line endings (\r\n vs \n) would be top of my paranoia list. I'd break out a hex editor or binary diffing tool (I've used 010 Editor a couple times for this) if you find yourself in the same situation again. Understanding exactly when I have a single string with newlines vs when I have an array of strings with implicit newlines when joined isn't something I've got my head perfectly wrapped around yet in powershell, and could be another possible cause.
> Sorry if what I'm saying does not seem very concrete.
You're offering what you know, and I appreciate it :). Sorry for the short reply (I need to be somewhere...)
But boy is it ugly. And the worst thing is that all the really nice functionality .NET provides with LINQ is implemented halfway at best, if at all...
1. It's amazing because Windows-only admins (or predominantly Windows admins with almost no Linux experience) have never seen anything like it before. Until PowerShell, the state of the art was VB scripting or batch files, both of which are (objectively) garbage. Regardless of how long the rest of us have been working with shell scripts, Python scripting, etc., Windows users have never had the opportunity to do similar things with similar tools which are included with the OS.
2. It's amazing because it does a lot of great things that even bash scripting can't do. The idea of passing around structured data can be super handy for a lot of common topics. For example, on Linux I have to use 'ip addr list' to get list of IPs, grep to get just the IPs, awk to get just the IPs, and now I have a list of IP addresses. It's a huge stupid hassle that I have to do every single time I want to write a script that takes advantage of IP addresses.
Making everything a string makes sense when it's 1970 and you want everything to be compatible, but when basically none of the tools I use on a day-to-day basis provide the option for easily machine-parse-able output, it ends up very frustrating. The (theoretical?) promise of Powershell is that all output is machine-parse-able.
The benefit of passing objects around is that you could do things like Get me a list of network interfaces | filter by interfaces which are up | which have IP addresses | just show me the IP addresses. The few examples I've seen make it feel like your shell is some sort of half-bash/half-SQL system where you can filter, process, and loop over objects.
I can't count how many shell scripts I've had to write which parse output to get the list of data I want, then go back over that same output again to do actual work on it. You can hack a lot of stuff together with ugly hacks; getting all the interfaces on a MacOS machine with IPs except loopback? Maybe 'ifconfig | egrep "^[a-z]|inet[^6]" | grep -B1 'inet' | grep '^[a-z]' | grep -v lo' would do it. In most cases. Probably there's a better way to do it, but if you just want to get something written then you can hack it in like this, or loop over 'ifconfig -lu' (which, on my machine, shows 13 'up' network interfaces), etc.
Another thing that bothers me is the lack of single line composability. In most Unix shells you can pipe things around with abandon; it's not pretty but it works. One more than one occasion, while working with PS, I've had to create a cmdlet because there is no way (or I don't know how) to store intermediate values between cmdlets. One example was processing things in a loop. I had to store the current value in a variable and then process that variable in another line. I know someone here will give a solution, but I looked for an hour before giving up and creating a script file.
On top of everything else, the cmdlets from Microsoft have differing switches for the same thing. One command might use -ServerName while another command will use -ComputerName. So, basically, you end up looking everything up before you can use it. I know Bash isn't much better, but at least I can expect that the tools are separate and not really designed to work together. I was expecting more consistency from PowerShell.
Windows is supposed to have better font support than Linux.
Also, if someone here has an answer: Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
Windows' way of program executable placement is using the holy Registry. It's called 'Application Registration' and was introduced to reduce the needs to modify the system-wide PATH variable. (They thought it was a bad idea to modify a system variable so frequently, and I partially agree.)
You can find registered applications in `HKLM\Software\Microsoft\Windows\CurrentVersion\App Paths`. Very few programs use that feature, which is unfortunate, but popular applications like Chrome and Firefox register themselves in it. That's why you can invoke `chrome` in the 'Run' dialog.
Edit: Another context to add: at the time App Paths was added, to modify PATH you had to edit AUTOEXEC.BAT manually which was painful. Not only that, but also PATH had a length limitation of 128 characters. You can find more details in the Raymond Chen's blog, as useful as always.
I should note there are still length limits - in practice you'll run into issues with as few as 2047 characters:
Debugging this is really annoying, as one of my coworkers found out when one too many applications decided to add multiple paths to PATH (for example, nVidia CodeWorks has added no less than 8 subdirectories of C:\NVPACK\ to PATH to support Android development - for gradle, ant, jdk, ndk, and the android SDK's support, build-tools, platform-tools, and regular tools.)
Said coworker ended up spending some time using directory junctions to shorten the paths in PATH to the point where his dev environment was useful again.
Oddly it appears the python2.7 installer uses this (global setting), but not the python3.x one (It would seem that python2 could register python.exe (as it currently does), and python3 could register python3.exe (as it currently does not)).
It certainly doesn't seem to make much sense for python3 to have an option to add itself to the path environment variable, and an option to change the path length limit - but apparently not an option to use this "modern" way of registering itself? (Unless, python2 and python3 installers, when fighting it out, default to only registering python2... which makes sense, but is painful).
But based on a windows hyper-v vm with only python3 installed, it looks like python3 does not use this setting.
This get's to the real issue of what makes powershell so horrible. It's not that anyone loves bash scripts, it's that there are a tonne of great utilities that bash script tie together. Windows doesn't have this.
I'd be more open to Windows if trying to manage Unix boxes from it wasn't like trying to build a ship in a bottle. I always feel like I have one hand tied behind my back trying to do my job in Windows.
For simple web browsing and office work and such, it's fine.
As a side note, like you said, Cygwin is slow. I once measured how much time compiling things consumed on both Cygwin/MSYS2 and WSL. `./configure` was 3 times faster on WSL, and `make` was 2 times faster. I assume the reason to be WSL's process management being lighter. This is another reason I'm looking forward to see the improvements to WSL's native integration.
Same here. Might be biased because I never completely mastered bash as I don't use it that much. But after the first bit of the learning curve is over the rest just seems to go automatically with seemingly way less searching the internet: things are just easier to discover and figure out by yourself, and that also makes it easier to remember them. Plus you can visually debug it. Also some of the bash things I'm addicted to (autojump and fzf) have some pretty good clones for PS, namely ZLocation and PSFzf, those are real timesavers for navigation/history search for me.
Sure, there has not been much progress with terminal emulators in the past couple decades, but in Windows there has literally been none at all.
I think if I liked the rest of the .Net toolkit more, I'd be more enthused about it. It makes things livable on Windows, which is better than the dark days of Vista/7.
(although that doesn't account for personal preference, of course)
With my understanding of the Windows architecture, such as it is, there isn't a great distinction between "native" and "Ubuntu layer" because the "native" is really already a "'native' layer". If they continue to polish the Linux support, it will basically become as native as the core of Windows already is.
Then again, developing for Apple currently means more compatibility with old hardware and drivers, but OpenGL 4.3 (which Apple doesn't support) includes Compute Shaders, something I would really like to explore, but can't as Apple supports only 4.2 with a limited amount of extensions.
It's all about benchmarks and cost/benefit analysis.
I've seen code that works on the iOS 20-100x faster than a desktop equivalent because the iOS version is done using Metal and the desktop one is barely vectorized CPU code. In some edge cases a "server farm" of iPads might outperform an equivalent spend on Xeon-based servers.
Also, Os X users are a relatively small group compared to Windows users, so if making for example applications for VR, limiting yourself to Mac is a killing move.
It's sad though, as I really love the Os X as a development environment.
- Most apps would look tiny; Others gigantic and pixelized (LibreOffice... looking at you) and plain unusable;
- External monitors have to be at the same DPI...
Also, the lack of proper graphic software like Keynote, Sketch, Pixelmator is the other reason I have to stick to macOS.
Finder is even more horrible. It reminds me of the Windows 95 Explorer. A network drive hangs? Good luck with Finder..., the whole system halts.
ps- if anyone has any suggestions or recommendations, I'd love to hear them!
I don't remember changing this setting, but it's always possible I did years ago. I have never been a big user of Spaces and somehow missed out knowing Split View was ever a thing, but it's actually pretty neat.
as long as someone does the job correctly
While I think that Apple should have just copied Microsoft's approach (i.e. what Cinch does) the wealth of options for window management means that everybody can find something that they like.
I use BetterSnapTool instead, which seems to have a better collection of settings and solves my (admittedly minor) gripe with Spectacle.
I know I'm being selfish here but it would be nice if Apple covered this sort of stuff instead of concentrating on emoji and Siri. Both of which I have no real use for.
Not sure about your window size problems with external monitors, I find that OS X handles this amazingly well
It has probably been my favorite utility for Mac
But I do love the amazing level of configuration and scripting that Hammerspoon allows you to do.
Also you can drag left/right to snap to one half of a screen.
Anyway I tried to replicate this behavior on Mac with spectacle. It's meh. If you fuck up and fullscreen something and then try to half screen it, it seems to break horribly.
It can do everything but make lunch.
Admittedly it's annoying that this feature is hidden and requires dragging with the mouse, but it is possible.
Actually the Finder has interactive breadcrumb style paths that's not hidden at all (just not on by default):
View -> Show Path Bar
It also has a "path dropdown" with all the directories up to the current path, shown if you command-click on the current folder's icon+name on the top-center of the Finder.
I've never found macOS's window management to be that bad, but to be fair I've been using Moom for a decade or so, and lately have been using the "split full screen app" trick a lot. (Someone else mentioned the long press on the green "full screen" dot for that, but you can also do it just by making an app full screen, going to Mission Control--which I do with a four-finger swipe--and dragging a second app on top of the full screen one.) That's not as useful for 27" monitors--in most cases I prefer to actually have untiled windows I can rearrange and resize with the pointer--but it's terrific for laptop screens.
Not sure what "isn't interactive" means, but the OS X "path bar" let's you do the exact same thing (as you describe for the command-click on the folder icon): by clicking on any folder along the path you can navigate to it.
Note that it takes a double-click for that though. Perhaps you were only single-clicking?
What is so horrible about it ? I have the exact same reaction every time I have to use Microsoft Windows. Window management is one of the strongpoints of MacOS.
> A network drive hangs? Good luck with Finder..., the whole system halts.
I've never encountered this problem on both AFP and SMB shares.
I am typing this on a Macbook Pro so I am not some Windows fan. OS X has fallen way behind in its desktop incarnation. The only reason I use OS X is due to its Unix shell for development. I don't think anyone can honestly say that as a GUI OS X is better than recent Windows.
I've never lost focus with fullscreen/2 monitors. So?
Can you give an example of this? I regularly switch between macos, windows and ubuntu and I have a hard time coming up with much positive to say about macos window management (unless a third-party tool is used).
I never understood this type of workflow and when I was in school it was everywhere, it's like apple was pushing for people to have little windows strewn out over an otherwise gorgeous display.
Also, it's nice for spacial awareness. I feel a little bit lost when a window takes over the entire screen and blanks everything else out.
It's even better in ubuntu because desktop switching isn't just left/right, it's up down. So I can really think geographically. I guess that's just the kind of brain I have - for example, if I want to look for tickets for a movie I want to see, I search "movie theater" in google maps and click the theater in the location I want to go to, then navigate through ticket buying etc. shrug who knows man
Uh... because with three 30" monitors, it would be like sitting in the front row at an iMAX theater?
While I'm at it, I'll also plug https://manytricks.com/witch which I find invaluable for switching between windows and just got an update.
I am specifically saying "Ubuntu" and not "Linux" because I don't care that it's a bloated distro, I don't care about tweaking things just right. I don't care about the freedom and flexibility that "hardcore" distros give you. I just want to turn on my computer and get to work, and Ubuntu gives me exactly that.
Your experience may be different but I'd be interested in hearing it.
Out of genuine curiosity, what are the ways it's not superior?
I still prefer using Ubuntu for my dev environment, tho.
i finally switched for same reso Dell units on my desk and really happy about that.
There's also something to be said about the consistency of the old Apple displays. I've seen dozens at this point and none had noticeable backlight issues, and they look good right out of the box instead of requiring the user to switch off 15 gimmick settings to get a proper picture.
They had their issues but they did several things right.
It's a personal preference, however, so we let people at work make their own choices.
Apple displays have long been very high quality with accurate colors. They also usually come with a built in hub and ability to power a laptop, which is very handy. It allows the monitor to essentially be a dock.
No, it's more than the glare means no bloody anti-glare coating (and hence more sharpness), the colors (saturation etc) were often reviewed and measured best-in-class, and same for angles of view, brightness, etc.
Oh, and portrait mode, while nice, it's at best a niche use.
The problem is that Apple almost never lowers their prices unless they come out with a "new improved" model of something. So, as a given product continues to be sold without an update or a price drop, the more steadily outrageous its price point seems. To the point of the original linked article, this is a serious problem for the Mac Pro. It was expensive at introduction, but it's been downright absurd for the last two years. This is an issue across most of the Mac product line currently, though.
This would probably also work with other monitors via MCCS/DDC  but it seems to me that all operating systems just ignore it.
At least that is the reasoning. They can complain to their local Apple Store to make it work.
Most end users have no clue, and they don't enjoy researching, so they don't know they could get something better for less money.
No, as a pro, my reasoning is they are great for photo/video work, and even affordable compared to competitive solutions (I mean at the time, they don't make them anymore, but the 5K iMac screens are excellent).
I just don't understand scrimping on something you will spend all day staring at. Getting something "almost as good" for half the price is a terrible deal.
I have never used an Apple monitor, but I have used several other 4k monitors and they are great. My Dell monitor is just fraught with compatibility issues and needs special software to work with win 7. I use Ubuntu and it barely works there, my coworkers have to treat theirs like special snowflakes.
Why are you still using windows 7?
On Ubuntu the monitor needed much finagling to get working right, unlike my AOC or Asus 4k monitors which both worked when hotplugged using HDMI.
My coworkers, who aren't all devs, have more work to do in windows and they needed the special software.
They also perform incredibly as monitors (or did, when they were newly updated). There's that too.
A thinkpad + ubuntu I think, will hit all your needs. Lenovo preinstalls Ubuntu on some of their thinkpads in some cases (afaict, large enterprises). The thinkpad line IIRC works to use well-supported hardware for Linux. You can check the certification list to be sure: https://certification.ubuntu.com/certification/make/Lenovo/
Dell also ships XPS models that ship with Linux. http://www.dell.com/en-us/shop/productdetails/xps-13-linux
I was on the mac a long time, but eventually switched to Linux and am much happier.
EDIT: Thinkpad P series has 16/32gb of RAM, donno about battery/size/weight, as there are a few models, and you'll have to figure out what your preferences are in trade-offs.
The only front this laptop falls extremely short is with battery life and weight. The P50 is really good if you need the power and I usually get around 5 hours of battery life without Optimus because it interferes with my workflow under Linux.
In 2015 the macbook pro had a 99.5 watt-hour battery (100 watts is the limit to take on airplanes). Now it has 76 watt hours.
So you could get approximately 30% more battery life had they not prioritized thin and light.
For phones, it's an easy product decision, thinner/lighter is always better. You give the most benefit to the most users, and those who really need more battery life can get a battery case and pay the weight/thickness costs alone.
For a Macbook, it's a closer decision, but I think in this case they still reached the same battery life as the older laptops made the decision reasonable. Remember that it wasn't that long ago that a 7 hour laptop battery life was extraordinary, 10 hours should be plenty for most users. And there are also battery packs you can get if it isn't'.
I heard that Apple wasn't able to get a custom fitted 85 watt-hour battery ready in time for the release, and I expect they will refresh the MBP lineup with it when it's ready, giving an 11-12 hour battery life.
Then I build some C++ under Xcode and battery life is not so good...
Probably great for web browsing and casual use as used by 98% of users though.
Just because that's what you want, did not mean it's what everyone wants. I, for one, would love a solid inch thick brick of a phone with a replaceable battery, and a 2-inch thick laptop with sturdy replaceable parts.
Hmm, where'd the downvote button go?
I know one is discouraged from questioning whether a commenter read the article, but I'll point out that Gruber is on about the Mac Pro, which is a desktop machine. I'll doubt the new one will have "longer battery life".
> most of their pro users use MacBooks and most of the rest use iMacs — and that they have big plans in store for the pro segment of both of those product lines
I think this makes talk about the Macbook Pro on topic.
Lately I even tried elementaryOS, and it's worse than Ubuntu. They keep saying how it's not a copy of OS X, and it evidently isn't as far as user experience is concerned, but on top of that they're obviously inspired by a design that's now completely outdated. At least Ubuntu is looking ahead and thinking of touch interfaces.
Ubuntu is genuinely the only somewhat passable option for people who don't know, nor should know, what process thread is, or even how many cores are in their CPU. Ubuntu has a somewhat consistent UI but still suffers from all kind of major bloopers. I mean, what the fuck. It's 2017, and it still doesn't save the last window size & position in most apps. It drives me mad. Some do, some don't, so it ends up worse than not supporting it at all.
I got fed up with Windows and Ubuntu so I bought a five year old Mac Mini. Sierra looks amazing, and it runs silky smooth. I don't AAA game and this will most likely serve me very well for web development. Came with a big SSD drive too.
It's kinda sad nobody can compete with Apple. But if anybody will I don't think it's the "free software" world.
Also here is a Linux joke for you:
If you don't like certain things, just fork it and do your own thing.
Dual monitors should be plug & play. I shouldn't have to add a new repository to apt-get, I shouldn't have to choose between 15 Nouveau drivers and 15 potentially system-breaking nvidia drivers. It should "just work".
Dual monitors, and just display output in general... this is 2017, this is very basic, expected functionality. It doesn't matter how complicated it is to implement - the user doesn't give a shit, they just want two monitors.
Evidently it does matter how complicated it is to implement, or it would just work by now.
I use a desktop (i3), package manager (nix), editor (Emacs), programming language (GHC Haskell), file system (ZFS), bidirectional sync (unison) and security (gnupg) that are collectively far more innovative, powerful and stable than anything Apple have ever produced.
Of course, if you are specifically looking for consumer tech, ease of use and support, then open source probably isn't for you.
If you can look past your smugness a bit, why? Why is it that if I want "ease of use", open source isn't for me? Do you not see the problem here? I can't see how you can argue your favorite projects are more "innovative & stable than anything Apple have ever produced", but in the same breath say that open source isn't for someone who wants ease of use. What exactly does "stable" mean to you?
My point was that there are quality open-source projects out there, after you appeared to assert otherwise. But ease-of-use is not something developers/startups seem to be interested in spending time on.
If you want me to qualify stable, then let's compare Apple's bidirectional iCloud sync to Unison, or Time Machine to ZFS snapshots.
I eventually got fed up of all of this and went to OSX with Windows alongside after 15 years of Linux use, and that's from RedHat 5.0 and 6.2 days. No not RHEL, RedHat.
The "ease of use" argument is sad, and precisely what some forget when developing software - it's there to be easily used, else nobody will use it. The computer is there to work for YOU, not YOU work for it (ie, spend hours fighting with it).
You only have to look at Windows 8 to see that "ease of use" was abandoned on the Start menu and see what a mess that was.
Very thoughtful point.
Why is that a problem? You're not entitled to anything, easy to use or otherwise.
i3wm, the window manager the post above is talking about, is incredibly complicated, but provides efficiency and a sense of accomplishment when learned. That reward from learning something complicated is where the smugness of most open source enthusiasts comes from. Don't look too much into it.
Open-source can compete on many fronts and offers many other advantages (i.e. freedom), but on a pure ease-of-use assessment, I do not agree with you that Gnome or KDE could sway the parent, if Ubuntu completely failed to do so.
When you use a proprietary OS, you rely entirely on its creators to create a system that does what you want. When something in Windows or OS X is not what you want (or is broken), you can't do anything about it. When something in a free OS is not what you want, you always have the option to use something else.
Free is a promise, a promise that I do what I say. If the software doesn't, it is for all the world to see my deficiencies.
Free is also about not being an asshole. It is about accepting the fact that, just because the users use my software, I don't get to control their lives.
Or to modify it yourself, or even hire someone to change it to your liking.
Specifically? People typically point to the telemetry and forced updates, but I've managed to disable both, using what were admittedly much-too-difficult procedures or third-party software. It's annoying, but not that annoying.
I really like Finder and Spotlight, but not enough to be tied, via licensing, to any specific hardware.
Less cluttering--I can uninstall apps on windows
Native package manager-- I guess Windows has Chocolatey and I use npm for dev work.
* HiDPI support in all apps, whethere they are aware of it or not.
* Built-in PDF editing and creation from all printable content.
* POSIX scripting and CLI.
* Very clean and consistent configuration system (defaults). Reset an app to factory? Delete one plist file and potentially an app support folder. Got a new Mac? You could even boot it up from the old harddisk. Good luck doing that with windows registry.
* Touchpad support.
* Systemwide fulltext search with indexing and complex search terms. Somehow MS still hasn't caught up with 10.4 Tiger it seems to me.
* Superb discoverability of power user features with in-app help system and hotkeys displayed in the menu.
* A consistent menu system.
* Powerful and system wide screenshots.
* Very good screen calibration out of the box
* Cmd-C / V work everywhere, including the Terminal.
* Very good terminal with good color schemes, tabs, unicode and even emoji support.
On the other hand Windows has:
* The best keyboard-only UI (although ribbons were a big step backwards in that regard - very hard to discover now)
* The best graphics drivers
* The best Office version (although Google Docs has mostly replaced the need for me)
* Windows-P, I really like that menu
* The Windows 10 task manager, pretty neat.
* Pen and Touchscreen support.
Overall MacOS beats it hands down for me when it comes to productivity.
I think most people who haven't used an Apple laptop extensively are not aware that the touchpad actually works, and you do not need to carry an external mouse to use the device comfortably.
The power of gestures is incredible. Managing macOS windows without a touchpad is awful. With a touchpad it is the best.
We have those at my work, and we also have an iMac with a magic trackpad, and several of us have macbook pros.
It's a huge improvement, but it's maybe 75% of the way there. And the points at which it's NOT there are very noticeable and annoying. There's still regularly times where the trackpad just gets totally confused and you can't move the cursor for 1-2 seconds. Mehhhhhh
* Drag-and-drop automation, scripting and workflow tools. I love folder actions.
* Built-in screen recording in QuickTime Player, which also works with attached iOS devices
* Time Machine backups. So easy to use and has saved my bacon over and over
* iMovie and Photos. I don't use the rest of the bundled apps, both those two are essentials for me.
Use the Alt key.
For example, open Word and press Alt to show the keyboard commands. If you want the keyboard commands for the Home tab, press H as shown.
Alternatively, to open the Ribbon and show the keyboard commands for the Home tab, press Alt-H.
So, if you want to center some selected text in Word, press Alt, H, then AC
If you want to insert an image, press Alt, N, P and so on.
The Ribbon makes Office programs much easier to use, so you probably won't want to learn many of these key sequences. However, the ones you already know will almost certainly work.
For the record, I much prefer Windows 10. However, the fact is that Apple doesn't sell any of the hardware I use. It doesn't make a proper tower desktop and it doesn't make a small rotating-screen laptop that doubles as a touch tablet.
Even if I was willing to compromise on hardware, less-functional Apple products would cost 2x to 4x more.
Please note that I still do consider Windows Office the best version nevertheless, but for different reasons.
Of course, there's also personal taste, and you are perfectly entitled to prefer whichever menu system you like. However, the Ribbon won a decade ago, so at this stage, it would probably be more useful to learn how to make better use of it. My Alt tip is just one example.
Windows has WSL which gets better in the new Creator's update.
>* Touchpad support.
I don't use laptops, but I've heard good things about the touchpad in the Surface line, and Dell XPS.
>* Superb discoverability of power user features with in-app help system and hotkeys displayed in the menu.
Windows has had hotkeys in menus as long as I can remember. I know they where there in 3.1. If you press Alt in Explorer you'll get overlays with hotkeys over the buttons and menus.
>* A consistent menu system.
I'm guessing you mean that the ribbon is inconsistent. Most programs use the regular menus, and the ribbon is just a glorified toolbar with tabs. I don't see the big deal.
>* Powerful and system wide screenshots.
Windows 10 has Win+Print screen to save fullscreen screenshots as a file. For more control, there's the snipping tool that's been included for years now.
>* Cmd-C / V work everywhere, including the Terminal.
Ctrl+C/V works in Windows terminal too.
>* Very good terminal with good color schemes, tabs, unicode and even emoji support.
I usually use ConEmu. I just tested, it does support emoji, but I don't see the point. I just tried "mkdir " (edit: seems like HN eats my emoji, but that's supposed to be a directory with an emoji in the name), and it worked as expected. If I use the built in terminal in VSCode it even looks nice, with colors, but it's just two blank rectangles in both cmd and powershell.
The built in terminal supports 24-bit color now though: https://blogs.msdn.microsoft.com/commandline/2016/09/22/24-b...
If you mean Command Prompt by "Windows Terminal", this is not the case (if not, I'd really like to know what "Windows Terminal" is - I use Command Prompt for DOS/Windows things and Kitty for *nix related things). At least I have had to turn it on explicitly in the Command Prompt options (quick edit mode), and it's one of the first things I do on a Windows machine after setup.
alt key in windows does nowhere near what the help on macs does. you're looking for a command or forgot where it was in the menu or want some documentation -> open up help, type a query in the unified search and you're presented with docs as well as commands. Highlighting a command shows you the full path in the menu by opening it up and now you can even sed the hotkey. That's what I mean with discoverability. Every provrammer who uses GUIs should have a look at how that works, I consider that mandatory homework.
Obviously windows has screenshots, but you overlooked the word powerful. MacOS has all the features of the snipping tool right there on system wide hotkeys, no need to open up an app first - including delayed shots and area selections. It's not a big deal, but it saves enough time that it's a total no brainer for me to provide screenshots fo whatever question someone has (even when it's just a distraction from my actual task) while on Windows it takes a crucial 10-15 secs longer to do the same per shot and would disrupt my workflow.
> WSL / Terminal
I do acknowledge that things are getting better there and this is a great development - if/when it gets there I'll consider windows among my primary PC choices again.
Btw. it's telling that Windows-only users always overlook my point about the registry in these discussions. I use all three desktops and I can tell you, not having a central registry in an OS is a huge productivity win. With windows I spend days every 2 years getting a fresh state again while on Mac I can just copy over the file system (using automated tools that support thunderbolt cables, copy half a TB in 30min and are again built in) and start working after a coffee break. Yes there's imaging in Windows, but then you have to regularly keep those up to date and in the end you spend even more time if you only manage a handful of PCs.
And I say this as Mac user.
Btw. in what way is NTFS ahead of APFS?
The MFT is very helpful in that it is on the disk itself, as opposed to Apple's solution of getting a separate utility to index the disk (Spotlight) and generating a giant database file on the filesystem in your hidden Spotlight-V100 directory. Spotlight is on the filesystem, not IN the filesystem.
The NTFS page on Wikipedia lists all of the wonderful features of NTFS. Also Windows Internals 6 details some in wonderous detail.
In any case, APFS is far better than HFS+ (which has to flip all metadata's endianness as it is stored in big-endian format). It also has single-threaded access to this metadata, from what I recall. John Siracusa's review 6 years ago of OSX 10.7 Lion detailed the poor state of HFS+: https://arstechnica.com/apple/2011/07/mac-os-x-10-7/12/
Makes for sad reading.
To be fair, the built in Terminal.app sucks, you have to get iTerm2, which is a 3rd-party app, but at least it's not another $40 replacement app that shouldn't suck by default, (the same cannot be said for Finder and its replacements).
That said, almost identical behaviour can be had in fullscreen with two Terminal windows sharing a full screen desktop space.
I also enjoy having my terminal be a slightly-transparent black rectangular slate, with no title bars, corner curving, etc., but that's purely preference and has no impact on functionality.
> Specifically? People typically point to the telemetry and forced updates, but I've managed to disable both, using what were admittedly much-too-difficult procedures or third-party software. It's annoying, but not that annoying.
Before I switched to macOS about 8 years ago I was used to the mindset of "oh this doesn't work the way I want but I'm smart enough to figure out how to fix it" and I took great pride in being able to fix my PC no matter what happened. I'd dig into the registry, I'd futz with inf files, and drivers. I didn't mind it too much, it wasn't "that annoying". And then I switched to a MBP my freshman year of college mainly because it meant I could use Windows/Linux/macOS and I loved how solid they felt and had people around me rave about the hardware and longevity of the machines themselves. I played with macOS and found that after getting used to it it was a joy to work with. It took a little longer for me to realize that I wasn't spending all my time making sure my computer kept working, it just worked on it's own. The OS that I had belittled and mocked for years for being "a toy" or "dumbed down" actually was insanely powerful under the hood, extremely intuitive, and looked beautiful. That last point may sound stupid, I know I used to think it was, but it's a big deal. You are going to be staring at this for 8 hours+ a day. Trust me it is way more enjoyable to look at something pretty than something not. When it comes down to it for me macOS is built on a rock solid core and makes switching between it and the linux servers I work on a breeze with beautiful apps and a beautiful UI all of which JustWorks (tm). For that I am more than willing to pay the MacTax (tm).
Of course, under the hood they've rewritten everything like DNS and actually using the hosts file etc. but from a usability perspective you are right that you end up fighting with the OS less (so I have found).
I dunno, I've done all kinds of stuff to my windows 10 machine to keep Candy Crush Soda from reinstalling itself. Yet every few weeks/months that King garbage ends up on my start menu.
I use windows 10 because I'm a .NET developer and a gamer. If I could use OSX on my desktop instead, and have access to the same steam games - I can't think of a reason I would stay.
> Less cluttering--I can uninstall apps on windows
On OSX you typically don't need to 'uninstall' just drag to the trash. Yeah, some apps leave some garbage behind - but the same happens on windows when you 'uninstall'
> Native package manager-- I guess Windows has Chocolatey and I use npm for dev work.
I think the biggest detriment on this point for windows is that the command line interface on windows is not friendly. I recently worked on a project where half the team didn't know powershell, and the other half really loved powershell. We also used some stuff from the node ecosystem. We had scripts that would only run in cmd.exe, powershell scripts, and scripts that only worked in bash. In the end http://cmder.net/ saved my butt, since I could have all three shells open.
Another anecdote, I right clicked and clicked "unpin from start" and never saw it again.
It bothers me. It doesn't bother you. That's fine.
Microsoft lost points with me, doesn't mean it has to affect you in any way.
Besides, I thought only the Anniversary Update had CC? How would it get reinstalled?
Most of these things take up a trivial amount of disk space -- some of them are just placeholders -- so they're not worth the time taken to worry about them.
You never explicitly asked for it to be added there, did you?
What about the pernicious inclusion of Edie Brickell's Good Times in Windows 95? ;-) https://www.youtube.com/watch?v=iqL1BLzn3qc
Candy Crush Saga was actually one of the benefits of Windows 10 that Microsoft promoted, for two reasons: (1) Candy Crush was enormously popular, and (2) it provides practice in touch operations, just as Solitaire got people used to using a mouse.
The Good Times video in Windows 95 also sold Edie a lot of CDs, but it would be equally stupid to criticize that. In both cases, Microsoft is providing something entertaining that demonstrates some benefit of the operating system.
If you don't see this as part of a bigger trend I don't think I can help you do so.
"Solitaire. Hearts. Minesweeper. These are games that have been played millions of times over the years in Windows. And they are coming back in Windows 10. If you’re a Windows Insider, you can check out a preview of the new Microsoft Solitaire Collection that’s included in the latest build of the Windows 10 Insider Preview (Build 10074). In addition to these games, we’re also working with partners to bring some of their great games to Windows 10 too. And we’re excited to be able to announce today that King will bring their game, Candy Crush Saga, to Windows 10. Candy Crush Saga will be automatically installed for customers that upgrade to or download Windows 10 during the launch! Over time, other popular and awesome King game titles will be available for Windows 10. Ever since Candy Crush Saga arrived for Windows Phone, I’ve spent countless hours of fun matching candies. I’m really looking to playing Candy Crush Saga and King’s other game titles on Windows 10."
For you, maybe. For me, I have zero intention of going through the hassle when I can just use Linux instead. Any hassles here are easily learned and generally don't _require_ the use of third party stuff doing black magic under the hood.
Chocolatey is nowhere near as good as e.g. pacman.
You mentioned disabling telemetry, but the fact that the OS you paid for spies on you and shows ads is a big turn off.
Your concerns sound like they come from many years ago and that maybe you haven't taken a look lately.
Need to SSH/SFTP? Download.
Need to edit code? Download.
A reasonable terminal? Download.
Yes, I might have to go to the Mac App Store and get command line tools for C, C++, Objective-C, etc. It is unfortunate that XQuartz is no longer installed by default, granted. (PowerShell is very nice to work with though, I give MS full credit for that.)
I could live with having to download python, perl, etc., but someone please explain why having an absolutely fundamental ability to edit text and securely connect to work and transfer data to/from remote machines is something that any OS should be shipping without?
While that is true, it's a bit of a moot point imo. I have yet to see an OS which had everything I needed for any actual work installed out of the box. As such, the way to deal with this is have a script download/install everything for you (and eventually copy or link all configuration files). Using Powershell getting any of the examples you mention is basically a one-liner just like for other package management tools, something like `openssh, miniconda, strawberryperl, conemu | Install-Package`. Or you can go more of a DSC way with Powershell Dsc.
And starting with Windows 10 a metro app called 'Code Writer' seems to be installed by default for coding. (At least it's there in my installation.) I didn't try it though.
It still befuddles me why some of this basic functionality cannot be a standard part of Windows. Not every Mac user uses emacs, or even knows how to open the Terminal app, but the fact that basic tools are supported means that there is just a baseline level of infrastructure to work with.
The ability to edit text _does_ ship on windows - notepad. (not that I'd recommend it but it does exist). Also, OSX doesn't ship with anything better.
FTP isn't a fundamental requirement for everyone, I don't have an FTP client installed on my machine, and don't have any intention of installing one.
You are correct, I would never suggest anyone have an FTP client; I have that service turned off on every machine I administer. SSH / SFTP are fundamental tools however.
They are provided out of the box (and newer versions than OSX ships with) by WSL. As is vim, emacs, apt, etc.
Pro-tip. Pin PowerShell to your taskbar and drag all the way left. Now you can open it with win+1.
Edit: now with no mouse required!
Edit: for reference, home also doesn't come with bitlocker (!?), Domain join, Group Policy, client hyper-v, and others. Don't buy home to do work.
> I would never suggest anyone have an FTP client; I have that service turned off on every machine I administer. SSH / SFTP are fundamental tools however.
Pedantry at its finest.
I have neither ssh nor sftp on my workstation and I have no need for either.
Windows also ships with WordPad, which is a simple word processor.
I'm pretty sure you didn't try PowerShell. It is at least as good as bash, if not better.
> you need to use the mouse too much
I find this more problematic on macOS. I was forced to use a mouse much more on macOS. It doesn't even allow me to press the yes/no dialog only using a keyboard. On Windows nearly every item has a shortcut, sometimes even better than GNOME (but worse than KDE, IMO).
Who needs ls when you have Get-ChildItem?
Who needs grep -r 'pattern' when you have Select-String -Path c:\ -Pattern pattern?
Now that I think about it, Powershell has a conceptual similarity to Applescript. A proprietary, verbose, and hard-to-discover English-like syntax belying a great amount of power over the target platform.
cmd+first letter of the dialog option.
I actually find OS X far superior with shortcuts than Windows: More discoverable, actually configurable and more consistent.
One of the only things I miss now I've switched to Win.
In Windows the shortcuts have always seemed less logical to me, though the newer Windows key ones are an improvement.
After some googling just now, it seems like there's now an Invoke-WebRequest command, but that too looks like a hell of a lot of typing compared to Linux shells.
There's also the fact that lots of stuff in windows wasn't designed to be accessible through the cli. You can't, for example, make a powershell script to toggle an audio output device on and off, as that's only available through the GUI.
I blame the documentation. Not the official documentation (they're not excellent, but OK), but various outdated resources residing in many blogs and sites, including Stack Overflow. I mean, what's wrong with
iwr REMOTE_URL -o LOCAL_FILENAME
> There's also the fact that lots of stuff in windows wasn't designed to be accessible through the cli. You can't, for example, make a powershell script to toggle an audio output device on and off
This is just one example. On the other hand, I found nearly every thing I had done using GUI could be replaced by a few lines of PowerShell code. Actually automating my day-to-day GUI operations was my way to learn PowerShell, and mostly it worked great. Microsoft is adding tons of commands each release to expose more system functionalities. There are exceptions of course, but "lots of stuff" is a bit exaggerated.
Toggles on highlighting for all UI elements so you can Tab / Shift-Tab through them. It's usually the first thing I do when I touch a new Mac.
Even if you don't, learning it is a fun experience. It's like learning Haskell just to feel another way to program, even though you're not using it in practice.
The biggest annoyance was configuring the system to not get in my way: Never ever put something in the front of the app I'm working in, forcing a change of context. Don't use up all of the bandwidth when downloading updates in the background, etc. This was much less of a hassle the last time I used OSX. Apple might give you less options for configuration, but at least out of the box it is/was much less disruptive.
Overall I like the look and feel of windows 10, and some of the problems were caused by third party software. I do think a lot of the defaults actually make sense for the mass market, but they should be much more transparent and easier to change.
For example, I expected disabling Cortana during installation would, well, actually disable it.
Just the fact that I can download some new software and still see a 2001 era file dialog or such (because it was made with an older, but still supported ancient UI lib), is enough to put me off windows for life...
Best look for a cheese-grater Mac as a workhouse. Xeons may be old but not to be sniffed at.
I can understand lighter, but thinner? Why do you wish for it to be thinner? I consider it to be thin enough.
I'm genuinely curious - why do you need 64 GB of RAM on a laptop? Are there industries where this is a necessity? At that point, wouldn't you be better off having remote machines?
I see this question often, and I totally don't get it. I do use 64GB for work, and in light of this news that no new Mac Pro will arrive any time soon, I'm considering bumping that to 128GB. But I don't do high-frequency corporate hegemony work, or video editing, or any of that — I'm just a programmer.
Moreover, you could delete IDEA and Xcode and my 5 browsers and 7 text editors and my 30 terminal windows and git client and all the other work-related stuff open right now, and I'd still easily use 64GB just fucking around.
I wonder: how is it that people don't use 64GB of RAM? Do they reboot their machines every week? Do they fastidiously quit applications even though they'll probably use the app again within a few days? Are they just all like, "modern memory-swapping technology is so awesome compared to 1990s System 7 'Virtual Memory' that I love to watch it work, even though things run an order of magnitude slower in many critical sections"?
I really don't get it. Terabytes of RAM? Yeah, that might be hard to make use of today. But 64GB is definitely not too much, not for me and probably not for you, or even for your mom.
I don't have 64GB in my laptop, but only because I can't and have to settle for 16GB (unless I switch to a different OS, which is on balance a worse tradeoff currently, and yeah yeah I'm rooting for Linux but come on, one can only maintain hope for a couple decades and that mark is fast approaching...)
Cheap RAM is one of the things that keeps hope alive in this increasingly degenerate era.
It's an old habit of mine that stems from spending most of my pre-adult life with old equipment, and using others' equipment with <1gb ram running Norton antivirus.... I'm so glad those days are over, but I haven't completely recovered. That being said, I do occasionally just leave everything running for as long as I can stand to. My system can never tell the difference.
As for using less than 64gb of ram... Ha! I have never had more than 8gb of ram! I have never needed more than 8gb of ram! 8gb is a lot of memory! 64gb? Are you kidding me? I've considered several times over the past few years getting another 8gb (for 16gb total), and ended up realizing I would never use it.
I used to keep AWS GPU nodes for that, but those are expensive, and it is infinitely more convenient to keep your data close to your workspace, instead of constantly uploading/downloading work batches.
But back to your 32/64Gb question. I need it to run my Docker containers. I do trading and stuff.
How about this? For when you want to run VMs that are non-trivial.
I bought this Samsung Chromebook 2 to test out a 13.3" laptop size and the keys were close to being "counter sunk" into the frame (not really but they were noticeably low) and seeing one of the new Macbook Pro's in person at a store I was like "WTF is up with those keys."
Yeah I love the software optimization giving to long battery life on Apple's behalf but the whole "gaming on Windows, and software like CAD/Solidworks on Windows" I don't know. As I said run three operating systems as Linux is my primary OS to develop on. I'm not looking forward to learning Apple OS, at least Visual Studio Code is on there.
I'm kind of dumb though in some ways, I keep thinking "If I have a computer like a Macbook Air I can develop on the go" but I'm most productive on my desk, two monitor, desktop. But I want to get setup for a moving/traveling digital nomad lifestyle so one device and barely any possessions would be great, particularly in the event of theft my device is outdated/protected enough that it will destroy itself and I can replace it relatively easily... but that's far from my current situation in life as a mere peasant.
This autobiography brought to you by, schizophrenia, you gotta love it.
Just warez or what?
I mean visual representations something like SketchUp is great, free easy to use. My friend has a 3D printer and can produce STL files. I've used SolidWorks before. Yeah I don't know, I'm just rambling excuse me.
I've heard of Warez.
I'm also aware of free options like FreeCAD which I loaded on Ubuntu, pretty cool.
That might be the lock-in talking, Apple would have to screw the MacBook line over as badly as they did the Mac Pro line to get me to switch.
Apple hardware is ok, except for the fact that it is insanely overpriced.
> Being that the quality is superior
Exactly how? Honest question.
The keyboards are excellent, better than the norm. I consider ThinkPad keyboards to be outstanding, slightly superior to Apple. But Apple keyboards are still way better than the average laptop, the second best laptop keyboard. I haven't used the latest Lenovo ThinkPads and I've read that their laptops have degraded over the years. So maybe Apple's now is the best. All I know is that it does it's job without any fuss and that's very important to me. The backlighting could get dimmer in low-light conditions but I understand that Apple fixed that in the newest model.
Moving on, the batteries are outstanding. Apple manages to get everything right. Charging works well, though the chargers themselves are sub-par, due to Apple's ill-considered decision to not use strain relief, I never have to worry about whether I'm charging my laptop too much or not enough, Apple builds all those decisions into the circuitry of the charging system. It's one more thing, like the trackpad, that I don't have to worry about when I use Apple that I always miss when I start using other kinds of laptops.
There's the screen. I'm sure there are similar-quality screens out there, but Apple's is outstanding. With Flux, I can use it in lighting conditions ranging from very dark all the way to just shy of direct sunlight on a bright day.
There's the solidity of the aluminum construction. I wouldn't exactly say I'm careless with my laptops, but I don't use a case and I bring them to the bar. If I close the lid, it's practically impervious to spills. I've relied on this more than once. I've spilled liquid on it with the lid open, all that needed to be replaced was the keyboard and trackpad, the mainboard wasn't damaged.
Finally there's Apple's support ecosystem. I've never not left the Apple Store satisfied. Their reps are helpful and knowledgable in a way that you really miss when you stray outside the ecosystem. One time I didn't want to wait for Apple after I spilled water on it on a Sunday, so I took it to a Micro Center that was listed on Apple's website. The difference in professionalism was night and day. Micro Center made me fill out paper forms and mis-transcribed my phone number, so I didn't get any notifications.
Literally everything about Apple's laptops is a cut above in terms of quality, and some things, like the trackpad and support, are spectacularly so. Other companies can get close to Apple on a few things, but only Apple can consistently do everything right. You're always going to be missing something if you go elsewhere. Apple hardware looks overpriced compared to a run-of-the-mill machine, but when you look at the high end of the laptop market, prices all look very similar. When you're actually comparing apples to Apples, (see what I did there?) the prices for similar quality laptops, like say the ThinkPad X1 Carbon, Apple comes out at only slightly more expensive. I consider the premium very much worth it. I can see how a more price-sensitive customer could find it very expensive, but to me that's like comparing Toyotas and Hondas to Mercedes and BMWs.
If there's one part where I like Windows and Windows applications a lot more than macOS, it's in the support of keyboard shortcuts (like all the Alt+ or Ctrl+ combinations). The apps as well as OS on macOS severely lack in keyboard shortcut support and depend more on a mouse or trackpad. I know I can define my own shortcuts for application/system menu items in macOS easily, but that's a big chore to do.
All the praise of macOS aside, I avoid Apple's own apps for anything where I need longer term availability. Take iWork for example. I never use it for anything that's not a throwaway project. Apple could, at any point in time, just decide that it's not worth it, junk support for all the files you've created (in its proprietary format) and then start afresh. So I use LibreOffice for all my longer term spreadsheet needs. Or Thunderbird for mail (and so on). The shelf life of Apple's own applications and their proprietary formats and cryptic file organization systems are relatively much shorter compared to FOSS offerings or even Microsoft's offerings. Apple's consumer side iLife apps also have hit some people hard in the past with data corruption and data loss (IIRC, iPhoto was notorious for that). Since I don't upgrade my hardware every few years, these factors hit me harder on the Mac side.
Linux is a lot more time consuming to manage for me (even though I'm fairly tech savvy). But a good combination of UI (looks, readability, fonts), usability along with perfect hardware support would be a great thing to have.
In my experience, Linux has this. Especially fonts. IMHO, fonts in Linux are slightly better than OS X, and worlds ahead of Windows. As far as UI, it depends what you want. There are a lot of options, and some of them look fantastic, some of them are very usable, and some fall into both categories.
Well, you seem to have isolated yourself from counterargument. I was given an 2016 Macbook Pro from work. My normal dev workstation is Ubuntu 16.10. I do truly prefer Ubuntu to OSX. I like the customisation, and even the default look and feel of Ubuntu just works better for me.
Also, as a developer doing mostly Haskell and Python, derping with Elm, rendering documents with Pandoc + LaTeX, all of my tools are perfectly at home on Linux. Everything works with OSX too, but it is usually a bit easier to get things working. I prefer apt to brew. I don't know what else there is to say, but here is a counter argument. If OSX and Ubuntu were both for-pay products and cost the same, I would pick Ubuntu.
Oh and the XPS is better looking and non-hypey like the touchbar Mac book.
I really suggest you give the Fedora livecd a try.
>- Longer Battery Life (5+ hours)
>- 32/64GB RAM
These goals are diametrically opposed.
I disagree. I moved from a MBP to a notebook with Arch and i3. The UX of a tiling window manager such as i3 is in my opinion much better and more efficient than the floating wm on OSX. There is no real tiling wm for OSX (don't mention divvy here, it's a nice tool but far away from a real tiling wm).
The vast majority of the world disagrees. The only reason anybody uses OS X is because it's Unix. Other than that, the UI is atrocious and severely lacking. It doesn't even come close to the robust utility that Windows offers.
Someone who is processing lots of video or audio for money may spend most of their week at a desk in one or two apps and the faster the throughput of their machines the easier it is to meet client deadlines. And hardware starts to matter a lot and throwing hardware at the problem is often a good idea...the logic of rendering farms is the same as server farms.
Not any more. Mac OS looks and feels very antiquated compared to Windows today.
As a developer, Windows has caught up to MacOS for the shells and beats it in pretty much all the other areas (UI of the OS, keyboard shortcuts everywhere, great file manager(s), much more productivity tools available, etc...).
And of course, Windows laptops are anywhere between 1/3rd to half the price of the equivalent Mac laptop.
These days, I use both Mac and Windows to develop but once I can no longer use my Mac Book Pro, I won't be getting another Apple laptop, it's going to be Windows all the way for a few years until Apple catches up again, if they ever do.
It seems silly, but to me two big annoyances are
1. I can't figure out how to easily open up the bash/ubuntu thing where I need it. I'd love to just have a dedicated "github stuff" folder that it opens into by default (and can edit). I always seem to put stuff where bash ubuntu isn't allowed to touch, or I'm manually CDing around until I find that weird place where the C drive is. I mean yea I can google it, but I'd have to do it every time.
2. Pardon my french but it's fuckugly. The colors, the fonts, lack of transparency. Copying/pasting/etc all suck. How can I make this less sucky?
The main take aways:
- I share all my dot files (.bash_profile, etc...) between Windows and Mac OS. They are in a Google Drive folder and whenever I move to a new machine, I just copy them all verbatim and they work right away.
- Bash, git and ssh work out of the box on these Windows/UNIX shells.
I agree with the colors, fonts and the copy/paste interaction on Windows. Not great, but tolerable. I want to experiment with more consoles since there are so many to choose from on Windows.
Oh and Cmder has transparency and you can configure copy/paste to be by line instead of by block, at least.
OOOOooooh this is fucking smart. How have I not thought of this?!
Mostly, I use vs code, and open that from where I am in explorer.. it has it's own terminal that opens in the current directory, that I change to use git-bash -l, adding in the git prompt script(s). Overall it works very well, imho better than bash for windows (ubuntu userspace).
2. Colors are better supported in the CU coming out in 3 weeks, and transparency + copy/paste can be setup via the file menu (right click on the logo in the command prompt title bar and go to Properties).
No kidding. cmd has not meaningfully changed since its original version. It just sucks. No matter how nice the shells are.
No, you're not the only one. I can't stand non-OS X anymore.
The only thing that bothers me is the lack of window-based Alt+tab (Cmd + ~ and Cmd + tab are not equivalent).
> And please, don't tell me Ubuntu or other linux flavors. They look good (and are good if you are programming on them) but the UX is still lacking a lot.
There certainly are some specific areas, like 4K support, which are still being worked on, (and is in fact worse on Windows), but other than that, I cannot see what macOS offers over a modern GNOME desktop, UX wise, can you offer specifics?
On my MBP, I always feel constrained, doing any heavier compilation slows it to a crawl and sends the temps to the very edge of what the CPU can handle, there's not even a point in having a CPU that can turbo up to 4GHz, since I never, ever, saw it happen under macOS, Finder is a joke, (Path Finder is good, but you can get there with built-in File Managers on Linux), Xcode is a joke of an IDE, stuff like syntax highlighting and autocompletion is randomly gone every couple of minutes etc.
My main problem with macOS and the hardware it runs on, is that even on a top quad-core model, you don't really feel like 'this thing has power to spare', the last thing I want to feel when working on a computer that costs over 2k.
> god forbid you have a problem (especially a hardware problem) and then try to debug it. Good luck searching online for a resolution.
I don't think that is really true anymore, places like the Arch wiki and forums are a sure way to get almost anything resolved. On the other hand, try having a hardware issue, (personally experienced WiFi drops, dead pixels and GPU glitches), on macOS, nobody even tries to resolve it themselves, you're told just ship it back to Apple.
> Never mind the confusion of the different flavors, packaging systems, and configurations
Not really a problem in practice, (but a very tired talking point), you just stick to your distro's 'ecosystem' and be done with it.
> Anyone figuring out the Linux/Laptop problem is re-inventing the Macbook Pro/OS X.
There are surely people doing just that, but I don't think that's the majority.
Linux with GNOME just 'clicks' better with my workflow, the system feels a lot snappier, the package manager is in charge of every update and systemd is powerful and easy to use, so I always have a very clear picture of the services running on my system and their health.
Additionally, my Linux laptop is aprox. 3x as powerful as the MBP for less money.
The only thing I really wish my Linux laptop had was the MBP's superior build quality, but things like the XPS line are catching up fast, so there's hope...
The built-in apps aren't cluttering I don't think. If you don't use them, what's cluttering about them? I'd rather have them than not.
Things Mac OS is missing compared to Windows:
- Stealthy updates
There may be more.
How can plaintext configs be confusing for a competent software engineer?
On Linux, you can make your OS work for you just the way you like it.
Good luck changing anything significant in Apple's walled garden
Nowadays I'm on a Xubuntu box, and I've managed to have zero system-wide configuration. I've made a metapackage that depends on the software I use---and in the years I've optimised that to be a couple dozen---and installs a package repo for itself. Until recently I was running a FreeBSD box where all the desktop setup was my own (VTWM, dunst, many little programs; I ran Arch for a couple years before that). That's nice, but it becomes a baggage quickly. So many points of failure, and only me to maintain it. Now I'm back on (X)Ubuntu, all I have to do is configuration for my shell, git, mercurial, and then my emacs.d. It isn't even close to how plesurable it was to run BSD or Arch, but the minutiae is tiresome. I'd rather configure as small as possible and focus on my actual work and indispensable tools (Emacs, VCS, shell, in descending order) instead.
The usual Unix level of customisations are good on server, but for daily use they're hard. And the configuration files, with the lack of conventions makes it harder (one has to know tens of dialects and languages).
In most cases, you don't even need to customize much.
I was arguing about the ability to customize that's just not present in OSX and W10.
Knowing that you can always change things is liberating.
I do believe that people go overboard sometimes, but that's not the fault of plaintext configs at all.
I've been running XFCE on Arch with just a handful of visual tweaks for 3 years now. For the most part, everything just works.
But currently, on the french layout, you can't even pipe things.
A Macbook dual booted with a normal keyboard would be my dream machine I think.
The US-International input is very simple: to do an à you do ` then a, to do a ç you do ' then c. This input mode is however quite annoying when writing code, so I switch between US and US-International input mode depending on what I'm doing.
The normal US keyboard has alt-e (´), alt-`(`), alt-u (¨), alt-i (ˆ), alt-c (ç), alt-n (˜) for the combinable version of accents.
I agree Canonical should have first-party laptops and demand quality.
But Apple not having an open package management ecosystem, where people can manage/create their own repositories, kills it for me.
It's not from Apple but Mac OS has Macports and Brew for this just like Windows has Chocolatey
Unity makes the Linux desktop usable now, but I still wouldn't call it slick. Even if it leaned more on KDE instead of Gnome it still doesn't feel right compared to either Mac OS or Windows.
Going on a tangent, I still wish Apple would release a screenless iMac. Not as much people want or need a screenless Macbook Pro (Mac Mini) - the 16 GB RAM limit is annoying and thunderbolt is lacking as opposed to just SATA, and not as much people can justify or afford a modern Mac Pro. If Apple wanted to see the market demand for that, they should try tracking sales of used old-gen Mac Pros on ebay.
The package manager only manages a subset of user-installed software.
I honestly don't care much, since I don't even use OS X anymore, but I wanted to clarify this as one of many reasons why.
The screen keeps feeling smaller every year, but I can not complain about the performance of this nearly 7 year old computer.
But it was a real dog with a hard disk. Shoved in a 1TB SSD and all is good (and that's a SATA3 not NVME). Should continue working for another 5 years no problem.
Edit: And that doesn't include the re-selling price. I do give my old hardware to my mother, so I take it out of equation.
Does that $10K hardware save you 30 minutes/day over over what you'd see with $3000 worth of hardware (or keeping your $10000 hardware for another year)?
For some, the answer is clearly yes, but for others, maybe not.
I second this. Just take a look at prices on Ebay for old metal Mac Pros.
You can build a much more powerful computer and put OSX on it.
A lot of the other things you list are in direct conflict with thinner/lighter, and Apple's obsession with thinness is probably why you don't have them.
I feel like they could keep things the way they are, and offer an AOSP-like vanilla version for people who generally know what they're doing (Google does a pretty good job with limiting it).
I'd pay hundreds of dollars more for this software option (have previously considered hiring someone to do it in the past on a new PC, but I have had bad experiences with PC repair shops).
I bought my spouse a new PC and literally had to spend HOURS removing software and decoupling McAffee from Windows. An i5-based system was out of the box crippled while it downloaded updates and software from the Windows store that I didn't even want to begin with.
OSX is markedly better, but I don't need siri, icloud, chess, ilife, dvd player, photo booth... the list goes on.
(The only exception might be Garage Band with its huge sound files, but you can simply find and delete those.)
OS X is a bloated version of BSD. Why run that when you can use Linux for free? Install the software you use and nothing else. Your package manager will keep all of your software up to date for you. You won't be running an absurd antivirus program an the time...
Before anyone even starts — GIMP is not viable in enterprise workflows.
I understand completely. GIMP has never been very good (though quite usable in many cases). GTK (Gimp ToolKit) is a nice library, though. Krita is much better, but is very focused on painting.
At least you can run a real OS whenever you aren't using that specific software. Any reasonable linux distro will use <10gb including all the software you really use, and have a nice automated installer to shrink your windows/mac partition and install in the empty space.
Sure it's not ideal to have to reboot, but with solid state drives, rebooting isn't very much hassle anymore.
I spend much of my OS X time in terminal.
This is excellent news, I'm chugging along on a 2010 Mac Pro and was very disappointed when the Apple Displays were cancelled last year. They are a staple of the lineup and always look gorgeous compared to what's on the market. I will definitely be buying the new Mac Pro and 2 displays to go with it. There is absolutely no way I will ever use Windows and having to downscale to an iMac or, worse, a MacBook Pro, when my Mac Pro is finally too old was filling me with dread. By the sounds of things, they are working on making it modular and expandable, also very good news as I like to keep my workhorse computer for a long time.
Overall very excited to see what Apple announce next year!
Deciding on using Macs for your work is not only about the devices themselves -- you're buying into the whole Apple ecosystem. That's not something that's easy to change in a heartbeat according to which vendor currently has the best offering. And the more concerned you are about the future of Apples products for professionals, the more tempting it is to look long and hard at Windows or Linux, even if Apples current devices fill your requirements just fine.
Have you looked at what's on the market? There are many great displays out there now, all far better than Apple's offerings. Apple's displays were good for about a year before the competition surpassed them.
There is a third choice: Linux. I've been using it as my desktop for 18 years now, and it really is wonderful. With a tiling window manager (no GNOME, no KDE, just X11) the UI gets out of my way and lets me work.
Yeah, I do kinda sorta wish that there were lighter or prettier laptops around, but on Linux I can get work done: on macOS or Windows my productivity is severely hampered.
Look, I use Apple products all day, damn near every computing device I have is made by them. It's just astounding how they've let this languish. Very bad.
The days of performance doubling every 18 months are completely over. Now it's more about power efficiency and more cores rather than peak single-thread performance.
I don't think the tower will be that hard to design. I think the issue will be in making a new motherboard. Or do you think, they'll just use one of Intel's designs?
This is a thing many comments nowadays capitalise on. First, I'd rather read criticism than lodes, to see for myself if all the good in sth. is eliminated by criticism or not. Second, your statement is false, as there are as many positive comments as negative ones. If the negative ones are higher up, that means people upvote them, i.e. agree those negative comments, a right to which they're entitled.
Just a few months ago, I spent somewhere around $4500 (all-in) putting together a new workstation. It runs Linux instead of OS X and this has led to me using my (4-year-old) ThinkPad more than my (18-month-old) MacBook Pro (when I'm "on the go"). I actually plan on selling the MBP; I just haven't gotten around to it yet.
I'm sure this is great news -- and long-awaited -- to many people... but some of us got tired of waiting.
I would have been willing to spend several thousand £ on a Mac Pro that met my needs, ended up having my needs met by a PC with a GTX 1080 that cost less than the baseline iMac and within 15 minutes of 3D rendering using CUDA and Octane render I asked myself why I didn't do this years ago and my wallet thanks me.
Now I don't really see the point in going back to Apple, feels like they've only just decided to care about pro users again and this was probably a decision that came out of a meeting where it was 50/50 if the line was killed or rebooted.
I really don't understand how there can be vim users that haven't remapped escape.
Like, if you care about ergonomics enough to complain about making a rarely used key into a touch key, how can you not care enough to remap that key if you use it more often?
And Mac OS even has built-in support to remap caps-lock to escape, and has had for a long time I think.
Funny side-note: I've broken the keycap for my escape key (on a TypeMatrix keyboard).. yet I haven't bothered to try to fix it. It's just not worth the trouble.
I had never gotten in the habit of using "^]". And I, like some others, do like my caps-lock right where it is these days, because I use it a lot.
Takes 10s to write into a .vimrc on any new computer.
Neat, I hope AMD sells a lot of these chips to get back on its feet.
State of Linux on the MacBook Pro 2016
• Audio input & output
• Keyboard backlight
• Suspend & Hibernation
• Touch ID
I've had a similar experience with the Air but it works well with the newer kernels (including the SSD, which was the main problem for me).
But if there's one good thing at play, it's that there's a fairly active community that tries to tackle these problems.
I've had very good luck with my conversion on my old MBP so far. I have a late 2009 kicking around the house, basically a web browsing computer or backup machine if I leave my newer model in the office, etc.
Since I couldn't get the latest macOS on it I figured I'd just wipe it and install a Linux distro- all told I was maybe an hour into the process and had Ubuntu 16.10 running like a champ. The built in Ethernet was a life saver because I had to replace the WiFi drivers and there were a few power settings I tweaked.
A few weeks post install I've tweaked touchpad settings and various other things, except for the WiFi, it worked nearly straight out of the gate, and it was certainly functional immediately post-install.
Known limitations that affect me are pretty much limited to Thunderbolt hotswap (you've got to boot with the TB hardware plugged in), lid close/sleep is pretty finicky (I feel like this has always been the case with every Linux laptop I've ever run), and the iSight drivers for the built in web cam have some issues therefor preventing some smart screen dimming functionality. If I actually took the time to patch the drivers this sounds like it's resolvable but I really just don't care.
Battery life is tough to gauge because it's so old and I didn't benchmark it pre-Linux but I'm getting about 2-2.5 hours out of it while writing code. Obviously not great, but I doubt I was getting that much more w/ OS X. This machine was my daily workhorse for a couple of years, it's got plenty of cycles on the battery. With auto screen dimming or GPU optimizations, this could likely get improved.
Really, the only thing I'm finding now is that I'm not a huge fan of the default Ubuntu 16.10 desktop and I'll probably start hopping distros since it's been a while since I was running a Linux desktop as a main OS.
I would say that from a completely subjective standpoint it feels every bit as snappy in standard use as OS X did and it will likely keep this machine quite useful for as long as it holds together.
That's an essential insight for anybody that ever wants to run a business, and it is why sales people get paid the big bucks and purchasers are only as good as the discounts they get.
Unrelated, but nice to have interacted with you on HN! I've seen your handle and comments hundreds of times by now.
On second thought, I guess if you can just sell it to some service. Or say, Best Buy, then there's a point there for reduced price (at a high degree :p).
We have some big Linux advocates in our company, so I've also tried switching to Linux twice before going to OS X, a couple of years apart. The claim from them was always the same, Linux is ready for the Desktop. The problem is that it's been "ready for the desktop" for ten years. I can't imagine ever going to Linux over Mac OS.
I've still got my Macbook Pro, so we'll see how things develop, but I've already started the process of migrating away, and if we're looking at 2018 it'll be pretty far along by that time.
... and that's how Apple lost the professionals. The desk is a clean space for a huge monitor / keyboard / mouse, and MY work. It's good to make a workstation that looks nice, but it's ten times as good to make one that's flexible and powerful. The only people who want that workstation on the desktop are the designers at Apple, and stroking your own ego isn't on the path to making a great product.
Every great designer knows that form follows function.
And not only that, even Apple with its vast resources cannot keep the machine up to date because the components are so tied to the case design.
- I want to configure a laptop with a 1TB spinning hard disc, so that I get a lot of space without paying $3000. I can get such a Dell for $1500 with a UHD screen. I'd prioritise disk space over weight and battery life.
- I want to configure the Mac Mini with a 6TB spinning desktop hard disc, rather than a portable one, which is costlier and has limited capacity. To reduce footprint, make the machine vertical. I also want to be able to configure the Mac Mini to drive a 5k monitor via Thunderbolt 3, and 32GB memory. All of this should be upgradeable. It's a shame that the Mac Mini isn't. It doesn't even have the "thin and light" excuse laptops have.
None of this is sexy. It won't make tech reviewers go ooh and aah. It won't earn a place in the MoMA. But it will be more useful to customers.
Based on how long a product takes to bring to market in a large company, it might have well been the public reaction to the MB Pro release last autumn which woke them up. Just todays spec-bump of the Can takes like 6 months of preparation and planning. And it would fit to the true renewal being about 1 year in the future from today.
It might be to late for some, but I am so glad this is happening. Apple can make great hardware, if they are trying, and this sounds that they are trying again, so I am very curious what they can create.
It's a 'hackintosh', sure enough, but it's fantastic. First time in 30 years I don't own a mac, that's telling. Their fault, too.
One thing I didn't like about the Hackintosh was maintaining it during updates, having to wait for the right NVidia web drivers, and then remembering how to update all the stuff specifically.
What I did like was that I had a computer that I built for $1100 that scored about 16,000 on the Geekbench, without overclocking.
Note: I still use a Mac Book Air 11" because it's the best laptop I've owned. I think the trackpad and sleep/wake functionality that just works is the best ever, and makes it really convenient. Obviously, these two killer features for me matter a lot less on a desktop.
Second Note: I built mine with an aluminum Rosewill Legacy U3-S – Silver/Aluminum Case that was both compact, and really good-looking.
[Edit, updated type of computer cost from $110 -> $1100]
For anything earlier, Nvidia periodically releases drivers about a month later than the OS version bumps. There are drivers for 10.12.4.
Really wish more PC component companies focus on high quality good looking components without the gamer branding.
If you get a golden build and get it working it's almost like a real mac. It's possible to get everything working 100% but it's time consuming and even in the best case scenario macOS updates will probably be a pain.
Another aspect to consider is noise. If you come from the Mac world you are probably spoiled by low noise or silent computers. PCs are very noisy and you will probably have spend time in research and money in expensive silent parts to solve this.
Finally, a common problem with hackintoshes is wifi. I've tried everything, believe me. If you intend on using Wifi your best bet is buying the same chip Apple uses with an adapter. Many manufacturers offer their Wifi drivers for macOS but those are usually finicky or outdated.
This is expensive, but it's IMO the best choice:
If you just randomly build a beige box hackintosh, it will be noisy. If you spend some quality time on silentpcreview, you can get one that is silent even under load. I know, I have one.
Still, I'd very much like to get a decent Mac Pro instead. Devil's in the details, let's wait and see. My hackintosh isn't becoming obsolete any time soon.
My point is simply that one has to put special effort when building a PC to even get silence at idle.
I briefly ran a hackintosh when my Biostar motherboard died and I was frustrated with how slow my old Macbook Air was running. Pretty simple process: replaced it with a Gigabyte, downloaded an installer image from my Mac, and used to TonyMac tools to set it up. Unfortunately wasn't able to get Xcode to work reliably. Everything else seemed to work but Xcode was a crashfest.
I've certainly had Xcode crash once in a while, but this was something else. Tried reinstalling it and no change. I don't think I even got it to finish a build.
There was also a great hackintosh discussion on HN a couple days ago:
- It's all about making it easy for yourself. In short, motherboard and GPU choice change the experience from 'almost effortless' to 'never gonna work'. Get a $20 USB external soundcard to get rid of any audio configuration quagmires. Get the most compatible, widely used motherboard and Nvidia GPU, so you can get support on Tonymac should you need it. Stick to wired Ethernet to avoid meddling with bluetooth and wi-fi if you can. Everything else (PSU, CPU, RAM, hard drives and case) isn't an issue.
- The tradeoff is time invested into initially understanding how it all fits vs. money saved and increased knowledge of how MacOS ticks (a good thing regardless, if you're a power user). How much time depends on how much of a PC tinkerer you are already. If you already built PCs and tried getting Linux distros going it's gonna be second nature.
- You'll still need to check out Tonymac when a point update comes out for tips and warnings. The easiest solution is to install the fully up to date next-to-last MacOS version (install El Capitan 10.11.6 now that we are in the Sierra cycle, for example) and keep it going until you're forced to upgrade. Staying a generation behind, both in hardware and software, is the safest strategy. Hackintosh and being on the bleeding edge don't really mix.
I would agree with the overall sentiment though and add that even on a legitimate Mac it's usually a good idea to hold back a bit on updating. I work with audio a lot and it's common for audio units (and Pro Tools if you use that) to break initially.
Also, some of the original setup requires a bit of tinkering with drivers to start with (especially if you use 'recent' hardware) but it's really not more complicated than installing linux these days (and then fiddle a bit with the settings).
To get solid bluetooth support I bought a card from http://www.osxwifi.com/ , which makes peripherals like the magic trackpad work smoothly.
Odd kernel panics used to happen once every 3 months. Black text would scroll down the screen and you have to hard reset the system.
I would keep a legit backup around but the power/price ratio cannot be beat. I'm looking at building another soon.
It feels incredibly un-Apple.
After the critical reception of the new MacBook Pro range amongst the Apple community, especially pro users, everyone was questioning Apple's commitment to pro users, and the Mac as a whole.
I'm glad to see Apple doing this, but I can't help thinking this was totally reactive and pre-emptive damage control. If Apple had just released the Mac Pro speed bump, there would be even more of an outcry that Apple has given up on anything more than incremental changes.
The pessimist in me thinks that Apple simply had no idea that there was this sort of demand for pro Macs. Phil and Craig mentioned the iMac numerous times, as though to say "hey pros, there is a great Mac you can use", so they can still claim that they do make great, high-performance pro devices.
My guess is "next year" means "Holidays 2018", and my guess is that Apple has only recently started work on this. Apple hasn't been about modular design or expandability for a long time. With the rise in adoption of VR, there has been even more discussion about upgrading graphics cards (remember Oculus comments about the Mac?), and the age of the CPUs is another criticism. I just can't imagine Apple starting work on this some time ago.
They talk in the article about being backed into a thermal corner and that's easy to believe. They simply didn't know that GPU technology was going to accelerate in advances in the way it did, and particularly not in single-GPU workloads.
I've heard again and again stories about power users that are editing videos (4K these days) or doing other content creation (3D animation, etc) which can really benefit from high-end hardware.
I'm thinking it was more an issue they couldn't figure out a high-end, upgradable Mac Pro which kept their beefy profit margins, and didn't become a support nightmare at a price point that was somewhat reasonable (i.e. less than $10K USD).
Probably, but I've read Gruber's piece twice and I couldn't help but feel as if the talk was held by a bunch of young geeks coming from a Kickstarter project.
Maybe that's an Apple's new approach at PR but it sounds like they tried to downplay the impact of bad management that ultimately led to a failure that must have cost them millions.
I was admittedly disappointed at first but then realised there isn't really anything else that they could've done to interest me anyway. I'm genuinely very happy with my existing 2014 MacBook Pro. I just also think it's telling that I as someone who's spent far too much money on Apple hardware also uses a hackintosh desktop.
Sure, they IMO should increase the minimum past 128, but many people don't use close to that and the 1,300$ price point is attractive w/ 2.7GHz i5.
A lot of pros will be thinking "So why should I buy New MacPro if they're only going to let the line stagnate again?"
Once you've spent five years transitioning to, say, a Windows setup, there's a significant additional cost to going with Apple, and it's not like Apple has felt like a deluxe experience lately.
I genuinely love macOS and I'm happy with Apple's offerings on the whole, but I think there's plenty of people who just don't care and are happy with whatever works. If you're using say, Pro Tools or Avid Media Composer or whatever software you use to produce content, odds are it works just fine on Windows too. I think many are going to try Windows, realise this and then simply not bother switching back.
But its like when you're on the freeway and see billboard for your favorite restaurant 40 miles away. It gives you the motivation to not settle for something else.
It might work.
For Apple, that normally is "you're holding the phone wrong."
Snark aside, I have a mid 2010 mac pro at home that is still going strong (due to upgrades, SSD, more RAM.) However, I would like to get a new GPU for the machine, but I'm not about to spend $500 on a better, but ancient GPU that's compatible with a 7 year old machine. I've been wondering what my upgrade path would be. No way in hell I'm buying a trashcan mac without any upgradability. And I'm certainly not going to buy a MBP with the touchbar (I need a real escape key) and under powered specs.
I'm really hoping this is true, otherwise in the next year or two I'm going to be seriously considering building a PC like I used to and deal with Windows 10. The rest of my family uses apple and it makes support for their devices easier being on the same platform, but I need better performance for photo editing and audio production.
As for a "top of the line" MBP, the 2016 15" is already the fastest MBP ever overall, even if you count the few benchmarks which put it slightly behind 2015 models in one or two metrics.
The damning benchmarks seem to be related to AMD vs. Nvidia. I don't know much about that but I'd put it down to optimization issues in a few apps. The 2016 15" definitely plays all games better than any MacBook before it.
I had a HD4870 running strong until a couple months ago it died. I replaced it with a GT 640 I had on hand; it works fine. Your mileage may vary.
It is also exactly what you would think Apple would say if they know this is going to be an extremely expensive device. Much more so than the current Mac Pro.
Why? Just bring back the cheese grater case and call it a day. No pro needs or wants these fancy designs.
I'm just guessing by modular design they're talking about something much more complex and unique.
> We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do.
That's silly. They could take their previous generation Mac Pro chassis, stuff it with a dual Xeon board, 128gigs of RAM, a pair of SSDs and a pair of Nvidia 1080s - and after some nominal quality testing/driver tweaking sell it at their regular ridiculous mark-up.
Those old cases are so convenient, I've considered buying a used one just for a regular pc workstation build. Easy to get to the internals, nice airflow. Roomy. Looks perfectly fine:
Like many others, I put together a decent "tiny" Hackintosh (https://taoofmac.com/space/blog/2016/12/17/1840), but ended up converting it to a Linux workstation because it turned out it was more useful to me as a VM host.
So I'm still using my ancient mini as an instant-on desktop (can't quite beat Apple's BT keyboards, really), and am looking forward to upgrading it - I just hope Apple realizes that it serves "semi-pro" uses well enough to gift it with at least as much CPU and RAM as the current MacBook Pro range...
Brutal to all the MBP hot takes from this fall.
Here's a good running list of them.
Not really, what are people supposed to do? Move their entire ecosystem to Windows?
Just because it sells doesn't mean it's a great product. Balmer's reign is testament to that so I'm getting pretty tired of (non-shareholder) people pointing at spreadsheet numbers to justify Tim Cook doing a good job.
How about if the people it makes those record sales to love it?
"According to Brand Keys’ 2017 Customer Loyalty Engagement Index, Apple delivers a best in class user experience across every single category in which it competes, from smartphones to music streaming."
But what about people who are fine with the specs and just don't like the details? You really need to have bought an MBP to know whether you like the Touch Bar and the flat/loud keyboard, or how well the battery works for you (given that the benchmarks are all over the place).
I'd love to see the percentage of returned machines instead.
I don't think they've updated the iMac GPU since 2015.
If you build a hackintosh you can at least run an Nvidia GPU and get CUDA support.
The industry is not "moving to one big GPU".
The industry has already moved to 4, 6, or 8 "big" GPUs per station/node.
I am absolutely curious as to what you're talking about because I have never seen this before, please correct me because I want to learn more.
Apple is talking about prosumers, not the rendering firms that need such massive GPU workloads. They'd be better off using server farms to render that much work. No one is going to work next to a noisy workstation with fans that has to disperse 2,000 watt worth of heat at full load.
Seemed more of a case that no one wants to actually code for OpenCL on AMD...
My guess is that, Mac Pro, no iMac update, and Macbook Pro with touchbar had many pro users worried and start freaking out. And Apple think they needed to do something fast because they dont have anything to shown in the short term. I also wonder why touchbar wasn't questioned in the Interview if they think it was a mistake. TouchID is great, Touch bar is not.
In 5 years time, by 2022 we are very likely to get 7nm from Intel and 5nm from TSMC. There is no reason why, within the same thermal budget, we cant fit a 16 Core CPU, a GPU that is equal or faster then today's Top GPU, 128GB RAM, and PCI-E 4.0 SSD. I.e This iMac will be faster then many of the Mac Pro sold today, and likely to cover majority of the Pro uses.
That is why I am surprised at Apple continue to support the Mac Pro. It is highly likely the numbers of Mac Pro sales will continue to shrink.
I hope Apple look at Rack usage as one of the factor in its design. I see Mac Pro in Severs Rack as one of the potential to greatly increase its sales.
Interesting that VR get's a mention there. Federighi chooses his words carefully, and they're aware of the rumours surrounding Apple and VR.
That fact that the new Mac Pros will not be out till next year makes it sound like they've only started working on it quite recently, perhaps in response to the furore over their treatment of their professional users.
I hope the situation is not that the market of pro individuals have moved to the point of just throwing the whole machine away on an upgrade.
The Gruber article hints that Apple might have human manual maintainability / upgradability on a priority list, but I wonder what that means.
Unfortunately, I'm anticipating something with an outrageous design premium and so hyperspecced at every turn that it won't make much sense for me. It's a shame - shouldn't the company able to make a $300 iPad be able to make a dream $2500 workhorse for the amateur musician/videographer? I've been reading these kind of wishful rants since the 90's, so I know not to expect much.
Obi-Wan: "That boy is our last hope."
Yoda: "No...there is another..."
That a Linux based system is an alternative for a powerful Unixy system?
As I noted elsethread, it really is. I used to use Macs exclusively, and I'll never willing leave Linux now: it really is that much better. The problem is that Windows & macOS are too constrained by their installed base: they don't have the freedom to be really revolutionary in their UIs, nor can they afford to support deep customisation (GNOME & KDE have similar problems — but one needn't use either GNOME or KDE to use Linux). Linux, meanwhile, offers a user true freedom: I can use a tiling WM, I can write code and bind keys to do anything I want. A Linux box running StumpWM and emacs is the closest thing the modern world offers to a Lisp machine, and it's awesome.
I do always need a macOS machine around to build iOS apps but the current Mac Pro is too expensive and limited for what it is, non customizable or upgradable really. I also want a pro machine that isn't an iMac, which really is what you have to settle for now, because I want a separate screen that I don't have to toss or I can donate when the iMac dies in 3-4 years.
A major problem is that Apple doesn't even make their cinema displays anymore, they were once the best screens and beautiful. LG is their monitor seller now. Why?
Apple is just missing the Pro users that like to customize their machine and modernize it. I have since moved back to custom PCs for my main power/pro machines and just do iOS builds on the Mac Pro 2012 now. The worst part is the next couple versions of macOS might not even run on that Pro because they are force EOL the hardware in the OS, not because of the lack of power as their hasn't been much progression there at all, just to EOL hardware when there isn't even a good new one to move to. I also feel a little disappointed in the new Macbook Pros. Some people I know are back on PC and just bought Mac Minis to compile their iOS apps.
macOS really is a great unix backed OS out there and the best looking for dev. Macs became so useful in 2006ish when they went intel and started creating the software around that and new web tech (canvas,webkit/webgl/khronos funding) that revolutionized. Unity on Mac pulled me into the Apple world again, for a time Unity was Mac only. Great things were happening after 2006 including the iPhone pushing dev to more macs. Apple has squandered that. They just seem apt to kill all that now and go totally proprietary and machines that are one block. That isn't going to attract Pro users or developers like it did a decade ago. They are losing their developers and pro creative/video/interactive users, that should be scary to them. Pro users are saying to Apple, "we'll believe it when we see it, for now we'll be over here".
A big chunk of their pro market just wants a pile of ram and cpu cores. They could offer that now with an intel integrated gpu for $1-$3K. Also, it wouldnt surprise me if intel gpus can already drive 5K, so you'd be able to actually plug it into the nice LG mac monitor (unlike the thing they are shipping). If not, they could sell a high end, but single gpu config, which is still a waste of a gpu, but at least it could drive current apple-approved monitors.
Also, delaying the entire line 12+ months for a heatsink is madness. Surely they could slap together a water cooler or something for a single high end gpu config.
In other words, GRAM sizes, and max single card speed are the driving factors in GPU adoption. Which makes sense, because you only get the large increases in GPU speed if you can fit everything in memory, and throughput is not high enough yet to ignore the cost of copying data to GRAM.
This is a tacit admission that Pro users are increasingly _not_ concerned with RAM capacity or CPU speed, but with GPU power.
The other mistake made was their assumption that Thunderbolt would allow nearly unlimited peripheral expansion, which turned out to be a complete bust.
Neither of things is solvable without completely changing the dimensions/specs of the Pro.
I can see how photographers want faithful color reproduction with a wide gamut, good image consistency across the whole display, good resolution, and a decent size (maybe 32" tops?)
But AFAIK, monitors with those qualities are already available. The only thing I can think of that would make them better suited for professional work is to bring their prices down. (Because even pros have a budget.)
The last Thunderbolt Display had USBs, ethernet, additional thunderbolt, even Firewire, and I use all of them. I can scarcely find a third-party Thunderbolt (not Displayport, Thunderbolt) display that has more than a couple USB ports, let alone the rest.
Believe me, I'd love to settle for a cheaper Dell, but as soon as you throw in a serious Thunderbolt dock (they start at $200), you're spending nearly as much, and now you've got some clunky lunchbox on top of your desk to boot.
I don't see any signs that the market won't repeat this pattern.
That's this one, for anyone wondering: https://en.wikipedia.org/wiki/IMac_G4
Apparently they had quite considerable problems with the arm becoming a bit loose!
OK, I'll bite. I'll talk about the Mac Pro news.
First, I'll state the obvious and applaud Apple for conducting a "Mac State of the Union" with a Congress of the Apple press. Bravo.
Looking at the details, I think we should be careful when assessing the use of iMacs, laptops, and Mini's by the pro user base. Specifically, were these folks running to the pro merits of these other devices or running away from the lack of a viable Mac Pro option? I think the latter.
I'm a Mac Pro user (2009 5.1) and devotee. I also own a top-line 5K iMac. I'd much rather do heavy computing on the Pro. For one thing, even something as simple as playing a 1080p video on the iMac wreaks thermal hell and sends the fans into overdrive. My oldish (and well-liked) Macbook Pro did similar things.
But what I truly like about the Pro is its easy expandability. In fact, I wish that it had more slots! Mine are always full (and I still lack a fast flash disk card).
So I am very, very psyched about the announcements. I am hopeful that the 2018 offerings are not outrageously priced.
Finally, as an aside, those who just can't wait until 2018 have an option. Very decent Mac Pro towers can be had used in the $1K-$2K price bracket. Maybe less if you mine Craigslist. If I were in the market for a non-laptop Mac, that's what I'd do.
Note that I am not trying to criticize non-Pro Macs. I just wanted to contrast them with the Pro, hopefully highlighting the latter's merits.
I'm looking forward to being able to land on moons in Elite Dangerous Horizons in VR on a Mac at some point.
In the meantime I got a Zotac EN1060, a machine the size of a Mac Mini that runs any game I throw at it at 60 frames per second, which makes booting into Windows tolerable.
Graphic design and coding work remains in the realm of a 2014 Macbook Pro.
The Touch Bar seems like a pretty apparent failure in that even most of the "until death" apologists can't or won't defend it, but Apple's not going to give up this quickly. I think you'll get your wish, but not soon.
The troublesome thing with this article is that even if Apple put out a Mac Pro with the latest and greatest guts in the old cheese grater chassis today, pros would be extremely happy. FWIW, I still think the cheese grater chassis is a great design.
Sure, design matters, but the livelihood of pros depends on their ability to get sh!t done. Just giving them access to "less pretty" hardware that does what they need today is better than making them wait another year or so.
Of all the Macs, the Mac Pro is probably the easiest to design. What people basically need are modular-PCs that can run OSX on the latest hardware and gives the user choice with respect to GPU manufacturer. There's less of a need to make it the smallest possible computer, because smaller makes it harder to do upgrades.
The Osborne effect is a term referring to the unintended
consequences of a company announcing a future product,
unaware of the risks involved or when the timing is
misjudged, which ends up having a negative impact on the
sales of the current product. This is often the case when a
product is announced too long before its actual
availability. This has the immediate effect of customers
canceling or deferring orders for the current product,
knowing that it will soon be obsolete, and any unexpected
delays often means the new product comes to be perceived as
vaporware, damaging the company's credibility and
It's not like killing the current Mac Pro sales is exactly going to put Apple in trouble - they're raking in money from the iPhone and MacBook (Pro) lines.
This is a carefully thought out plan from Apple for sure, unlike Osborne's blunder.
Example of what I'm talking about? Check out this Mac Pro server rack setup:
Also glad to hear the Mac Mini's not being killed off.
It's a pro level desktop, size (within reason) constraints aren't really an issue.
While the iPhone and iPad businesses are solid, they can be cyclical and nowhere near as robust as their core faithful mac users who have stuck with them for decades.
Few professionals are going to muck about with a hackintosh and I mean no disrespect to those who muck about. I have done it myself, and its great as a curiosity and to learn but at a point you just need to get things done and have proper seamless hardware support for all the peripherals for pro level work.
Vapour Race 2017: Next Gen MacBook Pro or Surface Pro 5.
It would be nice to have a real GPU again....
I happen to use Windows today, because it runs the most apps that I want to run (in particular, some Steam stuff not available on Linux or Mac), but with the help of dual boot, I could just as easily be on Linux or Mac (all other things being equal, I'd prefer single booting).
Personally, I left Mac after a bad experience with the 2011 MBP and not liking the design direction of Apple hardware, but I still consider Apple hardware (for the most part) to be the gold standard in build quality. Macs may not suit my use case, but thankfully there are plenty of other fish in the sea.
Specific to the SP3 Pro: use the SP4 keyboard, it is much better and is completely compatible; the dock is expensive and worth it: 2 display port connectors so I have the surface display, 4K monitor + 1K monitor attached working at the same time - Also wired Gigabit Ethernet; Magnet connector for the dock/power is super convenient; Power adapter has USB charging; I use the new Surface Ergonomic keyboard and a Logitech MX Master Mouse at home and the Surface foldable mouse on the road (best portable mouse I have used). I keep the surface on my desk so I can use it with my pen, I just flip the keyboard underneath it. On the road: the MicroSD provides all the extra storage I need (movies on the plane), and it has a real USB 3 port, battery life is around 5 hours, but better if you are only using it to watch movies or reading. Runs VMs effortlessly with i7 processor.
Biggest problem is power, sometimes it sleeps and won't wake up. Happens far more often than it should, but I haven't heard the same complain from people with Surface Pro 4s.
Hope that helps.
I have a SP3, and it's nice, but I didn't like using the type cover. I much prefer being able to keep it in my lap, with the screen suspended from the keyboard. I'm sure it works great with a dock, but I haven't use one.
This is an interesting PR approach, non-Apple process. They took time to talk to their top journalist connections in order to halt the the "apple doesn't care about pros anymore" articles. I applaud the additional transparency, smart. What I am not convinced about is that they are building the right solutions. How did their product teams miss this gap several years back? If Apple saw the "developers" as a growing user base, why did the new Macbook launch with a touch strip? They should know what device (laptop vs desktop) developers prefer. Did the Macbook and Mac Pro leadership change? Was it personnel issue?
Overall an interesting situation I would love to better understand. If all revenue and growth projections are positive, why appease a vocal minority? I assume they do believe their "early adopter developer creative types" are a vocal minority that can sway a large consumer base's brand perception.
> Apple’s research shows that 15 percent of all Mac users use at least one “pro” app frequently. These are apps for things like music creation, video editing, graphic design, and software development.
> Schiller, on Apple’s own pro apps: “I just want to reiterate our strong commitment there, as well. Both with Final Cut Pro 10 and Logic 10, there are teams on those software products that are completely dedicated to delivering great pro software to our customers.
As a few people have noticed (Recall Arment and possibly even Gruber pointing this out) for a while now almost all their Pro user footage used in advertising and announcements has either been photographers, directors or people sketching down ideas on iPad Pros.
Kinda seems to be ignoring the intersections of creativity and computing that the original Mac pioneered.
It's going to be an iPod, a Phone, AND an internet communications device
Not that I'm not satisfied with Mac OS or Apple hardware. But, they refresh too infrequently. I love the mac mini, my late-2012 is still chugging along nicely. But, I was able to expand the memory and swap the HDD for a SSD.
I don't need the retina iMac, I'd rather put the money into more memory and faster storage. For the same reason I'm not a customer for the Pro; I don't need the Xenon or enhanced GPU. So, where do I fit into Apple's product line?
I installed Gimp for basic photo editing (cropping, basic toning), and it's really not too bad once you get used to it. I have a long history with Adobe products too.
Not sure Apple fits the description here.
But seriously, Apple has a history much closer to the opposite of that. Never admit fault. If something really isn't working, pretend that it is right up until a replacement is launched. Just like PowerPC to Intel. Just like the Mac Pro until today. Just like the Apple Watch and Touch Bar (admittedly these last two are a little more speculative on my part, but I think it'll happen).
I'm using chromebook hardware (high end, but $200-500) for daily use most of the time, for reasons which will be clear in a while. (I still use iOS for mobile, though.)
For high-end computing, I just got an Acer Predator 17X "gaming" laptop (it was a toss up between that and a Dell PWS 7720). $2850, 32GB/512GB SSD/1TB, 17" 4K screen, GTX 1080, great keyboard, external mouse, 1-3h of battery life under hard use. Add 2 more NVMe SSDs and 32GB RAM (64GB total), with Linux/Win10 dualboot. It's pretty amazing. The alternative Dell PWS was about $4500 for a similar config and a tiny bit better in some ways.
Not true at all, it was G3 almost as fast as Apple's then PowerMac. The only "problem" was that USB was slow compared to SCSI.
But I used it for PRO work for years. Great value for the money
Interestingly enough, they also broke secrecy in 2013 when they gave a sneak peek of the new MacPro at WWDC, which would only ship later that year. They probably only announce products when they're ready to ship to not hurt sales of the existing models, but I'm guessing that won't make much of a difference with the current MacPro.
What graphics professional isn't assembling their own rig though? $3k is a LOT of money to spend on a desktop.
I'm old enough to remember dedicated SGI workstations, and a decade ago I felt OSX had some advantages in graphics, I can't think of any of those that exist today, at least to justify such a huge markup on parts I could buy myself.
I'm generally curious, I feel like I'm missing out, people that would consider paying $3k for a computer made out of easily-obtained components, why is it worth it to you?
Right now she uses Windows 7 Pro. We can't stay on it long-term because of security (and eventually driver) issues.
Migrating to Windows 10 (non-enterprise) isn't an option because the forced updates are an unacceptable risk to downtime, especially at certain points in her business calendar (e.g., highschool yearbook photo season).
We may end up migrating to a new Mac Pro, but it won't be because the hardware is awesome. Truth be told, it will likely be overkill for her needs, and definitely overpriced.
We'd migrate her business to a Mac Pro because OS X doesn't have Windows 10's problems, and because we can probably recover from a hardware failure quickly due to the system's expected modularity.
> Photoshop and Lightroom are essential tools
Those are in the cloud now, and performance is mostly based on your GPU.
>Truth be told, it will likely be overkill for her needs, and definitely overpriced.
>We'd migrate her business to a Mac Pro because OS X doesn't have Windows 10's problems
This is insightful, thank you.
>because we can probably recover from a hardware failure quickly due to the system's expected modularity
Every hand-built PC will be just as or more modular though. If that's your concern, look at, say the Mac Minis with the RAM soldered in, I love Apple industrial design but don't like how hard it is to upgrade the hardware, generally.
There's one number to call, and they'll repair any part of the machine up to and including swapping out the whole thing. Anyone buying a machine that expensive they really need would(and should) be looking at 24 hour turn around on site service anyways.
And the only way to really get that is to buy a whole prebuilt machine from $BIGCOMPANY
Depending on your area you might at some point have to do lengthy computations or you start working with deep neural nets and then want to have Nvidia hardware.
I think it's great if you can stay on your favorite platform and have everything on one system.
Also: continuous integration for iOS. If these machines will be reasonably priced it will result in faster test build times for many engineers out there.
So I'm very happy about this unexpected announcement.
This is the problem: Apple has just lost user trust with this one.
All they ever needed to do was have something like the pre-cylinder Mac Pro: something PC-like. A case with replaceabl parts and lots of expansion slots. That's it.
Have any Mac Pro users long since moved on to Windows (or even Linux) or gone down the hackintosh route? Who is going to trust the Mac Pro at this point?
Obviously the trash can was a huge design effort and I get the feeling they want to be just as revolutionary this time if they spend so much time, when they are obviously in a hurry.
Shouldn't they just be making a new 2 socket cheese grater tower? As simple as possible? The USP of the Mac is Mac OS, not that it uses a custom power supply.
As the article points out, Apple felt the need to make this announcement even though the new Mac Pro won't be released until next year, because they don't want more pro users to abandon the platform.
That's a pretty sorry state of affairs. I don't know if there are good statistics about this, but I wouldn't be surprised if quite a few users have already abandoned Macs for Windows. Microsoft is making a big push to attract creative professionals, who are heavy Mac users as far as I know.
Easy for a relationship abuser to quickly "fix" the problems then go back to their old ways.
Recent Linux kernels and have added the ability to pass through PCI devices to QEMU guests.
I run NixOS with Windows 10, Windows 7, and Ubuntu guests. Only one guest can be run at a time. I have a second GPU with a second screen that gets passed directly to the VM. I use numactl and nice to half surrender half my CPU cores to the VM and static huge pages to hand over a chunk of memory. The VM also gets passed an entire dedicate SSD for application storage. I believe libvirt can do most of this automatically now. I pass in a seperate bluetooth controller for keyboard and mouse.
Performance wise you're missing some CPU power and RAM, but otherwise running at full speed. Applications run flawlessly, its my dream development machine. I dont game much but I get the same FPS from the vm as bare metal windows on the same machine, as CPU and RAM are not my bottle neck.
Id I had a supported GPU or the Pascal drivers dropped I could get MacOS running and it'd be a one stop shop.
The best source to get started I've found is the arch wiki: https://wiki.archlinux.org/index.php/PCI_passthrough_via_OVM...
My keyboard and mouse can pair to multiple different machines at the same time and switch with a button press so I just do that.
Synergy also works fine, you just need to temporarily pass a real keyboard in to set it up.
There is also a newish feature in QEMU that allows they physical devices to be swapped back and forward by pressing both CTRL keys . It works reasonably well with a few quirks (Extra mouse buttons dont work in guest, for example)
Same thing apples to the screen, I surrender an entire monitor to whatever VM is running, but you could use a KVM switch.
It's a little work to get set up and tuned but it is 100% a dual boot killer. The next time I reconfigure this machine, I won't be installing a native Windows partition, I simply don't need it anymore.
Heres a video demoing and explaining it a bit: https://youtu.be/16dbAUrtMX4
Apple has lost its way. Their support has become horrible if not arrogant. Their updates keep bricking devices more often than not. Their hardware fails more than it used to.
If they thought the Mac Pro was a mistake, how is the MacBook Pro a success?
Can't innovate anymore, my arse, you bet you can't!
Is this because some companies are running rendering farms on a whole bunch of Mac Pros?
In so many words he uses it as a data management interface for his many TBs of external hard drives and that's about it. This struck me pretty hard in framing my opinion of what the Mac Pro really is to those who have it.
I am hoping for a serious memory upgrade when every they get around to it.
I am curious what others are using for external storage? I did not splurge on a big internal drive, but I am finding any video work really consumes a ton of space.
Then a Synology NAS for long term storage - also contains my entire iMovies library since gigabit is plenty fast for SD video.
Why might they not do this?
a) makes them look bad
b) well... is there a CPU architecture transition coming up? Last time there was this performance block, it was time to move from PowerPC.
Perhaps I'm just a Luddite, but what I really want from my desktop Macs is, basically, what I already have in my 2008 Mac Pro and my very ancient Mac Mini, just updated, because those machines won't last forever.
The 2008 Mac Pro has the giant aluminum case. It has four drive bays. That machine has been an absolute warhorse for me. It's pretty much been running every day since 2008. I've produced a lot of video clips and multi-track songs using Logic. The only things I haven't liked about it have been (1) Snow Leopard was more reliable for audio, on this box, than later releases, (2) Apple has gradually walked away from things I wanted to do with the server subsystem, like maintain a usable current version of Apache in the OS distribution, and (3) the box is quite loud for use in a recording studio.
With the Mac Mini I am actually planning to buy some newer refurbished Mac Minis with SSDs. These machines are almost perfect for use in recording situations.
Apple is so obsessed with design -- the tin can design of the modern Pro, whatever "modular" design they are cooking up with the future Pro design -- that it doesn't sound like they will consider that the old Pro had an industrial design that was almost ideal, with the exception in my view of the noise level. A honking big case with a lot of thermal mass and big fans and a lot of room for hot memory and drives is in fact perfect. It is beautiful to me because it is so simple and reliable. It doesn't need to be tiny. Another option would be a rackable version. And that's it. That's all I need from a high-end computer.
For the low-end utility machine like the Mini I want it to be small and _silent_.
The iMac probably has enough CPU but I don't think it is quiet or expandable enough and I need 4-16 terabytes of storage right in the box and an easy way to back it up _to physical drives in my own house_, not the cloud.
I'm still resentful of the thousands of hours of work I put in to Aperture projects. Tens of thousands of photos, many with a lot of delicate editing, which don't even render on the screen correctly anymore. I'm still resentful of all the projects I had built in iMovie which don't work in later versions because of the features that Apple jettisoned. If Apple has a solution for the two big needs I've got -- the small ultra-quiet media "capture" capability (for audio), and a big honking _simple_ Pro for editing and production, _and_ it appears they are serious about maintaining Logic, then I'll stay with them and probably buy more Macs. If Logic goes, I'm gone. (Mac user and on-and-off developer since 1985... Apple user since 1977...)
So perhaps starting in 2016, or 2015, or 2014 would have been a good idea?
... and they treat it like sh*t.
Harsh words but 3 years to update a pc?
None of these has anything to do with Thermals.
I am betting Apple will switch to AMD's Ryzen CPU in their next iMac.
It's strange they don't understand that doing nothing but keeping the Pro up to date spec-wise would've signaled mild disregard already. Keeping it frozen for four years is a clear, prolonged, 'we don't care about you' message to pros.
And everyone could've told them the trashcan design made no sense, and in fact did when it debuted. No need to wait four years for that. Probably Ive really really liked it and nobody managed to stop him.
J/K This is great news. I <3 OSX, but lately the hardware has been heading into reverse innovation