Then 10 years ago I got a mac. I never went back..
But what am I saving money for right now? To build a nice PC again.
Mostly because of the exact reasons in the article.
I have a fondness for apple... but they have definitely lost their way. First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone. Then they became a computer company who also made a phone and a tablet. Then they became a phone company who also made computers and tablets.
Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"
It makes me sad, as a mac fan. The hardware is getting worse. The decisions are getting dumber every time. I wont buy a laptop without a magsafe or similar connection, i have kids and animals, and the magsafe has saved a laptop more than once.. to remove something that was as core and identifiable a part of their computers was just a stupid move and served no purpose.
They don't listen to the industry or the consumers anymore, they stick their fingers in their ears and pretend to know best.
Jobs was hardheaded, but reasonable. Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.
It seems like it will never finish, but it does ... eventually. However, the lack of any UI, any indication of its presence or its progress, and also that killing it will just make it come back later are all pretty hostile.
(Google for more info.)
The Rollup updates that MS moved towards, even in Windows 7 now, is supposed to help this. However, a completely new solution is really needed to replace this antiquated one, like many other lingering parts of Windows.
There are plenty of other examples
Apparently the magic runes are
sc config wuauserv type= own
Probably backing up your files to government servers :-)
Yoink seems to trigger a file export in Photos. Why that doesn't happen with other programs is anyone's guess.
I put "workaround" in quotation marks because this is really how I do all dragging and dropping between programs in macOS, since I tend to run stuff in full screen.
There you'll find all your pictures in Photos
2. Select Photo.
3. Edit Menu -> Copy.
4. Open TextEdit.
5. Edit Menu -> Paste.
It is ridiculous. Ridiculously simple.
Feels like this forms a data point for the thread - since meanwhile, on Windows, I have a couple of programs in my bin folder that I use fairly regularly that I compiled in 2006.
Try selecting text in a Windows dialog box, such as the "About" box. It doesn't work, it never did as far as I can tell.
Also arguably, selecting text in a dialog box (instead of copying the whole text) is a lot less used feature than drag&drop of photos.
Also, what PC alternative has a MagSafe equivalent?
Thought experiment: say Macs had USB-C for 10 years, and the latest Macbooks switched to MagSafe. There'd be a lot of complaints about Apple locking us in to their walled garden, and how we have to buy a proprietary overpriced charger now, which can't charge other devices, moreover.
One technique I try to follow is an anti-knee-jerk reaction: if everyone has a knee-jerk reaction about something, I remind myself of the advantages of the alternative.
Ultimately, the only true test is time. Wait 2 years and let the emotions cool, and you'll know if the outcry was justified.
Then you would presumably still be able to use the remaining USB-C ports to charge it, with the MagSafe merely an option. That would not get many complaints. If they violated that they would deserve the complaints.
The thought experiment is intended to be the same as reality, just in the opposite direction.
But if they add magsafe while also having USB-C charging then it's much better than either option on its own.
And keeping USB-C charging does not require extra ports, because you need the USB-C ports anyway to do USB stuff.
But you were using it to make an argument that the complaints were invalid.
So I'm going to reply to that specific argument, by pointing out that when Apple changes the IO ports to USB-C and enables USB charging, it does not require them to remove the magsafe port.
In other words, apple went X -> Y. But adding Y reused IO ports and could have coexisted with X. Even though Y might be better than X, it was a false dichotomy in the first place.
It also entrenches the old standard rather than making way for the new one. If you have a laptop with both USB-C and MagSafe, and you need a second charger, you might buy a MagSafe charger if it's cheaper or to use with an older Macbook. Whereas without MagSafe, you'll buy a USB-C charger — the new technology.
If you want to do a transition, you have to go all in. Having both the old and the new port merely delays the transition, causing more pain in the long-term. Get it over with.
I've also been using MacBooks since 2006, and I love MagSafe - or at least, MagSafe 1 (haven't tried the more recent MagSafe 2).
Think how nice it would be to have the same functionality with USB ports....
I don't understand how Apple could patent this and prevent anyone else from implementing it. In any event, there do seem to be 3rd party implementations for USB-C if you're keen on the feature: https://www.macrumors.com/2016/01/04/griffin-breaksafe-magne....
That's something they could have kept (somehow) in the shipped USB-C charging cable.
It is, however, not the best choice for Apple.
I think it should be pretty obvious what I believe will happen.
The Surface Pro uses a MagSafe style charger.
For the majority of the population in developed countries, and almost all of the population of developing countries, smartphones are the only computer people have or need.
I agree that Apple have shifted focus away from "making tools for people to create things and solve problems" towards consumption-oriented mobile devices. But those devices are still computers, and they're wildly successful.
If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.
But what does that have to do with me (or, it seems, the GP poster)?
I want a great computer on which I can do what I usually do on my computer which is mostly programming, but I guess I could be fancy and say "content creation" instead.
The fact that phones are computers have exactly zero relevance when it comes to me choosing a new laptop.
I used to be a hardcore Mac user, because the computing environment was superior to any other choice, but that's not the case anymore so my next computer will be a Linux laptop.
True. However, creators and developers (who, in my experience, almost always use a desktop) are important for the iOS platform. Someone has to write those native apps. Therefore it doesn't make sense to ignore them for too long.
Those of us who are ios developers will keep bitching about it but have no alternative so will keep on using macs as long as their iOS ecosystem is doing great.
Source: have been a mobile developer for 6 years.
No matter how great XCode becomes, if everyone starts using Android, people will all jump ship to Android, which means many iOS developers will change to windows.
Mobile devices are still consumption devices. It's hard to create complex, multi-layer structures on phones: the screen is too small, the processor is too underpowered... take your pick. But creating a good SaaS or a good UX on a phone is darn tough.
Here comes speculation:
Websites/services in developing countries are painfully bad.
This is due to a number of factors outside technology, but it's also true that universities here graduate people who never grew up with big screens and unwalled gardens, never grew up creating rather than consuming, and don't have a sense of what it's like to go from blank screen to working prototype to polished platform.
Neither do most people in developed countries.
But at least computers are widely available in developed countries so that the x% of kids who have an affinity for these things end up getting started early.
Someday mobile devices will match and surpass desktops and laptops. But that day is not today, and meanwhile developing countries are years behind not only in physical infrastructure, but in online infrastructure as well.
Solidworks, AutoCAD, etc. don't run on phones
Photoshop/Lightroom/Illustrator don't run on phones
SPICE, VHDL, Verilog don't run on phones
InDesign and other DTP programs don't run on phones
Emacs/Vi don't run on phones
There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.
"AutoCAD" is a brand that applies to a large number of different CAD-related programs, some of which are mobile apps.
> Photoshop/Lightroom/Illustrator don't run on phones
Both Adobe Photoshop and Adobe Photoshop Lightroom are in the Google Play Store for Android phones.
> SPICE, VHDL, Verilog don't run on phones
Well, I can find circuit design apps that support a subset of verilog, but this is basically right.
> Emacs/Vi don't run on phones
There are ports of both on the Android App Store, supporting phones.
> There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.
There may be applications that phones don't have the processing capacity for or that work best with certain I/O peripherals that phones aren't often used with, sure. But the array of classes apps that are completely unavailable for phones is smaller than you seem to think.
Not true. I am from India. Developing country. There are 5 working people in my family. None of us can replace our notebooks with our phones. For all of us phone is for communication and notebook for work.
This is like saying trucks are cars, too.
Different form factor for different purpose.
It. Doesn't. Work.
- iPhones will suck if developers stop making software for them.
- You need XCode to make software for iPhones
- XCode only runs on MacOS
If developers start abandoning Mac en masse, that's going to hurt iOS a lot.
Also, if photographers, videographers, animators, designers, audio engineers, etc start championing Windows or Linux as their platform of choice, the average person will follow eventually. "All the pros I know use X" persuades a lot of average people.
And if people aren't using Mac, there's a lot less reason to use iPhone as opposed to Android.
I hate to use the word "synergy", but that seems to be what they're risking here.
I've written some thoughts about this previously, if you would care for a detailed elaboration: https://guan.sg/apples-2017/
I'm skeptical that this will work for everyone.
Everything is tradeoffs. When "must be tiny" is the top priority, performance and battery life necessarily suffer.
There will always be people - from gamers to video producers - whose top priority is performance. And for computer geeks generally, the weight difference between a laptop and a phone is not compelling, but having four times as many cores would be.
The portability difference between a mainframe and a laptop is immense - it changes your working life. The portability difference between a laptop and a phone is much smaller. It means you can do things spontaneously, because you can always have the phone with you. But if you're planning to work, the difference is minimal. Especially if the smaller form factor means you have to plan to have peripherals wherever you're going.
Eg, I can take my laptop to a cafe or a park and work. I couldn't do that with an "ultraportable" unless I bring my own peripherals. So it's actually less portable for the situations I care about.
Of course they are. So are microwaves at this point. So are toilets at this point.
I fail to believe that you just "missed" the real point, that the form factor and the user interface are (and need to be) quite different from a phone to a desktop.
Computers that need jail-breaking, though.
Windows 7 was the pinnacle windows experience for me and it has gotten worse ever since.
> Windows 10 has a 5 second boot up time
Man, I wish. My 10 system never boots up that fast. Meanwhile my Windows 7 desktop takes 15 seconds to boot up on SSD. Those 10 seconds just aren't that much of a feature for me-especially since my laptop/desktop are typically in sleep mode anyway.
> most all annoyances anyone has online can be configured away,
Yes, because we should have to do work to eliminate baked in ads and processes that share my information with who knows who.
Also, you can't even intuitively FIND settings. The Control Panel has some settings the Settings app doesn't have and vice versa. It's a mess. Why can't they all be in one place? Mac? one place.
I can't even get Windows 10 to update. Instead I have to constantly kill a rogue update process that decimates resources because I can't get a basic update to download and install properly.
I am far more productive now than I was on OSX. Switched over when the Windows Subsystem for Linux came out (you can see my ZSH shell running in the corner there)
Really? In what ways?
This is interesting to me, because the main reason I'm a mac zealot (for all but le screengames) is because it's mostly indistinguishable from working on a linux box, without any of the negative aspects of running linux on a workstation. Any time I try to do work on my Windows machine it feels very handicapped. WSL helps, but it's still one more layer of abstraction.
SSH keys/git always feel like a hassle on Windows, rsync was slow as fuck for me on WSL (but works fine in Cygwin), any kind of scripted behavior is a hassle. Package management outside of WSL is pretty annoying, choco is okay, but nothing compared to brew/apt.
At best I could see myself being equally productive on Windows, but not without significant effort.
Granted there were a few of annoyances I had to find workarounds for, but nothing unsurmountable, we are devs after all. There was a lot I had to relearn the-windows-way, old habits I had to drop, new ones I had to adopt. But with experience comes expertise, and you get used to the new way. I now have that handicapped feeling when I use OSX, death by a thousand restrictions. Funny thing, I never noticed them before, I assumed that's just how things were.
You don't know how far you can push and customize an OS until you make it your main environment and force yourself to stick to it until you overcome and find your new workflow. Pussyfooting around with Windows while you use OSX on your main machine is not how you test the OS for fitness, nor how you change old habits. You have to go in it with an open mind, you're not going to find OSX in Windows, you WILL have to relearn new ways of doing things, then you have to dive into the deep end of the pool head-first and attempt to hit your stride.
Until I wanted Emacs.
Until I wanted to uninstall software.
Until I wanted to setup a VM in VirtualBox (set up a Win2k12 Server VM tonight, and it simply failed to boot 3 consecutive times, then booted up in a "repair" mode, and then failed to boot again, just to boot up again after the 5-6th try. Wonderful).
Until I wanted a damned clock that keeps time (always have to manually stop/start automatic time in the options).
I really want to like Windows but I can't. It feels like it's going out of its way to annoy me. Booting up a fresh Debian install feels so much better... I just feel like I have to understand how things work much more. Linux is a sharp tool, and Windows feels like a clunky bicycle.
Can't use Emacs on Windows? Use any other editor.
Can't setup VM in VirtualBox? A VirtualBox (maybe) problem
If you go on Windows expecting the same environment as OSX or whatever OS you're on, you're going to have a bad time. Booting up a fresh Debian install feels better because you know what to do already.
Booting up a fresh Debian install for anyone else is likely to be an almost impossible undertaking without reading some kind of guide, if you're wanting to setup a proper dev environment.
There are problems in any platform you chose, you're probably just subconsciously sidestepping those in your process of setting up, while the Windows ones stick out to you.
Example problems I notice on Mac that are fine on Windows;
Docker is extremely slow, I need to run it inside a Linux VM for any kind of proper developing
Window management is horrible
Now, I'm no advocate for any OS, I love running Arch Linux with i3wm, I love OSX and I love Windows. I don't see any reason at all to hate any OS, I can setup my dev environment on practically any platform I could want with little or no difference. The only things that change are the things around my environment, the simplicity of i3wm, the task bar on OSX etc.
In my opinion Linux is the outlier here, which provides the greatest change in environment, not a bad one mind you, just a difference. Mac and Windows are mostly interchangable, I can switch betweem them with little overhead.
I don't prefer OS X over OpenBSD or Debian. What I like is that I can mostly just hop from one to the other without doing a context-switch. Things (mostly) work as I expect them to from one box to the other. That's not true for me on Windows (but that is to be expected).
I learned about computers on Windows, from 95 to Vista (briefly touched it and then left for Unix). I used to memorize countless contextual menus and options and paths between each, so that I would see how to solve a problem when it arose and could diagnose it without access to a computer. I still do not have the same ease with Unix.
What I have gained by using Unix is real knowledge about how computers actually work, not only how the OS itself is built. And in my anecdotal experience, typical users of Windows (at work, college and friends) unequivocally understand and know less about computers than typical users of Linux do. That is true of Mac users in general but the effect is less pronounced than with Windows - most Mac users that I know have a basic understanding of the command line.
But take this for what it is: personal experience.
But... to Emacs users, there is no "any other" editor :)
// That being said, it's possible to run a native Windows build of Emacs, I remember doing that at some point. It wasn't overly nice though, and you have to delve into the whole msys/mingw/cygwin thing.
It's like they don't even try. After a lifetime of looking down at Windows users, most Mac users go into Windows looking for reasons to validate the way they already feel towards it (irrational hatred) and don't allow themselves to like anything about it. They hit a couple of bumps in the road and quit in frustration and use those as excuses for their decision to retreat back into their comfort zone.
Don't be that guy.
I am proud to say that I can fully configure a dev environment and work on any major OS out there. Linux (Debian and Redhat based), OSX, and Windows.
Currently, I'm glad I'm no longer bound to OSX (deprecated OS, overpriced hardware) or limited to Linux (no gaming and no adobe suite). I'm on a constantly evolving OS ran by a forward thinking company on hardware upgradable through the next decade. It can only get better from here. I got no worries.
I dont understand what you are trying to say there.
Not being able to run Emacs as a reason not to use Windows is like saying; Damn! PulseAudio doesn't work on Windows, guess I'm back to Ubuntu.
There is no alternative for emacs if gimp does not count as alternative for photoshop.
I also have this theme installed on Windows 10, which is why everything is dark including the ConEmu's titlebar:
I spend long hours staring at the screen, I need a dark theme for eyesight preserving reasons. I had a dark theme on OSX until Apple decided to give a massive middle finger to the theming scene when they released El Capitain. The built-in dark mode just doesn't cut it. Yet another reason I'm glad im on Windows now.
What is it missing? It's been a while since I really wanted a feature in the desktop OS. All of the 'phone' features (besides launchpad) have made my life a lot easier (continuity, handoff, notification centre, even Siri from time to time).
It's missing proper support for eGPU cards, which is currently the only way to add a high performance GPU to any currently produced Mac.
Blender, for instance, can't even make use of th GPU in my MacBook Pro because the openCL support in macOS is not good enough.
As a app developer who wanted to build an app which challenged what a mac can do with regards to productivity I had to basically leave the mac app store because they made it impossible for me to make it work in Sandbox mode.
Trying to get modern Unix software to run on OSX is getting more and more painful. Since development is the main activity I do on a laptop, this is quite important.
Apparently as someone that favors Xerox PARC way of thinking, I am not a developer, as I don't embrace the UNIX religion.
You have to admit that around 2003 or so there was a huge influx of technical users to the Mac. My proposition is that a lot of those users were developers moving from Linux (or from an unhappy Windows life, wishing they were using a Unix-based operating system).
I base this on the general sentiment at the time, and I was one of the people who made the switch. I'm sure you can find old posts from me on Slashdot talking about how great the switch was.
Now I'm on Hacker News, talking about doing the exact opposite. I'm moving away from OSX because the Unix experience has become really bad.
Personally I know UNIX since Xenix days, do have a vast experience across UNIX variants, use it in some form in many projects, but rather use OS X and Windows environment.
For strange as it may seem, some of us are actually happier with the GUI developer tooling culture of OS X, Windows, Android, iOS.
OTOH, ALL Apple hardware that I have bought (multiple iPads, iPhones, iPods, Macbook Pros, iMacs) still works as advertised and is in good shape (I always buy and use Apple Care). I have just recently begun to use my Linux desktop more because my 2009 MBP is starting to feel a bit sluggish.
People can whine all they want about Apple (I personally will not buy the new MBPs, wait until next year's model). The truth is they've set the bar so high that anything less than perfection (which IMO is mostly what they gave us during the last decade) is seen as unacceptable. The same cannot be said of Windows - at least according to my experience since Windows 95. I recently bought a brand new SSD and a license for Windows 10. One day after installation, Windows crashed and had to "repair" itself. To be fair, it has been running flawlessly since.
That's literally the only way I can see your anecdotal experience being truthful.
The PC marketplace is open, like the Android one. There's a lot of cruft to sift through, but it should only take about 1-2 hours of research to find the best laptop at any given time and any given price range for any given use-case.
But you're right that it's anecdotal.
>Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"
I don't know much about this area, other than the little I have read, but didn't MS do somewhat the same thing - "lets put this phone feature on the desktop" - with Windows 8? Not sure if they reversed that in Windows 10. Interested to know.
The apps stuff (Windows Runtime) was actually quite well done for a first try, but it was unfamiliar to Win32 users and got in the way. (Only one click away, but people are impatient and resist change.)
Windows 10 does an excellent job of merging apps and traditional programs, though it's not strictly correct to equate apps and mobile. Windows 10 is a mobile operating system that runs apps from an app store, but you can resize apps and run them seamlessly alongside traditional Win32 programs.
You can still run Windows 10 in tablet (Windows 8/8.1) mode if you want to. I don't know anyone who does.
I don't think Cook is hardheaded. I think he has no idea how to run a product company. Pretty much everyone from the average HN user to Larry Ellison predicted this outcome. Cook is logistics / supply chain etc, Ballmer was sales, neither had any business at the top of their respective companies. Jobs did it on purpose, in my opinion, because he believed the single most important function initially was to keep Apple running smoothly and to complete the iPhone boom (ie the big product for the next decade was already in place). The only question is how long Apple will stay in the Cook era before getting a product leader replacement.
The only problem was that the original magsafe design was prone to fraying. and probably didn't look as sexy as having the cord at a right angle to the computer.
Check this out: https://griffintechnology.com/us/breaksafe-magnetic-usb-c-po...
I use my laptop, closed, hooked up to an external monitor. Nowadays, I spend a lot of time on video conferences. The number of times that my laptop has shut down mid-conference because I tapped the cable under my desk just enough for the connection to drop for a split second (combined with Apple's -- let's call it "interesting" -- decisions around power management) in the last week is way too high.
Fortunately for me I got a new laptop not long before this new abomination came out. If Apple doesn't correct this gigantic mistake, the next time I get a new laptop it won't be made by Apple.
Anyway, back to your point, I most certainly do NOT want DB-9 or DB-25 ports on my laptop. ;)
It'd also be great if I could have a VGA output, but that ship has long since sailed, hasn't it? It'd also be great if I could have more than two USB ports, but I have a 2015 MacBook Pro. To get four USB ports I'd have to upgrade to the 2016 MacBook Pro. See? To me, the 2016 is better than that one I have. It lets me plug in more adapters, adapters that I've been used to carrying for years. I carry four with me to every client site I visit, and you don't see me whining about it on the Internet.
Regressions in user experience? It's double the number of ports! It's a god damn lifesaver!
Apple laptops are still successful as for now, they sold wagons of the latest Macbook Pro, but it is not going to last if they don't fix the OS ASAP.
You missed the most important stages in between: when they became a company that made cute, candy-colored computers that would match your Volkswagen New Beetle, then when they made a totally awesome mp3 player you could get to match it.
Moving to USB-C was the right choice. Charge cable goes bad? Buy another one for $15.
Everything else is spot on. Apple quality is fast going downhill.
They are though. Consumers will pay for hardware with batteries that die after a year and a half. They'll pay for vendor-lock in and to be locked into the walled garden experience. They'll pay for hardware that will be locked out of the walled garden and unsupported by vendor updates after 4-5 years. They'll pay extra for accessories that give basic functionality to their electronics. They'll pay a massive premium for accessories that are subpar for their asking price if they're endorsed by celebrities.
I'm not a fan of newer Apple hardware and software, but we're in the minority.
After half a day of conversations, as we left the building, my very first words to my team were: "Apple Computer is no more. This is a marketing company. They might as well stick their logo on washing machines, microwaves and refrigerators. They know how to market them to be cool and they'll sell millions of them"
Then came the iPhone.
And, yeah, they could have filled homes with Apple appliances. Not sure why they didn't go there. So easy. Not saying it would have been right, but they had the opening and the mindless following to make every home "Apple Cool".
Not a big deal nowadays, my mid-2014 Thinkpad has a similar connection (it won't ever drag the laptop when pulled), I'd think most laptops should have such connectors by now, "mag-safe" or other?
I think we can be sure that under Jobs, they would not be in a situation where you cannot charge the current iPhone model with the current MacBook without a dongle. "Oh, it's not a dongle, it's just a cable." -> Get out of my face.
I still think their rMBP 15s are the all around the best you can buy. I haven't tried the track bar version yet though. Also, OS X (or whatever) on a rMBP is still great for surfing the web. The zoom with the trackpad is flawless.
it's death throes.
You're describing the changing world we're living in. Apple is merely reflecting it.
Do Windows laptops have it, or you're going to use a 2014 MacBook Air forever..?
What about wireless charging? I see magsafe as deprecated and only Macbook air has it now.
Magsafe may be deprecated by Apple, but that's a step backwards in functionality. One cannot deny that using USB C for charging is inferior in terms of protecting the device.
Gotta say the idea of a single USB C adaptor carrying power, data, and having a general purpose mag-safe like adapter is pretty compelling.
.. there's no pleasing some folk.
Add that to the one-and-a-half key-sized optionally permanent part (unless you're going to buy one for each cable, it's fixed until you need the safety feature) and it's a huge thing sticking out of the side!
To be honest, I question the usefulness anyway - I've pulled my Macbook Air off a table (accidentally) by it's MagSafe power cable before, and trying it deliberately now it only works with vertical movement. Side-to-side or directly 'out' it just moves my laptop, even as short & and sharp as I can.
If I stepped directly on the cable it would be such a directly-out movement; if I tripped, a side-to-side one. Unless my laptop was positioned perfectly such that the MagSafe hung over the edge of the table, and I trod directly down, I can't imagine how it would have its intended effect.
I use it all the time for un/plugging deliberately, but I can live without that, and as above with this Kickstarter product without buying several I'd have to leave the whole dongle plugged in anyway, so any deliberate dis/connection would be a normal un/plug motion.
Are you sure about that? I've ordered two and I've never thought about leaving it in: for me, it's purely a normal connector except that it could "break" safely.
Magsafe is a differentiating feature of Apple with little to none daily life saving rationality for existance. Its there to be different not to be better.
So, like creating a full blown new programming language (Swift)? Or a full new filesystem (AFS)? Integration of Cloud storage directly to the desktop? Siri on the desktop? Saving RAM through memory compression? Continuity to transfer work across desktop and mobile (and different desktops) seamlessly? All things added in the last few years, with few of them still ongoing.
Sure -- they've totally abandoned it /s.
>Take a look at Sierra: the only feature of note is Siri, which is half-baked as it is, and the things that did get ported over from iOS are half-done too.
That's hardly "the only feature of note". But even so, I wouldn't want Apple to continue to change much in OS X, except refining things.
>and so I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro.
So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).
If it discussed the state of Mac Pro and Mac Mini the post would actually have a leg to stand...
Byproduct of Apple's work on mobile.
> Or a full new filesystem (AFS)?
> Siri on the desktop?
Byproduct of mobile again! (By the way, ask Siri on Mac to set an alarm or interact with homekit!)
But, I take your point. I tried to mostly detail Apple's lack of attention for developers, but perhaps missed the emphasis on that there in the post. Microsoft is really trying with developers and it's readily apparent.
Swift appeared in IOS first and still is not great on the desktop at all.
APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017
Siri on the desktop is mostly useless, as said here.
The only difference is that it's being released in a minor update to iOS but will probably wait for a major update to macOS. But that makes sense, because macOS allows the user to customize partitions and filesystems, and allows running other operating systems - including Boot Camp, which Apple itself must provide drivers for, and Linux, which is waiting on Apple releasing documentation for their filesystem, as they have promised to do. By contrast, iOS has the same partition layout on every single device and no cross-OS compatibility concerns. Thus APFS on macOS requires more work and has greater risk of failure. For both reasons, regardless of Apple's priorities, it would be logical to start with iOS. Even if Apple did add stable APFS support to a minor macOS update, they probably want to auto-convert users to APFS eventually, but doing so in a minor update would definitely piss off technical users; they could make it opt-in to start with, but it's easier to just wait.
The next version of macOS is only a few months away; really not a big deal.
(I agree that Apple seems to be prioritizing iOS in some areas; I just don't think APFS is one of them.)
Which makes perfect sense. We talk to our phones all the time anyway, and hands-free is crucial. For desktop, not so much.
>APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017
Which makes perfect sense. A constrained environment without an exposed filesystem like iOS is easier to convert to a new FS. A full blown desktop OS not so much. That's why it needs much more testing and development to deliver the latter.
But they ARE doing this testing and development.
(Btw, APFS is not "the default in iOS" yet. It will be when 10.3 is released -- it is still in beta atm).
But it also indicates that Apple isn't really designing for the Mac as the author points out.
I mean, Siri is the biggest feature Apple is touting for Sierra (http://www.apple.com/macos/sierra/). And as you point out, it's something that doesn't even make all that much sense on the Mac.
Maybe they don't but I don't see them doing anything major on the iOS side either. Both platforms are quite mature by now anyway.
(And Apple was never about revolutionary new designs and jumps. Back in the day of the iPod and early MBPs etc, we cheered and waited anxiously for at best incremental changes -- now it has USB, now it has a different touch wheel, now it has a color screen, now it has wifi, now it does video, etc, year over year).
>And as you point out, it's something that doesn't even make all that much sense on the Mac.
Yes, but people (and pundits) have been asking for it all the same to appear on the Mac anyway.
And "talking to your computer" has been a thing from the times of 50's sci-fi stories even.
Curious here - can you give some examples? Siri seems to work better (on friends' devices) than "OK Google" on my Moto X.
Well, it will be when that beta iOS is released.
Besides, it makes sense for any sort of file system to be released first in such a limited manner.
So the limited manner would be to release it to desktop?
1) Catering to casual. Xbox one launch is probably the best example. They wanted to become the home media center so the entire launch and discussion was centered around sports streaming and social media. + Kinect.
2) Desperately trying to get a foothold in mobile.
I think they realized people weren't going for either. Sony focusing on games and hardware killed it initially in sales, and still kind of does. Apple and Android had two corners of the market and MS could make a space for a third.
They pivoted, targeted "professionals" and that's what you see in their approach for Xbox & PC. I think it's overall good, but saying Apple abandoned developer is sort of desparate fear mongering.
Apple's strategy has always been pretty consistent, focus a lot of resources on making a good ecosystem. More recently it seems that's changed to "focus a lot of resources on making the best systems, and allow for a great ecosystem" with sponsoring third party monitors, and apple's home automation front.
I think it's still a pretty good strategy, but with windows catching up with some "pro" features like their own premium desktops and laptops and adding in old features like "workspaces" people jump to say Apple "abandoned" them when they're still kind of sticking to the same successful strategy.
I've always liked Mac OS better than Windows and even though Windows 10 is finally a usable system for development, I still don't trust Microsoft. I've been burned too many times with a promising system that then gets hobbled by ORMs adding their own Adware and other BS, by Microsoft caving into the bean counters and hobbling their systems in order to fight Chinese piracy, and by some C-suite managers bright idea that they should to waste resources on some "Synergy" play instead of just making the OS, and other products, better.
At the core, you still have an ecosystem where the people making the computers can't make enough money on their own because Microsoft and the chip makers are squeezing them. Which makes for a sorry experience in the long run, even if Apple has its own problems.
Don't get me wrong, Azure is blazing amazing, and so is .NET in general, but when your enterprise IDE still hasn't found a way to function properly when the "documents" folder is on a network share, then you've got ways to go.
Which has been their approach for a very long time - let's not forget that Ballmer speech. From what I have heard, Apple puts minimal effort into XCode; the iPhone lead dev here was throwing around all forms of cuss words trying to get CI set up.
They support all three on Desktop systems, and they have features and APIs for both that only make sense on desktop systems too.
The desktop isnt a phone. Forcing phone features into a desktop and slowly turning the desktop into iOS is not a good thing.
We were just fine without an app store. Just fine. I cant think of any of the "use this mobile feature on your desktop" that provides any real value...
Siri, maybe, but there's nothing about an FS or a programming language like Swift that makes them a "phone feature". Same for memory compression (which only exists on the Mac IIRC) and continuity (which would be useful even if it only worked from Mac to Mac).
If anything Apple has adamantly refused to "force phone features into a desktop", and only ports stuff that makes sense -- that's sort of what Microsoft did, trying to merge e.g. tablet and desktop platforms in the same UI.
>We were just fine without an app store. Just fine.
Well, I'm much much better WITH an app store. Why wouldn't I want one? I might want some more features out of the Mac App Store (like the ability to demo an app) but I very much want to have it.
And in no way is an app store a "mobile feature". App stores, (and app repos) existed way before they appeared on mobile phones, for one.
Windows 10 has memory compression.
You were just fine. Don't speak for everyone.
Finding applications on the internet is a dangerous endeavour if you are inexperienced. Many people get tricked by popups suggesting their computer is broken and they need to download a "cleaner app". Many people get tricked by the fake Download buttons. Many people just use whatever the first link on Google.
And then once they are tricked they then proliferate their passwords, email addresses, credit cards etc all over the internet.
App Stores IMHO are a must have for most people.
Additionally it makes it hard for 2 apps to specify libraries required meaning each must include whatever they require AND either updates are manual, read nonexistent, or each app includes its own nagging update mechanism.
In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.
First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.
Second, whether some random web dev or embedded C dev or Java dev wants Swift or not is irrelevant to its intended audience and utility. It's not for "professional developers" in general, it's for Mac/iOS application developers, and thus, for Mac/iOS users (that benefit as a side-effect from developers having a modern/better language to implement stuff).
>I'm buying a powerful laptop, I don't want cloud storage.
So? Millions of users do want it, judging from the huge popularity of Dropbox, Google Drive, MS implementation of the same, etc.
>I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem.
Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+ and Apple obliged.
And whether you can "dual boot" or "share drives" is orthogonal to whatever OS X has a new native filesystem. You can still format the rest of the hard disk on another fs to dual boot, and you can still share drives with other systems in NTFS, xFAT, etc.
>I don't want compressed memory, I want 32GB of RAM.
And I want a pony unicorn. But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it. And Intel gives late 2017 / early 2018 as the date for those. (Existing PC laptops with 32 GB ram merely sacrifice battery life using regular high power drawing RAM -- and even those are far and few between).
(And of course compressed memory helps whether you have 16 or eventually 32GB of ram)
>In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.
Well, as a 15 year Mac user and 20 years professional developer, I do want those things.
I don't particularly care for the touch strip though -- I'd prefer it to be OLED physical buttons giving both tactile feel AND changeable inscriptions.
Yet just about everyone uses local FS with cloud for backup or sharing of a small subset. Giving us an odd cloud first strategy seems more suited for a phone or severely storage constrained devices.
> Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+
Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning. HFS+ has been getting long in the tooth for years, so a better FS is long overdue! So now we're getting APFS that explicitly doesn't checksum user data! With current storage size, that's disappointing to understate it hugely.
> until Intel delivers boards that accept 32GB low-power RAM
It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both. Probably a longer overall life. Teardowns show a lot of empty space around current batteries.
What "odd cloud first strategy"? Cloud is just ANOTHER option, not a "first" or privileged one. To the point that Apple also includes a whole new local filesystem (in beta) with Sierra.
>Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning.
Oracle bought Sun and poisoned the area with patents and threats.
>It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both.
No, we really couldn't. At best we'd have a 10% or so larger battery space. The impact of RAM (there whether you use 32GB or fewer for a task or not) is much larger.
In most cases, not because they want to, but because that's how current software works. Both Dropbox and Google Drive focus on having their own special folder that gets synced; syncing the OS desktop and documents folders is possible but only with special configuration (on both Windows and macOS).
Cloud first makes perfect sense no matter how much or how little storage you have. It prevents you from losing data if your device is lost or damaged, and if you have multiple devices it allows accessing all your data from any device. If anything, its usefulness depends more on internet connection speed.
I do think Apple's specific cloud storage offerings could be improved for large devices. Currently the highest tier of iCloud storage is 2TB, which costs $20/mo; that's only twice as large as my MacBook Pro's SSD, and fairly expensive. But it's the same price/GB as both Google Drive and Dropbox, so it can't be that much of a ripoff...
Until that other OS has support for the new proprietary filesystem, it won't be able to read it. Since Apple and Microsoft categorically refuse to implement any of the many featureful existing filesystems, one is stuck with archaic NTFS (with no file permissions) or FAT (with less than 4gb files) to keep data.
In NTFS, each file or directory can have arbitrary access control list specifying granted and denied permissions for users and groups.
Those lists are typically inherited down the directory hierarchy, but that inheritance can be stopped.
xFAT which I already mentioned is supported and doesn't have a 4GB limitation. You can have up to 128 pebibytes (which should be enough for everybody: that's ~144 petabytes).
I've heard these words so often that they've lost any meaning to me. :)
A language with no promises of stability counts as a "toy" in my mind (I understand that this can be subjective, just providing my opinion).
Say what you want about Objective-C, but at least Apple stuck with basic syntax decisions.
Migrating is a pain, but I'll take the Swift migration pain over the maintenance of old Objective-C code any day. Even on large projects.
As a Linux/Windows developer you are not the target audience (which I already addressed).
Besides, just because you can't use it, it doesn't make it a toy language, just a language that doesn't have support (or full support) for Linux.
A toy language has a specific meaning, it's not a generic work for "language I can't use professionally on my platform/industry".
I can't use Forth to make web apps either, which is were I make all of my money, but it's not a toy language by any means.
As a developer for Apple platforms, am I right to consider C# or Java toy languages, too?
C# that he mentioned wouldn't be toy languages even if there were 0 Apple and Android C# apps. It would just be a totally professional Windows language with no Android/iOS support.
C# was professional even when it only supported Windows (and partially FreeBSD), before Mono came along...
As well as the decade-plus when Mono existed but wasn't owned by Microsoft, making cross-platform support merely incidental or even a negative from Microsoft's perspective.
That's some amazing CoolAid you've got there. But I'll grant you that it's on the same level with a bunch of other new languages that also aren't in general use for serious long term products yet.
But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it.
Yeah, wouldn't it be great if Apple made this happen? That's the kind of innovation I want to see.
As for the bulk of the rest of your points, developers and common Mac users have different priorities, and your opinions are not mine. Who better represents professional developers will take time to find out.
Adding something with marginal utility for the majority of their users (since even pros can do just fine with 16GB) just because some insignificant minority runs multiple VMs on their Macs?
That's not the kind of innovation Apple was ever, ever, interested in.
Then you're in luck. APFS isn't proprietary, as Apple's website claims that "Apple plans to document and publish the APFS volume format specification when Apple File System is released for macOS in 2017." So there will be nothing preventing drivers from being written for other operating systems, though of course it will require someone to actually do the work. In return you get a FS that improves on HFS+ in reliability, performance, and features (volumes). If there is enough interest I imagine it might even compete with btrfs for use as a root volume for Linux installations… though there probably won't be.
You may criticize Apple for not open sourcing their implementation of APFS, though I'm holding out hope they will do so eventually. An open source implementation would definitely be useful as a starting point for drivers for other operating systems, but to be fair, it would still require someone to spend the time to port it. Apple's HFS+ driver is open source as part of xnu, but Linux hfsplus doesn't support journaling which was added in 2002. That's not Apple's fault.
Great. ... anyone else still waiting for the open FaceTime API announced in public on stage by Steve Jobs?
Only people who missed the part that some other company took them to court over a BS patent on it and won, so they couldn't finally do it.
Meanwhile, they HAVE opensourced other stuff, including Swift, in the meanwhile, so it's not like they have some big history of backtracking on open sourcing promises...
The rumor is that Apple changed its mind due to the VirnetX patent lawsuit, which forced it to switch FaceTime from P2P to relaying all calls through a central server. But who knows if that's the whole story. I'd still like to see an open FaceTime.
Why do you think Swift is a toy language?
I think it is a question of emphasis. You've focused on the OS, but not all the innovation was in the OS itself. Remember when they added iLife? That was a big deal. Software you had to pay for on Windows was bundled free, and it was good. Everyone used iPhoto, iMovie was nice, GarageBand was awesome, although probably not everyone needed it.
(Good lord, just checked, iLife debuted (minus GarageBand) on 10.1. Evidently, I've been using MacBooks longer than I realized.)
The sense that they have an exciting vision and plan for the future of how people use and relate to the computer is gone. That sort of vision and the innovation that goes with it is reserved for the mobile space now.
They are still fine computers, particularly the notebooks, but the mobile products are driving the process now. I do think that is a shame.
This was done extremely poorly, though. Among the big sync providers (Dropbox, Sync, Box, Google), iCloud Drive is easily the worst.
The iOS app looks like something an intern cooked up. Hardly any iOS apps can read or write from the drive. There's no way to share anything in your drive. There's no way to share anything into your drive, either.
While Apple introduced the ability for your Documents folder to be stored in iCloud, macOS has completely lost control of the Documents folder in a very Microsofty way: Mine has turned into a rubbish heap of various application data that I didn't ask for. Savegames, virtual machines, "My Music", "Microsoft User Data", etc. — it's so unusable that I ended up hiding the entire folder and creating a new one that I could use for, you know, documents.
The iCloud services that are hidden away from the user seem pretty great, but the way Apple decide to expose the Drive is terrible. It's almost like they don't want users to actually use the drive.
Neither of these are Apple's doing - obviously in the case of "Microsoft User Data", and I don't know what "My Music" is but Apple's software puts music in ~/Music. (Heh, it seems I have a "My Music" in Documents, as well as "My Documents", "My Pictures", "My Games"… not sure whether this is Wine or VMware's doing, or whether they're there for the same reason as on your system.)
Applications are not supposed to use random subdirectories of Documents (that's what Library/Application Support is for), and in the case of sandboxed Mac App Store apps, can't. But I don't know what you want Apple to do about random unsandboxed third party apps deciding to do it anyway. I guess if the Mac App Store had been more of a success, more apps would be sandboxed, but still…
I guess it would be nice if macOS made it easy to sync custom (existing) folders to iCloud rather than just desktop and documents. But if you're starting from scratch, you can just create the folder under iCloud Drive and drag it to Finder favorites. If you're a CLI user, add a symlink in the home directory or whatever.
I agree iCloud Drive on iOS is really inconsistent and awkward, and it would be nice if you could share folders with other users. I don't know what you mean by "no way to share into your drive", though. You can add arbitrary files to iCloud Drive from the share menu (and you can access arbitrary files from the iCloud Drive app).
"Microsoft User Data" is apparently from earlier versions of Office. Newer versions will look for it in ~/Library/Preferences, and you can actually move the folder there to get rid of it. 
Here  is my Documents folder. It's 100% crap that I did not ask for.
As for sharing: This discussion was about macOS, although the same limitation applies to iOS: There's no way to share an iCloud Drive file directly. If you do right-click -> Share -> Email, you get an attachment that you need to send, even though the file is already reachable through the cloud. Every other competitor (Dropbox etc.) adds a "Copy Link" option to the Finder context menu.
Apple's documentation implies that applications are not supposed to use the documents directory themselves (only when the user selects it in the file picker).
>So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).
Exactly. Sigh. Kids these days. That is like the complain about Webkit being the new IE, without actually experiencing Web Development during the IE era, and falsely claiming IE era means IE 7+, then later acknowledged he wasn't even doing Web Development in IE7! time. Anyway......
There are lots of new fans into the Apple Ecosystem since iPhone. And without the RDF from Steve they quickly lose sight of Apple's greatest strength. Apple is the master of iteration! Very rarely will you see a completely new product CAT being introduced. They just continue to iterate. Sometimes taking two steps forward and one step back.
I agree Apple have spend less time with Mac and specially Desktop in General, but most of the point in article were not convincing.
I agree 100% with your point. If you want pointless changes to the OS that means you have to relearn a lot just to keep doing what you were doing, then yes, Microsoft is what you want.
If you want small incremental improvements that don't move around common functions for no reason, then stick with Mac OS.
True, there probably isn't much the average user would like to change in OSX. But it would be okay if stuff was broken was fixed and stuff that was wonky got improved.
It actually shows marked performance improvements, but that's because it focuses on performance on the type of storage an iPhone uses.
If that's solid state storage, then it's the same storage all Macs will also use in the future (and most models now).
I recently had the pleasure of checking in on Windows for the second time in 7 years. I had been a PC user since the DOS days, starting on a 286, and going all the way to Windows 8 when I switched to Macs around 2010. I've only ever needed Windows for games since then, and decided to install Boot Camp on the 15" 2016 MacBook Pro that I just got (which happens to be a pretty good machine , all in all, and runs Paragon  in high detail at 1920x1080 60 FPS; enough for me.)
There is still so much wrong with Windows I honestly don't know where to begin. The UI remains a convoluted Escher'esque nightmare. Apple and macOS are still FAR from fucking up badly enough to make me want to willingly head back to Microsoft.
I am running both Sierra and the latest version of Windows 10 side-by-side (with Parallels Desktop) so the differences are very glaring and obvious. Maybe I'll write up an in-depth comparison later.
Comments like this make it hard to take any side of the discussion seriously.
/Neither/ of the OSes are "Escher'esque nightmare[s]." My elderly mother can kludge her way through both of them, more or less reliably - admittedly, with more trouble than using her phone, but not a lot more.
Nothing that a tech-illiterate can reasonably use on a day-to-day basis qualifies as a "nightmare." Can we please stop pitching hyperbole-for-effect until it nullifies the entire discussion?
• macOS Finder: http://i.imgur.com/9Y5hK0e.png
• Windows Explorer: http://i.imgur.com/ef7CnJ4.png
Alright. Maybe not nightmarish, but one of these IS objectively more convoluted than the other.
Let's start with something simple:
WHY does the "Options" button on the Windows Explorer Ribbon (in the top-right corner) drop-down to reveal a SINGLE menu item which DOES THE EXACT SAME THING as clicking on the button itself?
(Bonus: This is what happened when I tried to take a screenshot first: http://i.imgur.com/UsN554o.png — Explorer hanged when trying to remove a shortcut to a now-missing network share.)
I see the MacOS Finder and see unlabelled icons. My elderly folks are terrible at remembering what various icons do, and what error messages are for, etc. I've trained them to pause and read the words on the toolbar, read the words on the window, etc. as they're generally self-explanatory.
So the finder that is more convoluted for a more proficient user, is the one that my folks can actually more reasonably navigate.
Okay, your criticism is on point: the Explorer Options button has a redundant click. To me, that would matter if I ever used anything other than keyboard navigation. To my mother, father, and sister, "yay! It's f'n labeled!"
My nieces don't give a damn either way, because they're too young to care about one unnecessary click, and they don't forget what the icons mean.
Your point isn't wrong, it's just not that clear-cut an issue when dealing with different people with different levels of proficiency and needs.
But the fact that we're down to "labels vs. reasonably intuitive icons, plus or minus a redundant click" suggests we're well into the weeds of trivialities.
You're a miracle worker. :)
Failure to read what's on the screen is one of the biggest reasons why "non computer" people don't get things done with computers.
In any case, the longer label could have been implemented as a simple mouse-hover tooltip, instead of an extra button alongside the primary button that expands to a menu with a single item (with its own icon too!)
I'm not just comparing it to macOS; the Windows UI is heavily flawed not only in terms of efficiency in usage of system resources, it is inconsistent with itself, and baffling no matter which school of design you adhere to.
The option button was just one of the immediate examples I could point out in a quick reply. There is still a lot to pick at in that screenshot without comparing it to any other OS.
As an aside, I don't think that "average" or new users are a good gauge of how well-designed a system is; people can learn and put up with horribly unintuitive interfaces out of necessity when they don't have any better alternatives. Just look at computers and phones and other devices and systems of only 20 years ago. Many will make you go "how the hell did we live through that?!"
Likewise, "All My Files" sorts based on... what? And which folders does it look in? Because very often I save a file from the Internet and it never shows up in All My Files. And when I search for something when I'm in a directory, I expect the search to be restricted to that directory by default. Instead it defaults to My Mac, which makes no sense. Is there also no way to get the current path as text? On Windows I can type "C:\Users\My Documents" by hand in Explorer. I don't see an option to do that in Finder.
Explorer isn't perfect, but Finder is just awful. Just awful.
Right-click on the toolbar and select "Customize Toolbar...", then drag the "Path" button into the toolbar.
In macOS, generally:
⌘ Command = App-specific shortcuts. The primary modifier key.
⌃ Control = Global shortcuts. App-specific shortcuts will not generally have Control as the sole modifier.
⌥ Option/Alt = Something to with the keyboard, selection, or secondary modifier for a shortcut. Option+Up/Down is Page Up/Down.
⇧ Shift = Something to with the keyboard, selection, or secondary modifier for a shortcut.
Common shortcuts are way more consistent than in Windows, and you can search for them from the Help menu of any app, and even modify/create shortcuts for any app from System Preferences -> Keyboard.
> As far as I can tell, there's no non-keyboard way to do it.
Right-click on the folder icon in the title bar, use Columns view, or show the Path Bar from the View menu.
> Likewise, "All My Files" sorts based on... what?
Files in your home folder's hierarchy, that are indexed by Spotlight.
> And when I search for something when I'm in a directory, I expect the search to be restricted to that directory by default. Instead it defaults to My Mac, which makes no sense.
Keyword: defaults. You can change it.
> Is there also no way to get the current path as text? On Windows I can type "C:\Users\My Documents" by hand in Explorer. I don't see an option to do that in Finder.
"Go to Folder", it's under the Go menu.
To copy an item's path name, hold down Option/Alt to modify the "Copy" menu item.
Most of your complaints are basically "it doesn't do everything the way Windows does it so it's bad" or "I'm furious because I didn't bother to tweak the defaults."
"You are holding it wrong".
The only reason to know this seems to be because someone has told someone it seems.
There is no indication in the design of the element, and IIRC not even a subtle glow when you mouseover.
Sorry: as much as I like Apple I don't waste my time on their software. Not saying it is bad but for me I had to admit it was a waste of perfectly good working time and focus.
If you like it: good. I know people who seem to be more productive with Macs. And more Apple users is good news for both Linux users and Windows users since they force companies to think cross platform and force Microsoft to be more humble.
It's okay to have to be taught some things, it makes it better for pro users when the handholding is minimal.
Here's another one: hold alt when a menu is open to see some options change into alternate versions. Instant feedback, works on dialogs too. Keeps the UI uncluttered for noobs and power users alike without significantly impeding you.
HOW is Cmd+Up NOT an obvious shortcut for going UP one level? As I said, Cmd is already the primary modifier key for all app-specific shortcuts.
On the other hand, what does Windows use? BACKSPACE! A key people associate with deletion and editing text!
> Hold down option to change the right click menus?
Option modifies shortcuts, changing their behavior. "ALTernate." Again it's a behavior that's consistent across all macOS apps.
> Explorer is terrible but it's better than Finder because you don't need to read a manual
You don't need to read a manual for Finder, or almost any macOS app, because you can just click on the Help menu, and type the name of a command or action you expect that app to be able to perform, and it will automatically highlight the menu which contains that action:
I just typed "path" in the Finder help menu and it showed me the shortcut for "Copy as Pathname" including the Alt/Option modifier. And that works in EVERY macOS app. Try doing that in Windows!
Back when I used it, 5 years ago, it was a solid mess IMO:
Jump one word forward or backwards?
Windows/Linux: ctrl + arrow
Mac: depends on which program you are currently using
IIRC our resident Mac enthusiast explained that it was because some was written in carbon and some in cocoa or something but I don't care.
Opening a file upload dialog box in one browser window would block other windows from the same browser forcing me to use two different browsers: one for docs and one for the task at hand.
This went on for 3 years and in the end I looked as much forward to leave as I had once looked forward to get a Mac.
To each their own but Mac users talking about consistency and ease of use still baffles me.
> Windows/Linux: ctrl + arrow
> Mac: depends on which program you are currently using
I don't know what the deal is with Carbon - it was already dead 5 years ago, even Photoshop transitioned in 2010 - but everything I use uses option-arrow. Even the terminal.
> Opening a file upload dialog box in one browser window would block other windows from the same browser forcing me to use two different browsers: one for docs and one for the task at hand.
I just checked and at present in Safari, file upload dialogs block the current window (i.e. can't access other tabs) but not other windows. Though to be honest, I don't understand why you needed to keep file upload dialogs open… why would you click the button to pop the dialog if you weren't intending to immediately select a file? But I guess https://xkcd.com/1172/ applies :)
Would finding and copying a somewhat long folder path from the internal wiki qualify as a good reason?
macOS has demonstrably more discoverability than Windows; Not only are shortcuts consistent across all apps, but the visuals of common UI elements are the same as well, so a new user will immediately know what does what, in different apps. Whereas on Windows, you have a multitude of menu, button, scrollbar etc. styles right out of the box, in Microsoft's own apps!
macOS also has a list of all mouse/trackpad gestures — complete with videos — in System Preferences. Likewise, all global keyboard shortcuts are visible in one place as well (Keyboard preferences, where you can modify and even create your own shortcuts for ANY app.)
Best of all however, you can click on the Help menu and just type the name of a command or action that you expect an app to have (for example a filter in an image editor) but don't know its shortcut or which menu it's under, and the unified menu bar subsystem will automatically highlight all matching menu items and their shortcuts for you:
Here in that screenshot I wondered about the ability to copy a folder or file's pathname, and the Help Search pointed it out to me! And it works in EVERY macOS app! Does Windows have anything which comes close to aiding in discoverability like that?
I will concede your point on iOS, however.
In windows, to jump to a file starting with 'docu' one can start typing d o c and explorer will jump the file.
Likewise, it's possible to tap d d d d to navigate between all files starting with 'd'.
Is there a way to do either of these with finder?
Typing the same letter a second time in finder will go to the last file starting with that letter, which could be a handy feature. But d, down, down, down seems almost as good a d d d.
I don't use Windows, but it's important to use fair comparisons.
You have to expand it and be faced with that hodgepodge of cluttered buttons to perform many functions. Even with it collapsed, it's still 4! toolbars (counting the buttons in the title bar and the ones in the bottom status bar) and a menu (?) style that is inconsistent with many other built-in Windows apps.
While you can argue about whether or not the option to modify folder settings belongs within that window or on a title bar, taking screenshots like that is dishonest, at best.
This is the default view for Finder on a new user account: http://i.imgur.com/Dni0Kpo.png
The only thing I changed in this Finder screenshot is collapse the auto-populated Devices, Shared and Tags list for privacy, so I didn't have to edit the image. Notice that it defaults to the "All My Files" view which is empty for a new user and would be even more of an unfair comparison to the default Windows view.
Which even is the options on Finder? The cog, I assume. If I'm helping over the phone, can a user describe the current state of Finder? Not really. But I could ask for the contents of the address bar in Explorer.
• Click on the first icon/happy face at the bottom of the screen (to make sure Finder is in focus).
• Click on "View" at the top of the screen [or press Cmd+Option+P] to show the Path Bar.
To guide someone to a specific folder:
• Click on "Go" at the top of the screen [or press Cmd+Shift+G].
• Type the address.
The Windows Explorer uses Alt+Enter to show the properties of something, and in games and other apps that shortcut means Full Screen.
Also, why does Alt+F4 quit an app? What does F4 stand for? Compared to Cmd+Q to Quit or Cmd+W to close it's a lot less intuitive.
And so on and so forth.
Because "Enter" has been used a key to activate/open the currently selected item in various UIs across various platforms for several decades now?
Also, possibly, because the button is labeled "Enter"? I guess on Macs it's "Return" though, but that doesn't really make any more sense.
Alt-F4 makes little sense, true, but in almost all windows apps control-q or control-w does exactly the same thing as on osx.
Also, having sane defaults for keys people are likely to press is much more important than /not/ having obscure keyboard shortcuts that users never have to know even exist.
In particular, if you have unsaved data, most apps will prompt to save on Alt+F4.
However, Alt+F4 translating to WM_CLOSE is also handled by the app message loop - i.e. Windows sends WM_KEYDOWN events for those keys, and when they get to the event loop, it's supposed to call TranslateMessage. That one normally handles WM_KEYDOWN -> WM_CHAR translation, but it will also handle any shortcuts, and this includes Alt+F4. So if the app is hanging, messages aren't pumped, and the shortcut doesn't work. To send WM_CLOSE, you'd need to use the close button on the window.
So if the app does close on Alt+F4, it was, at the minimum, pumping messages and invoking TranslateMessage on them.
Launching processes with a single keypress and no confirmation can hardly be called "the least destructive action."
(As a sidenote, you can drag documents from the finder into a Terminal window and their absolute paths appear at the cursor position).
As for Cmd+Q and Cmd+W, those might be somewhat more intuitive, but only for some languages, and arguably only very vaguely intuitive for Cmd+W even for english speaking people.
But then again, I don't think there is a real notable difference in the end between most modern systems. It's more of a question of what we are used to.
Lol. 10/10 troll.
The Finder is still less cluttered, with all its buttons in 1 toolbar instead of spread across 4.
If it's 2, that's because other items can be added to that menu, such as LastPass and Bitdefender just to use the examples conveniently at hand.
And as a quick tip if you need to take a lot of screenshots on Windows - Greenshot is a great screenshotting program for Windows (and OSX, though I don't know how it is there), free, GPL, etc. Binds the PrtSc button and lets you select the area you want to capture as well as destination, format, etc.
And as for when the ribbon is closed, assuming you're using an appropriate language selection I think I'd find "File/Home/Share/View" at least as informative as "4 squares/4 lines/3 boxes/multisize boxes" - and the text labels next to them on the Windows side when you're on the View ribbon to be much more informative.
It's a bona-fide menu item, accessed via that extra dropdown button below the primary "Options" button.
Totally disagree. I hopped from OS to OS for over a decade every other year or 3 (Win, OSX, Linuxes with various DMs/WMs) and it's always, always an adjustment period from "WTF is this" to "totally productive with this now", usually a matter of 1-2 weeks. Happy with Fedora/Gnome right now but have to say just Explorer itself beats Finder and Nautilus (rebranded as "Gnome Files" apparently now) for keyboard users when it comes to out-of-box file managers.
The weird new tiles/metro UI stuff they tried pushing from Win 8 is of course a pretty bad attempt to Microsoftify the "secondary desktop layer" concept that has been prevalent on the other OSes for longer, and better. It sucks, but also useless and thus not used for many/most power users I'd guess.
The rest of the "UI" question is in the hands of individual independent app developers, as it properly should be.
Well I'm kinda visual/directional when it comes to files and other data structures. Even before I had a PC that could run Windows, ie living in MS-DOS and certainly being aware of navigating and operating the file-system via CLI, I much preferred the blocky-ASCII-GUIs (with menus and "windows" and panes, usually both mouse&keyboard-capable) of Borland, MS and many other software vendors incl "commander"-type fie managers. So.. probably a very longstanding "habit" thing.
The irony is in those days the "power-user" in general was adamant to get as quickly and as far away from the CLI as they can, even though they need to know some basics to automate script and launch stuff. Other than the odd GNU beard. Whereas the 2017 generation loves the CLI, would never have predicted such a thing! =) I've found time and again that once one operates with more than 1 or 2 dozen commands, it's disturbingly often `command --help` or `man command` first because there are only words (that vary from command to command) absent a program based on and supporting the rich set of commonly-used, de-facto-standard spatial gestures and interaction primitives that have evolved over the years.
Of course, not exactly comparable, but it's pretty amazing.
Your pwd opens up in vi and you can edit it as you see fit. When you :wq! all of the filename changes you have made get written to the directory.
It also has a well integrated terminal with directory tracking, so it is straightforward to switch between command line and graphical operations.
Have you checked out Nemo, which is basically the good old, full-featured Nautilus?
Uh... no. No, you can't do that.
WiFi configuration was annoying. Lots of clicking. No consistency to the UI. And for a relatively new laptop, it was slow. OSX in comparison was simple, relatively straightforward, and responsive.
Certificates were a joke. We could import certificates and CAs. But once imported, they didn't do anything. Access to the corporate intranet was still blocked with "unknown CA". When we went back to see if the certs had been imported... nope. No certificates. OK, import them again? Nope, still no certificates on the machine.
We gave up and let him use a spare Mac Mini.
I hate where Apple has been going with OSX: Slow strangulation by oxygen deprivation. But until it becomes entirely unusable, it's still better than Windows for me.
Install (only) MS Word - - another cycle of updates. Restart, check updates, more updates!!! It's just insane. Why does it download Outlook spam filter update, 100mb?I don't have Outlook installed, I only wanted freaking MS Word.
Granted, Windows has a usable file manager (Explorer) and Finder is an embarrassment, but theres always Path Finder.
Apple's notebooks are excellent, no argument there.
Unfortunately Windows 8 and 10 are just not great OSs, in my opinion. A major theme on these threads is "Apple is trying to force mobile onto desktop computers." yet I see Windows doing it just as much, and not nearly as smooth, as Apple. Cortana, weird metro UI layout, terrible settings, confusing maintenance, updates never work, slow response times, etc.
I gave up on Windows 10 when I got my father a new laptop and had to troubleshoot my sister's. The OS is a mess. Finding basic things is so difficult. Just give me a control panel. Now we have "Settings" too, which is just a mess. Windows Update never works properly. It constantly needs a troubleshooter to run and start and stop the services again.
My dad didn't want to log in with an email to his laptop. So I disabled it (which was difficult to find, so I had to google it.) Now it's suddenly back, and he's complaining about it. There's no way he manually did it purposefully.
Trying to disable things like Cortana never works right. The huge debacle of info being sent to microsoft and needing heavy workarounds to actually prevent it from occurring. And then being force fed apps with advertisements built into the OS...come on.
Apple simply needs to improve their hardware at their price point and 90% of these threads wouldn't be made. The OS needs some additional features and improvements, but the experience is just so much better than Win10 and 8 for me.
My current build uses Windows 7 and whenever it is officially unsupported I will move on from Windows completely.
The old Control Panel is still there.
> Trying to disable things like Cortana never works right.
Cortana is a settable option. There's no reason to try to disable it.
*> Now we have "Settings" too, which is just a mess.
Settings is a clean, well-organized and simple app that does what most people want. It saves ordinary users from having to grapple with the Control Panel.
You'd probably do better if you spent more time in learn mode instead of in angry mode.
And you'd probably do better if you weren't condescending towards other people's experiences and preferences.
You say the Settings app is great, yet it's not great enough to replace the control panel? Redundant settings abound between the two?
There are many reasons to want to disable Cortana-not just turn it off, because it doesn't actually turn it off.
As far as search goes, my Windows 7 machine finds most settings too.
The Control Panel remains because a lot of hardcore Windows users would have bitched if Microsoft had removed it.
As usual, Microsoft would have been criticized for whichever decision it made. That's life when who have more than 400 million Windows 10 users.
It was simply intended as good advice.
If you think about it, there are good reasons for moving settings to a simpler, touch-sensitive, sandboxed app.
If you think about it, there are good reasons for moving settings from the Control Panel in stages, and for keeping the mouse-oriented Control Panel program around for people who have grown used to it.
If you've read the book, the anger is a System 1 rant and should be modified by System 2 thinking.
Windows is easily the most visually inconsistent OS out there -- even in comparison to a Linux desktop. That has it reasons of course, many features are in there that didn't need any kind of update in the last 15+ years, therefore their UI was never updated to conform to updated guidelines and so on.
Exactly. Honestly it's linkbate nonsense. Sucks you in like a car crash on the side of the road or a hockey fight.
I get the slickest hardware, great battery life, the best touchpad and keyboard, and a hi-resolution screen that doesn't have battery life, flicker, tint, or scaling issues.
Windows is still...Windows. The bash subsystem is half-baked. Windows itself is still the same mess. Docking is still a joke. Performance is terrible, laptop hardware is still 'almost-as-good', display scaling is a joke, and the ecosystem is still fragmented beyond belief.
Linux is still as bad as it ever was. It's a nice place to visit but I wouldn't want to live there. I've been using Linux on laptops since Redhat 7 (the first one from 2000) and a Dell Latitude C-series.
Somehow I shut my laptop off and it booted backup to a failure message, I had to make another bootable USB and go online to find out how to do a fsck. I was using external monitors without issue for a month, and then one day the configuration changed to a point where I couldn't even use the laptop with it's built-in display. The sound and volume controls are a joke. I was prevented from installing any packages because of an issue with a package from Google.
All fairly minor, fixable issues, but I forgot all about them because I haven't had to deal with crap like that since I switched to a Mac back in 2010.
A bit of a rant ahead, but I am really tired of people still treating Linux like it's 1997 or expecting it to work like their old OS did and then running into silly problems that none of us, who actually spend our day in Linux, run into.
These problems sound more like inexperience in using Linux to me than anything else. fsck works automatically after a power failure, configuration doesn't override itself and installing a package from the web, while not a great way of doing things, definitely does not prevent you from installing other packages, unless you somehow broke a whole lot of other stuff by deleting dependencies without really understanding what you were doing.
If you want raw power, you can buy a car with 500 horsepower. But you handle all of that power, you'll need traction control and maybe stability control, and anti-lock brakes for when you need to stop it. But all of that limits the amount of power you have available, limits your control. Now, a race car driver would look at these driver assist tools and say "I'm sick of people expecting race cars to work like their Jetta and then running into problems that us professional race car drivers don't run into". When you complain that the tires spin or you are having a hard time working the clutch and steering wheel without power assist, they might even tell you "these problems sound more like inexperience with driving race cars than anything else".
And they'd be 100% right. But I'm trying to get to the grocery store, not win the Daytona 500. I want a car with power steering and an automatic gearbox and anti-lock brakes and traction control and an airbag. You may say "computers were designed for power and Linux gives you maximum power!" like the other guy commenting did but I don't want that because when it inevitably breaks (because I'm inexperienced), I don't want to be stuck on the side of the road reading manpages and stack exchange posts with no answers and trying to navigate mailing lists. I have a job to do.
(Seriously, did saying "sounds like you're just inexperienced with Linux" sound good in your head? Because it certainly didn't sound good reading it.)
Does it require some adjustment? Sure, just like any new OS. If you're going to try it, I think it's fair to require you to learn its ways. If you don't want to do that, why not stick with your old OS?
Mac requires adjustment too, has many problems, (ie WiFi drops, file system, temps etc.), however I feel the reason people are willing to adjust is because they paid solid money for it and so want to get the most out of their purchase. Because Linux is "free", there's no such incentive and thus it is judged much more harshly.
Hardware selection is another problem, with Mac you get custom tailored hardware, but with Linux, most people just purchase any random junk PC and expect it to work great, not really fair is it?
> Seriously, did saying "sounds like you're just inexperienced with Linux" sound good in your head? Because it certainly didn't sound good reading it
I am sorry if it sounded harsh, but constantly reading about novice problems I haven't had in a decade also doesn't "sound good reading".
It's not fair. It's very very unfair because even in the PC world, hardware vendors write their own drivers for Windows, or release documentation only under NDA, while Linux is often dependent on reverse engineering by random contributors. And of course the latter can't begin until the hardware is actually available, delaying support.
Unfortunately, that still leaves hardware broken on a large fraction of systems, with only a handful of vendors explicitly designing systems for Linux. It's better than it used to be, but not great.
It's not a novice problem. I know what a fsck is and I know you're not supposed to unmount filesystems improperly. But you shouldn't be presented with an error that you can't recover without a rescue USB because your system lost power.
And the video configuration issues are just weird. Monitors plug and play perfectly but all of a sudden it got so screwed up, I had to poke around quite a bit to get things back to normal. Frustrating the heck out of me because I'm used to something that just worked.
It's a "you're holding it wrong" moment. It's a completely ridiculous argument.
As an aside, I can't speak for other Linux users, but my goal is most certainly not to "drive Linux user adoption". If someone is interesting in learning to use Linux or wants advice about which distro to use, I would happily help them, but I actively try not to convert people who haven't expressed that interest. The reason I chimed in here is that I disagreed with the sentiment that Linux is a terrible choice. It's certainly not the right choice for most people, but there's a difference between being being niche and being bad.
The answer is always "Because I'm learning cool new shit and enjoying myself".
You know when the last time I even thought about my OS was? Probably 8 yrs ago. I'd much rather fiddle with the stuff I'm building than with the stuff I just need to work in order for me to build what I'm building.
So I have no aversion to learning, but I want to focus my time learning the stuff I want/need to learn. And if there's an OS that runs well enough I don't need to think about it then that's fantastic.
The argument is that if you're going to try Linux, it's only fair to give it a fair shot at adjustment, rather than complaining that it doesn't work like you old OS, because guess what - it's not supposed to.
To be honest, I hadn't really given much thought to my OS for even longer before I decided to go looking myself, I'm glad I picked that fight though. Granted, 8 years ago I was 15, digging in to the guts of an OS for the first time is very different from focusing your time to build what needs building :)
That's the problem right here. There's a vanishingly small percentage of people who can afford a laptop who are also willing and able to spend the time to gain "experience in using Linux". Sure it's much easier than it used to be, but it used to be a completely unmitigated disaster (I remember things like having to teach myself enough C to fix a wireless driver that didn't want to compile under a newer gcc version) so that's not saying much.
This is so far removed from my world as a daily Linux user. I put Fedora 25 onto a Dell XPS 13 Developer Edition and everything worked perfectly out of the box. The OS looks gorgeus, and I now have the tooling so perfected to my workflow that I would be absolutely gutted if I had to change OS. In fact, I would likely turn down a job if it meant having to use Windows as a daily driver. MacOS I could likely get on with ok, but would miss the customizations I have on my linux machine.
If someone gave me a MacBook tomorrow, I'd install Fedora on it, no contest. At least I wouldn't have to get lost in "homebrew ports" and such, or some "Powershell" RedmondUbuntu2017 concoction ;))
I mean, Windows doesn't get a pass here - you still can't scale HyperV, or anything involving mmc.exe, but Mac's HiDPI support is flawless.
Now that is true. I can readily believe that Apple invests more efforts than the others in somehow ensuring even old no-longer-updated apps somehow render properly. It really is weird how the OS' font-scale factor seems to be ignored by so many old programs (or their GUI toolkits)!
I have a standing offer to buy a developer a laptop with a high DPI screen  if they can make High DPI support suck less on any distribution.
At least Fedora auto detected the screen and enabled 2x scaling. GRUB is still unreadable. WINE apps are unreadable. QT less than 5.6 is unreadable. GTK is unreadable.
1: http://en.chuwi.com/product/items/Chuwi-HiBook-pro.html although there's a newer one I can't find
You have a 13" 1080p screen correct? Then all text will be absurdly small because Fedora only supports scaling at 1x and 2x but not at 1.5x, which is what that screen size / resolution combination requires for a comfortable DPI. Windows does support 1.5x scaling.
> I get the slickest hardware, great battery life, the best touchpad and keyboard, and a hi-resolution screen that doesn't have battery life, flicker, tint, or scaling issues.
None of these things are universal among Windows machines. I can't remember which model, but a Verge review recently said that a Windows laptop had surpassed the best Mac (in terms of hardware quality).
> display scaling is a joke
It's fine. Everything works and looks good. Sometimes you use legacy apps (which is a feature of Windows -- legacy apps still work) that don't handle scaling well, but the OS helps resolve those issues.
> Performance is terrible
Reviewers, people on this thread, and I have all reported Windows being much snappier than OS X. I actually switched to Windows for performance alone. I hate not having a bash terminal, but it's better than using buggy, slow OS X.
Doubtful. Apple does crazy stuff like custom ACPI for Thunderbolt controllers so they can be very efficient. Or having each of the fins on a fan be at a different angle so they have a different pitch and fan noise is distributed across the sound spectrum (reducing total perceived noise). They were also the first (not sure if the only one) with terraced batteries.
> It's fine. Everything works and looks good. Sometimes you use legacy apps (which is a feature of Windows -- legacy apps still work) that don't handle scaling well, but the OS helps resolve those issues.
It's universally known that display scaling is the forté of macOS, and has been for a long time
> Reviewers, people on this thread, and I have all reported Windows being much snappier than OS X. I actually switched to Windows for performance alone. I hate not having a bash terminal, but it's better than using buggy, slow OS X.
Final Cut Pro works fine (read: smooth) on the anemic Macbook 2016. Try running Premiere on a Netbook and see how that goes. This is the power of macOS/Apple: they only have a handful of devices in each category, so they can hand-optimize every single one of them. So, wrong.
I have to interject here. Apple was the first one to have actually working and non-horrible implementation.
But the way they went about it has big performance cost. They can only render 2x versions. So if you use the physical resolution, the one that is "as big" as 1280x800 on 2560x1600 13" rMBP, they render it in 12802x800x2 and everything is fine.
But if you use scaled resolution, say "as big" as 1400x900 on 13" rMBP, they render it in 14402x9002 and rescale it to 25601600, and this gets laggy on bigger resolutions or on older devices or devices without dedicated GPU.
Meanwhile, other operating systems can do non-2x scaling and have no problem with resolutions like that.
(Personally I don't can what Windows can do - I'm not using that if I can help it. I just wanted to note that there's this issue with how Apple decided to do scaling.)
I have the original 2012 rMBP, which by all accounts has far too wimpy a GPU for retina, and I've been running it in the max scaled resolution (the 1900x1200@2x mode) since the day I bought it.
On first few versions of OS X after it was released, yes, the graphics often got very choppy (especially stuff like Expose), but Apple has really optimized the hell out of those drivers, and by now on Sierra, I never run into any noticeable 2D graphic lag. I'm sure it's terrible for 3D though.
It might be different but my 2013 13" is definitely laggy on max resolution.
Interesting that later hardware was actually laggier. I even keep my machine stuck on the Intel 4000 graphics 99% of the time.
Honestly, it depends on what you're looking at.
In graphics power for example ALL macs suck. Even when you spend 5000$ on a workstation you barely get something that can hold even with top-of-the-line pc hardware.
MBP: 2977 - http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+Pro+450...
iMac: 5714 - http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+R9+M395...
Mac Pro: 5178 - http://www.videocardbenchmark.net/gpu.php?gpu=FirePro+W9000&... (in D700 variant)
Mind, the last one is a high-accuracy cad work graphics card, but if you're looking for raw graphics performance for 3d development for example, all of those are easily outperformed by a piece of kit costing 190$:
Any PC: 8508 - http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+10...
And thanks to Apple being Apple, upgrading is not an option.
In general I feel that no one (for now) can touch Apple in the laptop market, although half of the reason is macOS working so damn well on MacBooks. But other companies have other markets on lockdown.. Windows is best for the desktop. Linux (or BSD) for servers. Android for phones.
Not trying to take away from the innovation of that idea, but IMO ventilation is one of the few areas where I think Macbooks are strictly worse than the PC laptops I've used. The Macbook I was given for work (13" Pro, 2015) constantly overheats, as there isn't really any vent for the fan to push the air out of. I'm sure it's a conscious design decision not to have a vent (either because it "looks bad" or because blowing out hot air is considered worse than overheating the laptop itself), but I'd personally have trouble counting the way that my Macbook ventilates as a net win in "hardware quality".
That being said, it's still awful ventilation. I tried playing video games on it through Windows and Apple's drivers left the laptop more hot to the touch than I would like. I ended up using a fan curve controller application to fix that.
Yeah, my girlfriend's MacBook would beg to differ, as it stutters on a couple of heavy websites in Safari tabs.
even 4k is workable, although not quite as smooth.
Desktop Safari is also notorious for not being as smooth and fast as other browser because its primarily built for (battery) efficiency, not performance.
It's odd that this is a discussion point. I have a MBP 17' circa 2011 or so. I have both OSes installed. They are both really snappy. I am not quite sure what all the fuss is about.
I have no problems with scaling with a combination of 4k, 1080p and 2736x1824 monitors. (It's working great, though it's true it didn't work like that a year ago)
With a surface book you get hardware that's even better quality than any Apple laptop, and actually has a detachable screen I can write on (this isn't a gimmick, I'm using it daily as a software developer).
No idea what you mean with saying "Windows is still a mess", maybe give an argument for that?
I don't use it ever except when I need to help friends/family though.
The cost of Mac or Windows is cumulatively huge for me: on both I regularly sit wondering when some incidental operation will end.
Even though there are some things that are annoying or in a bad shape, all in alle macOS and the ecosystem makes me more productive than Windows or Linux.
As a developer, I also notice that a lot of tools don't work well on Windows. Even things like Docker.
Not really. It may not be suitable for all use-cases, but it has certainly much improved in the last years.
To make a concrete example, I'm running ArchLinux + bspmw on an Asus Zenbook UX305UA. Thanks to ArchLinux I don't have all the bloat that I don't need, putting me at 10 hours of battery life. The tiling WM allows me to be extremely efficient with desktop and window managing. This is why I find it really obnoxious to use both Windows and OS X when I have to.
If you don't need proprietary software that doesn't run on Linux I seriously suggest you give it a try.
I agree, but why do this? Has it ever led to anything in your experience that was not a pointless and tiresome flame war?
The only problem I've had with this desktop has been the occasional hissyfit of compiling the Nvidia driver against a modern kernel. Thats a story for another post, but the Linux systems I've built and used for my day-to-day work are and have been rock solid. Without me sitting there struggling to make things work.
On the laptops, I've not had an issue in like 5 or 6 years, apart from a single Wifi driver ... which also didn't work in windows. So I added a simple Panda nub and off to the races I went.
Really, from my own personal experience, the OSX interface is a pain in terms of how it maps against keyboards, and the lack of a sloppy focus equivalence in OSX.
From a programming scenario, the toolchain is maddening (I am looking at you Xcode ... every (&(&^&^&* update, you have to reapply the click-this-license-button) on OSX. I use homebrew (having left macports in the dust), as I want stuff to work. And then along comes an update which just breaks everything.
And then the SMB stack in OSX. Yeah, the SMB stack. Using a lowest common denominator for our office meant using crappy file system technology, but thanks to either licensing issues or a NIH mentality, OSX has an almost completely neutered SMB implementation. It is awful. Painfully slow even over fast connections.
Then there is the desktop experience. Many of the customers of my previous employer are creative types whom used to use Mac for their video bits. Most are migrating to Linux, some back to windows. They are done fighting the battles (don't shoot the messenger, this is what they tell me).
I know the article was clickbait, and had a number of issues. But blasting Linux, which is a perfectly good, and very functional OS for many by rehashing issues from 10 years ago ... might not be the best approach for advocating your view.
FWIW: I run windows. In a window. Having had the joyful experience of self-destructing windows installations on wife/offspring's laptops, I've sworn to never again allow a windows OS to run installed on bare metal. I've taken wife/offspring's old laptops, virtualized them, and they can run their old stuff in a VM. They use Macs now, and I won't try to force them to change, because they just work for their use case.
Linux is a great OS for desktops (ignore the haters). Linux Mint is definitely one of the best user experiences I've had on a system so far, for day to day use.
I can't believe people keep reading these "I left Apple and here's why you should care" articles by front end devs with no real idea of what goes into OS dev and where macOS has come from since the days of OS8.
Give me a break, surely if you want to use Windows you don't need to write 2000 words in the form of some needlessly pronoun heavy (because your opinion is really important) diatribe telling us all about how detatched Apple is from reality and how suddenly Windows is the 'place to be' because suddeny Windows has bash support, tell me again why I should drop a fully featured, mature *nix shell for one bolted on top of Windows.
Do people get paid to write articles specifically like this or are these devs so full of their own self importance that they feel like it's their duty to inform us of their opinions on the state macOS and why it's suddenly so much worse than ever before when in reality, macOS has never been more stable or developer friendly.
It's honestly so predictable I could have guessed 90% of the content of this article just from the sensationalist headline alone.
macOS has never been more stable or developer friendly
The OS is fine, although largely in maintenance mode. The hardware value is poor and has gotten significantly worse in the last few years. I'm typing this on a retina iMac that I'm very happy with, but if I had to replace it there's no Mac that I would consider now that Windows has decent high-DPI support.
YMMV, but this is why I still use a Mac--because even Windows 10's high-DPI support is messy and not great. Applications act inconsistently and Windows is straight-up bad at handling different DPI on different devices (and when you've got two 27" 2560x1440 panels and the laptop's own panel, this is a pretty significant problem).
Logitech headphone software on 5k screen vs Chrome -> https://drive.google.com/file/d/0B4joq2oW_zHBLXJLa2d1MEhlbTJ...
Sorry, rant over, had to get it out.
But I do use both Mac and Windows regularly and they both have their quirks, but personally I find Mac to be more polished, but Windows is catching up and Apple has been slacking recently.
On OS X, I've literally never thought about DPI.
What trouble did you have? It's pretty great for me.
* I have to turn off gatekeeper to run unsigned apps.
* I can't write into /bin or /usr on my own machine without flipping some magic option
* I can't run gdb without some complicated signing dance I have to do every time I update it.
* I can't run dtrace without rebooting and switching some secret flag off. I have to tell other people to turn they same thing off so they can dtrace applications.
That's just straight off the top of my head. The dtrace and gdb things are particularly annoying, as it makes life harder for me to get other users to do simple debugging tasks, and there is no simple workaround, just complicated instructions.
Right-click on the unsigned app and select "Open". You only need to do this the first time you run the app.
Regarding the gdb signing, it's painful. Even more so is the privilege escalation--it's impossible to debug over ssh with gdb or lldb since the GUI prompt is on a different machine. Not having the prompt in the terminal where the debugger is being run is asinine. I had to switch to debugging on FreeBSD to avoid the pain of all this; it's madness.
even OSX 10.3, which was when OS X started to get really good, was released in late 2003, almost 14 YEARS AGO .
I know people who have been on Macs for 20 years, since OS8 or 9 days, who have bought non-Mac hardware and won't get another MacBook - that is significant and Apple should be paying attention to that.
Apple is not gated by their software costs - they clearly make a lot of money from their desktop, mini, and laptop sales.
So why not produce something truly excellent instead of merely adequate?
> where macOS has come from since the days of OS8.
I care a lot about a consistent Finder UI, about the Clipboard and Dragboard. Those became a little less logical when OS X 10 was introduced. As you know, many other features became much more powerful, so overall it was easily forgivable.
Unfortunately, since Tim has been in charge, the lack of logic and consistency has become a real problem. Nobody at Apple seems to care about their own "Human Interface Guidelines" anymore.
It's like Apple compares Mac with iOS, and decides that mystery meat navigation, and inconsistent UI, are no big deal. As long as it's no worse on the Mac than on iOS, Apple is fine with things.
The author's primary issue seems to be the fact that Apple hasn't been packing the punch that is necessary for him, and that especially with his wanting to get started with VR the entire platform is a no-go for him atm. No arguments there.
But curiously, he spends the bulk of the article either explaining or talking about the things that he had to compensate or now has to deal with as part of his swtich: The fact that apps aren't as polished (and neither the OS itself), issues with installing drivers, Windows Bash that doesn't work quite 100%, etc., etc.
I switched several years ago as I was finishing college, and like the author am extremely disappointed in the fact that my MBP can't really do it for me when it comes to VR. Yet, as per his article, it seems that in just about every other meaningful way the Windows experience for me (and him) is worse.
More broadly speaking, just about every article I have read touting how great it was to switch "back" to Windows seems to follow this general trend; it seems as though everyone who switches is glad to be using machines with more juice behind them, yet it is clear the computing experience itself is worse. I can't help but feel that had there just been better graphics/ram options on the current mac lineup, that pretty much would kill any of the justifications I see for the switch.
If Apple just stopped doing those things, I'd consider running OSX, but they won't so it seems like an entirely pointless exercise.
It's the same iOS vs Android debate, Apple charge a premium but promise things will be significantly better, if they are only marginally better then it becomes harder to justify their premium and lock in.
I think the recent spate of articles are more about the fact that the gap has been narrowing significantly, partly due to Apple's neglect, partly due to Microsoft doing better, so the balance is changing for people on the margins.
And the progress on Windows's side is what exactly?
For the end users, instead of business corporations, practically nothing changes. I swear every freaking version of Windows nearly everything about the user experience is similar to before. It looks nicer at first glance, but once you open some apps, particularly the windows apps like msconfig, disk management and such, you find yourself with UI from ten years ago.
Just look at the file copy window. What a joke. They made it look a bit nicer, they added a fancy animation while copying, but really, it didn't change at all. It stills sucks majorly at giving you a proper estimate of the time it will take to copy a file.
The single biggest change for me in Windows 7 was the ability to use Win + <number> to switch quickly between apps. It's incredibly useful, and thart's pretty much the ONLY real change in may day to day experience of Windows compared to earlier versions.
Safety, performance, stability, easier UI (for the average user).
Yeah, none of these things feature well in tv advertisements, but compared to its predecessors Windows 10 actually does excel in all of those. Heck, i installed 10 on my parents' cheapo PC and with the same programs and such installed it actually is more responsive.
> It stills sucks majorly at giving you a proper estimate of the time it will take to copy a file.
That's not a software problem, that's a hardware problem. Particularly on SSDs you have to deal with deletions being surprisingly slow, and in the copying process itself you often have a very fast phase at the beginning when it's just slurping the file into ram; and then it drops off when it runs into the write limit of the target medium and/or runs out of ram to use cache. If you're copying to the same medium you get an even stronger drop due to read/write happening on the same thing.
Predicting this is HARD.
The only way to get reliable predictions out of the copy dialog would be to disable memory caching while copying, and uh, you kinda don't want that. It would just be predictably slow.
They need Snow Leopard 2, and someone who cares about PC hardware.
For what it's worth, I think this will be addressed by the switch to AFS.
Another fun problem is copying a bunch of files out of a directory that Finder has currently opened and watch Finder go catatonic. I get the feeling someone did a lot of programming in Finder without proper testing.
As a non-game developer, what practical advantages does Windows hardware and software provide over Mac?
The writing doesn't make it clear to me. Sure, there are some interesting facts noted, but there's no connection drawn or relationship established between the facts and the impact on the author's ability to work, or the work product itself.
After that, it depends on what you are doing. I develop using a PC, macbook (for iOS builds usually) and a Linux machine (running mint).
Linux is best for Node/JS heavy development, IMO, largely because of the long filenames in Nodeland and the general linux-first documentation and help. Recently I have found its also pretty good for Dotnet Core development. The OS is nice as well as in that sort of bare-metal coding, it kind of gets out of the way.
Windows, either with VS or VS Code, is for things that require a lot of power or are huge (usually related): Docker stuff is easier, solutions with hundreds of projects, etc. When I want to throw code against a tiny god, the PC is my go to.
Presumably the author has already purchased the Mac, in which case, s/he's already obtaining the benefit from that purchase. To rationally justify the replacement cost (which is not just the cost of the hardware itself, but the opportunity cost of learning the new platform, purchasing new software, etc.), the benefits of switching have to be greater.
It's those benefits--not necessarily of the platform itself, but of switching--that aren't clear to me.
I think you're discounting the performance aspect as well - if workload X is too slow on the fastest mac, it may well still run on a windows machine since they can be specced higher. Cost isn't a factor for everyone - sometimes you literally just need the fastest possible machine, in which case Windows can be it.
Am I missing something or does this seem like one thing? Or is there use case for VR that isn't gaming? And what is 'upcoming 3D' exactly?
>if workload X is too slow on the fastest mac, it may well still run on a windows machine
Can't argue with that, but what are such workloads if you are not game developer? Wouldn't cluster of Linux servers always be a better choice? Unless of course you are die hard Windows fan.
Training and design.
I do machine learning development. It is very helpful to have a CUDA-compliant GPU (read: NVIDIA) on your local machine so you can develop and run things quickly. Yes, you can use AWS, but that is costly, you need to push your data into the cloud, and you definitely cannot do it w/ sensitive data such as health records/medical images with localization clauses. None of the major machine learning platforms work well on non-CUDA compliant GPUs, most are barely functional outside NVIDIA cards.
I waited over a year for the new MBP and was disappointed to see no NIVIDA GPU on them. I love the MBP, it is an absolute pleasure to use. But the new line had barely any horsepower improvements. It was almost a step back from the 2014 MBP I already owned. Further, the cost was obscene for the value/improvements I was getting.
I'm not happy about it, but I purchased an Alienware w/ a hefty NVIDIA card and 32GB RAM. It is a beast, but it meets my work needs. I do miss the agility of a MBP (e.g., carrying it in one hand to a cafe or onto a plane), but ultimately horsepower is needed more.
My first Mac was in 1992 (a IIsi). Since then every computer I've bought (servers aside) has been a Mac.
Last week my MBP died. The battery overheated and 'burst', taking the case and trackpad with it. I'll still have a Mac desktop for development and cartography, but I need a laptop, and I simply can't justify £1250 for a MacBook.
Instead, for the first time in 25 years, I'm buying something else: a Chromebook, on which I'll install Ubuntu. It's pretty much the same weight (1.19kg vs 0.91kg) and size; it does word processing, web browsing, PDF viewing, and Ruby hacking just fine; and it costs £1,000 less. Sure, the performance is much, much worse, but for my uses that's fine.
* Generally nicer hardware even compared to things at the same price point. Don't forget that hardware is more than RAM and CPU specs. There are some notable exceptions with pretty nice hardware, like the surface pro tablets.
* More friendly UX all around. This is somewhat subjective, of course, but if you'll recall a while back there were some big corporate IT departments who explained that IT costs were much lower for Mac users. They chalked this up to a combination of increased user friendliness, lower virus risk, and increased reliability.
The only activity in my life where I actually choose to use Windows is for FPS games. However, that advantage is disappearing; over half my steam games support mac now.
So in my book the Mac is still best but I also agree that the direction Apple is going doesn't look too promising.
I remember spending time trying to figure out how to disable the animation that plays when switching OSX virtual desktops - only to find out that you literally can't without code injection or a third party app - rendering the virtual desktops on OSX entirely useless.
The fact that this a) is the default behavior and b) is not configurable, is just another in a long line of awful OSX UX and it only gets worse with every update. El Capitan changed terminal font smoothing to something woeful which was also a struggle to revert.
Honestly the only way I have found development possible on native OSX and Windows is by stretching a terminal(/putty) to the full size of the desktop and handling all window management with tmux.
Pretty minor issue though, I still love both. I'm just not sure if I'll love the new versions everyone seems to be whining about :)
To give more detail on my use case: I'm not a developer so things like terminal or unix don't really matter to me. And I don't think I've had a virus since the 90s.
That's really what it's all about though. Many developers prefer a unix environment. Windows is often neglected in open source, so getting up and running with a working dev environment can be a challenge. So, for many, Windows just isn't really an option.
That leaves macOS as the only option that's a first class citizen in all regards -- hardware support from manufacturer, support in open source projects, support for proprietary applications (I.e MS Office), etc.
For non-programmers though, there really isn't a meaningful difference. Take your pick.
If you are just starting out or you aren't in the Apple's ecosystem (e.g. you don't have Idevices) then Windows might be the place for you. I'd still go with Linux personally, but that's just me. Or if you just need more power, but I'm guessing most people here don't actually NEED that much power
- If I have sublimetext maximized on other display, my mouse cursor gets stuck at default pointer, no longer changing on context.
- If I have same sublimetext maximized on other virtual screen, my mouse cursor stucks in text select.
- Option to reorder icons in dock via drag and drop goes away randomly.
- When I plug display to my MBPro 2016, menubar has white square where tray goes.
- It gets stuck in boot -> kernel panic -> restart -> boot loop when I am starting it with display plugged in via their HDMI dongle.
- On random, finder stops showing changes in filesystem. I've had this happen to me in Mavericks on old laptop, it keeps happening in Sierra on new one.
Quarrel over operating systems is pretty pointless, if it doesn't run natively, slap a hypervisor on it and run whatever you like in a VM; for me that'd be a Linux or BSD, I'm not really opinionated there (although certain things run faster in Linux due to experience).
(Obviously I don't grok the laptop lifestyle, I don't understand the "work everywhere" mentality, in fact, I don't want it at all. It's called work place for a reason - at least for me. I do have a laptop (or half a dozen, who has time to keep record?), the mainly used one is a Getac unit with a carrying handle that weights ~3 kilos or so; at least it's silent, has a superbright screen and a battery that's larger than most ultrabooks by volume ;)
You obviously don't understand what those 2 letters stand for (hint it stats with Personal and ends with Computer).
* talks about developing VR apps, shows screenshot of editing html
* complains about lack of innovation in macOS, has zero ideas of things he lacks
* praises slack for being a great windows app
* is impressed by bash for windows
* never considered using Linux
check, check & check!
The truth is that, for the rest of the world, "Apple platform" means only iOS, only the IPhone, nothing more.
In Asia, Africa, Latin America and some parts of Europe, Macs and OSX are a rare species, most people spend years without even seeing a Mac computer "in the wild". They are used only for graphics editing and IPhone development.
These discussions only show how exotic HN is, a Silicon Valey bubble.
But, it might be better to say HN is a tech bubble, because everywhere in the world that I've been people who work on web tech use macs, for most of us the job would be almost impossible without a *nix OS.
And this post was written (somewhat regretfully) from a Mac in Aotearoa/New Zealand, where non mac laptops are somewhat rare in the tech community (and are then usually Linux rather than Windows).
Interestingly I could not found Germany on http://gs.statcounter.com/os-market-share/ but here is Switzerland which have one of the highest OS X penetration of the world (the highest?)
I don't have the figures for laptop only, but I don't believe for 1s that OS X is above Windows in any country (of a non-trivial size)
Maybe in certain extremely local areas, you see a highest concentration of mac laptops. Even then, I challenge you to count carefully and come back to us if you still find that mac laptops are more common than laptops running Windows. Of course there are some factors that can bias in favor of mac laptops, depending of where you look at. However, while that can be interesting from a socio-economical POV, keep in mind that Windows market share is, on average, consistently largely superior to OS X, regardless of if running on desktop or laptop computers.
UK Birmingham: 0700 trains to London (corporate types), Thinkpads, Dells and a few HP laptops all with Excel/Outlook in operation. A few hours later (students and younger people), it changes to tablets and Macs. Mostly watching films or doing Web stuff. And lots of phones.
I'd imagine a large percentage of Windows licenses in any country are for desktops sitting in offices (1000+ in my fairly small organsation alone).
There are certain industries etc that is very heavy Mac users, media production in particular. They crop up in the weirdest of ads because the production staff use them for example. But for the rest of the nation it is Windows all the way.
The only people i know personally that own a Mac musicians or in the ads/marketing/graphics business. And i think the latter, as much as the _nix internals, that has made Mac a web dev fixture.
Students don't tend to have money to spare for a Mac, so they buy a cheaper Windows laptop instead.
A Macbook is 15% of a year's money, which is better spent on rent, food and beer.
Microsoft has chosen to focus on enterprise. This is probably the right move for their culture and the future of the company, but the remainder of their consumer products seem to be turning into an ad-supported model heavily focused on legacy support.
Windows may have a new coat of paint, but the underlying OS just gets more buried under more and more simplified UIs which ironically make its harder to use.
This is really desktop linux's time to shine, but it's usability is still sporadic.
I can't believe how much worse Windows 10 is than 7. And all the good features of 10 could easily be integrated with the 7 UI.
It's very frustrating that there isn't another player in this space. I don't mind fooling around with linux but sometimes you just want to buy a laptop and go without all the extra crap and most of the consumer space won't go through that anyway.
This has lead to an immense number of mistakes, bugs, and unhandled cases seeping all throughout Windows 10.
At its core, W10 is still a great OS, but the lack of proper testing (and its massive, scary levels of privacy invasion) is something to be aware of.
To name an example, most of the people who prefers OS X complain that Windows lacks aesthetics (ie. font rendering, high dpi support, etc.).
On the opposite side, most of the people who prefers Windows complains that OS X lacks support (ie. a huge library of software, backwards compatibility, etc.).
Unfortunately, BC and aesthetics are mutually exclusive, taking support for High DPI under Windows as an example:
Only the modern stacks like WPF or UWP are DPI Aware by default, but the amount of software built with these stacks is relatively small, on the other side, most of the software on the wild is built with stacks like GDI/MFC or WinForms, but they'll always look "ugly" with High DPI configurations.
Apple is the kind of company which would demand developers to update their software, Microsoft can't afford to alienate its developers.
I've been very impressed with Mac backwards compatibility. Old Apple hardware (going back nearly a decade) still works great with latest macOS Sierra and I have decade+ old utilities and custom scripts working as fast or faster than ever before.
There's some of my custom scripts I've had to tweak over time due to Apple's increasingly locked-down security measures within the OS, but that's very much worth the small amount of time I've spent tweaking them and I appreciate the better, overall security.
There's rarely the case that there's a functionality in Windows that can't be found within the many hundreds of thousands of Mac apps available. There's more Mac apps that one could ever use in a lifetime. As a matter of fact, the problem I've run into with Windows is the lack of quality apps that can't match the superior third party Mac apps or built-in macOS functionality in many cases. Of course, there's occasions the opposite is true and I run Crossover and Parallels in Coherence mode for those.
There's also a lot of built-in, time-saving functionalities within the macOS that third party apps in Windows don't replicate well or at all. For example, spring-loaded folders or a solid, fast alternative to Mission Control in Windows that works as seamlessly as it does in macOS.
I use Windows 10 and macOS in near daily production and consulting/support environments. Windows 10 has its advantages over the macOS, but Task View isn't one of them.
Mission Control on Mac in a production environment blows away Task View - which was only finally copied by MS from the macOS after already being in use for well over a decade for Apple users. Granted, there was some Windows third party apps that attempted to clone Exposé (former name of Mac's Mission Control), but they were terribly slow, clunky, crashy and buggy on Windows. That's why I was really happy to see Windows 10 finally copy Mission Control and incorporate it natively, but I was sorely disappointed after using it.
For example, I can use corner gestures with Mission Control that've been removed from Windows 10. Microsoft had corner gestures in Windows 8, but removed the option entirely in Win10.
Even after I brought corner gestures back to trigger Win10 Task View with a custom script that works via a third party app (the great AutoHotkey), it's still incredibly limited compared to Mission Control. The AutoHotkey app doesn't even trigger itself right away consistently like the built-in macOS corner gestures always instantly and reliably does. I've wasted time with multiple third party triggers and none work as well as the native, built-in macOS corner gestures.
On top of that, with the macOS (and I've been able to do this for about a decade with Exposé and now Mission Control) - I can drag any file to my corner gesture, then drop the file directly into a preferred Mission Control thumbnail window.
Try that in Windows 10 Task View. There's no integration with the file system in Win10 Task View at all and that severely cripples its functionality. There's no third party app that fills the void yet for this either. Granted, I often use launchers on both Mac & Windows to move files, but when there's a need to have a more GUI, visual approach with dragging and dropping, Win10 fails badly because it also inexplicably doesn't have spring-loaded folders in Win10 and no reliable third party app copies that functionality properly either.
I do enjoy the Task Bar thumbnails in Windows that the macOS lacks, but I just use a third party app called HyperDock that not only replicates the functionality, but much improves upon it - and HyperDock has never had any speed or stability issues against the macOS for me like many third party Windows apps tend to have.
That said, there's definitely various advantages to running Windows over Mac and that's why I work in a mixed environment at home and in my work tasks.
What areas do you find W10 to be lacking in?
Poke around in the settings for a while and you will find remnants from the NT days.
The dark theme is another example, it works for a handful of their own apps, not even half of them.
Half baked and unpolished.
I'm a daily W10, macOS sierra and Fedora Linux user.
I develop on all the OS:es and play games mostly on W10.
This is, IMHO, a backlash of Microsoft's long update cycle. The yearly update Apple pushed to MacOS, along with free updates, allows for an easier deprecate-then-remove approach that gently transitions users from the old to the new approach. It's hard to do the same when people got used to an OS for many, many years. Maybe W10 with its "rolling" approach will suceed, btw.
Happily, though, the inconsistencies don't reach as far down as the kernel level. W10 looks weird and sometimes acts strangely when you try to use the new stuff, but its bones are stable, which is all I really need.
Windows almost never crashes for me, but I haven't had a system crash on my Macs in close to a decade even after updating the OS numerous times without a clean install. Granted, I know to use combo updaters for Mac instead of the streaming updates, so that helps me quite a bit along with making sure I update third party apps first.
On the other hand, Windows 10 updates have caused all kinds of various issues and it's documented to be widespread. Killing many webcams is one major issue that comes to mind.
Then again, some people have had wifi issues with Mac updates, so no OS is perfect, that's for sure. However, to allude that the macOS system core with Sierra isn't as strong as Win10 doesn't seem realistic to me.
macOS Sierra has been as rock solid as Win10, if not more in some cases.
>given that Apple has total control over their ecosystem, they are frighteningly bad at providing it.
That's a myth. I have Android phones integrated with Macs just fine, for example. I use the free MightyText to send & receive texts and that's just one of several good options. Google Keep app to sync notes across Mac & Android and the list goes on and on.
If any professional power user wants to skip Gatekeeper on a Mac and install apps without any hoops (a simple right-click, basically), Apple made it as simple as this in Terminal so there's no hoop at all:
sudo spctl --master-disable
I use both Windows and Macs daily. I run anything and everything on my Mac I want and have done so for many years. I'm not trapped in some ecosystem at all on my Mac. If anything, I feel more trapped (privacy-wise) on Windows 10 than Mac and I despise how Microsoft forces updates on me that have crippled my workflow on occasion whereas Mac just puts up a daily reminder until you do it.
Now, the iOS devices are another story, but that's a huge can of worms when we're talking about phones and the need for security, etc. -- I'm not going to get into that here since we're talking about Mac vs. Windows -- not iOS vs. Android, etc. (my preference is Android for most of my use cases and iOS for some others).
They are supposed to be buggy as hell!
I have a much better machine now (although there are nice Linux certified laptops out there) and a much nicer hardware integration with the OS (especially in terms of battery life).
But as a desktop Gnome is just as good if not better (that's a matter of taste after all), and as a Unix development machine it's sub-par compared to Linux.
I'm pretty happy with it, but I can see myself going back to Linux. But Windows - not anytime soon.
But saying that Windows 10 is so much better than Sierra doesn't hold water for me. It's laden with spyware that is difficult to turn off, and even when you manage to, the next update turns it back on and changes what you have to do to disable it. Windows 10 is the most user abusing operating system in history. It's just that it does it in ways that are less obvious than Vista or Windows 8, on the surface (no pun intended) it seems all is good.
Microsoft is doing some good things with open source and making things like Code. But I can't abide the aggressive spyware.
> On the developer side? Nothing, unless you use XCode
> Their hardware is underpowered
> Gaming on Mac, which initially showed promising signs of life had started dying in 2015
> brings dedicated gaming features, full OS-level VR support, color customization
> NVIDIA GTX 1080 graphics card is an insane work-horse that can play any game
> On top of that? I can play recent games without the PC breaking a sweat, and I’ve started experimenting with VR
Things that I loved and now I hate:
- spaces (now this convoluted combination of "notifications", no more far left widget screen
- universal zoom, you used to be able to zoom in on anything
- odd wifi issues (wifi was such a pain in the early 2000s on a PC that OSX experience of working was amazing)
- terrible cloud features, icloud, mobileme always sucked but when my computer started automatically updating and then switched to trying to store my local files on icloud it f'ed over my file structure, huge annoyance
- cmd-ctr-alt-8 (defaulted off but enable'able at least)
I'll still probably get a base model air for my next laptop and I run a custom build PC that runs Ubuntu 16 LTS, Win 7, Win 8 (Windows mostly for CAD and other such software still not available on mac, like proprietary 3D printer environments, looking at you Stratasys).
- Dashboard is still there and can be enabled as a space or an overlay (System Preferences > Mission Control).
- You still can zoom in on anything. I use ctrl+scroll (System Preferences > Accessibility > Zoom).
- The invert colors shortcut (cmd-opt-ctrl-8) is still there. (System Preferences > Keyboard > Accessibility).
Another feature I love that used to be easy to find but got buried in settings is the three-finger drag (now found in System Preferences > Accessibility > Mouse & Trackpad > Trackpad Options...).
I question some of the changes in default settings too, but then I realize the defaults are most important for the non-power-users who can't or won't change them. I would also never want to store my Documents or Desktop contents in iCloud, but I can see why someone who doesn't understand the file system might.
Basically in 2006 when I joined Google I had a choice, Macbook or Linux laptop, I chose a macbook. Really liked it and used a Macbook for more portable computing for the next 10 years. I also got a Linux desktop at Google and learned the various bits you had to know in order to run a Linux desktop full time with my laptop filling in for things I couldn't get on my desktop. When the Surface Book was announced, the hardware was just amazing. I figured even if Windows sucked I could eventually get Linux working on it. When WSL became available suddenly Linux was sort of just their and all my ARM development tools just worked. My home desktop which had been windows 7/Linux dual boot (defaulting to Linux) and I booted Windows 7 and took the free upgrade when it was offered. Then went back to Linux. And then needed to run my ECAD program so brought the windows side up to snuff. And have been running on the Win10 partition for the majority of the last 3 months.
At this point I use my Macbook Pro less and less.
I've been running an Ubuntu server VM for ages to get bits and pieces done, it's so nice to _just be there_.
Pretty cool stuff!
It requires fundamental "long-game" mindset to realise that once you lose the high-end Mac guys - you lose your best proponents - then you go into a decline, a long and very profitable decline.
There is no excuse for the current line-up of Macs - super expensive, incremental and confusing half-baked features.
Which is why I'm voicing my objections to the way that Apple is treating their Mac lineup by leaving iOS. I sold my iPhone and iPads (I had 2) and moved to Android.
I'm still a Mac user who will happily spend thousands when they offer compelling reasons to upgrade. But if their belief is that there is more money in iOS, I'm making damn sure that I'm not part of that equation.
Switching to Android also has the pleasant side effect of being far cheaper. I was able to get a brand new Nexus 5X for net-$20 after trading in my old iPhone 6, which barely held a charge anymore. And Google Fi is so much cheaper for my use case (I use about a half gig per month and do a lot of travel in other countries) that I'm saving around $20/mo and getting better service.
For those of you wanting Apple to focus more product development resources on macOS and Macs, abandoning that platform will only confirm Apple's decision to focus on iOS. We need to abandon that platform if we want more focus on macOS.
For instance, if you are developing a game with Sprite Kit, you will find that around 95% of the code is IDENTICAL on Mac and iOS. Sprite Kit itself works the same on both. Sister frameworks like AVFoundation are mostly the same and they have peanut-buttered some #define values to allow “different” classes like UIColor and NSColor to be SKColor, etc. to make it easier to have code that does not needlessly vary. You end up having to clearly think about UI differences between the two but that is true anyway between desktop and mobile.
On the other hand, sure, a utility-style application is not that similar between iOS and macOS once you get past the likes of NSString and NSArray. Yet, utility applications often look and work quite differently between desktop and mobile so this may make sense. And if you rely on the cloud to implement part of the functionality (and share it between the two), you may find that again you are dealing with mainly a lot of UI-centered differences on the two platforms that would have been different anyway.
The iPad is the weird one. Here, Apple probably needs a middle API that really can use exactly the same constructs between mobile and desktop where they actually do end up looking or even working the same.
Or maybe they can, judging by the last MBP release. Hopefully they turn it around.
I'd be surprised if Apple doesn't have XCode running on both iOS and Windows internally.
I forgot how awesome it is to have a desktop/server online all the time at home and how much performance I had given up for the convenience of a laptop.
The lack of a used market for the 2nd generation (2013+) Mac Pro leaves me frustrated, but it's no surprise given that it was targeted for the designer niche and the rest of the desktop market has largely been subsumed by laptops.
I'm really encouraged by the growth of the Hackintosh community and all of the problems they have solved. That will likely be my next computer.
Including people who don't actually tolerate the ads, but who need to target that platform.
This article actually prodded me to try out the Windows Subsystem for Linux, and so far it seems pretty nice. Everything I've tried to install has worked so far, and worked fine with my standard Unix config files.
I'm in the market to upgrade at some point in the near future as my mac has developed a number of hardware issues and the linux subsystem for Windows and Docker support are both intriguing.
Because they actually update the hardware...
If Apple sold a $400 Mac, you wouldn't be able to give away a 3-year-old Mac either.
They are more expensive to buy, not really “great” over their predecessors, and there is no doubt that some future-generation Touch Bar 2 with a feedback engine, etc. will be far superior to use (kind of like buying an iPad 1 with not enough memory, being too heavy, etc. and iPad 2 blowing it out of the water, not to mention Apple unceremoniously ending OS support for iPad 1). These first-generation Touch Bar laptops will really age poorly I think, I doubt they will resell as well as Macs typically have.
I don't care about this whole windows/mac debate, as I run linux, but my 4 year old Asus gaming machine is still quite a powerful machine and hasn't been updated outside of SSDs. It would fetch a few hundred bucks, probably 40% of its original price, and it doesn't have a cult following willing to pay a premium for...whatever it is.
Used Macs remind me of used Jeeps.
I don't need to sell this in 1.5 years, probably not in 3-4 years. After 5 years, it'll certainly worth >$1000.
Not at 50% of the original price, but still...
- Why I left Mac for Windows: Apple has given up
- I got a Google Home and finally understand the future of computing
- Apple just told the world it has no idea who the Mac is for
- The reason I keep going back to Apple might surprise you
- Google's software is putting Apple to shame
- Google buying Twitter would be the best result for everyone
- iOS 10: the beginning of the end for apps as you know them
This is the profile of the newsletter editor (http://char.gd/announcements/welcome-to-the-new-charged/):
Owen was previously Editor at The Next Web and now runs digital at VanMoof in Amsterdam. He created Charged newsletter, and is probably far too obsessed with keeping up with everything in tech.
Currently I have a Thinkpad X1 Carbon running Fedora 25 (Gnome 3) and do not miss OS X at all. Gnome 3 is very clean.
I have a Google Pixel XL. All I can say about it is wow
I purchased a Intel Skylake-based System and 99% of stuff worked out of the box with the MultiBeast tool. Had to fiddle a while to get HDMI Audio working, other than that i had no problems.
But what Microsoft has done with mandatory updates and forced telemetry is unacceptable to me. And because of that, Windows 10 is the first version of Windows that I have not installed since the release of version 3.0.
What's BeOS like?
As for BeOS I really liked it. It was very stable and fast if the hardware you had was supported. It had great multi-CPU support, came with a BASH shell, and also BeFS - which with its support for arbitrary user defined metadata, which made things like email and music queries very interesting.
The downside of BeOS was that it didn't attract any big third party apps and that it was a single user operating system.
 I ran BeOS on a Gateway Performance 450 with the following (from memory) specs: 450mhz Pentium 3, 384MB of RAM, a nVidia TNT graphics card, and a Promise Ultra 66 IDE card. I eventually moved away from it when I upgraded to a Pentium 4 computer with RD-RAM.
But since 2006, every time I look at buying a Mac laptop, I can't hit the buy button because they are more expensive than I want to pay and end up buying a Dell.
But even during that time, I've still bought multiple iPods, iPhones, and tablets.
All that being said, I can see a Mac in my future once I decide to buckle down and be a true hard core, front end developer and start using all of the stuff the cool kids use and/or an iOS developer.
But what would I buy? The Mac Mini would be good enough but I would feel bad buying outdated tech even if in day to day use I couldn't tell the difference, I don't want an all in one iMac, the laptops still either cost more than I want to spend or are underpowered (the MacBook).
Most of mine is audio mixing for live and live-to-tape video, so I could move to Ableton, but it'll be a lot of reworking stuff that already works.
Be flexible, use the right tool for the job.
Don't be a hater.
How does it feel to belong to the Blackberry of Apple? Do they get laughed at during lunch breaks?
If I had to take a wild guess, they've been going stir crazy waiting for the new HQ to be built. From what I know, they've got a lot of employees crammed in lots of temporary offices. They may also have temporarily slowed hiring.
All of that could add up to multiple problems all over the company. Like a skeleton crew struggling to get the ship back to port for refitting.
I noticed that the author seems to have stuck with a MacBook as his on the go machine. I've considered that route as well; but, I'm wondering if any company has yet started making portable hardware that compares well with Apple products. The last I checked a couple of years ago, no one felt close.
In the past year I have been spending the majority of my time in Windows 10 on my PC, as well as booting into Linux. I went all-in on Mac in 2001 for me and my family, buying quite a few Macs since then but I'm pretty sure I've bought the last Apple computer. They are not the win they once were.
I run both Mac OS (natively) and virtualized Windows (via VMware). The Mac OS side is more polished in terms of drivers/hardware integration. I'm a huge fan of that. Mac OS also handles website browsing and media entertainment. I like the way Bluetooth speakers just work when I power them on, BUT the overall media experience went downhill with recent Mac OS releases. Sudden disconnects from AirPlay without my consent, flacky mess between system / iTunes target audio devices. It gets worse with every release. Dubious improvements in macOS audio subsystem introduce more evil than good to the end consumer. That's why I'm pretty sure macOS won't last in the long run. The countdown had started when Steve P Jobs passed away.
The Windows part of my system is used during day to day business and I mostly do all my developments there (with a tiny bit of help from macOS / Homebrew / Unix utilities like curl).
To sum this up, to me Apple is like Titan who is destined to decay. At the same time, Windows and PC are the results of best efforts from thousands of companies. That's why PC has much more inertia and better long-term stability. In other words, PC has natural status quo while macOS is living in a bubble of a single company, so the end is nigh.
It looks like my next computer will be PC. It's so hard to leave the polished Mac's hardware but I won't have a choice presumably.
The incremental process allows one to improve their machine without massive (well, most the time) capital outlays and you end up building something that you love, you created and that can be phenomenally fast and customised to your specific needs (and a pleasure to look at, the more LEDs the better right?).
Anyway, I realise this might be quite specific to me and the other "enthusiasts", and I am sure a lot of people just glance at tempered glass PCs with water cooling and think "I could never do that", or "I couldn't be bothered to put so much effort into that, I need something that just works.". Once you start, it becomes apparent how simple and modular it all is.
I am not sure where I am going with this, but Owen Williams built a awesome computer so thought I would comment.
I also find that Microsoft is at least trying to make quality products even if they will never be mainstream (like the Surface Book). Very happy y have made the switch.
One of the things I like most about OS X/macOS is how its development has been a gradual evolution, from the time Apple bought NeXT and OpenStep in 1997 and started working on Rhapsody, until macOS Sierra. It may have a lot to do with my using it every day and having gotten so used to the OS after more than a decade, but my Mac in 2017 totally gets out of the way and lets me concentrate on my development work. The familiar UNIX underpinnings and developments like e.g. Homebrew help a lot with that. Also, printers, multiple external screens etc. just work.
When I look at Windows, what I really cannot understand is how an operating system that is used by the majority of people and businesses gets away with having each major version looking and behaving totally different from its predecessor, as if there was a constant need to fix the UI design errors that have been made in the previous version. And still (or thus?) it's not getting to the point where it feels natural to use.
And I too just purchased my first Windows PC in ten years.
My new 2016 MacBook Pro with Touch Bar hangs on resume. Whether a USB-C connected external monitor will work or not is a total crapshoot. The TouchBar display becomes decomposed intermittently. The battery runs down to zero once a month, forcing a total electrical disconnect and a hot computer. I am afraid to install flash for Safari, fearing it will cause a black hole to form in my office, so I have to run Safari or Chrome depending on the situation. Each gives me a very disappointing 3.5 hours of battery runtime. Opening the computer is a gamble: will the keyboard backlighting work or not? It's impossible to know in advance.
On top of all of that, Apple's refusal to push on component quality means that this computer is just plain slow, despite being maxed on RAM and CPU. Also it was $3000.
If you want a computer that is snappy and just works these days, you probably want a Windows computer.
I haven't really paid any attention to the feature set since I've only used Windows for games for past 5 years and ~6 months ago I gave up on Windows completely, but let's say it give you everything Linux gives you. You can just use rvm.io to install your ruby and then use gems to install everything else you need (author used Jekyll as an example). What is the real gain to be have'd?
I can not see a single reason to switch over to Windows from perfectly working OS just because Windows is catching up. Only argument you can reasonably make is that Apple has neglected Macs and you can build faster/more powerful PC and run Windows on it, but what front-end stuff needs that much power? Imo unless you are working with 3D or something like a game meant to played on Windows there is no benefit switching to a more powerful machine.
Maybe I'm missing something obvious, maybe the author was upgrading from early 00s model of Mac/MacBook, but if you have anything even recent-ish from Apple, I see no benefit in switching.
The old "640k should be enough for everyone" argument. I'm amazed after decades of continuous performance improvement in computing people are still making this argument.
Really? Why do you need lots of VMs? Back when I did front end stuff I used Vagrant, so that's one, but maybe I weren't a real developer.
Now days I work with embedded stuff and only reason some people run VMs is that some clients use VPNs that require Windows, so they are running Linux on top of Windows, but again that's just one VM.
But that is also precisely why Apple is still WAY ahead of Microsoft and co. They will be able to make the desktop and tablets converge where you can carry your computer in your bag, and when you're home you put it in a dock and you have a full blown powerful desktop with the 5K Retina screen and maybe even a dedicated GPU in the dock fit for triple A gaming. I'm pretty sure that is where they are headed, since obviously you can't make a single "form factor" for both uses. And for people who don't need the full blown desktops you won't even need the dock, external screen, just the keyboard and you'll be able to do most computer tasks.
And that will be possible because they are already thinking ahead, while Microsoft will be stuck with their turd of a desktop and making funny hardware that kinda works for some uses, but can't quite decide what it does.
I'm sure Surface Pro is a great device, and it answers the needs of a certain demographic but "windows on the tablet" is not the answer to the computer of tomorrow. For what it's worth I heard from artists the Surface isn't a replacement for the Wacom as it has its problems.
I assume you're referring to the fact that iPad Pro doesn't accept USB keys, or lets you access filesystem etc. But what Apple achieved with the iPad Pro is to make a tablet first and foremost, with a fantastic touch interface. Not a hybrid device.
"But an idea is not enough. It is never enough. And even its successful execution as a piece of technology is not enough if the company refuses to join with the rest of the world and its own customers in cooperative exploitation."
 except for the original shuffle which is the perfect jogging companion.
> If you ask anyone who knows me, I’m probably the biggest Apple fan they know.
> I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro. That machine was my first real taste of Apple’s world, and I loved it.
I rest my case.
But saying that one OS is better than another because it takes 5s to 'boot' instead of 10s is bikeshedding. (windows cheats here anyway, because it shows you the desktop before it's ready to be interacted with)
Thats not to say I haven't been researching other options in the meantime. The good thing about all of this is that the people (myself included) who have never even considered leaving are getting a chance to see what else is available in the market and make a more educated decision to stay/go.
I think I would even miss small things like the Finder listing options, it's just that macOS was may many years ahead and Microsoft is playing catch up just now but has too much to catch up too, so I will wait at least several more years to even consider Windows viable.
Even if macOS stagnates for the following years, it's still the better OS for me. I am on a Hackintosh btw, so the hardware advantages are non existent.
1 - https://github.com/jkbrzt/httpie
On Windows there's ConEmu, Bash on Windows and Powershell.
On Linux, there's a bazillion shells like zsh & fish and emulators like Terminator & terminix
Also, Finder is probably the least featured file manager out there and you could replace its features with pretty much anything else.
MS was successful on desktop but failed on mobile, so MS is adapting their desktop OS for mobile devices.
For Apple, its the other way around. Apple was successful on mobile, but failed on taking over the desktop. So they are adapting their mobile OS for the desktop.
Their end-game is the same: to create a larger ecosystem via universal apps that work in both mobile and desktop.
The funny thing is perception. Apple's moves are viewed negatively but MS moves are viewed positively (at least here on HN).
Win8 killed momentum, stagnation ensued.
But, then GPU kicked it up a notch. The new 1080 is insane, but in itself that is not that cool.
The new killer upgrade is the display. After a whole decade of nothing suddenly there was a huge bump.
I just upgraded to an Ultra-Widescreen Acer Predator.
Combine it with a 1080 (or the new 1080Ti) and your PC makes a whole generational leap forward.
And right at this moment in time, Apple stops making display and chooses shitty LG as the maker of choice.
Yes, Apple is losing the desktop, 100%.
So Apple has made some controversial design decision with their newest line of laptops...I still will be purchasing one in two or three years...whenever this one becomes unusable.
but for desk PCs where Apple only has iMacs anymore?
the big ultrawidescreens need powerful GPUs and a displayport (not usb-c yet).
use an ultra for a day, high rez and high Hz, then try go back to a classic LCD. like going to retina on phones.
Even worse, apparently both Thunar and Nautilus dropped their implementations.
I use Windowmaker largely without the GNUstep file manger -- bash is my preferred interface. But it's there, and you can run that freestanding under any desktop or windowmanager you want.
Part of the beauty and/or asthetic confusion, as well as freedom, of Linux / BSD.
How has your experience with Elementary been?
I've been thinking of getting a MacBook for my mom in a few months. In the meantime, I had to reinstall her linux laptop, so I thought I'd give her Elementary because I heard good things about it, and from the screenshots I figured it would be like halfway between linux and mac, so might ease the transition a little.
But in the end, I got so frustrated with it I switched it back to Xubuntu!
And it's ubuntu, so you can apt-get install build-essential & whatever else you need. It doesn't do upstart or /etc/init.d services (like if you wanted to run mongo as a service) but it runs mongo instances fine for me while I develop locally.
I spent my first full day developing on a win10 box this weekend & it went fine. But since I've switched to webstorm for most of my development what o/s I'm running hardly matters outside of keyboard shortcuts.
For anyone considering switching, I'd recommend buying the "pro" version as you can use the services feature to turn off things like cortana or the automatically-restarting windows defender. It would like to do a whole lot of spying by default, more than I'm surprised people are comfortable with.
But as many in this discussion have said, please give linux (and especially gnome 3 a shot). It's a pleasure to work with, and if I could stream legal tv services and run photoshop there's no question it's what I'd run.
I still far prefer macOS to Windows for development because it's an actual Unix, the UI is still much better, and the quality of apps available is quite a large difference.
My dream though, is a Linux with an actually decent GUI, desktop environment, and good apps. But one can only dream...
Also, you’ve conveniently left out a hell of a lot about what’s wrong with Windows 10, starting with the utterly insane way that Microsoft shoehorned it onto most people’s systems. Microsoft has done well, and they have made some reasonable moves for developers but you have some fog-covered glasses here.
Things like Homebrew have given Microsoft a very clear feature roadmap for achieving parity with Apple in most open source development stuff.
At this rate it won't be long before nearly anyone using open source tooling will likely be indifferent between OSX and Windows 10. This is great news for everyone.
Linux can definitely fare better as a server OS, allowing you to hit higher concurrent connections, etc. But while it has a strong focus as far as providing a fast and robust platform, it completely lacks any focus, coherence and direction in being a tool used by humans, i.e. a Desktop OS. I kind of like dealing with it and make good money doing so, but I would never recommend anyone switch to Linux.
At this point it's a bit like that old joke about Stroustrup making C++ complicated because C was too easy and the jobs didn't pay enough - if I wasn't 100% certain of the sheer hubris present in everyone involved, I'd chalk up all the problems to "we want to minimize the number of Linux devs so they can be higher paid".
Note, I'm not calling anyone stupid - all the people involved are highly intelligent, but it works out like that experiment where you get 100 monkeys banging randomly at typewriters to produce the works of Shakespeare; except here the monkeys are professors with PhDs in literature, each of which has a different, very well thought-out idea of how best to fix it. Then some poor sods bundle it all up and you get Fedorio and Debliet with instructions on the cover like "if you want to read about Romeo's sidequest of slaying a dragon, turn to page 1257" and a few of the pages are blank so you can stitch up the story in exactly the way you want it. Some people like that. The rest of you, please save your sanity, just go to the library and check out Winter's Tale or Macbeth.
It is not very user friendly and requires manual work to set up unlike, say, ubuntu
Trying to get recent versions of development packages like ruby, java, npm, etc--can be a bitch on Ubuntu, where the repositories lag years behind arch.
I use Fedora, and the upgrade process tends to work. I remember there was a time when Ubuntu wasn't so great at upgrades in various situations, but I'd hoped that sort of thing would be fixed by now.
- Pacman (with pacaur on top of it)
- The Arch Wiki
- Bleeding-edge software
I'm sure iPad also runs OneNote but you simply cannot efficiently use an iPad as a full time desktop experience, from what I've seen.
Currently, a license for Windows 10 Pro costs € 279 in the German Microsoft Store, while macOS Sierra is free for everyone who can run it. (Some say, Mac is the most expensive dongle.)
Also he mentions this in his post
> (I now use a 12" MacBook for on-the-go productivity)
Seems a bit strange to spend 10 paragraphs criticising Apple and then still buying their products anyway.
I can't process this sentence
I won't be switching to windows anytime soon (probably never).