"True Apple fashion" has only recently been about resisting power users. Before about five years ago, they were all about enabling power users while still being friendly to regular people. This was much more compelling than their current direction. Remember, this is the company that built their OS on top of UNIX and shipped a terminal app with it standard. It all changed around when the iPhone came out, though.
OS X is going down the same path as iOS, too: Mountain Lion scolds users for downloading apps without going through the App Store; this can be disabled by digging around in the system settings, but I can envision a day where you'll need to open a terminal to set a system property from the command line, and then a day when they simply disable unsigned binaries from running altogether.
What is Apple going to use for their own work? Their developers and designers are power users.
It's certainly possible that the Macbook Air would switch to ARM in the next few years, but until you can write software for an iPad on an iPad, there's no getting away from OS X and serious processors.
I don't see how power users have been left out. If you mean they try to keep you out of the OS itself I wouldn't consider that being unfriendly to power users. That's just how they've always been. If you mean the whole dumbing down of the UI, I'd say that's pretty irrelevant. As a power user there's no reason why you couldn't just turn certain stuff back on. Plus it's still the same BSD underneath and they still give you the terminal. I keep hearing that they're being unfriendly to power users but power user's are the ones who don't really need friends. I've been using Macs since 10.4 and since that time each time the OS is upgraded it's maybe a little annoying for an hour but then I remember I'm a power user and I know how to open a terminal window and do whatever the hell it is I want to do.
My iPad disallows me to do tethering, functionality that's available on the iPad but that can be enabled/disabled by the career. My Android phone, from the same career, allows me to do pretty much whatever I want.
iOS doesn't have a terminal. And considering the recent moves with the Mac app store, how long before terminals will stop shipping with OS X?
> iOS doesn't have a terminal. And considering the recent moves with the Mac app store, how long before terminals will stop shipping with OS X?
Why not take this illogical train of thought even further? iOS tries to hide the filesystem from the user, so how long will it be before OSX does the same? iOS doesn't allow generic USB devices like 3G modems, so when can we expect that support to be removed from OSX?
I purchased an iMac for my mother two years ago, and she never could quite get the hang of it. The iPad that replaced it last Christmas has been working out much better. The same qualities that make iOS shitty for power users make it simpler and easier to use for the average populace.
OS X already hides the ~/Library folder, it's not far-fetched to think they might hide everything but the Photos, Documents, etc. folders.
I wouldn't say iOS is a terrible product, just that it's terrible for someone that knows how to use a computer. Apple is trying to get users that don't know what they are doing at the expense of the experience for users that do. I know I shall never buy another Apple computer if the trend continues.
You'd have a better argument if they got rid of the Library folder completely. Just because it's not a single click away anymore doesn't justify the slippery slope argument. OS X is not Linux. It's targeted to the general computer using populace and happens to be quite popular and useful for a very small minority of people like us. They still allow us to do everything we used to, it's just that they've hidden a few things that confused the normals out there. Big whoop.
Hold down Alt while clicking the Go menu to see your Library folder.
Your assertion that it isn't a slippery slope is subjective, as is my assertion that it is. Both viewpoints are valid.
However, I reject your solution. It's my machine, I want it to be a pleasant experience to develop on, I don't want to memorize a workarounds for a bunch of trivial problems that I have to apply to every machine I use.
I'm taking the defensive position of not investing too much of my time in their products because I think they will remove access to those folders, or lock down on application installs, or otherwise make the experience wretched for me sometime in the future.
The problem with this line of thought is that you're talking about the older generation (our mothers and grandmothers).
Teenagers today are power users, except for those who have much bigger problems than poor technical skills (like being freaking illiterate).
So shouldn't we optimize for our children instead? Isn't it plain stupid that we spend so much worrying about our mothers and grandmas?
The side effect is that we, as a society, are making efforts in keeping people dumb. Reading, as a skill, is hard to learn and it was considered optional and for power users even in the 17th century. Even today, I find it so stupid that movies are dubbed around the world, as if people can't be bothered to read freaking subtitles. That's how I learned English btw, something which would have never happened if I lived in Spain or Italy.
Do you really think teens today are power users? Every generation since I was a kid thought their kids were tech geniuses; having interacted with them, I know it wasn't true. So kids today use Facebook and Instagram and Snapchat. And Tumblr. I haven't seen any kind of trend for teens using Terminal or rolling their own. They appear to be proficient because the tools for normal people have gotten so much better.
Yes, I really think teens today are power users. I also interact with plenty of teens and also my wife works at a kindergarten - she has 4-year olds that know their way around a PC, enough to open their game or a browser.
Being a power user doesn't necessarily mean usage of a terminal, especially since most teens today use Windows on their PC, which has the shittiest terminal experience of them all. Heck, when I was a Windows user I wasn't using the terminal either, even though I was doing programming. Even if you are using the terminal in Windows, you don't really have much need for it since the whole freaking OS is anti-terminal and you can't do much with it. It's easy to forget this if you're a Mac or a Linux user.
This is not about teens being smarter btw. Older people simply don't have the patience to learn anymore, unless they see the real value in doing it. My folks also have a huge language barrier - they never learned English, as they were taught Russian and French in school and they forgot everything due to a lack of practice. For my mother, it doesn't matter how easy to use the device is, if it isn't localized.
Our tools definitely got better, but the only truly meaningful thing that changed since the late 90 is the availability of the Internet. When I was in high-school, in year 2000, we had no Facebook or Twitter, but I still had classmates that were communicating a lot over IRC and email. But they were doing so from school, because home Internet connections were expensive and shitty.
Seriously? iOS doesn't have a terminal? What are you going to do with a terminal on an iPad? Run operations on the non-existent file system? The iPad is not meant for computing. It's an entertainment device. That statement is almost like saying "TV manufacturers are forgetting about the power users! My new Sony flatscreen doesn't even come with a terminal so I could... uhh... adjust the picture, color, brightness, etc. from the command line instead of just using the real simple buttons on the side".
The Mac App Store doesn't really have any relationship with the terminal. There's no reason to believe that its in Apple's interest in any way to take away the terminal. It's still Unix underneath, they still provide developer tools, and they still need developers to write applications for them. In addtion Macs are still huge in the design/developer community because they're well suited to graphic design work with their appearance, focus on large screens and high resolutions, and performance. They're pretty big in the developer community too as you get a great "point and click" kind of OS with full and easy access to the terminal and most of the goodies you get with a full-fledged Linux machine.
Finally, your argument is that Apple is ignoring power users but you use an example where it's actually your carrier that's stopping you, not Apple. You said it yourself, the iPad does support tethering but your carrier is the one who enables/disables it. Seems more like your carrier is against power users, not Apple.
In the end, just because Apple makes the OS more "point and click" friendly and comes with a pretty opinionated set of defaults for non-power users doesn't mean they're trying to keep power users out. By definition, if you're a power user, these things they're doing should be a minor annoyance when you get a new Mac and after a couple of hours you should have your machine how you like it because... drumroll please... you're a power user and know how to do that stuff! I personally don't see much difference between OS X and some of the more "user-friendly" Linux distros. They've both got the same underlying tools and are working hard to make it so your grandma can pick it up and get emails of her grandchildren within an hour. What I think the real problem people have, which maybe they just don't see, is that they just don't like change in general. New versions of OS X come out and they hid an option somewhere and everyone goes nuts and says "Who moved my cheese! This is the worst computer ever!"
Actually the iPad does have a file system. You just don't have access to it.
> That statement is almost like saying "TV manufacturers are forgetting about the power users! My new Sony flatscreen doesn't even come with a terminal so I could... uhh...
Consequently, one of the reasons TV is dying is because it's just a dumb consumption device. I use my laptop, my Android and my iPad for 10 to 12 hours per day. I use my enormous flat-screen that's sitting in my room only for streaming movies from my laptop and yes, while connected to it sometimes I open the terminal.
> it's actually your carrier that's stopping you, not Apple
BULLSHIT. This is a device-level configuration setting that the career can remotely send to you. The device wasn't even bought from that career. It wasn't on a contract or anything like that.
It's my device and I find it unacceptable that the career can tell it what it can and cannot do. It's Apple's fault for giving them the option.
> By definition, if you're a power user, these things they're doing should be a minor annoyance
Actually it's a big annoyance because I'm the customer that pays money and why in the world would I pay for devices that are defective by design when I could be supporting companies that respect me and my needs? My current retina-enabled and shiny iPad is the last Apple product I'll ever buy.
I got a 17" Macbook Pro in 2009 (when they were still riding the wave of the iPhone) because it was a solid piece of hardware and gave me a sweet spot of down-the-road-choice (I exchanged the cd bay with another hdd, added Ram, installed Linux). I was looking forward to buying further Macbooks in the future.
All that was shattered with their new Macbook lineup (which are pretty much just beefier MB Airs). They flat out killed the 17" (which I still consider the perfect on-the-go workstation).
I considered Apple very relevant around the iPhone release. The current direction is not the iPhone direction. It is the iPad direction. That's when they started to go somewhat batshit on driving away professional users. They could have maintained both camps pretty handily in my opinion. Both camps were quite happy and got along great. Why they decided to kill off one is beyond me. Sure, there is more money in everyday clients, but I doubt they were actually hurting their business with power users.
Consider this: When I - a staunch defender of FOSS, user of Kubuntu, Free Software programmer, ardent antagonist of everything Microsoft - got my Macbook, I actually started recommending Macs as a choice to others. It actually did seem to me a better choice than going with Microsoft Windows. These days, I recommend Windows 7.
I also got a 17" MacBook Pro around the time you did. I really wish I hadn't though, it's way too heavy to really be useful for anything but using around my apartment. I've tried to travel with it a few times and really regretted it.
While I don't relish the idea of a non-upgradable system, I do appreciate Apple trying to shave off every last possible gram from their laptops. My next laptop will probably be a 13" Air for that reason.
Wow, I'm in almost exactly the same boat. I'm on my third and apparently final 17" MBP. I have started resigning myself to switching back to Ubuntu but I'm not enjoying being forced off a beautiful platform after having made friends and family switch. I had been an ardent Linux user for a long time before I went Mac and it looks like I'll be returning sometime in the next few years.
You can still get 13" and 15" "traditional" (non Retina display) machines which are part of the "new" lineup. My early 2011 15" with matte display, SSD, and 16GB RAM seems to be a good compromise of still being under my control, yet gaining most of the performance benefits of the 15" rMBP.
>While we're keeping history in context, let's also not forget this is also the company that was borderline irrelevant prior to their current direction.
You're retconning. Apple hasn't been 'borderline irrelevant' since before they launched the iPod in 2001. There's a six year gap there in between the iPod and the iPhone, and imho they didn't really change their direction until 2010 or 2011, with the release of OSX Lion, neglect of the Mac Pro, and the killing off of the 17" MBP.
I won't dispute that they're far more relevant today, but I disagree that they were borderline irrelevant circa 2005. They were quite successful selling Macs and had a credible alternative to the Windows monopoly.
The iOS stuff certainly moved them to a whole new plane of success, but I don't have to like it.
As a power user I don't 'like' it either. But I don't 'hate' it. I just use third party services when Apple's offerings don't fit.
The only bit I took issue with, is holding up Apple as having 'failed' simply because their focus is on other types of users. Particularly when they're serving those users at least as well as any alternative. And when those users are far, far more numerous than users like myself and their needs far, far easier to meet in an engineering and support sense.
Counter narrative: nothing has changed. Apple has just expanded their business. If Apple made toasters, whether they were locked down or not wouldn't affect my opinion of what was likely to happen to my Mac. iPhones are toasters, not computers.
I can buy the idea that iOS being locked down doesn't tell you what's likely to happen to your Mac. But once Apple started bringing iOS-like features across, then it becomes pretty reasonable to compare the two to guess at what they might do next.
OS X and iOS share a lot of the same core code base, so of course there is going to be cross-pollination features-wise. Why should the migration of features to solve problems common to both platforms indicate you should start worrying about Apple locking down the Mac?
Guess what: when Macs get touch screens, more Launchpad is going to make more sense, and full screen is going to be even better. That doesn't mean that Apple is going to start locking down the Mac. Indeed, as time goes on and their less technical users migrate to iOS instead, they have even less incentive to further lock down the Mac.
When one of the features they migrate makes it so that the default state of a Mac is to obstruct running any software not approved by Apple, it starts to make sense to think in this direction. Full screen/launchpad are irrelevant.
You know that Windows and (some) Linux desktop environments also have that feature, right? It's not "obstruct running any software not approved by Apple," it's "obstruct running any software marked with the 'downloaded from the Internet' taint flag, unless signed with a certificate in the OS's keychain."
It's a very sensible default for people who can't be trusted to not click on banners telling them to download a "FREE CAT SCREENSAVER", thus the universal adoption. And it doesn't hinder anything like software development at all, since programs you compile yourself aren't marked as tainted. (And you can just pop open a Terminal and drop the taint xattr from any file.)
Nevertheless, in all cases, in all these OSes, the Gatekeeper/Smartscreen-like system can be turned off, and always will be able to be. Otherwise, how would programs get deploy-tested? [You can't require signing with individual device deploy keys like for iOS deploy testing, because IBM-compatible PCs have structural identity--there's nothing equivalent to the UDID to tell them apart by. You could try using a fingerprint with the CPU model, MAC address, etc--but all those can be faked. Unless we get something like a TPM-based PC UDID, trying to do device keying on PCs is moot, and no OS vendor will bother.]
Actually, come to think of it, Linux also has this at an even more fundamental level: you can't install a DEB/RPM from the Internet as an automatic dependency unless its signing key is in your keychain, fullstop. There are actual programs I've installed (for example, ESL's distribution of Erlang) which require the user to "curl http://example.com/key.asc | sudo apt-key add -". Ubuntu's PPA system (using apt-add-repository et al) doesn't get around this, it just automates it with a prompt for whether you trust the key.
You are talking about something different than what most people are worried about. Sure, locking down what can run at the behest of the administrator is a feature.
What people are talking about is locking down what can run at the behest of the vendor. Like how iOS is. Like how Mac OS X would be, if you couldn't disable Gatekeeper.
You think users "always will be" able to disable Gatekeeper, but I don't think there is any evidence to support that. It's entirely up to Apple, and if they want to implement a TPM-based (or other) Mac UDID and lock Macs down to Apple-approved software, they will go right ahead and do whatever the fuck they want to do.
But that's the thing. OSX, Linux, Windows--they're PC operating systems, and they run on PCs. Any PC. Which also includes virtual machine environments that emulate PCs. Apple could lock Mac hardware down, yes, but they can't stop a Hackintosh from running whatever it likes--because you wouldn't build a TPM chip into your Hackintosh.
Now, if your argument is that Apple is going to take OSX and make it into something that doesn't run on generic PCs, but rather a specific, closed environment that loosely resembles PCs [thus killing all ability to do Hackintosh builds, run OSX in a VM, etc.], I agree that there's a very slight possibility of that.
But Apple has a heavy incentive to keep OSX running on generic PCs. For one thing, it's required to maintain backward compatibility with all the current hardware that are just generic PCs. For another, it gives them the ability to test their software using generic VM products, rather than a specialized "simulator." For a third, it allows them to just construct a new prototype Mac in the lab out of the newest off-the-shelf components (picture an empty Mac Pro case with random hardware inside), and then use it to write and test drivers for those components, instead of waiting for a specialized mobo to be produced for them that supports all those technologies and carries their special, needed OSX TPM chip.
Sure, Apple could push the industry to standardize a UDID-carrying TPM chip for all devices (this is basically the dystopia everyone was scared would happen with Palladium), so that Apple could use off-the-shelf hardware and still do device-key deploys to it.
And sure, Apple could write their own machine simulator.
And sure, Apple could just make the device-deploy-keys feature optional until an OSX release where all the old hardware is no longer supported.
But why? What advantage does this give them? It sounds like a lot of hassle to create a world where it's harder for everyone--including Apple's own in-house developers--to develop, test, and distribute Mac software. A world where fewer developers want to develop for OSX. A world where it's impossible for enterprises (yes, Apple has enterprise customers) to deploy their own internal software over their networks.
Now, look out below, for :itisacaranalogy: --
If you're a car company who makes sedans [iOS devices] for "consumer driving", and trucks [Macs] for "utility driving", what purpose would it serve to turn all your products into cars? Especially if your own employees require a truck, as part of their job, to haul loads around the workplace?
As far as I can see, Macs are going to diverge from iOS, not converge. The more consumers who buy sedans [instead of buying a truck they don't need and then complaining when it doesn't have heated seats], the more "trucky" the trucks can become without impacting sales. Macbook Pros and Mac Minis--both "trucks"--are here to stay.
On the other hand, iMacs and Macbook Airs--both "sedans"--might just get locked down, run iOS, and probably have touchscreens one day. But that's just fine, isn't it?
The MBP looks like it's going to keep getting lighter until there's no need for a separate "Air" category any more; if they keep the brand after that, it'll be for an iOS device with a keyboard attached.
And the iMac is already a redundant competitor to (Mac Mini + Cinema Display); so it will probably make more sense as a big iOS touchscreen "kiosk." Instead of having a Mac built in, it'll have an Apple TV built in. (I imagine the Cinema Display would also get touchscreen capabilities, and then you'd get the same experience as an iMac by hooking an external Apple TV up to it instead of a Mac Mini.)
...and note that everything I just said could apply equally well to Microsoft. They have all the same choices available to them, and there's already the same "nervousness" surrounding the Surface RT. It's just simpler to do the analysis with Apple, since their long-term hardware strategy is more obvious.
In short: I hope you are right, because since 2006, Macs have been far and away the best general-purpose PCs ('trucks' in your parlance) on the market, and migrating off of the Mac and/or jailbreaking and bootlegging the OS and then running it on my own unsupported hardware, will be a major pain in the ass. Either options sucks.
But yep, Apple could do every single thing you say. Without breaking a sweat.
As for why? I think Apple would prefer that OS X not run on commodity PCs. They already take halfassed measures to control running OS X in a VM, and to prevent booting OS X on non-Apple hardware. If they could do that more reliably, they woudln't care about their slightly higher internal costs, and they definitely don't care about making life miserable for their developers (as I've witnessed being one for the past 12 years). But it's just a hard problem for them and a hard sell to existing users used to PCs being wide-open. But with every single iOS user they add, that sell gets one user easier.
I'd bet that within five years, the percentage of users running unapproved software [EDIT: somehow deleted 2nd half of this sentence:] on new Mac hardware will be about the same as it is on iOS today. It won't probably be impossible, just hard enough to not be feasible for most normal/busy people.
OK, that wasn't short, but in summary: The fact that Mac OS X has been the best power user OS for the last several years wasn't by design, it was just an accident of history and where they got their OS from. Apple doesn't give a fuck about power users, and Apple doesn't give a fuck about trucks. That market is just way too small for Apple to care about -- which is sad for those of us currently in that market.
Because if/when Apple finally abandons Intel and power users (timing that makes sense to me) it will be years before Ubuttnu or any other plausible player is anywere near as good as Mac OS X 10.7. 10.8 still has too many bugs and stability issues, but it will get there. Probably 10.9, too. But after that? I don't think anybody knows, but I am very skeptical.
(I think Microsoft will move in this direction, too, so those Surface RT users are probably right to worry.)
> What people are talking about is locking down what can run at the behest of the vendor. Like how iOS is. Like how Mac OS X would be, if you couldn't disable Gatekeeper.
Apple doesn't control what is signed by devs, though they do control handing out certs to devs. If Gatekeeper were permanently on, it wouldn't mean you could only use Apple-approved apps (ie, the app store), it just means you can only used signed apps (ie, random stuff you download from the internet).
Ubuntu is going all touchy, Windows 8 (although confused) is touch-enabled, the new ChromeBook Pixel looks very touch-centric; It seems that Apple are really falling behind the eight-ball on something they purportedly pioneered.
I can only assume they'll release their new OS (OSXI, OSX.I, X.I.OS, etc.) fairly soon as OSX in it's current form is about as touch friendly as Windows 7 or KDE.
I've heard tales of woe from old geezers of having to toggle their pdp8 bootloader in with panel-switches, in the snow, uphill, both ways. In a related matter, I have it on good authority that this very website runs on a computer only slightly more modern than the pdp8. http://news.ycombinator.com/item?id=5229488