If his site was still up, I would read it and then write the comment. But instead, I will make some sweeping assumptions about what it says based on the title.
We were never their target market. It is a happy accident that what works for their consumers can also work great for developers. There were always pain points adopting the Mac as your development platform. For example, when it first came out, the JVM available for it was woefully old and out of date. Generally you had to wait 6 months to a year to get the last GA version. Now that the latest is built nightly by Oracle, this is a huge improvement. On Mountain Lion, almost every command line utility (that isn't GPL v3) has been updated like git, ruby, svn, python, etc. At the end of the day, this developer loves that it is still a unix command line with a pretty face. More than I can say for any version of Windows or Linux.
I used to work at CERN in the early Mac OS X days, Apple came around with a presentation to convince us that using Mac OS X would be a very nice upgrade path than Scientific Linux, given the BSD roots, equal footing between Java and Objective-C for Cocoa development and nice GUI applications for scientific work.
I switched from various Linux distributions in 2003 to Mac OS X, not so much because of the looks, but because it had support for Linux tools in addition to native support for apps like Photoshop, Illustrator, Cubase, Flash, and many more - basically the creative app building blocks. I still follow Linux news because I liked it as an OS, it just lacked some of the apps that were crucial for me. Nowadays app support has grown (Gimp is still no Photoshop though), but I can't say that I find any Linux distribution particularly visually attractive. I don't like elmentaryos much either - I think the current Ubuntu 12+ looks much better.
So for me, using a Mac (despite that nowadays I write Mac & iOS apps) was and is mostly about the software that wasn't available on other operating systems.
That dock showing favourites and applications that are open (with a dot below those that are) and that top notification bar are strangely familiar... Trying to place where I've seen them before.
Unlike the early days of Apple when the company embraced and found refuge in the hacker mentality, Apple has emerged as a consumer company thanks to the wide reach and accessibility of its iOS products.
The thesis is that Mac OS and being hacker friendly are becoming increasingly less relevant to Apple, and consumers are now the focus. That's broadly true, however as a long-time user I'd say the focus on consumers (not developers/hackers) has been there since at least the late 90s, and possibly the 80s (with a gap in the middle where they floundered).
Re Mac OS, I fully expect them to migrate desktops to iOS completely at some point and just have a few interface tweaks for the mouse/window paradigm. The move of AppleTV to iOS foreshadows this, and the numbers above make it almost inevitable, given the cost of maintaining two ecosystems, and the fact they now have an operations guy in charge. It'll be interesting to see just how far they take developer restrictions and sandboxing on the desktop, and whether that affects their market share at all. I suspect as they are a consumer company that it won't matter to them if they lose the hackers/developers to some extent, and they will continue to attract enough developers to survive at least in the short term, while pulling them ever closer into the Apple orbit. There is a great and growing tension there though between the interests of developers (who ideally would like a cross platform solution and the flexibility to use any tools) and Apple (who want lock-in to their ecosystem).
The broader problem of developing for an ecosystem like this is that it is in the control of one vendor, and you must play by their rules - the Amazon, Google, Microsoft and even Twitter platforms come with similar problems - either you adopt their chosen technology this year, and accept the restrictions they wish to impose, or you're suddenly frozen out and may fail as a result. It's a lesson for anyone building a business on someone else's ecosystem - it's hard to avoid, but does come with dangers.
I'll be interested to see if the open web has a second renaissance as people recognise the deep difficulties of controlled ecosystems - its one great advantage is that it sidesteps the question of control by one platform owner, which sets it apart from all the binary or closed web platforms currently being touted as the future.
I have always used macs in passing, but in the past year I have really been using OSX a lot and from what I have used I get the impression that Apple never catered to developers or else their whole gui system would be very different.
Has OSX ever really been a developer friendly environment? Every single actual gui OS feature I have used is terrible and skin deep.
Having a terrible gui and slapping a linux console on it is not the same as having a real visual OS. Yes having a linux console is wonderful, but it is not an excuse for how completely lacking in features the actual visual interface is. You know, the part Apple is responsible for.
What I mean by skin deep is that it works well for a consumer that isn't very computer literate really and just wants to write documents and use the internet, but if you try to do something deeper than that you basically discover that you can't or it is terrible.
Finder is awful across the board. Yes you can get a replacement, but what the hell is a gui operating system for if not looking at files in the file tree. I could go into detail about why it is awful but if you have used it than you really already know. I don't think windows 7 is the greatest os or has a perfect file explorer, but it is crazy how much better it is than the one in osx.
Outputting to multiple monitors and the management of the output is terrible. Currently my understanding is that you basically have to go into the linux console to have any control over it.(Like turning off your laptop monitor, or any kind of management of the individual monitors past which one is located on the right)
Apple seems to not give a fuck about backwards compatibility across ios and osx. If you have an older mac(not even very old really, less than 4-5 years) and you want to update the OS to the newest one the device can handle, which is an OSX version before mountain lion; for some reason the only way to do this is to call apple on the phone and read them your credit card info so they can ship it to you which will take weeks. Even though they used to be available in the app store and the physical store. Literally the day Mountain Lion came out they basically discontinued any distribution of Lion outside of this crazy calling apple method. Also new SDKs for ios break code in undocumented ways including in features that Apple provides(like the video controller). The simulator can give wildly different output than the actual devices and each iteration of the ipad/iphone can give its own different output on the same code. This can and often is App breaking stuff.
XCode is much worse than Visual Studio in too many ways to list. Even having something basic like tabs for different files is terrible, also the vi plugin for it is not good. Also imo obj-c is a pretty unpleasant language but that is trivial.
I don't love windows or microsoft at all, but apple does not cater to developers in any way. Even the way they deal with developers being in their little ecosystems is very different. Microsoft XBLIG/WP/Win 8 sign up and publishing system is no cake walk but it is orders of magnitude better than how Apple treats us.
Apple makes great hardware, beautiful amazing hardware. However their software is really awful and basically only caters to consumers, and really unsophisticated ones at that.(Note: Apple did not make linux, any features of OSX you are using from a console is really not something Apple is responsible for. Apple is responsible for the gui, xcode, itunes, ios, and any other little app they made that comes with osx)
Note: the terminal (Terminal.app) has very little to do with the Linux kernel. It is not a 'linux console'.
Currently my understanding is that you basically have to go into the linux console to have any control over it.
I use multiple monitors (or a projector) daily, for me it has always worked, and it's easy to toggle mirroring, indicate where the Dock should be, changing the layout, etc. via the Display Preferences.
Apple seems to not give a fuck about backwards compatibility across ios and osx.
Compared to...? Windows, yes. Linux? OS X is definitely better there. Usually the only manner to guarantee compatibility on Linux is to compile software on an older distribution and/or including the shared libraries with your application. On OS X, I can at least trivially compile for older versions.
for some reason the only way to do this is to call apple on the phone and read them your credit card info so they can ship it to you which will take weeks
What does that have to do with backward compatibility?
XCode is much worse than Visual Studio in too many ways to list.
At least they provide a C99 compiler ;). (And good C++11 support.)
I don't love windows or microsoft at all, but apple does not cater to developers in any way.
clang? LLVM? Instruments? DTrace? Out of the box support for Python and Ruby?
I don't love windows or microsoft at all, but apple does not cater to developers in any way.
It's funny that you compare Apple, which gives all its development tools away for free, with a company that charges 615 Euro to get a Pro version of their development tools, and suggest that the latter is developer-friendly.
Multiple monitor support is one of the bits of MacOS that most annoys me. Some quick sample problems:
* if you have a laptop connected to an external monitor, and you disconnect the monitor then all your windows migrate to the laptop display. But they don't migrate back again when you plug the monitor in again, so you're stuck with a lot of manual window rearrangement (or third party utilities to work around it)
* the menu bar only appears on one monitor, so if you're dealing with a window on the other monitor there's a huge trek with the mouse to get to the menu. (this is kind of fallout from the fundamental choice that the menu doesn't attach to the window, so I can see why it works the way it does, but it suggests that a UI which worked on the Mac Classic might be being a bit strained in a modern world of multiple large monitors)
* new windows/dialog boxes often appear on the "wrong" monitor
* fullscreening an app on monitor A removes all the app windows on monitor B
So I can see why the parent might be irritated too.
While they do now have a good compiler (clang), there was an (in my opinion) unforgivable period of 4 years or so where we were stuck with a buggy g++ 4.2.
Xcode, interface builder and Instruments are free. For a few weeks last year, you had to buy them for 0.99$, but they changed their mind and now they're free (as they have been previously).
...Apple, which gives all its development tools away for free...
I have to purchase some overpriced PC hardware to run Xcode though. Meanwhile, I can run a complete Windows development environment of in a remote hyper-visor VM on commodity hardware (and I do).
On OS X, I can at least trivially compile for older versions.
I can write a program today on Windows 7 and go install it on Windows XP, a system that was released over 11 years ago. I can also make a program that will run on Windows 95 or 98. Can you do that from OS X to Mac OS 8 or 9?
How well do Mac OS 8/9 programs run in OS X? Do you really think they run as well as Win 95/98/XP programs run on Windows 7 or 8? Do you actually run programs from that era? Which ones?
It's funny that you compare Apple, which gives all its development tools away for free [minus the cost of a Mac], with a company that charges 615 Euro to get a Pro version of their development tools, and suggest that the latter is developer-friendly.
That's not why people often say that Apple is not developer friendly. There are many reasons but I believe the primary one is that Apple often locks out developers where Microsoft does not. For instance, only Apple can affect NSScreen visibleFrame. Nobody else. That's why every Dock replacement on OS X has the same bug: windows end up beneath the replacement dock when "maximized".
Microsoft also creates tools and frameworks for all types of developer stories where Apple simply has nothing. Corporate developers, game developers and add-in developers. (Apple doesn't even have official APIs for developing add-ins for products whereas Microsoft created an entire markets for Visual Studio add-ins or Office add-ins.)
I have to purchase some overpriced PC hardware to run Xcode though.
That has very little to do with developer-friendliness, since it applies to any Mac or iOS-only application.
I can write a program today on Windows 7 and go install it on Windows XP, a system that was released over 11 years ago.
I know, that's why I asked what he is comparing the backward compatibility too. Microsoft goes through great pains to guarantee backward compatibility, I wish Apple would go so far. But backward compatibility on OS X is definitely better than e.g. Linux.
There are many reasons but I believe the primary one is that Apple often locks out developers where Microsoft does not. For instance, only Apple can affect NSScreen visibleFrame.
Microsoft also has a history of private APIs. In fact, it is well known that Microsoft used APIs only available to them to get an edge over competition. Even Windows RT has such limits. E.g. Microsoft is the only developer that can use the desktop mode on Surface RT tablets.
Microsoft also creates tools and frameworks for all types of developer stories where Apple simply has nothing.
Well, it's no secret that Apple is not really in the enterprise server market. So, no, you'll not find an equivalent of Microsoft's enterprise .NET stack. Fortunately, you can use Java or any mixture of widely-used open source libraries, which are often far easier to compile and use, given that OS X is a fairly regular UNIX.
That has very little to do with developer-friendliness...
The ability to run my development OS in a virtual machine, preferably on a hyper-visor, is paramount. Am I supposed to buy one physical machine per configuration that I want to test?
Apple has recently gotten a little friendlier in this regard. At least now you can run OS X in a virtual machine at all, even if it does have to be on Mac hardware.
Microsoft also has a history of private APIs.
Not really...and especially not on the level that Apple does. (Can you name some off the top of your head that are not security related?) It's not just about APIs either. It's usually an utter lack of functionality on Apple's part.
For example, look at Finder vs Explorer. Does Finder have an official/supported plug-in architecture yet? Why not? Power users and developers have been begging for this for over ten years.
What do you suppose Apple's reason is for only allowing the Dock program to affect NSScreen visibleFrame? I believe that it's the same reason that Apple took forever to allow people to resize Windows by all sides. It's the same reason that they never admitted that a single-button mouse is not better - they think they know better than any other developers out there. That's unfriendly.
Fortunately, you can use Java or any mixture of widely-used open source libraries, which are often far easier to compile and use, given that OS X is a fairly regular UNIX.
Apple is not very inviting to a large portion of developers, yet there's (sometimes) an alternative, so that means Apple is friendly to developers? I know that all of the *nix devs out there are on Macs now because Linux failed miserably as a desktop OS...but that still doesn't make Apple friendly to developers.
There's a reason that Apple has this bad reputation with developers, particularly in comparison with Microsoft. Part of it has to do with their business model, part of it is simply hubris and in any case, there's a long history to consult here.
Mac OS X (and iOS) is based on a unix (NextStep, based on BSD), not linux, this is an important point as many people expect it to be 'just like linux'. It's not, and it won't be, though they do use some similar GNU tools.
I don't believe they have a terrible UI, though there are some weak points. To claim that every single gui feature is terrible is hyperbole and not very persuasive I'm afraid.
The Finder is not the best file manager, and always has been subpar, though it has been improving. This is a hard thing to do right though given all the different file systems which might underlie each view and the potential for lags with io/network access.
Outputting to multiple monitors is pretty elegant - there's a GUI in system settings which lets you drag monitors around to set positions. This is fairly basic stuff, and if you haven't discovered it I'd submit you need to spend more time with Mac OS before dismissing it as awful. You don't have to use terminal.app to do this.
Re the iOS simulator, I haven't found this too bad (any simulator will not match the device, and it's remarkable how little you notice given this is a different binary) but overall yes their dev tools can be frustrating, glitchy, and surprisingly buggy compared to the polished consumer experience, and I agree xcode is a bit of a mess, though it is improving. Their software for consumers is generally very polished and mostly matches their hardware in my opinion, so it can be a disappointment to find that the tools for developers lack polish.
I have found the display options. As far as I am aware you have to use the console to disable a monitor and you have to give it a bunch of crazy commands.
Some of the navigating between open programs stuff is good. Really nothing else I can think of is. Other parts are acceptable, but really the control panel is wanting(like with the monitors), file navigation and dev tools are bad, removing usb drives is annoying, I hate itunes for a variety of reasons. Media playback is about as bad as windows, however, vlc for mac is worse than windows, but I don't blame apple for that except for the fullscreen issues relating back to monitor control. I also find decompressing files to be annoying, but on windows that is 3rd party as well if you want something good. I haven't found anything on the mac that is as robust as 3rd party windows tools yet for decompression.
There is more but just in general the gui is skin deep just like all the software apple makes.
On a laptop, close the laptop lid, or you can unplug the monitor, or some apps blank the second screen (games for example). Decompressing files you just double click them. Your requirements sound... unusual, so I'm not surprised you are not satisfied.
The way you feel about Mac OS is not a universal truth or even a common experience, and your claims about a skin deep UI are off the mark - Apple has hisorically paid more attention to consistent and predictable ui than any other vendor.
Unplugging the monitor/closing the lid is the most ridiculous solution I have ever heard. Maybe I want to use the laptop keyboard, or the trackpad. This isn't even theoretical, I actually want to do this all the time.
What if I have a bunch of rar files on a usb stick that I want to extract. They can't extract in place because of fat32 file size restrictions. This happened to me yesterday.
If you need to extract some files and can't extract them in place, you can just copy them to someplace you do have space, of if this is really a common problem for you and you have huge files, change the built-in Archive Utility settings to set the destination (Archive Utility > Preferences). Not a big deal.
Re disabling the monitor, the solution above works for me (for use with an external monitor, where I have a keyboard). Instead of ranting, you should try to explain what you actually want to do. Moving to a different OS requires adapting to the paradigms of that OS, there's no way around that, and Mac OS does not have this concept of leaving a monitor plugged in but calling it 'disabled'. Most people really don't miss it. What problem does leaving a monitor on give you?
Pointing out minor glitches is easy, and there is no OS which has a monopoly on minor irritations.
That is not true. Mac OS absolutely has the ability to do this, but it has to be done on the command line because the gui is shallow.
Sometimes I would like to disable the monitor to improve performance, sometimes I just don't want windows to hide on the laptop screen. Sometimes I want to output video to my tv and not have the laptop screen on. It really doesn't matter.
This is not a "glitch" because the feature exists. It is just a shallow UI.
Because on the Mac, writes to those devices are fast instead of slow. You'll find that on Mac, caching is enabled by default and on windows it is disabled by default. You can actually use a USB stick on a Mac almost like a normal disk. On windows, it is really too slow as write caching is disabled so you can pull it out at will when you aren't currently writing.
I agree with you especially about the Finder. The Windows explorer is much better. The problems in the Finder are:
- unable to see the dimensions of an image
- unable to change MP3 tags and other file properties
- cut/paste is possible but difficult
- in column mode (the one I always use), I am constantly resizing the columns. The Windows short Ctrl + + to automatically resize the columns really misses me on OSX
really? unable to see dimensions, how about Cmd + I ? and you use column view exclusively? than just select the image, the column on the right has all the file info (I'm using lion, so not sure about previous versions).
Others have addressed some other parts that are nonsense. I will tackle some gaps.
> Also new SDKs for ios break code in undocumented ways including in features that Apple provides(like the video controller).
You make this statement as though it is a widespread issue. But it isn't. You may have simple hit one bug. The fact is that you can still take an iOS 1.0 codebase, change the SDK to 6.0 recompile and be done. All of the newer features e.g. ARC are optional.
> The simulator can give wildly different output than the actual devices and each iteration of the ipad/iphone can give its own different output on the same code.
This is complete rubbish. Common sense suggests that if it gave wildly different output then nobody would be using it. But yet everybody does. And likewise if each iPhone executed code differently people would be buying up old iPhones in droves to test. But again they don't.
> However their software is really awful and basically only caters to consumers, and really unsophisticated ones at that.
You should be careful about who you call unsophisticated when you clearly don't even understand the fundamentals of the OS. Hint: it's Darwin not Linux. And I would say that much of their 'sophisticated' software has been pretty good e.g. Clang/LLVM, GCD, LaunchD, WebKit, OpenCL.
"This is complete rubbish. Common sense suggests that if it gave wildly different output then nobody would be using it. But yet everybody does. And likewise if each iPhone executed code differently people would be buying up old iPhones in droves to test. But again they don't."
We use old iphones and older versions of ios to test on.....
I don't really see how you could get by if you didn't.
Yep. Also, a lot of kernel code originates from Mach via NeXTStep. Some parts of the userland are from NetBSD as well.
It's really a mixture of Mach, FreeBSD, NetBSD, GNU and Apple's own additions and modifications. So, the I guess the only proper names for this beast are XNU and Darwin.
It's not just developers. It's PRO users of any kind. Graphic designers and video professionals haven't been catered to as well. Heck, we're still waiting for a new Mac workhorse that's not an iMac or laptop. Video editors have been duped by the new iOS like Final Cut Pro that was not backwards compatible.
Developers are a relatively small percentage, and in some ways not an overly profitable one as whilst we want the best we're also prone to pushing machines as hard as possible, as long as possible.
I've been developing on OS X for something like 8 years now, ranging from C++ dev work up to RoR and some other bits. I've never had any real issues with it as a development platform, never felt like the OS was getting in my way or anything similar. I also heavily use Linux in my day-to-day and that's definitely more developer focussed but it can also be way less productive at times. ML is a solid consumer focussed OS but it's still a very nice development platform as well, but thats my opinion.
I have no empirical evidence that Apple's turnaround was based on developer advocacy, but my friends and family that can afford Macs use Macs due to my recommendations in the mid 2000s.
One thing I've noticed about fellow British developers is many of us don't actually have Macs at home because we simply can't afford them. You'll see us with work-purchased Apple laptops at events, but back home we have our trusty PCs, serving triple duty as games machines and entertainment devices. They might use Linux, Windows, or dual boot; but you won't find many of us dropping £1000+ for a personal Apple laptop.
So now those friends and family with ageing Macs are buying tablets, because they do 90% of what they need to do at less than half the price. Meanwhile, Chromebooks have a certain appeal, and I think Google should look adapting them to suit developers (outside of switching on Developer Mode and installing another Linux distribution.)
I think there is hope for this whole mobile and tablet future if, and only if, it will be possible to plug your phone/tablet into something that has a keyboard, mouse and two large monitors on your desk to work on that (and then take it elsewhere and do the same there). On top of that, optionally the device needs allow you to install and run any software you want without restrictions, and basically have unix shells and desktop UI when plugged in the stuff mentioned above. Of course, also, the CPUs and RAM need to be on-par with current desktops, but it's kind of realistic that this can happen, and the actual desktop box could be reserved for things that require some extra computing power.
sounds a lot like the Surface, minus the unix shells, that is..
I guess the real question is whether it is possible to simplify computing for the general customers while at the same time preserving the ability to tinker with the computer comfortably for the developers..
So things have actually improved tremendously since then, with free and improving dev tools, open-source Unix underpinnings etc. The end of the world will be postponed...
How to create software on a tablet?(not for a tablet)
Consumerism is --- all the touch interface support, dropping the desktops and laptops, dropping the workstations, and paving the way for not-developer-freindly tablets, glasses, cellphones, put anything.
The problem is, tablets do not provide the native - feel at home - environment to the programmers. To me, and probably to any serious developer out there, desktops (with maybe a tilling wm) provide the best dev environment.
If you don't believe it... Try to give your desktop a tablet like interface with something like Gnome3 or metroUI and then see the difference ;) .
The touch interface is not designed for power users. It is designed for the people who like to have beer and listen to music.
Being persuaded in another discussion on HN, that chromebooks should not aim to be for developers, now I read the complaints that Apple does not target developers neither. I am curious, how is the situation with Win 8? What is the future machines for developers?
This is a facetious question if I ever saw one. Apple machines should be developer-oriented, the author is complaining that they've lost/given up on the plot for the power users who catapulted iOS to popularity by developing for it. If you are developing for Mac or iOS, use a Mac. If you've developing for the web, maybe a Mac.
The thing is, you were complaining about the lack of an octocore, 8GB RAM chrome book, which is at odds with the purpose of a chromebook. You can develop web apps on a Mac. You can even SSH to a Mac or Linux box from your chrome book. The chromebook model is to provide the bare minimum a user needa to get into the Google ecosystem, so that the hardware is attainable for low-income users, and becomes a complete commodity for businesses. Nowhere in there do they account for power users; it's not part of their plan, and they don't need developers on their platform to succeed
"The chromebook model is to provide the bare minimum a user needs to get into the Google ecosystem, so that the hardware is attainable for low-income users, and becomes a complete commodity for businesses" Could you point me where you get this if it is not your idea? In contrary, I always thought ChromeOS is the model where web is the app; and all you need is a web browser to do anything you want; which includes "development". As a developer, using Firefox/Firebug, Chrome Dev tools, Online editors/Ace to do everyday job, I am myself developing on a cloud IDE to do anything I want online.
The question is, why can't I only use a browser,(then why not Chromebook) to do development? And what is facetious here?
You're conflating this post, which is about Apple more or less mistreating developers, with Google's conscious decision to produce $200 ARM thin clients. They're not really the same; Apple still wants you to develop on their hardware, they more or less shackle you to it.
Based on what you described your development stack is supported on Chromebooks. That's cool. Eclipse used to run on my 11" netbook as well, it doesn't mean it was a good way to develop, nor was it designed to be.
We were never their target market. It is a happy accident that what works for their consumers can also work great for developers. There were always pain points adopting the Mac as your development platform. For example, when it first came out, the JVM available for it was woefully old and out of date. Generally you had to wait 6 months to a year to get the last GA version. Now that the latest is built nightly by Oracle, this is a huge improvement. On Mountain Lion, almost every command line utility (that isn't GPL v3) has been updated like git, ruby, svn, python, etc. At the end of the day, this developer loves that it is still a unix command line with a pretty face. More than I can say for any version of Windows or Linux.