The fact that you have to build an iOS app on a Mac, that you can't even set up a CI build on commodity hardware, or in the cloud, is pretty fundamentally developer-hostile. I'll start to feel wooed when they take steps toward fixing that.
Thank you for bringing this up since it's been massively painful for years. As an example, when I first started developing for iOS back in 2009, I had to buy a $500 used macbook before I could even get started coding. That hardware was obsolete within about 3 years (couldn't update to the latest OSX or XCode), so then I had to pony up another $800 for a refurbished mac mini just to still be able to develop. Meanwhile, Android Studio runs on... actually I don't know what it doesn't run on.
And yes, I know that leasing time on a mac for dev work is an option, but that gets rather pricey over time ($1 per hour or $20/month is not cheap if you need to work on and maintain apps for multiple years). And an RDC experience will never be as good as having the actual computer in front of you, even on a fast connection.
That hardware was obsolete within about 3 years (couldn't update to the latest OSX or XCode
I call shenanigans - latest OSX and XCode running on a 2008 Macbook right here on my desk. I plan to get 10 years usage out of this machine - that's value for money right there.
In my experience Apple laptops tend to hold their value pretty well so I'm guessing a $500 machine was 4-5 years old when purchased. So the fact OP could use it for 3 more seems about right. Personally I upgrade around every 4 years and the laptop is still in great condition for most people and sells for well over half what I paid for it.
Unfortunately this seems to be less and less true over the years. When I got my MacBook about 4 or 5 years ago, it only had 4GB of ram and a pretty slow hard disk. If have upgraded it to 12GB Ram and a pretty good SSD, this is why I can still use such an old MacBook. For the new ones, I have to decide the RAM, and disk upfront and can't replace them anymore (this is especially worrying with the small lifetime with SSDs).
I wish Apple made their pro products more bulky so we can upgrade the hardware and let the air products be the sexy, thin ones for Starbucks hipsters.
As a long time mac user, I do agree that the lack of upgradability in their Pro line is a problem, and it probably should be the defining factor of Pro versus the Air/Straight Macbook line.
That being said, Apple has gone to pretty crazy lengths to ensure that it's modern OSes work on legacy hardware; I was a bit shocked to hear from my parents that they were able to install El Capitan on my dad's 2008 iMac, as I was wary about putting it on my 2012 Macbook Air due to performance concerns. I don't know if this was a conscious decision by Apple to cater to the idea of "Apple = Longevity", but it's very nice to know that a single purchase takes you pretty far. I think Apple could take it a step further and really win over more people if they added the millimeter or two to the Pro line and allowed for the upgrades.
It's really a back and forth thing for me; on the one hand, I wish I had more options and the ability to customize later on. On the other hand, since ~2010, the appliance like nature of the portables has worked so seamlessly that my only complaint comes from the lack of an option to upgrade, not the need for an upgrade. For my previous Apple laptop, my own incompetence when trying to do a repair hurt the laptop's longevity; had I not had butterfingers when working on the insides, I'm sure that I could have had another 3 years with my already 5 year old Macbook.
Performance has essentially plateaued for the last decade in laptops - aside from sticking SSDs and more memory in them, there isn't a ton of difference in specs. I have a 2006 Windows laptop that runs better now on Win 10 than it did originally on Vista. Except for power consumption, it's the same as my two year old laptop.
Of course, with a Windows PC, you've got a variety of options, and can buy something that is actually upgradable, rather than being locked into what Apple deigns to give you.
I believe the SSD is a pretty large aside, isn't it? GPUs have also vastly improved, I believe. It's just CPU that stagnate at the "good enough" level.
While I do agree with most things, there will be a need to upgrade something in the future.
When I bought my MacBook, 4GB of Ram were plenty but it is not anymore. Chrome takes to much ram and with 16GB, I don't have to worry about anything. Furthermore, apples new laptops are all dual core (until you buy the bigger one). This is also dealbreaker for me – quad cores are really awesome when using VMs. Sure a CPU cannot be updated on most laptops (and certainly not on apples but I can't blame really).
My point is that I'm afraid apples pro series has drifted into a state where their lifespan will shorten while the prices go up.
And apple's OS performs really good until you use third party applications that expect better performance then your sad 5-year-old MacBook.
Oh I'm not suggesting that it will never peak higher than what Apple offers; again, I do agree that it would be nice if the Pro line offered a little more to users looking for more oomph.
It's just that, again, by and large with everything with the Mac line up right now, you throw something at it and you're good to go. Even my 2012' Air can do moderately advanced video editing in Premiere, the only real bottlenecks being local storage and the CPU taking a bit to render video.
The current line up is not a perfect solution for absolutely everything, and Apple's power option in the Mac Pro line is just unacceptable for most people, even if it does bring a lot to the table. But in most cases, Apple's line ups do extremely well for a large majority of things you'd want to use a computer for. I too would enjoy a quad core in the Air/Pro line, or at least the Pro line, though I suspect the issue is heat, and as I said earlier, I'd happily take a bit of thickness to accommodate a heftier processor.
I certainly don't endorse letting Apple tell me what hardware I need; I also don't have too many complaints with what they've given. Price exaggerations aside, I find the price tag for the Macbook Air/Pro lines appropriate for the base models, and comparable to other 13" sized machines with similar specs and amenities (for lack of a better word). Such is sort of the nature with Apple and being a long time user - there's a lot Apple does that bugs you, but at the same time, there is a lot of good you get with the hardware and the OS.
The question is why the software takes up so much more. Firefox is the pain point for me on my 2008 Macbook, for everything else, it's perfectly snappy. I block ads, don't play games or look at animation heavy sites, but even doing nothing but displaying pages, it sits at 2.5G RAM (out of 4) and 100% of one of my two CPU cores. I'd run an old version if it wasn't for the security updates. Utterly ridiculous.
I think this sentiment (pro users want to upgrade their hardware) is inaccurate for most professionals. Hobbyists and freelance/self-employed may want to upgrade their hardware but a professional generally doesn't care. They just want a machine that works and a reasonable path to a new machine that doesn't affect their productivity. That upgrade path is really smooth on OS X. Migration assistant plus thunderbolt cable plus target disk mode makes it really easy.
8 years from a laptop is still more than you would expect from almost any other manufacturer, save maybe IBM/Lenovo tho' I have heard the build quality of Thinkpads has really deteriorated lately.
2009 Thinkpad T400 still working fine, although with updated RAM and SSD. Another 2012 T430s also working fine.
Refreshing current Apple computers won't be that easy. Even if you are capable of opening it, compatible M.2 drives won't be easy to find and you won't be able to swap soldered RAM. I'm not sure that my 2015 rMBP will be able to serve so long as that T430s will.
Not shenanigans, happened to as well: couldn't upgrade OS X, so couldn't upgrade to latest xcode, and Apple required using the latest xcode... so I had to buy a new mac.
That was a long time ago though, now Apple seems to support HW for much longer.
This is shenanigans. You only need the latest XCode to take advantage of bleeding edge features released in the latest iOS. 90% of apps never take advantage of the bleeding edge; you can continue to develop apps for all versions of iOS - including the latest release - with a 5 year old laptop.
I have MacMini Late 2009 and it can hardly run the OS itself, let alone something else :) It's practically unusable. It doesn't have much memory though, only 2 GB.
I still use my 21" iMac 2007. It does the job for household computer, usually my mom use it for web browsing and emails, but I can do dev work relatively nice on it, El Captain installed btw.
It will hit 10 years mark in May next year, I shelled out $1.6k at the time for it, 2 years ago upgraded RAM from OCW, and installed SSD, upgraded magic mouse, and it works really nice! I think I used the hell out of it, and it is amazing machine!
While your concerns and gripes about Apple Dev. environment are 100% true, there are some reasons why are they so. And in a few years I think that will change, Swift will change it! Imagine you can compile swift code on Linux now! It's nowhere near usable for serious development work, but I think with Swift, Apple addressed that problem too, so I think in 5 years you won't need exclusively OSX to dev for Apple devices. I'm sure it would be the best way to do it, but who knows we shall see.
> And in a few years I think that will change, Swift will change it!
The compilers for Swift and Objective-C are command-line tools that are (relatively) easy to port to Linux.
But for a real iOS dev environment or CI server you'll need the iPhone Simulator, which is built on a large stack of proprietary frameworks. I don't see Apple porting it to other platforms anytime soon.
The trivial fix would be to allow virtualisation of OS X on commodity hardware.
On the other hand Apple makes virtually no hardware that would be a good fit for a datacenter with many VMs on powerful multiprocessor hardware available in a compact form factor.
If you wanted to program for Windows you still had to buy a computer. You could argue that you could get a Windows computer cheaper than a Mac but in the long run that difference is insignificant when you're spending many years with the same hardware.
I'm still programming iOS apps on a 2012 Macbook Pro because I'm not rich. Performance is still pretty good!
Android Studio runs on a lot of hardware but it's slow unless you have good hardware, which costs money.
That's true, although Windows can be run in a VM. I'm not saying that would be the ideal means of developing for Windows, but it's an available option, and MS even offers free images for testing on various OS versions. Meanwhile OSX can only be run in a VM on Apple hardware.
So, I haven't tried actually doing this, but is there a real reason you can' load OS X up on a VirtualBox VM, etc, or is that just what Apple tells you you can do with the software you bought from them?
OS X basically prevents this with default install, you will have to get a hacked OS X image to do this, and that is a legal grey area depending on your country.
Forgive the rambling but it's getting late and I'm sleepy:
I'm just saying that people are complaining too much about having to buy a Mac to write code for iOS. If not must, then you should buy a Windows machine to develop for Windows but you don't hear anyone complain about that. You have to buy an Oculus Rift to develop Oculus Rift software. You have to buy an Android device to test your Android program. etc... You don't have to buy a Linux machine to write code for Android but if you don't have a computer at all then you have to buy one. There's stuff you have to pay for to get into programming...To build software that will make you money. Windows/Linux developers are complaining that Apple didn't release free tools for their platform and overreacting.
You can easily write good Windows applications on Linux and OS X; and Microsoft distributes standard virtualization images for free which you can fire up in a VM to test your Windows applications. In addition to those virtual machines, Wine implements much of the Win32 library and can be used to test your Windows applications with some minor limitations. For iOS you cannot run your build process on a remote machine without jumping high hoops, you cannot build without OS X at all, you cannot sign code for your own device without their IDE and a certificate from their website.
I will stand up and be prickly here and say: please do not judge the world by your (apparently) Apple-centric circumstance.
I (personally) am a nomad. I can not carry 27 computers with me. My world would be best served by having one computer that does all the jobs. Apple hinders me in this; it does not help.
Yes, I can (and do) do Hackintosh VMs, but it makes me feel dirty.
Whenever I have to deploy on an iphone, to test I buy a bank of iphones (and give them to the neighborhood children when I am done). That should be good enough.
Run Windows on a Macbook? They're pretty excellent for travelling. Just the other day I dumped half a Caipirinha over the keyboard and I booted right up afte two hours in the Cuban shade :)
You need some precise combinaison of Windows and Visual Studio to target a given version of Windows Phone. At least if you want to use emulators, you'll need a Windows 8 to target WP8 (otherwise Win7 is fine IIRC).
Windows 10 is required to build Win10M apps targeting UWP. You can still use WP8 tools (thus Win8) to make WinPRT or Silverlight apps.
That's the theory. In practice the last insider build have a bug which forbid app deployment, and for some builds nows the Silverlight app runtime had its own bugs (keyboard related for example).
For having used this tools since WP7 I really feel a decline in quality on the mobile OS and I wouldn't recommand to bother anymore.
I've never heard of leasing but $20 month is a steal. At $20/month, it would take 7.5 years to pay off a $1800 laptop. If you are leasing in that time you would have upgraded a few times.
I would note that, if you want to Mackintosh specifically for the sake of compiling OSX/iOS executables, it's far easier to set up a computer to run a bare-metal hypervisor that knows how to run OSX virtual machines (e.g. VMWare ESXi), than it is to directly set up the same computer to run OSX directly. (Source: did both over the last month.)
IIRC I had "build platform nut supported" errors coming back from iTunes Connect when I tried submitting a build from my Hackintosh.
I guess Apple can detect if you're running on a Hackintosh. For example Netflix won't play on my Hackintosh without installing Silverlight. Invalid DRM keys or something.
Yes, I did. As others have pointed out, that tends to go poorly if you plan to actually develop, test, and publish OSX or iOS apps for a few reasons. I've read reports of some people getting it to work, but it's generally a hugely painful process and tends to break whenever there's an update.
I don't even develop platform-specific apps and I'm happy to pay the cost of a new machine every few years. Three machines, purchased new, have provided my sole income source for over a decade, at a cost of about $1000 per year. Not too bad considering what I profited after paying that ghastly fanboy tax...
Well, you should be able to recoup those $1300 you spent on the Macs pretty easily if you do professional iOS development.
And if you're not in for the money but just as a hobby: don't develop for the iPhone then if you don't like the costs. (Even though $1300 over 5 or 6 years is a pretty cheap hobby in my books).
I don't disagree with your general point that it's ridiculous how locked down the tooling is, but I think you can build an iOS app on a CI in the cloud, can't you? TravisCI and CircleCI at least offer options for this, I believe.
They said CI build on commodity hardware. Yes folks like Circle have CI for iOS and OS X apps, but (1) it's on Apple hardware and (2) as a result, way more expensive.
Agreed that its a pain. I only have experience with OS X builds (not for cocoa apps, but for a CLI tool) in TravisCI, but I can tell you that their build box offerings are much weaker than what you can get for Linux (presumably because of the cost).
I completely agree, just wanted to add that there are options for iOS / macOS CI in the cloud, like ours ( https://www.bitrise.io - we have a free plan for freelancers ) or Travis / Circle. Not on commodity hardware of course, and you most likely won't have macOS as an option on AWS/GCE/Azure in the near future either.. (we use a private vSphere system right now)
Cross-compiling Windows apps from Linux using Mingw has been a thing for a very long time. Mingw itself is 18 years old. There are also options for using LLVM or compatibility layers like Midipix. So, no, Windows is not required and hasn't been for a long time.
As far as Visual Studio, it's not required, either. 3rd party compilers have always been a thing for MS operating systems. In fact, for a long time, you weren't even able to use VS to build drivers for Windows. Microsoft is also currently going out of its way to create neutral tools to enable developers to use the workflow and tools they desire. Omnisharp exists for providing intellisense to editors and IDEs. VS Code is a new, cross-platform IDE that works for both node and C#, among other languages. MonoDevelop still exists and can be used to write Windows-compatible applications.
But for Apple, you have to own a mac or, as is common, lease a mac to build and sign on. So, no, these things aren't even in the same realm of control. MS treats Windows as its primary platform for innovation. Apple treats OSX as the only platform in existence.
I have a feeling it wouldn't be that difficult to do codesigning on another platform. It's a case of sha all the things, encrypt the sha with the cert private key, and render a plist with the signatures. IIRC there is a code signature load command in the Mach-O headers too?
The issue is not so much technical as it is about legality. Codesign is itself open source (Darwin), but Apple TOS forbids you from building OSX/IOS apps to be distributed on the App Store on non Apple Hardware. I guess you could do it anyway, but I don't think people would risk it for the price of a mac mini...
> Cross-compiling Windows apps from Linux using Mingw has been a thing for a very long time. Mingw itself is 18 years old. There are also options for using LLVM or compatibility layers like Midipix. So, no, Windows is not required and hasn't been for a long time.
Sure, develop for Windows without using Windows. But why spend time developing for a platform that you don't use regularly? What about testing? How far can you go without Windows?
> As far as Visual Studio, it's not required...
VS or Windows is not required but the majority of developers develop using VS on Windows. You could cheap-out by avoiding these but you're being penny-wise and pound-foolish.
> But for Apple, you have to own a mac
But for Windows/Linux you still have to buy an Intel machine. You just don't want to buy another machine to develop iOS or OSX apps.
Anyhow, I think Windows/Linux owners are complaining they have to spend more money to develop on a different platform. You never hear this from Mac owners who want to develop for Windows or Linux.[Edit: Not trying to start a flame war. I'm really trying to understand the outrage...]
Because neither windows or linux restrict which hardware or vms you can install them on like osx does? Obviously you won't hear it from Mac owners, because they can still develop for windows or linux, the inverse isn't true thanks to Apple's artificial restrictions.
>I am developing iOS apps on a 2012 Macbook Pro. Those aren't expensive nowadays.
I would be willing to bet that a used Thinkpad is even cheaper, and has the benefit of having more upgrade options.
To make a car analogy, your argument boils down to, "Well, you need a car anyway, so why not buy a Lexus?" I think a lot of the people replying to your posts are arguing that requiring luxury hardware (and, let's face it, Macs are luxury hardware) adds a pretty huge barrier to entry for software development. Trying to argue that the price difference between Apple hardware and Windows/Linux hardware is "insignificant" in the long run just doesn't cut it. Even the cheapest Apple laptop is a thousand dollars in the US. It costs even more overseas. Meanwhile, you can get similarly spec'd Windows machines for a half to three quarters of the price.
There is a Mac premium, and it is a barrier to entry for developers. You can't make that barrier disappear by blindly asserting that it doesn't matter.
I'd like to hear more about this. Are you saying you've set up an OSX VM on non-apple hardware? I haven't tried in at least a year, so maybe things have changed, but I have put a significant amount of effort in trying to get an OSX VM running on non-apple hardware and was not even close to successful.
Thanks for this. Searching around for the title of that article[1] opened up plenty more resources as well. I ended up grabbing a refurb'd Mini last time I needed to do OSX-specific testing locally, but it's great to know progress is being made to open the possibilities on non-Apple hardware.
I've gone down that path; it was a big pain—Apple just doesn't bother to make drivers for what most hypervisors consider "generic" hardware. You basically need to do the same stuff Hackintosh builders do, to your VM, to get even some of the features working.
By contrast, you can just create an OSX VM in VMWare Fusion in OSX and then copy it over to VMWare Workstation on Windows (or an ESXi server, or whatever else) and it'll work fine after being convinced[1] that it's allowed to try.
You can set up build servers for your app on Microsoft Azure, or AWS' Windows Server instances. Microsoft produces articles on how to do Windows development on a Mac (Bootcamp, Parallels, etc.) [1]. Microsoft has shown commitment to bringing it's core offerings to other platforms. Apple has in no way done that. You literally, absolutely need an Apple computer for every step of the development process for iOS. Your comparison is not valid.
Is it really developer hostile that Apple is not bending over backwards to provide free tools to Windows and Linux users? I think it's prudent of them not to waste money on such a small segment, some of which vehemently hate Apple. And when you talk about build servers you are still spending money on some machine to build your iOS app. You can get pretty good used Mac hardware if you really want to. For all those developers sitting on the sidelines refusing to develop iOS apps because they can't setup a build server on Azure, etc. Well, you're just making excuses.
I've been in computers since the early 90's and as far as I can see, Apple has always been hostile towards developers (and users).
> For all those developers sitting on the sidelines refusing to develop iOS apps...
I've made a ton of money building Ionic and Xamarin apps and recommend those to anybody building for iOS because both of them will let you spend as little time using OS X as possible. The way my workflow is now, I could totally rent a cloud Mac and just compile and publish stuff from there.
Anyway, even though I think OS X and iOS are quite possibly the worst operating systems I've ever had the displeasure of using and even though I know for a fact that Apple is hostile to developers and users, I don't mind making apps for them because I figure anyone who uses there stuff must not care about wasting money and it should be easy for me to grab some...and it is :)
And, like Spotify, all of my apps are subscription based with out-of-app signup, so Apple doesn't get 30% of my money. I also only buy used Macs because Apple doesn't get your money that way either. Right now I'm running a Mid-2012 Mac Pro with 32GB RAM that I got for ~$1,200. It was about double the money that I spent for a similar PC server, but I made all that back by taking money out of Apple's ecosystem and not putting it back in, so I feel like that's a win for everybody (except Apple of course, but hey - fuck them - if they're going to be hostile to me, I'm going to be hostile to them.)
Well, Xcode is free. Apple Dev membership is free. You need a computer to develop software on. Windows machines aren't free. A legit Windows OS license is more expensive than OSX.
A developer account is free, the $99 is to publish in the App Store, receive technical support, and receive free versions of various things (e.g. OS X Server).
> the tool chain is incredibly expensive, obsoleted frequently, and requires lots of maintenance.
The toolchain isn't expensive or obsoleted frequently. A new Mac mini is $499 and can last more than five years. When you consider a professional iOS developer can easily make $100/hr, a cost of $100/yr for equipment is beyond negligible.
Not sure where you get "requires lots of maintenance" from – he doesn't seem to have said anything about that?
$499 is a big investment for the majority of the population and assumes you have a monitor and an iOS device already. It's not a sum many people will drop just to try it out. For a lot of people it means completely adopting the platform.
A new Apple device could potentially last you 5 years but Apple has also been known to obsolete hardware in as little as a year.
The majority of developers making iOS apps do not make $100 and that's not assured income, its potential.
If you happen to work for a professional shop with a team, chances are you want to do CI or automation. It's a pain. If your IT department manages your machines then they need someone dedicated to maintaining enterprise security policies on your Macs, it's a pain for the user and the admin.
> $499 is a big investment for the majority of the population
I think labelling a $499 computer as "incredibly expensive" isn't accurate when you're talking about typical Western countries. That's a pretty normal cost for a computer.
> and assumes you have a monitor and an iOS device already.
Most platforms require you to have an instance of the platform available to you. Yes, people who want to develop for iOS should reasonably be expected to have an iOS device, just like people who want to develop for Android should reasonably be expected to have an Android device, and people who want to develop for BlackBerry should reasonably be expected to have a BlackBerry device. Apple are not unusual in this respect – you appear to be holding them to a different standard to other platforms here.
> Apple has also been known to obsolete hardware in as little as a year.
I can't think of a case where this has happened – can you give an example?
> The majority of developers making iOS apps do not make $100 and that's not assured income, its potential.
I'm trying to put the cost/benefit into perspective. The cost of iOS development is minuscule in context.
> If you happen to work for a professional shop with a team, chances are you want to do CI or automation. It's a pain. If your IT department manages your machines then they need someone dedicated to maintaining enterprise security policies on your Macs, it's a pain for the user and the admin.
Again, you seem to be holding Apple to different standards than other platforms. If you have employees using computers, then IT needs to manage them. Yes, of course – but this is true for Windows developers, Android developers, and all other developers as well.
That might be the normal cost of one of 50 different Dell laptops but for Apple it's their lowest end unit and it lacks a display.
> Most platforms require you to have an instance of the platform available to you.
Nope they don't, but it is better if you do. It's also considerably cheaper.
> I can't think of a case where this has happened – can you give an example?
The PPC MacBook after the Intel Switch, the first gen Core Duo MacBook after the 32-64bit switch, non-retina iDevices... I'm on a phone or I'd keep going.
> I'm trying to put the cost/benefit into perspective. The cost of iOS development is minuscule in context.
If your context is "successful developer with a hit app" then yes, but the reality is that the majority of devs won't be anywhere near that successful.
> Again, you seem to be holding Apple to different standards than other platforms.
See that's where you're confused. I'm holding them to the same standards. Microsoft offers tons of free tools for managing large pools of enterprise machines, maintaining their patch levels, group policies, access control, etc. Linux has similar support.
Apple on the other hand makes you invest in countless 3rd party tools to achieve the same level of control. And at the end of the day it's still Flaky and error prone.
> That might be the normal cost of one of 50 different Dell laptops but for Apple it's their lowest end unit and it lacks a display.
Now you're shifting the goal from the cost of the toolchain to the cost of Apple's lineup as a whole. It doesn't matter what the top of the range Apple lineup costs, to get started with iOS development, you just need the cheapest Mac mini.
> The PPC MacBook after the Intel Switch, the first gen Core Duo MacBook after the 32-64bit switch, non-retina iDevices... I'm on a phone or I'd keep going.
Wait, you've moved the goalposts again. We're talking about the ability to buy a computer and continue to use it for development for a reasonable period of time. We're not talking about your computer not being the latest model. Just because Apple introduced retina screens on the iPhone 4, it doesn't mean that you had to go out and buy a new Mac to develop for it.
> If your context is "successful developer with a hit app" then yes, but the reality is that the majority of devs won't be anywhere near that successful.
I'm talking about typical professional iOS developers. The majority of them are successful. I suspect you're talking about app entrepreneurs, which is a whole different ballgame. Most iOS developers aren't making their own applications, they are making applications for other people. And yes, the cost of iOS development is trivial for them.
It's true, lots of applications don't make money – that's because they fit into the business in other ways. For instance, your bank's application is free, but I can assure you that the developers who built it aren't worried about the cost of an entry-level Mac mini.
The context is this: If you do this professionally, then you can afford the developer toolchain easily. If you're not employed as an iOS developer and you're just making applications for yourself? Sure, I can see how you'd consider it a moderately expensive hobby if you didn't already own a Mac. But we're really talking about two different things there.
> I'm holding them to the same standards. Microsoft offers tons of free tools for managing large pools of enterprise machines, maintaining their patch levels, group policies, access control, etc. Linux has similar support.
You've shifted the goalposts again. You were talking about having to hire somebody to manage Apple computers. IT manpower is required for any platform, so singling Apple out as special for requiring IT manpower is holding them to a different standard.
I really don't think the enterprise is relevant to your argument here. If a company employs enough iOS developers to require large pools of machines and more than a basic $2/month MDM solution, then they really aren't going to consider a $499 machine "incredibly expensive", whether it comes with or without a monitor.
Wouldn't it be a more fair comparison with Android? You can develop Android apps on Windows, Mac, and Linux (though in my experience the Mac is the best supported of the three).
I'm running into this problem right now for my app, and there are reasonably priced cloud options out there. Mac OS doesn't run on commodity hardware so it isn't surprising that you can't set up a CI build on commodity hardware. Put another way, Mac OS is not like Windows and Linux: the software is inextricably linked to the hardware.
You are correct. I've started Android Development since I did not want to work on Apple Hardware.
I don't like the "just different" attitude concerning some basic things (Why can't I order a MacBook with full printed special character keyboards AND none of those non-standard key bindings(especially horrible on German Keyboards)? Why can't they offer a Version of Finder that is not insanely infuriating to people used to Windows or Ubuntu?) that are just hostile to people who don't drink their brand of kool-aid.
I'm happily developing on Windows and if I ever feel badly served by this, switching to Ubuntu is a very warm well trodden road.
I really don't get what people See in Macs, other than the tribe factor.
You can develop your application with Qt on whatever platform you want, but you'll still need Apple hardware when it comes time to build and test on iOS, because Qt still has to build and link against Apple's SDKs and use Apple's codesigning tools to produce builds that are actually runnable.
Given that the Swift toolchain is now open-source and being ported to other platforms, I would expect that the building and linking issues will be solved fairly shortly. I wouldn't be surprised if there was something official announced at WWDC.
That's a feature, not a bug. What possible benefit could there be to helping people who are hostile to their platforms develop for them? If you want to develop for Apple platforms, you do so from Apple platforms. Fair and logical.
I think that if the hurdle of using OSX to develop for Apple platforms is so high to one that they'd describe it as "fundamentally developer hostile", then yes.
It just doesn't make sense to me why anyone thinks they're entitled to have Apple build and support a complex developer toolchain on their platform of choice, and if Apple doesnt, it's a sign of "hostility".
I don't think Apple wants that person building for their platforms. As an apple user and developer, neither do I.
> What possible benefit could there be to helping people who are hostile to their platforms develop for them? If you want to develop for Apple platforms, you do so from Apple platforms.
That suggests you think every developer who wants to publish iOS apps should either do so from a Mac or they are automatically hostile towards Apple. That's the point I was touching.
I personally wouldn't call Apple's choices hostile and I don't think anyone is entitled to have cross-platform tools. But you made it clear you think I'm an "enemy" as long as I'm not on a Mac, and there would be zero benefits in allowing me to publish apps to my phone.
They may not be "hostile" but they are most certainly more likely to treat iOS as a second-class citizen or an afterthought, which is obviously not all that desirable. Depending on who you ask a half hearted port is worse than nothing at all.
By requiring a Mac to develop for iOS, they're forcing a certain level of investment which makes it more likely that developers targeting the platform make their apps worth the time spent (for both the dev and the consumer). By opening it up they're welcoming those who port just to tick a box.
Sure, somebody who doesn't have an iPhone and yet wants to develop apps for the iPhone is likely to treat iOS as a second class citizen.
But you are saying that if somebody bought the iPhone on launch date, and has upgraded to every new model as soon as it came out, and is obsessed with all things iOS....then he is still gonna treat iOS as a second-class citizen because he doesn't own a OSX and a Mac, which are different hardware and a different OS from the device he wants to make an app for??
That's a much more reasonable argument. It's also likely the real reason there's no cross-development support.
It does open another thread of discussion, though: as far as I'm aware, there's no proven correlation between buying a Mac and caring about the quality of your iOS product.
I could just as well say that companies that want to tick a box for their sub-par apps farm invest in several platforms much more easily than starting devs who can't afford multiple setups and pride themselves in craftsmanship.
Which one is closer to reality or has the bigger impact, I have no idea.
It's not an arbitrary restriction. All the tools will have to be reimplemented. Take the simulator, for example. It's implemented by translating iOS system calls into OS X system calls. It's not an entire OS running on a emulated CPU. That won't work on Windows. Will Apple have to implement an emulator? But that will be slower for developers using a Mac. Should have they to support both?
There are significant downsides. You can argue that the upside is still worth it, but you can't dismiss the downsides as "completely arbitrary".
AFAIK this is incorrect the simulator runs x86/x86-64 binaries and links against x86/x86-64 versions of Apple's frameworks (including UIKit). There is no translation of syscalls happening, the instruction for a syscall, e.g SVC 0x80 on ARM wouldn't be happening in your binary but in a framework your binary (indirectly) links to.
Also the simulator is basically an x86 version of iOS, look inside for yourself and see. Someone correct me here but I don't see why it would be impossible to for apple to ship a simulator for Windows, n.b. how MS has shipped the Linux ABI translation.
The simulator doesn't run a full x86 version of iOS; it runs an x86 version of the iOS userland on top of the host kernel (Darwin) and components shared between OS X and iOS.
A port of the simulator would be possible but it'd be a good deal heavier than its OS X counterpart since it'd be running the entire OS instead of just the topmost layer.
You're again missing the point that that would be heavier and so perhaps not as good for developers that choose to use macOS.
The question is not whether "the simulator is basically an x86 iOS". The point is that requiring macOS to build iOS apps is not an arbitrary restriction — it's a tradeoff. You can argue for or against Apple's decision, but you can't argue that it's an arbitrary restriction on Apple's part.
Why is the simulator not simply a VM loaded with iOS, like with Android, or Windows Phone, or most of the other targets that you can cross-compile and test for from Windows or Linux?
I don't think that Android would have had better apps if Android developer tools were Mac-only. The reason behind the Play Store/App Store quality difference has more to do with the way the store is run, the system frameworks (Core Animation vs. the Java frameworks, etc.), and the inclination of users to purchase apps.
- App store ads. If you have an app marketing budget this does nothing besides shift who you spend it with. We'll see if the roi is better but you'll be competing with some heavy spenders for the limited inventory.
- The new subscriptions pay model with the 15% cut after one year. In my opinion Apple doesn't deserve any cut of the recurring money since they don't have to do anything to earn it.
> In my opinion Apple doesn't deserve any cut of the recurring money since they have to do nothing to earn it.
Aside from host the infrastructure that delivers the apps. Are they doing this to make money? Of course. But saying they haven't earned it is flat out wrong.
Indeed. Here is a list I made a while back of things they do for their cut.
* Maintains a user database including authentication, plus all the support costs (password support is very high)
* Payment handling (no one does it for free) including keeping up with tax authorities and legal systems in much of the world
* Almost unlimited (re)downloads for users to a reasonable number of devices. Bandwidth is cheap but not free.
* Backup and restore for application data. (That functionality is available to apps without extra fees.)
* A curated walled garden including a review process. Apple keeping the dregs out and avoiding the place turning into a cesspool means users can be more confident about the apps and that halo effect helps all apps in the store.
* Mechanisms to extend your app such as IAP, and an advertising solution etc
* Access to a large user base, with reasonably fair rules that everyone has play by
You didn't even mention the fact that they keep the OS and platform updated, something critical to the longer term viability and success of apps you'd want to use a subscription model on.
> A curated walled garden including a review process. Apple keeping the dregs out and avoiding the place turning into a cesspool means users can be more confident about the apps and that halo effect helps all apps in the store.
I switched from Android to iOS and I just don't see this at all. Yes I'm more confident installing from the app store because of the security model, but android is getting that too. I'm no more confident that an app or particularly game will be good or worth downloading or buying though, and I don't perceive apps to be high quality in general.
Note that I am talking about a comparison with no app store at all, not comparing Google to Apple. f there wasn't an app store at all - you just downloaded apps from random web sites instead like for much Windows software. Now have a random person go download VLC for Windows, or for that matter something like Java. Between hijacked installers, "download" sites, toolbars, dark patterns in the installers and who knows what else, the experience is very negative and would make users far less likely to install stuff. The app store helps alleviate those issues and increase user confidence in apps, which in turn helps those offering apps at the app store.
Having deployed apps to the app store, and on one occasion a bug was caught, that saved it from going out to the users. It's far from perfect, but it does have an effect. I think due to the mass amount of apps & users, shit is inevitable :(
After a user pays 99 cents Apple will provide all that for free indefinitely. Subscriptions has nothing to do with Apple. A fair fee for just credit card processing is 1-4% max.
> Aside from host the infrastructure that delivers the apps
Under that logic they should charge for upgrades. But subscriptions is not really related to upgrades. Subscription implies that your app is tied to your own backend service that you keep up and running without Apple's help.
>> "Subscription implies that your app is tied to your own backend service that you keep up and running without Apple's help."
You could be using iCloud as your backend. You're definitely using their subscription billing infrastructure. You're getting access to their users credit cards without having to ask for them.
This is debatable. Recent changes and interviews with Shiller seem to imply that _any_ application can use the subscription model, even if it is only to pay for updates.
The new subscriptions pay model with the 15% cut after one year. In my opinion Apple doesn't deserve any cut of the recurring money since they don't have to do anything to earn it.
Maybe it isn't a benefit to developers, in which case we'll find out, because no one will be using it. Is that really what you expect will happen, though?
It's ridiculous to argue that it's a bad deal or somehow unfair because Apple doesn't incur costs in proportion to the cut they are taking. Welcome to the free market where prices are set according to what customers are willing to pay, and not according to how much it costs to provide the particular service.
I used to work at a company whose flagship product is an iOS app. I have typically worked on web things and was working on the API used by the product. But it always struck me as strange how different of perceptions iOS devs had from most developers I've met about what constituted "reasonable requirements" from platform providers.
I get that you play ball with these things and pay what you owe to get access to the market of Apple customers. I do get that. But I also think that relationship is mutualistic, and that app developers bring a tremendous amount of value to the platform that causes people to buy Apple products in the first place. And I would argue that their value has not been assessed fairly by Apple given the amount of cut they require and the amount of stipulations they create about permissible ideas. Yes, a certain amount of these rules do make sense, but some are unbelievably arcane and arbitrary. For instance:
"Apps that unlock or enable additional features or functionality with mechanisms other than the App Store will be rejected"
It's absurd that something like this would be in your platform's rules. That eliminates a huge number of possibilities in app development that would have absolutely no negative impact on Apple's reputation or the safety of the App Store.
And it would be another thing if these rules weren't given much credence, but from what I've seen, developers are generally worried about being caught for these kinds of things during app review, and they take the policies very seriously.
I don't know if the iOS developers I've known are representative or not, but in any case I'm happy that some of them are starting to push back.
How doesn't that make sense? What if app developers started saying you could unlock 100 gold coins for free if you visit some website and click 3 ads or some other shitty experience? Maybe having that restriction is bad for developers but developers aren't Apple's only constituency, customers are too (Apple has definitely shown they care more about their customers than the needs of developers).
From a pure business strategy of commoditize your complements, encouraging app developers to have subscription pricing works against the interests of their customers (who want to minimize their costs). Yet they're obviously doing it because the long-term interests of customers is in having sustainable businesses develop those apps/services.
Note I didn't say that there weren't examples of applications that could be abusive if such a rule didn't exist. I'm saying that the rule is far too general and covers far too many benign ideas that could end up becoming incredible products. Many iOS apps have an admin console that can only be unlocked by special designators in their account on the server side, yet by this rule that would be disallowed. In fact the company I worked for disabled their admin panel for this very reason, despite the fact that it was very useful for debugging and field testing a released build.
EDIT: Also is there any evidence that companies in the App Store are more sustainable than elsewhere? I can see how this is possible, but it's also unintuitive that companies paying higher costs would end up keeping their apps maintained for longer.
I don't think it's a surprise that Apple's rules chafe developers, I also don't think it's a surprise that Apple has a bias towards their customers which is why they're willing to have restrictive rules.
Ultimately it comes down to leverage, and they have plenty of it because they've aggregated a very nice demographic of users that are willing to spend money.
This is exactly the kind of "Apple is infallable" logic I'm talking about. How exactly does it benefit customers to limit access to safe, cool apps that violate some arcane rule they have? If they actually wanted to do right by their customers it stands to reason that they should work with developers to come up with a more precise set of rules that still keeps their reputation intact and their customers happy so that customers have access to more varied kinds of apps. I'm an iOS customer (and not an app developer) who is irked by this, so there's at least an existence proof of people that feel the opposite.
> If they actually wanted to do right by their customers it stands to reason that they should work with developers to come up with a more precise set of rules that still keeps their reputation intact and their customers happy so that customers have access to more varied kinds of apps. I'm an iOS customer (and not an app develop
Apple's general MO around restrictions against abuse is to make them too strong, see what kind of useful apps get developed in other app stores that would be against Apple's rules, and then change the rules to allow for just those examples. So if there's an overly strict Apple rule, but no other good apps come out elsewhere, then the rule will stay.
What apps are being made elsewhere that aren't being made for iOS? And how successful/popular are they? Saying there are plenty, without evidence, is a counterfactual that can't be proven.
There may be some apps that aren't made on the margins, but on the whole the rules make the App Store a better experience, and it's their prerogative to choose that over the marginal app.
Fundamentally the reason they have that rule is to prevent apps from doing an end run around Apple's 30% cut. For example, to unlock the "Pro" features, go to the developer's website and enter some payment info.
The rule has nothing to do with preventing crappy ad-driven experiences. If you had to click 3 ads in app to get 100 gold coins, Apple would be fine with that even though it's crappy - they're fine with it because the action is still happening within their garden walls and if it involves any exchange of real $$ they have enough control to either get their 30% or block the app.
Here is the beef that I have with Apple. Hands down the app store is rotten infrastructure to the core for app developers. In my experience, app discovery is so bad it shows everything but the app you were looking for unless the app you were looking for is in the top 10.
Most indy-apps are brutally left out in the cold. As a result, app downloads are stagnating.
So it boggles the mind why going forward we should invest months into the development in an app where the only benefit the app store brings is payment (for a free app) and versioning.
Apple is in dire need for an app store disruption.
These are complaints about visibility. Can you market it yourself? Then the app store provides cryptographically-signed peace of mind to users that they can use your app without their contacts stolen or their phones hijacked.
The problem with visibility is the following. Let's say you make a commercial/web-ad/etc for App "A".
The customer would with 0.05% probability click the link, but maybe remembers the name.
So now in their spare time they would think, let's look for this app "A" ? so they input the name and only "B","C", and "D" that have nothing to do with "A" came up, that would be not that good, right?
A real world equivalent would be: You are in a supermarket and want to buy "A" beer because you heard the radio commercial and the assistant guides you to the vegetable, the ice-cream, and the raw meat section but neither to "A" nor the beer section. So you need to know before you newly discover "A" in the beer section that there is a beer section and that "A" is the beer to try. If there are now 20,000 beers that is your problem right there.
In comparison, on Amazon it is more effective to find products through recommendations and "similar products" which help customers find new products.
The app discovery path does not work in the app store.
> peace of mind to users that they can use your app without their contacts stolen or their phones hijacked
I think you can get that part without requiring the bottleneck of a single, hand-curated app store. The OS sandboxes app processes and blocks them from accessing user information without getting permission first - if that's done well, no app review is needed to ensure apps can't hijack your data.
Unfortunately this is far from true. It's trivial use private APIs and beat AppStore static analysis. There is also a MITM attack on FairPlay DRM that has enabled malware to be installed via AppStore apps.
Thanks, I wasn't aware of the latter, but looks like it requires a user to install a Windows client called 爱思助手 (Aisi Helper), purportedly for jailbreaking etc., then connect their iOS device to a Windows PC running this software. So it's hardly a typical case.
Reference: http://researchcenter.paloaltonetworks.com/2016/03/acedeceiv...
Regarding the former, yes it may be possible to hardcode addresses for private APIs and target specific devices running specific iOS versions, but Apple routinely weeds these out.
Edit: I assume Apple weeds these out. I have no reference for this.
I realize it's not bulletproof, but it's still good enough to trust for me as an iOS user compared to the Android phone I was using until last year.
I may represent a minority, but I only use the app store to search for and install an app that I already know that I want, either because a friend showed me or because I found it on the web. To me, discovery via the app store is completely uninteresting and if I was developing an app, I would not care about it.
Here's my thing: how many apps do you actually need? My phone is first and foremost, a telephone, then an SMS device, then a GPS, and finally, if I really am in a pinch and can't get to a real computer, a web browser. That's all I do with it.
If I could get rid of the crapware bundled on my phone without rooting it, I'd have maybe a half dozen apps installed.
I have dozens of apps (but anecdotally many fewer than average) for a bunch of things. I have 3 book readers from different vendors. I usually use one, but if the book I want isn't available, I try the others. I have several streaming video apps which I use when traveling. I have a metronome and various audio recording apps for practicing music. I have apps for various airlines for when I'm traveling. I have a few restaurant apps for ordering on the go, but I don't usually use them. I also have several games. Then there are simple info apps like the standard stock app, weather app, maps, etc. Also calendar and email. Oh, and I take lots of photos with my phone even though I have a very nice DSLR (which I also use a lot). It's a really useful tool.
I think I'm in the minority in how few apps I use, so it seems to me that you're even more in the minority. And that's fine, but don't assume everyone else uses their device like you do!
I really feel that at this point the biggest issue is Apple's lack of openness.
My biggest issue with Android is the UI. For whatever reason it's fragmented between devices, and when I worked in an area that needed to support both iOS and Android devices, I preferred helping those with iOS because it was much, much easier to support them.
However, in terms of troubleshooting beyond a certain point I discovered that Android devices were by far superior. On an Android I can get access to log files, low level network features and a host of other goodies (like the filesystem!) much more easily. Heck, some of the devices we eventually ended up using had great emulators and remote access tools - something iOS devices just don't have.
The company I worked for had a custom mobile app-based solution. Like any solution it had bugs, but we fixed more bugs in the Android version in shorter timeframes than we did for iPhones - largely because we had more options for people in the field who experienced intermittent and hard to track down problems.
The fact I can't view logs on an iPhone app on my iPhone seems ridiculous. And not being able to setup network filtering ion the device, well I know why Apple doesn't want that but it's not for any good technical reason - it's only to "protect" their ecosystem.
The complaint by Fanify is ridiculous. It's very clear the app "violated Apple’s requirement that all in-app payments be routed through iTunes" so you can't pay through paypal.
Sounds like poor planning by Fanify to submit the app only 10 days before their planned launch, when they clearly didn't bother following Apple's T&Cs.
It does sound like poor planning, given the known realities of the app store. But I've never understood why Apple feels entitled to a cut of in-app purchases. They aren't providing any added value and in some cases (like w/ Kindle) they seem to be begging for an anti-trust suit.
One reason is that without it, there'd be nearly no paid apps. Every app would be free with some minute amount of functionality (to pass the review process) and an in-app purchase to unlock the "pro" version, with Apple getting nothing.
Apple built a platform and wants to extract value from that platform. It seems perfectly reasonable to me.
A good rule of thumb is that a consumer electronics product should retail for about 2.5x the COGS (cost of goods sold). That's pretty much where the iPhone sits.
I don't see evidence of excessive margins on hardware; I see a company that's planned for and executed on proper margins.
>Every app would be free with some minute amount of functionality
Maybe. But that already happens w/ free and "pro", etc.. I expect that the marketplace would sort that out. When everything is a come-on genuine offerings start to have more appeal.
>Apple built a platform and wants to extract value from that platform.
Sure. No problems with that as long as there is some value they are providing. At least in the case of a paid app, once that sale is made Apple shouldn't be involved unless they are doing something useful.
Don't you find the App Store review process somewhat arbitrary and opaque though? Have they really improved their processes in terms of responsiveness?
The time to review & release is definitely faster. I wonder if quality will suffer because of it. One time when I had a rejection, I asked for some more details and got a fairly automated response. So it does feel a bit inhuman unfortunately.
I think this is part of the new changes as well. I was dealing with a rejection a few days ago because of the new IPv6-only restriction and someone from App Store Review actually called me and we discussed and resolved the issue over the phone. After only ever communicating with them via contact forms and automated responses, I was in disbelief. Speaking to a real person from App Store Review was so absurd in my mind that it may as well have been God calling to discuss my prayers.
The App store has reached maturity and at this point, breaking through the noise is near to impossible if you don't have millions to begin with. There are those odd apps that like in the first couple of years, tear through to the top and are made over a month by an independent dev, but that is rare. The app store is no where close to as exciting as it was. Its basically become like the PC software market now, though with less liberty.
"At one point, Apple raised a fundamental objection,
saying that Fanify’s method for tipping artists, which
used the online payment services Venmo and PayPal,
violated Apple’s requirement that all in-app payments
be routed through iTunes so the company could take
its 30 percent cut."
Duh Fanify.. I am amazed at developers who complain about their apps not getting accepted when they clearly never read the App Store Review Guidelines.
"11.2 Apps utilizing a system other than the In-App Purchase API (IAP)
to purchase content, functionality, or services in an App will
be rejected."
More like "Apple does a PR piece to convince developers that everyone else is being wooed so they should be wooed too". I miss the days when Apple used to generate hype simply based on great product, instead of artificial pieces like this.
I am very surprised that it takes Apple years before they finally decide to release Siri API (supposedly). The benefit of speech assistant for apps seems huge. And would it been released earlier, Apple can collect much more speech data as well, which may help build next-gen Siri.
I don't really get the whole wedge of the article. Giving developers a Siri Kit is them trying to make it easier for them? Apple has continually improved developers access to the different hardware and software on the iPhone.
Same thing for the improvement of app review times. It's great they did it.
I don't see the connection between that and "stagnating" iPhone sales.