Hacker News new | comments | show | ask | jobs | submit login
The future of iOS is 64-bit only: Apple to stop support of 32-bit apps (macworld.com)
282 points by intuzhq 52 days ago | hide | past | web | 294 comments | favorite

The Ars Technica article on this issue cuts to the heart of why this is so devastating: there is tons of software--software which was really interesting and I dare say "seminal" for this important era of computing; and which is not old or outdated by any sane standard--that this destroys access to going forward, for essentially no benefit.

Apple insisted that they get to curate something of critical value, but they don't comprehend the moral weight of that responsibility, and now want to just go around burning down their Apple-branded libraries. It is absolutely sickening, and is the kind of thing we probably need to solve using regulation (which, of course, is likely never going to happen, given how even companies we would have assumed were on our side often have lobbyists fighting for the ability to lock down ecosystems).


> The Ars Technica article on this issue cuts to the heart of why this is so devastating: there is tons of software--software which was really interesting and I dare say "seminal" for this important era of computing; and which is not old or outdated by any sane standard--that this destroys access to going forward, for essentially no benefit.

It's not zero benefit. Supporting multiple ABIs means that you need to have multiple copies of system libraries paged in at a time if they're used, which is not an insignificant amount of memory (which then has resulting power/price costs). It's possibly even bigger of a problem on iOS than Android, because Apple changed the floating point ABI between ARMv6 and ARMv7, so they have to support multiple different 32-bit ABIs.

So do like Microsoft does for 16bit/DOS support in the most recent 32bit windows, and let NTVDM be an optional download that is triggered on first use. Do the same for armv6/armv7 dyld files. Maybe they could even just freeze the armv6/armv7 libs on the iOS10 API level, since old apps won't be requiring new APIs. There are important apps out there that people paid for (for both the app or the hardware), and I think those users are more interested in having a working app that uses some memory, than a blocked app but lots of free memory.

The reason that Windows sucks are often linked to it's long backwards compatibility. Remember when the first Surface came out and they couldn't sell an entry level version with 16 gigs storage because the OS took 13 gigs?

At the same time this reliable backwards compatibility is what keeps Windows in the market position it has, in particular in the corporate world.

Something a number of FOSS projects should take to heart when they fret about "year of the desktop".

Even if the compatibility layer is optional, Apple still has to maintain it and port it to new OS versions.

> Apple still has to maintain it and port it to new OS versions.

Apple decided that they don't want to open-source it. That is the reason why Apple would have to maintain it.

Ok, I love open source but let's not pretend it's that easy... "Just open source it" is something I hear from people that have never open sourced a large code base developed internally only. Look at RethinkDB, I never used it but I had wanted to on multiple occasions and it looks really awesome. They were open source from the start and yet the future of it now that the company is gone is uncertain at this time. All of this ignores the fact that the compatibility layer is much more than just 1 application and an integral part of the OS. I seriously doubt Apple plans to release tools to make development of said layer easy for for whoever can clone the project off github.

With something like RethinkDB though, you're relying on it in production.

With a compatibility layer or a game, you're just toying around, trying to make it run something. If it works - awesome, if it doesn't, you're not losing money over it.

I think there's a substantial difference here.

Apple shouldn't have to open-source anything they don't see fit to open-source for their own benefit.

With the resources that Apple has, it would be silly to throw out so much backwards compatibility just because "uh, effort".

I don't think size and resources can effectively combat this sort of accrued technical debt. Once you have to maintain so much cruft it becomes emotionally difficult to write good software (at least, I experience this).

I have a few personal projects that I have maintained for years, but the motivation required to refactor, clean and maintain the code gets bigger every year.

You have to be passionate about the software you write. And unless you can find a team that is passionate about maintaining a 32 bit compatibility layer for old App Store apps, I think it would be a permanent burden on Apple.

This is the reasoning that made Windows so crappy for so many years.

It's also the reasoning that has left Windows so dominant for so many years. There are businesses that would go bankrupt overnight if Windows broke compatibility with 32-bit VB6 applications built at least a decade and a half ago.

That only matters if you are using 32-bit software. Deleting the 32-bit software is not somehow a benefit with relation to that goal. You are saying that it is better to run all 64-bit software. I am arguing that there is essentially no benefit to a device that is incapable of running 32-bit software when it clearly still has the hardware.

I'm saying that supporting multiple ABIs has a nonnegligible cost, regardless of whether the hardware supports it or not. On my MacBook (I haven't touched an iOS device in the last 5 years), the 32-bit dyld_shared_cache is 300 MB, and the 64-bit one is over 600. There might be baked in assets that are larger on the laptop form factor, and libraries that don't exist on the phone, and demand paging might buy you some of that back, but it's still going to be an absolutely massive chunk of code.

I'm not actually arguing that this is a net win for the consumer of the device. If your choice is between a device that supports 32-bit and a device that doesn't, with literally zero difference in performance, I would totally agree with you (modulo engineering effort it takes to maintain them all, but that probably isn't too high). I'm just saying that there are real quantifiable benefits to what they're doing.

edit: and that's not considering the actual advantages of forcing 64-bit like improved security via better ASLR

What you say is true, but can you imagine trying to explain to the average user that the reason their phone is running slower than necessary is because they are using applications which necessitate multiple libraries be loaded simultaneously?

From Apples perspective, the App market is so saturated that any missing functionality will be quickly replaced. This way, everyone gets a performant system without needing to know what the difference between a 32bit app and 64bit app is.

They already show a "this app may slow down your device" alert occasionally when you open a 32-bit app. This was added with iOS 10.

Oh, is that why you get that message? Uhm, I have a few of those apps then :(

I think you're assuming the upcoming A11 chip will also be capable of running 32-bit software. The hardware may be changed, for the sake of simplicity.

As far as anyone is reporting, Apple is intending to remove support for existing devices that get updated to new versions of the firmware, not just for new hardware.

Lack of fragmentation is a good thing.

As others have said important software will be made available. Apple has the luxury of a dedicated userbase and can afford to make these calls.

I imagine they are dropping the support in new hardware too.

If I may, if all your pointers are now 64bit, the memory used by your program will typically grow significantly. Not sure that all in, reducing memory consumption is a benefit of switching to 64bit.

64-bit allows Objective-C programs to use things like tagged pointers (allowing boxed numbers, dates, small strings etc. to be stored inline in the pointer) and non-pointer isa (storing things like an object's refcount in the same word as the pointer to the object's class) which can reduce memory usage significantly.

The larger pointers will certainly cause some programs to increase their memory usage, but these changes will cause others to shrink, and the net effect seems to be more or less a wash.

iOS apps have been 64-bit for several years. Apple's tools build dual 32/64-bit binaries, and the App Store will not accept uploads where the 64-bit part is missing.

So, any modern iOS device has been running pretty much only 64-bit code for a long while.

The only thing affected by this is the ability to run really old binaries that have not been updated in years. But those will often be suffering from other compatibility issues anyway, such as display sizes and not meeting modern security standards like Apple's TLS mandate.

Among these 32bit applications there is my beloved Tetris app. Which doesn't need modern security standards or use the half of an inch more screen real estate available. If I have to chose between yet another iOS update (with the usual harassment to use apple pay, apple music, icloud and all that shit every time I upgrade) and Tetris, I chose Tetris.

I agree, so let's yell at EA and get them to update the app. I suppose they have no interest because they now have a "free" version in the App Store...

> I suppose they have no interest because they now have a "free" version in the App Store

Not "free" as in "freedom". :-(

If you think you're being "harassed" by the device offering to use essential features of the platform then perhaps it's not the device for you? I've used iOS for years and have never found any prompts that are remotely harassing about anything iirc. It's been nearly two years since they started requiring all apps and updates to be 64 bit, it's not a huge surprise this is coming.

> the device offering to use essential features of the platform

If you think "apple pay, apple music, icloud" are essential features of the platform, you are misinformed. They are attempts to lock you into a platform by providing an (inferior but embedded) alternative to products in the respective market.

> I've used iOS for years and have never found any prompts that are remotely harassing about anything

Then you have never tried avoiding to upload all your personal information into the Apple icloud. Harassment at every minor dot-dot update.

I think cm2187's use of the word "harassed" is grossly overstated. Only one of the three things he mentions even has a prompt at initial setup, but even then, skipping iCloud setup leads to absolutely zero prompts from then on.

No Apple Pay, Apple Wallet and iCloud get prompted with a confirmation prompt (so you have to decline twice each) on pretty much every iOS upgrade.

I do think that "harassed" is the appropriate term for this repetitive nagging.

You mean the upgrade, that involved the wizard setup that shows users new features?

I am not sure if it is the wizard setup, but when it has updated iOS, it reboots, and on first boot I am greated with multiple prompts to start using each of these services, in a similar way than if it was a new iphone. But this is not for new features. The same prompts come back at every update, even if I opted out many times before.

I would not call a update and setup screen "harassed." Pop-up ads are being harassed, iClouds prompts without updates is harassed.

> The only thing affected by this is the ability to run really old binaries that have not been updated in years. But those will often be suffering from other compatibility issues anyway

There are numerous applications — mostly games — which haven't been updated in years (and show the 32 bit warning thing) yet work just fine. Trainyard, Auditorium, Drop7, Eliss, Tilt to Live, geoDefense, Walking Mars, Canabalt, …

Just a side note, the rollout of ATS requirements on iOS/AppStore has been delayed until further notice.


Why do you and many others treat software as if it is a sacred artefact that must be kept and preserved and be able to be run for all future generations?

Humans are producing so much content right now in the form of software, art, movies, writing. Does it really matter if we leave some behind? Do we need to preserve every piece of information we generate?

I have software on the App Store that I cherish that unfortunately will never be updated past 32 bit (too much work). But I don't care, nor do I feel the need to preserve it.

You say there is no benefit to this move. But this move significantly reduces Apple's technical debt and support requirements and is likely to be worth the tradeoff for future developers and future software on the platform.

> Humans are producing so much content right now in the form of software, art, movies, writing. Does it really matter if we leave some behind? Do we need to preserve every piece of information we generate?

Everybody should be able to decide for themself what they consider as important and what should be preserved. So I strongly oppose when the benevolent/malevolent dictator decides that you have to leave this piece behind.

> I have software on the App Store that I cherish that unfortunately will never be updated past 32 bit (too much work). But I don't care, nor do I feel the need to preserve it.

But others perhaps care. So I strongly encourage you to open-source it if you don't care about it anymore.

> So I strongly oppose when the benevolent/malevolent dictator decides that you have to leave this piece behind.

There is no one dictating that. The source code is still in the hands of developers, and your advice to me was to "open source it." All developers of 32 bit apps have that option, and they also have the option to maintain their source going forward.

Apple's decision to shed technical debt shouldn't be seen as them eschewing some moral responsibility to support old software.

> But others perhaps care. So I strongly encourage you to open-source it if you don't care about it anymore.

I would consider it if I still had the source. But you should realise that saying "open source it" trivialises the amount of effort and emotion required to do so. Sharing your work with the world (especially if you are embarrassed by old code you have written) is not something that everyone finds easy to do. I would feel obligated to update the code, write build instructions and a readme file at the very least.

That would take a good few days of my time to make the code "presentable" before I would be willing to share it. I don't have the mental or emotional energy to devote to that task.

> > So I strongly oppose when the benevolent/malevolent dictator decides that you have to leave this piece behind.

> There is no one dictating that.

Apple is.

> dictator decides that you have to leave this piece behind

They aren't dictating that you have to leave this piece behind.

They are dictating that you have 9 months to either port your code to 64 bit, leave it behind, open source it. Or whatever else you want to do.

They are dictating that the end user leave this piece behind. The developer could be dead or out of business.

Granted, they have been doing this for a long time -- you can't easily install older versions of software, for example.

You can very easily install older versions of software. If you make a local iTunes backup, you can keep the apps and reinstall them locally. If you have an older device that doesn't support the newest version of the app, you will be prompted that there is an older version available that us compatible and be allowed to download it.

The former option works on all iOS devices ever made, the latter works at least as far back as iOS 5 devices.

The former stopped working in iOS 9.

You can't restore local copies of apps that you backed up using iTunes in iOS 9? But you can download older copies of apps on a first gen iPad? That's weird if that's what you are saying. I was able to download older apps on my 1st gen iPad about a month ago.


Are you sure it stopped working? You can't download an older version of an app on a non supported OS unless you either previously downloaded a version before on a compatible device or used iTunes and bought the app on a computer.

Edit 2:

Apparently you still can download the app to your computer directly using iTunes but you can't back it up to your device.


No they shouldn't. Those people should either update their software or force their users to stay on legacy devices. It should not be Apple's responsibility to not move forward and continue to complicate their environment for the benefit of the few.

> So I strongly oppose when the benevolent/malevolent dictator decides that you have to leave this piece behind.

You don't have to leave it, you just cannot sell it on the App Store or run it in modern devices. By all means, open source the code, publish it, archive it... do whatever you want with it.its not the end of the world and its a sensible decision.

> But others perhaps care. So I strongly encourage you to open-source it if you don't care about it anymore.

This is false thinking in my mind. Others may care, but that does not mean they have the right to guilt someone into releasing the source for _their creation_.

On a grander scale - nothing lasts forever. Not even stars. So, we don't need to preserve everything - this is very egoic mind to me. It is ok to let things die and be forgotten. Software, art, ideas, songs, poems included.

If the creator of these things decides to not release it to the world, that is _their_ prerogative.

The same reason you treat cinema and games the same way. It's not worth destroying all access to The Godfather just because "there's so many other films coming out nowadays"

He isn't saying that. He is saying we shouldn't continue to support tape cassettes because the creator of the Godfather didn't want to update it to DVD.

I think the difference is that film has archival formats to keep them around. If you have a film reel you can rescan it for a new format, whether it be VHS, Betamax, DVD, Bluray, HD-DVD or whatever is next. For software we have the source code, but given Apple's iOS ecosystem you are likely tied into that platform (Especially for early software when there was little cross platform tooling.)

The problem with the App Store is that neither users nor developers have a good method for keeping their apps around (other than releasing a binary for jailbroken devices.) You can't just create a DMG and host it on S3 to allow any old iOS device to install because the platform is locked down. The issue is that a lot of the history of mobile gaming is going to be lost (And some already has been with the purging of non-updated apps already happening) because the App Store is the arbiter for users obtaining software.

The "simple" solution is for apple to allow users to install IPA files outside of the App Store, like Android can install APK files. Developers can build their app in an old version of Xcode and put it up for install on their own terms.

Apple does allow developer users to install IPA files outside of the App Store. I can easily take an app off Github, build it with Xcode, and install it on my iPhone, for example. There are even scripts to do this similar to a "configure; make; make install", eg: https://github.com/phonegap/ios-deploy

This doesn't require the $99 it used to either, anyone can do it, though it does require you to get a free signed developer certificate from Apple which signs the app (you'll also need to tell your device to trust your cert).

You can even distribute an IPA file and install it via iTunes and not the App Store, but it needs to be signed by a trusted authority (your company; your dev cert; or most likely Apple, as long as it is a non-revoked cert).

> This doesn't require the $99 it used to either, anyone can do it, though it does require you to get a free signed developer certificate from Apple which signs the app (you'll also need to tell your device to trust your cert).

There's a limit of 3 installs per device, though. And you'll have to re-sign the app every 7 days if you choose to go down this path.

> You can even distribute an IPA file and install it via iTunes and not the App Store, but it needs to be signed by a trusted authority (your company; your dev cert; or most likely Apple, as long as it is a non-revoked cert).

Doesn't this specifically require an Enterprise Developer account?

Seriously? That's awesome, I'm really glad Apple's taken that step in the right direction.

Question though, has it had a large impact on popularity of pirated apps if you no longer have to jailbreak to use them?

It should be noted that it's quite limited. You can only install a few of these apps on a given device, and they expire quickly. It's not really practical for sideloading in the general case.

In practice, what people do is they pay something like $5 a year to get added to a shared pool of developer accounts. In China they also had a really slick scam going for a long time involving pools of otherwise-unused enterprise certificates (though I haven't tracked this one recently, and so maybe they got hit even harder than the resigning services have been).

I will also point out that for many of the use cases of piracy, which involves games and other content where the user rapidly get "bored" and moves on to other content, a limit of three at a time for only seven days doesn't matter: people can pirate a few games and play them for a weekend, and then move on to pirate other games.

(If you are curious, I'm of course excited by these things and like explaining them as it further weakens the argument some people try to make that jailbreaking leads to mass piracy, something even the US Copyright Office doesn't believe; but this line of analysis makes the whole situation more obvious ;P;)

Stretching the metaphor a bit; but what if the creator is dead? Or can't afford to update it? Does it make the Godfather worthless because of its medium?

The Mona Lisa is deteriorating. Should we also stop efforts to preserve it, because there's so many other paintings out there?

As long as there are people willing to preserve it, then we will preserve it. Few people would ever "stop" efforts to preserve something, it's just going to stop being preserved when there is no one around who cares about it anymore.

There are many millions of paintings that have been lost to time, and there will be many more.

Just because something isn't preserved doesn't mean it is worthless. The transience of human creativity is part of what makes it so beautiful.

If the Godfather was to be removed from existence this instant it would still have value because it has already made a mark on the world and the memories of the people in the world. And will indirectly influence generations to come. Value accrues in the humans who experience the work, not in the work itself.

> As long as there are people willing to preserve it, then we will preserve it.

As long as there are people willing to preserve it, there are still laws (e.g. DCMA) that prohibit people from doing it.

Those laws exist to protect the rights of the creator for a period of time; if they want a thing to pass into history, without openly publishing the source, that's their prerogative.

Even before copyright expiration, anyone can clean-room reimplement a thing in open source if they wanted to, and not be violating copyright. This has been common in OSS. (Samba for example)

The period is effectively infinite now, since congress keeps retroactively extending it.

Also, the SMB protocol does not contain any copyrightable material, like music, sprites, icons, game assets, etc. Copyright law prevents you from faithfully cloning (pixel for pixel) most GUIs and games (so most phone apps)

The Sonny Bono act was passed 19 years ago. 100 years of protection isn't infinite. It also requires an entity to WANT to enforce that copyright. if they exist, then you really don't have a right to pixel clone something - they can keep it buried.

There are plenty of game variants created that capture the feel of prior games without being IP violations

This is a poor analogy because the customer can turn a Godfather tape into a DVD at home. Or a DVD into a mp4. Or a mp4 into whatever the future holds and be able to watch it on newer equipment. If there's DRM its typically trivially broken or worked around via hdmi converters or even a basic analog output.

As a customer of a 32-bit app, I have no such power. I can't convert it to 64-bit. That's the problem.

If you have a 32-bit app that you are not willing to strap into 64-bit, then the medium in which it can be used on with stay in the past. So if I don't update my iPhone, I can continue to use your app and will have to begin finding ways to preserve your app for myself and people like me who enjoy it.

I don't believe that you can be said to cherish something if you don't care if it goes and you don't feel the need to preserve it.

cherish (verb):

​ 1. to love, protect, and care for someone or something that is important to you:

Although I cherish my children, I do allow them their independence.

Her most cherished possession is a 1926 letter from F. Scott Fitzgerald.

Freedom of speech is a cherished (= carefully protected) right in this country.

2. to keep hopes, memories, or ideas in your mind because they are important to you and bring you pleasure:

I cherish the memories of the time we spent together.

Why not? I have fond memories of it. I have the memories of sleeping in my car for weeks on end when I was building the software. I have memories of the two friends I built it with. I have fond memories of the bucket-based OpenGL render engine I wrote for it, the dynamic water system, the physically-driven character rig, the 3D models I made, memories of the art I painted.

I also remember the blog posts I wrote about some of the complexities and fun problems we solved. I remember the reviews written about the product.

I cherish it. I just don't care if I have the source code or a working binary (though I still have the latter until this 32 bit support is dropped!). I have the memories and experience and the desire to create more.

Your quote says it all:

> I cherish the memories of the time we spent together.

I cherish the time I spent creating my software and putting it out into the world. Now I'm creating something else, and I'm happy with the memories.

So you cherish the memories, not the thing

I don't know if there is really a difference there.

It is absolutely possible to care, and love, and cherish something -- while also be willing to let it go, so you can move on to something else. And it's software, not a human child or something. These are absolutely not in conflict, at all. Why does this even need to be said? Are half the people here robots? Apparently they are, because the moment someone talks about throwing away some software, people begin making allusions to child rearing.

Bringing out dictionaries is just a way of being an annoying pendant; nitpicking phrasing like "... and care for" in a Merriam Webster definition does not make you sound like someone enlightened "loving" or in tune. It makes you sound like that insufferable jackass, that guy everyone avoids, because they have to remind everyone "but what about the pedantry" anytime any complaint exists in the public sphere.

I'm not trying to sound anything at all. I'm trying to understand his point of view. I broke out the dictionary to help establish whether he was using the word as I understood it, not as an attempt at irritating pedantry.

The discussion is about whether it matters that these 32-bit apps (and more generally, all software eventually) will be lost to us. In that context, whether you cherish the thing or the memory of the thing is a crucial distinction.

As for being loving or "in tune", jumping on strangers who are trying to understand somebody's point of view is neither. It is precisely the behaviour of a jackass.

Some people are genuinely upset when they lose the use of cherished thing, be it a pen, a car, or the iOS games or other apps which they enjoyed using, and ignoring that or dismissing that out of hand betrays a lack of empathy.

>Does it really matter if we leave some behind?

Who decides? Popularity and economic benefit? So we get to keep Call of Duty video games, but apps that benefit a niche industry or are simply not very popular but considered important should be left behind? In other words, my Mona Lisa isn't your Mona Lisa.

Not too long ago, maybe the mid/late 1990's, we had this battle with deteriorating celluloid from the earlier stages of Hollywood. The studios were happy to let many old movies that were never transferred to VHS or DVD just wither away because the cost of transfer would never be recouped. I remember a lot of petitions and outrage and the studios reversed their policy (perhaps with grants or other funding, I don't recall). It was a small controversy and most of those movies weren't critically acclaimed, but some were and they were all part of history.

Not sure how well this analogy transfers to software but it always makes sense from a practical and moral perspective to save everything than to curate a few, especially in the age of automation and cheap computing.

We're not just talking about art here, and we're not talking about geological timescales either. I think it's perfectly reasonable to think that if I compile a program for an operating system, it will still run okay in twenty years on the latest version of that operating system. The Linux ecosystem (and Microsoft in a lot of ways is the best at this) manages this pretty well, and I think they're getting better at it. That's why people can safely build things like nuclear control systems on Linux or Windows, and not have to worry about recompiling every single binary every 3 years.

(Now mind you, they should be recompiling every binary monthly, but OS vendors shouldn't be breaking everyone's builds unless security demands it.)

It's a capricious decision that eliminates a bunch of actual useful applications with no consideration.

Meanwhile, there are 10,000 fart apps.

i'd just like to interject for a moment..

i wonder how long it is going to take for people to realize the importance of libre software. sure, strong copyleft can be inconvenient for developers, but that inconvenience must be borne to enable users to control their technology. otherwise we end up with the nonsense that we are discussing here: software people want/need, that they are no longer permitted to use because Apple said so.


Quite. When you accept a dependency on proprietary software, you open yourself to the risk of the vendor withdrawing support. Looks like that risk materialized for some users today.

This is an "I told you so" moment for a lot of Free software advocates.

FWIW, as a serious advocate of Free Software (someone who constantly travels around to developer conferences, college hackathons, and even sci-fi conventions giving talks that stress the benefits of the GPL3), this doesn't help and almost becomes a non-sequitor, as it makes it sound like a business decision related to productivity applications. I doubt anyone would say they "rely" on Trainyard, and the Trainyard developer did "rely" on iOS, but they probably liked that and for all I know they are dead now (I realize they almost certainly aren't, but like: this is the extent to which they aren't relevant).

As a free software advocate, there isn't a group of "some users" whom I get to point at and say "I told you so": in fact, most of the people who played Trainyard and enjoyed it probably don't even care that it is going to be inaccessible. And while I sort of care myself about some specific applications, it isn't about people like me either. The devastating reality is that this purge of 32-bit software combined with the centralized control on the computers that are capable of running it is a devastating blow to society as a whole, and the people who are causing it are the people who either don't directly feel the downsides (as historical preservation is an externality, like pollution) or are actually in the minority of people who experience an upside.

Advocating for free software is good as it can be a weapon in this fight, but in practice Apple just isn't feeling the burn: when they wanted a new compiler, they hired a ton of engineers and wrote one so they didn't have to rely on gcc anymore. There is only so much power in the lever of free software to force societal change in a way that doesn't continually lead to a handful of resistance fighters looking at the carnage of the land and thinking "I know what will make me feel better: I'll point at the majority of people who don't even understand this enough to notice and the handful of people who are happy this happened and yell 'I told you so'", particularly when developers like Linus Torvalds, who have critically important positions, aren't fighting for the same battle.

My claim: we need regulation. In the same sense that we have regulation about pollution--to keep a handful of people who benefit from pollution enough to not care about its downsides to society (and them) from doing horrible things--we need to regulate companies like Apple to force them to provide open platforms and to have some reasonable set of responsibilities when it comes to sunsetting older hardware. The issue of closed vs. open ecosystems is related to free software, but isn't the same, and we need to be paying a lot more attention to the former and avoid a lot of knee jerk to the latter if we want to prevent some of the worst case scenarios.

"we need to regulate companies like Apple to force them to provide open platforms and to have some reasonable set of responsibilities when it comes to sunsetting older hardware"

This sort of regulation is terrible, and in my opinion should be resisted. It institutionalizes the worst tendencies of open source - free riding and slowing progress due to community restrictions.

At best, a community will be a thriving organic set of users and contributors. At worst, they are abusive parasites to the core maintainers.

The former has its own future guaranteed - customers care about a thing and are investing in it. The latter however is far more common - everyone wants something for nothing.

The world is built on organizations that need revenue to continue their mission and fulfill customer needs. Forcing companies to allocate resources to the past, the obsolete, that which customers do not want to pay for, hurts society far more than letting products die of obsolescence.

A compromise might exist where a portion of copyright law and/or taxes are allocated to preserve certain works, which would capture a balance between the economic need to create the future and the social need to preserve tradition and past practices, but that's tricky a level of nuance for policy.


I also see this somewhat as a 'own your st' scenario. Apple makes all of these systems and platforms, and then proceeds to abandon them, producing waste (in both materials and developer time for the apps that get updated for nothing but compatibility reasons).

They update their APIs, deprecate the old stuff, at a breakneck pace, and they leave a great mess behind. They should be forced to think more about the mess they are making, and be unable to leave it without penalties. One way for them to not leave it would be to open it up. Another way would be to support older versions for a longer timeframe, and allow users to continue to download and use them.

Either way, it's not in Apple's interests to do this - they're much more content to force this crap down our throats, and rely on us just dealing with it because there really isn't any choice.

The mess they are leaving behind are outdated, crappy apps. They don't adapt to modern screen sizes, use memory efficiently, download efficiently (bitcode), etc.

Apple is continuously improving and modernizing it's platform. They should be banning hundreds of thousands of zombie apps that haven't been touched in many years because they are not valuable and misleading to customers. But they don't, they allow you to keep your apps available on the store no matter how old or how poorly written. Finally they have their first platform change that's going to break them, and they can automatically boot them. That's a very good thing.

For another example, a thread today brought up Winamp. Winamp's last major updates happened, quite literally, a decade ago.

This is a player that works just fine on Windows 10, and there's definitely a case to be made that it's better [overall] than any more modern alternative.

This type of software can never happen on iOS, and is quickly becoming a relic on Mac OS.

The core principles of good software haven't changed much over the years. UI (in HCI terms) has in many ways not moved forward much at all. Apple are changing their platform it seems, more for the sake of change, than for the sake of real progress. And they're forcing some really good software out because of it.

And among the hundreds of thousands of older apps are apps built by smaller teams to supplement other things i.e. web services that struggle to keep everything maintained.

It's far more important that the apps remain usable to their end users than it is that they adapt perfectly to a slightly larger screen size :s

Apple's development platform moves too quickly for many developers to keep pace with. It's not that all these apps are not valuable or misleading to customers.

> My claim: we need regulation

We already have this. Copyright gives Apple a legal monopoly on iOS. It isn't supposed to give Apple a monopoly on the distribution of third party iOS apps. That's just the sort of monopoly leveraging the antitrust laws are intended to deal with.

They built it, it's their property. If you don't like it don't buy an iDevice. For those who do buy iDevices, they can rely on Apple to ensure the quality of the apps and platform. Look at Android, it's the wild wild west, insecure and full of scammers.

They built it, it's their property. If you don't like it don't buy service from AT&T in 1981. For those who do buy AT&T service, they can rely on AT&T to ensure the quality of the phones and platform. Look at The Internet, it's the wild wild west, insecure and full of scammers.

And if you don't like AT&T, all you have to do is sell your house and move out of their service area. Unless you make phones or modems or are an ISP; then even moving doesn't help you because you can't move your customers.

Don't forget Google, gcc has been shown the door on the Android's NDK.

Regulations like the DMCA are what got us into this situation, by creating an environment where companies can declare that it's illegal to use "your" hardware in ways they don't approve of. We could start by getting rid of those restrictions, rather than creating even more surface area for regulatory capture.

Regulations cut both ways: regulations like "people can enforce contracts" are required to make things sane. Without regulation, DRM happens and wins. As one of the people extremely affected by the DMCA and often involved in official complaining about it (look me up), the DMCA is actually a red herring: whether it is legal or not is irrelevant if it isn't possible, and we are rapidly running into a world where the biggest issue isn't whether it is legal (I mean, even when we have exemptions, we still don't have exploits). Moving to a world without laws doesn't make things safer or happier.

Pollution is a "tragedy of the commons", the atmosphere and ground water are shared resources that if degraded affects all of us. But you don't have to regulate how I clean my house, if I don't want to live in filth I'll do it.

If you don't like Apple's house, you don't have to live there. There are many free software projects that you can use for your phone, pc, etc.

There are many arguments for libretto software, but the support argument is the weakest. I'm much more confident of getting support from someone I've paid than someone I haven't. At least any support I'm entiriedcyo will. Even a contract.

The reality of most liner software is that scads if it go out of support all the time and only users that are members of the developer elite (and happen to be skilled in the relevant platform, libraries and dev tools) have even a chance of doing anything about it.

> The reality of most liner software is that scads if it go out of support all the time and only users that are members of the developer elite (and happen to be skilled in the relevant platform, libraries and dev tools) have even a chance of doing anything about it.

That's only true for things for which there isn't even one user who is a developer, who fixes your problem because it was also their problem.

You have the same problem with free software, only that you can step in and maintain it yourself (which costs a lot of time/money).

> This is an "I told you so" moment for a lot of Free software advocates.

Is this though? How much of the impacted software would exist as free software?

Again. And again. And again. But we'll keep saying it.

I wonder how long it will take for people like you to realize what users want. Hint: they don't want to control, there are already to many things to control. They want to use.

Control and use go hand in hand. If you don't control your stuff, then it's just about inevitable that whoever does will eventually block your ability to use it.

> I wonder how long it will take for people like you to realize what users want. Hint: they don't want to control, there are already to many things to control. They want to use.

I wonder how long it will take for people to realize that "using" means "controling" (i.e. "using" without ability to "control" will sooner or later lead to a loss of ability to "use").

Your comment doesn't make any sense, in this context or otherwise.

I thought it made sense.

I know lots of non technically inclined people who use software every day. Most of them have no idea what free software is. If you asked them, they say it's software they don't have to pay for.

If you explained it to them, they wouldn't care. They neither care about not want the control and freedom granted by free software, nor are they going to start wanting it even if we spell out for them why it would benefit them. They'll take it if it means no effort or sacrifice on their part, but they're not going to ask for it or advocate for it or fight for it in any way.

What the grandparent was saying is that the vast majority of people neither care about nor want the additional control they'd have in a free software ecosystem. And I think he or she is correct.

It's not just free software, but the ability to control which set of binaries you use. The final straw that pushed me away from iOS devices was when the ability to pack and backup apps from devices was removed in iOS 8 (IIRC). Case in point, I had been holding out on a particular build of official twitter app for iPad (ver4.3.2) for some time as I detest the subsequent major updates for being ugly and hard to use. From that point onwards if I lose that signed copy stored on my computer it is goodbye forever since I won't be able to pull it from my tablet if I wanted to.

Maybe I overreacted since there is a good chance that I was in voilation of whatever EULA I have agreed to, or the app might be broken by another random update but enough was enough.

I interpret "that users want" to be a typo of "what users want".

Correct. Fixed, thank you.

While i agree on the value of libre software, i think we should not exaggerate what's happening here.

1. Most people are not aware there is such a thing as libre software, let alone why it would benefit them. Most people therefore will not channel their frustration at this move into a demand for libre. In fact, most people won't even realize apple did anything. They'll blame the original developer.

2. In apple's view, if there is 32-bit software people need or want, some developer will fill that gap. Apps to them are easily replaced and ephemeral. That is not necessarily wrong or in conflict with the view that all software should be Free.

So I don't think this decision matters much or will have much consequence. But it still is regrettable.

You don't really need copyleft protected software. The BSD community is very much alive.

The important point is actually use the type of software you care about. If you care about copyleft, use as much copyleft software as you can, if you care about BSD, use BSD systems.

In particular, if an open source version is available but you take the closed source version because it is somehow nicer, you are sending a message that you don't actually care.

And the only way we can keep open source alive is if we care. Just using it helps. Reporting bugs helps.

Apple is focused on offering a secure, fast, modern platform, and they keep moving it forward. These dusty apps aren't being banned, the authors just need to recompile them and re-release them. Blame the authors for never maintaining, not Apple.

Android lets you and virus makers and scammers do whatever you want. Why not focus on that if freedom is a bigger end goal that quality.

For most users this is an annoyance, but not enough of an annoyance to turn them on to free software ideology. It's hard for me to imagine what sort of closed-source irritation could outweigh the horrendous irritation of prioritizing using (and, in fact, just using) most free software.

The only user facing freedom in free software is the ability to get and redistribute the software for free. But that's not what free software is about. For all the other freedoms you need to be able to program, which makes you a developer.

Imagine your business depends on it. You can hire someone who works on it. Try that with proprietary software.

Then, as a business you: A) Buy a newer app that does what you need it to do. B) Pay for software support or C) Develop your own version

Hinging one's business on software who's future isn't so certain is a potentially stupid risk.

there's also option D: pay the developer to continue support.

Sadly there are ways for monied interests to hijack the direction of even that kind of software for their own goals.

One play we are seeing right now is code churn. Get developers you payroll to churn to many changes in central parts of the stack that people either have to accept your goals or fork the whole stack.

It will take at least very long, and probably will never happen, unless we change our own nature as species. This obvious fact alone is a hint that FS is invalid as ideology-for-everything. Why advocating at large scale then?

unless we change our own nature as species

says who? libre software is only uncommon for consumers. Google, Facebook, Amazon, Red Hat, Oracle, et al, all depend on and contribute to libre software. a lot of other companies develop libre software and manage to turn a profit. personally, i have used libre software to build production systems at every step of my career.

That was in the context of "to enable users to control their technology" and "software people want/need, that they are no longer permitted to use because Apple said so". It is about consumers.

How is this different from console manufacturers not making their software backwards compatible?

If you want to preserve old iOS software that you bought, do a local iTunes backup and your apps are on your hard drive. For the forseable future, you will be able to buy used 32 bit iOS devices - just as people buy used consoles to play old games.

That's the worse case, best case is that Apple keeps allowing older devices to run older versions of purchased software.

I recently charged up my 1st gen iPad (last update 5 OS versions ago) and I was able to download older versions of apps - Hulu, Netflix, Crackle, Plex, Amazon Video, kindle, Spotify and Google Drive, -- from the App Store.

The iPad 1 is too old and too low on memory (256Mb) to run Safari without crashing constantly, but the apps I mentioned still run well as does the built in mail and calendar apps.

It works with Bluetooth and Spotify doesn't force you to listen in shuffle mode on tablets if you're not a subscriber.

Anyone have any examples of a "seminal" iOS app which is still 32 bit?

You won't hear any good answer because there isn't one.

Who are you or anyone else to tell me that the app that I paid for is not "seminal" and therefore Apple gets to remove my ability to use it?

You might suggest that I don't have to upgrade iOS. But I have two responses to that. First, is that Apple has begun popping up frequent nag dialogs on the devices telling you to upgrade. Second, upgrades to iOS are the only way to get security fixes. It's one thing to give people a choice to opt out of the newest UI style vs. running older apps. It's something entirely different to require them to forego security fixes. Security issues that by rights should be considered defects in the products with Apple being given a choice of repairing the defects on every platform the defect exists on or buying back the old devices and paying the costs of switching for the people who bought the products in good faith.

It's bullshit that fully functional hardware is completely unsafe to use because Apple chooses not to fixes security bugs in the iOS versions that run on those devices. E.g. first generation iPad is not safe to use online because Apple has abandoned support.

Why do you care about security fixes when updating the OS but you don't care about security fixes of apps? If an app is 2 years old it is probably using libraries with security issues. I would blame the developer for not updating the app, not apple. (I am saying this as an app developer).

Not all apps involve connecting to an arbitrary set of network services or provide a listening port as an attack vector.

Regardless, these are not exclusive values. We don't have to have one or the other. If Apple ships a device that is defective, it's bullshit for them to say "that piece of hardware is two years old, it's not going to get fixed."

Ask yourself how many if the goods you have are less than two years old? I'm not saying they should bring every feature from all later releases to the older devices, but actual bugs should be fixed for longer than two years.

I don't understand your comment. I am saying you should have updated SO and updated apps. Apple is in charge to update the SO and the app developers to update their apps.

I'm saying that I should have a choice not to upgrade to the latest features -- including the dropping of support for 32-bit apps -- without having to give up security fixes, which are fundamentally defects in the product.

Imagine if you bought an mp3 & aac player, amassed a collection of mp3 formatted audio, and then the vendor said "oh, we have to do an update because if we don't the player will explode. and we're removing the ability to play mp3 formatted music, but that's ok, you can play aacs." No one wants their audio player to explode, but by the same token it was sold as a device that plays mp3 and aac.

You can not upgrade, but you will have security issues.

It is very hard to offer support for legacy systems. You should expect more than 2 year support. Software evolves really fast, companies cannot sit and wait. They will lose their hype and money.

You are giving the developer a free ride for never maintaining that app you paid them for, and blaming Apple for greatly improving the operating system?

No. Your sentence presents a false dichotomy. There is nothing inherent in software design that binds security fixes and new features. I get it, no amount of software updates will put a camera in a first generation ipad, but there is absolutely nothing about the hardware that prevents the correcting of buffer and stack overflows.

No, no they don't. Any app that's still 32 but hasn't been updated in almost two years. How crucial could it be?

How about an App that controls stuff where the iPad (or iPod) is just used as a "cheap" HMI. For example iPads are used in stage control with dedicated Apps.

The hardware, which the App controls, does not change. But the iPad itself has to be replaced with a newer one, or even you can't buy old ones. Think about, building control which is build for 20+ years.

If you're using it in that situation and you're blindly updating the OS, you're doing it wrong. You have a very explicit dependency, the application. The OS shouldn't get touched unless you know the application is supported by it.

We have, similar, setups at work for our test stations, though with Windows holding the interface application. Essentially replacing walls of control panels with one application and computer. We do not push OS or major library updates to those computers until we have verified that the application will either be unaffected or has been updated to correct whatever issues the new OS or library introduce.

What if the hardware fails and you have to buy new hardware? Not a problem with Windows, but you can't downgrade iOS.

That should be a major consideration before purchasing it, then. We have dependencies on VAX systems, so I'm fairly well aware of this risk. We work deliberately, now, to future-proof our systems. Depending on a very closed platform like iOS and a specific iPad version is the antithesis of future-proofing and any risk analysis should reveal the pitfalls. If someone still chose to use it, they have little ground to complain about being unable to replace that hardware/software easily in 3-5 years, let alone 20.

Field Runners

Cooking Mama iOS

Wolfenstein RPG

Those were just a few I tried.

Think of all the web sites optimized for IE6.

If those apps are so seminal, let's hope they are open source and available for study, porting, and upgrading.

Those websites should still work perfectly well on modern browsers (unless they use ActiveX components of course).

Said software can still be run using emulators, can it not? There is lots of (well, some) potentially useful software out there that only runs on machines no longer in production. A good solution is also to make source code freely available, then the best programs will not die.

I happen to know someone who had managed to build an emulator for iOS, but it could only emulate one particular old device (the key thing to understand is that we had a bootrom exploit that let us dump a lot of the code and encryption keys from these older devices). The issue with Apple goes well beyond distribution of apps: they also control the hardware, and they use that control to shroud the software; and unlike with a typical game console, there is a ton of software running on the device as part of a network of dependencies.

But like, we can go further: as it stands, without exploits for security vulnerabilities for these devices, it isn't even possible to get a copy of the apps to run in an emulator, due to the as-yet uncracked DRM used by Apple (which may or may not use hardware AES keys; I still haven't figured this out). So even if you had an iPhone emulator (and you don't, and might not for a very long time, if ever), you also need to obtain an unencrypted copy of Trainyard (the app in the banner image of this article) to run on it, as if they really are using hardware AES keys for that, your emulator is going to be missing those.

Essentially, I would argue that we are moving into a post-emulator reality. It was really easy to write a Nintendo emulator, and it was sort of OK to write a Nintendo 64 emulator. The existence of Wii and PS3 emulators is only due to bugs in security of systems, often the kind that are caused by people too often using unsafe languages like C. Imagine a world where people with intent to lock things down and the experience to know how to do it well are also coding in secure-by-default languages: you get to a point where even a scanning tunneling microscope isn't enough to get you access to the code either from the console or from the game.

Imagine a world where people with intent to lock things down and the experience to know how to do it well are also coding in secure-by-default languages: you get to a point where even a scanning tunneling microscope isn't enough to get you access to the code either from the console or from the game.

This is why I've always found these "safe" languages and the general security trend rather worrisome: they can effectively secure our devices against attackers, but also against their owners. Personally I think these issues need to be solved before advocating for these more secure systems, as otherwise we'll just be locking ourselves out.

I've been looking at the FairPlay DRM this week. I followed the path from the memory pager, through IOTextCrypter et al, through to fairplayd via com.apple.unfreed IPC. Static analysis of that binary looks impracticable (a lot of effort appears to have been made to ensure this), but have you considered dynamic analysis using Unicorn?

> moving into a post-emulator reality

Also known as the death of the general purpose computer. Free software advocates have been warning about this slow-rolling catastrophe for ages.


So... we have to wait for quantum computers to break the AES keys? :)

Quantum computers are only marginally faster against symmetric key encryption - AES128 requires 2^64 quantum operations.

Such is progress. And there is plenty of benefit: a cleaner, saner iOS, and software that's been updated can take advantage of the latest Apple technology.

Perhaps it would be more worthwhile to identify the apps/frameworks you value and bug their authors to modernize them to meet current technological standards. It's not like Apple is springing this one people. They're giving them ~18 months of warning.

How can a user tell whether an iOS app is 64-bit? I use some old apps that are no longer maintained and worry that they will just stop working.

If you are on iOS 10.1 or later, you should get a dialog box that clail the application "may slow down your phone" (this dialog'a appearance is remembered and only shows up occasionally per app, I believe).


Is there any way to tell before you buy an app? I just bought one yesterday, actually paid money for it, and it gave me that warning. While six months is a long time, if I buy it the day before the cutoff, I'd be pretty disappointed when I lose access 24 hours later.

I also wish there was a way to see which apps support retina displays before you buy them, I've been tricked a few times.

Ask the developer if they ware going to fix them. If not, get a refund. You can do it through your iTunes account or directly with


Say product didn't work as advertised.

> Is there any way to tell before you buy an app?

I believe that most apps built to support "iOS 8 or later" (this is shown in the app's Information section) are 64-bit.

> While six months is a long time, if I buy it the day before the cutoff, I'd be pretty disappointed when I lose access 24 hours later.

I just looked it up, and Apple has been asking developers to build against the iOS 8 SDK for 2 years now:


Ah so that's what that warning is about.

There is a warning, when you first open the app. The warning is that the app developer should update their app. It started appearing either in 10.1 or in 10.2

> is the kind of thing we probably need to solve using regulation

No, no, no, no, no. This is absurd. Government has absolutely nothing to do with this. This kind of "regulate everything that exists" is why we have some of the problems we do today. You can't go up to someone or some organization and force them by gunpoint to do something like this by law.

I'm okay with Apple switching to 64-bit only. I wish Google did that a while ago, too. Now, the vast majority of low-end phones use Cortex-A53 (64-bit) chips anyway.

However, it's frustrating to see them prioritizing this move over the HTTPS-only one, which they've recently delayed. I'd much rather they push that through, whether developers like it or not, than this.


This is the price of using Apple and Google gardens. Would you like reasonable freedom? Build a web app.

I would be very hesitant to legislate Apple supporting 32bit apps. That sounds absurd on its face.

Software enthusiasts and power-users will still find a way, whether with unlocking, modding, or emulation. Communities will surely spring up that cater to those looking for old iOS / Android software and sandboxing tools will allow these to be safely run. Do not fret :)

It is ... the kind of thing we probably need to solve using regulation

Apple already have a technical solution: bitcode.

In the future, this should mean the App Store can automatically recompile old binaries for new CPU architectures.

No. It is not that abstract or general. It enables tuning, but several aspects of the target architecture are assumed in the bitcode. It allows for CPU-aware optimization passes, such as - for example - reconfiguring math to use faster instructions when possible, but it is specifically not an abstract machine-independent bytecode.

True, bitcode is not completely abstract, and things like calling conventions are baked into the bitcode.

But it is abstract enough that Apple could use it to recompile binaries for, say, a new 64-bit CPU architecture.

Yes, they might have to jump through a few hoops to make it work, but given the control that Apple have over their platforms (they can define things like calling conventions) it shouldn't be too difficult.

Apple did JIT compilation of binaries during the PowerPC -> Intel transition years ago. Recompiling bitcode on the App Store would be easy by comparison!

Apple's PPC emulation was carried out at the process level. That means that not only the program was PPC, but also all libraries it used. And that means that Apple had to keep shipping PPC versions of all their system frameworks.

When it comes to 32-bit ARM, that's basically the situation Apple is already in. It's just that instead of emulation, the 32-bit code is run natively by the CPU.

They could replace that with an actual emulator, but it would gain you almost nothing. 32-bit support in the CPU doesn't cost much. The real reason Apple wants to dump 32-bit support is so they can stop shipping 32-bit versions of all their system libraries. Doing that will save disk space, memory, and development effort.

It's really, really hard to create an emulator which will run a 32-bit program that talks to 64-bit libraries, and bitcode doesn't help.

I was not suggesting that bitcode could be used for 32-bit -> 64-bit translation, but rather that bitcode means that Apple may be able to avoid similar situations in the future.

In any case, these (32-bit only) binaries predate bitcode by several years, so it's not relevant.

Lattner refuted the ability of Apple to recompile bitcode into alternate architectures in his recent ATP interview

Lattner said that bitcode is "not a panacea that makes everything magically portable"[1], which of course is true.

But that doesn't mean it couldn't be recompiled for different architectures, given reasonable bounds like both 64-bit, same endian, similar calling conventions, etc.

I'm not suggesting Apple is considering that, just that it would be technically possible if there were ever a compelling reason.

It's unlikely they would ever move away from ARM on iOS, but on something like the Apple Watch with it's highly custom SIP, and ultra-low-power requirements, you never know...

[1] http://atp.fm/205-chris-lattner-interview-transcript/#bitcod...

He actually references that pretty specifically saying that indeed it can't be used to move 32bit -> 64bit [0]

> Bitcode is not [12:30] a magic solution, though. You can't take a 32-bit app, for example, and run it on a 64-bit device. That kind of portability isn’t something that Bitcode can give you, notably because that is something that's visible in C. As you're writing C code, you can write #ifdef pointer size equals 32, and that’s something that Bitcode can't abstract over. It's useful for very specific, low-level kinds of enhancements, but it isn't a panacea that makes everything [13:00] magically portable

[0]: http://atp.fm/205-chris-lattner-interview-transcript/#portab...

I suppose I also don't comprehend the moral weight of the responsibility for running 32-bit code. What sort of problems do you foresee happening?

Did we read the same article? I am looking a whole lot of Cunningham worlds for how this move by Apple is a good thing...

Even without having read the article (the site is down for me) I can say this is a wise decision.

Currently the 64-bit arm cpu can also run 32-bit code. To support this it contains a 32-bit emulation layer that also exists in a few other places such as caches. By removing that, apple will be able to

1. Shrink the CPU, hence making room for more cores or bigger caches.

2. Reduce power usage by removing a significant number of gates.

3. Simplify the OS code significantly, which will probably improve performance and security.

Yes, it would be a smart move if they can actually drop the 32 bit mode from the CPU/SoC but I wouldn't be so sure about that. Having the 32 bit mode may still be required for the firmware/bootloader/OS early boot environment.

The 32 bit mode may be partially or fully implemented using microcode and the actual silicon area that could be saved by this might not be that big.

What comes to OS and firmware, yes it could be simpler but most of the code is probably there already. Getting rid of it would mean more engineering work, not less (in the short term). On the other hand, removing the parts would reduce maintenance burden in the long run. And Apple doesn't care about supporting old hardware anyway so they might as well drop it.

I work with ARM SoC's but the above is no more than an educated guess as I don't know details about their CPU architecture or firmware/bootloader/OS software, take it with a grain of salt.

Presumably they can write their firmware using 64-bit code too? is there a reason it would still be 32-bit?

Yes, iOS 64-bit devices use a full 64-bit bootchain

If it's all their code, yes they can do it (it might be a lot of engineering, though). But it might contain 3rd party code or dependencies to something that's non trivial to port to 64 bits.

If you go look at the boot procedure of a modern ARM SoC, it's pretty complex (also it's different for every SoC, there's no governing standard like the "PC is for x86"). The CPU may go through several operating modes and the procedure of booting up the cores is non-trivial.

I do not think it's impossible, but the benefits for all that engineering work may not be as great as GP suggests.

They are in a pretty unique position as they control all the hardware and the software, though.

There is UEFI+ACPI on ARM also, that's how Windows on ARM works. Windows on ARM devices use a far more desktop-like bootchain than anything else

Yes, but that's in addition to the dozens of proprietary boot procedures commonly used by Android tablets and smartphones and other ARM devices.

Actually there are more or less only two big ones used in 2017: the well engineered open source boot solution that ARM started and is now backed by Linux and that piece of garbage that Qualcomm uses.

Hopefully next year this time there will be only one.

UEFI with ACPI on ARM was started by Microsoft first, but is now the ARM standard :)

I'm not sure that this means what the article says it means. Apple was selling the iPhone 5C in India until less than a year ago. Dropping support for the new OS that soon would be uncharacteristic for the iPhone.

Instead, they may simply be dropping support for 32 bit apps on a 64 bit CPU. Having to support 64 bit and 32 bit apps one a single device forces them to ship two versions of every shared library, and is probably annoying for them in various types of interprocess communication, because, for example, CGFloat and integer types are different sizes.

My guess is that they will release a 32 bit version of iOS and a 64 bit version, and each one will only run apps for the native processor word size.

They did something like this for the worst gen iPads. Two years and support was dropped. I believe they've also done it for Mac hardware too. So it wouldn't surprise me at all if they did it for an economy model of the iPhone.

It had 256 megs of ram. I'm running the latest OS on an iPhone 5c that is 4 and a half years old.

The pace of change in phones/tablets has been tremendous since the iPhone and iPad were released. Current versions are more than 10x faster and ship with much more ram. There aren't infinite resources at any company (unless you want the army of programmers failed model from IBM of yore), focusing limited engineering resources on most important issues is crucial to making progress.

You got an iPad that worked well. They stopped enhancing it after 2 years. When did your car maker stopped enhancing your car?

The trouble with the car analogy is that internet-connected devices need constant security updates, whereas cars don't. A car built ten years ago is as safe as it ever was, minus whatever problems arise from wear and tear. A device running iPhone OS 1.0 will probably not survive long on the internet of 2017.

So developers have roughly 9 months to update their apps or they won't run on 5 year old hardware? This doesn't sound as devastating as some of the tweets and headlines would imply (that seems to be common these days).

> This doesn't sound as devastating as some of the tweets and headlines would imply (that seems to be common these days).

Most of these applications are games, the developer may not even be alive anymore at this point and there's no ROI in updating working complete games just because they are old.

Your basic actual game (not the F2P/IAP garbage) just creates an opengl context and handles touch input, that stuff only breaks when the OS itself breaks core API, I've got many games from my early 3G/iPhoneOS2 days which still run fine on a 7 running iOS10 (though not all of them).

If only the authors of abandonware would open source their projects.

Complaining about one closed source company that is actively growing and managing the platform we rely on for increasing performance and security because other closed source companies abandoned their software seems rather confusing to me.

Consider the following situation:

* an application was an early 32-bit iOS port; the codebase was not 64-bit clean at the time of the port

* elsewhere, the world moves forward: new DirectX APIs for the desktop, new graphics and asset pipeline techniques, etc. Development, accordingly, moves on; the application ships on other platforms, and large reusable sections of the code - the engine - have been updated, rewritten, made 64-bit clean, used in other applications etc; however none of this work was migrated to the branch for the original iOS port: at the time, that had shipped and was stable, large sweeping changes are expensive and risky, with (at the time) little visible benefit for an existing working stable iOS application to justify that.

* 2015: Apple pulls the rug out from under updates by mandating 64-bit support. The developer is faced with a stark choice: pay the porting cost to iOS again for the new codebase, or redo the 64-bit cleanup work from current trunk on the ancient branch (a cost comparable to porting again, at this point), or suck it up and say there won't be any more updates. Since the app is several years old and sales are down to a negligible trickle, the third option is picked - the other two are not affordable.

* 2017: Apple pulls the rug out from 32-bit-only apps by releasing an OS update that no longer runs them. Users that bought the application can no longer use it. The developer can't afford to update it or port it, but is (a) still selling it on other platforms and (b) is still using the engine in other applications that are selling, so would be crippled by making it open source.

Each individual decision is sensible at the time, but it all adds up to checkmate.

I'm personally aware of at least three developers that are in this trap right now.

If the developer is dead, what man alive now could ever figure out how to open it in Xcode 8 and click "compile"?

Many other requirements have been instituted from iOS version to iOS version that require substantial work to bring old code up to current App Store standards. Among them are things like screen size support and retina-resolution assets. Meanwhile, many of these apps still work perfectly fine on iOS 10, even if Apple won't accept those submissions today.

Oh dear, note that while your interpretation makes perfect sense upon re-reading my comment I really meant "developer" as the developing company rather than the actual person, and "dead" as in folded/gone, rather than the actual physical death of a living human being.

Not really, the iPhone 5C was sold worldwide up until September 2015 and was sold in India until February 2016 so anyone still using it will be stuck.

The article makes a claim that Apple will drop 32 bit in the OS that Apple has never said. I'm fairly certain that is wrong.

All Apple has said is that it will remove apps from the App store that don't support 64-bit.

Journalists shouldn't make stuff like that up.

Yeah, that changes everything. I’m all for Apple forcing developers to include 64bit support in their builds.

I agree totally, the point is cleaning up app store. Most of those legacy apps are already buggy or consuming too much power

Update the apps or they won't run on anything BUT 5 year old hardware.

> So developers have roughly 9 months to update their apps or they won't run on 5 year old hardware? This doesn't sound as devastating as some of the tweets and headlines would imply

Older apps that users have paid money for a while back, that work and are in use right now, but that developers haven't received much income for in years since everyone that wanted them has already bought them, will stop working for users that upgrade their device.

Cue user complaints, but no extra money coming to deal with them.

Apple have unilaterally decided it's time to burn something; developers have roughly 9 months to choose whether it is their time and money or their users' possessions that get burned.

This really, really sucks for people who paid good money for 32bit apps. Or even expensive external hardware items that require ancient apps to be useful. Even if the apps are removed from the app store (and Apple has gone on a rampage removing old apps), existing customers always could keep the apps installed until now. If apps stop working and disappear, they will lose a whole lot of the trust that people had in the app store. I'm also curious about how this will affect the usually impressive rate of upgrades to the latest iOS version among active app-purchasing users.

The fact that those apps are still 32-bit means they're unmaintained. The fact that they're unmaintained means that they're likely to break at some arbitrary OS update anyway. Even common apps like Tweetbot, by reputable developers, will break on a new major version, so your app that hasn't been updated in years is probably going to die off soon anyway.

The only way to make sure to keep those apps open is to keep from updating iOS, and if you do that this doesn't affect you anyway.

There's no reason apps should break on arbitrary OS updates as long as they are well-behaved and stick to documented UIKit APIs and so on. I have apps from 2009 that work perfectly well for the task they need to do. The companies have long since folded and closed down their app store account. Others have similar apps, perhaps controlling $1000+ worth of DVR equipment. I think (and hope) Apple may be underestimating the effect of a vocal minority still requiring these apps, and the chilling effect this may have on paid apps or even free apps for the made-for-iphone hardware program in the app store in general.

And a lot of games hardly even use UIKit, relying only on OpenGL ES and some input handling logic: these are the kinds of apps I would expect to essentially never break (and would usually assume a platform bug if they did break).

I run iOS public betas and many games break during the beta period of new iOS versions. Even highly-rated apps like Rayman Run has been completely broken for weeks for me; like the audio breaks out, the touch input is not accepted in some menus, etc. At some point I couldn't event start the game. Then, an update arrives and everything is fine.

That is only true if Apple doesn't change behavior or deprecate APIs. They don't do that often but they do do that. (Yes, I said doodoo).

This is not a counterargument; it is, in fact, the definition of the problem.

Removing them from store doesn't mean you can't still download them. Apple has always been good about making sure you can still download stuff you've bought.

If it's a fear, don't upgrade beyond iOS 10. But Apple HAS to start removing all those crappy old apps that developers aren't maintaining, zombie apps are near useless and are misleading to customers.

What apps would that be? Also, it will have zero effect on iOS upgrades, unless the number of "users with expensive external hardware items that require ancient apps" is in double percentage digits. In reality I guess they don't even reach thousdanth of the percent.

> Ever launch an app on your iPhone and then get a pop-up warning that says the app may slow down your iPhone?

Noooooo :( Pleas say it ain't so...

"PDF Highlighter" by omz:software is one of my favorite apps on the iPad. The only PDF app that doesn't have a million things I don't need, and has a beautiful summary of all the highlights you make in your PDFS..plus Dropbox .. plus OCR. Oh, and highlight colors that don't look like they were put in by a 3 year old jesus. What's up with all those annotate aps using FF0 yellow or F00 red? What happened to design? Why do all these PDF apps have to suck so bad as a reading experience?



For Apple this must be a dream -- drop everything that's explicitly 32-bit. Make some compiler improvements easier, possibly drop some silicon costs, remove lots of #ifdef __LP64__, simplify the kernel, etc.

Don't forget that the newest Apple Watch is 32-bit.

Apple Watch (from a development perspective) runs bitcode, not armv6/armv7/armv7s.

32bit bitcode. You can't go from LLVM bitcode to both 32bit and 64bit as far as I understand. You need "bit-specific" (architecture-specific) bitcode.

"This includes the iPhone 5, 5c, and older, the standard version of the iPad (so not the Air or the Pro), and the first iPad mini."

This implies there is still a "standard" iPad for sale. In fact, the iPad Air was simply the successor to the iPad 4, and the iPad Pro was the successor to the iPad Air 2. There is no "standard iPad" on sale anymore, and there hasn't been for years.

This won't directly affect any device Apple has sold in some time. The iPhone 5C is the last iPhone they released that was 32bit, and was likely to be dropped for software updates with the release of iOS 11 anyway.

In short, there's nothing to worry about, except if you want to run an app that hasn't been updated in 2+ years

My dad still uses an iPhone 5. Will he be unable to download any app updates until he gets a new phone?

I see no reason Apple would prevent people from continuing to upload dual 32 and 64 bit apps for now, as they currently do.

Depends on individual developers and how easily Xcode 9 will let devs target older devices.

For example, the iPad 2 can't upgrade to iOS 10, but plenty of apps still support iOS 9.

The iPhone 5 is four and half years old. He'll be fine. • He probably won't be able to update to iOS 11. • He'll still be able to run iOS 10. • He will still be able to download new apps. • He will still be able to download his old apps, even 32 bit ones, because he owns them.

Yep. Nor will he be able to get the latest version of iOS. So just like any other upgrade that removed older devices from the supported list.

Based on what evidence? Apple still accepts 32-bit slices if you want to submit them. 64-bit slices are required.

I'm upset that all of my Llamasoft/Jeff Minter games will go by the wayside. Minter did not find it profitable to make iOS games, so he let his Apple Developer subscription lapse; there won't be a 64-bit recompile of Gridrunner for iOS.

I might have to get an Android device just for classic games, like his.


I'd be mad at Jeff Minter. Not Apple.

> Since a 64-bit processor processes more data at a time than a 32-bit processor, you get faster performance.

Is this true? If I recall correctly, 32-bit to 64-bit doesn't always necessarily mean "end-user performance improvements."

In x86, there is even an ABI for using 64-bit instruction set features (like more registers) while sticking with 32-bit pointers to keep the memory footprint small and fast. (This limits you to 4GB of ram per process, but there are a lot of applications where that's not particularly important.)



x86 is a special case because 32-bit x86 was exceptionally register-starved, causing a performance penalty.

(That said, the "x32" ABI is not widely used)


Technically it doesn't.

In practice only two architectures matter: x86 and ARM. Both brought cleaned-up ISAs, expanded the number of general-purpose registers, made some features required, and made other improvements during the transition.

So for all practical purposes: YES, the 64-bit transition has actual real benefits to end users besides the larger address space.

Assuming you don't need the extra address space, what are those? End-user improvements that is... Performance wise, I doubt that the performance gains from extra registers make up for the higher cache miss rate from the larger code and data sizes.

Assuming the most performance intensive tasks most users do is browse the web, it's probably a win. Webkit's garbage collector is conservative-- it scans all memory looking for possible pointers-- so it runs more efficiently on 64-bit architectures where valid pointers are unlikely to randomly occur. Lots of JIT tricks like tagged pointers really work best on 64-bit architectures. https://webkit.org/blog/7122/introducing-riptide-webkits-ret...

It's not just extra registers and address space. ARM64 is a large-scale restructuring to better enable modern processor design, including things like removing barrel shifting on every instruction (expensive and not very useful) and removing predication on every instruction (expensive and makes OoO machinery slower). http://stackoverflow.com/a/26841196

On x86, it's just the extra registers, and indeed 32-bit can be faster. But AArch64 is essentially a completely different ISA from AArch32. Here's an old benchmark from Apple's first 64-bit CPU that shows a significant improvement on benchmarks in 64-bit versus 32-bit mode on the same device:


Of course, this may be an idiosyncrasy of that processor implementation, or it may not generalize well to real applications which tend to stress the instruction cache more...

Being able to rely on at least SSE2 is a big win for 64-bit on x86.

Technically you could use 32-bit addresses on a 64-bit processor and not have your data be any bigger, but most languages don't offer a good way to do this (IIRC the HotSpot JVM's UseCompressedOops option does something along these lines).

This is actually implemented on Linux/glibc: https://en.wikipedia.org/wiki/X32_ABI

You can compile C/C++ programs using this ABI, and get all 64-bit benefits without 64-bit pointers.

>higher cache miss rate from the larger code and data sizes.

Can you explain how these are related? What do you mean by "larger code"? I'm assuming you're talking about the binaries, rather than actual code.

I'd agree with you on the large data size but I really don't think 32->64 bit increase will make much difference overall unless someone is using a lot of existing numerical data. Most of the data is taken up by multimedia resources and this won't really be affected by the architecture.

More registers are always good, unless you have a terrible compiler. And since Apple uses LLVM a lot, and actually takes care of the compiler backend. I don't really see caching problems. AFAIK there will be more registers to store data in so caches will actually be used less than usual.

Pointer size doubles, so any data structure (linked list, tree, etc, etc) which uses raw pointers directly will need more memory to store the same items.

Normal way of negating this increase is to use uint32_t offset items into a master array of the items the pointers represent, which reduces the size again and has a side benefit of possibly making the items more coherent (localised) in memory.

I believe there are actually some cool things they do with the extra space in the Objective-C runtime. It's been a while so I may be mistaken but I think they store the reference count in the pointer to avoid a lookup to a separate table. There might also be some trickery in storing NSNumber values in the pointer itself as well.

EDIT: Grammar

As the 64-bit CPUs used on the iPhone actually only have 33-bits of address space (undermining the benefits of having more address space ;P), Apple does use the extra space in a pointer to store "tagged" values in some cases, but for the most part this can be seen as a "mitigation": as the size of a pointer is now 64-bits, and as the alignment of a lot of things is now 64-bits because of that, processes are larger and heavier than on 32-bit systems, and thereby require more memory bandwidth. Sharing some of the bits of a pointer is essentially required to win back this lost performance. (Another great option is a 32-bit ABI using the 64-bit instruction set, as with x32 on Intel, but Apple did not go that direction.)

Improved security with care for performance, little things like SGX, MPX, IO and GPU DMA in hypervisors, for example.

How about coming up with an example for ARM, one that doesn't come from using the new chipset but which requires the new instruction set (and particularly one which somehow precludes supporting older applications). AFAIK, ARM64 maybe offers some minor benefits to floating point operations, but mostly just provides access to some more registers... an advantage which sometimes matters but is often so dubious that on 32-bit ARM a lot of performance oriented code would be compiled to Thumb-1 even though it had half as many registers, as the code would load faster and take up less space in the cache.

Parent stated:

> In practice only two architectures matter: x86 and ARM.

So I replied about x86, because that is the architecture I know best.

> So for all practical purposes: YES, the 64-bit transition has actual real benefits to end users besides the larger address space

This is technically true, but it's a very bland statement vs "64-bit yields end-user performance improvements" or even "64-bit is a net positive".

It's false - if you want to process many bits at once, you use SIMD instructions, which you have on 32-bit too.

Generally, other things equal, going from 32 to 64 bit word size makes things slower because pointers suddenly double in size, and you can fit fewer program objects in your caches.

In practice there are a slew of miscellaneous improvements in the new 64-bit revision of an instruction set that makes up for some of the slowdown. This is less pronounced on ARM, but was big on x86 because a big flaw, the register count, was fixed.

However, ARMv8 also increased the power of the SIMD portion of the chip, going from a 64 bit wide set of registers to one that's 128 bits wide. So SIMD-heavy code will see a decent speedup because you can process more data per clock.

128-bit NEON has been available on 32-bit ARM for a long time, since Cortex-A8 (2005) or so.

Not completely true, AArch64 performs better because it has more registers (means less fetch from memory), and more advanced instructions. Here is an article about this:


Yep. If it's just 32->64 there may even be a slow down as you need to pull bigger things through the various busses. However, Armv8 also brought a huge improvement to the instruction set, so recompiling with the proper optimizations will bring performance increases in a lot of programs.

Sure, but don't forget that there also exists aarch64 with ilp32 (int, long and pointers are 32bit) as opposed to the default lp64. Which is basically the new instruction set but restricted to 32bit memory space. In fact some of the specint benchmarks are faster with ilp32.

It is a big fat "it depends". If you don't need the extra data width then in theory 32-bit code could be faster as smaller units are being loaded from cache or main memory. But depending on your architecture there could be alignment related performance issues that 32-bit instructions will run into but 64-bit ones will not because the memory fetch portions of the processor and related subsystems are optimised for 64-bit alignment.

A 32-bit application is going to be older too and won't be using non-64-bit-specific enhancements present in the processor families that may help it more than being able to deal with bigger integers. More registers being available for instance, additional SIMD instructions (for doing more with collections of smaller values) or wider versions of existing SIMD instructions.

There may be other architecture dependant factors to consider too.

Other considerations come from running 32-bit apps on systems that support both 64 and 32: you end up with two versions of some libraries in RAM which could give a one-off hit upon loading an app but also means more memory pressure so other stuff could be being pushed out and will need to be reloaded later from slower storage later when next called. That won't matter for libraries that one one app is using anyway, but for shared system libraries it could be significant. Also if the 32-bit libraries are really just a translation layer calling the 64-bit ones, then there is extra work per call that isn't related to how many bits you can process in one instruction. The extra memory pressure may slow down other apps when you switch back to them (via user action or due to them having a background service portion that needs to respond to an event) as more pages need to be reloaded from slower storage instead of already being in RAM, meaning the slow down the warning is talking about may not be immediate or affect the current app at all.

"64-bit gets faster performance because you can do twice as much in one step" is a useful explanation for the general public, one that is true (or at least not completely false), easy to understand, and much simpler than the larger collection of real details. A bit like in GCSE physics when you are told "yeah, what we said in science lessons before is a simplification and on some levels actually wrong, this is what really happens" and in A-Level physics you are told "yeah, what we said at GCSE level is a simplification and on some levels actually wrong, this is what really happens" and so on up your education as far as you take it. The simplification is needed to get the message across to those who don't need or care to know more, but doesn't hold as much water upon more detailed inspection.

I think talking about the performance hit is a mistaken way of talking to the users about the matter though because for a great many cases on an iPhone that is unlikely to be significant as the reporter here notes. And even if there is an effect overall the common user will not notice an immediate effect and assume going forward that there is no effect at all and that the warning was being over sensitive. A better way to get the message over on an iDevice is to point out that more battery power will be consumed. This is equally true (if not more so) and people are likely to care more IME.

Arch Linux also recently announced phasing out the 32-bit distribution (though multilib will still be available).[1] Perhaps the start of a trend.

[1]: https://www.archlinux.org/news/phasing-out-i686-support/

That's a different kettle of fish.

Multilib means you can still run 32-bit binaries. Although given Linux doesn't have a stable ABI, I haven't found that 32-bit Linux binaries age well (e.g. if you run something from 10 years ago, it's probably using OSS, enjoy trying to wrap that to alsa...).

Arch is deprecating the i686 only distribution. Which IMHO is a good idea. Multilib support is generally excellent, and to my knowledge no one who is shipping i686 systems (e.g. industrial PCs with a 10 year availability) is using Arch Linux anyway.

From my experience, the aoss wrapper works pretty well. I had the UT99 Loki/UTPG port running with it with no problems.

It would be good to start some kind of initiative to try to get in contact with developers of unmaintained apps and tu to convince them to do one last build. Or to release the software to open source or at least work with somebody to keep them going. Many will likely not respond but if there is a chance to salvage at least some of them it would be a win.

“One last build” generally isn’t enough because software might have many issues, including:

— Dependencies on other 32-bit libraries that have not changed.

— Dependencies on entire OS frameworks that have not migrated to 64-bit (and Apple has at least one of these).

— Having a “plug-in” feature, whereby cherished plug-ins have not all migrated to 64-bit so the user must then choose a 64-bit subset.

— Any number of problems in the code itself, such as improper assumptions made about the sizes of data structures or creative solutions that only made sense on 32-bit platforms. After a simple recompile, the code may crash in places or consume much more memory. Floating-point values may produce subtly different results, creating issues all over the place.

It feels really weird to me that now 32 bit CPUs aren't considered good enough even for a telephone. Maybe I'm getting old.

Did you write this just to be snarky and call it a "telephone"?

An iPhone is more complex than the desktop computer you wrote this comment on and you know it.

So basically 4.5 years support max. for your devices (iPhone 5 was September 2012; iPad 4 October 2012)? Also aren't there a lot of these "old" iPads out there with people "refusing to update"?

I think technically it makes a lot of sense but for customers this is pretty bad. Compared to Microsoft's 15 years of extended support for an OS (iirc. 5 is standard). Ubuntu LTS is 5 years. Sure there's a difference when you're the hardware vendor and the OS vendor but still I'm not sure this is a great move.

If this is bad for customers, pretty much every Android device except the Nexus must be a living hell. How many Android handsets (including "flagships") launch with an already outdated OS version and never get updated?

Even though the OS itself is not updated, most of the libraries etc gets updated via the google services package.

Not that I am defending not updating the OS, but its not nearly as bad as removing the support completely.

I still have a HTC Desire HD (bought in 2011) that gets updates in this way. I still keep it around to see how long it will live (that's one solid phone), but my main phone is a nexus

This is markedly different because most applications written for the first versions of Android still work even on the newest version of the OS. Apple is explicitly breaking 32bit applications from running on the device/OS. Remember we are talking about application compatibility which is not much related to OS updates.

Unless the developer recompiles. Which is simple to do.

Yeah, that's true, but Google is pretty good at backwards compatibility and most of the applications out there still support at least Andorid 4.1, released 5 years ago.

If I read this correctly, Apple will stop letting people upload and distribute apps for iPhone 5C, which is significantly newer, right?

I still use an iPhone 4s and a first generation iPad. The devices I have are in perfect working condition, and they fulfill my needs. It's not a stubborn "refusing to update", but simply a "there's nothing about an iPhone 7 that's worth $649 to me".

This shouldn't affect you at all. Your devices are 32-bit so will continue to run the 32-bit apps you have already bought or downloaded for free.

Apple keeps old copies of apps for old versions of the OS almost for ever. My youngest was using my old iPhone 3GS until last September. I was still able to download an app from the App Store I bought for it long ago, but which hasn't been available to buy in the store for years and only supported ancient versions of the OS.

There's a lot about a used iPhone 6s that's worth $350 though.

They are supporting them at least 4 years, not 2. That's Google.

Edit: Well now you edited your response and mine doesn't make sense :/

4.5 years of official smartphone updates is actually considered excellent by modern standards. That's several new generations' worth.

It feels surreal to read this line of complaints about this when Apple is in such a high regard when it comes to precisely smartphone support.

MSFT can go 15 years of support because PCs pace of change is far slower than smartphones. I'm pretty sure MSFT dropped 286 support long ago.

And Windows compatibility with so much old software is a big reason why it sucks. There are too many apis that should have been deprecated and hacks to support specific apps that bloat it up.

It would be nice if they published a list of the software which will no longer be supported.

There is a definite opportunity for app developers here.

One of the consequences of this move will be to flush out tens or hundreds of thousands of applications from the App Store.


It's a simple matter of economics: Creating apps for iOS is, and has been, for a long time, unprofitable for most developers. Last I looked, most can't earn a living, which means they are working for far less than what they could otherwise earn doing something else.

A good deal of the existing 32 bit apps are simply going to die off as developers abandon them.

This could be good, of course, as it might improve discoverability in certain categories. Time will tell.

The first apps I've bought for my iPhone were dictionaries. The owners of the dictionary material sold the right to the app developers, but didn't extend it, therefore the developer doesn't even have the right to publish an updated version, even if it would be only a recompile with the never compilers for them.

Which means that even if I've paid for the content, now I'll have to purchase it again from another developer only because some random limitation is introduced, and even that purchases can soon disappear again.

So glad this is happening, it'll be a nice forced purge of old outdated apps, also there is no good reason to have a 32bit app in 2017...

Does anyone have a comprehensive list of the advantages (and disadvantages besides: "Old software won't work anymore.") of this decision?

Does it reduce engineering/research effort, production costs, chip size, size of binaries, duration of compilation, programming effort and improve performance: If yes, by how much?

Online banking app of my bank is 32-bit, so for me it means "don't update anymore".

Nobody at the bank could be bothered to push the "Build" button in Xcode since JUNE 2015? (when they stopped accepting 32-bit apps) Sounds like you need to upgrade your bank...

I'm guessing it's more because they want to support customers who might still be on iOS 6. The 64-bit compiler only came with the iOS 7 SDK.

Five years doesn't seem like a long time at banks, even though iOS 6 probably has a whole bunch of privilege escalation exploits that wouldn't be very nice to deal with.

Seems to be a good reason to have a mobile website as well as apps. Though I've never bothered to see if my main bank even has a decent mobile website because I always use the app. But another bank I use has only a mobile website and has no app.

How often you upgrade your banks just because your phone asks you to do so?

Years ago TomTom released an update for their GPS app that won't run on older iOS versions and they had to add a warning in the description so people don't buildly update and get locked out forever until they buy a new phone.

I believe there is a mechanism in place now that serves the last working version to older devices.

So the Squeezebox remote app is finally going to break. How annoying.

Say what you will about legacy support or planned obsolescence, but Apple has been constantly pushing the computing envelope exactly because they don't have to be Microsoft and get held back by older technology.

Yes, only Apple could get away with this and it's great that they're willing to sacrifice a few short term $ for the long term benefit. Sure, a lot of software will become useless. Same as happened with all the Mac architecture changes over the past couple of decades. Apple developers should should expect to stay on the ball and users should expect not to start depending on old unsupported software.

If you prefer longer backward compatibility, and no rush to keep updating everything, use Windows, but pay the price in ugliness.

I'll be glad when all 32 bit platforms are out of use. There's so much duplicated code around and time wasted debugging 32/64 bit mixups.

> Yes, only Apple could get away with this

Android phone vendors drop support for their devices and software like a stone all the time, but it's just considered business as usual. Apple twitches in a vaguely unexpected way and the internet burns down in paranoid outrage. Personally, I'm going to wait until we actually find out what this means.

So, some people are talking about iOS itself possibly dropping support for 32-bit hardware, while others (including myself) are talking about issues with existing hardware not even being able to run things that will soon no longer be downloadable, and are taking this as an opportunity to lament over the lack of historical preservation of software. FWIW, you might mean the former, but I'm going to address the latter.

In one of the few really legitimately "open" things Google does with Android, they provide emulators for their devices and downloadable historic stock firmwares; and because they don't have control over their hardware, they don't have the ability to implement "actually good" DRM on their application archive files. Apple has set up an environment where when they sunset something, it will actually fade from existence: that is not the case with Android.

TIL there are still 32-bit apps for apples.

this translates to: swift will be included with the next iOS

That makes no sense. You can already develop iOS (or tvOS) applications in Swift, the runtime bits are provided by the pre-existing Objective-C runtime.

Every app that uses Swift embeds its own copy of the Swift runtime libraries, since they are not part of the system (the ABI isn't stable yet).

I wonder if they will at some point remove the crap that's needed for 32 bit mode from the die. That could shrink the die size a tiny bit.

iPhone 5 users are plentiful and will be pretty livid if they are locked out of a future major version over this issue. Apple just can't get out of their own way these days. If I were an iPhone 5 user and was forced into a twinked device like the iPhone SE or iPhone 7 I would probably leave the entire ecosystem and hundreds of dollars worth of DRM purchases behind.

The iPhone 5 was likely to be dropped with iOS 11 anyway, regardless of 64-bit support. It'll be 5 years old at that time. It will still be able to run apps that were compiled with 32 and 64 bit support. It's not necessarily going to be any different to when any previous iPhone stopped getting software updates. I don't see any restriction here on Apple allowing devs to still support 32 bit, they just have to support 64. Someone correct me if I'm wrong.

Do you have stats for how plentiful iPhone 5 users are?

The Apple iPhone 5 was released in 2012. Apple has provided those users with 5 years of updates (assuming September is when they will release the new iPhone and also update the major iOS version number).

5 years is a long time in the mobile handset business... I doubt too many people will be livid about anything.

You assume everyone purchased the iphone 5 in 2012. The iphone 5c was still selling new in India less than a year ago.

Also new ones still listed for sale direct from Walmart it appears: https://www.walmart.com/ip/Apple-iPhone-5C-8GB-AT-T-Locked/4...

If you purchased a new device and in less than a year support is dropped, yeah you might be livid.

Standard practice in the world of Android phones. To add even more salt to that wound the OS was probably a couple years out of date when you bought it as well.

I keep an eye on David Smith’s published iOS version stats[0] which say around 23% of his users are running a 32bit CPU. This includes iPhone 5/5C and older iPads.


"twinked device like the iPhone SE"

Yes, such a bad purchase, a far faster phone with a better camera, better wifi and cellular connectivity, a better camera, and perhaps most importantly, far, far better security.

If all this does is help encourage people to get onto Secure Enclave-equipped iOS devices, it's a winner.

iPhone SE has its deficiencies (in comparison with 5s), it is darker (contrast ratio 804:1 while 5s has 1219:1) and iPhone SE does not have Corning Gorilla Glass, oleophobic coating: http://www.gsmarena.com/compare.php3?idPhone1=5685&idPhone2=...

There are literally dozens of us!

I can only presume part of a long ramp up for an iOS Macbook play.

If you have no 32 bit processes on the system, then 32-bit shared libraries don't have to be loaded (saving RAM), can be omitted from the system entirely (saving disk space), and don't have to enter into the test matrix (saving engineering time). I suspect it's as simple as those optimizations.

Yep, it really is that simple. It reduces an entire category of issues. This is also why some people entirely disable multilib in their Gentoo systems.

Also on Arch Linux, you typically only activate the multilib repo if you need one of the applications in it (which includes most notably, Wine and Steam, and a bunch of emulators).

With hardware SO cheap, and the amount of RAM or disk space these take up it would be self destructive for Apple to ostracize so many customers over something that is literally dirt cheap to accommodate.

No, the 32bit libs will cause havoc with the CPU cache. Good to drop them.

Yeah sure burn more customer goodwill after the last batch of MacBook Pro's... Apple is in customer maintenance mode, Tim Cook especially, technically this might make sense ,just be prepared for customer backlash at a time when Apple really doesnt need anymore of that.

Tell that to everyone who has trouble finding enough free space to do an iOS upgrade.

UIKit and AppKit are pretty different, I don't see them convereging anytime soon.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact