
Mac Apps That Use Garbage Collection Must Move to ARC - tachion
https://developer.apple.com/news/?id=02202015a
======
zbowling
To head off some of the misconceptions here, this is not about garbage
collection in general but Objective-C built-in garbage collection.

Apple NEEDs to do this for their own sanity. I'm surprised they waited this
long actually.

It's a serious PITA to support GC and non GC at the same time in a framework.
Having to add finalizer methods instead of relaying on dealloc methods and
dealing with non-deterministic object lifetime as well as deterministic was a
huge problem.

ARC doesn't have these problems because it wraps traditional manual reference
counting (to pick nits, it actually bypasses it when it can safely but this is
an optimization detail that you shouldn't need to worry about).

To Apple and 3rd party framework developers, it's painful to support GC and
non-GC paths in your code at the sametime. It's not really possible to use a
non-GC compatible framework in a GC app.

It's also impossible to use ARC and still support the GC as well in the same
framework. Apple has been completely _prevented_ from using ARC in it's own
frameworks because of keeping GC compatible apps working, and making this
change will allow them to start ARC-ifying their own code.

It also means that a significant chunk of the ObjC runtime can be simplified.

There are also some great advancements in the ObjC runtime that Apple is doing
(like tagged pointers and shadow objects) that the reference boehm garbage
collector can't really deal with and was all completely disabled in the ObjC
runtime when you had a GC app.

This is a good thing. Not many app actually shipped with the ObjC garbage
collector and ran well.

Now you are free to use your GC for your own needs in your app. That has
nothing to do with ObjC. Feel free to add Java or Ruby or whatever language
you like to your own app and submit that to the Mac App Store.

~~~
LeoNatan25
Question is, are they removing support from the OS frameworks themselves or
just Mac App Store? There are many applications that still use GC, many of
which are not in the store. Does that mean they cut support for these in a
future 10.10 update or 10.11? I doubt it.

~~~
realityking
I'd bet on 10.11 or 10.12. ig they don't eventually remove it, there's little
point in deprecating it or forcing it off the App Store as they can not reap
the benefits in their own code.

~~~
TazeTSchnitzel
Well, deprecating it would mean they're free not to support it in newer
frameworks, right?

------
jeffsco
What Apple calls "automatic reference counting" is what is usually known as
just "reference counting." They call it "automatic" to distinguish it from the
previous, even cruder, system where counts were manipulated directly by the
programmer. Obviously this is an improvement, but personally I am a supporter
of GC. It is a myth that reference counting is not subject to arbitrary
pauses:
[http://www.hboehm.info/gc/myths.ps](http://www.hboehm.info/gc/myths.ps)

~~~
TheLoneWolfling
Incorrect.

You can, although I haven't seen systems that do this, enqueue objects to be
deallocated then have either the main thread every once in a while or a
background thread periodically pop objects off of the queue and do the normal
reference counting dance.

~~~
gchpaco
Yup, the problem with this is that it renders the other claimed advantage of
reference counting moot (viz. there is now little or no temporal linkage
between when the object's ref count hits 0 and when its destructor/finalizer
is called).

The other problem with reference counting, which is I believe insoluble, is
that it is inherently O(alldata); that is to say you must perform an operation
each time a piece of memory dies, at the very least. This makes it good for
collecting older generations in a generational GC (where O(alldata) ~=
O(livedata)) but bad for the nursery (where the generational assumption means
that O(livedata) << O(alldata)).

~~~
TheLoneWolfling
Any computer processor can do a finite number of operations per time period,
and an (effectively) arbitrary number of objects can be marked for
deallocation simultaneously. So yes, you can have arbitrary delays.

But this is _always_ the case. Refcounting, GC, manual memory management, etc.
But I'd say you've gone one step too far in saying that there is little or no
temporal linkage with refcounting. If you're using a queue, objects will be
cleaned up as soon as they can be while remaining inside the time limitations
you've allowed.

If you want to get fancy, you can even have the compiler rearrange objects
such that pointers that are likely to point to objects with destructors are
first in line.

And yes, refcounting is O(alldata). But that's ignoring that freeing is
relatively fast, and that you can easily implement bunches and bunches of
optimizations to prevent refcounting from having to work at all.(For example:
if you acquire a reference to an object on the same thread and then release
it, you don't have to do anything. If you create an object, but at runtime you
can determine that you never released the reference to anything, you can free
it immediately. If you acquire multiple references to an object, you only need
to check for references reaching zero once. Etc. All of these are things that
theoretically can be done with a standard GC, but good luck actually doing
so.)

~~~
pjmlp
So in the end you implement a poor man's GC.

~~~
TheLoneWolfling
Reference counting _is_ GC - or rather it solves the same problem that GC
solves. I don't see why people don't count it as one. Sure, basic reference
counting cannot handle cycles - but there are extensions that can.

~~~
pjmlp
Sure it is GC, the very first chapter in most CS books about automatic memory
management starts with reference counting and then proceed to more elaborate
algorithms.

It is usually presented as a quick solution for when the resources for doing
more powerful GC algorithms aren't available, either in computing power or
engineering.

------
philippnagel
Any other languages that use Automatic Reference Counting? Is ARC strictly
better than GC?

(Sorry for the question, I am not an iOS, Mac dev.)

~~~
mikaraento
Perl, Visual Basic 6 and to some extent Python are examples of languages using
the moral equivalent of ARC.

~~~
jimrandomh
Python has been garbage collected since 2.0.

(Edit: Perl is garbage collected since version 6, and Visual Basic is garbage
collected since version 7. I am not aware of any language besides Objective C
moving in the opposite direction.)

~~~
rsynnott
C++ and Rust both provide reference counting pointer-like-things (shared_ptr
and Rc respectively). C++ has been around for quite a while, of course, but
Rust is pretty new, and deliberately eschewed garbage collection.

~~~
pjmlp
C++ RC requires good behavior from whole developers in the team and third
party libraries.

It is hard to get right with existing code bases.

Additionally C++11 got a GC API, but requires explicit programmer control.

[http://www.stroustrup.com/C++11FAQ.html#gc-
abi](http://www.stroustrup.com/C++11FAQ.html#gc-abi)

C++'s memory model is tainted by C's compatibility, as such it is quite
limited in the GC algorithms that can be implemented.

~~~
coliveira
In C++, reference counting is in a per object basis. There is no need of
cooperation at all from other libraries or code bases.

~~~
pjmlp
Really?!?

How can you guarantee that the component library that your company just
bought, available only in binary form isn't doing something like this?

    
    
        void cool_stuff_with_widget(const std::shared_ptr<UI> ptr)
        {
           UI* evilPtr = const_cast<UI*>(ptr->get());
           // now store evilPtr somewhere else and try to use it any time the library feels like it
        }
    
    

No one is going through disassembly to check such behaviors.

C++ RC is a very welcoming addition to the language, but it only works if
everyone plays by the rules, 100% of the time.

~~~
tonfa
It's not really an issue, if the library is buggy and leaks or keeps reference
to things it's not supposed to, it will be the same whether it's RC or just
plain memory management.

It's just better documented and easier to spot the bugs if you're doing RC.

~~~
pjmlp
It not about bugs, rather C++'s lack of support to enforce that RC is not
misused and programmers cannot grab RC internals and corrupt the data
structures.

------
danbruc
What is the reasoning behind that? Given the advances over the last decade or
two automatic garbage collection is good enough for a lot of scenarios and
this looks like a step back. Why not use automatic garbage collection as the
default but offer additional means for manual resource management in case it
is really needed?

~~~
bitwize
Everything GC can do, ARC can do in strict space and time constraints.

~~~
danbruc
Plain (automatic) reference counting can not handle cycles without support
from developers.

~~~
blub
I find that a good thing, because developers are trained to think that they
have to use weak references and think about object lifetimes.

Android developers tend to not think so much about lifetimes which leads to
memory leaks because Dalvik apparently can't figure out some of those cycles
by itself. Just do a search for "Android memory leak" to see some examples.

~~~
smrtinsert
A platform with 80% of mobile installations has developers who don't
understand object lifecycles Got it.

~~~
kasabali
I guess he meant Android application developers, not the people who wrote
Dalvik.

------
amit_m
Does ARC work like reference counting in python? (i.e. a reference count is
stored for every object)

~~~
gilgoomesh
With one difference: the reference count is usually stored inside the Object's
pointer (taking advantage of the fact that allocations are 16-byte aligned so
the lowest 4 bits are otherwise idle). This means that there's no storage
overhead on the reference count.

~~~
tomjen3
So if you end up with more than a certain number of allocations you can write
a memory pointer to any place you want, then execute the code there?

Because that does not at all sound like a huge exploit.

------
falcolas
I wonder how many applications will start leaking memory as a result of this?
Cyclic references are not that uncommon in many algorithms, and identifying
the need to remove these will be challenging for applications which relied on
automatic identification will take time.

~~~
jjoonathan
When it comes to overall ease of use, the importance of tooling dwarfs the
importance of GC vs RC, and apple has had really solid RC tooling for a while.
It graphs memory usage, searches out leaked objects (by injected GC or by
diffs), lists them, lists the retain/release events with timestamps, and lets
you jump to the responsible code with a single click. Compare that to the
nightmare of trying to get a misbehaving GC to cooperate and suddenly it
doesn't seem so crazy.

EDIT: Or at least they had such tools 5-6 years ago, regression is certainly
possible. For unrelated reasons I've been playing in other playgrounds for the
last 5-6 years so I can't say for sure.

------
jakobegger
This is a pity. I have an app that uses ARC on 64bit, and GC on 32bit. Allowed
me to ditch manual memory management without dropping support for 32bit
machines.

~~~
FigBug
Is there still demand for 32 bit apps? The last 32-bit mac was released in
2006 and the last OS 32 bit OS in 2009?

~~~
delinka
The most recent version of Microsoft Office for OS X is still 32-bit, still
uses the deprecated event system, and still uses deprecated file I/O functions
... Remember resource forks?

~~~
FigBug
If I wasn't clear, my question was the opposite, not is there still demand to
run 32-bit software, but is there demand for new 32-bit software from people
who can't run 64-bit.

I've assumed anybody with a almost 10 year old computer doesn't get that much
new software. I tend to release supporting the last 2 or 3 OS releases and
64-bit only and haven't received any complaints.

~~~
delinka
I did indeed misinterpret. To answer that way, I'm not personally aware of
anyone demanding new apps targeted to 32-bit on OS X. As an OS X developer,
I'm averse to the idea of starting a new app with that as a target.

------
userbinator
I'm not so familiar with the Mac ecosystem, but is it really true that Apple
decides what languages you can use to write your apps in?

~~~
parasubvert
No. You can write it in whatever you want (even on iOS these days) though you
can't depend on any third party pre-installations. IOW you need to be able to
vendor in your 3rd party language and library dependencies, and use a thin
Objective-C wrapper to make the whole thing natively executable.

This announcement is more about a feature in Objective-C being actively
discouraged by refusing to distribute your app on their store. You can still
use it if you distribute your app on your own.

~~~
jimrandomh
> No. You can write it in whatever you want (even on iOS these days)

This is somewhat misleading. Apple bans all use of JIT compilation on iOS,
which de facto eliminates languages that depend on it.

~~~
greggman
I thought they just banned the use of code not embedded in your app. You are
free to Jit. Just not files downloaded from the net. Only files embedded In
your app

~~~
wging
For those who are wondering, you can still use an embedded webview, which does
end up doing just-in-time compilation of javascript.

~~~
m_mueller
Isn't JIT deactivated in webviews? I'm not even entirely sure it's activated
in JavascriptCore.

~~~
snowwrestler
As of iOS 8, webviews have the same JavaScript engine as Safari does, called
Nitro, which I believe does do JIT compilation of JavaScript.

~~~
firloop
Well, UIWebView (the standard web view, pre iOS8) does not have Nitro, but
WKWebView (introduced in iOS8) has Nitro. The only problem is that while they
are similar, the WKWebview API is not feature complete with UIWebView, leading
to a lack of adoption from big players such as Chromium.

[0]:
[https://code.google.com/p/chromium/issues/detail?id=423444](https://code.google.com/p/chromium/issues/detail?id=423444)

------
zomgbbq
[https://developer.apple.com/library/ios/documentation/Swift/...](https://developer.apple.com/library/ios/documentation/Swift/Conceptual/Swift_Programming_Language/AutomaticReferenceCounting.html)

------
tacos
Awesome. Now all they gotta do is stop treating strings like we're in a Pascal
Intro to Computer Science class circa 1987 and we gots ourselves a kick ass
development environment here. :)

------
jimrandomh
What the fuck? This is a _really_ bad technical decision. As written, it
sounds like they're planning to reject garbage collection in-general, which
would be a ban on most major programming languages. If you assume they only
mean garbage collection in Objective C, it's still a head-scratchingly stupid
decision. Modifying an app to not use garbage collection is, in many cases, a
major project that will introduce a lot of bugs. And the benefit is... very
dubious at best.

~~~
tptacek
This is not a ban on garbage-collected programming languages.

~~~
ZenoArrow
Perhaps, but I see no reference to 'Objective-C apps', the current wording of
this summary is broad enough to include other languages as well. Do you have
additional information that exempts Java, Mono, etc...?

~~~
parasubvert
All Mac native apps need an Objective-C Wrapper at minimum to give them an
executable entry point (all apps are actually directory structures). You also
on the App Store need to vendor in your dependencies. There's no language
restriction in play. You can even still use the old GC if you want - just not
on their store. The Mac App Store isn't the only game in town.

~~~
ZenoArrow
Thank you for the information, useful to know about the requirement to build
an application entry point in Objective-C.

