Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One of the reasons why Lisp is fading into irrelevance is that GC is obsolete tech. You should be using RAII, smart pointers, or ARC for all dynamic-lifetime resource management depending on language and toolset for completely deterministic object lifetimes.

This is also a factor in why Android phones are orders of magnitude slower than equivalently-specced iPhones, even with the new ART.



> One of the reasons why Lisp is fading into irrelevance is that GC is obsolete tech.

RAII, smart pointers and ARC aren't free lunches, which is why people had moved from them to GC in the first place.

Of the top 10 languages on github [1] that require memory management, 7 of them use GC [2].

> This is also a factor in why Android phones are orders of magnitude slower than equivalently-specced iPhones

That's a pretty nutty assertion to make without data.

1: http://adambard.com/blog/top-github-languages-2014/

2: Discluding CSS and shell since they don't require memory management. Python's default implementation actually has both RC and GC.


> RAII, smart pointers and ARC aren't free lunches, which is why people had moved from them to GC in the first place.

Yet GC solves 10% of the problem that RAII solves. Tbh I would call them "similar". GC has the advantage of being much more tolerant of sloppy programming, but if you compare the complexity of GC with the complexity of RAII, I'd say RAII has a clear advantage.


Most RAII use cases can be taken with using/with/try/lambdas


> are orders of magnitude slower

If this was true, an Android phone would refresh less than once per second. How about "why Android phones are maybe 20% slower" or some more actually within the bounds of reason number?


It's funny that whenever Apple does something, it shifts fashion. Reference counting is suddenly hip, when garbage collectors have been invented to solve problems with reference counting and those problems are still there with ARC.

I'm waiting for the day in which Apple will introduce an optinal GC as alternative to ARC on iOS, along with proclaiming red as the new purple.


Apple did introduce optional GC into its Mac OS X runtime. It received little uptake and was subsequently removed because Apple and its developer base realized ARC was the better solution.

If your code is leaking because of reference cycles, maybe that's a problem with your code, not ARC. (I.e., structure your data as an acyclic graph or tree and/or use weak references rather than expecting the runtime to compensate for your sloppiness.)


There are plenty of situations in which reference cycles would be perfectly sensible (and not at all "sloppy") ways to organize your data if it weren't for reference counting.

For example: suppose you are writing some code that analyses a strategy game. You have (let's say) an object representing a position in the game; each position object has references to the positions you can move from there to. If repeated positions are possible, then you have a reference cycle.

For example: you have a tree-like structure in which things contain other things. It's convenient for each thing to have a reference to its parent, and for each thing to have references to its children. Boom, reference cycles everywhere.

For example: you are implementing a programming language that has closures. So each closure object has a reference to its lexical environment, and some of the things in that environment may themselves be closures defined in that environment. Reference cycles. (The same happens if you have classes, and methods have references to the class where they're defined.)

Of course, all these things can be avoided. Often you can pick some subset of the references (e.g., the parent pointers) and make them weak references, or you can restructure your code to make some of the references go away (e.g., positions don't have references to their successors, they have methods/functions for generating them). But the only reason to do those things is that you need to avoid reference cycles. Refcounting isn't (in these cases) kindly helping you improve your code by forcing you to avoid sloppy constructs; it's taking what would otherwise be perfectly sensible code, making it sloppy, and then forcing you to do something else to avoid the resulting memory leaks.


> Apple did introduce optional GC into its Mac OS X runtime. It received little uptake and was subsequently removed because Apple and its developer base realized ARC was the better solution.

You are telling the story wrong.

The reality was the the GC never worked properly, specially when mixing frameworks compiled with and without GC, leading to core dumps.

There there was a list of corner cases causes by having C as part of Objective-C.

Apple did not introduce ARC because it was little uptake.

They introduced because the GC never worked properly, so devs had better things to do than GC core dumps and ARC is a better approach to the Objective-C semantics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: