Hacker News new | past | comments | ask | show | jobs | submit login

Arc is deterministic, you get the same results on multiple runs (minus not synchronized async fun of course), you can profile it etc. – you don't have this luxury with gc.



Not if the run depends on external data, which might happen to create a data forest with stop the world effect caused by domino effect of deletions.


Of course, what is deterministic is deterministic, if you use allocations based on nondeterministic random, then you can't see deterministic allocations - I think this is obvious and doesn't have be to spelled out.

With gc deterministic runtime creates nondeterministic deallocations.

With arc deterministic runtime creates deterministic deallocations - reproducible behavior that can be profiled and allows you to work on optimizing it.


How it is reproducible if you cannot control your input data?

There are GC languages like D, where you can have C++ like RAII deterministic deallocations.

Learn to use the features.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: