
Swift impressions - gmac
http://www.evanmiller.org/swift-impressions.html
======
cromwellian
First of all, "no garbage collection" Technically, referencing counting is
garbage collection, it's automatic memory management by definition.

Is ARC really a win over non-refcounted GC + escape analysis? Full manual
control over stack vs heap allocations and lifetime, I get, especially for
games. But reference counting is not deterministic, and can also generate
pauses.

@Override has been in Java since Java 5.

~~~
lxcid
ARC are less likely to get paused as memory release are amortised rather than
gather and swept.

Also, LLVM static analyze is so good at figuring out your code path that it
could effectively retain and release your memory for you.

Cyclic reference is its only flaw but its something that can easily solved
through programmer awareness.

~~~
cromwellian
What LLVM static analysis can be done to figure out the lifecycle of objects
also applies to other GC algorithms.

ARC may be less likely to get memory paused on release, but ARC doesn't
compact free memory, and therefore fragmentation can cause object allocations
(malloc) to become more expensive, possibly leading to pauses.

If you use non-copying GC that doesn't compact (ARC doesn't compact), then
there are low pause GC algorithms out there that give pretty good bounds (e.g.
5ms pause).

I'd like to see actual benchmarks of both ARC and say, mark-and-sweep on a
mobile device rather than speculation and opinion, and both benchmarks must
get the same static compilation treatment. That is, if escape analysis tells
you that the lifetime of the object is bounded by the current stack frame,
then allocate it on the stack, and don't penalize the non-reference-counted GC
by allocating 100% of everything on the heap.

~~~
sitharus
ARC may be slower than a good GC, but what it has is deterministic behaviour.
A 10ms pause could be tolerable if you know when it will happen.

A fair few years ago I worked in the murky world of J2ME games, and a common
pattern was byte[] _variables = new byte[1024] so you could ensure there were
no GC pauses.

~~~
pcwalton
> A fair few years ago I worked in the murky world of J2ME games, and a common
> pattern was byte[] _variables = new byte[1024] so you could ensure there
> were no GC pauses.

That's not the same as ARC, though. The Java equivalent to ARC would be
something like having an array of volatile reference counts to go with your
variables and fiddling with those at most accesses.

~~~
sitharus
Not the same, but they had the same problem - GC pauses when it likes, not
when you like. If you know calling move_hero() takes 23ms, 15 of which is ARC
you can plan for that. You can't plan for move_hero() taking 7ms, except when
GC happens.

~~~
cromwellian
Depending on VM, Java developers can plan where the pauses happen. One way
they do that is by object pooling, another is by using DirectBuffers/off-head
memory. A third way to do it is with scoped heaps.

It's not like tons of games haven't been shipped with non ref-count GC.
Minecraft being the most famous, but games on Unity3D/Mono can have non-ref
count GC. Lua is shipped in tons of game engines and uses classic GC.

Regardless of whether you are using automatic GC, or if you are using
malloc/free, you can't write a high performance game without carefully working
around memory issues. Even C/C++ games like Max Payne have shipped with frame
hiccups caused by poor malloc/free behavior that had to be fixed with custom
allocators.

If you're writing a game, you have to pay attention to what you're doing. But
do non-framrate limited games and regular apps need ref-count GC? I suggest
no, they do not.

~~~
rsynnott
> But do non-framrate limited games and regular apps need ref-count GC? I
> suggest no, they do not.

Practically the first piece of advice usually given when someone asks "how do
I make my Android app non-laggy" is to avoid allocation wherever possible;
even a single frame drop due to a GC pause when the user is scrolling a list,
say, can be noticeable.

~~~
cromwellian
Android's GC is not optimal/best of breed. ART is introducing much lower pause
times for Android AFAIK.

Low pause GC is possible. See
[http://openjdk.java.net/jeps/189](http://openjdk.java.net/jeps/189) or
Zing/Azul (absolutely pauseless GC or so they claim)

~~~
pcwalton
Azul C4 uses a custom kernel extension. It's not really a general purpose GC
for user-level apps. (There are kernel-extension-free variations of it, but
they suffer from reduced throughput over the HotSpot GC.)

~~~
cromwellian
That's true, but for a mobile device, you can control the kernel, it it may
be, from a user experience point of view, that reducing pause latency is
better than high throughput. I dunno, has anyone ever considered using Azul-
like tricks on mobile kernels? Does ARM have the required HW to support it
efficiently?

What's FirefoxOS doing for it's GC?

------
seiji
Anybody interested in an ebook about using all of Swift's strict typing
features for better app development?

I _may_ be in the process of writing one of those. It _may_ be up to 100 pages
so far with only a chapter or two remaining. It _may_ be available for release
in a day or two.

~~~
Twisol
Definitely. I think that would be fantastic.

------
lxcid
Swift seems to be a very pragmatic language. It borrow and learn heavily from
other languages, but it doesn't try to be promise anything big. It just
introduce opinionated defaults which is safe and tends to be at least 80% of
the implementor's intend, while providing alternative for the 20%.

An example is the switch case fall through behaviour of C. In Swift, by
default, switch case no longer fall through, but if you need that behaviour,
you can always add fallthrough keyword at the end.

When you think about the above behaviour, it seems so much more natural, and
wonder why we have been keeping ourselves bitten by this C language construct
and developing muscle memory to prevent it by having break, sometimes with a
scope (All cases in a switch scare the same scope within C switch, if I don't
remember wrongly).

Part of the switch example is solved in other languages (default intends and
scoping), of course. But the subtlety of the fallthrough keyword gave me a
strong impression. It felt that it should always be a opt-in all these while,
rather than an opt-out as it have always been.

With my limited command of english, I feels that I would do it little justice
to the language trying to explains some of its concepts like optionals &
mutability hints. Some of these concepts only modify the existing solution
only so slightly, but meaningful enough to be impactful.

It also doesn't take a stance in the OO and functional debates. It is both OO
and functional. Staying as pragmatic as possible.

Finally, Swift is like a wolf hiding under the sheep's skin. Its a very very
strongly typed language. It is very specific about it type. But half the time,
you can ignore the type and write it like python or ruby. The magic was type
inference, its kinda weird but I like that.

You can almost feel that Chris Lattner might have felt he is reaching the
point of diminishing return while optimising LLVM for Objective-C. after all,
this is a very old language. He done a great job all these while with LLVM but
sometimes, the problem is something deeper. Swift is kinda like Objective-C
3.0 without the backward compatibility.

(P.S. Objective-C is my favourite language before Swift)

If programming language is a sword, Swift is just another sword, albeit one
that's very sharp.

~~~
xkarga00
Go already has the same _switch_ behavior (plus the _fallthrough_ keyword) and
is very specific about its types while having type inference.

~~~
altyus
haha these answers are cracking me up. Who cares? Do people develop iPhone
apps using Go? Swift never claimed to invent new paradigms that didn't exist
in other languages, it simply brought them to Cocoa.

~~~
xkarga00
My comment was specific to lxcid's comment and his surprise to see all these
features in a language. I don't understand why my reply got downvoted.

> Who cares? Do people develop iPhone apps using Go?

There is no built-in support for people to start caring in the first place
anyway. But Go fundamentally is better suited for server-side code rather than
applications as Swift is supposed to.

On a different note, Apple has started hiring Go engineers. But who cares huh?
[https://jobs.apple.com/us/search?jobFunction=MTLMF#&ss=33773...](https://jobs.apple.com/us/search?jobFunction=MTLMF#&ss=33773001&t=0&so=&j=MTLMF&lo=0*USA&pN=0&openJobId=33773001)

------
gmac
I'm with him on the arcane rules about Array copying — IIRC, when you modify
them such as to change the length but not otherwise.

~~~
Someone
I do not understand that, either. Also, I find it weird that dictionaries are
value types. If I started programming in Swift, I think I would soon have a
generic class named 'Dict' that has a single field storing a Dictionary and
the same interface as the Dictionary struct.

Does anybody know what is the reason they chose those non-standard behavior
for the standard containers?

~~~
lohankin
My guess is that they want you to declare map/array arguments as "inout" to be
able to add/change things so that the changes are visible by caller. It might
not be a bad idea after all.

~~~
Someone
That makes some, but not much sense to me. Firstly, that is not what I read in
the book "The Swift Programming Language":

 _" Whenever you assign a Dictionary instance to a constant or variable, or
pass a Dictionary instance as an argument to a function or method call, the
dictionary is copied at the point that the assignment or call takes place."_

Even if I assume that that is an error (they didn't think of _inout_ arguments
when writing that), the behavior still doesn't make much sense to me.

Say that I want to do a few things with someObject.someField.someDict. In most
languages, I would introduce a helper variable: _var items =
someObject.someField.someDict_ and do _items[42] = 346; items[423] = 356;
etc_. That won't work, as it clones the dictionary (shallowly). I find it is
just too easy to accidentally clone arrays or dictionaries in Swift.

~~~
wooster
The idea is you shouldn't be modifying someObject.someField.someDict in the
first place since you don't own the dict, someField does.

~~~
Someone
That's true in the majority of cases. However, the code could be written as it
is for performance reasons. If someObject is a field in the current object,
that isn't that bad, especially if it also is an instance of a nested class.

Also, weirdly, that isn't something the language forbids. It doesn't even make
it possible for library writers to prevent it.

I still think we will see some language changes before the official release
here in this area because naive users will run into too many weird issues with
the current behavior.

For example, if the goal is to have normal function arguments immutable, I
think it would be more natural to forbid functions from calling mutating
methods on their arguments, unless they are specified as _inout_.

On the other hand, good compiler warnings might be enough for preventing
programmers from accidentally copying containers.

------
mercurial
I'd be curious to get a summary of what Swift has that is not a mix of
traditional OO and ML-family features.

~~~
AlexanderDhoore
What other language families are there, apart from C and ML languages? Lisp,
maybe? Maybe dataflow or logic programming?

~~~
Dewie
Pascal?

~~~
nextos
This is a very interesting and complex question. CTM provides some answers,
summarized in
[http://www.info.ucl.ac.be/~pvr/VanRoyChapter.pdf](http://www.info.ucl.ac.be/~pvr/VanRoyChapter.pdf)

------
flym4n
> At its core, the language is designed to eliminate bugs, but not in the
> academic way that, say, Haskell eliminates bugs by preventing normal people
> from writing code in it

Shots fired

------
klickverbot
D ([http://dlang.org](http://dlang.org)) has had mandatory override for a long
time.

------
spicyj
> Override functions must use the `override` keyword. Now when you rename the
> superclass's function, all the subclass functions fail to compile because
> they're not overriding anything. Why did no one think of this before?

As I understand it, Java has a similar `@Override` annotation:

[http://stackoverflow.com/q/94361/49485](http://stackoverflow.com/q/94361/49485)

~~~
pjmlp
Eiffel, Object Pascal, Modula-3, Component Pascal and many others do as well.

I guess the author is not well versed in programming languages history.

~~~
alayne
I read the article as being in the context of Objective-C bugs that were
solved by Swift.

------
foobarraboof
wow! almost like C# from, what, 10 years ago?

~~~
sitharus
Except with pattern matching, built-in immutability, extensible classes,
curried function and tagged unions.

More like F# 2.0 in feature set.

