
Swift 2.2 Released - nostrademons
https://swift.org/blog/swift-2-2-released/
======
AsyncAwait
As someone who refused to sink any significant time into Swift until they
open-sourced it, I have found it to be the most productive statically-typed,
native language out there. Its type system feels like the best of C# and Rust,
while it feels that the speed at which one can pick it up is not that much
greater than that of Go despite being a much richer language, (I've started
learning in January and am at the level now where I can take on a fairly
complex Swift project and feel comfortable about being able to finish it in
time, plus I now use Swift for almost any programming task, 90% of them at
least). Plus, you have to give Apple credit for getting the open-source
release right, (GitHub, pull requests, proposals, mailing lists, frequent dev
builds, Linux support - I guess Chris Lattner has been the one driving this,
but I have to give credit to Craig Federighi and Tim Cook for letting him do
it.)

The only part that frustrates me is that SourceKit seems to crash every other
minute for no apparent reason - but that is no fault of the language itself.

Swift is the first piece of Apple software I'm genuinely excited for. Congrats
to everyone who contributed to the 2.2 release

~~~
wyldfire
> the best of C# and Rust

e.g.? I've only just scratched the surface of Rust and dismissed C# as a Java
clone (but maybe I shouldn't have?). What's some examples of the best of Swift
that you see?

~~~
chadzawistowski
One case where C# is far superior to Java is generics.

Generics were a cutting-edge feature in 2005. Microsoft took the opportunity
to start from scratch, and released the .Net Framework 2.0 without backwards-
compatibility, and with full support for generics. A List<String> is actually
a List of String.

When Sun released Java 5, they didn't want to disrupt the established Java
ecosystem. Their implementation is called "type erasure". As an example,
List<String> is actually just List<Object>, with automatic runtime typecasting
to String.
[https://docs.oracle.com/javase/tutorial/java/generics/erasur...](https://docs.oracle.com/javase/tutorial/java/generics/erasure.html)

Java's solution works well enough, but it breaks down in certain scenarios,
such as multiple levels of generics. For example, when getting an element from
a List<List<String>>, Object is not cast to List<String> (which doesn't really
exist), but merely List.

C#'s implementation of generics became even more sophisticated with C#4, which
introduced covariance and contravariance. [https://msdn.microsoft.com/en-
us/library/ee207183.aspx](https://msdn.microsoft.com/en-
us/library/ee207183.aspx)

~~~
pjmlp
> Generics were a cutting-edge feature in 2005.

I was already using C++ templates in 1994 with Borland C++ for Windows 3.1!

Ada, CLU and ML go back to the late 70's / early 80's.

~~~
pkolaczk
Templates are not proper generics. Templates can be used to emulate generics,
but they are not the same thing at all. Templates are not first-class citizen
of the type system (they don't survive to the typechecking phase), while
generics are. You cannot write a generic list implementation in C++ and have
it typechecked.

~~~
pjmlp
Now I am curious about an example where generics are able to do what
templates, which are Turing complete, aren't.

~~~
chadzawistowski
I think of templates as an automated copy+paste mechanism. C++ templates can
bloat the compiled executable pretty significantly... If you define a generic
Foo<String> and Foo<Widget>, C++ templates will generate two separate binary
implementations. C# will use the same implementation, and perform type checks
using reflection at runtime.

Further support for the idea that C++ templates are an automated copy+paste
mechanism is that you're allowed to give templates arguments that are not
types.

Further reading: [https://msdn.microsoft.com/en-
us/library/c6cyy67b.aspx](https://msdn.microsoft.com/en-
us/library/c6cyy67b.aspx)

~~~
pjmlp
It is not how you think of them, rather what is possible to do with their
expressiveness.

C# type checks imply a performance hit and are an implementation detail.
Nothing prevents a C# compiler to choose another one. It has nothing to do
with generics vs templates.

For example Modula-3 and Ada compilers use the same approach as C++ for code
generation of their generics.

So where is that example of something that can only be written with C#
generics, but not with C++ templates?

A github gist maybe?

~~~
pkolaczk
It is not the case of what can only be written, but what can be written and
_typechecked_. C# generics offer much stronger type safety for library writers
than C++ templates. C++ templates are not type-checked. C# (and Java, and
Scala and Haskell) generics are.

~~~
pjmlp
Yes they are type checked.

That is what enable_if, type traits, if constexpr and eventually concepts
allow for.

~~~
pkolaczk
No, they are not. This code compiles fine, despite an obvious non-generic
type-error. You may add whatever number of type traits, constexprs, etc there,
and it will still compile fine as long as syntax is ok.

    
    
      #include <string>
    
      template <typename T>
      void foo(const T& arg) {
        int x = std::string("int expected");
      }
    
      $ g++ -c template.cpp
      $ // see, no error
    

Templates are only syntax-checked. The type checker is not even executed on
the non-generic code inside of templates. The type checker will be executed
_after_ the template is instantiated (expanded). At that point, generic types
do not exist. There exist only concrete types.

~~~
pjmlp
They are still type checked when they are instantiated and the code is as
generic as always.

Uninstantiated templates are dead code that will be stripped anyway.

~~~
pkolaczk
Did I write anything different?

Sure, you can type-check them for particular type arguments. But you can't
type-check them generally at the generic level and you have absolutely no
guarantee that your library code is free of type errors. Not a problem if the
goal is to produce the final executable, but a big problem for a library
writer, who doesn't control the inputs (in this case: the types provided by
the user of the library).

And AFAIK C++17 concepts do not address this problem, really. I can constrain
my input types with them, and the calling code will be forced to conform to
them (and the compiler will present a nice error message if it doesn't), but
there is still no checking on the other side - that is if the template
implementation is correct assuming these constraints. I can publicly declare
my code requires T.bar(), then shamelessly call T.baz() in the template
implementation and the compiler will not catch it.

This is like a Python program. You don't know if it type-checks before you run
it, but even if you run it once or twice and it was fine, that doesn't
guarantee there are no type errors.

~~~
pjmlp
Yes.

The code line _int x = std::string( "int expected");_ has zero relation with
any of the template arguments.

So a programmer error that doesn't have anything to do with the types provided
for the template.

Sure it will be caught in C#, because its build model assumes the existence of
modules and everything is compiled to binary.

In C++ a similar compilation error would occur when compiling the translation
unit into a library where the template is instantiated.

Of course, most templates being header only the error will only occur when
instating it, but it will still blow on compilation and prevent the final
generation of the exe, dll being produced.

Also errors related to semantic type check, independent of template arguments
is relatively easy to achieve with unit tests.

I still fail to see how a generic system that doesn't offer partial
specialization or meta-programming as being more powerful than templates.

~~~
pkolaczk
I never said more powerful. I said stronger. In C++ you need unit tests to
check something I can have statically proven in C#. And obviously you can't
instantiate your template for all possible valid input types, because this
number can be infinite.

------
brudgers
Looks like two potentially breaking changes at source level:

[https://github.com/apple/swift-
evolution/blob/master/proposa...](https://github.com/apple/swift-
evolution/blob/master/proposals/0011-replace-typealias-associated.md)

[https://github.com/apple/swift-
evolution/blob/master/proposa...](https://github.com/apple/swift-
evolution/blob/master/proposals/0022-objc-selectors.md)

~~~
alblue
Don't forget the removal of the C-style for loop, along with the increment ++
and decrement -- operators.

[https://github.com/apple/swift-
evolution/blob/master/proposa...](https://github.com/apple/swift-
evolution/blob/master/proposals/0007-remove-c-style-for-loops.md)

[https://github.com/apple/swift-
evolution/blob/master/proposa...](https://github.com/apple/swift-
evolution/blob/master/proposals/0004-remove-pre-post-inc-decrement.md)

~~~
whiddershins
I am baffled by the for loop removal thing.

I just spent a while reading about both stride and for ... in, and I feel like
there are any number of cases where I would rather use a C-style for loop.

~~~
xxpor
I'm not familiar with swift, but Python doesn't have C style for loops either.
People just write:

for i in range(10):

instead.

~~~
vram22
More generally, people write:

for item in sequence:

which is somewhat general (sequences include lists, dicts, strings, tuples,
generators, text files and more [2]), and it covers a wide range [1] of use
cases.

[1] Pun not intended.

[2] Basically, any iterable. This include custom iterables you can define,
which can then be iterated over using the same standard for loop. A unifying
feature of the language:

"The use of iterators pervades and unifies Python."

[https://docs.python.org/2/tutorial/classes.html#iterators](https://docs.python.org/2/tutorial/classes.html#iterators)

------
iliketosleep
i can't believe how much people here on HN love swift! do you guys really love
stuff like implicitly unwrapped optionals and awkward syntax where you need
exclamation marks all over the place. not to mention their insanely
complicated use of enumerations. what about a completely half-baked standard
libraries where even basic string operations such as doing a substring are
unnecessarily complicated and verbose? i simply cannot see any real basis for
all the love for swift.

~~~
danappelxx
>implicitly unwrapped optionals

This only exists for objective-c introp, and the creators are actively trying
to find ways to get rid of it. On the other hand, in nearly every other
language, all pointers would be considered implicitly unwrapped optionals,
while in Swift you get optionals built in (food for thought).

>awkward syntax where you need exclamation marks all

Oh come on, syntax does not define a language. If they used a different
character or an operator, would you be more happy? Either way, it does take
some getting used to but eventually it feels natural.

>insanely complicated use of enumerations

What?! This is one of my favorite features! You can represent so many things
with an enum and it can make your code much safer. My favorite two examples
are representing JSON as an enum, and the fact that Swift's Optional is
actually just an enum with cases .none and .some(T).

>basic string operations such as doing a substring are unnecessarily
complicated and verbose

I somewhat agree with you on this. They have their reasons (mostly because
strings can have varying sizes, so indexing a utf8 string, for example, can
give unexpected results) but I still wish they did something about it.

------
seivan
Happy about renaming typealias to associatedtype, now I don't have to
constantly explain why I am using typealiasis when dealing with bugs.

------
pjmlp
Sad Ubuntu LTS user:

> error: opening import file for module 'SwiftShims': No such file or
> directory

Apparently the problem with GCC ABI break hasn't been tackled yet.

[https://bugs.swift.org/browse/SR-23](https://bugs.swift.org/browse/SR-23)

As info, the only workaround is to rebuild Swift from scratch.

------
timelincoln
now they just need to open source Xcode..

~~~
mightykan
I’m not sure you’d really want the crap-fest that is Xcode’s code. It has so
many bugs and issues that it’s a better idea to completely dump it and either
start over or give up and adopt something like AppCode, like Google did with
Android Studio. Making developer tools is just not Apple’s forte and Xcode’s
sad state of quality and performance is clear indication of that.

I’m thoroughly convinced that Apple has an internal-only IDE that they use
that is far better than Xcode and is actually functional, much like the
internal Radar tool. There is no way Apple’s engineers could get any work done
if they use Xcode.

Or, they should start charging money for it so they could justify spending
resources on the dev tools. One of the reasons (among many) that Visual Studio
is still the king of all IDEs is because Microsoft charges money for the
serious bits of it (Pro and higher).

~~~
feelix
>I’m thoroughly convinced that Apple has an internal-only IDE that they use
that is far better than Xcode and is actually functional, much like the
internal Radar tool.

That's an interesting point and it makes a lot of sense. With the scale of the
codebases a lot of them are working on I also don't see how they could be
productive while working in XCode. And also XCode could not be this buggy and
broken if it had internal developers working with it and constantly reporting
bugs.

~~~
thought_alarm
The experienced developers at Apple are no different than the experienced
developers elsewhere. They use the LLDB command-line and write their code in
Emacs.

I've never liked Visual Studio, and the last time I worked on a serious
Windows product most of the experienced developers used WinDbg and cordbg
rather than the neutered Visual Studio debugger. And some guys still wrote
their code in Visual Studio 6 rather than the newer .NET IDEs.

My problem with Xcode today is that it's too much like Visual Studio. I'll
take Xcode 3 with LLDB any day.

~~~
makecheck
Command-line tools are great but graphical displays can be useful, too. The
real problem is that pane/window management on OSes is years behind where it
should be.

What I _want_ is a way to take any view I please, from any application, and
arrange it _anywhere_ in a standard way with keyboard support and sensible
omissions of chrome. If this means 65% of my screen is terminals, 20% web
browsers, 10% some graphical view from Xcode and 5% notifications, that should
be perfect fine. Instead, the most the Mac has been able to cobble together is
a simple split screen view and that is only for Full Screen.

And, we also have: Xcode with its own completely custom and quirky pane/tab
management scheme, terminals with their own pane/tab scheme, browsers with
their own tabs, etc. Individual applications continue to feel some need to
over-engineer their own pane/window management to compensate for lack of
system support.

There are some signs of hope though. The direction Apple is going with Mac
view controllers could theoretically put them in a position to finally turn
individual views into first-class citizens that would be feasible to
interleave arbitrarily across applications. At that point, command-line tools
could integrate very nicely in arbitrary ways with elements that really
benefit from being graphical. We’ll see.

~~~
dcgoss
I have been using
[https://www.spectacleapp.com/](https://www.spectacleapp.com/) for a while
now, and it has performed like a champ. Easy window resizing and organization
with keyboard shortcuts.

