
My misalignment with Apple's love affair with Swift - mgrayson
https://rant.monkeydom.de/posts/2018/06/10/on-my-misalignment-with-apple_s-love-affair-with-swift
======
chadcmulligan
Its interesting how languages divide opinion. For me I think Swift is the best
language around. One of the issues not addressed in the article (I believe) is
performance, Swift is faster than Objective C because of the static type
checking, which is a good thing.

It has the advantages of C# as a language but no garbage collection, granted
the reference counting in Swift can be problematic, but it seems to be
improving with each release.

The functional programming aspect of swift I find useful. There are a number
of features in swift that decreases the amount of code - e.g. if let/guard.
The use of optional types increases program safety.

The fact that you can use classes declared elsewhere without any consideration
of dependency I find very powerful, and of course with great power comes great
responsibility.

The power of swift collections is fantastic, and the syntax compared to
Objective C is so succinct.

Many more...

~~~
sheeshkebab
Faster than ObjC? Unless your objc code is written using nothing but id and
reflection, I’m not sure how can this be the case...

~~~
saagarjha
> Unless your objc code is written using nothing but id and reflection

That's how _all_ Objective-C code is written–every method call goes through
objc_msgSend.

~~~
sitharus
To be fair to objc_msgSend it's orders of magnitude faster than most
reflection APIs. [https://www.mikeash.com/pyblog/performance-comparisons-of-
co...](https://www.mikeash.com/pyblog/performance-comparisons-of-common-
operations.html) has some benchmarks, it's a bit old but I don't think
objc_msgSend would get slower.

~~~
saagarjha
Yup, I'm not saying objc_msgSend is slow: it's just that it's not faster than
a direct call, and I believe about on-par with a virtual function call.

------
jordansmithnz
As someone with a significant amount of experience in both ObjC and Swift (4+
full-time years in each), I don't agree with this article. Yes, Swift isn't
perfect, it has its flaws. Totally agree there. It's not perfectly integrated
with Cocoa either, as the author mentions.

However, to say that Swift is a regression over Objective C seems very short
sighted to me. Can you imagine if Apple continued with Objective-C? There
would be no way to reverse decade old decisions that simply aren't the right
decision anymore. Every new feature would need to be built on top of code from
1984. Swift was very future focused from the get go, and is a bid to ensure
that developing for Apple devices in 2025 is not an archaic mess.

Like anything new, it's not perfect at first. That is the world we live in
now. However, Swift is getting incrementally better at a very good pace. There
are some reasons left to prefer Objective-C right now, but I'm sure that in a
few years these reasons will be far fewer.

My own opinion is that I'm much more productive writing Swift, after spending
a similar amount of time with each language. If you think the same is true of
Objective-C, then that's great - there are certainly still some upsides to
developing with Objective-C. However, to say that Swift is a mistake is
something that I can't reason with, since it's one of Apple's most forward
thinking decisions to date.

~~~
AnthonyMouse
> There would be no way to reverse decade old decisions that simply aren't the
> right decision anymore. Every new feature would need to be built on top of
> code from 1984.

Of all the reasons to prefer a new language, this is the worst one from a
platform perspective.

C and POSIX allowed Unix to conquer the world. They allow code written since
the 1980s to run with minimal modifications even on modern systems. The other
major operating system family -- Windows -- is also strongly associated with
backward compatibility.

Everybody always wants to throw away the legacy code, because maintaining it
is expensive. But throwing it all away and starting over from scratch can be
more expensive. Especially when you're dealing with millions of lines of third
party code. Which is why the only platforms that are still popular after 30+
years are the ones that didn't force everybody to do that.

~~~
pjmlp
You mean allowed security firms and black hat hackers to prosper, turning
Internet on a Swiss cheese of security with an endless number of attempts of
compiler saniters, process sandboxing, CPU extensions, CS research to get out
of the mess.

~~~
AnthonyMouse
> You mean allowed security firms and black hat hackers to prosper, turning
> Internet on a Swiss cheese of security with an endless number of attempts of
> compiler saniters, process sandboxing, CPU extensions, CS research to get
> out of the mess.

I mean _take over the world_.

I know it's popular to hate C these days, but remember the context. The
competition in the 80s and 90s were operating systems that didn't even have
memory protection or the concept of user accounts. Unix was never going to
lose over _security_.

And we're not talking about the language to write daemons or kernels in, just
the system API. A system should make it easy to write your code in any
language you want regardless of whether any other part of the system uses it.
There isn't supposed to be a One True Language. System APIs that advantage any
specific language are problematic -- which is why the C family is popular for
APIs. It's simple enough that it's easy to wrap using nearly any other
language.

For example, Java was the Next Big Thing around the time when IPv6 was new,
and Java is "safer" than C. That would have been an opportunity for an
operating system to deprecate the BSD sockets API, never extend it to support
IPv6, and require internet-facing code to use a Java API in order to use IPv6.

The problem is that's just making trouble for everyone. People who want to
keep using their C programs, or use any other language (like Rust) that isn't
Java, will either get saddled with some awkward and slow kludge to use the
Java API from another language, or will just not bother to support IPv6,
making IPv6 less popular than it is already. People who do want to use IPv6
but not Java will prefer other operating systems, making a system that does
that less popular in general.

It's no use to make something nominally more secure in a way that just causes
people to not use it.

~~~
pjmlp
> The competition in the 80s and 90s were operating systems that didn't even
> have memory protection or the concept of user accounts. Unix was never going
> to lose over security.

Really? Apparently we lived in different planets.

VAX/VMS, OS/360, Burroughs, AS/400, ....

UNIX was available with source code for a symbolic price to universities, that
was the difference.

~~~
AnthonyMouse
> VAX/VMS, OS/360, Burroughs, AS/400, ....

On mainframe hardware similar in price to a single family house. The
competition on hardware individuals could hope to afford was the likes of DOS
and MacOS Classic. And even on mainframes, the ones that have survived like
OS/400 are the ones obsessed with backward compatibility.

------
rockshassa
IMO Swift is a language written by a compiler guy to solve compiler problems.
the syntax is dense because it forces you to make a lot of decisions that
could otherwise go unmade in objc.

If a variable is read-only, you're forced to think about that by deciding
between let/var. contrast this with objc, where all vars are writable unless
you do the extra work of adding the const keyword. objc makes us do more work
to get the faster (and safer) behavior.

Similar with Optionals, you're now forced to decide right away if a parameter
can ever be nil, whereas with objc you didn't need to declare nullability.
Again, it makes the safest and fastest choice easier to make, and allowing
nullability is actually more work.

Generally Swift forces you to pass as much information to the compiler as
possible at compile time, and it does it with a delightfully readable syntax.
This theme repeats itself throughout the language. more information at compile
time is always going to result in safer and more predictable behavior.

full disclosure: i'm an iOS dev working in objc and swift. i love both
languages, and i think swift is the obvious path forward.

~~~
beureum
You've hit the nail on the head, but I take the opposite conclusion.

Swift is designed around premature optimization. In my experience, improving
programmer productivity is more important. The more time spent with compiler
enforced busywork is less time in Instruments. The same goes for compile
times. If you want people to write fast code then help them to iterate
quickly.

~~~
valuearb
Most developer time is spent finding and fixing bugs. That’s what makes
developing in Swift much more productive than in Objective C.

~~~
beureum
Well yeah pretty much anything is better than ObjC.

------
ardit33
I agree with the author.

"So now contrast that to Swift. First of all: Which question did it desire to
answer? Think about it. There is no one clean answer."

I feel that Swift is more just like a language designed to be a syntax swap of
objective-c, making few things better, yet some other ones worse.

Sure, it might be more welcoming to people used to java/javascript type of
language syntax, but overall I think it is two step backwards on language
design and functionality.

To folks that are proficient with Objective-C,

1) How do you like working in Swift?

2) Are you more productive (for medium or large projects)?

3) Do you enjoy it more?

In my part (if you are already very proficient with Objective-C), those
answers are Nos. If you are new to iOS though, Swift is easier to get started.

[https://www.hackingwithswift.com/articles/27/why-many-
develo...](https://www.hackingwithswift.com/articles/27/why-many-developers-
still-prefer-objective-c-to-swift)

~~~
veidr
I programmed mainly in Objective-C from ~2000 (Rhapsody beta) and resisted
even learning Swift until Apple finally confirmed they were definitely going
to open-source it. So about a decade of objc.

1.) I love working in Swift. It would not occur to me to start a new project
in Objective-C. The only reason I could see myself writing more objc code in
this life would be to make a quick fix on some old project. (Incidentally,
this is the exact same kind of B-completely-replaces-A-for-me language
experience I had with JavaScript→TypeScript. Can't think of another.)

2.) Swift confers upon me a quantum-leap kind of productivity increase -- the
kind of "programmer time" optimization the Ruby guys used to talk about over
Java. The core language is great. Swift enums concisely solve lots of problems
that are easy-but-annoying in Objective-C. Protocol extensions are an absolute
godsend and I hate the feeling (when programming in something else) of seeing
how a protocol extension would solve the problem at hand, but not having them.

Incidentally, I remember vividly when there was talk of adding "concrete
protocols" to Objective-C 2.0. (They let you do a similar thing; adopt a
protocol to gain a bunch of implementation behavior, not just declare that you
support an interface). I was really, really excited about that, too. But the
rumor didn't pan out and concrete protocols were never added.

3.) Yep, love it. I changed jobs in 2016 and don't even do much Apple platform
development anymore; I am mainly using Swift on Linux.

My comments apply to current Swift, so right now that means 4.2. As you go
back in time to older versions, I liked it less. In the first year doing
Swift, there were still times I enjoyed Objective-C more, but not anymore.

Aww, it's kind of sad when I think about it; after all those years,
Objective-C is now dead to me, like NewtonScript...

~~~
mpweiher
> Swift enums concisely solve lots of problems that are easy-but-annoying in
> Objective-C.

Curious as to what those might be, can you shed some light?

I was also puzzled by the emphasis on enums in Swift, as they are a language
feature I rarely if ever use in Objective-C, not because I want to and they
are inadequate, but because I generally don't see much of a need for them.

~~~
mercutio2
Swift enums are effectively discrimated unions, which are really, really
different from C (and thus, Obj-C) enums.

For example, in Swift, you can have an enum which says its values holds
exactly one of:

A) several specific semantically meaningful strings B) some other object C)
several objects

In Obj-c, you would probably express this as a single object, and write a
bunch of error prone logic to handle which of these states you’re in, possibly
missing an important last minute addition in one of your methods for walking
over all the possible state.

~~~
mpweiher
Thanks!

In any OO language, I’d write different classes for cases A B and C that share
a common message protocol (wether formal or informal) and use them
uniformly/polymorphically.

------
elcritch
> It seems to have been driven by the needs of the compiler and the gaps that
> needed to be filled for the static analyzer. Those seem to have been super-
> charged instead of catering to app developer's actual needs: efficient,
> hassle free, productive (iOS) App development.

This. Even then Swift (the last I really looked like v2) was full of special
one offs and smacked of design by committee.

It would’ve been much more interesting if they’d worked on making an updated
Objective C language. Maybe even break C superset constraint and add in some
new syntax. It’s even more sad that ObjC message passing can support
distributed systems.

Probably time to move back to Linux. Too bad GNUStep never took off. Or Etoile
(?).

~~~
saagarjha
> This. Even then Swift (the last I really looked like v2) was full of special
> one offs and smacked of design by committee.

How so?

~~~
elcritch
Here's one link [1]. It's dated and so hopefully many of these have been
resolved. Swift doesn't solve any problems for me so haven't really looked at
it again for a long while.

Mainly it seemed that instead of supporting full functional concepts, they
one-offed feature that _looked_ like it was functional but weren't.
Unfortunately I can't find the original article that reviewed the Swift type
system.

1: [https://www.quora.com/Which-features-overcomplicate-Swift-
Wh...](https://www.quora.com/Which-features-overcomplicate-Swift-What-should-
be-removed/answer/Rob-Rix)

~~~
saagarjha
A lot of those are either fixed, or being actively worked on.

------
bsaul
As a mobile & fullstack developper, my road also takes me to different
languages but for very different reason. To me swift is at its core the best
language by far. I love the way it lets me model pretty much every problem
using combination of enum and struct, while keeping everything value-based and
type safe. If you add null safety, there is no other (mainstream) language
that checks all those marks.

But the OP is right in that Swift aims at being a silver bullet, yet the pace
at which it evolves compared to that goal is absolutely scary. Concurrency
hasn't moved at all, and in the meantime server side performance is a total
disaster
([https://www.techempower.com/benchmarks/#section=data-r16&hw=...](https://www.techempower.com/benchmarks/#section=data-r16&hw=ph&test=fortune)).

There is also nothing for cross-platform development ( _real_ cross-platform,
not just iOS and macOS).

So, personally, my next experiment is going to be full stack dart.

~~~
seandougall
That benchmark seems to list only Kitura. I wonder what it would look like
with Perfect, Vapor, or Zewo.

> Concurrency hasn't moved at all

Yeah, it hasn't moved because GCD is already pretty mature, and they don't
seem to have made any effort at other concurrency paradigms. I'd love to see
how something more like CSP would work (Zewo has an implementation, but I
haven't had a chance to work with it yet).

> There is also nothing for cross-platform development (real cross-platform,
> not just iOS and macOS).

Not yet; it doesn't quite seem like it's ready to be a practical goal. I'm
curious to see if that changes when they manage ABI stability.

~~~
bsaul
GCD is a library, not a language construct. As such it is extremely raw, and
i've got a strong suspicion that the fact that it wasn't built for server-side
style concurrency is one of the reason kitura is dead slow in the benchmarks.
Those kind benchmarks usually spawn thousands of simultaneous connexions,
which is very far from what an app generally requires (but i could be wrong).

It also doesn't provide more parallelism regarding i/o than os-based
threading. go has light threads which makes context switching fast, and node
has async i/o all over the place. Maybe Swift NIO would make a difference, but
it would only make swift catch up to what the first version of node.js
provided.

My point was that this manifesto
[https://gist.github.com/lattner/31ed37682ef1576b16bca1432ea9...](https://gist.github.com/lattner/31ed37682ef1576b16bca1432ea9f782)
has been written 1 year ago and nothing has been implemented or announced at
wwdc (but maybe i missed a talk...)

~~~
jrs95
Vapor generally kicks ass in benchmarks FWIW, and it's using this now so
clearly Apple is beginning to pay attention to stuff around server side
concurrency: [https://github.com/apple/swift-
nio](https://github.com/apple/swift-nio)

Although as I mentioned in another comment, if you're looking for something
more mature, Kotlin is a great option. Vert.x has the widest database support
and is lean and fast, so this is probably what you're looking for. But if just
Mongo/Cassandra/Redis is acceptable, Spring Boot w/ Reactive Web looks pretty
cool, too.

------
majewsky
> Swift code might end up being more correct in the end. It also might alert
> you to edge cases early on. However, the flip side is it inhibits your
> creativity while writing. When starting out to program, the way I enjoy
> working, I don't yet know how the API is best expressed. So I try out
> different ways to express it until I find a sweet spot. Then I go back and
> unify accordingly.

> Swift actively distracts me in that endeavor by making me answer questions I
> really don't want to answer right now. Yes, stuff might be less correct in
> the meantime, but heck that is what I want during the design phase. Find my
> concept, sweet spot, iterate, pivot quickly.

I have much of the same feelings with Rust.

~~~
deltron3030
It might make sense to differentiate between prototyping and implementation
languages. As a design first company, they don't prototype in code, they use
code to implement prototypes. With Swift they're just serving their own
methodology.

This also fits into their "we don't need to collect data" mindset, they seem
to iterate on assumptions, not direct feedback from customers. They create
many prototypes and then implement the most viable one, and and this point
they know exactly what they need.

~~~
majewsky
That's a fine approach when you prototype a UI or a process, but it doesn't
help if, f.i., you want to try out a specific Rust library.

------
protomyth
I find myself in the same boat. I've programmed in Objective-C since 1995 on
NeXTSTEP. I love Objective-C, but realize it has quite a lot of warts.

My problem with Swift is my feeling that the people who developed it don't
like Objective-C. It really feels like the Java direction Apple tried to take
in the late 1990s. You can argue Nil messaging but its part of the landscape.

Plus, I really find weird way they adapted message passing to the
C++/Javascript like syntax. A simple obj.(selector: value selector2:value)
would have made it much easier to go back and forth. This is mismatch affects
things. Never mind the added punctuation that need not be there.

I should be much more productive, but I'm not and that sucks.

------
lxcid
I love Swift but at 4.2, its still a distance from being useable for me.

Swift is suppose to be modern, only to be burden by API compatibility work,
which is non trivia.

This significantly slow certain important developments like ABI Stability
(5.0), Full Generic (5.0), Concurrency (Maybe 6.0)?

A year since 4.0 release and in the coming few months, 4.2 will be release and
not 5.0. This mean the timeline for 6.0 get push even further back.

While Swift is modern in area like optionality, first class immutable struct
and (my favourite feature) enum with associated values, it lack many other
modern features we come to expect from modern language. e.g. callback are
still the way for async path control (1 of the regret of Ryan Dahl in his
JSConf EU talk
[https://www.youtube.com/watch?v=M3BM9TB-8yA](https://www.youtube.com/watch?v=M3BM9TB-8yA))

1.0 to 3.0 was spent getting the API right. This is a significant positive
investment in the long run, but as someone who have to maintain codes, it was
not pleasant at all and I still have code stuck in 1.0/2.0 eras. I have
crashes with getting conditional conformance working with generic. Some wasn't
crashing on 1.0 or 2.0 but crash on 3.0. Swift clearing is a WIP.

\---

At the same time, TypeScript happened. TypeScript turn JavaScript into
optional typed language. I see JavaScript and Objective-C in similar light.
Since Objective-C start getting some syntactic sugar (generic, nullability), I
wonder what if they have taken the TypeScript approach instead.

TypeScript have no choice but be pragmatic (probably after seeing how Dart was
not adopted by the larger community for going the Swift way).

Apple basically act like a benevolent dictator, whatever direction they take
is more or less the future, we have to figure out how to work around the new
"world" order, which get updated every year at June.

The best iOS/Mac developer thrive in this environment and get handsomely
rewarded (App Store ranking, recognition from Apple), I tried and failed
miserably.

------
nextstep
This post is lacking in details. What does this mean?

>> “It wants to be strongly typed, but at the same time so convenient in type
inference, that it falls over its own feet while trying to grasp simple
expressions, and they become too complex to manage. By design.”

What is an example of an expression that is tripping over its own feet?

~~~
mpweiher
This is one example:

    
    
       time swiftc too-complex.swift 
       too-complex.swift:1:49: error: expression was too complex to be solved in reasonable time; consider breaking up the expression into distinct sub-expressions
       let a:[Int] = [1] + [2] + [3] + [4] + [5] + [6] + [7]
                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~
    
        real	1m20.639s
        user	1m12.459s
        sys 	0m5.249s
    
        cat too-complex.swift 
        
        let a:[Int] = [1] + [2] + [3] + [4] + [5] + [6] + [7]

------
pkulak
I think the problem it solves is developer PR with Objective C, which looks
like absolute hell the first time you see it. Swift looks like a sleek
scripting language.

~~~
noecrnt34mw
I never understood this point of view. The first time I saw it, I was like,
"Brackets. That's slightly different than I'm used to. They look like arrays
or something. Oh well. It seems to work." And from then on it was a non-issue.
Why do developers get so hung up on this?

~~~
ChrisLTD
All the extra symbols did make ObjC more laborious to read and write.
¯\\_(ツ)_/¯

~~~
Apocryphon
Objective-C [] over the :: and <> overload of C++, imo

~~~
pjmlp
I prefer the :: and <> overload of C++, than having to type @ everywhere.

------
lifeisstillgood
> Swift actively distracts me in that endeavor by making me answer questions I
> really don't want to answer right now. Yes, stuff might be less correct in
> the meantime, but heck that is what I want during the design phase. Find my
> concept, sweet spot, iterate, pivot quickly.

This is also what annoys me about TDD - writing code is a process of
refinement - if I know exactly what the API I am going to program against
looks like, if I know the API I will offer and I know the internals of my code
are _before_ I write code then hell, I have done it before.

Often times coding is exploring.

~~~
vbezhenar
Eclipse Java compiler had a neat feature. It could compile wrong Java code
into a working bytecode. Wrong instructions were replaced by throwing
exceptions. So you could run incorrect program (it wasn't 100% bullet proof,
of course, curly brace mismatch could render your program absolutely useless,
but it worked in majority of cases). This greatly helps with development. I
can stop my unfinished work in one place and test another, for example.

------
Apocryphon
Steven Sinofsky thread here:
[https://twitter.com/stevesi/status/1005848814048047105](https://twitter.com/stevesi/status/1005848814048047105)

Some other good ones tracked by Michael Tsai:
[https://mjtsai.com/blog/2018/06/10/on-my-misalignment-
with-a...](https://mjtsai.com/blog/2018/06/10/on-my-misalignment-with-apples-
love-affair-with-swift/)

~~~
pjmlp
Given the way Steven Sinofsky team contributed to the political mess of
Longhorn/Vista, followed by WinRT split and the mess it brought to Windows and
.NET eco-system, he is probably not the best authority to pay attention about
programming languages and eco-systems.

------
bla2
There have always been heated arguments about which language is better, but
not many languages get as many "I don't like it" blog posts as Swift (maybe
Go). It's probably just because people are usually free to choose a language,
but are getting their arm twisted to use Swift on iOS. That's fine for people
who like Swift, but leaves no option but complain to the others.

~~~
gurkendoktor
I also think it's a question of community management. Before Swift was
introduced, the iOS dev ecosystem was naturally composed of people who were
comfortable enough with Objective-C. Then Apple suddenly released a language
that reversed almost every single design choice. Of course that was going to
tear the community apart.

------
saagarjha
The author seems to not be up-to-date with modern Swift:

> Adding compile time static dispatch, and making dynamic dispatch and message
> passing a second class citizen and introspection a non-feature.

There was a proposal recently to make this first-class, and not just limited
to Objective-C.

> Classify the implicit optionality of objects purely as a source of bugs.

Not purely, but often optionality is a source of bugs.

> At the same time it failed to attack, solve and support some of the most
> prominent current and future computing problems, which in my book all would
> be more important than most of the areas it tries to be good in:

> concurrency

Concurrency is being hashed out, and will probably be available in a later
version of Swift.

> overall API interaction complexity

The Swift API is refreshingly small. Are you talking about Foundation?

> debug-ability

What's wrong with the debug-ability, especially with solutions like
playgrounds?

> actual App and UI development

…this is literally the main use of Swift.

> developer productivity

Personally, I feel it's much improved.

> While Apple did a great job on exposing Cocoa/Foundation as graspable into
> Swift as they could, there is still great tension in the way Swift wants to
> see the world, and the design paradigms that created the existing
> frameworks. That tension is not resolved yet, and since it is a design
> conflict, essentially can't be resolved. Just mitigated. From old
> foundational design patterns of Cocoa, like delegation, data sources, flat
> class hierarchies, over to the way the collection classes work, and how
> forgiving the API in general should be.

This is almost a non-issue these days, unless you're interacting with C APIs.

> Just imagine a world where Objective‑C would have gotten the same amount of
> drive and attention Swift got from Apple?

The ten years before Swift existed?

~~~
EpicEng
>The author seems to not be up-to-date with modern Swift:

> Adding compile time static dispatch, and making dynamic dispatch and message
> passing a second class citizen and introspection a non-feature.

There was a proposal recently to make this first-class, and not just limited
to Objective-C.

> concurrency

>Concurrency is being hashed out, and will probably be available in a later
version of Swift

Sounds to me like "modern swift" doesn't actually have any if these things.
Did you mean to say that the author wasn't familiar with some future version
of swift that doesn't yet exist?

~~~
saagarjha
Dynamic calls are available today. Concurrency is not.

------
fpoling
A friend of mine have been working with porting to iPhone some Android and web
apps. I recently asked his opinion of Swift. He said he had no idea as the
essential libraries the apps use are in C or Objective-C, so there were no
choice but to use Objective-C for the apps. This was surprising, as I thought
that calling Objective-C from Swift should be transparent. But it turned out
it was not as many small but important details make the whole thing rather
messy. So I also wandered what exactly Apple wanted to get with Swift if it
lacks even in Objective-C interoperability.

~~~
seandougall
I'm not sure where your friend was getting that information, or what small
details he was referring to. Calling Objective-C from within Swift is
remarkably transparent -- where you might have called `[foo bar:baz];`, you
call `foo.bar(baz)`.

In some cases, wordy Objective-C names will get mapped to Swift's tidier
conventions, but the public APIs are all documented in both languages, and
Xcode's auto-complete is pretty good at helping you figure it out.

~~~
fpoling
The issue was with exceptions.

------
seandougall
These complaints largely seem to come down to Swift not really being what the
author expected it to be. His list of speculations as to Apple's motivations,
in particular:

> It should scale from App/UI language down to system language.

Did I miss something where Apple or the Swift team said anything about writing
a kernel in Swift? They wanted to make it fast enough for performance-critical
situations, yes, but there are plenty of those in userspace -- and I haven't
found Obj-C's inability to perform in those situations to be an asset.

> It should inter-op with existing Foundation and Cocoa, but still bring its
> own wide reaching standard library, adding a lot of duplication.

This is missing the forest for the trees. Bringing Swift-native Foundation
APIs into the standard library means they're available on non-Mac platforms,
which is huge. I'm also not sure what alternative would be preferable --
should it _not_ interop with Foundation/Cocoa, and totally fragment the
ecosystem? Or should _not_ reimplement them with native calls, so that it
inherits the performance penalty of objc_msgSend?

> It is functional, object-oriented and protocol oriented all at the same
> time.

Apple describes Swift as protocol-oriented. It's flexible enough to support
other paradigms, but I don't see how that's a liability, and I haven't seen
Apple claim "Swift is an FP language".

> It wants to be strongly typed, but at the same time so convenient in type
> inference, that it falls over its own feet while trying to grasp simple
> expressions, and they become too complex to manage. By design.

[citation needed]

This complaint is so vague as to be impossible to address, except to say that
I haven't managed to come across a case of it yet. An example would really
help make the case.

> It is compiled and static, but emphasized the REPL and playground face that
> makes it want to look like a great scripting solution. Which it isn't.

Again, I haven't heard any claim that "Swift is a scripting language". Sure,
it's not, but neither is Objective-C, so... I'm not really sure what the
complaint is here?

> It seems to have been driven by the needs of the compiler and the gaps that
> needed to be filled for the static analyzer. Those seem to have been super-
> charged instead of catering to app developer's actual needs: efficient,
> hassle free, productive (iOS) App development.

This sounds like a complaint about strong typing and static dispatch in
general. Yes, you have to code more carefully, but the flipside is that a huge
number of errors that crop up at runtime in Objective-C become compile-time
errors in Swift. I'd much rather bang my head against the compiler for a while
before I ship than have reams of undebuggable crash logs pour in.

> It is meant offer progressive disclosure and be simple, to be used in
> playgrounds and learning. At the same time learning and reading through the
> Swift book and standard library is more akin to mastering C++. It is quite
> unforgiving, harsh, and complex.

That's true, but that's also kind of a big part of why playgrounds exist. So
again, what's the complaint? Is the fact that playgrounds make the language
more accessible a problem somehow?

Also, the only way I can view Objective-C as "simple" is by ignoring the C
underpinnings and looking only at the relatively thin layer of classes and
message passing on top. For somebody coming to coding with fresh eyes, both
Swift and Objective-C are going to have a learning curve; the difference is
that with Swift you don't have to pick up K&R first.

------
mmjaa
I agree with the author on all points, except one: there isn't any reason not
to just use Lua for everything.

Yes, thats right. Lua for everything. Lua on iOS, Lua on MacOS, Lua on Linux.
Lua on Windows.

I absolutely love the freedom, flexibility and downright sexiness of using one
language on all of the platforms. Its a beautiful, difficult, lonely place to
be - but if you haven't tried it, you can't really knock it.

~~~
lifeisstillgood
I am unfamiliar with mobile coding but I presume there are really good reasons
to use the "blessed" language on ios / android - I would assume any scripting
interfaces leave many useful (security) areas unavailable? Could you access
the secure enclave for example in python or lua?

~~~
mmjaa
Yes, you sure can. Anything you can do in Objective-C or Swift, you can do in
Lua.

------
pjmlp
> However, the flip side is it inhibits your creativity while writing.

The old adage of C devs against us on the strong type side of the fence of
Algol family.

We all know how the freedom for creativity end up.

[https://www.cvedetails.com/vulnerability-
list/opmemc-1/memor...](https://www.cvedetails.com/vulnerability-
list/opmemc-1/memory-corruption.html)

~~~
erichocean
> _We all know how the freedom for creativity end up._

You mean, it completely dominates the programming world?

C is a _massive success story_ , no other language even comes close.

~~~
valuearb
C is garbage, a loaded shotgun with no safety. I say this as someone who spent
twenty years writing object c code. The only way we survived was to build
safety into a massive set of core libraries and use them religiously.

------
dwaite
> So now contrast that to Swift. First of all: Which question did it desire to
> answer?

From Swift.org: Swift makes it easy to write software that is incredibly fast
and safe by design. Our goals for Swift are ambitious: we want to make
programming simple things easy, and difficult things possible.

> It should scale from App/UI language down to system language.

I don't see how say Go can be a system language and Swift can't.

Much of Apple's investment has been around making App and Framework
development more productive on their platforms, but Swift is open and other
companies like IBM have been focused on things like Web frameworks.

>It should inter-op with existing Foundation and Cocoa, but still bring its
own wide reaching standard library, adding a lot of duplication. I assume they
mean the standard Swift package as the standard library, since they listed
Foundation separate. Swift is _tiny_ , basically holding core types like Int,
String, Optional, and pointers, Collections (Dictionary, Set, and Array),
ranges like 1..<10, and essential common protocols like Equatable and
Hashable.

It isn't until you get to foundation that you get things like I/O / networking
or binary data types.

> It is functional, object-oriented and protocol oriented all at the same
> time.

Several of the languages listed in the article as being above as positive
(like Ruby) are also multi-paradigm. Swift is hardly a functional language, it
just has functional influences - like nearly _all_ the languages listed.

> It wants to be strongly typed, but at the same time so convenient in type
> inference, that it falls over its own feet while trying to grasp simple
> expressions, and they become too complex to manage. By design.

Since there wasn't an example given, all I can really say is there's nothing
that _requires_ you to use type inference. My experience from multiple
languages is that usually such an expression is wrong, and the compiler cannot
figure out any suggestions on how to fix it because a broken expression
doesn't give any type inference hints. A dynamic language would just let it
crash and burn at runtime, which (if you have a test suite capturing the
issue) is such a different process for debugging from fixing type inference at
compile-time that they are hard to compare.

> It is compiled and static, but emphasized the REPL and playground face that
> makes it want to look like a great scripting solution. Which it isn't.

REPL and Playgrounds are both for experimentation, not necessarily for
scripting. You can do command-line swift scripts, but generally the compiler
makes them a bit heavyweight for many things.

> It seems to have been driven by the needs of the compiler and the gaps that
> needed to be filled for the static analyzer. Those seem to have been super-
> charged instead of catering to app developer's actual needs: efficient,
> hassle free, productive (iOS) App development.

Can't say much here vs having a computer analyze your code is meant to be a
boon, not a hinderance. My experience is that now when I write Java projects I
wish I could get back the expressiveness, performance, and personal
productivity I have writing Swift code.

> It is meant offer progressive disclosure and be simple, to be used in
> playgrounds and learning. At the same time learning and reading through the
> Swift book and standard library is more akin to mastering C++. It is quite
> unforgiving, harsh, and complex.

If nothing else, someone is underestimating the effort of mastering C++.

~~~
mpweiher
> incredibly fast and safe by design.

Except it _actually_ is incredibly _slow_ , not fast. The only thing fast
about it is the marketing, and that is mostly fast and loose.

I ran a ton of benchmarks for my performance book, and despite the fact that I
didn't expect much, I was still surprised by just how badly it misses on the
performance front.

As an example see how Kitura compares to other web frameworks in performance:
[https://www.techempower.com/benchmarks/#section=data-r16&hw=...](https://www.techempower.com/benchmarks/#section=data-r16&hw=ph&test=fortune)

And Kitura actually implements the central HTTP parser in C. Porting that same
code to Swift made it (that code, not Kitura) 10x slower.

As a second example, see JSON parsing. Putting Swift on top of
NSJSONSerialization typically adds a 25x performance overhead. Yes, that's
twenty-five. And NSJSONSerialization uses NSDictionary and friends to
represent the parse result, which is already very, very slow. 25x on top of
that is no mean feat.

Anyway, the good folk from Big Nerd Ranch decided the problem was the interop
and decided to write a parser in pure Swift, called Freddy. It was a
"success", as Freddy is only 8x slower than NSJSONSerialization.

Oh, and if you think Swift serialization fixes this: it doesn't. Still around
an order of magnitude slower.

And I didn't even go into the role of the optimizer. Unoptimized code is much,
much slower relatively speaking, sometimes 100x to 1000x (yes, that's a
_thousand_ ) slower than optimized code. Note that the default debug mode in
Xcode is unoptimized and that 1000x is the difference between 1 second and 20
minutes. And of course you don't get a diagnostic if an optimization is not
applied.

The "Swift is fast" meme is pure Apple marketing at its most deceptive. It has
never been true in the past and isn't true now.

As to safety: there is very little to no evidence that static typing actually
increases safety, studies go either way and when there is a positive effect it
is _tiny_. See also: [http://blog.metaobject.com/2014/06/the-safyness-of-
static-ty...](http://blog.metaobject.com/2014/06/the-safyness-of-static-
typing.html)

> [type inference falling over] usually such an expression is wrong
    
    
       time swiftc too-complex.swift 
       too-complex.swift:1:49: error: expression was too complex to be solved in reasonable time; consider breaking up the expression into distinct sub-expressions
       let a:[Int] = [1] + [2] + [3] + [4] + [5] + [6] + [7]
                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~
    
       real	1m20.639s
       user	1m12.459s
       sys	0m5.249s
    

This compiles fine if you reduce the number of elements. But that's of course
just one of the most egregious examples. Overall, compile times are
horrendously slow, again we are talking order(s) of magnitude, and that is
with the horrible C compilation model and its ridiculous header files as a
baseline.

~~~
YouAreGreat

        let a:[Int] = [1] + [2] + [3] + [4] + [5] + [6] + [7]
    

What's the reason, something like exponentially many candidates for
overloading resolution?

> here is very little to no evidence that static typing actually increases
> safety

Well, C is statically typed, Python or Common Lisp aren't. We need better
categories than "static/dynamic".

------
khitchdee
Call me a purist, but I find no reason to have gone beyond C Not C++, not
ObjectiveC, nor anything else that followed I find C++ and ObjectiveC add a
lot of complexity and the benefits of adding this complexity are very limited
I think it would be better to stick with C and pursue other ways to improve
developer tools

~~~
sjwright
You’re not a purist, because there’s nothing inherently pure about C — it’s
just another language.

~~~
khitchdee
Well there is. It's the first 'high level' language that replaced assembly
programming. All the other languages have added even more high level features,
yet none of them have made its existence obsolete. The basic compier
translates from C, most other languages add a translational front end to a C
compiler.

~~~
pjmlp
That is an urban myth from C fan club.

There were quite a few system languages that replaced Assembly, some of them
even 10 years before C was invented.

[https://en.wikipedia.org/wiki/Burroughs_large_systems](https://en.wikipedia.org/wiki/Burroughs_large_systems)

[https://en.wikipedia.org/wiki/PL/8](https://en.wikipedia.org/wiki/PL/8)

[https://en.wikipedia.org/wiki/IBM_PL/S](https://en.wikipedia.org/wiki/IBM_PL/S)

~~~
khitchdee
Sorry, I meant in the microprocessor era when programming became mainstream
and stepped out of academia. Before that, there were so few programmers that
languages didn't need to develop.

~~~
pjmlp
Burroughs, Olivetti, IBM and DEC were hardly academia and their languages were
quite advanced.

~~~
khitchdee
In terms of number of programmers programming them compared to now, I'd guess
a factor of 10000 at least

