Hacker News new | past | comments | ask | show | jobs | submit login

Funny how people here mention source breaking change as the main issue with the language. I think it's because they haven't used swift on a large codebase. The main issue once you start to work on a big project are compiler crash ( https://github.com/practicalswift/swift-compiler-crashes ) and compilation time.

I really don't understand how a compiler can crash that much ( actually i've been coding in many client and server languages before and it's the first time i see a compiler crashing while i code). It makes me really nervous, because it shows that unit tests aren't good enough, and that the language is really in alpha stage, not more.

As for compilation time, you may think it's just a matter of optimisation. But the problem is that until those optimisation arrive, you can't rely too much on type inference (main source of slowdowns), which diminishes a lot the beauty of the code.

Now that was for client side. Server side probably adds its share of issues.

I used to be very optimistic for this language, but i'd say the swift team only has one more shot to make it really production grade, before the word spreads that this language is for now just a joke ( maybe apple using it for its own app would help allocate more resources).




Yes, as part of a project that has used Swift for the last year, migrating away from Objective-C on a decent size codebase, I can say it is not ready for primetime.

Swift 2.2 + Xcode 7 wasn't great, but it was livable.

Constraint SourceKit crashes makes Xcode essentially a text editor and not a good one. All indexing, highlighting essentially all IDE functionally lost.

This is the worst development experience that I've seen in 20+ years as a developer.

I thought the CoreData / CloudKit debacle from 3 years ago was bad, but oh my God, I just want to jump ship and go to Android, switch to Xamarin, or just leave mobile at this point.

It would be nice to have some level of optimism and say this is growing pains, but I don't have any faith that the Apple developers are competent in making this better.


Most of my problems seem to be with Xcode. The crashes can be pretty frustrating, and most I encounter are repeatable (which makes me think they should of been caught in testing).

The failure in releasing Xcode 8.2/iOS 10.2, but not updating the iTunes Connect backend to allow iOS 10.2 as the max was pretty disheartening. All apps were automatically rejected for half the day. How does something like that slip through the cracks? To find out if it was fixed I had to periodically check twitter :\. There was no blog post, no status page I could check - my only hope was some unlucky dev I found that didn't even work on the iTunes Connect/App Submission team.

I think the consensus is that we need nothing but bug fixes on the core stuff (Xcode, SourceKit, etc).


indeed. i think apple has a general software quality issue and should now think about hiring some senior devs from microsoft to help them sort their process out.


how big is this decent size codebase?


397 Swift files, 2K Obj-C.


1000 times this. I've been really enthusiastic about the idea of a modern, compiled, type safe, systems language, backed by a major tech juggernaut.

But the compiler crashes and sourcekit instability are just breathtaking (even in 3.0+). I'm surprised I haven't seen this get more attention.

I'd (somewhat) expected something like this for the first few releases, but two years in I'm starting to think the team bit off more than they can chew with the type system.


SourceKit is unbelievable! When using Xcode for most of the time (as in >50%) it causes very high CPU load and it's not uncommon for it to use >3GB of RAM.


SourceKit issues fall into two categories. The source causes the compiler itself to crash (due to one of those compiler defects mentioned elsewhere in the thread), or some issue with the build configuration.

The latter category is especially nasty, and SourceKit won't inform you of this, except perhaps to crash. Oh, and that dumb bar in Xcode.

You can compile all the Swift source in your target successfully but have SourceKit choke on your build configuration. Accidentally introduce a duplicate set of headers in HEADER_SEARCH_PATHS? Degraded performance, and at worse, the compiler crashes everytime you edit text and SourceKit invokes the compiler. In cases like this the SOURCEKIT_LOGGING environment variable is your friend. Have fun combing through those logs to tease out what build option you set is causing your issue.

I spent the better part of two days sifting through SourceKit logs to figure out why a project's autocomplete wouldn't work. Yet, I wouldn't want to go back to writing Objective-C primarily. Because Swift as a language rocks.


I actually love the source breaking changes. The alternative is easier and much worse, and it takes boldness to not roll with the alternative, e.g. add language bloat and keep suboptimal decisions as part of the language just to support code that's already shipped.

C++ is the prime example of what the alternative looks like.


I'm not sure I agree.

My girlfriend has decided to learn programming. She started with Swift because she wanted to write iOS apps. The course she's using targets Swift 2 but her install is Swift 3, or something like that (I haven't looked into it much).

XCode constantly flags stuff like x++ being replaced with x += 1. Really? Surely that's the kind of decision you make before version one of a language, not years after release. Why this pointless churn?

If these sorts of totally-indecisive deprecations were rare then you could overlook them, as otherwise Swift is a rather nice language. But they're everywhere. Even in trivial examples intended for beginners line after line of code gets the yellow deprecation warnings. "What does deprecated mean" was literally one of the first questions my girl asked me as she started out learning programming, which is ludicrous. You shouldn't be encountering deprecation warnings over and over when targeting a brand new platform using a teaching course barely a year old.


If your girlfriend wants a stable language for iOS development she should use Objective-C. Apple have been very clear about the fact that Swift is evolving and that there will be breaking changes.

Changes to Swift are hardly indecisive or pointless. A very clear rationale for removing the ++ operator was laid out here: https://github.com/apple/swift-evolution/blob/master/proposa....

While these changes may be painful now (though the removal of the ++ operator seems minor to me), they should result in a stronger and simpler language in the long run.


The course she bought (without my input btw, she's pretty independent!) uses Swift. And that's probably right. Swift is a much nicer language than Objective-C, much closer to modern languages and has far less of the C heritage poking through. Even with the deprecation warnings, it's probably easier to deal with.

Removing x++ is indeed minor, which is why it's so curious. Sure, there are arguments for removing it. But there are also arguments for minimising language churn and just living with these things.


You could give a little help to your girlfriend. How about installing for her the required swift version by her course with swiftenv? This way she could learn the basics without getting nagged by Xcode. Later she could read up the differences and port her code to practice.


Living with x++ wouldn't be terrible, but I don't mind having to make changes to my code now that Swift is still very new if it ends up becoming a better language. Teaching x++ and ++x to beginners can be quite a pain.


It feels backward to modify the language in a way that makes it less functional: replacing a construct that returns a value with one that returns Void.

Definitely going against the general language trend, there.

And the justifications to remove ++ are downright bizarre in my opinion.


The c style for loops were also dropped. now you do `for idx in 0...5` or you can also do `for idx in 0..<5` I think this makes clearer if you want to include or exclude the last index in the loop.

I think that this kind of break is not that bad, swift is a young language, and there are some things that only after so much testing you realize it could be better.

Swift is 2.5y old, it has just one year being open source, so I don't think these breaking changes as a big deal. It is not in a situation like Python 3. Even the compiler helps you fix it and XCode is very helpful when translating Swift 2 to Swift 3, so it is not that bad.


Swift development started in 2010, so it's about 6.5 years old at this point.


Which is still younger than Rust or Go, it's probably the youngest widely used language.


Rust didn't start real development until 2010, it was mostly notes and incomplete PoCs before then (which I'm sure Lattner was doing for Swift before 2010, and Pike for Go as well).


As I understand it, July 2010 was literally the first commit for Swift in any form (https://techcrunch.com/2014/06/04/apples-new-programming-lan...), as a personal project for Chris Lattner. I think the complete history is on Github now.

Of course, age only tells you so much about the amount of work put into the project, and Rust in particular took a huge amount of conceptual work to get its current model of ownership.


Source?


Wikipedia.


A month ago, I tried the getting started on Apple's website. Even that had deprecated syntax =:)

Btw, Xcode still doesn't support refactoring Swift code


No, it is Java that is the prime example of what the alternative looks like.

And in spite of all problems with Java and its standard library, there's something very liberating about having code written in 1997 that still compiles and works in 2017.

And the beauty of Java is that it became a platform, so if you hate the language, you can pick another that's closer to what you think programming should be, like Scala, Clojure, JRuby, Groovy, Kotlin, etc. and still benefit from that piece of code written in 1997, all made possible due to Sun's fanatical devotion to backwards compatibility.


I've been a heavy Java platform user for years, mainly Java language but also with some Groovy and Scala. I agree with what you're saying to as far as it goes.

Where Swift excels is that it has the performance of C for the most part, and it will continue to improve. Another strength is ARC over Java style GC. As the slides point out, to get the same performance with GC, you need 4x the memory. With only 2x the memory, you get 70% lower performance with Java. Most important, though, is determinism. GC is simply not suitable for real time, hard or soft.

I'm very excited about Swift, and the current version of Xcode is working pretty well for me. I love the aggressive goal of replacing the C family of languages down the road...


> As the slides point out, to get the same performance with GC, you need 4x the memory.

I'd take that claim with a pretty big grain of salt. The paper they referenced used an experimental GC without use in production. Who knows how well or bad the default collector in the JVM would perform.

Also, it is not exactly recent. Hardware changes could have shifted the result in any direction.


The exact numbers are worth a grain of salt. The general point aligns well with my JVM experience. Requiring at least double the working set for decent performing GC doesn't seem congruent with running the leanest possible data centers.

Regardless, determinism is the really big win. :-)


Scala and Apache Groovy benefit from the JVM's backwards compatibility, but programmers can't benefit from backwards compatibility in those languages. Groovy even broke compatibility between various 1.x releases.


Which is a pity. We take backwards incompatible changes too lightly. I'm working with Scala for example and love the language, but it breaks binary compatibility between major versions because Scala's features don't map that well to the JVM's bytecode and the encoding of things like traits are fragile. Well, the community is kind of coping with it by having libraries support multiple major versions at the same time. Well, it's not much, but at least the tools have evolved to support this (e.g. SBT, IDEs, etc).

That said, if you're looking for a language that has kept backwards compatibility, that's Clojure. Rich Hickey has a recent keynote that I found very interesting: https://www.youtube.com/watch?v=oyLBGkS5ICk

I don't know how to feel about it. On one hand this means Clojure will probably never fix some obvious mistakes in its standard library, without changing its name (like what they did with ClojureScript). On the other hand, as I said, it is liberating to have code written in Clojure 1.0 still working in 1.8.


Python is the prime example what breaking a language looks like.


If anything Python is a good example for breaking not enough in the main language.


> it takes boldness to not roll with the alternative,

Yes, like the "courage" it takes to remove the 3.5mm headphone jack.


> compiler crash

That list is quite impressive, hundreds of crashing cases discovered manually, and nearly thirty thousand crashes fuzzed.


Could someone define "compiler crash" and explain what it looks like? Does Xcode itself crash?


Your code stops compiling with a pretty unhelpful error message. Y'know, like Segmentation Fault 11, and a stack trace into the swift compiler sourcecode if you're lucky.

XCode doesn't always crash (though it does, frequently), but syntax highlighting goes away at the first hint of an issue.




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: