Hacker News new | past | comments | ask | show | jobs | submit login
Swift: Not quite there, but too far gone too (studentf.wordpress.com)
32 points by neop on June 4, 2014 | hide | past | favorite | 48 comments



I start writing a ORM (that is how I learn a new language). Very fast I hit problems. The keynote and the state of union say swift is ready for production, but is not there yet. The coding experience is buggy, the repl violate the language (is possible to change a constant to another type!), the importing of other obj-c code is sometimes broken (ie: Not work at all, or yes). A lot of things are not documented (For example, how do a generator?), etc

Still, I hope this lang mature and be popular, because I want badly to not use obj-c


Can you provide more details? All of the issues you mention seem like typical early beta expectations.


Other issues is for example, how import code into a playground:

https://devforums.apple.com/thread/227949?tstart=0

and when importing obj-c code https://devforums.apple.com/message/972445#972445 it suddenly can't refer to them.

How much is this about xcode or the language is something I don't know.

The thing is, I'm a avid supported of alternate languages. When everyone use VB, I use FoxPro. When is C, I use Delphi. I get iOS instead of Android, and python instead of php.

But with swift I hit in less than half a hour several crashes & weird behaviour (not just: i don't know how use this lang) that I rarely experiment when use other langs.

But: I still continue with this, because the lang -despite the flaws (no exceptions? why ...? and other stuff) seem enjoyable.


Some of the issues are in https://devforums.apple.com/community/tools/languages/swift

For example, when the REPL is invoked with /Applications/Xcode6-Beta.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/swift, is possible to write:

let a= 1

This create a constant integer. According to the docs, is a constant and can't change.

However, you can write:

a=2

And it work. But if is write in XCode it show that that is a error


Why the downvote?


Maybe because neither the keynote not the state of union said swift is ready for production.


Do you see the videos? They expect to get apps made with swift on launch of iOS 8, and have rewrite at least the WWDC videos app as showcase.

P.D: This remark was about 32 min in the state of union video


The launch of iOS is several months in the future.

Swift, as XCode 6, is still beta.


Yep, but in terms of languages, that is around the corner!


No idea why someone would have downvoted you. Your post was the first one that has given me insight into what it's actually like to work with the language at this early stage.


grammar maybe?


The article suffers from a bit from looking at things from a strictly theoretical knowledge (i.e. the first example is not valid Swift code), but it does bring up a few interesting points.


I agree that instances being pass-by-reference is weird, with everything else having copy semantics. It seems as yet another odd artifact of the design constraint that Swift must be ObjC-runtime compatible.

I sort of wish they had created a special type annotation to support legacy ObjC classes, and gone for Go-style structs for the non-legacy OO.


It's more than that. I don't think having non-reference objects is entirely impossible (e.g. Objective-C blocks are already stack-based objects until they are copied), but it doesn't work for Swift's design goals. Objects have to be reference types because Swift needs to interact with Cocoa and Cocoa Touch, which assume reference-sharing in many places.


> Except that you can still make changes that don’t affect length (immutArray[3] = "Whoopsie")

That sounds like a bug to me. I would expect the compiler to detect it as an error (it doesn't). Can anyone point me to the relevant bit of the spec that says you can do that?


"Immutability has a slightly different meaning for arrays, however. You are still not allowed to perform any action that has the potential to change the size of an immutable array, but you are allowed to set a new value for an existing index in the array. This enables Swift’s Array type to provide optimal performance for array operations when the size of an array is fixed."

https://developer.apple.com/library/prerelease/ios/documenta...


So immutability is a code safety feature, except in the case of arrays, where it's a performance optimization. That seems counterintuitive.


Thanks. I think it sounds like a bad decision, it can be compile time checked so won't cause performance issues. I wonder what the cost is in the mutable case for the fact that the length can vary.


Optional types are not the only use of algebraic data types. I agree that it's weird to have them and optional operators in the same language, but I don't see it as a big issue.


"... but this language, once released, will be fixed for a decade at least. Something with that lifespan should be great from day one."

I'm not sure why the author believes that once the language ships it cannot change. Surely all languages evolve and change over time, it would be foolish to think Swift as it ships in September will not change for a decade after that.


A language that is constantly changing in fundamental ways is not usable for serious software. Languages are largely constrained by the choices that were made when they were released. See Python 3 for an example of what happens when a language tries to make breaking changes — and Python's were relatively minor! No language could thrive while going through that sort of strife constantly.


Objective-C has added several features in recent years; things like bindings, dot notation, etc. are relatively new, but pretty substantial changes in terms of code clarity.

As for Python 3, the Python community never intended for people to move over to Python 3 immediately, nor was it intended for people to move their Python projects from Python 2 to Python 3 unless they had some reason to (e.g. Django, public libraries, etc).


Objective-C has added things, but it has largely not changed existing things. As I said in another comment, languages generally do not change — they just accrete.

If you feel like a language is missing something, that might be fixable — but if you feel like a language either added the wrong thing or did something the wrong way (which is the OP's concern), that is a much more difficult problem, because after release you can't take away what's already there without making people angry.


Languages generally don't change -- they get added-to and very rarely removed-from but they don't often change. For example, it's usually very hard to add new keywords. So C++ for example, uses every possible keyword and symbol overloaded to provide new features to the point of ridiculousness. Java is on the same road. Python changed a few things and the split between 2.x and 3.x is still ongoing.


This was about the pace of language development once upon a time. C++ changed glacially for its first 20 years, for example. Perl, Ruby, and Python also took pretty conservative approaches to language evolution (but kudos to Python for finding a way to encode versioning information into the code itself). I think Objective-C didn't really change all that much until the mid-2000s, probably in prep for iOS?

So the expectation seems like it's probably based on real history, but things seem different now, though. Languages are evolving faster than before, even some of the ones that previously moved very slowly.


By and large, languages don't really change — they just accrete. It is very rare for something that used to work one way to later work another way. For example, languages generally do not go from being statement-based to being expression-based, or go from something being mutable to immutable, or eliminate operators. (MzScheme did the second one — it went from mutable to immutable defaults — and it was considered so significant that they stopped calling their language Scheme and renamed it Racket to avoid confusion!)


I agree this has historically been true, but again, I think this is changing. Ruby 1.9 and Python 3 both did more than accrete, they actively broke existing code in quite significant ways.

C++ has so far avoided completely breaking changes, but with all the accretion it's doing now it's probably only a matter of time before some significant breaking changes happen lest it become even more ridiculously complex than it is now.

Go has had breaking changes as well, I believe, but they have a smart upgrade tool to help with it. This is probably something that will catch on for other rapidly evolving languages.

I think we'll see a lot more of this kind of thing in the future.


Ruby and Python each did one release where they were willing to do significant changes. That's it — they're not doing it again for a good long while now. It's an isolated incident, not a trend in those languages' development practices that we can project into the future.

Go did breaking changes pre-1.0, but they are now committed to providing a stable platform that only accretes features (http://golang.org/doc/go1compat).


I don't expect either of them to do it for a while either. The trends I'm talking about are in PL development in general. More breaking evolution is taking place post-initial-development than ever has before in languages both new and old.

Note that the Go compat wiki you link to acknowledges a future Go 2 that may break compatibility. That's actually a pretty strongly pro-evolution statement compared to past languages.


The statement you were questioning is that Swift will be "fixed for a decade at least" after release. If you agree that it will be at least four years before Python makes any more changes like Python 3, then you are agreeing with the OP.


I'm noting an acceleration in recent years and believing in the possibility of further acceleration going forward. Part of that being newer languages being more willing to undergo breaking changes sooner in their evolution than older ones were. Thus, I agree that Python might stay at a big change every ten years (which would still be faster than historical language evolution!) while still believing that swift or go might go for faster than that.

It's also worth noting that Swift isn't even at 1.0 yet, and they've said there will be changes before release. So I also disagree with a somewhat hysterical "we'll be stuck with this!!11!!" right now.

It's all just guesses, though. We'll see.


Python 3 is also not meant to run Python 2.x code. It's pretty trivial to port most code from Python 2.x to Python 3, but there's no real reason to do so unless you're writing a library or other project for other people to take advantage of.

It was intentionally done as a 'clean break' release; 2.x keeps doing the same stuff it used to do, and Python 3 changes a bunch of stuff which, in hindsight, makes sense (such as a distinction between 'stream of bytes' and 'string of text', vs. 'stream of bytes which may or may not be ascii text' and 'string of unicode text').

That said, there's no real benefit to moving an existing project/codebase from 2.x to 3.x, and it was never intended that there would be one. Python 3 is for new projects; Python 2.x is for existing projects, or new projects which need deployment in older environments.


Which is one of the reasons I think it's a sign towards acceleration. People are developing strategies to deal with language evolution. This, "import from __future__", and gofmt are all tooling that helps you deal with a language that's still willing to evolve.

That said, I don't think this part of the python3 effort entirely succeeded. Unfortunately they share a package source, so a lot of projects do need to support both. But whatever failings there have been in the python jump to 3.0, they're nothing compared to the disaster that was ruby 1.9, even though I think 1.8.7 is truly relegated to legacy now.

As long as they learn from those issues, though, I think the future is bright for non-stagnating languages.


This isn't right about the renaming. The switch from mutable pairs to immutable pairs happened in release 4.0 of PLT Scheme, in June 2008. The name change was in May 2010.


I don't understand the obsession with mutability. When is it a problem that a variable is mutable?


It is a very different way of thinking about programming, but immutable inputs remove a whole class of bugs that can happen.

It is analogous to global variables. We know they are bad, because any other piece of code can change them and break our code.

A variable having state is similar. When you think about what a function does or write tests... having the variable be able to have many unknown states increases the complexity.

In code we write everyday, a variable might be undefined, null, a valid phone number as a string, etc. But in immutable code, if your input comes from a function that returns either None or a valid phone number as a string and no other code can tweak this... then your code becomes much easier to think about and write tests for.


Check out Rich Hickey's (the creator of Clojure) talk The Value of Values. He makes a good case for immutability.

http://www.confreaks.com/videos/1830-jaxconf2012-keynote-the...


When it surprises you. It is easier to reason about code when you know that values will not change - an immutable value cannot change so you don't have to worry about passing it around, using it in asynchronous processes, changing it without persisting the change, etc.


Probably because is hard to track the chain of calls that could have change the data. If inmutable, you are sure what exactly you have at hand.

But the article point to a inconsistent mutable/inmutable behavior that is more problematic.


[deleted]


Most modern compilers transform source code into a single-static assignment form, rendering (im)mutability notations useless from an optimization standpoint. (Ignoring for now the case of non-local assignment, e.g. through aliased pointers.)

What you actually gain from (im)mutability notation is documentation and sanity checks: it's a huge win to tell the compiler "yell at me if I ever try to modify this variable", which unambiguously indicates to a human "don't worry, this variable is never modified in this code".


> So this is a list of things I don’t like about Swift based on things I’ve read in the book, without having experimented with the language myself.

I stopped reading about here.


Yeah, this immediately made me picture Steve Carrell's character from "40-Year-Old Virgin" trying to describe what sex must be like.


The good part is that the following smooth transition is now possible for the core iOS developers:

ObjectiveC -> Swift -> C#: iOS, Android, Windows Phone etc.


FTFA;

>"So this is a list of things I don’t like about Swift based on things I’ve read in the book, without having experimented with the language myself..."

Question; why post this then?


IMHO he made some valid points, all from reading the spec. Experimenting with the language wouldn't really invalidate them.


Because one would hope that the official Swift book does a sufficient job of describing the language?


Drop the snark. I doubt that it does a sufficient enough job to write a critique of a language that the OP admits they have they haven't tried! This article is nothing but here-say and as such is worthless.


That wasn't snark; it was an answer. The fact that you have some unexplained "doubt" about whether it is correct does not make it snark.

But let's delve in. Why is it impossible to analyze a language from a programming language theory perspective without having used it? Obviously having used it will give a fuller perspective, but I don't see what you think is so deficient about the extensive documentation Apple has put out that it's impossible to comment from a theoretical standpoint on that basis.

To put it another way, I think you should drop the snark. If you have an actual problem with the critique or with the Swift documentation, you can bring it up. A snarky dismissal like "Question; why post this then?" does not add anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: