Hacker News new | past | comments | ask | show | jobs | submit login
Swift 2.1 (russbishop.net)
81 points by ingve on Sept 24, 2015 | hide | past | favorite | 50 comments



Wake me up when they open source it like they said they were going to. Proprietary languages are evil acts of lock-in committed against unwary junior developers. Swift is a great language, but I'd really like my skills to be somewhat transportable.


"Swift is a great language, but I'd really like my skills to be somewhat transportable."

If your skills are locked in to a certain language then you're doing it wrong.


9/10, given two coders of the same talent, I'll always hire the one with the experience in the language/framework. But I hire for velocity - ie, people who can develop productized solutions (bug free) in a short period of time. People who know the theory but not the in the trenches reality, well, you don't need that many folks.


Apple open sourcing Swift is not magically goingto make your skills transportable.

The only person who can make your skills transportable is yourself.


Yep! And the way to do that is stay away from proprietary technologies wherever possible.


Then you're limiting the amount of skills you possess.

Learn open source and proprietary technologies alike, now you become someone with a lot of diverse skills.

This open source fanboyism makes no sense if you want to be professional about your craft.


You have to limit what are you going to learn, otherwise you will have no time to build anything. So you have to pick carefully.

Generally, the open technologies were proved to be useful in long-term, and even when they change, they change evolutionary. You don't have big bumps and complete u-turns in your progress.

There is also no-one, who suddenly decides that the technology you were invested into is suddenly not profitable for them and ceases development. The open source technologies die only then, when they are not useful to anyone.

So basically, strong preference for open technologies is a pragmatic risk mitigation and investment protection.


Hello German friend,

> So basically, strong preference for open technologies is a pragmatic risk mitigation and investment protection.

.net and Microsoft technologies completely invalidate that claim, don't you think?


Nope, i wont touch them either for the same reason. Although ms is starting to get its act together and open sourcing parts of its ecosystem. Anyway who would write a net application on windows, they would need thier head seeing to.


Perhaps the Swift conference end of October?

Coupled with the amazing NeXTBSD work that's spun up, I've found myself surprisingly excited about what Swift might become.


I'm guessing they'll announce open source Swift at the next llvm conference this fall.


Which makes me think you aren't doing any iOS or Mac OS X development anyway. So the state of Swift openness doesn't matter.

So go pick OCaml, Haskell, Rust, Idris,.... instead.


>Swift is a great language, but I'd really like my skills to be somewhat transportable.

Huh?


I think OP is talking about multi platform code. Swift isn't "write once compile anywhere" like C or C++. This is also the reason I haven't used it yet, although I read the swift book and I'm looking forward to multiplatform capabilities (if that were to happen)


Looks like a good set of changes.

A question for the more switched-on among us: has there been any word on when Swift 2 will be open-sourced?


Author here; my understanding is the compiler team is hard at work doing some necessary house cleaning and reworking. It is something they've wanted to do for some time.

No official word from Apple though.


Chris Lattner said on reddit that they're on track to deliver when they originally promised, by the end of the year.


Link?



Ah good news. Hopefully there is a standards track for it as well.



[flagged]


This comment breaks the HN guidelines. We ban accounts that do this repeatedly. Please comment civilly and substantively, or not at all.

https://news.ycombinator.com/newsguidelines.html


You might consider posing a little more substantial a warning if you're going to leave it publicly. Your comment on the flagged comment doesn't leave much context for anyone else coming to this.


Sorry if that wasn't obvious. I usually try to make it clear what the problem was. In this case the offending comment was both uncivil ("Get over yourself, kid.") and unsubstantive ("PC masterrace, amirite guise? Hurrr"). Uncivil, of course, is the worse of the two.

If you or anyone wants to read flagged comments or comments killed for any reason (other than the author deleting them), you can set 'showdead' to 'yes' in your profile.


It was an insubstantial reply to an insubstantial comment.

The way I see it, the comment received a warning because it was contentless and offensive without reason. Does there need to be more context?


Apple is evolving Swift quite fast. It's a great language. I like that they adopted Go's _ to ignore unused variables. Xcode 7 also recommends where to change var's to let's where you don't mutate the value after initial assignment.

Anyway, let's hope the developer community jumps onboard for the ride. For anyone interested in learning Swift, I've accumulated just under 1800 Swift urls in the past 15 months:

http://www.h4labs.com/dev/ios/swift.html


Not to be antagonistically pedantic or anything, but just FYI, Go was definitely not the first to use an underscore to ignore variables.


If you don't mind me asking, what was? I thought it was too.


Erlang got it from Prolog and Prolog's been around since 1972.


We truly stand on the shoulders of giants.


Haskell and Erlang come to mind. They probably weren't the first either.


In python it is quite commonly used too.


Clojure idioms use it as well. Lots of languages do.


Actually, some of those features are still buggy when operating to Obj-C land :)

      //Warning if you don't use let :-) 
      let array = NSMutableArray(capacity: 50)
      array.addObject(notification)

I am aware that the let essentially means I can't reassign the variable, and does not have anything to do with the mutability of the class. Since it's a class and not a struct.

However, what I am saying is that Apple needs to somehow add the keywords to the container classes imported from Cocoa, this is manually I guess.

I wonder if adding a mutating keyword on the declaration would help

    public mutable func addObject(anObject: AnyObject) mutable
That way that should throw a warning on a let usage. But that keyword only exists on Swift. They would have to go through all of them like they did with nullable keyword.

If you're wondering why I am using CFArray -> performance. Structs are pretty slow since you'd have to copy it from the dictionary to mutate and then insert into the dictionary again.

Though, those numbers were with Swift 1. Will try with Swift 2 again, but given the nature of immutability, I suspect it will still be slower.

Now that I think about it. I might make my own generic structs that uses NSArray/NSDictionary internally. Though I wonder if you can pass structs by reference instead of copy, that would be more useful.


Remember that Swift collection types have the semantics of copy-on-write. Under the hood they actually only copy if the collection is non-uniquely referenced.

An Array<T> is a struct but uses a storage reference type under the covers. In theory the standard library is actually free to make common slices of arrays share storage so pure additions don't even require copying but I have no idea if it is actually implemented that way (neither String nor Array guarantee contiguous storage)


It definitely changes how a developer works in a language if they know they can depend on this kind of implementation. In Clojure, knowing up front that all data structures have shared persistence allows you to do things that you'd never consider in a language where this wasn't guaranteed.

I think Swift could greatly enhance its power in the functional paradigm if indeed it did what you describe, but also making this clearly known to everyone who uses it.


Swift seems to distinguish 'mutating' from 'changing'. The name 'array' in your example remains pointing to the same object, it's just that that object is being internally mutated. Not sure that there'd be a foolproof way to catch all possible mutations, so that may be the best they can do.


For objects passed by reference, a Swift "let" signifies immutability of reference (not value); thus you can append to the NSMutableArray but not reassign to a different NSMutableArray. On the other hand a native Swift Array assigned via "let" would be immutable from both a value and assignment perspective.


That's not a bug. Your array reference isn't changing, just the data in the object. Scala (which uses var and val for mutable and immutable) would do exactly the same thing.

Heck, even Java (using final to mark immutable) would work like this. This is basically how every OO language works.


I think the poster understands that, it's just in Swift if you assign a Swift Array (not an NSArray) using let it is immutable both in reference and in whether or not items can be added or removed from it.

That said, it's a quirk of the interop and I don't think Apple should spend time fixing it - rather developers should avoid NSArray within Swift.


Can't avoid it for everything which is why I'm considering wrapping.


I wouldn't call that a bug. That's like saying you can't do

  let a = Foo()
  a.addItem(i)


It's a bug in a sense that NSArray, NSSet, NSDictionary (The mutable ones) should have the keywords setup when they are ported with Swift headers.

Naturally, that would be manual work though. Like the way they added nonnull, nilable or whatever its called to Obj-C


Why would they have keywords added? Adding `mutating` to a func in a class is invalid, because the very concept makes no sense. The `mutating` keyword can only apply to value types, because it means "this func mutates the value in-place, so it needs mutable storage to invoke it."

The "bug," if there can be said to be one at all (I don't see it), is that Swift collection types are value types, while Objective-C collection types are reference types. But given this difference, the lack of `mutating` on the mutating methods makes complete sense, and adding it would be bizarre, since it would be an error if you did it in your own code.


_ for unused variables isn't a Go innovation. Haskell's been around for over 20 years and it has it. I don't think Haskell was first to do it, either.


Anyone knows of a safe way to use Selector in Swift. I am not a fan of using string though.


I've been experimenting with python scripts in my build process to create structs of static strings for Apple APIs that require strings (ex: Storyboard filenames/identifiers, Core Data entity/properties, Image filenames) for compile time safety.

I haven't done selectors yet partially because I use relatively few of them and partially because parsing code is slightly harder than file system walks and xml parsing.


Swift is a language designed from its core to build apps that work well for Apple hardware. I understand why developers gag at its quirky paradigms, but at the end of the day, if you play by the rules, you end up with an extremely valuable product. The positives outweigh the negatives, open source or not.


The problem is that it's actually very unlikely you'll wind up with a valuable product. The odds of getting on a user's home screen get longer every day and if you're not on the home screen you might as well not be on the phone at all.

I would like to see Swift eventually become a general purpose programming language useful outside of Apple's niche but it's got enough baggage from its ties to UIKit & ObjC I'm not sure how likely this is to happen.


>String interpolation now allows string literals.

But do editors highlight this properly? (JOE 4.1 does).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: