Hacker News new | past | comments | ask | show | jobs | submit login
Chris Lattner on Swift (nondot.org)
405 points by tambourine_man on June 3, 2014 | hide | past | favorite | 198 comments



Chris demonstrated Swift and Playgrounds at WWDC 2014:

"I can build anything with Swift... from a social media application, all the way up to a high-performance, 3D game using Metal."

https://www.youtube.com/watch?v=nKMAV6owYh4#t=6436

He wrote this chapter (entitled LLVM) in the book, "The Architecture of Open Source Applications":

http://aosabook.org/en/llvm.html

* * *

On the general topic, I wrote this [1] a little earlier in another thread; I'm just impressed with how Apple is becoming a gaming powerhouse.

[1] https://news.ycombinator.com/item?id=7841744


Did he mention anywhere Swift will be open sourced, the way LLVM has been? If not, why?


I find when Apple employees leave something important out, like licensing, it's because others haven't yet made the decision -- in other words, it's not final. I personally would have expected that if Apple were not making the code open source, they would mention its exclusivity and emphasize it as a training program for schools, etc. Realistically, it's just time -- like LLVM improvements or past work on Objective-C, expect everything up to the REPL as open source, but playgrounds and docs under proprietary license or completely closed would be my guess. If not now, some time in the future as .NET further open sources or if they want to encourage Swift as a JS competitor.


Or the decision has been made but it's not ready to be announced yet, for whatever reason.

Apple is notoriously secretive. They hate saying anything ahead of time if they don't have to. They could have committed (internally) to open sourcing Swift when they started on it four years ago, but they still probably wouldn't say anything until they day it happens. If they thought they could get away with it, they'd have kept the very existence of Swift secret until the public release, but they need feedback from third-party developers at this point.


According to Lattner himself, looks like the decision to open source has not been made yet.

"@Ahti333 right now we are focused on finishing it up for the final release this fall."

https://twitter.com/clattner_llvm/status/473907124288770050



Does it really matter ? While I think Swift is an amazing language, it would be much less useful outside the closed ecosystem of Apple. The same thing happened to C#, it never really took off outside Windows.


It's true that C# is not as prevalent on non-Microsoft platforms as it is on Windows but there are a handful of mid-size to big multiplatform projects using C#.

e.g.

- Xamarin (multiplatform mobile apps)

- Unity3d (multiplatform games)

- Monogame (multiplatform games)

- Unreal Engine 4 (build system)

I guess Swift could fill the same gaps. I would especially love to be able to develop multiplatform mobile apps with swift since switching between ObjC and Java all the time is quite taxing.


You could share code with Android and Windows Phone targets.


The reason why C# (edited out C3) didn't take off on Linux was:

1) It did take off like wildfire on Linux as Mono

2) People claimed Mono was a trap and not to use it. Because you can never trust Micro$oft

3) personally sad this happened to an open standard language


C# is extremely popular in the gaming industry.


A small sector perhaps, but it isn't that popular if you look at the entire industry, i.e., AAA games. No one is writing CoD or Witcher in C#.


I respectfully disagree. You can't downplay the massive indie game libraries or even AA games out there. They're not breaking records, but they make a large amount of sales.

There are several massive communities that use Monogame (even still use XNA) and Unity. It's huge in the indie community, and is basically how the XB1, PS4, and Vita expect you to make games for their systems. The Vita Tool Chain is mostly C#.

Sure, no one is writing AAA games that require maximum graphical capabilities in C#, those teams use C++ exclusively (often with a scripting language on top). At that point that is really the only language AAA games use if they are made from scratch, otherwise they are using an engine already made in C++.

Bastion, Magika, A.R.E.S., Dust, Fez, Rogue Legacy, Reus, and Terraria are just a few made in XNA/Mono. And looking at total sales amounts, some of those games definitely had revenues over a million dollars.

That's not including unity games like: Shadowrun Returns, Rust, Wasteland 2, and Hearth Stone. These games made very good amounts of money, again, not 100% AAA, but still industry veterans.


I suppose you're right, I shouldn't have downplayed them. I just think you need to segregate the two groups as they are almost completely different industries. To say that "C# is really popular in the video game industry" is misleading.


I can mostly see your argument. It makes sense. The indie and AAA industry are two different beasts, but they are the same industry.

It feels like one of those big business vs small business issues. If you have the man power and the resources, you can go much farther much faster. But not every starting company can afford the overhead of going full C++ and Engine heavy.

So C# being 'really' popular in the AAA side of the industry is a tad misleading (though can be used as a language on top of the engine, for tools (where it is probably most used), and for tool-chain), to discount it only due to the top 33% of companies usage, is a bit unfair to the quite large and growing indie and AA crowd.


Does it really matter if a language is open source or not? Surely the most important thing is a versioned language specification?


Yes, it does matter. I have little to no interest in Swift if it's walled off into the Apple infrastructure.

I would be very interested if it was opened up and available for Linux.


If there is a standard available then it doesn't matter so much. With the a spec anyone may write their own implementation. Of course, will they becomes the question then, and whether or not people will adopt it on non-Apple platforms.


That is a nice view in theory, but doesn't mean much in reality, even if they release a spec who is going to be able to build an implementation to compare. It will end up like GNUStep. Even languages that are open like Python have trouble maintaining language compatibility across multiple versions. Written specs are also riddled with ambiguities and implicit assumptions which make the creation of a "perfect" clone very hard.


Which languages that can produce a high performance 3D game wouldn't be able to also produce a social media application?


If you put it that way, then yes. You could even use assembly to do both things. Or even CPU microcode.

What they mean is that the language works at high enough abstraction and feels appropriate for writing both kinds of apps.

I wouldn't want to do a social media app with Fortran, even if it's perfectly possible.


C++ would be absolutely terrible for social media applications. There, the barrier isn't usually the speed but correctness and maintenance overhead.


Is Objective C really that much better? If you have decent APIs/libraries, C++ would be just fine.


As an example, OKCupid uses C++ and runs on a custom webserver.

http://www.okcupid.com/about/technology


I don't understand. Why would you make everything in-house especially for site like Ok-Cupid which is not exactly used to power rocket engines.


They were in much the same boat as I was back then: whatever you could install out of the box was simply too slow. Inefficiency would have caused us to die on hosting costs so we had to build our own webserver and so on. It's gotten a lot easier over the last decade and a half to build a performant website.

If Ok-Cupid were started today I highly doubt they would do that the same way. But given when they did start it I figure they literally did not have a choice.


They did a blog post on it that I can't find right now. Essentially, they made it quite a while ago (2002? I don't recall exactly), and all other options were too slow. Their backend apparently has to do a lot of work calculating match scores when users search.


Found a post by one of the developers describing the decision in some detail: http://www.brightjourney.com/q/okcupid-write-web-server

They've also open sourced their server: https://github.com/okws/okws


Objective C is head and shoulders better to program in than C++.

It's still got the same C warts that C++ does in a lot of place, but it is far simpler and less leaky than C++.

C++ is still faster and more cross platform, than objective-c

  --Have done both professionally


C++ can be just fine. It can also be used to write software bad enough to quit over, that dozens of people will maintain for decades. And they talk about it. I don't think Objective C has that problem.


To be fair, Objective C hasn't run billion dollar companies for decades yet either (in mass).


Objective-C up until the last couple years was powering the payments infrastructure for the Apple Online Store.


Which isn't exposing Objective-C to a wide range of engineers in the industry to the point people and relate and swap horror stories about it.


Yes, that is my point but I didn't make it very well. If Objective C were routinely subjected to these sorts of projects, I think its over-all reputation would be a lot worse.


No, objC isn't better. While it is fine for UI stuff, I wouldn't really want I build a web app with it. What I've read of Swift is more interesting.


Funny NeXT thought otherwise. WebObjects was a very good server platform.


No, but according to Chris Lattner, Swift is.


And Swift would be equally terrible for a high performance 3D game. (It's not because the guy from Apple said it that it's true.)

There's a reason why every single AAA studio uses C++ for their engines and titles. And that's the same reason why it is terrible for social media applications.


Every single AAA studio uses C++ to write the smallest portion of their game they can in C++.

Then they put a sane, modern language on top of it many places, Actionscript at EA, Python at many other places, LUA at others, C# at others, and write in a high level language that doesn't leak abstractions like a nut milk bag.


Mostly because game studios don't write compilers so they get to use what the respective OS vendors SDKs offer.


Is the balloons playground he demoed in the keynote available for download?


It's amazing that Apple managed to go from hatching the idea in mid-2010 to releasing a fully working framework 4 years later, with tight IDE integration, huge amount of testing and compatibility without a single leak (that I've heard of).


No doubt. Also amazing is that they promoted a compiler architect to eventually lead the whole tools group. This is good news.


The existence of a programming language named Swift under internal development at Apple had leaked. What it was for wasn't clear – it certainly wasn't clear that it would replace Objective-C.


can you give a reference?


Source? There is no mention on Macrumors on "Swift" before June 2.


http://www.reddit.com/r/swift was actually made 6 months ago.


Doesn't it use the same frameworks as Objective-C?


It does (direct interop with ObjC was obviously a design constraint; just look at function signatures), but there was still a tremendous amount of work to make a language one guy started on in 2010 into a beta today.


It's not clear how much code is actually written in Swift so far, so I'm not sure how much testing/compatibility was required.


If they're willing to allow developers to release an app written in Swift when the beta ends, I'd say they're pretty comfortable with it in production. If Swift is really four years old already, who knows what components could be already in Swift. And it's not like Apple jumps ship overnight - iTunes spent nearly a decade as a Carbon app after introducing Cocoa.


Not that it's particularly complicated, but they mentioned yesterday that the updated version of the WWDC app (which they released right after the keynote) is written in Swift.

I don't know if that means "entirely written" or some parts, but they at least called attention to it.


Which implicitly means that swift code can be compiled for iOS 7. Which isn't surprising since the compiled code uses the ObjC runtime.


Swift code compiles and runs on 10.5 and up. Tested it myself.


When they talked about the WWDC app being written in Swift they also explicitly mentioned that Swift will run on iOS 7 and 8 as well as OS X Mavericks and Yosemite.


Great to see the credit to Bret Victor and Chris Granger's Light Table. Apple's resources can really help move forward these new ideas of what an IDE can be. If Swift is successful, a whole generation of young developers will use and improve on these ideas. Very exciting to see what happens.


Why do Bret V and Chris G get all the credit for something invented in the 90s? E.g. Steering programs with time travel (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.33....).


> The approach to steering described in this paper is implemented in our research prototype, which runs on Sun and Hewlett-Packard color workstations, using Lucid Common Lisp and the Garnet user interface development system [Myers et al. 1990].


Because they made some impressive demos to reintroduce the concepts at a time when the audience is ready for them.


Fine, but if this guy was just inspired by and knew about LightTable and Bret's demos, he is missing a lot of good ideas and experiences that have been gathered over the last 30 years in doing similar things. They are destined to relearn the past.


I agree, and frankly I think Bret Victor in particular hasn't until recently referenced his influences enough.


Because many young developers seem not to care to learn about our history.


Speaking of giving credit and new ideas, see this 20 year old work: http://web.media.mit.edu/~lieber/Lieberary/ZStep/ZStep.html


I honestly expected Bret Victor to do the demo on Swift after seeing the realtime playground etc.


fwiw, he had no direct involvement in it. http://www.quora.com/Did-Bret-Victor-work-with-Apple-on-the-...


I love that his account's URL is http://www.quora.com/Bret-Victor-1, because http://www.quora.com/Bret-Victor was already dedicated to his name as a topic.


You are correct he was not involved. But the language creator specifically calls him and Lightroom out as inspiration.


He does not have a beard. I'm very skeptic this language will be successful


I have a beard. Wish me luck with my new language.


I think the beard test is for OS developers.


Matz with a beard: http://www.rubyist.net/~matz/images/matz-beard.jpg

He grew it specifically as a response to the "beard test" for programming languages.




Whoops! You're right.


It is nice that he mentioned Light Table. I would not be surprised if Swift ends up really benefitting Clojure adoption indirectly. I think one of the big hangups for newcomers is that, if you don't have experience with a Lisp, it's often difficult to understand the benefits of interactive development. If a large amount of new programmers become exposed to it, they'll be more open to other options that provide similar or better interactivity.


The only thing that makes me sad is that because of their focus on secrecy, they're doomed to relearn all that we learned along the way. Having played with the swift playground stuff, just an hour long conversation could've made a big difference.

Such is the way of Apple though.


> because of their focus on secrecy, they're doomed to relearn all that we learned

That's wrong, condescending and misses the point. Wrong because the developers were not cut off from learning what was learned by others. Condescending in the way it implies that there's a royal "we" of people that should be consulted whenever any programming language is conceived. And it misses the point because some types of learning are learned better by making the mistake yourself rather than accepting the wisdom of authorities.

Anyway, I don't see how Swift's life cycle is any different from other most other languages, except that its inevitable celebrity has been tempered by its parents' protectiveness. The number of people expressing interest in nurturing a language in its formative stages grows in proportion to its viability. Swift went from experiment to viable as soon as it got chosen as the path forward from ObjC.

> just an hour long conversation could've made a big difference

Apple has huge numbers of programmers on staff, including many compiler hackers, many kernel hackers, many with practical experience building and maintaining APIs and developer tools. They employ literally thousands of programmers who will be end users of Swift. To suggest that your opinion is more valid than theirs is hubris.


Hm. I said it was sad, not wrong.

Let's say you take direct inspiration for something you're working on and you know someone's been there and thought a lot about it. Wouldn't it make sense to ask them about it? It's certainly true that some mistakes are better learned by making them yourself, but a whole lot aren't. Hell, half the time it's just stuff you're too close to see anymore.

The Swift playground is strikingly similar to many of the things we've done in Light Table. It's wonderful that Apple is taking that and running with it and I want these things to end up out there and make things better for devs. But I could've helped them skip some of the crap along the way and I've worked with other large organizations to help them do exactly that.

> To suggest that your opinion is more valid than theirs is hubris.

My opinion is no more valid, but given that their work looks fairly like our own, I certainly have the benefit of past experience. This isn't about hubris. I have a unique perspective in this particular case, one that no one else will have, as the creator of one of the things they were "heavily influenced" by. I'd rather they took advantage of that so that they can continue to push things even further and not fall into some traps that we did at Microsoft and with LT itself.

In any case, I'm sorry I seem to have offended you. My goal is not self aggrandizement, it's just to help do my part in making things better for us all.


What is light table's equivalent to playground's timeline assistant?


You have some valid points, but they explicitly said lighttable was an inspiration, so Chris' stance should not be confused with hubris.


Well, don't hesitate to give them feedback now. They're staying it's still quite in a fluid state and will change before release. Plus I'm sure they'd always be grateful for feedback, and if it improves the lot of millions of developers (they say now 9M registered Apple devs), you've done the world a great favor.

“There is no limit to what a man can do so long as he does not care a straw who gets the credit for it.” Charles Edward Montague


On the other hand by keeping it secret and internal they avoided making bad decisions due to defensiveness and stubbornness.

An example of this is Go; they don't actually take input from outsiders (generics) and when they have had to give in to outside ideas they implement it poorly just to be different (exceptions as panic). They would rather people put in a no-op printf to avoid the unused package error rather than letting it be a warning because of their own self-righteousness.

Apple's secretive process may not be ideal, but there are far worse processes out there.


Wow, I'm not sure how to begin responding to that.

Go has taken a lot of suggestions from the open source community. Check the Go 1 mailing list discussions for examples. But panic existed before the opensource release and is not a substitute for exceptions, and nobody has actually proposed a viable generics implementation. The unused variable/package thing is a fundamental to the project's goals of working well at scale, and tools like goimports alleviate the pain (sometimes the answer is tools, not language changes).

Overall I'm pretty dismayed by your characterisation of Go as an open source project. We have a lot of great contributors from outside Google and your ignorant comments do them a great disservice.


I realize I'm in the Lion's Den. I feel that if you stepped outside the Google employee and Hacker News bubble you would see Go stubbornly refusing to have modern features, as I do.

I've been programming for 30 years and have written code that is in every Linux distro, and yes I did read the Go mailing list occasionally in the early days. But if it makes you feel better to call my opinions ignorant then I hope that's working out for you.


I agree with you. There is a lot to like in Go, but even I can see (and I've only been programming for ~10 years) that it ignored a lot of good stuff from other languages.

They keep saying they would add generics if they feel they could "do it right" but I'm starting to not believe that. I like that the language tooling is extremely disciplined and enforces even small things, like style, but this rankles a bit considering other parts of the language design seem a bit inconsistent or haphazard. Just IMO, of course.


> I've been programming for 30 years and have written code that is in every Linux distro

A very strange argument from authority considering Ken Thompson is one of Go's designers...

Considering the fact that 700 people attended the inaugural Go conference, and the variety of speakers (http://gophercon.com/schedule/), I think it's safe to say that Go has gained traction far beyond the Google employee and HN "bubbles".


The argument of authority of Go designers is actually my point. By designing the language in public (even before having source available) they now don't want to admit their errors, and neither do their supporters, because they are authorities, supposedly. I'm thinking of Pike in particular.

By keeping these discussions private, the Apple team did not need to protect their egos or their authority. We do not know who is responsible for which decisions at which points in the design, and they don't feel the need to defend their choices in public.


The Go team has admitted several errors publicly (e.g. var binding for range loops, some standard library design) on multiple occasions. Those errors won't be changed because we consider preserving backward compatibility more important. You could say that holding on to backward compatibility is "defensiveness and stubbornness", but it's really just making a different decision to what you may have done in the same situation.

In fact, your premise is almost entirely incorrect. Probably 95% of Go's language design happened before it was open sourced, so it's in exactly the same situation as Swift. The Go language hasn't changed a lot since November 2009.


I wasn't calling your opinions ignorant. I made several factual corrections about your post. You also made further incorrect statements about the development timeline of go (dsymonds corrected you there). No offense was intended, but in general I'm of the opinion that you should get your facts straight when criticising something publicly.

Also I've gotta laugh at "lions den". I've been reading HNers (like yourself) broadly condemn Go for nearly five years.


> Also I've gotta laugh at "lions den".

And are you going to laugh all the way over to r/programming? There is some criticism on HN. Where there are no holds barred it's really brutal. I don't blame you as a Go developer for avoiding those forums.

> incorrect statements about the development timeline of go (dsymonds corrected you there).

Go authors retconned their decision making process in talks, lists, and on the web, which in terms of feeling defensive about it now is the same thing.


Suggesting "retconning" here makes no sense.

You said the Go team "[designed] the language in public". I pointed out that there's only a small amount of difference between the first public unveiling of Go and its current state, which you can verify for yourself.

You said the Go team "now don't want to admit their errors". I pointed out that we've admitted lots of errors, publicly, which you can verify for yourself.

Your criticisms seem scattershot and just don't make sense.


I stopped regularly visiting /r/programming before I started working on Go. The standard of discourse is too low; too many people eager to dickwave by sharing their negative opinions. HN can be bad but at least I can have a vaguely sane conversation here from time to time.


People can be defensive and stubborn regardless of whether there is outside input.

Your eagerness to bash on Go has managed to upend your grasp of logic.


> On the other hand by keeping it secret and internal they avoided making bad decisions due to defensiveness and stubbornness.

That is a non-sequitur.

The rest of your comment appears to be low-grade trolling.


> The only thing that makes me sad is that because of their focus on secrecy, they're doomed to relearn all that we learned along the way.

But you're not focused on secrecy, so you can publish the lessons you've learned along the way publicly. That way, everyone, including Apple, can skip some of the mistakes. Wouldn't that be the case?


That's also a good thing. They'll have learned more from making experimental mistakes than from having been told something.


I would be very interested to hear what you think might have benefitted from the experiences of LightTable. Are they things you think could evolve convergently in the future, or are they mutually exclusive design choices?


When he said they hope to redefine how computer science is taught, that's a great goal (and to an outsider Swift looks great) but have they spoke to an academic about that idea?

If they take the Microsoft approach of being nice to schools that'd be awesome - much better to teach kids Swift than how to use Office. So if they're talking about schools, there's probably some private ones that would be interested but it'd take a crazy effort to crack teaching in schools on a wide scale.

Universities, not a chance. It might be great but it's not going to be an academic's choice of teaching language however good it might be for that.


A load of colleges (e.g. Stanford) already teach iOS development. Swift and playgrounds will automatically chance how this is taught.


And with those nice closures, generics, monadic patterns, ... They even have a nice way to introduce useful abstract concepts at lower level classes (not that they couldn't make it without... It's just funnier and with higher incentive if you can build ios apps out of it)


Don't generalize, there are lots of good universities out there.


> if you don't have experience with a Lisp, it's often difficult to understand the benefits of interactive development

Also with Javascript and browsers' built in consoles you can have good times developing interactively


Yes, although I am not a big fan of web apps, modern browser developer tools brings me back a familiar feel of Lisp and Smalltalk development experience.


It looks like Rust was a major influence. It has been interesting seeing all these posts claiming Swift borrowed from _their_ language.

Obviously there is a lot of feature influence between languages but it was interesting to see the ones he called out explicitly.


It's a bit of a Rorschach test, sometimes. People see what's familiar to them.


Great way of putting it. I see Ruby in it personally, and loads and loads of C#, but I have heard the Rust thing before from other people.


Personally, I see nothing from Rust in it (except what Rust took from other languages, like ML).


His take on the "x years of swift experience" meme is pretty great:

"Looking forward to next month: I'll be the first and only guy with 4 years of swift programming experience :-)"

https://mobile.twitter.com/clattner_llvm/status/473835365137...


> I hope that by making programming more approachable and fun, we'll appeal to the next generation of programmers and to help redefine how Computer Science is taught.

I'll be cynical here: can this be done if the language ends up being restricted to apple devices?


To me, this is going to be the only thing that matters : dill this language be used for non-apple platform coding. I've been looking for a language just like that for a very long time on the server side for example.


>I'll be cynical here: can this be done if the language ends up being restricted to apple devices?

Sure it can. Students and school-children have been taught (and inspired) with proprietary systems and languages for ages.

Sure, it might not get to Windows or Linux users, but it still can reach tens of millions of kids and even universities (e.g Stanford already offers a well known iOS course with Objective-C).


Really excited for Swift. I've tried and failed many times to learn Objective C and felt that the barrier to entry was a bit to high for myself. As a someone who writes JavaScript for a living, Swift is very inviting.


Not trying to be rude, but if you've tried and actually FAILED to learn Objective-C multiple times, then you will have a much bigger problem with actually learning the Cocoa framework yourself. Objective-C does have a non-standard syntax, but its barrier of entry for actual learning is purely psychological and not technical.


My problem with "learning Objective-C" was always the frameworks, not so much the language itself.


I suggest http://www.raywenderlich.com/ and his fantastic books with examples. While I actually think I've been doing Objective-C longer than most of his writers, when there is some area I haven't worked in yet they usually have a high quality, simple to follow example on how do do a thing that's a great jumping off point to getting going and then understanding further once you get the main thing in front of you.


Maybe the mistake could lie in starting with Objective C without first learning C? I'm thinking that a reasonably solid C foundation helps a lot in understanding why and how Obj-C works.


I went straight from a CFML/Java/Javascript web developer to picking up Objective C in a week. I had some coaching from a C/C++ expert, who learned ObjC on the fly and wrote a first draft of an app I designed, but from there I worked out how to solve my own problems and eventually became a moderately high skilled ObjC developer.

I have subsequently leveraged that ObjC experience to become somewhat proficient in plain C, a skill I appreciate today and something I might never have learned otherwise.


I learned Objective-C before I learned C, back in the early 90s. Before that I'd learned Basic and Pascal. Objective-C clicked with my brain pretty easily.


While knowing C certainly helps, it is not essential to learning Objective C, at least it wasn't for me. If you try first to understand it at a high level, then it is very doable (and fun!).


Cocoa, the API behind iOS and Mac Objective-C, is the hardest part, and is still there.

I suggest going through some detailed tutorials on building an X, so you get the feel for how views, view controllers, objects and everything all interact.

The new playgrounds look EXCELLENT for doing that! Best of luck


Programming languages are like vehicles. If you can drive one, you mostly know how to drive the other. JavaScript has a lot more gotchas than Objective-C. If you feel comfortable with the weird "this", closures, and prototypes, Objective C shouldn't be hard. Pick up a copy of K&R to get familiar with C and then read the Objective-C docs.



Site seems to be buckling under load. Google cache: http://webcache.googleusercontent.com/search?q=cache:http://...


Here's a screenshot of the page http://i.imgur.com/C51LXnz.png


I'm not sure if it's XCode / playground acting up, but array equality is broken for me: `[1, 2, 3] == [1, 2, 3]` returns false. This seems to contradict the core declarations though:

/// Returns true if these arrays contain the same elements.

func ==<T : Equatable>(lhs: T[], rhs: T[]) -> Bool

Works alright for dictionaries. Is there a bug tracker for Swift anywhere to report this?

Edit:

Whoops, copied wrong declaration. ContiguousArrays actually work fine, but require an extra cast. e.g.

ContiguousArray([1, 2, 3]) == ContiguousArray([1, 2, 3])


var one = [1, 2, 3]

var two = [1, 2, 3]

one == two

That's because Swift checks for safety first (variable declaration).


Can you explain what you mean by `checks for safety first (variable declaration)`?

This issue only seems to affect arrays. Strings, numbers, and dictionaries are fine.


Another interesting thing to note is that because Swift really began in roughly mid-2010, Steve Jobs probably had his hand at least slightly in it (obviously not as a programmer).

I guess my point being that there was likely some more specific reason for why it gained momentum later.


I dunno, it's possible but OTOH it's hard to imagine a guy like Steve Jobs, for whom (as far as I know) coding wasn't a primary concern, and who was dealing with cancer and much bigger fish in the i* products, being any more than tangentially involved in what would have been at the time a fledgling research language project. He was probably aware that it was happening, but I doubt that he played much of a role. Apple is after all a massive company filled with many smart people and who knows how many research projects.


I think it very likely Apple have all kinds of 1, 2, 5 and 10 year road maps for all their products (software and hardware).

Given that almost all of them rely so heavily on Objective-C, I expect they've had a road map for that (and XCode and the whole tool chain) for a very long time.

Given the plan is now to phase out ObjC and replace it entirely with Swift, I think it's extremely unlikely Steve didn't know about it, and endorse it.


He did say that it wasn't A major Apple focus until last year, when they actually got together and did all the end work.


It might not have been a major focus until last year, but I'll bet it was on road maps for many, many years preceding that.


I feel like swift is really a good view on the future of programming. And it seems that in the future we have really two different kind of software engineers. As we make programming mainstream and easy, we will see some new people able to use langages like swift and dev good apps without having the slightest idea of what is happening underneath. We used to have at least a common background between software engineers but I think it gonna slowly disappear. Is it good or bad? I can't make up my mind yet, but I'm really considering more and more to go back to lower level langages as I feel like the upper levels are going to be crowded by young generations.


"Underneath" just means "the level below where I'm writing". This has pretty much always been true, the way systems are layered. You write install scripts? Apps? Libraries? rendering engines? Drivers? OS? Firmware? Chip layout? You probably don't really know what goes on 'underneath' your layer.


Sometimes the occasion arises where it's useful to look at the assembly language being generated by your compiler.

The vast majority of developers I know completely freak out at the idea. I might as well suggest that they deadlift a car.

A few take it calmly, but don't really grasp what the stuff does or what it means.

I can probably count on one hand (maybe two?) the number of developers I know who can actually do something useful in that situation.

And that's just assembly language. Not even machine code, let alone actual hardware.

Most developers don't understand what goes on below wherever they work, and that's how it's been for a very long time.


Agreed. I spend half my debugging time with the disassembly turned on - removes ALL the ambiguity. It's like pulling teeth to get other developers to try it. Once you do, you kind of get addicted to it.


I agree but I feel like with swift it's kinda reaching an extent and it really doesn't push you toward learning any other layer. Swift + playground feels like the new dreamweaver to me. Either people are gonna write ugly code that shows them what they want to do or keep building quality apps, I don't know, I guess future will tell us. But I guess my main point was just that where I personally would try to reduce the gap between the different layers so we developers know what happens from A to Z (at least on the software side), the industry is clearly adding layer on layers such a way it's nearly impossible to control and understand the whole thing.


Our entire software industry is built on the power of abstractions. While its always great to know as much as you can about the layers you are building on, solid abstractions that allow you to get away with NOT knowing that unless you want to or need to optimize to extremes are a great thing.


Off the top of my head the only times not knowing what's going on at the layer below has been a problem:

1. The layer below has a bug that manifests at the top layer. You're not sure if your code is broke or if the stuff your code is built on is broke. I see this a lot in the Java world where people use frameworks / libraries they're sometimes not even aware of they're using and it has a bug.

2. You run into performance problems because of the things you're using at the layer below you. C++ STL containers is a good example. It's pretty much black-magic, have you ever looked at the implementation? I also see this a lot in the Java enterprise world. One of the selling points of Java is that it's monkey-coder friendly. Except when it breaks and the monkey doesn't know how things work under the hood.

Other than that, you should be fine not knowing the layer below you.


I can't afford to ignore it as it matures, but with my dev setup right now, I just can't afford it. Hoping to see this work well in Linux and Windows.


XCode6 runs alongside XCode5, you can safely play with Playground and build apps in the simulator, and even install iOS8 on a dev device and deploy there, without affecting your toolchain


Swift "greatly benefited from the experiences hard-won by many other languages in the field, drawing ideas from Objective-C, Rust, Haskell, Ruby, Python, C#, CLU, and far too many others to list." I have been wondering if in fact this language will be Open Source, since it is said to have taken from other programming languages, some of which are Open Source, yet the decision to make it Open Source has not been made? Lock down.


In general, a _language_ is not open source; its _implementation_ is.


Right, well, let's see what happens. I kindda agree with Gordon Haff (http://www.cnet.com/news/apples-new-swift-coding-language-ho...) on being disappointed that Apple did choose a more open platform, but I suppose this can change in time, like with Microsoft recent changes.


Extremely excited about this language's versatility. Also great to see the work of my school's Alumni and professors going into production. This is the third time I've come across the LLVM compiler in industry use (albeit nothing to the scale of iOS' language) - anecdotally during my internship search this last semester. Will be very interesting to take Professor Adve's compiler course soon.


Apple really took a big step. Waiting to see amazing stuffs done with Swift.


Looking forwards, here's a dream roadmap if this WWDC is any indication.

playgrounds for iPad, interface builder for iPad and finally Xcode.


I like js, but not swift.


Swift is a little complex.

Does it compile to Object-C?


No, it compiles from source to LLVM IR to native instructions, just like with clang or rustc


No, but it can call it.


I wish Apple had selected a different name: https://www.ci.uchicago.edu/research-projects/swift


Name a letter or a word and there is some project out there that had it before. If you come to grips with this now, your adult life will be much easier.


Google similarly displaced another programming language called go.


Funny how he forgot to mention Golang somehow while that seems to be one of the bigger inspirations for the language.


I don't understand why people keep comparing Swift and Go, as they have diametrically opposed philosophies. As stated succinctly by Bryan O'Sullivan (one of the most prominent people in the Haskell community) on Twitter:

"It's interesting to compare Swift and Go. One has caught up to the 1990s in language design, with the other firmly in the 60s."

Swift includes many of the language features that have been touted by the academic functional programming community for years but which have not made a large impact in industry, like algebraic data types, pattern matching, and Optionals. The designers of Go apparently looked at the past few decades of language research and decided that none of it interested them. The only really innovative parts of Go are its concurrency support, which Swift seems to be completely lacking.


The concurrency problem was partially and briefly covered in the Platforms state of the Union keynote. Since closures in Swift are compatible with Apple's C blocks extension, all of the libdispatch API works with it out of the box. and the last-argument-is-a-closure syntax from Ruby makes it really nice:

    dispatch_apply(arr.length, queue) {index in
        arr[index] = expensive_func(arr[index])
    }
Obviously nice wrappers can be written around code like this to give you safe parallel programming. You could also quite easily make futures/async stuff.


As kryptiskt mentions in a separate comment below, its not that go's designers ignored language research. Rather, its that they took a decidedly minimalistic approach to language features.

What really surprises me is that Apple didn't put any concurrency features into Swift. Having a language with better block/closure support will certainly help, but imagine what they could have done with some additional work. Would have loved to have seen some support for channels and tasks a la Rust/Go/etc.

Edit: personZ's comment above says what I wanted to say, much more clearly.


Apple has Grand Central Dispatch. It seems that Swift plays well with it, and I won't be surprised if more concurrency-model-as-library stuffs pop up in the future. It's a sound approach and avoids tying the language itself to any singular model.


If you see my comment above [1] I've got an example of just how nicely GCD/libdispatch works with Swift. It's quite exciting really.

[1] https://news.ycombinator.com/item?id=7844219


Thanks for the link. When I mentioned "better block/closure support will certainly help" this is what I was referring to, but its nice to see an example. But some in-language support could have made things a lot nicer still, and I find it odd that as a "modern" language that they omitted it.


I think it's because Swift is so new only a small percentage of people have actually looked at the language spec.

First thing I thought of when I saw all the func calls, curly braces and beaks (->) was "this looks like in between Lua and Go"

After starting to dig into the book, it's pretty apparent that isn't the case


It's bizarre how focused programmers are on syntax.

It seems like half the comments about Swift have been comparing it to languages which it superficially resembles at the syntax level. And the other half of the comments are along the lines of, "I can't stand Objective-C's brackets, this looks much better."

It's like that scene from The Matrix: "I don't even see the code. All I see is blonde, brunette, red-head." I'd have expected experienced programmers to have reached that point long ago.


Wadler's Law: "In any language design, the total time spent discussing a feature in this list is proportional to two raised to the power of its position:

  0. Semantics
  1. Syntax
  2. Lexical syntax
  3. Lexical syntax of comments"


Speaking of which, I was quite pleased to see the inclusion of nested comments. Makes getting rid of currently incorrect code during development much easier.


No different from learning a human language I think.

The first thing you're exposed to is its sounds and symbols.

They look & sound strange and your brain instinctively tries to make sense of them by comparing it to something you are already familiar with.

Then when you dig into it you begin learning vocabulary and grammar. You focus on that for a long time until the sounds sound less like gibberish and resemble something like what you've been practicing putting on paper.

Once you get comfortable with the constructs and stop focusing on them you can start conversing - putting more effort into what you're trying to say then how you would go about saying it.

After that then you can start picking up all the idioms, colloquialisms, cliches, double entendres, etc

Finally you can start inventing your own.


Most of the commenting on a new language, particularly here (as opposed to on e.g. Lambda The Ultimate) is from people who aren't going to write a single line in it in anger; it's not surprising that idle commentary would focus on syntax. It's of a piece with all internet commentary, which rarely gets below the outermost layer of thinking about anything.


I've been seeing much the same from programmers invested in the Apple ecosystem who will likely be spending all day writing nothing but Swift within a year.


Lots of people are having a hard time understanding the difference between dynamic typing and static typing with type inference.


Definitely. I suppose it's understandable if this is your first exposure to type inference. Superficially it looks exactly like the "assign anything to anything" languages like Python or JavaScript.


well in Swift I think concurrency is expected to rely on system libraries whereas in Go they are built-in and even have special syntax. If Swift ever gets a good macro system, though, I'm guessing one could create nice syntax to use system concurrency libraries.


Lack of CS knowledge I guess, specially in programming language history.

They see things like func, and think Go was the first language to come up with it, when many others already had it.


The only really innovative parts of Go are its concurrency support, which Swift seems to be completely lacking.

Worth noting that one of the greatest problems in software development is concurrency support and its ease of use/robustness.

It is a critical and growing issue.

In contrast, having or not having generics, optionals (in C# these are nullable types), algebraic data types (which in Go is interface, albiet minus the type set checking) make a marginal, vanishingly small difference in programmer productivity or application performance/stability.

99% of the articles about the profound importance of generics are people building nothing of interest for anyone, and it is exactly that vaguery of design that makes generics seem so important.

Concurrency, however, is everyone's problem. It is the modern programming problem. Nothing is more important.

And FWIW, all languages draw from each other, and of course they should learn from each other. It is very likely that Go influenced Swift in subtle ways, but it is obvious why that wouldn't be referenced.


> 99% of the articles about the profound importance of generics are people building nothing of interest for anyone, and it is exactly that vaguery of design that makes generics seem so important.

I think it's quite a stretch to say that people who have written articles about generics are almost never building anything of importance. The browser you used to post this comment makes extensive use of generics via C++ templates, which were well documented by their authors in order to fill a very important role (smart pointers for reference counted objects, for one).


I think it's quite a stretch to say that people who have written articles about generics

I said 99%. The vast majority of language observations online are language tourists, and the observations are seldom practical or rational, but instead are of the "in kicking the tires and making nothing in practical, here are my thoughts".

Those sorts of posts dominate.

Generics have a place, but their importance is...overstated. Though I would quite broadly disagree with the notion that auto_ptr -- a shim on C++ -- justifies the notion of generics.


People don't write articles discussing how their whole complex project works because it would take too long. those articles are written to highlight what and how these features have improved their ability to make software efficiently/reduce errors etc.

I do not agree that the importance of generics is overrated and it's one of the primary reasons I won't be using Go. I need to know what type of object I'm working with to feel comfortable when programming, and I like the reassurance from the compiler that it agrees with me that that is actually what I'm working with. I would really like Go if it had generics, but it's just too unsafe for me to use in its current form.


Well, I have written big projects in other languages, and I wouldn't go back to a language without generics.

Don't know where you got the arbitrary "99%" statistic from.

It's not like generics are some novel, marginal concept. It's 2014 already.


Genuine question: what does Go provide in terms of concurrency that Swift + GCD doesn't?


I would argue that Go has it more "baked in."

But I'm actually curious as well. Given that Objective-C (and now Swift) have GCD, what language features are needed to bring concurrency up to par with Go?

I'm sort of reminded of what Microsoft did after Task Parallel Library came out with C#. People didn't pay attention too much to what MS was building until async/await came out (along with tooling support). But async/await built upon TPL. So it's probably a matter of time for Apple to do something similar on top of GCD, if they're going to do something. But I doubt it's going to look something like Go.


Go provides straight-ahead, blocking-is-OK concurrency. GCD provides callback-based concurrency, like node.js, but with several worker threads instead of just one.


Apple already has GCD/libdispatch, and is presumably content to continue handling this with libraries, at least for now. Concurrency's certainly important, but I don't think it's entirely clear yet that _language level support_ for it is.


That's a strange accusation, because Go is avowedly minimalist while Swift has cribbed most every thing from modern statically typed languages like ADTs, pattern matching, and generics.


The language has nothing from Go.

Almost all Go features exist in languages since the early 80's, outside the C family.


Yeah? Interesting. Because all popular languages that I've used, up until Go, fail at concurrency at such a basic level, that it's almost as if their authors don't understand what concurrency is. Except Erlang. But then again, I said popular.

C/C++/Ruby/Javasscript/Python/Java/C# - all fail miserably, utterly and completely at concurrency. What Go accomplishes with channels, goroutines and the "select" statement has been an eye-opener for me.

Watching people struggle with threads/mutexes/shared memory in the 21st century, or with some half-baked library, is truly sad.

I'm really surprised that all Go features have been present since the 80's - It's strange I have utterly missed a feature as basic as concurrency in those! Very interesting.


Go's gouroutines trace back to the Modula-2 co-routines and Ada tasks, just to cite two possible examples.

Go's channels make use of CSP theory which had as first implementation Occam.


Ah, Modula-2. Yes. Very popular in the corporate world; huge hit on Github/Bitbucket as well. Lots of libraries, web frameworks....up-and-comer, for sure. I hear we'll soon be writing iOS apps in it?

Occam? No comment. At least Ada is being actively used in something, but Modula-2 and Occam? Seriously?

You can find any idea currently in use that was probably first academically tested out and "proven" in "Obscure-language-nobody-ever-uses-anymore-on-any-real-projects".

The fact that Go, this supposedly boring, unoriginal language is actually taking off in popularity, is for some reason a huge pain point for certain people who refuse to acknowledge that the "plebs" who don't bow before Haskell or similar are allowed to program in something they enjoy and that brings something new to the table, for them. New for them. New for corporate America, new for real projects, new for actual people getting actual work done, on actual projects, in actual companies, getting paid actual money for it.

How did Go haters ever survive the rise of Ruby, for example? A language which brought absolutely nothing interesting to the table, yet after people experienced RoR became more popular, in one month, than Occam, CSP, Modula-2, Oberon, combined, times 1000? You guys must have been foaming at the mouth for years.

Occam? Modula-2? Jesus Christ.

You think Go's success is undeserved? I can't wait to see your reactions when Swift surpasses "My-favorite-obscure-language-X" in number of people/libraries/projects using it before the next WWDC.


>Ah, Modula-2. Yes. Very popular in the corporate world; huge hit on Github/Bitbucket as well. Lots of libraries, web frameworks....up-and-comer, for sure. I hear we'll soon be writing iOS apps in it? Occam? No comment. At least Ada is being actively used in something, but Modula-2 and Occam? Seriously?

Are you just trolling or simply act defensively?

Nobody said anything about it being found in languages that are in popular use today. What the parent said was that those features already existed in several languages.

That those languages also had to be "popular" is just a random restriction you added. In fact it has nothing to do with the origins of the features.

Oh, and Modula-2 and Occam are hardly some obscure, unknown languages (especially back in the day). In fact, they are some of the most well known and copied languages in PL history. Even Java copied a lot from Modula-2 itself, by its designers own admission. And Occam has inspired several modern lanaguages.

>How did Go haters ever survive the rise of Ruby, for example? A language which brought absolutely nothing interesting to the table, yet after people experienced RoR became more popular, in one month, than Occam, CSP, Modula-2, Oberon, combined, times 1000? You guys must have been foaming at the mouth for years.

Juvenile stuff. Moving along.

(Btw, CSP is not a language. It's a decades old mechanism for concurrency, the one Go itself uses -- and available in other modern languages too, including Clojure. Shows how much you know about those things).


>Yeah? Interesting. Because all popular languages that I've used, up until Go, fail at concurrency at such a basic level, that it's almost as if their authors don't understand what concurrency is. Except Erlang. But then again, I said popular.

You added "popular" now, to deflect from what he said: that those features already existed outside of Go. If a feature exists for 20+ years, having it in your language is not a sign that you "copied it" from a 5 year old language.

Not to mention that he also said "almost all" features, and was replying in the context that Swift copied Go. In that context Go's concurrency is irrelevant, since Swift doesn't have the same mechanisms.


So you're saying golang is web scale?


Surprising, but nonetheless true.


I like Chris and what he has done, but...

>>The Swift language is the product of tireless effort from a team of language experts, documentation gurus, compiler optimization ninjas...

Seriously? What is a "documentation guru"? What is a "compiler optimization ninja"? I find language like this so distracting and asinine that I had to stop reading. Am I the only one?


Yes, you are the only one. Replace "guru" with "expert" and "ninja" with "expert" and continue reading. There are already no real ninjas in the world and soon there will be no real gurus. No one thinks twice when someone is called a marketing wiz (there are no real wizards any more). Language changes and metaphors using archaic terms are commonly introduced.


That's funny, because when such language is used in job postings, they are overwhelmingly frowned upon by HN folks. But I guess it's OK when someone like Chris does it...

edit: judging by the downvotes, he seems to have a lot of fans. :)


Likely downvoted because this is the stupidest, most pointless comment thread on an otherwise decent page. Contributing to it is what's overwhelmingly frowned upon by HN.

(Yes, I know. I could use some downvotes.)


It is quite possible that e was referring to people who he believed had taught him interesting aspects about the spirit of documentation or the philosophical way to view compiler optimisation. In which case the word would be apt.


Hilarious.


If you're going to let it get to you that much, you may want to step back a tad.


Well, regarding "what is a documentation guru?", the documentation system for Swift is extremely impressive; all example pages are literate programs which can be interacted with with live results a la the playground. That's some bad ass documentation if you ask me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: