Hacker News new | past | comments | ask | show | jobs | submit login

I'm not even an iOS developer but this is by far the most exciting thing I heard in the keynote.

As an amatuer/hobbyist programmer who's self-taught with Ruby, JavaScript, etc., the one thing that was keeping me from experimenting with iOS apps was Objective-C. I know I could tackle it, but it's been hard to take the plunge.

I don't know much about Swift yet, but from what I've seen it looks very exciting. So if Apple's goal was to get new devs into the iOS world, at least from 10k feet, it's working.

I'm excited!

I'm not really that impressed--it looks like a hodgepodge of ideas from ES6, Ruby, Go, and maybe Rust, with a bit of backend work done to let it work on their existing infrastructure.

I dislike that Apple has continued the special snowflake approach, that for some reason we as developers need to learn yet another different-but-almost-the-same language to develop for them, instead of just adding proper support and documentation for an existing language. Why not just let us use ES6, or normal C/C++, or Java?

But instead, now there's yet another language without great innovation that is probably going to be badly supported outside of the Apple ecosystem but still will have enough fandom to keep it alive and make life annoying.

At least Google had the decency to pick a language everybody was already using and use that.


I feel bad for all the engineers stuck in those pixel mines, not allowed to talk about what they're doing, doomed to reinvent things that are on the way out just as they come in.

There is already MacRuby and RubyMotion. They tried using Java years ago. It failed. Developers didn't like it. Existing stuff simply doesn't mix that well with Cocoa and that style of programming. That is why something like Swift was needed.

I really don't get why you can bring up languages such as Rust and Go, and complain about Apple's special snowflake approach. Suddenly Apple is doing something developers have been demanding from them for years and something lots of other companies like Google, Mozila and Microsoft has already done. But oh no, because it is Apple, it is all wrong.

It's unfair to lump Mozilla in with the rest, since Rust isn't at all propriety. It has been open source for a long long time: https://github.com/mozilla/rust

That is not quite right.

The Java/Objective-C bridge existed in the early days as they weren't sure if developers would pick Objective-C, so they decided to bet on two horses.

As Objective-C eventually won the hearts of Mac OS X developers, the bridge was deprecated and a few years later the full Java support.

Suddenly Apple is doing something developers have been demanding from them for years and something lots of other companies like Google, Mozila and Microsoft has already done.

And yet they've decided to do it again, with yet another incompatible language! Joy of joys!

(And as for Java, it was my understanding that Apple had hobbled it by refusing to release updates on a timely basis.)

> Apple had hobbled it by refusing to release updates on a timely basis.

I can see how they could get tired of being forced to ship almost-monthly updates just to support an extra language with very limited adoption. If you have to make that sort of effort, you'll probably do it for your native tools only (like Microsoft does with .Net). Besides, Java apps on OSX looked better than Java apps on Windows, but they were still recognizably different from Obj-C ones.

I wish somebody would write an OS in Python 3...

"(And as for Java, it was my understanding that Apple had hobbled it by refusing to release updates on a timely basis.)"

That's a different, later issue.

Early on in the life of OS X, Apple offered a Java interface to the Cocoa class frameworks. In theory, you could write OS X applications using Java, calling into the Apple frameworks instead of using Swing or whatever Java frameworks.

This wasn't all that well supported, didn't perform well, and wasn't popular.

Sun should simply have hired some mac people and done it themselves. Entrusting the success of your entire company ( they changed their ticker symbol to JAVA!) to a 3rd party vendor's whims was and is silly.

Agreed that the lack of using an existing (and open-source!) language is annoying and frustrating to deal with (think of where we'd be if they invested that time and effort into improving Ruby/Python/whatever instead!). But because of the desire for compatibility with Objective-C, and Apple's general desire to call all the shots regarding their ecosystem, this move doesn't surprise me in the least.

The fact that this has static typing is a huge difference to "just improving" ruby/python. That approach couldn't come close to getting the same early-error-catching dev experience, and performance. And amongst static languages, Apple wasn't likely to recommend C++ as simple, were they? And Rust/D are also quite low level, nor do they have the Objective-C legacy to consider. So really, you're probably left with C# (or maybe Java), and those are so old and large (esp. the libraries) by now that they're unlikely to naturally port to Apple's environment.

Frankly, a bit of a clean up every decade or two is not exactly often, right?

Apple consistently represents a step backwards for both developers and users, in terms of engineering and freedom, but they've amassed enough capital at this point that the hope of them simply withering on the vine and dying off is probably not going to happen.

At least Microsoft and Google show off their new projects and code so everyone can learn from them and read their research.

any proof to back up those claims?




Hint: one of these things is not like the other...see if you can figure out which using only the power of curl.

What about the special snowflake projects of google, mozilla, or sun? Apples language development is no less valid than google developing Go, or mozilla developing rust. This just shows your inherent bias.

I've been amazed recently how many open-source projects that we rolled into our linux product were Apple sourced: LLVM, Clang, libdispatch, webkit, OpenCL, zeroConf. Can't think of anything google has done for me recently.

And if there is anyone who will knock-this out of the park, its Chris Lattner. LLVM, Clang, and openCL is all him. He has done more for compiler tech than anyone in 30 years.

>At least Google had the decency to pick a language everybody was already using and use that.

If you think Java is remotely comparable in power and expressiveness to Objective C, you should probably reconsider your line of work.

The rise in popularity of Java nearly drove me from the industry it is such a verbose half baked pile of garbage. I could fill your browser with things you can do in Objective C that you cannot do in Java at all and this incredible flexibility is why Apple is such an agile company with such limited head count.

I don't get the hate. Yeah, syntax is unfamiliar, bu once I got used to it I began to really enjoy objective-c. Ymmv etc., but it's now one of my fav languages - though I guess this is mostly due to cocoa

I also really like Obj-C now that I am familiar with it. I think the biggest pain point with iOS apps is understanding the way to build apps within the context of the iPhone (how to structure views, and the custom things like alert sheets, etc...) particularly if you are coming from a web app background. The syntax is quite nice (although sometimes verbose) once you get used to it.

I never understood what the fuss was all about either.

If you know one other language really well, Objective-C should take a week or two to get use to.

To understand all the design patters, apple HIG, XCode, profiling, libraries, debugging, app submission, etc, these combined is where youll sink your time to learn iOS development. Imo, Objective-C is the easy part.

I recently translated one of my Apps from Android to iPhone.

I had 0 objective-C experience, but I made it work. It was a bit of a frustrating experience. Many times I found myself writing Objective-C boilerplate-ish code that I had 0 clue what it was doing, considering this is a hobby / for fun project I just wanted it working.

It's not easy to google the answer to, "Why did I just add this new keyword after this colon in this random .h file.."

I didn't want to spend the next month reading Objective-C for beginners, I know what a for loop is, I also know what constructors are. I just wanted to use the language.

You may know what a constructor is, but maybe not know what a designated initializer does. ;-)

I felt the same when working on iOS. I felt I was writing way too much boilerplate code, while Android and Windows Phone just gave me a lot more "for free".

You've just described exactly what it feels like transitioning from iOS to Android development, too.

You may not hate Objective-C, but I doubt you love it either. Have you / would you ever use Objective-C to write a web back-end? To write a command-line tool?

I got started with WebObjects, a Next product a couple years before Apple bought them. Yes I've written wonderfully powerful web applications in Objective-C back when the rest of the web was being built using CGI and Perl scripts.

I loved Smalltalk and I love Objective-C at a deep level. The Objective-C runtime is incredibly powerful and its method dispatch is astonishingly efficient considering what it does. It is not as fast as vtables, but it isn't as fragile either.

It might well interest you to know that WebObjects (I'm talking 1997 here) ran on HP-UX, SunOS, AIX, and one other popular Unix of the day that slips my mind and it too shipped with a lively scripting language called WebScript which was not so different from a minimal Swift today.

The thing is, once you dig into the Objective-C runtime and spend a bit of time trying to write an interpreter, you start to realize that the interpreter almost writes itself. Swift is far from the first language built atop the Objective-C runtime.

Consider FScript (http://www.fscript.org) has been around for well over a decade and does more or less the same thing except it gives you something closer to Smalltalk than Javascript and it includes some advanced matrix manipulation goodies as well.

The majority of the people squealing with glee over the introduction to Swift seem to be the sort of people I wouldn't care to work with. If a bit of syntax puts you off so much, lord help you when a truly new paradigm hits.

Swift looks to have some nice features, but it seems to be missing the low level access to the runtime that advanced developers can use like default message handlers (forwardInvocation:/doesNotUnderstand:/methodForSelector: kinds of stuff) and the ability to fiddle method dicts at runtime which can be very useful for intercepting strange errors and unexpected code paths.

So, yes, I do LOVE Objective-C. It is my second favorite language to work in after Smalltalk and to those claiming that Swift will help them move over from Android because it less verbose - lets remember Java is the most boilerplate per capability language I've seen since COBOL. I don't know what those people are talking about.

I've done both, they were fun projects :)

The only thing that got in the way was the difficulty using the code away from OS X or iOS, and the fact that a lot of libraries for things like database access (especially those intended for iOS) were never intended to be used in a long running process. I found slow (3 week) memory leaks that someone writing an iOS app would never have hit.

I actually really like Objective-C and would totally use it as a back end language if there were good libraries to make use of. I've also written a couple of command line tools in Obj-C.

My dislike is that it uses [] for method calls. It's like making Objective-English where we swap Z and A and j for o, just for the hell of it.

If thzt sjunds like fun tj yju, thzn gj fjr Jboective-C.

It's not for the hell of it.

[ ] does not mean method call, it is the syntax for a message send.

Objective-C is a super set of C, adding an Smalltalk like object system to C. The delimiters say "I am sending a message", which is different to a method call. Also, without them the language would be much more difficult to parse, and future changes to C could break the language. It's lasted well (first appeared in 1993). Not as long as Lisp, perhaps it needs more [ ] :)

> It's lasted well (first appeared in 1993).

1983, actually.

Thanks - I felt I should type 1983, but if felt wrong! I still had my Apple ][ back then.

Thanks. Just read up on messaging and now I like it even less :(

In Smalltalk and Objective-C, the target of a message is resolved at runtime, with the receiving object itself interpreting the message. ... A consequence of this is that the message-passing system has no type checking.


This is exactly what gives you the ability to easily wire up standard UI components and do tihngs like KVO. KVO is really difficult in something like C++ (for example, it's practically impossible to do in Qt to create without a lot of templating/boilerplace code).

This is in my opinion the best thing about Objective-C; it clearly delineates the object/class and C dichotomy, making it easier for a C programmer (or a Smalltalk programmer!) to pick up. For years, the only changes from vanilla C were the brackets, "#import" and the @ literal syntax (IIRC).

Actually, if you ask me today, after dealing with Scala's idea of how the Option type should work, I might say that nil propagation is the best thing about Objective-C.

That's how I always felt. I liked the clear differentiation between C function calls and method calls on objects.

very genius response!

It's not hate, but Objective-C can be intimidating.

I just spent the past 2 months learning obj-c, about to release my first app and boom, X out obj-c. my luck.

90% of what you learned are Cocoa frameworks and Apple-flavored OOP patterns that will be totally applicable to apps written in Swift. Fear not!

I don't know very much at all about objective C, but the way these things generally work is that you will benefit from the experience as you learn new languages, as it will be an anchor of context against which you may base fresh perceptions.

You'll always be able to contribute to NeXTSTEP. It's not dead yet!

No worries, Objective-C is faaar from deprecated.

Objective C isn’t going anywhere.

Swift is shit. I suspect it will die in a couple years, like the misguided effort to get people to adopt the Java bridge or WebScript before that.

I don't think syntax is really the issue. Using objc these days is clunky for reasons besides syntax.

Like dealing with ARC, which is still clunky:

    @lazy var asHTML: () -> String = {
        [unowned self] in
        if let text = self.text {
            return "<\(self.name)>\(text)</\(self.name)>"
        } else {
            return "<\(self.name) />"
Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks. https://itun.es/us/jEUH0.l

To someone on the outside of ObjC, its just SO DAMN VERBOSE. It's unapproachable the same way Java is unapproachable.

I understand why ObjC's syntax makes some people bristle, but I've never felt that way myself. It's sort of like the people that really hate Python for no other reason than the meaningful whitespace. It's unconventional, but once you understand the rationale for it it makes sense in a way that is at least forgivable if not likable.

There have been a lot of C-based object-oriented APIs over the years. GObject has a C API. On the Mac, there's Core Foundation and a bunch of other OS X APIs that are built on top of it. For over a decade on X11, before gtk and Qt even existed, the closest thing there was to a standard graphical environment was Motif (the corresponding desktop environment was CDE), and Motif was built on top of Xt. Xt was yet another C-based object system, although it was specialized for designing UI components.

This is all well and good but you end up with a ton of boilerplate code that does nothing but manage the lifecycles of the object instances (retain/release for example), and lends itself to extremely verbose function calls in place of object methods.

One possible solution is to put together some really elaborate preprocessor macros to make it look like you have extended the C language to include special syntax for your object system, so you can at least replace this:

obj foo = obj_factory(); int c = obj_getNumberOfElements(foo);

...with something more compact like this:

obj foo = [Obj new]; int c = [foo numberOfElements];

(the second example is ObjC-ish but the former is nothing in particular other than just what the typical C object APIs tend to look like)

The only catch is that the little mini-language you are extending C with using macros can't use existing C syntax, because you can only add to the language, not alter the behavior of existing operators. So, you can't just do method calls using a dot syntax on the instance (such as foo.numberOfElements()). So, you have to come up with something new. Maybe you always liked Smalltalk, and maybe you even based much of behavior of your object system on how Smalltalk objects behave and interact? If so, you might settle on the bracket notation. This has the added benefit of making it very clear when a chunk of code is run-of-the-mill C versus when the code is triggering the syntactic sugar you created with macros to add support for your object system to the C language.

C++ doesn't exist yet, or else you might've just gone with that instead of rolling your own thing. Eventually C++ does exist, and you start to feel a little primitive for sticking with the weird macro language. You eventually build your mini-language into a C compiler so you don't have use the macros anymore. You experiment with some new alternatives to the syntax that are more conventional, but no one uses them. Many developers like that the non-C-ish syntax makes it easy to distinguish between straight C code vs. interactions with the object system, which has its own set of rules and conventions.

Anyway, that's mostly speculation, but something like that story is how I've always thought Objective-C evolved over the years. I don't mind it nearly as much as long as I don't think of it as a separate programming language from C (like C++ or Java or pretty much anything else these days), but rather think of it as C with some useful syntactic sugar that gets rid of a ton of boilerplate code for a particular C-based object-oriented API.

According to http://en.wikipedia.org/wiki/Objective-C#History, that's actually almost exactly how it came to be. (Apple even experimented with changing the syntax: http://en.wikipedia.org/wiki/Objective-C#.22Modern.22_Object...)

It really reeks of 80s. I 'd rather program in plain C.

I spent a lot of time trying to do stuff with ObjectiveC, but just hated the syntax. That's been the biggest thing keeping me from developing Mac OSX apps; I just prefer Ruby's simplicity. I'm going to seriously give Swift a try.

Yep, same here. It looks pretty JavaScript-y, which is familiar at least. I think this is a good move on Apple's part.

It's probably a wise decision to have an "Algol patterned" language. No non Algol patterned language has ever become a mainstream programming language to my knowledge.

I am not a programming language wonk; so I imagine most languages I am familiar-with/know-of are necessarily Algol patterned. What are some non-Algol patterned languages?

Lisp, Forth, Prolog (and Erlang), Smalltalk, Haskell, and Tcl all come to mind.

In particular, Obj-C = Smalltalk + C. If you subtract C from Obj-C, you'd most easily just end up with Smalltalk. But that's not the right move for mass adoption.

I agree with the first, but disagree with the second part:

COBOL, Fortran, JCL (not Turing complete, AFAIK), SQL, Excel, DOS batch files all were (fairly) mainstream at some time.

Fortran came before Algol and arguably influenced it[1]. I agree with COBOL and SQL in particular, though.

[1] http://www.digibarn.com/collections/posters/tongues/Computer...

The correctness of that image can be discussed. Fortran was specified in 1954, but the first compiler shipped in April 1957 (http://en.wikipedia.org/wiki/Fortran#History). That is earlier than Algol 58 (first two implementations in 1958 (http://en.wikipedia.org/wiki/ALGOL_58#Time_line_of_implement...), but close.

More importantly, "inspired by" does not imply that Fortran 58 is Algol-like (that same picture would declare Fortran Lisp-like, too)

For me, http://en.wikipedia.org/wiki/Fortran#Simple_FORTRAN_II_progr... certainly is nothing like Algol.

Ruby is simple and beautiful, isn't it? Too bad it never got the shower of money from big backers Javascript, PHP and now Swift got blessed with.

Beauty is in the eye of the beholder, but Ruby is anything but simple. It has one of the most complicated syntaxes of any programming language in common use.

Perl and C++ are still in the lead, but with stuff like the gratuitous introduction of alternate hash syntax, new-style lambdas, etc., Ruby is catching up.

Ruby's grammar is complex, but it's object model is incredibly simple.

Introduction of a new hash syntax wasn't gratuitous really. I think the point was to make up for the lack of proper keyword arguments. Now that they're available, it's true that it doesn't have a reason to stand on its own, but it does make the code more readable and concise, as does the stabby lambda syntax. Though I do agree with your point on simplicity really, the language does offer way too many ways to do the same thing sometimes.

Agreed. I would go so far as to say that this was "one more thing" worthy.

It's definitely more exciting than something like an incremental update to the Apple TV.

My dad tuned out as the keynote got to this point, but for me (as a web developer... for now!) this was the highlight.

I feel the exact same way. For a while now I've been looking at other ways to develop for iOS, such as HTML5 with PhoneGap or C# with Xamarin, but it's always been a kludge.

Swift looks amazing and I'm really excited to try it out tonight! Great job Apple devs.

  > So if Apple's goal was to get new devs into the iOS world, at least
  > from 10k feet, it's working
They just announced Swift, at a conference for Apple developers, with live streaming that is only easily accessed from an ios device. I think it is probably premature to pop the corks and celebrate the efficacy of the get new developers initiative.

As someone wise mentioned to me, Objective-c was 20% of the problem and Apple's silly rules and controls around app distribution are the other 80%. As someone who had their app available in the app store for nearly 8 months including 3 approved updates before being (seemingly) arbitrarily rejected, I feel the pain of that other 80%.

How else are they supposed to announce it? It's simply that, an announcement. People are talking about it now and there's info on the Apple site. I see this as a huge push forward for new developers.

The announcement was fine, it is the "its working" part that is odd considering it is less than a day old. Let's see if it actually attracts new developers before we declare it a mighty success.

Well; based on the promise of immediate inclusion in the app store and a very well thought out book about the language available for free I'd say they're doing rather well so far already.

You mentioned things that are likely to bring about the desired result of creating new ios developers. I am not disagreeing about the likelihood of success. I am simply saying that T + 8h is probably too soon to conclude that the program is successfully creating new ios developers. To be honest I think it is absurd to expect that such a program from any company could bring about the desired goal of creating new developers in less than eight hours.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact