Hacker News new | comments | show | ask | jobs | submit login
Creating a Swift syntax extension: the Lisp 'cond' function (appventure.me)
28 points by terhechte on June 8, 2014 | hide | past | web | favorite | 13 comments

Interesting idea. We can support unlimited cases by using tuples and varargs, though the fallback has to be the first argument if we don't want to return an optional:

    func cond<T>(#fallback: T, testsAndExprs: (test: @auto_closure () -> Bool, expr: @auto_closure () -> T)...) -> T {
        for (t, e) in testsAndExprs {
            if t() {
                return e()
        return fallback
And in use:

  // y is assigned "0 == 0, of course"
  let y = cond(fallback: "fallback", (test: false, expr: "not this branch"), (test: 0 == 0, expr: "0 == 0, of course"))

I was about to post something very similar, but was trying to check that type annotations can be applied to tulle components before posting.

I do like the idea of @auto_closure, but I bet it will confuse the hell out people reading swift code unless the IDE highlights it in some way.

That's a smart idea. Another option would be to use an Array, though that might make the types a bit ugly (as one would need to have the () -> Bool and () -> T in one array)

Incredibly cool hack.

But i'm going to be the first one to point out that having function arguments not evaluated at call time is very, very unintuitive. Anyone with a background in traditional (procedural/OO) languages is reasonably going to expect both `perform_side_effects` calls to be invoked here.

    var ctest = cond_1(a == 1, 2 + perform_side_effects1(),
                       a == 2, 3 + perform_side_effects2(),

This is why the rust people require a bang in macro names. The behavior is just too confusing if what looks like a normal function is actually evaluating its arguments in a non-strict manner.

Scala is another language that supports declaration-site opt-in for non-strict evaluation, very similarly to how swift seems to be doing it. I'm not the biggest fan of Scala, but my limited experience with it has left me with the impression that you can't trust any normal-looking code to actually be normal until you look at its documentation and/or source.

To play devil's advocate, they already have to worry about:

- exceptions short circuiting one of the side effects (C++, C#, etc.)

- the indeterminate order of execution of the side effects (C, C++)

- macros with conditional evaluation behavior (such as assert depending on... _DEBUG? NDEBUG?)

So everything is fair game therefore?

No, but macro names give a lot more context about whether an argument will be evaluated than function names give about side effects and exceptions.

> In Lisp, cond works as follows:

> ...

> Much like a switch statement,

No. cond is nothing like a switch statement. cond is Lisp's version of if..else if...else if..else.

The distinction being, of course, that if you have a contiguous range of cases in your switch statement, a proper C compiler will compile it to an O(1) jump table, rather than an O(n) set of tests like an extended if-statement. This is so crucial and basic a distinction that I am astounded every time I find another person claiming that cond==switch.

The closest thing that Lisp has to a switch statement (and this is a significant failing in the language) is CASE. And that's only if you are careful and have a very smart compiler -- not sure if SBCL will jump-table it for you, for example.

That @auto_closure annotation seems to work exactly like D's `lazy` keyword, transparently wrapping the passed-in value in a closure. Pretty cool that more languages are getting built-in support for laziness.

And... shit like that is why features like operator overloading or this @auto_closure attribute are a bad idea.

I sympathize with the idea of creating a language mostly defined as a library of itself.

It's nerdy, it's neat, it's minimal, it's very Lattnerish.

But it has a fatal flaw: it assumes people will use that feature set in an intelligent, restrained, minimal way. For good.

And let's face it. Most of us devs are fucking morons. It took people only a few days to start abusing it. I wonder where we'll be a year from now.

Capabilities do not equal best practices. And not most devs are morons, especially the ones who write frameworks for others.

The idea that writing a framework means you smart wouldn't explain why there's such a proliferation of badly written frameworks, particularly in easy to pick up languages like JavaScript, Python and PHP.

We're not special. Like everything else in the world, developers follow Sturgeon's law.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact