
Clojure 1.7 is now available - Spendar89
http://blog.cognitect.com/blog/2015/6/30/clojure-17
======
logicchains
I'm really impressed by how backwards compatible it is. I just changed the
Clojure version from "1.6.0" to "1.7.0" for one of my side projects, without
updating any library versions, ran it, all the tests passed, and it seems to
work perfectly. It also didn't break my Emacs setup, which is a breath of
fresh air compared to how much work it was to get the Haskell tooling working
with the new 7.10 GHC release (GHC-mod for instance still doesn't have a
compatible release, although the trunk mostly works with 7.10). Similarly,
even though it's months after the 7.10 release there are still libraries that
don't support it, like reactive-banana-wx, whilst a couple of the Clojure
libraries I'm using haven't been updated in over a year yet still work fine on
1.7, and none of the libraries I'm using break on 1.7.

To be fair, GHC and the Haskell ecosystem is far more complex than Clojure and
its ecosystem/standard library. Nevertheless it's pleasant how easy Clojure
was to upgrade (although of course this stability is nothing special: more
conservative languages like Go and Java generally break almost nothing on
upgrade).

~~~
kul_
Yes this is something clojure users take for granted. I dont know the state of
Scala right now but a year back even minor version bump was horrible in scala
in terms of backward compatibility. Really impressive job by clojure code devs
in terms of maintaining such stable releases.

~~~
xixixao
Is the job simpler because Clojure is not statically typed?

~~~
arohner
I'm not a Scala user, but Clojure the language tends to be extremely stable.
Existing stdlib functions almost never get updated, unless they add a new
(backwards-compatible) arities.

Releases tend to consist mostly of new features, and a small number of
bugfixes.

------
sharms
If you want to build a website in Clojure, I highly recommend checking out
[http://luminusweb.net](http://luminusweb.net) \- the documentation is
amazing, and it incorporates nearly all of the best practices I have seen.

Making an API in Clojure using Swagger gives you a full, interactive UI and
documentation for your API, while also having a schema which makes sure you
know what is submitted and that it validates (i.e. is that a string or a
number?)

~~~
yogthos
Luminus author here, thanks for the kind words. :)

As a note making a Swagger app with Luminus is as simple as:

lein new luminus myapp +swagger

cd myapp

lein run

once the server starts, browse to [http://localhost:3000/swagger-
ui/index.html](http://localhost:3000/swagger-ui/index.html) to see your
Swagger API

------
27182818284
>Transducers are composable algorithmic transformations. They are independent
from the context of their input and output sources and specify only the
essence of the transformation in terms of an individual element. Because
transducers are decoupled from input or output sources,

The biggest thing holding me back from learning Clojure is that I fear it will
take me a decade to become remotely competent in it.

~~~
enoch_r
If you have an understanding of functions like map, filter, and reduce,
transducers are actually pretty easy.

Say you have `(map inc [1 2])`. You can run that, and get `'(2 3)`.

A transducer is the `(map inc)` part of that call (slightly confusingly, this
isn't partial application or currying). You can apply it to something like `[1
2]`, but you can also compose with it, by _combining_ it with say, `(filter
even?)` to get something that represents the process of incrementing
everything, then removing odd numbers. Or you can put in things that aren't
collections, like asynchronous channels, and get back a new channel with the
values modified accordingly.

That's pretty much it.

What I think I love most about Clojure is that there are fantastic, esoteric,
academic ideas that, when I read about them in a context like this for the
first time, I a) do not understand them, and b) have no idea how they would be
useful. Then I read an example or two, and suddenly it's apparent that the
tool is really as simple as it can be--there's very little accidental
complexity--and is extremely useful.

~~~
aoeuasdf1
The way you explain it, it's no different from functions and function
composition; in which case, why invent new vocabulary?

I do remember looking into them before and translating them into Haskell and
they ended up not being identical to functions in the trivial sense that you
suggest, but I forget how.

~~~
kazinator
Transducers are functions. The thing is that they are functions that are
designed to serve as the functional argument to reduce. And they pair with
ordinary functions which are not transducers.

For instance if we have (map inc [1 2]), there exists a transducer function T
such that:

    
    
      (reduce T [1 2])  ==   (map inc [1 2])
    

I.e. we can somehow do "map inc" using reduce.

Okay?

Now, the clever thing is this: why don't we allow map to be called without the
list argument? Just let it be called like this:

    
    
      (map inc)
    

This looks like partial application, right? Now what would partial application
do? It would curry the "inc", returning a function of one argument that takes
a list, i.e.:

    
    
      ;; if it were partial application, then:
      ((map inc) [1 2])  ;; same as (map inc [1 2])
    

But Hickey did something clever; he overloaded functions like map so that (map
inc) returns T!

    
    
      (reduce (map inc) [1 2]) ;; same as (map inc [1 2])
    

The cool thing is that this (map inc) composes with other functions of its
kind. So you can compose together the transducers of list processing
operations, which are then put into effect inside a single reduce, and the
behavior is like the composition of the original list processors.

It's like a linear operator; like LaPlace. Composition of entire list
operations in the regular domain corresponds to composition of operations on
individual elements in the "t-domain".

~~~
ds300
> (reduce (map inc) [1 2]) ;; same as (map inc [1 2])

This is wrong. You've missed the point.

    
    
        (map inc [1 2]) 
    

is actually roughly equivalent to

    
    
        (reduce ((map inc) conj) [] [1 2])
    

which, due to the use of `reduce`, is eager. To get laziness back:

    
    
        (sequence (map inc) [1 2])
    

Transducers are not reducing functions, they _return_ reducing functions when
applied to reducing functions. `((map inc) conj)` is a version of `conj` that
calls `inc` on all the rhs args before `conj`ing them into the lhs arg.

~~~
kazinator
I suspected I had to be not quite understanding something, because why would
we want, say, a map based on reduce that chokes on lazy lists, in a language
where laziness is important.

------
icey
It would be great to see a refactor of some code using transducers to get a
better sense of what they're useful for. From an abstraction standpoint, I can
see the attraction; but I am having a tough time imagining how it would
improve code in practice.

~~~
birdsbolt
It improves the performance by fusing some of the operations doing them in one
pass instead of multiple passes, and it does so generically - operations are
composed regardless of the source, and the implementation isn't looking at the
type of the source at all.

I believe one of the arguments was also that these couldn't be written in
statically typed languages. Although, I do not know if this turned out to be
true.

~~~
puredanger
Re the last sentence, that is not true, nor was it a claim.

~~~
birdsbolt
I think I saw a talk, or a statement somewhere, made by Rich Hickey -
[https://news.ycombinator.com/item?id=8342718](https://news.ycombinator.com/item?id=8342718)
. I believe it was in this talk.

Not that I was being negative about it.

~~~
puredanger
Rich didn't claim you couldn't implement it. He just claimed that some aspects
of transducers are difficult to represent _as types_. Or at least, that's how
I took it.

------
giancarlostoro
My only negativity with Clojure isn't really with the language itself, but
with the Debian / Ubuntu packages, they're way outdated. Not sure who ever
maintained them, or why they stopped doing so 1.4 being the last version.
Outside of that for anyone wanting to check it out, you could try downloading
LightTable and using ClojureScript, seems to be close enough I am able to use
Clojure books with ClojureScript, not sure entirely of it's differences or how
backwards compatible one is meant to be with the other, though I suspect
they're meant to be near identical aside from their available libraries maybe.

~~~
ffreire
Clojure is often billed as "just another JAR" and in my experience I've never
felt the need to install via any package manager. My workflow often consists
of a new leiningen project where I define the version of Clojure that I want
to use for the current project, along with whatever I else I need for the task
at hand. Likewise, if I just need a quick repl to test something out, I'll
simply use `lein repl` and fire away.

~~~
nerd_stuff
To expand on this:

If you're familiar with Python Leiningen takes the place of both pip and
virtualenv. Every project has a project.clj file where you declare your
project dependencies and running "lein deps" from your project root handles it
from there. This includes libraries and the version of Clojure you're
targeting. When you add/delete/change a dependency in project.clj you simply
run "lein deps" again. You never have to run "pip freeze" or make a
requirements.txt file because project.clj serves that purpose as well.

As an example, Leiningen's own project.clj:
[https://github.com/technomancy/leiningen/blob/master/project...](https://github.com/technomancy/leiningen/blob/master/project.clj)

------
pjmlp
Was the performance related work postponed?

~~~
susi22
If you mean the compilation fsync stuff: Yeah until 1.8

~~~
pjmlp
I mean the compilation and startup time that people think lies on the JVM,
CLR, Dalvik and ART when trying to run Clojure code.

~~~
puredanger
It sounds like you are referring to various ideas around delayed var loading
to improve startup performance - most of that was delayed, although there was
one change that improved compilation speed and improves the performance of
some projects.

~~~
pjmlp
Yeah that was it, thanks.

