
A Lisper's first impression of Julia - ananthrk
http://p-cos.blogspot.com/2014/07/a-lispers-first-impression-of-julia.html
======
idunning
Comprehensive! Covers the Lisp-y influences of Julia in great depth.

My perspective on Julia is that it has 3 ingredients:

1\. A principled design that derives from the experiences of past programming
language and particularly the creator's experiences with Lisps. This is where
a lot of the "magic" comes from: multiple dispatch, the type system,
metaprogramming, etc. The article covers this aspect.

2\. A need to be accessible to those transitioning from other languages, like
MATLAB and Python. MATLAB, for example, has guided function naming (although
Numpy also has similar names for similar reasons). The author mentions the
lack of distinction between creating a variable and changing its binding: I'd
suggest this is an example of something affected by this design point.

3\. A need to be fast. The author brings up the Int vs BigInt distinction.
Python, for example, allows Ints to get as big as you want but at a cost.
Adding to Ints is not simply an add instruction, you must do a lot more work.
Julia, falling on the side of performance, elects to distinguish between
arbitrary BigInts and machine Int.

~~~
StefanKarpinski
Regarding the Int / BigInt distinction, other issues besides performance,
which haven't historically been prominent considerations in new language
designs, are interoperability and transparency. In the current design, a
Vector{Int} always has the same in-memory representation as it would in C or
Fortran – you can take a pointer to the first array element and pass it to a
library function using the C ABI and it will just work. You also know exactly
how your data is represented and can reason about it. You know, for example,
that a Vector{Int} definitely does not require any additional heap allocation
besides the inline Int values and that arithmetic operations on Ints will just
be machine arithmetic ops. I think that the transparency of the C data and
performance models has been one of the major reasons for C's long-lived
success. One of the design goals of Julia is to have similarly transparent
data and performance models.

~~~
phkahler
IMHO it is total fail to have int be dependent on the machine architecture.
C99 fixed this behavior with types for specific sizes so people could finally
write portable code. Julia should adopt 64bit integers by default given its
intended audience and the reality that even some phones have 64bit processors.
int64_t works on 32bit processors too, but with a performance penalty. Having
the range of a variable depend on the machine architecture really went out of
style a long time ago.

~~~
StefanKarpinski
We considered that, but even though 64-bit ints work on 32-bit machines, they
are dog slow. Insisting that integers are 64-bit everywhere is basically
saying that you want slow for loops, slow array indexing – slow everything –
on 32-bit systems. Clearly that's unacceptable in a language that is meant to
be fast. So Julia has Int32 and Int64 when you want a specific bit size and
Int is always the same size as your pointers. This arrangement is considerably
simpler to deal with than C's "integers are whatever size I want them to be!
[evil cackle]" approach. In particular, default integers and pointers are
always the same size – which is not always the case in C (I'm looking at you,
Win64) – so there's only one system-dependent size to worry about.

------
mjburgess
This is a refreshingly specific post. Many articles of this kind ham-fistedly
define various philosophical criteria throughout the post and make sketchy
judgements within these. Here however there is just "here's the main
comparative languages, here's the difference, here's where there may be
issues".

NB. I'm very much in favour of a principled (qua philosophical) approach to
language comparison (etc.) but its rarely done well.

~~~
nabla9
It's always great to see post from real Lisper instead of yet another Lisp
philosopher.

------
616c
As an aside (and please do not take it as a flame), this is a very neat
article that shows a class of languages in a paradigm I have never considered:
Lispy languages (semantically) without Lisp morpho-syntax. I had heard of
Julia of course, and see a few mentions here and there of Dylan. It is
interesting Dylan had such little interest, or even similar projects, because
everyone complains about Lisp syntax (as I see here, I am an amateur Lisper
and I understand its history and appreciate it), but bemoans not having other
languages with the power of homo-iconicity and other core parts from which the
macro system and others gem are based upon (I forget the guy with that quote:
keeping adding features to a lang, and you get a much shittier Lisp).

Why did these languages not take off (at least pre-Julia)? I have heard other
people "debate" (and I use it hear to say disagreement on principle not on
details of said debate that Ruby and other langs are Lisp-like, but fall
short. Dylan seems to have been Lisp (proper) without Lisp syntax on purpose
(after intentionally moving from the design phase). So why do languages with
such powerful expressiveness (for your value of the word, I do not want to
start that discussion either) never take off, Dylan or otherwise? It seems
that is what all programmers, at least the ones more advanced than me, clamor
for.

~~~
rayiner
Aside from Apple having abandoned the language, the basic issue is that Dylan
projects were very ambitiois. Dylan was aimed at C++, so Harlequin and CMU
spent a huge amount of time developing sophisticated native code compilers,
thread-safe GC, compilation to native executables, etc. Harlequin also did a
whole IDE, with GUI toolkit and Emacs-like editor, all written in Dylan and
self-hosted. Ruby, Python, etc, showed there was a market for simple dumb
implementations that were nonetheless useful, and got to market quickly
because it was easy to do a little C interpreter that did a dictionary lookup
every other operation.

There's a renaissance in native-compiled languages now, mainly thanks to LLVM
and the JVM. Having a fast optimizing compiler back end that generates
binaries on many platforms is a huge head start, and goes a long way to making
the language immediately useful. The JVM gives you those and then some.

~~~
pjmlp
> There's a renaissance in native-compiled languages now, mainly thanks to
> LLVM and the JVM

No, technology just goes in circles.

Like 30 years ago when people started to realize P-Code and other VM
approaches were too slow and resource hungry to be useful targeting
minicomputers.

Now mobiles and high electricity costs are making developers reach the same
conclusions again.

~~~
phkahler
> Now mobiles and high electricity costs are making developers reach the same
> conclusions again.

But rather than fall back on existing compiled languages, they are now trying
to build something that has it all.

~~~
pjmlp
That is also not new. While the minicomputers struggled with VMs and got back
to AOT compilation, research labs workstations already had mixed mode.

The first JIT were targeted at Lisp and Smalltalk environments, and commercial
Lisps always had JIT + AOT compilation support.

As for going for something new, it is hard to bring people back to
technologies that are no longer mainstream, without adding something new to
it.

------
andrewflnr
It seems weird to try to characterize Julia in terms of object-oriented
programming. Is that just me? Julia's approach to subtyping and multiple
dispatch is sufficiently different from the C++ and Python approaches to OOP
that I don't even put them in the same bucket, and it seems about as far away
from CL's objects as well. Julia doesn't really advertise itself as OO; you
can't even find the word "object" on the front page of their site. So I
wouldn't try to think of it that way.

A lot of the comparisons in this article seem like that to me. Julia and
Common Lisp are apparently just close enough to make a point-by-point
comparison like this plausible, but things are not quite aligned close enough
to make it work. It's still a good article with a lot of solid meat in it, but
I think the topic would have been better served by going up the abstraction
ladder a bit and talking about how the different paradigms of each language
motivated the differences between them.

Disclaimer: I'm only somewhat familiar with Julia and not at all with Common
Lisp.

~~~
PuercoPop
It only seems weird in so far that you are ignorant (aren't we all?) of the
generic function approach to object orientation.

In the message passing approach you dispatch based on the type of the first
argument to the method. Because it would be redundant to explicitly write down
the argument it is commonly syntactically elided (though not fully in Python)
and you get coupling of the methods and the object. This is not a fundamental
property of OO but accidental feature found in most OO languages.

In the generic function approach, the dispatch is extended so one can dispatch
on the type (among other things) of (ideally) all of its arguments. Julia
follows this approach afaik and if one is familiar with the generic function
approach it is not a controversial claim at all.

Erik Naggum explains it here in more depth:
[http://www.xach.com/naggum/articles/3243735416407529@naggum....](http://www.xach.com/naggum/articles/3243735416407529@naggum.no.html)

------
wirrbel
I would very much like to read a comparison of CL and Clojure from the author
at some point. As it seems he is offering a fair comparison.

~~~
LeonidasXIV
> I would very much like to read a comparison of CL and Clojure from the
> author at some point. As it seems he is offering a fair comparison.

I suspect the authors main problem with Clojure is, that it is a mostly
functional language which heavily emphasizes doing things in the functional
way and discouraging imperative programming whereas CL is more like a true
multiparadigm language.

I used to think that multiparadigm is best, but after migrating from Scheme
which is mostly functional but has a lot of mutation and a sad lack of
interesting datastructures apart from Lisp to Clojure which has good support
for dicts and persistent data structures I think I prefer a community that is
more focused on one approach.

~~~
klibertp
> I suspect the authors main problem with Clojure

That's not a very nice thing to do, suspecting people without any kind of
evidence. Not to mention the fact that there is a `set!` form in Clojure,
which makes it entirely possible to write very imperative code (and thread-
local semantics don't matter in single-threaded programs).

Anyway, "problems with Clojure" can be very different for different people. I
like Clojure design as a language - even its interop with OO host features are
very neat - but then when I want to hack some simple script in a REPL I not
only need to write this:

    
    
        $ rlwrap java -cp "clojure-1.5.1.jar" clojure.main
    

but then I need to wait for freaking 6 seconds for the prompt to appear. 6
seconds. I don't know what more I could write here, so I'll just paste this
(Chicken Scheme):

    
    
        $ time csi -e '(exit)'
        csi -e '(exit)'  0,01s user 0,00s system 81% cpu 0,007 total
    

So that's my problem with Clojure, nothing to do with "functional way", right?

~~~
LeonidasXIV
> So that's my problem with Clojure, nothing to do with "functional way",
> right?

Sure. For a solution to your particular problem, maybe ClojureScript will
bring some improvement for CLI use, since Nodejs tends to start faster than a
whole JVM.

------
klibertp
> I would like to see something like Oberon’s support for renaming identifiers
> on import in some Lisp dialect someday

I couldn't find any specifics on how it's done in the linked PDF (modules are
described at the end, 11 section), but I think both Clojure and Racket do this
already. The `require` mini langauge in Racket is very rich and allows for
prefixing, renaming, selective import of identifiers and so on:
[http://docs.racket-
lang.org/reference/require.html#%28form._...](http://docs.racket-
lang.org/reference/require.html#%28form._%28%28lib._racket%2Fprivate%2Fbase..rkt%29._require%29%29)

------
peterashford
What is the ecosystem for Julia like? Could it be considered for systems
programming tasks?

~~~
StefanKarpinski
Define systems programming. If you mean could you write servers in it, then
yes (there's a web stack for it already). If you want to write an OS kernel in
Julia, then you probably could, but I'm not sure you'd want to.

~~~
ics
There are times when having an English word for "no, well– yes, but I don't
know why you'd want to" would be very useful. It'd have to be pretty short to
save breath every time a computer scientist must answer the question, "But is
it a _systems_ programming language?"

~~~
LeonidasXIV
Maybe 無 (mu) [0], which is something philosophically in between yes and no.

[0]:
[https://en.wikipedia.org/wiki/Mu_(negative)#.22Unasking.22_t...](https://en.wikipedia.org/wiki/Mu_\(negative\)#.22Unasking.22_the_question)

~~~
peterashford
mu is more like "you've made a category error" or "the question makes no
sense" rather than "technically yes, but you wouldn't want to" \- which is
what the post you were responding to seemed to be aiming for

------
zercool
Wow. I really appreciate how thorough this post is!

------
gct
I just wish they hadn't ruined the entire concept by going with one-based
indexing.

