Hacker News new | past | comments | ask | show | jobs | submit login
Switching from Common Lisp to Julia (2017) (tamaspapp.eu)
118 points by mindB 78 days ago | hide | past | web | favorite | 81 comments



Considering how close Julia is to Lisp, it could be interesting to have an actual Lisp targeting the Julia compiler, for people who don't want to compromise on s-expr but would still want to have a performance oriented Lisp with access to a lot of scientific computation and ML libraries (and probably better interop compared to Clojure and Java for example). The only work I know for this is:

https://github.com/swadey/LispSyntax.jl


I'm working my way towards doing that. I've got more than a decade of experience with Scheme and would like to be able to code Scheme in the Julia environment. I also recently started learning Clojure (and implementing Clojure in Scheme, although I'm not sure yet I'll finish that), maybe covering both languages will be feasible. I recently implemented a Scheme interpreter in C++ to access Qt (needs more work, and I'm thinking of moving to generating C++ from annotated Scheme code first), and I implemented some Scheme AST & some optimizer stages (see the corescheme* files at [1]). I wrote pretty much all of the infrastructure for coding these things on top of R5RS myself, so I've got a library I can pull from to build the basic environment (to at least bootstrap it easily).

I recently got unemployed and am looking to get back into contracting (Scheme, Clojure, Haskell), so that I can work part time to have enough time to work on this.

I'm somewhat hoping to modify the Julia compiler/runtime to add first-class continuation support (I have implemented it in the interpreter I wrote in C++, so I do have some experience with that). I'd also like optional TCO, that shouldn't be hard (LLVM supports it, should be just a case of the compiler annotating it correspondingly).

If someone would like to join the effort, please tell me.

[1] https://github.com/pflanze/chj-schemelib


Jeff (creator of Julia) created femtolisp [0] which is used in the Julia parser.

[0]: https://github.com/JeffBezanson/femtolisp


This is a wonderful quote:

"This is what I do for fun, because it is the exact opposite of the kind of thing people will pay for: an obscure implementation of a programming language everybody hates."

Great attitude!


You can access Python libs easily from CL: https://github.com/CodyReichert/awesome-cl#python and there's a Numpy clone: https://github.com/numcl/numcl


The interop is a great stopgap (like Julia with Python and R), but they don't replace a native ecosystem. A Lisp flavored Julia would have native access to the libraries, it could define new types, new function implementations, new algorithms all natively and with no performance cost, which would be hard to do for a C/C++/Fortran library wrapped in Python wrapped in CL.

Also, Julia doesn't have a numpy clone, the native n-dimensional arrays are fast enough that it's not necessary. That makes so everything supports it, the very opposite of a lisp curse scenario in which everyone could have ended up making their own incompatible version (though thankfully DataFrames which are not part of the language all comply with a single Tables interface, so you can switch them).


Julia ships with femtolisp. I've never used it. To start it, run julia --lisp


Yes, but this doesn't expose any Julia functionality, so it isn't really useable. I think the parent was looking for something akin to Lisp Flavored Erlang, where the whole language is usable with a Lisp-inspired syntax.


Is femtolisp useful? Does it expose functions like filesystem interactions?


Tamas is a very active participant in the Julia ecosystem, so it seems that the switch went pretty well!

Julia 1.0 was released a little over a year ago at JuliaCon 2018. At this year's JuliaCon, there was, imo, an even more positive "vibe" than past years, which were already full of excitement. My guess is that it may be because everyone has been really happy to have a year of not breaking things and building cool new stuff on top of the solid foundation of Julia the Julia 1.x series.


I'm using Julia instead of Python/Pandas for a machine learning project and I'm happy with how things are going. Multiple dispatch is very cool. The project was previously written in R, but the performance and lack of programming clarity inspired a re-write in Julia.

There is something satisfying about writing code in a high-performance language, you just know you won't be kicking yourself because it would run 10x faster in another language.


Julia is one of my fav languages. For numerical computing, neither python + numpy, nor matlab come even close. The interop is nuts.

To call, say numpy fft, you just do

using PyCall

np = pyimport("numpy")

res = np.fft.fft(rand(ComplexF64, 10))

No casting back and forth. This is a toy example, julia ofc has fftw bindings.


As of today, how would you compare Julia to the Python ecosystem? Could I use Julia as a "drop-in" replacement for the python scientific stack already?


As I mention above, the interop is really good so you can literally start using it right now without rewriting anything.

The ecosystem is smaller, but the average package quality is higher. It's easier to contribute to packages, since in python, you'd sometimes just end up writing C. Julia packages are generally pure julia.


It can depend on which specific Python packages you're using, but generally speaking the Julia ecosystem is pretty mature for scientific computing and data analysis. And as the GP post demonstrated, you can fill in the gaps by using the awesome PyCall interface to directly call Python code.


For what I do (scientific computing mixed with differential equations, so lots of things like neural partial differential equations which utilize lots of special numerical linear algebra and sparsity handling), the packages exist in Julia but not in Python, MATLAB, or R.


It seems little known, but Matlab has had Python interop since 2015, your code would just be

    res = py.numpy.fft.fft(rand(10,1))


Common Lisp folks should really start designing a more modern version of CL.

I'm pretty sure some old Lispers would come and say, 'Common Lisp's stability is a great feature', 'There are lots of new dialects', and while they are right in some aspects, Common Lisp isn't appealing to anyone these days. (I mean, even C and C++ is improving. Why shouldn't CL?)

Even for people who 'get' the ideas of s-exps and macros, there are too many quirks in CL to appeal to people, and most go to Clojure (where the interactive experience is at best moderate, debbuging support/error handling is a mess, e.g.).

I believe the hypothetical CL++ should at least:

* Straight out incompatibilites between implementations, especially the numerous opinions to make an image or binary, and the FFI parts.

* Design a better CLOS system. CLOS feels like an object system thought after (well, after all it is a system thought after) and bolted on CL with duct tape. I would appreciate a CLOS-based type system (based on macros much like CLOS) that integrates well with CL.

* Remove unused functions, and fixing some inconsisties in the standard library

* Bigger, much much bigger standard library. Crucial, community-standard libraries like asdf & uiop, syntax-sugar like the arrow/diamond macros, and reader macros that allow infix syntax (this currently exists in the standard but nobody uses it partly because most Lisp users don't really care about infix, partly because it's extensiblity is bad) should be in the std.

* A compatibility layer. Providing a 'cl' and 'cl-user' package with the usual CL symbols will be sufficient.

to appeal to the users.

This really isn't sufficient for the hypothetical CL++ itself; there really should be beginner materials that isn't PCL; Lispers shouldn't hate infix reader macros or syntax-sugar like LOOP(IMHO these show the real power of macros, after all, macros are another tool to write readable code with great abstractions) and beginner materials should use, or at least introduce them(infix reader macros and other syntax sugar).

Just a rant about the language after reading the post and becoming sad.


I absolutely love developing in Common Lisp, and I agree with you.

Some suggestions:

- Get rid of upcasing; 21st-century languages should be case-sensitive and case-preserving. Allegro’s modern mode is the Right Thing™ here IMHO.

- Fix things like order-of-argument inconsistencies in the standard library.

- Consider adding type to streams to indicate whether they are input, output or both and character, byte or both.

- Base the package system off of Go’s.

- More fully specify pathnames.

- Specify a standard library roughly equivalent to Go’s. At this point I think it’s the gold standard (Python’s used to be, but no longer).


On upcasing:

    (setf *print-case* :downcase)


Oh definitely — that is a must! But it doesn’t do you any good if e.g. you want to turn read symbols into shell commands. You can play some games with :invert, but at the end of the day it’s a hack.

I don’t blame them for upcasing back in the 90s, but … it’s not the 90s anymore.


Not downcase, modern. I.e case-preserving and case-sensitive. E.g. to be able to define native ffi calls into CamelCase functions. Downcasing is destroying information, Modern is keeping the original symbol names


Some bits are here:

- https://github.com/alex-gutev/generic-cl/ gives generic equality, predicates and other base functions.

- the https://lispcookbook.github.io/cl-cookbook/ has nice topics and is growing.

- there is also http://cl21.org/, which works©, but is staling.

- to make an image, I think asdf:make is all one needs (see the Cookbook).

- Coalton is bring ML-types on top of CL.

- …

and I love CLOS.

> there are too many quirks in CL to appeal to people

I saw quirks at first. What was part of the learning process, I do wonder.

Last, there are many things we cannot do in Julia, like the Next browser :) https://github.com/next-browser/next


I have doubts. In particular, from my vantage, what common lisp lacked was a heavy sponsor from the big players in industry. Or academia, even.

What it has, is a ton of people that are more than willing to voice what they don't like about it. Which most languages have. Just doesn't have folks also posting good blogs about it often.


> what common lisp lacked was a heavy sponsor from the big players in industry. Or academia, even.

I believe CL can thrive even without sponsors, most of the new languages in the town don’t have sponsors. (Like Julia, Nim, Ruby, e.g.)

> a ton of people that are more than willing to voice what they don't like about it.

The CL community can ignore people that just whine about the ((parens)). What the CL community shouldn’t is constructive criticism from itself.

(For the record, I use CL as my side-project language, and use Clojure as my main language.)

> Just doesn't have folks also posting good blogs about it often.

IMHO, the CL community is too small to have much. The percentage of people who blog about CL is quite high; but the denominator is too small.


My doubts were meant as specifically against your points.

CLOS is often lauded as one of the best object systems. To see it as one of the complaints feels odd.

Cruft in the standard library? Examples would be good here. But, this is hard to get behind. Nobody would be really sad to see Java date leave. That said, many of us are happy we don't have to touch some code that is working fine with it.

And this is next to a demand for a bigger standard library. Which will only lead to more cruft.

Not sure what you mean about the compatibility layer. Care to expand?

My point was if CL had the press of any of the alternatives to gave, that would put weigh the negatives you gave. Instead of chasing removal of stuff, marketing would make more of a difference.


> CLOS is often lauded as one of the best object systems. To see it as one of the complaints feels odd.

CLOS is awesome, better than the rest. But there are some weird seems between types and classes. And in places CLOS doesn’t go far enough — e.g. it’d be nice to specialise methods based on element types.


But this criticism wants a new system. Not just fixing some areas. That sounds off.

I'd be interested in seeing some of the extensions you have in mind. I've only dabbled in clos. And I have to confess structures fit my case way better.


> new languages in the town don’t have sponsors. (Like Julia, Nim, Ruby, e.g.)

I thought Julia was coming from MIT research (financed with various research money) and moved now to Julia Computing, Inc.

https://economictimes.indiatimes.com/small-biz/startups/why-...

The employees of Julia Computing, Inc. need to be paid somehow... ;-) Looks like they have a good start:

https://juliacomputing.com/about-us


Maybe it lacks now, but it DID have heavy sponsors (Carmegie Melon University and Nasa are two that come to mind). The implementations we have now are based on this work.


I would have said they were involved, but they didn't market it. So, sponsor isn't right. I'm specifically thinking how Sun pushed Java.

That said, I cede I could just be wrong here on history. :(


Original Lisp vendors (Lucid, Symbolics, TI, LMI, Xerox, Franz, Harlequin, Goldhill, ...) and other companies selling CL (HP, DEC, IBM, Apple, SUN, Tektronix, ...) did market it, but that ended in the mid 90s with many of them exiting the business.


I don't want to ruin the language just so that it appeals to some people, it's already fine for me.


> I don't want to ruin the language just so that it appeals to some people, it's already fine for me.

To an outsider his comments seem pretty reasonable. Why would they ruin Common Lisp?


* There are larger incompatibilities between implementations in other popular languages, like C, C++ and Java.

* It's easy to suggest designing an OO system better than CLOS. It's harder to point out any examples tho.

* Remove unused functions, what?

* "Make FOO standard" but a bunch of popular languages don't even have a spec, why is this an issue even?


That other languages are worse than Lisp is no reason not to improve.

I didn't read him as suggesting that CLOS be replaced, but that it be embraced.

I have no idea what he meant about unused functions. My experience has been that when I think something in Common Lisp is pointless … I'm wrong.

One of the great advantages of Lisp in the 90s was that it standardised things that other languages left undefined (indeed, this is still an advantage Common Lisp has over Scheme). It seems like a good idea to keep on expanding the standardised area, as we gain experience & understanding.


All good points but well.. this discussion appears to be about possible reasons people migrate to other languages. And I just can't see how addressing these (rather arbitrary) points would reverse it.

Some people just don't like CL much, despite solid foundation and steady progress with community projects. That's okay, we all have one or another language we are not very fond of.


> ruin the language just so that it appeals to some people

May I ask why this can be considered ‘ruin’ing? These sort of changes will be useful for all CL users IMO.

> it's already fine for me

Maybe to you, but not for me or many, many people who went to Clojure apparently.


I don't mean this in a hostile way, but seriously, just go, then. If Clojure is best for you, vaya con dios :)


Okay, maybe I should clarify. I’m not a Clojure-lover; I use CL as my side project language and I love CL. I was just suggesting a way that CL can be more appealing language so that CL can reach more adoption (as CL has it’s strengths that Clojure can’t follow along).


Speaking as another Lisper, I am glad that you're sticking with Lisp, not moving to Clojure.

… and maybe someday you can propose your own project, something like CL21!


That's exactly the way I feel about it :)


Don't forget a package installer/dependency manager as in NPM/PIP/CPAN. Nowadays I cannot look to a programming language without one of those... it is just too painful to look for libraries, download them, set in a separate folder, etc...


There is quicklisp [1], works fine for me.

[1] https://www.quicklisp.org/beta/


I thought that it's not official and that it was largely criticized due the lack of test cases, and other problems.


It's fine and you'd know that if you ever used it.

I guess there are some metrics by which you could criticise it, but not if NPM or CPAN are your yardsticks.


Quicklisp is great (and better than npm/pip) because it works with distributions, aka releases. It's more stable, I never had dependencies of dependencies that break because they were not pinned carefully. Upgrade when you want. If you want to do otherwise, see Qlot and Ultralisp (a distribution that builds every 5 minutes).


Seems like this:

    You can of course branch on the array element types and maybe even 
    paper over the whole mess with sufficient macrology (which is what 
    LLA ended up doing), but this approach is not very extensible, as 
    eventually you end up hardcoding a few special types for which your 
    functions will be "fast", otherwise they have to fall back to a 
    generic, boxed type. 
is exactly what numpy/scipy is doing (s/macrology/c-functions/), but they are successful.. granted julia has more ground-up support for these things, but it doesn't seem like an impediment to scientific computing per-se, provided sufficient userbase


This is now a little out of date given 1.0+ has been released. Still a decent article though.


  I was thinking about this the other day.  How can I expose my Common Lisp code to Julia.  
  
1. You could take the Clasp path by exposing CL to C++. https://youtu.be/8X69_42Mj-g

  This means taking /  writing a CL in Julia.
  
2. Julia has an AST representation written in femtolisp. So it would be in interesting target for CL or some glue middle language like Shen. http://www.shenlanguage.org/ https://youtu.be/lMcRBdSdO_U


>The standard does not guarantee that this gives you an array of double-float: it may (if the implementation provides them), otherwise you get an array of element type T. This turned out to be a major difficulty for implementing portable scientific code in Common Lisp.

Well, so? Just use a single CL implementation that does give you the guarantee.

After all you are now switching to use a single Julia implementation (there's no other anyway).

If portability is a concern, at least with CL you can always port 95% of your code to several implementations, and then write special code those kind of incompatibilities. With Julia one platform is all you get.

So how's that better, from the point of this complaint?


For numeric work in CL, one might like Numcl, a clone of Numpy: https://github.com/numcl/numcl


I don't know why no one seems to think about symbolic programming which was the original killer feature of lisp, and one of the main reasons for doing macros.

In math oriented programming this should be one of the most powerful features and skills in order to simplify and modify calculations. Sure you can implement that with macros and AST but that is not at the level of S-Expressions.

Everything else is not more important as any other competition of DSLs.


That was the strength of Lisp as the original 'AI' language, but the following AI summer were always more numerical oriented (neural networks, bayesian inference, SVM, neural networks part 2), so that particular strength was not very needed. And in the end the most important killer feature of an 'AI' language ended up being approachable and easy to use, and not the "power", which is why Julia goes with the "Looks like Python, feels like Lisp, runs like Fortran".

And like you said, even without s-expr you can do it with macros (especially in a homoiconic language like Julia), for example:

https://github.com/chakravala/Reduce.jl

https://github.com/korsbo/Latexify.jl


Somehow you moved the discussion to AI comparisons and macros back again, instead of the point of solving/simplifying math problems by using abstract symbols and grammars.

Although Julia can be represented in a homoiconic representation, no one sane can believe that Julia is homoiconic. Unless such person is so optimistic that could believe things such as Europe-Asia-Africa is a big giant island.


There seems to be a market for symbolic maths, too.

See: http://wolfram.com/


In my experience doing lots of computational science, the analytical/symbolic part of the problem solving is typically done separately, as it really needs to be done once, and (mostly) by hand so that you understand the problem and solution better.

Then, once you've gotten the problem and solution framed the way you like, you start implementing the numerical solution of your problem.

There are a few exceptions, particularly when things do get hairy. For example one of my colleagues has created a radial basis function finite difference solver[1]. In some of the modules, it manipulates the RBFs using SymPy and then compiles that to C which is then executed by Cython as part of the solution process.

[1]: https://github.com/treverhines/RBF/


How is that the “easy things” are done by humans, often in paper and the “hairy” problems are done using a computer? Why not let the computer do all simplifications and precalculations and precomputations?

I am an external and maybe I see the things a bit different but if I should write a decimal number I would prefer to keep sqrt(2) as in my code rather than writing an approximate, almost equivalent, float. And I see formula simplifications and analysis in the same way... work to feed the compiler.


Generally it seems people prefer to keep symbolic and numerical computation separate. Perhaps it's due to the cultural schism between pure and applied mathematics, or perhaps it's something else.

Having said that, I think people tend to prefer to work out their final equations using symbolic software before transcribing the functions into a numerical system. Maybe it's too hard to reason about what's going on if your machine is both deriving the equations and computing the numerical result in one go?


We can do this in Julia quite well. ModelingToolkit.jl for example takes in numerical Julia functions and symbolically calculates and builds the function for computing the Jacobian. Example:

https://github.com/JuliaDiffEq/ModelingToolkit.jl/pull/155

The JuliaDiffEq team will be pushing symbolics for automatic transformations and automated model order reductions over the next few years. It's already working really well.


I'm going to speculate wildly and suggest that most math-oriented code nowadays is not analytical, but numerical, where the advantage of having a symbolic system is limited.


As far as I know, parts of the Julia parser is still written in a Lisp variant called Femtolisp.

Also one can run Femtolisp by executing:

julia --lisp



Off the topic but,

Why keep reinventing the wheel when there's an ML language family already? Why do people keep giving up these juicy Hindley—Milner-ish type systems and these brief and concise equations for anything? It just doesn't make sense for me.


In both SML and OCaml writing a function that's polymorphic over numeric types is a huge pain in the ass, so I'm not sure it's a good choice for numeric programming. (Unless you count Haskell as a member of the ML family?) Though it could be that generics aren't that important here, and I'm just a weirdo who liked to do things like implementing a dual number type to get forward-mode automatic differentiation of standard functions for free...


> In both SML and OCaml writing a function that's polymorphic over numeric types is a huge pain in the ass, so I'm not sure it's a good choice for numeric programming.

What? In numeric programming we usually want the opposite, to write a function for specific type, and to make it run fast.

Also, in OCaml you can simply make your computation a parametric module of some algebraic structure, something like

    module type Ring = sig
       type t
       val zero : t
       val one : t
       val (+) : t -> t -> t
       ...
    end

    module Numeric_staff (N : Ring) = struct
       ...
And have a completely static numeric computation without any dynamic dispatch. That's resembling C++'s templates.

You also can use classes if you want dynamic dispatch

    let (+) x y = x#add y


Haskell is its direct descendant, so of course it counts. It has all the traits required to be called that, and I'm implicitly talking towards it in the initial comment while still respecting the family. And for what I know, OCaml has generics, so I'm not exactly sure what do you mean here.


I haven't ever written any OCaml, but I thought it has separate operators for ints and floats. I've written some SML and it has some special hacks to allow overloading on built-in operators. While I do understand that ML style modules are theoretically more powerful than Haskell's typeclasses, they always struck me as much clunkier for everyday use (that's why I personally consider ML family and Haskell family as separate - the approach to ad-hoc polymorphism is an important difference in my opinion)


> they always struck me as much clunkier for everyday use

They are simply explicit. What's so clunky about modular implicits, for example?

Personally, I prefer explicit modules and hate typeclasses, because I can't look at the code and say if this operation is a primitive operator or some dynamically dispatched class method.

Besides, modules are way more powerful, and have a way broader use. They can encapsulate state, types, can be parametric etc.


Each language will have different design priorities. For example auto-currying is great but it wouldn't make sense in a multiple dispatch language since you have to evaluate every argument to dispatch (which makes n-arity functions necessarily first class, not a composition of 1-arity functions). Another example is that Julia's type system exists to make it more dynamic (while maintaining high performance), while ML languages focus on static properties.

But previous designs are great for inspiration (like you say, Julia has a lot from them, like sum and product types, subtyping, parametric polymorphism) in the same way it also borrows from Lisp (CLOS). You can probably find in the literature/issues/discussion if you're curious how the language ended up this way, for example:

https://arxiv.org/abs/1808.03370


> auto-currying is great but it wouldn't make sense in a multiple dispatch language since you have to evaluate every argument to dispatch (which makes n-arity functions necessarily first class, not a composition of 1-arity functions).

I have a question on this. I'll note up front that this is not an area of expertise for me.

It seems like knowledge of the types is/could be embedded in the process of currying. Multiple dispatch of a 2-ary function depends on the type of arg1 and arg2. If I call this with just the first arg, then the curried function "has knowledge" of the type of arg1. So now I have a 1-ary function that can do single dispatch on the type of its argument.

This seems possible to me to inductively extend to n-ary functions, since the process of consuming one argument by currying embeds knowledge of the type of that arg in the newly created function of n-1 arity.

Am I way off base here?


I'm very far from an expert as well, but as I see you could have automatic currying in the same way I can write in Julia x -> f(x, 10) (manual currying using anonymous function) but automatically for every function. But the language would have to disallow using them for dispatch purposes:

f :: (Int -> Double) -> Int -> Double

That would have to dispatch to

f(::Function(::Int, ::Double), ::Int)::Double

Now the dispatch will have to be recursive (it will have to apply to each argument and arguments arguments to evaluate which is more specialized), making finding the most specialized version of a function even more chaotic. I'm trying not to think what dispatching on return types means for the compiler though (and that's not necessary for the auto currying).

And because of the currying, if you apply the first argument you'd have a (g :: Int -> Double) which has the memory of receiving (Int -> Double). But now trying to dispatch on another function with that partial will it dispatch as if it were a normal Int -> Double or will the "knowledge" of receiving a previous argument change which method it will pick?

To be honest I have no idea, it's probably possible but it feels like it would be something very different from Julia.


>which makes n-arity functions necessarily first-class

Excuse me but tuples. You still have to observe them to evaluate them but that's all you need to simulate n-arity in the way you want.

>more dynamic

Well, Hindley—Milner is how you do type-safe stuff with inference that annotates for you. For any generic purposes you just put a typevar with constraints, so I don't see a problem here and can't really get what you gain. MLs are more about proper composition.


Do you know of an ML language that's widely used in scientific computing? Julia was made to compete with R, Python, C++ and Fortran, combing high-performance with ease of use and general computing.


That is exactly the thing that baffles me most. Procedural languages that are inherently distanced from math are used for math for whatever reason. That's basically what I've said in earlier comment.


Mathematica and Maxima are much more functional, for what is worth. Mathematica is pretty much a redressed Lisp (much lispier than Julia), and Maxima is not even redressed (it uses s-expressions).

>Procedural languages that are inherently distanced from math are used for math for whatever reason.

I think that historically it wasn't easy to make an efficient functional language. Numerical languages all trace back to Fortran after all (which is still alive and well). Nowadays, I think it's mostly because of inertia, and researchers having better things to do than learning a completely different programming language.


In Julia, the math part is mostly functional. Structs/Data are immutable by default (which cover most basic numeric types) and while arrays are mutable it's very common to write operations in the form of broadcasting (vectorized math) which will not mutate (and you have all higher order functions). In general most languages are this way to some extent (I don't know languages with mutable base numbers or math operators, outside of stuff like +=).

But mutability is also a great tool to have, for example for efficient dataframe handling and neural network weights.


Julia type system is based on subtyping.


Ok but I'm talking of ones based on composition.


I think Edi uses Julia too, now


2018 he published a mathematics book with examples in Python.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: