I recently got unemployed and am looking to get back into contracting (Scheme, Clojure, Haskell), so that I can work part time to have enough time to work on this.
I'm somewhat hoping to modify the Julia compiler/runtime to add first-class continuation support (I have implemented it in the interpreter I wrote in C++, so I do have some experience with that). I'd also like optional TCO, that shouldn't be hard (LLVM supports it, should be just a case of the compiler annotating it correspondingly).
If someone would like to join the effort, please tell me.
"This is what I do for fun, because it is the exact opposite of the kind of thing people will pay for: an obscure implementation of a programming language everybody hates."
Also, Julia doesn't have a numpy clone, the native n-dimensional arrays are fast enough that it's not necessary. That makes so everything supports it, the very opposite of a lisp curse scenario in which everyone could have ended up making their own incompatible version (though thankfully DataFrames which are not part of the language all comply with a single Tables interface, so you can switch them).
Julia 1.0 was released a little over a year ago at JuliaCon 2018. At this year's JuliaCon, there was, imo, an even more positive "vibe" than past years, which were already full of excitement. My guess is that it may be because everyone has been really happy to have a year of not breaking things and building cool new stuff on top of the solid foundation of Julia the Julia 1.x series.
There is something satisfying about writing code in a high-performance language, you just know you won't be kicking yourself because it would run 10x faster in another language.
To call, say numpy fft, you just do
np = pyimport("numpy")
res = np.fft.fft(rand(ComplexF64, 10))
No casting back and forth. This is a toy example, julia ofc has fftw bindings.
The ecosystem is smaller, but the average package quality is higher. It's easier to contribute to packages, since in python, you'd sometimes just end up writing C. Julia packages are generally pure julia.
res = py.numpy.fft.fft(rand(10,1))
I'm pretty sure some old Lispers would come and say, 'Common Lisp's stability is a great feature', 'There are lots of new dialects', and while they are right in some aspects, Common Lisp isn't appealing to anyone these days. (I mean, even C and C++ is improving. Why shouldn't CL?)
Even for people who 'get' the ideas of s-exps and macros, there are too many quirks in CL to appeal to people, and most go to Clojure (where the interactive experience is at best moderate, debbuging support/error handling is a mess, e.g.).
I believe the hypothetical CL++ should at least:
* Straight out incompatibilites between implementations, especially the numerous opinions to make an image or binary, and the FFI parts.
* Design a better CLOS system. CLOS feels like an object system thought after (well, after all it is a system thought after) and bolted on CL with duct tape. I would appreciate a CLOS-based type system (based on macros much like CLOS) that integrates well with CL.
* Remove unused functions, and fixing some inconsisties in the standard library
* Bigger, much much bigger standard library. Crucial, community-standard libraries like asdf & uiop, syntax-sugar like the arrow/diamond macros, and reader macros that allow infix syntax (this currently exists in the standard but nobody uses it partly because most Lisp users don't really care about infix, partly because it's extensiblity is bad) should be in the std.
* A compatibility layer. Providing a 'cl' and 'cl-user' package with the usual CL symbols will be sufficient.
to appeal to the users.
This really isn't sufficient for the hypothetical CL++ itself; there really should be beginner materials that isn't PCL; Lispers shouldn't hate infix reader macros or syntax-sugar like LOOP(IMHO these show the real power of macros, after all, macros are another tool to write readable code with great abstractions) and beginner materials should use, or at least introduce them(infix reader macros and other syntax sugar).
Just a rant about the language after reading the post and becoming sad.
- Get rid of upcasing; 21st-century languages should be case-sensitive and case-preserving. Allegro’s modern mode is the Right Thing™ here IMHO.
- Fix things like order-of-argument inconsistencies in the standard library.
- Consider adding type to streams to indicate whether they are input, output or both and character, byte or both.
- Base the package system off of Go’s.
- More fully specify pathnames.
- Specify a standard library roughly equivalent to Go’s. At this point I think it’s the gold standard (Python’s used to be, but no longer).
(setf *print-case* :downcase)
I don’t blame them for upcasing back in the 90s, but … it’s not the 90s anymore.
- https://github.com/alex-gutev/generic-cl/ gives generic equality, predicates and other base functions.
- the https://lispcookbook.github.io/cl-cookbook/ has nice topics and is growing.
- there is also http://cl21.org/, which works©, but is staling.
- to make an image, I think asdf:make is all one needs (see the Cookbook).
- Coalton is bring ML-types on top of CL.
and I love CLOS.
> there are too many quirks in CL to appeal to people
I saw quirks at first. What was part of the learning process, I do wonder.
Last, there are many things we cannot do in Julia, like the Next browser :) https://github.com/next-browser/next
What it has, is a ton of people that are more than willing to voice what they don't like about it. Which most languages have. Just doesn't have folks also posting good blogs about it often.
I believe CL can thrive even without sponsors, most of the new languages in the town don’t have sponsors. (Like Julia, Nim, Ruby, e.g.)
> a ton of people that are more than willing to voice what they don't like about it.
The CL community can ignore people that just whine about the ((parens)).
What the CL community shouldn’t is constructive criticism from itself.
(For the record, I use CL as my side-project language, and use Clojure as my main language.)
> Just doesn't have folks also posting good blogs about it often.
IMHO, the CL community is too small to have much.
The percentage of people who blog about CL is quite high; but the denominator is too small.
CLOS is often lauded as one of the best object systems. To see it as one of the complaints feels odd.
Cruft in the standard library? Examples would be good here. But, this is hard to get behind. Nobody would be really sad to see Java date leave. That said, many of us are happy we don't have to touch some code that is working fine with it.
And this is next to a demand for a bigger standard library. Which will only lead to more cruft.
Not sure what you mean about the compatibility layer. Care to expand?
My point was if CL had the press of any of the alternatives to gave, that would put weigh the negatives you gave. Instead of chasing removal of stuff, marketing would make more of a difference.
CLOS is awesome, better than the rest. But there are some weird seems between types and classes. And in places CLOS doesn’t go far enough — e.g. it’d be nice to specialise methods based on element types.
I'd be interested in seeing some of the extensions you have in mind. I've only dabbled in clos. And I have to confess structures fit my case way better.
I thought Julia was coming from MIT research (financed with various research money) and moved now to Julia Computing, Inc.
The employees of Julia Computing, Inc. need to be paid somehow... ;-) Looks like they have a good start:
That said, I cede I could just be wrong here on history. :(
To an outsider his comments seem pretty reasonable. Why would they ruin Common Lisp?
* It's easy to suggest designing an OO system better than CLOS. It's harder to point out any examples tho.
* Remove unused functions, what?
* "Make FOO standard" but a bunch of popular languages don't even have a spec, why is this an issue even?
I didn't read him as suggesting that CLOS be replaced, but that it be embraced.
I have no idea what he meant about unused functions. My experience has been that when I think something in Common Lisp is pointless … I'm wrong.
One of the great advantages of Lisp in the 90s was that it standardised things that other languages left undefined (indeed, this is still an advantage Common Lisp has over Scheme). It seems like a good idea to keep on expanding the standardised area, as we gain experience & understanding.
Some people just don't like CL much, despite solid foundation and steady progress with community projects. That's okay, we all have one or another language we are not very fond of.
May I ask why this can be considered ‘ruin’ing? These sort of changes will be useful for all CL users IMO.
> it's already fine for me
Maybe to you, but not for me or many, many people who went to Clojure apparently.
… and maybe someday you can propose your own project, something like CL21!
I guess there are some metrics by which you could criticise it, but not if NPM or CPAN are your yardsticks.
You can of course branch on the array element types and maybe even
paper over the whole mess with sufficient macrology (which is what
LLA ended up doing), but this approach is not very extensible, as
eventually you end up hardcoding a few special types for which your
functions will be "fast", otherwise they have to fall back to a
generic, boxed type.
I was thinking about this the other day. How can I expose my Common Lisp code to Julia.
This means taking / writing a CL in Julia.
Well, so? Just use a single CL implementation that does give you the guarantee.
After all you are now switching to use a single Julia implementation (there's no other anyway).
If portability is a concern, at least with CL you can always port 95% of your code to several implementations, and then write special code those kind of incompatibilities. With Julia one platform is all you get.
So how's that better, from the point of this complaint?
In math oriented programming this should be one of the most powerful features and skills in order to simplify and modify calculations. Sure you can implement that with macros and AST but that is not at the level of S-Expressions.
Everything else is not more important as any other competition of DSLs.
And like you said, even without s-expr you can do it with macros (especially in a homoiconic language like Julia), for example:
Although Julia can be represented in a homoiconic representation, no one sane can believe that Julia is homoiconic. Unless such person is so optimistic that could believe things such as Europe-Asia-Africa is a big giant island.
Then, once you've gotten the problem and solution framed the way you like, you start implementing the numerical solution of your problem.
There are a few exceptions, particularly when things do get hairy. For example one of my colleagues has created a radial basis function finite difference solver. In some of the modules, it manipulates the RBFs using SymPy and then compiles that to C which is then executed by Cython as part of the solution process.
I am an external and maybe I see the things a bit different but if I should write a decimal number I would prefer to keep sqrt(2) as in my code rather than writing an approximate, almost equivalent, float. And I see formula simplifications and analysis in the same way... work to feed the compiler.
Having said that, I think people tend to prefer to work out their final equations using symbolic software before transcribing the functions into a numerical system. Maybe it's too hard to reason about what's going on if your machine is both deriving the equations and computing the numerical result in one go?
The JuliaDiffEq team will be pushing symbolics for automatic transformations and automated model order reductions over the next few years. It's already working really well.
Also one can run Femtolisp by executing:
Why keep reinventing the wheel when there's an ML language family already? Why do people keep giving up these juicy Hindley—Milner-ish type systems and these brief and concise equations for anything? It just doesn't make sense for me.
What? In numeric programming we usually want the opposite, to write a function for specific type, and to make it run fast.
Also, in OCaml you can simply make your computation a parametric module of some algebraic structure, something like
module type Ring = sig
val zero : t
val one : t
val (+) : t -> t -> t
module Numeric_staff (N : Ring) = struct
You also can use classes if you want dynamic dispatch
let (+) x y = x#add y
They are simply explicit. What's so clunky about modular implicits, for example?
Personally, I prefer explicit modules and hate typeclasses, because I can't look at the code and say if this operation is a primitive operator or some dynamically dispatched class method.
Besides, modules are way more powerful, and have a way broader use. They can encapsulate state, types, can be parametric etc.
But previous designs are great for inspiration (like you say, Julia has a lot from them, like sum and product types, subtyping, parametric polymorphism) in the same way it also borrows from Lisp (CLOS). You can probably find in the literature/issues/discussion if you're curious how the language ended up this way, for example:
I have a question on this. I'll note up front that this is not an area of expertise for me.
It seems like knowledge of the types is/could be embedded in the process of currying. Multiple dispatch of a 2-ary function depends on the type of arg1 and arg2. If I call this with just the first arg, then the curried function "has knowledge" of the type of arg1. So now I have a 1-ary function that can do single dispatch on the type of its argument.
This seems possible to me to inductively extend to n-ary functions, since the process of consuming one argument by currying embeds knowledge of the type of that arg in the newly created function of n-1 arity.
Am I way off base here?
f :: (Int -> Double) -> Int -> Double
That would have to dispatch to
f(::Function(::Int, ::Double), ::Int)::Double
Now the dispatch will have to be recursive (it will have to apply to each argument and arguments arguments to evaluate which is more specialized), making finding the most specialized version of a function even more chaotic. I'm trying not to think what dispatching on return types means for the compiler though (and that's not necessary for the auto currying).
And because of the currying, if you apply the first argument you'd have a (g :: Int -> Double) which has the memory of receiving (Int -> Double). But now trying to dispatch on another function with that partial will it dispatch as if it were a normal Int -> Double or will the "knowledge" of receiving a previous argument change which method it will pick?
To be honest I have no idea, it's probably possible but it feels like it would be something very different from Julia.
Excuse me but tuples. You still have to observe them to evaluate them but that's all you need to simulate n-arity in the way you want.
Well, Hindley—Milner is how you do type-safe stuff with inference that annotates for you. For any generic purposes you just put a typevar with constraints, so I don't see a problem here and can't really get what you gain. MLs are more about proper composition.
>Procedural languages that are inherently distanced from math are used for math for whatever reason.
I think that historically it wasn't easy to make an efficient functional language. Numerical languages all trace back to Fortran after all (which is still alive and well). Nowadays, I think it's mostly because of inertia, and researchers having better things to do than learning a completely different programming language.
But mutability is also a great tool to have, for example for efficient dataframe handling and neural network weights.