
JuliaLang: The Ingredients for a Composable Programming Language - mindB
https://white.ucc.asn.au/2020/02/09/whycompositionaljulia.html
======
StefanKarpinski
I think the top-level take away here is not that Julia is a great language
(although it is) and that they should use it for all the things (although
that's not the worst idea), but that its design has hit on _something_ that
has made a major step forwards in terms of our ability to achieve code reuse.
It is actually the case in Julia that you can take generic algorithms that
were written by one person and custom types that were written by other people
and just use them together efficiently and effectively. This majorly raises
the table stakes for code reuse in programming languages. Language designers
should not copy all the features of Julia, but they should at the very least
understand why this works so well, and be able to accomplish this level of
code reuse in future designs.

~~~
pdexter
Interesting. Can you give an example of generic algorithms plus custom types,
in practice? Off the top of my head I thought that any dynamic language or
static language with good genetics would have this property, but maybe there's
something that Julia does differently.

~~~
ddragon
As an additional example, I really like the combination of unitful and diffeq
[1]. But you're right that the core feature that allows this stuff is duck
typing, but by itself it's not enough. Your notduck had to not only quack, but
other animals have to look at it and act like it's a duck sometimes and like a
notduck when you want it to do more than a duck. Multiple Dispatch (plus
parametric subtyping) allows you to trivially define both notduck + A
(notduck.+(A) in OOP languages) and A + notduck (extending A whatever A is)
and it's really fast. That allows for the core Julia concept of
specialization, easily customizing the particular behavior of any agent at any
point to get both the common behavior right and the extended behavior.

For static languages you can implement part of it with, for example,
interfaces (you'll face the same restrictions if the language is single
dispatch), but even if you can extend the interface freely for already
existing objects, there must be an agreement between the multiple packages to
comply to the same interfaces (and you might either end with tons of
interfaces since there are tons of possible behaviors for each entity and
purpose or giant interfaces to fit all). In Julia you can use specialization
to surgically smooth the integration between two packages that had no
knowledge of each other and didn't even decide to comply to any (informal)
interface (which do exist in Julia, like the Julia Array and Tables.jl
interfaces).

[1]
[https://tutorials.juliadiffeq.org/html/type_handling/03-unit...](https://tutorials.juliadiffeq.org/html/type_handling/03-unitful.html)

------
UncleOxidant
I really like Julia a lot and actually used it in a work project a few years
back.

However, there's the debugger issue. There are several debugger alternatives.
It's tough to figure out which debugger is canonical (or is any of them the
canonical debugger?). The one that seems to being used most at this point is
Debugger.jl. However, it's exceedingly slow if you're debugging sizeable
operations (matrix multiplies, for example) - I'm talking hitting 'n' to step
to the next line and then waiting several minutes for it to get there. There's
also Rebugger.jl, MagneticReadHead.jl (IIRC) and Infiltrator.jl among others.
I finally found that Infiltrator.jl was a lot faster for that machine learning
program I was trying to debug, but it's rather limited in features (the only
way to set breakpoints it seems is by editing your source, for example).

And this isn't the only case where there are multiple packages for achieving
some task and you're not quite sure which one is the one that's the most
usable. I think what the Julia community needs to do is maybe add some kind of
rating system for packages so you can see which packages have the highest
rating.

~~~
nwvg_7257
Couple comments on the Debugger situation:

1\. Debugger.jl is A LOT smoother if you run it in compiled mode, which is a
checkbox in the Juno interface. I've found that stepping to next line is
instant in compiled mode, but takes forever without it.

2\. Infiltrator.jl is great at what it's designed for, which is to dump you in
a REPL deep within a call stack and let you see what's going on. But, Debugger
in compiled mode also does this well.

~~~
UncleOxidant
> 1\. Debugger.jl is A LOT smoother if you run it in compiled mode, which is a
> checkbox in the Juno interface. I've found that stepping to next line is
> instant in compiled mode, but takes forever without it.

Is there a way to do this if I'm not running Juno? I'd guess there must be
some parameter that can be passed to @enter or @run?

~~~
Sukera
Sure! 'C' in the debug REPL enters compiled mode.

[https://github.com/JuliaDebug/Debugger.jl#compiled-
mode](https://github.com/JuliaDebug/Debugger.jl#compiled-mode)

~~~
UncleOxidant
I guess what I don't understand is this part (regarding the compiled mode):

"The drawback is of course that breakpoints in code that is stepped over are
missed."

what exactly does that mean?

~~~
StefanKarpinski
It means that if you would have hit a breakpoint in code that is run in
compiled mode, the breakpoint doesn't trigger—because it's being run normally
at full speed without breakpoints, not being interpreted in the debugger
(which knows about breakpoints).

~~~
UncleOxidant
I read this as in compiled mode breakpoints don't trigger at all. Is that
correct?

Edit: Ok, I tried out compiled mode and it does stop at my breakpoint. The
verbage in the documentation is a bit difficult to understand on this point.
I'd guess you need to first set your breakpoints prior to going into compile
mode?

~~~
mkborregaard
Learning a lot from this thread, seems really usfeul

------
yahyaheee
There are parts of Julia I really like but it has some problems.

* Multiple dispatch is an odd design pattern that seems to over complicate things. I know there are people that love it and claim it’s better, but after working with it for some time I just want a struct with methods. It’s much easier to reason about.

* The packaging is cumbersome. I understand their reasoning behind it but in practice it’s just more of a pain than denoting functions as public one way or another.

* The tooling is poor. I work with Go for my day job and it’s toolchain is an absolute pleasure. The Julia toolchain isn’t in the same arena.

* The JIT is slowwww to boot. I was amazed the first time running a Julia program how slow it was. You could practically compile it faster.

* Editor support has never been great.

* As others have mentioned type checks don’t go deep enough

I think it has some neat ideas and in certain scientific arenas it will be
useful, but IMO they need to focus a bit more on making it a better general
purpose language.

~~~
ViralBShah
While I don't agree with many of these points, I do agree that some of these
can be substantially improved. We continue to work hard at it. Some are
research problems, while others need elbow grease.

Just to present the other side, here's a recent thread on Julia discourse
about why people love Julia. Many chiming in there are recent users of Julia
and I think it is insightful.

[https://discourse.julialang.org/t/in-as-few-lines-as-
possibl...](https://discourse.julialang.org/t/in-as-few-lines-as-possible-
describe-why-you-love-julia/33179/9)

One that I particularly enjoyed reading about:

[https://discourse.julialang.org/t/in-as-few-lines-as-
possibl...](https://discourse.julialang.org/t/in-as-few-lines-as-possible-
describe-why-you-love-julia/33179/24?u=viralbshah)

~~~
yahyaheee
Yea I really wanted to like Julia overall and many of the parts I like about
it are on this thread. I think it's apparent we need a better numeric language
than python, I just wish Julia would focus a bit more on utility.

------
fishmaster
I've been using Julia along with python and Pytorch, not yet for machine
learning until flux is more mature but for NLP scripts, and I have to say that
I'm starting to like it. Multiple dispatch, linear algebra and numpy built in,
dynamic language but with optional types, user defined types, etc.

------
xvilka
I hope Julia will be more popular in bioinformatics. Personally, I have a high
hopes for BioJulia[1][2][3] and the amazing AI framework FluxML[4][5] +
Turing.jl[6][7]. Apart from the speed, they offer some interesting concepts
too - I recommend to check them out.

[1] [https://biojulia.net/](https://biojulia.net/)

[2] [https://github.com/BioJulia](https://github.com/BioJulia)

[3] [https://github.com/BioJulia](https://github.com/BioJulia)

[4] [https://fluxml.ai/](https://fluxml.ai/)

[5] [https://github.com/FluxML/](https://github.com/FluxML/)

[6] [https://turing.ml/dev/](https://turing.ml/dev/)

[7] [https://github.com/TuringLang](https://github.com/TuringLang)

~~~
CreRecombinase
How much of the BioJulia stuff would you say currently works? It looks like a
lot of repos have been created, and the scope is pretty impressive (looks like
there are repos for everything from structural bioinformatics to population
genetics), but a lot of them look to be basically
empty([https://github.com/BioJulia/PopGen.jl](https://github.com/BioJulia/PopGen.jl)),
or have really scary looking issues:(e.g
[https://github.com/BioJulia/GeneticVariation.jl/issues/25](https://github.com/BioJulia/GeneticVariation.jl/issues/25)).

~~~
kescobo
% of the repos in the org on github? That number is lower than I'd like. % of
the repos that are actively maintained? Much higher.

One of the great things about julia is that it's really easy to throw together
a package and register it. One of the bad things about julia is how easy it is
for those one-off projects or idea dumps to pollute the space. We could
definitely do a better job labeling the repos that are no longer being
maintained or that aren't actually ready for prime time. There's a tradition
in julia of a lot of really functional libraries to stay < v1.0, because we
all take semver seriously, and if the interface is still in a bit of flux,
making the switch to 1.0 is a big deal (DataFrames.jl, looking at you). But it
does make it hard for new users to distinguish between a super robust package
and someone's weekend hobby.

------
smabie
Julia is great. It’s significantly simpler than Python while also being much
more expressive. It’s too bad the typing is only for dispatch, but hopefully
someone will write a typechecker someday. I’ve found it surprisingly
refreshing to not have to think about classes and just add functions on
objects wherever I want. Some languages solve this with monkey patching (which
is bad), others like Scala with an extension class (reasonable, but you still
don’t get access to private properties), but the Julia approach is cleaner.

I wouldn’t use Julia for a non-scientific computing app as I don’t think it’s
suitable, but for anything data science related, it’s great! And with the
Python interop, I don’t really think there’s any reason _not_ to use Julia for
your next data science project. I suspect that over the next 5 years Python
will no longer be used for these applications at all.

~~~
FranzFerdiNaN
Python will still be used 20 years from now. The clear advantage of Python is
the enormous ecosystem that is available, the millions of questions on SO
giving solutions to every problem you can run into, the books and learning
materials etc, programmers and corporations having invested loads of time and
effort in building, maintaining and battle-testing libraries.

Don't get me wrong, i think Julia is an amazing language, but being an amazing
language is neither necessary or sufficient to succeed. R shows how you can
succeed just fine with a kinda weird language.

~~~
awb
> Python will still be used 20 years from now

So much technology has come and gone since 2000. If innovation continues to
accelerate I'd be hesitant to predict what 2040 will look like.

Programming might become so simple and automated that we won't need
StackOverflow to the extent we do today.

It's really hard to predict the next year or two let alone the next 20.

~~~
goatlover
Programming languages from 2000 remain, though. C, C++, Java, JS, Python, etc.
Even Fortran, Cobol and Lisp remain in use. There’s been attempts since the
80s to popularize visual and and higher level approaches to programming, but
the traditional languages still dominate. And the newer ones like Go, Elixir
and Julia are like the traditional C, Lisp and Fortrans.

~~~
awb
Sure, some do, but past adoption is not a guarantee of future adoption. 20
years in the future is a long time. Just look at Flash. Who knows, maybe a
Quantum OS will dominate that doesn't support Python.

So yes, if someone will still be using Python in 20 years, I'm sure you're
right. But Python just as easily could be relegated to a niche domain while
other languages take over a broader range of applications.

Especially if someone's able to build a programming translator where you can
easily port your code base from one language to another.

~~~
goatlover
I'm more responding to the idea that programming might become simple and
automated, and we can't even predict technological changes a year or two out.
You're right that Python could be relegated to a niche language in 20 years.
That does happen.

What hasn't happened is a fundamental change to programming languages since
the 1960s, despite the incredible increase in computing power, ubiquity of
computers and much better tooling and environments for programming languages.
There's no evidence that this is going to change anytime soon. All the popular
new languages are similar to the popular older languages. If people don't want
to program in JS, they transpile from Typescript. Even web assembly is a means
to use a language like Rust on the web.

There's no simple, automated PL on the horizon that is going to replace JS,
Java, Python, etc. There isn't one for spreadsheets, either, which is a
technology from the 80s.

~~~
JadeNB
> What hasn't happened is a fundamental change to programming languages since
> the 1960s

I think the rise of HLLs is an example of a fundamental change. It was still
reasonable to be hand-writing assembler for lots of applications in the '60s.

~~~
goatlover
Sure, but HLLs are decades old and ubiquitous by the 80s. There's no reason to
think PLs are going to fundamentally change in 20 years. They could, but it's
just speculation at this point.

~~~
edw
The argument over whether C was fast enough to obviate the need to write in
assembly raged across the pages of Dr Dobb's Journal and Byte Magazine well
into the mid '80s.

~~~
pjmlp
Which while true for home micros, was already a proven fact since the early
60's in the big warehouse mainframes, with the Algol derived systems
programming languages.

------
tgflynn
Julia is a language I really wanted to like, and to a certain extent I do.
However after spending some time working with it and hanging out on the
Discourse channels (they certainly have a very friendly and open community,
which I think is a big plus), I've come to the tentative conclusion that its
application domains are going to be more limited than I would have hoped.

This article hits on some of the issues that the community tends to see as
advantages but that I think will prove limiting in the long run.

> Missing features like:

> Weak conventions about namespace pollution

> Never got around to making it easy to use local modules, outside of packages

> A type system that can’t be used to check correctness

These are some of my biggest gripes about Julia, especially the last two. To
these I would add:

* Lack of support for formally defined interfaces.

* Lack of support for implementation inheritance.

Together with Julia's many strengths I think these design choices and
community philosophy lead to a language that is very good for small scale and
experimental work but will have major issues scaling to very complex systems
development projects and will be ill-suited to mission critical applications.

In short I think Julia may be a great language for prototyping an object
detection algorithm, but I wouldn't want to use it to develop the control
system for a self-driving car.

Unfortunately this means that Julia probably isn't really going to solve the
"2 language problem" because in most cases you're still going to need to
rewrite your prototypes in a different language just like you would previously
in going from, for example, a Matlab prototype to a C++ system in production.

~~~
short_sells_poo
You touch upon some interesting pain points. I really like Julia and working
with it is a pleasure.

Except the Module system, which feels unnecessarily arcane. I'm happy to be
educated on why, but it seems to successfully combine the awkwardness of
C-style #include with the mess of a free-form module system. The end result is
a Frankenstein monster where technically everything is possible, everything
could be included anywhere, there are no boundaries or even conventions. It
makes for a frustrating experience for a newbie.

Say you have a package, and inside is a file called `xyz.jl`. You open the
file and it defines a module called Xyz. But this tells you absolutely nothing
about where in the package Xyz will appear. It could be included somewhere
deep in the hierarchy, or it could be a submodule. It could be included
multiple times in multiple places! That's bad design for sure, but the
language places no boundaries on you. You open another file `abc.jl`, and see
no modules at all, just a bunch of functions, which in turn call other
functions that are defined God knows where. A julia file does not have to
contain any information about where the symbols it's using come from, since it
will be just pasted in verbatim to some location somewhere.

The whole module system feels like one big spaghetti of spooky action at a
distance.

It's a shame too, because the rest of the language is very neat. Once one gets
over the hurdle of the modules, it is possible to establish conventions to
bring some sanity in there, but it's a hurdle that many people will probably
not want to deal with.

~~~
improbable22
How would you like modules to work?

It seems great to me that paths & source files are mostly irrelevant, you're
free to re-organise without changing anything. And that `using Xyz` is always
talking to the package manager. You can make sub-modules and say `using .Xyz`,
but there's very little need to do so, and few packages do.

You can shoot yourself in the foot by including source twice, as you can by
generating it with a macro, or simply copy-pasting wrong.

~~~
Seanny123
I mean, I'd like them to work like Python, Ruby and TypeScript, but you're
right to say I can't describe _why_ I want this.

Is there some guide I could read about structuring a large Julia project? It
was pretty easy to intuit with Python, wherein I would put related files in a
folder. But with Julia, everything is everywhere and I'm baffled.

~~~
short_sells_poo
> But with Julia, everything is everywhere and I'm baffled.

This is exactly it. Julia allows you to import and include anything, anywhere.
You open a file and it doesn't say anything about where the dependencies are
coming from and where this particular piece of code will go. Both of those are
defined at the place where this file is included, which itself could be
anywhere. It could be a different directory, different file, tucked away in a
module. It could be in a dozen other files, or no files at all, and you can't
tell from looking at just the source of the file.

~~~
oxinabox
In practice most packages:

1\. Never use submodules (they don't add much) 2\. Only use `include` within
the main `src/MyPackage.jl` file

------
classified
Is it possible yet to compile ahead-of-time to a stand-alone binary
executable?

~~~
cmcaine
The other replies are slightly out of date and imprecise.

PackageCompilerX replaced PackageCompiler (and there's a PR open that will
pull all the X work in soon).

The binaries produced bundle the whole Julia sysimage by default and they're
quite big, but they are quite fast!

A more traditional static compilation approach is being tried with
StaticCompiler.jl by tshort, but it's in early development.

~~~
cosmojg
> A more traditional static compilation approach is being tried with
> StaticCompiler.jl by tshort, but it's in early development.

The moment this ships, there will no longer be any reason to use any other
garbage-collected language.

~~~
eigenspace
What does this have to do with garbage collection?

If your concern is about latency in real time systems, Garbage collection
isn't the problem, heap allocating memory is. Julia makes it easy to never
allocate anything on the heap (unlike most garbage collected langauges).

Check out this talk this discusses this in detail in the context of robotics:
[https://www.youtube.com/watch?v=dmWQtI3DFFo](https://www.youtube.com/watch?v=dmWQtI3DFFo)

~~~
snicker7
I think the above poster is referring to languages like
Go/Java/C#/Nim/whatever. Like Julia, these languages have a significant
runtime. Unlike Julia, their apps can be compiled AOT into standalone
executables.

~~~
cosmojg
My point exactly! Thanks for clarifying.

~~~
eigenspace
Right, but you said

> there will no longer be any reason to use any other __garbage-collected
> language __.

(emphasis mine). I was just curious as to why you specified garbage collected
there.

~~~
cosmojg
I equate garbage collection with larger static binaries. Julia's static
binaries are especially large since they package the entire sysimage. I'm
excited about StaticCompiler.jl because, by cutting out the sysimage, it
promises to make Julia more competitive with other garbage-collected languages
like Nim, Go, Crystal, etc., all of which produce relatively small static
binaries.

However, like these other garbage-collected languages, Julia is unlikely to
ever compete with the likes of Zig, Rust, C++, etc. which produce _even
smaller_ static binaries since they don't have to ship a GC runtime. That's
why I specified garbage-collected languages as, in my opinion, the addition of
proper static compilation will allow Julia to completely supersede other
garbage-collected general programming languages but not manually managed
systems programming languages.

~~~
eigenspace
I see, thanks for the explanation!

I think with StaticCompiler.jl, if you create a binary where the compiled code
knows there won't be any GC, for instance, say you only ever work with Tuples
and floating point numbers, I don't think there will be any GC runtime in the
compiled binary.

I could be wrong about this though.

~~~
cmcaine
Julia can stack allocate all sorts of values. I think it currently stack
allocates all immutable values and also every mutable value that it proves do
not escape their function.

Kind of besides the point, though, because the GC is probably not heavy
anyway. Lots of fear of GC is not based on benchmarks.

------
byt143
Anyone know how this compares to swift and its protocols etc?

~~~
socialdemocrat
I am a big fan of Julia, but Swift is perhaps the only statically typed
object-oriented language (apart from Objective-C) which I have found offers
some similarity in flexibility to Swift's way of dealing with types.

With the ability in Swift of adding extensions to conforming to a particular
protocol to a class, you gain some of the same flexibility in Swift as in
Julia.

It means you can take an existing class which was not designed in particular
for some kind of abstraction and add conformance to an abstraction (protocol)
you later added.

That is kind of what Julia gives you, with the ability to easily add functions
dispatching on an existing type.

Say you got a `Polygon` and `Circle` type in Swift and Julia which you want to
add serialization to without either one having been designed for it
originally. In Swift I would define a `Serializable` protocol with a
`serialize` method taking an `IO` object to serialize to. Then I would extend
`Polygon` and `Circle` to implement this protocol.

In Julia I would simply add two functions:

    
    
       serialize(io::IO, poly::Polygon)
       serialize(io::IO, circle::Circle)
    

The challenge in Julia is that I might want to define that only objects of
type `Shape` can be serialized, but if `Polygon` and `Circle` was not already
defined as subtypes of `Shape` I cannot do anything about that without
changing source code. Swift has an advantage in his case.

My only alternative in Swift would be to create a Union type of all tye types
I want to be serializable.

~~~
ken
> With the ability in Swift of adding extensions to conforming to a particular
> protocol to a class, you gain some of the same flexibility in Swift as in
> Julia.

You can add methods or computed properties, but that's about it. That's only
one axis of flexibility, and it's really only syntactic sugar for writing and
calling your own functions. You can't add any other kinds of features, unless
they chose to use protocols in their interfaces -- which they usually didn't.

For example, that page gives the example of adding precision to numbers in
Julia. I'm not sure how you could do something analogous in Swift, short of
writing your own numeric tower from scratch. In Swift 4 they did add a Numeric
protocol, but it's not used much. It's probably hard to retcon this sort of
interface onto a framework which was built around concrete structs from the
start.

~~~
byt143
So it's a package, not a language issue?

Here's a revamp based on protocols [https://github.com/apple/swift-
numerics](https://github.com/apple/swift-numerics)

------
huijzer
Julia is by far the favorite language I have written code in. It is extremely
expressive, while also being easy to read. Most design decisions are spot on.
For example, the language has syntactic sugar, but not too much. Everything in
the base library makes sense and seems to be there for a purpose.

Other niceties are the meta-programming capabilities, which allow for things
like inspecting the llvm code and printing a variable name `x` plus output by
only typing `@show x`. Then there is the fact that anonymous functions
actually look like how you would describe a math function! (That is, `f(x) =
2x` is a valid function, as is `f(x) = 2π`.)

However, there is one thing I do not like at all. That is the loading time of
packages. When starting julia and running `@time using DataFrames` it takes
about 38 seconds when recompiling some stale cache. If all caches are good,
the the load times for some common packages still add up to 1.1 + 4.5 + 1.1
seconds according to `@time using Test; @time using CSV; @time using Dates`.
Therefore, nowadays I prefer to use R. For most of my use cases R outperforms
Julia by a factor 10.

------
inamberclad
I never was able to get julia to do what I want. If I were a data scientist
who developed and maintained large libraries, then it would probably be great,
but I'm not. I just want to quickly visualize and modify data, or maybe see
how a model compares. Much more difficult to do simple things like that than
in Octave/Matlab.

~~~
sgillen
That’s interesting I would have thought it would be the opposite. Julia being
good at smaller experimental work and maybe creaking when things get scaled
and put into production. Coming from a matlab background I found Julia much
more natural to pick up than say python.

------
ablekh
Interesting post and excellent discussion. I have the following question to
all Julia and/or Python experts here. What strategy for developing a cloud-
native {science and engineering}-focused platform would be better, in your
opinion, and why: A) develop an MVP and then relevant production platform in
Python, spending some saved time and efforts (due to simplicity, [as of now]
better tooling and much wider pool of experts) on development of more and/or
better features; B) develop an MVP in Python, then rewrite it for production
in Julia for much better native performance, GPU integration and potential use
of macros for an embedded DSL; C) take more time initially to master Julia and
develop an MVP and then the corresponding production platform in Julia from
scratch?

EDIT: Forgot to mention that HPC workloads would represent a significant,
albeit non-unique, aspect of the platform in question.

~~~
eigenspace
I’m a bit biased as someone who switched from Python to Julia for my physics
research (I.e. I’m biased to believe I made a good choice and others should
follow my decision making), but I think any extra effort you spend in the
beginning to get it working in Julia will pay big dividends.

In the scientific world, Julia’s package ecosystem is already more developed
than Python in some fields (Differential equations being one, but there are
others), so you may not find that limiting you.

Furthermore, for the reasons laid out in the article, Julia is highly
composable and empirically, has a far greater ratio of code re-use than
Python. I believe you’ll see greater bang for your buck in Julia because code
you write is more likely to be re-used, especially in scientific domains.

~~~
ablekh
Your blazingly fast and thoughtful comment is much appreciated. Let's see what
others have to say ... :-)

One clarification that I would like to make (which IMO diminishes your second
point's potential value) is that the B2B SaaS platform that I plan to build
would stay away from implementing a myriad of domain-specific scientific
methods and algorithms (though I plan to provide a valuable core) and rather
would enable users to integrate their own implementations (through a plug-in
architecture). Thus, Julia's package ecosystem seems to be a factor of
somewhat lesser importance.

------
ColanR
I've been meaning to learn Julia. Is there a recommended text for doing so?

~~~
cosmojg
Check out Think Julia:
[https://benlauwens.github.io/ThinkJulia.jl/latest/book.html](https://benlauwens.github.io/ThinkJulia.jl/latest/book.html)

------
dzonga
though, Julia is faster than Python. Does anyone mind explaining why Python
can't have a JIT ?

~~~
eigenspace
As others have mentioned, Python does have JIT compilers. The problem is that
havign a JIT doesn't solve the problem.

PyPy is often a factor of 10 behind julia performance and projects like Numba,
PyTorch (the PyTorch people had to build their own Python JIT compiler
yikes!), etc. will always have a more restricted scope than a project like
Julia because Python's very semantics make many optimizations impossible.

Here's a great talk from the author of the famous Flask library in Python:
[https://www.youtube.com/watch?v=qCGofLIzX6g](https://www.youtube.com/watch?v=qCGofLIzX6g)
where he discusses the fundamental problems with Python's semantics.

If you fix these problems, you will end up changing Python so fundamentaly
that you'll really have a new language. Generic CPython 3._ code will
certainly not be compatible with it.

~~~
dzonga
so in this case, once say the Julia ecosystem grows then migrate to Julia. or
wait for optimizations to be done, e.g have pandas, numpy etc handle multi-
core processors etc ?

~~~
ChrisRackauckas
There's quite fundamental optimizations that will be missing if separately
compiled pieces cannot optimize together. These barriers disallow many things.
Additionally, anything that relies on higher order functions provided by the
user, such as optimization or differential equations, will run into issues
because those functions cannot be optimized by some super good package
developer, but instead have to come from the user.

This blog post highlights the massive performance advantages Julia can have
when you start integrating optimized packages and allow for optimizing across
these barriers in Julia, and also the pros and cons of allowing these kinds of
cross-function optimizations.

[https://www.stochasticlifestyle.com/why-numba-and-cython-
are...](https://www.stochasticlifestyle.com/why-numba-and-cython-are-not-
substitutes-for-julia/)

------
lightsighter
Except they have no story for building composable libraries in a distributed
setting. The story for distributed execution in Julia today is just use MPI,
which is a terrible answer. Anyone who has ever use libraries backed by MPI in
any language knows that they are inherently not composable. You can't just
take an object partitioned across multiple nodes one way by one library and
pass it into a second library that expects it to be partitioned a different
way. As far as I can tell the Julia language has nothing to say about that,
and that makes them a non-starter today for anyone trying to build composable
libraries for distributed memory machines.

~~~
xiaodai
Just curious, is there anything that's composable in the distributed world

------
Iwan-Zotow
Composable? Julia and composable?!?

They decided on sequence range [1...N], instead of [0...N) like Python.

Try to compose that.

~~~
socialdemocrat
Depends on what you work on. When doing more computer science like stuff, such
as computing memory offsets etc, then 0-based indexing is practical. But for
numerical work 1-based indexing is usually easier to work with. Mathematical
texts are already using 1-based indexing and hence that is what people are
used to when thinking about math.

I work with both and I never found this a big problem. This is on par with
complaining about whitespace in Python. I prefer languages to not be
whitespace sensitive but it is not a big problem.

Although I pretty sure you will accidentally get more problem from Python
whitespace usage than from Julia 1-based indexing.

And frankly since Julia can use any indexing, you can use A[begin:end] to
refer to the whole range of an object. If you want the second item in any
array you can just write A[begin+1].

~~~
Iwan-Zotow
Whatever I'm doing this is wrong. Simple example - I have Python sequence
[0...N) to process. If members are independent, I could split it with easy in
Python and do it in parallel: [0...N) -> [0...M) + [M...N) for any M between 0
and N. Basically, Python sequences/ranges/etc are monoids in many cases.
Simplest composition rule - monoid, with unit element and associative
composition. In Julia it really looks ugly, you have to make some effort to do
it right.

For me, Julia people don't get range/sequence composition right, so ...

~~~
JustFinishedBSG
You're inventing problems that don't exist. Every thing you said is applicable
exactly with 1, 2 or whatever based indexing. It literally doesn't matter.

Julia's sequences/arrays are monoids too, I don't see what it has to do with
anything.

~~~
Iwan-Zotow
> Every thing you said is applicable exactly with 1, 2 or whatever based
> indexing.

read my statement again, it is about open vs closed interval. Python
ranges/sequences are composable, Julia ones are not.

> Julia's sequences/arrays are monoids too

really?

suppose you compute sum, and got two results: S(1, M) and S(M, N)

Could you compose them in Julia?

~~~
newen
You compute S(1, M-1) and S(M, N) instead and compose them. That's it. This is
not complicated stuff.

