
Julialang Antipatterns - ViralBShah
https://white.ucc.asn.au/2020/04/19/Julia-Antipatterns.html
======
ohsonice
Appreciate this post. I remember when I first started using Julia, I wanted to
type every argument to every function because I thought static typing made me
hip. Ran into a lot of problems with my types not being wide enough, etc. and
had no performance impact.

Also, good to know about the NamedTuple. I've been away from Julia for about a
year and am starting to get back into some development with it.

On another note, I just found out today that my department's HPC is still
running Julia 0.4 and since we are in between IT people are not going to
update it. Considering rewriting my project in Fortran or C++, waiting for the
day when Julia is a first-class language

~~~
cbkeller
Julia 1.5 and 1.6 are going to have some really nice features performance-wise
(1.5beta has cut a lot of allocations from my code and feels noticeably
snappier than some older version).

Once they come out it might be worth having a push to get your cluster
updated. I'm sure folks in the Julia slack (check out #HPC and #distributed)
would be happy to help if they can, even if it's just at the level of building
a local install in your user directory.

~~~
ohsonice
Thanks for the reply! Yes I've pushed for it to get updated but haven't seen
any results due to IT turnover. Always excited to hear more performance
improvements, yay :) shout out to the Julia dev team for doing great,
intentional work.

They supposedly have the HPC setup so you can install your own version locally
and submit it to the cluster but nothing happens when I do that (job is called
successful after 1sec, no print statements do anything, no save statements,
etc). I may take your suggestion to work with the Julia slack team, thanks.

~~~
cbkeller
Debugging slurm/pbs/cluster issues is no joke, but there are a few folks on
Slack with relevant experience

~~~
ohsonice
I actually was able to solve this for myself. As often happens, I went to ask
for help on Slack but first wanted to understand the problem better. Turns
out, our old IT director (who was not loved in the dept because he was
constantly taking away permissions) decided to make life 'easy' for us by
creating a bash script that created the sbatch file for us. Debugging this
file showed that it did not work correctly when using local installs. I was
able to simply write my own sbatch file.

I submitted the fix to the IT team but I don't even know if anyone has the
permissions to get in and fix it right now.

------
bionhoward
Thanks for posting —- I hit an issue recently this would have prevented: goal
was to connect neural nets to probability distributions, the nn library took
Float32 and the distribution library took Float64. Double-precision
uncertainty seems counter to the purpose of uncertainty!

Better to not constrain the type of the inputs, unless it is a big problem

