A while back I was prototyping let's just say... unusual binary datatype representations for numbers. All i had to do was reimplement a handful of operations (+, -, x, /, one, zero) and I got everything from fourier transforms to matrix solving for free. Comparing numerical performance with standard IEEE representations was then easy, and I had confidence that my comparisons were legit, since I was literally calling the same function against both numerical types.
More recently I wanted to play around with galois fields, following Mary Wootter's impressive work, and was able to test some ideas very quickly (and trivially deploy on a supercomputer cluster) in few lines of code using Julia.
That's not a typical use case, but it's a thing. I am thinking about playing around with complex numbers (when I get some free time, which increasing seems like 'never') in deep learning, and for similar reasons Julia will be an obvious choice.