D shows that you can be in one language and still performs very fast. With its numerical library you can get both productivity and performance that is even better than the OpenBLAS (Matlab, Numpy and Julia library are based on) .
These libraries have been incorporated in various spots in the machine learning, scientific machine learning, differential equation, and other libraries due to how they often outperform BLAS's like OpenBLAS. While the core language still links to OpenBLAS by default, this is up for debate for future versions of Julia given these developments (and movements in the standard library are being done to better incorporate the use of different BLAS implementations).
As far as I can tell from glancing through Gaius.jl is no longer maintained and both it and Tullio still consistently lose out to MKL across a wide range of matrix sizes and usually lose out to OpenBLAS on larger matrices. Moreover most victories for Tullio seem to be very machine dependent with different users reporting OpenBLAS still beating Tullio even on smaller matrix sizes.
It has mostly been replaced by PaddedMatrices.jl
>Moreover most victories for Tullio seem to be very machine dependent with different users reporting OpenBLAS still beating Tullio even on smaller matrix sizes.
On kernels which OpenBLAS directly represents. But Tullio is building more general kernels and has a pretty big win on tensor calculations that are not just one GEMM.
Besides handling operations which aren't standard kernels, handling weird number types efficiently would be nice. (Being able to compile lighter-weight Julia images without BLAS libraries might be nice too, for other purposes.)