Unfortunately with scientific computing the only place I see D could be writing custom performance critical algorithms but then again for university folk it is more straightforward to do it in C++ or C since there are plenty of code snippets lying around.
There is excellent mir library but its documentation is subpar, no tutorials or any examples too. There is Netflix Vectorflow small deep learning library but only for CPU and only feed-forward networks, so it works for some specific narrow case. There is fastest on earth csv parsing library TSV-utilities from one of the ebay engineers but I have only learnt about it when I started looking through D website resources, also no tutorials. There are tools but using them needs more time investment than alternatives.
I actually have quite a lot of tools available that I've built up over the years. I wish I had more time to work on it. I have a matrix algebra library that in my opinion is extremely convenient to use that I'm preparing to release complete with documentation in the next couple months.
I have everything available in R at my disposal, because it's easy to embed an R interpreter inside a D program. Note that this does not always give poor performance, because the R code calls into compiled code or you can just call the compiled code directly using the R interface.
For numerical optimization, I call the R optimization routines directly (i.e., the C library functions).
For basic model estimation, I call the Gretl library.
For statistical functions (evaluating distributions and such) I can call into the R API or Gretl.
For random number generation, I can call into Gretl, R or GSL. I have parallel random number generation code that I ported from Java.
For machine learning (I do a limited amount like lasso) I call R functions. The overhead with that is so low that there's no point in not just calling the R functions directly.
So things are there. It's just a matter of finding the time to turn it into something others can use. Right now I'm focused on doing that with my linear algebra library.
Oh hello friend, you've piqued my interest as someone interested in D, numerical optimization, and linear algebra. I have some questions though.
How do you do numerical optimization in D? Do you somehow wrap Coin-OR's CBC C++ library or lp_solve? What does it mean to call the R functions directly? Do you have an example? I'm going to guess that won't be able to handle the massive and time critical models I use, but am still curious.
How do you do linear algebra? Are you binding to BLAS, LAPACK, Armadillo? Or did you write some routines from scratch?
A couple of points to make before I give my answer. You want to check out the Mir project linked in the other comment. It's a well-designed library (though maybe lacking documentation). The other thing is that you can #include C headers directly in a D file using https://github.com/atilaneves/dpp so you can always costlessly add a C library to your project.
For optimization, I was referring to calling into the R API, which exposes the optimization routines in base R (Nelder-Mead, BFGS, Conjugate Gradient, and simple bounds constrainted BFGS). In terms of what it can handle, I guess that's entirely up to what R can handle. Here's the project page, but it looks like I haven't committed to that repo in three years: https://bitbucket.org/bachmeil/dmdoptim/src/master/
If you do try it and have problems with anything, please create an issue so I can fix it or add proper documentation.
There were two reasons for that. First, it offered a really simple LAPACK interface when I was starting out with D, and second, it offers a lot more than just linear algebra.
Is this something I'd recommend to others? I don't know. I built my infrastructure over a period of several years while waiting for my son at his many practices and activities. I also optimize for programmer convenience rather than performance at all costs. The time it takes to write correct, performant code is far more valuable than having code that runs 15% faster.
This has a lot of potential, but ultimately I'm paid to do other things, meaning those things become the priority...
This is great, please do! It would be nice to share it not only on dlang forum too. How does it compare to scid btw?
I like R but generally use it for basic stat tasks and plotting instead of Python. It would awesome if you could share your experience on how to set it up with D in blog post or whatever form you find useful.
I've been writing up a summary of how I use D to put on my website. Maybe this is the push I need to finish it.
About scid[0], I looked at it when I started, but it seemed to be largely inactive by that time, it didn't do what I needed, and the documentation wasn't really good enough. I was also turned off by the excessively generic nature of everything - there were just too many templates. At least that's what I recall.
There is excellent mir library but its documentation is subpar, no tutorials or any examples too. There is Netflix Vectorflow small deep learning library but only for CPU and only feed-forward networks, so it works for some specific narrow case. There is fastest on earth csv parsing library TSV-utilities from one of the ebay engineers but I have only learnt about it when I started looking through D website resources, also no tutorials. There are tools but using them needs more time investment than alternatives.