I'm curious, what makes Java an especially poor choice for numerically intensive computing? Is the JIT penalty too high, or is it something else entirely?
My guess is that there are problems with things like the way all objects are on the heap, and probably other things too.
I'm curious, what makes Java an especially poor choice for numerically intensive computing? Is the JIT penalty too high, or is it something else entirely?