Hacker News new | past | comments | ask | show | jobs | submit login

The software improvements are on an order of 1000x more than the hardware improvements.

IE: The software matters more. As in, today's programmers know more about this subject.




I'd say on average programmers know less about writing optimal code now, but certainly the best in the field know a lot more.


I'd say that the algorithm matters more. As in, today's mathematicians know more about the subject.

I suspect the actual implementation of said algorithms probably achieves a lower % of peak performance than the older ones (though to be fair they are /much/ more complex algorithms).


>I suspect the actual implementation of said algorithms probably achieves a lower % of peak performance than the older ones

In my experience I have found the opposite to be the case. Most old maths libraries are written in FORTRAN (generally an order of magnitude or more slower than a comparable C/C++) and the implementations of standard algorithms are often sub-optimal and naive. I got the same impression when I compared arctangent (Taylor series) implementations in π programs and Minimax in chess (see Bernstein's program for the 704). I would guess that in the worst case they were 100x slower than what those machines could theoretically achieve. The C+inline asm libraries of today might be 2-10x slower at worst and some are even bottlenecked by memory. In this case I doubt the programs discussed in the 1989 paper, which were written in Prolog and FORTRAN, are exceptions to this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: