Hacker News new | past | comments | ask | show | jobs | submit login
Algorithms for the 21st Century (2006) (yaccman.com)
14 points by pncnmnp 59 days ago | hide | past | favorite | 4 comments



This paper is very enlightening, but I don’t think that an algorithms class should address these sorts of problems. When I went to school (15 years ago) We had a separate machine architecture class series that went over these sorts of practical concerns. We had labs where we would time memory accesses and test aliased variables and manually decompile simple functions.

In my humble opinion as a volunteer educator, algorithms are already very complicated and students being introduced to them don’t need to concern themselves with this stuff at the same time.


Nifty paper. I kind-of wish I was in a problem space that had that sort of issue!

I just spent a week shaving 98% off of the latency in a single RPC call by going from O(N^2) to O(1). Data locality is not a problem I have. :)


> We can also examine what happens when we write data repeatedly.

I wonder whether an optimizing compiler can detect and optimize the sequential writes of the same values, possibly by replacing them with something like a blitter. It would be interesting to see the difference in speed if the data to be written was random, rather than the same value (1.0) every time.


This is a good paper. I would label it "Caveats for Big O" or a map is not the territory.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: