Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a pet peeve of mine. Hackers, when measuring time for software performance, please use something smaller than milliseconds. 0ms is a Dirty Lie!


In this case the performance measure in question is whether the latency is perceptible to humans or not. Milliseconds is the appropriate unit. Nobody is going to notice differences of less than 1ms.


Accumulation.

If x is the unit of measure for the end-result and we have n components that add together, then we need at least x/n as unit of measure for the performance of each component. x/(2*n) is more reasonable to not deviate from the target performance more than one x after rounding in the worst-case.


Yes. But if some of your stages have delays of tens of milliseconds then there's no point knowing how big the delays are in the stages that are lower than 1 millisecond.

Once they're all below 1ms it's worth increasing the resolution.


Well, e.g. 20x 0.4ms (rounded to 0ms) is going to be noticed quite soon.


Their goal is to produce and "easily digestible overview", which necessarily means removing a lot of the detail that a hardcore performance measurement person would find interesting. They were right to stick with milliseconds throughout.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: