Hacker News new | comments | show | ask | jobs | submit login

Why doesn't anyone actually mention what Big O (of anything) actually means? Instead of jumping directly to a CS-related example, shouldn't the definition be made clear first?

http://en.wikipedia.org/wiki/Big_O_notation




Nobody explains Big O because the OP demonstrates that he understands the idea of Big O by giving the complexity of a piece of code and explaining how the complexity would change if the code changes.

OP is confused with the idea of O(log n), not of Big O in general.

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: