Hacker Newsnew | comments | show | ask | jobs | submit login

With 25 years of experience in computer industry, you should be explaining us all how we can estimate the time when a DDoS attack is being done:).

Anyways we are talking here about complexity of the algorithm, and not its actual speed or performance which will depend on a lot of factors. But complexity tells for example how many steps it will take for the algorithm to solve the problem. The algorithm can be used by Humans or super computers, so performance can vary from days to microseconds, but complexity will remain same.




I think I covered that in this ebook: http://www.smashwords.com/books/view/314107 :) BTW it is a joke written by MIT's AI SciGen software to show how hard it is to tell if a thesis is serious or a parody written by clever AI software. Even if I mark it as parody and Computer Science Fiction some people are taking it seriously. Even when it says stuff like added 27 25Ghz Intel 386 chips to the network. :) The 386 chip does not run that fast.

OK complexity, then why is polynomial time used to describe it? I know some compilers can optimize machine code better than others, how does that affect the complexity? If the performance can vary from days to microseconds, well that is a very big gap. Video gamers always complain about 'lag', and days well that is one heck of a 'lag'! :)

-----




Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: