Anyways we are talking here about complexity of the algorithm, and not its actual speed or performance which will depend on a lot of factors. But complexity tells for example how many steps it will take for the algorithm to solve the problem. The algorithm can be used by Humans or super computers, so performance can vary from days to microseconds, but complexity will remain same.
OK complexity, then why is polynomial time used to describe it? I know some compilers can optimize machine code better than others, how does that affect the complexity? If the performance can vary from days to microseconds, well that is a very big gap. Video gamers always complain about 'lag', and days well that is one heck of a 'lag'! :)