Ultimate Limits to Computation
If the universe is a simulation, is this not to be understood as the processor clock speed?
(I've not read much of Nick Bostrom's work nor do I claim to know much about reasoning that the universe is in fact a computer simulation.)
So if we were in a simulation, I'm skeptical that we could guess much about the properties of the 'host' system.
I was watching a (Bostrom, I think ?) presentation where the vocabulary used to explain Einstein's "c" (the speed of light) included words like "maximum speed of causality". That blew my mind. Because light moves as fast as it theoretically can (many Newtonians believed there was no ubound to the speed of light), light speed still cannot breach the absolute limit of 299 792 458 m/s. That speed limit is embedded directly into the action-reaction architecture of our universe.
(Anyway, didn't mean to hijack)
We could determine the maximum speed within our terms, but again that doesn't necessarily translate to how things happen in the underlying substrate.
Thought experiment: let's imagine we create our own simulated environment and put a sentient AI into it (which is implemented in the physics of that environment). The AI gets curious and does some experiments and figures out the maximum speed that causality can propagate in it's environment. Suppose the speed of light in the simulation is as fast as the speed of light in our environment. So something takes 1 in the simulation and 1 second in ours. Cool. Now, suppose you and I as the developers put in a 'sleep' call into the event loop or whatever so that the simulation runs at 1/10 the speed of real time. Light in the simulation is now 10 times slower than it was by our clock time. Now the code operates considerably slower - but any entity in the code wouldn't perceive the slowdown because it has slowed down too in the same way.
Same thing - any slowdown in the simulation would not be noticeable to us because it's also a slowdown in us.
Assuming our universe is being simulated on an earth sized computer (this is mind bogglingly ridiculous to me) and that the simulation meets Bremermann's limit (also ridiculous, since we're 40 orders of magnitude away according to this http://arxiv.org/pdf/quant-ph/9908043v3.pdf), then it would take on the order of a whole earth day to simulate a single time step.
Of course, perhaps the host universe doesn't have the same limitations. Such speculation seems pointless. Maybe solipsism is true. Maybe the simulation runs on magical fairy dust.
I believe that Bremermann's limit should also apply to quantum computers, but you are right in that example given for breaking cryptography is way off if you allow quantum computers.