> Also, don't overlook the possibility of pointing a
> telescope at an illuminated LED being powered from the
> motherboard. It will vary in brightness just as (or even
> more) accurately as the supply voltage from a USB port.
That is probably technically difficult from any significant distance, because tiny differences in air temperature slightly alter the refractive index, and add random error to your measured brightness that most likely would dwarf the tiny fluctuations in light output.
On top of that, light output is a quantum phenomena, and this complicates any analysis from a significant distance.
For example, suppose that there is a 630 nm, 2 mA average red LED attached to the computer, with a clock speed of 1 GHz. Each photon has 6.6260695729E-34 * 299792458 / 630E-9 = 3.153088387521748e-19 J of energy, so the device emits 6.342987427548621e15 photons per second, or 6.342987427548621e6 per clock cycle. Assuming those photons are radiated in a perfect semisphere, and a 15cm telescope aperture at a distance of 100m, the number of photons per clock cycle is 6.342987427548621e6 * pi0.15^2 / (2pi*(100)^2) = 7.14 photons / clock-cycle.
From such a small number of photons, trying to detect a <0.002% change in voltage seems to be infeasible unless the attacker can make the computer repeat the exact same sequence of operations with predictable timing enough for an averaging procedure.
Obviously, if the attacker can get closer or use a massive telescope aperture, the attack might be more feasible.
But the leaked information doesn't need to be collected at anything close to clock-cycle sampling periods. For example, there have been successful timing attacks demonstrated from sampling only the time taken by the overall private key operation (millseconds). Note that these guys demonstrated a side channel attack with an ultrasonic microphone.
Yes, the standard assumption is that an attacker will be able to prompt the server to perform a large number of private key operations at predictable times. This is not unrealistic, it usually only takes a half-dozen TCP packets or so to get a web server to do it immediately. (e.g. http://thehackerschoice.wordpress.com/2011/10/24/thc-ssl-dos... )
"A practical upper boundary on data rates by optical emanations was estimated at 10 Mbps (Megabits per second), but they thought that greater data rates may be feasible."
That is probably technically difficult from any significant distance, because tiny differences in air temperature slightly alter the refractive index, and add random error to your measured brightness that most likely would dwarf the tiny fluctuations in light output.
On top of that, light output is a quantum phenomena, and this complicates any analysis from a significant distance.
For example, suppose that there is a 630 nm, 2 mA average red LED attached to the computer, with a clock speed of 1 GHz. Each photon has 6.6260695729E-34 * 299792458 / 630E-9 = 3.153088387521748e-19 J of energy, so the device emits 6.342987427548621e15 photons per second, or 6.342987427548621e6 per clock cycle. Assuming those photons are radiated in a perfect semisphere, and a 15cm telescope aperture at a distance of 100m, the number of photons per clock cycle is 6.342987427548621e6 * pi0.15^2 / (2pi*(100)^2) = 7.14 photons / clock-cycle.
From such a small number of photons, trying to detect a <0.002% change in voltage seems to be infeasible unless the attacker can make the computer repeat the exact same sequence of operations with predictable timing enough for an averaging procedure.
Obviously, if the attacker can get closer or use a massive telescope aperture, the attack might be more feasible.