Can you think of a way to model a non-instantaneous propagation of this change that doesn't involve simulating oodles of photons wandering around the world?
Light from source A travels at the speed determined by the last speed-wavefront that passed it. This can be modeled by storing the speed on the sources, or by calculating from all gathered orbs in the world, and selecting the most-recent, when the time is calculated by (<time picked up> at <distance when picked up> at <speed of light before picked up>).
All such waves will be entirely contained within the previous wave because you cannot go faster than light and the speed only decreases, so all that matters is the speed before it was picked up, which is the speed the wave-front moves at (or does it move at the new speed? eh, pick either one, it doesn't matter.)
Then, for each orb picked up since the most-recent that affected the source, the time for the light to reach the observer is based on <distance to orb N> at speed <before orb N> until <distance to orb N-1>. Then repeat for orb N-1 until you run out of orbs. In this case, N would be the oldest orb whose wavefront has not yet passed the source.
It's a piece-wise calculation (the reverse of what's described), so it would be slower than instantaneous changes, but there's no need to model photons. And I would guess there's a reasonably-efficient way to calculate this on a GPU, so it could probably be real-time. Or it will be in a year or two.