Can anyone explain why this change didn't cause the client to think something was wrong with the server, and start applying local corrections? It seems like NTP was designed to prevent this sort of thing. (I don't really know anything about NTP :)
Why would the client think something was wrong with the server? This smear/fudge would be no different than any other clock adjustment from the clients perspective.
All of the client machines would be querying Lower stratum servers that all agreed on the smear factor. The clients do not query public ntp servers, they only query internal google servers that all agree on the time smear/fudge.