Say there's some world wide computer glitch and the clocks have to be reset. Is it possible that we lose track of exactly how many seconds passed since the epoch?
Actual answer: The VLBI radio telescope Celestial Reference System measures the Earth's rotational position to 10 microarcseconds, which is good to within two thirds of a millionth of a second of sidereal time-of-day. The position of distant quasars is very stable. https://en.wikipedia.org/wiki/International_Celestial_Refere...
Put the question another way: Suppose you want to determine the moment of midnight to within 0.5s without using any clock more accurate than counting the number of days. You are, however, allowed to use telescopes, reference materials and perform arbitrary amounts of calculations. And you're allowed to wait for ideal observing conditions.
The general kind of instrument you want is the https://en.wikipedia.org/wiki/Transit_instrument -- if you can measure the transit of a star to 1 arcsecond, that's better than 1 second/day. That article says there are transit instruments that work down to .01 arcsecond.
Unless you have to go a very long time just counting days (enough for a half a leap second to accumulate), it seems pretty clear you could get the correct UTC second.
You mention computer glitches; there also exist non-transistorized clocks that are accurate to much more than 1s/day. In fact, the Wikipedia article about the Elgin Observatory talks about how use of a transit instrument to discipline a mechanical clock was accurate to .01s https://en.wikipedia.org/wiki/Elgin_National_Watch_Company_O... -- though I doubt such a system is in continuous use today.
You'd be able to get it back from GPS satellites but barring that, the same place synchronized time originally came from - astronomical observations and records of astronomical observations.
I didn't understand everything, but it looks like they have a fleet of atomic clocks for UTC and are resilient to one of the clocks failing. I guess it is possible that if all of those clocks go down, the world loses track of time. I wouldn't want to be responsible for that cluster :)
I think it depends on what you mean by 'accurate'. Losing computer time doesn't lose the ability to measure time so you've lost a reference and synchronization. The reference you get back from where you got the original reference to begin with and then go from there.
Back down on earth, though, most computer time has drift anyway. Very few devices need super precise time (GPS does, for example, because of relativity). Your home computer is subject to network time fluctuations +/- some number of milliseconds. Like NTP over the internet is only accurate down to a few tens of milliseconds.
If you have two databases running on two different computers, there's no guarantee their timestamps are in sync relative to each other or relative to a particular atomic clock's epoch. Is it within a second? Probably, on a modern operating system. Same millisecond? Very unlikely.