You want to validate the age of a person, so you remove one of their teeth or take a tissue sample from their eye?
There's not much in the tooth apatite that would be useful for radiometric dating on the time scale of a single human lifespan, though. I don't think you could do it with radiometrics, no matter what you sampled. Anything precise enough to indicate one year within a range of 100 would probably kill the animal with its radiation, even if it were naturally common enough to be taken up by living organisms.
It may be that a tooth is an easy source of original DNA, and approximate age may be extrapolated from accumulation of copy-error mutations in different types of cells in the body, which would be affected by the frequency of division. So by comparing DNA found in the tooth mineral against that found in the tooth nerve cells, and in the tooth blood supply, it might be possible to narrow the range of possible whole-organism ages by referencing mean mutation rates, and then further narrow it by checking telomere lengths. Seems like that might vary somewhat by individual, and their history of radiation exposure.
That wouldn’t help solve these cases.
That said, you generally have error boundaries spanning multiple decades and it measures since time of death (when new carbon stops being integrated into the body) not time of birth, so it would not be useful for this.
That said, the error bars are simply too large to do anything reasonable for dates prior to 1955. The difference between 80 years and 110 years is 98.7% vs. 99.0% of the original C14 concentration... and while you may be able to get a very precise measurement of the remaining C14 you also need to very precisely know the baseline from that era to determine the percentage. Thus typical radiocarbon error bars are at least ±60 years.