The distribution of an individual's IQ (as a quantile in the group's aggregate, normalized IQ distribution) is not a dirac delta. It's not constant over a lifespan at one location.
I think there would be significant noticeable losses in IQ due to social factors, too, and to a much greater extent than most people would appear to believe when they talk about the heredity of IQ but then think that means IQ determinism..
For example someone could test at 115 IQ, become homeless, then have something much closer to ~100. (Do not use "but someone that smart would never become..." as a cop-out..)
Nonsense. If I administer an IQ test to 1000 people in their 20s, and then retested them in their 60s, I'd be performing a classic within-subject test, where I could easily assess whether test performance increases, decreases, or doesn't significantly change.
My point was that let's say all 1,000 people in their 20s got every question correct, and then you retested them in their 60s and they all got every question wrong. Assuming the sample was representative of the overall population, the group's average IQ would have remained constant at 100.
That's not quite accurate. The score adjustments are periodic, not constant. The 100 average is from some years ago, and will be adjusted some time in the future. It is not constantly changing to reflect current scores.
Absolutely, they IQ test and the relative results are constantly changing according to the population's performance, but if you take a snapshot of the test by using the original test and scoring rubric, you can determine if your IQ changed from your 30's to your 60's in relation to the original distribution.
It doesn’t because it’s a bell curve. You literally can’t figure out your score until you already know the distribution.