Radiation levels due to nuclear testing were elevated 7% over normal
The major source of radioactivity in steel is cobalt 60, which has a half life of 5.27 years
In which case one could just wait a year and the radioactivity of your steel would drop by 7%, making up for the effects of nuclear testing contamination. Put another way, steel from 1944 has been around for some 10 half-lifes of cobalt-60, meaning it has 1/2^10th as much 60Co radiation as when it was made. Why would it matter if the radioactivity was 1/2^10th or 1.07/2^10th as much as the background radiation?
I'm sure there are other isotopes which make this more of a problem, but the facts as presented in this article don't make much sense.
Even so, there are still some rare cases where it's easier to use older steel to ensure that there is much lower background radiation. Co-60 isn't the only source of radiation, and even 1/1,000th of peak levels can still be fairly high for applications like neutrino detectors.
I've half a mind to write a bot that submits a random article every 48 hours.
The article says background radiation levels peaked at .15 mSv in 1963. Looking at the wikipedia page on Seiverts, I am trying to compare this to other radiation examples, but not sure how to draw a comparison.
Would a human standing outside be receiving .15 mSv per hour? year? total?
I think the 0.15 mSv the article mentions is per year. The next sentence says per year explicitly.
From the article if looks like the most important factor for that would be atmospheric concentration of Cobalt-60 (in ppm or ppb).