I've terribly butchered this since the details completely escape me but you get what I mean. It feels like it could be true, which is... neat. This sort of thing certainly happened several times with gaming consoles where developers are able to squeeze every ounce of performance from the hardware at the very end of its generation.
2000x is believable, but that doesn't mean the latest algorithm will run on Apple II. Algorithmic speedup is often hardware relative. For example better cache locality is less important in older hardwares.
On the actual back-in-the-day Apple II at time of release, you probably did not get a configuration that maxed out RAM and storage because it just cost too darn much. But as you got into the 80's, prices came down, and the Apple could expand to fit with more memory and multiple disks, without architectural changes.
As such, a lot of the early microcomputer software assumed RAM and storage starvation and the algorithms had to be slow and simple to make the most of memory, but Apple developers could push the machine hard when ignoring the market forces that demanded downsizing. It was a solid workstation platform. When the 16-bit micros came around the baseline memory configuration expanded remarkably, so completely different algorithms became feasible for all software, which made for a substantial generation gap.
By the 90's, the "free lunch" scenario was in full swing and caching became normalized, and so everything has been about faster runtimes, but at systemic scale, with layers of cache, it's often hard to pinpoint the true bottleneck.
No-one, no site I have read, whether they are tabloid type ( like Wccftech ) or professional type ever claims the end of Moore's law equals the end of transistor improvement. ( It just meant we wont be getting the improvement as quickly or as much )
And yet somehow, somewhere, some marketing reckons the Mass Thinks Moore's Law equals transistor improvement. So we are now "redefining" Moore's Law equals transistor improvement and it is not dead. As shown in the recent Intel Conference.
Which sort of make any discussion on the topic pointless.