Imagine you're a Rocketdyne engineer in the 60's and one day, some guy walks out of a time warp into your assembly area, pulls out some weird looking cameras, takes a few pictures of your engine, plugs some cables into some unknown equipment, and then, looking bored, sits down and directs his attention to a thin metal and glass slab for a while. When a light changes on the unknown equipment, the guy gets up, sticks his hand inside, and pulls out a tool which takes apart your engine.
That would be magic. You'd think the guy was from the 25th century or something. But no, only 45 years.
> [T]he power output of the Saturn first stage was 60 gigawatts. This happens to be very similar to the peak electricity demand of the United Kingdom.
What I thought strange was that they would have to reverse engineer something that they themselves built? That sounded pretty strange to me. But the article clarifies this point:
"A typical design document for something like the F-1, though, was produced under intense deadline pressure and lacked even the barest forms of computerized design aids. Such a document simply cannot tell the entire story of the hardware. Each F-1 engine was uniquely built by hand, and each has its own undocumented quirks. In addition, the design process used in the 1960s was necessarily iterative: engineers would design a component, fabricate it, test it, and see how it performed. Then they would modify the design, build the new version, and test it again. This would continue until the design was "good enough.""
Making a model by laser scanning with structured light was a known process in the early 1970s. It was used by Ford Motor to get from clay models of cars to metal dies used to stamp out body parts. Previous approaches involved plaster casts and mechanical tracing machines.
However, a computer tablet was beyond 1970s technology to even analyze, let alone duplicate.
I doubt they'd be able to duplicate it, since there's a whole manufacturing infrastructure they'd have to duplicate first. But I doubt they'd have any trouble analyzing it. By 1970 they had integrated circuits, and there were people who were already looking ahead to the possibilities. It's not that hard to pop the plastic off of an IC and look at it with a microscope.
Based on the gestalt of the time, they'd probably look at the tablet and think you should have something more advanced by 2015.
And wonder why the UI was so inconsistent ^_^.
As for ICs, the "standard" 7400 series dates back to the mid-60s: https://en.wikipedia.org/wiki/7400_series They'd be impressed by the surface mounting, but that's not a great leap from through hole DIPs. They'd be really impressed by the CPU, since computing resources were so hard to get back then, but putting it all on a single chip is obvious, and e.g. the 4004 dates back to late 1971. Even DRAM dates back to the mid-60s (IBM), with the 1103 (Intel's first DRAM, a whopping 1024 bits) being sold stating in late 1970 (albeit with low yields for a while).