This is really, really cool. I've been dreaming of a system like this for years (with better persistent context and a voice interface) - a proper Star Trek computer!
I think the implication is that it has read a _lot_ of information on the web, I think their training data is (in some part) based on reddit links, and that information is then just passively encoded in the model weights. Some of the information comes from the few-shot examples, at least in terms of what meta-form an 'answer' takes, but I think the implication is that GPT-3 is really just that knowledgeable.