Yes, of course it's different. Google can point you to a real Wikipedia article while GPT-3 will generate links to fake ones (or you could ask it to generate a fake article.) It's the difference between truth and fiction.
Or at least, a first-level approximation. Wikipedia articles can be wrong.
At the level of abstraction I believe your friend was talking about, yes, this is effectively a condensed model of "The Internet" and the results returned by Google.
For those saying it can generate new things, well it is interpolating between existing data points in the corpus, but mushed through a bunch of math.
Yes, but also, mixing together different articles on the Internet by different authors results in a complicated mixture of truth and fiction. What you get back depends on the query and some amount of randomness (depending on settings).
By not mixing the articles together, a search engine lets you see where the information came from.
These are both useful things to do, but one is more useful for fiction and the other for nonfiction.
I'm not sure "interpolation" is the right word, though, for these mixtures. Transformer output seems more creative than that.
As a creative person, creativity is over rated. But dont let me speak for everyone.
It is a straight line in a higher dimensional space. We vibe with the space, or it chooses us by prior experience. Creativity is in some aspect, in the eye of the beholder. I think the creativity we see in ML models is not unlike the creativity we prescribe to humans.
Interpolation is the right word and the right concept and metaphor. A mixing between two concepts to create a path between the two.
Or at least, a first-level approximation. Wikipedia articles can be wrong.