I think the GPT-3 thing is wrong, but it is not impossible. You can always train an existing neural net to do new things right? Not sure how to handle different whitespace tokens.
the release today for copilot X says it's using gpt4 "With chat and terminal interfaces, support for pull requests, and early adoption of OpenAI’s GPT-4" ~ https://github.com/features/preview/copilot-x
They only positively say that they are using GPT-4 for the new pull requests feature (and one other feature that I forgot). It’s unclear what model they are using for the main copilot code generation feature. It may be that they are only using GPT-4 for features that require the larger context size.