If that's the case, there's an obvious moat (perhaps not an incredibly deep one) in being better at prompt engineering than your competitors, dedicating R&D effort to discovering new prompt engineering tricks/principles, etc.
I could see this as being kind of like an advanced form of SEO.
I don't think it's going to be about a single prompt; reverse engineering multiple prompts interacting with themselves is hard. There's a lot of cool things to be done with:
(a) creating a pipeline of prompts that combine outputs of previous prompts into new prompts in a predefined manner
and (b) designing prompts to generate other prompts
With the right type of online learning and possibly some of the weights frozen, GPT-3 could gain an unlimited memory instead of the fixed 2048 token memory.
True, but unless there is a clear leader in your market - lots of good enough products will appear with GPT-3 and to compete with them you will need at least 5-10x better product. 2x won't suffice. So probably it will go down to who has a bigger marketing budget.
If that's the case, there's an obvious moat (perhaps not an incredibly deep one) in being better at prompt engineering than your competitors, dedicating R&D effort to discovering new prompt engineering tricks/principles, etc.
I could see this as being kind of like an advanced form of SEO.