Hacker News new | past | comments | ask | show | jobs | submit login

Isn’t this just GPT-3 under the hood? Other similar things do exist (eg: copy.ai).

Not sure whether there’s much of a difference between all these GPT-3 powered services when all that distinguishes you from competition is some (slick) UI and the 500-1000 extra words of “training” you give to GPT-3.




Gwern [1], who has spent quite some time with GPT-3 and previous models, seems to think that coming up with the right 500-1000 words can be a subtle business.

Now, what's the prompt that will get GPT-3 to generate good prompts for us?

We'll call this technique Promptception.

[1] https://www.gwern.net/GPT-3#prompts-as-programming


https://arxiv.org/abs/2102.07350 already calls it "metaprompt" :) I gave it a quick stab a while ago but I think prompt programming is too new, and you can't easily cram demonstrations into an existing prompt, for it to really work well. It's more promising to train models on examples of tasks from instructions, or work on directly optimizing prompts for a goal (https://arxiv.org/abs/2101.00190) - it's a differentiable model and a whitebox, so use that power!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: