
Using GPT-3 to generate user interfaces - Samin100
https://twitter.com/sharifshameem/status/1282676454690451457
======
gas9S9zw3P9c
As someone working in the field I have a few thoughts:

1\. The way I think about these huge models is no longer as something that
makes predictions, but rather as a kind of huge knowledge base (compressed
training data as representations) with smart query capabilities, which come
from the fact that the model is constrained to create syntactically correct
output. This could be the next generation of search. There is a fine line
between search and creating novel outputs, since most novelty is just a
combination of old things with constraints on correct syntax.

2\. The shown results are likely a result of massive cherry-picking in the
typical "pitch deck demo" fashion. You'll see a lot more failures when trying
to use this and you need to get the query "just right" for it to produce good
output, which a skill in itself. Also let's not forget that the model must be
fed fine-tuning examples, which also need to be "just right.". That's why I
believe these kinds of models aren't that useful for fully automating things -
for something to be used in production it should be 100% correct - but they
are very useful as a search function and alternative to e.g. StackOverflow in
this case. Querying the model will give you good enough results that you can
use as a starting point for your own use case.

~~~
dang
That's interesting. Are there examples of these techniques being used for
searchy things?

~~~
gas9S9zw3P9c
Well, I think it comes down to how you define search. People are amazed and
how good GPT-3 is at generating text, but looking at it from a search lens,
you could argue that rather than "generating" it searches for the most
relevant pieces to the input, combines them, and then enforces correct
grammatical rules on the output. A recent example I saw on Twitter was someone
asking GPT-3 questions about human evolution, and the model answers with a
bunch of book references and links - that's something you would have done in
Google before.

The fact that the Transformer attention mechanisms works based on matching
queries and keys/values is probably why the model is so good at finding
relevant information to your input - that's kind of what it was trained to do.

------
CJefferson
This is a really cool demo, and congrats to the creator for making something
really cool.

However, while these systems based on GPT-3, and similar techniques, can
produce some amazing outputs given the right inputs, you don't have to poke
them very hard before they start falling apart.

At the end of the day, this isn't really any more "clever" than a very, very
advanced and long markov chain, there isn't any intelligence.

~~~
rbinv
> At the end of the day, this isn't really any more "clever" than a very, very
> advanced and long markov chain, there isn't any intelligence.

[https://en.wikipedia.org/wiki/AI_effect](https://en.wikipedia.org/wiki/AI_effect)
\- at least sort of

I get what you're saying, but is this really as fragile as you claim? I've
been rather impressed by (GPT) input handling.

~~~
CJefferson
I would never have thought this type of thing was clever. I don't want to pull
expertise but I've been an AI research for 20 years, thinking this stuff isn't
clever isn't new for me :)

If anything, this is "less clever" than older techniques like tree-search, as
it is very easy to confuse, and will start just producing nonsense.

------
escardin
What I find most interesting about this is that is has just as much trouble as
humans in translating requirements into code. Admittedly it makes some fairly
basic mistakes as well, but always in the under specified areas

i.e.:

You didn't say what colour 'welcome to my news letter' should be, so I picked
white. The numbers from 1-5 are totally a possible random permutation in the
1-10 range. I find the crying laughing emoji to be really ugly, so it's the
worst one.

------
applecrazy
This is so cool. I want to feed recipes into GPT-3 to generate novel (and
humorous) food items.

------
jonas_kgomo
My actual worry about this is 'will I be obsolete as a front-end engineer'. It
seems that AI will manage web optimization, and NLP conversational
programming, so basically there is no need for a front end engineer. Perhaps I
am wrong.

------
jayjagtap
A blue color Login Button and a green color sign up button

------
londons_explore
How was this made?

The GPT-3 model hasn't been published, and training it yourself would probably
cost millions of dollars...

~~~
captn3m0
OpenAI is offering an API for it, currently in private beta.

------
zuhayeer
This is so cool – looking forward to using GPT-3 on random things like song
lyrics and internal company slacks to see what happens

------
Der_Einzige
I was one of the people who didn't think that transformers would be powerful
enough to automate software engineers out of a job.

I was wrong. It's coming for front-end engineers and it's coming soon.

