
GPT-3 can generate a React app from natural language description - andreaorru
https://twitter.com/sharifshameem/status/1284421499915403264
======
weeksie
Soon machines will steal all the copy-pasting from Stackoverflow jobs.

But really, this is super impressive for ML. It's not software development,
though. It's analyzing a corpus and performing a search.

The nasty bit is that if that is all it takes to do most software dev jobs,
and hey, we might find that most software dev really is just pattern matching
and regurgitating those patterns—as the project gets more complex, the
specifications will have to be more complex and we'll end up mostly back where
we started.

~~~
derision
this is great for all the people who hate gluing together boilerplater and
pipes. in my opinion, the hardest part of development is the "natural language
description". gathering requirements, making sure everyone is on the same page
with what is being requested, etc. if we're able to quickly generate
prototypes from that language that greatly reduces the cycle time from
ideation to prototype. what I forsee is that companies will need less
developers overall, but more companies will bring in technical talent they
wouldn't have otherwise, knowing it's that much easier to get working code

~~~
pmart123
The trick will be whether the code generation works well enough most of the
time, but if it generates some 1% weirdness that becomes a needle in the
haystack. It seems to be it will be great for product manager/business person
prototypes and potentially helpful for "hyper auto-complete" for a developer,
but likely not for automating mission critical code.

~~~
derision
we just need an adversarial network to generate unit tests

------
aerovistae
I don't really understand. How is GPT-3 able to do this? Understanding English
is one thing, but how did it learn JavaScript, HTML, CSS, and the React
framework?

~~~
andreaorru
Apparently it was trained over GitHub repositories.

[https://youtu.be/fZSFNUT6iY8](https://youtu.be/fZSFNUT6iY8)

~~~
aerovistae
What I'm wondering is how it matched up the natural english description with
the repository content though.

~~~
taylorfinley
I'd like to see this done with tests as the input, and the working app as the
output. Then writing an app would be as simple as defining the tests it must
pass.

~~~
FabioBertone
> Then writing an app would be as simple as defining the tests it must pass

In most cases, if you can define the tests, you have already figured out 80%
of the solution.

------
hnarayanan
The twist to this is that no matter what you ask the model, it will make you a
TODO app because that’s all it’s been trained on.

------
dbbk
Fun gimmick, but nothing more.

~~~
aerovistae
This comment seems short-sighted. Code that can generate a working set of
interlinked page elements from natural English? How on earth is that a
gimmick? This is incredible. This is the Pong to the Crysis of 25 years from
now, when we can describe entire applications into existence.

I can't fathom the mindset that dismisses this as a gimmick. Is the first step
on every journey meaningless? And this isn't even the first step-- this is
miles in!

~~~
otabdeveloper4
It's only a tiny step above searching Github and copy-pasting.

Searching and copy-pasting isn't the hard part of software development.
Guaranteeing validity, safety and performance is the hard part.

This doesn't get us any closer to the goal. (In fact it's the opposite, a big
step back.)

~~~
aerovistae
This is something nobody's ever done before, and which nobody _could_ have
done before. How can you dismiss it so easily? If this is a step back, what
does a step forward even look like?

Generating code with guaranteed validity, safety, and performance are worthy
goals, but how can you work on them without having an AI that can generate any
code in the first place?

I'm just endlessly confused by this mindset. If you wanted a palace and
someone built you the foundation, would you deride it as being nothing like a
palace and in fact a step AWAY from a palace?

~~~
danShumway
> If you wanted a palace and someone built you the foundation, would you
> deride it as being nothing like a palace and in fact a step AWAY from a
> palace?

Depends on whether it's actually possible to build a palace on top of the
foundation. If I know the foundation is going to crumble under the extra
weight of the walls, then yes, it's a step backwards.

This is a _fantastic_ achievement, and I think people who are saying it's
nothing new are kind of being obstinate. But there is real debate over whether
GPT is a foundation we can build on.

It's not as simple as just saying, "generate the code, and then we'll come up
with ways to make sure the code is correct." Fundamentally, that might require
us to generate the code differently than GPT does. GPT's foundation might
crumble under the weight when we try to put walls on top of it.

This is particularly worrisome with GPT because it's still a very active area
of research, so we don't know for sure that GPT's weaknesses aren't intrinsic
to its design. We could end up devoting a ton of time to pushing GPT to its
limits only to find out that the entire process has to be scrapped and that
we'll need to start over from the beginning.

I think people have a tendency to see something new and either only see the
capabilities or only see the weaknesses. There's been some really startlingly
impressive things coming out of GPT-3, in particular the 'infinite' text
adventure someone posted a while ago. But all of those projects have also had
substantial weaknesses, and the weaknesses are forming a pattern across all of
the projects. There are certain tendencies that GPT seems to universally have
around recycling content, going off on weird asides -- stuff that should have
proponents at least slightly worried, even while they rightly praise its
advances.

~~~
p1esk
_We could end up devoting a ton of time to pushing GPT to its limits only to
find out that the entire process has to be scrapped and that we 'll need to
start over from the beginning._

This is perfectly fine. In fact it's the only way forward currently. Unless
you have some alternative which is more promising? GPT models are like large
convolutional networks in 2012 - they were so much better than all existing CV
approaches at the time that it didn't make any sense to keep working on those
other approaches.

------
haltingproblem
This will work really really well for schema design and generation. A very
structured problem with a reasonably small set of good solutions. This is
something that you can already do with schema design tools - where you draw a
schema and it produces it. But imagine doing it through a textual description
sounds even more impressive especially if it can auto-magically fill in all
the fields I _might_ possibly need. Again, these are not seismic game changers
but productivity enhancers.

------
darepublic
I've seen numerous tweets from this guy and they frankly annoy me. If u are
about to bring down the hammer on my career, using btw exclusive access to an
API that you have through connections, then just do it already. Stop with this
teasing videos bullshit that we've seen many times.. as if you were stirring
up a frenzy at a games con with demo reels. Show me the money or take your
smug twitter Avatar profile and gtfo

