And how much does Discord commit to paying in damages if my face scan or ID scan leaks from their servers? Via security vulnerabilities or employees making some money on the side?
I've noticed this in my small scale tests. Basically the larger the prompt gets (and it includes all the previously generated code because that's what you want to add features to), the more likely is that the LLM will go off the rails. Or forget the beginning of the context. Or go into a loop.
Now if you're using a lot of separate prompts where you draw from whatever the network was trained on and not from code that's in the prompt, you can get usable stuff out of it. But that won't build you the whole application.
> In fact, LLMs will be better than humans in learning new frameworks.
LLMs don't learn? The neural networks are trained just once before release and it's a -ing expensive process.
Have you tried using one on your existing code base, which is basically a framework for whatever business problem you're solving? Did it figure it out automagically?
They know react.js and nest.js and next.js and whatever.js because they had humans correct them and billions of lines of public code to train on.
Wouldn't there be a chicken and egg problem once humans stop writing new code directly? Who would write the code using this new framework? Are the examples written by the creators of the framework enough to train an AI?
There's tooling out there 100% vibe coded, that is used by tens of thousands of devs daily, if that codebase found its way to training data, would it somehow ruin everything? I don't think this is really a problem, the problem will become people will need to identify good codebases from bad ones, if you point out which codes bad during training it makes a difference. There's a LOT of writings about how to write better code out there that I'm sure are already part of the training data.
How much proprietary business logic is on public github repos?
I'm not talking about "do me this solo founder saas little thing". I'm talking about working on existing codebases running specialized stuff for a functional company or companies.
Hmm? Where I live you have to renew your license (basically pass a few medical exams) every 10 years since the day you get your first license. Why wait until 70?
And in your mind NOW always means "since GenAI is a thing"?
Most of the time, when people realize something, it happens NOW. Also, AI isn't even mentioned in the headline at all, and not even in the first part of the article. It's just used as one hint that it might be scam, then followed up with further evidence.
reply