My experience with working with AI agents is that they can be verbose and do things that are too over complicated by default. Unless directed explicitly. Which may be the reason for this discrepancy.
There is a notable difference between say, calculating long division through a calculator compared to prompting an AI to calculate the derivative of a simple continuous function. one requires _understanding_ of the function, while the other just skips the understanding and returns the required derivative.
One is just a means to skip labor intensive and repetitive actions, while the other is meant to skip the entire point of _why_ you are even calculating in the first place. What is the point of dividing two numbers if you don't even understand the reason behind it ?
I'm not quite sure I understand the logic of this and how people don't see that these claims of "well now everyone is going to be dumber because they don't learn" has been a refrain literally every time a major technological / Industrial Revolution happens. Computers? The internet? Calculators?
The skills we needed before are just no longer as relevant. It doesn't mean the world will get dumber, it will adapt to the new tooling and paradigm that we're in. There are always people who don't like the big paradigm change, who are convinced it's the end of the "right" way to do things, but they always age terribly.
I find I learn an incredible amount from using AI + coding agents. It's a _different_ experience, and I would argue a much more efficient one to understand your craft.
100%. I have been learning so much faster as the models get better at both understanding the world and how to explain it me at whatever level I am ready for.
Using AI as just a generator is really missing out on a lot.
Integration and differentiation, even before LLMs, were already something that you would be better off just getting a machine to do in most cases. It's far more important to understand what the operations represent than it is to derive the exact closed form of the result yourself, because the actual process of doing it is almost always tedious and mechanical and doesn't give you much insight into the equation you are working with.
I am not knowledgeable on how transformer works but, what if, us humans just do the same thing in our minds as well ? What if our feeling of "understanding" is merely just the emotional response to a pattern matching as you just said?
Yea, you said it. It is the feeling of understanding and feeling/sensing implies consciousness. Why does it matter? I don't know. All I know is that it is not the same thing, because a chunk of metal cannot feel. So I don't want it to be called by the same name.
When AI marketing (ab)uses the word, it is to project the appearance of human equivalence. And I don't like to fall for it.
reply