"It’s about maintaining the human element in a craft that’s increasingly automated."
I mean, what can anyone do, anyway? We’ve been on a "quest" toward the total automation of work for decades! and unfortunately these reflections are coming far too late.
didn’t anyone notice what was happening all these years?
Talking with a musician friend, he pointed out that today, studying, producing, and releasing music is almost volunteer work because the vast majority of artists will likely see no return on their investment, especially with AI flooding the music platforms, so I really expect it to happen to many other jobs.
>majority of artists will likely see no return on their investment
I wonder if music is the best example, because if I recall it has been always like this for musicians. Never have I heard that in my, my parents or grandparents time Musician was a career you would get in for money
When I was young I got to meet a lot of the aging jazz musicians of the 1930s in Kansas City. It absolutely was a career here. Granted, that’s a distant memory for most people.
Going this route, what’s the point of of learning anything if everything is instantly accessible from an AI with working solutions ? So no learning = no teaching or teaching that feels useless. That’s a weird and dangerous road. Everyone should own this technology for the situation to be balanced, not private or country based. Because we make ourself kind of useless in the process we loose leverage and value and we are at the mercy of the powerful ones
Volterra also contributed to materials science, more precisely with dislocations in crystals. Always amaze me how people in the past could make huge impact in totally different fields.
There's a "Altered or synthetic content" notice in the description. You can also look at the rest of the channel's output and draw some conclusions about their output rate.
(To be clear, I have no problem with AI-generated music. I think a lot of the commenters would be surprised to hear of its origin, though.)
We certainly improve productivity, but that is not necessarily good for humanity. Could be even worse.
i.e.: my company already expect less time for some tasks given that they _know_ I'll probably use some AI to do tasks. Which means I can humanly handle more context in a given week if the metric is "labour", but you end up with your brain completely melted.
We produce more output certainly but if it's overall lower quality than previous output is that really "improved productivity"?
There has to be a tipping point somewhere, where faster output of low quality work is actually decreasing productivity due to the efforts now required to keep the tower of garbage from toppling
I am a programmer and my opinion is that all of the AI tooling my company is making me use gets in the way about as often as it helps. It's probably overall a net negative, because any code it produces for me takes longer for me to review and ensure correctness as it would to just write it
> It's not up for debate. Ask any programmer if LLMs improve productivity and the answer is 100% yes.
Programmer here. The answer is 100% no. The programmers who think they're saving time are racking up debts they'll pay later.
The debts will come due when they find they've learned nothing about a problem space and failed to become experts in it despite having "written" and despite owning the feature dealing with it.
Or they'll come due as their failure to hone their skills in technical problem solving catches up to them.
Or they'll come due when they have to fix a bug that the LLM produced and either they'll have no idea how or they'll manage to fix it but then they'll have to explain, to a manager or customer, that they committed code to the codebase that they didn't understand.
I think the core of the 'improved productivity' question will be ultimately impossible to answer. We would want to know if productivity was improved over the lifetime of a society; perhaps hundreds of years. We will have no clear A/B test from which to draw causal relationships.
This is exactly right. It also depends on how all the AGI promises shake out. If AGI really does emerge soon, it might not matter anymore whether students have any foundational knowledge. On the other hand, if you still need people to know stuff in the future, we might be creating a generation of citizens incapable of doing the job. That could be catastrophic in the long term.
reply