If that's true it would be a sad outcome, I believe people would react against such an artificial world.
In a music DJ context: even if an AI was able to mimic the dopest turntablist moves and factor in layers of depth and groove and create unique mixes, it would still be an artificial mix made by AI, and so not as valuable or worthwhile as a human DJ. That doesn't mean that AI DJs or musicians won't be successful, they just won't be human and can never be human, and that means something.
Jeffy's story is a cautionary tale for our times - apparently thus: young talented programmer makes something that hits the zeitgeist but quickly unravels as the lure of riches of corrupts.
Zerebro was building an AI agent framework (+ the obligatory coin) and also released AI-generated music. The music is lowest common denominator AI pop with cringe lyrics [1] but at one point achieved a respectable ~40k monthly listeners on Spotify.
Jeffy hit the crypto podcast scene (see his Blocmates interview [2]) and comes across as quite earnest, but there was a lot of hype too, for sure.
One thing that these thread comments are maybe overlooking is that Jeffy may have been genuinely frightened of being harmed, like kidnapped. Hence the fake death. Crypto kidnappings are more prevalent in Europe, so maybe they're going to take off in the USA now too ...
I know that large organizations are often slow to adapt to changing circumstances, but come on! Apple appear to be suffering from a severe case of decision paralysis with respect to AI
> James [...] sent along work at the last minute with no connection to the artist involved (a random tape of gabba from his studio was handed over to an Atlantic rep for a Lemonheads remix)
... but I think we may be heading for a new 'golden age' of web animation and gratuitous creativity. Personally, I'm happy to see more crazy animated stuff, it's the corporate dark patterns and bad UX that I hate.
Thanks! Sounds like a reasonable prediction. To me, crazy animated stuff in the wrong context is a component of bad UX. Though I learned web design by interning under a literal Nazi, so my design opinions may be a bit...extreme.
Perhaps I could make room in my heart some day for animated cats on personal sites. Clippy is still pushing it. More because of a bunch of bad memories of trying to support people who were infuriated by it, or on a few occasions having to go to the trouble of opening Word just to disable it on several machines in a day, than its actual physical aesthetics. In my memory it looks more like an image search for "evil Clippy" (didn't think to try that until now, some pretty funny stuff).
Completely agree that corporate dark patterns are a much greater concern. That's why, except for Clippy, I like this project. It puts the tool directly in people's hands with no need for tech skills or cloud gatekeepers and spying.
Tangentially, I just realized that this nicely self-contained Clippy might be able to copy itself. It doesn't have to be able to write an LLM, just copy (or worse yet upload) one file and execute it. Like Agent Smith. But Clippy.
Sounds like you were a bit traumatized, I can see how clippy could become the stuff of nightmares!
Recently I've been playing with webgl, Three JS, SVG and CSS animations - it's all so accessible with AI coding that one's creativity naturally becomes the main thing. Sometimes even vibe coding on my phone - it's a lot of fun, and so yes I'm sure we're also going to get a lot more annoying stuff that gets in the way.
You're not entirely wrong, though I was also playing it up a little. Meant to be read more like a standup comedy bit, some part honest opinion and some part exaggeration. Someone told me a long time ago that laughing at my own jokes isn't funny, and ever since, nobody including me is ever quite sure how serious I am or not. :)
I'm not so opposed to vibe coding as recreation. Though if you're ever interested, those are all pretty easy and fun things to work with directly in my opinion, at least at hobby scale. Well, maybe not bare webgl, but that's why three.js is in the list.
AI sure does it a lot faster than I can though. I totally get your point that it lowers the bar to entry, and that the speed and near-instant mutability is more conducive to creativity. I'm more opposed to the semantically inscrutable term "vibe coding", but it seems pretty well entrenched already.
This is how I feel too. For me it feels great to have a powerful camera in my pocket wherever I go. I enjoy being 'present' within my surroundings and open and observant for potential images. This practice feels almost like a meditation, and capturing photographs in my local neighborhood as well as anywhere else is part of the experience of living my best life.
It's been obvious since ChatGPT blew up in early 2023 that educators had to rethink how they educate.
I agree that this situation that the author outlines is unsatisfactory but it's mostly the fault of the education system (and by extension the post author). With a class writing exercise like the author describes, of course the students are going to use an LLM, they would be stupid not to if their classmates are using it.
The onus should be on the educators to reframe how they teach and how they test. It's strange how the author can't see this.
Universities and schools must change how they do things with respect to AI, otherwise they are failing the students. I am aware that AI has many potential and actual problems for society but AI, if embraced correctly, also has the potential to transform the educational experience in positive ways.
> they would be stupid not to if their classmates are using it.
Why would they be stupid? Were people before LLMs stupid for not asking smarter classmate/parent/paid contractor to solve the homework for them?
Large part of education is learning about things that can be easily automated, because you can't learn hard things without learning easy things. Nothing conceptually changed in this regard, like Wolfram Alpha didn't change the way differentiation is taught.
Agreed, my bad choice of words. I really meant 'stupid' from a slightly ironic, competitive point of view. It's like the pressure to cheat in professional sport is obviously so intense, I'm sure a lot of a cheat's motivation is to remain competitive because their colleagues are cheating so they feel they have to as well, otherwise they lose.
YMMV, in my uni there is more or less zero impact of cheating in your homeworks in most classes: the homework has low weight in final grade and are graded very generously.
I agree that making assignments not designed with external sources in mind significantly impact the final grade is not ideal. I think this is minor and easily fixable point rather than some failure of the whole education system.
What's funny about this is if you are an athlete who cheats constantly while growing up - you'll never develop the skills to be able to make it as a pro. Interesting how students don't see this same situation with their homework
> Were people before LLMs stupid for not asking smarter classmate/parent/paid contractor to solve the homework for them?
In American universities where your GPA from your in-class assessments forms part of your final grade? Yes, absolutely.
Where I came from you do your learning in class and your assessment in a small, short set of exams (and perhaps one graded essay) at the end of each year. That seems far more conducive to learning things without having to juggle two competing objectives the whole time.
> In American universities where your GPA from your in-class assessments forms part of your final grade? Yes, absolutely.
Whether not doing everything to maximize your GPA is "stupid" (literally or figuratively) is a good question too.
But even if your assignments influence your GPA it's rarely the only thing that does, and not doing assignments will harm your ability to perform in midterm/exam/whatever.
Amusingly, when I asked o3 to propose changes to the education system which address the author's complaints wrt writing assignments, one of the first things it suggested was transparent prompt logging (basically what the author proposes).
It seems to me that the little sister is acting rationally. Yes it's a bit lazy but then: when was the last time you used a calculator for a calculation that you could have done in your head? Do you actually need to be able to do mental arithmetic, do you actually really need to be able to work out what 24 + 7 is in your head?
I don't know what the answer is. I'm old school, if it was up to me I'd bring back slide rules and log tables, because that's such a visual and tactile way of getting to know mathematics and numbers.
It's interesting to consider how AI is affecting humans' cognition skills. Is it going to make us stupid or free us up to use our mental capacities for higher level activities? Or both?
One thing missing from the reddit story is how the author knows their sister is using ChatGPT to obtain the answer as opposed to validating work. I'll often punch a calculation I've already done in my head into a calculator to confirm for myself I landed on the right answer. Or come up with a solution to a problem, find myself wondering if there's a better solution or just feel like it doesn't quite look right and search for solutions online to compare and contrast. I'm not using the calculator or search to replace my work, I'm using it as the sniff test. But you couldn't determine that just by knowing my inputs to the calculator and google. You'd have to see the entire process.
Only in a few fields. You can have a successful career publishing papers in the social sciences, or justifying decisions in middle-management, without being able to know whether an effect size passes the sniff test - actually not knowing will probably help you make convenient mistakes.
24 plus 7 is 41 or 94 or 3 and 4/7 are all obviously wrong, but only if you have developed number sense. I've had students who wouldn't bat an eye at any of those solutions.
> With a class writing exercise like the author describes, of course the students are going to use an LLM, they would be stupid not to if their classmates are using it.
Its only stupid if you try to optimize for the wrong things (finishing quickly, just getting a pass).
I'd say it's very smart if you don't rely on LLMs, copy the homework from someone else, or similar; because you're optimizing for learning, which will help you more than the various shortcuts.
Why isn't both a valid option, can't one be at a university to both learn and get a degree+GPA showing they did well at doing so? In any case, why does a student selecting that they are there to learn negate the responsibility of the university to provide the best curricula for students to do so with?
The assumption you are making is that students consistently using llms is required by the best curriculum. That’s a big assumption. There may be value educationally in forcing students to learn how to do things on their own that an llm could do for them. And in that case, it’s the student failing their education if they circumvent it with an llm, not the institution’s.
At times there is value in that kind of approach, especially when the learning is specifically about those kinds of lower layers instead of higher concepts. As such, it's not that certain tools should be used in every assignment or never ever used, just whether they should be used commonly.
For most continued learning it's better if the university uses calculators, compilers, prepared learning materials, and other things that do stuff on behalf of the students instead of setting the bar permanently to "the student should want to engage everything at a base level or they must not be here to learn". It allows much more advanced learning to be done in the long run.
> For most continued learning it's better if the university uses calculators, compilers, prepared learning materials, and other things that do stuff on behalf of the students instead of setting the bar permanently to "the student should want to engage everything at a base level or they must not be here to learn".
IMHO, the example of using calculators in a learning environment is a great topic to explore.
Using calculators in a university setting is entirely reasonable as it is expected students have already mastered the math calculators automate. Formulae calculators are also included as, again, the expectation is a student capable of defining them have an understanding of what they are and when to use them.
Now, contrast the above with using calculators in elementary school, where basic math is an entirely new concept and the subject being taught. Here, the expectation is students learn how to perform the operations themselves through varied exercises, questions to the instructor, and practice.
> It allows much more advanced learning to be done in the long run.
Only if the fundamentals have already been established. Which leads back to my original question:
While I agree with nearly everything you say here, 90% of college isn't about sticking to the fundamentals like how elementary students didn't use calculators at first. This leads to the disagreement this discussion is related to the statement "Am I here to learn or to get a passing grade?". We both agree students need to be there to learn to get anything useful out of the university, what we're disagreeing on is how they best do that for the majority of university level content.
Universities do not provide knowledge, this is a romanticization of an ideal. They are about getting a passing grade and a certificate so that you can enter the workforce. The idea that you go to college to get an education is just a polite fiction to appease students and their parents
> Universities do not provide knowledge, this is a romanticization[sic] of an ideal. They are about getting a passing grade and a certificate so that you can enter the workforce. The idea that you go to college to get an education is just a polite fiction to appease students and their parents
University is like a supermarket.
For some, they go there with a loose idea of what they want only to find ingredients not previously considered, often ending up with a better dining experience because of it.
For others, it is aisle after aisle of crap "Uber Eats" can deliver already made and without the hassle of having to cook it.
Seems like you missed the point of the article. The author is saying that if you treat the class/lesson as a means to an end only where the goal is to get a diploma then you’re not actually getting an education. If you’re using an LLM to do the work for you even if the other students are too then you’re just following all the other lemmings off the ledge.
I think it makes a lot of sense to employ various specialized LLMs in the software development lifecycle: one that's good at ideation and product development, one that fronts the organizational knowledge base, one for testing code, one (or more) for coding, etc, maybe even one whose job it is to always question your assumptions.
In a music DJ context: even if an AI was able to mimic the dopest turntablist moves and factor in layers of depth and groove and create unique mixes, it would still be an artificial mix made by AI, and so not as valuable or worthwhile as a human DJ. That doesn't mean that AI DJs or musicians won't be successful, they just won't be human and can never be human, and that means something.
reply