Hacker News new | past | comments | ask | show | jobs | submit login

If it's reliably generating working code then that isn't bullshit! (Ignoring, of course, other things about the code that might be relevant to the assignment, like coding style or efficiency.) What I'm saying is that if you are looking at the AI's output and judging that it's bullshit, and if you can't distinguish that output from your students' satisfactory essays, then that by definition means that the assignment was to produce bullshit.



This is a pretty dumb take and it's repeated in this thread. The goal of the assignment is not a result. I mean the professor can probably write a better essay than some kid who just learned that this subject exist. The point of this exercise is to have the student learn how to do the work, so they can do it when it's not a simulated exercise.


You’re missing the point. Even if you disagree with the point, it’s important to understand it.

If the goal is “to have the student learn how to do the work”, and there is a tool they can use to do so, then using the tool is doing the work.

Your position only makes sense if you define “the work” to also include exactly the process you personally learned. No fewer tools (did you learn on a word processor?), no more tools (is spellcheck OK?).


Even if language models exist that can generate text for you, it is still very useful to be literate yourself. At least it is for me.


I think you're missing the elephant in the room: we learn _to_ learn and be able to adapt, not to "do the work".


Um. So kids adapting to use LLMs and learning how to prompt them to get the desired results is evidence that they aren’t learning or adapting?

Doesn’t that feel a little. . . Odd?


When you are told to write an essay about WW2 it's not because your teacher needs info about WW2 but because they want you to read, parse, search, filter information and organise it in a logical manner. If all you do it type a question in chatgpt you learn nothing of these things that will be very useful in life in many situations in which you won't be able to ask your AI overlord for a quick answer.

You go from being a swiss army knife to being a butter knife without handle, it's all fun and games as long as you're asked to cut butter but when you'll have cut a steak or open a beer bottle you'll have a hard time.


That's not how learning work in the slightest, the output may be both of similar low quality but if the student learned a few things in the process of writing that low quality essay then it might have been worth it anyway, as in maybe the student didn't fully reached the knowledge or understanding to get the highest grade but it did improve the way the student thinks about the subject, maybe after a few days in the back or their heads it will click, or maybe it will click after reading something tangentially related, or maybe it will help them find gaps in their knowledge they were even unaware of before trying to actually complete the assignment (and were unable to be spotted by reading the homework tasks alone)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: