Hacker News new | past | comments | ask | show | jobs | submit login

Here's ChatGPT answering what the word "puppeteer" means in this article:

The metaphor of the "puppeteer" in the story could be interpreted in a variety of ways. The author doesn't explicitly state what the puppeteer represents, but given the context of the story and the themes it explores, there are a few possibilities.

One interpretation is that the puppeteer represents societal and cultural expectations that dictate how a person is expected to present themselves. The story touches on themes of family dynamics, pressure to conform and the negative consequences of living a fabricated life, which aligns with the experiences of many people who may feel compelled to hide their true selves in order to fit in.

Another possible interpretation is that the puppeteer represents addiction or substance abuse, the story mentions Jeff's use of magic mushrooms and the narrator talks about the family's history with alcoholism and the pressure to conform. In this interpretation, the puppeteer could be seen as the addiction controlling Jeff's actions and behavior and the way he presents himself to others.

It is worth to mention that the author could have also intended for the metaphor to be interpreted in a more open-ended way, allowing readers to draw their own interpretations based on their own experiences and perspectives.




I have not quite stayed tuned in to all the chatGPT comings-and-goings. But examples like this are just stunning. This is just the most incredible cognition "act" ever. It's impressive not just because it's convincing but mainly because the answer is far superior to the answer I'd provide given the same prompt.

I am so impressed and so terrified of what will happen when the ne'er-do-wells start using this for whatever nefarious purpose they conjure.


it could make it much easier for a bad actor to use social engineering to pretend to be someone in a certain role that requires a certain amount of domain-specific knowledge and/or lingo


I suspect that's simply a stock regurgitated PR-speak answer saying nothing with <insert keyword here> replaced by "puppeteer".

I've seen a zillion reponses like this from lazy students (although, to be fair, the grammar is quite a bit better). It should get a "D", but honestly I'd have a hard time throwing that down given the coherence of the grammar. I'd probably have to give it a "C" by comparison to the rest of the class.

This why I really don't understand the fuss about GPT. It's just not that great.


wow, talk about being this wrong, and with this minority of an opinion lol

dude it’s real. and it’s good. and you’re terrible.


Oh ChatGPT, what a nightmare you’re going to turn grading papers into for teachers.


Maybe the teachers can feed the papers back into ChatGPT and ask it what their grades should be, saving a great deal of time.


Also a boon. I asked it for a 50 question multiple choice test on Canadian history and it assembled a remarkably good test.


A couple of nights ago my wife and I used ChatGPT to generate a bunch of various report card comments. The results were fairly good as a starting point but, what was really interesting is that ChatGPT corrected terminology errors I had made in the prompts.

For example, I wrote "ppositional defiance," and in the response, ChatGPT corrected the term to "Oppositional Defiant Disorder (ODD)" along with a brief description of the disorder.

And, in response to my jokingly using the term "little monster" to describe a disruptive student, ChatGPT chided me for using the expression, calling it "utterly inappropriate."


It makes one wonder, do students need to learn how to write? The question has been asked before in a 'simpler' context. Do students need to learn long division? Seems like if not now, soon, the answer is no.


Except in the narrowest sense of "writing" meaning to put words on a page in grammatical order, it can't really be compared to learning to perform a mechanical chore like long division. It's more comparable to learning numeracy. To learn to write is to learn how to structure thoughts in human language, just as to learn math is to learn to structure thoughts in a universal framework built on logic.

One could make the question more explicit: If machines can think, is there a need for students to learn how to think? If not, is there a need to teach anything at all?

Just my personal view: The presence of reasoning machines makes it all the more urgent for our species to learn reasoning skills. The passing on of knowledge has perhaps been our key evolutionary advantage. Without the ability to write (and to structure thoughts), future humans would be defenseless against whoever or whatever could do so.


Sure, basic writing is probably still important, but more nuanced things, for instance, do students really now need to learn the form of an argumentative essay? That is/was a good skill to have. But now or in the future one could say i want an argument essay for why we should build this feature. i'd just get ChatGPT to do it.


eh, learning form in argument is exactly the same thing as learning to order one's thoughts. Would you do away with a lawyer's need to parse the law, or an engineer's need to learn control structures because a computer could write a paper they wouldn't understand? We need to be teaching people to spot the flaws in AI arguments and to spot when other people have used AI to write those arguments. There's no way to do that unless the students are competent in writing excellent arguments themselves.


My mother-in-law has ranted many times about how they don’t teach cursive anymore. She doesn’t know how to use a computer.

I think you raise a very interesting (and likely very controversial) thought to explore.

My quick opinion is that we can only teach so much. And the list of things that could be taught is forever growing. We need to be pragmatic about priority.


I just hate we optimize putting info into computer (typing vs cursive), but now computers can just read writing, so why are we so focused on typing? Just make computers adapt to humans, not train humans to accommodate machines.


Why do you think typewriters exist?

Print and cursive are about the same amount of effort, and that amount is a lot more than hitting keys. The average person can type 40 words per minute, or handwrite 40 letters per minute.


I agree — but then, I teach & remediate handwriting. Nowadays, more and more of my work is to teach humans to do what computers have learned to do: namely, to read, cursive, whether or not they write that way too.

Kate Gladstone author of READ CURSIVE FAST https://nationalautismresources.com/read-cursive-fast/


Typing is easier for most humans than writing, and it's not like writing is any less artificial.


I just tried a quiz of 15 multiple choice questions on New Zealand history, and I asked it to indicate the correct answer. It made a couple of glaring errors, which was disappointing.


So you’re saying you had to think, and that the AI did not take your job, but possibly alleviated some of the work in it?


No I am not saying that.


Just add “None of the Above” to each. ;)


I find chatgpt throws out multiple options for almost every response and they become formulaic, as in this example of 3 options. Sometimes the options are even numbered.


I never saw this answer until now, probably because it got flagged at first, but chatGPT never ceases to amaze lately, even when it makes errors


Mind sharing the specific prompt?


Why did this comment get flagged? It's great.


ChatGPT is what SWEs came up with to understand literature lol




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: