Hacker News new | past | comments | ask | show | jobs | submit login

My wife is a university professor in the states, she said it's incredibly obvious who has started to use a GPT for the work they turn in. There is a lot of wrestling going on within the staff in her department on what they should do, some people want to outright ban anything that "seems" it was written by a LLM. As my wife discussed with me, it's hard to know how much was written by GPT and how much was edited by the student, maybe they asked 50 questions to the LLM and then copy/pasted it together themselves, maybe they used it for the outline, and filled in the details themselves. The lines can get pretty blurry.

One thing that it seems most of the profs find annoying is... they're aware that GPT can invent convincing sounding sources, so they're doing a lot more work verifying sources.

Commented this ^^ the other day on the UK students being able to use LLMs post: https://news.ycombinator.com/item?id=35028224




I think your wife's collogues are missing the forest for the trees here. This isn't going away. You can't stop it. Her university can't win an AI arms race, nor should they even try to. I've already integrated ChatGPT into my workflow and not only is my boss thrilled about it, he's actively embracing it and encouraging the rest of the people at work to do it.

We're discussing approaching OpenAI about enterprise pricing so everyone can have the Plus version.

I'm able to take time doing the bullshit grunt work of my job and essentially outsource it to ChatGPT. Then I go in and fix everything it fucks up - not just code, but poor wording, incoherent / incongruent sentences, etc.

University professors, high school teachers, middle school teachers - everyone - are all in for a rude awakening. You're going to have to get to know your students. You're going to have to scale down class sizes, because you have to now invest time in learning someone's writing style to know what's ChatGPT and what's their own voice. You're going to have to actually test for understanding, not just rote memorization. Our species is moving past that, thank God.

And more importantly, we're going to see college scaled back like it should have already been. Most people are not only not college material, they don't need to be college material. The vast, vast majority of jobs in the world can be easily done without a four-year degree and with a few weeks to a few months on on-the-job training.

We're long overdue to pass the buck back to companies to pay for training their workers, not dumping a $25,000 to $250,000 responsibility on would-be workers in the hopes they can stand out. ChatGPT will hopefully be a huge catalyst for this, and I think paired with crippling student loans, we will start to see a massive paradigm shift soon.


I agree with you that it's here to stay, and I'm sure my wife does too... but it's not so simple.

I can mostly figure out how to improve my work with modern ML, and I'm fairly tech literate... I think it's just a painful time for them. It's easy to say they're in for a rude awakening because it's obvious to us they are. However, she's a history professor who can hardly use a computer. Between dealing with bureaucracy of an inherently protectionist work environment, reading a zillion books, research, teaching, conferences, publishing, etc, etc, plus never really focusing on becoming technology literate, this is a lot to process.

My wife is still young relatively to her colleagues so I think she can deal with the shift, but it doesn't negate the fact that she's (a junior) member of a team, and this conversation is happening in all departments, from architecture to philosophy to math. It's going to be interesting to see what happens.


Although my first post may not sound like it, I am in fact quite sympathetic to your wife and her situation (although I think she'll be fine is she adopts what I'm about to post).

> However, she's a history professor who can hardly use a computer. Between dealing with bureaucracy of an inherently protectionist work environment, reading a zillion books, research, teaching, conferences, publishing, etc, etc, plus never really focusing on becoming technology literate, this is a lot to process.

This is no longer an option for our species.

We've managed to dumb down computers to a level that people can't even figure out what operating system they're using (Windows 10/11 or macOS).

That's a problem. I don't care what anyone says. They're all objectively wrong. You *need* to have some idea of how the technology that shapes your life works. Just like you don't need to be able to tear down a car, you don't need to be able to explain how ChatGPT works, but just like you understand that using the accelerator makes your car go and the brake makes your car stop, you need some base idea of what's happening with your computer and the software you use.

We see this in just about every aspect of our lives. I wouldn't be able to file taxes for a multi-million dollar business with 300 employees, but I can file taxes for myself with a little bit of effort and reading (in fact, even trained professionals struggle with filing taxes, so maybe this wasn't the best example...).

I think you should have your wife make an account on OpenAI, and subscribe to ChatGPT, and try it out. Sometimes I literally just talk to it when I want to learn things.

I've never had to use Microsoft Outlook, ever, until I started working at my current job. I didn't know what an OST file was, or why they get fucked up so often, so instead of Googling and reading a bunch of shit, I just asked ChatGPT, "What is an Outlook OST file?" and it told me. And I went down an Outlook rabbit hole from there.

And the 4.0 and 5.0 version are just going to be that much more spectacular. I think it's better for her to embrace this right now and get on the other side of this train than be caught in the tracks when it arrives at her station.


I just asked ChatGPT, "What is an Outlook OST file?" and it told me.

Wow, I'm happy for you.

You realize though that the kinds of questions academic research tries to answer are a bit more nuanced than that, right? And not only that -- almost by definition -- when doing this kind of research, you don't just want to copy-pasta whatever turns up on the top of any automated search you do (whether it be through a search engine, or a chatbot).

Right?


I think it's you that may not be realizing some things. There are a lot of bullshit degrees and bullshit fields that ChatGPT is going to destroy. And it cannot happen soon enough.

For the actual rigorous fields that require some work and effort, ChatGPT wasn't designed to replace those researchers. It wasn't designed to make sense of the data that that kind of academic research produces. It's designed to help you take the bullshit work out of the sense you've already made out of this research.

It's an assistant. That's all it is... right now.

It will eventually, and probably sooner than later, be much, much more.


> University professors, high school teachers, middle school teachers - everyone - are all in for a rude awakening. You're going to have to get to know your students. You're going to have to scale down class sizes, because you have to now invest time in learning someone's writing style to know what's ChatGPT and what's their own voice.

I’m pretty sure this is not a rude awakening. Everyone would like to work more closely with fewer students. The ones in for the rude awakening are the ones who will be paying for more instructors.


> The ones in for the rude awakening are the ones who will be paying for more instructors.

That's not going to happen. Too many free alternatives that can get the job done, and people are waking up to that.

*Administrators* might have to go find a way to be productive for once... but people are fed up with the insanely high costs of higher education. Especially given what it's producing.


I've already integrated ChatGPT into my workflow

And your line of work is?


Some of the cheating using AI methods right now is very obvious because the writing is so poor. (For example, it's making things up, falling flat on argument, or using a grade school essay style.)

But what happens if generated poor content like this becomes a high percentage of what people are reading?

Will people start unconsciously mimicking the styles in their own writing? (This might be at least a temporary effect, as a technical writer mentioned they'd seen in their own work during a period they were reading Hunter S. Thompson heavily. Getting a bit punchy for tech doc.)

And will people's cognitive and expression abilities outright decline? (Perhaps because they're exposed to lots of generated examples of argument and other communication, or because they leaned on an AI-method tool, rather than go through the thinking and perhaps learning exercise themselves?)

Maybe these effects will begin, and even be noticed, but rather than try to fix it, society will just lower its standards, and then others will try to profit on the problem existing and continuing to exist? (Look at journalism and political discourse. Or at software development.)


I truly think this will happen.

Most people I know, smart people included, really struggle with writing. People struggle to clearly communicate ideas, express themselves, form sentences and more.

I'd say the majority of people I work with are actually poor communicators, especially via text (email, sms etc). Writing is a skill and no one is "perfect" at it, including me. It's only going to get worse without practice. I know this because I'm learning another language and if I take even a few weeks off, I lose it pretty quickly for being at an intermediate level.

There was a period in my life where I picked up a pen to write something and I realized that it had been so long since I used one I was actually intimated and struggled. I was too used to keyboards. I don't see outsourcing writing itself to ChatGPT being much different.

I cringe a little when I read all these messages like, "Oh, I just use ChatGPT to respond to everything now and write all my code etc". Well that's good it's saving you time, but you're also not exercising your ability to think anymore, and that will be detrimental. Many of those people probably didn't have a yardstick to measure there own writing ability by in the first place and now just assume they're better off outsourcing to ChatGPT. It's a questionable position to take.

There might be a time in 10 years where people are told they should write because it's good for their brain, like how we have to tell people who don't do physical labor they need to exercise...funny stuff.

Lastly, there must be truly something to be said for the fact that ChatGPT, at least currently (maybe it will train against itself soon) is trained off human inputs. If that trend continues and everything is ChatGPT, well where does that leave us? With a stagnant, flat way of communicating with stale ideas?

I feel like everything is becoming a little dystopian lately, ha!


I’d be quite worried if I could automate my job with chatGPT. ChatGPT will never be worse than it is now, after all, and never be less user friendly either — if I was glorified UI for chatGPT, I guess my boss would be looking into cutting out the middleman.


I’m in grad school right now. I use chatgpt to convert first drafts into polished final drafts by asking the following questions

- is this text convincing? How can it be improved?

- rewrite the following passage for clarity. Target an audience of X level.

It’s like a more efficient version of grammarly, I don’t have any qualms with this use. Saves hours debating phrasing, and acts as a TA for getting initial feedback on whether you are missing the mark.


If you are in graduate school, you should learn how to write. You will not always be in the luxury to have an internet connection to clarify your point to your audience.


This argument is basically the same as teacher's argument that 'You'll not always have a calculator in your pocket'. And I suspect it will age equally badly.


That argument hasn’t really aged badly though. People do still study math in school..you still need to learn arithmetic. Not because you’ll necessarily need to be able to arithmetic without a calculator, but for the cognitive skills it’s training you in


I was also terrible at arithmetic, my major was physics :shrug: I was also terrible at cursive, and prone to getting lost prior to GPS. I also never learned to carve a spear without modern power tools.

I'm either a fool who's able to get by with modern technology - or we're freeing up mental capacity to work on new ideas. ChatGPT is yet another tool, one which is demonstrating productivity improvements comparable to the launch of the internet IMO.


This has not aged badly at all, you now have people who can not quickly add numbers together, or even figure out fractions. Example:

My mom (Rhonda, 64) is the shop steward at a deli counter and had this conversation with her fellow employee Janice (25y):

Janice: "Hey Rhonda, they want 2/3 of a pound of cheese. How much is that on the scale?"

Rhonda (mom): "Janice, that's .66 on the scale."

Janice: "Yea but why is it .66 on the scale if they want two thirds?"

Rhonda: "Janice, you don't have to worry about that, just know that two thirds is .66."

Janice: "Yeah, but HOW do you know that?"

Rhonda: "..."

I'm sure someone will say "Your mom is wrong because she should teach her fractions! What a huge jerk!" Note that this person has gone through high school, and children do this math regularly. What I am saying is, being numerate is useful at all levels of life, not just academic exercise.


people should learn how to use their computer everywhere.


Sure, let me bust my phone calculator out in a deli with customers while wearing food service gloves.

The model is very easy to break down, all to learn something ONCE.

Here's the difference: You know fractions everywhere and you don't have to think about it more than 2s without your phone because you are math literate. OR Every time someone asks you something, you take 5s to pull out your phone type the numbers (hopefully correctly), and you get your decimal answer that you can hopefully interpret.

Which do you think is a better employee overall?

I have a particular axe to grind on this one, as the response "well I'll just look it up!" is so glib it throws the entire house out with the bathwater and the baby inside :).

For day to day math, needing a calculator to do the basics is like walking instead of flying. Mastery of the basics really pays off, especially for young students.


I'd argue that using language is more innate than calculating things, and that practicing it may well contribute to being able to form more coherent thoughts/ideas. After all, language seems to be one of the biggest things setting humans apart from animals.


It's funny you are trying to use this argument with me :) the value of mental math was instilled in me as a young man. When I am with family and buying things and they read out fractions or add up costs, they've developed the habit to defer to me to do mental arithmetic because taking out their phones to add things is tedious. Mental arithmetic is easier and requires no time punching in numbers.

I have never used a calculator app on a phone, ever, at least I don't remember the last time I did. The only times where I need to add up a large set of numbers, I get in front of a computer and use a python repl or a spreadsheet, but even then I "pre-stage" the calculation by doing approximations in my head, so I'm sort of ready.

Not letting your tools think for you is how you distinguish a professional from someone who can just push buttons. Don't disservice yourself by allowing yourself to never really learn to the limit of your abilities.


"because taking out their phones to add things is tedious. Mental arithmetic is easier and requires no time punching in numbers."

This is a hilariously bad take. I'm sorry how long does it take to "punch in numbers"? Is it possible to fail touch typing class on a phone?

The value in understanding mathematics is not the ability to be able to recite the quadratic formula or to be able to perform long division in your head, it's knowing when to use the formulas and understanding their applications in your daily life and work.


Yes, it is quicker... I don't know if it is because you do not have such an experience to compare with but is quicker and easier. If your hands are full of things then it does save a few seconds.

>The value in understanding mathematics is not the ability to be able to recite the quadratic formula or to be able to perform long division in your head, it's knowing when to use the formulas and understanding their applications in your daily life and work.

You can do both, really. You should be able to recite the mathematics you use in your daily work without having to look it up. I've never met a good scientist who wasn't able to.

EDIT: okay, may be one exception is for example something like tedious calculations from QFT or something, but you should be able to gesture at general parts of it, you should be able recite certain simple equations or particular approximations. If you have to refamiliarize yourself with your work every time you look at it, I do not know how you can advance in your research.


I thought the same thing, then I also realized that might be nice if people actually enjoyed writing, enjoyed the process, enjoyed getting better at it and who knows, even...used their brains a little bit more.

Even though I find ChatGPT impressive, it still doesn't write what I want to it write in the way I want it written.

The same way Dall-E, Mindjourney doesn't really draw what I want it to draw. It draws something , but I can just accept it's "good enough", but it's a different experience to creating something yourself.

ChatGPT writes something, but it's not me. While it might save me time, it's not as fun.

I think learning to enjoy the process important because honestly, that is all there is to life.


That is what I consider part of the process. You learn to write because it is a requisite skill and part of your craft, but eventually, as a professional, it becomes part of what you use to distinguish yourself. It is a part of your professional persona, in the same way writing in creative senses (blogs, novels, etc) is.

It's one thing to automate dry responses (email bots have been doing it for decades now) but automating writing in your career work leaves little left of your career, and further, it dilutes what you add in value to everyone else you work with.


Wow, props on getting that username!


Sounds like they're just using it to touch up an existing draft. They've come up with the fundamental ideas and to me the writing is merely ornamentation and window dressing.

I used to be an ESL instructor in another life and as I've always said to my students, "it doesn't matter how many languages you know if you've got nothing to say."


aye - I learned that lesson the hard way many years ago. As with all things ChatGPT - it does better with an expert user. Prior to ChatGPT I learned how to write well both for my job, and then for grad school. ChatGPT is just a time saver.


LOL, anywhere on the planet where you would actually need to clarify a point to an audience, you'll have an Internet connection.

Do you honestly think you'll get into a philosophical debate so incredibly challenging that you'd need to fall back upon ChatGPT to argue on your behalf in the middle of say, the Australian bush??

And even if you did, RV Starlink is there to save the day.

ChatGPT doesn't "do work for you". ChatGPT is like having a subject matter expert by your side to guide you. And like every single person who has access to a subject matter expert, you're going to take away whatever you want, and leave whatever you don't. Lazy shitbags will be lazy shitbags with or without ChatGPT. But inquisitive people who want to soar with the eagles have an impossibly powerful tool at their disposal.


Your committee will be real impressed if you whip out a phone during your oral examination and ask to talk to ChatGPT first.


Then they will argue that the examination method is outdated


Oral or dissertation defense is a high stakes version of things you tend to do a lot: giving talks, teaching, etc.


Until your phone dies. The point is some skills you either have, or you don't.


It sounds to me that they are learning how to write, ChatGPT is the tool that they are using to do so.


I think they know how to write if they're in graduate school.


> some people want to outright ban anything that "seems" it was written by a LLM

... The false positives won't be pretty. Especially for ESL students whose grasp of the language might make them sound a little too artificial.

> it's hard to know how much was written by GPT and how much was edited by the student

The smartest students won't even be suspected of using an AI model. They'll use the AI to do directed work, proofread and re-write as needed. Not really different than going through three revisions with different mentors for their college entrance essay...


As a recent graduate, I read plenty of my peer’s papers that were obviously written in 1 hour at 10pm, talking in circles with no substance. This is the level of output of these LLMs, and frankly I think all of them should be getting C’s.


> The smartest students won't even be suspected of using an AI model. They'll use the AI to do directed work, proofread and re-write as needed. Not really different than going through three revisions with different mentors for their college entrance essay

If that’s what they’re using it for I propose that they’re meeting the learning objectives just as much as someone who did those things in a study group with their friends. Of course how acceptable that is depends on the context of the work.


I think one of the other things that could be valuable would be to ask students to write fewer longer form assignments. ChatGPT and friends aren’t very good at staying on topic for all that long without being very repetitive. I think that’s probably a side effect of the number of tokens available. I don’t think that chatGPT could reliably write a detailed 5k word essay or more. It might be able to make some useful snippets to copy and paste, but I think the tone would be weirdly disconnected and it might make it more obviously not written by a person.

I’d be interested if anyone has tried this and what their experiences were.


I found it interesting that there start to be best-practices for professors and teachers on how to deal with ChatGPT in a class room [1].

One I really liked a lot: >And if you don't trust the AI to output correct content, teach your students critical thinking by using Assisted Teaching to write history essays and let your students find mistakes in it.

Another way (if you want to go the more traditional way), is the Flipped Classroom [2] method, in which students self-teach at home and do the "homework" in the classroom.

I agree with others here that there is no way to ban these technologies; there are just better and worse ways to deal with it.

[1]: https://assistedeverything.substack.com/p/the-age-of-assiste... [2]: https://en.wikipedia.org/wiki/Flipped_classroom


But there is also quillbot.com which is around for some time and used by students to paraphrase text which they copy & past from other sources. IMO this is worse.


Students should be rewarded for checking sources, and (moderately?) penalized for not doing so. I would say that at a minimum, every fictitious source knocks off a full letter grade. And make sure students know this. They will adapt. And it might get them to the sources, at least superficially (yeay?) to check for existence/non-existence.


Interview the student.

That's literally all it takes... but oh no, it's too expensive, must figure out how to automate testing... while fighting against the automation of answering.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: