Prior to GPT, word processors would catch when your grammar was off for multiple words and give you suggestions to fix it. As students begin using the new GPT enhanced word processors, these suggestions will only increase.
Professors have a tough job ahead and they aren't always smart enough to use these GPT Detectors[1].
I think one solution would be a word processor that records the process of writing a paper and you need to turn in your paper along with your recording. Of course this is going to create added stress but what else do we do? There's going to be GPT scramblers that remove watermarks.
> Or, instead of surveillance dystopia, we just start having in person tests again, if we want to know, what a person on their own is capable of.
Why would I want to test someone's performance at say writing an email without the benefit of things like a spell checker, or grammarly style stuff, or chatgpt. They'll have access to this in the real world.
How about we test realistic scenarios. Instead of asking someone to calculate 96*451 by hand, ask them to workout the difference between buying a $700 phone and $5/month sim contract vs a $30/month contract for 3 years with RPI inflation. Give them a pop quiz to choose which of a dozen screaming offers in the supermarket aisle is the best thing with 10 seconds to decide, for added realism throw in a couple of kids causing some distractions.
For added credit have them consider opportunity costs of that $700 upfront payments, not just what they'd get with it sitting in a bank account. Instead of have them write a 1500 word essay by hand, give them 2 hours to write 1500 words explaining the benefits of X using normal tools like a modern word processor, google, wikipedia, ChatGPT etc. If they don't use the tools at all, you'd probably need to mark them down.
This is a false dichotomy. I feel like this well-worn talking point might be outliving it’s usefulness even faster with the invention of LLMs. We teach the mechanics of arithmetic and of spelling and grammar because that’s what education is for, it’s for teaching how things work so there’s an understanding of the fundamental mechanics, and it starts with the basics and builds on top of them to more advanced topics, in order to deliver a well-rounded and deep understanding. Note that the logical extension of your argument is to let ChatGPT, or the next AI, or the one after that, to start explaining your examples, and let humans ignore everything the computer can do, which now includes writing 1500 word essays. After all, just like spell checkers and calculators, access to AI is what people have in the real world.
We don’t need to choose between teaching multiplication by hand and how interest works, because we already teach both, one in a basic arithmetic class, and the other later in algebra & calculus. Same for spelling vs grammar vs writing. Spelling and grammar happen in elementary school, and essay writing happens in high school and college. Teachers already do allow calculators in algebra and calculus. This has no bearing on whether we should allow calculators in arithmetic class based on the vague notion that having access to calculators at all times is a ‘realistic scenario’.
I feel like this kind of thinking is what is leading people to try to cheat in the first place, it’s a lack of understanding of the value in basic skills, and the misguided assumption that learning something only has value if you can demonstrate you would need to use it all day every day for a job right now. The problem is it doesn’t ever get better or easier if you skip the basic mechanics that computers can do; in fact it only gets harder to learn the subjects that do actually matter that you are going to be using, when you have foundational gaps.
More than ever, what we are going to need from here on out of the education system is people who know at least a little what AI is on the inside just so we can use it effectively, not to mention build it, maintain it, control it, set legal policy, fix it when it’s wrong, etc. etc.
Right. Schools will become more like tech job interviews. Let’s be frank, that’s what higher education is anyway. An employment filter.
So students will white board their term papers and do other absurdities totally removed from any actual real world job requirements. Because in the real world everyone will just use gpt.
How about judging the content no matter which tools the students are using? If ChatGPT output cannot be distinguished from a student's output, then the respective profession / line of study is soon going to be obsolete anyway. If there is a difference (there should be one!), then the LLM is used as a tool, which should be allowed for most purposes. Universities don't disallow style and grammar correction software like Grammarly either.
What I can see in the humanities at least is that evaluation will soon get away from subjective and superficial criteria of form and focus on the quality of the content again. (That's me being optimistic. The other alternative is that AI will soon evaluate the output of other AIs.)
Is Dr. Jared Mumm still employed? How did they ever get a teaching position when they can't spell, conceptualize, or appreciate failure?
I think your solution works but the idea of even less privacy, especially around thought processing, is worse than the problem solved. I am guessing this will be the solution and it will send the data to Google and or Microsoft. Terrible.
They can also use the computer belonging to any family member. In a better world there should be cameras everywhere so we can detect a student not studying as much as the result they bring would require.
Ideally though, mind probes should come into play to find out if they really produced the work.
Honestly after some of the ill-advised-in-retrospect all-nighters at university during which I wrote assignments, I don't think a mind probe would be able to tell much at all!
You can actually detect this. Chess.com detects things like this for cheating chess players. The idea is that people who are cheating at chess with a computer have several identifying characteristics even if you ignore the moves. Things like the time spent per move, switching to a different window or tab, etc.
You could probably do something similar with recorded sessions like what GP suggested. Even someone doing what you suggested could leave behind a distinct profile in the shape of the session.
I think one solution would be a word processor that records the process of writing a paper and you need to turn in your paper along with your recording. Of course this is going to create added stress but what else do we do? There's going to be GPT scramblers that remove watermarks.
[1] https://www.businessinsider.com/professor-fails-students-aft...