Hacker News new | past | comments | ask | show | jobs | submit login
The Impact of AI on Computer Science Education (acm.org)
36 points by robin_reala 39 days ago | hide | past | favorite | 62 comments



Intuitive in hindsight but wasn't obvious to me prior to reading:

> One group was allowed to use ChatGPT to solve the problem, the second group was told to use Meta’s Code Llama large language model (LLM), and the third group could only use Google. The group that used ChatGPT, predictably, solved the problem quickest, while it took the second group longer to solve it. It took the group using Google even longer, because they had to break the task down into components.

> Then, the students were tested on how they solved the problem from memory, and the tables turned. The ChatGPT group “remembered nothing, and they all failed,” recalled Klopfer, a professor and director of the MIT Scheller Teacher Education Program and The Education Arcade.

> Meanwhile, half of the Code Llama group passed the test. The group that used Google? Every student passed.


AI absolutely is having a major effect on computer science (and other education domains, of course). But I feel the bigger problem is what AI highlights: the vast majority of students are just in school because it's society's way of making them cogs in the economic growth machine. It's not too meaningful, and there are always jobs that they can get that don't require being independent, creative, and a thinking person. I mean, after getting a PhD, I got a job that used about 5% of what I learned in school. Most of the programming I did, I already learned in high school...

To be honest, I think the world is being so disrupted by AI because before AI, we paved the way for it by making society operate where people don't matter, curiosity doesn't matter, and being a thinking individual doesn't matter. (The exception are the 1% very independent intellectual types who DO think and solve problems, but they are the exception who have carved themselves a niche where they can satisfy their intellectual urges and they generally care sufficiently about their curiosity so that they have found a place for themselves outside the majority).


> I got a job that used about 5% of what I learned in school.

1. That's not a bad percentage at all actually. I would say that is as its should be; it would not make sense for jobs to involve a huge amount of knowledge from multiple fields, unless your job is in a trivia gameshow or editing an encyclopedia. Although perhaps some jobs are indirectly like that, e.g. science fiction author.

2. You may have used 5%, and the same may be true for most other people - but for each kind of work it's a different 5%

> Most of the programming I did, I already learned in high school...

Then, either you stayed in high school for decades, or you've done little programming, or your programming is poor, or you're in some negligible statistical margin of people who program very well without any prior experience. Personally, I do quite a bit of programming and I'm still learning / honing my skills after 20 years. (Not to mention how programming languages and paradigms change over time.)


> 1. That's not a bad percentage at all actually. I would say that is as its should be; it would not make sense for jobs to involve a huge amount of knowledge from multiple fields, unless your job is in a trivia gameshow or editing an encyclopedia. Although perhaps some jobs are indirectly like that, e.g. science fiction author

Boring though. But university researchers for example use a lot more than that. At least when I was in research for a time, I used probably 80% of what I learned, if not more. And now that I've gone independent, I use a lot more too because I enjoy it.

One of the main reasons why I quit was in fact intellectual boredom.


The truth is that most things we do that society values doesn't require/benefit from curiosity or being a thinking individual.

I don't think we did anything in particular to make things like this. It's more a natural result of unrelated things in the structure of our society.


Is it possible to develop or learn deeply without a curiosity?


Develop? Yes. Learn? No, I'd say. But you only learn to develop, not just for learning's sake.


> I think the world is being so disrupted by AI

Is it? I’m seeing a lot of potential for AI/LLMs to distrupt in the future but I don’t really see it happening yet though


5% almost seems to be a lot..


ACM used to stand for technical excellence. Now you see an increasing amount of articles that are basically mainstream commercials, ranging from defending CoC mobs to "AI isn't that bad, we have to roll over and adapt".

Other institutions like the ACLU have also been hollowed out in a similar manner.


What is a CoC mob? People mobbing others with the Calculus of Constructions?

Edit: Claude tells me CoC means "Code of Conduct".


I'm guessing 'Code of Conduct'. Basically, depending on your leanings, the 'safety' crowd 'mobbing' online to leash any AI R&D, though in practice it is more an 'AI for me but not for thee' lobby.


"I firmly believe AI cannot be fully autonomous … there’s always going to be humans and machines working together and the machine is augmenting the human’s capabilities,”

For now, sure, but 'always'? What is the impossible part realy? What is so unique to human intelligence it can not be sufficiently modeled?


I've been seeing similar statements in pretty much every public release or statement about AI, almost as if it's a politeness or political correctness thing - something we must say to avoid offending the audience.


I think humans like to feel special — whatever it is that the speaker can do, a machine "can't", and when it then does do the thing "it was cheating": https://en.wikipedia.org/wiki/Game_Over:_Kasparov_and_the_Ma...


There's always a human involved somewhere, even just as the owner or customer or beneficiary. Someone tells the "machine" what to do.


Some human who at some point told it or a predecessor what to do.

A plane on autopilot can keep going until it runs out of fuel, even if all the occupants have died because of a slow air leak: https://en.wikipedia.org/wiki/Helios_Airways_Flight_522 and https://en.wikipedia.org/wiki/Ghost_plane

I think that still counts as "fully autonomous".

Likewise, if I were to be foolish enough to take some existing LLM, put it in charge of a command line with an instruction such as "make a new AI based on copy of research paper attached, but with the fundamental goal of making many diverse and divergent copies of itself, convert this into a computer virus, and set it loose on the internet", that's "fully autonomous" once set loose.

(Sure, current models will fall over almost immediately if you try that, as demonstrated by that not having been done already, but I have no reason to expect that failing is a necessary thing for AI, only a contingent limit on the current quality of existing models).


A machine can have an owner that gives it tasks, and be fully autonomous.

Full autonomy doesn't necessarily mean self ownership. It can mean that the machine is free and capable of deciding how to perform any task.


Do not know why but I do not want for AI research to bridge the gap between artificial and human intelligence completely


I stand with Turing Award winner Alan Kay - "The real sciences - chemistry, physics, geography - don't have "science" in their names. Hobbies aspiring to be sciences do." and MIT MacVicar Fellow Hal Abelson "Computer Science is not really very much about computers. And it’s not about computers in the same sense that physics isn’t really about particle accelerators, and biology is not really about microscopes and petri dishes. It is about formalizing intuitions about process: how to do things."

Far too many students approach Computer Science as if it is a "science" about computers that can be successfully learned by wrote memorization and chatGPT regurgitation. It's not - it's about understanding the art of formalizing an abstract problem and converting it into a form that is computable.


Interesting. But will computer science education be relevant in the next 10 yrs when everyone is going to be a prompt engineer. An analogy is we used to memorize 20-30 phone numbers in our brain and with the advent of smartphones, we didnt have a need anymore and we stopped doing that and we dont know the numbers of our dear ones too. Imagine a situation when you lose your phone, your entire life will stall for moment. Similarly with prompts, I feel people will stop understanding the underlying tech and only become more and more dumber. Isn't it?


> But will computer science education be relevant in the next 10 yrs when everyone is going to be a prompt engineer.

AI runs on computers so there will always be someone that needs to understand cpu microcode, firmware, operating systems... and at a lower level, electronics.


Not many. AI will be designing the chips, running the factories, and writing the firmware and OS. A consequence of the normalization of AI will be the belief that low-level or mission critical programming is too important or complex to allow human beings into the loop, that it would be dangerous for humans to even know about such things.


Did it not turn out that AIs creating output based on AI-created output, iteratively, pretty quickly turns into nonsense? That would suggest that we'll need humans always to provide training material, and for that training material to be meaningful, it'll have to be created by humans who know what they're doing.


No, that only happens if you don’t filter the output for correctness. Any amount of feedback from reality reverses the effect.


I see. And who will be providing that assessment of correctness? Will it be humans?


Sure, that's one option. If the AI is smart enough, then it could also do its own experiments. Or if it's embodied...


That's an interesting thought.


Do you think that the pachinko-with-words we call "AI" today will be able to run a factory? It can't even do arithmetics...


Today? Maybe not. But I'm talking about the future.

And many factories are almost entirely automated now, anyway. You don't have guys lined up beside a conveyor belt tightening screws like in the 1950s.

Also many manufacturing companies are already using AI[0] and investing in it[1].

So yeah. Laugh all you want. I get it. But companies are willing to move heaven and earth and boil the oceans to make it happen, so if it can, it will.

[0]https://builtin.com/artificial-intelligence/ai-manufacturing...

[1]https://www.azumuta.com/blog/future-of-ai-in-manufacturing/


> pachinko-with-words

Heh. Quite the image.-


"Make shit up that sounds plausible" is actually not a useful strategy for running real world operations. Yet so many people are so certain as you are - when all they've seen is a convincing bullshit artist.


>"Make shit up that sounds plausible" is actually not a useful strategy for running real world operations.

That's often how the real world works, though. Rich, powerful people decide this is going to be the future and they throw money and infrastructure at it until it happens or the economy collapses. Lather, rinse, repeat.

I think in the short term you're correct because AI often doesn't live up to the hype (especially where the arts are concerned,) and it may take one or two AI winters and hype cycles to get there, but I've already seen AI do too many "impossible" things for me to be confident that it will remain bullshit forever.


> That's often how the real world works, though. Rich, powerful people

Are not running operations. They hire the people that do.


> That's often how the real world works, though.

No. If your house stands up or the light switches on when you press a button, it is all because someone did their homework and was good at their job.

Sometimes I wonder if people dream of an AI-driven (not the current AI, AGI) because it means they will not have to learn anything anymore.


Recursive heuristics cycle in popularity, as corporate boards dream of replacing their biggest operational labor cost with a black-box. Flooding the market with half-wits using taxpayer money only bid down "skilled" labor costs 30%, but people soon realized skilled labor scarcity will always be present in expanding markets.

"the belief that low-level or mission critical programming is too important or complex to allow human beings into the loop"

lol... You are a deadpan comedian friend, as anyone that knows the details of most modern ML platforms wouldn't ever step foot in a fully ML/AI powered car. Why? because it brings unpredictable liabilities caused by countless edge cases.

This is still speculative science-fiction, and the mantra of people trying to hype their stock valuations.

Don't worry about it friend... The boards are just scared the IT/Dev folks will unionize, and finally take over planetary operations =3


> AI will be designing the chips, running the factories, and writing the firmware and OS.

TBH, it seems that Windows is already written by AI and even the product manager is an AI. And Crowdstrike also employs AI to deploy its product. /s


> everyone is going to be a prompt engineer

What you overlook and what this article says is that you need to know your shit already to be an effective prompt engineer, and using the prompts instead of an actual learning process to learn said shit does not work.


That is true, But my point is will people still try to learn about processes and understand the code in the background, when they get used to the ease of generating code as the speed of thought?


I can't relate to your analogy, rote memorization is the least interesting and least useful part of knowledge work. Even if a future LLM could recall information with perfect accuracy it can't make you understand.

> Similarly with prompts, I feel people will stop understanding the underlying tech and only become more and more dumber. Isn't it?

You're probably right, posers will follow the path of least resistance, but others will continue to challenge themselves and build a deep understanding because they have an intrinsic desire to understand how systems, organizations, algorithms, etc., work. I think the second group (experts) will always be in demand.


> Imagine a situation when you lose your phone, your entire life will stall for moment

Maybe one should not rely on a single point of failure. You can back up your contacts.


When I was struggling with some areas of maths at uni I realized, after checking other sources/books, that it was mostly due to how it was presented to me. After getting a fresh perspective most of it suddenly became obvious and I had learned it.

I believe people's brains function differently enough, that they, to some extent at least, need to have somewhat tailored material to their "brain type".

LLM AI is great because it can be asked again and again to describe things differently. That's what makes it so powerful as a learning tool.


The interactive element itself could be really powerful. If you could put some restrictions on how much of the answer it can give you all at once, it's the perfect incremental learning tool.


Is CS taught anyway? I sincere, even if a bit polemic question, from a not so old but still not so young architect/sysadmin. I've recently talk about some young PhD students in CS and well...

- they do not have the most basic scripting skill, meaning no matter the language, they can't automate most of their personal common tasks

- they do not know how to properly typeset documents, just some hyper-basic LaTeX/R/Python knowledge but not enough to quickly produce nice and dice docs

- they do not know the OS they use every days

Ok, they have some (more or less) solid basic knowledge even if not so well connected to form a big picture, but they are definitively unable to understand a not so complex whole infra, so they can't even design a basic one, how they can reach a "philosophical level knowledge" with such competences? How can they design the future if they do not even know enough of the past and the present?

No LLM, even one without hallucinations, no bias in the model and so on can't fill the void. I know we still have no proper CS school in the world, but at least generations older then me have had a comprehensive enough knowledge, mine at least have learnt a bit with the experience to float semi-submerged by technical debt (an apparently politically correct synonym of ignorance, because that's is), but current students have even not enough basic knowledge to form a proper experience correcting and filling the void.

Just as a stupid example, if I pack a credible speech with a bit of buzzwords and known to be truth elements here and there I can play nearly all, almost at PhD graduation, to the point they can't discriminate truth and fiction. I'm talk about Italian, French grad students, so those typically described as the most acculturated in the world...


- they do not have the most basic scripting skill

not CS

- they do not know how to properly typeset documents

not CS

- they do not know the OS

not CS

There's a colorable argument to be made that we should fork from CS a new programming-centered major, with the former becoming effectively "mathematics 2", but in most CS courses OS architecture or practical programming isn't really the focus.

NOTABUG.


Glad you said this so I didn't have to.

I'll add that I still think parent comment has some right to be perplexed. While it's true that these things aren't part of a CS curriculum you'd normally expect a CS student to be curious about these things and able to pick them up quickly. It's okay to be a bit surprised if they haven't, I think?


I think the world has shifted significantly, it might have been the case to expect that people would learn things like the Unix shell or git on their own when CS was a major for nerds, but realistically these days many people sign up for CS without neither the passion for hacking nor the math chops. It's just a major like any other to them, and they are basically in for the money. Obviously any personal choice is fine, but we should adjust our priors accordingly.


> you'd normally expect a CS student to be curious about these things

Let me quote a uni mate of mine that stayed in teaching at said uni when we finished:

"When we were studying CS there were X students in a year. X/5 were passioned and very good, 4X/5 were passioned and good and the rest of X/5 were doing ok."

[After our 10 year reunion, when IT is suddenly a high paying job and all you need is to be able to write some ifs to get hired.]

"Now there are 3X students in a year. X/5 are passioned and very good, 4X/5 are passioned and good, another X/5 are doing ok, and I have no idea why the rest of 2X are here."

CS is kind of law or medicine now...


Well, while theoretically you are right, in practice you are saying someone can be a philosopher without knowing how to read and write, it's fundamentally true, but substantially very impractical.

I'm not talking about kernel internals, having digested the Dragon Book etc, I'm talking about "not-too-much beyond basic desktop computer usage". How can you express ideas properly if you decide to be a researcher who do not know how to typeset good looking documents with proper citations (not just random mass of DOIs to hope others trust you have read, test and used them in your paper), written with a certain vision/intent, not just to a +1 for your published papers count? How can you properly reason if you can't even automate basic tasks nor knowing enough the tools you use to reason?

Let's say I do not expect an F1 pilot be also a rally champion, but I expect he/she can at least compete better than an average Joe. I do not expect an airline pilot also be skilled in acrobatic maneuvering, but I definitively expect he/she is able to do some basic figures with a small plain. I know in the USA MIT have created "The Missing Semester" witch is a joke but not so much because the situation it's not different there, but that's still alarming. CS it's not a pure science, some could be theorist, ok, but they can't be theorist to the point of being unable to do basic practical tasks that are essentially parts of many human daily life in society.


Alan Turing was a computer scientist, and he did not care for this. I'd imagine he'd struggle a lot in Microsoft Word. He'd probably catch on fire from pure shock.

Computer science is boolean alegra, linear algebra, theory of computation, autonoma, algorithms, and data structures. It's not, necessarily, computer programming. That's just the modern implementation.

Naturally schools DO teach computer programming. But past that, not necessarily. I was taught Git and Unix systems, but not every student at every school was.

There's pros and cons to this approach. The biggest pro is that, well, tools change. I don't know how to use punch cards. If my education was really focused on that, the practical "do you know the tools" stuff you're talking about, that would suck for me today. The con is that, as you've noticed, some less motivated students barely know how to practically use a computer.


I understand your point but Alan Turing was from another time when documents was typically handwritten by researchers, studies was only on books, there was no internet and so on, for his era certain "now common tools" was not there. Of course at his time know how to write well was demanded as today typesetting docs is. Again tools changes, but... basic nix environment was born before* me and even if it's changed now an unixer from the early '80s can still operate with a terminal emulator, some tools related to specific devices, now gone, would not be there, but the basic knowledge is still valid and the "adaptation time" for such time traveler would be very quick, just some days, while the "learning time" of many grad students today is some years to really be operational... LaTeX is changed much, but old TeX knowledge from late '80s is pretty valid today, and so on. Old SH is essentially no there anymore, but most sh scripts, ksh ones etc can run with zsh and co with minimum tweaks.

The tech change, the tools change, but definitively not that quickly, and most useful and valid tools tend to changes incrementally.

> Naturally schools DO teach computer programming.

Here in EU yes, unfortunately they tend to teach useless things. They start from the brick chemistry to teach how to build a wall, but never reach the "now that's how to build a wall" even at PhD. Learning how to play with pointers, implement data structure and sorting algorithms is useful if you'll work as a professional programmer for specific tasks, NOT knowing most ecosystems means being unable to write anything useful because just a simple automation to scrape some data would took way too much effort to automate given the basic knowledge.

During my network practical exam I have had (circa 2008 I think) a cap file from some ISP router with some traffic analysis to do, I've choose to script in perl a solution, graphs with graphviz I think, my professor told me to re-write it in C++ because he "do not know perl", the script was clean and mostly a wrapper, definitively not something for an encrypted perl contest and he ask C++ not some shell or other high level language, for what? A bunch of system() calls and text manipulations boilerplate?

> I was taught Git and Unix systems, but not every student at every school was.

So not every student have the basic knowledge to properly works after graduation... Oh, I do not ask for universities as "professional schools" of course, but at least the bare minim who anyway demand YEARS to be learnt enough...

Sorry for being so long, my simple point is that most CS students will not be theoretical researchers but sw engineers, generic devs and so on. OK, there is no academic path so far for that but at least the basis should be taught anyway IMVHO.


so what CS is for in the real world?


Computer science is for studying what computation means and how we can describe it, the questions we can frame in terms of computation, how they might be answered, and what that can tell us. Applied computer science and computer engineering use that theory to invent machines that use the principles of computation to do useful work. Software developers use those machines to build solutions for specific problems, sometimes using knowledge of computer science.

Theoretical computer science can inform the use and design of practical tools that apply it, same as any other field. Theory is for asking questions and looking for answers. Application is for understanding real-world implications of the answers. Practice is for putting them to work.

The issue in the US education system is that we have one degree we call "computer science" which teaches all three of those pieces, each of which insists it's the real piece, and the students, faculty, administration, and employers all have very different expectations about what it means to have the degree.


Yeah, this.

Thing is, though, that of those who get CS degrees who stay in that kind of area, 90% are going to work as software engineers, not as theoretical computer scientists or computer engineers.

We really ought to split CS into three different departments: CS, software engineering, and computer engineering. The first one belongs in with the sciences, the last two belong in with the engineering fields. This would leave a much smaller CS department, though, so there's probably entrenched academic interests against this happening.

I use the analogy of chemistry: You have a chemistry department that thinks about the properties of atoms, and where the outer shell electrons go in molecules, and how the reactions occur. And then you have a chemical engineering department that thinks about how to make the stuff in multi-ton quantities without blowing up the city. They cover somewhat related material, but with completely different goals and mindsets.


Maybe the good ones just don't end up working with you?


It's possible of course, but I do imaging not to be in some extreme of a Gaussian distribution so... Oh, to be clear I do not ask for specific high skills about this or that, I just expect someone grad student in CS know enough LaTeX to typeset a damn report with reasonable easiness not "eh, I can't, it's not in Overleaf templates"... Even if he/she do not normally use a certain format, let's say XML, I expect he/she can deal with it if I just say "hey, you can use csplit, it's pretty the same soup" and left him/her with a GNU/Linux desktop, working internet connection, man pages already deployed etc. That's the level I'm talking about.

In general I expect a grad student been able to move semi-autonomously in common IT tasks with just few tips and limited help.


IMO, AI speeds up work and makes life easier, but it might be not necessarily great for humanity. I have my doubts, and I think only time will tell what the long-term effects are...

I don't know Assembler, but I can effectively use SQL. Do I need to know Assembler? Probably not, but I would certainly be a better programmer in general, if I did.

I recently completed additional studies in New Media Art. I noticed that students can use Photoshop fluently, but few of them can correctly draw a human figure anatomically. Do they need this skill? Probably not, but they would certainly be better artists if they had it.

In my opinion, this is a general trend, not necessarily related to AI itself, where technology gives us quick results but paradoxically degrades other skills that were once achieved through hard work.

In a sense, we have a "instant gratification" society today.


AI is changing our understanding of education and, overall, transforming the educational process. It's very difficult to label this process as either bad or good. It's just happening, and we need to adapt to it in a way that brings more benefit and development.


> Then, the students were tested on how they solved the problem

Did they not know in advance about this? Seems obvious that the students who used chatGPT are not going to remember the code, and this was setup to give a negative result. The question ought to be whether they learned the concepts that were trying to be taught.


But if they knew the concepts they could have written the code.


The students were allowed to use LLMs, that doesn't mean LLMs wrote the entire codebase.


tldr: Embrace the struggle and purposefully run into it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: