Hacker News new | past | comments | ask | show | jobs | submit login

These types of sites are great. BUT. The concept hasn't properly evolved since the 1970s-80s when I assume it started to become pretty big.

I think the reason 99% of people study computer science is to get a computer programming job. Almost all programming jobs are actually software engineering jobs these days.

So I believe that at least one third of a degree program like this should actually be software engineering.

And the most important part of software engineering is the outer loop with clients or end users (or some stakeholders). So they should train on multiple project iterations with some external group as customers.

So the interaction with customers, defining requirements, the basic looping of iterations and maintaining software and evolving designs and technical debt, somewhat larger codebases that make modules/components/classes etc. more important. A lot of that should be integrated and probably even replace some of the lower-level stuff that would have been much more relevant in the 1980s for most application programmers.

It should be multiple projects that get evolved over the entire course.

Also another thing, a key tool to add to an education like this would have been active Google searching and learning how to do that, which should be at least a small part of the curriculum. Since the last few months, another key tool at least as important is ChatGPT.

The technology landscape changes rapidly. Education should keep up. Especially if its self-education, no reason to be stuck in the late 80s or early 90s.




> Since the last few months, another key tool at least as important is ChatGPT.

Could you elaborate on this? Why do you think it's such an important tool for software developers, and how would one go about learning or teaching it?

Also, what should be an appropriate interval between the appearance of a technology and its inclusion in a curriculum? Curricula, by their nature, tend to be fundamental and conservative.


Because it can literally write the code for you in many cases. It can help you find bugs in your code or refactor it. Do a code review for you instantly. So even if you believe for some reason that you should never use a tool like this to actually write code, it can make suggestions for improving it, in seconds.

Curricula should not be so conservative in this age of high technology. Especially in a highly technical field.


> Because it can literally write the code for you in many cases. It can help you find bugs in your code or refactor it. Do a code review for you instantly. So even if you believe for some reason that you should never use a tool like this to actually write code, it can make suggestions for improving it, in seconds.

In that case, wouldn't it be more useful to teach students regular software engineering skills that will allow them to check whether the code generated by something like ChatGPT is correct / appropriate / fit for purpose?

I guess I'm wondering what is it that a CS course would need to teach specifically about ChatGPT.


That's a very dangerous tip. Someone who is learning will not understand when ChatGPT is wrong, and neither will ChatGPT itself.


BLTBASS - Blind leading the blind (as a service)...suites this day and age...


From a business perspective, the most valuable aspect of written code is not the code itself, but that you have managed to retain the author.


What? You mean prevent talent from going to the competition?


Agreed, theoretical computer science education can be considered a luxury, as it is not typically required for the average programming job. Even if it is needed, you will still need to invest time in revisiting it. It might be more efficient to learn the theory from scratch as needed, as that time could be better spent learning practical skills if your goal is to find employment.


You can't just learn what you need today without having the culture to understand what you're reading.

Dumb example: counting sort is fast! Ok let's use it. Then your data happens to be an array of 2 elements -10000000 and +10000000.

Or the classic "let's parse a math expression with regular expressions".


I mean 2 element arrays will be fast to sort.


The 2 element array in my example will be very slow to sort with counting sort, and you didn't notice.

Exactly the point I was trying to make when saying that knowing theory is useful :)


>I think the reason 99% of people study computer science is to get a computer programming job. Almost all programming jobs are actually software engineering jobs these days.

With my recent encounters with new-comers and juniors employees, this has become a big commonality

> And the most important part of software engineering is the outer loop with clients or end users (or some stakeholders). So they should train on multiple project iterations with some external group as customers.

100%, I've become an advocate for this in my recent roles. Something as fundamental as applied communication between different contexts is underrated in a lot of curriculums I've heard of from.


>So they should train on multiple project iterations with some external group as customers.

That would almost be a Psychology course. Include stakeholders who don't really know what they want, who think they know what they want but constantly "rearrange the furniture," and who try to give you the solution rather than the problem. This would be a great idea. For this to be effective, it would probably have to be 2 semesters, 1 for theory and 1 for practice. It's a great idea though. Get rid of some of those Calc/Physics classes.

I had a professor tell me one time that the "meeting of the minds," between stakeholders and the developer was the hardest part of building software and that getting specs was like "pulling teeth." He wasn't wrong.


> So I believe that at least one third of a degree program like this should actually be software engineering.

How about an actual software engineering degree instead?

As others have said, it should teach interfacing with non-tech people, extracting requirements, reacting to changing requirements, and so on. But it should also teach things like source code control systems, bug databases, dealing with large long-lived code bases (go find this bug in this 1,000,000 line code base, say), what languages work best for which kinds of problems, and so on.

Maybe one third of that degree could be classical CS.


>Almost all programming jobs are actually software engineering jobs these days.

Not quite. Most all programming jobs are translation jobs, where you take some business requirement and put it into code. Which is why GPT models are going to render most of those jobs obsolete.


Once an AI can just “understand some business requirement” it will be able to probably replace most roles in a company!


That depends on how trivial and stereotypical the tasks fulfilled by that role are. Sure, writing simple functions or accessing databases and loading or unloading form-based user interfaces should be easy enough tasks to specify easily. These will be automated first.

But designing a user interface that's intuitive or chooses sensible default values, especially one that isn't "typical" (where a business model or user use case already exists), or one that's not trivial to specify, or complex to integrate into a workflow -- these use cases will require iteration in order to specify useably. And revision of a specification is an ability where language-based specification tools like GPT have yet to prove themselves -- like activities such as interactive debugging or the performance tuning of an implementation.

How do you describe a task to ChatGPT that isn't yet well defined and still requires iterative refinement with the user to nail down? Until ChatGPT can perform Q&A to "fill in the blanks" and resolve the requirements in a custom task, a human in the loop will still be still needed.

Think of ChatGPT as a way to code using templates bigger than the ones used today. Initially it's templates will be smaller than the average function. It's hard to know how long before its templates will grow in size and complexity sufficient to build a full blown app, unless that app is pretty trivial and requires no customization. I'm guessing it'll be years before it creates apps.


I think of ChatGPT as a way to dynamically generate grails scaffolds.

A scaffold is basically a code template that gets you started with writing a particular class so you don't have to start from zero.

ChatGPT is an amazing scaffold generator, which isn't suprising because that is one of the defining features of an LLM but people extrapolate this and say absurd things that simply trigger my bullshit detector.


But ChatGPT hardly listens to my instructions and the severe case of Alzheimer's is quite annoying and even then, who is going to write the prompts? It's not like your project manager/customers actually write clear requirements or have any actual knowledge about how their infrastructure works.

Prompting an AI to write text is also quite a slow back and forth process that took me two hours for a basic class I could have written in 5 minutes but since the AI can answer with bullshit in five seconds I am now obsolete even though it needs multiple iterations and reading the code is the bottleneck and I practically did all the work. (Integrating the code into my project and doubling the lines of code because asking it to make the modifications and additions is just way too slow. Typing is just too damn fast to bother. Maybe teach your developers touch typing so they don't suck versus AI?)


Translation where context is whole apps, spread over dozens or hundreds of files, with complex interdependencies is also a form of integration. And chatGPT doesn't have the context size to do it, maybe the next GPT. But 4000 tokens is not enough, 50-100K tokens would be more like it.


Well.. yes, it is a stretch to use it for a large code base, but especially if they are all relatively small/medium sized files, a directory listing can get you pretty far to select the relevant files. You can also do a vector (embedding) search to find relevant functions/files and feed that into the prompt.

Also the OpenAI coding model code-davinci-002 has an 8000 token max not 4000 like text-davinci-003.


You can ask chat GPT now the following question "Given this Json as input, and this json as output, write code that transforms the input to output", and it will get it right. Try it out sometime, and then realize that like 80% of backend processing.

There just needs to be more targeted training, and some system built around it to write a complete service code, and you can replace a good number of jobs solely through that.


20 yoe. Never had to write code to transform json into json in a pure way. Maybe XML to XML … once. Good ol XSLT!

By pure I mean with no other requirements such as accessing another data store and running a bunch of rules decided by having several meetings with various people to find out what is actually required… a bit like what chatGPT doesn’t do.


Ya, it's tricky to map that comment to a useful scenario, but maybe getting a json document from an external service and saying "write a function that parses this json and returns this TS structure, but I think that would be a rare efficiency killer for me


Generally transforming the data is not the hard bit, it's specifying the shape. You're not replacing any meaningful jobs by getting GPT to automatically translate one schema to another, you're improving the productivity of existing devs.


If it was only that simple, I assure you those jobs would have been obsolete already.


Not always, a well written software contract will generate additional fees/income when analysts or businesses fail to stipulate all requirements.

On the point of >Almost all programming jobs are actually software engineering jobs these days.

Thats a very narrow view of Computer Science, ignoring how software and hardware can be exploited in air gapped scenarios, exploiting what the military have traditionally called Signal Intelligence, not something taught in any university or online as far as I'm aware of.

The undetectable metadata by human senses because of restrictions like our range of audible sounds, ability to detect tiny fluctuations of electromagnetic radiation, lack of knowledge of a devices ability, makes most computer science graduates somewhat blinkered and restricted in perspective and highly exploitable, with hubris being the number one attribute for most.

IMO Computer Science should be viewed more as a natural science, incorporating things like physics, biology, psychology, chemistry along with what's currently taught in a stereotypical CS course. I'm reminded of the fact that my language debugger is an excellent colour blind test operating in plain sight and when you become wise to these additional points of interest, you start to see the chaff from the wheat, whose good, whose not because Only the Paranoid Survive!


It sounds like you’re considering a very narrow view of computer science.

How many people will come into contact with an air gapped system in their entire lives?


Central heating and hotwater systems, some home security lighting systems, some vehicles, many electronic devices with a cpu of sorts inside. I think its really quite common when you think about it.

Are you a bot trying to resource burn me? If a bot, would you even know you are a bot?


A system being air gapped implies a particular security set up that being incidentally not connected to the internet doesn’t.

> Are you a bot trying to resource burn me?

Seek help.


When I was young, most systems were air gapped :D


Schizo offering an actual joke to you: drunk drivers don't kill people; drunk crashers kill people.


You clearly never developed software besides hello world or never used chatgpt.


Funny, to me you are both saying the same thing with slightly different wording


Chatgpt and outsourcing take all the jobs. I would not take CSE course if i had to now


I've been told that my CSE degree was obsolete 10 years ago.

Funny, seems that this take is more obsolete than the degree itself :)


If chatgpt is so good at taking computer scientist jobs then it should also be good at taking lots of other jobs.


> 99% of people study computer science is to get a computer programming job

software engineering: https://amspub.abet.org/aps/name-search?searchType=program&k...

vs computer science: https://amspub.abet.org/aps/name-search?searchType=program&k...


>basic looping of iterations and maintaining software and evolving designs and technical debt, somewhat larger codebases that make modules/components/classes etc. more important.

Can anyone recommend a good book for this? I'm getting the hang of lots of other 'TYCS' topics, but I'm mostly building little toys for myself to try out cool algorithms I see in my books/resources. But mention CI/CD, large module & class design, or other 'big software' stuff I get very intimidated.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: