Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd love to get y'all's feedback on a related question: did you receive any training in your CS degrees that had anything to do with testing, dev-ops, continuous deployment, and the like? I teach in an MIS department and have started to include some of these concepts in the capstone course, which was originally more of a project management course.

From talking to the employers who hire our students (and the CS department's students), I got the impression that CS students are taking four years of programming classes but are never taught how to write tests or even to use version control. Is that the way it is? Is that the way it should be?



I went through Georgia Tech, and there is never any formal introduction to version control. If you're exposed to it, it's from a job or a friend (who in turn got it from a friend or their job). If you don't know it exists, you'll never seek out and learn it.

Testing was used in some courses. We had a software development class that involved UML, TDD, and everything "proper". Testing is also provided as a way of pre-checking your homework by some TAs. They might give you a basic set of tests for your linked list homework (adding, removing, etc), but then their grading script will check off by 1s, etc. It's not necessarily taught how to write said tests in those classes.

Continuous deployment is never touched. If you co-op/intern you might be exposed to it.

No one that I knew of learned anything related to "ops", unless they happened to take an interest in it.


No chemical engineer I know learn how to go to the assembly line and take samples for quality control, either. And arguably this is a good thing. You pay to go to school in order to learn stuff that is useful across the board in many different situations, not to have the local big employer save a few bucks per new hire during the induction training.


This is a fair point, but begs the question: should a CS degree include, or not include, learning about software project management and software development processes? I teach in the business school (an MIS department) so it's not controversial that we'd teach about project management and operations management here. I don't know what the prevailing opinion is in engineering schools.


I think we have reached to point where we should admit that a career in software development needs much more learning than what can possibly be achieved at school (regardless of level).

The current state of affairs holds the belief that no CS graduate worth their salt should be ignorant of software engineering methodologies (agile, waterfall, etc), nor specific development processes & tools (source control, peer reviews, unit testing, etc). At the same time it holds the more or less contradictory belief that CS curriculum should be concerned mostly with Computer Science and not mention much of the practical side of software development.

The ideal is that a good program will present the student with multiple opportunities to pick up practical skills on the go. No formal class and no grades, but exposure through lab work. The reality is that this requires everyone involved (students, instructors and various other support roles) to work longer and harder to achieve this tacit expectations.


If people's comments here are any indication, yes, SADLY, that is kind of the way it is.

Does it have to be? Definitely not.

There are tons of ways to inject practical skill building moments into an otherwise rigorous and principled CS degree.

For example, Professors could force students to turn in their coding assignments by Git Pushing to a repo. Since they have to do that anyway, they might as well version their assignments as they work on them. Boom! Some exposure to version control.

Or Professors could provide students with a suite of unit tests that their assignments will have to eventually pass. Now the students are coding to test cases from the very start of an assignment and probably picking up some practical knowledge about testing platforms.

Both of these tactics could be injected into just about any course, from Compilers to Data Structures.


This is an orthogonal issue from the one you brought in originally, and in this case I am 100% with you.

In Programming 101, you require all assignments to be turned in by pushing to <insert your favorite control version here> repository.

In Data Structures, you require each assignment to turn at least 5 versions which show the evolution of your work (not just 4 empty skeletons and one final push with the completed assignment at the end). Then was you asses the quality of versions you introduce this other soft technology called "peer review".

In later courses, like Compilers, you can have the whole group work on a single code base, each branching and merging back their own part of the work. You teach how to work in teams.

etc, etc...

What the choking point is here is that instructors themselves are, more often than not, not familiar with this kind of techniques and technologies. And they are unlikely to learn if there is not some serious institutional support for this. Of course the odd extraordinary professor will be a regular contributor to open source projects and the like, but all the others will not know what to do.

I think if you are interested to pursue this beyond a simple online discussion, you should start from here. Try to talk your local head of CS department into this kind of stuff, see where can you get from there.


Testing and Version Control: Yes. Far more than we needed to.

Dev-ops and continuous delivery is much harder to learn in a college setting because you, IMO, only start to see the benefits on large long term projects or quick moving undocumented/poorly documented ones.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: