In the Real World(tm) of building software, you will work on systems built over years by other people, many of which have long since left the project or company. The level of integration tests, unit tests, documentation, and automation will depend entirely on the culture of the organization.
Databases, low-level I/O, sorting operations, and such are nominally provided either by the standard library or sourced from the community.
Infrastructure will rule your world. Good infrastructure, with smooth provisioning and deployment, will make your day a joy. Poor infrastructure, with lots of manual work and sign-offs from ten degrees of sysadmins, will do the opposite.
You will also work as a member of a team, with a wide variety in terms of background and experience levels, making clear communication, in all forms, an absolutely essential skill.
In college, very little of the above is actually covered
Sure, you'll learn lots of stuff about operating systems, algorithms, compilers, data structures, and the like. But without any real context on what that stuff really does.
But the big place that computer science education is a big let-down is on the collaboration front. There aren't many programs, perhaps none, where you are required to, as a member of a small team, maintain and modify code built three years ago by other people, and graded on your ability to produce work that can continue to be developed by future students.
Instead, the focus is on solving relatively straightforward problems as quickly as possible, with little or no attention paid to user experience, readability, testing, and any of the other things that make up the "engineering" side of the practice of building software.
This is true. The things of value (from an industry POV) that you actually learn at university are:
1) You learn to learn. You learn to pick up an entirely new area of knowledge (albeit sometimes based on prior learning) in six months to the point where you're proficient enough to pass an exam in it.
2) You learn to work to sometimes-unreasonable deadlines with minimal supervision (at least, if you graduate you've probably learned this).
3) You learn to stick with something for 3+ years even when it stops being fun.
4) You meet a bunch of people who form the start of your professional network.
Anything else you pick up is a bonus, and your first 3-6 months on the job are where you actually learn what you need for that job.
I can’t stand it when an engineer wastes hours looking for a library, instead of even considering actually writing code themselves. Many just can’t because they’re so accustomed to glueing together other people’s code.
Of course, I’ve worked with academic types that can never stop theorizing and actually be productive. That have zero discipline.
There has to be a balance. But yes, fresh out of school == fairly worthless for anything I’ve worked on ever.
That's too little theoretical background, not too much practical experience. If you take someone with a good grasp of the basics, you will never remove that grasp by giving them more practical experience.
The choice depends not only on how complex your own code would be, but also how mature and well maintained the library is (e.g. maybe a library that's just right is some solo developer's pet project, whereas a slightly more awkward fit is actively developed by a big company). That is the sort of thing where making the right choice really requires a lot of experience.
There is little point trying to teach work-related skills in a non-work-related environment. If I wanted to teach a bunch of kids how to work, the university system would not be a good starting point. It is the blind leading the blind, there are potentially literally thousands of people with at best limited work experience all crammed in together. That is not a sane environment for teaching people how to add value to others lives through working.
University is supposed to provide networking opportunities, cross-pollinate interesting theoretical ideas that have limited practical application and provide a feeder ramp into the research community. If universities try to teach employable skills it will be hugely inefficient compared to teaching people skills in entry level jobs.
Anyone trying to learn teamwork in a team of university students is not going to understand teamwork. In real teams it is not unusual to have 1 or 2 centuries of pooled experience in the problem domain being worked on and that creates a completely different dynamic for how the work is divided up and work gets done.
Not to sound flippant, but if this is the case, why do we bother requiring a university degree for really any job?
Or even the college's own toy systems?
Annually from the second year on during a computer science or software engineering program, students should be required to have one class where they spend at least one full day per week contributing to their choice of one of the following:
(1) An open-source project approved by the instructor.
(2) Production software used within the university, excluding systems dealing with student or faculty PII or sensitive information (grades, schedules, etc). Lots of opportunities here to contribute across the sciences!
(3) Non-commercial software benefitting the community or government at any level (from local to national).
The key difference between university and work is that in the former you are supposed to receive a pretty fine-grained assessment of your own skill, which is isolated and repeatable. In work, it's understood that promotion and slotting is a much more arbitrary affair and the marking criteria may not even be documented, let alone applied consistently. Also if your project fails because of bad work done by others, well, tough luck, that's what your salary is for. In university you're paying to be there, not the other way around, so the amount of random failure tolerable is much lower.
- customer oriented thinking (your not working for artistic beauty, balance technical arts and feature set ROI)
- pragmatism (even though they'll send a few "solve it then make it fast") you don't really get to feel the real pressure of aiming right at a good enough solution.
- optimization, not enough variations on a piece of logic to make it faster or whatever metric you need to optimize.
Things may have changed in college, but when I have very young in their career people start with me, one of their first tasks is to book a 15-minute meeting with me.
Learning Outlook/GCal/? and how to find a book a meeting room is base-level skill.
like im working with a recent cum laude CS grad ,and he was never told how SSL really works, nor DNS nor TCPIP
which makes it really hard to actually operate in the real world beyond IDE
- low level was for electronics/cpu architect
- the things you mention were for network engineer
- the mainstream was application developper, and you get to learn java and eclipse