Hacker News new | past | comments | ask | show | jobs | submit login

Candidly, I would say that college does not even provide the base skill-set.

In the Real World(tm) of building software, you will work on systems built over years by other people, many of which have long since left the project or company. The level of integration tests, unit tests, documentation, and automation will depend entirely on the culture of the organization.

Databases, low-level I/O, sorting operations, and such are nominally provided either by the standard library or sourced from the community.

Infrastructure will rule your world. Good infrastructure, with smooth provisioning and deployment, will make your day a joy. Poor infrastructure, with lots of manual work and sign-offs from ten degrees of sysadmins, will do the opposite.

You will also work as a member of a team, with a wide variety in terms of background and experience levels, making clear communication, in all forms, an absolutely essential skill.

In college, very little of the above is actually covered

Sure, you'll learn lots of stuff about operating systems, algorithms, compilers, data structures, and the like. But without any real context on what that stuff really does.

But the big place that computer science education is a big let-down is on the collaboration front. There aren't many programs, perhaps none, where you are required to, as a member of a small team, maintain and modify code built three years ago by other people, and graded on your ability to produce work that can continue to be developed by future students.

Instead, the focus is on solving relatively straightforward problems as quickly as possible, with little or no attention paid to user experience, readability, testing, and any of the other things that make up the "engineering" side of the practice of building software.




> Candidly, I would say that college does not even provide the base skill-set.

This is true. The things of value (from an industry POV) that you actually learn at university are:

1) You learn to learn. You learn to pick up an entirely new area of knowledge (albeit sometimes based on prior learning) in six months to the point where you're proficient enough to pass an exam in it.

2) You learn to work to sometimes-unreasonable deadlines with minimal supervision (at least, if you graduate you've probably learned this).

3) You learn to stick with something for 3+ years even when it stops being fun.

4) You meet a bunch of people who form the start of your professional network.

Anything else you pick up is a bonus, and your first 3-6 months on the job are where you actually learn what you need for that job.


Most companies don't give grads the opportunity to learn on the job.They want experienced professionals.


You just mean they don't hire junior developers. Everybody learns on the job, even experienced professionals.


Conversely, too much practical experience can lead to a lack of fundamental understanding, innovation, and helplessness.

I can’t stand it when an engineer wastes hours looking for a library, instead of even considering actually writing code themselves. Many just can’t because they’re so accustomed to glueing together other people’s code.

Of course, I’ve worked with academic types that can never stop theorizing and actually be productive. That have zero discipline.

There has to be a balance. But yes, fresh out of school == fairly worthless for anything I’ve worked on ever.


> too much practical experience can lead to a lack of fundamental understanding

That's too little theoretical background, not too much practical experience. If you take someone with a good grasp of the basics, you will never remove that grasp by giving them more practical experience.


I meant, in the context of schooling & learning. Coding boot camps can sometimes result in some behaviors that require unlearning.


I often see the opposite: people wasting hours writing code when they could have found the appropriate library with a few minutes of searching. Often the library is actually already installed with the project or even part of the standard library.


The most difficult case is when writing the code from scratch actually is a lot faster than choosing and learning the API for a library, but in the long term takes more effort to maintain (higher TCO).

The choice depends not only on how complex your own code would be, but also how mature and well maintained the library is (e.g. maybe a library that's just right is some solo developer's pet project, whereas a slightly more awkward fit is actively developed by a big company). That is the sort of thing where making the right choice really requires a lot of experience.


I’m mostly talking about trivial things. For example: Adding a third-party dependency to evaluate euclidian distance. i.e. A^2 + B^2 == C^2.


Theoretically, is this not all intentional?

There is little point trying to teach work-related skills in a non-work-related environment. If I wanted to teach a bunch of kids how to work, the university system would not be a good starting point. It is the blind leading the blind, there are potentially literally thousands of people with at best limited work experience all crammed in together. That is not a sane environment for teaching people how to add value to others lives through working.

University is supposed to provide networking opportunities, cross-pollinate interesting theoretical ideas that have limited practical application and provide a feeder ramp into the research community. If universities try to teach employable skills it will be hugely inefficient compared to teaching people skills in entry level jobs.

Anyone trying to learn teamwork in a team of university students is not going to understand teamwork. In real teams it is not unusual to have 1 or 2 centuries of pooled experience in the problem domain being worked on and that creates a completely different dynamic for how the work is divided up and work gets done.


> University is supposed to provide networking opportunities, cross-pollinate interesting theoretical ideas that have limited practical application and provide a feeder ramp into the research community. If universities try to teach employable skills it will be hugely inefficient compared to teaching people skills in entry level jobs.

Not to sound flippant, but if this is the case, why do we bother requiring a university degree for really any job?


Maybe make them work on open source projects?

Or even the college's own toy systems?


There's a wealth of opportunities here.

Annually from the second year on during a computer science or software engineering program, students should be required to have one class where they spend at least one full day per week contributing to their choice of one of the following:

(1) An open-source project approved by the instructor.

(2) Production software used within the university, excluding systems dealing with student or faculty PII or sensitive information (grades, schedules, etc). Lots of opportunities here to contribute across the sciences!

(3) Non-commercial software benefitting the community or government at any level (from local to national).


That's exactly what many UROP schemes are (/were, anecdotally after all). (Undergraduate Research Opportunity Placement, I think. Speaking from UK but I believe it's a pan-European scheme.)


The problem is then their grade ends up being heavily affected by the quality of work done by other people. It'd be impossible to objectively grade such a task.

The key difference between university and work is that in the former you are supposed to receive a pretty fine-grained assessment of your own skill, which is isolated and repeatable. In work, it's understood that promotion and slotting is a much more arbitrary affair and the marking criteria may not even be documented, let alone applied consistently. Also if your project fails because of bad work done by others, well, tough luck, that's what your salary is for. In university you're paying to be there, not the other way around, so the amount of random failure tolerable is much lower.


Maybe it's less about the quality of your code for this and more about how you do on: estimating your time, hitting your schedule, collaborating metrics (response time, ticket closing, etc.)?


By those metrics many of the industry's top companies and engineers would get a fail grade.


In my case what college didn't teach me:

- customer oriented thinking (your not working for artistic beauty, balance technical arts and feature set ROI)

- pragmatism (even though they'll send a few "solve it then make it fast") you don't really get to feel the real pressure of aiming right at a good enough solution.

- optimization, not enough variations on a piece of logic to make it faster or whatever metric you need to optimize.


> Candidly, I would say that college does not even provide the base skill.

Things may have changed in college, but when I have very young in their career people start with me, one of their first tasks is to book a 15-minute meeting with me.

Learning Outlook/GCal/? and how to find a book a meeting room is base-level skill.


college doesnt even provide basics

like im working with a recent cum laude CS grad ,and he was never told how SSL really works, nor DNS nor TCPIP

which makes it really hard to actually operate in the real world beyond IDE


my university split these into different paths.

- low level was for electronics/cpu architect

- the things you mention were for network engineer

- the mainstream was application developper, and you get to learn java and eclipse


yeah except now when you're hired into fullstack job youre expected to operate or at least understand various cloud services and other components




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: