Hacker News new | past | comments | ask | show | jobs | submit login

> you can throw out everything you learned in class and start over by Googling it and following the best practices of today, like we do for everything else

Yep, everything newer is automatically better. No need to have any grounding in objective utility as long as you're up-to-date on the latest web framework!

> You might have an idea how a basic OS kernel works from your OS class, or how...or maybe...None of that is useful at all, other than maybe to get you interacting with your machine, in the hopes that you'll learn how to use a computer (which is a completely different set of knowledge)

Any decent computer science program is both theoretical and hands-on. Projects where you get hands-on experience with the concepts you just learned. Not all of us can read Data Structures and Algorithms and implement a search algorithm as an 18 year-old. No, not everyone needs to know this. Yet, some people do. For those people who do need to know, their work wouldn't be possible without it.

> If you go into your CS degree as a young hacker, adept at nix with a penchant for assembly language, with enough experience to appreciate the CS concepts, you're going to be disappointed. There's a strong chance that you'll know more than the professors.

True, but (especially now) I'd be surprised if this is <2-3% of the CS undergrad freshmen population. I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.

Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions. I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes).




> Yep, everything newer is automatically better.

I didn't say newer is better, I'm saying that your recollection of how a compiler worked in university will not be of any use to you if you need to write your own language for production today. I've also never heard anyone say "oh yeah, I did this once in college, here's how you do that".

> For those people who do need to know, their work wouldn't be possible without it.

Who? Seriously, who would ever be employed to author search algorithm using the knowledge they obtained in a 4 year computer science program? Extremely few people are involved in the work of implementing anything that low level, and thinking you understand what's going on under the hood is just tricking yourself. For example, your 4-year CS degree holder would surely understand the trade-offs between a linked list and an array, right? Then you find out none of that is true in practice: https://dzone.com/articles/performance-of-array-vs-linked-li...

> I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.

I'm not saying only the l33t can do CS, I'm just giving the example that if you actually do know what's going on and you have to take all those classes, it's very obvious that the professors have nothing of value to add that you can't learn faster by yourself online. I grew up cracking windows software for fun and I had to watch a professor stumble through the basics of x86 assembly. It was an obvious waste of time for all parties. Educational resources today are vast, and if people want to learn something, they can just go learn it.

Also, have you looked at professor salaries vs engineer salaries? I know there are some people who teach at night or do it for the passion of it, but the reality is that it's not going to attract the most ambitious minds in our society. Meanwhile, so many major pieces of software are available for free online, and you can actually talk to the teams doing the work, and they'll let you contribute and give you feedback... for free!

> Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions.

This is my primary objection, and perhaps you have drank the Kool-Aid. If you get a CS degree at any major institution, you are not learning how to learn. You're both preventing learning and picking up bad learning habits. Doing a pre-built lab in a CS course is so much worse than contributing to literally anything on github. Studying how algorithms used to work in the 70s is useless compared to diving into any modern piece of code and benchmarking and learning how to optimize.

Look at what is actually in a CS program from a respected school: https://cse.engin.umich.edu/wp-content/uploads/sites/3/2019/...

There's a 4th-year course just called "Algorithms", check it out: http://www.eecs.umich.edu/courses/eecs477/f02/syl.html

They might as well have called it "Inefficient implementations of already solved problems".

I can't imagine why there's a "Databases" class and "Web Databases" class, but I think you see my point. Nobody is implementing a database with anything they learned in uni, and nobody is doing a better job of learning how to use a database in school than they would with experimentation and online explanation.

Maybe the fact that you're graded on learning these things, coupled with the enormous price, drives a student to independently research each of the topics presented throughout your 4-years? Then through research and play, they would gain understanding of each topic. But if the school is simply serving as an extremely expensive prompt for self-learning, then what value is the school really adding?

> I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes)

Every time I hear that reasoning, I think it's a rationalization for spending huge amounts of time and money doing something pointless. I also like "the college experience" as a good reason it was valuable.

Maybe general problem solving is being taught and is valuable, but that's not what these CS degrees are being sold as. To pick on umich again, look at the "Student outcomes": https://cse.engin.umich.edu/academics/undergraduate/computer...

So if you graduate with this program, you'll be able to "Analyze a complex computing problem and to apply principles of computing and other relevant disciplines to identify solutions" and "Apply computer science theory and software development fundamentals to produce computing-based solutions"? Maybe if you study independently while also stressing about passing your exams and doing your homework, then you'll be able to do those things, but it'll happen at a slower rate than if you just started building software and researching as you went. I just looked at a few of the syllabuses for this program as an example, and there's no way it's going to deliver what they're promising.


> If you get a CS degree at any major institution, you are not learning how to learn. You're both preventing learning and picking up bad learning habits.

What planet am I on right now?


Exactly!


Did you mean to write ">2-3%"?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: