Hacker News new | past | comments | ask | show | jobs | submit login

The key problem I think, with higher education, is that we force students to take classes in the name of liberal education. If no one is required to take a class, then students might not take that class, which leads to that professor's usefulness, and perhaps career, coming into question. If anything universities are increasing not only the number of classes a student has to take, but also increasing seemingly-pointless labs and other classes taught by graduate students, so they can then pay those GTA's. It's almost like a pyramid scheme.

That's a side effect of Caplan's main objection, that a college degree has become a credential, used for signaling, and not indicating any real merit or learning. Once students recognize that (and they have), they put in the bare minimum to get the credential, and administrators respond by forcing new minimums. It's still possible to get a good education if you choose your institution, classes and professors wisely, but it's frustrating for motivated students to be surrounded by slackers. In a good learning environment, you're going to learn as much or more from your peers as from your teacher.

I highly recommend his book: The Case Against Education - https://www.amazon.com/dp/B076ZY8S8J.

An important thing to note is that in many areas colleges may not be the best way to learn a subject. I personally am ok learning from old blog posts and online documentation for most everything computer related and personally even slightly prefer it to college classes. However, as you mentioned I need a credential so sit through classes to get it and do near the minimum to get a good GPA and pass all the classes. There is simply no need to take challenging classes which may not teach exactly what I am interested in and take the risk of wasting upwards of 1,000 dollars on the class and getting my credential permanently stained with a bad grade if I misjudge the difficulty and fail. It is simply less risky, far less stressful, and typically easier for me to learn by not challenging myself at college wherever possible. Especially with education as cheap (many cases free) and accessible as it is online I find it hard to view anyone not slacking and attempting to learn much at college to be a fool.

I don't presume my institution is representative (nor that it isn't), but for what it's worth, most students I see are not slackers at all. Students often choose the harder courses, and many of them require a significant amount of work (a typical CS course requires 12 hours of work outside lecture and many require much more).

I wasn't a slacker until I took a computer graphics class that consumed more of my time than my other 4 classes combined, and I still failed it on a technicality (out for a job interview the day partners were assigned for the final project. Wasn't allowed to do the project alone, there were an odd number of people in the class. If you fail any "section" of the class, you fail the whole class. After back-and-forth with the professor, I contacted my class dean. He said he'd contact the professor, didn't, told me it was too late to change the grade, and then quit)

After that I realized grades were fairly arbitrary and explicitly aimed for Ds in classes I didn't care about and settled for whatever in the classes I did care about.

I've been doing graphics since high school, it was one of the few classes I was really excited about, and it's what I'm currently getting paid to do. I've never failed any other class.

In my opinion, the first two years of CS actually matter. Fundamentals like data structures, algorithms, and maybe even operating systems classes are great. Beyond that, most CS programs tend to be severely outdated - you'll learn more from internships and co-ops than another 2 years of classes.

Of course, take that with a grain of salt since I did know exactly what I wanted to do coming into college.

> I still failed it on a technicality (out for a job interview the day partners were assigned for the final project. Wasn't allowed to do the project alone, there were an odd number of people in the class.

University professor here.

Perhaps it's too late now, but I would encourage you to make a strenuous effort to get your professor in as much trouble with the administration as you can. I don't know who the "class dean" is -- but contact this person's department chair, the departmental undergraduate director, the dean of engineering, the dean of students, the provost, anybody, everybody. Whoever will listen.

What you experienced is not okay. I'm sure it's not an isolated incident, but it's also not the norm.

Thing I learned about the two bad professors in the school of engineering I went to was the Chair was beyond sick of their shit.

I had a similar experience when I finally made an effort to get to know the higher UPS in my car department. they often know already who the problem faculty are and are more than happy to help you deal with them. unfortunately, I think most college students are unaware of how in-department politics work and/or are just inherently unwilling to escalate things.

Sorry, forgot my audience. :) STEM folks tend not to be slackers (in their subject). Caplan also mentions this is where most of the measurable learning takes place. And there are non-slackers, to greater or lesser degree, in every class. But I saw a lot of them in the required classes ("core" or "diversity"), language classes and business classes. Most slackers self-select out of STEM, except as required.

For a CS degree, the slackers can read the slides at home, do the assignments from home, and only come in to do the exams. You're not going to see them until the graduation ceremony. (At least, I didn't.)

How does doing work remotely make one a slacker (other than the aspect of not physically transporting yourself)? And if you're capable of completing a course without being present, what does that say about the course and or your capabilities?

It doesn't; what I meant to say is that "just because you don't see slackers doesn't mean they're not there".

As for the courses and the professors' capabilities, there are a bunch of systemic issues there. One is that "number of students who fail out" is a metric that is used to measure program performance, so first year is usually full of courses that are review for like half a given cohort. Another is that one often gets tenure for research, not teaching ability, so some professors optimize accordingly.

For what it's worth, there are plenty of folks who will tell you that you should "slack off" on a degree and just start a startup instead.

Eric Weinstein talked about the same thing in an Interview a while back: “I think that what happened, if we think about it historically, is that we had this beautiful period of fairly reliable, high, evenly distributed technologically led growth after World War II, up until about 1970. We predicated all of our institutions on this expectation that this growth was some sort of normal thing we could depend upon in the future.”

But then, that growth ran out, and as Weinstein says, “We were left with all of our institutions looking in some form or another like a Ponzi scheme.”

Was from here: https://youtu.be/LruYnDjkOgU?t=895

100% this. I think half of my undergraduate degree simply consisted of taking waste of time classes that have absolutely zero benefit to my career. Those courses should be replaced with this: general skills (like reading skills and math skills to algebra level) and then general career skills. These are skills which your field of study specifically requires - public speaking, laboratory report writing, technical writing etc. It varies from major to major.

For the general skills stuff, that is all remedial stuff that should be obtained prior to entering formal college. Students should have mastery of those subjects before even being admitted. Have community colleges do remedial course exclusively (or some way to self-study online and test out).

For those 60 units of old "general education" replace it with 30 units of general career skills and 30 units of applied career skills. So the course track would go like this - first 30 units is exclusively your 30 general career skills then the next 60 units is the traditional academic track that most people take currently for their major. The last 30 units is exclusively used at applying those skills to various project-based work.

For all those old course that are currently considered general education, make them all optional, not required courses.

The issue with a liberal arts education is not that certain classes are mandatory: it's that the mandatory classes are simply not useful or general enough.

An easy solution would just to make replacements. Have students take introductory classes like data science, logic and reasoning, ethics, public policy, etc.

The most frustrating thing that happened to me in college was my school fucking my graduation requirements on their online portal and basically not telling me until my final quarter I actually had to take three more electives.

I was prevents from graduating on time because I needed to take 9 credits of... whatever. Why? Am I really a better person for being made to spend the extra time and money taking some bullshit class I barely remember?

A professor's usefulness is their research. Teaching is an added benefit for the public. Universities are not trade schools.

That's the platonic ideal of a professor's utility, yes. In practice it's more of a mixed bag. Once you leave the top ~100 colleges the scales stop being tipped towards research nearly as much.

Not all universities are primarily research institutions. And I think you'll find that the majority of courses are taught by adjuncts or TAs who are being paid exclusively to teach.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact