Hacker News new | past | comments | ask | show | jobs | submit login

Yes, I think loan policy changes are our most practical knob to turn, as I think many of the current problems are the result of making college too apparently cheap with easy loans and deferred payments. Cracking down on for-profit colleges is a good first step, even though we've somewhat slowed down on that lately.

I'd favor a less prescriptive method around "study this, not that" and a more responsive one: your loans are in danger if your students' debt-to-income ratios are bad, or they aren't employed, etc. Probably "employed" is fine as long as debt-vs-income is checked too, vs "employed in the field," as I do find the liberal arts valuable in general for critical thinking skills even if not directly tied to specific careers. That's the five minute version, it doubtless needs some tweaks, but I wouldn't want to over-target it for today's specifics.

On the point of whether or not university education, as currently practices, is needed more or less:

> The reason employers don't train workers is that it's all risk no reward. If the person drops out without finishing, or worse quits the day after you train them, you lose money. It's easier to just pay high salaries to qualified graduates from normal schools, which is what they do.

That just makes me skeptical to if they really need the workers that much, though. Emerging industries can't work like this - someone has to do the work before the universities catch up - so how do we quantify e.g. how desperate we truly are for programmers? Heck, we even have people with degrees we generally don't want to hire because they somehow still fail fizzbuzz! Doesn't that speak terribly of our ability to produce high-demand graduates?

I'm sure there's a middle ground between "the risk of training this person and them leaving is small enough compared to the potential return they'd give us since the field is so ripe" and "nah, we're totally good," but how do we find out where in that space we are?




> Emerging industries can't work like this - someone has to do the work before the universities catch up

Creating something new and teaching people the current state of the art are two separate things. And universities do both of them.

The things invented in Bell Labs in the 1950s are taught in universities today, but the researchers at Bell Labs had college degrees. It's a feedback loop.

> so how do we quantify e.g. how desperate we truly are for programmers?

Median salary is a very strong quantification, especially accounting for sub-specialization. It's possible we need more DBAs but not more web designers.

> I'm sure there's a middle ground between "the risk of training this person and them leaving is small enough compared to the potential return they'd give us since the field is so ripe" and "nah, we're totally good," but how do we find out where in that space we are?

They just don't benefit from it at all.

Suppose you pay $150,000 to train someone with the expectation that you'll pay them $50,000/year for five years and the training will allow them to do the work of an $80,000/year employee, so you'll break even (ignoring time value of money). Except that once they can do the work of an $80,000/year employee, they can immediately quit and get a job paying $80,000/year and you can't retain them by paying $50,000/year.

The alternative is to loan them the money, so if they leave they still have to pay you back for the training. But now you've just reinvented universities and student loans and companies don't get anything out of doing it in house that they wouldn't from hiring graduates from normal schools who borrow money from normal banks.

Companies know that they aren't any better at being a university than a university is and they aren't any better at being a bank than a bank is, so they don't try to be.


> Emerging industries can't work like this - someone has to do the work before the universities catch up

Often, in fields that are just emerging into commercial viability, the people doing the new work before it gets taught in universities or commercially applied are researchers at universities (professors leading, but graduate students often doing the heavy lifting and often undergraduates working in supporting roles), who are then also (both via publication and personal contact and advisory councils) taking it to industry to further development.

So, yeah, while someone is doing work before university classroom instruction catches up, a lot of that advance work is done in universities, across the whole professor-to-undergraduate academic heirarchy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: