I attended a state school. I just checked the website and found no promises of a job. In fact the only pages that include the word "career" are the career services which is a free service to help students and recent graduates find work. I checked a few other area schools and they make no such promises either. Hardly a comprehensive poll but,
Where did you get the idea that colleges ever promised jobs after earning a degree?
Every single guidance councilor and parent across the country.
I'm not sure how old you are, but it's borderline brainwashing. My mother literally used to tell me "Work hard in school, so you can go to college, and get a good job." It's not hard to see why people think that way.
Yes, some people go because they want to expand their horizons, learn more about the world, network, make friends. However most people go to be able to eventually get a job.
I attended a supposed liberal arts school and was NEVER informed of such goals or anything in the proximity. In addition, never did I feel that my schooling was geared towards or assisted in much of that.
Interesting you say that. Around here, the for-profit colleges and cook schools and whatnot all have fine print in their TV ads that say "XYZ School does not guarantee a job or job placement."